Distributing processing for an application that wasn't designed with that in mind
Posted
by Tim
on Server Fault
See other posts from Server Fault
or by Tim
Published on 2009-11-24T07:03:07Z
Indexed on
2010/04/22
23:03 UTC
Read the original article
Hit count: 313
We've got the application at work that just sits and does a whole bunch of iterative processing on some data files to perform some simulations. This is done by an "old" Win32 application that isn't multi-processor aware, so new(ish) computers and workstations are mostly sitting idle running this application.
However, since it's installed by a typical Windows Install Shield installer, I can't seem to install and run multiple copies of the application.
The work can be split up manually before processing, enabling the work to be distributed across multiple machines, but we still can't take advantage of multiple core CPUs. The results can be joined back together after processing to make a complete simulation.
Is there a product out there that would let me "compartmentalize" an installation (or 4) so I can take advantage of a multi-core CPU? I had thought of using MS Softgrid, but I believe that still depends on a remote server to do the heavy lifting (though please correct me if I'm wrong).
Furthermore, is there a way I can distribute the workload off the one machine? So an input could be split into 50 chunks, handed out to 50 machines, and worked on?
All without really changing the initial application?
In a perfect world, I'd get the application to take advantage of a DesktopGrid (BOINC), but like most "mission critical corporate applications", the need is there, but the money is not.
Thank you in advance (and sorry if this isn't appropriate for serverfault).
© Server Fault or respective owner