Parallel processing from a command queue on Linux (bash, python, ruby... whatever)

Posted by mlambie on Stack Overflow See other posts from Stack Overflow or by mlambie
Published on 2009-01-21T02:54:45Z Indexed on 2010/06/17 18:03 UTC
Read the original article Hit count: 223

I have a list/queue of 200 commands that I need to run in a shell on a Linux server.

I only want to have a maximum of 10 processes running (from the queue) at once. Some processes will take a few seconds to complete, other processes will take much longer.

When a process finishes I want the next command to be "popped" from the queue and executed.

Does anyone have code to solve this problem?

Further elaboration:

There's 200 pieces of work that need to be done, in a queue of some sort. I want to have at most 10 pieces of work going on at once. When a thread finishes a piece of work it should ask the queue for the next piece of work. If there's no more work in the queue, the thread should die. When all the threads have died it means all the work has been done.

The actual problem I'm trying to solve is using imapsync to synchronize 200 mailboxes from an old mail server to a new mail server. Some users have large mailboxes and take a long time tto sync, others have very small mailboxes and sync quickly.

© Stack Overflow or respective owner

Related posts about python

Related posts about ruby