mutliprocessing.Pool.add_sync() eating up memory

Posted by Austin on Stack Overflow See other posts from Stack Overflow or by Austin
Published on 2010-06-08T21:20:48Z Indexed on 2010/06/08 21:22 UTC
Read the original article Hit count: 190

Filed under:
|
|

I want to use multithreading to make my script faster... I'm still new to this. The Python doc assumes you already understand threading and what-not.

So...

I have code that looks like this

from itertools import izip
from multiprocessing import Pool

p = Pool()
for i, j in izip(hugeseta, hugesetb):
    p.apply_async(number_crunching, (i, j))

Which gives me great speed!

However, hugeseta and hugesetb are really huge. Pool keeps all of the _i_s and _j_s in memory after they've finished their job (basically, print output to stdout). Is there any to del i, and j after they complete?

© Stack Overflow or respective owner

Related posts about python

Related posts about multiprocessing