mutliprocessing.Pool.add_sync() eating up memory
- by Austin
I want to use multithreading to make my script faster...
I'm still new to this. The Python doc assumes you already understand threading and what-not.
So...
I have code that looks like this
from itertools import izip
from multiprocessing import Pool
p = Pool()
for i, j in izip(hugeseta, hugesetb):
p.apply_async(number_crunching, (i, j))
Which gives me great speed!
However, hugeseta and hugesetb are really huge. Pool keeps all of the _i_s and _j_s in memory after they've finished their job (basically, print output to stdout). Is there any to del i, and j after they complete?