I have code like this:
def worker(data_row):
print("In worker", data_row)
if __name__ == '__main__':
db_conn = init_db_conn()
rows = db_conn.execute(query).fetchall()
db_conn.close()
pool = Pool(4)
jobs = []
for row in rows:
print("In main", row)
job = pool.apply_async(worker, (row,))
jobs.append(job)
for job in jobs:
job.get()
pool.join()
pool.close()
When I execute this, only In main
are printed, and zero In worker
, so no worker code is executed. How to resolve this?
from the docs you have to call close
before join
.
class multiprocessing.pool.Pool
close()
Prevents any more tasks from being submitted to the pool.
Once all the tasks have been completed the worker processes will exit.
terminate()
Stops the worker processes immediately without completing
outstanding work. When the pool object is garbage collected
terminate() will be called immediately.
join()
Wait for the worker processes to exit.
One must call close() or terminate() before using join().