I have a simple code like the one shown below. The first process holds back the queue, so none of them are completed.
I would like to be able to kill an AsyncResult if it exceeds the .get() timeout, so my pool queue can move on. However I couldn't find any simple way to do it, without modifying "myfunc". Does anyone have an idea how this could be achieved?
import multiprocessing
import time
def myf(x):
if x == 0:
time.sleep(100)
else:
time.sleep(2)
return 'done'
pool = multiprocessing.Pool(processes=1)
results = []
for x in range(8):
results.append(pool.apply_async(myf,args=[x]))
pool.close()
for res in results:
try:
print res.get(3)
except Exception as e:
print 'time out'
multiprocessing.Pool
has not been designed for such use case.
Forcing one of the workers to commit suicide will lead to undefined behaviour which might vary from remaining stuck there forever to getting your program to crash.
There are libraries which can solve your problem. pebble allows you to set timeout to your workers and will stop them if the time limit has exceeded.