Search code examples
pythonparallel-processingdeadlockpython-multiprocessing

python AsyncResult.successful() does not return


I am trying my hands on python multiprocessing. I want a couple of processes which are independent to each other to run in parallel and as they return check if the process was successful or not using ApplyAsync.successful() utility. However when I call successful in the callback to my subprocess the script hangs.

import multiprocessing as mp
import time

result_map = {}

def foo_pool(x):
    time.sleep(2)
    print x
    return x


result_list = []
def log_result(result):
    print result_map[result].successful()    #hangs                                                                                            
    result_list.append(result)


def apply_async_with_callback():
    pool = mp.Pool()
    for i in range(10):
        result_map[i] = pool.apply_async(foo_pool, args = (i, ),    callback = log_result)

    pool.close()
    pool.join()

    print(result_list)



if __name__ == '__main__':
    apply_async_with_callback()

Solution

  • You don't need to check successful() because the callback is only called when the result was successful.

    Following is the relevant code (multiprocessing/pool.py - AsyncResult)

    def _set(self, i, obj):
        self._success, self._value = obj
        if self._callback and self._success:  # <-----
            self._callback(self._value)       # <-----
        self._cond.acquire()
        try:
            self._ready = True
            self._cond.notify()
        finally:
            self._cond.release()
        del self._cache[self._job]