Search code examples
pythonmultiprocessingqueue

python3 queue.put() blocking main


python3 queue.put() will block main process when queue size over particular value(1386).

I use 30 subprocesses and two queue to count an int number, each subprocess get number from first queue and then put this number to second queue. I can see all the subprocess close successfully, but main process are blocked. the ware thing is, when number length less then 1387, it works good. python version 3.7.0

    #!/usr/bin/env python
from multiprocessing import Manager, Process, Lock, Queue


def work(q_in, o_out, process, lock):
    print("process ", process, "start")
    while 1:
        lock.acquire()
        if q_in.empty():
            lock.release()
            break
        d1 = q_in.get(timeout=1)
        o_out.put(d1*2)
        print("in process ", process, " queue 2 size", o_out.qsize())
        lock.release()
    print("process ", process, "done")


if __name__ == '__main__':
    length = 1386
    q_in = Queue(length)
    q_out = Queue(length)
    for i in range(length):
        q_in.put(i)
    lock = Lock()
    processes = list()
    for i in range(30):
        p = Process(target=work, args=(q_in, q_out, i, lock))
        processes.append(p)
        p.start()
    [p.join() for p in processes]
    print("main done")

when length less then 1386, I can see "main done", but length = 1387, all subprocesses closed but "main done" never show and main process keep running status


Solution

  • The problem is that nothing is consuming the data from q_out. The workers are able to complete their work because the queue is buffered on their side, but (some of) the processes remain alive waiting to be able to flush the data to an underlying pipe. See https://bugs.python.org/issue29797 for more details.

    The pipe seems to be able to hold 1386 items in its buffer in your case.