Search code examples
pythonqueuemultiprocessinglifo

Python Multiprocessing With (LIFO) Queues


I'm trying to use multiprocessing in Python to have a function keep getting called within a loop, and subsequently access the latest return value from the function (by storing the values in a LIFO Queue).

Here is a code snippet from the main program

q = Queue.LifoQueue()
while True:
   p = multiprocessing.Process(target=myFunc, args = (q))
   p.daemon = True
   p.start()
   if not q.empty():
      #do something with q.get()

And here's a code snippet from myFunc

def myFunc(q):
    x = calc()
    q.put(x)

The problem is, the main loop thinks that q is empty. However, I've checked to see if myFunc() is placing values into q (by putting a q.empty() check right after the q.put(x)) and the queue shouldn't be empty.

What can I do so that the main loop can see the values placed in the queue? Or am I going about this in an inefficient way? (I do need myFunc and the main loop to be run separately though, since myFunc is a bit slow and the main loop needs to keep performing its task)


Solution

  • Queue.LifoQueue is not fit for multiprocessing, only multiprocessing.Queue is, at is it specially designed for this usecase. That means that values put into a Queue.LifoQueue will only be available to the local process, as the queue is not shared between subprocesses.

    A possibility would be to use a shared list from a SyncManager (SyncManager.list()) instead. When used with only append and pop, a list behaves just like a lifo queue.