Search code examples
python-3.xpython-multiprocessing

python - lock process accessing queue queue while queue.put() for n seconds


I have the following code (simplified):

from multiprocessing import Process, Queue

def f1(queue):
    while True:
        # do some stuff and get a variable called data
        # ...

        queue.put(data)

def f2(queue):
    while True:
        if not queue.empty():
            data = queue.get(timeout=300)
            print('queue data: ' + str(data))

if __name__ == '__main__':
    q = Queue()
    p1 = Process(target=f1, args=(q,))
    p2 = Process(target=f2, args=(q,))
    p1.start()
    p2.start()
    p1.join()
    p2.join()

The problem I'm facing is that I don't know how to lock the queue in f1 in order to keep putting data for n seconds, before f2 is able to read it.

I tried with timeouts but of course, it didn't work. Basically, the expected behaviour would be that f1 keeps appending data into the queue and after n seconds, f2 can get what's in that queue. So, summarising, f1 should be running continuously, f2 should be running continuously too but accessing the queue every n seconds.

I can think of not so elegant ways of doing this with the time library, but I guess it has to be other way. Maybe the code's approach is wrong and I shouldn't be using Process and Queue but Pipelines or something else.

Thanks in advance!


Solution

  • For this particular case in which I was using the multiprocessing library, instead of threading or asyncio, I found the best way to do this by using a simple sleep, so the f2() will end up like:

    def f2(queue):
        while True:
            time.sleep(300) # sleep for 5 minutes before POSTing
            if not queue.empty():
                data = queue.get(timeout=300)
                print('queue data: ' + str(data))
    

    Of course, after importing time.

    As I said, maybe not the most elegant solution but I couldn't come up with anything better for the time being (and this particular use case).