Search code examples
pythonmultithreadingqueuelistenerrcp

What is the best option to send data to another script without be blocked


I have been working on two scripts where one of the script (we can call generate_data.py) that generates the data (looks for new data 24/7) and I have notifications.py that receives the data from generate_data , convert it to a json and post it to my server through a requests.

There is some problems that I have and one of them is that I need it to run simultaneously. That means whenever the generate_data.py is looking for new data, when it sends the data to notifications.py it should NOT wait for the notifications.py to be done. Instead it should work all the time and only send data to notifications.py when new found and continue looking for new data.

Scenario that can happen is that generate_data.py can get new 5 data at the same time and is needing to be send to notifications.py that needs to send the requests to my server as soon as possible without any delay/stopping/blocking.

I got few suggestions such as using listener, queue, rcp, threading, multiprocessing but here I am. My question is, What would be my best option to use here where I have generate_data.py that is on 24/7 and notifications.py should only post once it got the data from generate_data and post it to my server as fast as possible and without being blocked/stopped and if there is any example would also be appreciated it! (Using python)


Solution

  • so as generate_data script generates data and sends it to notification script, while notification script is in processing the data which has sent meanwhile the data generated by generate_data where you are storing it to push to the notification script later ?

    you can do one thing as you asked only with python.. you can generate data in to a file always and notifications script read the content and delete only content which is read, so that you can do both parallel.