I currently have a PHP daemon that needs to run very quick iterations through a never ending loop.
It looks something like while(true) { /* do something */ }
The problem is my application requires this loop to do more and more actions as my application grows. However, if the loop takes too long to run then some actions might be missed. Since the loop is sending out network requests it will take some time. Currently it only sends out about 5 requests per loop, but I can already see it growing to around 50 soon.
If each request takes 1 second a 50 second loop is unacceptable. Information will be missed.
My question is how can I improve on my current system to accommodate a larger number of requests while still retaining very quick loop iterations?
I know PHP isn't ideal for this situation but at the time provided a very elegant solution. I need to use a MySQL database with JSON encoding/decoding and PHP was perfect for that. If there's a better language to do this in please let me know.
Start multiple workers, that handle a subset of urls. Many things depend on the details of what you want to achieve, but maybe a message queue (like rabbitMQ) would help too. For example: Every worker receives messages from the queue and this message contains the url they should take care of.