Currently, I have a PHP script (e.g. masterProcessor.php
) that my server executes at regular intervals using cron. This script has a static list of about 80 URLs it must fetch and process. Since processing each URL takes a few minutes, to save time the script divides the 80 URLs up into about 10 sets of 8, and fires off a second PHP script (using exec("childProcess.php")
) for each set of URLs, and that child PHP script actually processes each of the 8 or so URLs it is handed from the master script.
My aim is to accomplish this using Iron.io's IronWorker service, but I'm still a bit confused on how to go about this. So much of their docs are in Ruby, which I don't know, and the few PHP examples they have show code but not how to actually set this up.
Here is how I think this would work, so please let me know if I'm right or wrong here:
Do I have that right? Is it even possible for a worker to fire off other, multiple workers/tasks? If so, how is that accomplished using the PHP IronWorker lib?
Any guidance or tips, resources, etc. would be much appreciated. I apologize for my ignorance, but I've been reading and researching and I've tried experimenting locally but I can't even get a single worker to run locally on Windows (it says it got executed, but no logs are printed out?).
I'd suggest to try https://github.com/iron-io/iron_combine - it's small helper framework with master script already implemented. You'll just need to implement slave and push 1 message for each url to mq.
In any case - your approach is correct, if you don't want to use iron_combine for whatever reason, just do it as you described in question. If you'll face any problems, iron.io got nice support channel http://www.hipchat.com/gym1ayjWj