Search code examples
pythonscrapyweb-crawlerscrapyd

Running multiple spiders using scrapyd


I had multiple spiders in my project so decided to run them by uploading to scrapyd server. I had uploaded my project succesfully and i can see all the spiders when i run the command

curl http://localhost:6800/listspiders.json?project=myproject

when i run the following command

curl http://localhost:6800/schedule.json -d project=myproject -d spider=spider2

Only one spider runs because of only one spider given, but i want to run run multiple spiders here so the following command is right for running multiple spiders in scrapyd ?

curl http://localhost:6800/schedule.json -d project=myproject -d spider=spider1,spider2,spider3........

And later i will run this command using cron job i mean i will schedule this to run frequently


Solution

  • If you want to run multiple spiders using scrapyd, schedule them one by one. scrapyd will run them in the same order but not at the same time.

    See also: Scrapy 's Scrapyd too slow with scheduling spiders