Search code examples
pythonpython-2.7scrapydscrapy

Change number of running spiders scrapyd


Hey so I have about 50 spiders in my project and I'm currently running them via scrapyd server. I'm running into an issue where some of the resources I use get locked and make my spiders fail or go really slow. I was hoping their was some way to tell scrapyd to only have 1 running spider at a time and leave the rest in the pending queue. I didn't see a configuration option for this in the docs. Any help would be much appreciated!


Solution

  • This can be controlled by scrapyd settings. Set max_proc to 1:

    max_proc

    The maximum number of concurrent Scrapy process that will be started.