I've multiple spiders running in multiple instances (4) parallelly. All of them are using almost 100% cpu usage.
I've deployed them using scrapyd. Tried changing scrapyd settings like max_concurrent_requests,CONCURRENT_REQUESTS,CONCURRENT_REQUESTS_PER_DOMAIN to minimum but no luck.
I'm using python 2.7.5 and scrapy 0.24
I've googled for solution and found this page
https://groups.google.com/forum/#!topic/scrapy-users/Rgq07ldcoPs
I couldn't get their solutions
Thanks in advance
Fixed this. Issue was with frequent Mysql updation, eventually leading to a load on CPU. Introduced a minute delay in pipeline to reduce the load, which fixed the whole issue.