Search code examples
scrapyscrapyd

scrapyd shared middleware and pipeline code


I have several scrapy projects that I have deployed to a scrapyd instance. They all tend to use the same middleware code that I have created and that I have duplicated amongst the projects.

I would like to avoid this duplication of code. Is there a way for scrapy projects deployed on scrapyd to share the same middleware code without resorting to combining all projects into the one project?

Thanks


Solution

  • yes it's possible if you create a package of your middleware and call it within your settings.py a good example of it (and one I use often) is for instance : https://github.com/svetlyak40wt/scrapy-useragents hope it helps