I'm trying to put my scraped data on my firebase
account on cloud , but i'm getting this ImportError
when i run the spider. I tried making new project and even reinstalling the firebase
and shub
on specific version of Python
but no help.
the spider runs perfectly on my machine , and doesn't show any ImportErrors. here is the error log.
Traceback (most recent call last):
File "/usr/local/lib/python2.7/site-packages/scrapy/utils/defer.py", line 102, in iter_errback
yield next(it)
File "/usr/local/lib/python2.7/site-packages/sh_scrapy/middlewares.py", line 30, in process_spider_output
for x in result:
File "/usr/local/lib/python2.7/site-packages/scrapy/spidermiddlewares/offsite.py", line 29, in process_spider_output
for x in result:
File "/usr/local/lib/python2.7/site-packages/scrapy/spidermiddlewares/referer.py", line 339, in <genexpr>
return (_set_referer(r) for r in result or ())
File "/usr/local/lib/python2.7/site-packages/scrapy/spidermiddlewares/urllength.py", line 37, in <genexpr>
return (r for r in result or () if _filter(r))
File "/usr/local/lib/python2.7/site-packages/scrapy/spidermiddlewares/depth.py", line 58, in <genexpr>
return (r for r in result or () if _filter(r))
File "/app/__main__.egg/Terminator/spiders/IcyTermination.py", line 18, in parse
from firebase import firebase
ImportError: No module named firebase
any help?
I couldn't comment due to reputation. But have you created your requirements.txt?
Here you will find how to deploy your own dependencies to scrapinghub.
Basically you create a requirements.txt file at the root of your project with one dependency per line and add
requirements_file: requirements.txt
to your scrapinghub.yml file