Search code examples
pythonscrapyscrapyd

deploy scrapy in a remote machine


Hello I am using scrapy and I have managed to deploy it with scrapyd - this work perfectly in my localhost but when it comes to run it from another computer it doesn't work out.

this command works fine : curl http://localhost:6800/schedule.json -d project=webplode -d spider=pingwebsite -d file=./testfiles/testfiles.xlsx

but when it come to run this one :

curl http://myip:6800/schedule.json -d project=webplode -d spider=pingwebsite -d file=./testfiles/testfiles.xlsx

I get the following error msg :

Failed to connect to myip port 6800: Connection refused

here is my scrapy.cfg

# Automatically created by: scrapy startproject
#
# For more information about the [deploy] section see:
# https://scrapyd.readthedocs.io/en/latest/deploy.html

[settings]
default = webplode.settings

[deploy:local]
url = http://myip:6800/
project = webplode

I don't find scrapyd.conf, I am running this on windows


Solution

  • Finally I manage to solve my problem, I just have to change the settings of my scraper and add my ip address like below:

    scrapy.cfg :

    [settings]
    default = webplode.settings
    
    [scrapyd]
    bind_address= myip
    
    [deploy:local]
    url = http://localhost:6800/
    project = webplode