I am scraping a website which returns in a list of urls
.
Example - scrapy crawl xyz_spider -o urls.csv
It is working absolutely fine now I want is to make new urls.csv
not append data
into the file. Is there any parameter passing I can do to make it enable?
Unfortunately scrapy can't do this at the moment.
There is a proposed enhancement on github though: https://github.com/scrapy/scrapy/issues/547
However you can easily do redirect the output to stdout and redirect that to a file:
scrapy crawl myspider -t json --nolog -o - > output.json
-o -
means output to minus and minus in this case means stdout.
You can also make some aliases to delete the file before running scrapy, something like:
alias sc='-rm output.csv && scrapy crawl myspider -o output.csv'