The spider stopped in the middle of the crawl (after 7h run, 20K requests). The job status is "failure". Even though there are no ERROR messages in the log. The log look like the code just stopped running on a particular code line range without any errors reported. It happened in spider_idle method override. The logs are enabled and I can see all preceding INFO messages indicated normal run of the spider. I don't know how to enable DEBUG messages in the scrapinghub log.
Checked memory consumption - it is stable, at least in short tests, now waiting for long run results.
How can I retrieve more info after job "failed"?
Jobs are stopped automatically in 24hours for free accounts. The status is "cancelled" and the log shows SIGTERM in this case, normally.