I seek your help in the following 2 questions - How do I set the handler for the different log levels like in python. Currently, I have
STATS_ENABLED = True
STATS_DUMP = True
LOG_FILE = 'crawl.log'
But the debug messages generated by Scrapy are also added into the log files. Those are very long and ideally, I would like the DEBUG level messages to left on standard error and INFO messages to be dump to my LOG_FILE
.
Secondly, in the docs, it says The logging service must be explicitly started through the scrapy.log.start() function.
My question is, where do I run this scrapy.log.start()
? Is it inside my spider?
Hmm,
Just wanted to update that I am able to get the logging file handler to file by using
from twisted.python import log
import logging
logging.basicConfig(level=logging.INFO, filemode='w', filename='log.txt'""")
observer = log.PythonLoggingObserver()
observer.start()
however I am unable to get the log to display the spiders' name like from twisted in standard error. I posted this question.