I have a service using NodeJS, RabbitMQ and Python workers. The NodeJS brokers use MongoDB and the Python workers have only a connection to the rabbitMQ server.
I would like to be able to centralize all the logs from the different languages in a db.
My idea was to push all the logs in a rabbitMQ queue and then push them in the mongoDB used by NodeJS.
I would like to know if this is the best way to have centralized log and how can I redirect the logging python module to a pika consumer?
It sounds like you want to create a custom logging.Handler. You would override the emit
method, and have it publish the log message to a RabbitMQ queue of your choosing. You'll also need to override close
, and have it close the RabbitMQ channel/connection, etc.
Then to use the handler, do something like this (see https://docs.python.org/2/howto/logging.html for more info):
import logging
# create logger
logger = logging.getLogger('my_logger')
# create RabbitMQ handler
rh = RabbitMQHandler() # You need to create this.
# add rh to logger
logger.addHandler(rh)
# start logging stuff
logger.error("An error!")