Search code examples
pythonmultiprocessinginterprocess

Multiple python scripts sending messages to a single central script


I have a number of scripts written in Python 2.6 that can be run arbitrarily. I would like to have a single central script that collects the output and displays it in a single log.

Ideally it would satisfy these requirements:

  • Every script sends its messages to the same "receiver" for display.
  • If the receiver is not running when the first script tries to send a message, it is started.
  • The receiver can also be launched and ended manually. (Though if ended, it will restart if another script tries to send a message.)
  • The scripts can be run in any order, even simultaneously.
  • Runs on Windows. Multiplatform is better, but at least it needs to work on Windows.

I've come across some hints:

From those pieces, I think I could cobble something together. Just wondering if there is an obviously 'right' way of doing this, or if I could learn from anyone's mistakes.


Solution

  • I'd consider using logging.handlers.SocketHandler for the message passing parts of this, it sounds like you have a logging type use case in mind already.

    The standard libraries logging facilities are very flexible and configuration driven so you should be able to adapt them to your requirements.

    This doesn't handle the automatic restarting part of your question. For UNIX you'd probably just use pid files and os.kill(pid, 0) to check if it's running, but I don't know what their equivalents in the Windows world would be.