Search code examples
pythonmultiprocessingpython-multiprocessing

Terminate all processes before finishing the main method


I run a separate process for some logging tasks in parallel to my main process. They share some resources and I run into issues terminating the logging process before the main process finishes.

Are there any drawbacks from finishing the main Python program and keeping the subprocess? Can I be sure that it will be terminated on exiting the main program? Or would it be better to call Process.terminate() as my last call in the main script?


Solution

  • As long as the processes you're launching are daemons, the main process will terminate them automatically before it exits:

    daemon

    The process’s daemon flag, a Boolean value. This must be set before start() is called.

    The initial value is inherited from the creating process.

    When a process exits, it attempts to terminate all of its daemonic child processes.

    Note that a daemonic process is not allowed to create child processes. Otherwise a daemonic process would leave its children orphaned if it gets terminated when its parent process exits. Additionally, these are not Unix daemons or services, they are normal processes that will be terminated (and not joined) if non-daemonic processes have exited.

    This flag is automatically set for processes created by a multiprocessing.Pool, but defaults to false for Process objects. The parent process will try to call join on all non-daemon children, so if you have any of those running, they will prevent the parent from exiting until they exit themselves.