Search code examples
pythonmultiprocessingpython-multiprocessing

How to run a function in python in parallel with the same arguments?


How do I run several independent processes in parallel when the argument is the same? My current (not nice) solution is:

import time
import multiprocessing

def parse_args():
   ...
   return args

def my_function(args):
    ...

if __name__ == '__main__':
    args = parse_args()
    processes = []
    for i in range(5):
        processes.append(multiprocessing.Process(target=my_function, args=(args,)))
        processes[-1].start()
    time.sleep(200)
    for i in range(5):
       processes[i].terminate()

Also, my_function runs infinetely and it doesn't return anything.


Solution

  • I'd join them, and make sure they terminate on their own, like so :

    processes = [Process(target=my_function, args=(args,), daemon=True) for p in range(nb_processes)] # create nb_processes running my_function
    [p.start() for p in processes] # start all processes
    [p.join()  for p in processes] # wait for all processes to end
    

    You want to make sure that my_function implements some sort of timeout, because that'll wait for all processes to complete.

    As for getting back their results you could use a queue, check multiprocessing.Queue, or a message broker. I personally like to use REDIS for that, but then it's very much opinion oriented.

    As a side note, you probably want to take a look at asyncio if you haven't yet.