Search code examples
pythonmultiprocessing

Execute third function after the completion of first two functions run concurrently


I have the following code:

class RepeatTimer(Timer):
    """
    Extending the threading.Timer class to execute functions at fixed intervals.
    """
    def run(self):
        while not self.finished.wait(self.interval):
            self.function(*self.args, **self.kwargs)

def parallel_process(mp, arguments, cb_fn):
    main_process = multiprocessing.Process(target=mp, args=[arguments, cb_fn])
    side_process = RepeatTimer(30, generate_info)
    final_process = multiprocessing.Process(target=create_final)
    main_process.start()
    side_process.start()
    main_process.join()
    side_process.cancel()
    final_process.start()
    final_process.join()

mp, generate_info, and create_final are my functions being used. I wish to execute mp and generate_info concurrently and then execute create_final after their completion.

mp and generate_info run concurrently if I don't use the create_final function, but after I use the function, it also starts running while the other two are running. I read about it and got to know join() method will only block the main process, and not other multiprocessing functions.

What can I do to get my desired output? Even using concurrent.futures also throws the same error.

EDIT

So taking suggestion from @Aaron, I did somethings like this

class RepeatTimer(Timer):
    """
    Extending the threading.Timer class to execute functions at fixed intervals.
    """
    def run(self):
        while not self.finished.wait(self.interval):
            self.function(*self.args, **self.kwargs)

def parallel_process(mp, arguments, cb_fn):
    main_process = multiprocessing.Process(target=mp, args=[arguments, cb_fn])
    side_process = RepeatTimer(30, generate_info)
    main_process.start()
    side_process.start()
    while main_process.is_alive():
        main_process.join()
        side_process.cancel()
    else:
        create_final()

This seems to be performing like I intended to.


Solution

  • The situation you describe only has one point at which multiple things need to happen concurrently, so it stands to reason you only need 2 processes rather than 4 (main + 3 sub-processes). Reducing the number of processes may make it easier to keep track of flow control so you don't accidentally run things out of order. Here's an example of how I would restructure:

    def mp():
        sleep(10)
        print("mp done")
    
    def generate_info():
        print("info")
    
    def create_final():
        print("done")
    
    if __name__ == "__main__":
        mp_runner = Process(target=mp, )
        mp_runner.start()
        while mp_runner.is_alive():
            generate_info()
            mp_runner.join(1) #timeout to join gives loop timer interval
        create_final()