Search code examples
pythonmultithreadingfunctionpython-asynciopython-multithreading

Python - How to use the function's old result while waiting for its new result


Let's say I have 2 functions like this:

  • Function 1: To Calculate a. And it takes 5 hours to return the result of a for each calculation.
def Calculate_a (x,y,z,t)
    Do_somthing_to_calculate_a
    return a
  • Function 2: Calculate the sum of a + b (b is any number). The requirement is that Function 2 must produce results every 5 seconds or less.
def Sum(a, b):
    s = a + b
    return s

a = Calculate_a (x,y,z,t) #a is the return result of Function 1
s = Sum(a, b)
print(s)

The question is: How can I print out "result s" every 5s.

If I pass parameter a as usual, Function 2 has to wait for Function 1 to finish calculating (I have to wait 5 hours). Thus, I will not be able to get the result of Function 1 every 5 seconds.

Thank you so much!

I tried using a while loop for Function A but it didn't work. I thought the solution might lie in "threading" or "asyncio" but I'm not sure how to use these two libraries.


Solution

  • Use two separate processes and a manager to communicate the result of the calculation of a from the first to the second process. That way, the 2nd process continuously outputs data, using the last know value for a. As soon as a gets updated in the 1st process, the 2nd process uses the new value as well.

    from multiprocessing import Process, Manager
    
    def calc_a(x, y, z, z, manager_namespace):
        # calculate a
        manager_namespace.a = new_a
    
    def sum_ab(manager_namespace, b):
        s = manager_namespace.a + b
        print(s)
    
    if __name__ == '__main__':
        manager = Manager()
        global_ns = manager.Namespace()
        global_ns.x = 0
        p1 = Process(target=calc_a, args=(x, y, z, t, global_ns))
        p1.start()
    
        p2 = Process(target=sum_ab, args=(global_ns, b))
        p2.start()
    
    
        p.join()