Search code examples
pythonasynchronouspython-3.5python-asyncio

Python asyncio, possible to await / yield entire myFunction()


I've written a library of objects, many which make HTTP / IO calls. I've been looking at moving over to asyncio due to the mounting overheads, but I don't want to rewrite the underlying code.

I've been hoping to wrap asyncio around my code in order to perform functions asynchronously without replacing all of my deep / low level code with await / yield.

I began by attempting the following:

async def my_function1(some_object, some_params):
      #Lots of existing code which uses existing objects
      #No await statements
      return output_data

async def my_function2():
      #Does more stuff

while True:
    loop = asyncio.get_event_loop()
    tasks = my_function(some_object, some_params), my_function2()
    output_data = loop.run_until_complete(asyncio.gather(*tasks))
    print(output_data)

I quickly realised that while this code runs, nothing actually happens asynchronously, the functions complete synchronously. I'm very new to asynchronous programming, but I think this is because neither of my functions are using the keyword await or yield and thus these functions are not cooroutines, and do not yield, thus do not provide an opportunity to move to a different cooroutine. Please correct me if I am wrong.

My question is, is it possible to wrap complex functions (where deep within they make HTTP / IO calls ) in an asyncio await keyword, e.g.

async def my_function():
    print("Welcome to my function")
    data = await bigSlowFunction()

UPDATE - Following Karlson's Answer

Following and thanks to Karlsons accepted answer, I used the following code which works nicely:

from concurrent.futures import ThreadPoolExecutor
import time    

#Some vars
a_var_1 = 0
a_var_2 = 10

pool = ThreadPoolExecutor(3)

future = pool.submit(my_big_function, object, a_var_1, a_var_2)
while not future.done() :
    print("Waiting for future...")
    time.sleep(0.01)
print("Future done")
print(future.result())

This works really nicely, and the future.done() / sleep loop gives you an idea of how many CPU cycles you get to use by going async.


Solution

  • The short answer is, you can't have the benefits of asyncio without explicitly marking the points in your code where control may be passed back to the event loop. This is done by turning your IO heavy functions into coroutines, just like you assumed.

    Without changing existing code you might achieve your goal with greenlets (have a look at eventlet or gevent).

    Another possibility would be to make use of Python's Future implementation wrapping and passing calls to your already written functions to some ThreadPoolExecutor and yield the resulting Future. Be aware, that this comes with all the caveats of multi-threaded programming, though.

    Something along the lines of

    from concurrent.futures import ThreadPoolExecutor
    
    from thinair import big_slow_function
    
    executor = ThreadPoolExecutor(max_workers=5)
    
    async def big_slow_coroutine():
        await executor.submit(big_slow_function)