Basically what I am asking is, if there is a "await p.join()
" function, which awaits a separate Process, while letting the event loop in the main process do other stuff.
I have a python-asyncio architecture in a FastAPI application. In case I have a long running and CPU heavy operation I would like to handle it in a separate Process, something like the following toy example:
def heavy_duty(input: InputModel) -> None:
# long running calculation here
async def process(input: InputModel) -> None:
p = Process(target=heavy_duty, args=(input,))
p.start()
await p.join() # give up event loop and wait for heavy duty to finish while main process can do other stuff
Would this be possible to do in the described way?
Ok as far as I can tell my problem seems to be simply solved by using the event loops run_in_executor
:
def heavy_duty(input: InputModel) -> None:
# long running calculation here
async def process(input: InputModel) -> None:
loop = asyncio.get_running_loop()
with ProcessPoolExecutor() as pool:
await loop.run_in_executor(pool, heavy_duty, input)