Hey so I run lots of Image Manipulation on an api built using fastapi async. I would like to be able to run the Image Manipulation asynchronously. As a result I used run_in_executor which I believe runs it in a seperate thread. However I was told that using python multiprocessing is better instead. Does moving have any advantages?.
import asyncio
import functools
from app.exceptions.errors import ManipulationError
def executor(function):
@functools.wraps(function)
def decorator(*args, **kwargs):
try:
partial = functools.partial(function, *args, **kwargs)
loop = asyncio.get_event_loop()
return loop.run_in_executor(None, partial)
except Exception:
raise ManipulationError("Uanble To Manipulate Image")
return decorator
I made this decorator to wrap my blocking funcs as run in executor.
two questions
a) Does moving to multiprocesisng have any advantages
b) How would I do so
a) Does moving to multiprocesisng have any advantages
Yes, it utilizes multiple cores in case of CPU-bound processing.
b) How would I do so
By passing an instance of ProcessPoolExecutor
to run_in_executor
. (The None
value you're passing now means use the default executor provided by asyncio, which is a ThreadPoolExecutor
.) For example (untested):
_pool = concurrent.futures.ProcessPoolExecutor()
def executor(function):
@functools.wraps(function)
def decorator(*args):
loop = asyncio.get_event_loop()
return loop.run_in_executor(_pool, function, *args)
return decorator
This will also require that all arguments to the function be serializable, so that they can be transferred to the subprocess.