I have to call some stored procedures from the database(oracle). I use celery and redis to call this SP in asynchronous way.
tasks.py
@task
def carga_ftp():
tabla = Proc_Carga()
sp = tabla.carga()
return None
@task
def conci(idprov,pfecha):
conci = Buscar_Conci()
spconc = conci.buscarcon(idprov,pfecha)
return None
I need to specify diferent concurrencies for every task. for the task CONCI I need a concurrency of 1, for the task CARGA_FTP a concurrency of 3 or more
celery multi start -A provcon conc carga -c:conc 1 -c:carga 3
My celery settings in the file settings.py
BROKER_URL = 'redis://localhost:6379/0'
CELERY_IMPORTS = ("pc.tasks", )
CELERY_ACCEPT_CONTENT = ['json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_RESULT_BACKEND='djcelery.backends.cache:CacheBackend'
CELERY_ROUTES = {"tasks.conci": {"queue": "conc"}, "tasks.carga_ftp": {"queue": "carga"}}
but the worker "CONC" take more than 1 task at the same time and the worker "CARGA" take more than 3 tasks at the same time too
When I see the process
ps aux | grep 'celery'
I see two process for the worker "CONC" and four the worker "CARGA" have 4 process.
I don't know if I miss something or if my commando to execute celery is wrong. But I need only one task at time for the task "CONCI"
Any advice
Thanks in advance
When you start a celery worker it starts
So, if you start a worker with a concurrency of 1, it will have 2 process. For a worker with concurrency of 3, it has 4 process.
Celery process tasks asynchronously. It will immediately consume given tasks but executes 1 at a time.