Search code examples
pythonredisrabbitmqcelery

Celery send_task() method


I have my API, and some endpoints need to forward requests to Celery. Idea is to have specific API service that basically only instantiates Celery client and uses send_task() method, and seperate service(workers) that consume tasks. Code for task definitions should be located in that worker service. Basicaly seperating celery app (API) and celery worker to two seperate services. I dont want my API to know about any celery task definitions, endpoints only need to use celery_client.send_task('some_task', (some_arguments)). So on one service i have my API, an on other service/host I have celery code base where my celery worker will execute tasks.

I came across this great article that describes what I want to do. https://medium.com/@tanchinhiong/separating-celery-application-and-worker-in-docker-containers-f70fedb1ba6d and this post Celery - How to send task from remote machine?

I need help on how to create routes for tasks from the API? I was expecting for celery_client.send_task() to have queue= keyword, but it does not. I need to have 2 queues, and two workers that will consume content from these two queues.

Commands for my workers:

celery -A <path_to_my_celery_file>.celery_client worker --loglevel=info -Q queue_1
celery -A <path_to_my_celery_file>.celery_client worker --loglevel=info -Q queue_2

I have also visited celery "Routing Tasks" documentation, but it is still unclear to me how to establish this communication.


Solution

  • Your API side should hold the router. I guess it's not an issue because it is only a map of task -> queue (aka send task1 to queue1).

    In other words, your celery_client should have task_routes like:

    task_routes = {
        'mytasks.some_task': 'queue_1',
        'mytasks.some_other_task': 'queue_2',
    }