Search code examples
djangorabbitmqcelerydjango-celery

Running multiple celery instances in Production in same VM different virtual-envs


There have been similar questions. I have been running multiple production applications with Django Celery RabbitMQ, and so far so good. However now there's a customer on whose virtual machine we need to run three separate Django applications, and each of them has a Celery app.

While running Celery as a stand-alone I had followed these docs. And they work like a charm. I am talking of /etc/init.d/celeryd option.

The problem is the init.d scripts point to a script /etc/default and there is only one option to add a directory and other settings to point the right Django app.

https://docs.celeryproject.org/en/latest/userguide/daemonizing.html#example-configuration

However I am yet to see any docs, and what configs I will need to change if in the same VM, and for the same Rabbit-MQ Server we will need to make changes.

In short, how do I run multiple Django Apps with celery and Rabbit MQ in a single machine. Apps are using different python VMs


Solution

  • One solution is to have a Celery systemd service script for each Django app. This means you would have appA-celery.service in /usr/lib/systemd/system/, as well as appB-celery.service and appC-celery.service. Similar thing is with the old /etc/init.d (SysV) way of handling services.

    Another solution is to run a single worker, subscribed to N different queues (for each application), and configure your apps to send tasks to their dedicated queue(s).