I have a cluster of three machines. I want to run celery beat
on those. I have a few related questions.
CELERYBEAT_SCHEDULE
, do I need to persist it at all?djcelery.schedulers.DatabaseScheduler
automatically take care of concurrent beat daemons? That is, if I just run three beat daemons with DatabaseScheduler
, am I safe from duplicate tasks?DatabaseScheduler
but based on MongoDB, without Django ORM? Like Celery’s own MongoDB broker and result backend.Currently Celery doesn't support multiple concurrent celerybeat instances.
You have to ensure only a single scheduler is running for a schedule at a time, otherwise you would end up with duplicate tasks. Using a centralized approach means the schedule does not have to be synchronized, and the service can operate without using locks.
http://docs.celeryproject.org/en/latest/userguide/periodic-tasks.html