Search code examples
djangocelery

Memory leak in run Celery Django, Gevent strategy with Multi Worker


I used celery with rabbitmq in Django. my tasks are io-bound and I used the gevent strategy. I want to run celery tasks on a multi-process (because gevent runs on a single process). but memory grows up for no reason, without doing any tasks. what happened?

this is my celery command:

celery multi start 10 -P gevent -c 500 -A my-app -l info -E --without-gossip --without-mingle --without-heartbeat --pidfile='my-path' --logfile='my-path'

my Django celery config:

CELERY_IGNORE_RESULT = True
CELERY_WORKER_PREFETCH_MULTIPLIER = 100
CELERY_WORKER_MAX_TASKS_PER_CHILD = 400
CELERYD_TASK_SOFT_TIME_LIMIT = 60 * 60 * 12
CELERYD_TASK_TIME_LIMIT = 60 * 60 * 13

celery==5.2.7 django==4.1.2

enter image description here


Solution

  • I found the problem:

    Celery warns of this on worker startup that running with DEBUG causes a memory leak. I did change debug mode to False and fixed memory leak.