Search code examples
pythoncelerydjango-celery

worker_concurrency configuration not valid for celery


environment: Django3.1, celery5.2 Django setting.py

CELERY_RESULT_BACKEND = 'django-db'
CELERY_BROKER_URL = 'redis://127.0.0.1/0'
CELERYD_CONCURRENCY = 2  # 并发worker数
# CELERYD_PREFETCH_MULTIPLIER=1
CELERYD_FORCE_EXECV = True 
CELERYD_MAX_TASKS_PER_CHILD = 50  

celery.py

import os

from celery import Celery

# Set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'san.settings')

app = Celery('san')
app.config_from_object('django.conf:settings', namespace='CELERY')

# Load task modules from all registered Django apps.
app.autodiscover_tasks()

启动命令:celery -A san worker -l INFO -P gevent

-------------- [email protected] v5.2.3 (dawn-chorus) --- ***** ----- -- ******* ---- Linux-3.10.0-327.el7.x86_64-x86_64-with-glibc2.17 2022-02-18 14:31:44

  • *** --- * ---
  • ** ---------- [config]
  • ** ---------- .> app: san:0x7f7d31a811c0
  • ** ---------- .> transport: redis://127.0.0.1:6379/0
  • ** ---------- .> results:
  • *** --- * --- .> concurrency: 4 (gevent) -- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker) --- ***** ----- -------------- [queues] .> celery exchange=celery(direct) key=celery

concurrency: 4,why does my configuration not take effect? Specifying the '-c' parameter in the startup command is valid.

  • *** --- * --- .> concurrency: 2 (gevent)

What can I do to make my configuration work?

The rest of my configuration works,such as CELERY_BROKER_URL,CELERY_RESULT_BACKEND


Solution

  • # Using a string here means the worker doesn't have to serialize
    # the configuration object to child processes.
    # - namespace='CELERY' means all celery-related configuration keys
    #   should have a `CELERY_` prefix.
    

    Change CELERYD CONCURRENCY to CELERY_WORKER_CONCURRENCY