Search code examples
pythondjangocelerydjango-celeryworker

How to start remote celery workers from django


I'm trying to use django in combination with celery.

Therefore I came across autodiscover_tasks() and I'm not fully sure on how to use them. The celery workers get tasks added by other applications (in this case a node backend).

So far I used this to start the worker:

celery worker -Q extraction --hostname=extraction_worker

which works fine.

Now I'm not sure what the general idea of the django-celery integration is. Should workers still be started from external (e.g. with the command above), or should they be managed and started from the django application?

My celery.py looks like:

# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'main.settings')

app = Celery('app')

app.config_from_object('django.conf:settings', namespace='CELERY')

# Load task modules from all registered Django app configs.
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)

then I have 2 apps containing a tasks.py file with:

@shared_task
def extraction(total):
    return 'Task executed'

how can I now register django to register the worker for those tasks?


Solution

  • You just start worker process as documented, you don't need to register anything else

    In a production environment you’ll want to run the worker in the background as a daemon - see Daemonization - but for testing and development it is useful to be able to start a worker instance by using the celery worker manage command, much as you’d use Django’s manage.py runserver:

    celery -A proj worker -l info
    

    For a complete listing of the command-line options available, use the help command:

    celery help
    

    celery worker collects/registers task when it runs and also consumes tasks which it found out