Search code examples
djangocelerycelery-task

Celery + Django not working at the same time


I have Django 2.0 project that is working fine, its integrated with Celery 4.1.0, I am using jquery to send ajax request to the backend but I just realized its loading endlessly due to some issues with celery.

Celery Settings (celery.py)

from __future__ import absolute_import, unicode_literals
import os
from celery import Celery

# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'converter.settings')

app = Celery('converter', backend='amqp', broker='amqp://guest@localhost//')

# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
#   should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')

# Load task modules from all registered Django app configs.
app.autodiscover_tasks()

@app.task(bind=True)
def debug_task(self):
    print('Request: {0!r}'.format(self.request))

Celery Tasks (tasks.py)

from __future__ import absolute_import, unicode_literals

from celery import shared_task

@shared_task(time_limit=300)
def add(number1, number2):
    return number1 + number2

Django View (views.py)

class AddAjaxView(JSONResponseMixin, AjaxResponseMixin, View):
    def post_ajax(self, request, *args, **kwargs):
        url = request.POST.get('number', '')
        
        task = tasks.convert.delay(url, client_ip)
        result = AsyncResult(task.id)

        data = {
            'result': result.get(),
            'is_ready': True,
        }
        if result.successful():
            return self.render_json_response(data, status=200)

When I send ajax request to the Django app it is loading endlessly but when terminate Django server, and I run celery -A demoproject worker --loglevel=info that's when my tasks are running.

Question How do I automate this so that when I run Django project my celery tasks will work automatically when I send ajax request?


Solution

  • If you are on development environment, you have to run manually celery worker as it does not run automatically on the background, in order to process the jobs in the queue. So if you want to have a flawless workflow, you need both Django default server and celery worker running. As stated in the documentation:

    In a production environment you’ll want to run the worker in the background as a daemon - see Daemonization - but for testing and development it is useful to be able to start a worker instance by using the celery worker manage command, much as you’d use Django’s manage.py runserver: celery -A proj worker -l info

    You can read their documentation for daemonization.

    http://docs.celeryproject.org/en/latest/userguide/daemonizing.html