Search code examples
djangocelerycookiecutter-django

Call celery task from signal


I need to import data from several public APIs for a user after he signed up. django-allauth is included and I have registered a signal handler to call the right methods after allaut emits user_signed_up. Because the data import needs to much time and the request is blocked by the signal, I want to use celery to do the work.

My test task:

@app.task()
def test_task(username):
    print('##########################Foo#################')
    sleep(40)
    print('##########################' + username + '#################')
    sleep(20)
    print('##########################Bar#################')
    return 3

I'm calling the task like this:

from game_studies_platform.taskapp.celery import test_task

@receiver(user_signed_up)
def on_user_signed_in(sender, request, *args, **kwargs):
    test_task.apply_async('John Doe')

The task should be put into the queue and the request should be followed immediately. But it is blocked and I have to wait a minute.

The project is setup with https://github.com/pydanny/cookiecutter-django and I'm running it in a docker container. Celery is configured to use the django database in development but will be redis in production


Solution

  • The solution was to switch CELERY_ALWAYS_EAGER = True to False in the local.py. I was pointed to that solution in the Gitter channel of cookiecutter-django.

    The calls mention above where already correct.