Search code examples
djangodockerdocker-composerediscelery

Docker Celery configuration - For Django/React/Redis/Celery project. No Celery logo and background task log appearing


I have created a project github repository that has Docker, Django, React, Redis, Celery: https://github.com/axilaris/dockerize-django-react-celery-redis. My goal is to get celery working correctly with the logo appearing and background task log prints.

This is based from this Docker - React and Django example tutorial code: https://github.com/dotja/authentication_app_react_django_rest

And trying to use Docker - Celery & Redis from this tutorial code: https://github.com/veryacademy/docker-mastery-with-django/tree/main/Part-4%20Django%20Postgres%20Redis%20and%20Celery <-- Part 4 tutorial for Celery & Redis

Here is the docker-compose.yaml for the redis & celery part:

# Redis
  redis:
    image: redis:alpine
    container_name: redis
    ports:
      - "6379:6379"

# celery
  celery:
    restart: always
    build:
      context: ./backend
    command: celery -A backend worker -l DEBUG
    volumes:
      - .:/django
    container_name: celery  
    depends_on:
      - redis
      - backend

here is how I implemented the background tasks in user_api/tasks.py (https://github.com/axilaris/dockerize-django-react-celery-redis/blob/main/backend/user_api/tasks.py)

in backend/user_api/tasks.py

from __future__ import absolute_import, unicode_literals

from celery import shared_task
import logging

@shared_task
def add(x, y):
    logging.debug("XXX add")
    return x + y

my project should work very simply by running:

 docker-compose build
 docker-compose up

However, there is no celery logo (1) and I dont see the logs print on the background process (2). I think it is executing as result.ready returns. But I want these 2 for completeness. (eventhough result.ready maybe be working with celery executing background task)

It did not print this celery logo. (you can check the full log prints for docker-compose up - https://gist.github.com/axilaris/a2776fc8f39e53bbc93e0d7a4e0ba0f5):

celery       |  -------------- celery@b755a7cdba8d v5.3.6 (emerald-rush)
celery       | --- ***** ----- 
celery       | -- ******* ---- Linux-6.6.12-linuxkit-aarch64-with 2024-03-02 20:48:06
celery       | - *** --- * --- 
celery       | - ** ---------- [config]
celery       | - ** ---------- .> app:         core:0xffff9bbd7550
celery       | - ** ---------- .> transport:   redis://redis:6379//
celery       | - ** ---------- .> results:     redis://redis:6379/
celery       | - *** --- * --- .> concurrency: 10 (prefork)
celery       | -- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
celery       | --- ***** ----- 
celery       |  -------------- [queues]
celery       |                 .> celery           exchange=celery(direct) key=celery

and here is how I execute the background task, but there is no log prints in django logs on the background task. (you can find the full log prints https://gist.github.com/axilaris/a2776fc8f39e53bbc93e0d7a4e0ba0f5)

>>> % docker exec -it backend_container sh 
>>> /app # python manage.py shell
>>> Python 3.9.18 (main, Jan 27 2024, 07:18:02) 
>>> [GCC 13.2.1 20231014] on linux
>>> Type "help", "copyright", "credits" or "license" for more information.
>>> (InteractiveConsole)
>>> from user_api.tasks import add
>>> result = add.delay(2, 2)
>>> result.ready
<bound method AsyncResult.ready of <AsyncResult: 9046dd90-f44d-4eba-9881-acc0fbc4278a>>

My goal is to see that celery logo print above and to verify in the django logs that the background process is executed just like in (https://youtu.be/oBQxFn1CDno?si=58ZRfLZeuCC8fz01&t=1204 at 20:04).

UPDATE (based on Sujay)

  • 2 changes:
  • result (see logs here):
    • celery logo appears ! I think its running well
    • django container exit due to entrypoint.sh permission denied. I even tried with giving permission but doesnt work.

This looks promising but django container error which makes it I cant run action no.2 which is the background task

  • I check in this code here in a separate branch if someone wants to check.

Solution

  • What I understand is you want celery to get start working on

    docker-compose build 
    

    Firstly you need to understand how docker works with django and celery: so your docker services

    1. Backend
    2. celery

    both need to be on same shared volume.

    now when you copy django data to docker container see you docker file for backend it copies to app dir command in docker -

    COPY . /app

    and in you docker compose file you have volume conf as

     - .:/django
    

    so all you dirs needs to be streamlined

    first change your docker file as follows

        FROM python:3.9-alpine
    
    RUN apk update && apk add postgresql-dev gcc python3-dev musl-dev
    
    RUN pip install --upgrade pip
    
    COPY ./requirements.txt .
    RUN pip install -r requirements.txt
    
    COPY . /app
    WORKDIR /app
    
    COPY ./entrypoint.sh .
    
    CMD ["/app/entrypoint.sh"]
    

    then change you docker-compose file to

    version: '3.7'
    
    services:
      # Redis
      redis:
        image: redis:alpine
        container_name: redis
        ports:
          - '6379:6379'
    
      backend:
        volumes:
          - ./backend:/app
          - static:/static
        env_file:
          - .env
        build:
          context: ./backend
        ports:
          - '8000:8000'
        container_name: backend_container
    
      frontend:
        build:
          context: ./frontend
        volumes:
          - frontend:/app/build
        container_name: frontend_container
      nginx:
        build:
          context: ./nginx
        volumes:
          - static:/static
          - frontend:/var/www/frontend
        ports:
          - '80:80'
        depends_on:
          - redis
          - backend
          - frontend
    
      # celery
      celery:
        restart: always
        build: ./backend
        command: celery -A backend worker -l info
        volumes:
          - ./backend:/app
        container_name: celery
        depends_on:
          - redis
          - backend
    
    volumes:
      static:
      frontend:
    

    so all I changed was docker file last line to CMD["command"] so that it not executed on build and adjusted volumes in docker-compose to point to same volume as build

    your expected output

    I suggest you to watch this 2 videos

    1)https://www.youtube.com/watch?v=mScd-Pc_pX0&ab_channel=LondonAppDeveloper

    2)https://www.youtube.com/watch?v=EfWa6KH8nVI&ab_channel=PriyanshuGupta