I can't figure out why celery isn't being recognized.
Here is the full error I get when I run docker compose up
(excluding everything else that works fine):
celeryworker | Usage: celery [OPTIONS] COMMAND [ARGS]...
celeryworker | Try 'celery --help' for help.
celeryworker |
celeryworker | Error: Invalid value for '-A' / '--app':
celeryworker | Unable to load celery application.
celeryworker | Module 'my_project' has no attribute 'celery'
celeryworker exited with code 2
I'm using:
Dockerfile:
FROM python:3.11.0
# Set environment variables
ENV PYTHONDONTWRITEBYTECODE=1
ENV PYTHONUNBUFFERED=1
# Set work directory
WORKDIR /code
# Install dependencies
RUN pip install --upgrade pip
COPY requirements.txt /code/
RUN pip install -r requirements.txt
# Copy the Django project
COPY . /code/
docker-compose.yml
services:
db:
image: postgres:15.2
restart: always
volumes:
- ./data/db:/var/lib/postgresql/data
environment:
- POSTGRES_DB=postgres
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=postgres
cache:
image: redis:7.0.10
restart: always
volumes:
- ./data/cache:/data
rabbit:
image: rabbitmq:3.11.8
restart: always
ports:
- 5673:5673
- 15672:15672
- 25672:25672 #?
volumes:
- ./data/rabbit/data:/var/lib/rabbitmq
- ./data/rabbit/log:/var/log/rabbitmq
web:
build: .
command: ["./wait-for-it.sh", "db:5432", "--", "uwsgi","--ini", "/code/config/uwsgi/uwsgi.ini"]
restart: always
volumes:
- .:/code
environment:
- DJANGO_SETINGS_MODULE=my_project.settings.production
- POSTGRES_BD=postgres
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=postgres
depends_on:
- db
- cache
nginx:
image: nginx:1.23.1
restart: always
volumes:
- ./config/nginx:/etc/nginx/templates
- .:/code
ports:
- "80:80"
- "443:443"
daphne:
build: .
working_dir: /code/my_project/
command: ["../wait-for-it.sh", "db:5432", "--", "daphne", "-u", "/code/my_project/daphne.sock", "my_project.asgi:application"]
restart: always
volumes:
- .:/code
environment:
- DJANGO_SETTINGS_MODULE=my_project.settings.production
- POSTGRES_DB=postgres
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=postgres
depends_on:
- db
- cache
celeryworker:
container_name: celeryworker
build: .
volumes:
- .:/code
#command: python -m celery -A my_project.celery worker -l info
#command: ["./wait-for-it.sh", "rabbit:5673", "--", "celery", "-A","my_project.celery","worker","-l","info"] #"python","-m",
command: ["./wait-for-it.sh", "web:8000", "--", "celery", "-A","my_project","worker","-l","info"] #"python","-m",
depends_on:
- rabbit
- web
The wait-for-it script is here. I've tried it with and without that script and have the same issue, so I don't think that is that problem.
my_project/celery.py
import os
from celery import Celery
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "my_project.settings.production")
app = Celery("my_project")
app.config_from_object("django.conf:settings", namespace="CELERY")
app.autodiscover_tasks()
my_project/settings/production.py (excerpt)
CELERY_BROKER_URL = 'amqp://rabbit:5672'
my_project/init.py
from my_project.celery import app as celery_app
__all__ = ("celery_app",)
I've tried all sorts of iterations with changes to the commands in the docker-compose.yml file, changes in settings, with and without the wait-for-it script, etc., as well as various suggestions on the internet, including this, all with no luck.
I don't have any issue running a Celery worker when I run it on my computer without a Docker container.
Does anyone know why I am getting this error and how I can solve it? Let me know if there is anything I left out.
If I change the command line in the docker-compose file to either my_project.my_project.celery_app
or just my_project.my_project
it changes the error - then I get ModuleNotFoundError: No module named 'my_project.settings'
Then I figured out that if I figured out that if rather than using os.environ.setdefault("DJANGO_SETTINGS_MODULE", "my_project.settings.production")
and app.config_from_object("django.conf:settings", namespace="CELERY")
in my celery.py file, if I create a celeryconfig.py file and use from . import celeryconfig
and app.config_from_object(celeryconfig)
, the celeryworker container will start up successfully...but without any tasks registered.
This leads me to believe there is some issue with how Docker is reading my file hierarchy. Any ideas?
Here is what I ended up doing to fix this error:
working_dir: /code/dating_project/
right under buildenvironement:
subsection under command and on the next line added - DJANGO_SETTINGS_MODULE=dating_project.settings.production
I'm not 100% sure which part fixed it or if it needed both changes but with those changes Celery started working and recognized the tasks files...but then I started having a problem with Daphne and I ended up abandoning Docker altogether.