Search code examples
pythondjangodockerrediscelery

Celery crash with redis.exceptions.ResponseError: UNBLOCKED


I am using Celery with Django and Redis for everyday tasks. It works actually, but sometimes celery crashes with redis.exceptions.ResponseError, which i can't allow. I saw solution with configuring redis authentication, but it didn't work for me (or i did something wrong).

celery auth log
[2024-01-24 08:49:17,548: INFO/MainProcess] Connected to redis://default:**@redis:6379/0
error
[2024-01-24 08:51:10,365: CRITICAL/MainProcess] Unrecoverable error: ResponseError('UNBLOCKED force unblock from blocking operation, instance state changed (master -> replica?)')
Traceback (most recent call last):
  File "/usr/local/lib/python3.12/site-packages/celery/worker/worker.py", line 202, in start
    self.blueprint.start(self)
  File "/usr/local/lib/python3.12/site-packages/celery/bootsteps.py", line 116, in start
    step.start(parent)
  File "/usr/local/lib/python3.12/site-packages/celery/bootsteps.py", line 365, in start
    return self.obj.start()
           ^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/celery/worker/consumer/consumer.py", line 340, in start
    blueprint.start(self)
  File "/usr/local/lib/python3.12/site-packages/celery/bootsteps.py", line 116, in start
    step.start(parent)
  File "/usr/local/lib/python3.12/site-packages/celery/worker/consumer/consumer.py", line 742, in start
    c.loop(*c.loop_args())
  File "/usr/local/lib/python3.12/site-packages/celery/worker/loops.py", line 97, in asynloop
    next(loop)
  File "/usr/local/lib/python3.12/site-packages/kombu/asynchronous/hub.py", line 373, in create_loop
    cb(*cbargs)
  File "/usr/local/lib/python3.12/site-packages/kombu/transport/redis.py", line 1344, in on_readable
    self.cycle.on_readable(fileno)
  File "/usr/local/lib/python3.12/site-packages/kombu/transport/redis.py", line 569, in on_readable
    chan.handlers[type]()
  File "/usr/local/lib/python3.12/site-packages/kombu/transport/redis.py", line 962, in _brpop_read
    dest__item = self.client.parse_response(self.client.connection,
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/redis/client.py", line 553, in parse_response
    response = connection.read_response()
               ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/redis/connection.py", line 524, in read_response
    raise response
redis.exceptions.ResponseError: UNBLOCKED force unblock from blocking operation, instance state changed (master -> replica?)
docker-compose
services:
  redis:
    image: redis:latest
    container_name: redis
    restart: on-failure
    volumes:
      - ./redis.conf:/usr/local/etc/redis/redis.conf
    ports:
      - "6379:6379"

  django:
    image: django-service
    container_name: django
    ports:
      - '8000:8000'
    env_file:
      - .django
    depends_on:
      - redis
    command: /django-start.sh

  celery:
    image: django-service
    container_name: celery
    restart: on-failure
    env_file:
      - .django
    depends_on:
      - django
    command: /celery-start.sh

  celery-beat:
    image: django-service
    container_name: celery-beat
    restart: on-failure
    env_file:
      - .django
    depends_on:
      - django
    command: /celery-beat-start.sh
redis.conf
requirepass someInsanePass
celery-start.sh
#!/bin/bash

celery -A celery_worker.app worker --loglevel=debug -E --uid=nobody --gid=nogroup
celery-beat-start.sh
#!/bin/bash

celery -A celery_worker.app beat --loglevel=debug
Django settings
...
# CELERY
# --------------------------------------------------
CELERY_BROKER_URL = os.environ.get("CELERY_BROKER_URL", "redis://default:someInsanePass@localhost:6379/0")
CELERY_RESULT_BACKEND = os.environ.get("CELERY_RESULT_BACKEND", "redis://default:someInsanePass@localhost:6379/0")
CELERY_TIMEZONE = TIME_ZONE
CELERY_EAGER_PROPAGATES_EXCEPTIONS = True
CELERY_BEAT_SCHEDULE = {
    "some_everyday_task_1": {
        "task": "apps.app1.tasks.task1",
        "schedule": crontab(hour=0, minute=0),
    },
    "some_everyday_task_2": {
        "task": "apps.app2.tasks.task2",
        "schedule": crontab(hour=4, minute=0),
    },
}
...
.django
# ...
DJANGO_SETTINGS_MODULE=config.settings
CELERY_BROKER_URL=redis://default:someInsanePass@redis:6379/0
CELERY_RESULT_BACKEND=redis://default:someInsanePass@redis:6379/0
# ...
celery_worker.app
from celery import Celery, Task

app = Celery("tasks")

app.config_from_object('django.conf:settings', namespace='CELERY')


class BaseTask(Task):
    def on_failure(self, exc, task_id, args, kwargs, einfo):
        raise exc


Solution

  • I found this issue on github where itamarhaber said:

    I'm taking a wild guess here - your Redis server isn't password protected and is open to the public. If this is the case, a nefarious entity is attempting to hijack the server via the SLAVEOF command.

    In my case, setting password for redis wasn't the solution. But closing ports for Redis was.

    services:
      redis:
        image: redis:latest
        container_name: redis
        restart: on-failure
        volumes:
          - ./redis.conf:/usr/local/etc/redis/redis.conf