Search code examples
pythonrabbitmqqueuecelerycelery-task

worker does not consume tasks after celery add_consumer is called


I would like to leverage Celery (with RabbitMQ as backend MQ) to execute tasks of varying flavors via different Queues. One requirement is that consumption (by the workers) from a particular Queue should have the capability to be paused and resumed.

Celery, seems to have this capability via calling add_consumer and cancel_consumer. While I was able to cancel the consumption of tasks from a queue for a particular worker, I cannot get the worker to resume consumption by calling add_consumer. The code to reproduce this issue is provided here. My guess is likely I'm missing some sort of a parameter to be provided either in the celeryconfig or via the arguments when starting the workers?

Would be great to get some fresh pairs of eyes on this. There is not much discussion on Stackoverflow regarding add_consumer nor in Github. So I'm hoping there's some experts here willing to share their thoughts/experience.

--

I am running the below:

Windows OS, RabbitMQ 3.5.6, Erlang 18.1, Python 3.3.5, celery 3.1.15


Solution

  • To get celery worker to resume working in Windows OS, my work around is listed below.

    • update celery : pip install celery==4.1.0
    • update billiard/spawn.py : encasulate line 338 to 339 with try: except: pass
    • (optional) install eventlet: pip install eventlet==0.22.1
    • add --pool=eventlet or --pool=solo when starting workers per comment in https://github.com/celery/celery/issues/4178