I'm using PyCharm and have noticed when I attach a debugger to a celery worker process that my tasks do not complete and this error is logged to the console:
Traceback (most recent call last):
File "<string>", line 1, in <module>
ImportError: No module named pydevd
This message is logged as well when a celery task is invoked:
[2013-03-24 05:24:26,336: INFO/MainProcess] Got task from broker: celery.group[91218981-204a-414c-a674-fcd8e2b22d23]
However, this task never actually completes.
This is the actual command used to attach the pydevd debugger in PyCharm to my celery worker process:
/home/scottc/venv/myproj/bin/python home/scottc/.IntelliJIdea12/config/plugins/python/helpers/pydev/pydevd.py --multiproc --client 127.0.0.1 --port 60283 --file manage.py celeryd -E -B --loglevel=INFO
When I simply the run the process without attaching a debugger in PyCharm, the ImportError message is never displayed and my tasks do completes.
Finally, I know that pydevd is found in my path because I can manually enter in the following to my code:
from pydev import pydevd
pydevd.settrace('my_host', port=5643, stdoutToServer=True, stderrToServer=True)
and a debugger will successfully attach. The problem is, however, this is far less convenient than setting some breakpoints and clicking 'debug' in PyCharm.
This issue went away when I upgraded celery and billiards:
billiard==2.7.3.23
celery==3.0.17