Task itself works well(run asynchronously), but when I trying using celery beat
it doesn't work.
I followed http://docs.celeryproject.org/en/latest/userguide/periodic-tasks.html#beat-custom-schedulers.
This is my django
project structure:
.
.
├── clien
│ ├── __init__.py
│ ├── admin.py
│ ├── management
│ │ ├── __init__.py
│ │ └── commands
│ │ ├── __init__.py
│ │ └── crawl_clien.py
│ ├── migrations
│ ├── models.py
│ ├── tasks
│ │ ├── __init__.py ## ==> code
│ │ └── crawl_clien_task.py ## ==> code
│ ├── templates
│ ├── urls.py
│ └── views
├── config
│ ├── __init__.py ## ==> code
│ ├── celery.py ## ==> code
│ ├── settings
│ │ ├── __init__.py
│ │ ├── partials
│ │ │ ├── __init__.py
│ │ │ ├── base.py
│ ├── urls.py
│ └── wsgi.py
├── manage.py
.
.
Only clien
is registered in app
. Here are the codes:
config/celery.py
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
from celery.schedules import crontab
app = Celery('chois_crawler')
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()
app.conf.beat_schedule = {
'my_task': {
'task': 'tasks.crawl_clien_task',
'schedule': crontab(minute='*/1'),
},
}
config/__init__.py
from __future__ import absolute_import, unicode_literals
from .celery import app as celery_app
__all__ = ['celery_app']
clien/tasks/crawl_clien_task.py
from __future__ import absolute_import, unicode_literals
from celery import shared_task, Celery
from clien.management.commands import crawl_clien
@shared_task
def crawl_clien_task():
print("hi")
clien/tasks/__init__.py
from .crawl_clien_task import *
It doesn't work...
It showed error :
[2017-05-02 09:58:00,027: ERROR/MainProcess] Received unregistered task of type 'tasks.crawl_clien_task'.
The message has been ignored and discarded.
Did you remember to import the module containing this task?
Or maybe you're using relative imports?
Please see
http://docs.celeryq.org/en/latest/internals/protocol.html
for more information.
The full contents of the message body was:
b'[[], {}, {"callbacks": null, "errbacks": null, "chain": null, "chord": null}]' (77b)
Traceback (most recent call last):
File "/Users/Chois/.pyenv/versions/3.5.1/envs/chois_crawler/lib/python3.5/site-packages/celery/worker/consumer/consumer.py", line 559, in on_task_received
strategy = strategies[type_]
KeyError: 'tasks.crawl_clien_task'
So I choose the other way:
config/celery.py
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
app = Celery('chois_crawler')
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()
tasks/crawl_clien_task.py
from __future__ import absolute_import, unicode_literals
from celery import shared_task
from celery.schedules import crontab
from config.celery import app
@app.on_after_configure.connect
def setup_periodic_tasks(sender, **kwargs):
sender.add_periodic_task(
crontab(minute='*/5'),
crawl_clien_task(),
name="hi",
)
@shared_task
def crawl_clien_task():
print("hi")
But it doesn't work, either!
What's wrong with it?
You should include tasks as
'appname.tasks.crawl_clien_task',
So
app.conf.beat_schedule = {
'my_task': {
'task': 'clien.tasks.crawl_clien_task',
'schedule': crontab(minute='*/1'),
},
}