I am trying to have some asynchronous functions run by a worker following the approach explained here.
That means, in my tasks.py
file I have:
from django_rq import job
@job
def long_function(one_list):
#many stuff that should be done asynchrounously
Then in my views.py
file:
from .tasks import long_function
def render_function(request):
#some code to get one_list
long_function.delay(one_list)
#some more code to render the page
return render(request, 'results_page.html', context)
For the moment I am doing the tests locally.
Therefore, I have two Terminals opened: one to run python manage.py runserver
and another to run python manage.py rqworker default
.
So when I load 'results_page.html' in the browser I expect that the tasks are queued and start running with the rqworker. The problem is that this happens only some random times, while in the rest the Terminal for rqworker just shows:
*** Listening on default...
Sent heartbeat to prevent worker timeout. Next one should arrive within 420 seconds.
Sent heartbeat to prevent worker timeout. Next one should arrive within 420 seconds.
My first idea was that as I am using two different terminals simultaneously, the connection was not properly done. However, I think this does not make sense because sometimes the asynchronous task does run.
Why is the worker not seeing the tasks sometimes?
Following this article, I replaced the delay
function in views.py
.
From
from .tasks import long_function
def render_function(request):
#some code to get one_list
long_function.delay(one_list)
#some more code to render the page
return render(request, 'results_page.html', context)
to
import django_rq
from .tasks import long_function
def render_function(request):
#some code to get one_list
queue = django_rq.get_queue('default')
queue.enqueue(long_function, one_list)
#some more code to render the page
return render(request, 'results_page.html', context)
And it seems to be working. No idea why, though...