Search code examples
djangorabbitmqdjango-celery

Running continually tasks alongside the django app


I'm building a django app which lists the hot(according to a specific algorithm) twitter trending topics.

I'd like to run some processes indefinitely to make twitter API calls and update the database(postgre) with the new information. This way the hot trending topic list gets updated asynchronously.

At first it seemed to me that celery+rabbitmq were the solution to my problem, but from what I understand they are used within django to launch scheduled or user triggered tasks, not indefinitely running tasks.

The solution that comes to my mind is write a .py file to continually put trending topics in a queue and write independent .py files continually running, making get queue requests and saving the data in the db used by django with raw SQL or SQLAlchemy. I think that this could work, but I'm pretty sure there is a much better way to do it.


Solution

  • If you just need to keep some processes running continually, supervisor is a nice solution.

    You can combine it with any queuing technology you like to push things into your queues.