Search code examples
pythonrabbitmqcelerykombu

Celery: enqueuing multiple (100-1000) tasks at the same time via send_task?


We quite often have the need to enqueue many messages (we chunk them into groups of 1000) using Celery (backed by RabbitMQ). Does anyone have a way to do this? We're basically trying to "batch" a large group of messages in one send_task call.

If i were to guess we would need to go a step "deeper" and hook into kombu or even py-amqp.

Regards,
Niklas


Solution

  • No need to "go deeper" and use Kombu directly. - There are few solutions that are suitable for different use-cases:

    • You may want to exploit the chunks if you prefer using Celery workflows.

    • There is nothing stopping you from calling send_task() thousands of times.

    • If calling send_task() is too slow, you may want to use a pool of threads that would concurrently send N tasks to the queue.