When I route a task to a particular queue it works:
task.apply_async(queue='beetroot')
But if I create a chain:
chain = task | task
And then I write
chain.apply_async(queue='beetroot')
It seems to ignore the queue keyword and assigns to the default 'celery' queue.
It would be nice if celery supported routing in chains - all tasks executed sequentially in the same queue.
Ok I got this one figured out.
You have to add the required execution options like queue= or countdown= to the subtask definition, or through a partial:
subtask definition:
from celery import subtask
chain = subtask('task', queue = 'beetroot') | subtask('task', queue = 'beetroot')
partial:
chain = task.s().apply_async(queue = 'beetroot') | task.s().apply_async(queue = 'beetroot')
Then you execute the chain through:
chain.apply_async()
or,
chain.delay()
And the tasks will be sent to the 'beetroot' queue. Extra execution arguments in this last command will not do anything. It would have been kind of nice to apply all of those execution arguments at the Chain (or Group, or any other Canvas primitives) level.