I am using redis
and python-rq
to manage a data processing task. I wish to distribute the data processing across multiple servers (each server would manage several rq workers) but I would like to keep a unique queue on a master server.
Is there a way to achieve this using python-rq
?
Thank you.
It turned out to be easy enough. There are two steps:
1) Configure Redis
on the master machine so that it is open to external communications by the remote "agent" server. This is done by editing the bind
information as explained in this post. Make sure to set a password if setting the bind value to 0.0.0.0
as this will open the Redis connection to anyone.
2) Start the worker on the remote "agent" server using the url
parameter:
rq worker --url redis://:[your_master_redis_password]@[your_master_server_IP_address]
On the master server, you can check that the connection was properly made by typing:
rq info --url redis://:[your_master_redis_password]@localhost
If you enabled the localhost binding, this should display all the workers available to Redis
from your "master" including the new worker you created on your remote server.