Search code examples
pythonredisflaskpython-rq

Executing long task on a remote server with python-rq


I have written some code which takes a long time to execute (2-3 days) and i want to push it on the server to be executed there. The code is rich of classes and functions interacting with each other but in the end the whole code execution is done through a single function (test2) which will make it work. I have found out that a solution for me might be a task queue and as i do not need to execute multiple tasks at the same time i've found that RQ might suit my needs.

#action_test.py

import action2

def test1():
    fl = action2.FollowersList()
    mech = action2.Mechanics()
    manager = action2.Manager()
    manager.launch(mech,fl)
    for i in range(0,10):
        manager.iterate(mech,fl)

def test2():
    messageList = []
    fl = action2.FollowersList()
    mech = action2.Mechanics()
    manager = action2.Manager()
    manager.launch(mech,fl)
    for i in range(0,2000):
        message = manager.iterate(mech,fl)
        messageList.append(message)
    return messageList

I've set up Reddis on remote server. Run it in a daemon mode. Than i wrote a simple module which should just put my test2 function in a queue.

#app.py

from rq import Connection, Queue
from redis import Redis
from action_test import test2

def main():
    # Tell RQ what Redis connection to use
    redis_conn = Redis()
    q = Queue(connection=redis_conn)  # no args implies the default queue

    # Delay calculation of the multiplication
    job = q.enqueue(test2, timeout = 259200)
    print job.result   # => None

if __name__ == '__main__':
    main()

Then i have encountered an issue: on a python-rq docs webpage (http://python-rq.org/docs/workers/) the described way to launch a worker is to execute

$ rqworker

from the shell. But this worker starts not as a daemon and thus as i am connecting to this remote server where my app is set up over ssh, if my ssh connection goes down the worker goes down as well and this is not quite a behavior i would like it to have. Maintaining the ssh connection for 2-3 days while my code executes denies the whole logic beyond using python-rq in my case. Are there any ways around this issue? Perhaps a python-rq worker should be launched not from the shell to be daemonized?


Solution

  • You can run the worker in background (&) and send the output to a textfile (nohup):

    nohup rqworker &
    

    Bydefault this will write the output to a file nohup.out within the same directory (or $HOME/nohup.out if that's not permitted). You can now close the ssh connection.

    With default settings, rq will write a lot to this file but --quiet helps:

    nohup rqworker --quiet &
    

    See man nohup and how to start jobs in background.