I'm creating a python script that runs rsync using subprocess and then gets the stdout and print it. The script runs mulitple rsync process bases on a conf file using this code:
for share in shares.split(', '):
username = parser.get(share, 'username')
sharename = parser.get(share, 'name')
local = parser.get(share, 'local')
remote = parser.get(share, 'remote')
domain = parser.get(share, 'domain')
remotedir = username+"@"+domain+":"+remote
rsynclog = home + "/.bareshare/"+share+"rsync.log"
os.system("cp "+rsynclog+" "+rsynclog+".1 && rm "+rsynclog) # MOve and remove old log
rsync="rsync --bwlimit="+upload+" --stats --progress -azvv -e ssh "+local+" "+username+"@"+domain+":"+remote+" --log-file="+rsynclog+" &"
# Run rsync of each share
# os.system(rsync)
self.rsyncRun = subprocess.Popen(["rsync","--bwlimit="+upload,"--stats","--progress","-azvv","-e","ssh",local,remotedir,"--log-file="+rsynclog], stdout=subprocess.PIPE, stderr=subprocess.PIPE)
I thinkg that this might not be the best thing to do - running multiple syncs at the sime time. How could I set up this so I wait for one process to finish before next one starts?
You can find my complete script here: https://github.com/danielholm/BareShare/blob/master/bareshare.py
Edit: And How do I make self.rsyncRun to die when done? When rsync is done with all the files, it seems like it continues altough it shouldnt be doing that.
Calling
self.rsyncRun.communicate()
will block the main process until the rsyncRun
process has finished.
If you do not want the main process to block, then spawn a thread to handle the calls to subprocess.Popen
:
import threading
def worker():
for share in shares.split(', '):
...
rsyncRun = subprocess.Popen(...)
out, err = rsyncRun.communicate()
t = threading.Thread(target = worker)
t.daemon = True
t.start()
t.join()