I want to backup a remote directory with a lot of files, and because of that, I need to compress it before downloading it. I can access this folder through SSH or FTP. The host is running on Linux.
I have covered the downloading part with aioftp
. I was using paramiko
and tar
Linux command for compressing the directory in the remote host, but instead, I want to use Python modules (from standard library or not) and avoid using Linux commands. Maybe a combination of paramiko
to open the session, urllib
to create the remote object and tarfile
to compress it can make the job, but I haven't found the way.
In the end, I want a directory-backup.tar.gz
in my localhost.
How can I accomplish that?
You have to compress the directory using tools on the server.
Using local Python code makes no sense. To compress the files locally, you would have to download the uncompressed files, compress them (with your local Python code), and upload the compressed archive, only to download it then again. That defies the purpose of the compression, right?
If you want to use Python code for compression, you would have to run the Python code on the server. Either by uploading the script and executing it on the server or by feeding the code to remote python
process. I do not see much advantage of doing that over using ready-made tar
command.