Search code examples
pythongoogle-cloud-storagesftppysftp

Upload zip file from Google cloud storage to SFTP server without loading into memory using Python


I want to upload large zip files from google cloud bucket to SFTP server without loading into memory and cloud function. I am using pysftp for SFTP transfer.

with pysftp.Connection(host="SFTP Server", username='test', password='test', cnopts=cnopts) as sftp:
    sftp.cwd("/tmp")
    sftp.put("zip_file_from_google_bucket")

Can we access os path of the file in the bucket and provide the file path in the sftp.put() since gs:// path will not be recognised in the sftp.put()?

Do we have any other method to transfer?

Please advice.


Solution

  • First, better use Paramiko. The pysftp is dead. See pysftp vs. Paramiko. In any case, the following will work both with Paramiko and pysftp.


    Combine Paramiko SFTPClient.open with GCS Blob.download_to_file:

    client = storage.Client(credentials=credentials, project='myproject')
    bucket = client.get_bucket('mybucket')
    blob = bucket.blob('archive.zip')
    
    with sftp.open('/sftp/path/archive.zip', mode="w+", bufsize=32768) as f:
        blob.download_to_file(f)
    

    Alternatively, combine SFTPClient.putfo with Blob.open.

    with blob.open("rb") as f:
        sftp.putfo(f, '/sftp/path/archive.zip')
    

    (untested, but it should give you the idea at least)