Search code examples
pythonsftpparamiko

How to send data from a file in a Python docker container to remote SFTP server?


I have a Python script I am trying to run in a Docker container to send a file that is on this container to an SFTP server.

I tried the following :

import paramiko

ssh=paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
session = ssh.connect(hostname="X", port=X, username='X', password="X")

stdin,stdout,stderr = ssh.exec_command('sftp -P X ldzdl@hostname', get_pty=True)

I also tried with paramiko transport method but didn't work from remote (docker container) to remote SFTP.

But I have the following error : paramiko.ssh_exception.AuthenticationException: Authentication failed.

How can I do this ? I don't know if my method is okay or if there is other better way to solve it (send data from container to an SFTP server).


Solution

  • The argument given to the exec_command function is not the command you would normally run on the local (client) host's shell, but rather attempted to be ran on the remote (server) host. While it is not likely to get an AuthenticationException by attempting to run remote commands, as you did not post a full traceback in the question - it is hard to tell for sure.
    I suggest checking the following code:

    import paramiko
    
    ssh=paramiko.SSHClient()
    ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
    session = ssh.connect(hostname="X", port=X, username='X', password="X")
    ### so far, same code as in the question ###
    
    print("Auth OK")
    sftp_connection = ssh.open_sftp()
    
    sftp_connection.put("local/file/path", "remote/file/path")
    

    If you see the "Auth OK" print - then you should be good to go, just replace the file path arguments of the sftp_connection.put() method with actual local and remote file paths.
    Otherwise - there is an actual authentication issue which should be resolved.