I know how to list all files in a directory from a FTP server:
import ftplib
ftp = ftplib.FTP()
ftp.connect("192.168.1.18", port=2240)
ftp.login()
ftp.cwd('path/to')
for f in ftp.mlsd():
print(f)
But what's the best way to obtain a recursive filelist (i.e. files in subdirectories, subsubdirectories, etc.)?
i.e. an equivalent for FTP of Python 3's glob.glob('path/to/**/*', recursive=True)
which lists all files recursively.
I could do it by entering manually each dir, and then redo a msld()
but I fear this will be very slow (listing files in FTP is already slow as far as I remember), so this is not optimal.
How would one do it with SFTP? Would it be easier with SFTP to list all files recursively?
I could do it by entering manually each dir, and then redo a msld()
And that's the correct solution. FTP protocol does not have any better standard way to retrieve recursive listing. So there's no space for optimization (only by parallelizing the operation). See also Downloading a directory tree with ftplib.
Some FTP servers support non-standard -R
switch with some file listing commands (not sure about MLSD
). So if you are willing to rely on a non-standard functionality and your particular server supports it, you can optimize your code this way. See also Getting all FTP directory/file listings recursively in one call.
For SFTP, see Recursive SFTP listdir in Python?