Search code examples
node.jsftpsynchronizationrsync

Keep a local folder in sync with a remote FTP folder


I'm looking to build a client node app to keep a folder on the clients machine in sync with any changes that happen with a folder it's cloned from on a remote FTP server.

I could connect to the server and download all the files initially, but I'm not sure how to keep track of file changes.

I could keep a file with filenames/checksums for all files and folders somewhere on that remote server so I can tell when files need re-downloading. However I'm not sure how I'd handle file deletions, maybe if the file name is no longer in the server side hash file I know to delete it.

I guess I'm asking is there any better ways of keeping a local and remote folder in sync with Nodejs?


Solution

  • I solved this by generating an "index" json file on the server that lists the path to every file and it's size and md5 hash. I then download this file and perform checks against everything in the list, anything missing I download. I then save a local copy of that index file then next time I check for updates anything that exists in my local index file but not the remote one indicates deletions are needed. Hashing lots of files (10GB~) on the client side was pretty slow so after the first sync I store hash's of folders in the index files, then I can just compare folder hash values in local/remote index and if there are differences I can then do the expensive hashing on individual files in those folders.