I am writing a utility that will zip a file (or set of files) using the LZMA sdk then send the file off to a ftp server. Usually the speed of the compression is faster than the speed of the ftp connection. What I would like to do is instead of compressing the file, waiting for it to finish, then starting the upload I would like to compress to a temporary file or stream, then while it is being compressed upload the completed portions.
The question now is how?
One concern I have is the files I will be working with can be over 1GB when compressed and the systems I will be running this on will have between 512MB and 2GB of ram so I do not want to let the compression side run wild in to memory and lock up the system. The method I have been thinking about is running the compression in a thread, Queuing up 5-10Mb in a memory stream, then send the info to the ftp in the other thread. Is this a good approach or is there a better way to do it? Are there any gotchas like it needs to rewrite the file header at the start of the file when it is done or anything else?
I plan on writing this in c# but code examples in c, c++, or java are fine too.
Thank you for your help.
I'm not familiar with the LZMA SDK but in C# with the SharpZipLib library it's easy to stream out zip files. You don't have to worry about memory, only the blocks being compressed/streamed will be in memory at any one time. We use this to compress and stream files via HTTP but the concept for FTP is the same.
Basically you create a ZipOutputStream
that passes data off to the FTP stream. Call PutNextEntry
at the start of each file and then stream the file contents. Not much more to it than that.