Search code examples
node.jsstreamhttpresponselarge-files

Write multiple files to http response with streams in nodejs


I have an array of files that I have to pack into a gzip archive and send them through http response on the fly. That means I can't store the whole file in the memory yet I have to synchronously pipe them into tar.entry or everything is going to break.

const tar = require('tar-stream'); //lib for tar stream
const { createGzip } = require('zlib'); //lib for gzip stream

//large list of huge files.
const files = [ 'file1', 'file2', 'file3', ..., 'file99999' ];
...

//http request handler:
const pack = tar.pack(); //tar stream, creates .tar
const gzipStream = createGzip(); //gzip stream so we could reduce the size

//pipe archive data trough gzip stream
//and send it to the client on the fly
pack.pipe(gzipStream).pipe(response);

//The issue comes here, when I need to pass multiple files to pack.entry
files.forEach(name => {
    const src = fs.createReadStream(name);    //create stream from file
    const size = fs.statSync(name).size;      //determine it's size
    const entry = pack.entry({ name, size }); //create tar entry

    //and this ruins everything because if two different streams
    //writes smth into entry, it'll fail and throw an error
    src.pipe(entry);
});

Basically I need for the pipe to complete sending data (smth like await src.pipe(entry);), but pipes in nodejs don't do that. So is there any way I could get around it?


Solution

  • Nevermind, just don't use forEach in this case