Search code examples
node.jsstreamnode.js-stream

Use only one write stream in parent process for multiple child processes


Say I have multiple node.js child processes, and I want their stdout/stderr to all write to the same file.

In the parent process, ideally I could create one stream to the file, something like so:

const cp = require('child_process');
const strm = fs.createWriteStream('bar.log');

async.each([1,2,3], function(item, cb){

 const n = cp.spawn('node', ['foo' + item + '.js']);

 n.stdout.pipe(strm);
 n.stderr.pipe(strm);

 n.on('close', cb);

}, function(err){
 if(err) throw err;
});

what will likely happen is that we'll get an error:

Error: write after 'end'

The following seems to fix the problem, whereby we create a new stream for every child process:

const cp = require('child_process');

async.each([1,2,3], function(item, cb){

 const n = cp.spawn('node',['foo' + item + '.js']);

  //create a new stream for every child process...
 const strm = fs.createWriteStream('bar.log');

 n.stdout.pipe(strm);
 n.stderr.pipe(strm);

 n.on('close', cb);

}, function(err){
    if(err) throw err;
});

Is there a way to "keep the stream open" even if the child fires the end event? Seems unnecessary to have to create a new stream for every child process.


Solution

  • To the resulting stream was not closed for writing you need set the end option to false:

     n.stdout.pipe(strm, {end: false});
     n.stderr.pipe(strm, {end: false});