Here is some node.js code that spawns a linux ls
command and prompts its result
const spawn = require('child_process').spawn;
const ls = spawn('ls', ['-l']);
let content = "";
ls.stdout.on('data', function(chunk){
content += chunk.toString();
});
ls.stdout.on('end', function(){
console.log(content);
});
This works well. However, the ls
command is launched asynchronously, completely separated from the main nodeJs thread. My concern is that the data
and end
events on the process' stdout
may have occurred before I attached event listeners.
Is there a way to attach event listeners before starting that sub-process ?
Note : I don't think I can wrap a Promise around the spawn
function to make this work, as it would rely on events to be properly catched to trigger success/failure (leading back to the problem)
There is no problem here.
Readable streams (since node v0.10) have a (limited) internal buffer that stores data until you read from the stream. If the internal buffer fills up, the backpressure mechanism will kick in, causing the stream to stop reading data from its source.
Once you call .read()
or add a data
event handler, the internal buffer will start to drain and will then start reading from its source again.