I'm using Node's child_process.execFile()
to start and communicate with a process that puts all of its output into its standard output and error streams. The process runs for an indeterminate amount of time and may theoretically generate any amount of output, i.e.:
const process = execFile('path/to/executable', [], {encoding: buffer'});
process.stdout.on('data', (chunk) => {
doSomethingWith(chunk);
});
process.stderr.on('data', (chunk) => {
renderLogMessage(chunk);
});
Notice that I'm not using the last argument to execFile()
because I never need an aggregated view of all data that ever came out of either of those streams. Despite this omission, Node appears to be buffering the output anyway and I can reliably make the process end with the SIGTERM
signal just by giving it enough input for it to generate a large amount of output. That is problematic because the process is stateful and cannot simply be restarted periodically.
How can I alter or work around this behavior?
You don't want to use execFile
, which will wait for the child process to exit before "returning" (by calling the callback that you're not passing).
The documentation for execFile
also describes why your child process is being terminated:
maxBuffer
<number>
Largest amount of data in bytes allowed on stdout or stderr. (Default:200*1024
) If exceeded, the child process is terminated.
For long-running processes for which you want to incrementally read stdout/stderr, use child_process.spawn()
.