Search code examples
node.jsstreamfs

Using node to stream file contents to stdout


My NodeJS application is using pino for JSON based logging. Pino sends the log file to stdout. I'm simply saving the output to a file using ./app.js >> filename.log . I want to setup a small script that will constantly stream the log file similar to the -f flag on tail. I do not want the application to exit when it reaches the end of the file, but rather wait for additional data to be streamed. I can get the file to echo to stdout but either it does not continue the stream once it reaches the end of the file.

const fs = require('fs');
const split = require('split2')
const pump = require('pump')

// Attempt 1 
fs.createReadStream(file)
  .pipe(split(JSON.parse))
  .on('data', function (obj) {
    console.log(obj);
  })

pump(process.stdout, streamToElastic);

// Attempt 2
const readStream = fs.createReadStream(logFileLocation);
pump(readStream.pipe(process.stdout), split(JSON.parse));

Solution

  • You'll have to implement it yourself, since built-in streams do not provide that functionality, you have to watch for file changes, or poll the file.

    You can use one of this packages that implement that already:

    Using tail:

    Tail = require('tail').Tail;
    
    tail = new Tail("fileToTail");
    
    tail.on("line", function(data) {
      console.log(data);
    });
    
    tail.on("error", function(error) {
      console.log('ERROR: ', error);
    });