Search code examples
node.jsexpresszlibreal-time-data

how to put data continously put data into a stream and transmit it while compressing it in node js


I am a newbie to javascript.

What i am trying to do is to fetch data from the data base and then transmit it on the internet.

Now i can only read one entry at a time but i want to compress all the entries together rather than compressing one entry at a time.

I can either store all of them in an array and then pass this array to zlib function. but this take up alot of time and memory.

Is it somehow possible to compress the data while transmitting it in node js with express api at the same time as it is being read, sort of like streaming servers, who on real time compress data while retrieving it from memory and then transmitting it over to the client


Solution

  • It's certainly possible. You can play around with this example:

    var express = require('express')
      , app = express()
      , zlib = require('zlib')
    
    app.get('/*', function(req, res) {
      res.status(200)
      var stream = zlib.createGzip()
      stream.pipe(res)
    
      var count = 0
      stream.write('[')
    
      ;(function fetch_entry() {
         if (count > 10) return stream.end(']')
    
         stream.write((count ? ',' : '') + JSON.stringify({
            _id: count,
            some_random_garbage: Math.random(),
         }))
    
         count++
         setTimeout(fetch_entry, 100)
      })()
    })
    
    app.listen(1337)
    
    console.log('run `curl http://localhost:1337/ | zcat` to see the output')
    

    I assume you're streaming JSON, and setTimeout calls would need to be replaced with actual database calls of course. But the idea stays the same.