Search code examples
node.jscompressionzlib

Node v12.7 How to implement native brotli, gzip, deflate compression buffer


I am a front end developer upping my chops on the backend here.

I have a Node express server that hosts an app and serves a REST api on the same server/AWS EC2 instance.

I was using express-static-gzip npm package to enable brotli compression for the static app bundle and assets. This was great. Then however, I had to switch to SSR for Three.js objects since phones couldn't handle the massive dataset parsing. Unfortunately, it didn't apply compression to my REST data.

Currently, in the interim I have disabled express-static-gzip and enabled the compression npm package. This is enabling gzip only but for both the static bundle AND the REST API.

I specifically need Brotli with GZIP/Deflate fallback compression on both my static bundle AND my REST API. The largest GET request uncompressed can be 138MB. GZIP gets it down to 12.8MB. I want it under 10MB with Brotli.

My intention is to have express-static-gzip running compression on my bundle and manual node zlib compression on my REST API. If that isn't feasible then manual node zlib compression for everything!

I don't understand some things about Buffers and backend type of things... perhaps you can tell me what I'm doing wrong here:

app.get('/quakeData/:index', function(req, res){
    // Send Specific Selection or All
    const encoding = req.headers['accept-encoding'], 
          index    = req.params.index,
          jsonArr  = index != "all" ? [ quakes[index], threeData[index] ] : [ quakes, threeData ],
          jsonStr  = JSON.stringify(jsonArr),
          bData    = Buffer.from(jsonStr);

    if (encoding.includes('br')) {
        console.log("BROTLI RES");
        zlib.brotliCompress(bData, (err, result) => {
            console.log(result);
            !err ? res.send(result) : console.warn(err);
        });

    } else if (encoding.includes('gzip')) {
        console.log("GZIP RES");
        zlib.gzip(bData, (err, result) => {
            console.log(result);
            !err ? res.send(result) : console.warn(err);
        });

    } else if (encoding.includes('deflate')) {
        console.log("DEFLATE RES");
        zlib.deflate(bData, (err, result) => {
            console.log(result);
            !err ? res.send(result) : console.warn(err);
        })

    } else {
        console.warn("Unsupported Content Encoding Headers");
        res.setHeader('Content-Type', 'application/json');
        res.json(jsonArr);
    }

Also, I've realized that the compression module removes Content Length headers because my XHR Progress API code stopped working. I need to have Content Length headers no matter what solution is implemented. How do I go about that? Also, is there a way to setup a GET to receive content length ahead of time to estimate download times?

Thank you kindly!


Solution

  • I've gotten this to work. Although now I have the issue of how to have a dynamically selectable timeframe with different compression options. I will have to modify it because compressing the data takes quite a long time.

    It is apparent that Brotli is far superior. I imagine when I refactor and test my monthly the data saving will be impressive. Below is the code I have that got the above working. Granted it is compressing on each API request which in production doesn't make sense but is good for testing.

    (UPDATE) I refactored the code to a more final implementation that includes writing to txt files etc. The code is not supplied however, here is the final compression bytelength comparison for Brotli vs GZIP. GZIP offers faster compression albeit at a larger size.

    Brotli: [ 5433, 137501, 952538, 6438971 ]

    GZIP: [ 6818, 194843, 1544908, 10451525 ]

    The above bytelength array is for hourly, daily, weekly, and monthly dataset respectively. The monthly dataset before compression was ~138MB.

    Also, FYI the XHR progress API is busted so even if you send content headers for byte size it just wont work. I had to create a sepearate API to send the byte lengths before the data AJAX.

    const jsonStr  = JSON.stringify(jsonArr),
              bData    = Buffer.from(jsonStr, 'utf-8');
    
        if (encoding.includes('br')) {
            console.log("BROTLI RES");
            zlib.brotliCompress(bData, (err, result) => {
                console.log(result);
                res.writeHead(200, {
                    'Content-Type':     'application/json',
                    'Content-Encoding': 'br',
                    'Content-Length':   bData.length
                });
    
                !err ? res.end(result) : console.warn(err);
            });
    
        } else if (encoding.includes('gzip')) {
            console.log("GZIP RES");
            zlib.gzip(bData, (err, result) => {
                console.log(result);
                res.writeHead(200, {
                    'Content-Type':     'application/json',
                    'Content-Encoding': 'gzip',
                    'Content-Length':   bData.length
                });
    
                !err ? res.end(result) : console.warn(err);
            });
    
        } else if (encoding.includes('deflate')) {
            console.log("DEFLATE RES");
            zlib.deflate(bData, (err, result) => {
                console.log(result);
                res.writeHead(200, {
                    'Content-Type':     'application/json',
                    'Content-Encoding': 'deflate',
                    'Content-Length':   bData.length
                });
    
                !err ? res.end(result) : console.warn(err);
            });
    
        } else {
            console.warn("Unsupported Content Encoding Headers");
            res.setHeader('Content-Type', 'application/json');
            return res.json(compressed);
        }