Search code examples
node.jsamazon-web-servicesaws-lambdazipaws-api-gateway

What is a good away to compress AWS Lambda response to avoid 6MB limit?


I have Lambda that performs several calls to DynamoDB, creates a big stringified JSON object as a response and passes to the client application via API Gateway. Naturally, API Gateway has "Content Encoding enabled" option set, and all data is passed over the internet in a compressed form.

The problem is that Lambda response itself is not compressed and it hits 6MB response limit. Is it possible to compress Lambda response and then decompress it on the client-side in some natural way?

I've checked node.js libraries like JSZip and ADM Zip and was surprised that despite they allow in-memory output for decompressed data they don't allow in-memory input like string, buffer or smth, only files. Lambda already has several restrictions and surprises related to working with files so I would like to avoid the following redundant workflow:

  1. create JSON object
  2. save it as a temporal file inside lambda environment
  3. load the file via zipping library to compress and return to API Gateway

Is there any more natural way to deal with the issue?


Solution

  • I've followed the next approach. For backend:

    const { deflate, unzip } = require("zlib");
    const { promisify } = require("util");
    
    const asyncDeflate = promisify(deflate);
    
    async zip(object) {
        return (await asyncDeflate(JSON.stringify(object))).toString("base64");`
    }
    

    For front end:

    import * as pako from "pako";
    
    export function unzip(base64str: string) {
      const strData = atob(base64str);
    
      // Convert binary string to character-number array
      const charData = strData.split("").map((x) => { return x.charCodeAt(0); });
    
      // Turn number array into byte-array
      const binData = new Uint8Array(charData);
    
      return JSON.parse(pako.inflate(binData, { to: "string" }));
    }
    

    So it is rather similar to the recent answer.