Search code examples
amazon-web-servicesaws-lambdaaws-cli

AWS Lambda cutting my payload in lambda invocation


I fried my brain a long day with a weird problem. I'm trying to send a .docx file document encoded in base64, initially, I thought that the AWS Api Gateway was cutting my body request, but invoking directly via awscli I check the lambda is cutting part of the body payload. Let me show you:

Here a snippet of my lambda function:

def lambda_handler(event, context):
    if event["httpMethod"] == "GET":
        return {
            'statusCode': 200,
            'body': json.dumps('Hello from PDFTron! Please send base64 doc to use this Lambda function!')
        }

    elif event["httpMethod"] == "POST":
        try:
            output_path    = "/tmp/"
            body           = json.loads(str(event["body"])) # <--- the error is thrown here
            base64str      = body["file"]["data"]
            filename       = body["file"]["filename"]
            base64_bytes   = base64decode(base64str)
            input_filename = filename.split('.')[0] + "docx"

Can you check the full file in lambda_function.py

Ok, now I try call my function via awscli with this command line:

aws lambda invoke \
  --profile lab \
  --region us-east-1 \
  --function-name pdf_sign \
  --cli-binary-format raw-in-base64-out \
  --log-type Tail \
  --query 'LogResult' \
  --output text \
  --payload file://payload.json response.json | base64 -d

payload.json

And receive the error:

C/ZcP1zcsiKiso3qnIUFO0Bk9/LjB7EOzkNAA7GgFDYu2A7RzzmPege9ijNyW/K0LvQKiYYtd21rNKycfuvBIr8syxsO7wi2gebCTwnZmHG+x/9N2jid9MWX+uApnxQ19L5TCPJbetnNGoe94JNV1A5VV5seZPWJ7BMTa7WFKCvBRyBeXWiivCsFH5FY7lRQGmmC8rq6Ezzj4rP3ndEKabbyq9HBRddi8TQILtJ7wfMQQU1sQL8FgwdJFXIqvhhL9a8EHwEJC2oblN8d1U1MbLTqYEnty1Z1EQT/bRCPoNJq18okfXuc70GjC0U0P2m5l6z4riKkoS3YXgWjLLIxbCQD7nzEIGuDHeWe+ADzsBybqyRyBOeBAxk0ED5XN1SITy31hv8QW+ViBw2j1ExOruxU44+sS9d7ZQ9yvXqog7O0v6MhDfxHfPa1W6ULOY7y3Jgt/9XgbuOVptXclLf5GWQesSErNLTXaTWTQTxSI6FL+emt3UJzivnbkQ7rZfxnZXU9K+kbLulko3uYfib5CwAA//8DAFBLAwQUAAYACAAAACEA+36uyJIBAAAiAwAAEQAIAWRvY1Byb3BzL2NvcmUueG1sIKIEASigAAEAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAfJJNa9wwEEDvhf4Ho7tX0uZjg/E6tAmhhAYCdUnoTZUmGzW2JKRJnP33HdtrbzaEgA4zmqeHNKPy/LVtsheIyXq3ZnIhWAZOe2PdZs1+11f5GcsSKmdU4x2s2RYSO6++fil1KLSPcBt9gIgWUkYmlwod1uwRMRScJ/0IrUoLIhwVH3xsFVIaNzwo/aQ2wJdCnPIWUBmFivfCPMxGtlMaPSvDc2wGgdEcGmjBYeJyIfmeRYht+vDAUHlDtha3AT5Ep+JMvyY7g13XLbqjAaX7S35/8/PX8NTcur5XGlhVGl2gxQaqku9DitLz33+gcdyeE4p1BIU+VtcqeZf9UDFaCgZsKvVNf4Jt56NJJDjICDOQdLQBaZSj/mCD6EYlvKHZPlgw37fVtya7pqGmwfSu1uMRXmz/L6rjgZjTSXUbrUMw1VJIkYtlLk9rIQshaP2ZnRNU7iYzPgZMRh0txv5Plbuji8v6iu19ZzXJ5Gr0vTu/F7a7W39uPMnFKpermnQnx4fGSTC29PBXV/8BAAD//wMAUEsDBBQABgAIAAAAIQDKYulMAAEAAOQBAAAUAAAAd29yZC93ZWJTZXR0aW5ncy54bWyU0VFLwzAQB/B3we9Q8r6mHSpS1g5EBj6rHyDNrl1YLhfuMuv89IY6J+LLfLtwuR93/Ffrd/TFG7A4Cq2qy0oVECxtXRhb9fqyWdyrQpIJW+MpQKuOIGrdXV+tpmaC/hlSyj+lyEqQBm2rdinFRmuxO0AjJUUIuTkQo0n5yaNGw/tDXFjCaJLrnXfpqJdVdadODF+i0DA4C49kDwghzfOawWeRguxclG9tukSbiLeRyYJIvgf9l4fGhTNT3/yB0FkmoSGV+ZjTRjOVx+tqrtD/ALf/A5ZnAG3zNAZi0/scQd6kyJjqcgYUk0P3ARviB6ZJgHW30r+y6T4BAAD//wMAUEsDBBQABgAIAAAAIQA1ZRhYpAAAAP4AAAATACgAY3VzdG9tWG1sL2l0ZW0xLnhtbCCiJAAooCAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAACsz00KgzAQhuF9oXcIOYCRLlyICkLd1kKgq26SOJpAfiQZQW/fIKUn6HL4Hl6YRtY8bFFBIhwsKISJ42Ghpe/+2XOzox4mgyb4cZ6NgtFb46HYk6XkhA/hMs6WkhfElGFLK0p2Z32qZUs14lozlpQGJ1IRVvB5m0N0AvMZFxbO8D2ozYFHdivLikkjrQlLFKs+vrG/pLqG/R7urpcPAAAA//8DAFBLAQItABQABgAIAAAAIQCWW+dDiQEAADwGAAATAAAAAAAAAAAAAAAAAAAAAABbQ29udGVudF9UeXBlc10ueG1sUEsBAi0AFAAGAAgAAAAhAB6RGrfvAAAATgIAAAsAAAAAAAAAAAAAAAAAwgMAAF9yZWxzLy5yZWxzUEsBAi0AFAAGAAgAAAAhAFRBdBwzAQAASwUAABwAAAAAAAAAAAAAAAAA4gYAAHdvcmQvX3JlbHMvZG9jdW1lbnQueG1sLnJlbHNQSwECLQAUAAYACAAAACEA5fxODOgMAADnMAAAEQAAAAAAAAAAAAAAAABXCQAAd29yZC9kb2N1bWVudC54bWxQSwECLQAKAAAAAAAAACEArp7hlQsEAQALBAEAFQAAAAAAAAAAAAAAAABuFgAAd29yZC9tZWRpYS9pbWFnZTMuZ2lmUEsBAi0AFAAGAAgAAAAhAJa1reLxBQAAUBsAABUAAAAAAAAAAAAAAAAArBoBAHdvcmQvdGhlbWUvdGhlbWUxLnhtbFBLAQItAAoAAAAAAAAAIQAZDo2bmmgCAJpoAgAVAAAAAAAAAAAAAAAAANAgAQB3b3JkL21lZGlhL2ltYWdlMS5wbmdQSwECLQAUAAYACAAAACEABNjgzVrSAgCcAgcAFQAAAAAAAAAAAAAAAACdiQMAd29yZC9tZWRpYS9pbWFnZTIuZW1mUEsBAi0AFAAGAAgAAAAhABb3sCgZCAAArB0AABEAAAAAAAAAAAAAAAAAKlwGAHdvcmQvc2V0dGluZ3MueG1sUEsBAi0AFAAGAAgAAAAhALqKxOGRAgAAZgkAABIAAAAAAAAAAAAAAAAAcmQGAHdvcmQvZm9udFRhYmxlLnhtbFBLAQItABQABgAIAAAAIQB0Pzl6wgAAACgBAAAeAAAAAAAAAAAAAAAAADNnBgBjdXN0b21YbWwvX3JlbHMvaXRlbTEueG1sLnJlbHNQSwECLQAUAAYACAAAACEASUab3eAAAABVAQAAGAAAAAAAAAAAAAAAAAA5aQYAY3VzdG9tWG1sL2l0ZW1Qcm9wczEueG1sUEsBAi0AFAAGAAgAAAAhAIIHldWPDAAA+ncAAA8AAAAAAAAAAAAAAAAAd2oGAHdvcmQvc3R5bGVzLnhtbFBLAQItABQABgAIAAAAIQCiDemz4QEAAOMDAAAQAAAAAAAAAAAAAAAAADN3BgBkb2NQcm9wcy9hcHAueG1sUEsBAi0AFAAGAAgAAAAhAPt+rsiSAQAAIgMAABEAAAAAAAAAAAAAAAAASnoGAGRvY1Byb3BzL2NvcmUueG1sUEsBAi0AFAAGAAgAAAAhAMpi6UwAAQAA5AEAABQAAAAAAAAAAAAAAAAAE30GAHdvcmQvd2ViU2V0dGluZ3MueG1sUEsBAi0AFAAGAAgAAAAhADVlGFikAAAA/gAAABMAAAAAAAAAAAAAAAAARX4GAGN1c3RvbVhtbC9pdGVtMS54bWxQSwUGAAAAABEAEQBdBAAAQn8GAAAA'}}}
[ERROR] Runtime.MarshalError: Unable to marshal response: Object of type KeyError is not JSON serializable
Traceback (most recent call last):END RequestId: a26b93bd-bcf2-4072-99da-29ffcc3bc350
REPORT RequestId: a26b93bd-bcf2-4072-99da-29ffcc3bc350  Duration: 11.92 ms      Billed Duration: 12 ms  Memory Size: 1024 MB    Max Memory Used: 93 MB

When I add a debug (print command) on my code I can check that the payload is cutted and in the log error contains the the final part of my payload (eg. RX4GAGN1c3RvbVhtbC9pdGVtMS54bWxQSwUGAAAAABEAEQBdBAAAQn8GAAAA'}}}.

Reading the aws doc about lambda quotas and limits, we can see that the invocation payload limit is 6MB, very different than my payload size (232kb)

Resource Quota
Invocation payload (request and response) 6 MB each for request and response (synchronous); 256 KB (asynchronous)

Source: https://docs.aws.amazon.com/lambda/latest/dg/gettingstarted-limits.html

I really think I'm doing all things, into the rules, someone here can tell me and help me, where am i going wrong or misunderstand?

I expect the payload of my request arrives correct in my python function.


Solution

  • How you logged event in your Lambda function?

    It seems like the problem is this line: body = json.loads(str(event["body"]))

    You are using the CLI and based on the docs, the Event will be sent as a String. So you need to JSON parse the event object first.

                output_path    = "/tmp/"
                e = json.loads(event)
                body           = str(e["body"])
                base64str      = body["file"]["data"]
                filename       = body["file"]["filename"]
                base64_bytes   = base64decode(base64str)
                input_filename = filename.split('.')[0] + "docx"