Search code examples
pythonamazon-web-servicesdockeramazon-s3aws-lambda

how to trigger s3 in aws lambda locally


I was testing aws lambda python locally with docker.My purpose is trigger a lambda function when a JSON file is uploading to a s3 bucket. and I want to test it locally and I need the event data.

I have done almost everything with docker and push this docker image to ECR and then deploy to aws lambda.But how can I get the event data when a file is uploading to the S3 bucket?

Dockerfile

FROM public.ecr.aws/lambda/python:3.10-x86_64

# Copy requirements.txt
COPY requirements.txt ${LAMBDA_TASK_ROOT}


# Install the specified packages
RUN pip3 install -r requirements.txt --target "${LAMBDA_TASK_ROOT}"

# Copy function code
COPY check_engine_function.py ${LAMBDA_TASK_ROOT}

# Set the CMD to your handler (could also be done as a parameter override outside of the Dockerfile)
CMD [ "check_engine_function.handler" ]

Lambda . check_engine_function.py

def handler(event, context):
    bucket = event['Records'][0]['s3']['bucket']['name']
    key = event['Records'][0]['s3']['object']['key']
    print(bucket)
    return {
        'statusCode': 200,
        'body': json.dumps('TTS processing initiated.')
    }



Here how can I get the bucket and key when I'm running the Lembda locally following this article: https://docs.aws.amazon.com/lambda/latest/dg/python-image.html#python-image-instructions


Solution

  • the boto3 response object has a number of attributes (see docs for more details) but they keys in a specific bucket can be retrieved with this script

    import boto3
    
    s3_client = boto3.client('s3')
    response = s3_client.list_objects_v2(Bucket='your-bucket-name')
    keys = [obj['Key'] for obj in response['Contents']]
    
    for key in keys:
        print(key)