I have a python code which has following dependencies:
import json
import pydicom
from pydicom.dataset import Dataset, FileDataset
from pydicom.uid import ImplicitVRLittleEndian
import numpy as np
from PIL import Image
import cv2
import datetime
import base
from io import BytesIO
When I make a zip to upload a code and dependencies on lambda, limit exceeds. Issue is with two dependency numpy
and cv2
which I'm trying to installing it with opencv-python-headless
or opencv-python
. Tried uploading from s3 bucket but uncompressed size is greater than 250mb as well. Also tried Lambda Layer to fix this but no luck.
Anyone familiar with the issue?
Lambda.zip gets uploaded on AWS LAMBDA including required dependencies.
As pgrzesik said, a container would be best as this allows a limit of 10GB. One way you could do this would be to create a Dockerfile, likely starting like this:
FROM public.ecr.aws/lambda/python:3.8.2023.03.28.11-x86_64
# copy requirements.txt to container
COPY requirements.txt ./
# installing dependencies
RUN pip3 install -r requirements.txt
This docker file can be used to create an image which can be stored in AWS ECR. A Lambda Function can then be created that uses the image to create a container.
FYI, the requirements.txt file is just a list of dependencies (with versions if desired), e.g.,
cv2
datetime
base
lightgbm==3.3.5