I tried to deploy my Serverless function on AWS lambda, but I got this error:
Unzipped size must be smaller than 262144000 bytes.
I can't lower function size, because there is trained machine learning model (about 400Mb).
Is there any way to increase maximum lambda size?
Update June 2022
The local Lambda storage in /tmp
can now be expanded to up to 10GB instead of the 512MB as mentioned in my answer.
If your model is 400MB you won't be able to upload it with your Lambda or in a layer.
From my point of view you have three options:
/tmp
. /tmp
. Should your model increase its size beyond this, this won't work anymore.All of those solutions increase the complexity of your solution. I think the S3
solution (option 1) is the easiest to do, but might get in you int trouble in the future. It will also increase your cold start time considerably, because your Lambda has to download 400MB.
Using a EFS volume should be the best compromise. It should not effect your cold start time too much, since the volume just has to be mounted and no files need to be downloaded. You also won't have the issue of the restricted 512MB /tmp
.