I'm using the Serverless framework with AWS Lambda to deploy multiple functions, one of them is written in Python3 and bundled with a few external libraries I'm using (redis, elasticsearch, dateutil).
When running the lambda locally using serverless invoke local --function function-name
it works fine. When running locally using python3 function.py
I'm getting the error ImportError: attempted relative import with no known parent package
, because I am importing a local python file using a dot prefix i.e from .imported_file import one, two, three
. When running on cloud i.e serverless invoke --function function-name
, I'm getting the error Unable to import module 'function-name/function': No module named 'redis'
.
Since I'm deploying multiple functions under the same yaml file, each function has it's own folder, and I'm deploying using the following settings:
package:
individually: true
exclude:
- "*/**"
- "*"
And in each function:
package:
include:
- function-name/**
File structure looks like this:
service-name/
├─ function-name/
│ ├─ packages/
│ │ ├─ dateutil/
│ │ ├─ elasticsearch/
│ │ ├─ redis/
│ ├─ function.py
│ ├─ imported_file.py
├─ function-name-2/
├─ serverless.yaml
I've tried lots of things to overcome this, but mainly what seems to be working is having all the files and folders in the root folder of the lambda. Problem is I'm unable to accomplish this because using the method above to package the functions forces a file structure looking like this:
./
├─ function-name/
│ ├─ packages/
│ ├─ function.py
Instead of this:
./
├─ packages/
├─ function.py
If someones can explain to me what's going on and how to get it working in the cloud, it will be much appreciated.
Solved it by setting PYTHONPATH
as environment variable pointing to the local packages/
folder.