Search code examples
pythonpython-3.xaws-lambdaaws-samaws-sam-cli

AWS Lambda in Python: Import parent package/directory in Lambda function handler


I have a directory structure like the following in my serverless application(simplest app to avoid clutter) which I created using AWS SAM with Python 3.8 as the runtime:

├── common
│   └── a.py
├── hello_world
│   ├── __init__.py
│   ├── app.py
│   └── requirements.txt
└── template.yaml

I would like to import common/a.py module inside the Lambda handler - hello_world/app.py. Now I know that I can import it normally in Python by adding the path to PYTHONPATH or sys.path, but it doesn't work when the code is run in Lambda inside a Docker container. When invoked, the Lambda handler function is run inside /var/task directory and the folder structure is not regarded.

I tried inserting /var/task/common, /var/common, /var and even /common to sys.path programmatically like this:

import sys
sys.path.insert(0, '/var/task/common')
from common.a import Alpha

but I still get the error:

ModuleNotFoundError: No module named 'common'

I am aware of Lambda Layers but given my scenario, I would like to directly reference the common code in multiple Lambda functions without the need of uploading to a layer. I want to have something like the serverless-python-requirements plugin in the Serverless framework but in AWS SAM.

So my question is, how should I add this path to common to PYTHONPATH or sys.path? Or is there an alternative workaround for this like [serverless-python-requirements][3] to directly import a module in a parent folder without using Lambda Layers?


Solution

  • I didn't find what I was looking for but I ended up with a solution to create a single Lambda function in the root which handles all the different API calls within the function. Yes my Lambda function is integrated with API Gateway, and I can get the API method and API path using event["httpMethod"] and event ["httpPath"] respectively. I can then put all the packages under the root and import them between each other.

    For example, say I have 2 API paths /items and /employees that need to be handled and both of them need to handle both GET and POST methods, the following code suffices:

    if event["path"] == '/items':
       if event["httpMethod"] == 'GET':
          ...
       elif event["httpMethod"] == 'POST':
          ...
    elif event["path"] == '/employees':
       if event["httpMethod"] == 'GET':
          ...
       if event["httpMethod"] == 'POST':
          ...
    

    So now I can have as much packages under this Lambda function. For example, the following is how repository looks like now:

    ├── application
    │   └── *.py
    ├── persistence
    │   └── *.py
    ├── models
    │   └── *.py
    └── rootLambdaFunction.py
    └── requirements.txt
    

    This way, I can import packages at will wherever I want within the given structure.