Search code examples
pythondockerflaskgoogle-cloud-run

ModuleNotFoundError on Google Cloud Run, not locally


I am deploying a Python Flask application behind gunicorn on Google Cloud run. The container runs fine locally, but I notice when I build and push the service revision I get an error and my gunicorn workers crash:

ModuleNotFoundError: No module named 'lib'

My directory structure is like so:

├── Dockerfile
├── README.md
├── gunicorn.conf.py
├── lib
│   ├── __init__.py
│   └── misc
│       ├── __init__.py
│       └── daily_message.py
├── requirements.txt
└── server.py

I import the function get_daily_message in server.py like so:

from lib.misc.daily_message import get_daily_message

This all works fine locally and also when I build and run my container image locally. Here is my Dockerfile.

# Use the official lightweight Python image.
# https://hub.docker.com/_/python
FROM python:3.11-slim

# Allow statements and log messages to immediately appear in the logs
ENV PYTHONUNBUFFERED True

# Copy local code to the container image.
ENV APP_HOME /app
WORKDIR $APP_HOME

# Copy the entire project directory to the container.
COPY . ./

# Install production dependencies.
RUN pip install --no-cache-dir -r requirements.txt

# Run the web service on container startup.
CMD ["gunicorn", "--bind", ":8080", "--workers", "2", "--threads", "4", "--timeout", "0", "server:app"]

Solution

  • Here I am to answer my own question. After pulling down my artifact from Artifact Registry and inspecting it, I realized the lib directory was not being copied over. It turns out Google Cloud Build uses the .gitignore file to determine what to copy over along with a .dockerignore. The reason my module could not be found was because it wasn't copied over.

    For context I was deploying to Cloud Run:

    gcloud builds submit ...
    gcloud run deploy ...