I am deploying and running Python Jobs to Cloud Run. I can successfully deploy and run my jobs, and they behave as expected.
I am adding import requests
to my job, and it is giving me the error:
Traceback (most recent call last):
File "/workspace/main.py", line 3, in <module>
import requests
ModuleNotFoundError: No module named 'requests'
Which led me to https://stackoverflow.com/a/60599991/1889720.
I created a Dockerfile
:
FROM python:3
RUN pip install requests
My job still successfully deploys and runs, and I no longer receive the error. However, my main.py
no longer runs. It just immediately exits:
Container called exit(0).
I get the same results of an early exit if my Dockerfile
is simply:
FROM python:3
It feels like it overwriting something?
The addition of the Dockerfile
is the only change I made, so I suspect I am creating my it wrong. I have it in my jobs
directory alongside main.py
.
How do I correctly set up a Dockerfile
for a Cloud Run Job?
I think your question lacks some more technical context, but I think that maybe if your Dockerfile is just:
FROM python:3
RUN pip install requests
It actually should have the specification of the app files to be deployed.
FROM python:3
WORKDIR /app
COPY . /app
RUN pip install --no-cache-dir -r requirements.txt
RUN pip install requests # (If needed)
CMD ["python", "main.py"]
If that's the case I recommend searching on how to build a proper Dockerfile.
Hope it helps. :D