Trying to create two GCP cloud run jobs based on the same docker image, just different commands. I can run the jobs locally using Docker, but when I try and create cloud run jobs on GCP they fail to find the scripts placed in the WORKDIR defined in the Dockerfile. E.g.
#Dockerfile
WORKDIR /app
COPY script1.py .
COPY script2.py .
On my local machine:
docker build -t image .
docker run image python3 script1.py
docker run image python3 script2.py
Both work fine. But if I deploy as cloud run jobs:
gcloud beta run jobs deploy job1 \
image=path-to-image:latest \
--region=region\
--command="/usr/bin/python3 script1.py"
Fails with:
terminated: Application failed to start: invalid status ::14: could not start container:
no such file or directory
Note, I seem to need /usr/bin/python3
on cloud run even though python3
is on the path running a docker container locally.
I have tried:
--command="/usr/bin/python3 script1.py"
--command="/usr/bin/python3 script1.py /app/script1.py"
--command="/usr/bin/python3./script1.py"
All fail to find the scripts when running as a cloud run job. Anyone know how to make custom commands work correctly with WORKDIR?
Christian is correct, but for completeness I'm showing custom command and args to the job. The thing that confused me (should have realized) is that a command cannot contain spaces, i.e. it really must be a command, not a command + args. So pass the command and args separately.
# Dockerfile
WORKDIR /app
COPY script1.py .
COPY script2.py .
#CMD whatever, we will override it
To deploy equivalent jobs on GCP:
gcloud beta run jobs deploy job1 \
--image=path-to-image:latest \
--region=region\
--command="/usr/bin/python3"
--args="script1.py"
gcloud beta run jobs deploy job2 \
--image=path-to-image:latest \
--region=region\
--command="/usr/bin/python3"
--args="script2.py"