Search code examples
dockergoogle-cloud-platformgoogle-cloud-run

Google Cloud Run job succeeding in 2 seconds without included script running


I have Google Cloud Run job that contains a script I need ran. The script is loading data into Google Bigquery. I have a Dockerfile that defines all pre-requisites I need (loading a custom python library from a github Repo, installing python libraries, etc).

When I build the container the script runs as expected and I see results in BQ.

But when I run the job in Cloud Run it doesn't run as expected.

I tried manually running the job as well as scheduling a Cloud Scheduler Cron job.

I expected it the script to execute and then see data in BQ.

However I notice that the job is running in 2 seconds or so and succeeding and BQ shows no changes after the run. Also, the logs only tell me that the container exited with status 0 and gives no information, including stdout dumps.

What am I missing here?

UPDATE: ISSUE RESOLVED

However my answer may not be the best practice and I may still be missing something here. Will be happy to accept any answers helping me and the greater Stack Overflow community understand more here.

Thanks!


Solution

  • Turns out it was my lack of understanding of what the docker runtime (engine? IDK the right word here) does with built containers.

    I thought that a Docker image would run through all the commands listed in the Dockerfile every time.

    Thus (Assuming an Ubuntu base image for this example),

    ...
    All awesome image set up stuff here
    ...
    
    RUN python3 my_awesome_script.py
    

    Would mean that every time the container was ran it would end up running my_awesome_script.py.

    I WAS WRONG

    Turns out docker container run my_awesome_image (in local) will run the image as it was after all the steps in the Dockerfile are completed.

    Therefore a Dockerfile with no Flask app or other service running will simply exit.

    This is what I noticed locally and in GCP as well. The job would simply finish in 20 seconds with a 'success'.

    Running docker container run -ti my_awesome_image will get me to a TTY in the image itself and then I can issue python3 my_awesome_script.py for the results I want.

    After some trial and error in the Google Cloud Run Jobs interface I found the "Container Arguments" section and was able issue python3 my_awesome_script.py.

    This resolved my issue.