I have ran this example using this job
Everything worked well.
Now I am trying to see if there is a way to pass parameters to jobs running on Cloud Run.
I understand I can use the command to create jobs with a --message-body
argument like this:
gcloud scheduler jobs create http JOB_NAME \
--location REGION \
--schedule="*/3 * * * *" \
--uri="https://REGION-run.googleapis.com/apis/run.googleapis.com/v1/namespaces/PROJECT_ID/jobs/CLOUD_RUN_JOB_NAME:run" \
--http-method POST \
--oauth-service-account-email PROJECT-compute@developer.gserviceaccount.com
--message-body="This is the body"
However while checking the documentation for Cloud Run jobs here.
I don't see parameters being mentioned anywhere. The idea is that depending on a JSON that contains the parameters we can run different kind of jobs (it's a same job that changes its operation based on the parameters)
With Cloud Run Jobs, you can't provide parameters (entry points, args or env vars). I discussed that point with the Cloud Run Job PM to implement something, and, obviously, the other alpha testers had the same issue and something will happen :).
My current solution is to wrap the current batch job in a web server. Package that webserver in a container and deploy it on Cloud Run. Then, provide the correct body, with your parameters, parse it, and invoke your batch with those inputs