Search code examples
google-cloud-dataprocdataproc

Is it possible to submit a job to a cluster using initization script on Google Dataproc?


I am using Dataproc with 1 job on 1 cluster.

I would like to start my job as soon as the cluster is created. I found that the best way to achieve this is to submit a job using an initialization script like below.

function submit_job() {
  echo "Submitting job..."
  gcloud dataproc jobs submit pyspark ...
}
export -f submit_job

function check_running() {
  echo "checking..."
  gcloud dataproc clusters list --region='asia-northeast1' --filter='clusterName = {{ cluster_name }}' |
  tail -n 1 |
  while read name platform worker_count preemptive_worker_count status others
  do
    if [ "$status" = "RUNNING" ]; then
      return 0
    fi
  done
}
export -f check_running

function after_initialization() {
  local role
  role=$(/usr/share/google/get_metadata_value attributes/dataproc-role)
  if [[ "${role}" == 'Master' ]]; then
    echo "monitoring the cluster..."
    while true; do
      if check_running; then
        submit_job
        break
      fi
      sleep 5
    done
  fi
}
export -f after_initialization

echo "start monitoring..."
bash -c after_initialization & disown -h

is it possible? When I ran this on Dataproc, a job is not submitted...

Thank you!


Solution

  • Consider to use Dataproc Workflow, it is designed for workflows of multi-steps, creating cluster, submitting job, deleting cluster. It is better than init actions, because it is a first class feature of Dataproc, there will be a Dataproc job resource, and you can view the history.