Search code examples
dataframeapache-spark-sqldatabrickspysparkdatabricks-connect

How to monitor Databricks jobs using CLI or Databricks API to get the information about all jobs


I want to monitor the status of the jobs to see whether the jobs are running overtime or it failed. if you have the script or any reference then please help me with this. thanks


Solution

  • You can use the databricks runs list command to list all the jobs ran. This will list all jobs and their current status RUNNING/FAILED/SUCCESS/TERMINATED.

    If you wanted to see if a job is running over you would then have to use databricks runs get --run-id command to list the metadata from the run. This will return a json which you can parse out the start_time and end_time.

    #  Lists job runs.
    databricks runs list
    
    # Gets the metadata about a run in json form
    databricks runs get --run-id 1234
    

    Hope this helps get you on track!