Search code examples
apache-sparkdatabricksjob-schedulingdatabricks-workflows

How to to trigger a Databricks job from another Databricks job?


I'm currently working on a project where I have two distinct jobs on Databricks. The second job is dependent on the results of the first one.

I am wondering if there is a way to automatically trigger the second job once the first one has completed successfully. Ideally, I would like to accomplish this directly within Databricks without the need for an external scheduling or orchestration tool. Has anyone been able to implement this type of setup or know if it's possible?


Solution

  • Databricks is now rolling out the new functionality, called "Job as a Task" that allows to trigger another job as a task in a workflow. Documentation isn't updated yet, but you may see it in the UI.

    • Select "Run Job" when adding a new task:

    enter image description here

    • Select specific job to execute as a task:

    enter image description here