Search code examples
databricksazure-databricksdatabricks-dbx

Databricks DBX pass parameters to notebook job


For a standard deployment.yaml file for dbx databricks as given below:

workflows:
      - name: "your-job-name"


    job_clusters:
      - job_cluster_key: "basic-cluster"
        <<: *basic-static-cluster
      - job_cluster_key: "basic-autoscale-cluster"
        <<: *basic-autoscale-cluster

    tasks:
      - task_key: "task1"
          python_wheel_task: #
            package_name: "some-pkg"
            entry_point: "some-ep"
            parameters: ["param1","param2"]


      - task_key: "your-task-03"
        job_cluster_key: "basic-cluster"
        notebook_task:
          notebook_path: "/Repos/some/project/notebook"
        depends_on:
          - task_key: "your-task-01"

Is there a way to pass parameters to a notebook job like as shown in the wheel job. How would I do that and read the parameters in the notebook?


Solution

  • you can define notebook parameters in your deployment.yaml as below:

    - task_key: "your-task-03"
            job_cluster_key: "basic-cluster"
            notebook_task:
              notebook_path: "/Repos/some/project/notebook"
              base_parameters:
                param1: "param1-value"
                param2: "param2-value"
            depends_on:
              - task_key: "your-task-01"