Search code examples
databricksazure-databricksdatabricks-asset-bundle

Databricks Assets bundle


I have the following config file for databricks assets bundles that I am migrating from dbx.

# yaml-language-server: $schema=bundle_config_schema.json

bundle:
  name: usrp

experimental:
  python_wheel_wrapper: true

artifacts:
  my-wheel:
    type: whl
    path: ./

targets:
  default:
    workspace:
      profile: DEFAULT
      root_path: /Shared/dbx/projects/bundels/p1
      artifact_path: "/dbfs:/dbx/p1"
    
    resources:
      jobs:
        job1:
          name: job1
          job_clusters:
            - job_cluster_key: common-cluster
              new_cluster:
                spark_version: 12.2.x-scala2.12
                node_type_id: Standard_D4s_v5
                autoscale:
                  min_workers: 2
                  max_workers: 8
                spark_conf: # remove if not needed
                  spark.databricks.delta.preview.enabled: "true"
                  spark.databricks.io.cache.enabled: "true"
                  spark.sql.adaptive.enabled: "true"
                  # For postgres and mssql connection
                  spark.network.timeout: "300000"
                init_scripts: # remove if not needed
                  - workspace:
                      destination: /Shared/init_scripts/pyodbc-install.sh
          tasks:
            - task_key: task-1
              job_cluster_key: common-cluster
              python_wheel_task:
                package_name: dbx_test
                entry_point: main # take a look at the setup.py entry_points section for details on how to define an entrypoint
                parameters: ["something"]
              libraries:
                - whl: ./dist/dbx_test-*.whl

I am able to build and deploy the job when using the DBR 13.1+. But when I use DBR 12.2 I get the following error:

Error: python wheel tasks with local libraries require compute with DBR 13.1+. Please change your cluster configuration or set experimental 'python_wheel_wrapper' setting to 'true'

Even though I have the experimental section defined.

Also when deployed it comes as a notebook job and not as a wheel task. Why is this the case?


Solution

  • The issue has been recently fixed in CLI, so please upgrade to version 0.208.2 where the fix is released https://github.com/databricks/cli/issues/892