Search code examples
azureazure-data-factoryazure-databricks

How to get python notebook path from Azure databricks?


I want to get exact python/scala file path which are created at workspace or user level.

In Azure Datafactory Python activity, I want to execute the python notebook which is part of my workspace.

If I upload the .py file in dbfs, ADF pipeline gets executed.

But I don't want to upload file in dbfs.

How to call python files in Python activity of Azure data factory?

Thank you


Solution

  • You need to use Azure Databricks Notebook Activity in a Data Factory pipeline runs a Databricks notebook (Python/Scala/Sql/R) in your Azure Databricks workspace.

    Note: The Azure Databricks Python Activity in a Data Factory pipeline runs a Python file in your Azure Databricks cluster.