Search code examples
pythonpython-importazure-databricksspark-notebook

How to import one databricks notebook into another?


I have a python notebook A in Azure Databricks having import statement as below:

import xyz, datetime, ...

I have another notebook xyz being imported in notebook A as shown in above code. When I run notebook A, it throws the following error:

ImportError: No module named xyz  

Both notebooks are in the same workspace directory. Can anyone help in resolving this?


Solution

  • The only way to import notebooks is by using the run command:

    %run /Shared/MyNotebook
    

    or relative path:

    %run ./MyNotebook
    

    More details: https://docs.azuredatabricks.net/user-guide/notebooks/notebook-workflows.html