Search code examples
azurepysparkazure-databricks

How can we save or upload .py file on dbfs/filestore


We have few .py files on my local needs to stored/saved on fileStore path on dbfs. How can I achieve this?

Tried with dbUtils.fs module copy actions.

I tried the below code but did not work, I know something is not right with my source path. Or is there any better way of doing this? please advise

'''
dbUtils.fs.cp ("c:\\file.py", "dbfs/filestore/file.py")
'''

Solution

  • It sounds like you want to copy a file on local to the dbfs path of servers of Azure Databricks. However, due to the interactive interface of Notebook of Azure Databricks based on browser, it could not directly operate the files on local by programming on cloud.

    So the solutions as below that you can try.

    1. As @Jon said in the comment, you can follow the offical document Databricks CLI to install the databricks CLI via Python tool command pip install databricks-cli on local and then copy a file to dbfs.

    2. Follow the offical document Accessing Data to import data via Drop files into or browse to files in the Import & Explore Data box on the landing page, but also recommended to use CLI, as the figure below.

      enter image description here

    3. Upload your specified files to Azure Blob Storage, then follow the offical document Data sources / Azure Blob Storage to do the operations include dbutils.fs.cp.

    Hope it helps.