Search code examples
azure-blob-storagedatabricksazure-databricks

Azure Databricks: error 403 while doing dbutils.fs.ls on mounted directory


I have an Azure Databricks workspace, a storage account with hierarchical namespace enabled and a service principal. I have mounted the storage successfully (the output was "True"):

configs = {
    "fs.azure.account.auth.type": "OAuth",
    "fs.azure.account.oauth.provider.type": "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider",
    "fs.azure.account.oauth2.client.id": "[redacted]",
    "fs.azure.account.oauth2.client.secret": "[redacted]",
    "fs.azure.account.oauth2.client.endpoint": "https://login.microsoftonline.com/[redacted]/oauth2/token"
}

dbutils.fs.mount(
  source = "abfss://[redacted]@[redacted].dfs.core.windows.net/",
  mount_point = "/mnt/demo",
  extra_configs = configs
)

Now I try to view the mounted directory contents:

dbutils.fs.ls("/mnt/demo")

and I get error:

Operation failed: "This request is not authorized to perform this operation using this permission.", 403, GET, https://[redacted].dfs.core.windows.net/[redacted]?upn=false&resource=filesystem&maxResults=5000&timeout=90&recursive=false, AuthorizationPermissionMismatch, "This request is not authorized to perform this operation using this permission.

I have double checked that my service principal has Storage Blob Data Contributor permissions to the storage account.

What am I doing wrong? Any help will be much appreciated.


Solution

  • Thanks everyone for useful tips! The problem, however, was more prosaic. It turned out that I had two Service Principals with the same name. In my code have used Client ID/Secret of first one, but I have granted Blob Contributor role to another one.