Search code examples
pythonpysparkazure-table-storageazure-databricks

Connectiong to Azure table storage from Azure databricks


I am trying to connecto to azure table storage from Databricks. I can't seem to find any resources that doesn't go to blob containers, but I have tried modifying it for tables.

spark.conf.set(
  "fs.azure.account.key.accountname.table.core.windows.net",
  "accountkey")

blobDirectPath = "wasbs://accountname.table.core.windows.net/TableName"

df = spark.read.parquet(blobDirectPath)

I am making an assumption for now that tables are parquet files. I am getting authentication errors on this code now.


Solution

  • According to my research, Azure Databricks does not support the data source of Azure table storage. For more details, please refer to https://learn.microsoft.com/en-gb/azure/databricks/external-data/. Data Sources list

    Besides if you still want to use table storage, you can use Azure Cosmos DB Table API. But they have some differences. For more details, please refer to https://learn.microsoft.com/en-us/azure/cosmos-db/faq#where-is-table-api-not-identical-with-azure-table-storage-behavior.