I created Azure synapse and data lake inside that
1.I uploaded one parquet file in data lake which is connected to synapse
2.When i access that file in serverless pool using openrowset it works
3.but when i access that file in dedicated sql pool it is not authorized. Please check screenshot and same issue arrise when i call this file spark pool
There is an authorization issue: the default credential works fine in the OPENROWSET
serverless pool, but it does not work in the dedicated pool.
I have tried all available credentials mentioned in this documentation.
CSV | Parquet | ORC | |
---|---|---|---|
Azure Blob Storage | SAS/MSI/SERVICE PRINCIPAL/KEY/AAD | SAS/KEY | SAS/KEY |
Azure Data Lake Gen2 | SAS/MSI/SERVICE PRINCIPAL/KEY/AAD | SAS (blob 1 )/MSI (dfs 2 )/SERVICE PRINCIPAL/KEY/AAD | SAS (blob 1 )/MSI (dfs 2 )/SERVICE PRINCIPAL/KEY/AAD |
For ADLS Gen2, you need to provide any one of the above-mentioned credentials.
I recommend using MSI
. Use the code below and ensure you have the minimum RBAC roles required:
- Minimum RBAC roles required: Storage Blob Data Contributor or Storage Blob Data Owner for the Azure AD registered SQL Database server