I have created a delta table in Azure data lake gen1. But, I am not able to see any data in data lake via azure portal. But, when I run 'ls' command on that delta table path I can see the files present.
code is:
groupsUpdatePath = 'dbfs:/mnt/datalakeg2/referencedata/Groups.csv'
groupsUpdate = spark.read.csv(groupsUpdatePath,
inferSchema = 'true',
header = 'true',
mode="DROPMALFORMED",
timestampFormat="MM/dd/yyyy hh:mm:ss",
escape = "\"").distinct()
groupsUpdate.write.option("path", "/mnt/datalake/Experiments/Example").saveAsTable("exampleTable")
Can anyone guide what could be the issue? Thanks
Here is an example of how to write a delta table in Azure Data Lake Storage Gen1 use the following commands: As Chen Hirsh it might be an permission issue.
#create mount point for ADLS gen 1
configs = {
"fs.adl.oauth2.access.token.provider.type": "CustomAccessTokenProvider",
"fs.adl.oauth2.access.token.custom.provider": spark.conf.get("spark.databricks.passthrough.adls.tokenProviderClassName")
}
dbutils.fs.mount(
source = "adl://<storage-account-name>.azuredatalakestore.net/<directory-name>",
mount_point = "/mnt/<mount-name>",
extra_configs = configs)
#Write the file to ADLS gen 1 as delta table
groupsUpdate.write.option("path", "/mnt/mnt1sampledemo1").saveAsTable("exampleTable")
Also go to ADLS gen 1 account go to Data explorer >> Access.
Check user you are trying to access has appropriate permissions.
I can see files are created under the Mountpoint of ADLS gen 1 folder.