I have been using data bricks for quite some time now. recently i got a new Databricks environment and i also mounted azure ADLS gen 2 to my Databricks env. i tested the connection it looks good . but when i read any file from mount point , the code is just running and not showing any result. any idea what could be the reason? i even tried to read a csv file, its the same .
the code i am running is as below.
path = 'some/path/in/ADLS_Gen2/'
df= spark.read.json(path)
df.display()
also earlier when i mount ADLS i use to see that in DBFS explorer , but this one i am not able to.
dbutils.fs.ls(path)
this shows all the file present in perticulat folder of ADLS
try prepending with "dbfs:" to tell spark that it should look through the Databricks file system.
Try
path = 'dbfs:/some/path/in/ADLS_Gen2/'
df= spark.read.json(path)
df.display()
If it does not work, I would try to find the diectory within the databricks catalog
and copy the path directly from there with the Spark API format.
The last solution I am thinking is changing the API command to:
path = 'dbfs:/your/path/here'
spark.read.format('json').load(path).display()