I have a architectural requirement to have the data stored in ADLS under a medallion model, and are trying to achieve writing to ADLS using Delta Live Tables as a precursor to creating the Delta Table.
I've had had success using CREATE TABLE {dlt_tbl_name} USING DELTA LOCATION {location_in_ADLS}
to create the Delta Table without Delta Live... however the goal is to use Delta live and I don't see how this method is supported in Delta Live
Anyone have a suggestion? I'm guessing at this point that writing to ADLS isn't supported.
If you look into the documentation then you can see that you can specify path
parameter for the @dlt.table
(for Python). And similarly, you can specify LOCATION
parameter when using SQL (docs). You just need to make sure that you're provided all necessary Spark configuration parameters on the pipeline level (with service principal or SAS token). Following code works just fine:
In Python:
import dlt
@dlt.view
def input():
return spark.range(10)
@dlt.table(
path="abfss://test@<account>.dfs.core.windows.net/dlt/python"
)
def python():
return dlt.read("input")
In SQL:
CREATE OR REFRESH LIVE TABLE sql
LOCATION 'abfss://test@<account>.dfs.core.windows.net/dlt/sql'
AS SELECT * from LIVE.input