Im trying to create a Delta live table with the source data from azure synapse table. I used spark.read() method and ran in DLT pipeline, But the pipeline creates a Materialised view instead of DLT table.
And then, i tried using spark.readStream(), but that couldn’t support synapse analytics as a format.
Suggest me some methods to create a DLT table with the source data is from a table present in Azure Synapse
I tried the below code which is creating a materialised view instead of a DLT table. Here the TableQuery have a select statement which gets records from the synapse table.
@dlt.table(
name=TargetTableName,
comment="Landing to Bronze layer",
table_properties={"quality": "bronze"}
)
def table_load():
df = (
spark.read
.format("com.databricks.spark.sqldw")
.option("url", ConnectionURL)
.option("tempDir", TempLocation)
.option("enableServicePrincipalAuth", "true")
.option("query", TableQuery)
.load()
)
return df
Vaduganathan Sekar is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.