Passing parameters to the Delta Live table through a config file stored in dbfs
I am reading data from ADLS location. I have connection details. I need to pass these parameters through a config file. There are several parameters for different layers(bronze and silver) DLT pipeline. I tried the following –
Hard Deletes in Delta Live Tables (DLT)
How are folks handling hard deletes in their Delta Live Table pipelines? I am working with the source team to see about getting them to update their processes to provide a change log but for right now, that option is off the table and I need to come up with a way to handle hard deletes in my sources at the apply changes step.
Clarification on Databricks DLT resources
When setting up DLT pipeline, there are 3 types of product edition, Core, Pro and Advanced. When I compare DLT Classic Core and DLT Classic Pro, the difference is that DLT Classic Pro can handle CDC. Does it means if I’m using DLT Classic Core, I have to enable change data feed, get data feed and do merging for my daily ingestion pipeline manually while DLT Classic Pro are not required to do all these steps?
Append-only table from non-streaming source in Databricks Delta Live Tables (DLT)
I have a DLT pipeline, where all tables are non-streaming (materialized views), except for the last one, which needs to be append-only, and is therefore defined as a streaming table.
Append-only table from non-streaming source in Delta Live Tables
I have a DLT pipeline in Databricks, where all tables are non-streaming (materialized views), except for the last one, which needs to be append-only.