Pre hook – Macro – Truncate statement
I’m firing a macro using prehook in the pub_table model. The role of the macro is to truncate the pub_table but it doesn’t truncate the table. So i checked the firing of the truncate statement using 2 versions of the macro.
Pre hook – Macro – Truncate statement
I’m firing a macro using prehook in the pub_table model. The role of the macro is to truncate the pub_table but it doesn’t truncate the table. So i checked the firing of the truncate statement using 2 versions of the macro.
How to label a dbt project so that I can query the INFORMATION_SCHEMA for all jobs from this project?
dbt-core: 1.8.0 dbt-bigquery: 1.8.1 I have a dbt-core project that includes 2 tables (1 incremental, 1 that aggregates the incremental table). Since dbt only logs the direct *.sql files but not the incremental macro magic with the MERGE statements the logged bytes_billed in the /target/run_results.json are missing the majority of the costs when you run […]
dbt copy_partitions partitions with job parallelization in Bigquery
I am creating dbt models with copy_partitions=true, but the problem is when I need to ingest a number of partitions with a large volume of data, the whole process can take a long time as dbt iterates by calling the Bigquery API for each partition. Is there a way to parallelize those copy_partitions jobs in dbt?
Folder level schema config in DBT using variable
I have a use case in DBT where I need to set schema name based on folder structure since there are many folders and many models. I have done that folder level in DBT_project.yml file and it is working fine. But I wanted to make it a bit more generic and assign values to variable and then use the variable to populate the schema name.