I have an incremental model in dbt that uses the data of a stream (as a source) and tries to merge into a next table.
I’m facing the problem that one of the columns is defined in the 1st table (on which the stream is created) as col_1 VARCHAR(3) and the field in the 2nd table as col_1 VARCHAR(3).
The the select statement in the model that is trying to do the merge onto the 2nd table (table_2) is set as:
SELECT
CAST(REGEXP_REPLACE(col_1, '[^x00-x7F]', '') AS VARCHAR(3) ) AS col_1,
...
FROM
DB.SCHEMA.stream_table_1
WHERE
METADATA$ACTION = 'INSERT'
QUALIFY ROW_NUMBER() OVER (PARTITION BY ... ORDER BY ...) = 1
And the DBT execution log is showing:
13:50:01 SQL status: SUCCESS 57 in 0.0 seconds 13:50:01 Changing col
type from character varying(3) to character varying(16777216) in table
database: “DB” schema: “SCHEMA” identifier: “table_2”
13:50:01 Using snowflake connection
“model.workspace.table_2” 13:50:01 On
model.workspace.table_2: /* {“app”: “dbt”, “dbt_version”:
“1.8.0”, “profile_name”: “but_worspace”, “target_name”: “qa”,
“node_id”: “model.workspace.table_2”} */ alter table
db.schema.table_2 alter “col_1” set data type character
varying(16777216); 13:50:02 Snowflake adapter: Snowflake query id:
01b4587e-0105-28c1-003f-f70708b39e12
I don’t know what is trying to do, any thoughts?