I use airflow to run a dbt command, which is a long running query, sometime take more than 1 hour. We have airflow config scheduler_zombie_task_threshold set to 600
which is 10 mins. This is leading to zoombie task in airflow.
Googling this result in suggestion I can add some logs lines, that airflow thinks that task is still running.
my dbt command like like dbt run --project-dir ./dbt --profiles-dir ./dbt --select tag:SOME_VIEWS
how can i update this command such that in log some progress which the query is failing for snowflake to finish executing it.