I am using the DbtCloudRunJobOperator in a managed (Google Cloud Composer) Apache Airflow instance to trigger jobs in DBT Cloud. While the DBT Cloud jobs themselves run successfully and complete without any issues, the DbtCloudRunJobOperator fails to recognize the successful completion and reports the task as failed. This issue seems to occur consistently, regardless of the job configuration or DBT Cloud project settings.
- Airflow version: 2.7.3
- Composer version: 2.7.1
- DBT Cloud API version: Latest
DbtCloudRunJobOperator
pip package version: 3.8.0
trigger_dbt_job = DbtCloudRunJobOperator(
task_id="trigger_dbt_job",
job_id=job_id,
wait_for_termination=True, # Flag to wait on a job run’s termination.
check_interval=30, # The interval in seconds between each check for termination.
execution_timeout=timedelta(minutes=60), # The maximum amount of time to wait for the job to complete.
steps_override=steps_override,
retries=config.get(
"max_retries", 1
), # If max_retries is found in the config, it uses that value. Otherwise it uses 1.
)
I expected the DbtCloudRunJobOperator in Apache Airflow to successfully recognize and report when a DBT Cloud job completes successfully. The operator is configured to wait for the job’s termination and should ideally update the task status to “success” upon the successful completion of the DBT job. The expected behavior is for the Airflow UI to show the task as completed successfully, allowing subsequent tasks in the DAG to proceed as planned.
Davy van der Horst is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.