I have DAG which is loading data into bigquery table A.
The table A is dependent on 8 other tables and the DAG for these tables are triggered at different time.
I want create a DAG for table A such that data should be loaded into it only after all other dependent DAG are triggered and completed.
how can we do it in airflow?
below is the sample iam trying for 1 table but its not working as intended.
I can see its poking the dependent table in airflow logs but its failing after timeout.
i have tried by triggering the dag of dependent table first and executing dag of table A manually but i am facing the same issue.
Can anyone suggest what could be the error?
operator = ExternalTaskSensor
(
task_id='wait_for_dag',
external_task_id="Task_dependent",
external_dag_id='dependent table',
allowed_states=['success', 'failed'],
mode= 'poke',
poke_interval = 60
timeout = 180
)
delusional_reality is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.