I am learning airflow and trying to use GCSToBigQueryOperator to load data from GCS to BQ Table, it seems the task_id is not an accepted parameter and I am getting below error
airflow.exceptions.AirflowException: Invalid arguments were passed to GCSToBigQueryOperator (task_id: gcs_to_bq_land). Invalid arguments were:
**kwargs: {‘DAG’: <DAG: Gcs_to_bq>}
Below is my code for the Dag
from airflow import DAG
from datetime import datetime,timedelta
from airflow.providers.google.cloud.transfers.gcs_to_bigquery import GCSToBigQueryOperator
default_args ={
"retries": 1,
"retry_delay": timedelta(minutes=5)
}
with DAG(
start_date= datetime.today(),
dag_id="Gcs_to_bq",
default_args=default_args
) as dag:
load_from_gcs_to_bq = GCSToBigQueryOperator(
task_id= "gcs_to_bq_land",
bucket= "learn-composer",
source_objects = ["landing_Employee_Ratings.csv"],
destination_project_dataset_table= "uki-composer-sbox2-3350.composer.gcs-bq",
skip_leading_rows=1,
write_disposition="WRITE_TRUNCATE",
DAG=dag
)