So my goal is to create a Dag with BigQueryOperators that I can send in Airflow with a parametrized destination table in my SQL. I checked a lot of topics about how to send parameters to PythonOperators in order to call them with a --conf in Airflow but I don't know how to apply the same way to an argument of a BigQueryOperators.
My dag.py looks like this :
import airflow
import blabla..
from airflow.contrib.operators.bigquery_operator import BigQueryOperator
with DAG(
"TestPython",
schedule_interval=None,
default_args=default_args,
max_active_runs=1,
catchup=False,
) as dag:
stepOne = BigQueryOperator(
task_id="stepOne",
sql="SELECT * FROM `testTable` ",
destination_dataset_table=" **variableTable** ",
write_disposition="WRITE_TRUNCATE",
use_legacy_sql=False,
)
stepOne
I wanted to know if there is a way to set the destination table name with an airflow trigger_dag command or maybe something else ( and of course while having a default value when it is not set so it can still be uploaded in my Dag bucket )
If something is not clear, I am available for more details and ways I tried to do it.