Save result of operator in Apache Airflow
The way you would use get_data
is in the next task can be a PythonOperator
which you can then use to process the data.
get_data = BigQueryGetDataOperator( task_id='get_data_from_bq', dataset_id='test_dataset', table_id='Transaction_partitions', max_results='100', selected_fields='DATE', bigquery_conn_id='airflow-service-account' )def process_data_from_bq(**kwargs): ti = kwargs['ti'] bq_data = ti.xcom_pull(task_ids='get_data_from_bq') # Now bq_data here would have your data in Python list print(bq_data)process_data = PythonOperator( task_id='process_data_from_bq', python_callable=process_bq_data, provide_context=True )get_data >> process_data
PS: I am the author of BigQueryGetDataOperator
and Airflow committer / PMC