How to trigger an airflow DAG run from within a Python script? How to trigger an airflow DAG run from within a Python script? python-3.x python-3.x

How to trigger an airflow DAG run from within a Python script?


You have a variety of options when it comes to triggering Airflow DAG runs.

Using Python

The airflow python package provides a local client you can use for triggering a dag from within a python script. For example:

from airflow.api.client.local_client import Clientc = Client(None, None)c.trigger_dag(dag_id='test_dag_id', run_id='test_run_id', conf={})

Using the Airflow CLI

You can trigger dags in airflow manually using the Airflow CLI. More info on how to use the CLI to trigger DAGs can be found here.

Using the Airflow REST API

You can also use the Airflow REST api to trigger DAG runs. More info on that here.


The first option from within python might work for you best (it's also how I've personally done it in the past). But you could theoretically use a subprocess to interact with the CLI from python, or a library like requests to interact with the REST API from within Python.


On AWS MWAA Airflow 1.10.12 I used solution based on boto3 library for Python and REST POST request:

import boto3import requestsdef TriggerAirflowDAG(mwaa_environment, dag_id):    client = boto3.client("mwaa")    token = client.create_cli_token(Name=mwaa_environment)    url = "https://{0}/aws_mwaa/cli".format(token["WebServerHostname"])    body = f"trigger_dag {dag_id}"    headers = {        "Authorization": "Bearer " + token["CliToken"],        "Content-Type": "text/plain"    }    return requests.post(url, data=body, headers=headers)

User/role who initiates DAG run must have AmazonMWAAAirflowCliAccess policy.