如果较早的任务失败,如何停止我的从属任务

问题描述 投票:0回答:1

嗨,我想设置Airflow工作时间表...我有3个任务,即task_1> tast_2> task_3。

If first task , task_1 fails I need to stop remaining tasks being executed.
How this can be handled ?
How can I know earlier job failed or succeeded ? how to get task status ?
apache-spark-sql airflow airflow-scheduler airflow-operator
1个回答
0
投票

请参阅下面的示例代码:

from airflow import DAG
from airflow.operators.python_operator import PythonOperator, BranchPythonOperator
from airflow.utils.trigger_rule import TriggerRule
import datetime as dt

args = {
    'owner': 'airflow',
    'start_date': '2020-06-02'
}

dag = DAG(
    'testing_trigger_rule',
    schedule_interval="@daily",
    default_args=args
)

def task1():
    print('Running task1')
def task2():
    print('Running task2')
def task3():
    print('Running task3')

Task1 = PythonOperator(
        task_id='task1',
        python_callable=task1,
        trigger_rule=TriggerRule.ALL_SUCCESS,
        dag=dag
    )
Task2 = PythonOperator(
        task_id='task2',
        python_callable=task2,
        trigger_rule=TriggerRule.ALL_SUCCESS,
        dag=dag
    )

Task3 = PythonOperator(
        task_id='task3',
        python_callable=task3,
        trigger_rule=TriggerRule.ALL_SUCCESS,
        dag=dag
    )
Task1 >> Task2 >> Task3

有关触发器规则的更多信息,请查看this链接。

© www.soinside.com 2019 - 2024. All rights reserved.