如何在 dag 云作曲家外部触发任务

How to trigger a task externally in a dag cloud composer

我想要一个基本上看起来像这样的数据管道

其中多个任务由相应的 pubsub 消息触发,处理来自 pubsub 消息输入的数据,只有当所有这些工作流完成时才会触发最后一个任务。我设法使用 PubSub 触发了整个 DAG(在此 guide 之后对 PubSub 进行了修改),但它触发了整个 DAG,而不是单个任务。有没有办法只在外部触发 DAG 中的 1 个任务(来自 Cloud Function/PubSub?)

编辑

这是我认为 DAG 代码的简化版本:

import google.cloud.bigquery as bigquery

import airflow
from airflow import DAG
from airflow.contrib.operators.bigquery_operator import BigQueryOperator
from airflow.operators import python_operator
from airflow.operators import dummy_operator


def task1_1(**kwargs):
    # I want this function to take the table name of source 1 from pubsub1, reads the table from BigQuery and processes it
    client_bq = bigquery.Client()
    table_name = kwargs['dag_run'].conf.get('message')
    data = client_bq.query(f"SELECT * FROM {table_name}").result().to_dataframe()
    # ETL Code
    # ..... 


def task2_1(**kwargs):
    # I want this function to take the table name of source 2 from pubsub2, reads the table from BigQuery and processes it
    client_bq = bigquery.Client()
    table_name = kwargs['dag_run'].conf.get('message')
    data = client_bq.query(f"SELECT * FROM {table_name}").result().to_dataframe()
    # ETL Code
    # ..... 

def task_combine():
    # This task is triggered when task1_1 and task2_1 are done
    # More ETL code


with DAG(
        'clean_am_workflow',
        schedule_interval=None,
        start_date=datetime.datetime.today() - datetime.timedelta(days=5),
        catchup=False) as dag:

    source_1 = python_operator.PythonOperator(
        task_id='process_source_1',
        python_callable=task1_1,
        provide_context=True
        )

    source_2 = python_operator.PythonOperator(
        task_id='process_source_2',
        python_callable=task2_1,
        provide_context=True
        )

    combine = python_operator.PythonOperator(
        task_id='combine_sources',
        python_callable=task_combine,
        provide_context=True
        )

    [source_1, source_2] >> combine

你需要的不是触发dag本身,而是根据bigquery分别触发不同的task。这可以通过气流传感器来实现。 https://airflow.apache.org/docs/apache-airflow/stable/_api/airflow/sensors/index.html SQL 传感器: https://airflow.apache.org/docs/apache-airflow/stable/_modules/airflow/sensors/sql.html

在这种情况下,dag 将由普通的 cron 触发。 2 传感器任务将定期查询 bigquery,如果该查询 returns 'good to go' 则它将启动任务。因为 2 个传感器是独立的,最后一个任务只有在传感器和任务都完成时才会执行。