Google Cloud DataFlow 作业尚不可用......在 Airflow 中

Google Cloud DataFlow job not available yet.. in Airflow

我正在 运行来自 Airflow 的数据流作业。我需要说我是 Airflow 的新手。数据流(来自 Airflow 的 运行)成功 运行ning,但我可以看到 Airflow 在获取作业状态方面存在一些问题并且我收到了无限消息,例如:

Google Cloud DataFlow job not available yet..

这是将所有步骤添加到数据流后的日志(我将 {projectID} 和 {jobID} 放在原来的位置):

[2018-10-01 13:00:13,987] {logging_mixin.py:95} INFO - [2018-10-01 13:00:13,987] {gcp_dataflow_hook.py:128} WARNING - b'INFO: Staging pipeline description to gs://my-project/staging'

[2018-10-01 13:00:13,987] {logging_mixin.py:95} INFO - [2018-10-01 13:00:13,987] {gcp_dataflow_hook.py:128} WARNING - b'Oct 01, 2018 1:00:13 PM org.apache.beam.runners.dataflow.DataflowRunner run'

[2018-10-01 13:00:13,988] {logging_mixin.py:95} INFO - [2018-10-01 13:00:13,988] {gcp_dataflow_hook.py:128} WARNING - b'INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-10-01_06_00_12-{jobID}?project={projectID}'

[2018-10-01 13:00:13,988] {logging_mixin.py:95} INFO - [2018-10-01 13:00:13,988] {gcp_dataflow_hook.py:128} WARNING - b'Oct 01, 2018 1:00:13 PM org.apache.beam.runners.dataflow.DataflowRunner run'

[2018-10-01 13:00:13,988] {logging_mixin.py:95} INFO - [2018-10-01 13:00:13,988] {gcp_dataflow_hook.py:128} WARNING - b"INFO: To cancel the job using the 'gcloud' tool, run:"

[2018-10-01 13:00:13,988] {logging_mixin.py:95} INFO - [2018-10-01 13:00:13,988] {gcp_dataflow_hook.py:128} WARNING - b'> gcloud dataflow jobs --project={projectID} cancel --region=us-central1 2018-10-01_06_00_12-{jobID}'

[2018-10-01 13:00:13,990] {logging_mixin.py:95} INFO - [2018-10-01 13:00:13,990] {discovery.py:267} INFO - URL being requested: GET https://www.googleapis.com/discovery/v1/apis/dataflow/v1b3/rest

[2018-10-01 13:00:14,417] {logging_mixin.py:95} INFO - [2018-10-01 13:00:14,417] {discovery.py:866} INFO - URL being requested: GET https://dataflow.googleapis.com/v1b3/projects/{projectID}/locations/us-central1/jobs?alt=json

[2018-10-01 13:00:14,593] {logging_mixin.py:95} INFO - [2018-10-01 13:00:14,593] {gcp_dataflow_hook.py:77} INFO - Google Cloud DataFlow job not available yet..

[2018-10-01 13:00:29,614] {logging_mixin.py:95} INFO - [2018-10-01 13:00:29,614] {discovery.py:866} INFO - URL being requested: GET https://dataflow.googleapis.com/v1b3/projects/{projectID}/locations/us-central1/jobs?alt=json

[2018-10-01 13:00:29,772] {logging_mixin.py:95} INFO - [2018-10-01 13:00:29,772] {gcp_dataflow_hook.py:77} INFO - Google Cloud DataFlow job not available yet..

[2018-10-01 13:00:44,790] {logging_mixin.py:95} INFO - [2018-10-01 13:00:44,790] {discovery.py:866} INFO - URL being requested: GET https://dataflow.googleapis.com/v1b3/projects/{projectID}/locations/us-central1/jobs?alt=json

[2018-10-01 13:00:44,937] {logging_mixin.py:95} INFO - [2018-10-01 13:00:44,937] {gcp_dataflow_hook.py:77} INFO - Google Cloud DataFlow job not available yet..

你知道是什么原因造成的吗?我找不到与此问题相关的任何解决方案。 我应该提供更多信息吗?

这是我在 DAG 中的任务:

# dataflow task
dataflow_t=DataFlowJavaOperator(
task_id='mydataflow',
jar='/lib/dataflow_test.jar',
gcp_conn_id='my_gcp_conn',
delegate_to='{service_account}@{projectID}.iam.gserviceaccount.com',
dag=dag)

和选项连接到 default_args 中 DAG 中的数据流:

'dataflow_default_options': {
     'project': '{projectID}',
     'stagingLocation': 'gs://my-project/staging'
    }

我遇到了同样的问题。我在 DataflowPipelineOptions 中创建了作业名称。 Airflow 还会根据您提供的任务 ID 创建作业名称。

So there is conflict and airflow is not able to find the actual job name which 
you created via DataflowPipelineOptions.

您只需从 DataflowPipelineOptions 中删除作业名称即可。