Airflow 无法识别我的 S3 连接设置
Airflow doesn't recognise my S3 Connection setting
我正在将 Airflow 与 Kubernetes 执行器一起使用并在本地进行测试(使用 minikube),虽然我能够启动它并且 运行,但我似乎无法将我的日志存储在 S3 中。我已经尝试了描述的所有解决方案,但我仍然收到以下错误,
*** Log file does not exist: /usr/local/airflow/logs/example_python_operator/print_the_context/2020-03-30T16:02:41.521194+00:00/1.log
*** Fetching from: http://examplepythonoperatorprintthecontext-5b01d602e9d2482193d933e7d2:8793/log/example_python_operator/print_the_context/2020-03-30T16:02:41.521194+00:00/1.log
*** Failed to fetch log file from worker. HTTPConnectionPool(host='examplepythonoperatorprintthecontext-5b01d602e9d2482193d933e7d2', port=8793): Max retries exceeded with url: /log/example_python_operator/print_the_context/2020-03-30T16:02:41.521194+00:00/1.log (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fd00688a650>: Failed to establish a new connection: [Errno -2] Name or service not known'))
我实现了自定义日志记录 class,如本文 中所述,但仍然没有成功。
- 我使用 Puckel airflow 1.10.9
- 来自 charts/stable/airflow/
的气流的稳定 Helm 图表
我的airflow.yaml
看起来像这样
airflow:
image:
repository: airflow-docker-local
tag: 1
executor: Kubernetes
service:
type: LoadBalancer
config:
AIRFLOW__CORE__EXECUTOR: KubernetesExecutor
AIRFLOW__CORE__TASK_LOG_READER: s3.task
AIRFLOW__CORE__LOAD_EXAMPLES: True
AIRFLOW__CORE__FERNET_KEY: ${MASKED_FERNET_KEY}
AIRFLOW__CORE__SQL_ALCHEMY_CONN: postgresql+psycopg2://postgres:airflow@airflow-postgresql:5432/airflow
AIRFLOW__CELERY__RESULT_BACKEND: db+postgresql://postgres:airflow@airflow-postgresql:5432/airflow
AIRFLOW__CELERY__BROKER_URL: redis://:airflow@airflow-redis-master:6379/0
# S3 Logging
AIRFLOW__CORE__REMOTE_LOGGING: True
AIRFLOW__CORE__REMOTE_LOG_CONN_ID: s3://${AWS_ACCESS_KEY_ID}:${AWS_ACCESS_SECRET_KEY}@S3
AIRFLOW__CORE__REMOTE_BASE_LOG_FOLDER: s3://${BUCKET_NAME}/logs
AIRFLOW__CORE__S3_LOG_FOLDER: s3://${BUCKET_NAME}/logs
AIRFLOW__CORE__LOGGING_LEVEL: INFO
AIRFLOW__CORE__LOGGING_CONFIG_CLASS: log_config.LOGGING_CONFIG
AIRFLOW__CORE__ENCRYPT_S3_LOGS: False
# End of S3 Logging
AIRFLOW__WEBSERVER__EXPOSE_CONFIG: True
AIRFLOW__WEBSERVER__LOG_FETCH_TIMEOUT_SEC: 30
AIRFLOW__KUBERNETES__WORKER_CONTAINER_REPOSITORY: airflow-docker-local
AIRFLOW__KUBERNETES__WORKER_CONTAINER_TAG: 1
AIRFLOW__KUBERNETES__WORKER_CONTAINER_IMAGE_PULL_POLICY: Never
AIRFLOW__KUBERNETES__WORKER_SERVICE_ACCOUNT_NAME: airflow
AIRFLOW__KUBERNETES__DAGS_VOLUME_CLAIM: airflow
AIRFLOW__KUBERNETES__NAMESPACE: airflow
AIRFLOW__KUBERNETES__DELETE_WORKER_PODS: True
AIRFLOW__KUBERNETES__KUBE_CLIENT_REQUEST_ARGS: '{\"_request_timeout\":[60,60]}'
persistence:
enabled: true
existingClaim: ''
accessMode: 'ReadWriteMany'
size: 5Gi
logsPersistence:
enabled: false
workers:
enabled: true
postgresql:
enabled: true
redis:
enabled: true
我已经尝试通过 UI 设置连接并通过 airflow.yaml
创建连接但似乎没有任何效果,我已经尝试了 3 天了,但没有成功,任何帮助都是非常感谢。
附上截图供参考,
我很确定这个问题是因为 worker pods 上没有设置 s3 日志记录配置。工作人员 pods 不会使用 AIRFLOW__CORE__REMOTE_LOGGING: True
等环境变量获得给定的配置集。如果您希望在 worker pod 中设置此变量,则必须复制该变量并将 AIRFLOW__KUBERNETES_ENVIRONMENT_VARIABLES__
附加到复制的环境变量名称:AIRFLOW__KUBERNETES_ENVIRONMENT_VARIABLES__AIRFLOW__CORE__REMOTE_LOGGING: True
.
在这种情况下,您需要复制所有为 s3 日志记录指定配置的变量,并将 AIRFLOW__KUBERNETES_ENVIRONMENT_VARIABLES__
附加到副本。
我正在将 Airflow 与 Kubernetes 执行器一起使用并在本地进行测试(使用 minikube),虽然我能够启动它并且 运行,但我似乎无法将我的日志存储在 S3 中。我已经尝试了描述的所有解决方案,但我仍然收到以下错误,
*** Log file does not exist: /usr/local/airflow/logs/example_python_operator/print_the_context/2020-03-30T16:02:41.521194+00:00/1.log
*** Fetching from: http://examplepythonoperatorprintthecontext-5b01d602e9d2482193d933e7d2:8793/log/example_python_operator/print_the_context/2020-03-30T16:02:41.521194+00:00/1.log
*** Failed to fetch log file from worker. HTTPConnectionPool(host='examplepythonoperatorprintthecontext-5b01d602e9d2482193d933e7d2', port=8793): Max retries exceeded with url: /log/example_python_operator/print_the_context/2020-03-30T16:02:41.521194+00:00/1.log (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fd00688a650>: Failed to establish a new connection: [Errno -2] Name or service not known'))
我实现了自定义日志记录 class,如本文
- 我使用 Puckel airflow 1.10.9
- 来自 charts/stable/airflow/ 的气流的稳定 Helm 图表
我的airflow.yaml
看起来像这样
airflow:
image:
repository: airflow-docker-local
tag: 1
executor: Kubernetes
service:
type: LoadBalancer
config:
AIRFLOW__CORE__EXECUTOR: KubernetesExecutor
AIRFLOW__CORE__TASK_LOG_READER: s3.task
AIRFLOW__CORE__LOAD_EXAMPLES: True
AIRFLOW__CORE__FERNET_KEY: ${MASKED_FERNET_KEY}
AIRFLOW__CORE__SQL_ALCHEMY_CONN: postgresql+psycopg2://postgres:airflow@airflow-postgresql:5432/airflow
AIRFLOW__CELERY__RESULT_BACKEND: db+postgresql://postgres:airflow@airflow-postgresql:5432/airflow
AIRFLOW__CELERY__BROKER_URL: redis://:airflow@airflow-redis-master:6379/0
# S3 Logging
AIRFLOW__CORE__REMOTE_LOGGING: True
AIRFLOW__CORE__REMOTE_LOG_CONN_ID: s3://${AWS_ACCESS_KEY_ID}:${AWS_ACCESS_SECRET_KEY}@S3
AIRFLOW__CORE__REMOTE_BASE_LOG_FOLDER: s3://${BUCKET_NAME}/logs
AIRFLOW__CORE__S3_LOG_FOLDER: s3://${BUCKET_NAME}/logs
AIRFLOW__CORE__LOGGING_LEVEL: INFO
AIRFLOW__CORE__LOGGING_CONFIG_CLASS: log_config.LOGGING_CONFIG
AIRFLOW__CORE__ENCRYPT_S3_LOGS: False
# End of S3 Logging
AIRFLOW__WEBSERVER__EXPOSE_CONFIG: True
AIRFLOW__WEBSERVER__LOG_FETCH_TIMEOUT_SEC: 30
AIRFLOW__KUBERNETES__WORKER_CONTAINER_REPOSITORY: airflow-docker-local
AIRFLOW__KUBERNETES__WORKER_CONTAINER_TAG: 1
AIRFLOW__KUBERNETES__WORKER_CONTAINER_IMAGE_PULL_POLICY: Never
AIRFLOW__KUBERNETES__WORKER_SERVICE_ACCOUNT_NAME: airflow
AIRFLOW__KUBERNETES__DAGS_VOLUME_CLAIM: airflow
AIRFLOW__KUBERNETES__NAMESPACE: airflow
AIRFLOW__KUBERNETES__DELETE_WORKER_PODS: True
AIRFLOW__KUBERNETES__KUBE_CLIENT_REQUEST_ARGS: '{\"_request_timeout\":[60,60]}'
persistence:
enabled: true
existingClaim: ''
accessMode: 'ReadWriteMany'
size: 5Gi
logsPersistence:
enabled: false
workers:
enabled: true
postgresql:
enabled: true
redis:
enabled: true
我已经尝试通过 UI 设置连接并通过 airflow.yaml
创建连接但似乎没有任何效果,我已经尝试了 3 天了,但没有成功,任何帮助都是非常感谢。
附上截图供参考,
我很确定这个问题是因为 worker pods 上没有设置 s3 日志记录配置。工作人员 pods 不会使用 AIRFLOW__CORE__REMOTE_LOGGING: True
等环境变量获得给定的配置集。如果您希望在 worker pod 中设置此变量,则必须复制该变量并将 AIRFLOW__KUBERNETES_ENVIRONMENT_VARIABLES__
附加到复制的环境变量名称:AIRFLOW__KUBERNETES_ENVIRONMENT_VARIABLES__AIRFLOW__CORE__REMOTE_LOGGING: True
.
在这种情况下,您需要复制所有为 s3 日志记录指定配置的变量,并将 AIRFLOW__KUBERNETES_ENVIRONMENT_VARIABLES__
附加到副本。