Airflow: error: unrecognized arguments: --yes
Airflow: error: unrecognized arguments: --yes
我喜欢重新运行 或运行 来自作曲家的 DAG,下面的命令是我用过的,但我遇到了一些像这样的例外情况
kubeconfig entry generated for europe-west1-leo-stage-bi-db7ea92f-gke.
Executing within the following Kubernetes cluster namespace: composer-1-7-7-airflow-1-10-1-db7ea92f
command terminated with exit code 2
[2020-07-14 12:44:34,472] {settings.py:176} INFO - setting.configure_orm(): Using pool settings. pool_size=5, pool_recycle=1800
[2020-07-14 12:44:35,624] {default_celery.py:80} WARNING - You have configured a result_backend of redis://airflow-redis-service.default.svc.cluster.local:6379/0, it is highly recommended to use an alternative result_backend (i.e. a database).
[2020-07-14 12:44:35,628] {__init__.py:51} INFO - Using executor CeleryExecutor
[2020-07-14 12:44:35,860] {app.py:51} WARNING - Using default Composer Environment Variables. Overrides have not been applied.
[2020-07-14 12:44:35,867] {configuration.py:516} INFO - Reading the config from /etc/airflow/airflow.cfg
[2020-07-14 12:44:35,895] {configuration.py:516} INFO - Reading the config from /etc/airflow/airflow.cfg
usage: airflow [-h]
{backfill,list_tasks,clear,pause,unpause,trigger_dag,delete_dag,pool,variables,kerberos,render,run,initdb,list_dags,dag_state,task_failed_deps,task_state,serve_logs,test,webserver,resetdb,upgradedb,scheduler,worker,flower,version,connections,create_user}
...
airflow: error: unrecognized arguments: --yes
ERROR: (gcloud.composer.environments.run) kubectl returned non-zero status code.
这是我的命令,第二行我指定了参数,有人可以帮忙吗?
谢谢
gcloud composer environments run leo-stage-bi --location=europe-west1 backfill -- regulatory_spain_monthly -s 20190701 -e 20190702 -t "regulatory_spain_rud_monthly_materialization" --reset_dagruns
gcloud composer environments run project-name --location=europe-west1 backfill -- DAG name -s start date -e end date -t task in the DAG --reset_dagruns
要触发手动 运行,您可以使用 trigger_dag 参数:
gcloud composer environments run <COMPOSER_INSTANCE_NAME> --location <LOCATION> trigger_dag -- <DAG_NAME>
我已经检查了 Google Cloud SDK 300.0.0
工具集中的 gcloud
实用程序中的 Airflow backfill
子命令功能,并且我已经完成了测试尝试 运行 backfill
具有相同错误的操作:
airflow: error: unrecognized arguments: --yes
深入研究这个问题并为 gcloud composer environments run
命令启动 --verbosity=debug
,我发现了延迟的原因:
gcloud composer environments run <ENVIRONMENT> --location=<LOCATION> --verbosity=debug backfill -- <DAG> -s <start_date> -e <end_date> -t "task_id" --reset_dagruns
DEBUG: Executing command: ['/usr/bin/kubectl', '--namespace',
'', 'exec', 'airflow-worker-*', '--stdin', '--tty',
'--container', 'airflow-worker', '--', 'airflow', 'backfill', '',
'-s', '<start_date>', '-e', '<end_date>', '-t', 'task_id',
'--reset_dagruns', '--yes']
以上输出反映了 gcloud
如何解耦命令行参数,将它们分派给 kubectl
命令继承者的方式。说到这里,我假设 --yes argument 出于未知原因被传播,甚至更错误地定位了其余参数。
寻找解决方法我正在编写对特定 Airflow worker Pod 的相关 kubectl
命令调用,手动调度 Airflow 命令行参数:
kubectl -it exec $(kubectl get po -l run=airflow-worker -o jsonpath='{.items[0].metadata.name}' \
-n $(kubectl get ns| grep composer*| awk '{print }')) -n $(kubectl get ns| grep composer*| awk '{print }') \
-c airflow-worker -- airflow backfill <DAG> -s <start_date> -e <end_date> -t "task_id" --reset_dagruns
到目前为止,airflow backfill
命令成功且没有抛出任何错误。
我喜欢重新运行 或运行 来自作曲家的 DAG,下面的命令是我用过的,但我遇到了一些像这样的例外情况
kubeconfig entry generated for europe-west1-leo-stage-bi-db7ea92f-gke.
Executing within the following Kubernetes cluster namespace: composer-1-7-7-airflow-1-10-1-db7ea92f
command terminated with exit code 2
[2020-07-14 12:44:34,472] {settings.py:176} INFO - setting.configure_orm(): Using pool settings. pool_size=5, pool_recycle=1800
[2020-07-14 12:44:35,624] {default_celery.py:80} WARNING - You have configured a result_backend of redis://airflow-redis-service.default.svc.cluster.local:6379/0, it is highly recommended to use an alternative result_backend (i.e. a database).
[2020-07-14 12:44:35,628] {__init__.py:51} INFO - Using executor CeleryExecutor
[2020-07-14 12:44:35,860] {app.py:51} WARNING - Using default Composer Environment Variables. Overrides have not been applied.
[2020-07-14 12:44:35,867] {configuration.py:516} INFO - Reading the config from /etc/airflow/airflow.cfg
[2020-07-14 12:44:35,895] {configuration.py:516} INFO - Reading the config from /etc/airflow/airflow.cfg
usage: airflow [-h]
{backfill,list_tasks,clear,pause,unpause,trigger_dag,delete_dag,pool,variables,kerberos,render,run,initdb,list_dags,dag_state,task_failed_deps,task_state,serve_logs,test,webserver,resetdb,upgradedb,scheduler,worker,flower,version,connections,create_user}
...
airflow: error: unrecognized arguments: --yes
ERROR: (gcloud.composer.environments.run) kubectl returned non-zero status code.
这是我的命令,第二行我指定了参数,有人可以帮忙吗? 谢谢
gcloud composer environments run leo-stage-bi --location=europe-west1 backfill -- regulatory_spain_monthly -s 20190701 -e 20190702 -t "regulatory_spain_rud_monthly_materialization" --reset_dagruns
gcloud composer environments run project-name --location=europe-west1 backfill -- DAG name -s start date -e end date -t task in the DAG --reset_dagruns
要触发手动 运行,您可以使用 trigger_dag 参数:
gcloud composer environments run <COMPOSER_INSTANCE_NAME> --location <LOCATION> trigger_dag -- <DAG_NAME>
我已经检查了 Google Cloud SDK 300.0.0
工具集中的 gcloud
实用程序中的 Airflow backfill
子命令功能,并且我已经完成了测试尝试 运行 backfill
具有相同错误的操作:
airflow: error: unrecognized arguments: --yes
深入研究这个问题并为 gcloud composer environments run
命令启动 --verbosity=debug
,我发现了延迟的原因:
gcloud composer environments run <ENVIRONMENT> --location=<LOCATION> --verbosity=debug backfill -- <DAG> -s <start_date> -e <end_date> -t "task_id" --reset_dagruns
DEBUG: Executing command: ['/usr/bin/kubectl', '--namespace', '', 'exec', 'airflow-worker-*', '--stdin', '--tty', '--container', 'airflow-worker', '--', 'airflow', 'backfill', '', '-s', '<start_date>', '-e', '<end_date>', '-t', 'task_id', '--reset_dagruns', '--yes']
以上输出反映了 gcloud
如何解耦命令行参数,将它们分派给 kubectl
命令继承者的方式。说到这里,我假设 --yes argument 出于未知原因被传播,甚至更错误地定位了其余参数。
寻找解决方法我正在编写对特定 Airflow worker Pod 的相关 kubectl
命令调用,手动调度 Airflow 命令行参数:
kubectl -it exec $(kubectl get po -l run=airflow-worker -o jsonpath='{.items[0].metadata.name}' \
-n $(kubectl get ns| grep composer*| awk '{print }')) -n $(kubectl get ns| grep composer*| awk '{print }') \
-c airflow-worker -- airflow backfill <DAG> -s <start_date> -e <end_date> -t "task_id" --reset_dagruns
到目前为止,airflow backfill
命令成功且没有抛出任何错误。