无法使用来自 Airflow DAG(Airflow 版本 2.0.2 MWAA)的 SparkKubernetesOperator 在 Kubernetes 集群上创建 SparkApplications
Unable to create SparkApplications on Kubernetes cluster using SparkKubernetesOperator from Airflow DAG (Airflow version 2.0.2 MWAA)
我尝试使用 SparkKubernetesOperator 运行 使用与以下问题相同的 DAG 和 yaml 文件将作业激发到 Kubernetes 中:
Unable to create SparkApplications on Kubernetes cluster using SparkKubernetesOperator from Airflow DAG
但是airflow显示如下错误:
HTTP response headers: HTTPHeaderDict({'Audit-Id': 'e2e1833d-a1a6-40d4-9d05-104a32897deb', 'Cache-Control': 'no-cache, private', 'Content-Type': 'application/json', 'Date': 'Fri, 10 Sep 2021 08:38:33 GMT', 'Content-Length': '462'})
HTTP response body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"the object provided is unrecognized (must be of type SparkApplication): couldn't get version/kind; json parse error: json: cannot unmarshal string into Go value of type struct { APIVersion string \"json:\\"apiVersion,omitempty\\"\"; Kind string \"json:\\"kind,omitempty\\"\" } (222f7573722f6c6f63616c2f616972666c6f772f646167732f636f6e6669 ...)","reason":"BadRequest","code":400}
有什么解决该问题的建议吗???
觉得你和我有同样的问题
SparkKubernetesOperator(
task_id='spark_pi_submit',
namespace="default",
application_file=open("/opt/airflow/dags/repo/script/spark-test.yaml").read(), #officially know bug
kubernetes_conn_id="kubeConnTest", #ns default in airflow connection UI
do_xcom_push=True,
dag=dag
)
我是这样包的。
而且效果很好
我尝试使用 SparkKubernetesOperator 运行 使用与以下问题相同的 DAG 和 yaml 文件将作业激发到 Kubernetes 中:
Unable to create SparkApplications on Kubernetes cluster using SparkKubernetesOperator from Airflow DAG
但是airflow显示如下错误:
HTTP response headers: HTTPHeaderDict({'Audit-Id': 'e2e1833d-a1a6-40d4-9d05-104a32897deb', 'Cache-Control': 'no-cache, private', 'Content-Type': 'application/json', 'Date': 'Fri, 10 Sep 2021 08:38:33 GMT', 'Content-Length': '462'})
HTTP response body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"the object provided is unrecognized (must be of type SparkApplication): couldn't get version/kind; json parse error: json: cannot unmarshal string into Go value of type struct { APIVersion string \"json:\\"apiVersion,omitempty\\"\"; Kind string \"json:\\"kind,omitempty\\"\" } (222f7573722f6c6f63616c2f616972666c6f772f646167732f636f6e6669 ...)","reason":"BadRequest","code":400}
有什么解决该问题的建议吗???
觉得你和我有同样的问题
SparkKubernetesOperator(
task_id='spark_pi_submit',
namespace="default",
application_file=open("/opt/airflow/dags/repo/script/spark-test.yaml").read(), #officially know bug
kubernetes_conn_id="kubeConnTest", #ns default in airflow connection UI
do_xcom_push=True,
dag=dag
)
我是这样包的。 而且效果很好