(gcloud.dataproc.batches.submit.spark) 无法识别的参数:--subnetwork=

(gcloud.dataproc.batches.submit.spark) unrecognized arguments: --subnetwork=

我正在尝试提交 google dataproc 批处理作业。根据文档 Batch Job,我们可以将 subnetwork 作为参数传递。但是用的时候给我

ERROR: (gcloud.dataproc.batches.submit.spark) unrecognized arguments: --subnetwork=

这是我用过的 gcloud 命令,

gcloud dataproc batches submit spark \
    --region=us-east4 \
    --jars=file:///usr/lib/spark/examples/jars/spark-examples.jar \
    --class=org.apache.spark.examples.SparkPi \
     --subnetwork="https://www.googleapis.com/compute/v1/projects/myproject/regions/us-east4/subnetworks/network-svc" \
    -- 1000

根据 dataproc batches docs,需要使用参数 --subnet.

指定子网 URI

尝试:

gcloud dataproc batches submit spark \
    --region=us-east4 \
    --jars=file:///usr/lib/spark/examples/jars/spark-examples.jar \
    --class=org.apache.spark.examples.SparkPi \
    --subnet="https://www.googleapis.com/compute/v1/projects/myproject/regions/us-east4/subnetworks/network-svc" \
    -- 1000