Spark 提交命令返回缺少的应用程序资源

Spark Submit command is returning a missing application resource

首先,我使用 How to build jars from IntelliJ properly?.

创建了一个 jar 文件

我的 Jar 文件路径是

out/artifacts/sparkProgram_jar/sparkProgram.jar

我的spark程序,一般来说,从MongoDB中读取一个table,使用spark的mllib对其进行转换,然后写入MySQL。 这是我的 build.sbt 文件。

name := "sparkProgram"

version := "0.1"

scalaVersion := "2.12.4"
val sparkVersion = "3.0.0"
val postgresVersion = "42.2.2"

resolvers ++= Seq(
  "bintray-spark-packages" at "https://dl.bintray.com/spark-packages/maven",
  "Typesafe Simple Repository" at "https://repo.typesafe.com/typesafe/simple/maven-releases",
  "MavenRepository" at "https://mvnrepository.com"
)

libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % sparkVersion,
  "org.apache.spark" %% "spark-sql" % sparkVersion,
  "org.apache.spark" %% "spark-mllib" % sparkVersion,
  // logging
  "org.apache.logging.log4j" % "log4j-api" % "2.4.1",
  "org.apache.logging.log4j" % "log4j-core" % "2.4.1",
  "org.mongodb.spark" %% "mongo-spark-connector" % "2.4.1",

  //"mysql" % "mysql-connector-java" % "5.1.12",
  "mysql" % "mysql-connector-java" % "8.0.18"
).

我的主要 class 在名为

的 scala 对象中的包 com.testing 中
mainObject

当我运行下面的spark-submit命令

spark-submit --master local --class com.testing.mainObject
--packages mysql:mysql-connector-java:8.0.18,org.mongodb.spark:mongo-spark-connector_2.12:2.4.1 out/artifacts/sparkProgram_jar/sparkProgram.jar

我收到这个错误

Error: Missing application resource.

Usage: spark-submit [options] <app jar | python file | R file> [app arguments]
Usage: spark-submit --kill [submission ID] --master [spark://...]
Usage: spark-submit --status [submission ID] --master [spark://...]
Usage: spark-submit run-example [options] example-class [example args]

Options:


... zsh: command not found: --packages

然后当我尝试 运行 我的 spark-submit 时没有 --packages(只是为了检查会发生什么)我收到这个错误。

命令:

spark-submit --master local --class com.testing.mainObject out/artifacts/sparkProgram_jar/sparkProgram.jar

错误: 错误:无法加载 class com.testing.mainObject

我之前使用过 spark-submit 并且它有效(几个月前)。我不确定为什么这仍然给我一个错误。我的MANIFEST.MF如下

Manifest-Version: 1.0
Main-Class: com.testing.mainObject

到目前为止,我的回答是首先以不同的方式构建 jar 文件。(IntelliJ 创建)

File -> Project Structure -> Project Settings -> Artifacts -> Jar, 然而,我没有解压到 jar,而是点击了

Copy to Output and link to manifest

从那里,我执行了 spark-submit 命令,其中没有 --packages 部分。 这是

spark-submit --class com.testing.mainObject --master local out/artifacts/sparkProgram_jar/sparkProgram.jar

还要注意间距,以及复制和粘贴到您的终端。白色 space 会给你带来奇怪的错误。

从那里我又遇到了另一个错误,如下所示。 https://github.com/Intel-bigdata/HiBench/issues/466。解决方案在评论

"This seems to happen with hadoop 3. I solved it removing a hadoop-hdfs-2.4.0.jar that was in the classpath."