尝试编译 gensort.scala,得到:[error] 未加载数据时无法获取工件。 IvyNode =net.java.dev.jets3t#jets3t;0.6.1
Trying to compile gensort.scala, getting: [error] impossible to get artifacts when data has not been loaded. IvyNode =net.java.dev.jets3t#jets3t;0.6.1
scala 和 sbt 的新手,不确定如何进行。我是否缺少更多依赖项?
重现步骤:
- 在 ~/spark-1.3.0/project/
中保存 gensort.scala 代码
- 开始构建:my-server$ ~/spark-1.3.0/project/sbt
- > 运行
gensort.scala:
gensort source
在 ~/spark-1.3.0/project/build.sbt 中构建定义文件:
lazy val root = (project in file(".")).
settings(
name := "gensort",
version := "1.0",
scalaVersion := "2.11.6"
)
libraryDependencies ++= Seq(
"org.apache.spark" % "spark-examples_2.10" % "1.1.1",
"org.apache.spark" % "spark-core_2.11" % "1.3.0",
"org.apache.spark" % "spark-streaming-mqtt_2.11" % "1.3.0",
"org.apache.spark" % "spark-streaming_2.11" % "1.3.0",
"org.apache.spark" % "spark-network-common_2.10" % "1.2.0",
"org.apache.spark" % "spark-network-shuffle_2.10" % "1.3.0",
"org.apache.hadoop" % "hadoop-core" % "1.2.1"
)
非常感谢任何关于如何前进的见解。谢谢! -丹尼斯
您不应混用 2.10 和 2.11,它们不是二进制兼容的。您的 libraryDependencies
应如下所示:
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-examples" % "1.1.1",
"org.apache.spark" %% "spark-core" % "1.3.0",
"org.apache.spark" %% "spark-streaming-mqtt" % "1.3.0",
"org.apache.spark" %% "spark-streaming" % "1.3.0",
"org.apache.spark" %% "spark-network-common" % "1.2.0",
"org.apache.spark" %% "spark-network-shuffle" % "1.3.0",
"org.apache.hadoop" % "hadoop-core" % "1.2.1"
)
%%
表示Scala版本作为后缀添加到库id中。进行此更改后,我收到错误消息,因为找不到依赖项。它位于此处:
resolvers += "poho" at "https://repo.eclipse.org/content/repositories/paho-releases"
不过,似乎 spark-examples
不适用于 2.11。将 scalaVersion
更改为
scalaVersion := "2.10.5"
解决所有依赖问题,编译成功
scala 和 sbt 的新手,不确定如何进行。我是否缺少更多依赖项?
重现步骤:
- 在 ~/spark-1.3.0/project/ 中保存 gensort.scala 代码
- 开始构建:my-server$ ~/spark-1.3.0/project/sbt
- > 运行
gensort.scala: gensort source
在 ~/spark-1.3.0/project/build.sbt 中构建定义文件:
lazy val root = (project in file(".")).
settings(
name := "gensort",
version := "1.0",
scalaVersion := "2.11.6"
)
libraryDependencies ++= Seq(
"org.apache.spark" % "spark-examples_2.10" % "1.1.1",
"org.apache.spark" % "spark-core_2.11" % "1.3.0",
"org.apache.spark" % "spark-streaming-mqtt_2.11" % "1.3.0",
"org.apache.spark" % "spark-streaming_2.11" % "1.3.0",
"org.apache.spark" % "spark-network-common_2.10" % "1.2.0",
"org.apache.spark" % "spark-network-shuffle_2.10" % "1.3.0",
"org.apache.hadoop" % "hadoop-core" % "1.2.1"
)
非常感谢任何关于如何前进的见解。谢谢! -丹尼斯
您不应混用 2.10 和 2.11,它们不是二进制兼容的。您的 libraryDependencies
应如下所示:
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-examples" % "1.1.1",
"org.apache.spark" %% "spark-core" % "1.3.0",
"org.apache.spark" %% "spark-streaming-mqtt" % "1.3.0",
"org.apache.spark" %% "spark-streaming" % "1.3.0",
"org.apache.spark" %% "spark-network-common" % "1.2.0",
"org.apache.spark" %% "spark-network-shuffle" % "1.3.0",
"org.apache.hadoop" % "hadoop-core" % "1.2.1"
)
%%
表示Scala版本作为后缀添加到库id中。进行此更改后,我收到错误消息,因为找不到依赖项。它位于此处:
resolvers += "poho" at "https://repo.eclipse.org/content/repositories/paho-releases"
不过,似乎 spark-examples
不适用于 2.11。将 scalaVersion
更改为
scalaVersion := "2.10.5"
解决所有依赖问题,编译成功