使用 spark 时关于 sbt yarn 的错误
Error about sbt yarn at using spark
嗨,我在写这段代码
>sbt
看到这个结果后
beyhan@beyhan:~/sparksample$ sbt
Starting sbt: invoke with -help for other options
[info] Set current project to Spark Sample (in build file:/home/beyhan/sparksample/)
在我写完这段代码后
>compile
我收到这个错误
[error] {file:/home/beyhan/sparksample/}default-f390c8/*:update: sbt.ResolveException: unresolved dependency: org.apache.hadoop#hadoop-yarn-common;1.0.4: not found
[error] unresolved dependency: org.apache.hadoop#hadoop-yarn-client;1.0.4: not found
[error] unresolved dependency: org.apache.hadoop#hadoop-yarn-api;1.0.4: not found
[error] download failed: org.eclipse.jetty.orbit#javax.transaction;1.1.1.v201105210645!javax.transaction.orbit
[error] download failed: org.eclipse.jetty.orbit#javax.servlet;3.0.0.v201112011016!javax.servlet.orbit
[error] download failed: org.eclipse.jetty.orbit#javax.mail.glassfish;1.4.1.v201005082020!javax.mail.glassfish.orbit
[error] download failed: org.eclipse.jetty.orbit#javax.activation;1.1.0.v201105071233!javax.activation.orbit
[error] Total time: 14 s, completed Oct 16, 2015 3:58:48 PM
我的 sparksample 有这个。
beyhan@beyhan:~/sparksample$ ll
total 20
drwxrwxr-x 4 beyhan beyhan 4096 Eki 16 16:02 ./
drwxr-xr-x 57 beyhan beyhan 4096 Eki 16 15:27 ../
drwxrwxr-x 2 beyhan beyhan 4096 Eki 16 16:02 project/
-rw-rw-r-- 1 beyhan beyhan 142 Eki 15 18:57 simple.sbt
drwxrwxr-x 3 beyhan beyhan 4096 Eki 15 11:14 src/
src 文件也有
src>main>scala>SimpleCode.scala
我的 simple.sbt 文件是这样的
name := "Spark Sample"
version := "1.0"
scalaVersion := "2.11.7"
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "1.2.0"
我该怎么办?
我认为这个关于我的纱线的错误是因为我没有。
谢谢。
org.apache.hadoop#hadoop-yarn-client;1.0.4
对你的这种依赖似乎不是因为 build.sbt
。也许 ~/.ivy2
或 ~/.m2
中的缓存文件有问题,或者可能是因为某些 project/*.sbt
文件引入了额外的依赖项。
虽然对我来说效果很好:
build.sbt
$ cat build.sbt
name := "Spark Sample"
version := "1.0"
scalaVersion := "2.11.7"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.2.0"
已解决所有依赖项:
$ sbt compile
Getting org.scala-sbt sbt 0.13.9 ...
:: retrieving :: org.scala-sbt#boot-app
confs: [default]
52 artifacts copied, 0 already retrieved (17785kB/791ms)
Getting Scala 2.10.5 (for sbt)...
:: retrieving :: org.scala-sbt#boot-scala
confs: [default]
5 artifacts copied, 0 already retrieved (24493kB/306ms)
[info] Set current project to Spark Sample (in build file:/home/tuxdna/tmp/p/)
[info] Updating {file:/home/tuxdna/tmp/p/}p...
[info] Resolving jline#jline;2.12.1 ...
[info] downloading https://repo1.maven.org/maven2/org/apache/spark/spark-core_2.11/1.2.0/spark-core_2.11-1.2.0.jar ...
[info] [SUCCESSFUL ] org.apache.spark#spark-core_2.11;1.2.0!spark-core_2.11.jar (31007ms)
[info] downloading https://repo1.maven.org/maven2/org/apache/spark/spark-network-common_2.11/1.2.0/spark-network-common_2.11-1.2.0.jar ...
[info] [SUCCESSFUL ] org.apache.spark#spark-network-common_2.11;1.2.0!spark-network-common_2.11.jar (1873ms)
[info] downloading https://repo1.maven.org/maven2/org/apache/spark/spark-network-shuffle_2.11/1.2.0/spark-network-shuffle_2.11-1.2.0.jar ...
[info] [SUCCESSFUL ] org.apache.spark#spark-network-shuffle_2.11;1.2.0!spark-network-shuffle_2.11.jar (2122ms)
[info] Done updating.
[success] Total time: 61 s, completed 17 Oct, 2015 12:48:49 AM
注意我安装的 Scala 和 SBT 版本:
$ sbt sbt-version
[info] Set current project to Spark Sample (in build file:/home/tuxdna/tmp/p/)
[info] 0.13.9
$ scala -version
Scala code runner version 2.11.2 -- Copyright 2002-2013, LAMP/EPFL
您可以使用单独的用户(或可能在单独的计算机上)尝试这些步骤吗?
嗨,我在写这段代码
>sbt
看到这个结果后
beyhan@beyhan:~/sparksample$ sbt
Starting sbt: invoke with -help for other options
[info] Set current project to Spark Sample (in build file:/home/beyhan/sparksample/)
在我写完这段代码后
>compile
我收到这个错误
[error] {file:/home/beyhan/sparksample/}default-f390c8/*:update: sbt.ResolveException: unresolved dependency: org.apache.hadoop#hadoop-yarn-common;1.0.4: not found
[error] unresolved dependency: org.apache.hadoop#hadoop-yarn-client;1.0.4: not found
[error] unresolved dependency: org.apache.hadoop#hadoop-yarn-api;1.0.4: not found
[error] download failed: org.eclipse.jetty.orbit#javax.transaction;1.1.1.v201105210645!javax.transaction.orbit
[error] download failed: org.eclipse.jetty.orbit#javax.servlet;3.0.0.v201112011016!javax.servlet.orbit
[error] download failed: org.eclipse.jetty.orbit#javax.mail.glassfish;1.4.1.v201005082020!javax.mail.glassfish.orbit
[error] download failed: org.eclipse.jetty.orbit#javax.activation;1.1.0.v201105071233!javax.activation.orbit
[error] Total time: 14 s, completed Oct 16, 2015 3:58:48 PM
我的 sparksample 有这个。
beyhan@beyhan:~/sparksample$ ll
total 20
drwxrwxr-x 4 beyhan beyhan 4096 Eki 16 16:02 ./
drwxr-xr-x 57 beyhan beyhan 4096 Eki 16 15:27 ../
drwxrwxr-x 2 beyhan beyhan 4096 Eki 16 16:02 project/
-rw-rw-r-- 1 beyhan beyhan 142 Eki 15 18:57 simple.sbt
drwxrwxr-x 3 beyhan beyhan 4096 Eki 15 11:14 src/
src 文件也有
src>main>scala>SimpleCode.scala
我的 simple.sbt 文件是这样的
name := "Spark Sample"
version := "1.0"
scalaVersion := "2.11.7"
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "1.2.0"
我该怎么办? 我认为这个关于我的纱线的错误是因为我没有。 谢谢。
org.apache.hadoop#hadoop-yarn-client;1.0.4
对你的这种依赖似乎不是因为 build.sbt
。也许 ~/.ivy2
或 ~/.m2
中的缓存文件有问题,或者可能是因为某些 project/*.sbt
文件引入了额外的依赖项。
虽然对我来说效果很好:
build.sbt
$ cat build.sbt
name := "Spark Sample"
version := "1.0"
scalaVersion := "2.11.7"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.2.0"
已解决所有依赖项:
$ sbt compile
Getting org.scala-sbt sbt 0.13.9 ...
:: retrieving :: org.scala-sbt#boot-app
confs: [default]
52 artifacts copied, 0 already retrieved (17785kB/791ms)
Getting Scala 2.10.5 (for sbt)...
:: retrieving :: org.scala-sbt#boot-scala
confs: [default]
5 artifacts copied, 0 already retrieved (24493kB/306ms)
[info] Set current project to Spark Sample (in build file:/home/tuxdna/tmp/p/)
[info] Updating {file:/home/tuxdna/tmp/p/}p...
[info] Resolving jline#jline;2.12.1 ...
[info] downloading https://repo1.maven.org/maven2/org/apache/spark/spark-core_2.11/1.2.0/spark-core_2.11-1.2.0.jar ...
[info] [SUCCESSFUL ] org.apache.spark#spark-core_2.11;1.2.0!spark-core_2.11.jar (31007ms)
[info] downloading https://repo1.maven.org/maven2/org/apache/spark/spark-network-common_2.11/1.2.0/spark-network-common_2.11-1.2.0.jar ...
[info] [SUCCESSFUL ] org.apache.spark#spark-network-common_2.11;1.2.0!spark-network-common_2.11.jar (1873ms)
[info] downloading https://repo1.maven.org/maven2/org/apache/spark/spark-network-shuffle_2.11/1.2.0/spark-network-shuffle_2.11-1.2.0.jar ...
[info] [SUCCESSFUL ] org.apache.spark#spark-network-shuffle_2.11;1.2.0!spark-network-shuffle_2.11.jar (2122ms)
[info] Done updating.
[success] Total time: 61 s, completed 17 Oct, 2015 12:48:49 AM
注意我安装的 Scala 和 SBT 版本:
$ sbt sbt-version
[info] Set current project to Spark Sample (in build file:/home/tuxdna/tmp/p/)
[info] 0.13.9
$ scala -version
Scala code runner version 2.11.2 -- Copyright 2002-2013, LAMP/EPFL
您可以使用单独的用户(或可能在单独的计算机上)尝试这些步骤吗?