构建 Spark 1.3.0 JDK 1.6.0_45 maven 3.0.5 CentOS 6 时出错
Having error while building Spark 1.3.0 JDK 1.6.0_45 maven 3.0.5 CentOS 6
当我尝试在包中添加依赖项来构建 Spark 1.3.0 时
我收到与 class 不匹配
相关的错误
`[warn] /u01/spark/core/src/main/scala/org/apache/spark/ExecutorAllocationManager.scala:23: imported `Clock' is permanently hidden by definition of trait Clock in package spark
[warn] import org.apache.spark.util.{SystemClock, Clock}
[warn] ^
[error] /u01/spark/core/src/main/scala/org/apache/spark/ExecutorAllocationManager.scala:127: type mismatch;
[error] found : org.apache.spark.util.SystemClock
[error] required: org.apache.spark.Clock
[error] private var clock: Clock = new SystemClock()
[error] ^
[error] /u01/spark/core/src/main/scala/org/apache/spark/scheduler/DAGScheduler.scala:66: reference to Clock is ambiguous;
[error] it is imported twice in the same scope by
[error] import org.apache.spark.util._
[error] and import org.apache.spark._
[error] clock: Clock = new SystemClock())
[error] ^
[warn] /u01/spark/core/src/main/scala/org/apache/spark/deploy/worker/DriverRunner.scala:34: imported `Clock' is permanently hidden by definition of trait Clock in package worker
[warn] import org.apache.spark.util.{Clock, SystemClock}
[warn] ^
[error] /u01/spark/core/src/main/scala/org/apache/spark/deploy/worker/DriverRunner.scala:61: type mismatch;
[error] found : org.apache.spark.util.SystemClock
[error] required: org.apache.spark.deploy.worker.Clock
[error] private var clock: Clock = new SystemClock()
[error] ^
[error] /u01/spark/core/src/main/scala/org/apache/spark/deploy/worker/DriverRunner.scala:190: value getTimeMillis is not a member of org.apache.spark.deploy.worker.Clock
[error] val processStart = clock.getTimeMillis()
[error] ^
[error] /u01/spark/core/src/main/scala/org/apache/spark/deploy/worker/DriverRunner.scala:192: value getTimeMillis is not a member of org.apache.spark.deploy.worker.Clock
[error] if (clock.getTimeMillis() - processStart > successfulRunDuration * 1000) {
[error] ^
[warn] /u01/spark/core/src/main/scala/org/apache/spark/executor/Executor.scala:37: imported `MutableURLClassLoader' is permanently hidden by definition of trait MutableURLClassLoader in package executor
[warn] import org.apache.spark.util.{ChildFirstURLClassLoader, MutableURLClassLoader,
[warn] ^
[error] /u01/spark/core/src/main/scala/org/apache/spark/executor/Executor.scala:319: type mismatch;
[error] found : org.apache.spark.util.ChildFirstURLClassLoader
[error] required: org.apache.spark.executor.MutableURLClassLoader
[error] new ChildFirstURLClassLoader(urls, currentLoader)
[error] ^
[error] /u01/spark/core/src/main/scala/org/apache/spark/executor/Executor.scala:321: trait MutableURLClassLoader is abstract; cannot be instantiated
[error] new MutableURLClassLoader(urls, currentLoader)
[error] ^
[warn] /u01/spark/core/src/main/scala/org/apache/spark/scheduler/local/LocalBackend.scala:89: postfix operator millis should be enabled
[warn] by making the implicit value scala.language.postfixOps visible.
[warn] This can be achieved by adding the import clause 'import scala.language.postfixOps'
[warn] or by setting the compiler option -language:postfixOps.
[warn] See the Scala docs for value scala.language.postfixOps for a discussion
[warn] why the feature should be explicitly enabled.
[warn] context.system.scheduler.scheduleOnce(1000 millis, self, ReviveOffers)
[warn] ^
[warn] /u01/spark/core/src/main/scala/org/apache/spark/util/MutableURLClassLoader.scala:26: imported `ParentClassLoader' is permanently hidden by definition of class ParentClassLoader in package util
[warn] import org.apache.spark.util.ParentClassLoader
[warn] ^
[warn] 5 warnings found
[error] 7 errors found
`
我在尝试构建包含的 Maven + jdk 1.7
时遇到了同样的错误
完整的构建输出在 pastebin id i9PFEVJ8
完整的 pom.xml pastebin id 8gEgT5EE
[更新]
我已经更改了 spark 版本以匹配 1.3.0 现在我收到循环依赖错误。
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.10</artifactId>
<version>1.3.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka_2.10</artifactId>
<version>1.3.0</version>
</dependency>
我意识到 kafka 流模块附带了用于 MapR 的预构建 Spark 1.3.0 3.x,如果需要在其应用程序中生成流,则需要模块和依赖项。
当我尝试在包中添加依赖项来构建 Spark 1.3.0 时 我收到与 class 不匹配
相关的错误`[warn] /u01/spark/core/src/main/scala/org/apache/spark/ExecutorAllocationManager.scala:23: imported `Clock' is permanently hidden by definition of trait Clock in package spark
[warn] import org.apache.spark.util.{SystemClock, Clock}
[warn] ^
[error] /u01/spark/core/src/main/scala/org/apache/spark/ExecutorAllocationManager.scala:127: type mismatch;
[error] found : org.apache.spark.util.SystemClock
[error] required: org.apache.spark.Clock
[error] private var clock: Clock = new SystemClock()
[error] ^
[error] /u01/spark/core/src/main/scala/org/apache/spark/scheduler/DAGScheduler.scala:66: reference to Clock is ambiguous;
[error] it is imported twice in the same scope by
[error] import org.apache.spark.util._
[error] and import org.apache.spark._
[error] clock: Clock = new SystemClock())
[error] ^
[warn] /u01/spark/core/src/main/scala/org/apache/spark/deploy/worker/DriverRunner.scala:34: imported `Clock' is permanently hidden by definition of trait Clock in package worker
[warn] import org.apache.spark.util.{Clock, SystemClock}
[warn] ^
[error] /u01/spark/core/src/main/scala/org/apache/spark/deploy/worker/DriverRunner.scala:61: type mismatch;
[error] found : org.apache.spark.util.SystemClock
[error] required: org.apache.spark.deploy.worker.Clock
[error] private var clock: Clock = new SystemClock()
[error] ^
[error] /u01/spark/core/src/main/scala/org/apache/spark/deploy/worker/DriverRunner.scala:190: value getTimeMillis is not a member of org.apache.spark.deploy.worker.Clock
[error] val processStart = clock.getTimeMillis()
[error] ^
[error] /u01/spark/core/src/main/scala/org/apache/spark/deploy/worker/DriverRunner.scala:192: value getTimeMillis is not a member of org.apache.spark.deploy.worker.Clock
[error] if (clock.getTimeMillis() - processStart > successfulRunDuration * 1000) {
[error] ^
[warn] /u01/spark/core/src/main/scala/org/apache/spark/executor/Executor.scala:37: imported `MutableURLClassLoader' is permanently hidden by definition of trait MutableURLClassLoader in package executor
[warn] import org.apache.spark.util.{ChildFirstURLClassLoader, MutableURLClassLoader,
[warn] ^
[error] /u01/spark/core/src/main/scala/org/apache/spark/executor/Executor.scala:319: type mismatch;
[error] found : org.apache.spark.util.ChildFirstURLClassLoader
[error] required: org.apache.spark.executor.MutableURLClassLoader
[error] new ChildFirstURLClassLoader(urls, currentLoader)
[error] ^
[error] /u01/spark/core/src/main/scala/org/apache/spark/executor/Executor.scala:321: trait MutableURLClassLoader is abstract; cannot be instantiated
[error] new MutableURLClassLoader(urls, currentLoader)
[error] ^
[warn] /u01/spark/core/src/main/scala/org/apache/spark/scheduler/local/LocalBackend.scala:89: postfix operator millis should be enabled
[warn] by making the implicit value scala.language.postfixOps visible.
[warn] This can be achieved by adding the import clause 'import scala.language.postfixOps'
[warn] or by setting the compiler option -language:postfixOps.
[warn] See the Scala docs for value scala.language.postfixOps for a discussion
[warn] why the feature should be explicitly enabled.
[warn] context.system.scheduler.scheduleOnce(1000 millis, self, ReviveOffers)
[warn] ^
[warn] /u01/spark/core/src/main/scala/org/apache/spark/util/MutableURLClassLoader.scala:26: imported `ParentClassLoader' is permanently hidden by definition of class ParentClassLoader in package util
[warn] import org.apache.spark.util.ParentClassLoader
[warn] ^
[warn] 5 warnings found
[error] 7 errors found
`
我在尝试构建包含的 Maven + jdk 1.7
时遇到了同样的错误完整的构建输出在 pastebin id i9PFEVJ8 完整的 pom.xml pastebin id 8gEgT5EE
[更新]
我已经更改了 spark 版本以匹配 1.3.0 现在我收到循环依赖错误。
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.10</artifactId>
<version>1.3.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka_2.10</artifactId>
<version>1.3.0</version>
</dependency>
我意识到 kafka 流模块附带了用于 MapR 的预构建 Spark 1.3.0 3.x,如果需要在其应用程序中生成流,则需要模块和依赖项。