Spark:package.class 中的签名指的是类型 compileTimeOnly
Spark: A signature in package.class refers to type compileTimeOnly
当尝试使用 SBT 使用 Spark 1.2.1 构建 MLlib 示例时,我遇到了一大堆奇怪的编译错误。使用 Spark 1.1.0 可以很好地构建相同的代码。对于 Spark 1.2.1,我使用以下 SBT 构建文件:
name := "Test"
version := "1.0"
scalaVersion := "2.10.4"
libraryDependencies += "org.apache.spark" % "spark-mllib_2.11" % "1.2.1" % "provided"
结果我得到了以下一组奇怪的错误:
[info] Compiling 1 Scala source to /home/test/target/scala-2.10/classes...
[error] bad symbolic reference. A signature in package.class refers to type compileTimeOnly
[error] in package scala.annotation which is not available.
[error] It may be completely missing from the current classpath, or the version on
[error] the classpath might be incompatible with the version used when compiling package.class.
[error] /home/test/src/main/scala/Test.scala:16: Reference to method augmentString in object Predef should not have survived past type checking,
[error] it should have been processed and eliminated during expansion of an enclosing macro.
[error] val parsedData = data.map(s => Vectors.dense(s.split(' ').map(_.toDouble))).cache()
[error] /home/test/src/main/scala/Test.scala:16: Reference to method augmentString in object Predef should not have survived past type checking,
[error] it should have been processed and eliminated during expansion of an enclosing macro.
[error] val parsedData = data.map(s => Vectors.dense(s.split(' ').map(_.toDouble))).cache()
[error] ^
[error] three errors found
[error] (compile:compile) Compilation failed
[error] Total time: 21 s, completed 26.02.2015 17:47:29
如何解决这个问题?如果有人可以 post 一个通用的 SBT 来构建 Spark 1.2.1 + MLlib 代码,那就太好了。谢谢!
尝试将 libraryDependencies 行更改为以下内容:
libraryDependencies += "org.apache.spark" %% "spark-mllib" % "1.2.1" % "provided"
您正在使用 Scala 2.10.4,并且您正在尝试为 Scala 2 安装 Spark 库。11.x - %% 将自动 select 为您提供正确的 Scala 库版本。
我正在使用 IntelliJ 编译 spark 1.6.0 代码。并面临同样的错误。 [错误] 错误:错误的符号引用。 package.class 中的签名指的是类型 compileTimeOnly。
我通过在项目中添加Scala语言相关的依赖来解决这个问题。也许 maven 不能使用 Intellij 的 Scala 配置。所以我们应该显式指定 Scala 依赖项。
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-reflect</artifactId>
<version>2.10.6</version>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>2.10.6</version>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-compiler</artifactId>
<version>2.10.6</version>
</dependency>
当尝试使用 SBT 使用 Spark 1.2.1 构建 MLlib 示例时,我遇到了一大堆奇怪的编译错误。使用 Spark 1.1.0 可以很好地构建相同的代码。对于 Spark 1.2.1,我使用以下 SBT 构建文件:
name := "Test"
version := "1.0"
scalaVersion := "2.10.4"
libraryDependencies += "org.apache.spark" % "spark-mllib_2.11" % "1.2.1" % "provided"
结果我得到了以下一组奇怪的错误:
[info] Compiling 1 Scala source to /home/test/target/scala-2.10/classes...
[error] bad symbolic reference. A signature in package.class refers to type compileTimeOnly
[error] in package scala.annotation which is not available.
[error] It may be completely missing from the current classpath, or the version on
[error] the classpath might be incompatible with the version used when compiling package.class.
[error] /home/test/src/main/scala/Test.scala:16: Reference to method augmentString in object Predef should not have survived past type checking,
[error] it should have been processed and eliminated during expansion of an enclosing macro.
[error] val parsedData = data.map(s => Vectors.dense(s.split(' ').map(_.toDouble))).cache()
[error] /home/test/src/main/scala/Test.scala:16: Reference to method augmentString in object Predef should not have survived past type checking,
[error] it should have been processed and eliminated during expansion of an enclosing macro.
[error] val parsedData = data.map(s => Vectors.dense(s.split(' ').map(_.toDouble))).cache()
[error] ^
[error] three errors found
[error] (compile:compile) Compilation failed
[error] Total time: 21 s, completed 26.02.2015 17:47:29
如何解决这个问题?如果有人可以 post 一个通用的 SBT 来构建 Spark 1.2.1 + MLlib 代码,那就太好了。谢谢!
尝试将 libraryDependencies 行更改为以下内容:
libraryDependencies += "org.apache.spark" %% "spark-mllib" % "1.2.1" % "provided"
您正在使用 Scala 2.10.4,并且您正在尝试为 Scala 2 安装 Spark 库。11.x - %% 将自动 select 为您提供正确的 Scala 库版本。
我正在使用 IntelliJ 编译 spark 1.6.0 代码。并面临同样的错误。 [错误] 错误:错误的符号引用。 package.class 中的签名指的是类型 compileTimeOnly。
我通过在项目中添加Scala语言相关的依赖来解决这个问题。也许 maven 不能使用 Intellij 的 Scala 配置。所以我们应该显式指定 Scala 依赖项。
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-reflect</artifactId>
<version>2.10.6</version>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>2.10.6</version>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-compiler</artifactId>
<version>2.10.6</version>
</dependency>