编译 cassandra-spark-connector 是否需要任何特定的 sbt 版本
Is there any specific sbt version required to compile the cassandra-spark-connector
我正在组装 "Cassandra-Spark-Connector"。我只是按照以下步骤操作:
- Git 克隆连接器代码
- 运行 "sbt assembly"
在组装阶段我收到以下错误:
[info] Done updating.
[warn] There may be incompatibilities among your library dependencies.
[warn] Here are some of the libraries that were evicted:
[warn] * com.eed3si9n:sbt-assembly:0.11.2 -> 0.13.0
[warn] Run 'evicted' to see detailed eviction warnings
[info] Compiling 5 Scala sources to /home/xxxxxx/Development/iAdLearning/spark-cassandra-connector/project/target/scala-2.10/sbt-0.13/classes...
[error] /home/xxxxxx/Development/iAdLearning/spark-cassandra-connector/project/Settings.scala:23: object Plugin is not a member of package sbtassembly
[error] import sbtassembly.Plugin._
[error] ^
[error] /home/xxxxxx/Development/iAdLearning/spark-cassandra-connector/project/Settings.scala:24: not found: object AssemblyKeys
[error] import AssemblyKeys._
[error] ^
[error] /home/xxxxxx/Development/iAdLearning/spark-cassandra-connector/project/Settings.scala:217: not found: value assemblySettings
[error] lazy val sbtAssemblySettings = assemblySettings ++ Seq(
[error] ^
[error] three errors found
[error] (compile:compileIncremental) Compilation failed
我是 运行 sbt 0.13.6
构建 Spark java 连接器需要 plugins.bat 中定义的 sbt-assembly 版本 0.11.2。您可能在全局插件文件夹 (~.sbt[=12=].13\plugins) 中安装了较新的 sbt-assembly 版本(版本 0.13.0),这导致了此问题。
请重命名 ~.sbt[=12=].13 中的插件文件夹并尝试重新构建它。
您可以随时使用 运行 打包的 sbt。
./sbt/sbt assembly
这将自动下载并使用有效版本的 sbt。
我正在组装 "Cassandra-Spark-Connector"。我只是按照以下步骤操作:
- Git 克隆连接器代码
- 运行 "sbt assembly"
在组装阶段我收到以下错误:
[info] Done updating.
[warn] There may be incompatibilities among your library dependencies.
[warn] Here are some of the libraries that were evicted:
[warn] * com.eed3si9n:sbt-assembly:0.11.2 -> 0.13.0
[warn] Run 'evicted' to see detailed eviction warnings
[info] Compiling 5 Scala sources to /home/xxxxxx/Development/iAdLearning/spark-cassandra-connector/project/target/scala-2.10/sbt-0.13/classes...
[error] /home/xxxxxx/Development/iAdLearning/spark-cassandra-connector/project/Settings.scala:23: object Plugin is not a member of package sbtassembly
[error] import sbtassembly.Plugin._
[error] ^
[error] /home/xxxxxx/Development/iAdLearning/spark-cassandra-connector/project/Settings.scala:24: not found: object AssemblyKeys
[error] import AssemblyKeys._
[error] ^
[error] /home/xxxxxx/Development/iAdLearning/spark-cassandra-connector/project/Settings.scala:217: not found: value assemblySettings
[error] lazy val sbtAssemblySettings = assemblySettings ++ Seq(
[error] ^
[error] three errors found
[error] (compile:compileIncremental) Compilation failed
我是 运行 sbt 0.13.6
构建 Spark java 连接器需要 plugins.bat 中定义的 sbt-assembly 版本 0.11.2。您可能在全局插件文件夹 (~.sbt[=12=].13\plugins) 中安装了较新的 sbt-assembly 版本(版本 0.13.0),这导致了此问题。 请重命名 ~.sbt[=12=].13 中的插件文件夹并尝试重新构建它。
您可以随时使用 运行 打包的 sbt。
./sbt/sbt assembly
这将自动下载并使用有效版本的 sbt。