`sbt 运行` 添加依赖后编译时报错
`sbt run` results in an error when compiling after adding dependencies
我在 built.sbt 中添加了以下依赖项,在终端中的 运行 和 sbt run
之后,我收到以下错误:
$ sbt run
[info] welcome to sbt 1.5.5 (Private Build Java 1.8.0_292)
[info] loading global plugins from /home/hayat/.sbt/1.0/plugins
[info] loading project definition from /home/hayat/myproject/project
[info] loading settings for project root from build.sbt ...
[info] set current project to scala3-simple (in build file:/home/hayat/myproject/)
[info] Updating
[info] Resolved dependencies
[warn]
[warn] Note: Unresolved dependencies path:
[error] sbt.librarymanagement.ResolveException: Error downloading org.apache.spark:spark-streaming:3.1.2
[error] Not found
[error] Not found
[error] not found: /home/hayat/.ivy2/localorg.apache.spark/spark-streaming/3.1.2/ivys/ivy.xml
[error] not found: https://repo1.maven.org/maven2/org/apache/spark/spark-streaming/3.1.2/spark-streaming-3.1.2.pom
[error] at lmcoursier.CoursierDependencyResolution.unresolvedWarningOrThrow(CoursierDependencyResolution.scala:258)
[error] at lmcoursier.CoursierDependencyResolution.$anonfun$update(CoursierDependencyResolution.scala:227)
[error] at scala.util.Either$LeftProjection.map(Either.scala:573)
[error] at lmcoursier.CoursierDependencyResolution.update(CoursierDependencyResolution.scala:227)
[error] at sbt.librarymanagement.DependencyResolution.update(DependencyResolution.scala:60)
[error] at sbt.internal.LibraryManagement$.resolve(LibraryManagement.scala:59)
[error] at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate(LibraryManagement.scala:133)
[error] at sbt.util.Tracked$.$anonfun$lastOutput(Tracked.scala:73)
[error] at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate(LibraryManagement.scala:146)
[error] at scala.util.control.Exception$Catch.apply(Exception.scala:228)
[error] at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate(LibraryManagement.scala:146)
[error] at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$adapted(LibraryManagement.scala:127)
[error] at sbt.util.Tracked$.$anonfun$inputChangedW(Tracked.scala:219)
[error] at sbt.internal.LibraryManagement$.cachedUpdate(LibraryManagement.scala:160)
[error] at sbt.Classpaths$.$anonfun$updateTask0(Defaults.scala:3678)
[error] at scala.Function1.$anonfun$compose(Function1.scala:49)
[error] at sbt.internal.util.$tilde$greater.$anonfun$$u2219(TypeFunctions.scala:62)
[error] at sbt.std.Transform$$anon.work(Transform.scala:68)
[error] at sbt.Execute.$anonfun$submit(Execute.scala:282)
[error] at sbt.internal.util.ErrorHandling$.wideConvert(ErrorHandling.scala:23)
[error] at sbt.Execute.work(Execute.scala:291)
[error] at sbt.Execute.$anonfun$submit(Execute.scala:282)
[error] at sbt.ConcurrentRestrictions$$anon.$anonfun$submitValid(ConcurrentRestrictions.scala:265)
[error] at sbt.CompletionService$$anon.call(CompletionService.scala:64)
[error] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
[error] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
[error] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
[error] at java.lang.Thread.run(Thread.java:748)
[error] (update) sbt.librarymanagement.ResolveException: Error downloading org.apache.spark:spark-streaming:3.1.2
[error] Not found
[error] Not found
[error] not found: /home/hayat/.ivy2/localorg.apache.spark/spark-streaming/3.1.2/ivys/ivy.xml
[error] not found: https://repo1.maven.org/maven2/org/apache/spark/spark-streaming/3.1.2/spark-streaming-3.1.2.pom
[error] Total time: 7 s, completed Sep 16, 2021 11:21:30 AM
这里是built.sbt:
val scala3Version = "3.0.2"
lazy val root = project
.in(file("."))
.settings(
name := "scala3-simple",
version := "0.1.0",
scalaVersion := scala3Version,
libraryDependencies += "com.novocode" % "junit-interface" % "0.11" % "test",
libraryDependencies += "org.apache.spark" % "spark-streaming" % "3.1.2",
libraryDependencies += "org.apache.spark" % "spark-core" % "3.1.2"
)
- Scala 版本:3.0.2
- Sbt 版本:1.5.5
库 spark-streaming
和 spark-core
不存在,它是 spark-streaming_2.12
和 spark-core_2.12
,其中 2.12 是 Scala 版本。目前没有 spark-streaming_3.0
和 spark-core_3.0
库。
因此,要解决您的问题,您需要:
- 将您的 Scala 版本从
3.0.2
降级到 2.12.x
(最新版本,2.12.15
),因为没有适用于 Scala 3 的 Spark 版本
- 使用
spark-streaming_2.12
库而不是 spark-streaming
- 使用
spark-core_2.12
库而不是 spark-core
要使用 _2.12
版本的库,您可以将 _2.12
添加到您的库名称中:
libraryDependencies += "org.apache.spark" % "spark-streaming_2.12" % "3.1.2",
libraryDependencies += "org.apache.spark" % "spark-core_2.12" % "3.1.2"
或者,更好的是,在组和库名称之间使用 %%
以自动将 scala 版本添加到库名称中:
libraryDependencies += "org.apache.spark" %% "spark-streaming" % "3.1.2",
libraryDependencies += "org.apache.spark" %% "spark-core" % "3.1.2"
所以你的 build.sbt
应该变成:
val scala2Version = "2.12.15"
lazy val root = project
.in(file("."))
.settings(
name := "scala2-simple",
version := "0.1.0",
scalaVersion := scala2Version,
libraryDependencies += "com.novocode" % "junit-interface" % "0.11" % "test",
libraryDependencies += "org.apache.spark" %% "spark-streaming" % "3.1.2",
libraryDependencies += "org.apache.spark" %% "spark-core" % "3.1.2"
)
我在 built.sbt 中添加了以下依赖项,在终端中的 运行 和 sbt run
之后,我收到以下错误:
$ sbt run
[info] welcome to sbt 1.5.5 (Private Build Java 1.8.0_292)
[info] loading global plugins from /home/hayat/.sbt/1.0/plugins
[info] loading project definition from /home/hayat/myproject/project
[info] loading settings for project root from build.sbt ...
[info] set current project to scala3-simple (in build file:/home/hayat/myproject/)
[info] Updating
[info] Resolved dependencies
[warn]
[warn] Note: Unresolved dependencies path:
[error] sbt.librarymanagement.ResolveException: Error downloading org.apache.spark:spark-streaming:3.1.2
[error] Not found
[error] Not found
[error] not found: /home/hayat/.ivy2/localorg.apache.spark/spark-streaming/3.1.2/ivys/ivy.xml
[error] not found: https://repo1.maven.org/maven2/org/apache/spark/spark-streaming/3.1.2/spark-streaming-3.1.2.pom
[error] at lmcoursier.CoursierDependencyResolution.unresolvedWarningOrThrow(CoursierDependencyResolution.scala:258)
[error] at lmcoursier.CoursierDependencyResolution.$anonfun$update(CoursierDependencyResolution.scala:227)
[error] at scala.util.Either$LeftProjection.map(Either.scala:573)
[error] at lmcoursier.CoursierDependencyResolution.update(CoursierDependencyResolution.scala:227)
[error] at sbt.librarymanagement.DependencyResolution.update(DependencyResolution.scala:60)
[error] at sbt.internal.LibraryManagement$.resolve(LibraryManagement.scala:59)
[error] at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate(LibraryManagement.scala:133)
[error] at sbt.util.Tracked$.$anonfun$lastOutput(Tracked.scala:73)
[error] at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate(LibraryManagement.scala:146)
[error] at scala.util.control.Exception$Catch.apply(Exception.scala:228)
[error] at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate(LibraryManagement.scala:146)
[error] at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$adapted(LibraryManagement.scala:127)
[error] at sbt.util.Tracked$.$anonfun$inputChangedW(Tracked.scala:219)
[error] at sbt.internal.LibraryManagement$.cachedUpdate(LibraryManagement.scala:160)
[error] at sbt.Classpaths$.$anonfun$updateTask0(Defaults.scala:3678)
[error] at scala.Function1.$anonfun$compose(Function1.scala:49)
[error] at sbt.internal.util.$tilde$greater.$anonfun$$u2219(TypeFunctions.scala:62)
[error] at sbt.std.Transform$$anon.work(Transform.scala:68)
[error] at sbt.Execute.$anonfun$submit(Execute.scala:282)
[error] at sbt.internal.util.ErrorHandling$.wideConvert(ErrorHandling.scala:23)
[error] at sbt.Execute.work(Execute.scala:291)
[error] at sbt.Execute.$anonfun$submit(Execute.scala:282)
[error] at sbt.ConcurrentRestrictions$$anon.$anonfun$submitValid(ConcurrentRestrictions.scala:265)
[error] at sbt.CompletionService$$anon.call(CompletionService.scala:64)
[error] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
[error] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
[error] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
[error] at java.lang.Thread.run(Thread.java:748)
[error] (update) sbt.librarymanagement.ResolveException: Error downloading org.apache.spark:spark-streaming:3.1.2
[error] Not found
[error] Not found
[error] not found: /home/hayat/.ivy2/localorg.apache.spark/spark-streaming/3.1.2/ivys/ivy.xml
[error] not found: https://repo1.maven.org/maven2/org/apache/spark/spark-streaming/3.1.2/spark-streaming-3.1.2.pom
[error] Total time: 7 s, completed Sep 16, 2021 11:21:30 AM
这里是built.sbt:
val scala3Version = "3.0.2"
lazy val root = project
.in(file("."))
.settings(
name := "scala3-simple",
version := "0.1.0",
scalaVersion := scala3Version,
libraryDependencies += "com.novocode" % "junit-interface" % "0.11" % "test",
libraryDependencies += "org.apache.spark" % "spark-streaming" % "3.1.2",
libraryDependencies += "org.apache.spark" % "spark-core" % "3.1.2"
)
- Scala 版本:3.0.2
- Sbt 版本:1.5.5
库 spark-streaming
和 spark-core
不存在,它是 spark-streaming_2.12
和 spark-core_2.12
,其中 2.12 是 Scala 版本。目前没有 spark-streaming_3.0
和 spark-core_3.0
库。
因此,要解决您的问题,您需要:
- 将您的 Scala 版本从
3.0.2
降级到2.12.x
(最新版本,2.12.15
),因为没有适用于 Scala 3 的 Spark 版本 - 使用
spark-streaming_2.12
库而不是spark-streaming
- 使用
spark-core_2.12
库而不是spark-core
要使用 _2.12
版本的库,您可以将 _2.12
添加到您的库名称中:
libraryDependencies += "org.apache.spark" % "spark-streaming_2.12" % "3.1.2",
libraryDependencies += "org.apache.spark" % "spark-core_2.12" % "3.1.2"
或者,更好的是,在组和库名称之间使用 %%
以自动将 scala 版本添加到库名称中:
libraryDependencies += "org.apache.spark" %% "spark-streaming" % "3.1.2",
libraryDependencies += "org.apache.spark" %% "spark-core" % "3.1.2"
所以你的 build.sbt
应该变成:
val scala2Version = "2.12.15"
lazy val root = project
.in(file("."))
.settings(
name := "scala2-simple",
version := "0.1.0",
scalaVersion := scala2Version,
libraryDependencies += "com.novocode" % "junit-interface" % "0.11" % "test",
libraryDependencies += "org.apache.spark" %% "spark-streaming" % "3.1.2",
libraryDependencies += "org.apache.spark" %% "spark-core" % "3.1.2"
)