SBT 在导入 Spark 的依赖项时出错

SBT gives error when importing Spark's dependencies

我是 Spark 的新手,这是我的第一个测试项目。我遵循了一切正常的教程,但是当我尝试将它引入我的机器时,它不起作用。我在构建项目时遇到了错误。我正在使用依赖项:

name := "spark"

version := "0.1"

scalaVersion := "2.12.8"

libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % "2.3.3",
  "org.apache.spark" %% "spark-sql" % "2.3.3"
)

导入依赖项时出现错误:

[error] sbt.librarymanagement.ResolveException: unresolved dependency: org.apache.spark#spark-core_2.12;2.3.3: not found
[error] unresolved dependency: org.apache.spark#spark-sql_2.12;2.3.3: not found
[error]     at sbt.internal.librarymanagement.IvyActions$.resolveAndRetrieve(IvyActions.scala:332)
[error]     at sbt.internal.librarymanagement.IvyActions$.$anonfun$updateEither(IvyActions.scala:208)
[error]     at sbt.internal.librarymanagement.IvySbt$Module.$anonfun$withModule(Ivy.scala:239)
[error]     at sbt.internal.librarymanagement.IvySbt.$anonfun$withIvy(Ivy.scala:204)
[error]     at sbt.internal.librarymanagement.IvySbt.sbt$internal$librarymanagement$IvySbt$$action(Ivy.scala:70)
[error]     at sbt.internal.librarymanagement.IvySbt$$anon.call(Ivy.scala:77)
[error]     at xsbt.boot.Locks$GlobalLock.withChannel(Locks.scala:113)
[error]     at xsbt.boot.Locks$GlobalLock.withChannelRetries(Locks.scala:91)
[error]     at xsbt.boot.Locks$GlobalLock.$anonfun$withFileLock(Locks.scala:119)
[error]     at xsbt.boot.Using$.withResource(Using.scala:12)
[error]     at xsbt.boot.Using$.apply(Using.scala:9)
[error]     at xsbt.boot.Locks$GlobalLock.withFileLock(Locks.scala:119)
[error]     at xsbt.boot.Locks$GlobalLock.ignoringDeadlockAvoided(Locks.scala:71)
[error]     at xsbt.boot.Locks$GlobalLock.withLock(Locks.scala:59)
[error]     at xsbt.boot.Locks$.apply0(Locks.scala:47)
[error]     at xsbt.boot.Locks$.apply(Locks.scala:36)
[error]     at sbt.internal.librarymanagement.IvySbt.withDefaultLogger(Ivy.scala:77)
[error]     at sbt.internal.librarymanagement.IvySbt.withIvy(Ivy.scala:199)
[error]     at sbt.internal.librarymanagement.IvySbt.withIvy(Ivy.scala:196)
[error]     at sbt.internal.librarymanagement.IvySbt$Module.withModule(Ivy.scala:238)
[error]     at sbt.internal.librarymanagement.IvyActions$.updateEither(IvyActions.scala:193)
[error]     at 

...
...
...

[error] (update) sbt.librarymanagement.ResolveException: unresolved dependency: org.apache.spark#spark-core_2.12;2.3.3: not found
[error] unresolved dependency: org.apache.spark#spark-sql_2.12;2.3.3: not found
[error] (ssExtractDependencies) sbt.librarymanagement.ResolveException: unresolved dependency: org.apache.spark#spark-core_2.12;2.3.3: not found
[error] unresolved dependency: org.apache.spark#spark-sql_2.12;2.3.3: not found
[error] Total time: 1 s, completed 18-Sep-2021 10:33:42
[info] shutting down server

spark 2.3 似乎需要兼容的 scala 版本,请尝试使用 2.11.x 作为 scala 版本。

来源:[sparkDocs]https://spark.apache.org/docs/2.3.0/

`Spark 在 Java 8+、Python 2.7+/3.4+ 和 R 3.1+ 上运行。对于 Scala API,Spark 2.3.0 使用 Scala 2.11。您将需要使用兼容的 Scala 版本 (2.11.x).

请注意,从 Spark 2.2.0 开始,对 Java 7、Python 2.6 和 2.6.5 之前的旧 Hadoop 版本的支持已被删除。从 2.3.0 开始,对 Scala 2.10 的支持已被删除。`