Apache Ignite 使用 sbt 与 scala-spark 集成
Apache Ignite integration with scala-spark using sbt
我正在尝试将 ignite 集成到 scala 代码中,运行 使用 sbt 的应用程序。我不能为此使用任何 IDE。
Scala version - 2.11.0
Spark version - 2.3.0
Ignite version - 2.8.0
Sbt version - 1.3.3
我尝试在 sbt.build、
中添加基本库依赖项
libraryDependencies += "org.apache.ignite" %% "ignite-spark" % "2.8.0"
我的完整 sbt.build 代码是,
scalaVersion := "2.11.0"
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.3.0"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.3.0"
libraryDependencies += "org.apache.ignite" %% "ignite-spark" % "2.8.0"
但它仍然无法正常工作,我收到以下错误:
$ sbt package
[info] Loading project definition from /home/testing/project
[info] Loading settings for project testing from build.sbt ...
[info] Set current project to testing (in build file:/home/testing/)
[info] Updating
[info] Resolved dependencies
[warn]
[warn] Note: Unresolved dependencies path:
[error] sbt.librarymanagement.ResolveException: Error downloading org.apache.ignite:ignite-spark_2.11:2.8.0
[error] Not found
[error] Not found
[error] not found: /root/.ivy2/local/org.apache.ignite/ignite-spark_2.11/2.8.0/ivys/ivy.xml
[error] not found: https://repo1.maven.org/maven2/org/apache/ignite/ignite-spark_2.11/2.8.0/ignite-spark_2.11-2.8.0.pom
[error] at lmcoursier.CoursierDependencyResolution.unresolvedWarningOrThrow(CoursierDependencyResolution.scala:245)
[error] at lmcoursier.CoursierDependencyResolution.$anonfun$update(CoursierDependencyResolution.scala:214)
[error] at scala.util.Either$LeftProjection.map(Either.scala:573)
[error] at lmcoursier.CoursierDependencyResolution.update(CoursierDependencyResolution.scala:214)
[error] at sbt.librarymanagement.DependencyResolution.update(DependencyResolution.scala:60)
[error] at sbt.internal.LibraryManagement$.resolve(LibraryManagement.scala:52)
[error] at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate(LibraryManagement.scala:102)
[error] at sbt.util.Tracked$.$anonfun$lastOutput(Tracked.scala:69)
[error] at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate(LibraryManagement.scala:115)
[error] at scala.util.control.Exception$Catch.apply(Exception.scala:228)
[error] at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate(LibraryManagement.scala:115)
[error] at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$adapted(LibraryManagement.scala:96)
[error] at sbt.util.Tracked$.$anonfun$inputChanged(Tracked.scala:150)
[error] at sbt.internal.LibraryManagement$.cachedUpdate(LibraryManagement.scala:129)
[error] at sbt.Classpaths$.$anonfun$updateTask0(Defaults.scala:2946)
[error] at scala.Function1.$anonfun$compose(Function1.scala:49)
[error] at sbt.internal.util.$tilde$greater.$anonfun$$u2219(TypeFunctions.scala:62)
[error] at sbt.std.Transform$$anon.work(Transform.scala:67)
[error] at sbt.Execute.$anonfun$submit(Execute.scala:281)
[error] at sbt.internal.util.ErrorHandling$.wideConvert(ErrorHandling.scala:19)
[error] at sbt.Execute.work(Execute.scala:290)
[error] at sbt.Execute.$anonfun$submit(Execute.scala:281)
[error] at sbt.ConcurrentRestrictions$$anon.$anonfun$submitValid(ConcurrentRestrictions.scala:178)
[error] at sbt.CompletionService$$anon.call(CompletionService.scala:37)
[error] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
[error] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
[error] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
[error] at java.lang.Thread.run(Thread.java:748)
[error] (update) sbt.librarymanagement.ResolveException: Error downloading org.apache.ignite:ignite-spark_2.11:2.8.0
[error] Not found
[error] Not found
[error] not found: /root/.ivy2/local/org.apache.ignite/ignite-spark_2.11/2.8.0/ivys/ivy.xml
[error] not found: https://repo1.maven.org/maven2/org/apache/ignite/ignite-spark_2.11/2.8.0/ignite-spark_2.11-2.8.0.pom
[error] Total time: 5 s, completed May 28, 2020 6:24:04 AM
我正在处理 Ubuntu:latest docker 图片。
我确信 spark 和 ignite 正在工作,因为我也在 运行ning pyspark,它工作得很好。请帮助我,我认为作为新手我犯了一些小错误,这已成为这里的主要问题。
看来他不仅要"ignite_spark",还要"ignite_spark_2.11"。
"not found: https://repo1.maven.org/maven2/org/apache/ignite/ignite-spark_2.11/2.8.0/ignite-spark_2.11-2.8.0.pom"
使用
libraryDependencies += "org.apache.ignite" % "ignite-spark" % "2.8.0"
而不是
libraryDependencies += "org.apache.ignite" %% "ignite-spark" % "2.8.0"
%% -> 获取附加了 Scala 版本的依赖项
% -> 获取不附加 scala 版本的依赖项
我正在尝试将 ignite 集成到 scala 代码中,运行 使用 sbt 的应用程序。我不能为此使用任何 IDE。
Scala version - 2.11.0
Spark version - 2.3.0
Ignite version - 2.8.0
Sbt version - 1.3.3
我尝试在 sbt.build、
中添加基本库依赖项libraryDependencies += "org.apache.ignite" %% "ignite-spark" % "2.8.0"
我的完整 sbt.build 代码是,
scalaVersion := "2.11.0"
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.3.0"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.3.0"
libraryDependencies += "org.apache.ignite" %% "ignite-spark" % "2.8.0"
但它仍然无法正常工作,我收到以下错误:
$ sbt package
[info] Loading project definition from /home/testing/project
[info] Loading settings for project testing from build.sbt ...
[info] Set current project to testing (in build file:/home/testing/)
[info] Updating
[info] Resolved dependencies
[warn]
[warn] Note: Unresolved dependencies path:
[error] sbt.librarymanagement.ResolveException: Error downloading org.apache.ignite:ignite-spark_2.11:2.8.0
[error] Not found
[error] Not found
[error] not found: /root/.ivy2/local/org.apache.ignite/ignite-spark_2.11/2.8.0/ivys/ivy.xml
[error] not found: https://repo1.maven.org/maven2/org/apache/ignite/ignite-spark_2.11/2.8.0/ignite-spark_2.11-2.8.0.pom
[error] at lmcoursier.CoursierDependencyResolution.unresolvedWarningOrThrow(CoursierDependencyResolution.scala:245)
[error] at lmcoursier.CoursierDependencyResolution.$anonfun$update(CoursierDependencyResolution.scala:214)
[error] at scala.util.Either$LeftProjection.map(Either.scala:573)
[error] at lmcoursier.CoursierDependencyResolution.update(CoursierDependencyResolution.scala:214)
[error] at sbt.librarymanagement.DependencyResolution.update(DependencyResolution.scala:60)
[error] at sbt.internal.LibraryManagement$.resolve(LibraryManagement.scala:52)
[error] at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate(LibraryManagement.scala:102)
[error] at sbt.util.Tracked$.$anonfun$lastOutput(Tracked.scala:69)
[error] at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate(LibraryManagement.scala:115)
[error] at scala.util.control.Exception$Catch.apply(Exception.scala:228)
[error] at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate(LibraryManagement.scala:115)
[error] at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$adapted(LibraryManagement.scala:96)
[error] at sbt.util.Tracked$.$anonfun$inputChanged(Tracked.scala:150)
[error] at sbt.internal.LibraryManagement$.cachedUpdate(LibraryManagement.scala:129)
[error] at sbt.Classpaths$.$anonfun$updateTask0(Defaults.scala:2946)
[error] at scala.Function1.$anonfun$compose(Function1.scala:49)
[error] at sbt.internal.util.$tilde$greater.$anonfun$$u2219(TypeFunctions.scala:62)
[error] at sbt.std.Transform$$anon.work(Transform.scala:67)
[error] at sbt.Execute.$anonfun$submit(Execute.scala:281)
[error] at sbt.internal.util.ErrorHandling$.wideConvert(ErrorHandling.scala:19)
[error] at sbt.Execute.work(Execute.scala:290)
[error] at sbt.Execute.$anonfun$submit(Execute.scala:281)
[error] at sbt.ConcurrentRestrictions$$anon.$anonfun$submitValid(ConcurrentRestrictions.scala:178)
[error] at sbt.CompletionService$$anon.call(CompletionService.scala:37)
[error] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
[error] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
[error] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
[error] at java.lang.Thread.run(Thread.java:748)
[error] (update) sbt.librarymanagement.ResolveException: Error downloading org.apache.ignite:ignite-spark_2.11:2.8.0
[error] Not found
[error] Not found
[error] not found: /root/.ivy2/local/org.apache.ignite/ignite-spark_2.11/2.8.0/ivys/ivy.xml
[error] not found: https://repo1.maven.org/maven2/org/apache/ignite/ignite-spark_2.11/2.8.0/ignite-spark_2.11-2.8.0.pom
[error] Total time: 5 s, completed May 28, 2020 6:24:04 AM
我正在处理 Ubuntu:latest docker 图片。 我确信 spark 和 ignite 正在工作,因为我也在 运行ning pyspark,它工作得很好。请帮助我,我认为作为新手我犯了一些小错误,这已成为这里的主要问题。
看来他不仅要"ignite_spark",还要"ignite_spark_2.11"。
"not found: https://repo1.maven.org/maven2/org/apache/ignite/ignite-spark_2.11/2.8.0/ignite-spark_2.11-2.8.0.pom"
使用
libraryDependencies += "org.apache.ignite" % "ignite-spark" % "2.8.0"
而不是
libraryDependencies += "org.apache.ignite" %% "ignite-spark" % "2.8.0"
%% -> 获取附加了 Scala 版本的依赖项
% -> 获取不附加 scala 版本的依赖项