如何修复 sbt 项目中的 "origin location must be absolute" 错误(使用 Spark 2.4.5 和 DeltaLake 0.6.1)?
How to fix "origin location must be absolute" error in sbt project (with Spark 2.4.5 and DeltaLake 0.6.1)?
我正在尝试使用 DeltaLake 0.6.1 为 Spark 2.4.5 设置一个 SBT 项目。我的构建文件如下。
然而,此配置似乎无法解决某些依赖项。
[info] Reapplying settings...
[info] Set current project to red-basket-pipelnes (in build file:/Users/ashika.umagiliya/git-repo/redbasket-pipelines/red-basket-pipelnes/)
[info] Updating ...
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn] :: UNRESOLVED DEPENDENCIES ::
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn] :: org.antlr#antlr4;4.7: org.antlr#antlr4;4.7!antlr4.pom(pom.original) origin location must be absolute: file:/Users/ashika.umagiliya/.m2/repository/org/antlr/antlr4/4.7/antlr4-4.7.pom
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn]
[warn] Note: Unresolved dependencies path:
[warn] org.antlr:antlr4:4.7
[warn] +- io.delta:delta-core_2.11:0.6.1 (/Users/ashika.umagiliya/git-repo/redbasket-pipelines/red-basket-pipelnes/build.sbt#L13-26)
[warn] +- com.mycompany.dpd.solutions:deltalake-pipelnes_2.11:1.0
[error] sbt.librarymanagement.ResolveException: unresolved dependency: org.antlr#antlr4;4.7: org.antlr#antlr4;4.7!antlr4.pom(pom.original) origin location must be absolute: file:/Users/ashika.umagiliya/.m2/repository/org/antlr/antlr4/4.7/antlr4-4.7.pom
[error] at sbt.internal.librarymanagement.IvyActions$.resolveAndRetrieve(IvyActions.scala:332)
build.sbt
name := "deltalake-pipelnes"
version := "1.0"
organization := "com.mycompany.dpd.solutions"
// The compatible Scala version for Spark 2.4.1 is 2.11
scalaVersion := "2.11.12"
val sparkVersion = "2.4.5"
val scalatestVersion = "3.0.5"
val deltaLakeCore = "0.6.1"
val sparkTestingBaseVersion = s"${sparkVersion}_0.14.0"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % sparkVersion % "provided",
"org.apache.spark" %% "spark-sql" % sparkVersion % "provided",
"org.apache.spark" %% "spark-hive" % sparkVersion % "provided",
"org.apache.spark" %% "spark-avro" % sparkVersion % "provided",
"io.delta" %% "delta-core" % deltaLakeCore,
"org.scalatest" %% "scalatest" % scalatestVersion % "test",
"com.holdenkarau" %% "spark-testing-base" % sparkTestingBaseVersion % "test"
)
assemblyOption in assembly := (assemblyOption in assembly).value.copy(includeScala = false)
assemblyMergeStrategy in assembly := {
case PathList("org", "apache", xs @ _*) => MergeStrategy.last
case PathList("changelog.txt") => MergeStrategy.last
case PathList(ps @ _*) if ps.last contains "spring" => MergeStrategy.last
case x =>
val oldStrategy = (assemblyMergeStrategy in assembly).value
oldStrategy(x)
}
resolvers ++= Seq(
"SPDB Maven Repository" at "https://artifactory.mycompany-it.com/spdb-mvn/",
Resolver.mavenLocal)
publishMavenStyle := true
publishTo := {
val repoBaseUrl = "https://artifactory.mycompany-it.com/"
if (isSnapshot.value)
Some("snapshots" at repoBaseUrl + "spdb-mvn-snapshot/")
else
Some("releases" at repoBaseUrl + "spdb-mvn-release/")
}
publishConfiguration := publishConfiguration.value.withOverwrite(true)
publishLocalConfiguration := publishLocalConfiguration.value.withOverwrite(true)
credentials += Credentials(Path.userHome / ".sbt" / ".credentials")
artifact in (Compile, assembly) := {
val art = (artifact in (Compile, assembly)).value
art.withClassifier(Some("assembly"))
}
addArtifact(artifact in (Compile, assembly), assembly)
parallelExecution in Test := false
关于如何解决这个问题的任何提示?
我自己还没有搞清楚它发生的时间和原因,但我之前确实遇到过类似的与分辨率相关的错误。
每当我 运行 遇到像你这样的问题时,我通常会删除受影响的目录(例如 /Users/ashika.umagiliya/.m2/repository/org/antlr
)并重新开始。它通常有帮助。如果没有,我也删除 ~./ivy2
并重新开始。这是一种大锤,可以完成工作(又名 if all you have is a hammer, everything looks like a nail)。
我总是确保使用最新最好的 sbt。您似乎在使用 macOS,因此请尽早并经常使用 sdk update
。
我还建议使用最新最好的库版本,更具体地说,Spark 是 2.4.7(在 2.4.x 行),而 Delta Lake 应该是0.8.0.
我正在尝试使用 DeltaLake 0.6.1 为 Spark 2.4.5 设置一个 SBT 项目。我的构建文件如下。
然而,此配置似乎无法解决某些依赖项。
[info] Reapplying settings...
[info] Set current project to red-basket-pipelnes (in build file:/Users/ashika.umagiliya/git-repo/redbasket-pipelines/red-basket-pipelnes/)
[info] Updating ...
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn] :: UNRESOLVED DEPENDENCIES ::
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn] :: org.antlr#antlr4;4.7: org.antlr#antlr4;4.7!antlr4.pom(pom.original) origin location must be absolute: file:/Users/ashika.umagiliya/.m2/repository/org/antlr/antlr4/4.7/antlr4-4.7.pom
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn]
[warn] Note: Unresolved dependencies path:
[warn] org.antlr:antlr4:4.7
[warn] +- io.delta:delta-core_2.11:0.6.1 (/Users/ashika.umagiliya/git-repo/redbasket-pipelines/red-basket-pipelnes/build.sbt#L13-26)
[warn] +- com.mycompany.dpd.solutions:deltalake-pipelnes_2.11:1.0
[error] sbt.librarymanagement.ResolveException: unresolved dependency: org.antlr#antlr4;4.7: org.antlr#antlr4;4.7!antlr4.pom(pom.original) origin location must be absolute: file:/Users/ashika.umagiliya/.m2/repository/org/antlr/antlr4/4.7/antlr4-4.7.pom
[error] at sbt.internal.librarymanagement.IvyActions$.resolveAndRetrieve(IvyActions.scala:332)
build.sbt
name := "deltalake-pipelnes"
version := "1.0"
organization := "com.mycompany.dpd.solutions"
// The compatible Scala version for Spark 2.4.1 is 2.11
scalaVersion := "2.11.12"
val sparkVersion = "2.4.5"
val scalatestVersion = "3.0.5"
val deltaLakeCore = "0.6.1"
val sparkTestingBaseVersion = s"${sparkVersion}_0.14.0"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % sparkVersion % "provided",
"org.apache.spark" %% "spark-sql" % sparkVersion % "provided",
"org.apache.spark" %% "spark-hive" % sparkVersion % "provided",
"org.apache.spark" %% "spark-avro" % sparkVersion % "provided",
"io.delta" %% "delta-core" % deltaLakeCore,
"org.scalatest" %% "scalatest" % scalatestVersion % "test",
"com.holdenkarau" %% "spark-testing-base" % sparkTestingBaseVersion % "test"
)
assemblyOption in assembly := (assemblyOption in assembly).value.copy(includeScala = false)
assemblyMergeStrategy in assembly := {
case PathList("org", "apache", xs @ _*) => MergeStrategy.last
case PathList("changelog.txt") => MergeStrategy.last
case PathList(ps @ _*) if ps.last contains "spring" => MergeStrategy.last
case x =>
val oldStrategy = (assemblyMergeStrategy in assembly).value
oldStrategy(x)
}
resolvers ++= Seq(
"SPDB Maven Repository" at "https://artifactory.mycompany-it.com/spdb-mvn/",
Resolver.mavenLocal)
publishMavenStyle := true
publishTo := {
val repoBaseUrl = "https://artifactory.mycompany-it.com/"
if (isSnapshot.value)
Some("snapshots" at repoBaseUrl + "spdb-mvn-snapshot/")
else
Some("releases" at repoBaseUrl + "spdb-mvn-release/")
}
publishConfiguration := publishConfiguration.value.withOverwrite(true)
publishLocalConfiguration := publishLocalConfiguration.value.withOverwrite(true)
credentials += Credentials(Path.userHome / ".sbt" / ".credentials")
artifact in (Compile, assembly) := {
val art = (artifact in (Compile, assembly)).value
art.withClassifier(Some("assembly"))
}
addArtifact(artifact in (Compile, assembly), assembly)
parallelExecution in Test := false
关于如何解决这个问题的任何提示?
我自己还没有搞清楚它发生的时间和原因,但我之前确实遇到过类似的与分辨率相关的错误。
每当我 运行 遇到像你这样的问题时,我通常会删除受影响的目录(例如 /Users/ashika.umagiliya/.m2/repository/org/antlr
)并重新开始。它通常有帮助。如果没有,我也删除 ~./ivy2
并重新开始。这是一种大锤,可以完成工作(又名 if all you have is a hammer, everything looks like a nail)。
我总是确保使用最新最好的 sbt。您似乎在使用 macOS,因此请尽早并经常使用 sdk update
。
我还建议使用最新最好的库版本,更具体地说,Spark 是 2.4.7(在 2.4.x 行),而 Delta Lake 应该是0.8.0.