未解决的依赖路径 SBT - Scala Intellij 项目

Unresolved dependencies path SBT - Scala Intellij Project

我已经在 intellij 中新安装并创建了 spark、scala、SBT 开发环境,但是当我尝试编译 SBT 时,出现未解决的依赖项错误。

下面是我的 SBT 文件

name := "xxxxxxxxxxxxxxxxxxxx"
version := "0.1"
scalaVersion := "2.11.8"

val sparkVersion = "2.3.1"
val jacksonCore = "2.6.7"
val publishMavenStyle = true


resolvers ++= Seq(
  "Artifactory" at "https://binrepo.xxxxxx.com/artifactory/xyz/",
  "Artifactory Common" at "https://binrepo.xxxxxx.com/artifactory/data-engineering-gscl-abc/",
  //"ArtifactorySnapShots" at "https://binrepo.xxxxxx.com/artifactory/xyz/SNAPSHOTS/"
  ("Spark Packages Repo" at "http://dl.bintray.com/spark-packages/maven").withAllowInsecureProtocol(true)
)

libraryDependencies ++= Seq(


  "org.scalamock" %% "scalamock" % "4.4.0" % Test,

  //dependency of xyz-core.
  "com.tgt.dsc.xyz.datapipeline" % "xyz-core_2.11" % "1.8.1",

  //dependency of fields performance specific  common function
  "abc-common" % "abc-common_2.11" % "3.0.0",

  //dependency of reading configuration
  "com.typesafe" % "config" % "1.3.3",

  //spark core libraries, in the production or for spark-submit in local add provided so that dependent jar is not part of assembly jar
  // ex :"org.apache.spark" %% "spark-core" % sparkVersion % provided,
  "org.apache.spark" %% "spark-core" % sparkVersion  % Provided ,
  "org.apache.spark" %% "spark-sql" % sparkVersion  % Provided ,
  "org.apache.spark" %% "spark-hive" % sparkVersion % Provided ,
  "org.scala-lang" % "scala-library" % scalaVersion.value,

  //logging library
  "org.slf4j" % "slf4j-api" % "1.7.29",

  //for doing testing
  "org.scalatest" %% "scalatest" % "3.1.0" % Test,
  "MrPowers" % "spark-fast-tests" % "0.20.0-s_2.11",
  "mrpowers" % "spark-daria" % "0.35.0-s_2.11"
)

enablePlugins(GitVersioning)
assemblyJarName in assembly := s"${name.value}_${scalaVersion.value}-${version.value}.jar"
assemblyMergeStrategy in assembly := {
  //case "META-INF/services/org.apache.spark.sql.sources.DataSourceRegister" => MergeStrategy.concat
  case PathList("META-INF", xs@_*) => MergeStrategy.discard
  case x => MergeStrategy.first
}

//get the token from secrets
val token = sys.env.getOrElse("SONAR_TOKEN", "")

//configurations for sonar integration
sonarProperties ++= Map(
  "sonar.host.url" -> "http://sonarqube.xxxxxx.com:9000",
  "sonar.scala.version" -> "2.11",
  "sonar.projectName" -> "xyz-starter",
  "sonar.projectKey" -> "xyz-starter",
  "sonar.sources" -> "src/main/scala",
  "sonar.tests" -> "src/test/scala",
  "sonar.scala.coverage.reportPaths" -> "xxxxxx/scala-2.11/coverage-report/cobertura.xml,xxxxxx/scala-2.11/scapegoat-report/scapegoat.xml",
  "sonar.login" -> token,
  "sonar.buildbreaker.skip" -> "false"
)

//how much code coverage is needed
coverageMinimum := 80
//fail the build if code coverage is not met
coverageFailOnMinimum := true

整个 sbt 文件显示为红色,包括名称、版本、scalaVersion

当我编译时,下面是我现在遇到的错误

/Library/Java/JavaVirtualMachines/adoptopenjdk-11.jdk/Contents/Home/bin/java -Djline.terminal=jline.UnsupportedTerminal -Dsbt.log.noformat=true -Dfile.encoding=UTF-8 -Didea.managed=true -Dfile.encoding=UTF-8 -jar /Users/xxxxxxx/Library/Application Support/JetBrains/IdeaIC2020.2/plugins/Scala/launcher/sbt-launch.jar
[info] welcome to sbt 1.4.5 (AdoptOpenJDK Java 11.0.9)
[info] loading global plugins from /Users/xxxxxxx/.sbt/1.0/plugins
[info] loading settings for project xxxxxxxxxxxxxxxxxxxxxxxxxx-build from assembly.sbt ...
[info] loading project definition from /Users/xxxxxxx/Documents/GitClone/xxxxxxxxxxxxxxxxxxxxxxxxxx/project
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[info] loading settings for project xxxxxxxxxxxxxxxxxxxxxxxxxx from build.sbt ...
[info] set current project to xxxxxxxxxxxxxxxxxxxxxxxxxx (in build file:/Users/xxxxxxx/Documents/GitClone/xxxxxxxxxxxxxxxxxxxxxxxxxx/)
[info] sbt server started at local:///Users/xxxxxxx/.sbt/1.0/server/5231834612cf96406db7/sock
[info] started sbt server
sbt:xxxxxxxxxxxxxxxxxxxxxxxxxx>
;set _root_.scala.collection.Seq(historyPath := None,shellPrompt := { _ => "" }
,SettingKey[_root_.scala.Option[_root_.sbt.File]]("sbtStructureOutputFile")
in _root_.sbt.Global := _root_.scala.Some(_root_.sbt.file("/private/var/folders/6p/qvsthwj11q38nxlpn72hv6fr0000gq/T/sbt-structure.xml"))
,SettingKey[_root_.java.lang.String]
("sbtStructureOptions")
in _root_.sbt.Global := "download, resolveClassifiers")
[info] Defining Global / sbtStructureOptions, Global / sbtStructureOutputFile and 1 others.
[info] The new values will be used by cleanKeepGlobs
[info]  Run `last` for details.
[info] Reapplying settings...
[info] set current project to xxxxxxxxxxxxxxxxxxxxxxxxxx (in build file:/Users/xxxxxxx/Documents/GitClone/xxxxxxxxxxxxxxxxxxxxxxxxxx/)
[info] Applying State transformations org.jetbrains.sbt.CreateTasks from /Users/xxxxxxx/Library/Application Support/JetBrains/IdeaIC2020.2/plugins/Scala/repo/org.jetbrains/sbt-structure-extractor/scala_2.12/sbt_1.0/2018.2.1+4-88400d3f/jars/sbt-structure-extractor.jar
[info] Reapplying settings...
[info] set current project to xxxxxxxxxxxxxxxxxxxxxxxxxx (in build file:/Users/xxxxxxx/Documents/GitClone/xxxxxxxxxxxxxxxxxxxxxxxxxx/)
[warn]
[warn]  Note: Unresolved dependencies path:
[error] stack trace is suppressed; run 'last update' for the full output
[error] stack trace is suppressed; run 'last ssExtractDependencies' for the full output
[error] (update) sbt.librarymanagement.ResolveException: Error downloading MrPowers:spark-fast-tests:0.20.0-s_2.11
[error]   Not found
[error]   Not found
[error]   not found: /Users/xxxxxxx/.ivy2/localMrPowers/spark-fast-tests/0.20.0-s_2.11/ivys/ivy.xml
[error]   not found: https://repo1.maven.org/maven2/MrPowers/spark-fast-tests/0.20.0-s_2.11/spark-fast-tests-0.20.0-s_2.11.pom
[error]   not found: https://binrepo.target.com/artifactory/kelsa/MrPowers/spark-fast-tests/0.20.0-s_2.11/spark-fast-tests-0.20.0-s_2.11.pom
[error]   not found: https://binrepo.target.com/artifactory/data-engineering-gscl-fieldperformance/MrPowers/spark-fast-tests/0.20.0-s_2.11/spark-fast-tests-0.20.0-s_2.11.pom
[error]   download error: Caught java.io.IOException: Server returned HTTP response code: 403 for URL: http://dl.bintray.com/spark-packages/maven/MrPowers/spark-fast-tests/0.20.0-s_2.11/spark-fast-tests-0.20.0-s_2.11.pom (Server returned HTTP response code: 403 for URL: http://dl.bintray.com/spark-packages/maven/MrPowers/spark-fast-tests/0.20.0-s_2.11/spark-fast-tests-0.20.0-s_2.11.pom) while downloading http://dl.bintray.com/spark-packages/maven/MrPowers/spark-fast-tests/0.20.0-s_2.11/spark-fast-tests-0.20.0-s_2.11.pom
[error] Error downloading mrpowers:spark-daria:0.35.0-s_2.11
[error]   Not found
[error]   Not found
[error]   not found: /Users/xxxxxxx/.ivy2/localmrpowers/spark-daria/0.35.0-s_2.11/ivys/ivy.xml
[error]   not found: https://repo1.maven.org/maven2/mrpowers/spark-daria/0.35.0-s_2.11/spark-daria-0.35.0-s_2.11.pom
[error]   not found: https://binrepo.target.com/artifactory/kelsa/mrpowers/spark-daria/0.35.0-s_2.11/spark-daria-0.35.0-s_2.11.pom
[error]   not found: https://binrepo.target.com/artifactory/data-engineering-gscl-fieldperformance/mrpowers/spark-daria/0.35.0-s_2.11/spark-daria-0.35.0-s_2.11.pom
[error]   download error: Caught java.io.IOException: Server returned HTTP response code: 403 for URL: http://dl.bintray.com/spark-packages/maven/mrpowers/spark-daria/0.35.0-s_2.11/spark-daria-0.35.0-s_2.11.pom (Server returned HTTP response code: 403 for URL: http://dl.bintray.com/spark-packages/maven/mrpowers/spark-daria/0.35.0-s_2.11/spark-daria-0.35.0-s_2.11.pom) while downloading http://dl.bintray.com/spark-packages/maven/mrpowers/spark-daria/0.35.0-s_2.11/spark-daria-0.35.0-s_2.11.pom
[error] (ssExtractDependencies) sbt.librarymanagement.ResolveException: Error downloading MrPowers:spark-fast-tests:0.20.0-s_2.11
[error]   Not found
[error]   Not found
[error]   not found: /Users/xxxxxxx/.ivy2/localMrPowers/spark-fast-tests/0.20.0-s_2.11/ivys/ivy.xml
[error]   not found: https://repo1.maven.org/maven2/MrPowers/spark-fast-tests/0.20.0-s_2.11/spark-fast-tests-0.20.0-s_2.11.pom
[error]   not found: https://binrepo.target.com/artifactory/kelsa/MrPowers/spark-fast-tests/0.20.0-s_2.11/spark-fast-tests-0.20.0-s_2.11.pom
[error]   not found: https://binrepo.target.com/artifactory/data-engineering-gscl-fieldperformance/MrPowers/spark-fast-tests/0.20.0-s_2.11/spark-fast-tests-0.20.0-s_2.11.pom
[error]   download error: Caught java.io.IOException: Server returned HTTP response code: 403 for URL: http://dl.bintray.com/spark-packages/maven/MrPowers/spark-fast-tests/0.20.0-s_2.11/spark-fast-tests-0.20.0-s_2.11.pom (Server returned HTTP response code: 403 for URL: http://dl.bintray.com/spark-packages/maven/MrPowers/spark-fast-tests/0.20.0-s_2.11/spark-fast-tests-0.20.0-s_2.11.pom) while downloading http://dl.bintray.com/spark-packages/maven/MrPowers/spark-fast-tests/0.20.0-s_2.11/spark-fast-tests-0.20.0-s_2.11.pom
[error] Error downloading mrpowers:spark-daria:0.35.0-s_2.11
[error]   Not found
[error]   Not found
[error]   not found: /Users/xxxxxxx/.ivy2/localmrpowers/spark-daria/0.35.0-s_2.11/ivys/ivy.xml
[error]   not found: https://repo1.maven.org/maven2/mrpowers/spark-daria/0.35.0-s_2.11/spark-daria-0.35.0-s_2.11.pom
[error]   not found: https://binrepo.target.com/artifactory/kelsa/mrpowers/spark-daria/0.35.0-s_2.11/spark-daria-0.35.0-s_2.11.pom
[error]   not found: https://binrepo.target.com/artifactory/data-engineering-gscl-fieldperformance/mrpowers/spark-daria/0.35.0-s_2.11/spark-daria-0.35.0-s_2.11.pom
[error]   download error: Caught java.io.IOException: Server returned HTTP response code: 403 for URL: http://dl.bintray.com/spark-packages/maven/mrpowers/spark-daria/0.35.0-s_2.11/spark-daria-0.35.0-s_2.11.pom (Server returned HTTP response code: 403 for URL: http://dl.bintray.com/spark-packages/maven/mrpowers/spark-daria/0.35.0-s_2.11/spark-daria-0.35.0-s_2.11.pom) while downloading http://dl.bintray.com/spark-packages/maven/mrpowers/spark-daria/0.35.0-s_2.11/spark-daria-0.35.0-s_2.11.pom
[error] Total time: 7 s, completed 19-May-2021, 6:44:44 PM
[info] shutting down sbt server

关于如何解决它的任何想法。

添加我的控制台外观的屏幕截图

Entire sbt file is showing in red including the name, version, scalaVersion

这可能是由于 IntelliJ 中缺少某些配置造成的,您应该有某种弹出窗口要求您“配置 Scala SDK”。如果没有,您可以转到您的模块设置并添加 Scala SDK。

when i compile following is the error which i am getting now

如果您仔细查看错误,您应该注意到这条消息:

Server returned HTTP response code: 403 for URL: http://dl.bintray.com/spark-packages/maven/MrPowers/spark-fast-tests/0.20.0-s_2.11/spark-fast-tests-0.20.0-s_2.11.pom

您要查找的依赖项("MrPowers" % "spark-fast-tests" % "0.20.0-s_2.11""mrpowers" % "spark-daria" % "0.35.0-s_2.11")仅在此存储库 http://dl.bintray.com/spark-packages/maven/ 中可用,它似乎需要一些身份验证,因为 HTTP 错误代码 403 建议。

您可以配置身份验证,或者 您可以使用这些库的更新版本,它们发布在 public Maven 中央存储库:

  • "com.github.mrpowers" %% "spark-daria" % "0.39.0"
  • "com.github.mrpowers" %% "spark-fast-tests" % "0.23.0"

编辑:我是怎么找到的?我使用 https://mvnrepository.com/search?q=spark-daria 搜索您的依赖项并找到带有 Repository Central 标志

的新依赖项

另请注意,您可能希望在 build.sbt:

中更改一些内容
  • 使用“sbt方案”来使用Scala依赖,无需手动设置Scala版本:
    • 使用"com.github.mrpowers" %% "spark-daria" % "0.39.0"
    • 而不是 "com.github.mrpowers" % "spark-daria" % "0.39.0_2.11"(注意双 %% 并且没有后缀 _2.11
  • 不要声明 Scala 库依赖项"org.scala-lang" % "scala-library" % scalaVersion.value,这是隐式提供的

问题与缺少依赖项有关,即

"MrPowers" % "spark-fast-tests" % "0.20.0-s_2.11",
"mrpowers" % "spark-daria" % "0.35.0-s_2.11"

从代码中删除它后,我发现它也在其他依赖 JAR 中使用

//dependency of fields performance specific  common function
"abc-common" % "abc-common_2.11" % "3.0.0",

从那里删除解决了问题。