在多项目设置中为 sbt 控制台加载正确的依赖项导致德比安全异常
loading the right dependencies for sbt console in multi project setup causing derby security exception
我有一个 SBT 多项目设置概述 https://github.com/geoHeil/sf-sbt-multiproject-dependency-problem 并且希望能够在根项目中执行 sbt console
。
执行时:
import org.apache.spark.sql.SparkSession
val spark = SparkSession.builder().master("local[*]").enableHiveSupport.getOrCreate
spark.sql("CREATE database foo")
根控制台中的错误是:
java.lang.NoClassDefFoundError: Could not initialize class org.apache.derby.jdbc.EmbeddedDriver
奇怪的是,它在子项目中工作得很好:
sbt
project common
console
现在粘贴相同的代码。
问题
- 如何修复 sbt 控制台以直接加载正确的依赖项?
- 如何直接从子项目加载控制台? sbt common/console 似乎没有解决这个问题。
详情
以下最重要的设置:
lazy val global = project
.in(file("."))
.settings(
settings,
libraryDependencies ++= commonDependencies
)
.aggregate(
common
)
.dependsOn(
common
)
lazy val common = project
.settings(
name := "common",
settings,
libraryDependencies ++= commonDependencies
)
lazy val dependencies =
new {
val sparkV = "2.3.0"
val sparkBase = "org.apache.spark" %% "spark-core" % sparkV % "provided"
val sparkSql = "org.apache.spark" %% "spark-sql" % sparkV % "provided"
val sparkHive = "org.apache.spark" %% "spark-hive" % sparkV % "provided"
}
lazy val commonDependencies = Seq(
dependencies.sparkBase,
dependencies.sparkHive,
dependencies.sparkSql
)
lazy val settings = commonSettings
lazy val commonSettings = Seq(
fork := true,
run in Compile := Defaults
.runTask(fullClasspath in Compile, mainClass.in(Compile, run), runner.in(Compile, run))
.evaluated
)
相关问题
- Transitive dependency errors in SBT multi-project
编辑
奇怪的是:对于 spark 版本 2.2.0,此设置工作正常。只有 2.2.1 / 2.3.0 会导致这些问题,但在单个项目设置中或在正确的项目中启动控制台时工作正常。
还有
java.security.AccessControlException: access denied org.apache.derby.security.SystemPermission( "engine", "usederbyinternals" )'
在堆栈跟踪中提到。
其实
使用代码:
if (appName === "dev") {
System.setSecurityManager(null)
}
正在为开发修复它。
https://github.com/holdenk/spark-testing-base/issues/148
https://issues.apache.org/jira/browse/SPARK-22918
我有一个 SBT 多项目设置概述 https://github.com/geoHeil/sf-sbt-multiproject-dependency-problem 并且希望能够在根项目中执行 sbt console
。
执行时:
import org.apache.spark.sql.SparkSession
val spark = SparkSession.builder().master("local[*]").enableHiveSupport.getOrCreate
spark.sql("CREATE database foo")
根控制台中的错误是:
java.lang.NoClassDefFoundError: Could not initialize class org.apache.derby.jdbc.EmbeddedDriver
奇怪的是,它在子项目中工作得很好:
sbt
project common
console
现在粘贴相同的代码。
问题
- 如何修复 sbt 控制台以直接加载正确的依赖项?
- 如何直接从子项目加载控制台? sbt common/console 似乎没有解决这个问题。
详情
以下最重要的设置:
lazy val global = project
.in(file("."))
.settings(
settings,
libraryDependencies ++= commonDependencies
)
.aggregate(
common
)
.dependsOn(
common
)
lazy val common = project
.settings(
name := "common",
settings,
libraryDependencies ++= commonDependencies
)
lazy val dependencies =
new {
val sparkV = "2.3.0"
val sparkBase = "org.apache.spark" %% "spark-core" % sparkV % "provided"
val sparkSql = "org.apache.spark" %% "spark-sql" % sparkV % "provided"
val sparkHive = "org.apache.spark" %% "spark-hive" % sparkV % "provided"
}
lazy val commonDependencies = Seq(
dependencies.sparkBase,
dependencies.sparkHive,
dependencies.sparkSql
)
lazy val settings = commonSettings
lazy val commonSettings = Seq(
fork := true,
run in Compile := Defaults
.runTask(fullClasspath in Compile, mainClass.in(Compile, run), runner.in(Compile, run))
.evaluated
)
相关问题
- Transitive dependency errors in SBT multi-project
编辑
奇怪的是:对于 spark 版本 2.2.0,此设置工作正常。只有 2.2.1 / 2.3.0 会导致这些问题,但在单个项目设置中或在正确的项目中启动控制台时工作正常。
还有
java.security.AccessControlException: access denied org.apache.derby.security.SystemPermission( "engine", "usederbyinternals" )'
在堆栈跟踪中提到。
其实
if (appName === "dev") {
System.setSecurityManager(null)
}
正在为开发修复它。
https://github.com/holdenk/spark-testing-base/issues/148 https://issues.apache.org/jira/browse/SPARK-22918