我怎样才能 运行 来自 SBT 的 DataNucleus Bytecode Enhancer?

How can I run DataNucleus Bytecode Enhancer from SBT?

我整理了一个概念证明,旨在提供一个骨架 SBT 多模块项目,该项目利用 DataNucleus JDO Enhancer 以及混合 Java 和 Scala 源。

当我尝试从 SBT 增强持久性 类 时出现困难。显然,从 SBT 调用 Fork.java.fork(...) 时,我没有传递正确的类路径


另见这个问题:


Exception in thread "main" java.lang.NoClassDefFoundError: Could not initialize class org.datanucleus.util.Localiser
        at org.datanucleus.metadata.MetaDataManagerImpl.loadPersistenceUnit(MetaDataManagerImpl.java:1104)
        at org.datanucleus.enhancer.DataNucleusEnhancer.getFileMetadataForInput(DataNucleusEnhancer.java:768)
        at org.datanucleus.enhancer.DataNucleusEnhancer.enhance(DataNucleusEnhancer.java:488)
        at org.datanucleus.api.jdo.JDOEnhancer.enhance(JDOEnhancer.java:125)
        at javax.jdo.Enhancer.run(Enhancer.java:196)
        at javax.jdo.Enhancer.main(Enhancer.java:130)
[info] Compiling 2 Java sources to /home/rgomes/workspace/poc-scala-datanucleus/model/target/scala-2.11/klasses...
java.lang.IllegalStateException: errno = 1
        at 321831a5683ffa07b5$.runner(build.sbt:230)
        at 321831a5683ffa07b5$$anonfun$model.apply(build.sbt:259)
        at 321831a5683ffa07b5$$anonfun$model.apply(build.sbt:258)
        at scala.Function1$$anonfun$compose.apply(Function1.scala:47)
        at sbt.$tilde$greater$$anonfun$$u2219.apply(TypeFunctions.scala:40)
        at sbt.std.Transform$$anon.work(System.scala:63)
        at sbt.Execute$$anonfun$submit$$anonfun$apply.apply(Execute.scala:226)
        at sbt.Execute$$anonfun$submit$$anonfun$apply.apply(Execute.scala:226)
        at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
        at sbt.Execute.work(Execute.scala:235)
        at sbt.Execute$$anonfun$submit.apply(Execute.scala:226)
        at sbt.Execute$$anonfun$submit.apply(Execute.scala:226)
        at sbt.ConcurrentRestrictions$$anon$$anonfun.apply(ConcurrentRestrictions.scala:159)
        at sbt.CompletionService$$anon.call(CompletionService.scala:28)

为了完整和信息,您可以在下面看到由 SBT 生成的 java 命令行,例如,可以在单独的 window 上手动执行。它运行良好。

$ java  -cp /home/rgomes/.ivy2/cache/org.scala-lang/scala-library/jars/scala-library-2.11.6.jar:/home/rgomes/.ivy2/cache/com.google.code.gson/gson/jars/gson-2.3.1.jar:/home/rgomes/.ivy2/cache/javax.jdo/jdo-api/jars/jdo-api-3.0.jar:/home/rgomes/.ivy2/cache/javax.transaction/transaction-api/jars/transaction-api-1.1.jar:/home/rgomes/.ivy2/cache/org.datanucleus/datanucleus-core/jars/datanucleus-core-4.0.4.jar:/home/rgomes/.ivy2/cache/org.datanucleus/datanucleus-api-jdo/jars/datanucleus-api-jdo-4.0.4.jar:/home/rgomes/.ivy2/cache/org.datanucleus/datanucleus-jdo-query/jars/datanucleus-jdo-query-4.0.4.jar:/home/rgomes/.ivy2/cache/org.datanucleus/datanucleus-rdbms/jars/datanucleus-rdbms-4.0.4.jar:/home/rgomes/.ivy2/cache/com.h2database/h2/jars/h2-1.4.185.jar:/home/rgomes/.ivy2/cache/org.postgresql/postgresql/jars/postgresql-9.4-1200-jdbc41.jar:/home/rgomes/.ivy2/cache/com.github.dblock.waffle/waffle-jna/jars/waffle-jna-1.7.jar:/home/rgomes/.ivy2/cache/net.java.dev.jna/jna/jars/jna-4.1.0.jar:/home/rgomes/.ivy2/cache/net.java.dev.jna/jna-platform/jars/jna-platform-4.1.0.jar:/home/rgomes/.ivy2/cache/org.slf4j/slf4j-simple/jars/slf4j-simple-1.7.7.jar:/home/rgomes/.ivy2/cache/org.slf4j/slf4j-api/jars/slf4j-api-1.7.7.jar:/home/rgomes/workspace/poc-scala-datanucleus/model/src/main/resources:/home/rgomes/workspace/poc-scala-datanucleus/model/target/scala-2.11/klasses javax.jdo.Enhancer -v -pu persistence-h2 -d /home/rgomes/workspace/poc-scala-datanucleus/model/target/scala-2.11/classes

May 13, 2015 3:30:07 PM org.datanucleus.enhancer.ClassEnhancerImpl save
INFO: Writing class file "/home/rgomes/workspace/poc-scala-datanucleus/model/target/scala-2.11/classes/model/AbstractModel.class" with enhanced definition
May 13, 2015 3:30:07 PM org.datanucleus.enhancer.DataNucleusEnhancer addMessage
INFO: ENHANCED (Persistable) : model.AbstractModel
May 13, 2015 3:30:07 PM org.datanucleus.enhancer.ClassEnhancerImpl save
INFO: Writing class file "/home/rgomes/workspace/poc-scala-datanucleus/model/target/scala-2.11/classes/model/Identifier.class" with enhanced definition
May 13, 2015 3:30:07 PM org.datanucleus.enhancer.DataNucleusEnhancer addMessage
INFO: ENHANCED (Persistable) : model.Identifier
May 13, 2015 3:30:07 PM org.datanucleus.enhancer.DataNucleusEnhancer addMessage
INFO: DataNucleus Enhancer completed with success for 2 classes. Timings : input=112 ms, enhance=102 ms, total=214 ms. Consult the log for full details
Enhancer Processing -v.
Enhancer adding Persistence Unit persistence-h2.
Enhancer processing output directory /home/rgomes/workspace/poc-scala-datanucleus/model/target/scala-2.11/classes.
Enhancer found JDOEnhancer of class org.datanucleus.api.jdo.JDOEnhancer.
Enhancer property key:VendorName value:DataNucleus.
Enhancer property key:VersionNumber value:4.0.4.
Enhancer property key:API value:JDO.
Enhancer enhanced 2 classes.

您可以在下面看到一些传递给 Fork.java.fork(...) 的调试信息:

=============================================================
mainClass=javax.jdo.Enhancer
args=-v -pu persistence-h2 -d /home/rgomes/workspace/poc-scala-datanucleus/model/target/scala-2.11/classes
javaHome=None
cwd=/home/rgomes/workspace/poc-scala-datanucleus/model/target/scala-2.11/classes
runJVMOptions=
bootJars ---------------------------------------------
/home/rgomes/.ivy2/cache/org.scala-lang/scala-library/jars/scala-library-2.11.6.jar
/home/rgomes/.ivy2/cache/com.google.code.gson/gson/jars/gson-2.3.1.jar
/home/rgomes/.ivy2/cache/javax.jdo/jdo-api/jars/jdo-api-3.0.jar
/home/rgomes/.ivy2/cache/javax.transaction/transaction-api/jars/transaction-api-1.1.jar
/home/rgomes/.ivy2/cache/org.datanucleus/datanucleus-core/jars/datanucleus-core-4.0.4.jar
/home/rgomes/.ivy2/cache/org.datanucleus/datanucleus-api-jdo/jars/datanucleus-api-jdo-4.0.4.jar
/home/rgomes/.ivy2/cache/org.datanucleus/datanucleus-jdo-query/jars/datanucleus-jdo-query-4.0.4.jar
/home/rgomes/.ivy2/cache/org.datanucleus/datanucleus-rdbms/jars/datanucleus-rdbms-4.0.4.jar
/home/rgomes/.ivy2/cache/com.h2database/h2/jars/h2-1.4.185.jar
/home/rgomes/.ivy2/cache/org.postgresql/postgresql/jars/postgresql-9.4-1200-jdbc41.jar
/home/rgomes/.ivy2/cache/com.github.dblock.waffle/waffle-jna/jars/waffle-jna-1.7.jar
/home/rgomes/.ivy2/cache/net.java.dev.jna/jna/jars/jna-4.1.0.jar
/home/rgomes/.ivy2/cache/net.java.dev.jna/jna-platform/jars/jna-platform-4.1.0.jar
/home/rgomes/.ivy2/cache/org.slf4j/slf4j-simple/jars/slf4j-simple-1.7.7.jar
/home/rgomes/.ivy2/cache/org.slf4j/slf4j-api/jars/slf4j-api-1.7.7.jar
/home/rgomes/workspace/poc-scala-datanucleus/model/src/main/resources
/home/rgomes/workspace/poc-scala-datanucleus/model/target/scala-2.11/klasses
envVars ----------------------------------------------

=============================================================

为了您的方便,该项目在 github 中可用 https://github.com/frgomes/poc-scala-datanucleus

只需下载并输入

./sbt compile

非常感谢任何帮助。谢谢

我认为问题在于您将依赖项 jar 作为引导 jar 而不是类路径传递。

来自你的 poc 项目可能是这样的:

val jvm_ = runJVMOptions.map(p => p.toString) ++
  Seq("-cp", cp_ mkString java.io.File.pathSeparator)

您可以使用 java.lang.ProcessBuildersbt.Fork

请参阅下面的通用 javaRunner,您可以将其添加到使用 java.lang.ProcessBuilder.

的 build.sbt

另请参阅通用 sbtRunner,您可以将其添加到使用 sbt.Fork 的 build.sbt 中。感谢@dwijnand 提供了使 sbtRunner 按预期工作的有见地的信息。

def javaRunner(mainClass: String,
               args: Seq[String],
               classpath: Seq[File],
               cwd: File,
               javaHome: Option[File] = None,
               runJVMOptions: Seq[String] = Nil,
               envVars: Map[String, String] = Map.empty,
               connectInput: Boolean = false,
               outputStrategy: Option[OutputStrategy] = Some(StdoutOutput)): Seq[File] = {

  val java_ : String      = javaHome.fold("") { p => p.absolutePath + "/bin/" } + "java"
  val jvm_  : Seq[String] = runJVMOptions.map(p => p.toString)
  val cp_   : Seq[String] = classpath.map(p => p.absolutePath)
  val env_                = envVars.map({ case (k,v) => s"${k}=${v}" })
  val xcmd_ : Seq[String] = Seq(java_) ++ jvm_ ++ Seq("-cp", cp_.mkString(java.io.File.pathSeparator), mainClass) ++ args

  println("=============================================================")
  println(xcmd_.mkString(" "))
  println("=============================================================")
  println("")

  IO.createDirectory(cwd)

  import scala.collection.JavaConverters._
  val cmd = xcmd_.asJava

  val pb = new java.lang.ProcessBuilder(cmd)
  pb.directory(cwd)
  pb.inheritIO
  val process = pb.start()
  def cancel() = {
    println("Run canceled.")
    process.destroy()
    1
  }
  val errno = try process.waitFor catch { case e: InterruptedException => cancel() }
  if(errno==0) {
    if (args.contains("-v")) cwd.list.foreach(f => println(f))
    cwd.listFiles
  } else {
    throw new IllegalStateException(s"errno = ${errno}")
  }
}

def sbtRunner(mainClass: String,
           args: Seq[String],
           classpath: Seq[File],
           cwd: File,
           javaHome: Option[File] = None,
           runJVMOptions: Seq[String] = Nil,
           envVars: Map[String, String] = Map.empty,
           connectInput: Boolean = false,
           outputStrategy: Option[OutputStrategy] = Some(StdoutOutput)): Seq[File] = {

  val args_ = args.map(p => p.toString)
  val java_ = javaHome.fold("None") { p => p.absolutePath }
  val cp_   = classpath.map(p => p.absolutePath)
  val jvm_  = runJVMOptions.map(p => p.toString) ++ Seq("-cp", cp_.mkString(java.io.File.pathSeparator))
  val env_  = envVars.map({ case (k,v) => s"${k}=${v}" })

  def dump: String =
    s"""
       |mainClass=${mainClass}
       |args=${args_.mkString(" ")}
       |javaHome=${java_}
       |cwd=${cwd.absolutePath}
       |runJVMOptions=${jvm_.mkString(" ")}
       |classpath --------------------------------------------
       |${cp_.mkString("\n")}
       |envVars ----------------------------------------------
       |${env_.mkString("\n")}
    """.stripMargin

  def cmd: String =
    s"""java ${jvm_.mkString(" ")} ${mainClass} ${args_.mkString(" ")}"""

  println("=============================================================")
  println(dump)
  println("=============================================================")
  println(cmd)
  println("=============================================================")
  println("")

  IO.createDirectory(cwd)
  val options =
    ForkOptions(
      javaHome = javaHome,
      outputStrategy = outputStrategy,
      bootJars = Seq.empty,
      workingDirectory = Option(cwd),
      runJVMOptions = jvm_,
      connectInput = connectInput,
      envVars = envVars)
  val process = new Fork("java", Option(mainClass)).fork(options, args)
  def cancel() = {
    println("Run canceled.")
    process.destroy()
    1
  }
  val errno = try process.exitValue() catch { case e: InterruptedException => cancel() }
  if(errno==0) {
    if (args.contains("-v")) cwd.list.foreach(f => println(f))
    cwd.listFiles
  } else {
    throw new IllegalStateException(s"errno = ${errno}")
  }
}

然后您需要将 DataNucleus Enhancer 作为构建过程的一部分进行连接。这是通过 manipulateBytecode 子任务完成的,如下所示:

lazy val model =
  project.in(file("model"))
    // .settings(publishSettings:_*)
    .settings(librarySettings:_*)
    .settings(paranoidOptions:_*)
    .settings(otestFramework: _*)
    .settings(deps_tagging:_*)
    //-- .settings(deps_stream:_*)
    .settings(deps_database:_*)
    .settings(
      Seq(
        // This trick requires SBT 0.13.8
        manipulateBytecode in Compile := {
          val previous = (manipulateBytecode in Compile).value
          sbtRunner(  // javaRunner also works!
            mainClass = "javax.jdo.Enhancer",
            args =
              Seq(
                "-v",
                "-pu", "persistence-h2",
                "-d",  (classDirectory in Compile).value.absolutePath),
            classpath =
              (managedClasspath in Compile).value.files ++
                (unmanagedResourceDirectories in Compile).value :+
                (classDirectory in Compile).value,
            cwd = (classDirectory in Compile).value,
            javaHome = javaHome.value,
            envVars = (envVars in Compile).value
          )
          previous
        }
      ):_*)
    .dependsOn(util)

有关完整示例,包括一些 JDO 注释持久性 类 和一些基本测试用例,请查看

http://github.com/frgomes/poc-scala-datanucleus