使用 Gradle 构建 Spark fat jar:shadow 插件产生损坏的 JAR 文件
Building Spark fat jar with Gradle: shadow plugin yields corrupted JAR file
我正在尝试使用 Gradle 构建一个 Spark fat jar。构建成功但文件以微妙的方式损坏:尝试 运行 它产生:
Error: Could not find or load main class shadow_test.Main
Caused by: java.lang.ClassNotFoundException: shadow_test.Main
JAR 本身看起来不错:缺少的 class 在那里,当我解压缩它时,我可以 运行 项目很好。
这是 gradle.build
文件:
plugins {
id "scala"
id 'com.github.johnrengelman.shadow' version '7.1.2'
}
ext {
ver = [
scala : '2.11.12',
scala_rt: '2.11',
spark : '2.4.4'
]
}
configurations {
// Dependencies that will be provided at runtime in the cloud execution
provided
compileOnly.extendsFrom(provided)
testImplementation.extendsFrom provided
}
repositories {
mavenCentral()
}
dependencies {
implementation "org.scala-lang:scala-library:$ver.scala"
provided "org.apache.xbean:xbean-asm6-shaded:4.10"
provided "org.apache.spark:spark-sql_$ver.scala_rt:$ver.spark"
provided "org.apache.spark:spark-hive_$ver.scala_rt:$ver.spark"
testImplementation "org.testng:testng:6.14.3"
}
tasks.register("allJar", com.github.jengelman.gradle.plugins.shadow.tasks.ShadowJar) {
manifest {
attributes "Main-Class": "shadow_test.Main"
}
from sourceSets.main.output
configurations = [project.configurations.runtimeClasspath, project.configurations.provided]
zip64 true
mergeServiceFiles()
with jar
}
test {
useTestNG()
}
Gradle 版本为 7.3.3
可以在 https://github.com/SashaOv/shadow-jar-repro
找到重现此问题的最小项目的完整代码
感谢任何线索
问题已由 Gradle 社区解决:https://discuss.gradle.org/t/possible-to-build-spark-fat-jar-with-gradle/42235/2?u=sashao
我遗漏了签名文件的排除项:
exclude 'META-INF/*.DSA'
exclude 'META-INF/*.SF'
我正在尝试使用 Gradle 构建一个 Spark fat jar。构建成功但文件以微妙的方式损坏:尝试 运行 它产生:
Error: Could not find or load main class shadow_test.Main
Caused by: java.lang.ClassNotFoundException: shadow_test.Main
JAR 本身看起来不错:缺少的 class 在那里,当我解压缩它时,我可以 运行 项目很好。
这是 gradle.build
文件:
plugins {
id "scala"
id 'com.github.johnrengelman.shadow' version '7.1.2'
}
ext {
ver = [
scala : '2.11.12',
scala_rt: '2.11',
spark : '2.4.4'
]
}
configurations {
// Dependencies that will be provided at runtime in the cloud execution
provided
compileOnly.extendsFrom(provided)
testImplementation.extendsFrom provided
}
repositories {
mavenCentral()
}
dependencies {
implementation "org.scala-lang:scala-library:$ver.scala"
provided "org.apache.xbean:xbean-asm6-shaded:4.10"
provided "org.apache.spark:spark-sql_$ver.scala_rt:$ver.spark"
provided "org.apache.spark:spark-hive_$ver.scala_rt:$ver.spark"
testImplementation "org.testng:testng:6.14.3"
}
tasks.register("allJar", com.github.jengelman.gradle.plugins.shadow.tasks.ShadowJar) {
manifest {
attributes "Main-Class": "shadow_test.Main"
}
from sourceSets.main.output
configurations = [project.configurations.runtimeClasspath, project.configurations.provided]
zip64 true
mergeServiceFiles()
with jar
}
test {
useTestNG()
}
Gradle 版本为 7.3.3
可以在 https://github.com/SashaOv/shadow-jar-repro
找到重现此问题的最小项目的完整代码感谢任何线索
问题已由 Gradle 社区解决:https://discuss.gradle.org/t/possible-to-build-spark-fat-jar-with-gradle/42235/2?u=sashao
我遗漏了签名文件的排除项:
exclude 'META-INF/*.DSA'
exclude 'META-INF/*.SF'