Spark-HBase - GCP模板(1/3) - 如何在本地打包Hortonworks连接器?

Spark-HBase - GCP template (1/3) - How to locally package the Hortonworks connector?

我正在尝试在 GCP 上下文中测试 Spark-HBase 连接器并尝试遵循 [1],它要求使用 Maven(我尝试使用 Maven 3.6.3)为 Spark 在本地打包连接器 [2] 2.4,并导致以下问题。

错误“branch-2.4”:

[ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile (scala-compile-first) on project shc-core: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile failed.: NullPointerException -> [Help 1]

参考资料

[1] https://github.com/GoogleCloudPlatform/cloud-bigtable-examples/tree/master/scala/bigtable-shc

[2] https://github.com/hortonworks-spark/shc/tree/branch-2.4

按照评论中的建议(感谢 @Ismail !),使用 Java 8 构建连接器:

sdk use java 8.0.275-zulu

mvn clean package -DskipTests

然后可以按如下方式在 GCP templateDependencies.scala 中导入 jar。

...
val shcCore = "com.hortonworks" % "shc-core" % "1.1.3-2.4-s_2.11" from "file:///<path_to_jar>/shc-core-1.1.3-2.4-s_2.11.jar"
...
// shcCore % (shcVersionPrefix + scalaBinaryVersion) excludeAll(
shcCore excludeAll(
...