Dataproc 依赖项冲突 - google-api-client

Dataproc dependency conflict - google-api-client

我正在构建一个用于从云存储中获取加密机密的库(在 Scala 中,使用 Java 客户端)。我正在使用以下 google 库:

"com.google.apis"  % "google-api-services-cloudkms" % "v1-rev26-1.23.0" exclude("com.google.guava", "guava-jdk5"),
"com.google.cloud" % "google-cloud-storage"         % "1.14.0",

在本地一切正常,但是当我尝试 运行 我在 Dataproc 中的代码时,出现以下错误:

Exception in thread "main" java.lang.NoSuchMethodError: com.google.api.client.googleapis.services.json.AbstractGoogleJsonClient$Builder.setBatchPath(Ljava/lang/String;)Lcom/google/api/client/googleapis/services/AbstractGoogleClient$Builder;
    at com.google.api.services.cloudkms.v1.CloudKMS$Builder.setBatchPath(CloudKMS.java:4250)
    at com.google.api.services.cloudkms.v1.CloudKMS$Builder.<init>(CloudKMS.java:4229)
    at gcp.encryption.EncryptedSecretsUser$class.clients(EncryptedSecretsUser.scala:111)
    at gcp.encryption.EncryptedSecretsUser$class.getEncryptedSecrets(EncryptedSecretsUser.scala:62)

我的代码中有问题的行是:

val kms: CloudKMS = new CloudKMS.Builder(credential.getTransport,
      credential.getJsonFactory,
      credential)
      .setApplicationName("Encrypted Secrets User")
      .build()

我在 documentation 中看到一些 google 库在 Dataproc 上可用(我使用的是映像版本为 1.2.15 的 Spark 集群)。但据我所知,google-api-client 的传递依赖与我在本地使用的相同(1.23.0)。那怎么找不到方法呢?

我是否应该为 Dataproc 上的 运行ning 设置不同的依赖项?

编辑

终于在另一个项目中解决了这个问题。事实证明,除了着色所有 google 依赖项(包括 gcs-connector!!)之外,您还必须向 JVM 注册您的着色 class 以处理 gs:// 文件系统。 下面是适用于我的 Maven 配置,类似的东西可以用 sbt 实现:

父 POM:

<project xmlns="http://maven.apache.org/POM/4.0.0"...>
...
<properties>
    <!-- Spark version -->
    <spark.version>[2.2.1]</spark.version>
    <!-- Jackson-libs version pulled in by spark -->
    <jackson.version>[2.6.5]</jackson.version>
    <!-- Avro version pulled in by jackson -->
    <avro.version>[1.7.7]</avro.version>
    <!-- Kryo-shaded version pulled in by spark -->
    <kryo.version>[3.0.3]</kryo.version>
    <!-- Apache commons-lang version pulled in by spark -->
    <commons.lang.version>2.6</commons.lang.version>

    <!-- TODO: need to shade google libs because of version-conflicts on Dataproc. Remove this when Dataproc 1.3/2.0 is released -->
    <bigquery-conn.version>[0.10.6-hadoop2]</bigquery-conn.version>
    <gcs-conn.version>[1.6.5-hadoop2]</gcs-conn.version>
    <google-storage.version>[1.29.0]</google-storage.version>
    <!-- The guava version we want to use -->
    <guava.version>[23.2-jre]</guava.version>
    <!-- The google api version used by the google-cloud-storage lib -->
    <api-client.version>[1.23.0]</api-client.version>
    <!-- The google-api-services-storage version used by the google-cloud-storage lib -->
    <storage-api.version>[v1-rev114-1.23.0]</storage-api.version>

    <!-- Picked up by compiler and resource plugins -->
    <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>

...

<build>
    <pluginManagement>
        <plugins>
...

        <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-shade-plugin</artifactId>
            <version>3.1.1</version>
            <executions>
                <execution>
                    <phase>package</phase>
                    <goals>
                        <goal>shade</goal>
                    </goals>
                    <configuration>
                        <minimizeJar>true</minimizeJar>
                        <filters>
                            <filter>
                                <artifact>com.google.**:*</artifact>
                                <includes>
                                    <include>**</include>
                                </includes>
                            </filter>
                            <filter>
                                <artifact>com.google.cloud.bigdataoss:gcs-connector</artifact>
                                <excludes>
                                    <!-- Register a provider with the shaded name instead-->
                                    <exclude>META-INF/services/org.apache.hadoop.fs.FileSystem</exclude>
                                </excludes>
                            </filter>
                        </filters>
                        <artifactSet>
                            <includes>
                                <include>com.google.*:*</include>
                            </includes>
                            <excludes>
                                <exclude>com.google.code.findbugs:jsr305</exclude>
                            </excludes>
                        </artifactSet>
                        <relocations>
                            <relocation>
                                <pattern>com.google</pattern>
                                <shadedPattern>com.shaded.google</shadedPattern>
                            </relocation>
                        </relocations>
                    </configuration>
                </execution>
            </executions>
        </plugin>
...
    </plugins>
</build>

<dependencyManagement>
    <dependencies>
        <dependency>
...
            <groupId>com.google.cloud.bigdataoss</groupId>
            <artifactId>gcs-connector</artifactId>
            <version>${gcs-conn.version}</version>
            <exclusions>
                <!-- conflicts with Spark dependencies -->
                <exclusion>
                    <groupId>org.apache.hadoop</groupId>
                    <artifactId>hadoop-common</artifactId>
                </exclusion>
                <!-- conflicts with Spark dependencies -->
                <exclusion>
                    <groupId>org.apache.hadoop</groupId>
                    <artifactId>hadoop-mapreduce-client-core</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>com.google.guava</groupId>
                    <artifactId>guava</artifactId>
                </exclusion>
            </exclusions>
        </dependency>
        <dependency>
            <!-- Avoid conflict with the version pulled in by the GCS-connector on Dataproc -->
            <groupId>com.google.apis</groupId>
            <artifactId>google-api-services-storage</artifactId>
            <version>${storage-api.version}</version>
        </dependency>
        <dependency>
            <groupId>commons-lang</groupId>
            <artifactId>commons-lang</artifactId>
            <version>${commons.lang.version}</version>
        </dependency>
        <dependency>
            <groupId>com.esotericsoftware</groupId>
            <artifactId>kryo-shaded</artifactId>
            <version>${kryo.version}</version>
        </dependency>
        <dependency>
            <groupId>com.fasterxml.jackson.core</groupId>
            <artifactId>jackson-databind</artifactId>
            <version>${jackson.version}</version>
        </dependency>
        <dependency>
            <groupId>com.google.api-client</groupId>
            <artifactId>google-api-client</artifactId>
            <version>${api-client.version}</version>
        </dependency>
        <dependency>
            <groupId>com.google.guava</groupId>
            <artifactId>guava</artifactId>
            <version>${guava.version}</version>
        </dependency>
    </dependencies>
</dependencyManagement>

<dependencies>
    <dependency>
        <groupId>com.google.cloud</groupId>
        <artifactId>google-cloud-storage</artifactId>
        <version>${google-storage.version}</version>
        <exclusions>
            <!-- conflicts with Spark dependencies -->
            <exclusion>
                <groupId>com.google.guava</groupId>
                <artifactId>guava</artifactId>
            </exclusion>
        </exclusions>
    </dependency>
    <dependency>
        <groupId>com.google.guava</groupId>
        <artifactId>guava</artifactId>
    </dependency>
...
</dependencies>

...
</project>

子 POM:

    <dependencies>
    <!-- Libraries available on dataproc -->
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.11</artifactId>
        <scope>provided</scope>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql_2.11</artifactId>
        <scope>provided</scope>
    </dependency>
    <dependency>
        <groupId>com.google.cloud.bigdataoss</groupId>
        <artifactId>gcs-connector</artifactId>
    </dependency>
    <dependency>
        <groupId>com.esotericsoftware</groupId>
        <artifactId>kryo-shaded</artifactId>
        <scope>provided</scope><!-- Pulled in by spark -->
    </dependency>
    <dependency>
        <groupId>com.fasterxml.jackson.core</groupId>
        <artifactId>jackson-databind</artifactId>
        <scope>provided</scope><!-- Pulled in by spark -->
    </dependency>
</dependencies>

并在 path/to/your-project/src/main/resources/META-INF/services 下添加一个名为 org.apache.hadoop.fs.FileSystem 的文件,其中包含您的阴影 class 的名称,例如:

# WORKAROUND FOR DEPENDENCY CONFLICTS ON DATAPROC
#
# Use the shaded class as a provider for the gs:// file system
#

com.shaded.google.cloud.hadoop.fs.gcs.GoogleHadoopFileSystem

(请注意,此文件已从父 POM 中的 gcs-connector 库中过滤掉)

可能不明显,但最新稳定的 GCS 连接器中的 google-api-client 版本实际上是 1.20.0

原因是 this was the commit which rolled the api client version forward to 1.23.0, and it was part of a series of commits including this dependency-shading commit 的总体目标是不再将传递依赖性泄漏到作业类路径中,正是为了避免将来出现版本冲突问题,代价是每个人都必须携带他们自己的 fat jar 包含完整的 api 客户端依赖项。

然而,事实证明,许多人已经成长为依赖 GCS 连接器提供的 api 客户端在类路径上,因此生产工作负载无法在这种变化中生存在次要版本升级中;因此,升级后的 GCS 连接器使用 1.23.0 但也对其进行着色,以便它不再出现在作业类路径中,保留用于未来的 Dataproc 1.3+ 或 2.0+ 版本。

在您的情况下,您可以尝试使用 1.20.0 版本的依赖项(您可能还必须降级包含的 google-cloud-storage 依赖项的版本,尽管 1.22.0假设没有重大更改,该版本可能仍然有效,因为 setBatchPath 确实仅在 1.23.0 中引入),否则您可以尝试 shade all your own dependencies using sbt-assembly.

我们可以验证 setBatchPath 仅在 1.23.0 中引入:

$ javap -cp google-api-client-1.22.0.jar com.google.api.client.googleapis.services.AbstractGoogleClient.Builder | grep set
  public com.google.api.client.googleapis.services.AbstractGoogleClient$Builder setRootUrl(java.lang.String);
  public com.google.api.client.googleapis.services.AbstractGoogleClient$Builder setServicePath(java.lang.String);
  public com.google.api.client.googleapis.services.AbstractGoogleClient$Builder setGoogleClientRequestInitializer(com.google.api.client.googleapis.services.GoogleClientRequestInitializer);
  public com.google.api.client.googleapis.services.AbstractGoogleClient$Builder setHttpRequestInitializer(com.google.api.client.http.HttpRequestInitializer);
  public com.google.api.client.googleapis.services.AbstractGoogleClient$Builder setApplicationName(java.lang.String);
  public com.google.api.client.googleapis.services.AbstractGoogleClient$Builder setSuppressPatternChecks(boolean);
  public com.google.api.client.googleapis.services.AbstractGoogleClient$Builder setSuppressRequiredParameterChecks(boolean);
  public com.google.api.client.googleapis.services.AbstractGoogleClient$Builder setSuppressAllChecks(boolean);

$ javap -cp google-api-client-1.23.0.jar com.google.api.client.googleapis.services.AbstractGoogleClient.Builder | grep set
  public com.google.api.client.googleapis.services.AbstractGoogleClient$Builder setRootUrl(java.lang.String);
  public com.google.api.client.googleapis.services.AbstractGoogleClient$Builder setServicePath(java.lang.String);
  public com.google.api.client.googleapis.services.AbstractGoogleClient$Builder setBatchPath(java.lang.String);
  public com.google.api.client.googleapis.services.AbstractGoogleClient$Builder setGoogleClientRequestInitializer(com.google.api.client.googleapis.services.GoogleClientRequestInitializer);
  public com.google.api.client.googleapis.services.AbstractGoogleClient$Builder setHttpRequestInitializer(com.google.api.client.http.HttpRequestInitializer);
  public com.google.api.client.googleapis.services.AbstractGoogleClient$Builder setApplicationName(java.lang.String);
  public com.google.api.client.googleapis.services.AbstractGoogleClient$Builder setSuppressPatternChecks(boolean);
  public com.google.api.client.googleapis.services.AbstractGoogleClient$Builder setSuppressRequiredParameterChecks(boolean);
  public com.google.api.client.googleapis.services.AbstractGoogleClient$Builder setSuppressAllChecks(boolean);