spark application has thrown java.lang.NoSuchMethodError: javax.ws.rs.core.Response.readEntity(Ljava/lang/Class;)Ljava/lang/Object
spark application has thrown java.lang.NoSuchMethodError: javax.ws.rs.core.Response.readEntity(Ljava/lang/Class;)Ljava/lang/Object
我在 java 中有一个使用 spark 和 hbase 的应用程序。我们需要击中部署在 tomcat(jersey) 的 url。所以,我们使用了 resteasy 客户端来做到这一点。
When i execute a standalone java code to hit the url using rest-easy
client, it works fine
但是,当我在另一个使用 spark 进行某些处理的应用程序中使用相同的代码时,它会抛出如标题所示的错误。
我在 eclipse 中使用 maven 作为构建工具。构建它之后,我正在创建一个可运行的 jar 并选择选项 "extract required libraries into generated jar"。为了执行应用程序,我正在使用命令:
nohup spark-submit --master yarn-client myWork.jar myProperties 0 &
rest-easy客户端代码的依赖:
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.jboss.resteasy</groupId>
<artifactId>resteasy-client</artifactId>
<version>3.0.11.Final</version>
</dependency>
</dependencies>
我想不通,在编译时,它不会抛出任何错误,但在运行时,虽然jar中包含了每个库(包括spark和hbase的库),但它会抛出错误说没有这样的方法。请帮忙。
have tried changing the version of resteasy-client but it didn't
help. during compile time i can see the class, how come at runtime it
is missing
可能的原因可能是原因
1) 如果您使用的是 Maven 作用域,则可能是 provided
。这样您的 jar 就不会被复制到您的发行版中。
你说的上述配置排除了这种情况。
2) 你没有从你的执行脚本指向正确的位置可能是 shell 脚本。
3) 你没有通过 --jars
选项或 --driverclasspath --executorclasspath
等传递这个 jar...
我怀疑问题是由于第二或第三个原因。
也看看https://spark.apache.org/docs/1.4.1/submitting-applications.html
编辑:
Question : spark-submit --conf
spark.driver.extraClassPath=surfer/javax.ws.rs-api-2.0.1.jar:surfer/jersey-client-2.25.jar:surfer/jersey-common-2.25.jar:surfer/hk2-api-2.5.0-b30.jar:surfer/jersey-guava-2.25.jar:surfer/hk2-utils-2.5.0-b30.jar:surfer/hk2-locator-2.5.0-b30.jar:surfer/javax.annotation-api-1.2.jar
artifact.jar againHere.csv
now it throws different exception : Exception in thread "main"
java.lang.AbstractMethodError:
javax.ws.rs.core.UriBuilder.uri(Ljava/lang/String;)Ljavax/ws/rs/core/UriBuilder;
i have also tried searching for the class Response$Status$Family
somewhere in classpath other than what i am supplying. i used the
command grep Response$Status$Family.class
/opt/mapr/spark/spark-1.4.1/lib/*.jar And i found that spark also has
this class. May be this is the issue. but how to forcefully tell the
jvm to use the class supplied by me at runtime and not that of spark,
i don't know! can you help?
由于您在类路径中提供了外部 jar
您可以使用以下选项告诉框架它必须使用您提供的外部 jar。这可以通过两种方式完成
- 通过 spark 提交
- conf.set...
由于您使用的是 1.4.1,请参阅 configuration options
spark.executor.userClassPathFirst false
(Experimental) Same
functionality as spark.driver.userClassPathFirst, but applied to
executor instances.
spark.driver.userClassPathFirst false
(Experimental) Whether to give
user-added jars precedence over Spark's own jars when loading classes
in the the driver. This feature can be used to mitigate conflicts
between Spark's dependencies and user dependencies. It is currently an
experimental feature. This is used in cluster mode only. can be used
to to tell framework
我在 java 中有一个使用 spark 和 hbase 的应用程序。我们需要击中部署在 tomcat(jersey) 的 url。所以,我们使用了 resteasy 客户端来做到这一点。
When i execute a standalone java code to hit the url using rest-easy client, it works fine
但是,当我在另一个使用 spark 进行某些处理的应用程序中使用相同的代码时,它会抛出如标题所示的错误。 我在 eclipse 中使用 maven 作为构建工具。构建它之后,我正在创建一个可运行的 jar 并选择选项 "extract required libraries into generated jar"。为了执行应用程序,我正在使用命令:
nohup spark-submit --master yarn-client myWork.jar myProperties 0 &
rest-easy客户端代码的依赖:
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.jboss.resteasy</groupId>
<artifactId>resteasy-client</artifactId>
<version>3.0.11.Final</version>
</dependency>
</dependencies>
我想不通,在编译时,它不会抛出任何错误,但在运行时,虽然jar中包含了每个库(包括spark和hbase的库),但它会抛出错误说没有这样的方法。请帮忙。
have tried changing the version of resteasy-client but it didn't help. during compile time i can see the class, how come at runtime it is missing
可能的原因可能是原因
1) 如果您使用的是 Maven 作用域,则可能是 provided
。这样您的 jar 就不会被复制到您的发行版中。
你说的上述配置排除了这种情况。
2) 你没有从你的执行脚本指向正确的位置可能是 shell 脚本。
3) 你没有通过 --jars
选项或 --driverclasspath --executorclasspath
等传递这个 jar...
我怀疑问题是由于第二或第三个原因。
也看看https://spark.apache.org/docs/1.4.1/submitting-applications.html
编辑:
Question : spark-submit --conf spark.driver.extraClassPath=surfer/javax.ws.rs-api-2.0.1.jar:surfer/jersey-client-2.25.jar:surfer/jersey-common-2.25.jar:surfer/hk2-api-2.5.0-b30.jar:surfer/jersey-guava-2.25.jar:surfer/hk2-utils-2.5.0-b30.jar:surfer/hk2-locator-2.5.0-b30.jar:surfer/javax.annotation-api-1.2.jar artifact.jar againHere.csv
now it throws different exception : Exception in thread "main" java.lang.AbstractMethodError: javax.ws.rs.core.UriBuilder.uri(Ljava/lang/String;)Ljavax/ws/rs/core/UriBuilder; i have also tried searching for the class Response$Status$Family somewhere in classpath other than what i am supplying. i used the command grep Response$Status$Family.class /opt/mapr/spark/spark-1.4.1/lib/*.jar And i found that spark also has this class. May be this is the issue. but how to forcefully tell the jvm to use the class supplied by me at runtime and not that of spark, i don't know! can you help?
由于您在类路径中提供了外部 jar
您可以使用以下选项告诉框架它必须使用您提供的外部 jar。这可以通过两种方式完成
- 通过 spark 提交
- conf.set...
由于您使用的是 1.4.1,请参阅 configuration options
spark.executor.userClassPathFirst false
(Experimental) Same functionality as spark.driver.userClassPathFirst, but applied to executor instances.
spark.driver.userClassPathFirst false
(Experimental) Whether to give user-added jars precedence over Spark's own jars when loading classes in the the driver. This feature can be used to mitigate conflicts between Spark's dependencies and user dependencies. It is currently an experimental feature. This is used in cluster mode only. can be used to to tell framework