发送结果 RpcResponse 时出错/关闭连接 - Datastax Enterprise

Error sending result RpcResponse / closing connection - Datastax Enterprise

我是 运行 通过 Maven 的 Scala 应用程序,它使用 spark-submit 将 fat-jar 从客户端节点发送到 Datastax Enterprise 集群。 (在 Azure 上)。

一切似乎 运行 都很好,它确实将作业提交给了 Spark Worker/Master 但在某些时候它开始不断地抛出这些东西并且永远不会退出:

[rpc-server-17-1] ERROR org.apache.spark.network.server.TransportRequestHandler - Error sending result RpcResponse{requestId=6159268836916637242, body=NioManagedBuffer{buf=java.nio.HeapByteBuffer[pos=0 lim=47 cap=64]}} to /10.0.0.4:40852; closing connection
java.lang.AbstractMethodError: null
    at io.netty.util.ReferenceCountUtil.touch(ReferenceCountUtil.java:77)
    at io.netty.channel.DefaultChannelPipeline.touch(DefaultChannelPipeline.java:116)
    at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:810)
    at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:723)
    at io.netty.handler.codec.MessageToMessageEncoder.write(MessageToMessageEncoder.java:111)
    at io.netty.channel.AbstractChannelHandlerContext.invokeWrite0(AbstractChannelHandlerContext.java:738)
    at io.netty.channel.AbstractChannelHandlerContext.invokeWrite(AbstractChannelHandlerContext.java:730)
    at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:816)
    at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:723)
    at io.netty.handler.timeout.IdleStateHandler.write(IdleStateHandler.java:302)
    at io.netty.channel.AbstractChannelHandlerContext.invokeWrite0(AbstractChannelHandlerContext.java:738)
    at io.netty.channel.AbstractChannelHandlerContext.invokeWrite(AbstractChannelHandlerContext.java:730)
    at io.netty.channel.AbstractChannelHandlerContext.access00(AbstractChannelHandlerContext.java:38)
    at io.netty.channel.AbstractChannelHandlerContext$AbstractWriteTask.write(AbstractChannelHandlerContext.java:1081)
    at io.netty.channel.AbstractChannelHandlerContext$WriteAndFlushTask.write(AbstractChannelHandlerContext.java:1128)
    at io.netty.channel.AbstractChannelHandlerContext$AbstractWriteTask.run(AbstractChannelHandlerContext.java:1070)
    at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)
    at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:404)
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:465)
    at io.netty.util.concurrent.SingleThreadEventExecutor.run(SingleThreadEventExecutor.java:884)
    at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
    at java.lang.Thread.run(Thread.java:748)

所有依赖都下载正常,编译没有问题。我在 pom.xml 中有以下内容:

<dependency>
<groupId>com.datastax.dse</groupId>
<artifactId>dse-spark-dependencies</artifactId>
<version>6.7.1</version>
<scope>provided</scope>
</dependency>

<dependency>
<groupId>com.datastax.dse</groupId>
<artifactId>dse-byos_2.11</artifactId>
<version>6.7.1</version>
<scope>provided</scope>
<exclusions><exclusion>
  <groupId>io.netty</groupId>
  <artifactId>*</artifactId>
</exclusion>
</exclusions>  
</dependency>

<dependency>
<groupId>com.datastax.dse</groupId>
<artifactId>spark-connector</artifactId>
<version>6.7.1</version>
<exclusions>
<exclusion>
  <groupId>org.apache.solr</groupId>
  <artifactId>solr-solrj</artifactId>
</exclusion>
<exclusion>
  <groupId>io.netty</groupId>
  <artifactId>*</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>com.datastax.oss</groupId>
<artifactId>java-driver-core-shaded</artifactId>
<version>4.0.0</version>
</dependency>
<!--<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-transport</artifactId>
<version>4.1.25.4.dse</version>
</dependency>
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-transport-native-epoll</artifactId>
<version>4.1.25.Final</version>
<classifier>linux-x86_64</classifier>
</dependency> -->

另外,我试过,基于这个 https://docs.datastax.com/en/developer/java-driver/3.3/faq/ 传递那个参数来使用 FORCE_NIO,但它根本没有任何区别。

我什至尝试过 运行 应用程序,例如:

dse -u cassandra -p pass spark-submit --conf "spark.driver.extraClassPath=$(dse spark-classpath)" --class my.package.bde.TestSparkApp target/big-data-engine-0.0.1-jar-with-dependencies.jar -Dcom.datastax.driver.FORCE_NIO=true 但抛出不同的错误:Caused by: java.lang.NoClassDefFoundError: Could not initialize class io.netty.channel.epoll.EpollEventLoop

您的构建确实存在不正确的依赖项,因为它在之前的回答中已指出。你只需要离开

<dependency>
<groupId>com.datastax.dse</groupId>
<artifactId>dse-spark-dependencies</artifactId>
<version>6.7.1</version>
<scope>provided</scope>
</dependency>

并删除所有其他内容 - byos、spark-connector、netty、java-driver-core 等