无法在 spark-shell Hadoop 3.2.1 和 spark 3.0.0 中发送 RPC XXXX

Failed to send RPC XXXX in spark-shell Hadoop 3.2.1 and spark 3.0.0

我正在尝试 运行 在我的 windows 10 台具有 8 Gigs ram 的 psuedodistributed 模式下激发 shell。 我能够在 yarn 上提交和 运行 一个 mapreduce wordcount,但是当我尝试初始化一个 spark shell 或 spark 提交任何带有 master as yarn 的程序时,它失败并出现 failed to send RPC 错误。 错误如下。

下面是我的yarn-site.xml配置

    <configuration>
    <property>
      <name>yarn.nodemanager.aux-services</name>
      <value>mapreduce_shuffle</value>
   </property>
   <property>    
   <name>yarn.nodemanager.auxservices.mapreduce.shuffle.class</name>
   <value>org.apache.hadoop.mapred.ShuffleHandler</value>
   </property>
   <property>
        <name>yarn.nodemanager.local-dirs</name>
        <value>C:\study\hadoop-3.2.1\data\nodemanager</value>
</property>
<property>
  <name>yarn.resourcemanager.hostname</name>
  <value>127.0.0.1</value>
</property>

<property>
  <name>yarn.acl.enable</name>
  <value>0</value>
</property>
<property>
  <name>yarn.nodemanager.env-whitelist</name> 
<value>JAVA_HOME,HADOOP_COMMON_HOME,HADOOP_HDFS_HOME,HADOOP_CONF_DIR,CLASSPATH_PERPEND_DISTCACHE,HADOOP_YARN_HOME,HADOOP_MAPRED_HOME</value>
</property>
<property>       
<name>yarn.nodemanager.vmem-check-enabled</name>      
<value>false</value>  
</property>
<property>
    <name>yarn.nodemanager.pmem-check-enabled</name>
    <value>false</value>
</property>

<!-- Site specific YARN configuration properties -->



</configuration>

根据我的初步调查,这似乎是由 netty io 库在 spark 网络实用程序中调用 abstractRegion.transfer() 方法引起的,该方法似乎不存在... 以下是完整的错误。

SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/C:/study/hadoop-3.2.1/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/C:/study/hadoop-3.2.1/data/nodemanager/usercache/Administrator/appcache/application_1609008428682_0006/container_1609008428682_0006_01_000001/__spark_libs__/slf4j-log4j12-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
2020-12-27 01:27:52,370 WARN util.Shell: Did not find winutils.exe: {}
java.io.FileNotFoundException: java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset. -see https://wiki.apache.org/hadoop/WindowsProblems
    at org.apache.hadoop.util.Shell.fileNotFoundException(Shell.java:548)
    at org.apache.hadoop.util.Shell.getHadoopHomeDir(Shell.java:569)
    at org.apache.hadoop.util.Shell.getQualifiedBin(Shell.java:592)
    at org.apache.hadoop.util.Shell.<clinit>(Shell.java:689)
    at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:78)
    at org.apache.hadoop.yarn.conf.YarnConfiguration.<clinit>(YarnConfiguration.java:1159)
    at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:858)
    at org.apache.spark.deploy.yarn.ExecutorLauncher$.main(ApplicationMaster.scala:921)
    at org.apache.spark.deploy.yarn.ExecutorLauncher.main(ApplicationMaster.scala)
Caused by: java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset.
    at org.apache.hadoop.util.Shell.checkHadoopHomeInner(Shell.java:468)
    at org.apache.hadoop.util.Shell.checkHadoopHome(Shell.java:439)
    at org.apache.hadoop.util.Shell.<clinit>(Shell.java:516)
    ... 5 more
2020-12-27 01:27:52,776 INFO spark.SecurityManager: Changing view acls to: Administrator
2020-12-27 01:27:52,777 INFO spark.SecurityManager: Changing modify acls to: Administrator
2020-12-27 01:27:52,778 INFO spark.SecurityManager: Changing view acls groups to: 
2020-12-27 01:27:52,779 INFO spark.SecurityManager: Changing modify acls groups to: 
2020-12-27 01:27:52,780 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(Administrator); groups with view permissions: Set(); users  with modify permissions: Set(Administrator); groups with modify permissions: Set()
2020-12-27 01:27:53,417 INFO yarn.ApplicationMaster: ApplicationAttemptId: appattempt_1609008428682_0006_000001
2020-12-27 01:27:54,627 INFO client.RMProxy: Connecting to ResourceManager at /127.0.0.1:8030
2020-12-27 01:27:54,727 INFO yarn.YarnRMClient: Registering the ApplicationMaster
2020-12-27 01:27:55,305 INFO client.TransportClientFactory: Successfully created connection to LAPTOP-GQ2OL7O9/192.168.0.106:56588 after 137 ms (0 ms spent in bootstraps)
2020-12-27 01:27:55,341 ERROR client.TransportClient: Failed to send RPC RPC 6402554451456766428 to LAPTOP-GQ2OL7O9/192.168.0.106:56588: io.netty.channel.socket.ChannelOutputShutdownException: Channel output shutdown
io.netty.channel.socket.ChannelOutputShutdownException: Channel output shutdown
    at io.netty.channel.AbstractChannel$AbstractUnsafe.shutdownOutput(AbstractChannel.java:587)
    at io.netty.channel.AbstractChannel$AbstractUnsafe.flush0(AbstractChannel.java:893)
    at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.flush0(AbstractNioChannel.java:313)
    at io.netty.channel.AbstractChannel$AbstractUnsafe.flush(AbstractChannel.java:847)
    at io.netty.channel.DefaultChannelPipeline$HeadContext.flush(DefaultChannelPipeline.java:1264)
    at io.netty.channel.AbstractChannelHandlerContext.invokeFlush0(AbstractChannelHandlerContext.java:770)
    at io.netty.channel.AbstractChannelHandlerContext.invokeFlush(AbstractChannelHandlerContext.java:762)
    at io.netty.channel.AbstractChannelHandlerContext.flush(AbstractChannelHandlerContext.java:743)
    at io.netty.channel.ChannelOutboundHandlerAdapter.flush(ChannelOutboundHandlerAdapter.java:115)
    at io.netty.channel.AbstractChannelHandlerContext.invokeFlush0(AbstractChannelHandlerContext.java:770)
    at io.netty.channel.AbstractChannelHandlerContext.invokeFlush(AbstractChannelHandlerContext.java:762)
    at io.netty.channel.AbstractChannelHandlerContext.flush(AbstractChannelHandlerContext.java:743)
    at io.netty.channel.ChannelDuplexHandler.flush(ChannelDuplexHandler.java:117)
    at io.netty.channel.AbstractChannelHandlerContext.invokeFlush0(AbstractChannelHandlerContext.java:770)
    at io.netty.channel.AbstractChannelHandlerContext.invokeFlush(AbstractChannelHandlerContext.java:762)
    at io.netty.channel.AbstractChannelHandlerContext.access00(AbstractChannelHandlerContext.java:35)
    at io.netty.channel.AbstractChannelHandlerContext$WriteAndFlushTask.write(AbstractChannelHandlerContext.java:1116)
    at io.netty.channel.AbstractChannelHandlerContext$AbstractWriteTask.run(AbstractChannelHandlerContext.java:1050)
    at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:399)
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:464)
    at io.netty.util.concurrent.SingleThreadEventExecutor.run(SingleThreadEventExecutor.java:131)
    at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
    at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.NoSuchMethodError: org.apache.spark.network.util.AbstractFileRegion.transferred()J
    at org.apache.spark.network.util.AbstractFileRegion.transfered(AbstractFileRegion.java:28)
    at io.netty.channel.nio.AbstractNioByteChannel.doWrite(AbstractNioByteChannel.java:228)
    at io.netty.channel.socket.nio.NioSocketChannel.doWrite(NioSocketChannel.java:282)
    at io.netty.channel.AbstractChannel$AbstractUnsafe.flush0(AbstractChannel.java:879)
    ... 21 more
2020-12-27 01:27:55,353 ERROR yarn.ApplicationMaster: Uncaught exception: 
org.apache.spark.SparkException: Exception thrown in awaitResult: 
    at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:302)
    at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
    at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:101)
    at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:109)
    at org.apache.spark.deploy.yarn.ApplicationMaster.runExecutorLauncher(ApplicationMaster.scala:547)
    at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:266)
    at org.apache.spark.deploy.yarn.ApplicationMaster$$anon.run(ApplicationMaster.scala:890)
    at org.apache.spark.deploy.yarn.ApplicationMaster$$anon.run(ApplicationMaster.scala:889)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
    at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:889)
    at org.apache.spark.deploy.yarn.ExecutorLauncher$.main(ApplicationMaster.scala:921)
    at org.apache.spark.deploy.yarn.ExecutorLauncher.main(ApplicationMaster.scala)
Caused by: java.io.IOException: Failed to send RPC RPC 6402554451456766428 to LAPTOP-GQ2OL7O9/192.168.0.106:56588: io.netty.channel.socket.ChannelOutputShutdownException: Channel output shutdown
    at org.apache.spark.network.client.TransportClient$RpcChannelListener.handleFailure(TransportClient.java:363)
    at org.apache.spark.network.client.TransportClient$StdChannelListener.operationComplete(TransportClient.java:340)
    at io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:507)
    at io.netty.util.concurrent.DefaultPromise.notifyListeners0(DefaultPromise.java:500)
    at io.netty.util.concurrent.DefaultPromise.notifyListenersNow(DefaultPromise.java:479)
    at io.netty.util.concurrent.DefaultPromise.notifyListeners(DefaultPromise.java:420)
    at io.netty.util.concurrent.DefaultPromise.tryFailure(DefaultPromise.java:122)
    at io.netty.util.internal.PromiseNotificationUtil.tryFailure(PromiseNotificationUtil.java:64)
    at io.netty.channel.ChannelOutboundBuffer.safeFail(ChannelOutboundBuffer.java:680)
    at io.netty.channel.ChannelOutboundBuffer.remove0(ChannelOutboundBuffer.java:294)
    at io.netty.channel.ChannelOutboundBuffer.failFlushed(ChannelOutboundBuffer.java:617)
    at io.netty.channel.AbstractChannel$AbstractUnsafe.closeOutboundBufferForShutdown(AbstractChannel.java:627)
    at io.netty.channel.AbstractChannel$AbstractUnsafe.shutdownOutput(AbstractChannel.java:620)
    at io.netty.channel.AbstractChannel$AbstractUnsafe.flush0(AbstractChannel.java:893)
    at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.flush0(AbstractNioChannel.java:313)
    at io.netty.channel.AbstractChannel$AbstractUnsafe.flush(AbstractChannel.java:847)
    at io.netty.channel.DefaultChannelPipeline$HeadContext.flush(DefaultChannelPipeline.java:1264)
    at io.netty.channel.AbstractChannelHandlerContext.invokeFlush0(AbstractChannelHandlerContext.java:770)
    at io.netty.channel.AbstractChannelHandlerContext.invokeFlush(AbstractChannelHandlerContext.java:762)
    at io.netty.channel.AbstractChannelHandlerContext.flush(AbstractChannelHandlerContext.java:743)
    at io.netty.channel.ChannelOutboundHandlerAdapter.flush(ChannelOutboundHandlerAdapter.java:115)
    at io.netty.channel.AbstractChannelHandlerContext.invokeFlush0(AbstractChannelHandlerContext.java:770)
    at io.netty.channel.AbstractChannelHandlerContext.invokeFlush(AbstractChannelHandlerContext.java:762)
    at io.netty.channel.AbstractChannelHandlerContext.flush(AbstractChannelHandlerContext.java:743)
    at io.netty.channel.ChannelDuplexHandler.flush(ChannelDuplexHandler.java:117)
    at io.netty.channel.AbstractChannelHandlerContext.invokeFlush0(AbstractChannelHandlerContext.java:770)
    at io.netty.channel.AbstractChannelHandlerContext.invokeFlush(AbstractChannelHandlerContext.java:762)
    at io.netty.channel.AbstractChannelHandlerContext.access00(AbstractChannelHandlerContext.java:35)
    at io.netty.channel.AbstractChannelHandlerContext$WriteAndFlushTask.write(AbstractChannelHandlerContext.java:1116)
    at io.netty.channel.AbstractChannelHandlerContext$AbstractWriteTask.run(AbstractChannelHandlerContext.java:1050)
    at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:399)
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:464)
    at io.netty.util.concurrent.SingleThreadEventExecutor.run(SingleThreadEventExecutor.java:131)
    at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
    at java.lang.Thread.run(Thread.java:748)
Caused by: io.netty.channel.socket.ChannelOutputShutdownException: Channel output shutdown
    at io.netty.channel.AbstractChannel$AbstractUnsafe.shutdownOutput(AbstractChannel.java:587)
    ... 22 more
Caused by: java.lang.NoSuchMethodError: org.apache.spark.network.util.AbstractFileRegion.transferred()J
    at org.apache.spark.network.util.AbstractFileRegion.transfered(AbstractFileRegion.java:28)
    at io.netty.channel.nio.AbstractNioByteChannel.doWrite(AbstractNioByteChannel.java:228)
    at io.netty.channel.socket.nio.NioSocketChannel.doWrite(NioSocketChannel.java:282)
    at io.netty.channel.AbstractChannel$AbstractUnsafe.flush0(AbstractChannel.java:879)
    ... 21 more
2020-12-27 01:27:55,357 INFO yarn.ApplicationMaster: Final app status: FAILED, exitCode: 10, (reason: Uncaught exception: org.apache.spark.SparkException: Exception thrown in awaitResult: 
    at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:302)
    at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
    at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:101)
    at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:109)
    at org.apache.spark.deploy.yarn.ApplicationMaster.runExecutorLauncher(ApplicationMaster.scala:547)
    at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:266)
    at org.apache.spark.deploy.yarn.ApplicationMaster$$anon.run(ApplicationMaster.scala:890)
    at org.apache.spark.deploy.yarn.ApplicationMaster$$anon.run(ApplicationMaster.scala:889)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
    at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:889)
    at org.apache.spark.deploy.yarn.ExecutorLauncher$.main(ApplicationMaster.scala:921)
    at org.apache.spark.deploy.yarn.ExecutorLauncher.main(ApplicationMaster.scala)
Caused by: java.io.IOException: Failed to send RPC RPC 6402554451456766428 to LAPTOP-GQ2OL7O9/192.168.0.106:56588: io.netty.channel.socket.ChannelOutputShutdownException: Channel output shutdown
    at org.apache.spark.network.client.TransportClient$RpcChannelListener.handleFailure(TransportClient.java:363)
    at org.apache.spark.network.client.TransportClient$StdChannelListener.operationComplete(TransportClient.java:340)
    at io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:507)
    at io.netty.util.concurrent.DefaultPromise.notifyListeners0(DefaultPromise.java:500)
    at io.netty.util.concurrent.DefaultPromise.notifyListenersNow(DefaultPromise.java:479)
    at io.netty.util.concurrent.DefaultPromise.notifyListeners(DefaultPromise.java:420)
    at io.netty.util.concurrent.DefaultPromise.tryFailure(DefaultPromise.java:122)
    at io.netty.util.internal.PromiseNotificationUtil.tryFailure(PromiseNotificationUtil.java:64)
    at io.netty.channel.ChannelOutboundBuffer.safeFail(ChannelOutboundBuffer.java:680)
    at io.netty.channel.ChannelOutboundBuffer.remove0(ChannelOutboundBuffer.java:294)
    at io.netty.channel.ChannelOutboundBuffer.failFlushed(ChannelOutboundBuffer.java:617)
    at io.netty.channel.AbstractChannel$AbstractUnsafe.closeOutboundBufferForShutdown(AbstractChannel.java:627)
    at io.netty.channel.AbstractChannel$AbstractUnsafe.shutdownOutput(AbstractChannel.java:620)
    at io.netty.channel.AbstractChannel$AbstractUnsafe.flush0(AbstractChannel.java:893)
    at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.flush0(AbstractNioChannel.java:313)
    at io.netty.channel.AbstractChannel$AbstractUnsafe.flush(AbstractChannel.java:847)
    at io.netty.channel.DefaultChannelPipeline$HeadContext.flush(DefaultChannelPipeline.java:1264)
    at io.netty.channel.AbstractChannelHandlerContext.invokeFlush0(AbstractChannelHandlerContext.java:770)
    at io.netty.channel.AbstractChannelHandlerContext.invokeFlush(AbstractChannelHandlerContext.java:762)
    at io.netty.channel.AbstractChannelHandlerContext.flush(AbstractChannelHandlerContext.java:743)
    at io.netty.channel.ChannelOutboundHandlerAdapter.flush(ChannelOutboundHandlerAdapter.java:115)
    at io.netty.channel.AbstractChannelHandlerContext.invokeFlush0(AbstractChannelHandlerContext.java:770)
    at io.netty.channel.AbstractChannelHandlerContext.invokeFlush(AbstractChannelHandlerContext.java:762)
    at io.netty.channel.AbstractChannelHandlerContext.flush(AbstractChannelHandlerContext.java:743)
    at io.netty.channel.ChannelDuplexHandler.flush(ChannelDuplexHandler.java:117)
    at io.netty.channel.AbstractChannelHandlerContext.invokeFlush0(AbstractChannelHandlerContext.java:770)
    at io.netty.channel.AbstractChannelHandlerContext.invokeFlush(AbstractChannelHandlerContext.java:762)
    at io.netty.channel.AbstractChannelHandlerContext.access00(AbstractChannelHandlerContext.java:35)
    at io.netty.channel.AbstractChannelHandlerContext$WriteAndFlushTask.write(AbstractChannelHandlerContext.java:1116)
    at io.netty.channel.AbstractChannelHandlerContext$AbstractWriteTask.run(AbstractChannelHandlerContext.java:1050)
    at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:399)
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:464)
    at io.netty.util.concurrent.SingleThreadEventExecutor.run(SingleThreadEventExecutor.java:131)
    at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
    at java.lang.Thread.run(Thread.java:748)
Caused by: io.netty.channel.socket.ChannelOutputShutdownException: Channel output shutdown
    at io.netty.channel.AbstractChannel$AbstractUnsafe.shutdownOutput(AbstractChannel.java:587)
    ... 22 more
Caused by: java.lang.NoSuchMethodError: org.apache.spark.network.util.AbstractFileRegion.transferred()J
    at org.apache.spark.network.util.AbstractFileRegion.transfered(AbstractFileRegion.java:28)
    at io.netty.channel.nio.AbstractNioByteChannel.doWrite(AbstractNioByteChannel.java:228)
    at io.netty.channel.socket.nio.NioSocketChannel.doWrite(NioSocketChannel.java:282)
    at io.netty.channel.AbstractChannel$AbstractUnsafe.flush0(AbstractChannel.java:879)
    ... 21 more
)
2020-12-27 01:27:55,368 INFO util.ShutdownHookManager: Shutdown hook called

互联网上似乎对我的事业没有任何帮助... 提前致谢。

Caused by: java.lang.NoSuchMethodError: org.apache.spark.network.util.AbstractFileRegion.transferred()J
    at org.apache.spark.network.util.AbstractFileRegion.transfered(AbstractFileRegion.java:28)
    at io.netty.channel.nio.AbstractNioByteChannel.doWrite(AbstractNioByteChannel.java:228)
    at io.netty.channel.socket.nio.NioSocketChannel.doWrite(NioSocketChannel.java:282)
    at io.netty.channel.AbstractChannel$AbstractUnsafe.flush0(AbstractChannel.java:879)
    ... 21 more

您的类路径中似乎有多个版本。确保您在类路径上只有一个版本(这必须是正确的版本)。