SparkR 和 Pyspark 在启动时抛出 Java.net.Bindexception,但 Spark-Shell 不会?

SparkR and Pyspark throw Java.net.Bindexception on launch, but Spark-Shell does not?

我已经尝试将SPARK_LOCAL_IP设置为“127.0.0.1”并检查端口是否被占用。这是完整的错误文本:

Launching java with spark-submit command /usr/hdp/2.4.0.0-  

    169/spark/bin/spark-submit   "sparkr-shell" /tmp/RtmpZo44il/backend_port998540c56917
/usr/hdp/2.4.0.0-169/spark/bin/load-spark-env.sh: line 72: export: `load-spark-env.sh': not a valid identifier
16/06/13 11:28:24 ERROR RBackend: Server shutting down: failed with exception
java.net.BindException: Cannot assign requested address
        at sun.nio.ch.Net.bind0(Native Method)
        at sun.nio.ch.Net.bind(Net.java:433)
        at sun.nio.ch.Net.bind(Net.java:425)
        at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
        at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
        at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:125)
        at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:485)
        at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1089)
        at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:430)
        at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:415)
        at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:903)
        at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:198)
        at io.netty.bootstrap.AbstractBootstrap.run(AbstractBootstrap.java:348)
        at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
        at io.netty.util.concurrent.SingleThreadEventExecutor.run(SingleThreadEventExecutor.java:111)
        at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:137)
        at java.lang.Thread.run(Thread.java:745)
Error in SparkR::sparkR.init() : JVM is not ready after 10 seconds

以上错误是在启动时出现的。/bin/sparkR。 Spark-shell 将再次正常执行。

更多信息。启动时,Spark-shell 将自动搜索端口,直到它解决了没有绑定异常的端口。即使我将默认的 SparkR 后端端口设置为未使用的端口,它也会失败。

我发现了问题。另一个用户删除了我的 etc/hosts 文件。我用 localhost 重新配置了文件,它似乎 运行 sparkR。我仍然很好奇 spark-shell 如何 运行 处理文件。