尝试写入 HDFS 时出现奇怪的错误

A weird error when trying to write to HDFS

当我们尝试写入 HDFS 时,我们发现了一个异常和偶然的错误。 这是例外情况:

http://pastebin.com/3YDX4a39

代码如下所示:

http://pastebin.com/h1RW07qv

当我第一次尝试实例化 fs 字段变量时发生异常,第 12 行,当我尝试调用方法 MyWatchService.saveInputDataIntoHDFS 时,首先要做的是实例化的静态部分MyHadoopUtils class、

fs = FileSystem.get(myConf);

这引发了异常,但在异常中我可以看到这条消息:

[INFO][FeedAdapter][2015-04-08 09:31:21] MyHadoopUtils:29 - HDFS instantiated! name: hdfs://dub-vcd-vms170.global.tektronix.net:8020
[INFO][FeedAdapter][2015-04-08 09:31:21] MyHadoopUtils:43 - HDFS fs instantiated? true

如何删除 IOException?

我 运行 在 linux 环境中,

    2.6.32-504.3.3.el6.centos.plus.x86_64, java version "1.7.0_71"
    OpenJDK Runtime Environment (rhel-2.5.3.2.el6_6-x86_64 u71-b14)
    OpenJDK 64-Bit Server VM (build 24.65-b04, mixed mode)

    <hadoop-hdfs.version>2.5.0-cdh5.2.0</hadoop-hdfs.version>
    <hadoop-common.version>2.5.0-cdh5.2.0</hadoop-common.version>

    <!-- necessary to write within HDFS -->
    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-hdfs</artifactId>
        <version>${hadoop-hdfs.version}</version>
    </dependency>

    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-common</artifactId>
        <version>${hadoop-common.version}</version>
    </dependency>

我自己回复,设置HADOOP_HOME变量后问题就解决了

export HADOOP_HOME=/var/lib/hadoop-hdfs