错误 datanode.DataNode:secureMain Hadoop 配置异常
ERROR datanode.DataNode: Exception in secureMain Hadoop Configuration
我正在 Windows 上尝试配置 hadoop
我有这个错误:
org.apache.hadoop.util.DiskChecker$DiskErrorException: Too many failed volumes - current valid volumes: 0, volumes configured: 1, volumes failed: 1, volume failures tolerated: 0
at org.apache.hadoop.hdfs.server.datanode.checker.StorageLocationChecker.check(StorageLocationChecker.java:233)
at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2841)
at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2754)
at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2798)
at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2942)
at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2966)
2021-10-12 11:07:43,633 INFO util.ExitUtil: Exiting with status 1: org.apache.hadoop.util.DiskChecker$DiskErrorException: Too many failed volumes - current valid volumes: 0, volumes configured: 1, volumes failed: 1, volume failures tolerated: 0
2021-10-12 11:07:43,641 INFO datanode.DataNode: SHUTDOWN_MSG:
我已经尝试设置:
<property>
<name>dfs.datanode.failed.volumes.tolerated</name>
<value>0</value>
</property>
但是,我抛出其他错误,无法正常工作。
我只在 hadoop-env.sh 中设置这个就解决了这个问题:
export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
export HADOOP_CLASSPATH=/usr/lib/jvm/java-8-openjdk-amd64/lib/tools.jar
我正在 Windows 上尝试配置 hadoop 我有这个错误:
org.apache.hadoop.util.DiskChecker$DiskErrorException: Too many failed volumes - current valid volumes: 0, volumes configured: 1, volumes failed: 1, volume failures tolerated: 0
at org.apache.hadoop.hdfs.server.datanode.checker.StorageLocationChecker.check(StorageLocationChecker.java:233)
at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2841)
at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2754)
at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2798)
at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2942)
at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2966)
2021-10-12 11:07:43,633 INFO util.ExitUtil: Exiting with status 1: org.apache.hadoop.util.DiskChecker$DiskErrorException: Too many failed volumes - current valid volumes: 0, volumes configured: 1, volumes failed: 1, volume failures tolerated: 0
2021-10-12 11:07:43,641 INFO datanode.DataNode: SHUTDOWN_MSG:
我已经尝试设置:
<property>
<name>dfs.datanode.failed.volumes.tolerated</name>
<value>0</value>
</property>
但是,我抛出其他错误,无法正常工作。
我只在 hadoop-env.sh 中设置这个就解决了这个问题:
export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
export HADOOP_CLASSPATH=/usr/lib/jvm/java-8-openjdk-amd64/lib/tools.jar