未入门Hortonworks沙盒中的Datanode进程手动设置

Datanode process in not getting started with Hortonworks sandbox Manual Set up

我是 Hortonworks 沙箱的新手。 我正在尝试通过此 link.

在我的系统 (ubuntu-14.04) 上手动设置它

http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.2.4/HDP_Man_Install_v224/index.html#validating_core_hadoop.

在 运行 下面的命令之后 运行 一个数据节点。

/usr/hdp/current/hadoop-hdfs-datanode/../hadoop/sbin/hadoop-daemon.sh --config $HADOOP_CONF_DIR start datanode

Datanode 进程未获取 started.There 日志文件中也没有错误。

这是我的日志文件内容

ulimit -a for secure datanode user hdfs
core file size          (blocks, -c) 0
data seg size           (kbytes, -d) unlimited
scheduling priority             (-e) 0
file size               (blocks, -f) unlimited
pending signals                 (-i) 62916
max locked memory       (kbytes, -l) 64
max memory size         (kbytes, -m) unlimited
open files                      (-n) 1024
pipe size            (512 bytes, -p) 8
POSIX message queues     (bytes, -q) 819200
real-time priority              (-r) 0
stack size              (kbytes, -s) 8192
cpu time               (seconds, -t) unlimited
max user processes              (-u) 62916
virtual memory          (kbytes, -v) unlimited
file locks                      (-x) unlimited

这是日志文件的内容

 2015-05-19 10:37:36,626 INFO  datanode.DataNode StringUtils.java:startupShutdownMessage(633)) - STARTUP_MSG: 
  java.jar:/usr/share/java/mysql.jar::/usr/share/java/mysql-connector-java-5.1.28.jar:/usr/share/java/mysql-connector-java.jar:/usr/share/java/mysql.jar:
    STARTUP_MSG:   build = git@github.com:hortonworks/hadoop.git -r 22a563ebe448969d07902aed869ac13c652b2872; compiled by 'jenkins' on 2015-03-31T19:40Z
    STARTUP_MSG:   java = 1.7.0_80
    ************************************************************/
    2015-05-19 10:37:36,631 INFO  datanode.DataNode (SignalLogger.java:register(91)) - registered UNIX signal handlers for [TERM, HUP, INT]
    2015-05-19 10:37:36,680 WARN  common.Util (Util.java:stringAsURI(56)) - Path /grid/hadoop/hdfs/dn should be specified as a URI in configuration files. Please update hdfs configuration.
    2015-05-19 10:37:36,681 WARN  common.Util (Util.java:stringAsURI(56)) - Path /grid1/hadoop/hdfs/dn should be specified as a URI in configuration files. Please update hdfs configuration.
    2015-05-19 10:37:36,681 WARN  common.Util (Util.java:stringAsURI(56)) - Path grid2/hadoop/hdfs/dn should be specified as a URI in configuration files. Please update hdfs configuration.
    2015-05-19 10:37:36,747 WARN  datanode.DataNode (DataNode.java:checkStorageLocations(2284)) - Invalid dfs.datanode.data.dir /grid/hadoop/hdfs/dn : 
    java.io.FileNotFoundException: File file:/grid/hadoop/hdfs/dn does not exist
        at org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:608)
        at org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:821)
        at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:598)
        at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:414)
        at org.apache.hadoop.util.DiskChecker.mkdirsWithExistsAndPermissionCheck(DiskChecker.java:139)
        at org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:156)
        at org.apache.hadoop.hdfs.server.datanode.DataNode$DataNodeDiskChecker.checkDir(DataNode.java:2239)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.checkStorageLocations(DataNode.java:2281)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2263)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2155)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2202)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2378)
        at org.apache.hadoop.hdfs.server.datanode.SecureDataNodeStarter.start(SecureDataNodeStarter.java:78)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.commons.daemon.support.DaemonLoader.start(DaemonLoader.java:243)
    2015-05-19 10:37:36,748 WARN  datanode.DataNode (DataNode.java:checkStorageLocations(2284)) - Invalid dfs.datanode.data.dir /grid1/hadoop/hdfs/dn : 
    java.io.FileNotFoundException: File file:/grid1/hadoop/hdfs/dn does not exist
        at org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:608)
        at org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:821)
        at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:598)
        at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:414)
        at org.apache.hadoop.util.DiskChecker.mkdirsWithExistsAndPermissionCheck(DiskChecker.java:139)
        at org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:156)
        at org.apache.hadoop.hdfs.server.datanode.DataNode$DataNodeDiskChecker.checkDir(DataNode.java:2239)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.checkStorageLocations(DataNode.java:2281)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2263)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2155)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2202)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2378)
        at org.apache.hadoop.hdfs.server.datanode.SecureDataNodeStarter.start(SecureDataNodeStarter.java:78)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.commons.daemon.support.DaemonLoader.start(DaemonLoader.java:243)
    2015-05-19 10:37:36,749 WARN  datanode.DataNode (DataNode.java:checkStorageLocations(2284)) - Invalid dfs.datanode.data.dir /usr/hdp/2.2.4.2-2/hadoop/grid2/hadoop/hdfs/dn : 
    java.io.FileNotFoundException: File file:/usr/hdp/2.2.4.2-2/hadoop/grid2/hadoop/hdfs/dn does not exist
        at org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:608)
        at org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:821)
        at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:598)
        at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:414)
        at org.apache.hadoop.util.DiskChecker.mkdirsWithExistsAndPermissionCheck(DiskChecker.java:139)
        at org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:156)
        at org.apache.hadoop.hdfs.server.datanode.DataNode$DataNodeDiskChecker.checkDir(DataNode.java:2239)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.checkStorageLocations(DataNode.java:2281)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2263)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2155)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2202)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2378)
        at org.apache.hadoop.hdfs.server.datanode.SecureDataNodeStarter.start(SecureDataNodeStarter.java:78)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.commons.daemon.support.DaemonLoader.start(DaemonLoader.java:243)
    2015-05-19 10:37:36,749 FATAL datanode.DataNode (DataNode.java:secureMain(2385)) - Exception in secureMain
    java.io.IOException: All directories in dfs.datanode.data.dir are invalid: "/grid/hadoop/hdfs/dn" "/grid1/hadoop/hdfs/dn" "/usr/hdp/2.2.4.2-2/hadoop/grid2/hadoop/hdfs/dn" 
        at org.apache.hadoop.hdfs.server.datanode.DataNode.checkStorageLocations(DataNode.java:2290)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2263)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2155)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2202)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2378)
        at org.apache.hadoop.hdfs.server.datanode.SecureDataNodeStarter.start(SecureDataNodeStarter.java:78)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.commons.daemon.support.DaemonLoader.start(DaemonLoader.java:243)
    2015-05-19 10:37:36,751 INFO  util.ExitUtil (ExitUtil.java:terminate(124)) - Exiting with status 1
    2015-05-19 10:37:36,752 INFO  datanode.DataNode (StringUtils.java:run(659)) - SHUTDOWN_MSG: 
    /************************************************************
    SHUTDOWN_MSG: Shutting down DataNode at arjun-ubuntu1404/127.0.1.1

还有更多,这是我在控制台中的内容。

root@arjun-ubuntu1404:/# /usr/hdp/current/hadoop-hdfs-datanode/../hadoop/sbin/hadoop-daemon.sh --config $HADOOP_CONF_DIR start datanode
starting datanode, logging to /var/log/hadoop/hdfs/hadoop-hdfs-datanode-arjun-ubuntu1404.out
root@arjun-ubuntu1404:/# jps
5240 SecondaryNameNode
5017 NameNode
6368 Jps

如何解决?

首先从 hdfs 文件夹中删除所有内容:

hadoop.tmp.dir

的值
rm -rf /grid/hadoop/hdfs

确保目录拥有正确的所有者和权限

(用户名根据您的系统)

sudo chown hduser:hadoop -R /grid/hadoop/hdfs 
sudo chmod 777 -R /grid/hadoop/hdfs

格式化名称节点:

hadoop namenode -format

试试这个:

sudo chown -R hdfs:hadoop /grid/hadoop/hdfs/dn
sudo chmod -R 755 /grid/hadoop/hdfs/dn

再次启动所有进程