$bin/hadoop namenode --格式错误

$bin/hadoop namenode --format error

我在尝试执行此命令时遇到此错误:$bin/hadoop namenode –format

/home/MAHI/hadoop-1.2.1/libexec/../conf/hadoop-env.sh: line 31: unexpected EOF while looking for matching `"'
/home/MAHI/hadoop-1.2.1/libexec/../conf/hadoop-env.sh: line 58: syntax error: unexpected end of file**
# The java implementation to use.  Required.
export JAVA_HOME= "C:\Java\"

# Extra Java CLASSPATH elements.  Optional.
# export HADOOP_CLASSPATH=

# The maximum amount of heap to use, in MB. Default is 1000.
# export HADOOP_HEAPSIZE=2000

# Extra Java runtime options.  Empty by default.
# export HADOOP_OPTS=-server

# Command specific options appended to HADOOP_OPTS when specified
export HADOOP_NAMENODE_OPTS="-Dcom.sun.management.jmxremote $HADOOP_NAMENODE_OPTS"
export HADOOP_SECONDARYNAMENODE_OPTS="-Dcom.sun.management.jmxremote $HADOOP_SECONDARYNAMENODE_OPTS"
export HADOOP_DATANODE_OPTS="-Dcom.sun.management.jmxremote $HADOOP_DATANODE_OPTS"
export HADOOP_BALANCER_OPTS="-Dcom.sun.management.jmxremote $HADOOP_BALANCER_OPTS"
export HADOOP_JOBTRACKER_OPTS="-Dcom.sun.management.jmxremote $HADOOP_JOBTRACKER_OPTS"
 export HADOOP_TASKTRACKER_OPTS=
 The following applies to multiple commands (fs, dfs, fsck, distcp etc)
export HADOOP_CLIENT_OPTS

 Extra ssh options.  Empty by default.
 export HADOOP_SSH_OPTS="-o ConnectTimeout=1 -o SendEnv=HADOOP_CONF_DIR"

 Where log files are stored.  $HADOOP_HOME/logs by default.
 export HADOOP_LOG_DIR=${HADOOP_HOME}/logs

 File naming remote slave hosts.  $HADOOP_HOME/conf/slaves by default.
 export HADOOP_SLAVES=${HADOOP_HOME}/conf/slaves

 host:path where hadoop code should be rsync'd from.  Unset by default.
 export HADOOP_MASTER=master:/home/$USER/src/hadoop

 Seconds to sleep between slave commands.  Unset by default.  This
 can be useful in large clusters, where, e.g., slave rsyncs can
 otherwise arrive faster than the master can service them.
 export HADOOP_SLAVE_SLEEP=0.1

pid文件存放目录。默认 /tmp
注意:这应该设置为一个只能被写入的目录 将要 运行 hadoop 守护程序的用户。否则有 符号链接攻击的可能性。

 export HADOOP_PID_DIR=/var/hadoop/pids

 A string representing this instance of hadoop. $USER by default.
 export HADOOP_IDENT_STRING=$USER

 The scheduling priority for daemon processes.  See 'man nice'.
export HADOOP_NICENESS=10

export JAVA_HOME= "C:\Java\"

这表明您正在 windows 环境中使用 hadoop。但是:

export HADOOP_PID_DIR=/var/hadoop/pids

此行显示您提供的文件夹位置与 linux 环境中的一样。

不要混淆 windows 和 linux 配置。 更正路径以反映您的操作系统

您在 hadoop-env.sh 中设置 JAVA_HOME 不正确。给出 java_home.You 的绝对路径可以使用以下命令找到 java 当前 java 路径:

alternatives --config java

它会给出你安装的所有 java 版本并选择合适的版本并将此 java 路径设置到 hadoop-env.sh 中,如下所示:

 export JAVA_HOME=/usr/java/jdk1.*/bin

另一种方法是将 $JAVA_HOME 设置到用户的 .bashrc 中。所以不需要设置成 hadoop-env.sh.

在机器中安装java并在.bash_profile上设置JAVA_HOME路径。执行此操作后,导出 $JAVA_HOME 变量,以便其他正在寻找 JAVA_HOME 的进程可以使用此路径。还要在 /etc/hadoop/conf/hadoop-env.sh 文件中设置 JAVA_HOME 路径。 注意:- 有些人安装了多个 hadoop 版本。确保您在正确的 hadoop 系统中进行更改。要查看您当前使用的是哪个 hadoop 系统,请在 /etc/alternatives 文件上执行 cat,这将提供所有符号链接并进行必要的更改。