如何在 Linux 上找到 HADOOP_HOME 路径?
how to find HADOOP_HOME path on Linux?
我正尝试在 hadoop 服务器上 运行 以下 java 代码。
javac -classpath ${HADOOP_HOME}/hadoop-${HADOOP_VERSION}-core.jar -d wordcount_classes WordCount.java
但我找不到 {HADOOP_HOME}
。我尝试使用 hadoop -classpath
但它给出的输出如下:
/etc/hadoop/conf:/usr/lib/hadoop/lib/*:/usr/lib/hadoop/.//*:/usr/lib/hadoop-hdfs/./:/usr/lib/hadoop-hdfs/lib/*:/usr/lib/hadoop-hdfs/.//*:/usr/lib/hadoop-yarn/.//*:/usr/lib/hadoop-0.20-mapreduce/./:/usr/lib/hadoop-0.20-mapreduce/lib/*:/usr/lib/hadoop-0.20-mapreduce/.//*
有人对此有任何想法吗?
hadoop-core jar 文件在${HADOOP_HOME}/share/hadoop/common目录下,不在${HADOOP_HOME}目录下。
您可以在 .bashrc 文件中设置环境变量。
vim ~/.bashrc
然后在 .bashrc 文件末尾添加以下行。
export HADOOP_HOME=/your/hadoop/installation/directory
只需将路径替换为您的hadoop安装路径即可。
导航到安装hadoop的路径。找到 ${HADOOP_HOME}/etc/hadoop
,例如
/usr/lib/hadoop-2.2.0/etc/hadoop
当您为此文件夹键入 ls 时,您应该会看到所有这些文件。
capacity-scheduler.xml httpfs-site.xml
configuration.xsl log4j.properties
container-executor.cfg mapred-env.cmd
core-site.xml mapred-env.sh
core-site.xml~ mapred-queues.xml.template
hadoop-env.cmd mapred-site.xml
hadoop-env.sh mapred-site.xml~
hadoop-env.sh~ mapred-site.xml.template
hadoop-metrics2.properties slaves
hadoop-metrics.properties ssl-client.xml.example
hadoop-policy.xml ssl-server.xml.example
hdfs-site.xml yarn-env.cmd
hdfs-site.xml~ yarn-env.sh
httpfs-env.sh yarn-site.xml
httpfs-log4j.properties yarn-site.xml~
httpfs-signature.secret
核心配置设置在 hadoop-env.sh.
中可用
您可以在这个文件中看到类路径设置,我在这里复制了一些示例供您参考。
# The java implementation to use.
export JAVA_HOME=/usr/lib/jvm/jdk1.7.0_67
# The jsvc implementation to use. Jsvc is required to run secure datanodes.
#export JSVC_HOME=${JSVC_HOME}
export HADOOP_CONF_DIR=${HADOOP_CONF_DIR}
# Extra Java CLASSPATH elements. Automatically insert capacity-scheduler.
for f in $HADOOP_HOME/contrib/capacity-scheduler/*.jar; do
export HADOOP_CLASSPATH=${HADOOP_CLASSPATH+$HADOOP_CLASSPATH:}$f
done
希望对您有所帮助!
我正尝试在 hadoop 服务器上 运行 以下 java 代码。
javac -classpath ${HADOOP_HOME}/hadoop-${HADOOP_VERSION}-core.jar -d wordcount_classes WordCount.java
但我找不到 {HADOOP_HOME}
。我尝试使用 hadoop -classpath
但它给出的输出如下:
/etc/hadoop/conf:/usr/lib/hadoop/lib/*:/usr/lib/hadoop/.//*:/usr/lib/hadoop-hdfs/./:/usr/lib/hadoop-hdfs/lib/*:/usr/lib/hadoop-hdfs/.//*:/usr/lib/hadoop-yarn/.//*:/usr/lib/hadoop-0.20-mapreduce/./:/usr/lib/hadoop-0.20-mapreduce/lib/*:/usr/lib/hadoop-0.20-mapreduce/.//*
有人对此有任何想法吗?
hadoop-core jar 文件在${HADOOP_HOME}/share/hadoop/common目录下,不在${HADOOP_HOME}目录下。
您可以在 .bashrc 文件中设置环境变量。
vim ~/.bashrc
然后在 .bashrc 文件末尾添加以下行。
export HADOOP_HOME=/your/hadoop/installation/directory
只需将路径替换为您的hadoop安装路径即可。
导航到安装hadoop的路径。找到 ${HADOOP_HOME}/etc/hadoop
,例如
/usr/lib/hadoop-2.2.0/etc/hadoop
当您为此文件夹键入 ls 时,您应该会看到所有这些文件。
capacity-scheduler.xml httpfs-site.xml
configuration.xsl log4j.properties
container-executor.cfg mapred-env.cmd
core-site.xml mapred-env.sh
core-site.xml~ mapred-queues.xml.template
hadoop-env.cmd mapred-site.xml
hadoop-env.sh mapred-site.xml~
hadoop-env.sh~ mapred-site.xml.template
hadoop-metrics2.properties slaves
hadoop-metrics.properties ssl-client.xml.example
hadoop-policy.xml ssl-server.xml.example
hdfs-site.xml yarn-env.cmd
hdfs-site.xml~ yarn-env.sh
httpfs-env.sh yarn-site.xml
httpfs-log4j.properties yarn-site.xml~
httpfs-signature.secret
核心配置设置在 hadoop-env.sh.
中可用您可以在这个文件中看到类路径设置,我在这里复制了一些示例供您参考。
# The java implementation to use.
export JAVA_HOME=/usr/lib/jvm/jdk1.7.0_67
# The jsvc implementation to use. Jsvc is required to run secure datanodes.
#export JSVC_HOME=${JSVC_HOME}
export HADOOP_CONF_DIR=${HADOOP_CONF_DIR}
# Extra Java CLASSPATH elements. Automatically insert capacity-scheduler.
for f in $HADOOP_HOME/contrib/capacity-scheduler/*.jar; do
export HADOOP_CLASSPATH=${HADOOP_CLASSPATH+$HADOOP_CLASSPATH:}$f
done
希望对您有所帮助!