Error: /usr/local/Cellar/sqoop/1.4.6/../hadoop does not exist! Please set $HADOOP_COMMON_HOME to the root of your Hadoop installation
Error: /usr/local/Cellar/sqoop/1.4.6/../hadoop does not exist! Please set $HADOOP_COMMON_HOME to the root of your Hadoop installation
我使用 brew 在 Mac 上安装了 Hadoop,然后对其进行了配置。然后我安装了 Sqoop,当我尝试 运行 Sqoop 时,我收到以下错误:
错误:/usr/local/Cellar/sqoop/1.4.6/../hadoop不存在!
请将 $HADOOP_COMMON_HOME 设置为 Hadoop 安装的根目录。
我的 Hadoop 运行ning 很好,我什至在 ~/.bash_profile 和 sqoop-env.sh
这是我的 sqoop 环境文件:
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version
2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# included in all the hadoop scripts with source command
# should not be executable directly
# also should not be passed any arguments, since we need original $*
# Set Hadoop-specific environment variables here.
#Set path to where bin/hadoop is available
export HADOOP_COMMON_HOME= /usr/local/Cellar/hadoop/3.0.0
export PATH=$PATH:$HADOOP_COMMON_HOME/bin
#Set path to where hadoop-*-core.jar is available
#export HADOOP_MAPRED_HOME=
#set the path to where bin/hbase is available
#export HBASE_HOME=
#Set the path to where bin/hive is available
#export HIVE_HOME=
#Set the path for where zookeper config dir is
#export ZOOCFGDIR=
我之前 brew install hadoop
做过一些 Spark 的事情。我认为我不必在 Hadoop 本身中编辑任何配置,
但是 brew install sqoop
似乎可以开箱即用地找到 Hadoop 配置。不知道你从哪里得到你的 sqoop-env.sh
,但刚刚安装了它,这是我的。
[jomoore@libexec] $ cat /usr/local/Cellar/sqoop/1.4.6/libexec/conf/sqoop-env.sh
export HADOOP_HOME="/usr/local"
export HBASE_HOME="/usr/local"
export HIVE_HOME="/usr/local"
export ZOOCFGDIR="/usr/local/etc/zookeeper"
如果我们 grep 它想要在其中找到 hadoop 的位置... $HADOOP_HOME/bin
[jomoore@libexec] $ ls -l /usr/local/bin/ | grep hadoop
lrwxr-xr-x 1 jomoore admin 45 Mar 13 10:24 container-executor -> ../Cellar/hadoop/3.0.0/bin/container-executor
lrwxr-xr-x 1 jomoore admin 33 Mar 13 10:24 hadoop -> ../Cellar/hadoop/3.0.0/bin/hadoop
lrwxr-xr-x 1 jomoore admin 31 Mar 13 10:24 hdfs -> ../Cellar/hadoop/3.0.0/bin/hdfs
lrwxr-xr-x 1 jomoore admin 33 Mar 13 10:24 mapred -> ../Cellar/hadoop/3.0.0/bin/mapred
lrwxr-xr-x 1 jomoore admin 50 Mar 13 10:24 test-container-executor -> ../Cellar/hadoop/3.0.0/bin/test-container-executor
lrwxr-xr-x 1 jomoore admin 31 Mar 13 10:24 yarn -> ../Cellar/hadoop/3.0.0/bin/yarn
而且由于我没有安装 Accumulo、Hive 或 Zookeeper,所以这些其他警告只是杂音。
[jomoore@libexec] $ sqoop
Warning: /usr/local/Cellar/sqoop/1.4.6/libexec/bin/../../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /usr/local/Cellar/sqoop/1.4.6/libexec/bin/../../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
Warning: /usr/local/Cellar/sqoop/1.4.6/libexec/bin/../../zookeeper does not exist! Accumulo imports will fail.
Please set $ZOOKEEPER_HOME to the root of your Zookeeper installation.
ERROR: Cannot execute /usr/local/libexec/hadoop-config.sh.
最后的那个错误实际上是在这里寻找文件,它可能不应该在 Brew 包中 libexec/libexec
$ find /usr/local/Cellar -name hadoop-config.sh
/usr/local/Cellar/hadoop/3.0.0/libexec/libexec/hadoop-config.sh
我使用 brew 在 Mac 上安装了 Hadoop,然后对其进行了配置。然后我安装了 Sqoop,当我尝试 运行 Sqoop 时,我收到以下错误:
错误:/usr/local/Cellar/sqoop/1.4.6/../hadoop不存在! 请将 $HADOOP_COMMON_HOME 设置为 Hadoop 安装的根目录。
我的 Hadoop 运行ning 很好,我什至在 ~/.bash_profile 和 sqoop-env.sh
这是我的 sqoop 环境文件:
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version
2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# included in all the hadoop scripts with source command
# should not be executable directly
# also should not be passed any arguments, since we need original $*
# Set Hadoop-specific environment variables here.
#Set path to where bin/hadoop is available
export HADOOP_COMMON_HOME= /usr/local/Cellar/hadoop/3.0.0
export PATH=$PATH:$HADOOP_COMMON_HOME/bin
#Set path to where hadoop-*-core.jar is available
#export HADOOP_MAPRED_HOME=
#set the path to where bin/hbase is available
#export HBASE_HOME=
#Set the path to where bin/hive is available
#export HIVE_HOME=
#Set the path for where zookeper config dir is
#export ZOOCFGDIR=
我之前 brew install hadoop
做过一些 Spark 的事情。我认为我不必在 Hadoop 本身中编辑任何配置,
但是 brew install sqoop
似乎可以开箱即用地找到 Hadoop 配置。不知道你从哪里得到你的 sqoop-env.sh
,但刚刚安装了它,这是我的。
[jomoore@libexec] $ cat /usr/local/Cellar/sqoop/1.4.6/libexec/conf/sqoop-env.sh
export HADOOP_HOME="/usr/local"
export HBASE_HOME="/usr/local"
export HIVE_HOME="/usr/local"
export ZOOCFGDIR="/usr/local/etc/zookeeper"
如果我们 grep 它想要在其中找到 hadoop 的位置... $HADOOP_HOME/bin
[jomoore@libexec] $ ls -l /usr/local/bin/ | grep hadoop
lrwxr-xr-x 1 jomoore admin 45 Mar 13 10:24 container-executor -> ../Cellar/hadoop/3.0.0/bin/container-executor
lrwxr-xr-x 1 jomoore admin 33 Mar 13 10:24 hadoop -> ../Cellar/hadoop/3.0.0/bin/hadoop
lrwxr-xr-x 1 jomoore admin 31 Mar 13 10:24 hdfs -> ../Cellar/hadoop/3.0.0/bin/hdfs
lrwxr-xr-x 1 jomoore admin 33 Mar 13 10:24 mapred -> ../Cellar/hadoop/3.0.0/bin/mapred
lrwxr-xr-x 1 jomoore admin 50 Mar 13 10:24 test-container-executor -> ../Cellar/hadoop/3.0.0/bin/test-container-executor
lrwxr-xr-x 1 jomoore admin 31 Mar 13 10:24 yarn -> ../Cellar/hadoop/3.0.0/bin/yarn
而且由于我没有安装 Accumulo、Hive 或 Zookeeper,所以这些其他警告只是杂音。
[jomoore@libexec] $ sqoop
Warning: /usr/local/Cellar/sqoop/1.4.6/libexec/bin/../../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /usr/local/Cellar/sqoop/1.4.6/libexec/bin/../../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
Warning: /usr/local/Cellar/sqoop/1.4.6/libexec/bin/../../zookeeper does not exist! Accumulo imports will fail.
Please set $ZOOKEEPER_HOME to the root of your Zookeeper installation.
ERROR: Cannot execute /usr/local/libexec/hadoop-config.sh.
最后的那个错误实际上是在这里寻找文件,它可能不应该在 Brew 包中 libexec/libexec
$ find /usr/local/Cellar -name hadoop-config.sh
/usr/local/Cellar/hadoop/3.0.0/libexec/libexec/hadoop-config.sh