Pydoop error: RuntimeError: java home not found, try setting JAVA_HOME on remote server using CDH5.4

Pydoop error: RuntimeError: java home not found, try setting JAVA_HOME on remote server using CDH5.4

Objective:从我的笔记本电脑上使用 pydoop 读取存储在 HDFS 中的远程文件。我使用的是 pycharm 专业版。我正在使用 Cloudera CDH5.4

pyCharm 我笔记本电脑上的配置:在项目解释器中(在设置下),我已将 python 编译器指示到远程服务器上,如 ssh://remote-server-ip -地址:port-number/home/ashish/anaconda/bin/python2.7

现在有一个文件存储在 HDFS 位置 /home/ashish/pencil/someFileName.txt

然后我使用 pip install pydoop 在远程服务器上安装 pydoop 并安装它。然后我写这段代码从 hdfs 位置读取文件

import pydoop.hdfs as hdfs
with hdfs.open('/home/ashish/pencil/someFileName.txt') as file:

for line in file:
    print(line,'\n')

执行时出现错误

Traceback (most recent call last):
File "/home/ashish/PyCharm_proj/Remote_Server_connect/hdfsConxn.py", line 7,  in <module>
import pydoop.hdfs as hdfs
File /home/ashish/anaconda/lib/python2.7/sitepackages/pydoop/hdfs/__init__.py", line  82, in <module>
from . import common, path
File "/home/ashish/anaconda/lib/python2.7/site-packages/pydoop/hdfs/path.py", line 28, in <module>
from . import common, fs as hdfs_fs
File "/home/ashish/anaconda/lib/python2.7/site-packages/pydoop/hdfs/fs.py", line 34, in <module>
from .core import core_hdfs_fs
File "/home/ashish/anaconda/lib/python2.7/site-packages/pydoop/hdfs/core/__init__.py", line 49, in <module>
_CORE_MODULE = init(backend=HDFS_CORE_IMPL)
File "/home/ashish/anaconda/lib/python2.7/site-packages/pydoop/hdfs/core/__init__.py", line 29, in init
jvm.load_jvm_lib()
File "/home/ashish/anaconda/lib/python2.7/site- packages/pydoop/utils/jvm.py", line 33, in load_jvm_lib
java_home = get_java_home()
File "/home/ashish/anaconda/lib/python2.7/site-packages/pydoop/utils/jvm.py", line 28, in get_java_home
raise RuntimeError("java home not found, try setting JAVA_HOME")
RuntimeError: java home not found, try setting JAVA_HOME

Process finished with exit code 1

我猜它可能找不到 py4j。 py4j的位置是

/home/ashish/anaconda/lib/python2.7/site-packages/py4j

当我在远程服务器上回显 java home 时,

 echo $JAVA_HOME

我得到这个位置,

/usr/java/jdk1.7.0_67-cloudera

我不熟悉 python 中的编程以及 centOS 设置,请建议我可以做些什么来解决这个问题?

谢谢

你可以在hadoop-env.sh中设置JAVA_HOME试试(默认是注释的)

变化:

# The java implementation to use.  Required.
# export JAVA_HOME=/usr/lib/j2sdk1.5-sun

收件人:

# The java implementation to use.  Required.
export JAVA_HOME=/usr/java/jdk1.7.0_67-cloudera

或者您的 java 安装目录是什么。

嗯,看来我解决了。我所做的是我使用

sys.path.append('/usr/java/jdk1.7.0_67-cloudera')

我更新了代码

import os, sys
sys.path.append('/usr/java/jdk1.7.0_67-cloudera')
input_file = '/home/ashish/pencil/someData.txt'
with open(input_file) as f:
   for line in f:
       print line

此代码从远程服务器中的 HDFS 读取文件,然后在我笔记本电脑的 pycharm 控制台中打印输出。

通过使用 sys.path.append(),您不必手动更改 hadoop.sh 文件并避免与其他 java 配置文件发生冲突。