PyArrow 0.16.0 fs.HadoopFileSystem 抛出 HDFS 连接失败

PyArrow 0.16.0 fs.HadoopFileSystem throws HDFS connection failed

我目前正在迁移旧的 Arrow 文件系统接口:

http://arrow.apache.org/docs/python/filesystems_deprecated.html

到新的文件系统接口:

http://arrow.apache.org/docs/python/filesystems.html

我正在尝试使用 fs.HadoopFileSystem 连接到 HDFS,如下所示

from pyarrow import fs
import os
os.environ['HADOOP_HOME'] = '/usr/hdp/current/hadoop-client'
os.environ['JAVA_HOME'] = '/opt/jdk8'
os.environ['ARROW_LIBHDFS_DIR'] = '/usr/lib/ams-hbase/lib/hadoop-native'

fs.HadoopFileSystem("hdfs://namenode:8020?user=hdfsuser")

我尝试了不同的 uri 组合,并将 uri 替换为 fs.HdfsOptions:

connection_tuple = ("namenode", 8020)
fs.HadoopFileSystem(fs.HdfsOptions(connection_tuple, user="hdfsuser"))

以上所有都给我同样的错误:

Environment variable CLASSPATH not set!
getJNIEnv: getGlobalJNIEnv failed
Environment variable CLASSPATH not set!
getJNIEnv: getGlobalJNIEnv failed
/arrow/cpp/src/arrow/filesystem/hdfs.cc:56: Failed to disconnect hdfs client: IOError: HDFS hdfsFS::Disconnect failed, errno: 255 (Unknown error 255) Please check that you are connecting to the correct HDFS RPC port
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "pyarrow/_hdfs.pyx", line 180, in pyarrow._hdfs.HadoopFileSystem.__init__
  File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status
  File "pyarrow/error.pxi", line 99, in pyarrow.lib.check_status
OSError: HDFS connection failed

因为这个功能很新,所以没有太多的文档,所以希望我能在这里得到一些答案

干杯

设置您的 HDFS 类路径环境

export CLASSPATH=`$HADOOP_HOME/bin/hdfs classpath --glob`

找到 hdfs bin 目录来设置这个变量

相关问题

  • How to properly setup pyarrow for python 3.7 on Windows