H2O Hadoop 需要访问用户 hdfs 的 HDFS 主文件夹吗?

H2O Hadoop requires access to user hdfs's HDFS home folder?

运行 h2o (http://h2o-release.s3.amazonaws.com/h2o/rel-yau/5/h2o-3.26.0.5-hdp3.1.zip) 在 hdp 3.1.4 上由于对 hdfs:///user/hdfs 文件夹的访问限制

在启动时出现错误
[root@HW005 h2o-3.26.0.5-hdp3.1]# hadoop jar h2odriver.jar -nodes 4 -mapperXmx 6g
Determining driver host interface for mapper->driver callback...
    [Possible callback IP address: 172.18.4.83]
    [Possible callback IP address: 127.0.0.1]
Using mapper->driver callback IP address and port: 172.18.4.83:37342
(You can override these with -driverif and -driverport/-driverportrange and/or specify external IP using -extdriverif.)
Memory Settings:
    mapreduce.map.java.opts:     -Xms6g -Xmx6g -verbose:gc -XX:+PrintGCDetails -XX:+PrintGCTimeStamps -Dlog4j.defaultInitOverride=true
    Extra memory percent:        10
    mapreduce.map.memory.mb:     6758
Hive driver not present, not generating token.
19/09/17 10:38:17 INFO client.RMProxy: Connecting to ResourceManager at hw001.co.local/172.18.4.46:8050
19/09/17 10:38:17 INFO client.AHSProxy: Connecting to Application History server at hw002.co.local/172.18.4.47:10200
ERROR: Permission denied: user=root, access=WRITE, inode="/user":hdfs:hdfs:drwxr-xr-x

这似乎是一项要求,因为我想 运行 h2o 根据用例作为各种不同的用户使用,我认为只授予 hdfs 访问权是不对的用户(HDP 默认 HDFS 管理员用户)的 HDFS 主文件夹以执行此操作。谁能解释这里发生了什么以及通常如何处理?

如何在非 Kerberized Hadoop 集群中管理模拟,用于...
* 为任意 Hadoop 用户创建 HDFS HomeDir
* 运行 该用户下的作业(将使用 HomeDir 存储临时文件)

## create HDFS HomeDir for new user, with "hdfs" privileged account
## note the workaround for the bug in "chmod" parser which
##  fails on "=<nothing>" in most Hadoop versions
export HADOOP_USER_NAME=hdfs
hdfs dfs -mkdir -p               /user/zorro
hdfs dfs -chown zorro:zorro      /user/zorro
hdfs dfs -chmod u=rwx,g=rx,o-rwx /user/zorro
unset HADOOP_USER_NAME
export HADOOP_USER_NAME=zorro
# just to check who's there
hdfs groups

run-my-H2O-job-on-command-line
unset HADOOP_USER_NAME