我有 hadoop 1.2.1 和 hive 0.14.0
I have hadoop 1.2.1 and have hive 0.14.0
hduser@Connected:~$ hive
Logging initialized using configuration in jar:file:/usr/local/hive/lib/hive-common-0.14.0.jar!/hive-log4j.properties
Exception in thread "main" java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rwx--x--x
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:444)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:672)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:616)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.main(RunJar.java:160)
Caused by: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rwx--x--x
at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:529)
at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:478)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:430)
我在 hadoop 中创建了配置单元目录:
hadoop fs -mkdir /usr/hive/warehouse
和
为表设置权限:
hadoop fs -chmod g+w /usr/hive/warehouse
但是还是不行?我该怎么办?
HDFS 目录 /tmp/hive 似乎丢失或没有足够的权限在其中写入文件。执行以下命令以分配适当的权限。
先切换到HDFS admin用户(sudo -su hdfs
命令即可),然后执行以下命令。
hadoop fs -chmod 777 /tmp;
hadoop fs -mkdir /tmp/hive;
hadoop fs -chmod -R 777 /tmp/hive;
检查 hive-site.xml 上以下标签的值,然后更改提到的文件夹的权限
<property>
<name>hive.exec.local.scratchdir</name>
<value>/tmp/mydir</value>
<description>Local scratch space for Hive jobs</description>
</property>
hadoop fs -rmr /tmp/mydir;
hadoop fs -mkdir /tmp/mydir;
hadoop fs -chmod 777 /tmp/mydir;
hadoop fs -chmod -R 777 /tmp/mydir;
hduser@Connected:~$ hive
Logging initialized using configuration in jar:file:/usr/local/hive/lib/hive-common-0.14.0.jar!/hive-log4j.properties
Exception in thread "main" java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rwx--x--x
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:444)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:672)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:616)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.main(RunJar.java:160)
Caused by: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rwx--x--x
at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:529)
at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:478)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:430)
我在 hadoop 中创建了配置单元目录:
hadoop fs -mkdir /usr/hive/warehouse
和 为表设置权限:
hadoop fs -chmod g+w /usr/hive/warehouse
但是还是不行?我该怎么办?
HDFS 目录 /tmp/hive 似乎丢失或没有足够的权限在其中写入文件。执行以下命令以分配适当的权限。
先切换到HDFS admin用户(sudo -su hdfs
命令即可),然后执行以下命令。
hadoop fs -chmod 777 /tmp;
hadoop fs -mkdir /tmp/hive;
hadoop fs -chmod -R 777 /tmp/hive;
检查 hive-site.xml 上以下标签的值,然后更改提到的文件夹的权限
<property>
<name>hive.exec.local.scratchdir</name>
<value>/tmp/mydir</value>
<description>Local scratch space for Hive jobs</description>
</property>
hadoop fs -rmr /tmp/mydir;
hadoop fs -mkdir /tmp/mydir;
hadoop fs -chmod 777 /tmp/mydir;
hadoop fs -chmod -R 777 /tmp/mydir;