java.io.IOException:此版本的 hadoop 不支持合并凭据

java.io.IOException: Merging of credentials not supported in this version of hadoop

我正在尝试通过在 HBase 中创建的 Hive 访问 table。

以下命令执行成功。

hbase(main):032:0> create 'hbasetohive', 'colFamily'
0 row(s) in 1.9540 seconds

hbase(main):033:0> put 'hbasetohive', '1s', 'colFamily:val','1strowval'
0 row(s) in 0.1020 seconds

hbase(main):034:0> scan 'hbasetohive'
ROW                                   COLUMN+CELL                                                                                               
 1s                                   column=colFamily:val, timestamp=1423936170125, value=1strowval                                            
1 row(s) in 0.1170 seconds
-----
hive> CREATE EXTERNAL TABLE hbase_hivetable_k(key string, value string)
    > STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
    > WITH SERDEPROPERTIES ("hbase.columns.mapping" = "colFamily:val")
    > TBLPROPERTIES("hbase.table.name" = "hbasetohive");
OK
Time taken: 1.622 seconds
hive> Select * from hbase_hivetable_k;
OK
1s    1strowval
Time taken: 0.184 seconds, Fetched: 1 row(s)

但是执行 COUNT(*) 时出错

hive> select count(1) from hbase_hivetable_k;
Query ID = hduser_20150216081212_f47b2faa-be53-4eb3-b8dd-b56990455977
Total jobs = 1
Launching Job 1 out of 1
Number of reduce tasks determined at compile time: 1
In order to change the average load for a reducer (in bytes):
  set hive.exec.reducers.bytes.per.reducer=<number>
In order to limit the maximum number of reducers:
  set hive.exec.reducers.max=<number>
In order to set a constant number of reducers:
  set mapred.reduce.tasks=<number>
java.lang.RuntimeException: java.io.IOException: Merging of credentials not supported in this version of hadoop
 at org.apache.hadoop.hive.hbase.HBaseStorageHandler.configureJobConf(HBaseStorageHandler.java:485)
 at org.apache.hadoop.hive.ql.plan.PlanUtils.configureJobConf(PlanUtils.java:856)
 at org.apache.hadoop.hive.ql.plan.MapWork.configureJobConf(MapWork.java:540)
 at org.apache.hadoop.hive.ql.plan.MapredWork.configureJobConf(MapredWork.java:68)
 at org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:370)
 at org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:137)
 at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
 at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85)
 at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1604)
 at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1364)
 at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1177)
 at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1004)
 at org.apache.hadoop.hive.ql.Driver.run(Driver.java:994)
 at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:247)
 at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:199)
 at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:410)
 at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:783)
 at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:677)
 at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:616)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
 at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:601)
 at org.apache.hadoop.util.RunJar.main(RunJar.java:160)
Caused by: java.io.IOException: Merging of credentials not supported in this version of hadoop
 at org.apache.hadoop.hive.shims.Hadoop20SShims.mergeCredentials(Hadoop20SShims.java:527)
 at org.apache.hadoop.hive.hbase.HBaseStorageHandler.configureJobConf(HBaseStorageHandler.java:483)
 ... 23 more
Job Submission failed with exception 'java.lang.RuntimeException(java.io.IOException: Merging of credentials not supported in this version of hadoop)'
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask

我使用的是 Hadoop 版本 1.2.1,Hive 版本 0.14.0,HBase 版本 0.94.8

能否请您告诉我我必须更新哪个版本才能使其正常工作。

问候 - Koushik

你应该更新hadoop的版本。使用 hadoop 2.4.0 一切正常