Hive StorageHandler 抛出异常 "Configuration and input path are inconsistent"
Hive StorageHandler throws exception "Configuration and input path are inconsistent"
我有一个 HiveStorageHandler
如果我 select * from myTable
它 returns 底层存储中的所有行。
当我执行类似 select col1 from myTable
的操作时,底层 mapreduce 作业抛出异常:
java.lang.RuntimeException: Error in configuring object
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:109)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:75)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:413)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)
at org.apache.hadoop.mapred.Child.run(Child.java:268)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1438)
at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:106)
... 9 more
Caused by: java.lang.RuntimeException: Error in configuring object
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:109)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:75)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
at org.apache.hadoop.mapred.MapRunner.configure(MapRunner.java:34)
... 14 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:106)
... 17 more
Caused by: java.lang.RuntimeException: Map operator initialization failed
at org.apache.hadoop.hive.ql.exec.ExecMapper.configure(ExecMapper.java:119)
... 22 more
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: org.apache.hadoop.hive.ql.metadata.HiveException: Configuration and input path are inconsistent
at org.apache.hadoop.hive.ql.exec.MapOperator.setChildren(MapOperator.java:526)
at org.apache.hadoop.hive.ql.exec.ExecMapper.configure(ExecMapper.java:90)
... 22 more
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Configuration and input path are inconsistent
at org.apache.hadoop.hive.ql.exec.MapOperator.setChildren(MapOperator.java:520)
... 23 more
2015-02-12 15:45:51,881 INFO org.apache.hadoop.mapred.Task: Runnning cleanup for the task
如果我引入一个 HiveMetaHook.preCreateTable
并将位置设置为我在 hdfs 上的输入路径,那么它就可以工作了。但是我的路径是动态变化的,我不能不断更新这个属性:
@Override
public void preCreateTable(Table tbl) throws MetaException {
if (tbl.getSd().getLocation() != null) {
throw new MetaException("LOCATION should be null.");
}
tbl.getSd().setLocation(*hard-coded-input-path*);
}
我是根据以下posting.
做的
发生这种情况是因为我正在动态更改 mapred.input.dir
以指向 hdfs 上的特定文件。现在,一旦我完成了自己的计算,我将 mapred.input.dir
设置回配置单元设置的值。
这个InputFormat
我有自己的Split
。所以每次调用 FileSplit.getPath()
我 return hive 期望的是原始 mapred.input.dir
现在一切正常。
class MySplit extends org.apache.hadoop.mapred.FileSplit {
public MySplit(InputSplit actualSplit, String hiveInputPath) {
super (new Path(hiveInputPath), 0,0, (String[]) null);
this.actualSplit = actualSplit;
}
//do not override getPath that way hiveInputPath is returned
InputSplit getActualSplit() {
return actualSplit;
}
....
}
我有一个 HiveStorageHandler
如果我 select * from myTable
它 returns 底层存储中的所有行。
当我执行类似 select col1 from myTable
的操作时,底层 mapreduce 作业抛出异常:
java.lang.RuntimeException: Error in configuring object
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:109)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:75)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:413)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)
at org.apache.hadoop.mapred.Child.run(Child.java:268)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1438)
at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:106)
... 9 more
Caused by: java.lang.RuntimeException: Error in configuring object
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:109)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:75)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
at org.apache.hadoop.mapred.MapRunner.configure(MapRunner.java:34)
... 14 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:106)
... 17 more
Caused by: java.lang.RuntimeException: Map operator initialization failed
at org.apache.hadoop.hive.ql.exec.ExecMapper.configure(ExecMapper.java:119)
... 22 more
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: org.apache.hadoop.hive.ql.metadata.HiveException: Configuration and input path are inconsistent
at org.apache.hadoop.hive.ql.exec.MapOperator.setChildren(MapOperator.java:526)
at org.apache.hadoop.hive.ql.exec.ExecMapper.configure(ExecMapper.java:90)
... 22 more
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Configuration and input path are inconsistent
at org.apache.hadoop.hive.ql.exec.MapOperator.setChildren(MapOperator.java:520)
... 23 more
2015-02-12 15:45:51,881 INFO org.apache.hadoop.mapred.Task: Runnning cleanup for the task
如果我引入一个 HiveMetaHook.preCreateTable
并将位置设置为我在 hdfs 上的输入路径,那么它就可以工作了。但是我的路径是动态变化的,我不能不断更新这个属性:
@Override
public void preCreateTable(Table tbl) throws MetaException {
if (tbl.getSd().getLocation() != null) {
throw new MetaException("LOCATION should be null.");
}
tbl.getSd().setLocation(*hard-coded-input-path*);
}
我是根据以下posting.
做的发生这种情况是因为我正在动态更改 mapred.input.dir
以指向 hdfs 上的特定文件。现在,一旦我完成了自己的计算,我将 mapred.input.dir
设置回配置单元设置的值。
这个InputFormat
我有自己的Split
。所以每次调用 FileSplit.getPath()
我 return hive 期望的是原始 mapred.input.dir
现在一切正常。
class MySplit extends org.apache.hadoop.mapred.FileSplit {
public MySplit(InputSplit actualSplit, String hiveInputPath) {
super (new Path(hiveInputPath), 0,0, (String[]) null);
this.actualSplit = actualSplit;
}
//do not override getPath that way hiveInputPath is returned
InputSplit getActualSplit() {
return actualSplit;
}
....
}