Java Hadoop-lzo 已找到接口,但 class 是预期的 LzoTextInputFormat
Java Hadoop-lzo Found interface but class was expected LzoTextInputFormat
我正在尝试使用 Hadoop-LZO package (built using the steps here)。似乎一切正常,因为我能够通过(如预期的那样 returns big_file.lzo.index
)将我的 lzo 文件转换为索引文件:
hadoop jar /path/to/your/hadoop-lzo.jar com.hadoop.compression.lzo.LzoIndexer big_file.lzo
然后我将在我的 mapreduce 作业中使用这些文件(使用 big_file.lzo.index
作为输入):
import com.hadoop.mapreduce.LzoTextInputFormat;
....
Job jobConverter = new Job(conf, "conversion");
jobConverter.setJar("JsonConverter.jar");
jobConverter.setInputFormatClass(LzoTextInputFormat.class);
....
我收到以下错误:
Exception in thread "main" java.lang.IncompatibleClassChangeError: Found interface org.apache.hadoop.mapreduce.JobContext, but class was expected
at com.hadoop.mapreduce.LzoTextInputFormat.listStatus(LzoTextInputFormat.java:62)
at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.getSplits(FileInputFormat.java:389)
at com.hadoop.mapreduce.LzoTextInputFormat.getSplits(LzoTextInputFormat.java:101)
at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:304)
at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:321)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:199)
at org.apache.hadoop.mapreduce.Job.run(Job.java:1290)
at org.apache.hadoop.mapreduce.Job.run(Job.java:1287)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1595)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1308)
at org.wwbp.JsonConverter.run(JsonConverter.java:116)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.wwbp.JsonConverter.main(JsonConverter.java:74)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
我看到其他问题回答了这个问题,他们说要针对 Hadoop v2 重新构建。所以我从 Github 和 运行
重新下载了所有内容
% hadoop version
Hadoop 2.7.0-mapr-1607
Compiled by root on 2016-07-18T07:56Z
Compiled with protoc 2.5.0
This command was run using /opt/mapr/hadoop/hadoop-2.7.0/share/hadoop/common/hadoop-common-2.7.0-mapr-1607.jar
% ant clean compile-native tar -Dhadoopversion=27
....
tar:
[tar] Building tar: ../jars/hadoop-lzo/build/hadoop-lzo-0.4.15.tar.gz
BUILD SUCCESSFUL
Total time: 15 seconds
构建路径如下:
C_INCLUDE_PATH=../jars/lzo-2.09/include
LIBRARY_PATH=../jars/lzo-2.09/lib
JAVA_HOME=/usr/lib/jvm/java-7-openjdk-amd64
我真的不确定我做错了什么。如何让 ant
看到 Hadoop v2?
编辑 1:可能值得注意:当我 运行 我的 mapreduce 作业(调用 LzoTextInputFormat.class
)和 lzo 转换器(在 big_file.lzo
) 我的类路径如下
CLASS_PATH=/opt/mapr/hadoop/hadoop-2.7.0/etc/hadoop:/opt/mapr/hadoop/hadoop-2.7.0/share/hadoop/common/lib/*:/opt/mapr/hadoop/hadoop-2.7.0/share/hadoop/common/*:/opt/mapr/hadoop/hadoop-2.7.0/share/hadoop/hdfs:/opt/mapr/hadoop/hadoop-2.7.0/share/hadoop/hdfs/lib/*:/opt/mapr/hadoop/hadoop-2.7.0/share/hadoop/hdfs/*:/opt/mapr/hadoop/hadoop-2.7.0/share/hadoop/yarn/lib/*:/opt/mapr/hadoop/hadoop-2.7.0/share/hadoop/yarn/*:/opt/mapr/hadoop/hadoop-2.7.0/share/hadoop/mapreduce/lib/*:/opt/mapr/hadoop/hadoop-2.7.0/share/hadoop/mapreduce/*:/contrib/capacity-scheduler/*.jar:/opt/mapr/lib/kvstore*.jar:/opt/mapr/lib/libprotodefs*.jar:/opt/mapr/lib/baseutils*.jar:/opt/mapr/lib/maprutil*.jar:/opt/mapr/lib/json-20080701.jar:/opt/mapr/lib/flexjson-2.1.jar:/jars/hadoop-lzo-0.4.15/hadoop-lzo-0.4.15.jar
编辑 2:如果我按如下方式索引 lzo 文件(即尝试使用 DistributedLzoIndexer
而不是 LzoIndexer
通过 mapreduce 作业进行索引)我得到类似的错误:
> hadoop jar /path/to/your/hadoop-lzo.jar com.hadoop.compression.lzo.DistributedLzoIndexer big_file.lzo
16/12/09 13:06:24 INFO mapreduce.Job: map 0% reduce 0%
16/12/09 13:06:29 INFO mapreduce.Job: Task Id : attempt_1472572940387_0370_m_000000_0, Status : FAILED
Error: Found interface org.apache.hadoop.mapreduce.TaskAttemptContext, but class was expected
不知道为什么上面的方法不起作用所以我从头开始使用这个 repo:
https://github.com/twitter/hadoop-lzo
而不是上面链接的那个,并使用 maven 来构建而不是 ant(使用与上面相同的所有设置)。
我正在尝试使用 Hadoop-LZO package (built using the steps here)。似乎一切正常,因为我能够通过(如预期的那样 returns big_file.lzo.index
)将我的 lzo 文件转换为索引文件:
hadoop jar /path/to/your/hadoop-lzo.jar com.hadoop.compression.lzo.LzoIndexer big_file.lzo
然后我将在我的 mapreduce 作业中使用这些文件(使用 big_file.lzo.index
作为输入):
import com.hadoop.mapreduce.LzoTextInputFormat;
....
Job jobConverter = new Job(conf, "conversion");
jobConverter.setJar("JsonConverter.jar");
jobConverter.setInputFormatClass(LzoTextInputFormat.class);
....
我收到以下错误:
Exception in thread "main" java.lang.IncompatibleClassChangeError: Found interface org.apache.hadoop.mapreduce.JobContext, but class was expected
at com.hadoop.mapreduce.LzoTextInputFormat.listStatus(LzoTextInputFormat.java:62)
at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.getSplits(FileInputFormat.java:389)
at com.hadoop.mapreduce.LzoTextInputFormat.getSplits(LzoTextInputFormat.java:101)
at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:304)
at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:321)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:199)
at org.apache.hadoop.mapreduce.Job.run(Job.java:1290)
at org.apache.hadoop.mapreduce.Job.run(Job.java:1287)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1595)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1308)
at org.wwbp.JsonConverter.run(JsonConverter.java:116)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.wwbp.JsonConverter.main(JsonConverter.java:74)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
我看到其他问题回答了这个问题,他们说要针对 Hadoop v2 重新构建。所以我从 Github 和 运行
重新下载了所有内容% hadoop version
Hadoop 2.7.0-mapr-1607
Compiled by root on 2016-07-18T07:56Z
Compiled with protoc 2.5.0
This command was run using /opt/mapr/hadoop/hadoop-2.7.0/share/hadoop/common/hadoop-common-2.7.0-mapr-1607.jar
% ant clean compile-native tar -Dhadoopversion=27
....
tar:
[tar] Building tar: ../jars/hadoop-lzo/build/hadoop-lzo-0.4.15.tar.gz
BUILD SUCCESSFUL
Total time: 15 seconds
构建路径如下:
C_INCLUDE_PATH=../jars/lzo-2.09/include
LIBRARY_PATH=../jars/lzo-2.09/lib
JAVA_HOME=/usr/lib/jvm/java-7-openjdk-amd64
我真的不确定我做错了什么。如何让 ant
看到 Hadoop v2?
编辑 1:可能值得注意:当我 运行 我的 mapreduce 作业(调用 LzoTextInputFormat.class
)和 lzo 转换器(在 big_file.lzo
) 我的类路径如下
CLASS_PATH=/opt/mapr/hadoop/hadoop-2.7.0/etc/hadoop:/opt/mapr/hadoop/hadoop-2.7.0/share/hadoop/common/lib/*:/opt/mapr/hadoop/hadoop-2.7.0/share/hadoop/common/*:/opt/mapr/hadoop/hadoop-2.7.0/share/hadoop/hdfs:/opt/mapr/hadoop/hadoop-2.7.0/share/hadoop/hdfs/lib/*:/opt/mapr/hadoop/hadoop-2.7.0/share/hadoop/hdfs/*:/opt/mapr/hadoop/hadoop-2.7.0/share/hadoop/yarn/lib/*:/opt/mapr/hadoop/hadoop-2.7.0/share/hadoop/yarn/*:/opt/mapr/hadoop/hadoop-2.7.0/share/hadoop/mapreduce/lib/*:/opt/mapr/hadoop/hadoop-2.7.0/share/hadoop/mapreduce/*:/contrib/capacity-scheduler/*.jar:/opt/mapr/lib/kvstore*.jar:/opt/mapr/lib/libprotodefs*.jar:/opt/mapr/lib/baseutils*.jar:/opt/mapr/lib/maprutil*.jar:/opt/mapr/lib/json-20080701.jar:/opt/mapr/lib/flexjson-2.1.jar:/jars/hadoop-lzo-0.4.15/hadoop-lzo-0.4.15.jar
编辑 2:如果我按如下方式索引 lzo 文件(即尝试使用 DistributedLzoIndexer
而不是 LzoIndexer
通过 mapreduce 作业进行索引)我得到类似的错误:
> hadoop jar /path/to/your/hadoop-lzo.jar com.hadoop.compression.lzo.DistributedLzoIndexer big_file.lzo
16/12/09 13:06:24 INFO mapreduce.Job: map 0% reduce 0%
16/12/09 13:06:29 INFO mapreduce.Job: Task Id : attempt_1472572940387_0370_m_000000_0, Status : FAILED
Error: Found interface org.apache.hadoop.mapreduce.TaskAttemptContext, but class was expected
不知道为什么上面的方法不起作用所以我从头开始使用这个 repo:
https://github.com/twitter/hadoop-lzo
而不是上面链接的那个,并使用 maven 来构建而不是 ant(使用与上面相同的所有设置)。