Getting error:- Type mismatch in key from map: expected org.apache.hadoop.io.Text, recieved org.apache.hadoop.io.LongWritable

Getting error:- Type mismatch in key from map: expected org.apache.hadoop.io.Text, recieved org.apache.hadoop.io.LongWritable

我已经编写了一个 mapreduce 作业来处理日志文件 analysis.My 映射器输出文本作为键和值,并且我已经在我的 driver 中明确设置了映射输出 classes class.

但我仍然收到错误:-映射中的密钥类型不匹配:预期 org.apache.hadoop.io.Text,收到 org.apache.hadoop.io.LongWritable

public class CompositeUserMapper extends Mapper<LongWritable, Text, Text, Text> {

IntWritable a = new IntWritable(1);
//Text txt = new Text();

@Override
protected void map(LongWritable key, Text value,
        Context context)
        throws IOException, InterruptedException {
    String line = value.toString();

    Pattern p = Pattern.compile("\bd{8}\b");
    Matcher m = p.matcher(line);
    String userId = "";
    String CompositeId = "";
    if(m.find()){

        userId = m.group(1);
    }

     CompositeId = line.substring(line.indexOf("compositeId :")+13).trim();

     context.write(new Text(CompositeId),new Text(userId));


    // TODO Auto-generated method stub
    super.map(key, value, context);
}    

我的Driverclass如下:-

public class CompositeUserDriver extends Configured implements Tool {

public static void main(String[] args) throws Exception {

    CompositeUserDriver wd = new CompositeUserDriver();
    int res = ToolRunner.run(wd, args);
    System.exit(res);

}

public int run(String[] arg0) throws Exception {
    // TODO Auto-generated method stub

    Job job=new Job();
    job.setJarByClass(CompositeUserDriver.class);
    job.setJobName("Composite UserId Count" );

    FileInputFormat.addInputPath(job, new Path(arg0[0]));
    FileOutputFormat.setOutputPath(job, new Path(arg0[1]));
    job.setMapperClass(CompositeUserMapper.class);
    job.setReducerClass(CompositeUserReducer.class);
    job.setMapOutputKeyClass(Text.class);
    job.setMapOutputValueClass(Text.class);
    job.setOutputKeyClass(Text.class);
    job.setOutputValueClass(IntWritable.class);
    return job.waitForCompletion(true) ? 0 : 1;
    //return 0;
}

}

请指教如何解决这个问题。

从映射器代码中删除 super.map(key, value, context); 行:它调用父 class 的映射方法,它是 returns 键和值传递给它的身份映射器,在此案例关键是从文件开头的字节偏移量