java Hadoop:LongWritable 不能转换为 org.apache.hadoop.io.IntWritable

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/14922087/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-10-31 17:52:14  来源:igfitidea点击:

Hadoop: LongWritable cannot be cast to org.apache.hadoop.io.IntWritable

javahadoop

提问by Junaid

I want to take a mean value of a temperature given in an input file and my Mapper and Reducer synatax seems fine to me but I am still getting the following error:

我想取输入文件中给出的温度的平均值,我的 Mapper 和 Reducer 语法对我来说似乎很好,但我仍然收到以下错误:

 Unable to load realm info from SCDynamicStore
    13/02/17 08:03:28 INFO mapred.JobClient: Task Id : attempt_201302170552_0009_m_000000_1, Status : FAILED
    java.lang.ClassCastException: org.apache.hadoop.io.LongWritable cannot be cast to org.apache.hadoop.io.IntWritable
        at org.apache.hadoop.examples.TempMeasurement$TempMapper.map(TempMeasurement.java:26)
        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
        at org.apache.hadoop.mapred.Child.run(Child.java:255)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
        at org.apache.hadoop.mapred.Child.main(Child.java:249)

My Mapper function is this:

我的映射器功能是这样的:

public static class TempMapper extends Mapper<IntWritable, Text, IntWritable, FloatWritable>{

@Override
protected void map(IntWritable key, Text value, Context context)
                throws IOException, InterruptedException {

    //code for getting date and temperature

    String temp = columns.get(3);
    context.write(new IntWritable(year), new FloatWritable(Float.valueOf(temp)));
}
}

And Reduce is:

而减少是:

  public static class IntSumReducer
       extends Reducer<IntWritable, FloatWritable, IntWritable ,FloatWritable> {
    private FloatWritable result = new FloatWritable();

    public void reduce(IntWritable key, Iterable<FloatWritable> values,
                       Context context
                       ) throws IOException, InterruptedException {

      //code for making calculations    

      context.write(key, result);
    }
  }

Input file is as:

输入文件如下:

11111 , 0,19900101, 44.04 ,
11112, 0, 19900102, 50.00,
11113, 3, 19910203, 30.00,

Any help would be appreciated

任何帮助,将不胜感激

回答by Thomas Jungblut

The key class of a mapper that maps text files is always LongWritable. That is because it contains the byte offset of the current line and this could easily overflow an integer.

映射文本文件的映射器的关键类始终是LongWritable. 那是因为它包含当前行的字节偏移量,这很容易溢出整数。

Basically you need to change your code to this:

基本上你需要把你的代码改成这样:

public static class TempMapper extends Mapper<LongWritable, Text, IntWritable, FloatWritable>{

  @Override
  protected void map(LongWritable key, Text value, Context context)
                throws IOException, InterruptedException {
       //code for getting date and temperature
       String temp = columns.get(3);
       context.write(new IntWritable(year), new FloatWritable(Float.valueOf(temp)));
  }
}