首页 > 其他 > 详细

Wordcount -- MapReduce example -- Reducer

时间:2018-09-01 17:54:56      阅读:179      评论:0      收藏:0      [点我收藏+]

Reducer receives (key, values) pairs and aggregate values to a desired format, then write produced (key, value) pairs back into HDFS.

Reducer Class Prototype:

Reducer<Text, IntWritable, Text, IntWritable> 
// Text:: INPUT_KEY
// IntWritable:: INPUT_VALUE
// Text:: OUTPUT_KEY
// IntWritable:: OUTPUT_VALUE

Reduce Method for Mapper

Method header

public void reduce(Text key, Iterable<IntWritable> values,
                     Context context
                     ) throws IOException, InterruptedException 
// Text key:: Declare data type of input key;
// Iterable<IntWritable> values:: Declare data type of input values; (Note: Received values from mapper should be in a list)
// Context context:: Declare data type of output. Context is often used for output data collection.

Aggregate Values

// Iterate through all the values wrt the key:
int sum = 0;
for (IntWritable val : values) {
  sum += val.get();
}

Building (key, value) pairs

// Convert built-in int into IntWritable
result.set(sum);
// build (key, value) pair into Context and emit:
context.write(key, result);

Reducer Class Summary

Reducer class produces Reducer.Context object and serialize obtained (key, value) pair into HDFS.

Overview of Reducer Class

public static class IntSumReducer
     extends Reducer<Text,IntWritable,Text,IntWritable> {
  private IntWritable result = new IntWritable();

  public void reduce(Text key, Iterable<IntWritable> values,
                     Context context
                     ) throws IOException, InterruptedException {
    int sum = 0;
    for (IntWritable val : values) {
      sum += val.get();
    }
    result.set(sum);
    context.write(key, result);
  }
}

Written with StackEdit.

Wordcount -- MapReduce example -- Reducer

原文:https://www.cnblogs.com/LexLuc/p/9571033.html

(0)
(0)
   
举报
评论 一句话评论(0
关于我们 - 联系我们 - 留言反馈 - 联系我们:wmxa8@hotmail.com
© 2014 bubuko.com 版权所有
打开技术之扣,分享程序人生!