MapReduce-OutputFormat数据输出 (From 尚硅谷)

MapReduce-OutputFormat数据输出 (From 尚硅谷),第1张

MapReduce-OutputFormat数据输出 (From 尚硅谷)

个人学习整理,所有资料来自尚硅谷
B站学习连接:添加链接描述

MapReduce-OutputFormat数据输出


1 OutputFormat数据输出 1.1 OutputFormat接口实现类

​ OutputFormat是MapReduce输出的基类,所有实现MapReduce输出都实现了Output接口。以下几种常见的OutputFormat实现类。

默认输出格式TextOutputFormat。

​ 自定义OutputFormat:

应用场景:输出数据到MySQL/Hbase/Elasticsearch等存储框架中;自定义OutputFormat步骤:自定义一个类继承FileOutputFormat;改写RecordWriter,具体改写输出的方法write()。 1.2 自定义OutputFormat案例实 ***

(1)需要:过滤输入的log日志,包含atguigu的网站输出到e:/atguigu.log,不包含的网站输出到e:/other.log

(2)输入数据:

数据连接:添加链接描述
提取码:flnk
(3)输出数据:

(4)自定义一个OutputFormat类

Mapper阶段输出key为Text,value为NullWritable

Reducer阶段输出key为Text,value为NullWritable

创建一个类LogRecordWriter继承RecordWriter

创建两个文件的输出流:atguiguOut、otherOut如果输入数据包含atguigu,输出到atguiguOut流,如果不包含atguigu,输出到otherOut流

(5)驱动类

要将自定义的输出格式组件设置到job中

job.setOutputFormatClass(LogOutputFormat.class);

LogMapper类

package com.atguigu.mapreduce.outputformat;

import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.NullWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Mapper;

import java.io.IOException;

public class LogMapper extends Mapper{
    @Override
    protected void map(LongWritable key, Text value, Mapper.Context context) throws IOException, InterruptedException {
        //http://www.baidu.com
        //http://www.google.com
        context.write(value,NullWritable.get());
    }
}

LogReducer类

package com.atguigu.mapreduce.outputformat;

import org.apache.hadoop.io.NullWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Reducer;

import java.io.IOException;

public class LogReducer extends Reduce{
    @Override
    protected void reduce(Text key, Iterable values, Reducer.Context context) throws IOException, InterruptedException {
        //防止有相同数据,丢失数据
        for (NullWritable value:values){
            context.write(key,NullWritable.get());
        }
    }
}

LogOutputFormat类

package com.atguigu.mapreduce.outputformat;

import org.apache.hadoop.io.NullWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.RecordWriter;
import org.apache.hadoop.mapreduce.TaskAttemptContext;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;

import java.io.IOException;

//先定义一个类继承FileOutputFormat
public class LogOutputFormat extends FileOutputFormat{
	@Override
    //重写getRecordWritable方法,返回RecordWritable。没有就创建一个
    public RecordWritable getRecordWritable(TaskAttemptContext taskAttemptContext) throws IOException, InterruptedException {
        LogRecordWriter lrw = new LogRecordWriter(taskAttemptContext);//与taskAttemptContext连接
        return lrw;
    }
}

LogRecordWriter类

package com.atguigu.mapreduce.outputformat;

import org.apache.hadoop.fs.FSDataOutputStream;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IOUtils;
import org.apache.hadoop.io.NullWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.RecordWriter;
import org.apache.hadoop.mapreduce.TaskAttemptContext;

import java.io.IOException;

public class LogRecordWriter extends RecordWriter{//泛型为Reducer输出类型
    private FSDataOutputStream atguiguOut;
    private FSDataOutputStream otherOut;
    
    public LogRecordWriter(TaskAttemptContext taskAttemptContext){
        //创建两条流,使用FileSystem(HDFS客户端创建流)
        try{
            //与taskAttemptContext产生关联
            FileSystem fs = FileSystem.get(taskAttemptContext.getConfiguration());
            atguiguOut = fs.create(new Path("D:downloadshadoop-3.1.0dataoutputLogatguigu.log"));
            otherOut = fs.create(new Path("D:\downloads\hadoop-3.1.0\data\output\Log\other.log"));
        }catch(IOException e){
            e.printStackTrace();
        }
    }
    @Override
    public void write(Text text, NullWritable nullWritable) throws IOException, InterruptedException {
        //具体写
        String log = text.toString();
        if (log.contains("atguigu")){
            atguiguOut.writeBytes(log+"n");
        }else{
            otherOut.writeBytes(log+"n");
        }
    }
    @Override
    public void close(TaskAttemptContext taskAttemptContext) throws IOException, InterruptedException {
        //关流
        IOUtils.closeStream(atguiguOut);
        IOUtils.closeStream(otherut);
    }
}

LogDriver类

package com.atguigu.mapreduce.outputformat;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.NullWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
import org.apache.hadoop.yarn.webapp.hamlet2.Hamlet;

import java.io.IOException;

public class LogDriver{
        public static void main(String[] args) throws IOException, InterruptedException, ClassNotFoundException {
            Configuration conf = new Configuration();
            Job job = Job.getInstance(conf);
            
            job.setJarByClass(LogDriver.class);
            job.setMapperClass(LogMapper.class);
            job.setReducerClass(LogReducer.class);
            
            job.setMapOutputKeyClass(Text.class);
            job.setMapOutputValueClass(NullWritable.class);
            
            job.setOutputKeyClass(Text.class);
            job.setOutputValueClass(NullWritable.class);
            
            //设置自定义的outputformat
            job.setOutputFormatClass(LogOutputFormat.class);
            
            FileInputFormat.setInputPaths(job,new Path("D:downloadshadoop-3.1.0data11_inputinputoutputformat"));
            
            //虽然自定义了outputformat,但是因为outputformat继承fileoutputformat,而fileoutputformat要输出一个_SUCCESS文件,所以在这里还得指定一个输出目录
            FileOutputFormat.setOutputPath(job,new Path("D:downloadshadoop-3.1.0dataoutputLogoutput"));
            
            boolean result = job.waitForCompletion(true);
            System.exit(result?0:1);
        }
}

最终输出:

欢迎分享,转载请注明来源:内存溢出

原文地址: http://outofmemory.cn/zaji/5702089.html

(0)
打赏 微信扫一扫 微信扫一扫 支付宝扫一扫 支付宝扫一扫
上一篇 2022-12-17
下一篇 2022-12-17

发表评论

登录后才能评论

评论列表(0条)

保存