Flume系列5-Flume自定义Interceptor

Flume系列5-Flume自定义Interceptor,第1张

Flume系列5-Flume自定义Interceptor

文章目录
  • 一. 拦截器简介
  • 二. idea构建拦截器
    • 2.1 pom文件
    • 2.2 Java代码
    • 2.3 maven打包并上传
  • 三.flume conf编写
  • 四. 运行并查看结果
  • 参考:

一. 拦截器简介

拦截器主要用来实现日志的分类,修改或者删除不需要的日志信息,拦截器分为内置拦截器和自定义拦截器。
下面我们主要介绍使用自定义拦截器来将信息分类传输。

二. idea构建拦截器

首先构建一个maven工程。

2.1 pom文件

pom依赖如下:


      org.apache.flume
      flume-ng-core
      1.9.0
    
2.2 Java代码

Java代码如下:

package com.bigdata.study.flume;

import org.apache.flume.Context;
import org.apache.flume.Event;
import org.apache.flume.interceptor.Interceptor;

import java.util.ArrayList;
import java.util.List;
import java.util.Map;


// 主要是实现Interceptor中的抽象方法
public class TypeInterceptor implements Interceptor {
    // 定义一个Event类型的集合来保存数据
    private List addHeaderEvents;
    @Override
    public void initialize() {
        // 初始化
        addHeaderEvents = new ArrayList<>();
    }

    @Override
    public Event intercept(Event event) {
        Map headers = event.getHeaders();
        String body = new String(event.getBody());
        //1.根据 body 中是否有"flume"来决定添加怎样的头信息
        if (body.contains("flume")) {
            //2.添加头信息
            headers.put("type", "with_flume");
        } else {
            //2.添加头信息
            headers.put("type", "without_flume");
        }
        return event;
    }

    @Override
    public List intercept(List list) {
        // 每接收一个新的event就要清空addHeaderEvents这个list
        addHeaderEvents.clear();
        for (Event event : list) {
            // 将event类型的拦截器信息添加到addHeaderEvents中
            addHeaderEvents.add(intercept(event));
        }
        return addHeaderEvents;
    }

    @Override
    public void close() {

    }
    // 定义一个静态内部类来构建interceptor
    public static class Builder implements Interceptor.Builder{

        @Override
        public Interceptor build() {
            return new TypeInterceptor();
        }

        @Override
        public void configure(Context context) {

        }
    }
}
2.3 maven打包并上传

将项目打包,并上传到flume的安装路径。

我本地是CDH 6.3.1的环境,上传路径如下:

/opt/cloudera/parcels/CDH-6.3.1-1.cdh6.3.1.p0.1470567/lib/flume-ng/lib
三.flume conf编写
cd /opt/cloudera/parcels/CDH-6.3.1-1.cdh6.3.1.p0.1470567
vi conf/job/flume_Interceptor.conf

添加如下内容:

a1.sources = r1
a1.channels = c1 c2
a1.sinks = k1 k2

a1.sources.r1.type = netcat
a1.sources.r1.bind = localhost
a1.sources.r1.port = 44444

a1.sources.r1.interceptors = i1
# 拦截器的类型:类名$内部类名
a1.sources.r1.interceptors.i1.type = com.bigdata.study.flume.TypeInterceptor$Builder

a1.sources.r1.selector.type = multiplexing
a1.sources.r1.selector.header = type
a1.sources.r1.selector.mapping.with_flume = c1
a1.sources.r1.selector.mapping.without_flume = c2

a1.channels.c1.type = memory
a1.channels.c1.capacity = 1000
a1.channels.c1.transactionCapacity = 100

a1.channels.c2.type = memory
a1.channels.c2.capacity = 1000
a1.channels.c2.transactionCapacity = 100

a1.sinks.k1.type = hdfs
a1.sinks.k1.hdfs.fileType = DataStream
a1.sinks.k1.hdfs.filePrefix = with_flume
a1.sinks.k1.hdfs.fileSuffix = .csv
# 如果时以gree开头,则生成的文件类型是greedemo
a1.sinks.k1.hdfs.path = hdfs://hp1:8020/user/flume/%Y-%m-%d
a1.sinks.k1.hdfs.useLocalTimeStamp = true
a1.sinks.k1.hdfs.batchSize = 100
a1.sinks.k1.hdfs.rollCount = 0
a1.sinks.k1.rollSize = 100
a1.sinks.k1.hdfs.rollInterval = 3
a1.sinks.k2.type = hdfs
a1.sinks.k2.hdfs.fileType = DataStream
a1.sinks.k2.hdfs.filePrefix = without_flume
a1.sinks.k2.hdfs.fileSuffix = .csv
a1.sinks.k2.hdfs.path = hdfs://hp1:8020/user/flume/%Y-%m-%d
a1.sinks.k2.hdfs.useLocalTimeStamp = true
a1.sinks.k2.hdfs.batchSize = 100
a1.sinks.k2.hdfs.rollCount = 0
a1.sinks.k2.rollSize = 100
a1.sinks.k2.hdfs.rollInterval = 3

a1.sources.r1.channels = c1 c2
a1.sinks.k1.channel = c1
a1.sinks.k2.channel = c2
四. 运行并查看结果
-- 运行flume命令
bin/flume-ng agent --conf conf/ --name a1 --conf-file conf/job/flume_Interceptor.conf 
-- 开启nc
nc localhost 44444

查看运行结果:

参考:
  1. https://blog.csdn.net/qq_38497133/article/details/108062855

欢迎分享,转载请注明来源:内存溢出

原文地址: http://outofmemory.cn/zaji/5682178.html

(0)
打赏 微信扫一扫 微信扫一扫 支付宝扫一扫 支付宝扫一扫
上一篇 2022-12-17
下一篇 2022-12-17

发表评论

登录后才能评论

评论列表(0条)

保存