Flume 从安装到采集数据至 HDFS

mark

前言

数据开发工作中最重要的工作之一便是数据采集,数据采集的正确性直接影响到后续的数据分析研究策略,而数据采集工作中Flume作为一个重要的组件之一。以下我们将从安装以及如何采集一个应用中的日志信息直接存储到HDFS的一个过程。

准备安装文件

以下为大家准备了安装文件,也可以自己到网上下载

下载地址:http://archive.apache.org/dist/flume/1.6.0/

执行以下操作步骤需提先把Java、Hadoop环境安装好,可以参考Hadoop2.7.2集群安装

准备文件 下载地址
spark-2.0.2-bin-hadoop2.7 链接: https://pan.baidu.com/s/1dlUjcLjemTcm7jnc0HDiwQ 提取码: typ4

Flume安装

Flume安装非常简单,只要上传安装包tar解压即可

1
2
3
4
5
6
7
[root@master201 Soft]# tar -xvf apache-flume-1.6.0-bin.tar.gz
[root@master201 apache-flume-1.6.0-bin]# bin/flume-ng version
Flume 1.6.0
Source code repository: https://git-wip-us.apache.org/repos/asf/flume.git
Revision: 2561a23240a71ba20bf288c7c2cda88f443c2080
Compiled by hshreedharan on Mon May 11 11:15:44 PDT 2015
From source with checksum b29e416802ce9ece3269d34233baf43f

Flume配置采集应用日志存储至HDFS

  • Flume采集webpy日志,然后通过sink输出配置

    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    13
    14
    15
    16
    a1.sources = r1            
    a1.sinks = k1
    a1.channels = c1
    # Describe/configure the source
    a1.sources.r1.type = exec
    a1.sources.r1.channels = c1
    a1.sources.r1.command = tail -f /home/soft/web.py-0.37/log/log.log
    # Describe the sink
    a1.sinks.k1.type = logger
    # Use a channel which buffers events in memory
    a1.channels.c1.type = memory
    a1.channels.c1.capacity = 1000
    a1.channels.c1.transactionCapacity = 100
    # Bind the source and sink to the channel
    a1.sources.r1.channels = c1
    a1.sinks.k1.channel = c1
  • Flume采集webpy日志,通过sink输出到hdfs

    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    a1.sources = r1                                 
    a1.sinks = k1
    a1.channels = c1
    # Describe/configure the source
    a1.sources.r1.type = exec
    a1.sources.r1.channels = c1
    a1.sources.r1.command = tail -f /home/soft/web.py-0.37/log/log.log
    # Describe the sink
    a1.sinks.k1.type = logger
    # Use a channel which buffers events in memory
    a1.channels.c1.type = memory
    a1.channels.c1.capacity = 1000
    a1.channels.c1.transactionCapacity = 100
    # Bind the source and sink to the channel
    a1.sources.r1.channels = c1

    a1.sinks.k1.type = hdfs
    a1.sinks.k1.channel = c1
    a1.sinks.k1.hdfs.path = hdfs://hadoop100:9000/jiarong/flume/syslogtcp
    a1.sinks.k1.hdfs.filePrefix = Syslog
    a1.sinks.k1.hdfs.round = true
    a1.sinks.k1.hdfs.roundValue = 10
    a1.sinks.k1.hdfs.roundUnit = minute
分享到 评论