案例一:单一日志传输avro client
flume-avro-client.conf
1 | avro-client-agent.sources = r1 |
文本准备
1 | [hadoop@hadoop001 ~]$ echo "flume" >> data/flume-avro-client.test |
sink启动命令
1 | flume-ng agent \ |
source启动命令
1 | flume-ng avro-client \ |
结果:
source端:
命令执行,event传输完后会退出。退出后再往文件添加数据,并不会传输到sink端,所以avro client的方式不适合于增量data。
sink端:
1 | [INFO - org.apache.flume.sink.LoggerSink.process(LoggerSink.java:95)] Event: { headers:{} body: 66 6C 75 6D 65 flume } |