首页
学习
活动
专区
圈层
工具
发布
社区首页 >问答首页 >flume hdfs rollSize在多通道和多接收器中不工作

flume hdfs rollSize在多通道和多接收器中不工作
EN

Stack Overflow用户
提问于 2016-08-24 10:17:30
回答 1查看 384关注 0票数 0

我正在尝试使用Flume-ng来获取128MB的日志信息,并将其放入HDFS中的一个文件中。但HDFS滚动选项不起作用。Flume-ng每秒发送日志文件。如何修复flume.conf文件?

代码语言:javascript
复制
agent01.sources = avroGenSrc
agent01.channels = memoryChannel hdfsChannel
agent01.sinks = fileSink hadoopSink

# For each one of the sources, the type is defined
agent01.sources.avroGenSrc.type = avro
agent01.sources.avroGenSrc.bind = dev-hadoop03.ncl
agent01.sources.avroGenSrc.port = 3333

# The channel can be defined as follows.
agent01.sources.avroGenSrc.channels = memoryChannel hdfsChannel

# Each sink's type must be defined
agent01.sinks.fileSink.type = file_roll
agent01.sinks.fileSink.sink.directory = /home1/irteam/flume/data
agent01.sinks.fileSink.sink.rollInterval = 3600
agent01.sinks.fileSink.sink.batchSize = 100

#Specify the channel the sink should use
agent01.sinks.fileSink.channel = memoryChannel



agent01.sinks.hadoopSink.type = hdfs
agent01.sinks.hadoopSink.hdfs.useLocalTimeStamp = true
agent01.sinks.hadoopSink.hdfs.path = hdfs://dev-hadoop04.ncl:9000/user/hive/warehouse/raw_logs/year=%Y/month=%m/day=%d
agent01.sinks.hadoopSink.hdfs.filePrefix = AccessLog.%Y-%m-%d.%Hh
agent01.sinks.hadoopSink.hdfs.fileType = DataStream
agent01.sinks.hadoopSink.hdfs.writeFormat = Text
agent01.sinks.hadoopSink.hdfs.rollInterval = 0
agent01.sinks.hadoopSink.hdfs.rollSize = 134217728
agent01.sinks.hadoopSink.hdfs.rollCount = 0

#Specify the channel the sink should use
agent01.sinks.hadoopSink.channel = hdfsChannel


# Each channel's type is defined.
agent01.channels.memoryChannel.type = memory
agent01.channels.hdfsChannel.type = memory

# Other config values specific to each type of channel(sink or source)
# can be defined as well
# In this case, it specifies the capacity of the memory channel
agent01.channels.memoryChannel.capacity = 100000
agent01.channels.memoryChannel.transactionCapacity = 10000

agent01.channels.hdfsChannel.capacity = 100000
agent01.channels.hdfsChannel.transactionCapacity = 10000
EN

回答 1

Stack Overflow用户

发布于 2016-08-24 10:56:52

我找到了这个解决方案。dfs.replication不匹配导致了此问题。

在我的hadoop conf (hadoop-2.7.2/etc/hadoop/hdfs-site.xml)中

代码语言:javascript
复制
<property>
  <name>dfs.replication</name>
  <value>3</value>
</property>

我有2个数据节点,所以我将其更改为

代码语言:javascript
复制
<property>
  <name>dfs.replication</name>
  <value>2</value>
</property>

我在flume.conf中添加了配置

代码语言:javascript
复制
agent01.sinks.hadoopSink.hdfs.minBlockReplicas = 2

感谢你

https://qnalist.com/questions/5015704/hit-max-consecutive-under-replication-rotations-error

Flume HDFS sink keeps rolling small files

票数 0
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/39113408

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档