Fluentd error: “buffer space has too many data” Fluentd error: “buffer space has too many data” kubernetes kubernetes

Fluentd error: “buffer space has too many data”


The default buffer type is memory check: https://github.com/uken/fluent-plugin-elasticsearch/blob/master/lib/fluent/plugin/out_elasticsearch.rb#L63

There are two disadvantages to this type of buffer- if the pod or containers are restarted logs that in the buffer will be lost.- if all the RAM allocated to the fluentd is consumed logs will not be sent anymore

Try to use file-based buffers with the below configurations

<buffer>    @type file    path /fluentd/log/elastic-buffer    flush_thread_count 8    flush_interval 1s    chunk_limit_size 32M    queue_limit_length 4    flush_mode interval    retry_max_interval 30    retry_forever true  </buffer>