Logstash: Handling of large messages Logstash: Handling of large messages json json

Logstash: Handling of large messages


If your elasticsearch output has a document_id set, it will update the document (the default action in logstash is to index the data -- which will update the document if it already exists)

In your case, you'd need to include some unique field as part of your json messages and then rely on that to do the merge in elasticsearch. For example:

{"key":"123455","attachment1":"something big"}{"key":"123455","attachment2":"something big"}{"key":"123455","attachment3":"something big"}

And then have an elasticsearch output like:

elasticsearch {   host => localhost  document_id => "%{key}" }