How not to parse some fields by logstash? How not to parse some fields by logstash? elasticsearch elasticsearch

How not to parse some fields by logstash?


You should be able to do this with a ruby filter:

filter {    ruby {        init => "require 'socket'"        code => "           event['host'] = Socket.gethostname.gsub(/\..*/, '')           event['request'] = (event['request'].to_s);        "    }    if "restapi" in [tags] {        ruby {                code => '                    require "json"                    event.set("request",event.get("request").to_json)'        }        date {                match => [ "date_start", "yyyy-MM-dd HH:mm:ss" ]                target => "date_start"         }        date {                match => [ "date_end", "yyyy-MM-dd HH:mm:ss" ]                target => "date_end"        }        date {                match => [ "date", "yyyy-MM-dd HH:mm:ss" ]                target => "date"        }    }}

When testing this with stubbed out stdin/stdout:

input { stdin { codec => json }}// above filter{} block hereoutput {  stdout { codec=>rubydebug}}

And testing like this:

echo '{ "startDate": "2015-05-27", "endDate": "2015-05-27", "request" : {"requestId":"123","field2":1,"field2": 2,"field3":3} }' | bin/logstash -f test.conf

It outputs this:

{     "startDate" => "2015-05-27",       "endDate" => "2015-05-27",       "request" => "{\"requestId\"=>\"123\", \"field2\"=>2, \"field3\"=>3}",      "@version" => "1",    "@timestamp" => "2017-02-09T14:37:02.789Z",          "host" => "xxxx"}

So I've answered your original question. You should ask another question if you can't figure out why your template isn't working.


ElasticSearch analyzes the field by default.If what you need is just not to analyze the request field, change how this is indexed by setting "index": "not-analyzed" in the mapping of the field.

More info from the documentation here