Logstash date parsing as timestamp using the date filter Logstash date parsing as timestamp using the date filter elasticsearch elasticsearch

Logstash date parsing as timestamp using the date filter


I have tested your date filter. it works on me!

Here is my configuration

input {    stdin{}}filter {    date {        locale => "en"        match => ["message", "YYYY-MM-dd;HH:mm:ss.SSS"]        timezone => "Europe/Vienna"        target => "@timestamp"        add_field => { "debug" => "timestampMatched"}   }}output {    stdout {            codec => "rubydebug"    }}

And I use this input:

2014-08-01;11:00:22.123

The output is:

{   "message" => "2014-08-01;11:00:22.123",  "@version" => "1","@timestamp" => "2014-08-01T09:00:22.123Z",      "host" => "ABCDE",     "debug" => "timestampMatched"}

So, please make sure that your logTimestamp has the correct value. It is probably other problem. Or can you provide your log event and logstash configuration for more discussion. Thank you.


This worked for me - with a slightly different datetime format:

# 2017-11-22 13:00:01,621 INFO [AtlassianEvent::0-BAM::EVENTS:pool-2-thread-2] [BuildQueueManagerImpl] Sent ExecutableQueueUpdate: addToQueue, agents known to be affected: []input {   file {       path => "/data/atlassian-bamboo.log"       start_position => "beginning"       type => "logs"             codec => multiline {                pattern => "^%{TIMESTAMP_ISO8601} "                charset => "ISO-8859-1"                negate => true                what => "previous"                       }          }}filter {   grok {      match => [ "message", "(?m)^%{TIMESTAMP_ISO8601:logtime}%{SPACE}%{LOGLEVEL:loglevel}%{SPACE}\[%{DATA:thread_id}\]%{SPACE}\[%{WORD:classname}\]%{SPACE}%{GREEDYDATA:logmessage}" ]   }    date {        match => ["logtime", "yyyy-MM-dd HH:mm:ss,SSS", "yyyy-MM-dd HH:mm:ss,SSS Z", "MMM dd, yyyy HH:mm:ss a" ]        timezone => "Europe/Berlin"   }   }output {  elasticsearch { hosts => ["localhost:9200"] }  stdout { codec => rubydebug }}