elasticsearch - I don't know how to filter my log file with grok and logstash -


i have small java app loads logs similar these ones bellow:

fri may 29 12:10:34 bst 2015 trade id: 2 status :received fri may 29 14:12:36 bst 2015 trade id: 4 status :received fri may 29 17:15:39 bst 2015 trade id: 3 status :received fri may 29 21:19:43 bst 2015 trade id: 3 status :parsed sat may 30 02:24:48 bst 2015 trade id: 8 status :received sat may 30 08:30:54 bst 2015 trade id: 3 status :data not found sat may 30 15:38:01 bst 2015 trade id: 3 status :book not found sat may 30 23:46:09 bst 2015 trade id: 6 status :received 

i want use elk stack analyse logs , filter them. @ least 3 filters : date , time, trade id , status.

in filter part of logstash configuration file here did:

filter { grok {     match => { "message" => "%{day} %{month} %{day} %{time} bst %{year} trade id: %{number:tradeid}  status : %{word:status}" }   } 

and moment can't filter logs want.

you have spaces between pattern, , status, parse entire message, using greeedydata instead of word choice.

filter {     grok {         match => { "message" => "%{day:day} %{month:month} %{monthday:monthday} %{time:time} bst %{year:year} trade id: %{number:tradeid} status :%{greedydata:status}" }     } } 

for log line:

sat may 30 15:38:01 bst 2015 trade id: 3 status :book not found

you end json like:

{    "message" => "sat may 30 15:38:01 bst 2015 trade id: 3 status :book not found",   "@version" => "1", "@timestamp" => "2015-08-18t18:28:47.195z",       "host" => "gabriels-macbook-pro.local",        "day" => "sat",      "month" => "may",   "monthday" => "30",       "time" => "15:38:01",       "year" => "2015",    "tradeid" => "3",     "status" => "book not found"  } 

Comments

Popular posts from this blog

Java 3D LWJGL collision -

spring - SubProtocolWebSocketHandler - No handlers -

methods - python can't use function in submodule -