Search code examples
elasticsearchlogstashlogstash-grokkibana-4

Logstash output to ElasticSearch With Valid Types


ELK Stack has been successfully setup.

using grokdebug.herokuapp.com

my gork patterns are also valid and getting Dumped into ElasticSearch

filter {
  if [type] == "some_log" {
grok {
    match => { "message" => '%{WORD:word_1} %{TIME:time_1} %{DATE:date_1} %{NUMBER:number_1}'
             }
overwrite => "message"
     }
                          }
     }

This grok parsing of input is completely correct.

and output is

output {
    elasticsearch { 
         protocol => "http" 
                  }
       }

Problem is all the dumped variables are of String Type.

How to get them logged into their respective type in ElasticSearch ( Correct Mapping Type)

time_1, date_1 and number_1 all has same type which is of type

"time_1":{
    "type":"string",
    "norms":{
        "enabled":false
            },
     "fields":{ 
            "raw":{
                 "type":"string",
                 "index":"not_analyzed",
                 "ignore_above":256
                  }
              }
          }

I want date_1 to be indexed as Date Type, number_1 to be indexed as Number type in Elastic search.

PS: Is it possible to do that ?? determine the Type of Elasticsearch field from Logstash.

OR - How to send those field with proper type to ElasticSearch.

Thanks


Solution

  • In your grok pattern, use the form %{PATTERN:field:datatype} to turn the captured fields into something other than strings. Valid data types are "int" and "float". In your case you'd e.g. use %{NUMBER:number_1:int} to turn your number_1 field into an integer.

    See the grok filter documentation under Grok Basics.

    Another option is to use the mutate filter to convert the type of existing fields:

    mutate {
      convert => ["name-of-field", "integer"]
    }
    

    Related: