Search code examples
elasticsearchgeolocationlogstashkibanalogstash-jdbc

How to map latitude and longitude values into a geo_point converting a database using the JDBC input plugin within Logstash to be used in Kibana?


I am logging a Mariadb database into elastic search using the jdbc input plugin within logstash. The objective is to make some geographical reports using Kibana.

The data get indexed, yet I can't use the longitude and latitude indexed to make the reports showing that there is no "geo_point" type within the index.

I tried to configure the jdbc plugin to understand the geo values but seems not its job, but I should alter the field mappings of the index in elastic search.

Can I alter the mappings of index after creation? how?


Solution

  • You can add new field to the index with geo_point type using PUT mapping API: https://www.elastic.co/guide/en/elasticsearch/reference/current/indices-put-mapping.html

    Then in logstash config you can add additional filter to add this field. Assuming that you added new field with name "location" and you are selecting "lan" and "lot" in sql query filter should be like this:

    input {
      jdbc {
        ...
      }
    }
    filter {
        if [lat] and [lon] {
              mutate {
                        add_field => { "[location][lat]" => "%{[lat]}" }
                        add_field => { "[location][lon]" => "%{[lon]}" }
              }
              mutate {
                      convert => {"[location][lat]" => "float"}
                      convert => {"[location][lon]" => "float"}
              }
        }
    }
    
    output {
      elasticsearch {
        ...
      }
    }
    

    Note: works in version 7.2.