If in my log I print the latitude and longitude of a given point, how can I capture this information so that it is processed as a geospatial data in elastic search?
Below I show an example of a document in Elasticsearch corresponding to a log line:
{
"_index": "memo-logstash-2018.05",
"_type": "doc",
"_id": "DDCARGMBfvaBflicTW4-",
"_version": 1,
"_score": null,
"_source": {
"type": "elktest",
"message": "LON: 12.5, LAT: 42",
"@timestamp": "2018-05-09T10:44:09.046Z",
"host": "f6f9fd66cd6c",
"path": "/usr/share/logstash/logs/docker-elk-master.log",
"@version": "1"
},
"fields": {
"@timestamp": [
"2018-05-09T10:44:09.046Z"
]
},
"highlight": {
"type": [
"@kibana-highlighted-field@elktest@/kibana-highlighted-field@"
]
},
"sort": [
1525862649046
]
}
You can first separate LON
and LAT
into their own fields as follows,
grok {
match => {"message" => "LON: %{NUMBER:LON}, LAT: %{NUMBER:LAT}"}
}
once they are separated you can use mutate filter to create a parent field around them, like this,
filter {
mutate {
rename => { "LON" => "[location][LON]" }
rename => { "LAT" => "[location][LAT]" }
}
}
let me know if this helps.