I am trying to geolocate the requests on my rails application. I have configured Lograge to generate my logs in json.
I think logstash is not able to retrieve the remote_ip from the json and process the geoip.
Here is the decoded json with the empty geoip field in Kibana :
{
"_index": "logstash-2016.03.15",
"_type": "rails logs",
"_id": "AVN6t1-FkghE9kQv20fc",
"_score": null,
"_source": {
"@version": "1",
"@timestamp": "2016-03-15T14:39:10.176Z",
"client": {
"host": "www.myapp.com",
"remote_ip": "\"xx.xx.xx.xxx\"",
"user_agent": "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/48.0.2564.116 Safari/537.36",
"browser": "Chrome",
"browser_version": "48.0.2564.116",
"plateform": "windows"
},
"geoip": {}
},
"fields": {
"@timestamp": [
1458052750176
]
},
"sort": [
1458052750176
]
}
Here is my logstash.conf
input {
file {
type => "rails logs"
# * is for indexing rotated logs
path => "/var/www/myapp/shared/log/production.log*"
}
}
filter {
grok {
match => [
"message",
"%{DATA:data}%{LOGLEVEL:loglevel} -- : %{GREEDYDATA:json}({({[^}]+},?\s*)*})?\s*$(?<stacktrace>(?m:.*))?"
]
remove_field => ["message"]
}
json {
source => "json"
remove_field => ["json"]
}
geoip {
source => "[client][remote_ip]"
target => "geoip"
database => "/etc/logstash/GeoLiteCity.dat"
add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}" ]
}
mutate {
convert => [ "[geoip][coordinates]", "float"]
}
}
output {
elasticsearch {
}
}
Did I miss something in my configuration ? Thanks in advance.
It looks like "remote_ip" is not correctly parsed, it has double quotes. I guess the geoip filter does not work because it does not manage remote_ip as an ip.