I am using Elasticsearch-1.5.1, Kibana-4.0.2-linux-x86, Logstash-1.4.2. My logstash conf is like this
input{
redis{
data_type=>'list'
key=>'pace'
password=>'bhushan'
type=>pace
}
}filter {
geoip {
source => "mdc.ip"
target => "geoip"
database => "/opt/logstash-1.4.2/vendor/geoip/GeoLiteCity.dat"
add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}" ]
}
}
output{
if[type]=="pace"{
elasticsearch{
template_overwrite => true
host=>localhost
index=>'pace'
template => "/opt/logstash-1.4.2/mytemplates/elasticsearch-template.json"
template_name => "bhushan"
}
}
stdout{
codec=>rubydebug
}
}
{
"template" : "bhushan",
"settings" : {
"index.refresh_interval" : "5s"
},
"mappings" : {
"_default_" : {
"_all" : {"enabled" : true},
"dynamic_templates" : [ {
"string_fields" : {
"match" : "*",
"match_mapping_type" : "string",
"mapping" : {
"type" : "string", "index" : "analyzed", "omit_norms" : true,
"fields" : {
"raw" : {"type": "string", "index" : "not_analyzed", "ignore_above" : 256}
}
}
}
} ],
"properties" : {
"@version": { "type": "string", "index": "not_analyzed" },
"geoip" : {
"type" : "object",
"dynamic": true
"properties" : {
"location" : { "type" : "geo_point" }
}
}
}
}
}
}
When i do url curl http://localhost:9200/pace/_mapping/pace/field/geoip.location?pretty
{
"pace" : {
"mappings" : {
"pace" : {
"geoip.location" : {
"full_name" : "geoip.location",
"mapping" : {
"location" : {
"type" : "double"
}
}
}
}
}
}
}
Example of log record is like
{
"thread_name" => "main",
"mdc.ip" => "14.X.X.X",
"message" => "Hii, I m in info",
"@timestamp" => "2015-05-15T10:18:32.904+05:30",
"level" => "INFO",
"file" => "Test.java",
"class" => "the.bhushan.log.test.Test",
"line_number" => "15",
"logger_name" => "bhushan",
"method" => "main",
"@version" => "1",
"type" => "pace",
"geoip" => {
"ip" => "14.X.X.X",
"country_code2" => "IN",
"country_code3" => "IND",
"country_name" => "India",
"continent_code" => "AS",
"region_name" => "16",
"city_name" => "Mumbai",
"latitude" => 18.974999999999994,
"longitude" => 72.82579999999999,
"timezone" => "Asia/Calcutta",
"real_region_name" => "Maharashtra",
"location" => [
[0] 72.82579999999999,
[1] 18.974999999999994
],
"coordinates" => [
[0] "72.82579999999999",
[1] "18.974999999999994"
]
}
}
I thought my problem was same as this, so I did everything mention in that link like deleting all old index and restarting of LS and ES but no luck. Any help is appreciated.
Your logstash filter is storing the coordinates in the field geoip.coordinates
, however in your elasticsearch-template.json
mapping the field is called geoip.location
. This shows up in your sample log record where you can see the two fields location
and coordinates
in the geoip
sub-object.
I think if you change this in your logstash filter, you might be good:
From this
add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}" ]
To this
add_field => [ "[geoip][location]", "%{[geoip][longitude]}" ]
add_field => [ "[geoip][location]", "%{[geoip][latitude]}" ]
UPDATES
add_field
directives in the geoip
filter can be removed as they are unnecessary"path": "full"
can be removed as it's been deprecated since ES v1.0pace
instead of bushan
, i.e. the name of the index where the log records are stored.