Search code examples
mysqllogstashelastic-stacklogstash-jdbc

Logstash : Get timestamp from mysql doesn't work, convert it to string it work?


I'm using this conf file to overwrite the @timestamp field in ElasticSearch, but I automatically get an _dateparsefailure flag:

input {
    jdbc {
        jdbc_driver_library => "C:/path/to/mariadb-java-client.jar"
        statement => "SELECT '${FIELD}' as field, from ${TABLE_NAME}"
        tracking_column => "timestamp"
        tracking_column_type => "timestamp" 
    }
}

filter {
    grok {
        match => ["timestamp","%{TIMESTAMP_ISO8601}"]
    }
    date {
        match => ["timestamp", "ISO8601"]
    }
}

Note that with or without the grok filter I get the same result.

The result:

{
    "@timestamp" => 2022-12-13T09:16:10.365Z,
    "timestamp" => 2022-11-23T10:36:13.000Z,              
    "@version" => "1",
    "tags" => [
        [0] "_dateparsefailure"
    ],
    "type" => "mytype",
}

But when I extract the timestamp with this conf:

input {
    *same input*
}

filter {
    grok {
        match => ["timestamp","%{TIMESTAMP_ISO8601:tmp}"]
        tag_on_failure => [ "_grokparsefailure"]
    }
    date {
        match => ["tmp", "ISO8601"]
    }
}

then it give me the expected result:

{
    "@timestamp" => 2022-11-23T11:16:36.000Z,
    "@version" => "1",
    "timestamp" => 2022-11-23T11:16:36.000Z,
    "tmp" => "2022-11-23T11:16:36.000Z",
}

Can anyone explain me why is that and how can I avoid create this extra field ? Thanks


Solution

  • Ok, The first parse a string I guess, but timestamp already has the right type, So a copy is enough to save and overwrite the @timestamp field:

    filter {
        mutate {
            copy => { "@timestamp" => "insertion_timestamp" }
            copy => { "timestamp" => "@timestamp" }
            remove_field => [ "timestamp" ]
        }
    }