Search code examples
elasticsearchlogstashlogstash-jdbc

Logstash jdbc plugin: how use a date filter for a column?


In my MS SQL table, i have a column with date stored as string on format "dd-mm-yyyy 00:00:00.000" for example : 1999-10-06 00:00:00.000 or 2000-04-27 00:00:00.000

During my import, i want to convert this column's values into date type.

I saw a feature called filter which could do this type of transformation. I found examples but only for parsing logs full text lines brought by Beats with sort of regex.

How to use this feature for a sql column? For example, how adapt this conf file for making it working for real?

 input {
    jdbc {
        jdbc_connection_string => "jdbc:localhost;"
          jdbc_user => "user"
          jdbc_password => "pass"
       jdbc_driver_library => "C:\Program Files (x86)\jdbc\sqljdbc_6.0\enu\sqljdbc42.jar"
        jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
        statement => "SELECT  * FROM SOLD"
    }
}

filter
{
     date {
         match => [ "DATE_COLUMN", "YYYY-MM-dd 00:00:00.000" ]
    }
}

output {
    elasticsearch { 
       hosts => "localhost:9200"
       index => "indexname"
       document_type => "typename"

    }
}

For the moment this conf does not work and create this ES text field:

 "date_column": {
          "type": "text",
          "fields": {
            "keyword": {
              "type": "keyword",
              "ignore_above": 256
          }
     }
  }

Solution

  • so answere was to make this filter in writing the name of column in lowercase:

    filter
    {
         date {
             match => [ "date_column", "yyyy-MM-dd HH:mm:ss.SSS" ]
        }
    }
    

    And to make an elasticsearch mapping column for this date format:

    PUT /index
    {
         "mappings": {
          "type": {
            "properties": {
             "date_column": {
                "type": "date",
                "format": "yyyy-MM-dd HH:mm:ss.SSS"
              }
            }
          }
        }
    }