Search code examples
logstashlogstash-configurationlogstash-jdbc

Logstash: TZInfo::AmbiguousTime exception when parsing JDBC column Logstash


I am getting this exception when getting my data with the Logstash JDBC input plugin:

error:
  26413962
  Sequel::InvalidValue
  TZInfo::AmbiguousTime: 2017-11-05T01:30:00+00:00 is an ambiguous local time.

This is probably because I am already converting my time zone in my JDBC plugin with this parameter:

   jdbc_default_timezone => "America/New_York"

Therefore 1:30am happened twice on November 5th, and I suspect Logstash doesn't know what to do and it fall in an infinite loop.

As a workaround, I removed the jdbc_default_timezone parameter and instead I convert my values in UTC in the select statement, like this:

 DATEADD(hh, DATEDIFF(hh, GETDATE(), GETUTCDATE()), th.[DueDate]) as DueDate 

But this workaround is annoying since I need to modify all of my logstash inputs date columns.

Is there a way to force it to pick any of the two possible times, or any more elegant way?


Solution

  • It seems that this is a known bug in the Logstash JDBC Input plugin, it is flagged as a P2 enhancement. https://github.com/logstash-plugins/logstash-input-jdbc/issues/121

    Meanwhile, the workaround is to convert to UTC all the date and timestamps in the SQL query, as described above in the question (MS SQL version), or like this for the Oracle version:

    select from_tz(cast(<column> as timestamp), 'CET') at time zone ('EST') "@timestamp"
        from <table>
        where ...
    

    We also need to remove the jdbc_default_timezone parameter in the input file and in the filter, if applicable.