I'm working with GridDB to manage and query time series data. I've set up a container to store time series data points with a timestamp as the primary key. When I query the data using specific time intervals, I've noticed that the results sometimes omit certain data points that should fall within the specified range. This inconsistency seems to happen more frequently when querying large datasets with small time intervals (e.g., querying minute-by-minute data over a month).
Here’s an example of the query I'm using:
SELECT * FROM my_time_series
WHERE timestamp BETWEEN TIMESTAMP('2023-01-01 00:00:00') AND TIMESTAMP('2023-01-31 23:59:59')
In some cases, the query returns fewer data points than expected, even though the data exists in the container. I’ve verified that the missing data points do fall within the specified range. Has anyone else encountered this issue with GridDB? Is there something I'm missing in how time intervals or time series queries are handled? Any insights or workarounds would be greatly appreciated.
I found out that the issue might be due to the difference of precision factor used by GridDB for timestamps than what is specified in the query, leading to data points being excluded from the results due to minor discrepancies in the time intervals. Few steps could be taken to mitigate the issue: