Search code examples
mysqlsqldatabaseoracledatabase-performance

Is it good to use SQL function like JSON_EXTRACT for large number of data


I'm implementing a DB, that is going to hit more than 2000 records per day and I'm going to fetch the data by using SQL Function such as Mysql JSON_EXTRACT and ORACLE JSON_VALUE. becarse. I'm storing data in JSON column.

I will use Mysql JSON_EXTRACT and ORACLE JSON_VALUE functions for search a data range like below

SELECT *
FROM audit
where json_extract(detail_x,'$.eventDt') > '2017-10-01 00:00:00'
  And json_extract(detail,'$.eventDt')   < '2018-11-01 00:00:00'

Max date range will be 30days. so, Max row count will be around 2000 * 30 = 60000.

My question is. is it fine to use SQL Function such as Mysql JSON_EXTRACT and ORACLE JSON_VALUE for this scenario.

Open to disuss. Thanks.


Solution

  • If you know that all rows have eventDt, then you should extract that into a column. An index on the column will then speed many of your queries. Admittedly, the index may not be useful for such a wide timespan, but it should improve performance when you are retrieving a small number of records.

    The purpose of JSON should be to store data that has irregular formats that don't fit into columns. MySQL doesn't index JSON directly.

    You can always add an index using a generated column:

    alter table audit
        add column eventDt date generated as (json_extract(detail_x,'$.eventDt')) stored;
    
    create index idx_audit_eventDt on audit(eventDt);