I have a complex object with JSONB (PostgreSQL 12), that has nested arrays in nested arrays and so on. I search for all invoices, that contains specific criteria.
create table invoice:
invoice_number primary key text not null,
parts: jsonb,
...
Object:
"parts": [
{
"groups": [
{
"categories": [
{
"items": [
{
...
"articleName": "article1",
"articleSize": "M",
},
{
...
"articleName": "article2"
"articleSize": "XXL",
}
]
}
]
}
]
},
{
"groups": [
...
]
},
]
I've a build a native query to search for items with a specific articleName:
select * from invoice i,
jsonb_array_elements(i.invoice_parts) parts,
jsonb_array_elements(parts -> 'groups') groups,
jsonb_array_elements(groups -> 'categories') categories,
jsonb_array_elements(categories -> 'items') items
where items ->> 'articleName' like '%name%' and items ->> 'articleSize' = 'XXL';
I assume i could improve search speed with indexing. I've read about Trigram indexes. Would it be the best type of indexing for my case? If yes -> how to build it for such complex object.
Thanks in regards for any advices.
The only option that might speed up this, is to create a GIN index on the parts
column and use a JSON path operator:
select *
from invoice
where parts @? '$.parts[*].groups[*].categories[*].items[*] ? (@.articleName like_regex "name" && @.articleSize == "XXL")'
But I doubt this is going to be fast enough, even if that uses the index.