I have 2 tables:
Table person
with columns: person_id
, person_name
Table pet
with columns: pet_id
, owner_id
, pet_name
person data:
1, 'John'
2, 'Jill'
3, 'Mary'
pet data:
1, 1, 'Fluffy'
2, 1, 'Buster'
3, 2, 'Doggy'
How to write a SELECT
query from person
left join pet
on person_id = owner_id
with aggregate functions so my result data looks like:
1, [{pet_id:1,pet_name:'Fluffy'},{pet_id:2,pet_name:'Buster'}], 'John'
2, [{pet_id:3,pet_name:'Doggy'}], 'Jill'
3, [],'Mary'
Use LEFT JOIN LATERAL
and aggregate in the subquery:
SELECT p.person_id, COALESCE(pet.pets, '[]') AS pets, p.person_name
FROM person p
LEFT JOIN LATERAL (
SELECT json_agg(json_build_object('pet_id', pet.pet_id
, 'pet_name', pet.pet_name)) AS pets
FROM pet
WHERE pet.owner_id = p.person_id
) pet ON true
ORDER BY p.person_id; -- optional, Q suggests ordered results
db<>fiddle here
This way you do not need to aggregate results from the outer query. Simpler and cleaner when your outer query is more complex than the example in the question. When aggregating multiple related tables, it even becomes a necessity:
It is also typically much faster when there are selective predicates on the outer table person
- which is the typical use case.
Make sure there is an index on pet(owner_id)
to make it fast.
Or even one on pet(owner_id, pet_id, pet_name)
or pet(owner_id) INCLUDE (pet_id, pet_name)
in Postgres 11 or later, if your row isn't wide like in your example, and if you get index-only scans out of it.
Oh, and use json_build_object()
to preserve attribute names for arbitrary selections:
Related: