I have a postgresql database and there is a table having millions of record. This table has columns auto_id (sequence added), id, schedule, date, status
.
auto_id is the primary key and sequence added to it.
In my jpa repository I have a method with following query,
@Query("delete from Rate r where r.id IN :rateIds")
void deleteByIds(@Param("rateIds") List<Integer> rateIds);
In this query I'm going to delete by id column value and this list size is around 100. But in the table for this id there are thousands of records. When I execute this code it getting too long time to respond.
There is a index added to this table combining id, schedule, date and status columns.
Can anyone give me a solution to speed this deletion? Is it ok to add a index to id column and will it faster?
Thank you.
Several solutions can be applied! Proposing below to my knowledge
Index would certainly help but they are good when data needs to be retrieved
instead of deleted.
Try creating a Stored procedure
since they are pre-compiled
and faster in executions instead of native queries!
Give this delete operation Asynchronous
touch if possible. i.e. Accept the operation of delete and let it run in background, send the acknowledge request as soon as you receive the call (As your list may grow or shrink
and you cannot always rely on DB performance)
Make use of JdbcTemplate
instead of JPA as it is faster (This article has some related discussion Spring Data JPA Is Too Slow)