Spring application reading around 200 events per minute from kafka topic and updating a single table with a where clause on a combination of two columns. Proper indexes are set up on the composite key.
The processing times for the record upsert is almost constant (<30ms).But during certain hours the processing times reach even 30 seconds for a single insert or update statement. There is no correlation with these high processing times and the peak load time as well.
So far we have checked the CPU utilisation , memory , thread count and row lock wait times as well.There seems to be no issues. Please suggest any other items to analyse this issue.
Tech stack used for db operations. Spring 4.3.0 Spring Jdbc Template. C3P0 Connection pool
Code snippet
String stmt= //construct query;
jdbcTemplate.update(stmt);
logger.info("Insert event is successful");
Sample Update:-
UPDATE tablename
SET
column1 = 'string1',
column2 = 'string2',
column3 = 0.1,
column4 = 'string3',
column5 = 0.1,
column6 = 'string4',
column7 = to_timestamp('2020-10-21_08.00.02_861', 'YYYY-MM-DD HH24:MI:SS_FF')
WHERE
compositeKeyA = 210
AND compositeKeyB = '123';
I ended up implementing the upsert with dynamically generated Prepared statement.
Looks like jdbcTemplate.update(stmt) creates a statement object for execution.I am guess it was due to absence of statement caching.
It seems to resolve the issue when we started using prepared statements.
Thanks.