I have a PL/pgSQL function which takes data from a staging table to our target table. The process executes every night. Sometimes due to server restart or some maintenance issues we get the process executed manually.
The problem I am facing: whenever we start the process manually after 7 AM, it takes almost 2 hours to complete (read from staging table and insert into the target table). But whenever it executes as per schedule, i.e., before 7 AM, it takes 22-25 minutes on average.
What could be the issue? If required, I can share my function snippet here.
The typical reason would be general concurrent activity in the database, which competes for the same resources as your function and may cause lock contention. Check your DB log for activities starting around 7 a.m.
A function always runs as a single transaction. Locks are acquired along the way and only released at the end of a transaction. This makes long running functions particularly vulnerable to lock contention.
You may be able to optimize general performance as well as behavior towards concurrent transactions to make it run faster. Or more radically: if at all possible, split your big function in separate parts, which you call in separate transactions.
How to split huge updates:
There are additional things to consider when packing multiple big operations into a single function: