I am using "Postgres" database and pessimistic_locking for locking database row.
I have the following code
@Transactional
doProcess(int id, int quantity){
Article article = lockArticle(id);
modifyArticle(article, quantity);
}
This is lockArticle() function which locks and fetches article from DB(Postgres)
Article lockArticle(int id){
return sessionFactory.getCurrentSession().createCriteria(Article.class)
.add(Restrictions.eq("id", id))
.setLockMode(LockMode.PESSIMISTIC_WRITE)
.setTimeout(10000)
.getUniqueResult();
}
Here is the modifyArticle(), which basically deletes/updates article entry in database.
modifyArticle(Article article, int quantity){
if(article.getQuantity()<quantity){
dao.delete(article);
}
else{
article.setQuantity(article.getQuantity()-quantity)
dao.updateArticle(article);
}
}
//DAO for updateArticle
@Transactional
updateArticle(Article article){
sessionFactory.getCurrentSession().saveOrUpdate(article);
}
Let's say initially in db, article has quantity 10.
I have spawned 2 threads to update article quantity, first thread is spawned with quantity = 2, second thread with 3. So ideally the DB after both the finished processing should be 5(10-2-3).
But database net quantity in db is either 8(10-2)/7(10-3).
Can someone please help me understand why PESSIMISTIC_LOCKING is not working and what would be a workaround to fix this?
After getting the lock, if i did session.refresh(entity), i got the updated value.