It's going to be hard to provide example code on this one. Let me try to explain it with some pseudo code. If I need to, I'll have to put together a working example.
First I'll demonstrate a variation of the code that works as expected. In this variation I'm iterating over a List of entities. For each entity I call a service method which creates a new transaction and writes something to the DB. If an exception occurs, all the work done in that service method remains in the database as expected because the work was done in new transactions that are independent of the outer transaction.
Here's the pseudo code for the class/method that is iterating over a List of entities and calling a service method for each one.
@Transactional
public myMethod()
{
List<MyEntity> myEntities = myRepo.findAllBySomething("foo");
foreach(MyEntity myEntity : myEntities)
{
myService.doSomeWork()
}
}
Here is the service method that creates a new transaction to do some work.
@Service
public MyService {
@Transactional(propagation = Propagation.REQUIRES_NEW)
public void doSomeWork() {
SomeOtherEntity someOtherEntity = new SomeOtherEntity();
...
someOtherRepo.save(someOtherEntity);
}
}
The above example works as expected. If an exception is encountered, all the work done in the MyService.doSomeWork() calls is retained in the database because it was done in independent transactions.
Here's where the problem comes in. The number of entities I'm iterating over is very large and was causing memory issues. So instead of retrieving them in a list I changed the code to use a Stream. The repo was changed to return Stream instead of List and the code changed to look something like this...
@Transactional
public myMethod()
{
try (Stream<MyEntity> myStream = myRepo.findAllBySomething("foo")) {
// Get an iterator to allow us to step through each row.
Iterator<MyEntity> myStreamIterator = myStream.iterator();
// Loop through each row
while (myStreamIterator.hasNext()) {
// Get the next row
MyEntity myEntity = myStreamIterator.next();
myService.doSomeWork();
}
}
}
Now if an exception is thrown, the work done in all the MyService.doSomeWork() calls is not in the database! It's very strange. I have turned on trace logging for org.springframework.transaction. In both instances of the code I can see a new transaction being created on each call to MyService.doSomeWork(). But when the exception is thrown, for some reason the inserts done in MyService.doSomeWork() are lost in the Stream based version but not in the List based version.
Is this known behavior that if you create a new transaction while a stream is open that it's somehow tied to that outer transaction? Or is this possibly a Spring bug?
I wrote a small piece of code to illustrate this issue. This made it easier for me to test and pinpoint the problem. What I found was when running this code in a standalone Spring Boot application, everything worked as expected. When I deployed the code to a JBoss server, the problem occurs.
I was able to come up with a resolution to the issue. I suspect this is more of a work-around than a fix. But the problem seems to be with JBoss's "Cached Connection Manager". It is enabled by default on the datasource. Turning off the "Cached Connection Manager" fixes the issue. I found some older issues where the Spring folks blame the Cached Connection Manager for some transactional problems. I'm probably going to submit tickets to both Redhat and Spring for the issue and see if anyone wants to step up and be accountable. I doubt either will, but at very least hopefully I can provide some visibility to the next poor soul who runs into this mess.
Until someone gets to the bottom of the issue, to turn off the Cached Connection Manager, update your datasource and set use-ccm to "false" like this...
<datasource jta="true" jndi-name="java:/myDs" pool-name="myDs" use-ccm="false">
or similarly for an XA datasource...
<xa-datasource jndi-name="java:/myXaDs" pool-name="myXaDs" use-ccm="false">