Search code examples
mavenjpajunitpersistencehelidon

How to configure Helidon application to use in-memory database for integration tests?


We are setting up a Helidon MP application that connects to a SQL database and exposes some endpoints for CRUD operations. I am facing issues when implementing the integration tests. Our objective is to have the application use the SQL database, but when running the tests use an in-memory database.

  1. I've used this type of implementation on others frameworks and programing languages. The initial solution was to access the dependency injection container and change the configuration of the ORM (hibernate in this case) to use an in-memory database. Unfortunately I did not manage to do this.

  2. The second approach was to configure another persistence.xml file in the test folder, that would override the one from main folder. Using on each of them a different jta-data-source, I would be able to configure separate connection credentials. I found out that this causes a ambiguous dependency and would fail.

content of src/resources/META-INF/persistence.xml

<persistence>

    <persistence-unit name="dservice" transaction-type="JTA">
        <jta-data-source>dsource</jta-data-source>
        <class>.....</class>
        .
        .
        .
        <class>.....</class>
        <properties>
            <property name="hibernate.dialect" value="org.hibernate.dialect.OracleDialect"/>
        </properties>
    </persistence-unit>

</persistence>

content of test/resources/META-INF/persistence.xml

<persistence>

    <persistence-unit name="dservice" transaction-type="JTA">
        <jta-data-source>dsource_test</jta-data-source>
        <class>.....</class>
        .
        .
        .
        <class>.....</class>
        <properties>
            <property name="hibernate.dialect" value="org.hibernate.dialect.H2Dialect"/>
            <property name="jakarta.persistence.sql-load-script-source" value="META-INF/init_script.sql"/>
            <property name="jakarta.persistence.schema-generation.database.action" value="drop-and-create"/>
        </properties>
    </persistence-unit>

</persistence>

content of src/resources/META-INF/microprofile-config.properties

# used for build
oracle.ucp.jdbc.PoolDataSource.dsource.URL=jdbc:oracle:something
oracle.ucp.jdbc.PoolDataSource.dsource.connectionFactoryClassName=oracle.jdbc.pool.OracleDataSource
oracle.ucp.jdbc.PoolDataSource.dsource.user=some_user
oracle.ucp.jdbc.PoolDataSource.dsource.password=some_password

# used for in-memory testing
oracle.ucp.jdbc.PoolDataSource.dsource_test.URL=jdbc:h2:mem:depServerDb;DB_CLOSE_DELAY=-1
oracle.ucp.jdbc.PoolDataSource.dsource_test.connectionFactoryClassName=org.h2.jdbcx.JdbcDataSource
oracle.ucp.jdbc.PoolDataSource.dsource_test.user=db_user
oracle.ucp.jdbc.PoolDataSource.dsource_test.password=user_password
  1. I added in main/resources/META-INF/persistence.xml another persistence unit with a different name and try to create the entity manager manually using Persistence.createEntityManagerFactory() and have a provider class to access the entity manager. Unfortunately this attempt also failed.

content of src/resources/META-INF/persistence.xml


<persistence>

    <persistence-unit name="dservice" transaction-type="JTA">
        <jta-data-source>dsource</jta-data-source>
        <class>.....</class>
        .
        .
        .
        <class>.....</class>
        <properties>
            <property name="hibernate.dialect" value="org.hibernate.dialect.OracleDialect"/>
        </properties>
    </persistence-unit>


    <persistence-unit name="dservice_test" transaction-type="JTA">
        <jta-data-source>dsource_test</jta-data-source>
        <class>.....</class>
        .
        .
        .
        <class>.....</class>
        <properties>
            <property name="hibernate.dialect" value="org.hibernate.dialect.H2Dialect"/>
            <property name="jakarta.persistence.sql-load-script-source" value="META-INF/init_script.sql"/>
            <property name="jakarta.persistence.schema-generation.database.action" value="drop-and-create"/>
        </properties>
    </persistence-unit>

</persistence>
  1. I ended up with a solution that I am not satisfied with. I kept the persistence.xml from scenario 3 and added a entity manager provider class in which I inject two entity managers, one for each persistence unit. In the test class I added a @AddConfig(key = "app.testing", value = "true"). This will make my entity manager provider to deliver the entity manager "depservice" when the application is running and "depservice-test" when I run "mvn test" command.

content of provider class

@ApplicationScoped
public class PersistenceUnitProvider {

    @PersistenceContext(unitName = "dservice")
    private EntityManager em_application;

    @PersistenceContext(unitName = "dservice_test")
    private EntityManager em_test;

    private String testing = false;

    @Inject
    public PersistenceUnitProvider(@ConfigProperty(name = "app.testing") String testing){
        this.testing = testing;
    }

    public EntityManager getPersistenceUnit(){
        if(this.testing == "true"){
            return em_test;
        }
        return em_application;
    }
}

content of junit test class that will change app.testing property to true and use in-memory database

@HelidonTest
@AddConfig(key = "app.testing", value = "true")
class MainTest {
    .
    ..
    ...
    ....
}

The issue is that this makes the application establish two connections when it's run or tested. Is there a better way to achieve this?

..............................................................................................................................................

UPDATE: Following the solution Laird mentioned in the accepted answer, we are using the maven build process, that has phase triggers to add whatever config files needed to change the behavior of the helidon application.

We created a folder to store all the config files used for development, test and production, with the following structure:

_config
   development
       (files that use a local SQL database)
   tests
       (files that use a in memory database)
   production
       (files that use a development SQL database)

In the pom.xml file we switch the config files as needed for the different phases

<plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-resources-plugin</artifactId>
    <executions>
        <execution>
            <id>copy-resources-dev</id>
            <phase>compile</phase>
            <goals>
                <goal>copy-resources</goal>
            </goals>
            <configuration>
                <outputDirectory>${basedir}/target/classes/META-INF</outputDirectory>
                <resources>
                    <resource>
                        <directory>_config/development</directory>
                        <filtering>true</filtering>
                    </resource>
                </resources>    
            </configuration>
        </execution>

        <execution>
            <id>copy-resources-test</id>
            <phase>test-compile</phase>
            <goals>
                <goal>copy-resources</goal>
            </goals>
            <configuration>
                <outputDirectory>${basedir}/target/classes/META-INF</outputDirectory>
                <resources>
                    <resource>
                        <directory>_config/tests</directory>
                        <filtering>true</filtering>
                    </resource>
                </resources>
            </configuration>
        </execution>

        <execution>
            <id>copy-resources-packaging</id>
            <phase>prepare-package</phase>
            <goals>
                <goal>copy-resources</goal>
            </goals>
            <configuration>
                <outputDirectory>${basedir}/target/classes/META-INF</outputDirectory>
                <resources>
                    <resource>
                        <directory>_config/production</directory>
                        <filtering>true</filtering>
                    </resource>
                </resources>
            </configuration>
        </execution>
    </executions>
</plugin>

This will enable us, when we run "helidon dev", to use a localhost database. When we execute the tests, use a in-memory database. And when we package the application and want to run it on a server somewhere, use whatever database we need it to.


Solution

  • There are many, many, many things going on here. I'll try to keep it short; a full JPA tutorial is beyond the scope of this question and this website.

    The short answer is: (1) in JPA, a persistence.xml is definitionally environment-specific, and (2) persistence.xmls don't "override" each other. When seen this way, the problem reduces to: I want two environments in the same project and can't figure out how to turn them on and off selectively.

    There are a variety of (non-Helidon-specific) ways you can do this sort of thing:

    1. Use the maven-resources-plugin to defer copying src/main/resources/META-INF/persistence.xml into target/classes/META-INF/ until after unit tests have run (so exclude persistence.xml from <resources> in your pom.xml and then bind the maven-resources-plugin:copy-resources goal to the prepare-package phase. Now src/test/resources/META-INF/persistence.xml will be in effect at unit test time, and your (untested) src/main/resources/META-INF/persistence.xml will be the one you deploy with.
    2. Do amazing things with MicroProfile Config configuration profiles if the only thing you need to change is data source information, which is already external to a container-mode-JPA persistence.xml file, but from your example it seems that you need to change <property> elements as well.
    3. Recognize that since persistence.xmls are inherently environment-specific, just don't include a src/main/resources/META-INF/persistence.xml at all in your library project, since a library project, by definition, is supposed to be used in a variety of environments. Instead, place a persistence.xml in its own "thin" project and combine that project with your library project to form an application. You can of course test these combinations in all sorts of other ways.