Search code examples
javaspringpostgresqlhibernatec3p0

Hibernate : C3p0 pool configuration slowing down entire server.


I am working on a Spring-MVC application in which we are using Hibernate and c3p0 for Database transactions and connection pooling. Most of the time it works really good, no issues. But in certain situations I have to copy a lot of objects and files in the current transaction. When that happens, the entire server slows down and finally I start getting could not rollback exception . Anything wrong with my c3p0 settings? Thank you.

pom.xml :

 <!--- Hibernate dependencies -->
        <dependency>
            <groupId>org.hibernate</groupId>
            <artifactId>hibernate-validator</artifactId>
        </dependency>

        <dependency>
            <groupId>org.hibernate</groupId>
            <artifactId>hibernate-core</artifactId>
        </dependency>
        <dependency>
            <groupId>org.hibernate</groupId>
            <artifactId>hibernate-entitymanager</artifactId>
        </dependency>
        <dependency>
            <groupId>org.hibernate</groupId>
            <artifactId>hibernate-c3p0</artifactId>
            <version>4.3.9.Final</version>
        </dependency>

root-context.xml :

<beans:bean id="dataSource" class="org.apache.commons.dbcp.BasicDataSource"
            destroy-method="close">
    <beans:property name="driverClassName" value="org.postgresql.Driver"/>
    <beans:property name="url"
                    value="jdbc:postgresql://localhost:PORT/DB_NAME"/>
    <beans:property name="username" value="USERNAME"/>
    <beans:property name="password" value="PASSWORD"/>
    <beans:property name="removeAbandoned" value="true"/>
    <beans:property name="removeAbandonedTimeout" value="20"/>
    <beans:property name="defaultAutoCommit" value="false"/>
</beans:bean>

<!-- Hibernate 4 SessionFactory Bean definition -->
<beans:bean id="hibernate4AnnotatedSessionFactory"
            class="org.springframework.orm.hibernate4.LocalSessionFactoryBean">
    <beans:property name="dataSource" ref="dataSource"/>
    <beans:property name="packagesToScan" value="com.ourapp.spring.model"/>
    <beans:property name="hibernateProperties">
        <beans:props>
            <beans:prop key="hibernate.dialect">org.hibernate.dialect.PostgreSQL9Dialect</beans:prop>
            <beans:prop key="hibernate.show_sql">false</beans:prop>
               <!--<beans:prop key="hibernate.jdbc.batch_size">1000</beans:prop>-->
            <beans:prop key="hibernate.hbm2ddl.auto">update</beans:prop>
            <beans:prop key="cache.use_second_level_cache">true</beans:prop>
            <beans:prop key="cache.use_query_cache">true</beans:prop>
            <beans:prop key="hibernate.order_updates">true</beans:prop>
            <beans:prop key="show_sql">false</beans:prop>
            <beans:prop key="hibernate.c3p0.min_size">1</beans:prop>
            <beans:prop key="hibernate.c3p0.max_size">750</beans:prop>
            <beans:prop key="hibernate.c3p0.acquire_increment">1</beans:prop>
            <beans:prop key="hibernate.c3p0.idle_test_period">1000</beans:prop>
            <beans:prop key="hibernate.c3p0.max_statements">150</beans:prop>
            <beans:prop key="hibernate.c3p0.timeout">1200</beans:prop>
            <beans:prop key="hibernate.connection.release_mode">auto</beans:prop>
        </beans:props>
    </beans:property>

</beans:bean>

Thank you.


Solution

  • For starters you aren't using C3P0 simply because you have configured the org.apache.commons.dbcp.BasicDataSource as a DataSource and are injecting that into your LocalSessionFactoryBean. This basically renders all the hibernate.c3p0 settings useless as they will be ignored.

    Next you have issues with processing large amounts of data and I highly doubt the issue is your DataSource or connection pool but rather the way you are processing your entities and how you have configured Hibernate.

    To speedup batch processing you want to flush every x records to the database and clear the first level cache. Why do you want to do this you might wonder. It has all to do with how Hibernate works, when you persist an entity what Hibernate does it will add it to the first level cache (the Session or EntityManager in case of JPA). Each time you add an item to the first level cache it will do a dirty check on ALL of the entities in the first level cache to determine if something needs to be flushed. Now this will be fast for the first few entities but will become increasingly slower and slower.

    Lets configure and code things for a batch size of 50.

    First you want to configure hibernate to have a proper batch size and you want to order the insert and update statements. If you do this you can benefit from the fact that JDBC can now do a Batch Update (i.e. one insert or update statement to modify 50 records instead of 50 single insert/update statement).

    Hibernate Configuration

    <beans:bean id="hibernate4AnnotatedSessionFactory"
                class="org.springframework.orm.hibernate4.LocalSessionFactoryBean">
        <beans:property name="dataSource" ref="dataSource"/>
        <beans:property name="packagesToScan" value="com.ourapp.spring.model"/>
        <beans:property name="hibernateProperties">
            <beans:props>
                <beans:prop key="hibernate.dialect">org.hibernate.dialect.PostgreSQL9Dialect</beans:prop>
                <beans:prop key="hibernate.show_sql">false</beans:prop>
                   <!--<beans:prop key="hibernate.jdbc.batch_size">1000</beans:prop>-->
                <beans:prop key="hibernate.hbm2ddl.auto">update</beans:prop>
                <beans:prop key="hibernate.cache.use_second_level_cache">true</beans:prop>
                <beans:prop key="hibernate.cache.use_query_cache">true</beans:prop>
                <beans:prop key="hibernate.jdbc.batch_size">50</beans:prop>
                <beans:prop key="hibernate.order_inserts">true</beans:prop>
                <beans:prop key="hibernate.order_updates">true</beans:prop>
                <!-- If you use versioned entities set this to true as well -->
                <beans:prop key="hibernate.jdbc.batch_versioned_data">true<beans:prop> 
            </beans:props>
        </beans:property>
    </beans:bean>
    

    Code Modification

    public void yourLargeDataSetProcessingMethod() {
        Session session = sessionFactory.getCurrentSession();
    
        int i = 0;
        for (YourItem item : List<YourItem> items) {
            i++:
            // Here will be processing / creation
    
            if (i % 50 == 0) {
                session.flush();
                session.clear();
            }
        }
        session.flush();
        session.clear();
    } 
    

    This will probably speed up your processing and database locking.

    A final note, instead of Commons DBCP or C3P0 I would suggest to use HikariCP as the connection pool. It is very small and very fast and actively maintained (whereas C3P0 has been dormant for quite some time already).

    Here is a nice resource (with benchmarks etc.) on what each setting does and add or removes performance wise and how to configure things properly.