I have used MyBatis-spring + Java. I need to insert >10000 records into table in one transaction. To do it, I have used a mapper:
<insert id="saveBulk" parameterType="List">
INSERT INTO "quote" ("id", "mi_id", "timestamp", "open", "close", "low", "high", "volume", "period")
VALUES
<foreach collection="list" item="item" separator=",">
( #{item.key}, #{item.marketInstrumentKey}, #{item.timestamp}, #{item.open}, #{item.close}, #{item.low},
#{item.high}, #{item.volume}, #{item.period}::quote_period)
</foreach>
</insert>
And pass a List to this statement. It is worked extremely slowly for 2000-3000 records, but the 10000 records are inserted more than 4 minutes (I should increase the timeout interval)! The same 10000 records are inserted to the same DB via PgAdmin for less than 10 second. I tried to trace the processing of this operation and found a bottleneck
public int doUpdate(MappedStatement ms, Object parameter) throws SQLException {
Statement stmt = null;
try {
Configuration configuration = ms.getConfiguration();
StatementHandler handler = configuration.newStatementHandler(this, ms, parameter, RowBounds.DEFAULT, null, null);
stmt = prepareStatement(handler, ms.getStatementLog());
return handler.update(stmt);
} finally {
closeStatement(stmt);
}
}
The StatementHandler is calculated few minutes, and few minutes for prepareStatement. I understand, why it happens: 10000 records with 9 field for each record. All these 100k fields should be inserted as parameters into statement. And how can I accelerate this process?
UPDATE:
I implements a batch save with using of "BATCH" mode of sqlFactory and @Transactional. This is configuration of mybatis-spring XML config:
<bean id="sqlBatchTemplate" class="org.mybatis.spring.SqlSessionTemplate">
<constructor-arg index="0" ref="sqlSessionFactory"/>
<constructor-arg index="1" value="BATCH"/>
</bean>
<bean id="quoteBatchMapper" class="org.mybatis.spring.mapper.MapperFactoryBean">
<property name="mapperInterface" value="tafm.dataaccess.mybatis.mapper.QuoteMapper"/>
<property name="sqlSessionFactory" ref="sqlSessionFactory"/>
<property name="sqlSessionTemplate" ref="sqlBatchTemplate"/>
</bean>
<bean id="dataAccessBatch" class="tafm.dataaccess.DataAccess">
<property name="quoteMapper" ref="quoteBatchMapper"/>
</bean>
Then, I have implemented a "batch" method:
@Transactional
public void saveBulk(List<Quote> quotes) {
for(Quote q:quotes) {
mapper.save(q);
}
}
mapper - is XML mapper for entity Quote:
<insert id="saveBulk" parameterType="List">
INSERT INTO "quote" ("id", "mi_id", "timestamp", "open", "close", "low", "high", "volume", "period")
VALUES
<foreach collection="list" item="item" index="index" separator=",">
( #{item.key}, #{item.marketInstrumentKey}, #{item.timestamp}, #{item.open}, #{item.close}, #{item.low},
#{item.high}, #{item.volume}, #{item.period}::quote_period)
</foreach>
</insert>
It is working fast
you can do something like this.
Mapper.xml
<insert id="saveBulk" parameterType="List">
INSERT INTO "quote" ("id", "mi_id", "timestamp", "open", "close", "low", "high", "volume", "period")
VALUES
<foreach collection="list" item="item" separator=",">
( #{item.key}, #{item.marketInstrumentKey}, #{item.timestamp}, #{item.open}, #{item.close}, #{item.low},
#{item.high}, #{item.volume}, #{item.period}::quote_period)
</foreach>
</insert>
In Java File
public void insertBulkData(List<Item> list)
{
int total = list.size();
int interval = 1000;
int from = 0;
int to = 0;
while (to <= total)
{
from = to == 0 ? 0 : to;
to = (to + interval) <= total ? (to + interval) : total;
saveBulk(list.subList(from, to));
if (to == total)
break;
}
}
above code using interval at 1000.so 1000 records will insert at a time.you can modify this interval also.
you have to call insertBulkData(list).
I hope this will helpful to you.