I'm running an application with spring batch jobs. When I try to collect and publish some data from one data source to another I get the following exception.
o.s.batch.core.step.AbstractStep - Encountered an error executing step upload in job reviewsToYtBatchJob
java.lang.OutOfMemoryError: GC overhead limit exceeded
at com.mysql.jdbc.Buffer.<init>(Buffer.java:59)
at com.mysql.jdbc.MysqlIO.nextRow(MysqlIO.java:1967)
at com.mysql.jdbc.MysqlIO.readSingleRowSet(MysqlIO.java:3401)
at com.mysql.jdbc.MysqlIO.getResultSet(MysqlIO.java:483)
at com.mysql.jdbc.MysqlIO.readResultsForQueryOrUpdate(MysqlIO.java:3096)
at com.mysql.jdbc.MysqlIO.readAllResults(MysqlIO.java:2266)
at com.mysql.jdbc.ServerPreparedStatement.serverExecute(ServerPreparedStatement.java:1485)
at com.mysql.jdbc.ServerPreparedStatement.executeInternal(ServerPreparedStatement.java:856)
at com.mysql.jdbc.PreparedStatement.executeQuery(PreparedStatement.java:2318)
at com.zaxxer.hikari.pool.ProxyPreparedStatement.executeQuery(ProxyPreparedStatement.java:52)
at com.zaxxer.hikari.pool.HikariProxyPreparedStatement.executeQuery(HikariProxyPreparedStatement.java)
at org.springframework.batch.item.database.JdbcCursorItemReader.openCursor(JdbcCursorItemReader.java:126)
My questions are:
It works only in a small amount of data. I've also tried this:
reader.setFetchSize(CHUNK_SIZE); //JdbcCursorItemReader
uploadStep.chunk(CHUNK_SIZE); //SimpleStepBuilder
CHUNK_SIZE tried from 100 to 10000 If I limit selected data with the size it works, heap size was not exceeded.
protected ItemReader<Review> reader() {
JdbcCursorItemReader<Review> reader = new JdbcCursorItemReader<>();
reader.setDataSource(dataScource);
reader.setSql(
//sql query
);
reader.setFetchSize(CHUNK_SIZE);
reader.setRowMapper(
(rs, rowNum) -> new Review(
rs.getLong("reviewId"),
//map data
)
);
return reader;
}
private ItemProcessor<Review, ReviewTo> processor() {
return review -> new ReviewTo(
//parameters
);
}
private ItemWriter<ReviewTo> writer() {
return new ItemWriter<>(client);
}
private TaskletStep uploadStep() {
SimpleStepBuilder<Review, ReviewTo> uploadStep = new SimpleStepBuilder<>(stepBuilderFactory.get("upload"));
return uploadStep
.chunk(CHUNK_SIZE)
.reader(reader())
.processor(processor())
.writer(writer())
.allowStartIfComplete(true)
.build();
}
@Bean
public Job reviewsToYtBatchJob() {
return jobBuilderFactory.get(JOB_NAME)
.start(//generate table)
.build())
.next(stepBuilderFactory.get("createTmpTable")
.tasklet(//step)
.build())
.next(uploadStep())
.next(stepBuilderFactory.get("moveTmpTableToDestination")
.tasklet(//step)
.build())
.build();
}
There was not enough memory space. It worked with parameters CHUNK_SIZE = 100000 and -Xmx4g. There was a config file with arguments for virtual machine where I could increase heap size.