I am planning to batch insert around 2000 records per insert using jdbcTemplate batch update. Each record will vary between 2KB to 10 KB in size on an average. So the update size will vary between ~ 2 MB to 10MB per batch update.
Is BatchPreparedStatementSetter good enough for this use case or do I really need to use ParameterizedPreparedStatementSetter?
I am using SpringBoot 2.0 and PostGre 9.6. The frequency of these updates will be twice every minute.
BatchPreparedStatementSetter
designed for batch operations. Obviously BatchPreparedStatementSetter
is good in your case.
Setting rewriteBatchedStatements=true
will improve performance of batch operations by reducing number of roundtrips to the database sql statements execution. JDBC Batch operations
When you have more records, it's good to split them using batch size, it improves great performance. It's easier to implement inserting records using batch size with BatchPreparedStatementSetter
.
Example:
public class UserBatchPreparedStatementSetter implements BatchPreparedStatementSetter{
private List<User> users;
public UserBatchPreparedStatementSetter(List<User> users) {
super();
this.users = users;
}
@Override
public void setValues(PreparedStatement ps, int i) {
try {
User user = users.get(i);
ps.setString(1, user.getUserName());
ps.setString(2, user.getPassword());
ps.setDate(3, new java.sql.Date(user.getCreatedTime().getTime()));
ps.setDate(4, user.getUpdatedTime() == null ? null : new java.sql.Date(user.getUpdatedTime().getTime()));
ps.setString(5, user.getUserType().toString());
ps.setDate(6, new java.sql.Date(user.getDateofBirth().getTime()));
} catch (SQLException e) {
e.printStackTrace();
}
}
@Override
public int getBatchSize() {
return users.size();
}
}