I need to create a generic framework to Archive the Table and Put Archived data into CSV, For That we will have multiple table which we will be picking through config table in Database, Now as we will have multiple tabes I don't want to write multiple reader and writer and POJO specific to Table. I want to make it generic and I just need to enter new entry in our config table and my framework should do the work without doing any code changes.
My question is it feasible to achieve this without POJO in Spring Batch?
I have tried it using hashmap but the performance was very slow , I am open to enhance it further if can improve the performance else need a fresh suggestion.
Edit 1 : Select query will be picked from Config Table
Reader and Writer which I have tried with Hashmap :
public ItemReader<Map<String, Object>> jdbcItemReader() {
JdbcCursorItemReader<Map<String, Object>> reader = new JdbcCursorItemReader<>();
reader.setDataSource(dataSource);
reader.setSql("select a1,a2,a3 from Table");
reader.setRowMapper(new ColumnMapRowMapper());
return reader;
}
@Bean
public ItemWriter<Map<String, Object>> jdbcItemWriter() {
JdbcBatchItemWriter<Map<String, Object>> writer = new JdbcBatchItemWriter<>();
writer.setDataSource(dataSource);
writer.setSql(
"INSERT INTO table1_hist (a1,a2,a3) VALUES (:a1,:a2, :a3)");
writer.setItemSqlParameterSourceProvider(new ItemSqlParameterSourceProvider<Map<String, Object>>() {
@Override
public SqlParameterSource createSqlParameterSource(Map<String, Object> item) {
MapSqlParameterSource mapSqlParameterSource = new MapSqlParameterSource();
mapSqlParameterSource.addValues(item);
return mapSqlParameterSource;
}
});
return writer;
}
Edit 2 : Adding step and Job Configuration as well,
@Bean
public Step myStep(ItemReader<Map<String, Object>> reader, ItemWriter<Map<String, Object>> writer) {
return stepBuilderFactory.get("myStep")
.<Map<String, Object>, Map<String, Object>>chunk(10000)
.reader(reader)
.writer(writer)
.build();
}
@Bean
public Job myJob(Step myStep) {
return jobBuilderFactory.get("myJob")
.start(myStep)
.build();
}
As you are copying the data to a history table, which apparently are in the same schema I would suggest simply dropping Spring Batch and use the database for that. Using an INSERT INTO ... SELECT FROM
query, which does exactly that copy the data from one table to another.
As you are using Spring Boot you already have a pre-configured JdbcTemplate
which you can use.
var sql = "INSERT INTO table1_hist (a1,a2,a3) SELECT a1,a2,a3 FROM table1"
jdbcTemplate.executeUpdate(sql);
This will save you from transfering all the data from the database to your java program, construct an additional query and send all the data back to the database again. It will save you a lot of time on the network and executing single queries.
You could even execute this in a loop or make it multi threaded (1 for each table).
public class DataArchiveService {
private final JdbcTemplate jdbc;
private final AsyncTaskExecutor executor;
public DataArchiveService(JdbcTemplate jdbc, AsyncTaskExecutor executor) {
this.jdbc=jdbc;
this.executor=executor;
}
public void archive() {
var archiveConfigs = // Get your metadata from somewhere
for (var toArchive : archiveConfigs) {
var sql = toArchive.getSql();
executor.execute(() -> jdbc.executeUpdate(sql);
}
}
}
Now you have several threads running a, rather simple, query to copy the data. This should all be fairly quick, or at least quicker then anything you do on the Java side to transfer the data back and forth.