I have a legacy database which has data stored as JSON in clob columns. The JSON in clob needs to be normalized and put into respective tables to support a new solution.
The clob data will produce 1 Company record, multiple department records and each department will have any number of employees. All this data is to be persisted into 3 tables i.e. Company, Department & Employee. I want to read the source data only once and perform the "data split" operation in one item-writer class using Spring Batch
Which item-writer will be more appropriate for this use case ClassifierCompositeItemWriter or CompositeItemWriter?
The processor seems to be simple:
public class ClientProfileEVProcessor implements ItemProcessor<ClientProfile, ClientCompositeVO> {
/**
*
*/
@Override
public ClientCompositeVO process(ClientProfile item) throws Exception {
ClientEntity client = populateClientProfile(item);
List<DeptEntity> deptEntities = populateDept(item);
List<Employee> evEntities = populateEmployee(item);
return new ClientCompositeVO(client, deptEntities, evEntities);
}
}
Here is my attempt of the Classifier. I am getting compilation error clazz instanceof List. How should I check if the clazz instance of list.get(0).class in the below code?
public class ClientClassifier<T> implements Classifier<T, ItemWriter<? super T>> {
private final Map<Class<T>, ItemWriter<T>> itemWriterMap;
public EVProfileClassifier(Map<Class<T>, ItemWriter<T>> itemWriterMap) {
super();
this.itemWriterMap = itemWriterMap;
}
@Override
public ItemWriter<? super T> classify(T classifiable) {
Class<?> clazz = classifiable.getClass();
if (this.itemWriterMap.containsKey(clazz)) {
return this.itemWriterMap.get(clazz);
} else if(clazz instanceof List) {
List<?> list = (List<?>) classifiable;
return this.itemWriterMap.get(list.get(0).getClass());
} else {
throw new IllegalArgumentException("No writer found for domainn class: " + clazz.getTypeName());
}
}
public Map<Class<T>, ItemWriter<T>> getItemWriterMap() {
return itemWriterMap;
}
}
Any suggestions for a simpler solution approach are also welcome.
the following solution works. My domain classes extend from a common BaseEntity. I wrote writers for the 3 entities i.e., Company, Department & Employee
I wrote a Delegate Writer as follows:
public class DelegateItemWriter implements ItemWriter<List<CCTBaseEntity>> {
private final LinkedHashMap<Class<? extends BaseEntity>, ItemWriter<? extends BaseEntity>> itemWriterMap;
public DelegateItemWriter(
LinkedHashMap<Class<? extends BaseEntity>, ItemWriter<? extends BaseEntity>> itemWriterMap) {
this.itemWriterMap = itemWriterMap;
}
@Override
public void write(Chunk<? extends List<BaseEntity>> chunk) throws Exception {
List<BaseEntity> listEntities = chunk.getItems().get(0);
for (BaseEntity b : listEntities) {
this.itemWriterMap.get(b.getClass()).write(new Chunk(Arrays.asList(b)));
}
}
}
Bean definitions for itemWriterMap & DelegateItemWriter:
@Bean
public LinkedHashMap<Class<? extends BaseEntity>, ItemWriter<? extends BaseEntity>> itemWriterMap(MongoTemplate mongoTemplate) {
LinkedHashMap<Class<? extends BaseEntity>, ItemWriter<? extends BaseEntity>> itemWriterMap = new LinkedHashMap<>();
itemWriterMap.put(CompanyEntity.class, companyItemWriter(mongoTemplate));
itemWriterMap.put(DepartmentEntity.class, deptEntityWriter(mongoTemplate));
itemWriterMap.put(Employee.class, employeeItemWriter(mongoTemplate));
return itemWriterMap;
}
@Bean
public DelegateItemWriter delegateItemWriter(LinkedHashMap<Class<? extends BaseEntity>, ItemWriter<? extends BaseEntity>> itemWriterMap) {
return new DelegateItemWriter(itemWriterMap);
}
Step Definition:
@Bean
public Step step1(@Qualifier(value = "dataSource") final DataSource dataSource,
final JobRepository jobRepository,
final PlatformTransactionManager transactionManager,
final DelegateItemWriter delegateItemWriter) {
return new StepBuilder("step1", jobRepository).<ClientProfile, List<BaseEntity>>chunk(10, transactionManager)
.reader(reader(dataSource)).processor(profileProcessor()).writer(delegateItemWriter)
.build();
}