Search code examples
javaspringfilejava-io

Optimize read and file write


I have this scenario wherein my repository will retrieve 1million+ rows. Is there any way retrieve the records incrementally. Let's say, I will only read 100k+ data first before writing it.

By the way, I don't have an option to use Spring Batch.

List<Data> dataList = dataRepository.findAll(); // This will retrieve 1million+ rows

FileWriter fw = new FileWriter(path);
BufferedWriter bw = new BufferedWriter(fw);
for(Data data: dataList){
        bw.write(data.toString()); // I need to write the data in the file incrementally
}




Solution

  • You can use the findAll(Pageable page) variant that comes with the PagingAndSortingRepository which returns a Page, like so:

    FileWriter fw = new FileWriter(path);
    BufferedWriter bw = new BufferedWriter(fw);
    
    PageRequest request = PageRequest.of(0, 100000); // this brings first 100000
    Slice<Data> dataList = null;
    do {
        dataList = dataRepository.findAll(request);
        for(Data data: dataList){
            bw.write(data.toString());
        }
        request = request.next(); //turn the page
       
    } while(dataList.hasNext());
    

    dataList.hasNext() will return false when no data is left to retrieve and will exit the do while