I have Spring integration pipeline which fetch document from SFTP server and persist it to Postgresql database (done by Spring data/hibernate). After successful sftp fetch pipeline get fileName and content (as byte[]) and persist it to database.
I have 2 main problems:
Problem is that I haven't anticipated that client will upload 100Mb - 200MB zip files which pipeline reads but fails to persist.
Sometimes (NOT ALAWAYS) "java.lang.OutOfMemoryError: Java heap space" is throwned, but increased Heap memory temporally resolved that, maybe there is solution that does not require to load whole file content to memory before persisting to database?
Only log message which is logged during transaction:
"thread":"task-scheduler-8","location":"org.springframework.integration.file.remote.synchronizer.AbstractInboundFileSynchronizer.copyFileToLocalDirectory(AbstractInboundFileSynchronizer.java:430)","level":"WARN","message":"The remote file '-r-x------ 1 0 0 81062125 May 15 15:06 test.zip' has not been transferred to the existing local file './transfered-files/test.zip'. Consider removing the local file."}
I understand that I should share code, but really can't (legal issues).
Consider to use an SftpStreamingMessageSource
isntead of copy/paste over the local file system: https://docs.spring.io/spring-integration/docs/current/reference/html/#sftp-streaming