I am working on a application which archiving data at linux (RHEL 7). The application keep the file from the source and put the tar/tar.gz at archive folder. Which was working fine when I did have few data. but when I tried with large data I am getting following exception...
Caused by: java.io.IOException: File too large
at java.io.UnixFileSystem.createFileExclusively(Native Method) ~[na:1.8.0_92]
at java.io.File.createNewFile(File.java:1012) ~[na:1.8.0_92]
The source have more than 60,00,000 and it has break near about the 3280000 file count. We are archiving all file in single folder. System has enough space available.
Is there any limit of files per folder at Linux?
I have also checked at /etc/security/limits.config but it does not have like setting as well as complete file is commented.
IMPORTANT: Files are are being written to NFS.
The IOException with message File Too Large - Occurred because of application was writing the file at NetApp NFS and where it has limitation for number of files per directory.