I"m running a Java application on a Linux system. I noticed that the application seemed to consume a lot of file handles (I get "Too many open files" after a few days).
So when I use the 'lsof' command to dump all the files associated with the Java application, I get something like this:
java 2690 root 239u REG 3,2 428057 94300 /tmp/jar_cache5782499018536796385.tmp (deleted)
java 2690 root 240u REG 3,2 58955 94360 /tmp/jar_cache3818842806647031366.tmp (deleted)
java 2690 root 241u REG 3,2 28673 94301 /tmp/jar_cache8793213887943479521.tmp (deleted)
java 2690 root 242u REG 3,2 67115 94302 /tmp/jar_cache3648070144390426051.tmp (deleted)
I'm only showing 4 here, but there are actually 87 of them and the number grows in time.
From what I've read online, Java uses these temporary files internally or something and they are normal.
As the output above says, they are deleted, and I confirm that they do not exist physically on the file system.
But what I'm afraid of is that it's not releasing the file descriptors or any of the associated memory... does anyone know anything about these '/tmp/jar_cache####.tmp' files or have experience with these?
The /tmp/jar_cache files are produced when loading a jar via a URLClassloader. I suspect there are components of the application that are being reloaded and this results in old jar_cache files being deleted and new ones created. The fact that file handles are not released seems like a JVM issue though - I've seen this behavior as well with the same JDK version.
There are comments along these lines on this JVM bug: https://bugs.java.com/bugdatabase/view_bug?bug_id=4166799
although the issue was closed a while ago.