A java application I have made is apparently using shared memory, which is interfering with the deployment of other programs in my client's unix environment.
It was never the intension to use any shared memory and I need to identify where it is being consumed.
Program Overview:
-Generic executable jar which will read a configuration file, executes a unix command line script and posts the entirity of the output to a REST service hosted elsewhere. This is repeated every hour (it monitors the health of the environment)
-There are 3 seperate instances of the jar running at the same time, but reading different configuration files. At no point do they share any files and the log files they create are seperate.
-All of the BufferedReaders, InputStreams etc are opened and closed appropriately.
After starting up the program, check its presense in shared memory by greping their process IDs in ipcs -a, where they are all listed.
Is there any warning signs flaring up to people familiar with IPC and java? Is there any execution option I can use to prevent shared memory being used?
Cheers
Edit:
@Aaron - Theres no errors, but when the other programs are deployed on the environment they check for processes using shared memory. If there is a process, it will halt the deployment. There must be a reason for that check, but I don't know it...
I can think of a couple of possible explanations:
Apparently, if you run a JVM with -XX:UseLargePages enabled, the JVM uses shared memory: see Cannot create JVM with -XX:+UseLargePages enabled
If two JVMs open a MemoryMappedByteBuffer
on the same file, they are effectively using shared memory: see Java NIO - Memory mapped files
It is also possible to write JNI native libraries that use shmat
and so on to create shared memory segments. One of your Java application's 3rd party libraries could be doing this behind the scenes.
As to how to finding the culprit, maybe you should try running java
under strace
. This blog post tells you how:
However, going from the raw traces to a definite diagnosis may not easy.
When the other programs are deployed on the environment they check for processes using shared memory. If there is a process, it will halt the deployment. There must be a reason for that check, but I don't know it...
That could be a defence against subversion of some application's license checking ... or something like that. I would be inclined to talk to the providers / vendors of those programs to find out why their software is failing. Ask >>them<< for an acceptable workaround to get their code to co-exist with Java and/or your Java appss.