Many a carefully crafted piece of Java code has been laid to waste by java.lang.OutOfMemoryError. There seems to be no relief from it, even production class code gets downed by it.
The question I wish to ask is: are there good programming/architecture practices where you can avoid hitting this error.
So the tools at a Java programmers disposal seem to be:
So the thought I had is: could one write factory methods which before the creation of objects check if the system has adequate memory left before attempting to allocate memory? For example in C, the malloc would fail and you would know that you had run out of memory, not an ideal situation but you wouldn't just drop dead from an java.lang.OutOfMemoryError aneurism.
A suggested approach is to do better memory management or plug memory leaks or simply allocate more memory -- I do agree these are valuable points but lets look at the following scenarios:
Thanks in advance.
We handled JVM memory more as a tuning parameter, than as something to manage actively with the application. We have a MemoryInfo class (which wraps several of the Runtime memory info methods).
While application is running, we track free memory in the application as:
Runtime.getMaxMemory() - Runtime.getTotalMemory() + Runtime.getFreeMemory();
Max memory is the -Xmx jvm arg, total memory is what JVM has already allocated to the application heap, and free memory is how much of the allocated heap memory still available. (If your -Xms parameter is the same as your -Xmx parameter, then getFreeMemory()
is all you need to check).
If we get above 70% memory usage, we send alerts to our monitoring system. At that point we make a decision whether we can limp through the rest of the day, or whether adjust the -Xmx parameter and restart. Although this seems a bit messy, in practice, once we have tuned a system, we never run into memory problems after that. (Once you get above 90% max memory used, the JVM will GC extremely frequently to try to prevent running out of memory).
I think the approach of managing memory with every construction is draconion, but if you need absolute control, then maybe it makes sense. Another approach is to make sure that any memory caches you use have an LRU or Expiration and reload mechanism, so you can better limit the number objects preserved in memory.
That said, our approach is to keep as much as possible in memory, and just allocate plenty of RAM. Our big systems have 28G RAM allocated (we use betwen 40-60% of that on average).