Search code examples
javajvmweblogicout-of-memory

How to find which Class is causing OutOfMemory for the JVM?


Oflate our Weblogic server is frequently crashing with OutOfMemory error. Is there any way I can monitor the JVM to find out which Classes are are hogging the memory and have the maximum number of objects?


Solution

  • yes. The way I did it was to configure the jvm to create a heap dump on OOM, then I pulled the heap down and ran it thru jvisualvm. You can compute the retained sizes (took a long time) but it will be very clear what the offender is.

    You can also attach jvisualvm to a running instance, but you need to configure the jvm to accept the connection. That way you can watch the heap grow in real time. See this; its for jboss but should be very similar: https://wiki.projectbamboo.org/display/BTECH/VisualVM+Profiler

    I think it is easier to get to the answer after you have a heap dump though, as when you watch it in real time things get garbage collected and whatnot.

    EDIT -- here are my startup configs.

    -XX:+PrintGCDetails -XX:+PrintGCTimeStamps  
    -Xloggc:/path/to/memlogs/memlog.txt -XX:+PrintTenuringDistribution   
    -Xms1024m -Xmx2048m -XX:MaxPermSize=128m   
    -server -Dcom.sun.management.jmxremote -Dcom.sun.management.jmxremote.port=xxxx   
    -Dcom.sun.management.jmxremote.ssl=false -Dcom.sun.management.jmxremote.authenticate=false   
    -Djava.rmi.server.hostname=<ip-address> -XX:+HeapDumpOnOutOfMemoryError   
    -XX:HeapDumpPath=/path/to/heapdumps/ -XX:+CMSClassUnloadingEnabled   
    -XX:+CMSPermGenSweepingEnabled -XX:+UseConcMarkSweepGC  
    

    because I configured it to drop memory logs, I can tail the memlog.txt file in real time to see what happened. I can connect to the jvm if I want, but like I said I will just analyze the heap after a crash to see what the issue is, because after the fact its really clear....