Problem - Memory Utilization keeps growing in my program when parsing an XML string.
I have an XML stream coming in live on a TCP port.
I use Netty to pull the xml data out and then I use XStream to deserialize/unmarshal the XML String into Java Object tree.
When I monitor memory consumption using netbeans profile, I see the HEAP growing over time and then finally JVM throws OutOfMemory exception. I traced the call Stack and I've attached a screenshot of one my tests in action.
The memory consumption seems to be happening when I deserialize/unmarshal the XML String into Java Objects.
I've tried different parsers within XStream to do the parsing - XPP3,KXML,STAX. I've even tried JAXB instead of XStream to unmarshal. No matter what parser I use, the problem persists.
I've tried all these different parsers so far but I have the same issue.
xstream1 = new XStream(new KXml2Driver());
xstream2 = new XStream(new StaxDriver());
xstream3 = new XStream(new JDomDriver());
xstream4 = new XStream(new Xpp3Driver());
Like I mentioned,I've even tried JAXB to unmarshal instead of XStream...still same issue.
If you look at the attached image, its this Arrays.copyOfRange call right under the char[] that shows up.... No matter which parser it is, this call always shows up at the top of the trace.
I'm completely lost on how to fix or approach this problem
PLS NOTE - I'm not reading XML from a file. I get a live stream of data containing small XML chunks. I extract each chunk to convert it into Java objects for further processing
Thanks
A
Well, on the evidence you've shown us, the simplest explanation is that your JVM's heap is too small. Try adding an "-Xmx" option as described in the manual entry for the java
command.
If that doesn't work, then you need to take a deeper look at what your application is doing:
Is there an upper bound on the size of these "small" XML chunks? Could you be getting a chunk that is bigger than you have allowed for?
Is your application's processing keeping stuff from each chunk in a long lived in-memory data structure? Can you place an upper bound on the size of that data structure? Perhaps by throwing stuff out? (Or by using "weak references" so that the GC will throw them out?)
Is your application leaking memory?