Search code examples
javamemorystanford-nlp

Exception in thread "main" java.lang.OutOfMemoryError: Java heap space not fixed


this is not a duplicate question, i see this, i want to run a java prograrm and have this error:

Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
    at edu.stanford.nlp.ie.crf.CRFLogConditionalObjectiveFunction.empty2D(CRFLogConditionalObjectiveFunction.java:892)
    at edu.stanford.nlp.ie.crf.CRFLogConditionalObjectiveFunction.<init>(CRFLogConditionalObjectiveFunction.java:134)
    at edu.stanford.nlp.ie.crf.CRFLogConditionalObjectiveFunction.<init>(CRFLogConditionalObjectiveFunction.java:117)
    at edu.stanford.nlp.ie.crf.CRFClassifier.getObjectiveFunction(CRFClassifier.java:1792)
    at edu.stanford.nlp.ie.crf.CRFClassifier.trainWeights(CRFClassifier.java:1798)
    at edu.stanford.nlp.ie.crf.CRFClassifier.train(CRFClassifier.java:1713)
    at edu.stanford.nlp.ie.AbstractSequenceClassifier.train(AbstractSequenceClassifier.java:763)
    at edu.stanford.nlp.ie.AbstractSequenceClassifier.train(AbstractSequenceClassifier.java:751)
    at edu.stanford.nlp.ie.crf.CRFClassifier.main(CRFClassifier.java:2917)

according this i try this:

java -Xms2000m -cp stanford-ner.jar edu.stanford.nlp.ie.crf.CRFClassifier -prop fa.prop 

but the error not fix and i see error again! when i was set a value more than 2000m, my os crashed, or i get this output:

...
...
//stanford log
...

Time to convert docs to data/labels: 8.8 seconds
Killed

how i can fix it

edit:

and for this

[stanford-ner]$ java -Xms1G -Xmx50G -cp stanford-ner.jar edu.stanford.nlp.ie.crf.CRFClassifier -prop fa.prop

i have this error:

[1000][2000][3000][4000][5000][6000]OpenJDK 64-Bit Server VM warning: INFO: os::commit_memory(0x00007f04c7c00000, 1225785344, 0) failed; error='Cannot allocate memory' (errno=12)
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Native memory allocation (malloc) failed to allocate 1225785344 bytes for committing reserved memory.
# An error report file with more information is saved as:
# /stanford-ner/hs_err_pid1536.log

Solution

  • Looking at the software's purpose it is likely that it is very memory-consuming, so it is reasonable to assume that 1GB of heap just isn't sufficient, so you'll have to further increase your heap-size.

    The messages you get when you try imply that you are using

    • a 32-bit-OS or
    • a 32-bit-VM

    which might both limit you to a maximum heap-size of about 1.5GB (at least on windows).

    So make sure you use a 64bit-VM on a 64-bit-OS and then try again to increase the heap-size.