Search code examples
javaelasticsearchout-of-memorylogstashheap-size

Heap space error while executing logstash with a large dictionary (translate filter)


I get an error

java.lang.OutOfMemoryError: Java heap space while executing logstash with a 
large dictionary of 353 mega bytes in translate filter. 

I use it to do a lookup on my input data.

I tried to allow the JVM to use more memory (with java -Xmx2048m). suppose i do it wrong because it has no effect.

I tested my config file with "smaller" dictionary and it worked fine. Any help please ? how to give logstash enough RAM to not die ?

My config file looks like that :

input {  
file {
  type => "MERGED DATA"
  path => "C:\logstash-1.4.1\bin\..."
  start_position => "beginning"
    sincedb_path => "/dev/null"}} 

 filter {
grok {
    match => [ "message", "..." ]} 

if (...") {
translate {dictionary_path => "C:\logstash-1.4.1\bin\DICTIONARY.yaml"  field => "Contact_ID"  destination => "DATA" fallback => "no match" refresh_interval => 60  }

grok {match => [ "DATA", "..." ]}

mutate {remove_field => ...}

else if ...

else if ...
 
mutate {   ... }
}
output { if [rabbit] == "INFO" {
  elasticsearch {
    host => "localhost"
            }
  stdout {}
}}

Solution

  • To increase heap size set the LS_HEAP_SIZE environment variable before launching logstash.

    LS_HEAP_SIZE=2048m