We have a Ruby rails runner script that loads a bunch of JSON schema files, processes them, and writes the processed data to files.
If I run the rails runner script repeatedly, it raises failed to allocate memory (NoMemoryError) once every 15-20 times. I've compared the memory usages between the failed and successful runs before termination and both were ~178 MB. The VM has about ~ 2 GB of free memory.
Based on the log lines I've added for debugging, it's failing when the code does JSON.pretty_generate(schema).
Observations:
- The script is processing the same set of files and I don't understand why it raises NoMemoryError intermittently.
- The script is processing the files in the same order
- In all the failed runs, it failed exactly when it was processing a particular file. I fired up IRB and tried to do the same set of operations with the file and measured the memory usage before and after these operations. It resulted in a memory increase of only ~100 KB.
- I tried raising RUBY_GC_HEAP_INIT_SLOTS, so that the process memory size is 10x more than the successful or failed runs but still no luck. There are lots and lots of heap_free_slots (54x heap_live_slots) when it raises NoMemoryError
Here are the questions that I have:
- Why doesn't Ruby get more memory allocated even when there's around ~2GB of free memory in the VM?
- Any pointers or suggestions?
Ruby version: 2.7.1
Rails version: 5.2.4.2
Thank you!