Search code examples
javamemorymemory-managementheap-memorydirectmemory

Setting MaxDirectMemory and MaxHeapMemory for Java Applications


For my Java application, I was trying to limit the heap memory and direct memory usage using command line options.

I came across the following VMware article when I was trying to understand more about the Java application memory layout.

From the article, I assumed that -Xmx setting can be used to limit the heap usage while MaxDirectMemory setting can be used to limit the native memory that lies outside the heap(Guest OS Memory in the diagram). But, results are different when I ran a simple program. I used ByteBuffer.allocateDirect to allocate native memory while ByteBuffer.allocate to allocate heapmemory.

It is a 64bit processor(OSX) and 64bit JVM.

First Experiment

import java.nio.ByteBuffer;
import java.nio.channels.FileChannel;
import java.nio.file.Paths;
import java.nio.file.Path;
import java.util.*;

public class javalimits {

    public static void main (String [] args)
            throws Exception {

            ArrayList al = new ArrayList();
            for(int i = 0; i< 100;i++) {

                    ByteBuffer bb = ByteBuffer.allocateDirect(1024 * 1024* 1024);
                    al.add(bb);
                    System.out.println(" Buffer loop "+ i);
                    Thread.sleep(500);
            }
            Thread.sleep(10000);
    }
}

When I ran the above program without any options, it crashed after 3.6G of memory allocation. When I used "-XX:MaxDirectMemorySize=100g" option or "-Xms100g -Xmx100g" option, it crashed after 65 loops or around 65G of memory allocation.

I don't understand

  1. Since my physical ram is just 16G, Why didn't it crash after 16G of memory allocation? What is special about 64G of native memory allocation?
  2. How does native memory allocation limits change when I use "-Xms100g -Xmx100g" ? I assumed that native memory limits are only controlled only by option ""-XX:MaxDirectMemorySize=100g" as per the above diagram in the link i provided. But, results are different. Heapsize memory settings has changed the direct memory buffer limits too.
  3. What is special about 3.6G memory allocation when no command line options are provided?

Second Experiment

I changed ByteBuffer.allocateDirect to ByteBuffer.allocate for allocating in heap memory instead of native memory.

import java.nio.ByteBuffer;
import java.nio.channels.FileChannel;
import java.nio.file.Paths;
import java.nio.file.Path;
import java.util.*;

public class javalimits {

    public static void main (String [] args)
            throws Exception {

            ArrayList al = new ArrayList();
            for(int i = 0; i< 100;i++) {

                    ByteBuffer bb = ByteBuffer.allocate(1024 * 1024* 1024);
                    al.add(bb);
                    System.out.println(" Buffer loop "+ i);
                    Thread.sleep(500);
            }
            Thread.sleep(10000);
    }
}

When I ran the above program without any options, it crashed after 2.7G of memory allocation. When I used "-XX:MaxDirectMemorySize=100g" option, it didn't have any effect. It crashed after 2.7G of memory allocation. I felt, this makes sense. But, when I added "-Xms100g -Xmx100g" option, it crashed after 48 loops or around 48G of memory allocation.

I don't understand why,

  1. Since my physical ram is just 16G, Why didn't it crash after 16G of memory allocation? What is special about 48G of heap memory allocation?
  2. What is special about 2.7G memory allocation when no command line options are provided?

Third experiment

I enabled both allocateDirect and allocate functions inside the loop. when I added "-Xms100g -Xmx100g" option, it crashed after 24 loops or effectively 48G of memory allocation combining both. (24G of native memory + 24G of heap memory)

Can someone help me to understand where am I wrong in understanding the Java memory layout?(Referring to the diagram in the link)


Solution

  • Pretty good explanation about memory management you can find here: https://smarttechie.org/2016/08/15/understanding-the-java-memory-model-and-the-garbage-collection/

    to answer your questions:

    I don't understand why,

    1. Since my physical ram is just 16G, Why didn't it crash after 16G of memory allocation? What is special about 48G of heap memory allocation?

    Physical memory isn't the limit for the system, it can use swapping technique. Which enables the system to remove infrequently accessed modified pages from physical memory to let the system use physical memory more efficiently for more frequently accessed pages.

    What is special about 48G might be that you system is capable to handle this amount of memory only. You can try to play with swamping and let system to allocate all 100G.

    1. What is special about 2.7G memory allocation when no command line options are provided?

    Before you run your java from command line check this:

    Windows

    java -XX:+PrintFlagsFinal -version | findstr /i "HeapSize PermSize ThreadStackSize"
    

    Linux

    java -XX:+PrintFlagsFinal -version | grep -iE 'HeapSize|PermSize|ThreadStackSize'
    

    Also you can monitor visual memory allocation over jconsol.

    Why different sizes of memory were located you should read this:

    ByteBuffer.allocate() vs. ByteBuffer.allocateDirect()