my code :
public class Memory {
// Dummy Entity representing usual data objects
private static class Entity {
public String name;
public String detail;
public Double amount;
public Integer age;
}
// Linked list offers table inserts and helps illustrating the issue by using multiple
// references per entry
public static java.util.LinkedList<Entity> entities = new java.util.LinkedList<>();
// This threshold should be 2 times Xmn. It ensures the loop stops before a full GC happens.
private static final int MB = 1024 * 1024;
private static final int THRESHOLD = 100 * MB;
public static void main(String[] args) {
System.out.println("Total Memory (in bytes): " + Runtime.getRuntime().totalMemory());
System.out.println("Free Memory (in bytes): " + Runtime.getRuntime().freeMemory());
System.out.println("Max Memory (in bytes): " + Runtime.getRuntime().maxMemory());
while (true) {
appendEntitiesToDataStructure();
terminateBeforeFullGCorOOMEcanHappen();
}
}
private static void appendEntitiesToDataStructure() {
entities.add(new Entity());
}
private static void terminateBeforeFullGCorOOMEcanHappen() {
if (Runtime.getRuntime().freeMemory() < THRESHOLD) {
System.out.println("Elements created and added to LinkedList: " + entities.size());
System.exit(0);
}
}}
launch :
java -Xms${i}g -Xmx${i}g -Xmn50m memory.java >> output.txt
Idea : I want to know the behavior of JVM heap size when it evolves.
Heap size : from 1 to 100 GB
GC tested : g1c, zgc, pgc
Result : with g1c and pgc there is regression from 32GB -> I can store less data in my object as heap size increases.
Schema : X : size of my object ; Y : heap size
I don't know how to explain this behavior -> any idea ?
Refer to this article: https://www.baeldung.com/jvm-compressed-oops
First, let's find out what an oop is:
The HotSpot JVM uses a data structure called oops or Ordinary Object Pointers to represent objects. These oops are equivalent to native C pointers. The instanceOops are a special kind of oop that represents the object instances in Java. Moreover, the JVM also supports a handful of other oops that are kept in the OpenJDK source tree.
Furthermore
A very neat feature is the compressing of oops to 32 bits:
As it turns out, the JVM can avoid wasting memory by compressing the object pointers or oops, so we can have the best of both worlds: allowing more than 4 GB of heap space with 32-bit references in 64-bit machines!
Finally:
To enable oop compression, we can use the -XX:+UseCompressedOops tuning flag. The oop compression is the default behavior from Java 7 onwards whenever the maximum heap size is less than 32 GB. When the maximum heap size is more than 32 GB, the JVM will automatically switch off the oop compression. So memory utilization beyond a 32 Gb heap size needs to be managed differently.
So, the answer to your question is: you simply passed beyond the threshold of 32 GB as the maximum heap size.
Proof
Look at your graphs and see that the behavior we are speaking about happens at the level of 32 GB.