Search code examples
javacachingmemoryheap-memory

What is the best practice I utilize large memory size in java?


I have an application server server and 64 GB memory on it. And a java web application on it. What is a best practice to utilize all these 64 GB? I need store a large set of objects (HashSet) Is it a best solution to use -Xms -Xmx ? But will GC working well? Should I use 3rd party solutions like cache libraries (memcache and etc.) ?


Solution

  • A pretty generic question, so a broad answer.

    • Albeit doing better lately, neither of the "free" JREs (Oracle, IBM) are "famous" for supporting large amounts of memory. If you really need one JVM for whatever reasons, then you probably should bite the bullet and get a product such as zing from Azuul. zing allows you to use up to 2 TB per JVM, and it is also designed to minimize GC pausing. ( that is the real problem with the other VMs: at least some years back, their GC pauses would grow linear with memory size. )
    • But then, the better way in 2018: scale out instead of scale up. Meaning: rather have multiple JVMs (maybe consuming 4, 8, 16 GB), and use things like load balancing to keep these JVMs busy.

    In other words: sure, if you have one large monolithic application that can only be "scaled" by adding more RAM, well, then you have to live with that. But if you are wondering about smarter ways to spend your money, then look into micro services, and how you can use them in order to break up that monolith into many small parts ( and where scaling happens by instantiating more of these small parts ).