How much data is too much for on-heap cache like ehcache?
I'm getting a 24GB RAM server. I'll probably start off devoting 2-4 GB for caching but may end up devoting 20GB or so to cache. At what point should I worry that GC for on-heap cache will take too long?
By the way, is DirectMemory the only open source off-heap cache available? Is it ready for prime time?
Depends on your JVM and especially the used GC. Older GCs especially were not really capable of handling really large heaps, but there's been an increasing effort to fix that.
Azul systems for example sells hardware with hundreds of GB of heap without problems (ie gc pauses in the ms not half minutes) thanks to their special GC, so it's no limitation of Java per se. No idea how good hotspot/IBM have got over time though. But then a 24gb heap isn't that large anyhow - G1 should probably do a good enough job there anyhow.