Search code examples
javacaffeine-cache

Can you still use a ConcurrentLinkedHashMap with Caffeine?


Caffeine provides a great alternative to Guava's caching library, but can you still use just the ConcurrentLinkedHashMap class itself, or a Cache the implements concurrent, in-order iteration? It does not appear in Caffeine anymore that I can see, although I'm sure its using something similar under the hood.

I have tested the Caffeine Cache and it does not iterate over the elements in order of insertion, and I don't see any options on the Caffeine builder that would make me thing this feature still exists. But maybe I'm missing something.

Any ideas?


Solution

  • ConcurrentLinkedHashMap offered access-order iteration via the ascending and descending snapshots. This was useful for cases like persisting the hottest entries for a warm server restart. Otherwise views and iterators were in encounter order by the underlying ConcurrentHashMap.

    Caffeine offers snapshots by in the eviction policy's hottest/coldest order and the expiration policy's youngest/oldest order. See Cache.policy().

    Insertion-order is not that difficult to implement because writes are rare, exclusive, and items are not being shifted around frequently. The simplest approach is to use a Map computation to maintain a secondary data structure. If you want that in addition to the cache then Caffeine's asMap() view offers atomic versions. This way you can have fast random access reads via a ConcurrentHashMap and ordered iterations for a background process.

    var data = new ConcurrentHashMap<String, Integer>();
    var order = Collections.synchronizedMap(new LinkedHashMap<String, Integer>());
    
    data.compute("a", (key, oldValue) -> {
      order.put(key, 1);
      return 1;
    });
    
    // Locks the map during access, blocking writes; consider snapshotting first
    order.forEach((key, value) -> System.out.printf("%s=%s%n", key, value);
    

    Vavr's LinkedHashMap might be useful if you need concurrent iteration or want to avoid the cost of a full copy. That is a persistent data structure, which where writes produce a new immutable version. That could be used above or directly if you don't need any caching concepts.

    volatile Map<K, V> data = LinkedHashMap.empty();
    final Lock lock = new ReentrantLock();
    
    // Single writer
    lock.lock();
    try {
      data = data.put(1, 2).put(3, 4);
    } finally {
      lock.unlock();
    }
    
    // Multiple readers
    System.out.println(data);