Search code examples
javaimagecachinggoogle-guava-cache

Limit cache by deep (opposite to shallow) size of objects in Java?


I am doing cache of images for faster downloading from network share and/or internet. Currently, this cache is in memory.

Is it possible to limit cache by deep size of objects (prelimilarly of BufferedImageSize)?

Cache is initialized like following

import com.google.common.cache.CacheBuilder;
import com.google.common.cache.CacheLoader;
import com.google.common.cache.LoadingCache;

....

imageCache = CacheBuilder.newBuilder()
         .maximumSize(cacheSize)
         .expireAfterWrite(10, TimeUnit.MINUTES)
         .build(
            new CacheLoader<String, DeferredImage>() {
               public DeferredImage load(String pathname) throws IOException {
                  return new DeferredImage(pathname);

               }
            });

Where DeferredImage is a wrapper around BufferedImage to facilitate loading in separate thread.

Is it possible to code some size checking, which takes into account not number of entries in cache, but their size in bytes?

UPDATE

I found maximumWeight() method but didn't understand if this what I am looking for: documentation nowhere states, that the sum of weights is calculated.


Solution

  • A short description is at Caches Explained - size based eviction:

    Alternately, if different cache entries have different "weights" -- for example, if your cache values have radically different memory footprints -- you may specify a weight function with CacheBuilder.weigher(Weigher) and a maximum cache weight with CacheBuilder.maximumWeight(long). In addition to the same caveats as maximumSize requires, be aware that weights are computed at entry creation time, and are static thereafter.

    In your case it is something like:

    imageCache = CacheBuilder.newBuilder()
         .maximumWeight(maximumCacheSizeInBytes)
         .weigher(new Weigher<String, DeferredImage>() {
            public int weigh(String k, DeferredImage g) {
              return g.size();
            }
         })  
         .build(
            new CacheLoader<String, DeferredImage>() {
               public DeferredImage load(String pathname) throws IOException {
                  return new DeferredImage(pathname);
    
               }
            });
    

    The type for the value DeferredImage could cause some trouble, because the size needs to be known when the value is inserted into the cache. Having an extra thread for loading becomes useless, since the thread that inserts into the cache would block until the size is known.

    Alternatives maybe worth mentioning:

    The API and functionality of Caffeine is similar to the Guava cache, but supports async loaders. This would allow a better control of the number of image loading threads.

    EHCache provides size based eviction by analyzing the memory consumption of the values, so it does not need a Weighter function.