Search code examples
javacachingigniteapacheignite

The retrieval time for the Apache Ignite cache is too long


I want to use Apache Ignite as a caching layer. I'm using Ignite thick client. I've inserted up to 400,000 records into the server for caching. However, when retrieving data, it takes 2-3 seconds. How can I reduce this delay? Additionally, when attempting to load more than 400,000 records, they're not getting cached. I'm trying to load over 10 million data onto the server at startup. What could be the issue? Please give me a clear solution. I'm new to Ignite.

Server Configuration for reference:

public class MemoryAndCacheMonitoring {

    public static void main(String[] args) throws Exception {
        IgniteConfiguration cfg = new IgniteConfiguration();
        cfg.setIgniteInstanceName("Instance");
        cfg.setConsistentId("Node");

        DataStorageConfiguration storageCfg = new DataStorageConfiguration();
        DataRegionConfiguration regionCfg = new DataRegionConfiguration();
        regionCfg.setMaxSize(6L * 1024 * 1024 * 1024);
        regionCfg.setPersistenceEnabled(true);
        regionCfg.setInitialSize(4L * 1024 * 1024 * 1024);
        regionCfg.setMetricsEnabled(true); 

        storageCfg.setDefaultDataRegionConfiguration(regionCfg);
        cfg.setDataStorageConfiguration(storageCfg);

        CacheConfiguration<String, String> CacheCfg = new CacheConfiguration<>();
        CacheCfg.setName("myCache");
        CacheCfg.setAtomicityMode(CacheAtomicityMode.TRANSACTIONAL);
        CacheCfg.setCacheMode(CacheMode.REPLICATED);
        CacheCfg.setWriteSynchronizationMode(CacheWriteSynchronizationMode.FULL_SYNC);

        cfg.setCacheConfiguration(CacheCfg);
        cfg.setPeerClassLoadingEnabled(true);

        Ignite igniteServer = Ignition.start(cfg);
        igniteServer.cluster().state(ClusterState.ACTIVE);

        IgniteCache<String, String> myCache = igniteServer.getOrCreateCache("myCache");
        igniteServer.resetLostPartitions(Arrays.asList("myCache"));
        igniteServer.cluster().baselineAutoAdjustEnabled(true);
    }
}

Function for streaming bulk data:

 @Async
    public CompletableFuture<Void> processAllRecords() {
        long startTime = System.currentTimeMillis();

        List<ProductLines> records = productLinesRepo.findRecordsWithPanNotNull();
        if (!records.isEmpty()) {
            igniteCacheService.streamBulkData("myCache", records);
            totalProcessedRecords += records.size();
            logger.info("Processed {} records", totalProcessedRecords);
            System.out.println("Processed batch of records until now: " + totalProcessedRecords);
        }

        long endTime = System.currentTimeMillis();
        long totalTime = endTime - startTime;
        logger.info("Total time taken for processing all records: {} milliseconds", totalTime);

        return CompletableFuture.completedFuture(null);
    }

Controller for cache retrieval:

 @GetMapping("/getCache/{cacheName}/{panNumber}")
    public Map<String, ProductLines> getCachebyId(@PathVariable String cacheName, @PathVariable String panNumber) {
        IgniteCache<String, ProductLines> cache = cacheService.getCache(cacheName);

        if (cache != null) {
            Map<String, ProductLines> thatoneCache = new HashMap<>();

            // Iterate over all entries in the cache
            for (Cache.Entry<String, ProductLines> entry : cache) {
                if(entry.getKey().toString().equals(panNumber))
                    thatoneCache.put(entry.getKey(), entry.getValue());
            }
            return thatoneCache;
        } else {
            throw new RuntimeException("Requested Cache '" + cacheName + "' cannot be found"); // Handle error appropriately

        }
    }

Solution

  • Iterating over the entire cache is never a good idea

    You have the key, use it to retrieve the entry.

    ProductLines res = cache.get(panNumber);