I am using Spring Data Redis. If a Redis SET has millions of values, will fetching its members (am using members() function) create a Java Set in the heap with a million values? Or is values fetched as and only required?
If all the values are fetched at one go, will it throw some out of memory error if the SET is huge? If so, how can I overcome that? I have got the same doubt with the range() of LIST and ZSET.
SD Redis v 1.2 implementation for RedisSet
has no lazy loading implementation.
But commands like add
and remove
are delegated to the underlying RedisConnection
performing operations at the server without affecting any local data.
Using RedisSet.iterator()
will execute SMEMBERS
and load the entire response into memory, which is likely to use quite a lot of memory.
Same goes for the List
and Maps
implementation.