Search code examples
redisredis-cluster

Thousands of REDIS Sorted Sets VS millions of Simple Sets


I came to 2 options on how to solve the problem I have with (AWS ElastiCache (REDIS)).

I was able to find all the differences for these two approaches in scope of Time complexity (Big O) and other stuff. However, there is one question that still bothers me:

Is there any difference for REDIS cluster (in memory consumption, CPU or any other resources) to handle:

?

Thanks in advance for the help :)


Solution

  • You are comparing two different data types, it is better to be benchmarked to decide which one's memory consumption is better with info memory. But I assume both are used with the same length for entries inside.

    If you use the config set-max-intset-entries and stay in the limits of it while adding to this set(let's say 512), then your memory consumption will be lower than your first option(same value lengths and equality of the total entries). But it doesn't come for free.

    The documentation states that

    This is completely transparent from the point of view of the user and API. Since this is a CPU / memory trade off it is possible to tune the maximum number of elements and maximum element size for special encoded types using the following redis.conf directives.