Search code examples
redisstorageaerospikelarge-data

Storing very large list of data


I need to store a very large list (size upto 80MB), I could chunkify the data and store it across multiple keys - A_1, A_2 ... and so on. The solution looks far from elegant- and also comes with an overhead of having to maintain a lookup dictionary {'A': [1, 2, ....]} to know how many chunks exist of a particular document while reading it back.

Is there any other way of doing the same in Redis or Aerospike? I am not hell-bent on using Redis, any other storage (except relational) would be just as fine.


Solution

  • Based on your input redis lists will fit in. You can do a simple benchmarking on the same. By list I assume you will access data around some range.

    Use lpush to push all your data, http://redis.io/commands/lpush

    Use lrange to retrieve chunks of data, http://redis.io/commands/lrange

    To give you an idea, On a rough scale your list size would be 10,000. Time required to put in that 80MB of data will take around 1 sec. Time to retrieve chunk of some 500 data (in a range) will take around 200 ms. This could vary based on your input, RTT, etc.