Cache huge data in-memory Cache huge data in-memory elasticsearch elasticsearch

Cache huge data in-memory


A Redis Cluster sounds like a good fit for your usecase!

Redis cluster provides a mechanism for data sharding by means of hash slots. These slots are equally distributed over the nodes in your cluster when setting it up.

Whenever you store a value in the cluser, the corresponding hash slot for the given key is calculated and the data is forwarded to the responsible node. And the same way you can afterwards query your data again. So the answer to your question is certainly yes.

However, the max value size per key is 512MB. I'm not sure if I got your storage requirement correctly. I assume 5GB is the estimated total amount over all users.

Checkout the redis cluster tutorial.


You can also look into NCache(.net) / Tayzgrid(java) by Alachisoft,

Both of these solutions provide distributed caching with dynamic clustering which allows to add or remove nodes in cluster at runtime with out losing any data. Also intelligent client makes sure to refer to appropriate node to fetch/store a record against any key.