We have a dataset in which we have more logical records than our current cluster capacity allows for. The records are really small, so it should be easy enough to group multiple records together into one aerospike record. I can obviously come up with a custom solution to do this, but I’m wondering if there are any best practices we should be following in doing this. I would think this is a fairly common problem.
Its called a "Modeling Tiny Records" problem. You can store each small record in a Map type bin as a key-value pair. The key of this mega-record will be some number of bits of the RIPEMD160 Hash of the mapkey. Downside - you can't use XDR in EE for individual record shipping, you lose record level time-to-live option. This technique was discussed at the Aerospike User Summit 2019. The slide snapshot is here: