Search code examples
c#memory-mapped-files

MemoryMappedFile and b-tree for cache application


This is just an idea, I don't yet have any code, I need some design advice. I would implement a cache ( non distributed in first instance ) by using the MemoryMappedFile in c#. I think it would be good to have a b-tree as an undelying structure, but this is debatable as well. So the question are:

  • Is B-tree a good strategy to use to fast search items when the undelaying support is a memory mapped files ?
  • What tip and trick do we have with memory mapped files ? How much the view can be large, what are the drawbacks when it is too small or too large ?
  • Multithread consideration: how we deal with memory mapped file and concurrency ? Cache are supposed to be higly hitten by clients, what strategy is better to have something performant ?

As @Internal Server Error asked, I integrate the question with this: Key would be a string, about 64 chars max len. The data would be a byte[] about 1024 bytes long but consider an average at 128 bytes, or better: what I want to cache are OR/M entities, let's consider how long is a serialized entity in bytes with something like a BSOn serializer.


Solution

    • B-Tree is good (with memory-mapped files), but if the file is not always entirely kept in resident memory then a page-aligned B+Tree is much better. See also.
    • The trick with memory-mapped files is to use a 64-bit architecture so that you can map the entire file into memory, otherwise you'd have to only map the parts and the cached reads might be faster than mmaps.
    • Try CAS (compare-and-swap) over the shared memory. See also.