Search code examples
c#.netsequences

Best choice for growing, fixed capacity generic sequence


I'm trying to keep an in-memory list of entries in my .NET cache, representing the last N HTTP requests to my app. What's the best .NET sequence to use for this?

Requirements

  • Fixed number of items (e.g 50)
  • Serializable (need to add sequence to .NET cache)
  • When i try and add the max+1 entry, it automatically removes the oldest item in order to make room
  • Don't really care about order of items
  • Need to be able to get all items in a single operation, in order to perform aggregate calculations.
  • Thread-safe
  • Non-unique (e.g List<T>, not Dictionary<TKey,TValue>). I might hit the URL /foo 10x times, which isn't unique but all needs to be added to the sequence.

Off the top of my head, i was thinking i could use a Queue<T>, and when enqueuing just check the length, and if it hits the capacity, dequeue the old one. But concerned about thread safety (ConcurrentQueue<T> perhaps?) and the best approach, since this is a 'hot' area of my app that needs to be optimal.

Thanks!


Solution

  • It really depends on what you specifically mean by "oldest entry".

    If you are looking for a FIFO structure, then it is possible to extend the ConcurrentQueue<T> to eject the oldest item (first item entered). (Copied from this answer).

    public class FixedSizedQueue<T> : ConcurrentQueue<T>
    {
        private readonly object syncObject = new object();
    
        public int Size { get; private set; }
    
        public FixedSizedQueue(int size)
        {
            Size = size;
        }
    
        public new void Enqueue(T obj)
        {
            base.Enqueue(obj);
            lock (syncObject)
            {
                while (base.Count > Size)
                {
                    T outObj;
                    base.TryDequeue(out outObj);
                }
            }
        }
    }
    

    If you are looking for a cache that keeps track of when the last item was accessed, and ejects the one that was accessed least recently (sort of like the sliding expiration functionality in System.Runtime.Caching), you could use a Least Recently Used (LRU) cache.

    There is a high performance thread-safe .NET implementation of one called the LurchTable in the CSharpTest.Net.Collections project, which is available on NuGet.

    Introducing the LurchTable as a C# version of LinkedHashMap

    For other options, see