Search code examples
cachingdata-structuresdictionaryhashmaplru

Regarding LRU Cache Design using HashMap and Linked List combination


Referring to LRU cache design

I have a question regarding the answer.

Say my hash map is full (the interviewer gave me a max size) [I understand if I need to fetch a pair already present in the map I'll move the list entry to the front to indicate recent use.]

But, what if I have an entry which is to be added and this key hashes to same position as a different key. (Collision) How do I go about it?

DO I do chaining or probing? If I do chaining, should I increase the map size? If I remove the oldest entry it empties a location in my hash map. But a new entry might not hash to this location? It might hash to another full entry? (Different Key, Value Pair) How to solve this?


Solution

  • This design will not include chaining because we're here designing a direct mapped cache and this tradeoff is known that a direct mapped cache ONLY considers the recency of an entry before removing it from cache and not the frequency of being asked for.

    The max size limit will be imposed on the linked list size and every time we try to add a new entry when the linked list is full, the last used entry (of linked list) and corresponding map entry is removed. The location where the new entry is to be inserted is independent of what was removed.

    For more details on concurrency check out this link.