LRU Cache
Your desk has room for only 5 folders. When you need a 6th, you clear off the folder you haven't touched in the longest time to make space. That's the Least Recently Used (LRU) eviction policy — and it's the most common cache strategy in operating systems, databases, and browsers.
An LRU cache supports two operations:
get(key)— return the value if it's in the cache; mark it as recently usedput(key, value)— add the item; if the cache is full, evict the least recently used item first
Both must run in \(O(1)\). Achieving this requires combining two data structures: a doubly linked list and a hash map.
← → arrow keys to step · click elements to interact
LRU Cache: Get & Put
The data structure combination
The doubly linked list keeps items in order of recency. The head holds the most recently used item; the tail holds the least recently used. Moving a node to the head or removing the tail node both take \(O(1)\) with a doubly linked list.
The hash map maps each key directly to its node in the linked list. This gives \(O(1)\) lookup — no need to scan the list to find a key.
get(key)
- Look up the key in the hash map. If absent, return -1.
- If found, move that node to the head of the list (mark it as most recently used).
- Return its value.
put(key, value)
- If the key already exists, update its value and move it to the head.
- If the key is new: create a new node, add it to the head, and store it in the hash map.
- If the cache is now over capacity: remove the node at the tail and delete its key from the hash map.
Sentinel nodes
A clean implementation uses two dummy nodes — a permanent head sentinel and a permanent tail sentinel. Real nodes live between them. This avoids null-pointer edge cases when inserting or removing the first/last real node.
Complexity
LRU Cache: Hit Rate Simulation
Quick check
Continue reading