Advanced / Specialized6 min read

LRU Cache

Keep the most recent items fast — evict the forgotten ones
get: \(O(1)\)put: \(O(1)\)space: \(O(capacity)\)

Your desk has room for only 5 folders. When you need a 6th, you clear off the folder you haven't touched in the longest time to make space. That's the Least Recently Used (LRU) eviction policy — and it's the most common cache strategy in operating systems, databases, and browsers.

An LRU cache supports two operations:

  • get(key) — return the value if it's in the cache; mark it as recently used
  • put(key, value) — add the item; if the cache is full, evict the least recently used item first

Both must run in \(O(1)\). Achieving this requires combining two data structures: a doubly linked list and a hash map.

LRU cache access flow — hits move to front, misses may trigger tail eviction
Watch items move to the head on access, and get evicted from the tail when full

← → arrow keys to step · click elements to interact

LRU Cache: Get & Put

from collections import OrderedDict
class LRUCache:
def __init__(self, capacity):
self.cap = capacity
self.cache = OrderedDict()
def get(self, key):
if key not in self.cache: return -1
self.cache.move_to_end(key)
return self.cache[key]
def put(self, key, value):
if key in self.cache: self.cache.move_to_end(key)
self.cache[key] = value
if len(self.cache) > self.cap:
self.cache.popitem(last=False)
cache = LRUCache(3)
cache.put(1, 'a')
cache.put(2, 'b')
cache.put(3, 'c')
print(cache.get(1)) # a (1 becomes most recent)
cache.put(4, 'd') # evicts key 2 (least recent)
print(cache.get(2)) # -1 (evicted)
print(cache.get(4)) # d
Output
a
-1
d

The data structure combination

The doubly linked list keeps items in order of recency. The head holds the most recently used item; the tail holds the least recently used. Moving a node to the head or removing the tail node both take \(O(1)\) with a doubly linked list.

The hash map maps each key directly to its node in the linked list. This gives \(O(1)\) lookup — no need to scan the list to find a key.

get(key)

  1. Look up the key in the hash map. If absent, return -1.
  2. If found, move that node to the head of the list (mark it as most recently used).
  3. Return its value.

put(key, value)

  1. If the key already exists, update its value and move it to the head.
  2. If the key is new: create a new node, add it to the head, and store it in the hash map.
  3. If the cache is now over capacity: remove the node at the tail and delete its key from the hash map.

Sentinel nodes

A clean implementation uses two dummy nodes — a permanent head sentinel and a permanent tail sentinel. Real nodes live between them. This avoids null-pointer edge cases when inserting or removing the first/last real node.

pseudocode
1
2
head ↔ [A][B][C] ↔ tail
(most recent) (least recent, evict next)
Note: In Python, collections.OrderedDict implements LRU cache internally. In an interview, build it from scratch using a dict + doubly linked list — that's what interviewers want to see.

Complexity

get(key)Hash map lookup + move node to head
Instant\(O(1)\)
put(key, value)Hash map insert + add node to head + optional tail eviction
Instant\(O(1)\)
SpaceCache never grows beyond the set capacity
Bounded\(O(capacity)\)

LRU Cache: Hit Rate Simulation

from collections import OrderedDict
class LRUCache:
def __init__(self, capacity):
self.cap = capacity
self.cache = OrderedDict()
def get(self, key):
if key not in self.cache: return None
self.cache.move_to_end(key)
return self.cache[key]
def put(self, key, value):
if key in self.cache: self.cache.move_to_end(key)
self.cache[key] = value
if len(self.cache) > self.cap: self.cache.popitem(last=False)
requests = [1, 2, 3, 1, 4, 2, 5, 1]
cache = LRUCache(3)
hits = 0
for key in requests:
if cache.get(key) is not None:
hits += 1
else:
cache.put(key, key)
print(f"Hit rate: {hits}/{len(requests)} = {hits/len(requests):.0%}")
Output
Hit rate: 3/8 = 38%
Challenge

Quick check

An LRU cache has capacity 3 and currently holds [A, B, C] with A most recent. You call get(C). What is the new order?

Continue reading

Doubly Linked List
A two-way chain — each node knows both its neighbour ahead and the one behind
Hash Functions
Turn any key into a bucket number — instantly
CachingSystem Design
Remember answers so you don't compute them twice
Sets & Maps
Instant membership and key-value lookup — the everyday hash table