Complexity & Analysis8 min read

Recurrence Relations

A recipe that calls itself — until it's simple enough to solve
merge sort:T(n)=2T(n/2)+\(O(n)\) → \(O(n \log n)\)binary search:T(n)=T(n/2)+\(O(1)\) → \(O(\log n)\)Master Theorem:T(n)=aT(n/b)+f(n)

Imagine a recipe for cooking a feast that says: "To cook for N people, split the guests in half, cook each half separately, then combine the two meals." The recipe refers to itself! That self-referencing is exactly what a recurrence relation is in algorithms.

When a recursive algorithm solves a problem of size n by breaking it into sub-problems and combining the results, its running time T(n) depends on T(smaller n). A recurrence relation is the equation that captures this dependency — and solving it gives us the final Big-O.

Reading T(n) = aT(n/b) + f(n)

This is the standard template for divide-and-conquer. a = how many sub-problems you create. n/b = the size of each sub-problem. f(n) = the work done outside the recursive calls (splitting, combining).

Merge sort: T(n) = 2T(n/2) + O(n) — two halves (a=2), each half-sized (b=2), plus \(O(n)\) to merge (f(n)=n).

Binary search: T(n) = T(n/2) + O(1) — one sub-problem (a=1), half-sized (b=2), just one comparison (f(n)=1).

The Master Theorem — Your Shortcut

The Master Theorem solves T(n) = aT(n/b) + f(n) by comparing f(n) to the "natural" recursion cost n^(log_b a):

Case 1 — Recursion dominates: f(n) is smaller than n^(log_b a). Then T(n) = \(\Theta(n^(log_b a)\)). Most work happens deep in the recursion tree.

Case 2 — Both match: f(n) = \(\Theta(n^(log_b a)\)). Then T(n) = \(\Theta(n^(log_b a)\) · log n). Every level of the tree does equal work, so we multiply by the number of levels.

Case 3 — Combining dominates: f(n) is larger than n^(log_b a) (with a regularity condition). Then T(n) = \(\Theta(f(n)\)). Most work is at the top level.

The Recursion Tree — Visual Intuition

Draw the recursion as a tree. Each node shows how much non-recursive work is done at that level. For merge sort: level 0 has 1 node doing \(O(n)\); level 1 has 2 nodes each doing \(O(n/2)\) = \(O(n)\) total; level 2 has 4 nodes each doing \(O(n/4)\) = \(O(n)\) total. Every level does \(O(n)\) work, and there are \(O(\log n)\) levels — so total cost is \(O(n \log n)\). The tree makes the multiplication intuitive.

Note: The Master Theorem is a calculator for recursion: plug in a, b, and f(n), compare f(n) to n^(log_b a), and read off the answer. Merge sort: \(\log_2\)2 = 1, f(n) = n = \(n^1\) — they match! That's Case 2 → \(\Theta(n \log n)\).

Solving Merge Sort's Recurrence — Recursion Tree

def merge_sort(arr, depth=0):
indent = " " * depth
n = len(arr)
if n <= 1:
print(f"{indent}base case: {arr}")
return arr
mid = n // 2
left = merge_sort(arr[:mid], depth + 1)
right = merge_sort(arr[mid:], depth + 1)
merged = []
i = j = 0
while i < len(left) and j < len(right):
if left[i] <= right[j]:
merged.append(left[i]); i += 1
else:
merged.append(right[j]); j += 1
merged += left[i:] + right[j:]
print(f"{indent}merge({left}, {right}) -> {merged} [O(n) work at this level]")
return merged
result = merge_sort([5, 2, 8, 1, 9])
print("Sorted:", result)
Output
    base case: [5]
    base case: [2]
  merge([5], [2]) -> [2, 5]  [O(n) work at this level]
    base case: [8]
    base case: [1]
    base case: [9]
  merge([1], [9]) -> [1, 9]  [O(n) work at this level]
  merge([8], [1, 9]) -> [1, 8, 9]  [O(n) work at this level]
merge([2, 5], [1, 8, 9]) -> [1, 2, 5, 8, 9]  [O(n) work at this level]
Sorted: [1, 2, 5, 8, 9]

Complexity

🔍 Binary search T(n)=T(n/2)+\(O(1)\)Case 2: \(\log_2\)1=0, f(n)=\(O(1)\)=\(\Theta(n^0)\) — both match → multiply by log n
Logarithmic\(O(\log n)\)
🔀 Merge sort T(n)=2T(n/2)+\(O(n)\)Case 2: \(\log_2\)2=1, f(n)=\(\Theta(n^1)\) — match → \(\Theta(n \log n)\)
Linearithmic\(O(n \log n)\)
🧮 Strassen T(n)=7T(n/2)+\(O(n^2)\)Case 1: \(\log_2\)7 ≈ 2.807 > 2 — recursion dominates
Sub-cubic\(O(n^2.807)\)
📦 Naive matrix multiply T(n)=8T(n/2)+\(O(n^2)\)Case 1: \(\log_2\)8=3 — recursion dominates
Cubic\(O(n^3)\)
📝 Linear scan T(n)=T(n−1)+\(O(1)\)Not in standard Master form — substitution gives T(n)=\(O(n)\)
Linear\(O(n)\)

Quick check

T(n) = 2T(n/2) + \(O(n)\) describes an algorithm that...

Continue reading

Big-O, Theta, Omega
How fast does your recipe slow down as the guest list grows?
Merge Sort
Split, sort each half, merge — guaranteed \(O(n \log n)\) every time
D&C Pattern
Split the problem, solve the pieces, combine the results