Complexity & Analysis6 min read

Big-O, Theta, Omega

How fast does your recipe slow down as the guest list grows?
O(1) constant:Same time, alwaysO(log n) logarithmic:Grows very slowlyO(n) linear:Grows with inputO(n²) quadratic:Slows fast

Imagine you're cooking for a dinner party. A recipe for grilled cheese takes about the same time whether you're feeding 2 people or 200 — you just stack them on the grill together. That's \(O(1)\): constant time, no matter the size.

But if the recipe says "taste every dish and compare it to every other dish", cooking for 200 people means an astronomic number of comparisons. That's \(O(n^2)\): it explodes as the guest list grows.

Big-O notation is just a way to describe that growth rate — how many more steps does an algorithm need as the input gets bigger? We don't care about exact milliseconds (those depend on your computer). We care about the shape of the growth.

The Core Idea: Drop the Details

If an algorithm does 3n² + 100n + 42 steps, Big-O says that's simply \(O(n^2)\). Here's why: when n is a million, the \(n^2\) term completely drowns out everything else. So we drop constants and lower-order terms and just keep the dominant piece.

Two rules to remember: (1) Drop constants — 5n is \(O(n)\). (2) Drop smaller terms — \(n^2\) + n is \(O(n^2)\). Sequential steps add; nested loops multiply.

The Most Common Complexities

\(O(1)\) — Constant. Grabbing an array element by index. Same speed no matter what.

\(O(\log n)\) — Logarithmic. Binary search on a sorted list. Each step cuts the problem in half, so even a billion items only takes ~30 steps.

\(O(n)\) — Linear. Scanning every item once. Double the input, double the time.

\(O(n \log n)\) — Linearithmic. Merge sort, heap sort. The sweet spot for comparison-based sorting.

\(O(n^2)\) — Quadratic. A loop inside a loop. 10× more input = 100× more time.

\(O(2^n)\) — Exponential. Brute-force subset search. Completely infeasible past ~30 items.

Big-O, Big-Omega, Big-Theta

Big-O is an upper bound — the worst it can get. Big-Omega (Ω) is a lower bound — the best it can ever do. Big-Theta (Θ) is both at once: the algorithm grows exactly like that function, up to constants. Merge sort is \(\Theta(n \log n)\) in all cases — best, worst, and average.

Note: Big-O is like a speed limit sign — it tells you the worst case, not the typical speed. An algorithm that's \(O(n)\) might usually run much faster, but it will never be slower than linear growth.

Seeing Big-O in Code

import time, math
def o_1(arr): # O(1)
return arr[0]
def o_n(arr): # O(n)
total = 0
for x in arr:
total += x
return total
def o_n2(arr): # O(n²)
count = 0
for i in arr:
for j in arr:
count += 1
return count
arr = list(range(1000))
print(o_1(arr)) # 0 — always 1 step
print(o_n(arr)) # 499500 — 1000 steps
print(o_n2(arr)) # 1000000 — 1,000,000 steps
Output
0
499500
1000000

Complexity

⚡ \(O(1)\) ConstantArray index lookup, hash table get
Instant, never grows\(O(1)\)
🪵 \(O(\log n)\) LogarithmicBinary search — 1 billion items takes ~30 steps
Grows very slowly\(O(\log n)\)
📏 \(O(n)\) LinearSingle scan through an array
Proportional to input\(O(n)\)
📈 \(O(n \log n)\) LinearithmicMerge sort, heap sort — optimal for comparison sorting
Slightly more than linear\(O(n \log n)\)
🐢 \(O(n^2)\) QuadraticBubble sort, nested loops — 10× input = 100× time
Slows fast — avoid for large n\(O(n^2)\)
💥 \(O(2^n)\) ExponentialBrute-force subsets — infeasible past n ≈ 30
Doubles with every added item\(O(2^n)\)

Quick check

An algorithm takes \(8n^2\) + 3n + 100 steps. What's its Big-O?

Continue reading

Amortized Analysis
One big bill, paid in tiny installments
Recurrence Relations
A recipe that calls itself — until it's simple enough to solve
Binary Search
Cut the problem in half — every single time