Divide & Conquer7 min read

Karatsuba Multiplication

3 multiplications instead of 4 — per recursive level
Karatsuba:\(O(n^1.585)\)naive:\(O(n^2)\)insight:Compute ad+bc without multiplying ad or bc directly

Suppose you need to multiply two 1,000-digit numbers — like the RSA keys used in your browser right now. The grade-school long multiplication approach requires about 1,000,000 single-digit multiplications. That's already slow, and for 10,000-digit numbers it becomes 100,000,000.

In 1960, a 23-year-old mathematician named Anatoly Karatsuba found a trick that cuts this dramatically. The idea: split each number in half and use 3 multiplications instead of 4. At first that sounds like a rounding error. But recursively applied — splitting again and again — 3 vs 4 compounds into O(n1.585) vs \(O(n^2)\). For large enough numbers, the gap is enormous.

The setup

Take two n-digit numbers x and y. Split each at the midpoint:

x = a · 10n/2 + b, and y = c · 10n/2 + d, where a and c are the high halves, b and d are the low halves.

Then: x · y = ac · 10n + (ad + bc) · 10n/2 + bd

The naive approach computes four products: ac, ad, bc, bd.

The trick: save one multiplication

Observe that (a+b)(c+d) = ac + ad + bc + bd.

So: ad + bc = (a+b)(c+d) − ac − bd

We only need three products: \(P_1\) = ac, \(P_2\) = bd, \(P_3\) = (a+b)(c+d). Then ad + bc = \(P_3\) − \(P_1\) − \(P_2\) for free. The final result: \(P_1\) · 10n + (\(P_3\) − \(P_1\) − \(P_2\)) · 10n/2 + \(P_2\).

The additions and shifts cost \(O(n)\). Only the three recursive multiplications drive the recurrence.

Why does saving one multiplication help so much?

Recurrence: T(n) = 3·T(n/2) + \(O(n)\). By the Master Theorem with a=3, b=2: n\(\log_2\)3 ≈ n1.585 dominates \(O(n)\), so Case 1 applies. T(n) = O(n1.585).

With four multiplications it'd be T(n) = 4·T(n/2) + \(O(n)\) → O(n2). That one extra multiplication per level compounds across \(O(\log n)\) levels into a fundamentally different complexity class.

Modern fast multiplication algorithms (Schönhage-Strassen, Harvey-Hoeven) use FFT to push further toward \(O(n \log n)\), but Karatsuba was the first crack in the belief that \(O(n^2)\) was optimal — a classic divide and conquer breakthrough — and it's still the go-to for moderately large numbers.

Note: Python's built-in integer multiplication uses Karatsuba automatically for large numbers. When you write a * b for big integers in Python, you're using Karatsuba — without doing anything special.

Karatsuba Multiplication

def karatsuba(x, y):
# Base case: small numbers
if x < 10 or y < 10:
return x * y
# Number of digits in the larger number
n = max(len(str(x)), len(str(y)))
half = n // 2
# Split: x = a * 10^half + b, y = c * 10^half + d
split = 10 ** half
a, b = x // split, x % split
c, d = y // split, y % split
# Three recursive multiplications
p1 = karatsuba(a, c) # ac
p2 = karatsuba(b, d) # bd
p3 = karatsuba(a + b, c + d) # (a+b)(c+d)
# Combine: ac*10^n + (p3-p1-p2)*10^half + bd
return p1 * (split ** 2) + (p3 - p1 - p2) * split + p2
print(karatsuba(1234, 5678)) # 7006652
print(1234 * 5678) # verify: 7006652
Output
7006652
7006652

Complexity

🐌 Grade-school multiplicationn rows each requiring \(O(n)\) work — 1 million operations for 1,000-digit numbers
Quadratic in digit count\(O(n^2)\)
⚡ Karatsuba multiplication\(O(n^log_23)\); significantly faster for large n — the first algorithm to beat \(O(n^2)\)
Sub-quadratic\(O(n^1.585)\)
🔁 Recursive structureThree multiplications instead of four — one saved per level
Three half-size calls3 · T(n/2)
➕ Combine stepAdditions and digit shifts — cheap compared to multiplications
Linear in digits\(O(n)\)
🚀 FFT-based multiplicationThe Harvey-Hoeven 2019 algorithm — far more complex, used in very specialized contexts
Near-linear (reference point)\(O(n \log n)\)

Quick check

What is the key algebraic trick that reduces 4 multiplications to 3?

Continue reading

Fast Power
Compute \(x^n\) in \(O(\log n)\) — not \(O(n)\) multiplications
D&C Pattern
Split the problem, solve the pieces, combine the results