Divide & Conquer7 min read

D&C Pattern

Split the problem, solve the pieces, combine the results
template:T(n) = aT(n/b) + f(n)Master Thm:Solves most D&C recurrences

Your room is a disaster. Cleaning it feels overwhelming — until you realize you could split it in half. You clean one half, your sibling cleans the other. Suddenly what felt like one giant job is two smaller jobs happening at the same time. When you're both done, the room is clean.

That's divide and conquer. Take a hard problem, break it into pieces, solve each piece independently, then combine the results. The magic is that many problems which are hard at scale become trivially easy when the input is small enough.

The three phases

Divide: Split the input into two or more smaller subproblems. Often this means splitting an array in half, narrowing a search range, or partitioning a set of points.

Conquer: Solve each subproblem recursively. The base case is when the subproblem is so small you can solve it directly — an array of size 1 is already sorted, for example.

Combine: Merge the subsolutions into a solution for the original problem. This is where the interesting work often lives — the merge step in merge sort, for instance.

Canonical examples

Merge Sort: Divide array in half, sort each half, merge. Combine step costs \(O(n)\). Recurrence: T(n) = 2T(n/2) + \(O(n)\) → \(O(n \log n)\).

Binary Search: Discard half the search space at each step. No combine step needed. Recurrence: T(n) = T(n/2) + \(O(1)\) → \(O(\log n)\).

Strassen Matrix Multiplication: Split matrices into quadrants, use 7 recursive multiplications instead of 8. Recurrence: T(n) = 7T(n/2) + \(O(n^2)\) → \(O(n^2.807)\).

Setting up the recurrence

Every divide-and-conquer algorithm fits the form T(n) = a·T(n/b) + f(n), where a is the number of recursive calls, b is how much the input shrinks per call, and f(n) is the cost of the divide and combine steps. The Master Theorem then solves this by comparing f(n) to nlogba.

When D&C beats other approaches

Reach for divide and conquer when subproblems are independent (no shared state) and the combine step is cheap. If subproblems overlap — computing the same sub-answers repeatedly — switch to dynamic programming. If a simple greedy choice always works, keep it even simpler.

The divide-and-conquer pattern: every call either solves a tiny base case or splits, recurses, and combines.
Note: The key word is 'independent.' D&C only works well when solving one half doesn't depend on the other. That's what separates it from dynamic programming, where subproblems share results and you need to cache them.

Merge Sort — Classic Divide & Conquer

def merge_sort(arr):
if len(arr) <= 1:
return arr
mid = len(arr) // 2
left = merge_sort(arr[:mid]) # Conquer left
right = merge_sort(arr[mid:]) # Conquer right
return merge(left, right) # Combine
def merge(left, right):
result = []
i = j = 0
while i < len(left) and j < len(right):
if left[i] <= right[j]:
result.append(left[i]); i += 1
else:
result.append(right[j]); j += 1
return result + left[i:] + right[j:]
arr = [38, 27, 43, 3, 9, 82, 10]
print(merge_sort(arr))
Output
[3, 9, 10, 27, 38, 43, 82]

Complexity

✂️ DivideSplitting an array by index is \(O(1)\); copying is \(O(n)\)
Usually linear or constant\(O(1)\) – \(O(n)\)
🔁 Conquer (recurse)a subproblems each of size n/b
Depends on branchinga · T(n/b)
🔗 CombineMerge sort: \(O(n)\); binary search: \(O(1)\)
Varies by algorithmf(n)
📘 Master Theorem Case 1When f(n) = \(O(n^(log_b(a)\) − ε))
Subproblems dominate\(O(n^log_b(a)\))
📗 Master Theorem Case 2When f(n) = \(\Theta(n^log_b(a)\)) — merge sort falls here
Balanced — evenly split\(O(n^log_b(a)\) · log n)
📕 Master Theorem Case 3When f(n) = \(\Omega(n^(log_b(a)\) + ε))
Combine step dominates\(O(f(n)\))

Quick check

Merge Sort has recurrence T(n) = 2T(n/2) + \(O(n)\). Which Master Theorem case applies?

Continue reading

Merge Sort
Split, sort each half, merge — guaranteed \(O(n \log n)\) every time
Recurrence Relations
A recipe that calls itself — until it's simple enough to solve
Count Inversions
Measure how shuffled a sequence is — in \(O(n \log n)\)