D&C Pattern
Your room is a disaster. Cleaning it feels overwhelming — until you realize you could split it in half. You clean one half, your sibling cleans the other. Suddenly what felt like one giant job is two smaller jobs happening at the same time. When you're both done, the room is clean.
That's divide and conquer. Take a hard problem, break it into pieces, solve each piece independently, then combine the results. The magic is that many problems which are hard at scale become trivially easy when the input is small enough.
The three phases
Divide: Split the input into two or more smaller subproblems. Often this means splitting an array in half, narrowing a search range, or partitioning a set of points.
Conquer: Solve each subproblem recursively. The base case is when the subproblem is so small you can solve it directly — an array of size 1 is already sorted, for example.
Combine: Merge the subsolutions into a solution for the original problem. This is where the interesting work often lives — the merge step in merge sort, for instance.
Canonical examples
Merge Sort: Divide array in half, sort each half, merge. Combine step costs \(O(n)\). Recurrence: T(n) = 2T(n/2) + \(O(n)\) → \(O(n \log n)\).
Binary Search: Discard half the search space at each step. No combine step needed. Recurrence: T(n) = T(n/2) + \(O(1)\) → \(O(\log n)\).
Strassen Matrix Multiplication: Split matrices into quadrants, use 7 recursive multiplications instead of 8. Recurrence: T(n) = 7T(n/2) + \(O(n^2)\) → \(O(n^2.807)\).
Setting up the recurrence
Every divide-and-conquer algorithm fits the form T(n) = a·T(n/b) + f(n), where a is the number of recursive calls, b is how much the input shrinks per call, and f(n) is the cost of the divide and combine steps. The Master Theorem then solves this by comparing f(n) to nlogba.
When D&C beats other approaches
Reach for divide and conquer when subproblems are independent (no shared state) and the combine step is cheap. If subproblems overlap — computing the same sub-answers repeatedly — switch to dynamic programming. If a simple greedy choice always works, keep it even simpler.