Longest Increasing Subsequence
Picture a sequence of random numbers: 3, 1, 4, 1, 5, 9, 2, 6. Hidden inside is a "staircase" — a chain of numbers where each is strictly bigger than the last. You can skip numbers, but you can't reorder them. The longest such staircase is [1, 4, 5, 9] or [1, 4, 5, 6] — both of length 4.
That's the Longest Increasing Subsequence (LIS). It comes up in scheduling problems, in analyzing algorithm complexity, and even in card games.
The \(O(n^2)\) DP Approach
Define dp[i] = length of the longest increasing subsequence that ends at index i. For each i, look back at every j < i where A[j] < A[i]: dp[i] = max(dp[j] + 1). Base case: dp[i] = 1 (every single element is a subsequence of length 1 by itself). The answer is the maximum across all dp[i] values.
This is \(O(n^2)\) time — fine for n ≤ 10,000, but too slow for n = 100,000.
The \(O(n \log n)\) Patience Sorting Trick
Imagine playing a card game: you lay cards into piles, and each new card goes on the leftmost pile whose top card is greater than or equal to the new card. If no such pile exists, start a new one. The number of piles at the end equals the LIS length.
In code, maintain a tails array where tails[k] is the smallest possible tail element of all increasing subsequences of length k+1 found so far. For each element x:
If x > tails.last — append x (the LIS just got longer).
Otherwise — binary search for the leftmost tails[pos] ≥ x, and replace it with x.
The tails array is always sorted (that's an invariant you can prove), so the binary search takes \(O(\log n)\) — giving \(O(n \log n)\) total.
Why Replacing Works
Replacing tails[pos] with a smaller x doesn't break any existing subsequence — it just opens the door to longer subsequences in the future. A smaller tail is strictly better for future extensions: more elements in the array will be able to follow it.