Algorithm Design Techniques: Divide and Conquer and Beyond
Algorithm design techniques are reusable patterns for turning a problem into an efficient solution. This post gives an overview of the main ones—divide and conquer, greedy, dynamic programming—and when to reach for each.
Divide and conquer
- Idea: Split the problem into smaller subproblems (often half the size), solve them recursively, then combine the results.
- When it fits: Problem naturally breaks into independent or nearly independent pieces; combination cost is modest.
- Examples: Merge sort (split array, sort halves, merge), quicksort (partition, sort parts), binary search (one half only), closest pair of points.
- Complexity: Often O(n log n) or similar; master theorem helps for recurrences of the form T(n) = a T(n/b) + f(n).
Greedy
- Idea: At each step, make the locally optimal choice; never backtrack.
- When it fits: Problem has greedy choice property (a global optimum includes the greedy choice) and optimal substructure.
- Examples: Activity selection, Huffman coding, Dijkstra, MST (Kruskal, Prim), fractional knapsack.
- Complexity: Often O(n log n) from sorting or a priority queue; sometimes O(n) or O(E log V).
Dynamic programming
- Idea: Solve overlapping subproblems once; store results and reuse (memoization or tabulation).
- When it fits: Optimal substructure and overlapping subproblems; greedy doesn’t work.
- Examples: Fibonacci, coin change, longest common subsequence, knapsack, many optimization problems.
- Complexity: Depends on subproblem count and work per subproblem; often O(n²) or O(n × W) etc.
Brute force and backtracking
- Brute force: Try all possibilities; accept when input is tiny.
- Backtracking: Explore choices (e.g. place queen, try next row); prune when a partial solution can’t lead to a valid one. Used for enumeration, constraint satisfaction (e.g. N-queens, Sudoku).
- When: Problem size is small, or you need all solutions; pruning makes it feasible for medium sizes.
Which technique to try
- Can you split and combine? → Divide and conquer.
- Does “best local choice” always lead to global optimum? → Greedy.
- Overlapping subproblems and optimal substructure but no greedy? → Dynamic programming.
- Need to enumerate or search with pruning? → Backtracking (or search + heuristics).
Often the same problem can be solved with different techniques; pick the one that gives the right complexity and is clearest to implement and maintain.