Back to stories
<Engineering/>

Algorithm Complexity and Big O: A Practical Intro

Share by

Algorithm Complexity and Big O: A Practical Intro

When you pick an algorithm or data structure, you care how it scales with input size. Big O notation is the standard way to describe that scaling. This post gives a practical intro so you can read and use complexity in everyday code.


What Big O describes

Big O describes how runtime or space grows as the input size n grows. We usually care about the worst case (or sometimes average case), and we ignore constant factors and lower-order terms. So we say “order of n” (O(n)) or “order of n squared” (O(n²)), not “exactly 3n + 2 steps.”


Common complexities (quick reference)

  • O(1): Constant. Same cost regardless of n (e.g. array index by i, hash lookup by key).
  • O(log n): Logarithmic. Grows slowly (e.g. binary search in a sorted array).
  • O(n): Linear. Doubling n roughly doubles cost (e.g. one pass over an array).
  • O(n log n): Linearithmic. Typical for good general-purpose sorting (e.g. merge sort, quicksort average).
  • O(n²): Quadratic. Nested loops over the same input (e.g. naive comparison of every pair).
  • O(2^n) or worse: Exponential. Only acceptable for very small n; usually a red flag.

Why it matters in code

  • Small n: O(n²) might be fine (e.g. n < 100). For large n, it can be too slow.
  • Hot paths: In loops or frequent calls, prefer better complexity (e.g. use a Set for membership instead of scanning an array).
  • Data structure choice: Array index = O(1); array search by value = O(n). Hash map lookup = O(1) average; binary search tree = O(log n) for search/insert. Choosing the right structure often fixes performance without changing the algorithm idea.

Examples

Linear search in an array: Check each element until found or end. Worst case n comparisons → O(n).

Binary search in a sorted array: Halve the search space each step. Worst case about log₂(n) steps → O(log n).

Two nested loops over the same list: For each of n elements, compare with up to n others → O(n²).

One loop that does a constant-time thing per element: n × O(1) → O(n).


Summary

  • Big O describes how time or space grows with input size n; we focus on worst (or average) case and ignore constants.
  • O(1), O(log n), O(n), O(n log n), O(n²) are the ones you’ll see most; prefer lower complexity on hot paths and for large n.
  • Use complexity to choose algorithms and data structures (e.g. hash vs array for lookups, sort algorithms) and to spot when code will not scale.