The Ultimate Big O Notation Cheat Sheet: Algorithm Complexity Explained

Introduction: What is Big O Notation and Why It Matters

Big O Notation is a mathematical notation that describes the limiting behavior of a function when the argument tends towards a particular value or infinity. In computer science, it’s used to classify algorithms according to how their run time or space requirements grow as the input size grows.

Why it matters:

  • Helps predict algorithm performance as data scales
  • Enables comparison between different algorithms
  • Essential for technical interviews and efficient software development
  • Guides optimization efforts where they’ll have the most impact

Core Concepts and Principles

Key Terminology

TermDefinition
Time ComplexityHow runtime increases with input size
Space ComplexityHow memory usage increases with input size
Best Case (Ω)Performance under optimal conditions
Average Case (Θ)Expected performance under typical conditions
Worst Case (O)Performance under worst possible conditions
nThe input size (number of elements)
Asymptotic AnalysisFocuses on performance as input approaches infinity

Important Rules

  • Drop Constants: O(2n) → O(n)
  • Drop Lower-Order Terms: O(n² + n) → O(n²)
  • Consider Dominant Terms Only: Focus on what happens as n gets very large
  • Variables in Exponents Matter: O(2ⁿ) is very different from O(n²)
  • Consider All Operations: Assignments, comparisons, arithmetic operations, etc.

Common Big O Complexities (Best to Worst)

NotationNameDescriptionExample Operations
O(1)ConstantRuntime is independent of input sizeArray access, simple calculations
O(log n)LogarithmicRuntime grows logarithmically with input sizeBinary search, balanced trees
O(n)LinearRuntime grows linearly with input sizeLinear search, traversing arrays
O(n log n)LinearithmicRuntime is n × log(n)Efficient sorting (merge sort, quicksort on average)
O(n²)QuadraticRuntime grows quadratically with input sizeNested loops, bubble/insertion/selection sorts
O(n³)CubicRuntime grows cubically with input sizeTriple nested loops, some matrix operations
O(2ⁿ)ExponentialRuntime doubles with each additional elementRecursive Fibonacci, combinatorial problems
O(n!)FactorialRuntime grows factorially with input sizeTraveling salesman (brute force), permutations

Determining Big O Complexity: Step-by-Step Process

  1. Identify the operations: Count assignments, comparisons, arithmetic operations
  2. Identify variables: Determine which variables represent input size
  3. Count operations in terms of input size: Analyze how operations scale with input
  4. Focus on the dominant term: Keep the term that grows fastest as n increases
  5. Remove constants and coefficients: Simplify to standard Big O notation

Common Code Patterns and Their Complexities

# O(1) - Constant time
def constant_time(arr):
    return arr[0]

# O(log n) - Logarithmic time
def binary_search(arr, target):
    left, right = 0, len(arr) - 1
    while left <= right:
        mid = (left + right) // 2
        if arr[mid] == target: return mid
        if arr[mid] < target: left = mid + 1
        else: right = mid - 1
    return -1

# O(n) - Linear time
def linear_time(arr):
    total = 0
    for item in arr:
        total += item
    return total

# O(n log n) - Linearithmic time
def merge_sort(arr):
    if len(arr) <= 1:
        return arr
    mid = len(arr) // 2
    left = merge_sort(arr[:mid])
    right = merge_sort(arr[mid:])
    return merge(left, right)

# O(n²) - Quadratic time
def bubble_sort(arr):
    n = len(arr)
    for i in range(n):
        for j in range(0, n-i-1):
            if arr[j] > arr[j+1]:
                arr[j], arr[j+1] = arr[j+1], arr[j]
    return arr

# O(2ⁿ) - Exponential time
def fibonacci(n):
    if n <= 1:
        return n
    return fibonacci(n-1) + fibonacci(n-2)

Algorithm Comparison Tables by Category

Sorting Algorithms

AlgorithmBest CaseAverage CaseWorst CaseSpace Complexity
Bubble SortO(n)O(n²)O(n²)O(1)
Selection SortO(n²)O(n²)O(n²)O(1)
Insertion SortO(n)O(n²)O(n²)O(1)
Merge SortO(n log n)O(n log n)O(n log n)O(n)
Quick SortO(n log n)O(n log n)O(n²)O(log n)
Heap SortO(n log n)O(n log n)O(n log n)O(1)
Radix SortO(nk)O(nk)O(nk)O(n+k)
Counting SortO(n+k)O(n+k)O(n+k)O(k)

Data Structure Operations

Data StructureAccessSearchInsertionDeletion
ArrayO(1)O(n)O(n)O(n)
Linked ListO(n)O(n)O(1)O(1)
StackO(n)O(n)O(1)O(1)
QueueO(n)O(n)O(1)O(1)
Hash TableN/AO(1)*O(1)*O(1)*
Binary Search TreeO(log n)*O(log n)*O(log n)*O(log n)*
AVL TreeO(log n)O(log n)O(log n)O(log n)
B-TreeO(log n)O(log n)O(log n)O(log n)
Red-Black TreeO(log n)O(log n)O(log n)O(log n)
HeapO(1)O(n)O(log n)O(log n)

*Average case, can be worse in certain scenarios

Graph Algorithms

AlgorithmTime ComplexitySpace Complexity
BFS (Breadth-First Search)O(V + E)O(V)
DFS (Depth-First Search)O(V + E)O(V)
Dijkstra’s AlgorithmO(V² + E) or O((V+E)log V) with min-heapO(V)
Bellman-FordO(V × E)O(V)
Floyd-WarshallO(V³)O(V²)
Prim’s AlgorithmO(V² + E) or O(E log V) with min-heapO(V)
Kruskal’s AlgorithmO(E log E)O(V)
Topological SortO(V + E)O(V)

Common Challenges and Solutions

Challenge 1: Nested Loops with Different Indices

Problem: Determining complexity when loops have different bounds or patterns Solution: Multiply complexities of independent loops, add complexities of sequential operations

# What's the complexity?
for i in range(n):          # O(n)
    print(i)
for j in range(n*n):        # O(n²)
    print(j)
# Total: O(n) + O(n²) = O(n²)

Challenge 2: Recursive Functions

Problem: Analyzing recursive function complexity Solution: Use the master theorem or recursion tree method

Master Theorem for recurrences of form T(n) = aT(n/b) + f(n):

  • If f(n) = O(n^c) where c < log_b(a), then T(n) = O(n^(log_b(a)))
  • If f(n) = O(n^c) where c = log_b(a), then T(n) = O(n^c * log n)
  • If f(n) = O(n^c) where c > log_b(a), then T(n) = O(f(n))

Challenge 3: Amortized Analysis

Problem: Some operations occasionally cost more but are rare Solution: Consider the average cost over many operations

Example: Dynamic arrays (like ArrayList in Java, vector in C++, list in Python)

  • Most insertions are O(1)
  • Occasional resizing is O(n)
  • Amortized analysis shows O(1) per insertion on average

Challenge 4: Multiple Variables

Problem: Analyzing complexity with multiple input variables Solution: Express complexity in terms of all relevant variables

# Complexity depends on both m and n
def two_variable_complexity(m, n):
    for i in range(m):
        for j in range(n):
            print(i, j)
    # Complexity: O(m × n)

Best Practices and Practical Tips

For Technical Interviews

  • Communicate your thought process: Explain how you’re analyzing complexity
  • Consider all cases: Best, average, and worst-case scenarios
  • Practice common patterns: Recognize nested loops, recursive calls, etc.
  • Simplify first: Drop constants and lower-order terms early for clarity
  • Ask clarifying questions: Input size, constraints, expected behavior

For Algorithm Design

  • Choose the right data structure: Appropriate choices can drastically improve performance
  • Consider input characteristics: Sorted data? Mostly unique values? Size constraints?
  • Space-time tradeoffs: Sometimes using more memory can significantly improve runtime
  • Avoid premature optimization: Focus on correct implementation first, then optimize if needed
  • Benchmark with realistic data: Theoretical analysis is important but test with actual data

Time Complexity Optimization Techniques

  • Avoid nested loops when possible
  • Use hash tables for O(1) lookups
  • Replace recursion with iteration to avoid stack overflow
  • Use dynamic programming to avoid redundant calculations
  • Consider greedy algorithms for appropriate problems
  • Precompute and cache frequently needed values

Visualizing Big O Complexities

Growth rates visualization:

   n     O(1)   O(log n)   O(n)   O(n log n)   O(n²)   O(2ⁿ)   O(n!)
  10       1        3       10        30        100    1024   3628800
  20       1        4       20        86        400   1048576    Too large
  50       1        6       50       283       2500   Too large    Too large
 100       1        7      100       664      10000   Too large    Too large
1000       1       10     1000      9966    1000000   Too large    Too large

Visual Comparison

O(1)        ▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁
O(log n)    ▁▁▁▁▁▁▁▁▁▁▁▁▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂
O(n)        ▁▁▁▁▁▁▁▁▁▁▁▁▁▁▃▃▃▃▃▃▃▃▃▃▃▃▃▃▃▃▃
O(n log n)  ▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▃▃▃▃▃▃▃▃▃▅▅▅▅▅▅▅
O(n²)       ▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▂▂▃▃▃▃▅▅▅█████▓
O(2ⁿ)       ▁▁▁▁▁▁▁▁▁▂▂▂▃▃▅▅█████████████████
O(n!)       ▁▁▂▃▅███████████████████████████

Resources for Further Learning

Books

  • “Introduction to Algorithms” by Cormen, Leiserson, Rivest, and Stein
  • “Algorithms” by Robert Sedgewick and Kevin Wayne
  • “Grokking Algorithms” by Aditya Bhargava
  • “The Art of Computer Programming” by Donald Knuth

Online Courses

  • MIT OpenCourseWare: “Introduction to Algorithms”
  • Coursera: “Algorithms” by Princeton University
  • Udemy: “JavaScript Algorithms and Data Structures Masterclass”
  • edX: “Algorithm Design and Analysis”

Websites and Tools

  • Big-O Cheat Sheet: bigocheatsheet.com
  • VisuAlgo: visualgo.net – Algorithm visualization
  • LeetCode: Practice problems with complexity analysis
  • HackerRank: Algorithm challenges with complexity requirements
  • GeeksforGeeks: Detailed articles on algorithm analysis

YouTube Channels

  • MIT OpenCourseWare
  • Back To Back SWE
  • CS Dojo
  • Abdul Bari
  • mycodeschool

Practice Platforms

  • LeetCode
  • HackerRank
  • CodeSignal
  • AlgoExpert
  • Codeforces
Scroll to Top