Algorithms and Big O Notation: The Ultimate Developer’s Cheatsheet

Introduction to Big O Notation

Big O Notation is a mathematical notation that describes the limiting behavior of a function when the argument tends towards a particular value or infinity. In computer science, it’s used to classify algorithms based on their performance or space requirements as input size grows. Understanding Big O allows developers to make informed decisions about algorithm selection and optimization, especially when dealing with large datasets.

Core Concepts of Algorithmic Complexity

Time Complexity

Measures how runtime increases as input size grows.

Space Complexity

Measures how memory usage increases as input size grows.

Common Complexity Classes (Ordered from Best to Worst)

NotationNameDescription
O(1)ConstantRuntime remains the same regardless of input size
O(log n)LogarithmicRuntime grows logarithmically with input size
O(n)LinearRuntime grows linearly with input size
O(n log n)LinearithmicRuntime grows in proportion to n log n
O(n²)QuadraticRuntime grows in proportion to square of input size
O(n³)CubicRuntime grows in proportion to cube of input size
O(2ⁿ)ExponentialRuntime doubles with each additional element
O(n!)FactorialRuntime grows factorially with input size

Growth Rate Visualization

  • O(1): ▬▬▬▬▬▬▬▬▬▬
  • O(log n): ▬▬▬▬▬▬▬▬▬⬈
  • O(n): ▬▬▬▬▬▬▬▬▬↗
  • O(n log n): ▬▬▬▬▬▬▬▬↗↗
  • O(n²): ▬▬▬▬▬▬▬↗↗↗
  • O(2ⁿ): ▬▬▬▬▬↗↗↗↗↑
  • O(n!): ▬▬▬↗↗↗↑↑↑↑

Big O Analysis Methodology

Step-by-Step Process for Determining Big O

  1. Break down the algorithm into basic operations
  2. Count how many times each operation is executed
  3. Express the count as a function of input size n
  4. Eliminate non-dominant terms and constants
  5. Express the result using Big O notation

Simplification Rules

  • Drop constants: O(2n) → O(n)
  • Drop lower-order terms: O(n² + n) → O(n²)
  • Keep only the dominant term: O(n + log n) → O(n)

Common Algorithms and Their Complexities

Sorting Algorithms

AlgorithmTime Complexity (Best)Time Complexity (Average)Time Complexity (Worst)Space ComplexityStable
Bubble SortO(n)O(n²)O(n²)O(1)Yes
Selection SortO(n²)O(n²)O(n²)O(1)No
Insertion SortO(n)O(n²)O(n²)O(1)Yes
Merge SortO(n log n)O(n log n)O(n log n)O(n)Yes
Quick SortO(n log n)O(n log n)O(n²)O(log n)No
Heap SortO(n log n)O(n log n)O(n log n)O(1)No
Counting SortO(n+k)O(n+k)O(n+k)O(n+k)Yes
Radix SortO(nk)O(nk)O(nk)O(n+k)Yes

Searching Algorithms

AlgorithmTime Complexity (Average)Time Complexity (Worst)Space Complexity
Linear SearchO(n)O(n)O(1)
Binary SearchO(log n)O(log n)O(1)
Hash Table SearchO(1)O(n)O(n)
Depth-First SearchO(V+E)O(V+E)O(V)
Breadth-First SearchO(V+E)O(V+E)O(V)

Data Structure Operations

Data StructureAccessSearchInsertionDeletion
ArrayO(1)O(n)O(n)O(n)
Linked ListO(n)O(n)O(1)O(1)
Hash TableN/AO(1)*O(1)*O(1)*
Binary Search TreeO(log n)*O(log n)*O(log n)*O(log n)*
AVL TreeO(log n)O(log n)O(log n)O(log n)
B-TreeO(log n)O(log n)O(log n)O(log n)
HeapO(n)O(n)O(log n)O(log n)

* Average case, worst case may differ

Common Algorithmic Techniques

Divide and Conquer

  1. Pattern: Break problem into smaller subproblems, solve each, then combine results
  2. Examples: Merge sort, quick sort, binary search
  3. Typical Complexity: O(n log n)

Dynamic Programming

  1. Pattern: Break problem into overlapping subproblems, store solutions for reuse
  2. Examples: Fibonacci, knapsack problem, shortest path
  3. Typical Complexity: Varies, but improves naive recursive solutions

Greedy Algorithms

  1. Pattern: Make locally optimal choice at each step
  2. Examples: Dijkstra’s algorithm, Huffman coding
  3. Typical Complexity: Varies, often O(n log n)

Backtracking

  1. Pattern: Build solutions incrementally, abandoning paths that fail constraints
  2. Examples: N-Queens, Sudoku solver, graph coloring
  3. Typical Complexity: Often exponential O(2ⁿ) or worse

Common Challenges and Solutions

Challenge: Nested Loops

Problem: Nested loops often lead to quadratic complexity Solution:

  • Use a hash table to reduce from O(n²) to O(n)
  • Consider divide and conquer approach
  • Pre-process data to allow faster lookup

Challenge: Recursive Calls

Problem: Naive recursion can lead to exponential complexity Solution:

  • Use memoization to store already computed results
  • Convert recursive solutions to iterative when possible
  • Use tail recursion when supported by language

Challenge: Large Input Sets

Problem: Algorithms with higher complexity fail with large datasets Solution:

  • Use sampling or approximation algorithms
  • Consider parallel processing when applicable
  • Apply data reduction techniques before processing

Best Practices for Algorithm Design

  • Start simple: Get a working solution before optimizing
  • Measure, don’t guess: Profile your code to find actual bottlenecks
  • Space-time tradeoff: Sometimes using more memory can drastically reduce time complexity
  • Consider the average case: Algorithms with worse worst-case behavior may perform better in practice
  • Select appropriate data structures: The right data structure can significantly improve performance
  • Know your input: Optimize for expected input patterns and sizes
  • Optimize hot paths: Focus optimization on frequently executed code
  • Use built-in functions: Language libraries are often highly optimized

Resources for Further Learning

Books

  • “Introduction to Algorithms” by Cormen, Leiserson, Rivest, and Stein
  • “Algorithms” by Robert Sedgewick and Kevin Wayne
  • “The Algorithm Design Manual” by Steven Skiena

Online Courses

  • MIT OpenCourseWare: Introduction to Algorithms
  • Coursera: Algorithms Specialization by Stanford
  • Udacity: Data Structures and Algorithms

Websites and Interactive Tools

  • LeetCode.com
  • HackerRank.com
  • VisuAlgo.net
  • BigOCheatSheet.com
  • GeeksForGeeks.org

YouTube Channels

  • MIT OpenCourseWare
  • Back To Back SWE
  • mycodeschool
  • CS Dojo

Quick Reference: Big O Complexity of Common Operations

Array Operations

  • Access by index: O(1)
  • Search unsorted: O(n)
  • Search sorted: O(log n)
  • Insert/remove at end: O(1)
  • Insert/remove at beginning/middle: O(n)

String Operations

  • Access by index: O(1)
  • Concatenation: O(n)
  • Substring: O(n)
  • String comparison: O(n)

Graph Operations

  • Traverse: O(V+E)
  • Shortest path (Dijkstra’s): O(E log V)
  • Minimum spanning tree (Prim’s): O(E log V)
  • Minimum spanning tree (Kruskal’s): O(E log E)
Scroll to Top