Ultimate Common Algorithms Cheat Sheet: Master Guide with Implementations & Complexity Analysis

Introduction: Understanding Algorithms

Algorithms are step-by-step procedures for solving problems or performing tasks, serving as the foundation of computer science and software development. They provide efficient solutions to computational problems by defining precise instructions that transform inputs into desired outputs. Mastering key algorithms is essential for writing efficient code, optimizing performance, solving complex problems, acing technical interviews, and understanding the theoretical underpinnings of computer science.

Core Algorithm Concepts

  • Time Complexity: How runtime grows relative to input size (Big O notation)
  • Space Complexity: How memory usage grows relative to input size
  • Algorithm Efficiency: Balance between time, space, and implementation simplicity
  • Correctness: Algorithm must produce correct output for all valid inputs
  • Determinism: Same input always produces same output
  • Optimality: Achieving best possible performance for a specific problem
  • Scalability: Performance behavior as input sizes grow very large

Common Complexity Classes

NotationNameExample AlgorithmsReal-world Interpretation
O(1)ConstantHash table lookup, array accessInstant regardless of size
O(log n)LogarithmicBinary search, balanced BST operationsDoubling input adds constant time
O(n)LinearLinear search, array traversalProportional to input size
O(n log n)LinearithmicEfficient sorting (merge, heap, quick)Slightly worse than linear
O(n²)QuadraticSimple sorting (bubble, insertion)Practical for small inputs only
O(n³)CubicSimple matrix multiplicationVery limited practical use
O(2ⁿ)ExponentialRecursive Fibonacci, brute forcePractical only for tiny inputs
O(n!)FactorialBrute force TSP, permutationsPractical only for minimal inputs

Sorting Algorithms

Quick Reference Table

AlgorithmBest CaseAverage CaseWorst CaseSpaceStableMethodBest Use Case
Merge SortO(n log n)O(n log n)O(n log n)O(n)YesDivide & ConquerWhen stability matters and O(n) extra space is acceptable
Quick SortO(n log n)O(n log n)O(n²)O(log n)NoDivide & ConquerGeneral purpose, when average performance matters most
Heap SortO(n log n)O(n log n)O(n log n)O(1)NoSelectionWhen space is at a premium and stability not required
Insertion SortO(n)O(n²)O(n²)O(1)YesInsertionSmall arrays or nearly sorted data
Bubble SortO(n)O(n²)O(n²)O(1)YesExchangeEducational purposes, tiny arrays
Selection SortO(n²)O(n²)O(n²)O(1)NoSelectionSimple implementation, minimal memory
Counting SortO(n+k)O(n+k)O(n+k)O(n+k)YesCountingSmall range of integers
Radix SortO(nk)O(nk)O(nk)O(n+k)YesDigit by digitIntegers or strings with fixed length
Bucket SortO(n+k)O(n+k)O(n²)O(n+k)YesDistributionUniformly distributed data

Step-by-Step Implementations

Merge Sort

function mergeSort(arr)
    if length of arr ≤ 1
        return arr
    
    mid = length of arr / 2
    left = mergeSort(first half of arr)
    right = mergeSort(second half of arr)
    
    return merge(left, right)

function merge(left, right)
    result = empty array
    
    while left and right both have elements
        if first element of left ≤ first element of right
            append first element of left to result
            remove first element from left
        else
            append first element of right to result
            remove first element from right
    
    append remaining elements of left to result
    append remaining elements of right to result
    
    return result

Quick Sort

function quickSort(arr, low, high)
    if low < high
        pivot_index = partition(arr, low, high)
        quickSort(arr, low, pivot_index - 1)
        quickSort(arr, pivot_index + 1, high)

function partition(arr, low, high)
    pivot = arr[high]
    i = low - 1
    
    for j = low to high - 1
        if arr[j] ≤ pivot
            i = i + 1
            swap arr[i] and arr[j]
    
    swap arr[i + 1] and arr[high]
    return i + 1

Heap Sort

function heapSort(arr)
    n = length of arr
    
    // Build max heap
    for i = n/2 - 1 down to 0
        heapify(arr, n, i)
    
    // Extract elements one by one
    for i = n - 1 down to 0
        swap arr[0] and arr[i]
        heapify(arr, i, 0)

function heapify(arr, n, i)
    largest = i
    left = 2*i + 1
    right = 2*i + 2
    
    if left < n and arr[left] > arr[largest]
        largest = left
    
    if right < n and arr[right] > arr[largest]
        largest = right
    
    if largest ≠ i
        swap arr[i] and arr[largest]
        heapify(arr, n, largest)

Searching Algorithms

Quick Reference Table

AlgorithmBest CaseAverage CaseWorst CaseSpaceRequirementsBest Use Case
Linear SearchO(1)O(n)O(n)O(1)NoneUnsorted data, small arrays
Binary SearchO(1)O(log n)O(log n)O(1)Sorted arraySorted data, efficient search
Hash Table LookupO(1)O(1)O(n)O(n)Hash functionFast lookups, insertions, deletions
Binary Search TreeO(log n)O(log n)O(n)O(n)Balanced treeSorted dynamic data
Breadth-First SearchO(V+E)O(V+E)O(V+E)O(V)Graph structureShortest path in unweighted graph
Depth-First SearchO(V+E)O(V+E)O(V+E)O(V)Graph structureMaze solving, connectivity

Step-by-Step Implementations

Binary Search

function binarySearch(arr, target)
    left = 0
    right = length of arr - 1
    
    while left ≤ right
        mid = (left + right) / 2
        
        if arr[mid] == target
            return mid
        else if arr[mid] < target
            left = mid + 1
        else
            right = mid - 1
    
    return -1  // Target not found

Breadth-First Search (BFS)

function BFS(graph, start)
    queue = new Queue()
    visited = set of booleans initialized to false
    
    visited[start] = true
    queue.enqueue(start)
    
    while queue is not empty
        vertex = queue.dequeue()
        process vertex
        
        for each neighbor of vertex
            if not visited[neighbor]
                visited[neighbor] = true
                queue.enqueue(neighbor)

Depth-First Search (DFS)

function DFS(graph, start)
    visited = set of booleans initialized to false
    DFS_util(graph, start, visited)

function DFS_util(graph, vertex, visited)
    visited[vertex] = true
    process vertex
    
    for each neighbor of vertex
        if not visited[neighbor]
            DFS_util(graph, neighbor, visited)

Graph Algorithms

Quick Reference Table

AlgorithmTime ComplexitySpace ComplexityPurposeKey Characteristics
Breadth-First SearchO(V+E)O(V)Shortest path (unweighted)Level by level traversal
Depth-First SearchO(V+E)O(V)Graph traversal, cycle detectionPath exploration
Dijkstra’s AlgorithmO(E log V)O(V)Shortest path (positive weights)Greedy approach
Bellman-FordO(V×E)O(V)Shortest path (any weights)Handles negative edges
Floyd-WarshallO(V³)O(V²)All-pairs shortest pathsDynamic programming
Prim’s AlgorithmO(E log V)O(V)Minimum spanning treeGreedy, grows single tree
Kruskal’s AlgorithmO(E log E)O(V)Minimum spanning treeGreedy, connects forests
Topological SortO(V+E)O(V)Ordering of directed graphFor DAGs only
Strongly Connected ComponentsO(V+E)O(V)Finding connected subgraphsKosaraju’s or Tarjan’s

Step-by-Step Implementations

Dijkstra’s Algorithm

function dijkstra(graph, start)
    distances = array of size |V| initialized to ∞
    distances[start] = 0
    priority_queue = empty min heap
    priority_queue.insert(start, 0)
    
    while priority_queue is not empty
        u = priority_queue.extractMin()
        
        for each neighbor v of u
            if distances[u] + weight(u,v) < distances[v]
                distances[v] = distances[u] + weight(u,v)
                priority_queue.insert(v, distances[v])
    
    return distances

Kruskal’s Algorithm

function kruskal(graph)
    mst = empty list
    sort edges of graph by weight in non-decreasing order
    disjoint_set = initialize disjoint set for all vertices
    
    for each edge (u,v) in sorted edges
        if disjoint_set.find(u) ≠ disjoint_set.find(v)
            mst.add(edge)
            disjoint_set.union(u, v)
    
    return mst

Topological Sort

function topologicalSort(graph)
    visited = array of size |V| initialized to false
    stack = empty stack
    
    for each vertex in graph
        if not visited[vertex]
            topologicalSortUtil(graph, vertex, visited, stack)
    
    return contents of stack in reverse order

function topologicalSortUtil(graph, vertex, visited, stack)
    visited[vertex] = true
    
    for each neighbor of vertex
        if not visited[neighbor]
            topologicalSortUtil(graph, neighbor, visited, stack)
    
    stack.push(vertex)

Dynamic Programming Algorithms

Quick Reference Table

ProblemTime ComplexitySpace ComplexityPatternClassic Example
FibonacciO(n)O(1) or O(n)Bottom-up/memoizationCalculating Fibonacci numbers
0/1 KnapsackO(n×W)O(n×W)2D tableItem selection with weight constraint
Longest Common SubsequenceO(m×n)O(m×n)2D tableFinding common sequence in strings
Edit DistanceO(m×n)O(m×n)2D tableMinimum operations to transform string
Matrix Chain MultiplicationO(n³)O(n²)Interval DPOptimally ordering matrix multiplications
Longest Increasing SubsequenceO(n²) or O(n log n)O(n)1D tableFinding increasing sequence in array
Coin ChangeO(n×amount)O(amount)1D tableWays to make change with coins
Subset SumO(n×sum)O(sum)2D tableFinding subset that sums to target

Step-by-Step Implementations

Fibonacci with Dynamic Programming

function fibonacci(n)
    if n ≤ 1
        return n
    
    dp = array of size n+1
    dp[0] = 0
    dp[1] = 1
    
    for i = 2 to n
        dp[i] = dp[i-1] + dp[i-2]
    
    return dp[n]

0/1 Knapsack Problem

function knapsack(values, weights, capacity)
    n = length of values
    dp = 2D array of size (n+1) × (capacity+1) initialized to 0
    
    for i = 1 to n
        for w = 1 to capacity
            if weights[i-1] ≤ w
                dp[i][w] = max(values[i-1] + dp[i-1][w-weights[i-1]], dp[i-1][w])
            else
                dp[i][w] = dp[i-1][w]
    
    return dp[n][capacity]

Longest Common Subsequence

function LCS(X, Y)
    m = length of X
    n = length of Y
    dp = 2D array of size (m+1) × (n+1) initialized to 0
    
    for i = 1 to m
        for j = 1 to n
            if X[i-1] == Y[j-1]
                dp[i][j] = dp[i-1][j-1] + 1
            else
                dp[i][j] = max(dp[i-1][j], dp[i][j-1])
    
    return dp[m][n]

String Algorithms

Quick Reference Table

AlgorithmTime ComplexitySpace ComplexityPurposeKey Characteristic
Naive String MatchingO((n-m+1)×m)O(1)Pattern matchingSimple but inefficient
KMP AlgorithmO(n+m)O(m)Pattern matchingUses prefix function
Rabin-KarpO(n+m) avg, O((n-m+1)×m) worstO(1)Pattern matchingUses hashing
Boyer-MooreO(n×m) worst, O(n/m) bestO(m+k)Pattern matchingBad character heuristic
TrieO(m) for operationsO(ALPHABET_SIZE×LENGTH×WORDS)Prefix searchTree structure for strings
Suffix ArrayO(n log² n) constructionO(n)String analysisSorted array of suffixes
Longest Palindromic SubstringO(n²)O(1) or O(n²)Finding palindromesExpansion around centers

Step-by-Step Implementations

KMP (Knuth-Morris-Pratt) Algorithm

function KMP_search(text, pattern)
    n = length of text
    m = length of pattern
    lps = computeLPSArray(pattern)
    
    i = 0  // text index
    j = 0  // pattern index
    
    while i < n
        if pattern[j] == text[i]
            i = i + 1
            j = j + 1
        
        if j == m
            // Found pattern at index i-j
            j = lps[j-1]
        else if i < n and pattern[j] != text[i]
            if j != 0
                j = lps[j-1]
            else
                i = i + 1

function computeLPSArray(pattern)
    m = length of pattern
    lps = array of size m
    lps[0] = 0
    
    len = 0
    i = 1
    
    while i < m
        if pattern[i] == pattern[len]
            len = len + 1
            lps[i] = len
            i = i + 1
        else
            if len != 0
                len = lps[len-1]
            else
                lps[i] = 0
                i = i + 1
    
    return lps

Trie Implementation

class TrieNode
    children = array of size ALPHABET_SIZE initialized to null
    is_end_of_word = false

function insert(root, key)
    node = root
    
    for i = 0 to length of key - 1
        index = key[i] - 'a'  // Assuming lowercase alphabet
        
        if node.children[index] is null
            node.children[index] = new TrieNode()
        
        node = node.children[index]
    
    node.is_end_of_word = true

function search(root, key)
    node = root
    
    for i = 0 to length of key - 1
        index = key[i] - 'a'  // Assuming lowercase alphabet
        
        if node.children[index] is null
            return false
        
        node = node.children[index]
    
    return node.is_end_of_word

Data Structures and Their Algorithms

Quick Reference Table

Data StructureAccessSearchInsertionDeletionKey Characteristics
ArrayO(1)O(n)O(n)O(n)Contiguous memory, random access
Linked ListO(n)O(n)O(1)O(1)Dynamic size, efficient insertions/deletions
StackO(n)O(n)O(1)O(1)LIFO principle, push/pop operations
QueueO(n)O(n)O(1)O(1)FIFO principle, enqueue/dequeue operations
Hash TableN/AO(1) avgO(1) avgO(1) avgKey-value mapping, fast lookups
Binary Search TreeO(log n) avgO(log n) avgO(log n) avgO(log n) avgOrdered operations, hierarchical
HeapO(1) for max/minO(n)O(log n)O(log n)Priority queue, efficient max/min access
GraphO(1)O(V+E)O(1)O(V+E)Relationships, connectivity
TrieO(m)O(m)O(m)O(m)String operations, prefix matching

Common Data Structure Operations

Binary Search Tree Operations

function insert(root, key)
    if root is null
        return new Node(key)
    
    if key < root.key
        root.left = insert(root.left, key)
    else if key > root.key
        root.right = insert(root.right, key)
    
    return root

function search(root, key)
    if root is null or root.key == key
        return root
    
    if root.key < key
        return search(root.right, key)
    
    return search(root.left, key)

function delete(root, key)
    if root is null
        return root
    
    if key < root.key
        root.left = delete(root.left, key)
    else if key > root.key
        root.right = delete(root.right, key)
    else
        // Node with only one child or no child
        if root.left is null
            return root.right
        else if root.right is null
            return root.left
        
        // Node with two children
        root.key = minValue(root.right)
        root.right = delete(root.right, root.key)
    
    return root

function minValue(node)
    current = node
    
    while current.left is not null
        current = current.left
    
    return current.key

Heap Operations

function insert(heap, key)
    heap.push(key)
    n = length of heap - 1
    heapifyUp(heap, n)

function extractMax(heap)
    if heap is empty
        return error
    
    max = heap[0]
    heap[0] = heap[length of heap - 1]
    heap.pop()
    heapifyDown(heap, 0)
    
    return max

function heapifyUp(heap, i)
    parent = (i - 1) / 2
    
    if parent >= 0 and heap[parent] < heap[i]
        swap heap[i] and heap[parent]
        heapifyUp(heap, parent)

function heapifyDown(heap, i)
    n = length of heap
    largest = i
    left = 2*i + 1
    right = 2*i + 2
    
    if left < n and heap[left] > heap[largest]
        largest = left
    
    if right < n and heap[right] > heap[largest]
        largest = right
    
    if largest != i
        swap heap[i] and heap[largest]
        heapifyDown(heap, largest)

Advanced Algorithm Paradigms

Divide and Conquer

  • Key Principle: Break problem into smaller subproblems, solve recursively, combine solutions
  • Common Examples: Merge Sort, Quick Sort, Binary Search, Strassen’s Matrix Multiplication
  • General Pattern:
    function divideAndConquer(problem)    if problem is small enough        solve directly    else        divide problem into smaller subproblems        recursively solve each subproblem        combine solutions to subproblems
    

Greedy Algorithms

  • Key Principle: Make locally optimal choice at each step to find global optimum
  • Common Examples: Huffman Coding, Dijkstra’s Algorithm, Fractional Knapsack, Minimum Spanning Trees
  • When Applicable: Problems with “greedy choice property” and “optimal substructure”
  • Limitation: Does not always yield optimal solution for all problems

Dynamic Programming

  • Key Principle: Break problem into overlapping subproblems, solve each once and store results
  • Common Patterns:
    • Memoization (top-down)
    • Tabulation (bottom-up)
  • Steps:
    1. Define subproblems
    2. Find recurrence relation
    3. Determine computation order
    4. Add base cases
    5. Optimize space if needed

Backtracking

  • Key Principle: Build solutions incrementally, abandoning a path when it fails to satisfy constraints
  • Common Examples: N-Queens, Sudoku Solver, Hamiltonian Path, Graph Coloring
  • General Pattern:
    function backtrack(partial_solution)    if partial_solution is complete        return partial_solution        for each possible next element        if element is valid            add element to partial_solution            result = backtrack(partial_solution)            if result is a solution                return result            remove element from partial_solution        return no solution
    

Common Challenges and Solutions

Challenge: Choosing the Right Algorithm

  • Solution: Analyze the problem constraints (time/space limits)
  • Solution: Consider input size and expected growth patterns
  • Solution: Compare algorithm trade-offs for specific use cases

Challenge: Optimizing Performance

  • Solution: Identify and eliminate bottlenecks
  • Solution: Use appropriate data structures
  • Solution: Consider space-time tradeoffs (e.g., caching results)

Challenge: Handling Edge Cases

  • Solution: Test with boundary values (empty input, single element, etc.)
  • Solution: Consider extreme cases (very large inputs, degenerate inputs)
  • Solution: Validate inputs and add error handling

Challenge: Designing Efficient Algorithms

  • Solution: Start with a naive solution, then optimize
  • Solution: Look for mathematical insights or patterns
  • Solution: Apply known algorithmic paradigms (divide and conquer, dynamic programming, etc.)

Challenge: Debugging Complex Algorithms

  • Solution: Test with small, manageable inputs
  • Solution: Trace the execution step by step
  • Solution: Use visualization tools to understand algorithm behavior

Best Practices for Algorithm Implementation

  • Start simple: Implement a basic working solution before optimizing
  • Test thoroughly: Use a variety of inputs, including edge cases
  • Measure performance: Profile code to identify bottlenecks
  • Comment and document: Explain algorithm approach and complexity
  • Modularize code: Break complex algorithms into smaller functions
  • Use descriptive names: Make code self-documenting
  • Consider readability: Balance optimization with maintainability
  • Reuse standard algorithms: Don’t reinvent the wheel for solved problems
  • Validate inputs: Handle invalid inputs gracefully
  • Learn algorithm patterns: Recognize and apply common problem-solving techniques

Algorithm Selection Guidelines

When to Use Sorting Algorithms

  • Quick Sort: General-purpose sorting, when average case matters most
  • Merge Sort: When stability is required or guaranteed O(n log n) needed
  • Heap Sort: When space is limited and stability not required
  • Insertion Sort: For nearly sorted data or small arrays
  • Counting/Radix Sort: For integers with limited range

When to Use Graph Algorithms

  • BFS: Shortest path in unweighted graphs, level-order traversal
  • DFS: Maze solving, cycle detection, topological sort
  • Dijkstra’s: Single-source shortest path with positive weights
  • Bellman-Ford: When negative edges exist
  • Floyd-Warshall: All-pairs shortest paths in dense graphs
  • MST Algorithms: Network design, clustering

When to Use Dynamic Programming

  • Optimization problems: Maximum/minimum value with constraints
  • Counting problems: Number of ways to achieve something
  • **Problems with overlapping subproblems and optimal substructure
  • **When recursive solutions are inefficient due to repeated calculations

Resources for Further Learning

Books

  • Introduction to Algorithms by Cormen, Leiserson, Rivest, and Stein
  • Algorithms by Robert Sedgewick and Kevin Wayne
  • The Algorithm Design Manual by Steven Skiena
  • Grokking Algorithms by Aditya Bhargava

Online Courses

  • MIT OpenCourseWare: Introduction to Algorithms
  • Stanford Algorithms Specialization on Coursera
  • Princeton’s Algorithms course on Coursera
  • Khan Academy: Computer Science Algorithms

Practice Platforms

  • LeetCode
  • HackerRank
  • Codeforces
  • TopCoder
  • Project Euler

Tools and Visualizations

  • VisuAlgo (visualgo.net)
  • Algorithm Visualizer (algorithm-visualizer.org)
  • Python’s algorithm libraries (collections, heapq, itertools)
  • Big-O Cheat Sheet (bigocheatsheet.com)

Remember that understanding the fundamentals of algorithms is more important than memorizing implementations. Focus on grasping the core principles, complexity analysis, and appropriate use cases for each algorithm to solve problems efficiently.

Scroll to Top