Chapter 2
20 min read
Section 11 of 261

Master Theorem for Recurrences

Complexity Analysis

Learning Objectives

By the end of this section, you will be able to:

  1. Understand what recurrence relations are and how they arise in divide-and-conquer algorithms
  2. Apply the Master Theorem to solve recurrences of the form T(n)=aT(n/b)+f(n)T(n) = aT(n/b) + f(n)
  3. Identify which of the three cases applies by comparing f(n)f(n) to nlogban^{\log_b a}
  4. Visualize recurrence trees to build intuition about where work is done
  5. Analyze famous algorithms like merge sort, binary search, and Strassen's algorithm
  6. Recognize when the Master Theorem does not apply
Why This Matters: The Master Theorem is one of the most powerful tools in algorithm analysis. It allows you to immediately determine the complexity of any divide-and-conquer algorithm without tedious mathematical derivations. This is essential for technical interviews and for making informed algorithm design decisions.

The Story: Divide and Conquer

Many of the most elegant and efficient algorithms follow a pattern called divide and conquer:

  1. Divide: Break the problem into smaller subproblems
  2. Conquer: Solve the subproblems recursively
  3. Combine: Merge the subproblem solutions into a solution for the original problem

Consider sorting an array of n elements. The naive approach compares all pairs, taking O(n2)O(n^2) time. But merge sort divides the array in half, recursively sorts each half, then merges them:

StepOperationCost
DivideSplit array in halfO(1)
ConquerRecursively sort left half (n/2 elements)T(n/2)
ConquerRecursively sort right half (n/2 elements)T(n/2)
CombineMerge two sorted halvesO(n)

This gives us a recurrence relation: T(n)=2T(n/2)+O(n)T(n) = 2T(n/2) + O(n). But what does this equal? Is it faster than O(n2)O(n^2)?

The Master Theorem answers this question instantly: T(n)=Θ(nlogn)T(n) = \Theta(n \log n). Much faster than the naive O(n2)O(n^2) approach!

Historical Context

The Master Theorem was introduced by Jon Bentley, Dorothea Haken, and James Saxe in 1980. It unified and simplified the analysis of divide-and-conquer algorithms, which had previously required ad-hoc mathematical techniques for each algorithm.

Recurrence Relations

A recurrence relation expresses the running time of a recursive algorithm in terms of smaller inputs. For divide-and-conquer algorithms, recurrences have this general form:

T(n)=aT(nb)+f(n)T(n) = aT\left(\frac{n}{b}\right) + f(n)

Where:

  • a1a \geq 1: Number of subproblems (recursive calls)
  • b>1b > 1: Factor by which problem size shrinks
  • f(n)f(n): Cost of dividing and combining (non-recursive work)

Examples of Recurrences

AlgorithmRecurrenceabf(n)
Binary SearchT(n) = T(n/2) + 1121
Merge SortT(n) = 2T(n/2) + n22n
Quick Sort (avg)T(n) = 2T(n/2) + n22n
Binary Tree TraversalT(n) = 2T(n/2) + 1221
Strassen's MatrixT(n) = 7T(n/2) + n²72
Karatsuba MultiplyT(n) = 3T(n/2) + n32n

The key question: given aa, bb, and f(n)f(n), what is the total running time T(n)T(n)?


The Master Theorem

The Master Theorem provides a "cookbook" solution for recurrences of the form above. The answer depends on how f(n)f(n) compares to a critical value: nlogban^{\log_b a}.

The Master Theorem

Let T(n)=aT(n/b)+f(n)T(n) = aT(n/b) + f(n) where a1a \geq 1 and b>1b > 1. Then:

Case 1: If f(n)=O(nlogbaϵ)f(n) = O(n^{\log_b a - \epsilon}) for some ϵ>0\epsilon > 0, then T(n)=Θ(nlogba)T(n) = \Theta(n^{\log_b a})

Case 2: If f(n)=Θ(nlogba)f(n) = \Theta(n^{\log_b a}), then T(n)=Θ(nlogbalogn)T(n) = \Theta(n^{\log_b a} \log n)

Case 3: If f(n)=Ω(nlogba+ϵ)f(n) = \Omega(n^{\log_b a + \epsilon}) for some ϵ>0\epsilon > 0, and af(n/b)cf(n)af(n/b) \leq cf(n) for some c<1c < 1, then T(n)=Θ(f(n))T(n) = \Theta(f(n))

The Critical Value: nlogban^{\log_b a}

The magic number nlogban^{\log_b a} represents the cost of all the leavesin the recursion tree. Why? Because:

  • The tree has height logbn\log_b n (we divide by b each level until reaching base case)
  • At each level, we multiply the number of problems by a
  • Total leaves = alogbn=nlogbaa^{\log_b n} = n^{\log_b a}

The Master Theorem compares the work at the leaves (nlogban^{\log_b a}) to the work at each level (f(n)f(n)) to determine which dominates.

Quick Check

For the recurrence T(n) = 4T(n/2) + n, what is n^(log_b a)?


Case 1: Recursion Dominates

When it applies: f(n)=O(nlogbaϵ)f(n) = O(n^{\log_b a - \epsilon}) for some ϵ>0\epsilon > 0

Intuition: The work at each level increases as we go down the tree. Most work is done at the leaves. The recursion overwhelms the combining work.

Result: T(n)=Θ(nlogba)T(n) = \Theta(n^{\log_b a})

Example: T(n) = 4T(n/2) + n

Let's analyze step by step:

  1. Identify parameters: a=4a = 4, b=2b = 2, f(n)=nf(n) = n
  2. Calculate critical value: log24=2\log_2 4 = 2, so nlogba=n2n^{\log_b a} = n^2
  3. Compare: f(n)=nf(n) = n vs n2n^2. Since n=O(n21)=O(n1)n = O(n^{2-1}) = O(n^1), we have f(n)=O(nlogbaϵ)f(n) = O(n^{\log_b a - \epsilon}) with ϵ=1\epsilon = 1
  4. Case 1 applies: T(n)=Θ(n2)T(n) = \Theta(n^2)

Why? We make 4 recursive calls on problems half the size. The branching factor (4) exceeds the shrinkage (2), so work explodes down the tree. The n work at each level is dwarfed by the n² leaves.

Real Algorithm: Karatsuba Multiplication Variant

This recurrence arises in a naive approach to matrix multiplication or when a divide-and-conquer algorithm creates too many subproblems relative to how much it shrinks them.

Case 2: Balanced Work

When it applies: f(n)=Θ(nlogba)f(n) = \Theta(n^{\log_b a})

Intuition: The work is equal at every level of the recursion tree. No level dominates; all contribute equally. Total work = (work per level) × (number of levels).

Result: T(n)=Θ(nlogbalogn)T(n) = \Theta(n^{\log_b a} \log n)

Example: T(n) = 2T(n/2) + n (Merge Sort)

This is the most famous Case 2 example:

  1. Parameters: a=2a = 2, b=2b = 2, f(n)=nf(n) = n
  2. Critical value: log22=1\log_2 2 = 1, so nlogba=n1=nn^{\log_b a} = n^1 = n
  3. Compare: f(n)=n=Θ(n)f(n) = n = \Theta(n). They match exactly!
  4. Case 2 applies: T(n)=Θ(nlogn)T(n) = \Theta(n \log n)

Why? At each level of the recursion tree, the total work is exactly n:

LevelSubproblemsSize EachWork/ProblemTotal Work
01nnn
12n/2n/2n
24n/4n/4n
............n
log nn11n

Each of the logn\log n levels contributes n work, giving Θ(nlogn)\Theta(n \log n) total.

Merge Sort - Case 2 in Action
🐍merge_sort.py
1Function Definition

Merge sort recursively divides the array and merges sorted halves.

2Base Case

Arrays with 0 or 1 elements are already sorted. This is our termination condition.

EXAMPLE
T(1) = O(1)
5Divide Step

Find the midpoint to split the array into two halves of roughly equal size.

6Recursive Call 1

First recursive call on the left half. Size is n/2.

EXAMPLE
This contributes T(n/2) to the recurrence
7Recursive Call 2

Second recursive call on the right half. Size is also n/2.

EXAMPLE
Another T(n/2) contribution
9Merge (Combine Step)

Merge takes O(n) time - it touches each element once while combining sorted halves.

EXAMPLE
This is the f(n) = n term in our recurrence
11Merge Function

The merge function combines two sorted arrays into one sorted array.

14Merge Loop

Compare elements from both arrays and add the smaller one. This loop runs at most n times total.

15 lines without explanation
1def merge_sort(arr):
2    if len(arr) <= 1:
3        return arr
4
5    mid = len(arr) // 2
6    left = merge_sort(arr[:mid])
7    right = merge_sort(arr[mid:])
8
9    return merge(left, right)
10
11def merge(left, right):
12    result = []
13    i = j = 0
14    while i < len(left) and j < len(right):
15        if left[i] <= right[j]:
16            result.append(left[i])
17            i += 1
18        else:
19            result.append(right[j])
20            j += 1
21    result.extend(left[i:])
22    result.extend(right[j:])
23    return result

Quick Check

Binary Search has recurrence T(n) = T(n/2) + 1. What is its complexity?


Case 3: Root Dominates

When it applies: f(n)=Ω(nlogba+ϵ)f(n) = \Omega(n^{\log_b a + \epsilon}) for some ϵ>0\epsilon > 0, and the regularity condition af(n/b)cf(n)af(n/b) \leq cf(n) holds for some c<1c < 1

Intuition: The work decreases as we go down the tree. Most work is done at the root. The combining work overwhelms the recursion.

Result: T(n)=Θ(f(n))T(n) = \Theta(f(n))

Example: T(n) = 2T(n/2) + n²

  1. Parameters: a=2a = 2, b=2b = 2, f(n)=n2f(n) = n^2
  2. Critical value: nlog22=nn^{\log_2 2} = n
  3. Compare: f(n)=n2=Ω(n1+1)=Ω(nlogba+ϵ)f(n) = n^2 = \Omega(n^{1+1}) = \Omega(n^{\log_b a + \epsilon}) with ϵ=1\epsilon = 1
  4. Check regularity: 2(n/2)2=n2/2(1/2)n22 \cdot (n/2)^2 = n^2/2 \leq (1/2) \cdot n^2. Yes, with c = 1/2.
  5. Case 3 applies: T(n)=Θ(n2)T(n) = \Theta(n^2)

Why? The work at each level shrinks geometrically:

LevelSubproblemsWork/ProblemTotal Work
0 (root)1
12(n/2)² = n²/4n²/2
24(n/4)² = n²/16n²/4
............

Total work = n2+n2/2+n2/4+<2n2=Θ(n2)n^2 + n^2/2 + n^2/4 + \cdots < 2n^2 = \Theta(n^2). The geometric series converges, and the root dominates.

Don't Forget the Regularity Condition!

Case 3 requires checking that af(n/b)cf(n)af(n/b) \leq cf(n) for some c<1c < 1. For polynomial f(n), this almost always holds, but it can fail for unusual functions.

Interactive: Master Theorem Explorer

Use this interactive tool to explore the Master Theorem. Adjust the parameters a, b, and f(n) to see which case applies and what the solution is.

Master Theorem Interactive Explorer
T(n) = 2T(n/2) + n

Number of recursive calls

Factor by which problem size is reduced

Case 2
logb(a) = log2(2) = 1.000
k = 1.0 = 1.000
Solution
T(n) = Θ(n^{1.00} log n)
Example: Merge Sort

Since k = 1.00 = log₂(2) = 1.00, recursion and work per level are balanced.

Case 1: Recursion Dominates
k < logb(a)
T(n) = Θ(nlogba)

Work increases as we go down the tree. Most work is done at the leaves.

Case 2: Balanced
k = logb(a)
T(n) = Θ(nk log n)

Same work at each level. Total work multiplied by tree height (log n).

Case 3: Root Dominates
k > logb(a)
T(n) = Θ(f(n))

Work decreases as we go down. Most work is done at the root.


The Recursion Tree Method

The Master Theorem is actually derived from analyzing recursion trees. Understanding this visualization gives deep intuition for why the three cases exist.

Structure of a Recursion Tree

  • Root: The original problem of size n, doing f(n) work
  • Each level: ai nodes, each of size n/bi
  • Height: logb n levels until we reach the base case
  • Leaves: alogb n = nlogb a base cases

The total work is the sum of work at all levels. Depending on whether work increases, stays constant, or decreases as we go down, we get Cases 1, 2, or 3.

Recurrence Tree Visualizer
T(n) = 2T(n/2) + n
n/11 × n/1 = 16n/2n/22 × n/2 = 16n/4n/4n/4n/44 × n/4 = 16n/8n/8n/8n/8n/8n/8n/8n/88 × n/8 = 16
LevelNodesProblem SizeWork/NodeTotal Work
020 = 1n/20 = 1616.016.0
121 = 2n/21 = 88.016.0
222 = 4n/22 = 44.016.0
323 = 8n/23 = 22.016.0
424 = 16n/24 = 11.016.0
Total Work80.0
Work is BALANCED across levels (Case 2)

Each level contributes equally. Multiply single level by tree height.

Building Intuition

Experiment with different values of a, b, and k in the visualizer above. Notice how:
  • When a > bk, work increases down the tree (Case 1)
  • When a = bk, work is balanced (Case 2)
  • When a < bk, work decreases down the tree (Case 3)

Famous Algorithm Examples

Let's apply the Master Theorem to some of the most important algorithms in computer science.

Binary Search - O(log n)

Recurrence: T(n)=T(n/2)+1T(n) = T(n/2) + 1

Binary Search - Only ONE Recursive Call
🐍binary_search.py
1Recursive Binary Search

Binary search divides the search space in half each time. Only ONE recursive call is made.

5Base Case

If search space is empty (left > right), the target is not found. T(1) = O(1).

8Find Middle

Calculate midpoint. This is O(1) work - part of f(n) = 1.

10Check Middle

O(1) comparison. If found, return immediately.

12Search Right Half

Only ONE recursive call on n/2 elements. Recurrence: T(n) = T(n/2) + 1.

EXAMPLE
a = 1, b = 2, f(n) = 1
14Search Left Half

Alternatively search left half - still only ONE recursive call per level.

9 lines without explanation
1def binary_search(arr, target, left=0, right=None):
2    if right is None:
3        right = len(arr) - 1
4
5    if left > right:
6        return -1
7
8    mid = (left + right) // 2
9
10    if arr[mid] == target:
11        return mid
12    elif arr[mid] < target:
13        return binary_search(arr, target, mid + 1, right)
14    else:
15        return binary_search(arr, target, left, mid - 1)

Analysis: a=1, b=2, f(n)=1. Since log2(1)=0, we have nlogb a = 1 = f(n).Case 2: T(n) = Θ(log n).

Merge Sort - O(n log n)

Recurrence: T(n)=2T(n/2)+nT(n) = 2T(n/2) + n

Analysis: a=2, b=2, f(n)=n. Since log2(2)=1, we have nlogb a = n = f(n).Case 2: T(n) = Θ(n log n).

Strassen's Matrix Multiplication - O(n2.81)

Recurrence: T(n)=7T(n/2)+n2T(n) = 7T(n/2) + n^2

Strassen's Algorithm - The Power of 7 Instead of 8
🐍strassen.py
1Strassen&apos;s Algorithm

Strassen discovered a way to multiply matrices using 7 recursive calls instead of 8, improving complexity.

3Base Case

1×1 matrices are multiplied directly in O(1) time.

9The Magic: 7 Products

Instead of 8 recursive n/2 × n/2 multiplications, Strassen uses 7 clever products. This changes the recurrence from T(n) = 8T(n/2) + O(n²) to T(n) = 7T(n/2) + O(n²).

EXAMPLE
log₂(7) ≈ 2.81 < 3 = log₂(8)
18Combine Step

Combining the 7 products requires O(n²) additions and subtractions.

EXAMPLE
f(n) = n² work to combine
17 lines without explanation
1def strassen(A, B):
2    n = len(A)
3    if n == 1:
4        return [[A[0][0] * B[0][0]]]
5
6    # Split matrices into quadrants
7    # ... (splitting code)
8
9    # 7 recursive multiplications (not 8!)
10    P1 = strassen(A11 + A22, B11 + B22)
11    P2 = strassen(A21 + A22, B11)
12    P3 = strassen(A11, B12 - B22)
13    P4 = strassen(A22, B21 - B11)
14    P5 = strassen(A11 + A12, B22)
15    P6 = strassen(A21 - A11, B11 + B12)
16    P7 = strassen(A12 - A22, B21 + B22)
17
18    # Combine results with O(n^2) additions
19    # C11 = P1 + P4 - P5 + P7
20    # ... (combination code)
21    return C

Analysis: a=7, b=2, f(n)=n². Since log2(7)≈2.81, we have nlogb a ≈ n2.81. Since n² = O(n2.81-0.81) = O(n²), Case 1 applies: T(n) = Θ(n2.81).

This is better than the naive O(n³)! Strassen's key insight was reducing from 8 to 7 recursive multiplications, changing log2(8)=3 to log2(7)≈2.81.

Summary Table

AlgorithmRecurrenceCaseComplexity
Binary SearchT(n) = T(n/2) + 12Θ(log n)
Binary Tree TraversalT(n) = 2T(n/2) + 11Θ(n)
Merge SortT(n) = 2T(n/2) + n2Θ(n log n)
Quick Sort (avg)T(n) = 2T(n/2) + n2Θ(n log n)
KaratsubaT(n) = 3T(n/2) + n1Θ(n^1.58)
StrassenT(n) = 7T(n/2) + n²1Θ(n^2.81)
Median FindingT(n) = T(n/5) + T(7n/10) + nΘ(n) (not standard)

When the Master Theorem Doesn't Apply

The Master Theorem is powerful but not universal. It fails in several situations:

1. Non-Polynomial Gaps

The theorem requires f(n)f(n) to be polynomially smaller or larger than nlogban^{\log_b a}. A logarithmic gap doesn't qualify:

T(n)=2T(n/2)+nlognT(n) = 2T(n/2) + \frac{n}{\log n}

Here n/log n is not O(n1-ε) for any ε > 0, nor Θ(n), nor Ω(n1+ε). The Master Theorem doesn't apply. (The answer is Θ(n log log n), found by other methods.)

2. Unequal Subproblem Sizes

The Master Theorem assumes all subproblems have the same size n/b. Consider:

T(n)=T(n/3)+T(2n/3)+nT(n) = T(n/3) + T(2n/3) + n

This has two subproblems of different sizes (n/3 and 2n/3). The standard Master Theorem doesn't apply, though the Akra-Bazzi theorem can handle this case.

3. Variable Number of Subproblems

If the number of recursive calls depends on the input:

T(n)=nT(n)+nT(n) = \sqrt{n} \cdot T(\sqrt{n}) + n

This doesn't fit the aT(n/b) + f(n) form because a = √n changes with n.

4. Non-Constant Base Case

If the base case isn't Θ(1), additional analysis may be needed.

Alternative Methods

When the Master Theorem fails, try:
  • Recursion Tree: Draw the tree and sum work at all levels
  • Substitution Method: Guess the answer and prove by induction
  • Akra-Bazzi Theorem: Generalizes Master Theorem to different-sized subproblems

Test Your Understanding

Put your knowledge to the test! Identify which case of the Master Theorem applies to each recurrence and determine the solution.

Question 1 of 6Score: 0/0
Identify the Master Theorem Case
T(n) = 2T(n/2) + n
a = 2, b = 2, f(n) = n

Summary

The Master Theorem provides a powerful shortcut for analyzing divide-and-conquer algorithms:

Key Concepts

ConceptDescription
Recurrence FormT(n) = aT(n/b) + f(n)
Critical Valuen^(log_b a) = number of leaves in recursion tree
Case 1f(n) polynomially smaller → T(n) = Θ(n^(log_b a))
Case 2f(n) equals critical → T(n) = Θ(n^(log_b a) log n)
Case 3f(n) polynomially larger → T(n) = Θ(f(n))

Quick Reference

The 3-Second Test: Compare k (exponent of n in f(n)) to logb(a):

• k < logb(a) → Recursion dominates → Θ(nlogb a)
• k = logb(a) → Balanced → Θ(nk log n)
• k > logb(a) → Root dominates → Θ(nk)

Why This Matters

  • Technical Interviews: You'll be asked to analyze recursive algorithms. The Master Theorem gives instant answers.
  • Algorithm Design: Knowing how a and b affect complexity helps you design better divide-and-conquer algorithms.
  • Optimization Insights: Strassen saved 0.19 in the exponent by reducing from 8 to 7 subproblems. Understanding this tradeoff is crucial.

Exercises

Basic Analysis

  1. Apply the Master Theorem to: T(n)=9T(n/3)+nT(n) = 9T(n/3) + n
  2. Apply the Master Theorem to: T(n)=T(n/2)+n2T(n) = T(n/2) + n^2
  3. Apply the Master Theorem to: T(n)=4T(n/2)+n2T(n) = 4T(n/2) + n^2
  4. Apply the Master Theorem to: T(n)=2T(n/4)+nT(n) = 2T(n/4) + \sqrt{n}

Algorithm Analysis

  1. An algorithm divides a problem of size n into 3 subproblems of size n/2, and spends O(n) time on dividing and combining. Write the recurrence and solve it.
  2. If we could improve Strassen's algorithm to make only 6 recursive calls instead of 7, what would the new complexity be?
  3. A recursive algorithm has recurrence T(n) = 8T(n/2) + n². What is its complexity? How does this compare to the naive cubic matrix multiplication?

Deeper Understanding

  1. Explain why reducing from 8 to 7 subproblems in matrix multiplication (Strassen) provides such a significant improvement for large matrices.
  2. For what values of k does the recurrence T(n) = 4T(n/2) + nk satisfy Case 1? Case 2? Case 3?
  3. Why doesn't the Master Theorem apply to T(n) = 2T(n/2) + n log n? What additional condition would be needed?

Solutions Hints

  • Exercise 1: a=9, b=3, log₃(9)=2. Compare f(n)=n to n².
  • Exercise 2: a=1, b=2, log₂(1)=0. Compare f(n)=n² to 1.
  • Exercise 3: This is exactly Case 2 - balanced!
  • Exercise 4: a=2, b=4, log₄(2)=0.5. Compare √n to n^0.5.

You've now mastered one of the most important tools in algorithm analysis. The Master Theorem transforms complex recurrence analysis into a simple comparison, enabling you to quickly assess the efficiency of any divide-and-conquer algorithm.

In the next chapter, we'll explore Mathematical Foundations that underpin algorithm analysis, including logarithms, summations, and proof techniques that complement the Master Theorem.