In the analysis of algorithms, the master theorem for divide-and-conquer recurrences provides an asymptotic analysis for many recurrence relations that occur in the analysis of divide-and-conquer algorithms. The approach was first presented by Jon Bentley, Dorothea Blostein (née Haken), and James B. Saxe in 1980, where it was described as a "unifying method" for solving such recurrences. The name "master theorem" was popularized by the widely-used algorithms textbook Introduction to Algorithms by Cormen, Leiserson, Rivest, and Stein.
Not all recurrence relations can be solved by this theorem; its generalizations include the Akra–Bazzi method.
Consider a problem that can be solved using a recursive algorithm such as the following:
procedure p(input x of size n): if n < some constant k: Solve x directly without recursion else: Create a subproblems of x, each having size n/b Call procedure p recursively on each subproblem Combine the results from the subproblems
The above algorithm divides the problem into a number of subproblems recursively, each subproblem being of size . Its solution tree has a node for each recursive call, with the children of that node being the other calls made from that call. The leaves of the tree are the base cases of the recursion, the subproblems (of size less than k) that do not recurse. The above example would have child nodes at each non-leaf node. Each node does an amount of work that corresponds to the size of the subproblem passed to that instance of the recursive call and given by
f(n)
The runtime of an algorithm such as the above on an input of size, usually denoted
T(n)
T(n)=a T\left(
n | |
b |
\right)+f(n),
f(n)
The master theorem always yields asymptotically tight bounds to recurrences from divide and conquer algorithms that partition an input into smaller subproblems of equal sizes, solve the subproblems recursively, and then combine the subproblem solutions to give a solution to the original problem. The time for such an algorithm can be expressed by adding the work that they perform at the top level of their recursion (to divide the problems into subproblems and then combine the subproblem solutions) together with the time made in the recursive calls of the algorithm. If
T(n)
n
f(n)
T(n)=a T\left(
n | |
b |
\right)+f(n)
n
a
b
a
b
n
T(n)=\Theta(1)
n
\kappa>0
Recurrences of this form often satisfy one of the three following regimes, based on how the work to split/recombine the problem
f(n)
c\operatorname{crit
c\operatorname{crit
width=10 | Case | Description | Condition on f(n) c\operatorname{crit logba | Master Theorem bound | width=400 | Notational examples | ||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
1 | Work to split/recombine a problem is dwarfed by subproblems. i.e. the recursion tree is leaf-heavy | When f(n)=O(nc) c<c\operatorname{crit (upper-bounded by a lesser exponent polynomial) | ... then T(n)=\Theta\left(
(The splitting term does not appear; the recursive tree structure dominates.) | If b=a2 f(n)=O(n1/2-\epsilon) T(n)=\Theta(n1/2) | ||||||||
2 | Work to split/recombine a problem is comparable to subproblems. | When f(n)=
k\ge0 (rangebound by the critical-exponent polynomial, times zero or more optional log | ... then T(n)=\Theta\left(
(The bound is the splitting term, where the log is augmented by a single power.) | If b=a2 f(n)=\Theta(n1/2) T(n)=\Theta(n1/2logn) If b=a2 f(n)=\Theta(n1/2logn) T(n)=\Theta(n1/2log2n) | ||||||||
3 | Work to split/recombine a problem dominates subproblems. i.e. the recursion tree is root-heavy. | When f(n)=\Omega(nc) c>c\operatorname{crit (lower-bounded by a greater-exponent polynomial) | ... this doesn't necessarily yield anything. Furthermore, if af\left(
\right)\lekf(n) k<1 n then the total is dominated by the splitting term f(n) T\left(n\right)=\Theta\left(f(n)\right) | If b=a2 f(n)=\Omega(n1/2+\epsilon) T(n)=\Theta(f(n)) |
A useful extension of Case 2 handles all values of
k
width=10 | Case | Condition on f(n) c\operatorname{crit logba | Master Theorem bound | width=400 | Notational examples | |||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
2a | When f(n)=
k>-1 | ... then T(n)=\Theta\left(
(The bound is the splitting term, where the log is augmented by a single power.) | If b=a2 f(n)=\Theta(n1/2/log1/2n) T(n)=\Theta(n1/2log1/2n) | |||||||||
2b | When f(n)=
k=-1 | ... then T(n)=\Theta\left(
(The bound is the splitting term, where the log reciprocal is replaced by an iterated log.) | If b=a2 f(n)=\Theta(n1/2/logn) T(n)=\Theta(n1/2loglogn) | |||||||||
2c | When f(n)=
k<-1 | ... then T(n)=\Theta\left(
(The bound is the splitting term, where the log disappears.) | If b=a2 f(n)=\Theta(n1/2/log2n) T(n)=\Theta(n1/2) |
T(n)=8T\left(
n | |
2 |
\right)+1000n2
As one can see from the formula above:
a=8,b=2,f(n)=1000n2
f(n)=O\left(nc\right)
c=2
Next, we see if we satisfy the case 1 condition:
logba=log28=3>c
It follows from the first case of the master theorem that
T(n)=\Theta\left(
logba | |
n |
\right)=\Theta\left(n3\right)
(This result is confirmed by the exact solution of the recurrence relation, which is
T(n)=1001n3-1000n2
T(1)=1
T(n)=2T\left(
n | |
2 |
\right)+10n
As we can see in the formula above the variables get the following values:
a=2,b=2,c=1,f(n)=10n
f(n)=\Theta\left(nclogkn\right)
c=1,k=0
logba=log22=1
logba
So it follows from the second case of the master theorem:
T(n)=\Theta\left(
logba | |
n |
logk+1n\right)=\Theta\left(n1log1n\right)=\Theta\left(nlogn\right)
Thus the given recurrence relation
T(n)
\Theta(nlogn)
(This result is confirmed by the exact solution of the recurrence relation, which is
T(n)=n+10nlog2n
T(1)=1
T(n)=2T\left(
n | |
2 |
\right)+n2
As we can see in the formula above the variables get the following values:
a=2,b=2,f(n)=n2
f(n)=\Omega\left(nc\right)
c=2
Next, we see if we satisfy the case 3 condition:
logba=log22=1
c>logba
The regularity condition also holds:
2\left(
n2 | |
4 |
\right)\lekn2
k=1/2
So it follows from the third case of the master theorem:
T\left(n\right)=\Theta\left(f(n)\right)=\Theta\left(n2\right).
Thus the given recurrence relation
T(n)
\Theta(n2)
f(n)
(This result is confirmed by the exact solution of the recurrence relation, which is
T(n)=2n2-n
T(1)=1
The following equations cannot be solved using the master theorem:[3]
T(n)=2nT\left(
n | |
2 |
\right)+nn
a is not a constant; the number of subproblems should be fixed
T(n)=2T\left(
n | |
2 |
\right)+
n | |
logn |
non-polynomial difference between
f(n)
logba | |
n |
T(n)=64T\left(
n | |
8 |
\right)-n2logn
f(n)
T(n)=T\left(
n | |
2 |
\right)+n(2-\cosn)
case 3 but regularity violation.
In the second inadmissible example above, the difference between
f(n)
logba | |
n |
f(n) | ||||
|
=
n/logn | ||||
|
=
n | |
nlogn |
=
1 | |
logn |
1 | |
logn |
<n\epsilon
\epsilon>0
T(n)=\Theta(nloglogn)
Algorithm | Recurrence relationship | Run time | Comment | ||||
---|---|---|---|---|---|---|---|
Binary search | T(n)=T\left(
\right)+O(1) | O(logn) | Apply Master theorem case c=logba a=1,b=2,c=0,k=0 | ||||
Binary tree traversal | T(n)=2T\left(
\right)+O(1) | O(n) | Apply Master theorem case c<logba a=2,b=2,c=0 | ||||
Optimal sorted matrix search | T(n)=2T\left(
\right)+O(logn) | O(n) | Apply the Akra–Bazzi theorem for p=1 g(u)=log(u) \Theta(2n-logn) | ||||
Merge sort | T(n)=2T\left(
\right)+O(n) | O(nlogn) | Apply Master theorem case c=logba a=2,b=2,c=1,k=0 |