Since divide and conquer algorithms occur quite frequently and produce recursive equations for their run times, it is important to be able to solve recursive equations. Last lecture we “solved” the recurrence equation for merge sort using a method known as a recursion tree. While this method can in theory always be used, it is often cumbersome. For problems that have a fixed number of identical sized recursive pieces there is a much simpler method known as the master theorem that can be used to solve certain recursive equations almost “by inspection”.
Master Theorem
The master theorem can be employed to solve recursive equations of the form
T(n) = aT(n/b) + f(n)
where a ≥ 1, b > 1, and f(n) is asymptotically positive. Intuitively for divide and conquer algorithms, this equation represents dividing the problem up into a subproblems of size n/b with a combine time of f(n). For example, for merge sort a = 2, b = 2, and f(n) = Θ(n). Note that floors and ceilings for n/b do not affect the asymptotic behavior or the results derived using the theorem.
If the recursion is in the form shown above, then the recurrence can be solved depending on one of three cases (which relate the dominance of the recursive term to the combine term):
Case 1 (recursion dominates)
If
f(n) = O(nlogba - ε)
for some constant ε > 0, then the solution to the recurrence is given by
T(n) = Θ(nlogba)
In this case, f(n) is polynomially bounded above by nlogba (which represents the run time of the recursive term), i.e. the recursive term dominates the run time.
Case 2 (recursion and combine are “equal”)
If
f(n) = Θ(nlogba)
then the solution to the recurrence is given by
T(n) = Θ(nlogba lg n)
In this case, f(n) is “equal” to nlogba, i.e. neither term dominates thus the extra term to get the bound.
Case 3 (combine dominates)
If
f(n) = Ω(nlogba + ε)
for some constant ε > 0 AND if a f(n/b) ≤ c f(n) for some constant c < 1 and sufficiently large n (this additional constraint is known as the regularity condition), then the solution to the recurrence is given by
T(n) = Θ(f(n))
In this case, f(n) is polynomially bounded below by nlogba (which represents the run time of the recursive term) with the additional regularity condition, i.e. the combine term dominates the run time.
Thus in all three cases it is important to compute the recursive term run time nlogba and compare it asymptotically to the combine term run time f(n) to determine which case holds. If the recursive equation satifies either case 1 or case 2, the solution can then be written by inspection. If the recursive equation satisfies case 3, then the regularity condition must be verified in order to write down the solution.
Note: There are gaps between the cases where the theorem cannot be applied (as well as recursive equations that do not fit the form required by the theorem exactly). In these cases, other techniques must be used which will not be covered in this course (refer to Lecture 6 for details).
Procedure
To apply the master theorem, the following procedure can be applied. Given the recursive equation
T(n) = aT(n/b) + f(n)
- Convert any asymptotic bounds for f(n) to functional notation, e.g. Θ(n2) ⇒ cn2
- Identify a, b, and f(n)
- Compute nlogba
- Compare f(n) with nlogba to determine the proper case and write the solution to the recursive equation
- Case 1 (f(n) “<” nlogba): Give an ε to show f(n) = O(nlogba - ε), then give the solution T(n) = Θ(nlogba)
- Case 2 (f(n) “≈” nlogba): Show f(n) = Θ(nlogba), then give the solution T(n) = Θ(nlogba lg n)
- Case 3 (f(n) “>” nlogba): Give an ε to show f(n) = Ω(nlogba + ε), AND show af(n/b) ≤ cf(n) for some c < 1, then give the solution T(n) = Θ(f(n))
Examples
Example 1
Solve the recursive equation
T(n) = 9T(n/3) + n
1. f(n) = n does not contain any asymptotic bounds that need conversion
2. For this equation
a = 9
b = 3
f(n) = n
Intuitively this equation would represent an algorithm that divides the original inputs into nine groups each consisting of a third of the elements and takes linear time to combine the results.
3. Computing
nlogba = nlog39 = n2
4. We see that f(n) = n “<” n2 so we try to show Case 1, i.e.
f(n) = n = O(nlogba - ε) = O(n2 - ε)
which is satisfied for any ε ≤ 1, e.g. choose ε = 1 giving n = O(n2 - 1) = O(n). Hence the equation satisfies Case 1 so we can give the solution as
T(n) = Θ(nlogba) = Θ(n2)
Example 2
Solve the recursive equation
T(n) = T(2n/3) + Θ(1)
1. f(n) = Θ(1) hence we convert the asymptotic bound to f(n) = c
2. For this equation
a = 1
b = 3/2 (note b is the denominator of n/(3/2))
f(n) = c
Intuitively this equation would represent an algorithm that only uses two thirds of the elements at each step and takes constant time to combine the results.
3. Computing
nlogba = nlog3/21 = n0 = 1
4. We see that f(n) = c “≈” 1 so we try to show Case 2, i.e.
f(n) = c = Θ(nlogba) = Θ(1)
which is satisfied. Hence the equation satisfies Case 2 so we can give the solution as
T(n) = Θ(nlogba lg n) = Θ(1 lg n) = Θ(lg n)
Example 3
Solve the recursive equation
T(n) = 3T(n/4) + 2nlgn
1. f(n) = 2nlgn does not contain any asymptotic bounds that need conversion
2. For this equation
a = 3
b = 4
f(n) = 2nlgn
Intuitively this equation would represent an algorithm that divides the original inputs into three groups each consisting of a quarter of the elements and takes linearlog time to combine the results.
3. Computing
nlogba = nlog43 ≈ n0.79
4. We see that f(n) = 2nlgn “>” n0.79 so we try to show Case 3, i.e.
f(n) = 2nlgn = Ω(nlogba + ε) = Ω(n0.79 + ε)
which (since 2nlgn = Ω(n)) is satisfied for any ε ≤ 0.21, e.g. choose ε = 0.2 giving 2nlgn = Ω(n0.79 + 0.2) = Ω(n0.99). Hence the equation might satisfies Case 3 if we can show the regularity condition
af(n/b) ≤ cf(n) for c < 1
Thus
a f(n/b) = 3·2(n/4)lg(n/4) = (3/4)·2nlg(n/4) ≤ (3/4)·2nlgn = (3/4)f(n)
Thus regularity holds by choosing c = 3/4 (which is < 1). So we can give the solution as
T(n) = Θ(f(n)) = Θ(2nlgn) = Θ(nlgn)
Example 4
Solve the recursive equation
T(n) = 2T(n/2) + nlgn
1. f(n) = nlgn does not contain any asymptotic bounds that need conversion
2. For this equation
a = 2
b = 2
f(n) = nlgn
Intuitively this equation would represent an algorithm that divides the original inputs into two groups each consisting of half of the elements and takes linearlog time to combine the results.
3. Computing
nlogba = nlog22 = n1 = n
4. We see that f(n) = nlgn “>” n so we try to show Case 3, i.e.
f(n) = nlgn = Ω(nlogba + ε) = Ω(n1 + ε)
which while n lg n ≥ n asymptotically, it is not polynomially greater - i.e. there is no ε that satisfies the above equation, therefore Case 3 does not apply (and clearly nlgn ≠ Θ(n) so we are not in Case 2).
However, we can use a generalization of Case 2 which states that if
f(n) = Θ(nlogba lgkn) for k ≥ 0 (note for the standard Case 2 ⇒ k = 0)
then the solution to the recurrence is given by
T(n) = Θ(nlogba lgk+1n)
Hence in this example
f(n) = nlgn = Θ(nlogba lgkn) = Θ(nlgn) for k = 1
so we can give the solution as
T(n) = Θ(nlogba lgk+1n) = Θ(n lg2n)