Master Theorem

Distribute Education like Computer Geek

The master theorem is a simplistic way to find time complexity. It works only for the recurrence T(n) = aT(n/b) + c1*nk, where c1 and k are constants. Question no. 6 to 8 in Solving Recurrence Relation will be solved by this theorem. Don’t solve questions 9 and 10 with this method because it will give incorrect results.

 

Let’s say T(n) = aT(n/b) + c1*nk
a ≥ 1, b ≥ 1, c1 ≥ 0 and k are constants

 

(A) if n^(logba) > c1*nk , then answer is O(n^(logba)).
(B) if n^(logba) < c1*nk , then answer is Ω(c1*nk)
(C) if n^(logba) = c1*nk , then answer is ÆŸ(n^(logba)*logn).

 

 

Q6- Find the time complexity of T(n) = T(n/2) + n.

Ans –

Apply master’s theorem.

In this equation, a = 1, b = 2, c1 = 1 and k = 1

-> n^(logba) = n^(log21) = n0 = 1

-> c1*nk is n

-> Case B occurs

-> (B) if n^(logba) < c1*nk , then answer is Ω(c1*nk)

-> Ω(n) is the answer.

 

 

Q7- Find the time complexity of T(n) = 2T(n/2) + 1.

Ans –

Apply master’s theorem.

In this equation, a = 2, b = 2, c1 = 1 and k = 0

-> n^(logba) = n^(log22) = n1 = n

-> c1*nk is 1

-> Case A occurs

-> (A) if n^(logba) > c1*nk , then answer is O(n^(logba)).

-> O(n) is the answer.

 

 

Q8- Find the time complexity of T(n) = 2T(n/2) + n.

Ans –

Apply master’s theorem.

In this equation, a = 2, b = 2, c1 = 1 and k = 1

-> n^(logba) = n^(log22) = n1 = n

-> c1*nk is n

-> Case C occurs

-> (C) if n^(logba) = c1*nk , then answer is ÆŸ(n^(logba)*logn).

-> ÆŸ(n*logn) is the answer.

You can check the results by applying iterative method.

 

In 9th and 10th question, you should not apply master’s theorem because there is a log in the running time.

Let’s check

 

Q9- Find the time complexity of T(n) = T(n/2) + log n.

Ans –

Apply master’s theorem.

In this equation, a = 1, b = 2, c1 = 1 and k = 0

-> n^(logba) = n^(log21) = n0 = 1

-> c1*nk is log n

-> Case B occurs

-> (B) if n^(logba) < c1*nk , then answer is Ω(c1*nk)

-> Answer is O(log n) but it is incorrect. By applying iterative property, the answer is O(logn*logn). So, this method can’t solve your problem in which running time is in the terms of log n.

Test Yourself

Q1- Can you explain the intuition behind each case of the Master Method?

Case 1: If f(n) is asymptotically smaller than the work done by recursive calls, i.e., f(n) = O(n^(log_b(a – ε))) for some ε > 0, then the time complexity is dominated by the recursive calls, resulting in O(n^(log_b(a))).

 

Case 2: If f(n) is asymptotically equal to the work done by recursive calls, i.e., f(n) = Θ(n^(log_b(a))), then the time complexity becomes O(n^(log_b(a)) * log n).

 

Case 3: If f(n) is asymptotically larger than the work done by recursive calls, i.e., f(n) = Ω(n^(log_b(a + ε))) for some ε > 0, and f(n) satisfies the regularity condition (af(n/b) ≤ kf(n) for some constant k < 1 and sufficiently large n), then the time complexity is determined by f(n), resulting in O(f(n)).

Q2- What are the advantages and limitations of the Master Method in analyzing algorithm time complexity?

The Master Method provides a straightforward and efficient way to determine the time complexity for specific types of recursive algorithms. However, it is limited to a specific form of recurrence relations and may not be applicable to all recursive algorithms.

Q3- Can the Master Method be used to analyze space complexity as well?

No, the Master Method is specifically designed for analyzing the time complexity of recursive algorithms. It does not directly apply to space complexity analysis.

Q4- Use the Master Theorem to determine the time complexity of the following recursive function:
T(n) = 4T(n/2) + n2

To apply the Master Theorem, we need to express the recursive relation in the standard form: T(n) = a * T(n/b) + f(n)

In this case,

a = 4

b = 2

f(n) = n2

Now, let’s compare f(n) with nlogb(a), It is same.

According to the Master Theorem, Case 2 occurs

Case 2: If f(n) = Θ(n^log_b(a)), then T(n) = Θ(n^log_b(a) * log n).

In our case, f(n) = n^2, and n^log_b(a) = n^2. Since f(n) is equal to n^log_b(a), the time complexity is Θ(n^2 * log n).

BOOKS

Algorithm T H CoremanAlgorithm by Udit AgarwalAlgorithm Ellis HorowitzData Structure & Algorithm Made Easy Narasimha

Â