# Master Method for Solving Recurrence

When analyzing algorithm we only care about the asymptotic behavior. Recursive Algorithm are no different. Rather than solving the recurrence relation associated with the cost of an algorithm, it is enough to give an asymptotic characterization. The main tool for doing this is master method.

**Master Method**

The Master Method provides a **“cookbook”** method for solving recurrences of the form

$$\bbox[pink,5px]{T(n) = a T\left(\frac{n}{b}\right) + f(n)}$$

where \(a \ge 1\) and \(b \gt 1\) are constants and \(f(n)\) is an asymptotically positive function (means there exist some large value \(n_0\) such that \(f(n) \gt 0\) for \(n \ge n_0\)).

Master Method requires memorization of the following three case:

**Case I: ****If** \( f(n) = \mathcal O(n^{\log_b a \,-\, \epsilon})\) **for some constant** \(\epsilon \gt 0 \) **then**

$$\bbox[pink,5px]{ T(n) = \Theta (n^{\log_b a})}$$

**Case II: ****If** \( f(n) = \Theta(n^{\log_b a} \log^k n)\) **with \(k\ge 0\) ****then**

$$\bbox[pink,5px]{ T(n) = \Theta (n^{\log_b a}\log^{k+1}n)}$$

**Case III: ****If** \( f(n) = \Omega(n^{\log_b a \,+\, \epsilon})\) **with \(\epsilon\gt 0\)** **and \(f(n)\)** **satisfy the regularity condition, then**

$$\bbox[pink,5px]{ T(n) = \Theta (f(n))}$$

Regularity Condition is given below:

$$\bbox[pink,5px]{a f\left(\frac{n}{b}\right) \le c f(n)}$$ for some constant \(c \lt 1\) and all sufficiently large n.

**Explanation of Master Method**

The recurrence relation of Master Theorem

$$\bbox[pink,5px]{T(n) = a T\left(\frac{n}{b}\right) + f(n)}$$

signifies that \(n\) size problem is divided in \(a\) sub problem each of size \(\frac{n}{b}\) and work done in dividing and combining is \(f(n)\). The recurrence tree of this recurrence relation is shown below.

Let k is the height of the tree then

\begin{align}

\frac{n}{b^k} = 1\\

\Rightarrow b^k = n\\

\Rightarrow k = \log_b n\\

\end{align}

So number of the leaf node in the recursion tree is: \(a^k\) where \(k\) is the height of the tree. Putting the value of \(k\) we get

\begin{align}

= a^{\log_b n}\\

= n^{\log_b a}\\

\end{align}

So now we have two terms.

- Work done at the root which is equal to \(f(n)\).
- Work done at the leaf nodes which is equal to \(n^{\log_b a}\).

Are you get any similarity between these values and the three cases of Master theorem. Let us revisit the Master theorem again in the preview of these two values.

We compare the work done at the root \(f(n)\) with the work done at the leaf nodes \(n^{\log_b a}\). Following three cases can possible

**Case 1: When Work done at the leaf node is greater than the work done at the root:**

In this case \(f(n)\) is not only less than \(n^{\log_b a}\), it must be polynomially smaller. That is, \(f(n)\) must be asymptotically smaller than \(n^{\log_b a}\) by a factor of \(n^\epsilon\) for some constant \(\epsilon \gt 0\). In this case solution of recurrence relation is

$$\bbox[pink,5px]{ T(n) = \Theta (n^{\log_b a})}$$

**Case 2: When Work done at the leaf node is same as the work done at the root:**

In this case work is evenly distributed through out the recursion tree. In this case \(f(n)\) is not only equal to \(n^{\log_b a}\) but equal up to polylog factor. In this case solution of recurrence relation is

$$\bbox[pink,5px]{ T(n) = \Theta (n^{\log_b a}\log^{k+1}n)}$$

If \(f(n)\) is equal to \(n^{\log_b a}\) and polylog factor is one (i.e.\(\log^{k}n = 1 \Rightarrow k = 0\)), then

$$\bbox[pink,5px]{ T(n) = \Theta (n^{\log_b a}\log n)}$$

**Case 3: When Work done at the leaf node is smaller than the work done at the root:**

In this case \(f(n)\) is not only larger than \(n^{\log_b a}\), it must be polynomially larger. That is, \(f(n)\) must be asymptotically larger than \(n^{\log_b a}\) by a factor of \(n^\epsilon\) for some constant \(\epsilon \gt 0\). In this case solution of recurrence relation is is

$$\bbox[pink,5px]{ T(n) = \Theta (f(n))}$$

In the third case one more condition is required and that is:

Work done at all level is continuously decreasing. This condition is called **Regularization Condition**.

Work done at the first level is: \(f(n)\)

Work done at the second level is: \(af\left(\frac{n}{b}\right)\)

Condition of regularization is

Work done at the root is always greater than work done at the next level of the root.

$$\bbox[pink,5px]{a f\left(\frac{n}{b}\right) \le c f(n)}$$ for some constant \(c \lt 1\) and all sufficiently large n.

**How to apply Master Method**

- compare the given recurrence with the \( T(n) = a T\left(\frac{n}{b}\right) + f(n)\), and find the value of \(a\), \(b\) and \(f(n)\).
- compute \(n^{\log_b a}\) and compare it with \(f(n)\).
- Now find the complexity of the algorithm by applying the following rule.

$$

T(n) =

\begin{cases}

\Theta (n^{\log_b a}), & \text{if $f(n) \lt n^{\log_b a}$ by a polynomial factor of $\epsilon$} \\[2ex]

\Theta (n^{\log_b a}\log^{k+1}n), & \text{if $f(n) = n^{\log_b a}$ by a polylog factor of $\log^{k+1}n$ } \\[2ex]

\Theta (f(n)), & \text{if $f(n) \gt n^{\log_b a}$ by a polynomial factor of $\epsilon$}

\end{cases}

$$

In case 3 we required to check one more condition called regularity condition, which ensures that work done at the root \(f(n)\) is always greater than work done at the next level of the root \(af\left(\frac{n}{b}\right)\) ,i.e.,

$$\bbox[pink,5px]{a f\left(\frac{n}{b}\right) \le c f(n)}$$

**Pitfall of Master Method**

We cannot use the Master method in the following cases if

**T(n) is not monotone**, ex: \(T(n) = \sin n\).**f(n) is not a polynomial,**ex: \(T(n) = 2 T\left(\frac{n}{2}\right) + 2^n \).**b cannot be expressed as a constant,**ex: \( T(n) = T(\sqrt n)\).

For example problem see the below links:

This article is contributed by Ram Kripal. If you like eLgo Academy and would like to contribute, you can mail your article to admin@elgoacademy.org. See your article appearing on the eLgo Academy page and help other Geeks. Please write comments if you find anything incorrect, or you want to share more information about the topic discussed above.