# Finding Time Complexity of a Program

We have discussed RAM Model of Computation and Time Complexity and Asymptotic Notations in previous post. In this post we are going to discuss how to compute the time complexity of different types of programs. There are some general rules to help us in determining the running time of an algorithm. First of all we are just going to revisit the \(\mathcal O\)-notations, because it is the most important asymptotic notation and in theory of asymptotic analysis it is used most of the time. The definition of \(\mathcal O\) is given as:

$$\mathcal O(g(n)) \in \{ f(n): \exists\; +ve\; constant\; c\; and\; n_0,\; such\; that\; 0\le f(n) \le cg(n) \forall \; n \ge n_0\}$$

From the above definition, it is clear that \(\mathcal O(g(n))\) is the set of functions with smaller or same order of growth as \(g(n)\). For example, \(\mathcal O(n^2)\) includes \(\mathcal O(1)\), \(\mathcal O(n)\), \(\mathcal O(n\log n)\) etc.

**Guidelines for Finding Time Complexity**

There are some general rules to help us in determining the running time of an algorithm.

**1. Loops**

The running time of a loop is, at most, the running time of the statements inside the loop (including tests) multiplied by the number of iterations.

for(i = 1; i <= n; i++){ //loop executes n times m = m + 2; // constant time using RAM model }

To analyze the complexity of program, first check that loop, which executes \(n\) time and by assuming that m = m + 2; take \(c\) time to execute.

$$Total\; time = c*n, $$

**Hence time complexity is:** \( \mathcal O(n)\)

**2. Nested Loops**

To find the time complexity of nested loop, analyze from inside out. Total running time is the product of the size of all the loops.

//Outer Loop for(i = 1; i <= n; i++) { //inner loop for(j = 1; j <= n; j++) //loop executes n times { m = m + 2; // constant time 'c' using RAM model } }

To analyze the complexity of program, first check for inner loop, which executes \(n\) time and then check external loop which executes \(n\) time. By assuming that m = m + 2; take \(c\) time to execute.

$$Total\; time = c*n*n $$

**Hence time complexity is:** \( \mathcal O(n^2)\)

**3. Consecutive Statements**

If the program consists of consecutive statement, add the time complexity of each statements.

x = x + 1; //constant time c0 for(i = 1; i <= n; i++) //executed n time m = m + 2; //Constant time c1 for(i = 1; i <= n; i++) //Outer loop executed n time { for(j = 1; j <= n; j++) //Inner loop executed n time { k = k + 1; //Constant time c2 } }

$$Total\; time = c_0 + c_1*n + c_2*n*n $$

**Hence time complexity is:** \( \mathcal O(n^2)\)

**4. If-then-else Statements**

To get the worst case time complexity of a program with if-then-else, add the test time to either the if part or then part (whichever is the larger).

if(length()==0) //test: constant time c0 { return false; //constant time c1 } else { for(int n = 0; n < length(); n++) //loop run n time { if(!list[n].equals(otherList.list[n])) //test: constant time c2 { return false; //constant time c3 } } }

By observing, the above program it is clear that if part take \((c_0 + c_1)\) time and else part take \((c_0 + (c_2 + c_3)*n)\) time. Clearly, the else part taking more time than if part. Hence

$$Total\; time = c_0 + (c_2 + c_3)*n $$

**Hence time complexity is:** \( \mathcal O(n)\)

**5. Logarithmic complexity**

An algorithm is \( \mathcal O(n\log n)\) if it takes a constant time to cut the problem size by a fraction (usually 1/2). As an example let us consider the following program:

for(i = 1; i <= n;) { i = i*2; }

If we observe carefully, the value of \(i\) is doubling every time. Initially \(i = 1\), in next step \(i = 2\), and in subsequent step \(i = 4, 8, …\) and so on. Let us assume that the loop is executing some \(k\) times. At \(k^{th}\) step \(2^i = n\) and we come out of loop. Taking logarithm on both sides, gives

$$\log_2(2^i) = \log_2 n$$

$$i\log_2(2) = \log_2 n$$

$$i = \log_2 n$$

**Hence time complexity is:** \( \mathcal O(\log_2 n)\)

Similarly, for the below case also, worst case time complexity is \( \mathcal O(\log_2 n)\). The same discussion holds good for decreasing sequence as well.

for(i = n; i <= 1;) { i = i/2; }

This article is contributed by Ram Kripal. If you like eLgo Academy and would like to contribute, you can mail your article to admin@elgoacademy.org. See your article appearing on the eLgo Academy page and help other Geeks. Please write comments if you find anything incorrect, or you want to share more information about the topic discussed above.