In Progress
Lesson 4, Topic 2
In Progress

# Time Complexity

##### Akash
Lesson Progress
0% Complete

The time complexity of an algorithm is the amount of computer time it needs to run to completion.

The time T(P) taken by a program P is the sum of the compile time and the run (or execution) time. The compile time does not depend on the instance characteristics. Also, we may assume that a compiled program will be run several times without recompilation.Consequently,we concern ourselves with just the run time of a program. This run time is denoted by tp (instance characteristics).

Because many of the factors tp depends on are not known at the time a program is conceived,it is reasonable to attempt only to estimate tp. If we knew the characteristics of the compiler to be used,we could proceed to determine the number of additions, subtractions, multiplications, divisions, compares, loads, stores, and soon, that would be made by the code for P. So,we could obtain an expression for tp(n) of the form:

where n denotes the instance characteristics, and ca, cs, cm, q, and soon, respectively, denote the time needed for an addition,subtraction, multiplication, division, and soon, and ADD, SUB, MUL, DIV, and so on, are functions whose values are the numbers of additions, subtractions, multiplications, divisions,and soon, that are performed when the code for P is used on an instance with characteristic n.

We can determine the number of steps needed by a program to solve a particular problem instance in one of two ways. In the first method,we introduce a new variable,count, into the program.This is a global variable with initial value 0. Statements to increment count by the appropriate amount are introduced into the program. This is done so that each time a statement in the original program is executed,count is incremented by the step count of that statement.

Ex 1: When the statements to increment count are introduced into Algorithm 2, the result is Algorithm 4. The change in the value of count by the time this program terminates is the number of steps executed by Algorithm 2.

Since we are interested in determining only the change in the value of count, Algorithm 4 may be simplified to Algorithm 5. For every initial value of count, Algorithms 4 and 5 compute the same final value for count. It is easy to see that in the for loop, the value of count will increase by a total of 2n. If count is zero to start with, then it will be 2n + 3 on termination. So each invocation of Sum (Algorithm 2) executes a total of 2n + 3 step.

Ex: When the statements to increment count are introduced into Algorithm 3, Algorithm 6 is obtained. Let tRSum(n) be the increase in the value of count when Algorithm 6 terminates. We see that tRSum(0) = 2. When n >0, count increases by 2 plus whatever increase results from the invocation of RSum from within the else clause. From the definition of tRSum it follows that this additional increase is tRsum(n-1). So,if the value of count is zero initially, its value at the time of termination is 2+tRSum(n-1), n>0.

In Table 1, the number of steps per execution and the frequency of each of the statements in Sum (Algorithm 2)have been listed. The total number of steps required by the algorithm is determined to be 2n+3. It is important to note that the frequency of the for statement is n +1 and not n. This is so because i has to be incremented to n + 1 before the for loop can terminate.

Table 2, gives the step count for RSum (Algorithm 3). Notice that under the s/e (steps per execution) column, the else clause has been given a count of 1+ tRsum(n-1). This is the total cost of this line each time it is executed.It includes all the steps that get executed as a result of the invocation of RSum from the else clause. The frequency and total steps columns have been split into two parts: one for the case n = 0 and the other for the case n >0. This is necessary because the frequency (and hence total steps)for some statements is different for each of these cases.