Time complexity is the amount of time it takes an algorithm to run as a function of the input length. It calculates the time required to execute each code statement in an algorithm. It will not look at an algorithm's overall execution time. Instead, it will provide information on the variation (increase or decrease) in execution time when the number of operations in an algorithm is increased or decreased. So, the amount of time required is solely determined by the length of the input.
The time complexity of an algorithm denotes the entire amount of time required by the program to run till completion. The big O notation is most typically used to indicate the temporal complexity of algorithms. The time complexity is represented using asymptotic notation. In the following tutorial, we will go through it in depth.
The most popular way to evaluate time complexity is to count the number of elementary steps taken by any algorithm to complete execution. For example, in the first code, the loop will execute n times so that the time complexity will be n at the very least, and as the value of n increases, so will the time is taken. The time complexity of the second code is constant since it is never dependent on the value of n and always returns the result in one step. And, because the method's performance may vary depending on the input data type, we usually utilize the worst-case Time complexity of an algorithm because that is the most time required for any input size.
Example 1:
for(i=0; i<n; i++){ \\ require n+1 unit of time
statement.......; \\ require n unit of time
}
Time analysis: f(n)= n+1+n=2n+1, that means this algorithm require order of n or, O(n) [the highest power of n in the time function]
Example 2:
for(i=0; i<n; i--){ \\ require n+1 unit of time
statement.......; \\ require n unit of time
}
Time analysis: f(n)= n+1+n=2n+1, that means this algorithm require order of n or, O(n) [the highest power of n in the time function]
Example 3:
for(i=0; i<n; i=i+2){ \\ require n+1 unit of time
statement.......; \\ require n/2 unit of time
}
Time analysis: f(n)= n+1+n/2, that means this algorithm require order of n or, O(n). Degree of polynomial is 'n' [the highest power of n in the time function]
Example 4:
for(i=0; i<n; i++){ \\ require n+1 unit of time
for(j=0; j<n; j++){ \\ require n(n+1) unit of time
statement.......; \\ require n * n unit of time
}
}
Time analysis: f(n)= n+1+n(n+1)+n*n=2n^2+2n+1, that means this algorithm require order of n^2 or, O(n^2) [the highest power of n in the time function]
Example 5:
for(i=0; i<n; i++){
for(j=0; j<i; j++){
statement.......;
}
}
Time analysis:
i j no of times
0 0 0
1 0 1
1
2 0 2
2
3 0 3
1
2
3
. .
. .
. .
n n
f(n)= 1+2+3+.........+n(n+1)/2=n^2+n)/2, that means this algorithm require order of n^2 or, O(n^2) [the highest power of n in the time function]
Example 6:
p=0
for(i=1; p<=n; i++){
p=p+i;
}
Time analysis:
Assume that, p > n then loop stopped
i p
1 0+1=1
2 1+2=3
3 1+2+3=6
4 1+2+3+4=10
. .
. .
. .
k 1+2+3+......+k=k(k+1)/2
So, p = k(k+1)/2 = k^2+k/2
k^2 > n
k > Square Root n
f(n)= Square Root n, that means this algorithm require order of root n or, O(Square Root n) [the highest power of n in the time function]
Example 7:
for(i=0; i<n; i=i*2){
statement.......;
}
Time analysis:
i
1 Assume i>=n
1*2=2 So, i= 2^k
2*2=2^2 2^k>= n
2^2*2=2^3 =>2^k=n => k=log n
.
.
.
2^k
f(n)= log n, that means this algorithm require order of log n or, O(log n) [the highest power of n in the time function]
Example 8:
for(i=0; i<n; i=i/2){
statement.......;
}
Time analysis:
i
n Assume i<1
n/2 So, n/2^k<1
n/2^2 n/2^k= 1
n/2^3 =>n = 2^k => k=log n
.
.
.
n/2^k
f(n)= log n, that means this algorithm require order of log n or, O(log n) [the highest power of n in the time function]
Example 9:
for(i=0; i*i<n; i++){
statement.......;
}
}
Time analysis:
i*i < n
i*i>= n
i^2 = n
i= Square Root n
f(n)= Square Root n, that means this algorithm require order of Square Root n or, O(Square Root n) [the highest power of n in the time function]
Example 10:
for(i=0; i<n; i++){ \\ require n+1 unit of time
statement.......; \\ require n unit of time
}
for(j=0; j<n; j++){ \\ require n+1 unit of time
statement.......; \\ require n unit of time
}
Time analysis: f(n)= n+1+n+n+1+n=3n+2, that means this algorithm require order of n or, O(n) [the highest power of n in the time function]
Example 11:
P=0 ;
for(i=1; i<n; i=i*2){
P++; \\ require log n unit of time
}
for(j=1; j<p; j=j*2){
statement.......; \\ require log P unit of time [for P time require log n unit] So, loglog n
}
Time analysis: f(n)= loglog n, that means this algorithm require order of loglog n or, O(loglog n) [the highest power of n in the time function]
Example 12:
P=0 ;
for(i=1; j<n; j++){ \\ require n+1 unit of time
for(j=1; j<p; j=j*2){ \\ require n log n unit of time
statement.......; \\ require n log n unit of time
}
}
Time analysis: f(n)= 2nlogn+n, that means this algorithm require order of n log n or, O(n log n) [the highest power of n in the time function]
for (i=0; i<n ; i++) = O(n)
for (i=1; i<n; i=i*3) = O(log n)
for (i=0; i<n ; i=i+2) = O(n)
for (i=0; i<n ; i- -) = O(n)
for (i=0; i<n ; i=i*2) = O(log n)
for (i=0; i<n ; i=i/2) = O(log n)