Data Structures and Algorithms
Chapter 1 – Algorithm Analysis - 2
Dr. Georges Badr
Time Complexity
The time complexity of the algorithms can be one of the following:
▪ O(1): Constant
▪ O(log n): Logarithmic
▪ O(n): Linear
▪ O(n*log n): Linearithmic
▪ O(n2): Quadratic
▪ O(n3): Cubic
▪ O(C.n): Exponential
▪ O(n!): Factorial
26
Time Complexity: O(1)
Time complexity of a function is considered as O(1) if it doesn’t contain
loop, recursion and call to any other non-constant time function
For example: set of non-recursive and non-loop statements such as the
swap function
A loop or recursion that runs a constant number of times is also
considered as O(1)
The size is a constant
27
Time Complexity: O(log n)
Logarithmic time complexities usually apply to algorithms that divide problems
in half every time.
For instance, let’s say that we want to look for a person in an old phone book. It
has every name sorted alphabetically. There are at least two ways to do it:
▪ Algorithm A
- Start at the beginning of the book and go in order until you find the contact you are looking for. Run-time O(n)
▪ Algorithm B
- Open the book in the middle and check the first name on it.
- If the name that you are looking for is alphabetically bigger, then look to the right. Otherwise, look in the left half,
Run-time O(log n)
28
Time Complexity: Example O(log n)
The algorithm Binary Search has a complexity of Log n.
At Iteration 1, Length of array = n
𝑛
At Iteration 2, Length of array = 2
𝑛 𝑛
At Iteration 3, Length of array = ( 2 )/2 = 22
𝑛
Therefore, after Iteration k, Length of array = 𝑘
2
Also, we know that after k divisions, the length of array becomes 1
𝑛
Therefore, the length of array = 𝑘 = 1 => n = 2k
2
Applying log function on both sides:
log2 (n) = log2 (2k)
log2 (n) = k log2 (2)
As (logx (x) = 1)
Therefore, k = log2 (n)
29
Time Complexity: O(n)
Linear time complexity O(n) means that as the input grows, the algorithms take
proportionally longer. A function with a linear time complexity has a growth
rate.
For example:
▪ Get the max/min value in an array
▪ Find a given element in a collection
30
Time Complexity: Example O(n)
1 int findMax(int *arr, int len)
2 {
3 int i = 0;
4 int max = arr[0];
5 for(i=1; i < len; i++)
6 {
7 if(arr[i] >= max)
8 max = arr[i];
9 }
10 return max;
11 }
If you get the time complexity it would be something like this:
- Line 3-4: 2 operations
- Line 5: a loop of size n
Line 6-9: 2 operations in the loop ➔ 2 * n
- Line 10: 1 operation
So, this gets us 2(n) + 3 → By leaving the most significant term, we get n. And finally using the big O notation we get O(n)
31
Time Complexity: O(nlog n)
Linearithmic time complexity is slightly slower than a linear algorithm but still
much better than a quadratic algorithm.
Example of linearithmic algorithms:
▪ Efficient sorting algorithms like merge sort, quick sort and others.
32
Time Complexity: Example O(nlogn)
We have already learned that whenever we divide a number into half
in every step, it can be represented using a logarithmic function (log
n)
Also, we perform a single step operation to find out the middle of
any subarray, i.e O(1)
Finally, to merge the n elements of the
subarrays, the time required is O(n)
Hence, the total time is O(n*log n)
33
Time Complexity: O(n2)
A function with a quadratic time complexity has a growth rate n2.
Here are some examples of O(n2) quadratic algorithms:
▪ Check if an array has duplicated values
▪ Sorting elements in an array using bubble sort, insertion sort and selection sort.
34
Time Complexity: O(n2)
Find duplications in a non sorted array O(n2)
for(i=0→n)
for(j=i→n)
if(array[i]==array[j])
Find duplication in a sorted array: O(n)
for(i=0→n)
if(array[i]==array[i+1])
Sort the array then search for duplication
Merge sort: O(nlogn) + find duplicates: O(n)
O(nlogn + n) = O(nlogn) < O(n2)
35
Time Complexity: O(nc)
Polynomial running is represented as O(nc) when c>1. As you already saw, two
inner loops translate to O(n2) since it has to go through the array twice in most
cases.
Are three nested loops cubic? In most cases, yes!
Usually, we want to stay away from polynomial running times (quadratic, cubic,
O(nc)) since they take longer to compute as the input grows fast.
However, they are not the worst.
36
Time Complexity: O(Cn)
Exponential running time means that the calculations performed by an
algorithm double every time as the input grows.
Example of exponential runtime algorithms:
▪ Power set: finding all the subsets on a set
▪ Recursive Fibonacci
37
Time Complexity: Example O(2n)
Let’s imagine you are buying a pizza. The store has many toppings that you can
choose from like pepperoni, mushrooms, bacon, and pineapple.
Let’s call each topping A, B, C, D. What are your choices?
You can select:
▪ no topping
▪ one topping
▪ a combination of two
▪ a combination of three
▪ all of them.
38
Time Complexity: Example O(2n)
If you plot n and f(n), you will notice that it would be exactly like the function
2n.
This algorithm has a running time of O(2n).
39
Time Complexity: O(n!)
Factorial is the multiplication of all positive integer numbers less than itself.
As you might guess, you want to stay away if possible from algorithms that have
this running time.
Example of O(n!) factorial runtime algorithms:
▪ Permutations of a string
▪ Brute-force search
40
Time Complexity: Example O(n!)
41
All running complexities graphs
42
Exercises
1st loop: O (N)
2nd loop: O (M)
Code: O (N+M)
if M = N → O (2N) = O (N) = O (n)
43
Exercises
1st fragment: 2 nested loops: O (N*N) = O (n2)
2nd fragment: 1 loop : O(N) = O(n)
Total: O (n2 + n) = O (n2)
44
Exercises
N I J number of times the statements are executed
4 0 4321 4 -> N
1 432 3 -> N-1
2 43 2 -> N-2
3 4 1 -> N-3
Total number of times = 4 + 3 + 2 + 1
= N + N-1 + N-2 + N-3 + N-4 + …. + N-k + ... + 1
𝑁∗(𝑁+1) (𝑁2 +1)
= =
2 2
(𝑁2 +1)
O( ) = O (N2)
2
45
Exercises
Exercise 2:
for(int i = 0; i < n; i += 2) n/2
{
for(int j = 0; j < 6; j++) 6
{
for(int k = 0; k < n; k++) n
{
for(int m = 0; m < 5; m++) 5
{ ___________
// operations; O(n/2 * 6 * n * 5)
} O(30 * n2 / 2)
} O(n2)
}
}
46
Exercises
Exercise 3:
for(int i = 1; i <= n; i *= 2)
{
// statements;
}
n = 1000
i = 1 2 4 8 16 32 64 128 256 512 1024
2^10 = 1024
O(log n)
2k=n
Log(2k) = log (n)
K log(2) = log (n)
K = log(n)
47
Exercises
Exercise 4:
for(int i = 0; i < n; i++) O(n)
{
for(int j = 0; j < n; j++)
{
// operations;
O(n)
} O(n * 2n)
for(int k = 0; k < n; k++) O(n+n) O(n2)
{
// operations; O(n)
}
}
48
Exercises
Exercise 5:
for(int i = n; i > 1; i = i / 2)
{
// statements;
}
n = 1024
i = 1024, 512, 256, 128, 64, 32, 16, 8, 4, 2 (i>1)
10 operations
Log2(1024) = 10
➔ O(log(n))
49
Exercises
Exercise 5:
int fct(int n) {
int sum = 0;
for(int i = 0; i < n; i++) {
for(int j = 0; j < n; j++) {
sum++;
} }
return sum;
}
Calculate
1- Res = fct (10)
2- Complexity?
Res = 100
Complexity: O (n2)
50
Exercises
Exercise 6:
int fct(int n) {
int sum = 0;
for(int i = 0; i < n; i++) {
for(int j = 0; j < n; j++) {
if(j < n/2) {
for(int k = 0; k < n; k++) {
for(int m = 0; m < n; m++) {
sum++;
} } } } }
return sum;
} n = 10
Calculate 10 (for the 1st loop)
1- Res = fct (10) *
2- Complexity? nd
5 (2 loop = condition+2 loops)
*
Res = 5000
10 (3rd loop) * 10 (4th loop)
Complexity = O(n4)
= 5000
51
Exercises
Exercise 7:
int fct(int n) {
int sum = 0;
for(int i = 0; i < n/2; i+=2) {
for(int j = 0; j < 10; j++) {
for(int k = 0; k < n; k++) {
for(int m = 0; m < 10; m++) {
sum++;
} } } }
return sum;
}
Calculate
1- Res = fct (10)
2- Complexity?
Res = 3000
Complexity = O(n/4 * 1 * n * 1) = O( n2/4) = O (n2)
52
Exercises
Exercise 8: Exercise 9:
int fct(int n) { int fct(int n) {
int sum = 10; int sum = 10;
for(int i = 0; i < 2 * n; i++) { for(int i = 0; i < 2*n; i++) {
for(int j = 0; j < n; j++) { for(int j = 0; j < n; j++) {
sum++; sum++;
return sum; break;
} }
} }
} return sum;
}
Calculate
1- Res = fct (10) Calculate
2- Complexity? 1- Res = fct (10)
2- Complexity?
Res = 11 Res = 30
O(1) O(2n) =O(n)
53