KEMBAR78
Unit-2-Sorting (Merge+Quick+Heap+Binary Searach).ppt
1
Divide and Conquer
2
•Divide the problem into a number of subproblems that
are smaller instances of the same problem.
•Conquer the subproblems by solving them recursively.
If they are small enough, solve the subproblems as base
cases.
•Combine the solutions to the subproblems into the
solution for the original problem.
3
Algorithms to be done under divide and conquer technique
•Merge sort
•Quick Sort
•Heap Sort
•Binary search
4
Merge sort
•Divide and Conquer
•Recursive
•Out-of-place
•Space complexity: O(n)
•Time complexity: O(n logn) in worst case
Merge Sort Working Process:
5
Think of it as a recursive algorithm continuously splits the array in half until it cannot be
further divided. This means that if the array becomes empty or has only one element left, the
dividing will stop, i.e. it is the base case to stop the recursion. If the array has multiple
elements, split the array into halves and recursively invoke the merge sort on each of the
halves. Finally, when both halves are sorted, the merge operation is applied. Merge operation
is the process of taking two smaller sorted arrays and combining them to eventually make a
larger one.
6
Pseudo-Code
Merge_sort(A, p, r)
{
If p<r
q= └ (p+r)/2
┘
Merge_sort (A, p, q)
Merge_sort (A,q+1,r)
Merge ( A, p, q, r)
}
7
Pseudo-Code
Merge(A, p, q, r)
{
n1=q-p+1 //count the number of elements in first list
n2=r-q
Let L [ 1………..n1+1] and
R[1…….n2+1] be the two new arrays
For(i=1 to n1)
L[i]=A[p+i-1] // copy array into first list
For(j=1 to n2)
R[j]=A[q+j]
L[n1+1]=∞
R[n2+1]=∞
8
For(k=p to r) //k will be incremented at every step
If(L[i]<= R[j])
A[k]=L[i]
i+1
Else
A[k]=R[j]
j=j+1
Merge sort with Example
9
10
Time complexity of Merge Sort
Merge_sort(A, p, r)………………T(n)
{
If p<r…………………………….(1)
q= └ (p+r)/2
┘…………………….(1)
Merge_sort (A, p, q)………….T(n/2)
Merge_sort (A,q+1,r)…………T(n/2)
Merge ( A, p, q, r)……………(n)
}
11
T(n)= 1 , n=1
2T(n/2) +n , n>0
Solve by Recursive Tree/Master Theorem
T(n)= O(nlogn)
Space Complexity=O(n)
Recurrence Relation of merge sort
12
Merge sort Recursive tree
Practice:
sort the following number using
merge sort:
<15,10,5,20,25,30,40,35>
And show the recursive tree
13
14
HEAP SORT
15
Combines the better attributes of merge sort and
insertion sort.
Like merge sort, but unlike insertion sort,
running time is O(n lg n).
Like insertion sort, but unlike merge sort, sorts
in place.
16
1. Heap is a binary tree .
2. Every heap is almost a complete binary tree.
3. Elements should be filled from left to right before
going to next level.
Heap Properties:
Heap length: Total number in the array is the heap
length
Heap size: till how many elements is it a heap?
17
Position of elements in a heap:
If the position of parent node is i then,
Position of left child =2( i)
Position of right child= 2(i)+1
Types of Heap:
Max heap: root should always be maximum
Min heap: root should always be minimum
18
Elements A.Length A.Heapsize
25, 12, 16, 13, 10, 8, 14 7 1
25,14,16,13,10,8,12 7 7
25,14,13,16,10,8,12
25,14,12,13,10,8,16
14,13,12,10,8
14,12,13,8,10
14,13,8,12,10
14,13,12,8,10
89,19,40,17,12,10
2,5,7,11,6,9,70
19
•getMax(): It returns the root element of Max Heap. The Time
Complexity of this operation is O(1).
•If array is in descending order than max heap
MAX HEAP
20
•getM()n: It returns the root element of Min Heap. The Time
Complexity of this operation is O(1).
•If array is in ascending order than min heap
21
Algorithm
Heap_sort(A)
{
Build_max_heap(A)
For i= length[A] down to 2
Do exchange A[1]=A[i]
Heap_size[A]=Heap_size[A-1]
Max_Heapify(A,1)
}
22
Algorithm
Build_Max_Heap(A)
{
A.heap_size=A.length //assuming that all nodes are creating a heap
For ( i=floor(A.length/2) down to 1) //maximum non-leaf node
Max_Heapify(A,i)
}
23
Max_heapify(A,i)
{
l =2i
r =2 i+1
If(l <=A.heapsize and A[l]>A[i]) //if leftchild exists & greater than parent
Largest=l
Else largest=i
If(r <=A.heapsize and A[r]>A[largest])
Largest=r
If(largest != i)
Exchange A[i] with A[largest]
Max_heapify(A,largest)
}
24
Time-Complexity of Heap Sort:
The height of a heap
The height of a node in a heap is the number of edges
on the longest simple downward path from the node to a
leaf, and the height of the heap to be the height of the
root, that is Θ(lg n).
For example: `
the height of node 2 is 2
the height of the heap is 3
25
Number of call of Max_heapify function depends on height of node that
“i” is pointing to.
•Max_Heapify funciton which run in O(log n) time to maintain the heap
property.
•For 1 call of build_Max_heap function Max_Heapify Function is called O(log
n) times.
•And there are O(n) such calls
= running time isO(n log n)
Height of node “i” No. of times Max_heapify() will
be called
1 1
2 2
3 3
… …
Log n (height of heap) Log n
26
a.) To delete the largest element in a max heap tree : O(1).
b.) To delete the smallest element in a max heap tree : O(logn).
c.) To delete n elements in a max heap tree : O(nlogn).
Now to sort the heap tree requires,
1. arrange or insert the elements in a form of heap i.e O(nlogn) and
2. delete the elements one by one after swapping operation O(nlogn)
This gives Heap sort complexity = O(nlogn) + O(nlogn).
= 2 O(nlogn) = O(nlogn).
Space complexity of O(1)
Practice:
sort the following number using Heap sort:
<27,13,3,16,13,10,1,5,7,12,4,8,9,0>
show every step
27
28
QUICK-SORT
29
Quick sort
•Divide and Conquer
•Recursive
•Out-of-place
Cases Time Complexity Space Complexity
Best Case O(n log n) O(log n)
Average Case O(n log n) O(log n)
Worst Case O(n^2) O(n)
30
Quicksort picks an element as pivot, and then it
partitions the given array around the picked pivot
element. In quick sort, a large array is divided into
two arrays in which one holds values that are smaller
than the specified value (Pivot), and another array
holds the values that are greater than the pivot.
After that, left and right sub-arrays are also
partitioned using the same approach. It will continue
until the single element remains in the sub-array.
Choosing the pivot
•Pivot can either be the rightmost element or the leftmost element of
the given array.
•Select median as the pivot element.
31
Partition(A, p, r)
{
X=A[r]
i=p-1
For(j=p to r-1)
{
If(A[ j ]<=x)
{
i=i+1
Exchange A[i] with A[j]
}
}
Exchange A[i+1] with A[r]
Return (i+1)
}
32
Quick_sort(A,p,r)
{
If(p<r)
{
Q=Partition(A, p, r)
Quick_sort (A,p,q-1)
Quick_sort (A,q+1,r)
}
}
33
Best case Time Complexity of Quick
Sort
Best Time Complexity of Quick sort comes when the sort position
of the pivot element comes almost in the middle of the array, so
that array is partitioned into almost equal halves
Balanced Tree
34
Quick_sort(A,p,r)……………….T(n) times
{
If(p<r)…………………………….1 times
{
Q=Partition(A, p, r)…………….n times
Quick_sort (A,p,q-1)…………..T(n/2) time
Quick_sort (A,q+1,r)…………..T(n/2)
}
}
Hence the recurrence relation of quick sort in best case:
T(n)=2*T(n/2)+n
Solve either by substitution, Master or Recursive tree
Best case Time complexity of Quick Sort comes: O(nlogn)
Best case Time Complexity of
Quick Sort
35
Worst case Time Complexity of
Quick Sort
Worst case Time Complexity of Quick sort comes when the sort position of the
pivot element partitions the array so that there are (n-1) elements at one end
and on the other hand there is zero element.
Unbalanced Tree
36
Quick_sort(A,p,r)……………….T(n) times
{
If(p<r)…………………………….1 times
{
Q=Partition(A, p, r)…………….n times
Quick_sort (A,p,q-1)…………..T(n-1) time
Quick_sort (A,q+1,r)…………..T(0)
}
}
Hence the recurrence relation of quick sort in best case:
T(n)=T(n-1)+n
Solve either by substitution, or Recursive tree
The worst case Time complexity of Quick Sort comes: O(n2
)
Worst case Time Complexity of
Quick Sort
Practice:
sort the following number using Quick sort:
<36,15,40,1,60,20,55,25,50,20>
show every step
37
38
Binary Search Algorithm:
The basic steps to perform Binary Search are:
•Begin with the mid element of the whole array as a search
key.
•If the value of the search key is equal to the item then return
an index of the search key.
•Or if the value of the search key is less than the item in the
middle of the interval, narrow the interval to the lower half.
•Otherwise, narrow it to the upper half.
•Repeatedly check from the second point until the value is
found or the interval is empty.
39
binarySearch(arr, x, low, high)
repeat till low = high
mid = (low + high)/2
if (x == arr[mid])
return mid
else if (x > arr[mid]) // x is on the right side
low = mid + 1
else // x is on the left side
high = mid - 1
Binary Search Algorithm
40
41
Time Complexity of Binary Search
Best Case Time Complexity of Binary Search
The best case scenario of Binary Search occurs when the target element is in
the central index.In this situation, there is only one comparison. Therefore, the
Best Case Time Complexity of Binary Search is O(1).
Average/Worst Case Time Complexity of Binary Search
In the following iterations, the size of the subarray is reduced using the result of
the previous comparison. - Initial length of array =n
Iteration 1 - Length of array =n/2
Iteration 2 - Length of array =(n/2)/2=n/22
Iteration k - Length of array =n/2k
After k iterations, the size of the array becomes 1 (narrowed down to the first
element or last element only).
Length of array =n/2k
=1
=> k=log2​
(n)= O(log n)
Time and Space Complexity
42
 Best Case Complexity - In Binary search, best case occurs when the element to
search is found in first comparison, i.e., when the first middle element itself is the
element to be searched. The best-case time complexity of Binary search is O(1).
 Worst Case Complexity - In Binary search, the worst case occurs, when we
have to keep reducing the search space till it has only one element. The
worst-case time complexity of Binary search is O(logn).
The space complexity of binary search is O(1).
Linear Search
43
Linear Search is defined as a sequential search algorithm that starts at one end
and goes through each element of a list until the desired element is found,
otherwise the search continues till the end of the data set. It is the easiest
searching algorithm
Write the algorithm of linear search yourself
Time complexity: O(N)
Auxiliary Space: O(1)

Unit-2-Sorting (Merge+Quick+Heap+Binary Searach).ppt

  • 1.
  • 2.
    Divide and Conquer 2 •Dividethe problem into a number of subproblems that are smaller instances of the same problem. •Conquer the subproblems by solving them recursively. If they are small enough, solve the subproblems as base cases. •Combine the solutions to the subproblems into the solution for the original problem.
  • 3.
    3 Algorithms to bedone under divide and conquer technique •Merge sort •Quick Sort •Heap Sort •Binary search
  • 4.
    4 Merge sort •Divide andConquer •Recursive •Out-of-place •Space complexity: O(n) •Time complexity: O(n logn) in worst case
  • 5.
    Merge Sort WorkingProcess: 5 Think of it as a recursive algorithm continuously splits the array in half until it cannot be further divided. This means that if the array becomes empty or has only one element left, the dividing will stop, i.e. it is the base case to stop the recursion. If the array has multiple elements, split the array into halves and recursively invoke the merge sort on each of the halves. Finally, when both halves are sorted, the merge operation is applied. Merge operation is the process of taking two smaller sorted arrays and combining them to eventually make a larger one.
  • 6.
    6 Pseudo-Code Merge_sort(A, p, r) { Ifp<r q= └ (p+r)/2 ┘ Merge_sort (A, p, q) Merge_sort (A,q+1,r) Merge ( A, p, q, r) }
  • 7.
    7 Pseudo-Code Merge(A, p, q,r) { n1=q-p+1 //count the number of elements in first list n2=r-q Let L [ 1………..n1+1] and R[1…….n2+1] be the two new arrays For(i=1 to n1) L[i]=A[p+i-1] // copy array into first list For(j=1 to n2) R[j]=A[q+j] L[n1+1]=∞ R[n2+1]=∞
  • 8.
    8 For(k=p to r)//k will be incremented at every step If(L[i]<= R[j]) A[k]=L[i] i+1 Else A[k]=R[j] j=j+1
  • 9.
    Merge sort withExample 9
  • 10.
    10 Time complexity ofMerge Sort Merge_sort(A, p, r)………………T(n) { If p<r…………………………….(1) q= └ (p+r)/2 ┘…………………….(1) Merge_sort (A, p, q)………….T(n/2) Merge_sort (A,q+1,r)…………T(n/2) Merge ( A, p, q, r)……………(n) }
  • 11.
    11 T(n)= 1 ,n=1 2T(n/2) +n , n>0 Solve by Recursive Tree/Master Theorem T(n)= O(nlogn) Space Complexity=O(n) Recurrence Relation of merge sort
  • 12.
  • 13.
    Practice: sort the followingnumber using merge sort: <15,10,5,20,25,30,40,35> And show the recursive tree 13
  • 14.
  • 15.
    15 Combines the betterattributes of merge sort and insertion sort. Like merge sort, but unlike insertion sort, running time is O(n lg n). Like insertion sort, but unlike merge sort, sorts in place.
  • 16.
    16 1. Heap isa binary tree . 2. Every heap is almost a complete binary tree. 3. Elements should be filled from left to right before going to next level. Heap Properties: Heap length: Total number in the array is the heap length Heap size: till how many elements is it a heap?
  • 17.
    17 Position of elementsin a heap: If the position of parent node is i then, Position of left child =2( i) Position of right child= 2(i)+1 Types of Heap: Max heap: root should always be maximum Min heap: root should always be minimum
  • 18.
    18 Elements A.Length A.Heapsize 25,12, 16, 13, 10, 8, 14 7 1 25,14,16,13,10,8,12 7 7 25,14,13,16,10,8,12 25,14,12,13,10,8,16 14,13,12,10,8 14,12,13,8,10 14,13,8,12,10 14,13,12,8,10 89,19,40,17,12,10 2,5,7,11,6,9,70
  • 19.
    19 •getMax(): It returnsthe root element of Max Heap. The Time Complexity of this operation is O(1). •If array is in descending order than max heap MAX HEAP
  • 20.
    20 •getM()n: It returnsthe root element of Min Heap. The Time Complexity of this operation is O(1). •If array is in ascending order than min heap
  • 21.
    21 Algorithm Heap_sort(A) { Build_max_heap(A) For i= length[A]down to 2 Do exchange A[1]=A[i] Heap_size[A]=Heap_size[A-1] Max_Heapify(A,1) }
  • 22.
    22 Algorithm Build_Max_Heap(A) { A.heap_size=A.length //assuming thatall nodes are creating a heap For ( i=floor(A.length/2) down to 1) //maximum non-leaf node Max_Heapify(A,i) }
  • 23.
    23 Max_heapify(A,i) { l =2i r =2i+1 If(l <=A.heapsize and A[l]>A[i]) //if leftchild exists & greater than parent Largest=l Else largest=i If(r <=A.heapsize and A[r]>A[largest]) Largest=r If(largest != i) Exchange A[i] with A[largest] Max_heapify(A,largest) }
  • 24.
    24 Time-Complexity of HeapSort: The height of a heap The height of a node in a heap is the number of edges on the longest simple downward path from the node to a leaf, and the height of the heap to be the height of the root, that is Θ(lg n). For example: ` the height of node 2 is 2 the height of the heap is 3
  • 25.
    25 Number of callof Max_heapify function depends on height of node that “i” is pointing to. •Max_Heapify funciton which run in O(log n) time to maintain the heap property. •For 1 call of build_Max_heap function Max_Heapify Function is called O(log n) times. •And there are O(n) such calls = running time isO(n log n) Height of node “i” No. of times Max_heapify() will be called 1 1 2 2 3 3 … … Log n (height of heap) Log n
  • 26.
    26 a.) To deletethe largest element in a max heap tree : O(1). b.) To delete the smallest element in a max heap tree : O(logn). c.) To delete n elements in a max heap tree : O(nlogn). Now to sort the heap tree requires, 1. arrange or insert the elements in a form of heap i.e O(nlogn) and 2. delete the elements one by one after swapping operation O(nlogn) This gives Heap sort complexity = O(nlogn) + O(nlogn). = 2 O(nlogn) = O(nlogn). Space complexity of O(1)
  • 27.
    Practice: sort the followingnumber using Heap sort: <27,13,3,16,13,10,1,5,7,12,4,8,9,0> show every step 27
  • 28.
  • 29.
    29 Quick sort •Divide andConquer •Recursive •Out-of-place Cases Time Complexity Space Complexity Best Case O(n log n) O(log n) Average Case O(n log n) O(log n) Worst Case O(n^2) O(n)
  • 30.
    30 Quicksort picks anelement as pivot, and then it partitions the given array around the picked pivot element. In quick sort, a large array is divided into two arrays in which one holds values that are smaller than the specified value (Pivot), and another array holds the values that are greater than the pivot. After that, left and right sub-arrays are also partitioned using the same approach. It will continue until the single element remains in the sub-array. Choosing the pivot •Pivot can either be the rightmost element or the leftmost element of the given array. •Select median as the pivot element.
  • 31.
    31 Partition(A, p, r) { X=A[r] i=p-1 For(j=pto r-1) { If(A[ j ]<=x) { i=i+1 Exchange A[i] with A[j] } } Exchange A[i+1] with A[r] Return (i+1) }
  • 32.
  • 33.
    33 Best case TimeComplexity of Quick Sort Best Time Complexity of Quick sort comes when the sort position of the pivot element comes almost in the middle of the array, so that array is partitioned into almost equal halves Balanced Tree
  • 34.
    34 Quick_sort(A,p,r)……………….T(n) times { If(p<r)…………………………….1 times { Q=Partition(A,p, r)…………….n times Quick_sort (A,p,q-1)…………..T(n/2) time Quick_sort (A,q+1,r)…………..T(n/2) } } Hence the recurrence relation of quick sort in best case: T(n)=2*T(n/2)+n Solve either by substitution, Master or Recursive tree Best case Time complexity of Quick Sort comes: O(nlogn) Best case Time Complexity of Quick Sort
  • 35.
    35 Worst case TimeComplexity of Quick Sort Worst case Time Complexity of Quick sort comes when the sort position of the pivot element partitions the array so that there are (n-1) elements at one end and on the other hand there is zero element. Unbalanced Tree
  • 36.
    36 Quick_sort(A,p,r)……………….T(n) times { If(p<r)…………………………….1 times { Q=Partition(A,p, r)…………….n times Quick_sort (A,p,q-1)…………..T(n-1) time Quick_sort (A,q+1,r)…………..T(0) } } Hence the recurrence relation of quick sort in best case: T(n)=T(n-1)+n Solve either by substitution, or Recursive tree The worst case Time complexity of Quick Sort comes: O(n2 ) Worst case Time Complexity of Quick Sort
  • 37.
    Practice: sort the followingnumber using Quick sort: <36,15,40,1,60,20,55,25,50,20> show every step 37
  • 38.
    38 Binary Search Algorithm: Thebasic steps to perform Binary Search are: •Begin with the mid element of the whole array as a search key. •If the value of the search key is equal to the item then return an index of the search key. •Or if the value of the search key is less than the item in the middle of the interval, narrow the interval to the lower half. •Otherwise, narrow it to the upper half. •Repeatedly check from the second point until the value is found or the interval is empty.
  • 39.
    39 binarySearch(arr, x, low,high) repeat till low = high mid = (low + high)/2 if (x == arr[mid]) return mid else if (x > arr[mid]) // x is on the right side low = mid + 1 else // x is on the left side high = mid - 1 Binary Search Algorithm
  • 40.
  • 41.
    41 Time Complexity ofBinary Search Best Case Time Complexity of Binary Search The best case scenario of Binary Search occurs when the target element is in the central index.In this situation, there is only one comparison. Therefore, the Best Case Time Complexity of Binary Search is O(1). Average/Worst Case Time Complexity of Binary Search In the following iterations, the size of the subarray is reduced using the result of the previous comparison. - Initial length of array =n Iteration 1 - Length of array =n/2 Iteration 2 - Length of array =(n/2)/2=n/22 Iteration k - Length of array =n/2k After k iterations, the size of the array becomes 1 (narrowed down to the first element or last element only). Length of array =n/2k =1 => k=log2​ (n)= O(log n)
  • 42.
    Time and SpaceComplexity 42  Best Case Complexity - In Binary search, best case occurs when the element to search is found in first comparison, i.e., when the first middle element itself is the element to be searched. The best-case time complexity of Binary search is O(1).  Worst Case Complexity - In Binary search, the worst case occurs, when we have to keep reducing the search space till it has only one element. The worst-case time complexity of Binary search is O(logn). The space complexity of binary search is O(1).
  • 43.
    Linear Search 43 Linear Searchis defined as a sequential search algorithm that starts at one end and goes through each element of a list until the desired element is found, otherwise the search continues till the end of the data set. It is the easiest searching algorithm Write the algorithm of linear search yourself Time complexity: O(N) Auxiliary Space: O(1)