KEMBAR78
Chapter 5 Sorting Algorithms | PDF | Applied Mathematics | Theoretical Computer Science
0% found this document useful (0 votes)
14 views37 pages

Chapter 5 Sorting Algorithms

data structure and algorithms for undergraduate degree

Uploaded by

Mark
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views37 pages

Chapter 5 Sorting Algorithms

data structure and algorithms for undergraduate degree

Uploaded by

Mark
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 37

Data Structures and

Algorithms

Chapter 5 : Sorting Algorithms


Introduction
 Sorting and searching are among the most common programming
processes.
 We want to keep information in a sensible order.
 alphabetical order
 ascending/descending order
 order according to names, ids, years, departments etc.
 The aim of sorting algorithms is to put unordered information in an
ordered form.
 There are many sorting algorithms, such as:
 Selection Sort
 Bubble Sort
 Insertion Sort
 Merge Sort
 Quick Sort
 The first three are the foundations for faster and more efficient
algorithms.
What is Sorting?
 Rearrange the items of a given list in
ascending order.
 Input: A sequence of n numbers <a , a ,
1 2
…, an>
 Output: A reordering <a´1, a´2, …, a´n> of
the input sequence such that a´1≤ a´2 ≤ … ≤ a
´
n.
 Example:
 Input: 〈 31,41,59,26,41,58 〉
 Output: 〈 26,31,41,41,58,59 〉
Characteristics of Sorting Algorithms
 In-place Sorting vs Not-in-place Sorting
 In-Place - algorithms do not require any extra space and sorting is
said to happen in-place, or for example, within the array itself.
Bubble sort is an example of in-place sorting.
 Not In-place – algorithms requires space which is more than or
equal to the elements being sorted. Merge-sort is an example of
not-in-place sorting.
 Stable vs Not Stable sorting
 Stable - sorting does not change

the sequence of similar content in


which they appear
 Non Stable - changes the sequence

of similar content in which they appear


Selection Sort
 The list is divided into two sublists, sorted and unsorted, which
are divided by an imaginary wall.
 We find the smallest element from the unsorted sublist and
swap it with the element at the beginning of the unsorted
data.
 After each selection and swapping, the imaginary wall
between the two sublists move one element ahead,
increasing the number of sorted elements and decreasing the
number of unsorted ones.
 Each time we move one element from the unsorted sublist
to the sorted sublist, we say that we have completed a sort
pass.
 A list of n elements requires n -1 passes to completely
rearrange the data
Selection Sort Example
Selection Sort Algorithm
Bubble Sort
 The list is divided into two sublists: sorted and
unsorted.
 The smallest element is bubbled from the
unsorted list and moved to the sorted sublist.
 Each time an element moves from the unsorted
part to the sorted part one sort pass is completed.
 Given a list of n elements, bubble sort requires up to
n -1 passes to sort the data.
 Bubble sort was originally written to “bubble up”
the highest element in the list.
 From an efficiency point of view it makes no
difference whether the high element is bubbled or
the low element is bubbled
Traces of 1st pass of the Example
Bubble Sort Algorithm
Insertion Sort
 Most common sorting technique used by card
players.
 Again, the list is divided into two parts: sorted
and unsorted.
 In each pass, the first element of the
unsorted part is picked up, transferred to
the sorted sublist, and inserted at the
appropriate place.
 A list of n elements will take at most n-
1passes to sort the data.
Insertion Sort Example

12
Insertion Sort Algorithm
Insertion Sort - Summary
 Advantages
 Good running time for “almost sorted” arrays
(n)
 Disadvantages
 (n2) running time in worst and average case
  n2/2 comparisons and exchanges

14
Merge Sort
 Apply divide-and-conquer to sorting problem
 Problem: Given n elements, sort elements
into non-decreasing order
 Divide-and-Conquer:
 If n=1 terminate (every one-element list is already
sorted)
 If n>1, partition elements into two or more sub-
collections; sort each; combine into a single sorted
list
 How do we partition?
Merge Sort Example

Original 24 13 26 1 12 27 38 15
Divide in 2 24 13 26 1 12 27 38 15
Divide in 4 24 13 26 1 12 27 38 15
Divide in 8 24 13 26 1 12 27 38 15
Merge 2 13 24 1 26 12 27 15 38
Merge 4 1 13 24 26 12 15 27 38
Merge 8 1 12 13 15 24 26 27 38
Merge Sort Example

Original 24 13 26 1 12 27 38 15
Divide in 2 24 13 26 1 12 27 38 15
Divide in 4 24 13 26 1 12 27 38 15
Divide in 8 24 13 26 1 12 27 38 15
Merge 2 13 24 1 26 12 27 15 38
Merge 4 1 13 24 26 12 15 27 38
Merge 8 1 12 13 15 24 26 27 38
Merging

 The key to Merge Sort is merging two sorted


lists into one, such that if you have two lists X
(x1x2…xm) and Y(y1y2…yn) the resulting
list is Z(z1z2…zm+n)
 Example:
L1 = { 3 8 9 } L 2 = { 1 5 7 }
merge(L1, L2) = { 1 3 5 7 8 9 }
Merging (cont.)

X: 3 10 23 54 Y: 1 5 25 75

Result:
Merging (cont.)

X: 3 10 23 54 Y: 5 25 75

Result: 1
Merging (cont.)

X: 10 23 54 Y: 5 25 75

Result: 1 3
Merging (cont.)

X: 10 23 54 Y: 25 75

Result: 1 3 5
Merging (cont.)

X: 23 54 Y: 25 75

Result: 1 3 5 10
Merging (cont.)

X: 54 Y: 25 75

Result: 1 3 5 10 23
Merging (cont.)

X: 54 Y: 75

Result: 1 3 5 10 23 25
Merging (cont.)

X: Y: 75

Result: 1 3 5 10 23 25 54
Merging (cont.)

X: Y:

Result: 1 3 5 10 23 25 54 75
Quick Sort
 Like binary search, this method uses a
recursive, divide and conquer strategy.
 The basic idea is to separate a list of items
into two parts, surrounding a distinguished
item called the pivot.
 At the end of the process, one part will
contain items smaller than the pivot and
the other part will contain items larger
than the pivot.
 Partition set into two using randomly chosen
pivot
Quick Sort – cont’d
 Divide: Partition the n elements into 3 groups left,
middle and right
 The middle group contains only the pivot element
 All elements in the left group are ≤ pivot
 All elements in the right group are ≥ pivot
 A[p…r] is partitioned (rearranged) into two nonempty subarrays
A[p…q-1] and A[q+1…r] such that each element of A[p…q-1] is
less than or equal to each element of A[q+1…r]. Index q is
computed here, called pivot.
 Conquer: two subarrays are sorted by recursive
calls to quicksort
 Combine: Answer is sorted left group, followed by
middle group followed by sorted right group
 unlike merge sort, no work needed since the subarrays are
sorted in place already.
Quick Sort –Pros and Cons
 Quicksort pros [advantage]:
 Sorts in place
 Sorts O(n lg n) in the average case
 Very efficient in practice , it’s quick
 Quicksort cons [disadvantage]:
 Sorts O(n2) in the worst case
 And the worst case doesn’t happen often
… sorted
Quick Sort Example
 If an unsorted list (represented as vector a) originally
contains:

 We might select the item in the middle position, a[5],


as the pivot, which is 8 in our illustration. Our process
would then put all values less than 8 on the left side
and all values greater than 8 on the right side.
 This first subdivision produces
Quick Sort Example – cont’d
 We need to identify a left_arrow and right_arrow on
the far left and far right, respectively.
 The left_arrow and right_arrow initially represent
the lowest and highest indices of the array items.
 Starting on the right, the right_arrow is moved left
until a value less than or equal to the pivot is
encountered. This produces
Quick Sort Example – cont’d
 In a similar manner, left_arrow is moved right until a
value greater than or equal to the pivot is
encountered.
 Now the contents of the two vector items are
swapped to produce:
Quick Sort Example – cont’d
 We continue by moving right_arrow left to produce

 and moving left_arrow right yields

 These values are exchanged to produce


Quick Sort Example – cont’d
 This process stops when left_arrow > right_arrow is TRUE.
Since this is still FALSE at this point, the next right_arrow
move produces

 and the left_arrow move to the right yields


Quick Sort Example – cont’d
 Because we are looking for a value greater than or equal to
pivot when moving left, left_arrow stops moving and an
exchange is made to produce

 Notice that the pivot, 8, has been exchanged to occupy a new


position. This is acceptable because pivot is the value of the
item, not the index. As before, right_arrow is moved left and
left_arrow is moved right to produce
Quick Sort Example – cont’d
 Since right_arrow < left_arrow is TRUE, the first
subdivision is complete. At this stage, numbers
smaller than pivot are on the left side and numbers
larger than pivot are on the right side. This produces
two sublists that can be envisioned as

 Each sublist can now be sorted by the same function.


This would require a recursive call to the sorting
function. In each case, the vector is passed as a
parameter together with the right and left indices for
the appropriate sublist.

You might also like