College of Computing and Informatics
Department of Computer Science-Postgraduate Program
            DESIGN AND ANALYSIS OF ALGORITHM: Sorting Algorithm
Individual Assignment Prepared: By
No.          Name                    IDNo
 1    Nuredin Abdumalik        PGP/1056/14
Submitted to: Dr.Million M
                                                     Mar 7, 2022
                                            Haramaya University, Ethiopia
                            Page 1
Table of Contents
1. Overview of Sorting Algorithms ............................................................................................................................................................. 1
   1.1 Importance of Sorting Algorithms .................................................................................................................................................... 2
2. Advantages and disadvantages of sorting algorithms .............................................................................................................................. 2
   2.1 Advantages of sorting algorithms ..................................................................................................................................................... 2
   2.2 Disadvantages of sorting algorithms ................................................................................................................................................. 3
   2.3 Application of sorting algorithm ...................................................................................................................................................... 3
3 Types of Sorting Algorithms .................................................................................................................................................................... 3
   1. Bubble sort ........................................................................................................................................................................................ 3
   B. Selection Sort .................................................................................................................................................................................... 6
   3. Insertion Sort..................................................................................................................................................................................... 9
   4. Merge Sort....................................................................................................................................................................................... 11
       4.1Merge Sort: Analysis ................................................................................................................................................................... 13
       4.2 Time Complexity Analysis of merge sort.................................................................................................................................... 13
       4.3 Space Complexity Analysis of merge sort .................................................................................................................................. 14
       4.5 Advantages and disadvantages of merge sort .............................................................................................................................. 14
   5. Quick Sort ........................................................................................................................................................................................ 15
       5.1 Quick Sort Algorithm................................................................................................................................................................. 15
       5.2 Quick Sort Pseudo code.............................................................................................................................................................. 15
       5.3Analysing Quicksort: .................................................................................................................................................................. 16
   6. CONCLUSIONS ................................................................................................................................................................................ 17
   7. RECOMMENDATIONS ...................................................................................................................................................................... 17
   8. References ....................................................................................................................................................................................... 18
                                                                                      Page 2
1. Overview of Sorting Algorithms
Sorting is an algorithm that arranges the elements either in an ascending or descending order. Sorting
Algorithms can decrease the complexity of a problem. A sorting algorithm is used to reorder a group of
items into a specified order. This sort could be by alphabetic order or some increasing or decreasing
order. The main aim of using sorting algorithms is to make the record easier to search, insert, and delete.
A sorting algorithm will put items in a list into an order, such as alphabetical or numerical order. For
example, a list of customer names could be sorted into alphabetical order by surname, or a list of people
could be put into numerical order by age. Sorting a list of items can take a long time, especially if it is a
large list. A computer program can be created to do this, making sorting a list of data much easier.
Information growth rapidly in our world leads to increase developing sort algorithms. Developing sort
algorithms through improved performance and decreasing complexity.
Mainly there are two types of Sorting,
             Internal Sorting
             External Sorting.
Sorting algorithms can be classified based on the following parameters:
    A. Stability
A sorting algorithm is stable if it preserves the form of duplicate keys, or stability means those similar
elements retain their relativistic positions, after sorting.
    B. Adaptively
A sorting algorithm is adaptive if it sorts the sequences that are close to sorted faster than random
sequences. With a few sorting algorithms, the complexity changes based on pre-sortedness [Quick sort]:
pre-sortedness of the input affects the running time
    C. Time complexity
The time complexity of an algorithm signifies the total time required by the program to run to
completion. Algorithms have different cases of complexity which are the best case, the average case,
and the worst case. The time complexity of an algorithm is represented using the asymptotic notations.
Asymptotic notations provide the lower bound and upper bound of an algorithm.
                                                          Page 1
    D. Space complexity
The space complexity of any algorithm is also important, and it is the number of memory cells which an
algorithm needs. Space complexity calculated by both auxiliary space and space used by the input.
1.1 Importance of Sorting Algorithms
In, computer science arranging in an ordered sequence is called "sorting". Sorting is a common
operation in many applications, and efficient algorithms to perform it have been developed.
The most common uses of sorted sequences are:
  making lookup or search efficient;
  making merging of sequences efficient.
  Enable processing of data in a defined order.
Efficient sorting is     important    for    optimizing     the efficiency of     other     algorithms   (such
as search and merge algorithms) that require input data to be in sorted lists. Sorting is also often useful
for canonicalizing data and for producing human-readable output. For sorting, either a weak order,
"should not come after", can be specified, or a strict weak order, “should come before" (specifying one
defines also the other, the two are the complement of the inverse of each other). For the sorting to be
unique, these two are restricted to a total order and a strict total order, respectively.
Sorting n-tuples (depending on context also called e.g. records consisting of fields) can be done based on
one or more of its components. More generally objects can be sorted based on a property. Such a
component or property is called a sort key.
For example, the items are books, the sort key is the title, subject or author, and the order is alphabetical.
2. Advantages and disadvantages of sorting algorithms
2.1 Advantages of sorting algorithms
     The quick sort is regarded as the best sorting algorithm. This is because of its significant
        advantage in terms of efficiency because it is able to deal well with a huge list of items.
     Because it sorts in place, no additional storage is required as well. Because it is an in-place
        sorting algorithm, no additional temporary storage is required beyond what is needed to hold the
        original list.
                                                          Page 2
2.2 Disadvantages of sorting algorithms
     The selection sort requires n-squared number of steps for sorting n elements.
2.3 Application of sorting algorithm
     Uniqueness testing
     Deleting duplicates
     Prioritizing events
     Frequency counting
     Reconstructing the original order
     Set intersection/union
     Finding a target pair x, y such that x+y = z
3 Types of Sorting Algorithms
There are a lot of types sorting in this case we have discussed five sorting algorithms with their complexity and
stability.
        1. Bubble sort
        2. Selection Sort
        3. Insertion Sort
        4. Merge Sort
        5. Quick Sort
1. Bubble sort
The bubble sort algorithm works by repeatedly swapping adjacent elements that are not in order until
the whole list of items is in sequence. In this way, items can be seen as bubbling up the list according
to their key values. The primary advantage of the bubble sort is that it is popular and easy to
implement. Furthermore, in the bubble sort, elements are swapped in place without using additional
temporary storage, so the space requirement is at a minimum. The main disadvantage of the bubble
sort is the fact that it does not deal well with a list containing a huge number of items. This is because
the bubble sort requires n-squared processing steps for every n number of elements to be sorted. As
such, the bubble sort is mostly suitable for academic teaching but not for real-life applications.
                                                           Page 3
Algorithm for bubble sort
1 /input “n” elements of the array ;
2/initialize i=0 and repeat through step 4 if (i<n)
3/initialize i=0 and repeat through step 4 if (j<n-1-i)
4/if(array[j]>array[j+1])
temp = array[j]
array[j]=array[j+1]
array[j+1]=temp
5/Display the sorted element of array
6/exit
Bubble Sort: Implementation
void bubbleSort2(int a[], int n) {
for (int i = n-1; i >= 1; i--) {
    bool is_sorted = true;
    for (int j = 1; j <= i; j++) {
        if (a[j-1] > a[j]) {
          swap(a[j], a[j-1]);
          is_sorted = false;
        } // end of inner loop
if (is_orted) return; the inner loop Any swapping will invalidate the assumption
Example: First Pass:
(55 11 44 22 88) –> (11 55 44 22 88),
The first two elements compared by algorithm,
                                                          Page 4
   until swaps 55 > 11.
 (11 55 44 22 88) –> (11 44 55 22 88),
     Swaps until 55 > 44
 (11 44 55 22 88) –> (11 44 22 55 88),
      Swaps until 55 > 22
 (11 44 22 55 88) –> (11 44 22 55 88),
Second Pass:
(11 44 22 55 88) –> (11 44 22 55 88),
(11 44 22 55 88) –> (11 22 44 55 88),
Swaps until 44 > 22
(11 22 44 55 88) –> (11 22 44 55 88)
Now, the array is sorted but the algorithm wants one whole pass without any swap to verify.
Third Pass:
(11 22 44 55 88) –> (11 22 44 55 88)
(11 22 44 55 88) –> (11 22 44 55 88)
(11 22 44 55 88) –> (11 22 44 55 88)
(11 22 44 55 88) –> (11 22 44 55 88)
Bubble Sort: Analysis
     iteration of the inner loop (test and swap) requires time bounded by a constant c
     Two nested loops
               Outer loop: exactly n iterations
               Inner loop:
                           when i=0, (n−1) iterations
                           when i=1, (n−2) iterations
                           …
                           when i=(n−1), 0 iterations
                           Total number of iterations = 0+1+…+(n−1) = n(n−1)/2
                           Total time = c n(n−1)/2 = O(n2)
Performance
               Worst case performance: O(n2)
                                                        Page 5
             Best case performance: O(n)
             Average case performance: O(n2)
             Worst case space complexity: O(n) total, O(1) auxiliary
B. Selection Sort
The selection sort works by repeatedly going through the list of items, each time selecting an item
according to its ordering and placing it in the correct position in the sequence.
The main advantage of the selection sort is that it performs well on a sma ll list. Furthermore, because
it is an in-place sorting algorithm, no additional temporary storage is required beyond what is needed
to hold the original list. The primary disadvantage of the selection sort is its poor efficiency when
dealing with a huge list of items. Similar to the bubble sort, the selection sort requires n-squared
number of steps for sorting n elements. Additionally, its performance is easily influenced by the initial
ordering of the items before the sorting process. Because of this, the selection sort is only suitable for
a list of few elements that are in random order.
Basic Idea.
     Scan array to find smallest element: swap it with element in first position
     Scan remainder of array to find smallest element: swap it with element in second position
     etc. until all elements placed in correct position.
Algorithm
1 /input “n” elements of the array ;
2/initialize i=0 and repeat through step 5 if (i<n).
        min=array[i]
        loc=i
3/initialize j=i+1 and repeat through step 4 if (j<min)
4/if(array[j]<min)
min = array[j]
                                                            Page 6
loc=j
5/if(loc!=i)
temp = array[i]
array[i] = array[loc]
array[loc]=temp
6/Display the sorted element of array   7/exit
Selection sort: implementation
               Selection Sort Code
                                                 Page 7
            Example of Selection Sort
Time Complexity Analysis of selection sort
.Owing to the two nested loops, it has O(n2) time complexity
                                              Time Complexity
            Best Case                                  n2
          Average Case                                 n2
            Worst Case                                 n2
                                                            Page 8
3. Insertion Sort
The insertion sorts repeatedly scan the list of items, each time inserting the item in the unordered
sequence into its correct position. Insertion sort algorithm picks components individually and places it
to the right position any place it has a place in the sorted lists of elements.
The insertion sort works just like its name suggests - it inserts each item into its proper place in the final
list. The simplest implementation of this requires two list structures - the source list and the list into
which sorted items are inserted. To save memory, most implementations use an in-place sort that works
by moving the current item past the already sorted items and repeatedly swapping it with the preceding
item until it is in place. It's the most instinctive type of sorting algorithm. The approach is the same
approach that you use for sorting a set of cards in your hand. While playing cards, you pick up a card,
start at the beginning of your hand and find the place to insert the new card, insert it and move all the
others up one place.
Basic Idea: Find the location for an element and move all others up, and insert the element.
The process involved in insertion sort is as follows:
     First, consider first two elements: if they are out of order then swap them
     Next consider, the third element: insert it into its correct location in the 2 elements preceding it
     Next consider the fourth element: insert it into its correct location in the 3 elements preceding it
     etc. up to last element.
Algorithm
1 /input “n” elements of the array ;
2/initialize i=1 and repeat through step 4 if (i< n)
                 temp=array[i]
                 pos=pos-1
3/ repeat through step “3” if ( temp < array[pos] and pos >=0)
Array[pos+1]=array[pos]
pos=pos-1
4/array[pos+1]=temp
                                                           Page 9
5/Display the sorted element of array 6/exit
Insertion Sort: Implementation
void insertionSort(int a[], int n) {
     for (int i = 1; i < n; i++) {
        int next = a[i];
         Int j;
        for (j = i-1; j >= 0 && a[j] > next; j--)
            a[j+1] = a[j];
        a[j+1] = next;
Insertion Sort: Analysis
     Outer-loop executes (n−1) times
     Number of times inner-loop is executed depends on the input
               Best-case: the array is already sorted and (a[j] > next) is always false
                        o No shifting of data is necessary
               Worst-case: the array is reversely sorted and (a[j] > next) is always true  Insertion always occur
                   at the front.
                        o Insertion always occur at the front
     Therefore, the best-case time is O(n)
     And the worst-case time is O(n2)
                                                             Page 10
4. Merge Sort
Merge Sort is worked with the basis of Divide and Conquer algorithm. It splits the input array into two
bisects, calls itself for the two bisects and merges two sorted bisects. 1. Divide Step: If a given array Arr
has zero or one element, simply return; if previously sorted. Otherwise, split Arr [x ... z] into two
secondary arrays Arr [x ... z] and A [y + 1 ... z], each containing about parts of the elements of Arr [x ...
z]. That is, y is the intermediate point of Arr [x ... z]. 2. Conquer Step: Now Sorting the two small arrays
Arr [x ... z] and Arr [y + 1 ... z] recursively by using conquer 3. Combine Step: After all, we have to
Merge the back of the element in Arr [x ... z] by merging the two sorted small arrays Arr [x ... z] and Arr
[y + 1 ... z] into a sorted series. Merge sort is an O(n log n) comparison-based sorting algorithm. It is an
example of the divide and conquer algorithmic paradigm.
Merge sort incorporates two main ideas to improve its runtime:
    A small list will take fewer steps to sort than a large list.
    Fewer steps are required to construct a sorted list from two sorted lists than two unsorted lists.
Idea (use merge to sort n items)
             Merge each pair of elements into sets of 2
             Merge each pair of sets of 2 into sets of 4
             Repeat previous step for sets of 4 …
             Final step: merge 2 sets of n/2 elements to obtain a fully sorted set
Algorithm
    Conceptually, a merge sort works as follows:
        If the list is of length 0 or 1, then it is already sorted. Otherwise:
        Divide the unsorted list into two sublists of about half the size.
        Sort each sublist recursively by re-applying merge sort.
        Merge the two sublists back into one sorted list
                                                        Page 11
Merge Sort: Implementation
void mergeSort(int a[], int low, int high) {
if (low < high) {
        int mid = (low+high) / 2;
        mergeSort(a, low , mid );
        mergeSort(a, mid+1, high);
        merge(a, low, mid, high);
Merge Sort: Example
                                               Page 12
4.1Merge Sort: Analysis
    Level 0: 0 call to merge()
    Level 1: 1 calls to merge() with n/2 items in each half,
    Level 2: 2 calls to merge() with n/22 items in each half,
                O(2 x 2 x n/22) = O(n) time
    Level 3: 22 calls to merge() with n/23 items in each half,
                O(22 x 2 x n/23) = O(n) time
      …
    Level (lg n): 2lg(n) − 1(= n/2) calls to merge() with n/2lg(n) (= 1)
    item in each half, O(n) time
    Total time complexity = O(n lg(n))
    Optimal comparison-based sorting method
4.2 Time Complexity Analysis of merge sort
In merge sort, we divide the array into two (nearly) equal halves and solve them recursively using merge
sort only.
 So, we have-
                                                     Page 13
If T(n) is the time required by merge sort for sorting an array of size n, then the recurrence relation for
time complexity of merge sort is-
4.3 Space Complexity Analysis of merge sort
    Merge sort uses additional memory for left and right sub arrays.
    Hence, total Θ(n) extra memory is needed
4.5 Advantages and disadvantages of merge sort
Advantage merge sort
It is quicker for larger lists because unlike insertion and bubble sort it doesnt go through the whole list
several times.
It has a consistent running time, carries out different bits with similar times in a stage.
Disadvantages of merge sort
    For small datasets, merge sort is slower than other sorting algorithms.
                                                         Page 14
    For the temporary array, mergesort requires an additional space of O(n). Even if the array is
     sorted, the merge sort goes through the entire process
5. Quick Sort
Quick sort is one of the fastest sorting algorithms which is the part of many sorting libraries. The
running time of Quick Sort depends heavily on choosing the pivot element. Quick sort also belongs to
the divide and conquer category of algorithms. It depends on the operation of the partition. To partition
an array of an element called a pivot is selected.
The Quick sort algorithm works as follows
    Arr [j] = x is the pivot value.
    Arr [j…p - 1] contains elements less than x.
    Arr [p + 1…r - 1] contains the elements which are larger than or equivalent to x.
    Arr[r...k] contains elements which are currently unexplored
Since the selection of pivot elements is random, therefore average case and best case running time is O
(n log n). Moreover, the Worst case time complexity is O(n*n). Quick sort is not a stable sorting
algorithm.
5.1 Quick Sort Algorithm
Using pivot algorithm recursively, we end up with smaller possible partitions. Each partition is then
processed for quick sort. We define recursive algorithm for quicksort as follows −
Step 1 − Make the right-most index value pivot
Step 2 − partition the array using pivot value
Step 3 − quicksort left partition recursively
Step 4 − quicksort right partition recursively
5.2 Quick Sort Pseudo code
To get more into it, let see the pseudo code for quick sort algorithm −
procedure quickSort(left, right)
    if right-left <= 0
       return
    else
       pivot = A[right]
       partition = partitionFunc(left, right, pivot)
       quickSort(left,partition-1)
                                       Page 15
       quickSort(partition+1,right)
    end if
end procedure
5.3Analysing Quicksort: The Worst Case T(n) ∈ Ω(n 2 )
The choice of a pivot is most critical:
            The wrong choice may lead to the worst-case quadratic time complexity.
            A good choice equalises both sublists in size and leads to linearithmic (“n log n”) time
               complexity.
The worst-case choice: the pivot happens to be the largest (or smallest) item.
            Then one subarray is always empty.
            The second subarray contains n − 1 elements, i.e. all the elements other than the pivot.
            Quicksort is recursively called only on this second group. However, quicksort is fast on
               the “randomly scattered” pivots.
Proof. The partitioning step: at least, n − 1 comparisons.
• At each next step for n ≥ 1, the number of comparisons is one less, so that T(n) = T(n − 1) + (n − 1);
T(1) = 0.
• “Telescoping” T(n) − T(n − 1) = n − 1:
T(n)+T(n − 1)+T(n − 2)+. . .+T(3)+T(2) −T(n − 1)−T(n − 2)−. . .−T(3)−T(2)− T(1)
= (n − 1) + (n − 2) +. . .+ 2 + 1 − 0 T(n)= (n − 1) + (n − 2) +. . .+ 2 + 1
=(n−1)n 2 This yields that T(n) ∈ Ω(n 2 )
                                                         Page 16
6. CONCLUSIONS
A sorting algorithm is said to be stable if two objects with equal keys appear in the same order in sorted
output as they appear in the input unsorted array.
Some sorting algorithms are stable by nature like Insertion sort, Merge Sort, Bubble Sort, etc. In this
paper we introduced Merge Sort algorithm, an O (N logN) time and Bubble sort algorithm , O(n2) time .
Strengths Bubble sort: It is easy to understand, Easy to implement, No demand for large amounts of
memory, once sorted, data is available for processing.
Weaknesses: Sorting takes a long time.
Strengths Merge sort: More applicable in accessing data with slow access rates, typically tape drives
and hard disks. The size of the file does not adversely affect the performance. It is good for sorting
through data sets that are accessed in sequence. Weaknesses:Not easy to implementRequires
additional storage during merging operation.
7. RECOMMENDATIONS
first, I would like to congratulate the scholars, educators and professors who have worked this useful
algorithms, really I have wondered how the idea of the sorting algorithm works, how it reduces the
worst time complexity, especially in quick sort and it has taken a little time to understand the way it
works. Secondly, I would like to recommend the Computer Scientist and other scholars who works on
these useful algorithms to improve the performance and efficiency of sorting, An important key to
algorithm design is to use sorting as a basic building block, because once a set of items is sorted, many
other problems become easy.
                                                         Page 17
8. References
[1] Cormen, Thomas H., Leiserson, C. E., Rivest, R. L., & Stein, C. Introduction to algorithms. MIT
press, 2009.
[2] T. H. Cormen, C. E. Lieserson, R. L. Rivest and S. Clifford, "Introduction to Algorithms", 3rd ed.,
the MIT Press Cambridge, Massachusetts London, England 2009.
[3] Debosiewicz W. An Efficient Variation of Bubble Sort. Information Processing Letters. 1980, 11(1):
5-6
[4]Alfred V. Aho, John E. Hopcroft, Jeffrey D. Ullman, "The Design and Analysis of Computer
Algorithms", 1976, pp.66
[5] Thomas H. Cormen, Charles E. Leiserson, Ronald L. Rivest, and Clifford Stein. Introduction to
Algorithms, Second Edition. MIT Press and McGraw-Hill, 2001. ISBN 0-262-03293-7. Problem 2-2,
pg.40.
[7] Kumari A., Chakraborty S. Software Complexity: A Statistical Case Study through Insertion Sort.
Applied Mathematics and Computation. 2007, 190(1): 40-50.
                                                    Page 18
Page 19