Advanced Analysis of Algorithms
Dr. Qaiser Abbas
Department of Computer Science & IT,
University of Sargodha, Sargodha, 40100, Pakistan
qaiser.abbas@uos.edu.pk
2/10/21 1
Divide and Conquer
• Like Greedy and Dynamic Programming, Divide and
Conquer is an algorithmic paradigm. A typical Divide
and Conquer algorithm solves a problem using
following three steps.
– 1. Divide: Break the given problem into
subproblems of same type.
– 2. Conquer: Recursively solve these subproblems
– 3. Combine: Appropriately combine the answers
2/10/21 2
Divide and Conquer
• Following are some standard algorithms that are Divide and
Conquer algorithms.
– 1) Binary Search is a searching algorithm. In each step, the
algorithm compares the input element x with the value of the
middle element in array. If the values match, return the index of
middle. Otherwise, if x is less than the middle element, then the
algorithm recurs for left side of middle element, else recurs for
right side of middle element.
– 2) Quicksort is a sorting algorithm. The algorithm picks a pivot
element, rearranges the array elements in such a way that all
elements smaller than the picked pivot element move to left
side of pivot, and all greater elements move to right side. Finally,
the algorithm recursively sorts the subarrays on left and right of
pivot element.
2/10/21 3
Divide and Conquer
– 3) Merge Sort is also a sorting algorithm. The algorithm
divides the array in two halves, recursively sorts them and
finally merges the two sorted halves.
– 4) Closest Pair of Points The problem is to find the closest
pair of points in a set of points in x-y plane. The problem
can be solved in O(n^2) time by calculating distances of
every pair of points and comparing the distances to find
the minimum. The Divide and Conquer algorithm solves
the problem in O(nLogn) time.
– 5) Strassen’s Algorithm is an efficient algorithm to multiply
two matrices. A simple method to multiply two matrices
need 3 nested loops and is O(n^3). Strassen’s algorithm
multiplies two matrices in O(n^2.8974) time.
2/10/21 4
Divide and Conquer
– 6) Cooley–Tukey Fast Fourier Transform (FFT) algorithm is
the most common algorithm for FFT. It is a divide and
conquer algorithm which works in O(nlogn) time.
– 7) Karatsuba algorithm for fast multiplication it does
multiplication of two n-digit numbers in at most single-
digit multiplications in general (and exactly when n is a
power of 2). It is therefore faster than
the classical algorithm, which requires n2 single-digit
products. If n = 210 = 1024, in particular, the exact counts
are 310 = 59,049 and (210)2 = 1,048,576, respectively.
• We will study some of them in separate lectures. Binary and
Merge sort (read it yourself) because it was the part of
“Fundamentals of Algorithms” course.
2/10/21 5
Divide and Conquer
• Divide and Conquer (D & C) vs Dynamic Programming
(DP)
Both paradigms (D & C and DP) divide the given
problem into subproblems and solve subproblems. How
to choose one of them for a given problem?
• Divide and Conquer should be used when same
subproblems are not evaluated many times. Otherwise
Dynamic Programming or Memoization should be used.
• For example, Binary Search is a Divide and Conquer
algorithm, we never evaluate the same subproblems
again. On the other hand, for optimal BST, Dynamic
Programming should be preferred (See previous
lectures for details).
2/10/21 6
Quick Sort
• Like Merge Sort, QuickSort is a Divide and Conquer algorithm.
• It picks an element as pivot and partitions the given array around
the picked pivot. There are many different versions of quickSort
that pick pivot in different ways.
– 1) Always pick first element as pivot.
– 2) Always pick last element as pivot (as in this lecture)
– 3) Pick a random element as pivot.
– 4) Pick median as pivot.
• The key process in quickSort is partition().
• Target of partitions is, given an array and an element x of array as
pivot, put x at its correct position in sorted array and put all smaller
elements (smaller than x) before x, and put all greater elements
(greater than x) after x. All this should be done in linear time.
2/10/21 7
Quick Sort
• Here is the three step divide-and-conquer process for sorting
a typical subarray A[p…r]:
– Divide: Partition (rearrange) the array A[p…r] into two
(possibly empty) subarrays A[p…q-1] and A[q+1…r] such
that each element of A[p…q-1] is less than or equal to
A[q], which is, in turn, less than or equal to each element
of A[q+1…r]. Compute the index q as part of this
partitioning procedure.
– Conquer: Sort the two subarrays A[p…q-1] and A[q+1…r]
by recursive calls to quicksort.
– Combine: Because the subarrays are already sorted, no
work is needed to combine them: the entire array A[p…r]
is now sorted.
2/10/21 8
Quick Sort
• The following procedure implements quicksort:
• To sort an entire array A, the initial call is
QUICKSORT(A,1,A.length())
• The key to the algorithm is the PARTITION procedure, which
rearranges the subarray A[p…r] in place.
• The running time of PARTITION on the subarray A[p…r] is O(n)
2/10/21 9
Quick Sort
2/10/21 10
Quick Sort
2/10/21 11
Quick Sort
• Analysis of QuickSort: Time taken by QuickSort in
general can be written as following.
– T(n) = T(k) + T(n-k-1) + O(n)
– The first two terms are for two recursive calls, the
last term is for the partition process.
– k is the number of elements which are smaller
than pivot.
• The time taken by QuickSort depends upon the input
array and partition strategy.
2/10/21 12
Quick Sort
• Worst Case: The worst case occurs when the partition process
always picks greatest or smallest element as pivot.
• If we consider previous partition strategy where last element
is always picked as pivot, the worst case would occur when
the array is already sorted in increasing or decreasing order.
• In this case, partitioning produces one subproblem with n-1
elements and one with 0 elements. So following would be the
recurrence for worst case.
– T(n) = T(0) + T(n-1) + O(n)
– which is equivalent to T(n) = T(n-1) + O(n)
– The solution of above recurrence is O(n2) by eq. of
Arithmetic Series in A.2
2/10/21 13
Quick Sort
• Best Case: In the most even possible split, PARTITION
produces two subproblems, each of size no more than
n/2, since one is of size ⌊n/2⌋ and one of size ⌈n/2⌉-1.
• In this case, quicksort runs much faster Or the best case
occurs when the partition process always picks the
middle element as pivot.
• Following is recurrence for best case.
– T(n) = 2T(n/2) + O(n)
– The solution of above recurrence is O(nLogn). It can
be solved using case 2 of Master Theorem.
2/10/21 14
Quick Sort
• Although the worst case time complexity of
QuickSort is O(n2) which is more than many other
sorting algorithms like Merge Sort and Heap Sort.
• QuickSort is faster in practice, because its inner loop
can be efficiently implemented on most
architectures, and in most real-world data.
• QuickSort can be implemented in different ways by
changing the choice of pivot, so that the worst case
rarely occurs for a given type of data..
2/10/21 15
Strassen’s Matrix Multiplication
• Given two square matrices A and B of size n x n each,
find their multiplication matrix.
2/10/21 16
Strassen’s Matrix Multiplication
• Divide and Conquer : Following is simple Divide and Conquer
method to multiply two square matrices.
– Divide matrices A and B in 4 sub-matrices of size N/2 x N/2
as shown in the below diagram.
– Calculate following values recursively. ae + bg, af + bh, ce +
dg and cf + dh.
2/10/21 17
Strassen’s Matrix Multiplication
2/10/21 18
Strassen’s Matrix Multiplication
• In the previous method, we do 8 multiplications for
matrices of size N/2 x N/2 and 4 additions. Addition
of two matrices takes O(N2) time. So the time
complexity can be written as
– T(N) = 8T(N/2) + O(N2)
– From Master's Theorem (4.5), time complexity of
above method is O(N3) which is unfortunately
same as the above naive method.
2/10/21 19
Strassen’s Matrix Multiplication
• Simple Divide and Conquer also leads to O(N3), can
there be a better way?
– In the previous divide and conquer method, the main
component for high time complexity is 8 recursive
calls.
– The idea of Strassen’s method is to reduce the
number of recursive calls to 7.
– Strassen’s method is similar to previous simple divide
and conquer method in the sense that this method
also divide matrices to sub-matrices of size N/2 x N/2
as shown in the previous diagram, but in Strassen’s
method, the four sub-matrices of result are calculated
using following formulae.
2/10/21 20
Strassen’s Matrix Multiplication
2/10/21 21
Strassen’s Matrix Multiplication
• Time Complexity of Strassen’s Method: Addition and
Subtraction of two matrices takes O(N2) time. So
time complexity can be written as
– T(N) = 7T(N/2) + O(N2)
– From Master's Theorem, time complexity of above
method is O(NLog7) which is approximately
O(N2.8074)
2/10/21 22
Strassen’s Matrix Multiplication
• Generally Strassen’s Method is not preferred for practical
applications for following reasons.
– The constants used in Strassen’s method are high and
for a typical application Naive method works better.
– For Sparse matrices, there are better methods
especially designed for them.
– The sub-matrices in recursion take extra space.
– Because of the limited precision of computer
arithmetic on non-integer values, larger errors
accumulate in Strassen’s algorithm than in Naive
Method.
2/10/21 23
Home Work # 5
2/10/21 24