Divide-and-Conquer Technique
The most-wellknown algorithm design strategy:
1. Divide instance of problem into two or more smaller instances
2. Solve smaller instances recursively
3. Obtain solution to original (larger) instance by combining these
solutions
3.
Divide-and-Conquer Technique (cont.)
subproblem2
of size n/2
subproblem 1
of size n/2
a solution to
subproblem 1
a solution to
the original problem
a solution to
subproblem 2
a problem of size n
(instance)
It general leads to a
recursive algorithm!
4.
Merge sort
Quicksort
Binary search
Matrix multiplication: Strassen’s algorithm
Multiplication of large integers
Divide-and-Conquer Examples
Analysis of Mergesort
LetT(n) denote time complexity of sorting n elements using
merge sort then :
T(n) = 2T(n/2) + cn, T(1) = 0 for some constant c
11.
For simplification purposeassume n = 2k
for some positive constant k
T(n) = 2T(n/2) + Cn, where c is a constant
T(n) = 2[2T(n/22
) + Cn/2] + Cn = 22
T(n/ 22
) + 2Cn
T(n) = 22
[2T(n/23
) + Cn/22
] + 2Cn
T(n) = 23
T(n/ 23
) + 3Cn
………..
………..
………..
T(n) = 2k
T(n/ 2k
) + kCn
Therefore, T(n) = 2k
T(1) + kCn = k Cn (since T(1)=0)
Which implies T (n) = O (kn) = O(n logn)
[since n = 2k
, k = log n]
Analysis of Merge sort
12.
Analysis of Mergesort
(Using Masters Theorem)
T(n) = aT(n/b) + f (n) where f(n) (nd
), d 0
Master Theorem: If a < bd
, T(n) (nd
)
If a = bd
, T(n) (nd
log n)
If a > bd
, T(n) (nlog b a
)
T(n) = 2T(n/2) + cn, T(1) = 0 for some constant c
Comparing the Recurrence Relation with the Masters Theorem we get
a= 2, b=2, d=1
Using Masters Theorem
a = bd
Therefore T(n) (nd
log n)
=> T(n) (n log n)
• Small instancehas n <= 1. Every small instance is a sorted instance.
• To sort a large instance, select a pivot element from out of the n
elements.
• Partition the n elements into 3 groups left, middle and right.
• The middle group contains only the pivot element.
• All elements in the left group are <= pivot.
• All elements in the right group are >= pivot.
• Sort left and right groups recursively.
• Answer is sorted left group, followed by middle group followed by
sorted right group.
Quick Sort
15.
Example
6 2 85 11 10 4 1 9 7 3
Use 6 as the pivot.
11
10
Sort left and right groups recursively.
16.
Choice Of Pivot
•Pivot is leftmost element in list that is to be sorted.
When sorting a[6:20], use a[6] as the pivot.
• Randomly select one of the elements to be sorted
as the pivot.
When sorting a[6:20], generate a random number r in
the range [6, 20]. Use a[r] as the pivot.
17.
Choice Of Pivot
•Median-of-Three rule. From the leftmost, middle, and
rightmost elements of the list to be sorted, select the
one with median key as the pivot.
When sorting a[6:20], examine a[6], a[13] ((6+20)/2), and
a[20]. Select the element with median (i.e., middle) key.
Complexity
• O(n) timeto partition an array of n elements.
• Let t(n) be the time needed to sort n elements.
• t(0) = t(1) = d, where d is a constant.
• When t > 1,
t(n) = t(|left|) + t(|right|) + cn,
where c is a constant.
• t(n) is maximum when either |left| = 0 or |right| =
0 following each partitioning.
20.
Complexity
• This happens,for example, when the pivot is always the
smallest or largest element.
• For the worst-case time,
t(n) = t(n-1) + cn, n > 1
• Use repeated substitution to get t(n) = O(n2
).
• The best case arises when |left| and |right| are equal (or
differ by 1) following each partitioning.
• For the best case, the recurrence is the same as for
merge sort. T(n) = 2T(n/2) + Cn, where c is a constant
21.
Analysis of Quicksort
Bestcase: split in the middle (same as merge sort)— O(n log n)
Worst case: sorted array! — O(n2)
T(n) = T(n-1) + Θ(n) , T(1) = 0
T(n) = T(n-1) + cn = T(n-2) + c(n-1) + cn
………
………
T(n)= T(1) + c( n + (n-1) + … + 2) = O(n2)
Average case: random arrays — Θ(n log n) ?
22.
Analysis of Quicksort Best Case
(Using Masters Theorem)
T(n) = aT(n/b) + f (n) where f(n) (nd
), d 0
Master Theorem: If a < bd
, T(n) (nd
)
If a = bd
, T(n) (nd
log n)
If a > bd
, T(n) (nlog b a
)
T(n) = 2T(n/2) + Cn
Comparing the Recurrence Relation with the Masters Theorem we get
a= 2, b=2, d=1
Using Masters Theorem
a = bd
Therefore T(n) (nd
log n)
=> T(n) (n log n)
23.
• Let t(n)denote average time complexity of sorting n elements
using quick sort
• For n <=1, t(n) = d , for some constant d.
• t(n) <= cn + 1/n [ ∑0
n-1
(t(s) + t(n-s-1) ], where s and n-s-1
denote number of elements in the left segment and the right
segment respectively.
Average Time Complexity of Quick Sort
24.
Average Time Complexityof Quick Sort
Proof by induction: We show that t(n) <= kn logn, where k= 2(c+d)
Base case n=2, t(2) <= 2(c+d)
Assume that theorem is true for n<m , now to prove the theorem for n = m
int binarySearch(int a[],int n, const int& x, int left, int right)
{
while (left <= right)
{
int middle = (left + right)/2;
if (x == a[middle]) return middle;
else if (x > a[middle]) binarySearch(a, n, x, mid+1, right);}
else binarySearch(a, n, x, left, mid-1);
}
return -1; // x not found
}
Binary Search
28.
T(n) = T(n/2)+ c , T(1) = d, where d is a constant
T(n) = T(n/2) + c = T(n/22
) + 2c = ….. =T(n/2k
) + kc
= kc + d (Assuming n=2k
, which implies k = logn)
= O(log n)
Binary Search Complexity
29.
Analysis of BinarySearch
(Using Masters Theorem)
T(n) = aT(n/b) + f (n) where f(n) (nd
), d 0
Master Theorem: If a < bd
, T(n) (nd
)
If a = bd
, T(n) (nd
log n)
If a > bd
, T(n) (nlog b a
)
T(n) = T(n/2) + c
Comparing the Recurrence Relation with the Masters Theorem we get
a= 1, b=2, d=0
Using Masters Theorem
a = bd
Therefore T(n) (nd
log n)
=> T(n) (log n)
A binary treeT is defined as a finite set of nodes that is either empty or consists of a root and
two disjoint binary trees TL and TR called, respectively, the left and right subtree of the root.
Binary Tree
Height of BinaryTree
We are making one recursive call corresponding to each node in the tree (Therefore in total we are
making n recursive calls) and the cost associated with each call is constant. Therefore, the time
complexity of this recursive algorithm is O(n)
-1
34.
Binary Tree Traversal
Themost important divide-and-conquer
algorithms for binary trees are the three
classic traversals: pre-order, in-order
and post-order. All the three traversals
visit the nodes of a binary tree
recursively, i.e., by visiting the tree’s
root and its left and right sub trees.
They differ only by the timing of the
root’s visit:
– Pre-order traversal: the root is visited
before the left and right sub trees are
visited (in that order).
– In-order traversal: the root is visited
after visiting its left sub tree but before
visiting the right sub tree.
– Post-order traversal: the root is visited
after visiting the left and right sub trees
(in that order).
35.
inorder(Node)
if (Node !=null)
inorder(left_child)
Process the Node
inorder(right_child)
End if
End inorder
preorder(Node)
if (Node != null)
Process Node
preorder(left_child)
preorder(right_child)
End if
End preorder
postorder(Node)
if (Node != null)
postorder(left_child)
postorder(right_child)
Process Node
End if
End postorder
Pseudocode for these traversals is straightforward. We are making one recursive call corresponding
to each node in the tree (Therefore in total we are making n recursive calls) and the cost associated
with each call is constant. Therefore, the time complexity of each of these recursive algorithms is
O(n)
Some applications, notablymodern cryptography, require manipulation of integers
that are over 100 decimal digits long. Since such integers are too long to fit in a
single word of a modern computer, they require special treatment. This practical
need supports investigations of algorithms for efficient manipulation of large
integers
Obviously, if we use the conventional pen-and-pencil algorithm for multiplying two
n-digit integers, each of the n digits of the first number is multiplied by each of the
n digits of the second number for the total of n2
digit multiplications. (If one of
the numbers has fewer digits than the other, we can pad the shorter
number with leading zeros to equalize their lengths.)
Recurrence relation usingUsing Masters Theorem
T(n) = aT(n/b) + f (n) where f(n) (nd
), d 0
Master Theorem: If a < bd
, T(n) (nd
)
If a = bd
, T(n) (nd
log n)
If a > bd
, T(n) (nlog b a
)
Comparing the Recurrence Relation with the Masters Theorem we get
a= 3, b=2, d=0
Using Masters Theorem
a > bd
Therefore T(n) (nlog 2 3
)
=> T(n) (n1.585
)
Analysis of MatrixMultiplication
(Using Masters Theorem)
T(n) = aT(n/b) + f (n) where f(n) (nd
), d 0
Master Theorem: If a < bd
, T(n) (nd
)
If a = bd
, T(n) (nd
log n)
If a > bd
, T(n) (nlog b a
)
T(n) = 8 T(n/2) + cn2
Comparing the Recurrence Relation with the Masters Theorem we get
a= 8, b=2, d=2
Using Masters Theorem
a > bd
Therefore T(n) (nlog b a
)
=> T(n) (nlog 28
) = (n3
)
Analysis of Strassen’sAlgorithm
(Using Masters Theorem)
T(n) = aT(n/b) + f (n) where f(n) (nd
), d 0
Master Theorem: If a < bd
, T(n) (nd
)
If a = bd
, T(n) (nd
log n)
If a > bd
, T(n) (nlog b a
)
T(n) = 7 T(n/2) + cn2
Comparing the Recurrence Relation with the Masters Theorem we get
a= 7, b=2, d=2
Using Masters Theorem
a > bd
Therefore T(n) (nlog b a
)
=> T(n) (nlog 27
)
50.
General Divide-and-Conquer Recurrence
T(n)= aT(n/b) + f (n) where f(n) (nd
), d 0
Master Theorem: If a < bd
, T(n) (nd
)
If a = bd
, T(n) (nd
log n)
If a > bd
, T(n) (nlog b a
)
Note: The same results hold with O instead of .
Examples: T(n) = 4T(n/2) + n T(n) ?
T(n) = 4T(n/2) + n2
T(n) ?
T(n) = 4T(n/2) + n3
T(n) ?