KEMBAR78
Divide and Conquer in DAA concept. For B Tech CSE | PPTX
Divide-and-Conquer
Divide-and-Conquer Technique
The most-well known algorithm design strategy:
1. Divide instance of problem into two or more smaller instances
2. Solve smaller instances recursively
3. Obtain solution to original (larger) instance by combining these
solutions
Divide-and-Conquer Technique (cont.)
subproblem 2
of size n/2
subproblem 1
of size n/2
a solution to
subproblem 1
a solution to
the original problem
a solution to
subproblem 2
a problem of size n
(instance)
It general leads to a
recursive algorithm!
 Merge sort
 Quicksort
 Binary search
 Matrix multiplication: Strassen’s algorithm
 Multiplication of large integers
Divide-and-Conquer Examples
Mergesort
Pseudocode of Mergesort
Time complexity: Θ(p + q) = Θ(n) comparisons
Pseudocode of Mergesort
• B=[2, 4, 15, 20] C= [ 1, 16, 100, 120] merge B &C in A
• i=0 j=0 k=0
• B[i]=2 > C[j] = 1 therefore A[0]=1 k=1, j=1
• B[i]=2 < c[j]=16 therefore A[1]= 2 k=2, i=1
• B[i]=4 < c[j]=16 A[2]=4 i=2, k = 3
• B[i]=15 < c[j]=16 A[3]= 15 i=3, k=4
• B[i]=20 > c[j]=16 A[4]= 16 k=4, j= 2
• B[i]=20 < c[2]=100 A[5] = 20 k=5 , i=4
• i>3 therefore copy remaining element of C in A
• A[6]=100 A[7]=120
• A[] =[1, 2, 4, 15, 16, 20, 100, 120]
Merge
8 3 2 9 7 1 5 4
8 3 2 9 7 1 5 4
8 3 2 9 7 1 5 4
8 3 2 9 7 1 5 4
3 8 2 9 1 7 4 5
2 3 8 9 1 4 5 7
1 2 3 4 5 7 8 9
Merge sort Example
Analysis of Mergesort
Let T(n) denote time complexity of sorting n elements using
merge sort then :
T(n) = 2T(n/2) + cn, T(1) = 0 for some constant c
For simplification purpose assume n = 2k
for some positive constant k
T(n) = 2T(n/2) + Cn, where c is a constant
T(n) = 2[2T(n/22
) + Cn/2] + Cn = 22
T(n/ 22
) + 2Cn
T(n) = 22
[2T(n/23
) + Cn/22
] + 2Cn
T(n) = 23
T(n/ 23
) + 3Cn
………..
………..
………..
T(n) = 2k
T(n/ 2k
) + kCn
Therefore, T(n) = 2k
T(1) + kCn = k Cn (since T(1)=0)
Which implies T (n) = O (kn) = O(n logn)
[since n = 2k
, k = log n]
Analysis of Merge sort
Analysis of Merge sort
(Using Masters Theorem)
T(n) = aT(n/b) + f (n) where f(n)  (nd
), d  0
Master Theorem: If a < bd
, T(n)  (nd
)
If a = bd
, T(n)  (nd
log n)
If a > bd
, T(n)  (nlog b a
)
T(n) = 2T(n/2) + cn, T(1) = 0 for some constant c
Comparing the Recurrence Relation with the Masters Theorem we get
a= 2, b=2, d=1
Using Masters Theorem
a = bd
Therefore T(n)  (nd
log n)
=> T(n)  (n log n)
Quick Sort
• Small instance has n <= 1. Every small instance is a sorted instance.
• To sort a large instance, select a pivot element from out of the n
elements.
• Partition the n elements into 3 groups left, middle and right.
• The middle group contains only the pivot element.
• All elements in the left group are <= pivot.
• All elements in the right group are >= pivot.
• Sort left and right groups recursively.
• Answer is sorted left group, followed by middle group followed by
sorted right group.
Quick Sort
Example
6 2 8 5 11 10 4 1 9 7 3
Use 6 as the pivot.
11
10
Sort left and right groups recursively.
Choice Of Pivot
• Pivot is leftmost element in list that is to be sorted.
 When sorting a[6:20], use a[6] as the pivot.
• Randomly select one of the elements to be sorted
as the pivot.
 When sorting a[6:20], generate a random number r in
the range [6, 20]. Use a[r] as the pivot.
Choice Of Pivot
• Median-of-Three rule. From the leftmost, middle, and
rightmost elements of the list to be sorted, select the
one with median key as the pivot.
 When sorting a[6:20], examine a[6], a[13] ((6+20)/2), and
a[20]. Select the element with median (i.e., middle) key.
Choice Of Pivot
Median
Complexity
• O(n) time to partition an array of n elements.
• Let t(n) be the time needed to sort n elements.
• t(0) = t(1) = d, where d is a constant.
• When t > 1,
t(n) = t(|left|) + t(|right|) + cn,
where c is a constant.
• t(n) is maximum when either |left| = 0 or |right| =
0 following each partitioning.
Complexity
• This happens, for example, when the pivot is always the
smallest or largest element.
• For the worst-case time,
t(n) = t(n-1) + cn, n > 1
• Use repeated substitution to get t(n) = O(n2
).
• The best case arises when |left| and |right| are equal (or
differ by 1) following each partitioning.
• For the best case, the recurrence is the same as for
merge sort. T(n) = 2T(n/2) + Cn, where c is a constant
Analysis of Quicksort
Best case: split in the middle (same as merge sort)— O(n log n)
Worst case: sorted array! — O(n2)
T(n) = T(n-1) + Θ(n) , T(1) = 0
T(n) = T(n-1) + cn = T(n-2) + c(n-1) + cn
………
………
T(n)= T(1) + c( n + (n-1) + … + 2) = O(n2)
Average case: random arrays — Θ(n log n) ?
Analysis of Quick sort Best Case
(Using Masters Theorem)
T(n) = aT(n/b) + f (n) where f(n)  (nd
), d  0
Master Theorem: If a < bd
, T(n)  (nd
)
If a = bd
, T(n)  (nd
log n)
If a > bd
, T(n)  (nlog b a
)
T(n) = 2T(n/2) + Cn
Comparing the Recurrence Relation with the Masters Theorem we get
a= 2, b=2, d=1
Using Masters Theorem
a = bd
Therefore T(n)  (nd
log n)
=> T(n)  (n log n)
• Let t(n) denote average time complexity of sorting n elements
using quick sort
• For n <=1, t(n) = d , for some constant d.
• t(n) <= cn + 1/n [ ∑0
n-1
(t(s) + t(n-s-1) ], where s and n-s-1
denote number of elements in the left segment and the right
segment respectively.
Average Time Complexity of Quick Sort
Average Time Complexity of Quick Sort
Proof by induction: We show that t(n) <= kn logn, where k= 2(c+d)
Base case n=2, t(2) <= 2(c+d)
Assume that theorem is true for n<m , now to prove the theorem for n = m
Average Time Complexity of Quick Sort
Binary Search
int binarySearch(int a[], int n, const int& x, int left, int right)
{
while (left <= right)
{
int middle = (left + right)/2;
if (x == a[middle]) return middle;
else if (x > a[middle]) binarySearch(a, n, x, mid+1, right);}
else binarySearch(a, n, x, left, mid-1);
}
return -1; // x not found
}
Binary Search
T(n) = T(n/2) + c , T(1) = d, where d is a constant
T(n) = T(n/2) + c = T(n/22
) + 2c = ….. =T(n/2k
) + kc
= kc + d (Assuming n=2k
, which implies k = logn)
= O(log n)
Binary Search Complexity
Analysis of Binary Search
(Using Masters Theorem)
T(n) = aT(n/b) + f (n) where f(n)  (nd
), d  0
Master Theorem: If a < bd
, T(n)  (nd
)
If a = bd
, T(n)  (nd
log n)
If a > bd
, T(n)  (nlog b a
)
T(n) = T(n/2) + c
Comparing the Recurrence Relation with the Masters Theorem we get
a= 1, b=2, d=0
Using Masters Theorem
a = bd
Therefore T(n)  (nd
log n)
=> T(n)  (log n)
Binary tree traversals and related
Properties
A binary tree T is defined as a finite set of nodes that is either empty or consists of a root and
two disjoint binary trees TL and TR called, respectively, the left and right subtree of the root.
Binary Tree
Height of Tree
Height of Binary Tree
We are making one recursive call corresponding to each node in the tree (Therefore in total we are
making n recursive calls) and the cost associated with each call is constant. Therefore, the time
complexity of this recursive algorithm is O(n)
-1
Binary Tree Traversal
The most important divide-and-conquer
algorithms for binary trees are the three
classic traversals: pre-order, in-order
and post-order. All the three traversals
visit the nodes of a binary tree
recursively, i.e., by visiting the tree’s
root and its left and right sub trees.
They differ only by the timing of the
root’s visit:
– Pre-order traversal: the root is visited
before the left and right sub trees are
visited (in that order).
– In-order traversal: the root is visited
after visiting its left sub tree but before
visiting the right sub tree.
– Post-order traversal: the root is visited
after visiting the left and right sub trees
(in that order).
inorder(Node)
if (Node != null)
inorder(left_child)
Process the Node
inorder(right_child)
End if
End inorder
preorder(Node)
if (Node != null)
Process Node
preorder(left_child)
preorder(right_child)
End if
End preorder
postorder(Node)
if (Node != null)
postorder(left_child)
postorder(right_child)
Process Node
End if
End postorder
Pseudocode for these traversals is straightforward. We are making one recursive call corresponding
to each node in the tree (Therefore in total we are making n recursive calls) and the cost associated
with each call is constant. Therefore, the time complexity of each of these recursive algorithms is
O(n)
Multiplication of Large Integers
Some applications, notably modern cryptography, require manipulation of integers
that are over 100 decimal digits long. Since such integers are too long to fit in a
single word of a modern computer, they require special treatment. This practical
need supports investigations of algorithms for efficient manipulation of large
integers
Obviously, if we use the conventional pen-and-pencil algorithm for multiplying two
n-digit integers, each of the n digits of the first number is multiplied by each of the
n digits of the second number for the total of n2
digit multiplications. (If one of
the numbers has fewer digits than the other, we can pad the shorter
number with leading zeros to equalize their lengths.)
Recurrence relation using Back Substitution
Recurrence relation using Using Masters Theorem
T(n) = aT(n/b) + f (n) where f(n)  (nd
), d  0
Master Theorem: If a < bd
, T(n)  (nd
)
If a = bd
, T(n)  (nd
log n)
If a > bd
, T(n)  (nlog b a
)
Comparing the Recurrence Relation with the Masters Theorem we get
a= 3, b=2, d=0
Using Masters Theorem
a > bd
Therefore T(n)  (nlog 2 3
)
=> T(n)  (n1.585
)
Matrix Multiplication
for (int i=0; i< n ; i++)
for (int j=0; j<n; j++)
{
c[i][j]=0;
for (int k=0; i<n; k++)
c[i][j]+= A[i][k]*B[k][j];
}
O(n3
)
Conventional Matrix Multiplication
Matrix Multiplication : Divide and Conquer
T(n) = 8 T(n/2) + cn2
for
n>=2 and T(1)=d
Solve ! ?
• T(n) = 8 T(n/2) + cn2
= 8 [8T(n/22
) + cn2
/4] + cn2
• = 82
T(n/22
) + cn2
[1 + 2]
• = 82
[8T(n/23
) + cn2
/16] + cn2
[1 + 2 + 4]
• = 83
T(n/23
) + cn2
[1 + 2 + 22
]
• ……
• ……
• = 8k
T(n/2k
) + cn2
[1 + 2+ 22
+… + 2k-1
] = O(n3
) (Since n=
2k
)
Analysis of Matrix Multiplication
(Using Masters Theorem)
T(n) = aT(n/b) + f (n) where f(n)  (nd
), d  0
Master Theorem: If a < bd
, T(n)  (nd
)
If a = bd
, T(n)  (nd
log n)
If a > bd
, T(n)  (nlog b a
)
T(n) = 8 T(n/2) + cn2
Comparing the Recurrence Relation with the Masters Theorem we get
a= 8, b=2, d=2
Using Masters Theorem
a > bd
Therefore T(n)  (nlog b a
)
=> T(n)  (nlog 28
) =  (n3
)
Strassen’s Matrix Multiplication
Formulas for Strassen’s Algorithm
T(n) = 7 T(n/2) + cn2
T(2) = d, for some constant d
Analysis of Strassen’s Algorithm
T(n) = 7 T(n/2) + cn2
T(1) = d, for some constant d
T(n) = 7 T(n/2) + cn2
= 7 [7 T(n/22
) + c(n/2)2
] + cn2
=
= 72
T(n/22
) + cn2
[1 + 7/4]
= 73
T(n/23
) + cn2
[1 + 7/4 + (7/4)2
]
…..
…..
= 7k
T(n/2k
) + cn2
[1 + 7/4 + (7/4)2
+ … (7/4)k-1
]
= 7k
T(n/2k
) + cn2
(7/4)k
/ [7/4 – 1] = O(7k
)
= O(7logn
) = O(nlog 7
) = O(n2.81
)
Analysis of Strassen’s Algorithm
(Using Masters Theorem)
T(n) = aT(n/b) + f (n) where f(n)  (nd
), d  0
Master Theorem: If a < bd
, T(n)  (nd
)
If a = bd
, T(n)  (nd
log n)
If a > bd
, T(n)  (nlog b a
)
T(n) = 7 T(n/2) + cn2
Comparing the Recurrence Relation with the Masters Theorem we get
a= 7, b=2, d=2
Using Masters Theorem
a > bd
Therefore T(n)  (nlog b a
)
=> T(n)  (nlog 27
)
General Divide-and-Conquer Recurrence
T(n) = aT(n/b) + f (n) where f(n)  (nd
), d  0
Master Theorem: If a < bd
, T(n)  (nd
)
If a = bd
, T(n)  (nd
log n)
If a > bd
, T(n)  (nlog b a
)
Note: The same results hold with O instead of .
Examples: T(n) = 4T(n/2) + n  T(n)  ?
T(n) = 4T(n/2) + n2
 T(n)  ?
T(n) = 4T(n/2) + n3
 T(n)  ?
Example: Strassen’s Matrix Multiplication
Exercise
Multiply the following two matrix using Strassen’s Matrix Multiplication Method. Multiply
2 x 2 matrix conventionally.
5 10 15 20
5 10 15 20
1 2 3 4
5 6 7 8
1 2 3 4
5 6 7 8
9 10 11 12
13 14 15 16
450 500 550 600
450 500 550 600
90 100 110 120
202 228 254 280
Exercice
Solve the following recurrence equations using substitution Method and Master’s theorem. In
all the examples considered take t(1)=1:

Divide and Conquer in DAA concept. For B Tech CSE

  • 1.
  • 2.
    Divide-and-Conquer Technique The most-wellknown algorithm design strategy: 1. Divide instance of problem into two or more smaller instances 2. Solve smaller instances recursively 3. Obtain solution to original (larger) instance by combining these solutions
  • 3.
    Divide-and-Conquer Technique (cont.) subproblem2 of size n/2 subproblem 1 of size n/2 a solution to subproblem 1 a solution to the original problem a solution to subproblem 2 a problem of size n (instance) It general leads to a recursive algorithm!
  • 4.
     Merge sort Quicksort  Binary search  Matrix multiplication: Strassen’s algorithm  Multiplication of large integers Divide-and-Conquer Examples
  • 5.
  • 6.
  • 7.
    Time complexity: Θ(p+ q) = Θ(n) comparisons Pseudocode of Mergesort
  • 8.
    • B=[2, 4,15, 20] C= [ 1, 16, 100, 120] merge B &C in A • i=0 j=0 k=0 • B[i]=2 > C[j] = 1 therefore A[0]=1 k=1, j=1 • B[i]=2 < c[j]=16 therefore A[1]= 2 k=2, i=1 • B[i]=4 < c[j]=16 A[2]=4 i=2, k = 3 • B[i]=15 < c[j]=16 A[3]= 15 i=3, k=4 • B[i]=20 > c[j]=16 A[4]= 16 k=4, j= 2 • B[i]=20 < c[2]=100 A[5] = 20 k=5 , i=4 • i>3 therefore copy remaining element of C in A • A[6]=100 A[7]=120 • A[] =[1, 2, 4, 15, 16, 20, 100, 120] Merge
  • 9.
    8 3 29 7 1 5 4 8 3 2 9 7 1 5 4 8 3 2 9 7 1 5 4 8 3 2 9 7 1 5 4 3 8 2 9 1 7 4 5 2 3 8 9 1 4 5 7 1 2 3 4 5 7 8 9 Merge sort Example
  • 10.
    Analysis of Mergesort LetT(n) denote time complexity of sorting n elements using merge sort then : T(n) = 2T(n/2) + cn, T(1) = 0 for some constant c
  • 11.
    For simplification purposeassume n = 2k for some positive constant k T(n) = 2T(n/2) + Cn, where c is a constant T(n) = 2[2T(n/22 ) + Cn/2] + Cn = 22 T(n/ 22 ) + 2Cn T(n) = 22 [2T(n/23 ) + Cn/22 ] + 2Cn T(n) = 23 T(n/ 23 ) + 3Cn ……….. ……….. ……….. T(n) = 2k T(n/ 2k ) + kCn Therefore, T(n) = 2k T(1) + kCn = k Cn (since T(1)=0) Which implies T (n) = O (kn) = O(n logn) [since n = 2k , k = log n] Analysis of Merge sort
  • 12.
    Analysis of Mergesort (Using Masters Theorem) T(n) = aT(n/b) + f (n) where f(n)  (nd ), d  0 Master Theorem: If a < bd , T(n)  (nd ) If a = bd , T(n)  (nd log n) If a > bd , T(n)  (nlog b a ) T(n) = 2T(n/2) + cn, T(1) = 0 for some constant c Comparing the Recurrence Relation with the Masters Theorem we get a= 2, b=2, d=1 Using Masters Theorem a = bd Therefore T(n)  (nd log n) => T(n)  (n log n)
  • 13.
  • 14.
    • Small instancehas n <= 1. Every small instance is a sorted instance. • To sort a large instance, select a pivot element from out of the n elements. • Partition the n elements into 3 groups left, middle and right. • The middle group contains only the pivot element. • All elements in the left group are <= pivot. • All elements in the right group are >= pivot. • Sort left and right groups recursively. • Answer is sorted left group, followed by middle group followed by sorted right group. Quick Sort
  • 15.
    Example 6 2 85 11 10 4 1 9 7 3 Use 6 as the pivot. 11 10 Sort left and right groups recursively.
  • 16.
    Choice Of Pivot •Pivot is leftmost element in list that is to be sorted.  When sorting a[6:20], use a[6] as the pivot. • Randomly select one of the elements to be sorted as the pivot.  When sorting a[6:20], generate a random number r in the range [6, 20]. Use a[r] as the pivot.
  • 17.
    Choice Of Pivot •Median-of-Three rule. From the leftmost, middle, and rightmost elements of the list to be sorted, select the one with median key as the pivot.  When sorting a[6:20], examine a[6], a[13] ((6+20)/2), and a[20]. Select the element with median (i.e., middle) key.
  • 18.
  • 19.
    Complexity • O(n) timeto partition an array of n elements. • Let t(n) be the time needed to sort n elements. • t(0) = t(1) = d, where d is a constant. • When t > 1, t(n) = t(|left|) + t(|right|) + cn, where c is a constant. • t(n) is maximum when either |left| = 0 or |right| = 0 following each partitioning.
  • 20.
    Complexity • This happens,for example, when the pivot is always the smallest or largest element. • For the worst-case time, t(n) = t(n-1) + cn, n > 1 • Use repeated substitution to get t(n) = O(n2 ). • The best case arises when |left| and |right| are equal (or differ by 1) following each partitioning. • For the best case, the recurrence is the same as for merge sort. T(n) = 2T(n/2) + Cn, where c is a constant
  • 21.
    Analysis of Quicksort Bestcase: split in the middle (same as merge sort)— O(n log n) Worst case: sorted array! — O(n2) T(n) = T(n-1) + Θ(n) , T(1) = 0 T(n) = T(n-1) + cn = T(n-2) + c(n-1) + cn ……… ……… T(n)= T(1) + c( n + (n-1) + … + 2) = O(n2) Average case: random arrays — Θ(n log n) ?
  • 22.
    Analysis of Quicksort Best Case (Using Masters Theorem) T(n) = aT(n/b) + f (n) where f(n)  (nd ), d  0 Master Theorem: If a < bd , T(n)  (nd ) If a = bd , T(n)  (nd log n) If a > bd , T(n)  (nlog b a ) T(n) = 2T(n/2) + Cn Comparing the Recurrence Relation with the Masters Theorem we get a= 2, b=2, d=1 Using Masters Theorem a = bd Therefore T(n)  (nd log n) => T(n)  (n log n)
  • 23.
    • Let t(n)denote average time complexity of sorting n elements using quick sort • For n <=1, t(n) = d , for some constant d. • t(n) <= cn + 1/n [ ∑0 n-1 (t(s) + t(n-s-1) ], where s and n-s-1 denote number of elements in the left segment and the right segment respectively. Average Time Complexity of Quick Sort
  • 24.
    Average Time Complexityof Quick Sort Proof by induction: We show that t(n) <= kn logn, where k= 2(c+d) Base case n=2, t(2) <= 2(c+d) Assume that theorem is true for n<m , now to prove the theorem for n = m
  • 25.
  • 26.
  • 27.
    int binarySearch(int a[],int n, const int& x, int left, int right) { while (left <= right) { int middle = (left + right)/2; if (x == a[middle]) return middle; else if (x > a[middle]) binarySearch(a, n, x, mid+1, right);} else binarySearch(a, n, x, left, mid-1); } return -1; // x not found } Binary Search
  • 28.
    T(n) = T(n/2)+ c , T(1) = d, where d is a constant T(n) = T(n/2) + c = T(n/22 ) + 2c = ….. =T(n/2k ) + kc = kc + d (Assuming n=2k , which implies k = logn) = O(log n) Binary Search Complexity
  • 29.
    Analysis of BinarySearch (Using Masters Theorem) T(n) = aT(n/b) + f (n) where f(n)  (nd ), d  0 Master Theorem: If a < bd , T(n)  (nd ) If a = bd , T(n)  (nd log n) If a > bd , T(n)  (nlog b a ) T(n) = T(n/2) + c Comparing the Recurrence Relation with the Masters Theorem we get a= 1, b=2, d=0 Using Masters Theorem a = bd Therefore T(n)  (nd log n) => T(n)  (log n)
  • 30.
    Binary tree traversalsand related Properties
  • 31.
    A binary treeT is defined as a finite set of nodes that is either empty or consists of a root and two disjoint binary trees TL and TR called, respectively, the left and right subtree of the root. Binary Tree
  • 32.
  • 33.
    Height of BinaryTree We are making one recursive call corresponding to each node in the tree (Therefore in total we are making n recursive calls) and the cost associated with each call is constant. Therefore, the time complexity of this recursive algorithm is O(n) -1
  • 34.
    Binary Tree Traversal Themost important divide-and-conquer algorithms for binary trees are the three classic traversals: pre-order, in-order and post-order. All the three traversals visit the nodes of a binary tree recursively, i.e., by visiting the tree’s root and its left and right sub trees. They differ only by the timing of the root’s visit: – Pre-order traversal: the root is visited before the left and right sub trees are visited (in that order). – In-order traversal: the root is visited after visiting its left sub tree but before visiting the right sub tree. – Post-order traversal: the root is visited after visiting the left and right sub trees (in that order).
  • 35.
    inorder(Node) if (Node !=null) inorder(left_child) Process the Node inorder(right_child) End if End inorder preorder(Node) if (Node != null) Process Node preorder(left_child) preorder(right_child) End if End preorder postorder(Node) if (Node != null) postorder(left_child) postorder(right_child) Process Node End if End postorder Pseudocode for these traversals is straightforward. We are making one recursive call corresponding to each node in the tree (Therefore in total we are making n recursive calls) and the cost associated with each call is constant. Therefore, the time complexity of each of these recursive algorithms is O(n)
  • 36.
  • 37.
    Some applications, notablymodern cryptography, require manipulation of integers that are over 100 decimal digits long. Since such integers are too long to fit in a single word of a modern computer, they require special treatment. This practical need supports investigations of algorithms for efficient manipulation of large integers Obviously, if we use the conventional pen-and-pencil algorithm for multiplying two n-digit integers, each of the n digits of the first number is multiplied by each of the n digits of the second number for the total of n2 digit multiplications. (If one of the numbers has fewer digits than the other, we can pad the shorter number with leading zeros to equalize their lengths.)
  • 39.
    Recurrence relation usingBack Substitution
  • 40.
    Recurrence relation usingUsing Masters Theorem T(n) = aT(n/b) + f (n) where f(n)  (nd ), d  0 Master Theorem: If a < bd , T(n)  (nd ) If a = bd , T(n)  (nd log n) If a > bd , T(n)  (nlog b a ) Comparing the Recurrence Relation with the Masters Theorem we get a= 3, b=2, d=0 Using Masters Theorem a > bd Therefore T(n)  (nlog 2 3 ) => T(n)  (n1.585 )
  • 41.
  • 42.
    for (int i=0;i< n ; i++) for (int j=0; j<n; j++) { c[i][j]=0; for (int k=0; i<n; k++) c[i][j]+= A[i][k]*B[k][j]; } O(n3 ) Conventional Matrix Multiplication
  • 43.
    Matrix Multiplication :Divide and Conquer T(n) = 8 T(n/2) + cn2 for n>=2 and T(1)=d Solve ! ?
  • 44.
    • T(n) =8 T(n/2) + cn2 = 8 [8T(n/22 ) + cn2 /4] + cn2 • = 82 T(n/22 ) + cn2 [1 + 2] • = 82 [8T(n/23 ) + cn2 /16] + cn2 [1 + 2 + 4] • = 83 T(n/23 ) + cn2 [1 + 2 + 22 ] • …… • …… • = 8k T(n/2k ) + cn2 [1 + 2+ 22 +… + 2k-1 ] = O(n3 ) (Since n= 2k )
  • 45.
    Analysis of MatrixMultiplication (Using Masters Theorem) T(n) = aT(n/b) + f (n) where f(n)  (nd ), d  0 Master Theorem: If a < bd , T(n)  (nd ) If a = bd , T(n)  (nd log n) If a > bd , T(n)  (nlog b a ) T(n) = 8 T(n/2) + cn2 Comparing the Recurrence Relation with the Masters Theorem we get a= 8, b=2, d=2 Using Masters Theorem a > bd Therefore T(n)  (nlog b a ) => T(n)  (nlog 28 ) =  (n3 )
  • 46.
  • 47.
    Formulas for Strassen’sAlgorithm T(n) = 7 T(n/2) + cn2 T(2) = d, for some constant d
  • 48.
    Analysis of Strassen’sAlgorithm T(n) = 7 T(n/2) + cn2 T(1) = d, for some constant d T(n) = 7 T(n/2) + cn2 = 7 [7 T(n/22 ) + c(n/2)2 ] + cn2 = = 72 T(n/22 ) + cn2 [1 + 7/4] = 73 T(n/23 ) + cn2 [1 + 7/4 + (7/4)2 ] ….. ….. = 7k T(n/2k ) + cn2 [1 + 7/4 + (7/4)2 + … (7/4)k-1 ] = 7k T(n/2k ) + cn2 (7/4)k / [7/4 – 1] = O(7k ) = O(7logn ) = O(nlog 7 ) = O(n2.81 )
  • 49.
    Analysis of Strassen’sAlgorithm (Using Masters Theorem) T(n) = aT(n/b) + f (n) where f(n)  (nd ), d  0 Master Theorem: If a < bd , T(n)  (nd ) If a = bd , T(n)  (nd log n) If a > bd , T(n)  (nlog b a ) T(n) = 7 T(n/2) + cn2 Comparing the Recurrence Relation with the Masters Theorem we get a= 7, b=2, d=2 Using Masters Theorem a > bd Therefore T(n)  (nlog b a ) => T(n)  (nlog 27 )
  • 50.
    General Divide-and-Conquer Recurrence T(n)= aT(n/b) + f (n) where f(n)  (nd ), d  0 Master Theorem: If a < bd , T(n)  (nd ) If a = bd , T(n)  (nd log n) If a > bd , T(n)  (nlog b a ) Note: The same results hold with O instead of . Examples: T(n) = 4T(n/2) + n  T(n)  ? T(n) = 4T(n/2) + n2  T(n)  ? T(n) = 4T(n/2) + n3  T(n)  ?
  • 51.
  • 52.
    Exercise Multiply the followingtwo matrix using Strassen’s Matrix Multiplication Method. Multiply 2 x 2 matrix conventionally. 5 10 15 20 5 10 15 20 1 2 3 4 5 6 7 8 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 450 500 550 600 450 500 550 600 90 100 110 120 202 228 254 280
  • 53.
    Exercice Solve the followingrecurrence equations using substitution Method and Master’s theorem. In all the examples considered take t(1)=1: