KEMBAR78
solving_Recurrence_relations_using_methods1.pptx
Recurrence Relations
CS 4102: Algorithms
Fall 2021
Mark Floryan and Tom Horton
1
2
Recurrence Relations
3
Solving Recurrence Relations
 Several (four) methods for solving:
 Directly Solve
 Substitution method
 In short, guess the runtime and solve by induction
 Recurrence trees
 We won’t see this in great detail, but a graphical view of the
recurrence
 Sometimes a picture is worth 210
words!
 “Master” theorem
 Easy to find Order-Class for a number of common cases
 Different variations are called different things, depending on the
source
4
Directly Solving (or Iteration
Method)
5
Directly Solve (unrolling the recurrence)
 For Mergesort:
 T(n) = 2*T(n/2) + n
 Do it on board 
6
Another Example!!
 Consider:
 T(n) = 3*T(n/4) + n
7
Unroll the recurrence
 T(n) = 3*T(n/4) + n
 T(n) = 3*[3*T(n/16)+n/4] + n
 = 9T(n/16) + (7/4)n
 T(n) = 9T(n/16) + (7/4)n
 T(n) = 9[3T(n/64) + n/16] + (7/4)n
 T(n) = 27*T(n/64) + 9n/16 + 7n/4
 T(n) = 27*T(n/64) + 37n/16 //Pattern??
 T(n) = 3d
* T(n/4d
) + n * ∑(3/4)d-1
sum from 1 to d
8
Unroll the recurrence
 T(n) = 3d
* T(n/4d
) + n * ∑(3/4)d-1
 We hit base case when:
 n/(4d
) = 1
 n = 4d
 d = log4(n) //seem familiar??
9
Unroll the recurrence
 T(n) = 3d
*T(n/4d
) + n * ∑(3/4)d
 Let’s do one term at a time.
 3d
* T(n/4d
)
 3log4(n)
* T(1)
 3log4(n)
= nlog4(3)
//huh? this is a log rule
10
Unroll the recurrence
 T(n) = 3d
* T(n/(4d
)) + n * ∑(3/4)d-1
 Let’s do one term at a time.
 n * ∑(3/4)d-1
//note summation part approaches 4 as d grows
 n * ∑(3/4)d-1
<= 4*n = (n)
11
Unroll the recurrence
 T(n) = 3d
* T(n/4d
) + n * ∑(3/4)d
 T(n) = 3log4(n)
+ (n)
 T(n) = nlog4(3)
+ (n) //log rules
 T(n) = o(n) + (n)
 T(n) = (n)
12
Substitution Method
13
Iteration or Substitution Method
 Strategy
 1. Consider Mergesort Recurrence
 T(n) = 2*T(n/2) + n
 2. Guess the solution
 Let’s go with n*log(n) **Remember logs are all base 2 (usually)
 3. Inductively Prove that recurrence is in proper order class
 For n*log(n), we need to prove that T(n) <= c*n*log(n)
 For some ‘c’ constant and for all n >= n0
 Remember, we get to choose the ‘c’ and ‘n0’ values
 Do it on board 
14
Substitution Method: Subtleties
 Consider:
 T(n) = 2*T(n/2) + 1 T(1)=1
 Let’s make our guess:
 We are thinking O(n)
 Try to prove:
 T(n) <= c*n
 What happens? How do we fix this issue?
 On board 
15
Substitution Method: Subtleties
 Consider:
 T(n) = 2*T(n/2) + 1
16
Substitution Method: Subtleties
 Summary of the problem / issue:
 T(n) = 2*T(n/2) + 1
 T(n) <= 2(c*(n/2)) + 1
 T(n) <= c*n + 1
 What is the issue here?
 c*n + 1 is TOO LARGE.
 Need to prove exact form of inductive hypothesis
17
Substitution Method: Subtleties
 Here is how we fix the issue. Subtract lower order term.
 Inductive Hypothesis:
 T(n) <= c*n – d //d is a constant term. Note c*n-d <=
c*n
 Fix:
 T(n) = 2*T(n/2) + 1
 T(n) <= 2(c*(n/2) - d) + 1
 T(n) <= c*n -2d + 1 <= c*n - d //as long as d >=
1
18
Substitution Method: Another Pitfall
 Consider Mergesort recurrence again:
 T(n) = 2*T(n/2) + n
 Let’s make our guess:
 We are thinking O(n)  Note that this is INCORRECT!
 Try to prove:
 T(n) <= c*n
 What happens?
 On board 
19
Substitution Method: Another Pitfall
 Consider Mergesort recurrence again:
 T(n) = 2*T(n/2) + n
20
Substitution Method: Pitfall Example
 Attempt to prove:
 T(n) = 2*T(n/2) + n
 T(n) <= 2*(c*n/2) + n
 T(n) <= c*n + n
 Again, need to prove EXACT form of inductive
hypothesis.
 Subtracting off a lower order term won’t help.
 Why?
21
Recursion Tree Method
22
Recursion Tree Method
 Evaluate: T(n) = 2*T(n/2) + n
 Work copy:T(k) = T(k/2) + T(k/2) + k
 For k=n/2, T(n/2) = T(n/4) + T(n/4) + (n/2)
 [size| non-recursive cost]
23
Recursion Tree: Total Cost
 To evaluate the total cost of the recursion tree
 sum all the non-recursive costs of all nodes
 = Sum (rowSum(cost of all nodes at the same depth))
 Determine the maximum depth of the recursion tree:
 For our example, at tree depth d the size parameter is n/(2d
)
 the size parameter converging to base case, i.e. case 1
 such that, n/(2d
) = 1,
 d = lg(n)
 The rowSum for each row is n
 Therefore, the total cost,T(n) = n lg(n)
24
The Master Theorem
25
The Master Theorem
 Given: a divide and conquer algorithm
 An algorithm that divides the problem of size n into a
subproblems, each of size n/b
 Let the cost of each stage (i.e., the work to divide the problem
+ combine solved subproblems) be described by the function
f(n)
 Then, the Master Theorem gives us a cookbook for the
algorithm’s running time
 Some textbooks has a simpler version they call the “Main
Recurrence Theorem”
 We’ll splits it into individual parts
26
The Master Theorem (from Cormen)
 If T(n) = a T(n/b) + f(n)
 then let k = lg a / lg b = logb(a) (critical exponent)
 Then three common cases:
 If f(n)  O(nk-
) for some positive , then T(n)  (nk
)
 If f(n)  (nk
) then T(n)  ( f(n) log(n) ) = (nk
log(n))
 If f(n)  (nk+
) for some positive , and
a f(n/b) ≤ c f(n) for some c < 1 and sufficiently large n,
then T(n)  (f(n))
 Note: none of these cases may apply
27
Using the Master Theorem
 T(n) = 9T(n/3) + n
 A = 9, b = 3, f(n) = n
 Master Theorem
 k = lg 9 / lg 3 = log3 9 = 2
 Since f(n) = O(nlog3 9 - 
), where =1, case 1 applies:
T(n)  (nk
)
 Thus the solution is T(n) = (n2
) since k=2
28
Problems to Try
 Can you use a theorem on these?
 Assume T(1) = 1
 T(n) = T(n/2) + lg n
 T(n) = T(n/2) + n
 T(n) = 2T(n/2) + n (like Mergesort)
 T(n) = 2T(n/2) + n lg n
29
More Master Theorem Examples
30
Problems to Try
 Let’s try these?
 T(n) = 7T(n/3) + n^2
 T(n) = 3T(n/3) + n/2
 T(n) = 4T(n/2) + n / log(n)
 T(n) = 3T(n/3) + n / log(n)
31
Problems to Try: Solutions


 Case 3:
regularity:
//YES
32
Problems to Try: Solutions

 Case 2: nlogn
33
Problems to Try: Solutions


 Case 1: n^2
34
Problems to Try: Solutions

 Case 1 doesn’t apply because f(n) not polynomially smaller
 e.g., n / log(n) !<= n^0.99 for large n
35
Solutions
Solutions to problems that aren’t directly in the slides above
36
Directly Solve (unrolling the recurrence)
 For Mergesort:
 T(n) = 2*T(n/2) + n
 Do it on board 
37
Directly Solve (unrolling the recurrence)
T(n) = 2*T(n/2) + n
T(n) = 2*[2 * T(n/4) + n/2] + n //unroll one level
= 4*T(n/4) + 2n
= 4*[2*T(n/8) + n/4] + 2n //unroll another level
= 8*T(n/8) + 3n
= 8*[2*T(n/16) + n/8] + 3n //one more time
= 16*T(n/16) + 4n
38
Directly Solve (unrolling the recurrence)
T(n) = 4*T(n/4) + 2n
= 8*T(n/8) + 3n
= 16*T(n/16) + 4n
= … //what is the general pattern??
= //where d is depth of recursion
39
Directly Solve (unrolling the recurrence)
T(n) = //where d is depth of recursion
//when do we hit T(1)
//recursion ends when d is log(n)
T(n) = //sub back in for d
T(n) =
T(n) =
T(n) =
40
Iteration or Substitution Method
 T(n) = 2*T(n/2) + n
 Guess n*log(n)
 Base case (n=2):
 2*T(1)+2
 4 //true if c >= 2
41
Iteration or Substitution Method
 T(n) = 2*T(n/2) + n
 Guess n*log(n)
 Inductive Hypothesis:
 Assume for all that
42
Iteration or Substitution Method
 T(n) = 2*T(n/2) + n
 Guess n*log(n)
 Inductive Step:

//if c >= 1
43
Problems to Try
 Can you use a theorem on these?
 Assume T(1) = 1
 T(n) = T(n/2) + lg n
 T(n) = T(n/2) + n
 T(n) = 2T(n/2) + n (like Mergesort)
 T(n) = 2T(n/2) + n lg n
44
Problems to Try

 Case 3 does not apply!

 Case 3: //YES
45
Problems to Try
 (like Mergesort)

 Case 2:
 k = 1

 //NO! not polynomially smaller!
 Master theorem cannot be used

solving_Recurrence_relations_using_methods1.pptx

  • 1.
    Recurrence Relations CS 4102:Algorithms Fall 2021 Mark Floryan and Tom Horton 1
  • 2.
  • 3.
    3 Solving Recurrence Relations Several (four) methods for solving:  Directly Solve  Substitution method  In short, guess the runtime and solve by induction  Recurrence trees  We won’t see this in great detail, but a graphical view of the recurrence  Sometimes a picture is worth 210 words!  “Master” theorem  Easy to find Order-Class for a number of common cases  Different variations are called different things, depending on the source
  • 4.
    4 Directly Solving (orIteration Method)
  • 5.
    5 Directly Solve (unrollingthe recurrence)  For Mergesort:  T(n) = 2*T(n/2) + n  Do it on board 
  • 6.
  • 7.
    7 Unroll the recurrence T(n) = 3*T(n/4) + n  T(n) = 3*[3*T(n/16)+n/4] + n  = 9T(n/16) + (7/4)n  T(n) = 9T(n/16) + (7/4)n  T(n) = 9[3T(n/64) + n/16] + (7/4)n  T(n) = 27*T(n/64) + 9n/16 + 7n/4  T(n) = 27*T(n/64) + 37n/16 //Pattern??  T(n) = 3d * T(n/4d ) + n * ∑(3/4)d-1 sum from 1 to d
  • 8.
    8 Unroll the recurrence T(n) = 3d * T(n/4d ) + n * ∑(3/4)d-1  We hit base case when:  n/(4d ) = 1  n = 4d  d = log4(n) //seem familiar??
  • 9.
    9 Unroll the recurrence T(n) = 3d *T(n/4d ) + n * ∑(3/4)d  Let’s do one term at a time.  3d * T(n/4d )  3log4(n) * T(1)  3log4(n) = nlog4(3) //huh? this is a log rule
  • 10.
    10 Unroll the recurrence T(n) = 3d * T(n/(4d )) + n * ∑(3/4)d-1  Let’s do one term at a time.  n * ∑(3/4)d-1 //note summation part approaches 4 as d grows  n * ∑(3/4)d-1 <= 4*n = (n)
  • 11.
    11 Unroll the recurrence T(n) = 3d * T(n/4d ) + n * ∑(3/4)d  T(n) = 3log4(n) + (n)  T(n) = nlog4(3) + (n) //log rules  T(n) = o(n) + (n)  T(n) = (n)
  • 12.
  • 13.
    13 Iteration or SubstitutionMethod  Strategy  1. Consider Mergesort Recurrence  T(n) = 2*T(n/2) + n  2. Guess the solution  Let’s go with n*log(n) **Remember logs are all base 2 (usually)  3. Inductively Prove that recurrence is in proper order class  For n*log(n), we need to prove that T(n) <= c*n*log(n)  For some ‘c’ constant and for all n >= n0  Remember, we get to choose the ‘c’ and ‘n0’ values  Do it on board 
  • 14.
    14 Substitution Method: Subtleties Consider:  T(n) = 2*T(n/2) + 1 T(1)=1  Let’s make our guess:  We are thinking O(n)  Try to prove:  T(n) <= c*n  What happens? How do we fix this issue?  On board 
  • 15.
    15 Substitution Method: Subtleties Consider:  T(n) = 2*T(n/2) + 1
  • 16.
    16 Substitution Method: Subtleties Summary of the problem / issue:  T(n) = 2*T(n/2) + 1  T(n) <= 2(c*(n/2)) + 1  T(n) <= c*n + 1  What is the issue here?  c*n + 1 is TOO LARGE.  Need to prove exact form of inductive hypothesis
  • 17.
    17 Substitution Method: Subtleties Here is how we fix the issue. Subtract lower order term.  Inductive Hypothesis:  T(n) <= c*n – d //d is a constant term. Note c*n-d <= c*n  Fix:  T(n) = 2*T(n/2) + 1  T(n) <= 2(c*(n/2) - d) + 1  T(n) <= c*n -2d + 1 <= c*n - d //as long as d >= 1
  • 18.
    18 Substitution Method: AnotherPitfall  Consider Mergesort recurrence again:  T(n) = 2*T(n/2) + n  Let’s make our guess:  We are thinking O(n)  Note that this is INCORRECT!  Try to prove:  T(n) <= c*n  What happens?  On board 
  • 19.
    19 Substitution Method: AnotherPitfall  Consider Mergesort recurrence again:  T(n) = 2*T(n/2) + n
  • 20.
    20 Substitution Method: PitfallExample  Attempt to prove:  T(n) = 2*T(n/2) + n  T(n) <= 2*(c*n/2) + n  T(n) <= c*n + n  Again, need to prove EXACT form of inductive hypothesis.  Subtracting off a lower order term won’t help.  Why?
  • 21.
  • 22.
    22 Recursion Tree Method Evaluate: T(n) = 2*T(n/2) + n  Work copy:T(k) = T(k/2) + T(k/2) + k  For k=n/2, T(n/2) = T(n/4) + T(n/4) + (n/2)  [size| non-recursive cost]
  • 23.
    23 Recursion Tree: TotalCost  To evaluate the total cost of the recursion tree  sum all the non-recursive costs of all nodes  = Sum (rowSum(cost of all nodes at the same depth))  Determine the maximum depth of the recursion tree:  For our example, at tree depth d the size parameter is n/(2d )  the size parameter converging to base case, i.e. case 1  such that, n/(2d ) = 1,  d = lg(n)  The rowSum for each row is n  Therefore, the total cost,T(n) = n lg(n)
  • 24.
  • 25.
    25 The Master Theorem Given: a divide and conquer algorithm  An algorithm that divides the problem of size n into a subproblems, each of size n/b  Let the cost of each stage (i.e., the work to divide the problem + combine solved subproblems) be described by the function f(n)  Then, the Master Theorem gives us a cookbook for the algorithm’s running time  Some textbooks has a simpler version they call the “Main Recurrence Theorem”  We’ll splits it into individual parts
  • 26.
    26 The Master Theorem(from Cormen)  If T(n) = a T(n/b) + f(n)  then let k = lg a / lg b = logb(a) (critical exponent)  Then three common cases:  If f(n)  O(nk- ) for some positive , then T(n)  (nk )  If f(n)  (nk ) then T(n)  ( f(n) log(n) ) = (nk log(n))  If f(n)  (nk+ ) for some positive , and a f(n/b) ≤ c f(n) for some c < 1 and sufficiently large n, then T(n)  (f(n))  Note: none of these cases may apply
  • 27.
    27 Using the MasterTheorem  T(n) = 9T(n/3) + n  A = 9, b = 3, f(n) = n  Master Theorem  k = lg 9 / lg 3 = log3 9 = 2  Since f(n) = O(nlog3 9 -  ), where =1, case 1 applies: T(n)  (nk )  Thus the solution is T(n) = (n2 ) since k=2
  • 28.
    28 Problems to Try Can you use a theorem on these?  Assume T(1) = 1  T(n) = T(n/2) + lg n  T(n) = T(n/2) + n  T(n) = 2T(n/2) + n (like Mergesort)  T(n) = 2T(n/2) + n lg n
  • 29.
  • 30.
    30 Problems to Try Let’s try these?  T(n) = 7T(n/3) + n^2  T(n) = 3T(n/3) + n/2  T(n) = 4T(n/2) + n / log(n)  T(n) = 3T(n/3) + n / log(n)
  • 31.
    31 Problems to Try:Solutions    Case 3: regularity: //YES
  • 32.
    32 Problems to Try:Solutions   Case 2: nlogn
  • 33.
    33 Problems to Try:Solutions    Case 1: n^2
  • 34.
    34 Problems to Try:Solutions   Case 1 doesn’t apply because f(n) not polynomially smaller  e.g., n / log(n) !<= n^0.99 for large n
  • 35.
    35 Solutions Solutions to problemsthat aren’t directly in the slides above
  • 36.
    36 Directly Solve (unrollingthe recurrence)  For Mergesort:  T(n) = 2*T(n/2) + n  Do it on board 
  • 37.
    37 Directly Solve (unrollingthe recurrence) T(n) = 2*T(n/2) + n T(n) = 2*[2 * T(n/4) + n/2] + n //unroll one level = 4*T(n/4) + 2n = 4*[2*T(n/8) + n/4] + 2n //unroll another level = 8*T(n/8) + 3n = 8*[2*T(n/16) + n/8] + 3n //one more time = 16*T(n/16) + 4n
  • 38.
    38 Directly Solve (unrollingthe recurrence) T(n) = 4*T(n/4) + 2n = 8*T(n/8) + 3n = 16*T(n/16) + 4n = … //what is the general pattern?? = //where d is depth of recursion
  • 39.
    39 Directly Solve (unrollingthe recurrence) T(n) = //where d is depth of recursion //when do we hit T(1) //recursion ends when d is log(n) T(n) = //sub back in for d T(n) = T(n) = T(n) =
  • 40.
    40 Iteration or SubstitutionMethod  T(n) = 2*T(n/2) + n  Guess n*log(n)  Base case (n=2):  2*T(1)+2  4 //true if c >= 2
  • 41.
    41 Iteration or SubstitutionMethod  T(n) = 2*T(n/2) + n  Guess n*log(n)  Inductive Hypothesis:  Assume for all that
  • 42.
    42 Iteration or SubstitutionMethod  T(n) = 2*T(n/2) + n  Guess n*log(n)  Inductive Step:  //if c >= 1
  • 43.
    43 Problems to Try Can you use a theorem on these?  Assume T(1) = 1  T(n) = T(n/2) + lg n  T(n) = T(n/2) + n  T(n) = 2T(n/2) + n (like Mergesort)  T(n) = 2T(n/2) + n lg n
  • 44.
    44 Problems to Try  Case 3 does not apply!   Case 3: //YES
  • 45.
    45 Problems to Try (like Mergesort)   Case 2:  k = 1   //NO! not polynomially smaller!  Master theorem cannot be used