KEMBAR78
Greedy with Task Scheduling Algorithm.ppt
The Greedy Method 1
The Greedy Method
The Greedy Method 2
Outline and Reading
The Greedy Method Technique (§5.1)
Fractional Knapsack Problem (§5.1.1)
Task Scheduling (§5.1.2)
Minimum Spanning Trees (§7.3) [future lecture]
The Greedy Method 3
The Greedy Method
Technique
The greedy method is a general algorithm
design paradigm, built on the following
elements:
 configurations: different choices, collections, or
values to find
 objective function: a score assigned to
configurations, which we want to either maximize or
minimize
It works best when applied to problems with the
greedy-choice property:
 a globally-optimal solution can always be found by a
series of local improvements from a starting
configuration.
The Greedy Method 4
Making Change
Problem: A dollar amount to reach and a collection of
coin amounts to use to get there.
Configuration: A dollar amount yet to return to a
customer plus the coins already returned
Objective function: Minimize number of coins returned.
Greedy solution: Always return the largest coin you can
Example 1: Coins are valued $.32, $.08, $.01
 Has the greedy-choice property, since no amount over $.32 can
be made with a minimum number of coins by omitting a $.32
coin (similarly for amounts over $.08, but under $.32).
Example 2: Coins are valued $.30, $.20, $.05, $.01
 Does not have greedy-choice property, since $.40 is best made
with two $.20’s, but the greedy solution will pick three coins
(which ones?)
The Greedy Method 5
The Fractional Knapsack
Problem
Given: A set S of n items, with each item i having
 bi - a positive benefit
 wi - a positive weight
Goal: Choose items with maximum total benefit but with
weight at most W.
If we are allowed to take fractional amounts, then this is
the fractional knapsack problem.
 In this case, we let xi denote the amount we take of item i
 Objective: maximize
 Constraint:

S
i
i
i
i w
x
b )
/
(



S
i
i W
x
The Greedy Method 6
Example
Given: A set S of n items, with each item i having
 bi - a positive benefit
 wi - a positive weight
Goal: Choose items with maximum total benefit but with
weight at most W.
Weight:
Benefit:
1 2 3 4 5
4 ml 8 ml 2 ml 6 ml 1 ml
$12 $32 $40 $30 $50
Items:
Value: 3
($ per ml)
4 20 5 50
10 ml
Solution:
• 1 ml of 5
• 2 ml of 3
• 6 ml of 4
• 1 ml of 2
“knapsack”
The Greedy Method 7
The Fractional Knapsack
Algorithm
Greedy choice: Keep taking
item with highest value
(benefit to weight ratio)
 Since
 Run time: O(n log n). Why?
Correctness: Suppose there
is a better solution
 there is an item i with higher
value than a chosen item j,
but xi<wi, xj>0 and vi<vj
 If we substitute some i with j,
we get a better solution
 How much of i: min{wi-xi, xj}
 Thus, there is no better
solution than the greedy one
Algorithm fractionalKnapsack(S, W)
Input: set S of items w/ benefit bi
and weight wi; max. weight W
Output: amount xi of each item i
to maximize benefit w/ weight
at most W
for each item i in S
xi  0
vi  bi / wi {value}
w  0 {total weight}
while w < W
remove item i w/ highest vi
xi  min{wi , W - w}
w  w + min{wi , W - w}

 


S
i
i
i
i
S
i
i
i
i x
w
b
w
x
b )
/
(
)
/
(
The Greedy Method 8
Task Scheduling
Given: a set T of n tasks, each having:
 A start time, si
 A finish time, fi (where si < fi)
Goal: Perform all the tasks using a minimum number of
“machines.”
1 9
8
7
6
5
4
3
2
Machine 1
Machine 3
Machine 2
The Greedy Method 9
Task Scheduling
Algorithm
Greedy choice: consider tasks
by their start time and use as
few machines as possible with
this order.
 Run time: O(n log n). Why?
Correctness: Suppose there is a
better schedule.
 We can use k-1 machines
 The algorithm uses k
 Let i be first task scheduled
on machine k
 Machine i must conflict with
k-1 other tasks
 But that means there is no
non-conflicting schedule
using k-1 machines
Algorithm taskSchedule(T)
Input: set T of tasks w/ start time si
and finish time fi
Output: non-conflicting schedule
with minimum number of machines
m  0 {no. of machines}
while T is not empty
remove task i w/ smallest si
if there’s a machine j for i then
schedule i on machine j
else
m  m + 1
schedule i on machine m
The Greedy Method 10
Example
Given: a set T of n tasks, each having:
 A start time, si
 A finish time, fi (where si < fi)
 [1,4], [1,3], [2,5], [3,7], [4,7], [6,9], [7,8] (ordered by start)
Goal: Perform all tasks on min. number of machines
1 9
8
7
6
5
4
3
2
Machine 1
Machine 3
Machine 2

Greedy with Task Scheduling Algorithm.ppt

  • 1.
    The Greedy Method1 The Greedy Method
  • 2.
    The Greedy Method2 Outline and Reading The Greedy Method Technique (§5.1) Fractional Knapsack Problem (§5.1.1) Task Scheduling (§5.1.2) Minimum Spanning Trees (§7.3) [future lecture]
  • 3.
    The Greedy Method3 The Greedy Method Technique The greedy method is a general algorithm design paradigm, built on the following elements:  configurations: different choices, collections, or values to find  objective function: a score assigned to configurations, which we want to either maximize or minimize It works best when applied to problems with the greedy-choice property:  a globally-optimal solution can always be found by a series of local improvements from a starting configuration.
  • 4.
    The Greedy Method4 Making Change Problem: A dollar amount to reach and a collection of coin amounts to use to get there. Configuration: A dollar amount yet to return to a customer plus the coins already returned Objective function: Minimize number of coins returned. Greedy solution: Always return the largest coin you can Example 1: Coins are valued $.32, $.08, $.01  Has the greedy-choice property, since no amount over $.32 can be made with a minimum number of coins by omitting a $.32 coin (similarly for amounts over $.08, but under $.32). Example 2: Coins are valued $.30, $.20, $.05, $.01  Does not have greedy-choice property, since $.40 is best made with two $.20’s, but the greedy solution will pick three coins (which ones?)
  • 5.
    The Greedy Method5 The Fractional Knapsack Problem Given: A set S of n items, with each item i having  bi - a positive benefit  wi - a positive weight Goal: Choose items with maximum total benefit but with weight at most W. If we are allowed to take fractional amounts, then this is the fractional knapsack problem.  In this case, we let xi denote the amount we take of item i  Objective: maximize  Constraint:  S i i i i w x b ) / (    S i i W x
  • 6.
    The Greedy Method6 Example Given: A set S of n items, with each item i having  bi - a positive benefit  wi - a positive weight Goal: Choose items with maximum total benefit but with weight at most W. Weight: Benefit: 1 2 3 4 5 4 ml 8 ml 2 ml 6 ml 1 ml $12 $32 $40 $30 $50 Items: Value: 3 ($ per ml) 4 20 5 50 10 ml Solution: • 1 ml of 5 • 2 ml of 3 • 6 ml of 4 • 1 ml of 2 “knapsack”
  • 7.
    The Greedy Method7 The Fractional Knapsack Algorithm Greedy choice: Keep taking item with highest value (benefit to weight ratio)  Since  Run time: O(n log n). Why? Correctness: Suppose there is a better solution  there is an item i with higher value than a chosen item j, but xi<wi, xj>0 and vi<vj  If we substitute some i with j, we get a better solution  How much of i: min{wi-xi, xj}  Thus, there is no better solution than the greedy one Algorithm fractionalKnapsack(S, W) Input: set S of items w/ benefit bi and weight wi; max. weight W Output: amount xi of each item i to maximize benefit w/ weight at most W for each item i in S xi  0 vi  bi / wi {value} w  0 {total weight} while w < W remove item i w/ highest vi xi  min{wi , W - w} w  w + min{wi , W - w}      S i i i i S i i i i x w b w x b ) / ( ) / (
  • 8.
    The Greedy Method8 Task Scheduling Given: a set T of n tasks, each having:  A start time, si  A finish time, fi (where si < fi) Goal: Perform all the tasks using a minimum number of “machines.” 1 9 8 7 6 5 4 3 2 Machine 1 Machine 3 Machine 2
  • 9.
    The Greedy Method9 Task Scheduling Algorithm Greedy choice: consider tasks by their start time and use as few machines as possible with this order.  Run time: O(n log n). Why? Correctness: Suppose there is a better schedule.  We can use k-1 machines  The algorithm uses k  Let i be first task scheduled on machine k  Machine i must conflict with k-1 other tasks  But that means there is no non-conflicting schedule using k-1 machines Algorithm taskSchedule(T) Input: set T of tasks w/ start time si and finish time fi Output: non-conflicting schedule with minimum number of machines m  0 {no. of machines} while T is not empty remove task i w/ smallest si if there’s a machine j for i then schedule i on machine j else m  m + 1 schedule i on machine m
  • 10.
    The Greedy Method10 Example Given: a set T of n tasks, each having:  A start time, si  A finish time, fi (where si < fi)  [1,4], [1,3], [2,5], [3,7], [4,7], [6,9], [7,8] (ordered by start) Goal: Perform all tasks on min. number of machines 1 9 8 7 6 5 4 3 2 Machine 1 Machine 3 Machine 2

Editor's Notes

  • #2 12/5/2022 4:19 AM