SlideShare a Scribd company logo
1 of 65
Download to read offline
Iris Hui-Ru Jiang Fall 2014
CHAPTER 4
GREEDY ALGORITHMS
IRIS H.-R. JIANG
Outline
¨ Content:
¤ Interval scheduling: The greedy algorithm stays ahead
¤ Scheduling to minimize lateness: An exchange argument
¤ Shortest paths
¤ The minimum spanning tree problem
¤ Implementing Kruskal’s algorithm: Union-find
¨ Reading:
¤ Chapter 4
Greedy algorithms
2
IRIS H.-R. JIANG
Greedy Algorithms
¨ An algorithm is greedy if it builds up a solution in small steps,
choosing a decision at each step myopically to optimize some
underlying criterion.
¨ It’s easy to invent greedy algorithms for almost any problem.
¤ Intuitive and fast
¤ Usually not optimal
¨ It’s challenging to prove greedy algorithms succeed in solving
a nontrivial problem optimally.
1. The greedy algorithm stays ahead.
2. An exchange argument.
Greedy algorithms
3
The greedy algorithm stays ahead
Interval Scheduling
4
Greedy algorithms
Iris Hui-Ru Jiang Fall 2014
CHAPTER 4
GREEDY ALGORITHMS
IRIS H.-R. JIANG
Outline
¨ Content:
¤ Interval scheduling: The greedy algorithm stays ahead
¤ Scheduling to minimize lateness: An exchange argument
¤ Shortest paths
¤ The minimum spanning tree problem
¤ Implementing Kruskal’s algorithm: Union-find
¨ Reading:
¤ Chapter 4
Greedy algorithms
2
IRIS H.-R. JIANG
Greedy Algorithms
¨ An algorithm is greedy if it builds up a solution in small steps,
choosing a decision at each step myopically to optimize some
underlying criterion.
¨ It’s easy to invent greedy algorithms for almost any problem.
¤ Intuitive and fast
¤ Usually not optimal
¨ It’s challenging to prove greedy algorithms succeed in solving
a nontrivial problem optimally.
1. The greedy algorithm stays ahead.
2. An exchange argument.
Greedy algorithms
3
The greedy algorithm stays ahead
Interval Scheduling
4
Greedy algorithms
Iris Hui-Ru Jiang Fall 2014
CHAPTER 4
GREEDY ALGORITHMS
IRIS H.-R. JIANG
Outline
¨ Content:
¤ Interval scheduling: The greedy algorithm stays ahead
¤ Scheduling to minimize lateness: An exchange argument
¤ Shortest paths
¤ The minimum spanning tree problem
¤ Implementing Kruskal’s algorithm: Union-find
¨ Reading:
¤ Chapter 4
Greedy algorithms
2
IRIS H.-R. JIANG
Greedy Algorithms
¨ An algorithm is greedy if it builds up a solution in small steps,
choosing a decision at each step myopically to optimize some
underlying criterion.
¨ It’s easy to invent greedy algorithms for almost any problem.
¤ Intuitive and fast
¤ Usually not optimal
¨ It’s challenging to prove greedy algorithms succeed in solving
a nontrivial problem optimally.
1. The greedy algorithm stays ahead.
2. An exchange argument.
Greedy algorithms
3
The greedy algorithm stays ahead
Interval Scheduling
4
Greedy algorithms
Iris Hui-Ru Jiang Fall 2014
CHAPTER 4
GREEDY ALGORITHMS
IRIS H.-R. JIANG
Outline
¨ Content:
¤ Interval scheduling: The greedy algorithm stays ahead
¤ Scheduling to minimize lateness: An exchange argument
¤ Shortest paths
¤ The minimum spanning tree problem
¤ Implementing Kruskal’s algorithm: Union-find
¨ Reading:
¤ Chapter 4
Greedy algorithms
2
IRIS H.-R. JIANG
Greedy Algorithms
¨ An algorithm is greedy if it builds up a solution in small steps,
choosing a decision at each step myopically to optimize some
underlying criterion.
¨ It’s easy to invent greedy algorithms for almost any problem.
¤ Intuitive and fast
¤ Usually not optimal
¨ It’s challenging to prove greedy algorithms succeed in solving
a nontrivial problem optimally.
1. The greedy algorithm stays ahead.
2. An exchange argument.
Greedy algorithms
3
The greedy algorithm stays ahead
Interval Scheduling
4
Greedy algorithms
IRIS H.-R. JIANG
The Interval Scheduling Problem
¨ Given: Set of requests {1, 2, …, n}, ith request corresponds an
interval with start time s(i) and finish time f(i)
¤ interval i: [s(i), f(i))
¨ Goal: Find a compatible subset of the requests with maximum size
Greedy algorithms
5
requests don’t overlap optimal
Time
0 1 2 3 4 5 6 7 8 9 10 11
6
7
8
5
1
2
3
4
Maximum compatible subset {2, 5, 8}
8
5
2
IRIS H.-R. JIANG
Greedy Rule
¨ Repeat
¤ Use a simple rule to select a first request i1.
¤ Once i1 is selected, reject all requests not compatible with i1.
¨ Until run out of requests.
¨ Q: How to decide a greedy rule for a good algorithm?
¨ A:
1. Earliest start time: min s(i)
2. Shortest interval: min {f(i) - s(i)}
3. Fewest conflicts: mini=1..n |{j: j is not compatible with i}|
4. Earliest finish time: min f(i)
Greedy algorithms
6
IRIS H.-R. JIANG
The Greedy Algorithm
¨ The 4th greedy rule leads to the optimal solution.
¤ We first accept the request that finish first
¤ Natural idea: Our resource becomes free ASAP
¨ The greedy algorithm:
¤ Q: Feasible?
¤ A: Yes! Line 5.
n A is a compatible set of requests.
¤ Q: Optimal? Efficient?
Greedy algorithms
7
Interval-Scheduling(R)
// R: undetermined requests; A: accepted requests
1. A = Æ;
2. while (R is not empty) do
3. choose a request iÎR with minimum f(i) // greedy rule
4. A = A + {i}
5. R = R – {i} – X, where X = {j: jÎR and j is not compatible with i)
6. return A
Æ: empty set = {}
IRIS H.-R. JIANG
The Greedy Algorithm Stays Ahead
¨ Q: How to prove optimality?
¤ Let O be an optimal solution. Prove A = O? Prove |A| = |O|?
¨ The greedy algorithm stays ahead.
¤ We will compare the partial solutions of A and O, and show that
the greedy algorithm is doing better in a step-by-step fashion.
¨ Let A be the output of the greedy algorithm, A = {i1, …, ik}, in the
order they were added. Let O be the optimal solution, O = {j1,
…, jm} in the ascending order of start (finish) times.
For all indices r ! k, we have f(ir) ! f(jr).
¨ Pf: Proof by induction!
¤ Basis step: true for r = 1, f(i1) ! f(j1).
¤ Induction step: hypothesis: true for r-1.
n f(ir-1) ! f(jr-1)
n O is compatible, f(jr-1) ! s(jr)
n Hence, f(ir-1) ! s(jr); jrÎR after ir-1 is selected in line 5.
n According to line 3, f(ir) ! f(jr).
Greedy algorithms
8
jr
jr-1
ir
ir-1
Q: ir finishes later?
IRIS H.-R. JIANG
The Interval Scheduling Problem
¨ Given: Set of requests {1, 2, …, n}, ith request corresponds an
interval with start time s(i) and finish time f(i)
¤ interval i: [s(i), f(i))
¨ Goal: Find a compatible subset of the requests with maximum size
Greedy algorithms
5
requests don’t overlap optimal
Time
0 1 2 3 4 5 6 7 8 9 10 11
6
7
8
5
1
2
3
4
Maximum compatible subset {2, 5, 8}
8
5
2
IRIS H.-R. JIANG
Greedy Rule
¨ Repeat
¤ Use a simple rule to select a first request i1.
¤ Once i1 is selected, reject all requests not compatible with i1.
¨ Until run out of requests.
¨ Q: How to decide a greedy rule for a good algorithm?
¨ A:
1. Earliest start time: min s(i)
2. Shortest interval: min {f(i) - s(i)}
3. Fewest conflicts: mini=1..n |{j: j is not compatible with i}|
4. Earliest finish time: min f(i)
Greedy algorithms
6
IRIS H.-R. JIANG
The Greedy Algorithm
¨ The 4th greedy rule leads to the optimal solution.
¤ We first accept the request that finish first
¤ Natural idea: Our resource becomes free ASAP
¨ The greedy algorithm:
¤ Q: Feasible?
¤ A: Yes! Line 5.
n A is a compatible set of requests.
¤ Q: Optimal? Efficient?
Greedy algorithms
7
Interval-Scheduling(R)
// R: undetermined requests; A: accepted requests
1. A = Æ;
2. while (R is not empty) do
3. choose a request iÎR with minimum f(i) // greedy rule
4. A = A + {i}
5. R = R – {i} – X, where X = {j: jÎR and j is not compatible with i)
6. return A
Æ: empty set = {}
IRIS H.-R. JIANG
The Greedy Algorithm Stays Ahead
¨ Q: How to prove optimality?
¤ Let O be an optimal solution. Prove A = O? Prove |A| = |O|?
¨ The greedy algorithm stays ahead.
¤ We will compare the partial solutions of A and O, and show that
the greedy algorithm is doing better in a step-by-step fashion.
¨ Let A be the output of the greedy algorithm, A = {i1, …, ik}, in the
order they were added. Let O be the optimal solution, O = {j1,
…, jm} in the ascending order of start (finish) times.
For all indices r ! k, we have f(ir) ! f(jr).
¨ Pf: Proof by induction!
¤ Basis step: true for r = 1, f(i1) ! f(j1).
¤ Induction step: hypothesis: true for r-1.
n f(ir-1) ! f(jr-1)
n O is compatible, f(jr-1) ! s(jr)
n Hence, f(ir-1) ! s(jr); jrÎR after ir-1 is selected in line 5.
n According to line 3, f(ir) ! f(jr).
Greedy algorithms
8
jr
jr-1
ir
ir-1
Q: ir finishes later?
IRIS H.-R. JIANG
The Interval Scheduling Problem
¨ Given: Set of requests {1, 2, …, n}, ith request corresponds an
interval with start time s(i) and finish time f(i)
¤ interval i: [s(i), f(i))
¨ Goal: Find a compatible subset of the requests with maximum size
Greedy algorithms
5
requests don’t overlap optimal
Time
0 1 2 3 4 5 6 7 8 9 10 11
6
7
8
5
1
2
3
4
Maximum compatible subset {2, 5, 8}
8
5
2
IRIS H.-R. JIANG
Greedy Rule
¨ Repeat
¤ Use a simple rule to select a first request i1.
¤ Once i1 is selected, reject all requests not compatible with i1.
¨ Until run out of requests.
¨ Q: How to decide a greedy rule for a good algorithm?
¨ A:
1. Earliest start time: min s(i)
2. Shortest interval: min {f(i) - s(i)}
3. Fewest conflicts: mini=1..n |{j: j is not compatible with i}|
4. Earliest finish time: min f(i)
Greedy algorithms
6
IRIS H.-R. JIANG
The Greedy Algorithm
¨ The 4th greedy rule leads to the optimal solution.
¤ We first accept the request that finish first
¤ Natural idea: Our resource becomes free ASAP
¨ The greedy algorithm:
¤ Q: Feasible?
¤ A: Yes! Line 5.
n A is a compatible set of requests.
¤ Q: Optimal? Efficient?
Greedy algorithms
7
Interval-Scheduling(R)
// R: undetermined requests; A: accepted requests
1. A = Æ;
2. while (R is not empty) do
3. choose a request iÎR with minimum f(i) // greedy rule
4. A = A + {i}
5. R = R – {i} – X, where X = {j: jÎR and j is not compatible with i)
6. return A
Æ: empty set = {}
IRIS H.-R. JIANG
The Greedy Algorithm Stays Ahead
¨ Q: How to prove optimality?
¤ Let O be an optimal solution. Prove A = O? Prove |A| = |O|?
¨ The greedy algorithm stays ahead.
¤ We will compare the partial solutions of A and O, and show that
the greedy algorithm is doing better in a step-by-step fashion.
¨ Let A be the output of the greedy algorithm, A = {i1, …, ik}, in the
order they were added. Let O be the optimal solution, O = {j1,
…, jm} in the ascending order of start (finish) times.
For all indices r ! k, we have f(ir) ! f(jr).
¨ Pf: Proof by induction!
¤ Basis step: true for r = 1, f(i1) ! f(j1).
¤ Induction step: hypothesis: true for r-1.
n f(ir-1) ! f(jr-1)
n O is compatible, f(jr-1) ! s(jr)
n Hence, f(ir-1) ! s(jr); jrÎR after ir-1 is selected in line 5.
n According to line 3, f(ir) ! f(jr).
Greedy algorithms
8
jr
jr-1
ir
ir-1
Q: ir finishes later?
IRIS H.-R. JIANG
The Interval Scheduling Problem
¨ Given: Set of requests {1, 2, …, n}, ith request corresponds an
interval with start time s(i) and finish time f(i)
¤ interval i: [s(i), f(i))
¨ Goal: Find a compatible subset of the requests with maximum size
Greedy algorithms
5
requests don’t overlap optimal
Time
0 1 2 3 4 5 6 7 8 9 10 11
6
7
8
5
1
2
3
4
Maximum compatible subset {2, 5, 8}
8
5
2
IRIS H.-R. JIANG
Greedy Rule
¨ Repeat
¤ Use a simple rule to select a first request i1.
¤ Once i1 is selected, reject all requests not compatible with i1.
¨ Until run out of requests.
¨ Q: How to decide a greedy rule for a good algorithm?
¨ A:
1. Earliest start time: min s(i)
2. Shortest interval: min {f(i) - s(i)}
3. Fewest conflicts: mini=1..n |{j: j is not compatible with i}|
4. Earliest finish time: min f(i)
Greedy algorithms
6
IRIS H.-R. JIANG
The Greedy Algorithm
¨ The 4th greedy rule leads to the optimal solution.
¤ We first accept the request that finish first
¤ Natural idea: Our resource becomes free ASAP
¨ The greedy algorithm:
¤ Q: Feasible?
¤ A: Yes! Line 5.
n A is a compatible set of requests.
¤ Q: Optimal? Efficient?
Greedy algorithms
7
Interval-Scheduling(R)
// R: undetermined requests; A: accepted requests
1. A = Æ;
2. while (R is not empty) do
3. choose a request iÎR with minimum f(i) // greedy rule
4. A = A + {i}
5. R = R – {i} – X, where X = {j: jÎR and j is not compatible with i)
6. return A
Æ: empty set = {}
IRIS H.-R. JIANG
The Greedy Algorithm Stays Ahead
¨ Q: How to prove optimality?
¤ Let O be an optimal solution. Prove A = O? Prove |A| = |O|?
¨ The greedy algorithm stays ahead.
¤ We will compare the partial solutions of A and O, and show that
the greedy algorithm is doing better in a step-by-step fashion.
¨ Let A be the output of the greedy algorithm, A = {i1, …, ik}, in the
order they were added. Let O be the optimal solution, O = {j1,
…, jm} in the ascending order of start (finish) times.
For all indices r ! k, we have f(ir) ! f(jr).
¨ Pf: Proof by induction!
¤ Basis step: true for r = 1, f(i1) ! f(j1).
¤ Induction step: hypothesis: true for r-1.
n f(ir-1) ! f(jr-1)
n O is compatible, f(jr-1) ! s(jr)
n Hence, f(ir-1) ! s(jr); jrÎR after ir-1 is selected in line 5.
n According to line 3, f(ir) ! f(jr).
Greedy algorithms
8
jr
jr-1
ir
ir-1
Q: ir finishes later?
IRIS H.-R. JIANG
The Greedy Algorithm Is Optimal
¨ The greedy algorithm returns an optimal set A.
¨ Pf: Proof by contradiction.
¤ If A is not optimal, then an optimal set O must have more
requests, i.e., |O| = m > k = |A|.
¤ Since f(ik) ! f(jk) and m>k, there is a request jk+1 in O.
¤ f(jk) ! s(jk+1); f(ik) ! s(jk+1).
¤ Hence, jk+1 is compatible with ik. R should contain jk+1.
¤ However, the greedy algorithm stops with request ik, and it is only
supposed to stop when R is empty. ®¬
Greedy algorithms
9
jk+1
jk
ik
A
O j1
i1
…
… jm
…
IRIS H.-R. JIANG
Implementation: The Greedy Algorithm
¨ Running time: From O(n2) to O(n log n)
¤ Initialization:
n O(n log n): sort R in ascending order of f(i)
n O(n): construct S, S[i] = s(i)
¤ Lines 3 and 5:
n O(n): scan R once
n We do not delete all incompatible requests in line 5, only
those finish after the next selected request finishes and
start before the next selected request finishes.
Greedy algorithms
10
Interval-Scheduling(R)
// R: undetermined requests; A: accepted requests
1. A = Æ
2. while (R is not empty) do
3. choose a request iÎR with minimum f(i) // greedy rule
4. A = A + {i}
5. R = R – {i} – X, where X = {j: jÎR and j is not compatible with i)
6. return A
IRIS H.-R. JIANG
The Interval Scheduling Problem
Greedy algorithms
11
Time
0 1 2 3 4 5 6 7 8 9 10 11
Maximum compatible subset {1, 3, 5, 8}
12 13 14 15 16
8
9
4 5
6
3
1
2
7
1 3 5 8
Interval coloring
Interval Partitioning
12
Greedy algorithms
IRIS H.-R. JIANG
The Greedy Algorithm Is Optimal
¨ The greedy algorithm returns an optimal set A.
¨ Pf: Proof by contradiction.
¤ If A is not optimal, then an optimal set O must have more
requests, i.e., |O| = m > k = |A|.
¤ Since f(ik) ! f(jk) and m>k, there is a request jk+1 in O.
¤ f(jk) ! s(jk+1); f(ik) ! s(jk+1).
¤ Hence, jk+1 is compatible with ik. R should contain jk+1.
¤ However, the greedy algorithm stops with request ik, and it is only
supposed to stop when R is empty. ®¬
Greedy algorithms
9
jk+1
jk
ik
A
O j1
i1
…
… jm
…
IRIS H.-R. JIANG
Implementation: The Greedy Algorithm
¨ Running time: From O(n2) to O(n log n)
¤ Initialization:
n O(n log n): sort R in ascending order of f(i)
n O(n): construct S, S[i] = s(i)
¤ Lines 3 and 5:
n O(n): scan R once
n We do not delete all incompatible requests in line 5, only
those finish after the next selected request finishes and
start before the next selected request finishes.
Greedy algorithms
10
Interval-Scheduling(R)
// R: undetermined requests; A: accepted requests
1. A = Æ
2. while (R is not empty) do
3. choose a request iÎR with minimum f(i) // greedy rule
4. A = A + {i}
5. R = R – {i} – X, where X = {j: jÎR and j is not compatible with i)
6. return A
IRIS H.-R. JIANG
The Interval Scheduling Problem
Greedy algorithms
11
Time
0 1 2 3 4 5 6 7 8 9 10 11
Maximum compatible subset {1, 3, 5, 8}
12 13 14 15 16
8
9
4 5
6
3
1
2
7
1 3 5 8
Interval coloring
Interval Partitioning
12
Greedy algorithms
IRIS H.-R. JIANG
The Greedy Algorithm Is Optimal
¨ The greedy algorithm returns an optimal set A.
¨ Pf: Proof by contradiction.
¤ If A is not optimal, then an optimal set O must have more
requests, i.e., |O| = m > k = |A|.
¤ Since f(ik) ! f(jk) and m>k, there is a request jk+1 in O.
¤ f(jk) ! s(jk+1); f(ik) ! s(jk+1).
¤ Hence, jk+1 is compatible with ik. R should contain jk+1.
¤ However, the greedy algorithm stops with request ik, and it is only
supposed to stop when R is empty. ®¬
Greedy algorithms
9
jk+1
jk
ik
A
O j1
i1
…
… jm
…
IRIS H.-R. JIANG
Implementation: The Greedy Algorithm
¨ Running time: From O(n2) to O(n log n)
¤ Initialization:
n O(n log n): sort R in ascending order of f(i)
n O(n): construct S, S[i] = s(i)
¤ Lines 3 and 5:
n O(n): scan R once
n We do not delete all incompatible requests in line 5, only
those finish after the next selected request finishes and
start before the next selected request finishes.
Greedy algorithms
10
Interval-Scheduling(R)
// R: undetermined requests; A: accepted requests
1. A = Æ
2. while (R is not empty) do
3. choose a request iÎR with minimum f(i) // greedy rule
4. A = A + {i}
5. R = R – {i} – X, where X = {j: jÎR and j is not compatible with i)
6. return A
IRIS H.-R. JIANG
The Interval Scheduling Problem
Greedy algorithms
11
Time
0 1 2 3 4 5 6 7 8 9 10 11
Maximum compatible subset {1, 3, 5, 8}
12 13 14 15 16
8
9
4 5
6
3
1
2
7
1 3 5 8
Interval coloring
Interval Partitioning
12
Greedy algorithms
IRIS H.-R. JIANG
The Greedy Algorithm Is Optimal
¨ The greedy algorithm returns an optimal set A.
¨ Pf: Proof by contradiction.
¤ If A is not optimal, then an optimal set O must have more
requests, i.e., |O| = m > k = |A|.
¤ Since f(ik) ! f(jk) and m>k, there is a request jk+1 in O.
¤ f(jk) ! s(jk+1); f(ik) ! s(jk+1).
¤ Hence, jk+1 is compatible with ik. R should contain jk+1.
¤ However, the greedy algorithm stops with request ik, and it is only
supposed to stop when R is empty. ®¬
Greedy algorithms
9
jk+1
jk
ik
A
O j1
i1
…
… jm
…
IRIS H.-R. JIANG
Implementation: The Greedy Algorithm
¨ Running time: From O(n2) to O(n log n)
¤ Initialization:
n O(n log n): sort R in ascending order of f(i)
n O(n): construct S, S[i] = s(i)
¤ Lines 3 and 5:
n O(n): scan R once
n We do not delete all incompatible requests in line 5, only
those finish after the next selected request finishes and
start before the next selected request finishes.
Greedy algorithms
10
Interval-Scheduling(R)
// R: undetermined requests; A: accepted requests
1. A = Æ
2. while (R is not empty) do
3. choose a request iÎR with minimum f(i) // greedy rule
4. A = A + {i}
5. R = R – {i} – X, where X = {j: jÎR and j is not compatible with i)
6. return A
IRIS H.-R. JIANG
The Interval Scheduling Problem
Greedy algorithms
11
Time
0 1 2 3 4 5 6 7 8 9 10 11
Maximum compatible subset {1, 3, 5, 8}
12 13 14 15 16
8
9
4 5
6
3
1
2
7
1 3 5 8
Interval coloring
Interval Partitioning
12
Greedy algorithms
IRIS H.-R. JIANG
What if We Have Multiple Resources?
¨ The interval partitioning problem:
¤ a.k.a. the interval coloring problem: one resource = one color
¤ Use as few resources as possible
¨ Given: Set of requests {1, 2, …, n}, ith request corresponds an
interval with start time s(i) and finish time f(i)
¤ interval i: [s(i), f(i))
¨ Goal: Partition these requests into a minimum number of compatible
subsets, each corresponds to one resource
Greedy algorithms
13
Time
0 1 2 3 4 5 6 7 8 9 10 11
3 resources
12 13
IRIS H.-R. JIANG
How Many Resources Are Required?
¨ The depth of a set of intervals is the maximum number that
pass over any single point on the time-line.
¨ In any instance of interval partitioning, the number of
resources needed is at least the depth of the set of intervals.
¨ Pf:
¤ Suppose a set of intervals has depth d, and let I1, …, Id all pass
over a common point on the time-line.
¤ Then each of these intervals must be scheduled on a different
resource, so the whole instance needs at least d resources.
Greedy algorithms
14
Time
0 1 2 3 4 5 6 7 8 9 10 11
depth: 3
12 13
lower bound
IRIS H.-R. JIANG
Can We Reach the Lower Bound?
¨ The depth d is the lower bound on the number of required
resources.
¨ Q: Can we always use d resources to schedule all requests?
¨ A: Yes.
¨ Q: How to prove the optimality?
¨ A: One finds a bound that every possible solution must have at
least a certain value, and then one shows that the algorithm
under consideration always achieves this bound.
Greedy algorithms
15
IRIS H.-R. JIANG
The Greedy Algorithm
¨ Assign a label to each interval. Possible labels: {1, 2, …, d}.
¨ Assign different labels for overlapping intervals.
¨ Implementation:
¤ Lines 3--5: find a resource compatible with Ij, assign this label
n Record the finish time of the last added interval for each label
n Compatibility checking: s(Ij) " f(labeli)
n Use priority queue to maintain labels
Greedy algorithms
16
Interval-Partitioning(R)
1. {I1, …, In} = sort intervals in ascending order of their start times
2. for j from 1 to n do
3. exclude the labels of all assigned intervals that are not compatible with Ij
4. if (there is a nonexcluded label from {1, 2, …, d}) then
5. assign a nonexcluded label to Ij
6. else leave Ij unlabeled
IRIS H.-R. JIANG
What if We Have Multiple Resources?
¨ The interval partitioning problem:
¤ a.k.a. the interval coloring problem: one resource = one color
¤ Use as few resources as possible
¨ Given: Set of requests {1, 2, …, n}, ith request corresponds an
interval with start time s(i) and finish time f(i)
¤ interval i: [s(i), f(i))
¨ Goal: Partition these requests into a minimum number of compatible
subsets, each corresponds to one resource
Greedy algorithms
13
Time
0 1 2 3 4 5 6 7 8 9 10 11
3 resources
12 13
IRIS H.-R. JIANG
How Many Resources Are Required?
¨ The depth of a set of intervals is the maximum number that
pass over any single point on the time-line.
¨ In any instance of interval partitioning, the number of
resources needed is at least the depth of the set of intervals.
¨ Pf:
¤ Suppose a set of intervals has depth d, and let I1, …, Id all pass
over a common point on the time-line.
¤ Then each of these intervals must be scheduled on a different
resource, so the whole instance needs at least d resources.
Greedy algorithms
14
Time
0 1 2 3 4 5 6 7 8 9 10 11
depth: 3
12 13
lower bound
IRIS H.-R. JIANG
Can We Reach the Lower Bound?
¨ The depth d is the lower bound on the number of required
resources.
¨ Q: Can we always use d resources to schedule all requests?
¨ A: Yes.
¨ Q: How to prove the optimality?
¨ A: One finds a bound that every possible solution must have at
least a certain value, and then one shows that the algorithm
under consideration always achieves this bound.
Greedy algorithms
15
IRIS H.-R. JIANG
The Greedy Algorithm
¨ Assign a label to each interval. Possible labels: {1, 2, …, d}.
¨ Assign different labels for overlapping intervals.
¨ Implementation:
¤ Lines 3--5: find a resource compatible with Ij, assign this label
n Record the finish time of the last added interval for each label
n Compatibility checking: s(Ij) " f(labeli)
n Use priority queue to maintain labels
Greedy algorithms
16
Interval-Partitioning(R)
1. {I1, …, In} = sort intervals in ascending order of their start times
2. for j from 1 to n do
3. exclude the labels of all assigned intervals that are not compatible with Ij
4. if (there is a nonexcluded label from {1, 2, …, d}) then
5. assign a nonexcluded label to Ij
6. else leave Ij unlabeled
IRIS H.-R. JIANG
What if We Have Multiple Resources?
¨ The interval partitioning problem:
¤ a.k.a. the interval coloring problem: one resource = one color
¤ Use as few resources as possible
¨ Given: Set of requests {1, 2, …, n}, ith request corresponds an
interval with start time s(i) and finish time f(i)
¤ interval i: [s(i), f(i))
¨ Goal: Partition these requests into a minimum number of compatible
subsets, each corresponds to one resource
Greedy algorithms
13
Time
0 1 2 3 4 5 6 7 8 9 10 11
3 resources
12 13
IRIS H.-R. JIANG
How Many Resources Are Required?
¨ The depth of a set of intervals is the maximum number that
pass over any single point on the time-line.
¨ In any instance of interval partitioning, the number of
resources needed is at least the depth of the set of intervals.
¨ Pf:
¤ Suppose a set of intervals has depth d, and let I1, …, Id all pass
over a common point on the time-line.
¤ Then each of these intervals must be scheduled on a different
resource, so the whole instance needs at least d resources.
Greedy algorithms
14
Time
0 1 2 3 4 5 6 7 8 9 10 11
depth: 3
12 13
lower bound
IRIS H.-R. JIANG
Can We Reach the Lower Bound?
¨ The depth d is the lower bound on the number of required
resources.
¨ Q: Can we always use d resources to schedule all requests?
¨ A: Yes.
¨ Q: How to prove the optimality?
¨ A: One finds a bound that every possible solution must have at
least a certain value, and then one shows that the algorithm
under consideration always achieves this bound.
Greedy algorithms
15
IRIS H.-R. JIANG
The Greedy Algorithm
¨ Assign a label to each interval. Possible labels: {1, 2, …, d}.
¨ Assign different labels for overlapping intervals.
¨ Implementation:
¤ Lines 3--5: find a resource compatible with Ij, assign this label
n Record the finish time of the last added interval for each label
n Compatibility checking: s(Ij) " f(labeli)
n Use priority queue to maintain labels
Greedy algorithms
16
Interval-Partitioning(R)
1. {I1, …, In} = sort intervals in ascending order of their start times
2. for j from 1 to n do
3. exclude the labels of all assigned intervals that are not compatible with Ij
4. if (there is a nonexcluded label from {1, 2, …, d}) then
5. assign a nonexcluded label to Ij
6. else leave Ij unlabeled
IRIS H.-R. JIANG
What if We Have Multiple Resources?
¨ The interval partitioning problem:
¤ a.k.a. the interval coloring problem: one resource = one color
¤ Use as few resources as possible
¨ Given: Set of requests {1, 2, …, n}, ith request corresponds an
interval with start time s(i) and finish time f(i)
¤ interval i: [s(i), f(i))
¨ Goal: Partition these requests into a minimum number of compatible
subsets, each corresponds to one resource
Greedy algorithms
13
Time
0 1 2 3 4 5 6 7 8 9 10 11
3 resources
12 13
IRIS H.-R. JIANG
How Many Resources Are Required?
¨ The depth of a set of intervals is the maximum number that
pass over any single point on the time-line.
¨ In any instance of interval partitioning, the number of
resources needed is at least the depth of the set of intervals.
¨ Pf:
¤ Suppose a set of intervals has depth d, and let I1, …, Id all pass
over a common point on the time-line.
¤ Then each of these intervals must be scheduled on a different
resource, so the whole instance needs at least d resources.
Greedy algorithms
14
Time
0 1 2 3 4 5 6 7 8 9 10 11
depth: 3
12 13
lower bound
IRIS H.-R. JIANG
Can We Reach the Lower Bound?
¨ The depth d is the lower bound on the number of required
resources.
¨ Q: Can we always use d resources to schedule all requests?
¨ A: Yes.
¨ Q: How to prove the optimality?
¨ A: One finds a bound that every possible solution must have at
least a certain value, and then one shows that the algorithm
under consideration always achieves this bound.
Greedy algorithms
15
IRIS H.-R. JIANG
The Greedy Algorithm
¨ Assign a label to each interval. Possible labels: {1, 2, …, d}.
¨ Assign different labels for overlapping intervals.
¨ Implementation:
¤ Lines 3--5: find a resource compatible with Ij, assign this label
n Record the finish time of the last added interval for each label
n Compatibility checking: s(Ij) " f(labeli)
n Use priority queue to maintain labels
Greedy algorithms
16
Interval-Partitioning(R)
1. {I1, …, In} = sort intervals in ascending order of their start times
2. for j from 1 to n do
3. exclude the labels of all assigned intervals that are not compatible with Ij
4. if (there is a nonexcluded label from {1, 2, …, d}) then
5. assign a nonexcluded label to Ij
6. else leave Ij unlabeled
IRIS H.-R. JIANG
The Interval Partitioning Problem
Greedy algorithms
17
Time
0 1 2 3 4 5 6 7 8 9 10 11
depth: 3
12 13
9
8
10
6
5
4
1
2
3
7
1
2
3
4
5
6
7
8
9
10
1
2
3
IRIS H.-R. JIANG
Optimality (1/2)
¨ The greedy algorithm assigns every interval a label, and no two
overlapping intervals receive the same label.
¨ Pf:
1. No interval ends up unlabeled.
n Suppose interval Ij overlaps t intervals earlier in the sorted list.
n These t + 1 intervals pass over a common point, namely s(Ij).
n Hence, t + 1 ! d. Thus, t ! d – 1; at least one of the d labels is
not excluded, and so there is a label that can be assigned to Ij.
n i.e., line 6 never occurs!
Greedy algorithms
18
Interval-Partitioning(R)
1. {I1, …, In} = sort intervals in ascending order of their start times
2. for j from 1 to n do
3. exclude the labels of all assigned intervals that are not compatible with Ij
4. if (there is a nonexcluded label from {1, 2, …, d}) then
5. assign a nonexcluded label to Ij
6. else leave Ij unlabeled
IRIS H.-R. JIANG
Optimality (2/2)
¨ Pf: (cont’d)
2. No two overlapping intervals are assigned with the same label.
n Consider any two intervals Ii and Ij that overlap, i<j.
n When Ij is considered (in line 2), Ii is in the set of intervals
whose labels are excluded (in line 3).
n Hence, the algorithm will not assign to Ij the label used for Ii.
¤ Since the algorithm uses d labels, we can conclude that the
greedy algorithm always uses the minimum possible number of
labels, i.e., it is optimal!
Greedy algorithms
19
Interval-Partitioning(R)
1. {I1, …, In} = sort intervals in ascending order of their start times
2. for j from 1 to n do
3. exclude the labels of all assigned intervals that are not compatible with Ij
4. if (there is a nonexcluded label from {1, 2, …, d}) then
5. assign a nonexcluded label to Ij
6. else leave Ij unlabeled
An exchange argument
Scheduling to Minimize Lateness
20
Greedy algorithms
IRIS H.-R. JIANG
The Interval Partitioning Problem
Greedy algorithms
17
Time
0 1 2 3 4 5 6 7 8 9 10 11
depth: 3
12 13
9
8
10
6
5
4
1
2
3
7
1
2
3
4
5
6
7
8
9
10
1
2
3
IRIS H.-R. JIANG
Optimality (1/2)
¨ The greedy algorithm assigns every interval a label, and no two
overlapping intervals receive the same label.
¨ Pf:
1. No interval ends up unlabeled.
n Suppose interval Ij overlaps t intervals earlier in the sorted list.
n These t + 1 intervals pass over a common point, namely s(Ij).
n Hence, t + 1 ! d. Thus, t ! d – 1; at least one of the d labels is
not excluded, and so there is a label that can be assigned to Ij.
n i.e., line 6 never occurs!
Greedy algorithms
18
Interval-Partitioning(R)
1. {I1, …, In} = sort intervals in ascending order of their start times
2. for j from 1 to n do
3. exclude the labels of all assigned intervals that are not compatible with Ij
4. if (there is a nonexcluded label from {1, 2, …, d}) then
5. assign a nonexcluded label to Ij
6. else leave Ij unlabeled
IRIS H.-R. JIANG
Optimality (2/2)
¨ Pf: (cont’d)
2. No two overlapping intervals are assigned with the same label.
n Consider any two intervals Ii and Ij that overlap, i<j.
n When Ij is considered (in line 2), Ii is in the set of intervals
whose labels are excluded (in line 3).
n Hence, the algorithm will not assign to Ij the label used for Ii.
¤ Since the algorithm uses d labels, we can conclude that the
greedy algorithm always uses the minimum possible number of
labels, i.e., it is optimal!
Greedy algorithms
19
Interval-Partitioning(R)
1. {I1, …, In} = sort intervals in ascending order of their start times
2. for j from 1 to n do
3. exclude the labels of all assigned intervals that are not compatible with Ij
4. if (there is a nonexcluded label from {1, 2, …, d}) then
5. assign a nonexcluded label to Ij
6. else leave Ij unlabeled
An exchange argument
Scheduling to Minimize Lateness
20
Greedy algorithms
IRIS H.-R. JIANG
The Interval Partitioning Problem
Greedy algorithms
17
Time
0 1 2 3 4 5 6 7 8 9 10 11
depth: 3
12 13
9
8
10
6
5
4
1
2
3
7
1
2
3
4
5
6
7
8
9
10
1
2
3
IRIS H.-R. JIANG
Optimality (1/2)
¨ The greedy algorithm assigns every interval a label, and no two
overlapping intervals receive the same label.
¨ Pf:
1. No interval ends up unlabeled.
n Suppose interval Ij overlaps t intervals earlier in the sorted list.
n These t + 1 intervals pass over a common point, namely s(Ij).
n Hence, t + 1 ! d. Thus, t ! d – 1; at least one of the d labels is
not excluded, and so there is a label that can be assigned to Ij.
n i.e., line 6 never occurs!
Greedy algorithms
18
Interval-Partitioning(R)
1. {I1, …, In} = sort intervals in ascending order of their start times
2. for j from 1 to n do
3. exclude the labels of all assigned intervals that are not compatible with Ij
4. if (there is a nonexcluded label from {1, 2, …, d}) then
5. assign a nonexcluded label to Ij
6. else leave Ij unlabeled
IRIS H.-R. JIANG
Optimality (2/2)
¨ Pf: (cont’d)
2. No two overlapping intervals are assigned with the same label.
n Consider any two intervals Ii and Ij that overlap, i<j.
n When Ij is considered (in line 2), Ii is in the set of intervals
whose labels are excluded (in line 3).
n Hence, the algorithm will not assign to Ij the label used for Ii.
¤ Since the algorithm uses d labels, we can conclude that the
greedy algorithm always uses the minimum possible number of
labels, i.e., it is optimal!
Greedy algorithms
19
Interval-Partitioning(R)
1. {I1, …, In} = sort intervals in ascending order of their start times
2. for j from 1 to n do
3. exclude the labels of all assigned intervals that are not compatible with Ij
4. if (there is a nonexcluded label from {1, 2, …, d}) then
5. assign a nonexcluded label to Ij
6. else leave Ij unlabeled
An exchange argument
Scheduling to Minimize Lateness
20
Greedy algorithms
IRIS H.-R. JIANG
The Interval Partitioning Problem
Greedy algorithms
17
Time
0 1 2 3 4 5 6 7 8 9 10 11
depth: 3
12 13
9
8
10
6
5
4
1
2
3
7
1
2
3
4
5
6
7
8
9
10
1
2
3
IRIS H.-R. JIANG
Optimality (1/2)
¨ The greedy algorithm assigns every interval a label, and no two
overlapping intervals receive the same label.
¨ Pf:
1. No interval ends up unlabeled.
n Suppose interval Ij overlaps t intervals earlier in the sorted list.
n These t + 1 intervals pass over a common point, namely s(Ij).
n Hence, t + 1 ! d. Thus, t ! d – 1; at least one of the d labels is
not excluded, and so there is a label that can be assigned to Ij.
n i.e., line 6 never occurs!
Greedy algorithms
18
Interval-Partitioning(R)
1. {I1, …, In} = sort intervals in ascending order of their start times
2. for j from 1 to n do
3. exclude the labels of all assigned intervals that are not compatible with Ij
4. if (there is a nonexcluded label from {1, 2, …, d}) then
5. assign a nonexcluded label to Ij
6. else leave Ij unlabeled
IRIS H.-R. JIANG
Optimality (2/2)
¨ Pf: (cont’d)
2. No two overlapping intervals are assigned with the same label.
n Consider any two intervals Ii and Ij that overlap, i<j.
n When Ij is considered (in line 2), Ii is in the set of intervals
whose labels are excluded (in line 3).
n Hence, the algorithm will not assign to Ij the label used for Ii.
¤ Since the algorithm uses d labels, we can conclude that the
greedy algorithm always uses the minimum possible number of
labels, i.e., it is optimal!
Greedy algorithms
19
Interval-Partitioning(R)
1. {I1, …, In} = sort intervals in ascending order of their start times
2. for j from 1 to n do
3. exclude the labels of all assigned intervals that are not compatible with Ij
4. if (there is a nonexcluded label from {1, 2, …, d}) then
5. assign a nonexcluded label to Ij
6. else leave Ij unlabeled
An exchange argument
Scheduling to Minimize Lateness
20
Greedy algorithms
IRIS H.-R. JIANG
What If Each Request Has a Deadline?
¨ Given: A single resource is available starting at time s. A set of
requests {1, 2, …, n}, request i requires a contiguous interval of
length ti and has a deadline di.
¨ Goal: Schedule all requests without overlapping so as to minimize
the maximum lateness.
¤ Lateness: li = max{0, f(i) – di}.
Greedy algorithms
21
Time
0 1 2 3 4 5 6 7 8 9 10 11 12 13
6
5
4
1
3
2
14 15
deadline
length max lateness = 1
1 2 3 4 5 6
IRIS H.-R. JIANG
Greedy Rule
¨ Consider requests in some order.
1. Shortest interval first: Process requests in ascending order of ti
2. Smallest slack: Process requests in ascending order of di – ti
3. Earliest deadline first: Process requests in ascending order of di
Greedy algorithms
22
1 2
ti
di
1
100
10
10
1 2
ti
di
1
2
10
10
Jobs with earlier deadlines
get completed earlier
IRIS H.-R. JIANG
Minimizing Lateness
¨ Greedy rule: Earliest deadline first!
Greedy algorithms
23
Min-Lateness(R,s)
// f: the finishing time of the last scheduled request
1. {d1, …,dn} = sort requests in ascending order of their deadlines
2. f = s
3. for i from 1 to n do
4. assign request i to the time interval from s(i) = f to f(i) = f + ti
5. f = f + ti
6. return the set of scheduled intervals [s(i), f(i)) for all i = 1..n
Time
s
6
5
4
1
3
2
max lateness = 1
1 2 3 4 5 6
IRIS H.-R. JIANG
No Idle Time
¨ Observation: The greedy schedule has no idle time.
¤ Line 4!
¨ There is an optimal schedule with no idle time.
Greedy algorithms
24
Min-Lateness(R,s)
// f: the finishing time of the last scheduled request
1. {d1, …,dn} = sort requests in ascending order of their deadlines
2. f = s
3. for i from 1 to n do
4. assign request i to the time interval from s(i) = f to f(i) = f + ti
5. f = f + ti
6. return the set of scheduled intervals [s(i), f(i)) for all i = 1..n
0 1 2 3 4 5 6 7 8 9 10 11 12
d = 4 d = 6 d = 12
0 1 2 3 4 5 6 7 8 9 10 11 12
d = 4 d = 6 d = 12
IRIS H.-R. JIANG
What If Each Request Has a Deadline?
¨ Given: A single resource is available starting at time s. A set of
requests {1, 2, …, n}, request i requires a contiguous interval of
length ti and has a deadline di.
¨ Goal: Schedule all requests without overlapping so as to minimize
the maximum lateness.
¤ Lateness: li = max{0, f(i) – di}.
Greedy algorithms
21
Time
0 1 2 3 4 5 6 7 8 9 10 11 12 13
6
5
4
1
3
2
14 15
deadline
length max lateness = 1
1 2 3 4 5 6
IRIS H.-R. JIANG
Greedy Rule
¨ Consider requests in some order.
1. Shortest interval first: Process requests in ascending order of ti
2. Smallest slack: Process requests in ascending order of di – ti
3. Earliest deadline first: Process requests in ascending order of di
Greedy algorithms
22
1 2
ti
di
1
100
10
10
1 2
ti
di
1
2
10
10
Jobs with earlier deadlines
get completed earlier
IRIS H.-R. JIANG
Minimizing Lateness
¨ Greedy rule: Earliest deadline first!
Greedy algorithms
23
Min-Lateness(R,s)
// f: the finishing time of the last scheduled request
1. {d1, …,dn} = sort requests in ascending order of their deadlines
2. f = s
3. for i from 1 to n do
4. assign request i to the time interval from s(i) = f to f(i) = f + ti
5. f = f + ti
6. return the set of scheduled intervals [s(i), f(i)) for all i = 1..n
Time
s
6
5
4
1
3
2
max lateness = 1
1 2 3 4 5 6
IRIS H.-R. JIANG
No Idle Time
¨ Observation: The greedy schedule has no idle time.
¤ Line 4!
¨ There is an optimal schedule with no idle time.
Greedy algorithms
24
Min-Lateness(R,s)
// f: the finishing time of the last scheduled request
1. {d1, …,dn} = sort requests in ascending order of their deadlines
2. f = s
3. for i from 1 to n do
4. assign request i to the time interval from s(i) = f to f(i) = f + ti
5. f = f + ti
6. return the set of scheduled intervals [s(i), f(i)) for all i = 1..n
0 1 2 3 4 5 6 7 8 9 10 11 12
d = 4 d = 6 d = 12
0 1 2 3 4 5 6 7 8 9 10 11 12
d = 4 d = 6 d = 12
IRIS H.-R. JIANG
What If Each Request Has a Deadline?
¨ Given: A single resource is available starting at time s. A set of
requests {1, 2, …, n}, request i requires a contiguous interval of
length ti and has a deadline di.
¨ Goal: Schedule all requests without overlapping so as to minimize
the maximum lateness.
¤ Lateness: li = max{0, f(i) – di}.
Greedy algorithms
21
Time
0 1 2 3 4 5 6 7 8 9 10 11 12 13
6
5
4
1
3
2
14 15
deadline
length max lateness = 1
1 2 3 4 5 6
IRIS H.-R. JIANG
Greedy Rule
¨ Consider requests in some order.
1. Shortest interval first: Process requests in ascending order of ti
2. Smallest slack: Process requests in ascending order of di – ti
3. Earliest deadline first: Process requests in ascending order of di
Greedy algorithms
22
1 2
ti
di
1
100
10
10
1 2
ti
di
1
2
10
10
Jobs with earlier deadlines
get completed earlier
IRIS H.-R. JIANG
Minimizing Lateness
¨ Greedy rule: Earliest deadline first!
Greedy algorithms
23
Min-Lateness(R,s)
// f: the finishing time of the last scheduled request
1. {d1, …,dn} = sort requests in ascending order of their deadlines
2. f = s
3. for i from 1 to n do
4. assign request i to the time interval from s(i) = f to f(i) = f + ti
5. f = f + ti
6. return the set of scheduled intervals [s(i), f(i)) for all i = 1..n
Time
s
6
5
4
1
3
2
max lateness = 1
1 2 3 4 5 6
IRIS H.-R. JIANG
No Idle Time
¨ Observation: The greedy schedule has no idle time.
¤ Line 4!
¨ There is an optimal schedule with no idle time.
Greedy algorithms
24
Min-Lateness(R,s)
// f: the finishing time of the last scheduled request
1. {d1, …,dn} = sort requests in ascending order of their deadlines
2. f = s
3. for i from 1 to n do
4. assign request i to the time interval from s(i) = f to f(i) = f + ti
5. f = f + ti
6. return the set of scheduled intervals [s(i), f(i)) for all i = 1..n
0 1 2 3 4 5 6 7 8 9 10 11 12
d = 4 d = 6 d = 12
0 1 2 3 4 5 6 7 8 9 10 11 12
d = 4 d = 6 d = 12
IRIS H.-R. JIANG
What If Each Request Has a Deadline?
¨ Given: A single resource is available starting at time s. A set of
requests {1, 2, …, n}, request i requires a contiguous interval of
length ti and has a deadline di.
¨ Goal: Schedule all requests without overlapping so as to minimize
the maximum lateness.
¤ Lateness: li = max{0, f(i) – di}.
Greedy algorithms
21
Time
0 1 2 3 4 5 6 7 8 9 10 11 12 13
6
5
4
1
3
2
14 15
deadline
length max lateness = 1
1 2 3 4 5 6
IRIS H.-R. JIANG
Greedy Rule
¨ Consider requests in some order.
1. Shortest interval first: Process requests in ascending order of ti
2. Smallest slack: Process requests in ascending order of di – ti
3. Earliest deadline first: Process requests in ascending order of di
Greedy algorithms
22
1 2
ti
di
1
100
10
10
1 2
ti
di
1
2
10
10
Jobs with earlier deadlines
get completed earlier
IRIS H.-R. JIANG
Minimizing Lateness
¨ Greedy rule: Earliest deadline first!
Greedy algorithms
23
Min-Lateness(R,s)
// f: the finishing time of the last scheduled request
1. {d1, …,dn} = sort requests in ascending order of their deadlines
2. f = s
3. for i from 1 to n do
4. assign request i to the time interval from s(i) = f to f(i) = f + ti
5. f = f + ti
6. return the set of scheduled intervals [s(i), f(i)) for all i = 1..n
Time
s
6
5
4
1
3
2
max lateness = 1
1 2 3 4 5 6
IRIS H.-R. JIANG
No Idle Time
¨ Observation: The greedy schedule has no idle time.
¤ Line 4!
¨ There is an optimal schedule with no idle time.
Greedy algorithms
24
Min-Lateness(R,s)
// f: the finishing time of the last scheduled request
1. {d1, …,dn} = sort requests in ascending order of their deadlines
2. f = s
3. for i from 1 to n do
4. assign request i to the time interval from s(i) = f to f(i) = f + ti
5. f = f + ti
6. return the set of scheduled intervals [s(i), f(i)) for all i = 1..n
0 1 2 3 4 5 6 7 8 9 10 11 12
d = 4 d = 6 d = 12
0 1 2 3 4 5 6 7 8 9 10 11 12
d = 4 d = 6 d = 12
IRIS H.-R. JIANG
No Inversions
¨ Exchange argument: Gradually transform an optimal solution
to the one found by the greedy algorithm without hurting its
quality.
¨ An inversion in schedule S is a pair of requests i and j such
that s(i) < s(j) but dj < di.
¨ All schedules without inversions and without idle time have the
same maximum lateness.
¨ Pf:
¤ If two different schedules have neither inversions nor idle time,
then they can only differ in the order in which requests with
identical deadlines are scheduled.
¤ Consider such a deadline d. In both schedules, the jobs with
deadline d are all scheduled consecutively (after all jobs with
earlier deadlines and before all jobs with later deadlines).
¤ Among them, the last one has the greatest lateness, and this
lateness does not depend on the order of the requests.
Greedy algorithms
25
IRIS H.-R. JIANG
Optimality
¨ There is an optimal schedule with no inversions and no idle
time.
¨ Pf:
¤ There is an optimal schedule O without idle time. (done!)
1. If O has an inversion, there is a pair of jobs i and j such that j is
scheduled immediately after i and has dj < di.
2. After swapping i and j we get a schedule with one less
inversion.
3. The new swapped schedule has a maximum lateness no larger
than that of O.
n Other requests have the same lateness
Greedy algorithms
26
inversion
i j
i
j
fi
fi
dj di
IRIS H.-R. JIANG
Optimality: Exchange Argument
¨ Theorem: The greedy schedule S is optimal.
¨ Pf: Proof by contradiction
¤ Let O be an optimal schedule with inversions.
¤ Assume O has no idle time.
¤ If O has no inversions, then S = O. done!
¤ If O has an inversion, let i-j be an adjacent inversion.
n Swapping i and j does not increase the maximum lateness and
strictly decreases the number of inversions.
n This contradicts definition of O.
Greedy algorithms
27
IRIS H.-R. JIANG
Summary: Greedy Analysis Strategies
¨ An algorithm is greedy if it builds up a solution in small steps,
choosing a decision at each step myopically to optimize some
underlying criterion.
¨ It’s challenging to prove greedy algorithms succeed in solving
a nontrivial problem optimally.
1. The greedy algorithm stays ahead: Show that after each step of
the greedy algorithm, its partial solution is better than the
optimal.
2. An exchange argument: Gradually transform an optimal solution
to the one found by the greedy algorithm without hurting its
quality.
Greedy algorithms
28
IRIS H.-R. JIANG
No Inversions
¨ Exchange argument: Gradually transform an optimal solution
to the one found by the greedy algorithm without hurting its
quality.
¨ An inversion in schedule S is a pair of requests i and j such
that s(i) < s(j) but dj < di.
¨ All schedules without inversions and without idle time have the
same maximum lateness.
¨ Pf:
¤ If two different schedules have neither inversions nor idle time,
then they can only differ in the order in which requests with
identical deadlines are scheduled.
¤ Consider such a deadline d. In both schedules, the jobs with
deadline d are all scheduled consecutively (after all jobs with
earlier deadlines and before all jobs with later deadlines).
¤ Among them, the last one has the greatest lateness, and this
lateness does not depend on the order of the requests.
Greedy algorithms
25
IRIS H.-R. JIANG
Optimality
¨ There is an optimal schedule with no inversions and no idle
time.
¨ Pf:
¤ There is an optimal schedule O without idle time. (done!)
1. If O has an inversion, there is a pair of jobs i and j such that j is
scheduled immediately after i and has dj < di.
2. After swapping i and j we get a schedule with one less
inversion.
3. The new swapped schedule has a maximum lateness no larger
than that of O.
n Other requests have the same lateness
Greedy algorithms
26
inversion
i j
i
j
fi
fi
dj di
IRIS H.-R. JIANG
Optimality: Exchange Argument
¨ Theorem: The greedy schedule S is optimal.
¨ Pf: Proof by contradiction
¤ Let O be an optimal schedule with inversions.
¤ Assume O has no idle time.
¤ If O has no inversions, then S = O. done!
¤ If O has an inversion, let i-j be an adjacent inversion.
n Swapping i and j does not increase the maximum lateness and
strictly decreases the number of inversions.
n This contradicts definition of O.
Greedy algorithms
27
IRIS H.-R. JIANG
Summary: Greedy Analysis Strategies
¨ An algorithm is greedy if it builds up a solution in small steps,
choosing a decision at each step myopically to optimize some
underlying criterion.
¨ It’s challenging to prove greedy algorithms succeed in solving
a nontrivial problem optimally.
1. The greedy algorithm stays ahead: Show that after each step of
the greedy algorithm, its partial solution is better than the
optimal.
2. An exchange argument: Gradually transform an optimal solution
to the one found by the greedy algorithm without hurting its
quality.
Greedy algorithms
28
IRIS H.-R. JIANG
No Inversions
¨ Exchange argument: Gradually transform an optimal solution
to the one found by the greedy algorithm without hurting its
quality.
¨ An inversion in schedule S is a pair of requests i and j such
that s(i) < s(j) but dj < di.
¨ All schedules without inversions and without idle time have the
same maximum lateness.
¨ Pf:
¤ If two different schedules have neither inversions nor idle time,
then they can only differ in the order in which requests with
identical deadlines are scheduled.
¤ Consider such a deadline d. In both schedules, the jobs with
deadline d are all scheduled consecutively (after all jobs with
earlier deadlines and before all jobs with later deadlines).
¤ Among them, the last one has the greatest lateness, and this
lateness does not depend on the order of the requests.
Greedy algorithms
25
IRIS H.-R. JIANG
Optimality
¨ There is an optimal schedule with no inversions and no idle
time.
¨ Pf:
¤ There is an optimal schedule O without idle time. (done!)
1. If O has an inversion, there is a pair of jobs i and j such that j is
scheduled immediately after i and has dj < di.
2. After swapping i and j we get a schedule with one less
inversion.
3. The new swapped schedule has a maximum lateness no larger
than that of O.
n Other requests have the same lateness
Greedy algorithms
26
inversion
i j
i
j
fi
fi
dj di
IRIS H.-R. JIANG
Optimality: Exchange Argument
¨ Theorem: The greedy schedule S is optimal.
¨ Pf: Proof by contradiction
¤ Let O be an optimal schedule with inversions.
¤ Assume O has no idle time.
¤ If O has no inversions, then S = O. done!
¤ If O has an inversion, let i-j be an adjacent inversion.
n Swapping i and j does not increase the maximum lateness and
strictly decreases the number of inversions.
n This contradicts definition of O.
Greedy algorithms
27
IRIS H.-R. JIANG
Summary: Greedy Analysis Strategies
¨ An algorithm is greedy if it builds up a solution in small steps,
choosing a decision at each step myopically to optimize some
underlying criterion.
¨ It’s challenging to prove greedy algorithms succeed in solving
a nontrivial problem optimally.
1. The greedy algorithm stays ahead: Show that after each step of
the greedy algorithm, its partial solution is better than the
optimal.
2. An exchange argument: Gradually transform an optimal solution
to the one found by the greedy algorithm without hurting its
quality.
Greedy algorithms
28
IRIS H.-R. JIANG
No Inversions
¨ Exchange argument: Gradually transform an optimal solution
to the one found by the greedy algorithm without hurting its
quality.
¨ An inversion in schedule S is a pair of requests i and j such
that s(i) < s(j) but dj < di.
¨ All schedules without inversions and without idle time have the
same maximum lateness.
¨ Pf:
¤ If two different schedules have neither inversions nor idle time,
then they can only differ in the order in which requests with
identical deadlines are scheduled.
¤ Consider such a deadline d. In both schedules, the jobs with
deadline d are all scheduled consecutively (after all jobs with
earlier deadlines and before all jobs with later deadlines).
¤ Among them, the last one has the greatest lateness, and this
lateness does not depend on the order of the requests.
Greedy algorithms
25
IRIS H.-R. JIANG
Optimality
¨ There is an optimal schedule with no inversions and no idle
time.
¨ Pf:
¤ There is an optimal schedule O without idle time. (done!)
1. If O has an inversion, there is a pair of jobs i and j such that j is
scheduled immediately after i and has dj < di.
2. After swapping i and j we get a schedule with one less
inversion.
3. The new swapped schedule has a maximum lateness no larger
than that of O.
n Other requests have the same lateness
Greedy algorithms
26
inversion
i j
i
j
fi
fi
dj di
IRIS H.-R. JIANG
Optimality: Exchange Argument
¨ Theorem: The greedy schedule S is optimal.
¨ Pf: Proof by contradiction
¤ Let O be an optimal schedule with inversions.
¤ Assume O has no idle time.
¤ If O has no inversions, then S = O. done!
¤ If O has an inversion, let i-j be an adjacent inversion.
n Swapping i and j does not increase the maximum lateness and
strictly decreases the number of inversions.
n This contradicts definition of O.
Greedy algorithms
27
IRIS H.-R. JIANG
Summary: Greedy Analysis Strategies
¨ An algorithm is greedy if it builds up a solution in small steps,
choosing a decision at each step myopically to optimize some
underlying criterion.
¨ It’s challenging to prove greedy algorithms succeed in solving
a nontrivial problem optimally.
1. The greedy algorithm stays ahead: Show that after each step of
the greedy algorithm, its partial solution is better than the
optimal.
2. An exchange argument: Gradually transform an optimal solution
to the one found by the greedy algorithm without hurting its
quality.
Greedy algorithms
28
Edsger W. Dijkstra 1959
Shortest Paths
29
Greedy algorithms
IRIS H.-R. JIANG
Edsger W. Dijkstra (1930—2002)
¨ 1972 recipient of the ACM Turing Award
Greedy algorithms
30
The question of whether computers can think is like
the question of whether submarines can swim.
http://www.cs.utexas.edu/users/EWD/obituary.html
If you want more effective programmers,
you will discover that they should not waste their time debugging,
they should not introduce the bugs to start with.
Program testing can be a very effective way to show the presence of bugs,
but it is hopelessly inadequate for showing their absence.
-- Turing Award Lecture 1972, the humble programmer
IRIS H.-R. JIANG
Google Map
Greedy algorithms
31
Shortest path from
San Francisco Shopping Centre to
Yosemite National Park
IRIS H.-R. JIANG
Floor Guide in a Shopping Mall
¨ Direct shoppers to their destinations in real-time
Greedy algorithms
32
Edsger W. Dijkstra 1959
Shortest Paths
29
Greedy algorithms
IRIS H.-R. JIANG
Edsger W. Dijkstra (1930—2002)
¨ 1972 recipient of the ACM Turing Award
Greedy algorithms
30
The question of whether computers can think is like
the question of whether submarines can swim.
http://www.cs.utexas.edu/users/EWD/obituary.html
If you want more effective programmers,
you will discover that they should not waste their time debugging,
they should not introduce the bugs to start with.
Program testing can be a very effective way to show the presence of bugs,
but it is hopelessly inadequate for showing their absence.
-- Turing Award Lecture 1972, the humble programmer
IRIS H.-R. JIANG
Google Map
Greedy algorithms
31
Shortest path from
San Francisco Shopping Centre to
Yosemite National Park
IRIS H.-R. JIANG
Floor Guide in a Shopping Mall
¨ Direct shoppers to their destinations in real-time
Greedy algorithms
32
Edsger W. Dijkstra 1959
Shortest Paths
29
Greedy algorithms
IRIS H.-R. JIANG
Edsger W. Dijkstra (1930—2002)
¨ 1972 recipient of the ACM Turing Award
Greedy algorithms
30
The question of whether computers can think is like
the question of whether submarines can swim.
http://www.cs.utexas.edu/users/EWD/obituary.html
If you want more effective programmers,
you will discover that they should not waste their time debugging,
they should not introduce the bugs to start with.
Program testing can be a very effective way to show the presence of bugs,
but it is hopelessly inadequate for showing their absence.
-- Turing Award Lecture 1972, the humble programmer
IRIS H.-R. JIANG
Google Map
Greedy algorithms
31
Shortest path from
San Francisco Shopping Centre to
Yosemite National Park
IRIS H.-R. JIANG
Floor Guide in a Shopping Mall
¨ Direct shoppers to their destinations in real-time
Greedy algorithms
32
Edsger W. Dijkstra 1959
Shortest Paths
29
Greedy algorithms
IRIS H.-R. JIANG
Edsger W. Dijkstra (1930—2002)
¨ 1972 recipient of the ACM Turing Award
Greedy algorithms
30
The question of whether computers can think is like
the question of whether submarines can swim.
http://www.cs.utexas.edu/users/EWD/obituary.html
If you want more effective programmers,
you will discover that they should not waste their time debugging,
they should not introduce the bugs to start with.
Program testing can be a very effective way to show the presence of bugs,
but it is hopelessly inadequate for showing their absence.
-- Turing Award Lecture 1972, the humble programmer
IRIS H.-R. JIANG
Google Map
Greedy algorithms
31
Shortest path from
San Francisco Shopping Centre to
Yosemite National Park
IRIS H.-R. JIANG
Floor Guide in a Shopping Mall
¨ Direct shoppers to their destinations in real-time
Greedy algorithms
32
IRIS H.-R. JIANG
The Shortest Path Problem
¨ Given:
¤ Directed graph G = (V, E)
n Length le = length of edge e = (u, v) Î E
n Distance; time; cost
n le ³ 0
n Q: what if undirected?
n A: 1 undirected edge = 2 directed ones
¤ Source s
¨ Goal:
¤ Shortest path Pv from s to each other node v Î V – {s}
n Length of path P: l(P)= SeÎP le
Greedy algorithms
33
http://upload.wikimedia.org/wikipedia/commons/4/45/Dijksta_Anim.gif
l(a®b) = l(1®3®6®5)
= 9+2+9 = 20
IRIS H.-R. JIANG
Dijkstra’s Algorithm
Greedy algorithms
34
E. W. Dijkstra: A note on two problems in connexion with graphs.
In Numerische Mathematik, 1 (1959), S. 269–271.
Dijkstra(G,l)
// S: the set of explored nodes
// for each u Î S, we store a shortest path distance d(u) from s to u
1. initialize S = {s}, d(s) = 0
2. while S ¹ V do
3. select a node v Ï S with at least one edge from S for which
4. d'(v) = mine = (u, v): uÎS d(u)+le
5. add v to S and define d(v) = d'(v)
shortest path to some u in
explored part, followed by a
single edge (u, v)
s
u
x
v
y
z
1
2
1
2
4
3
3
1
2
S: explored
d'(x) = 2; d'(y) = 4; d'(z) = 5
d'(y) = 3; d'(z) = 4
IRIS H.-R. JIANG
Correctness
¨ Loop invariant: Consider the set S at any point in the
algorithm’s execution. For each node u Î S, d(u) is the length
of the shortest s-u path Pu.
¨ Pf: Proof by induction on |S|
¤ Basis step: trivial for |S| = 1.
¤ Induction step: hypothesis: true for k ³ 1.
n Grow S by adding v;
let (u, v) be the final edge on our s-v path Pv.
n By induction hypothesis, Pu is the shortest s-u path.
n Consider any other s-v path P; P must leave S somewhere; let
y be the first node on P that is not in S, and x Î S be the node
just before y.
n P cannot be shorter than Pv because it is already at least as
long as Pv by the time it has left the set S.
n At iteration k+1, d(v) = d'(v) = d(u) + le=(u, v) £ d(x) + le'=(x, y) £ l(P)
Greedy algorithms
35
s
x
u
y
v
S P
Pv
The algorithm
stays ahead!
Q: Why le ³ 0?
IRIS H.-R. JIANG
Implementation
¨ Q: How to do line 4 efficiently? d'(v) = mine = (u, v): uÎS d(u)+le
¨ A: Explicitly maintain d'(v) in the view of each unexplored node
v instead of S
¤ Next node to explore = node with minimum d'(v).
¤ When exploring v, update d'(w) for each outgoing (v, w), w Ï S.
¨ Q: How?
¨ A: Implement a min priority queue nicely
Greedy algorithms
36
Operation Dijkstra Array Binary heap Fibonacci heap
Insert O(n) O(n) Q(lg n) Q(1)
ExtractMin O(n) O(n) Q(lg n) O(lg n)
ChangeKey O(m) O(1) Q(lg n) Q(1)
IsEmpty O(n) O(1) Q(1) Q(1)
Total O(n2) O(mlg n) O(m+nlg n)
Operation Dijkstra Array Binary heap Fibonacci heap
Insert O(n)
ExtractMin O(n)
ChangeKey O(m)
IsEmpty O(n)
Total
Operation Dijkstra Array Binary heap Fibonacci heap
Insert
ExtractMin
ChangeKey
IsEmpty
Total
IRIS H.-R. JIANG
The Shortest Path Problem
¨ Given:
¤ Directed graph G = (V, E)
n Length le = length of edge e = (u, v) Î E
n Distance; time; cost
n le ³ 0
n Q: what if undirected?
n A: 1 undirected edge = 2 directed ones
¤ Source s
¨ Goal:
¤ Shortest path Pv from s to each other node v Î V – {s}
n Length of path P: l(P)= SeÎP le
Greedy algorithms
33
http://upload.wikimedia.org/wikipedia/commons/4/45/Dijksta_Anim.gif
l(a®b) = l(1®3®6®5)
= 9+2+9 = 20
IRIS H.-R. JIANG
Dijkstra’s Algorithm
Greedy algorithms
34
E. W. Dijkstra: A note on two problems in connexion with graphs.
In Numerische Mathematik, 1 (1959), S. 269–271.
Dijkstra(G,l)
// S: the set of explored nodes
// for each u Î S, we store a shortest path distance d(u) from s to u
1. initialize S = {s}, d(s) = 0
2. while S ¹ V do
3. select a node v Ï S with at least one edge from S for which
4. d'(v) = mine = (u, v): uÎS d(u)+le
5. add v to S and define d(v) = d'(v)
shortest path to some u in
explored part, followed by a
single edge (u, v)
s
u
x
v
y
z
1
2
1
2
4
3
3
1
2
S: explored
d'(x) = 2; d'(y) = 4; d'(z) = 5
d'(y) = 3; d'(z) = 4
IRIS H.-R. JIANG
Correctness
¨ Loop invariant: Consider the set S at any point in the
algorithm’s execution. For each node u Î S, d(u) is the length
of the shortest s-u path Pu.
¨ Pf: Proof by induction on |S|
¤ Basis step: trivial for |S| = 1.
¤ Induction step: hypothesis: true for k ³ 1.
n Grow S by adding v;
let (u, v) be the final edge on our s-v path Pv.
n By induction hypothesis, Pu is the shortest s-u path.
n Consider any other s-v path P; P must leave S somewhere; let
y be the first node on P that is not in S, and x Î S be the node
just before y.
n P cannot be shorter than Pv because it is already at least as
long as Pv by the time it has left the set S.
n At iteration k+1, d(v) = d'(v) = d(u) + le=(u, v) £ d(x) + le'=(x, y) £ l(P)
Greedy algorithms
35
s
x
u
y
v
S P
Pv
The algorithm
stays ahead!
Q: Why le ³ 0?
IRIS H.-R. JIANG
Implementation
¨ Q: How to do line 4 efficiently? d'(v) = mine = (u, v): uÎS d(u)+le
¨ A: Explicitly maintain d'(v) in the view of each unexplored node
v instead of S
¤ Next node to explore = node with minimum d'(v).
¤ When exploring v, update d'(w) for each outgoing (v, w), w Ï S.
¨ Q: How?
¨ A: Implement a min priority queue nicely
Greedy algorithms
36
Operation Dijkstra Array Binary heap Fibonacci heap
Insert O(n) O(n) Q(lg n) Q(1)
ExtractMin O(n) O(n) Q(lg n) O(lg n)
ChangeKey O(m) O(1) Q(lg n) Q(1)
IsEmpty O(n) O(1) Q(1) Q(1)
Total O(n2) O(mlg n) O(m+nlg n)
Operation Dijkstra Array Binary heap Fibonacci heap
Insert O(n)
ExtractMin O(n)
ChangeKey O(m)
IsEmpty O(n)
Total
Operation Dijkstra Array Binary heap Fibonacci heap
Insert
ExtractMin
ChangeKey
IsEmpty
Total
IRIS H.-R. JIANG
The Shortest Path Problem
¨ Given:
¤ Directed graph G = (V, E)
n Length le = length of edge e = (u, v) Î E
n Distance; time; cost
n le ³ 0
n Q: what if undirected?
n A: 1 undirected edge = 2 directed ones
¤ Source s
¨ Goal:
¤ Shortest path Pv from s to each other node v Î V – {s}
n Length of path P: l(P)= SeÎP le
Greedy algorithms
33
http://upload.wikimedia.org/wikipedia/commons/4/45/Dijksta_Anim.gif
l(a®b) = l(1®3®6®5)
= 9+2+9 = 20
IRIS H.-R. JIANG
Dijkstra’s Algorithm
Greedy algorithms
34
E. W. Dijkstra: A note on two problems in connexion with graphs.
In Numerische Mathematik, 1 (1959), S. 269–271.
Dijkstra(G,l)
// S: the set of explored nodes
// for each u Î S, we store a shortest path distance d(u) from s to u
1. initialize S = {s}, d(s) = 0
2. while S ¹ V do
3. select a node v Ï S with at least one edge from S for which
4. d'(v) = mine = (u, v): uÎS d(u)+le
5. add v to S and define d(v) = d'(v)
shortest path to some u in
explored part, followed by a
single edge (u, v)
s
u
x
v
y
z
1
2
1
2
4
3
3
1
2
S: explored
d'(x) = 2; d'(y) = 4; d'(z) = 5
d'(y) = 3; d'(z) = 4
IRIS H.-R. JIANG
Correctness
¨ Loop invariant: Consider the set S at any point in the
algorithm’s execution. For each node u Î S, d(u) is the length
of the shortest s-u path Pu.
¨ Pf: Proof by induction on |S|
¤ Basis step: trivial for |S| = 1.
¤ Induction step: hypothesis: true for k ³ 1.
n Grow S by adding v;
let (u, v) be the final edge on our s-v path Pv.
n By induction hypothesis, Pu is the shortest s-u path.
n Consider any other s-v path P; P must leave S somewhere; let
y be the first node on P that is not in S, and x Î S be the node
just before y.
n P cannot be shorter than Pv because it is already at least as
long as Pv by the time it has left the set S.
n At iteration k+1, d(v) = d'(v) = d(u) + le=(u, v) £ d(x) + le'=(x, y) £ l(P)
Greedy algorithms
35
s
x
u
y
v
S P
Pv
The algorithm
stays ahead!
Q: Why le ³ 0?
IRIS H.-R. JIANG
Implementation
¨ Q: How to do line 4 efficiently? d'(v) = mine = (u, v): uÎS d(u)+le
¨ A: Explicitly maintain d'(v) in the view of each unexplored node
v instead of S
¤ Next node to explore = node with minimum d'(v).
¤ When exploring v, update d'(w) for each outgoing (v, w), w Ï S.
¨ Q: How?
¨ A: Implement a min priority queue nicely
Greedy algorithms
36
Operation Dijkstra Array Binary heap Fibonacci heap
Insert O(n) O(n) Q(lg n) Q(1)
ExtractMin O(n) O(n) Q(lg n) O(lg n)
ChangeKey O(m) O(1) Q(lg n) Q(1)
IsEmpty O(n) O(1) Q(1) Q(1)
Total O(n2) O(mlg n) O(m+nlg n)
Operation Dijkstra Array Binary heap Fibonacci heap
Insert O(n)
ExtractMin O(n)
ChangeKey O(m)
IsEmpty O(n)
Total
Operation Dijkstra Array Binary heap Fibonacci heap
Insert
ExtractMin
ChangeKey
IsEmpty
Total
IRIS H.-R. JIANG
The Shortest Path Problem
¨ Given:
¤ Directed graph G = (V, E)
n Length le = length of edge e = (u, v) Î E
n Distance; time; cost
n le ³ 0
n Q: what if undirected?
n A: 1 undirected edge = 2 directed ones
¤ Source s
¨ Goal:
¤ Shortest path Pv from s to each other node v Î V – {s}
n Length of path P: l(P)= SeÎP le
Greedy algorithms
33
http://upload.wikimedia.org/wikipedia/commons/4/45/Dijksta_Anim.gif
l(a®b) = l(1®3®6®5)
= 9+2+9 = 20
IRIS H.-R. JIANG
Dijkstra’s Algorithm
Greedy algorithms
34
E. W. Dijkstra: A note on two problems in connexion with graphs.
In Numerische Mathematik, 1 (1959), S. 269–271.
Dijkstra(G,l)
// S: the set of explored nodes
// for each u Î S, we store a shortest path distance d(u) from s to u
1. initialize S = {s}, d(s) = 0
2. while S ¹ V do
3. select a node v Ï S with at least one edge from S for which
4. d'(v) = mine = (u, v): uÎS d(u)+le
5. add v to S and define d(v) = d'(v)
shortest path to some u in
explored part, followed by a
single edge (u, v)
s
u
x
v
y
z
1
2
1
2
4
3
3
1
2
S: explored
d'(x) = 2; d'(y) = 4; d'(z) = 5
d'(y) = 3; d'(z) = 4
IRIS H.-R. JIANG
Correctness
¨ Loop invariant: Consider the set S at any point in the
algorithm’s execution. For each node u Î S, d(u) is the length
of the shortest s-u path Pu.
¨ Pf: Proof by induction on |S|
¤ Basis step: trivial for |S| = 1.
¤ Induction step: hypothesis: true for k ³ 1.
n Grow S by adding v;
let (u, v) be the final edge on our s-v path Pv.
n By induction hypothesis, Pu is the shortest s-u path.
n Consider any other s-v path P; P must leave S somewhere; let
y be the first node on P that is not in S, and x Î S be the node
just before y.
n P cannot be shorter than Pv because it is already at least as
long as Pv by the time it has left the set S.
n At iteration k+1, d(v) = d'(v) = d(u) + le=(u, v) £ d(x) + le'=(x, y) £ l(P)
Greedy algorithms
35
s
x
u
y
v
S P
Pv
The algorithm
stays ahead!
Q: Why le ³ 0?
IRIS H.-R. JIANG
Implementation
¨ Q: How to do line 4 efficiently? d'(v) = mine = (u, v): uÎS d(u)+le
¨ A: Explicitly maintain d'(v) in the view of each unexplored node
v instead of S
¤ Next node to explore = node with minimum d'(v).
¤ When exploring v, update d'(w) for each outgoing (v, w), w Ï S.
¨ Q: How?
¨ A: Implement a min priority queue nicely
Greedy algorithms
36
Operation Dijkstra Array Binary heap Fibonacci heap
Insert O(n) O(n) Q(lg n) Q(1)
ExtractMin O(n) O(n) Q(lg n) O(lg n)
ChangeKey O(m) O(1) Q(lg n) Q(1)
IsEmpty O(n) O(1) Q(1) Q(1)
Total O(n2) O(mlg n) O(m+nlg n)
Operation Dijkstra Array Binary heap Fibonacci heap
Insert O(n)
ExtractMin O(n)
ChangeKey O(m)
IsEmpty O(n)
Total
Operation Dijkstra Array Binary heap Fibonacci heap
Insert
ExtractMin
ChangeKey
IsEmpty
Total
IRIS H.-R. JIANG
Example
Greedy algorithms
37
Cormen, 3rd ed.
Binary Tree Application
Recap Heaps: Priority Queues
38
Greedy algorithms
IRIS H.-R. JIANG
Priority Queue
¨ In a priority queue (PQ)
¤ Each element has a priority (key)
¤ Only the element with highest (or lowest) priority can be deleted
n Max priority queue, or min priority queue
¤ An element with arbitrary priority can be inserted into the queue
at any time
Greedy algorithms
39
Operation Binary heap Fibonacci heap
FindMin Q(1) Q(1)
ExtractMin Q(lg n) O(lg n)
Insert Q(lg n) Q(1)
ChangeKey Q(lg n) Q(1)
The time complexities are worst-case time for binary heap, and amortized
time complexity for Fibonacci heap
Thomas H. Cormen, Charles E. Leiserson, Ronald L. Rivest, and Clifford Stein
Introduction to Algorithms, 2nd Edition. MIT Press and McGraw-Hill, 2001.
Fredman M. L. & Tarjan R. E. (1987). Fibonacci heaps and their uses in improved
network optimization algorithms. Journal of the ACM 34(3), pp. 596-615.
IRIS H.-R. JIANG
Max heap Min heap
Heap
¨ Definition: A max (min) heap is
¤ A max (min) tree: key[parent] >= (<=) key[children]
¤ A complete binary tree
¨ Corollary: Who has the largest (smallest) key in a max (min) heap?
¤ Root!
¨ Example
Greedy algorithms
40
7
8 6
2
7 4
8 6
14
12
10 10
IRIS H.-R. JIANG
Example
Greedy algorithms
37
Cormen, 3rd ed.
Binary Tree Application
Recap Heaps: Priority Queues
38
Greedy algorithms
IRIS H.-R. JIANG
Priority Queue
¨ In a priority queue (PQ)
¤ Each element has a priority (key)
¤ Only the element with highest (or lowest) priority can be deleted
n Max priority queue, or min priority queue
¤ An element with arbitrary priority can be inserted into the queue
at any time
Greedy algorithms
39
Operation Binary heap Fibonacci heap
FindMin Q(1) Q(1)
ExtractMin Q(lg n) O(lg n)
Insert Q(lg n) Q(1)
ChangeKey Q(lg n) Q(1)
The time complexities are worst-case time for binary heap, and amortized
time complexity for Fibonacci heap
Thomas H. Cormen, Charles E. Leiserson, Ronald L. Rivest, and Clifford Stein
Introduction to Algorithms, 2nd Edition. MIT Press and McGraw-Hill, 2001.
Fredman M. L. & Tarjan R. E. (1987). Fibonacci heaps and their uses in improved
network optimization algorithms. Journal of the ACM 34(3), pp. 596-615.
IRIS H.-R. JIANG
Max heap Min heap
Heap
¨ Definition: A max (min) heap is
¤ A max (min) tree: key[parent] >= (<=) key[children]
¤ A complete binary tree
¨ Corollary: Who has the largest (smallest) key in a max (min) heap?
¤ Root!
¨ Example
Greedy algorithms
40
7
8 6
2
7 4
8 6
14
12
10 10
IRIS H.-R. JIANG
Example
Greedy algorithms
37
Cormen, 3rd ed.
Binary Tree Application
Recap Heaps: Priority Queues
38
Greedy algorithms
IRIS H.-R. JIANG
Priority Queue
¨ In a priority queue (PQ)
¤ Each element has a priority (key)
¤ Only the element with highest (or lowest) priority can be deleted
n Max priority queue, or min priority queue
¤ An element with arbitrary priority can be inserted into the queue
at any time
Greedy algorithms
39
Operation Binary heap Fibonacci heap
FindMin Q(1) Q(1)
ExtractMin Q(lg n) O(lg n)
Insert Q(lg n) Q(1)
ChangeKey Q(lg n) Q(1)
The time complexities are worst-case time for binary heap, and amortized
time complexity for Fibonacci heap
Thomas H. Cormen, Charles E. Leiserson, Ronald L. Rivest, and Clifford Stein
Introduction to Algorithms, 2nd Edition. MIT Press and McGraw-Hill, 2001.
Fredman M. L. & Tarjan R. E. (1987). Fibonacci heaps and their uses in improved
network optimization algorithms. Journal of the ACM 34(3), pp. 596-615.
IRIS H.-R. JIANG
Max heap Min heap
Heap
¨ Definition: A max (min) heap is
¤ A max (min) tree: key[parent] >= (<=) key[children]
¤ A complete binary tree
¨ Corollary: Who has the largest (smallest) key in a max (min) heap?
¤ Root!
¨ Example
Greedy algorithms
40
7
8 6
2
7 4
8 6
14
12
10 10
IRIS H.-R. JIANG
Example
Greedy algorithms
37
Cormen, 3rd ed.
Binary Tree Application
Recap Heaps: Priority Queues
38
Greedy algorithms
IRIS H.-R. JIANG
Priority Queue
¨ In a priority queue (PQ)
¤ Each element has a priority (key)
¤ Only the element with highest (or lowest) priority can be deleted
n Max priority queue, or min priority queue
¤ An element with arbitrary priority can be inserted into the queue
at any time
Greedy algorithms
39
Operation Binary heap Fibonacci heap
FindMin Q(1) Q(1)
ExtractMin Q(lg n) O(lg n)
Insert Q(lg n) Q(1)
ChangeKey Q(lg n) Q(1)
The time complexities are worst-case time for binary heap, and amortized
time complexity for Fibonacci heap
Thomas H. Cormen, Charles E. Leiserson, Ronald L. Rivest, and Clifford Stein
Introduction to Algorithms, 2nd Edition. MIT Press and McGraw-Hill, 2001.
Fredman M. L. & Tarjan R. E. (1987). Fibonacci heaps and their uses in improved
network optimization algorithms. Journal of the ACM 34(3), pp. 596-615.
IRIS H.-R. JIANG
Max heap Min heap
Heap
¨ Definition: A max (min) heap is
¤ A max (min) tree: key[parent] >= (<=) key[children]
¤ A complete binary tree
¨ Corollary: Who has the largest (smallest) key in a max (min) heap?
¤ Root!
¨ Example
Greedy algorithms
40
7
8 6
2
7 4
8 6
14
12
10 10
IRIS H.-R. JIANG
Class MaxHeap
¨ Implementation?
¤ Complete binary tree Þ array representation
Greedy algorithms
41
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
- -
-
- -
heap 14 12 7 10 8 6
7
8 6
14
12
10
IRIS H.-R. JIANG
Insertion into a Max Heap (1/3)
¨ Maintain heap property all the times
¨ Push(1)
Greedy algorithms
42
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
- -
- -
- -
20 15 2 14 10
heapSize = 5
capacity = 10
- -
-
- -
20 15 2 14 10
Initial location
of new node
heapSize = 6
capacity = 10
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
2
14
15
10
20
2
14
15
10
20
- -
-
- -
20 15 2 14 10 1
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
2
1
14
15
10
20
IRIS H.-R. JIANG
Insertion into a Max Heap (2/3)
¨ Maintain heap Þ bubble up if needed!
¨ Push(21)
Greedy algorithms
43
- -
-
- -
20 15 2 14 10 1
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
2
1
14
15
10
20
-
- -
20 15 2 14 10 1 21
Initial location
of 21
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
2
1
14
15
10
20
21
- -
- -
20 15 21 14 10 1 2
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
2
1
14
15
10
20
21
- -
- -
21 15 20 14 10 1 2
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
2
1
14
15
10
21
20
IRIS H.-R. JIANG
Insertion into a Max Heap (3/3)
¨ Time complexity?
¤ How many times to bubble up in the worst case?
¤ Tree height: Q(lg n)
Greedy algorithms
44
IRIS H.-R. JIANG
Class MaxHeap
¨ Implementation?
¤ Complete binary tree Þ array representation
Greedy algorithms
41
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
- -
-
- -
heap 14 12 7 10 8 6
7
8 6
14
12
10
IRIS H.-R. JIANG
Insertion into a Max Heap (1/3)
¨ Maintain heap property all the times
¨ Push(1)
Greedy algorithms
42
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
- -
- -
- -
20 15 2 14 10
heapSize = 5
capacity = 10
- -
-
- -
20 15 2 14 10
Initial location
of new node
heapSize = 6
capacity = 10
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
2
14
15
10
20
2
14
15
10
20
- -
-
- -
20 15 2 14 10 1
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
2
1
14
15
10
20
IRIS H.-R. JIANG
Insertion into a Max Heap (2/3)
¨ Maintain heap Þ bubble up if needed!
¨ Push(21)
Greedy algorithms
43
- -
-
- -
20 15 2 14 10 1
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
2
1
14
15
10
20
-
- -
20 15 2 14 10 1 21
Initial location
of 21
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
2
1
14
15
10
20
21
- -
- -
20 15 21 14 10 1 2
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
2
1
14
15
10
20
21
- -
- -
21 15 20 14 10 1 2
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
2
1
14
15
10
21
20
IRIS H.-R. JIANG
Insertion into a Max Heap (3/3)
¨ Time complexity?
¤ How many times to bubble up in the worst case?
¤ Tree height: Q(lg n)
Greedy algorithms
44
IRIS H.-R. JIANG
Class MaxHeap
¨ Implementation?
¤ Complete binary tree Þ array representation
Greedy algorithms
41
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
- -
-
- -
heap 14 12 7 10 8 6
7
8 6
14
12
10
IRIS H.-R. JIANG
Insertion into a Max Heap (1/3)
¨ Maintain heap property all the times
¨ Push(1)
Greedy algorithms
42
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
- -
- -
- -
20 15 2 14 10
heapSize = 5
capacity = 10
- -
-
- -
20 15 2 14 10
Initial location
of new node
heapSize = 6
capacity = 10
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
2
14
15
10
20
2
14
15
10
20
- -
-
- -
20 15 2 14 10 1
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
2
1
14
15
10
20
IRIS H.-R. JIANG
Insertion into a Max Heap (2/3)
¨ Maintain heap Þ bubble up if needed!
¨ Push(21)
Greedy algorithms
43
- -
-
- -
20 15 2 14 10 1
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
2
1
14
15
10
20
-
- -
20 15 2 14 10 1 21
Initial location
of 21
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
2
1
14
15
10
20
21
- -
- -
20 15 21 14 10 1 2
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
2
1
14
15
10
20
21
- -
- -
21 15 20 14 10 1 2
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
2
1
14
15
10
21
20
IRIS H.-R. JIANG
Insertion into a Max Heap (3/3)
¨ Time complexity?
¤ How many times to bubble up in the worst case?
¤ Tree height: Q(lg n)
Greedy algorithms
44
IRIS H.-R. JIANG
Class MaxHeap
¨ Implementation?
¤ Complete binary tree Þ array representation
Greedy algorithms
41
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
- -
-
- -
heap 14 12 7 10 8 6
7
8 6
14
12
10
IRIS H.-R. JIANG
Insertion into a Max Heap (1/3)
¨ Maintain heap property all the times
¨ Push(1)
Greedy algorithms
42
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
- -
- -
- -
20 15 2 14 10
heapSize = 5
capacity = 10
- -
-
- -
20 15 2 14 10
Initial location
of new node
heapSize = 6
capacity = 10
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
2
14
15
10
20
2
14
15
10
20
- -
-
- -
20 15 2 14 10 1
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
2
1
14
15
10
20
IRIS H.-R. JIANG
Insertion into a Max Heap (2/3)
¨ Maintain heap Þ bubble up if needed!
¨ Push(21)
Greedy algorithms
43
- -
-
- -
20 15 2 14 10 1
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
2
1
14
15
10
20
-
- -
20 15 2 14 10 1 21
Initial location
of 21
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
2
1
14
15
10
20
21
- -
- -
20 15 21 14 10 1 2
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
2
1
14
15
10
20
21
- -
- -
21 15 20 14 10 1 2
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
2
1
14
15
10
21
20
IRIS H.-R. JIANG
Insertion into a Max Heap (3/3)
¨ Time complexity?
¤ How many times to bubble up in the worst case?
¤ Tree height: Q(lg n)
Greedy algorithms
44
IRIS H.-R. JIANG
Deletion from a Max Heap (1/3)
¨ Maintain heap Þ trickle down if needed!
¨ Pop()
45
Greedy algorithms
- -
- -
21 15 20 14 10 1 2
heapSize = 7
capacity = 10
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
2
1
14
15
10
21
20
- -
- -
15 20 14 10 1 2
heapSize = 6
capacity = 10
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
2
1
14
15
10
20
- -
-
- -
2 15 20 14 10 1
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
2
1
14
15
10
20
- -
-
- -
20 15 2 14 10 1
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
2
1
14
15
10
20
IRIS H.-R. JIANG
Deletion from a Max Heap (2/3)
¨ Maintain heap Þ trickle down if needed!
¨ Pop()
Greedy algorithms
46
- -
-
- -
20 15 2 14 10 1
heapSize = 6
capacity = 10
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
2
1
14
15
10
20
- -
- -
- -
1 15 2 14 10
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
1
2
14
15
10
heapSize = 5
capacity = 10
- -
- -
- -
15 1 2 14 10
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
1 2
14
15
10
- -
- -
- -
15 14 2 1 10
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
1
2
14
15
10
IRIS H.-R. JIANG
Deletion from a Max Heap (3/3)
¨ Time complexity?
¤ How many times to trickle down in the worst case? Q(lg n)
Greedy algorithms
47
IRIS H.-R. JIANG
Max Heapify
¨ Max (min) heapify = maintain the max (min) heap property
¤ What we do to trickle down the root in deletion
¤ Assume i’s left & right subtrees are heaps
n But key[i] may be < (>) key[children]
¤ Heapify i = trickle down key[i]
Þ the tree rooted at i is a heap
Greedy algorithms
48
- -
- -
- -
1 15 2 14 10
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
1
2
14
15
10
heapSize = 5
capacity = 10
- -
- -
- -
15 1 2 14 10
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
1 2
14
15
10
- -
- -
- -
15 14 2 1 10
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
1
2
14
15
10
IRIS H.-R. JIANG
Deletion from a Max Heap (1/3)
¨ Maintain heap Þ trickle down if needed!
¨ Pop()
45
Greedy algorithms
- -
- -
21 15 20 14 10 1 2
heapSize = 7
capacity = 10
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
2
1
14
15
10
21
20
- -
- -
15 20 14 10 1 2
heapSize = 6
capacity = 10
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
2
1
14
15
10
20
- -
-
- -
2 15 20 14 10 1
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
2
1
14
15
10
20
- -
-
- -
20 15 2 14 10 1
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
2
1
14
15
10
20
IRIS H.-R. JIANG
Deletion from a Max Heap (2/3)
¨ Maintain heap Þ trickle down if needed!
¨ Pop()
Greedy algorithms
46
- -
-
- -
20 15 2 14 10 1
heapSize = 6
capacity = 10
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
2
1
14
15
10
20
- -
- -
- -
1 15 2 14 10
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
1
2
14
15
10
heapSize = 5
capacity = 10
- -
- -
- -
15 1 2 14 10
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
1 2
14
15
10
- -
- -
- -
15 14 2 1 10
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
1
2
14
15
10
IRIS H.-R. JIANG
Deletion from a Max Heap (3/3)
¨ Time complexity?
¤ How many times to trickle down in the worst case? Q(lg n)
Greedy algorithms
47
IRIS H.-R. JIANG
Max Heapify
¨ Max (min) heapify = maintain the max (min) heap property
¤ What we do to trickle down the root in deletion
¤ Assume i’s left & right subtrees are heaps
n But key[i] may be < (>) key[children]
¤ Heapify i = trickle down key[i]
Þ the tree rooted at i is a heap
Greedy algorithms
48
- -
- -
- -
1 15 2 14 10
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
1
2
14
15
10
heapSize = 5
capacity = 10
- -
- -
- -
15 1 2 14 10
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
1 2
14
15
10
- -
- -
- -
15 14 2 1 10
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
1
2
14
15
10
IRIS H.-R. JIANG
Deletion from a Max Heap (1/3)
¨ Maintain heap Þ trickle down if needed!
¨ Pop()
45
Greedy algorithms
- -
- -
21 15 20 14 10 1 2
heapSize = 7
capacity = 10
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
2
1
14
15
10
21
20
- -
- -
15 20 14 10 1 2
heapSize = 6
capacity = 10
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
2
1
14
15
10
20
- -
-
- -
2 15 20 14 10 1
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
2
1
14
15
10
20
- -
-
- -
20 15 2 14 10 1
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
2
1
14
15
10
20
IRIS H.-R. JIANG
Deletion from a Max Heap (2/3)
¨ Maintain heap Þ trickle down if needed!
¨ Pop()
Greedy algorithms
46
- -
-
- -
20 15 2 14 10 1
heapSize = 6
capacity = 10
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
2
1
14
15
10
20
- -
- -
- -
1 15 2 14 10
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
1
2
14
15
10
heapSize = 5
capacity = 10
- -
- -
- -
15 1 2 14 10
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
1 2
14
15
10
- -
- -
- -
15 14 2 1 10
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
1
2
14
15
10
IRIS H.-R. JIANG
Deletion from a Max Heap (3/3)
¨ Time complexity?
¤ How many times to trickle down in the worst case? Q(lg n)
Greedy algorithms
47
IRIS H.-R. JIANG
Max Heapify
¨ Max (min) heapify = maintain the max (min) heap property
¤ What we do to trickle down the root in deletion
¤ Assume i’s left & right subtrees are heaps
n But key[i] may be < (>) key[children]
¤ Heapify i = trickle down key[i]
Þ the tree rooted at i is a heap
Greedy algorithms
48
- -
- -
- -
1 15 2 14 10
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
1
2
14
15
10
heapSize = 5
capacity = 10
- -
- -
- -
15 1 2 14 10
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
1 2
14
15
10
- -
- -
- -
15 14 2 1 10
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
1
2
14
15
10
IRIS H.-R. JIANG
Deletion from a Max Heap (1/3)
¨ Maintain heap Þ trickle down if needed!
¨ Pop()
45
Greedy algorithms
- -
- -
21 15 20 14 10 1 2
heapSize = 7
capacity = 10
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
2
1
14
15
10
21
20
- -
- -
15 20 14 10 1 2
heapSize = 6
capacity = 10
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
2
1
14
15
10
20
- -
-
- -
2 15 20 14 10 1
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
2
1
14
15
10
20
- -
-
- -
20 15 2 14 10 1
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
2
1
14
15
10
20
IRIS H.-R. JIANG
Deletion from a Max Heap (2/3)
¨ Maintain heap Þ trickle down if needed!
¨ Pop()
Greedy algorithms
46
- -
-
- -
20 15 2 14 10 1
heapSize = 6
capacity = 10
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
2
1
14
15
10
20
- -
- -
- -
1 15 2 14 10
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
1
2
14
15
10
heapSize = 5
capacity = 10
- -
- -
- -
15 1 2 14 10
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
1 2
14
15
10
- -
- -
- -
15 14 2 1 10
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
1
2
14
15
10
IRIS H.-R. JIANG
Deletion from a Max Heap (3/3)
¨ Time complexity?
¤ How many times to trickle down in the worst case? Q(lg n)
Greedy algorithms
47
IRIS H.-R. JIANG
Max Heapify
¨ Max (min) heapify = maintain the max (min) heap property
¤ What we do to trickle down the root in deletion
¤ Assume i’s left & right subtrees are heaps
n But key[i] may be < (>) key[children]
¤ Heapify i = trickle down key[i]
Þ the tree rooted at i is a heap
Greedy algorithms
48
- -
- -
- -
1 15 2 14 10
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
1
2
14
15
10
heapSize = 5
capacity = 10
- -
- -
- -
15 1 2 14 10
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
1 2
14
15
10
- -
- -
- -
15 14 2 1 10
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
1
2
14
15
10
IRIS H.-R. JIANG
How to Build a Max Heap?
¨ How to convert any complete binary tree to a max heap?
¨ Intuition: Max heapify in a bottom-up manner
¤ Induction basis: Leaves are already heaps
¤ Induction steps: Start at parents of leaves, work upward till root
¤ Time complexity: O(n lg n)
Greedy algorithms
49
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
14 8
9 10
2 16
1 3
- 4 7
a
16 10
14
16 10
14
16 10
14
16
10
14
16 10
14
16
10
14
1
4
3 1
4
3 1
4
3
1
4
2
2
8 7
9 2
8 7
9
2 8 7
9
1
8 7
8 7
9 3
4
7
2 8 1
9 3 9 3
2 4
Robert C. Prim 1957 (Dijkstra 1959)
Joseph B. Kruskal 1956
Reverse-delete
Minimum Spanning Trees
50
Greedy algorithms
A
B
E
D
C
F
7
5
7
9
15
8
8
5
6
G
9
11
IRIS H.-R. JIANG
Minimum Spanning Graphs
¨ Q: How can a cable TV company lay cable to a new
neighborhood, of course, as cheaply as possible?
¨ A: Curiously and fortunately, this problem is a case where
many greedy algorithms optimally solve.
¨ Given
¤ Undirected graph G = (V, E)
n Nonnegative cost ce for each edge e = (u, v) Î V
n ce ³ 0
¨ Goal
¤ Find a subset of edges T ‫ك‬ E so that
n The subgraph (V, T) is connected
n Total cost SeÎV ce is minimized
Greedy algorithms
51
A
B
E
D
C
F
7
5
7
9
15
8
8
5
6
G
9
11
A
B
E
D
C
F
7
5
7
9
15
8
8
5
6
G
9
11
IRIS H.-R. JIANG
Minimum Spanning Trees
¨ Q: Let T be a minimum-cost solution. What should (V, T) be?
¨ A:
¤ By definition, (V, T) must be connected.
¤ We show that it also contains no cycles.
¤ Suppose it contained a cycle C, and let e be any edge on C.
¤ We claim that (V, T – {e}) is still connected
¤ Any path previously used e can now go path C – {e} instead.
¤ It follows that (V, T – {e}) is also a valid solution, but cheaper.
¤ Hence, (V, T) is a tree.
¨ Goal
¤ Find a subset of edges T ‫ك‬ E so that
n (V, T) is a tree,
n Total cost SeÎV ce is minimized.
Greedy algorithms
52
Minimum Spanning ????
A
B
E
D
C
F
7
5
7
9
15
8
8
5
6
G
9
11
A
B
E
D
C
F
7
5
7
9
15
8
8
5
6
G
9
11
IRIS H.-R. JIANG
How to Build a Max Heap?
¨ How to convert any complete binary tree to a max heap?
¨ Intuition: Max heapify in a bottom-up manner
¤ Induction basis: Leaves are already heaps
¤ Induction steps: Start at parents of leaves, work upward till root
¤ Time complexity: O(n lg n)
Greedy algorithms
49
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
14 8
9 10
2 16
1 3
- 4 7
a
16 10
14
16 10
14
16 10
14
16
10
14
16 10
14
16
10
14
1
4
3 1
4
3 1
4
3
1
4
2
2
8 7
9 2
8 7
9
2 8 7
9
1
8 7
8 7
9 3
4
7
2 8 1
9 3 9 3
2 4
Robert C. Prim 1957 (Dijkstra 1959)
Joseph B. Kruskal 1956
Reverse-delete
Minimum Spanning Trees
50
Greedy algorithms
A
B
E
D
C
F
7
5
7
9
15
8
8
5
6
G
9
11
IRIS H.-R. JIANG
Minimum Spanning Graphs
¨ Q: How can a cable TV company lay cable to a new
neighborhood, of course, as cheaply as possible?
¨ A: Curiously and fortunately, this problem is a case where
many greedy algorithms optimally solve.
¨ Given
¤ Undirected graph G = (V, E)
n Nonnegative cost ce for each edge e = (u, v) Î V
n ce ³ 0
¨ Goal
¤ Find a subset of edges T ‫ك‬ E so that
n The subgraph (V, T) is connected
n Total cost SeÎV ce is minimized
Greedy algorithms
51
A
B
E
D
C
F
7
5
7
9
15
8
8
5
6
G
9
11
A
B
E
D
C
F
7
5
7
9
15
8
8
5
6
G
9
11
IRIS H.-R. JIANG
Minimum Spanning Trees
¨ Q: Let T be a minimum-cost solution. What should (V, T) be?
¨ A:
¤ By definition, (V, T) must be connected.
¤ We show that it also contains no cycles.
¤ Suppose it contained a cycle C, and let e be any edge on C.
¤ We claim that (V, T – {e}) is still connected
¤ Any path previously used e can now go path C – {e} instead.
¤ It follows that (V, T – {e}) is also a valid solution, but cheaper.
¤ Hence, (V, T) is a tree.
¨ Goal
¤ Find a subset of edges T ‫ك‬ E so that
n (V, T) is a tree,
n Total cost SeÎV ce is minimized.
Greedy algorithms
52
Minimum Spanning ????
A
B
E
D
C
F
7
5
7
9
15
8
8
5
6
G
9
11
A
B
E
D
C
F
7
5
7
9
15
8
8
5
6
G
9
11
IRIS H.-R. JIANG
How to Build a Max Heap?
¨ How to convert any complete binary tree to a max heap?
¨ Intuition: Max heapify in a bottom-up manner
¤ Induction basis: Leaves are already heaps
¤ Induction steps: Start at parents of leaves, work upward till root
¤ Time complexity: O(n lg n)
Greedy algorithms
49
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
14 8
9 10
2 16
1 3
- 4 7
a
16 10
14
16 10
14
16 10
14
16
10
14
16 10
14
16
10
14
1
4
3 1
4
3 1
4
3
1
4
2
2
8 7
9 2
8 7
9
2 8 7
9
1
8 7
8 7
9 3
4
7
2 8 1
9 3 9 3
2 4
Robert C. Prim 1957 (Dijkstra 1959)
Joseph B. Kruskal 1956
Reverse-delete
Minimum Spanning Trees
50
Greedy algorithms
A
B
E
D
C
F
7
5
7
9
15
8
8
5
6
G
9
11
IRIS H.-R. JIANG
Minimum Spanning Graphs
¨ Q: How can a cable TV company lay cable to a new
neighborhood, of course, as cheaply as possible?
¨ A: Curiously and fortunately, this problem is a case where
many greedy algorithms optimally solve.
¨ Given
¤ Undirected graph G = (V, E)
n Nonnegative cost ce for each edge e = (u, v) Î V
n ce ³ 0
¨ Goal
¤ Find a subset of edges T ‫ك‬ E so that
n The subgraph (V, T) is connected
n Total cost SeÎV ce is minimized
Greedy algorithms
51
A
B
E
D
C
F
7
5
7
9
15
8
8
5
6
G
9
11
A
B
E
D
C
F
7
5
7
9
15
8
8
5
6
G
9
11
IRIS H.-R. JIANG
Minimum Spanning Trees
¨ Q: Let T be a minimum-cost solution. What should (V, T) be?
¨ A:
¤ By definition, (V, T) must be connected.
¤ We show that it also contains no cycles.
¤ Suppose it contained a cycle C, and let e be any edge on C.
¤ We claim that (V, T – {e}) is still connected
¤ Any path previously used e can now go path C – {e} instead.
¤ It follows that (V, T – {e}) is also a valid solution, but cheaper.
¤ Hence, (V, T) is a tree.
¨ Goal
¤ Find a subset of edges T ‫ك‬ E so that
n (V, T) is a tree,
n Total cost SeÎV ce is minimized.
Greedy algorithms
52
Minimum Spanning ????
A
B
E
D
C
F
7
5
7
9
15
8
8
5
6
G
9
11
A
B
E
D
C
F
7
5
7
9
15
8
8
5
6
G
9
11
IRIS H.-R. JIANG
How to Build a Max Heap?
¨ How to convert any complete binary tree to a max heap?
¨ Intuition: Max heapify in a bottom-up manner
¤ Induction basis: Leaves are already heaps
¤ Induction steps: Start at parents of leaves, work upward till root
¤ Time complexity: O(n lg n)
Greedy algorithms
49
[8] [9]
[6] [7]
[4] [5]
[2] [3]
[0] [1] [10]
14 8
9 10
2 16
1 3
- 4 7
a
16 10
14
16 10
14
16 10
14
16
10
14
16 10
14
16
10
14
1
4
3 1
4
3 1
4
3
1
4
2
2
8 7
9 2
8 7
9
2 8 7
9
1
8 7
8 7
9 3
4
7
2 8 1
9 3 9 3
2 4
Robert C. Prim 1957 (Dijkstra 1959)
Joseph B. Kruskal 1956
Reverse-delete
Minimum Spanning Trees
50
Greedy algorithms
A
B
E
D
C
F
7
5
7
9
15
8
8
5
6
G
9
11
IRIS H.-R. JIANG
Minimum Spanning Graphs
¨ Q: How can a cable TV company lay cable to a new
neighborhood, of course, as cheaply as possible?
¨ A: Curiously and fortunately, this problem is a case where
many greedy algorithms optimally solve.
¨ Given
¤ Undirected graph G = (V, E)
n Nonnegative cost ce for each edge e = (u, v) Î V
n ce ³ 0
¨ Goal
¤ Find a subset of edges T ‫ك‬ E so that
n The subgraph (V, T) is connected
n Total cost SeÎV ce is minimized
Greedy algorithms
51
A
B
E
D
C
F
7
5
7
9
15
8
8
5
6
G
9
11
A
B
E
D
C
F
7
5
7
9
15
8
8
5
6
G
9
11
IRIS H.-R. JIANG
Minimum Spanning Trees
¨ Q: Let T be a minimum-cost solution. What should (V, T) be?
¨ A:
¤ By definition, (V, T) must be connected.
¤ We show that it also contains no cycles.
¤ Suppose it contained a cycle C, and let e be any edge on C.
¤ We claim that (V, T – {e}) is still connected
¤ Any path previously used e can now go path C – {e} instead.
¤ It follows that (V, T – {e}) is also a valid solution, but cheaper.
¤ Hence, (V, T) is a tree.
¨ Goal
¤ Find a subset of edges T ‫ك‬ E so that
n (V, T) is a tree,
n Total cost SeÎV ce is minimized.
Greedy algorithms
52
Minimum Spanning ????
A
B
E
D
C
F
7
5
7
9
15
8
8
5
6
G
9
11
A
B
E
D
C
F
7
5
7
9
15
8
8
5
6
G
9
11
IRIS H.-R. JIANG
Greedy Algorithms (1/3)
¨ Q: What will you do?
¨ All three greedy algorithms produce an MST.
¨ Kruskal's algorithm:
¤ Start with T = {}.
¤ Consider edges in ascending order of cost.
¤ Insert edge e in T as long as it does not create a cycle;
otherwise, discard e and continue.
Greedy algorithms
53
A
B
E
D
C
F
7
5
7
9
15
8
8
5
6
G
9
11
A
B
E
D
C
F
7
5
7
9
15
8
8
5
6
G
9
11
IRIS H.-R. JIANG
Greedy Algorithms (2/3)
¨ Q: What will you do?
¨ All three greedy algorithms produce an MST.
¨ Prim's algorithm: (c.f. Dijkstra’s algorithm)
¤ Start with a root node s.
¤ Greedily grow a tree T from s outward.
¤ At each step, add the cheapest edge e to the partial tree T that
has exactly one endpoint in T.
Greedy algorithms
54
A
B
E
D
C
F
7
5
7
9
15
8
8
5
6
G
9
11
D
A
F
B
E
C
G
A
B
E
D
C
F
7
5
7
9
15
8
8
5
6
G
9
11
IRIS H.-R. JIANG
Greedy Algorithms (3/3)
¨ Q: What will you do?
¨ All three greedy algorithms produce an MST.
¨ Reverse-delete algorithm: (reverse of Kruskal’s algorithm)
¤ Start with T = E.
¤ Consider edges in descending order of cost.
¤ Delete edge e from T unless doing so would disconnect T.
Greedy algorithms
55
A
B
E
D
C
F
7
5
7
9
15
8
8
5
6
G
9
11
A
B
E
D
C
F
7
5
7
9
15
8
8
5
6
G
9
11
IRIS H.-R. JIANG
Cut Property (1/2)
¨ Q: What’s wrong with this proof?
¨ A: Take care about the definition!
¤ Spanning tree: connected & acyclic
Greedy algorithms
56
v
v'
S
w
h
w'
e
e'
f
¨ Simplifying assumption: All edge costs ce are distinct.
¨ Q: When is it safe to include an edge in the MST?
¨ Cut Property: Let S be any subset of nodes, and let e = (v, w) be
the minimum cost edge with one end in S and the other in V–S.
Then every MST contains e.
¨ Pf: Exchange argument!
¤ Let T be a spanning tree that does not contain e. We need to
show that T does not have the minimum possible cost.
¤ Since T is a spanning tree, it must contain an edge f with one end
in S and the other in V–S.
¤ Since e is the cheapest edge with this property, we have ce < cf.
¤ Hence, T – {f} + {e} is a spanning tree that is cheaper than T.
¨ Q: What’s wrong with this proof?
¨ A: Take care about the definition!
IRIS H.-R. JIANG
Greedy Algorithms (1/3)
¨ Q: What will you do?
¨ All three greedy algorithms produce an MST.
¨ Kruskal's algorithm:
¤ Start with T = {}.
¤ Consider edges in ascending order of cost.
¤ Insert edge e in T as long as it does not create a cycle;
otherwise, discard e and continue.
Greedy algorithms
53
A
B
E
D
C
F
7
5
7
9
15
8
8
5
6
G
9
11
A
B
E
D
C
F
7
5
7
9
15
8
8
5
6
G
9
11
IRIS H.-R. JIANG
Greedy Algorithms (2/3)
¨ Q: What will you do?
¨ All three greedy algorithms produce an MST.
¨ Prim's algorithm: (c.f. Dijkstra’s algorithm)
¤ Start with a root node s.
¤ Greedily grow a tree T from s outward.
¤ At each step, add the cheapest edge e to the partial tree T that
has exactly one endpoint in T.
Greedy algorithms
54
A
B
E
D
C
F
7
5
7
9
15
8
8
5
6
G
9
11
D
A
F
B
E
C
G
A
B
E
D
C
F
7
5
7
9
15
8
8
5
6
G
9
11
IRIS H.-R. JIANG
Greedy Algorithms (3/3)
¨ Q: What will you do?
¨ All three greedy algorithms produce an MST.
¨ Reverse-delete algorithm: (reverse of Kruskal’s algorithm)
¤ Start with T = E.
¤ Consider edges in descending order of cost.
¤ Delete edge e from T unless doing so would disconnect T.
Greedy algorithms
55
A
B
E
D
C
F
7
5
7
9
15
8
8
5
6
G
9
11
A
B
E
D
C
F
7
5
7
9
15
8
8
5
6
G
9
11
IRIS H.-R. JIANG
Cut Property (1/2)
¨ Q: What’s wrong with this proof?
¨ A: Take care about the definition!
¤ Spanning tree: connected & acyclic
Greedy algorithms
56
v
v'
S
w
h
w'
e
e'
f
¨ Simplifying assumption: All edge costs ce are distinct.
¨ Q: When is it safe to include an edge in the MST?
¨ Cut Property: Let S be any subset of nodes, and let e = (v, w) be
the minimum cost edge with one end in S and the other in V–S.
Then every MST contains e.
¨ Pf: Exchange argument!
¤ Let T be a spanning tree that does not contain e. We need to
show that T does not have the minimum possible cost.
¤ Since T is a spanning tree, it must contain an edge f with one end
in S and the other in V–S.
¤ Since e is the cheapest edge with this property, we have ce < cf.
¤ Hence, T – {f} + {e} is a spanning tree that is cheaper than T.
¨ Q: What’s wrong with this proof?
¨ A: Take care about the definition!
IRIS H.-R. JIANG
Greedy Algorithms (1/3)
¨ Q: What will you do?
¨ All three greedy algorithms produce an MST.
¨ Kruskal's algorithm:
¤ Start with T = {}.
¤ Consider edges in ascending order of cost.
¤ Insert edge e in T as long as it does not create a cycle;
otherwise, discard e and continue.
Greedy algorithms
53
A
B
E
D
C
F
7
5
7
9
15
8
8
5
6
G
9
11
A
B
E
D
C
F
7
5
7
9
15
8
8
5
6
G
9
11
IRIS H.-R. JIANG
Greedy Algorithms (2/3)
¨ Q: What will you do?
¨ All three greedy algorithms produce an MST.
¨ Prim's algorithm: (c.f. Dijkstra’s algorithm)
¤ Start with a root node s.
¤ Greedily grow a tree T from s outward.
¤ At each step, add the cheapest edge e to the partial tree T that
has exactly one endpoint in T.
Greedy algorithms
54
A
B
E
D
C
F
7
5
7
9
15
8
8
5
6
G
9
11
D
A
F
B
E
C
G
A
B
E
D
C
F
7
5
7
9
15
8
8
5
6
G
9
11
IRIS H.-R. JIANG
Greedy Algorithms (3/3)
¨ Q: What will you do?
¨ All three greedy algorithms produce an MST.
¨ Reverse-delete algorithm: (reverse of Kruskal’s algorithm)
¤ Start with T = E.
¤ Consider edges in descending order of cost.
¤ Delete edge e from T unless doing so would disconnect T.
Greedy algorithms
55
A
B
E
D
C
F
7
5
7
9
15
8
8
5
6
G
9
11
A
B
E
D
C
F
7
5
7
9
15
8
8
5
6
G
9
11
IRIS H.-R. JIANG
Cut Property (1/2)
¨ Q: What’s wrong with this proof?
¨ A: Take care about the definition!
¤ Spanning tree: connected & acyclic
Greedy algorithms
56
v
v'
S
w
h
w'
e
e'
f
¨ Simplifying assumption: All edge costs ce are distinct.
¨ Q: When is it safe to include an edge in the MST?
¨ Cut Property: Let S be any subset of nodes, and let e = (v, w) be
the minimum cost edge with one end in S and the other in V–S.
Then every MST contains e.
¨ Pf: Exchange argument!
¤ Let T be a spanning tree that does not contain e. We need to
show that T does not have the minimum possible cost.
¤ Since T is a spanning tree, it must contain an edge f with one end
in S and the other in V–S.
¤ Since e is the cheapest edge with this property, we have ce < cf.
¤ Hence, T – {f} + {e} is a spanning tree that is cheaper than T.
¨ Q: What’s wrong with this proof?
¨ A: Take care about the definition!
algorithm_4greedy_algorithms.pdf
algorithm_4greedy_algorithms.pdf
algorithm_4greedy_algorithms.pdf
algorithm_4greedy_algorithms.pdf
algorithm_4greedy_algorithms.pdf
algorithm_4greedy_algorithms.pdf
algorithm_4greedy_algorithms.pdf
algorithm_4greedy_algorithms.pdf
algorithm_4greedy_algorithms.pdf
algorithm_4greedy_algorithms.pdf

More Related Content

Similar to algorithm_4greedy_algorithms.pdf

Analysis and Design of Algorithms
Analysis and Design of AlgorithmsAnalysis and Design of Algorithms
Analysis and Design of AlgorithmsBulbul Agrawal
 
Basics of Algorithm Unit 1 part 1 algorithm
Basics of Algorithm Unit 1 part 1  algorithmBasics of Algorithm Unit 1 part 1  algorithm
Basics of Algorithm Unit 1 part 1 algorithmJIMS LAJPAT NAGAR
 
Heuristic search-in-artificial-intelligence
Heuristic search-in-artificial-intelligenceHeuristic search-in-artificial-intelligence
Heuristic search-in-artificial-intelligencegrinu
 
Approximation algorithms
Approximation algorithmsApproximation algorithms
Approximation algorithmsGanesh Solanke
 
Backtracking Algorithm.ppt
Backtracking Algorithm.pptBacktracking Algorithm.ppt
Backtracking Algorithm.pptSalmIbrahimIlyas
 
Algorithms Design Patterns
Algorithms Design PatternsAlgorithms Design Patterns
Algorithms Design PatternsAshwin Shiv
 

Similar to algorithm_4greedy_algorithms.pdf (9)

Analysis and Design of Algorithms
Analysis and Design of AlgorithmsAnalysis and Design of Algorithms
Analysis and Design of Algorithms
 
Basics of Algorithm Unit 1 part 1 algorithm
Basics of Algorithm Unit 1 part 1  algorithmBasics of Algorithm Unit 1 part 1  algorithm
Basics of Algorithm Unit 1 part 1 algorithm
 
Algorithms - Aaron Bloomfield
Algorithms - Aaron BloomfieldAlgorithms - Aaron Bloomfield
Algorithms - Aaron Bloomfield
 
Heuristic search-in-artificial-intelligence
Heuristic search-in-artificial-intelligenceHeuristic search-in-artificial-intelligence
Heuristic search-in-artificial-intelligence
 
Approximation algorithms
Approximation algorithmsApproximation algorithms
Approximation algorithms
 
DATA STRUCTURE.pdf
DATA STRUCTURE.pdfDATA STRUCTURE.pdf
DATA STRUCTURE.pdf
 
DATA STRUCTURE
DATA STRUCTUREDATA STRUCTURE
DATA STRUCTURE
 
Backtracking Algorithm.ppt
Backtracking Algorithm.pptBacktracking Algorithm.ppt
Backtracking Algorithm.ppt
 
Algorithms Design Patterns
Algorithms Design PatternsAlgorithms Design Patterns
Algorithms Design Patterns
 

More from HsuChi Chen

algorithm_6dynamic_programming.pdf
algorithm_6dynamic_programming.pdfalgorithm_6dynamic_programming.pdf
algorithm_6dynamic_programming.pdfHsuChi Chen
 
algorithm_1introdunction.pdf
algorithm_1introdunction.pdfalgorithm_1introdunction.pdf
algorithm_1introdunction.pdfHsuChi Chen
 
algorithm_9linear_programming.pdf
algorithm_9linear_programming.pdfalgorithm_9linear_programming.pdf
algorithm_9linear_programming.pdfHsuChi Chen
 
algorithm_3graphs.pdf
algorithm_3graphs.pdfalgorithm_3graphs.pdf
algorithm_3graphs.pdfHsuChi Chen
 
algorithm_7network_flow.pdf
algorithm_7network_flow.pdfalgorithm_7network_flow.pdf
algorithm_7network_flow.pdfHsuChi Chen
 
algorithm_8beyond_polynomial_running_times.pdf
algorithm_8beyond_polynomial_running_times.pdfalgorithm_8beyond_polynomial_running_times.pdf
algorithm_8beyond_polynomial_running_times.pdfHsuChi Chen
 
algorithm_5divide_and_conquer.pdf
algorithm_5divide_and_conquer.pdfalgorithm_5divide_and_conquer.pdf
algorithm_5divide_and_conquer.pdfHsuChi Chen
 
algorithm_0introdunction.pdf
algorithm_0introdunction.pdfalgorithm_0introdunction.pdf
algorithm_0introdunction.pdfHsuChi Chen
 
期末專題報告書
期末專題報告書期末專題報告書
期末專題報告書HsuChi Chen
 
單晶片期末專題-報告二
單晶片期末專題-報告二單晶片期末專題-報告二
單晶片期末專題-報告二HsuChi Chen
 
單晶片期末專題-報告一
單晶片期末專題-報告一單晶片期末專題-報告一
單晶片期末專題-報告一HsuChi Chen
 
模組化課程生醫微製程期末報告
模組化課程生醫微製程期末報告模組化課程生醫微製程期末報告
模組化課程生醫微製程期末報告HsuChi Chen
 
單晶片期末專題-初步想法
單晶片期末專題-初步想法單晶片期末專題-初步想法
單晶片期末專題-初步想法HsuChi Chen
 

More from HsuChi Chen (13)

algorithm_6dynamic_programming.pdf
algorithm_6dynamic_programming.pdfalgorithm_6dynamic_programming.pdf
algorithm_6dynamic_programming.pdf
 
algorithm_1introdunction.pdf
algorithm_1introdunction.pdfalgorithm_1introdunction.pdf
algorithm_1introdunction.pdf
 
algorithm_9linear_programming.pdf
algorithm_9linear_programming.pdfalgorithm_9linear_programming.pdf
algorithm_9linear_programming.pdf
 
algorithm_3graphs.pdf
algorithm_3graphs.pdfalgorithm_3graphs.pdf
algorithm_3graphs.pdf
 
algorithm_7network_flow.pdf
algorithm_7network_flow.pdfalgorithm_7network_flow.pdf
algorithm_7network_flow.pdf
 
algorithm_8beyond_polynomial_running_times.pdf
algorithm_8beyond_polynomial_running_times.pdfalgorithm_8beyond_polynomial_running_times.pdf
algorithm_8beyond_polynomial_running_times.pdf
 
algorithm_5divide_and_conquer.pdf
algorithm_5divide_and_conquer.pdfalgorithm_5divide_and_conquer.pdf
algorithm_5divide_and_conquer.pdf
 
algorithm_0introdunction.pdf
algorithm_0introdunction.pdfalgorithm_0introdunction.pdf
algorithm_0introdunction.pdf
 
期末專題報告書
期末專題報告書期末專題報告書
期末專題報告書
 
單晶片期末專題-報告二
單晶片期末專題-報告二單晶片期末專題-報告二
單晶片期末專題-報告二
 
單晶片期末專題-報告一
單晶片期末專題-報告一單晶片期末專題-報告一
單晶片期末專題-報告一
 
模組化課程生醫微製程期末報告
模組化課程生醫微製程期末報告模組化課程生醫微製程期末報告
模組化課程生醫微製程期末報告
 
單晶片期末專題-初步想法
單晶片期末專題-初步想法單晶片期末專題-初步想法
單晶片期末專題-初步想法
 

Recently uploaded

(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...gurkirankumar98700
 
Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...
Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...
Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...stazi3110
 
Adobe Marketo Engage Deep Dives: Using Webhooks to Transfer Data
Adobe Marketo Engage Deep Dives: Using Webhooks to Transfer DataAdobe Marketo Engage Deep Dives: Using Webhooks to Transfer Data
Adobe Marketo Engage Deep Dives: Using Webhooks to Transfer DataBradBedford3
 
CALL ON ➥8923113531 🔝Call Girls Kakori Lucknow best sexual service Online ☂️
CALL ON ➥8923113531 🔝Call Girls Kakori Lucknow best sexual service Online  ☂️CALL ON ➥8923113531 🔝Call Girls Kakori Lucknow best sexual service Online  ☂️
CALL ON ➥8923113531 🔝Call Girls Kakori Lucknow best sexual service Online ☂️anilsa9823
 
SyndBuddy AI 2k Review 2024: Revolutionizing Content Syndication with AI
SyndBuddy AI 2k Review 2024: Revolutionizing Content Syndication with AISyndBuddy AI 2k Review 2024: Revolutionizing Content Syndication with AI
SyndBuddy AI 2k Review 2024: Revolutionizing Content Syndication with AIABDERRAOUF MEHENNI
 
Hand gesture recognition PROJECT PPT.pptx
Hand gesture recognition PROJECT PPT.pptxHand gesture recognition PROJECT PPT.pptx
Hand gesture recognition PROJECT PPT.pptxbodapatigopi8531
 
Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...
Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...
Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...MyIntelliSource, Inc.
 
HR Software Buyers Guide in 2024 - HRSoftware.com
HR Software Buyers Guide in 2024 - HRSoftware.comHR Software Buyers Guide in 2024 - HRSoftware.com
HR Software Buyers Guide in 2024 - HRSoftware.comFatema Valibhai
 
Building Real-Time Data Pipelines: Stream & Batch Processing workshop Slide
Building Real-Time Data Pipelines: Stream & Batch Processing workshop SlideBuilding Real-Time Data Pipelines: Stream & Batch Processing workshop Slide
Building Real-Time Data Pipelines: Stream & Batch Processing workshop SlideChristina Lin
 
Reassessing the Bedrock of Clinical Function Models: An Examination of Large ...
Reassessing the Bedrock of Clinical Function Models: An Examination of Large ...Reassessing the Bedrock of Clinical Function Models: An Examination of Large ...
Reassessing the Bedrock of Clinical Function Models: An Examination of Large ...harshavardhanraghave
 
why an Opensea Clone Script might be your perfect match.pdf
why an Opensea Clone Script might be your perfect match.pdfwhy an Opensea Clone Script might be your perfect match.pdf
why an Opensea Clone Script might be your perfect match.pdfjoe51371421
 
Active Directory Penetration Testing, cionsystems.com.pdf
Active Directory Penetration Testing, cionsystems.com.pdfActive Directory Penetration Testing, cionsystems.com.pdf
Active Directory Penetration Testing, cionsystems.com.pdfCionsystems
 
Learn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdf
Learn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdfLearn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdf
Learn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdfkalichargn70th171
 
Optimizing AI for immediate response in Smart CCTV
Optimizing AI for immediate response in Smart CCTVOptimizing AI for immediate response in Smart CCTV
Optimizing AI for immediate response in Smart CCTVshikhaohhpro
 
Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...
Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...
Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...kellynguyen01
 
Unveiling the Tech Salsa of LAMs with Janus in Real-Time Applications
Unveiling the Tech Salsa of LAMs with Janus in Real-Time ApplicationsUnveiling the Tech Salsa of LAMs with Janus in Real-Time Applications
Unveiling the Tech Salsa of LAMs with Janus in Real-Time ApplicationsAlberto González Trastoy
 
DNT_Corporate presentation know about us
DNT_Corporate presentation know about usDNT_Corporate presentation know about us
DNT_Corporate presentation know about usDynamic Netsoft
 
TECUNIQUE: Success Stories: IT Service provider
TECUNIQUE: Success Stories: IT Service providerTECUNIQUE: Success Stories: IT Service provider
TECUNIQUE: Success Stories: IT Service providermohitmore19
 
Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...
Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...
Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...MyIntelliSource, Inc.
 
The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...
The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...
The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...ICS
 

Recently uploaded (20)

(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...
 
Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...
Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...
Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...
 
Adobe Marketo Engage Deep Dives: Using Webhooks to Transfer Data
Adobe Marketo Engage Deep Dives: Using Webhooks to Transfer DataAdobe Marketo Engage Deep Dives: Using Webhooks to Transfer Data
Adobe Marketo Engage Deep Dives: Using Webhooks to Transfer Data
 
CALL ON ➥8923113531 🔝Call Girls Kakori Lucknow best sexual service Online ☂️
CALL ON ➥8923113531 🔝Call Girls Kakori Lucknow best sexual service Online  ☂️CALL ON ➥8923113531 🔝Call Girls Kakori Lucknow best sexual service Online  ☂️
CALL ON ➥8923113531 🔝Call Girls Kakori Lucknow best sexual service Online ☂️
 
SyndBuddy AI 2k Review 2024: Revolutionizing Content Syndication with AI
SyndBuddy AI 2k Review 2024: Revolutionizing Content Syndication with AISyndBuddy AI 2k Review 2024: Revolutionizing Content Syndication with AI
SyndBuddy AI 2k Review 2024: Revolutionizing Content Syndication with AI
 
Hand gesture recognition PROJECT PPT.pptx
Hand gesture recognition PROJECT PPT.pptxHand gesture recognition PROJECT PPT.pptx
Hand gesture recognition PROJECT PPT.pptx
 
Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...
Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...
Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...
 
HR Software Buyers Guide in 2024 - HRSoftware.com
HR Software Buyers Guide in 2024 - HRSoftware.comHR Software Buyers Guide in 2024 - HRSoftware.com
HR Software Buyers Guide in 2024 - HRSoftware.com
 
Building Real-Time Data Pipelines: Stream & Batch Processing workshop Slide
Building Real-Time Data Pipelines: Stream & Batch Processing workshop SlideBuilding Real-Time Data Pipelines: Stream & Batch Processing workshop Slide
Building Real-Time Data Pipelines: Stream & Batch Processing workshop Slide
 
Reassessing the Bedrock of Clinical Function Models: An Examination of Large ...
Reassessing the Bedrock of Clinical Function Models: An Examination of Large ...Reassessing the Bedrock of Clinical Function Models: An Examination of Large ...
Reassessing the Bedrock of Clinical Function Models: An Examination of Large ...
 
why an Opensea Clone Script might be your perfect match.pdf
why an Opensea Clone Script might be your perfect match.pdfwhy an Opensea Clone Script might be your perfect match.pdf
why an Opensea Clone Script might be your perfect match.pdf
 
Active Directory Penetration Testing, cionsystems.com.pdf
Active Directory Penetration Testing, cionsystems.com.pdfActive Directory Penetration Testing, cionsystems.com.pdf
Active Directory Penetration Testing, cionsystems.com.pdf
 
Learn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdf
Learn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdfLearn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdf
Learn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdf
 
Optimizing AI for immediate response in Smart CCTV
Optimizing AI for immediate response in Smart CCTVOptimizing AI for immediate response in Smart CCTV
Optimizing AI for immediate response in Smart CCTV
 
Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...
Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...
Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...
 
Unveiling the Tech Salsa of LAMs with Janus in Real-Time Applications
Unveiling the Tech Salsa of LAMs with Janus in Real-Time ApplicationsUnveiling the Tech Salsa of LAMs with Janus in Real-Time Applications
Unveiling the Tech Salsa of LAMs with Janus in Real-Time Applications
 
DNT_Corporate presentation know about us
DNT_Corporate presentation know about usDNT_Corporate presentation know about us
DNT_Corporate presentation know about us
 
TECUNIQUE: Success Stories: IT Service provider
TECUNIQUE: Success Stories: IT Service providerTECUNIQUE: Success Stories: IT Service provider
TECUNIQUE: Success Stories: IT Service provider
 
Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...
Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...
Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...
 
The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...
The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...
The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...
 

algorithm_4greedy_algorithms.pdf

  • 1. Iris Hui-Ru Jiang Fall 2014 CHAPTER 4 GREEDY ALGORITHMS IRIS H.-R. JIANG Outline ¨ Content: ¤ Interval scheduling: The greedy algorithm stays ahead ¤ Scheduling to minimize lateness: An exchange argument ¤ Shortest paths ¤ The minimum spanning tree problem ¤ Implementing Kruskal’s algorithm: Union-find ¨ Reading: ¤ Chapter 4 Greedy algorithms 2 IRIS H.-R. JIANG Greedy Algorithms ¨ An algorithm is greedy if it builds up a solution in small steps, choosing a decision at each step myopically to optimize some underlying criterion. ¨ It’s easy to invent greedy algorithms for almost any problem. ¤ Intuitive and fast ¤ Usually not optimal ¨ It’s challenging to prove greedy algorithms succeed in solving a nontrivial problem optimally. 1. The greedy algorithm stays ahead. 2. An exchange argument. Greedy algorithms 3 The greedy algorithm stays ahead Interval Scheduling 4 Greedy algorithms
  • 2. Iris Hui-Ru Jiang Fall 2014 CHAPTER 4 GREEDY ALGORITHMS IRIS H.-R. JIANG Outline ¨ Content: ¤ Interval scheduling: The greedy algorithm stays ahead ¤ Scheduling to minimize lateness: An exchange argument ¤ Shortest paths ¤ The minimum spanning tree problem ¤ Implementing Kruskal’s algorithm: Union-find ¨ Reading: ¤ Chapter 4 Greedy algorithms 2 IRIS H.-R. JIANG Greedy Algorithms ¨ An algorithm is greedy if it builds up a solution in small steps, choosing a decision at each step myopically to optimize some underlying criterion. ¨ It’s easy to invent greedy algorithms for almost any problem. ¤ Intuitive and fast ¤ Usually not optimal ¨ It’s challenging to prove greedy algorithms succeed in solving a nontrivial problem optimally. 1. The greedy algorithm stays ahead. 2. An exchange argument. Greedy algorithms 3 The greedy algorithm stays ahead Interval Scheduling 4 Greedy algorithms
  • 3. Iris Hui-Ru Jiang Fall 2014 CHAPTER 4 GREEDY ALGORITHMS IRIS H.-R. JIANG Outline ¨ Content: ¤ Interval scheduling: The greedy algorithm stays ahead ¤ Scheduling to minimize lateness: An exchange argument ¤ Shortest paths ¤ The minimum spanning tree problem ¤ Implementing Kruskal’s algorithm: Union-find ¨ Reading: ¤ Chapter 4 Greedy algorithms 2 IRIS H.-R. JIANG Greedy Algorithms ¨ An algorithm is greedy if it builds up a solution in small steps, choosing a decision at each step myopically to optimize some underlying criterion. ¨ It’s easy to invent greedy algorithms for almost any problem. ¤ Intuitive and fast ¤ Usually not optimal ¨ It’s challenging to prove greedy algorithms succeed in solving a nontrivial problem optimally. 1. The greedy algorithm stays ahead. 2. An exchange argument. Greedy algorithms 3 The greedy algorithm stays ahead Interval Scheduling 4 Greedy algorithms
  • 4. Iris Hui-Ru Jiang Fall 2014 CHAPTER 4 GREEDY ALGORITHMS IRIS H.-R. JIANG Outline ¨ Content: ¤ Interval scheduling: The greedy algorithm stays ahead ¤ Scheduling to minimize lateness: An exchange argument ¤ Shortest paths ¤ The minimum spanning tree problem ¤ Implementing Kruskal’s algorithm: Union-find ¨ Reading: ¤ Chapter 4 Greedy algorithms 2 IRIS H.-R. JIANG Greedy Algorithms ¨ An algorithm is greedy if it builds up a solution in small steps, choosing a decision at each step myopically to optimize some underlying criterion. ¨ It’s easy to invent greedy algorithms for almost any problem. ¤ Intuitive and fast ¤ Usually not optimal ¨ It’s challenging to prove greedy algorithms succeed in solving a nontrivial problem optimally. 1. The greedy algorithm stays ahead. 2. An exchange argument. Greedy algorithms 3 The greedy algorithm stays ahead Interval Scheduling 4 Greedy algorithms
  • 5. IRIS H.-R. JIANG The Interval Scheduling Problem ¨ Given: Set of requests {1, 2, …, n}, ith request corresponds an interval with start time s(i) and finish time f(i) ¤ interval i: [s(i), f(i)) ¨ Goal: Find a compatible subset of the requests with maximum size Greedy algorithms 5 requests don’t overlap optimal Time 0 1 2 3 4 5 6 7 8 9 10 11 6 7 8 5 1 2 3 4 Maximum compatible subset {2, 5, 8} 8 5 2 IRIS H.-R. JIANG Greedy Rule ¨ Repeat ¤ Use a simple rule to select a first request i1. ¤ Once i1 is selected, reject all requests not compatible with i1. ¨ Until run out of requests. ¨ Q: How to decide a greedy rule for a good algorithm? ¨ A: 1. Earliest start time: min s(i) 2. Shortest interval: min {f(i) - s(i)} 3. Fewest conflicts: mini=1..n |{j: j is not compatible with i}| 4. Earliest finish time: min f(i) Greedy algorithms 6 IRIS H.-R. JIANG The Greedy Algorithm ¨ The 4th greedy rule leads to the optimal solution. ¤ We first accept the request that finish first ¤ Natural idea: Our resource becomes free ASAP ¨ The greedy algorithm: ¤ Q: Feasible? ¤ A: Yes! Line 5. n A is a compatible set of requests. ¤ Q: Optimal? Efficient? Greedy algorithms 7 Interval-Scheduling(R) // R: undetermined requests; A: accepted requests 1. A = Æ; 2. while (R is not empty) do 3. choose a request iÎR with minimum f(i) // greedy rule 4. A = A + {i} 5. R = R – {i} – X, where X = {j: jÎR and j is not compatible with i) 6. return A Æ: empty set = {} IRIS H.-R. JIANG The Greedy Algorithm Stays Ahead ¨ Q: How to prove optimality? ¤ Let O be an optimal solution. Prove A = O? Prove |A| = |O|? ¨ The greedy algorithm stays ahead. ¤ We will compare the partial solutions of A and O, and show that the greedy algorithm is doing better in a step-by-step fashion. ¨ Let A be the output of the greedy algorithm, A = {i1, …, ik}, in the order they were added. Let O be the optimal solution, O = {j1, …, jm} in the ascending order of start (finish) times. For all indices r ! k, we have f(ir) ! f(jr). ¨ Pf: Proof by induction! ¤ Basis step: true for r = 1, f(i1) ! f(j1). ¤ Induction step: hypothesis: true for r-1. n f(ir-1) ! f(jr-1) n O is compatible, f(jr-1) ! s(jr) n Hence, f(ir-1) ! s(jr); jrÎR after ir-1 is selected in line 5. n According to line 3, f(ir) ! f(jr). Greedy algorithms 8 jr jr-1 ir ir-1 Q: ir finishes later?
  • 6. IRIS H.-R. JIANG The Interval Scheduling Problem ¨ Given: Set of requests {1, 2, …, n}, ith request corresponds an interval with start time s(i) and finish time f(i) ¤ interval i: [s(i), f(i)) ¨ Goal: Find a compatible subset of the requests with maximum size Greedy algorithms 5 requests don’t overlap optimal Time 0 1 2 3 4 5 6 7 8 9 10 11 6 7 8 5 1 2 3 4 Maximum compatible subset {2, 5, 8} 8 5 2 IRIS H.-R. JIANG Greedy Rule ¨ Repeat ¤ Use a simple rule to select a first request i1. ¤ Once i1 is selected, reject all requests not compatible with i1. ¨ Until run out of requests. ¨ Q: How to decide a greedy rule for a good algorithm? ¨ A: 1. Earliest start time: min s(i) 2. Shortest interval: min {f(i) - s(i)} 3. Fewest conflicts: mini=1..n |{j: j is not compatible with i}| 4. Earliest finish time: min f(i) Greedy algorithms 6 IRIS H.-R. JIANG The Greedy Algorithm ¨ The 4th greedy rule leads to the optimal solution. ¤ We first accept the request that finish first ¤ Natural idea: Our resource becomes free ASAP ¨ The greedy algorithm: ¤ Q: Feasible? ¤ A: Yes! Line 5. n A is a compatible set of requests. ¤ Q: Optimal? Efficient? Greedy algorithms 7 Interval-Scheduling(R) // R: undetermined requests; A: accepted requests 1. A = Æ; 2. while (R is not empty) do 3. choose a request iÎR with minimum f(i) // greedy rule 4. A = A + {i} 5. R = R – {i} – X, where X = {j: jÎR and j is not compatible with i) 6. return A Æ: empty set = {} IRIS H.-R. JIANG The Greedy Algorithm Stays Ahead ¨ Q: How to prove optimality? ¤ Let O be an optimal solution. Prove A = O? Prove |A| = |O|? ¨ The greedy algorithm stays ahead. ¤ We will compare the partial solutions of A and O, and show that the greedy algorithm is doing better in a step-by-step fashion. ¨ Let A be the output of the greedy algorithm, A = {i1, …, ik}, in the order they were added. Let O be the optimal solution, O = {j1, …, jm} in the ascending order of start (finish) times. For all indices r ! k, we have f(ir) ! f(jr). ¨ Pf: Proof by induction! ¤ Basis step: true for r = 1, f(i1) ! f(j1). ¤ Induction step: hypothesis: true for r-1. n f(ir-1) ! f(jr-1) n O is compatible, f(jr-1) ! s(jr) n Hence, f(ir-1) ! s(jr); jrÎR after ir-1 is selected in line 5. n According to line 3, f(ir) ! f(jr). Greedy algorithms 8 jr jr-1 ir ir-1 Q: ir finishes later?
  • 7. IRIS H.-R. JIANG The Interval Scheduling Problem ¨ Given: Set of requests {1, 2, …, n}, ith request corresponds an interval with start time s(i) and finish time f(i) ¤ interval i: [s(i), f(i)) ¨ Goal: Find a compatible subset of the requests with maximum size Greedy algorithms 5 requests don’t overlap optimal Time 0 1 2 3 4 5 6 7 8 9 10 11 6 7 8 5 1 2 3 4 Maximum compatible subset {2, 5, 8} 8 5 2 IRIS H.-R. JIANG Greedy Rule ¨ Repeat ¤ Use a simple rule to select a first request i1. ¤ Once i1 is selected, reject all requests not compatible with i1. ¨ Until run out of requests. ¨ Q: How to decide a greedy rule for a good algorithm? ¨ A: 1. Earliest start time: min s(i) 2. Shortest interval: min {f(i) - s(i)} 3. Fewest conflicts: mini=1..n |{j: j is not compatible with i}| 4. Earliest finish time: min f(i) Greedy algorithms 6 IRIS H.-R. JIANG The Greedy Algorithm ¨ The 4th greedy rule leads to the optimal solution. ¤ We first accept the request that finish first ¤ Natural idea: Our resource becomes free ASAP ¨ The greedy algorithm: ¤ Q: Feasible? ¤ A: Yes! Line 5. n A is a compatible set of requests. ¤ Q: Optimal? Efficient? Greedy algorithms 7 Interval-Scheduling(R) // R: undetermined requests; A: accepted requests 1. A = Æ; 2. while (R is not empty) do 3. choose a request iÎR with minimum f(i) // greedy rule 4. A = A + {i} 5. R = R – {i} – X, where X = {j: jÎR and j is not compatible with i) 6. return A Æ: empty set = {} IRIS H.-R. JIANG The Greedy Algorithm Stays Ahead ¨ Q: How to prove optimality? ¤ Let O be an optimal solution. Prove A = O? Prove |A| = |O|? ¨ The greedy algorithm stays ahead. ¤ We will compare the partial solutions of A and O, and show that the greedy algorithm is doing better in a step-by-step fashion. ¨ Let A be the output of the greedy algorithm, A = {i1, …, ik}, in the order they were added. Let O be the optimal solution, O = {j1, …, jm} in the ascending order of start (finish) times. For all indices r ! k, we have f(ir) ! f(jr). ¨ Pf: Proof by induction! ¤ Basis step: true for r = 1, f(i1) ! f(j1). ¤ Induction step: hypothesis: true for r-1. n f(ir-1) ! f(jr-1) n O is compatible, f(jr-1) ! s(jr) n Hence, f(ir-1) ! s(jr); jrÎR after ir-1 is selected in line 5. n According to line 3, f(ir) ! f(jr). Greedy algorithms 8 jr jr-1 ir ir-1 Q: ir finishes later?
  • 8. IRIS H.-R. JIANG The Interval Scheduling Problem ¨ Given: Set of requests {1, 2, …, n}, ith request corresponds an interval with start time s(i) and finish time f(i) ¤ interval i: [s(i), f(i)) ¨ Goal: Find a compatible subset of the requests with maximum size Greedy algorithms 5 requests don’t overlap optimal Time 0 1 2 3 4 5 6 7 8 9 10 11 6 7 8 5 1 2 3 4 Maximum compatible subset {2, 5, 8} 8 5 2 IRIS H.-R. JIANG Greedy Rule ¨ Repeat ¤ Use a simple rule to select a first request i1. ¤ Once i1 is selected, reject all requests not compatible with i1. ¨ Until run out of requests. ¨ Q: How to decide a greedy rule for a good algorithm? ¨ A: 1. Earliest start time: min s(i) 2. Shortest interval: min {f(i) - s(i)} 3. Fewest conflicts: mini=1..n |{j: j is not compatible with i}| 4. Earliest finish time: min f(i) Greedy algorithms 6 IRIS H.-R. JIANG The Greedy Algorithm ¨ The 4th greedy rule leads to the optimal solution. ¤ We first accept the request that finish first ¤ Natural idea: Our resource becomes free ASAP ¨ The greedy algorithm: ¤ Q: Feasible? ¤ A: Yes! Line 5. n A is a compatible set of requests. ¤ Q: Optimal? Efficient? Greedy algorithms 7 Interval-Scheduling(R) // R: undetermined requests; A: accepted requests 1. A = Æ; 2. while (R is not empty) do 3. choose a request iÎR with minimum f(i) // greedy rule 4. A = A + {i} 5. R = R – {i} – X, where X = {j: jÎR and j is not compatible with i) 6. return A Æ: empty set = {} IRIS H.-R. JIANG The Greedy Algorithm Stays Ahead ¨ Q: How to prove optimality? ¤ Let O be an optimal solution. Prove A = O? Prove |A| = |O|? ¨ The greedy algorithm stays ahead. ¤ We will compare the partial solutions of A and O, and show that the greedy algorithm is doing better in a step-by-step fashion. ¨ Let A be the output of the greedy algorithm, A = {i1, …, ik}, in the order they were added. Let O be the optimal solution, O = {j1, …, jm} in the ascending order of start (finish) times. For all indices r ! k, we have f(ir) ! f(jr). ¨ Pf: Proof by induction! ¤ Basis step: true for r = 1, f(i1) ! f(j1). ¤ Induction step: hypothesis: true for r-1. n f(ir-1) ! f(jr-1) n O is compatible, f(jr-1) ! s(jr) n Hence, f(ir-1) ! s(jr); jrÎR after ir-1 is selected in line 5. n According to line 3, f(ir) ! f(jr). Greedy algorithms 8 jr jr-1 ir ir-1 Q: ir finishes later?
  • 9. IRIS H.-R. JIANG The Greedy Algorithm Is Optimal ¨ The greedy algorithm returns an optimal set A. ¨ Pf: Proof by contradiction. ¤ If A is not optimal, then an optimal set O must have more requests, i.e., |O| = m > k = |A|. ¤ Since f(ik) ! f(jk) and m>k, there is a request jk+1 in O. ¤ f(jk) ! s(jk+1); f(ik) ! s(jk+1). ¤ Hence, jk+1 is compatible with ik. R should contain jk+1. ¤ However, the greedy algorithm stops with request ik, and it is only supposed to stop when R is empty. ®¬ Greedy algorithms 9 jk+1 jk ik A O j1 i1 … … jm … IRIS H.-R. JIANG Implementation: The Greedy Algorithm ¨ Running time: From O(n2) to O(n log n) ¤ Initialization: n O(n log n): sort R in ascending order of f(i) n O(n): construct S, S[i] = s(i) ¤ Lines 3 and 5: n O(n): scan R once n We do not delete all incompatible requests in line 5, only those finish after the next selected request finishes and start before the next selected request finishes. Greedy algorithms 10 Interval-Scheduling(R) // R: undetermined requests; A: accepted requests 1. A = Æ 2. while (R is not empty) do 3. choose a request iÎR with minimum f(i) // greedy rule 4. A = A + {i} 5. R = R – {i} – X, where X = {j: jÎR and j is not compatible with i) 6. return A IRIS H.-R. JIANG The Interval Scheduling Problem Greedy algorithms 11 Time 0 1 2 3 4 5 6 7 8 9 10 11 Maximum compatible subset {1, 3, 5, 8} 12 13 14 15 16 8 9 4 5 6 3 1 2 7 1 3 5 8 Interval coloring Interval Partitioning 12 Greedy algorithms
  • 10. IRIS H.-R. JIANG The Greedy Algorithm Is Optimal ¨ The greedy algorithm returns an optimal set A. ¨ Pf: Proof by contradiction. ¤ If A is not optimal, then an optimal set O must have more requests, i.e., |O| = m > k = |A|. ¤ Since f(ik) ! f(jk) and m>k, there is a request jk+1 in O. ¤ f(jk) ! s(jk+1); f(ik) ! s(jk+1). ¤ Hence, jk+1 is compatible with ik. R should contain jk+1. ¤ However, the greedy algorithm stops with request ik, and it is only supposed to stop when R is empty. ®¬ Greedy algorithms 9 jk+1 jk ik A O j1 i1 … … jm … IRIS H.-R. JIANG Implementation: The Greedy Algorithm ¨ Running time: From O(n2) to O(n log n) ¤ Initialization: n O(n log n): sort R in ascending order of f(i) n O(n): construct S, S[i] = s(i) ¤ Lines 3 and 5: n O(n): scan R once n We do not delete all incompatible requests in line 5, only those finish after the next selected request finishes and start before the next selected request finishes. Greedy algorithms 10 Interval-Scheduling(R) // R: undetermined requests; A: accepted requests 1. A = Æ 2. while (R is not empty) do 3. choose a request iÎR with minimum f(i) // greedy rule 4. A = A + {i} 5. R = R – {i} – X, where X = {j: jÎR and j is not compatible with i) 6. return A IRIS H.-R. JIANG The Interval Scheduling Problem Greedy algorithms 11 Time 0 1 2 3 4 5 6 7 8 9 10 11 Maximum compatible subset {1, 3, 5, 8} 12 13 14 15 16 8 9 4 5 6 3 1 2 7 1 3 5 8 Interval coloring Interval Partitioning 12 Greedy algorithms
  • 11. IRIS H.-R. JIANG The Greedy Algorithm Is Optimal ¨ The greedy algorithm returns an optimal set A. ¨ Pf: Proof by contradiction. ¤ If A is not optimal, then an optimal set O must have more requests, i.e., |O| = m > k = |A|. ¤ Since f(ik) ! f(jk) and m>k, there is a request jk+1 in O. ¤ f(jk) ! s(jk+1); f(ik) ! s(jk+1). ¤ Hence, jk+1 is compatible with ik. R should contain jk+1. ¤ However, the greedy algorithm stops with request ik, and it is only supposed to stop when R is empty. ®¬ Greedy algorithms 9 jk+1 jk ik A O j1 i1 … … jm … IRIS H.-R. JIANG Implementation: The Greedy Algorithm ¨ Running time: From O(n2) to O(n log n) ¤ Initialization: n O(n log n): sort R in ascending order of f(i) n O(n): construct S, S[i] = s(i) ¤ Lines 3 and 5: n O(n): scan R once n We do not delete all incompatible requests in line 5, only those finish after the next selected request finishes and start before the next selected request finishes. Greedy algorithms 10 Interval-Scheduling(R) // R: undetermined requests; A: accepted requests 1. A = Æ 2. while (R is not empty) do 3. choose a request iÎR with minimum f(i) // greedy rule 4. A = A + {i} 5. R = R – {i} – X, where X = {j: jÎR and j is not compatible with i) 6. return A IRIS H.-R. JIANG The Interval Scheduling Problem Greedy algorithms 11 Time 0 1 2 3 4 5 6 7 8 9 10 11 Maximum compatible subset {1, 3, 5, 8} 12 13 14 15 16 8 9 4 5 6 3 1 2 7 1 3 5 8 Interval coloring Interval Partitioning 12 Greedy algorithms
  • 12. IRIS H.-R. JIANG The Greedy Algorithm Is Optimal ¨ The greedy algorithm returns an optimal set A. ¨ Pf: Proof by contradiction. ¤ If A is not optimal, then an optimal set O must have more requests, i.e., |O| = m > k = |A|. ¤ Since f(ik) ! f(jk) and m>k, there is a request jk+1 in O. ¤ f(jk) ! s(jk+1); f(ik) ! s(jk+1). ¤ Hence, jk+1 is compatible with ik. R should contain jk+1. ¤ However, the greedy algorithm stops with request ik, and it is only supposed to stop when R is empty. ®¬ Greedy algorithms 9 jk+1 jk ik A O j1 i1 … … jm … IRIS H.-R. JIANG Implementation: The Greedy Algorithm ¨ Running time: From O(n2) to O(n log n) ¤ Initialization: n O(n log n): sort R in ascending order of f(i) n O(n): construct S, S[i] = s(i) ¤ Lines 3 and 5: n O(n): scan R once n We do not delete all incompatible requests in line 5, only those finish after the next selected request finishes and start before the next selected request finishes. Greedy algorithms 10 Interval-Scheduling(R) // R: undetermined requests; A: accepted requests 1. A = Æ 2. while (R is not empty) do 3. choose a request iÎR with minimum f(i) // greedy rule 4. A = A + {i} 5. R = R – {i} – X, where X = {j: jÎR and j is not compatible with i) 6. return A IRIS H.-R. JIANG The Interval Scheduling Problem Greedy algorithms 11 Time 0 1 2 3 4 5 6 7 8 9 10 11 Maximum compatible subset {1, 3, 5, 8} 12 13 14 15 16 8 9 4 5 6 3 1 2 7 1 3 5 8 Interval coloring Interval Partitioning 12 Greedy algorithms
  • 13. IRIS H.-R. JIANG What if We Have Multiple Resources? ¨ The interval partitioning problem: ¤ a.k.a. the interval coloring problem: one resource = one color ¤ Use as few resources as possible ¨ Given: Set of requests {1, 2, …, n}, ith request corresponds an interval with start time s(i) and finish time f(i) ¤ interval i: [s(i), f(i)) ¨ Goal: Partition these requests into a minimum number of compatible subsets, each corresponds to one resource Greedy algorithms 13 Time 0 1 2 3 4 5 6 7 8 9 10 11 3 resources 12 13 IRIS H.-R. JIANG How Many Resources Are Required? ¨ The depth of a set of intervals is the maximum number that pass over any single point on the time-line. ¨ In any instance of interval partitioning, the number of resources needed is at least the depth of the set of intervals. ¨ Pf: ¤ Suppose a set of intervals has depth d, and let I1, …, Id all pass over a common point on the time-line. ¤ Then each of these intervals must be scheduled on a different resource, so the whole instance needs at least d resources. Greedy algorithms 14 Time 0 1 2 3 4 5 6 7 8 9 10 11 depth: 3 12 13 lower bound IRIS H.-R. JIANG Can We Reach the Lower Bound? ¨ The depth d is the lower bound on the number of required resources. ¨ Q: Can we always use d resources to schedule all requests? ¨ A: Yes. ¨ Q: How to prove the optimality? ¨ A: One finds a bound that every possible solution must have at least a certain value, and then one shows that the algorithm under consideration always achieves this bound. Greedy algorithms 15 IRIS H.-R. JIANG The Greedy Algorithm ¨ Assign a label to each interval. Possible labels: {1, 2, …, d}. ¨ Assign different labels for overlapping intervals. ¨ Implementation: ¤ Lines 3--5: find a resource compatible with Ij, assign this label n Record the finish time of the last added interval for each label n Compatibility checking: s(Ij) " f(labeli) n Use priority queue to maintain labels Greedy algorithms 16 Interval-Partitioning(R) 1. {I1, …, In} = sort intervals in ascending order of their start times 2. for j from 1 to n do 3. exclude the labels of all assigned intervals that are not compatible with Ij 4. if (there is a nonexcluded label from {1, 2, …, d}) then 5. assign a nonexcluded label to Ij 6. else leave Ij unlabeled
  • 14. IRIS H.-R. JIANG What if We Have Multiple Resources? ¨ The interval partitioning problem: ¤ a.k.a. the interval coloring problem: one resource = one color ¤ Use as few resources as possible ¨ Given: Set of requests {1, 2, …, n}, ith request corresponds an interval with start time s(i) and finish time f(i) ¤ interval i: [s(i), f(i)) ¨ Goal: Partition these requests into a minimum number of compatible subsets, each corresponds to one resource Greedy algorithms 13 Time 0 1 2 3 4 5 6 7 8 9 10 11 3 resources 12 13 IRIS H.-R. JIANG How Many Resources Are Required? ¨ The depth of a set of intervals is the maximum number that pass over any single point on the time-line. ¨ In any instance of interval partitioning, the number of resources needed is at least the depth of the set of intervals. ¨ Pf: ¤ Suppose a set of intervals has depth d, and let I1, …, Id all pass over a common point on the time-line. ¤ Then each of these intervals must be scheduled on a different resource, so the whole instance needs at least d resources. Greedy algorithms 14 Time 0 1 2 3 4 5 6 7 8 9 10 11 depth: 3 12 13 lower bound IRIS H.-R. JIANG Can We Reach the Lower Bound? ¨ The depth d is the lower bound on the number of required resources. ¨ Q: Can we always use d resources to schedule all requests? ¨ A: Yes. ¨ Q: How to prove the optimality? ¨ A: One finds a bound that every possible solution must have at least a certain value, and then one shows that the algorithm under consideration always achieves this bound. Greedy algorithms 15 IRIS H.-R. JIANG The Greedy Algorithm ¨ Assign a label to each interval. Possible labels: {1, 2, …, d}. ¨ Assign different labels for overlapping intervals. ¨ Implementation: ¤ Lines 3--5: find a resource compatible with Ij, assign this label n Record the finish time of the last added interval for each label n Compatibility checking: s(Ij) " f(labeli) n Use priority queue to maintain labels Greedy algorithms 16 Interval-Partitioning(R) 1. {I1, …, In} = sort intervals in ascending order of their start times 2. for j from 1 to n do 3. exclude the labels of all assigned intervals that are not compatible with Ij 4. if (there is a nonexcluded label from {1, 2, …, d}) then 5. assign a nonexcluded label to Ij 6. else leave Ij unlabeled
  • 15. IRIS H.-R. JIANG What if We Have Multiple Resources? ¨ The interval partitioning problem: ¤ a.k.a. the interval coloring problem: one resource = one color ¤ Use as few resources as possible ¨ Given: Set of requests {1, 2, …, n}, ith request corresponds an interval with start time s(i) and finish time f(i) ¤ interval i: [s(i), f(i)) ¨ Goal: Partition these requests into a minimum number of compatible subsets, each corresponds to one resource Greedy algorithms 13 Time 0 1 2 3 4 5 6 7 8 9 10 11 3 resources 12 13 IRIS H.-R. JIANG How Many Resources Are Required? ¨ The depth of a set of intervals is the maximum number that pass over any single point on the time-line. ¨ In any instance of interval partitioning, the number of resources needed is at least the depth of the set of intervals. ¨ Pf: ¤ Suppose a set of intervals has depth d, and let I1, …, Id all pass over a common point on the time-line. ¤ Then each of these intervals must be scheduled on a different resource, so the whole instance needs at least d resources. Greedy algorithms 14 Time 0 1 2 3 4 5 6 7 8 9 10 11 depth: 3 12 13 lower bound IRIS H.-R. JIANG Can We Reach the Lower Bound? ¨ The depth d is the lower bound on the number of required resources. ¨ Q: Can we always use d resources to schedule all requests? ¨ A: Yes. ¨ Q: How to prove the optimality? ¨ A: One finds a bound that every possible solution must have at least a certain value, and then one shows that the algorithm under consideration always achieves this bound. Greedy algorithms 15 IRIS H.-R. JIANG The Greedy Algorithm ¨ Assign a label to each interval. Possible labels: {1, 2, …, d}. ¨ Assign different labels for overlapping intervals. ¨ Implementation: ¤ Lines 3--5: find a resource compatible with Ij, assign this label n Record the finish time of the last added interval for each label n Compatibility checking: s(Ij) " f(labeli) n Use priority queue to maintain labels Greedy algorithms 16 Interval-Partitioning(R) 1. {I1, …, In} = sort intervals in ascending order of their start times 2. for j from 1 to n do 3. exclude the labels of all assigned intervals that are not compatible with Ij 4. if (there is a nonexcluded label from {1, 2, …, d}) then 5. assign a nonexcluded label to Ij 6. else leave Ij unlabeled
  • 16. IRIS H.-R. JIANG What if We Have Multiple Resources? ¨ The interval partitioning problem: ¤ a.k.a. the interval coloring problem: one resource = one color ¤ Use as few resources as possible ¨ Given: Set of requests {1, 2, …, n}, ith request corresponds an interval with start time s(i) and finish time f(i) ¤ interval i: [s(i), f(i)) ¨ Goal: Partition these requests into a minimum number of compatible subsets, each corresponds to one resource Greedy algorithms 13 Time 0 1 2 3 4 5 6 7 8 9 10 11 3 resources 12 13 IRIS H.-R. JIANG How Many Resources Are Required? ¨ The depth of a set of intervals is the maximum number that pass over any single point on the time-line. ¨ In any instance of interval partitioning, the number of resources needed is at least the depth of the set of intervals. ¨ Pf: ¤ Suppose a set of intervals has depth d, and let I1, …, Id all pass over a common point on the time-line. ¤ Then each of these intervals must be scheduled on a different resource, so the whole instance needs at least d resources. Greedy algorithms 14 Time 0 1 2 3 4 5 6 7 8 9 10 11 depth: 3 12 13 lower bound IRIS H.-R. JIANG Can We Reach the Lower Bound? ¨ The depth d is the lower bound on the number of required resources. ¨ Q: Can we always use d resources to schedule all requests? ¨ A: Yes. ¨ Q: How to prove the optimality? ¨ A: One finds a bound that every possible solution must have at least a certain value, and then one shows that the algorithm under consideration always achieves this bound. Greedy algorithms 15 IRIS H.-R. JIANG The Greedy Algorithm ¨ Assign a label to each interval. Possible labels: {1, 2, …, d}. ¨ Assign different labels for overlapping intervals. ¨ Implementation: ¤ Lines 3--5: find a resource compatible with Ij, assign this label n Record the finish time of the last added interval for each label n Compatibility checking: s(Ij) " f(labeli) n Use priority queue to maintain labels Greedy algorithms 16 Interval-Partitioning(R) 1. {I1, …, In} = sort intervals in ascending order of their start times 2. for j from 1 to n do 3. exclude the labels of all assigned intervals that are not compatible with Ij 4. if (there is a nonexcluded label from {1, 2, …, d}) then 5. assign a nonexcluded label to Ij 6. else leave Ij unlabeled
  • 17. IRIS H.-R. JIANG The Interval Partitioning Problem Greedy algorithms 17 Time 0 1 2 3 4 5 6 7 8 9 10 11 depth: 3 12 13 9 8 10 6 5 4 1 2 3 7 1 2 3 4 5 6 7 8 9 10 1 2 3 IRIS H.-R. JIANG Optimality (1/2) ¨ The greedy algorithm assigns every interval a label, and no two overlapping intervals receive the same label. ¨ Pf: 1. No interval ends up unlabeled. n Suppose interval Ij overlaps t intervals earlier in the sorted list. n These t + 1 intervals pass over a common point, namely s(Ij). n Hence, t + 1 ! d. Thus, t ! d – 1; at least one of the d labels is not excluded, and so there is a label that can be assigned to Ij. n i.e., line 6 never occurs! Greedy algorithms 18 Interval-Partitioning(R) 1. {I1, …, In} = sort intervals in ascending order of their start times 2. for j from 1 to n do 3. exclude the labels of all assigned intervals that are not compatible with Ij 4. if (there is a nonexcluded label from {1, 2, …, d}) then 5. assign a nonexcluded label to Ij 6. else leave Ij unlabeled IRIS H.-R. JIANG Optimality (2/2) ¨ Pf: (cont’d) 2. No two overlapping intervals are assigned with the same label. n Consider any two intervals Ii and Ij that overlap, i<j. n When Ij is considered (in line 2), Ii is in the set of intervals whose labels are excluded (in line 3). n Hence, the algorithm will not assign to Ij the label used for Ii. ¤ Since the algorithm uses d labels, we can conclude that the greedy algorithm always uses the minimum possible number of labels, i.e., it is optimal! Greedy algorithms 19 Interval-Partitioning(R) 1. {I1, …, In} = sort intervals in ascending order of their start times 2. for j from 1 to n do 3. exclude the labels of all assigned intervals that are not compatible with Ij 4. if (there is a nonexcluded label from {1, 2, …, d}) then 5. assign a nonexcluded label to Ij 6. else leave Ij unlabeled An exchange argument Scheduling to Minimize Lateness 20 Greedy algorithms
  • 18. IRIS H.-R. JIANG The Interval Partitioning Problem Greedy algorithms 17 Time 0 1 2 3 4 5 6 7 8 9 10 11 depth: 3 12 13 9 8 10 6 5 4 1 2 3 7 1 2 3 4 5 6 7 8 9 10 1 2 3 IRIS H.-R. JIANG Optimality (1/2) ¨ The greedy algorithm assigns every interval a label, and no two overlapping intervals receive the same label. ¨ Pf: 1. No interval ends up unlabeled. n Suppose interval Ij overlaps t intervals earlier in the sorted list. n These t + 1 intervals pass over a common point, namely s(Ij). n Hence, t + 1 ! d. Thus, t ! d – 1; at least one of the d labels is not excluded, and so there is a label that can be assigned to Ij. n i.e., line 6 never occurs! Greedy algorithms 18 Interval-Partitioning(R) 1. {I1, …, In} = sort intervals in ascending order of their start times 2. for j from 1 to n do 3. exclude the labels of all assigned intervals that are not compatible with Ij 4. if (there is a nonexcluded label from {1, 2, …, d}) then 5. assign a nonexcluded label to Ij 6. else leave Ij unlabeled IRIS H.-R. JIANG Optimality (2/2) ¨ Pf: (cont’d) 2. No two overlapping intervals are assigned with the same label. n Consider any two intervals Ii and Ij that overlap, i<j. n When Ij is considered (in line 2), Ii is in the set of intervals whose labels are excluded (in line 3). n Hence, the algorithm will not assign to Ij the label used for Ii. ¤ Since the algorithm uses d labels, we can conclude that the greedy algorithm always uses the minimum possible number of labels, i.e., it is optimal! Greedy algorithms 19 Interval-Partitioning(R) 1. {I1, …, In} = sort intervals in ascending order of their start times 2. for j from 1 to n do 3. exclude the labels of all assigned intervals that are not compatible with Ij 4. if (there is a nonexcluded label from {1, 2, …, d}) then 5. assign a nonexcluded label to Ij 6. else leave Ij unlabeled An exchange argument Scheduling to Minimize Lateness 20 Greedy algorithms
  • 19. IRIS H.-R. JIANG The Interval Partitioning Problem Greedy algorithms 17 Time 0 1 2 3 4 5 6 7 8 9 10 11 depth: 3 12 13 9 8 10 6 5 4 1 2 3 7 1 2 3 4 5 6 7 8 9 10 1 2 3 IRIS H.-R. JIANG Optimality (1/2) ¨ The greedy algorithm assigns every interval a label, and no two overlapping intervals receive the same label. ¨ Pf: 1. No interval ends up unlabeled. n Suppose interval Ij overlaps t intervals earlier in the sorted list. n These t + 1 intervals pass over a common point, namely s(Ij). n Hence, t + 1 ! d. Thus, t ! d – 1; at least one of the d labels is not excluded, and so there is a label that can be assigned to Ij. n i.e., line 6 never occurs! Greedy algorithms 18 Interval-Partitioning(R) 1. {I1, …, In} = sort intervals in ascending order of their start times 2. for j from 1 to n do 3. exclude the labels of all assigned intervals that are not compatible with Ij 4. if (there is a nonexcluded label from {1, 2, …, d}) then 5. assign a nonexcluded label to Ij 6. else leave Ij unlabeled IRIS H.-R. JIANG Optimality (2/2) ¨ Pf: (cont’d) 2. No two overlapping intervals are assigned with the same label. n Consider any two intervals Ii and Ij that overlap, i<j. n When Ij is considered (in line 2), Ii is in the set of intervals whose labels are excluded (in line 3). n Hence, the algorithm will not assign to Ij the label used for Ii. ¤ Since the algorithm uses d labels, we can conclude that the greedy algorithm always uses the minimum possible number of labels, i.e., it is optimal! Greedy algorithms 19 Interval-Partitioning(R) 1. {I1, …, In} = sort intervals in ascending order of their start times 2. for j from 1 to n do 3. exclude the labels of all assigned intervals that are not compatible with Ij 4. if (there is a nonexcluded label from {1, 2, …, d}) then 5. assign a nonexcluded label to Ij 6. else leave Ij unlabeled An exchange argument Scheduling to Minimize Lateness 20 Greedy algorithms
  • 20. IRIS H.-R. JIANG The Interval Partitioning Problem Greedy algorithms 17 Time 0 1 2 3 4 5 6 7 8 9 10 11 depth: 3 12 13 9 8 10 6 5 4 1 2 3 7 1 2 3 4 5 6 7 8 9 10 1 2 3 IRIS H.-R. JIANG Optimality (1/2) ¨ The greedy algorithm assigns every interval a label, and no two overlapping intervals receive the same label. ¨ Pf: 1. No interval ends up unlabeled. n Suppose interval Ij overlaps t intervals earlier in the sorted list. n These t + 1 intervals pass over a common point, namely s(Ij). n Hence, t + 1 ! d. Thus, t ! d – 1; at least one of the d labels is not excluded, and so there is a label that can be assigned to Ij. n i.e., line 6 never occurs! Greedy algorithms 18 Interval-Partitioning(R) 1. {I1, …, In} = sort intervals in ascending order of their start times 2. for j from 1 to n do 3. exclude the labels of all assigned intervals that are not compatible with Ij 4. if (there is a nonexcluded label from {1, 2, …, d}) then 5. assign a nonexcluded label to Ij 6. else leave Ij unlabeled IRIS H.-R. JIANG Optimality (2/2) ¨ Pf: (cont’d) 2. No two overlapping intervals are assigned with the same label. n Consider any two intervals Ii and Ij that overlap, i<j. n When Ij is considered (in line 2), Ii is in the set of intervals whose labels are excluded (in line 3). n Hence, the algorithm will not assign to Ij the label used for Ii. ¤ Since the algorithm uses d labels, we can conclude that the greedy algorithm always uses the minimum possible number of labels, i.e., it is optimal! Greedy algorithms 19 Interval-Partitioning(R) 1. {I1, …, In} = sort intervals in ascending order of their start times 2. for j from 1 to n do 3. exclude the labels of all assigned intervals that are not compatible with Ij 4. if (there is a nonexcluded label from {1, 2, …, d}) then 5. assign a nonexcluded label to Ij 6. else leave Ij unlabeled An exchange argument Scheduling to Minimize Lateness 20 Greedy algorithms
  • 21. IRIS H.-R. JIANG What If Each Request Has a Deadline? ¨ Given: A single resource is available starting at time s. A set of requests {1, 2, …, n}, request i requires a contiguous interval of length ti and has a deadline di. ¨ Goal: Schedule all requests without overlapping so as to minimize the maximum lateness. ¤ Lateness: li = max{0, f(i) – di}. Greedy algorithms 21 Time 0 1 2 3 4 5 6 7 8 9 10 11 12 13 6 5 4 1 3 2 14 15 deadline length max lateness = 1 1 2 3 4 5 6 IRIS H.-R. JIANG Greedy Rule ¨ Consider requests in some order. 1. Shortest interval first: Process requests in ascending order of ti 2. Smallest slack: Process requests in ascending order of di – ti 3. Earliest deadline first: Process requests in ascending order of di Greedy algorithms 22 1 2 ti di 1 100 10 10 1 2 ti di 1 2 10 10 Jobs with earlier deadlines get completed earlier IRIS H.-R. JIANG Minimizing Lateness ¨ Greedy rule: Earliest deadline first! Greedy algorithms 23 Min-Lateness(R,s) // f: the finishing time of the last scheduled request 1. {d1, …,dn} = sort requests in ascending order of their deadlines 2. f = s 3. for i from 1 to n do 4. assign request i to the time interval from s(i) = f to f(i) = f + ti 5. f = f + ti 6. return the set of scheduled intervals [s(i), f(i)) for all i = 1..n Time s 6 5 4 1 3 2 max lateness = 1 1 2 3 4 5 6 IRIS H.-R. JIANG No Idle Time ¨ Observation: The greedy schedule has no idle time. ¤ Line 4! ¨ There is an optimal schedule with no idle time. Greedy algorithms 24 Min-Lateness(R,s) // f: the finishing time of the last scheduled request 1. {d1, …,dn} = sort requests in ascending order of their deadlines 2. f = s 3. for i from 1 to n do 4. assign request i to the time interval from s(i) = f to f(i) = f + ti 5. f = f + ti 6. return the set of scheduled intervals [s(i), f(i)) for all i = 1..n 0 1 2 3 4 5 6 7 8 9 10 11 12 d = 4 d = 6 d = 12 0 1 2 3 4 5 6 7 8 9 10 11 12 d = 4 d = 6 d = 12
  • 22. IRIS H.-R. JIANG What If Each Request Has a Deadline? ¨ Given: A single resource is available starting at time s. A set of requests {1, 2, …, n}, request i requires a contiguous interval of length ti and has a deadline di. ¨ Goal: Schedule all requests without overlapping so as to minimize the maximum lateness. ¤ Lateness: li = max{0, f(i) – di}. Greedy algorithms 21 Time 0 1 2 3 4 5 6 7 8 9 10 11 12 13 6 5 4 1 3 2 14 15 deadline length max lateness = 1 1 2 3 4 5 6 IRIS H.-R. JIANG Greedy Rule ¨ Consider requests in some order. 1. Shortest interval first: Process requests in ascending order of ti 2. Smallest slack: Process requests in ascending order of di – ti 3. Earliest deadline first: Process requests in ascending order of di Greedy algorithms 22 1 2 ti di 1 100 10 10 1 2 ti di 1 2 10 10 Jobs with earlier deadlines get completed earlier IRIS H.-R. JIANG Minimizing Lateness ¨ Greedy rule: Earliest deadline first! Greedy algorithms 23 Min-Lateness(R,s) // f: the finishing time of the last scheduled request 1. {d1, …,dn} = sort requests in ascending order of their deadlines 2. f = s 3. for i from 1 to n do 4. assign request i to the time interval from s(i) = f to f(i) = f + ti 5. f = f + ti 6. return the set of scheduled intervals [s(i), f(i)) for all i = 1..n Time s 6 5 4 1 3 2 max lateness = 1 1 2 3 4 5 6 IRIS H.-R. JIANG No Idle Time ¨ Observation: The greedy schedule has no idle time. ¤ Line 4! ¨ There is an optimal schedule with no idle time. Greedy algorithms 24 Min-Lateness(R,s) // f: the finishing time of the last scheduled request 1. {d1, …,dn} = sort requests in ascending order of their deadlines 2. f = s 3. for i from 1 to n do 4. assign request i to the time interval from s(i) = f to f(i) = f + ti 5. f = f + ti 6. return the set of scheduled intervals [s(i), f(i)) for all i = 1..n 0 1 2 3 4 5 6 7 8 9 10 11 12 d = 4 d = 6 d = 12 0 1 2 3 4 5 6 7 8 9 10 11 12 d = 4 d = 6 d = 12
  • 23. IRIS H.-R. JIANG What If Each Request Has a Deadline? ¨ Given: A single resource is available starting at time s. A set of requests {1, 2, …, n}, request i requires a contiguous interval of length ti and has a deadline di. ¨ Goal: Schedule all requests without overlapping so as to minimize the maximum lateness. ¤ Lateness: li = max{0, f(i) – di}. Greedy algorithms 21 Time 0 1 2 3 4 5 6 7 8 9 10 11 12 13 6 5 4 1 3 2 14 15 deadline length max lateness = 1 1 2 3 4 5 6 IRIS H.-R. JIANG Greedy Rule ¨ Consider requests in some order. 1. Shortest interval first: Process requests in ascending order of ti 2. Smallest slack: Process requests in ascending order of di – ti 3. Earliest deadline first: Process requests in ascending order of di Greedy algorithms 22 1 2 ti di 1 100 10 10 1 2 ti di 1 2 10 10 Jobs with earlier deadlines get completed earlier IRIS H.-R. JIANG Minimizing Lateness ¨ Greedy rule: Earliest deadline first! Greedy algorithms 23 Min-Lateness(R,s) // f: the finishing time of the last scheduled request 1. {d1, …,dn} = sort requests in ascending order of their deadlines 2. f = s 3. for i from 1 to n do 4. assign request i to the time interval from s(i) = f to f(i) = f + ti 5. f = f + ti 6. return the set of scheduled intervals [s(i), f(i)) for all i = 1..n Time s 6 5 4 1 3 2 max lateness = 1 1 2 3 4 5 6 IRIS H.-R. JIANG No Idle Time ¨ Observation: The greedy schedule has no idle time. ¤ Line 4! ¨ There is an optimal schedule with no idle time. Greedy algorithms 24 Min-Lateness(R,s) // f: the finishing time of the last scheduled request 1. {d1, …,dn} = sort requests in ascending order of their deadlines 2. f = s 3. for i from 1 to n do 4. assign request i to the time interval from s(i) = f to f(i) = f + ti 5. f = f + ti 6. return the set of scheduled intervals [s(i), f(i)) for all i = 1..n 0 1 2 3 4 5 6 7 8 9 10 11 12 d = 4 d = 6 d = 12 0 1 2 3 4 5 6 7 8 9 10 11 12 d = 4 d = 6 d = 12
  • 24. IRIS H.-R. JIANG What If Each Request Has a Deadline? ¨ Given: A single resource is available starting at time s. A set of requests {1, 2, …, n}, request i requires a contiguous interval of length ti and has a deadline di. ¨ Goal: Schedule all requests without overlapping so as to minimize the maximum lateness. ¤ Lateness: li = max{0, f(i) – di}. Greedy algorithms 21 Time 0 1 2 3 4 5 6 7 8 9 10 11 12 13 6 5 4 1 3 2 14 15 deadline length max lateness = 1 1 2 3 4 5 6 IRIS H.-R. JIANG Greedy Rule ¨ Consider requests in some order. 1. Shortest interval first: Process requests in ascending order of ti 2. Smallest slack: Process requests in ascending order of di – ti 3. Earliest deadline first: Process requests in ascending order of di Greedy algorithms 22 1 2 ti di 1 100 10 10 1 2 ti di 1 2 10 10 Jobs with earlier deadlines get completed earlier IRIS H.-R. JIANG Minimizing Lateness ¨ Greedy rule: Earliest deadline first! Greedy algorithms 23 Min-Lateness(R,s) // f: the finishing time of the last scheduled request 1. {d1, …,dn} = sort requests in ascending order of their deadlines 2. f = s 3. for i from 1 to n do 4. assign request i to the time interval from s(i) = f to f(i) = f + ti 5. f = f + ti 6. return the set of scheduled intervals [s(i), f(i)) for all i = 1..n Time s 6 5 4 1 3 2 max lateness = 1 1 2 3 4 5 6 IRIS H.-R. JIANG No Idle Time ¨ Observation: The greedy schedule has no idle time. ¤ Line 4! ¨ There is an optimal schedule with no idle time. Greedy algorithms 24 Min-Lateness(R,s) // f: the finishing time of the last scheduled request 1. {d1, …,dn} = sort requests in ascending order of their deadlines 2. f = s 3. for i from 1 to n do 4. assign request i to the time interval from s(i) = f to f(i) = f + ti 5. f = f + ti 6. return the set of scheduled intervals [s(i), f(i)) for all i = 1..n 0 1 2 3 4 5 6 7 8 9 10 11 12 d = 4 d = 6 d = 12 0 1 2 3 4 5 6 7 8 9 10 11 12 d = 4 d = 6 d = 12
  • 25. IRIS H.-R. JIANG No Inversions ¨ Exchange argument: Gradually transform an optimal solution to the one found by the greedy algorithm without hurting its quality. ¨ An inversion in schedule S is a pair of requests i and j such that s(i) < s(j) but dj < di. ¨ All schedules without inversions and without idle time have the same maximum lateness. ¨ Pf: ¤ If two different schedules have neither inversions nor idle time, then they can only differ in the order in which requests with identical deadlines are scheduled. ¤ Consider such a deadline d. In both schedules, the jobs with deadline d are all scheduled consecutively (after all jobs with earlier deadlines and before all jobs with later deadlines). ¤ Among them, the last one has the greatest lateness, and this lateness does not depend on the order of the requests. Greedy algorithms 25 IRIS H.-R. JIANG Optimality ¨ There is an optimal schedule with no inversions and no idle time. ¨ Pf: ¤ There is an optimal schedule O without idle time. (done!) 1. If O has an inversion, there is a pair of jobs i and j such that j is scheduled immediately after i and has dj < di. 2. After swapping i and j we get a schedule with one less inversion. 3. The new swapped schedule has a maximum lateness no larger than that of O. n Other requests have the same lateness Greedy algorithms 26 inversion i j i j fi fi dj di IRIS H.-R. JIANG Optimality: Exchange Argument ¨ Theorem: The greedy schedule S is optimal. ¨ Pf: Proof by contradiction ¤ Let O be an optimal schedule with inversions. ¤ Assume O has no idle time. ¤ If O has no inversions, then S = O. done! ¤ If O has an inversion, let i-j be an adjacent inversion. n Swapping i and j does not increase the maximum lateness and strictly decreases the number of inversions. n This contradicts definition of O. Greedy algorithms 27 IRIS H.-R. JIANG Summary: Greedy Analysis Strategies ¨ An algorithm is greedy if it builds up a solution in small steps, choosing a decision at each step myopically to optimize some underlying criterion. ¨ It’s challenging to prove greedy algorithms succeed in solving a nontrivial problem optimally. 1. The greedy algorithm stays ahead: Show that after each step of the greedy algorithm, its partial solution is better than the optimal. 2. An exchange argument: Gradually transform an optimal solution to the one found by the greedy algorithm without hurting its quality. Greedy algorithms 28
  • 26. IRIS H.-R. JIANG No Inversions ¨ Exchange argument: Gradually transform an optimal solution to the one found by the greedy algorithm without hurting its quality. ¨ An inversion in schedule S is a pair of requests i and j such that s(i) < s(j) but dj < di. ¨ All schedules without inversions and without idle time have the same maximum lateness. ¨ Pf: ¤ If two different schedules have neither inversions nor idle time, then they can only differ in the order in which requests with identical deadlines are scheduled. ¤ Consider such a deadline d. In both schedules, the jobs with deadline d are all scheduled consecutively (after all jobs with earlier deadlines and before all jobs with later deadlines). ¤ Among them, the last one has the greatest lateness, and this lateness does not depend on the order of the requests. Greedy algorithms 25 IRIS H.-R. JIANG Optimality ¨ There is an optimal schedule with no inversions and no idle time. ¨ Pf: ¤ There is an optimal schedule O without idle time. (done!) 1. If O has an inversion, there is a pair of jobs i and j such that j is scheduled immediately after i and has dj < di. 2. After swapping i and j we get a schedule with one less inversion. 3. The new swapped schedule has a maximum lateness no larger than that of O. n Other requests have the same lateness Greedy algorithms 26 inversion i j i j fi fi dj di IRIS H.-R. JIANG Optimality: Exchange Argument ¨ Theorem: The greedy schedule S is optimal. ¨ Pf: Proof by contradiction ¤ Let O be an optimal schedule with inversions. ¤ Assume O has no idle time. ¤ If O has no inversions, then S = O. done! ¤ If O has an inversion, let i-j be an adjacent inversion. n Swapping i and j does not increase the maximum lateness and strictly decreases the number of inversions. n This contradicts definition of O. Greedy algorithms 27 IRIS H.-R. JIANG Summary: Greedy Analysis Strategies ¨ An algorithm is greedy if it builds up a solution in small steps, choosing a decision at each step myopically to optimize some underlying criterion. ¨ It’s challenging to prove greedy algorithms succeed in solving a nontrivial problem optimally. 1. The greedy algorithm stays ahead: Show that after each step of the greedy algorithm, its partial solution is better than the optimal. 2. An exchange argument: Gradually transform an optimal solution to the one found by the greedy algorithm without hurting its quality. Greedy algorithms 28
  • 27. IRIS H.-R. JIANG No Inversions ¨ Exchange argument: Gradually transform an optimal solution to the one found by the greedy algorithm without hurting its quality. ¨ An inversion in schedule S is a pair of requests i and j such that s(i) < s(j) but dj < di. ¨ All schedules without inversions and without idle time have the same maximum lateness. ¨ Pf: ¤ If two different schedules have neither inversions nor idle time, then they can only differ in the order in which requests with identical deadlines are scheduled. ¤ Consider such a deadline d. In both schedules, the jobs with deadline d are all scheduled consecutively (after all jobs with earlier deadlines and before all jobs with later deadlines). ¤ Among them, the last one has the greatest lateness, and this lateness does not depend on the order of the requests. Greedy algorithms 25 IRIS H.-R. JIANG Optimality ¨ There is an optimal schedule with no inversions and no idle time. ¨ Pf: ¤ There is an optimal schedule O without idle time. (done!) 1. If O has an inversion, there is a pair of jobs i and j such that j is scheduled immediately after i and has dj < di. 2. After swapping i and j we get a schedule with one less inversion. 3. The new swapped schedule has a maximum lateness no larger than that of O. n Other requests have the same lateness Greedy algorithms 26 inversion i j i j fi fi dj di IRIS H.-R. JIANG Optimality: Exchange Argument ¨ Theorem: The greedy schedule S is optimal. ¨ Pf: Proof by contradiction ¤ Let O be an optimal schedule with inversions. ¤ Assume O has no idle time. ¤ If O has no inversions, then S = O. done! ¤ If O has an inversion, let i-j be an adjacent inversion. n Swapping i and j does not increase the maximum lateness and strictly decreases the number of inversions. n This contradicts definition of O. Greedy algorithms 27 IRIS H.-R. JIANG Summary: Greedy Analysis Strategies ¨ An algorithm is greedy if it builds up a solution in small steps, choosing a decision at each step myopically to optimize some underlying criterion. ¨ It’s challenging to prove greedy algorithms succeed in solving a nontrivial problem optimally. 1. The greedy algorithm stays ahead: Show that after each step of the greedy algorithm, its partial solution is better than the optimal. 2. An exchange argument: Gradually transform an optimal solution to the one found by the greedy algorithm without hurting its quality. Greedy algorithms 28
  • 28. IRIS H.-R. JIANG No Inversions ¨ Exchange argument: Gradually transform an optimal solution to the one found by the greedy algorithm without hurting its quality. ¨ An inversion in schedule S is a pair of requests i and j such that s(i) < s(j) but dj < di. ¨ All schedules without inversions and without idle time have the same maximum lateness. ¨ Pf: ¤ If two different schedules have neither inversions nor idle time, then they can only differ in the order in which requests with identical deadlines are scheduled. ¤ Consider such a deadline d. In both schedules, the jobs with deadline d are all scheduled consecutively (after all jobs with earlier deadlines and before all jobs with later deadlines). ¤ Among them, the last one has the greatest lateness, and this lateness does not depend on the order of the requests. Greedy algorithms 25 IRIS H.-R. JIANG Optimality ¨ There is an optimal schedule with no inversions and no idle time. ¨ Pf: ¤ There is an optimal schedule O without idle time. (done!) 1. If O has an inversion, there is a pair of jobs i and j such that j is scheduled immediately after i and has dj < di. 2. After swapping i and j we get a schedule with one less inversion. 3. The new swapped schedule has a maximum lateness no larger than that of O. n Other requests have the same lateness Greedy algorithms 26 inversion i j i j fi fi dj di IRIS H.-R. JIANG Optimality: Exchange Argument ¨ Theorem: The greedy schedule S is optimal. ¨ Pf: Proof by contradiction ¤ Let O be an optimal schedule with inversions. ¤ Assume O has no idle time. ¤ If O has no inversions, then S = O. done! ¤ If O has an inversion, let i-j be an adjacent inversion. n Swapping i and j does not increase the maximum lateness and strictly decreases the number of inversions. n This contradicts definition of O. Greedy algorithms 27 IRIS H.-R. JIANG Summary: Greedy Analysis Strategies ¨ An algorithm is greedy if it builds up a solution in small steps, choosing a decision at each step myopically to optimize some underlying criterion. ¨ It’s challenging to prove greedy algorithms succeed in solving a nontrivial problem optimally. 1. The greedy algorithm stays ahead: Show that after each step of the greedy algorithm, its partial solution is better than the optimal. 2. An exchange argument: Gradually transform an optimal solution to the one found by the greedy algorithm without hurting its quality. Greedy algorithms 28
  • 29. Edsger W. Dijkstra 1959 Shortest Paths 29 Greedy algorithms IRIS H.-R. JIANG Edsger W. Dijkstra (1930—2002) ¨ 1972 recipient of the ACM Turing Award Greedy algorithms 30 The question of whether computers can think is like the question of whether submarines can swim. http://www.cs.utexas.edu/users/EWD/obituary.html If you want more effective programmers, you will discover that they should not waste their time debugging, they should not introduce the bugs to start with. Program testing can be a very effective way to show the presence of bugs, but it is hopelessly inadequate for showing their absence. -- Turing Award Lecture 1972, the humble programmer IRIS H.-R. JIANG Google Map Greedy algorithms 31 Shortest path from San Francisco Shopping Centre to Yosemite National Park IRIS H.-R. JIANG Floor Guide in a Shopping Mall ¨ Direct shoppers to their destinations in real-time Greedy algorithms 32
  • 30. Edsger W. Dijkstra 1959 Shortest Paths 29 Greedy algorithms IRIS H.-R. JIANG Edsger W. Dijkstra (1930—2002) ¨ 1972 recipient of the ACM Turing Award Greedy algorithms 30 The question of whether computers can think is like the question of whether submarines can swim. http://www.cs.utexas.edu/users/EWD/obituary.html If you want more effective programmers, you will discover that they should not waste their time debugging, they should not introduce the bugs to start with. Program testing can be a very effective way to show the presence of bugs, but it is hopelessly inadequate for showing their absence. -- Turing Award Lecture 1972, the humble programmer IRIS H.-R. JIANG Google Map Greedy algorithms 31 Shortest path from San Francisco Shopping Centre to Yosemite National Park IRIS H.-R. JIANG Floor Guide in a Shopping Mall ¨ Direct shoppers to their destinations in real-time Greedy algorithms 32
  • 31. Edsger W. Dijkstra 1959 Shortest Paths 29 Greedy algorithms IRIS H.-R. JIANG Edsger W. Dijkstra (1930—2002) ¨ 1972 recipient of the ACM Turing Award Greedy algorithms 30 The question of whether computers can think is like the question of whether submarines can swim. http://www.cs.utexas.edu/users/EWD/obituary.html If you want more effective programmers, you will discover that they should not waste their time debugging, they should not introduce the bugs to start with. Program testing can be a very effective way to show the presence of bugs, but it is hopelessly inadequate for showing their absence. -- Turing Award Lecture 1972, the humble programmer IRIS H.-R. JIANG Google Map Greedy algorithms 31 Shortest path from San Francisco Shopping Centre to Yosemite National Park IRIS H.-R. JIANG Floor Guide in a Shopping Mall ¨ Direct shoppers to their destinations in real-time Greedy algorithms 32
  • 32. Edsger W. Dijkstra 1959 Shortest Paths 29 Greedy algorithms IRIS H.-R. JIANG Edsger W. Dijkstra (1930—2002) ¨ 1972 recipient of the ACM Turing Award Greedy algorithms 30 The question of whether computers can think is like the question of whether submarines can swim. http://www.cs.utexas.edu/users/EWD/obituary.html If you want more effective programmers, you will discover that they should not waste their time debugging, they should not introduce the bugs to start with. Program testing can be a very effective way to show the presence of bugs, but it is hopelessly inadequate for showing their absence. -- Turing Award Lecture 1972, the humble programmer IRIS H.-R. JIANG Google Map Greedy algorithms 31 Shortest path from San Francisco Shopping Centre to Yosemite National Park IRIS H.-R. JIANG Floor Guide in a Shopping Mall ¨ Direct shoppers to their destinations in real-time Greedy algorithms 32
  • 33. IRIS H.-R. JIANG The Shortest Path Problem ¨ Given: ¤ Directed graph G = (V, E) n Length le = length of edge e = (u, v) Î E n Distance; time; cost n le ³ 0 n Q: what if undirected? n A: 1 undirected edge = 2 directed ones ¤ Source s ¨ Goal: ¤ Shortest path Pv from s to each other node v Î V – {s} n Length of path P: l(P)= SeÎP le Greedy algorithms 33 http://upload.wikimedia.org/wikipedia/commons/4/45/Dijksta_Anim.gif l(a®b) = l(1®3®6®5) = 9+2+9 = 20 IRIS H.-R. JIANG Dijkstra’s Algorithm Greedy algorithms 34 E. W. Dijkstra: A note on two problems in connexion with graphs. In Numerische Mathematik, 1 (1959), S. 269–271. Dijkstra(G,l) // S: the set of explored nodes // for each u Î S, we store a shortest path distance d(u) from s to u 1. initialize S = {s}, d(s) = 0 2. while S ¹ V do 3. select a node v Ï S with at least one edge from S for which 4. d'(v) = mine = (u, v): uÎS d(u)+le 5. add v to S and define d(v) = d'(v) shortest path to some u in explored part, followed by a single edge (u, v) s u x v y z 1 2 1 2 4 3 3 1 2 S: explored d'(x) = 2; d'(y) = 4; d'(z) = 5 d'(y) = 3; d'(z) = 4 IRIS H.-R. JIANG Correctness ¨ Loop invariant: Consider the set S at any point in the algorithm’s execution. For each node u Î S, d(u) is the length of the shortest s-u path Pu. ¨ Pf: Proof by induction on |S| ¤ Basis step: trivial for |S| = 1. ¤ Induction step: hypothesis: true for k ³ 1. n Grow S by adding v; let (u, v) be the final edge on our s-v path Pv. n By induction hypothesis, Pu is the shortest s-u path. n Consider any other s-v path P; P must leave S somewhere; let y be the first node on P that is not in S, and x Î S be the node just before y. n P cannot be shorter than Pv because it is already at least as long as Pv by the time it has left the set S. n At iteration k+1, d(v) = d'(v) = d(u) + le=(u, v) £ d(x) + le'=(x, y) £ l(P) Greedy algorithms 35 s x u y v S P Pv The algorithm stays ahead! Q: Why le ³ 0? IRIS H.-R. JIANG Implementation ¨ Q: How to do line 4 efficiently? d'(v) = mine = (u, v): uÎS d(u)+le ¨ A: Explicitly maintain d'(v) in the view of each unexplored node v instead of S ¤ Next node to explore = node with minimum d'(v). ¤ When exploring v, update d'(w) for each outgoing (v, w), w Ï S. ¨ Q: How? ¨ A: Implement a min priority queue nicely Greedy algorithms 36 Operation Dijkstra Array Binary heap Fibonacci heap Insert O(n) O(n) Q(lg n) Q(1) ExtractMin O(n) O(n) Q(lg n) O(lg n) ChangeKey O(m) O(1) Q(lg n) Q(1) IsEmpty O(n) O(1) Q(1) Q(1) Total O(n2) O(mlg n) O(m+nlg n) Operation Dijkstra Array Binary heap Fibonacci heap Insert O(n) ExtractMin O(n) ChangeKey O(m) IsEmpty O(n) Total Operation Dijkstra Array Binary heap Fibonacci heap Insert ExtractMin ChangeKey IsEmpty Total
  • 34. IRIS H.-R. JIANG The Shortest Path Problem ¨ Given: ¤ Directed graph G = (V, E) n Length le = length of edge e = (u, v) Î E n Distance; time; cost n le ³ 0 n Q: what if undirected? n A: 1 undirected edge = 2 directed ones ¤ Source s ¨ Goal: ¤ Shortest path Pv from s to each other node v Î V – {s} n Length of path P: l(P)= SeÎP le Greedy algorithms 33 http://upload.wikimedia.org/wikipedia/commons/4/45/Dijksta_Anim.gif l(a®b) = l(1®3®6®5) = 9+2+9 = 20 IRIS H.-R. JIANG Dijkstra’s Algorithm Greedy algorithms 34 E. W. Dijkstra: A note on two problems in connexion with graphs. In Numerische Mathematik, 1 (1959), S. 269–271. Dijkstra(G,l) // S: the set of explored nodes // for each u Î S, we store a shortest path distance d(u) from s to u 1. initialize S = {s}, d(s) = 0 2. while S ¹ V do 3. select a node v Ï S with at least one edge from S for which 4. d'(v) = mine = (u, v): uÎS d(u)+le 5. add v to S and define d(v) = d'(v) shortest path to some u in explored part, followed by a single edge (u, v) s u x v y z 1 2 1 2 4 3 3 1 2 S: explored d'(x) = 2; d'(y) = 4; d'(z) = 5 d'(y) = 3; d'(z) = 4 IRIS H.-R. JIANG Correctness ¨ Loop invariant: Consider the set S at any point in the algorithm’s execution. For each node u Î S, d(u) is the length of the shortest s-u path Pu. ¨ Pf: Proof by induction on |S| ¤ Basis step: trivial for |S| = 1. ¤ Induction step: hypothesis: true for k ³ 1. n Grow S by adding v; let (u, v) be the final edge on our s-v path Pv. n By induction hypothesis, Pu is the shortest s-u path. n Consider any other s-v path P; P must leave S somewhere; let y be the first node on P that is not in S, and x Î S be the node just before y. n P cannot be shorter than Pv because it is already at least as long as Pv by the time it has left the set S. n At iteration k+1, d(v) = d'(v) = d(u) + le=(u, v) £ d(x) + le'=(x, y) £ l(P) Greedy algorithms 35 s x u y v S P Pv The algorithm stays ahead! Q: Why le ³ 0? IRIS H.-R. JIANG Implementation ¨ Q: How to do line 4 efficiently? d'(v) = mine = (u, v): uÎS d(u)+le ¨ A: Explicitly maintain d'(v) in the view of each unexplored node v instead of S ¤ Next node to explore = node with minimum d'(v). ¤ When exploring v, update d'(w) for each outgoing (v, w), w Ï S. ¨ Q: How? ¨ A: Implement a min priority queue nicely Greedy algorithms 36 Operation Dijkstra Array Binary heap Fibonacci heap Insert O(n) O(n) Q(lg n) Q(1) ExtractMin O(n) O(n) Q(lg n) O(lg n) ChangeKey O(m) O(1) Q(lg n) Q(1) IsEmpty O(n) O(1) Q(1) Q(1) Total O(n2) O(mlg n) O(m+nlg n) Operation Dijkstra Array Binary heap Fibonacci heap Insert O(n) ExtractMin O(n) ChangeKey O(m) IsEmpty O(n) Total Operation Dijkstra Array Binary heap Fibonacci heap Insert ExtractMin ChangeKey IsEmpty Total
  • 35. IRIS H.-R. JIANG The Shortest Path Problem ¨ Given: ¤ Directed graph G = (V, E) n Length le = length of edge e = (u, v) Î E n Distance; time; cost n le ³ 0 n Q: what if undirected? n A: 1 undirected edge = 2 directed ones ¤ Source s ¨ Goal: ¤ Shortest path Pv from s to each other node v Î V – {s} n Length of path P: l(P)= SeÎP le Greedy algorithms 33 http://upload.wikimedia.org/wikipedia/commons/4/45/Dijksta_Anim.gif l(a®b) = l(1®3®6®5) = 9+2+9 = 20 IRIS H.-R. JIANG Dijkstra’s Algorithm Greedy algorithms 34 E. W. Dijkstra: A note on two problems in connexion with graphs. In Numerische Mathematik, 1 (1959), S. 269–271. Dijkstra(G,l) // S: the set of explored nodes // for each u Î S, we store a shortest path distance d(u) from s to u 1. initialize S = {s}, d(s) = 0 2. while S ¹ V do 3. select a node v Ï S with at least one edge from S for which 4. d'(v) = mine = (u, v): uÎS d(u)+le 5. add v to S and define d(v) = d'(v) shortest path to some u in explored part, followed by a single edge (u, v) s u x v y z 1 2 1 2 4 3 3 1 2 S: explored d'(x) = 2; d'(y) = 4; d'(z) = 5 d'(y) = 3; d'(z) = 4 IRIS H.-R. JIANG Correctness ¨ Loop invariant: Consider the set S at any point in the algorithm’s execution. For each node u Î S, d(u) is the length of the shortest s-u path Pu. ¨ Pf: Proof by induction on |S| ¤ Basis step: trivial for |S| = 1. ¤ Induction step: hypothesis: true for k ³ 1. n Grow S by adding v; let (u, v) be the final edge on our s-v path Pv. n By induction hypothesis, Pu is the shortest s-u path. n Consider any other s-v path P; P must leave S somewhere; let y be the first node on P that is not in S, and x Î S be the node just before y. n P cannot be shorter than Pv because it is already at least as long as Pv by the time it has left the set S. n At iteration k+1, d(v) = d'(v) = d(u) + le=(u, v) £ d(x) + le'=(x, y) £ l(P) Greedy algorithms 35 s x u y v S P Pv The algorithm stays ahead! Q: Why le ³ 0? IRIS H.-R. JIANG Implementation ¨ Q: How to do line 4 efficiently? d'(v) = mine = (u, v): uÎS d(u)+le ¨ A: Explicitly maintain d'(v) in the view of each unexplored node v instead of S ¤ Next node to explore = node with minimum d'(v). ¤ When exploring v, update d'(w) for each outgoing (v, w), w Ï S. ¨ Q: How? ¨ A: Implement a min priority queue nicely Greedy algorithms 36 Operation Dijkstra Array Binary heap Fibonacci heap Insert O(n) O(n) Q(lg n) Q(1) ExtractMin O(n) O(n) Q(lg n) O(lg n) ChangeKey O(m) O(1) Q(lg n) Q(1) IsEmpty O(n) O(1) Q(1) Q(1) Total O(n2) O(mlg n) O(m+nlg n) Operation Dijkstra Array Binary heap Fibonacci heap Insert O(n) ExtractMin O(n) ChangeKey O(m) IsEmpty O(n) Total Operation Dijkstra Array Binary heap Fibonacci heap Insert ExtractMin ChangeKey IsEmpty Total
  • 36. IRIS H.-R. JIANG The Shortest Path Problem ¨ Given: ¤ Directed graph G = (V, E) n Length le = length of edge e = (u, v) Î E n Distance; time; cost n le ³ 0 n Q: what if undirected? n A: 1 undirected edge = 2 directed ones ¤ Source s ¨ Goal: ¤ Shortest path Pv from s to each other node v Î V – {s} n Length of path P: l(P)= SeÎP le Greedy algorithms 33 http://upload.wikimedia.org/wikipedia/commons/4/45/Dijksta_Anim.gif l(a®b) = l(1®3®6®5) = 9+2+9 = 20 IRIS H.-R. JIANG Dijkstra’s Algorithm Greedy algorithms 34 E. W. Dijkstra: A note on two problems in connexion with graphs. In Numerische Mathematik, 1 (1959), S. 269–271. Dijkstra(G,l) // S: the set of explored nodes // for each u Î S, we store a shortest path distance d(u) from s to u 1. initialize S = {s}, d(s) = 0 2. while S ¹ V do 3. select a node v Ï S with at least one edge from S for which 4. d'(v) = mine = (u, v): uÎS d(u)+le 5. add v to S and define d(v) = d'(v) shortest path to some u in explored part, followed by a single edge (u, v) s u x v y z 1 2 1 2 4 3 3 1 2 S: explored d'(x) = 2; d'(y) = 4; d'(z) = 5 d'(y) = 3; d'(z) = 4 IRIS H.-R. JIANG Correctness ¨ Loop invariant: Consider the set S at any point in the algorithm’s execution. For each node u Î S, d(u) is the length of the shortest s-u path Pu. ¨ Pf: Proof by induction on |S| ¤ Basis step: trivial for |S| = 1. ¤ Induction step: hypothesis: true for k ³ 1. n Grow S by adding v; let (u, v) be the final edge on our s-v path Pv. n By induction hypothesis, Pu is the shortest s-u path. n Consider any other s-v path P; P must leave S somewhere; let y be the first node on P that is not in S, and x Î S be the node just before y. n P cannot be shorter than Pv because it is already at least as long as Pv by the time it has left the set S. n At iteration k+1, d(v) = d'(v) = d(u) + le=(u, v) £ d(x) + le'=(x, y) £ l(P) Greedy algorithms 35 s x u y v S P Pv The algorithm stays ahead! Q: Why le ³ 0? IRIS H.-R. JIANG Implementation ¨ Q: How to do line 4 efficiently? d'(v) = mine = (u, v): uÎS d(u)+le ¨ A: Explicitly maintain d'(v) in the view of each unexplored node v instead of S ¤ Next node to explore = node with minimum d'(v). ¤ When exploring v, update d'(w) for each outgoing (v, w), w Ï S. ¨ Q: How? ¨ A: Implement a min priority queue nicely Greedy algorithms 36 Operation Dijkstra Array Binary heap Fibonacci heap Insert O(n) O(n) Q(lg n) Q(1) ExtractMin O(n) O(n) Q(lg n) O(lg n) ChangeKey O(m) O(1) Q(lg n) Q(1) IsEmpty O(n) O(1) Q(1) Q(1) Total O(n2) O(mlg n) O(m+nlg n) Operation Dijkstra Array Binary heap Fibonacci heap Insert O(n) ExtractMin O(n) ChangeKey O(m) IsEmpty O(n) Total Operation Dijkstra Array Binary heap Fibonacci heap Insert ExtractMin ChangeKey IsEmpty Total
  • 37. IRIS H.-R. JIANG Example Greedy algorithms 37 Cormen, 3rd ed. Binary Tree Application Recap Heaps: Priority Queues 38 Greedy algorithms IRIS H.-R. JIANG Priority Queue ¨ In a priority queue (PQ) ¤ Each element has a priority (key) ¤ Only the element with highest (or lowest) priority can be deleted n Max priority queue, or min priority queue ¤ An element with arbitrary priority can be inserted into the queue at any time Greedy algorithms 39 Operation Binary heap Fibonacci heap FindMin Q(1) Q(1) ExtractMin Q(lg n) O(lg n) Insert Q(lg n) Q(1) ChangeKey Q(lg n) Q(1) The time complexities are worst-case time for binary heap, and amortized time complexity for Fibonacci heap Thomas H. Cormen, Charles E. Leiserson, Ronald L. Rivest, and Clifford Stein Introduction to Algorithms, 2nd Edition. MIT Press and McGraw-Hill, 2001. Fredman M. L. & Tarjan R. E. (1987). Fibonacci heaps and their uses in improved network optimization algorithms. Journal of the ACM 34(3), pp. 596-615. IRIS H.-R. JIANG Max heap Min heap Heap ¨ Definition: A max (min) heap is ¤ A max (min) tree: key[parent] >= (<=) key[children] ¤ A complete binary tree ¨ Corollary: Who has the largest (smallest) key in a max (min) heap? ¤ Root! ¨ Example Greedy algorithms 40 7 8 6 2 7 4 8 6 14 12 10 10
  • 38. IRIS H.-R. JIANG Example Greedy algorithms 37 Cormen, 3rd ed. Binary Tree Application Recap Heaps: Priority Queues 38 Greedy algorithms IRIS H.-R. JIANG Priority Queue ¨ In a priority queue (PQ) ¤ Each element has a priority (key) ¤ Only the element with highest (or lowest) priority can be deleted n Max priority queue, or min priority queue ¤ An element with arbitrary priority can be inserted into the queue at any time Greedy algorithms 39 Operation Binary heap Fibonacci heap FindMin Q(1) Q(1) ExtractMin Q(lg n) O(lg n) Insert Q(lg n) Q(1) ChangeKey Q(lg n) Q(1) The time complexities are worst-case time for binary heap, and amortized time complexity for Fibonacci heap Thomas H. Cormen, Charles E. Leiserson, Ronald L. Rivest, and Clifford Stein Introduction to Algorithms, 2nd Edition. MIT Press and McGraw-Hill, 2001. Fredman M. L. & Tarjan R. E. (1987). Fibonacci heaps and their uses in improved network optimization algorithms. Journal of the ACM 34(3), pp. 596-615. IRIS H.-R. JIANG Max heap Min heap Heap ¨ Definition: A max (min) heap is ¤ A max (min) tree: key[parent] >= (<=) key[children] ¤ A complete binary tree ¨ Corollary: Who has the largest (smallest) key in a max (min) heap? ¤ Root! ¨ Example Greedy algorithms 40 7 8 6 2 7 4 8 6 14 12 10 10
  • 39. IRIS H.-R. JIANG Example Greedy algorithms 37 Cormen, 3rd ed. Binary Tree Application Recap Heaps: Priority Queues 38 Greedy algorithms IRIS H.-R. JIANG Priority Queue ¨ In a priority queue (PQ) ¤ Each element has a priority (key) ¤ Only the element with highest (or lowest) priority can be deleted n Max priority queue, or min priority queue ¤ An element with arbitrary priority can be inserted into the queue at any time Greedy algorithms 39 Operation Binary heap Fibonacci heap FindMin Q(1) Q(1) ExtractMin Q(lg n) O(lg n) Insert Q(lg n) Q(1) ChangeKey Q(lg n) Q(1) The time complexities are worst-case time for binary heap, and amortized time complexity for Fibonacci heap Thomas H. Cormen, Charles E. Leiserson, Ronald L. Rivest, and Clifford Stein Introduction to Algorithms, 2nd Edition. MIT Press and McGraw-Hill, 2001. Fredman M. L. & Tarjan R. E. (1987). Fibonacci heaps and their uses in improved network optimization algorithms. Journal of the ACM 34(3), pp. 596-615. IRIS H.-R. JIANG Max heap Min heap Heap ¨ Definition: A max (min) heap is ¤ A max (min) tree: key[parent] >= (<=) key[children] ¤ A complete binary tree ¨ Corollary: Who has the largest (smallest) key in a max (min) heap? ¤ Root! ¨ Example Greedy algorithms 40 7 8 6 2 7 4 8 6 14 12 10 10
  • 40. IRIS H.-R. JIANG Example Greedy algorithms 37 Cormen, 3rd ed. Binary Tree Application Recap Heaps: Priority Queues 38 Greedy algorithms IRIS H.-R. JIANG Priority Queue ¨ In a priority queue (PQ) ¤ Each element has a priority (key) ¤ Only the element with highest (or lowest) priority can be deleted n Max priority queue, or min priority queue ¤ An element with arbitrary priority can be inserted into the queue at any time Greedy algorithms 39 Operation Binary heap Fibonacci heap FindMin Q(1) Q(1) ExtractMin Q(lg n) O(lg n) Insert Q(lg n) Q(1) ChangeKey Q(lg n) Q(1) The time complexities are worst-case time for binary heap, and amortized time complexity for Fibonacci heap Thomas H. Cormen, Charles E. Leiserson, Ronald L. Rivest, and Clifford Stein Introduction to Algorithms, 2nd Edition. MIT Press and McGraw-Hill, 2001. Fredman M. L. & Tarjan R. E. (1987). Fibonacci heaps and their uses in improved network optimization algorithms. Journal of the ACM 34(3), pp. 596-615. IRIS H.-R. JIANG Max heap Min heap Heap ¨ Definition: A max (min) heap is ¤ A max (min) tree: key[parent] >= (<=) key[children] ¤ A complete binary tree ¨ Corollary: Who has the largest (smallest) key in a max (min) heap? ¤ Root! ¨ Example Greedy algorithms 40 7 8 6 2 7 4 8 6 14 12 10 10
  • 41. IRIS H.-R. JIANG Class MaxHeap ¨ Implementation? ¤ Complete binary tree Þ array representation Greedy algorithms 41 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] - - - - - heap 14 12 7 10 8 6 7 8 6 14 12 10 IRIS H.-R. JIANG Insertion into a Max Heap (1/3) ¨ Maintain heap property all the times ¨ Push(1) Greedy algorithms 42 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] - - - - - - 20 15 2 14 10 heapSize = 5 capacity = 10 - - - - - 20 15 2 14 10 Initial location of new node heapSize = 6 capacity = 10 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 2 14 15 10 20 2 14 15 10 20 - - - - - 20 15 2 14 10 1 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 2 1 14 15 10 20 IRIS H.-R. JIANG Insertion into a Max Heap (2/3) ¨ Maintain heap Þ bubble up if needed! ¨ Push(21) Greedy algorithms 43 - - - - - 20 15 2 14 10 1 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 2 1 14 15 10 20 - - - 20 15 2 14 10 1 21 Initial location of 21 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 2 1 14 15 10 20 21 - - - - 20 15 21 14 10 1 2 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 2 1 14 15 10 20 21 - - - - 21 15 20 14 10 1 2 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 2 1 14 15 10 21 20 IRIS H.-R. JIANG Insertion into a Max Heap (3/3) ¨ Time complexity? ¤ How many times to bubble up in the worst case? ¤ Tree height: Q(lg n) Greedy algorithms 44
  • 42. IRIS H.-R. JIANG Class MaxHeap ¨ Implementation? ¤ Complete binary tree Þ array representation Greedy algorithms 41 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] - - - - - heap 14 12 7 10 8 6 7 8 6 14 12 10 IRIS H.-R. JIANG Insertion into a Max Heap (1/3) ¨ Maintain heap property all the times ¨ Push(1) Greedy algorithms 42 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] - - - - - - 20 15 2 14 10 heapSize = 5 capacity = 10 - - - - - 20 15 2 14 10 Initial location of new node heapSize = 6 capacity = 10 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 2 14 15 10 20 2 14 15 10 20 - - - - - 20 15 2 14 10 1 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 2 1 14 15 10 20 IRIS H.-R. JIANG Insertion into a Max Heap (2/3) ¨ Maintain heap Þ bubble up if needed! ¨ Push(21) Greedy algorithms 43 - - - - - 20 15 2 14 10 1 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 2 1 14 15 10 20 - - - 20 15 2 14 10 1 21 Initial location of 21 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 2 1 14 15 10 20 21 - - - - 20 15 21 14 10 1 2 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 2 1 14 15 10 20 21 - - - - 21 15 20 14 10 1 2 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 2 1 14 15 10 21 20 IRIS H.-R. JIANG Insertion into a Max Heap (3/3) ¨ Time complexity? ¤ How many times to bubble up in the worst case? ¤ Tree height: Q(lg n) Greedy algorithms 44
  • 43. IRIS H.-R. JIANG Class MaxHeap ¨ Implementation? ¤ Complete binary tree Þ array representation Greedy algorithms 41 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] - - - - - heap 14 12 7 10 8 6 7 8 6 14 12 10 IRIS H.-R. JIANG Insertion into a Max Heap (1/3) ¨ Maintain heap property all the times ¨ Push(1) Greedy algorithms 42 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] - - - - - - 20 15 2 14 10 heapSize = 5 capacity = 10 - - - - - 20 15 2 14 10 Initial location of new node heapSize = 6 capacity = 10 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 2 14 15 10 20 2 14 15 10 20 - - - - - 20 15 2 14 10 1 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 2 1 14 15 10 20 IRIS H.-R. JIANG Insertion into a Max Heap (2/3) ¨ Maintain heap Þ bubble up if needed! ¨ Push(21) Greedy algorithms 43 - - - - - 20 15 2 14 10 1 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 2 1 14 15 10 20 - - - 20 15 2 14 10 1 21 Initial location of 21 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 2 1 14 15 10 20 21 - - - - 20 15 21 14 10 1 2 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 2 1 14 15 10 20 21 - - - - 21 15 20 14 10 1 2 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 2 1 14 15 10 21 20 IRIS H.-R. JIANG Insertion into a Max Heap (3/3) ¨ Time complexity? ¤ How many times to bubble up in the worst case? ¤ Tree height: Q(lg n) Greedy algorithms 44
  • 44. IRIS H.-R. JIANG Class MaxHeap ¨ Implementation? ¤ Complete binary tree Þ array representation Greedy algorithms 41 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] - - - - - heap 14 12 7 10 8 6 7 8 6 14 12 10 IRIS H.-R. JIANG Insertion into a Max Heap (1/3) ¨ Maintain heap property all the times ¨ Push(1) Greedy algorithms 42 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] - - - - - - 20 15 2 14 10 heapSize = 5 capacity = 10 - - - - - 20 15 2 14 10 Initial location of new node heapSize = 6 capacity = 10 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 2 14 15 10 20 2 14 15 10 20 - - - - - 20 15 2 14 10 1 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 2 1 14 15 10 20 IRIS H.-R. JIANG Insertion into a Max Heap (2/3) ¨ Maintain heap Þ bubble up if needed! ¨ Push(21) Greedy algorithms 43 - - - - - 20 15 2 14 10 1 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 2 1 14 15 10 20 - - - 20 15 2 14 10 1 21 Initial location of 21 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 2 1 14 15 10 20 21 - - - - 20 15 21 14 10 1 2 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 2 1 14 15 10 20 21 - - - - 21 15 20 14 10 1 2 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 2 1 14 15 10 21 20 IRIS H.-R. JIANG Insertion into a Max Heap (3/3) ¨ Time complexity? ¤ How many times to bubble up in the worst case? ¤ Tree height: Q(lg n) Greedy algorithms 44
  • 45. IRIS H.-R. JIANG Deletion from a Max Heap (1/3) ¨ Maintain heap Þ trickle down if needed! ¨ Pop() 45 Greedy algorithms - - - - 21 15 20 14 10 1 2 heapSize = 7 capacity = 10 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 2 1 14 15 10 21 20 - - - - 15 20 14 10 1 2 heapSize = 6 capacity = 10 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 2 1 14 15 10 20 - - - - - 2 15 20 14 10 1 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 2 1 14 15 10 20 - - - - - 20 15 2 14 10 1 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 2 1 14 15 10 20 IRIS H.-R. JIANG Deletion from a Max Heap (2/3) ¨ Maintain heap Þ trickle down if needed! ¨ Pop() Greedy algorithms 46 - - - - - 20 15 2 14 10 1 heapSize = 6 capacity = 10 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 2 1 14 15 10 20 - - - - - - 1 15 2 14 10 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 1 2 14 15 10 heapSize = 5 capacity = 10 - - - - - - 15 1 2 14 10 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 1 2 14 15 10 - - - - - - 15 14 2 1 10 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 1 2 14 15 10 IRIS H.-R. JIANG Deletion from a Max Heap (3/3) ¨ Time complexity? ¤ How many times to trickle down in the worst case? Q(lg n) Greedy algorithms 47 IRIS H.-R. JIANG Max Heapify ¨ Max (min) heapify = maintain the max (min) heap property ¤ What we do to trickle down the root in deletion ¤ Assume i’s left & right subtrees are heaps n But key[i] may be < (>) key[children] ¤ Heapify i = trickle down key[i] Þ the tree rooted at i is a heap Greedy algorithms 48 - - - - - - 1 15 2 14 10 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 1 2 14 15 10 heapSize = 5 capacity = 10 - - - - - - 15 1 2 14 10 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 1 2 14 15 10 - - - - - - 15 14 2 1 10 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 1 2 14 15 10
  • 46. IRIS H.-R. JIANG Deletion from a Max Heap (1/3) ¨ Maintain heap Þ trickle down if needed! ¨ Pop() 45 Greedy algorithms - - - - 21 15 20 14 10 1 2 heapSize = 7 capacity = 10 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 2 1 14 15 10 21 20 - - - - 15 20 14 10 1 2 heapSize = 6 capacity = 10 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 2 1 14 15 10 20 - - - - - 2 15 20 14 10 1 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 2 1 14 15 10 20 - - - - - 20 15 2 14 10 1 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 2 1 14 15 10 20 IRIS H.-R. JIANG Deletion from a Max Heap (2/3) ¨ Maintain heap Þ trickle down if needed! ¨ Pop() Greedy algorithms 46 - - - - - 20 15 2 14 10 1 heapSize = 6 capacity = 10 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 2 1 14 15 10 20 - - - - - - 1 15 2 14 10 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 1 2 14 15 10 heapSize = 5 capacity = 10 - - - - - - 15 1 2 14 10 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 1 2 14 15 10 - - - - - - 15 14 2 1 10 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 1 2 14 15 10 IRIS H.-R. JIANG Deletion from a Max Heap (3/3) ¨ Time complexity? ¤ How many times to trickle down in the worst case? Q(lg n) Greedy algorithms 47 IRIS H.-R. JIANG Max Heapify ¨ Max (min) heapify = maintain the max (min) heap property ¤ What we do to trickle down the root in deletion ¤ Assume i’s left & right subtrees are heaps n But key[i] may be < (>) key[children] ¤ Heapify i = trickle down key[i] Þ the tree rooted at i is a heap Greedy algorithms 48 - - - - - - 1 15 2 14 10 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 1 2 14 15 10 heapSize = 5 capacity = 10 - - - - - - 15 1 2 14 10 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 1 2 14 15 10 - - - - - - 15 14 2 1 10 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 1 2 14 15 10
  • 47. IRIS H.-R. JIANG Deletion from a Max Heap (1/3) ¨ Maintain heap Þ trickle down if needed! ¨ Pop() 45 Greedy algorithms - - - - 21 15 20 14 10 1 2 heapSize = 7 capacity = 10 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 2 1 14 15 10 21 20 - - - - 15 20 14 10 1 2 heapSize = 6 capacity = 10 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 2 1 14 15 10 20 - - - - - 2 15 20 14 10 1 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 2 1 14 15 10 20 - - - - - 20 15 2 14 10 1 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 2 1 14 15 10 20 IRIS H.-R. JIANG Deletion from a Max Heap (2/3) ¨ Maintain heap Þ trickle down if needed! ¨ Pop() Greedy algorithms 46 - - - - - 20 15 2 14 10 1 heapSize = 6 capacity = 10 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 2 1 14 15 10 20 - - - - - - 1 15 2 14 10 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 1 2 14 15 10 heapSize = 5 capacity = 10 - - - - - - 15 1 2 14 10 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 1 2 14 15 10 - - - - - - 15 14 2 1 10 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 1 2 14 15 10 IRIS H.-R. JIANG Deletion from a Max Heap (3/3) ¨ Time complexity? ¤ How many times to trickle down in the worst case? Q(lg n) Greedy algorithms 47 IRIS H.-R. JIANG Max Heapify ¨ Max (min) heapify = maintain the max (min) heap property ¤ What we do to trickle down the root in deletion ¤ Assume i’s left & right subtrees are heaps n But key[i] may be < (>) key[children] ¤ Heapify i = trickle down key[i] Þ the tree rooted at i is a heap Greedy algorithms 48 - - - - - - 1 15 2 14 10 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 1 2 14 15 10 heapSize = 5 capacity = 10 - - - - - - 15 1 2 14 10 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 1 2 14 15 10 - - - - - - 15 14 2 1 10 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 1 2 14 15 10
  • 48. IRIS H.-R. JIANG Deletion from a Max Heap (1/3) ¨ Maintain heap Þ trickle down if needed! ¨ Pop() 45 Greedy algorithms - - - - 21 15 20 14 10 1 2 heapSize = 7 capacity = 10 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 2 1 14 15 10 21 20 - - - - 15 20 14 10 1 2 heapSize = 6 capacity = 10 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 2 1 14 15 10 20 - - - - - 2 15 20 14 10 1 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 2 1 14 15 10 20 - - - - - 20 15 2 14 10 1 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 2 1 14 15 10 20 IRIS H.-R. JIANG Deletion from a Max Heap (2/3) ¨ Maintain heap Þ trickle down if needed! ¨ Pop() Greedy algorithms 46 - - - - - 20 15 2 14 10 1 heapSize = 6 capacity = 10 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 2 1 14 15 10 20 - - - - - - 1 15 2 14 10 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 1 2 14 15 10 heapSize = 5 capacity = 10 - - - - - - 15 1 2 14 10 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 1 2 14 15 10 - - - - - - 15 14 2 1 10 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 1 2 14 15 10 IRIS H.-R. JIANG Deletion from a Max Heap (3/3) ¨ Time complexity? ¤ How many times to trickle down in the worst case? Q(lg n) Greedy algorithms 47 IRIS H.-R. JIANG Max Heapify ¨ Max (min) heapify = maintain the max (min) heap property ¤ What we do to trickle down the root in deletion ¤ Assume i’s left & right subtrees are heaps n But key[i] may be < (>) key[children] ¤ Heapify i = trickle down key[i] Þ the tree rooted at i is a heap Greedy algorithms 48 - - - - - - 1 15 2 14 10 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 1 2 14 15 10 heapSize = 5 capacity = 10 - - - - - - 15 1 2 14 10 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 1 2 14 15 10 - - - - - - 15 14 2 1 10 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 1 2 14 15 10
  • 49. IRIS H.-R. JIANG How to Build a Max Heap? ¨ How to convert any complete binary tree to a max heap? ¨ Intuition: Max heapify in a bottom-up manner ¤ Induction basis: Leaves are already heaps ¤ Induction steps: Start at parents of leaves, work upward till root ¤ Time complexity: O(n lg n) Greedy algorithms 49 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 14 8 9 10 2 16 1 3 - 4 7 a 16 10 14 16 10 14 16 10 14 16 10 14 16 10 14 16 10 14 1 4 3 1 4 3 1 4 3 1 4 2 2 8 7 9 2 8 7 9 2 8 7 9 1 8 7 8 7 9 3 4 7 2 8 1 9 3 9 3 2 4 Robert C. Prim 1957 (Dijkstra 1959) Joseph B. Kruskal 1956 Reverse-delete Minimum Spanning Trees 50 Greedy algorithms A B E D C F 7 5 7 9 15 8 8 5 6 G 9 11 IRIS H.-R. JIANG Minimum Spanning Graphs ¨ Q: How can a cable TV company lay cable to a new neighborhood, of course, as cheaply as possible? ¨ A: Curiously and fortunately, this problem is a case where many greedy algorithms optimally solve. ¨ Given ¤ Undirected graph G = (V, E) n Nonnegative cost ce for each edge e = (u, v) Î V n ce ³ 0 ¨ Goal ¤ Find a subset of edges T ‫ك‬ E so that n The subgraph (V, T) is connected n Total cost SeÎV ce is minimized Greedy algorithms 51 A B E D C F 7 5 7 9 15 8 8 5 6 G 9 11 A B E D C F 7 5 7 9 15 8 8 5 6 G 9 11 IRIS H.-R. JIANG Minimum Spanning Trees ¨ Q: Let T be a minimum-cost solution. What should (V, T) be? ¨ A: ¤ By definition, (V, T) must be connected. ¤ We show that it also contains no cycles. ¤ Suppose it contained a cycle C, and let e be any edge on C. ¤ We claim that (V, T – {e}) is still connected ¤ Any path previously used e can now go path C – {e} instead. ¤ It follows that (V, T – {e}) is also a valid solution, but cheaper. ¤ Hence, (V, T) is a tree. ¨ Goal ¤ Find a subset of edges T ‫ك‬ E so that n (V, T) is a tree, n Total cost SeÎV ce is minimized. Greedy algorithms 52 Minimum Spanning ???? A B E D C F 7 5 7 9 15 8 8 5 6 G 9 11 A B E D C F 7 5 7 9 15 8 8 5 6 G 9 11
  • 50. IRIS H.-R. JIANG How to Build a Max Heap? ¨ How to convert any complete binary tree to a max heap? ¨ Intuition: Max heapify in a bottom-up manner ¤ Induction basis: Leaves are already heaps ¤ Induction steps: Start at parents of leaves, work upward till root ¤ Time complexity: O(n lg n) Greedy algorithms 49 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 14 8 9 10 2 16 1 3 - 4 7 a 16 10 14 16 10 14 16 10 14 16 10 14 16 10 14 16 10 14 1 4 3 1 4 3 1 4 3 1 4 2 2 8 7 9 2 8 7 9 2 8 7 9 1 8 7 8 7 9 3 4 7 2 8 1 9 3 9 3 2 4 Robert C. Prim 1957 (Dijkstra 1959) Joseph B. Kruskal 1956 Reverse-delete Minimum Spanning Trees 50 Greedy algorithms A B E D C F 7 5 7 9 15 8 8 5 6 G 9 11 IRIS H.-R. JIANG Minimum Spanning Graphs ¨ Q: How can a cable TV company lay cable to a new neighborhood, of course, as cheaply as possible? ¨ A: Curiously and fortunately, this problem is a case where many greedy algorithms optimally solve. ¨ Given ¤ Undirected graph G = (V, E) n Nonnegative cost ce for each edge e = (u, v) Î V n ce ³ 0 ¨ Goal ¤ Find a subset of edges T ‫ك‬ E so that n The subgraph (V, T) is connected n Total cost SeÎV ce is minimized Greedy algorithms 51 A B E D C F 7 5 7 9 15 8 8 5 6 G 9 11 A B E D C F 7 5 7 9 15 8 8 5 6 G 9 11 IRIS H.-R. JIANG Minimum Spanning Trees ¨ Q: Let T be a minimum-cost solution. What should (V, T) be? ¨ A: ¤ By definition, (V, T) must be connected. ¤ We show that it also contains no cycles. ¤ Suppose it contained a cycle C, and let e be any edge on C. ¤ We claim that (V, T – {e}) is still connected ¤ Any path previously used e can now go path C – {e} instead. ¤ It follows that (V, T – {e}) is also a valid solution, but cheaper. ¤ Hence, (V, T) is a tree. ¨ Goal ¤ Find a subset of edges T ‫ك‬ E so that n (V, T) is a tree, n Total cost SeÎV ce is minimized. Greedy algorithms 52 Minimum Spanning ???? A B E D C F 7 5 7 9 15 8 8 5 6 G 9 11 A B E D C F 7 5 7 9 15 8 8 5 6 G 9 11
  • 51. IRIS H.-R. JIANG How to Build a Max Heap? ¨ How to convert any complete binary tree to a max heap? ¨ Intuition: Max heapify in a bottom-up manner ¤ Induction basis: Leaves are already heaps ¤ Induction steps: Start at parents of leaves, work upward till root ¤ Time complexity: O(n lg n) Greedy algorithms 49 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 14 8 9 10 2 16 1 3 - 4 7 a 16 10 14 16 10 14 16 10 14 16 10 14 16 10 14 16 10 14 1 4 3 1 4 3 1 4 3 1 4 2 2 8 7 9 2 8 7 9 2 8 7 9 1 8 7 8 7 9 3 4 7 2 8 1 9 3 9 3 2 4 Robert C. Prim 1957 (Dijkstra 1959) Joseph B. Kruskal 1956 Reverse-delete Minimum Spanning Trees 50 Greedy algorithms A B E D C F 7 5 7 9 15 8 8 5 6 G 9 11 IRIS H.-R. JIANG Minimum Spanning Graphs ¨ Q: How can a cable TV company lay cable to a new neighborhood, of course, as cheaply as possible? ¨ A: Curiously and fortunately, this problem is a case where many greedy algorithms optimally solve. ¨ Given ¤ Undirected graph G = (V, E) n Nonnegative cost ce for each edge e = (u, v) Î V n ce ³ 0 ¨ Goal ¤ Find a subset of edges T ‫ك‬ E so that n The subgraph (V, T) is connected n Total cost SeÎV ce is minimized Greedy algorithms 51 A B E D C F 7 5 7 9 15 8 8 5 6 G 9 11 A B E D C F 7 5 7 9 15 8 8 5 6 G 9 11 IRIS H.-R. JIANG Minimum Spanning Trees ¨ Q: Let T be a minimum-cost solution. What should (V, T) be? ¨ A: ¤ By definition, (V, T) must be connected. ¤ We show that it also contains no cycles. ¤ Suppose it contained a cycle C, and let e be any edge on C. ¤ We claim that (V, T – {e}) is still connected ¤ Any path previously used e can now go path C – {e} instead. ¤ It follows that (V, T – {e}) is also a valid solution, but cheaper. ¤ Hence, (V, T) is a tree. ¨ Goal ¤ Find a subset of edges T ‫ك‬ E so that n (V, T) is a tree, n Total cost SeÎV ce is minimized. Greedy algorithms 52 Minimum Spanning ???? A B E D C F 7 5 7 9 15 8 8 5 6 G 9 11 A B E D C F 7 5 7 9 15 8 8 5 6 G 9 11
  • 52. IRIS H.-R. JIANG How to Build a Max Heap? ¨ How to convert any complete binary tree to a max heap? ¨ Intuition: Max heapify in a bottom-up manner ¤ Induction basis: Leaves are already heaps ¤ Induction steps: Start at parents of leaves, work upward till root ¤ Time complexity: O(n lg n) Greedy algorithms 49 [8] [9] [6] [7] [4] [5] [2] [3] [0] [1] [10] 14 8 9 10 2 16 1 3 - 4 7 a 16 10 14 16 10 14 16 10 14 16 10 14 16 10 14 16 10 14 1 4 3 1 4 3 1 4 3 1 4 2 2 8 7 9 2 8 7 9 2 8 7 9 1 8 7 8 7 9 3 4 7 2 8 1 9 3 9 3 2 4 Robert C. Prim 1957 (Dijkstra 1959) Joseph B. Kruskal 1956 Reverse-delete Minimum Spanning Trees 50 Greedy algorithms A B E D C F 7 5 7 9 15 8 8 5 6 G 9 11 IRIS H.-R. JIANG Minimum Spanning Graphs ¨ Q: How can a cable TV company lay cable to a new neighborhood, of course, as cheaply as possible? ¨ A: Curiously and fortunately, this problem is a case where many greedy algorithms optimally solve. ¨ Given ¤ Undirected graph G = (V, E) n Nonnegative cost ce for each edge e = (u, v) Î V n ce ³ 0 ¨ Goal ¤ Find a subset of edges T ‫ك‬ E so that n The subgraph (V, T) is connected n Total cost SeÎV ce is minimized Greedy algorithms 51 A B E D C F 7 5 7 9 15 8 8 5 6 G 9 11 A B E D C F 7 5 7 9 15 8 8 5 6 G 9 11 IRIS H.-R. JIANG Minimum Spanning Trees ¨ Q: Let T be a minimum-cost solution. What should (V, T) be? ¨ A: ¤ By definition, (V, T) must be connected. ¤ We show that it also contains no cycles. ¤ Suppose it contained a cycle C, and let e be any edge on C. ¤ We claim that (V, T – {e}) is still connected ¤ Any path previously used e can now go path C – {e} instead. ¤ It follows that (V, T – {e}) is also a valid solution, but cheaper. ¤ Hence, (V, T) is a tree. ¨ Goal ¤ Find a subset of edges T ‫ك‬ E so that n (V, T) is a tree, n Total cost SeÎV ce is minimized. Greedy algorithms 52 Minimum Spanning ???? A B E D C F 7 5 7 9 15 8 8 5 6 G 9 11 A B E D C F 7 5 7 9 15 8 8 5 6 G 9 11
  • 53. IRIS H.-R. JIANG Greedy Algorithms (1/3) ¨ Q: What will you do? ¨ All three greedy algorithms produce an MST. ¨ Kruskal's algorithm: ¤ Start with T = {}. ¤ Consider edges in ascending order of cost. ¤ Insert edge e in T as long as it does not create a cycle; otherwise, discard e and continue. Greedy algorithms 53 A B E D C F 7 5 7 9 15 8 8 5 6 G 9 11 A B E D C F 7 5 7 9 15 8 8 5 6 G 9 11 IRIS H.-R. JIANG Greedy Algorithms (2/3) ¨ Q: What will you do? ¨ All three greedy algorithms produce an MST. ¨ Prim's algorithm: (c.f. Dijkstra’s algorithm) ¤ Start with a root node s. ¤ Greedily grow a tree T from s outward. ¤ At each step, add the cheapest edge e to the partial tree T that has exactly one endpoint in T. Greedy algorithms 54 A B E D C F 7 5 7 9 15 8 8 5 6 G 9 11 D A F B E C G A B E D C F 7 5 7 9 15 8 8 5 6 G 9 11 IRIS H.-R. JIANG Greedy Algorithms (3/3) ¨ Q: What will you do? ¨ All three greedy algorithms produce an MST. ¨ Reverse-delete algorithm: (reverse of Kruskal’s algorithm) ¤ Start with T = E. ¤ Consider edges in descending order of cost. ¤ Delete edge e from T unless doing so would disconnect T. Greedy algorithms 55 A B E D C F 7 5 7 9 15 8 8 5 6 G 9 11 A B E D C F 7 5 7 9 15 8 8 5 6 G 9 11 IRIS H.-R. JIANG Cut Property (1/2) ¨ Q: What’s wrong with this proof? ¨ A: Take care about the definition! ¤ Spanning tree: connected & acyclic Greedy algorithms 56 v v' S w h w' e e' f ¨ Simplifying assumption: All edge costs ce are distinct. ¨ Q: When is it safe to include an edge in the MST? ¨ Cut Property: Let S be any subset of nodes, and let e = (v, w) be the minimum cost edge with one end in S and the other in V–S. Then every MST contains e. ¨ Pf: Exchange argument! ¤ Let T be a spanning tree that does not contain e. We need to show that T does not have the minimum possible cost. ¤ Since T is a spanning tree, it must contain an edge f with one end in S and the other in V–S. ¤ Since e is the cheapest edge with this property, we have ce < cf. ¤ Hence, T – {f} + {e} is a spanning tree that is cheaper than T. ¨ Q: What’s wrong with this proof? ¨ A: Take care about the definition!
  • 54. IRIS H.-R. JIANG Greedy Algorithms (1/3) ¨ Q: What will you do? ¨ All three greedy algorithms produce an MST. ¨ Kruskal's algorithm: ¤ Start with T = {}. ¤ Consider edges in ascending order of cost. ¤ Insert edge e in T as long as it does not create a cycle; otherwise, discard e and continue. Greedy algorithms 53 A B E D C F 7 5 7 9 15 8 8 5 6 G 9 11 A B E D C F 7 5 7 9 15 8 8 5 6 G 9 11 IRIS H.-R. JIANG Greedy Algorithms (2/3) ¨ Q: What will you do? ¨ All three greedy algorithms produce an MST. ¨ Prim's algorithm: (c.f. Dijkstra’s algorithm) ¤ Start with a root node s. ¤ Greedily grow a tree T from s outward. ¤ At each step, add the cheapest edge e to the partial tree T that has exactly one endpoint in T. Greedy algorithms 54 A B E D C F 7 5 7 9 15 8 8 5 6 G 9 11 D A F B E C G A B E D C F 7 5 7 9 15 8 8 5 6 G 9 11 IRIS H.-R. JIANG Greedy Algorithms (3/3) ¨ Q: What will you do? ¨ All three greedy algorithms produce an MST. ¨ Reverse-delete algorithm: (reverse of Kruskal’s algorithm) ¤ Start with T = E. ¤ Consider edges in descending order of cost. ¤ Delete edge e from T unless doing so would disconnect T. Greedy algorithms 55 A B E D C F 7 5 7 9 15 8 8 5 6 G 9 11 A B E D C F 7 5 7 9 15 8 8 5 6 G 9 11 IRIS H.-R. JIANG Cut Property (1/2) ¨ Q: What’s wrong with this proof? ¨ A: Take care about the definition! ¤ Spanning tree: connected & acyclic Greedy algorithms 56 v v' S w h w' e e' f ¨ Simplifying assumption: All edge costs ce are distinct. ¨ Q: When is it safe to include an edge in the MST? ¨ Cut Property: Let S be any subset of nodes, and let e = (v, w) be the minimum cost edge with one end in S and the other in V–S. Then every MST contains e. ¨ Pf: Exchange argument! ¤ Let T be a spanning tree that does not contain e. We need to show that T does not have the minimum possible cost. ¤ Since T is a spanning tree, it must contain an edge f with one end in S and the other in V–S. ¤ Since e is the cheapest edge with this property, we have ce < cf. ¤ Hence, T – {f} + {e} is a spanning tree that is cheaper than T. ¨ Q: What’s wrong with this proof? ¨ A: Take care about the definition!
  • 55. IRIS H.-R. JIANG Greedy Algorithms (1/3) ¨ Q: What will you do? ¨ All three greedy algorithms produce an MST. ¨ Kruskal's algorithm: ¤ Start with T = {}. ¤ Consider edges in ascending order of cost. ¤ Insert edge e in T as long as it does not create a cycle; otherwise, discard e and continue. Greedy algorithms 53 A B E D C F 7 5 7 9 15 8 8 5 6 G 9 11 A B E D C F 7 5 7 9 15 8 8 5 6 G 9 11 IRIS H.-R. JIANG Greedy Algorithms (2/3) ¨ Q: What will you do? ¨ All three greedy algorithms produce an MST. ¨ Prim's algorithm: (c.f. Dijkstra’s algorithm) ¤ Start with a root node s. ¤ Greedily grow a tree T from s outward. ¤ At each step, add the cheapest edge e to the partial tree T that has exactly one endpoint in T. Greedy algorithms 54 A B E D C F 7 5 7 9 15 8 8 5 6 G 9 11 D A F B E C G A B E D C F 7 5 7 9 15 8 8 5 6 G 9 11 IRIS H.-R. JIANG Greedy Algorithms (3/3) ¨ Q: What will you do? ¨ All three greedy algorithms produce an MST. ¨ Reverse-delete algorithm: (reverse of Kruskal’s algorithm) ¤ Start with T = E. ¤ Consider edges in descending order of cost. ¤ Delete edge e from T unless doing so would disconnect T. Greedy algorithms 55 A B E D C F 7 5 7 9 15 8 8 5 6 G 9 11 A B E D C F 7 5 7 9 15 8 8 5 6 G 9 11 IRIS H.-R. JIANG Cut Property (1/2) ¨ Q: What’s wrong with this proof? ¨ A: Take care about the definition! ¤ Spanning tree: connected & acyclic Greedy algorithms 56 v v' S w h w' e e' f ¨ Simplifying assumption: All edge costs ce are distinct. ¨ Q: When is it safe to include an edge in the MST? ¨ Cut Property: Let S be any subset of nodes, and let e = (v, w) be the minimum cost edge with one end in S and the other in V–S. Then every MST contains e. ¨ Pf: Exchange argument! ¤ Let T be a spanning tree that does not contain e. We need to show that T does not have the minimum possible cost. ¤ Since T is a spanning tree, it must contain an edge f with one end in S and the other in V–S. ¤ Since e is the cheapest edge with this property, we have ce < cf. ¤ Hence, T – {f} + {e} is a spanning tree that is cheaper than T. ¨ Q: What’s wrong with this proof? ¨ A: Take care about the definition!