__Exercises 16.2-1:__

Prove that the fractional knapsack problem has the greedy-choice property.

__Answer:__

The greedy-choice property: A globally optimal solution can be arrived at by making a locally optimal (greedy) choice. (Textbook page 380).

To prove the fractional knap sack problem, first we have to prove that the choice we have made that is going to be the best choice for that moment, and then that choice is the best choice for the problem.

Given that the knap sack capacity is 50lb and the optimal value for the bag has to be greater or equal $220.

Assume that items i1< i2< i3 having v1< v2< v3 are the value of each item, and the weight for each item as w1< w2< w3. We have c13 is the sub problem and let ck is the choice of items for the bag with the maximum value:

So that vn = max {vk : ck є c13}

The greedy-choice is selected base on the “dollar per pound” value for each item:

Item 1 = $60/10lb = 6

Item 2 = $100/20 = 5

Item 3 = $120/30 = 4.

To get the most dollar value out from this, the thief has to select the highest $/lb and has to use the knap sack to its maximum capacity - greedy-choice. There for we will have

First take item 1 Total Value = $60 Total Weight = 10lb

Second take item 2 Total Value = $160 Total Weight = 10 + 20 = 30lb

Takes 2/3 of item 3 Total Value = $240 Total Weight = 10+20+2/3x30= 50lb

W = 10 W2 = 20 W3 = 2/3x30 = 20

Item 1 = $60 Item 2 = $100 2/3 of item 3 = $80

The total value is greater $220 therefore this is a greedy-choice for this problem

__Exercises 16.2-2:__

Give a dynamic-programming solution to 0-1 knap sack problem that run in O(n W) time, where n is number of items and W is the maximum weight of items that the thief can put in his knapsack

__Answer:__

Let assume that: W is the maximum
weight of the knapsack, **n** = i,k…j is the number of items the thief wants
to take, item i
has a set of attribute as vi
and wi as value and weight
respectively. The optimal solution is: He wants to take the as valuable a load
as possible at most W
pounds, therefore if we remove one item j from his load then the load is at most W – **w****j**, and the number
of item the thief takes n – 1.

The second step is to recursively define the value of an optimal solution. Let c[i,W] is value of the optimal solution for a sack of size W and items from 1 to i

0 if i = 0 or W = 0

c[i,W] = {c[i-1, W] if wi > W

n . . 3 2 1

max (vi + c[i-1,W-wi], c[i-1,W]) if i > 0 and W ³ wi

1 2 3
…………..W

**Weight**

From the table above for any value of n there is a value of weight W is associated to that, therefore the running time is O(n W)

Reference: http://www.cs.nyu.edu/courses/fall02/V22.0310-002/class-notes.html

And Katherine
Macropol (assignment4)

__Exercise 16.5-1:__

ai |
1 |
2 |
3 |
4 |
5 |
6 |
7 |

di |
4 |
2 |
4 |
3 |
1 |
4 |
6 |

wi |
10 |
20 |
30 |
40 |
50 |
60 |
70 |

Solve the instance of the scheduling problem given in the table above.

__Answer:__

Consideration points: Reschedule the tasks bases on two criterions “dead line” and “weight” for each task. Let sort the table on descending weight:

ai |
7 |
6 |
5 |
4 |
3 |
2 |
1 |

di |
6 |
4 |
1 |
3 |
4 |
2 |
4 |

wi |
70 |
60 |
50 |
40 |
30 |
20 |
10 |

Ratio
of |
11.6 |
15 |
50 |
13.3 |
7.5 |
6.6 |
2.5 |

Now re-sorted this table in combining the weight /dead-line (time) in order of earliest time first. If two tasks have a same deadline then whichever task has more weight will taking place first.

ai |
5 |
4 |
6 |
3 |
7 |
2 |
1 |

di |
1 |
3 |
4 |
4 |
6 |
2 |
4 |

wi |
50 |
40 |
60 |
30 |
70 |
20 |
10 |

Therefore the answer is going to be as:

a5, a4, a6, a3, a7, a2, a1,

__Problem 16-2:__

Suppose
you are given a set *S* = {*a _{1}, a_{2},...a_{n}*}
of tasks, where task

_{n}

minimize
the average completion time, that is, to minimize (1/*n*)∑ _{i=1} *c** _{i}*. For example, suppose there are two tasks,

a-
Give an algorithm that schedules the tasks so as to minimize the average
completion time. Each task must run non-preemptively, that is, once task *a _{i}*
is started it must run continuously for

__Answer:__

There
are two scenarios of the greedy choice:

·
Let schedule
the tasks in order of the processing time from shortest to longest.

Let
examine the set S {a_{1, }a_{2, }a_{3}} with the processing time p_{1
}<
p_{2 }< p_{3}

a_{1 }a_{2 }a_{3}

p_{1 }= 2 p_{2 }= 3 p_{3 }= 4

p_{n }= 3 (2 + 3 + 4)

Therefore
the average completion time is:

3

p_{n}
= 3

From
the graph we can see p_{n }is always satisfy the given constrains i.e.:
“once task *a _{i}* is started it must run continuously for

·
The second
scenario is scheduling the tasks in order of the processing time from the
longest time first to the shortest time last p_{1 > }p_{2
> }p_{3. }In
this_{ }case
p_{n }the
would stay the same = 3, however this is conflict with the processing time of p_{1
}= 4.

b-
Suppose now that the tasks are not all available at once. That is, each task has
a *release time** r _{i}* before which it is not
available to be processed. Suppose also that we allow

__Answer:__ **The credit is going to Kathy Macropol – as
I totally agreed with her explanation (here I just copied from Kathy’s assignment
4).**

The greedy algorithm for this is to choose the task that
currently has the shortest processing time left, and run it. If another task
become available mid-way through, see whether that new task has a shorter
processing time than the current task's remaining processing time. If it does,
suspend the current task (and put it back into the front of the list of tasks)
and do the new task instead. If not, add the new task to the list of tasks to
be run (sorted from shortest processing time to longest) and continue with the
current task. This has a running time of O(*n*lg*n*) to sort and add
the *n* tasks to the list. (No matter when the tasks become available, one
task will be added to the list every time a task becomes available. Whether
it's the suspended old task or the new one).

To see that this algorithm minimizes the average completion time, we can look
at the situations individually. At any point in time, you have a list of tasks
that are ready to be processed. This situation is equivalent to the situation
in the previous problem, and we know that putting the task with the shortest
processing time first will always minimize the average completion times at
every point in time for the problem. Whenever a new task is added, we now have
a slightly modified list of tasks that are ready to be processed, and must
again choose to do the tasks with the shortest remaining processing times first
(which will again minimize the average completion times in every point in time
for that problem). Since every subproblem is minimizing the average completion
time, the entire problem also has minimized average completion time.