Excellent video. I love this subject but I can't find that many videos on the subject unfortunately =/ The williamson-shmoys book is great for certain algorithms, but not all of them. Its always nice to have complimentary materials. Thanks again, for another great lecture!
So it's simple yet fiddly with colours partial reductions to 1's and 2's can deduce a broken grid with 2 or more colours that can get quickly enough vital clues to solve grids of any size. P=np Gemini said yes that's true but it's not sure where it's solution bounds are probably called up prolog You can use 2sat partial reductions to deduce broken partial 3sat reductions which makes it quicker to solve.
It feels like the prof made a mistake when explaining aprox partition. He first said (when he was building the algorithm at 1:03:52) m < n, and then when he was proving he said imagine that the second part never executed because m is large. However the only case in which m never executes is if m = n, since otherwise there will still be leftover elements that are not added to any partition.
The input elements are sorted in decreasing order, so you can imagine that only 'small' values were left which don't change the sums that much so were added to only one of the sets and thus the bigger set(aka A) stays the same after the initial first stage. The second stage always executes(if m < n), so if the lector said that, then he made a mistake.
He meant that if m is large it might be possible to never add elements to the bigger partition, which he denoted with A. The reason for that is that all the elements left might be really small and not really affect W(B), so we just add then there without exceeding W(A).
Once k/t > ln(n), the algorithm is done. In other words, once k > t * ln(n), the algorithm is done. Since k is discrete and goes up by one each time, that means by the time the algorithm is done, k
This is an example that shows max degree heuristic could lead to an approximation solution that grows with n, but we rather find one that is a fixed approximation like what we find in this lecture which is 2-approximation solution.
It’s confusing because you think it follows from the statement he said just before it. That is, the prior equation does not imply the statement made at your timestamp (I thought that as well, but it’s separate) That statement is separate. You prove it from the fact that k>m plus the fact that s_i are sorted descending we know s_m = s_m from above yields the result.
This topic cannot be made clearer. Love his subtle humor too.
My favorite professor. Thank you so much for making this subject interesting & easy.
Excellent video. I love this subject but I can't find that many videos on the subject unfortunately =/ The williamson-shmoys book is great for certain algorithms, but not all of them. Its always nice to have complimentary materials. Thanks again, for another great lecture!
I'd recommend Abdul Bari channel
30:00 i think correct way to say it would be there wont be any vertices in our vertex cover that share an edge
So it's simple yet fiddly with colours partial reductions to 1's and 2's can deduce a broken grid with 2 or more colours that can get quickly enough vital clues to solve grids of any size. P=np Gemini said yes that's true but it's not sure where it's solution bounds are probably called up prolog
You can use 2sat partial reductions to deduce broken partial 3sat reductions which makes it quicker to solve.
This dude is awesome
It feels like the prof made a mistake when explaining aprox partition. He first said (when he was building the algorithm at 1:03:52) m < n, and then when he was proving he said imagine that the second part never executed because m is large. However the only case in which m never executes is if m = n, since otherwise there will still be leftover elements that are not added to any partition.
The input elements are sorted in decreasing order, so you can imagine that only 'small' values were left which don't change the sums that much so were added to only one of the sets and thus the bigger set(aka A) stays the same after the initial first stage. The second stage always executes(if m < n), so if the lector said that, then he made a mistake.
He meant that if m is large it might be possible to never add elements to the bigger partition, which he denoted with A. The reason for that is that all the elements left might be really small and not really affect W(B), so we just add then there without exceeding W(A).
Can't find a complexity class from MIT without the black boards and chalk for some reason.
53:50, how do you get (1-1/t)^k
some simple calculus
taylor series expansion for e^x=1+x+x^2/2!+..
For set cover, can’t we reduce it to vertex cover and make the 2 approx method ?
how k/n > lgn becomes k/n
Nitish Sandhu did you know now?
do*
I'm also stuck in this part.
Once k/t > ln(n), the algorithm is done. In other words, once k > t * ln(n), the algorithm is done. Since k is discrete and goes up by one each time, that means by the time the algorithm is done, k
19:10 don't really get it...can someone help?
This is an example that shows max degree heuristic could lead to an approximation solution that grows with n, but we rather find one that is a fixed approximation like what we find in this lecture which is 2-approximation solution.
bhai bhai bhai... kya pada diya
Could someone explain the deduction at 1:18:03
It’s confusing because you think it follows from the statement he said just before it. That is, the prior equation does not imply the statement made at your timestamp (I thought that as well, but it’s separate)
That statement is separate. You prove it from the fact that k>m plus the fact that s_i are sorted descending we know s_m = s_m from above yields the result.
@53:22 shouldn't be X_k-1 on the left side of inequality?
No, since X is shrinking because it is the set of elements remaining to be covered by the algorithm.
4:54 correction in caption "rho of n" instead of "row of n"
Same mistake appears 8 times
Would be better if the cameraman stays stills on the board instead of following the prof.
Now I can't stop noticing it thank-you.
我在渣魚垃
He doesn't explain enough details to follow his arguments easily when it comes to calculate the approximation factor.