And one of the worst things about the age we live in is that we have to spend 8-9 hours a day in front of a computer screen wasting our lives on menial corporate tasks instead of watching lectures like these and applying what we learned from them to do something really meaningful.
Professor Guttag gives simple and well understandable explanations for otherwise actually pretty complex optimization problems (especially digital optimization). It is so nice that MIT is making these lectures public
*My takeaways:* 1. Prerequisites for MIT 6.0002 2:16 2. What is a computation model 4:17 3. Optimization models 5:47 - Knapsack problem 8:04 - Solutions of knapsack problem: brute force algorithm 16:18, greedy algorithm 19:38 and problem with greedy algorithm 37:05
This Course is gold. This quality does not exist anywhere else. I read the book, watched all the videos, solve the priogramming assignments. Thanks MIT and Professor Guttag! You can find assignment solutions for 6.0001 and 6.0002 on my github account: github.com/emtsdmr
Imagination expansion is the single most valuable skill to learn that can assist further learning in the future . This imagination comes in forms like mind palace aka the Art of memory , maybe (Learn how to Learn ) ... This lecture made me think about why i became interested in Machine learning and made the path seem less intimidating , which makes me glad that i found this lecture playlist and youtube channel
Personal Notes. 1. Keyfunction serves to map elements (items) into numbers. Tells us what we mean by best. In this case, the professor wishes to use the one algorithm independently of his definition of best. 2. Lambda function creates anonymous functions (a great one for one-liners) by taking an input of parameters and then executes the ONE expression. (lambda : [expression]) 3. Greedy algorithms can't really bring you an optimal solution. Different approaches to greedy tests: greedy by profit/value (selects the biggest value first), greedy by cost (selects the ones with minimal cost in hopes of obtaining as much items as possible), and finally greedy by density (selects the one with the biggest value per cost)
A good example of the global vs local optimum is: Problem: consider vals = {1/2, 1,3, 1,4}, and then find the subset of values in vals such that the sum of values in this subset is as large as possible, but is also constrained to be 5/8. However, not confined to taking this greedy algorithm, you can see that 1/3 + 1/4 = 7/12, which is less than 5/8, but better than our greedy alg result of 1/2. So therefore the point is that greedy algorithms give you different results to the knapsack problem depending on what your metric is (our greedy metric here was 'next largest', but we could have chosen something else. In fact, 'next smallest', would have gotten us the global optimum solution!). "local optimum" in this context refers to the optimal solution *for a given metric* ('next largest' - which yielded our result of 1/2), which as mentioned, isn't necessarily the same as the best possible global solution (our result of 7/12) to a knapsack (optimisation) problem.
Thank you! I was confused that he was describing a local optimum with those examples because the metrics he is using are qualitatively different, ie. it might be more desirable to me to have slightly less overall calories but me maximising on "value" (how much I like the food) rather than cost. What seems significant for determining the optimum is the _order_ of the elements, and the metric (or the key function) determines the order. So then the global optimum is the solution with biggest total across all orderings.
How optimization works? 6:08 I. E. Route by car from a to b Objective to min travel time So objective function = sum( mins spent) from a to b On top of that layer a set of constraints(default empty) Fast way Boston by plane but impossible on a 100 budget Timw: to be before 5 pm While bus only 15 but impossible before 5, infer better to drive Constraints help elimination some solutions This asymmetry is handled differently Knapsack a burglar with limited space, items more than he takes 11:00 contonus problems solved by greedy algorithm takes best, nice on 0 1 knapsack: decision affects other decisions I could end up multiple solution 1300 or 1450, greedy does not guarantee best answer Assume n items: 0. Total max w 1. Set available l 2. V item is taken 16:30 bruteforce algorithm Generate all subsets (of items) From a powerset 23:31 Key function used to sort the items (based on. Some criteria Take item subtract calories Next time best time found out (but can't leave yet) 🤔 If an item makes it overbudget "wait and see" check others, then Algorithm efficiency? Python built in timsort Same as quicksort= same as mergesort n log n N (len items) N log b + n (constant) Order n (log n) Door for large number (1M) Not for cost but cheap ones first We get different answers with greedy Only local optimal solutions chosen each point Hey stuck local points boy the best one
Easy introduction; Using human mind as an example for understanding of how mental congnition takes place in logic sets, to more logic sets, taken into relativity to personal information that is believed from the correlation of past believed information that foundationally supports anything believed by that individual to be true. *Because, beliefs equal what we deem to be real (more on that later). For example, Artificial Intelligence is computationally created (unintentionally), but found to be necessary based upon exposure to beliefs or purposely created by the creators (humans) without knowledge of the methods that are being used for an outsider source of creation. This is the greatest factor of creation. It is statistically possible to re-create what has been proven and even possible to prove that nothing is random in the event that it be understands the mirrored language in which it comparatively recognizes as belonging to a "concious" observation of some outcome. If the created language is is newly acquired and uknown, then no phenemonela is observed for validate its existence. Therefore, no new DATA is confirmed and a moment for observational phenomena was lost (some call this luck). In the event that new. Infornation is realized and then it turns into data due to concious observation then it will be consciously compared to what is known in some context that cognitively gives validation to a past experience that has been deemed factual and correct, therefore creating a sense of beliefs. *If the Universe offers assistance to the creation of other Universes and its nature is to produce systems that are in mirrored in reproduction then it would seem relative. Some of these observations would be similiar, metaphor like, opposite of, symbolically important or whatever is conciously observed and to be factual or possibly thought of and believed to somehow shaped or formed the connected understandings of the unique observer. We could jump into many acdemic subject matters and show how concious creation through cross sourcing one subject matter to the nex subject matter and to helps to identify the creation of anything, because everything is a "system" persay...
the length of the list of names is 9, but the length of the list of values and calories is 8. Therefore, no value or calorie is asigned to the cake. But the lecture is really great...minor mistake...
Woahh, nice video. Didnt expect to use knapsack algo in data science... We learnt it in design and analysis of algorithms.... Interesting idea.. i got a idea.. maybe i can do something innovative 🤔 By the way love from India
dothemathright 1111 by this definition no person at time t will understand lambda functions unless they know it, and If we let t = 0, no one understands lambdas, and there fore no one will ever be able to understand lambdas and therefore lambdas become useless
[36:00] I don't get why we get different answers in the greedy algorithms as long as we use the same items and the same key function It does local optimization, but it does not mean that local optimization is different each time we run the program given the same parameters
Great course...but objectively speaking...we are always looking for a=b...now subjectively speaking...a=whatever the *user* wants... which brings us back to why we stick to frigging applying a linear transform on everything...
It probably was a white lie, having to explain the actual difference between average and worse case time complexity would drive people's attention away from the actual problem imo. Would've been better if he just used mergeSort which the students already knew tho
Timsort is a variant of Quick Sort? AND QS has worst case complexity similar to merge sort?? I guess I don't understand Computational Complexity that well :(
The textbook is Guttag, John. Introduction to Computation and Programming Using Python: With Application to Understanding Data. 2nd ed. MIT Press, 2016. ISBN: 9780262529624. It is available both in hard copy and as an e-book. (mitpress.mit.edu/9780262529624). The course materials are available on MIT OpenCourseWare at: ocw.mit.edu/6-0002F16. Best wishes on your studies!
6.0001 Introduction to Computer Science and Programming in Python is the prerequisite for the course. See the course (and the prerequisite) on MIT OpenCourseWare at: ocw.mit.edu/6-0002F16. Best wishes on your studies!
No, "itemsCopy = sorted(itmes, key = keyFunction, reverse = True)" has a complexity of O(nlogn) as the fastest sorting algorithm has that complexity. by "n = len(items)" the professor means that in O(nlogn) n is equal to the number of items we have to sort.
love the prof and the content, but what's with the lame students. I was yelling FOOD! FOOD! while watching this at around the 32:00 mark, and the students were just not interested in answering or participating. Don't they know how lucky they are to be sitting there?
One of the best things about the age we live in is that we all have FREE access to amazing lectures like these from MIT, no matter where we are
agreed lol.
@@w3w3w3
Specially during the pandemia and lockdown.
And we recognize that watching videos of lectures is meaningless for most people.
we know
And one of the worst things about the age we live in is that we have to spend 8-9 hours a day in front of a computer screen wasting our lives on menial corporate tasks instead of watching lectures like these and applying what we learned from them to do something really meaningful.
Professor Guttag gives simple and well understandable explanations for otherwise actually pretty complex optimization problems (especially digital optimization). It is so nice that MIT is making these lectures public
*My takeaways:*
1. Prerequisites for MIT 6.0002 2:16
2. What is a computation model 4:17
3. Optimization models 5:47
- Knapsack problem 8:04
- Solutions of knapsack problem: brute force algorithm 16:18, greedy algorithm 19:38 and problem with greedy algorithm 37:05
thanks a lot, it's really helped me
@@vegitoblue21 nice GH Z 👍
GH Z you’re welcome
W, need more comments like these
For anyone interested, this course starts in march 2021 in EDx. It's free with an optional certificate for $75.
I'm working on an MS in data science, and man do I wish I had this guy. My professors over complicate everything.
thank you so much mit, I am a colombian student and without you I wouldn't be able to take this kind of courses
I am amazed that these courses are freely available. Thank you, MIT!
This Course is gold. This quality does not exist anywhere else. I read the book, watched all the videos, solve the priogramming assignments. Thanks MIT and Professor Guttag!
You can find assignment solutions for 6.0001 and 6.0002 on my github account: github.com/emtsdmr
Hey, do we get a certificate on completion? Just curious.
Hey I'm having hard time completing the last problem set. Can you please help me?
Imagination expansion is the single most valuable skill to learn that can assist further learning in the future . This imagination comes in forms like mind palace aka the Art of memory , maybe (Learn how to Learn ) ... This lecture made me think about why i became interested in Machine learning and made the path seem less intimidating , which makes me glad that i found this lecture playlist and youtube channel
Personal Notes.
1. Keyfunction serves to map elements (items) into numbers. Tells us what we mean by best. In this case, the professor wishes to use the one algorithm independently of his definition of best.
2. Lambda function creates anonymous functions (a great one for one-liners) by taking an input of parameters and then executes the ONE expression. (lambda : [expression])
3. Greedy algorithms can't really bring you an optimal solution. Different approaches to greedy tests: greedy by profit/value (selects the biggest value first), greedy by cost (selects the ones with minimal cost in hopes of obtaining as much items as possible), and finally greedy by density (selects the one with the biggest value per cost)
Thank You MIT
same bro lmao, i apologize for my broke ass
yeah , same goal
Now you have money, so donate already
donate bro
Now its time
Just finished 6.0001. If you want to go through 6.0002 with me im starting today!
What a brilliant lecture and a amazing professor. He reminded me of what a pleasure it is to attend university.
İt is so nice that MIT is making these lectures public 🎉
Great lecture. Really looking forward to dive into this second part of the course, thank you MIT for uploading those
this is the best teacher ,i realized that most of mit teacher are great wish i could study there
A good example of the global vs local optimum is:
Problem: consider vals = {1/2, 1,3, 1,4}, and then find the subset of values in vals such that the sum of values in this subset is as large as possible, but is also constrained to be 5/8.
However, not confined to taking this greedy algorithm, you can see that 1/3 + 1/4 = 7/12, which is less than 5/8, but better than our greedy alg result of 1/2. So therefore the point is that greedy algorithms give you different results to the knapsack problem depending on what your metric is (our greedy metric here was 'next largest', but we could have chosen something else. In fact, 'next smallest', would have gotten us the global optimum solution!). "local optimum" in this context refers to the optimal solution *for a given metric* ('next largest' - which yielded our result of 1/2), which as mentioned, isn't necessarily the same as the best possible global solution (our result of 7/12) to a knapsack (optimisation) problem.
Thank you! I was confused that he was describing a local optimum with those examples because the metrics he is using are qualitatively different, ie. it might be more desirable to me to have slightly less overall calories but me maximising on "value" (how much I like the food) rather than cost. What seems significant for determining the optimum is the _order_ of the elements, and the metric (or the key function) determines the order. So then the global optimum is the solution with biggest total across all orderings.
Hyperparameters tuning is making so much sense now!. Thank you so much for this.
how???
wots that ?
Great content, teacher and course. Thank you so much for uploading this course.
How optimization works?
6:08 I. E. Route by car from a to b
Objective to min travel time
So objective function = sum( mins spent) from a to b
On top of that layer a set of constraints(default empty)
Fast way Boston by plane but impossible on a 100 budget
Timw: to be before 5 pm
While bus only 15 but impossible before 5, infer better to drive
Constraints help elimination some solutions
This asymmetry is handled differently
Knapsack a burglar with limited space, items more than he takes
11:00 contonus problems solved by greedy algorithm takes best, nice on
0 1 knapsack: decision affects other decisions
I could end up multiple solution 1300 or 1450, greedy does not guarantee best answer
Assume n items:
0. Total max w
1. Set available l
2. V item is taken
16:30 bruteforce algorithm
Generate all subsets (of items)
From a powerset
23:31
Key function used to sort the items
(based on. Some criteria
Take item subtract calories
Next time best time found out (but can't leave yet) 🤔
If an item makes it overbudget
"wait and see" check others, then
Algorithm efficiency?
Python built in timsort
Same as quicksort= same as mergesort n log n
N (len items)
N log b + n (constant)
Order n (log n)
Door for large number (1M)
Not for cost but cheap ones first
We get different answers with greedy
Only local optimal solutions chosen each point
Hey stuck local points boy the best one
The fact that his name basically 'means' "goodday" in German and "abdominal label" in English cheers me up for some reason.
Great content and teacher.
A little remark in the code:
names values and calories are not of same length. names is 9 and cake is indeed excluded
Easy introduction;
Using human mind as an example for understanding of how mental congnition takes place in logic sets, to more logic sets, taken into relativity to personal information that is believed from the correlation of past believed information that foundationally supports anything believed by that individual to be true.
*Because, beliefs equal what we deem to be real (more on that later). For example, Artificial Intelligence is computationally created (unintentionally), but found to be necessary based upon exposure to beliefs or purposely created by the creators (humans) without knowledge of the methods that are being used for an outsider source of creation.
This is the greatest factor of creation. It is statistically possible to re-create what has been proven and even possible to prove that nothing is random in the event that it be understands the mirrored language in which it comparatively recognizes as belonging to a "concious" observation of some outcome. If the created language is is newly acquired and uknown, then no phenemonela is observed for validate its existence. Therefore, no new DATA is confirmed and a moment for observational phenomena was lost (some call this luck). In the event that new. Infornation is realized and then it turns into data due to concious observation then it will be consciously compared to what is known in some context that cognitively gives validation to a past experience that has been deemed factual and correct, therefore creating a sense of beliefs. *If the Universe offers assistance to the creation of other Universes and its nature is to produce systems that are in mirrored in reproduction then it would seem relative. Some of these observations would be similiar, metaphor like, opposite of, symbolically important or whatever is conciously observed and to be factual or possibly thought of and believed to somehow shaped or formed the connected understandings of the unique observer.
We could jump into many acdemic subject matters and show how concious creation through cross sourcing one subject matter to the nex subject matter and to helps to identify the creation of anything, because everything is a "system" persay...
Anyone here because of the damn quarantine?
i suppose you are optimizing your time
I don't even know how I got here lol
Maybe want to become bald.
If you are confused when Wednesday is, yes it is 2. Optimization Problems on autoplay
I just have two words: Thank You
it feels funny to hear absolute silence in response to some questions, the way that even MIT students dont know or are afraid of answering wrong
Wish I could attend in person. Great lecture, just sad not enough interaction.
What a personable prof!
4:10
Start
Fantastic course, thank you to MIT, like many here I will donate when I start earning!
he is legend ,great explainer
the length of the list of names is 9, but the length of the list of values and calories is 8. Therefore, no value or calorie is asigned to the cake. But the lecture is really great...minor mistake...
It is such a shame that this video has 287K views and the last video has only 20K views, why do people don't complete the course?
Woahh, nice video. Didnt expect to use knapsack algo in data science... We learnt it in design and analysis of algorithms.... Interesting idea.. i got a idea.. maybe i can do something innovative 🤔
By the way love from India
The 'no good solution' statement for 0/1 knapsack problem is true if we assume P not = NP
Thank you MIT
This John Guttag guy, I like his style
I Love the way they teach us .....Awesome I have great experience .....#Great Content and Also Valuable ......
This Parachute is a knapsack! XD
What a cliffhanger to end on! :)
I can't believe this is for free
I love this guy! Man literally threw out candy to encourage students to answer questions, that’s so cute lol
thanks , mit
Thanks for the assist Ana (heart emoji)
thank uu mit ocw
Thank you for these lectures. If I come into money I will make a large donation.
32:14 i feel so bad for the prof... he's trying so hard to build a connection with his students...
this food rewards reminds me my relationship with my dog. :) Anyhow, good explanation and overall definition of such concepts!
31:40 The moment the professor discovers that no one understood anything.
Because he is teaching the wrong folk.
Jesus that was so cringe
@dothemathright 1111 that is so true, haha
dothemathright 1111 by this definition no person at time t will understand lambda functions unless they know it, and If we let t = 0, no one understands lambdas, and there fore no one will ever be able to understand lambdas and therefore lambdas become useless
@@ramind10001 It's almost as if he was joking ...
[36:00]
I don't get why we get different answers in the greedy algorithms as long as we use the same items and the same key
function
It does local optimization, but it does not mean that local optimization is different each time we run the program given the same parameters
very good lecture
Thank you from Algeria
This is amazing
Thank You
32:31 really no one can answer !!
Great!
thanks
LOLLLL I love when no one can answer his questions. Omg, I feel so bad for that professor.
thnx MIT
Professor knows to solve complex optimization problems but don't know what to do when the screen freezes. Calls the assistant.
Cheers!!!
Great course...but objectively speaking...we are always looking for a=b...now subjectively speaking...a=whatever the *user* wants... which brings us back to why we stick to frigging applying a linear transform on everything...
Just finished the exam of this... What if this uploaded few months ago...
What about a genetic tournament algorithm?
Thank you I enjoyed it
36:48 donut should have 95 in calories instead of 195 showing in the result, and apple should be 150, not 95.
👍Gud morning gud video
Dr. Anna Bell from 6.0001 pops out in this video... Did any of you guys notice???
35:18
Quicksort worst case is O(n^2). The professor probably wanted to say average case complexity.
It probably was a white lie, having to explain the actual difference between average and worse case time complexity would drive people's attention away from the actual problem imo.
Would've been better if he just used mergeSort which the students already knew tho
Maybe he was saying worst case for Timsort is O(n log(n)). en.wikipedia.org/wiki/Timsort
But timsort is not a quicksort, it is more like a mergesort.
so i have learned machine learning ,python,sql,tableue,powerbi,flask in 10months thanks to corona ugggh
what have you put to practise?
@@ArunKumar-yb2jn got job in business analyst role
@@axa3547 What's a business analyst do? Work with Excel or coding?
@@ArunKumar-yb2jn depends upon you which ever tool you wanna use , I use both
Did you get the job without a diploma in those, simply by skill?
6:14 here should it not be objective value than a function? What am I missing? Minimum time would be a value right?
Boy, talk about your cliffhangers.
damn i'm hooked
I code excactly like in the video but when i run it, the error name “Food” is not defined in line 17 (build menu) appear. Does anyone has any ideas ?😢
Timsort is a variant of Quick Sort? AND QS has worst case complexity similar to merge sort?? I guess I don't understand Computational Complexity that well :(
La Casa De Papel knows the 0/1 knapsack problems omg!
I could have got the candy reward it was so obvious that the answer is Food .
RE: Carnegie Hall Joke.
--> Is that where Inglorious Bastards got the line from?
36:06 Do you mean calories as weight, sir?
cool
Where did the I[i] come from? Shouldn't it be L[i]?
He didnt define it in the beginning as a list but it is the list of item values and weights.
Which book is used for this course and how I can exercise on the different topics concerned the course? If there are any...
The textbook is Guttag, John. Introduction to Computation and Programming Using Python: With Application to Understanding Data. 2nd ed. MIT Press, 2016. ISBN: 9780262529624. It is available both in hard copy and as an e-book. (mitpress.mit.edu/9780262529624). The course materials are available on MIT OpenCourseWare at: ocw.mit.edu/6-0002F16. Best wishes on your studies!
Is there a specific order in which I should watch the different playlists for ML?
Yes depends on wt u want to learn?
Are the numbers inside the 'values' array randomly picked by the instructor or the does it act as a grading scale for each menu item?
I think they are a grading scale he has chosen to order the items according to how much value they have to him (how much he likes them).
What are the prerequesites of this course?
6.0001 Introduction to Computer Science and Programming in Python is the prerequisite for the course. See the course (and the prerequisite) on MIT OpenCourseWare at: ocw.mit.edu/6-0002F16. Best wishes on your studies!
28:42 what is "item" that used for ?
'List' of Food items or Menu
Excelente, ¿podrían igualmente subir vídeos de física y matemáticas con subtítulos en español o traducidos al español? Gracias.
@Nicolás Gómez Aragón roflmao, i got it.
Aprende inglés
26:30
error?
The americans love examples with food
0:23
why are there 9 names and only 8 values and claoires
I think it's a minor mistake. You have to omit cake.
source code of that example program please.
ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-0002-introduction-to-computational-thinking-and-data-science-fall-2016/lecture-slides-and-files/
😂funny when he threw the first candy
27:11 so n = len (item) has a computation time of O (n log n) huh? I just understand now. thank you sir
No, "itemsCopy = sorted(itmes, key = keyFunction, reverse = True)" has a complexity of O(nlogn) as the fastest sorting algorithm has that complexity. by "n = len(items)" the professor means that in O(nlogn) n is equal to the number of items we have to sort.
Finally i can double speed the lecture.
There's no value and calories for the name 'cake' :(
11:44
20:00
22:00
love the prof and the content, but what's with the lame students. I was yelling FOOD! FOOD! while watching this at around the 32:00 mark, and the students were just not interested in answering or participating. Don't they know how lucky they are to be sitting there?
Tbh I had to pause and go back a couple of times to understand what the prof was saying so maybe they were just abit confused.