Nice explanation!! Very clear!! If you want to know WHY the least squares method works, then watch this explanation. Only three data points are used here so as not to loose the method in a sea of unnecessary complication.
Very rarely do the presenters on MIT OpenCourseWare ever take the time to SLOWLY and THROUGHLY explain key concepts/steps and methods. This was brilliant, I love the way you broke everything down to a tea and made the learning process so simple. Thank you! Please keep making more videos
The best explanation for last decades, and more to come! Linear Least Squares. Could you do a non-linear LS as well? :D We are suckers for real teachers and materials who simplify concepts for the proper digestion.
- Regarding the parabola: With a bit of "contemplation" one can see that the Y values seem to be the ones of a parabola with equation Y = X². However the Y values are not symmetrical with respect to X=0, but rather to X=1. In other words it seems that the parabola Y = X² was shifted to the right by 1. That means the parabola Y = (X - 1)² will fit the data points perfectly, and so that must be the solution: Y = X² -2X +1 .
OMG!!!!!! We're doing Regression Analysis in Engineering Stats, which is exactly this!!!! Thanks so much for making it SO much easier to understand!!!!!!!! Much love!
My thought&guess? : Instead of finding the least squares, what we are finding is the least distance (absolute value since some points are above, some are below the fit function). However, we decide to use least squares because when we take the gradient, it makes it easy to solve the problem.
Least Square is a data fitting application discovered by Mathematician Gauss. The system of equations that you see are given laws obtained by Trial and Error by the Mathematician and so you don't need to Break your head. Hence by merely plugging in the x values you can not only Interpolate, but also extrapolate the y value. As for the Second question, it is a Challenge Question. If you have done it, you can post it here or hint your way of approach, I can say if you are Right or Not.
@SpitTanker Given certain assumptions there are different solutions and estimators. If all gauss-markov assumptions are valid you use the normal OLS which is BLUE under thes assumptions (best linear unbiased estimator). You minimize the summation by minimizing the residuals. the residuals are the difference between the true value of y and the estimated value of y(head). You do this by deriving the regression equation with respect to b (which is the regressor). Kinda complicated on YT
3:25 Hey there, I would like to inquire you regarding the function you mentioned in the timeline; you are asking to take the derivative of the function with respect to a and b, but where is the function of I may ask. Furthermore, if I were to find the equation of the scattered plot, it is evidently incorrect! May I please get your insight and assistance on that?
Thanks for the video, i found it very straight forward. Can you please explain further how you found the two equations you used to solve for a and b. are taking the partial derv. of y=ax+b? And do you have a video for your bonus question. I would like to see if what i worked out is correct. Thanks!!
what if "Y" is a function of a lot of variables say for example 4 or 5 variables and each time we combine values of these variables we get a new value of "Y"......in this case how can i proceed to get an approximate function of y?? THANK YOU SO MUCH :-)
Patents and copyrights expire eventually for a good reason. Generally I appreciate that we honor the ones who found something out sooner than the rest of the world, even if in fact that often happens by mere accident.
Hi Madam, Thank you very much indeed for a nice explanation. Really I want ot draw a best curve through all point, curve may have a lot of fluctuations instead of line or parabola, can you tell me how to do that. I am desperately in need of this answer. I appreciate your help in advance . Look forward to hearing from you Aram
Hi, How to solve a something like this.. f= 1/{K1(x1^a1)(y1^b1)(z1^c1)} + 1/{K2(x2^a2)(y2^b2)(z2^c2)}: x1,x2,y1,y2,z1,z2 all are variables and a1,a2, b1,b2 c1,c3 K1, K2 all are constants that needed to be find.
This was a good explanation, but I was kinda disappointed that the instructor didn't explain how she got the equations that were used to find the least squares
I don't understand how is 13a + 5b = 14 and then 7 under 6 and 5 a + 3b = 6 and then b = 4 under 7 can someone explain the step the end in that conclusions please?
I appreciate your Line of thought, and the way you put forth it. But what would your Opinion be on Patient Policies and Rights. If I am to agree upon your Say, then it would mean no one would ever have a legal right on any thing. In fact, as you said and claimed earlier, some one would, some how, at least stumble on the thought of invention of all the existing concepts and their application, and that means, Rutherford, and J.J. Thompson and Bohr and Einstein were just there by accident?
It would be have been much better if the Instructor had given Due Credits to the Inventor/Discoverer Carl Gauss. After all without him, there would have Never Been a technique as Least Squares.
That's clearly wrong! Many inventions and discoveries in history were made almost simultaneously and independently. It would've been a mere matter of time till anyone else had found it. And don't get me wrong Gauss was one of the greatest geniuses ever!
Question...Isn't Exi Squared 5 squared5x5 =25, why do you have 13? and kExiyi is 5x6=30, why do you have 14? Just started statistics and I can't see it...help! Stewy
There is no mathematical proof so far proves that this method is the best way, but I have a simple proof that this method is the best way in this field
hi sir/mam, iam doing reaseach on image processing so i want to know about local mean estimator using lms algorithm so in this topic i want to find mean of (noise+image) after that i want to find output can u help me sir i have paper with me name is "two dimensional local mean estimator adaptive filter using lms algorithm
1:16 least squares explain
4:42 fit line equations
6:39 fit parabola equations
Nice explanation!! Very clear!! If you want to know WHY the least squares method works, then watch this explanation. Only three data points are used here so as not to loose the method in a sea of unnecessary complication.
Very rarely do the presenters on MIT OpenCourseWare ever take the time to SLOWLY and THROUGHLY explain key concepts/steps and methods. This was brilliant, I love the way you broke everything down to a tea and made the learning process so simple. Thank you! Please keep making more videos
The best explanation for last decades, and more to come! Linear Least Squares. Could you do a non-linear LS as well? :D We are suckers for real teachers and materials who simplify concepts for the proper digestion.
The "challenge" was the real explanation.
- Regarding the parabola:
With a bit of "contemplation" one can see that the Y values seem to be the ones of a parabola with equation Y = X².
However the Y values are not symmetrical with respect to X=0, but rather to X=1.
In other words it seems that the parabola Y = X² was shifted to the right by 1.
That means the parabola Y = (X - 1)² will fit the data points perfectly, and so that must be the solution: Y = X² -2X +1 .
OMG!!!!!! We're doing Regression Analysis in Engineering Stats, which is exactly this!!!! Thanks so much for making it SO much easier to understand!!!!!!!! Much love!
This made the concept of lease squares even clearer than i had mastered it before. Thanks for the great Video
Thankyou you dont how helpful these lectures might be
Finally i found a good explenation for this subject:) Keep up the good work
0:43 She literally went out and came back. she is so cute🤣 not to mention that she is such a great Professor
Really helped me to understand the c program algorithm of it...
condensing my 2 and 1/2 hour long lecture into 10 minutes 🙏, thank you
Thanks, very clear by keeping it to it's simplest form.
This is extremely good. I've been trying to wrap me around this for a bit and this is the clearest explanation.
Amazing explanation as I understood what least squares is at the two minute mark!
Perfect class. Nice explanation. Thanks so much! It helped me a lot. :)
First time I understood why we square values.. Awesome thank you.
TQVM! My first exposure (self study) to Statistics, starting with List Squares.
After solving the problem a = 1, b = -2, c = 1
so
y=x^2-2x+1
Thanks for the good explanation, couldn't be better!! Awesome work
Love it! Great video that I will recommend to my students. Thank you!
the value of b is 11/7 at time 5:43
OMG you are so good,thanks
what a legend, thank you
brilliant explanation! very clear!
Thank you Baris hocam for the video :)
Thanks i found what i was looking for. Non linear model using least square methode
I'm sure you'll bring the non linear model in the exam since you didn't show it on the board
great lecture and teacher. Thank you!
it would be nice if you can show how u derived those two equations!!
Loved the challenge problem! Very engaging recitation video.
Thank you professor.
My thought&guess? : Instead of finding the least squares, what we are finding is the least distance (absolute value since some points are above, some are below the fit function). However, we decide to use least squares because when we take the gradient, it makes it easy to solve the problem.
Holy crap shes a great teacher
how could 9 people didn't like this video???!!!! this video is very helpful to understand the least square method!
They are jealous perhaps. Hahaha
thank you so much. I need that formules for faster GPS ambiguity estimation
Least Square is a data fitting application discovered by Mathematician Gauss. The system of equations that you see are given laws obtained by Trial and Error by the Mathematician and so you don't need to Break your head. Hence by merely plugging in the x values you can not only Interpolate, but also extrapolate the y value. As for the Second question, it is a Challenge Question. If you have done it, you can post it here or hint your way of approach, I can say if you are Right or Not.
Thank you. This is the point I was missing!
You are great teacher.
Very very helpful video... And very easy to understand...also the concept is explained very clearly
We were looking for this explanation thank you so much 😊
thnx it was simple and clear and it didnt made me look stupid because i understood 100% of what you said.
Very beautifully explained... thanks a lot !! It helped me for my semesters
@SpitTanker Given certain assumptions there are different solutions and estimators. If all gauss-markov assumptions are valid you use the normal OLS which is BLUE under thes assumptions (best linear unbiased estimator). You minimize the summation by minimizing the residuals. the residuals are the difference between the true value of y and the estimated value of y(head). You do this by deriving the regression equation with respect to b (which is the regressor). Kinda complicated on YT
Thank you very much and best wishes
Thank you. Very good session!!
What function are they putting in terms of a and b at 3:36? i do not see any derivative being taken.
thanku very much...for helping us by spreading knowledge !!!!!
truly u r great !!!
I got more work to do but that was very helpful. Thank you.
how do we derivive when we have a summation ? would the summation go away since it is like an integral?
3:25 Hey there, I would like to inquire you regarding the function you mentioned in the timeline; you are asking to take the derivative of the function with respect to a and b, but where is the function of I may ask. Furthermore, if I were to find the equation of the scattered plot, it is evidently incorrect! May I please get your insight and assistance on that?
I think my calculation might be wrong but could anyone tell me if there is a factor of 2 in front of the latter two terms in equation 1 at 3:32 here?
This is very helpful , thanks maam christine!
Too Good. Thanks very much
Thank you
AWW YEAH MIT.
Thanks for the video, i found it very straight forward. Can you please explain further how you found the two equations you used to solve for a and b. are taking the partial derv. of y=ax+b? And do you have a video for your bonus question. I would like to see if what i worked out is correct. Thanks!!
i realize this is way too late but ill leave this here in case it helps someone :D
th-cam.com/video/UYe98CcxPbs/w-d-xo.html
what if "Y" is a function of a lot of variables say for example 4 or 5 variables and each time we combine values of these variables we get a new value of "Y"......in this case how can i proceed to get an approximate function of y?? THANK YOU SO MUCH :-)
Segmented Least Squares: Multi-way Choices
Patents and copyrights expire eventually for a good reason. Generally I appreciate that we honor the ones who found something out sooner than the rest of the world, even if in fact that often happens by mere accident.
very nice explanation. thank you very much :)
Hi Madam,
Thank you very much indeed for a nice explanation.
Really I want ot draw a best curve through all point, curve may have a lot of fluctuations instead of line or parabola, can you tell me how to do that.
I am desperately in need of this answer.
I appreciate your help in advance .
Look forward to hearing from you
Aram
Did you solve it sir?
Hi, How to solve a something like this.. f= 1/{K1(x1^a1)(y1^b1)(z1^c1)} + 1/{K2(x2^a2)(y2^b2)(z2^c2)}: x1,x2,y1,y2,z1,z2 all are variables and a1,a2, b1,b2 c1,c3 K1, K2 all are constants that needed to be find.
very nice explanation. thank you very much
2 hours of class in 10min
thank you teacher
Tqvm madam
This was a good explanation, but I was kinda disappointed that the instructor didn't explain how she got the equations that were used to find the least squares
she is is mighty smaht - chuckie sullivan of good will hunting
I Really Like The Video Least squares From Your
In the challenge problem how to ensure that error is not maximized, as we are trying to minimise error?
I don't understand how is 13a + 5b = 14 and then 7 under 6 and 5 a + 3b = 6 and then b = 4 under 7 can someone explain the step the end in that conclusions please?
Awesome!!! Thanks a lot
Thankyou , that was very helpful
Mind Blown
Did anyone do the challenge? I got a=1, b= -2, and c=1. Can anyone confirm the answers?
Thank you very much!
It was very helpful thank you so much
nice video!
very useful video.
You are awsome !! Thank you !! Should be cloned and put into every college and school in the world.
That would be, no. Differential?
I appreciate your Line of thought, and the way you put forth it. But what would your Opinion be on Patient Policies and Rights. If I am to agree upon your Say, then it would mean no one would ever have a legal right on any thing. In fact, as you said and claimed earlier, some one would, some how, at least stumble on the thought of invention of all the existing concepts and their application, and that means, Rutherford, and J.J. Thompson and Bohr and Einstein were just there by accident?
Holy shit. Now I get it!
Very good video
Nice lecture....thanks a lot....:))
very nice!!
It would be have been much better if the Instructor had given Due Credits to the Inventor/Discoverer Carl Gauss. After all without him, there would have Never Been a technique as Least Squares.
That's clearly wrong! Many inventions and discoveries in history were made almost simultaneously and independently. It would've been a mere matter of time till anyone else had found it.
And don't get me wrong Gauss was one of the greatest geniuses ever!
got lost at 3:20
Question...Isn't Exi Squared 5 squared5x5 =25, why do you have 13? and kExiyi is 5x6=30, why do you have 14? Just started statistics and I can't see it...help! Stewy
Exiyi is not equal to Exi * Eyi .
It's equal to the sum of each xi multiplied by it's corresponding yi.
thanks
Wow, very illustrative, I wish you were my lecturer Thanks, but how should we send your assignment mentioned at the end of the lecture?
well explained!!!!!!!!
great
I have fallen love
wow great
Thank you prof! Are you Elon's sister...😂
There is no mathematical proof so far proves that this method is the best way, but I have a simple proof that this method is the best way in this field
Can u provide the proof.I am thinking that we must reduce error so we have that formula of mse, can u explain urs.
Not even close to being correct why is this video still up here...
hi sir/mam, iam doing reaseach on image processing so i want to know about local mean estimator using lms algorithm so in this topic i want to find mean of (noise+image) after that i want to find output can u help me sir i have paper with me name is "two dimensional local mean estimator adaptive filter using lms algorithm