I have been enrolled in a graduate machine learning course for about a month now and you have just demystified so many details around Linear Regression. Please do more ML videos! They are so clear and helpful. If you can, please do one on Regularization and Decision Forests.
You deserve more subscribers, the quality in your videos is so intuitive that even a high school student understands. Please do upload videos, keep going. I really think ur gonna get more subscribers in future.
After weeks of research and frustration, I have finally understood the concept of least squares so well! You explained the concepts so simply and logically!! Thank you so much for this amazing video. Much appreciated.
I actually took machine learning as the elective subject for my final year of engineering and I pretty much guess that this channel's going to teach me every thing!
Thank you!! After somehow passing 2 PhD quantitative methods modules and not really understanding why we did any of it, your channel has finally cleared a lot of stuff up!
Your presentation style is way ahead of anything else on this platform. You don't have the inefficient habit of deriving everything from first principles but allow a holistic intuition to develop. Absolute magic!
I keep coming back for more! THANK YOU SO MUCH!!! This is so clearly explained than any other tutorial/video/in-class session I've ever listened to! You are the best!
There are so many videos on this subject where they say what and how to use least squares method of find the line of best fit but you are the only one who explained the concept behind this method. Thank you.
I always wanted to know why the formula squares the distance and then get the root instead of using the absolut value. You're the first one to explain this. Thank you!
Josh your songs and teaching are excellent, you are doing something no one else has done in my life: inspiring me to become a Data scientist as well as a composer
Done thanks We take the square to make all the errors positive (we want to find the total error of the points from the line) We want to find the optimal values for a and b in the equation of a line that minimize the sum of errors squared. We can express the sum of squares as a function of a and b and take the derivative to optimize it 5:45 We find the slope that minimizes the error by finding the minima of the multi variable function (variables are a,b) 7:30
This is absolutely fantastic! I am so glad that I found this channel on the TH-cam while I am doing the data science self study. I am now understand the concept of OLS which stress me out in a week time before I found this video. Big thanks!
I'm new to data science, you just nailed it ....amazing explanation...after so many videos...finally understood what the heck Linear regression is ...Thank you so much...
I've read a few undergrad texts and none of them actually explain the origins of the idea behind least squares. At least not in a simplified visual form, you're usually just explained the problem and slapped with the simple linear model and then the generalized version....This gives some insight into the motivation behind this technique. Thank you for donating you're time to such a altruistic cause. You a real one!
@@statquest I'm serious! If you ever need anything done on Python for content - I would be more than happy to write it out as clearly and as elegantly as I can so you can use it for content.
What a legend you are! No words to express my gratitude. You are a blessing to everyone wanting to learn these concepts! Wish you good health and loads of happiness. :)
StatQuest Team - Thank you so much for all your efforts. For the last few months, I felt like that mouse stuck in a wheel going round and round with concepts as I got deeper into ML. A definite recommendation to everyone and anyone irrespective of their ML proficiency.
Thanks very much for this. I watched it last year when I was looking to change careers. Re watching now that I'm enrolled in some real training. And Wow!
Oh my God this is just the best video I've ever seen about Linear Regression! Thank you very much! I subscribed just after the video, please do not stop!
Josh. Ive seen this explanation for near 40 years, always re-watching for someone to explain the approach as well as the instructor who introduced it to me in 1980... your is the best so far. But, you got away w some simplification by starting w a 0- slope line, and the calculatiin of the line's value was sort of 'lost' when you lept to non-zero slope lines... Jussayin. Cheers DocV
@@statquest hey man, I'd like to request you to kindly make a video on how to become an ML engineer from scratch! I am a self taught aspirant. please make the roadmap for people like us
This is an amazing video on the intuition behind Fitting lines to data. Loved this video, it gave me a recap on some of the concepts I've learnt years ago and have forgotten.
I just "discovered" by myself that lines generated by linear regression always passes (M(X), M(Y)) point (which, when thinked about it is quite intuitive) and thus, we can add constriction, that b=M(Y)-a*(M(X), which allows to solve the equation for just single varible (slope), instead of two (slope and intercept). Thats for sure basic, but im neverthless proud of myself :D Great channel BTW.
The slope explanation gives a good intuitive sense of how to find the best-fit slope and y-intercept for least squares, especially if you have a background in calculus. In contrast, the linear algebra solution of OLS is just that much more shocking/amazing; that the same result can be calculated algebraically for n-space >without< any geometric intuition, without any search in the solution space for slope and y-intercept. Your visual explanation is more intuitive and memorable. The linear algebra approach feels more magical, as I find it harder to remember the derivation.
Thank you for the vid. It was so easy to understand the concept of Least Square using visualization. I will use this as my reference for my demo teaching in stat. Hoping fore more stat videos and data analysis trick & tips...
I just wanted to say that you were born to teach. The book? Are you kidding me? Perfection. I would advise to give the first chapter as free content so everybody can have a taste of your abilities. Favorite quote so far: "The Binomial Distribution makes me want to run away and hide. :) "
@@franciscoicarocs Haha! I'm glad you're enjoying it. I just started (in earnest) to write my next book on neural networks (from simple to start of the art)
I've seen many of your videos, they are amazing good stuff! I just wanted to point out something in this one: you calculate "b" for the first horizontal line, and then you start rotating it to find the best slope. But you never explain WHERE IS THE ROTATION POINT! This is crucial!
hello, you have done the wonderfull explanation. I loved it. and I am requesting you to do a video on assumptions made in linear regression. this will help us a lot.
BAM 😅😅😅🎉 I reached out to the bottom of the _STACK_ Quest (finally) 🎉❤ Wow 😮 I am on this quest to find out where this will stop since I just learned that I must watch the video *Fitting a Line to Data* also know as *Linear Regression* before I can watch the *Gradient Descent Step-by-Step!!!* so that I can watch the video related to *Neural Networks part 2* that I must watch before I can watch the *The StatQuest Introduction To PyTorch...* before I can watch the *Introduction to coding neural networks with PyTorch and Lightning* 🌩️ (it’s something related to the cloud I understand) I am genuinely so happy to learn about that stuff with you Josh❤ I will go watch the other videos first and then I will back propagate to that first video... And now I feel like it took me 6 years to find out this wonderful video I can't wait to see it😊
Bro you are genius . it just sinks in the mind . .they way you are explaining , Kindly guide me with some youtube channel which explains other concepts related to math's like calculus etc. .in similar way . .or I request you to create them as well from prospective of DS
Great video! It is admirable the effort you put on teaching... Just a suggestion: It would be great if you could put the link of the complementary videos you describe in each video. That way, it is easier to keep on track.
Thanks for the tip. I generally try to do that (add the links in the description), but sometimes I forget. If you have time, it would be great if you could post which videos need links to complementary videos in a comment and then I'll take care of the rest.
I think another reason to take the square error is that it creates an actual geometric square of size of the error (area of the square, A= L*W). Add them all up together to get a 2D representation of the error. Where adding all the absolute error lines is only 1D. Sometimes shapes are easier to visualize. Amazing videos and pedagogy style. Props.
Hey josh, I absolutely love your explanations! It's given me a completely different perspective on how i think about machine learning and made the topics intuitive. I wonder if you can compile some notes for all of the content that can be reviewed in under a day and a mind-map that can be used to put all the pieces together. that would be really AWESOME!
Hi, Josh. You are amazing person. Your videos are very helpful to me. Your talant of explaining complicated things simply is magnificent! I hope you will go on and help a lot of people like me. But I am a bit confused in some moments, I hope you can help me through this. It's about the gragh where we plot sum of squared residuals and different rotation. If the derivative=0 it means that the our function is horizontal, isn't it? In my head we just have the horizontal line as optimal line but it cannot be so. Please, clear it up. Thank you very much!
The point of rotating the line and showing different sums of the squared residuals was simply to help people understand the concept of the goal of finding the optimal line. However, in practice, we just take the derivative of the function and set it to 0 (just like you said). BAM! :)
@@statquest thank you very much! So quick response! I am a bit of the middle between “simple explanation” and “complex explanation” so it confused me a bit). You are a great human being! Good luck)
@@АлексейШаков-ь4и Thank you! Now, even though we can solve for optimal line by setting the derivative = 0 and solving for the slope and intercept, a more general solution, that works in a lot of different situations, is called Gradient Descent. Gradient Descent is the backbone of Machine Learning and is used in this situation, as well as for Deep Learning and all that fancy stuff. For details on how Gradient Descent works, see: th-cam.com/video/sDv4f4s2SB8/w-d-xo.html
Your method of explanation is great. Please keep uploading tutorials. I would like to see tutorials about Deep Learning and Boosting (XGBoost, Catboost, etc.) algorithms which are popular lately. Thanks.
I already have videos on deep learning here: th-cam.com/video/CqOfi41LfDw/w-d-xo.html and XGBoost here: th-cam.com/video/OtD8wVaFm6E/w-d-xo.html All of my videos are organized here: app.learney.me/maps/StatQuest
This is a brilliant video. You're a few minutes from explaining how to derive the LR formula. I seem to recall from my high school days - that you work out two partial derivatives (dSum/da - treating 'b' as a constant and then dSum/db treating 'a' as a constant) and equate these to 0 (to find minimum). You should be left with a couple of equations for intercept and slope.
That's exactly right (and I show how at least part of this works in my video on the Chain Rule th-cam.com/video/wl1myxrtQHQ/w-d-xo.html ). However, what I don't like about the LR formula is that it only works in this specific situation. In contrast, Gradient Descent gives us pretty good parameter estimates and works in a million other situations. For more details on Gradient Descent, see: th-cam.com/video/sDv4f4s2SB8/w-d-xo.html
Sorry for commenting several times throughout the series, but I would like to point out you should probably move this video + "Linear Regression, Clearly Explained!!!" before ROC/AUC in your ML playlist since you suggest understanding these basics beforehand.
Hi Josh. Love your videos!! They give the best intuitive explanation. Can't thank you enough :). Please make a video on curve fitting for linear equations using normal equations vs using gradient descent. Thank You!
Support StatQuest by buying my book The StatQuest Illustrated Guide to Machine Learning or a Study Guide or Merch!!! statquest.org/statquest-store/
Everyone should buy this book if you want to learn machine learning. It is the greatest 20 bucks that I have ever spent in my entire life.
@@weixiangzhao561 BAM! Thank you very much! :)
This video isn't available in Nepal. Can you please make it available sir? I really love your content. 🙏
@@dipenpandit684 I'm working on it. It's some strange thing with youtube and I've contacted them.
I love this book! I have taken Stats so many times but I still learned so much. Thank you for writing this book!
I have been enrolled in a graduate machine learning course for about a month now and you have just demystified so many details around Linear Regression. Please do more ML videos! They are so clear and helpful. If you can, please do one on Regularization and Decision Forests.
You have no idea how much I appreciate the clarity and simplicity of this explanation, you deserve a medal
Thank you!
You deserve more subscribers, the quality in your videos is so intuitive that even a high school student understands. Please do upload videos, keep going. I really think ur gonna get more subscribers in future.
You were right
Can’t agree more
Definitely! I'm glad that as of March 2024, he's got over one million subs (1.13 to be exact)!
After weeks of research and frustration, I have finally understood the concept of least squares so well! You explained the concepts so simply and logically!! Thank you so much for this amazing video. Much appreciated.
Thank you very much!!! :)
I actually took machine learning as the elective subject for my final year of engineering and I pretty much guess that this channel's going to teach me every thing!
Hooray!!! :)
@@statquest your vids are so good I watch your videos for entertainment lol
How'd the machine learning class turn out?
I must admit sir, you are one of the best teachers I've ever had. Thank you for being so awesome!
Wow, thank you!
Thank you!! After somehow passing 2 PhD quantitative methods modules and not really understanding why we did any of it, your channel has finally cleared a lot of stuff up!
Great to hear!
How the hell did you pass
@@frankchen4229 do what you're told and don't ask why
@@maggiechen1141 huh. Maybe PhD isn't so bad.
@@maggiechen1141 best advice so far
english is not my first language, but i can clearly understand your explanation. Thank you sir!
Thank you!
Your presentation style is way ahead of anything else on this platform. You don't have the inefficient habit of deriving everything from first principles but allow a holistic intuition to develop. Absolute magic!
Thank you! :)
I keep coming back for more! THANK YOU SO MUCH!!! This is so clearly explained than any other tutorial/video/in-class session I've ever listened to! You are the best!
Wow! Thank you very much!
There are so many videos on this subject where they say what and how to use least squares method of find the line of best fit but you are the only one who explained the concept behind this method. Thank you.
Thanks!
This is the best TH-cam channel ever, thank you Josh for all your work you doing awesome!!
Wow, thanks!
the quality of these videos seem to have improved greatly over the years, but the simplicity was always there. Amazing!
Some of the early videos are still the best.
I always wanted to know why the formula squares the distance and then get the root instead of using the absolut value. You're the first one to explain this. Thank you!
Thanks!
So can you explain in your own words in short then? Thanks
Josh your songs and teaching are excellent, you are doing something no one else has done in my life: inspiring me to become a Data scientist as well as a composer
Wow! That is awesome! BAM! :)
I don't think I have ever clicked on the subscribe button that fast. Absolutely amazing
Awesome! :)
Me too
Josh, seriously, thanks for all these videos. My ML journey is smoother thanks to them.
Glad you like them!
Done thanks
We take the square to make all the errors positive (we want to find the total error of the points from the line)
We want to find the optimal values for a and b in the equation of a line that minimize the sum of errors squared. We can express the sum of squares as a function of a and b and take the derivative to optimize it
5:45
We find the slope that minimizes the error by finding the minima of the multi variable function (variables are a,b)
7:30
Hello Josh, Thank you for sharing. Every time I sign in, I learn something new from you. Fantastic presentation . Stay blessed.
Thank you! You too!
Your way to explain these abstract concepts is simply AMAZING!!!! Thank you so much for these incredible videos!
Thank you very much! :)
I have watched and read so many articles but this video explains the use of sum of squared errors and why its important. Thank you!!!
Hooray! :)
It literally only took me 15 seconds to subscribe because of his unique 15 second intro
That's awesome! :)
@@statquest i think you should add fitting a curve to data in this playlist
bro im appreciate every single videos that you've made, i just want to say thanks a ton with not skip ads in your videos, love from indonesian
I appreciate that!
This is absolutely fantastic! I am so glad that I found this channel on the TH-cam while I am doing the data science self study. I am now understand the concept of OLS which stress me out in a week time before I found this video. Big thanks!
I'm glad it was helpful! :)
Never excited more than after watching this video. Truly intuitive and amazing intro to linear regression.
Thank you! :)
What a teaching style... 200% which I was searching about...
Exellent work
Thank you very much! :)
I'm new to data science, you just nailed it ....amazing explanation...after so many videos...finally understood what the heck Linear regression is ...Thank you so much...
Hooray! :)
I've read a few undergrad texts and none of them actually explain the origins of the idea behind least squares. At least not in a simplified visual form, you're usually just explained the problem and slapped with the simple linear model and then the generalized version....This gives some insight into the motivation behind this technique. Thank you for donating you're time to such a altruistic cause. You a real one!
Thank you! :)
OMG !! You really have the best in explaining this concept which most books just want to show how good their English are.
Thanks! :)
Awesome. I am screaming with happiness. Thanks Statquest. Intuition you conveyed to us priceless.
Thanks!
Your videos are helping me write my thesis because I don't have a stats background and went into a science heavy masters. this is just the best!!!!
Good luck! :)
This video made me stop crying from stress of barely understanding anything in my class. Thank you
Glad it helped!
Right?! This channel is so underrated, we have to change that!
@@bigvinweasel1050 Wow! Thank you!
@@statquest I'm serious! If you ever need anything done on Python for content - I would be more than happy to write it out as clearly and as elegantly as I can so you can use it for content.
I never understood why we needed to plot a line. Now I do. It is amazing
What a legend you are! No words to express my gratitude. You are a blessing to everyone wanting to learn these concepts! Wish you good health and loads of happiness. :)
Thank you!
This is the real teaching. Respect
AMAZING VIDEO. MAKES STATS AND MATHS MORE LOGICAL AND REAL
Thanks!
You have the best teaching skill in the universe.
Wow! Thanks!
StatQuest Team - Thank you so much for all your efforts. For the last few months, I felt like that mouse stuck in a wheel going round and round with concepts as I got deeper into ML. A definite recommendation to everyone and anyone irrespective of their ML proficiency.
Awesome! Good luck with your ML studies! :)
Thanks very much for this. I watched it last year when I was looking to change careers. Re watching now that I'm enrolled in some real training. And Wow!
BAM! Good luck with your course.
Oh my God this is just the best video I've ever seen about Linear Regression! Thank you very much! I subscribed just after the video, please do not stop!
Awesome! Thank you!
maths is so crazy, the explanation was amazing - the people who figure this stuff out are geniuses
bam! :)
This is the best tutorial channel ever!!
Thank you very much! :)
I think that the Blue Lake in New Zealand has some competition for ~most clarity~. This channel is amazing!!
BAM! :)
Josh. Ive seen this explanation for near 40 years, always re-watching for someone to explain the approach as well as the instructor who introduced it to me in 1980... your is the best so far.
But, you got away w some simplification by starting w a 0- slope line, and the calculatiin of the line's value was sort of 'lost' when you lept to non-zero slope lines...
Jussayin.
Cheers
DocV
Glad it was helpful!
The best teaching video I have ever seen. What a great work!
Thank you! :)
superb explanation, crystal clear with graphics is good way to comprehend. Thank you for this.
Hooray!
taken Andrew's course and bought books, but that intro and vibe is the rela into to ML--I learned that the hard way. Thanks for your songs.
:)
@@statquest hey man, I'd like to request you to kindly make a video on how to become an ML engineer from scratch! I am a self taught aspirant. please make the roadmap for people like us
@@supriyamanna715 To be honest, you could just start at the top of this and work your way down, through the webinars: statquest.org/video-index/
Best and simplest explanation ever !!
Glad you liked it!
This is an amazing video on the intuition behind Fitting lines to data. Loved this video, it gave me a recap on some of the concepts I've learnt years ago and have forgotten.
Hooray! :)
Yes this is exactly what I've been looking for. Great video. It has made my life a lot easier.
Thanks!
I just "discovered" by myself that lines generated by linear regression always passes (M(X), M(Y)) point (which, when thinked about it is quite intuitive) and thus, we can add constriction, that
b=M(Y)-a*(M(X), which allows to solve the equation for just single varible (slope), instead of two (slope and intercept). Thats for sure basic, but im neverthless proud of myself :D Great channel BTW.
bam!
The slope explanation gives a good intuitive sense of how to find the best-fit slope and y-intercept for least squares, especially if you have a background in calculus. In contrast, the linear algebra solution of OLS is just that much more shocking/amazing; that the same result can be calculated algebraically for n-space >without< any geometric intuition, without any search in the solution space for slope and y-intercept. Your visual explanation is more intuitive and memorable. The linear algebra approach feels more magical, as I find it harder to remember the derivation.
Noted
your method of teaching is awesome, thank you!
Thank you! :)
Thank you for the vid. It was so easy to understand the concept of Least Square using visualization. I will use this as my reference for my demo teaching in stat. Hoping fore more stat videos and data analysis trick & tips...
Glad it was helpful!
Sir, your explanations are crystal clear. Thank you
Thank you! :)
I just wanted to say that you were born to teach.
The book? Are you kidding me? Perfection.
I would advise to give the first chapter as free content so everybody can have a taste of your abilities.
Favorite quote so far: "The Binomial Distribution makes me want to run away and hide. :) "
Thank you very much! That's a good idea. I wonder how I can do that.
Favorite new quote just dropped: "the Normal distribution is awesome, and, to be honest, it sort of looks like you..." lol
@@franciscoicarocs Haha! I'm glad you're enjoying it. I just started (in earnest) to write my next book on neural networks (from simple to start of the art)
I've seen many of your videos, they are amazing good stuff!
I just wanted to point out something in this one: you calculate "b" for the first horizontal line, and then you start rotating it to find the best slope.
But you never explain WHERE IS THE ROTATION POINT! This is crucial!
I mean, to give the intuition that you can assume any intercept and the rotate using it as the rotation point, and it won't change the result.
Noted.
I actually LOVE your video style!
Thank you! :)
You are an incredible teacher.
Thank you! :)
hello, you have done the wonderfull explanation. I loved it.
and I am requesting you to do a video on assumptions made in linear regression.
this will help us a lot.
BAM 😅😅😅🎉 I reached out to the bottom of the _STACK_ Quest (finally) 🎉❤ Wow 😮 I am on this quest to find out where this will stop since I just learned that I must watch the video *Fitting a Line to Data* also know as *Linear Regression* before I can watch the *Gradient Descent Step-by-Step!!!* so that I can watch the video related to *Neural Networks part 2* that I must watch before I can watch the *The StatQuest Introduction To PyTorch...* before I can watch the *Introduction to coding neural networks with PyTorch and Lightning* 🌩️ (it’s something related to the cloud I understand)
I am genuinely so happy to learn about that stuff with you Josh❤ I will go watch the other videos first and then I will back propagate to that first video... And now I feel like it took me 6 years to find out this wonderful video I can't wait to see it😊
You finally made it!
Bro you are genius . it just sinks in the mind . .they way you are explaining , Kindly guide me with some youtube channel which explains other concepts related to math's like calculus etc. .in similar way . .or I request you to create them as well from prospective of DS
Thanks! Lots of other people like 3Blue1Brown for math.
Please do more of these, I think I will be able to pass my econometrics test thanks to you.
Good luck!
Simply explained...just makes it beautiful to watch! Thanks!! :-)
Thank you! :)
Great video on least square method.
Thanks!
Great video! It is admirable the effort you put on teaching...
Just a suggestion: It would be great if you could put the link of the complementary videos you describe in each video. That way, it is easier to keep on track.
Thanks for the tip. I generally try to do that (add the links in the description), but sometimes I forget. If you have time, it would be great if you could post which videos need links to complementary videos in a comment and then I'll take care of the rest.
Concise and precise, well done Sir.
Thank you! :)
Wowie kabawie wowie ZOWIE! Im still in highschool! And I understand most fundementals in the video!😊
double bam!
I think another reason to take the square error is that it creates an actual geometric square of size of the error (area of the square, A= L*W). Add them all up together to get a 2D representation of the error. Where adding all the absolute error lines is only 1D. Sometimes shapes are easier to visualize. Amazing videos and pedagogy style. Props.
Noted.
Thanks for this content!!! I am very happy to understand these concepts watching this awesome explanation!
Glad it was helpful!
Hey josh, I absolutely love your explanations! It's given me a completely different perspective on how i think about machine learning and made the topics intuitive. I wonder if you can compile some notes for all of the content that can be reviewed in under a day and a mind-map that can be used to put all the pieces together. that would be really AWESOME!
Something like this? app.learney.me/maps/StatQuest
Hi, Josh. You are amazing person. Your videos are very helpful to me. Your talant of explaining complicated things simply is magnificent! I hope you will go on and help a lot of people like me.
But I am a bit confused in some moments, I hope you can help me through this.
It's about the gragh where we plot sum of squared residuals and different rotation.
If the derivative=0 it means that the our function is horizontal, isn't it? In my head we just have the horizontal line as optimal line but it cannot be so. Please, clear it up.
Thank you very much!
The point of rotating the line and showing different sums of the squared residuals was simply to help people understand the concept of the goal of finding the optimal line. However, in practice, we just take the derivative of the function and set it to 0 (just like you said). BAM! :)
@@statquest thank you very much! So quick response!
I am a bit of the middle between “simple explanation” and “complex explanation” so it confused me a bit). You are a great human being! Good luck)
@@АлексейШаков-ь4и Thank you! Now, even though we can solve for optimal line by setting the derivative = 0 and solving for the slope and intercept, a more general solution, that works in a lot of different situations, is called Gradient Descent. Gradient Descent is the backbone of Machine Learning and is used in this situation, as well as for Deep Learning and all that fancy stuff. For details on how Gradient Descent works, see: th-cam.com/video/sDv4f4s2SB8/w-d-xo.html
Thanks for the video. It was a great refresher.
Thanks!
awesome explanation, best ever explanation, made it look so easy.....
Thanks!
Great explanation. Thank you. God bless you!
Thank you!
The best explanation to date!
Thanks!
Your method of explanation is great. Please keep uploading tutorials. I would like to see tutorials about Deep Learning and Boosting (XGBoost, Catboost, etc.) algorithms which are popular lately. Thanks.
I already have videos on deep learning here: th-cam.com/video/CqOfi41LfDw/w-d-xo.html and XGBoost here: th-cam.com/video/OtD8wVaFm6E/w-d-xo.html All of my videos are organized here: app.learney.me/maps/StatQuest
This is a brilliant video. You're a few minutes from explaining how to derive the LR formula. I seem to recall from my high school days - that you work out two partial derivatives (dSum/da - treating 'b' as a constant and then dSum/db treating 'a' as a constant) and equate these to 0 (to find minimum). You should be left with a couple of equations for intercept and slope.
That's exactly right (and I show how at least part of this works in my video on the Chain Rule th-cam.com/video/wl1myxrtQHQ/w-d-xo.html ). However, what I don't like about the LR formula is that it only works in this specific situation. In contrast, Gradient Descent gives us pretty good parameter estimates and works in a million other situations. For more details on Gradient Descent, see: th-cam.com/video/sDv4f4s2SB8/w-d-xo.html
Just the right amount of theory and math. You should consider teaching stat for students in health and biological studies.
Great video.I love the way how intuitive your videos are! 👍🏻
Thank you! :)
Really good explanation, thank you !
Thanks!
clearly explained! lots of thanks
Glad it was helpful!
thanks a lot for your useful content , ❤ from Iran
Hello Iran!!! Thank you! :)
Sorry for commenting several times throughout the series, but I would like to point out you should probably move this video + "Linear Regression, Clearly Explained!!!" before ROC/AUC in your ML playlist since you suggest understanding these basics beforehand.
Ok thanks!
Clearly explained indeed. Thank you.
Glad it was helpful!
Best explanation ever!
Wow, thanks!
Awesome Information.Thank you Sir
Thanks!
God bless you and these videos. These are so helpful
Thanks!
Hi Josh. Love your videos!! They give the best intuitive explanation. Can't thank you enough :).
Please make a video on curve fitting for linear equations using normal equations vs using gradient descent. Thank You!
My video on Gradient Descent compares that method to the analytical solution: th-cam.com/video/sDv4f4s2SB8/w-d-xo.html
each time i hear your song, i just laugh. you brighten my day!
bam! :)
Now that's a good explanation.
Thanks!
Amazing clarity.... and I don't just mean the singing.
:)
Great Video! I believe correction needed at 8:09 , it should be "Taking the derivatives of SSR with respect to both slope and intercept ... "
Sure, I should have been a little more careful with my words there.
You have the best videos on the internet on Stat & data science.
Thank you very much! :)
This is so awesome. I finally get it! Have you written a book?
Yes, I have written a book and it should come out this spring. Subscribe to stay in the loop! :)
When a clip about stats starts of with singing that makes you laugh before you start it's a very good thing.
bam!
Wow ! this such good visuals ! Interesting!
Thanks!
10 videos in I have started dancing to the song at the beginning.
bam! :)
very interesting, i can see also the usage of optimization as a course
Thanks!