Hey, I'm a Brazilian student/software engineer studying Rec Systems and ML. Tons of articles, papers and videos did not do what you've just done. Now everything is crystal clear, thanks for the explanation.
OMG! Thank you! My power went out and I figured I would try to learn gradient descent on my phone.. This is the first time it's made sense.. All those experienced mathmaticians suck at being teachers, making it sound all crazy complicated and shit. You sir are amazing.
Going through Andrew Ng's Coursera... got stuck on how the Cost Function derivatives/partial derivatives are obtained.... 11:00 and on... Oh... MY... GOSH... this is GOLD!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! Thank you so much!
Spot on ... This is a small bridge for the Andrew Ng's Coursera course . Specially the explanation how chain rule and power rule are coming into picture here, really helps.
Just wanted to say that this is easily the best and clearest explanation of gradient descent I've come across, on the web and in the books I've read. Thank you, sir.
Usually, math makes me cry but while watching this I am learning and laughing at the same time. How cool is that? Lol. All thanks to you, bro! Keep the good work on. Cheers!!
Thank you so much.After failing my exam on Machine learning I was searching videos on Gradient Descent topic.After watching so many videos I landed on this page.By far this is the best video.You are simple great teacher because you have understood the topic very well that is why you are able to explain in really simple way.. Thanks a million !!!!!
I had searched on this subject and watched several other videos before I could find this amazing video on the topic; I am more than happy that I am now able to explain this concept to anyone - now it is so much clear, thank you sir!
I'm studying up for an interview to transfer to the Machine Learning department Wednesday. This is enormously helpful in providing an actual mathematical (not just conceptual) understanding of gradient descent. Thanks!
This is a clearer explanation than Professor Ng's explanation in his machine learning video series. Ng denotes m and b as theta0 and theta1. He also reverses the terms in his line equation which confuses the Hell out of everybody. In addition, he doesn't take you through how the partial derivative is worked out and he doesn't show the code. A great explanation in only 22 minutes.
Thank you so much for this. One of the best explanation of gradient descent on youtube. So far I'm loving your Intelligence and Learning series. Think I'm gonna binge watch the entire series now.
You nailed it sir! I was confused when partial dJ/dm = 2*Error at one moment then suddenly had partial dError/dm glued onto it, but your clarification at point 19:15 clarified it. Please keep making videos!
I have seen some of your videos to get some concepts that I didn't get the first time on my ML class and I'm truly convinced that these are the best tutorials on TH-cam about ML, you make every concept so simple to understand and funny, at the same time. Thanks a lot!!! Keep doing this great content!!!
I searched for a video like this for a long time, and the only one I could clearly understood was yours. thank you so much and congrats for the explanation
You did it fantasticly! These are concepts that I know well already but find them difficult to explain, so I'll recommend your videos when pure IT guys (and not so educated audiences) ask me about the internals of the ML algos that I use.
Your videos are the only calculus and ML videos I can understand, you are the best! I just subbed to you; 'minimum' is singular and 'minima' is plural.
Spot on ... This is a small (but a very important ) bridge for the Andrew Ng's Coursera course . Specially the explanation how chain rule and power rule are coming into picture here, really helps.
Thanks for doing this within the 20 minutes you made it clear. Than the many hours I have watched and read articles others have made and were totally confusing. keep doing this... you made me unafraid of all the math.
Really awesome, what a elegant style of delivering the concepts, mind boggling. I wish & dream i should work and get education under his supervision. Moreover, gestures, tone, humor was extra extra outstanding, i'm speechless. :). I must say that it is the best ever explanation of gradient descent I've seen so far. Thanks a lot.
Thank you!!! I’m a computer science graduate student and trying to understand gradient descent. This video is awesome, can’t wait to watch more of your videos.
I've been studying this subject for a couple months in my final semester of college, and for reason, the connectiom between the loss function and the parabola just made it all click
Daniel I have been following you since you had 2000 subs. I always enjoyed your videos man. I started learning Deep Learning on my own and got stuck at understanding Gradient descent and I know it is the back bone of Ml and DL I want to know it deeply. I have watched around 3 videos before this and your video just explains it beautifully. Thanks for this video it helped me alot. Please keep doing these kind of videos which explains the math behind these ML and DL algorithms and again Thank you for your videos. :) Iam gonna follow you more and more from now. If it's possible try to make an awesome course on Udemy with math and programming of ML and DL . Thank you again.
I wish I had a teacher like you. You are amazing sir I think it doesn't matter whatever are you studying but your teacher has the power to make the concept Easy or Difficult. And you are the one who makes everything extremely easy! and Yeah you are damn funny.
A great youtuber recommending another one! Although I know Calculus from college, I think you did a great job explaining some of the rules. Keep it up Daniel.
plus i would not jump into c++ as my first langauge, try learn an easy langauge and then start with c++. plus you need to be sure what are you gonna c++ for.
very, very, very helpful!!! I'm in grade 12 and was researching how exactly calculus could be applied to com sci, and this was a life saver! I had no idea what I was doing before this XD Thankss
Dude, you are completely mad. But in the most noble sense of this word;)! Fantastic way of explaining an actually quite complex piece of math. And it's very funny too;). Congrats and hats off. You're an excellent educator.
Well, since I gave you a negative review on the calculus video, I feel I owe you one here. I thought that was great! The only thing you glossed over was the fact that the cost function is actually a summation of all the errors of each x value. But, since the derivative of a sum is simply the sum of the derivatives, putting the computation of m and b inside the for loop works fine. (Seeing that in your code at first actually bothered me, but now I see that it's no problem--it's exactly what's needed.) I found it fascinating how simple everything turns out after going through all the calculus. And I think that was the important point. Nice job.
Thank you, I really appreciate it. I think there is still more room for improvement and, in a way, I'm just making these videos to help give myself the background for future videos. But I'm glad this one seems to be better received!
Great explanation thank you! I kept seeing the chain rule, which I understand, but no one was explaining explicitly which chain of functions we are using it on and that the loss function is at the end of the chain.
superb ! as always. Knowledge of the more advanced concepts/techniques, especially higher level maths/abstractions etc. is primarily, precisely what separates the boys from the men / the script kiddies from software engineers etc. fear it NOT ! he plays the part so well mostly as a great teaching strategy/angle so as to reach as many as possible & to make them feel they aren't alone. Even if he really doesn't like it. I'd like to think its mostly an act anyway. & the Oscar goes to: Dan Shiffman ! He do knows his shiz though ! these vids have been priceless to me. thanks ! can't wait for the rest of Neural Networks
Dan, I have been a math tutor for 1,5 years. I know what is derivative yet gradient descends is still confusing for me at least if someone asks me, I have no idea how to explain it in less than 3 sentences. Your explanation is mindblown. I guess you will make a lot of University go bankrupt. People just open your channel instead.
Hey, I'm a Brazilian student/software engineer studying Rec Systems and ML. Tons of articles, papers and videos did not do what you've just done. Now everything is crystal clear, thanks for the explanation.
OMG! Thank you! My power went out and I figured I would try to learn gradient descent on my phone.. This is the first time it's made sense.. All those experienced mathmaticians suck at being teachers, making it sound all crazy complicated and shit. You sir are amazing.
My wifi just went out and instead of using it as an excuse, I am using my laggy phone to try and learn .
Going through Andrew Ng's Coursera... got stuck on how the Cost Function derivatives/partial derivatives are obtained.... 11:00 and on... Oh... MY... GOSH... this is GOLD!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! Thank you so much!
Same. Haven't done calculus since 1999, this helps a lot.
Same here. I was so confused.
@@calluma8472 gosh, how old are you?
Spot on ... This is a small bridge for the Andrew Ng's Coursera course . Specially the explanation how chain rule and power rule are coming into picture here, really helps.
found my coursera classmate
after read 20-30 articles , after watched 20+ videos, i watched best video ever of Gradient Descent.
i falling love with you.
best explanation.
Just wanted to say that this is easily the best and clearest explanation of gradient descent I've come across, on the web and in the books I've read. Thank you, sir.
Coming here from Andrew Ng's ML course. Got confused with Gradient Descent. This is Gold. You explained Linear regression so well.
Usually, math makes me cry but while watching this I am learning and laughing at the same time. How cool is that? Lol. All thanks to you, bro! Keep the good work on. Cheers!!
Thank you so much.After failing my exam on Machine learning I was searching videos on Gradient Descent topic.After watching so many videos I landed on this page.By far this is the best video.You are simple great teacher because you have understood the topic very well that is why you are able to explain in really simple way.. Thanks a million !!!!!
I had searched on this subject and watched several other videos before I could find this amazing video on the topic; I am more than happy that I am now able to explain this concept to anyone - now it is so much clear, thank you sir!
I'm studying up for an interview to transfer to the Machine Learning department Wednesday. This is enormously helpful in providing an actual mathematical (not just conceptual) understanding of gradient descent. Thanks!
It was teacher's day yesterday, here in India. And today I have got this amazing teacher. Thank You
This is a clearer explanation than Professor Ng's explanation in his machine learning video series. Ng denotes m and b as theta0 and theta1. He also reverses the terms in his line equation which confuses the Hell out of everybody. In addition, he doesn't take you through how the partial derivative is worked out and he doesn't show the code. A great explanation in only 22 minutes.
Thank you so much for this. One of the best explanation of gradient descent on youtube. So far I'm loving your Intelligence and Learning series. Think I'm gonna binge watch the entire series now.
You nailed it sir! I was confused when partial dJ/dm = 2*Error at one moment then suddenly had partial dError/dm glued onto it, but your clarification at point 19:15 clarified it. Please keep making videos!
I have seen some of your videos to get some concepts that I didn't get the first time on my ML class and I'm truly convinced that these are the best tutorials on TH-cam about ML, you make every concept so simple to understand and funny, at the same time. Thanks a lot!!! Keep doing this great content!!!
I searched for a video like this for a long time, and the only one I could clearly understood was yours. thank you so much and congrats for the explanation
3Blue1Brown's was hard for me, your explanations are waaay better.
You did it fantasticly! These are concepts that I know well already but find them difficult to explain, so I'll recommend your videos when pure IT guys (and not so educated audiences) ask me about the internals of the ML algos that I use.
you're a very good teacher, a bit crazy though lol
The great kind of crazy tho!
@@Dennis4Videos yes!
I never saw any tutor who put so much emotion in the video like you lol
Excellent channel! Thanks so much
Your videos are the only calculus and ML videos I can understand, you are the best! I just subbed to you; 'minimum' is singular and 'minima' is plural.
You went in to details and explained the concept. Loved the way of your fun-filled teaching. Thank you !
This is the best gradient descent video I have ever seen! Great work!
you deserve 200 million subscribers.. more than that your personality is really great!!!
Spot on ... This is a small (but a very important ) bridge for the Andrew Ng's Coursera course . Specially the explanation how chain rule and power rule are coming into picture here, really helps.
Thanks for doing this within the 20 minutes you made it clear. Than the many hours I have watched and read articles others have made and were totally confusing. keep doing this... you made me unafraid of all the math.
You are so engaging that turns this boring math to something actually interesting.. thank you so much
Amazing esplanation, easy enough for a high school student to learn. Amazing how simple you made this complex concept. You sir are a genius!
I was literally so frustrated with these things meesing up my head.... thank you sir for helping me to survive....you are just fanstastic🙏🙏
After watching this i finally figured out the calculus behind back propagation. Thank you! BIG LIKE
Really awesome, what a elegant style of delivering the concepts, mind boggling. I wish & dream i should work and get education under his supervision. Moreover, gestures, tone, humor was extra extra outstanding, i'm speechless. :). I must say that it is the best ever explanation of gradient descent I've seen so far. Thanks a lot.
Thank you!!! I’m a computer science graduate student and trying to understand gradient descent. This video is awesome, can’t wait to watch more of your videos.
I've been studying this subject for a couple months in my final semester of college, and for reason, the connectiom between the loss function and the parabola just made it all click
Was looking for an accessible explanation of gradient descent, and this was by far the best one I found--thanks!
Daniel I have been following you since you had 2000 subs. I always enjoyed your videos man. I started learning Deep Learning on my own and got stuck at understanding Gradient descent and I know it is the back bone of Ml and DL I want to know it deeply. I have watched around 3 videos before this and your video just explains it beautifully. Thanks for this video it helped me alot. Please keep doing these kind of videos which explains the math behind these ML and DL algorithms and again Thank you for your videos. :) Iam gonna follow you more and more from now. If it's possible try to make an awesome course on Udemy with math and programming of ML and DL . Thank you again.
I wish I had a teacher like you. You are amazing sir I think it doesn't matter whatever are you studying but your teacher has the power to make the concept Easy or Difficult.
And you are the one who makes everything extremely easy!
and Yeah you are damn funny.
Man that’s awesome. Being Russian I’ve understood every single thing. Keep up!
I can not believe this video aired just when i needed it, that you so much!
I cannot describe how useful this was to me! Thank you!
A great youtuber recommending another one! Although I know Calculus from college, I think you did a great job explaining some of the rules. Keep it up Daniel.
Superb break down of this often miss-explained concept.!!!! A+
Man, you are super!! I had a hard time understanding the mathematics of gradient descent and you made it very easy. Thank u
A beautiful mathematical explanation of Gradient Descent! Way to go man...
thank you for everything you do. I'm a c++ guy but your videos are very interesting.
hey can u tell me best place to learn c++
plus i would not jump into c++ as my first langauge, try learn an easy langauge and then start with c++. plus you need to be sure what are you gonna c++ for.
Deitel and Deitel's book.
gibson If his goal is to only learn C++, then learning C first is unnecessary. I would even argue it is a big mistake.
I think, you made it so simple. I was looking for a proper explanation of this formula. Liked. Subscribed.
You're videos are entertaining and informative at the same time . Love it!
This was the best tutorial on this subject that I've found, thank you for this too!
this video demystified everything of the previous one, thank you so much
At first I thought this is BS, now I’m so thankful
Excellent videos. I just went through the playlist and they explain the concepts really well. You sir are a hero!!
its a great video.A simple and easy language is used to explain every concept.Great work!!
Amazing! The best explanation so far
amazing explanation! One of the best explanation in the whole youtube IMO
Glad to hear, thank you!
@@TheCodingTrain thanks for the explanation. I couldn't find the next value where you explained batch gradient descend.
OHH man.. you're one hell of a teacher... Loved it
It's awesome, i understand the concept of error minimizing and i jump over here to comment.
very, very, very helpful!!! I'm in grade 12 and was researching how exactly calculus could be applied to com sci, and this was a life saver! I had no idea what I was doing before this XD Thankss
So glad to hear thanks for the nice comment!
An excellent video.. The best video in the internet for Gradient Descent Algorithm. Thanku so much :) ... Keep posting like this
After watching a ton of videos I finally understand it thanks to you. Thank you so much!
it's crystal clear even for a person like me lacking understanding of calculus
No worries I understood this lecture and appreciate it. I have studied Calculus 1 &2 in my high school
That is one of the best explanations I saw on the youtube..Thanks a lot..
One of the best video tutorial I came across !!
thank you for explaining this. and even better than what my professor did in multiple lectures.
This really helped me understand the MSE derivative. Great job!
it's interesting how different youtube channels become different classes^^ Khan gives you allt the calculus you could ever want if you're a beginner
I lov this.
" I tried again " Love this ,
Simple and easy to understand. Thanks for sharing other important links.
Well done!!
One of the best explanations of GD
Nice one. I actually had a doubt in GD, but watching your video I think I'm clear a bit.
Dude, you are completely mad. But in the most noble sense of this word;)! Fantastic way of explaining an actually quite complex piece of math. And it's very funny too;). Congrats and hats off. You're an excellent educator.
Best explanation ever in Gradient Descent.
Take a bow!! thats the best explanation of gradient descent!!
So lucky when i was at high school i did a lot of math exercise about derivative and i love it
You're an amazing teacher! I wish I had a Math teacher like you!
This is EXACTLY what i had been searching for the past week pfft.. Thank you Sir ^_^
wonderful way of teaching and just fantastic video. Just Loved it Man...!!!
You are awesome, atleast tried not skip the mathematics, not like most of them who just run away from mathematics
Well, since I gave you a negative review on the calculus video, I feel I owe you one here.
I thought that was great! The only thing you glossed over was the fact that the cost function is actually a summation of all the errors of each x value. But, since the derivative of a sum is simply the sum of the derivatives, putting the computation of m and b inside the for loop works fine. (Seeing that in your code at first actually bothered me, but now I see that it's no problem--it's exactly what's needed.) I found it fascinating how simple everything turns out after going through all the calculus. And I think that was the important point. Nice job.
Thank you, I really appreciate it. I think there is still more room for improvement and, in a way, I'm just making these videos to help give myself the background for future videos. But I'm glad this one seems to be better received!
After 5:30 your expression is like👏👏👏👏👏😂😂😂😂😂
very energetic presentaion ...loved it
Thank you!
this video is fantastic, you are a very talented teacher
The best explanation of gradient decent! Thank you very much.
Thank you for making this look so simple!! You are an amazing teacher.!!
Instant subscription, I adore the passion you have for what you do :) .
Awesome explanation of cost funciton , derivatives and its usage in ML
Thank you Sir.This teaching gained you a subscriber
Thank u so much!!! This is just what I needed... U rock!!
Your aura is life saving👌
Wow! Thanks! It would be so fun for people who work with you. May Allah bless you.
Thank you, its very simple yet amazing explanation.
Great explanation thank you! I kept seeing the chain rule, which I understand, but no one was explaining explicitly which chain of functions we are using it on and that the loss function is at the end of the chain.
superb ! as always. Knowledge of the more advanced concepts/techniques, especially higher level maths/abstractions etc. is primarily, precisely what separates the boys from the men / the script kiddies from software engineers etc. fear it NOT ! he plays the part so well mostly as a great teaching strategy/angle so as to reach as many as possible & to make them feel they aren't alone. Even if he really doesn't like it. I'd like to think its mostly an act anyway. & the Oscar goes to: Dan Shiffman ! He do knows his shiz though ! these vids have been priceless to me. thanks ! can't wait for the rest of Neural Networks
great video! Gradient descent makes more sense now
awesome u really gave different way to look gd will love to see more ML videos by you. Awesome work bro!
I love your energy! Nice explanation btw
You make learning math fun.
Thank you so much. Finally I found an explanation that I could undestand. Good job, Daniel :D
Dan, I have been a math tutor for 1,5 years. I know what is derivative yet gradient descends is still confusing for me at least if someone asks me, I have no idea how to explain it in less than 3 sentences. Your explanation is mindblown. I guess you will make a lot of University go bankrupt. People just open your channel instead.
Thank you so much!
Excellent explanation. Although I must point out that we travel in the direction of the negative of the gradient. So we multiply by -(learning rate)
sir can u explain pls i dont gwt it why this guy + the gradient
You did a great job explaining the material.