Can you talk about liquid neural networks? I’m interested to know if that’s a revolutionary work that deserves more recognition and following. arxiv.org/pdf/2006.04439.pdf
Artem, i want to thank you, not only for publishing an excellent material (from the 100's of DL/ML videos i've saved, yours is top 5 - really), as well as having a great intonation, that helps A LOT in capturing the attention in this day and age of constant distractions. THANK YOU 🙌
Back prop is a hard, heavy thing to explain, and this video does it extremely well. I mean, that section 'Computational Graph and Autodiff' might be the best explanation of that subject on the internet. I'm very impressed - well done!
@@George70220 I don't think CuriousLad meant it as a diss, it's just that when Artem made the video, he explained the Calculus section as a background information. The partial derivates and gradient descent wasn't the main topic of the vid, yet you could show this to Calculus I student and they would be thanking him for the explanation, even if they have not interest in learning back propagation! That's why funnily enough, while the intro Calc topics wasn't the main part of the video, that portion would be very helpful to anyone starting out int Calc!
I dont agree for example the act of minimizing loss function and gradient descend were not properly linked there were just two pieces of information unprocessed dumped in series
I found it unnecessary. Anyone who clicks on a video about back propagation likely already knows calculus, and if they don’t, that short primer is not going to be enough foundation for the rest of the content.
It makes sense that you would cover both computational neuroscience AND machine learning since they both play a significant role in AI research. The sort of content you're making is definitely 3Blue1Brown level. Keep up the good work!
I knew that calculus is important for machine learning but never knew that 12th grade derivatives are that much important. When you said about chain rule, that bring me back to my school days , I never thought that derivatives, integration and probabilities will be used this way in future. Well explained video. Thanks for sharing this knowledge and conveying process much simply.
I've been trying to get into ML for quite a while now. This is by far the best explanation of gradient descent and back propagation hands down!!! Amazing work!!!
I actually pictured this all in my head successfully where I thought I had everything in a canonical deep neural network figured out the other day. It’s one thing to hold it, it’s another to do the detailed, gritty work of explaining it in video format. Very well done.
Most Comprehensive Explanation EVER my opinion : better than 3b 1b, No offence to 3b 1b Hes great at it and one of the pioneers who did these kind kf visual explanations. But i like your explanation as it is slow paced & comprehensive
Yeah 3b1b definitely deserves respect from me, but I think he will to recognize this video is very carefully done. I like that these people just care about the truth and the perfection, and even with a little bit of envy, care about the best product being done.
That was an outstanding explanation. Your ability to explain higher mathematical concepts in such simple terms is really an amazing service to the rest of us who wanna understand these subjects but don’t have a mathematics degree. Thank you.
i just made that in python for a simple quadratic equation.....THANK YOU !!!! i just learned python and machine learning !!!!!!!!!! Using desired y=0 i could also find one solution of the equation... wow i love this so much!! The only different i did was to make x the weight and not the coeficients which i wanted them to be fixed inputs What you helped me realise is that any system that can put in a computational graph like that 30:04 ...it can be embeded backpropagation regardles THANK YOU im out of words Also when the next loss is bigger or equal than the preview loss after one iteration... i divided the learning rate by a factor of 2 or 10 for more accuracy and if the next loss was smaller than the preview one i multiple the learning rate by a factor of 1.1 to 1.5 to speed up the proccess...thus having results in hundreds or even thousands less generations/iterations and less time consuming!!!!! I can use this for optimizing my desired outputs in any system !!! JUST WOW!!
I just have to say this goes way beyond the quality of the many chainrule videos I've seen so far. Good job man, you've got some impressive skills to keep me watching a math video and take notes past my usual bedtime
Understanding something complex requires high intelligence. Explaining it simply requires even higher intelligence. You are one of the best teachers that I have encountered in my life! I'm grateful!
Damn, I was wondering where you've been since over half a year, whilst I was stuck in backpropagation😂 and here you came back like a true mind reader. Glad to see you back❤
Guru of Fundamentals. I can't resist subscribing to your channel and watch all of your videos. The way you explained Chain Rule : the logic behind it is awesome. I am trying to visualize the Quotient Rule of Derivatives in your way. A good Teacher always makes you THINK 🙏
all these basic concepts such as derivatives, least square method, I'm learning it in my college. watching these kind of machine learning videos has made me understand the practical applications of these theoretical concepts a bit better now 😌
This is one of, if not the, best videos I’ve seen that throughly explains back propagation. It will definitely help me to be able to better explain the algorithm to others, so thank you for creating it.
I should be watching gameplays but here I am procrastinating. Jokes aside, im 13 minutes into the video and astonished by your crystal clear explanations and the quality of your material, this is gold.
This is a visual masterpiece! Well done! Much of this was a review for me as I took the time to go through all this last year. I did an implementation of the MNIST handwritten number neural network and had to learn all the calculus covered here to work out the backpropagation math. You really do have to dig in to it to get a good handle on it but it's fun stuff.
As a student in this business, who has passed through a bunch of professors, I can say with confidence! With this trader, you will both learn and earn and, importantly, receive advice. Everything is competent and clear, without a bunch of any unnecessary movements! Keep up the good work!🤣
I have been doing ML research for a few years now but somehow I was drawn to this video. I am glad to say that it did not disappoint! You have done an amazing job, putting things in perspective and showing respect to calculus where it is due. We forget how a simple derivatives powers all of ML. Thank you for reminding that!
The best and most understandable explanation I have ever seen. You explained the essential basis of Artificial Neural Networks so beautifully. I really congratulate you
in simple words, backpropagation is method of finding gradient descent. to minimising the losses their is need to find the correct values of each function and it is get by chain rule which is based on darivatives and calculus. it's direction in backward direction hence it is known as backward propagation. still so much confusion in mind regarding this process. video is very useful and editing is extraordinary.
This is the best resource I have found to learn backpropagation. The visualization of each concept made this very clear. I can't even imagine the amount of effort you have put into this video.
I just watched this video after completing my first lecture of Deep Learning on Backpropagation and Gradient Descent. thanks man! appreciated. Really solid content.
Where you were so far.... I've been trying to understand this concept from past 2 years, and now it's cleared after watching this video. Honestly, my maths was not up to the mark. After seeing your video so many important concepts are cleared. Don't have enough words to thank you.... God bless you. I'll share your videos with my friends. Please keep it up.....🙏🙏🙏🙏🙏🙏🙏🙏🙏
31 years now, had like 13 years of math in school and another 5 years at university, first time i really understood how derivatives work, bcs visualisation instead of "you calculate it this way and derive it that way, now memorize"
@@ArnaudMEURET Sorry, but I don't believe anybody who have no idea what a tangent line of a function in point x looks like, and what it means (despite milion of excersises teaching you it's meaning), and dozens of graphs in literally every workbook, could actually go through 5 years of math related subject. This guy is straight up lying or trolling.
@@WsciekleMleko exactly. People shit on school because "muh education system bad" and forget about all of the interesting stuff they actually teach. No, you're not heroes trying to fight against the big bad. You're just lazy and want an excuse to keep being one.
Um, how long did it take you to graduate grade school? 13 years of math! Even if you did start learning maths in PK or K to 12, it’s basic mathematics until about grade 4 when you’re starting to build off the foundation of that such as algebra, geometry etc In other words, the maths ain’t mathing!
I cannot tell how much excited this video has got me once I realized I am understanding every single step effortlessly.😂😂😂 Thanks so much for the explanation. God bless you!🙏🙏🙏🙏🙏🙏
What an amazing video. I hope one day they come up with some world prize for 'free education heroes'. 173k views for a video like this is simply disgusting. This guy deserves maybe 2 billion views. God damn it, that makes me mad.
Ehm, ya do realize this flies over the head of most people, you'll have to stack thousands up to find one person who is interested and can understand this properly. It is also not really needed for a plumber or a bakery cashier to understand ML improvement/approach velocity which is what I'd call this in a sense. Or, a visual way to pick a good method for it.
Man, you really nailed it, especially the Computational Graph and Autodiff part. I heard so many times about them in lectures on Stanford and others. However, this was impressive.
This video explains the mathematical base of neural networks in a way I understood it the frist time enough to be able to explain it to somebody else. Thank You for that. I can't even imagine how much work you put into the animations. A master piece!
I wish the Chain Rule was explained in this manner when I was in university. I understood how to do it on paper just fine, but this explanation makes the reasoning behind it make complete sense.
Very well explained how backpropagation and how the loss function helps in determining the optimal minimum by using calculus, great detail which helps newbies like me understand this complex topic much better.
Glad to see ML related video from you ! As you have neuroscience background I would love to see some video that compare the current state of the art architecture work in ML with some of the inner working of the brain. For exemple if there are any structure in the brain with some ressemblance with GPT/transformers architecture, even thought the brain is light-years away I think that could be interesting :)
Shit, that was great. Backprop was really pain in the ass, this is the channel that deserves the most subscrbers in the world, it's even better than 3blue1brown
Very good video, very well explained. But there is one problem you didn't mention. When training very deep neural networks and using a sigmoid or tanh funktion as the activation funktion, Backpropagation loses it's "powers". The learning prozess becomes extremely slow and results are suboptimal. One of many solutions to this is a ReLU or a ELU funktion in the hidden layers instead of Sigmoid or Tanh. And also how we initialize our weights at the beginning. For example the He-Initialiazation...
Waiting patiently for the second video 🫰♥️. Much love from Kenya, thank you for making me understand back propagation. Started watching your channel because of Obsidian, stayed for the AI lessons 🫰.
Join Shortform for awesome book guides and get 5 days of unlimited access! shortform.com/artem
Can you talk about liquid neural networks? I’m interested to know if that’s a revolutionary work that deserves more recognition and following.
arxiv.org/pdf/2006.04439.pdf
Artem, i want to thank you, not only for publishing an excellent material (from the 100's of DL/ML videos i've saved, yours is top 5 - really), as well as having a great intonation, that helps A LOT in capturing the attention in this day and age of constant distractions. THANK YOU 🙌
Back prop is a hard, heavy thing to explain, and this video does it extremely well. I mean, that section 'Computational Graph and Autodiff' might be the best explanation of that subject on the internet. I'm very impressed - well done!
You two are the best channels I have found in the SoME episodes. It's great to see this interaction between you guys.
Love your videos
If there is no mention of sine waves in neural networks then it won't be total.
Where is that section 'Computational Graph and Autodiff' ?
Yeah really helped me get the significance of autodiff
It’s probably the best explanation of backward propagation. Hats off to your hard work and saving this so valuable content.
"Wait, It's all derivatives?"
"Always has been"
Great work pal. Provides excellent clarity.
Looking forward to the second part.
😂 Turns out back propagation isn’t just magic
Funnily enough, the calculus portion of the video is probably one of the best explained I've seen
Why would that be 'funnily enough'? What a diss lmao.
@@George70220 I don't think CuriousLad meant it as a diss, it's just that when Artem made the video, he explained the Calculus section as a background information. The partial derivates and gradient descent wasn't the main topic of the vid, yet you could show this to Calculus I student and they would be thanking him for the explanation, even if they have not interest in learning back propagation! That's why funnily enough, while the intro Calc topics wasn't the main part of the video, that portion would be very helpful to anyone starting out int Calc!
I dont agree for example the act of minimizing loss function and gradient descend were not properly linked there were just two pieces of information unprocessed dumped in series
I found it unnecessary. Anyone who clicks on a video about back propagation likely already knows calculus, and if they don’t, that short primer is not going to be enough foundation for the rest of the content.
Nasdaq please buy toggle 0:25
this's by far the most clearer explaination and simplification of backpropagation i have watched
By far the best ML explanation I have seen on internet.
It makes sense that you would cover both computational neuroscience AND machine learning since they both play a significant role in AI research. The sort of content you're making is definitely 3Blue1Brown level. Keep up the good work!
He also managed to squeeze an entire calc 1 course into this single video. It's amazing
This just might be the most underrated video on Back Propagation that I've ever seen! I hope more people come across this
The visuals on this video is from another planet . So Good !!!!!!!!
This has to be one of the greatest explanation of the inner working of learning in ML, I love it!
indeed
I knew that calculus is important for machine learning but never knew that 12th grade derivatives are that much important.
When you said about chain rule, that bring me back to my school days , I never thought that derivatives, integration and probabilities will be used this way in future.
Well explained video.
Thanks for sharing this knowledge and conveying process much simply.
Very True!
Remember when people said nobody needs higher dimensions expect those stupid quantum scientist and nothing useful would come out of it? yeah...^^
I've been trying to get into ML for quite a while now. This is by far the best explanation of gradient descent and back propagation hands down!!!
Amazing work!!!
I actually pictured this all in my head successfully where I thought I had everything in a canonical deep neural network figured out the other day. It’s one thing to hold it, it’s another to do the detailed, gritty work of explaining it in video format. Very well done.
Most Comprehensive Explanation EVER
my opinion : better than
3b 1b, No offence to 3b 1b Hes great at it and one of the pioneers who did these kind kf visual explanations.
But i like your explanation as it is slow paced & comprehensive
Yeah 3b1b definitely deserves respect from me, but I think he will to recognize this video is very carefully done.
I like that these people just care about the truth and the perfection, and even with a little bit of envy, care about the best product being done.
That was an outstanding explanation. Your ability to explain higher mathematical concepts in such simple terms is really an amazing service to the rest of us who wanna understand these subjects but don’t have a mathematics degree. Thank you.
I've seen probably 20 videos on this and your explanation of the derivatives for someone not in calculus was really helpful. thanks.
i just made that in python for a simple quadratic equation.....THANK YOU !!!! i just learned python and machine learning !!!!!!!!!!
Using desired y=0 i could also find one solution of the equation... wow i love this so much!!
The only different i did was to make x the weight and not the coeficients which i wanted them to be fixed inputs
What you helped me realise is that any system that can put in a computational graph like that 30:04 ...it can be embeded backpropagation regardles
THANK YOU im out of words
Also when the next loss is bigger or equal than the preview loss after one iteration... i divided the learning rate by a factor of 2 or 10 for more accuracy and if the next loss was smaller than the preview one i multiple the learning rate by a factor of 1.1 to 1.5 to speed up the proccess...thus having results in hundreds or even thousands less generations/iterations and less time consuming!!!!!
I can use this for optimizing my desired outputs in any system !!! JUST WOW!!
I just have to say this goes way beyond the quality of the many chainrule videos I've seen so far. Good job man, you've got some impressive skills to keep me watching a math video and take notes past my usual bedtime
you take notes?
@@marc_frank It's generally a good idea if you are trying to learn. Don't be passive if you want it to stick.
Taking notes, making sketches of the ideas, doing the math are excellent learning techniques. Old timers like me always do that 👍
Understanding something complex requires high intelligence. Explaining it simply requires even higher intelligence. You are one of the best teachers that I have encountered in my life! I'm grateful!
Damn, I was wondering where you've been since over half a year, whilst I was stuck in backpropagation😂 and here you came back like a true mind reader. Glad to see you back❤
He was calculating your backward step so you can make your next forward step (sorry, couldnt resist) XD
@@highchillerhe just gave you the right explanation gradient so that you can optimize your learning loss function 😂
There could not have been a better explanation. Hats off to you
Guru of Fundamentals. I can't resist subscribing to your channel and watch all of your videos. The way you explained Chain Rule : the logic behind it is awesome. I am trying to visualize the Quotient Rule of Derivatives in your way. A good Teacher always makes you THINK 🙏
all these basic concepts such as derivatives, least square method, I'm learning it in my college. watching these kind of machine learning videos has made me understand the practical applications of these theoretical concepts a bit better now 😌
Absolutely one of the best videos explaining data points and regression formulas I have ever seen. Amazing work
This is one of, if not the, best videos I’ve seen that throughly explains back propagation. It will definitely help me to be able to better explain the algorithm to others, so thank you for creating it.
Dude, this is the most beautiful ML video i've ever seen. Highly informative yes, but also beautifully made. Thank you for your work.
Excellent explanation!! You have done a selfless service to humanity.
I should be watching gameplays but here I am procrastinating. Jokes aside, im 13 minutes into the video and astonished by your crystal clear explanations and the quality of your material, this is gold.
This is a visual masterpiece! Well done!
Much of this was a review for me as I took the time to go through all this last year. I did an implementation of the MNIST handwritten number neural network and had to learn all the calculus covered here to work out the backpropagation math. You really do have to dig in to it to get a good handle on it but it's fun stuff.
As a student in this business, who has passed through a bunch of professors, I can say with confidence! With this trader, you will both learn and earn and, importantly, receive advice. Everything is competent and clear, without a bunch of any unnecessary movements! Keep up the good work!🤣
He is back! Greetings from Brazil, we've all been waiting for this release!
I have been doing ML research for a few years now but somehow I was drawn to this video. I am glad to say that it did not disappoint! You have done an amazing job, putting things in perspective and showing respect to calculus where it is due. We forget how a simple derivatives powers all of ML. Thank you for reminding that!
Thank you! That’s really nice to hear!
This is the best educational video i ever seen on internet.. explaining the Backpropogation with visualisation. Amazingly super😊😊
The best and most understandable explanation I have ever seen. You explained the essential basis of Artificial Neural Networks so beautifully. I really congratulate you
in simple words, backpropagation is method of finding gradient descent. to minimising the losses their is need to find the correct values of each function and it is get by chain rule which is based on darivatives and calculus. it's direction in backward direction hence it is known as backward propagation.
still so much confusion in mind regarding this process.
video is very useful and editing is extraordinary.
This is the best resource I have found to learn backpropagation. The visualization of each concept made this very clear. I can't even imagine the amount of effort you have put into this video.
This is the best ever explanation I have seen. Thanks for taking the time and doing something extraordinary.
I just watched this video after completing my first lecture of Deep Learning on Backpropagation and Gradient Descent.
thanks man! appreciated. Really solid content.
Where you were so far.... I've been trying to understand this concept from past 2 years, and now it's cleared after watching this video. Honestly, my maths was not up to the mark. After seeing your video so many important concepts are cleared.
Don't have enough words to thank you.... God bless you.
I'll share your videos with my friends. Please keep it up.....🙏🙏🙏🙏🙏🙏🙏🙏🙏
What a great explanation and clarification especially for all mathematics required to understand Back prop algorithm, appreciate this so much
The best explanation about Deep Learning. Grateful.
31 years now, had like 13 years of math in school and another 5 years at university, first time i really understood how derivatives work, bcs visualisation instead of "you calculate it this way and derive it that way, now memorize"
May I ask which university you went to?
@@ArnaudMEURET Sorry, but I don't believe anybody who have no idea what a tangent line of a function in point x looks like, and what it means (despite milion of excersises teaching you it's meaning), and dozens of graphs in literally every workbook, could actually go through 5 years of math related subject. This guy is straight up lying or trolling.
@@WsciekleMleko exactly. People shit on school because "muh education system bad" and forget about all of the interesting stuff they actually teach.
No, you're not heroes trying to fight against the big bad. You're just lazy and want an excuse to keep being one.
Um, how long did it take you to graduate grade school? 13 years of math! Even if you did start learning maths in PK or K to 12, it’s basic mathematics until about grade 4 when you’re starting to build off the foundation of that such as algebra, geometry etc
In other words, the maths ain’t mathing!
So clear and concise! Thank you for creating this.
This has to be the best explanation of the chain rule ever! Thanks
The world needs more of you bro
I cannot tell how much excited this video has got me once I realized I am understanding every single step effortlessly.😂😂😂
Thanks so much for the explanation.
God bless you!🙏🙏🙏🙏🙏🙏
I think I just found my favourite channel of all times.
I've been on YT since 2011 and never had a crush for a YT channel before today é.è
Finally a solid explantation of backpropagation. Thank you!!
this is the most intuitive video I have ever come across. Amazing work!!!!!
What an amazing video. I hope one day they come up with some world prize for 'free education heroes'. 173k views for a video like this is simply disgusting. This guy deserves maybe 2 billion views. God damn it, that makes me mad.
Ehm, ya do realize this flies over the head of most people, you'll have to stack thousands up to find one person who is interested and can understand this properly. It is also not really needed for a plumber or a bakery cashier to understand ML improvement/approach velocity which is what I'd call this in a sense. Or, a visual way to pick a good method for it.
bro im 2 minutes in and your graphics are insanely good I can already tell this is going to be a treat. Holy smokes man I'm having a graphicgasm
It's very very nice to see that are you updating.
Some people just want to see the world learning. Great Video Artem!
Beat graphical experience with a clear information, Really enjoyed throughout the video !!!
Hands down the best explanation there is to backprop
Hands down the best explanation I have seen so far! So clear and easy to understand!!
This is incredibly well done and helped me visualize derivatives comprehensively. Thank you.
Man, you really nailed it, especially the Computational Graph and Autodiff part. I heard so many times about them in lectures on Stanford and others. However, this was impressive.
So much effort, in this video, the quality of the content at the same level of 3B1B, keep it going man.
Magnificent work, from the beautiful, creative, elegant design, to the mastery in teaching. Thank you!
Best description on the topic on the internet!
This video would have saved me so many days that I have spent on researching backpropagation 2 years ago
That was fire bro! Gonna have to rewatch to understand the back step, but a lot clearer than most videos
simply the best presentation on the subject
This video has an amazing and easy-to-understand explanation of the basics of Calculus. Many Thanks to the Creator 🙏🏼
This video explains the mathematical base of neural networks in a way I understood it the frist time enough to be able to explain it to somebody else. Thank You for that. I can't even imagine how much work you put into the animations. A master piece!
This video is an absolute masterpiece, congratulations
The best explanation of machime learning i have ever seen on you tube ,amazing work .thank you👍
One of the best visual explanations of the backpropagation algorithm I've seen! The animations are really good.
Sure that it was the back propagation algorithm?
Thanks!
Wow amazing thank you. Ive read and watched many videos on this topic and this is the one where I finally "got it"
This is the best youtube channel in my feed, and I have many.
This is the best video about this topic. Learned a lot of things. Took me 2 or more hours but I understand it now. Thank you!
Excellent explanation of back-propagation, the building block of machine learning. Thanks a lot.
Another gem of a video, well done Artem!! This channel deserves 1M+ subscribers, there's nothing else like it on TH-cam.
So theoretical! When you actually implement this algorithm in C++, it becomes more clear how variables should be adjusted to minimize loss.
I wish the Chain Rule was explained in this manner when I was in university. I understood how to do it on paper just fine, but this explanation makes the reasoning behind it make complete sense.
Very well explained how backpropagation and how the loss function helps in determining the optimal minimum by using calculus, great detail which helps newbies like me understand this complex topic much better.
Amazing how you can explain it so well, so simply. You have a subscriber !
Glad to see ML related video from you ! As you have neuroscience background I would love to see some video that compare the current state of the art architecture work in ML with some of the inner working of the brain. For exemple if there are any structure in the brain with some ressemblance with GPT/transformers architecture, even thought the brain is light-years away I think that could be interesting :)
Subscribed after just watching five mins.. 😊
Thanks
Gr8t Job.. my 12th grade mathematics marks suggest I am good at differentail calculus.. now you made me realizse its true..
If you couldn’t understand this explanation, visualization, clearness … there’s nothing else can work with you I swear
Shit, that was great. Backprop was really pain in the ass, this is the channel that deserves the most subscrbers in the world, it's even better than 3blue1brown
Спасибо, это лучший канал связок, все работает, буду это пробовать.
Brilliant video! The math, detailed visuals and explanation are excellent. Thank you.
Protect this guy at all cost please
27:27. It clicked here.
Seriously amazing video. Honestly, all your videos are.
Thank you so much.
Excellent explanation. I am going to rewatch this a few more times. Well done and thank you.
Very good video, very well explained. But there is one problem you didn't mention. When training very deep neural networks and using a sigmoid or tanh funktion as the activation funktion, Backpropagation loses it's "powers". The learning prozess becomes extremely slow and results are suboptimal. One of many solutions to this is a ReLU or a ELU funktion in the hidden layers instead of Sigmoid or Tanh. And also how we initialize our weights at the beginning. For example the He-Initialiazation...
this video has amazing animations. You/your team clearly have a very high attention to details
Your approach to trading is truly impressive. Thank you for teaching me so much!
Waiting patiently for the second video 🫰♥️. Much love from Kenya, thank you for making me understand back propagation. Started watching your channel because of Obsidian, stayed for the AI lessons 🫰.
As soon as I saw this video, I knew it was going to be the best of this kind on the Internet. And it was. Fantastic video!
Excellent explanation - I already understood this conceptually but this video gives a very good intuition for the repeated chain rule application
That's the most amazing way of explaining such hard things to understand
criminally underrated
Wow, hats off to you! Can't even imagine how long it takes to make something like this