Watched 7:30 mins and before i complete the rest of the video i felt an overwhelming need to tell you that you taught this concept in a brilliant manner
Brilliant. I love how you spell out the matrices that implement the rules of the neural network. Great job pulling back the curtain on the Wizard. Also Mt Kilimanjerror!
Thanks! Coming from you, this is very high praise, higher than Mt. Kilimanjerror! (actually, was between that one and Mt. Rainierror... maybe for the next error function) :)
I've been "just getting through" my machine learning class for the last 9 weeks and now after watching this video I finally feel like I understand these concepts!
I have looked at many videos and I rarely comment so my words carry a lot of weight. This is hands down the best tutorial I have seen yet for machine learning.
This is hands down one of the best tutorials I've ever seen on a Machine Learning topic. The quality and the ease of explanation with which the video was made and presented really helped me understand the scary concept of RNN in a very uncomplicated way. Thank you very much.
My hunting for clarity on RNN ended with this video. I had read many medium articles and saw the videos too. Putting all those together can't reach this video. Thank you Luis Serrano
best video on the topic so far. firstly explaining the topic in a almost oversimplified manner, and then gradually increase complexity and difficult terminology. Perfect teaching style!
I have been through every single RNN video trying to understand it and you are the only one that has explained it well. I am sick of abstract topics such as NN's being unapproachable because of teachers who don't know how to explain things with a tamer vocabulary and EXAMPLES. Lots and lots of examples.
Fantastic!! By presenting simple Neural network operations as matrix multiplications you have explained the basics of RNNs to me in a way that no one on TH-cam was able to do! You're fantastic Luis💛
This is a kickass explanation of RNN.. You are a genius in teaching.. Trust me i am a student of one of the renowned institutes of the world.. but i didn’t get to hear this simple and effective way of teaching
Luis, Great video as always ! I am in udacity machine learning nanodegree program and I love your teaching style. Please keep making videos you are making a big difference for people like us.
Seriously for putting it in such a easy way and you have given a great idea of how it is working internally through matrices rather than using nodes and edges because it is way too difficult to understand and really thank you for making it such easy.
Dude you can teach this supposedly extremely advanced theory to a primary school kid with your brilliant way of explanation. Respect and thank you so much!
wow! this is the exact tutorial I've looking for ages, by starting with examples and motivation, then moving into matrices multiplication. now I think a dump like me can understand how RNN works...
your pytorch udacity courses helped me out. But without understanding the topic, I went to youtube and again help out. Thanks for the course. You will explain so that the child can understand
I Must Say Excellent teacher u are.. really i have been searching this topic and again and again was confused. Today i watch your video. Welldone it was clearly described. I am impressed .keep it up
thank you so much brother you a genius. you've enlightened my mind towards RNN. I've watched plenty of videos trying to figure out what's going on, but your video gave me hope. thank you so much.
Congrats Luis, what an awesome video! The concept of RNN was broken down to the bare minimum and the rest of the explanation stemmed from this simple principle, brilliant!
De todas as explicações que vi na internet até agora, a sua é de longe a melhor. Ao contrário do convencional, você procura explicar claramente esse conceito que inicialmente é extremamente abstrato. Muita coisa complexa que não podia dizer que compreendia, agora percebo que estou começando a entender. Muito obrigado!
Thank you very much for that prompt response, Luis!! You really have the knack for clarifying these fundamental issues. I understand clearly now the motivation for recurrent neural networks.
Luis! I love your NN series, but question that threw me off a bit. @18:06 when you add the inputs, how did you get [0,1,0,1,2,1] when the first node is 1+0 & the second node is 0+0, shouldn't it equaled to [1,0,0,1,2,1] or is there some other input that I am missing? I mean ultimately it's irreverent because after the Non-Linear function it transforms into a 0, but just want to make sure I am not missing anything there aha.
Dang! Yes you're totally right, that's a typo, it should be 1,0,0,1,2,1... Thank you! And yes, also right that the non-linear function makes it 0 anyway, but yeah, I put the one in the wrong place.
phew, okay cool. & thanks for the video! I've yet came across a simple explainer on how to write a LSTM RNN & this did the trick for me, keep up the great work!
Very nice knowledge transfer Luis! This is my first week of introduction to Neural Networks! I followed you completely, until we get into the food matrix. 13:25 minute mark and I "tapped out." This (simpleman) explanation encourages me to stick with my learning and I'm sure after watching this a few more times, I will gain a better understanding. Great job!! Thank you. P.S. Bought your book;-)
Great explanation!!!...it really can't get more simpler than this. I've watched most of the videos on the subject but this was the one that really made it clear. Thanks
your way in explanation is very cool you look like open the mind of the person and put the material inside, please we want more videos about deep learning applications like object tracking and even videos on programming languages to build the deep learning
Can someone help @14:00: How a sunny matrix [1,0] gets mapped to [111], and still be defined as a sunny matrix, when sunny matrix is defined as [1 0]? Or whichever are 111's let the sunny through?
My intuition is that the 6x3 weather matrix simply expands the sunny / rainy matrices. So the [1,0] sunny matrix becomes a [1,1,1,0,0,0] matrix, and the [0,1] rainy matrix becomes a [0,0,0,1,1,1] matrix. These dimensions allow the matrix addition in the merge step. Functionally, the top 3 values in the expanded sunny matrix being 1 mean that the '2' that appears in the matrix sum will always appear in the top half, i.e. the same food. When it is rainy, the bottom half is 1's, meaning the '2' will appear in the bottom half, and the next food in order will be made. The graphic is slightly confusing, the [1,0] isn't mapped just to [1,1,1], but the entire 6x1 matrix
Thanks for your video, I did get a friendly introduction to RNN :) It reminds me so much of the Hidden Markov Models (HMM in short; here, what he cooks is the hidden state and the weather is the observation in your diagram at 10:39 of this video. I guess I'll do some search how HMM and RNN's are connected! Your comments are most welcome here!
4:01 shows how a nn can map simple inputs to specific outputs. Uses the same as Stanford notation for feed forward Wx Weights matrix, each row is the weights of coming into a node in the hidden/output layer
Who else think Luis Serrano is a genius teacher ? Wow !
Thank you. :)
You are welcome professor !
he explained a quite complex problems in a very intuitive and easy to understand way
Amazing explanation
@@SerranoAcademy Really great lecture :) Thanks
"If you can't explain it simply, you don't understand it well enough..."
You proved it can be done...!
if you can't explain it with at least one bad pun, then you don't understand the concept of humour well enough!
Watched 7:30 mins and before i complete the rest of the video i felt an overwhelming need to tell you that you taught this concept in a brilliant manner
Lol I actually did the same and went straight to the comment section in the same minute
Brilliant. I love how you spell out the matrices that implement the rules of the neural network. Great job pulling back the curtain on the Wizard.
Also Mt Kilimanjerror!
And thanks for the shoutout :)
Thanks! Coming from you, this is very high praise, higher than Mt. Kilimanjerror! (actually, was between that one and Mt. Rainierror... maybe for the next error function) :)
Before reading this comment I was just about to say that it's cool aproach with matrices!
Thank you Luis. It's a rare talent, to explain things in such a clear and simple way.
I've been "just getting through" my machine learning class for the last 9 weeks and now after watching this video I finally feel like I understand these concepts!
All the other tutorials just explained NN as a black box. Your use of matrices for the explanation really helped strengthen the understanding! :)
Intelligence, simplicity and didactic. Three ingredients of a genial Machine Learning teacher!
I have looked at many videos and I rarely comment so my words carry a lot of weight. This is hands down the best tutorial I have seen yet for machine learning.
This is hands down one of the best tutorials I've ever seen on a Machine Learning topic. The quality and the ease of explanation with which the video was made and presented really helped me understand the scary concept of RNN in a very uncomplicated way. Thank you very much.
Just want to leave a comment so that more people could learn from your amazing videos! Many thanks for the wonderful and fun creation!!!
My hunting for clarity on RNN ended with this video. I had read many medium articles and saw the videos too. Putting all those together can't reach this video. Thank you
Luis Serrano
I really love your ability to convert extremely complex concepts into simple things by giving day to day life examples. Hats off to you!!!!
I have no words for this guy, what a legend! Thank you for being such a great teacher!
Watching this video for the first time exactly after 6 years.
simplest and amazing explanation thank you sir
best video on the topic so far. firstly explaining the topic in a almost oversimplified manner, and then gradually increase complexity and difficult terminology. Perfect teaching style!
I have been through every single RNN video trying to understand it and you are the only one that has explained it well. I am sick of abstract topics such as NN's being unapproachable because of teachers who don't know how to explain things with a tamer vocabulary and EXAMPLES. Lots and lots of examples.
You are the best. All of your videos are awesome. Even a 5-year-old can understand what you are saying. I respect your contribution !!
Sir ,you have a talent for representing complication things in a simplest manner.
The most intuitive introduction to RNNs that I've come across thus far! Thank you!
By far the clearest and most approachable intro to recurrent NNs I've come across!
17:50 is the most important figure of the whole video. The explanation was very good, simple and easy!
Real classic intro to a complicated topic. Love the smooth introduction. Just perfect.
This is easily the best RNN explanation on the internet.
I can't believe I got to learn this for free, thank you!
Fantastic!! By presenting simple Neural network operations as matrix multiplications you have explained the basics of RNNs to me in a way that no one on TH-cam was able to do! You're fantastic Luis💛
This is a kickass explanation of RNN.. You are a genius in teaching.. Trust me i am a student of one of the renowned institutes of the world.. but i didn’t get to hear this simple and effective way of teaching
Before I go any further, I really liked how you stated what Machine Learning does to us.
Genius!!!
Your method of teaching with all those images is really awesome
This is gold!. How do you like a YT video more than once?
The errorrest and kilimanjerror pun was perfect!
Luis,
Great video as always ! I am in udacity machine learning nanodegree program and I love your teaching style. Please keep making videos you are making a big difference for people like us.
Thank you for your message, Avinash! Great to hear that you enjoy the program! Definitely, as time permits, I'll keep adding videos here. Cheers!
Seriously for putting it in such a easy way and you have given a great idea of how it is working internally through matrices rather than using nodes and edges because it is way too difficult to understand and really thank you for making it such easy.
One of the best videos for beginning DNNs. It sets our psyche properly for all the things to come in Deep Neural Networks.
"The Vector of the Chicken." I wonder how many times in the history of humanity that phrase has been uttered.
Dmitry Karpovich
hahaha, I wonder if it's the first time! :)
I feel like there is a joke or pun somewhere in there, but I cant find it...
a must video for everybody trying to understand RNN. Really appreciate your work to make basic concepts simpler for audience.
God! This one is a saviour. It changed my perspective towards NNs.
Luis Serrano, you have an incredible ability to represent tough concepts in such an interesting way
Hello Luis, please don't stop making these videos. Your NN series are awesome. I had to come back to comment on this. Thanks a lot man.
Dude you can teach this supposedly extremely advanced theory to a primary school kid with your brilliant way of explanation. Respect and thank you so much!
I wish I had teachers like you in UNI. Thank you!!!
wow! this is the exact tutorial I've looking for ages, by starting with examples and motivation, then moving into matrices multiplication. now I think a dump like me can understand how RNN works...
your pytorch udacity courses helped me out. But without understanding the topic, I went to youtube and again help out. Thanks for the course. You will explain so that the child can understand
Best explanation of RNNs i found on TH-cam. Thanks a tonne.
I am normally very lazy in commenting, but this guy made me do it. You Sir are awesome!!!
You are one of the best tutors here. You make complex things look damn easy. Thanks lot for all your videos
Your voice and teaching skill both are soothing enough.
Best,easy and simple explanation of RNN.
keep up the great work.Thanks
now i can practically teach my students about gradient descent, very intuitive lessons here.thanks alot
This is the best and most easily understanding introduction I have ever heard. Fantastic!
I Must Say Excellent teacher u are.. really i have been searching this topic and again and again was confused. Today i watch your video. Welldone it was clearly described. I am impressed .keep it up
This tutorial has reinforced my understanding and see it in a new light. Superb explanation. Very very clear. Thanks very much.
It really was an amazing video. It was really nice to see how such an esoteric topic was presented in really simple way. Keep it going dude!
thank you so much brother you a genius. you've enlightened my mind towards RNN. I've watched plenty of videos trying to figure out what's going on, but your video gave me hope. thank you so much.
Congrats Luis, what an awesome video! The concept of RNN was broken down to the bare minimum and the rest of the explanation stemmed from this simple principle, brilliant!
De todas as explicações que vi na internet até agora, a sua é de longe a melhor. Ao contrário do convencional, você procura explicar claramente esse conceito que inicialmente é extremamente abstrato. Muita coisa complexa que não podia dizer que compreendia, agora percebo que estou começando a entender. Muito obrigado!
A very clear explanation. You did a lot of work to come out with really clear teaching. Thank you very much
Bingo ! My journey to understand RNN intuitively finally ends, thanks for this great video.
Whoa this helps a lot. I watch a bunch of videos about this and I keep getting confused. Glad I find this video. Thank you!
Sir, this is really amazing. Loved this example in general because Neural Networks as a linear transformation in general sounds so cool!
Thank you for making this video! Most articles on RNNs didn't explicitly explain how two inputs were added to make a proper output
This is the best for beginners. You deserve more likes!
Congrats Luis! It is explained a quite complex problems in a very intuitive and easy to understand way
Great Explaination sir in very simple language thats signof the best teacher
This is the best video that user has seen which explains complex things in simple way
3 years in AI engineering and I've never seen an interpretation of neural network like this. Amazing Sir!
Thank you very much for that prompt response, Luis!! You really have the knack for clarifying these fundamental issues. I understand clearly now the motivation for recurrent neural networks.
after watching 50 videos about RNN, finally this one teaches me the idea.
Thank you for demystifying the RNN. It is really beginner-friendly, thank you.
Luis! I love your NN series, but question that threw me off a bit. @18:06 when you add the inputs, how did you get [0,1,0,1,2,1] when the first node is 1+0 & the second node is 0+0, shouldn't it equaled to [1,0,0,1,2,1] or is there some other input that I am missing? I mean ultimately it's irreverent because after the Non-Linear function it transforms into a 0, but just want to make sure I am not missing anything there aha.
Dang! Yes you're totally right, that's a typo, it should be 1,0,0,1,2,1... Thank you!
And yes, also right that the non-linear function makes it 0 anyway, but yeah, I put the one in the wrong place.
phew, okay cool. & thanks for the video! I've yet came across a simple explainer on how to write a LSTM RNN & this did the trick for me, keep up the great work!
Jabrils I appreciate your recommendation for this video ;)
i LOVED THE INTRODUCTORY WITH YOUR PICTURES, exactly what happened to me.
Especially the mapping between the operations on matrices and the network of nodes helps visualize the topic. Great job, sir, indeed! Thank you
the best RNN tutorial period. Thanks
I am really enjoying learning from your Neural Networks playlist. Thank you so much for such amazing teaching and great quality content.
Amazing .. wonderful.. What a great teacher you are!! Lot of prep required to explain a complicated subject in few minutes with an easy example.
Best explanation on RNN I have seen so far. Thanks for doing this
This is just wow! Such a lucid explanation of RNN
Fantastic explanation through a tangible example and simple (and right) math principle evidence.
Very nice knowledge transfer Luis! This is my first week of introduction to Neural Networks! I followed you completely, until we get into the food matrix. 13:25 minute mark and I "tapped out." This (simpleman) explanation encourages me to stick with my learning and I'm sure after watching this a few more times, I will gain a better understanding. Great job!! Thank you. P.S. Bought your book;-)
Thank you so much for the neural network series. Such simple explanations without the mathematics.
Bravo, I liked your simplistic illustration of RNNs and how you related that to matrices 👍
Best explanation I found for RNN.
This was an outstanding explanation of RNN... Thanks for making this :)
The best TH-cam video I've ever seen
Thank you for the video, your explanation is clear as crystal
Excellent video. I understood it only after watching this video. Tried many earlier. Good Service
Great explanation!!!...it really can't get more simpler than this. I've watched most of the videos on the subject but this was the one that really made it clear. Thanks
Simple ! Lucid explaination ! clear methodology !
SUBSCRIBED !!!
your way in explanation is very cool you look like open the mind of the person and put the material inside, please we want more videos about deep learning applications like object tracking and even videos on programming languages to build the deep learning
What an explanation, thank you so much for making it so understandable and interesting!
Hands down the best vid I have ever seen. Great job mate. Great job.
Thanks!
Thank you for making this video! It's allowed me to understand RNNs in terms of matrices much more clearly!
Love it! discovered that NN can be represented as a matrix.
Didn't know this as well! Is this always the case?
Indeed a great teacher. Loved your explanation.
Can someone help @14:00: How a sunny matrix [1,0] gets mapped to [111], and still be defined as a sunny matrix, when sunny matrix is defined as [1 0]? Or whichever are 111's let the sunny through?
My intuition is that the 6x3 weather matrix simply expands the sunny / rainy matrices. So the [1,0] sunny matrix becomes a [1,1,1,0,0,0] matrix, and the [0,1] rainy matrix becomes a [0,0,0,1,1,1] matrix. These dimensions allow the matrix addition in the merge step. Functionally, the top 3 values in the expanded sunny matrix being 1 mean that the '2' that appears in the matrix sum will always appear in the top half, i.e. the same food. When it is rainy, the bottom half is 1's, meaning the '2' will appear in the bottom half, and the next food in order will be made.
The graphic is slightly confusing, the [1,0] isn't mapped just to [1,1,1], but the entire 6x1 matrix
nice video for people to understand one layer neural network, multi-layer neural network and deep learning
Thanks for your video, I did get a friendly introduction to RNN :) It reminds me so much of the Hidden Markov Models (HMM in short; here, what he cooks is the hidden state and the weather is the observation in your diagram at 10:39 of this video. I guess I'll do some search how HMM and RNN's are connected! Your comments are most welcome here!
4:01 shows how a nn can map simple inputs to specific outputs.
Uses the same as Stanford notation for feed forward Wx
Weights matrix, each row is the weights of coming into a node in the hidden/output layer
Excellent! You make it simple and clear to understand. Awesome Luis Sir...
Very well explained in simple words. Thank you so much.