Absolutely marvellous lecture! Never heard a lecture of this quality and clarity with absolutely the right speed. Taken me five decades to understand these terms! Thank you!
I am enjoying your lecture, starting from the basics. Many things I failed to understand as a student finally makes sense. You are a marvelous teacher and Thank you so much!
Thank you very much for making this video.I don't have a statistics background, yet it was easy to comprehend. The pace of teaching was perfect and there is a lot of clarity in explaining the concept.
Truly feel this is the best explanations I’ve read or seen. I knew the details were lurking in plain sight thank you. This channel deserves millions of views.
You are amazing. All my life I've been fearful of this topic, but I totally get it now. What I really like is the fact that you use a very simple data table to show things in concrete terms, and you work through the example using that table. Thank you so very much!!
bro, great job, love the way you explain things. You might see this comment copied and pasted across few of your other videos, I am just doing this for the algorithm.
This explanation is super sweet and i have seen it couple of times on other webs and channels. But when i see applications of SVD , PCA or face recognition that intuition about "stretched and not rotated" or "its just a multiple of vector" is lost somewhere. Or how that is has to do with determinant = 0 ?
Yes, the math behind PCA is not easy. However, the purpose of this video is to introduce the students about eigenvectors before they watch my videos about PCA. I would recommend that you watch my videos about multivariate statistics in order at my homepage: www.tilestats.com/
It has to do with the degree of the polynomial in the calculations. For example, a two-by-two matrix results in a degree of 2, which can result in a maximum of two roots. In the next video, I show how the polynomial is generated in the calculations: th-cam.com/video/JtcNe--fsyA/w-d-xo.html
Old video but luckily youtube suggested this one to me. Watched other videos first and none succeeded at explaining this concept as effectively.
Absolutely marvellous lecture! Never heard a lecture of this quality and clarity with absolutely the right speed. Taken me five decades to understand these terms! Thank you!
I am enjoying your lecture, starting from the basics. Many things I failed to understand as a student finally makes sense. You are a marvelous teacher and Thank you so much!
straight forward explanation. Thank you! I don't know why all the other videos on this topic have to make things so overcomplicated.
Thank you very much for making this video.I don't have a statistics background, yet it was easy to comprehend. The pace of teaching was perfect and there is a lot of clarity in explaining the concept.
Glad you enjoyed it!
hands down the best video on eigenvalues & -vectors on youtube
Best tutorial ever on Eigenvectors and Eigenvalues. Thanks
Some people are born to teach. Thanks a lot for this amazing series of videos about multivariate analysis.
I just wanted to say THANK YOU for making such a wonderful tutorial! I finally have a proper understanding of this topic.
this channel should be over 1 million subscribers. Its contents are far better than 99% of the youtube maths and ML contents
Truly feel this is the best explanations I’ve read or seen. I knew the details were lurking in plain sight thank you. This channel deserves millions of views.
I wish all TH-cam channels would teach so concisely and easy to understand as you.
I Sub'd right after 30 seconds.
Congratulations you broke through my barrier.❤
Great. After wandering through hundreds of video now I know what an Eigen vector and Eigen value is👍
This the most clear and simple demo I have seen.. thank you very much.
The best explanation I got from this channel regarding Eigen vectors and Eigen values and orthogonal eigen vectors. Thanks a ton
thank you very much for this channel it is like a treasure
This is the best lecture on eigenvectors and eigenvalues. I now have a good basic understanding of what these are. Thank you so much sir
Excellent, the approach, the pace, the details!
I absolutely love this channel. The explanations are so clear that I understand every bit of this complex topic
This important topic has been eluding me for several years. But this video made it clear about the concept with appropriate examples. ❤❤❤❤
The best training of Eigenvectors
You are an amazing teacher, thanks.
Great explanation about eigenvectors and eigenvalues! My favorite one! Thank you very much! :))))
You are amazing. All my life I've been fearful of this topic, but I totally get it now. What I really like is the fact that you use a very simple data table to show things in concrete terms, and you work through the example using that table. Thank you so very much!!
awesome teaching skill..thank you so much boss
Awesome explanation 👍 Eigenvector vs eigenvalue.
What an excellent description of things. Hats off to you.
Thank you!
Simple and easy video on eigen values and eigen vectors
Great explanation, really helpful for beginners..thank you so much!!
Thank you!
excellent series.
At 08:20 I wonder how will you normalize the eigenvector if there were three rows in the vector instead of two?
Just extend the equation by the third value. You sum the three squared values and then take the square root of the sum.
bro, great job, love the way you explain things. You might see this comment copied and pasted across few of your other videos, I am just doing this for the algorithm.
Thank you!
Good explanation
very well explained
Thanks Man ❤❤❤
This explanation is super sweet and i have seen it couple of times on other webs and channels. But when i see applications of SVD , PCA or face recognition that intuition about "stretched and not rotated" or "its just a multiple of vector" is lost somewhere. Or how that is has to do with determinant = 0 ?
Yes, the math behind PCA is not easy. However, the purpose of this video is to introduce the students about eigenvectors before they watch my videos about PCA. I would recommend that you watch my videos about multivariate statistics in order at my homepage:
www.tilestats.com/
Absolutely ❤❤❤❤
Splendid. Thank you.
Thank you!
Thank you
Thanks
Thanks!!
Hi, I was just confused about why n*n matrix have n eigenvector and eigenvalues
It has to do with the degree of the polynomial in the calculations. For example, a two-by-two matrix results in a degree of 2, which can result in a maximum of two roots. In the next video, I show how the polynomial is generated in the calculations:
th-cam.com/video/JtcNe--fsyA/w-d-xo.html
How can i download the slides
www.tilestats.com/shop/
great
100 likes for you
Thank you for that great video! I wonder if we can calculate eigenvalues and eigenvectors with a 3x2 matrix.
No, it has to be a square matrix.
@@tilestats Thank you for your response. So, can we not use canonical correlation analysis for a n-by-5 matrix?
Sure, because you compute the eigenvectors on the covariance matrix, which is a square matrix:
th-cam.com/video/2tUuyWTtPqM/w-d-xo.html
@@tilestatsThank you!