Linear independence is necessary but not sufficient for Ortho normality. You also need them to be unit length and the dot product to be zero. I noticed your choice of words in both this video and the prior didn’t adequately capture this nuance and it’s worth clarifying this for your viewers. Thanks.
watching your videos make me happy and sad at the same time. Happy because it clarifies so many doubts and sad because I get demotivated to start my own channel. Who would really need anything after this?
I'm glad the video made you happy! Please please start your own channel. People learn in very different ways there isn't a single teaching style that works for everyone.
I think the correct definition for an orthonormal matrix is that the columns are orthogonal to each other, and that they are of unit length. That's how U U^T = I.
Hey man have you been deleting/privating some of your videos? Just wondering.. I had saved many in my watch later but don’t see them anymore. Thanks bro!
I like the way you get into the "math behind" all data science concepts. Thank you!
I was exactly waiting this topic after watching two previous episodes. Thanks very much!
Why must this channel be so awesome.
Nice. A concise refresher is just what I needed.
Glad it was helpful!
Linear independence is necessary but not sufficient for Ortho normality. You also need them to be unit length and the dot product to be zero. I noticed your choice of words in both this video and the prior didn’t adequately capture this nuance and it’s worth clarifying this for your viewers. Thanks.
I was dissatisfied with the previous video, because it did not explain what u_i and v_i are. Now I'm very happy. Thanks a lot!
Great series of insights.. much appreciated.
Glad you like them!
watching your videos make me happy and sad at the same time. Happy because it clarifies so many doubts and sad because I get demotivated to start my own channel. Who would really need anything after this?
I'm glad the video made you happy! Please please start your own channel. People learn in very different ways there isn't a single teaching style that works for everyone.
@@ritvikmath Thanks for the encouragement. I sure will.
I think the correct definition for an orthonormal matrix is that the columns are orthogonal to each other, and that they are of unit length. That's how U U^T = I.
duh
you are an absolute legend
Fucking hell. Not one other resource did I find that explained this like you did. Bless you and the work that you're doing!
Please do a video on cointegration and another one on Bayesian Vector Autoregression
Great job! A video on lstm for time series analysis would be cool too.
I thought orthogonal matrices have to be square. Could you plz explain why did you assume U, V as m*p and p*n ?
nice and concise ever on youtube !
that explanation was so good omg thankyou
I thought U is mxm, V is nxn and SIGMA is mxn
2:25 important
2:52 important
4:26 important to notice that this is capital lambda not A :)
Great video. However I have one doubt - aren't orthonormal vectors alway square? if so, shouldn't U and V be square matrices?
MMt being a correlation matrix? So is it correct to say that Eigenvalues are Singular Values for a correlation matrix?
Great explanation!
holy shit that was amazing, thank you
That's it? OMG 😮Awesome !
A fascinating explanation!
Can anyone explain how to get eigenvectors from SVD?
Hey man have you been deleting/privating some of your videos? Just wondering.. I had saved many in my watch later but don’t see them anymore. Thanks bro!
Mindblown!
Holy crap it's connected
Thankkkk you so much!
Wow, that easy?
I'm the first one,