how in the world only 13 comments are present with 142 likes on this video is much more mind blowing to me!!! but the video is amazing!!! cannot stress this enough i can only say thank you! you are the chosen one!
Really like this video, thanks a lot! And thanks for my luck to find it. As for a statistics student, sometimes the knowledge get too much interwoven with many proofs. However, this is such a good video that the abstract of these knowledge gets me clear-minded. Beautiful power prints also!
Thank you very much!!! Excellent explanation! Great approach to use both analytical and graphical way of representation!!! Really surpricing why there are relatively not many subsribers...
Hello, I have newly started working on a PCR project. I am stuck at a point and could really use some help...asap Thanks a lot in advance. I am working on python. So we have created PCA instance using PCA(0.85) and transformed the input data. We have run a regression on principal components explaining 85 percent variance(Say N components). Now we have a regression equation in terms of N PCs. We have taken this equation and tried to express it in terms of original variables. Now, In order to QC the coefficients in terms of original variables, we tried to take the N components(85% variance) and derived the new data back from this, and applied regression on this data hoping that this should give the same coefficients and intercept as in the above derived regression equation. The issue here is that the coefficients are not matching when we take N components but when we take all the components the coefficients and intercept are matching exactly. Also, R squared value and the predictions provided by these two equations are exactly same even if the coefficients are not matching I am soo confused right now as to why this is happening. I might be missing out on the concept of PCA at some point. Any help is greatly appreciated.Thank you!
Because LDA maximizes the separation between the groups. Have a look at my LDA video where I show the difference to PCA. th-cam.com/video/julEqA2ozcA/w-d-xo.html
Not sure I understand your question. If you add the variances of the first two PCs you explain more than 80-90%, which means that it is enough to use just the first two PCs.
I may say that it is not because of the unit... it is better explained by the scale o range of the magnitud. You can have many variables with diferent units but what if all of the data points goes from 1 to 10, would it be necessary to scale the data? just because of the units?... but as your own video states, they need to be in a same scale.
how in the world only 13 comments are present with 142 likes on this video is much more mind blowing to me!!! but the video is amazing!!! cannot stress this enough i can only say thank you! you are the chosen one!
You're the best teacher I've seen on U-tube insofar as PCA is concerned. How do I the analysis in R?
Was searching for a detailed math with example for PCA, not disappointed. Keep up the good work.
Really like this video, thanks a lot! And thanks for my luck to find it. As for a statistics student, sometimes the knowledge get too much interwoven
with many proofs. However, this is such a good video that the abstract of these knowledge gets me clear-minded. Beautiful power prints also!
Thank you very much!!! Excellent explanation! Great approach to use both analytical and graphical way of representation!!! Really surpricing why there are relatively not many subsribers...
this video is mind-blowing. Everything explained so well
Thank You so much for such an excellent lecture series!
Thanks for providing answers to my questions related to PCA.
Thank you!
please provide link when you are referring to your previous video at 06:04 . overall I liked the way you explain difficult concepts so easily.
Hello, I have newly started working on a PCR project. I am stuck at a point and could really use some help...asap
Thanks a lot in advance.
I am working on python. So we have created PCA instance using PCA(0.85) and transformed the input data.
We have run a regression on principal components explaining 85 percent variance(Say N components). Now we have a regression equation in terms of N PCs. We have taken this equation and tried to express it in terms of original variables.
Now, In order to QC the coefficients in terms of original variables, we tried to take the N components(85% variance) and derived the new data back from this, and applied regression on this data hoping that this should give the same coefficients and intercept as in the above derived regression equation.
The issue here is that the coefficients are not matching when we take N components but when we take all the components the coefficients and intercept are matching exactly.
Also, R squared value and the predictions provided by these two equations are exactly same even if the coefficients are not matching
I am soo confused right now as to why this is happening. I might be missing out on the concept of PCA at some point. Any help is greatly appreciated.Thank you!
It sounds like you are trying to do principal component regression. I have a video on that
th-cam.com/video/SWfucxnOF8c/w-d-xo.html
i really like this vide welled explained. please which software can i used to compute eigen values and eigen vectors
Thank you! I would recommend R
Is there any source like video or article how to implement this extracted pca component and use it in machine learning ?
Do you mean that you like to extract components to use for classification? If so, I would then recommend to use LDA instead of PCA.
@@tilestats why LDA ?
Because LDA maximizes the separation between the groups. Have a look at my LDA video where I show the difference to PCA.
th-cam.com/video/julEqA2ozcA/w-d-xo.html
13.23 minutes can you explain how the sum of the two small value also between 80-90%
Not sure I understand your question. If you add the variances of the first two PCs you explain more than 80-90%, which means that it is enough to use just the first two PCs.
I may say that it is not because of the unit... it is better explained by the scale o range of the magnitud. You can have many variables with diferent units but what if all of the data points goes from 1 to 10, would it be necessary to scale the data? just because of the units?... but as your own video states, they need to be in a same scale.
In that specific case, you do not need to scale.