PCA : standardization and how to extract components

แชร์
ฝัง
  • เผยแพร่เมื่อ 2 ธ.ค. 2024

ความคิดเห็น • 22

  • @jannroche
    @jannroche 2 ปีที่แล้ว +8

    how in the world only 13 comments are present with 142 likes on this video is much more mind blowing to me!!! but the video is amazing!!! cannot stress this enough i can only say thank you! you are the chosen one!

  • @baridomakemsi557
    @baridomakemsi557 วันที่ผ่านมา

    You're the best teacher I've seen on U-tube insofar as PCA is concerned. How do I the analysis in R?

  • @keerthanakalburgivenkatesh8111
    @keerthanakalburgivenkatesh8111 ปีที่แล้ว +3

    Was searching for a detailed math with example for PCA, not disappointed. Keep up the good work.

  • @JulieZhou-1017
    @JulieZhou-1017 3 หลายเดือนก่อน +1

    Really like this video, thanks a lot! And thanks for my luck to find it. As for a statistics student, sometimes the knowledge get too much interwoven
    with many proofs. However, this is such a good video that the abstract of these knowledge gets me clear-minded. Beautiful power prints also!

  • @Sergei-ld1iv
    @Sergei-ld1iv ปีที่แล้ว +1

    Thank you very much!!! Excellent explanation! Great approach to use both analytical and graphical way of representation!!! Really surpricing why there are relatively not many subsribers...

  • @benjaminbrodeur8537
    @benjaminbrodeur8537 2 ปีที่แล้ว +2

    this video is mind-blowing. Everything explained so well

  • @asima444
    @asima444 11 หลายเดือนก่อน

    Thank You so much for such an excellent lecture series!

  • @priyankaverma4053
    @priyankaverma4053 3 ปีที่แล้ว

    Thanks for providing answers to my questions related to PCA.

  • @mrbilalkhan
    @mrbilalkhan 4 หลายเดือนก่อน

    please provide link when you are referring to your previous video at 06:04 . overall I liked the way you explain difficult concepts so easily.

  • @bommubhavana8794
    @bommubhavana8794 2 ปีที่แล้ว +1

    Hello, I have newly started working on a PCR project. I am stuck at a point and could really use some help...asap
    Thanks a lot in advance.
    I am working on python. So we have created PCA instance using PCA(0.85) and transformed the input data.
    We have run a regression on principal components explaining 85 percent variance(Say N components). Now we have a regression equation in terms of N PCs. We have taken this equation and tried to express it in terms of original variables.
    Now, In order to QC the coefficients in terms of original variables, we tried to take the N components(85% variance) and derived the new data back from this, and applied regression on this data hoping that this should give the same coefficients and intercept as in the above derived regression equation.
    The issue here is that the coefficients are not matching when we take N components but when we take all the components the coefficients and intercept are matching exactly.
    Also, R squared value and the predictions provided by these two equations are exactly same even if the coefficients are not matching
    I am soo confused right now as to why this is happening. I might be missing out on the concept of PCA at some point. Any help is greatly appreciated.Thank you!

    • @tilestats
      @tilestats  2 ปีที่แล้ว

      It sounds like you are trying to do principal component regression. I have a video on that
      th-cam.com/video/SWfucxnOF8c/w-d-xo.html

  • @gheasandrinemawen5363
    @gheasandrinemawen5363 3 ปีที่แล้ว

    i really like this vide welled explained. please which software can i used to compute eigen values and eigen vectors

    • @tilestats
      @tilestats  3 ปีที่แล้ว

      Thank you! I would recommend R

  • @kyleevalencia1827
    @kyleevalencia1827 2 ปีที่แล้ว

    Is there any source like video or article how to implement this extracted pca component and use it in machine learning ?

    • @tilestats
      @tilestats  2 ปีที่แล้ว

      Do you mean that you like to extract components to use for classification? If so, I would then recommend to use LDA instead of PCA.

    • @kyleevalencia1827
      @kyleevalencia1827 2 ปีที่แล้ว

      ​@@tilestats why LDA ?

    • @tilestats
      @tilestats  2 ปีที่แล้ว

      Because LDA maximizes the separation between the groups. Have a look at my LDA video where I show the difference to PCA.
      th-cam.com/video/julEqA2ozcA/w-d-xo.html

  • @krish_krish354
    @krish_krish354 20 วันที่ผ่านมา

    13.23 minutes can you explain how the sum of the two small value also between 80-90%

    • @tilestats
      @tilestats  20 วันที่ผ่านมา

      Not sure I understand your question. If you add the variances of the first two PCs you explain more than 80-90%, which means that it is enough to use just the first two PCs.

  • @davidguardamino
    @davidguardamino 2 ปีที่แล้ว

    I may say that it is not because of the unit... it is better explained by the scale o range of the magnitud. You can have many variables with diferent units but what if all of the data points goes from 1 to 10, would it be necessary to scale the data? just because of the units?... but as your own video states, they need to be in a same scale.

    • @tilestats
      @tilestats  2 ปีที่แล้ว

      In that specific case, you do not need to scale.