The Unexpected Power of Orthogonal Matrices

แชร์
ฝัง
  • เผยแพร่เมื่อ 30 ก.ย. 2024

ความคิดเห็น • 16

  • @stanislausstein3047
    @stanislausstein3047 ปีที่แล้ว +5

    I think you confused linear independence with orthonormality in the verbal definition.
    We say that two vectors are orthogonal if their inner product is zero. Linear independence doesn't suffice for this as for example (1, 0)^T and (1, 1)^T are linearly independent, but
    (1,0) • (1,1)^T = 1
    and not 0.
    I love your content btw, just wanted to point that out.

    • @robharwood3538
      @robharwood3538 ปีที่แล้ว +3

      Was going to comment the same point. Linear Independence just means that one vector is not a linear combination of some other vector(s). Doesn't mean the dot product is 0.

    • @ritvikmath
      @ritvikmath  ปีที่แล้ว +4

      thanks for the correction! I'll put this note in the video description

  • @muntedme203
    @muntedme203 ปีที่แล้ว +3

    Unintended pun....sits squarely.....

  • @sand9282
    @sand9282 ปีที่แล้ว +1

    May I request a video explaining how L1 regularization creates a sparse matrix? I have already read a few articles on the internet, but I still couldn't convince myself to fully understand the process. Your explanations on data science topics are consistently clear and concise, and I am eager to watch a video on this specific topic soon. Thank you for providing such valuable content on TH-cam.

  • @knok16
    @knok16 7 หลายเดือนก่อน

    2:46 why linear independence means that dot product is zero? for example, vectors (1, 0) and (1, 1) linearly independent, but their dot product is 1*1+0*1 = 1 != 0. As I understand dot product is zero for orthogonal vectors

  • @walterreuther1779
    @walterreuther1779 ปีที่แล้ว +1

    This took me ages (and a lot of pain) to understand in my Multivariate Statistics class. I wish this video would have been around then... Possibly I find this video so good because I have already learned the basics, but I think this was probably the most understandable explanation of Orthogonal Matrices and PCA I have ever heard.
    So... tanks! 😅

  • @miguelcampos867
    @miguelcampos867 11 หลายเดือนก่อน

    This channel is awesome. Subscribed

  • @Set_Get
    @Set_Get ปีที่แล้ว +1

    thank you for teaching/refreshing us the algebra in relatively short sessions. if possible, please include a numerical example in these videos. i love the topics you choose.

  • @awakerain9680
    @awakerain9680 11 หลายเดือนก่อน

    Such a high-quality video! I was very uncomfortable when studying these theories without knowing the principles and finally I found this video to help me clearly figure out how these theories were worked out! Thank you so much!

  • @ericxue1002
    @ericxue1002 ปีที่แล้ว

    Hi, Great video, Could you talk about the Gaussian Process in future videos? thanks very much.

  • @manishbhanu2568
    @manishbhanu2568 ปีที่แล้ว

    @ritvikmath Hi, can you put more insight on condition no. of orthogonal matrices n how it can deal with noise?

  • @pipertripp
    @pipertripp ปีที่แล้ว

    Nicely explained!

  • @MathOrient
    @MathOrient ปีที่แล้ว

    Great video :)