Eigenvalues & Eigenvectors : Data Science Basics

แชร์
ฝัง
  • เผยแพร่เมื่อ 27 ม.ค. 2025

ความคิดเห็น • 141

  • @romainthomas8238
    @romainthomas8238 3 ปีที่แล้ว +16

    as a native french speaker, I understand your videos better than most of any french courses I've read / watched ! Thanks a lot, you save me a lot of time and desperation :D

    • @ritvikmath
      @ritvikmath  3 ปีที่แล้ว +8

      Je suis heureux d'aider! (Sorry I only took 3 years of french in high school 😁)

  • @batuhantekmen6607
    @batuhantekmen6607 4 ปีที่แล้ว +78

    Last 2-3 mins are invaluable, I knew eigen values and eigenvectors but didnt know where to use them. Thanks very much!

    • @wirelessboogie
      @wirelessboogie 10 หลายเดือนก่อน

      Same here, cheers to the author of the video!

  • @sepidehmalektaji3770
    @sepidehmalektaji3770 3 ปีที่แล้ว +12

    And the best explanation eigenvector/value prize goes to.... this guy!... good job man ...great video

  • @visheshmp
    @visheshmp 2 ปีที่แล้ว +3

    You are amazing ... anyone who is watching this video please don't miss last few minutes.

  • @tobydunbar1153
    @tobydunbar1153 4 ปีที่แล้ว +22

    You have an easy, inclusive and coherent way of teaching. Great job!!👌

  • @maditea
    @maditea 4 ปีที่แล้ว +117

    *me learning linear algebra for the first time even though i passed the class three or four years ago*

    • @EdeYOlorDSZs
      @EdeYOlorDSZs 3 ปีที่แล้ว

      Oof

    • @jonathanokorie9857
      @jonathanokorie9857 3 ปีที่แล้ว +1

      lol

    • @spiderjerusalem4009
      @spiderjerusalem4009 2 ปีที่แล้ว +2

      hate it. Seems more theory rather than intuition. Let alone super rigorous dumbed down overrated book such as axler's. All students would end up doing would be memorizing them for the sake of getting through all the complexity since the intuition is nowhere to be found

  • @saraaltamirano
    @saraaltamirano 4 ปีที่แล้ว +5

    I have searched and searched for an explanation like this one, took me months to finally found an explanation that anyone can understand. You are a talented teacher, thank you!

    • @marcowen1506
      @marcowen1506 4 ปีที่แล้ว

      there's a really old maths book by Bostock and Chandler that has a good explanation of this too.

  • @paulntalo1425
    @paulntalo1425 4 ปีที่แล้ว +2

    You have not just made my day but my career. I have following you for two days and seems u just keep cracking the rocket science. Am doing an Msc Data Science. Thank u so much

  • @anujasebastian8034
    @anujasebastian8034 3 ปีที่แล้ว +1

    I've been learning eigen values and vectors solved a bunch of problems without even understanding what i was doing....Thanks a lot for that explanation!!!

  • @siddhantrai7529
    @siddhantrai7529 3 ปีที่แล้ว +3

    last 30 secs taught me more than last 3 months. Thanks you sir. Your way of teaching is impeccable. I am absolutely stunned by the last minute intution.MIND = BLOWN

  • @nexus1226
    @nexus1226 5 ปีที่แล้ว +29

    Just came across your channel .. Your videos are absolutely amazing! I'm in a multivariate analysis course, where I need to refresh my linear algebra skills, so these videos are really helpful.

  • @sirginirgin4808
    @sirginirgin4808 3 ปีที่แล้ว +2

    Many thanks. You summarised important chapter of linear Algebra in just less than 12 minutes.

  • @donalddavis8033
    @donalddavis8033 3 ปีที่แล้ว +2

    This is way better than the explanation I had in my linear algebra course long ago!

  • @sirivilari6796
    @sirivilari6796 4 ปีที่แล้ว +6

    Really did a good job man. Appreciate your time and valuable information

  • @saulflores5052
    @saulflores5052 4 ปีที่แล้ว +3

    Thank you! I took linear over a year ago and your explanations clear up so many questions I had.

  • @theodoresweger4948
    @theodoresweger4948 ปีที่แล้ว +1

    You not only explained the math operations, but thanks for the insite of why we are doingit. Thanks for the enlightment.

  • @ryansolomon2778
    @ryansolomon2778 4 ปีที่แล้ว +3

    How this man has not blown up bigger that someone like blackpenredpen is beyond me. I am in Calculus II right now, and this video made perfect sense to me.

    • @ritvikmath
      @ritvikmath  4 ปีที่แล้ว +1

      That's the goal :)

  • @TheDroidMate
    @TheDroidMate 4 ปีที่แล้ว +11

    Just came to your vids by accident.. now I'm asking how I could not have got here earlier..!

  • @CaterpillarOGM
    @CaterpillarOGM 2 ปีที่แล้ว +1

    It would be useful to be able to like these videos more than once to express how appreciated they are for a newbie! Thank you!

  • @komelmerchant6772
    @komelmerchant6772 4 ปีที่แล้ว +2

    These are awesome videos! They really Intuitively connect theoretical concepts in linear algebra with application in ways that I was never explicitly taught! Keep up the great work!

  • @arsemabes
    @arsemabes 4 ปีที่แล้ว +2

    this just made me so happy... THANK U!

  • @danieljulian4676
    @danieljulian4676 ปีที่แล้ว

    Your presentation skills are top-notch. Since this is the first of your videos I've watched, I don't yet know whether you devote another video to other properties of eigenvectors. You stress the collinearity, but don't talk about the way the hypervolume of some set of vectors collapses. Maybe you do this in a video where you define the determinant. Maybe your mentioning the null space of the matrix covers this. At any rate, I'll say at this point that I'll probably find all your presentations worthwhile. Best wishes in growing your channel.

  • @yingchen8028
    @yingchen8028 4 ปีที่แล้ว +1

    I love your videos .. super helpful not just to refresh the knowledge but also understand it in a more intuitive way!! Thank you so much !

  • @123gregery
    @123gregery 2 ปีที่แล้ว +1

    You are good. I knew some linear algebra but I couldn't get the "feel" of it. Watching this Data Science Basics series has changed it.

  • @scottlivezey9479
    @scottlivezey9479 4 ปีที่แล้ว +1

    I haven’t had a reason to dive into this kind of topic for over 20 years: only saw it during undergrad & grad school. But I enjoyed your technique of going through it.

  • @marcogelsomini7655
    @marcogelsomini7655 7 หลายเดือนก่อน

    11:35 useful to think about the same "ratio", thank you boss

  • @chathuraedirisuriya6535
    @chathuraedirisuriya6535 4 ปีที่แล้ว +2

    Excellent explanation. Very useful in my mathematical modelling of infectious disease learning. Thank you

  • @blueis910
    @blueis910 3 ปีที่แล้ว +1

    You're helping me a lot refreshing these concepts! So happy I found your channel!

    • @ritvikmath
      @ritvikmath  3 ปีที่แล้ว

      Awesome! Thank you!

  • @user-xj4gg9jm3q
    @user-xj4gg9jm3q 2 ปีที่แล้ว

    this cleared up so much and importance of why we need eigenvectors, tysm!!

  • @user-or7ji5hv8y
    @user-or7ji5hv8y 3 ปีที่แล้ว +2

    Wow, I don’t know why professors rarely provide motivation like this.

  • @sarfrazjaved330
    @sarfrazjaved330 3 ปีที่แล้ว

    This man is a real gem.

  • @gello95
    @gello95 4 ปีที่แล้ว +1

    I appreciate you making these videos as they have helped so much in understanding abstract machine learning/data science concepts! :) Cheers to you!

    • @sahil0094
      @sahil0094 3 ปีที่แล้ว

      Cheers to me . I taught him

  • @Ivan-mp6ff
    @Ivan-mp6ff 8 หลายเดือนก่อน +1

    What an amazing example in eigenvector. Help fish to find fish with same "figures". Surely has been used in dating apps😂

  • @JasonBjörne89
    @JasonBjörne89 4 ปีที่แล้ว

    Your videos are absolutely top-notch. Keep it up!

  • @NishantAroraarora007
    @NishantAroraarora007 4 ปีที่แล้ว +1

    I have a quick question, at 1:01 you mention that lambda is such that it is a real number, can't this be extended to imaginary numbers as well? Btw, Thanks for you great work!

  • @ChristianLezcano-n2u
    @ChristianLezcano-n2u ปีที่แล้ว

    thank you, I learned faster and easier with your explanation, you rock!

  • @VictorOrdu
    @VictorOrdu ปีที่แล้ว

    Lovely! I have a question: if the scalar (eigenvalue) is negative, when multiplied is the vector's direction not changed by 180 degrees?

  • @MichaelGoldenberg
    @MichaelGoldenberg 3 ปีที่แล้ว

    Very clean and clear presentation.

  • @minxxdia1132
    @minxxdia1132 4 ปีที่แล้ว +1

    this video made it all look so so simple, thankyou very much!

  • @amjedbelgacem8218
    @amjedbelgacem8218 2 ปีที่แล้ว

    You are a literally a Godsend and a savior, Machine Learning is becoming more clear with every video i watch of you fam, thank you!

  • @matthewchunk3689
    @matthewchunk3689 4 ปีที่แล้ว +2

    2:51 Thank you. A teacher who melds big pictures with equations

  • @mustafizurrahman5699
    @mustafizurrahman5699 2 ปีที่แล้ว

    Awesome splendid mesmerising

  • @chandrikasaha6301
    @chandrikasaha6301 10 หลายเดือนก่อน

    Which video to follow for the importance of invertibility?

  • @agamchug595
    @agamchug595 2 ปีที่แล้ว

    This is amazing! Thank you so much for making these videos.

  • @CLL-mr3kz
    @CLL-mr3kz 19 วันที่ผ่านมา

    I'm taking CS229 and it helps a lot, thank you!

  • @rajgopalmanoharan
    @rajgopalmanoharan ปีที่แล้ว

    Thank you, really helpful, awesome explanation :)

  • @edphi
    @edphi 2 ปีที่แล้ว

    Best video on eigen

  • @bocckoka
    @bocckoka 4 ปีที่แล้ว

    I think it's important to point out what an operator can do to a vector (A*x) in general, and then point out that these eigen directions are special, because here the operator's effect is just scaling. And this is useful, because...

  • @alinazem6662
    @alinazem6662 ปีที่แล้ว

    Killed it. Period.

  • @kewtomrao
    @kewtomrao 3 ปีที่แล้ว

    How did u assume the shape of X was (2,1)?
    Was it because of two eigen values?

  • @kosnik88
    @kosnik88 2 ปีที่แล้ว

    really nice explanation thank you!!

  • @lilmoesk899
    @lilmoesk899 5 ปีที่แล้ว +3

    great explanation. thanks.

  • @ernestanonde3218
    @ernestanonde3218 3 ปีที่แล้ว

    just found you & I LOVE YOU

  • @SoreneSorene
    @SoreneSorene 3 ปีที่แล้ว

    Hey, special thanks for that last application example 😊

  • @DataScience-s8q
    @DataScience-s8q ปีที่แล้ว

    WOW!! bless you man🌺

  • @AnDr3s0
    @AnDr3s0 4 ปีที่แล้ว +1

    Thanks man! Really good explanation! Keep it up!

    • @ritvikmath
      @ritvikmath  4 ปีที่แล้ว

      Thanks, will do!

  • @sanketannadate4407
    @sanketannadate4407 8 หลายเดือนก่อน

    God Tier Explanation

  • @sali6989
    @sali6989 2 ปีที่แล้ว

    thank you I love this topic it gives me a lot in the foundation of basic data science most likely in machine learning

  • @abhinavmishra9401
    @abhinavmishra9401 4 ปีที่แล้ว +1

    This is amazing. You are amazing.

  • @EdeYOlorDSZs
    @EdeYOlorDSZs 3 ปีที่แล้ว +1

    funny how the mathematical understanding behind it is very important to grasp, however we will never have to calculate the eigenvectors and values by hand after university.

  • @MohamedMostafa-kg6gk
    @MohamedMostafa-kg6gk 3 ปีที่แล้ว +1

    Thank you for this great explanation .

    • @ritvikmath
      @ritvikmath  3 ปีที่แล้ว

      Glad it was helpful!

  • @danielwiczew
    @danielwiczew 4 ปีที่แล้ว +1

    Great video!

  • @jairomejia616
    @jairomejia616 ปีที่แล้ว +1

    Take the last section of the video, knowing that eigen is a German word for "own" and you will never forget what is the importance of eigenvalues and eigenvectors.

  • @AG-dt7we
    @AG-dt7we 5 หลายเดือนก่อน

    How does this property of a vector (eigen vector) remains in the same dimension even after transformation (by A) helps in some problem solving (related to ML)?

  • @prakashb1278
    @prakashb1278 3 ปีที่แล้ว

    This helps a lot!! Thank you so much

  • @karthikeya9803
    @karthikeya9803 4 ปีที่แล้ว

    Explanation is awesome

  • @mcwulf25
    @mcwulf25 4 ปีที่แล้ว +2

    Good explanation of the math. But for 40 years I still struggle with what eigenvalues really are. Your fish example was better than most I have heard but I am still missing something vital.

    • @ritvikmath
      @ritvikmath  4 ปีที่แล้ว +1

      It's definitely a tricky concept and I'm glad this video helped a little bit. Took me a long time to understand too. I think the easiest explanation is that an eigenvector is one where the matrix will map a vector to a multiple of itself (so that the input vector and the output vector both point in the same direction). Why does this matter? Because the same direction ensures the same ratios between each individual vector component which loosely means that the input and output vectors have the same proportions.

    • @mcwulf25
      @mcwulf25 4 ปีที่แล้ว

      @@ritvikmath That helps. I have come across it in PCA and in quantum mechanics. (Yes, I am eclectic).
      Another question: do eigenvalues HAVE to be real?

  • @brownsugar85
    @brownsugar85 4 ปีที่แล้ว

    These videos are amazing!

  • @roastdchestnuts
    @roastdchestnuts 26 วันที่ผ่านมา

    why are there only 2 items in the x column vector?

  • @nandakumarcheiro
    @nandakumarcheiro 4 ปีที่แล้ว

    In aerodrums screens eigen vector and eigenvalues for different landing planes may be manipulated with out collision by having graphics accordingly correct? That might have been a better explanation.

  • @andreo1030
    @andreo1030 4 ปีที่แล้ว

    Thanks for your clear explanation

  • @luiswilbert2377
    @luiswilbert2377 2 ปีที่แล้ว

    great job

  • @himanihasani
    @himanihasani 4 ปีที่แล้ว

    great explanation!!

  • @theodoresweger4948
    @theodoresweger4948 4 ปีที่แล้ว

    Thank you very well explained and I like the fish analogy. .

  • @paulbrown5839
    @paulbrown5839 4 ปีที่แล้ว

    Good video! Thanks.

  • @rezaerabbi2492
    @rezaerabbi2492 3 ปีที่แล้ว

    Could you upload a video on Hat Matrix?

  • @unzamathematicstutormwanaumo
    @unzamathematicstutormwanaumo 11 หลายเดือนก่อน

    You are good 👍

  • @user-or7ji5hv8y
    @user-or7ji5hv8y 3 ปีที่แล้ว

    Does matrix A have to be square?

  • @marcosfuenmayor563
    @marcosfuenmayor563 3 ปีที่แล้ว

    amazing !!

  • @omkarsawant9267
    @omkarsawant9267 2 หลายเดือนก่อน

    Imagine squishing or stretching a balloon by pressing on it. Certain parts of the balloon will always expand in specific directions, even if the whole shape changes-that’s like how eigenvectors and eigenvalues show up when transformations are applied.
    So, eigenvectors give us those "directions that don’t change direction," and eigenvalues tell us how much things get stretched or shrunk along those directions!
    Simple Example --
    Suppose we have a dataset with information on height and weight for a group of people. By calculating the eigenvectors and eigenvalues of this dataset, we can:
    Find the direction in which height and weight are most correlated (an eigenvector showing the pattern).
    See how important this correlation is with an eigenvalue, which tells us if this pattern captures a big or small portion of the data's overall "shape."

  • @ranjbar
    @ranjbar 2 ปีที่แล้ว

    my man gave the fish a mohawk :))) thanks for the content though. much love

  • @Fat_Cat_Fly
    @Fat_Cat_Fly 4 ปีที่แล้ว +1

    soooo good!

  • @reemalshanbari
    @reemalshanbari 4 ปีที่แล้ว

    so we always gonna use only one eigenvalues, am I right?

  • @ahnafislam6933
    @ahnafislam6933 7 วันที่ผ่านมา

    Please continue making data science related math videos

    • @ritvikmath
      @ritvikmath  7 วันที่ผ่านมา

      that's the plan !

  • @paingzinkyaw331
    @paingzinkyaw331 4 ปีที่แล้ว +1

    I just subscribed!

  • @beaudjangles
    @beaudjangles 4 ปีที่แล้ว

    Fantastic

  • @umehmoses8118
    @umehmoses8118 2 ปีที่แล้ว

    Thank you

  • @sahirbansal7027
    @sahirbansal7027 4 ปีที่แล้ว

    hey, can we subtract the mean of each column from the column so as to make it zero mean before calculating the cov matrix. and in some textbooks it is divided by n-1 instead of n. why is that? Thanks

    • @neel_in_germany
      @neel_in_germany 4 ปีที่แล้ว

      I think because with (n-1) the estimator is unbiased...

    • @paingzinkyaw331
      @paingzinkyaw331 4 ปีที่แล้ว

      It is because of the difference between "population" and "sample" if you use for population then the accuracy must be considered so that we use n-1 it's for more accuracy.

  • @neurite001
    @neurite001 4 ปีที่แล้ว

    Talking about fishy vectors... 8:31

  • @adisimhaa65
    @adisimhaa65 4 หลายเดือนก่อน

    thank you soooo much

    • @ritvikmath
      @ritvikmath  4 หลายเดือนก่อน

      You're welcome!

  • @SNSaadu1999
    @SNSaadu1999 4 ปีที่แล้ว

    does Ax = LAMDA X holds for all x?

  • @djangoworldwide7925
    @djangoworldwide7925 ปีที่แล้ว +1

    Evals and Evecs are everywhere in DS

  • @saadelmadafri8050
    @saadelmadafri8050 5 ปีที่แล้ว +1

    great fish !

  • @robertpenoyer9998
    @robertpenoyer9998 4 ปีที่แล้ว

    Math and engineering classes always seem to treat Ax = λx as an abstraction. I wish someone would say at the beginning of the discussion that Ax = λx means that an eigenvector is a vector that points in the same direction after it's been operated on by A.

    • @s25412
      @s25412 3 ปีที่แล้ว

      they could point in different direction though, is lamda is a negative number

    • @robertpenoyer9998
      @robertpenoyer9998 3 ปีที่แล้ว

      @@s25412 My comment about direction was a generality. Of course, A might transform x so that it points in the opposite direction, but the eigenvector will point along the same line as it was pointing before being operated on by A. A scalar multiple of an eigenvector is also an eigenvector.

    • @s25412
      @s25412 3 ปีที่แล้ว

      @@robertpenoyer9998 Thank you

  • @尾﨑元恒-u8q
    @尾﨑元恒-u8q 4 ปีที่แล้ว

    新たに定理を発見しました。

  • @urbansamurai_o4
    @urbansamurai_o4 11 วันที่ผ่านมา

    very simply put idea of Eigen vectors, but there's a significant point missed about how you can even write a transformation that transforms vector in direction and scale replaced by just scaling a vector(eigen vector) by a value(Eigen value). I saw this explaination on khan academy's yt channel.
    th-cam.com/video/QOTjdgmNqlg/w-d-xo.htmlsi=bEgKfKdgnwz5Qy2X

  • @nicoleluo6692
    @nicoleluo6692 ปีที่แล้ว

    🌹🌹🌹

  • @nicholasnelson1005
    @nicholasnelson1005 ปีที่แล้ว

    Didn't really go over finding the Eigenvector 😕 just solved the system of equations and left it be.

  • @mlaursen
    @mlaursen 7 หลายเดือนก่อน

    Why couldn’t you have been my teacher when I was studying eigenvectors. Sigh.

  • @premstein16
    @premstein16 5 ปีที่แล้ว

    Hi, could you please do the computation for eigen value -2 and eager to know how to plug in x1 and x2