Singular Value Decomposition (SVD): Matrix Approximation

แชร์
ฝัง
  • เผยแพร่เมื่อ 27 ก.ย. 2024

ความคิดเห็น • 211

  • @greenpumpkin172
    @greenpumpkin172 4 ปีที่แล้ว +209

    This channel is so underrated, your explanations and overal video presentation is really good!

    • @dombowombo3076
      @dombowombo3076 4 ปีที่แล้ว +3

      Don't know why you think it's underrated...
      Everyone who is watching this videos knows how great they are.

  • @smilebig3884
    @smilebig3884 4 ปีที่แล้ว +38

    The best thing about your lectures is, u do coding implementation along with huge maths.. That makes u different from rest of the traditional instructors. Kudos to you!!!

    • @Eigensteve
      @Eigensteve  4 ปีที่แล้ว +8

      It's my pleasure

  • @AdityaDiwakarVex
    @AdityaDiwakarVex 4 ปีที่แล้ว +35

    SVD was at the very end of my college LinAlg class so I never got a very good understanding of it before the final - this is truly amazing; you say "thank you" at the end of every video but it should be us saying it to you- keep doing your thing! I'm loving it.

  • @ris2043
    @ris2043 4 ปีที่แล้ว +47

    The best explanation of SVD. Your videos are excellent. Thank you very much!

  • @alexpujoldartmouth
    @alexpujoldartmouth 3 ปีที่แล้ว +2

    You have a talent for taking complicated topics and breaking them down into digestible pieces. That's the sign of a good teacher. Thank you.

  • @omniscienceisdead8837
    @omniscienceisdead8837 2 ปีที่แล้ว +3

    you explain math in such a way as to not make someone feel stupid, but feel like their taking steps into understanding a larger concept, and the tools they need are the ones we already have, big ups

  • @douglasespindola5185
    @douglasespindola5185 2 ปีที่แล้ว +3

    Gosh, what a class! As mr. Ayush said, this was indeed by far the best SVD explanation I've seen. You've made a such complicated subject way more affordable! I wish you all the best, Steve! Greetings from Brazil!

    • @Eigensteve
      @Eigensteve  2 ปีที่แล้ว +2

      Thanks so much! That is great to hear!!

  • @nathannguyen2041
    @nathannguyen2041 2 ปีที่แล้ว +1

    This was, by far, the most compensable explanation of what the SVD is mathematically and visually. The SVD is an incredible algorithm! Amazing how so little you could keep in order to understand the original system.

  • @skilambi
    @skilambi 4 ปีที่แล้ว +3

    Please keep making these high quality lectures. They are some of the best I have seen on TH-cam and that goes a long way because I watch a lot of lectures online.

  • @rajkundaliya7796
    @rajkundaliya7796 2 ปีที่แล้ว +1

    It doesn't get better than this. I am so thankful to you. I don't know how to repay this help.... And yes, this is a highly underrated channel

  • @AkshatJha
    @AkshatJha ปีที่แล้ว +1

    What a wonderful way to simplify a complicated topic such as SVD--I wish more people in academia emulated your way of teaching, Mr. Brunton.

  • @wackojacko1997
    @wackojacko1997 ปีที่แล้ว +1

    Not an engineer/student, but I'm watching this to get a better understanding of PCA in statistics. I'm going to check the book and research this, but my only complaint (nit-picky) is trying to tell the difference when Steve speaks between "M" and "N" which I know refers to the number of rows or columns of the matrix. But really, this was great and I am thankful that this is something I can study on my own. Much appreciated.

  • @fabou7486
    @fabou7486 3 ปีที่แล้ว +1

    One of the best channels I have ever followed, appreciate it so much!

  • @YYchen713
    @YYchen713 2 ปีที่แล้ว +2

    Thank you for making the linear algebra less boring and really connected to data science and machine learning, this series is so much more interpretable than what my professor explains

    • @PunmasterSTP
      @PunmasterSTP ปีที่แล้ว

      Hey I know it's been nine months but I just came across your comment and was curious. How'd the rest of your class go?

  • @peymanzirak5400
    @peymanzirak5400 2 ปีที่แล้ว +1

    I find everything with these courses, even the way board arranged is just great. Many many thanks for this wonderful explanation and all your effort to make it understandable and yet complete.

  • @ayushsaraswat866
    @ayushsaraswat866 4 ปีที่แล้ว +132

    This series is by far the best explanation of SVD that I have seen.

  • @bnglr
    @bnglr 4 ปีที่แล้ว

    every time I think it's time to pause and comment this video with "awesome", it surprises me with more informative perspective. great job

  • @malekbaba7672
    @malekbaba7672 4 ปีที่แล้ว +1

    The best explanation of SVD i have ever seen !

  • @NickKingIII
    @NickKingIII 4 ปีที่แล้ว +4

    Wonderful explanation, clear and easy to understand. Thank you very much

  • @kansasmypie6466
    @kansasmypie6466 3 ปีที่แล้ว +5

    Can you do a series on QR decomposition as well? This is so useful!

  • @nicholashawkins1017
    @nicholashawkins1017 2 ปีที่แล้ว

    Lightbulbs are finally going off when it comes to SVD cant thank you enough!

  • @FezanRafique
    @FezanRafique 3 ปีที่แล้ว +1

    Steve is magician of explanation.

  • @saitaro
    @saitaro 4 ปีที่แล้ว +3

    It was pleasure to watch. You should do more educational videos, mr. Brunton.

  • @zepingluo694
    @zepingluo694 3 ปีที่แล้ว +1

    Thank you for presenting us an amazing experience to learn about SVD!

  • @MassimoMorelli
    @MassimoMorelli 3 ปีที่แล้ว

    Extremely clear. Just want to point out a fact which at first did not seem obvious to me: the outer product has rank 1 because all the column are proportional to the first vector of the outer product, hence they are linearly dependent.

  • @Nana-wu6fb
    @Nana-wu6fb 3 ปีที่แล้ว

    Literally the best svd explained, so meaningful

  • @nikosips
    @nikosips 4 ปีที่แล้ว +2

    Thank you very much for those videos , they are very explanatory . Keep up the good work, we need you lessons for our academic improvement.

  • @yasirsultani
    @yasirsultani 2 ปีที่แล้ว

    These are the best videos out there. Biggest fan Steve, keep it up.

  • @bryan_hiebert
    @bryan_hiebert ปีที่แล้ว

    Thank you so much for posting the course material. I was running through asking ChatGPT some questions about eigenvector/eigenvalues and revisiting some linear algebra when I stumbled upon transitional probability matrices or Markov Matrices, PCA and SVD as was getting back to my Data Science studies. This is very exciting stuff and your presentation is very clear and understandable.

  • @mkhex87
    @mkhex87 2 ปีที่แล้ว

    To the point. Answers all the important questions. I mean you should come to the party knowing some lin alg but great for intermediate level

  • @wudiNB
    @wudiNB ปีที่แล้ว

    best teacher that l have ever met

  • @alwaysaditi2001
    @alwaysaditi2001 5 หลายเดือนก่อน

    Thank you so much for this easy to understand explanation. I was really struggling with the topic and this helped a lot. Thanks again 😊

    • @Eigensteve
      @Eigensteve  5 หลายเดือนก่อน

      Glad it was helpful!

  • @xiaoyu5181
    @xiaoyu5181 4 ปีที่แล้ว

    This is also the best explanation of SVG I have seen! Thanks for sharing!

  • @kiaranr
    @kiaranr 2 ปีที่แล้ว

    I've read about and even used SVD. But I never really understood it in this way. Thank you!

  • @neurochannels
    @neurochannels ปีที่แล้ว

    I never *really* appreciated SVD until I saw this video. Mind blown!

  • @Phi1618033
    @Phi1618033 ปีที่แล้ว

    This all sounded like gibberish until I started to think of the first term of the expansion (Sigma1*U1*V1T) as the (strongest) "signal" and the rest of the terms as ever decreasing amounts of "signal" and ever increasing amounts of "noise". So the last term (Sigmam*Um*VmT) is essentially all background "noise" in the data. Thinking of it that way, it all makes perfect sense.

  • @HuadongWu
    @HuadongWu 4 ปีที่แล้ว

    the best lecture of SVD I have ever seen!

  • @pilmo11
    @pilmo11 ปีที่แล้ว

    superinformative series of SVD

  • @andrezabona3518
    @andrezabona3518 3 ปีที่แล้ว +1

    for mn ? (For example, what happen if my dataset is composed by 5000 images of 32x32?)

  • @sabarathinamsrinivasan8832
    @sabarathinamsrinivasan8832 2 หลายเดือนก่อน

    Very nice lecture and clearly understantable...Thanks Steve...🤗

  • @RajeshSharma-bd5zo
    @RajeshSharma-bd5zo 3 ปีที่แล้ว

    Such a cool concept of decomposition and brilliantly explained here. Big thumbs up!!

  • @prabalghosh2954
    @prabalghosh2954 2 ปีที่แล้ว

    Very, very nice explanation and presentation. Thank you!

  • @tobyleung96
    @tobyleung96 3 ปีที่แล้ว +1

    @14:52
    No Steve, thank YOU!

  • @garrettosborne4364
    @garrettosborne4364 3 ปีที่แล้ว

    This is answering a lot of my questions on SVD.

  • @tusharnandy6711
    @tusharnandy6711 4 ปีที่แล้ว

    Gentleman, you have done a very impressive job. I have just started exploring data science and have recently completed my college course in Linear Algebra. This was quite interesting.

  • @Streamoon
    @Streamoon 2 ปีที่แล้ว

    Thank you Prof. Brunton, excellent explanation! Just come from MIT 18.06.

  • @mohiuddinshojib2647
    @mohiuddinshojib2647 3 ปีที่แล้ว +1

    This is everything that I need. Thanks for nice explanation .

    • @Eigensteve
      @Eigensteve  3 ปีที่แล้ว

      You are welcome!

  • @patrickgilbert6170
    @patrickgilbert6170 3 ปีที่แล้ว

    Great video. Should be required viewing for anybody learning the SVD!

  • @khim2970
    @khim2970 ปีที่แล้ว

    really appreciate your efforts. wish u all the best

  • @LyndaCorliss
    @LyndaCorliss ปีที่แล้ว

    Top rate education, I'm happily learning a lot.
    Nicely done. Thank you

  • @raven5165
    @raven5165 11 หลายเดือนก่อน

    In the example, m is the number of images you have, n is their pixel values. So m doesn't have to be enough to represent your data.
    It is like saying, if you have 2 photo, 2 dimensions are enough to represent image features.

  • @juangoog
    @juangoog 3 ปีที่แล้ว

    Wow, what a wonderful presentation. Congratulations.

  • @mohammedal-khulaifi7655
    @mohammedal-khulaifi7655 2 ปีที่แล้ว

    you are at the tip-top i like your explanation

  • @mdmamunurrashid2945
    @mdmamunurrashid2945 3 ปีที่แล้ว

    Love his explanation style

  • @chenqu773
    @chenqu773 3 ปีที่แล้ว

    Good explanation! Many thanks ! how could one manage to get these stuffs explained in such an elegant way.

  • @deepthikiran8345
    @deepthikiran8345 3 ปีที่แล้ว

    The explanation is really wow !! Very intuitive ... thank you so much !!

  • @4096fb
    @4096fb ปีที่แล้ว

    Thank you for this great explanation. I just lost you on one point, why is this matrix multiplication equals sig1U1V1T + sig2U2V2T + ... + sigmUmVmT
    Can someone explain how does it complete the entire matrix multiplication? I somehow lost in this columns of U and row of V

  • @liuhuoji
    @liuhuoji 3 ปีที่แล้ว

    love the video, well explained and aesthetically good.

  • @parnashish1910
    @parnashish1910 3 ปีที่แล้ว

    Beautifully explained.

  • @arthurlee8961
    @arthurlee8961 2 หลายเดือนก่อน

    That's so helpful! thank you!

  • @maciejmikulski7287
    @maciejmikulski7287 ปีที่แล้ว

    The assumption n >> m is contrary to what we have quite often in data sciences. In many problems, the number of samples (here m) is bigger than number of features (here n). In such a case, we just take the transpose and keep going the same way? Or there are some additional considerations (of course except of swapping interpretations of eigen vectors etc)?

  • @murphp151
    @murphp151 2 ปีที่แล้ว

    This is so excellent. I just have one question you keep saying that you will multiply a column of u with a row of v. But matrix multiplication is row by column. So how are you doing it the other way around ?
    What am I missing ?

  • @psycheguy503
    @psycheguy503 6 หลายเดือนก่อน

    How to determine the criterion to truncate please? It might depend on the specific case but is there a general guideline for it?

  • @adelheidgang8217
    @adelheidgang8217 2 ปีที่แล้ว

    incredicle explanation!

  • @yutengfeng377
    @yutengfeng377 3 ปีที่แล้ว

    For the last point, (u~)*transpose(u~) is still a identity matrix, but it is a n by n matrix instead of a r by r matrix. Am I right?

  • @JCatharsis
    @JCatharsis 2 ปีที่แล้ว

    Thank you so much professor.

  • @非常大的圆白菜
    @非常大的圆白菜 2 ปีที่แล้ว

    I don't understand why we are multiplying the columns of U by the rows of V... shouldn't it be the opposite?

  • @inazuma3gou
    @inazuma3gou 3 ปีที่แล้ว

    Excellent, excellent content. Thank you so much!

  • @hugeride
    @hugeride 3 ปีที่แล้ว

    Just amazing explanation.

  • @maipyaar
    @maipyaar 3 ปีที่แล้ว

    Thank you for this video series.

  • @wonderworm232
    @wonderworm232 ปีที่แล้ว

    My man writting in mirror mode. Absolute legend

    • @andrelorenzob
      @andrelorenzob 9 หลายเดือนก่อน

      I was just thinking about that! But I think he just write in normal mode and mirror the video

  • @harrypotter1155
    @harrypotter1155 2 ปีที่แล้ว

    Mindblowing!

  • @fabiopadovani2359
    @fabiopadovani2359 4 ปีที่แล้ว

    Thank you very much. Excellent explanations.

  • @zrmsraggot
    @zrmsraggot 2 ปีที่แล้ว

    Is truncating at 'r' similar to filtering out the highest frequencies in the FFT ?

  • @eeeeee132
    @eeeeee132 3 ปีที่แล้ว

    It is very clear when you compare the vector x with the water flow that, 'U' is the eigen flows and 'V' is the time the eigen flows evolves in the flow. In contrast to the face, also it ise very clear in the part of 'U' as the eigen faces but I wonder what will be the 'V' vector ?

  • @mr.jizhouwubs7256
    @mr.jizhouwubs7256 3 ปีที่แล้ว

    Great video in linear space point of view. One naive question: can we make use of Lanczos algorithm such that we can pick up the most significant eigenvalues for approximation in order to circumvent the full diagonalization of the whole large matrix?

  • @zhichaozhao172
    @zhichaozhao172 2 ปีที่แล้ว

    Prof.Steve, Thanks for your explanations. But what is the difference between the POD and SVD for aerodynamics analyeses?

  • @wmafyouni
    @wmafyouni ปีที่แล้ว

    Why is the X matrix n by m but your SVD written out for X as m by n matrix?

  •  3 ปีที่แล้ว

    Shouldn't u vectors be row vectors? How do you multiply column vector with another column vector in other matrix?

  • @momoh6696
    @momoh6696 10 หลายเดือนก่อน

    Hello, I can't find a way to compress all my images into one X matrix, the only examples I've seen are for doing SVD for one image at a time. How do I make one X matrix from images please?

  • @namex26
    @namex26 2 ปีที่แล้ว

    Great lectures. Thanks so much. Kind of hard to distinguish the m's and n's sometimes. Hoped you'd use some other letters :D

  • @regbot4432
    @regbot4432 3 ปีที่แล้ว

    Wow, you are really good teacher.

  • @dewinmoonl
    @dewinmoonl 3 ปีที่แล้ว

    others : "wow great explanation thanks for the lsson"
    me : how is this man writing perfectly backwards onto thin air ?
    mr bruton : I'm glad the green screen, glass board, and $1000 dollars of adobe products really paid off

  • @momoh6696
    @momoh6696 10 หลายเดือนก่อน

    Hello, do we pass the U, S or VT as input into the SINDy algorithm or we pass in the approximation of X gotten using the U,S and VT?

  • @isaaclakes-trapbeatspianob5955
    @isaaclakes-trapbeatspianob5955 3 ปีที่แล้ว

    What if u = 2x2 matrix, and V a 3x3 matrix. Then how would you calculate that outer product uV

  • @florawoflour4501
    @florawoflour4501 ปีที่แล้ว

    thank u so much sir, very helpful

  • @YasserHawass
    @YasserHawass ปีที่แล้ว

    i had some time to accept 1:22 conclusion, since if i understood you right, we have n vector space, in which our data can be, so it should be okay to use the whole n vectors of U as new basis, unless we want dimensionality reduction and not just matrix decomposition, or i'm just missing something?

  • @Catwomen4512
    @Catwomen4512 2 ปีที่แล้ว

    Is the economy SVD not also an approximation of X? (Since we lose some columns of U)

  • @amirhamzekhoshnam4755
    @amirhamzekhoshnam4755 5 หลายเดือนก่อน

    what if we change our norm. is this remain the best approximation or not?

  • @Dominus_Ryder
    @Dominus_Ryder 3 ปีที่แล้ว

    I'm confused with the T-transpose notation. Dose the SVD of X give U*Sigma* V, for which you manually have to transpose V afterwards, or does the SVD of X give you U*Sigma*V-transpose, such that the SVD transposes V for you automatically upon completion of its calculation.

  • @ParijatKar
    @ParijatKar 2 ปีที่แล้ว

    I am interested in knowing what software he is using to write.

  • @akramz2341
    @akramz2341 3 ปีที่แล้ว

    incredible!

  • @manuel56354
    @manuel56354 4 ปีที่แล้ว

    What happens when m >> n ? Does something change in all that has been said?

  • @matthewprestifilippo7673
    @matthewprestifilippo7673 2 ปีที่แล้ว

    found this hard to follow without teaching any ways to think of this intuitively.

  • @warrior_1309
    @warrior_1309 3 ปีที่แล้ว

    I wish i was his student during my college days ,my grades had been improved ..

  • @santoshshanbhogue
    @santoshshanbhogue 3 ปีที่แล้ว

    Such a lucid explanation! A question though : If I do an SVD on this toy matrix X = [1, 2; 1,2], then the second singular value is zero as expected. But if I am unable to reconstruct X using sigma1*U1*V1^T, i.e. just the first singular value (The matrix is reconstructed as [1, -2; 1 -2]). But if I add an infinitesimal noise to X though, it works. Is this a well documented numerical issue?

  • @arbitrandomuser
    @arbitrandomuser ปีที่แล้ว

    if the U columns after m dont matter .. why is U unique ?

  • @hiryu87
    @hiryu87 3 ปีที่แล้ว

    Wow superb!

  • @Klompe2003
    @Klompe2003 ปีที่แล้ว

    @11:08 his face when he said Frobenius norm xD

  • @mourenlee5403
    @mourenlee5403 4 ปีที่แล้ว +6

    why could he write in mirror image? Amazing!

    • @NG-lx1kx
      @NG-lx1kx 4 ปีที่แล้ว

      I also feel amazed by that

    • @parrychen4738
      @parrychen4738 4 ปีที่แล้ว

      @@NG-lx1kx Remember that image processing is one of the applications of SVD. Now think, how does image processing relate to this video?

    • @douglashurd4356
      @douglashurd4356 3 ปีที่แล้ว

      Post processing he flips left-right. Notice which hand has the wedding ring.