33. Left and Right Inverses; Pseudoinverse

แชร์
ฝัง
  • เผยแพร่เมื่อ 2 ม.ค. 2025

ความคิดเห็น • 124

  • @ozzyfromspace
    @ozzyfromspace 4 ปีที่แล้ว +178

    I've been taking proper notes, doing the readings, and all the things you would expect of a student (short of taking the quizzes and final). I'm surprised at myself for having the will, interest, and sustained performance to get here. I'm tooting my own horn because I think I deserve it. Thanks to Professor Strang, I haven't just learned one of the most beautiful frameworks in mathematics. I have also proven that I can focus and get things done. If you knew me, you'd know that I'm unreliable, highly motivated at first, but that I ultimately lose steam and perform poorly. This class taught me that when I choose to do something to completion, and mean it, good things can happen. Thank you from the bottom of my heart, Professor G. Strang! Seriously, I'll always hold you in high regard. You gave me the MIT experience. You made me really wanna roll up my sleeves and learn deeply, learn how to think, which is impressive because I'm a college dropout who wanted nothing to do with education (well, schooling, you come across that line of reasoning, I'm sure). This course has improved my confidence. I just wanted to let you know, the special touch you placed on every lecture was precisely what I needed to start to see education in a whole new light. I've said it a million times now, but thank you Professor Strang! And of course, many, many thank you's to the MIT OpenCourseWare for making such high-quality, timeless content available to millions like myself for free. You're changing minds.

    • @owen7666
      @owen7666 4 ปีที่แล้ว +7

      There is a wonderful momentum to this course that few courses can match. And Prof Strang puts such great effort into building a good intuition for the subject that when much more advanced topics appear, they just fall into place. Really, the breadth and depth of this "introductory" course is amazing

    • @Hotheaddragon
      @Hotheaddragon 4 ปีที่แล้ว +3

      You are not alone with that feel.

    • @lzawbrito
      @lzawbrito 4 ปีที่แล้ว +4

      May I ask if you're still using linear algebra after completing the course, and if so, whether you think learning it from Prof. Strang served you well long-term? And congrats. It takes serious dedication to do something like this without the threat of bad grades and all that.

    • @ozzyfromspace
      @ozzyfromspace 4 ปีที่แล้ว +5

      @@lzawbrito thanks for checking in. Yes, I’m still using linear algebra in my own private studies. I’ve since gone on to self-study Sturm-Liouville theorem, the theory of integral transforms, and complex analysis, which are really just infinite-dimensional vector spaces. Just a few minutes ago, I was solving the Poisson equation in a compact space in R2 (I just took a break, came online, and found your comment 😅).
      Physicists always use the so-called fundamental solution (which is really just a radial basis function) to solve Poisson’s equation, motivated by the fact that the laplacian has rotational symmetry in the usual sense. However, I’m doing research to show that there is a more general solution involving non-radial basis functions (which makes sense as something you’d look for if you’ve taken a linear algebra course as fine as Professor Strang’s). I find this line of inquisition interesting because the Maxwell’s Equations in classical physics are really just a form of higher order Poisson equations, and their underlying potential fields are expressed as radially symmetric convolutions. But the two issues are that 1) potentials only work over all space, not in compact geometries (Helmholtz decomposition theorem) and 2) You can’t capture all the expected dynamics if you’re only composing your functions with radial bases. So, basically, the retarded potential solutions of E&M are just a projection of a function onto a radial subspace. Linear algebra is a good way to see this :)
      So I want to figure out the most general structure of solutions to the Maxwell’s equations, and then write a Machine Learning algorithm that can minimize parameters on the equation to predict the mathematically ideal electronic circuit topology given a set of constraints. As you can tell, I’m a little “all over the place”, but I still insist that this was the first class I took that gave me the confidence to do math and stick to it.
      As for your final point, I’ve found that the threat of bad grades generally hurts my understanding, because then I’m learning to pass rather than to deeply understand. My motivation is that I’m working towards a novel technology and it needs a lot of mathematics. I care deeply about said technology (the float circuit - hence the pseudonym 😅), so I’m internally motivated. I dream lucidly about math. I can’t turn it off. I have a deep appreciation for why I’m trying so hard. Best wishes with everything, Lucas!
      Edit -
      one day I’ll start making YT videos documenting my way of understanding things, to help people that want to “go deeper” on concepts. I just don’t feel ready (knowledgeable enough) today, but that’s the plan. Hopefully by this coming January. I’m just doing a measure theory course to tie up some loose ends in understanding for my first course on PDEs.

    • @lzawbrito
      @lzawbrito 4 ปีที่แล้ว +2

      @@ozzyfromspace This is awesome. The applications of linear algebra in machine learning are many, I imagine. I self-studied using Strang's course exactly for physics (QM) and computer science, since those subjects are what I plan to major in. And your research is this cross-section of so many different fields... I subscribed just to see what you come out with.
      It's also funny you mention PDEs, as I'm taking a PDE course next semester!

  • @gamingChinaNo1
    @gamingChinaNo1 4 ปีที่แล้ว +143

    I can't believe this class would come to the end so soon. Watched from lecture 1 to here

  • @MaozhuPeng
    @MaozhuPeng 9 หลายเดือนก่อน +1

    Just finished the whole course. Thank you Professor Gilbert Strang

  • @YtEssEss
    @YtEssEss 4 ปีที่แล้ว +64

    Absolutely wonderful series of lectures. Thanks a zillion to Prof Strang and OCW for making this available. These are absolutely priceless!

    • @YtEssEss
      @YtEssEss 2 ปีที่แล้ว

      @aDBo'Ch 1 Thanks for pointing out typo.

  • @AnhTran-sf3hf
    @AnhTran-sf3hf 3 วันที่ผ่านมา +1

    I'm here, I can't believe how I learned math in the past. This class will benefit me for years of my life. Thank you, Mr Strang. Can't wait to continue on 18.065

  • @jinzhonggu8276
    @jinzhonggu8276 2 ปีที่แล้ว +4

    The charm of this course is after all these lectures, you feel so empowered and confident that is willing to give any linear algebra problem a try.

  • @MrLakastro
    @MrLakastro 4 ปีที่แล้ว +47

    "If a matrix takes a vector to zero, there is no way its inverse can bring it back to life!" - Gil Strang

  • @naterojas9272
    @naterojas9272 5 ปีที่แล้ว +30

    Strang: "You know what that matrix is"
    Me: "I do?.... Wait a minute... I do!"
    Classic moments in Linear Algebra with Prof. Strang 👌👌👌

  • @mazyarmazari3346
    @mazyarmazari3346 3 ปีที่แล้ว +2

    prof strang is by far the best teacher i've ever known in my life, he's teaching things that is very much complicated and teaches those things like there're super easy. thank to MIT OpenCourseWare for making this lecture, your job is priceless

  • @georgesadler7830
    @georgesadler7830 3 ปีที่แล้ว +4

    This is another fine lecture on inverses. This the first time that I have seen Left, Right and Pseudoinverses in a linear algebra class. I really learned these topics from DR. Strang. These topics are important in linear algebra, signal and systems theory and control engineering.

  • @neoblackcyptron
    @neoblackcyptron 3 ปีที่แล้ว +6

    I have huge respect for this lecture series and Professor Strang. This exact syllabus is what we have to learn in 2020 as one of our core courses in CSE MS in AI/Robotics and ML. It is going to take multiple revisions to really internalize all that this lecture series has to offer. Professor, I hope to build some good things and help usher in the AI revolution that is happening. Thank you for sharing your knowledge to us, the next generation to push things a little bit more into the future.

  • @nota2938
    @nota2938 2 ปีที่แล้ว +2

    The bijection between subspace of row space and subspace of col space had been hinted quite nicely in previous lectures and in the textbook by Dr. Strang, e.g. he stressed the fact there lies an identity when we do row reduction. So nice that I figured this out when I walk through the previous materials. But still, he managed to show there's a new (to me) point of view to such bijection: pseudo-inverse works as *the* inverse function of the bijection mapping, and that the pseudo-inverse *also eliminates* some (resp. left) null space, just like what the original is doing. Such symmetry, such beauty.

  • @quirkyquester
    @quirkyquester 4 ปีที่แล้ว +3

    this is the beauty of life, thank you so much Professor Strang and MIT!

  • @onnalinkajornklam5936
    @onnalinkajornklam5936 7 ปีที่แล้ว +9

    Thank you professor Gilbert! What you did is incredible, this help take me where I want to be in life!

  • @attilakun7850
    @attilakun7850 7 หลายเดือนก่อน

    For 8:22, the null-space of A^t A being the same as A's null-space (when A has linearly independent columns) can be proven as follows. Consider the null-space of A^t A first:
    A^t A x = 0 implies
    (x^t A^t) (A x) = 0
    (A x)^t (A x) = 0
    This expression is 0 only if x is in the null-space of A too. But because of A's columns are linearly independent, only 0 is in their null-space.

  • @Zanco
    @Zanco 13 ปีที่แล้ว +3

    that illustration and this intuitive way of looking at pseudo inverse is great! good job!

  • @moumitabiswas3199
    @moumitabiswas3199 4 ปีที่แล้ว

    Thanks a lot MIT OCW for this free lectures.and i am grateful to MIT for introducing the TEACHER Mr. Gilbert to the world. I wish him happy and healthy life. You are a wonderful teacher. Thanks.

  • @aoiroadelacvg7489
    @aoiroadelacvg7489 5 ปีที่แล้ว +2

    the best course to learn linear algebra

  • @emenikeanigbogu9368
    @emenikeanigbogu9368 4 ปีที่แล้ว

    It’s been a journey Professor Strang. It’s quite sad that this has come to an end, but what a beautiful way to come full circle.

    • @angfeng9601
      @angfeng9601 4 ปีที่แล้ว +1

      18.065 follows this course perfectly

    • @emenikeanigbogu9368
      @emenikeanigbogu9368 4 ปีที่แล้ว

      @@angfeng9601 100% on it. Thank you brother!!

  • @erikumble
    @erikumble 4 ปีที่แล้ว

    Thank you Professor Strang for the great course, and MIT for making it accessible for us. This is such a great opportunity for those of us who want to learn more than what is offered at our high schools.

  • @pankajvadhvani9278
    @pankajvadhvani9278 3 ปีที่แล้ว

    what a stunning way in which you have given this lecture series, sir.
    You are the best teacher Gilbert Strang Sir. And very very thanks to MIT OCW to give this beautiful lecture series totally free, and to indroduce the world to this amazing professor.

    • @vicevirtuoso123
      @vicevirtuoso123 3 ปีที่แล้ว

      Bhai samajh me aajayega ? I m Btech 1st yr student

    • @pankajvadhvani9278
      @pankajvadhvani9278 3 ปีที่แล้ว

      @@vicevirtuoso123 aayega bhai, but there is a deep meaning to every line of sir, and in watching 1 time you can't get everything, but surely u will get a geometrical sense of every concept, which gives u satisfaction

    • @pankajvadhvani9278
      @pankajvadhvani9278 3 ปีที่แล้ว

      One suggestion more : plz read book of Gilbert Strang sir also, because it will tell that how much sense u got, and will fill the gaps of understanding

    • @vicevirtuoso123
      @vicevirtuoso123 3 ปีที่แล้ว

      @@pankajvadhvani9278 thanks bhai
      Vese ap kis course ke liye pad rhe the ?

  • @npadmanabhan6980
    @npadmanabhan6980 3 ปีที่แล้ว

    Fantastic way to introduce Pseudoinverse. GS is a great teacher. Thanks OCW for making this available

  • @supersnowva6717
    @supersnowva6717 ปีที่แล้ว

    I am not ready for this to end!

  • @MrSyrian123
    @MrSyrian123 6 ปีที่แล้ว +1

    Thanks MIT OCW and Prof. Gilbert for your hard work.

  • @davidhcefx
    @davidhcefx 5 ปีที่แล้ว +4

    That’s a really great course review! There are something new to learn at the same time

  • @ummwho8279
    @ummwho8279 3 ปีที่แล้ว

    22:03 "Where in calculus, you really, you know, you're trying to visualize these things, well 2 or 3 dimensions is kind of the *_limit_* ..."
    Professor Gilbert Strang: Come for the fantastic teaching, stay for the fantastic puns.

  • @avinavkashyap8802
    @avinavkashyap8802 2 ปีที่แล้ว

    Literally ,the legend of linear algebra

  • @lucasrodriguesdelima8697
    @lucasrodriguesdelima8697 ปีที่แล้ว

    So happy with this journey!! Thanks Doctor Strange of Linear Algebra!!

  • @ramonmassoni9657
    @ramonmassoni9657 4 ปีที่แล้ว

    This class was pure art

  • @solfeinberg437
    @solfeinberg437 5 ปีที่แล้ว +1

    In addition to shouting out answers I knew, I would also applaud right when he said thanks.

  • @mauriciobarda
    @mauriciobarda 5 ปีที่แล้ว +3

    For all of you who made it to this point in this great course I recommend watching this short serie before taking the final reviews. Full of graphics with some good aha moments. th-cam.com/video/fNk_zzaMoSs/w-d-xo.html

  • @adamcanterbury0707
    @adamcanterbury0707 ปีที่แล้ว

    Thanks Prof Strang, you helped me a lot

  • @행복한나무-i9r
    @행복한나무-i9r 3 ปีที่แล้ว

    Thanks professor ! It really really helps understanding 3D computer vision works.

  • @Kenji314159
    @Kenji314159 11 ปีที่แล้ว +12

    Wrong? He's awesome, he has the best explanations! You should see my teachers :o

  • @bingin15
    @bingin15 13 ปีที่แล้ว

    i love these lectures! they are fantastic. i love the book too. i think it's really well written. overall, i really enjoyed this course

  • @rinzindorjee31
    @rinzindorjee31 10 ปีที่แล้ว +10

    may not be important to go university. i love for ur lecture give me enjoyment

  • @mav45678
    @mav45678 5 ปีที่แล้ว +1

    The camera crew does not understand what is important and which shots to take and later include in the final video... Way too many close-ups. In general, if the Professor is pointing to something on the blackboard and it is not included in the shot, it's a major fail.

  • @MrKrishtal
    @MrKrishtal 13 ปีที่แล้ว +4

    Even though i'm studing in Europe, it's still interesting, useful and easy to understand, thx for making these leactures available

    • @lucasm4299
      @lucasm4299 6 ปีที่แล้ว +1

      MrKrishtal
      MIT
      🇺🇸📈🏆

  • @BigBen866
    @BigBen866 ปีที่แล้ว

    “If a matrix takes a vector to zero it’s no way an inverse can bring it back to life.” I love it😇😇😎😎

  • @sansha2687
    @sansha2687 4 ปีที่แล้ว

    19:00, 21:10, 22:50, 31:40

  • @kehrierg
    @kehrierg 4 ปีที่แล้ว

    the pseudoinverse is denoted A+. Prof. Strang said pseudoinverses wouldn't be on the final exam. Therefore there were no A+'s on the final. :/ ...

  • @weiliangxu796
    @weiliangxu796 4 ปีที่แล้ว

    I expect that Professor Strang could explain the rank decomposition for calculating the pseudo-inverse. Damn time limit.

  • @avinavkashyap8802
    @avinavkashyap8802 2 ปีที่แล้ว +1

    The ONE TRUE KING 😍😍😍😍😍😍😍😍😍😍😍😍😍😍😍😍😍😍😍😍😍😍

  • @neoneo1503
    @neoneo1503 3 ปีที่แล้ว

    One to One 24:00

  • @Qladstone
    @Qladstone 9 ปีที่แล้ว +1

    I never really got the whole picture thing even until now. Does it give any conceptual meaning beyond the definitions embodied by it? Or it is merely a memory aid to remember the definitions? The definitions embodied are:
    For any m by n matrix A - row space and nullspace are orthogonal complements in Rn, column space and left nullspace are orthogonal complements in Rm.
    i.e. does it supply meaning to the 4 subspaces the way a graph would supply meaning to a function? From what I see, no... Or am I missing something?

    • @720SouthCalifornia
      @720SouthCalifornia 9 ปีที่แล้ว +14

      +Quanxiang Loo
      Yes it is like a function, hence the name linear map. When we multiply by a matrix we take combinations of the columns, so the range of the mapping is the column space. If the column space doesn't span Rm, then we can see there is no way to get answers that have any components orthogonal to the column space. We call this 'orthogonal space' the left nullspace.
      Similarly, when we multiply by a matrix A we take the inner product of a vector x and the rows of A. Recall that the dot product of v.w can be thought of as capturing information of how parallel v is to w and their magnitudes (||v||||w||cos(v-w)). Therefore in multiplying Ax = b, only the magnitude of x that is parallel to the rows of A is relevant to determining b. In other words, components of x are either in the rowspace, or in the nullspace. The nullspace doesn't contribute to b.
      Input : Rn = Domain = (Row Space + Null Space)
      Output : Rm = (Range / Column Space) + Left Null Space
      Taking the transpose flips the input and output. This is why for orthogonal matrices, which have orthogonal unit vectors, the inverse is the transpose. They aren't scaling or skewing the coordinate vectors, so we can just transpose to flip the input and output and the mapping between vectors in the row and column spaces remains intact.

    • @jamjayhamz8309
      @jamjayhamz8309 9 ปีที่แล้ว +3

      +720SouthCalifornia Thank you for this comment! Very informative!

    • @leojanssens1130
      @leojanssens1130 8 ปีที่แล้ว +2

      great answer.. here here!!

  • @oakschris
    @oakschris 8 ปีที่แล้ว

    What does he mean "A projection matrix is the identity matrix where it can be and everywhere else it's the zero matrix" at 19:20?

    • @antoniolewis1016
      @antoniolewis1016 8 ปีที่แล้ว +5

      A projection matrix doesn't change any vector in its column space, but it kills any vector in its left null space.
      In that sense, it is the "identity" on the column space, and the "zero" or "killer" on the left null space.

    • @davidlovell729
      @davidlovell729 7 ปีที่แล้ว +2

      That's not quite the full explanation, because there is a third possibility. The matrix A maps a vector that is in the row space but neither the column space nor the null space onto the column space. Thus, he shouldn't have said "everywhere else" because it's not true.

    • @jagwasenibanerjee1424
      @jagwasenibanerjee1424 4 ปีที่แล้ว

      @@davidlovell729 Ax will always be in the column space of A, right? Then can you please explain what do you mean by A maps a vector that is in the row space but not in the colomn space?

    • @phogbinh
      @phogbinh 3 ปีที่แล้ว +1

      @@jagwasenibanerjee1424 Antonio was totally correct, and David was right in saying that the professor shouldn't have said "everywhere else".
      Here is my summary (hopefully this is totally correct):
      b is in R^M (A is an MxN matrix).
      If b is in C(A), then the projection matrix acts on b like the identity matrix.
      If b is in N(A'), then the projection matrix acts on b like the zero matrix.
      If b is in neither C(A) nor N(A'), then the projection matrix does its job: project b onto C(A).
      Note: A' stands for A transpose.

    • @woddenhorse
      @woddenhorse 3 ปีที่แล้ว

      @@phogbinh Thanks

  • @yuwuxiong1165
    @yuwuxiong1165 3 ปีที่แล้ว

    "projectors are trying to be identity, but it's impossible."

  • @lincyan784
    @lincyan784 3 ปีที่แล้ว

    It is awesome! Thank you professor!

  • @wowlikefun
    @wowlikefun 5 ปีที่แล้ว +1

    Damn, this guy rocks 🎸

  • @geethabr374
    @geethabr374 5 ปีที่แล้ว

    You are a great teacher!

  • @samruby82
    @samruby82 13 ปีที่แล้ว +2

    simply amazing!

  • @AngeloYeo
    @AngeloYeo 6 ปีที่แล้ว +1

    should be the legend

  • @pelemanov
    @pelemanov 13 ปีที่แล้ว

    @Zanco I agree, it's almost magical :-). Or as he puts it: elegant.

  • @yongjieee
    @yongjieee 11 ปีที่แล้ว

    Prof. G. S. is awesome

  • @kingsal
    @kingsal 10 หลายเดือนก่อน

    There shouldn't be the power of minus one and multiple 'T' - it's excluding matter.

    • @kingsal
      @kingsal 10 หลายเดือนก่อน

      I mean, it's no longer inverses anymore by using this equation.

    • @kingsal
      @kingsal 10 หลายเดือนก่อน

      '34:12'

  • @thejameskan
    @thejameskan 13 ปีที่แล้ว

    some great inforamtion here thanks

  • @saurabhshrivastava224
    @saurabhshrivastava224 3 ปีที่แล้ว +1

    21:28 Well I shouldn't say anything bad about calculus, but I'll. 😂

  • @imegatrone
    @imegatrone 13 ปีที่แล้ว +1

    I Really Like The Video Left and Right Inverses; Pseudoinverse From Your

  • @ruiruihuang
    @ruiruihuang 10 ปีที่แล้ว +5

    The best

  • @georgeyu7987
    @georgeyu7987 4 ปีที่แล้ว

    40min short lecture contain lots juicy stuff, watch two times

  • @jinzhonggu8276
    @jinzhonggu8276 2 ปีที่แล้ว

    What does it mean when the projection matrix is trying to be the identity matrix?

    • @Robocat754
      @Robocat754 2 ปีที่แล้ว

      Only when A is invertible can that projection matrix equals to identity. Since A is invertible, it has n independent columns. What happened when you project a vector into a subspace where the vector is actually inside it? The projection matrix is the identity, isn't it? It's like the least square problem AX=b where b is in the column space.

    • @Robocat754
      @Robocat754 2 ปีที่แล้ว

      But elsewhere the projection matrix is zero matrix, how?

    • @Robocat754
      @Robocat754 2 ปีที่แล้ว

      The elsewhere professor refers to could be the elsewhere of R^m. Since the projection matrix projects to the column space of A. And Both column space and left null space span the R^m space.
      For future learners who may have doubts on this.😅

  • @mengmengcui8861
    @mengmengcui8861 8 ปีที่แล้ว

    A good class! Thanks for all your effect(Thank teacher extra)!

  • @苗子琛-f4y
    @苗子琛-f4y 5 ปีที่แล้ว

    Insightful!

  • @thejameskan
    @thejameskan 13 ปีที่แล้ว

    interesting video and very informative

  • @simpleboy5100
    @simpleboy5100 5 ปีที่แล้ว

    Thank you really

  • @AZZEDDINE2801
    @AZZEDDINE2801 11 ปีที่แล้ว +1

    Thank you Prof but I would like to find a video about apllied mathematics especially focused on power system thank you

  • @huangmingya2905
    @huangmingya2905 3 ปีที่แล้ว

    thankyou!

  • @jayejayeee
    @jayejayeee 13 ปีที่แล้ว

    very interesting video thanks

  • @alekssandroassisbarbosa3749
    @alekssandroassisbarbosa3749 7 ปีที่แล้ว +3

    lovely 21:14

    • @Horesmi
      @Horesmi 6 ปีที่แล้ว +1

      I shouldn't say anything bad about calculus.
      *But I will*

  • @solfeinberg437
    @solfeinberg437 5 ปีที่แล้ว

    Does he want students to shout out answers? Because I'd be talking a lot. I'd have to ask him, if he just meant us to think about it or to say it.

  • @RohitSingh-nm9wd
    @RohitSingh-nm9wd 3 ปีที่แล้ว

    Wow this is the end

  • @muhammadsarimmehdi
    @muhammadsarimmehdi 4 ปีที่แล้ว

    How can you use right pseudo-inverse to solve Ax = b?

    • @ramkrishna3256
      @ramkrishna3256 4 ปีที่แล้ว +1

      Ax = b
      here we need to project 'b' on to the column space of A.
      We need left pseudoinverse.
      (A+)Ax= (A+)b
      x‘ = (A+) b
      Where x‘ is the approaximated solution.
      And (AA+) is the projection matrix for the column space of A.
      So I guess in this case we don't use right pseudoinverse

  • @chengweili9516
    @chengweili9516 7 ปีที่แล้ว

    Awesome!

  • @JH-jv1ee
    @JH-jv1ee 4 ปีที่แล้ว +1

    Is this real? He is the legend..

  • @shivammalviya1718
    @shivammalviya1718 5 ปีที่แล้ว

    Thanks sir

  • @thegeffc
    @thegeffc 13 ปีที่แล้ว

    very interesting thanks

  • @ChessMemer69
    @ChessMemer69 2 ปีที่แล้ว

    "Calculus sucks" - Strang, 2005 😆

  • @hemantyadav1047
    @hemantyadav1047 5 ปีที่แล้ว

    I think about this in my sleep.............lmao.

  • @shuier525
    @shuier525 4 ปีที่แล้ว

    147K views with 700 likes, human beings are not very generous about their likes

  • @AmanSingh-xk2lv
    @AmanSingh-xk2lv 9 ปีที่แล้ว +1

    which lecture did he speak about full column rank?

    • @mitocw
      @mitocw  9 ปีที่แล้ว +8

      +Aman Singh Judging from our quick searching, full column ranks are discussed starting in lecture 8.

  • @RrNefriana
    @RrNefriana 4 ปีที่แล้ว

    I am a statistician that watching this video :)))

  • @spandanbasu5653
    @spandanbasu5653 4 ปีที่แล้ว

    How is A(A^TA)^(-1)A^T a projection matrix?
    Please help.

    • @saintelohim
      @saintelohim 4 ปีที่แล้ว

      Check if the matrix P^2=P

    • @arijitghosh2400
      @arijitghosh2400 4 ปีที่แล้ว

      You can always consult lecture number 15, the , matrix essentially projects any vector on its coloumn space, which is deemed as the best solution to a problem which has no solution.

  • @3andhalfpac
    @3andhalfpac 12 ปีที่แล้ว

    haha i love this guy,

  • @kjjonsify
    @kjjonsify 13 ปีที่แล้ว

    wats wrong with that teacher lol