Least squares approximation | Linear Algebra | Khan Academy

แชร์
ฝัง
  • เผยแพร่เมื่อ 10 ธ.ค. 2024

ความคิดเห็น • 100

  • @sarabenavidez3863
    @sarabenavidez3863 5 ปีที่แล้ว +237

    "Some of you might already know where this is going.."
    Me: Nope

  • @aashaytambi3268
    @aashaytambi3268 3 ปีที่แล้ว +66

    When I get a real job, I will donate my bonus to Khan Academy. This has saved me so much time and you are so awesome.

  • @nicholascunningham6936
    @nicholascunningham6936 9 หลายเดือนก่อน +2

    This channel is a blessing. I've had some really bad professors and I've had some really good professors. But even the really good professor never made the concepts click with me as well as these videos do. Like, not only do I understand the math better, but just the little diagram you drew showing A's column space, and visibly showing how b is outside of A's column space yet could still be approximated using a vector v in A's column space, like idk how else to describe it but that just made it click for me.
    Edit: I guess _one_ way to describe it and how it clicked for me:
    So we use a line to approximate a bunch of data points on a graph, or plane. If these data points were in a straight line, the "approximation" would have no error. However, this is often not the case.
    Now, think about the equation y=mx+b. Let's use c instead of b to avoid confusion in the next step. So we have y=mx+c. This is the equation used to represent our line. Suppose b=y-c. Then we have mx=b, which looks a lot like Ax=b. And it is! A is just a 1x1 matrix.
    So the line is bounded by the column space of A, or m, and our variable(s) (in this case, just x) can be changed to get b. Just basic algebra: if m=3 and b=6, then x=2. But say b is a 2D vector, e.g. b=(1, 2)^T. Well now, no matter what x you use, you can't get b (unless b just happens to lie on the line). You can only get as close to b as the column space of A will allow you.
    In the diagram drawn in the video, the column space of A is a plane, so the span of A is 2. For simplicity, let's suppose A is a 3x2 matrix (a geometrical interpretation of this is that A is a 2D plane "floating" in a 3D space). b appears to be a 3D vector (so while A is only a 2D slice of the 3D space, b is a point that could be anywhere in the 3D space). So, just like before, we try to use a line bounded by the column space of A to get as close to b as possible by changing our variables (in this case, x1 and x2).
    Correct me if my understanding is wrong :)

  • @danielerusso_1589
    @danielerusso_1589 2 ปีที่แล้ว +12

    This lesson is fantastic! I understood the problem in only 15 minutes! You're absolutely better than my numerical analysis teacher at university, that can't properly teach an argument in two hours! Thank you!

  • @haojiang4882
    @haojiang4882 7 ปีที่แล้ว +64

    Comes in handy while studying machine learning.

    • @KeystoneScience
      @KeystoneScience 7 ปีที่แล้ว +4

      yes, same

    • @mwaleed2082
      @mwaleed2082 4 ปีที่แล้ว +3

      Very true. When I was studying ML, "normal equation", I really thought that I had seen it somewhere. Then I realized I studied it in Lin. algb.

  • @ozzyfromspace
    @ozzyfromspace 5 ปีที่แล้ว +26

    I was doing an online machine learning course and got lost when the lecturer introduced the normal equation (which this is, with a different name). Needless to say, I'm finna binge-watch your linear algebra lectures now because I get insecure about using equations I don't understand. Thanks for the playlist, I really wanna put ML in my toolset so we're doing this!

    • @khaledsherif7056
      @khaledsherif7056 4 ปีที่แล้ว

      Can you please mention the name/link of the course ?

    • @mwaleed2082
      @mwaleed2082 4 ปีที่แล้ว

      @@khaledsherif7056 not sure which course he used for ML, but I'm studying Machine Learning by Andrew NG on Coursera. When he was teaching us normal equation as an alternative to gradient descent In Week 2 of the course, I realized I had seen this in Linear algebra but with a different name which is the title of this video.

  • @aidawall8
    @aidawall8 12 ปีที่แล้ว +31

    You are like a billion times better than my professor... and my professor isn't even bad. On the contrary he's my favorite! You're just even better at explaining things.
    Plus it's impossible for me to lose focus with the pretty colors and your beautiful handwriting. lol
    I have my Linear Algebra final tomorrow (technically today) and I owe the A that I'm sure to get to you and all your helpful videos!

    • @potatojam6519
      @potatojam6519 4 ปีที่แล้ว +10

      7 years later... did you get an A? :)

    • @HustleHeaven247
      @HustleHeaven247 ปีที่แล้ว

      11 years later did you get that A?

    • @arontinkl8782
      @arontinkl8782 หลายเดือนก่อน

      12 years later did you get that A?

  • @samirrimas
    @samirrimas 11 ปีที่แล้ว +23

    Very useful man you are doing an amazing job this literally saved me hours of searching and reading can't thank you enough :)

  • @fireheart9715
    @fireheart9715 11 หลายเดือนก่อน +1

    This was incredible, I started this video off being so confused about the least squares, and I just get it entirely now! Thank you so much :)

  • @陈明年
    @陈明年 2 ปีที่แล้ว +1

    Best linear algebra playlist.

  • @n07kiran43
    @n07kiran43 3 หลายเดือนก่อน +2

    Indebted to Khan academy forever!

  • @jibran6635
    @jibran6635 4 ปีที่แล้ว +1

    This is super useful in solving assignments.THanks khan academy.

  • @dakota5569
    @dakota5569 หลายเดือนก่อน

    This is a good preface before machine learning. The star notation is always the most optimal/best, and you can gradient descent to minimize the square error

  • @johnfykhikc
    @johnfykhikc 6 ปีที่แล้ว

    Best approach to the problem. No gradient, no multivariable calculus. you're master!

  • @seprage
    @seprage 9 ปีที่แล้ว +65

    It would be great having links when says "I explained (whatever) in a different video" to access that explanation. In this case I wanted to know why C(A)transpose=N(Atranspose).
    Thanks¡

    • @Daski69
      @Daski69 8 ปีที่แล้ว

      +Sergio Prada same thing here

    • @196phani
      @196phani 6 ปีที่แล้ว +10

      www.khanacademy.org/math/linear-algebra/alternate-bases/othogonal-complements/v/linear-algebra-orthogonal-complements go through this to understand how C(A)transpose=N(Atranspose).

    • @MrVishyG
      @MrVishyG 6 ปีที่แล้ว

      +1

    • @gulshanjangid3470
      @gulshanjangid3470 5 ปีที่แล้ว +5

      consider any vector x perpendicular to Column space of A i.e. belongs to A _|_.
      Then dot product of A and x is 0, i.e. (A^T)(x) = 0
      Now consider b = A^T, so clearly above equation is bx = 0, i.e. x lies in null space of b
      Thus x lies in null space of A^T
      also as in the first line I said x belongs to A perpendicular ,
      thus C(A _|_) = null(A^T)

  • @ottoomen5076
    @ottoomen5076 6 ปีที่แล้ว +3

    Excellent explanation of a valuable technique.

  • @mattralston4969
    @mattralston4969 4 ปีที่แล้ว +1

    Thank you Salman Khan. I appreciate the opportunity to relearn the method here. You can never hear this stuff enough times.

  • @tranzconceptual
    @tranzconceptual 9 ปีที่แล้ว +29

    god dang it I knew I should have chosen other bachelor thesis..

    • @user-xedwsg
      @user-xedwsg 8 ปีที่แล้ว

      haha!!!!

    • @OurEverlastingYouth
      @OurEverlastingYouth 7 ปีที่แล้ว

      just realizing this now as well

    • @rob6129
      @rob6129 4 ปีที่แล้ว

      first semester stuff at my uni

    • @gangigooga7710
      @gangigooga7710 4 ปีที่แล้ว

      @@rob6129 what uni u attending?

  • @Jshizzle2
    @Jshizzle2 5 ปีที่แล้ว +1

    Helpful exploration of least square properties

  • @rodrigo100kk
    @rodrigo100kk 4 หลายเดือนก่อน

    Awesome explanation! Keep up the good work!

  • @rajj1567
    @rajj1567 12 ปีที่แล้ว +1

    Your videos are just great !!! The concepts with geometrical examples make very good sense !!! Thanks a lot

  • @budharpey
    @budharpey 12 ปีที่แล้ว +1

    Very useful! In my lecture slides I had this term Hx=z for the same problem and I couldn't make sense of how we could get to this as the best solution: x = (Ht*H)^-1 * Ht * z.
    Now I understand:-)

  • @adamhuang2421
    @adamhuang2421 12 ปีที่แล้ว +1

    very helpful! Thanks a lot! you are doing great things! I also listened to your other videos, all very wonderful!

  • @batmendbatbaatar4290
    @batmendbatbaatar4290 4 ปีที่แล้ว

    This is surprisingly easy

  • @rob6129
    @rob6129 4 ปีที่แล้ว

    Nice derivation of the normal equation

  • @Awhobiwom
    @Awhobiwom 5 ปีที่แล้ว

    Thank you so much. You just simplified long boring hours of confusing lecture

  • @lancelofjohn6995
    @lancelofjohn6995 3 ปีที่แล้ว

    It seems I have seen the best video!

  • @Matterhorn1125
    @Matterhorn1125 13 ปีที่แล้ว +1

    can you teach me cubic expressions and cubic equations :)
    eg. solve the equation x(3X3X3) - 2x(2X2) - x + 2 = 0
    by using the factor theorem formula :)

  • @adithyavarma758
    @adithyavarma758 ปีที่แล้ว

    thank you very much sir

  • @xesan555
    @xesan555 7 ปีที่แล้ว

    Thanks so much Khan...wonderful explanation in two videos that explains everything...great. You are wonderful

  • @elmiramb
    @elmiramb 14 ปีที่แล้ว

    Thanks a lot, very comprehensive ! great job!

  • @EWang-yn5sy
    @EWang-yn5sy 6 ปีที่แล้ว

    This guy is good...........

  • @inserthere6387
    @inserthere6387 6 ปีที่แล้ว +2

    great geometric intuition of linear regression

  • @kalvinsackey1804
    @kalvinsackey1804 11 หลายเดือนก่อน

    can we please get a video for the maximum likelihood estimation

  • @박주은-f4x
    @박주은-f4x 2 ปีที่แล้ว

    당신은 나의 구원자입니다. 정말 명쾌한 강의입니다. 감사합니다!! 👍👍👍

  • @roy5180
    @roy5180 3 ปีที่แล้ว

    thank you sir

  • @kartarsingh7776
    @kartarsingh7776 6 ปีที่แล้ว

    Super clarity......

  • @mustafasabeeh8893
    @mustafasabeeh8893 3 ปีที่แล้ว

    thanks

  • @fascist27
    @fascist27 14 ปีที่แล้ว

    really helpful

  • @BlackfireGippal
    @BlackfireGippal 12 ปีที่แล้ว

    I wish to know how to solve this: x has values of : -2 0 1 2 3 and y : 17 5 2 1 2 and i'm asked to use the least squares method, but i've been absent and i don't know exactly what my teacher ment by that or what that method consists of. Can anyone help me solve this ?

  • @arico94
    @arico94 6 ปีที่แล้ว

    Should have used n instead of k its usually mxn in R^n

  • @luffy08dn
    @luffy08dn 13 ปีที่แล้ว

    thaks

  • @91leonetammie
    @91leonetammie 4 ปีที่แล้ว +1

    This is the first Khan Academy video I watch and don't understand...

    • @mwaleed2082
      @mwaleed2082 4 ปีที่แล้ว

      For that you need to study orthogonal components, and the concept of what spanning sets are which further derive the concept of column space, null space, etc.

  • @rbfreitas
    @rbfreitas 14 ปีที่แล้ว

    Good video!!!! And nice work! Good luck with the KhanAcademy :)

  • @MistrVahag
    @MistrVahag 13 ปีที่แล้ว

    Excelent video.
    Thanks much :))))))))
    Vahag

  • @MrZulfiqar37
    @MrZulfiqar37 10 ปีที่แล้ว +3

    I have a question..
    does least sequare approximation has always solution..

    • @CR-iz1od
      @CR-iz1od 9 ปีที่แล้ว +2

      +Zulfiqar Ali not if you don't solve it.

    • @Daski69
      @Daski69 8 ปีที่แล้ว +1

      +Conor Raypholtz it still has a universally reasonable solution

    • @spindash64
      @spindash64 8 ปีที่แล้ว +2

      I'm pretty sure that is the idea of least squares: to provide a close answer when you can't give an exact one

    • @shredding121
      @shredding121 7 ปีที่แล้ว

      it does always have one - if Ax = b has a solution than it's a vector on A and if not it's the projection on A.

    • @natebush26
      @natebush26 6 ปีที่แล้ว

      There is always a solution to the least squares problem. Why? x* is in colspace(A) by definition of being a projection from b into C(A) so there must be a set of weights that yield a linear combination of a that equal b.

  • @ArafatAmin
    @ArafatAmin 12 ปีที่แล้ว

    what happens when AT*A is singular. How do we solve for the least square solution?

  • @zhiqiguo803
    @zhiqiguo803 10 ปีที่แล้ว +1

    love this guy

  • @shinigummyl1586
    @shinigummyl1586 6 ปีที่แล้ว +25

    2018? Im alone :(

    • @diesel7777777
      @diesel7777777 6 ปีที่แล้ว +4

      I'm here.

    • @bli240
      @bli240 5 ปีที่แล้ว +5

      Onto 2019!

    • @biswa_407
      @biswa_407 11 หลายเดือนก่อน

      2024 here

  • @SanwaOfficial
    @SanwaOfficial 5 ปีที่แล้ว

    I have one question, whether the LSS always consistent? if yes, how can I prove it? please answer

    • @mwaleed2082
      @mwaleed2082 4 ปีที่แล้ว

      Hi, not sure if you're still looking for the answer, but could you please describe what do you mean by consistent?

    • @biswa_407
      @biswa_407 11 หลายเดือนก่อน

      It means that wheather we can always find least square solution of a system.

  • @utte12
    @utte12 12 ปีที่แล้ว

    nice vid, but why did you take the length squared? i understand that the length of the vector would be sqrt(b1^2 + b2^2...bn^2) but why did you square even that?

    • @lucasm4299
      @lucasm4299 6 ปีที่แล้ว

      utte12
      Because it’s easier to work with minimizing the sum of squares than minimizing the square root of a sum of squares. That’s my guess

  • @trejkaz
    @trejkaz 2 ปีที่แล้ว

    I tried using this trick for the problem I'm facing, but it turns out that when I multiply AT by A, I get a matrix which isn't invertible, so I still can't solve it. LOL
    This _still_ seems odd to me, because even if some element in the input matrix A was contributing 0 to the result b, it should _still_ be possible to get a point as close as possible to the result.

  • @winnies1001
    @winnies1001 7 ปีที่แล้ว

    how did you know that it was a projection to the Col(A) and not anything else like the Range(A)?

    • @lucasm4299
      @lucasm4299 6 ปีที่แล้ว

      Winnie Shi
      Col(A) already is the range of A.

  • @priestofrhythm
    @priestofrhythm 13 ปีที่แล้ว +1

    I am the 60th guy liking it !! :P :D
    Great vid, thank you. :)

  • @Ben.N
    @Ben.N 3 ปีที่แล้ว +1

    Big brajn

  • @kavishdoshi2408
    @kavishdoshi2408 8 ปีที่แล้ว

    accha hai

  • @hugoderuyver
    @hugoderuyver 3 ปีที่แล้ว

    ICAM ! ICAM ! .... .. ...... !

  • @nilsclaessens5203
    @nilsclaessens5203 11 หลายเดือนก่อน

  • @aysegocer3308
    @aysegocer3308 2 ปีที่แล้ว +1

    🤩

  • @jihyepark9139
    @jihyepark9139 4 ปีที่แล้ว

    Sometimes I can't see what he's writing.

  • @dion9795
    @dion9795 3 ปีที่แล้ว

    bro just do an example lol

  • @spechtbert
    @spechtbert 12 ปีที่แล้ว

    n1

  • @山田林-f5b
    @山田林-f5b 2 ปีที่แล้ว

    gorgeous

  • @fascist27
    @fascist27 14 ปีที่แล้ว

    Respond to this video...