Proof (part 3) minimizing squared error to regression line | Khan Academy

แชร์
ฝัง
  • เผยแพร่เมื่อ 23 ต.ค. 2024

ความคิดเห็น • 25

  • @孟令軻
    @孟令軻 8 ปีที่แล้ว +4

    I really love this series of statistic sagas, since everything i want to start a data analysis in excel or some other tools, these basic knowledge does help me with a better understanding of what i am doing

  • @aquilazyy1125
    @aquilazyy1125 4 ปีที่แล้ว

    This is given by from one of my homework I’m currently working on, a formula for computing m & b: let vector x= [x1,...,xn]^T, y = [y1,...,yn]^T, matrix A =[x, [1,...,1]^T], then [m,b]^T=(A^T•A)^{-1} • A^T •y.
    If you move the matrix inverse from the right to left and then expand out the matrix multiplication, it actually gives you the system of equation at the end of the video. I couldn’t understand why at first, but your video explain this much well.

  • @oling2812
    @oling2812 11 ปีที่แล้ว +8

    To everyone who's wondering where part 2 is, here's the link; (can't post the link so here's the url after the .com: /watch?v=f6OnoxctvUk

    • @SimplyAndy
      @SimplyAndy 3 ปีที่แล้ว

      This comment needs more votes. Relevant even after 8 years.

  • @ananthakrishnank3208
    @ananthakrishnank3208 ปีที่แล้ว

    Elegant proof! Many thanks.

  • @anweshadutta8782
    @anweshadutta8782 4 ปีที่แล้ว

    Thank you so much.

  • @DevShah-z2d
    @DevShah-z2d 8 หลายเดือนก่อน

    thanks

  • @gupta942
    @gupta942 4 ปีที่แล้ว

    Really Love ❤️ your explanation...!!!
    Thank you ...!!!

  • @andregomes2629
    @andregomes2629 5 ปีที่แล้ว

    This is awesome, thanks Kan!

  • @80amnesia
    @80amnesia 3 ปีที่แล้ว

    thanks! You're always the best

  • @ad2181
    @ad2181 14 ปีที่แล้ว

    Excellent insight.

  • @NorwegianExplorer
    @NorwegianExplorer 9 ปีที่แล้ว +12

    this playlist still needs some tidying up. Just move proof part 2 between part 1 and 3 please.

  • @charliean9237
    @charliean9237 5 ปีที่แล้ว +1

    at 10:31 the "two" points are actually the same points (x_bar^2/x_bar, (x_bar * y_bar)/x_bar) is exactly (x_bar, y_bar)

    • @muhammadaliburhanuddin9484
      @muhammadaliburhanuddin9484 5 ปีที่แล้ว

      no, it's not the same points..

    • @aquilazyy1125
      @aquilazyy1125 4 ปีที่แล้ว

      It’s x^2_Bar not x_Bar^2. Remember that they are actually the sum of the n terms, so you can’t just cancel them out.

  • @aditijuneja1848
    @aditijuneja1848 2 ปีที่แล้ว

    The conditions of partial derivatives being 0 will be true for maxima too. So then how will we know that our point is minima and not maxima?

  • @clar331
    @clar331 5 ปีที่แล้ว +1

    this goes too fast for me, im going to rewatch x(

  • @InternetDarkLord
    @InternetDarkLord 3 ปีที่แล้ว

    This playlist is out of order. Part 1 is #52, part 2 is #56, and part 3 is #53.

  • @pradoshkumarsahu2706
    @pradoshkumarsahu2706 6 ปีที่แล้ว +3

    Watch Part 2 here before you follow part 3:
    th-cam.com/video/f6OnoxctvUk/w-d-xo.html

  • @giveitago7745
    @giveitago7745 ปีที่แล้ว

    If u wonder where these derivatives came from, let me tell me there is another video a few videos after this one that should have been here in this place before this one. It is actually misplaced

  • @bartcase
    @bartcase 13 ปีที่แล้ว

    you're awesome

  • @ChimaevMikelson
    @ChimaevMikelson 5 หลายเดือนก่อน

    💯

  • @sagar7958
    @sagar7958 7 ปีที่แล้ว +2

    how he decided,that the figure will be a 3D figure?

    • @mrroo9658
      @mrroo9658 7 ปีที่แล้ว +11

      because there are two variables (m,b). when y is a function of only one variable, e.g y=f(x), it's a 2D graph, and when y is a function of two variables, e.g y=f(x,z), it's a 3D graph.