ไม่สามารถเล่นวิดีโอนี้
ขออภัยในความไม่สะดวก

Derivation of OLS coefficients

แชร์
ฝัง
  • เผยแพร่เมื่อ 10 ส.ค. 2015
  • The simple maths of OLS regression coefficients for the simple (one-regressor) case.
    This video screencast was created with Doceri on an iPad. Doceri is free in the iTunes app store. Learn more at www.doceri.com

ความคิดเห็น • 33

  • @averymojica3742
    @averymojica3742 6 ปีที่แล้ว +5

    Clear and to the point. I'm going to be referencing your videos for my econometrics class! Thank you so much!

  • @preritgoyal9293
    @preritgoyal9293 3 ปีที่แล้ว +2

    Sir you are great !!
    Very easily explained such a complex derivation..
    😍😍

  • @amritraj3934
    @amritraj3934 2 ปีที่แล้ว +1

    ooooh god ... you saved my degree sir!!!! hats off!!

  • @trchri
    @trchri 2 ปีที่แล้ว

    This is the one that finally made it click. Thank you

  • @grkoica
    @grkoica 2 ปีที่แล้ว

    Best regards to Mr.Mathematics...thank you

  • @jameschen2097
    @jameschen2097 3 ปีที่แล้ว

    So clear. GOD u saved my life

  • @AJ-et3vf
    @AJ-et3vf 2 ปีที่แล้ว

    awesome video sir. Thank you

  • @lovenessphiri2213
    @lovenessphiri2213 6 หลายเดือนก่อน

    Thank you for this😊

  • @user-io4sr7vg1v
    @user-io4sr7vg1v 11 วันที่ผ่านมา

    Bravo.

  • @kundananji.simutenda
    @kundananji.simutenda 2 ปีที่แล้ว

    well, Explained what books would you you recommend for econometrics?

  • @lefokomafoko2814
    @lefokomafoko2814 11 หลายเดือนก่อน

    exam in 2 weeks. thank you 😀

  • @tmrmbx5496
    @tmrmbx5496 4 ปีที่แล้ว +1

    Thank you legend

  • @paultoronto42
    @paultoronto42 4 ปีที่แล้ว

    Excellent explanation. There is just one step that I'm not 100% convinced about. No doubt this has to do with my own ignorance about summation. In the part where he solves for b_0 I can't quite see how he goes from Sigma (b_1 * X_i) to (b_1 * n * mean(X)). With my limited understand I would have thought it should be (b_1 * n^2 * mean(X)). He calculates n but I'm thinking it should be n^2.

    • @billsundstrom8948
      @billsundstrom8948 4 ปีที่แล้ว +1

      This follows from something mentioned around 6:50, namely by definition mean(X) = (1/n)sigma (X_i), so multiplying both sides by n we have sigma (X_i) = n*mean(X). Then note that because b_1 is a parameter we can factor it out of the sum, so Sigma (b_1 * X_i) = b_1 * Sigma (X_i) = b_1 * n*mean(X). There is no square involved because when we take the derivative with respect to b_0 in the minimization, the square term goes away.

    • @paultoronto42
      @paultoronto42 4 ปีที่แล้ว

      @@billsundstrom8948 Thanks, but I'm still not 100% clear. I understood the derivative with respect to b_0. That is not where I got the n squared from. This is my thinking which I know is wrong but I don't quite understand why. If Sigma(b_1) = n*b_1, and Sigma(X_i) = n * mean(X) then wouldn't Sigma(b_1 * mean(X)) be [n * b_1 * n * mean(X)] = n^2 * b_1 * mean(X)?

    • @billsundstrom8948
      @billsundstrom8948 4 ปีที่แล้ว +1

      ​@@paultoronto42 It helps to write out the sum: Sigma (b_1 * X_i) = b_1 * X_1 + b_1 * X_2 + b_1 * X_3... + b_1 * X_n and now factor out b_1 to get b_1 * (X_1 + X_2 + X_3... + X_n) = b_1 * n* mean(X)

    • @paultoronto42
      @paultoronto42 4 ปีที่แล้ว

      @@billsundstrom8948 Thanks, that does help!

    • @paultoronto42
      @paultoronto42 4 ปีที่แล้ว

      @@billsundstrom8948 Thanks also for you video on Summation Notation. I should have watched that one first.

  • @whitefiberman2094
    @whitefiberman2094 4 ปีที่แล้ว

    Excelent derivation!

  • @darkhansaidnassimov4407
    @darkhansaidnassimov4407 3 ปีที่แล้ว

    On 12:00, how did the sum of xi-yi - n*mean(x)*mean(y) turn into the second last row? Didn't quite get it. Thanks in advance

    • @williamsundstrom6103
      @williamsundstrom6103  3 ปีที่แล้ว +2

      If you take the LHS of the second to last row and expand it, you will see that some terms can be collected and you get the LHS of the third to last row. The derivation of this equivalency is very similar to the derivation around 7:15.

  • @quinnpisani180
    @quinnpisani180 4 ปีที่แล้ว

    Would you take the same steps for the OLS estimate of Beta-hat2 if there was another slope estimator of Beta-hat2 in the regression line?

    • @billsundstrom8948
      @billsundstrom8948 4 ปีที่แล้ว

      Yes, only then we have three equations in three unknowns so the formulas get more complicated.

    • @quinnpisani180
      @quinnpisani180 4 ปีที่แล้ว

      Thank you

  • @shivammishrashashwat
    @shivammishrashashwat 2 ปีที่แล้ว

    Awesome 🙏🙏

  • @1UniverseGames
    @1UniverseGames 3 ปีที่แล้ว

    How we can obtain intercept and slope of B0 and B1 after shifting line l to l'

    • @williamsundstrom6103
      @williamsundstrom6103  3 ปีที่แล้ว

      Not exactly sure what you are asking, Jahid. The B0 and B1 derived in this video are estimates completely dependent on the data... so if we have new or different data that imply a different line, we will have different calculated values of B0 and B1.

  • @rohtashbhall2671
    @rohtashbhall2671 5 ปีที่แล้ว

    Very nice 👍

  • @NaveenKumar-bi2ku
    @NaveenKumar-bi2ku 4 ปีที่แล้ว

    Thanks ... its awesm :) .. thanks :)

  • @ashishchauhan7343
    @ashishchauhan7343 7 ปีที่แล้ว +1

    i want multi regresion derivation

    • @pianotalent
      @pianotalent 6 ปีที่แล้ว

      I will be posting a comprehensive proof and derivation of the formulas of the multi regression in a couple of weeks, busy at this moment...

    • @msfasha
      @msfasha ปีที่แล้ว

      Very clear, brilliant 👍