Econometrics: Endogeneity in Ordinary Least Squares Regression

แชร์
ฝัง
  • เผยแพร่เมื่อ 25 ม.ค. 2025

ความคิดเห็น • 19

  • @RightAIopen
    @RightAIopen หลายเดือนก่อน

    Best professor EVER!

  • @oscarrosalescorzo
    @oscarrosalescorzo 2 ปีที่แล้ว +3

    I´m quite grateful for your videos. They help me to be the economist I want to be.

  • @jdmresearch
    @jdmresearch 4 ปีที่แล้ว +2

    Really great. Thanks.

  • @gianlucalepiscopia3123
    @gianlucalepiscopia3123 3 ปีที่แล้ว

    Hello teacher, isn't the same priority (econometrician and data scientist) ? I mean, accurate prediction accurate parameters estimation. Am I wrong? Thxs

  • @joylm9108
    @joylm9108 2 ปีที่แล้ว

    does that mean you have to introduce a missing variable to reduce the bias and endogeneity? In your simulation you introduced both study time and interest, so why was there an increase in bias instead of reduction?

    • @NickHuntingtonKlein
      @NickHuntingtonKlein  2 ปีที่แล้ว

      Yes, we'd need to bring any omitted variables, like interest, into th remodel as controls to remove the bias.

    • @joylm9108
      @joylm9108 2 ปีที่แล้ว

      @@NickHuntingtonKlein In your simulation example did you have two X variables or only one X variable?

    • @NickHuntingtonKlein
      @NickHuntingtonKlein  2 ปีที่แล้ว +1

      @@joylm9108 Only one. You can see the regression equation as ClassTime ~ StudyTime on the slide at 6:00

  • @tshinangakeith5819
    @tshinangakeith5819 4 ปีที่แล้ว

    You just saved me. Thanks!

  • @filipp800
    @filipp800 3 ปีที่แล้ว

    What if in the example we bring out Interest from the error term? Then x1 and x2 will be correlated, is it a problem?

    • @NickHuntingtonKlein
      @NickHuntingtonKlein  3 ปีที่แล้ว +1

      Not a problem at all for the predictors to be correlated. At worst if they're *super strongly correlated* it can inflate their standard errors.

    • @faiqkhan489
      @faiqkhan489 3 ปีที่แล้ว

      It's Touch and go as you find x2 and put it in to a model your also exposing your model to Multicollinearity which make the model have weaker prediction power

    • @NickHuntingtonKlein
      @NickHuntingtonKlein  3 ปีที่แล้ว +2

      @@faiqkhan489 multicollinearity isn't really a problem in OLS unless it's very strong multicollinearity. Just having two predictors with a nonzero correlation is no problem.

    • @filipp800
      @filipp800 3 ปีที่แล้ว

      @@NickHuntingtonKlein that’s because we interpret the coefficients ceteris paribus, right?

    • @NickHuntingtonKlein
      @NickHuntingtonKlein  3 ปีที่แล้ว +1

      @@filipp800 I'm not sure I'd say that's *why* it's not a problem, but it is a reminder that, if the goal is to interpret a particular coefficient, the main reason for including other predictors is to reduce omitted variable bias, which will only happen if the new predictors *are* related to the existing ones.

  • @mullerqenawy7715
    @mullerqenawy7715 4 ปีที่แล้ว

    Very Beneficial