Hello teacher, isn't the same priority (econometrician and data scientist) ? I mean, accurate prediction accurate parameters estimation. Am I wrong? Thxs
does that mean you have to introduce a missing variable to reduce the bias and endogeneity? In your simulation you introduced both study time and interest, so why was there an increase in bias instead of reduction?
It's Touch and go as you find x2 and put it in to a model your also exposing your model to Multicollinearity which make the model have weaker prediction power
@@faiqkhan489 multicollinearity isn't really a problem in OLS unless it's very strong multicollinearity. Just having two predictors with a nonzero correlation is no problem.
@@filipp800 I'm not sure I'd say that's *why* it's not a problem, but it is a reminder that, if the goal is to interpret a particular coefficient, the main reason for including other predictors is to reduce omitted variable bias, which will only happen if the new predictors *are* related to the existing ones.
Best professor EVER!
I´m quite grateful for your videos. They help me to be the economist I want to be.
Really great. Thanks.
Hello teacher, isn't the same priority (econometrician and data scientist) ? I mean, accurate prediction accurate parameters estimation. Am I wrong? Thxs
does that mean you have to introduce a missing variable to reduce the bias and endogeneity? In your simulation you introduced both study time and interest, so why was there an increase in bias instead of reduction?
Yes, we'd need to bring any omitted variables, like interest, into th remodel as controls to remove the bias.
@@NickHuntingtonKlein In your simulation example did you have two X variables or only one X variable?
@@joylm9108 Only one. You can see the regression equation as ClassTime ~ StudyTime on the slide at 6:00
You just saved me. Thanks!
What if in the example we bring out Interest from the error term? Then x1 and x2 will be correlated, is it a problem?
Not a problem at all for the predictors to be correlated. At worst if they're *super strongly correlated* it can inflate their standard errors.
It's Touch and go as you find x2 and put it in to a model your also exposing your model to Multicollinearity which make the model have weaker prediction power
@@faiqkhan489 multicollinearity isn't really a problem in OLS unless it's very strong multicollinearity. Just having two predictors with a nonzero correlation is no problem.
@@NickHuntingtonKlein that’s because we interpret the coefficients ceteris paribus, right?
@@filipp800 I'm not sure I'd say that's *why* it's not a problem, but it is a reminder that, if the goal is to interpret a particular coefficient, the main reason for including other predictors is to reduce omitted variable bias, which will only happen if the new predictors *are* related to the existing ones.
Very Beneficial