Super helpful! This basically means that whatever is in the error term is uncorrelated with the x variable(s), which is why this may not ever be true in social science research as there may be some overlap that we just don't know yet.
Using discrete case, (continuous is the same), say E[u] = 0. For E[u|x], this can be expressed as: E[u|x] = sum(u * Pu|x(u)), so if x and u are independent, the term Pu|x(u) can be expressed (from conditional probability formula) as the joint PMF of u and x divided by P of x: Pu|x(u) = Pu,x(u,x) / Px(x), and as we said they are independent, the Pu,x(u,x) = Pu(u)*Px(x), so, Pu|x(u) = Pu(u)Px(x)/ Px(x) = Pu(u), so E[u|x] = sum(u*Pu(u)) = E[u]. Since we already said that E[u] = 0, so E[u|x] = E[u] = 0 QED. Second, cov[u, x] can be expressed as: E[(u-Mu_u)(x-Mu_x)], where Mu is the mean (average/expected) value. Expanding this formula, we can get: E[ux - u*Mu_x - Mu_u*x + Mu_u*Mu_x], since any Mu is a constant, the expected value of a constant is a constant, then it can be written as: E[ux] - E[u]E[x], so this is the expression of covariance. When u and x is the same, this is the formula of variance of u, var[u,x] when u=x is: E[u^2] - E[u]^2. Back to the topic, since u and x are uncorrelated, which means they are independent, then E[ux] = E[u]E[x], this is a property you can verify. Then cov(u,x) = E[ux] - E[u]E[x] = E[u]E[x] - E[u]E[x] = 0. QED
given E(u|x)=0, there is no need for a further assumption of independence to demonstrate no correlation, Cov(u,x)=0 --------------------- E(u*x) = E(E(u*x|x)) ........... The outer parenthesis designate expectation along X ............ in case X is a random variable (which it is not :-). E(E(u*x|x)) = E(x*E(u|x)) ........... since X is held constant in the inner conditional ............ expectation E(x*E(u|x)) = E(x*0)=0 .............. using the given Gauss-Markov assumption, E(u|x)=0 So, Cov(u,x) = E(u*x) - E(u)*E(x) = E(u*x) - E(E(u|x))*E(x) = 0 - 0 * E(x) = 0
But we are assuming least squares so doesn't the line of best fit always lie right between the observations thus having a zero conditional mean? I don't quite understand
Super helpful! This basically means that whatever is in the error term is uncorrelated with the x variable(s), which is why this may not ever be true in social science research as there may be some overlap that we just don't know yet.
thank you for this explanation lol
@@josh.c36 be careful! zero conditional mean rules out non-linear relationships as well and it a stronger assumption that uncorrelatedness
Really good supplementary material for my Econometrics class, Thank you!
You are the hero of the year, thanks Ben! for all the videos!
Why does E[u|x]=0 then cov[u,x]=0?
Using discrete case, (continuous is the same), say E[u] = 0. For E[u|x], this can be expressed as: E[u|x] = sum(u * Pu|x(u)), so if x and u are independent, the term Pu|x(u) can be expressed (from conditional probability formula) as the joint PMF of u and x divided by P of x: Pu|x(u) = Pu,x(u,x) / Px(x), and as we said they are independent, the Pu,x(u,x) = Pu(u)*Px(x), so, Pu|x(u) = Pu(u)Px(x)/ Px(x) = Pu(u), so E[u|x] = sum(u*Pu(u)) = E[u]. Since we already said that E[u] = 0, so E[u|x] = E[u] = 0 QED. Second, cov[u, x] can be expressed as: E[(u-Mu_u)(x-Mu_x)], where Mu is the mean (average/expected) value. Expanding this formula, we can get: E[ux - u*Mu_x - Mu_u*x + Mu_u*Mu_x], since any Mu is a constant, the expected value of a constant is a constant, then it can be written as: E[ux] - E[u]E[x], so this is the expression of covariance. When u and x is the same, this is the formula of variance of u, var[u,x] when u=x is: E[u^2] - E[u]^2. Back to the topic, since u and x are uncorrelated, which means they are independent, then E[ux] = E[u]E[x], this is a property you can verify. Then cov(u,x) = E[ux] - E[u]E[x] = E[u]E[x] - E[u]E[x] = 0. QED
Knowing the value of x doesn't change what you expect from u. So they are independent and don t vary together, witch is quantified by a 0 covariance
given
E(u|x)=0, there is no need for a further assumption of independence to demonstrate no correlation, Cov(u,x)=0
---------------------
E(u*x) = E(E(u*x|x)) ........... The outer parenthesis designate expectation along X
............ in case X is a random variable (which it is not :-).
E(E(u*x|x)) = E(x*E(u|x)) ........... since X is held constant in the inner conditional
............ expectation
E(x*E(u|x)) = E(x*0)=0 .............. using the given Gauss-Markov assumption, E(u|x)=0
So,
Cov(u,x) = E(u*x) - E(u)*E(x) = E(u*x) - E(E(u|x))*E(x) = 0 - 0 * E(x) = 0
@@kottelkannim4919 great explanation thanks
@@nadekang8198 THIS IS THE BEST EXPLANATION I'VE EVER SEEN IN MY LIFE. BETTER THAN ALL THE CLASSES I'VE HAD THIS SEMESTER. THANK YOU SO MUCH!!!!
But we are assuming least squares so doesn't the line of best fit always lie right between the observations thus having a zero conditional mean? I don't quite understand
That would be the line of best fit if you plot y against x, but in this case we're plotting u against x.
@@Donutscanfly Thanks, I passed my statistics class :). Thanks for the help.
Many thanks mate!!!!
Thank you so much!!!!! super helpful and will let me pass my course
Please I don't really understand the concept behind zero mean
Hi can you explain why you use ui given xi I don't understand the intuition behind that?
Cool !!!
don't understand one thing not even to save a life I wouldn't understand a thing here
You must be related to my professor. 0 to 60 in under 4 seconds. You left me behind : /