Why did you assume \sum(x_i - x_bar) = 0 in @5:00 but then take a conditional expectation of the same variables in @8:00 ? I would have just assumed they both equal to zero and have the \sum(xi-x_bar)^2 cancel. You would be left with beta_1_hat = beta_1. Great video either way. I am just not sure why you didn't assume \sum(x_i - x_bar) = 0 for both cases.
I am not sure I understand your question ? Are you asking if the proof applies to multivariate regression ? For multivariate regression, there is a similar proof but using Matrix algebra.
@@RemiDav Hi, have made any video to apply this method for multivariate regression, for example, 2 variables regression? I think we need to use matrix like your suggestion, but it is too complicated to me to make the calculation. Thank you
Thank you for this video, really helpful. Just a small doubt, in the video summation of (xi - xbar) is constant because for the condition on x (which has a constant value, say C), (xi - xbar) = (c - xbar) and thus we are left with E[u|x].
I don't understand your question. You would have to tell me which elements exactly you are referring to. Note that if you are a Bayesian statistician, everything can be considered a random variable.
@@RemiDav I mean that in general case every observation from random sampling is random variable; so at counting can we consider observations as random variables?
I genuinely love you for this
I love all your videos. I have been struggling with some concepts
Why did you assume \sum(x_i - x_bar) = 0 in @5:00 but then take a conditional expectation of the same variables in @8:00 ? I would have just assumed they both equal to zero and have the \sum(xi-x_bar)^2 cancel. You would be left with beta_1_hat = beta_1.
Great video either way. I am just not sure why you didn't assume \sum(x_i - x_bar) = 0 for both cases.
Why the sum of (xi-x(bar)) is constant? Here, xi takes different values!!
And why you take conditional expectation?
Does this apply to multivariate regressions as well as univariate? Thank you, great videos!
I am not sure I understand your question ? Are you asking if the proof applies to multivariate regression ?
For multivariate regression, there is a similar proof but using Matrix algebra.
@@RemiDav Hi, have made any video to apply this method for multivariate regression, for example, 2 variables regression? I think we need to use matrix like your suggestion, but it is too complicated to me to make the calculation. Thank you
@@TheVista255 I didn't make any matrix version, sorry.
Thank you for this video, really helpful. Just a small doubt, in the video summation of (xi - xbar) is constant because for the condition on x (which has a constant value, say C), (xi - xbar) = (c - xbar) and thus we are left with E[u|x].
6:12 How Do you get to the last line from the previous one? What is the operation?
distribute the 1/sum(...) in the parenthesis and simplify.
@@RemiDav Thanks for that! It was under my nose really! Just got it 5 minutes ago after a closer look.
I really enjoy your videos!
Cheers!
can we consider elements in quatation of estimator b1 as random variables?
I don't understand your question. You would have to tell me which elements exactly you are referring to.
Note that if you are a Bayesian statistician, everything can be considered a random variable.
@@RemiDav I mean that in general case every observation from random sampling is random variable; so at counting can we consider observations as random variables?
@@RemiDav i meant exactly what this guy did on his video th-cam.com/video/5tMMESxjDBg/w-d-xo.html :)
you made me understand, thank you...do you have any Video on LM statistics to shARE PLEASE
Are you talking about the Lagrange Multiplier test ?
yes sir
Why the sample y is equal to the population Y
Please indicate a time in the video