Is there a reason you write X like that? 😅BTW this is a great video series! Would love it if you could expand this further into even deeper mathematical statistics or more on the applied side with econometrics methods etc.
well this makes sense to me,but can some one plz tell me why the proof in my textbook just complicate it much more than this?It used orthogonal matrix to transform some random variables,and utilized some covariance matrices to get to the final point.
@charlesAcmen Thanks for the comment! I try to present the easiest approach that comes to my mind. Your textbook using orthogonal matrices is an indication of the power of linear algebra, so the authors probably want you to know that orthogonal matrices are incredibly useful in statistics... think about principal component analysis for example. In PCA you are trying to write linear combination of explanatory variables in such as way as to capture as much variance in the data set as possible while building new orthogonal variables!
@geolab6193 You are correct that \bar X and X_i are positively correlated. We don't need any expected values in this proof, so that will not be an issue.Thanks for commenting!
@@mikethemathematician Thanks for your reply. My point is that when two chi-squared distributions are added to get a third one, the underlying normal distributions need to be independent, which is not the case here. So the last step of the proof seems not perfectly tight. The conclusion is still correct though. Thanks for the lectures.
@@geolab6193 Agreed. It seems that he is relying on a Corollary of Cochran's theorem from Cochran, 1934. I am not sure that the chi-squared subtraction property mentioned at 7:04 is true in general (I kind of doubt it).
Awesome lecture. Thank you!!!😊
@darrenpeck156 Thanks so much!
Thanks PROF for UR regular publication)
Thanks Arman! I will try to film more!
short and clear , thanks!
this helps me a lot, thanks!
I’m glad! Thanks for watching! @strxngerr
@mikethemathematician if you don't mind, can you explain the last blue part?
Is there a reason you write X like that? 😅BTW this is a great video series! Would love it if you could expand this further into even deeper mathematical statistics or more on the applied side with econometrics methods etc.
Probably to distinguish from χ
Awesome, this video helped me a lot 👍
@NurbolBelyal Thanks so much!
Great video!
Thanks!
well this makes sense to me,but can some one plz tell me why the proof in my textbook just complicate it much more than this?It used orthogonal matrix to transform some random variables,and utilized some covariance matrices to get to the final point.
@charlesAcmen Thanks for the comment! I try to present the easiest approach that comes to my mind. Your textbook using orthogonal matrices is an indication of the power of linear algebra, so the authors probably want you to know that orthogonal matrices are incredibly useful in statistics... think about principal component analysis for example. In PCA you are trying to write linear combination of explanatory variables in such as way as to capture as much variance in the data set as possible while building new orthogonal variables!
@@mikethemathematician never thought u would reply(˃ ⌑ ˂ഃ ),thank you thank you
So the sample version drops down a single degree of freedom
@darrenpeck156 Yes! Thanks for watching!
\bar X and X_i are not independent. The additivity is a little problematic.
@geolab6193 You are correct that \bar X and X_i are positively correlated. We don't need any expected values in this proof, so that will not be an issue.Thanks for commenting!
@@mikethemathematician Thanks for your reply. My point is that when two chi-squared distributions are added to get a third one, the underlying normal distributions need to be independent, which is not the case here. So the last step of the proof seems not perfectly tight. The conclusion is still correct though. Thanks for the lectures.
@@geolab6193 Agreed. It seems that he is relying on a Corollary of Cochran's theorem from Cochran, 1934. I am not sure that the chi-squared subtraction property mentioned at 7:04 is true in general (I kind of doubt it).