Awesome, simple and to the point, very easy to understand. I have two quick questions: 1. May I say GMM is a way to calculate variance? 2. It is kind of silly question, is there any reason we only use the 2nd and the 4th moment equations?
1. It is a way to find an ESTIMATOR for variance, that is an important distinction in statistics. 2. No, in theory we could use infinite moments. However it turns out that there is a trade off between bias and variance of the estimator: If we use too many moments, our estimator variance increases. Thus, there are moment selection criteria (reference is Hansen 1987) to select the optimal number of moments.
Please do a video on convergence in probability and convergence in distribution, if possible covering chebyshev's inequality etc. It would be very helpful. Thanks!
I have not yet come across an explanation for GMM which is as simple to understand as this. Thank you very much.
Best simple explaination of GMM I've seen so far. Thank you!
I'm so glad that I reached this video. It was very simple to understand it!
Very good and easy to understand material. Thanks for your work!
This video was simply explained thank you so much. God bless you!
Thank you very much, well explained and very clear, congratulations and thank you for sharing
very good job... thanks for the video!
This was so well explained. Thanks
Brutal video. Congratulations!
Really well explained. Thank you
Cool explanation!
Video on difference meaning of convergences would be great. Covergence in probability vs convergence in distribution vs etc.....
Awesome, simple and to the point, very easy to understand. I have two quick questions: 1. May I say GMM is a way to calculate variance? 2. It is kind of silly question, is there any reason we only use the 2nd and the 4th moment equations?
1. It is a way to find an ESTIMATOR for variance, that is an important distinction in statistics. 2. No, in theory we could use infinite moments. However it turns out that there is a trade off between bias and variance of the estimator: If we use too many moments, our estimator variance increases. Thus, there are moment selection criteria (reference is Hansen 1987) to select the optimal number of moments.
@@FinAndEcon Wow, thank you for your prompt reply providing useful information and great explanation.
Please do a video on convergence in probability and convergence in distribution, if possible covering chebyshev's inequality etc. It would be very helpful. Thanks!
Can you explain the connection between GMM and OLS
Thank you!!!
Extraordinary
If tomorrow I get a good grade would be bc of u thx so much
Hello, thank you for explanation. But I believe it should be 3*sigma^4, and not 3*sigma^2, no?
Hi these a great question I am pondering, does anyone have a definitive answer?
I think you are right. Kurtosis=fourth moment/second moment ^2..for normal k=3...so...
Great effort! Can you please make the same for OLS estimators? Many thanks in advance?
I will put it on my list :)