At 59:57 defining the Brownian bridge, I believe there might be an error. An expectation of R.V's can't have a distribution. Therefore, I think he may have meant X_t | X_S ~N(0,t-s) instead of the expectation.
At @0:35 , Chi-squared here is said to be the ASYMPTOTIC DISTRIBUTION, which means it does not hold for small n. Is this because ~ using an example~ the TEST STATISTIC has other estimators that which need n>30(say) for the guarantees of LLN to kick in? For example, the sample variance is made up of both X_i and the sample mean Xn_bar and we see that Sn/sigma_squared is approximately Chi-squared_n-1. But Xn_bar is only guaranteed to become E[X_i] when LLN kicks in , which requires large sample size, thus Chi_square is the ASYMPTOTIC / LIMITING distribution here because (X_i - Xn_bar)^2 is only N(0, 1) when n is large~ which is when Xn_bar --> E[X_i]. Did that cover the reason for calling it asymptotic, or am i missing something?
In the uploaded lecture slides of parametric hypothesis testing there is a typo. The equals sign of H1 on Slides 29/37 and 32/37 should have been not equals sign.The same slides that are in the video ( 1:17 and 8:09), don't have the that error.
At @0:35 , Chi-squared here is said to be the ASYMPTOTIC DISTRIBUTION, which means it does not hold for small n. Is this because ~ using an example~ the TEST STATISTIC has other estimators that which need n>30(say) for the guarantees of LLN to kick in? For example, the sample variance is made up of both X_i and the sample mean Xn_bar and we see that Sn/sigma_squared is approximately Chi-squared_n-1. But Xn_bar is only guaranteed to become E[X_i] when LLN kicks in , which requires large sample size, thus Chi_square is the ASYMPTOTIC / LIMITING distribution here because (X_i - Xn_bar)^2 is only N(0, 1) when n is large~ which is when Xn_bar --> E[X_i]. Did that cover the reason for calling it asymptotic, or am i missing something?
@@mitocw It would be great if the lecture series were to be completed with the appropriate lectures from 2017 (as was done with lecture 1). Thanks for everything though.
@@mitocw Seconded. The record from another year totally works and will make this lecture series a golden one regarding intermediate stat courses that available online. Thanks a lot!
The TH-cam playlist for this course is: th-cam.com/play/PLUl4u3cNGP60uVBMaoNERc6knT_MgPKS0.html. You can find the course materials on MIT OpenCourseWare at: ocw.mit.edu/18-650F16. Best wishes on your studies!
At @0:35 , Chi-squared here is said to be the ASYMPTOTIC DISTRIBUTION, which means it does not hold for small n. Is this because ~ using an example~ the TEST STATISTIC has other estimators that which need n>30(say) for the guarantees of LLN to kick in? For example, the sample variance is made up of both X_i and the sample mean Xn_bar and we see that Sn/sigma_squared is approximately Chi-squared_n-1. But Xn_bar is only guaranteed to become E[X_i] when LLN kicks in , which requires large sample size, thus Chi_square is the ASYMPTOTIC / LIMITING distribution here because (X_i - Xn_bar)^2 is only N(0, 1) when n is large~ which is when Xn_bar --> E[X_i]. Did that cover the reason for calling it asymptotic, or am i missing something?
Thank you so much for providing these lectures for free. This course is awesome.
Testing of Goodness of Fit: 22:19
thank you :)
At 59:57 defining the Brownian bridge, I believe there might be an error. An expectation of R.V's can't have a distribution. Therefore, I think he may have meant X_t | X_S ~N(0,t-s) instead of the expectation.
At @0:35 , Chi-squared here is said to be the ASYMPTOTIC DISTRIBUTION, which means it does not hold for small n. Is this because ~ using an example~ the TEST STATISTIC has other estimators that which need n>30(say) for the guarantees of LLN to kick in? For example, the sample variance is made up of both X_i and the sample mean Xn_bar and we see that Sn/sigma_squared is approximately Chi-squared_n-1. But Xn_bar is only guaranteed to become E[X_i] when LLN kicks in , which requires large sample size, thus Chi_square is the ASYMPTOTIC / LIMITING distribution here because (X_i - Xn_bar)^2 is only N(0, 1) when n is large~ which is when Xn_bar --> E[X_i]. Did that cover the reason for calling it asymptotic, or am i missing something?
In the uploaded lecture slides of parametric hypothesis testing there is a typo. The equals sign of H1 on Slides 29/37 and 32/37 should have been not equals sign.The same slides that are in the video ( 1:17 and 8:09), don't have the that error.
At @0:35 , Chi-squared here is said to be the ASYMPTOTIC DISTRIBUTION, which means it does not hold for small n. Is this because ~ using an example~ the TEST STATISTIC has other estimators that which need n>30(say) for the guarantees of LLN to kick in? For example, the sample variance is made up of both X_i and the sample mean Xn_bar and we see that Sn/sigma_squared is approximately Chi-squared_n-1. But Xn_bar is only guaranteed to become E[X_i] when LLN kicks in , which requires large sample size, thus Chi_square is the ASYMPTOTIC / LIMITING distribution here because (X_i - Xn_bar)^2 is only N(0, 1) when n is large~ which is when Xn_bar --> E[X_i]. Did that cover the reason for calling it asymptotic, or am i missing something?
Chi-squared test for contingency tables only holds for big n, because you need Taylor expansion to be properly performed on log likelihood ratio
56:29 subtitle correction instead of personality density, it should have been probability density. Thank you for the content =)
Where is the class 10 video?
Sorry, videos for Lectures 10 and 16 are not available.
Could you please upload them from a different year, as you did for lecture #1? It would really help making this course complete!
@@mitocw please upload,it will complete the whole course
@@mitocw It would be great if the lecture series were to be completed with the appropriate lectures from 2017 (as was done with lecture 1). Thanks for everything though.
@@mitocw Seconded. The record from another year totally works and will make this lecture series a golden one regarding intermediate stat courses that available online. Thanks a lot!
Can someone guide me to anova lectures please? please MIT reply or anyone 🙏
The TH-cam playlist for this course is: th-cam.com/play/PLUl4u3cNGP60uVBMaoNERc6knT_MgPKS0.html. You can find the course materials on MIT OpenCourseWare at: ocw.mit.edu/18-650F16. Best wishes on your studies!
@@mitocwThanks so much MIT
What happened to Lecture 10? (Multinomial Chi-square?)
The videos for Lectures 10 and 16 are not available.
I saw that some viewers say that Lecture 10 is the midterm time.
Making H_0=gaussian is cheating, isn't it?
At @0:35 , Chi-squared here is said to be the ASYMPTOTIC DISTRIBUTION, which means it does not hold for small n. Is this because ~ using an example~ the TEST STATISTIC has other estimators that which need n>30(say) for the guarantees of LLN to kick in? For example, the sample variance is made up of both X_i and the sample mean Xn_bar and we see that Sn/sigma_squared is approximately Chi-squared_n-1. But Xn_bar is only guaranteed to become E[X_i] when LLN kicks in , which requires large sample size, thus Chi_square is the ASYMPTOTIC / LIMITING distribution here because (X_i - Xn_bar)^2 is only N(0, 1) when n is large~ which is when Xn_bar --> E[X_i]. Did that cover the reason for calling it asymptotic, or am i missing something?