Philippe has cleared many confusions that easily show up in other similar courses. For example, he's gone to details of whether we should keep t or t transpose and how that is related to whether we should keep xTy or yTx (10:05). He has dived into many similar details in each of other lectures. He has a strong empathy upon students' learning process. Philippe is a genius professor and a nice person.
Heaps of dot-connecting mental exercises for me. Thanks professor. Hopefully, your High-Dimensional Statistics course will be available on youtube as well.
A question: for MLE of the multivariate case (for example, see slides on 57:45), does the expression of quadratic risk of \hat{\beta} implies that the quadratic risk goes to zero when n becomes large (i.e., more and more data)?
Philippe has cleared many confusions that easily show up in other similar courses. For example, he's gone to details of whether we should keep t or t transpose and how that is related to whether we should keep xTy or yTx (10:05). He has dived into many similar details in each of other lectures. He has a strong empathy upon students' learning process. Philippe is a genius professor and a nice person.
Heaps of dot-connecting mental exercises for me. Thanks professor. Hopefully, your High-Dimensional Statistics course will be available on youtube as well.
It really feels like if you know more, you can get more from this class. Many thanks~
yeah, Brownian bridge, Gradient, it's not a beginner level class. But it's a good way help me to connect different knowledge points.
38:56 Distribution of \hat{B}
54:45 MSE (Quadratic risk) of \hat{B} is the trace of the covariance matrix.
thanks for these lectures
thanks ♥️🤍
A question: for MLE of the multivariate case (for example, see slides on 57:45), does the expression of quadratic risk of \hat{\beta} implies that the quadratic risk goes to zero when n becomes large (i.e., more and more data)?
Yes, the risk approximates σ2*p/n when n large
👍👍👍👍👍
this was hard
Agreed, I couldn't quite follow
this is helpful ♥️🤍