how do we calculate a mixture of weights? I think it is used for univariate case. Could you explain also the multivariate case? Because in the case of multivariate, we can't use maximum likelihood estimation.
It depends on your prior assumptions of the model, but you're right, if your prior is too complicated, you can use Gibbs sampling or variational inference to to update the model.
Splendid explanation!
Thanks! I really appreciate it!
Nice explanation. I watched it many times. Thank you Dr.Jordan Boyd-Grabber
one auf the best and simple explanations out there.
*I think there is a mistake in the slides at **6:36**. I believe that instead of*
$\mu_1 = [-1, -1], \sigma_1^2 = [1, 1], \mu_1 = [1, 1], \sigma_1^2 = [1, 1]$
*it should be*
$\mu_1 = [-1, -1], \sigma_1^2 = [1, 1], \mu_2 = [1, 1], \sigma_2^2 = [1, 1]$
.
So what i found by reading papers is correct...in gmm we need mean covariance and weights...thank you sir
Yes, otherwise, there isn't much advantage to GMM over k-means!
how do we calculate a mixture of weights? I think it is used for univariate case. Could you explain also the multivariate case? Because in the case of multivariate, we can't use maximum likelihood estimation.
It depends on your prior assumptions of the model, but you're right, if your prior is too complicated, you can use Gibbs sampling or variational inference to to update the model.
thank you Sir
Thanks for watching!
Thank you for the lecture. So awesome. Great teacher
Thank you!
Amazing!!
Thank you so much really helped a lot.. Have my exams tomorrow
Best of luck!
Well said, thank you.
Nice