I really love that you talked about the big picture, what the variables mean, and how they relate to each other. The last 2 minutes of the video really explains everything to me. My professor in school spends most of the time showing us the technicalities in the math part, like how to take the derivatives, which seems to totally miss the important points but is just a big showoff.
WOW you are honestly an incredible teacher. My normal experience learning some ~"deep" concept like this is comprehending 30% of the material and needing to search at length to find answers to the questions I have about the remaining 70%. Watching this video, despite the fact that I had almost no background, you paused to answer 95% of the questions I would've otherwise had (e.g. "what is this big pi notation?"). I wish I had teachers like you in college - you truly are making me believe in the value of a competent professor over merely reading textbooks (which is the only way I got any learning done in college).
Man I SWEAR, you are a legend, I watched videos for faculty staff from very famous universities, they didn't explain it half clear and simple as you did, THANK YOU, THANK YOU, THANK YOU :-)
I came here looking for clarity on an adjacent topic that is quite different from what you were teaching, but I have a good idea of how my own troublesome matter is supposed to work. Thanks for this awesome explanation!
couldnt understand GMM in my expensive six month long semester classes but understood it now in 7.5 mins by watching this free video in 2x! Thanks Ritwik! Weird world we live in lol
I came from the Deep Learning book (Goodfellow, Bengio et al.), and I must say you have done a phenomenal job explaining GMM as compared to that literature.
The best resource for understanding this topic. I've been searching for two days and this is the most useful explanation I've encountered so far. Thank you.
Keep it going! Your hand-written Notes with that color scheme could be even more impactful if you used Onenote. Get a writing tab for small money it really is a game-changer. Try it if you fancy! Anyhow, pls keep doing these videos.
Great video, outstanding job. Got me 99% of the way to understanding the concept. I have 1 question: During Expectation Maximization, you are recomputing Mu of k, Sigma of k, Pi of k. For Sigma of k, what does the variable T denote? Transpose?
Hello, Would it be possible for you to make videos on belief propogation algorithms? Your videos are of great quality, and considering there are not really good quality explanations, it would be really helpful
Ritvik, thanks for explaining this. using Mclust package in R, what are the different parameters EEV etc.? what is BIC in the context of MM. This is awesome explanation. Using MM in mclust to classify small group of clusters that are distinct from population that carry significant meaning. Could you please introduce Dirichlet concept also as an extension to this class. Thank you.
When you describe the value of P(x) as a "probability that we see x" did you mean "likelihood" instead of "probability"? Because the result of pdf function doesn't give the the probability but rather the likelihood that point x belongs to the distribution. Thanks for the video though. Good stuff!
Hi Ritvik, Great video but why gamma has a summation from 1 to k. I think it should be without summation as you said probability of being in class k given an observation. Please correct me if understood that correctly
Please provide R code for modelling the dependence between trial and suceess ( herons case by J Zhu) which include em algorithm of beta binomial poisson mixture model. Please help me
The best explanation of GMMs on TH-cam
agreed
I really love that you talked about the big picture, what the variables mean, and how they relate to each other. The last 2 minutes of the video really explains everything to me. My professor in school spends most of the time showing us the technicalities in the math part, like how to take the derivatives, which seems to totally miss the important points but is just a big showoff.
WOW you are honestly an incredible teacher. My normal experience learning some ~"deep" concept like this is comprehending 30% of the material and needing to search at length to find answers to the questions I have about the remaining 70%. Watching this video, despite the fact that I had almost no background, you paused to answer 95% of the questions I would've otherwise had (e.g. "what is this big pi notation?"). I wish I had teachers like you in college - you truly are making me believe in the value of a competent professor over merely reading textbooks (which is the only way I got any learning done in college).
The best way to explain a concept is by keeping it simple. Hands down the best explanation ever !
Whenever I need to understand a new concept, I search your channel. Thank God, you have this!
I love that you talk about its assumptions, the reason for using it vs K means, etc. All the important questions answered.
Thanks!
Way better than my professor who couldn't explain it in one semester. You are a great teacher. thanks very much.
Happy to help!
the way explained it is so practical as well as theoretical, on top of that easy to understand. Respect to you sir. Thanks a lot
You are most welcome
bro i just put on notifications on this channel, you are blessing sent from the heavens
Wow, thanks!
Man I SWEAR, you are a legend, I watched videos for faculty staff from very famous universities, they didn't explain it half clear and simple as you did,
THANK YOU, THANK YOU, THANK YOU :-)
I came here looking for clarity on an adjacent topic that is quite different from what you were teaching, but I have a good idea of how my own troublesome matter is supposed to work. Thanks for this awesome explanation!
Really gives a simple but clear explanation of the model! Better than the book's explanation!
You are the most underrated machine learning conceptual lecturer I can find on youtube...
I really really appreciate the effort in making this video. Truly helped in understanding the GM model
thank you for the kind words :)
couldnt understand GMM in my expensive six month long semester classes but understood it now in 7.5 mins by watching this free video in 2x! Thanks Ritwik! Weird world we live in lol
*Probably the best explanation out there. Taught better than my professor.*
I came from the Deep Learning book (Goodfellow, Bengio et al.), and I must say you have done a phenomenal job explaining GMM as compared to that literature.
By far the best explanation I found!
Thanks!!!!!!
Thank you so much, watching your Em model video before watching this one helped me to understand things even better.
Glad it helped!
The best resource for understanding this topic. I've been searching for two days and this is the most useful explanation I've encountered so far. Thank you.
Wow you help me in my journey on understanding spectral mixture kernel which requires the knowledge in mixture of Gaussian. Thank you
No problem !
Dude, you broke it down so clearly even I could understand it! Well done!
Keep it going! Your hand-written Notes with that color scheme could be even more impactful if you used Onenote. Get a writing tab for small money it really is a game-changer. Try it if you fancy! Anyhow, pls keep doing these videos.
one of the best explanation on this topic
Thanks a lot for explainig this concept this much easier. Hats off to you..
You are most welcome
The most easy-to-understand video about GMM on TH-cam
Wow! God bless you for explaining it so beautifully!
very clear introduction ,I have understood the Gaussian mixed models.
Purely logical and very smooth explanation!
Thanks a ton ! I'm obsessed with this channel. Would love to watch you explaining - 'Deep Learning' topics.
You should be our professor at our university.
My new favorite stats channel! Can you make a video on multivariate normal distribution? You referenced it after all 😀
I wish I had you as one of my prof, who could make the concept easy and teach, rather than complicating it with all maths
Thanks a lot, I have my exam tomorrow, and I am saved.
Best simple explanation👏
This was a fantastic explanation!
Incredible explanation, thank you.
You're very welcome!
Thank you for another clear explanation
Thanks sir, outstanding explanation!
I wish I found your channel earlier. Great content !
best explanation ever
Great video! Can you please consider making one on Gaussian Processes?
Great video, outstanding job. Got me 99% of the way to understanding the concept. I have 1 question:
During Expectation Maximization, you are recomputing Mu of k, Sigma of k, Pi of k.
For Sigma of k, what does the variable T denote? Transpose?
Very well done ! Thanks for your explanation !
Hello,
Would it be possible for you to make videos on belief propogation algorithms?
Your videos are of great quality, and considering there are not really good quality explanations, it would be really helpful
amazing one to learn about GMM. do you have any video related to the use of GMM in sensor fusion?
Great explanation thanks
Thank you very much! That's was a very clear explanation!
Brilliant explanation!
Does capital sigma have to be a 2x2 matrix? Width, length and angle can also be represented by 3 values.
Clearly explained, ever.
Great Explanation
Wow .... keep it up
Incredible !! Thanks gigantically
The punch line is, "That's Gaussian Mixture Model in a Nutshell".
Anyway.. great video.. thank you...
Does this work for other distributions, like t-distros etc?
Ritvik, thanks for explaining this. using Mclust package in R, what are the different parameters EEV etc.? what is BIC in the context of MM. This is awesome explanation. Using MM in mclust to classify small group of clusters that are distinct from population that carry significant meaning. Could you please introduce Dirichlet concept also as an extension to this class. Thank you.
thanks a lot i really understand it!!!
This was amazing
Whether dos GMM with EM means, that at last we classify the point by the highest value from P(xi)?
What a best way to use those crayolas
When you describe the value of P(x) as a "probability that we see x" did you mean "likelihood" instead of "probability"? Because the result of pdf function doesn't give the the probability but rather the likelihood that point x belongs to the distribution.
Thanks for the video though. Good stuff!
In the multivariate gaussian mixture, how can insert the proportion in R?
wow thank you so much sir
the fucking 🐐
Hi Ritvik, Great video but why gamma has a summation from 1 to k. I think it should be without summation as you said probability of being in class k given an observation. Please correct me if understood that correctly
It's just a hyperparameter just like in case of K means
Could you please introduce the source for the math, especially for the derivatives part
So is EM like maximum likelihood?
Is salmon lengthy than tuna?
So helpful!
Thanks!
terrific
Thank you!!!!!!!!!!!!
Thank you
Welcome!
What a boss
ur awesome!
You rock!
Please provide R code for modelling the dependence between trial and suceess ( herons case by J Zhu) which include em algorithm of beta binomial poisson mixture model. Please help me
Thank you!
Good Explanation. But would have liked you would have went over maths bit more.
I came here after breaking my brain reading books for 2 days... seems like I only needed 15 minutes.
brilliant
Woow
What is capital N sub k?
pls look at 10:16
Simplified explanation.
nice :))
Please make a post prune video. Thanks!