Clustering (4): Gaussian Mixture Models and EM

แชร์
ฝัง
  • เผยแพร่เมื่อ 5 มี.ค. 2015
  • Gaussian mixture models for clustering, including the Expectation Maximization (EM) algorithm for learning their parameters.

ความคิดเห็น • 108

  • @hoavu4430
    @hoavu4430 8 ปีที่แล้ว +45

    Great explanation indeed!
    For those who already understand K_means. Gaussian mixture models take not only means into account but also co-variance to form a cluster. They use maximum likelihood to fit the models - just like K_means find out its cluster center.
    Let's go through other videos by this guys.

    • @humeil3629
      @humeil3629 4 ปีที่แล้ว

      th-cam.com/channels/wftHr2cf_jpiezE294UwqQ.htmlplaylists

    • @hypebeastuchiha9229
      @hypebeastuchiha9229 2 ปีที่แล้ว +1

      Thanks!

  • @littlefiend100
    @littlefiend100 5 ปีที่แล้ว +17

    This is by far the most well-explained lecture on GMM!

  • @ruinsaneornot
    @ruinsaneornot 5 ปีที่แล้ว +2

    Incredibly concise and informative explanation. Thank you!

  • @Noah-jz3gt
    @Noah-jz3gt 3 หลายเดือนก่อน

    Very clear and straightforward while containing all the necessary contents to understand the concept!

  • @DmitryIvanovDfcreative
    @DmitryIvanovDfcreative 7 ปีที่แล้ว

    Clear and lucid, finally I got that, thank you!
    The best among other videos, like Stanford ML 12 etc.

  • @sparkequinox
    @sparkequinox 8 ปีที่แล้ว +41

    I agree, I was lost on this even after textbooks and other videos on youtube, and this just made everything clear. Thank you! Simple, concise and well structured.

  • @J235304204
    @J235304204 7 ปีที่แล้ว +1

    Love this video, very straight clear and systematic.

  • @laktozza
    @laktozza 7 ปีที่แล้ว +1

    Totally awesome, very clear and easy to comprehend. Helped me to dramatically improve my ML algorithm.

  • @pedroaragon3435
    @pedroaragon3435 หลายเดือนก่อน

    By far the best explanation on GMM and specially the EM algorithm.

  • @waitingtilltheveryla
    @waitingtilltheveryla 9 ปีที่แล้ว +31

    Awesome, I like your teaching style!

  • @rickclark4832
    @rickclark4832 7 หลายเดือนก่อน

    Exceptionally clear explanation of the use of EM with Gaussian Mixture Models

  • @dinoanastasopoulos8511
    @dinoanastasopoulos8511 2 ปีที่แล้ว +7

    This is hands down the most thorough and intuitive explanation I've ever heard for GMM's and EM. Thanks for your work, will definitely be subscribing and watching more!

    • @rahulay
      @rahulay 2 ปีที่แล้ว

      Completely agree with this!

    • @Sagar_smh
      @Sagar_smh 2 ปีที่แล้ว

      100%

  • @itarabichi
    @itarabichi ปีที่แล้ว

    Great explanation! Every bit of it can be comprehended. Well Done!

  • @BOURNE399
    @BOURNE399 3 ปีที่แล้ว

    I would say this professor is the most excellent teacher in ML I ever met in the world.

  • @Joe-qp1ow
    @Joe-qp1ow 5 ปีที่แล้ว

    best explain of GMM I have ever seen, thank you

  • @kazakiewicz
    @kazakiewicz 5 ปีที่แล้ว

    The best lecture on GMM I've seen

  • @swetharajagopalan3781
    @swetharajagopalan3781 4 ปีที่แล้ว

    Thank you so much! Your video explains GMM really clearly!

  • @GiiWiiDii
    @GiiWiiDii 4 ปีที่แล้ว

    Thanks! Watched so many videos, but yours finally made it clear for me :D

  • @CentAurI0s
    @CentAurI0s 9 ปีที่แล้ว +4

    First read Bishop's chapter on Gaussian mixtures and I was completely lost. Your explanation just made everything very clear.

    • @humeil3629
      @humeil3629 4 ปีที่แล้ว

      th-cam.com/channels/wftHr2cf_jpiezE294UwqQ.htmlplaylists

  • @gentleplatypus4334
    @gentleplatypus4334 4 ปีที่แล้ว

    Thank you M. Ihler. Fantastic explanation.

  • @antiagonista
    @antiagonista 8 ปีที่แล้ว +1

    Congrats on the video... you are a very good speaker!

  • @marcospolanco3272
    @marcospolanco3272 2 ปีที่แล้ว

    The GOAT explanation of GMM.

  • @fatemehcheginisalzmann2189
    @fatemehcheginisalzmann2189 9 ปีที่แล้ว +1

    Thank you so much for these videos :) please continue

  • @simpleworld542
    @simpleworld542 3 ปีที่แล้ว

    Thank You Professor. Your lectures are the best

  • @rorals4814
    @rorals4814 7 ปีที่แล้ว

    I'm new in this field and I don't have even a good background in statistics.
    I think this video doesn't deserve 270 likes it deserves 270 million.
    Since 3 days, I have been trying to find a simple explanation of GMM and I couldn't find.
    All videos and tutorial talk about too much math details it was like someone throw me in ocean and I don't know how to swim lol. Even though I need to watch it at least one more time, really Thank you for making my life easier.

  • @vahidvajihinejad3178
    @vahidvajihinejad3178 2 ปีที่แล้ว

    Awesome simple explanation - appreciate it!

  • @nirvanaherbs6374
    @nirvanaherbs6374 5 ปีที่แล้ว

    Thanks for the great video.. sufficient information as well as nicely explained.

  • @Ezechielpitau
    @Ezechielpitau 8 ปีที่แล้ว +4

    Very well explained. Have an exam coming up about this and feel like I finally understood how this works :) Keep it up. Oh, and subscribed by the way ;)

  • @tomatocc4565
    @tomatocc4565 4 ปีที่แล้ว +1

    Finally understand what’s going on, much better than my prof. 💪

  • @ZLYang
    @ZLYang 10 หลายเดือนก่อน +1

    The best explanation I ever see. Hope can talk a bit about how to derive the equation.

  • @ryankarel420
    @ryankarel420 2 ปีที่แล้ว

    This was very helpful and clear, thank you.

  • @Oblivionator8000
    @Oblivionator8000 6 ปีที่แล้ว

    Thanks for this - helped me understand this tricky topic

  • @bradhatch5608
    @bradhatch5608 7 ปีที่แล้ว

    Best explanation out there!

  • @ismaelali7795
    @ismaelali7795 6 ปีที่แล้ว

    Thanks for making my life much easier! =)

  • @anwarshome
    @anwarshome 5 ปีที่แล้ว

    Beautiful, thank you so much.

  • @Aakash-mi8xq
    @Aakash-mi8xq ปีที่แล้ว

    Very well explained. Thank you!

  • @jiastone5622
    @jiastone5622 6 ปีที่แล้ว +1

    真的太棒了。Thank you a lot! Awesome

  • @raghavmittal1702
    @raghavmittal1702 7 ปีที่แล้ว

    Explained well. Thanks!

  • @mihir.khandekar
    @mihir.khandekar 4 ปีที่แล้ว

    Good video. Able to follow and understand well.

  • @mirettegeorgy5123
    @mirettegeorgy5123 2 หลายเดือนก่อน

    Thank you for this video, helps alot!

  • @haroldsu1696
    @haroldsu1696 5 ปีที่แล้ว

    thank you for the great explanation!

  • @vigneshmanoharan3770
    @vigneshmanoharan3770 8 ปีที่แล้ว +1

    Thank you so much!!! To the point and very easy to follow for newbies like me

  • @NazaninYari
    @NazaninYari 4 ปีที่แล้ว

    Great explanations! I wish Bishop could explain things as clearly as you did here in his textbook! Thanks a lot.

    • @AmeeliaK
      @AmeeliaK 3 ปีที่แล้ว

      that's true but Bishop at least has the formula for Sigma correct ;-)

  • @mahdishafiei7230
    @mahdishafiei7230 8 ปีที่แล้ว

    very eloquent.
    Thanks

  • @doaa1918
    @doaa1918 8 ปีที่แล้ว +1

    Excellent ! :)

  • @starcalibre
    @starcalibre 8 ปีที่แล้ว

    outstanding! thanks

  • @jnspincliffe2
    @jnspincliffe2 2 ปีที่แล้ว

    awesome video, thank you!

  • @user-hj1co9ub9w
    @user-hj1co9ub9w ปีที่แล้ว +1

    Excellent explanation!! (one correction for the mistake: the outer product was written as the inner product in the slides

  • @liaothomas7405
    @liaothomas7405 6 ปีที่แล้ว

    Very helpful. Thank you

  • @mwenyamulenga8435
    @mwenyamulenga8435 7 ปีที่แล้ว

    Thanks a lot, very helpful

  • @ruyingsun309
    @ruyingsun309 7 ปีที่แล้ว

    Really helpful ! thanks

  • @hminhph
    @hminhph 6 ปีที่แล้ว

    thx for your efforts

  • @bonnevalor
    @bonnevalor 6 ปีที่แล้ว

    BLESS YOU!

  • @gobluebalter
    @gobluebalter 6 ปีที่แล้ว

    Awesome. Thanks.

  • @bobbynazaris750
    @bobbynazaris750 8 ปีที่แล้ว +1

    Thank you

  • @Raven-bi3xn
    @Raven-bi3xn 3 ปีที่แล้ว

    The best!

  • @paullouw6080
    @paullouw6080 4 ปีที่แล้ว

    Thank you!!

  • @AHMADKELIX
    @AHMADKELIX 2 ปีที่แล้ว

    permission to learn sir 🙏.thank you

  • @valeriap9726
    @valeriap9726 7 ปีที่แล้ว +1

    Is it possible to have an example matlab code of the em algoritm?

  • @user-wj8gb2oh8r
    @user-wj8gb2oh8r 7 ปีที่แล้ว

    Excellent ! Can you put a link to that presentation ?

  • @hippopotamus2212
    @hippopotamus2212 4 ปีที่แล้ว

    sorry if I misunderstood, but does EM initialize parameters only once and always converge on global optima? and k-means is the one that resets cluster centers each time?

  • @RostovRedDevil
    @RostovRedDevil 8 ปีที่แล้ว

    Hi Alexander. is it not possible to upload your presentation somewhere? Thank you

  • @charmendro
    @charmendro 4 ปีที่แล้ว

    any pre reqs for understading all this better? I have a good background in Lin Algebra and Multi Var Calc but I've never really done statistics

  • @dsgarden
    @dsgarden 3 ปีที่แล้ว

    Small observation: m is actually the number of data points, since probabilities sum to 1 for each data point

  • @anhthungo9101
    @anhthungo9101 2 ปีที่แล้ว

    Character In the video It's great, I like it a lot $$

  • @NishantAgarwal7
    @NishantAgarwal7 4 ปีที่แล้ว

    Does it come under partitional clustering or aglomerative clustering?

  • @BruinChang
    @BruinChang 2 ปีที่แล้ว +1

    GMM looks like finite element method. Both of them use superposition to fit the ground truth.

  • @elanmiller5142
    @elanmiller5142 6 ปีที่แล้ว +1

    How do you determine at 3:29 what pi of c is?

  • @vincenzo4259
    @vincenzo4259 2 ปีที่แล้ว

    Thanks

  • @Anameplsszs
    @Anameplsszs 4 ปีที่แล้ว

    thank you

  • @001zeal
    @001zeal 8 ปีที่แล้ว

    Great Video. Very clear explaination! Thank you.
    COuld you redirect me to some questions in MIxture of gaussians?

  • @user-pp1nv6le3w
    @user-pp1nv6le3w 5 ปีที่แล้ว

    Great video, Is there some code to study?

  • @BrijeshJanardhanan
    @BrijeshJanardhanan 8 ปีที่แล้ว

    At circa 8:25, isn't it a error; isn't the Summation of (weighted mean) should go from ( i to M)??

  • @amrdel2730
    @amrdel2730 6 ปีที่แล้ว

    thanks very much prof ihler i am a phd student from algeria your vids are very useful for us and i d like to ask you if you can add somme matlab simulation implementation and code to these algorithms for classificatio ensembles clusteering and so thanks in advance

  • @liubo19831214
    @liubo19831214 ปีที่แล้ว

    Prof. Ihler, could you provide the reference for hard EM (in the last slide)? Thx!

  • @manueljenkin95
    @manueljenkin95 ปีที่แล้ว

    This video was so hard to follow and watch (too wordy and very little pauses) but I’m thankful anyway, since eventually I got to understand it by thinking about it and rewatching a few times.

  • @ishanbhatt6067
    @ishanbhatt6067 3 ปีที่แล้ว +3

    For some reason I couldn't grasp most of the material presented here.. Judging from the comments though, it seems like I am the only one.

  • @tobyto4614
    @tobyto4614 2 ปีที่แล้ว

    Is the initial mu_c, sigma_c, pi_c randomly initialized?

  • @tobipertlwieser3452
    @tobipertlwieser3452 4 ปีที่แล้ว +4

    Nice video! At 4:48 the second term should be transposed, not the first one, we want sigma to be a (dxd) matrix :)

    • @AmeeliaK
      @AmeeliaK 3 ปีที่แล้ว

      yes, that's very confusing if you want to reimplement it with this video. I wasted a lot of time before I figured out this mistake.

  • @mehmetnaml5073
    @mehmetnaml5073 3 ปีที่แล้ว

    k-means won't get that shape with 2 clusters but possibly it can create 5 clusters out of that.

  • @LongTran-og7ji
    @LongTran-og7ji 6 ปีที่แล้ว

    Can you please explain to me what is 'm' in the formula calculate pi_c = m_c/m.
    By the way I wonder do I understand the r_ic correctly? As you said, r_ic is the probability that sample i belongs to cluster c, so 0

    • @LongTran-og7ji
      @LongTran-og7ji 6 ปีที่แล้ว

      Your lecture is the most awesome among those videos about GMM sir. Thanks alot!
      I'd be more appreacited if you answer my questions above.

    • @hminhph
      @hminhph 6 ปีที่แล้ว

      Hi Long Tran,
      I try to answer this but dont take this with guarantee, i'm also just getting into this topic.
      m seems to me as the number of samples you notated that as n. But I will consider it as m now.
      Then there is m_c called "Total responsibilty allocated to cluster c", which is the sum of r_ic over all samples i=1,...,m.
      r_ic is the probabilty of sample x_i really be drawn out of the cluster or normal distribution c. In fact 0

  • @rorals4814
    @rorals4814 7 ปีที่แล้ว

    could any one tell what are the prerequisites to the GMM?
    what should I have known before this?

    • @solsticetwo3476
      @solsticetwo3476 5 ปีที่แล้ว

      Ror Als - probabilities / marginal prob.
      - Bayesian inference
      - k-mean clustering
      - multivariate Gaussian distribution
      - function optimization

  • @kdipakj
    @kdipakj 2 ปีที่แล้ว

    At 4:51, Instead "Mean is first moment of data " it should be "Mean is first moment of data about zero (i.e. arbitrary point )"

  • @manjeetchahal5548
    @manjeetchahal5548 6 ปีที่แล้ว

    where software

  • @tmoldwin
    @tmoldwin 7 ปีที่แล้ว

    The different distributions shouldn't have different areas; as probability distribution functions the integral should be 1 for all of them.

    • @DotcomL
      @DotcomL 7 ปีที่แล้ว

      Think of it as weights, one distribution has more "importance" than the other. For instance, if you have 100 values from N(0,1) and 1 value from N(100,10) then the first distribution should weigh 100x more

    • @solsticetwo3476
      @solsticetwo3476 5 ปีที่แล้ว

      Each have area of 1, but different fractional area in the combined sum.

  • @Dudufang
    @Dudufang 4 ปีที่แล้ว

    Great vid ! Though I would recommend to pronounce c and z more differently. Its a little unclear to me, or rather the z (zed) is rather prounces as a c (cee).

    • @AlexanderIhler
      @AlexanderIhler  4 ปีที่แล้ว

      That's how we all say it in America. :) Thanks!

  • @Lunkford
    @Lunkford 6 ปีที่แล้ว

    around 8:17 Sigma_c should be the sum of outer products not dot prods.

  • @remandev9074
    @remandev9074 7 หลายเดือนก่อน

    This is not a very good explanation at all. There's WAY too much theorem dumping with difficult-to-parse variables all over the place, and a big lack of tangible examples. I don't know what other people see in this video.

  •  8 ปีที่แล้ว +11

    useless without examples

    • @aparnaanaharajan5486
      @aparnaanaharajan5486 7 ปีที่แล้ว +6

      Just saw your same comment in all good videos.. Stop demotivating viewers

    • @gobluebalter
      @gobluebalter 6 ปีที่แล้ว +2

      What the hell are you talking about, there was an example you twerp

    • @gobluebalter
      @gobluebalter 6 ปีที่แล้ว +1

      Oh and also, you look like a douche