EM algorithm: how it works

แชร์
ฝัง
  • เผยแพร่เมื่อ 27 ก.ย. 2024
  • Full lecture: bit.ly/EM-alg
    Mixture models are a probabilistically-sound way to do soft clustering. We assume our data is sampled from K different sources (probability distributions). The expectation maximisation (EM) algorithm allows us to discover the parameters of these distributions, and figure out which point comes from each source at the same time.

ความคิดเห็น • 216

  • @biancaluedeker
    @biancaluedeker 3 ปีที่แล้ว +36

    I am a PhD student. I have seen the material in three classes and had no idea what was going on. You made it crystal clear in 8 minutes.

    • @snackmaster35
      @snackmaster35 ปีที่แล้ว

      You must of had comp stat with Tony too lol

    • @Beowulf245
      @Beowulf245 ปีที่แล้ว

      I don't mean to be rude but what area of research do you specialize in? Because that's some pretty basic math here that they teach in your freshman or sophomore year in college.

    • @TorSTARAGARIO
      @TorSTARAGARIO 11 หลายเดือนก่อน +10

      @@Beowulf245 Very intentionally rude. I've had the same issue, and without solid intuition or clear explanations like this video, the math looks like a confusing mess that would go right over a lot of undergraduates' heads.

    • @ChadieRahimian
      @ChadieRahimian 8 หลายเดือนก่อน

      @@Beowulf245 I have two masters degrees in Astrophysics and Computer Science and currently a PhD student in computer science and I needed a refresher on EM and was failing to find anything useful online and couple of book chapters I went through were not enough for me to grasp the concept on a practical level of implementation for my problem right now and this video helped. So…

    • @fbng
      @fbng 4 หลายเดือนก่อน

      im doing this shit in my Bachelors and it's kicking my ass. i guess we all united in bad teaching of this topic regardless of our level

  • @omidmo7554
    @omidmo7554 8 ปีที่แล้ว +262

    I was struggling with EM for a long time. You explained it very simple. I believe this is a kind of art if some one explains sth hard in a easy way. This makes different. I appreciate your help.

    • @joshwolff4592
      @joshwolff4592 4 ปีที่แล้ว +21

      It's also an art when someone explains something quite easy in a very hard way. I know a few Picassos in this genre

    • @michaelle1229
      @michaelle1229 2 ปีที่แล้ว

      This helped me a lot during my exam for Modern Applied Statistics! I got a bit lost computing the derivations and their parameters with the Gaussian.

  • @aprilsun2572
    @aprilsun2572 4 ปีที่แล้ว +15

    Omg. U don’t know how long I have struggled with this algo. Such a nice explanation!!!

  • @impzhu3088
    @impzhu3088 3 ปีที่แล้ว +3

    Omg, now I know the intuition behind the EM algo. This is art and you are fantastic! They say if you can’t explain some concepts in an easy way, you don’t know it well enough. I guess that’s why you can explain this so clearly and why many teachers can’t. Thank you so much!

  • @mr6462
    @mr6462 4 ปีที่แล้ว +1

    I like how you explained causation in parameter -> which group or which group -> parameter but we have none of them. It is truly a beauty to recognize this nuance.

  • @daattali
    @daattali 8 ปีที่แล้ว +7

    What a great and clear explanation! I watched a few videos on EM/GMM and didn't quite get it as well as I do now. Your explanation of chicken and egg problem and the intro before that really makes it so much more intuitive. Thanks!

  • @KoLMiW
    @KoLMiW 3 ปีที่แล้ว +1

    In our lectures they straight jumped into 2D examples and it was very hard to comprehend the formulas, it helped me a lot that you explained it with a 1-d example. Thank you very much!

  • @adawang9147
    @adawang9147 7 ปีที่แล้ว +1

    I'll have to say, I looove your lectures. I watched your Decision Tree lecture 2 days ago, today I was looking for EM lecture, and you explain them all through. Thank you so much for sharing

  • @jeremyborg1365
    @jeremyborg1365 7 ปีที่แล้ว +2

    The best introductory video to EM by far. thanks

  • @xiayisun8570
    @xiayisun8570 6 ปีที่แล้ว +2

    Sir you are my hero! You always set up things so intuitively.

  • @ks34199
    @ks34199 7 ปีที่แล้ว +3

    Hello Prof,
    The breakdown of complex algorithms in simple steps is excellent.

  • @pereeia9048
    @pereeia9048 ปีที่แล้ว +1

    Amazing video, perfectly explained the concepts without getting bogged down in the math/technical details.

  • @comatosetorpor3602
    @comatosetorpor3602 3 ปีที่แล้ว

    youtube has opened doors for so many people who would not have otherwise gotten such a good lectures.

  • @roman5932
    @roman5932 2 ปีที่แล้ว

    I'm a russian student, 2nd year. I've seen 3 lectures in russian language, but only now I get this algorithm. This is professionalism I sure
    Thank you sir

  • @sandlinjames
    @sandlinjames ปีที่แล้ว

    Wonderful explanation. I've been watching my class presentations over and over with no result. Now I get it. :)

  • @lijun2031
    @lijun2031 8 ปีที่แล้ว

    This is my first time to leave a comment!!! you are awesome!! I come from NYU. I am struggling my final project. It uses EM. You really save my time!!! This is video is wonderful!!!!

  • @1982Dibya
    @1982Dibya 8 ปีที่แล้ว

    Awesome video...Now finally my concept of EM is clear...Nobody can make me understand the way you did..So many thanks.....Waiting for more interesting videos on machine learning...U r simply awesome..hats off

  • @emmettmcdow9916
    @emmettmcdow9916 5 ปีที่แล้ว

    Rarely comment, but this was a fantastic video. Very few youtube videos on a subject as small as EM are this informative.

  • @bionh
    @bionh 10 ปีที่แล้ว +3

    Thanks Victor, you really have a great educational style. Please keep making videos explaining important topics in data science in a clear way--I will keep watching them :)

    • @vlavrenko
      @vlavrenko  10 ปีที่แล้ว

      Bion Howard Thank you for the kind words. Really happy you find these videos helpful.

  • @ape1eat
    @ape1eat 10 ปีที่แล้ว

    I love this kind of explanation. Simple practical example that anyone can understand without tons of abstract mathematical expression. It helped me to understand the concept and then it's much easier to go deeper. Wish more teachers teach this way.

  • @TankNSSpank
    @TankNSSpank 9 ปีที่แล้ว +5

    We need more of these sir.

    • @vlavrenko
      @vlavrenko  9 ปีที่แล้ว +2

      Thanks, I'm working on getting more uploaded.

  • @maxweera7897
    @maxweera7897 2 ปีที่แล้ว

    Best lecture on EM on youtube. Well done!

  • @shm2157
    @shm2157 7 ปีที่แล้ว

    Superb and simplified way to explain the essence of GMM and EM..!

  • @annlee8239
    @annlee8239 3 ปีที่แล้ว

    impressive! you made it sound so clear, and appreciate how you compared it with k means

  • @supacopper4790
    @supacopper4790 7 ปีที่แล้ว

    Very helpful guide! I spent some time to read the paper and tutorial online and having trouble in understand the logic, and u just lead me to the point of understanding this logic in jst 8 minutes!

  • @ct528
    @ct528 4 ปีที่แล้ว

    Seriously. I have a very rough time understanding my professor over zoom and his lecturing style doesn't help. This was easier to understand. Thank you.

  • @danielalfonsetti6602
    @danielalfonsetti6602 3 ปีที่แล้ว

    This is by far the best and clearest explanation of the intuition of EM that I've heard. This is amazing. Thank you so much.

  • @Kruuppe
    @Kruuppe 5 ปีที่แล้ว +1

    My god this was an amazing video. I think I've already commented on this before but watching it again was really helpful! Cheers

  • @dmitryzabavin319
    @dmitryzabavin319 5 ปีที่แล้ว +3

    Thank you so much. This is the most clear explanation of EM I found. But the only question remained is: how to calculate P(a) and P(b) in P(b | xi) formula?

  • @steveshank9674
    @steveshank9674 9 ปีที่แล้ว +2

    Subbed and will be going through your entire collection of videos. This video helped me take a step forward with my research. Thank you for your efforts!

  • @sarthak8786
    @sarthak8786 6 ปีที่แล้ว

    this is so far the best explanation on this topic

  • @mahdishafiei7230
    @mahdishafiei7230 8 ปีที่แล้ว +1

    very very eloquent.
    Thanks for your time which you spend to teach

  • @vikalpmehta6019
    @vikalpmehta6019 4 ปีที่แล้ว

    I was struggling to get a big picture of EM algorithm. What a great explanation you provided. Thank you. 😁

  • @hakimazman488
    @hakimazman488 3 ปีที่แล้ว

    The comparison with K-means made it clicked for me. Thank you!!.

  • @lordnicholasbuzanthefearle2155
    @lordnicholasbuzanthefearle2155 9 ปีที่แล้ว +22

    You just earned my sub

    • @vlavrenko
      @vlavrenko  9 ปีที่แล้ว +2

      Thank you!

  • @ishitaraj7723
    @ishitaraj7723 2 หลายเดือนก่อน

    Very smooth explanation. Loved it!

  • @jameshighland6769
    @jameshighland6769 9 ปีที่แล้ว

    Very clear and simple explanation. Fantastic. Thanks for this. Please upload more videos like these.

    • @vlavrenko
      @vlavrenko  9 ปีที่แล้ว

      Thank you for the kind words!

  • @cycman98
    @cycman98 2 ปีที่แล้ว

    Comparison with kmeans opened my eyes. Thank you

  • @nkapila6
    @nkapila6 6 หลายเดือนก่อน

    Thanks for this. Your video helped bring clarity to the problem statement.

  • @beatlekim
    @beatlekim 3 ปีที่แล้ว

    BEST VIDEO ON EM ALGORITHM

  • @ruoyuguo9134
    @ruoyuguo9134 3 ปีที่แล้ว

    Great EM algorithm explanation!

  • @samuelcoromandel7392
    @samuelcoromandel7392 7 ปีที่แล้ว +1

    How do you calculate priors?

  • @lucianotarsia9985
    @lucianotarsia9985 4 ปีที่แล้ว

    Great video Victor. Very simple to understand. Thank you for the help

  • @xianda9648
    @xianda9648 3 ปีที่แล้ว +1

    Hello, How to update the existing GMM when new data come?

  • @raoufkeskes7965
    @raoufkeskes7965 7 หลายเดือนก่อน

    at 3:08 the variance estimator shoud be divided by (nb-1) as corrected estimation and not nb .. that's what we call Bessel's correction

  • @rlobo2535
    @rlobo2535 5 ปีที่แล้ว +22

    Your voice is like "Gale Boetticher" the lab guy from Breaking Bad, good video though

  • @cageybee777
    @cageybee777 4 ปีที่แล้ว +1

    At 5:37, how do we know P(b) and P(a)?

  • @abhishekagnihotri9233
    @abhishekagnihotri9233 5 ปีที่แล้ว

    The way of explanation is very good.

  • @sreejadevisetti
    @sreejadevisetti ปีที่แล้ว +1

    Awesome explanation !!

  • @siyuanxiang1636
    @siyuanxiang1636 2 ปีที่แล้ว

    Thanks for this amazing video! It clarifies EM and it's really helpful! Thanks for making it!

  • @desitravellers2023
    @desitravellers2023 5 ปีที่แล้ว

    Clear, concise and insightful. Thank you.

  • @200415670
    @200415670 5 ปีที่แล้ว

    This is crazy. You explain it so clearly!! Thanks alot!!

  • @dilettachiaro5322
    @dilettachiaro5322 4 ปีที่แล้ว

    so grateful to you man for this explanation!

  • @yooneylee6694
    @yooneylee6694 4 ปีที่แล้ว

    At EM, do we know how many sources? Or we have to guess that as well?

  • @cp3shadow
    @cp3shadow 4 ปีที่แล้ว

    At 5:25, P(x_i|b) equals a PDF as stipulated by your equation. However, since the random variables are continuous, shouldn't the likelihood equal a density function f, NOT a probability? You'd need to integrate over some interval in order to claim the Gaussian PDF equals a probability

  • @krystaljinluma
    @krystaljinluma 4 ปีที่แล้ว

    How do you calculate the new parameters in EM? In k-means you would compute the means for each feature, but I'm confused about how you do that in EM?

    • @krystaljinluma
      @krystaljinluma 4 ปีที่แล้ว

      Nvm I found your next video explaining exactly what I was asking. Great videos

  • @martijnhuijnen1
    @martijnhuijnen1 11 หลายเดือนก่อน

    Thanks so much! I will refer my students to your webpage!

  • @adityapandey5264
    @adityapandey5264 5 ปีที่แล้ว

    This is so easy to understand. Thank you.

  • @mavaamusicmachine2241
    @mavaamusicmachine2241 2 ปีที่แล้ว

    thank you for this lecture, extremely well explained

  • @erictao8396
    @erictao8396 หลายเดือนก่อน

    Great explanation, thanks!

  • @phuongdinh5836
    @phuongdinh5836 7 ปีที่แล้ว

    Beautifully explained.

  • @ahmadjaradat3011
    @ahmadjaradat3011 29 วันที่ผ่านมา

    Such a nice explanation!!!

  • @havayastik
    @havayastik 10 ปีที่แล้ว

    thanks man, this video saved me so much time to understand it

    • @vlavrenko
      @vlavrenko  10 ปีที่แล้ว +2

      Thanks, glad this was helpful.

  • @camilledingam8210
    @camilledingam8210 3 ปีที่แล้ว

    Hello, please can you show me the reference where i can find the membership cluster expression of expectation maximization? I will be very grateful for you help, thank you

  • @billbentley3
    @billbentley3 4 ปีที่แล้ว

    Is it possible to use the EM technique with categorical variables and if so, how does the technique change to do that?

  • @Code-09
    @Code-09 5 ปีที่แล้ว

    Best explanation. Thank you very much.

  • @EliBiomedEng
    @EliBiomedEng 3 ปีที่แล้ว

    AWESOME EXPLANATION, THANK YOU

  • @kevinsong1056
    @kevinsong1056 8 ปีที่แล้ว

    Very very good explanation! Thank you!

  • @peterreichwald4400
    @peterreichwald4400 10 ปีที่แล้ว

    Really well explained .. which book is used in the course ?

    • @vlavrenko
      @vlavrenko  10 ปีที่แล้ว

      Thank you. The lectures aren't based on any single textbook.

  • @iamconnected5991
    @iamconnected5991 3 ปีที่แล้ว

    Thank you so much for this great video!

  • @SrikrishnaBhat
    @SrikrishnaBhat 5 ปีที่แล้ว

    @2:40 the expression should be (x1-mu_b)^2 instead of (x1-mu_1)^2

  • @michaelkuzmin
    @michaelkuzmin 5 ปีที่แล้ว

    great video, the only part that I found confusing was the last sentence. it will re-estimate to fit the assignments a little bit better. if P(xi|a)=0.18 and P(xi|b)=0.17, is it going to try to maximize just one of them or a combination or both or what?

    • @michaelkuzmin
      @michaelkuzmin 5 ปีที่แล้ว

      ah, nevermind, I watched your example video, it is now perfectly clear. th-cam.com/video/iQoXFmbXRJA/w-d-xo.html

  • @CT99999
    @CT99999 ปีที่แล้ว

    Great explanation!

  • @dayeonoh80
    @dayeonoh80 3 ปีที่แล้ว

    Thank you for the video, it helped me a lot!!!!!! Great explanation

  • @erlendlangseth4672
    @erlendlangseth4672 6 ปีที่แล้ว

    Thanks, very helpful! My course notes are stuffed with math notation. This is much better.

  • @sabbirneplumpstein334
    @sabbirneplumpstein334 2 หลายเดือนก่อน +1

    You´re amazing

  • @ramkufavourites
    @ramkufavourites 8 ปีที่แล้ว

    Hats off to you sir.

  • @DharshiniMBAnu
    @DharshiniMBAnu 6 ปีที่แล้ว

    Really it is very simple to understand.Good one :)

  • @JohnJones1987
    @JohnJones1987 7 ปีที่แล้ว

    Would it be possible to do Kernel Density Estimate on the raw values, build up a bimodal smoothened distribution, then use that to find the underlying distributions? Maybe that's already done and has a name, i'm just not very good at statistics ^_^

  • @desisto007
    @desisto007 7 ปีที่แล้ว

    Thank you so much for this lecture!

  • @sumowll8903
    @sumowll8903 2 ปีที่แล้ว

    Well done! Thank you so much!

  • @YYchen713
    @YYchen713 2 ปีที่แล้ว

    this is awesome! really helpful!

  • @kenway346
    @kenway346 4 ปีที่แล้ว

    this video is legendary!

  • @jianishen5656
    @jianishen5656 7 ปีที่แล้ว

    so helpful ! love your videos !thanks !

  • @ting-yuhsu4229
    @ting-yuhsu4229 4 ปีที่แล้ว

    Thanks for the video! It was really helpful :D

  • @Ergydion
    @Ergydion 4 ปีที่แล้ว

    Thanks, great explanation!

  • @ibrizottingrid
    @ibrizottingrid 8 ปีที่แล้ว

    Great video! Thank you very much.

  • @DrKnowsMore
    @DrKnowsMore 8 หลายเดือนก่อน

    Outstanding!

  • @anilsarode6164
    @anilsarode6164 6 ปีที่แล้ว

    great .....sir .....hats off for you .

  • @janmusil5783
    @janmusil5783 9 ปีที่แล้ว

    Thanks a lot! I'm going to the finals in 2 days so if I draw EM as my question (which I hope for now), I will use this example :-)

    • @vlavrenko
      @vlavrenko  9 ปีที่แล้ว

      Thank you! I hope you do well on your final.

    • @janmusil5783
      @janmusil5783 9 ปีที่แล้ว

      Thanks! I did, eventually I drew Bayesian and Non-Bayesian tasks, but I think I would have made it with EM too :-)

  • @statisticstime4734
    @statisticstime4734 4 ปีที่แล้ว

    Excellent!

  • @itzikgutzcha4779
    @itzikgutzcha4779 4 ปีที่แล้ว

    This was helpful, thank you.

  • @HebrewSongPod
    @HebrewSongPod 5 ปีที่แล้ว

    I love you. You save my day.

  • @shaukat6
    @shaukat6 8 ปีที่แล้ว

    Could you please share the slide

  • @vanyas8281
    @vanyas8281 9 ปีที่แล้ว

    Very clear!!!! Thanks so much!!!!

  • @siddhantsaurabh6098
    @siddhantsaurabh6098 9 ปีที่แล้ว

    Nice explanation. Thanks a lot

    • @vlavrenko
      @vlavrenko  9 ปีที่แล้ว

      You're welcome.

  • @lizaleon5854
    @lizaleon5854 5 ปีที่แล้ว

    omg it was so useful! THANK YOU!

  • @haojiang5618
    @haojiang5618 8 ปีที่แล้ว

    great lecture, tks

  • @jaroslawoska
    @jaroslawoska 10 ปีที่แล้ว

    Great, that was what what I was looking for.

  • @dhineshkumarramasubbu1190
    @dhineshkumarramasubbu1190 6 ปีที่แล้ว

    Is EM similar to Fuzzy C-means algorithm? It would be really helpful if someone could help me.