(ML 16.6) Gaussian mixture model (Mixture of Gaussians)

แชร์
ฝัง
  • เผยแพร่เมื่อ 25 พ.ย. 2024

ความคิดเห็น • 54

  • @mohamedgamal-gi5ws
    @mohamedgamal-gi5ws 3 ปีที่แล้ว +1

    This video is gold tried to understand this from other resources but didn't thanks a lot sir you are a real mathmonk !

  • @zx1987mdb
    @zx1987mdb 3 ปีที่แล้ว

    Very nicely explained. I will jump to your EM video Right away! Thank you!

  • @2008astro
    @2008astro 13 ปีที่แล้ว +1

    @apostoloumichail @apostoloumichail Determining the number of Gaussians is NP-hard problem. Therefore, we must make a guess. For example, we assume our process is as a result of one Gaussian, two Gaussians, or three Gaussians. Then we estimate the parameters by using EM-algorithm for all hypotheses. To decide which hypothesis fits our data most, we can apply Wilks' Theorem to check the likelihood of our data under these hypotheses, and choose the hypothesis with maximum likelihood.

  • @Tetraxzx9
    @Tetraxzx9 3 ปีที่แล้ว

    Excelent video, great explanation and proof. Thanks alot.

  • @yethusithole4695
    @yethusithole4695 3 หลายเดือนก่อน

    Practical to follow, thanks.

  • @sangeethamshare
    @sangeethamshare 10 ปีที่แล้ว +14

    I am lost as well. It is a good practice to always give examples at each stage instead of throwing lot of equations

    • @rickyhan7023
      @rickyhan7023 7 ปีที่แล้ว

      I am sorry to break this to you, but you are a brainlet and should steer clear of this excellent video.

    • @woohoo_head_in_the_game5154
      @woohoo_head_in_the_game5154 7 ปีที่แล้ว +12

      Could you be any more condescending? He is trying to learn.

    • @adityasista1836
      @adityasista1836 7 ปีที่แล้ว

      if you hate equations you are studying the wrong subject my friend...

    • @erickightley7672
      @erickightley7672 7 ปีที่แล้ว +1

      The first three minutes of the video consist of examples in one and two dimensions, without a single equation.

  • @MrYachtie
    @MrYachtie 9 ปีที่แล้ว +1

    I like your videos very much. Thank you.

  • @kreechapuphaiboon4886
    @kreechapuphaiboon4886 6 ปีที่แล้ว

    Great explanation, liked the colouring

  • @kimjong-un4574
    @kimjong-un4574 2 ปีที่แล้ว

    Very useful video for my studies.

  • @pankajgabale3210
    @pankajgabale3210 6 ปีที่แล้ว

    Precise and clear information, Thank you!

  • @gautamsvid
    @gautamsvid 11 ปีที่แล้ว +4

    what is the relationship between x and Z?

  • @AnnevanRossum
    @AnnevanRossum 12 ปีที่แล้ว +1

    Check out nonparametric Bayesian models if you don't have the number of classes in advance.

  • @windessohn
    @windessohn 4 ปีที่แล้ว

    thanks for good video.

  • @kurienabraham8739
    @kurienabraham8739 2 ปีที่แล้ว

    Why is the latent variable z, a vector and specifically the unit vectors along each dimension? Cant we just work with latent variable z that is a scalar

  • @ryuwtnb
    @ryuwtnb 13 ปีที่แล้ว

    Thank you very much, It's really helpful

  • @FloppyDobbys
    @FloppyDobbys 7 ปีที่แล้ว

    Some Wolfram Code:
    Manipulate[
    Plot[PDF[MixtureDistribution[ {5, 5}, {NormalDistribution[a, 2],
    NormalDistribution[b, 2]}], x], {x, -6, 6}, Filling -> Bottom,
    ColorFunction -> "Rainbow"], {a, -3, 1}, {b, -1, 3}]

  • @macomartinez2408
    @macomartinez2408 3 ปีที่แล้ว

    Some segments in the video are stamped not adjacent to each other

  • @topherpalmtree
    @topherpalmtree 12 ปีที่แล้ว

    Great videos! Have you thought about posting the pdf of your drawings from your videos as notes?

  • @nehildans8066
    @nehildans8066 5 ปีที่แล้ว

    Wait a second, isn't the formulation on 10.55 wrong? It should be sum symbol not product. Basically what we need to do is just integrating over z(in our case z is discrete latent variable so we need to use sum symbol not integral) to get the marginal distribution, and then for calculating the distribution of all N points, we need to take the product. This is the reason why we apply em algorithm. Because when we try to take the logarithm of the result equation, we can't do it directly because of the sum symbol in the equation. Then the em algorithm comes into play to solve this difficulty. Am I wrong?

  • @HanXiao-y3f
    @HanXiao-y3f 10 ปีที่แล้ว +1

    Gautam Garg I think Z is the underlying generating mechanism of x. Thus Z is called latent variable.

    • @addiesbliss6954
      @addiesbliss6954 9 ปีที่แล้ว

      i concur

    • @vikram360
      @vikram360 9 ปีที่แล้ว +1

      肖晗 Well, I don't think Z is really an underlying generating mechanism of x. Let's say you have 'k' buckets with numbered balls. The numbers in each bucket are distributed according to a Gaussian distribution, each bucket with it's own parameters (\mu and \sigma^2 for the Gaussian). The Z's merely represent which bucket you pick, from what I understand of it. The \alpha_k's determine the probability that you pick the 'k'th bucket.

  • @hassanshakeel854
    @hassanshakeel854 5 ปีที่แล้ว

    Is that necessary to intersect the contours like you did?

  • @salahiddinaltahat1749
    @salahiddinaltahat1749 10 ปีที่แล้ว +2

    Thanks. Very helpful. May I ask what is the technology you use to perform this lecture.

  • @apostoloumichail
    @apostoloumichail 13 ปีที่แล้ว +4

    The video is great! But i have a question
    As i can see here a mixture of m gaussians is defined, where m is the number of classes. What do you do if you do not know the number of classes?
    Thanks,
    Mike

    • @AizenAwakened
      @AizenAwakened 4 ปีที่แล้ว

      For anyone else with this question, using a Baysian Information Criterion (BIC) will help.

  • @zhaoc033
    @zhaoc033 9 ปีที่แล้ว

    So in this video we have finitely number of zs. Is it possible to expand z to a set of unaccountably many elements? What would the joint pdf be in that case?

  • @ruilinchen1846
    @ruilinchen1846 11 ปีที่แล้ว

    I don't quite understand why it's a sum of alpha(x)*N(x) in P(x) but a multiply of alpha(x)*N(x) in P(x, z). Shouldn't they be the same?

  • @kmoria2011
    @kmoria2011 9 ปีที่แล้ว

    Very useful.

  • @wucebrain
    @wucebrain 9 ปีที่แล้ว +1

    Nice explanation. Which software do you use for the presentation?

  • @nabbuhaq
    @nabbuhaq 7 ปีที่แล้ว +1

    good mythical morning!

  • @riomanty
    @riomanty 6 ปีที่แล้ว

    m=2 here?

  • @I77AGIC
    @I77AGIC 7 ปีที่แล้ว

    thanks for the video but it would be a lot better if it were legible

  • @franciscomelojunior2535
    @franciscomelojunior2535 7 ปีที่แล้ว

    Good explanation, the written part could improve though.

  • @kavyaashok6465
    @kavyaashok6465 6 ปีที่แล้ว +2

    Around 1:07 why do you say it is going to be a Convex combination of the individual pdfs? Why should it be convex?

    • @HimanshuAroraa
      @HimanshuAroraa 6 ปีที่แล้ว

      Consider the critical points:
      1. Gaussian 1 and Gaussian 2 intersect
      2. Gaussian 2 and 3 intersect
      Although at any given point in x, all the 3 Gaussians intersect, the above interactions between some specific Gaussians change the shape of the PDF at these points (mostly).
      At point 1, Gaussian 1's PDF is decreasing while Gaussian 2's PDF is increasing giving out the convex shape to the combined PDF near this point (decreasing and then increasing).
      At point 2, Gaussian 2's PDF is decreasing while Gaussian 3's PDF increases a little and then decreases giving the combined PDF a little bit of convexity near the point and then becomes concave.

  • @gucko123
    @gucko123 10 ปีที่แล้ว +6

    I'm lost too

  • @gautam2usa
    @gautam2usa 10 ปีที่แล้ว +6

    Seems you are very much confused.... coz whenever you get stuck, you try to skip that doubt !
    Observed this characteristic in almost all of your videos.

  • @shrivatsankchari1729
    @shrivatsankchari1729 6 ปีที่แล้ว

    probability and vectors together? Oh man

  • @AnnevanRossum
    @AnnevanRossum 12 ปีที่แล้ว

    I do not seem to be able to link to "video lectures dot net", but you will find some really good explanations there.

  • @rajeev3131
    @rajeev3131 12 ปีที่แล้ว

    I am lost!!!

  • @adilfsr1
    @adilfsr1 13 ปีที่แล้ว

    thx :-)

  • @adityasista1836
    @adityasista1836 7 ปีที่แล้ว

    How wrong would it be if you had just said "A gaussian mixture model is simply a linear combination of m gaussians with uk and Ck" ? It would have saved first 10 minutes ....

    • @kakolaukiommarzouk3332
      @kakolaukiommarzouk3332 7 ปีที่แล้ว

      in fact, a GMM is not a LINEAR combination of m gaussians. Rather, It is a CONVEX combination of m gaussians

    • @loic4269
      @loic4269 7 ปีที่แล้ว

      Actually, all CONVEX combinations are LINEAR combinations but not all linear combinations are convex.

    • @kakolaukiommarzouk3332
      @kakolaukiommarzouk3332 7 ปีที่แล้ว +1

      Yeah true. I wanted to say that GMM is not any linear combination but it's of convex type.

  • @Secondbloodbrother
    @Secondbloodbrother 3 ปีที่แล้ว +1

    Not really helpful as an introduction to be honest.