Dirichlet Distribution | Intuition & Intro | w\ example in TensorFlow Probability

แชร์
ฝัง
  • เผยแพร่เมื่อ 21 ม.ค. 2025

ความคิดเห็น • 34

  • @realgiesebrecht2180
    @realgiesebrecht2180 2 ปีที่แล้ว +2

    Awesome explanation, thank you!

  • @franciscos.2301
    @franciscos.2301 3 ปีที่แล้ว +5

    Extremely high quality. Thank you so much. If your other videos are of similar quality, you deserve many more subs.

    • @MachineLearningSimulation
      @MachineLearningSimulation  3 ปีที่แล้ว +1

      Thank you so much :)
      Feel free to look around. And if you like the content, I would be happy if you share the channel with your colleagues and peers :)

  • @sandeepsingh-ly9nh
    @sandeepsingh-ly9nh ปีที่แล้ว +1

    Thanks Man. One of the best video on Dirichlet. Finally have intuition for Dirichlet.

  • @Mustafa-sj6hi
    @Mustafa-sj6hi 2 ปีที่แล้ว +1

    Glad i found this channel. Being a masters student still I am not able to figure out what drichlet is :)

    • @MachineLearningSimulation
      @MachineLearningSimulation  ปีที่แล้ว

      Welcome on the channel :). Glad, I can help.
      Rest assured, when I started my Master, I also did not know about the Dirichlet distribution. :D probability theory is just such a huge field and it takes time to comprehend it, hope I can contribute to this understanding. :)
      Good luck with your studies ☺️

  • @Henrypassau
    @Henrypassau 2 ปีที่แล้ว +1

    Danke!

  • @AmIR.W98
    @AmIR.W98 ปีที่แล้ว +1

    exactly what I was looking for! Thank you so much

  • @user-or7ji5hv8y
    @user-or7ji5hv8y 3 ปีที่แล้ว +3

    Really clear to follow and understand.

  • @mukunthanr2514
    @mukunthanr2514 3 ปีที่แล้ว +2

    Excellent explanation

  • @naterthot
    @naterthot ปีที่แล้ว +1

    Excellently explained. Thank you!

  • @sumailsumailov1572
    @sumailsumailov1572 3 ปีที่แล้ว +2

    Thanks for the video, very clear explanation.

  • @林彥承-l6e
    @林彥承-l6e ปีที่แล้ว +1

    Thank you! This really helped me a lot!

  • @habilismayilov838
    @habilismayilov838 ปีที่แล้ว +1

    excellent explanation!!

  • @xiaoweidu4667
    @xiaoweidu4667 3 ปีที่แล้ว +2

    very good tutorial, keep it up and thank you

    • @MachineLearningSimulation
      @MachineLearningSimulation  3 ปีที่แล้ว

      You're welcome :)
      Thank you for the motivation. It's amazing to see that the videos have so much value.

  • @royvivat113
    @royvivat113 2 ปีที่แล้ว +1

    Thank you for the content!

  • @Bryan-jb6xu
    @Bryan-jb6xu 2 ปีที่แล้ว +1

    Hi, thanks for the great video! Could you make a video to derive the variational inference formula of "The author-topic model for authors and documents" ?

    • @MachineLearningSimulation
      @MachineLearningSimulation  2 ปีที่แล้ว

      Hi,
      thanks a lot for the kind comment and feedback :).
      I will put your suggestion on my list of possible video ideas, but for now I do not want to go too deep into NLP, it's not my field of expertise and research.

  • @ccuuttww
    @ccuuttww 3 ปีที่แล้ว

    alpha = the shape of DIR
    theta or x_i = the points on simplex
    but what does it mean by uniform distribution if I set all alpha to 1
    I cannot understand

    • @MachineLearningSimulation
      @MachineLearningSimulation  3 ปีที่แล้ว

      Hey, do you have a particular timestamp you are referring to in the video?
      Maybe some general thoughts: The Dirichlet Distribution is defined over a D-dimensional vector such that all elements add up to one. This is a necessary property in order for this vector to be a valid input to a Categorical Distribution.
      The Dirichlet Distribution itself is parameterized by another D-dimensional vector that I called alpha here. This alpha moves around the "probability density on the (D-1) dimensional simplex", meaning whether it is more likely to have high values in some components of the D-dimensional vector or not.
      A uniform prior on this D-dimensional theta vector would mean that all entries are identical, i.e. each entry is 1/D which is necessary to adhere to the property of all components adding up to one.

  • @fizoolplayer
    @fizoolplayer ปีที่แล้ว +1

    Awesome explanation! Thanks