Thank you so much :) Feel free to look around. And if you like the content, I would be happy if you share the channel with your colleagues and peers :)
Welcome on the channel :). Glad, I can help. Rest assured, when I started my Master, I also did not know about the Dirichlet distribution. :D probability theory is just such a huge field and it takes time to comprehend it, hope I can contribute to this understanding. :) Good luck with your studies ☺️
Hi, thanks for the great video! Could you make a video to derive the variational inference formula of "The author-topic model for authors and documents" ?
Hi, thanks a lot for the kind comment and feedback :). I will put your suggestion on my list of possible video ideas, but for now I do not want to go too deep into NLP, it's not my field of expertise and research.
alpha = the shape of DIR theta or x_i = the points on simplex but what does it mean by uniform distribution if I set all alpha to 1 I cannot understand
Hey, do you have a particular timestamp you are referring to in the video? Maybe some general thoughts: The Dirichlet Distribution is defined over a D-dimensional vector such that all elements add up to one. This is a necessary property in order for this vector to be a valid input to a Categorical Distribution. The Dirichlet Distribution itself is parameterized by another D-dimensional vector that I called alpha here. This alpha moves around the "probability density on the (D-1) dimensional simplex", meaning whether it is more likely to have high values in some components of the D-dimensional vector or not. A uniform prior on this D-dimensional theta vector would mean that all entries are identical, i.e. each entry is 1/D which is necessary to adhere to the property of all components adding up to one.
Awesome explanation, thank you!
Thanks a lot :)
Extremely high quality. Thank you so much. If your other videos are of similar quality, you deserve many more subs.
Thank you so much :)
Feel free to look around. And if you like the content, I would be happy if you share the channel with your colleagues and peers :)
Thanks Man. One of the best video on Dirichlet. Finally have intuition for Dirichlet.
You're very welcome 🤗
Glad i found this channel. Being a masters student still I am not able to figure out what drichlet is :)
Welcome on the channel :). Glad, I can help.
Rest assured, when I started my Master, I also did not know about the Dirichlet distribution. :D probability theory is just such a huge field and it takes time to comprehend it, hope I can contribute to this understanding. :)
Good luck with your studies ☺️
Danke!
Sehr gerne :)
Also thanks a lot for the donation ☺️
exactly what I was looking for! Thank you so much
Thanks for the kind comment. I am glad I could help :)
Really clear to follow and understand.
Thanks 😊 I really appreciate the feedback.
Excellent explanation
Thank you very much 😊
Excellently explained. Thank you!
You're welcome, thanks for the kind comment 😊
Thanks for the video, very clear explanation.
You're welcome :)
Thanks for the feedback. Glad, you enjoyed it :)
Thank you! This really helped me a lot!
You're welcome 🤗 I'm glad I could help.
excellent explanation!!
Thanks :)
very good tutorial, keep it up and thank you
You're welcome :)
Thank you for the motivation. It's amazing to see that the videos have so much value.
Thank you for the content!
You're welcome 😊
Hi, thanks for the great video! Could you make a video to derive the variational inference formula of "The author-topic model for authors and documents" ?
Hi,
thanks a lot for the kind comment and feedback :).
I will put your suggestion on my list of possible video ideas, but for now I do not want to go too deep into NLP, it's not my field of expertise and research.
alpha = the shape of DIR
theta or x_i = the points on simplex
but what does it mean by uniform distribution if I set all alpha to 1
I cannot understand
Hey, do you have a particular timestamp you are referring to in the video?
Maybe some general thoughts: The Dirichlet Distribution is defined over a D-dimensional vector such that all elements add up to one. This is a necessary property in order for this vector to be a valid input to a Categorical Distribution.
The Dirichlet Distribution itself is parameterized by another D-dimensional vector that I called alpha here. This alpha moves around the "probability density on the (D-1) dimensional simplex", meaning whether it is more likely to have high values in some components of the D-dimensional vector or not.
A uniform prior on this D-dimensional theta vector would mean that all entries are identical, i.e. each entry is 1/D which is necessary to adhere to the property of all components adding up to one.
Awesome explanation! Thanks
You're welcome! 😊