Thank you for your effort and fantastic content. I am amazed by the simple, clear and practical explanations. I am not sure if the topic of importance sampling has been already covered in one of your lectures. I will look for it. If not, maybe we will be lucky to have it covered in a future video. Thank you once more for your incredible work and generosity.
This self multiplication is similar to the Attention mechanism where you have Keys and Queries (and Values with basically same inputs) that create a content.
I had been breaking my head over, why is the training process was working. I really appreciate your amazing teaching.
coffee with my fav Grokking Book and this session again make my day. thank you again sir
It's a great explanation and connecting weighted graphs to energy is a very different perspective. I like it very much and thank you so much.
Thank you for your effort and fantastic content. I am amazed by the simple, clear and practical explanations. I am not sure if the topic of importance sampling has been already covered in one of your lectures. I will look for it. If not, maybe we will be lucky to have it covered in a future video. Thank you once more for your incredible work and generosity.
Thank you so much, I'm glad you liked it! Thanks for the suggestion, yes, importance sampling is a great topic, I'll add it to the list! :)
Great video!
25:48 I believe you misspoke. It should be "Low energy for the good configuration" instead of "Low energy for the bad configuration."
Ay, you’re correct, thank you! Yes I should have said low energy for the good configuration.
Great lecture, Thank you very much!
crazy simple thankyou
This self multiplication is similar to the Attention mechanism where you have Keys and Queries (and Values with basically same inputs) that create a content.
just wow it was my best coffe break time ever
Thank you, so glad to hear! :)
Does this mean that the whole thing starts by identifying the correlation between two features lets say x1 and x2 ?