Nice explanation... But, how embedding layer convert tokens into vectors representation? I guess if you could take an example and explain this then it will be helpful for those who are watching this video...
Embedding layer learns the relationship between words. For more info: read about word2vec model. I'll definitely make videos about how it works in near future.
@@DevelopersHutt Ok waiting for your video... My suggestion for you is that the description of the Transformer which you are giving is already available in writeup as well as videos. But viewers want to see an explanation using examples.. Like taking any sentence and then how things are happening.. but nice start..
Everytime I need to remember key concepts in transformers I go back to this video. God bless you bro.
Thanks bro ❤️
A very good and simplied video. Thank you.
I cannot like this enough!!!!!! This channel is SOOO underrated
I'm glad you liked it ❤️
This is really good man - Nice video and you've done a fantastic job explaining some very complex ideas.
Impressively clear and easy to follow while still being precise!
Beautiful explanation brother!
This is very clear. I understood a lot through this video about transformers. Thank you.
Glad you liked it!
Thankyou, very clear visualization on the processes involved in AIAYN (Attention-Is-All-You-Need) Concept ...
best explanation of transformers ever, thank you 👍
Glad you liked it!
Very Nice Explation. Absolutely Excellent !!! Thanks.
@8:47 should be: the query (Q) comes from the decoder. The key (K) and value (V) come from the encoder.
Yep
So at 7:47, the encoder output should actually read Ke and Ve, right?
@@adscript4713, that is correct.
You saved my day!
Thank you so much!
very easy to understand, nice job!
? how about: ? or maybe ?
Thank you for the informative video. I was wondering where can I find the image captioning video that you mentioned at the the of your video?
th-cam.com/video/3e4hsHLDHZA/w-d-xo.htmlsi=9zsaiRaCKkmRLJEF
Brilliant 👏
wow, well explained!
Such amazing work! thank you sir!
Very well explained
Hi there ! Excellent work thanks !
But : at 5:06 should the PE vectors be APPENDING to the word embeddings and not ADDING ?
Yes, correct. I will put a note in the description.
Thanks❤
What tool you use for animations?
After effects
Thanks!
Thank you so much 🙏
@@DevelopersHuttKeep up the good work. Your short and to-the-point videos are unique and amazing!
Very Nice Job. can you please tell me what software you are using to make such amazing animated presentation? it will be very helpful.
After effects
In Attention Filter example, the dot product between query "WHY" and key "WHY" is 0, this seems not possible unless embedding of "Why" is 0.
Let me check
👌
Nice explanation... But, how embedding layer convert tokens into vectors representation? I guess if you could take an example and explain this then it will be helpful for those who are watching this video...
Embedding layer learns the relationship between words.
For more info: read about word2vec model.
I'll definitely make videos about how it works in near future.
@@DevelopersHutt Ok waiting for your video... My suggestion for you is that the description of the Transformer which you are giving is already available in writeup as well as videos. But viewers want to see an explanation using examples.. Like taking any sentence and then how things are happening.. but nice start..
You are a badass