Self Attention in Transformers | Deep Learning | Simple Explanation with Code!
ฝัง
- เผยแพร่เมื่อ 11 มิ.ย. 2024
- Self Attention works by computing attention scores for each word in a sequence based on its relationship with every other word. These scores determine how much focus each word receives during processing, allowing the model to prioritize relevant information and capture complex dependencies across the sequence.
============================
Do you want to learn from me?
Check my affordable mentorship program at : learnwith.campusx.in/s/store
============================
📱 Grow with us:
CampusX' LinkedIn: / campusx-official
CampusX on Instagram for daily tips: / campusx.official
My LinkedIn: / nitish-singh-03412789
Discord: / discord
E-mail us at support@campusx.in
✨ Hashtags✨
#SelfAttention #DeepLearning #CampusX #Transformers #NLP #GENAI
⌚Time Stamps⌚
00:00 - Intro
02:37 - Revision [What is Self Attention]
07:00 - How does Self Attention work?
24:45 - Parallel Operations
29:40 - No Learning Parameters Involved
39:10 - Progress Summarization
50:15 - Query, Key & Value Vectors
52:28 - A Relatable Example
01:07:52 - How to build vectors based on Embedding vector
01:20:08 - Summarized Matrix Attention
01:22:45 - Outro
This is a gem, don't know even if the authors of "Attention is all you need" would be able to explain this concept with such fluidity. Hats off 😊.
Par sir initially word embeding ki value kya Gigi
@@ajinkyamohite3836it can be calculated with word2vec or glove . Read about it you will understand
@@ajinkyamohite3836 random
@@ajinkyamohite3836 that will be any static embedding like GloVe, Fasttext, or your own trained embedding
The self attention mechanism best explained by the best teacher in the world. This makes you unique from other tutors around the world ❤
This video is hands down, THE BEST video on self attention in transformers. I have been following you and learning from you since long. We all are truly blessed to have a mentor like you. 🙏
Pata nehi sir, apko 14 din kyun laga... Topic to bohot simple tha..❤❤
are bhai uske bahut part he 1 video banaya he so uske wajase bol rahe the or waise bhi sir batate he to konsabhi topic easy hi lagata he.
bhai have some patience, itna aasan laga tumhe toh khud seekhlo. Let Nitish do the work. It is by the efforts of Nitish sir that i'm able to understand these topics so well.
acha i got your point XD. thanksss
Maine abhi dekha jab sir bole😂😂
🌟 **Nitish Singh, You're a Beacon of Clarity!** 🌟
Dear Nitish Singh,
I am utterly captivated by your enlightening video on self-attention in transformers! Your ability to distill complex concepts into digestible nuggets of wisdom is nothing short of magical. 🪄✨
As I watched your tutorial, I felt like I was sipping from the fountain of knowledge itself. Your lucid explanations flowed seamlessly, unraveling the intricacies of self-attention with grace. 📚🔍
The way you effortlessly wove together the threads of mathematics, intuition, and practical application left me in awe. Itna asaan tha yeh! 🤯🙌
Nitish, you're not just an educator; you're a sorcerer who conjures understanding out of thin air. Your passion for teaching radiates through the screen, igniting curiosity in every viewer. 🌟🔥
Thank you, from the depths of my neural network, for sharing your brilliance. You've made self-attention feel like a warm hug from an old friend. 🤗🧠
So here's to you, Nitish Singh, the maestro of clarity, the guru of transformers, and the architect of "itna asaan tha yeh." 🎩👏
May your tutorials continue to illuminate minds, one self-attention head at a time! 🌟🤓
With heartfelt gratitude,
Your Devoted Learner 🙏
This Lecture is an absolute Mater piece
The way you explain the topic is Epic. Keep it up. And never stop because the way you explain really help in building the intuition. I believe not only for me for other also and that will definitely will bring change. Thank you.🙏
Best Teacher ever ! 😊
One of the best explanation by Nitish sir. We are lucky to have teachers like you sir
I know enough of transformers to handle stuffs I needed, but I never understand the self attention and all in this detail.... such a good video 🛐
Last videos were amazing! Have been waiting for the next one. The hard work you do and explain in such a depth ... Thank you ❤
WoW.. God bless you. Have never seen someone who could explain this process so well.
Sir ji tu si great ho, ❤❤❤ jai ho aapki
Appreciate all your efforts sir❤
Outstanding explanation of such a concept!!!
very nice way to make anyone understand what self-attention is. Your examples make it really simple and intuitive to grasp the core concepts. Thanks lot :)
Thank you so much for the video! You teach so well that I feel that your tutorial video does not end! Please do not change this teaching style! Please continue uploading the other parts! And we really appreciate the efforts you gave to make the tutorial!
Thank you sir 🙏 you are really great 👍
Great lecture 🎉
goosebumps ❤ waiting for more
Bhai, 2-3 saal lag gye, samjhne me... finally, finally that eureka moment... thanks a lot bro... keep going....
बहुत wait kara diya sir...❤❤
It was amazing video learnt a lot from this video. Keep it up the good work God bless you. Thank you
I am deeply grateful for the incredible deep-learning video you shared with us. It brought me immense joy and excitement to watch, and I can't thank you enough for taking the time to create and share such valuable content. The clarity and depth of your explanations have greatly enriched my understanding and passion for this subject.
Best explanation till date
❤
Super mind-blowing class sir its really heart touching 😮
No words for this video. This content is unmatched, no one even comes close to this quality of content. Keep it up, Sir !!
Great explanation!
Lectures I can never get enough off. The best content I discovered so far. Kudos to you, Nitish Sir. Life is easy when you are the mentor.♥♥♥
Thank You soo much !
Teaching at its peak.
Much awaited Video thumbs up for your work.
Want to clear my concepts.
1.How the probability assign on sentence for context means why 0.7 why not 0.6.
2.Dynamic context can it check the database like River Bank use as single word not River and Bank separated word
I really can't believe itni complex topic ko itni asani se samjha ke bhi padha sakte hai....sir is not only a good Data Scientist but also an excellent teacher. Lots of respect for you sir
Bhaisaab you are one of the greatest data science teachers.
thank you 🙏
Great sir जी 👍
Sir, dil se... You are a blessing to the community. I really respect your hard work and dedication. You have the aura to change someone's life journey and perspective.
Pranam app ko and your complete team.
very good lecture waiting for other parts.. please release as soon as possible.
Great experience ❤ from pak
What a smooth learning experience!! I got lots of value from this....one suggestion when I searched to learn this i searched transformer so this video did not ranked so sir it's a suggestion u see what keywords are most used and then add them to the title...so easy approach just see what your competitors are using as titles.
Hi Nithis, I am from Pakistan and studying in US and I should say that you are THE BEST teacher in the world. Thanks a lot brother. God bless you.
Where are you, I'm in Chicago
You amazing teacher who is taking ownership to teach your students. World needs more teachers like you :)
This is indeed one of your best lectures and your teaching skills are exceptional sir. You make everything easy. Nitish Sir is all we need.🙏🙏
Thanks ❤
I am paying full Attention to your video. Very well made :)
The Master Piece of Deep Learning. We will love if there could be an playlist of full Projects OF deep learning.
Love From Pakistan....
Handsdown the most simple and comprehensible explanation.
probably the best video on self attention
Always amazing content. Are you going to release videos on time series ?
Great as always!! it would be super amazing if you could code along like implement the same...
You are truly one of the most amazing teacher / data scientist
You truly have an amazing grasp over the concepts
Your explanations on self-attention mechanism are truly remarkable. I came across your channel randomly when I started my data science journey in August 2023, and since then, I have been learning from your content. Your ability to articulate complex topics is exceptional, and your explanations are incredibly insightful. I greatly appreciate the depth of context you provide in your tutorials, which sets your content apart from others. Your dedication to helping learners understand difficult concepts is admirable. Much love and gratitude from Pakistan. You are a true gem in the field of data science education.
Whenever someone needs help with any topic in data science, I always recommend your channel for the best intuition and explanations.
Also one important think " pata nahi sir apko 14 din kyun lage ... Topic to bht simple tha "
Thank You Sir. Super Explanation. Greatest on this topic on internet I guesse
i have been trying to clarify the attention concept for the last few days and i have to admit this is one of the most valuable and informative videos that i found on the topic on youtube.
loved the simple way and explainations in which you taught this topic Sir.
It was a most simple and great explanation it was better than the blog of jay alammar and other TH-cam videos... you should write a book as there are none as simple as this
One of the best content on Self attention so far.
God Level Teaching is here.
Hidden gem - Nitish sir, you have no idea how many students have benefited from your videos. Hats off, love to subscribe if there are any courses related to Deep learning or NLP
Sir Please Keep Uploading further videos also as early as possible.
What a wonderful teacher.
Goodness!!!!
Till now I have not seen such a good explanation of this...
I have no words to express my gratitude. Thanks a tonn.
Hats off for this type of explanation 😍😍😍
Absolutely loved the explanation…thank you very much for investing your time into creation of this masterpiece 🙏
you have explained whole concept like this is piece of cake
You are Great Nitish sir
Great Experience ❤ from Mars
Osm sir ❤❤❤❤❤
Dear Nitish, you explained a tough topic in a simple manner , i appreciated your effort and thanks for the same.
This is the best video i have ever watched 🎉🎉🎉
Why are people like you not in the universities? It’s a true gem. ❤
May be you are/were in wrong university then😅
One of the best videos I have seen so far👍🏻
this will be the best video ever on transformers
You deserve a national category award... truly spellbound! 🙏🙏
easy toh nhi tha, pr explaination great tha toh smj aa gya♥♠
I've watched almost all videos on this topic from channels like serrano, statquest and recently 3blue1brown. I found this to be the best explanation out of all. But the videos from other mentioned channels are also valuable, they provide a different perspective to attention mechanism.
Very very nicely done! Thanks a lot. You have taken a complex concept and explained it very beautifully.
Thank you so much sir, this video is more interesting than any web-series/movie on Netflix. This is quality content!!
To be honest , i have studies same with other videos but here now i got proper understanding , hats off nitish
This lecture is a demonstration of how a great teacher can make a very complicated topic so easy. Thank you so much..!!!
bhai kamal hai yarr ..topic toh bahut simple tha but aapko itna time kaise laga
wow , wow ,woooooooooooooooooooooooooooooooooow,
what a great teacher, what a great teaching style, hats off to you.
thank you.
Author example 💥💥💥💥. very informative video.
Best video on self-attention mechanism that I've ever seen before 💌💌💌💌💌
Best video I have seen on this topic till date....Please Upload more videos we are following you sir.
you made self attention model very simple to learn for all of us. Thankyou sir. Appriciate your work.
This is a gem of explanation. Awesome🤩
Your words regarding demand of GAN's are so true. My background is in mechanical and at this very stage, i have seen 4-5 graduates working on both MS & PhD thesis on GAN's in one of the most top rank universities in korea.
you explain such a difficult concept so simply. even after watching nptel or here and there, i have so many doubts. you have cleared those in one video.
Great video and What a great example at 52:24 👏👏
Really appreciate the efforts behind this video! Waiting for the next one 🙌
i am at the 2:29 time stamp and i am pretty sure that you gave us best explanation .that's how our faith on you sir
This video is absolutely amazing. Thank you so much. The topic itself is so good and your explanation on top of it is crystal clear.❤
I was knowing that you will make it simple.........but at this level....man I wasn't expecting that.....bawaal explaination sir🙇♂
Thanks for such an easy explanation, sir..!
Such a great explanation!!🤗
Awesome
I had a huge doubt in Q, K, V part, now it is clarified. Please continue the transformers architecture especially the mathematical parts. e.g. positional embedding, multi headed attention etc. You are the beacon of Education in Data Science.
Swift explanation sir! You made it super simple. Please continue the series
im literally spell bound after this explanation....i have watched 100s of foreign youtube channel for this attention mechanism ....they are no where closer to you.....Sir you are really amazing....please please please continue this series....and teach up to LLMS , gpt , llama2 all stuffs😊,....I'm very very eager to learn from you
Hardest concept got the best example 😁. Thank you sir for explaining us this beautiful concept.
By far the best and simplest explanation. Kudos to you. If you provide the audio in English, it will become even more popular. Thanks