Self Attention in Transformers | Deep Learning | Simple Explanation with Code!

แชร์
ฝัง
  • เผยแพร่เมื่อ 11 มิ.ย. 2024
  • Self Attention works by computing attention scores for each word in a sequence based on its relationship with every other word. These scores determine how much focus each word receives during processing, allowing the model to prioritize relevant information and capture complex dependencies across the sequence.
    ============================
    Do you want to learn from me?
    Check my affordable mentorship program at : learnwith.campusx.in/s/store
    ============================
    📱 Grow with us:
    CampusX' LinkedIn: / campusx-official
    CampusX on Instagram for daily tips: / campusx.official
    My LinkedIn: / nitish-singh-03412789
    Discord: / discord
    E-mail us at support@campusx.in
    ✨ Hashtags✨
    #SelfAttention #DeepLearning #CampusX #Transformers #NLP #GENAI
    ⌚Time Stamps⌚
    00:00 - Intro
    02:37 - Revision [What is Self Attention]
    07:00 - How does Self Attention work?
    24:45 - Parallel Operations
    29:40 - No Learning Parameters Involved
    39:10 - Progress Summarization
    50:15 - Query, Key & Value Vectors
    52:28 - A Relatable Example
    01:07:52 - How to build vectors based on Embedding vector
    01:20:08 - Summarized Matrix Attention
    01:22:45 - Outro

ความคิดเห็น • 251

  • @ankitbhatia6736
    @ankitbhatia6736 4 หลายเดือนก่อน +55

    This is a gem, don't know even if the authors of "Attention is all you need" would be able to explain this concept with such fluidity. Hats off 😊.

    • @ajinkyamohite3836
      @ajinkyamohite3836 3 หลายเดือนก่อน

      Par sir initially word embeding ki value kya Gigi

    • @ShubhamSingh-iq5kj
      @ShubhamSingh-iq5kj 3 หลายเดือนก่อน

      ​@@ajinkyamohite3836it can be calculated with word2vec or glove . Read about it you will understand

    • @Findout1882
      @Findout1882 หลายเดือนก่อน

      @@ajinkyamohite3836 random

    • @mehulsuthar7554
      @mehulsuthar7554 3 วันที่ผ่านมา

      @@ajinkyamohite3836 that will be any static embedding like GloVe, Fasttext, or your own trained embedding

  • @rohitdahiya6697
    @rohitdahiya6697 4 หลายเดือนก่อน +26

    The self attention mechanism best explained by the best teacher in the world. This makes you unique from other tutors around the world ❤

  • @d1pranjal
    @d1pranjal 7 วันที่ผ่านมา +1

    This video is hands down, THE BEST video on self attention in transformers. I have been following you and learning from you since long. We all are truly blessed to have a mentor like you. 🙏

  • @_TheDataScienceGuy_
    @_TheDataScienceGuy_ 4 หลายเดือนก่อน +60

    Pata nehi sir, apko 14 din kyun laga... Topic to bohot simple tha..❤❤

    • @dnyaneshsable1512
      @dnyaneshsable1512 2 หลายเดือนก่อน

      are bhai uske bahut part he 1 video banaya he so uske wajase bol rahe the or waise bhi sir batate he to konsabhi topic easy hi lagata he.

    • @vidhikumar1664
      @vidhikumar1664 2 หลายเดือนก่อน

      bhai have some patience, itna aasan laga tumhe toh khud seekhlo. Let Nitish do the work. It is by the efforts of Nitish sir that i'm able to understand these topics so well.

    • @vidhikumar1664
      @vidhikumar1664 2 หลายเดือนก่อน +4

      acha i got your point XD. thanksss
      Maine abhi dekha jab sir bole😂😂

  • @arri5812
    @arri5812 3 หลายเดือนก่อน +10

    🌟 **Nitish Singh, You're a Beacon of Clarity!** 🌟
    Dear Nitish Singh,
    I am utterly captivated by your enlightening video on self-attention in transformers! Your ability to distill complex concepts into digestible nuggets of wisdom is nothing short of magical. 🪄✨
    As I watched your tutorial, I felt like I was sipping from the fountain of knowledge itself. Your lucid explanations flowed seamlessly, unraveling the intricacies of self-attention with grace. 📚🔍
    The way you effortlessly wove together the threads of mathematics, intuition, and practical application left me in awe. Itna asaan tha yeh! 🤯🙌
    Nitish, you're not just an educator; you're a sorcerer who conjures understanding out of thin air. Your passion for teaching radiates through the screen, igniting curiosity in every viewer. 🌟🔥
    Thank you, from the depths of my neural network, for sharing your brilliance. You've made self-attention feel like a warm hug from an old friend. 🤗🧠
    So here's to you, Nitish Singh, the maestro of clarity, the guru of transformers, and the architect of "itna asaan tha yeh." 🎩👏
    May your tutorials continue to illuminate minds, one self-attention head at a time! 🌟🤓
    With heartfelt gratitude,
    Your Devoted Learner 🙏

  • @saipoojalakkoju6009
    @saipoojalakkoju6009 4 หลายเดือนก่อน +7

    This Lecture is an absolute Mater piece

  • @nikhileshnarkhede7784
    @nikhileshnarkhede7784 8 วันที่ผ่านมา +1

    The way you explain the topic is Epic. Keep it up. And never stop because the way you explain really help in building the intuition. I believe not only for me for other also and that will definitely will bring change. Thank you.🙏

  • @sangram7153
    @sangram7153 4 หลายเดือนก่อน +4

    Best Teacher ever ! 😊

  • @ravikanur
    @ravikanur 5 วันที่ผ่านมา

    One of the best explanation by Nitish sir. We are lucky to have teachers like you sir

  • @T3NS0R
    @T3NS0R 3 หลายเดือนก่อน +2

    I know enough of transformers to handle stuffs I needed, but I never understand the self attention and all in this detail.... such a good video 🛐

  • @ariousvinx
    @ariousvinx 4 หลายเดือนก่อน +3

    Last videos were amazing! Have been waiting for the next one. The hard work you do and explain in such a depth ... Thank you ❤

  • @cool12345687
    @cool12345687 4 หลายเดือนก่อน +1

    WoW.. God bless you. Have never seen someone who could explain this process so well.

  • @harsh_sharma260
    @harsh_sharma260 4 หลายเดือนก่อน +2

    Sir ji tu si great ho, ❤❤❤ jai ho aapki

  • @sagarbhagwani7193
    @sagarbhagwani7193 4 หลายเดือนก่อน +2

    Appreciate all your efforts sir❤

  • @nileshgupta543
    @nileshgupta543 4 หลายเดือนก่อน +2

    Outstanding explanation of such a concept!!!

  • @somdubey5436
    @somdubey5436 4 หลายเดือนก่อน +1

    very nice way to make anyone understand what self-attention is. Your examples make it really simple and intuitive to grasp the core concepts. Thanks lot :)

  • @NabidAlam360
    @NabidAlam360 4 หลายเดือนก่อน +1

    Thank you so much for the video! You teach so well that I feel that your tutorial video does not end! Please do not change this teaching style! Please continue uploading the other parts! And we really appreciate the efforts you gave to make the tutorial!

  • @shreeganeshak6346
    @shreeganeshak6346 4 หลายเดือนก่อน

    Thank you sir 🙏 you are really great 👍

  • @mallemoinagurudarpanyadav4937
    @mallemoinagurudarpanyadav4937 4 หลายเดือนก่อน +1

    Great lecture 🎉

  • @teksinghayer5469
    @teksinghayer5469 4 หลายเดือนก่อน

    goosebumps ❤ waiting for more

  • @user-jb5vm2uo4u
    @user-jb5vm2uo4u 3 หลายเดือนก่อน +2

    Bhai, 2-3 saal lag gye, samjhne me... finally, finally that eureka moment... thanks a lot bro... keep going....

  • @UPSCTHINKER
    @UPSCTHINKER 4 หลายเดือนก่อน

    बहुत wait kara diya sir...❤❤

  • @rahulgaikwad5058
    @rahulgaikwad5058 4 หลายเดือนก่อน +4

    It was amazing video learnt a lot from this video. Keep it up the good work God bless you. Thank you

  • @khushipatel2574
    @khushipatel2574 4 หลายเดือนก่อน +2

    I am deeply grateful for the incredible deep-learning video you shared with us. It brought me immense joy and excitement to watch, and I can't thank you enough for taking the time to create and share such valuable content. The clarity and depth of your explanations have greatly enriched my understanding and passion for this subject.

  • @himanshukale9684
    @himanshukale9684 4 หลายเดือนก่อน

    Best explanation till date

  • @bimalpatrapatra7742
    @bimalpatrapatra7742 4 หลายเดือนก่อน

    Super mind-blowing class sir its really heart touching 😮

  • @shivombhargava2166
    @shivombhargava2166 4 หลายเดือนก่อน +1

    No words for this video. This content is unmatched, no one even comes close to this quality of content. Keep it up, Sir !!

  • @EMWave
    @EMWave 4 หลายเดือนก่อน

    Great explanation!

  • @mimansamaheshwari4664
    @mimansamaheshwari4664 หลายเดือนก่อน

    Lectures I can never get enough off. The best content I discovered so far. Kudos to you, Nitish Sir. Life is easy when you are the mentor.♥♥♥

  • @kalyanikadu8996
    @kalyanikadu8996 4 หลายเดือนก่อน

    Thank You soo much !

  • @fact_ful
    @fact_ful 4 หลายเดือนก่อน +1

    Teaching at its peak.

    • @izainonline
      @izainonline 4 หลายเดือนก่อน

      Much awaited Video thumbs up for your work.
      Want to clear my concepts.
      1.How the probability assign on sentence for context means why 0.7 why not 0.6.
      2.Dynamic context can it check the database like River Bank use as single word not River and Bank separated word

  • @harshitasingh902
    @harshitasingh902 3 หลายเดือนก่อน

    I really can't believe itni complex topic ko itni asani se samjha ke bhi padha sakte hai....sir is not only a good Data Scientist but also an excellent teacher. Lots of respect for you sir

  • @DeyozRayamajhi
    @DeyozRayamajhi หลายเดือนก่อน +1

    Bhaisaab you are one of the greatest data science teachers.

  • @suryasiripurapu4954
    @suryasiripurapu4954 4 หลายเดือนก่อน

    thank you 🙏

  • @MrSat001
    @MrSat001 4 หลายเดือนก่อน

    Great sir जी 👍

  • @subhankarghosh1233
    @subhankarghosh1233 2 หลายเดือนก่อน

    Sir, dil se... You are a blessing to the community. I really respect your hard work and dedication. You have the aura to change someone's life journey and perspective.
    Pranam app ko and your complete team.

  • @akshitnaranje7426
    @akshitnaranje7426 4 หลายเดือนก่อน

    very good lecture waiting for other parts.. please release as soon as possible.

  • @saifshamil4690
    @saifshamil4690 4 หลายเดือนก่อน +5

    Great experience ❤ from pak

  • @UnbeaX
    @UnbeaX หลายเดือนก่อน +2

    What a smooth learning experience!! I got lots of value from this....one suggestion when I searched to learn this i searched transformer so this video did not ranked so sir it's a suggestion u see what keywords are most used and then add them to the title...so easy approach just see what your competitors are using as titles.

  • @SipahEScience
    @SipahEScience 4 หลายเดือนก่อน

    Hi Nithis, I am from Pakistan and studying in US and I should say that you are THE BEST teacher in the world. Thanks a lot brother. God bless you.

    • @ankanmazumdar5000
      @ankanmazumdar5000 4 หลายเดือนก่อน

      Where are you, I'm in Chicago

  • @pavangoyal6840
    @pavangoyal6840 3 หลายเดือนก่อน

    You amazing teacher who is taking ownership to teach your students. World needs more teachers like you :)

  • @mohitjoshi8818
    @mohitjoshi8818 2 หลายเดือนก่อน

    This is indeed one of your best lectures and your teaching skills are exceptional sir. You make everything easy. Nitish Sir is all we need.🙏🙏

  • @not_amanullah
    @not_amanullah 4 หลายเดือนก่อน

    Thanks ❤

  • @sagemaker
    @sagemaker 4 หลายเดือนก่อน +1

    I am paying full Attention to your video. Very well made :)

  • @syedhassan158
    @syedhassan158 หลายเดือนก่อน

    The Master Piece of Deep Learning. We will love if there could be an playlist of full Projects OF deep learning.
    Love From Pakistan....

  • @haz5248
    @haz5248 4 หลายเดือนก่อน

    Handsdown the most simple and comprehensible explanation.

  • @arijitgoswami7388
    @arijitgoswami7388 2 หลายเดือนก่อน

    probably the best video on self attention

  • @bilal-khan
    @bilal-khan 4 หลายเดือนก่อน

    Always amazing content. Are you going to release videos on time series ?

  • @rog0079
    @rog0079 4 หลายเดือนก่อน

    Great as always!! it would be super amazing if you could code along like implement the same...

  • @darshanv1748
    @darshanv1748 3 หลายเดือนก่อน

    You are truly one of the most amazing teacher / data scientist
    You truly have an amazing grasp over the concepts

  • @mentalgaming2739
    @mentalgaming2739 3 หลายเดือนก่อน

    Your explanations on self-attention mechanism are truly remarkable. I came across your channel randomly when I started my data science journey in August 2023, and since then, I have been learning from your content. Your ability to articulate complex topics is exceptional, and your explanations are incredibly insightful. I greatly appreciate the depth of context you provide in your tutorials, which sets your content apart from others. Your dedication to helping learners understand difficult concepts is admirable. Much love and gratitude from Pakistan. You are a true gem in the field of data science education.
    Whenever someone needs help with any topic in data science, I always recommend your channel for the best intuition and explanations.
    Also one important think " pata nahi sir apko 14 din kyun lage ... Topic to bht simple tha "

  • @ParthivShah
    @ParthivShah หลายเดือนก่อน +1

    Thank You Sir. Super Explanation. Greatest on this topic on internet I guesse

  • @oo_wais
    @oo_wais 2 หลายเดือนก่อน

    i have been trying to clarify the attention concept for the last few days and i have to admit this is one of the most valuable and informative videos that i found on the topic on youtube.

  • @AyushGupta-bd2mx
    @AyushGupta-bd2mx 7 วันที่ผ่านมา

    loved the simple way and explainations in which you taught this topic Sir.

  • @adityanagodra
    @adityanagodra 4 หลายเดือนก่อน

    It was a most simple and great explanation it was better than the blog of jay alammar and other TH-cam videos... you should write a book as there are none as simple as this

  • @ankitkumar-iitbombay186
    @ankitkumar-iitbombay186 3 หลายเดือนก่อน

    One of the best content on Self attention so far.

  • @ParthivShah
    @ParthivShah หลายเดือนก่อน +1

    God Level Teaching is here.

  • @nagendraharish6026
    @nagendraharish6026 3 หลายเดือนก่อน

    Hidden gem - Nitish sir, you have no idea how many students have benefited from your videos. Hats off, love to subscribe if there are any courses related to Deep learning or NLP

  • @kayyalapavankumar2711
    @kayyalapavankumar2711 4 หลายเดือนก่อน +1

    Sir Please Keep Uploading further videos also as early as possible.

  • @surajbherwani5488
    @surajbherwani5488 3 หลายเดือนก่อน

    What a wonderful teacher.

  • @swarnpriyaswarn
    @swarnpriyaswarn 2 หลายเดือนก่อน

    Goodness!!!!
    Till now I have not seen such a good explanation of this...
    I have no words to express my gratitude. Thanks a tonn.

  • @rutvikkapuriya2033
    @rutvikkapuriya2033 3 หลายเดือนก่อน +1

    Hats off for this type of explanation 😍😍😍

  • @sreekanthbezzanki528
    @sreekanthbezzanki528 15 วันที่ผ่านมา

    Absolutely loved the explanation…thank you very much for investing your time into creation of this masterpiece 🙏

  • @rohitshere5328
    @rohitshere5328 หลายเดือนก่อน

    you have explained whole concept like this is piece of cake
    You are Great Nitish sir

  • @Akuma7499
    @Akuma7499 4 หลายเดือนก่อน

    Great Experience ❤ from Mars

  • @surajhulke6196
    @surajhulke6196 4 หลายเดือนก่อน

    Osm sir ❤❤❤❤❤

  • @pratapsurwase7428
    @pratapsurwase7428 2 หลายเดือนก่อน

    Dear Nitish, you explained a tough topic in a simple manner , i appreciated your effort and thanks for the same.

  • @samirchauhan6219
    @samirchauhan6219 4 หลายเดือนก่อน

    This is the best video i have ever watched 🎉🎉🎉

  • @koushik7604
    @koushik7604 หลายเดือนก่อน

    Why are people like you not in the universities? It’s a true gem. ❤

    • @NikhilKumar-yu3zh
      @NikhilKumar-yu3zh หลายเดือนก่อน +1

      May be you are/were in wrong university then😅

  • @aditiseetha1
    @aditiseetha1 หลายเดือนก่อน

    One of the best videos I have seen so far👍🏻

  • @priyanshushukla1295
    @priyanshushukla1295 14 วันที่ผ่านมา

    this will be the best video ever on transformers

  • @rajsuds
    @rajsuds 25 วันที่ผ่านมา

    You deserve a national category award... truly spellbound! 🙏🙏

  • @jangiramitkumar439
    @jangiramitkumar439 4 หลายเดือนก่อน +1

    easy toh nhi tha, pr explaination great tha toh smj aa gya♥♠

  • @arpitsinghal07
    @arpitsinghal07 หลายเดือนก่อน

    I've watched almost all videos on this topic from channels like serrano, statquest and recently 3blue1brown. I found this to be the best explanation out of all. But the videos from other mentioned channels are also valuable, they provide a different perspective to attention mechanism.

  • @epkomarla1943
    @epkomarla1943 2 หลายเดือนก่อน

    Very very nicely done! Thanks a lot. You have taken a complex concept and explained it very beautifully.

  • @nileshpandey7776
    @nileshpandey7776 29 วันที่ผ่านมา

    Thank you so much sir, this video is more interesting than any web-series/movie on Netflix. This is quality content!!

  • @ankitgupta1806
    @ankitgupta1806 2 หลายเดือนก่อน

    To be honest , i have studies same with other videos but here now i got proper understanding , hats off nitish

  • @vimalshrivastava6586
    @vimalshrivastava6586 4 หลายเดือนก่อน

    This lecture is a demonstration of how a great teacher can make a very complicated topic so easy. Thank you so much..!!!

  • @Findout1882
    @Findout1882 หลายเดือนก่อน +1

    bhai kamal hai yarr ..topic toh bahut simple tha but aapko itna time kaise laga

  • @TalibYoutube
    @TalibYoutube 3 หลายเดือนก่อน

    wow , wow ,woooooooooooooooooooooooooooooooooow,
    what a great teacher, what a great teaching style, hats off to you.
    thank you.

  • @himanshujoshi5908
    @himanshujoshi5908 หลายเดือนก่อน

    Author example 💥💥💥💥. very informative video.

  • @AlAmin-xy5ff
    @AlAmin-xy5ff 4 หลายเดือนก่อน

    Best video on self-attention mechanism that I've ever seen before 💌💌💌💌💌

  • @akshayshelke5779
    @akshayshelke5779 3 หลายเดือนก่อน

    Best video I have seen on this topic till date....Please Upload more videos we are following you sir.

  • @_we_r_
    @_we_r_ 3 หลายเดือนก่อน

    you made self attention model very simple to learn for all of us. Thankyou sir. Appriciate your work.

  • @bhavikdudhrejiya852
    @bhavikdudhrejiya852 3 หลายเดือนก่อน

    This is a gem of explanation. Awesome🤩

  • @ali75988
    @ali75988 3 หลายเดือนก่อน

    Your words regarding demand of GAN's are so true. My background is in mechanical and at this very stage, i have seen 4-5 graduates working on both MS & PhD thesis on GAN's in one of the most top rank universities in korea.

  • @akashdutta6211
    @akashdutta6211 4 หลายเดือนก่อน

    you explain such a difficult concept so simply. even after watching nptel or here and there, i have so many doubts. you have cleared those in one video.

  • @user-hv2uu8vp9m
    @user-hv2uu8vp9m 9 วันที่ผ่านมา

    Great video and What a great example at 52:24 👏👏

  • @tishaghevariya
    @tishaghevariya 3 หลายเดือนก่อน

    Really appreciate the efforts behind this video! Waiting for the next one 🙌

  • @AbhijeetKumar-cj4fe
    @AbhijeetKumar-cj4fe 3 หลายเดือนก่อน

    i am at the 2:29 time stamp and i am pretty sure that you gave us best explanation .that's how our faith on you sir

  • @TehreemAyesha
    @TehreemAyesha 20 วันที่ผ่านมา

    This video is absolutely amazing. Thank you so much. The topic itself is so good and your explanation on top of it is crystal clear.❤

  • @chandank5266
    @chandank5266 4 หลายเดือนก่อน

    I was knowing that you will make it simple.........but at this level....man I wasn't expecting that.....bawaal explaination sir🙇‍♂

  • @kaivalyakulkarni4250
    @kaivalyakulkarni4250 4 หลายเดือนก่อน

    Thanks for such an easy explanation, sir..!

  • @ruchikakumar4283
    @ruchikakumar4283 25 วันที่ผ่านมา

    Such a great explanation!!🤗

  • @velloer
    @velloer 4 หลายเดือนก่อน

    Awesome

  • @aienthu2071
    @aienthu2071 2 หลายเดือนก่อน

    I had a huge doubt in Q, K, V part, now it is clarified. Please continue the transformers architecture especially the mathematical parts. e.g. positional embedding, multi headed attention etc. You are the beacon of Education in Data Science.

  • @ansariarbaz3374
    @ansariarbaz3374 3 หลายเดือนก่อน

    Swift explanation sir! You made it super simple. Please continue the series

  • @basabsaha6370
    @basabsaha6370 2 หลายเดือนก่อน

    im literally spell bound after this explanation....i have watched 100s of foreign youtube channel for this attention mechanism ....they are no where closer to you.....Sir you are really amazing....please please please continue this series....and teach up to LLMS , gpt , llama2 all stuffs😊,....I'm very very eager to learn from you

  • @abhijitrai1349
    @abhijitrai1349 3 หลายเดือนก่อน

    Hardest concept got the best example 😁. Thank you sir for explaining us this beautiful concept.

  • @amitabhgupta9475
    @amitabhgupta9475 4 หลายเดือนก่อน

    By far the best and simplest explanation. Kudos to you. If you provide the audio in English, it will become even more popular. Thanks