Attention Is All You Need

แชร์
ฝัง
  • เผยแพร่เมื่อ 26 ก.ย. 2024

ความคิดเห็น • 301

  • @finlayl2505
    @finlayl2505 3 ปีที่แล้ว +346

    Friendship ended with LSTM, transformer is now my best friend.

    • @EvgenSuit
      @EvgenSuit 2 ปีที่แล้ว +1

      As far as i know there's a little amount of transformers for audio problems

    • @electric_mind
      @electric_mind 2 ปีที่แล้ว +4

      LSTMS generally perform better when it comes to short sequences, and remember LSTM is the revolution that led to the birth of The Transformer. I love both of them!

    • @klam77
      @klam77 ปีที่แล้ว +1

      Lstm sequentialization is kludged inside transformers. Pay attention

    • @jamesbedwell4715
      @jamesbedwell4715 ปีที่แล้ว

      Same but with GRU

    • @st0a
      @st0a ปีที่แล้ว +1

      Friendship ended with Transformers, Retentive Networks are now my best friend.

  • @tanmayjain6791
    @tanmayjain6791 6 หลายเดือนก่อน +83

    Nobody knew this paper would change the world

    • @blitz1867
      @blitz1867 4 หลายเดือนก่อน

      Did it?

    • @yanniammari1491
      @yanniammari1491 4 หลายเดือนก่อน +21

      @@blitz1867 hell yeah it did its cited 124k times so it definetly did

    • @JohnDoe-pq8yw
      @JohnDoe-pq8yw 4 หลายเดือนก่อน +3

      @@yanniammari1491 And we're just getting started.

    • @AmanSingh-xk2lv
      @AmanSingh-xk2lv 3 หลายเดือนก่อน +1

      @@blitz1867 yes, definitely.

    • @MM-by6qq
      @MM-by6qq 2 หลายเดือนก่อน

      very true

  • @RobotProctor
    @RobotProctor 3 ปีที่แล้ว +283

    I've watched this maybe 5 times over 1 year, each time getting more and more from it. I think I finally intuitively understand how this works. Thanks for your work and your time!

    • @niedas3426
      @niedas3426 ปีที่แล้ว +22

      This has been my experience with ML in general: I have to re-read papers and books over and over again, and each time I understand more. It's hard, but it pays off to finally get the grasp of such an almost mystical cocnept.

    • @StoutProper
      @StoutProper ปีที่แล้ว +4

      It’s a little bit more complicated than just predicting the next word based on the last, which is the take a lot of people have on it.

    • @electric_sand
      @electric_sand ปีที่แล้ว

      @@niedas3426 How's it going...honestly this is how I feel sometimes, having to go through multiple videos and blogposts just to grasp concepts.

    • @niedas3426
      @niedas3426 ปีที่แล้ว +3

      @@electric_sand Honestly, still making steady progress. I am now at a place where I am much, much further. I've mainly been preoccupied with datasets (e.g. reducing file storage size, faster reading and calculations, pytorch iterdatapipes) and realised it'd help me to go back more to the fundamentals (linear algebra, calculus, probability, pandas, numpy, data structures, builtin methods etc). It's been fun, overall. :)

    • @electric_sand
      @electric_sand ปีที่แล้ว

      @@niedas3426 Thanks for your response. Same here, I decided to go back to the fundamentals as well...I simply got tired of struggling through papers. Wish you the best mate.

  • @TimKaseyMythHealer
    @TimKaseyMythHealer ปีที่แล้ว +13

    Finally, someone is drawing vectors to describe what is meant by encoding with vectors, and how the vectors relate to one another. So many talk about this, but barely understand the details.

  • @kema8628
    @kema8628 5 ปีที่แล้ว +59

    The explanation of querying a key-value pair is really nice

    • @gorgolyt
      @gorgolyt 3 ปีที่แล้ว +5

      I recommend looking at the paper, because they use exactly this analogy. I found their description very helpful:
      "An attention function can be described as mapping a query and a set of key-value pairs to an output,
      where the query, keys, values, and output are all vectors. The output is computed as a weighted sum
      of the values, where the weight assigned to each value is computed by a compatibility function of the
      query with the corresponding key."

  • @herp_derpingson
    @herp_derpingson 5 ปีที่แล้ว +128

    I was searching for a channel like "Two minute papers" but not two mins in length and goes in depth. I think I found it!
    Subbed!

  • @jugsma6676
    @jugsma6676 6 ปีที่แล้ว +317

    By far the best explanation about the paper "Attention Is All You Need". well explained. Thanks Yannic Kilcher

    • @jugsma6676
      @jugsma6676 4 ปีที่แล้ว +5

      @bunch of nerds , BERT is also a transformer but with bidirectional (forward and backward) movement in the input sentence.
      And, GPT is a generative (autoregressive/ randomness) version of Transformer.
      Both are a language mode able to predict and understand the input sentence/s.

    • @jugsma6676
      @jugsma6676 4 ปีที่แล้ว

      @bunch of nerds , It's much better to use inbuild stable tools than doing from scratch.

    • @asmadali-
      @asmadali- 3 ปีที่แล้ว

      @@jugsma6676 is the most important thing to remember remember that you are the most powerful in the life of the tomb and your children are the most important thing to remember to be a a good friend friend of of a family who has been in in the last few months for a a long term relationship and you have to make sure that you have a good relationship with your your life insurance company in the world world world and of course you will be able and can can afford you for your business in the world and you you want to be a part of your happy moments and happiness in your lives by the time you get to the best of my abilities and I hope that you will be able and willing and willing to help help me in this situation as I am in the process and I am very very grateful for the the opportunity to work with my company in the field and I am am very grateful for the opportunity to work work with with my my partner in this process of learning from a very high level level of customer support for my future goals for the the industry in which I have have the opportunity to work with my business partner in the future of my company as well as a professional professional customer and and I am very confident confident in my ability to work work and to work with you in the process and I look forward to to speak with you you may be able to help me with this project as soon as I have a job in mind for my career and I will try and make sure I have the best of my abilities and I will try and get it to work for the best and I will try to the best of my abilities and I can make it to my end of of this project as soon as I get a job in a a good good place for my future goals for the next few weeks so I am going back to the best of my abilities and will let you know if I have any further ideas or comments about this opportunity and and if you have any further questions or concerns please feel free to contact us via email or or phone number and we can call or text or call or text me if you have any other ideas for us to discuss or if you have any other ideas for future projects and if you you need any help with any of the tomb or any other information you may have or if you have have any questions questions questions please feel free to contact me or you can call me me or my cell phone at any of those places or or at my cell number below are my my cell phone numbers and I will call them and get them to send me the info for your new home home and I can give you a few of the tomb I am looking at on Friday so I will be able and able to make some work if you want me there for a few hours and I can get you some info for the late night or early next weekend if you want me there there is no way I could be of any assistance or if you need need to get me a hotel for the night night and I will be in the office tomorrow morning to pick up my check for you to pick it 2017 and send it to me so I can send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I I I I I I I I I can come pick it out from your house and pick it out on the way ye if you want it for you or you could pick it up tomorrow morning and pick it out on the the phone or on the way to the office or at home or on the way to to get you a new phone or a new one for your phone and send it it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of of my address address so I can send it to the best of my abilities as I am not sure if I will be able to make the payment today because I have not received any response from the seller and and I have not received any reply yet again for my response and I am not sure if I will be able to make this appointment today due to the fact fact that I am not able and I I will not be able to make it to the meeting today because I am unable to make it to class on Friday and I am not sure if if you you will be able or able help me out at this point I am not sure if I will be able to make this work or if I need this to work out or if I need this to work on my end of the semester and I will try to the best of the best of my abilities abilities as well and will let you know if I have any other openings for me I will try to the best of my abilities and will let you know if I have any further further information regarding the job offer or if you would like me help in any of the positions I am interested to apply to your position and I look forward forward to hearing from you in the near the end of next month to to see if I have any further information on the job opportunity that I am looking to move to a few years later this year as I have been working for the last last month of of the month of the month and I have a few things I need to do to get the job completed ASAP and I need a few things things to go over with my parents 243 4444444444443444 4w4424w4w4w442 2 I 4242 the 3 44442 of the country to make sure they have a 2 and they will 2 1 for their 3rd 11 in their 3rd party party 2 2 and they will not 32424242243 the world 44 34 25 2 44442 243 4444444444443444 and 2 2 is not a good place 4444 243 is the only 2 in the floor that that has to do with it in in 2nd century with 24 of a wide 20 and the 55 is the only 2nd largest of all 3 of all time in 2nd year of the tomb year and a long period of 20th 20th of the 4242 24 and a few 554444 243 554444 243 and a few of them were in the floor 3 of a few of them were the ones 2 and I 2 the ones that are in a good shape for a 3rd floor 44 to be the only one who is in a good mood and they will 2 their love and the love of 20 for the first day and a few more than a few days of their own lives in

    • @clray123
      @clray123 3 ปีที่แล้ว +12

      It's still like listening to a bad student struggling to explain something they don't themselves understand. And shame on the Google researchers for doing such a shit job explaining themselves, but I guess that's typical of the majority research papers out there - these people just don't care to teach their ideas to others (except maybe a very narrow circle with whom they have already communicated via other channels).

    • @rednas195
      @rednas195 ปีที่แล้ว +2

      @@clray123 What parts do you think are explained poorly? To me it feels like Yannic understands the paper quite well, but i'm interested in what you think he might not understand all too well.

  • @dariodemattiesreyes3788
    @dariodemattiesreyes3788 4 ปีที่แล้ว +90

    Really good explanation. You know how to provide the essence without getting lost into details. Details might be important later but the most important thing at first is the very main nature of the strategy and you provided it crystal clear. Thanks!!!

  • @prashanthkurella4500
    @prashanthkurella4500 ปีที่แล้ว +9

    Who knew this paper would change how we look at sequences forever

  • @deleteme924
    @deleteme924 6 ปีที่แล้ว +244

    You have the best videos about machine learning I've seen, comparable to only perhaps 3blue1brown, but his videos aren't about as advanced topics. It would be really nice if you can make more!

    • @snippletrap
      @snippletrap 4 ปีที่แล้ว +8

      Arxiv Insights and Henry AI are pretty good too

    • @ambujmittal6824
      @ambujmittal6824 4 ปีที่แล้ว +8

      @@snippletrap Arxiv sadly stopped posting a long time back and I personally find Henry AI's discussion to be superficial. Try Chris McCormik and reading groups by Rachael though. :)

    • @peterhojnos6705
      @peterhojnos6705 3 ปีที่แล้ว +5

      Who do you mean by “Rachael reading groups”?

    • @48956l
      @48956l 2 ปีที่แล้ว +3

      Definitely both great channels but the comparison doesn't do justice to just how good 3b1b's animations are. Here Yannic writes on a tablet lol. Not really comparable.

  • @owenmarkley446
    @owenmarkley446 9 หลายเดือนก่อน +6

    This is by far the best explanation I've seen of this paper. I'm writing a review of this paper for a class and wouldn't have been able to do it without your video! Immensely grateful!

  • @languagemodeler
    @languagemodeler ปีที่แล้ว +7

    It's amazing to have this explanation of the paper that is responsible for all of the AI interest and innovation happening now--- described as 'interesting' shortly after it came out. I love it.

  • @akhilvenkataraju7791
    @akhilvenkataraju7791 4 ปีที่แล้ว +17

    Thank you so much Yannic Kilcher, the paper seemed complex but you "encoded", performed "multi-head attention" and "decoded" it in such a simple way (: An amazing job! Undoubtedly the best explanation

  • @PaulFidika
    @PaulFidika 10 หลายเดือนก่อน +7

    I'm watching the history of AGI being built right here

  • @TijsZwinkels
    @TijsZwinkels ปีที่แล้ว +1

    Yeah, I'm late to the party, but I'd say that this video is still very relevant. I've read the paper several times and watching multiple blog posts and videos, but especially the Q,K,V mechanism never really clicked until watching this. Using dot-products between Q and K as a lookup mechanism. Ingenious! - Thanks for this video!

  • @rommeltito123
    @rommeltito123 3 ปีที่แล้ว +2

    Good that you were interrupted at 17:15. I had to strain my ears and go full volume to hear you. After that it was better.

  • @spinner4
    @spinner4 ปีที่แล้ว +1

    Why could there not be such a TH-cam explanation from authors of the paper: would be very helpful for humanity right now. But this is quite helpful.

  • @YtongT
    @YtongT 4 ปีที่แล้ว +7

    an amazing explanation. truly amazing. I cant say how much I appreciate you putting dot product and soft max into intuitive and easy to understand words. very grateful

  • @lleger
    @lleger 2 หลายเดือนก่อน

    i cant believei. just learned the intuition behind softmax, Yannic ur videos are pure gold, i hope life is treating u well !

  • @magnuswiklander8204
    @magnuswiklander8204 2 ปีที่แล้ว +2

    Fun to see this today after all the recent successful transformer results! (June 2022) Thanks Yannic, keep it up!!

  • @carlkenner4581
    @carlkenner4581 ปีที่แล้ว +6

    This will never catch on. (Kidding)

  • @BrettHannigan
    @BrettHannigan ปีที่แล้ว +3

    Excellent explanation of Transformers. Clear, easy to follow, and great information. Thanks!

  • @Luxcium
    @Luxcium 6 หลายเดือนก่อน +1

    This sounds like someone who was reading a paper without realizing that it was to be the third largest thing to happen onward to humanity after a pandemic and from my own perspective an invasive war in Europe then the spark of AI that would happen with ChatGPT and the expansion of generative imaging like Stable Diffusion and Midjourney 😅🎉🎉🎉🎉 I would love to know how many subscribers you got from back then to just before ChatGPT and from ChatGPT up to nowadays 😅😅😅😅 You are such an amazing communicator ❤

  • @RealStonedApe
    @RealStonedApe 8 หลายเดือนก่อน +1

    Love how 'All you need is attention' also applies to me in terms of understanding this video. Time to chug down some Adderall and take notes!!
    Also, probably not a good start when I have no idea what a vector even is... Take it in bit by bit.
    Robert Pirsig reading Thoreau style, ya dig?!
    Anyways, plz pray for me. Any God, Joseph Smith, Allah, Bill Murray, etc. And he will do. Pray for my attention, pray for my soul, pray for Bill Murray. Love❤🎉

  • @shandou5276
    @shandou5276 5 ปีที่แล้ว +3

    Very well done! I agree with the other comments that this is the clearest explanation I have seen so far. Thanks for the great work!

  • @tassoskat8623
    @tassoskat8623 3 ปีที่แล้ว +2

    Great video and very unique amongst most machine learning videos on youtube.
    Thank you!

  • @vijeta268
    @vijeta268 4 ปีที่แล้ว +2

    You have done an excellent job in explaining attention method in simple words. Thanks so much!

  • @WaylonFlinn
    @WaylonFlinn 2 ปีที่แล้ว

    You need to remake this video. You've gotten so much better at doing this since you made this video and this topic is so foundational.

  • @bpolat
    @bpolat 4 หลายเดือนก่อน +2

    This paper changed the world. AI revolution is started after this paper.

  • @revanthvejju3727
    @revanthvejju3727 ปีที่แล้ว +3

    the paper that changed everything

  • @patpearce8221
    @patpearce8221 ปีที่แล้ว

    So the words are converted into vector embeddings, than positionally encoded using the sine and cosine function.
    2. This vector is copied with one copy to be passed through the multihead attention layer to be contextualised by splitting each word into a key, query and value which are learned. There are a separate key, value and query vector for each word.
    3. The key is matrix multiplied by the query after being passed through a linear layer, divided by the squareroot of the dimensionality of the key, multiplied by a softmax function and than matrix multiplied by the value.
    4. It is than added to the other copy of the positionally encoded vector.
    5. It is than normalised.
    6. It than passes into the ffnn which has three layers.
    7. In the dense layers it is matrix multiplied with the weight and than has a bias added, both of which are learnt.
    8. It has the ReLU function applied to each word.
    9. It is added with the residual(the vector before it has passed through the ffnn)
    10. It is than normalised again...
    Than voile... what Am I missing here?

  • @fahds2583
    @fahds2583 4 ปีที่แล้ว +1

    you have such a cool state of mind ... really adds to making your teaching style more interesting

  • @astrobearmusic1977
    @astrobearmusic1977 3 ปีที่แล้ว

    I had to revisit this video several times, but I think transformers finally clicked for me. Thank you!

  • @deaths1l3nce
    @deaths1l3nce 4 ปีที่แล้ว +5

    Thank you very much! This has helped me a lot. All I could find on this specific paper was confusing and hard to understand, I think it was explained extremely well in your video! Please make more of these, I think you might help lots of people :D

  • @renehaas7866
    @renehaas7866 4 ปีที่แล้ว +1

    I really appreciate that you are making these videos.

  • @RobotProctor
    @RobotProctor 3 ปีที่แล้ว +4

    Question: if there is a finite max length of an input/output sequence, why do you need a positional encoding? Wouldn't the network have a static place for the 1st word, 2nd word, ..., nth word in it's inputs? I'm struggling to understand the need for the positional encoder.

    • @RobotProctor
      @RobotProctor 3 ปีที่แล้ว +5

      nevermind, I think I understand. Since the multi-head attention mechanism uses the same scaled dot product computation for each word in the sequence (and not different parts of a NN for example), the positional encoding is necessary in order to get different answers for the same word at different locations in the sequence.

    • @YannicKilcher
      @YannicKilcher  3 ปีที่แล้ว +2

      you got it :)

  • @bayesianlee6447
    @bayesianlee6447 9 หลายเดือนก่อน

    This 6 years after attention is all you need is just crazy rapid growth

  • @ostensibly531
    @ostensibly531 ปีที่แล้ว +1

    Love the relaxing voice. Way better than reading the paper myself. Now I can be on the elliptical and still ingest the gist of papers. Thank you for making this!

  • @Julian-tf8nj
    @Julian-tf8nj 5 ปีที่แล้ว +9

    VERY helpful, thanks! I'd love to see a "part 2" ...

  • @aidangomez6004
    @aidangomez6004 3 ปีที่แล้ว +3

    This is a great summary, thanks for making this!!

  • @sophiaxia3240
    @sophiaxia3240 3 ปีที่แล้ว

    by far the most intuitive explanation. Thanks!

  • @arslanali900
    @arslanali900 6 หลายเดือนก่อน +1

    You are amazing!

  • @thearianrobben
    @thearianrobben 4 ปีที่แล้ว +3

    so Representation in one Natural Language into the Universal Language of Math into anther Natural Language.

  • @goatpepperherbaltea7895
    @goatpepperherbaltea7895 ปีที่แล้ว +1

    5 years lasted I’m out here like damn

  • @Don-gk9ss
    @Don-gk9ss 5 ปีที่แล้ว +2

    the best transformer video I have watched. Well explained

  • @nchahine
    @nchahine 3 ปีที่แล้ว

    I always thought about doing a youtube channel like this, but I guess I don't need to because you are so good at this thanks!

  • @MuslimFriend2023
    @MuslimFriend2023 6 หลายเดือนก่อน

    Man. I just found your channel. All the best insh'Allah

  • @alexandrostsagkaropoulos
    @alexandrostsagkaropoulos ปีที่แล้ว

    Just exceptional explanation. You clear things up so much!

  • @jackiekuen
    @jackiekuen หลายเดือนก่อน

    Thank you sir, very clear explanation

  • @jsphyan
    @jsphyan ปีที่แล้ว

    This is beautiful, I really appreciate your work! Thank you

  • @goelnikhils
    @goelnikhils ปีที่แล้ว

    Such a clear explanation of Attention. I was struggling to understand attention and I would have watched over 20 videos for this but got no clarity.

  • @starlite5097
    @starlite5097 ปีที่แล้ว

    Thanks, nice video. You've come a long way since then, I'm sure, especially with the open assistant stuff.

  • @julinamaharjan6987
    @julinamaharjan6987 4 ปีที่แล้ว +1

    Very intuitive explanation. Thank you!

  • @chandlerclement1365
    @chandlerclement1365 6 ปีที่แล้ว +2

    Excellent video, thank you so much for illustrating these concepts so clearly.

  • @qidichen1756
    @qidichen1756 4 ปีที่แล้ว +1

    One of the best explanation !!!

  • @MrChristian331
    @MrChristian331 3 ปีที่แล้ว +2

    What's "Add and Norm" mean at each step of the network in the architecture??

  • @ziku8910
    @ziku8910 ปีที่แล้ว

    Very intuitive explanation here, thank you!

  • @anisakhlyan8581
    @anisakhlyan8581 4 ปีที่แล้ว +2

    Thank you! This is a very good explanation which I actually used in presenting this paper. Cheers man!

  • @teddy5474
    @teddy5474 2 ปีที่แล้ว

    Best explanation I've seen on this topic!

  • @fisherh9111
    @fisherh9111 ปีที่แล้ว

    this is excellent. Thank you so much for sharing!

  • @sebastianreyes8025
    @sebastianreyes8025 2 ปีที่แล้ว

    Protip, print out this paper and take notes on it as you try to follow along with this, also read on your own and take note of questions you have.

  • @hamedgholami261
    @hamedgholami261 2 ปีที่แล้ว

    I really understood the subject, thanks for your clear explanation.

  • @tobiaszb
    @tobiaszb 2 ปีที่แล้ว

    Thank You for the overview.
    (Let's take care of human attention, train it's span.)

  • @faysoufox
    @faysoufox 4 ปีที่แล้ว

    Nice explanation. Note that by the time somebody gets to the part where you explain dot products, he or she would likeky already know what a dot product is.

  • @olegshpynov
    @olegshpynov 4 ปีที่แล้ว +1

    Great explanation of the transformer model. Thank a lot!

  • @VinBhaskara_
    @VinBhaskara_ 6 ปีที่แล้ว +1

    great explanation. please keep posting such summaries of great papers thanks!

  • @ankitbhardwaj1956
    @ankitbhardwaj1956 4 ปีที่แล้ว +1

    Thanks a lot for this explanation video!!

  • @AlimamiHD
    @AlimamiHD 3 หลายเดือนก่อน +1

    Bro reminded me of that one dude who posted a video in 2011 begging people to buy 1$ of bitcoin

  • @alfred17686
    @alfred17686 5 ปีที่แล้ว +7

    This was such a good explanation. I've been trying to really understand these, but until now I haven't found a good resource. Cheers man!

  • @xingyubian5654
    @xingyubian5654 2 ปีที่แล้ว

    Always wondered wht keys, values, queries are. Thank you for the clear explanation!

  • @prasitamukherjee5864
    @prasitamukherjee5864 3 ปีที่แล้ว

    Thank you for the super neat explanation- Cleared a lot of stuff.

  • @arunantony3207
    @arunantony3207 4 ปีที่แล้ว +1

    Great explanation !

  • @luischinchilla-garcia4840
    @luischinchilla-garcia4840 4 ปีที่แล้ว +5

    Amazing explanation! Some constructive criticism: The visuals and hand writing can often be too messy.

  • @simons6512
    @simons6512 2 ปีที่แล้ว

    Super Erklärig, bis jetzt eini vo de beste woni gfunde ha.
    Findes cool zgseh das mal öpper us de schwiz sich so uf dere plattform engagiert.
    Witer so!

  • @michaelmuller136
    @michaelmuller136 4 ปีที่แล้ว +3

    Thank you, that was very informative and explained well!

  • @rupjitchakraborty8012
    @rupjitchakraborty8012 4 ปีที่แล้ว +1

    This is great video. Please make a video on Hierarchial Neural Networks.

  • @suyashshrivastava8317
    @suyashshrivastava8317 3 ปีที่แล้ว

    Thank you so much for this. Excellent explanation

  • @aminzaiwardak6750
    @aminzaiwardak6750 4 ปีที่แล้ว +1

    Thanks a lot you explained very well.

  • @danecchio6621
    @danecchio6621 2 ปีที่แล้ว

    Grazie tantissimo per la spiegazione.

  • @vikaskumarjha9
    @vikaskumarjha9 5 ปีที่แล้ว

    Thank you so much. You explain it so well in very simple terms.

  • @jabusch24
    @jabusch24 5 ปีที่แล้ว +1

    very good intro. many videos don't focus on visual explanation which you definitely did cover. I'ld be thrilled to see a video that goes more into depth, also how exactly the decoding is done once it's trained and how embeddings could be obtained for other tasks. But other than that, very very very well done!

  • @marjansherafati6913
    @marjansherafati6913 4 ปีที่แล้ว

    Thank you very much, amazing explanation! 🙏🏼🙏🏼

  • @norik1616
    @norik1616 4 ปีที่แล้ว +2

    Well explained. After watching your newer videos, you could do even better job now! Please, give again your (filmed) attention to Attention is all you neeed.

  • @nguyentrung1452
    @nguyentrung1452 2 ปีที่แล้ว

    great explanation. Thank you, master

  • @kevind.shabahang
    @kevind.shabahang 3 ปีที่แล้ว

    Thank you. Very clear.

  • @sahanagk4011
    @sahanagk4011 3 ปีที่แล้ว

    This explanation is amazing!! Thank you for this

  • @sebchap24
    @sebchap24 ปีที่แล้ว

    Quite an amazing explanation! thanks a lot

  • @bernhardvoggenberger9850
    @bernhardvoggenberger9850 3 ปีที่แล้ว

    As a student your videos are very helpful!

  • @pladselsker8340
    @pladselsker8340 ปีที่แล้ว +2

    So attention mechanism it's basically a dictionary... What the heck lol

  • @twobob
    @twobob 2 ปีที่แล้ว

    yeah. that was a really good explanation.

  • @shrikanthsingh8243
    @shrikanthsingh8243 4 ปีที่แล้ว +1

    Thank you so much it was a very good explanation

  • @halehdamirchi146
    @halehdamirchi146 4 ปีที่แล้ว +1

    This was really helpful, thank you!

  • @GunHolsters
    @GunHolsters 2 ปีที่แล้ว

    Thank you, now I understand.

  • @MehranZiadloo
    @MehranZiadloo 5 ปีที่แล้ว +2

    While this video did add to my knowledge about the Transformer model, but I think it did a poor job in total. I still cannot say that I have a general understanding of how this model works. What I'm interested to know is whether the training and inference models are the same (in LSTM and GRU, the inference decoder is used differently from training decoder). And also what are the inputs and outputs of this model (I mean the exact structure, maybe even some examples).

  • @dailygrowth7967
    @dailygrowth7967 2 ปีที่แล้ว

    Thanks, i really enjoy your content!

  • @sarahpanda1167
    @sarahpanda1167 4 ปีที่แล้ว +1

    Very helpful !! Thank you !

  • @youngchaizheng6560
    @youngchaizheng6560 4 ปีที่แล้ว +1

    Good explanation

  • @tyfoodsforthought
    @tyfoodsforthought 4 ปีที่แล้ว

    This was wonderful. Thank you!!!

  • @vsiegel
    @vsiegel 2 ปีที่แล้ว

    I see that this thing seems to exist, even if it is hard to believe. But that it was created by learning, starting with nothing, is literally unbelievable for me.
    Did it start from something else? But what could that be?

  • @garrettosborne4364
    @garrettosborne4364 3 ปีที่แล้ว

    Thanks Yannic, great videos.