Encoder-decoder architecture: Overview

แชร์
ฝัง
  • เผยแพร่เมื่อ 25 ม.ค. 2025

ความคิดเห็น • 33

  • @googlecloudtech
    @googlecloudtech  ปีที่แล้ว +3

    Subscribe to Google Cloud Tech → goo.gle/GoogleCloudTech

  • @gkay007
    @gkay007 ปีที่แล้ว +13

    You need to use encoder decoder to understand this course

    • @AshkaCambell
      @AshkaCambell ปีที่แล้ว

      RTS TEXTS AND FEEDS INSTEAD OF SMS
      /FEEL/

    • @Vivienna-k4v
      @Vivienna-k4v 22 วันที่ผ่านมา

      True 😭😭🤣🤣

  • @DogBlessGeesus
    @DogBlessGeesus ปีที่แล้ว +28

    I came to the comments to see if I was the only one struggling with this explanation. This was a very difficult video to follow, anyone have a better way of summing this up?

    • @Udayanverma
      @Udayanverma ปีที่แล้ว +1

      i am also looking for a proper course.

    • @jeffmathew6238
      @jeffmathew6238 ปีที่แล้ว +1

      I think Prof. Ghassemi Lectures and Tutorials gives a proper explanation...i watched his lecture 6 playlist and found it helpful...hope it helps you too

  • @debadri
    @debadri ปีที่แล้ว +12

    Anyone has link to an easier explanation? I could not decode this lesson.

    • @danish5326
      @danish5326 ปีที่แล้ว

      th-cam.com/video/L8HKweZIOmg/w-d-xo.html

    • @AshkaCambell
      @AshkaCambell ปีที่แล้ว

      Try Machine Learning AI on your own created audio

    • @debadri
      @debadri ปีที่แล้ว

      @@AshkaCambell You didn't have to display your stupidity in full view.

  • @ThisIsAnInactiveChannel
    @ThisIsAnInactiveChannel ปีที่แล้ว +35

    I think there's definitely a simpler way to explain how this architecture works, this one is too hard to understand tbh

  • @abdulanzil5224
    @abdulanzil5224 ปีที่แล้ว +8

    Nothing understood. What is happening ? A simple transformer block itself contains Encoder and Decoder. Why there is explanation with RNN ? Last it says, RNNs are replaced by transformers. Makes me more confused !

    • @opplli4942
      @opplli4942 ปีที่แล้ว +1

      RNN replaced by transformers which works in attention mechanism concept if u want to know more about transformers follow the learning path suggested in video

  • @hlodwig0_049
    @hlodwig0_049 ปีที่แล้ว +3

    "shifted to the left"...? 5:01

  • @vq8gef32
    @vq8gef32 11 หลายเดือนก่อน +1

    Amazing Thank you

  • @SuperShawermaXS
    @SuperShawermaXS ปีที่แล้ว +8

    Dam , sounds like chamber in valorant game

  • @aryansudan2239
    @aryansudan2239 2 หลายเดือนก่อน

    Tres Bien Monsieur

  • @2NormalHuman
    @2NormalHuman ปีที่แล้ว +2

    I think this video omits too many details to the point where I don't see what was the point of recording this? The video doesn't even show how data is embedded into the encoder and how the vector is produced

    • @kisame3151
      @kisame3151 ปีที่แล้ว

      The goal of an encoder decoder architecture is to "compress" high dimensional information be it an image or a natural language sentence into a dense lower dimensional representation.
      Then, in order to minimize the loss generated during learning, the encoder is forced to learn to represent the input data in a lower dimensional space without losing so much information that it makes it impossible for the decoder to recover the input. This could essentially be seen as the encoder learning to "summarize" its input.
      Again the encoder learns to encode natural language into vector space representations in the same way the decoder learns to decode it. Through training and backpropagation. Showing this process in any meaningful way is difficult because it happens in a deep neural network consisting of RNN blocks. If you dont know what RNNs are i recommend learning about that architecture first.

  • @akheelkale2937
    @akheelkale2937 ปีที่แล้ว +7

    what am i meant to understand from this??

  • @aashishrulz
    @aashishrulz ปีที่แล้ว

    enunciation and audio are so suppressed.

  • @Neil1701
    @Neil1701 ปีที่แล้ว +7

    Far too complicated and difficult to understand.

    • @AshkaCambell
      @AshkaCambell ปีที่แล้ว

      Supply Chains
      Are you the consumer the user or the creator?

  • @karthickkuduva9819
    @karthickkuduva9819 หลายเดือนก่อน

    I'm not alone 😂

  • @prose_mozaic
    @prose_mozaic ปีที่แล้ว +2

    *frantic notation intensifies *

  • @mtare8942
    @mtare8942 5 หลายเดือนก่อน +1

    Really google. We can do better than this 🙄

  • @abhisheksharmac3p0
    @abhisheksharmac3p0 6 หลายเดือนก่อน

    worst explanation

  • @AshkaCambell
    @AshkaCambell ปีที่แล้ว

    /feel/