Deep Learning(CS7015): Lec 5.9 (Part-2) Bias Correction in Adam

แชร์
ฝัง
  • เผยแพร่เมื่อ 31 ต.ค. 2024

ความคิดเห็น • 10

  • @OmPrakash-vt5vr
    @OmPrakash-vt5vr 4 หลายเดือนก่อน +1

    Concise and to the point! Thank You

  • @a.h.2138
    @a.h.2138 4 ปีที่แล้ว +6

    Cheers from Hungary!

  • @ranjithkumarkalal1810
    @ranjithkumarkalal1810 5 ปีที่แล้ว +2

    Nice explanation sir

  • @sadafwaqas5972
    @sadafwaqas5972 4 ปีที่แล้ว +3

    nice explaination....are there kids in the background?

  • @VishalSharma-gp6dm
    @VishalSharma-gp6dm ปีที่แล้ว +2

    Sir your voice changed from this lecture

  • @phani_kiran
    @phani_kiran 4 ปีที่แล้ว

    As E[mt] = true expected value of gradient after making bias correction ,does this mean loss will always decrease?

    • @pradeepkumar-qo8lu
      @pradeepkumar-qo8lu 4 ปีที่แล้ว

      Not sure about loss in this case as bias correct is used to bring the exponentially weighted running average same as the gradients and that is what he proved mathematically in this lecture ie the expectation by using bias corrected exponentially weighted running average is equal to the expectation of gradients
      Ps: do correct and explain me if I'm wrong

    • @kunalbharali5181
      @kunalbharali5181 3 ปีที่แล้ว

      If your entire data set is noisy, then it wont work, however if you have noise in some data and not on the entire set, then yes it may remove the noise.

    • @sudharshantr8757
      @sudharshantr8757 2 ปีที่แล้ว

      The bias correction is just multiplying the mt with a constant. This is already done by the learning rate. And loss doesn't always decrease as we have seen the error to overshoot due to the presence of momentum. As long as E[mt] = c*E[gt], we are in the right track as learning rate will take care of the rest