Exponential Weighted Average || Lesson 12 || Deep Learning || Learning Monkey ||

แชร์
ฝัง
  • เผยแพร่เมื่อ 31 ต.ค. 2024

ความคิดเห็น • 17

  • @utkarshsharma4041
    @utkarshsharma4041 ปีที่แล้ว

    🤟 rocking

  • @papa4669
    @papa4669 2 ปีที่แล้ว

    Wonderful teaching 👍

  • @shivamkumar-qp1jm
    @shivamkumar-qp1jm 2 ปีที่แล้ว

    Wow you teach nicely understood everything in the first attempt

  • @shahomaghdid9591
    @shahomaghdid9591 5 หลายเดือนก่อน

    Thank you so much

  • @dinoyjohny5211
    @dinoyjohny5211 2 ปีที่แล้ว

    Super. Thankyou so much

  • @hassanrevel
    @hassanrevel 2 ปีที่แล้ว

    Super cool man. I understood everything.

  • @lennonturner9032
    @lennonturner9032 3 ปีที่แล้ว

    I'm surprised this has so few views. the concepts were explained so well. Thank you for taking the time out to talk about this :)

  • @Gamezone-kq5sx
    @Gamezone-kq5sx 3 ปีที่แล้ว

    excellent.....your explanation is very good....Many Thanks

  • @biginepallinagasai4403
    @biginepallinagasai4403 3 ปีที่แล้ว

    Kudos for your work bro. The content is really informative and keep up the good work.

  • @saimafazal2517
    @saimafazal2517 2 ปีที่แล้ว

    B20 or b2???

  • @intelligentinvesto9060
    @intelligentinvesto9060 3 ปีที่แล้ว

    Can you pls explain how the no. of previous points = 1/(1-beta)

    • @LearningMonkey
      @LearningMonkey  3 ปีที่แล้ว +1

      Hi
      It's an approximate equation taken.
      As beta increases the denominator of the above equation decreases. As denominator decreases. No of points increase.
      That is the reason they have considered that equation.
      As beta value increases more number of previous points weight is considered to calculate average weight. It does not mean remaining points are not considered. Remaining points are given less weigtage