Batch Normalization - Part 3: Backpropagation & Inference

แชร์
ฝัง
  • เผยแพร่เมื่อ 1 ก.พ. 2025

ความคิดเห็น • 14

  • @marcoxs36
    @marcoxs36 ปีที่แล้ว +2

    This is the most complete explanation of BN I've seen on youtube. Thank you!

    • @MLForNerds
      @MLForNerds  ปีที่แล้ว

      Glad it was helpful!

  • @ThineshKumar-gk8hb
    @ThineshKumar-gk8hb 10 หลายเดือนก่อน +1

    Good explanation, efficient knowledge transfer is done, expecting more from your side like this.

  • @ankurdas1477
    @ankurdas1477 ปีที่แล้ว +4

    The video series of Batch Normalization is really gem. Before watching the series I thought that I knew everything about BN😅. Another request from my side, please complete the object detection playlist.

  • @anantmohan3158
    @anantmohan3158 ปีที่แล้ว +1

    Thank you for creating such wonderful videos.
    The maths behind the BN was beautifully explained. Eagerly waiting for the next videos of Python and Pytorch implementation of BN.
    Thank you..!

  • @lokeshborawar9899
    @lokeshborawar9899 25 วันที่ผ่านมา

    @MLForNerds Why do running_mean and running_var continue to be updated during training, even after the dataset has been processed for an epoch?

  • @sridharchandrasekar7787
    @sridharchandrasekar7787 10 หลายเดือนก่อน +1

    Sir I have one doubt, here batch refers to what...eg, in CNN , we have the Matrix of 255 X 255 ..how we calculate mean and variance

    • @MLForNerds
      @MLForNerds  10 หลายเดือนก่อน

      Batch is an additional dimension on top of 255x255 in your case. Suppose you have a grayscale image of 255x255. If there are 10 images in the batch, the dimension becomes 10x255x255. You calculate the mean and variance along batch dimension for each pixel.

    • @sridharchandrasekar7787
      @sridharchandrasekar7787 10 หลายเดือนก่อน

      @@MLForNerds Batch means mini batch size?..if i give mini batch size =32...then how it will calculate sir

  • @muthukamalan.m6316
    @muthukamalan.m6316 ปีที่แล้ว +1

    please cover group norm and layer norm as well

    • @MLForNerds
      @MLForNerds  ปีที่แล้ว

      I will cover them in the coming videos. Thank you.

  • @sandeepanand3834
    @sandeepanand3834 ปีที่แล้ว

    please add
    1. forward backward prop from dropouts, maxpool, residual....
    2. forward backward prop, when we use adam instead of SGD..
    please