The video series of Batch Normalization is really gem. Before watching the series I thought that I knew everything about BN😅. Another request from my side, please complete the object detection playlist.
Thank you for creating such wonderful videos. The maths behind the BN was beautifully explained. Eagerly waiting for the next videos of Python and Pytorch implementation of BN. Thank you..!
Batch is an additional dimension on top of 255x255 in your case. Suppose you have a grayscale image of 255x255. If there are 10 images in the batch, the dimension becomes 10x255x255. You calculate the mean and variance along batch dimension for each pixel.
This is the most complete explanation of BN I've seen on youtube. Thank you!
Glad it was helpful!
Good explanation, efficient knowledge transfer is done, expecting more from your side like this.
The video series of Batch Normalization is really gem. Before watching the series I thought that I knew everything about BN😅. Another request from my side, please complete the object detection playlist.
Thank you 😊
Thank you for creating such wonderful videos.
The maths behind the BN was beautifully explained. Eagerly waiting for the next videos of Python and Pytorch implementation of BN.
Thank you..!
Thank you 👍
@MLForNerds Why do running_mean and running_var continue to be updated during training, even after the dataset has been processed for an epoch?
Sir I have one doubt, here batch refers to what...eg, in CNN , we have the Matrix of 255 X 255 ..how we calculate mean and variance
Batch is an additional dimension on top of 255x255 in your case. Suppose you have a grayscale image of 255x255. If there are 10 images in the batch, the dimension becomes 10x255x255. You calculate the mean and variance along batch dimension for each pixel.
@@MLForNerds Batch means mini batch size?..if i give mini batch size =32...then how it will calculate sir
please cover group norm and layer norm as well
I will cover them in the coming videos. Thank you.
please add
1. forward backward prop from dropouts, maxpool, residual....
2. forward backward prop, when we use adam instead of SGD..
please