Before watching this, I tried to implement LeNet Architecture myself first and then it works and exactly similar to your class LeNet implementation. Thanks to you, I am able to connecting all the videos of this PyTorch Playlist. Cheers :D
This is quite great!! Implementing various architectures in PyTorch. Please just dont stop with the basic architectures, try to put out videos on advanced content as well, maybe advanced GAN architectures in Computer Vision :)
So happy to see you explaining as simple as possible....
please keep making videos on implementing various deep learning architectures using Pytorch.
I really appreciate your kind comment :) More architectures coming up!
Before watching this, I tried to implement LeNet Architecture myself first and then it works and exactly similar to your class LeNet implementation. Thanks to you, I am able to connecting all the videos of this PyTorch Playlist.
Cheers :D
same man! :)
Thanks for implementing the architectures from scratch! It is extremely helpful!
This is quite great!! Implementing various architectures in PyTorch. Please just dont stop with the basic architectures, try to put out videos on advanced content as well, maybe advanced GAN architectures in Computer Vision :)
8:28 I know that feeling bro! Great video, as always :)
Quick and easy, thanks
very good video, keep it up buddy
just a question what is difference between nn.linear and nn.sequential. why you're using nn.linear over here why not nn.sequential?
Why the softmax activation is not applied at the last layer?
because softmax is already in crossentropyloss class
dont we specify an error function and an optimizer like adam? also, dont we specify an initialization scheme like xavier initialization?
谢谢你的视频