The code u did in the video and in the repo is different. So if i try with the code provided in the video, while in main.py we call the Model class but we are missing out the arguments. But when i visited the repo, it was found the arguments have been removed in Layer class initialization. And there are some more places where there arise an error like ValueError: operands could not be broadcast together with shapes (32,64) (32,10)
Hey, thanks for bringing this up. I forgot to mention in the video that I refactored the code a little between the forward pass and backpropagation videos to remove some redundancies (they were primarily included for explanatory purposes). Also, while implementing backpropagation in the video, there were some bugs (Ex: I forgot to take the transpose of a couple of matrices during the backward step) that I fixed towards the end of the video. Looking at your ValueError, I believe you are not taking the transpose of a matrix in some step (I suspect the backward step). As to the GitHub repo, that is different from the code in the video because I have included additional activation functions, some debugging methods for the classes, and minor refactoring to remove explanatory code. The repo is supposed to be a reference/full fledged library that you can start playing with. I understand it can be frustrating to deal with errors when you are following a guide and I'll try to make more comprehensive explanations in the future. Let me know if you have any more questions/issues.
wow this is perfect for starting with neural networks. I recently started learning ML. Made some projects using Linear Regression, Logistic Regression, Decision Tree and Random Forest. Now I wanted to try neural networks but had no idea how to. This will be my starting point. Hopefully i can do it without going to the GitHub. Thank you for providing such a detailed video.
The code u did in the video and in the repo is different. So if i try with the code provided in the video, while in main.py we call the Model class but we are missing out the arguments. But when i visited the repo, it was found the arguments have been removed in Layer class initialization. And there are some more places where there arise an error like ValueError: operands could not be broadcast together with shapes (32,64) (32,10)
Hey, thanks for bringing this up.
I forgot to mention in the video that I refactored the code a little between the forward pass and backpropagation videos to remove some redundancies (they were primarily included for explanatory purposes). Also, while implementing backpropagation in the video, there were some bugs (Ex: I forgot to take the transpose of a couple of matrices during the backward step) that I fixed towards the end of the video.
Looking at your ValueError, I believe you are not taking the transpose of a matrix in some step (I suspect the backward step).
As to the GitHub repo, that is different from the code in the video because I have included additional activation functions, some debugging methods for the classes, and minor refactoring to remove explanatory code. The repo is supposed to be a reference/full fledged library that you can start playing with.
I understand it can be frustrating to deal with errors when you are following a guide and I'll try to make more comprehensive explanations in the future. Let me know if you have any more questions/issues.
@@DeCoder157 Understood mate, will be solving them from srcatch, hope that'll help me much more!
wow this is perfect for starting with neural networks. I recently started learning ML. Made some projects using Linear Regression, Logistic Regression, Decision Tree and Random Forest. Now I wanted to try neural networks but had no idea how to. This will be my starting point. Hopefully i can do it without going to the GitHub. Thank you for providing such a detailed video.
That's Seriorusly Amazing thanks for sharing.
I'm looking forward for the next videos, appreciate your work
Great video.
I loved the animation
Awesome!