Understanding The Computational Graph in Neural Networks
ฝัง
- เผยแพร่เมื่อ 19 มิ.ย. 2024
- Do you know what is this computational graph used by deep learning frameworks like TensorFlow or PyTorch? No? Let me tell you then!
The whole logic behind how neural networks function is the back-propagation algorithm. This algorithm allows to update the weights of the network so that it can learn. The key aspect of this algorithm is to make sure we can compute the derivatives or the gradients of very complex functions. That is the whole point of the computational graph! It is to make sure we can backpropagate those derivatives to the whole network, how deep it may be. So Let me show you how it works!
More good content indeed good one❤
Love the way you teach every point please start teaching this way
💯💯💯
Is it rnn 😅
I've got it now. I wonder why we can't calculate the x gradient by starting the backward pass closer to x instead of going through all the activations.
I am not sure I understand the question.