Understanding The Computational Graph in Neural Networks

แชร์
ฝัง
  • เผยแพร่เมื่อ 19 มิ.ย. 2024
  • Do you know what is this computational graph used by deep learning frameworks like TensorFlow or PyTorch? No? Let me tell you then!
    The whole logic behind how neural networks function is the back-propagation algorithm. This algorithm allows to update the weights of the network so that it can learn. The key aspect of this algorithm is to make sure we can compute the derivatives or the gradients of very complex functions. That is the whole point of the computational graph! It is to make sure we can backpropagate those derivatives to the whole network, how deep it may be. So Let me show you how it works!

ความคิดเห็น • 6

  • @IkhukumarHazarika
    @IkhukumarHazarika 5 วันที่ผ่านมา +1

    More good content indeed good one❤

  • @IkhukumarHazarika
    @IkhukumarHazarika 5 วันที่ผ่านมา +1

    Love the way you teach every point please start teaching this way

  • @randomaccessofshortvideos6214
    @randomaccessofshortvideos6214 7 วันที่ผ่านมา

    💯💯💯

  • @IkhukumarHazarika
    @IkhukumarHazarika 5 วันที่ผ่านมา +1

    Is it rnn 😅

  • @godzilllla2452
    @godzilllla2452 วันที่ผ่านมา

    I've got it now. I wonder why we can't calculate the x gradient by starting the backward pass closer to x instead of going through all the activations.

    • @TheMLTechLead
      @TheMLTechLead  19 ชั่วโมงที่ผ่านมา

      I am not sure I understand the question.