Neural Networks in pure JAX (with automatic differentiation)

แชร์
ฝัง
  • เผยแพร่เมื่อ 28 พ.ย. 2024

ความคิดเห็น • 9

  • @DefinitelyNotAMachineCultist
    @DefinitelyNotAMachineCultist ปีที่แล้ว +3

    Thanks a bunch.
    I can never find enough JAX tuts lol
    Even large language models like ChatGPT have outdated info a lot of the time due to their current knowledge cutoff dates.
    It's just this and the docs for me hehe...

  • @juancolmenares6185
    @juancolmenares6185 ปีที่แล้ว +2

    Very nice explanation!

  • @christiansinger2497
    @christiansinger2497 ปีที่แล้ว +2

    Very informative video as always!

  • @minhphan8552
    @minhphan8552 4 หลายเดือนก่อน

    Thanks a lot for the nice video! I have a naive question. The layers of the neural network seems similar to a polynomial. How is a neural network better than a polynomial fit?

  •  ปีที่แล้ว +1

    Great, seems like magic. Thanks!

    • @MachineLearningSimulation
      @MachineLearningSimulation  ปีที่แล้ว +1

      Nice, thanks :).
      Indeed, JAX is really smooth to use. I find the AD interface better than in tensorflow or Pytorch.