Lecture 2 Part 1: Derivatives in Higher Dimensions: Jacobians and Matrix Functions
ฝัง
- เผยแพร่เมื่อ 29 พ.ย. 2024
- MIT 18.S096 Matrix Calculus For Machine Learning And Beyond, IAP 2023
Instructors: Alan Edelman, Steven G. Johnson
View the complete course: ocw.mit.edu/co...
TH-cam Playlist: • MIT 18.S096 Matrix Cal...
Description: Derivatives as linear operators give a fresh perspective on multivariable calculus, gradients, and Jacobian matrices; but we can now generalize to matrix functions.
License: Creative Commons BY-NC-SA
More information at ocw.mit.edu/terms
More courses at ocw.mit.edu
Support OCW at ow.ly/a1If50zVRlQ
We encourage constructive comments and discussion on OCW’s TH-cam and other social media channels. Personal attacks, hate speech, trolling, and inappropriate comments are not allowed and may be removed. More details at ocw.mit.edu/co....
This explanation of backpropagation is better than any other CS professor’s videos on the web. Steven really went straight to the heart of the issue and showed its beauty and simplicity at the same time.
Massive respect when instructors interact with each others.
Such a complex field discussed in such a clear fashion. Thank you professors.
This has such a good introduction to the idea of backpropagation! Looking forward to learning more now. Kudos for this 2-teacher model as well.. Grateful to be able to access this lecture for free!
Is that Philip in the background? Buddy looks chill
Great course!
its kinda odd that the students are barely participating in these questions. i wonder why
Excellent lecture. Learnt alot from mathematical perspective
It took me 30min to prove ax as the only linear operator on R. Feel I’m so dumb
lecture 3 for today and am going to complete it
We should normalize seeing corgis paying careful attention to math classes.
Phillip!