Full video list and slides: www.kamperh.co... Errata: 6:10 - The Jacobian is actually something different (the partial derivatives of a vector function).
I'm confused about the deriative of a vector function at 5:40, i think the gradient of a function f:Rn→Rm should be a matrix of size m×n. not sure about it
@@kamperh Great, thanks! Do you also suggest any book for pure vector calc/multivariate analysis? A book that completely teaches the concepts and maybe with proofs.
nice video for sure. but for a principle seeker , it was pretty weird to find a suddenly appeared formula like,partial derivative of x trans mutiply Ax to the vector x is (A trans + A)x
Sir, After partial derivatives with respect to x1 and x2 we get two functions which represent the slope. My question is, are that functions represent the slope of a plane or anything?
At 4:41, each entry in that column of partial f with respect to the x's should read partial of f transposed with respect to the x's since you have defined the vector function f as a column matrix. The way you have it written would be an (NM)x1 column matrix.
Hey @James Greenwood. Appreciate you watching and giving feedback! I think it depends on how you define the partial derivatives of a vector-by-scalar (which is different from the partial derivatives of a scalar-by-vector defined at the top of the slide). You will see on Wikipedia that vector-by-scalar is the transpose of scalar-by-vector, which means that in this case vector-by-scalar would be a row vector and you don't need a transpose. Have a look here: en.wikipedia.org/wiki/Matrix_calculus#Derivatives_with_vectors. Just note that they use numerator layout while in the slides I use denominator layout.
@@kamperh Hi, it's a great video. I would like to suggest Ch 9 in Rudin, Principles of Mathematical Analysis, about differentiation. "At the beginnings" scholars decided about the vectors, a vector is a column vector (could have been row, but not). So to avoid the confusion and maybe wrong theorems, proofs, it would be useful to adopt this terminology.
Great video. That meow from the cat though
I'm confused about the deriative of a vector function at 5:40, i think the gradient of a function f:Rn→Rm should be a matrix of size m×n. not sure about it
I am learning and completely fascinated.. but the cat interrupting was hilarious as well.
4:49 if anyone asks what you're doing: watching cat videos on the Internet
🤣
Great video! thank you so much!
Can you put the links in the description?
انا جيت عشان باشمهندس حماد
منزل بوست عنك اسطااااااااا
Great video! Been writing equations into componentwise notation without really knowing why. Now i do! Thanks a bundle
Thank you for the video !!! Also which book would you recommend to learn vector/matrix calculus or multivariate analysis intuitively?
For a long time the books in this area was very poor, but I love the newer Mathematics for Machine Learning book: mml-book.github.io/book/mml-book.pdf
@@kamperh Great, thanks! Do you also suggest any book for pure vector calc/multivariate analysis? A book that completely teaches the concepts and maybe with proofs.
nice video for sure. but for a principle seeker , it was pretty weird to find a suddenly appeared formula like,partial derivative of x trans mutiply Ax to the vector x is (A trans + A)x
It's a really nice excercise to try and prove some these identies like the one you mention! That would be a great way to lay down the principles.
@@kamperh I‘ve proved it successfully! Indeed that was a nice experience, thanks for your precious advice
You look like Benedict Cumberbatch
The nicest thing that anyone has ever said!
Thanks for this video. Kind regards!
Thank you so much for this explanation!!! This helped me tremendously.
Thanks for your video, really helpful!
6:10 this is not a jacobian matrix, whose function is a vector function: R_m to R_n instead
Thanks for pointing this out: you are absolutely right! I've added it to the errata.
I’m so happy I found your channel! Beautiful diagrams and explanations, subscribed!
Sir, After partial derivatives with respect to x1 and x2 we get two functions which represent the slope.
My question is, are that functions represent the slope of a plane or anything?
wow im looking 6 months for a simple explanation like this
At 4:41, each entry in that column of partial f with respect to the x's should read partial of f transposed with respect to the x's since you have defined the vector function f as a column matrix. The way you have it written would be an (NM)x1 column matrix.
Hey @James Greenwood. Appreciate you watching and giving feedback! I think it depends on how you define the partial derivatives of a vector-by-scalar (which is different from the partial derivatives of a scalar-by-vector defined at the top of the slide). You will see on Wikipedia that vector-by-scalar is the transpose of scalar-by-vector, which means that in this case vector-by-scalar would be a row vector and you don't need a transpose. Have a look here: en.wikipedia.org/wiki/Matrix_calculus#Derivatives_with_vectors. Just note that they use numerator layout while in the slides I use denominator layout.
@@kamperh Thank you for your reply. This makes sense now.
@@kamperh Hi, it's a great video. I would like to suggest Ch 9 in Rudin, Principles of Mathematical Analysis, about differentiation. "At the beginnings" scholars decided about the vectors, a vector is a column vector (could have been row, but not). So to avoid the confusion and maybe wrong theorems, proofs, it would be useful to adopt this terminology.
Goat
Source ?
Great video. Thank you
you are a great guy
Thanks Gio Gio! :D
U r simply lethal in elucidating such convoluted topic
🙏🙏
Nice and easy to follow, thankss
I more like to see some example question but not horrible greek letter formula