i have a phd in machine learning, yet i regularly come revisit concepts (one i vividly remember is your description of kernel methods) you are covering here. thank you for your work
Just quickly popping in to say I really like your content so far. It helps me quickly recall some of these concepts I haven't used in a while and kind of understand again why things work how they work.
Dear ritvik I am studying 6 axis robot control and your videos immensely help me understand what all this stuff means. Thanks a lot for all the effort you put into these videos...
Very well done Ritvik, thanks for sharing! Could be nice to throw in the calculus "chain rule" for jacobian when you talk about their multiplication? And why just stop there, could be even more cool to hear the story of higher order derivatives!! Looking forwards :)
It seems Jacobian has a lot of different implications. I learn it by myself by watching diffenent utube videos. Your video is one of the best I come across. If you can upload more videos about the different underneath meanings and applications of Jacobian would be very great ! Thanks a lot ! Jack Lam from Hong Kong
Great video 👍 just a few questions I had while watching: 1- what is the meaning of a high jacobian? 2- is there some boundries to jacobian's values? 3- is it always a scaler value?
Don't know why we cannot use actual values to represent the flow of the steps. Mathematics are not abstract until we try to generalise and use all these subscripts , Understand the effort you guys put into these video. Hope my suggestion can improve the understanding. Thank you
Hey I got confused at the very simple beginning... why f(x)=x^2 conducts to another df(x)/dx= 2dx? does it make sense? sorry if I am stupid mistake something
i have a phd in machine learning, yet i regularly come revisit concepts (one i vividly remember is your description of kernel methods) you are covering here. thank you for your work
You have a gift for calm, simple, sequential explanation. Thank you!
You're very welcome!
I just looked at your old videos and this one, there's quite a step up in the way you explain.
Just quickly popping in to say I really like your content so far. It helps me quickly recall some of these concepts I haven't used in a while and kind of understand again why things work how they work.
Thanks!
Amazing explanation. I was just looking for a good definition of a jacobian matrix, but came back with some new insights in ML. Thanks!
Glad it was helpful!
Bro you make my life way more enjoyable and easier. Thank you so much
What an incredible video! Your channel must go tenfold within the next year.
Something new learned! Thank you for your efforts, Ritvik!
of course!
This was incredibly helpful - thank you! This is the first time Jacobians have made a semblance of sense to me.
Dear ritvik I am studying 6 axis robot control and your videos immensely help me understand what all this stuff means.
Thanks a lot for all the effort you put into these videos...
Glad to hear that
Awesome explanation Rithvik. In addition to this can you please cover how autodiff works by making use of these jacobians.
Great suggestion!
Thanks for talking about the application. That made it very clear.
Clear and sort vid that is really helpful to recall some of the jacobian matrix use cases
Thanks for such a simplified explanation 👍👍
My pleasure
Oh my god! This is the absolute best explaination about Jacobian matrix
Thank you so much!
Straight forward explanation of the Jacobians. It helped me a lot. Thanks!
As always, great videos man! :)
Thanks :)
Superb explanation! you explained the complicated topic in a simple way.
Damn!! That was really cool. Thank you for such a wonderful explaination.
No problem!
thank you very much, simple and intuitive presentation of the material!
Very well done Ritvik, thanks for sharing! Could be nice to throw in the calculus "chain rule" for jacobian when you talk about their multiplication? And why just stop there, could be even more cool to hear the story of higher order derivatives!! Looking forwards :)
Great suggestion!
Thank you! Even though I use the jacobian matrix in pose estimation from image, I feel like the more the ways to look at it, the better.
Great explanation delivered using plain language. Thanks!
great explanation and great example.
Thanks!!
Can you also make a video about hessian about ? Thank you
great suggestion!
Thanks man, the other way to calculate the derivative w.r.t. input was always non-intuitive for me. This is so easy to understand...
I'm glad!
this is exactly what i was looking for, thank you!!
This was freaking awesome. Bravo!
Excellent explaination. Thanks
This made the Jacobian way less intimidating. Thank you!
Great to hear!
Thanks so good to get an insight(my head still hurts) into how Jacobians relate to ML.. even if it is without the activation function.... fair play.
Nicely done bro! got a lot of help!
Great explanation. Good topic.
appreciated the content and the way you are teaching!
It seems Jacobian has a lot of different implications. I learn it by myself by watching diffenent utube videos. Your video is one of the best I come across. If you can upload more videos about the different underneath meanings and applications of Jacobian would be very great ! Thanks a lot !
Jack Lam from Hong Kong
Glad it was helpful!
Wow! Awesome video, right to the point, well explained. Thanks a lot!
Very impressive!
This was a very good explanation, thanks you!
very clear explanation
thank you
Great video 👍
just a few questions I had while watching:
1- what is the meaning of a high jacobian?
2- is there some boundries to jacobian's values?
3- is it always a scaler value?
Really nice explanation. Well done!
How does it relate to the backpropagation algorithm? Maybe it is worth mentioning briefly? Thank you!
great suggestion and thanks!
Really nice explanation.
Great Video! Thank you so much!
This is awesome explanation, thanks!
awesome vid always
Thanks!
Thanks for clear explanation. :)
This is a really good video!
best explanation in the web
Don't know why we cannot use actual values to represent the flow of the steps. Mathematics are not abstract until we try to generalise and use all these subscripts , Understand the effort you guys put into these video. Hope my suggestion can improve the understanding. Thank you
In the second and third layer we are taking derivatives wrt to a1 and a2 and not further breaking down a1 and a2?
Thank you!
Will this work when an activation function is included? How will that look? And is it a preferred way of dealing with it?
You are just awesome!
Hey I got confused at the very simple beginning... why f(x)=x^2 conducts to another df(x)/dx= 2dx?
does it make sense? sorry if I am stupid mistake something
This video put me out of my misery.
Bless you sir.
So clear. Congrats for didactic.
thank you
Excellent presentation. My only request is to slow down and/or pause more. It takes my tiny brain a second to process each step. Thanks so much!
Thanks for the tip!
Jacobians are used in gauss newton method
this is gold
🎉
0:01 - left hand: hail satan
Damn, I learnt about Jacobian without even knowing the "when" and "why" to use it
Very goood
Hessian next? 🥺
Good suggestion!
this is cool. 👍
The goal is to optimize the weights of the network. So why are you taking partial derivatives with respect to x instead of w?
thanks
You're welcome!
love u
👏🏾👏🏾👏🏾👏🏾👏🏾👏🏾👏🏾👏🏾👏🏾
And my Mathematese keeps getting neater...
super
good
You are a math prophet sent by God to guide us ..
w vid
Thank you sir, are you Indian ?
How about stop the nagging about Jacobean and move to hessian