if i am not wrong, you have used softmax function to normalize i.e. to sum up the probability to 1 but in your examples it is .147+.540+.133+.180 = 1 .160+.323+.357+.160 = 1 .188+.118+.715+.079 = 1.1 can you please help me to understand the above discrepancy
You can see that they messed up, and wrote .188 instead of .088 when transferring from the softmax to the y-hat vector. I guess they added y-hat manually, resulting in the mistake.
It's a good video but good grief, encourage people to take a week-long course in linear algebra. If you keep feeding them summation symbols and indices they will never do it. It's HARDER, not easier to spell it all out. Professor Strang's course is probably still on youtube if you are interested. You will gain back that week by being twice as productive in the week after. Not to mention the rest of your life.
Hello Lucy Thank you for your input! Always interested in comments to improve videos. Would you suggest any additional material to address the summation issue. I learned it in high school as a prelim to proof by induction - a long time ago.
this is hidden gem! Thank you!
Excellent Video Series. I love the question and answer format ! Thanks!
Wow! Hats of to you guys, perfect in demystifying Categorical Cross-Entropy.... thank you!
Good Video. The course format make the course look so easy to understand.
thank you guys! such a great explanation!
Very good explanation! it's been so hard to find a numerical example. Thank you guys!
Great video, thank you.
Vine buscando cobre y encontré oro!!!!
At 6:44 there is a mistake in the equation, .715 should be in last log parenthesis instead of .357
Thanks guys!
how much is an intercontinental ballistic missle?
why were you using logarithm base 2 ?
Great !
Awesome
thank you very much clear explanation, love it sir
if i am not wrong, you have used softmax function to normalize i.e. to sum up the probability to 1
but in your examples it is
.147+.540+.133+.180 = 1
.160+.323+.357+.160 = 1
.188+.118+.715+.079 = 1.1
can you please help me to understand the above discrepancy
You can see that they messed up, and wrote .188 instead of .088 when transferring from the softmax to the y-hat vector. I guess they added y-hat manually, resulting in the mistake.
wonderfull
Very good explaination
wowww
Brilliant
batman and robin
It's a good video but good grief, encourage people to take a week-long course in linear algebra. If you keep feeding them summation symbols and indices they will never do it. It's HARDER, not easier to spell it all out. Professor Strang's course is probably still on youtube if you are interested. You will gain back that week by being twice as productive in the week after. Not to mention the rest of your life.
Hello Lucy
Thank you for your input! Always interested in comments to improve videos. Would you suggest any additional material to address the summation issue. I learned it in high school as a prelim to proof by induction - a long time ago.