Categorical Cross - Entropy Loss Softmax

แชร์
ฝัง
  • เผยแพร่เมื่อ 4 ก.พ. 2025

ความคิดเห็น • 24

  • @rickragv
    @rickragv 4 ปีที่แล้ว +14

    this is hidden gem! Thank you!

  • @abiolaadeye2961
    @abiolaadeye2961 3 ปีที่แล้ว +3

    Excellent Video Series. I love the question and answer format ! Thanks!

  • @veganath
    @veganath ปีที่แล้ว +2

    Wow! Hats of to you guys, perfect in demystifying Categorical Cross-Entropy.... thank you!

  • @bengonoobiang6633
    @bengonoobiang6633 3 ปีที่แล้ว +1

    Good Video. The course format make the course look so easy to understand.

  •  4 ปีที่แล้ว +3

    thank you guys! such a great explanation!

  • @g.jignacio
    @g.jignacio 4 ปีที่แล้ว +1

    Very good explanation! it's been so hard to find a numerical example. Thank you guys!

  • @keiran110
    @keiran110 4 ปีที่แล้ว +1

    Great video, thank you.

  • @trajanobertrandlleraromero6579
    @trajanobertrandlleraromero6579 11 หลายเดือนก่อน +1

    Vine buscando cobre y encontré oro!!!!

  • @MultiTsunamiX
    @MultiTsunamiX 4 ปีที่แล้ว +6

    At 6:44 there is a mistake in the equation, .715 should be in last log parenthesis instead of .357

  • @himanshuvajaria6426
    @himanshuvajaria6426 3 ปีที่แล้ว +1

    Thanks guys!

  • @raymondchang9481
    @raymondchang9481 ปีที่แล้ว

    how much is an intercontinental ballistic missle?

  • @wesuli
    @wesuli 4 ปีที่แล้ว +1

    why were you using logarithm base 2 ?

  • @cedricmanouan2333
    @cedricmanouan2333 4 ปีที่แล้ว +1

    Great !

  • @hassanmahmood7284
    @hassanmahmood7284 3 ปีที่แล้ว +1

    Awesome

  • @rizalalfarizi9196
    @rizalalfarizi9196 4 ปีที่แล้ว +1

    thank you very much clear explanation, love it sir

  • @shivamgupta187
    @shivamgupta187 4 ปีที่แล้ว +2

    if i am not wrong, you have used softmax function to normalize i.e. to sum up the probability to 1
    but in your examples it is
    .147+.540+.133+.180 = 1
    .160+.323+.357+.160 = 1
    .188+.118+.715+.079 = 1.1
    can you please help me to understand the above discrepancy

    • @horvathbenedek3596
      @horvathbenedek3596 4 ปีที่แล้ว

      You can see that they messed up, and wrote .188 instead of .088 when transferring from the softmax to the y-hat vector. I guess they added y-hat manually, resulting in the mistake.

  • @keshavkumar7769
    @keshavkumar7769 3 ปีที่แล้ว

    wonderfull

  • @RitikDua
    @RitikDua 4 ปีที่แล้ว +1

    Very good explaination

  • @igomaur4175
    @igomaur4175 2 ปีที่แล้ว

    wowww

  • @KuldeepSingh-cm3oe
    @KuldeepSingh-cm3oe 4 ปีที่แล้ว +1

    Brilliant

  • @BrandonSLockey
    @BrandonSLockey 4 ปีที่แล้ว +1

    batman and robin

  • @lucyfrye6723
    @lucyfrye6723 ปีที่แล้ว

    It's a good video but good grief, encourage people to take a week-long course in linear algebra. If you keep feeding them summation symbols and indices they will never do it. It's HARDER, not easier to spell it all out. Professor Strang's course is probably still on youtube if you are interested. You will gain back that week by being twice as productive in the week after. Not to mention the rest of your life.

    • @mattyedlin7292
      @mattyedlin7292  ปีที่แล้ว

      Hello Lucy
      Thank you for your input! Always interested in comments to improve videos. Would you suggest any additional material to address the summation issue. I learned it in high school as a prelim to proof by induction - a long time ago.