Cross-Entropy Loss Log-likelihood Perspective

แชร์
ฝัง
  • เผยแพร่เมื่อ 9 ก.พ. 2025
  • This is a video that covers Cross-Entropy Loss Log-likelihood Perspective
    Attribution-NonCommercial-ShareAlike CC BY-NC-SA
    Authors: Matthew Yedlin, Mohammad Jafari
    Department of Computer and Electrical Engineering, University of British Columbia.

ความคิดเห็น •

  • @victorialeigh2726
    @victorialeigh2726 3 ปีที่แล้ว +1

    I really like that you guys hand clean the clear board. Feels like back in the classroom + math teather's small math talk!

  • @vaibhavarora9408
    @vaibhavarora9408 4 ปีที่แล้ว +4

    Great lectures. Thank you Matt Yedlin.

  • @yanbowang4020
    @yanbowang4020 4 ปีที่แล้ว +3

    love your video hope to see more of it.

  • @yjy8
    @yjy8 3 ปีที่แล้ว +1

    you just cleared my doubt about likelihooh and log likelihood which I had for past yaer and half.
    Thank you so much

  • @GenerativeDiffusionModel_AI_ML
    @GenerativeDiffusionModel_AI_ML 3 ปีที่แล้ว +1

    Well explained using human language.

  • @davidlearnforus
    @davidlearnforus 2 ปีที่แล้ว

    Hi, thank you so much! I am self lerner, with no much of formal background. Can you please explain how SUM p_i log q_i is entropy, because it does not have minus sign. If it would be log (1/q_i) we would get minus sign out of it but its not. I'm stuck there...

  • @AChadi-ug9pg
    @AChadi-ug9pg 4 ปีที่แล้ว +2

    Muhammad is a good student

  • @GenerativeDiffusionModel_AI_ML
    @GenerativeDiffusionModel_AI_ML 3 ปีที่แล้ว

    I want to know how you can write on the mirror and record it.

    • @mattyedlin7292
      @mattyedlin7292  3 ปีที่แล้ว +1

      The camera sees the writing flipped through a mirror.

  • @TylerMatthewHarris
    @TylerMatthewHarris 4 ปีที่แล้ว

    thanks!

  • @peizhiyan2916
    @peizhiyan2916 4 ปีที่แล้ว

    Nice

  • @jimbobur
    @jimbobur 2 ปีที่แล้ว +1

    *(EDIT: Solved it, see comment reply)*
    I don't follow how you go from the case of numerical example, where the likelihood is a product of predicted and observed probabilities p_i and q_i each raised to the number of times they occur, to the algebraic expression of the likelihood where you take the product of q_i raised to N * p_i (or is that N_p_i? I'm a little unsure if the p_i is a subscript of the N or multiplied by it).

    • @jimbobur
      @jimbobur 2 ปีที่แล้ว +2

      I worked it out. The answer is to remember that the number of times the outcome i, with probability p_i occurs can be expressed by rearranging the definition p_i = N_p_i / N and substituting this into the expression for the likelihood in the general form that follows from the numerical example:
      L = Π q_i ^ N_p_i ,
      giving
      L = Π q_i ^ N*p_i

  • @garrettosborne4364
    @garrettosborne4364 3 ปีที่แล้ว

    Can the old guy.