Uncertainty Quantification and Deep Learning ǀ Elise Jennings, Argonne National Laboratory

แชร์
ฝัง
  • เผยแพร่เมื่อ 21 ธ.ค. 2024

ความคิดเห็น • 16

  • @EigenA
    @EigenA 2 ปีที่แล้ว +4

    Love how she handled the questions in the middle of the presentation. Great work on the research too!

    • @jijie133
      @jijie133 ปีที่แล้ว

      me too.

  • @a2002
    @a2002 2 ปีที่แล้ว +2

    Great presentation. Can we get a copy of the code or the github link? Thank you

  • @corentink3887
    @corentink3887 3 ปีที่แล้ว +4

    good presentation, do we have acces to the code?

  • @jiongwang7645
    @jiongwang7645 9 หลายเดือนก่อน

    at around 10:00, last line, should be integration over theta, correct?

  • @KarriemPerry
    @KarriemPerry ปีที่แล้ว

    Outstanding presentation!!

  • @nickrhee7178
    @nickrhee7178 11 หลายเดือนก่อน

    I guess that the size of uncertainty will depend on the size of dropout rate. how can I determine optimal dropout rate?

  • @masisgroupmarinesoftintell3299
    @masisgroupmarinesoftintell3299 3 ปีที่แล้ว +1

    how do you interpret the uncertainties in prediction

  • @alexandterfst6532
    @alexandterfst6532 3 ปีที่แล้ว +2

    that was an excellent explanation

  • @saderick52
    @saderick52 ปีที่แล้ว

    I feel there is big gap between the lecture and audience. Variational inference is a pretty complicated process by itself. It’s difficult to introduce BBN without talking about how variational inference works

  • @ivotavares6576
    @ivotavares6576 3 ปีที่แล้ว

    This was a really interesting presentation!!!

  • @charilaosmylonas5046
    @charilaosmylonas5046 3 ปีที่แล้ว +2

    13:31 - It's a really interesting mistake that she mixes the "Laplace" (which is the correct distribution she wanted to say) with the Poisson distribution! It has to do with PDEs: the Laplace PDE is the homogeneous version of the Poisson PDE! hehe (I could easily do the same mistake)

    • @siddhantrai7529
      @siddhantrai7529 2 ปีที่แล้ว

      Hi Chariloas,
      Could you please describe how L1 corresponds to Poisson as she mentioned? And how Laplace as you mentioned is a correction over it. I am able to understand why l2 and normal makes sense, but for L1 I feel a bit clueless, I would really appreciate your guidance in this. Thank you

    • @charilaosmylonas5046
      @charilaosmylonas5046 2 ปีที่แล้ว +2

      @@siddhantrai7529 check any reference on "Bayesian interpretation of regularization" - I answered before but the comment seems to disappear for some reason! Also, note the dependence on exp^(-|X|^2) in the Gaussian PDF and the dependence on exp^(-|X|^1) in the Laplace (not Poisson!) PDF. There is a "Poisson" distribution but it's not relevant to L1 regularization! She makes an honest mistake because of the connection of Poisson and Laplace in the diffusion PDEs! (There is also a Poisson and Laplace PDE - that's what my comment was about!).

    • @siddhantrai7529
      @siddhantrai7529 2 ปีที่แล้ว

      @@charilaosmylonas5046 Thank you for the reply, makes sense now. For sure, I would look into "Bayesian interpretation of regularization" as you mentioned. Thanks again. 😁😁

  • @michaelsprinzl9045
    @michaelsprinzl9045 2 ปีที่แล้ว

    "How do you parameterize a distribution?" Answer: "Like you are parameterize every distribution". Ok I got it.