I feel there is big gap between the lecture and audience. Variational inference is a pretty complicated process by itself. It’s difficult to introduce BBN without talking about how variational inference works
13:31 - It's a really interesting mistake that she mixes the "Laplace" (which is the correct distribution she wanted to say) with the Poisson distribution! It has to do with PDEs: the Laplace PDE is the homogeneous version of the Poisson PDE! hehe (I could easily do the same mistake)
Hi Chariloas, Could you please describe how L1 corresponds to Poisson as she mentioned? And how Laplace as you mentioned is a correction over it. I am able to understand why l2 and normal makes sense, but for L1 I feel a bit clueless, I would really appreciate your guidance in this. Thank you
@@siddhantrai7529 check any reference on "Bayesian interpretation of regularization" - I answered before but the comment seems to disappear for some reason! Also, note the dependence on exp^(-|X|^2) in the Gaussian PDF and the dependence on exp^(-|X|^1) in the Laplace (not Poisson!) PDF. There is a "Poisson" distribution but it's not relevant to L1 regularization! She makes an honest mistake because of the connection of Poisson and Laplace in the diffusion PDEs! (There is also a Poisson and Laplace PDE - that's what my comment was about!).
@@charilaosmylonas5046 Thank you for the reply, makes sense now. For sure, I would look into "Bayesian interpretation of regularization" as you mentioned. Thanks again. 😁😁
Love how she handled the questions in the middle of the presentation. Great work on the research too!
me too.
Great presentation. Can we get a copy of the code or the github link? Thank you
good presentation, do we have acces to the code?
at around 10:00, last line, should be integration over theta, correct?
Outstanding presentation!!
I guess that the size of uncertainty will depend on the size of dropout rate. how can I determine optimal dropout rate?
how do you interpret the uncertainties in prediction
that was an excellent explanation
I feel there is big gap between the lecture and audience. Variational inference is a pretty complicated process by itself. It’s difficult to introduce BBN without talking about how variational inference works
This was a really interesting presentation!!!
13:31 - It's a really interesting mistake that she mixes the "Laplace" (which is the correct distribution she wanted to say) with the Poisson distribution! It has to do with PDEs: the Laplace PDE is the homogeneous version of the Poisson PDE! hehe (I could easily do the same mistake)
Hi Chariloas,
Could you please describe how L1 corresponds to Poisson as she mentioned? And how Laplace as you mentioned is a correction over it. I am able to understand why l2 and normal makes sense, but for L1 I feel a bit clueless, I would really appreciate your guidance in this. Thank you
@@siddhantrai7529 check any reference on "Bayesian interpretation of regularization" - I answered before but the comment seems to disappear for some reason! Also, note the dependence on exp^(-|X|^2) in the Gaussian PDF and the dependence on exp^(-|X|^1) in the Laplace (not Poisson!) PDF. There is a "Poisson" distribution but it's not relevant to L1 regularization! She makes an honest mistake because of the connection of Poisson and Laplace in the diffusion PDEs! (There is also a Poisson and Laplace PDE - that's what my comment was about!).
@@charilaosmylonas5046 Thank you for the reply, makes sense now. For sure, I would look into "Bayesian interpretation of regularization" as you mentioned. Thanks again. 😁😁
"How do you parameterize a distribution?" Answer: "Like you are parameterize every distribution". Ok I got it.