At 50:06, the way equation was written is not usual. One way to write it is: P(w=1| s1=0, s2=0) = P(s1=0, s2=0|w=1)*P(w=1) / (P(s1=0, s2=0|w=1)*P(w=1) + P(s1=0, s2=0|w=0)*P(w=0) ). The calculation is correct.
I was drawn to these videos because I wanted to see what PyMC is doing under the hood, for example, with respect to Gibbs sampling. I was following the presentation just fine until I got to the big messy conditional probability expression for tau, at which point I hit a brick wall. Chris Fonnesbeck indicates you "need to do a little mathematical wrangling" to derive this categorical distribution. Where should I look to find out how the "wrangling" is done? I haven't read Gelman, but I own it, and it is mentioned several times in this conference lecture. Any tips on what I ought to read to understand the details of Gibbs sampling (preferably not entire books) would be much appreciated.
I'll reply with "MCMC is not easy". If you want to understand Gibbs sampling, you should (briefly) look at Markov chains, and read Casella's paper "Explaining the Gibbs Sampler". It should be noted that Gibbs sampler is just a special version of the general Metropolis-Hastings. If you really want to understand what's going on "under the hood" of MCMC methods then you'll need to beef up on your math (not to be rude). It takes a good level of punishment to really understand some of this material.
At 50:06, the way equation was written is not usual. One way to write it is: P(w=1| s1=0, s2=0) = P(s1=0, s2=0|w=1)*P(w=1) / (P(s1=0, s2=0|w=1)*P(w=1) + P(s1=0, s2=0|w=0)*P(w=0) ). The calculation is correct.
A little theory goes a long way. Finally (introductory) Bayesian theory for the layperson from a real statistician.
i cant read the notebook. How can i read codes for coal data mining problem ?
The negative Bernoulli distribution is wrong @56:15, it should be a binomial coefficient, not a ratio.
For other readers: en.wikipedia.org/wiki/Negative_binomial_distribution
I was drawn to these videos because I wanted to see what PyMC is doing under the hood, for example, with respect to Gibbs sampling. I was following the presentation just fine until I got to the big messy conditional probability expression for tau, at which point I hit a brick wall. Chris Fonnesbeck indicates you "need to do a little mathematical wrangling" to derive this categorical distribution. Where should I look to find out how the "wrangling" is done? I haven't read Gelman, but I own it, and it is mentioned several times in this conference lecture. Any tips on what I ought to read to understand the details of Gibbs sampling (preferably not entire books) would be much appreciated.
I'll reply with "MCMC is not easy". If you want to understand Gibbs sampling, you should (briefly) look at Markov chains, and read Casella's paper "Explaining the Gibbs Sampler". It should be noted that Gibbs sampler is just a special version of the general Metropolis-Hastings. If you really want to understand what's going on "under the hood" of MCMC methods then you'll need to beef up on your math (not to be rude). It takes a good level of punishment to really understand some of this material.
I couldn't find the link to material on github. Could somebody post the link ?
fonnesbeck/scipy2014_tutorial
Ben Vincent Thanks!