I think the biggest problem of this video is that the presenter is using a laser pointer which we can’t see. So he says “this is this and this is that” and we never know what elements on the slides to follow. Also, the technical problems with things not showing up on slides.
It would be nice to see a video on Nested Sampling. I've read some literature and coded some models in R, but still a little shaky with some things (e.g., dynamic nested sampling, extracting parameter samples proportional to posterior density [I'm getting good estimates with + - sd, but would prefer output similar to MCMC]). Throwing it out there since there are not a ton of resources, and because of the tease at the beginning :)
28:06 presentor should mention that he cannot compute posterior but only to compare the ratio between the trial and previous. Otherwise it can confuse people: since it looks like you have computed P, why bother?
This is definitely not a beginner's guide. I know a little bit about bayes theorem, prior, posterior etc. But this Dr. Kipping ensured that i had to open up the books to sort out the mess he created in my mind. Definitely wont recommend to anyone.
Too many conceptual jumps, leaving big blanks. Something as important and to the core of Bayesian Analysis as how to define the Likelihood function is left unexplained. And at 27:35 he says that Metropolis does not consider the evidence!! ABSURD!! It is precisely in the Likelihood definition where the evidence is considered.
The way to teach MCMC, should start from its definition on each letter, and explain how does it compute posterior. Not messy terminology together with extremely personal viewpoint on MCMC’ers. I have been in similar situations and I know how to teach the worst.
I had a question about the plots when you are showing the jumps trying to find the "green zones". The axis labels are "a" and "b", with a_min and a_max as the range for variable a, and b_min and b_max for the variable b. Is a and b some sort of high dimensional space variable, like a principle component? Basically what are these variables you are plotting against to visualize the jumps/path the MCMC algorithm takes?
Lillian R Ashmore the variables are the parameters of the model - theta. A and B are two model parameters, but likely there will be many others. He’s just offering a simpler scenario.
Too many unnecessary details, and not a clear logical presentation. The slides are hard to follow, it just makes people who understand MCMC more confused.
I think this is a talk designed for people who have started a PhD in astrophysics. We used a very terse definition of the word 'beginner' in Astrophysics...
I'm sure this dude is very smart, intelligent and successful but honestly is a terrible explainer of complex materials. Nervous laughs while pointing with laser of to a written pdf. Ugh!
Best illustration of the burin period!
Thank you for your helping!
9:36 posterior
16:11 Bayes' theorem
29:18 Metropolis rule
40:35 walker(emcee)
6:15 skip to MCMC
You are my hero. =)
You are the best.
I think the biggest problem of this video is that the presenter is using a laser pointer which we can’t see. So he says “this is this and this is that” and we never know what elements on the slides to follow. Also, the technical problems with things not showing up on slides.
MCMC starts 6:17
He conveys a great point and I understand that the rigor is not the point of this video, but a statistician in me was screaming most of the time.
It would be nice to see a video on Nested Sampling. I've read some literature and coded some models in R, but still a little shaky with some things (e.g., dynamic nested sampling, extracting parameter samples proportional to posterior density [I'm getting good estimates with + - sd, but would prefer output similar to MCMC]). Throwing it out there since there are not a ton of resources, and because of the tease at the beginning :)
28:06 presentor should mention that he cannot compute posterior but only to compare the ratio between the trial and previous. Otherwise it can confuse people: since it looks like you have computed P, why bother?
Quite clear and clever explanation. Thanks a lot for sharing.
You are welcome!
This was super helpful. Thanks!
Glad it was helpful!
Very clear and helpful lecture! Thank you!
is there a video that discusses which situations are best for this approach
Video starts from @6:15
31:10 What the case (Metropolis rule) : P_trial=P_i (accept theta_i+1 = theta_i ?)
This is definitely not a beginner's guide. I know a little bit about bayes theorem, prior, posterior etc. But this Dr. Kipping ensured that i had to open up the books to sort out the mess he created in my mind. Definitely wont recommend to anyone.
Laplace started Bayesian statistics at least as much as Bayes!
Too many conceptual jumps, leaving big blanks. Something as important and to the core of Bayesian Analysis as how to define the Likelihood function is left unexplained. And at 27:35 he says that Metropolis does not consider the evidence!! ABSURD!! It is precisely in the Likelihood definition where the evidence is considered.
Great lecture ! really nice introduction to MCMC
The way to teach MCMC, should start from its definition on each letter, and explain how does it compute posterior. Not messy terminology together with extremely personal viewpoint on MCMC’ers. I have been in similar situations and I know how to teach the worst.
The main concern is that the presenter is not familiar with how to deliver his points. Hope this feedback helps you.
Do you know a beginners guide to Gibbs Sampling?
Stopping after 14 minutes, this is a garbled mess.
I had a question about the plots when you are showing the jumps trying to find the "green zones". The axis labels are "a" and "b", with a_min and a_max as the range for variable a, and b_min and b_max for the variable b. Is a and b some sort of high dimensional space variable, like a principle component? Basically what are these variables you are plotting against to visualize the jumps/path the MCMC algorithm takes?
Lillian R Ashmore the variables are the parameters of the model - theta. A and B are two model parameters, but likely there will be many others. He’s just offering a simpler scenario.
Too many unnecessary details, and not a clear logical presentation. The slides are hard to follow, it just makes people who understand MCMC more confused.
how to calculate odd ratio in bayesian ordered logistic plz tell me
This is not for beginners. you did not explain anything. You just keep saying "Monte Carlo does this or that" but you do not explain anything.
I think this is a talk designed for people who have started a PhD in astrophysics. We used a very terse definition of the word 'beginner' in Astrophysics...
People on TH-cam think "MCMC for beginners" means "MCMC for those who missed all math classes" smh.
@@yevgenydevinehaha
I'm sure this dude is very smart, intelligent and successful but honestly is a terrible explainer of complex materials. Nervous laughs while pointing with laser of to a written pdf. Ugh!
you tried, anyway it's difficult ;)
Presentation PDF download:
github.com/davidkipping/sagan2016/blob/master/MCMC.pdf
how to calculate odd ratio in bayesian ordered logistic plz tell me