I know the video is old but I have to agree with the pinned comment. I already knew Bayes Theorem buy as I don't use it often, I have to be constantly refreshing the details in my mind. TH-cam algorithm recommended this video and it's hands down the best I have ever watched.
EXCELLENT EXPLANATION!!! I am learning graphical modeling and a lot of these concepts were a bit unclear to me. Examples given here are absolutely to the point and demystify a lot of concepts. Thank you and looking forward to more videos.
Thank you. Your video has been of great help. I have tried different resources to wrap my head around Bayesian theorem and always got knocked out at the front door. Excellent expalnation
This is fantastic. Thank you so much! I have been exposed to BT before, but have never understood it. As sad as it sounds, I didn't realize it was composed of joint probability which is composed of conditional probability, and marginal probability. Conditional probability and joint probability, and Bayes theorem all just looked the same. This really helped clarify things for me.
Great example! Very easy to follow and understand. On a side note: I showed your video to my students and some of them objected rather "emphatically". They said it was too sexist. Crazy times we live in....Instead of math and statistics, they wanted to discuss gender roles and stereotypes in a Stat class. Gosh!
Thanks John! I agree with your students. When I watch this now, I cringe. I definitely need to re-do it with a better example, one that doesn't reinforce outdated gender norms.
@@BrandonRohrer No....Please, do not follow the mad crowds... this an innocent, simple math example. People are getting crazy and finding excuses to feel offended and start meaningless fights!
The knowledge that Bayes was a theologian and that his theory requires at least some belief or faith in improbable things earns a "Well played, Mr. Bayes" slow clap. I've been enjoying your videos Brandon, thanks for keeping things approachable!
I’m not sure that Bayesian epistemology requires a belief in improbable things. I love this video, but I think that’s an overstatement. I do think that it requires us to be open to the possibility that improbable things may be true. It does not require me to have faith in anything improbable, but rather to proportion our beliefs to the evidence (probabilities)- which is the antithesis of faith-while accepting the possibility of being wrong. To accept something as true that is improbable… is intellectually irresponsible and lacks due caution and humility. But to withhold belief (proportionally to the evidence) from improbable things is intellectually responsible and does not exclude being open to surprise-to the possibility that something improbable is true. I don’t think Bayesian epistemology intrinsically expects you to hold as true some improbable thing (faith). Abstinence from faith is acceptable in all cases as long as the possibility of error is operative. This suggestion that it’s necessary in Bayesian epistemology to believe something that is improbable was the only sloppy part of the video, no? I’m open to correction…
I think you guys got it the opposite way, the video was trying to say, be open to believe in the improbable things that come from the data (evidence), rather than only holding on a prior belief.
Thanks for excellent presentation! One question though: at 17:49 the P(m=[13.9, 14.1, 17.5]|w=17) is factorized as following: P(w=17|m=[13.9, 14.1, 17.5]) = P(m=[13.9, 14.1, 17.5]|w=17) = P(m=13.9|w=17) * P(m=14.1|w=17) * P(m=17.5|w=17) then at 20:47 the P(m=[13.9, 14.1, 17.5]|w=17) * P(w=17) is expanded into: P(w=17|m=[13.9, 14.1, 17.5]) = P(m=[13.9, 14.1, 17.5]|w=17) * P(w=17) = P(m=13.9|w=17) * P(w=17) * P(m=14.1|w=17) * P(w=17) * P(m=17.5|w=17) * P(w=17) how do you get that P(m=[13.9, 14.1, 17.5]|w=17) * P(w=17) is equal to P(m=13.9|w=17) * P(m=14.1|w=17) * P(m=17.5|w=17) * P(w=17)^3 ? Thx in advance!
i hate stats because of those things... my teacher was teaching us utility analysis, and says "satisfaction is measured in utils" to that i ask "tell me how satisfied are you with your job and answer in the form: n-utils...". i still haven't got an answer!
0:40 Bayes wrote 2 books one about theology and one about probabilty. He planed to write a third book infering the existence / non existence of God with probability (likelihood distribution = humanity, Prior distribution= miracles !
How does the shape and mean of our prior distribution tell us the best end belief? our prior knowledge is that the puppy weighs 14.2 lbs last time and that weight changes aren't noticeable what if the mean of our prior distribution is chosen to be larger/smaller than 14.2 lbs? and what if we take that distribution to be narrower/broader (more than 1 lb std)? an extreme case, what if we first guess that the puppy's weight is *exactly* 14.2 lbs?
I was looking for intuitive content to introduce me about essence of Bayes theorem in statistics, thanks for this. Luckily I found your blog about machine learning and robotics. That's all what I wanted under a roof, robotics, data science and machine learning.
Excellent explanation. At the 15:20 and beyond is when everything really started to come together. Also thanks for deriving the formula at the 7:10 mark.
Brandon, GREAT explanations!! I am taking a "Math for Data Sciences" class and have been flying through it until the final week and "Bayes Theorem". Achk...... It was poorly explained and very confusing. I was going to drop the class as I just couldn't get it. After watching your TH-cam explanation I am excited about the possibilities and understand the way it works - cool stuff! Thank you for all you do!!!
Thank you for the video, it helped me understand the concept of Bayesian inference. The concept is simple. In a nutshell, you have an idea about what the quantity is and then you use the measurements to sharpen your assumption.
Excellent. For those for whom this is the first lesson on Bayes, you've left out a few steps here and there. But still excellent. It's difficult to make things understandable. You're excellent at it.
Thanks Dov! And good callout - this focuses on the concepts and doesn't tell you quite enough to code it up. That will be the subject of a future course on e2eml.school
I found this quite helpful in giving Bayesian probability a more intuitive appeal. Bayes idea had previously been presented to me as "a priori probability", and I had always been troubled by the a priori part. But I guess a good way to think about it is like this: When we say, "All else, being equal, such-and-such is the case," we mean (or ought to mean) "Assuming that the variables which we have not measured or aren't even aware of have the values they most likely have when our one measured variable has the value measured or assumed by us, then such-and-such is most likely the case."
I believe you don't know much about statistics (the impossible thing), but I do believe you really know how to explain Bayesian Inference. Great video.
Please consider doing a longer version of the video as its a nice way you have introduced this concept and you'll miss the intended audience who's not well versed with Bayesian statistics to go along with the pace of this video. I have some basics but I still had to pause the video a lot to be able to read the material before it changed to next slide and also listen to what you are saying. Also if you had a cursor or pointer that would also improve the experience.
Bayes Theorem (1) only describes 2 events A and B that overlap (A+B). Bayes Theorem states that independent of the magnitude of A+B, the relationship between the proportion of A= P(A) and the proportion of B = P(B) is always the inverse of the conditional proportions being P(A | B) and P(B | A) or P(A) / P(B) = P(A | B) / P(B | A) (1) So if we know in your example the proportion: P(A) = P(M) = 0.50 P(B) = P(LH) = 0.25 than P(A) / P(B) = 0.50/0.25 = 2:1 which is true for any P(A+B) then according to (1): P(A | B) / P(B | A) = P(A) / P(B) P(A | B) / P(B | A) = 2/1 when P(B|A) = P(LH | M) = 0.04 the equation can be solved P(A | B) = P(M | LH) / 0.04 = 2 / 1 P(M | LH) = 0.08
GOD THANKS FOR EXISTING. Finally somebody that fucking breaks down the most important part of it (namely, how the do you calculate the likelyhood in practice). I hope life rewards you beautifully.
Great video!! Very nice and easy to digest explanation of Bayes theorem! Thank you very much for sharing this excellent material. I have got a better understanding on how to apply it to my problems. Keep the great work!
Great video. I'm slightly confused about the dog example at around 20:00: Why can we use the standard error of the prior distribution in the likelihood computation for the measurements? Don't we have to model distribution mean and spread seperately i.e. explore many different standard error values? Generally, is w supposed to represent only one number (e.g. the true dog weight in the example) or an entire distribution that can be characterised by its moments (with the true dog weight simply being its first moment)?
Wow can't believe I only came across this video now. This is by far the best explanation on Bayes with great examples! Thanks @BrandonRohrer !! Love the example with the weight of puppy! May I ask if you have codes to deal with multiple priors/ multiple events? Say such as an extension of the weight of the puppy, if the weight change is more than one pound, plus she may be showing some other symptoms (say losing appetite), the likelihood of her being sick from something is x. Or even, losing appetite can be just due to weather being too hot. So the lost of weight of one pound from the last vet visit and losing appetite may not be significant at all and doesn't warrant multiple expensive test suggested by the vet.
I don't understand your method of getting the posterior distribution. There is nothing in Bayes' formula that implies a summation over all possible parameters in the likelihood function. It seems to me that the way to get the posterior is to compute the likelihood distribution based on measured samples and then for each possible weight to multiply the value of the (static) likelihood distribution by the value of the prior distribution. Can you explain why your method works? Thank you.
Really I want to thank you so much the best thing that I have ever heard about Bayesian. you have illustrated everything and simplify the method for me regards
Great explanation, but i'm having a hard time understanding why you should use the standard error, as the width when you calculate the likelihood, and not the standard deviation. Doesn't this mean, that you are calculating the probability, that your measurement is the true mean, and not the probability of getting that measurement given your mean? Or perhaps that's what you'r supposed to do?
You hit it spot on, jeppe. The assumption that I glossed over is that Reign's actual weight will be the mean of all the measurements (if we kept taking measurements forever). However, since we only have a few measurements, we have to be content with an estimate of that true mean. The standard error helps us to determine the distribution of that estimate. We don't care about how far an individual measurement tends to be off, we only care about how far our estimate of the weight is off.
17:57 One query here. How is the P(m | w=17) distribution function calculated? What is the spread (S.D.)? How do we, for example, arrive at any certain value of the probability of getting m = 15.6lb given the true weight is 17lb? Thanks! Nice explanation.
_" When you have excluded the impossible, whatever remains, however improbable, must be __-the truth-__ _*_possible_*_ "_ There 'ya go Sheerluck Holmes, I've fixed it for 'ya : )
Wow best explanation and example ever I saw ^^ Fantastic.
Excellent
exactly. These pacient disease examples were driving me nuts.
this is by far the most accessible explanation of Bayes theorem. Well done Brandon!
Loved the analogies with real life philosophies, brilliant!
There are never lines at the men's room.
haha.
lol
Unless it's cocaine.
So the probability of this sample being true is 0%, hahhaha
You have never been to developer conferences :)
Best explanation of Bayes theorem I have seen. Fantastic teaching.
This is the most accessible explanation about Bayesian Inference. Thank you Brandon for the time taken to prepare this video. You rock !
Brandon I just want to tell you that you are a fantastic teacher.
Thank you very much Shirshanya. That is a huge compliment. I'm honored.
Please make more statistics videos! I have suggested your channel to my biostat teacher.
I know the video is old but I have to agree with the pinned comment. I already knew Bayes Theorem buy as I don't use it often, I have to be constantly refreshing the details in my mind. TH-cam algorithm recommended this video and it's hands down the best I have ever watched.
Thank you. I really appreciate that.
Wow! One of the clearest explanations of Bayes Theorem I’ve come across!
Thanks!
i don't know what to say, i'm a computer science student and i have never seen an explanation better than this ... thank you veryyyyy much
Thank you for this excellent explanation. You are a patient and well-spoken teacher.
The way you connect things with appropriate easy to examples...Amazing...
This is the first time I've felt like I've actually understood this... and it's such a simple concept! Thank you!
Excellent explanation ! This is the manner in which mathematics must be explained. With cases of practical applicability. Good job mr. Brandon !
Thanks Razvan. I'm so happy you enjoyed it.
For 5 years i kept Bayes aside , you are the guru in teaching stuff.. God bless you Brandon
hey man,this is the one good explanation for conditional prob i had ever heard
Thanks!
Definitely the best explanation of the theorem told in an easily understandable way, I can find in the internet...
EXCELLENT EXPLANATION!!! I am learning graphical modeling and a lot of these concepts were a bit unclear to me. Examples given here are absolutely to the point and demystify a lot of concepts. Thank you and looking forward to more videos.
when your teacher don't make sense, had to go through teaching videos online and came across this one...
Lucky lucky lucky! Thank you Mr!
This is the best explanation yet, it helped me get a greater intuitive sense of Bayesian inferences.
Yes it was great. It seems running into Feigenbaum maths or simular
Stop looking for a descent tutorial... this one is the best!
Thank you. Your video has been of great help. I have tried different resources to wrap my head around Bayesian theorem and always got knocked out at the front door. Excellent expalnation
This is fantastic. Thank you so much! I have been exposed to BT before, but have never understood it. As sad as it sounds, I didn't realize it was composed of joint probability which is composed of conditional probability, and marginal probability. Conditional probability and joint probability, and Bayes theorem all just looked the same. This really helped clarify things for me.
I must say, this is the explanation of Bayes theorem, I have ever seen..... PERFECT!!!!!
Great example! Very easy to follow and understand. On a side note: I showed your video to my students and some of them objected rather "emphatically".
They said it was too sexist. Crazy times we live in....Instead of math and statistics, they wanted to discuss gender roles and stereotypes in a Stat class. Gosh!
Thanks John! I agree with your students. When I watch this now, I cringe. I definitely need to re-do it with a better example, one that doesn't reinforce outdated gender norms.
@@BrandonRohrer No....Please, do not follow the mad crowds... this an innocent, simple math example. People are getting crazy and finding excuses to feel offended and start meaningless fights!
So sad. Long hair and standing in the womens restroom line and we cant even use Bayes' thereom to assume its a woman 😂
this was very intuitive explanation, man do more!
The knowledge that Bayes was a theologian and that his theory requires at least some belief or faith in improbable things earns a "Well played, Mr. Bayes" slow clap. I've been enjoying your videos Brandon, thanks for keeping things approachable!
I’m not sure that Bayesian epistemology requires a belief in improbable things. I love this video, but I think that’s an overstatement. I do think that it requires us to be open to the possibility that improbable things may be true. It does not require me to have faith in anything improbable, but rather to proportion our beliefs to the evidence (probabilities)- which is the antithesis of faith-while accepting the possibility of being wrong. To accept something as true that is improbable… is intellectually irresponsible and lacks due caution and humility. But to withhold belief (proportionally to the evidence) from improbable things is intellectually responsible and does not exclude being open to surprise-to the possibility that something improbable is true. I don’t think Bayesian epistemology intrinsically expects you to hold as true some improbable thing (faith). Abstinence from faith is acceptable in all cases as long as the possibility of error is operative. This suggestion that it’s necessary in Bayesian epistemology to believe something that is improbable was the only sloppy part of the video, no? I’m open to correction…
I think you guys got it the opposite way, the video was trying to say, be open to believe in the improbable things that come from the data (evidence), rather than only holding on a prior belief.
I have viewed many explanations about Bayes rule but this is no doubt the best! Thanks Brandon
I have been struggling with bayesian inference and your tutorial makes it so easy to understand! Thank You! Keep up the good work.
I'm very happy to hear it Fahad. Thanks.
I was reading about Bayes Theory for months ! And this is the first time I understand the concept!! Wow!! such an amazing way of teaching!!
I'm so happy to hear it Taghreed. That was exactly my hope.
I wish everyone taught like this. Your presentation was awesome. Thank you
Great explanation and simplification of a difficult concept. The three quotations at the end are poetic and purposeful. Thanks
I found them surprising relevant too. Thanks Sridhar.
Thanks for excellent presentation! One question though:
at 17:49 the P(m=[13.9, 14.1, 17.5]|w=17) is factorized as following:
P(w=17|m=[13.9, 14.1, 17.5])
= P(m=[13.9, 14.1, 17.5]|w=17)
= P(m=13.9|w=17) * P(m=14.1|w=17) * P(m=17.5|w=17)
then at 20:47 the P(m=[13.9, 14.1, 17.5]|w=17) * P(w=17) is expanded into:
P(w=17|m=[13.9, 14.1, 17.5])
= P(m=[13.9, 14.1, 17.5]|w=17) * P(w=17)
= P(m=13.9|w=17) * P(w=17) *
P(m=14.1|w=17) * P(w=17) *
P(m=17.5|w=17) * P(w=17)
how do you get that P(m=[13.9, 14.1, 17.5]|w=17) * P(w=17) is equal to P(m=13.9|w=17) * P(m=14.1|w=17) * P(m=17.5|w=17) * P(w=17)^3 ? Thx in advance!
i hate stats because of those things... my teacher was teaching us utility analysis, and says "satisfaction is measured in utils" to that i ask "tell me how satisfied are you with your job and answer in the form: n-utils...".
i still haven't got an answer!
0:40 Bayes wrote 2 books one about theology and one about probabilty.
He planed to write a third book infering the existence / non existence of God with probability (likelihood distribution = humanity, Prior distribution= miracles !
How does the shape and mean of our prior distribution tell us the best end belief?
our prior knowledge is that the puppy weighs 14.2 lbs last time and that weight changes aren't noticeable
what if the mean of our prior distribution is chosen to be larger/smaller than 14.2 lbs?
and what if we take that distribution to be narrower/broader (more than 1 lb std)?
an extreme case, what if we first guess that the puppy's weight is *exactly* 14.2 lbs?
I am from Brazil. What a fantastic explanation!
Thanks Jose! Welcome to the channel
Thank you for this excellent presentation!
I was looking for intuitive content to introduce me about essence of Bayes theorem in statistics, thanks for this. Luckily I found your blog about machine learning and robotics. That's all what I wanted under a roof, robotics, data science and machine learning.
Amazing explanation and graphics!
Thanks!
Excellent explanation. At the 15:20 and beyond is when everything really started to come together. Also thanks for deriving the formula at the 7:10 mark.
Brandon, GREAT explanations!! I am taking a "Math for Data Sciences" class and have been flying through it until the final week and "Bayes Theorem". Achk...... It was poorly explained and very confusing. I was going to drop the class as I just couldn't get it. After watching your TH-cam explanation I am excited about the possibilities and understand the way it works - cool stuff! Thank you for all you do!!!
This was the best explanation of Bayes I've ever heard, I had such a hard time wrapping my head around it from other sources
Best video I found with all the information that I needed at one place.
Thanks.
Thank you for the video, it helped me understand the concept of Bayesian inference.
The concept is simple. In a nutshell, you have an idea about what the quantity is and then you use the measurements to sharpen your assumption.
You're so much better than my Statistics teacher, thank you so much for this explanation!
Thanks Mathias!
Excellent. For those for whom this is the first lesson on Bayes, you've left out a few steps here and there. But still excellent. It's difficult to make things understandable. You're excellent at it.
Thanks Dov! And good callout - this focuses on the concepts and doesn't tell you quite enough to code it up. That will be the subject of a future course on e2eml.school
You're very good at explaining and also you go in some details which is nice. Too often youtube tutorials are too simple. keep going.
now that you have said that (an year ago), i kinda feel like finding the probability of likelihood of a youtuber making too simple tutorials!
I found this quite helpful in giving Bayesian probability a more intuitive appeal. Bayes idea had previously been presented to me as "a priori probability", and I had always been troubled by the a priori part. But I guess a good way to think about it is like this: When we say, "All else, being equal, such-and-such is the case," we mean (or ought to mean) "Assuming that the variables which we have not measured or aren't even aware of have the values they most likely have when our one measured variable has the value measured or assumed by us, then such-and-such is most likely the case."
I believe you don't know much about statistics (the impossible thing), but I do believe you really know how to explain Bayesian Inference. Great video.
Great explanation and video lesson production. Best Bayesian lesson I've found on youtube
Amazing. I already knew what Bayes theorem was, but you have an awesome intro to Bayes. Thanks for the video.
Superb lecture - esp. the MLE explanation!
This video deserves more thumbs up. I understood a lot on a lazy sunday evening :) great explaination.
i like the quotes you put at the end and how you reword them
Hey @Brandon Rogers,
At 18:12, y axis is likelihood not probability. Probability is area under curve for this graph.
Brandon, this was great, thank you. Very easy to follow and really interesting and concise!
Thanks Erin!
"...one of the top 10 math tattoos of all time." 😂
such a well-thought -through video, very good explanations for every instance, the ending was the bonus, loved it, thank you
Thank you Lilit! I appreciate that.
Absolutely brilliant! Your presentation, examples, etc. were perfect and applicable! Thanks!
Thank you very much :)
Thank you sooo much Brandon for explaining the concepts so clearly.
hands down best explanation i've seen, thank you
I thought this was not only a great example of Bayes but also a nice intro for Cox's Theorem. Nice jobQ
* quickly looks up Cox's Theorem *
Why, yes it does Donna. Thank you! :)
I have been searching for explanation like this for sometime and a big WOW to this guy. Wonderful explanation!!
That's the best explanation ever❤️❤️❤️❤️
Please consider doing a longer version of the video as its a nice way you have introduced this concept and you'll miss the intended audience who's not well versed with Bayesian statistics to go along with the pace of this video. I have some basics but I still had to pause the video a lot to be able to read the material before it changed to next slide and also listen to what you are saying. Also if you had a cursor or pointer that would also improve the experience.
the best explanation I ever seen! Super clear.
Thumbs up after watching the first 3 minutes. Finally, the "prior" make sense to me!!!!!!
the best explanation i hv seen about bayes theorem... awsome... thnx a ton....
Bayes Theorem (1) only describes 2 events A and B that overlap (A+B).
Bayes Theorem states that independent of the magnitude of A+B,
the relationship between the proportion of A= P(A) and the proportion of B = P(B)
is always the inverse of the conditional proportions being P(A | B) and P(B | A) or
P(A) / P(B) = P(A | B) / P(B | A) (1)
So if we know in your example the proportion:
P(A) = P(M) = 0.50
P(B) = P(LH) = 0.25
than P(A) / P(B) = 0.50/0.25 = 2:1
which is true for any P(A+B)
then according to (1):
P(A | B) / P(B | A) = P(A) / P(B)
P(A | B) / P(B | A) = 2/1
when
P(B|A) = P(LH | M) = 0.04
the equation can be solved
P(A | B) = P(M | LH) / 0.04 = 2 / 1
P(M | LH) = 0.08
Nothing much to say only thank you! you may have helped me in clearing my exam!
I'm confused how do you calculate the probabilities in @17:56 P(m=13.9|w=17) and so on?
best explanation on youtube so far
GOD THANKS FOR EXISTING. Finally somebody that fucking breaks down the most important part of it (namely, how the do you calculate the likelyhood in practice). I hope life rewards you beautifully.
Great video!! Very nice and easy to digest explanation of Bayes theorem! Thank you very much for sharing this excellent material. I have got a better understanding on how to apply it to my problems. Keep the great work!
thank you so much. you expained it in an awesome way and saved my exam.
Glad to hear it
Simply the best ! Thank you Brandon
Dude your analogies are on point.
Thanks Julian
The best explaination on youtube
thank you man
Great video. I'm slightly confused about the dog example at around 20:00: Why can we use the standard error of the prior distribution in the likelihood computation for the measurements? Don't we have to model distribution mean and spread seperately i.e. explore many different standard error values? Generally, is w supposed to represent only one number (e.g. the true dog weight in the example) or an entire distribution that can be characterised by its moments (with the true dog weight simply being its first moment)?
best explanation I found on the topic so far. great work!!!
Excellent examples and explanation! Now everything is so much clearer. :)
Wow can't believe I only came across this video now. This is by far the best explanation on Bayes with great examples! Thanks @BrandonRohrer !! Love the example with the weight of puppy! May I ask if you have codes to deal with multiple priors/ multiple events? Say such as an extension of the weight of the puppy, if the weight change is more than one pound, plus she may be showing some other symptoms (say losing appetite), the likelihood of her being sick from something is x. Or even, losing appetite can be just due to weather being too hot. So the lost of weight of one pound from the last vet visit and losing appetite may not be significant at all and doesn't warrant multiple expensive test suggested by the vet.
Terrific examples and terrific explanation down to such applicable quotes!
I don't understand your method of getting the posterior distribution. There is nothing in Bayes' formula that implies a summation over all possible parameters in the likelihood function. It seems to me that the way to get the posterior is to compute the likelihood distribution based on measured samples and then for each possible weight to multiply the value of the (static) likelihood distribution by the value of the prior distribution. Can you explain why your method works? Thank you.
Really I want to thank you so much
the best thing that I have ever heard about Bayesian.
you have illustrated everything and simplify the method for me
regards
What a stunning explanation. Speechless
💜 🧡 🖤 💚 🤎 💛 💙
Wonderful explanation. Thank you. The Mark Twain quote at the end is apocryphal though.
Thanks Matthew. I'll have to asterisk that. :)
Thanks for the excellent video. A good refresher! Keep up the good work!
Start with more slides like your last two. Thanks this was insightful.
Great explanation, but i'm having a hard time understanding why you should use the standard error, as the width when you calculate the likelihood, and not the standard deviation. Doesn't this mean, that you are calculating the probability, that your measurement is the true mean, and not the probability of getting that measurement given your mean? Or perhaps that's what you'r supposed to do?
You hit it spot on, jeppe. The assumption that I glossed over is that Reign's actual weight will be the mean of all the measurements (if we kept taking measurements forever). However, since we only have a few measurements, we have to be content with an estimate of that true mean. The standard error helps us to determine the distribution of that estimate. We don't care about how far an individual measurement tends to be off, we only care about how far our estimate of the weight is off.
When you said small human, I imagined a small adult
Imagining calling Tyrian cute
great video presentation Brandon. please try to apply more videos on other machine learning algorithms
Hi! This was such a clear explanation. It would be great if you could make one on hidden markov models.
Thanks Rachel! Hidden Markov Models are an excellent idea. I'll put it in my to do list.
That was fantastically done.
Thank you :)
Hi Brandon, your video was simple, superb, and stupendous!
17:57 One query here. How is the P(m | w=17) distribution function calculated? What is the spread (S.D.)? How do we, for example, arrive at any certain value of the probability of getting m = 15.6lb given the true weight is 17lb?
Thanks! Nice explanation.
Bravo! Wonderful presentation. Thank you for this presentation.
_" When you have excluded the impossible, whatever remains, however improbable, must be __-the truth-__ _*_possible_*_ "_
There 'ya go Sheerluck Holmes, I've fixed it for 'ya : )
Thank you very much for the best explanation, It's very interesting
Excellent introduction, thanks. Is there also a continuitaton concerning a graphic prior Selektion and Jeffreys priors?