How Bayes Theorem works

แชร์
ฝัง
  • เผยแพร่เมื่อ 23 พ.ย. 2024

ความคิดเห็น • 410

  • @phytasea
    @phytasea 7 ปีที่แล้ว +241

    Wow best explanation and example ever I saw ^^ Fantastic.

    • @marciasola462
      @marciasola462 5 ปีที่แล้ว +1

      Excellent

    • @romanemul1
      @romanemul1 8 หลายเดือนก่อน

      exactly. These pacient disease examples were driving me nuts.

  • @welcome33333
    @welcome33333 7 ปีที่แล้ว +55

    this is by far the most accessible explanation of Bayes theorem. Well done Brandon!

  • @syedmurtazaarshad3434
    @syedmurtazaarshad3434 7 หลายเดือนก่อน +1

    Loved the analogies with real life philosophies, brilliant!

  • @Skachmo1
    @Skachmo1 7 ปีที่แล้ว +234

    There are never lines at the men's room.

    • @selamysfendegu8239
      @selamysfendegu8239 6 ปีที่แล้ว +1

      haha.

    • @dinikosama
      @dinikosama 6 ปีที่แล้ว +1

      lol

    • @Beebo
      @Beebo 5 ปีที่แล้ว +5

      Unless it's cocaine.

    • @masterpogi1818
      @masterpogi1818 4 ปีที่แล้ว +2

      So the probability of this sample being true is 0%, hahhaha

    • @dhanushram6137
      @dhanushram6137 4 ปีที่แล้ว +3

      You have never been to developer conferences :)

  • @danrattray8884
    @danrattray8884 7 ปีที่แล้ว +43

    Best explanation of Bayes theorem I have seen. Fantastic teaching.

  • @toufikhamdani5044
    @toufikhamdani5044 ปีที่แล้ว +5

    This is the most accessible explanation about Bayesian Inference. Thank you Brandon for the time taken to prepare this video. You rock !

  • @shirshanyaroy287
    @shirshanyaroy287 8 ปีที่แล้ว +43

    Brandon I just want to tell you that you are a fantastic teacher.

    • @BrandonRohrer
      @BrandonRohrer  8 ปีที่แล้ว +3

      Thank you very much Shirshanya. That is a huge compliment. I'm honored.

    • @shirshanyaroy287
      @shirshanyaroy287 8 ปีที่แล้ว +2

      Please make more statistics videos! I have suggested your channel to my biostat teacher.

  • @claudiorio
    @claudiorio 3 ปีที่แล้ว +4

    I know the video is old but I have to agree with the pinned comment. I already knew Bayes Theorem buy as I don't use it often, I have to be constantly refreshing the details in my mind. TH-cam algorithm recommended this video and it's hands down the best I have ever watched.

    • @BrandonRohrer
      @BrandonRohrer  3 ปีที่แล้ว +2

      Thank you. I really appreciate that.

  • @richardgordon
    @richardgordon 5 หลายเดือนก่อน +4

    Wow! One of the clearest explanations of Bayes Theorem I’ve come across!

  • @theinsanify7802
    @theinsanify7802 6 ปีที่แล้ว +1

    i don't know what to say, i'm a computer science student and i have never seen an explanation better than this ... thank you veryyyyy much

  • @rjvaal
    @rjvaal 7 ปีที่แล้ว +5

    Thank you for this excellent explanation. You are a patient and well-spoken teacher.

  • @jithunniks
    @jithunniks 6 ปีที่แล้ว +1

    The way you connect things with appropriate easy to examples...Amazing...

  • @nutrinogirl456
    @nutrinogirl456 7 ปีที่แล้ว +18

    This is the first time I've felt like I've actually understood this... and it's such a simple concept! Thank you!

  • @razvanastrenie1455
    @razvanastrenie1455 3 ปีที่แล้ว +3

    Excellent explanation ! This is the manner in which mathematics must be explained. With cases of practical applicability. Good job mr. Brandon !

    • @BrandonRohrer
      @BrandonRohrer  3 ปีที่แล้ว

      Thanks Razvan. I'm so happy you enjoyed it.

  • @rameshmaddali6208
    @rameshmaddali6208 4 ปีที่แล้ว

    For 5 years i kept Bayes aside , you are the guru in teaching stuff.. God bless you Brandon

  • @bharathwajan6079
    @bharathwajan6079 ปีที่แล้ว +1

    hey man,this is the one good explanation for conditional prob i had ever heard

  • @SupremeSkeptic
    @SupremeSkeptic 7 ปีที่แล้ว

    Definitely the best explanation of the theorem told in an easily understandable way, I can find in the internet...

  • @nginfrared
    @nginfrared 7 ปีที่แล้ว +6

    EXCELLENT EXPLANATION!!! I am learning graphical modeling and a lot of these concepts were a bit unclear to me. Examples given here are absolutely to the point and demystify a lot of concepts. Thank you and looking forward to more videos.

  • @maxinelyu7875
    @maxinelyu7875 7 ปีที่แล้ว +1

    when your teacher don't make sense, had to go through teaching videos online and came across this one...
    Lucky lucky lucky! Thank you Mr!

  • @regozs
    @regozs 5 ปีที่แล้ว +3

    This is the best explanation yet, it helped me get a greater intuitive sense of Bayesian inferences.

    • @brendawilliams8062
      @brendawilliams8062 2 ปีที่แล้ว

      Yes it was great. It seems running into Feigenbaum maths or simular

  • @yurysambale
    @yurysambale 7 ปีที่แล้ว

    Stop looking for a descent tutorial... this one is the best!

  • @redserpent
    @redserpent 6 ปีที่แล้ว +2

    Thank you. Your video has been of great help. I have tried different resources to wrap my head around Bayesian theorem and always got knocked out at the front door. Excellent expalnation

  • @kimnguyen1227
    @kimnguyen1227 7 ปีที่แล้ว +4

    This is fantastic. Thank you so much! I have been exposed to BT before, but have never understood it. As sad as it sounds, I didn't realize it was composed of joint probability which is composed of conditional probability, and marginal probability. Conditional probability and joint probability, and Bayes theorem all just looked the same. This really helped clarify things for me.

  • @himanshu8006
    @himanshu8006 6 ปีที่แล้ว +1

    I must say, this is the explanation of Bayes theorem, I have ever seen..... PERFECT!!!!!

  • @johneagle4384
    @johneagle4384 2 ปีที่แล้ว +2

    Great example! Very easy to follow and understand. On a side note: I showed your video to my students and some of them objected rather "emphatically".
    They said it was too sexist. Crazy times we live in....Instead of math and statistics, they wanted to discuss gender roles and stereotypes in a Stat class. Gosh!

    • @BrandonRohrer
      @BrandonRohrer  2 ปีที่แล้ว +1

      Thanks John! I agree with your students. When I watch this now, I cringe. I definitely need to re-do it with a better example, one that doesn't reinforce outdated gender norms.

    • @johneagle4384
      @johneagle4384 2 ปีที่แล้ว +2

      @@BrandonRohrer No....Please, do not follow the mad crowds... this an innocent, simple math example. People are getting crazy and finding excuses to feel offended and start meaningless fights!

    • @Terszel
      @Terszel 11 หลายเดือนก่อน

      So sad. Long hair and standing in the womens restroom line and we cant even use Bayes' thereom to assume its a woman 😂

  • @robertoarce-tx8yt
    @robertoarce-tx8yt ปีที่แล้ว +1

    this was very intuitive explanation, man do more!

  • @ukktor
    @ukktor 7 ปีที่แล้ว +17

    The knowledge that Bayes was a theologian and that his theory requires at least some belief or faith in improbable things earns a "Well played, Mr. Bayes" slow clap. I've been enjoying your videos Brandon, thanks for keeping things approachable!

    • @themennoniteatheist1255
      @themennoniteatheist1255 3 ปีที่แล้ว +1

      I’m not sure that Bayesian epistemology requires a belief in improbable things. I love this video, but I think that’s an overstatement. I do think that it requires us to be open to the possibility that improbable things may be true. It does not require me to have faith in anything improbable, but rather to proportion our beliefs to the evidence (probabilities)- which is the antithesis of faith-while accepting the possibility of being wrong. To accept something as true that is improbable… is intellectually irresponsible and lacks due caution and humility. But to withhold belief (proportionally to the evidence) from improbable things is intellectually responsible and does not exclude being open to surprise-to the possibility that something improbable is true. I don’t think Bayesian epistemology intrinsically expects you to hold as true some improbable thing (faith). Abstinence from faith is acceptable in all cases as long as the possibility of error is operative. This suggestion that it’s necessary in Bayesian epistemology to believe something that is improbable was the only sloppy part of the video, no? I’m open to correction…

    • @ahmedemadsamy4244
      @ahmedemadsamy4244 2 ปีที่แล้ว

      I think you guys got it the opposite way, the video was trying to say, be open to believe in the improbable things that come from the data (evidence), rather than only holding on a prior belief.

  • @liucloud6317
    @liucloud6317 5 ปีที่แล้ว +2

    I have viewed many explanations about Bayes rule but this is no doubt the best! Thanks Brandon

  • @fahad3802
    @fahad3802 8 ปีที่แล้ว +4

    I have been struggling with bayesian inference and your tutorial makes it so easy to understand! Thank You! Keep up the good work.

    • @BrandonRohrer
      @BrandonRohrer  8 ปีที่แล้ว +2

      I'm very happy to hear it Fahad. Thanks.

  • @taghreedalghamdi6812
    @taghreedalghamdi6812 5 ปีที่แล้ว +1

    I was reading about Bayes Theory for months ! And this is the first time I understand the concept!! Wow!! such an amazing way of teaching!!

    • @BrandonRohrer
      @BrandonRohrer  5 ปีที่แล้ว

      I'm so happy to hear it Taghreed. That was exactly my hope.

  • @raghurrai
    @raghurrai 6 ปีที่แล้ว +1

    I wish everyone taught like this. Your presentation was awesome. Thank you

  • @ssundaraju
    @ssundaraju 5 ปีที่แล้ว +1

    Great explanation and simplification of a difficult concept. The three quotations at the end are poetic and purposeful. Thanks

    • @BrandonRohrer
      @BrandonRohrer  5 ปีที่แล้ว

      I found them surprising relevant too. Thanks Sridhar.

  • @evgenykriukov4239
    @evgenykriukov4239 6 ปีที่แล้ว +2

    Thanks for excellent presentation! One question though:
    at 17:49 the P(m=[13.9, 14.1, 17.5]|w=17) is factorized as following:
    P(w=17|m=[13.9, 14.1, 17.5])
    = P(m=[13.9, 14.1, 17.5]|w=17)
    = P(m=13.9|w=17) * P(m=14.1|w=17) * P(m=17.5|w=17)
    then at 20:47 the P(m=[13.9, 14.1, 17.5]|w=17) * P(w=17) is expanded into:
    P(w=17|m=[13.9, 14.1, 17.5])
    = P(m=[13.9, 14.1, 17.5]|w=17) * P(w=17)
    = P(m=13.9|w=17) * P(w=17) *
    P(m=14.1|w=17) * P(w=17) *
    P(m=17.5|w=17) * P(w=17)
    how do you get that P(m=[13.9, 14.1, 17.5]|w=17) * P(w=17) is equal to P(m=13.9|w=17) * P(m=14.1|w=17) * P(m=17.5|w=17) * P(w=17)^3 ? Thx in advance!

    • @keokawasaki7833
      @keokawasaki7833 6 ปีที่แล้ว

      i hate stats because of those things... my teacher was teaching us utility analysis, and says "satisfaction is measured in utils" to that i ask "tell me how satisfied are you with your job and answer in the form: n-utils...".
      i still haven't got an answer!

  • @WahranRai
    @WahranRai 3 ปีที่แล้ว +3

    0:40 Bayes wrote 2 books one about theology and one about probabilty.
    He planed to write a third book infering the existence / non existence of God with probability (likelihood distribution = humanity, Prior distribution= miracles !

  • @GeoffryGifari
    @GeoffryGifari ปีที่แล้ว +1

    How does the shape and mean of our prior distribution tell us the best end belief?
    our prior knowledge is that the puppy weighs 14.2 lbs last time and that weight changes aren't noticeable
    what if the mean of our prior distribution is chosen to be larger/smaller than 14.2 lbs?
    and what if we take that distribution to be narrower/broader (more than 1 lb std)?
    an extreme case, what if we first guess that the puppy's weight is *exactly* 14.2 lbs?

  • @joserobertopacheco298
    @joserobertopacheco298 2 ปีที่แล้ว +1

    I am from Brazil. What a fantastic explanation!

    • @BrandonRohrer
      @BrandonRohrer  2 ปีที่แล้ว

      Thanks Jose! Welcome to the channel

  • @bga9388
    @bga9388 ปีที่แล้ว +1

    Thank you for this excellent presentation!

  • @KunwarPratapSingh41951
    @KunwarPratapSingh41951 6 ปีที่แล้ว

    I was looking for intuitive content to introduce me about essence of Bayes theorem in statistics, thanks for this. Luckily I found your blog about machine learning and robotics. That's all what I wanted under a roof, robotics, data science and machine learning.

  • @williamliamsmith4923
    @williamliamsmith4923 ปีที่แล้ว +1

    Amazing explanation and graphics!

  • @joshuafancher3111
    @joshuafancher3111 6 ปีที่แล้ว

    Excellent explanation. At the 15:20 and beyond is when everything really started to come together. Also thanks for deriving the formula at the 7:10 mark.

  • @billgiles9662
    @billgiles9662 7 ปีที่แล้ว

    Brandon, GREAT explanations!! I am taking a "Math for Data Sciences" class and have been flying through it until the final week and "Bayes Theorem". Achk...... It was poorly explained and very confusing. I was going to drop the class as I just couldn't get it. After watching your TH-cam explanation I am excited about the possibilities and understand the way it works - cool stuff! Thank you for all you do!!!

  • @xxlolxx447
    @xxlolxx447 6 ปีที่แล้ว

    This was the best explanation of Bayes I've ever heard, I had such a hard time wrapping my head around it from other sources

  • @karthiksalian5715
    @karthiksalian5715 4 ปีที่แล้ว +1

    Best video I found with all the information that I needed at one place.
    Thanks.

  • @houssemguidara4467
    @houssemguidara4467 4 ปีที่แล้ว

    Thank you for the video, it helped me understand the concept of Bayesian inference.
    The concept is simple. In a nutshell, you have an idea about what the quantity is and then you use the measurements to sharpen your assumption.

  • @mathiasmews1122
    @mathiasmews1122 4 ปีที่แล้ว

    You're so much better than my Statistics teacher, thank you so much for this explanation!

  • @Zachor-v-Aseh
    @Zachor-v-Aseh 5 ปีที่แล้ว +2

    Excellent. For those for whom this is the first lesson on Bayes, you've left out a few steps here and there. But still excellent. It's difficult to make things understandable. You're excellent at it.

    • @BrandonRohrer
      @BrandonRohrer  5 ปีที่แล้ว

      Thanks Dov! And good callout - this focuses on the concepts and doesn't tell you quite enough to code it up. That will be the subject of a future course on e2eml.school

  • @attrapehareng
    @attrapehareng 7 ปีที่แล้ว +17

    You're very good at explaining and also you go in some details which is nice. Too often youtube tutorials are too simple. keep going.

    • @keokawasaki7833
      @keokawasaki7833 6 ปีที่แล้ว

      now that you have said that (an year ago), i kinda feel like finding the probability of likelihood of a youtuber making too simple tutorials!

  • @cliffordhodge1449
    @cliffordhodge1449 6 ปีที่แล้ว

    I found this quite helpful in giving Bayesian probability a more intuitive appeal. Bayes idea had previously been presented to me as "a priori probability", and I had always been troubled by the a priori part. But I guess a good way to think about it is like this: When we say, "All else, being equal, such-and-such is the case," we mean (or ought to mean) "Assuming that the variables which we have not measured or aren't even aware of have the values they most likely have when our one measured variable has the value measured or assumed by us, then such-and-such is most likely the case."

  • @wysiwyg2489
    @wysiwyg2489 6 ปีที่แล้ว +1

    I believe you don't know much about statistics (the impossible thing), but I do believe you really know how to explain Bayesian Inference. Great video.

  • @SamuelShaw1986
    @SamuelShaw1986 6 ปีที่แล้ว +1

    Great explanation and video lesson production. Best Bayesian lesson I've found on youtube

  • @SuperJg007
    @SuperJg007 6 ปีที่แล้ว +1

    Amazing. I already knew what Bayes theorem was, but you have an awesome intro to Bayes. Thanks for the video.

  • @lenkapenka6976
    @lenkapenka6976 3 ปีที่แล้ว +1

    Superb lecture - esp. the MLE explanation!

  • @Blooddarkstar
    @Blooddarkstar 8 ปีที่แล้ว +4

    This video deserves more thumbs up. I understood a lot on a lazy sunday evening :) great explaination.

  • @lemyul
    @lemyul 5 ปีที่แล้ว +1

    i like the quotes you put at the end and how you reword them

  • @AakashBhardwaj-dk3mi
    @AakashBhardwaj-dk3mi ปีที่แล้ว

    Hey @Brandon Rogers,
    At 18:12, y axis is likelihood not probability. Probability is area under curve for this graph.

  • @Erin-uk2jj
    @Erin-uk2jj 3 ปีที่แล้ว +2

    Brandon, this was great, thank you. Very easy to follow and really interesting and concise!

  • @viviankoneke1389
    @viviankoneke1389 4 ปีที่แล้ว +17

    "...one of the top 10 math tattoos of all time." 😂

  • @Atoyanable
    @Atoyanable 3 ปีที่แล้ว +1

    such a well-thought -through video, very good explanations for every instance, the ending was the bonus, loved it, thank you

    • @BrandonRohrer
      @BrandonRohrer  3 ปีที่แล้ว +1

      Thank you Lilit! I appreciate that.

  • @Nifty-Stuff
    @Nifty-Stuff 2 ปีที่แล้ว +1

    Absolutely brilliant! Your presentation, examples, etc. were perfect and applicable! Thanks!

  • @amanjain3341
    @amanjain3341 7 ปีที่แล้ว +1

    Thank you sooo much Brandon for explaining the concepts so clearly.

  • @EANTYcrown
    @EANTYcrown 7 ปีที่แล้ว +1

    hands down best explanation i've seen, thank you

  • @dmhowe2001
    @dmhowe2001 7 ปีที่แล้ว +1

    I thought this was not only a great example of Bayes but also a nice intro for Cox's Theorem. Nice jobQ

    • @BrandonRohrer
      @BrandonRohrer  7 ปีที่แล้ว +2

      * quickly looks up Cox's Theorem *
      Why, yes it does Donna. Thank you! :)

  • @sreekanthk2911
    @sreekanthk2911 5 ปีที่แล้ว

    I have been searching for explanation like this for sometime and a big WOW to this guy. Wonderful explanation!!

  • @moazelsayed541
    @moazelsayed541 ปีที่แล้ว +1

    That's the best explanation ever❤️❤️❤️❤️

  • @sunilmathew2914
    @sunilmathew2914 3 ปีที่แล้ว

    Please consider doing a longer version of the video as its a nice way you have introduced this concept and you'll miss the intended audience who's not well versed with Bayesian statistics to go along with the pace of this video. I have some basics but I still had to pause the video a lot to be able to read the material before it changed to next slide and also listen to what you are saying. Also if you had a cursor or pointer that would also improve the experience.

  • @s45510325
    @s45510325 6 ปีที่แล้ว +1

    the best explanation I ever seen! Super clear.

  • @angli1054
    @angli1054 5 ปีที่แล้ว

    Thumbs up after watching the first 3 minutes. Finally, the "prior" make sense to me!!!!!!

  • @kirankk9565
    @kirankk9565 7 ปีที่แล้ว +1

    the best explanation i hv seen about bayes theorem... awsome... thnx a ton....

  • @RobvanMechelen
    @RobvanMechelen 7 ปีที่แล้ว

    Bayes Theorem (1) only describes 2 events A and B that overlap (A+B).
    Bayes Theorem states that independent of the magnitude of A+B,
    the relationship between the proportion of A= P(A) and the proportion of B = P(B)
    is always the inverse of the conditional proportions being P(A | B) and P(B | A) or
    P(A) / P(B) = P(A | B) / P(B | A) (1)
    So if we know in your example the proportion:
    P(A) = P(M) = 0.50
    P(B) = P(LH) = 0.25
    than P(A) / P(B) = 0.50/0.25 = 2:1
    which is true for any P(A+B)
    then according to (1):
    P(A | B) / P(B | A) = P(A) / P(B)
    P(A | B) / P(B | A) = 2/1
    when
    P(B|A) = P(LH | M) = 0.04
    the equation can be solved
    P(A | B) = P(M | LH) / 0.04 = 2 / 1
    P(M | LH) = 0.08

  • @gautamjain2487
    @gautamjain2487 6 ปีที่แล้ว +1

    Nothing much to say only thank you! you may have helped me in clearing my exam!

  • @fredvin27
    @fredvin27 6 ปีที่แล้ว +2

    I'm confused how do you calculate the probabilities in @17:56 P(m=13.9|w=17) and so on?

  • @StephenHsiang
    @StephenHsiang 7 ปีที่แล้ว

    best explanation on youtube so far

  • @specialkender
    @specialkender 5 ปีที่แล้ว

    GOD THANKS FOR EXISTING. Finally somebody that fucking breaks down the most important part of it (namely, how the do you calculate the likelyhood in practice). I hope life rewards you beautifully.

  • @mariorodriguesperes1501
    @mariorodriguesperes1501 5 ปีที่แล้ว +1

    Great video!! Very nice and easy to digest explanation of Bayes theorem! Thank you very much for sharing this excellent material. I have got a better understanding on how to apply it to my problems. Keep the great work!

  • @b.e.3940
    @b.e.3940 2 ปีที่แล้ว +1

    thank you so much. you expained it in an awesome way and saved my exam.

  • @dalelu9422
    @dalelu9422 7 ปีที่แล้ว +2

    Simply the best ! Thank you Brandon

  • @julianjohnmert8658
    @julianjohnmert8658 4 ปีที่แล้ว +1

    Dude your analogies are on point.

  • @AnasHawasli
    @AnasHawasli 11 หลายเดือนก่อน

    The best explaination on youtube
    thank you man

  • @ingobethke2413
    @ingobethke2413 5 ปีที่แล้ว +1

    Great video. I'm slightly confused about the dog example at around 20:00: Why can we use the standard error of the prior distribution in the likelihood computation for the measurements? Don't we have to model distribution mean and spread seperately i.e. explore many different standard error values? Generally, is w supposed to represent only one number (e.g. the true dog weight in the example) or an entire distribution that can be characterised by its moments (with the true dog weight simply being its first moment)?

  • @jacktretton7815
    @jacktretton7815 4 ปีที่แล้ว

    best explanation I found on the topic so far. great work!!!

  • @mariamedrano4348
    @mariamedrano4348 6 ปีที่แล้ว

    Excellent examples and explanation! Now everything is so much clearer. :)

  • @MonsieurSchue
    @MonsieurSchue ปีที่แล้ว

    Wow can't believe I only came across this video now. This is by far the best explanation on Bayes with great examples! Thanks @BrandonRohrer !! Love the example with the weight of puppy! May I ask if you have codes to deal with multiple priors/ multiple events? Say such as an extension of the weight of the puppy, if the weight change is more than one pound, plus she may be showing some other symptoms (say losing appetite), the likelihood of her being sick from something is x. Or even, losing appetite can be just due to weather being too hot. So the lost of weight of one pound from the last vet visit and losing appetite may not be significant at all and doesn't warrant multiple expensive test suggested by the vet.

  • @eunicepark6860
    @eunicepark6860 6 ปีที่แล้ว

    Terrific examples and terrific explanation down to such applicable quotes!

  • @rasraster
    @rasraster 2 ปีที่แล้ว +1

    I don't understand your method of getting the posterior distribution. There is nothing in Bayes' formula that implies a summation over all possible parameters in the likelihood function. It seems to me that the way to get the posterior is to compute the likelihood distribution based on measured samples and then for each possible weight to multiply the value of the (static) likelihood distribution by the value of the prior distribution. Can you explain why your method works? Thank you.

  • @Qarout2021
    @Qarout2021 6 ปีที่แล้ว

    Really I want to thank you so much
    the best thing that I have ever heard about Bayesian.
    you have illustrated everything and simplify the method for me
    regards

  • @Daniboy370
    @Daniboy370 ปีที่แล้ว

    What a stunning explanation. Speechless

    • @BrandonRohrer
      @BrandonRohrer  ปีที่แล้ว

      💜 🧡 🖤 💚 🤎 💛 💙

  • @MatthewFricke
    @MatthewFricke 4 ปีที่แล้ว

    Wonderful explanation. Thank you. The Mark Twain quote at the end is apocryphal though.

    • @BrandonRohrer
      @BrandonRohrer  4 ปีที่แล้ว

      Thanks Matthew. I'll have to asterisk that. :)

  • @eyesonthetube
    @eyesonthetube 6 ปีที่แล้ว +1

    Thanks for the excellent video. A good refresher! Keep up the good work!

  • @missh1774
    @missh1774 ปีที่แล้ว

    Start with more slides like your last two. Thanks this was insightful.

  • @jeppejwo
    @jeppejwo 7 ปีที่แล้ว +4

    Great explanation, but i'm having a hard time understanding why you should use the standard error, as the width when you calculate the likelihood, and not the standard deviation. Doesn't this mean, that you are calculating the probability, that your measurement is the true mean, and not the probability of getting that measurement given your mean? Or perhaps that's what you'r supposed to do?

    • @BrandonRohrer
      @BrandonRohrer  7 ปีที่แล้ว +1

      You hit it spot on, jeppe. The assumption that I glossed over is that Reign's actual weight will be the mean of all the measurements (if we kept taking measurements forever). However, since we only have a few measurements, we have to be content with an estimate of that true mean. The standard error helps us to determine the distribution of that estimate. We don't care about how far an individual measurement tends to be off, we only care about how far our estimate of the weight is off.

  • @dorinori8189
    @dorinori8189 7 ปีที่แล้ว +35

    When you said small human, I imagined a small adult

    • @tofu.delivery.
      @tofu.delivery. 5 ปีที่แล้ว

      Imagining calling Tyrian cute

  • @hadyaasghar7680
    @hadyaasghar7680 6 ปีที่แล้ว

    great video presentation Brandon. please try to apply more videos on other machine learning algorithms

  • @Ms2loopy
    @Ms2loopy 4 ปีที่แล้ว +1

    Hi! This was such a clear explanation. It would be great if you could make one on hidden markov models.

    • @BrandonRohrer
      @BrandonRohrer  4 ปีที่แล้ว +1

      Thanks Rachel! Hidden Markov Models are an excellent idea. I'll put it in my to do list.

  • @KazaaModo
    @KazaaModo 3 ปีที่แล้ว +1

    That was fantastically done.

  • @balasubramanianilangovan888
    @balasubramanianilangovan888 4 ปีที่แล้ว

    Hi Brandon, your video was simple, superb, and stupendous!

  • @UpayanRoy-n6u
    @UpayanRoy-n6u 3 หลายเดือนก่อน

    17:57 One query here. How is the P(m | w=17) distribution function calculated? What is the spread (S.D.)? How do we, for example, arrive at any certain value of the probability of getting m = 15.6lb given the true weight is 17lb?
    Thanks! Nice explanation.

  • @WilliamHarbert69
    @WilliamHarbert69 7 ปีที่แล้ว +1

    Bravo! Wonderful presentation. Thank you for this presentation.

  • @garychap8384
    @garychap8384 4 ปีที่แล้ว +2

    _" When you have excluded the impossible, whatever remains, however improbable, must be __-the truth-__ _*_possible_*_ "_
    There 'ya go Sheerluck Holmes, I've fixed it for 'ya : )

  • @mohamedanasselyamani4323
    @mohamedanasselyamani4323 5 ปีที่แล้ว +2

    Thank you very much for the best explanation, It's very interesting

  • @thomasprobst8601
    @thomasprobst8601 ปีที่แล้ว

    Excellent introduction, thanks. Is there also a continuitaton concerning a graphic prior Selektion and Jeffreys priors?