Accept-Reject Sampling : Data Science Concepts

แชร์
ฝัง
  • เผยแพร่เมื่อ 12 ม.ค. 2021
  • How to sample from a distribution WITHOUT the CDF or even the full PDF!
    Inverse Transform Sampling Video: • Inverse Transform Samp...

ความคิดเห็น • 150

  • @MarioAguirre-jr1pm
    @MarioAguirre-jr1pm 14 วันที่ผ่านมา

    i'm doing my internship in a pretty heavy statistics role where i have to sample from very weird custom distributions, thanks for saving my life with these sampling vids.

  • @jochem2710
    @jochem2710 3 ปีที่แล้ว +62

    Great explanation! A lot of professors just go though the formulas without providing an intuition to what's going on. I love that in the field of AI and Data Science there are so many great lectures and tutorials online. Makes you wonder how useful university really is. Keep up the good work!

    • @shivamverma3686
      @shivamverma3686 2 ปีที่แล้ว

      Can you please name a few.. i am new to this and would very much appreciate your help. Thanks in advance. :)

  • @Peter1992t
    @Peter1992t 2 ปีที่แล้ว +22

    I am so glad I found this channel right as I started my PhD program in biostatistics. You straddle the line between proof/mechanics and intuition perfectly. So many videos on these topics are either way too surface level that I can't immediately apply it, or way too technical that I don't develop the intuition for what's going on. This is not the first video of yours that I have felt this way about.
    These videos are so good that sometimes after you reach a milestone in an explanation (like explaining how the acceptance probability drives the mechanics for this method), I just sit in awe at how good of an explanation it was that I lose track of the 15 seconds that has passed and I have to rewind the video. You are doing an incredible job; please keep it up.

    • @ritvikmath
      @ritvikmath  2 ปีที่แล้ว +3

      Your words mean a lot, thank you. And I wish you best of luck in your PhD!

  • @azamatbagatov4933
    @azamatbagatov4933 2 ปีที่แล้ว +1

    I just recently discovered your channel and I am glad I did! Clear, concise instruction. Thank you!

  • @vutsuak
    @vutsuak 2 ปีที่แล้ว +9

    Huge fan! By taking the pain to explain the intuition (often ignored), application as well as maths, you've created an amazing series.

  • @jiayuzhou6051
    @jiayuzhou6051 ปีที่แล้ว +2

    It's really nice for tutors to promise to come back later each time they want to introduce state-of-the-art solutions. It keeps students on track and motivates thinking.

  • @prasanthdwadasi6449
    @prasanthdwadasi6449 2 ปีที่แล้ว

    Your video was a great help. Thanks for taking time and explaining the math and intuition clearly.

  • @RECIPE4DISASTR
    @RECIPE4DISASTR 2 ปีที่แล้ว

    Thank you! After looking at six other sources that all explained it the same way and coming up short, I really appreciated the effort you took to explain it differently and intuitively. And great pajamas!

  • @maxgotts5895
    @maxgotts5895 ปีที่แล้ว

    An excellent explanation of some really beautiful data science!! Thank you so much!

  • @sheetalmadi336
    @sheetalmadi336 5 หลายเดือนก่อน

    You are very muchhhh underrateddd man!!You deserve similar appreciation as any other highly rated channel like 3B1B or Veritasium.May be people now a days go for animated videos only but your words are very valuable I could see that. You are trying to teach us exact same way you have learned it from scartch and that helps a lot.

  • @fengjeremy7878
    @fengjeremy7878 2 ปีที่แล้ว

    Intuition is very important for understanding math. You make my learning journey much more comfortable!

  • @bhujaybhatta3239
    @bhujaybhatta3239 ปีที่แล้ว

    Truly Amazing Explanation

  • @accountantguy2
    @accountantguy2 7 หลายเดือนก่อน

    Thank you! This explanation is so much better than what I got in my masters program.

  • @ingenierocivilizado728
    @ingenierocivilizado728 3 หลายเดือนก่อน

    All your videos are very clear and useful. Thank you very much for your help and your effort!!!

  • @qiguosun129
    @qiguosun129 2 ปีที่แล้ว

    Clear explanation and the most intuitive ideas, cool!

  • @nayabkhan7564
    @nayabkhan7564 ปีที่แล้ว

    the only person that knows how to teach data science

  • @sinextontechnologies9484
    @sinextontechnologies9484 3 ปีที่แล้ว +5

    Couple of tricks for sampling: If you need to sample from a normal distribution, then you can take N uniformly distributed random numbers and add them up (rand + rand + rand ...), then you can scale this result horizontally and vertically if you need it, the result will be normally distributed - also many times I need sampling from exponential distributions to have an extreme behave for the random variable, for this I take 1/rand or ln(rand)^2, these methods are pretty fast and robust.

  • @ankushkothiyal5372
    @ankushkothiyal5372 2 ปีที่แล้ว

    Thank you for these lectures.

  • @Gabriel-oy5kw
    @Gabriel-oy5kw 2 ปีที่แล้ว

    Happy Holidays my fellow! Your content is marvelous......

  • @momotabaluga2417
    @momotabaluga2417 10 หลายเดือนก่อน

    such a good explanation. 10/10

  • @milescooper3322
    @milescooper3322 3 ปีที่แล้ว +6

    The world needs people like you as teachers. Thanks.

  • @nicolebaker2902
    @nicolebaker2902 3 ปีที่แล้ว +1

    This is what I needed! I have gone through video after video trying to understand this. Fantastic job -- thank you!

    • @ritvikmath
      @ritvikmath  3 ปีที่แล้ว

      Glad it was helpful!

  • @phoebesteel5874
    @phoebesteel5874 ปีที่แล้ว

    love your videos bro they got me through my statistics paper xx

  • @ec-wc1sq
    @ec-wc1sq 3 ปีที่แล้ว

    great explanation, so much better than my professor....thanks for creating this video

  • @patrick_bateman-ty7gp
    @patrick_bateman-ty7gp 3 หลายเดือนก่อน

    many articles go through the algorithm, but it never really made sense to me as to why this works. This is a crazy good explanation of why it works(especially the bayes theorem part for accepting a sample).

  • @daveamiana778
    @daveamiana778 3 ปีที่แล้ว +1

    Thanks for clarifying this to me. It greatly helped me get through.

  • @katieforthman3384
    @katieforthman3384 ปีที่แล้ว

    Thank you for this great explainer! I would love to see a video on importance sampling.

  • @Phosphophyllite-lz4mb
    @Phosphophyllite-lz4mb 7 หลายเดือนก่อน

    Great videos! Have been learning from them for a long time.👍👍👍

  • @yachtmasterfig
    @yachtmasterfig ปีที่แล้ว

    ur so good at explaining this concept! Wow

  • @phuvuong9062
    @phuvuong9062 ปีที่แล้ว

    Thank you very much. Great explanation.

  • @yaaaaaadiiiiiii
    @yaaaaaadiiiiiii 5 หลายเดือนก่อน

    Excellent! better than my teacher's 1 hour rambling words

  • @julialikesscience
    @julialikesscience 7 หลายเดือนก่อน

    The method is so well-explained. Thanks a lot!

    • @ritvikmath
      @ritvikmath  7 หลายเดือนก่อน

      You are welcome!

  • @olivier306
    @olivier306 ปีที่แล้ว

    Legendary explainer thanks!!

  • @prajwalomkar
    @prajwalomkar 3 ปีที่แล้ว +1

    You're just brilliant. I wish my professors made it this easy. Thanks Ritvik

    • @ritvikmath
      @ritvikmath  3 ปีที่แล้ว +1

      My pleasure 😊

  • @luisrperaza
    @luisrperaza 2 ปีที่แล้ว

    Great explanation, many thanks for the video.

  • @hmingthansangavangchhia4913
    @hmingthansangavangchhia4913 ปีที่แล้ว

    I was looking for accept/reject algorithm for generating rv's. So not actually what I was looking for but glad I stumbled on your channel. Subscribed.

    • @ritvikmath
      @ritvikmath  ปีที่แล้ว +1

      Glad you’re here!

  • @seansteinle2950
    @seansteinle2950 2 หลายเดือนก่อน

    Thank you so much for these videos! You are a life-saver.

    • @ritvikmath
      @ritvikmath  2 หลายเดือนก่อน

      You're very welcome!

  • @sneggo
    @sneggo 2 ปีที่แล้ว

    Amazing explanation!! Thank you

  • @brady1123
    @brady1123 3 ปีที่แล้ว +8

    Very nice explanation!
    We use a similar technique in physics for molecular monte carlo simulations where we don't know the value of the partition function (i.e. normalization constant) but we do know a state's Boltzmann factor (i.e. the numerator value). So when a new molecular state is proposed during the MC sim, you take a ratio of the two states' Boltzmann factors and that gives you the accept/reject probability.

    • @ritvikmath
      @ritvikmath  3 ปีที่แล้ว +5

      Hey, that's super cool! I'm clearly more of a math person so I always love hearing when people have an actual application of some of these topics. Thanks!

  • @abroy77
    @abroy77 3 ปีที่แล้ว +1

    thanks a ton for all your content. It's incredibly helpful and beautifully composed. Best wishes

    • @ritvikmath
      @ritvikmath  3 ปีที่แล้ว

      You're most welcome!

  • @yuanhu7264
    @yuanhu7264 3 ปีที่แล้ว +2

    This should be the quality of all UCLA discussion sessions, great job!

  • @komuna5984
    @komuna5984 ปีที่แล้ว

    Great explanation!

  • @sksridhar1018
    @sksridhar1018 2 ปีที่แล้ว

    Great explanation!!

  • @peterszyjka7928
    @peterszyjka7928 ปีที่แล้ว +1

    Magnifique ! Do another AR video with some ( one or two ) examples!. ....Maybe you did and Ijust haven't seen it....I jumped on this one since it was very good, easy to follow, and as you stress, intuitive ! "Right On" as we used to say back in the 60's out there in LA.

  • @edwardmartin100
    @edwardmartin100 3 ปีที่แล้ว

    Brilliant. Thanks so much

  • @emiliaverdugovega7189
    @emiliaverdugovega7189 2 ปีที่แล้ว

    thanks!! it was very helpful

  • @andreveiga1
    @andreveiga1 2 ปีที่แล้ว

    Great! Proof + intuition. Awesome!

  • @timlonsdale
    @timlonsdale 2 ปีที่แล้ว

    Thanks, this is great!

  • @SarthakMotwani
    @SarthakMotwani หลายเดือนก่อน

    Beautifully Explained.

    • @ritvikmath
      @ritvikmath  หลายเดือนก่อน +1

      Thank you 🙂

  • @PatrickSVM
    @PatrickSVM 2 ปีที่แล้ว

    Thanks, very good explanation!

  • @thegreatestshowstopper5860
    @thegreatestshowstopper5860 2 ปีที่แล้ว

    0:00 - 1:10 Why we need Accept-Reject Sampling?
    1:10 - 4:00 The case problem
    4:00 - 6:25 How to sample from p(s) from g(s)
    6:25 - 8:20 Criteria for accept/reject the sample from g(s)
    8:20 - 12:20 Mathematical explanation of why the acceptance criteria works for the samples from g(s) using Bayes Theorem
    12:20 - 15:08 Intuitive explanation of why the acceptance criteria works by really understanding what f(s)/g(s) means
    15:08 - 17:20 Limitations of the Accept-Reject Sampling and importance of choosing the right g(s)
    17:20 - end of video Conclusions and ending
    Thanks for the video! I love your explanations of this concept especially the intuitive understanding part.

  • @RaviShankar-de5kb
    @RaviShankar-de5kb ปีที่แล้ว

    Its like magic!!! Thanks for explaining, 7:38 was a big key for me, I didn't get the magic at first

  • @FUCKOFFYOUTUBEWITHTHISBULLSHIT
    @FUCKOFFYOUTUBEWITHTHISBULLSHIT ปีที่แล้ว

    Life saver!

  • @MasterMan2015
    @MasterMan2015 ปีที่แล้ว

    Amazing content! Maybe we need a video about diffusion models and particle filter 😀

  • @andblom
    @andblom 23 วันที่ผ่านมา

    Well done!

  • @user-im7yo7hn5z
    @user-im7yo7hn5z 3 หลายเดือนก่อน +1

    definitely should have more followers!

  • @mino99m14
    @mino99m14 10 หลายเดือนก่อน

    Thank you for the amazing video. It's very useful when someone gives some intuition. Just one observation.
    By looking at wikipedia I can tell your proof is a bit misleading. You forgot to mention that the probability of acceptance is defined using a uniform distribution, instead of just getting there using the fact that
    P(A) = int(g(s)*p(A|s)ds).
    With this you get to the same expresion you use for P(A), but also you let your audience know that you need to use a uniform distribution to decide whether you reject or accept a sample.

  • @SpazioAlpha
    @SpazioAlpha 2 ปีที่แล้ว

    WAO! Great explanation! thanks thanks thanks!

  • @nudelsuppe2090
    @nudelsuppe2090 2 ปีที่แล้ว

    Thank you!!

  • @levmarcus8198
    @levmarcus8198 3 ปีที่แล้ว +3

    I've been hooked and watching through your videos in the past week. Do you have any favorite books or resources that you use for reference on the mathematical side of data science?

    • @ritvikmath
      @ritvikmath  3 ปีที่แล้ว +6

      Hey, thanks for the kind words. I get this question often and the admittedly unsatisfying answer is no. I've found that different resources out there do a really good job at different things or at least offer different ways of viewing the same problem. When I try and learn a new topic, or review an old topic when making a video, I'll look around at lots of different resources to see which path I want to take in explaining it. That said, I think the most important part for learning (in my opinion) is to write some basic code, doesn't have to be pretty, which implements the method. That way, you can do sanity checks to see if your understanding matches to how real data would behave. Plus, you get some coding experience out of it. Sorry to not have an answer to your initial question but I hope this helps regardless!

    • @levmarcus8198
      @levmarcus8198 3 ปีที่แล้ว

      @@ritvikmath No problem. Thanks for the long response!

  • @amithanina25
    @amithanina25 2 ปีที่แล้ว

    Thanks for the great explanation!
    Do you have any references for books about Accept-Reject Sampling?

  • @RomanNumural9
    @RomanNumural9 3 ปีที่แล้ว +2

    Amazing video. Keep up the amazing work :)

    • @ritvikmath
      @ritvikmath  3 ปีที่แล้ว

      Thank you! Will do!

  • @vs7185
    @vs7185 2 ปีที่แล้ว

    Excellent explanation and mathematical proof! By the way, is it same as or related to "Importance sampling"?

  • @Underwatercanyon
    @Underwatercanyon 2 ปีที่แล้ว +4

    Great explanation! 1 question though, if we have a f(s) that is easy to sample from, why can't we just directly sample from it and be done with, rather than going through the sample from g(s) steps?

  • @ravinder1022
    @ravinder1022 3 ปีที่แล้ว +1

    Great Explanation brother !

    • @ritvikmath
      @ritvikmath  3 ปีที่แล้ว

      Glad it was helpful!

  • @sharmilakarumuri6050
    @sharmilakarumuri6050 3 ปีที่แล้ว

    Ur explanation was awesome

    • @ritvikmath
      @ritvikmath  3 ปีที่แล้ว

      Thanks a lot 😊

  • @khalilibrahimi3807
    @khalilibrahimi3807 3 ปีที่แล้ว +1

    Man you're good. Thanks

    • @ritvikmath
      @ritvikmath  3 ปีที่แล้ว

      I appreciate that!

  • @shrill_2165
    @shrill_2165 4 หลายเดือนก่อน

    Thanks dawg

  • @alsadab7151
    @alsadab7151 ปีที่แล้ว

    Thanks a bunch

  • @sukursukur3617
    @sukursukur3617 3 หลายเดือนก่อน

    Very good

  • @MrTSkV
    @MrTSkV 3 ปีที่แล้ว +1

    I think this looks kinda similar (-ish) to MCMC algorithm? Maybe it's a good idea to cover MCMC in one of the next videos, since they are related.
    Anyway, that was a great video, I really enjoyed it. Keep up the good work!

    • @ritvikmath
      @ritvikmath  3 ปีที่แล้ว +3

      You're reading my mind. I put this video out first so that in the MCMC videos (releasing next week onward), we can compare it against this. Stay tuned :)

  • @zakariaaboulkacem7098
    @zakariaaboulkacem7098 3 ปีที่แล้ว +3

    Nice, thank you

  • @maximegrossman2146
    @maximegrossman2146 3 ปีที่แล้ว +1

    Excellent video!

    • @ritvikmath
      @ritvikmath  3 ปีที่แล้ว

      Thank you very much!

  • @renemanzano4537
    @renemanzano4537 ปีที่แล้ว

    Superb

  • @smilebnnbnn
    @smilebnnbnn 8 หลายเดือนก่อน

    NICE

  • @Kazzintuitruaabemss
    @Kazzintuitruaabemss ปีที่แล้ว

    Thank you for the great explanation. I am studying this concept for an actuarial exam, and my textbook says the probability of accepting a sample is 1/M "on average." Is this just because they are assuming f(x) is a pdf already? The book doesn't mention normalizing constants at all.

  • @tianjoshua4079
    @tianjoshua4079 3 ปีที่แล้ว +2

    This is a great explanation! I do have a specific practical question though. In the student score example, how do we practically get f(s)? Since the issue is we know f(s) yet we don't know but want to know p(s), it seems very curious to me how we could get f(s) in such an abstract math form or any math form?

    • @ritvikmath
      @ritvikmath  3 ปีที่แล้ว +3

      This is a valid question and indeed something that I also had confusion over for a long time. This is a common case in Bayesian stats.
      For example, P(A|B) is proportional to P(B|A)P(A) / P(B) by Bayes theorem. We might care about sampling from P(A|B) which is the posterior but don't know its full form since the denominator P(B) might be difficult to compute. So, we can use Accept-Reject sampling to still sample from the posterior given only the numerator in Bayes theorem.

    • @tianjoshua4079
      @tianjoshua4079 3 ปีที่แล้ว

      @@ritvikmath That makes much sense. P(B) is the normalizing constant. It is interesting questions like this come up all over the place in engineering.

  • @samwhite4284
    @samwhite4284 3 ปีที่แล้ว +1

    Question - is it assumed that the threshold for classification (Accept vs Reject) from that probability function [f(x)/Mg(x)] is at 0.5?

    • @ritvikmath
      @ritvikmath  3 ปีที่แล้ว +1

      So, [f(x)/Mg(x)] will be some number, say its 0.1. Then, we accept that sample with probability 0.1. That means, we generate some random number "u" from the Uniform distribution and if it is

  • @eliacharles5835
    @eliacharles5835 3 ปีที่แล้ว +2

    Love the video. This may sound like a silly question but do you use some sort of threshold to decide whether you accept or reject something ? I get the intuition behind the ratio but whats the process of actually accepting or rejecting ?

    • @awangsuryawan7320
      @awangsuryawan7320 2 ปีที่แล้ว

      Up

    • @zgbjnnw9306
      @zgbjnnw9306 2 ปีที่แล้ว

      I have the same question.

    • @xinzhou4360
      @xinzhou4360 2 ปีที่แล้ว

      Hi, the threshold is "f(s)/(Mg(s))", which is in (0,1). Since the larger result, say, f(s) greater, g(s) smaller, indicates g(s) can represent f(s) better, this sample should be accepted with greater probability. So now we can just generate u~U(0,1), and s~g(s), if u < the threshold, we accept. The process means, the more f(s)/(Mg(s)) close to 1, the higher probability u

  • @JakeGreeneDev
    @JakeGreeneDev 2 ปีที่แล้ว

    Great video but I have a follow-up question: we were told to assume that our equation for P(A|s) can be interpreted as a probability. Why? Can you point to a proof for this?

  • @aswathmg
    @aswathmg ปีที่แล้ว

    Hi, for that last case(where M would be large if we choose Normal), how would you suggest solving that? Do we consider another distribution or can we consider multiple distributions, like piece-wise for sample creation?

    • @zenchiassassin283
      @zenchiassassin283 ปีที่แล้ว

      Both. There's something called adaptive rejection sampling which creates some "proposal" distribution on the fly (See bishop book for more details). In general, rejection sampling has issues in high dimensions, Bishop's book explains it well (explained at the end of the section on adaptive rejection sampling)

  • @neelabhchoudhary2063
    @neelabhchoudhary2063 27 วันที่ผ่านมา

    how do you know whether to accept or reject your probability?

  • @Jameshazfisher
    @Jameshazfisher หลายเดือนก่อน

    Maybe we don't need f(s) to always be lower than Mg(s), if we allow outputting the sample multiple times. E.g. if f(s)/Mg(s) = 2, then we'd output s twice.

  • @juanete69
    @juanete69 ปีที่แล้ว

    When here you say "a sample" do you mean all the observations of a sample with a given size? Or do you mean the mean of that sample?

  • @bennettcohen130
    @bennettcohen130 2 ปีที่แล้ว

    Holy fuck this is so clear

  • @Briefklammer1
    @Briefklammer1 3 ปีที่แล้ว +6

    if you need a good g in ARS, why not using g for p? The aim is to find a good unknown density for your samples right? So by finding a good enough g for ARS you find your good density approx. You dont need ARS at all in my intuition. What is the advantage of ARS? Maybe make an approx even more better?

    • @ritvikmath
      @ritvikmath  3 ปีที่แล้ว +2

      You bring up a very interesting question. Usually, the distributions, p, that we want to sample from are not very nice looking (can have many peaks, noisy, etc.), so finding a distribution g that is "similar" to p can be challenging or impossible. So, instead we use a g that is "close enough" to the target and use ARS to actually sample from the target.

    • @Briefklammer1
      @Briefklammer1 3 ปีที่แล้ว

      @@ritvikmath thx for answering my question. So ARS can smooth the potentially noisy density or find an easy alternative for p, if you have an approx/good candidate g for p, right? But what is the advantage against kernel density estimation KDE with special kernel k?

  • @BruinChang
    @BruinChang 2 ปีที่แล้ว +1

    I am a little bit confused about inverse sampling. If I already had a pdf, why does I still need to bother inverse transformation to simulate a random number of the pdf I already obtained?

  • @_STEFFVN_
    @_STEFFVN_ ปีที่แล้ว

    Wouldn't the NC multiply to the integral of f(s)*ds to make it equal 1? Therefore it should be 1/NC = integral of f(s)*ds, no?

  • @Pruthvikajaykumar
    @Pruthvikajaykumar 2 ปีที่แล้ว

    With this method, we're trying to find p(s) right? and p(s) is f(s)/constant. To use this method we need to know what f(s) is. Then don't we already know what p(s) is? Can someone explain?

  • @sjb27182
    @sjb27182 2 ปีที่แล้ว

    So just to be sure, M*g(s) isn't technically a probability density because integrating M*g(s) over s would give a value greater than 1 right?
    ie: M scales probabilities of s, not the observable values of s?
    [edit] Actually the scaling makes sense I think, I was confusing your f(s) which isn't a pdf, for p(s)

  • @Yohan87845
    @Yohan87845 2 ปีที่แล้ว

    Thank you very much. What is the level of probability to accept the sample?

    • @azamatbagatov4933
      @azamatbagatov4933 2 ปีที่แล้ว

      To accept the sample s, you need to sample from the uniform distribution on the interval [0, M*g(s)]. Let's say it is u. If u is less than or equal to f(s), you accept, otherwise you reject.

  • @vasylcf
    @vasylcf 2 ปีที่แล้ว +1

    Magic )

  • @zgbjnnw9306
    @zgbjnnw9306 2 ปีที่แล้ว +1

    At 7:50, how do you decide which sample is accepted or rejected? Is the prob(f(x)/(Mg(x)) > 0.5?

    • @mino99m14
      @mino99m14 10 หลายเดือนก่อน

      The ratio you get from f(X)/Mg(X) gives you the probability of acceptance. Then you use a uniform distribution from 0 to 1 and if the value is less than the ratio, you accept. If it's bigger you reject.

  • @marc-aureleagoua4918
    @marc-aureleagoua4918 ปีที่แล้ว

    How can we choose M

  • @Flaaazed
    @Flaaazed หลายเดือนก่อน

    you're saying its hard to integrate -inf to +inf of some difficult pdf f(x), but that integral is equal to 1 right since its a pdf? so its not hard?

  • @omidomatali4510
    @omidomatali4510 2 ปีที่แล้ว

    13:10 , dude, if the ratio is high doesnt mean that f(s) is high, its a ratio. and u started with we know f, which i think would be the problem, that we dont. and g and f modes and the whole curvature of f and g are kind of parallel, in the case of multimodal g, and unimodal f, i think this is not a good way to calculate p(s), it would damp the data beneath one of the modes of g. still good explanation

  • @user-or7ji5hv8y
    @user-or7ji5hv8y 3 ปีที่แล้ว +1

    Why do we know the pdf? Can you provide a real example of how we know the pdf, even though it may be hard to sample from it? Like the example you provided above, with exponentials. How did we even know that the pdf had that analytical form?

  • @jonathanparlett1172
    @jonathanparlett1172 ปีที่แล้ว

    You say we need g(s) to be close to p(s), but you also say we don't know p(s) and in the illustration you show g(s) close to f(s), not p(s). Also in most of the other materials I see covering this method they seem to say that f(s) is in fact the known target distribution you want to sample from, just that it might be difficult to sample from directly.

  • @alexivanov8800
    @alexivanov8800 3 ปีที่แล้ว +1

    How do you choose M?

    • @jsalca52
      @jsalca52 2 ปีที่แล้ว

      You can use calculus. Find the maximum value of the ratio f(x)/g(y). If you then set M in the denominator, f(x)/Mg(x), you ensure the ratio won't be greater than 1

  • @soumyajitganguly2593
    @soumyajitganguly2593 ปีที่แล้ว

    I dont get where the numerator assumption comes from. In real life I would just have the scores of the students like 75, 63, 91 etc... Yes I can create an histogram from them but what is the numerator here?