Sampling from Bayes' Nets

แชร์
ฝัง
  • เผยแพร่เมื่อ 16 ต.ค. 2024

ความคิดเห็น • 62

  • @curdyco
    @curdyco 9 หลายเดือนก่อน +5

    Even after 11 years it's one of the best videos. Thank you

  • @sofiayfantidou6457
    @sofiayfantidou6457 9 ปีที่แล้ว +43

    Best and simplest thing I've found on the web so far about sampling on Bayes' nets!

  • @narjissyy5865
    @narjissyy5865 2 หลายเดือนก่อน

    Excellent explanation. I appreciate it so much. Now i finally understand this after so much struggling looking at some other resources. Thank you!

  • @adarbayan719
    @adarbayan719 2 ปีที่แล้ว +6

    Dear Mr. Abbeel, thank you for your explanation! It has been 10 years and this tutorial still rocks :)

  • @VonGazza82
    @VonGazza82 9 ปีที่แล้ว +26

    At 24:00, you added the last sample that had +d even though you were conditioning on -d. Just a small error.

  • @nikolatesla9598
    @nikolatesla9598 4 หลายเดือนก่อน

    greatest explanation on an example. Thank you professor Pieter!

  • @kshitijtakarkhede7833
    @kshitijtakarkhede7833 6 หลายเดือนก่อน +1

    Thank you sir!!!

  • @dadamkd
    @dadamkd 4 ปีที่แล้ว +4

    Good video, it's actually very simple when you see it in practice and not just as an abstract set of equations.

  • @aaronmailhot9370
    @aaronmailhot9370 3 ปีที่แล้ว +5

    Solid tutorial, thanks so much! My one suggestion for making it even better would be to tweak the details of the Likehood Weighting example a bit (starts at 17:50). Basically, I was a bit confused for a while how one could get different weight values from the samples, because in this example the samples will indeed always return the same results since they are only controlled by evidence variables. I think changing the example to 'find P(-d | -b, -c)' would work (swap out a for b, to allow a to be randomized to change which area P(b | a) and P(c | a) pull from).

  • @cangozpinar
    @cangozpinar 3 ปีที่แล้ว

    Waaay better than anything I've seen so far. Much respect!

  • @ahmedlakhel5383
    @ahmedlakhel5383 ปีที่แล้ว

    Thank you so much you saved my life

  • @aplicano0921
    @aplicano0921 5 ปีที่แล้ว

    This is the better video and the better teacher!

  • @bonkers33331
    @bonkers33331 3 ปีที่แล้ว +1

    Excellent description of Bayesian network sampling!

    • @PieterAbbeel
      @PieterAbbeel  3 ปีที่แล้ว

      thank you, and glad it's helpful!

  • @hypebeastuchiha9229
    @hypebeastuchiha9229 2 ปีที่แล้ว +1

    Thank you so so so much
    My professor could learn a thing from you

  • @sumowll8903
    @sumowll8903 2 ปีที่แล้ว

    This video is so good and clear!

  • @NazerkeSafina
    @NazerkeSafina 4 ปีที่แล้ว

    Thank you very much for this! I wish you get everything you want in life

    • @figuraclass444
      @figuraclass444 4 ปีที่แล้ว

      what the hell? are you from kz?

    • @NazerkeSafina
      @NazerkeSafina 4 ปีที่แล้ว

      @@figuraclass444 yes i am

  • @mohammadabbasi5952
    @mohammadabbasi5952 ปีที่แล้ว

    great tutorial Mr.Abbeel :)❤

  • @AshutoshSahuMRM
    @AshutoshSahuMRM 10 หลายเดือนก่อน

    Great explanation !!!

  • @jazibjamil9883
    @jazibjamil9883 5 ปีที่แล้ว +6

    In the last part you fixed B=-b and a=-a but the samples you showed did not reflect this. That part is extremely wrong and needs rectification.

  • @jeffkral5016
    @jeffkral5016 3 ปีที่แล้ว +1

    at 9:04 does it really map to +c? 0.04 seems like it would map to -c.

  • @mustafaghaleb
    @mustafaghaleb 7 ปีที่แล้ว

    The best tutorial about sampling. Thanks :)

  • @sparshjain6077
    @sparshjain6077 5 ปีที่แล้ว

    Very clear explaination. Awesome video. thanks a lot!

  • @nealpobrien
    @nealpobrien 5 หลายเดือนก่อน

    Excellent video

  • @bhaitato
    @bhaitato 6 ปีที่แล้ว +1

    Thank you for this simple lesson!

  • @kudamushaike
    @kudamushaike 6 ปีที่แล้ว

    Thank you sooo much. You just saved my exam

  • @aram4165
    @aram4165 11 ปีที่แล้ว

    Thank you professor, I really became a fan of your teaching style. One question: is likelihood weighting is the same as Gibbs sampling?

  • @PieterAbbeel
    @PieterAbbeel  11 ปีที่แล้ว +2

    Hi Aram, yes, likelihood weighting is an instantiation of importance sampling. In general in importance sampling you were interested in getting samples from a distribution Q1, but unfortunately you don't know how to sample efficiently from Q1 so you instead sample from another distribution Q2 that is easier to sample from --- and then you reweight your samples by the ratio Q1/Q2. In likelihood weighting: Q1 = P(unobserved variables | observed variables), Q2 = defined by the sampling process.

  • @prayaglehana7187
    @prayaglehana7187 5 ปีที่แล้ว

    best explanation of all !

  • @muhammedyusufsener1622
    @muhammedyusufsener1622 5 ปีที่แล้ว

    Great explanation sir, thank you very much!

  • @shixinli2818
    @shixinli2818 4 ปีที่แล้ว

    Very helpful! Thank you very much

  • @AishwaryaRadhakrishnan34
    @AishwaryaRadhakrishnan34 5 ปีที่แล้ว

    Best explanation!

  • @zhuoerlyu4705
    @zhuoerlyu4705 5 ปีที่แล้ว

    Still best video until 2019

  • @Dr.hayder374
    @Dr.hayder374 2 ปีที่แล้ว

    Dear Dr., in time 10:06, is the answering probabilistic queries uses for crating condition probability in Genie software

  • @dr.p.m.ashokkumar5344
    @dr.p.m.ashokkumar5344 4 ปีที่แล้ว

    Excellent.. But how to do sampling when the variables are continuous...

  • @preetivyas_
    @preetivyas_ 7 ปีที่แล้ว

    While doing likelihood weighting for the last question, while collecting samples how you would proceed? Given there are multiple conditional queries.

    • @ManishPrajapatchamp
      @ManishPrajapatchamp 6 ปีที่แล้ว +1

      You can directly add up the weights containing those (evidence + queries) upon the summation over the weights of the evidence

  • @nckporter
    @nckporter 7 ปีที่แล้ว

    Hi, Where did you get the examples?

  • @lydiama7803
    @lydiama7803 7 ปีที่แล้ว

    Very clear. Thank you.

  • @breakinggood-r2v
    @breakinggood-r2v 10 หลายเดือนก่อน

    excellent

  • @aram4165
    @aram4165 11 ปีที่แล้ว

    Excuse me, I meant importance sampling.is likelihood weighting same as importance sampling?

  • @jarlaxleow
    @jarlaxleow 2 ปีที่แล้ว

    Bless

  • @zhuoerlyu4705
    @zhuoerlyu4705 5 ปีที่แล้ว

    Best video

  • @salimakraa5027
    @salimakraa5027 6 ปีที่แล้ว +6

    How did you find the weights of each sample in 20:58 ? thanks in advance

    • @sathblr
      @sathblr 6 ปีที่แล้ว +3

      I too have the same question. GIven the evidence (-a,-b), we don't sample 'a' and 'b' from the distribution and we sample only c and d. But still there are samples with +a,+b / +a,-b / -a,+b combinations in 20:58!! how are the combined weights calculated? Thanks in advance. Thanks for the clear and simple explanation given.

    • @gabrielemazzola9652
      @gabrielemazzola9652 6 ปีที่แล้ว

      ​@@sathblr In general, when you do Likelihood weighting you have to weigh each sample by the likelihood of that generated sample with the evidence variables ("how much that sample makes sense, given the evidence of your query")
      This means the total weight is the product of the weights for each evidence variable (remember: evidence variables are always fixed by the query): conditional probability of the evidence given its parents.
      In this particular case, though, the evidence variables are 'a' and 'b':
      - 'a' doesn't have parents.
      - 'b' has only 'a' as a parent.
      For this reason, the weight of each sample is given by the following: P(A) * P(B|A), where B and A are actual settings of the variables for the current sample.
      Example:
      In the last sample (21:01) we have "+a , +b , -c , -d" --> weight = P(+a) * P(+b | +a) = 0.8 * 0.8 = 0.64
      For this reason, I believe the weights provided by the Teacher are just for the sake of explaining the concepts.
      Please, correct me if I'm wrong. I wish this helped.

  • @tega2754
    @tega2754 7 ปีที่แล้ว

    Nice explanation

  • @gourangpathak4443
    @gourangpathak4443 ปีที่แล้ว

    God Explaination

  • @vikankshnath8068
    @vikankshnath8068 4 ปีที่แล้ว

    Please make more small AI videos on different topics having good question examples.

  • @hardikchawla4966
    @hardikchawla4966 5 ปีที่แล้ว

    i should be paying my college fees to this guy.

  • @DanielVazquez
    @DanielVazquez 4 ปีที่แล้ว

    Go bears!

  • @Wodro
    @Wodro 6 ปีที่แล้ว

    0:00 earrape alert

  • @Workshirt
    @Workshirt 7 ปีที่แล้ว +1

    19:46 0.2 * 0.5 =/= 0.1

  • @manas_singh
    @manas_singh 3 ปีที่แล้ว

    From Rejection Sampling onwards, the explanation is bad. It is too fast.

  • @milos-simic
    @milos-simic 7 ปีที่แล้ว +1

    This is a good video, but you're talking too fast.

    • @buddhabrot
      @buddhabrot 5 ปีที่แล้ว +1

      this video is understandable even without audio

  • @sheheryar89
    @sheheryar89 5 ปีที่แล้ว

    Good, but you were too fast, and the words were hard too.

  • @trollerxoxox
    @trollerxoxox 9 ปีที่แล้ว +1

    Can ya talk any fucken faster, who ya tryin to impress, a machinegun?

    • @970teejay
      @970teejay 7 ปีที่แล้ว

      lmfao :'D

  • @peterpfankuchen
    @peterpfankuchen 6 ปีที่แล้ว +1

    Dude get some space between your mouth and mic, and talk a bit slower...