14. Poisson Process I

แชร์
ฝัง
  • เผยแพร่เมื่อ 26 ธ.ค. 2024

ความคิดเห็น • 51

  • @matsiv5707
    @matsiv5707 10 หลายเดือนก่อน +4

    If you're wondering, a simple way to solve the limit is to take the log of the expression, compute the limit of that logarithm and then take the exponential of the limit. You can use the fact that log(n!) goes to infinity as nlogn The rest of the computation is trivial

  • @eneserdogan34
    @eneserdogan34 3 ปีที่แล้ว +12

    To derive the formula at 26:07 (Poisson distribution) from 25:06 (Binomial distribution), you need to write combination as factorials, then use stirling approximation for factorials and the definition of e

    • @maxdickens9280
      @maxdickens9280 ปีที่แล้ว +3

      Well said! But, actually, stirling is not a necessity. Since when n -> infinity, we have (n k) = (n * (n-1) * ... * (n - k + 1)) / k! -> n^k / k!

    • @oyaoya2468
      @oyaoya2468 ปีที่แล้ว +1

      you dont need stirling appoximation for that

    • @Longpan898
      @Longpan898 ปีที่แล้ว

      Thanks 🙏🏻

  • @MayankGoel447
    @MayankGoel447 ปีที่แล้ว +3

    This is a fantastic lecture on Poisson Process and way they come from, probably the best

  • @dania_884
    @dania_884 3 ปีที่แล้ว +10

    he teaches excellently! make complicated things easy to understand, that's the magic!

  • @nadekang8198
    @nadekang8198 6 ปีที่แล้ว +26

    Even Professor Tsitsiklis might stuck in the middle of the proof of Poisson distribution PMF...but the proof is perfectly written in his textbook, which is the BEST by far I've ever read in intro level probability and statistics. It took me a long time to fully understand Poisson process...thanks to MIT OCW and Professor Tsitsiklis.

    • @elmrabti
      @elmrabti 4 ปีที่แล้ว

      Hi where I can find his textbook please?

    • @nadekang8198
      @nadekang8198 4 ปีที่แล้ว +2

      @@elmrabti amazon

    • @elmrabti
      @elmrabti 4 ปีที่แล้ว

      @@nadekang8198 Do you have it?

  • @elenag.224
    @elenag.224 3 ปีที่แล้ว +4

    Ένα μεγάλο ευχαριστώ, από Έλληνες φοιτητές!

  • @Naz-pk4ll
    @Naz-pk4ll 5 ปีที่แล้ว +7

    You are a star, dear lecturer.

  • @hrodrick2890
    @hrodrick2890 11 ปีที่แล้ว +20

    sacredsoma The lector doesn't explain it, but in that part he is working with one of the several definitions of the Poisson process, in which the process is defined as a Lévy process. Deep knowledge of mathematical analysis, specifically measure theory, is required to understand these infitesimal properties of Lévy processes, that's why he doesn't say much about those second order terms.

    • @JaviOrman
      @JaviOrman 3 ปีที่แล้ว +1

      Thanks for that clarification. I wrote on my notes that this part needed some more explanation.

  • @KyleGoryl
    @KyleGoryl 7 ปีที่แล้ว +16

    Begin's poisson at 9:20

  • @SomeHeavensStation
    @SomeHeavensStation 9 หลายเดือนก่อน

    I'm wondering, at 14:15, when Prof. is discussing arrival independence (second bullet point), how can they be truly "independent," if this is being modeled by a PMF that must sum to 1. In other words, don't we create a type of dependency exactly when an event has occurred ?

  • @rikenm
    @rikenm 7 ปีที่แล้ว +26

    my probability professor skipped poisson distribution saying it is not so important.

  • @aniketsaha7455
    @aniketsaha7455 6 ปีที่แล้ว +2

    Although it is true intuitively that the T2, T3,.... all will be independent exponential random variable but the pdf for K=2,3,4 is not exponential...How can you argue about that??? anyone?

  • @xozhang
    @xozhang 7 ปีที่แล้ว +7

    Excellent explanation !

  • @luigixu3251
    @luigixu3251 3 ปีที่แล้ว +1

    51:46 he probably meant "this plus that" instead of "this plus that".

  • @adityasahu96
    @adityasahu96 4 ปีที่แล้ว +3

    too much information to grasp in one go.. woah nice lecture!

  • @anexocelisia9377
    @anexocelisia9377 3 ปีที่แล้ว

    process has memorylessnessf property (provided the person entered the room doesn't see the future)
    continuous time variations
    definition of the poisson process and we use the concept of memorylessness in here
    time homogenity as time is continuous.
    Lambda*Tau will be the rate at very small interval
    diff time slots are independent of each other
    NUMBER of arrivals in the disjoint time interval will be independent of each other.
    delta is very small.
    pmf have the binomial pmf
    here we are taking a big interval and we are splitting that interval into many infinitesemely small interval.
    now prob of one of the intervals will be same as the bernoulli distriubution
    for poissson we take lambda tends to 0.
    merging of the poisson process

  • @subhashjain2294
    @subhashjain2294 6 ปีที่แล้ว +4

    thank you so much sir,you made learning very easy....

  • @animagous1
    @animagous1 8 ปีที่แล้ว +1

    Why probability of arrival for k>1 in time delta is zero.?

  • @zihaocheng6438
    @zihaocheng6438 9 ปีที่แล้ว +5

    wonderful lecture

  • @achillesarmstrong9639
    @achillesarmstrong9639 6 ปีที่แล้ว +2

    master piece of video

  • @grantguo9399
    @grantguo9399 7 หลายเดือนก่อน

    I enjoyed watching this maths lecture as watching a movie

  • @richardellard
    @richardellard 5 ปีที่แล้ว +13

    Not even a chuckle at the "B movie" joke?

  • @nguyentin6294
    @nguyentin6294 3 ปีที่แล้ว

    I am having a hard time making sense of the notation at 11:22. I believe the notation should be the conditional probability P(k|t) rather than P(k,t). I interpreted the latter to be the joint probability and if it is the case, the summation over all k of P(k,t) given a fixed t could not be equal to 1. Anyone, please help knock some sense to my head!

    • @schobihh2703
      @schobihh2703 3 ปีที่แล้ว +1

      t is a different nature and not a random variable.

    • @nguyentin6294
      @nguyentin6294 3 ปีที่แล้ว

      @@schobihh2703 could you clarify it a bit more?

    • @schobihh2703
      @schobihh2703 3 ปีที่แล้ว +1

      @@nguyentin6294 as mentioned in the video. k is a random variable, i.e. the summation over all k is 1. "t" just specifies omega, the set of all k's. it is only a parameter and not an event. Or otherwa around ask yourself if t would be an event what would P(t) be? it simply doesnt make sense to ask this, without specifing how much arrivals we are talking about.

    • @nguyentin6294
      @nguyentin6294 3 ปีที่แล้ว

      @@schobihh2703 thank you for the insight :)

  • @blogginbuggin
    @blogginbuggin 5 ปีที่แล้ว +2

    Prof John's lecture on Poissan PMF is available at: th-cam.com/video/eTLa9GDID8A/w-d-xo.html

  • @suyashmuley9096
    @suyashmuley9096 7 ปีที่แล้ว

    what is the reason behind considering second order terms of delta? since it is just an interval.

  • @sacredsoma
    @sacredsoma 11 ปีที่แล้ว

    I dont get the bit @ 15:50 about small interval probabilities being a limiting case. Why are there any second order terms?

    • @aryanparikh9085
      @aryanparikh9085 4 ปีที่แล้ว +5

      I doubt that my comment will be useful to you at all since it's been 7 years, but I'll comment anyway for people who are going to view this lecture in the future.
      The second order terms exist because P(k,d) isn't 0 when k >=2. d here stands for delta.
      Regardless of how small an interval you pick, it is possible for more than one successes to occur in the interval. That probability, however, will converge to 0 much "faster" than d, so we can ignore those probabilities since they are trivial.
      This is an important assumption to use the binomial theorem, since the binomial theorem only works when there are two possible outcomes - success and failure. If we allow multiple successes, the theorem cannot be used.
      Of course, it follows that any distribution which doesn't satisfy the property that the probability of multiple successes in a given interval doesn't converge to 0 "faster" than the interval itself cannot be treated as a Poisson Distribution. In our course notes at University of Waterloo, this property is referred to as "individuality".

    • @priyamdey3298
      @priyamdey3298 3 ปีที่แล้ว

      @@aryanparikh9085 If possible, can you share the notes of this particular characteristic for qualifying a distribution as Poisson?

  • @bosepukur
    @bosepukur 8 ปีที่แล้ว +1

    good explaination though a little more rigorous derivation wud have helped lot... intended for overview

    • @Isaac668
      @Isaac668 7 ปีที่แล้ว +2

      the course is applied probability, the point is not rigorous derivation

    • @JaviOrman
      @JaviOrman 3 ปีที่แล้ว

      I would look into a textbook for complete rigorous derivations.

  • @WayWaterWay
    @WayWaterWay 4 ปีที่แล้ว

    This is amazing

  • @huasongyin3711
    @huasongyin3711 9 ปีที่แล้ว +4

    I really enjoyed this series of lectures just to refresh my memory. Thanks.But I am also surprised that A MIT professor of statistics can stump on such a simple task of deriving the Poisson PDF.

    • @niko97219
      @niko97219 8 ปีที่แล้ว +27

      Anyone can stump on something once in a while.

    • @freeeagle6074
      @freeeagle6074 2 ปีที่แล้ว +1

      He wrote the textbook which contains the detailed derivation procedures. He would probably be able to figure out how to derive the final Poisson formula if he had spent perhaps some more minutes. But the professor decided to leave the students to read the textbook themselves. By the way, some people do have the ability to memorize all details once they take a look. But that is a painful disease.

  • @mpakojohnk
    @mpakojohnk 3 ปีที่แล้ว

    Περίφανος που βλέπω κι άλλον Έλληνα να διδάσκει εδώ

    • @lekokotonteso
      @lekokotonteso 3 ปีที่แล้ว

      Πώς ξέρετε ότι είναι Έλληνας; Αγνοήστε τα λάθη, χρησιμοποιώ λογισμικό μετάφρασης.

    • @mpakojohnk
      @mpakojohnk 3 ปีที่แล้ว

      @@lekokotonteso Tsitsiklis is a greek surname. Furthermore focus on his accent

  • @Старкрафт2комедия
    @Старкрафт2комедия 9 ปีที่แล้ว +1

    hahaha, this guy does not get ready for lectures hahah. so funny..