What is Fisher Information?

แชร์
ฝัง
  • เผยแพร่เมื่อ 12 มิ.ย. 2022
  • Explains the concept of Fisher Information in relation to statistical estimation of parameters based on random measurements. Gives an example of parameter estimation in Gaussian noise, and shows the component functions to help with providing intuition.
    Related videos: (see iaincollings.com)
    • What is Least Squares Estimation? • What is Least Squares ...
    • What is a Random Variable? • What is a Random Varia...
    • What is a Probability Density Function (pdf)? • What is a Probability ...
    • What is a Multivariate Probability Density Function (PDF)? • What is a Multivariate...
    • What is the Kalman Filter? • What is the Kalman Fil...
    • What is a Cumulative Distribution Function (CDF) of a Random Variable? • What is a Cumulative D...
    • What is a Moment Generating Function (MGF)? • What is a Moment Gener...
    • What is a Random Process? • What is a Random Process?
    • Expectation of a Random Variable Equation Explained • Expectation of a Rando...
    • What is a Gaussian Distribution? • What is a Gaussian Dis...
    • How are Matched Filter (MF), Zero Forcing (ZF), and MMSE Related? • How are Matched Filter...
    For a full list of Videos and Summary Sheets, goto: iaincollings.com

ความคิดเห็น • 82

  • @akaakaakaak5779
    @akaakaakaak5779 ปีที่แล้ว +11

    Love the format of the video. Emphasising the intuition just makes everything else clearer

  • @NONAME_G_R_I_D_
    @NONAME_G_R_I_D_ ปีที่แล้ว +10

    Great video! I have been following your content for quite a while now and you really always try to give the intuition behind the each process. This really helps to understand the material :) Much appreciated!!!

    • @iain_explains
      @iain_explains  ปีที่แล้ว +1

      I'm so glad to hear that you like the videos, and find the intuition helpful.

  • @user-cl7vh1tz3t
    @user-cl7vh1tz3t 6 หลายเดือนก่อน +1

    This is really a great explanation. You made a difficult concept (at least for me) very easy to understand. I’ve been watching other videos with animations and all, but I only understood this well after watching your explanation. Thank you very much.

    • @iain_explains
      @iain_explains  6 หลายเดือนก่อน

      That's so great to hear. I'm glad it was helpful!

  • @ImranMoezKhan
    @ImranMoezKhan 2 ปีที่แล้ว +2

    What a wonderful coincidence! Here I am deriving a CRB for a noise variance model I'm researching, and running MLE simulations to verify it, and your video with this great explanation of FI comes up :-). I've read that the multivariate FIM can be considered a metric in the parameter space, and with your explanation of how the derivative takes into account the variation of the PDF wrt to the parameter, I can almost visualize it for an intuitive understanding - fascinating concepts. Thanks Iain!

    • @iain_explains
      @iain_explains  ปีที่แล้ว +1

      I'm so glad it was helpful. The concept is not particularly intuitive, especially when considering the multivariate case with the FIM. Perhaps the FIM can be the topic of a future video.

  • @sdsa007
    @sdsa007 11 หลายเดือนก่อน +1

    Im a visual learner the graphs really helped! but I almost gave up half way through the video but I'm glad I hung on!

    • @iain_explains
      @iain_explains  11 หลายเดือนก่อน +1

      I'm glad you found the graphs helpful.

  • @mahtabsv
    @mahtabsv 6 ชั่วโมงที่ผ่านมา

    Thank you very much for this amazing video! You made understanding this concept very easy.

  • @SavasErdim-ly8xo
    @SavasErdim-ly8xo ปีที่แล้ว

    Great video to understand the Fisher Information intuitively. Thank you Prof. Iain.

  • @sintumavuya7495
    @sintumavuya7495 7 หลายเดือนก่อน

    Thank you for explaining the logic behind that formula. Knowing the why helps me remember easily and just makes it all make sense.

    • @iain_explains
      @iain_explains  7 หลายเดือนก่อน +1

      That's great to hear. I'm glad you found the video helpful.

  • @sharp8710
    @sharp8710 หลายเดือนก่อน

    Thank you for the video. Love the way you used simple examples to explain the theory intuitively and decomposed the expression explaining the meaning of each part!

    • @iain_explains
      @iain_explains  หลายเดือนก่อน

      Glad it was helpful!

  • @chengshen7833
    @chengshen7833 2 ปีที่แล้ว

    Thanks a lot. This really provides a excellent complementary explanation to S. Kay's book. In the book Fisher Information was interpreted as the 'curvature of the log-likelihood function', where the expectation of squared 1st derivative can be converted to negative of expectation of 2nd derivative, and f is viewed as a function of theta with Y fixed. The meaning of natural log will become more subtle when it comes to the derivation of CRLB.

    • @iain_explains
      @iain_explains  ปีที่แล้ว

      Glad it was helpful! It's a concept that took me a long time to get intuition on, when I first learned about it.

  • @aliosmancetin9542
    @aliosmancetin9542 ปีที่แล้ว

    Awesome video! You concentrate on giving the intuiton, thanks!

  • @mikewang4626
    @mikewang4626 ปีที่แล้ว

    Thanks a lot for your intuitive explanation with diagrams. The explanation about why Fisher Information looks like that is quite useful to understand the definition!

    • @iain_explains
      @iain_explains  ปีที่แล้ว

      That's great to hear. I'm glad you liked the video.

  • @marirsg
    @marirsg ปีที่แล้ว

    Beautifully explained ! Thank you

  • @oO_toOomy_Oo
    @oO_toOomy_Oo 6 หลายเดือนก่อน

    I appreciate your work Mr. lain, it is very helpful, it gives great sense of the signals and systems.

    • @iain_explains
      @iain_explains  6 หลายเดือนก่อน

      Glad it was helpful!

  • @menglu5776
    @menglu5776 8 หลายเดือนก่อน

    Thank you so much, I was literature research for some novel Cramer rao lower bound application. You video helped me a lot!

  • @qudratullahazimy4037
    @qudratullahazimy4037 ปีที่แล้ว

    Absolutely great explanation! made my my life easy.

  • @SignalProcessingWithPaul
    @SignalProcessingWithPaul 9 หลายเดือนก่อน

    Hey Ian, great content. Have you considered doing a video on the Cramer-Rao bound (and how it relates to Fisher information)? I was thinking of doing some statistical signal processing videos on my channel but you've covered so much already haha.

    • @iain_explains
      @iain_explains  9 หลายเดือนก่อน

      Thanks for the suggestion. Yes, I've thought about it. It's the inverse of the Fisher information, but there's really not much intuition as to why this is the case - except that the maths turns out that way.

  • @stillwalking78
    @stillwalking78 7 หลายเดือนก่อน

    The most informative video on Fisher information I have seen, pun intended! 😄

  • @user-fk2pc9zp3t
    @user-fk2pc9zp3t ปีที่แล้ว

    讲的真好,谢谢了

  • @ZardoshtHodaie
    @ZardoshtHodaie ปีที่แล้ว

    The beauty of math becomes evident when a good teacher teaches it :) ... thank you! thank you!

  • @mustaphasadok3172
    @mustaphasadok3172 2 ปีที่แล้ว

    Amazing... Thank you professor. In the litterature there is rare clear book on the subject beside Pr Steve Mc Kay collection.

  • @cerioscha
    @cerioscha ปีที่แล้ว

    Great video thanks !

  • @lostmylaundrylist9997
    @lostmylaundrylist9997 ปีที่แล้ว +1

    Excellent!

  • @bobcoolmen1571
    @bobcoolmen1571 หลายเดือนก่อน

    Excellent video thank you sir.

    • @iain_explains
      @iain_explains  หลายเดือนก่อน

      Glad it was helpful!

  • @nzambabignoumba445
    @nzambabignoumba445 6 หลายเดือนก่อน

    Wonderful!!!!

    • @iain_explains
      @iain_explains  6 หลายเดือนก่อน

      Glad you like it!

  • @rohansinghthelord
    @rohansinghthelord 3 หลายเดือนก่อน

    I'm a little confused as to why we take the log. Specifically, wouldn't we want the part of the function that changes the most to have more weight in the expectation? Aren't small changes not that notable in comparison?

  • @niveditashrivastava8374
    @niveditashrivastava8374 ปีที่แล้ว

    Very informative video. The normal distribution is plotted for a particular value of the mean. How can we perform differentiation wrt to the mean? Am I missing something here.

    • @iain_explains
      @iain_explains  ปีที่แล้ว

      In the definition of Fisher Information, there is a log function. This cancels out the exponential function in the function f(y;theta).

  • @govindbadgoti8678
    @govindbadgoti8678 ปีที่แล้ว

    i am from india ........ your video is so informative

    • @iain_explains
      @iain_explains  ปีที่แล้ว

      I'm glad you found it helpful.

  • @pitmaler4439
    @pitmaler4439 2 ปีที่แล้ว

    Thanks. Is the FI only useful when you compare a situation with the identical PDF?
    There don't seem to be an unit for the FI, or can you compare parameter with different PDFs?

    • @iain_explains
      @iain_explains  2 ปีที่แล้ว +2

      Just to clarify, the FI is not "comparing situations". It is a measure of the information in a random variable drawn from a (single) PDF. Of course the FI measure for different random variables (PDFs) can be compared.

    • @pitmaler4439
      @pitmaler4439 2 ปีที่แล้ว +1

      @@iain_explains Yes, I just thought, you get a number for the FI. But for what purpose? Without the unit you cannot put the number in a relation. Now I read that they use that value for the Cramer-Rao bound.

    • @khalifi2100
      @khalifi2100 2 ปีที่แล้ว +2

      ​ Example: like the uniform distribution, the FI is zero, and this is why they call the Uniform PDF: "uninformative PDF". So FI is a helpful measure even outside the use of Cramer-Rao bound calculation.

  • @arjunsnair4986
    @arjunsnair4986 2 ปีที่แล้ว

    Thank you sir

  • @natsuaya2859
    @natsuaya2859 ปีที่แล้ว

    great video thank you! what about fisher information and CRLB?

    • @iain_explains
      @iain_explains  ปีที่แล้ว

      Yes, thanks for the suggestion. It's on my "to do" list.

  • @ujjwaltyagi3030
    @ujjwaltyagi3030 ปีที่แล้ว

    Also in the video we never talked about which function gives most info about theta? is it pills , coffee or arm?

  • @xinpeiwu6086
    @xinpeiwu6086 ปีที่แล้ว

    absolutely made everything understandable, better than my college professor😁

    • @iain_explains
      @iain_explains  ปีที่แล้ว

      Great. I'm glad you found it useful.

  • @a.nelprober4971
    @a.nelprober4971 ปีที่แล้ว

    Am I a dummy? For the fisher info of theta I have computed theta/(pop variance). (I have theta instead of 1)

  • @gamingandmusic9217
    @gamingandmusic9217 ปีที่แล้ว

    sir, can you please tell the difference between
    1.Maximum likelihood (ML)
    2.Maximum Aposteriori (MAP)
    3.Least squares (LS)
    4.Minimum mean square error (MMSE)
    5. zero forcing (ZF).
    moreover, are the equalizer and receiver the same?
    if possible, please post a video on this topic sir. Thank you so much for inspiring us sir.

    • @iain_explains
      @iain_explains  ปีที่แล้ว +2

      Have you checked out my webpage? iaincollings.com I've already got videos on all the topics you ask about. There are lots of them, but the three most relevant would be: "What are Maximum Likelihood (ML) and Maximum a posteriori (MAP)?" th-cam.com/video/9Ahdh_8xAEI/w-d-xo.html and "How are Matched Filter (MF), Zero Forcing (ZF), and MMSE Related?" th-cam.com/video/U3qjVgX2poM/w-d-xo.html and "What is Least Squares Estimation?" th-cam.com/video/BZ9VlmmuotM/w-d-xo.html

    • @gamingandmusic9217
      @gamingandmusic9217 ปีที่แล้ว +1

      @@iain_explains Thank you so much sir.you have taken time to give me reply and all the links. Thank you so much agin sir.

  • @vedantnipane2268
    @vedantnipane2268 6 หลายเดือนก่อน

    🎯 Key Takeaways for quick navigation:
    00:01 🌡️ *Fisher Information measures information content in measurements about a parameter. An example is given with various methods to measure stomach temperature.*
    03:23 📊 *The Fisher Information formula is explained, involving the expected value of the squared derivative of the log of the probability density function (pdf) of a random variable.*
    07:22 📉 *Fischer Information is inversely proportional to the variance of noise in measurements. Small noise leads to high information, while large noise results in low information.*
    14:22 🔄 *The log function in the Fisher Information formula enhances small values in the pdf, ensuring contributions from all parts of the distribution, giving a comprehensive measure.*
    16:50 📈 *Fischer Information decreases as noise (standard deviation) increases, illustrated with visualizations of pdf changes and their impact on the information measure.*
    Made with HARPA AI

  • @tuongnguyen9391
    @tuongnguyen9391 2 ปีที่แล้ว

    Could you kindly explain "what is polarization " of polar code :) ?

    • @iain_explains
      @iain_explains  ปีที่แล้ว +1

      I'll have to give that one some more thought. I don't really have a good intuitive explanation right now.

    • @tuongnguyen9391
      @tuongnguyen9391 ปีที่แล้ว +1

      @@iain_explains Thank you professor

  • @InquilineKea
    @InquilineKea หลายเดือนก่อน

    12:00

  • @thomascook7948
    @thomascook7948 7 หลายเดือนก่อน

    i wish you were my professor

    • @iain_explains
      @iain_explains  7 หลายเดือนก่อน

      That's nice of you to say. I'm glad you like the videos.

  • @ujjwaltyagi3030
    @ujjwaltyagi3030 ปีที่แล้ว

    mistake at 15:07, fisher information would be 16 as 1/(0.25)^2=16

    • @iain_explains
      @iain_explains  ปีที่แล้ว

      No, you're mistaken. I said the variance is 0.25. The symbol for variance is sigma^2.

    • @ujjwaltyagi3030
      @ujjwaltyagi3030 ปีที่แล้ว +1

      @@iain_explains ok thanks my bad

  • @musaadalruwaili5772
    @musaadalruwaili5772 2 ปีที่แล้ว

    Hi, I really enjoy your videos, and I have learned a lot. I tried to find your email to contact you, and I only find your University's email. I am a Ph.D. student, and I am working on a D2D base on the NOMA system. So, could you please explain the D2D system and how it works? Thank you

    • @iain_explains
      @iain_explains  ปีที่แล้ว

      Thanks for the suggestion. I'll put it on my "to do" list.