An Introduction to the F Distribution
ฝัง
- เผยแพร่เมื่อ 6 ก.พ. 2025
- A brief introduction to the F distribution, an important continuous probability distribution that frequently arises in statistical inference. I discuss how the F distribution arises, its pdf, mean, median, and shape.
THANK YOU SO MUCH for all of your videos!! you will be the reason I dont fail this class!
How'd your class end up going?
These videos have really helped me understand probability and statistics... Got more out of them than I did in the classroom.. Thank you so much
watching this in 2018 and probably someone else will in years to come. thanks!
You're welcome! I definitely hope they provide a little value to the world for a long time. Cheers.
Yessir
I'm glad to be of help!
this is fantastic !!! I will watch this all the time before my homework
Out of curiosity, how did your class go?
These videos are very helpful. You have made it so easy for us to learn statistics. Truly appreciate....
sheenu sharma I'm glad you've found them helpful! All the best.
Please can I request you to upload a video for Bayes theorem.
Thank you, I can understand every vedio you upload very clearly keep it up
I have learnt many things from your channel....Thank you....
You are very welcome.
you're a life savior, seriously!
Thanks for the explanation, interesting to see that the F distribution is the ratio of variances and how the gamma function comes into play with the definition of the probability density function of F
Thanks for your video.
I saw that the F distribution is also widely used in ANOVA and Regression in which the F test is used to test for many things.
Could you make some videos about the application of F test in this field and also the video about MANOVA. I found it was very difficult to understand MANOVA and how to interpret the result.
Thank you so much and hope to see more video.
Thanks! I'm glad you liked it!
Thanks for your video. I have a question: Why F distribution is important for?
Thank you for being way much more helpful than my teacherssss! There is a small error at 2:50. V should have a minus sign in the denominator - google. This explains why mu is smaller than 1, but I don't know why mu is negative.
The expression I give for mu at 2:50 is correct. Mu can't possibly be negative, as none of the possible values of the random variable are negative.
@@jbstatistics I agree that mu is positive. I was confused because the mu looked less than 1, but actually, I was looking at the mode. anyway, I understand now :) thank you~
nice job jb!!! I like your all videos its short and unstandable easyly.......keep it up...
Thanks Pranav. I'm glad you like my videos. I'll definitely be adding more in the not-too-distant future.
Do you have discussion for Weibul and Erlang distribution?
Your videos are simply outstanding. I have only one question. While conducting F test, F statistic is usually given by σa2/σb2. However, while deriving the pdf of F distribution two independent variables' (with some dof) ratio are considered with χ2 distribution. How do we come to know that σa2 or σb2 follows χ2 distribution? Otherwise, σa2/σb2 ratio cannot be expressed by F statistic.
First, sigma^2 is a fixed-value parameter, whereas S^2 is a statistic. So it is only S^2 that is a random variable with a probability distribution. But, with the appropriate changes to notation, the answer is that it is a consequence of sampling from a normal distribution. If we're sampling n observations from a normally distributed population that has a variance of sigma^2, then the quantity S^2(n-1)/sigma^2 has a chi-square distribution with n-1 degrees of freedom. This is often covered in a first course in mathematical statistics.
So when sampling from normally-distributed populations A and B, S_a^2(n_a -1)/sigma_a^2 ~ chi-square with n_a-1 DF, and S_b^2(n_b -1)/sigma_b^2 ~ chi-square with n_b-1 DF. If the samples are independent, then under the null hypothesis of equality of population variances (sigma_a^2 = sigma_b^2, and let's just call this sigma^2), this quantity is a ratio of independent chi-squares divided by their respective degrees of freedom: [S_a^2(n_a -1)/sigma^2/(n_a-1)] /[S_b^2(n_b -1)/sigma^2/(n_b-1)], which reduces to S_a^2/S_b^2. Since S_a^2/S_b^2 is a ratio of independent chi-squares over their respective degrees of freedom (n_a -1, and n_b-1), S_a^2/S_b^2 ~ F(n_a -1, n_b-1).
If we're not sampling from normally distributed populations, then all bets are off.
@@jbstatistics Thank you so much for your lucid explanation. Now it's clear.
Is it Snedecor's f distribution? Or fisher's?
Dude, I have watched a few of your videos and I have to say that you are a very good explainer (if that is even a word!) of things please keep it up! can i request videos on Moment Generating Functions and Probability Generating Functions if possible?
So for fit statistics, If i want to compare two models, its F= (x1^2)/v1 / (x2^2)/v2 ?
thanks for the playlist but #9 video is private/missing
useful video and macho voice, bro.
very good
how to calculate f distribution probability using tables? do you have a video for that
+gaurav gregrath I show how to use the F table in this video: th-cam.com/video/mSn55vREkIw/w-d-xo.html.
Watching in 2024
Thank you! great video :)
Thank you very Much.
+Benjamin Franklin You are very welcome.
it was useful... thank you
You are very welcome!
Rahul Pradeep session watched👍
I love you.
thank you !
thank you so much..
You are very welcome!
Why you stopped making videos?
Cuz he deaad
F-this, but in the best way possible! 👍
Fuck I still don't understand
Thank you so much