Maybe it's relevant to mention the probability ~0.0134 can be estimated numerically in R with the code: many = 10000000; s = 0; for (i in 1:10) s = s + runif(many); sum(s>=7)/many
There is some concept about the Expectation value that I am not getting and how you are allowed to move around the numbers like that for Cheyshevs. What is it?
The 1/8 is the 1/2 * 1/(2^2). The 10*1/12 is Var(X) = n*Var(Xi). Here n = 10 and Xi is a uniform RV, which have variance (b-a)^2 / 12, or in our case (1-0)^2 / 12 = 1/12.
Sorry for the noobish question but what formula did you use to calculate variance? Is (b-a)^2 / 12 a standard formula for variance? Can I calculate variance through E(X) too?
Yes, for a uniform R.V. on an interval [a,b], the variance is known to be (b-a)^2 / 12. You can easily find the derivation via google. One method indeed does use E[X], as Var(X) = E[X^2] - (E[X])^2.
Both the Markov and Chebyshev inequalities are best-possible, which means that there are distributions that touch the limits. But they are discrete distributions that touch the limits at points.
Your equation is true without any assumption on X. But here X~\sum_i=1^10 Uni(0,1),which has mean 5. So the pdf of (X-5) is symmetric regarding 0. Therefore the first equation holds.
you invoke a basic property of the expectation, its linearity so if you have E(x1+....+xn)=E(x1)+....E(xn) in this case each Xi has an expectation of 1/2 because it's a uniform(0,1) RV and when you add 10 of them you get 1/2*10=5
Maybe it's relevant to mention the probability ~0.0134 can be estimated numerically in R with the code: many = 10000000; s = 0; for (i in 1:10) s = s + runif(many); sum(s>=7)/many
Do you have any video about chernoff bounds?
Please reply I need that urgent
10:21 If CLT provides the best upper bound, do we still need to know chebyshev and markov inequality? If the answer is yes, when to use each of them?
The CLT is a limiting case and not an upper bound
very clear explanation! appreciate the mini lecture 👍
This is really really great. Thank you very much!
The central limit theorem only applies asymptotically converging at a rate 1/sqrt n. This is a misleading video in my opinion
There is some concept about the Expectation value that I am not getting and how you are allowed to move around the numbers like that for Cheyshevs. What is it?
Very nice explanation thank you
At 4:21 How do you get from 1/2*(Var(X)/2^2) to the 1/8*(10*1/12) ?
The 1/8 is the 1/2 * 1/(2^2). The 10*1/12 is Var(X) = n*Var(Xi). Here n = 10 and Xi is a uniform RV, which have variance (b-a)^2 / 12, or in our case (1-0)^2 / 12 = 1/12.
Sorry for the noobish question but what formula did you use to calculate variance? Is (b-a)^2 / 12 a standard formula for variance? Can I calculate variance through E(X) too?
Yes, for a uniform R.V. on an interval [a,b], the variance is known to be (b-a)^2 / 12. You can easily find the derivation via google. One method indeed does use E[X], as Var(X) = E[X^2] - (E[X])^2.
This way it's: E[X^2] = E[X] = 1/2; E[X]^2 = 1/4; E[X^2] - E[X]^2 = 1/2 - 1/4 = 1/4. Therefore the bound should be 5/16.
nevermind, we've a continuous rv.
Dear Professor Xu, Is there a distribution for which this inequality becomes an equality?
Both the Markov and Chebyshev inequalities are best-possible, which means that there are distributions that touch the limits. But they are discrete distributions that touch the limits at points.
I still don't get it 2:53 why P(x - 5 >=2) == P(x - 5 =2) == P(5 - X
Your equation is true without any assumption on X. But here X~\sum_i=1^10 Uni(0,1),which has mean 5. So the pdf of (X-5) is symmetric regarding 0. Therefore the first equation holds.
How can u get 5?
Xi is uniform [0,1] -> E[Xi] is 1/2 or 0.5
@@mehdihachimi9624 But, it is not 0.5 is is 5. How do they just take away the decimal point? Why multiply by 10?
its E(X) not E(Xi), E(Xi)=0.5 and E(X)=E(sum of Xi's) which are independent
@@emilyhuang2759
So E(X) is like the cumulative density function? And the E(Xi) is the probability density function?
you invoke a basic property of the expectation, its linearity so if you have E(x1+....+xn)=E(x1)+....E(xn) in this case each Xi has an expectation of 1/2 because it's a uniform(0,1) RV and when you add 10 of them you get 1/2*10=5