What a wonderfully clear and precise presentation! It cleared up several doubts, especially the distinction between a probability mass function and a random variable. Would be interested to see an example of a more complex problem that’s more easily solved through random variables than by probability mass function. (Incidentally - is the converse ever true, i.e. is there a problem that’s more easily solved using probability mass function rather than the random variable? Assuming that the random variable is known.)
Some of my earlier Summer of Math Exposition videos show problems with tricky ways to manipulate the random variables to get the answer: The ABRACADABRA theorem th-cam.com/video/t8xqMxlZz9Y/w-d-xo.html and there is another about estimating Pi with spaghetti th-cam.com/video/e-RUyCs9B08/w-d-xo.html . For many "standard" problems, using the probability distribution is the most straightforward way to go, which is why this is the main way people are taught about random variables.
I had a similar experience! I was scared away by early courses, and didn't get into it until my PhD. The problem is really that statistics is too useful so there are lots of sources that try to make it accessible and end up hiding the details!
Prob theory and stats overall are very cool areas! It can be shady sometimes because it'll be "simplified" and it could make you feel like things are just happening sometimes. After intro to prob, this happens less and less, making these areas more and more fun! It's a lot of function theory which great!
I was first introduced to statistics in high school and it was incredibly boring. In uni I really enjoyed probability theory, it’s a different beast. Maybe it’s just me but getting to see the underlying mathematical machinery is really cool. You learn things like the difference between impossible and probability 0, the fact that there are many different types of convergence with different implications, and of course the Central Limit Theorem.
@@adam_jri dude that's so awesome that you enjoyed prob theory in uni! It only ever gets better! If you had fun with prob and enjoy applications as well, stats can be quite a lot of fun! And handy!
A surprising amount of reading math is constantly asking yourself: "wait what type of thing is this?". If you can keep all the domains/ranges/functions straight you are halfway there!
This was the clearest explanation of random variables and their relation to probability distributions I’ve had in 35 years of being around probability and statistics. Very nice job. I hope to see more explanations like this, I’m subscribing!
Thanks so much! It's amazing how sometimes the simple things get overlooked/underexplained in courses. It's easy to forget when teaching what it was like before you knew something!
I've been waiting for this video for a long time. I've long understood random variables at a surface level, and I've even used them to model things, but I've never had a good, solid understanding of what they are as mathematical objects. This video does what no statistics textbook or professor has done for me; namely, to give random variables a solid base for building intuition on. And the treating of them as deterministic functions on random inputs also helps me a lot. The description of the distinction between probability distribution and random variable was similarly helpful. I've long understood most of the consequences of saying "X is a Normally distributed Random Variable": it means I can calculate probabilities of X taking values between certain ranges based on the definite integration of a well-understood function. But now I think I understand what it _means_; namely, that X is a function which assigns a value to events. Perhaps events are pencils and the function the weight of the pencil. Then, if I understand the terminology, the probability distribution (which is a normal distribution) lets me measure the portion of the events where pencils take weights between some values I select. Wonderful video, and thank you for it!
Thanks so much for the kind words, I'm glad you liked the video! Another fun example I've heard: go to the library and look at the first book someone walks out with. The probability space is the set of all possible books, and you can make many random variables: how many pages is this book? How much does it weigh? What color is it?
If you're interested in the more robust formulation of random variables you could read about probability using measure theory, you'll get a few interesting insights which this video describes.
1:35 "You should think of them as 16 different atoms." The funny thing is that is agrees with the technical definition of an atom. It's a singleton set with nonzero measure. In this case, for any x in our Omega, we have mu({x}) = 1/16.
In spanish language we have two different words for that, and it helps to clear out sometimes. When we say "Azar" (azarous) we generally mean a sequence of events which don't appear to follow any pattern or rule (wether the absence of pattern is apparent or real) at least in the order of elements (and not necesarilly a group pattern; this is, you cant predict the next event but sometimes it's possible to build a distribution). When we say Aleatorio (Random) we mean an azarous sequence where each event is equally probable at all times. Like a type (azar) and a subtype (aleatorio). So basically it's all just mappings where you don't know if there's a rule behind that mapping or not.
this is a great video! I love how you explained also captures the measure theory underlying probability. I would love to hear how you conceptualise conditional expectations on measure spaces :D (I'm currently struggling with the intuition there😆)
1:05 isn’t a probability space the triplet (omega, sigma, P) with sigma being the sigma field and P being the measure? I’ve always heard omega called the fundamental set.
Yes: in general what you said is correct. For discrete spaces with finitely many elements, you can just make sigma all the subsets so it's "trivial" in some sense, and Omega, P is all you need
For continuous random variables, The Riemann way and the Lebesgue way are two ways to rigorously define it (expected values are integrals), but there are many different/varied ways to calculate it (just like there are many different integration techniques)
@MihaiNicaMath thank you Sir, I am thankful for this video as I'm using it to polish and refresh knowledge (right now Im working with DL models). I'm happy youtube recommended me this video.
Z is honestly one of the cooler letters in math. Third dimension, complex domain, and random variable. Using x,y,i,j,k are all that, but nothing says cool af like integral(0->x| z^2/4·dz) or f(z)=g(z)·h(z)
@MihaiNicaMath This video is an amazing explanation and I hope it hits a million views and gets liked and shared over and over. It has been very helpful in inspiring me to think about Quantum Mechanics in a different way and hopefully I will be able to repay the favour with an elegant theory that references this video in the near future. Keep up the good work
Thank you so much! I'm curious what exactly you mean by "format". Like lecture style with person in corner? Or what exactly did you like about it? Thanks for the feedback!
It totally insane to me that it's possible to take a measure theory class but not mention the connection to probability. (My first measure theory class was like this! I thought I hated measure theory!!!!)
This was one of the first things in stat that needed clarification. A random variable is a FUNCTION. It's a function from the sample space to the set of real numbers. It's job is to assign "numerical values" to the elements of the sample space (like say 0 to a tail and 1 to a head in a single coin toss).
To solve the exercise you mentioned at 21:00 , first note that the nonzero contribution to the sum is over the image of Z, which is discrete by assumption (otherwise you would need an integral). Then (in the language of Latex) you have $\sum_{\omega \in \Omega} Z(\omega) \mathbb{P}(\omega) = \sum_{ z \in Im(Z) } \left( \sum_{\omega \in Z^{-1}(z)} Z(\omega) \mathbb{P}(\omega) ight) = \sum_{ z \in Im(Z) } z \left( \sum_{\omega \in Z^{-1}(z)} \mathbb{P}(\omega) ight)$ as $Z(\omega) = z$ is constant over the inner sum, and then using the additivity axiom of a probability measure we have $ =\sum_{ z \in Im(Z) } z \mathbb{P}\left(\bigsqcup_{\omega \in Z^{-1}(z)} \omega ight) = \sum_{ z \in Im(Z) } z \mathbb{P}(Z = z)$. also, to show the equality mentioned at 25:19, proceed as follows: first the $k=0$ term of the sum is zero. Then we have $\sum_{k=1}^n k {n\choose k}\frac{1}{2^n} = \sum_{k=1}^n \frac{n!}{(k-1)!(n-k)!}\frac{1}{2^n}$ after unpacking the n choose k and then canceling factors of $k$ on the numerator and denominator. Pull out an $n$ from the $n!$ to get $=\sum_{k=1}^n n \frac{(n-1)!}{(k-1)!(n-k)!}\frac{1}{2^n} = \frac{n}{2^n} \sum_{k=1}^n {n-1 \choose k-1}$ where terms which dont depend on $k$ have been pulled out as they are constant with respect to the summation index $k$. Finally, the binomial formula $\sum_{k=1}^n {n-1 \choose k-1} = 2^{n-1}$ yields the desired result $n/2$. Some mention of these details would have been nice to see in the video. Note: If anyone reading this comment is not familiar with Latex, a good place to start is Overleaf.com. You will need to import some math packages, namely copy and paste \usepackage{amsmath, amssymb, amsfonts} at the top of your document and the above Latex Code should compile.
yes, that's one of the most insane things in most statistical textbooks (they also confuse models with fitted models, samples and groups, etc). imo, the only way one can understand statistics is through programming, bc you can see how these abstract concepts map to operations on real data (and there are a few textbooks that do this)
A "random variable" X conceptually means to extract partial)information or translate information of an event space that you are interessted in modelling to another hopefully more simple/useful event space (often IR). The informations distribution in the simplified event space is determined by the distribution on the original event space as well as X. That's also why measureability as a requirement makes sense. This seemlessly extends to Operators we call "expection": They coarsen the "resolution" of the random variable depending on how much information you assume to be known (i.e. your a priori knowledge about which events will happen), e.g. the usual expectation value we know is a constant function but with full information it would be the random variable X itself. The expectation (given sub information modelled by a sub sigma algebra) operator is optimal in the sense as it gives the best prognosis of how X behaves on the random event space as possible - it is an optimal approximation of X based on the a priori given knowledge of events and in fact its a measureable function too! Another useful thing is thinking about distributions as analogues to laws of motion in physics and this is even less strange while thinking about QM. I will now watch the video and see how you explain things.
Oh that's pretty cool. I am working on a video about looking at discrete distributions like vectors and doing projections on them. I was confused about RVs and PDFs in that contexts and this gives a good foundation to think about them
@@Erebius-mj7qh Does |w| mean the length of w or the number of 1s in w? The space we want is all possible outcomes for flipping n coins, so each w is length n (but the number of 1s/Hs can be different).
A good intuition for the probability distribution solution at 23:15 is to think of a coin flip as a binary digit and therefore a sequence of n coin flips as a binary number with n digits. Clearly there are 2^n binary numbers of length n. Now you can think of heads as a 1 (or 0 it doesn't matter it's just a label) and find how many ways there are to have x 1s in a binary number of length n. That is, how many ways is there to choose x digits out of our binary number to be a 1 digit with the rest staying 0, i.e. nCx (n choose x). Since there are 2^n binary numbers of length n, the probably a binary number of length n has x 1's in it is nCx / 2^n.
my friends asked me: "What is Linear Algebra and what do you learn in there?" Me: "Well to start, it's not exactly Linear and it's not exactly Algebra"
@@Patrick-x3g9i An "algebra" means a space where you can add/subtract/multiply elements. the core thing in linear algebra are vector spaces, where you can add/stretch/shrink but not multiply elements. So linear algebra should actually be called "linear stuff on vector spaces". (Note that the set of matrices IS an algebra...sometimes that's what people mean)
Student: What's a X Y? Me: Well for starters it's not an X and it's not a Y. Substitute X Y for: Random Variable, Big Bang, Halting Problem, Neural Network, Elliptic Curve, Fundamental (theorem of) Algebra. Naming things is hard...
The way i think of it is...Its just a function that maps events to real number space. The occurrence of those events can itself be random and thus the output of function as well. Hence the random in randome variable.
a function on a sample space has an associated probability distribution .This is what leads to designation of random in the name.As other giants of probability theory said, "random function " would be a more proper name.But the term "random variable" stuck for some intuitive interpretation.
It seems like there must be some general relationship between how structure is imposed on random outcomes and the information we can extract from that imposition.
I like the books "a first look at rigour probability" by Rosenthal and "probability with martingales" by Williams (featured in my other video: th-cam.com/video/t8xqMxlZz9Y/w-d-xo.html )
I love probability because it is a beautiful intersection of philosophy and math. I would disagree that Z isn't random. Take the function Z = X_1 + X_2 where X_1, X_2 represent the different dice rolls. X_1 and X_2 are in there on rights functions. They are maps from (1,2,3,4) -> (1/4, 1/4, 1/4, 1/4) where the domain is the outcomes and the codomain is the probabilities of each outcome. We consider X_1, X_2 to be random because they're mappings from outcomes to probabilities. If you agree with this last statement, then it's clear that Z is in itself also random. For our function Z = X_1 + X_2, you can cut out the middle man in a way and make Z a mapping from the set of outcomes of X_1, X_2 to probabilities. So the domain of Z would be the set {(1,1), (1,2), (1,3), (1,4), (2,1), etc.} and the codomain would be {1/16, 1/16, 1/16, etc.). So if you accept X_1, X_2 to be random, then Z must be as well because it is also fundamentally a mapping from outcomes to probabilities. Love the video!
Thanks for the detailed comment and glad you liked the video! I agree that Z can be treated as random exactly like you said. Mathematically though, the point of the video is the function Z:Omega -> R is a non-random function, and it's the output Z(omega) that is random. So Z is the non-random machine, and the output (which we also call Z sometimes!) which is random.
@@MihaiNicaMath Ah I understand, I had it all wrong. Thank you for the response! I'm an undergraduate in mathematics and statistics (dual major) and I love learning more about probability and statistics. I see the point you make, because the image of Z isn't a set of probabilities, then by definition it is not random (even though it feels like it should be!). Thank you for the explanation, I feel stronger in the concepts now.
i think this depends on how one was taught probability theory. When I learned it, we defined random variables to be a measurable map from our probability space to some arbitrary measure space. So for us, it is just a measurable function.
...this is precisely the point of the video: to explain in what way a random variable is actually just a function. What do you mean by "it depends"? Maybe you didn't watch the video and are just replying to the title...
@@MihaiNicaMath it was more about the way probability is taught in most places. most people don't learn measure theory before they learn what a random variable is when a short introduction to measure theory would be so much better before moving on to proper probability theory so that way, they can rigorously prove all the theorems
@@williamxion2806 I completely disagree. Probability is accessible, useful and full of super cool math. Delaying it until after measure theory would be such a shame.
this guy will look me straight in the eyes at 4:00 and tell me events are not random. Answer this question: which event will occur when I throw 2 fair 4-sided dice?
***Which*** event happens IS random, exactly as you said. The events themselves are not random (Example: the collection of all numbers that sum to 5 is a set of numbers....everyone has the same set). That's the perspective/philosphical shift that is the point of the video!
I think it's misleading to say it's a special case. Two things I would say: 1. Every RV *has* a probability distribution. (Described in video) 2. Given a probability distribution, you can create an RV that has that distribution. (Set \Omega to be the set of values it takes with P(\omega) given, and then make Z(\omega)=\omega). This is sometimes called the "canonical" match between an RV and a distribution. I didn't mention this in the video!
Love the video, but now you've got me thinking. If i roll a die with 4 sides, but each of these 4 sides does not have a value quite yet. Each of the sides, rolls a 4 sided die to determine their values. But wait, those 4 sides don't have a value quite yet... Let Z be the average value of this dice roll. Is Z now an actually random variable? 🤔Can we even calculate E(Z)? I think E(Z)= 4 times E( A normal 4 sided die), but im curious on your reasoning for how we can still call this Z non-random.
Each side has a value of E(4 sided die) E(Z) = 4 * 1/4 * E(4 sided die) 4 sides * probablity of a side * value of the side To understand this simply, to determine the result of rolling the die, roll a 4 sided die to determine what is on the face. But that happens no matter what you roll on the original die
@@BryanLu0 You're right. Each die face would tend to the E(Z) so then the total value would tend towards E(Z). Not 4 times E(Z). Thanks for the correction! Unrelated to your comment, because I was thinking about the problem again: Here is a different example to illustrate the point. At first have the normal 4 sided die with the middle two numbes becoming negative. . Each recursive new die does this to the faces currently on the die before it: Multiply by a random real number, from 1->infinity. Random number=5 So 1,2,3,4 become 5,-10,-15,20 Repeat again. lets say 10 50,-100,-150,200 etc. Now when it does get back up to the original dice, the one we actually care about, what will the E(Z) be? It will be 0. Yet is Z a RV. that isn't a RV. or is it? Are these "R.Vs" _random_ or not? Which im not sure is or isn't "random". I think it still isn't really random. But it also kind of feels like any such dice are actually random. Because Z can be end up being anything, even without infinite recursion. Though, then the definition of "choose a random number" gets called into question. How does one choose from 1->infinity? Idk, but as a thought expirement, i'm curious what the counter-claim would be such that this "Z" is actually essentially determinant.
To understand this, start with concept, or idea in your head. Concepts are not measurable, e.g. I am "happy". To measure it, you need to find a variable in the empirical world like all scientists do, e.g. ask him to rate his level of happiness, H, on scale of 1-5. It is a "random" variable, meaning that we do not know the number until he has rated it. It varies, unlike a constant, e.g. H = 3, which does not change. So a random variable is something that varies in the empirical world. Hope it helps.
It's the same thing actually! In classical mechanics, randomness comes from your lack of knowledge of the exact state, so the probability space can be thought of as being over the set of classical states. In quantum mechanics, the state alone is not enough to determine the outcome, so the probability space is larger than just the state space. But the fact that the random variable is a function on the probability space is the same in both!
A couple things: 1. "Disjoint" and "independent" sound similar but actually mean completely different things. Don't mix them up! 2. The proof works whenever Z=X_1+X_2...you don't need any other conditions or independence of anything else. All you have to do is rearrange the sum in the definition of E(X). (For non-discrete distributions it's still true but proof is slightly more complicated!)
@@MihaiNicaMath it's also not true in the infinite (but still discrete) case, unless you come up with a canonical order for the summands (if it diverges, you have Riemann rearrangement theorem to deal with, and yet these scenarios may appear when doing game theory thought experiements)
Proving E[X] = n/2 for a binomial random variable X is pretty simple to do, all it requires is the identity \binom{n}{x}x = \binom{n-1}{x-1}n. This will get rid of the x inside the sum, that is, 1/2^n \sum_{x=0}^n x \binom{n}{x} = 1/2^n \sum_{x=0}^n n \binom{n-1}{x-1} = n/2^n \sum_{x=0}^{n-1} \binom{n-1}{x-1} = n/2^n * 2^(n-1) = n/2. In the second to last equality, I used the binomial theorem: \sum_{x=0}^{n-1} \binom{n-1}{x-1} = (1+1)^(n-1) = 2^(n-1). Anywhere where the binomial coefficient is negative or otherwise does not make sense we define it as 0, thus we don't even have to reindex.
That's a neat trick! I haven't seen that one before. Normally I would do it with a generating function and take the derivative, which is a bit trickier
@@victorscarpes I wondered too, here is what ChatGPT had to say: "The terminology of "random variable" was introduced by the Russian mathematician Andrey Kolmogorov in the 1930s. His foundational work in probability theory, particularly his 1933 book Foundations of the Theory of Probability, formalized the modern axiomatic approach to probability."
Why does everyone keep saying "dice" as the singular lately? This isn't just you, it seems to be everywhere suddenly. What's going on? Did the singular "die" get replaced with the plural or something?
In the UK people (including old people) generally say dice. So I don't think it's a recent change. Although people do say "the die is cast", but most people don't actually know what it is referring to xD
@@tommyphillips1030 Oh interesting. I always thought "the die is cast" referred to the process of die casting, like pouring molten metal into a form until it sets. It seems you're right, though, it means throwing a die. That doesn't seem very final to me, though. Just pick it up and throw it again?
@@severoonI believe it’s referring to games of old where you would wager for example your cow or pig or firstborn child on the outcome of a cast of a die, please correct me if I’m wrong
Indeed, it's easy to say that, and that's pretty close to the "it's a random number" idea that is a good 1st level understanding of what an RV is. It's just that this is NOT the mathematical definition of an RV. The video has the details!
@@MihaiNicaMathSince I have had a long career designing RNGs and trying to bridge the gap between formalism and physical design for nondeterministic behaviour in the RNGs I think of the randomness coming part from ignorance of state (HILL entropy) and from quantum uncertainty in the generation of bits) so in that perspective the randomness appears at a well defined point. The distribution comes from analysis of the source and post processing.
Saying the random variables are (technically) functions doesn't help. Because every variable is a function, random or not... something has been left unsaid in this explanation.
Unbound variable: an arbitrary symbol/name in a mathematical expression. This expression can be evaluated with the variable bound to a specific mathematical object (eg. a Real Number). For any unbound variable, it is usually implicitly understood that is can only be bound to mathematical objects of some specific type T, and only operations thats make sense on objects of type T can be performed on the unbound variable. Around 11:15 when he says you can "use it is a variable", I think he means you can use it as a variable of type Real Number.
Yes exactly! So like even though Z is a function (and it's incorrect to say it has a particular value), you can do any operation that you can do on real numbers to it.
Yes indeed: that's the point of the video! Pedagogical, the situation is that students see functions in a Calculus-like context (e.g. f(x)=x^2) and need a nice explanation of how random variables also fit into that idea :)
I admire the effort. But I’m skeptical what purpose this simplification serves. People who would follow this simplification are most likely people who can think of random variables in the measure theoretic sense.
Do you just mean because the probability space has finitely many point (as opposed to being an arbitrary set/measure)? My feeling is its easier and helpful for undergraduate students to first understand the sinpler case before moving onto the general case
@@MihaiNicaMath Maybe it's just me. But I think the idea of a random variable as a function only feels natural once you can appreciate the requirement that the function be measurable. And to appreciate the requirement, we need to appreciate what a probability triple is. In my opinion, once you skip over the "technical details" of sigma algebras and probability measures, the idea of random variable as a function invites complication without the reward of greater sophistication and generalization. With this video, those who would understand the measure theoretic definition may feel that something is missing. And those who would never understand the measure theoretic definition probably just got more confused. But I must admit I have grown rather cynical about most people's capacity for abstract thinking.
Very good video, but you're use of the words x-axis and y-axis is incorrect imho. They're only an x-axis and a y-axis if the variables on them are x and y respectively. If you put events and probabilities on them then you should call them the event axis and the probability axis. Or you can refer to their visual orientation and call them the horizontal axis and the vertical axis.
The sigma algebra is just all subsets when it's a discrete space. I'm surprised by the number of people who are worried about the sigma algebra and measurability when those aren't issues here!
@@MihaiNicaMath Isn't it still necessary to mention what a probability space is? Yes, the sigma-algebra is at most the set of all subsets of Omega. But I might be interested in a smaller set.
@@berryesseen The only reason to do that in the discrete case is if you have more than one sigma algebra on the go (e.g. if you are setting up a filtration). But you can always make the finest sigma algebra equal to all the subsets by choosing the atoms correctly.
OK, in terms of computer programming, it's not a constant, ergo it is a variable and it varies in an apparently chaotic way, ergo it is random. I call it a random variable and I call clickbait.
Much like x=x+1 means something different in programming and in math, so too do the words "variable". This video is about the math definition of random variable! (And if course the interpretation as "these numbers are random" is used a lot in computer programming!)
@@MihaiNicaMath I know, but I am not a mathematician - and you managed to nerd-snipe me into watching for a bit - until I realised this was not going to be about coding. Doh! Anyhoo, you got plenty of great feedback from people who came here for the math. 🙂
@@smudgerdave1141 Thanks for the feedback! If you like randomness in computer programming, you might like my earlier video about using a GPU to estimate Pi by simulating a bunch of noodles th-cam.com/video/po_pmPrO2YY/w-d-xo.html
This is needlessly splitting hairs over terms that ultimately makes no difference to our approach. Does Z change? Yes. What is a variable? Something that changes (randomly or otherwise). So Z is a variable. It doesn't matter whether we can also interpret it as something else. Is a dog an animal or a mammal? Exactly. As for the random part, we could have simply said that it can be random, but it can also not be random. End of video.
Well the actual point of the video is to give a mathematical definition of a random variable so we can do math and understand the last 100 years of literature on the subject :) I'm sorry it was so upsetting for you
"Does Z change? Yes." No! You can have many different samples of Z, but Z itself is fixed. That was the point of the video. The machinery is general enough that you can do probability without necessarily involving a 'time' aspect.
@Lolwut You have low understanding and appreciation of the foundations of probability theory based on your terse “explanation”. I would not go with you if I needed help proving a theorem
And you clearly didn’t watch the end of the video where he shows how one approach easily got the expectation of the binomial distribution while the other approach was more tedious. There are different approaches all based on solid mathematical theory. Rigor is vital
What a wonderfully clear and precise presentation! It cleared up several doubts, especially the distinction between a probability mass function and a random variable.
Would be interested to see an example of a more complex problem that’s more easily solved through random variables than by probability mass function.
(Incidentally - is the converse ever true, i.e. is there a problem that’s more easily solved using probability mass function rather than the random variable? Assuming that the random variable is known.)
Some of my earlier Summer of Math Exposition videos show problems with tricky ways to manipulate the random variables to get the answer: The ABRACADABRA theorem th-cam.com/video/t8xqMxlZz9Y/w-d-xo.html and there is another about estimating Pi with spaghetti th-cam.com/video/e-RUyCs9B08/w-d-xo.html . For many "standard" problems, using the probability distribution is the most straightforward way to go, which is why this is the main way people are taught about random variables.
Student: What's a Dr. Pepper?
Me: Well for starters it is not a doctor and it's not a pepper.
In physics we have: "What is particle spin?"
"Well imagine a ball is spinning, except it's not a ball and it's not spinning"
I study maths but always avoided statistics but this video actually got me interested in learning more
I had a similar experience! I was scared away by early courses, and didn't get into it until my PhD. The problem is really that statistics is too useful so there are lots of sources that try to make it accessible and end up hiding the details!
@@MihaiNicaMath engineers ruin everything
Prob theory and stats overall are very cool areas! It can be shady sometimes because it'll be "simplified" and it could make you feel like things are just happening sometimes. After intro to prob, this happens less and less, making these areas more and more fun!
It's a lot of function theory which great!
I was first introduced to statistics in high school and it was incredibly boring. In uni I really enjoyed probability theory, it’s a different beast. Maybe it’s just me but getting to see the underlying mathematical machinery is really cool. You learn things like the difference between impossible and probability 0, the fact that there are many different types of convergence with different implications, and of course the Central Limit Theorem.
@@adam_jri dude that's so awesome that you enjoyed prob theory in uni! It only ever gets better! If you had fun with prob and enjoy applications as well, stats can be quite a lot of fun! And handy!
This is masterul. I'm starting stochastics and always get confused about what domain the variables are in etc. Best vid so far
A surprising amount of reading math is constantly asking yourself: "wait what type of thing is this?". If you can keep all the domains/ranges/functions straight you are halfway there!
This was the clearest explanation of random variables and their relation to probability distributions I’ve had in 35 years of being around probability and statistics. Very nice job. I hope to see more explanations like this, I’m subscribing!
Glad you liked it! Thanks so much for the kind words :)
This was such a lovely explanation for a topic that so often confuses students!
Thanks so much! It's amazing how sometimes the simple things get overlooked/underexplained in courses. It's easy to forget when teaching what it was like before you knew something!
I've been waiting for this video for a long time. I've long understood random variables at a surface level, and I've even used them to model things, but I've never had a good, solid understanding of what they are as mathematical objects. This video does what no statistics textbook or professor has done for me; namely, to give random variables a solid base for building intuition on. And the treating of them as deterministic functions on random inputs also helps me a lot.
The description of the distinction between probability distribution and random variable was similarly helpful. I've long understood most of the consequences of saying "X is a Normally distributed Random Variable": it means I can calculate probabilities of X taking values between certain ranges based on the definite integration of a well-understood function. But now I think I understand what it _means_; namely, that X is a function which assigns a value to events. Perhaps events are pencils and the function the weight of the pencil. Then, if I understand the terminology, the probability distribution (which is a normal distribution) lets me measure the portion of the events where pencils take weights between some values I select.
Wonderful video, and thank you for it!
Thanks so much for the kind words, I'm glad you liked the video! Another fun example I've heard: go to the library and look at the first book someone walks out with. The probability space is the set of all possible books, and you can make many random variables: how many pages is this book? How much does it weigh? What color is it?
If you're interested in the more robust formulation of random variables you could read about probability using measure theory, you'll get a few interesting insights which this video describes.
1:35 "You should think of them as 16 different atoms."
The funny thing is that is agrees with the technical definition of an atom. It's a singleton set with nonzero measure. In this case, for any x in our Omega, we have mu({x}) = 1/16.
It's not a coincidence! I'm well aware of what an atom means in probability :)
In spanish language we have two different words for that, and it helps to clear out sometimes.
When we say "Azar" (azarous) we generally mean a sequence of events which don't appear to follow any pattern or rule (wether the absence of pattern is apparent or real) at least in the order of elements (and not necesarilly a group pattern; this is, you cant predict the next event but sometimes it's possible to build a distribution).
When we say Aleatorio (Random) we mean an azarous sequence where each event is equally probable at all times.
Like a type (azar) and a subtype (aleatorio). So basically it's all just mappings where you don't know if there's a rule behind that mapping or not.
The best explanation I’ve come across so far
Iti multumesc. Este chiar singura mea problema care m-a intepa cand invatam la probabilitati
this is a great video! I love how you explained also captures the measure theory underlying probability. I would love to hear how you conceptualise conditional expectations on measure spaces :D (I'm currently struggling with the intuition there😆)
1:05 isn’t a probability space the triplet (omega, sigma, P) with sigma being the sigma field and P being the measure? I’ve always heard omega called the fundamental set.
Yes: in general what you said is correct. For discrete spaces with finitely many elements, you can just make sigma all the subsets so it's "trivial" in some sense, and Omega, P is all you need
20:17 so we can imagine P(Z=z) as a shorthand notation for the probability of the level set P({w: Z(w)=z}), right?
Yes exactly!
@@MihaiNicaMath awesome, thanks!
I am a professional musician in my 40s and I have no idea why I found this video so damn interesting that I ended up watching till end😅
@@hatebreeder999 This is one of my favourite genre of comments! "I am not normally into math BUT...". It's so great to show people how cool it can be!
Great! Super nice! Loved it!
Aren't the two ways of computing the expected value sort of the Lebesgue way and the Riemann way?
For continuous random variables, The Riemann way and the Lebesgue way are two ways to rigorously define it (expected values are integrals), but there are many different/varied ways to calculate it (just like there are many different integration techniques)
@MihaiNicaMath thank you Sir, I am thankful for this video as I'm using it to polish and refresh knowledge (right now Im working with DL models). I'm happy youtube recommended me this video.
Glad it was helpful!
This ambient music with the statistic determinism of the dice provoked some uninvited existential dread
Haha sorry
Z: “it ain’t much, but it’s honest work”
Z is honestly one of the cooler letters in math. Third dimension, complex domain, and random variable. Using x,y,i,j,k are all that, but nothing says cool af like integral(0->x| z^2/4·dz) or f(z)=g(z)·h(z)
@@adissentingopinion848 domain expansion
“Is a function a variable? Not really.”
The lambda calculus: bet.
@MihaiNicaMath This video is an amazing explanation and I hope it hits a million views and gets liked and shared over and over. It has been very helpful in inspiring me to think about Quantum Mechanics in a different way and hopefully I will be able to repay the favour with an elegant theory that references this video in the near future. Keep up the good work
Thanks so much! Indeed, Quantum Mechanics has a lot going on so it's easy to get confused by what is/isn't random if you aren't careful
i love this video format!!!
Thank you so much! I'm curious what exactly you mean by "format". Like lecture style with person in corner? Or what exactly did you like about it? Thanks for the feedback!
Holy hell u made me realised why some prob classes touch upon measure cause a random variale is a measure
It totally insane to me that it's possible to take a measure theory class but not mention the connection to probability. (My first measure theory class was like this! I thought I hated measure theory!!!!)
This was one of the first things in stat that needed clarification. A random variable is a FUNCTION. It's a function from the sample space to the set of real numbers. It's job is to assign "numerical values" to the elements of the sample space (like say 0 to a tail and 1 to a head in a single coin toss).
very well done explainer, thank you
Thank you!
To solve the exercise you mentioned at 21:00 , first note that the nonzero contribution to the sum is over the image of Z, which is discrete by assumption (otherwise you would need an integral). Then (in the language of Latex) you have $\sum_{\omega \in \Omega} Z(\omega) \mathbb{P}(\omega) = \sum_{ z \in Im(Z) } \left( \sum_{\omega \in Z^{-1}(z)} Z(\omega) \mathbb{P}(\omega)
ight) = \sum_{ z \in Im(Z) } z \left( \sum_{\omega \in Z^{-1}(z)} \mathbb{P}(\omega)
ight)$ as $Z(\omega) = z$ is constant over the inner sum, and then using the additivity axiom of a probability measure we have $ =\sum_{ z \in Im(Z) } z \mathbb{P}\left(\bigsqcup_{\omega \in Z^{-1}(z)} \omega
ight) = \sum_{ z \in Im(Z) } z \mathbb{P}(Z = z)$.
also, to show the equality mentioned at 25:19, proceed as follows: first the $k=0$ term of the sum is zero. Then we have $\sum_{k=1}^n k {n\choose k}\frac{1}{2^n} = \sum_{k=1}^n \frac{n!}{(k-1)!(n-k)!}\frac{1}{2^n}$ after unpacking the n choose k and then canceling factors of $k$ on the numerator and denominator. Pull out an $n$ from the $n!$ to get $=\sum_{k=1}^n n \frac{(n-1)!}{(k-1)!(n-k)!}\frac{1}{2^n} = \frac{n}{2^n} \sum_{k=1}^n {n-1 \choose k-1}$ where terms which dont depend on $k$ have been pulled out as they are constant with respect to the summation index $k$. Finally, the binomial formula $\sum_{k=1}^n {n-1 \choose k-1} = 2^{n-1}$ yields the desired result $n/2$. Some mention of these details would have been nice to see in the video.
Note: If anyone reading this comment is not familiar with Latex, a good place to start is Overleaf.com. You will need to import some math packages, namely copy and paste \usepackage{amsmath, amssymb, amsfonts} at the top of your document and the above Latex Code should compile.
yes, that's one of the most insane things in most statistical textbooks (they also confuse models with fitted models, samples and groups, etc).
imo, the only way one can understand statistics is through programming, bc you can see how these abstract concepts map to operations on real data (and there are a few textbooks that do this)
Taking statistical estimation of dynamical systems this really helped!
Such a great explanation!
A "random variable" X conceptually means to extract partial)information or translate information of an event space that you are interessted in modelling to another hopefully more simple/useful event space (often IR). The informations distribution in the simplified event space is determined by the distribution on the original event space as well as X. That's also why measureability as a requirement makes sense. This seemlessly extends to Operators we call "expection": They coarsen the "resolution" of the random variable depending on how much information you assume to be known (i.e. your a priori knowledge about which events will happen), e.g. the usual expectation value we know is a constant function but with full information it would be the random variable X itself. The expectation (given sub information modelled by a sub sigma algebra) operator is optimal in the sense as it gives the best prognosis of how X behaves on the random event space as possible - it is an optimal approximation of X based on the a priori given knowledge of events and in fact its a measureable function too! Another useful thing is thinking about distributions as analogues to laws of motion in physics and this is even less strange while thinking about QM.
I will now watch the video and see how you explain things.
Oh that's pretty cool. I am working on a video about looking at discrete distributions like vectors and doing projections on them. I was confused about RVs and PDFs in that contexts and this gives a good foundation to think about them
That sounds cool! Doing a projection must be equivalent to a kind of conditional expectation. Although I'm not sure of the details!
My god this is a great video - thank you Dr!
You're welcome! Glad it was helpful!
23:54 The size of sample space should be n!,don't you think?
I think the sample space of equally likely events is sequences of length n of 0s and 1s ( i.e. {0,1}^n), which has 2^n elements
@@MihaiNicaMath but,the set (say) A= {0,1}^n also contains say w ∈ A such that |w|
@@Erebius-mj7qh Does |w| mean the length of w or the number of 1s in w? The space we want is all possible outcomes for flipping n coins, so each w is length n (but the number of 1s/Hs can be different).
@@MihaiNicaMath got it!,thanks for the reply and |w| is length btw
Wow this was a neat explanation
Love this video!! Awesome stuff
We assign a probability to each point in the sample space Omega? How do we make sure that the sum of the sample space equals 1?
As long as they are all positive, you can always divide by the total!
A good intuition for the probability distribution solution at 23:15 is to think of a coin flip as a binary digit and therefore a sequence of n coin flips as a binary number with n digits. Clearly there are 2^n binary numbers of length n. Now you can think of heads as a 1 (or 0 it doesn't matter it's just a label) and find how many ways there are to have x 1s in a binary number of length n. That is, how many ways is there to choose x digits out of our binary number to be a 1 digit with the rest staying 0, i.e. nCx (n choose x). Since there are 2^n binary numbers of length n, the probably a binary number of length n has x 1's in it is nCx / 2^n.
Thank you for the title of the evideo. Finally someone spoke THE TRUTH.
my friends asked me: "What is Linear Algebra and what do you learn in there?"
Me: "Well to start, it's not exactly Linear and it's not exactly Algebra"
@@blacksnow7106 ok I agree with the"algebra" part, but everything in the Linear Algebra I know *IS* actually linear.
Why isn't it algebra @@MihaiNicaMath
@@Patrick-x3g9i An "algebra" means a space where you can add/subtract/multiply elements. the core thing in linear algebra are vector spaces, where you can add/stretch/shrink but not multiply elements. So linear algebra should actually be called "linear stuff on vector spaces". (Note that the set of matrices IS an algebra...sometimes that's what people mean)
Interesting angle! Thanks for sharing. **SUBSCRIBED**
My cousin in Romania is a vampire I'm pretty sure
This is why he’s called (count) Dracula I suppose
@@xTheUnknownAnimator
a count of what? 4?
@@freedomgoddess 4 what? Cows? Dogs?
Student: What's a X Y?
Me: Well for starters it's not an X and it's not a Y.
Substitute X Y for: Random Variable, Big Bang, Halting Problem, Neural Network, Elliptic Curve, Fundamental (theorem of) Algebra.
Naming things is hard...
The way i think of it is...Its just a function that maps events to real number space.
The occurrence of those events can itself be random and thus the output of function as well. Hence the random in randome variable.
a function on a sample space has an associated probability distribution .This is what leads to designation of random in the name.As other giants of probability theory said, "random function " would be a more proper name.But the term "random variable" stuck for some intuitive interpretation.
Great simplification!
Which software did you use for this animation sir?
Manim!
@@MihaiNicaMath and sir which software did u use for the slides?
It seems like there must be some general relationship between how structure is imposed on random outcomes and the information we can extract from that imposition.
Could you suggest some good books for advanced probability theory with number theory connections??
I like the books "a first look at rigour probability" by Rosenthal and "probability with martingales" by Williams (featured in my other video: th-cam.com/video/t8xqMxlZz9Y/w-d-xo.html )
@@MihaiNicaMath Thanks a lot..
I love probability because it is a beautiful intersection of philosophy and math. I would disagree that Z isn't random. Take the function Z = X_1 + X_2 where X_1, X_2 represent the different dice rolls. X_1 and X_2 are in there on rights functions. They are maps from (1,2,3,4) -> (1/4, 1/4, 1/4, 1/4) where the domain is the outcomes and the codomain is the probabilities of each outcome. We consider X_1, X_2 to be random because they're mappings from outcomes to probabilities.
If you agree with this last statement, then it's clear that Z is in itself also random. For our function Z = X_1 + X_2, you can cut out the middle man in a way and make Z a mapping from the set of outcomes of X_1, X_2 to probabilities. So the domain of Z would be the set {(1,1), (1,2), (1,3), (1,4), (2,1), etc.} and the codomain would be {1/16, 1/16, 1/16, etc.). So if you accept X_1, X_2 to be random, then Z must be as well because it is also fundamentally a mapping from outcomes to probabilities.
Love the video!
Thanks for the detailed comment and glad you liked the video! I agree that Z can be treated as random exactly like you said. Mathematically though, the point of the video is the function Z:Omega -> R is a non-random function, and it's the output Z(omega) that is random. So Z is the non-random machine, and the output (which we also call Z sometimes!) which is random.
@@MihaiNicaMath Ah I understand, I had it all wrong. Thank you for the response! I'm an undergraduate in mathematics and statistics (dual major) and I love learning more about probability and statistics. I see the point you make, because the image of Z isn't a set of probabilities, then by definition it is not random (even though it feels like it should be!). Thank you for the explanation, I feel stronger in the concepts now.
Maybe the true random variable is the friends we made along the way
Well done. Respect.
Great video!
Images and preimages of functions are important to understanding random variables.
I believe it was a bit to fast when you jumped to the E[X1]+E[X2] segment to show that they where equal. Good video thanks
Thanks for the feedback! A lot of my videos use this somewhere so I might do a little compilation of why it's true and some uses
great video!
Thank you!
i think this depends on how one was taught probability theory. When I learned it, we defined random variables to be a measurable map from our probability space to some arbitrary measure space. So for us, it is just a measurable function.
...this is precisely the point of the video: to explain in what way a random variable is actually just a function. What do you mean by "it depends"? Maybe you didn't watch the video and are just replying to the title...
@@MihaiNicaMath it was more about the way probability is taught in most places. most people don't learn measure theory before they learn what a random variable is when a short introduction to measure theory would be so much better before moving on to proper probability theory so that way, they can rigorously prove all the theorems
@@williamxion2806 I completely disagree. Probability is accessible, useful and full of super cool math. Delaying it until after measure theory would be such a shame.
this guy will look me straight in the eyes at 4:00 and tell me events are not random. Answer this question: which event will occur when I throw 2 fair 4-sided dice?
***Which*** event happens IS random, exactly as you said. The events themselves are not random (Example: the collection of all numbers that sum to 5 is a set of numbers....everyone has the same set). That's the perspective/philosphical shift that is the point of the video!
A random variable is a function.
@@wherengoes Spoiler alert jeez
I understood the interpretation of them is different but could you say a probability distribution is a special case of a random variable?
I think it's misleading to say it's a special case. Two things I would say:
1. Every RV *has* a probability distribution. (Described in video)
2. Given a probability distribution, you can create an RV that has that distribution. (Set \Omega to be the set of values it takes with P(\omega) given, and then make Z(\omega)=\omega). This is sometimes called the "canonical" match between an RV and a distribution. I didn't mention this in the video!
Love the video, but now you've got me thinking.
If i roll a die with 4 sides, but each of these 4 sides does not have a value quite yet. Each of the sides, rolls a 4 sided die to determine their values. But wait, those 4 sides don't have a value quite yet...
Let Z be the average value of this dice roll. Is Z now an actually random variable? 🤔Can we even calculate E(Z)? I think E(Z)= 4 times E( A normal 4 sided die), but im curious on your reasoning for how we can still call this Z non-random.
Each side has a value of E(4 sided die)
E(Z) = 4 * 1/4 * E(4 sided die)
4 sides * probablity of a side * value of the side
To understand this simply, to determine the result of rolling the die, roll a 4 sided die to determine what is on the face. But that happens no matter what you roll on the original die
@@BryanLu0 You're right. Each die face would tend to the E(Z) so then the total value would tend towards E(Z). Not 4 times E(Z). Thanks for the correction!
Unrelated to your comment, because I was thinking about the problem again:
Here is a different example to illustrate the point.
At first have the normal 4 sided die with the middle two numbes becoming negative.
. Each recursive new die does this to the faces currently on the die before it: Multiply by a random real number, from 1->infinity.
Random number=5
So 1,2,3,4 become 5,-10,-15,20
Repeat again. lets say 10
50,-100,-150,200
etc.
Now when it does get back up to the original dice, the one we actually care about, what will the E(Z) be?
It will be 0.
Yet is Z a RV. that isn't a RV. or is it? Are these "R.Vs" _random_ or not?
Which im not sure is or isn't "random".
I think it still isn't really random. But it also kind of feels like any such dice are actually random. Because Z can be end up being anything, even without infinite recursion. Though, then the definition of "choose a random number" gets called into question. How does one choose from 1->infinity?
Idk, but as a thought expirement, i'm curious what the counter-claim would be such that this "Z" is actually essentially determinant.
@@elunedssong8909 E(Z) = 0 doesn't mean Z is determinant/not RV. E.g. E(normal distribution) = 0
Yes, that's a RV. Indeed plugging in a RV into the expectation operator E gives you a random variable - a constant one!
I found these things so confusing, so I just tell myself they are measurable function. Much more nice
...that's the point of the video?
Students replying to other students be like "Well for starters..."
To understand this, start with concept, or idea in your head. Concepts are not measurable, e.g. I am "happy". To measure it, you need to find a variable in the empirical world like all scientists do, e.g. ask him to rate his level of happiness, H, on scale of 1-5. It is a "random" variable, meaning that we do not know the number until he has rated it. It varies, unlike a constant, e.g. H = 3, which does not change. So a random variable is something that varies in the empirical world. Hope it helps.
THIS is how you explain complex things to beginners.
Thank you!
what happens with quantum mechanics? there RVs are really random I think
It's the same thing actually! In classical mechanics, randomness comes from your lack of knowledge of the exact state, so the probability space can be thought of as being over the set of classical states. In quantum mechanics, the state alone is not enough to determine the outcome, so the probability space is larger than just the state space. But the fact that the random variable is a function on the probability space is the same in both!
Why is it that (expected value of a sum) = sum (expected values).....?
@@victorzurkowski2388 it was proven in the video (for the finite and discrete case)
A couple things:
1. "Disjoint" and "independent" sound similar but actually mean completely different things. Don't mix them up!
2. The proof works whenever Z=X_1+X_2...you don't need any other conditions or independence of anything else. All you have to do is rearrange the sum in the definition of E(X). (For non-discrete distributions it's still true but proof is slightly more complicated!)
@@MihaiNicaMath it's also not true in the infinite (but still discrete) case, unless you come up with a canonical order for the summands (if it diverges, you have Riemann rearrangement theorem to deal with, and yet these scenarios may appear when doing game theory thought experiements)
@@MagicGonads yes, in the infinite case you need an extra condition on the finiteness of the expected value!
@@MihaiNicaMath you are very correct
Proving E[X] = n/2 for a binomial random variable X is pretty simple to do, all it requires is the identity \binom{n}{x}x = \binom{n-1}{x-1}n. This will get rid of the x inside the sum, that is, 1/2^n \sum_{x=0}^n x \binom{n}{x} = 1/2^n \sum_{x=0}^n n \binom{n-1}{x-1} = n/2^n \sum_{x=0}^{n-1} \binom{n-1}{x-1} = n/2^n * 2^(n-1) = n/2.
In the second to last equality, I used the binomial theorem: \sum_{x=0}^{n-1} \binom{n-1}{x-1} = (1+1)^(n-1) = 2^(n-1). Anywhere where the binomial coefficient is negative or otherwise does not make sense we define it as 0, thus we don't even have to reindex.
That's a neat trick! I haven't seen that one before. Normally I would do it with a generating function and take the derivative, which is a bit trickier
Good video, sir
well explained
It is a "chance function", "random variable" is a mistranslation from French.
How is it in the original french?
@@victorscarpesit's random variable, not sure what he meant
@@NC-hu6xdmaybe a difference between the direct translation and the actual meaning?
@@nobody08088 The french is "variable aléatoire" which directly translates to random variable. I dont understand his comment nor the 20 upvotes
@@victorscarpes I wondered too, here is what ChatGPT had to say: "The terminology of "random variable" was introduced by the Russian mathematician Andrey Kolmogorov in the 1930s. His foundational work in probability theory, particularly his 1933 book Foundations of the Theory of Probability, formalized the modern axiomatic approach to probability."
Why does everyone keep saying "dice" as the singular lately? This isn't just you, it seems to be everywhere suddenly. What's going on? Did the singular "die" get replaced with the plural or something?
Guess it’s easier to say and also it’s close to death.
In the UK people (including old people) generally say dice. So I don't think it's a recent change. Although people do say "the die is cast", but most people don't actually know what it is referring to xD
It's even worse than that, back in my day they would say "alea iacta est"! Kids these days....
@@tommyphillips1030 Oh interesting. I always thought "the die is cast" referred to the process of die casting, like pouring molten metal into a form until it sets.
It seems you're right, though, it means throwing a die. That doesn't seem very final to me, though. Just pick it up and throw it again?
@@severoonI believe it’s referring to games of old where you would wager for example your cow or pig or firstborn child on the outcome of a cast of a die, please correct me if I’m wrong
Wohhhhhhaaaa😳👏🏼👏🏼👏🏼
What's so hard about saying "It's a randomly generated variate from a computable distribution" or it's "a number with a computable surprisal".
Indeed, it's easy to say that, and that's pretty close to the "it's a random number" idea that is a good 1st level understanding of what an RV is. It's just that this is NOT the mathematical definition of an RV. The video has the details!
@@MihaiNicaMathSince I have had a long career designing RNGs and trying to bridge the gap between formalism and physical design for nondeterministic behaviour in the RNGs I think of the randomness coming part from ignorance of state (HILL entropy) and from quantum uncertainty in the generation of bits) so in that perspective the randomness appears at a well defined point. The distribution comes from analysis of the source and post processing.
"Professor, what is Neoliberalism?"
"Well, for starters, it's not new and not liberal"
Use the mean-average, which if carefully maintained can approximate any random input. I'm triggered by "random is real"!
Not to forget parallel universes, which in reality are not parallel, and not universes either! As found out by the great scientist D. Adams.
It should be could a "meddling constant"
It's a function.
interesante
Why the unsettling music 🫨
Saying the random variables are (technically) functions doesn't help. Because every variable is a function, random or not... something has been left unsaid in this explanation.
Omg. ❤
What in math *IS* a variable? Seems to me that all names are immutably bound to some mathematical object.
Unbound variable: an arbitrary symbol/name in a mathematical expression. This expression can be evaluated with the variable bound to a specific mathematical object (eg. a Real Number).
For any unbound variable, it is usually implicitly understood that is can only be bound to mathematical objects of some specific type T, and only operations thats make sense on objects of type T can be performed on the unbound variable.
Around 11:15 when he says you can "use it is a variable", I think he means you can use it as a variable of type Real Number.
Yes exactly! So like even though Z is a function (and it's incorrect to say it has a particular value), you can do any operation that you can do on real numbers to it.
It's a function
Yes indeed: that's the point of the video! Pedagogical, the situation is that students see functions in a Calculus-like context (e.g. f(x)=x^2) and need a nice explanation of how random variables also fit into that idea :)
I admire the effort. But I’m skeptical what purpose this simplification serves. People who would follow this simplification are most likely people who can think of random variables in the measure theoretic sense.
Do you just mean because the probability space has finitely many point (as opposed to being an arbitrary set/measure)? My feeling is its easier and helpful for undergraduate students to first understand the sinpler case before moving onto the general case
@@MihaiNicaMath Maybe it's just me. But I think the idea of a random variable as a function only feels natural once you can appreciate the requirement that the function be measurable. And to appreciate the requirement, we need to appreciate what a probability triple is. In my opinion, once you skip over the "technical details" of sigma algebras and probability measures, the idea of random variable as a function invites complication without the reward of greater sophistication and generalization.
With this video, those who would understand the measure theoretic definition may feel that something is missing. And those who would never understand the measure theoretic definition probably just got more confused. But I must admit I have grown rather cynical about most people's capacity for abstract thinking.
my hyperborean ancestors studied mathematics too
Very good video, but you're use of the words x-axis and y-axis is incorrect imho. They're only an x-axis and a y-axis if the variables on them are x and y respectively. If you put events and probabilities on them then you should call them the event axis and the probability axis. Or you can refer to their visual orientation and call them the horizontal axis and the vertical axis.
WHAT
What’s this? Probability theory for babies? Where is the sigma-algebras?
The sigma algebra is just all subsets when it's a discrete space. I'm surprised by the number of people who are worried about the sigma algebra and measurability when those aren't issues here!
@@MihaiNicaMath Isn't it still necessary to mention what a probability space is? Yes, the sigma-algebra is at most the set of all subsets of Omega. But I might be interested in a smaller set.
@@berryesseen The only reason to do that in the discrete case is if you have more than one sigma algebra on the go (e.g. if you are setting up a filtration). But you can always make the finest sigma algebra equal to all the subsets by choosing the atoms correctly.
The singular of “dice” is “die”.
yet it's not less clear since in the case of saying "die" it sounds the same as "die" or "dye"
I agree: I decided years ago it's always clearer to say "dice"!
@@MihaiNicaMath Well, you know the old expression: “Never say die!”
@@malvoliosf Haha yes exactly!
OK, in terms of computer programming, it's not a constant, ergo it is a variable and it varies in an apparently chaotic way, ergo it is random. I call it a random variable and I call clickbait.
Much like x=x+1 means something different in programming and in math, so too do the words "variable". This video is about the math definition of random variable! (And if course the interpretation as "these numbers are random" is used a lot in computer programming!)
@@MihaiNicaMath I know, but I am not a mathematician - and you managed to nerd-snipe me into watching for a bit - until I realised this was not going to be about coding. Doh! Anyhoo, you got plenty of great feedback from people who came here for the math. 🙂
@@smudgerdave1141 Thanks for the feedback! If you like randomness in computer programming, you might like my earlier video about using a GPU to estimate Pi by simulating a bunch of noodles th-cam.com/video/po_pmPrO2YY/w-d-xo.html
a random variable is definitely a variable in how you would interpret it while inverting a probability distribution
Statisticians, please refrain from calling yourself mathematicians. Mathematicians did nothing to you to deserve that.
This is needlessly splitting hairs over terms that ultimately makes no difference to our approach. Does Z change? Yes. What is a variable? Something that changes (randomly or otherwise). So Z is a variable. It doesn't matter whether we can also interpret it as something else. Is a dog an animal or a mammal? Exactly.
As for the random part, we could have simply said that it can be random, but it can also not be random.
End of video.
Well the actual point of the video is to give a mathematical definition of a random variable so we can do math and understand the last 100 years of literature on the subject :) I'm sorry it was so upsetting for you
"Does Z change? Yes."
No! You can have many different samples of Z, but Z itself is fixed. That was the point of the video.
The machinery is general enough that you can do probability without necessarily involving a 'time' aspect.
@Lolwut You have low understanding and appreciation of the foundations of probability theory based on your terse “explanation”. I would not go with you if I needed help proving a theorem
And you clearly didn’t watch the end of the video where he shows how one approach easily got the expectation of the binomial distribution while the other approach was more tedious. There are different approaches all based on solid mathematical theory. Rigor is vital
And a continuous random variable isn't always continuous.