"I'm your son's maths teacher, Mr. Ian, but you can call me Al. I sometimes pretend to threaten my students with impossible problems to see what they can achieve."
@@DynestiGTI In olden days, it was quite common to publish with titles like "on a problem in set theory" "on a problem on polynomials" etc. So yes, most probably.
You clearly know your stuff, and this is an interesting topic, but your content might benefit from longer videos where you go into more details about your explanations. I found that most of the things you were talking about went over my head, not because I don’t think I’d understand, but because it was glossed over too quickly.
Reminds me of the classic joke: An engineer is working at his desk in his office. His cigarette falls off the desk into the wastebasket, causing the papers within to burst into flames. The engineer looks around, sees a fire extinguisher, grabs it, puts out the flames, and goes back to work. A physicist is working at his desk in another office and the same thing happens. He looks at the fire, looks at the fire extinguisher, and thinks "Fire requires fuel plus oxygen plus heat. The fire extinguisher will remove both the oxygen and the heat in the wastebasket. Ergo, no fire." He grabs the extinguisher, puts out the flames, and goes back to work. A mathematician is working at his desk in another office and the same thing happens. He looks at the fire, looks at the fire extinguisher, and thinks for a minute, says "Ah! A solution exists!" and goes back to work.
Just found this channel. Good stuff! Most of this went over my head but you present math in a very approachable manner! Can't wait to see more videos of this sort of stuff.
Your presentation skills are on point. The pace at which you tackle this is anything but. Most of the video flew way over my head, it'll take like 5 rewatches and a lot of pausing for me to actually take in the math. I reckon some people got a lot out of it though, but I only really got a very broad overview.
@@onecommunistboiListening to a lecture at the speed it was spoken is like watching paint dry, unless I'm actively engaged in a dialogue, then I need them to go slow so I can formulate my responses. My default is 2x to maximize efficiency but I scale it down as needed for full comprehension. Rarely go below 1.5 unless they have an accent or it's really dense or complicated. A lot of my friends do the same so I'm not sure what's "normal".
@@megamaser My roommate does like 3-4x speed, and then pauses for meaningful amounts of time in between to think, while I tend to 1x speed with liberal amounts of skipping parts (or just don't watch the lecture at all and skim the notes). Different strokes for different folks
@@megamaser I get that, and oftentimes I do that too. Still I think it's weird to say a lecture is too fast because you have to watch at the intended speed:D Wouldn't it make more sense to say all the other lectures are slow?
I was staring at the equation at 2:47 and couldn't see it... until I realized that if you draw a graph with k vertices, then you can just have a binary sequence like (0,1,0,0,0,1,0,1,...,1) with a term for every place an edge could possibly go, with 1 indicating the presence of an edge and 0 indicating the absence of an edge, and there are clearly 2^m such sequences, one of which represents a clique, so the chance of a clique is 1/(2^m) where m is the number of edges in a connected graph with k vertices.
Where did you first come into contact with this kind of method of proof? I remember that my first exposure to similar ideas to this was in a grad course on graph theory, but I assume there are other avenues of entry as well? In either case, it is a truly surprising way to prove mathematical statements, and I think the video did a good job of explaining how absurdly powerful the technique (that feels like it really shouldn't work) actually is. Good luck in SoME2, I think this video deserves to do well.
Yeah, also first encountered this in graph theory. Specifically, the probabilistic proof that every graph G with m edges has a bipartite subgraph of at least m/2 edges. For anyone who hasn't seen it: For each vertex in G, randomly assign it a colour (red or blue) with equal probability. Now, for each edge, there are three possibilities: 1) the edge joins two red vertices (prob = 1/4) 2) it joins two blue vertices (prob = 1/4) 3) it joins a red and a blue vertex (prob = 1/2). Define a subgraph H with vertex set equal to that of G, and edge set consisting of all edges that join one red and one blue vertex. The random colouring of G is a 2-colouring of H. Since each edge is in H with (independent) probability 1/2, the expected number of edges in H is m/2. Thus there must be at least one colouring of G that results in a subgraph of at least this size.
My first introduction to this was primality tests. In my CS data structures course, a few of the assignments were about implementing bignums (for storing and operating on huge integers, like > 2^64), and one of them was about checking whether the number was prime. Brute-force tests using traditional methods are not possible at that scale, but there are probabilistic tests like Miller-Rabin that determine either that a number is definitely composite or that it may be prime. It has another integer input that is a random variable. For prime numbers, all random seeds will give you the same "may be prime" outcome. For composite numbers, there are some fraction of the random seeds that give you the "definitely composite" outcome - they are "witnesses" to the compositeness of the number. You repeat the test with different random inputs until either you find a witness, so it is composite, or you reach a certain number of iterations where the probability of it being composite despite not finding any witnesses is so astronomically low that you can assume that it is prime.
The covering points with disjoint unit disks problem has been my favorite for the past few months. I hope to see progress made on it, but I'm not super optimistic given that the last progress made was in 2012.
What if there are dots covering ~ 90.7% of the paper? Like somany dots packed together, but still covers less area compared to coins? I might be wrong but i don't expect coins to cover all dots in that case cos coin areas aren't spread like dots, cos coin areas are like small concentrated chunks
Yo, agree, but my point is the coins won't be able to cover all the points. Well if there are also infinite tiny coins, they might cover actually. Nvm I'm drunk 🥴
0:58 I don't understand how a packing efficiency of 90.7% means that on a random shift the E[# covered] is 9.07 for all possible configurations of points. Why isn't it that for some configurations of points E[# covered] is < 9.07 and > 9.07 for some others? Interesting video, btw :)
This uses a property called “linearity of expectation.” If we let X_i be 1 when point i is covered and 0 otherwise, then E[X_i] = .907. X = X_1 + … + X_10 is the total number of points covered and by this property, E[X] = E[X_1] + … + E[X_10].
@@ASackVideo But doesnt the expected value assume the dots are randomly arranged? Why would we not assume there might exist an intentionally arranged configuration that will always fall outside of this packing grid arrangement? Even if the expected value of a randomly arranged 10 dots would have slightly over 10 dots covered, that doesnt mean 10 intentionally placed dots can fully be covered. There may exist specific situations where the expected value should really be 9 for some specific arrangements of ten dots.
@@eragon78 ....no. What it assumes is the random position of the coins. Try to look from the perspective of each and every single dot instead as say the hexagonal coin packing configuration is shifted across every possible position. Of those possible positions, exactly 90.69% of them will cover the dot and the rest doesn't. This is true for every single dot, making the average coverage be necessarily 9.069 dots across all packing position. The rest of the argument is the same as in the video. P.S.: I wonder if this will make it more clear... I wish I could visualize it but for now I will just try to explain with words. Keep track of the center of one coin across the space of all possible possibilities for the packing patterns; which is in the shape of -an equilateral triangle- _a parallelogram made of two equilateral triangles_ whose side is equal to the diameter of the coin. Now look at one dot, and color the paralelogram with red and white such that if the center of the coin is in red it means the dot is covered by the hexagonal packing otherwise not. You will find that 90.69% of the paralelogram's area is going to be colored red. Do the same thing for all dots -- the result will be the exact same; it's just that the position of the white area will be different. In any cases, the total sum of the red areas from all ten paralelograms will be 9.069 paralelograms. Now stack all of the paralelograms above one another and try to look at every single point of the paralelogram stack, and evaluate for any single point how many layers are colored red (imagine suppose we are 'compacting' the red areas down). Since the total sum of the areas is more than 9 paralelograms, it is necessary that there are some points in the paralelogram stack where all ten layers are red. P.S. 2 (5 days after original comment): I made a desmos interactible for fun: www.desmos.com/calculator/l3zhktbwez I originally made this for fun although I decided to share this to disprove someone being confidently incorrect before, although fotrunately the reply in question has now been deleted lol
@@Anonymous-jo2no no, i think I see it now. I did some more thinking into it and I think I understand now. I was getting confused because I was thinking of another similar problem this kind of reminded me of. There are some key differences though which is why it doenst apply. But I was thinking of that one problem where you have 100 boxes numbered 1 through 100 and also 100 slips of paper 1 through 100 and randomly place them into each box. Then you get 100 people each 1 through 100 and they have to go into the room one at a time and open 50 boxes. If ANY SINGLE person of the 100 fails to find their number, the entire group fails and is executed. However, there exist a strategy for this game where you can couple the success/fail together and instead of each person succeeding or failing independently of each other, they all succeed or fail at the same time. If you havent heard of this problem, the solution is to have everyone follow a loop starting with the box of their number and going to the next box that the paper inside each box they open says. So if they are number 13, they start on box 13, if box 13 has a paper with 84 on it, they open box 84, etc. This means the ENTIRE group will succeed if there are no loops of 51 or greater, or the MAJORITY will fail if there is a loop of 51 or greater. This couples all their win/losses together. Each individual person still has a 50/50 chance of being victorious or not, but that win chance is no longer independent of the other contestants. This method gets you around a 1/3 chance of success iirc. Of course I do see now some of the key differences. Each person's individual chance of success is still 50%. It doesnt change by coupling them to the success chance of other people. So for the dots, since the chance of being covered is greater than 90%, this means that even if they are coupled, they MUST retain that 90.7% chance of being covered on a random shift of the coin grid. The only way this is possible is if there exist at LEAST one situation in which all 10 coins are covered. So I do see the solution now. I was just confusing myself at first by drawing analogies to another problem it reminded me of, but thinking deeper about it I see my mistake now. Thanks for the explanation though, helped me see my mistake.
Excellent! Finally a short and fast math video! (Like a one-page proof... You have to reread it twice, but at least it doesn't take two hours. And there aren't those annoying musical interludes!)
This video is really cool and I defently enjoyed it but I think you really need to slow down a bit. The explanations were well done but felt a bit rushed out so I had to pause every few seconds which kinda destroyed the flow of the video :/
My way of explaining this (or a similar) concept was a little different (and perhaps useless, but it did get people to understand) Imagine a man with a infinite life. His life's mission is to win the lottery, so he plays every day. Every time he plays, he has a tiny chance of winning it, right? Let's say it is k. We know that 0 < k < 1. The probability of him not winning the lottery in a given day is 1-k. Therefore, the chance of him not winning it in n days is (1-k)^n. We know that (1-k)^n → 0 as n → ∞. In other words, it is impossible for him not to win it in an infinite amount of days. Therefore, it is proven that, if the man has a probability k of winning the lottery in a given day, he will eventually win it as long as he plays enough times. The same logic can be applied to different scenarios. Sometimes I use it to defend the possibility of the existence of alien life: The universe is absolutely gigantic, besides being quite old. There are (and have been) so many planets around that you could say the number is close to infinite (I know saying this is an atrocity, but you get the idea). If you consider that each planet has a probability k of generating life, the previous logic applies here and we conclude that alien life must exist. Obviously the number of planets isn't truly infinite, but still I think this is at least a very interesting argument.
Ah yes, the quintessential fractal, where the title describes the video and vice versa Just finished watching, this was absolutely great! Super engaging, super informative, super intriguing
The really wild thing is that there are constructive proofs of certain theorems in probabilistic combinatorics (see: constructive lovász local lemma). You can then follow the proofs of these constructive probabilistic theorems to generate examples that witness the existence you are proving! This has been done recently to prove sharp bounds on certain counting problems.
This is kind of the basis of Murphy's law as well - everyone knows this as "if something can go wrong, it will" which most people think means is something about people having bad luck, but really it's a precept of engineering that assumes this same kind of probability problem, if you build something, and there's a 1% chance that what you've built may break, and you build more than 100 of them, then it's inevitable that there WILL be users of your product that gets something broken and they WILL be calling you to fix it. This is very important in engineering a nuclear reactor and why we they're not designed the way Chernobyl was designed. It HAS to be assumed that at some point, some day, something WILL go wrong, therefore the default state of the control rods must be CLOSED, so that if power fails, or there is a hurricane, or whatever thing might go wrong, if it gets shut off, the control rods will no longer be held open, and will fall to a closed position, and stop the reaction, instead of the control rods actively needing to be closed which could go wrong and cause a meltdown like what happened in Chernobyl.
For some reason, I assumed this was a big video made by a decent size channel with 10K subscribers at a minimum. This is good content for such a relatively low number. May the algorithm bless this video.
As you've said, this method pops up a lot in Ramsey Theory and of course was used immensely by Paul Erdos. In particular, my favorite application is one I sadly don't remember the details of...but this was showing that there exists some graph with 2 different properties P1 and P2. You do so by showing that the probability of a graph having P1 is > .5 and the probability of it having P2 is also > .5, which means that there must be some graph that has both properties. I'd love if someone reminded me what the specifics were! Something like "there exists a graph with chromatic number k and largest independent set size m..." but maybe something less related... And I'm almost positive I found this in van Lint and Wilson's "A course in Combinatorics" Last note - Ramsey graphs are incredibly interesting and cleverly built! Easily one of my favorite things from Ramsey Theory.
You might be thinking of the proof that there are graphs with arbitrarily large girth and chromatic number? (You’re right, that is a really fantastic application of the technique!)
Thanks for the video. Another book on the topic that might be useful for anyone interested in computer science is "Probability and computing". I'd especially recommend this book to anyone who is curious about theoretical aspects of machine learning - the new version has a chapter on the famous Vapnik-Chervonenkis dimension.
Just to clarify the argument in a way I can follow. For some configuration of 10 points there is a probability Pi that exactly i points are covered by disks. The expected number of points being covered can then also be calculated by Sum(i*Pi)=P1+2*P2+3*P3+...+10*P10. If for any configuration P10 were to be zero, then that sum can at most be 9 (when P9=1). Since another method of calculating the expected number of points being covered resulted in a value higher than 9, you know that P10 cannot be zero. If you were dealing with a configuration of 11 points though, you cannot use this method that easily. Because 11*90.7% = 9.977 < 10, so it could be that that P11 is zero if P10>0.977. But if you can prove that P10 must always be smaller than that, then I guess you can prove that 11 points must always be possible to cover as well.
In my opinion the probabilistic method becomes more intuitive when you look at the converse version. If the probability is less than 1, then your set can not cover all cases. It is all about finding a sense of volume by which your set of interest has strictly less volume than the whole space, implying it can not cover the whole space and hence there exist points not in your set
@@KGafterdark Well I am only saying that if the volume is strictly smaller then you will not cover all cases, you raised an issue with the reverse implication, if someone were to claim it.
This is so incredibly silly, I love it. It makes sense but feels wrong, like a cheap party trick. Lovely video. The actual bit at the end is honestly super cool.
I’m going to have to revisit this video at a later date. It’s 2am and my head hurts, but it sounds like there’s a possible application to the happy endings theorem.
@14:13 " . . and solve them in the realm of probability . . " You mean assign the guess a probability of being correct. To solve them, you have to actually determine the correct value, not just the chances of your best guess being correct.
Ramsey Numbers we’re so unlearnable for me when I first took a graph theory class, I ended up dropping the class, but this video gave me so much more understanding, and I’m so glad because I’m retaking graph theory this upcoming semester!
I wonder if a similar principle can be used to philosophically deduce something like circular time. Like: Anything that can happen (chance >0) eventually will happen, given infinite time. Anything that has happened _can_ happen (is possible by definition). Therefore, everything that has happened will eventually happen again. Including complex things like "a Big Bang that unfolds exactly like this one" or "you and me being born and living these same lives the same way".
I'm going to argue that your first proposition is built on an incorrect proposition, namely that the probability of an event occurring does not change over time. I'm going to propose that it does, and that the probability for events tends downwards over large timescales.
@@dojelnotmyrealname4018 It is not built on the proposition that probabilities of events remain constant. It only requires that the probability for events will not go to zero and stay there forever. Your alternative -- that the probability for events tends downwards over large timescales -- seems reasonable enough if you only try to model eternity _forward_ The probability of things happening would stabilize at zero, (because that is the smallest number in a context like this where negative numbers aren't applicable) and the model will predict an eternity of nothing happening. But what happens when you look in the opposite direction? Probability of events should trend upwards the further back you go, eventually approaching some eternal singularity of maximum happenings. Since there is no biggest number, and since we know the past wasn't eternal anyway (on account of it being the present now), it appears to me that the model breaks down here.
Unless I am overlooking something, the proof at 1:05 is wrong. Yes, every point has 90.7% probability to be covered in a randomly shifted tiling. But this does not imply that average number of covered points is 9.07 because in general you do NOT get Binomial distribution as as sum Bernoulli distributed random variables which are NOT statistically independent.
We use a property called linearity of expectation. Here's a response I gave to another commenter: You're right that the probability of being covered is dependent. However, the property we use for that calculation is called "linearity of expectation", which works regardless of whether or not events are dependent. So in fact, regardless of the arrangement of the points, the expectation remains 9.07. Here are some more details copy/pasted from a previous comment of mine: If we let X_i be 1 when point i is covered and 0 otherwise, then E[X_i] = .907. X = X_1 + … + X_10 is the total number of points covered and by this property, E[X] = E[X_1] + … + E[X_10].
@@ASackVideo Sorry. You are right. I think I have mistaken it with multiplication. No I understand the proof. It's like pigeon hole principle but continuous. Very cool.
i remember that someone told me "if you can imagine it, then it does exist because you can't imagine something that doesn't exist. if you can't find it here, then it exist in another parallel universe" sounds dumb at first then i just go "OK it's cool" while imagining I'm riding a unicorn
This is fine if you use it as a motivational poster to tell yourself that you can do anything you set your mind to. But from a scientific standpoint, think about it logically for a minute: Why would you not be able to imagine things that don't exist? That's literally what imagination is all about.
It doesn't make sense to me why R(5) is difficult, so I'm gonna throw down some values. For a group of n people, there are `choose n 2 = n * (n - 1) / 2` edges (quadratic), and thus `2^(n * (n - 1) / 2)` possible graphs (exponential of quadratic. Ok starting to get it now.), and for each quintuplet of edges, we have to verify the are not all the same. So it's gonna be `choose n 5 * 2^(choose n 2)` of those checks for every n (well, the final n. All the others can stop early, when they find a configuration that avoids [anti-]cliques.) That's O(2^(n^2)).
I figured from the title that this would be a modal logic proof, but I’m delightfully surprised to see another cool method to prove existence from a probability of existence! For those who are curious, there are philosophical proofs that prove existence from a probability of existence using concepts of modal logic. One famous example of this is the modal ontological proof for the existence of God. Before presenting the argument, you should know that the definition of a _necessary being_ is a being who necessarily must exist and can not fail to not exist. With this in mind, the modal ontological argument runs something like this: There is at least probably a chance that God exists. If God probably exists, then that means there is some possible world in which God does exist. God by definition is a necessary being, which means He can not fail to not exist. If the existence of a necessary being can probably be established, it is the same as saying that a necessary being exists in at least one possible world. If the existence of a necessary being is established to exist in at least one possible world, then it must also exist in all possible worlds, including the actual world. In other words, this is the same as saying that in some possible world there exists a being who can not fail to exist, and if a being can not fail to exist in even just one possible world, then they must also not fail to exist in any possible world including the actual world. Therefore, God exists. Of course your intuition should lead you to suspect that this argument is pretty dang flawed, and it is indeed for a number of reasons, but nevertheless it is a very interesting argument since it ostensibly establishes existence from probable existence. One quick way to see how this argument is flawed is by beginning with the converse, namely, that it is probably the case that God does *not* exist, and you can follow the same chain of reasoning to establish that God does not exist.
I'm glad that you covered this in the way you did. My first thought listening to this is that some amount of religious fundamentalist will happen upon this video and start using this argument ad nauseum. If we go on this argument, the way that you've described, then every god/goddess/other being exists everywhere all at the same time, along with every being ever mentioned in folklore and the world would be a very different, to say the least.
@@mechanomics2649 Thanks for the compliment! There’s a lot of places where this argument fails. Another faulty premise in this argument is right off the bat “there is probably a chance that God exists”. I think there is an equivocation fallacy going on here because theists who are presenting this argument will get an atheist to affirm that premise by accepting that there is some probable chance that God exists, and there seems nothing wrong with granting that. But the problem is that there is an ambiguity in the word “probability” and it can mean two different things. The physicist Sean Carroll wrote an excellent article in quanta magazine entitled _Where Quantum Probability Comes From_ where he expounds two different versions of probability. The first is the *objective* meaning of probability, which treats probability as a fundamental feature of a system. One example of this is frequentist probability, which defines probability as the frequency with which things happen over many trials. If I flip a coin and I say “there is a 50% chance it lands on heads and a 50% chance it lands on tails” what I really mean to say is that if I flipped a coin infinite times, half of those trials will be heads and the other half will be tails. The other meaning is a *subjective* version of probability. When I say something like “there is probably a chance that candidate X will be president” I don’t mean to say given infinite trials candidate X will be president this many times. Rather, it is reflective of an epistemic shortcoming. It reflects my lack of knowledge or incomplete information about what is true or what will happen, and so I assign a probability to my degree of certainty of what is true or what will happen. An example of a subjective version of probability is Bayesian probability. So when theists are presenting this argument to an atheist and they ask “do you think there is a chance that God exists?” many atheists will answer according to a *subjective* version of probability. They do not have complete information about everything, and so they assign a degree of certainty to the proposition that God exists. Since this is the case, an atheist will affirm the first premise of the argument, but this is where the bait and switch comes into play because unbeknownst to the atheist the theist will run the argument as if the atheist affirmed an *objective* version of probability. The idea that God probably exists in some possible world is not a subjective probability assessment, it does not measure a degree of belief, but rather, the probability is an actual feature of the many possible worlds. When the first premise is understood in this way, the first premise is much more controversial and the entire argument collapses straight from the get go.
I don’t know if the first proof is solid without more analysis of the hexagonal pattern. Consider a unit disk of radius 5. Let’s say we can put two dots anywhere and we have one coin of radius 4 to cover both. Now with your technique the packing efficiency would be 16/25=0.64. E[# covered] = 2(.64)=1.28 > 1 which would be enough to conclude that there must be a configuration of the coin where both points are covered. However we can easily see a counterexample where one coin cannot cover both by considering any two points that are greater than a distance of 8 apart, as that is the diameter of the coin.
Not all shifts of the coin in your example lie entirely in the radius 5 disk, so we don't cover all points with probability .64. The important detail for why this isn't a problem in the video is that ANY hexagon of that size shifted any amount will have 90.7% of it covered. (Or equivalently, any shift of the pattern covers 90.7% of the hexagon.)
This idea reminds me a little bit of what physicist Murray Gell-Mann said: "Anything not prohibited is compulsory"....as well as Murphy's Law: "Anything that can go wrong, will go wrong" which most people assume is a pessimistic viewpoint, but its original intent was a recognition of the infallible reliability of causality.
1:00 - "The expected number of points we cover is 9.07" --> Then some way has to cover SOME ARRANGEMENT of 10 points. It doesn't imply that FOR ANY arrangement of 10 points, some way to cover them all exists...
"Did you do your math homework?"
"Well, there's a nonzero chance that I did."
"Fair enough. A+."
"I'm your son's maths teacher, Mr. Ian, but you can call me Al. I sometimes pretend to threaten my students with impossible problems to see what they can achieve."
Doing one's homework isn't the same as CORRECTLY doing one's homework. So the median: C.
@@ericdew2021 well there is a probability that i did my homework correctly. That means i did.
@@M1551NGN0 And there is a probability that you incorrectly did your homework, that means you did. A basic symmetry argument.
@@mathlitmusic3687 there is a probability that there is a probability of me doing probability
virgin mathematician: "Prove its existance!"
gigachad mathematician: "It can exist, therefore it does."
I think it's more like saying "The probability measure function must have measured some event where it exists, therefore it exists."
Modal realists: This, but unironically
yea, if i exist, i am
Or rather, it says "it can't not exist (else its P=0), so it does exist."
@@mathlitmusic3687 but how? a fair coin toss has a non zero P, but still can land tails, right?
Ramsey is such a cool figure. Published so little but everything he published was super impactful.
Were all his titles super vague?
Not only he can can cook but also sick maths
@@DynestiGTI In olden days, it was quite common to publish with titles like "on a problem in set theory" "on a problem on polynomials" etc. So yes, most probably.
He should have done more meth like Erdos.
You clearly know your stuff, and this is an interesting topic, but your content might benefit from longer videos where you go into more details about your explanations. I found that most of the things you were talking about went over my head, not because I don’t think I’d understand, but because it was glossed over too quickly.
Yeah, that
yup either make it more simple and easier to understand, or go longer with more details so we have some more time to think about it.
Or because I'm wasted
You wouldn’t understand
Totally agree. Great content/topics, pacing needs work.
This is such a well made video! One of my favourite some2 submissions - you had me hooked the entire time!
Glad you enjoyed it!
@@ASackVideo reads title. Probably a billion usd in my bank account, checks account disappointment. click bait title
so well made that I can't understand what the fuck he's talking about
@@ASackVideo Since TROCKA ARE DIRTY AND SHOULD NOT COUNT.how.can you solve it without the damn trick??
@@ASackVideo Thanks for sharing Inhave nonidea what Some 2..are you a mathematician?
My dad probably exists
w
He probably went to get milk.
Reminds me of the classic joke:
An engineer is working at his desk in his office. His cigarette falls off the desk into the wastebasket, causing the papers within to burst into flames. The engineer looks around, sees a fire extinguisher, grabs it, puts out the flames, and goes back to work.
A physicist is working at his desk in another office and the same thing happens. He looks at the fire, looks at the fire extinguisher, and thinks "Fire requires fuel plus oxygen plus heat. The fire extinguisher will remove both the oxygen and the heat in the wastebasket. Ergo, no fire." He grabs the extinguisher, puts out the flames, and goes back to work.
A mathematician is working at his desk in another office and the same thing happens. He looks at the fire, looks at the fire extinguisher, and thinks for a minute, says "Ah! A solution exists!" and goes back to work.
Just found this channel. Good stuff! Most of this went over my head but you present math in a very approachable manner! Can't wait to see more videos of this sort of stuff.
Your presentation skills are on point. The pace at which you tackle this is anything but. Most of the video flew way over my head, it'll take like 5 rewatches and a lot of pausing for me to actually take in the math. I reckon some people got a lot out of it though, but I only really got a very broad overview.
Yeah I had to play in 1x speed and rewind a lot of times to justify the details of his examples. He glossed over a lot.
@@megamaser You had to play in 1x speed? Isn't that like.. the normal speed?
@@onecommunistboiListening to a lecture at the speed it was spoken is like watching paint dry, unless I'm actively engaged in a dialogue, then I need them to go slow so I can formulate my responses. My default is 2x to maximize efficiency but I scale it down as needed for full comprehension. Rarely go below 1.5 unless they have an accent or it's really dense or complicated. A lot of my friends do the same so I'm not sure what's "normal".
@@megamaser My roommate does like 3-4x speed, and then pauses for meaningful amounts of time in between to think, while I tend to 1x speed with liberal amounts of skipping parts (or just don't watch the lecture at all and skim the notes). Different strokes for different folks
@@megamaser I get that, and oftentimes I do that too.
Still I think it's weird to say a lecture is too fast because you have to watch at the intended speed:D
Wouldn't it make more sense to say all the other lectures are slow?
Nature: Whatever is not expressly forbidden is required.
I was staring at the equation at 2:47 and couldn't see it... until I realized that if you draw a graph with k vertices, then you can just have a binary sequence like (0,1,0,0,0,1,0,1,...,1) with a term for every place an edge could possibly go, with 1 indicating the presence of an edge and 0 indicating the absence of an edge, and there are clearly 2^m such sequences, one of which represents a clique, so the chance of a clique is 1/(2^m) where m is the number of edges in a connected graph with k vertices.
Thank you for making this short enough to fit my attention span! Fascinating stuff.
The hardest problem for mathematicians, social interactions. i'm dead 💀
0:49 i think you just blew my mind
An absolutely brilliant video of an intruiging topic with crystal clear explanations. Looking forward to see more of your content in the future!
Where did you first come into contact with this kind of method of proof? I remember that my first exposure to similar ideas to this was in a grad course on graph theory, but I assume there are other avenues of entry as well? In either case, it is a truly surprising way to prove mathematical statements, and I think the video did a good job of explaining how absurdly powerful the technique (that feels like it really shouldn't work) actually is. Good luck in SoME2, I think this video deserves to do well.
Thank you! I learned this in a grad course on the probabilistic method.
for me it was an CS undergrad course in probabilistic algorithms
Yeah, also first encountered this in graph theory. Specifically, the probabilistic proof that every graph G with m edges has a bipartite subgraph of at least m/2 edges.
For anyone who hasn't seen it:
For each vertex in G, randomly assign it a colour (red or blue) with equal probability. Now, for each edge, there are three possibilities:
1) the edge joins two red vertices (prob = 1/4)
2) it joins two blue vertices (prob = 1/4)
3) it joins a red and a blue vertex (prob = 1/2).
Define a subgraph H with vertex set equal to that of G, and edge set consisting of all edges that join one red and one blue vertex. The random colouring of G is a 2-colouring of H. Since each edge is in H with (independent) probability 1/2, the expected number of edges in H is m/2. Thus there must be at least one colouring of G that results in a subgraph of at least this size.
My first introduction to this was primality tests. In my CS data structures course, a few of the assignments were about implementing bignums (for storing and operating on huge integers, like > 2^64), and one of them was about checking whether the number was prime.
Brute-force tests using traditional methods are not possible at that scale, but there are probabilistic tests like Miller-Rabin that determine either that a number is definitely composite or that it may be prime. It has another integer input that is a random variable. For prime numbers, all random seeds will give you the same "may be prime" outcome. For composite numbers, there are some fraction of the random seeds that give you the "definitely composite" outcome - they are "witnesses" to the compositeness of the number. You repeat the test with different random inputs until either you find a witness, so it is composite, or you reach a certain number of iterations where the probability of it being composite despite not finding any witnesses is so astronomically low that you can assume that it is prime.
Make a video on p-adics
The covering points with disjoint unit disks problem has been my favorite for the past few months. I hope to see progress made on it, but I'm not super optimistic given that the last progress made was in 2012.
What if there are dots covering ~ 90.7% of the paper? Like somany dots packed together, but still covers less area compared to coins?
I might be wrong but i don't expect coins to cover all dots in that case cos coin areas aren't spread like dots, cos coin areas are like small concentrated chunks
@@GiggityGlen what do you mean by dots covering 90.7% of the paper? Dots don't cover anything, they have no area...
Fair point, what if the dots are like rational numbers? Take any small area, there's infinite points in there but there are more space to fill
@@GiggityGlen well yeah even points at all rational coordinates have no area because they're countably infinite
Yo, agree, but my point is the coins won't be able to cover all the points.
Well if there are also infinite tiny coins, they might cover actually. Nvm I'm drunk 🥴
this video blew my mind, really well done
Great video, no filler at all and got straight to the point. Awesome!
0:58 I don't understand how a packing efficiency of 90.7% means that on a random shift the E[# covered] is 9.07 for all possible configurations of points.
Why isn't it that for some configurations of points E[# covered] is < 9.07 and > 9.07 for some others?
Interesting video, btw :)
This uses a property called “linearity of expectation.” If we let X_i be 1 when point i is covered and 0 otherwise, then E[X_i] = .907.
X = X_1 + … + X_10 is the total number of points covered and by this property, E[X] = E[X_1] + … + E[X_10].
@@ASackVideo But doesnt the expected value assume the dots are randomly arranged? Why would we not assume there might exist an intentionally arranged configuration that will always fall outside of this packing grid arrangement? Even if the expected value of a randomly arranged 10 dots would have slightly over 10 dots covered, that doesnt mean 10 intentionally placed dots can fully be covered. There may exist specific situations where the expected value should really be 9 for some specific arrangements of ten dots.
@@eragon78 we computed that for every arrangement, the expectation of shifting the hexagons is 9.07 using this linearity of expectation property.
@@eragon78 ....no. What it assumes is the random position of the coins. Try to look from the perspective of each and every single dot instead as say the hexagonal coin packing configuration is shifted across every possible position. Of those possible positions, exactly 90.69% of them will cover the dot and the rest doesn't. This is true for every single dot, making the average coverage be necessarily 9.069 dots across all packing position. The rest of the argument is the same as in the video.
P.S.: I wonder if this will make it more clear... I wish I could visualize it but for now I will just try to explain with words. Keep track of the center of one coin across the space of all possible possibilities for the packing patterns; which is in the shape of -an equilateral triangle- _a parallelogram made of two equilateral triangles_ whose side is equal to the diameter of the coin. Now look at one dot, and color the paralelogram with red and white such that if the center of the coin is in red it means the dot is covered by the hexagonal packing otherwise not. You will find that 90.69% of the paralelogram's area is going to be colored red. Do the same thing for all dots -- the result will be the exact same; it's just that the position of the white area will be different. In any cases, the total sum of the red areas from all ten paralelograms will be 9.069 paralelograms. Now stack all of the paralelograms above one another and try to look at every single point of the paralelogram stack, and evaluate for any single point how many layers are colored red (imagine suppose we are 'compacting' the red areas down). Since the total sum of the areas is more than 9 paralelograms, it is necessary that there are some points in the paralelogram stack where all ten layers are red.
P.S. 2 (5 days after original comment): I made a desmos interactible for fun: www.desmos.com/calculator/l3zhktbwez
I originally made this for fun although I decided to share this to disprove someone being confidently incorrect before, although fotrunately the reply in question has now been deleted lol
@@Anonymous-jo2no no, i think I see it now. I did some more thinking into it and I think I understand now.
I was getting confused because I was thinking of another similar problem this kind of reminded me of. There are some key differences though which is why it doenst apply.
But I was thinking of that one problem where you have 100 boxes numbered 1 through 100 and also 100 slips of paper 1 through 100 and randomly place them into each box. Then you get 100 people each 1 through 100 and they have to go into the room one at a time and open 50 boxes. If ANY SINGLE person of the 100 fails to find their number, the entire group fails and is executed. However, there exist a strategy for this game where you can couple the success/fail together and instead of each person succeeding or failing independently of each other, they all succeed or fail at the same time. If you havent heard of this problem, the solution is to have everyone follow a loop starting with the box of their number and going to the next box that the paper inside each box they open says. So if they are number 13, they start on box 13, if box 13 has a paper with 84 on it, they open box 84, etc. This means the ENTIRE group will succeed if there are no loops of 51 or greater, or the MAJORITY will fail if there is a loop of 51 or greater. This couples all their win/losses together. Each individual person still has a 50/50 chance of being victorious or not, but that win chance is no longer independent of the other contestants. This method gets you around a 1/3 chance of success iirc.
Of course I do see now some of the key differences. Each person's individual chance of success is still 50%. It doesnt change by coupling them to the success chance of other people.
So for the dots, since the chance of being covered is greater than 90%, this means that even if they are coupled, they MUST retain that 90.7% chance of being covered on a random shift of the coin grid. The only way this is possible is if there exist at LEAST one situation in which all 10 coins are covered. So I do see the solution now. I was just confusing myself at first by drawing analogies to another problem it reminded me of, but thinking deeper about it I see my mistake now.
Thanks for the explanation though, helped me see my mistake.
I now understand what my poor dog must go through when I talk to him
😅
This is the kind of thing that motivates me to study more math. What an amazing concept!
if it probably exists, then it doesn't.
Excellent! Finally a short and fast math video!
(Like a one-page proof... You have to reread it twice, but at least it doesn't take two hours. And there aren't those annoying musical interludes!)
i really like the pacing. i wish my uni lectures could go at that speed lol
2:30 I dont understand, isnt this a problem thats really easy to just brute force with a powerful computer?
This video is really cool and I defently enjoyed it but I think you really need to slow down a bit. The explanations were well done but felt a bit rushed out so I had to pause every few seconds which kinda destroyed the flow of the video :/
You can slow the video down using the TH-cam settings. You can slow it down all the way to 1/4 speed.
@@jjjccc728 yea..sure..
Yeah agreed, I think the content was very good, but the delivery was a bit too rushed.
TH-cam recommended me this. Although I didn't follow all the numbers I got the concept, which was extremly well explained! I like it!
0:50 Yet more proof that hexagons are, in fact, the bestagons
My way of explaining this (or a similar) concept was a little different (and perhaps useless, but it did get people to understand)
Imagine a man with a infinite life. His life's mission is to win the lottery, so he plays every day.
Every time he plays, he has a tiny chance of winning it, right? Let's say it is k. We know that 0 < k < 1. The probability of him not winning the lottery in a given day is 1-k. Therefore, the chance of him not winning it in n days is (1-k)^n.
We know that (1-k)^n → 0 as n → ∞. In other words, it is impossible for him not to win it in an infinite amount of days.
Therefore, it is proven that, if the man has a probability k of winning the lottery in a given day, he will eventually win it as long as he plays enough times.
The same logic can be applied to different scenarios. Sometimes I use it to defend the possibility of the existence of alien life:
The universe is absolutely gigantic, besides being quite old. There are (and have been) so many planets around that you could say the number is close to infinite (I know saying this is an atrocity, but you get the idea). If you consider that each planet has a probability k of generating life, the previous logic applies here and we conclude that alien life must exist.
Obviously the number of planets isn't truly infinite, but still I think this is at least a very interesting argument.
I actually needed this to win an argument, so thanks.
hmmm can't wait to see this channel evolve
When you have eliminated all which is impossible, then whatever remains, however improbable, one must be the truth
The probabilistic proof for the quarter problem works for n
Ah yes, the quintessential fractal, where the title describes the video and vice versa
Just finished watching, this was absolutely great! Super engaging, super informative, super intriguing
The really wild thing is that there are constructive proofs of certain theorems in probabilistic combinatorics (see: constructive lovász local lemma). You can then follow the proofs of these constructive probabilistic theorems to generate examples that witness the existence you are proving! This has been done recently to prove sharp bounds on certain counting problems.
This is kind of the basis of Murphy's law as well - everyone knows this as "if something can go wrong, it will" which most people think means is something about people having bad luck, but really it's a precept of engineering that assumes this same kind of probability problem, if you build something, and there's a 1% chance that what you've built may break, and you build more than 100 of them, then it's inevitable that there WILL be users of your product that gets something broken and they WILL be calling you to fix it. This is very important in engineering a nuclear reactor and why we they're not designed the way Chernobyl was designed. It HAS to be assumed that at some point, some day, something WILL go wrong, therefore the default state of the control rods must be CLOSED, so that if power fails, or there is a hurricane, or whatever thing might go wrong, if it gets shut off, the control rods will no longer be held open, and will fall to a closed position, and stop the reaction, instead of the control rods actively needing to be closed which could go wrong and cause a meltdown like what happened in Chernobyl.
For some reason, I assumed this was a big video made by a decent size channel with 10K subscribers at a minimum. This is good content for such a relatively low number.
May the algorithm bless this video.
It also becomes invaluable when dealing with quantum mechanical probability states.
As you've said, this method pops up a lot in Ramsey Theory and of course was used immensely by Paul Erdos.
In particular, my favorite application is one I sadly don't remember the details of...but this was showing that there exists some graph with 2 different properties P1 and P2. You do so by showing that the probability of a graph having P1 is > .5 and the probability of it having P2 is also > .5, which means that there must be some graph that has both properties.
I'd love if someone reminded me what the specifics were! Something like "there exists a graph with chromatic number k and largest independent set size m..." but maybe something less related...
And I'm almost positive I found this in van Lint and Wilson's "A course in Combinatorics"
Last note - Ramsey graphs are incredibly interesting and cleverly built! Easily one of my favorite things from Ramsey Theory.
You might be thinking of the proof that there are graphs with arbitrarily large girth and chromatic number? (You’re right, that is a really fantastic application of the technique!)
@@zachgz That's 100% it! Thank you so much!
It's like the pigeonhole principle, but used for continuous things.
Thanks for the video.
Another book on the topic that might be useful for anyone interested in computer science is "Probability and computing". I'd especially recommend this book to anyone who is curious about theoretical aspects of machine learning - the new version has a chapter on the famous Vapnik-Chervonenkis dimension.
Ooo I feel so early to what is clearly about to be a huge channel. pre-congratulations.
Just to clarify the argument in a way I can follow. For some configuration of 10 points there is a probability Pi that exactly i points are covered by disks. The expected number of points being covered can then also be calculated by Sum(i*Pi)=P1+2*P2+3*P3+...+10*P10. If for any configuration P10 were to be zero, then that sum can at most be 9 (when P9=1). Since another method of calculating the expected number of points being covered resulted in a value higher than 9, you know that P10 cannot be zero.
If you were dealing with a configuration of 11 points though, you cannot use this method that easily. Because 11*90.7% = 9.977 < 10, so it could be that that P11 is zero if P10>0.977. But if you can prove that P10 must always be smaller than that, then I guess you can prove that 11 points must always be possible to cover as well.
and this is exactly why I stopped going to parties
So happy that I stumbled upon your channel. Great video, good content.
In my opinion the probabilistic method becomes more intuitive when you look at the converse version. If the probability is less than 1, then your set can not cover all cases. It is all about finding a sense of volume by which your set of interest has strictly less volume than the whole space, implying it can not cover the whole space and hence there exist points not in your set
Slight issue is that the volume of "valid entries" can be 1, but still be unequal to the entire sample space.
@@KGafterdark Well I am only saying that if the volume is strictly smaller then you will not cover all cases, you raised an issue with the reverse implication, if someone were to claim it.
Please do more of this they are really intresting
0:35 we can use smaller coins.
Ramsey numbers example was very cool
Great. Now use this to solve zeiman zeta function
This sounds like the premise to the finite and infinite probability drives in the hitchhiker's guide to the galazy.
Amazing video dude
Thank you!!
This is so incredibly silly, I love it.
It makes sense but feels wrong, like a cheap party trick.
Lovely video. The actual bit at the end is honestly super cool.
bro just said words and numbers and I think that is cool
I’m going to have to revisit this video at a later date. It’s 2am and my head hurts, but it sounds like there’s a possible application to the happy endings theorem.
This is so interesting. I like speed but I didn't understand it well. A follow up would be great going through the examples more slowly
Really enjoyed this! If you feel like making more videos in the future, please do!
That trick with the expected value is simultaneously the most cursed and incredible thing I have ever witnessed.
@14:13 " . . and solve them in the realm of probability . . "
You mean assign the guess a probability of being correct.
To solve them, you have to actually determine the correct value, not just the chances of your best guess being correct.
I had to stop only because I was worried my brain was gonna explode
If u get the entire party stoned enough than they all click
Writes a book about a party.
The party: Okeeeee lets gooooooo
This Ramsey guy must be fun at parties
I'll take your word for it and get ready to fight the aliens
Saw u on TikTok, feeling great to find u here on TH-cam
Thanks for sharing these contents. You're doing a great job.
I thought this was going to be a discussion on modal logic, but I was pleasantly surprised to learn something new
@Michael Ah, but possible necessity entails necessity, so logicians are always looking within the weak sauce for the strong sauce
2:22 if WHAT?
Ramsey Numbers we’re so unlearnable for me when I first took a graph theory class, I ended up dropping the class, but this video gave me so much more understanding, and I’m so glad because I’m retaking graph theory this upcoming semester!
This also applys to R34
I wonder if a similar principle can be used to philosophically deduce something like circular time.
Like: Anything that can happen (chance >0) eventually will happen, given infinite time. Anything that has happened _can_ happen (is possible by definition). Therefore, everything that has happened will eventually happen again. Including complex things like "a Big Bang that unfolds exactly like this one" or "you and me being born and living these same lives the same way".
I'm going to argue that your first proposition is built on an incorrect proposition, namely that the probability of an event occurring does not change over time. I'm going to propose that it does, and that the probability for events tends downwards over large timescales.
@@dojelnotmyrealname4018 It is not built on the proposition that probabilities of events remain constant. It only requires that the probability for events will not go to zero and stay there forever.
Your alternative -- that the probability for events tends downwards over large timescales -- seems reasonable enough if you only try to model eternity _forward_
The probability of things happening would stabilize at zero, (because that is the smallest number in a context like this where negative numbers aren't applicable) and the model will predict an eternity of nothing happening.
But what happens when you look in the opposite direction? Probability of events should trend upwards the further back you go, eventually approaching some eternal singularity of maximum happenings. Since there is no biggest number, and since we know the past wasn't eternal anyway (on account of it being the present now), it appears to me that the model breaks down here.
a lot of this went right over my head
That quarter example is wild. Holy frick.
I hate how he says aliens, but I love the logic in this video....
1/10, it gets my like.
is this like continuous pigeon hole?
It has a similar flavour. In my mind, at least.
The Pentagram of Social Interaction
I knew it all along
Unless I am overlooking something, the proof at 1:05 is wrong. Yes, every point has 90.7% probability to be covered in a randomly shifted tiling. But this does not imply that average number of covered points is 9.07 because in general you do NOT get Binomial distribution as as sum Bernoulli distributed random variables which are NOT statistically independent.
We use a property called linearity of expectation. Here's a response I gave to another commenter:
You're right that the probability of being covered is dependent. However, the property we use for that calculation is called "linearity of expectation", which works regardless of whether or not events are dependent.
So in fact, regardless of the arrangement of the points, the expectation remains 9.07.
Here are some more details copy/pasted from a previous comment of mine:
If we let X_i be 1 when point i is covered and 0 otherwise, then E[X_i] = .907.
X = X_1 + … + X_10 is the total number of points covered and by this property, E[X] = E[X_1] + … + E[X_10].
@@ASackVideo Sorry. You are right. I think I have mistaken it with multiplication. No I understand the proof. It's like pigeon hole principle but continuous. Very cool.
I have no clue why this was recommended to me but I enjoyed feeling my brain break in real time
i remember that someone told me "if you can imagine it, then it does exist because you can't imagine something that doesn't exist. if you can't find it here, then it exist in another parallel universe"
sounds dumb at first then i just go "OK it's cool" while imagining I'm riding a unicorn
This is fine if you use it as a motivational poster to tell yourself that you can do anything you set your mind to.
But from a scientific standpoint, think about it logically for a minute:
Why would you not be able to imagine things that don't exist? That's literally what imagination is all about.
I don't know why I watched this, but I did. Did I understand it? Not a word. Did I enjoy it? Strangely, yes!
This just came on my recommended and I start my first stat and prob class tomorrow, kinda scared ngl
It doesn't make sense to me why R(5) is difficult, so I'm gonna throw down some values.
For a group of n people, there are `choose n 2 = n * (n - 1) / 2` edges (quadratic), and thus `2^(n * (n - 1) / 2)` possible graphs (exponential of quadratic. Ok starting to get it now.), and for each quintuplet of edges, we have to verify the are not all the same. So it's gonna be `choose n 5 * 2^(choose n 2)` of those checks for every n (well, the final n. All the others can stop early, when they find a configuration that avoids [anti-]cliques.) That's O(2^(n^2)).
Yeah, 2^(43^2) is the kind of number that's many times greater than the number of nanoseconds since the start of the universe.
thanks a lot!!! so nice reverb
This title and video reek of a video that'll eventually get millions of views
“Yeah, it might. Therefore it does.”
1:10 Had to watch this twice to understand it, then it was like ..."wow, brilliant"
I figured from the title that this would be a modal logic proof, but I’m delightfully surprised to see another cool method to prove existence from a probability of existence!
For those who are curious, there are philosophical proofs that prove existence from a probability of existence using concepts of modal logic. One famous example of this is the modal ontological proof for the existence of God. Before presenting the argument, you should know that the definition of a _necessary being_ is a being who necessarily must exist and can not fail to not exist. With this in mind, the modal ontological argument runs something like this:
There is at least probably a chance that God exists. If God probably exists, then that means there is some possible world in which God does exist. God by definition is a necessary being, which means He can not fail to not exist. If the existence of a necessary being can probably be established, it is the same as saying that a necessary being exists in at least one possible world. If the existence of a necessary being is established to exist in at least one possible world, then it must also exist in all possible worlds, including the actual world. In other words, this is the same as saying that in some possible world there exists a being who can not fail to exist, and if a being can not fail to exist in even just one possible world, then they must also not fail to exist in any possible world including the actual world. Therefore, God exists.
Of course your intuition should lead you to suspect that this argument is pretty dang flawed, and it is indeed for a number of reasons, but nevertheless it is a very interesting argument since it ostensibly establishes existence from probable existence. One quick way to see how this argument is flawed is by beginning with the converse, namely, that it is probably the case that God does *not* exist, and you can follow the same chain of reasoning to establish that God does not exist.
I'm glad that you covered this in the way you did. My first thought listening to this is that some amount of religious fundamentalist will happen upon this video and start using this argument ad nauseum.
If we go on this argument, the way that you've described, then every god/goddess/other being exists everywhere all at the same time, along with every being ever mentioned in folklore and the world would be a very different, to say the least.
@@mechanomics2649 Thanks for the compliment! There’s a lot of places where this argument fails. Another faulty premise in this argument is right off the bat “there is probably a chance that God exists”. I think there is an equivocation fallacy going on here because theists who are presenting this argument will get an atheist to affirm that premise by accepting that there is some probable chance that God exists, and there seems nothing wrong with granting that. But the problem is that there is an ambiguity in the word “probability” and it can mean two different things. The physicist Sean Carroll wrote an excellent article in quanta magazine entitled _Where Quantum Probability Comes From_ where he expounds two different versions of probability.
The first is the *objective* meaning of probability, which treats probability as a fundamental feature of a system. One example of this is frequentist probability, which defines probability as the frequency with which things happen over many trials. If I flip a coin and I say “there is a 50% chance it lands on heads and a 50% chance it lands on tails” what I really mean to say is that if I flipped a coin infinite times, half of those trials will be heads and the other half will be tails.
The other meaning is a *subjective* version of probability. When I say something like “there is probably a chance that candidate X will be president” I don’t mean to say given infinite trials candidate X will be president this many times. Rather, it is reflective of an epistemic shortcoming. It reflects my lack of knowledge or incomplete information about what is true or what will happen, and so I assign a probability to my degree of certainty of what is true or what will happen. An example of a subjective version of probability is Bayesian probability.
So when theists are presenting this argument to an atheist and they ask “do you think there is a chance that God exists?” many atheists will answer according to a *subjective* version of probability. They do not have complete information about everything, and so they assign a degree of certainty to the proposition that God exists. Since this is the case, an atheist will affirm the first premise of the argument, but this is where the bait and switch comes into play because unbeknownst to the atheist the theist will run the argument as if the atheist affirmed an *objective* version of probability. The idea that God probably exists in some possible world is not a subjective probability assessment, it does not measure a degree of belief, but rather, the probability is an actual feature of the many possible worlds. When the first premise is understood in this way, the first premise is much more controversial and the entire argument collapses straight from the get go.
@@mechanomics2649 The ontological proof of God is already used ad nauseum
I don’t know if the first proof is solid without more analysis of the hexagonal pattern. Consider a unit disk of radius 5. Let’s say we can put two dots anywhere and we have one coin of radius 4 to cover both. Now with your technique the packing efficiency would be 16/25=0.64. E[# covered] = 2(.64)=1.28 > 1 which would be enough to conclude that there must be a configuration of the coin where both points are covered. However we can easily see a counterexample where one coin cannot cover both by considering any two points that are greater than a distance of 8 apart, as that is the diameter of the coin.
Not all shifts of the coin in your example lie entirely in the radius 5 disk, so we don't cover all points with probability .64. The important detail for why this isn't a problem in the video is that ANY hexagon of that size shifted any amount will have 90.7% of it covered. (Or equivalently, any shift of the pattern covers 90.7% of the hexagon.)
2:05 "Any party with six people is doomed."
Introverts know that intuitively.
I love this information, but the video was so fast paced. I would really like to see similar stuff explained with further detail in 10-20 minutes.
I understood until R(6) and then everything went over my head. But I am going to understand this so that I can avoid sticky situations.
This only works most of the time
This idea reminds me a little bit of what physicist Murray Gell-Mann said: "Anything not prohibited is compulsory"....as well as Murphy's Law: "Anything that can go wrong, will go wrong" which most people assume is a pessimistic viewpoint, but its original intent was a recognition of the infallible reliability of causality.
It will happen or it will not happen. The probability of you randomly appearing on the moon is there, but it doesn't mean you will given time.
Another amazing video! :)
i'm impressed, you managed to mention ramsey and the r of k problem without bringing up graham
Some nerds were really overthinking going to parties.
Very interesting, and such a good exposition.
*Mathematics:* "You should fight aliens."
1:00 - "The expected number of points we cover is 9.07" --> Then some way has to cover SOME ARRANGEMENT of 10 points. It doesn't imply that FOR ANY arrangement of 10 points, some way to cover them all exists...
The point arrangement isn't what's random here. We calculated that for EVERY point arrangement, the expected number covered is 9.07.
@@ASackVideo Aaah, that's true. Thanks!
This is amazing, keep going :)
my happiness probably exists, i think.