FYI, this topic is much bigger than what a 13 minute video can represent, so I'm planning a follow up video. That'll address the natural questions that follow from this one. And second, I'd like to say that this isn't a normal video for me. I'm not a complexity theory expert by any standard (in fact, I recruited an expert to help on this one), so I'll be getting back to machine learning topics in the future.
The topic of non computable numbers is absolutely fascinating and is really well explored here. (There doesn't seem to be a huge amount on the topic across the internet, given its profundity) I'd be keen to learn more about how certain state machines map onto things like the Goldbach and Riemann hypotheses.
Hope you tackle some weird ideas like this one from time to time. I have never heard of the implications of the BB that you mention and it really made for me a new perspective in the topic.
I actually think the name Busy Beaver is great. Because for most of us regular folks, we ran into this number when looking for something that beats the TREE function. Beavers chomp down trees. Perfectly fitting
Fun fact: there is a finite number of states in the observable universe. You can estimate an upper bound of the number of states of a system as e^S, where S is the entropy of a black hole with the same size as the system. For our universe this ends up being around S=10^123. This means that we could only ever hope to compute up to BusyBeaver(e^10^123), as later busy beaver numbers need computations that our observable universe is too small to contain.
If it’s unimaginable then we can, by definition, not describe it. We might have a rough concept of it but that’s just describing the concept of something we know exists but can not fully understand, or in this case, imagine. We have a symbol for infinity but there is no person on earth able to fully understand actual infinity because our brained are not wired that way. The easiest thought experiment I can give you is the following: Tell me the number separated by exactly the smallest possible number that preceeds ♾️. So basically the number that comes before you reach infinity. Now, you’ll figure out very quickly that you just can’t do it because you’ll always be able to add something to it which means you are never going to reach infinity. But we do have the concept of infinity. To know something is there and be able to fully understand it are two different things.
@@sudowtf You are not thinking about this correctly. Infinity is not a number at all, so the words "the smallest possible number that precedes infinity" are nonsense. These words don't mean anything. However, "infinity" expresses precisely the concept that one can always add something to a number to make it larger. There is no upper limit. How is a statement about the absence of a limit impossible to understand?
@@alexeyvlasenko6622 Infinity can not be expressed by a number, and numbers are by definition not infinite. There is always a limit. You can not just keep adding +1 to any number because eventually you’ll run out of storage, in whatever form it may be, to store your new number. So again, what is that number right before you reach infinity? There must be a point where infinity is reached. A sort of switch. So think of that, but then add +1 to that number and keep doing that. My thought experiment merely shows that we can not imagine, express or think of the unimaginable. If you want to dive deep into the mathematics then we can, but it’s a different topic really.
@@alexeyvlasenko6622 And infinity doesn’t exist in a finite universe. Nothing is infinite. You could say the universe itself is infinite because it’s expanding but there is no proof that it’ll keep expanding forever. Since “forever” isn’t even defined here. If we go by the normal definition then forever itself is an expression of infinity and again, we don’t know if the universe will expand forever.
When he started explaining Goldbach's conj. I thought to myself "that can't possibly be true" and started with even numbers less than 10 and then started having an existential crisis as I realised every even number I tried could be made up of two prime numbers
I thought the opposite. It is such a frustrating conjecture to me because it seems to be obviously true. As you get larger, there are more and more ways that you can construct a number with 2 primes, like how 4 can only be written as 2+2, but 16 can be written as both 11+5 and 13+3. And since it's been proven that there are an infinite amount of primes, it would be extremely strange for the conjecture to be false.
11:26 A closely related problem to the busy beavers is the maximum shifts function S, which counts the most shifts a turing machine of a certain size can make while still halting. If we calculated S(27) _without_ proving or disproving Goldbach's conjecture, we could actually (theoretically) use our result to prove or disprove Goldbach's conjecture. Simply run the turing machine that terminates if Goldbach's conjecture is false and see if the number of shifts surpasses S(27). If the machine terminates before making S(27) shifts, then obviously Goldbach's conjecture must be false. But if it terminates after making more than S(27) shifts, then Goldbach's conjecture must be true, because the machine has surpassed the largest number of finite steps a machine of that size can make. It must run forever, so there exists no counterexample for the conjecture.
Well, using it to prove goldbach conjecture wouldn't actually be possible though right, since probably like S(6) or something would already be bigger than the number of atom in our universe
@@tcoren1 True, that's why I said "theoretically." But this logic allows mathematicians to use busy beavers as an upper bound for how difficult a problem is to solve. The smaller the turing machine that halts upon disproving the problem, the "easier" the problem could possibly be.
But what if you determine S(27) and it’s a number that you would need to invent a new notation to even describe? How would you even run a machine that many steps to see if Goldbach was right?
@@cara-seyun The shift function should always output an integer. Even if that integer is incomprehensibly large, and in reality would require special notation to represent, in _theory_ it is still just an integer. With enough space, time, and resources, you could theoretically represent this integer in a standard form. Whether this is actual possible in real life is a different story.
Perhaps in a trillion years there will still be people puzzling over how they might solve Goldbach's conjecture! Perhaps it is easier to just solve entropy and run a perfectly durable machine for an infinity-1 amount of time to calculate S(27).
My favorite mind-blowing fact is that the sum of 2^-BB(k) from k=1 to infinity definitely converges and we know what it's equal to up to an insane number of digits, but there is some precision which we will never be able achieve. The result of that sum is uncomputable. And since the vast majority of reals are uncomputable (the computable numbers have lebesgue measure zero), this weird number is somehow more typical than any other real number you've ever used.
@@artemdown6609 Yeah, I guess we could say that there is some precision which we will never be able to achieve with e, but this seems even less achievable, since there is no computable algorithm which approaches it. I guess if it had a finite amount of digits, then it would be technically computable by hardcoding the digits into the states, so it should have infinite digits.
@@artemdown6609OP is talking about something else. e is a computable number since there is an algorithm which approximates it to arbitrary precision. This BB-Sum however is uncomputable - there is no Algorithm for even approximating it beyond a certain degree. Huge difference
I have not had my mind blown to this scale ever. Quantum mechanics used to be the biggest mindfuck for me but this definitely takes the cake now. I swear we must be doing math wrong.
I think it's pretty easy to see BB(googol) > anything humans can come up with, since you can describe any human algorithm in less than googol amount of states of turing machines.
Interesting, yea we land in the same place. By my understanding, the tree sequence is computable. As you point out, we have no clue where the cross over happens. I'd be quite surprised if it was so high though.. then again, intuition is pretty useless here
Mean, we can construct a good upper bound. Let us say we can make 3 turing machines: Turing machine 1 starts with a tape full of zeroes and makes Tree(google) encoded in binary at a specific known point on the tape(with the head some known relative distance x left of the first 1, say), and uses K states to do so Turing machine 2 shifts the tape x to the right. Can do anything else(should be doable with K_2
The TREE sequence can be explained in a youtube video. A youtube video takes up much less than a googol bits. Thus a Turing machine producing the tree sequence can be coded with much less than a googol states. Thus BB beats Tree. And to say it is not even close would be an understatement. We can't compute the crossover point, but we can easily make upper limits.
Here's a fun (for some definition of "fun") exercise I did once: get a Turing machine simulator and initialize it with one of the leftover 5-state candidates of Busy Beaver TMs where people haven't figured out the long-term behavior yet. You can easily find such simulators and machine descriptions online. And then just run it and watch it. There is something fascinating about running these machines. Sometimes you think you see a pattern, but then you're still perplexed as to what they're doing with their 5 little states. And you wonder. Is this it? Am I already looking at the border between the computable and the noncomputable? Or more dramatically, as you watch the little 0s and 1s flash by, you wonder: am I now already standing at the abyss that separates what is fundamentally knowable from what is fundamentally unknowable? And you know you might already have plunged into, without knowing it. Because at some point, there is no knowing it.
What's also really crazy is that a recent bachelor thesis reduced the bound of Busy Beaver to 745! One off from the Riemann Hypothesis machine! Meaning if we manage to reduce the bound by 1, we will have proven that RH is INDEPENDENT of our current understanding of mathematics; neither provable nor disprovable!! The name of the person is Johannes Riebel, the thesis is called "bachelor-thesis-undecidability-bb748". Scott Aaronson briefly mentions it as well in blog post 7388.
This does not make sense. I assume the bound you are talking about is for BB(5) which is irrelevant to the RH machine. If you had an upper bound for BB(744) then that would decide the RH, not prove it undecidable. You just run that program and if it doesn't halt by the time it reaches your upper bound that would prove it doesn't halt.
I've looked at the paper now. The bound is on the first value of the BB sequence that is uncomputable. I still don't see how lowering that bound would be relevant to RH. Showing that there exists an undecidable 744 state machine wouldn't say anything about the specific machine that's equivalent to RH.
Well, no, if we reduce it by 1, it is still possible that the Riemann Hypothesis has a counterexample. Just because BB(n) isn't computable, doesn't mean every n-state Turing machine cannot be shown to halt. Example: It's easy to prove that any machine who's A0 instruction does not change the state, halts.
I find the Busy Beaver function fascinating because it shows that there is a whole world of mathematics that our current math is not yet able to comprehend, and every day our horizon of conceivability broadens a little bit more in this infinite sea of undiscovered knowledge.
This comment is poetic in a way. Like daedalus and icarus looking out over the (to them, infinite) sea after escaping the massive and complex labyrinth daedalus himself had built, and seeing how much larger it is than his creation. How much more there was out there.
"Not yet" is misleading there are things about math we cannot know and we will not know. It has been proven that not all theorems can be proven no matter how hard we try.
Whoa - this is mind-blowing! Up to now, I used to think about making up algorithms for producing insanely big numbers (like in the Numberphile videos mentioned at the start) as rather boring child's play - like: who can name the biggest number (always the child who answers last, of course - so what's the point, I thought). But I would not have thought that there is so much more depth to that. Like the existence of non-computable numbers and their connections to famous unsolved problems like the Riemann hypothesis.
Almost every number that exists is a non-computable number. We do not know what they are, we can not name them, we can't construct them, we can't work with them, because our math tools are computation and algorithms (hence, non-computable), but we know they exist out there, beyond the boundary of computation and that their space is so much larger that it is as if all our computable numbers are mere footnotes relegated to a corner of a page in comparison...
@@kburtsev They exist because the number line is continuous, it has no gaps. However computable numbers leaves gaps in the number line, therefore there has to be non-computable numbers that fill those gaps. And with the numbers we can compute, there are more gaps left in the number line than there are filled points. We do know a few individual non-computable numbers and some properties of them, but obviously can't compute what the exact values are. (For instance Chaitin's constant, which is a set of non-computable constants from information theory.)
On the topic of computability, I personally suggest you to also take a look at Kolmogorov's Complexity, which has strong links to learning theory and AI via Solomonoff Induction and AIXI. It clarifies a lot on what learning actually means, even if Kolmogorov Complexity is uncomputable you still obtain a framework that works well if you have a computable approximator, like a large language model calculating entropy!
You have me figured out - Kolmogorov Complexity is on my list. I haven't refined my perspective on it yet though.. sounds like you have an interesting angle. I may message you on Twitter when the time comes to create the vid
Thank you! That's quite a compliment! I'm happy to look as high quality as Manim. However, I'm not actually using Manim.. just an Altair-based personal library I've used. It's not nearly as a Manim
I love the name. It's kinda like the function is so much higher than us that it doesn't even understand our concepts of prestige and dignity. It's just toodling around, doing its thing
It's worth pointing out that other authors define the busy beaver as the maximum number of _steps_ that a halting n-state Turning machine can take, rather than the number of ones that it can write. E.g. Scott Aaronson uses the former in the paper you linked to, calling it "the most natural choice" (at the bottom of page 2).
@@PeterMeadows42Because it's not an error. It's a preference which has people split. Some prefer to deal with the shift function and others prefer the ones function. They are both called Busy Beaver. I spoke with both people, so I just went with the historical version of "Busy Beaver', which is the one's function.
11:26 That’s why I prefer to think of the maximum shifts function S instead of busy beaver; it’s trivial (though computationally expensive) to go from S(27) to an answer on Goldbach. I think in theory you can do the same with busy beaver, but I’m not sure how.
DJ, your video was both excellent and beatifully presented. I had seen the Numberphile video on the Busy Beaver and don't think I fully appreciated it. Your presentation was so clear -- especially with your rendering the the n-state Turing machine defintion! Thanks for sharing this content and I look forward to seeing more.
I just had my computability course. If my roommate wasn't sleeping next door, I would be screaming. What do you mean that we can't make claims about Σ(1000)?? What 🤯🤯🤯🤯 I am in awe right now
@@Mutual_Information i will send it to my brother now to see your video after work. he is a mechanical engineer though. I don't how much it will blow his mind. I know he will be impressed.
On your surprise of patrons. Your content is always great, and well presented. The topics are rarely presented elsewhere probably due to the scientific depth, other than basic information theory. What I enjoy the most, is the intuition you provide, such as with the probability distributions video, and the Fisher information video just to give a pair of examples. I would love to see this channel and its supporters grow.
Morpheus : I've seen an agent punch through a concrete wall; men have emptied entire clips at them and hit nothing but air; yet, their strength, and their speed, are still based in a world that is built on rules. Because of that, they will never be as strong, or as fast, as you can be.
What a fantastic video. I'm not really an expert in this, but I know a lot about it, and I didn't notice anything that was glaringly wrong, and your animation style and delivery are both excellent.
That is a great video; the function you've created (the one like Knuth notation) is impressive. It's crazy to think about how the Busy Beaver sequence would outpace functions with a large number of arguments. We know that there's no limit to recursion arguments, and theoretically, you could dwarf the Ackermann function by adding just one argument alongside (a, b). Remarkably, Loader created a function that grows faster than TREE, using only one argument!
That was such a good video and I'm a new subscriber now. If you keep producing such good content your channel's growth might surpass the busy beaver's growth.
I don't know how I wondered into math TH-cam but you do an excellent job presenting such complex information in a way that I, whose mathematical knowledge ends with calculus 101, is able to comprehend the beauty/insanity of the concept. Great video, thank you!
Wow, am I glad to stumble upon your video. Thanks so much for making it, you fascinated the life out of me. The last 3 days I was reading and watching everything I could find about it. Haven't been so excited about math since my early 20s. A function that "breaks" math at
You know, studying astrophysics sometimes really drives you insane and makes you question the meaning of existence. Whenever I feel that way, math has been a haven, it has been simple playground in abstract space. But now I feel the same existential dread from math as well. Thank you very much!
I've binged the googology wiki years ago and had my mind blown by the fact that simple algorithms can generate numbers beyond any practical analogy in only 6 or 7 states. But what no one ever thinks to mention is, these massive numbers are actually islands of low entropy surrounded by many integers which can never be expressed by humans. Imagine you have a 10,000 state Turing machine. That's 10^10^4.96 possible configurations, which is much greater than the number of unique numbers which can be expressed, since many of these machines will be redundant or never halt. Imagine you covered the entire surface of the Earth (510,072,000 km^2) with hard drives, with areal storage capacities at today's benchmark of 1.1 terabytes (8.8 terabits) per sq inch, and instead of bits, each unit represents a single state on a Turing machine (up to 1.364*10^22 states in the algorithm). That's 10^10^23.79 possible configurations. So let's say 10^10^24 is a ridiculously high upper bound on unique numbers we can represent. This means that most of the numbers, for instance, between 10^10^26 and 10^10^27 cannot be represented without approximating them, in any shape or form even if we covered the whole planet with hard drives. And yet, we can describe Graham's number in only 19 symbols on a Turing machine.
Interesting! If I'm interpretting you correctly, something like BB(20) is describe simply (I just did it), but it's so ridiculously huge that virtually every number that might be considered 'close' to it.. can't be simply described.. because we don't have any machine.. or algorithm.. or anything that can describe it. Even doing things like BB(20) - BB(10) or any other combination of operations with familiar object won't work because it'll miss the vast, vast majority of numbers in that range.
Awesome overview. Thanks for the link to my blog post :) So exciting to see more people exploring Busy Beavers. Let me know if you want to collaborate on a future video. I'd be glad to chat about some of the stuff going on in the community :)
Wow, great to hear from you! I'd love to chat for the follow up video. Reach me on LinkedIn? Searching "Duane Rich" should do the trick. Twitter DM works too (see description)
I also prefer the s(n) definition of the busy beaver, where s(n) is the number of steps the longest-running machine takes before halting. This is more directly useful - if you have s(n), and an n-state machine, you can just run your machine for s(n) steps: if it halted, great; if it didn't, it'll never halt. The "max number of 1s" definition requires another level of indirection to actually be useful - many infinite-looping machines will produce less 1s than that, so you can't tell if *your* machine will infinite loop unless you're lucky enough to have it produce more than Σ(n) 1s. Instead you have to run *all* machines until every one that has produced Σ(n) or less 1s has either halted or been proven to infinite loop, then you can use that max number of steps to check *your* machine. That's a lot harder to work with! The two definitions are equivalent for the purpose of the proof, it's just that s(n) is more directly useful and clearer why it's uncomputable.
You're right and it took me awhile to realize this. Exactly, S(n) is useful because you know when to stop running the machine. I went with Sigma(n) honestly to avoid confusion. It's weird to compute the output of a computable function to the number of steps the TM ran for. I tried both and both of them brought a burden of extra explanation. In the follow up video, I'll discuss S(n).
Oh I saw your post on twitter some weeks ago, didn't expect you'd make a video out of this! Regarding 11:10, while it's true that in theory knowing about a specific BB number leads to us knowing what machines halt or not, this also requires actually running said machine for that many steps! So considering BB(27) is already larger than your question mark machinery, you don't really have any chance of accomplishing any of this even if an oracle told you the value.
It's actually not too hard to get an intuition for why there can be no computable function that grows faster than bb. Let's consider the even more absurd function S(x) which counts the largest number of steps any halting turing machine with x states takes (instead of counting the number of 1s it produces). By definition S(x) has to grow faster than any computable function because if there was one that grows faster than S(x), we'd have an upper bound for how long we have to wait until we know that a turing machine can no longer possibly halt and the halting problem would be solved, which is impossible. Now this doesn't prove that bb(x) also grows this fast, but it would be very suprising if the growth rate of bb(x) and S(x) wasn't related in some way.
I think the reverse is even more crazy. Any task can be simplified to be within this busy beaver bound. Meaning we have an upper limit on all possible computations (well, we are essentially doing that anyway). But if you for example increase your tape alphabet beyond binary.... It can always be done in binary.
The title and miniature are a little misleading, because they suggest this line perfectly separates computable from uncomputable, even though there are uncomputable functions that are "below" the busy beaver.
3 things to say 1. mind blown 2. are there any imaginatable suitable usecases should the BB of 5,6,7,... be found? 3. Would quantum computing be more suitable in these types of calculation than the current computers could handle them?
10:09 It likely overtakes the weird ? function at around 7 or 8. People have already proved that Σ(6)>15?, and I wouldn't be surprised that one extra state makes the number a lot bigger by adding extra recursion.
If I understood correctly, he was comparing Sigma(n) to n? with two dashes, not sure if you were also referring to that or just the simple question mark
I just took a class on computational theory last semester at ASU and it definitely blew my mind and most of my classmates. It would be absolutely terrible having to be the one to write the proof for some of these things! Was really cool to hear you talk about this stuff though, brought back some of the things I had forgotten about like church-Turing thesis.
Nicely explained, really. The subject is fascinating, like a science fiction movie can be, but when it goes towards "if we can compute something than cannot be computed, we disprove the Riemann hypothesis", then I think it's worth taking a step back so things doesn't turn into navel-gazing, like any area when something gets more and more theoretical (string theory for example) and less connected to what I hope is the actual point of science, especially in times like these - applying its knowledge to improve society and people's lives, not just "do something because we can". This has been a realization of mine after spending a lot of time on things like this. It's interesting, satisfying, comfortable, but in the end rather pointless, as a few world champions have described after their pursuit have ended. We need the connection to reality, so we can see the ever-increasing questions and lack of answers as a hint, instead of pushing forward in vain hope.
Really interesting topic, thanks for making a video on it! Also, weird that there was no mention of compression since S(4) is the shortest way to express "make 13 ones" computationally.
Here's a followup question I have on the Halting problem: Let A be an algorithm that receives - any state table, and - any finite input tape and can output either - that the Turing machine halts, - that it doesn't halt, or - that A doesn't know whether the Turing machine halts, and it is not allowed to be wrong about the first two options. Then, what portion of the possible inputs, as input size goes to infinity, can A compute (i.e. output that it halts or doesn't halt)? If the percentage is zero, how does the computable portion compare to the frequency of prime numbers, squares, or powers of 2 in the integers?
Love these videos about absurdly large numbers. Remind me of those juvenile days where we argue who comes up with the larger number. Except these are not even close to what a small kids could even imagine about.
This was the first video I see on your channel and it was insanely cool, I can't wait to see more! I also appreciate the time you spent on the description of the video with the extra information and references.
Nice video! It’s worth noting that the existence of the Goldbachs conjecture Turing machine isn’t too surprising. I could write a rather simple computer program that simply goes through the even natural numbers in order exhaustively checking if I can write it as a sum of two primes, and halts when it arrives at a number that cannot be written as such. This program halts if and only if Goldbachs conjecture is false.
While it is true that Goldbachs conjecture trueness is not surprising with a BB halting, I think what's is more surprising is that both Goldbach and Riemann hypothesis can be equivalent to some n-states machine, which makes me thinking probably all unsolved mathematical conjectures are related to some n-state machines.
@@3kelvinhong I agree that Reimann's Hypothesis is more interesting. Its a more complex conjecture, which I guess necessitates the ~1000 states necessary to disprove it. I didn't mean to suggest the video isn't interesting, if that's how my comment came across. Theoretically any conjecture that one could disprove by iterating through all possible counterexamples can be translated into a halting problem. Like much of mathematics, its a connection that one doesn't immediately think of, but once you hear it, it it seems very natural. The halting problem is hard because it basically gives you a shortcut to prove many (all?) conjectures.
The picture is misleading. The number of functions is cont, the number of Turing Machines is countable infinity. So, there are way more uncomputable things than there are computable
Surely if one can encode a brute force attack on the Goldbach Conjecture as a 27 state Turing machine so that halting means that it has found a counter example, then this is not that surprising.
Awesome video, just turned me into a subscriber! From the aesthetics of the video, down to the content and calm presentation you deserve serious recognition. Nice work, this channel WILL blow up :)
Seems like almost the same idea. Rayo's number is the largest finite number that can be defined in less than a certain number of symbols in a particular mathematical language, right? Busy beavers seem like the same thing, just using a different language to define the numbers.
Aaronson discusses this. He says.. if we're given the value of Shift(27) (this is the shift function, which is the total number of steps taken by the machine, rather than the count of 1's written), then that reduces Goldbach to a finite procedure. You just run the GoldBach machine for Shift(27)-steps. If it hasn't halted by then, you know GoldBach is true! But this is not a practical route at all
The number 744 is prominent in the q-expansion of Klein's j-invariant, namely j(tau) = 1/q + 744 + 196884q + ... which manifests in Ramanujan's constant exp(pi*sqrt(163)) being approximately 640320^3 + 744. Coincidence?
I was going to say maybe, but after searching a tiny bit, the next term in the expansion is -196884/640320^3, which makes me a lot more inclined to believe that there is some connection there.
this might be the definitive mind blower video of the BB topic I'll see ever. Now I can close this case in my mind to go find another conclusions in this awesome field.
Very cool video, but I'm not getting the "There exists a 27-state machine that halts iff..." part; in that case, can't you just construct a trivial machine that halts on each internal state no matter the input obtained from the tape? EDIT: nvm I think I got it; there's a SPECIFIC 27-state machine, which halts iff Goldbach's Conjecture is false
FYI, this topic is much bigger than what a 13 minute video can represent, so I'm planning a follow up video. That'll address the natural questions that follow from this one.
And second, I'd like to say that this isn't a normal video for me. I'm not a complexity theory expert by any standard (in fact, I recruited an expert to help on this one), so I'll be getting back to machine learning topics in the future.
The topic of non computable numbers is absolutely fascinating and is really well explored here. (There doesn't seem to be a huge amount on the topic across the internet, given its profundity)
I'd be keen to learn more about how certain state machines map onto things like the Goldbach and Riemann hypotheses.
Hope you tackle some weird ideas like this one from time to time. I have never heard of the implications of the BB that you mention and it really made for me a new perspective in the topic.
the duration was on purpose, wasn't it?
Great video. But I would like to point out that you pronounced Alan Turing wrongly. It should sound like "Tu-ring", not "Tur-ring".
@@yuan-jiafan9998 oh wow I didn't know that, I appreciate the note
"the solution is easy, if the computer does not halt within a reasonable time, i smash it with a large mallet" - Alan turing.
I've found an elegant proof of Goldbach's Conjecture, but this binary Turing machine is too small to contain it.
This is some serious math humor
This hits too close to home
lol
@@orik737ah, Fermat’s last theorem in unknown to people outside of math
Pierre not again!
I actually think the name Busy Beaver is great. Because for most of us regular folks, we ran into this number when looking for something that beats the TREE function. Beavers chomp down trees. Perfectly fitting
Genius
Rayo has a flamethrower.
And Rayo in spanish means thunderbolt, wich destroys trees and beavers
@@gustavojuan5746I guess big foot is immune to lightning
No, I think it's a stupid name
I always thought Graham’s number was a really underwhelming name, but busy beavers so massively understates what it’s describing.
I find it oddly fitting
@@iamthecondor I was picturing something more like THE COMPUTABILITY BOUNDARY instead of 🌲🐿
@@MogaTange with only 1 tree, that beavers clearly not busy enough
@@iamthecondor are you suggesting I add my backups, tree 2 and tree 3
Language and math as we know it is insufficient to describe just how busy those beavers are
Fun fact: there is a finite number of states in the observable universe. You can estimate an upper bound of the number of states of a system as e^S, where S is the entropy of a black hole with the same size as the system. For our universe this ends up being around S=10^123. This means that we could only ever hope to compute up to BusyBeaver(e^10^123), as later busy beaver numbers need computations that our observable universe is too small to contain.
🤯
Then, the question is, if the entire Universe is your Turing machine, what are you going to use as tape?
For some reason, I find this really unsettling...
@@alexeyvlasenko6622 the universe is the tape according to op. the question then is what will be the computer
@@ksalarangStephen Wolfram has entered the chat.
The most amazing thing about this is the human brain that has the ability to even describe the unimaginable in just a few symbols.
It's rather pointing at something, than giving description.
If it’s unimaginable then we can, by definition, not describe it. We might have a rough concept of it but that’s just describing the concept of something we know exists but can not fully understand, or in this case, imagine.
We have a symbol for infinity but there is no person on earth able to fully understand actual infinity because our brained are not wired that way.
The easiest thought experiment I can give you is the following: Tell me the number separated by exactly the smallest possible number that preceeds ♾️. So basically the number that comes before you reach infinity. Now, you’ll figure out very quickly that you just can’t do it because you’ll always be able to add something to it which means you are never going to reach infinity. But we do have the concept of infinity.
To know something is there and be able to fully understand it are two different things.
@@sudowtf You are not thinking about this correctly. Infinity is not a number at all, so the words "the smallest possible number that precedes infinity" are nonsense. These words don't mean anything.
However, "infinity" expresses precisely the concept that one can always add something to a number to make it larger. There is no upper limit. How is a statement about the absence of a limit impossible to understand?
@@alexeyvlasenko6622 Infinity can not be expressed by a number, and numbers are by definition not infinite. There is always a limit. You can not just keep adding +1 to any number because eventually you’ll run out of storage, in whatever form it may be, to store your new number. So again, what is that number right before you reach infinity? There must be a point where infinity is reached. A sort of switch. So think of that, but then add +1 to that number and keep doing that.
My thought experiment merely shows that we can not imagine, express or think of the unimaginable. If you want to dive deep into the mathematics then we can, but it’s a different topic really.
@@alexeyvlasenko6622 And infinity doesn’t exist in a finite universe. Nothing is infinite. You could say the universe itself is infinite because it’s expanding but there is no proof that it’ll keep expanding forever. Since “forever” isn’t even defined here. If we go by the normal definition then forever itself is an expression of infinity and again, we don’t know if the universe will expand forever.
When he started explaining Goldbach's conj. I thought to myself "that can't possibly be true" and started with even numbers less than 10
and then started having an existential crisis as I realised every even number I tried could be made up of two prime numbers
I think it has been tested for numbers up to 4x10^14
@@adonaihilario1006 But imagine the fame and glory to the person who thinks up the 20-digit number that's the counterexample! :D
@@AlaiMacErc If we're going to dream about crazy shit like that, imagine the person who finally finds we need to nuance the statement 1+1=2.
My mind got blown when I read this and didn’t know what a prime number was.
I thought the opposite. It is such a frustrating conjecture to me because it seems to be obviously true. As you get larger, there are more and more ways that you can construct a number with 2 primes, like how 4 can only be written as 2+2, but 16 can be written as both 11+5 and 13+3. And since it's been proven that there are an infinite amount of primes, it would be extremely strange for the conjecture to be false.
Forget the busy beaver for a moment - this explained the halting problem better than most videos focused completely on that
This was the most clear and concise explanation of how turing machines work in relation to what a BB number is, thank you
Formal Languages and Automata as well as first and second order logic was one of my favorite parts of my CS undergrad. Thanks for the nice video!
11:26 A closely related problem to the busy beavers is the maximum shifts function S, which counts the most shifts a turing machine of a certain size can make while still halting. If we calculated S(27) _without_ proving or disproving Goldbach's conjecture, we could actually (theoretically) use our result to prove or disprove Goldbach's conjecture. Simply run the turing machine that terminates if Goldbach's conjecture is false and see if the number of shifts surpasses S(27). If the machine terminates before making S(27) shifts, then obviously Goldbach's conjecture must be false. But if it terminates after making more than S(27) shifts, then Goldbach's conjecture must be true, because the machine has surpassed the largest number of finite steps a machine of that size can make. It must run forever, so there exists no counterexample for the conjecture.
Well, using it to prove goldbach conjecture wouldn't actually be possible though right, since probably like S(6) or something would already be bigger than the number of atom in our universe
@@tcoren1 True, that's why I said "theoretically." But this logic allows mathematicians to use busy beavers as an upper bound for how difficult a problem is to solve. The smaller the turing machine that halts upon disproving the problem, the "easier" the problem could possibly be.
But what if you determine S(27) and it’s a number that you would need to invent a new notation to even describe? How would you even run a machine that many steps to see if Goldbach was right?
@@cara-seyun The shift function should always output an integer. Even if that integer is incomprehensibly large, and in reality would require special notation to represent, in _theory_ it is still just an integer. With enough space, time, and resources, you could theoretically represent this integer in a standard form. Whether this is actual possible in real life is a different story.
Perhaps in a trillion years there will still be people puzzling over how they might solve Goldbach's conjecture! Perhaps it is easier to just solve entropy and run a perfectly durable machine for an infinity-1 amount of time to calculate S(27).
My favorite mind-blowing fact is that the sum of 2^-BB(k) from k=1 to infinity definitely converges and we know what it's equal to up to an insane number of digits, but there is some precision which we will never be able achieve.
The result of that sum is uncomputable. And since the vast majority of reals are uncomputable (the computable numbers have lebesgue measure zero), this weird number is somehow more typical than any other real number you've ever used.
doesn't it converge to 0?
@@pedroivog.s.6870 The limit of the sequence itself converges to 0, but we’re talking about the summation of the sequence here.
@@zhg4485 sorry, misread
@@artemdown6609 Yeah, I guess we could say that there is some precision which we will never be able to achieve with e, but this seems even less achievable, since there is no computable algorithm which approaches it. I guess if it had a finite amount of digits, then it would be technically computable by hardcoding the digits into the states, so it should have infinite digits.
@@artemdown6609OP is talking about something else. e is a computable number since there is an algorithm which approximates it to arbitrary precision. This BB-Sum however is uncomputable - there is no Algorithm for even approximating it beyond a certain degree. Huge difference
5:40 It's solved! The longest-running is 47176870, and the most 1s is 4098.
One of the best videos I've seen on this topic. What a great, simple explanation without treating us like idiots.
I have not had my mind blown to this scale ever. Quantum mechanics used to be the biggest mindfuck for me but this definitely takes the cake now. I swear we must be doing math wrong.
Lol yea complexity theory is an absolute brain buster. Though so is QM..
Going back to quantum mechanics AFTER having learned complexity theory and looking at how it changes in quantum information is also a major mindfuck
We're not doinɡ math wronɡ, it's just that our fields of math are vulnerable to ɡoedel attacks
@bosoncolider quantum information and complexity
oh lolol, on point... stunning thought
Busy Beaver is not math, its meth
technically we don't know if BB(googol) is bigger than TREE(googol)
we can't prove at what point BB overruns TREE
only that it does that eventually
I think it's pretty easy to see BB(googol) > anything humans can come up with, since you can describe any human algorithm in less than googol amount of states of turing machines.
Interesting, yea we land in the same place. By my understanding, the tree sequence is computable. As you point out, we have no clue where the cross over happens.
I'd be quite surprised if it was so high though.. then again, intuition is pretty useless here
Mean, we can construct a good upper bound. Let us say we can make 3 turing machines:
Turing machine 1 starts with a tape full of zeroes and makes Tree(google) encoded in binary at a specific known point on the tape(with the head some known relative distance x left of the first 1, say), and uses K states to do so
Turing machine 2 shifts the tape x to the right. Can do anything else(should be doable with K_2
We can prove it by creating a Turing machine which will write TREE(googol) ones with less than googol states, and this can trivially be done.
The TREE sequence can be explained in a youtube video. A youtube video takes up much less than a googol bits. Thus a Turing machine producing the tree sequence can be coded with much less than a googol states. Thus BB beats Tree.
And to say it is not even close would be an understatement. We can't compute the crossover point, but we can easily make upper limits.
Here's a fun (for some definition of "fun") exercise I did once: get a Turing machine simulator and initialize it with one of the leftover 5-state candidates of Busy Beaver TMs where people haven't figured out the long-term behavior yet. You can easily find such simulators and machine descriptions online. And then just run it and watch it. There is something fascinating about running these machines. Sometimes you think you see a pattern, but then you're still perplexed as to what they're doing with their 5 little states. And you wonder. Is this it? Am I already looking at the border between the computable and the noncomputable? Or more dramatically, as you watch the little 0s and 1s flash by, you wonder: am I now already standing at the abyss that separates what is fundamentally knowable from what is fundamentally unknowable? And you know you might already have plunged into, without knowing it. Because at some point, there is no knowing it.
"for some definition of 'fun'" might just be the most mathematician way to describe an activity I've ever read.
What's also really crazy is that a recent bachelor thesis reduced the bound of Busy Beaver to 745! One off from the Riemann Hypothesis machine! Meaning if we manage to reduce the bound by 1, we will have proven that RH is INDEPENDENT of our current understanding of mathematics; neither provable nor disprovable!!
The name of the person is Johannes Riebel, the thesis is called "bachelor-thesis-undecidability-bb748". Scott Aaronson briefly mentions it as well in blog post 7388.
I'll check it out - thanks!
This does not make sense. I assume the bound you are talking about is for BB(5) which is irrelevant to the RH machine. If you had an upper bound for BB(744) then that would decide the RH, not prove it undecidable. You just run that program and if it doesn't halt by the time it reaches your upper bound that would prove it doesn't halt.
I've looked at the paper now. The bound is on the first value of the BB sequence that is uncomputable. I still don't see how lowering that bound would be relevant to RH. Showing that there exists an undecidable 744 state machine wouldn't say anything about the specific machine that's equivalent to RH.
which bound?
Well, no, if we reduce it by 1, it is still possible that the Riemann Hypothesis has a counterexample.
Just because BB(n) isn't computable, doesn't mean every n-state Turing machine cannot be shown to halt.
Example: It's easy to prove that any machine who's A0 instruction does not change the state, halts.
I loved this video for 2 reasons: nice clear explanation, and no background music.
I find the Busy Beaver function fascinating because it shows that there is a whole world of mathematics that our current math is not yet able to comprehend, and every day our horizon of conceivability broadens a little bit more in this infinite sea of undiscovered knowledge.
Mathematitions must have felt the same way thousands of years ago. Like when Pythagoreans learned about irrational numbers.
@@thomasb44223.14159..26535… oh dear Neptune, *it just keeps going!!!* 🤯
This comment is poetic in a way. Like daedalus and icarus looking out over the (to them, infinite) sea after escaping the massive and complex labyrinth daedalus himself had built, and seeing how much larger it is than his creation. How much more there was out there.
"Not yet" is misleading there are things about math we cannot know and we will not know. It has been proven that not all theorems can be proven no matter how hard we try.
@@thomasb4422 there are no feelings in mathematics.
Whoa - this is mind-blowing! Up to now, I used to think about making up algorithms for producing insanely big numbers (like in the Numberphile videos mentioned at the start) as rather boring child's play - like: who can name the biggest number (always the child who answers last, of course - so what's the point, I thought). But I would not have thought that there is so much more depth to that. Like the existence of non-computable numbers and their connections to famous unsolved problems like the Riemann hypothesis.
Almost every number that exists is a non-computable number. We do not know what they are, we can not name them, we can't construct them, we can't work with them, because our math tools are computation and algorithms (hence, non-computable), but we know they exist out there, beyond the boundary of computation and that their space is so much larger that it is as if all our computable numbers are mere footnotes relegated to a corner of a page in comparison...
@@tylisirn What does it mean for number to exist? If we "don't know" that number in what sense it exists?
@@kburtsev They exist because the number line is continuous, it has no gaps. However computable numbers leaves gaps in the number line, therefore there has to be non-computable numbers that fill those gaps. And with the numbers we can compute, there are more gaps left in the number line than there are filled points.
We do know a few individual non-computable numbers and some properties of them, but obviously can't compute what the exact values are. (For instance Chaitin's constant, which is a set of non-computable constants from information theory.)
@@tylisirn Given infinite bits all numbers even ones that aren't computable can be stored, but finding the non-computable numbers is tricky.
@@tylisirnluckily all the numbers that anyone actually cares about, except for the busy beavers, are computable
On the topic of computability, I personally suggest you to also take a look at Kolmogorov's Complexity, which has strong links to learning theory and AI via Solomonoff Induction and AIXI.
It clarifies a lot on what learning actually means, even if Kolmogorov Complexity is uncomputable you still obtain a framework that works well if you have a computable approximator, like a large language model calculating entropy!
You have me figured out - Kolmogorov Complexity is on my list. I haven't refined my perspective on it yet though.. sounds like you have an interesting angle. I may message you on Twitter when the time comes to create the vid
@@Mutual_Information Sure! Would be happy to help you back, your RL playlist was really useful!
@@mgostIH yo are you the geometry dash hacker person
why should he take a look at it?
@@mrosskneHis other videos are about machine learning; that's why
that is an appreciably monstrous rate of growth
lol yea.. pretty much any description in words feel like an understatement.
Whoa ! This channel is such a Gem
The 3b1b of Computer Science.
First time seeing a turing machine in Manim. Grant yould be proud !
Thank you! That's quite a compliment! I'm happy to look as high quality as Manim. However, I'm not actually using Manim.. just an Altair-based personal library I've used. It's not nearly as a Manim
This is the best explanation on Busy Beavers, hands down. Thanks
I love the name. It's kinda like the function is so much higher than us that it doesn't even understand our concepts of prestige and dignity. It's just toodling around, doing its thing
I can see the charm in it
Your explanation of a Turing machine is about the best I've ever seen.
It's worth pointing out that other authors define the busy beaver as the maximum number of _steps_ that a halting n-state Turning machine can take, rather than the number of ones that it can write. E.g. Scott Aaronson uses the former in the paper you linked to, calling it "the most natural choice" (at the bottom of page 2).
Yea, this divide is unfortunate. I'll be introducing the shift function in the next video.
@@Mutual_Information Divide? Unfortunate? By 'divide' do you mean 'error'? Why don't you correct it?
@@PeterMeadows42Because it's not an error. It's a preference which has people split. Some prefer to deal with the shift function and others prefer the ones function. They are both called Busy Beaver. I spoke with both people, so I just went with the historical version of "Busy Beaver', which is the one's function.
@@PeterMeadows42 Yes, it's an unfortunate divide. Are you slow?
Thanks. Computability was my favorite subject in my bachelor’s studies. Nice to see it getting posted on TH-cam!!!
11:26 That’s why I prefer to think of the maximum shifts function S instead of busy beaver; it’s trivial (though computationally expensive) to go from S(27) to an answer on Goldbach. I think in theory you can do the same with busy beaver, but I’m not sure how.
*casually drops the best explanation of a turing machine i’ve ever heard before digging into the crazy math they actually wanted to talk about*
DJ, your video was both excellent and beatifully presented. I had seen the Numberphile video on the Busy Beaver and don't think I fully appreciated it. Your presentation was so clear -- especially with your rendering the the n-state Turing machine defintion! Thanks for sharing this content and I look forward to seeing more.
Thank you Greg, much appreciated. Plus good timing, since I'm currently working on the next one!
I just had my computability course. If my roommate wasn't sleeping next door, I would be screaming. What do you mean that we can't make claims about Σ(1000)?? What 🤯🤯🤯🤯
I am in awe right now
Lol I was actually shouting with my brother when I read about this thing. We're not too far apart in our experience!
@@Mutual_Information i will send it to my brother now to see your video after work. he is a mechanical engineer though. I don't how much it will blow his mind. I know he will be impressed.
On your surprise of patrons. Your content is always great, and well presented. The topics are rarely presented elsewhere probably due to the scientific depth, other than basic information theory. What I enjoy the most, is the intuition you provide, such as with the probability distributions video, and the Fisher information video just to give a pair of examples. I would love to see this channel and its supporters grow.
Morpheus : I've seen an agent punch through a concrete wall; men have emptied entire clips at them and hit nothing but air; yet, their strength, and their speed, are still based in a world that is built on rules. Because of that, they will never be as strong, or as fast, as you can be.
Watching this video not knowing what the hell a turing machine even is in this case is a wild ride.
Im so excited when you posted!!! Your videos are truly one of a kind. Thx to u, I can grasp AI and RL concepts rly well!
thank you! And excellent, that's exactly my goal :)
don't mind me, just gonna binge all your content over the next few days. love this vid
What a fantastic video. I'm not really an expert in this, but I know a lot about it, and I didn't notice anything that was glaringly wrong, and your animation style and delivery are both excellent.
my favorite part of BB is that the mindblowing fact about its uncountability is incredibly easy to prove.
Have you tried putting the Y-axis on a LOG-Scale tho?
haha oh duh
I’ve had this in my watch later for a while. Great video! “A shot at the king” and onwards especially
That is a great video; the function you've created (the one like Knuth notation) is impressive. It's crazy to think about how the Busy Beaver sequence would outpace functions with a large number of arguments. We know that there's no limit to recursion arguments, and theoretically, you could dwarf the Ackermann function by adding just one argument alongside (a, b). Remarkably, Loader created a function that grows faster than TREE, using only one argument!
Knuth up arrow involving single digit integers is a scary monster.
Knuth up arrow involving n - digit integers is time for me to take a lie down.
The cost of housing grows faster than the busy beaver.
NO, it doesn't.
That was such a good video and I'm a new subscriber now. If you keep producing such good content your channel's growth might surpass the busy beaver's growth.
haha! Talk about high expectation!
@@Mutual_InformationA few subscribers today, enough to collapse the earth into a black hole via information density within a couple weeks.
I don't know how I wondered into math TH-cam but you do an excellent job presenting such complex information in a way that I, whose mathematical knowledge ends with calculus 101, is able to comprehend the beauty/insanity of the concept. Great video, thank you!
Awesome Marty - exactly what I'm hoping for
Wow, am I glad to stumble upon your video. Thanks so much for making it, you fascinated the life out of me. The last 3 days I was reading and watching everything I could find about it. Haven't been so excited about math since my early 20s. A function that "breaks" math at
yes exactly! you feel how I feel
You know, studying astrophysics sometimes really drives you insane and makes you question the meaning of existence. Whenever I feel that way, math has been a haven, it has been simple playground in abstract space. But now I feel the same existential dread from math as well. Thank you very much!
Rewatching this now for the 3rd time. This just keeps getting funnier. Insane topic, hilarious narration. 3? stars
I've binged the googology wiki years ago and had my mind blown by the fact that simple algorithms can generate numbers beyond any practical analogy in only 6 or 7 states. But what no one ever thinks to mention is, these massive numbers are actually islands of low entropy surrounded by many integers which can never be expressed by humans. Imagine you have a 10,000 state Turing machine. That's 10^10^4.96 possible configurations, which is much greater than the number of unique numbers which can be expressed, since many of these machines will be redundant or never halt. Imagine you covered the entire surface of the Earth (510,072,000 km^2) with hard drives, with areal storage capacities at today's benchmark of 1.1 terabytes (8.8 terabits) per sq inch, and instead of bits, each unit represents a single state on a Turing machine (up to 1.364*10^22 states in the algorithm). That's 10^10^23.79 possible configurations.
So let's say 10^10^24 is a ridiculously high upper bound on unique numbers we can represent. This means that most of the numbers, for instance, between 10^10^26 and 10^10^27 cannot be represented without approximating them, in any shape or form even if we covered the whole planet with hard drives.
And yet, we can describe Graham's number in only 19 symbols on a Turing machine.
Interesting! If I'm interpretting you correctly, something like BB(20) is describe simply (I just did it), but it's so ridiculously huge that virtually every number that might be considered 'close' to it.. can't be simply described.. because we don't have any machine.. or algorithm.. or anything that can describe it. Even doing things like BB(20) - BB(10) or any other combination of operations with familiar object won't work because it'll miss the vast, vast majority of numbers in that range.
@@Mutual_Information Yeah, you said it much clearer than me!
Cool video for someone like me who isn't in to complexity theory or math generally. I respect that you hired an expert to make sure you got it right.
Awesome overview. Thanks for the link to my blog post :) So exciting to see more people exploring Busy Beavers. Let me know if you want to collaborate on a future video. I'd be glad to chat about some of the stuff going on in the community :)
you were the 256th comment
I'd say that's a pretty cool number
Wow, great to hear from you! I'd love to chat for the follow up video. Reach me on LinkedIn? Searching "Duane Rich" should do the trick. Twitter DM works too (see description)
Ooh you reached me. That means your videos going to go far. Layman's explanator.
I also prefer the s(n) definition of the busy beaver, where s(n) is the number of steps the longest-running machine takes before halting. This is more directly useful - if you have s(n), and an n-state machine, you can just run your machine for s(n) steps: if it halted, great; if it didn't, it'll never halt. The "max number of 1s" definition requires another level of indirection to actually be useful - many infinite-looping machines will produce less 1s than that, so you can't tell if *your* machine will infinite loop unless you're lucky enough to have it produce more than Σ(n) 1s. Instead you have to run *all* machines until every one that has produced Σ(n) or less 1s has either halted or been proven to infinite loop, then you can use that max number of steps to check *your* machine. That's a lot harder to work with!
The two definitions are equivalent for the purpose of the proof, it's just that s(n) is more directly useful and clearer why it's uncomputable.
You're right and it took me awhile to realize this. Exactly, S(n) is useful because you know when to stop running the machine.
I went with Sigma(n) honestly to avoid confusion. It's weird to compute the output of a computable function to the number of steps the TM ran for. I tried both and both of them brought a burden of extra explanation.
In the follow up video, I'll discuss S(n).
No one actually builds turing machines so it doesn't matter
That link between Goldbach's Conjecture and the Busy Beavers is wild!
Oh I saw your post on twitter some weeks ago, didn't expect you'd make a video out of this!
Regarding 11:10, while it's true that in theory knowing about a specific BB number leads to us knowing what machines halt or not, this also requires actually running said machine for that many steps!
So considering BB(27) is already larger than your question mark machinery, you don't really have any chance of accomplishing any of this even if an oracle told you the value.
ha yes totally true! For those reasons, yea, no open problems are gonna get cracked by working on computing BB of anything..
I just about understood that. Certainly enough to feel suitably 'mind blown'. Them's some hefty numbers, you's toting Mister.
Who is the audience for this? General enquiry, not a dig at the channel. I loved this video.
It's actually not too hard to get an intuition for why there can be no computable function that grows faster than bb.
Let's consider the even more absurd function S(x) which counts the largest number of steps any halting turing machine with x states takes (instead of counting the number of 1s it produces).
By definition S(x) has to grow faster than any computable function because if there was one that grows faster than S(x), we'd have an upper bound for how long we have to wait until we know that a turing machine can no longer possibly halt and the halting problem would be solved, which is impossible.
Now this doesn't prove that bb(x) also grows this fast, but it would be very suprising if the growth rate of bb(x) and S(x) wasn't related in some way.
This totally blew my mind, thanks for making this!
My pleasure!
Mind blowing - and extremely informative!
You should consider submitting this to the Summer of Math Exposition 3, if you haven't already.
I think the reverse is even more crazy. Any task can be simplified to be within this busy beaver bound. Meaning we have an upper limit on all possible computations (well, we are essentially doing that anyway).
But if you for example increase your tape alphabet beyond binary.... It can always be done in binary.
The title and miniature are a little misleading, because they suggest this line perfectly separates computable from uncomputable, even though there are uncomputable functions that are "below" the busy beaver.
That.. is something I simply didn't think of. Good point indeed.
thank you so much, this explanation of the busy besver function is the clearest that I've found
That was the best Turning Machine explanation I've seen.
3 things to say
1. mind blown
2. are there any imaginatable suitable usecases should the BB of 5,6,7,... be found?
3. Would quantum computing be more suitable in these types of calculation than the current computers could handle them?
The way you express the mathematical concepts is really entertaining!
10:09 It likely overtakes the weird ? function at around 7 or 8. People have already proved that Σ(6)>15?, and I wouldn't be surprised that one extra state makes the number a lot bigger by adding extra recursion.
If I understood correctly, he was comparing Sigma(n) to n? with two dashes, not sure if you were also referring to that or just the simple question mark
@@MIKIBURGOS I was referring to that, because I said the "weird ? function" not the usual ?
@@zswu31416 There's ?, ? with a dash, and ? with two dashes
This video looks like a teaser trailer to math
I want a full 100 hour version now
I just took a class on computational theory last semester at ASU and it definitely blew my mind and most of my classmates. It would be absolutely terrible having to be the one to write the proof for some of these things! Was really cool to hear you talk about this stuff though, brought back some of the things I had forgotten about like church-Turing thesis.
Ouch! A new world opens. You brought me on a new frontier where I never went.
This was one if not the most interesting video on math i have ever seen.
Nicely explained, really. The subject is fascinating, like a science fiction movie can be, but when it goes towards "if we can compute something than cannot be computed, we disprove the Riemann hypothesis", then I think it's worth taking a step back so things doesn't turn into navel-gazing, like any area when something gets more and more theoretical (string theory for example) and less connected to what I hope is the actual point of science, especially in times like these - applying its knowledge to improve society and people's lives, not just "do something because we can".
This has been a realization of mine after spending a lot of time on things like this. It's interesting, satisfying, comfortable, but in the end rather pointless, as a few world champions have described after their pursuit have ended. We need the connection to reality, so we can see the ever-increasing questions and lack of answers as a hint, instead of pushing forward in vain hope.
Really interesting topic, thanks for making a video on it!
Also, weird that there was no mention of compression since S(4) is the shortest way to express "make 13 ones" computationally.
Indeed.. Compression is another angle I'd love to cover.
One of the craziest videos I have seen about math. Thanks for creating it.
the only function larger than the busy beaver is the number of times my mind was blown after n seconds of this video
youtube's autoplay has brought me here... I have no idea what's being said, but i can't stop watching...
Here's a followup question I have on the Halting problem:
Let A be an algorithm that receives
- any state table, and
- any finite input tape
and can output either
- that the Turing machine halts,
- that it doesn't halt, or
- that A doesn't know whether the Turing machine halts,
and it is not allowed to be wrong about the first two options.
Then, what portion of the possible inputs, as input size goes to infinity, can A compute (i.e. output that it halts or doesn't halt)? If the percentage is zero, how does the computable portion compare to the frequency of prime numbers, squares, or powers of 2 in the integers?
Yes
Love these videos about absurdly large numbers. Remind me of those juvenile days where we argue who comes up with the larger number. Except these are not even close to what a small kids could even imagine about.
This was the first video I see on your channel and it was insanely cool, I can't wait to see more!
I also appreciate the time you spent on the description of the video with the extra information and references.
and now we have BB(5)
About two minutes in you hit my "I'll-Take-Your-Word-For-It boundary!"
Nice video! It’s worth noting that the existence of the Goldbachs conjecture Turing machine isn’t too surprising. I could write a rather simple computer program that simply goes through the even natural numbers in order exhaustively checking if I can write it as a sum of two primes, and halts when it arrives at a number that cannot be written as such. This program halts if and only if Goldbachs conjecture is false.
While it is true that Goldbachs conjecture trueness is not surprising with a BB halting, I think what's is more surprising is that both Goldbach and Riemann hypothesis can be equivalent to some n-states machine, which makes me thinking probably all unsolved mathematical conjectures are related to some n-state machines.
@@3kelvinhong I agree that Reimann's Hypothesis is more interesting. Its a more complex conjecture, which I guess necessitates the ~1000 states necessary to disprove it.
I didn't mean to suggest the video isn't interesting, if that's how my comment came across. Theoretically any conjecture that one could disprove by iterating through all possible counterexamples can be translated into a halting problem. Like much of mathematics, its a connection that one doesn't immediately think of, but once you hear it, it it seems very natural. The halting problem is hard because it basically gives you a shortcut to prove many (all?) conjectures.
I've seen Numberphile's video on BB a few times, but I didn't really get it until this video.
Great video - first one of yours I've seen!
The picture is misleading. The number of functions is cont, the number of Turing Machines is countable infinity. So, there are way more uncomputable things than there are computable
glad to see this video exploded in views! have always loved this channel and thought it deserved more viewers !
Surely if one can encode a brute force attack on the Goldbach Conjecture as a 27 state Turing machine so that halting means that it has found a counter example, then this is not that surprising.
What a delightfully wonderful piece of work! Thank you for this video. Keep up the impeccable work.
I'll keep the explanations up, but a topic like the Busy Beavers is hard to repeat..
FYI: BB(5) has recently been found to be equal to 47,176,870
I know! Huge news!
Awesome video, just turned me into a subscriber! From the aesthetics of the video, down to the content and calm presentation you deserve serious recognition. Nice work, this channel WILL blow up :)
Thank you Peter, means a lot and I hope you're right!
The busy beavers are on par with Rayo’s number they are crazy.
Seems like almost the same idea. Rayo's number is the largest finite number that can be defined in less than a certain number of symbols in a particular mathematical language, right? Busy beavers seem like the same thing, just using a different language to define the numbers.
I'm with you to the world wide web. I'm on your rabbit hole.!
very good video. do we have some Busy Beavers that could have proven some theories if only we managed to “compute” them?
Aaronson discusses this. He says.. if we're given the value of Shift(27) (this is the shift function, which is the total number of steps taken by the machine, rather than the count of 1's written), then that reduces Goldbach to a finite procedure. You just run the GoldBach machine for Shift(27)-steps. If it hasn't halted by then, you know GoldBach is true! But this is not a practical route at all
@@Mutual_Information Wow... Still it is crazy that it hypothetically would work
Amazing video, comment for the algorithm.
Made my brain melt
The number 744 is prominent in the q-expansion of Klein's j-invariant, namely j(tau) = 1/q + 744 + 196884q + ... which manifests in Ramanujan's constant exp(pi*sqrt(163)) being approximately 640320^3 + 744. Coincidence?
"There are no coincidences in math!" - someone smart I think
I was going to say maybe, but after searching a tiny bit, the next term in the expansion is -196884/640320^3, which makes me a lot more inclined to believe that there is some connection there.
@@tonydai782i think it was related to representation of the monster group but i dont remember
this might be the definitive mind blower video of the BB topic I'll see ever. Now I can close this case in my mind to go find another conclusions in this awesome field.
Crazy that BB(5) was found.
BB(6) is almost certainly impossible though.
This is the video I needed all those years ago
Very cool video, but I'm not getting the "There exists a 27-state machine that halts iff..." part; in that case, can't you just construct a trivial machine that halts on each internal state no matter the input obtained from the tape?
EDIT: nvm I think I got it; there's a SPECIFIC 27-state machine, which halts iff Goldbach's Conjecture is false
Exactly, it just tests all numbers and holds if it finds a counter example.
I've seen a few videos on busy beavers but this video is the first one that I've actually understood what it's describing. Thanks for that