Memorising Graham’s Number Creates Black Holes | Entropy

แชร์
ฝัง
  • เผยแพร่เมื่อ 31 ก.ค. 2024
  • Entropy is really important... somehow? What exactly is it, and why does Entropy appear in physics' most philisophical problems?
    How can Graham's Number collapse my brain into a black hole?
    Notes:
    1. Notice the definition of "unusable" depends on what we define as useful. Different definitions of "useful" differ from each other with a fixed constant. Usable is relative, just like energy. Only differences matter, and the definition can't change between calculations. Ignoring this is a common basis for pseudoscientific statements, so it's good to be aware of.
    2. F = U - TS is only one definition of free energy: Helmholtz’s free energy. It assumes the same starting and ending pressure of the same gas. The reason for the many types of free energy, is that due to there being 6 thermodynamic variables (Temperature, Entropy, Pressure, Volume, Energy, Molar). That's too many degrees of freedom for any general equation, so there are commonly used assumptions that get us to useful free energy equations. Others include Gibbs free energy, which is used in chemistry.
    3. The Carnot engine is the theoretically most efficient engine possible. Viascience has a beautiful series going into more detail (Thermodynamics 4a is the video that begins on entropy.)
    4. Differential calculus sure assumes that gases are smooth and continuous, but a small enough particle size would still be best approximated by a differential equation. Part of the reason people were so mad over atoms existing was a philosophical debate on the reality of the universe.
    5.Energy alone doesn’t determine the number of possible microstates. Gas amount, potential (like gravity), electric fields etc. all contribute. This function assumes those are constant/zero, as the value that matters in this derivation is the parameter of total energy.
    6.The function of microstates is just a count of all possible configurations of energy across particles. It doesn’t matter exactly what is being sorted, whether it be baseballs in bins, pigeons in holes, or a gas’ energy over its particles. Therefore, the same function can apply to both gases.
    7. A fair die produces outcomes 1, 2, 3, 4, 5 and 6 with probabilities: 1/6, 1/6, 1/6, 1/6, 1/6, 1/6. The information associated with each outcome is Q = −k log 1/6 = k log 6, and the Shannon entropy S = log2(6) = 2.58 bits. A weighted die produces outcomes 1, 2, 3, 4, 5 and 6 with probabilities 1/10, 1/10, 1/10, 1/10, 1/10, 1/2. The information contents associated with the outcomes are k log 10 ,k log 10, k log 10, k log 10, k log 10 and k log 2. (These are 3.32, 3.32, 3.32, 3.32, 3.32 and 1 bit respectively.) The Shannon entropy is then S = k(5 × 1/10 log 10 + 1/2 log 2) = k(log√20) = 2.16 bits.
    8. Hawking radiation is not that simply proven. All matter particles are waves because of Quantum Field Theory, which in theory doesn’t take gravity into account. This off-the-cuff argument would be a potential intuition as to whether Hawking radiation might exist or not, but it took Stephan Hawking and others to prove it in gravity regimes that QFT has proven itself.
    ------------------
    Timestamps:
    0:00 - Intro
    0:55 - Entropy from Thermodynamics (Free Energy)
    3:06 - Entropy from Statistical Mechanics (Microstates)
    6:40 - Entropy from information theory (Shannon Entropy)
    10:35 - Making predictions with entropy
    11:03 - Black hole entropy
    12:13 - Maxwell's demon can't reduce entropy
    14:25 - Numbers have mass
    15:36 - Closing Thoughts
    ------------------
    Music: Mark Tyner - Close To You

ความคิดเห็น • 60

  • @desicmanifold4025
    @desicmanifold4025 11 หลายเดือนก่อน +10

    As always, immensely impressed by your clarity of explanation and the utter charm you convey it with! The idea of numbers having mass/energy is a concept I'd never heard of before in that context, never going to forget that now.

  • @TheSummoner
    @TheSummoner 11 หลายเดือนก่อน +8

    That's a great overview of all the different faces that entropy takes! Personally my favorite application of the notion of entropy is the so-called maximum entropy principle, used to decide which probability distribution best fits what is observed about a phenomenon.

  • @tcaDNAp
    @tcaDNAp 11 หลายเดือนก่อน +3

    It took me a year to get back to both SoME videos on this channel, but now I'm hooked on these deeper connections 🤝

  • @sgtstull
    @sgtstull 8 หลายเดือนก่อน +1

    This channel is disgustingly underrated.

    • @mindmaster107
      @mindmaster107  8 หลายเดือนก่อน

      Thanks so much for the kind word!

  • @skadoosh1729
    @skadoosh1729 11 หลายเดือนก่อน +3

    This is so underrated. Your voice is akin to 3b1b's actually

  • @hedgehog3180
    @hedgehog3180 4 หลายเดือนก่อน +1

    Slight correction, Carnot did not have the concept of entropy since he believed in the Caloric theory of heat. Therefore in his original description of the Carnot cycle the engine takes out as much heat, Q, from the hot reservoir as it returns to the cold reservoir. So he didn't believe that a heat engine does work by extracting heat from a heat difference, and the concept of efficiency, how much work the engine can extract from the heat, did not exist in his conception. Though he did sorta prefigure the idea of the 2. law off thermodynamics with his proof that no heat engine can be more efficient than the equivalent Carnot engine, but since he didn't conceive of heat as energy he also didn't think that the heat the engine delivered to the cold reservoir was lost energy.
    The version of the Carnot cycle you have up, and the formula for the Carnot efficiency were conceived by Clausius in order to rectify Carnot with the modern molecular theory of heat, who also coined the term entropy and the most common formulation of the laws of thermodynamics. So he probably deserves a lot of the credit.
    Though one thing that's sorta neat is that Carnot described a heat engine as "something that interrupts the free fall of heat", which is very close to a more modern understanding of heat "falling" from a state of low entropy to a state of high entropy and heat engines accelerate that fall by extracting work from the heat difference.

  • @fluffy_tail4365
    @fluffy_tail4365 11 หลายเดือนก่อน +4

    Another banger, you can explain certain coincepts with such an energy that is contagious

    • @mindmaster107
      @mindmaster107  11 หลายเดือนก่อน +1

      I have the energy of a perpetual motion machine
      Thank you so much for your comment!

  • @momothx1
    @momothx1 11 หลายเดือนก่อน +4

    i wish i had you as a teacher! you have such a nice way of providing information and you’re passionate about what you do. keep up the good work ❤

    • @mindmaster107
      @mindmaster107  11 หลายเดือนก่อน +2

      Thanks for the comment!

  • @ClemoVernandez
    @ClemoVernandez 11 หลายเดือนก่อน +2

    Great to see you back uploading! :)

    • @mindmaster107
      @mindmaster107  11 หลายเดือนก่อน

      Good to see you comment how happy you are!

  • @cycklist
    @cycklist 11 หลายเดือนก่อน +2

    Wonderful. Thank you.

  • @matiasnovabaza8208
    @matiasnovabaza8208 11 หลายเดือนก่อน +1

    This year happen to me that looking for videos about entropy I couldn't find anything that actually explain it, and i think that you did an excellent job here, thanks

    • @mindmaster107
      @mindmaster107  11 หลายเดือนก่อน

      This is the first ever super comment I’ve ever got! Thank you very much!
      I don’t make videos too often, but I make sure that each one i a treat worth waiting for.

  • @Susul-lj2wm
    @Susul-lj2wm 11 หลายเดือนก่อน +2

    i love your voice it makes listening to these concepts so pleasant

    • @mindmaster107
      @mindmaster107  11 หลายเดือนก่อน +2

      I appreciate the compliment :D
      I’m thinking of doing slightly different formats of videos, for instance a 1 hour lightly edited talk on science history.
      You might enjoy that it seems

  • @docopoper
    @docopoper 11 หลายเดือนก่อน +1

    I love this channel.

    • @mindmaster107
      @mindmaster107  11 หลายเดือนก่อน

      I love you.

    • @docopoper
      @docopoper 11 หลายเดือนก่อน +1

      @@mindmaster107 Awwwww

  • @Jaylooker
    @Jaylooker 9 หลายเดือนก่อน +2

    Wick’s rotation connects entropy to quantum mechanics by way of statistical mechanics.
    The prime number theorem can defined using the offset integral Li(x) = ∫ li(z) dz. Notably, the Li(x) bounded between 0 and 1 = -ln 2 like the information content defined at 7:45 and like how probability (and information) where defined as S = k_B ln W at 9:22. Also, Chebyshev’s functions for prime numbers are similarly defined to that of Shannon’s entropy at 8:38. This suggests the primes follow some entropy law and randomness.
    Thermodynamics and the dissipation it entails through entropy have solutions that are described using Gaussians and Fourier series. These solutions generalizes to harmonic analysis, automorphic functions, and automorphic forms such as modular forms and provides a mathematical basis to do entropy.

  • @arbodox
    @arbodox 11 หลายเดือนก่อน +3

    Yay, another mindmaster upload! Great video as always, I appreciate your in-depth and clear explanations of physics topics. I'm looking forward to revisiting your videos when I begin my physics undergrad next month. :D

    • @mindmaster107
      @mindmaster107  11 หลายเดือนก่อน +2

      Good luck on your university journey!
      Remember to enjoy both the learning, and meeting amazing people along the way.

    • @pyrenn
      @pyrenn 11 หลายเดือนก่อน +3

      Same for me as well! On all the points

  • @arminhrnjic1678
    @arminhrnjic1678 11 หลายเดือนก่อน +1

    One of the greatest entries for SoME3! Hope you win!

    • @mindmaster107
      @mindmaster107  11 หลายเดือนก่อน +1

      Thank you so much for the kind words! I feel like I have already watched and voted on 25 entries that are better than mine, but I’ll hold my fingers crossed as you never know.

  • @mohamedmouh3949
    @mohamedmouh3949 2 หลายเดือนก่อน +1

    thank you sooo much 🤩🤩🤩

  • @dcs_0
    @dcs_0 11 หลายเดือนก่อน +2

    great video, it serves beautifully as a sequel to the Veritasium video on the same topic, diving more into the actual mathematics behind entropy after getting a feel for what it is, which is insanely satisfying lol

    • @mindmaster107
      @mindmaster107  11 หลายเดือนก่อน

      Glad you enjoyed it.

  • @hedgehog3180
    @hedgehog3180 4 หลายเดือนก่อน +1

    You could totally make a video just about thermodynamics because after studying it I've found that most popular understandings are like slightly wrong in a way that critically skews perception. To give some examples:
    The first law is often said to be "energy can neither be created nor destroyed only transformed" this is not wrong but it's basically the same idea as conservation of energy which is just a basic physics thing not really a thermodynamics thing. The formulation I prefer is "the internal energy of a system is equal to the heat added to it plus the work performed on it" or U=Q+W, I prefer this formulation because it actually makes a statement about work and its relationship to heat and it clarifies the concept of internal energy as distinct from just heat. All of this is way more useful when doing thermodynamics.
    The second law has a similar problem, the most popular formulation is "the entropy of the universe tends towards a maximum" or something similar like that but this formulation kinda says nothing, like what is entropy? (I know wrong video to say that) and why does it increase? Another much better formulation is "It is impossible to realize a reversible cyclic process where work is performed by extracting heat from a single reservoir that remains at the same temperature", this of course sounds like nonsense but if you understand the Carnot cycle it basically boils down to saying "no engine can be more efficient than the equivalent reversible Carnot engine" and that of course means that a heat engine must deliver some amount of waste heat to the cold reservoir. Another formulation that is also somewhat common and in my opinion pretty good is "heat cannot flow from a cold body to a hot body without work being performed", you can see how this is equivalent to the other one I liked if you just perform a thought experiment where you have a Carnot engine and then some magical substance that can transfer heat from a cold body to a hot body. In that case what you end up with is the cold reservoir remaining at the same temperature while all of the heat energy of the hot reservoir gets turned into work.
    Other than that actually putting the Carnot cycle in it's proper historical context is really interesting, like Carnot was trying to improve steam engines and if you just take the conclusions of the Carnot cycle you can explain basically all the technological developments of the steam engine. Firetubes in boilers are a way to raise the temperature of the hot reservoir, compound expansion engines are a way to let the steam undergo adiabatic expansion for as long as possible and thus get as close to its condensation temperature as possible, and the limiting case of an extremely high number of pistons is basically just a steam turbine, which is why they're so efficient. Some early steam engines had their pistons contained inside the boiler but this obviously means that there is direct contact between the hot and cold reservoirs and thus it made the engine less efficient, even though it seems like a smart way to provide insulation. Superheated steam is another effort to make the engines as reversible as possible, since the Carnot engine assumes an ideal working gas and wet steam is very much not an ideal gas (which follows intuitively from the kinetic theory of heat) however by superheating the steam it does start to act more like an ideal gas.
    Maybe I'm just saying all of this because I just wrote about it but I think it could make for a good video, if I at some point have time myself I'd probably give it a shot.

    • @mindmaster107
      @mindmaster107  4 หลายเดือนก่อน +1

      Genuinely, make that video!
      I made my videos because I found no one doing it for this level of understanding. If you want to take it to the next level, you have my full encouragement!

    • @hedgehog3180
      @hedgehog3180 4 หลายเดือนก่อน

      @@mindmaster107 Thanks!

  • @HimanshuSingh-ej2tc
    @HimanshuSingh-ej2tc 9 หลายเดือนก่อน +1

    loved it

  • @EntropicalNature
    @EntropicalNature 11 หลายเดือนก่อน +1

    Great video. Just a note on the zeroth law (without trying to be pendantic): it's more a law which defines the notion of Temperature and accompanying scale without the explicit invocation of entropy. It can be stated as: ''if a body C, be in thermal equilibrium with two other bodies, A and B, then A and B are in thermal equilibrium with one another." Seems like a moot observation, but it ensures we can safely do our maths and calculate things like pressure, chemical potential or other nice thermodynamic properties between systems and say something useful about them IF the systems are in thermal equilibirum, i.e. have the same temperature.

    • @mindmaster107
      @mindmaster107  11 หลายเดือนก่อน

      Absolutely! It’s part of why it’s called the zeroth law, as its often just assumed and moved on from.

  • @filker0
    @filker0 11 หลายเดือนก่อน +1

    Though it isn't comprehensive, I think this video is very good at explaining information entropy and how that relates to the heat death of the universe. Unfortunately, this itself contributes to the entropy of the universe which as far as we know is a closed system, thus accelerating by some tiny amount to the aforementioned heat death of the universe...

    • @mindmaster107
      @mindmaster107  11 หลายเดือนก่อน

      Think about how daydreaming speeds up the end of the universe

    • @NiceGameInc
      @NiceGameInc 11 หลายเดือนก่อน

      If we already know the heat death of the universe will eventually come, why don't we stop right there on the spot and call it a day? In my humble opinion, entropy is still very poorly defined and i won't go into detail here on why i think so.

    • @oosmanbeekawoo
      @oosmanbeekawoo 11 หลายเดือนก่อน

      @@NiceGameInc Hey, can I ask you to explain how? I know we don't have a definitive definition for Entropy like we have for Work. But the only reason I think Entropy is poorly defined is because I can list all the conflicting definitions of Entropy from textbook to textbook. I can't find an ‘authoritative’ explanation why Entropy is poorly defined. If you have something to say, please do so!

  • @omicrontheta38
    @omicrontheta38 11 หลายเดือนก่อน +2

    This video was incredible

    • @mindmaster107
      @mindmaster107  11 หลายเดือนก่อน +1

      I’m glad all my effort was worth it

    • @omicrontheta38
      @omicrontheta38 11 หลายเดือนก่อน +2

      @@mindmaster107 Entropy has been a big confusion for me for ages and this video was perfect! I am really grateful for all your talent and effort with your videos.

  • @hazimahmed8713
    @hazimahmed8713 4 วันที่ผ่านมา +1

    Why did you stop uploading? Your videos are very good.

    • @mindmaster107
      @mindmaster107  10 ชั่วโมงที่ผ่านมา

      I'm still around, just things are currently busy in my life.
      A video will come one day, don't worry :D

  • @SuperMarioOddity
    @SuperMarioOddity 11 หลายเดือนก่อน +3

    This is some sci-fi shit

  • @OliBomby
    @OliBomby 11 หลายเดือนก่อน +2

    Entropy, thermodynamics, probabilities…to a common man like myself, these words mean less than dirt. But in the hands of a physician? Let’s just say, things can get interesting.

    • @mindmaster107
      @mindmaster107  11 หลายเดือนก่อน +3

      Physics isn't so great? Are you kidding me? When was the last time you saw a subject with such depth and mathematical? Physics puts the game in another level, and we will be blessed if we ever see a subject with potential skill and passion for science again. Chemistry breaks records. Biology breaks records. Physics breaks the rules. You can keep your experiments. I prefer the magic.

    • @petevenuti7355
      @petevenuti7355 11 หลายเดือนก่อน

      ​@@mindmaster107 breaks the rules? Don't you mean makes the rules?

  • @tcaDNAp
    @tcaDNAp 11 หลายเดือนก่อน +1

    Is this one more reason that it's impossible to know everything about anything? I hope UpAndAtom enjoys this video

    • @mindmaster107
      @mindmaster107  11 หลายเดือนก่อน

      This is one more reason :D

  • @2wr633
    @2wr633 11 หลายเดือนก่อน +1

    I don't get how the amount of of possible microstates can be maximize, isn't the amount of of possible microstates of a closed system a constant?

    • @mindmaster107
      @mindmaster107  11 หลายเดือนก่อน +1

      I didn’t spend as much time as I should have on this part of the video, so I’ll go into detail here.
      Microstates absolutely, for a given amount of energy and a closed system, should be a constant. However, we can sub-divide groups of microstates even further into microstates corresponding to a set energy, volume, etc.
      In the video, I assumed since the two gases were being mixed, we could ignore all these other factors, since both gases shared the same conditions. This is absolutely NOT assumable for the general case, which is why the full derivation is much more thorny. You can find a video from Viascience on this, as his videos are a beautiful showcase of the mathematics.

  • @lake5044
    @lake5044 11 หลายเดือนก่อน +1

    Can someone help me? I got curious about the maximum bits that can be stored in a 1kg mass. I used these equations here 15:11 (replacing ln(10) with ln(2)). And I got about 4456 TB is the maximum in one kg of mass. Hmm is that correct?

    • @mindmaster107
      @mindmaster107  11 หลายเดือนก่อน +1

      Yep! Keep in mind that maximum bits increases with mass squared, so it isn’t a direct proportional relationship.

    • @mindmaster107
      @mindmaster107  10 หลายเดือนก่อน

      I feel like I should go more into detail here.
      Increasing the volume within information can be stored, itself increases entropy.
      You can imagine increasing the volume of a cloud of gas.
      Therefore, while it is correct a 1kg black hole will store that amount of data, any storage device larger than that could eclipse it.
      My argument of minimum mass holds up when considering larger numbers, where the black hole radius is larger than human scales, say Graham's number which is much larger than Earth.
      For 1kg, the black hole is smaller than a proton, so we have a bit of wiggle room there.

    • @lake5044
      @lake5044 10 หลายเดือนก่อน

      ​@@mindmaster107What do you mean by "storage devices larger than the volume of a 1kg blackhole would eclipse it"?
      Also, does the max information in a blackhole depending on its mass? its surface area? or volume? Meaning, I assumed that the max info in a volume is the same as the info of a blackhole of the same volume, but is that true?

    • @mindmaster107
      @mindmaster107  10 หลายเดือนก่อน +1

      @@lake5044 For a given volume, absolutely the maximum information that can be stored is a black hole of that size.
      For a given mass, because spreading out mass increases entropy, and 1kg gas can take up loads more space than a 1kg black hole, this relationship isn't strictly valid.
      It still validates the fact that memorising numbers generates heat.

  • @johanyim3097
    @johanyim3097 8 หลายเดือนก่อน +1

    Your math warning was 4 minutes and 17 seconds too late into the video