Entropy is NOT About Disorder

แชร์
ฝัง
  • เผยแพร่เมื่อ 15 ก.ย. 2024
  • Entropy is often taught as a measure of how disordered or how mixed up a system is, but this definition never really sat right with me. How is "disorder" defined and why is one way of arranging things any more disordered than another? It wasn't until much later in my physics career that I learned a much better way of thinking of entropy: in terms of information.
    In statistical mechanics, we try to substitute exact descriptions of systems for descriptions based on probabilities and average states. However, in doing so, we sacrifice information about the precise state of the system, so we introduce entropy to account for this ignored information. With this in mind, we can derive a relatively simple expression for the entropy corresponding to the large-scale behaviors of a system. Once we do that, we can apply it to a simple system to make sure that our definition works as it is supposed to.
    As it turns out, this definition of entropy has more applications than just to thermodynamics. It is often used to study quantum entanglement and when applied to black holes, it sparked the search for a quantum theory of gravity.
    Check out Higgsino Physics's video on entropy: • Entropy Visually Expla...

ความคิดเห็น • 89

  • @zapphysics
    @zapphysics  5 ปีที่แล้ว +6

    Wow, you guys are awesome! Looks like I have a few more people to try to impress with the next video lol. Thank you so much for all of the support!

  • @mtbabels
    @mtbabels 5 ปีที่แล้ว +22

    Physics undergrad over here. This. Was. Crystal. Clear. Thanks sir.

    • @zapphysics
      @zapphysics  5 ปีที่แล้ว +3

      That's great to hear! Glad it helped!

    • @Ryuuuuuk
      @Ryuuuuuk 5 ปีที่แล้ว +2

      Also a reminder that you'll learn this in more detail during your statistical mechanics class.

    • @Jaggerbush
      @Jaggerbush 3 ปีที่แล้ว

      I appreciate this however I found this even more confusing

  • @itsawonderfullife4802
    @itsawonderfullife4802 หลายเดือนก่อน +1

    To see why order/disorder is also a good means of understanding entropy one has to deeply/philosophically understand order/disorder:
    Order ultimately reduces to low information (when with a few parameters in a formula aka low-information, every entity in our system can be described=they are obeying a universal system-wide rule/pattern/law). That commonality/correlation (between system entities' properties) is the source of low information (order). So "one law/pattern to rule them all", "one formula to describe them all" means low information (low entropy).
    Disorder fundamentally means high amount of information/high-entropy (when every entity in our system is doing its own thing, not ruled by a universal law/rule/pattern or parameter or order and so we need lots of individual pieces of information to capture the whole system).
    I hole that was helpful.

  • @ThisisBarris
    @ThisisBarris 5 ปีที่แล้ว +4

    Brilliant video man! Love that you're filming as you draw, it makes your videos much more dynamic. And when I learned that entropy wasn't disorder, it was a mind blown but it made so much sense - after all, there is absolutely no non-human appreciation of disorder or order!
    Congrats on the video gaining so many views man, it deserved it. Almost at 400!

    • @zapphysics
      @zapphysics  5 ปีที่แล้ว +1

      Thanks! Glad you enjoyed it! I definitely agree that the filmed drawing make it a lot less static and jumpy

  • @Hermetic7
    @Hermetic7 2 ปีที่แล้ว

    This was very interesting. Came over here from Curt Jaimungal’s TOE channel at his recommendation. Specifically curious about entropy since Curt mentioned it with respect to Thomas Campbell’s big TOE. However, it seems to me your explanation bolsters Tom’s theory as he views consciousness as an information system. And entropy in his model specifically relates to ignored or lost information (simple model of 1s and 0s being more random with less meaning vs. more ordered into patterns with meaning, i.e. when an entropic cause increases you can have file corruption in a digital system). I thought that the idea of “zero” entropy is an impossibility, however…being that you can only approach it asymtotically but never actually get there…hence you can speak of this going on forever as long a there is work being done to reduce entropy in the system. Your explanation is perfectly elegant. Thank you.

  • @daniel.scheinecker
    @daniel.scheinecker 5 ปีที่แล้ว +8

    Probably the best and clearest explanation I’ve heard so far!
    Good job 👍🏻

    • @zapphysics
      @zapphysics  5 ปีที่แล้ว

      Thank you! I'm glad you enjoyed it!

  • @ReidarWasenius
    @ReidarWasenius 2 ปีที่แล้ว

    Well said! Great crystallisation. Thank you for this video! Greetings from Finland.

  • @rovrola
    @rovrola 5 ปีที่แล้ว +9

    When the dice are coloured, it seems to me we could have two ways of 3+3. Why not so?

    • @zapphysics
      @zapphysics  5 ปีที่แล้ว +7

      (Holy cow, you're totally right, that's a mistake on my part! Thanks for pointing that out!!) (original comment). Actually, I take that back, there are only five states because there is still only one state corresponding to both dice being threes. Each die only has one three on it, so I only get one state. I get two states for the others because, for example, each die has both a four and a two, so by switching the number on each die, I am getting a distinguished state. Sorry for the confusion!

    • @rovrola
      @rovrola 5 ปีที่แล้ว +2

      @@zapphysics Ah, yes! That makes sense.
      Great vid btw.. I'm familiar with the multiplicity of microstates definition, but your epistemic angle makes it really clear.
      Also, can I have my cow back?

    • @StaticBlaster
      @StaticBlaster 4 ปีที่แล้ว +1

      @@rovrola lol 😂

    • @sinekonata
      @sinekonata 4 ปีที่แล้ว +1

      @@zapphysics
      OMG you're right ! *sweating bullets*
      OMG no I was right ! *sweet relief disseminating throughout your body*

    • @zapphysics
      @zapphysics  4 ปีที่แล้ว +1

      @@sinekonata @sinekonata lol, perfect summary of what happened

  • @kaybei9771
    @kaybei9771 2 ปีที่แล้ว +1

    @ZAP Physics . Where did you learn your Physics from? I have seen a few of your other videos like the one where you use symmetries to justify newton's equation. I really like your style where you reason what the form of the equation should be based on some observations; just like in this video where you reason the form of the entropy function. Is this how theoretical physicists work? I wonder how common it is for physicists to guess at their equations. I'm just wondering if there is/are textbooks which you have learned from which also follow this approach to doing/practicing to become a physicist. It seems to make more sense to me and i find it really shows how these laws could have been guessed at. So if you know any books which show this approach at the undergraduate level such as a first year general physics book, or perhaps even some problem solving books... that would be AMAZING!:)!!! Thanks.

    • @zapphysics
      @zapphysics  2 ปีที่แล้ว +2

      @Kay Bei thank you very much for saying this, I really appreciate it!!! To quickly answer your question, I think that for me personally, I got a lot of inspiration for how to sort of build equations from intuition from Steven Weinberg's books on QFT, but that I would say that is probably more graduate-level. Weinberg has some other textbooks aimed at more undergraduate levels, but I haven't read these as much, so I can't necessarily say how much he emphasizes this idea. Weinberg's books are also somewhat notorious for being a bit...dense...so depending on where you are at, they may or may not be a good fit.
      Feynman was also somewhat famous for his intuitive explanations of physics concepts, so one place to start is maybe his lectures on physics which are available for free on Cal Tech's website (www.feynmanlectures.caltech.edu/). I generally really like these, especially for a first-year undergrad, though I don't always love the order in which he talks about things.
      In more general, I really think that this is a skill that comes from practice and it definitely isn't easy at first (at least it wasn't for me!). I think that the best advice I can give for someone starting out is to sort of think of the math as a different language instead of a box that you plug stuff into and get an answer out. A good exercise to go through is to try to "translate" the equations that you use into regular language, extracting as much information as you can. For example, take Newton's law of gravity Fr = -G*m*M/r^2. You could say something like: the force of gravity between two masses is always attractive, the strength of which decreases as the separation increases. We can also notice that it *only* depends on the separation (not the direction, meaning we have a symmetry under rotations) and weights the masses feeling the force evenly, so that the two masses feel the same force, which we know must be the case from Newton's third law. So you see that there is a lot of information wrapped up in the math! An important thing is to not just re-state the equation in words, so don't just say "the force of gravity is inversely proportional to the square of the distance between two objects." While this is of course correct, it doesn't really tell you anything more than if you were to just look at the equation.
      Once you start to get comfortable with translating math to everyday language, it becomes easier to go the other direction: you can start to write down your own equations just based on the properties you want them to have. This is definitely something common in theoretical physics and how a lot of theories/models are built from the ground up!
      Another thing which might help for inspiration is to learn a bit about the history of physics and the stories behind some of the equations you see. A lot of the time, the Wikipedia page about an equation typically has some information on this as a starting place. This can often help to de-mystify the equations to show that they don't just come out of thin air!

  • @Ryuuuuuk
    @Ryuuuuuk 5 ปีที่แล้ว +3

    I can't see where disorder is subjective. Disorder - meaning the lack of information - is an absolutely equivalent property, just "inverse".
    E.g. if you have a perfectly ordered system you've got maximum information about it and thus minimum entropy. On the other hand it's just a hand waving way to explain it to people who don't study physics or something related.

    • @zapphysics
      @zapphysics  5 ปีที่แล้ว +4

      One big problem with this is that sometimes, what we think of as "disorder" or "mixed-up-ness" doesn't necessarily align with lacking information. The two best examples are the ones that I addressed in the video: black holes and entanglement. Black holes are incredibly simple systems with hardly any degrees of freedom (all you need is the mass, charge and angular momentum to completely classify it). However, they can have huge amounts of entropy, which makes sense from the information perspective (we can't see past the horizon from the outside, but that information is still accessible if we jump into the black hole), but not so much from the disorder perspective.
      Similarly, when two particles become entangled, when we consider just one of the two, that particle will have non-zero entropy. So now, we are talking about a single particle (the most ordered a system could possibly get) with entropy. Again, this makes sense from the point of view of information, because it tells us that there is more to the system (the other particle) that we aren't taking into account.
      Sure, we can arbitrarily define disorder however we want, but if we are just going to define disorder in terms of information, why not just use information? I guess my point is that information is no more complicated than the idea of "disorder," so why not make our hand-wavy explanations a little more accurate?
      (P.S. sorry for the rant lol, and I appreciate your comment!)

    • @Ryuuuuuk
      @Ryuuuuuk 5 ปีที่แล้ว

      @@zapphysics Thanks for the detailed answer, nothing to sorry for.

    • @NightmareCourtPictures
      @NightmareCourtPictures 2 ปีที่แล้ว

      @@zapphysics i agree. People always confuse this concept with "chaos" that the universe is heading toward "more disorder and more chaos" which also makes no sense and leads to more confusion about "chaos theory" and other theories related to complexity in system evolution like complexity theory.
      Cheers

  • @shadowqcd3197
    @shadowqcd3197 2 ปีที่แล้ว +1

    Great video, love your teaching style! One question though: for the scenario with identical dice, why aren't there only 3 states each with probability 1/3? My thinking is that for the dice to be identical, they must obey either bosonic or fermionic statistics, and in either case there are 3 equally likely states that give a sum of 6, e.g. (33), (24)+(42), and (15)+(51) for bosons

    • @zapphysics
      @zapphysics  2 ปีที่แล้ว

      @ShadowQCD Good question! First and foremost, here I assumed that we are working with classical dice, so we won't get such superpositions since the dice have a definite history despite them being identical.
      However, if we want to talk about quantum dice, we really only have one state (the combined state of the dice) where each combination of values that each die takes is equally likely. After measuring the sum of the dice, we collapse the state onto the sum-6 subspace where each outcome is equally likely, i.e. (forgive the formatting)
      |sum = 6>_b = 1/√5*(|1, 5> + |2, 4> + |3, 3> + |4, 2> + |5, 1>)
      for bosonic dice and
      |sum = 6>_f = 1/2*(|1, 5> + |2, 4> - |4, 2> - |5, 1>)
      for fermionic dice. So you see that for the bosons, there are still five equally likely outcomes for individual measurements of the dice while for the fermions there are only four possibilities since the likelihood of measuring |3, 3> is zero due to anti-symmetry under exchange. Notice that we don't include any restrictions other than symmetry of the wavefunction since all we know is the total sum of the dice.

  • @quahntasy
    @quahntasy 5 ปีที่แล้ว +1

    Came from Reddit. This is the best explanation of entropy I have heard in a long time.
    Thanks for this video.

  • @Higgsinophysics
    @Higgsinophysics 5 ปีที่แล้ว +2

    1:39 haha that image is perfect! I want to have a cup with that.. Anyways thank you for the shout-out! Brilliant video

  • @Exodust145
    @Exodust145 5 ปีที่แล้ว +2

    Thank you thank you! This clears so many things. Would you mind covering Maxwell's demon and what is known about it till now? Thanks!

    • @zapphysics
      @zapphysics  5 ปีที่แล้ว +1

      I would definitely be interested in making a video on Maxwell's demon! I have a few videos in line right now to make, but after that, I certainly could!

  • @OmniscientTroll
    @OmniscientTroll 5 ปีที่แล้ว +1

    There are many functions such that f(1) = 0 but the limit as x approaches 0 from the left is -∞. Why do we want to use a logarithmic function over these? For example, I could take -log(1-log(x)) or 1-1/sqrt(x)

    • @zapphysics
      @zapphysics  5 ปีที่แล้ว +1

      That's a good question! This video was in no way intended to be an extensive proof on why we have to use this function and you're exactly right that it did not rule out every possible function besides the one used. I just more wanted to point out that the logarithmic function used satisfies all the constraints that we need it to satisfy for it to be consistent with our desired definition of entropy, and that is a lot easier to see when you can see how something doesn't satisfy the constraints.
      To answer your question, I'm not actually sure why these functions are ruled out. I would bet that other constraints arise, such as a need for slow divergences, that would whittle down the possibilities, but I don't know for sure. Or maybe, since entropy isn't really a measurable quantity, we just use the simplest equation that works with all of our requirements and call it a definition.

  • @xemlfvrx
    @xemlfvrx 5 ปีที่แล้ว +1

    Great video! Do you consider creating a video explaining entropy in quantum entanglement in the future?

    • @zapphysics
      @zapphysics  5 ปีที่แล้ว +1

      Hopefully, yes! The problem is that it can be a somewhat hairy topic, so it will require a lot of thought on how best to go about it. But it is definitely a very interesting topic that I would love to talk about!

  • @imh3r3now1
    @imh3r3now1 5 ปีที่แล้ว

    What a great video!
    A ridiculous nit pick but I definitely was sad when at 1m in your micro variables didn't like up with their corresponding macro variables (position->density, kinetic energy to T, etc)

    • @zapphysics
      @zapphysics  5 ปีที่แล้ว +1

      Haha I didn't even think of that! That would have been much better!

  • @LuisMateusReis
    @LuisMateusReis 5 ปีที่แล้ว +1

    Can anyone put subtitles? For those who aren't english natives. Thank you.

  • @SweetieDancer24
    @SweetieDancer24 5 ปีที่แล้ว +2

    If entropy can be defined in two ways- the logarithmic equation and the polynomial function- shouldn’t they give the same outcome? Why would the polynomial function be used at all if the limit doesn’t make sense?

    • @zapphysics
      @zapphysics  5 ปีที่แล้ว +6

      Ahh yeah, I can see how that could have been confusing. The short answer is that the polynomial function isn't used. I more just brought it up to show that the logarithmic function wasn't the only function that satisfies the first four requirements on the definition of entropy. But then, when all of the probabilities are equal, we introduce a fifth requirement that entropy should blow up for a large amount of possible states. Taking all five of these requirements into account tells us that the logarithmic formula is the correct one.

    • @SweetieDancer24
      @SweetieDancer24 5 ปีที่แล้ว

      @@zapphysics Oh I see, that makes much more sense! Thank you!

  • @justinshadrach829
    @justinshadrach829 2 ปีที่แล้ว

    Im not understanding..... 😞 Could you conclude then whilst Entropy is chaos! There is also a power of order and so complete Entropy doesn't exist. But the appearance of order means we have an understanding of some aspects of entropy and so it seems less

  • @alienzenx
    @alienzenx 3 ปีที่แล้ว

    Disorder is easy to define I think. It is just the amount of information required to describe a system. Entropy is the lack of information about a system. So they are very closely related. The lack of information cannot be larger than the total information in a system.

  • @danielchong8222
    @danielchong8222 5 ปีที่แล้ว +2

    every time i read/watch about Entropy i had aha moment

  • @mohamedmouh3949
    @mohamedmouh3949 4 หลายเดือนก่อน

    thank you sooo much 🤩🤩🤩

  • @danbhakta
    @danbhakta 5 ปีที่แล้ว

    Entropy is described as a "tendency" towards disorder. However, it does not preclude the possibility, regardless of how slim, of a unique state or situation, that leads to a more ordered state. Entropy is probabilistic, therefore a system can arise that can tend towards order.
    I.E. A cyclical universe can still occur.

    • @NightmareCourtPictures
      @NightmareCourtPictures 2 ปีที่แล้ว +1

      Ya. Look up Entropy Bounce. Cyclical universe models exist based on this premise.

    • @danbhakta
      @danbhakta 2 ปีที่แล้ว

      @@NightmareCourtPictures Thx for the trip down memory road.

  • @SheffieldMarkE
    @SheffieldMarkE 5 ปีที่แล้ว +1

    Does this support a system collapsing into a BEC? I don't see how the polarity of the equation can be rationalized, as that shows the condensate at higher entropy.

    • @zapphysics
      @zapphysics  5 ปีที่แล้ว

      So as a disclaimer, I'm going to say that I'm not a condensed matter person, so definitely take this with a grain of salt.
      However, this formula should be consistent with a system condensing. The probability for the system to be in a state decreases exponentially by the energy of that state divided by the temperature. So as I turn down the temperature, I'm also turning down the probability of the system being in any state other than the ground state. So very close to absolute zero, all of my particles (as long as they are bosons) should be in the same lowest energy state, giving me a BEC. But again, that's from a very surface-level understanding of BECs and how they work. Great question, though!

    • @SheffieldMarkE
      @SheffieldMarkE 5 ปีที่แล้ว

      ​@@zapphysics Thought more about it...: I was mistakenly looking at the BEC collapse with distinguishable particles, leading to a wrong conclusion. For a 4 particle system, a random distribution for a disordered system would be P1*ln(1/P1) + P2*ln(1/P2) + P3*ln(1/P3) + P4*ln(1/P4). The orig. comment was truncated...continued below:

    • @SheffieldMarkE
      @SheffieldMarkE 5 ปีที่แล้ว

      The error in my thinking was here - when the system collapses, I posited that for something like a 4 particle system the entropy would be: S = 4*Pc*ln(1/Pc). However, when there are 5 particles in the system, the entropy is 5*Pc*ln(1/Pc), which is different. This is why the question before.

    • @SheffieldMarkE
      @SheffieldMarkE 5 ปีที่แล้ว

      In a BEC, the entropy should be the same for any number of particles. On thinking this througn, this is only true when the probability of a particle (or all particles) being in a particular state is 1 (or incrementally close to 1), which is the definition of the BEC, all fermions share the same state. .
      TLDR: It all works

  • @raddrift-xr5wy
    @raddrift-xr5wy 3 ปีที่แล้ว

    All I got from this is entropy is a measure of disorder which it is not. Or was I not paying attention?

  • @ThanosSofroniou
    @ThanosSofroniou 3 ปีที่แล้ว

    I'm still confused. WHY then does it always increase? What macrostate information are we progressively losing over time?

    • @zapphysics
      @zapphysics  3 ปีที่แล้ว +1

      This is a great question. It turns out that we don't lose *macrostate* information, but instead, the amount of *microstate* information available to us. The reason is that the highest entropy macrostates are the ones which have the most microstates accessible to them. This makes sense in the case that all microstates are equally probable, since there then would be a higher probability for the system to be in one of the many microstates corresponding to that particular macrostate.
      Let's go back to the dice analogy. If I roll 2 dice, there are 6 ways that I can roll a 7, making the "7 state" the most probable macrostate. Compare this to, for example, if the sum of the dice is a 2. There is only a single way of rolling a 2, making the "2 state" less likely than the "7 state." In terms of information, it then makes sense that we have less information if the system is in a "7 state" rather than a "2 state" since we have 6 states to choose from in the "7 state" whereas we know the exact state of the dice in the "2 state" since there is only a single possible microstate. So, if we put dice into the machine in a "2 state" and the machine shakes them up and they come up in the "7 state," we would say that the amount of information that we have access to has decreased since we now know less about the exact state of the system, hence we would say entropy has increased. Since the "7 state" is the most likely state, the system will prefer to be in this state and we can see that, statistically, entropy will tend to increase.
      In this example, it's easy to imagine a case where entropy decreases, but the second law applies to *thermodynamic systems* which have huge numbers of particles in the system. So, if instead of just 2 dice, we have a million (which is still nowhere close to large enough to consider it a thermodynamic system), there will only be a handful of most probable sums that will statistically appear because the multiplicity of these few sums massively dominate over any other. So, we will never get a sum of 1,000,000 (all the dice come up ones) because there is only one state corresponding to this. Compare this even to when the sum is 1,000,001 (one two, rest ones). In this case, any one of the dice could be the two, so we get one million microstates corresponding to this one macrostate. I'm sure you can imagine how huge the numbers soon become for the most probable states of this system.
      TL;DR entropy maximizes because the highest entropy macrostates correspond to the most probable states, which are the ones with the most possible microstates. Since there are so many possible microstates, we have far less information about the exact microstate of the system.

    • @ThanosSofroniou
      @ThanosSofroniou 3 ปีที่แล้ว

      @@zapphysics OMG thank you I finally get it. I missed how different microstates can yield the same macrostate. This was extremely helpful thank you very much

  • @jupiter3093
    @jupiter3093 2 ปีที่แล้ว

    I don’t see that accurate to define entropy as disorder
    Atom in hot air has more kinetic energy and more movement therefore more space it occupy and become more freedom from forces that restrict its move like van der waals or others. While atoms in cold air more restricted
    Therefore heat move from hot to cold objects according to the movement

  • @danmar007
    @danmar007 ปีที่แล้ว

    People confuse entropy with chaos.

  • @mikebellamy
    @mikebellamy 4 ปีที่แล้ว

    He is wrong about entropy as disorder.. If the dice are thrown and we see them we have complete knowledge of the system BUT the probabilities like of a double six being 1/36 is not changed by what we know... It does not become 1..?

    • @zapphysics
      @zapphysics  4 ปีที่แล้ว

      @Mike Bellamy I think I see where the confusion is here and perhaps I should have been clearer. The idea is not that, when we have complete knowledge of the system, there is a 100% chance that when we roll the dice, we will get two sixes, for example. Instead, we are only talking about AFTER the dice have been rolled. So if I roll the dice and it comes up as double sixes, I know with 100% certainty that the final state is double sixes when I observe it. However, if I don't observe it, I don't know what the final state is, and the best I can do is describe it with a probability of what the final state could be. Hopefully that makes sense!

    • @mikebellamy
      @mikebellamy 4 ปีที่แล้ว

      @@zapphysics Yes that makes sense about why we have a probability distribution but why should my knowledge of a system change its entropy?
      Let me put it this way, you said "the more exact we can describe the system the lower the entropy" doesn't work for me. The system will always end up or be in just ONE microstate after the roll of the dice. The entropy according to Boltzmann is the log of the count of microstates in the MACROSTATE. We cannot normally observe ALL the microstates in a macrostate. Say I CHOOSE the macrostate of sum to six., now whether I observe any of the [1,5] [5,1] [2,4] [4,2] [3,3] possible results it makes no difference to the entropy of that result, it will always be k.log(5) AFTER the throw. Before the throw all microstates are equally probable but that is not a system in any state so there is no entropy to calculate. Throwing produces a result which then has an entropy. Ok to calculate the entropy I must observe what the state is but it doesn't lower the entropy from any previous state.
      What I am suggesting is the real problem here is not the definition of entropy but the definition of information.

    • @zapphysics
      @zapphysics  4 ปีที่แล้ว

      @Mike Bellamy I think i understand what you are asking now. I totally agree that the system will always be in a single microstate. The idea is that entropy is a somewhat artificial thing which arises due to what's known as "coarse graining." All this means is that we are not describing the system using a fundamental theory like Newtonian mechanics, quantum mechanics, etc. Instead, we choose to trade the "exactness" of these theories in favor of being able to actually do calculations and make measurements. The way this is done is by not describing a system by its exact state, but instead by the probabilities of what states it could be in at any given time. This is the idea of information: how much of the exactness of the theory are we losing by intentionally choosing to describe systems in this probabilistic way. Entropy is just another word we use for this.
      So I'm not saying that the dice don't exist in an exact state, of course they do. What I'm saying is that, if we instead choose to give up the exactness of our measurements, and instead have a machine which only measures the sum, we can describe the state as having some entropy which accounts for the fact that we have given up the exact knowledge of the system.
      This is somewhat trivial in this simple example, but instead consider something like a room full of gas. How do I describe it? I can describe it by its temperature, by its pressure, by its density, and so on. But that isn't the whole picture. Every particle in every atom in every molecule is in an exact state in the system. If we were able to calculate this or measure this, there wouldn't be any need to talk about thermodynamic quantities like temperature, etc since we would just be able to do the exact physics (just like knowing the exact outcome of the dice). But it isn't possible to calculate/measure these exact states so instead we intentionally give up information about the exact state so we can describe the system in terms of average/thermodynamic quantities (like the sum of the dice). Entropy is just a way of book-keeping the information that we are giving up in order to do this.

    • @mikebellamy
      @mikebellamy 4 ปีที่แล้ว

      @@zapphysics Well it seems if we define ORDER as improbable and DISORDER as probable that would fit with every example we have discussed.. So why say entropy is not a measure of disorder..? Which would be consistent with the old definition of the second law as "All system tend to move to their most probable state"..

    • @zapphysics
      @zapphysics  4 ปีที่แล้ว

      @Mike Bellamy certainly in all of these examples, the two could be interchanged. However, as mentioned in the video, there are other systems which have entropy but describing them as disordered doesn't make sense. For example, there is some entropy associated with a single particle if it is entangled with a second. Yet saying that a single particle is "disordered" seems somewhat non-sensical. The point is that the more fundamental idea behind entropy is information, not disorder, though in many cases, the two could certainly be considered interchangeable.
      Edited to add: thinking about this more, disorder seems to be a somewhat subjective way of describing entropy anyway. For example, how could I say that the randomly distributed atoms in box A are more disordered than the randomly distributed atoms in box B when the temperature of box B is less than that of box A? Whereas, when we look at the definition of entropy, when we say that box A has higher entropy, we are saying that there are more possible states that box A could be in. Thus, we know less about the exact state of box A than box B, so information/knowledge about the system seems to be a more intuitive way of describing entropy. Especially because disorder typically means that things are out of place, meaning you know where things are, but they are mixed up from where they "should" be. But this is not at all what entropy is describing. This is more of an opinion, though.

  • @ChuckCreagerJr
    @ChuckCreagerJr 5 ปีที่แล้ว

    In general, this is a good video, however, while it is true that entropy is not itself disorder, they are related in that disorder results from randomness while in ordered state results from a deliberate action. Meanwhile, entropy is clearly connected to the randomness of a system.

    • @zapphysics
      @zapphysics  5 ปีที่แล้ว

      But this depends on how you define "randomness," no? Do you mean truly stochastic in nature? Because in that case, no classical system should ever be considered "disordered" then, since classical theories are entirely deterministic. Entropy only arises when we ignore this deterministic behavior in order to describe the system in terms of macroscopic/average quantities like temperature, internal energy, etc. Plus, entropy can be assigned to systems where "random" and "disordered" are either meaningless or incorrect ways to describe them. Perhaps the best example is a black hole. A black hole is entirely characterized by three numbers: mass, charge, and angular momentum. Not random or disordered at all. However, black holes have HUGE amounts of entropy associated with them. This can really only be attributed to the fact that you've taken all of the information that was in the stuff that fell into the black hole and are now trying to describe all of it with just those three numbers. Hence, the entropy comes from ignoring all of that information (in principle, you could still jump into the black hole and get all of the information back), not from any randomness, chaos, or disorder.

    • @ChuckCreagerJr
      @ChuckCreagerJr 5 ปีที่แล้ว

      @@zapphysics By random classically, it primarily refers to the inability to have sufficient information to predict the results and that is how I meant it. For example the throw of the dice it is still considered random because you cannot have all the information necessary to calculate the results. As a result, classically entropy would not be the information we ignore but rather the information we do not have access to. Also at the molecular level, things are never constant as molecules are always moving and in a way, we can never have sufficient information about. This would also by definition apply to things such as black holes.
      However, this is the cases only with classical mechanics, when you enter the world of quantum mechanics where much of thermodynamics actually takes place such as molecular motion the deterministic aspects of classical mechanics no longer apply. In fact no I do not have the information, but until we actually look at the system it does not exist but the system exists in a probabilistic state. As a result, at molecular level entropy would represent this degree of Quantum randomness in the system.

    • @zapphysics
      @zapphysics  5 ปีที่แล้ว

      See, I disagree. When you throw the dice, classically it is entirely possible for the system to be deterministic. Very very difficult, since you need to know all of the initial conditions and parameters of the problem to extremely high precision, but possible nonetheless. By choosing the practical route of describing the system with probabilities, we intentionally neglected this information. Similarly with systems on the molecular level. If we ignore the quantum effects for now, we could in principle have an entirely deterministic system, since we could measure all of the positions, momenta, and angular momenta of each particle in the system exactly. It isn't feasible, but it's classically possible. We choose to describe the system using average and macroscopic quantities instead and this is where entropy arises from.
      Now the story changes completely when we have quantum systems. Since quantum systems are allowed to exist in superpositions, we are only physically allowed to have information about the system up to this superposition. Basically, if I can write down a single wave function for the system, I know all I can about it, even if I don't know the exact outcomes of measurements of the system. So here's where defining entropy in terms of "randomness" breaks down yet again. You agree that if my system is in a superposition of eigenstates, the outcomes of certain measurements are truly, unambiguously random, correct? As it turns out, there is a well-defined quantity that measures the entropy of a quantum system (very analogous to the definition presented in this video), known as the von Neumann entropy. Now, the von Neumann entropy says that if I can write down an exact wave function for a system, its entropy is zero. So despite the possibility that my system is fundamentally random, it has zero entropy. So if we want to invoke quantum mechanics, defining entropy as disorder or randomness makes even less sense than for classical systems.

    • @ChuckCreagerJr
      @ChuckCreagerJr 5 ปีที่แล้ว

      @@zapphysics "See, I disagree. When you throw the dice, classically it is entirely possible for the system to be deterministic. Very very difficult, since you need to know all of the initial conditions and parameters of the problem to extremely high precision, but possible nonetheless."
      The problem what you are saying is that while it is true that in theory under classical mechanics you could predictor of dice if you had the initial conditions and all the forces involved. However, no being able to do so is more than a matter of practicality, it is actually impossible because we can't have all the information. You might be able to pull it off if you made a dice throwing robot, but with a human being throwing the dice it is impossible because even ignoring free will you cannot possibly know all the states are the person's brain. If you include free will it becomes totally impossible.
      You: "By choosing the practical route of describing the system with probabilities, we intentionally neglected this information.
      Once again is not a case of intentionally neglected this information but not having access to it.
      ----------------------------
      You: "Similarly with systems on the molecular level. If we ignore the quantum effects for now, we could in principle have an entirely deterministic system, since we could measure all of the positions, momenta, and angular momenta of each particle in the system exactly. It isn't feasible, but it's classically possible."
      There are 2.588 10^1025 air molecules per cubic meter. This is not only impractical tack it is literally impossible commercial all those positions in velocity.
      You "We choose to describe the system using average and macroscopic quantities instead and this is where entropy arises from."
      Once again is not a case of intentionally neglecting this information but not having access to it.
      -------------------------
      You: "Now, the von Neumann entropy says that if I can write down an exact wave function for a system, its entropy is zero."
      Please set a source for this, based on what I know about von Neumann entropy, and the quantum mechanical wave function this claim is bogus.

    • @zapphysics
      @zapphysics  5 ปีที่แล้ว

      I think our disagreement is on what constitutes "possible." I'm arguing from a purely theoretical viewpoint that all of these things are possible. I agree that we cannot do it in practice, but the definitions and the theory don't care about what we can do in practice. Either way, I think we are arguing semantics at this point. You seem to agree with me that there is information we don't have about the system, be it from intentionally ignoring or practical limitations, and that is where entropy arises from. I think we agree on that point. So my question is why bother introducing this new thing of "disorder" to mean exactly the same thing as "information we don't have?" It seems like a somewhat arbitrary definition and I see it as an extra point of confusion when we could simply just explain entropy as coming from information we don't have.
      Also, a very quick Google found several very accessible sources on von Neumann entropy:
      Wikipedia (though I understand it's not to be taken as the most credible source): en.m.wikipedia.org/wiki/Von_Neumann_entropy
      Lecture notes on the subject from Berkeley: people.eecs.berkeley.edu/~vazirani/s07quantum/notes/qinfo.pdf
      Paper from the mathematical physics arxiv talking about thermodynamic operators in quantum mechanics: arxiv.org/pdf/math-ph/0102013.pdf
      From all of these sources, it is clear that a state I can write a wave function (i.e. a ket) for has zero von Neumann entropy. This is incredibly easy to try for yourself, since the diagonalization of the density matrix is trivial for a density matrix which is a single outer product of one ket. Clearly, this density matrix has eigenvalues of 1 and 0 and the von Neumann entropy vanishes.
      It is only when I trace out parts of a pure system that I can get a non-zero von Neumann entropy since the reduced density matrix cannot always be written as a single outer product and therefore has eigenvalues other than 1 and 0. The Bell pair example in the Wikipedia article is an excellent example.

  • @mrnarason
    @mrnarason 5 ปีที่แล้ว +2

    Better than minutephysics

  • @a_little_bit_of_wisdom
    @a_little_bit_of_wisdom 4 ปีที่แล้ว

    First, the title is deceiving, and plays into the anti-science idea that entropy isn't true. Information is an excellent way of talking about entropy. So is Gibbs free energy, and so is disorder. It depends on circumstance. Gibbs is important in far-from equilibrium systems. Disorder explains entropy at high temperatures.
    Second, the device that tells you the total readings of the die inside does not have entropy. It has not lost information. Entropy would result if the device first knew the states of each die, then forget it. But Zap didn't say that. Because if he did, then he'd have to explain how that info was lost, i.e., where it went. Then it would be recoverable.
    However each die inside has entropy. No one knows any particular die's configuration, not even the machine.
    Third, there is not "only one way to roll two threes." There are two ways. Each die is independent. I appreciate it makes the equation easier to solve. That's not a good enough excuse.

    • @zapphysics
      @zapphysics  4 ปีที่แล้ว

      Hello, and thank you for the critiques! If you don't mind, I will respond to your points and hopefully back up the decisions that I made in making this video.
      To the first point, I certainly did not intend for the title to be deceiving and I apologize if it comes across that way. It was simply meant to capture the idea of the video: that being the fact that "disorder" is not the most generalizable idea behind entropy as we now understand it. I don't disagree that other ways of explaining entropy can work in different regimes, but information is the fundamental connection between a macroscopic "coarse-grained" theory and a microscopic "exact" theory. As I mentioned in the video, our thermodynamic intuition of entropy as disorder fails in more extreme cases, such as systems with very few degrees of freedom (e.g. black holes or EPR pairs). In these cases, the idea of entropy as information still holds up and so I would argue that it is the more "universal" way of describing entropy (not that the other methods aren't useful in their own right!).
      On your second point, I will agree that I was a bit sloppy in defining what my system is. However, the entropy produced by the machine "forgetting" the state of the dice after summing them is exactly the entropy that I am talking about in the video. The dice do exist in an exact, zero entropy, state. That is for sure. But by introducing a machine which obscures the information from us (i.e. a mechanism of coarse-graining), it creates entropy. We know this from the information perspective due to the fact that we no longer have all of the information of the state that we could have, but we also know it from a more thermodynamic point of view due to the fact that the machine has to measure the dice and use some energy to evaluate the sum. So say instead that I now program the machine to only say whether the sum is even or odd. I now have less information about the system, and therefore the entropy must be higher. But we know this must be the case since the machine must first evaluate the sum and then figure out if it is even or odd, therefore using more energy, creating more entropy. Perhaps I should have included this discussion in the video, but I didn't feel as though it was misleading without it, since the entropy I discussed and the entropy you are concerned with are equivalent.
      Finally, it is certainly news to me that there are two ways to roll two threes with two dice. If this was the case, it would have actually made the math somewhat easier due to the fact that everything would simply be multiplied by two. I actually disagree that the two dice are independent as long as I know the exact numbers that are shown on the dice (but not necessarily which number goes to which die). For a particular state, since I only have two dice, the number of one die actually exactly determines the number of the other die. Say, for example, I know I rolled a two and a three before looking at the dice. If die A is a two, then I know that die B must be a three. There is no choice in the matter. I don't even have to look at die B. But since die A can either be a two or a three, the system can exist in two possible states: {A=2, B=3} or {A=3, B=2}. When both dice are threes, I know that die A must ALWAYS be a three. Keeping in mind that the number on die A uniquely determines the number on die B (assuming I know the exact numbers for the state), there can only be a single state with two threes, {A=3, B=3}, since die A can only roll a three in one way.
      Hopefully this addresses your concerns!

  • @Tkis01gl
    @Tkis01gl 5 ปีที่แล้ว +1

    Just an average Joe with no higher education and I get what you are telling me in the video. I can't do the math, but I get the concept.

    • @zapphysics
      @zapphysics  5 ปีที่แล้ว

      I'm glad to hear it! Understanding the concepts is about 80% of the battle (if not more) so that's awesome!

    • @Tkis01gl
      @Tkis01gl 5 ปีที่แล้ว

      @@zapphysics I do have a theory concerning the double slit experiment, quantum fluctuations and dark matter/dark energy but do not have math to support it. I just have an idea based on how nature and silt in turbulent water works. Care to listen?

    • @zapphysics
      @zapphysics  5 ปีที่แล้ว

      @@Tkis01gl Sure, I'd be happy to listen!

    • @Tkis01gl
      @Tkis01gl 5 ปีที่แล้ว

      ZAP Physics we need to figure out a private method to communicate. I’m not sure TH-cam comments has enough space to explain it all.

  • @felinefriend6101
    @felinefriend6101 ปีที่แล้ว

    Wow….confused as hell. then i read compliments about how this video made things crystal clear! feel like a doorknob…..ok il go play chess to regain some self esteem

    • @zapphysics
      @zapphysics  ปีที่แล้ว

      Hi, I apologize if the video wasn't clear! Please, feel free to ask any questions and I will try to clarify any points of confusion! I know this is a very challenging topic, so I would hate to discourage you if understanding doesn't come immediately.

  • @wirelesskiller8075
    @wirelesskiller8075 5 ปีที่แล้ว +1

    A new subscriber

  • @lyrical7683
    @lyrical7683 3 ปีที่แล้ว

    All this time i thoth it was disorder i thoth is was one of the law of thermodynamics nooooooooo!😫

  • @poorboy2772
    @poorboy2772 5 ปีที่แล้ว +2

    I'm nearly convinced that reality is a simulation

  • @matthewdemauro6996
    @matthewdemauro6996 5 ปีที่แล้ว

    Dragon FOX DAnce