Claim your SPECIAL OFFER for MagellanTV here: sponsr.is/magellantv_actionlab and start your free trial TODAY so you can watch Venus: Death of a Planet about how Venus became the planet it is today: www.magellantv.com/video/venus-death-of-a-planet-
I love this subject, I liked it when I studied it too... When entropy decreased we called that an open system, otherwise it was closed. Like life on earth has a decreasing entropy. That's an open system. But when we take the sun and the energy that's coming from it the system will have an increasing entropy, now the system is closed...
In the specific simulation you showed at 4 minutes the particles actually attract each other which is why it springs back. So the particles are pulled to the center of mass, but one end is anchored causing all particles to migrate left.
I'm a physics student, and to my limited knowledge, the entropy of the dice did decrease. But the process still happened because of the release of potential energy when a die falls. The spontaneity may be determined by the change in the Gibbs free energy, which is given by dG = dQ - TdS (assuming isothermal and isobaric process). If dG < 0, the process is spontaneous. This is because the increase of entropy of the surrounding (-dQ/T) compensates for the decrease of entropy of the system (dS), thus abiding by the second law of thermodynamics. In this particular case presented in the video, since the system is not isolated (meaning, work and heat can trespass the boundary of the system), the decrease of entropy of the dice is compensated by the increase of entropy of surrounding caused by the release of heat that generated by the falling of the dice. I genuinely enjoy every video of yours! Thanks for uploading such an amazing video :)
This makes intuitive sense. The demostrations in the video are wonderful to teach how crystals form: the particles slot in the position with the lowest bonding potential energy, in the same way as the dice and nails slot in the position with the lowest gravitational potential energy. Would you agree? (PS honest question: I'm a high school teacher who has to teach chemistry with a degree in botany 😅)
My understanding is that increases in entropy are increases in homogeneity. In other words, one subregion of a high-entropy region should be difficult or impossible to tell apart from an adjacent subregion with the same level of entropy. To me, that sounds like a crystal lattice. The term "order" was, IMO, originally slightly misused when the terminology for talking about entropy was getting established, since some of the physical effects of increased entropy are at odds with our natural language use of the term "order" (and its inverse, disorder) to refer to things like how tidy your room is, or how regular a printed pattern is.
@@Mikee512 the third principle of thermodynamics can be stated as "a perfect crystal at 0K has 0 entropy", which to me seems at odds with this interpretation. A perfect crystal would be perfectly homogeneous, with each region indistinguishable from any other, corresponding to a single microstate. Am I wrong? 🤔
Another thing to consider The dice and nails also reached a state where the gravitational potential energy of the system was minimized, the disordered arrangement took up more volume meaning there were lower states the dice could be in if organized. The gravitational potential energy was thus, dissipated as sound and heat. This is equivalent to the nature of living systems. They exist in what we would call a more ordered state, but their capacity for converting useful forms of energy to waste heat more than offsets any apparent decrease in any other aspect of entropy they cause.
Seems work was done on the system with the twisting torque input or shaking the nail box. That kinetic energy converted to waste sound/heat means entropy increases. Gravity is another static input to the system.
Gotta disagree with the reasoning that there is more entropy in the ordered dice because they are more compact and "have more room to move around." They only have more room compared to the height of the disordered state. The dice are not actually concerned about the height of the disordered state, so I would argue that they are in fact at lower entropy since the room they care about (to the sides and below them due to gravity), is less, and there are fewer opportunities for the dice to move around.
I think this is true if you only look at the dice. But we are looking at and comparing the system in a packed state vs a random orientation state. When looking at the system as a whole we must keep the volumes the same.
Yeah. Entropie relates to the number of microscopic states that have the same macroscopic properties. The ordered dice have lower entropy since there are fewer such ordered states than unordered states (roughly, you can orient an ordered die in 24 ways, an unordered one in many more ways). The fact that the system spontaneously becomes ordered does not mean that that order has higher entropy.
This is incorrect. When the dice are disordered, many of them are effectively jammed and can't move at all. On the other hand, when the dice are ordered, each dice is free to move a little bit. They don't have much room, but remember that entropy is related to the number of possible microstates. It increases with the factorial of the number of dice when they can all move. Entropy-driven crystallization is an entire field of study. It's really counter-intuitive, but the explanation in this video is correct.
@@cwpeterson87 Dice don't have to move for entropy calculations; moreover, if they are well packed they would also not move. That is a red herring imho. I could for instance swap dice (even if they are blocked, I can still imagine two states that differ by a swap); thát gives multiple microstates with the same macrostate. Similar other processes like that. Most compelling to me is that this is completely analogous to crystallisation, and a crystal has lower entropy than - say - the corresponding gas. There are lots of other effects here. Like, the dice fall down, thus decrease in gravitational energy, which energy is dissipated such that the dice warm up. And thát heating still increases entropy in compensation. (They release heat, just like crystallisation does btw.) The argument in the video that the packed dice have air above them to move to is not correct either in my view: it requires energy for dice to move up there against gravity, so that is no longer the same macrostate; it has different total energy. (And on top of that, the jumbled dice leave exactly the same room for air among each other.)
I was really just thinking about what implications this phenomena has on living organism behavior, not just these inanimate objects that are being observed for the video
Excellent video, but I think you have it wrong with regard to the dice and nails. They settle when jostled into the lowest energy state - that being the lowest overall point in the gravitational field they are in. Do the same thing in a location without gravity and you'll get different results. I agree with the rubber band example however. Note that stretching a rubber band heats up the rubber, and it cools again when it contracts. This is consistent with entropy. The nails and dice would also heat up (or increase in average speed) if you jostled them in zero gravity just like the rubber by the amount of energy expended to jostle/stretch. Entropy is simply a measure of how likely a configuration is. Sure, it's possible that rubber band could stretch itself just sitting on my desk, but with 100's of billions of molecules in it all needing to do essentially the same thing at the same time over a long period of time (contextually speaking) a mathematician would call it a statistical impossibility.
I'm not sure about the rubber band one personnally, and if their is a contribution the force produced would be almost nothing compared to the intra and inter molecular force involved in an elastic.
@knurlgnar24 Agreed, Action Lab reversed the scale conventionally used to measure entropy. The disordered box of nails should be said to have relatively higher entropy than the more ordered box. I stopped watching after that botched intro.
I think if whoever the first person to use the word "disordered" to dscribe entropy had instead used "homogenous" or something I think we'd be like 50 years ahead with our understanding of physics and information theory
Doesn't work either : case in point oil does not mix with water, and homogeneous emulsions separate into distinct layers. The first use of entropy came from the observation that heat transferts spontaneously goes from hot stuff to cold stuff, and that this puts a hard limit on how efficient a thermal engine can be. The modern understanding of entropy is that it is a measure of the number of microscopic states a system can be in based on the values of observables that can be measured at our scale. The channel alphaphoenix made a really good video recently if you want a more in depth explanation.
"homogenous" is closer but not quite right. I think "likelihood" is a better description but still not perfect. If you allow a system to evolve undisturbed for an arbitrarily long amount of time, taking observations at random times, higher entropy states are more likely to be observed over time.
To me the best description of entropy is it’s the act of something reaching its most stable state. Like with these dice being organized like that the dice is more stable than when they are disorganized. Rust is chemically more stable than iron.
2:35 sounds fishy... please check: 1. microstates in the Bolzmann formula have the same energy. Here, if any dice goes to the free space above, the system will have its energy increased by the work against gravity. This breaks the condition of the microcanonical ensemble 2. Microstates are equally possible, however we don't see dices jump into the free space by their own will, therefore the shown state has much larger probability to be in and this breaks the condition of the microcanonical ensemble
I agree with your description, but bear in mind that this system wouldn't be described by a microcanonical ensemble, because it is not isolated. In fact, Boltzmann entropy would not be the correct expression for entropy in this case I reckon. This looks more like a (N, g, T) ensemble, where available states are definitely not iso-energetic. Further thought is necessary on my part
@@andreacosta7712I guess you've meant canonical NVT ensemble. I agree with you, it should be more suitable in this case. However, I played along with the author, who decided not to introduce Gibb's formula. However even if the energy distribution of microstates was considered, my point (2) would still hold. I actually try to lead to an idea that the author has some sense in his words only if he never stops shaking the jar and even in this case it could be hard to say smth specific about energy distribution of states and the resulting entropy
Definitely, any kind of thermodynamical reasoning only holds if the shaking is ongoing, aka constant temperature. What I meant was indeed NgT. Volume is not constant, but gravity is. In a way, a constant-force ensemble is equivalent to a constant-pressure one, as in NpT, which is described by Gibbs free energy. My general point is that in this demo, the interplay of energy and entropy is what actually explains the phenomenon, so reasoning in a microcanonical framework does not give you the full picture, because entropy is constant in it. But yes, point 2) still holds.
@@JackAtlassBasically, these are bad examples using the wrong math, failing to state the things he wants to state even though what he's saying is true. He's wrong in every step of explaining why it's true.
This is a wrong explanation. The spontaneous orderly arrangement in the example is due to the effect of gravity. Dice can lower their overall center of gravity when arranged neatly, and the same is true for nails. If gravity is removed, such as by astronauts conducting experiments in the space station, you will see that neither the dice nor the nails will spontaneously organize neatly.
@@andymouse Maybe It approximates the dark matter flowing over and through us. Galaxies start out as ring galaxies and "acquire" arms and bars and get warped along their rotational equator and things due to perturbations from other stimulus. An in space simulation would be all the dice in the cylinder in a random meander. Similar to a space containing a gas. It would take longer than it takes for tar to drip to get the data.
But if you remove the gravity of Earth or any other massive object the dice themselves would still gravitationally attract each other and they would form a clump with mostly face to face contact, very similar to the way they are arranging in the bottom of the tube.
'Amount of free volume per dice increases'. I thought that the amount of free volume is the same before and after aligning dice in a finite system, it is just that the same 'old' free volume is now concentrated in 'new' free volume in space continuously, above the dice, not scattered between dice.
yep - If we follow this guy's process and keep the volumes of the two systems (1/higgly-piggle dice and 2/ordered dice) constant then what he shows us (higgle-piggle / ordered) each represents a single microstate/config of the same macrostate. Both systems can reach all the other's possible microstates given some input of energy. They have the same number of possible microstates and the same macro. The entropy of both systems is the same. (If we are allowed to reduce the volume in the ordered dice system, Then it has lower entropy.)
Once the universe reaches full entropy, if ever, that will be perfect order. Or as close as you can get to it. How people come to the conclusion that entropy is chaos is beyond me
To my understanding entropy describes particles "desire" to reduce their energy to reach the most stable state. The dice prefer the tightly packed state because here they have the lowest amount of potential energy. Also it takes a lot more energy to break that state apart, because the previous unorganized state could be broken apart by the tapping / rotating, the new state however is hardly affected by it.
Indeed. When you have packed particles, if you free them on a bigger room, they redistribute to reduce their energy. On the dice scenario, this happens also for reduce their energy.
The trick here is not to compare the original disordered dice configuration to the final ordered one, but rather to imagine the distribution of dice orientations while the container is being shaken. At equilibrium, most of the dice align and the ones near the top are constantly and rapidly bouncing around. If the container is not being shaken, the entropy of both configurations near zero because the dice have almost no probability of changing state. And analyzing the situation becomes harder because the disordered configuration will have a higher energy level due to potential energy than the ordered configuration
This is true but a more practical role would involve calculating the partition function. We need to incorporate positional and configurational entropy due to the limited volume. This would be more a more detailed approach and would incorporate both positional and potential energy considerations.
If we replaced the dice with smaller particles like those Einstein first observed brownian motion in, it seems clear the unpacked disorganized dice are much higher entropy. Random disturbances are much more likely to lead to an unpacked vs packed state. For every 1 packed permutation, an enormous number of translated identical ones. Am I missing something?
3:33 This is semantically untrue. The reason the rubber band returns to an unstretched state is because of electrostatic attraction and repulsion between the molecules within the rubber, which respectively resist stretching and compression, and thus effectively function as an elastic force. You can explain this as a mechanism that acts to increase entropy, but this does not change the fact that the force is elastic in nature.
Reminds me of how the initial state of the early universe would have appeared about as random/disordered as possible yet gravitationally speaking was very low entropy.
The initial state of the universe is extremely ordered. All matter existing in the same point. There is only one configuration for that to be the case.
@@JamesDevlin-d4z It wasn't a single point. The universe before the big bang was still infinitely large, it was just much, much more tightly packed. At least, according to the widely accepted theories.
"When enough chaotic and random events happens. Eventually, it leads to very predictable events." Thats quite philosophical. So after enough observations, everything will be understood.
Yet the definition of chaotic and random preclude the possibility of predicting. So are the events truly chaotic or random? If they are predictable then they can't be.
@@PhokenKuulthis is incorrect. There is nothing in the definitions of chaotic or random that preclude prediction. Dice rolls are random but I can accurately predict that each number will appear 1/6th of the time in a large number of rolls.
The ordered jar has less entropy, with the dice compacted there is less positions they can actually occupy while still looking the same in buik; but the total entropy of the Universe increased with the heat of the friction and collisions though. Just like life, we may seem to violate the laws of thermodynamics with all our growing and reproducing, but it's actually the opposite, we spread heat more easily than still rocks.
The story of the dice themselves decreased, but the entropy inside the jar as a whole increased as a result by compacting the volume. Honestly though, he is using semantics.
That's incorrect, energy neither increases or decrease, the universe is moving towards the most statistically likely configuration which is the true definition of entropy. *The 3 levels of entropy from less correct to more correct* 1: A system moving in the direction of a more disorderly state 2: A system moving towards a state of equilibrium 3: Energy and matter displaying the most statistically likely configurations *Creating the arrow of time by moving towards that given configuration that can be more or less statistically likely but the ones which are the most likely are considered entropy
@@notaffiliatedwith7363It's not about geometry and order, it's about the most likely configuration in a given system with it's laws of physics at work
@@markmuller7962 Yep, "order" and "disorder" are subjective, so they do not always match with the entropy. A good example is oil and water separating over time when with most other liquids we see the opposite.
@@jacobblaustein7779If the jar entropy has been increased, it's due to the heating while wildly shaking. The jar is not a closed system. Explanation is not valid, or not complete, while heat and sound noise are not considered, as well as random shaking.
Action Lab reversed the scale conventionally used to measure entropy. The disordered box of nails should be said to have relatively higher entropy than the more ordered box.
he's wrong - If we follow this guy's process and keep the volumes of the two systems (1/higgly-piggle dice and 2/ordered dice) constant then what he shows us (higgle-piggle / ordered) each represents a single microstate/config of the same macrostate. Both systems can reach all the other's possible microstates given some input of energy. They have the same number of possible microstates and the same macro. The entropy of both systems is the same. (If we are allowed to reduce the volume in the ordered dice system, Then it has lower entropy.)
*The 3 levels of entropy from less correct to more correct* 1: A system moving in the direction of a more disorderly state 2: A system moving towards a state of equilibrium 3: Energy and matter displaying the most statistically likely configurations *Creating the arrow of time by moving towards that given configuration that can be more or less statistically likely but the ones which are the most likely are considered entropy
(3) need a tweak: the whole point is that all microscopic configurations are equally statistically likely (at the same energy), but there are just so many more with the "right" macroscopic variables that it always looks right. e.g., the configuration of N2 and O2 molecules in the room (all your rooms) right now has the same likelyhood as the one with all O2 on the left side and all N2 on the right side, but there is only one state that looks like the latter, and more than Avagadro's number to the sixth factorial that look like the former, and (N_A^6)! >>>>>>>>>>>>>>> 1.
The "arrow of time" is not necessarily a more correct part of your difinition. An interesting idea, but not demonstrable. Sean Carroll (the physicist who proposed this) is a *theoretical* physicist and his idea arguably enters into an unprovable area.
@@PsymonHawkeye Actually the arrow of time is a side addition to the third concept just to make it clearer. And btw it's not a matter of hypothesis, the main role of entropy in the arrow of time is a theoretical physics *fact* aka scientific theory which doesn't have anything to do with the English popular use of "theory". It's a scientific theory, learn how science works
@@markmuller7962 agreed. Idk, that was my favorite part of statistical mechanics. I use it to disused lottery players: Me: “ you should pick 1,2,3,4,5,6” Them: “why? That will never come up!” Me: “exactly.”
What was said in the beginning of the video, that the 'ordered' systems (dice or spheres) have more entropy than the disordered ones is not correct (if we look at the isolated entropy of the packing of dices and spheres). You might consider correcting this. These systems do not reach a state of maximum entropy, but the minimum of an energy potential (like the Gibbs potential in thermodynamics). For the dices and the spheres this potential contains a gravitational energy term, which is the +m*g*h, where m is the total mass of the dices or spheres, g the gravitational constant and h the height of the center of mass of all the objects in question. The second term is an entropic one -c*S (c is some constant depending on the thermal energy or the shaking intensity in this case). The system now tends to adopt a configuration of minimum m*g*h-c*S. The close packing reduces h and this term dominates and therefore the dices or spheres adopt a packing where h is smallest possible even if S is not at its maximum. Here minimal h and maximum S can not be reached simultaneously. To understand this consider there is no gravity (i.e. g=0) then only the entropic term -c*S is left in the potential and the system will adopt the state of maximum entropy (i.e. all dices and sphere floating randomly in the cylinder) at the mildest shaking (i.e. c is not zero). The second case is strong continued shaking. In this case c will be very large and m*g*h will become negligible (basically g can be assumed to be 0). Then again the dices and spheres will ‘float’ randomly in the cylinder due to the shaking.
If he said 'maximum entropy' then he's wrong on that as you say. He's also mixed up on another count i'd say: If we follow the guy's process and keep the volumes of the two systems (1/higgly-piggle dice and 2/ordered dice) constant then what he shows us (higgle-piggle / ordered) each represents a single microstate/config of the same macrostate. Both systems can reach all the other's possible microstates given some input of energy. They have the same number of possible microstates and the same macro. The entropy of both systems is the same. (If we are allowed to reduce the volume in the ordered dice system, Then it has lower entropy.)
@@nonabroladze8203 no it's not mess or disorder. they are just the more probable outcomes. it just mesures the number of arrangements particles can have(it have more to do with the distribution of energy rather than particles) . a cube of ice have almost one arrangement of atoms while vapor can have an astronomical amount of arrangements. vapor got more entropy than a cube of ice it's not about disorder. disorder just tend to corelate with it in most cases.
@@nonabroladze8203 as the guy showed in the video there are more ways to arrange the dice in an "ordered" maner than there are to arrange it in a disordered manner even if the entropy is high the cube arrangement will get more ordered with time.
My understanding is that the jar with the dice packs in an ordered way because that decreases the total gravitational potential energy of the dice, which makes that state more likely. If jar were in a state of weightlessness, the disorganised state would be more likely.
Hey! I like your video, but I have to ask for clarity. You treated the bowl of dice as a closed system while exerting outside forces. I believe you when you say the total entropy of the final system (inside the bowl only) increased, and the entropy of the drill also increased. Why was the outside force ignored? In another system where outside forces are ignored you may have decreases in entropy. But missing the outside force leaves the question open, does the entropy decrease in the bowl balanced by the entropy increase in the drill? Of course you explained that when looking at possible states it is favorable to pack. Further, thanks for the video, we fail to remember that "ordered" is a human imposed concept, if we define it as the low entropy states, then the bowl of none packed dice is more ordered than the packed one. But our brains like to call the packed state as more ordered.
As an aside, regarding "ordered" being a human imposed concept, I tend to think of "order" vs. disorder being recognizable patterns vs. a lack of recognizable patterns. Of course taking macro and micro into account, there are patterns that we don't always immediately recognize as such. You can quickly get into "does a tree falling in the woods make a sound if there's no one there to hear it" territory (we know the answer to that of course). Referring to whether patterns/order requires observation to be considered patterns/order vs. random chaos. The science of the impact or lack thereof of observastion of phenomena in the universe continues to be intriguing topic.
One way to clarify what he's talking about is if you understand entropy as hidden information rather than disorder (the latter of which I despise as an explanation of entropy). In this sense, free space has no information content, amd thus no hidden information. By lowering the average free space in the local vicinity of the dice, we are hiding more information, because now any given die could have any 6 other die (or 5, if it's against a wall) directly pressing up against it. This is the massive increase in translational entropy that he's referring to.
Nice video, but Entropy in the original statistical mechanics sense of S=log(Omega) is good assumption ONLY in an energy conserving equilibrium system, what is generally called a microcanonical condition. In the case of macroscopic spheres (or dice) you have a dissipative system in which a lot of energy is dissipated into heat during collision and this is why you have to provide energy by vibration. This is a steady state and not an equilibrium one and entropy technically is not defined. Clearly you can use the dice example for pedagogical reason, but with a bit of care. You said: "any system wants to maximize entropy" this is not true. Let me give an exanple. If you wold have a mixture of two macroscopic spheres in a shaker they will have a strong tendency to demix with the big spheres going on the top (brazilian nuts effect). In a equilibrium system this will be nuch more unlikely (especially if rhe sizs ratio of the two spheres is not too large) because mixing increase the number of condigurations and consequently entropy.
I watch tour channel mainly for how you make science topics fun and engaging with clever and novel experiments, but then every so often you deliver powerful insights expressed really intuitively and clearly. Thank you. Excellent work.
This is not entirely correct and in fact could be incorrect. It’s not black and white. We have to keep this simple and not try to manipulate individual concepts but instead focus on the combined perspective. Configurational entropy and positional entropy both need to be considered and calculated. If the extent of free volume significantly affects the number of Microstates that are accessible, then it’s possible the organized dice may have more entropy due to positional freedom. If the volume is more constrained, randomly orientated dice will have more entropy. We can not make the claim that one has more than the other without proper measurement.
If we follow this guy's process and keep the volumes of the two systems (1/higgly-piggle dice and 2/ordered dice) constant then what he shows us (higgle-piggle / ordered) each represents a single microstate/config of the same macrostate. Both systems can reach all the other's possible microstates given some input of energy. They have the same number of possible microstates and the same macro. The entropy of both systems is the same. (If we are allowws to reduce the volume in the ordered dice system, Then it has lower entropy.)
I'm joining team "I don't agree". Because the dice in the ordered state have fewer opportunities to change position. If you keep doing that start stop they no longer move. Where in the disordered state they do move. Meaning the disordered state has more opportunities for the dice to move and therefor a higher level of entropy.
This really changed my understand of entropy and appreciate the perspective that life is an emergent property of a system that has evolved to increase entropy while efficiently dissipating energy. The variety of life on Earth shows that a relatively small number of different chemical elements can assume quite a variety of arrangements. Carbon is entropy's strongest proponent in that realm.
1:45 Well Said Remember Folks, AI will give you the Popular Answer, not Necessarily the Correct Answer. [Usually Popular = Correct AND you also want to Double Check some other way]
I always understood entropy as the drive towards a stable state where no work is done, e.g. when all heat is evenly distributed in the universe, there is no transfer of energy (because heat doesn't pass from cooler to hotter (thank-you to Flanders and Swan for their song about the laws of thermodynamics) and everything is equally hot. The dice stop moving once they are aligned because the are then at their most stable configuration. In an earthquake, things fall down because being on the floor is their most stable state.
Entropy? :D This is more about a constant field of gravity acting on the contents locking them in place, and the constant work input by agitation that allows each die/bead to move out of its current local minima and fall into its new place. Once slotted into positions, gravity and force from their neighbors restrict them from moving out of place. This effectively locks them in place. It just so happens that the most 'restrictive' spaces are also the most space-efficient structures (highest packing efficiency), and crystalline structures are quite space-efficient. Hence spheres forming a face-centered cubic instead of body-centered cubic or other patterns. Given the magnitude of forces at play, they drown out any entropy changes you see in the system. Try doing this test without agitation and gravity messing with the system. Try suspending them in solution of similar density. If you let the solution evaporate or drain slowly, it might shed a decent light on crystal formation too. (drain too fast and they will look more disordered, and 'crumbly' too) :D
I prefer to think of entropy as pointing in the direction of the arrow of time - jiggling and jostling over time will tend to produce the most common configurations. This only holds if all configurations are equally available and aren't sticky. Natural filters in life put a non-random bias against this jostling, from your example here to the gravity wells of stars and the motion of beaches, or Life's Ratchet in the cellular machinery. If once a random state occurs that locks in a position (like your orderly dice that can no longer move) then they will tend to order up even if order up states are less common, because they stick. No matter how slow the decay is or how uncommon the state of a random decay is, Schrodinger's cat will die when it happens and will stay dead (cat will randomly die but not randomly resurrect), so over long time you are much more likely to find a dead cat than a live one. Entropy was meant to describe states that are fully free to interact in the system, not natural filters. It doesn't break physics, it just doesn't fit that model. If you think of entropy not as of particle concentration or arrangement, but as the most likely outcomes over long times scale including non-random natural filters, then a compact star or beautifully orderly beach sand are actually low entropy as the dead end of the jostling options. This kind of one-way-ness also applies in other realms like gambling - where no matter how much you win you still can still risk all and lose all, but once you are broke you cannot un-lose. The house always wins even if you have slightly favorable odds, because even a less common disaster cannot be recovered while all cumulative success is just one bad roll from being wiped away.
This is why science is so great. Its about studying whats actually going on regardless of preconceived ideas or beliefs. Properties of the Universe can get confusing (especially when there are multiple things going on and many variables). Science is studying each one, recording data, repeating, recording, etc. Humans love to see one event and assume they know everything from it...we know thats not the best way yet here we are in 2024 with 10s of millions who would rather believe what they want to more than what is real.
yep - If we follow this guy's process and keep the volumes of the two systems (1/higgly-piggle dice and 2/ordered dice) constant then what he shows us (higgle-piggle / ordered) each represents a single microstate/config of the same macrostate. Both systems can reach all the other's possible microstates given some input of energy. They have the same number of possible microstates and the same macro. The entropy of both systems is the same. (If we are allowed to reduce the volume in the ordered dice system, Then it has lower entropy.)
My thermodynamics professor had something he repeated constantly, "now we just count the states." Right, "just". The text (which he wrote and distributed free) had an explanation of how we knew all the states had been counted. Entropy was simple, the more states possible the higher the entropy and the more probable that the system would be in one of those states. Amazingly most of the laws of gases, liquids and even solids could be reduced from this "simple" counting of states. I'm not going to attempt counting the states of dice in random vs. ordered configurations but I'm pretty sure that from this perspective there are more states of randomly ordered dice. My lesson I learned is that the properties which apply to atomic level particles are not necessarily those that apply to macro sized objects.
@@JK_Vermont Not the issue. The video was claiming that volume was created. Not that usable volume was created or more useful volume was created. Details matter. "It's physics for Godsakes".
Great topic! I always learn something, realize how little I know about something and not understand something altogether in every episode, it’s awesome. Keep up the great work!
I run a Precious Plastic recycling workspace, here in Italy, in my spare time. Usually I wash the plastic I collect in my washing machine (the european kind of, horizontal shaft), in a pillowcase, when doing laundry. Often I have some boxes with sliding covers, the ones that contain carbide lathe inserts, or telescopic boxes for drills, and so on. Sometimes I also have to wash bottle caps with child safety bottle caps (something from vaping fluids from friends and colleagues), made in two parts. I break the external part to separate them, and then put them in the pillowcase. Practically every time all of the parts come out from the washing machine re-assembled. It's amazing.
Since when does ChatGPT "represent most people"? STEM bros really crumble at the thought of social research, huh? You could've easily made a poll of your audience or cited popular definitions of entropy in recognized dictionaries but you chose the laziest way possible, somehow equating an LLM with "most people" as if it's a fact.
I saw it differently. The random dice in the glass has a higher potential energy state than the organized stacked state hence the jiggling tends to organize the dice.
I love how your simple questions and demonstrations arouse the curiosity of so many. I never considered the entropy of a cup of dice before, for example.
Finally someone who gets it and can actually explain it in a good way.. Damn, not even my university profs got the concept of entropy! The whole "you can't unmix the marbles!" and other demonstrations of randomness to explain entropy completely ruined the idea in the public mind. We need more of these videos!
4:07 you say there's no forces acting on the particles making them coil together but actually in the simulator you used the electromagnetic force is simulated, which is what's pulling them together
I've quite recently finished writing my PhD thesis on entropic phase transitions in some specific systems. I have a whole section of a chapter devoted to "trading" one type of entropy for another type for a net entropic gain. I find this phenomenon very fascinating and almost each one of my scientific papers has a short, "intuitive" discussion how it is applied to the case on hand.
Cool video! I have seen many confusing explanations of entropy, but on of the better ones is "Entropy states 'that which is most probable will probably happen.'" In a random system, the probabilities reflect that in disorder. If the rules of the system are ordered, the result will generally be ordered. In application it's a little more complicated than that, but in general essence it seems to be right. So I guess the entropic force is the tendency of the system to align to its underlying probabilities.
I think the way that communicates best how I understood entropy is "The inverse deviation from an object's state at the end of time," meaning, the more entropy increases, the more closely an object will reflect the state it would be in as time approaches infinity, and that entropy is used as a way to gauge how far away an object is in an initial state from the state it will reach at the end of time
You're forgetting the most important part of the equation, you. Conscious intervention. You interfered. You added order to a disordered system. You put it back together. You pulled the rubber band. If you haven't looked into Emergence Theory, you should.
The second law of thermodynamics applies to closed systems, not all systems in the universe, nor all theoretically possible universes... But an interesting video that will be fun to think about.
Just for this... I subscribe. Didn't expect to learn anything new... but this gives me a new understanding on how to perceive the world. No one ever explained me entropy in class, and I guess I missed out a little in my understanding of this concept.
I’m pretty sure the example of the rubber band stops being only entropic force the moment the band has to be stretched rather than just loosely in a line.
This video reminds me of my younger days back in the 90s. I used to prepare for raves by visiting a stimulating trader I knew. I was still young and naive and didn't understand why this friend would always take out a large bowl filled with dice and marbles. He explained that sometimes its not enough to hide the screwdriver. As my experience grew and I met different people I understood, Dude was just protecting his electronics with that bowl.
In thermodynamics, we were taught that entropy is the amount of energy that is not available to do useful work. As such then, something that is not available cannot create a force. External work was done on the dice and nails to get them into the ordered pattern. They did not get that way by themselves.
Fun fact, entropy is why loose wires/strings get tangled all the time. One thing always helps: connect one end to the other, this restricts the number of variations in knots it can have (to 0).
I'm not an expert on entropy, but the easier explanation in my mind is that the aligned dice simply have more possible states available. Consider 100 dice stacked in the bottom of the container - each cube can be swapped for any other so you have a total of 100! possible states. Now imagine that 20 of them are sitting on the edge - each of the cubes lying flat can be swapped with each other flat cube (80! possible states), and each on-edge cube can be swapped with each other on-edge cube (20! possible states). So the mixed orientation configuration has 80! x 20! possible states which is SIGNIFICANTLY fewer states and therefore lower entropy. Adding additional orientations here lowers the entropy. (Yes, this is a simplification of the situation) I'm not yet totally convinced that the "free volume" concept is very helpful in understanding the increase in entropy with the dice here even though the difference in volume will certainly play a role in the number of states. Although it's pretty surprising, I wouldn't say there is anything "mysterious" about this "force". Where there are MANY more possible ways for a system to do something, the system will tend to eventually do that thing. At least, that's how I like to think about it.
When you heat a steel spring, its spring force decreases as the steel gets softer. When you heat a rubber band, its spring force increases, because the molecular chains wiggle faster.
When you jostle the dice, their gravitational potential energy is being dissipated as sound and heat, so they tend towards a "ground state" where their potential energy is minimized (by being packed as low as possible). The entropy of the dice does indeed decrease because there is only one "ground state" (up to rotation of each layer). It's possible for the dice's entropy to decrease because they are not a closed system - you are not counting the microscopic degrees of freedom where the sound and heat end up as part of the system, and if you did, total entropy would increase. Another way of looking at it is if there were no energy dissipation into those microscopic degrees of freedom, then the dice collisions would be perfectly elastic and they would keep bouncing around instead of settling.
While the entropic force is a major factor in a rubber band's retraction, it's not the sole force at play. There are also: Enthalpic Forces: These arise from changes in the internal energy of the rubber band when stretched. Stretching stores energy in the polymer bonds, and releasing the band allows this energy to dissipate. Cross-linking: The polymer chains in rubber are often cross-linked, creating a network that resists deformation. This contributes to the overall restoring force. The Dominant Force: In most everyday situations, the entropic force is the dominant factor in a rubber band's behavior. This is why heating a rubber band (increasing entropy) causes it to contract, and cooling it (decreasing entropy) makes it stretchier. Important Note: The balance between entropic and enthalpic forces can change under extreme conditions (e.g., very high or low temperatures) or with different types of rubber. In summary: The entropic force is a significant factor in a rubber band's tendency to return to its relaxed state, but it's not the only force involved. Enthalpic forces and cross-linking also play a role, although their contribution is usually smaller in typical scenarios.
This video goes about 95% of the way to explaining Erik Verlinde's Entropic Gravity, which is impressive because Leonard Susskind has a hard time explaining it in plain language.
Interesting fact is that with the most symmetric geometrical shape, the sphere, you cannot get entropy by shaking. This is because there are two ways of 'closest sphere packing' (cubic and hexagonal). Both packing structures will occur when you pile up balls/spheres and shake them. They will form random groups within the pile of spheres, and the bounderies between those groups will only get more defined the more you shake.
Keep in mind this analogy does not transfer over to the naturalistic model. Entropy does not produce information far less in a progression to a higher state. It does not increase complexity of design it degrades it same with information as information theory states. Probability does not produce rational outcomes in any reliable sense, that's a belief in magic that it would aka the core of the flawed naturalistic model.
I always looked at entropy as the ending of something or decay but there's more to it. The heat death of the universe was the main example of entropy i had used to try to figure out what entropy is/does. Even in this video towards the beginning it seemed to me more like the tendency to even out, before you explained it more. So what I get from this is entropy is the tendency to or force of changing to a state of more potential from a state with less potential.
You can still call entropy disorder, but just be careful of what you define as your system. For example, the dice ordering and leaving the volume has higher entropy, but if you were to consider the system as only the dices, and therefore don't consider that bit of volume that "creates" later, the entropy is lower. On a microscopic level it's easy to call entropy disorder, but on a macroscopic level defining systems is hard The example of the elastic band and the curled string of particles was really cool!
Entropy is not chaos, but the measurement of chaos that a location or event contains. Chaos is not randomness, instead it takes all expected data and outcomes and either slightly to greatly rearranges it into an unpredicted state.
Claim your SPECIAL OFFER for MagellanTV here: sponsr.is/magellantv_actionlab and start your free trial TODAY so you can watch Venus: Death of a Planet about how Venus became the planet it is today: www.magellantv.com/video/venus-death-of-a-planet-
I love this subject, I liked it when I studied it too...
When entropy decreased we called that an open system, otherwise it was closed.
Like life on earth has a decreasing entropy. That's an open system. But when we take the sun and the energy that's coming from it the system will have an increasing entropy, now the system is closed...
Thank you very much for this. In calculations of steam engines, entropy is fairly simple. In theory, it leaves me repeatedly perplexed.
this is all good, but can I melt steel object by placing incredibly hot tungsten ball on it?
In the specific simulation you showed at 4 minutes the particles actually attract each other which is why it springs back. So the particles are pulled to the center of mass, but one end is anchored causing all particles to migrate left.
Notice the dice align from outside inward as they themselves mimic the shape of the cylinder in cylindrical layers
So all I have to do is spin my room?!
You spin your room right round baby right round
@@btd5311like a record baby right round round round...
Or i rock ur world
If it's in boxes yes, or shake it really hard if its spheres. 😁
Um should i spin the world?! Wait a minute... its already spinning... Wait what
I'm a physics student, and to my limited knowledge, the entropy of the dice did decrease.
But the process still happened because of the release of potential energy when a die falls.
The spontaneity may be determined by the change in the Gibbs free energy, which is given by dG = dQ - TdS (assuming isothermal and isobaric process). If dG < 0, the process is spontaneous.
This is because the increase of entropy of the surrounding (-dQ/T) compensates for the decrease of entropy of the system (dS), thus abiding by the second law of thermodynamics.
In this particular case presented in the video, since the system is not isolated (meaning, work and heat can trespass the boundary of the system), the decrease of entropy of the dice is compensated by the increase of entropy of surrounding caused by the release of heat that generated by the falling of the dice.
I genuinely enjoy every video of yours! Thanks for uploading such an amazing video :)
It seems a better explanation.
This makes intuitive sense. The demostrations in the video are wonderful to teach how crystals form: the particles slot in the position with the lowest bonding potential energy, in the same way as the dice and nails slot in the position with the lowest gravitational potential energy. Would you agree? (PS honest question: I'm a high school teacher who has to teach chemistry with a degree in botany 😅)
My understanding is that increases in entropy are increases in homogeneity. In other words, one subregion of a high-entropy region should be difficult or impossible to tell apart from an adjacent subregion with the same level of entropy. To me, that sounds like a crystal lattice. The term "order" was, IMO, originally slightly misused when the terminology for talking about entropy was getting established, since some of the physical effects of increased entropy are at odds with our natural language use of the term "order" (and its inverse, disorder) to refer to things like how tidy your room is, or how regular a printed pattern is.
@@Mikee512 the third principle of thermodynamics can be stated as "a perfect crystal at 0K has 0 entropy", which to me seems at odds with this interpretation. A perfect crystal would be perfectly homogeneous, with each region indistinguishable from any other, corresponding to a single microstate. Am I wrong? 🤔
So you're saying falling dice are hot?
Another thing to consider
The dice and nails also reached a state where the gravitational potential energy of the system was minimized, the disordered arrangement took up more volume meaning there were lower states the dice could be in if organized.
The gravitational potential energy was thus, dissipated as sound and heat.
This is equivalent to the nature of living systems. They exist in what we would call a more ordered state, but their capacity for converting useful forms of energy to waste heat more than offsets any apparent decrease in any other aspect of entropy they cause.
Reading that made my living system fart.
@@cosmicraysshotsintothelight doing your doody
Seems work was done on the system with the twisting torque input or shaking the nail box. That kinetic energy converted to waste sound/heat means entropy increases. Gravity is another static input to the system.
I think this might be the main reason they align.
@@H0A0B123 Shimmy Shimmy Ko Ko Bop!
Gotta disagree with the reasoning that there is more entropy in the ordered dice because they are more compact and "have more room to move around." They only have more room compared to the height of the disordered state. The dice are not actually concerned about the height of the disordered state, so I would argue that they are in fact at lower entropy since the room they care about (to the sides and below them due to gravity), is less, and there are fewer opportunities for the dice to move around.
I think this is true if you only look at the dice. But we are looking at and comparing the system in a packed state vs a random orientation state. When looking at the system as a whole we must keep the volumes the same.
Yeah.
Entropie relates to the number of microscopic states that have the same macroscopic properties.
The ordered dice have lower entropy since there are fewer such ordered states than unordered states (roughly, you can orient an ordered die in 24 ways, an unordered one in many more ways).
The fact that the system spontaneously becomes ordered does not mean that that order has higher entropy.
This is incorrect. When the dice are disordered, many of them are effectively jammed and can't move at all. On the other hand, when the dice are ordered, each dice is free to move a little bit. They don't have much room, but remember that entropy is related to the number of possible microstates. It increases with the factorial of the number of dice when they can all move.
Entropy-driven crystallization is an entire field of study. It's really counter-intuitive, but the explanation in this video is correct.
@@cwpeterson87 Dice don't have to move for entropy calculations; moreover, if they are well packed they would also not move. That is a red herring imho. I could for instance swap dice (even if they are blocked, I can still imagine two states that differ by a swap); thát gives multiple microstates with the same macrostate. Similar other processes like that.
Most compelling to me is that this is completely analogous to crystallisation, and a crystal has lower entropy than - say - the corresponding gas.
There are lots of other effects here. Like, the dice fall down, thus decrease in gravitational energy, which energy is dissipated such that the dice warm up. And thát heating still increases entropy in compensation. (They release heat, just like crystallisation does btw.)
The argument in the video that the packed dice have air above them to move to is not correct either in my view: it requires energy for dice to move up there against gravity, so that is no longer the same macrostate; it has different total energy. (And on top of that, the jumbled dice leave exactly the same room for air among each other.)
The dice aren't "concerned" with anything because they're inanimate 😂
Does entropy also explain why elementary school students never line up in a straight line?
Did you try shaking them?
Random and chaotic.
Bro…it might.
I was really just thinking about what implications this phenomena has on living organism behavior, not just these inanimate objects that are being observed for the video
@@juroph7😂
Excellent video, but I think you have it wrong with regard to the dice and nails. They settle when jostled into the lowest energy state - that being the lowest overall point in the gravitational field they are in. Do the same thing in a location without gravity and you'll get different results.
I agree with the rubber band example however. Note that stretching a rubber band heats up the rubber, and it cools again when it contracts. This is consistent with entropy. The nails and dice would also heat up (or increase in average speed) if you jostled them in zero gravity just like the rubber by the amount of energy expended to jostle/stretch.
Entropy is simply a measure of how likely a configuration is. Sure, it's possible that rubber band could stretch itself just sitting on my desk, but with 100's of billions of molecules in it all needing to do essentially the same thing at the same time over a long period of time (contextually speaking) a mathematician would call it a statistical impossibility.
I'm not sure about the rubber band one personnally, and if their is a contribution the force produced would be almost nothing compared to the intra and inter molecular force involved in an elastic.
@knurlgnar24 Agreed, Action Lab reversed the scale conventionally used to measure entropy. The disordered box of nails should be said to have relatively higher entropy than the more ordered box. I stopped watching after that botched intro.
I think if whoever the first person to use the word "disordered" to dscribe entropy had instead used "homogenous" or something I think we'd be like 50 years ahead with our understanding of physics and information theory
entropy was devised before we even knew there were atoms. they had some "energy available to work" thing going on.
Doesn't work either : case in point oil does not mix with water, and homogeneous emulsions separate into distinct layers.
The first use of entropy came from the observation that heat transferts spontaneously goes from hot stuff to cold stuff, and that this puts a hard limit on how efficient a thermal engine can be.
The modern understanding of entropy is that it is a measure of the number of microscopic states a system can be in based on the values of observables that can be measured at our scale.
The channel alphaphoenix made a really good video recently if you want a more in depth explanation.
"homogenous" is closer but not quite right. I think "likelihood" is a better description but still not perfect. If you allow a system to evolve undisturbed for an arbitrarily long amount of time, taking observations at random times, higher entropy states are more likely to be observed over time.
To me the best description of entropy is it’s the act of something reaching its most stable state. Like with these dice being organized like that the dice is more stable than when they are disorganized. Rust is chemically more stable than iron.
Yeah a splattering of blue and red paint has less entropy than a smooth coating of purple paint (or however the paint combines)
2:35 sounds fishy... please check:
1. microstates in the Bolzmann formula have the same energy. Here, if any dice goes to the free space above, the system will have its energy increased by the work against gravity. This breaks the condition of the microcanonical ensemble
2. Microstates are equally possible, however we don't see dices jump into the free space by their own will, therefore the shown state has much larger probability to be in and this breaks the condition of the microcanonical ensemble
I agree with your description, but bear in mind that this system wouldn't be described by a microcanonical ensemble, because it is not isolated. In fact, Boltzmann entropy would not be the correct expression for entropy in this case I reckon.
This looks more like a (N, g, T) ensemble, where available states are definitely not iso-energetic. Further thought is necessary on my part
@@andreacosta7712I guess you've meant canonical NVT ensemble. I agree with you, it should be more suitable in this case. However, I played along with the author, who decided not to introduce Gibb's formula. However even if the energy distribution of microstates was considered, my point (2) would still hold.
I actually try to lead to an idea that the author has some sense in his words only if he never stops shaking the jar and even in this case it could be hard to say smth specific about energy distribution of states and the resulting entropy
Definitely, any kind of thermodynamical reasoning only holds if the shaking is ongoing, aka constant temperature.
What I meant was indeed NgT. Volume is not constant, but gravity is. In a way, a constant-force ensemble is equivalent to a constant-pressure one, as in NpT, which is described by Gibbs free energy.
My general point is that in this demo, the interplay of energy and entropy is what actually explains the phenomenon, so reasoning in a microcanonical framework does not give you the full picture, because entropy is constant in it. But yes, point 2) still holds.
🤭im not smart enought to understand
@@JackAtlassBasically, these are bad examples using the wrong math, failing to state the things he wants to state even though what he's saying is true. He's wrong in every step of explaining why it's true.
This is a wrong explanation.
The spontaneous orderly arrangement in the example is due to the effect of gravity. Dice can lower their overall center of gravity when arranged neatly, and the same is true for nails. If gravity is removed, such as by astronauts conducting experiments in the space station, you will see that neither the dice nor the nails will spontaneously organize neatly.
In the animation the Gravity is switched off so you have a point.
@@andymouse Maybe It approximates the dark matter flowing over and through us. Galaxies start out as ring galaxies and "acquire" arms and bars and get warped along their rotational equator and things due to perturbations from other stimulus. An in space simulation would be all the dice in the cylinder in a random meander. Similar to a space containing a gas. It would take longer than it takes for tar to drip to get the data.
But if you remove the gravity of Earth or any other massive object the dice themselves would still gravitationally attract each other and they would form a clump with mostly face to face contact, very similar to the way they are arranging in the bottom of the tube.
Shaking in combination with gravity, yes.
@@aSphericalCow618 It's shaking plus gravity. Otherwise they sit as at start of video, further apart and unoriented.
Now try the same experiment without gravity pulling all items into the same direction.
Entropy is a function of gravity
No, it is actually only a function of the number of possible states. Just check the Boltzmann equation
'Amount of free volume per dice increases'. I thought that the amount of free volume is the same before and after aligning dice in a finite system, it is just that the same 'old' free volume is now concentrated in 'new' free volume in space continuously, above the dice, not scattered between dice.
You are correct
yep - If we follow this guy's process and keep the volumes of the two systems (1/higgly-piggle dice and 2/ordered dice) constant then what he shows us (higgle-piggle / ordered) each represents a single microstate/config of the same macrostate. Both systems can reach all the other's possible microstates given some input of energy. They have the same number of possible microstates and the same macro. The entropy of both systems is the same.
(If we are allowed to reduce the volume in the ordered dice system, Then it has lower entropy.)
Yep, exactly. Though my chemist heart immediately made me instinctively want to say ‘surface tension’😂
Yep
Once the universe reaches full entropy, if ever, that will be perfect order. Or as close as you can get to it. How people come to the conclusion that entropy is chaos is beyond me
To my understanding entropy describes particles "desire" to reduce their energy to reach the most stable state. The dice prefer the tightly packed state because here they have the lowest amount of potential energy. Also it takes a lot more energy to break that state apart, because the previous unorganized state could be broken apart by the tapping / rotating, the new state however is hardly affected by it.
Indeed. When you have packed particles, if you free them on a bigger room, they redistribute to reduce their energy. On the dice scenario, this happens also for reduce their energy.
isn't that enthalpy?
That is enthalpy. Entropy is simply disorder.
2:54 would have been a good idea to start the video with “entropy is defined as”
The trick here is not to compare the original disordered dice configuration to the final ordered one, but rather to imagine the distribution of dice orientations while the container is being shaken. At equilibrium, most of the dice align and the ones near the top are constantly and rapidly bouncing around.
If the container is not being shaken, the entropy of both configurations near zero because the dice have almost no probability of changing state. And analyzing the situation becomes harder because the disordered configuration will have a higher energy level due to potential energy than the ordered configuration
This is true but a more practical role would involve calculating the partition function. We need to incorporate positional and configurational entropy due to the limited volume. This would be more a more detailed approach and would incorporate both positional and potential energy considerations.
If we replaced the dice with smaller particles like those Einstein first observed brownian motion in, it seems clear the unpacked disorganized dice are much higher entropy. Random disturbances are much more likely to lead to an unpacked vs packed state. For every 1 packed permutation, an enormous number of translated identical ones. Am I missing something?
So the few at the top have more entropy but most are trapped in the structure and have less.
All those squares make a circle ... all those squares make a circle ... all those squares make a circle. ..
Every tweaker spontaneously reached for the dice and knocked their screen to the ground.
He's succeeded in circling the squares!
I also enjoy Dragonball z abridged
They all align expect for that CHAD nail at the bottom 0:57 seconds
Sigma nail behaviour
Sheeples. am I right?
Very based. "I don't conform to your laws of physics, you science bitch!" - Nail Chad.
I'm nothing like yall 😎
that was actually just a nail like object
3:33 This is semantically untrue. The reason the rubber band returns to an unstretched state is because of electrostatic attraction and repulsion between the molecules within the rubber, which respectively resist stretching and compression, and thus effectively function as an elastic force. You can explain this as a mechanism that acts to increase entropy, but this does not change the fact that the force is elastic in nature.
How does that argument account for the temperature dependence of E?
Reminds me of how the initial state of the early universe would have appeared about as random/disordered as possible yet gravitationally speaking was very low entropy.
You’re living in a dream world neo.
Much ❤ Love
🌎🌏🌍☯️⚡️
World🌞Peace
but it's hot af.
The initial state of the universe is extremely ordered. All matter existing in the same point. There is only one configuration for that to be the case.
@@JamesDevlin-d4z It wasn't a single point. The universe before the big bang was still infinitely large, it was just much, much more tightly packed. At least, according to the widely accepted theories.
In the beginning God created the heaven and the earth.
"When enough chaotic and random events happens. Eventually, it leads to very predictable events."
Thats quite philosophical. So after enough observations, everything will be understood.
We are imperfect to understand all. Only God knows.
Yet the definition of chaotic and random preclude the possibility of predicting. So are the events truly chaotic or random? If they are predictable then they can't be.
@@PhokenKuul No, not if you are part of the system being analyzed yourself eg human nature.
@@PhokenKuul Made me realize that i find it philosophical because of how paradoxical it is.
@@PhokenKuulthis is incorrect. There is nothing in the definitions of chaotic or random that preclude prediction. Dice rolls are random but I can accurately predict that each number will appear 1/6th of the time in a large number of rolls.
Ah entropy, the 3rd wheel of thermodynamics
three wheel bicycle. 😂
Third wheel is what makes it a tricycle
More like a fifth wheel you know the company nobody wants around 🤣
Quite the opposite. It helps you so much to find properties of many Systems.
Ah
0:10 As ITALIAN i recognise the “why”😂
🤌
The ordered jar has less entropy, with the dice compacted there is less positions they can actually occupy while still looking the same in buik; but the total entropy of the Universe increased with the heat of the friction and collisions though. Just like life, we may seem to violate the laws of thermodynamics with all our growing and reproducing, but it's actually the opposite, we spread heat more easily than still rocks.
The story of the dice themselves decreased, but the entropy inside the jar as a whole increased as a result by compacting the volume. Honestly though, he is using semantics.
That's incorrect, energy neither increases or decrease, the universe is moving towards the most statistically likely configuration which is the true definition of entropy.
*The 3 levels of entropy from less correct to more correct*
1: A system moving in the direction of a more disorderly state
2: A system moving towards a state of equilibrium
3: Energy and matter displaying the most statistically likely configurations
*Creating the arrow of time by moving towards that given configuration that can be more or less statistically likely but the ones which are the most likely are considered entropy
@@notaffiliatedwith7363It's not about geometry and order, it's about the most likely configuration in a given system with it's laws of physics at work
@@markmuller7962 Yep, "order" and "disorder" are subjective, so they do not always match with the entropy. A good example is oil and water separating over time when with most other liquids we see the opposite.
@@jacobblaustein7779If the jar entropy has been increased, it's due to the heating while wildly shaking. The jar is not a closed system. Explanation is not valid, or not complete, while heat and sound noise are not considered, as well as random shaking.
I feel like I've been brainwashed
Action Lab reversed the scale conventionally used to measure entropy. The disordered box of nails should be said to have relatively higher entropy than the more ordered box.
he's wrong - If we follow this guy's process and keep the volumes of the two systems (1/higgly-piggle dice and 2/ordered dice) constant then what he shows us (higgle-piggle / ordered) each represents a single microstate/config of the same macrostate. Both systems can reach all the other's possible microstates given some input of energy. They have the same number of possible microstates and the same macro. The entropy of both systems is the same.
(If we are allowed to reduce the volume in the ordered dice system, Then it has lower entropy.)
AL is an idiot.
*The 3 levels of entropy from less correct to more correct*
1: A system moving in the direction of a more disorderly state
2: A system moving towards a state of equilibrium
3: Energy and matter displaying the most statistically likely configurations
*Creating the arrow of time by moving towards that given configuration that can be more or less statistically likely but the ones which are the most likely are considered entropy
(3) need a tweak: the whole point is that all microscopic configurations are equally statistically likely (at the same energy), but there are just so many more with the "right" macroscopic variables that it always looks right.
e.g., the configuration of N2 and O2 molecules in the room (all your rooms) right now has the same likelyhood as the one with all O2 on the left side and all N2 on the right side, but there is only one state that looks like the latter, and more than Avagadro's number to the sixth factorial that look like the former, and (N_A^6)! >>>>>>>>>>>>>>> 1.
The "arrow of time" is not necessarily a more correct part of your difinition. An interesting idea, but not demonstrable. Sean Carroll (the physicist who proposed this) is a *theoretical* physicist and his idea arguably enters into an unprovable area.
@@PsymonHawkeye Actually the arrow of time is a side addition to the third concept just to make it clearer. And btw it's not a matter of hypothesis, the main role of entropy in the arrow of time is a theoretical physics *fact* aka scientific theory which doesn't have anything to do with the English popular use of "theory". It's a scientific theory, learn how science works
@@DrDeuteron Yes everything is emergence but the definition was getting too long already for a YT comment
@@markmuller7962 agreed. Idk, that was my favorite part of statistical mechanics.
I use it to disused lottery players:
Me: “ you should pick 1,2,3,4,5,6”
Them: “why? That will never come up!”
Me: “exactly.”
1:30 most people dont know what entropy is at all though
Every time I'm pretty sure I have a solid understanding of entropy, something comes along to prove me wrong! Nicely done.
What was said in the beginning of the video, that the 'ordered' systems (dice or spheres) have more entropy than the disordered ones is not correct (if we look at the isolated entropy of the packing of dices and spheres). You might consider correcting this.
These systems do not reach a state of maximum entropy, but the minimum of an energy potential (like the Gibbs potential in thermodynamics). For the dices and the spheres this potential contains a gravitational energy term, which is the +m*g*h, where m is the total mass of the dices or spheres, g the gravitational constant and h the height of the center of mass of all the objects in question. The second term is an entropic one -c*S (c is some constant depending on the thermal energy or the shaking intensity in this case). The system now tends to adopt a configuration of minimum m*g*h-c*S. The close packing reduces h and this term dominates and therefore the dices or spheres adopt a packing where h is smallest possible even if S is not at its maximum. Here minimal h and maximum S can not be reached simultaneously.
To understand this consider there is no gravity (i.e. g=0) then only the entropic term -c*S is left in the potential and the system will adopt the state of maximum entropy (i.e. all dices and sphere floating randomly in the cylinder) at the mildest shaking (i.e. c is not zero). The second case is strong continued shaking. In this case c will be very large and m*g*h will become negligible (basically g can be assumed to be 0). Then again the dices and spheres will ‘float’ randomly in the cylinder due to the shaking.
If he said 'maximum entropy' then he's wrong on that as you say. He's also mixed up on another count i'd say: If we follow the guy's process and keep the volumes of the two systems (1/higgly-piggle dice and 2/ordered dice) constant then what he shows us (higgle-piggle / ordered) each represents a single microstate/config of the same macrostate. Both systems can reach all the other's possible microstates given some input of energy. They have the same number of possible microstates and the same macro. The entropy of both systems is the same.
(If we are allowed to reduce the volume in the ordered dice system, Then it has lower entropy.)
Now I don't understand entropy.
😂
Basically the smart way of saying disorder or mess
@@nonabroladze8203 no it's not mess or disorder. they are just the more probable outcomes. it just mesures the number of arrangements particles can have(it have more to do with the distribution of energy rather than particles) . a cube of ice have almost one arrangement of atoms while vapor can have an astronomical amount of arrangements. vapor got more entropy than a cube of ice it's not about disorder. disorder just tend to corelate with it in most cases.
@@nonabroladze8203 as the guy showed in the video there are more ways to arrange the dice in an "ordered" maner than there are to arrange it in a disordered manner even if the entropy is high the cube arrangement will get more ordered with time.
@azouitinesaad3856 ty
I read in a book that way and that's how I knew it😅
My understanding is that the jar with the dice packs in an ordered way because that decreases the total gravitational potential energy of the dice, which makes that state more likely. If jar were in a state of weightlessness, the disorganised state would be more likely.
Hey! I like your video, but I have to ask for clarity. You treated the bowl of dice as a closed system while exerting outside forces. I believe you when you say the total entropy of the final system (inside the bowl only) increased, and the entropy of the drill also increased. Why was the outside force ignored? In another system where outside forces are ignored you may have decreases in entropy. But missing the outside force leaves the question open, does the entropy decrease in the bowl balanced by the entropy increase in the drill? Of course you explained that when looking at possible states it is favorable to pack. Further, thanks for the video, we fail to remember that "ordered" is a human imposed concept, if we define it as the low entropy states, then the bowl of none packed dice is more ordered than the packed one. But our brains like to call the packed state as more ordered.
As an aside, regarding "ordered" being a human imposed concept, I tend to think of "order" vs. disorder being recognizable patterns vs. a lack of recognizable patterns. Of course taking macro and micro into account, there are patterns that we don't always immediately recognize as such. You can quickly get into "does a tree falling in the woods make a sound if there's no one there to hear it" territory (we know the answer to that of course). Referring to whether patterns/order requires observation to be considered patterns/order vs. random chaos. The science of the impact or lack thereof of observastion of phenomena in the universe continues to be intriguing topic.
One way to clarify what he's talking about is if you understand entropy as hidden information rather than disorder (the latter of which I despise as an explanation of entropy). In this sense, free space has no information content, amd thus no hidden information. By lowering the average free space in the local vicinity of the dice, we are hiding more information, because now any given die could have any 6 other die (or 5, if it's against a wall) directly pressing up against it. This is the massive increase in translational entropy that he's referring to.
Nice video, but Entropy in the original statistical mechanics sense of S=log(Omega) is good assumption ONLY in an energy conserving equilibrium system, what is generally called a microcanonical condition. In the case of macroscopic spheres (or dice) you have a dissipative system in which a lot of energy is dissipated into heat during collision and this is why you have to provide energy by vibration. This is a steady state and not an equilibrium one and entropy technically is not defined. Clearly you can use the dice example for pedagogical reason, but with a bit of care. You said: "any system wants to maximize entropy" this is not true. Let me give an exanple. If you wold have a mixture of two macroscopic spheres in a shaker they will have a strong tendency to demix with the big spheres going on the top (brazilian nuts effect). In a equilibrium system this will be nuch more unlikely (especially if rhe sizs ratio of the two spheres is not too large) because mixing increase the number of condigurations and consequently entropy.
This is what I couldn't understand when I was watching it... "aren't you putting energy into the system when you shake it?"
I watch tour channel mainly for how you make science topics fun and engaging with clever and novel experiments, but then every so often you deliver powerful insights expressed really intuitively and clearly. Thank you. Excellent work.
This is not entirely correct and in fact could be incorrect. It’s not black and white. We have to keep this simple and not try to manipulate individual concepts but instead focus on the combined perspective.
Configurational entropy and positional entropy both need to be considered and calculated. If the extent of free volume significantly affects the number of Microstates that are accessible, then it’s possible the organized dice may have more entropy due to positional freedom. If the volume is more constrained, randomly orientated dice will have more entropy. We can not make the claim that one has more than the other without proper measurement.
If we follow this guy's process and keep the volumes of the two systems (1/higgly-piggle dice and 2/ordered dice) constant then what he shows us (higgle-piggle / ordered) each represents a single microstate/config of the same macrostate. Both systems can reach all the other's possible microstates given some input of energy. They have the same number of possible microstates and the same macro. The entropy of both systems is the same.
(If we are allowws to reduce the volume in the ordered dice system, Then it has lower entropy.)
I'm joining team "I don't agree".
Because the dice in the ordered state have fewer opportunities to change position. If you keep doing that start stop they no longer move. Where in the disordered state they do move. Meaning the disordered state has more opportunities for the dice to move and therefor a higher level of entropy.
Greetings from Slovakia
The entropy gets even higher when you also start shaking perpendicular to the rotational shaking. It never gets ordered.
1:10 gravity, the pieces want to be at the lowest point possible so they pack together to make them as low as possible
This really changed my understand of entropy and appreciate the perspective that life is an emergent property of a system that has evolved to increase entropy while efficiently dissipating energy. The variety of life on Earth shows that a relatively small number of different chemical elements can assume quite a variety of arrangements. Carbon is entropy's strongest proponent in that realm.
Which scientific articles or books were used in the video?
He made it the F--- UP!
They're in the description
1:45 Well Said Remember Folks, AI will give you the Popular Answer, not Necessarily the Correct Answer.
[Usually Popular = Correct AND you also want to Double Check some other way]
if only science class was this interesting
Your videos make every hard concept seem so fun and easy
I always understood entropy as the drive towards a stable state where no work is done, e.g. when all heat is evenly distributed in the universe, there is no transfer of energy (because heat doesn't pass from cooler to hotter (thank-you to Flanders and Swan for their song about the laws of thermodynamics) and everything is equally hot. The dice stop moving once they are aligned because the are then at their most stable configuration. In an earthquake, things fall down because being on the floor is their most stable state.
Entropy? :D
This is more about a constant field of gravity acting on the contents locking them in place, and the constant work input by agitation that allows each die/bead to move out of its current local minima and fall into its new place. Once slotted into positions, gravity and force from their neighbors restrict them from moving out of place. This effectively locks them in place. It just so happens that the most 'restrictive' spaces are also the most space-efficient structures (highest packing efficiency), and crystalline structures are quite space-efficient.
Hence spheres forming a face-centered cubic instead of body-centered cubic or other patterns.
Given the magnitude of forces at play, they drown out any entropy changes you see in the system.
Try doing this test without agitation and gravity messing with the system. Try suspending them in solution of similar density. If you let the solution evaporate or drain slowly, it might shed a decent light on crystal formation too. (drain too fast and they will look more disordered, and 'crumbly' too) :D
This is why I love Thermodynamics. This is not only about "X" particles or things moving and doing things, it's something more fundamental.
(0:02) Let me guess... You bought a cheap bag of D6 dice that needed to have their dots filled in with a marker. 😉
Why else would you need a 💯 dice anyways.
I prefer to think of entropy as pointing in the direction of the arrow of time - jiggling and jostling over time will tend to produce the most common configurations. This only holds if all configurations are equally available and aren't sticky. Natural filters in life put a non-random bias against this jostling, from your example here to the gravity wells of stars and the motion of beaches, or Life's Ratchet in the cellular machinery. If once a random state occurs that locks in a position (like your orderly dice that can no longer move) then they will tend to order up even if order up states are less common, because they stick. No matter how slow the decay is or how uncommon the state of a random decay is, Schrodinger's cat will die when it happens and will stay dead (cat will randomly die but not randomly resurrect), so over long time you are much more likely to find a dead cat than a live one. Entropy was meant to describe states that are fully free to interact in the system, not natural filters. It doesn't break physics, it just doesn't fit that model.
If you think of entropy not as of particle concentration or arrangement, but as the most likely outcomes over long times scale including non-random natural filters, then a compact star or beautifully orderly beach sand are actually low entropy as the dead end of the jostling options. This kind of one-way-ness also applies in other realms like gambling - where no matter how much you win you still can still risk all and lose all, but once you are broke you cannot un-lose. The house always wins even if you have slightly favorable odds, because even a less common disaster cannot be recovered while all cumulative success is just one bad roll from being wiped away.
Good Morning!😃
I saw your journey on the Alan Pan video recently, and watched some of your content after. you should do more TH-cam videos
This is why science is so great. Its about studying whats actually going on regardless of preconceived ideas or beliefs. Properties of the Universe can get confusing (especially when there are multiple things going on and many variables). Science is studying each one, recording data, repeating, recording, etc.
Humans love to see one event and assume they know everything from it...we know thats not the best way yet here we are in 2024 with 10s of millions who would rather believe what they want to more than what is real.
Free volume is not created. The many random small amounts of volume have been combined into one larger volume but no extra has been added.
THIS.
yep - If we follow this guy's process and keep the volumes of the two systems (1/higgly-piggle dice and 2/ordered dice) constant then what he shows us (higgle-piggle / ordered) each represents a single microstate/config of the same macrostate. Both systems can reach all the other's possible microstates given some input of energy. They have the same number of possible microstates and the same macro. The entropy of both systems is the same.
(If we are allowed to reduce the volume in the ordered dice system, Then it has lower entropy.)
My thermodynamics professor had something he repeated constantly, "now we just count the states." Right, "just". The text (which he wrote and distributed free) had an explanation of how we knew all the states had been counted. Entropy was simple, the more states possible the higher the entropy and the more probable that the system would be in one of those states. Amazingly most of the laws of gases, liquids and even solids could be reduced from this "simple" counting of states.
I'm not going to attempt counting the states of dice in random vs. ordered configurations but I'm pretty sure that from this perspective there are more states of randomly ordered dice.
My lesson I learned is that the properties which apply to atomic level particles are not necessarily those that apply to macro sized objects.
Except with the disordered dice the volume is largely inaccessible because each open volume is less than the size of a die.
@@JK_Vermont Not the issue. The video was claiming that volume was created. Not that usable volume was created or more useful volume was created. Details matter. "It's physics for Godsakes".
Great topic! I always learn something, realize how little I know about something and not understand something altogether in every episode, it’s awesome. Keep up the great work!
I run a Precious Plastic recycling workspace, here in Italy, in my spare time.
Usually I wash the plastic I collect in my washing machine (the european kind of, horizontal shaft), in a pillowcase, when doing laundry.
Often I have some boxes with sliding covers, the ones that contain carbide lathe inserts, or telescopic boxes for drills, and so on.
Sometimes I also have to wash bottle caps with child safety bottle caps (something from vaping fluids from friends and colleagues), made in two parts.
I break the external part to separate them, and then put them in the pillowcase.
Practically every time all of the parts come out from the washing machine re-assembled.
It's amazing.
Since when does ChatGPT "represent most people"? STEM bros really crumble at the thought of social research, huh? You could've easily made a poll of your audience or cited popular definitions of entropy in recognized dictionaries but you chose the laziest way possible, somehow equating an LLM with "most people" as if it's a fact.
You are the first person to actually explain entropy simply.
and wrongly.
Yeah, it's simple and wrong, so...
Explain your case.
2:09 you lost me
I saw it differently. The random dice in the glass has a higher potential energy state than the organized stacked state hence the jiggling tends to organize the dice.
I love how your simple questions and demonstrations arouse the curiosity of so many. I never considered the entropy of a cup of dice before, for example.
Which simulation software at around 4:00?
I also want to know this
Finally someone who gets it and can actually explain it in a good way.. Damn, not even my university profs got the concept of entropy! The whole "you can't unmix the marbles!" and other demonstrations of randomness to explain entropy completely ruined the idea in the public mind. We need more of these videos!
4:07 you say there's no forces acting on the particles making them coil together but actually in the simulator you used the electromagnetic force is simulated, which is what's pulling them together
I've quite recently finished writing my PhD thesis on entropic phase transitions in some specific systems. I have a whole section of a chapter devoted to "trading" one type of entropy for another type for a net entropic gain. I find this phenomenon very fascinating and almost each one of my scientific papers has a short, "intuitive" discussion how it is applied to the case on hand.
Cool video! I have seen many confusing explanations of entropy, but on of the better ones is "Entropy states 'that which is most probable will probably happen.'" In a random system, the probabilities reflect that in disorder. If the rules of the system are ordered, the result will generally be ordered. In application it's a little more complicated than that, but in general essence it seems to be right. So I guess the entropic force is the tendency of the system to align to its underlying probabilities.
really cool video, and love how you just casually throw in chat GPT in like "if I ask it the answer WILL be wrong", 10/10 chefs kiss
I finally get the description of Entropy in a clear short way. Thank you 👍🏻
I think the way that communicates best how I understood entropy is "The inverse deviation from an object's state at the end of time," meaning, the more entropy increases, the more closely an object will reflect the state it would be in as time approaches infinity, and that entropy is used as a way to gauge how far away an object is in an initial state from the state it will reach at the end of time
You're forgetting the most important part of the equation, you. Conscious intervention. You interfered. You added order to a disordered system. You put it back together. You pulled the rubber band. If you haven't looked into Emergence Theory, you should.
Just stop. Go home.
Well, this was absolutely fascinating.
The second law of thermodynamics applies to closed systems, not all systems in the universe, nor all theoretically possible universes... But an interesting video that will be fun to think about.
Just for this... I subscribe. Didn't expect to learn anything new... but this gives me a new understanding on how to perceive the world. No one ever explained me entropy in class, and I guess I missed out a little in my understanding of this concept.
This is of the most interesting take on Entropy I have ever seen. Very nice explanation!
I recall reading that this is why cables in a box tend to get tangled with each other (or headphones in your pocket).
I’m pretty sure the example of the rubber band stops being only entropic force the moment the band has to be stretched rather than just loosely in a line.
This video reminds me of my younger days back in the 90s. I used to prepare for raves by visiting a stimulating trader I knew.
I was still young and naive and didn't understand why this friend would always take out a large bowl filled with dice and marbles. He explained that sometimes its not enough to hide the screwdriver.
As my experience grew and I met different people I understood, Dude was just protecting his electronics with that bowl.
I came out of this video feeling like I have a better understanding of entropy than ever before. Well done!
Careful, I think this is one of this channel's rare erroneous videos
@5th_decile Yeah he is wrong and Killmajaro's understanding just got worse
It still seems weird to call it a "force", but that's a great explanation of the increased entropy of "free space".
In thermodynamics, we were taught that entropy is the amount of energy that is not available to do useful work. As such then, something that is not available cannot create a force. External work was done on the dice and nails to get them into the ordered pattern. They did not get that way by themselves.
The best explanation of entropy on the internet
Fun fact, entropy is why loose wires/strings get tangled all the time. One thing always helps: connect one end to the other, this restricts the number of variations in knots it can have (to 0).
I'm not an expert on entropy, but the easier explanation in my mind is that the aligned dice simply have more possible states available. Consider 100 dice stacked in the bottom of the container - each cube can be swapped for any other so you have a total of 100! possible states. Now imagine that 20 of them are sitting on the edge - each of the cubes lying flat can be swapped with each other flat cube (80! possible states), and each on-edge cube can be swapped with each other on-edge cube (20! possible states). So the mixed orientation configuration has 80! x 20! possible states which is SIGNIFICANTLY fewer states and therefore lower entropy. Adding additional orientations here lowers the entropy. (Yes, this is a simplification of the situation)
I'm not yet totally convinced that the "free volume" concept is very helpful in understanding the increase in entropy with the dice here even though the difference in volume will certainly play a role in the number of states.
Although it's pretty surprising, I wouldn't say there is anything "mysterious" about this "force". Where there are MANY more possible ways for a system to do something, the system will tend to eventually do that thing. At least, that's how I like to think about it.
I took physics at McGill university with Adv Stat Mech I and II and never heard of a better explanation. Another amazing video!
Love that you need to toot your own horn as if anyone cares that you went to school. There are Christians with PhDs. Your degree means nothing.
When you heat a steel spring, its spring force decreases as the steel gets softer.
When you heat a rubber band, its spring force increases, because the molecular chains wiggle faster.
When you jostle the dice, their gravitational potential energy is being dissipated as sound and heat, so they tend towards a "ground state" where their potential energy is minimized (by being packed as low as possible). The entropy of the dice does indeed decrease because there is only one "ground state" (up to rotation of each layer). It's possible for the dice's entropy to decrease because they are not a closed system - you are not counting the microscopic degrees of freedom where the sound and heat end up as part of the system, and if you did, total entropy would increase.
Another way of looking at it is if there were no energy dissipation into those microscopic degrees of freedom, then the dice collisions would be perfectly elastic and they would keep bouncing around instead of settling.
While the entropic force is a major factor in a rubber band's retraction, it's not the sole force at play. There are also:
Enthalpic Forces: These arise from changes in the internal energy of the rubber band when stretched. Stretching stores energy in the polymer bonds, and releasing the band allows this energy to dissipate.
Cross-linking: The polymer chains in rubber are often cross-linked, creating a network that resists deformation. This contributes to the overall restoring force.
The Dominant Force:
In most everyday situations, the entropic force is the dominant factor in a rubber band's behavior. This is why heating a rubber band (increasing entropy) causes it to contract, and cooling it (decreasing entropy) makes it stretchier.
Important Note: The balance between entropic and enthalpic forces can change under extreme conditions (e.g., very high or low temperatures) or with different types of rubber.
In summary: The entropic force is a significant factor in a rubber band's tendency to return to its relaxed state, but it's not the only force involved. Enthalpic forces and cross-linking also play a role, although their contribution is usually smaller in typical scenarios.
I love this video you explained so much
The ideas of self organizing systems and entropy seem totally at odds until you bring up these cool examples
This video goes about 95% of the way to explaining Erik Verlinde's Entropic Gravity, which is impressive because Leonard Susskind has a hard time explaining it in plain language.
Interesting fact is that with the most symmetric geometrical shape, the sphere, you cannot get entropy by shaking. This is because there are two ways of 'closest sphere packing' (cubic and hexagonal). Both packing structures will occur when you pile up balls/spheres and shake them. They will form random groups within the pile of spheres, and the bounderies between those groups will only get more defined the more you shake.
Keep in mind this analogy does not transfer over to the naturalistic model. Entropy does not produce information far less in a progression to a higher state. It does not increase complexity of design it degrades it same with information as information theory states. Probability does not produce rational outcomes in any reliable sense, that's a belief in magic that it would aka the core of the flawed naturalistic model.
I always looked at entropy as the ending of something or decay but there's more to it. The heat death of the universe was the main example of entropy i had used to try to figure out what entropy is/does. Even in this video towards the beginning it seemed to me more like the tendency to even out, before you explained it more.
So what I get from this is entropy is the tendency to or force of changing to a state of more potential from a state with less potential.
Finally! Entropy explained in such an eloquent and simple way! Thank you for clearing up this seemingly arcane concept!
I've learned more from this channel than my whole schooling history !!!
You can still call entropy disorder, but just be careful of what you define as your system. For example, the dice ordering and leaving the volume has higher entropy, but if you were to consider the system as only the dices, and therefore don't consider that bit of volume that "creates" later, the entropy is lower.
On a microscopic level it's easy to call entropy disorder, but on a macroscopic level defining systems is hard
The example of the elastic band and the curled string of particles was really cool!
This was very informative, thanks for the video. It would be really cool to see more everyday examples of entropic forces
Your vids and explanations have really improved lately! Keep goin good sir
That "ow" was expected but it still threw me off guard
Very good explanation, lots of anecdotal examples.
Entropy has nothing to do with disorder unlike described by statistical mechanics, it is related to heat.
Entropy is not chaos, but the measurement of chaos that a location or event contains.
Chaos is not randomness, instead it takes all expected data and outcomes and either slightly to greatly rearranges it into an unpredicted state.