@aud_io to turn off the fusion reaction youd have to figure out a way to negate gravity, it's the gravitational pull of an star that makes the core dense enough to fuse, so in order to stop fusion youd have to cancel out all of the pressure caused by gravity
@aud_io But you would increase entropy by trying to stop the fusion in stars. The energy it would take to do that, would probably be more than if we just let it happen.
Once we watched this video and understood it and are as clever as him in this particular 11:42 mins of subject, we will not watch it again, does that prove knowledge or emotional entropy
P Hampton I don't want to play with that fire too much, but there is following phenomena: syntropy. This the new organization. E.g. when one's organization gets bad, and it gets reorganized; etc...etc...
@@ericklopes4046 Usable energy anyway. I suppose that's splitting hairs, though. Entropy increases as one runs steam through a turbine and enthalpy decreases. Then one condenses it and pumps it back to the boiler where entropy is decreased and enthalpy increased so it can be sent back to the turbine to do more useful work. Enthalpy is order and thus usable energy where entropy is disordered unusable energy.
First watched this video when I was in high school, found it absolutely fascinating and was at a time in my life where videos like yours were interesting me into pursuing STEM education. Now, years later, doing my MEng degree and learning about thermodynamics, stirling engines, etc in university, so I came back here to recap (you explained it a lot better than my lecturer). Thank you for your contribution to the educations of myself and so many others
I'm an Aerospace Engineer, studied at the University of Maryland. This is by far the best explanation of Entropy I've ever heard. You get into another level of complexity when solving for Entropy (S) as a function of heat transfer (Q) and absolute temperature (T), but having this foundational understanding gives context to really comprehend your solution. Good stuff!
Thanks a lot for the video! I struggled to understand what entropy really is, when I was first introduced to it at University (I’m still in first year physics). After watching this video it makes so much sense now! The way you defined it relating to the spreading out of energy helped me so much!
Well he looks at the engine so he can figure out how it works. But no matter how long a mans observations of a women, he knows he will never figure her out.
@@themask706 you have touched on an excellent point..in fact females or shall we say gynovoltaic engines have typically taken all of the energy i have ever given them and efficiently shared it with the rest of the universe..relieving me of the burden of storing it. Thankyou ladies!
@Clutch Inson. I too have pumped a steady stream of explosive energy via ‘small bangs’ into a continuous array of gynovoltaic engines over time in my endeavour to understand the nature of the universe and experience entropy first hand.
Sean- one of my small bangs has grown big enough to borrow the car ! May you continue your explosive research..and may your experiements confirm and exceed your wildest hypothesis.
This particular way of describing entropy is why you always find your earbud cords always tangled up: *there’s only 1 way* to have the cords straight and untangled but, as you jostle them around in your pocket or backpack, *there’s a myriad of ways* for them to become tangled.
Maybe entanglement is inverse to dimensionality. A loop in 3d becomes a knot when the dimensions are reduced to 2d under conditions of confinement like a backpack or desk drawer. Entropy means the knot or entanglement doesn't revert to a loop on re-exposure to 3d, so the cord is still in a tangle even after opening the desk drawer again.
@@danielterra4773 Sometimes you do, but it's less often than not, so that's what we remember. We're also tricked by our eyes, we think it should be a 50/50 chance that the USB is the correct way around, but it's one out of three: USB plugs are in fact four-dimensional objects, requiring you to turn it 180 degrees TWICE to rotate the plug to the superposition where it will fit the socket. ;)
This really bridged the gap for me between the statistical “ping pong ball” analogies ive heard, and the teachings of physically irreversible processes when describing entropy. Great video!
The CO2 pseudoscience is so spread that even a guy have a channel talking science parroting CO2-global warming nonsense. It is sad to see so many people not using their head, taking the liberal media's position as truth.
@@seanleith5312 I feel bad for people like you who can't understand very basic science concepts. The truth of the matter is that some materials absorb infrared radiation (heat) better than others. That's a well established fact. For example, CO2 absorbs infrared radiation much better than nitrogen or oxygen. This can result in heat from the sun becoming trapped in the atmosphere as it bounces between the ground and CO2 in the air (in the form of infrared radiation). You can argue how much of an impact this has on the global climate, but here's another couple of well established facts. The Earth's temperature is rising, and the amount of CO2 in the atmosphere is rising as well. The heat-trapping I mentioned is evidence that they aren't simply correlated. CO2 is causing a temperature increase.
@@mattn.8941 Your description has something mixed up. If the CO2 ABSORBS the infrared radiation (your first claim), then it is not "bouncing between the ground and CO2 in the air"...it's being absorbed by the CO2. If it's bouncing back & forth between the CO2 in the air and the ground (your second claim), then it's not being absorbed by ANYTHING (otherwise, there'd be nothing left to 'bounce'). Things that make you go "Hmmmmmm..."
I watch this video before and after my thermodynamics class. Now I understand the video better but I have to say that you explain better than my professor!
The problem with Professors is that they think everyone understands what they are talking about instead of realizing in comparison to himself they are all a class of Chimps sitting in front of him, this is why I couldn't learn Algebra at school, I do understand it now as a good mate that could explain things the way Steve Mould does actually taught me Algebra it in about 10 minutes when I was 28 lol.
I have a degree in mechanical engineering from a respectable university. I did thousands of thermodinamical calculations using entropy. Yet, you helped me understand entropy better... thank you!
I like how ppl complain about spelling, in a time when AI is almost ready to spell correct everything. Worrying about spelling is about the dumbest thing you can do. Also, I'm an astronaut millionaire.
@@TotalDec ALMOST ready? When AI finally gets it right, then I’ll take comments seriously. Until then, I’ll just skip reading comments with poor spelling, no punctuation and illiterate grammar. If you can’t bother to proof your comments or learn how to spell, why should I try to translate your incomprehensible posts? Astronaut millionaire? Sure! An Astronaut who can’t spell is the dumbest thing you can do. Can you even count?
don't worry you won't be around besides way before heat death occurs the Milky Way and our neighboring galaxy Andromeda will collide and our spec of a solar system will be no more. so don't fret ;)
@@redpillpusher I doubt that. There's a uniqueness to our home system and should humanity exist by then (and I very much expect it should) we ought to have the technology and infrastructure to keep such an important star safe.
Er, if you gave it a moment's thought I think you would see that it's far more likely they had both, but you were only allowed to choose one or the other.
Yes! This is what am talking about. Entropy is increasing not about disorder it's about fair even distribution or spread out. And this also applies to abstract things, for example a high entropy random generator is well spread out evenly. A high entropy checksum is also well spread out that a small change in input make much of the checksum to change into any possible value.
As far as I know, out of all entropy videos, only this one mentions why the concept of entropy was introduced and gives a clear reason as to why entropy always increases. You are doing god's work my man. AWESOME explanation. :)
I've always disliked "disorder" as a description for entropy, When energy distribution is homogenized, with everything spread out evenly, that's about as orderly as anything can get.
I don't like the disorder definition because it's easy to misconstrue and doesn't do a good job of giving someone an intuitive explanation of entropy, but it is correct. There are more ways to create "identical" disordered states than there are ways to create "identical" ordered states, but at that point why aren't you just giving the actual and correct Boltzmann definition? It's really not that hard to understand. You can even use dice to explain it, there's only 1 way to roll a 2, but there's 7 ways to roll a 7. Entropy is defined as the the number of ways you can get a particular "big picture" measurement. When you randomly roll 2 dies, you're more likely to roll a 7 than a 2. Congratulations, you just described both what entropy is and why the second law happens with nothing more than something everyone has seen and addition. It's even easy to relate it to Shannon entropy from here. And after revisiting this video after a year, I still don't like it. Maybe it's useful for the way an engineer thinks about thermodynamics, but it's useless for how a chemist thinks about it. Give me the boltzmann definition and I can instantly see that solids are lower entropy than gases, but this one? That's not at all clear.
It's a crappy description also because it's one word trying to represent a somewhat complex idea. I mean if someone asks you to define it you need to spend a little time explaining properly. If you are a professor standing at a chalk board you'd likely throw in some equations as well. And at the end of it, because as a society we have the attention of gold fish, we say something like, "Sooo... disorder?" And in frustration we say, "No, the answer is that thing I just took five minutes to tell you."
"An argument can be made that time is itself a statistical phenomenon." You're just gonna leave it there? Please go on. Seriously. Help. Go on. Tell me more.
I could explain this to you if you want to but you’re gonna have to wait till I’m on a computer and not about to go to bed. Cause I’m on my phone. And in bed. Reply to this so I remember
Thank you so much for this video Steve! Studying thermodynamics at uni right now, and this video is the ONLY one that I have understood. Physics, nature and our universe is just so beautiful, and your video made that ever more clear to me!
I wish we had TH-cam when I did my physics degree thirty five years ago. It would have made many concepts less of a struggle to get my head around (And I probably would have got a higher classification!)
My favorite definition of entropy cames from Shannon which roughly is the amount of information of a system, or how much you can compress information in a system. This is also consistent with the entanglement network we have as our reality, and its information increase over time attempting to "write" information on its state.
Yes. There is a great story about that. I think it was von Neuman who suggested that they call that Shannon's phenomena Entropy because of the similarity to the equations of (mechanical) entropy. So Shannon Entropy was discovered later and was only called Entropy as an analogy. But really it is more fundamental I think. You can explain thermodynamic Entropy in terms of Shannon Entropy but you can't really explain Shannon Entropy in terms of thermodynamic Entropy.
Excellent explanation! This video also touched on the difference between _temperature_ (a measure of the average kinetic energy of the molecules) versus _heat_ (which is the FLOW of energy). Basically, in order to perform work energy has to flow from a higher potential to a lower potential. This is ▲T (i.e.-temperature difference) in thermodynamics. Entropy is the energy "under" the ▲T that is not available to perform work. Enthalpy is the energy ABOVE the ▲T that is available to do work (technically enthalpy is the heat potential energy plus the pressure + volume potential that is available to do work). If you use the analogy of an hourglass (gravitational potential energy contained in sand particles), work is performed by the kinetic energy of the sand as it falls through the bottle neck [and say, turns a paddlewheel on its way through]. Enthalpy is the amount of sand in the top of the hourglass. Entropy is the amount of sand that has already fallen. The height of the hourglass is the potential (temperature). The volume of sand contained is the energy in the system. The size of the opening is the rate at which energy is transferred (heat). In real life you can never turn the hourglass "upside down" to start over. Once that potential energy is gone, it's gone.
@@nonchip True that, sir! I just couldn't figure out how to type a proper uppercase delta using alt code. When I type Alt+235 it comes out lower case 'δ', even though it's supposed to be upper case. Alt+30 comes out '▲' for me. How are you typing that if you don't mind me asking? (I'm using a Windows PC)
@@nonchip Afterthought: It probably would have been more correct if I wrote "pressure x volume potential" now that I re-read that as well; had to take some liberties trying to explain the idea without writing an equation. Oh well. ;o)
I just recently purchased a Sterling Engine model (and a Tensegrity Table) to Science educate my grand-kids (they'll face challenges greater than I've faced, and I'll be 'gone' in 20-30 years, optimistically). In the course of exploring, THIS post was suggested... and I've shared. Thank you!! I feel it was brilliantly explained and worthy of interest. Just sitting on a train, in a station, not moving yet... and an 11 minute compilation of Brilliance I just discovered when I needed it.
7:30 -- Here's a related question for you to consider answering perhaps! Solar panels have an efficiency measurement -- let's say they capture about 70% of the available energy from sunlight; the 30% which is "lost" is lost to entropy, in the form of the panels getting physically hotter in temperature despite the photo-electric potentials. Plants, via photosynthesis, manage to capture 98%... that's some fine negotiations with entropy, IMO.
Hey bro, I know this is old but where are you getting your stats from? From sun to stored energy, plants have about 3 to 6% efficiency. Solar panels are about ~20% efficient now. Perhaps you are mistakenly referring to the % yield of one of the reactions involved
Everyone else when the train isn't moving : Frustrated. Angry. Play games. Talks to someone Steve mould : "Hm... Time to teach kids what entropy really is"
+Ashwin Singh He says that the energy will evenly spread out and never clump together, which is not true. Temperature and energy is not the same. If you have a piece of mettal next to piece of wood, both with the same temperature, then no net energy will transfer from one of the materials to the other. But there will still be much more energy clumped up with in the piece of mettal. The laws of the universe governs the nature of how energy transfer and organizes it self. Thermodynamics gets aplied to these laws that we have come up with. This allso paints up a world where the laws of universe are very simple at a microscopic level. But that doesn't have to be the reality. It could hypothetically be as intresting laws that governs the macro- and larger- scope of the universe as the onese governing the microspoe of the universe. Just that we have not noticed it yet. I mean does these very simplified asumptions about the nature of thermodynamics aply to the realm of quantumphysics? If the laws of the universe are much more complex and dynamic than what we think then we could hypothetically exist in the "heat death" right now (a much more exciting heat death). That is what thermodynamics says.
I like the ping-pong ball box analogy, as it's also useful for explaining Chaos. You could fill the box with half red and blue balls in the same way, and use a machine to precisely shake the box, then note the position of the balls. Put the balls back in their half/half configuration, and repeat the exercise. The chances are the balls will settle into a different position every time, and you never effectively predict where the balls will land. This system is chaotic. Now I think that arguably, if the balls are set up precisely enough, and the box is maintained precisely enough, and the machine shakes precisely enough, and the temperature is precisely maintained, and it can be isolated from all external interference, perhaps you could predict the outcome of the test, and get the same result each time. However even the tiniest change in any factor will produce wildly different results, such that we cannot conceivably engineer such precise conditions to make the outcome predictable, even if we had an inconceivable amount of analytical computing power. That is chaos.
Sonic Reaper Some interactions result in less local entropy, but still tend toward the larger entropy of the universe. I believe he discusses that in the video 🙂
@@fluent_styles6720 Endothermic reactions happen (in terms of energy levels, there's lots of equivalent ways to describe them) because more energy levels become accessible to the products as the reaction progresses, outweighing those lost in the reactants due to the loss in temperature. So, overall theres an increase in entropy (which is the log of the available microstates). More specifically, the ratio of the partition functions of the products and reactants increases with temperature and so at some point the equilibrium will favour the product (there's also an enthalpic contribution due to the boltzmann factor, but that's less significant).
This video explains thermodynamics wrong. It is obvious that Steve Mould doesn't understand the difference between temperature and thermal energy. If you have a material A with a very high thermal capacity next to a material B with a very low thermal capacity then A and B will try to reach the same temperature, but not the same energy dencity. So in other word you will find a huge gradient between the energy density of material A and B, even when they reach the same temperature. Lets say that material A is much more dence than B and both have the same bolume. This means that it is more likely that the energy will be transfered from B to A than from A to B. Because there is more mass in A than in B that can hold on to the energy. So in the end it is all about probability. It is more probable that the energy will be transfered to the material of high thermal capacity than the one with low thermal capacity. The most probable balance of energy depends on the laws that governs the universe. In the case of two simple materials like this it is governed by the laws that define the property of different materials. In a hypothetical case our universe could be governed by laws that we may not allready know of; very complex laws that states that our universe is allready existing in thermal equalibrium.
Electro-Cute high energy density doesn't mean it will transfer more energy to a material of lower energy density at the same temperature, it just means it has more energy states accessible to store energy. Two touching materials at the same temperature will have an even energy transfer in a closed system no matter what their heat capacity is. Their energy content will not converge unless the materials themselves decay into each other to form a homogeneous spread of atoms.
I know that, it was allso kind of my point. Temperature is a thermodynamical perspective on a very simplified system. What the guy in the video said was that the systems will reach equal thermal energy distribution (or at least that is how I interpreted it). But that is only true for few special cases.
The context of the discussion and accompanying diagram make it clear that he is talking about two identical slabs of the same metal. He could, of course, divert to say that the thermal energy distribution would be different for slabs of different material but that would be unnecessary for this discussion (he's not talking about the relationship of thermal energy and temperature) and make the discussion overly complex.
@@andrewrobertson444 I don't agree that this is obvious, and even if it was how does this explanation explain what happens when the two metals aren't the same? That's not at all obvious to me, and at this point I've had about two years of thermodynamics classes ranging from gen chem to graduate level statistical mechanics. I don't see how someone who is totally ignorant of the topic is supposed to arrive at reasonable conclusions from it. But really, the most damning thing is that this video simply isn't an explanation for entropy. What he is describing is not entropy. He is describing a consequence of the concept of equally probable microstates that happens to work well for the few systems he described and few others.
it's actually a kit, so you also have to build it. it's made by kontax i believe, out of germany. they're not cheap but they are neat. big recommend from me, i run mine on a cup of coffee for guests
@@SassyTesla here we are in the future - my wife bought me one as a gift - yes I built it - and your coffee idea is being put into effect right now. It’s the gift that keeps on giving 😁...
You ought to do a video about that sodium acetate hand warmer you used. They’re really a pretty brilliant example of the latent heat of fusion given off when you have a phase change from liquors to solid. Orange farmers in Florida use the same phenomenon in water to protect oranges from freezing. They spray water on them and when the water freezes it keeps the oranges from freezing.
cool thought experiment: assume the universe is deterministic (law of concervation of information applies), press pause on the universe, play it backwards so that entropy will get lower, and displace a single atom somewhere. chaos theory will dictate that the cumulative effect of that displaced atom on that backwards running universe will cause an alternate progression of events where entropy ack goes up, but only to the objects where its influence has time to reach. hence, everything outside this luminar-speed bubble will run backwards and everything inside, will run normally. if you look at it from the outside, it will be a collapsing bubble where things are running backwards. bouncy balls will jump onto ledges, and coffee cups will unstir themselves until the bubble suface arrives!
DT28469. there is a potential issue with this time reversal anyway. there are simple configurations in newtonian kinematics, that are non-deterministic. Certain collisions between 3 (ideal) elastic collisions that are n-d. So reversing "time" would almost guarantee a different backward sequence.
Great explanation! Another fun tidbit is that "entropy" is also used to describe how much information is encoded in something. This seems counter-intuitive at first when compared to the heat death of the universe, but it makes sense if you compare it to Steve's example of the balls in the box. For example, your hard drive is in a more orderly (less entropy) state when it's empty and all the bits are zeroes; when you fill it with data it has more information but you also can't easily predict if any given bit is a zero or a one (more entropy). James Gleick has a great book on this for those with some "temporal entropy" to kill.
Indeed, I heard something similar when talking about compression of data. In short more you compress data, more you are reducing orderly, predictable parts, which means if you compress something to maximum then all you are left with is pure chaos.
Please do, they're great, this is my first time watching you(apart from brady's videos), and out of all the videos I watched on entropy(because I had difficulty accepting it) yours was THE BEST, even better than MITOCW.
This video explains thermodynamics wrong. It is obvious that Steve Mould doesn't understand the difference between temperature and thermal energy. If you have a material A with a very high thermal capacity next to a material B with a very low thermal capacity then A and B will try to reach the same temperature, but not the same energy dencity. So in other word you will find a huge gradient between the energy density of material A and B, even when they reach the same temperature. Lets say that material A is much more dence than B and both have the same bolume. This means that it is more likely that the energy will be transfered from B to A than from A to B. Because there is more mass in A than in B that can hold on to the energy. So in the end it is all about probability. It is more probable that the energy will be transfered to the material of high thermal capacity than the one with low thermal capacity. The most probable balance of energy depends on the laws that governs the universe. In the case of two simple materials like this it is governed by the laws that define the property of different materials. In a hypothetical case our universe could be governed by laws that we may not allready know of; very complex laws that states that our universe is allready existing in thermal equalibrium.
Cool ! A good demonstration for non physicists. No confusion, no philosophical considerations : just definition and good interpretation. Clausius and Boltzman concepts simply and methodically explained. Nice !
He totally blasts you with a deep philosophical argument near the end. "An argument can be made that time itself is a statistical phenomenon." If that doesn't get your Stirling engine warmed up I don't know what will.
10:00 no it's not just blind chance. Moving from low entropy to high entropy is moving from high potential to low potential. You need to do work to move a rock up, but you don't need energy to drop it. You need energy to heat or cool things, but you don't need that to spread heat from hot to cold. It's equilibrium not chance. Nature tend to seek equilibrium and low potential and high entropy. You need to force it otherwise by exercising work.
Thanks for the video! I've never understood why a more entropic state is described as more disorderly? Isn't the universe in more 'order' with energy more equally distributed? To me, clumping energy together seems more like the chaotic state of the universe. The universal heat death is when everything is evenly spread out and in order.
That confuses me as well, but I guess I do consider my home more in order when my stuff is concentrated in a few spots than when everything is "distributed evenly" across the place 🙂
I gave my Second Law lecture to my Freshman chemistry class this morning and that is exactly the way I say it too: increasing entropy is about spreading energy out. Our illustrations are at the molecular scale, but it always comes back to energy spreads out because states of the system that overwhelmingly most likely are those with the energy spread out.
What always confused me about the common definition of entropy is that to me it makes more sense that evenly spread particles and energy is more orderly than the clumps of energy. This definition of "increase in the spread of energy" makes much more sense to me.
@@iamtheiconoclast3 The rationale has more to do with "information" than "order". If all the particles are in a single corner of the room, you can describe the system with something like "1000 particles are in a square in the bottom left corner of the room with a distance of 1 unit between each". That's a single sentence which you can use to determine the exact location of every single particle in the room. If they are spread out throughout the room, you must give precise coordinates for every single particle individually, which in this example would be 1000 sentences long, to achieve the same level of knowledge. You require more information to fully describe high entropy systems than low entropy systems, and in that sense they are "disordered".
Come on man. Either gravity or electric forces, started getting shit together. Formed s black hole and then a black asshole. Bang, clampped universal turd energy
I have that engine plus another Stirling. Now I know how to introduce my grandkids to entropy. A great explanation that helps one visualize. My intro to entropy was in 1967 2nd year thermodynamics in a classical physical sciences program. The prof was really good but visual props were limited to spelling out Noel complete with the 2 dots over the e in a pre-Christmas review of various equations
I remember in high school a girl in my physics class concluded half jokingly that everything would eventually end up as heat, and the teacher agreed. Never knew that it was called entropy tho... Cool video!
Dear Steve, First of all, many sincere thanks for your efforts! ...By the way, ENTROPY is not only about particular engines/motors, etc. It is about principal impossibility of PERPETUUM MOBILE in general, according to Nicky Carnot. Now let us go... A. There is ONLY ONE BASIC, fundamental Energy Conservation and Transformation Law. It is definitely unique and conceptually indivisible delivering two logically joint concepts - these are Energy Conservation - and Energy Transformation. Still, a more-then-100-years-old conceptual failure has brought us to two separate thermodynamic laws - but this has nothing in common with the actual physics. To come back, they have coined two more fake thermodynamic laws, employed the Probability Theory + Mathematical Statistics, and this has helped formulate the Quantum Mechanics, which is thus a basically metaphysical conceptual construction and thus ought to be only restrictedly fruitful. B. By dividing the basically indivisible law, you are telling about Combinatorics, you are touching Probability Theory, you are even stepping back to Thermodynamics for a while, but... You are NOT answering the poser: WHAT IS ENTROPY, sorry! 1. In the formula S = kB * ln(Ω) you imply, Ω means not a "Huge Number of Microstates", not "Probability", which numerically ranges between [0,1], not even "Wavefunction", which ought to be a purely metaphysical notion, as it is... In effect, Ω ought to be a simplistic algebraic function of Lord Kelvin's Absolute Temperature. This result has been published 100 years ago in JACS. 2. WHAT-ENTROPY-IS-poser has been answered not by Clausius, not by Boltzmann, etc., but by Goethe, who has introduced Mephistopheles, the philosophical embodiment of ENTROPY. 3. Newton did basically know WHAT ENTROPY IS - A Counteraction. 4. That Counteractions do not grow to infinity with the growing Actions, but MUST reach their MAXIMUM values, is the result by Nicky Carnot formalized by Clausius... 5. In effect, Gibbs Energy formula renders implicit the interplay among ALL the relevant Actions (the Enthalpic term) and ALL the Counteractions (the Entropic term). 6. The standard approach you are reporting about is OK for the implicit Enthalpy-Entropy picture, employing it for studying reaction mechanism details is likewise eating soup with fork.🧐
I've always used a cup of coffee as an example of energy & entropy. You pour cold creamer into hot coffee, and the creamer will appear to distribute it's self through the coffee until it reaches equilibrium. This will not work if the coffee and creamer are the same temperature.
Tony Tuite For me I think of two spheres connected by a tube with a lot of balls on one sphere which represents the energy and they randomly move around and over time it will eventually fill up the second sphere and since it has much more space then it has much more probability of the balls ending up in there until they reach a state where both spheres have the same probability of the balls ending up on either of them.
9:15 there is actually more than one way red and blue can end up on each side. If Nb is number of blue balls and Nr number of red than there is = (Nb)! * (Br)! permutations it can happen.
This might be the most elegant explanation of entropy I’ve ever heard and I’ve been educated in Aerospace Engineering, so this says a lot. Entropy is so complicated to even the smartest people that explanations are faulty or sometimes wrong, even when described by the smartest people.
Entropy - measure of how spread out your energy is Well since our universe is expanding continuously The entropy of our universe is continuously increasing
I really did not want him to stop talking!!! I felt i was just getting into it when it cut to the external engine at the end. I love when topics such as these are explained in layman's terms, thank you 😊
Entropy is the cause of the existence of time. And if there is time, there is entropy. In quantum systems, entropy may increase or decrease. Essentially, as long as there is an arrow of time, there is entropy. Because of the way Quantum mechanics works, time can be turned backward and reverse entropy. There is still a backward arrow of time (so a backward entropy). Entropy is required for time. If entropy did not exist, we would not even have a universe to live in. Rather, we would have a static universe (maybe because our universe is expanding, it could last forever). Time exists on the local level inside a fixed size universe, if at all. This means things can still happen but will cease to exist after entropy is completed on all local scales. Entropy isn't just the transfer of heat, but the transfer of information. The transfer of information takes time. Therefore, the speed of information transfer is the speed of light. The speed of light is the fastest speed possible for a massless, yet information bearing, particle or wave. That means information, through light (or any other light speed particle/wave), will set the maximum speed of information transfer. In a world where light travels instantaneously across any space, cause and effect break down. Light from a billion miles away will reach you at the same instant as light an inch away. Light from any distance will reach you at the same time. Cause and effect become simultaneous and can't be told apart. This is the view of the photon. Its life begins just as it ends as seen from the view of the photon. For time to exist, you must be able to tell cause from effect. It is in the definition of time that you must be able to see cause than effect. In conclusion, entropy is the cause of the arrow of time, it forces information from one place to another or else the universe shall be static. This information travels at a finite speed to preserve time as the transfer of information from one cause to the effect starting its own cause and effect. In a world of teleportation of information, you take the view of photons (or other lightspeed things) and pop into existence at the same moment you stopped existing. In a world of infinite speed particles or waves, the information could just choose to stay in the position as it was before and do nothing until it does, but because our worlds speed of information transfer is finite, we will see time constantly moving through each small cause and effect building up to a larger cause and effect.
Yes I agree - without entropy, the universe would be useless, pretty much, with no movement or anything. So to get something to move or exist, you need some measure of entropy to achieve it, suggesting entropy allows for the formation of galaxies, planets, life etc. I do believe however that the more we look into quantum mechanics, I think we are going to find more interesting laws which builds on what we have already discovered.
A few observations First, there are more sources of energy than fossil and solar. (Fossil being basically stored solar) There's also nuclear decay and lunar (tidal). These have nothing to do with solar energy, to my understanding. Second, there's more than one way for all the red balls to be on one side and blue balls on the other. For example, red ball #1 and red ball #2 could swap places. I guess it would be a permutation of the number of red balls and the number of blue balls, which explains why if there are only 2 red and 2 blue, there's a decent chance that all the blue will be on one side and all the red on the other.
Thanks for this video! It is a much better explanation of entropy than the one we learned at school. Just a super nitpick (to be fair, you used the term ‘for statistical rigour’ so I think I can make this point!) but having all the red ping pong balls on one side and blue on the other is not just ONE of the possible arrangements because each of the reds are interchangeable offering a very high number of possibilities of red on one side (and of blue in the other side which are also self interchangeable). But even that high number of possibilities is thoroughly dwarfed by the number of overall possibilities, therefore we never see it. That being said, obviously, I know you were keeping things simple!
Yes and double that, because the reds could be on either side with the blues on the opposite side. But it is thoroughly dwarfed by the total number of overall possibilities. Statistics you are a cruel mistress! Great video thanks!
@@TennisCoachChip Statistically, the red and blue balls would rearrange themselves back into all red on one side and all blue on another, though far less frequently than all other possible combinations. What if you just had one red ball and one blue? Or (the next step up) two red and one blue? And so on.
"In thermodynamics, heat is energy in transfer to or from a thermodynamic system, by mechanisms other than thermodynamic work or transfer of matter." Heat is a transfer of energy, not a state of being hot. You absolutely need heat to run the sterling engine.
No you dont need heat. You need temperature difference between the two plates. The temperature difference is going to cause the heat to flow, which causes the engine to run. So heat flow is caused by a difference in temperature. Ultimately, what runs your engine is the temperature difference. Once the two plates are in equilibrium, your engine stops.
I think Steve's point is that you don't supply the heat, the energy transfer via heat happens as a result of the difference in temperature. I guess you need heat in the sense of you need the concept of heat to exist, but you don't need to supply any.
Alan Guth once said something like: ~gravity is the ultimate free lunch~, e.g. stars. It seems like instead of being spread out, energy collects in black holes, but is *unavailable*. Same result of course, but for different reasons.
Thank you I actually understood this. I've always thought that entropy was energy spreading out but then I heard the definition of disorder and chaos and I got really confused.
I remember I've always been wondering what entropy really is because wether a room is messy or not is subjective, the word disorder is also kind of subjective.
The degree of disorder that qualifies as messy is subjective, but the fact that "less disorder is less messy" is the objective part. If someone splashed paint on your wall it would be considered messy, until you find out their name is Jackson Pollock. In that case, there are relatively few random configurations that are similar to this highly specific, one-of-a-kind work of art.
Well I'm really glad this video came in my suggestions page, I'm in love with your channel, the video was perfectly executed, brilliantly explained. Great video and well, you just earned yourself a new subscriber :)
Steve Mould Whoa! Wait a second? Are you "The Steve Mould"? The Mould Effect is named after you right? I just realised that, that's seriously cool. I've watched your several videos, and you're doing a great job. Really glad came across your channel. what more effects you've in you mind to be named after you? :D
@@P_Ezi it does, unfortunately. The subtlety is in how we use the words in common language and while dealing with terms technically. Equilibrium is a state which every system desires to have, eventually. And when it is achieved, no system wants to move out of it. In other words, there is nothing left for it to do now, other than maintain the equilibrium. I'm repeating it again, "there's nothing left to do", which implies all energy has been spread out, and now there's no scope for work or heat. All energy has been spread out, and using the definition of entropy given in this video, the system has achieved maximum entropy. Now, the "most dispersed state" may seem like "the most ordered" when you look at it from the macroscopic level. At the statistical level, it is the most randomly distributed state.
@@tusharpal8041 well said. It has always bothered me when people say that things move toward disorder. "Disorder" and "randomly distributed" seem like very different concepts to me.
@Dr Deuteron Thanks for correcting me! I was wrong in phrasing things to make it sound like there is only one state for maximum entropy, or the equilibrium state. I realize now that that statement is true for macrostates, not microstates.
P.S. if time was infinite there would be another big bang, since energy clumping together is just really unlikely, thus entropy doesn't NEED to increase it's just really likely, but if time is infinite everything with a non 0% probability will happen no matter how low it is, like all the energy clumping in one spot.
If it doesn't have ice it can't be in America, it must be a European or English train. Granted the fact it's a passenger train pretty much suggests it isn't in America!
With you so far. The big question is why, if it is so statistically improbable, do we find energy so clumped together in the past? If the energy distribution of "heat death" is so much more statistically likely than very clumped-energy states of the universe, why isn't "heat death" the state of the universe in both the future AND the past? Also, if you observe the changing universe in temporal reverse, can you identify which physical laws are causing energy to defy statistical probability, and clump together? In other words, which physical laws underpin entropy and thus the direction of time? Maybe you could make a video about this?
If you accept the big bang theory plus the idea that our universe is the "white side" of some sort of black hole, it is gravity that first clumped everything together. Then, when all that energy was released, there were infinite ways to spread it out, sometimes making entropy seem to go temporarily backward, like star formation or human technology but there is still a statistical tendancy towards more entropy.
When I was younger, I used to think time didn't exist, that it was made up by humans who wanted to keep track of things like dentist appointments and yoga classes. Your description of time measured by entropy really made me think about the nature of time. Theoretically, if, in a freak scenario of statistical unlikelihood, entropy in the universe started going down, would that mean time is flowing backward? Great video, you have some quite profound insights into entropy in ways I've never considered.
The time would probably not go backwards, since there are many possible arrangements. Like in the example of the mixing balls. Red on the left and blue on the right. When the mixed state decreases in entropy, there may be a result where all the red balls are on the right side. Which has the same entropy as in the beginning, but is a different "past".
I think its more like since entropy always increases we can use that as the definition for passage of time as time also always goes forward (pls dont say time travel. It doesnt exist yet). In the statistical unlikely event that entropy decreases, our definition of time being related to entropy will be wrong. So time will still flow forwards. It's possible I'm wrong though.
No, while it's an interesting question, a decrease in the universe's entropy does not imply a reversal in the direction of time. If time were started to flow in reverse, then the total entropy of the universe would decrease, but if the total entropy of the universe decreased it would not imply time started flowing backwards. Also, a reversal in the direction of time seems to violate the fundamental nature of time since if time reversal started at some time A would it continue backwards in time, or would it reverse back to the direction we're familiar with at some time in the past B? In either case it seems we're left with paradoxical situations since if time continued to flow backwards indefinitely, it would progress to very the beginning of the universe. And if on the other hand, time suddenly began flowing forwards again, then once we reach time A again, we have reached the moment where time starts flowing backwards, so the universe would be permanently oscillating between points A and B. My argument relies on the assumption that the universe is deterministic though.
My description of entropy is: S = k log W where W is the number of microstates, k is the Boltzmann constant (found by Planck), and S is the entropy, of course... Another good description of entropy is a science fiction story by Isaac Asimov, entitled "The Last Question"...
Thanks, Steve. Definitely the most intuitively understandable definition of entropy I've heard. Good on ya for giving the Stirling engine company a shout-out without compensation. And finally, nice job decreasing entropy by filming your video at a time you were required to be sitting around waiting anyway. 💡
@@liotedupuis5311 Dude, It is just fiction... All he did was create a spy thriller with his own twist to the tale. If you really care about physics go watch interstellar(even that film had fringe science)
@@aravindmuthu95 interstellar is also a disaster on the physic aspects. However i never said that the film was bad, only that the physic inside is completely wrong (i do love fiction too ). So it doesn't make sense to use tenet as a physic reference, it's like quoting donald trump about global warming issues.
I wish some had made this kind of video when I was in my school struggling to get physics in to my head......I cant thankyou enough for this video......Simply superb
Hey mr.steve i really like your in depth definations the afterwards knowledge we have after watching your videos is just off the charts really appreciate your hardwork and thanks for educating these various complex yet beautiful topics hats off to you man
That was really interesting for this reason: given the amount of time the Universe has until it's heat death, that is enough time to allow for the statistical possibility that all the red and blue ping pong balls order themselves just once, and when that happens (if) then that would be a scary amount of enthalpy that could suddenly become available. Given that the Universe is finite yet boundless (and really, really big) I know that conventional wisdom says that it cannot happen but in all the time available there could be prolific pockets of enthalpy eruptions. I wonder if that actually happens quite regularly at the quantum level... I'm joking of course, since the discovery that entropy can decrease over time in a quantum system and quantum effects can be used to "clean up the states of systems". Interesting huh :) What I wonder is that if the quantum example can extrapolate to the macro system, or rather how it does that, similarly to how known quantum effects extrapolate to the macro universe we experience around us...
I'm afraid not, especially if you're thinking that maybe it would work with an actual fluid with a realistic number of particles. If your box of ping pong balls had 100 balls in it (so 50 of each colour), then the chances of a single throw yielding all the blue balls on the left is 1 in 100891 trillion trillion (that's 10^29) - far less than the number of years until the heat death of the universe (10^100 years according to this video, or 10^106 years according to first Google result) but still huge. If you have 1000 ping pong balls then the chance of a single throw yielding all blue balls on the left is 10^299 instead! (Too many trillions to write out here.) Even if you threw your box a trillion trillion trillion times per second, it would still take 10^256 years to be likely to get a neat arrangement of balls, which is a trillion trillion trillion trillion trillion trillion trillion trillion trillion trillion trillion trillion times as long as the time until the heat death of the universe. That's just 1000 balls! Now consider trillions of particles in a relatively small container, and you can see that it's just impossible for that to spontaneously organise itself, even on the timescale of the universe.
Absolutely great video! Not only your definition of entropy is, in my opinion, a lot better than the "measure of disorder" one, but also yours was the very best explanation of how the Stirling engine works (I love Sterling engines). Your way to explain stuff is second to none. Thank you.
I'll be curious to see how gravity fits into this model when we understand it fully. Gravity being the natural clumping force of matter, it almost seems anti-entropic.
I know it's been ages since the comment, but I think you still might be interested. There are some people that actually try to describe gravity as an entropic force itself. What does this mean? If you think about it, when you aggregate mass together you put it in state in which more configurations are possible. And thus it's increasing the entropy. Many particles apart have only one configuration, many particles together have loads of them. I always thought that it was a beautiful idea, but I don't know how seriously people take this route of research...
Oops! As Steve was making a point about "statistical rigour" (around 09:00), he made a little mistake by saying there is "only one way to have all the ping pong balls that are red on one side..." and all the blue ones on the other side. Actually, depending on how many balls there are, there will be lots of ways to have the same picture. You can do a great number of permutations where all the red and blue balls are separated on each side but not in the same place (as would be more evident if each ball was tagged with a number). But of course, there are still much much more different arrangement of balls where they are all mixed up. Nonetheless, a great video!
Well i think part of the point is that all same coloured balls are indistinguishable from eachother (which would make any configuration of all reds on one side and all blues on the other the same in the statistical calculations (assuming you also can't distinguish directions, so all red on the left and all red on the right would also be the same).
Statistics may be applied categorically as well as to permutations, don't forget. "One way to divide balls" may serve perfectly well as the statistical "E", the event to be studied; E is not restricted only to the specific arrangement of hot/cold units. Further, if we discuss a specific system, one more relevant to the subject of entropy, in which the blue and red balls are hydrogen atoms, say, and their only difference is the individual kinetic energy, all within a band near to room-temperature, there is no other meaningful difference between one hydrogen atom and it's neighbor. They are identical, and any one atom's location has no meaning to a study of their possible permutations. It's why the heat death of the universe remains absolute even if you somehow swapped the locations of some absolute-cold atoms with other identical, absolute-cold atoms: "death" has occurred and time has stopped because time has ceased to have any meaning. It's important to remember that maths does not exist in any objective manner; mathematics is merely a language convenient for humans to describe what the universe means. It cannot out-endure, or restore relevance to, a system devoid of meaning. If the red balls corralled on one side could be made entirely identical, some other arrangement of red balls has no different meaning, and there *IS* "only one way" to discuss their separation from the blue. Besides, atoms don't really have precisely meaningful "locations". That's just an added little googly thrown at us by a universe with a wicked-ass sense of humor.
burning fuels produces CO (carbon monoxide) which is a pollutant which is produced in more abundance than CO2 (Carbon Dioxide) which is needed for plant life, which plants give us O2 for animals to breath so that more fuel can be burned. Yet for some reason CO2 seen as pollutant.
So time and entropy could simply be two different ways of describing fundamentally the same phenomenon. Yes, why not? After all, it used to be thought that heat and light were completely different things, until we realized that they were simply different manifestations of the same entity - energy. Then later, Einstein showed us that space and time, thought to be completely different things, were actually just 2 aspects of the same phenomenon (space-time) and that energy and matter, again originally considered unrelated, were made of fundamentally the same stuff. So it doesn't seem in any way preposterous to suggest that time and entropy, rather than 2 distinct things, could turn out actually to be just one ("time-entropy"). Factor in a few other assorted fields and forces and - who knows? - we could just end up with one single, all-encompassing cosmos-wide phenomenon: matter-energy-gravity-magnetism-spacetime-entropy-...
Awesome video! Love the explanation, makes you think about renewables. Energy from the sun also creates the wind etc. Not sure how tidal fits in though, if we use too much tidal energy will the moon fall to earth ?
The reason that train isn't moving is because the driver wants to postpone the heat death of the universe by not increasing entropy.
+Gordon X. Frohman :)
Gordon X. Frohman Hahaha!!!😀
@aud_io a great way of extending the life of the universe, but alas still does not prevent the inevitable end
@aud_io to turn off the fusion reaction youd have to figure out a way to negate gravity, it's the gravitational pull of an star that makes the core dense enough to fuse, so in order to stop fusion youd have to cancel out all of the pressure caused by gravity
@aud_io But you would increase entropy by trying to stop the fusion in stars. The energy it would take to do that, would probably be more than if we just let it happen.
Sir, you are a huge clump of energy and enthusiasm!
But Steve is an exception to the rule - he spreads the energy around all the time, but it never loses its magic.
Check Out our Video on Entropy!!
th-cam.com/video/pKGhC2GKpm8/w-d-xo.html
Once we watched this video and understood it and are as clever as him in this particular 11:42 mins of subject, we will not watch it again, does that prove knowledge or emotional entropy
Entropy increases and so does my understanding of entropy, thanks to this video!
Great!
P Hampton I don't want to play with that fire too much, but there is following phenomena: syntropy. This the new organization. E.g. when one's organization gets bad, and it gets reorganized; etc...etc...
So...
Battery charged: Low entropy
Battery discharged: High entropy?
So entropy is kind of the opposite of available energy?
Erick Lopes yep
@@ericklopes4046 Usable energy anyway. I suppose that's splitting hairs, though. Entropy increases as one runs steam through a turbine and enthalpy decreases. Then one condenses it and pumps it back to the boiler where entropy is decreased and enthalpy increased so it can be sent back to the turbine to do more useful work. Enthalpy is order and thus usable energy where entropy is disordered unusable energy.
First watched this video when I was in high school, found it absolutely fascinating and was at a time in my life where videos like yours were interesting me into pursuing STEM education.
Now, years later, doing my MEng degree and learning about thermodynamics, stirling engines, etc in university, so I came back here to recap (you explained it a lot better than my lecturer). Thank you for your contribution to the educations of myself and so many others
Currently watching this for the exact same reason
Noice👍
I'm an Aerospace Engineer, studied at the University of Maryland. This is by far the best explanation of Entropy I've ever heard. You get into another level of complexity when solving for Entropy (S) as a function of heat transfer (Q) and absolute temperature (T), but having this foundational understanding gives context to really comprehend your solution. Good stuff!
+Angelo Collins thank you!
Thanks a lot for the video! I struggled to understand what entropy really is, when I was first introduced to it at University (I’m still in first year physics). After watching this video it makes so much sense now! The way you defined it relating to the spreading out of energy helped me so much!
how is aerospace @ Maryland? I'll be applying to colleges next fall, for Ms.
tsm
sorry but the earth is flat
Information spread over degrees of freedom to be more specific :p
I want someone to look at me the way Steve looks at his Stirling engine
I'd never have thought of that! More like "why doesn't he take his face out of the way so I can see the engine". Thank God for the difference.
Well he looks at the engine so he can figure out how it works. But no matter how long a mans observations of a women, he knows he will never figure her out.
@@themask706 you have touched on an excellent point..in fact females or shall we say gynovoltaic engines have typically taken all of the energy i have ever given them and efficiently shared it with the rest of the universe..relieving me of the burden of storing it. Thankyou ladies!
@Clutch Inson. I too have pumped a steady stream of explosive energy via ‘small bangs’ into a continuous array of gynovoltaic engines over time in my endeavour to understand the nature of the universe and experience entropy first hand.
Sean- one of my small bangs has grown big enough to borrow the car ! May you continue your explosive research..and may your experiements confirm and exceed your wildest hypothesis.
This particular way of describing entropy is why you always find your earbud cords always tangled up: *there’s only 1 way* to have the cords straight and untangled but, as you jostle them around in your pocket or backpack, *there’s a myriad of ways* for them to become tangled.
I sometimes think earbud cords tangle even when they're not being jostled, just confined like in a desk drawer.
yeah but theres always lots of other orderly ways to have a pair of earbuds that aren't tangled. not just one...
Maybe entanglement is inverse to dimensionality. A loop in 3d becomes a knot when the dimensions are reduced to 2d under conditions of confinement like a backpack or desk drawer. Entropy means the knot or entanglement doesn't revert to a loop on re-exposure to 3d, so the cord is still in a tangle even after opening the desk drawer again.
that doesn't explain why you never plug usb the right way though
@@danielterra4773 Sometimes you do, but it's less often than not, so that's what we remember. We're also tricked by our eyes, we think it should be a 50/50 chance that the USB is the correct way around, but it's one out of three: USB plugs are in fact four-dimensional objects, requiring you to turn it 180 degrees TWICE to rotate the plug to the superposition where it will fit the socket. ;)
This really bridged the gap for me between the statistical “ping pong ball” analogies ive heard, and the teachings of physically irreversible processes when describing entropy. Great video!
05:27 to 06:10 bookmark for myself to keep revisiting the definition
The CO2 pseudoscience is so spread that even a guy have a channel talking science parroting CO2-global warming nonsense. It is sad to see so many people not using their head, taking the liberal media's position as truth.
@@seanleith5312 Hello! You've reached the TH-cam Reply Hotline.
Please press 1 for troll, 2 for idiot, or 3 for joke.
@@aidenlilley1319 3
@@seanleith5312 I feel bad for people like you who can't understand very basic science concepts. The truth of the matter is that some materials absorb infrared radiation (heat) better than others. That's a well established fact. For example, CO2 absorbs infrared radiation much better than nitrogen or oxygen. This can result in heat from the sun becoming trapped in the atmosphere as it bounces between the ground and CO2 in the air (in the form of infrared radiation).
You can argue how much of an impact this has on the global climate, but here's another couple of well established facts. The Earth's temperature is rising, and the amount of CO2 in the atmosphere is rising as well. The heat-trapping I mentioned is evidence that they aren't simply correlated. CO2 is causing a temperature increase.
@@mattn.8941 Your description has something mixed up. If the CO2 ABSORBS the infrared radiation (your first claim), then it is not "bouncing between the ground and CO2 in the air"...it's being absorbed by the CO2. If it's bouncing back & forth between the CO2 in the air and the ground (your second claim), then it's not being absorbed by ANYTHING (otherwise, there'd be nothing left to 'bounce'). Things that make you go "Hmmmmmm..."
I finally understand, after so many videos and articles - now, the clumped energy of my frustration has been dispersed! Thank you!
It would be very nice if there was a universal law that said frustration would always tend to become dispersed. It seems the opposite may be true.
😄
Check Out our Video on Entropy!!
th-cam.com/video/pKGhC2GKpm8/w-d-xo.html
I watch this video before and after my thermodynamics class. Now I understand the video better but I have to say that you explain better than my professor!
The problem with Professors is that they think everyone understands what they are talking about instead of realizing in comparison to himself they are all a class of Chimps sitting in front of him, this is why I couldn't learn Algebra at school, I do understand it now as a good mate that could explain things the way Steve Mould does actually taught me Algebra it in about 10 minutes when I was 28 lol.
@@MrMambott congrats on finally learning algebra
Check Out our Video on Entropy!!
th-cam.com/video/pKGhC2GKpm8/w-d-xo.html
I have a degree in mechanical engineering from a respectable university. I did thousands of thermodinamical calculations using entropy. Yet, you helped me understand entropy better... thank you!
Did entropy increase when you substituted an “i” in “thermodynamical” for the “y”? Or, maybe you’re an engineer, not an English major :) ?
I like how ppl complain about spelling, in a time when AI is almost ready to spell correct everything. Worrying about spelling is about the dumbest thing you can do.
Also, I'm an astronaut millionaire.
@@TotalDec ALMOST ready? When AI finally gets it right, then I’ll take comments seriously. Until then, I’ll just skip reading comments with poor spelling, no punctuation and illiterate grammar. If you can’t bother to proof your comments or learn how to spell, why should I try to translate your incomprehensible posts?
Astronaut millionaire? Sure! An Astronaut who can’t spell is the dumbest thing you can do. Can you even count?
@@quixote5844 🤓☝️
@@vijay-jw8gq🤓🤓
This is the most intuitive video on thermodynamics I've ever watched. Thank you for finally making me understand what entropy is : )
What I came for: better understanding of entropy
What I left with: existential anxiety about the heat death of the universe
Yes yes yes that's what im processing too!!
Definitely 😂 Heyy, Kayla. I know you from your studygram :))
don't worry you won't be around besides way before heat death occurs the Milky Way and our neighboring galaxy Andromeda will collide and our spec of a solar system will be no more. so don't fret ;)
@@redpillpusher I doubt that. There's a uniqueness to our home system and should humanity exist by then (and I very much expect it should) we ought to have the technology and infrastructure to keep such an important star safe.
You won't live long enough for it to become a problem so relax
So they had Stirling engines but no ice? One weird train indeed.
+Frederik Tessmann haha! Yes, very strange.
They save the ice to power the Sterling engines that run the train... ;-)
It's only a half train.
It's not a train, it's just clever green screening!
Er, if you gave it a moment's thought I think you would see that it's far more likely they had both, but you were only allowed to choose one or the other.
Yes! This is what am talking about. Entropy is increasing not about disorder it's about fair even distribution or spread out. And this also applies to abstract things, for example a high entropy random generator is well spread out evenly. A high entropy checksum is also well spread out that a small change in input make much of the checksum to change into any possible value.
As far as I know, out of all entropy videos, only this one mentions why the concept of entropy was introduced and gives a clear reason as to why entropy always increases. You are doing god's work my man. AWESOME explanation. :)
+Siddhant Chaudhari thank you :)
I've always disliked "disorder" as a description for entropy, When energy distribution is homogenized, with everything spread out evenly, that's about as orderly as anything can get.
I don't like the disorder definition because it's easy to misconstrue and doesn't do a good job of giving someone an intuitive explanation of entropy, but it is correct. There are more ways to create "identical" disordered states than there are ways to create "identical" ordered states, but at that point why aren't you just giving the actual and correct Boltzmann definition? It's really not that hard to understand. You can even use dice to explain it, there's only 1 way to roll a 2, but there's 7 ways to roll a 7. Entropy is defined as the the number of ways you can get a particular "big picture" measurement. When you randomly roll 2 dies, you're more likely to roll a 7 than a 2. Congratulations, you just described both what entropy is and why the second law happens with nothing more than something everyone has seen and addition. It's even easy to relate it to Shannon entropy from here.
And after revisiting this video after a year, I still don't like it. Maybe it's useful for the way an engineer thinks about thermodynamics, but it's useless for how a chemist thinks about it. Give me the boltzmann definition and I can instantly see that solids are lower entropy than gases, but this one? That's not at all clear.
It works when you're learning the statistical mechanical derivation of the Boltzmann equation. It's how we learned it at university.
It's a crappy description also because it's one word trying to represent a somewhat complex idea. I mean if someone asks you to define it you need to spend a little time explaining properly. If you are a professor standing at a chalk board you'd likely throw in some equations as well. And at the end of it, because as a society we have the attention of gold fish, we say something like, "Sooo... disorder?" And in frustration we say, "No, the answer is that thing I just took five minutes to tell you."
Ron Arguelles that’s how it was explained to me as a kid.
@@Mezmorizorz Bouth is in same walue...
"An argument can be made that time is itself a statistical phenomenon."
You're just gonna leave it there? Please go on. Seriously. Help. Go on. Tell me more.
Hello!
@@NeilMarcellini Hi there
I could explain this to you if you want to but you’re gonna have to wait till I’m on a computer and not about to go to bed. Cause I’m on my phone.
And in bed. Reply to this so I remember
@@hudsoncaceres6820 That would be helpful! Thanks!
@@hudsoncaceres6820 still waiting
Thank you so much for this video Steve! Studying thermodynamics at uni right now, and this video is the ONLY one that I have understood. Physics, nature and our universe is just so beautiful, and your video made that ever more clear to me!
I wish we had TH-cam when I did my physics degree thirty five years ago. It would have made many concepts less of a struggle to get my head around (And I probably would have got a higher classification!)
Surely you would have known this before studying that at uni, right?
My favorite definition of entropy cames from Shannon which roughly is the amount of information of a system, or how much you can compress information in a system. This is also consistent with the entanglement network we have as our reality, and its information increase over time attempting to "write" information on its state.
Yes. There is a great story about that. I think it was von Neuman who suggested that they call that Shannon's phenomena Entropy because of the similarity to the equations of (mechanical) entropy. So Shannon Entropy was discovered later and was only called Entropy as an analogy. But really it is more fundamental I think. You can explain thermodynamic Entropy in terms of Shannon Entropy but you can't really explain Shannon Entropy in terms of thermodynamic Entropy.
"Entropy is a measure of how spread out your energy is." PERFECT!
Excellent explanation! This video also touched on the difference between _temperature_ (a measure of the average kinetic energy of the molecules) versus _heat_ (which is the FLOW of energy).
Basically, in order to perform work energy has to flow from a higher potential to a lower potential. This is ▲T (i.e.-temperature difference) in thermodynamics. Entropy is the energy "under" the ▲T that is not available to perform work. Enthalpy is the energy ABOVE the ▲T that is available to do work (technically enthalpy is the heat potential energy plus the pressure + volume potential that is available to do work).
If you use the analogy of an hourglass (gravitational potential energy contained in sand particles), work is performed by the kinetic energy of the sand as it falls through the bottle neck [and say, turns a paddlewheel on its way through]. Enthalpy is the amount of sand in the top of the hourglass. Entropy is the amount of sand that has already fallen. The height of the hourglass is the potential (temperature). The volume of sand contained is the energy in the system. The size of the opening is the rate at which energy is transferred (heat). In real life you can never turn the hourglass "upside down" to start over. Once that potential energy is gone, it's gone.
it's Δ "uppercase delta", not ▲"black up-pointing triangle", by the way ;)
@@nonchip True that, sir! I just couldn't figure out how to type a proper uppercase delta using alt code. When I type Alt+235 it comes out lower case 'δ', even though it's supposed to be upper case.
Alt+30 comes out '▲' for me.
How are you typing that if you don't mind me asking? (I'm using a Windows PC)
@@nonchip Afterthought:
It probably would have been more correct if I wrote "pressure x volume potential" now that I re-read that as well; had to take some liberties trying to explain the idea without writing an equation. Oh well. ;o)
I just recently purchased a Sterling Engine model (and a Tensegrity Table) to Science educate my grand-kids (they'll face challenges greater than I've faced, and I'll be 'gone' in 20-30 years, optimistically). In the course of exploring, THIS post was suggested... and I've shared. Thank you!! I feel it was brilliantly explained and worthy of interest. Just sitting on a train, in a station, not moving yet... and an 11 minute compilation of Brilliance I just discovered when I needed it.
7:30 -- Here's a related question for you to consider answering perhaps!
Solar panels have an efficiency measurement -- let's say they capture about 70% of the available energy from sunlight; the 30% which is "lost" is lost to entropy, in the form of the panels getting physically hotter in temperature despite the photo-electric potentials.
Plants, via photosynthesis, manage to capture 98%... that's some fine negotiations with entropy, IMO.
Photosynthesis has a 3 billion year lead on solar panels. I bet solar panels will be pretty great in 3 billion years.
Hey bro, I know this is old but where are you getting your stats from?
From sun to stored energy, plants have about 3 to 6% efficiency. Solar panels are about ~20% efficient now.
Perhaps you are mistakenly referring to the % yield of one of the reactions involved
@@DarkChasmGamers this man is right. Plants aren't that good at photosynthesis (all types of photosynthesis)
Source: plant biology
Everyone else when the train isn't moving : Frustrated. Angry. Play games. Talks to someone
Steve mould : "Hm... Time to teach kids what entropy really is"
This is the first satisfying explanation I've heard. Thank you!
It is actually wrong.
Electro-Cute why is it wrong?
it is only over simplified
+Ashwin Singh
He says that the energy will evenly spread out and never clump together, which is not true. Temperature and energy is not the same. If you have a piece of mettal next to piece of wood, both with the same temperature, then no net energy will transfer from one of the materials to the other. But there will still be much more energy clumped up with in the piece of mettal.
The laws of the universe governs the nature of how energy transfer and organizes it self. Thermodynamics gets aplied to these laws that we have come up with.
This allso paints up a world where the laws of universe are very simple at a microscopic level. But that doesn't have to be the reality. It could hypothetically be as intresting laws that governs the macro- and larger- scope of the universe as the onese governing the microspoe of the universe. Just that we have not noticed it yet.
I mean does these very simplified asumptions about the nature of thermodynamics aply to the realm of quantumphysics? If the laws of the universe are much more complex and dynamic than what we think then we could hypothetically exist in the "heat death" right now (a much more exciting heat death). That is what thermodynamics says.
Electro-Cute look up "heat equilibrium". The definition he given works if we assume that system is infinitely close to heat equilibrium.
I like the ping-pong ball box analogy, as it's also useful for explaining Chaos.
You could fill the box with half red and blue balls in the same way, and use a machine to precisely shake the box, then note the position of the balls. Put the balls back in their half/half configuration, and repeat the exercise. The chances are the balls will settle into a different position every time, and you never effectively predict where the balls will land. This system is chaotic.
Now I think that arguably, if the balls are set up precisely enough, and the box is maintained precisely enough, and the machine shakes precisely enough, and the temperature is precisely maintained, and it can be isolated from all external interference, perhaps you could predict the outcome of the test, and get the same result each time. However even the tiniest change in any factor will produce wildly different results, such that we cannot conceivably engineer such precise conditions to make the outcome predictable, even if we had an inconceivable amount of analytical computing power. That is chaos.
Entropy is the property of energy to tend toward uniform distribution, rather than concentration. That's how I've always described it 🙂Great video!
But then why do some exothermic chemical reactions have decreasing entropy values?
Sonic Reaper Some interactions result in less local entropy, but still tend toward the larger entropy of the universe. I believe he discusses that in the video 🙂
So basically diffusion
@@fluent_styles6720 Endothermic reactions happen (in terms of energy levels, there's lots of equivalent ways to describe them) because more energy levels become accessible to the products as the reaction progresses, outweighing those lost in the reactants due to the loss in temperature. So, overall theres an increase in entropy (which is the log of the available microstates). More specifically, the ratio of the partition functions of the products and reactants increases with temperature and so at some point the equilibrium will favour the product (there's also an enthalpic contribution due to the boltzmann factor, but that's less significant).
its not only a property of energy, you could use for information too... or whatever that has a statistical approach.
And here we find our Hero Cpt. Mould struck by the insane desire to clear up a basic physics concept during his daily train commute.
This video explains thermodynamics wrong. It is obvious that Steve Mould doesn't understand the difference between temperature and thermal energy.
If you have a material A with a very high thermal capacity next to a material B with a very low thermal capacity then A and B will try to reach the same temperature, but not the same energy dencity.
So in other word you will find a huge gradient between the energy density of material A and B, even when they reach the same temperature.
Lets say that material A is much more dence than B and both have the same bolume. This means that it is more likely that the energy will be transfered from B to A than from A to B. Because there is more mass in A than in B that can hold on to the energy.
So in the end it is all about probability. It is more probable that the energy will be transfered to the material of high thermal capacity than the one with low thermal capacity.
The most probable balance of energy depends on the laws that governs the universe. In the case of two simple materials like this it is governed by the laws that define the property of different materials.
In a hypothetical case our universe could be governed by laws that we may not allready know of; very complex laws that states that our universe is allready existing in thermal equalibrium.
Electro-Cute high energy density doesn't mean it will transfer more energy to a material of lower energy density at the same temperature, it just means it has more energy states accessible to store energy. Two touching materials at the same temperature will have an even energy transfer in a closed system no matter what their heat capacity is. Their energy content will not converge unless the materials themselves decay into each other to form a homogeneous spread of atoms.
I know that, it was allso kind of my point. Temperature is a thermodynamical perspective on a very simplified system.
What the guy in the video said was that the systems will reach equal thermal energy distribution (or at least that is how I interpreted it). But that is only true for few special cases.
The context of the discussion and accompanying diagram make it clear that he is talking about two identical slabs of the same metal. He could, of course, divert to say that the thermal energy distribution would be different for slabs of different material but that would be unnecessary for this discussion (he's not talking about the relationship of thermal energy and temperature) and make the discussion overly complex.
@@andrewrobertson444 I don't agree that this is obvious, and even if it was how does this explanation explain what happens when the two metals aren't the same? That's not at all obvious to me, and at this point I've had about two years of thermodynamics classes ranging from gen chem to graduate level statistical mechanics. I don't see how someone who is totally ignorant of the topic is supposed to arrive at reasonable conclusions from it.
But really, the most damning thing is that this video simply isn't an explanation for entropy. What he is describing is not entropy. He is describing a consequence of the concept of equally probable microstates that happens to work well for the few systems he described and few others.
My takeaway from this: that is one cool toy and I’m gonna order one 😁
Lol same 😂
5:51
@@WanderTheNomad bro we understood but its like that interested us more haha
it's actually a kit, so you also have to build it. it's made by kontax i believe, out of germany. they're not cheap but they are neat. big recommend from me, i run mine on a cup of coffee for guests
@@SassyTesla here we are in the future - my wife bought me one as a gift - yes I built it - and your coffee idea is being put into effect right now. It’s the gift that keeps on giving 😁...
You ought to do a video about that sodium acetate hand warmer you used. They’re really a pretty brilliant example of the latent heat of fusion given off when you have a phase change from liquors to solid. Orange farmers in Florida use the same phenomenon in water to protect oranges from freezing. They spray water on them and when the water freezes it keeps the oranges from freezing.
Liquors you say?
@@linuslundquist3501 - Liquid to solid is what he meant, a phase change.
I know, but it was a funny typo@@WJV9
cool thought experiment: assume the universe is deterministic (law of concervation of information applies), press pause on the universe, play it backwards so that entropy will get lower, and displace a single atom somewhere. chaos theory will dictate that the cumulative effect of that displaced atom on that backwards running universe will cause an alternate progression of events where entropy ack goes up, but only to the objects where its influence has time to reach. hence, everything outside this luminar-speed bubble will run backwards and everything inside, will run normally. if you look at it from the outside, it will be a collapsing bubble where things are running backwards. bouncy balls will jump onto ledges, and coffee cups will unstir themselves until the bubble suface arrives!
+dt28469 nice!
dt28469 j
If I get rich enough to buy my own universe Im going to try this.
DT28469. there is a potential issue with this time reversal anyway. there are simple configurations in newtonian kinematics, that are non-deterministic. Certain collisions between 3 (ideal) elastic collisions that are n-d. So reversing "time" would almost guarantee a different backward sequence.
Great explanation! Another fun tidbit is that "entropy" is also used to describe how much information is encoded in something. This seems counter-intuitive at first when compared to the heat death of the universe, but it makes sense if you compare it to Steve's example of the balls in the box. For example, your hard drive is in a more orderly (less entropy) state when it's empty and all the bits are zeroes; when you fill it with data it has more information but you also can't easily predict if any given bit is a zero or a one (more entropy). James Gleick has a great book on this for those with some "temporal entropy" to kill.
Indeed, I heard something similar when talking about compression of data. In short more you compress data, more you are reducing orderly, predictable parts, which means if you compress something to maximum then all you are left with is pure chaos.
"You never see it in reverse" Me, an intellectual: *clicks rewind*
_Reality Can Be Whatever I Want_
ok funny guy
thats how the idea of tenet came
😂😂 you are a genius
Check Out our Video on Entropy!!
th-cam.com/video/pKGhC2GKpm8/w-d-xo.html
You make complex subjects simple and fascinating . I can’t ask for more than that.
Probably your best video yet. Very well explained, with great examples. Great job Steve! Waiting for the next. :)
+Dragos Puri thank you! I really like doing these explainer videos. Will do more I think.
I agree!
This was beautifully explained. I am so glad I'm not the only one in conflict with the conventional definition of entropy.
Please do, they're great, this is my first time watching you(apart from brady's videos), and out of all the videos I watched on entropy(because I had difficulty accepting it) yours was THE BEST, even better than MITOCW.
Thank you Varad. Really appreciate comments like this.
This video explains thermodynamics wrong. It is obvious that Steve Mould doesn't understand the difference between temperature and thermal energy.
If you have a material A with a very high thermal capacity next to a material B with a very low thermal capacity then A and B will try to reach the same temperature, but not the same energy dencity.
So in other word you will find a huge gradient between the energy density of material A and B, even when they reach the same temperature.
Lets say that material A is much more dence than B and both have the same bolume. This means that it is more likely that the energy will be transfered from B to A than from A to B. Because there is more mass in A than in B that can hold on to the energy.
So in the end it is all about probability. It is more probable that the energy will be transfered to the material of high thermal capacity than the one with low thermal capacity.
The most probable balance of energy depends on the laws that governs the universe. In the case of two simple materials like this it is governed by the laws that define the property of different materials.
In a hypothetical case our universe could be governed by laws that we may not allready know of; very complex laws that states that our universe is allready existing in thermal equalibrium.
Cool ! A good demonstration for non physicists. No confusion, no philosophical considerations : just definition and good interpretation. Clausius and Boltzman concepts simply and methodically explained. Nice !
+François Cauneau thank you!
He totally blasts you with a deep philosophical argument near the end. "An argument can be made that time itself is a statistical phenomenon." If that doesn't get your Stirling engine warmed up I don't know what will.
Aah how satisfying it is to here such a clear explanation of entropy. Nice work.
+美琴御坂 thanks!
10:00 no it's not just blind chance. Moving from low entropy to high entropy is moving from high potential to low potential. You need to do work to move a rock up, but you don't need energy to drop it. You need energy to heat or cool things, but you don't need that to spread heat from hot to cold. It's equilibrium not chance. Nature tend to seek equilibrium and low potential and high entropy. You need to force it otherwise by exercising work.
Steve you are so good at explaining difficult concepts! Could you do a video on pKa and pH please?
Thanks for the video! I've never understood why a more entropic state is described as more disorderly? Isn't the universe in more 'order' with energy more equally distributed? To me, clumping energy together seems more like the chaotic state of the universe. The universal heat death is when everything is evenly spread out and in order.
That confuses me as well, but I guess I do consider my home more in order when my stuff is concentrated in a few spots than when everything is "distributed evenly" across the place 🙂
0:30 What is entropy?
3:20 How does a stirling engine work?
10:45 Where can I get one?!
I gave my Second Law lecture to my Freshman chemistry class this morning and that is exactly the way I say it too: increasing entropy is about spreading energy out. Our illustrations are at the molecular scale, but it always comes back to energy spreads out because states of the system that overwhelmingly most likely are those with the energy spread out.
This has to be one of the best explanations for entropy I have heard so far!
What always confused me about the common definition of entropy is that to me it makes more sense that evenly spread particles and energy is more orderly than the clumps of energy.
This definition of "increase in the spread of energy" makes much more sense to me.
Yes!! The popular use of "chaos" and "order" always seemed entirely backwards to me.
@@iamtheiconoclast3 The rationale has more to do with "information" than "order".
If all the particles are in a single corner of the room, you can describe the system with something like "1000 particles are in a square in the bottom left corner of the room with a distance of 1 unit between each". That's a single sentence which you can use to determine the exact location of every single particle in the room. If they are spread out throughout the room, you must give precise coordinates for every single particle individually, which in this example would be 1000 sentences long, to achieve the same level of knowledge.
You require more information to fully describe high entropy systems than low entropy systems, and in that sense they are "disordered".
The real question is not why entropy increases but how the universe was able to be created from Entropy itself
That one-in-a-gazillion chance.
Incredibly unlikely does not equal impossible.
It's not exactly an every-day event.
@@antonystringfellow5152 only if you assume infinity time!
@@ffccardoso ...and space.
Come on man. Either gravity or electric forces, started getting shit together. Formed s black hole and then a black asshole. Bang, clampped universal turd energy
This is why the big crunch hypothesis makes most sense to me
I have that engine plus another Stirling. Now I know how to introduce my grandkids to entropy. A great explanation that helps one visualize. My intro to entropy was in 1967 2nd year thermodynamics in a classical physical sciences program. The prof was really good but visual props were limited to spelling out Noel complete with the 2 dots over the e in a pre-Christmas review of various equations
Always watch Steve's videos until the very end. Don't ever leave the video thinking it's basically over.
Or a kitten will die.
will definitely be getting a wee sterling engine for my shelf!
I remember in high school a girl in my physics class concluded half jokingly that everything would eventually end up as heat, and the teacher agreed. Never knew that it was called entropy tho... Cool video!
Dear Steve,
First of all, many sincere thanks for your efforts!
...By the way, ENTROPY is not only about particular engines/motors, etc. It is about principal impossibility of PERPETUUM MOBILE in general, according to Nicky Carnot. Now let us go...
A. There is ONLY ONE BASIC, fundamental Energy Conservation and Transformation Law. It is definitely unique and conceptually indivisible delivering two logically joint concepts - these are Energy Conservation - and Energy Transformation. Still, a more-then-100-years-old conceptual failure has brought us to two separate thermodynamic laws - but this has nothing in common with the actual physics. To come back, they have coined two more fake thermodynamic laws, employed the Probability Theory + Mathematical Statistics, and this has helped formulate the Quantum Mechanics, which is thus a basically metaphysical conceptual construction and thus ought to be only restrictedly fruitful.
B. By dividing the basically indivisible law, you are telling about Combinatorics, you are touching Probability Theory, you are even stepping back to Thermodynamics for a while, but...
You are NOT answering the poser: WHAT IS ENTROPY, sorry!
1. In the formula S = kB * ln(Ω) you imply, Ω means not a "Huge Number of Microstates", not "Probability", which numerically ranges between [0,1], not even "Wavefunction", which ought to be a purely metaphysical notion, as it is... In effect, Ω ought to be a simplistic algebraic function of Lord Kelvin's Absolute Temperature. This result has been published 100 years ago in JACS.
2. WHAT-ENTROPY-IS-poser has been answered not by Clausius, not by Boltzmann, etc., but by Goethe, who has introduced Mephistopheles, the philosophical embodiment of ENTROPY.
3. Newton did basically know WHAT ENTROPY IS - A Counteraction.
4. That Counteractions do not grow to infinity with the growing Actions, but MUST reach their MAXIMUM values, is the result by Nicky Carnot formalized by Clausius...
5. In effect, Gibbs Energy formula renders implicit the interplay among ALL the relevant Actions (the Enthalpic term) and ALL the Counteractions (the Entropic term).
6. The standard approach you are reporting about is OK for the implicit Enthalpy-Entropy picture, employing it for studying reaction mechanism details is likewise eating soup with fork.🧐
I've always used a cup of coffee as an example of energy & entropy. You pour cold creamer into hot coffee, and the creamer will appear to distribute it's self through the coffee until it reaches equilibrium. This will not work if the coffee and creamer are the same temperature.
Tony Tuite For me I think of two spheres connected by a tube with a lot of balls on one sphere which represents the energy and they randomly move around and over time it will eventually fill up the second sphere and since it has much more space then it has much more probability of the balls ending up in there until they reach a state where both spheres have the same probability of the balls ending up on either of them.
But entropy could actually decrease in the open system of the coffee cup if enough heat is lost to surrounding air.
TJ Simmons Yes because a coffee cup is not a true closed system. It's merely an analogy
But creamer does distribute itself throughout the coffee assuming the same temperature...
For me it was similar, but the example we used was milk... because we're not savages
9:15 there is actually more than one way red and blue can end up on each side. If Nb is number of blue balls and Nr number of red than there is = (Nb)! * (Br)! permutations it can happen.
Now that I finally understand entropy I can say I'm lazy because I don't want the universe to die faster.
reset*
Welcome to the Negentropy Alliance! [Look up "Orion's Arm" "Negentropy"]
This might be the most elegant explanation of entropy I’ve ever heard and I’ve been educated in Aerospace Engineering, so this says a lot. Entropy is so complicated to even the smartest people that explanations are faulty or sometimes wrong, even when described by the smartest people.
No beaker, but at least you've tried :)
+LittlePeng9 I had to work with what was on the train :)
LittlePeng9 DOOOWEEEOOOOOOOOOHHHHHHH
Beaker wasn't a terribly good scientist, so better not to use him. Actually, neither was Professor Bunsen Honeydew... meep
I strongly recommend to watch the 3 brown 1 blue video on the heat equation.
Entropy - measure of how spread out your energy is
Well since our universe is expanding continuously
The entropy of our universe is continuously increasing
continuously but not indefinitely....
Those two things aren't necessarily related. You can still have "clumps" of energy spread out in a confined space.
@@Relaxing137 The heat death would kill the expansion?
I really did not want him to stop talking!!! I felt i was just getting into it when it cut to the external engine at the end. I love when topics such as these are explained in layman's terms, thank you 😊
Explained so well and I didn’t have to work to stay focused like so many of these videos!
Entropy is the cause of the existence of time. And if there is time, there is entropy.
In quantum systems, entropy may increase or decrease.
Essentially, as long as there is an arrow of time, there is entropy.
Because of the way Quantum mechanics works, time can be turned backward and reverse entropy.
There is still a backward arrow of time (so a backward entropy).
Entropy is required for time. If entropy did not exist, we would not even have a universe to live in.
Rather, we would have a static universe (maybe because our universe is expanding, it could last forever).
Time exists on the local level inside a fixed size universe, if at all.
This means things can still happen but will cease to exist after entropy is completed on all local scales.
Entropy isn't just the transfer of heat, but the transfer of information.
The transfer of information takes time.
Therefore, the speed of information transfer is the speed of light.
The speed of light is the fastest speed possible for a massless, yet information bearing, particle or wave.
That means information, through light (or any other light speed particle/wave), will set the maximum speed of information transfer.
In a world where light travels instantaneously across any space, cause and effect break down.
Light from a billion miles away will reach you at the same instant as light an inch away.
Light from any distance will reach you at the same time.
Cause and effect become simultaneous and can't be told apart.
This is the view of the photon.
Its life begins just as it ends as seen from the view of the photon.
For time to exist, you must be able to tell cause from effect.
It is in the definition of time that you must be able to see cause than effect.
In conclusion, entropy is the cause of the arrow of time, it forces information from one place to another or else the universe shall be static. This information travels at a finite speed to preserve time as the transfer of information from one cause to the effect starting its own cause and effect.
In a world of teleportation of information, you take the view of photons (or other lightspeed things) and pop into existence at the same moment you stopped existing.
In a world of infinite speed particles or waves, the information could just choose to stay in the position as it was before and do nothing until it does, but because our worlds speed of information transfer is finite, we will see time constantly moving through each small cause and effect building up to a larger cause and effect.
Therefore entropy is a stabilizer.
Ron Harley Pantaleon What is entropy the stabilizer of?
Yes I agree - without entropy, the universe would be useless, pretty much, with no movement or anything. So to get something to move or exist, you need some measure of entropy to achieve it, suggesting entropy allows for the formation of galaxies, planets, life etc. I do believe however that the more we look into quantum mechanics, I think we are going to find more interesting laws which builds on what we have already discovered.
Hey bro.. can't understand ur depth...but am desperate to understand....can u help me?
I've watched this like 30 times now
Kyun ek baar main samajh nahi aaya🤣
A few observations
First, there are more sources of energy than fossil and solar. (Fossil being basically stored solar) There's also nuclear decay and lunar (tidal). These have nothing to do with solar energy, to my understanding.
Second, there's more than one way for all the red balls to be on one side and blue balls on the other. For example, red ball #1 and red ball #2 could swap places. I guess it would be a permutation of the number of red balls and the number of blue balls, which explains why if there are only 2 red and 2 blue, there's a decent chance that all the blue will be on one side and all the red on the other.
Thanks for this video! It is a much better explanation of entropy than the one we learned at school. Just a super nitpick (to be fair, you used the term ‘for statistical rigour’ so I think I can make this point!) but having all the red ping pong balls on one side and blue on the other is not just ONE of the possible arrangements because each of the reds are interchangeable offering a very high number of possibilities of red on one side (and of blue in the other side which are also self interchangeable). But even that high number of possibilities is thoroughly dwarfed by the number of overall possibilities, therefore we never see it. That being said, obviously, I know you were keeping things simple!
Yes and double that, because the reds could be on either side with the blues on the opposite side. But it is thoroughly dwarfed by the total number of overall possibilities. Statistics you are a cruel mistress! Great video thanks!
@@TennisCoachChip Statistically, the red and blue balls would rearrange themselves back into all red on one side and all blue on another, though far less frequently than all other possible combinations. What if you just had one red ball and one blue? Or (the next step up) two red and one blue? And so on.
Thanks, Steve! I had the basic idea, but you really brought it home for me. And that was a nice tie-in to the heat death of the Universe.
"In thermodynamics, heat is energy in transfer to or from a thermodynamic system, by mechanisms other than thermodynamic work or transfer of matter."
Heat is a transfer of energy, not a state of being hot. You absolutely need heat to run the sterling engine.
No you dont need heat. You need temperature difference between the two plates. The temperature difference is going to cause the heat to flow, which causes the engine to run. So heat flow is caused by a difference in temperature. Ultimately, what runs your engine is the temperature difference. Once the two plates are in equilibrium, your engine stops.
@@menerke Heat isn't defined the way you are defining it
I just finished a degree in physics and didn't even know this, but it makes total sense. Christ that's embarrassing.
@@menerke As you say, there needs to be a transfer of energy from the hot plate to the cold plate. That _is_ heat.
I think Steve's point is that you don't supply the heat, the energy transfer via heat happens as a result of the difference in temperature. I guess you need heat in the sense of you need the concept of heat to exist, but you don't need to supply any.
Alan Guth once said something like: ~gravity is the ultimate free lunch~, e.g. stars.
It seems like instead of being spread out, energy collects in black holes, but is *unavailable*.
Same result of course, but for different reasons.
Thank you I actually understood this. I've always thought that entropy was energy spreading out but then I heard the definition of disorder and chaos and I got really confused.
This video should be titled "THE best description of entropy" :) thanks for it. It really helped me understand entropy
I remember I've always been wondering what entropy really is because wether a room is messy or not is subjective, the word disorder is also kind of subjective.
The degree of disorder that qualifies as messy is subjective, but the fact that "less disorder is less messy" is the objective part. If someone splashed paint on your wall it would be considered messy, until you find out their name is Jackson Pollock. In that case, there are relatively few random configurations that are similar to this highly specific, one-of-a-kind work of art.
Thank you for explaining this so clearly!
The only bad thing about this channel, is that it doesn't have more videos.
+Alejandro Ferrari :) working on it!
Well I'm really glad this video came in my suggestions page, I'm in love with your channel, the video was perfectly executed, brilliantly explained. Great video and well, you just earned yourself a new subscriber :)
+Gautam Passi thank you :)
Steve Mould Whoa! Wait a second? Are you "The Steve Mould"? The Mould Effect is named after you right? I just realised that, that's seriously cool. I've watched your several videos, and you're doing a great job. Really glad came across your channel. what more effects you've in you mind to be named after you? :D
Thanks Gautam! Always looking for the next Mould Effect :)
Entropy just sounds like a convoluted way of explaining equilibrium.
Actually, we can define that a system has reached equilibrium if it has reached the maximum amount of entropy.
But doesn't equilibrium sound like the ultimate order?
@@P_Ezi it does, unfortunately. The subtlety is in how we use the words in common language and while dealing with terms technically. Equilibrium is a state which every system desires to have, eventually. And when it is achieved, no system wants to move out of it. In other words, there is nothing left for it to do now, other than maintain the equilibrium. I'm repeating it again, "there's nothing left to do", which implies all energy has been spread out, and now there's no scope for work or heat. All energy has been spread out, and using the definition of entropy given in this video, the system has achieved maximum entropy. Now, the "most dispersed state" may seem like "the most ordered" when you look at it from the macroscopic level. At the statistical level, it is the most randomly distributed state.
@@tusharpal8041 well said. It has always bothered me when people say that things move toward disorder. "Disorder" and "randomly distributed" seem like very different concepts to me.
@Dr Deuteron Thanks for correcting me! I was wrong in phrasing things to make it sound like there is only one state for maximum entropy, or the equilibrium state. I realize now that that statement is true for macrostates, not microstates.
P.S. if time was infinite there would be another big bang, since energy clumping together is just really unlikely, thus entropy doesn't NEED to increase it's just really likely, but if time is infinite everything with a non 0% probability will happen no matter how low it is, like all the energy clumping in one spot.
Only if the universe isn’t infinitively large …
Boltzmann’s great contribution to science, as Leonard Susskind says.
4:20 So I'm just going to reiterate that: THERE IS NO ICE ON THIS TRAIN.
@aboctok I think we all saw it.
If it doesn't have ice it can't be in America, it must be a European or English train. Granted the fact it's a passenger train pretty much suggests it isn't in America!
Giorno and all his friends could be dead if they used this train
@@resonmon Poor Mista
With you so far. The big question is why, if it is so statistically improbable, do we find energy so clumped together in the past? If the energy distribution of "heat death" is so much more statistically likely than very clumped-energy states of the universe, why isn't "heat death" the state of the universe in both the future AND the past?
Also, if you observe the changing universe in temporal reverse, can you identify which physical laws are causing energy to defy statistical probability, and clump together? In other words, which physical laws underpin entropy and thus the direction of time?
Maybe you could make a video about this?
You might not like my answer but statistic is about observation, not about a rule.
If you accept the big bang theory plus the idea that our universe is the "white side" of some sort of black hole, it is gravity that first clumped everything together. Then, when all that energy was released, there were infinite ways to spread it out, sometimes making entropy seem to go temporarily backward, like star formation or human technology but there is still a statistical tendancy towards more entropy.
The big bang was the game changer, and it's hard to think how that could happen unless there are other universes somehow linked to ours.
When I was younger, I used to think time didn't exist, that it was made up by humans who wanted to keep track of things like dentist appointments and yoga classes. Your description of time measured by entropy really made me think about the nature of time. Theoretically, if, in a freak scenario of statistical unlikelihood, entropy in the universe started going down, would that mean time is flowing backward? Great video, you have some quite profound insights into entropy in ways I've never considered.
That's an interesting way to look at it!
The time would probably not go backwards, since there are many possible arrangements. Like in the example of the mixing balls. Red on the left and blue on the right. When the mixed state decreases in entropy, there may be a result where all the red balls are on the right side. Which has the same entropy as in the beginning, but is a different "past".
I think its more like since entropy always increases we can use that as the definition for passage of time as time also always goes forward (pls dont say time travel. It doesnt exist yet). In the statistical unlikely event that entropy decreases, our definition of time being related to entropy will be wrong. So time will still flow forwards. It's possible I'm wrong though.
True facts, scientists only know about time in its relation to space, they dont know what causes time to pass
No, while it's an interesting question, a decrease in the universe's entropy does not imply a reversal in the direction of time. If time were started to flow in reverse, then the total entropy of the universe would decrease, but if the total entropy of the universe decreased it would not imply time started flowing backwards. Also, a reversal in the direction of time seems to violate the fundamental nature of time since if time reversal started at some time A would it continue backwards in time, or would it reverse back to the direction we're familiar with at some time in the past B? In either case it seems we're left with paradoxical situations since if time continued to flow backwards indefinitely, it would progress to very the beginning of the universe. And if on the other hand, time suddenly began flowing forwards again, then once we reach time A again, we have reached the moment where time starts flowing backwards, so the universe would be permanently oscillating between points A and B. My argument relies on the assumption that the universe is deterministic though.
My description of entropy is:
S = k log W
where W is the number of microstates, k is the Boltzmann constant (found by Planck), and S is the entropy, of course...
Another good description of entropy is a science fiction story by Isaac Asimov, entitled "The Last Question"...
Thanks, Steve. Definitely the most intuitively understandable definition of entropy I've heard. Good on ya for giving the Stirling engine company a shout-out without compensation. And finally, nice job decreasing entropy by filming your video at a time you were required to be sitting around waiting anyway. 💡
I always describe entropy as amount of energy in a system unable to do work
That is, in fact, one of the standard definitions of entropy.
Here in 2020 after watching tenet, makes a lot more sense why they kept on going on about entropy.
Tenet is just a disaster on the physic aspect, absolute non-sense
@@liotedupuis5311 Dude, It is just fiction... All he did was create a spy thriller with his own twist to the tale. If you really care about physics go watch interstellar(even that film had fringe science)
@@aravindmuthu95 interstellar is also a disaster on the physic aspects. However i never said that the film was bad, only that the physic inside is completely wrong (i do love fiction too ). So it doesn't make sense to use tenet as a physic reference, it's like quoting donald trump about global warming issues.
I wish some had made this kind of video when I was in my school struggling to get physics in to my head......I cant thankyou enough for this video......Simply superb
Hey mr.steve i really like your in depth definations the afterwards knowledge we have after watching your videos is just off the charts really appreciate your hardwork and thanks for educating these various complex yet beautiful topics hats off to you man
I would really like to take the seat besides Steve, when on a train (actually, anywhere)!
That was really interesting for this reason: given the amount of time the Universe has until it's heat death, that is enough time to allow for the statistical possibility that all the red and blue ping pong balls order themselves just once, and when that happens (if) then that would be a scary amount of enthalpy that could suddenly become available. Given that the Universe is finite yet boundless (and really, really big) I know that conventional wisdom says that it cannot happen but in all the time available there could be prolific pockets of enthalpy eruptions. I wonder if that actually happens quite regularly at the quantum level... I'm joking of course, since the discovery that entropy can decrease over time in a quantum system and quantum effects can be used to "clean up the states of systems". Interesting huh :) What I wonder is that if the quantum example can extrapolate to the macro system, or rather how it does that, similarly to how known quantum effects extrapolate to the macro universe we experience around us...
My basic understanding is a star being born from gravity is the clumping of entropy...
Michael Borne,
If i understood the question, it Still wouldn't allow me an answer, ie: i don't know . . .
I'm afraid not, especially if you're thinking that maybe it would work with an actual fluid with a realistic number of particles. If your box of ping pong balls had 100 balls in it (so 50 of each colour), then the chances of a single throw yielding all the blue balls on the left is 1 in 100891 trillion trillion (that's 10^29) - far less than the number of years until the heat death of the universe (10^100 years according to this video, or 10^106 years according to first Google result) but still huge. If you have 1000 ping pong balls then the chance of a single throw yielding all blue balls on the left is 10^299 instead! (Too many trillions to write out here.) Even if you threw your box a trillion trillion trillion times per second, it would still take 10^256 years to be likely to get a neat arrangement of balls, which is a trillion trillion trillion trillion trillion trillion trillion trillion trillion trillion trillion trillion times as long as the time until the heat death of the universe. That's just 1000 balls! Now consider trillions of particles in a relatively small container, and you can see that it's just impossible for that to spontaneously organise itself, even on the timescale of the universe.
Absolutely great video! Not only your definition of entropy is, in my opinion, a lot better than the "measure of disorder" one, but also yours was the very best explanation of how the Stirling engine works (I love Sterling engines).
Your way to explain stuff is second to none.
Thank you.
TECHNICALLY heat is the movement of thermal energy, so TECHNICALLY you need heat with a nonzero magnitude ;p
I'll be curious to see how gravity fits into this model when we understand it fully. Gravity being the natural clumping force of matter, it almost seems anti-entropic.
I know it's been ages since the comment, but I think you still might be interested.
There are some people that actually try to describe gravity as an entropic force itself. What does this mean? If you think about it, when you aggregate mass together you put it in state in which more configurations are possible. And thus it's increasing the entropy. Many particles apart have only one configuration, many particles together have loads of them.
I always thought that it was a beautiful idea, but I don't know how seriously people take this route of research...
5:51 so my preferred one-line definition of entropy is that it's a measure of how spread out your energy is.
Isn't this like the famous definition of entropy - the measure of how distributed the energy is in a system?
I've probably watched 10 videos on entropy on TH-cam to try and understand it as my book teaches. This one is the best.
Oops! As Steve was making a point about "statistical rigour" (around 09:00), he made a little mistake by saying there is "only one way to have all the ping pong balls that are red on one side..." and all the blue ones on the other side. Actually, depending on how many balls there are, there will be lots of ways to have the same picture. You can do a great number of permutations where all the red and blue balls are separated on each side but not in the same place (as would be more evident if each ball was tagged with a number). But of course, there are still much much more different arrangement of balls where they are all mixed up.
Nonetheless, a great video!
Well i think part of the point is that all same coloured balls are indistinguishable from eachother (which would make any configuration of all reds on one side and all blues on the other the same in the statistical calculations (assuming you also can't distinguish directions, so all red on the left and all red on the right would also be the same).
Statistics may be applied categorically as well as to permutations, don't forget. "One way to divide balls" may serve perfectly well as the statistical "E", the event to be studied; E is not restricted only to the specific arrangement of hot/cold units. Further, if we discuss a specific system, one more relevant to the subject of entropy, in which the blue and red balls are hydrogen atoms, say, and their only difference is the individual kinetic energy, all within a band near to room-temperature, there is no other meaningful difference between one hydrogen atom and it's neighbor. They are identical, and any one atom's location has no meaning to a study of their possible permutations. It's why the heat death of the universe remains absolute even if you somehow swapped the locations of some absolute-cold atoms with other identical, absolute-cold atoms: "death" has occurred and time has stopped because time has ceased to have any meaning. It's important to remember that maths does not exist in any objective manner; mathematics is merely a language convenient for humans to describe what the universe means. It cannot out-endure, or restore relevance to, a system devoid of meaning. If the red balls corralled on one side could be made entirely identical, some other arrangement of red balls has no different meaning, and there *IS* "only one way" to discuss their separation from the blue.
Besides, atoms don't really have precisely meaningful "locations". That's just an added little googly thrown at us by a universe with a wicked-ass sense of humor.
wow who would have thought a fungus could teach better physics than community college professors
derp derpina xD has to look at his name twice to get this
@@samk6042
Oh that makes more sense, I thought it was supposed to mean that he is a fungi.
Probably community college students
Check Out our Entropy Video!!
th-cam.com/video/pKGhC2GKpm8/w-d-xo.html
I was fighting the never-ending entropy on my desk, but I got distracted by this video.. xD
burning fuels produces CO (carbon monoxide) which is a pollutant which is produced in more abundance than CO2 (Carbon Dioxide) which is needed for plant life, which plants give us O2 for animals to breath so that more fuel can be burned. Yet for some reason CO2 seen as pollutant.
Came here from Tom Scott
Really cool video! (and very emotive eyebrows :P)
+Ben786 thank you :). Can't tame the eyebrows!
He was an apprentice of Ze Frank
@@SteveMould 6:02 Can you explain how running it in reverse would cause one side to get hot and the other cold.
So time and entropy could simply be two different ways of describing fundamentally the same phenomenon. Yes, why not? After all, it used to be thought that heat and light were completely different things, until we realized that they were simply different manifestations of the same entity - energy. Then later, Einstein showed us that space and time, thought to be completely different things, were actually just 2 aspects of the same phenomenon (space-time) and that energy and matter, again originally considered unrelated, were made of fundamentally the same stuff. So it doesn't seem in any way preposterous to suggest that time and entropy, rather than 2 distinct things, could turn out actually to be just one ("time-entropy"). Factor in a few other assorted fields and forces and - who knows? - we could just end up with one single, all-encompassing cosmos-wide phenomenon: matter-energy-gravity-magnetism-spacetime-entropy-...
Awesome video! Love the explanation, makes you think about renewables. Energy from the sun also creates the wind etc. Not sure how tidal fits in though, if we use too much tidal energy will the moon fall to earth ?
I've learned so much and forgotten all of it at the same time