Entropy and Time
ฝัง
- เผยแพร่เมื่อ 24 ธ.ค. 2024
- Explains how going forward in time is different from going backwards in time.
For more information, I recommend The Road to Reality, Chapter 27 by Roger Penrose.
Created by Paul Merrell
------------------------------
Music: 9th Symphony, Finale - Beethoven • 9th Symphony, Finale -...
------------------------------
Images by Frederick Sargent, Louis-Leopold Boilly, Messrs Dickinson, NASA,
Taryn Elliott, Polina Tankilevitch
Weather Data: earth.nullscho...
Correction: Entropy does not decrease in a room when you run a refrigerator. I had gotten the concept of entropy and exergy (Gibb's free energy) mixed up. The principle that entropy can decrease somewhere as long as it increases greater somewhere else is correct.
instead of the entropy of matter can you make a vid on the entropy of gravity as per Roger Penrose's research on asymmetrical time and the de Broglie-Einstein relation? thanks
@user-ky5dy5hl4d We always try to define concepts in terms of more basic concepts. You can define things like entropy, energy, speed, etc. in term of more basic concepts, but at some point you will reach the most basic fundamental concepts. Time is as far as we can tell a basic building block of the universe. I don't know how to define it in terms of something else.
I don't think there would be a better explanation of Entropy on the entire TH-cam then this.
I haven't come across more wonderful way of explaining the whole idea of entropy so far. Good work!
When the world needed him most
*He returned*
So glad I found this channel! My mind has been blown so many times
This is one of the best - if not even the best - film to explain entropy and the first 2 laws of thermodynamics to people, who are not scientists in physic, I ever got to see. Beautifull done. Wunderfull performed.
One of the best videos on the subject of Entropy in YT.
Wonderful explanation....now I understand what is actually meaning of entropy after 10 years of study👏🏻👏🏻 appreciate your efforts😇 love from 🇮🇳
Great explanation! And that ending 😂 Glad to see you're back!
THANK YOU SO MUCH! After listening to many explanations of what entropy is land never really still understanding it), I am finally getting a grasp on it.
U did a great job! Thanks
Hey man, your videos are incredibly well made. Learned a lot. I can't believe I've never heard of you until now.
Thanks, spread the word!
Love the smash-house ending! Epic.
Welcome back!!!
Yay you're back. 👏👍
Beautifully explained.
Chris Nolan: "Hold my beer."
I also read somewhere (I can't recall the book, I think it was "The Collapse of Chaos" or maybe a book by Paul Davies) of the use of simple arithmetic to explain the complication of time reversal. It's used merely as an illustration and not a direct explanation. But if you take a number like 4, you can say 1+3, or 2+2, or 4+0 all egual 4. That would represent the normal arrow of time. But if you reverse it, it becomes difficult to discern from what 4 came from, due to convergence and dynamics. I'm probably not explaining it well, but at the time I found it helpful to picture the problems inherent in actual time reversal.
The same Paul Davies who couldn't dismiss a God explanation in 'Demon in the Machine'? Get serious. He doesn't have sufficient scepticism.
Yay glad you are back!
There is a little bit of kinetic energy imparted to each particle by colliding with the oncoming piston, but this is minimal.
In AC, this compression occurs outside, I believe, then the liquid is allowed to gassify indoors. The reverse occurs. Particles with a lot of kinetic energy (at or above escape velocity) are able to leave the liquid state, but they are slowed down by the attraction to the liquid. This is how the evaporating refrigerant cools.
Really cool video, thank you 👍
1:08 Its not the same - vectors have diffrent direction
I’m not saying you’re wrong about refrigerators using mechanical work to produce a temperature difference. But mostly they way they work is that as you compress the refrigerant, the refrigerant particles are more likely to attract each other and be able to condense into liquids. The attraction of each particle to the forming liquid speeds it up, thus increasing the kinetic energy of the particle (the thermal energy of the system). It’s exactly analogous to the massive increase in kinetic energy and thus thermal energy in the formation of planets and star from the collapse of all that matter into one body, except, of course that planets and stars so get a lot of heat from nuclear reactions.
If you're still doing these, I'd love to hear you explain differential rotation and the math and astronomy of eclipses.
I'm working on a video about eclipses. Although it may not be ready for some time.
In a way it makes sense the universe was extremely well organized in the beginning. If it hadn't even existed yet, there was nothing about it that could be disorganized, and as it disorganized into more variety, there was opportunity for even more disorganization on a continuous exponential cycle
You’re the best!
I love this video. This is one of my favorite subjects.
Personally, I don't think the entropy of the universe is increasing. I feel like saying the universe is unstable requires us to postulate a cause for the beginning, which requires us to postulate other universes and their causes and so forth. It's much simpler to just postulate a stable universe.
I think the thing is that we just have observed a lot of low-entropy situations decaying into high-entropy situations and rarely the opposite, so we assume there's a universal trend in that direction. But that could occur if we were born into a time and place that just had that trend.
It's easy to imagine this since most forces (nuclear and gravity) tend to oppose entropy. I know that's hand-wavy, but so is the argument that entropy is universally increasing.
Imagine if the universe consisted of a bunch of masses and the only force were gravity. With all the orbiting, there would be times that the whole cluster of mass would be at a minimum density. After that, we may observe "the universe tends towards entropy." Well, it would until it then started contracting again.
I don't see it as being all that different if you just add several more forces and conditions for those forces and several more types of mass.
It’s a similar process with crystallization. Particles are attracted into the crystalline matrix, so of course they are accelerated as they “drop” into place. This creates heat. So in this case, you are decreasing entropy in those particles while creating heat. I guess most would say that entropy is created elsewhere. I don’t see where.
How does life work, it’s insane how cells organize themselves and “battle” to exist despite the energy science saying they shouldn’t
If the atoms moves very fast when we heat then there must be vibrations when we heat any object, right?
Yes, that's right. When you heat a gas, the molecules are flying around like I showed. When you heat a solid object, the atoms start vibrating fast. Absolute zero temperatures mean no vibration.
Not sure what you mean. Another way to heat objects is to send electromagnetic radiation at them, like how the sun heats the Earth.
Bekenstein was an absolute force of nature
destruction and bethoven you got me
It brings me joy as Ludwig would have wanted.
What a great video!
Wow u came back 🗿
The long wating ends
I'm stoned and this is tripping the shit out of me.
So this is a great explanation and I love it. But I have a request. I get how entropy works, but I’m having trouble imagining the scenario where all the molecitles end up in a corner of the room. I understand that it’s a one in a huge number probability. I get that. I’m just having trouble even understanding how such a scenario could even happen, I mean the scenario leading up to it.
It occurs to me that it would be easier for me to imagine if I saw one of these reverse videos where you ran a simulation where you started all the particles in one corner of the “room” with different velocities and then take a simulation, but then ran it backwards to show how it could conceivably happen. Do you think you would do that please?
OK, I made a special animation just for you: th-cam.com/video/PtMoX4RI5WY/w-d-xo.html
Notice that none of the laws of physics are being broken here (besides the 2nd law). The scenario is possible, just very unlikely.
@@ItsJustAstronomical Thanks. I love how the wall comes down like "Let's just make this permanent."
Bro i wait for your videos ❤️🔥🔥
this is amazing
Hey, I've been into geology lately, and I want to learn more about astronomy, so I'm going to go back to your videos and try to understand them better. The Coriolis one broke my mind.
I don't know where you discuss it, but you once talked to me about convection cells in the atmosphere. Since I've been studying geology, I've been thinking a lot about convection cells in the mantle. My theory is that hot spots (like Yellowstone) are actually the centers of convection cells. But I'm not smart enough (or educated enough) to know how to think about the issue. I think it would involve.
Understanding the properties of the magma, and what causes it to cycle in the first place
Some complicated magnetic crap.
Some complicated crap pertaining to the spinning of the globe and Coriolis force and stuff.
Would probably have to account for whatever is causing the poles to flip, or at least its effect on convection cells, if there is one.
The resistance of the continents and the sinking of plates and other chaotic issues.
I bring all this up because while I don't have the ability to think about this, you do, and if you have any interest in this subject, you might be able to figure out some interesting things.
Because it is theoretically possible to go from high to low entropy, despite being extremely unlikely, I am captivated with the possibility of a 'dead' universe billions of years in the future randomly turning into another singularity and restarting the cycle!
You need to think much much longer than billions of years. The universe will be alive and well in billions of years.
I thought you were going to say that you were intrigued by, a universe with extremely low entropy, like universe consisting of a room where all the air is in one corner!
What word would describe the opposite of entropy?
Oh, and I’m reverse, prticles hot enough to escape the crystal matrix are slowed by the attraction of the matrix as they leave it, leading ti cooling. Hence ice can absorb our heat without heating up itself.
When the Universe was young, everything was a soup of particles moving around randomly. That is high entropy. How did it ever become low from there? You can't get more unorganized than that.
It's a great question. And one I find quite confusing. I was oversimplifying a bit when I said entropy = disorder. Entropy works completely different when you consider gravity. Gravity is always trying to clump things together. So the "natural" high-entropy state of objects under gravity is to be clumped together. The fact the universe was incredibly homogenous and not clumped is the reason why it had such low entropy.
There's a lot of this that I don't fully understand, so I'll just quote from Sir Roger Penrose "As a way of appreciating the problem posed by the absurdly tiny phase-space volume, we can imagine the Creator trying to use a pin to locate this tiny spot in the phase-space, so as to start the universe off in a way that resembles what we know of it today... The Creator's pin has to find a tiny box, just 1 part in 10^10^123 in order to create a universe with as special a Big Bang as that we actually find... If the Creator were to miss this spot by just the tiniest amount and plunge the pin effectively randomly in to the maximum entropy region, then an uninhabitable universe would be the result, in which there is no Second Law to define a statistical time-directionality."
@@ItsJustAstronomical I was going to suggest the same thing. So here's a question. Overall entropy can be increased by simply expanding the universe, and what's inside that "perimeter" can become more organized, thus sort of trading space for order, can the universe be infinitely ordered, since it can also expand to infinite space?
@@mitchjohnson4714 Entropy can decrease in one place while increasing in another. This sounds a little bit like the thought experiment that led to the idea of Bekenstein-Hawking entropy. Scientists were puzzled by the idea that you could create some entropy and then throw it into a black hole. But then they expanded the concept of entropy so that black holes have entropy too.
So here’s what I don’t get, and maybe you can help me understand. I understand entropy at a conceptual level, but I don’t understand it at a mathematical level.
You showed us a discrete model, but how does it work with continuous phenomena, like positions in space. I’ve thought about entropy in a chemistry context, so I’m also thinking about rotation of bonds, flexing of bonds orbital configurations . . . basically all the ways that matter and energy can be distributed. I know that might not be your background, but I don’t expect a detailed answer in terms of physical chemistry. I’m just giving the context for what I mean when I ask about how we calculate entropy for situations where a discrete model doesn’t work, like protein folding.
For example, where do we get the “S” in the Gibbs free energy equation. Is that just empirical? Do you, or does anyone know of the use of ways to calculate entropy using nondiscrete methods? Maybe it doesn't matter because maybe discrete models are used to understand it, and then for applications, it's just measured. I don't know.
I don't know enough about the Gibbs free energy equation to give much insight into it . In Penrose's book he says something like there are many different ways you could define entropy, but you always come to the same conclusion, the universe is in an unlikely state and moving towards a likely state. Here's a simple way you could measure entropy for the example I showed with particles in a box. Measure the number of particles on one side of the box. Their positions are continuous variables, but you can measure how far away it is from equilibrium by counting the number of particles on each side.
@@ItsJustAstronomical Yeah, I guess the S is probably empirical because it's probably a lot easier to measure than to calculate from some theory.
But what I realize is that I'm not really trying to ask a scientific question. I'm trying to ask a mathematical question. It doesn't really matter how "scientists" define entropy, what matters is (a) what is reality, and (b) what tools we use to understand what is real. So although I'm vaguely interested in (c) the tools that "scientists" use to understand what is real, it's a minor concern to me.
So I realize that you may not be familiar with how physical chemists define or use entropy. I'm more just trying to ask a mathematical question.
So let my try to be more clear: You've used a discrete model, but I'm having trouble thinking, mathematically, how one would deal with this in a continuous situation. Let's say that you added one more variable to your dice: which side they were facing. So you have 18 dice of each color (and 18 positions), but each die also has six states. So that significantly increases the entropy, or decreases the likelihood that all the blues would be on one side and all showing us "six". This is still a discrete model.
But let's consider it in a continuous model. You have 18 blue dice and 18 red dice in a cube of, say 8 cubic feet. They're all bouncing randomly around with no gravity, and totally elastic collisions, each free to rotate on three axes.
So it's a similar situation, where you can ask whether they are "ordered" in that all the blue dice are on one side of the cube and you can see the six face of all of them. Of course they will be much more likely to be in one of the many, many, states of disorder.
But while I can perfectly see how you considered the discrete case, I have no idea how you could think about the continuous case.
And since the real world is generally continuous like this, I'd really like to understand how one would think about it in such a continuous state.
Thanks so much for your answers.
@@mitchjohnson4714 What was wrong with my previous description. Divide the 8ft cube in half. Count the number of dice on each half of the cube. How many are at a position z < 4ft. 18 red dice on one side and 18 blue dice on the other is very unlikely.
Also, I found the Roger Penrose quote I was looking for "There is still something very subjective about this definition of S ... Be that as it may, it is remarkable how little effect the arbitrariness in coarse graining has in the calculations of thermodynamics. It seems that the reason for this is that, in most considerations of interest, one is concerned with absolutely enormous ratios between the sizes of the relevant phase-space box volumes and it makes little difference where the boundaries are drawn, provided that the coarse graining 'reasonably' reflects the intuitive idea of when systems are to be considered to be macroscopically indistinguishable."
@@ItsJustAstronomical I'm not saying I can't understand it using your example. I'm saying I don't know how to calculate the number of different configurations. But writing this out is helping me to think about it.
I realized that whereas in a discrete model, you can say, like, a gas particle is at pixel (x,y) in a continuous model, you probably have to think of it in terms of a range of values. Like the probability of it being in one exact location is zero, but the probability that it is within a small sub-box can be calculated (or the fraction that will be randomly found within or touching that sub-box can be calculated).
Similarly, with the idea of adding in the orientation of the dice as a factor, the probability they are exactly facing a direction is zero, but the fraction that are randomly within a range of orientations can be calculated.
So one could use a similar method to the discrete method you show, but it would be a little different for continuous phenomena.
Thats why we consider heat as low grade energy bc we can't fully convert heat into work bc of 2nd law of thermodynamics.
I have 4 millennial children, my contribution to entropy is immeasurable.
edit: 1 millennial and 3 gen z'ers
It’s about human population more than CO2. The CO2 trapping solar heat got there because of human population. But that’s the one thing no one says.
U must be our professor
Wow...
I may be low entropy but my grades weren´t
Any evidence that Time is not an abstract? Any logical reason to try and conflate Entropy with Time? PseudoScience or PseudoPhysics perhaps?
If Time is reversible then Entropy should decrease - so there is a Saviour because Maths (a study of abstracts) has found it but can't prove it. Like religion. How cute.
a lot, actually - I mean, I don't want to be rude, but do you know anything about physics and the history of physics? Space-time is a thing, that his proven.
@@xBINARYGODx Strong argument. Basically 'leave it to the high priests'; the Arg. from Authority fallacy. You are certainly worthy of that title.
SpaceTime was merely Einstein's science-free 'happy thought' in an effort to surpass Newton. All the 'evidence' for it could fit on one page.
So I'm saying you do know a lot, but it isn't going to help anyone.
BTW everyone else is giggling because you took the bait.
@@peterclark6290 "everyone is giggling" is something that's only happening in your mind. That's called relativity.
@@poetryflynn3712 Black holes, Dark Matter, Wormholes, Multiverses,... are the results of the same fantasy science (actually Maths) that 'finds' or 'discovers' these phenomena. e.g. Did you see the public 'evidence' for time dilation a few years ago? That was a nerd joke. An undergraduate geek prank (trying to be kind here). But they were dead serious. Y'all need some 'scepticism' transplants. You are trapped in a vortex which is a criminal waste of otherwise capable minds.
@@SiriusSphynx Sceptics such as myself are usually ignored by the cognoscenti dude, but add your name - I will get through to a few.
To wit: There is no magic in the Cosmos: just energy (two state, perpetual), infinite space and eternity (linear, stable). Any laws are contained in the electro-mechanical outcome of the interactions between the energy sub-particles; with just a handful of constants describing the limits.
From this comes intelligent sensate life which is owed nothing, inherits zilch but is afflicted with a desire for survival. So wasting energy on infantile projections is not advisable. No musical accompaniment required.