The Startling Reason Entropy & Time Only Go One Way!

แชร์
ฝัง
  • เผยแพร่เมื่อ 21 พ.ย. 2024

ความคิดเห็น • 1.2K

  • @krnathan
    @krnathan ปีที่แล้ว +271

    This man deserves an award, for explaining complex concepts with so much creativity and simplicity!

    • @CosmicNihilist
      @CosmicNihilist ปีที่แล้ว +6

      I agree wow what a stunning episode btw his lovely voice and tonality help too .

    • @Mr0rris0
      @Mr0rris0 ปีที่แล้ว

      We send him a bonjovi pasta sauce

    • @hyperduality2838
      @hyperduality2838 ปีที่แล้ว +2

      Randomness (uncertainty, entropy) is dual to order (certainty, syntropy).
      Certainty is dual to uncertainty -- the Heisenberg certainty/uncertainty principle.
      Signals (patterns, syntropy) are dual to noise (entropy) -- the signal to noise ratio in electronics.
      "Entropy is a measure of randomness" -- Roger Penrose.
      Syntropy is a measure of order.
      Repetition (patterns, order, syntropy) is dual to variation (disorder) -- music.
      Increasing the number of dimensions or states is an entropic process -- co-homology.
      Decreasing the number of dimensions or states is a syntropic process -- homology.
      Increasing is dual to decreasing.
      Homology (syntropy, convergence) is dual to co-homology (entropy, divergence).
      Syntropy (prediction) is dual to increasing entropy -- the 4th law of thermodynamics!
      Teleological physics (syntropy) is dual to non-teleological physics (entropy).
      Potential energy is dual to kinetic energy -- gravitational energy is dual.
      Synergy is dual to energy -- energy is dual.
      Energy is duality, duality is energy.
      The conservation of duality (energy) will be known as the 5th law of thermodynamics!
      Energy is dual to mass -- Einstein.
      Dark energy is dual to dark matter -- the universe is dual.
      "Always two there are" -- Yoda.
      Integration (syntropy, summation) is dual to differentiation (entropy, differences) - abstract algebra.
      The 4th law of thermodynamics is hardwired into mathematics & mathematical thinking.

    • @tonib5899
      @tonib5899 ปีที่แล้ว

      @@hyperduality2838 brilliantly put and citing Roger Penrose really nailed it.Well said.T

    • @naughtyat25
      @naughtyat25 ปีที่แล้ว

      Totally! Never thought time could be consequence of entropy

  • @stoneysdead689
    @stoneysdead689 ปีที่แล้ว +21

    This is the first explanation of entropy that explains why the universe started out in such a low entropy state- because it was so much smaller than it is now, and so much simpler because the symmetries had yet to be broken- it just makes sense that there would've been far fewer ways that the system (universe) could be arranged. Why haven't I heard this before? I guess they thought it was just apparent but- I had never even thought about this at all- it's so simple but explains so much. Entropy itself makes more sense when you think about the energy as being "less useful" even though it's conserved. I am definitely going to check out more of your videos- bravo man, well done.

    • @glasslinger
      @glasslinger 4 หลายเดือนก่อน

      This reasoning breaks down when the first femtoseconds of the universe expanding are considered. We don't have any idea at all what was in the "universe" during that situation.

  • @cowgoesmoo2
    @cowgoesmoo2 6 หลายเดือนก่อน +15

    Jesus christ. I went through high school thinking and being confused about entropy. And I did physics in college, and in thermodynamics the professor said that he did not like the confusion and misconceptions that everyone knows exists about entropy and said he would like to explain it once and for all. I have never been so interested to hear an explanation. He explained it. I immediately became more confused about entropy than ever. This video, by 10:00 has explained the entire concept I ever interacted with into early college physics clearer than I've ever heard. I actually feel motivated to go and understand more about entropy now

    • @ArvinAsh
      @ArvinAsh  6 หลายเดือนก่อน +2

      Glad it was helpful. I agree with you. It is one of the most difficult concepts in chemistry. It is much easier to get, imo, if one looks at it in terms of probabilities and energy, as I tried to do here.

  • @Afrobelly
    @Afrobelly ปีที่แล้ว +8

    This was amazing. Until now I'd never been able to make sense of the concept of "increasing entropy in a system" which as a science concept seemed hopelessly vague. Explaining it as a function of end state probabilities made the connection for me. Thank you for that. As a side note, I once dropped a penny onto a table where it bounced a bit until finally coming to rest on edge. I couldn't believe my eyes. In amazement, I left that penny sitting there for maybe 20 minutes. But as you explained--improbable, not impossible.

  • @videosbymathew
    @videosbymathew ปีที่แล้ว +83

    Finally, a great explanation of entropy. I always intuitive had this answer, but no one else ever seems to have described it correctly in videos before, thank you.

    • @JohnnyAngel8
      @JohnnyAngel8 ปีที่แล้ว +3

      I agree. I've tried to wrap my brain around other attempts but the demonstrations and explanations in this video finally gave me a better understanding. Just don't ask me to recite it back to you! LOL!

    • @videosbymathew
      @videosbymathew ปีที่แล้ว +1

      @@JohnnyAngel8 Lol, indeed!

    • @jsEMCsquared
      @jsEMCsquared ปีที่แล้ว

      It cracks me up that entropy,. The theory that things are getting smaller, is actually getting bigger!

    • @elio7610
      @elio7610 ปีที่แล้ว

      @@jsEMCsquared Since when was entropy about stuff getting smaller?

    • @hyperduality2838
      @hyperduality2838 ปีที่แล้ว

      Randomness (uncertainty, entropy) is dual to order (certainty, syntropy).
      Certainty is dual to uncertainty -- the Heisenberg certainty/uncertainty principle.
      Signals (patterns, syntropy) are dual to noise (entropy) -- the signal to noise ratio in electronics.
      "Entropy is a measure of randomness" -- Roger Penrose.
      Syntropy is a measure of order.
      Repetition (patterns, order, syntropy) is dual to variation (disorder) -- music.
      Increasing the number of dimensions or states is an entropic process -- co-homology.
      Decreasing the number of dimensions or states is a syntropic process -- homology.
      Increasing is dual to decreasing.
      Homology (syntropy, convergence) is dual to co-homology (entropy, divergence).
      Syntropy (prediction) is dual to increasing entropy -- the 4th law of thermodynamics!
      Teleological physics (syntropy) is dual to non-teleological physics (entropy).
      Potential energy is dual to kinetic energy -- gravitational energy is dual.
      Synergy is dual to energy -- energy is dual.
      Energy is duality, duality is energy.
      The conservation of duality (energy) will be known as the 5th law of thermodynamics!
      Energy is dual to mass -- Einstein.
      Dark energy is dual to dark matter -- the universe is dual.
      "Always two there are" -- Yoda.
      Integration (syntropy, summation) is dual to differentiation (entropy, differences) - abstract algebra.
      The 4th law of thermodynamics is hardwired into mathematics & mathematical thinking.

  • @johandam4992
    @johandam4992 ปีที่แล้ว +10

    Way back when I graduated, my thesis was about exergy (usefull enthalpy for short explanation). This video captured that concept. This is quite an achievement

  • @Razor-pw1xn
    @Razor-pw1xn ปีที่แล้ว +45

    Hi Arvin, could you make a video about the new discovery about the double slit experiment but instead of taking place in space it takes place in time? It would be very interesting to discuss what consequences it implies. "A team led by Imperial College London physicists has performed the experiment using 'slits' in time rather than space. They achieved this by firing light through a material that changes its properties in femtoseconds (quadrillionths of a second), only allowing light to pass through at specific times in quick succession"

    • @b1gb017
      @b1gb017 ปีที่แล้ว +4

      yh I saw an article on that as well, seemed very interesting, would be good to have Arvin’s take on it! 😎

    • @Rami-ll2bq
      @Rami-ll2bq ปีที่แล้ว

      ❤🧠❤

  • @starrynightlyrics7559
    @starrynightlyrics7559 ปีที่แล้ว +2

    I will remember your name ten years from now. You explained that concept about life better than my teachers ever did. THAAANKS A LOT !!!

  • @Laser593
    @Laser593 ปีที่แล้ว +6

    Now I know why things keep changing in the universe. The culprit is Entropy. Everything changes because of transfer of energy to one thing to another. Thanks Mr. Arvin Nash.

  • @hisss
    @hisss ปีที่แล้ว +7

    I may now, for the first time in my life, finally, have a slight bit of a beginning of an understanding of what entropy means. "Disorder" was always so vague, I never got who or what decided what was ordered and what was disordered. I don't quite speak maths, so things like statistics and probability are over my head, but at least the concept is starting to make sense. My teachers could never achieve that, so thank you. There might still be hope for me.

    • @evgenistarikov3386
      @evgenistarikov3386 ปีที่แล้ว

      To continue moving along the way you have kindly chosen please check also my humble comment to this stream

    • @Afrobelly
      @Afrobelly ปีที่แล้ว

      Same here, but I think you said it more plainly than I could. Glad I stumbled onto this video lecture.

  • @matkosmat8890
    @matkosmat8890 ปีที่แล้ว +26

    Time as a statistical phenomenon! Thank you, Arvin, your explanations are always spot-on.

  • @lookmath4582
    @lookmath4582 ปีที่แล้ว +35

    Your videos have that really phiosophical metaphysical twist in them which makes them special and understandable 👍❤

    • @hyperduality2838
      @hyperduality2838 ปีที่แล้ว

      Randomness (uncertainty, entropy) is dual to order (certainty, syntropy).
      Certainty is dual to uncertainty -- the Heisenberg certainty/uncertainty principle.
      Signals (patterns, syntropy) are dual to noise (entropy) -- the signal to noise ratio in electronics.
      "Entropy is a measure of randomness" -- Roger Penrose.
      Syntropy is a measure of order.
      Repetition (patterns, order, syntropy) is dual to variation (disorder) -- music.
      Increasing the number of dimensions or states is an entropic process -- co-homology.
      Decreasing the number of dimensions or states is a syntropic process -- homology.
      Increasing is dual to decreasing.
      Homology (syntropy, convergence) is dual to co-homology (entropy, divergence).
      Syntropy (prediction) is dual to increasing entropy -- the 4th law of thermodynamics!
      Teleological physics (syntropy) is dual to non-teleological physics (entropy).
      Potential energy is dual to kinetic energy -- gravitational energy is dual.
      Synergy is dual to energy -- energy is dual.
      Energy is duality, duality is energy.
      The conservation of duality (energy) will be known as the 5th law of thermodynamics!
      Energy is dual to mass -- Einstein.
      Dark energy is dual to dark matter -- the universe is dual.
      "Always two there are" -- Yoda.
      Integration (syntropy, summation) is dual to differentiation (entropy, differences) - abstract algebra.
      The 4th law of thermodynamics is hardwired into mathematics & mathematical thinking.

    • @NoActuallyGo-KCUF-Yourself
      @NoActuallyGo-KCUF-Yourself ปีที่แล้ว

      ​@@hyperduality2838
      < reported for spam >

  • @macsarcule
    @macsarcule ปีที่แล้ว +8

    This so clearly answered questions I’ve been puzzling over for years. I have new questions to puzzle over now, and that’s extra awesome! Thank you, Arvin! ✌️🙂

  • @gafyndavies
    @gafyndavies ปีที่แล้ว +5

    No matter how bad a day I'm having, when I hear "That's coming up, right now!" a huge smile erupts on my face 😊

  • @bmaverickoz
    @bmaverickoz ปีที่แล้ว +9

    I like that you referenced Lee Smolin, who is so invested in probing the status of time of fundamental (well, causation anyway, which is manifestation of time). I think his work is really valuable in this space, and your popularising of the subject is truly inspired Arvin :)

  • @AndrewUnruh
    @AndrewUnruh ปีที่แล้ว +3

    This was essentially a very good introductory lesson on statistical thermodynamics. I never really understood some thermodynamics concepts until I took a course on statistical thermodynamics as a grad student.

  • @jeffreymartin8448
    @jeffreymartin8448 ปีที่แล้ว +4

    There is nothing more satisfying than when that light bulb goes on. Often seemingly on it's own when least expected. You realize: Of course ! Thank you once again Arvin!

    • @hyperduality2838
      @hyperduality2838 ปีที่แล้ว

      Randomness (uncertainty, entropy) is dual to order (certainty, syntropy).
      Certainty is dual to uncertainty -- the Heisenberg certainty/uncertainty principle.
      Signals (patterns, syntropy) are dual to noise (entropy) -- the signal to noise ratio in electronics.
      "Entropy is a measure of randomness" -- Roger Penrose.
      Syntropy is a measure of order.
      Repetition (patterns, order, syntropy) is dual to variation (disorder) -- music.
      Increasing the number of dimensions or states is an entropic process -- co-homology.
      Decreasing the number of dimensions or states is a syntropic process -- homology.
      Increasing is dual to decreasing.
      Homology (syntropy, convergence) is dual to co-homology (entropy, divergence).
      Syntropy (prediction) is dual to increasing entropy -- the 4th law of thermodynamics!
      Teleological physics (syntropy) is dual to non-teleological physics (entropy).
      Potential energy is dual to kinetic energy -- gravitational energy is dual.
      Synergy is dual to energy -- energy is dual.
      Energy is duality, duality is energy.
      The conservation of duality (energy) will be known as the 5th law of thermodynamics!
      Energy is dual to mass -- Einstein.
      Dark energy is dual to dark matter -- the universe is dual.
      "Always two there are" -- Yoda.
      Integration (syntropy, summation) is dual to differentiation (entropy, differences) - abstract algebra.
      The 4th law of thermodynamics is hardwired into mathematics & mathematical thinking.

    • @NoActuallyGo-KCUF-Yourself
      @NoActuallyGo-KCUF-Yourself ปีที่แล้ว

      ​@@hyperduality2838
      < reported for spam >

  • @kanakTheGold
    @kanakTheGold ปีที่แล้ว +1

    This approach at understanding Entropy and Time is in agreement with what we observe in universe and at same time, the simplest explanation I have seen so far and hence the best way of describing both entropy and time.

  • @Baka_Komuso
    @Baka_Komuso ปีที่แล้ว +73

    Arvin’s qualitative explanations of these mathematically complex phenomena are especially valuable to me having been frustrated by the Principia, as well as his perfect articulation makes me say “eureka”.

    • @hyperduality2838
      @hyperduality2838 ปีที่แล้ว +1

      Randomness (uncertainty, entropy) is dual to order (certainty, syntropy).
      Certainty is dual to uncertainty -- the Heisenberg certainty/uncertainty principle.
      Signals (patterns, syntropy) are dual to noise (entropy) -- the signal to noise ratio in electronics.
      "Entropy is a measure of randomness" -- Roger Penrose.
      Syntropy is a measure of order.
      Repetition (patterns, order, syntropy) is dual to variation (disorder) -- music.
      Increasing the number of dimensions or states is an entropic process -- co-homology.
      Decreasing the number of dimensions or states is a syntropic process -- homology.
      Increasing is dual to decreasing.
      Homology (syntropy, convergence) is dual to co-homology (entropy, divergence).
      Syntropy (prediction) is dual to increasing entropy -- the 4th law of thermodynamics!
      Teleological physics (syntropy) is dual to non-teleological physics (entropy).
      Potential energy is dual to kinetic energy -- gravitational energy is dual.
      Synergy is dual to energy -- energy is dual.
      Energy is duality, duality is energy.
      The conservation of duality (energy) will be known as the 5th law of thermodynamics!
      Energy is dual to mass -- Einstein.
      Dark energy is dual to dark matter -- the universe is dual.
      "Always two there are" -- Yoda.
      Integration (syntropy, summation) is dual to differentiation (entropy, differences) - abstract algebra.
      The 4th law of thermodynamics is hardwired into mathematics & mathematical thinking.

    • @sethrenville798
      @sethrenville798 ปีที่แล้ว

      ​@@hyperduality2838 Interesting take. I am more of the Stephen Wolfram mind in that I honestly just think everything in this universe is driven by computation, and the most fundamental building block of our universe is information, these lightning quick computations with drive entropy in an irreversible direction coma as the computations collapsed the wave functions of the probabilistic futures into actualized Specifics, and it actually directly works with the Maxwell's demon explanation of entropy. I also find that his quirk seems to work out quite a few interesting weighs in other parts of physics, as well as gravity, and has a Another set of really nifty applications Of the main physics equations, Einstein's, the dirac equation, and theYang-Mills

    • @hyperduality2838
      @hyperduality2838 ปีที่แล้ว

      @@sethrenville798 Spin up is dual to spin down, particles are dual to anti-particles -- The Dirac equation.
      From a converging, convex (lens) or syntropic perspective everything looks divergent, concave or entropic -- the 2nd law of thermodynamics!
      All observers have a syntropic perspective according to the 2nd law of thermodynamics.
      My syntropy is your entropy and your syntropy is my entropy -- duality.
      Mind (the internal soul, syntropy) is dual to matter (the external soul, entropy) -- Descartes or Plato's divided line.
      Convex is dual to concave -- lenses, mirrors.
      Your mind is syntropic as it creates or synthesizes reality (non duality).
      Concepts are dual to percepts -- the mind duality of Immanuel Kant.
      Concepts are syntropic representations built from perceptions or measurements -- category theory.
      The Einstein reality criterion:-
      "If, without in any way disturbing a system, we can predict with certainty (i.e., with probability equal to unity)
      the value of a physical quantity, then there exists an element of reality corresponding to that quantity."
      (Einstein, Podolsky, Rosen 1935, p. 777)
      Internet Encyclopedia of Philosophy:-
      www.iep.utm.edu/epr/
      According to Einstein reality is predicted into existence -- a syntropic process!
      "We predict ourselves into existence" -- Anil Seth, neuroscientist, watch at 56 minutes:-
      th-cam.com/video/qXcH26M7PQM/w-d-xo.html
      Making predictions to track targets, goals & objectives is a syntropic process -- teleological.
      Physicists ignore the mind, but all observers have a mind which is syntropic.
      The observed is dual to the observer -- David Bohm.
      Syntropy (Prediction) is dual to increasing entropy -- the 4th law of thermodynamics!
      Main stream physics is currently dominated by teleophobia and eliminative materialism.
      Teleophilia is dual to teleophobia.
      New laws of physics based upon teleology do not go down to well in physics departments -- so you are unlikely to hear about this new law.
      "Philosophy is dead" -- Stephen Hawking.

    • @hyperduality2838
      @hyperduality2838 ปีที่แล้ว

      @@sethrenville798 All energy is dual -- electro is dual to magnetic -- Maxwell's equations.
      Null vectors or light rays are perpendicular or dual to themselves from our perspective -- self duality.
      The inner product is dual to the cross product -- Maxwell's equations.
      Positive is dual to negative -- electric charge.
      North poles are dual to south poles -- magnetic fields.
      Everything in physics is made out of energy (duality).
      The word "information" means something is being formed or created, synthesized -- a syntropic process!

    • @NoActuallyGo-KCUF-Yourself
      @NoActuallyGo-KCUF-Yourself ปีที่แล้ว

      ​@@hyperduality2838
      < reported for spam >

  • @DrZedDrZedDrZed
    @DrZedDrZedDrZed ปีที่แล้ว +2

    I’m really glad Arvin touched on the expansion of the universe in this video, it’s intractable linked to the concept of entropy in a counterintuitive way that most science communicators don’t talk about. It’s also important to note that, if we ONLY consider the contents of the universe at the Big Bang, without considering time, then the universe was exetremely homogenous, disordered, and random, there was no “didference” to be had to put it in a low entropy state, unless you consider the creation of space itself as giving that hot plasma more and more places to go; and doing so extremely quickly. This thinking leads to the best interpretation of entropy I’ve read to date. It’s less about order/disorder, or even the arrow of time. Entropy is the LOOSENING OF CONSTRAINTS. Hat tip to Terrence Deacon for that one. Anyone interested in this topic MIST read Incomplete Nature.

  • @timjohnson979
    @timjohnson979 ปีที่แล้ว +24

    "Time being a statistical phenomenon" is an interesting thought, but is that reality? If it is, then what does that mean in the context of spacetime?
    Love your videos, Arvin. They are clear and thought provoking.

    • @macysondheim
      @macysondheim ปีที่แล้ว

      All of this fancy science jargon is total nonsense. At the end of the day none of this bogus can be proven in a lab. It’s just blind faith.

    • @sunny_senpai
      @sunny_senpai ปีที่แล้ว

      would like to know this as well

  • @NoActuallyGo-KCUF-Yourself
    @NoActuallyGo-KCUF-Yourself ปีที่แล้ว +13

    Great timing on this one! I watched it right before one of my chemistry students asked about thermodynamics. I definitely passed this on as a study aide.

    • @chriskennedy2846
      @chriskennedy2846 ปีที่แล้ว

      I recommend the book: Introduction to Molecular Thermodynamics by Robert M Hanson and Susan Green. Easy to read and covers a lot of ground.

    • @hyperduality2838
      @hyperduality2838 ปีที่แล้ว

      Randomness (uncertainty, entropy) is dual to order (certainty, syntropy).
      Certainty is dual to uncertainty -- the Heisenberg certainty/uncertainty principle.
      Signals (patterns, syntropy) are dual to noise (entropy) -- the signal to noise ratio in electronics.
      "Entropy is a measure of randomness" -- Roger Penrose.
      Syntropy is a measure of order.
      Repetition (patterns, order, syntropy) is dual to variation (disorder) -- music.
      Increasing the number of dimensions or states is an entropic process -- co-homology.
      Decreasing the number of dimensions or states is a syntropic process -- homology.
      Increasing is dual to decreasing.
      Homology (syntropy, convergence) is dual to co-homology (entropy, divergence).
      Syntropy (prediction) is dual to increasing entropy -- the 4th law of thermodynamics!
      Teleological physics (syntropy) is dual to non-teleological physics (entropy).
      Potential energy is dual to kinetic energy -- gravitational energy is dual.
      Synergy is dual to energy -- energy is dual.
      Energy is duality, duality is energy.
      The conservation of duality (energy) will be known as the 5th law of thermodynamics!
      Energy is dual to mass -- Einstein.
      Dark energy is dual to dark matter -- the universe is dual.
      "Always two there are" -- Yoda.
      Integration (syntropy, summation) is dual to differentiation (entropy, differences) - abstract algebra.
      The 4th law of thermodynamics is hardwired into mathematics & mathematical thinking.

    • @NoActuallyGo-KCUF-Yourself
      @NoActuallyGo-KCUF-Yourself ปีที่แล้ว

      @@hyperduality2838
      Will you please go away with this nonsense and stop spamming every comment with your crackpot ideas?

  • @georgerevell5643
    @georgerevell5643 ปีที่แล้ว +10

    I love it when you go beyond the generally accepted answer and give your well thought through opinion. Your so brilliant Arvin!

    • @ArvinAsh
      @ArvinAsh  ปีที่แล้ว +4

      I appreciate that, but it's not just my opinion. This is the generally accepted way, for now at least, that we understand why these things happen.

    • @evgenistarikov3386
      @evgenistarikov3386 ปีที่แล้ว

      @@ArvinAsh Absolutely so, guys! My humble comment to this stream ought to shed light upon other plausible explanations as for WHY THINGS HAPPEN AT ALL.

    • @agranero6
      @agranero6 3 หลายเดือนก่อน

      "accepted opinion"? this explanation was exactly what I was taught at college. I still have Callen's book that was used in that course of Thermodynamics on my book shelves.But the course of Statistical Mechanics we used not book at all, we usually too one we liked or was suggested on the library and took printed notes by the professor. I think I took Sommerfeld.

    • @AstroTibs
      @AstroTibs หลายเดือนก่อน

      *You're

  • @ToddRickey
    @ToddRickey ปีที่แล้ว +1

    Brilliance, on conversationalist level with amazing truths!

  • @themcchuck8400
    @themcchuck8400 ปีที่แล้ว +4

    Completely wonderful video explaining the real meaning of entropy.
    It goes off the rails right at the end (and in the title) when he gets philosophical about time, getting things exactly backwards.

    • @hyperduality2838
      @hyperduality2838 ปีที่แล้ว

      Randomness (uncertainty, entropy) is dual to order (certainty, syntropy).
      Certainty is dual to uncertainty -- the Heisenberg certainty/uncertainty principle.
      Signals (patterns, syntropy) are dual to noise (entropy) -- the signal to noise ratio in electronics.
      "Entropy is a measure of randomness" -- Roger Penrose.
      Syntropy is a measure of order.
      Repetition (patterns, order, syntropy) is dual to variation (disorder) -- music.
      Increasing the number of dimensions or states is an entropic process -- co-homology.
      Decreasing the number of dimensions or states is a syntropic process -- homology.
      Increasing is dual to decreasing.
      Homology (syntropy, convergence) is dual to co-homology (entropy, divergence).
      Syntropy (prediction) is dual to increasing entropy -- the 4th law of thermodynamics!
      Teleological physics (syntropy) is dual to non-teleological physics (entropy).
      Potential energy is dual to kinetic energy -- gravitational energy is dual.
      Synergy is dual to energy -- energy is dual.
      Energy is duality, duality is energy.
      The conservation of duality (energy) will be known as the 5th law of thermodynamics!
      Energy is dual to mass -- Einstein.
      Dark energy is dual to dark matter -- the universe is dual.
      "Always two there are" -- Yoda.
      Integration (syntropy, summation) is dual to differentiation (entropy, differences) - abstract algebra.
      The 4th law of thermodynamics is hardwired into mathematics & mathematical thinking.

    • @mochiebellina8190
      @mochiebellina8190 ปีที่แล้ว

      What?

    • @NoActuallyGo-KCUF-Yourself
      @NoActuallyGo-KCUF-Yourself ปีที่แล้ว

      ​@@hyperduality2838
      < reported for spam >

  • @thiagocastrodias2
    @thiagocastrodias2 ปีที่แล้ว +1

    I remember listening once to an explanation, where you put a number of marbles inside a box (all grouped according to their colors) and shake it. Every time you will get a more homogeneous configuration of marbles. This is due mostly to simple probability, since there is a huge number of possible configurations, but only a few where the marbles are organized according to the colors. Now apply this to the Universe and its enormous number of atoms. It's also interesting because things like living organisms and human intention can break this trend and make stuff more organized. Natural selection and human intent and creativity doesn't operate in the level of pure chance, so naturally improbable states become more likely.

  • @pwnedd11
    @pwnedd11 ปีที่แล้ว +3

    Wow, this video made so many things that I am currently looking into clear to me. And it has helped me know what I need to study next. Sorry for being so vague, but again... this is so helpful! Thank you so very much!!!

  • @jeronimoolivavelez1299
    @jeronimoolivavelez1299 ปีที่แล้ว +1

    Great explanation as always.
    So in other words when we are sleeping we are: “at rest” low energy status. The “at rest” energy is transformed-spread-transport to the bed, increasing the energy of the bed. The bed is working good and at a High energy state.

  • @duukvanleeuwen2293
    @duukvanleeuwen2293 ปีที่แล้ว +43

    I like the way entropy and time are related to probability; never really thought about it that way 🤔

    • @ROHITKINGC
      @ROHITKINGC ปีที่แล้ว +2

      Boltzmann understood it 120 years ago, and then committed suicide.

    • @user-nu8in3ey8c
      @user-nu8in3ey8c ปีที่แล้ว

      @@ROHITKINGC Of course if it is related to probability, if you wait long enough all the particles will tunnel back to one spot and the big bang will happen again. Or even more likely according to Boltzmann is that particles would tunnel back to one spot to make a small isolated Boltzmann Brain. So Entropy can reverse if one looks at the math, you just have to wait long enough.

    • @JamesBrown-fd1nv
      @JamesBrown-fd1nv ปีที่แล้ว

      ​@@user-nu8in3ey8c there never was a big bang.

    • @user-nu8in3ey8c
      @user-nu8in3ey8c ปีที่แล้ว

      @@JamesBrown-fd1nv If there was not a big bang, then what was it? How did we get here?

  • @AlexthunderGnum
    @AlexthunderGnum ปีที่แล้ว +1

    Great video. Thank you! One remark from me - energy being more or less useful, or able to do work, is very Anthropocentric definition. It implies "useful to me" or "able to work for me".

    • @ArvinAsh
      @ArvinAsh  ปีที่แล้ว +1

      It is ability to be transformed into other forms of energy, particularly kinetic energy. Gravitational potential energy can be transformed into other forms.

    • @AlexthunderGnum
      @AlexthunderGnum ปีที่แล้ว

      @@ArvinAsh Energy in any form can be transformed to another form, given circumstances, no?

  • @superipermagererata5084
    @superipermagererata5084 ปีที่แล้ว +25

    I really love your videos and the way you explain this complex things, you make them sound easy!, thanks to you I’ve learn so much things and my interest in science and physics has grown a lot and I discover my love for this subject :>

  • @daniloonuk
    @daniloonuk ปีที่แล้ว +1

    Great one, heard somewhere that entropy is hidden information, this explenation told me that increasing entropy means a lot of giving up.

  • @Iuliuss_
    @Iuliuss_ ปีที่แล้ว +9

    My chemistry professor, who only used to read his slides underlying and circling basically everything, left me really dubious about entropy, I was like “yeah ok the logarithm, the disorder thing makes way more sense in some way”. It’s incredible how right now the logarithm n explains things so well and it makes such a wonderful sense, had I seen things like this before maybe I’d have enjoyed more chemistry or other courses entropy-related. Amazing video!!

    • @hyperduality2838
      @hyperduality2838 ปีที่แล้ว

      Randomness (uncertainty, entropy) is dual to order (certainty, syntropy).
      Certainty is dual to uncertainty -- the Heisenberg certainty/uncertainty principle.
      Signals (patterns, syntropy) are dual to noise (entropy) -- the signal to noise ratio in electronics.
      "Entropy is a measure of randomness" -- Roger Penrose.
      Syntropy is a measure of order.
      Repetition (patterns, order, syntropy) is dual to variation (disorder) -- music.
      Increasing the number of dimensions or states is an entropic process -- co-homology.
      Decreasing the number of dimensions or states is a syntropic process -- homology.
      Increasing is dual to decreasing.
      Homology (syntropy, convergence) is dual to co-homology (entropy, divergence).
      Syntropy (prediction) is dual to increasing entropy -- the 4th law of thermodynamics!
      Teleological physics (syntropy) is dual to non-teleological physics (entropy).
      Potential energy is dual to kinetic energy -- gravitational energy is dual.
      Synergy is dual to energy -- energy is dual.
      Energy is duality, duality is energy.
      The conservation of duality (energy) will be known as the 5th law of thermodynamics!
      Energy is dual to mass -- Einstein.
      Dark energy is dual to dark matter -- the universe is dual.
      "Always two there are" -- Yoda.
      Integration (syntropy, summation) is dual to differentiation (entropy, differences) - abstract algebra.
      The 4th law of thermodynamics is hardwired into mathematics & mathematical thinking.

    • @chrisrace744
      @chrisrace744 ปีที่แล้ว

      You were in the wrong class. Take physics.

    • @Iuliuss_
      @Iuliuss_ ปีที่แล้ว

      @@chrisrace744 I know and I am in physics, it was a mandatory course for physics tho lol

    • @NoActuallyGo-KCUF-Yourself
      @NoActuallyGo-KCUF-Yourself ปีที่แล้ว

      ​@@hyperduality2838
      < reported for spam >

  • @milanocomprendo7318
    @milanocomprendo7318 ปีที่แล้ว +1

    Thanks for the lovely and simple explanation. I can use this to explain to those I discuss this with.

  • @djfaber
    @djfaber ปีที่แล้ว +8

    It's interesting that you mention time being a statistical exercise, as in software engineering where you control the universe (to a certain extent), the time keeping mechanisms were monotonic time and real time differ because in the case of a system which is too busy, it's entirely possible (and highly probable) that time can stick or go backwards and the monotonic timer was invented to address this statistical probability.

    • @jack.d7873
      @jack.d7873 ปีที่แล้ว +1

      It's always fascinating to notice similarities in our computerised machinery and the machinery of the Universe. And the way our Universe works; Block time of Quantum Field excitations, is suspiciously similar to the way a super advanced computer system would operate.

  • @jyotismoykalita
    @jyotismoykalita ปีที่แล้ว +2

    This has to be the best explanation on entropy ever.

  • @davidchung1697
    @davidchung1697 ปีที่แล้ว +3

    There is a simpler explanation of entropy. Consider a 3-particles in a box. If you put 2 particles abutting one another at the XYZ coordinate (0, 0, 0) (motionless) and then smash them with the third particle, the 3 particles will begin to bounce around in the box. For the 3 particle to return to their initial state, all 3 particles would have to retrace their motions - which is highly improbable. This irreversbility is exactly why entropy does not decrease.
    More generally, given a multi-particle closed system, the probability of the particles returning to their prior state requires that all the particles move in a particular manner in a concerted fashion - which are extremely low probability events.

    • @davidchung1697
      @davidchung1697 ปีที่แล้ว

      Just to add a bit more - What the above thought experiment tells you is that the entropy is the consequence of having particles in space-time - it becomes obvious that time is NOT the consequence of the second law of thermodynamics.

  • @paulhofmann3798
    @paulhofmann3798 ปีที่แล้ว +1

    Nice explanation of entropy and maybe intuitive for some people, connecting entropy to probability of states as Boltzmann has done first. In this paradigm though we cannot explain the Saturn rings for example. But one can do better. One can go beyond probability of states to measure theory to quantify chaos. Kolmogorov has used measure theory to define Kolmogorov complexity which has Boltzmann entropy as a special case in its belly. Kolmogorov complexity and the KAM theorem are able to explain why there are unstable chaotic orbits of the Saturn rings between the stable orbits. Further, classically there is no explanation for the second law of thermodynamics. It’s just an unexplainable law. Roger Penrose claims the fact that entropy grows in the universe stems from a symmetry breaking at the Big Bang. Other universes may thus not have a second law of thermodynamics. By the way, there are ways where two liquids unmix, eg the Belousov Zhabotinsky reaction. I loved the look on the face of the students when I showed it during my lectures. There is more to physics than equilibrium Boltzmann type thermodynamics. Think of your pencil, or the physical pendulum. They show bifurcation and chaotic behavior.

  • @aaronaragon7838
    @aaronaragon7838 ปีที่แล้ว +9

    This man really explains the nut n bolts of physics in a clear fashion.🎉🎊

    • @stevegalaxidas458
      @stevegalaxidas458 ปีที่แล้ว

      Yes. Great explanation, but I now have a nagging suspision that we drilled down from one fundamental principle such as time or space time to the principle of probabilities. There should be a video on the concept of physical laws and what do really mean by that.

  • @juannarvaez5476
    @juannarvaez5476 ปีที่แล้ว +1

    Reading one of Asimov's short stories about the question on entropy.
    I always loved the idea that after several cosmological decades. After protons decayed, after the universe has grown cold and dark. After an incalculable near infinit amount of time, an event so impropbable as to have a 0.0 and near continous zeros following it before a 1 finally appears. Where at that point entropy resets to what is was at the beginning of the big bang. "Let there be light" happens again, and another big bang or something similar occurs.

    • @claycooper9955
      @claycooper9955 หลายเดือนก่อน

      You should write a science fiction novel.

    • @claycooper9955
      @claycooper9955 หลายเดือนก่อน

      ..novel or short story.

  • @CeeJay591
    @CeeJay591 ปีที่แล้ว +4

    Arvin, I’ve watched many of your videos but this one was really amazing - thanks so much

    • @hyperduality2838
      @hyperduality2838 ปีที่แล้ว

      Randomness (uncertainty, entropy) is dual to order (certainty, syntropy).
      Certainty is dual to uncertainty -- the Heisenberg certainty/uncertainty principle.
      Signals (patterns, syntropy) are dual to noise (entropy) -- the signal to noise ratio in electronics.
      "Entropy is a measure of randomness" -- Roger Penrose.
      Syntropy is a measure of order.
      Repetition (patterns, order, syntropy) is dual to variation (disorder) -- music.
      Increasing the number of dimensions or states is an entropic process -- co-homology.
      Decreasing the number of dimensions or states is a syntropic process -- homology.
      Increasing is dual to decreasing.
      Homology (syntropy, convergence) is dual to co-homology (entropy, divergence).
      Syntropy (prediction) is dual to increasing entropy -- the 4th law of thermodynamics!
      Teleological physics (syntropy) is dual to non-teleological physics (entropy).
      Potential energy is dual to kinetic energy -- gravitational energy is dual.
      Synergy is dual to energy -- energy is dual.
      Energy is duality, duality is energy.
      The conservation of duality (energy) will be known as the 5th law of thermodynamics!
      Energy is dual to mass -- Einstein.
      Dark energy is dual to dark matter -- the universe is dual.
      "Always two there are" -- Yoda.
      Integration (syntropy, summation) is dual to differentiation (entropy, differences) - abstract algebra.
      The 4th law of thermodynamics is hardwired into mathematics & mathematical thinking.

    • @NoActuallyGo-KCUF-Yourself
      @NoActuallyGo-KCUF-Yourself ปีที่แล้ว

      ​@@hyperduality2838
      < reported for spam >

  • @johnvosarogo1785
    @johnvosarogo1785 ปีที่แล้ว +1

    Some thoughts I've had regarding this topic: I think the best or most useful conceptualization of time is as a measurement of relative change. Anytime time is referenced change must always also be referenced. So change is fundamental to time and given a static closed system there can be no internal clock. An implication of this seems to be that once the universe evolves to it's final state of absolute heat death. Where any remaining undecayed particles are so isolated that they can never interact with each other and so for all intents and purposes there is no longer any relative change in the universe and all that's left is besides individual isolated particles is the quantum fluctuations of empty space. When the universe arrives at that state, time has essentially stopped since there are no longer any events by which to measure time. From that point on, anything that does happen could be said to happen instantly afterward. So, imagining an external clock, if it took a near infinite amount of time to elapse on that external clock for the quantum foam to produce a fluctuation that evolved into the next iteration of a material universe. From the internal perspective, lacking the concept of time following the heat death of the previous iteration, the emergence of the next could be said to have happened instantly following the end of time of the previous. Therefore it becomes easy to imagine that the near infinite amount of "time" required for the quantum foam to produce a fluctuation that will evolve into a material universe is actually no time at all. The ultimate implication of that being that not only is the emergence of a universe out of the quantum fluctuations of empty space probabilistically likely, it's inevitable, since whatever time is required is available and actually is relatively instantaneous from the perspective of a changeless universe. All of this however assumes eternally expanding space filled with quantum foam. How that originated, why there's even that as opposed to true nothingness? Probably has to do with probability and the uncertainty principle in a way thats difficult for me as a layperson to conceptualize or explain but I feel like I understand intuitively. True nothingness is a form of absolute certainty and the uncertainty principle is such that a state of absolute true nothingness must instantly and randomly evolve into a state with non zero energy. Or something like that. @ArvinAsh I'd love to know if you, or anyone else reading this, thinks any of it makes any sense lol 🙏🏽🙌🏽🌌

  • @LoanwordEggcorn
    @LoanwordEggcorn ปีที่แล้ว +3

    Not sure if you're already done one (or several), but would be interesting to hear how entropy relates to quantum mechanics. Quantum mechanics also is about probabilities of different states, for example Feynman diagrams.

  • @SergeyNeskhodovskiy
    @SergeyNeskhodovskiy 6 หลายเดือนก่อน

    I also go from "high ability to do work", to "low ability to do work" exactly when my working day starts - now I know why! Thank you!

  • @0-by-1_Publishing_LLC
    @0-by-1_Publishing_LLC ปีที่แล้ว +7

    *TIME* is a measurement of change and nothing more. *Example:* Let's say the only thing in existence was an unlit lightbulb. Without any observable change happening, it logically remains in a *timeless state.* The lightbulb would appear exactly the same, so no measurement of any time passing can take place. However, once the lightbulb turns on, then that represents a "change" and also the beginning of time.
    ... But even that is not enough to represent a measurement called "TIME."
    You would need a minimum of one other instance of change to assess how much time has passed. So, if the lightbulb turns itself back off, then the amount of TIME between the two instances of change can be measured.

    • @CarlosElio82
      @CarlosElio82 ปีที่แล้ว +1

      Hence, you need two different events, on and off. Who is the agent who flips the switch? In math, Peano uses 0 and 1 to create numbers, similarly light needs electricity and magnetism to exists and propagate. In all cases, an agent performing the transformation is needed. In Peano is the "successor" operation and in light we have fields. Leibniz tried hard to find monads but found them not.

    • @ArvinAsh
      @ArvinAsh  ปีที่แล้ว +4

      It is possible. The main argument against this would be that in General Relativity one can define a perfectly good spacetime without the need for any mass or object at all. So time would still exist in mass-less or matter-less universe.

    • @0-by-1_Publishing_LLC
      @0-by-1_Publishing_LLC ปีที่แล้ว +1

      @@ArvinAsh *"The main argument against this would be that in General Relativity one can define a perfectly good spacetime without the need for any mass or object at all. So time would still exist in mass-less or matter-less universe."*
      ... I understand your reasoning, but I argue that time is not dependent on any type of structure and that any "theoretical" state where a change is taking place can be chronicled via time. If time is indeed a "measurement of change," then it can be applied to all aspects of existence (even nonphysical abstractions)
      *Example:* The numbers "0" and "1" have no material structure, but should a 0 change to a number 1, then this represents a change. Should this number 1 change back to 0, then the duration of this mathematical event can be measured via time.
      ... Excellent video, as always!

    • @blackshard641
      @blackshard641 ปีที่แล้ว +1

      @@ArvinAsh I go back and forth on this. Logically, I just don't see how it's possible to define time independently of a change between two or more distinct states, just like I don't see how it's possible to define space except as a relationship between two or more distinct points. A dimension without a metric is undefined at best, meaningless at worst, right? But it's also hard to deny that GR and QFT, two of our most successful models of the universe, both seem to imply that space itself has a kind of independent existence of its own. 20th century science sure seemed to nod slyly in the direction of block spacetime.

    • @0-by-1_Publishing_LLC
      @0-by-1_Publishing_LLC ปีที่แล้ว

      @@blackshard641 *"Logically, I just don't see how it's possible to define time independently of a change between two or more distinct states, just like I don't see how it's possible to define space except as a relationship between two or more distinct points."*
      ... I'm in the same camp. Modern science seems to want to attach as many properties as possible to easily explained phenomena. Push that mindset far enough and we end up with a new religion!

  • @MrKelaher
    @MrKelaher ปีที่แล้ว +1

    My take. Entropy is all about "summary states" - states you can not, or chose not to, distinguish at some degree of "resolution". ALL observations happen at a distance and/or with fundamental limits on precision so there are fundamentally more states that look the same, ie their fields have the same influence on an observer frame at a distance, than ones that look unique as a remote frame evolves kinematically.

  • @SumitPrasaduniverse
    @SumitPrasaduniverse ปีที่แล้ว +17

    Very excited to see Arvin's explanation on time & entropy 😃

  • @Terrapin47-s8y
    @Terrapin47-s8y 5 หลายเดือนก่อน +2

    I'd say the universe is rather getting orderly. Just because this order is sometimes not useful to humans, we call it disorderly, not knowing the thing we're trying to do is the disorderly thing. The universe comes before the human. Entropy is order.

  • @glaucosaraiva363
    @glaucosaraiva363 ปีที่แล้ว +6

    The second law states that the entropy of a closed system tends to increase with time, implying a preferential direction for time. This tendency of increasing entropy is associated with the irreversibility of many physical processes, which may partially explain why time seems to flow in a specific direction

    • @drbuckley1
      @drbuckley1 ปีที่แล้ว

      The entropy of an open system may be slowed but not reversed.

    • @glaucosaraiva363
      @glaucosaraiva363 ปีที่แล้ว +3

      @@drbuckley1 local order increases at the expense of greater disorder in the surroundings

    • @luudest
      @luudest ปีที่แล้ว +1

      What always has confused me with the term ‚arrow of time‘: While entropy appears to be very random at the microscopic level time seems to be very ‚precise‘ and not random. So what makes time so ‚precise‘?

    • @drbuckley1
      @drbuckley1 ปีที่แล้ว +1

      @@glaucosaraiva363 That's what I intended to say, if by "local order" you mean an "open system," and by "surroundings" you mean a closed system. A "local order" may "slow" the rate of entropy but it cannot avoid increasing entropy, much less reverse it.

    • @drbuckley1
      @drbuckley1 ปีที่แล้ว

      @@luudest "Time" is observer dependent. Your worldline is different from mine. "Now" is an illusion because humans cannot perceive the miniscule differences in their respective worldlines. These differences are imperceptible but not immeasurable.

  • @shethtejas104
    @shethtejas104 ปีที่แล้ว +1

    Many thanks Arvin 👍👍Kudos for thinking up the pencil and table example. Its simplicity allows viewers to get the not-so-simple concept of entropy. If the expansion of the universe can be explained by its natural tendency to find more ways to distribute energy, why are looking for the mysterious dark energy? And also, why does this tendency not act at smaller scales. For example, our planet could expand while conserving energy and therefore exist in a higher entropy state. Finally, on a related note, it would be interesting to think about gravity as an enemy of entropy. At interstellar scale, it doesn't allow the expansion to occur. But at intergalactic scales, somehow entropy is taking over (the tendency to expand and find more ways to distribute energy).
    You did a damn good job as always. You got me thinking, wondering and asking questions!

  • @davidgracely7122
    @davidgracely7122 ปีที่แล้ว +3

    Very well presented. It was very important that this video pointed out that the words "entropy" and "disorder" are interchangeable in meaning.
    The discovery that the disorganization of a system is directly related to the probability of such a system coming into existence by random processes was a great breakthrough. And it gives the formula for how to calculate this probability. It not only shows what chemical and physical reactions will likely take place spontaneously under normal conditions, but also shows us that the overall entropy or disorder in the universe is increasing with time.
    However, Boltzmann's formula is a description of what is observed. The idea that probability is the causative factor in the second law of thermodynamics is a philosophical position. An equation describes what is observed and what to expect under certain conditions. It does not tell us why such a law in our natural world exists, nor why it follows the mathematical pattern that it does, nor by what means such a law came into existence.
    "The heavens shall wax old as doth a garment". This was declared in Scripture long ago. This increase in disorder, as a layman like myself understands it, is an aging process, and our common everyday observations of the world around us shows that it is true for both living and non-living systems.
    If you were to start a plum bob swinging back and forth, you would notice that each swing was becoming less and less. Even if you were to put the pendulum underneath an evacuated bell jar, the dying away of the motion, while taking place at a slower rate, would still be observable. In like manner, if you took an original photo and made multiple copies of it, each copy being made from the proceeding copy, it would be noticed that the copies would look worse and worse the further away you got from the original.
    The second law of thermodynamics shows us that nothing in our universe works with 100% efficiency. Energy is expelled from a process in the form of heat, which my physics textbook calls "disordered energy". While the total amount of energy is still the same, it has been spread out into other less usable forms just as explained in this video. That is why it is impossible to create a perpetual motion machine.
    What are all living things physically speaking? They are genetic copies of copies of copies going back further and further in time. The second law of thermodynamics makes a prediction---namely that the genetic copying process is not going to take place with 100% efficiency and that this inefficiency of genetic transfer from generation to generation has to show up somehow. And it does. As detrimental mutations. As to any supposedly beneficial change, it is necessary to be able to rule out the possibility that this is not a reverting back to a previously undamaged condition, especially since there is a backup to the genetic code which is incorporated into the cell with a computer-like sum check process kicking in to minimize the harmful effects of damaged genes being transmitted to the next generation of living things. This shows that the gene pools of all living things in our universe are aging.
    The theory of evolution should have been discarded when the mathematical understanding of the second law of thermodynamics was discovered. The fact that it hasn't been is a testament to the fact the scientific world is not as objective as advertised.

  • @philippefossier7178
    @philippefossier7178 ปีที่แล้ว +2

    Excellent presentation as usual: clear and well explained by someone who is obviously excited about the material. Thank you Arvin. I love your videos about physics and chemistry.

  • @dougieh9676
    @dougieh9676 ปีที่แล้ว +12

    Entropy is my favorite physics subject. The fact it’s linked to the arrow of time is profound and disturbing.

  • @HeavyBrocks
    @HeavyBrocks ปีที่แล้ว +1

    The simple way I see time is that it is movement. When particles move, it creates time. If particles didn’t move, time no longer exists. This explains the impossibility of time to go in reverse. There is no way to go slower than stopped.

  • @kirksneckchop7873
    @kirksneckchop7873 ปีที่แล้ว +7

    Your team did a great job! These concepts are quite complicated and nuanced. While some of the ideas are elementary (i.e., the concept of a state space), others you might only see as a graduate student (e.g., fluctuation theorems in statistical mechanics).

    • @hyperduality2838
      @hyperduality2838 ปีที่แล้ว

      Randomness (uncertainty, entropy) is dual to order (certainty, syntropy).
      Certainty is dual to uncertainty -- the Heisenberg certainty/uncertainty principle.
      Signals (patterns, syntropy) are dual to noise (entropy) -- the signal to noise ratio in electronics.
      "Entropy is a measure of randomness" -- Roger Penrose.
      Syntropy is a measure of order.
      Repetition (patterns, order, syntropy) is dual to variation (disorder) -- music.
      Increasing the number of dimensions or states is an entropic process -- co-homology.
      Decreasing the number of dimensions or states is a syntropic process -- homology.
      Increasing is dual to decreasing.
      Homology (syntropy, convergence) is dual to co-homology (entropy, divergence).
      Syntropy (prediction) is dual to increasing entropy -- the 4th law of thermodynamics!
      Teleological physics (syntropy) is dual to non-teleological physics (entropy).
      Potential energy is dual to kinetic energy -- gravitational energy is dual.
      Synergy is dual to energy -- energy is dual.
      Energy is duality, duality is energy.
      The conservation of duality (energy) will be known as the 5th law of thermodynamics!
      Energy is dual to mass -- Einstein.
      Dark energy is dual to dark matter -- the universe is dual.
      "Always two there are" -- Yoda.
      Integration (syntropy, summation) is dual to differentiation (entropy, differences) - abstract algebra.
      The 4th law of thermodynamics is hardwired into mathematics & mathematical thinking.

    • @NoActuallyGo-KCUF-Yourself
      @NoActuallyGo-KCUF-Yourself ปีที่แล้ว

      ​@@hyperduality2838
      < reported for spam >

  • @brenlee9325
    @brenlee9325 ปีที่แล้ว +2

    Thank you so much for making these concepts which always seem so difficult and mysterious, understandable.

  • @mr.cosmos5199
    @mr.cosmos5199 ปีที่แล้ว +3

    You just mean there are more ways to be disorganized than being organized,right?
    Therefore it’s more probable to be disorganized.
    Great video ❤

    • @thedeemon
      @thedeemon ปีที่แล้ว +1

      The thing is: such macro states are only called "disorganized" because there are more microstates in them, i.e. we know less about which state the system is in exactly if we know it's in this macro state. There are more ways to be disorganized because states where there are more ways to be are called disorganized.

    • @hyperduality2838
      @hyperduality2838 ปีที่แล้ว

      Randomness (uncertainty, entropy) is dual to order (certainty, syntropy).
      Certainty is dual to uncertainty -- the Heisenberg certainty/uncertainty principle.
      Signals (patterns, syntropy) are dual to noise (entropy) -- the signal to noise ratio in electronics.
      "Entropy is a measure of randomness" -- Roger Penrose.
      Syntropy is a measure of order.
      Repetition (patterns, order, syntropy) is dual to variation (disorder) -- music.
      Increasing the number of dimensions or states is an entropic process -- co-homology.
      Decreasing the number of dimensions or states is a syntropic process -- homology.
      Increasing is dual to decreasing.
      Homology (syntropy, convergence) is dual to co-homology (entropy, divergence).
      Syntropy (prediction) is dual to increasing entropy -- the 4th law of thermodynamics!
      Teleological physics (syntropy) is dual to non-teleological physics (entropy).
      Potential energy is dual to kinetic energy -- gravitational energy is dual.
      Synergy is dual to energy -- energy is dual.
      Energy is duality, duality is energy.
      The conservation of duality (energy) will be known as the 5th law of thermodynamics!
      Energy is dual to mass -- Einstein.
      Dark energy is dual to dark matter -- the universe is dual.
      "Always two there are" -- Yoda.
      Integration (syntropy, summation) is dual to differentiation (entropy, differences) - abstract algebra.
      The 4th law of thermodynamics is hardwired into mathematics & mathematical thinking.

    • @UnitSe7en
      @UnitSe7en ปีที่แล้ว

      Makes me feel better about the state of my room.

    • @UnitSe7en
      @UnitSe7en ปีที่แล้ว

      @@hyperduality2838 IS a list of descriptive definitions the only thing you have to say? It's totally non-relevant in every comment you've posted it in. (And there's been a lot)

    • @hyperduality2838
      @hyperduality2838 ปีที่แล้ว

      @@UnitSe7en Male is dual to female synthesizes children or offspring -- the Hegelian dialectic.
      The double helix should actually be called the dual helix as the code of life is dual -- two strands.
      Hydrophilic is dual to hydrophobic -- hydrogen bonding in DNA.
      A is dual to T.
      C is dual to G -- base pairs.
      The Penrose tribar:-
      th-cam.com/video/l1PCE47jwng/w-d-xo.html
      The Penrose tribar is only consistent from two antipodal points or perspectives -- antipodal points identify for the rotation group SO(3).
      Gluons are force carriers as they attract and repel quarks:- attraction is dual to repulsion -- forces are dual.
      Action is dual to reaction -- Sir Isaac Newton or the duality of force.
      Push is dual to pull, stretch is dual to squeeze -- forces are dual.
      Protons (positive) are dual to electrons (negative) synthesize photons (neutral, neutrons) or pure energy.
      Duality (thesis, anti-thesis) synthesizes reality or non duality -- the time independent Hegelian dialectic.
      Stability is dual to instability -- optimized control theory.
      Space is dual to time -- Einstein, space is actually 4 dimensional.
      Waves are dual to particles -- quantum duality.
      The rule of two -- Darth Bane, Sith Lord.
      Subgroups are dual to subfields -- the Galois correspondence.
      Addition is dual to subtraction (additive inverses) -- abstract algebra.
      Multiplication is dual to division (multiplicative inverses) -- abstract algebra.
      The Penrose tribar is an impossible construction but consistent from only a dual (antipodal) perspective.
      Gravitation is equivalent or dual (isomorphic) to acceleration -- Einstein's happiest thought, the principle of equivalence, duality.
      Once you take duality seriously you can create new laws of physics, syntropy is dual to entropy -- the 4th law of thermodynamics!
      You are built from DNA (duality).
      Energy is duality, duality is energy -- waves, particles.

  • @tonib5899
    @tonib5899 ปีที่แล้ว +1

    There is sometimes only one way for order to exist, where there are millions of ways of disorder. The logic and maths can be simple if you allow it to be.

  • @orbitalengineer2800
    @orbitalengineer2800 ปีที่แล้ว +4

    Does that mean that at any given moment, there is a non zero probability that the universe will wind up back to its orginal state of singularity?

    • @ВладіславЗаморський
      @ВладіславЗаморський ปีที่แล้ว

      You are right, pretty much the same way one divides by 0 for a derivative. It's not a number 0, just a very close number to it, and that doesn't really affect anything for a problem being solved. For example, people often say instantaneous velocity, but it doesn't really make sense actually. A car on a photograph doesn't move(just 1 frame, no time here), so for velocity we need this arbitrary small nudge on the time axis. Thus, instantaneous here means time window so small, that it could have been an instant for our problem. And when we can't choose this arbitrary small nudge anymore we get quantum mechanics. The same way improbable event here means that it just won't happen during a lifetime of universe(but the probability isn't zero).

  • @binbots
    @binbots ปีที่แล้ว +2

    The arrow of time points forward in time because of the wave function collapse. Because causality has a speed limit (c) every point in space that you observe it from will appear to be the closest to the present moment. When we look out into the universe we see the past which is made of particles (GR). When we try to look at smaller and smaller sizes and distances, we are actually looking closer and closer to the present moment (QM). The wave property of particles appears when we start looking into the future of that particle. It is a probability wave because the future is probabilistic. Wave function collapse happens when we bring a particle into the present/past. GR is making measurements in the predictable past. QM is trying to make measurements of the probabilistic future.

    • @blackshard641
      @blackshard641 ปีที่แล้ว +1

      I tend to lean toward a similar explanation, myself. Wave function collapse / decoherence seems to amount to some kind of dispersion of information into the environment whenever any previously independent particles interact. I picture this collapse as an exchange of information, sent and received in every temporal direction along the particle's worldline. Think of it like nature "error checking" to make sure it's behaving consistently and coherently. Since we are macroscopic objects, caught up in the flow of time toward the future, the past looks fixed to us while the future appears indeterminate, but in a sense, the future and past are both fixed AND contingent. Contingent in the sense that everything depends on everything else. Fixed in the superderministic sense that a spacetime-wide web of contingencies wouldn't leave much room to escape determinism.

    • @wheelswheels9199
      @wheelswheels9199 ปีที่แล้ว

      There is no evidence that then wave function is even a real thing let alone that a collapse actually happens. It’s a model that gives us some results that agree with observations.

    • @hyperduality2838
      @hyperduality2838 ปีที่แล้ว

      Randomness (uncertainty, entropy) is dual to order (certainty, syntropy).
      Certainty is dual to uncertainty -- the Heisenberg certainty/uncertainty principle.
      Signals (patterns, syntropy) are dual to noise (entropy) -- the signal to noise ratio in electronics.
      "Entropy is a measure of randomness" -- Roger Penrose.
      Syntropy is a measure of order.
      Repetition (patterns, order, syntropy) is dual to variation (disorder) -- music.
      Increasing the number of dimensions or states is an entropic process -- co-homology.
      Decreasing the number of dimensions or states is a syntropic process -- homology.
      Increasing is dual to decreasing.
      Homology (syntropy, convergence) is dual to co-homology (entropy, divergence).
      Syntropy (prediction) is dual to increasing entropy -- the 4th law of thermodynamics!
      Teleological physics (syntropy) is dual to non-teleological physics (entropy).
      Potential energy is dual to kinetic energy -- gravitational energy is dual.
      Synergy is dual to energy -- energy is dual.
      Energy is duality, duality is energy.
      The conservation of duality (energy) will be known as the 5th law of thermodynamics!
      Energy is dual to mass -- Einstein.
      Dark energy is dual to dark matter -- the universe is dual.
      "Always two there are" -- Yoda.
      Integration (syntropy, summation) is dual to differentiation (entropy, differences) - abstract algebra.
      The 4th law of thermodynamics is hardwired into mathematics & mathematical thinking.

  • @mysterio7807
    @mysterio7807 ปีที่แล้ว +3

    You know just commenting from an old video of yours regarding the Eye Of Agamotto on the MCU. Like entropy only increases in time, seemed to me watching Doctor Strange last night the Eye Of Agamotto reverses the entropy. How it does that? No idea. But the MCU used to be very clever with its scientific fiction until it became all over the place in Endgame.

    • @ArvinAsh
      @ArvinAsh  ปีที่แล้ว +3

      Yes, that's why I don't really enjoy those marvel movies so much. For a nerd like me, I just see all kinds of scientific flaws instead of enjoying the action! lol. Oh the agony of being a nerd!

    • @hyperduality2838
      @hyperduality2838 ปีที่แล้ว

      Randomness (uncertainty, entropy) is dual to order (certainty, syntropy).
      Certainty is dual to uncertainty -- the Heisenberg certainty/uncertainty principle.
      Signals (patterns, syntropy) are dual to noise (entropy) -- the signal to noise ratio in electronics.
      "Entropy is a measure of randomness" -- Roger Penrose.
      Syntropy is a measure of order.
      Repetition (patterns, order, syntropy) is dual to variation (disorder) -- music.
      Increasing the number of dimensions or states is an entropic process -- co-homology.
      Decreasing the number of dimensions or states is a syntropic process -- homology.
      Increasing is dual to decreasing.
      Homology (syntropy, convergence) is dual to co-homology (entropy, divergence).
      Syntropy (prediction) is dual to increasing entropy -- the 4th law of thermodynamics!
      Teleological physics (syntropy) is dual to non-teleological physics (entropy).
      Potential energy is dual to kinetic energy -- gravitational energy is dual.
      Synergy is dual to energy -- energy is dual.
      Energy is duality, duality is energy.
      The conservation of duality (energy) will be known as the 5th law of thermodynamics!
      Energy is dual to mass -- Einstein.
      Dark energy is dual to dark matter -- the universe is dual.
      "Always two there are" -- Yoda.
      Integration (syntropy, summation) is dual to differentiation (entropy, differences) - abstract algebra.
      The 4th law of thermodynamics is hardwired into mathematics & mathematical thinking.

    • @hyperduality2838
      @hyperduality2838 ปีที่แล้ว

      @@ArvinAsh Action is dual to reaction -- Sir Isaac Newton or the duality of force.
      Attraction is dual to repulsion, push is dual to pull, stretch is dual to squeeze -- forces are dual.
      Positive is dual to negative -- the electromagnetic force field is dual.

  • @Yasmin-pi5pr
    @Yasmin-pi5pr ปีที่แล้ว +2

    So, if this "force" (or 2nd law of thermodynamics) is the tendency to increase entropy, transforming energy into one that has less potential to do work, what is the opposite?
    What's the force that concentrates energy and increases its potential to do work?
    I once read that life is the opposite of entropy. The symmetry of there being 2 opposing forces makes some scientific sense.
    Thank you very much, your explanations is simply perfect.

  • @s1gne
    @s1gne ปีที่แล้ว +4

    I love entropy, it's a great excuse for lots of things.

    • @hyperduality2838
      @hyperduality2838 ปีที่แล้ว

      Randomness (uncertainty, entropy) is dual to order (certainty, syntropy).
      Certainty is dual to uncertainty -- the Heisenberg certainty/uncertainty principle.
      Signals (patterns, syntropy) are dual to noise (entropy) -- the signal to noise ratio in electronics.
      "Entropy is a measure of randomness" -- Roger Penrose.
      Syntropy is a measure of order.
      Repetition (patterns, order, syntropy) is dual to variation (disorder) -- music.
      Increasing the number of dimensions or states is an entropic process -- co-homology.
      Decreasing the number of dimensions or states is a syntropic process -- homology.
      Increasing is dual to decreasing.
      Homology (syntropy, convergence) is dual to co-homology (entropy, divergence).
      Syntropy (prediction) is dual to increasing entropy -- the 4th law of thermodynamics!
      Teleological physics (syntropy) is dual to non-teleological physics (entropy).
      Potential energy is dual to kinetic energy -- gravitational energy is dual.
      Synergy is dual to energy -- energy is dual.
      Energy is duality, duality is energy.
      The conservation of duality (energy) will be known as the 5th law of thermodynamics!
      Energy is dual to mass -- Einstein.
      Dark energy is dual to dark matter -- the universe is dual.
      "Always two there are" -- Yoda.
      Integration (syntropy, summation) is dual to differentiation (entropy, differences) - abstract algebra.
      The 4th law of thermodynamics is hardwired into mathematics & mathematical thinking.

    • @NoActuallyGo-KCUF-Yourself
      @NoActuallyGo-KCUF-Yourself ปีที่แล้ว +1

      ​@@hyperduality2838
      < reported for spam >

  • @LuigiSimoncini
    @LuigiSimoncini ปีที่แล้ว +1

    The statistics explanation (# of configurations before and after the energy change) is spot on, the concept of "useful" used in the previous explanation is... not useful given that the definition of the meaning of usefulness depends on the observer

  • @KatjaTgirl
    @KatjaTgirl ปีที่แล้ว +8

    Another great video Arvin, thanks!
    Since our universe is expanding and the speed of light is limited, in the future we will be able to interact with fewer and fewer particles. Does this mean that in the far future when our light cone gets emptier, entropy will actually go down while time is still moving forward? If so what is the moment of maximum entropy of our universe?

    • @tektrixter
      @tektrixter ปีที่แล้ว +6

      That concept is called "heat death". Given unlimited expansion of the universe, eventually all remaining particles will be evenly distributed and outside one another's light cones. At that point time itself may end as without interaction there can be no events to have intervals between. Entropy will be maximized.

    • @the6millionliraman
      @the6millionliraman ปีที่แล้ว +3

      That's a really interesting question imo.
      As Roger Penrose has theorized, the universe's remote future of maximum entropy (heat death) is, at least mathematically speaking, fundamentally indistinguishable from the minimum entropy state of the universe at the Big Bang.
      Both instances are in thermal equilibrium.
      So it's almost like the maximum entropy state is (mathematically) exactly the same as the minimum entropy state.
      Penrose posits that in the remote future, after black holes have evaporated and there are just massless and timeless photons bombing around, the universe basically "forgets" how big it is and in some way returns to its Big Bang state. He uses MC Escher's fractals as an analogy. Fascinating. Who knows.

    • @hyperduality2838
      @hyperduality2838 ปีที่แล้ว

      Randomness (uncertainty, entropy) is dual to order (certainty, syntropy).
      Certainty is dual to uncertainty -- the Heisenberg certainty/uncertainty principle.
      Signals (patterns, syntropy) are dual to noise (entropy) -- the signal to noise ratio in electronics.
      "Entropy is a measure of randomness" -- Roger Penrose.
      Syntropy is a measure of order.
      Repetition (patterns, order, syntropy) is dual to variation (disorder) -- music.
      Increasing the number of dimensions or states is an entropic process -- co-homology.
      Decreasing the number of dimensions or states is a syntropic process -- homology.
      Increasing is dual to decreasing.
      Homology (syntropy, convergence) is dual to co-homology (entropy, divergence).
      Syntropy (prediction) is dual to increasing entropy -- the 4th law of thermodynamics!
      Teleological physics (syntropy) is dual to non-teleological physics (entropy).
      Potential energy is dual to kinetic energy -- gravitational energy is dual.
      Synergy is dual to energy -- energy is dual.
      Energy is duality, duality is energy.
      The conservation of duality (energy) will be known as the 5th law of thermodynamics!
      Energy is dual to mass -- Einstein.
      Dark energy is dual to dark matter -- the universe is dual.
      "Always two there are" -- Yoda.
      Integration (syntropy, summation) is dual to differentiation (entropy, differences) - abstract algebra.
      The 4th law of thermodynamics is hardwired into mathematics & mathematical thinking.

    • @hyperduality2838
      @hyperduality2838 ปีที่แล้ว

      @@the6millionliraman Maximum is dual to minimum.

  • @Hossak
    @Hossak ปีที่แล้ว +1

    I felt like standing up and applauding after this video. Thank you so much, my friend :)

  • @0-by-1_Publishing_LLC
    @0-by-1_Publishing_LLC ปีที่แล้ว +2

    I equally argue that time cannot be reversed and can only move forward. In the "scrambled eggs" example even if the entropy reversed itself, there would still be a *beginning* (the egg), a *middle* (the egg being scrambled) and an *end* (the egg returning to its original state of entropy).
    *The Past* is a database of all events that take place.
    *The Present* is now.
    *The Future* is a specific degree of probability based on past and present events.

    • @AdvaiticOneness1
      @AdvaiticOneness1 ปีที่แล้ว +2

      It's always Now, Past and Future are just illusions.

    • @0-by-1_Publishing_LLC
      @0-by-1_Publishing_LLC ปีที่แล้ว +1

      @@AdvaiticOneness1 *"It's always Now, Past and Future are just illusions."*
      ... That is a *semantic paradox* resulting from the words we are forced to use to communicate. However, your paradox has no bearing on reality. There is, was, and always will be a past, present, and a future.

    • @drbuckley1
      @drbuckley1 ปีที่แล้ว

      Minkowski proposed the existence of hyperspace, a region of "now" that cannot be perceived by any observer. "Now" may be the illusion, since no two observers share the same "worldline."

    • @AdvaiticOneness1
      @AdvaiticOneness1 ปีที่แล้ว +1

      ​@@0-by-1_Publishing_LLC Our perception of time as a continuous flow is created by our brain's processing of sensory information and memories. Time is not an objective feature of the universe, but rather a product of our consciousness.

    • @0-by-1_Publishing_LLC
      @0-by-1_Publishing_LLC ปีที่แล้ว

      @@AdvaiticOneness1 *"Our perception of time as a continuous flow is created by our brain's processing of sensory information and memories."*
      ... Yes, and "brains" extrapolate *logic* from whatever we observe in a universe steeped in logic, and logic states that all events that have already taken place are recorded in the past; now is an immeasurable instant within the present, and the future is only a specific degree of probability based on data derived from past and present events.
      *"Time is not an objective feature of the universe, but rather a product of our consciousness."*
      ... Like it or not, our individual consciousnesses are an equal part of the universe ... just like everything else. We don't marginalize our ability to think, conceive, and process logic just because we are the ones wielding the power to do so.

  • @lukedowneslukedownes5900
    @lukedowneslukedownes5900 ปีที่แล้ว +1

    The very short summary or the answer to this is that: everything is connected to each other through infinite time and space, which means every piece of matter, has some sort of relative effect to alter another piece of matter, and is exponential except at the quantum scale from what we know in science so far

  • @mmtrichey
    @mmtrichey ปีที่แล้ว +16

    34 People liked it before it even played. Well done Arvin! :-)

    • @stefaniasmanio5857
      @stefaniasmanio5857 ปีที่แล้ว +4

      I understand this is not scientific at all... But Arvin is Arvin.... ❤

    • @blackshard641
      @blackshard641 ปีที่แล้ว +3

      Retrocausality.

    • @tayt_
      @tayt_ ปีที่แล้ว

      Reported for hacking.

    • @tonalambiguity3345
      @tonalambiguity3345 ปีที่แล้ว +1

      Probably had an early release for his patreon or something

    • @dsera2721
      @dsera2721 ปีที่แล้ว

      you mean hes paying for views?

  • @andyonions7864
    @andyonions7864 ปีที่แล้ว +2

    Loved the 2D animations of Hydrogen and Oxygen combining, showing clearly the incomplete hydrogen shell only having 1 of 2 electrons and the oxygen having two shells, innermost complete with 2 electrons and outer shell missing 2 electrons (having 6out of 8). Then after the combination, both hydrogens share 2 electron sand the oxygen shares 8 in their outer shells. Just as I learnt at school all those years ago...

  • @alfadog67
    @alfadog67 ปีที่แล้ว +6

    OUTSTANDING! Thanks, Professor Ash!
    If entropy is increasing, does that mean the speed of time is also decreasing? Could that look like universal expansion?

    • @ArvinAsh
      @ArvinAsh  ปีที่แล้ว +9

      Well, I'm not sure what speed of time means. There is nothing like a standard time in the universe. Clocks tick faster of slower depending on your reference frame compared to other reference frames. This is what we learned from Special Relativity.

    • @alfadog67
      @alfadog67 ปีที่แล้ว

      @@ArvinAsh The way I'm visualizing it is that before, when there was less entropy, a second was a second, and today with our current entropy, a second is still a second... but when we compare the two, today's second is relatively longer than before.

    • @drbuckley1
      @drbuckley1 ปีที่แล้ว +1

      @@alfadog67 Entropy increases in closed systems, but may be slowed (if not reversed) in open systems. The Earth is an open system, deriving most of its energy from the Sun. Entropy results in the Arrow of Time; the experience of time is observer dependent.

    • @iam6424
      @iam6424 ปีที่แล้ว

      Probably, since it all came out as a probability phenomenon !
      But I wonder ,what is "Probability " ?

    • @drbuckley1
      @drbuckley1 ปีที่แล้ว

      @@iam6424 "Probability" is a human construct intended to approximate their expectations. It is not reality, which could not care less about what humans expect.

  • @TedToal_TedToal
    @TedToal_TedToal 6 หลายเดือนก่อน +1

    Another great video! I wanted to hear something more about what distinguishes usable energy from unusable energy. I would think it would have to do with how spread out the energy is. If it's spread all over the place then it's not very useful.
    I have a personal way of thinking about increasing entropy and it comes from making chocolate chip cookies. When it is time to add chocolate chips to the dough, it's always very difficult to stir them and you wonder whether you can even make them mix in at all. Yet gradually they do become scattered throughout the dough. Why? It's entropy again. The entropy of the chocolate chips all clustered together at the start is very low, they're highly organized. But the number of possible ways that they can be spread around is practically infinite, whereas there is only one way to keep them all together in a clump. So probability-wise, as you stir, the chocolate chips are far more likely to end up in different places than to stay bunched together.

    • @garethrobinson2275
      @garethrobinson2275 หลายเดือนก่อน

      Be careful, by increasing entropy you are speeding the demise of the universe!

  • @anthonycarbone3826
    @anthonycarbone3826 ปีที่แล้ว +5

    I think the real question is how did everything get to such a high energy state in the first place. I could sum it up better by why did the universe have so much order to begin with when the original universe was full of disorder and only shortly thereafter the beginning coalesced into an ordered meaningful universe..

    • @anthonycarbone3826
      @anthonycarbone3826 ปีที่แล้ว +3

      Arvin answered my question very well. But still it boggles the mind to think the early universe was more orderly when no atom could exist until the universe could cool down enough for photons to go their own way and allow for atoms to form.

    • @vitovittucci9801
      @vitovittucci9801 ปีที่แล้ว

      Just before the Big Bang all matter and energy were compressed in an extremely small space. Because of Incertiytude Principle all particles (whatever they may be) were arranged in relatively precise position(low entropy)

    • @anthonycarbone3826
      @anthonycarbone3826 ปีที่แล้ว +1

      @@vitovittucci9801 There was no matter or energy before the big bang. To talk about anything before the Big Bang goes into the metaphysical realm and into the complete unknown. You mean afterwards up to the time the universe cooled enough for matter to form.

    • @hyperduality2838
      @hyperduality2838 ปีที่แล้ว +1

      Randomness (uncertainty, entropy) is dual to order (certainty, syntropy).
      Certainty is dual to uncertainty -- the Heisenberg certainty/uncertainty principle.
      Signals (patterns, syntropy) are dual to noise (entropy) -- the signal to noise ratio in electronics.
      "Entropy is a measure of randomness" -- Roger Penrose.
      Syntropy is a measure of order.
      Repetition (patterns, order, syntropy) is dual to variation (disorder) -- music.
      Increasing the number of dimensions or states is an entropic process -- co-homology.
      Decreasing the number of dimensions or states is a syntropic process -- homology.
      Increasing is dual to decreasing.
      Homology (syntropy, convergence) is dual to co-homology (entropy, divergence).
      Syntropy (prediction) is dual to increasing entropy -- the 4th law of thermodynamics!
      Teleological physics (syntropy) is dual to non-teleological physics (entropy).
      Potential energy is dual to kinetic energy -- gravitational energy is dual.
      Synergy is dual to energy -- energy is dual.
      Energy is duality, duality is energy.
      The conservation of duality (energy) will be known as the 5th law of thermodynamics!
      Energy is dual to mass -- Einstein.
      Dark energy is dual to dark matter -- the universe is dual.
      "Always two there are" -- Yoda.
      Integration (syntropy, summation) is dual to differentiation (entropy, differences) - abstract algebra.
      The 4th law of thermodynamics is hardwired into mathematics & mathematical thinking.

    • @10002One
      @10002One ปีที่แล้ว

      @@hyperduality2838 ugh! you again. you must be a Daoist with all of this yin-yang duality. will risk posing a question -- if energy is duality, and synergy is dual to energy, where is symmetry? don't just say that it is dual to dissymmetry. ☯

  • @ANGROCEL
    @ANGROCEL ปีที่แล้ว +2

    I had to learn entropy and equilibrium last month. I like how you explained it.

    • @evgenistarikov3386
      @evgenistarikov3386 ปีที่แล้ว +1

      But to really learn what entropy is in fact, please, cf. my commentary to this stream. Please, mind that any Equilibrium is basically a result of the relevant Actions-Counteractions interplay.

  • @rafanifischer3152
    @rafanifischer3152 ปีที่แล้ว +53

    I am an expert on probability. And I will now enlighten you: Never go on vacation to Las Vegas. You can thank me later.

    • @ArvinAsh
      @ArvinAsh  ปีที่แล้ว +6

      Wise words! But the shows are nice.

    • @jasonspades1265
      @jasonspades1265 11 หลายเดือนก่อน +1

      Then you know playing blackjack isn't too much of a risk if you know what you're doing

    • @rafanifischer3152
      @rafanifischer3152 11 หลายเดือนก่อน

      Flipping a coin is not too much of a risk but I wouldn't bet my house on it.@@jasonspades1265

    • @baxakk7374
      @baxakk7374 10 หลายเดือนก่อน +1

      Skilled poker players against stupid people on vacation could win

    • @caveman3592
      @caveman3592 9 หลายเดือนก่อน +1

      😂

  • @beatricechauvel8237
    @beatricechauvel8237 ปีที่แล้ว

    It's not physics, but probability that governs Entropy. Nice vidéo. Thank you for work.

  • @mhouslay7281
    @mhouslay7281 3 หลายเดือนก่อน +1

    That was such an awesome video.
    Lightbulb 💡 moment for me.
    You videos are always superb but this was a cracker 🎉
    Thanks so much. 👍😊

  • @Dyslexic-Artist-Theory-on-Time
    @Dyslexic-Artist-Theory-on-Time ปีที่แล้ว

    The one-way characteristic of electromagnetic waves (light always moves from the past into the future) light waves radiate out until they hits something. When this happens, it forms a new photon electron coupling or dipole moment as a probabilistic future unfolds. In such a theory, the mathematics of QM represents the physics of time as a geometrical process that forms statistical entropy and the Second law of thermodynamics. All we would need is symmetry, in the form of a geometrical shape, to form naturally in the Universe and when that symmetry breaks, it could form statistical entropy or disorganization. There would also be a small potential that the geometry would form greater complexity and diversity. A process of spherical symmetry forming and breaking can form the characteristic of our Universe.
    An interior of a sphere is naturally three dimensional, giving us our three-dimensional space with the process forming one variable in the form of time. When the spherical symmetry breaks, it has the potential to form the most beautiful of geometrical shapes in the form of the Fibonacci spiral.

  • @mrparkerdan
    @mrparkerdan ปีที่แล้ว +2

    I’ve been learning about energy and entropy for 30 years. Not a single teacher or textbook ever mentioned “probability “ 😒

    • @thedeemon
      @thedeemon ปีที่แล้ว

      Bad choice of books. Get a statistical mechanics textbook for once.

  • @webx135
    @webx135 ปีที่แล้ว +2

    One way this could be worded, is that all possible configurations are equally likely. There are so many fewer configurations that are the ones you would be looking for.
    So if you roll a 20-sided dice, and are hoping for a 20. The piece educators tend to miss is that when they explain it, it often sounds like a weighted die. But rather, it's the fact that you are hoping for a 20, and there are 19 other equally-likely states that AREN'T 20. These other states aren't more likely than the 20.
    So with the scrambled egg analogy, it's not like the "unscrambled" state is less likely than, say, a very, very SPECIFIC way for the egg to be scrambled. It's just that we define "unscrambled" in such a specific way that it is absurdly unlikely to ever be in that state. "What if I rolled a 20?"
    You could also ask "What are the chances the egg would be scrambled in a way that the yolk spells out the complete works of Shakespeare?". There are an absurd number of combinations in total. So the likelihood of that specific state is insanely low. This would be like asking "What if I rolled a 19?"
    Or you could ask "What are the chances that the egg would be about halfway scrambled", and now you've included a TON of states that would fit this description, so it's pretty likely to hit one of them, even if each state individually isn't more likely than the others. This would be like asking "What if I rolled something higher than 10?
    The likelihood of the states doesn't change. The number of combinations you choose is what changes.

  • @williammartine5168
    @williammartine5168 ปีที่แล้ว +2

    well done and well handled. Entropy changes can be quite deceptive, like living things developing---growing into an organized entity from the disorganized surroundings. The time/entropy connection has always interested me, I wonder if time and entropy are equivalent, when viewed from a higher dimension. Then there is the concept that time is only an illusion---well, it would seem that entropy can govern causality with as much direction as the apparent 'arrow of time'. Thanks for all your awesome work on the videos.

  • @buddharuci2701
    @buddharuci2701 ปีที่แล้ว +1

    Love the busted up bolder. Sysiphus is off the hook.

  • @germanjimenez5336
    @germanjimenez5336 ปีที่แล้ว +1

    Here's a different take... we experience time flowing this way because ALL our systems and mental processes depend on them to create memories. If we had a different way of creating and keeping memories that aren't tied up to the chemistry we know, we'd experience reality a different way. Quantum particles might be different and experience other laws.
    A being that creates memories and experiences through non chemical processes could be able to move through time at will.

    • @UnitSe7en
      @UnitSe7en ปีที่แล้ว

      What we call time is just a high-level mental construct.

  • @LQhristian
    @LQhristian ปีที่แล้ว +2

    It would seem more intuitive to correlate entropy with 'decay.' I.e.: The higher the energy level of the object/system, the lower its entropy/decay rate. This would explain why time would appear slower, the higher the dimension (with higher mass particles, stronger gravity). Just a thought :-).

  • @wayneyadams
    @wayneyadams ปีที่แล้ว +1

    0:47 LOL I used to say that to my Physics students for decades. Nature is lazy is also a way to remember how inertia works.
    3:22 There is no "...and so on." it is all friction and air resistance which is a form of friction, there is nothing else. Even the deformation of the ball as it rolls is the result of energy losses to friction within the material.
    7:59 Again, be specific, there is ONLY one way for the gravitational potential energy of the pencil to do work, i.e., the pencil falls to the table. What pray tell do YOU think are other ways for the gravitational energy to do work?
    The concept of time is a human construct that we impose on nature to understand phenomena and make sense of events happening around us. Events happen as you described whether we are there to say time flows forward or time flows backward or not. If the universe were constructed in such a way that natural phenomena occurred opposite to what we experience in this unvierse then we would define the forward flow time as the reverse of what we do today.

  • @ajayshinde1571
    @ajayshinde1571 ปีที่แล้ว +1

    really impressive video .. the best explanation I have ever seen

  • @shreeram5800
    @shreeram5800 ปีที่แล้ว +2

    Super Concept, and Can't believe that it was simplified in a beautifully elegant way ☺️

  • @Mikey-mike
    @Mikey-mike ปีที่แล้ว +2

    Good video.
    This would be different for anti-matter but gravity is attractive for both matter and anti-matter.

  • @Andrew-lo5sc
    @Andrew-lo5sc ปีที่แล้ว +1

    I think there's only two finite directions in the universe. Something is either going into a dimension or out of a dimension. Dimensional organization. On one hand entropy would seem more chaotic but spacetime itself is becoming more organized and predictable.

  • @patinho5589
    @patinho5589 ปีที่แล้ว +1

    Q: Why do things tend towards their lowest energy stage?
    A: because we’ve defined what is higher or lower energy based on what we observe, and we have defined states as lower energy if they are the ones tended to.
    Real question: is this what has happened? Or not?

  • @faikerdogan2802
    @faikerdogan2802 ปีที่แล้ว +1

    God tier video. So many videos i watched about entropy and this one is just the best

  • @GururajBN
    @GururajBN ปีที่แล้ว +2

    Entropy plays a role on my work table too. If I do not periodically rearrange the things, they become so mixed up and chaotic!

  • @abhishekc232
    @abhishekc232 ปีที่แล้ว +1

    One of the best video on entropy.

  • @plutoisacomet
    @plutoisacomet ปีที่แล้ว

    Great presentation. The question of how still remains. The use of Probability theory is somewhat an excuse for not knowing the exact mechanism on how something works or occurs.

  • @kwokchuchan7793
    @kwokchuchan7793 ปีที่แล้ว +2

    Another simple explanation of entropy by Stephen Hawking is that when you shake a jigsaw puzzle in a box there are much higher probability of finding the pieces in random order (higher entropy) than all the pieces magically assembled in the right order to form the picture (lower entropy).

    • @hyperduality2838
      @hyperduality2838 ปีที่แล้ว

      Randomness (uncertainty, entropy) is dual to order (certainty, syntropy).
      Certainty is dual to uncertainty -- the Heisenberg certainty/uncertainty principle.
      Signals (patterns, syntropy) are dual to noise (entropy) -- the signal to noise ratio in electronics.
      "Entropy is a measure of randomness" -- Roger Penrose.
      Syntropy is a measure of order.
      Repetition (patterns, order, syntropy) is dual to variation (disorder) -- music.
      Increasing the number of dimensions or states is an entropic process -- co-homology.
      Decreasing the number of dimensions or states is a syntropic process -- homology.
      Increasing is dual to decreasing.
      Homology (syntropy, convergence) is dual to co-homology (entropy, divergence).
      Syntropy (prediction) is dual to increasing entropy -- the 4th law of thermodynamics!
      Teleological physics (syntropy) is dual to non-teleological physics (entropy).
      Potential energy is dual to kinetic energy -- gravitational energy is dual.
      Synergy is dual to energy -- energy is dual.
      Energy is duality, duality is energy.
      The conservation of duality (energy) will be known as the 5th law of thermodynamics!
      Energy is dual to mass -- Einstein.
      Dark energy is dual to dark matter -- the universe is dual.
      "Always two there are" -- Yoda.
      Integration (syntropy, summation) is dual to differentiation (entropy, differences) - abstract algebra.
      The 4th law of thermodynamics is hardwired into mathematics & mathematical thinking.

    • @joevignolor4u949
      @joevignolor4u949 ปีที่แล้ว

      Right. Then it's interesting to take it a step further. When you're putting the puzzle together your brain and other organs are working together to convert food into energy, which increases the level of entropy in the universe. The energy is then used to put the puzzle together, which decreases the level of entropy in the puzzle. In the end, as long as the net gain of entropy in the entire universe exceeds the reduction of entropy in the puzzle, then the 2nd law of thermodynamics isn't being violated.

  • @tedwalford7615
    @tedwalford7615 หลายเดือนก่อน

    You've just proved not just intelligent design but conscious, intentional creation of the universe.
    If the "universe was at lowest entropy in the beginning," "in its lowest entropy state," then in the beginning it was in its least-probable state. And not just as improbable as one balanced pencil; rather ALL the constituents of the universe, including not just its elements but also all the physical laws and principles under which it would expand and agglomerate and in which life would arise.
    And in our experience, it is life that rearranges matter and energy in ways that create low-probability, low-entropy arrangements of matter. Logic and natural laws tell us, then, that the lowest-probability, lowest-entropy initial universe--components, laws, energy, and time--had a Creator.

    • @kuroshmunshi
      @kuroshmunshi 7 วันที่ผ่านมา

      You forget that part where probabilistically the highly improbable event closes the cycle ;)

  • @nickmelkata3950
    @nickmelkata3950 ปีที่แล้ว +1

    I think my issue with connecting time itself to entropy is special relativity. Here we see that time itself can slow down, which in turn means the random events that drive changes in entropy slow,down as well. This would seem to only be possible if time dictated entropy, not vice versa. The fact that time can vary based on things other than entropy seems to break the casual link. Entropy always increases due to probabilistic statistics based on other mechanically driven fluctuations (such as diffusion), but I think is more an indicator of time passing, not the cause. It may be one of the most fundamental quantities we can ascribe to the passing of time, but doesn’t mean it’s causal. Also, atthe atomic level we can have reactions that would appear to break the direction of time, like a particle absorbing a photon before its emitted, but these can also be interpreted in terms of the Heisenberg uncertainty principle in time and energy, so don’t necessarily require increasing entropy to push time forward.

  • @Handelsbilanzdefizit
    @Handelsbilanzdefizit ปีที่แล้ว +1

    I'm not convinced,
    because a relationship (Statistics Time direction) is not an explanation.
    A wet street doesn't explain rain.
    In a backward-time universe, people would say:
    Processes always go from "likely" to "unlikely". Entropy decreases.
    That's the way it is!
    Because they never experienced it differently, like we do.

  • @royalusala8527
    @royalusala8527 ปีที่แล้ว +1

    True, there are more possible ways of something going wrong than right.. Law of entropy

    • @ArvinAsh
      @ArvinAsh  ปีที่แล้ว

      There is a lot of truth to that statement!

  • @ShaziaQ
    @ShaziaQ 8 หลายเดือนก่อน

    Arvin undoubtedly is the Best physics teacher ever