Pure Information Gives Off Heat

แชร์
ฝัง

ความคิดเห็น • 1.7K

  • @patrickhanft
    @patrickhanft 2 ปีที่แล้ว +121

    I do have a degree in computer science and I find you again and again to be one of my best CS teachers in topics, that were never discussed or badly explained during my studies!

    • @zen1647
      @zen1647 2 ปีที่แล้ว +2

      Great video! You're awesome!

    • @hyperduality2838
      @hyperduality2838 2 ปีที่แล้ว +1

      There is a 4th law of thermodynamics:-
      Equivalence, similarity = duality (isomorphism).
      An "equivalence gate" measures similarity, sameness or dualness.
      Homology is dual to co-homology -- topology.
      Increasing the number of dimensions or states is an entropic process -- co-homology.
      Decreasing the number of dimensions or states is a syntropic process -- homology.
      From a converging, convex (lens) or syntropic perspective everything looks divergent, concave or entropic -- the 2nd law of thermodynamics.
      All observers have a syntropic perspective according to the 2nd law.
      My syntropy is your entropy and your syntropy is my entropy -- duality.
      Syntropy (prediction, convergence) is dual to increasing entropy -- the 4th law of thermodynamics!
      Teleological physics (syntropy) is dual to non-teleological physics (entropy).
      Convergence (syntropy, homology) is dual to divergence (entropy, co-homology).
      The word entropy means "a tendency to diverge" or differentiate into new states, reductionism -- science.
      The word syntropy means "a tendency to converge" or integrate into a single whole state, holism -- religion.
      Differentiation is dual to integration, division is dual to unity, reductionsim is dual to holism.
      Syntropy is dual to entropy.
      "Science without religion is lame, religion without science is blind" -- Einstein.
      Science is dual to religion -- the mind duality of Albert Einstein.
      Your mind uses information to converge or create optimized predictions -- a syntropic process.
      Making predictions to track targets, goals & objectives is a syntropic process -- teleological.
      Duality creates reality.
      "Always two there are" -- Yoda.
      The 4th law of thermodynamics is hardwired into mathematics.

    • @sherpajones
      @sherpajones ปีที่แล้ว +1

      16:23 What if the logic gate operated by interpreting the interference pattern of light. If you only had one light source, there would be no pattern. If you have two, there would be a pattern. The presence or not of a pattern can be your 0 or 1. This should easily be reversible.

    • @moth5799
      @moth5799 ปีที่แล้ว

      @@hyperduality2838 Christ mate you really think quoting Yoda makes you look smart lmfao? There are 4 laws of thermodynamics but that's only because we call one of them the Zeroth law. Your one is just made up.

    • @hyperduality2838
      @hyperduality2838 ปีที่แล้ว

      @@moth5799 Subgroups are dual to subfields -- the Galois correspondence.
      The Galois correspondence in group theory is based upon duality.
      There are new laws of physics -- Yoda is correct.
      Energy is dual to matter -- Einstein.
      Dark energy is dual to dark matter.
      Energy is duality, duality is energy -- the 5th law of thermodynamics!
      Potential energy is dual to kinetic energy -- gravitational energy is dual.
      Energy is measured in Joules (duals, jewels) in physics.

  • @compuholic82
    @compuholic82 2 ปีที่แล้ว +457

    Fun fact: Reversible logic is really important in quantum computing. Since all state changes can be represented as unitary matrices, quantum gates are always reversible.

    • @hyperduality2838
      @hyperduality2838 2 ปีที่แล้ว +38

      There is a 4th law of thermodynamics:-
      Equivalence, similarity = duality (isomorphism).
      An "equivalence gate" measures similarity, sameness or dualness.
      Homology is dual to co-homology -- topology.
      Increasing the number of dimensions or states is an entropic process -- co-homology.
      Decreasing the number of dimensions or states is a syntropic process -- homology.
      From a converging, convex (lens) or syntropic perspective everything looks divergent, concave or entropic -- the 2nd law of thermodynamics.
      All observers have a syntropic perspective according to the 2nd law.
      My syntropy is your entropy and your syntropy is my entropy -- duality.
      Syntropy (prediction, convergence) is dual to increasing entropy -- the 4th law of thermodynamics!
      Teleological physics (syntropy) is dual to non-teleological physics (entropy).
      Convergence (syntropy, homology) is dual to divergence (entropy, co-homology).
      The word entropy means "a tendency to diverge" or differentiate into new states, reductionism -- science.
      The word syntropy means "a tendency to converge" or integrate into a single whole state, holism -- religion.
      Differentiation is dual to integration, division is dual to unity, reductionsim is dual to holism.
      Syntropy is dual to entropy.
      "Science without religion is lame, religion without science is blind" -- Einstein.
      Science is dual to religion -- the mind duality of Albert Einstein.
      Your mind uses information to converge or create optimized predictions -- a syntropic process.
      Making predictions to track targets, goals & objectives is a syntropic process -- teleological.
      Duality creates reality.
      "Always two there are" -- Yoda.
      The 4th law of thermodynamics is hardwired into mathematics.

    • @laughingone3728
      @laughingone3728 2 ปีที่แล้ว +4

      @@hyperduality2838
      Nicely stated. Thanks for that.

    • @Snowflake_tv
      @Snowflake_tv 2 ปีที่แล้ว +2

      Really?

    • @hyperduality2838
      @hyperduality2838 2 ปีที่แล้ว +12

      @@laughingone3728 You're welcome, it gets better:-
      There is also a 5th law of thermodynamics, energy is duality, duality is energy!
      Energy is dual to mass -- Einstein.
      Dark energy is dual to dark matter.
      Action is dual to reaction -- Sir Isaac Newton (the duality of force).
      Attraction is dual to repulsion, push is dual to pull -- forces are dual.
      If forces are dual then energy must be dual.
      Energy = force * distance.
      Electro is dual to magnetic -- Maxwell's equations
      Positive is dual to negative -- electric charge.
      North poles are dual to south poles -- magnetic fields.
      Electro-magnetic energy is dual.
      "May the force (duality) be with you" -- Jedi teaching.
      "The force (duality) is strong in this one" -- Jedi teaching.
      There are new laws of physics! Your mind creates or synthesizes syntropy!
      Thesis is dual to anti-thesis creates the converging thesis or synthesis -- the time independent Hegelian dialectic.
      Duality creates reality!
      Certainty is dual to uncertainty -- the Heisenberg certainty/uncertainty principle.
      Everything in physics is made from energy hence duality.
      Concepts are dual to percepts -- the mind duality of Immanuel Kant.

    • @hyperduality2838
      @hyperduality2838 2 ปีที่แล้ว

      @@Snowflake_tv There is also a 5th law of thermodynamics, see my next comment.

  • @domotheus
    @domotheus 2 ปีที่แล้ว +440

    Good stuff! If you're interested there's an even weirder (and very theoretical) application of reversible computing called a "Szilard engine" where you can go back and forth between waste data and waste energy. Using the wasted bits of reversible computing you can theoretically extract energy out of a system that's at an equilibrium state, basically meaning you can convert energy into data and data into energy

    • @zdlax
      @zdlax 2 ปีที่แล้ว +20

      mass = energy = information

    • @landspide
      @landspide 2 ปีที่แล้ว +19

      isn't this like Maxwell's demon?

    • @abdobelbida7170
      @abdobelbida7170 2 ปีที่แล้ว +2

      How so?

    • @MasterHigure
      @MasterHigure 2 ปีที่แล้ว +25

      @@abdobelbida7170 Maxwell's demon can separate a fluid into fast / slow atoms, or a fluid mix into its constituent parts. You can extract energy from the recombining. The cost is that the demon's knowledge of the state of the system becomes less and less useful as it does its work.

    • @lubricustheslippery5028
      @lubricustheslippery5028 2 ปีที่แล้ว +10

      I am confused. I thought that information needs an physical representation according to information theory and thus something like pure information should not be a thing similar to that pure energy is not a thing. Then is it something like you can convert entropy and enthalpy, in chemistry you calculate with Gibbs free energy to see if an reaction can occur that is the combination of entropy and enthalpy.

  • @demetrius235
    @demetrius235 2 ปีที่แล้ว +96

    I worked in the semiconductor industry (DRAM) for a few years and now one of the courses I teach is Thermodynamics. I had no idea about the Landauer limit so thanks for teaching me something new! Also, good work pointing out that a completely reversible process is not possible as there is always some energy loss (collisions in your billiard ball case). This was an excellent video!

    • @anntakamaki1960
      @anntakamaki1960 2 ปีที่แล้ว

      Is it in Russia?

    • @demetrius235
      @demetrius235 2 ปีที่แล้ว +3

      @@anntakamaki1960 "it"?? I did not work in Russia and I have never been to Russia. I have no desire to set foot in Russia.

    • @mathslestan3323
      @mathslestan3323 2 ปีที่แล้ว +4

      @@demetrius235 Wow so much love for Russia 😑

    • @Nehmo
      @Nehmo ปีที่แล้ว +2

      How can you be associated with semiconductors in any way and not know about the Landauer limit? Sorry to be critical, but it's rather basic.
      Now that you know about it, you will recognize encountering it again and again.

    • @katrinabryce
      @katrinabryce ปีที่แล้ว +5

      @@Nehmo Possibly because it is so small that for all practical purposes it is zero? The Landauer limit of a typical modern 35W CPU is about 35 nanowatts. And for DRAM it is actually 0W, because the whole point of RAM is that you get back out what you put in, so it is reversible.

  • @fmeshna
    @fmeshna ปีที่แล้ว +15

    Jade, your ability to explain complex quantitative concepts so clearly is exceptional. We need more teachers like you.

  • @gotbread2
    @gotbread2 2 ปีที่แล้ว +102

    While the second law gives a mathematical justification for that energy loss, it does not give a deeper "why" that is the case. The fundamental issue is that of information erasure itself. This comes down to collapsing a state to a single value. Imagine the 2 bits getting reduced to 1 bit. This means we force one bit from having a variable state (either 0 or 1) to a fixed state. It can be any value, does not matter, but it is now a constant and no longer a variable. This is where the loss occures.
    One helpful visual is a ball in a potential with 2 valleys (as a standin for a particle in a bipotential). Now this ball can be in either of the 2 valleys initially. By definition we dont know, else it would be a known constant and not a variable. Lets say we want to move this ball into the left valley, from any starting valley. The issue here is that whatever we come up with needs to work for both starting valleys. Similar the bit erasure must be able to set a 0 to a 0, but also a 1 to a 0. In the ball case, you can move it over to the other valley but then it will have some speed left, which you need to dissipate in order for it to come to a rest at the bottom and keep this state. This is exactly where the loss happens. You can add some kind of reversible damping to "catch" this energy, but then it wont work for the case that the ball was already in the correct valley. Whatver case you design it for will always cause an energy loss for the other case, since you need to move from a case with potentially "some" kinetic energy to a state with "zero" kinetic energy, without knowing the direction of the motion. (This is similar to maxwells demon).
    Now how much energy do we need to dissipate? Also easy to see. In order to differentiate between the 2 bit states, there needs to be a potential barrier between them. This barrier needs to be high enough to prevent thermal movement from flipping the bit on its own. The energy you need to dissipate while "catching" the bit and bringing it to rest is directly coming from the energy you need to expend to cross this barrier. Since the barrier is temperature related (more temperature -> more thermal energy -> higher barrier needed to avoid flips), the energy loss is also temperature dependent. This is where the "T" in the equation comes from. The boltzman constant in a way is mandatory to match the units. Last piece of the puzzle is the ln(2). We can either be satisfied with using the second law as a shortcut here, but the ln(2) can also be derived directly from the "geometry" of this "information bit in 2 potential wells" problem.

    • @dtkedtyjrtyj
      @dtkedtyjrtyj 2 ปีที่แล้ว +11

      Wow. I actually think I understood some of that. It makes intuitive sense...?

    • @rewe3536
      @rewe3536 2 ปีที่แล้ว +6

      Thank you! The video makes it seem like it's just magic, it just happens.

    • @garyw.9628
      @garyw.9628 2 ปีที่แล้ว +9

      Really nice analysis of why erasing information necessitates a loss of energy. Also very appropriate to mention Maxwell's demon, since his thought experiment cleverly demonstrated the important link between information and energy. But, in the derivation of the Landauer Limit, and of the Boltzmann constant itself, there seems to be the assumption of a system consisting of the atoms and molecules of a gas. What if the computing device consisted of something smaller than atoms like photons, or neutrinos or quarks ? Would the corresponding Landauer Limit then, by necessity, have a much lower value ?

    • @hyperduality2838
      @hyperduality2838 2 ปีที่แล้ว +2

      There is a 4th law of thermodynamics:-
      Equivalence, similarity = duality (isomorphism).
      An "equivalence gate" measures similarity, sameness or dualness.
      Homology is dual to co-homology -- topology.
      Increasing the number of dimensions or states is an entropic process -- co-homology.
      Decreasing the number of dimensions or states is a syntropic process -- homology.
      From a converging, convex (lens) or syntropic perspective everything looks divergent, concave or entropic -- the 2nd law of thermodynamics.
      All observers have a syntropic perspective according to the 2nd law.
      My syntropy is your entropy and your syntropy is my entropy -- duality.
      Syntropy (prediction, convergence) is dual to increasing entropy -- the 4th law of thermodynamics!
      Teleological physics (syntropy) is dual to non-teleological physics (entropy).
      Convergence (syntropy, homology) is dual to divergence (entropy, co-homology).
      The word entropy means "a tendency to diverge" or differentiate into new states, reductionism -- science.
      The word syntropy means "a tendency to converge" or integrate into a single whole state, holism -- religion.
      Differentiation is dual to integration, division is dual to unity, reductionsim is dual to holism.
      Syntropy is dual to entropy.
      "Science without religion is lame, religion without science is blind" -- Einstein.
      Science is dual to religion -- the mind duality of Albert Einstein.
      Your mind uses information to converge or create optimized predictions -- a syntropic process.
      Making predictions to track targets, goals & objectives is a syntropic process -- teleological.
      Duality creates reality.
      "Always two there are" -- Yoda.
      The 4th law of thermodynamics is hardwired into mathematics.

    • @gotbread2
      @gotbread2 2 ปีที่แล้ว +5

      @@garyw.9628 We did not make any assumptions about what the system is made of. All we need to assume is "some element" (here a particle, but even a wave function works too), which can be in different states, and that the 2 states are separated by an energy barrier. We need this barrier in order to preserve the state (else it would evolve between the states over time), and the height of the barrier is based on the expected disturbances (thermal energy). Further, we assume that by crossing this barrier of potential energy, a certain kinetic energy is needed, which we eventually need to dissipate again. Notice how abstract that setup it, it makes no mention of the particle type, or even a particle at all (even a field configuration would work here).
      You are correct with the Boltzmann constant however. This carries some assumptions with it. Since this constant relates the temperature of a gas to its average kinetic energy, it all comes down to how you define "temperature" for your system. If you use different particles, or something different entirely, your definition of what a temperature is may change, thus changing the Boltzmann constant.

  • @bobdiclson4173
    @bobdiclson4173 2 ปีที่แล้ว +439

    I love Jades vibe of having a secret to share

    • @vigilantcosmicpenguin8721
      @vigilantcosmicpenguin8721 2 ปีที่แล้ว +34

      Yeah, that's a perfect description. The way she grins when she gets to the juicy part of the secret,

    • @Blue.star1
      @Blue.star1 2 ปีที่แล้ว

      She looks like a meson

    • @PhysioAl1
      @PhysioAl1 2 ปีที่แล้ว +4

      You're right!

    • @sabouedcleek611
      @sabouedcleek611 2 ปีที่แล้ว

      @@김민-o2k Looks a bit like Modified Newtonian dynamics, though i dont think √(1+(GM/(RC²))² satisfies the interpolation requirement in the overview of the wiki.

    • @communitycollegegenius9684
      @communitycollegegenius9684 2 ปีที่แล้ว

      Too bad. No one would watch or take her seriously without her vibe. Science and humanity loses.

  • @Alestrix76
    @Alestrix76 2 ปีที่แล้ว +28

    I was wondering about the "many million more" at 10:00 and did some math as I thought this sounded a little too much. But turns out it's about right: A somewhat modern 64bit x86 CPU has around 5*10^8 logic gates. Let's say with each cycle 1% of those gates gets flipped and there are 2*10^9 cycles per second (2GHz), then we end up with around 1µW. Modern power efficient x86 processors need roughly 10W, which is 10 million times this. Not sure what the numbers are like with an ARM processor in a smartphone. Of course this is just ballpark-math.

    • @greenaum
      @greenaum ปีที่แล้ว +3

      The latest AMD CPUs have 8.2*10^10 transistors, or 82 billion. A logic gate might have, maybe 5 transistors, so you're off by a factor of about 100.
      A lot more than 1% of the gates get flipped. CPUs are designed to use as much of the hardware as possible, all the time. To get more processing done. You don't want bits of the chip sitting around idle. This is why you might have "4 cores 8 threads". Each core runs two "threads". That is, if, say, the chip's multiplier unit is being used by one instruction, there might be a memory access it can make, using it's memory access hardware, for another instruction. It runs two instructions at once but it's not two entire processors. Instead, it's two front-ends that work out what instructions can be run with the currently unused parts of the CPU. So it's like you get, say, 1.8 effective processors per core, with just the addition of a second front-end and a bit of logic to figure out what's compatible.
      There's also pipelining, where an instruction might take, say, 5 stages of operations to complete. The 5 stages are built separately, and as an instruction leaves the first stage, a new one is brought in, so all 5 stages are busy, all the time, with 5 instructions.
      Then there's out-of-order execution and all sorts of other mad tricks to try and get processing done in almost no time at all. CPUs will have multiple adder units and other parts. It's not just having multiple cores, each core does multiple things.
      So, they're busy, by design, to produce the most throughput of calculation for the silicon you buy. To have circuits sitting idle is wasteful, and indeed they analyse that at the factory, and if a part isn't pulling it's weight, they'll replace it with something else that does. It's all about getting the most processing possible done, because that's what they compete on, and what they set their prices by.
      In power-saving chips, like for phones, the opposite is true. They try and switch off as many sub-units as possible while still providing just enough CPU power to do whatever you're doing at that moment. Entire cores will usually be shut down, but they can wake up quickly. Plus modern phones might have 4 high-power, fast processors, and 4 slower, very low-power ones, designed with different techniques, switched on and off as the operating system decides it needs them.

  • @E4tHam
    @E4tHam 2 ปีที่แล้ว +87

    I’m currently pursuing a masters in VLSI, so thanks for introducing these concepts to people! Although the built in impedances in metal and semiconductors will always overshadow the Landauer limit by several orders of magnitude. But this is an interesting thought experiment

    • @adamnevraumont4027
      @adamnevraumont4027 2 ปีที่แล้ว +10

      always is a long time to make a promise for

    • @adivp7
      @adivp7 2 ปีที่แล้ว +12

      @@adamnevraumont4027 Even if we you eliminate metal impedances with super-conduction you definitely need semi-conducting material for a transistor. And semi-conductors will always produce heat.

    • @adamnevraumont4027
      @adamnevraumont4027 2 ปีที่แล้ว +6

      @@adivp7 which you then follow with a formal proof that all computation requires transistors made of semiconductors? No?
      Well then making a promise for "always" is beyond your pay grade. Always is a very long time. Always is not just 10 years, it is 100 years, it is 10000 years, it is 100,000,000 years, it is 10^16 years, it is 10^256 years, it is 10^10^10^10^10 years, it is G64 years, it is TREE3 years, it is BB(50) years.
      It is a really long time.

    • @dot32
      @dot32 2 ปีที่แล้ว +9

      ​@@adamnevraumont4027 lmao, it's physics. You need semiconductors for transistors. If you found something other than a transistor, you may not need semiconductors, but semiconductors are what transistors are afaik

    • @adivp7
      @adivp7 2 ปีที่แล้ว +13

      @@adamnevraumont4027 Technically, none of those are "always", but fair point. What I meant to say is you need energy for switching. I can't see how you can have switching that doesn't use or release any energy.

  • @stufarnham
    @stufarnham ปีที่แล้ว +6

    This has become my favorite TH-cam channel. These short, digestible discussions of deep topics. Are endlessly fascinating. I especially enjoy the discussions of paradoxes. Also, you are amgreat presenter - clear and engaging. Keep itn up, please!❤

  • @heartofdawn2341
    @heartofdawn2341 2 ปีที่แล้ว +81

    The question then is, where does that second bit of output information go? How is is stored and used? If you simply discard it later, all that happens is that you push the source of the landauer limit further downstream.

    • @JB52520
      @JB52520 2 ปีที่แล้ว +32

      No one really knows because there's no design for a reversible computer yet. The billiard ball example shows how the balls might be returned to the correct place with a logic gate that doesn't expend energy, but it doesn't show where they're stored or how they'll return at the precise time (as far as I remember; it's been a while since I read about this).
      I'm just guessing, but a useful metaphor might be to picture a mechanical computer where each of the waste bits is stored in a tiny spring, such that computing would be like winding a clock. Once the result is obtained, the program runs in reverse to unwind the system and return the stored energy. (How it would actually work, I have no idea.) It's also like the difference between standard car brakes and regenerative braking. The former just radiates heat, and the later runs one or more generators, storing energy to accelerate the car later.
      As far as I can tell, reversible computing doesn't have to be perfect, just like regenerative brakes. Even if a program can only run backward part way before releasing the remainder of its stored energy as heat, that's still better than releasing all of it, and it might be enough for a computer of the distant future to sidestep the Landauer limit.

    • @erkinalp
      @erkinalp 2 ปีที่แล้ว +15

      @@JB52520 All quantum computers use reversible computational elements to prevent immediate collapses of superposition.

    • @glenncurry3041
      @glenncurry3041 2 ปีที่แล้ว +4

      @@erkinalp Qbits are not single binary bit.

    • @brandonklein1
      @brandonklein1 2 ปีที่แล้ว +3

      @@erkinalp but you're still subject to the Landauer limit once you make a measurement.

    • @hyperduality2838
      @hyperduality2838 2 ปีที่แล้ว +2

      There is a 4th law of thermodynamics:-
      Equivalence, similarity = duality (isomorphism).
      An "equivalence gate" measures similarity, sameness or dualness.
      Homology is dual to co-homology -- topology.
      Increasing the number of dimensions or states is an entropic process -- co-homology.
      Decreasing the number of dimensions or states is a syntropic process -- homology.
      From a converging, convex (lens) or syntropic perspective everything looks divergent, concave or entropic -- the 2nd law of thermodynamics.
      All observers have a syntropic perspective according to the 2nd law.
      My syntropy is your entropy and your syntropy is my entropy -- duality.
      Syntropy (prediction, convergence) is dual to increasing entropy -- the 4th law of thermodynamics!
      Teleological physics (syntropy) is dual to non-teleological physics (entropy).
      Convergence (syntropy, homology) is dual to divergence (entropy, co-homology).
      The word entropy means "a tendency to diverge" or differentiate into new states, reductionism -- science.
      The word syntropy means "a tendency to converge" or integrate into a single whole state, holism -- religion.
      Differentiation is dual to integration, division is dual to unity, reductionsim is dual to holism.
      Syntropy is dual to entropy.
      "Science without religion is lame, religion without science is blind" -- Einstein.
      Science is dual to religion -- the mind duality of Albert Einstein.
      Your mind uses information to converge or create optimized predictions -- a syntropic process.
      Making predictions to track targets, goals & objectives is a syntropic process -- teleological.
      Duality creates reality.
      "Always two there are" -- Yoda.
      The 4th law of thermodynamics is hardwired into mathematics.

  • @thattimestampguy
    @thattimestampguy 2 ปีที่แล้ว +2

    0:55 Heat Production & Waste
    1:55 Information has an energy cost ℹ️ The Landauer Limit
    2:25 Logic Gate
    More than one Logical Gate makes a Logical Network
    3:50
    4:04 4 Possible Combinations, 2 Possible Outputs
    00
    01
    10
    11
    6:37 Entropy Formula
    8:57 Fewer Outputs Than Input States
    10:26 Cooling The Computer 💻 🧊
    11:37 Reversible Computers
    12:07 Irreversible Operation, You can’t determine the inputs from the output
    13:36 It’s Possible To Reverse
    14:24
    15:28 Billiard Balls
    17:04

  • @pedromartins1474
    @pedromartins1474 2 ปีที่แล้ว +56

    This took me back to my statistical physics classes! Wonderfully explained! Thank you so much!

  • @Dadniel1st
    @Dadniel1st 2 ปีที่แล้ว

    Thanks

  • @danielschein6845
    @danielschein6845 2 ปีที่แล้ว +77

    Amazing to think about. I spent 10 years designing actual microprocessors and always thought of energy in terms of the electrical current flowing through the device.

    • @ralfbaechle
      @ralfbaechle 2 ปีที่แล้ว +15

      To be honest, the Landauer limit is so low - we can spend anoher century without reaching it. Figuratively speaking that is. I've not done an estimate how long we'd probably need to reach the Landauer limit from where technology is now. Because, let's face such estimates usually are wrong :-) So it's perfectly ok to concentrate on all other losses.

    • @DrewNorthup
      @DrewNorthup 2 ปีที่แล้ว +10

      And even that is an oversimplification… The need to understand the impact of both the resistance and the reactance escapes a good many people. Switching losses are so large compared to Landauer I'd not expect the latter to factor in meaningfully for quite some time.

    • @triffid0hunter
      @triffid0hunter 2 ปีที่แล้ว +9

      Sure, but the landauer limit says that a modern CPU must use at least a nanowatt or so, and since they _actually_ use about a hundred watts, we've got a _long_ way to go before having to deal with the limit - wikipedia's article says maybe 2080 if Koomey's law holds, although it doesn't mention which Koomey's law figure (there's two in the relevant article) was used to derive that figure.

    • @hyperduality2838
      @hyperduality2838 2 ปีที่แล้ว

      There is a 4th law of thermodynamics:-
      Equivalence, similarity = duality (isomorphism).
      An "equivalence gate" measures similarity, sameness or dualness.
      Homology is dual to co-homology -- topology.
      Increasing the number of dimensions or states is an entropic process -- co-homology.
      Decreasing the number of dimensions or states is a syntropic process -- homology.
      From a converging, convex (lens) or syntropic perspective everything looks divergent, concave or entropic -- the 2nd law of thermodynamics.
      All observers have a syntropic perspective according to the 2nd law.
      My syntropy is your entropy and your syntropy is my entropy -- duality.
      Syntropy (prediction, convergence) is dual to increasing entropy -- the 4th law of thermodynamics!
      Teleological physics (syntropy) is dual to non-teleological physics (entropy).
      Convergence (syntropy, homology) is dual to divergence (entropy, co-homology).
      The word entropy means "a tendency to diverge" or differentiate into new states, reductionism -- science.
      The word syntropy means "a tendency to converge" or integrate into a single whole state, holism -- religion.
      Differentiation is dual to integration, division is dual to unity, reductionsim is dual to holism.
      Syntropy is dual to entropy.
      "Science without religion is lame, religion without science is blind" -- Einstein.
      Science is dual to religion -- the mind duality of Albert Einstein.
      Your mind uses information to converge or create optimized predictions -- a syntropic process.
      Making predictions to track targets, goals & objectives is a syntropic process -- teleological.
      Duality creates reality.
      "Always two there are" -- Yoda.
      The 4th law of thermodynamics is hardwired into mathematics.

    • @Evolved_Skeptic
      @Evolved_Skeptic 2 ปีที่แล้ว +1

      @@ralfbaechle Perhaps at the scale of a single CPU of a home computer, but at the scale of the massive data processor centers for internet servers, or the supercomputers used to simulate the Climate/Weather patterns of the entire Earth, the need to to reduce the tremendous heat being generated (both in terrms of efficiency & especially the cost in cooling these mega-systems), makes current efforts to overcome the Landauer Effect financially viable.
      Just look at how much energy is estimated at being diverted from electrical power grids towards Cryptocurrency farming & try to estimate the vast amount of waste heat being generated (especially in hotter regions close to the equator, like Texas, where there's already a profound energy cost in maintaining functional computer environments) due to all this data processing.
      Any tweak to the design of computer processors which either overcomes, or more likely reduces (given our current incomplete understanding of Quantum Physics), the Landauer Effect is going to be a worthwhile achievement.

  • @0ptikGhost
    @0ptikGhost 2 ปีที่แล้ว +34

    I love this video as it pertains to theoretical computation. Realizable computers today generally use electric flow through substrates that always have some level of impedance. The real reason our current day computers generate heat has nothing to do with the Landauer Limit but rather with the technology we use to build said computers. We don't stand a chance of getting anywhere near the Landauer Limit regardless of the temperature we run the computer unless we don't figure out how to make computers using superconductors.

    • @david203
      @david203 2 ปีที่แล้ว

      Yes, that is today's understanding. But the theory itself doesn't require superconductors. And CMOS does actually use less energy than, say, TTL.

    • @TheDavidlloydjones
      @TheDavidlloydjones 2 ปีที่แล้ว

      @@david203
      You rather seem to miss Ghost's sensible point. Try reading it over again. Or look for E4tHam's sensible post, below in my feed, making Ghost's point in slightly different form.

    • @david203
      @david203 2 ปีที่แล้ว +1

      @@TheDavidlloydjones I fully agree with Ghost's point, thanks. The current required to activate computer logic causes vastly larger heat than does the actual processing of information. That's why it's called a Limit.

    • @greenaum
      @greenaum ปีที่แล้ว +1

      Right. Charging and discharging the gates of umpty-billion transistors requires dumping the energy as heat. Resistors make heat, though CMOS is usually quite high-impedance.
      There's reversible logic, but frankly it looks like a pain in the arse and it seems like most of it happens only on paper and the rest might be that someone strings together a couple of logic gates, not an entire CPU. Even then it's probably made of snooker balls!

  • @JanStrojil
    @JanStrojil 2 ปีที่แล้ว +178

    The fact that information contains energy always boggles my mind.

    • @goldenwarrior1186
      @goldenwarrior1186 2 ปีที่แล้ว +3

      It makes sense. There’s nothing to really think of (don’t mean to be mean)

    • @MyMy-tv7fd
      @MyMy-tv7fd 2 ปีที่แล้ว +33

      that is because it does not, there is no necessary information in a logic gate switching, it could randomly switch, or just be set to oscillate. No information is involved in mere switching, but energy obviously is. Either this is clickbait, or she does not understand that information is the intentional switching of logic gates to produce a certain storage pattern, which may or may not be volatile RAM or non-volatile like an SSD.

    • @dominobuilder100
      @dominobuilder100 2 ปีที่แล้ว +5

      @@MyMy-tv7fd give an example then of any sort of information that does not contain energy

    • @MyMy-tv7fd
      @MyMy-tv7fd 2 ปีที่แล้ว +38

      @@dominobuilder100 - no information whatsoever contains energy, it is conceptual. But there is always a physicsl substrate, whether it be the page of a book or a RAM stick. The change in the substrate could contain information, or just be random, but the change itself will always require energy, the information is the 'ghost in the machine'

    • @paulthompson9668
      @paulthompson9668 2 ปีที่แล้ว +2

      @@goldenwarrior1186 Hindsight is always 20/20

  • @ericmedlock
    @ericmedlock 2 ปีที่แล้ว +6

    Great video! I learned a bunch of this in university a million years ago but you do a super job of simplifying a really complex set of concepts. Kudos!

  • @cykkm
    @cykkm 2 ปีที่แล้ว +8

    There is a little big problem with the reversible billiard ball (RBB) computer (tangentially, the problem was noticed by Landauer himself in the original paper, but I'll translate it into the RBB language. Suppose you place ideal compressive springs at all output of the RBB logical circuit to _actually_ reverse the computation. This works indeed, but then you _don't known the result of the computation!_ You can adiabatically _uncompute_ any computation as long as you don't read its result. If you want to know the result, i.e. whether or not a ball popped out from an output and bounced back, you have no option but to touch the system. Even a single photon interacting with a ball, such that you can detect whether if there was a ball bouncing off the spring or not, transfers momentum to the ball, breaking the time symmetry of the RBB. A reversible computation is possible, _as long as you uncompute it all without reading out the result!_ The act of reading a result must increase the computer's entropy, even if the computer is reversible. This was one of the main Landauer results. His paper connected Shannon and Boltzmann entropy so clearly.

    • @JB52520
      @JB52520 2 ปีที่แล้ว

      I hadn't read that Landauer was working from a time symmetry perspective. If you're going to run the universe backwards, there's no need for a special computer. Heat will flow back into a normal computer, cooling it and pushing electricity back into the wall with perfect efficiency.
      If time symmetry must actually remain unbroken and that's not a joke, there's nothing clear about this concept. Can you have a system that's not influenced by the probabilistic nature of quantum effects? Even if that's not a problem, reversible computing couldn't give off photons, including infrared, because they'd have to turn around and hit the computer precisely where and when they left. Any irreversible interaction with the environment would also be forbidden.
      This means reversible computing would require impossibly perfect efficiency in perfect isolation, ignoring quantum effects and spontaneously traveling backward in time, while being theoretically guaranteed to produce no results.
      I don't know anything anymore. This explains nothing "so clearly". At least you easily understand the incomprehensible and see utility in the useless. Screw it, everyone else is awesome and I'm wrong about everything. The more I try to learn or think, the more I realize it's a miracle I ever learned to tie my shoes. Being born was a terrible idea. Hey, I finally understand something.

    • @michaelharrison1093
      @michaelharrison1093 2 ปีที่แล้ว

      This is along the same lines as the argument that I made in a comment I submitted.
      Reversibility does not eliminate the chane in entropy

    • @mihailmilev9909
      @mihailmilev9909 2 ปีที่แล้ว

      @@JB52520 lmao well fucking said dude

    • @mihailmilev9909
      @mihailmilev9909 2 ปีที่แล้ว

      @@JB52520 one of my favorite comments ever

    • @mihailmilev9909
      @mihailmilev9909 2 ปีที่แล้ว

      @@michaelharrison1093 what was the context? Who's in ur pfp btw, some professor?

  • @nbooth
    @nbooth 2 ปีที่แล้ว +4

    Minor quibble but the second law doesn't imply that entropy can only increase, only that it is more likely to. You can always get lucky and have total entropy go down. I'm sure it happens all the time.

    • @nmarbletoe8210
      @nmarbletoe8210 2 ปีที่แล้ว +2

      indeed! And at the maximum entropy state, it is more likely that entropy will increase. Randomness is fun

  • @sachamm
    @sachamm 2 ปีที่แล้ว +23

    The thought experiment I was given when learning about reversible computing referred to the elasticity of atomic bonds and how energy could be returned when a molecule returned to its original conformation.

    • @slevinchannel7589
      @slevinchannel7589 2 ปีที่แล้ว

      Not many have the intellectual Integrity to watch the harsh History Coverage that Some-More-News did i nthe video "Our Fake Thanksgiving'.

    • @david203
      @david203 2 ปีที่แล้ว

      I don't see it. You would have to identify where the "atomic bonds" are in the logic circuits. Read Gotbread's analysis in another comment here. It makes more sense.

  • @saggezza-artificiale
    @saggezza-artificiale 2 ปีที่แล้ว +6

    Very nice video. I'd just suggest to clarify that, if we consider real use cases, probably we'll never get around Landauer's principle, even if we developed perfect reversible computing. This because reversible computing allows to handle information without erasing it, but it we don't rely on a device allowing to store an infinite quantity of information, soon or later we'll need to delete it, and this can't be done in a reversible way.

  • @fios4528
    @fios4528 2 ปีที่แล้ว +61

    Thank you for making this video. I've genuinely spent many a sleepless night thinking about the lifespan and logistics of a minimum energy computer for a future civilization that lives in a simulation.

    • @berniv7375
      @berniv7375 2 ปีที่แล้ว +1

      Could you possibly do a video about data banks and how they are taking over the world. I was astonished to learn how much energy is required just to take a digital photograph and how much energy is required to cool down data banks. 🌱

    • @cate01a
      @cate01a 2 ปีที่แล้ว +1

      if the universe is a simulation, why care for energy? like the people controlling the simulation could either supply it with nuclear or better energy that would last many universes lifetimes. or more likely is they'd speed up the simulation speed to like tree(10^^^^^^^^^^^^^^^^^^^^^^^^^10), so that maintenence is a non issue because they (might, havent done the maths, can you even calculate that?? no probably not since iirc we only know a COUPLE digits of tree(3) so fuck that bigass number) would have already simulated more universes than we could literally comprehend in less than a microsecond.
      though thatd take fucking insane technology, probably impossible for the laws of physics, so then the simulator controllers would need to exist is a much more advanced, cooler universe, but then the persons controlling THEM would need even IMMENSELY more fucking power, like holy shit, not even a trillion bajillion shit tonne quatrillion lifecycles of the entire universe/existence could even come CLOSE to the amount of damn energy needed for that to happen
      so uh i guess simulation is outta the question unless like god is real and on his impossible computer he's playing the sims10 or some shit, though yknow that makes zero fucking sense too

    • @anywallsocket
      @anywallsocket 2 ปีที่แล้ว

      @@cate01a A^^A = A^A

    • @paulmichaelfreedman8334
      @paulmichaelfreedman8334 2 ปีที่แล้ว

      Funny thing is, that if it turns out we're bound to our solar system (FTL completely impossible for complexly arranged matter) at least part of the planet's population will choose to live in a simulation created by ourselves by means of VR immersion with whatever technology is invented for this purpose in the future. Games like EVE, WoW and Elite Dangerous are examples of current day escape to a universe in which much more is possible than the real one. If you have seen the movie Surrogates or Ready Player One you'll catch my drift.

    • @cate01a
      @cate01a 2 ปีที่แล้ว

      @@paulmichaelfreedman8334 faster than light travel for complex matter/humans ships plants etc should be possible (not feasible (yet)) by manipulating space time, which is possible - and by making the spacetime go faster than light rather than the matter, the matter is still travelling whilst not going against the laws of physics: th-cam.com/video/HUMGc8hEkpc/w-d-xo.html&ab_channel=PBSSpaceTime
      havent watched that specific vid but probably explains same concept

  • @esquilax5563
    @esquilax5563 2 ปีที่แล้ว +2

    This is one of my favourite videos of yours. Most of them cover topics I have a fair bit of familiarity with, but your "where's the mystery?" intro made me realise I've barely thought about this at all

  • @davestopforth
    @davestopforth 2 ปีที่แล้ว +11

    I'm struggling with the definition of information here. Surely information is based on our perception of states and conditions. Of course for information to exist requires a transfer of energy at some point, and the information literally exists because it is a state, but it doesn't become information until we begin to perceive it.
    For example, a particle travelling through space can have it's direction and velocity determined, but the particle doesn't care, it just exists and it just weighs X, whilst travelling at Y towards Z. That's it. For that to become meaningful information it requires some perception.
    Would it not be better to say it requires energy for the creation, transfer or manipulation of information?

    • @heinzerbrew
      @heinzerbrew 2 ปีที่แล้ว +4

      Yeah, it would have been nice if she had defined "pure information" because the information I know about doesn't need energy to simply exist. You are correct it is the creation, changing, and refreshing that requires energy. Not really sure how she doesn't get that.

    • @DrewNorthup
      @DrewNorthup 2 ปีที่แล้ว

      Information at rest does actually have an energy component, but she'd be here all week explaining it.

    • @hyperduality2838
      @hyperduality2838 2 ปีที่แล้ว

      There is a 4th law of thermodynamics:-
      Equivalence, similarity = duality (isomorphism).
      An "equivalence gate" measures similarity, sameness or dualness.
      Homology is dual to co-homology -- topology.
      Increasing the number of dimensions or states is an entropic process -- co-homology.
      Decreasing the number of dimensions or states is a syntropic process -- homology.
      From a converging, convex (lens) or syntropic perspective everything looks divergent, concave or entropic -- the 2nd law of thermodynamics.
      All observers have a syntropic perspective according to the 2nd law.
      My syntropy is your entropy and your syntropy is my entropy -- duality.
      Syntropy (prediction, convergence) is dual to increasing entropy -- the 4th law of thermodynamics!
      Teleological physics (syntropy) is dual to non-teleological physics (entropy).
      Convergence (syntropy, homology) is dual to divergence (entropy, co-homology).
      The word entropy means "a tendency to diverge" or differentiate into new states, reductionism -- science.
      The word syntropy means "a tendency to converge" or integrate into a single whole state, holism -- religion.
      Differentiation is dual to integration, division is dual to unity, reductionsim is dual to holism.
      Syntropy is dual to entropy.
      "Science without religion is lame, religion without science is blind" -- Einstein.
      Science is dual to religion -- the mind duality of Albert Einstein.
      Your mind uses information to converge or create optimized predictions -- a syntropic process.
      Making predictions to track targets, goals & objectives is a syntropic process -- teleological.
      Duality creates reality.
      "Always two there are" -- Yoda.
      The 4th law of thermodynamics is hardwired into mathematics.

    • @compuholic82
      @compuholic82 2 ปีที่แล้ว

      @Dave "Would it not be better to say it requires energy for the creation, transfer or manipulation of information?"
      But that also means that information itself must be associated with an energy state, does it not? If the manipulation (i.e. change of state) requires or releases energy there must have been an energy level before the manipulation and an energy level after the manipulation. The difference in these energy levels is the amount of energy needed for the manipulation.
      In that way it is no different than any other measurement of energy. Take gravitational potential energy. If you drop a ball you can assign an energy level to the ball before and after the drop and the difference in energy levels is released as kinetic energy.

    • @hyperduality2838
      @hyperduality2838 2 ปีที่แล้ว +1

      @@compuholic82 There is also a 5th law of thermodynamics, energy is duality, duality is energy!
      Energy is dual to mass -- Einstein.
      Dark energy is dual to dark matter.
      Action is dual to reaction -- Sir Isaac Newton (the duality of force).
      Attraction is dual to repulsion, push is dual to pull -- forces are dual.
      If forces are dual then energy must be dual.
      Energy = force * distance.
      Electro is dual to magnetic -- Maxwell's equations
      Positive is dual to negative -- electric charge.
      North poles are dual to south poles -- magnetic fields.
      Electro-magnetic energy is dual.
      "May the force (duality) be with you" -- Jedi teaching.
      "The force (duality) is strong in this one" -- Jedi teaching.
      There are new laws of physics! Your mind creates or synthesizes syntropy!
      Thesis is dual to anti-thesis creates the converging thesis or synthesis -- the time independent Hegelian dialectic.
      Duality creates reality!

  • @mchammer5026
    @mchammer5026 2 ปีที่แล้ว +1

    anyone notice that the truth table for the EQV+ gate at 13:40 makes absolutely no sense? it's just an exact copy of the inputs. in order to function anything like an XNOR gate there would need to be one column reading 1 0 0 1 from the top down, representing the output of the XNOR, and then a second column can be added to that to make it "reversible".

    • @bubbacat9940
      @bubbacat9940 11 หลายเดือนก่อน +1

      She decided to draw arrows from the inputs to the outputs instead of lining them up for some reason but if you match them up with what the arrows drew it does work

    • @mchammer5026
      @mchammer5026 11 หลายเดือนก่อน

      @@bubbacat9940 Ah of course you're right, how silly of me

  • @jesuss.c.8869
    @jesuss.c.8869 2 ปีที่แล้ว +11

    Great video, Jade. Thank you for introducing such a complex topic in an easy and fun way. 👍

  • @bryanmorwood
    @bryanmorwood 4 หลายเดือนก่อน +1

    9:18 "it is sometimes said that the Landauer limit gives the energy value of information itself."
    Causing considerable confusion! Information itself has no energy value. It is processing information that requires energy (using logic gates etc as per Jade's example). If you have a bit string 011000101011 with an information value, it has no energy value - it is constant, and can remain constant for years without requiring any energy.

    • @bryanmorwood
      @bryanmorwood 4 หลายเดือนก่อน

      One way of resolving the confusion is to equate the creation of information with the creation of order.
      The Landauer limit is then the minimum amount of energy required to create one bit of information (order).
      Interestingly, this means that the creation of information is analogous to life:
      these are both processes which employ energy to create order out of disorder.

    • @lepidoptera9337
      @lepidoptera9337 หลายเดือนก่อน

      @@bryanmorwood The problem with your poetry is that "order" is exactly the opposite of the actual definition of "information" both in physics and in computer science. The systems with the highest order contain the least information. The most ordered book has 80% completely white pages followed by 20% completely black pages. It has almost no information content other than the relative ratio of black to white pages. The book with the highest possible information content would look like a QR code page after page.

    • @bryanmorwood
      @bryanmorwood หลายเดือนก่อน

      @@lepidoptera9337 Hi, I suspect your comment was a reply to someone else. I didn't mention 'order' at all, I was referring to the supposed energy value of information, which has nothing to do with order, and is null.

    • @lepidoptera9337
      @lepidoptera9337 หลายเดือนก่อน

      @@bryanmorwood You wrote "One way of resolving the confusion is to equate the creation of information with the creation of order. "
      No offense, but have you been drinking? You seem to suffer from memory loss.

    • @bryanmorwood
      @bryanmorwood หลายเดือนก่อน

      @@lepidoptera9337 Reverting to the energy perspective, we can see that while it takes energy to create information, it also takes energy to create order (Maxwell's Demon). Interesting.

  • @trewaldo
    @trewaldo 2 ปีที่แล้ว +8

    I went to watch this video with the preconceived notion about entropy's connection with energy and information. But this is different. The explanation made it better! Thanks, Jade. Cheers! 🥰🤓😍

    • @pandoorloki1232
      @pandoorloki1232 2 ปีที่แล้ว

      Shannon entropy is not thermodynamic entropy--they are fundamentally different things.

  • @surya_11
    @surya_11 2 ปีที่แล้ว +2

    14:48
    So the output zero needs to be a component of the output from the gate itself.
    So we're looking for a function that looks like f(x, y)=(x, f(x, y)).
    Such a function would produce an output containing infinitely many xs which makes reversible gates impossible even theoretically.
    Because if f(x, y)=(x, g(x, y)) where f≠g, g would itself form a gate and produce heat.

    • @Santor-
      @Santor- 2 ปีที่แล้ว +1

      Exactly. A computer built like this would serve all the inputs as the output which defeats its purpose.

  • @havenbastion
    @havenbastion 2 ปีที่แล้ว +9

    "Pure information" still exists as a pattern in a physical substrate and the movement of physical things is heat. There's no mystery unless you imagine pure information to be non-physical, which is an existential metaphysical category error.

    • @thelocalsage
      @thelocalsage 2 ปีที่แล้ว

      yeah yeah yeah and really there are no exact circles in nature and technically pi is a rational number because there aren’t *really* uncountably many things to construct pi from and blah blah blah we’re dealing in mathematical abstraction here you don’t need to “well, actually” a really good piece of science communication by telling mathematicians they’re performing errors of metaphysics when it takes only a modicum of emotional intelligence to see these ideas are a platform for discussing purely mathematical systems

    • @pandoorloki1232
      @pandoorloki1232 2 ปีที่แล้ว

      "unless you imagine pure information to be non-physical, which is an existential metaphysical category error."
      That is precisely backwards. Confusing information with physical implementations is a category mistake.
      P.S. The response is more of the same conceptually confused nonsense. Information is a mathematical abstraction ... it doesn't need a physical instantiation or a "physical need"--that confuses the colloquial meaning of information with the formal meaning.

    • @havenbastion
      @havenbastion 2 ปีที่แล้ว +1

      @@pandoorloki1232 The information always has a physical instantiation or it couldn't be information. In order for it to exist in any meaningful sense there must be a physical need by which it may be acquired and manipulated.

    • @thelocalsage
      @thelocalsage 2 ปีที่แล้ว

      @@havenbastion a circle always has a physical instantiation or it couldn’t be a circle. in order for it to exist in any meaningful sense there must be a physical need by which it may be acquired and manipulated.

  • @_kopcsi_
    @_kopcsi_ 2 ปีที่แล้ว +1

    the statement at 6:25 is wrong.
    entropy CAN decrease. even is a thermodynamically closed system. but it has very very small chance. the entropy law is valid only globally (in the long run). this is a probabilistic description, which is not surprising knowing that entropy, information and probability are closely connected to each others.

  • @hoptanglishalive4156
    @hoptanglishalive4156 2 ปีที่แล้ว +21

    Disinformation gives me the chills but information warms my soul. Like Prometheus giving the fire of knowledge to humanity, science educators like shining Jade are doing vital work.

  • @IllIl
    @IllIl 2 ปีที่แล้ว +5

    Thank you very much for the video! This is by far the best explanation of the Landauer limit that I've heard, it actually makes sense to me now. One question I still have is that this equivalence between information and entropy seems to be a purely theoretical limit that comes out of the math when we take the 2nd law as axiomatic and then "balance the entropy books" of a gate that has less information on the output than the input. But this seems to make an assumption that "information" obeys the 2nd law. In reality, the information of a logic gate is something that a human interprets. The reality is just physical conductors and semi-conductors. The physical reality of _any_ implementation of _any_ logic gate should always be sufficient to preserve the 2nd law. Or is that the point? That if we found the ultimate physical implementation of a logic gate, its waste heat would be equal to (or greater than) that limit?

    • @hyperduality2838
      @hyperduality2838 2 ปีที่แล้ว +1

      There is a 4th law of thermodynamics:-
      Equivalence, similarity = duality (isomorphism).
      An "equivalence gate" measures similarity, sameness or dualness.
      Homology is dual to co-homology -- topology.
      Increasing the number of dimensions or states is an entropic process -- co-homology.
      Decreasing the number of dimensions or states is a syntropic process -- homology.
      From a converging, convex (lens) or syntropic perspective everything looks divergent, concave or entropic -- the 2nd law of thermodynamics.
      All observers have a syntropic perspective according to the 2nd law.
      My syntropy is your entropy and your syntropy is my entropy -- duality.
      Syntropy (prediction, convergence) is dual to increasing entropy -- the 4th law of thermodynamics!
      Teleological physics (syntropy) is dual to non-teleological physics (entropy).
      Convergence (syntropy, homology) is dual to divergence (entropy, co-homology).
      The word entropy means "a tendency to diverge" or differentiate into new states, reductionism -- science.
      The word syntropy means "a tendency to converge" or integrate into a single whole state, holism -- religion.
      Differentiation is dual to integration, division is dual to unity, reductionsim is dual to holism.
      Syntropy is dual to entropy.
      "Science without religion is lame, religion without science is blind" -- Einstein.
      Science is dual to religion -- the mind duality of Albert Einstein.
      Your mind uses information to converge or create optimized predictions -- a syntropic process.
      Making predictions to track targets, goals & objectives is a syntropic process -- teleological.
      Duality creates reality.
      "Always two there are" -- Yoda.
      The 4th law of thermodynamics is hardwired into mathematics.

  • @dandelion2948
    @dandelion2948 2 ปีที่แล้ว

    Thanks!

  • @elroyfudbucker6806
    @elroyfudbucker6806 2 ปีที่แล้ว +14

    Practically speaking, the heat generated within a microprocessor comes during the infinitesimally short time that the MOSFETs that make up the logic gates change state. When they are either conducting or not conducting current, they generate no heat. Multiply this extremely tiny amount of heat by the millions of MOSFETs that are changing state millions of times a second & you see the need to have a cooling fan on a heatsink mounted directly on the microprocessor & why it's not a good idea for your daughter to have her laptop on the bed.

    • @htomerif
      @htomerif 2 ปีที่แล้ว +3

      Minor correction: what you said was exactly true 20 years ago. Right now, with the size of junctions in gates, a good chunk of the heat generated by processors is from quantum tunneling (i.e. leakage current intrinsic to the gate). I say "gate" because in modern processors you don't have individual FETs. You have compound devices that serve the purpose of a logic gate without actually having a recognizable FET.
      The only way to stop that kind of power drain is to kill the power entirely to large sections of a processor, otherwise they constantly drain power, flipping bits or not.
      I realize I used the word "gate", both meaning the insulated physical gate that controls current flow in a semiconductor device and meaning "logic gate" here. Hopefully its relatively clear which is which.

    • @KirbyZhang
      @KirbyZhang 2 ปีที่แล้ว

      @@htomerif does this mean older FET processes can produce more power efficient chips?

    • @htomerif
      @htomerif 2 ปีที่แล้ว +1

      @@KirbyZhang Its not so much about "older". Its about what they were designed for. OP isn't wrong that FETs changing state *used* to be the primary power drain, or that it is *still* a significant contribution to TDP. Its just been optimized now in CPUs for minimum power per computation and that means gate leakage outweighs state changes.
      Unfortunately, this gets complicated. You probably know that past a certain point (I don't know, about 100nm?) the process nodes have nothing to do with size anymore. For example, the 7 "nm" node can't make features that are 7 nanometers. Its minimum feature size depends on the type of feature but its up around 50 actual nanometers. Its kind of a big fat lie.
      Ultimately, the answer to your question is: using current process technology, you can optimize it for power efficiency by increasing gate sizes and increasing gate dielectric thickness but it comes at the cost of more silicon real estate. They do use older processes and equipment for microcontrollers like TI's MSP430 series or Microchip's atmega series, but its older equipment that's been continuously developed for making these slow, extremely power efficient processors.
      I guess it really depends what you mean by "power efficient". If you mean "minimum standby TDP" then yes. If you mean "minimum power clocked at 100MHz, with a modern design" then also yes. If you mean "absolute minimum cost per floating point operation" then no. And all of those are important considerations. A rackmount server might be going all-out for 5 years straight. A desktop processor will probably spend most of its life at a much lower clock rate than maximum and a mobile (i.e. phone) processor will probably spend a whole lot of its life in standby at or near absolute minimum power.
      I hope that helped. Its hard to pack a whole lot of information into a small space that someone would theoretically read.

  • @NotHPotter
    @NotHPotter 2 ปีที่แล้ว +10

    Makes sense. In order to maintain a system of any kind of order (and thus store useful information), the system internally needs to expend energy to resist the natural entropy that seeks to degrade that information. Some kind of input is gonna be necessary to resist that decay.

    • @heinzerbrew
      @heinzerbrew 2 ปีที่แล้ว

      we have lots of methods of storing information that don't require energy to maintain. Sure eventually entropy will destroy those storage devices, but we don't operate on that time scale.

    • @NotHPotter
      @NotHPotter 2 ปีที่แล้ว +1

      @@heinzerbrew Those more stable methods are also a lot slower, although even early CDs and DVDs are approaching the point where they're no longer readable. Books weather and wear. Ultimately, it doesn't matter, though, because even if you're going to quibble over time scales, there is still some necessary effort made to preserve data in a useful state.

  • @zscriptwriter
    @zscriptwriter 9 หลายเดือนก่อน

    Back in College in 1984 I created a circuit gate simulator on my TI99 computer that allowed the user to place circuit gates onto the screen, connect them and then input values
    And then compute the output. Given enough memory, the simulator could reverse lookup the initial values.
    Thank you Jade for reminding me how much fun I had in college.
    You are an awesome person with an unlimited imagination.

  • @itsawonderfullife4802
    @itsawonderfullife4802 2 ปีที่แล้ว +4

    Insightful video.
    The "reversible logic gate" (referred to near the end of the video) is simply using (controlled and confined) classical scattering to compute. And we already know that scattering is a reversible process because it is directly dictated by the laws of mechanics (and conservation laws) such as Newton's 2nd law. And Newton's 2nd law is evidently time reversible (as are other fundamental laws of nature) because they are expressed as 2nd order differential equations (involving acceleration) which are invariant under a time-reversal transformation (=keep positions and reverse velocities).
    The question then again goes back to this: Given that the fundamental laws of nature are time-reversible, how come we have a thermodynamic arrow of time and irreversible macro processes (such as a traditional irreversible logic gate operating) and a common answer is that irreversibility (=thermodynamic arrow of time) is an emergent feature of an ensemble of many particles (its simply mathematics and probability).
    So the model "reversible logic gate" solves really nothing. It's just a toy model for a controlled Newtonian scattering, which we have known for hundreds of years. That does not tell us how to build computers which do not increase entropy.

  • @robbeandredstone7344
    @robbeandredstone7344 2 ปีที่แล้ว +1

    9:40 For anyone wondering, that is approximately 75eV to put it into perspective.

  • @atrus3823
    @atrus3823 2 ปีที่แล้ว +11

    Great video, as always! Based on my absolutely zero research, my first thoughts are that though the entropy change (assume lossless collisions) of the gate is 0, half of the output energy is not used in the final calculation. There would need to be a way in the system as a whole to make use of the waste balls, or else you're right back where you started.

    • @waylonbarrett3456
      @waylonbarrett3456 2 ปีที่แล้ว +4

      This has been considered. See the Szilard engine.

    • @hyperduality2838
      @hyperduality2838 2 ปีที่แล้ว

      There is a 4th law of thermodynamics:-
      Equivalence, similarity = duality (isomorphism).
      An "equivalence gate" measures similarity, sameness or dualness.
      Homology is dual to co-homology -- topology.
      Increasing the number of dimensions or states is an entropic process -- co-homology.
      Decreasing the number of dimensions or states is a syntropic process -- homology.
      From a converging, convex (lens) or syntropic perspective everything looks divergent, concave or entropic -- the 2nd law of thermodynamics.
      All observers have a syntropic perspective according to the 2nd law.
      My syntropy is your entropy and your syntropy is my entropy -- duality.
      Syntropy (prediction, convergence) is dual to increasing entropy -- the 4th law of thermodynamics!
      Teleological physics (syntropy) is dual to non-teleological physics (entropy).
      Convergence (syntropy, homology) is dual to divergence (entropy, co-homology).
      The word entropy means "a tendency to diverge" or differentiate into new states, reductionism -- science.
      The word syntropy means "a tendency to converge" or integrate into a single whole state, holism -- religion.
      Differentiation is dual to integration, division is dual to unity, reductionsim is dual to holism.
      Syntropy is dual to entropy.
      "Science without religion is lame, religion without science is blind" -- Einstein.
      Science is dual to religion -- the mind duality of Albert Einstein.
      Your mind uses information to converge or create optimized predictions -- a syntropic process.
      Making predictions to track targets, goals & objectives is a syntropic process -- teleological.
      Duality creates reality.
      "Always two there are" -- Yoda.
      The 4th law of thermodynamics is hardwired into mathematics.

    • @atrus3823
      @atrus3823 ปีที่แล้ว

      Thanks for the replies. Someday, I'll hopefully find time to follow up on this.

  • @vishalmishra3046
    @vishalmishra3046 2 ปีที่แล้ว

    3:45 Largest prime under 1 billion is *999,999,937* . Computers are indeed amazing in their ability to answer unexpected questions with such low effort, cost and time.

  • @AdrianBoyko
    @AdrianBoyko 2 ปีที่แล้ว +4

    What a great presentation of these topics. This is Carl Sagan level exposition!

  • @Snowflake_tv
    @Snowflake_tv 2 ปีที่แล้ว

    8:45 So, the number of input states that hasn't actually occurred transforms into heat entropy, which means it must consume a certain quantity of energy? To keep entropy from decreasing?

  • @danielclv97
    @danielclv97 2 ปีที่แล้ว +7

    I love the video, but I'd like a follow up, maybe with the Maxwell's daemon, like, if this can solve the need for energy consumption during information manipulation, what stops the Maxwell's daemon to break the laws for thermodynamics? Or, is it still physically imposible to store information with no expense of energy? And if yes, then isn't the billboard compute useless in the sense that it will consume the same amount of minimum energy? You can't use information if you don't store it to interpret it after all. You'll have to at least read the position of the balls after they pass trough the computer.

    • @geraldsnodd
      @geraldsnodd 2 ปีที่แล้ว +1

      Maxwell's Demon is a cool topic in itself :)
      Check out Eugene Khutoryansky's video.

    • @christopherknight4908
      @christopherknight4908 2 ปีที่แล้ว +6

      If I'm not mistaken, Maxwell's daemon has to expend energy to erase information. Would reversible computing eliminate this requirement? Perhaps information storage is just a different problem that would need to be solved, in addition to the energy usage of computing.

    • @_kopcsi_
      @_kopcsi_ 2 ปีที่แล้ว +1

      @@christopherknight4908 yes, information storage is totally different from computing, but interestingly the two problems have the same source:
      "Landauer's principle is a physical principle pertaining to the lower theoretical limit of energy consumption of computation. It holds that "any logically irreversible manipulation of information, such as the erasure of a bit or the merging of two computation paths, must be accompanied by a corresponding entropy increase in non-information-bearing degrees of freedom of the information-processing apparatus or its environment"." -- en.wikipedia.org/wiki/Landauer%27s_principle
      storage of information is static, manipulation of information is dynamic. these two have totally different nature. but when we erase information, we actually manipulate it. erasure is dynamic. that's why the same principle can provide a solution for the problem of Maxwell's demon and an explanation for the fundamental energy cost of irreversible logical operators.

    • @hyperduality2838
      @hyperduality2838 2 ปีที่แล้ว

      There is a 4th law of thermodynamics:-
      Equivalence, similarity = duality (isomorphism).
      An "equivalence gate" measures similarity, sameness or dualness.
      Homology is dual to co-homology -- topology.
      Increasing the number of dimensions or states is an entropic process -- co-homology.
      Decreasing the number of dimensions or states is a syntropic process -- homology.
      From a converging, convex (lens) or syntropic perspective everything looks divergent, concave or entropic -- the 2nd law of thermodynamics.
      All observers have a syntropic perspective according to the 2nd law.
      My syntropy is your entropy and your syntropy is my entropy -- duality.
      Syntropy (prediction, convergence) is dual to increasing entropy -- the 4th law of thermodynamics!
      Teleological physics (syntropy) is dual to non-teleological physics (entropy).
      Convergence (syntropy, homology) is dual to divergence (entropy, co-homology).
      The word entropy means "a tendency to diverge" or differentiate into new states, reductionism -- science.
      The word syntropy means "a tendency to converge" or integrate into a single whole state, holism -- religion.
      Differentiation is dual to integration, division is dual to unity, reductionsim is dual to holism.
      Syntropy is dual to entropy.
      "Science without religion is lame, religion without science is blind" -- Einstein.
      Science is dual to religion -- the mind duality of Albert Einstein.
      Your mind uses information to converge or create optimized predictions -- a syntropic process.
      Making predictions to track targets, goals & objectives is a syntropic process -- teleological.
      Duality creates reality.
      "Always two there are" -- Yoda.
      The 4th law of thermodynamics is hardwired into mathematics.

    • @_kopcsi_
      @_kopcsi_ 2 ปีที่แล้ว

      @@hyperduality2838 dude smoke less weed

  • @Kimwilliams45
    @Kimwilliams45 2 ปีที่แล้ว +2

    Thank you. I had never heard about the Landauer limit even though I was a physics student. Very good explanation.

  • @douggale5962
    @douggale5962 2 ปีที่แล้ว +5

    The fundamental component of modern computers is the field effect transistor. They are metal oxide semiconductors, so they are known as MOS transistors. There is a positive and negative construction, the two constructions are the suited for pulling (voltage) up and pulling down. Those two constructions are "complementary". So the fundamental component is more specifically, CMOS FETs. You switch FETs by charging or discharging the FET gate layer. The power used by a processor is the dissipation due to the resistance of conductors transferring charge in and out of the gate. Holding a 1 or a 0 state does not consume power, only the changes dissipate significant power.

    • @abdobelbida7170
      @abdobelbida7170 2 ปีที่แล้ว

      Is that what "switch losses" means?

    • @douggale5962
      @douggale5962 2 ปีที่แล้ว +2

      @@abdobelbida7170 Yes. Usually, switching losses primarily refer to the time when a transistor is partially on, and a voltage drop across the source and drain (the switched path) dissipates power. In a switching power supply with large currents, the partially-on pulse of losses is large. That's one huge transistor with one huge pulse of losses.
      In a CPU, there are no large currents going through any individual transistor, but an enormous number of gates being charged or discharged simultaneously, so an enormous number of small losses sum up to a large loss that would be correctly called switching loss.
      All of the power that goes into a CPU is lost, all of it becomes waste heat, eventually. Another way of looking at it is, CPUs use no power, other than the power they need for the losses, once they get going.

  • @allanwrobel6607
    @allanwrobel6607 ปีที่แล้ว +1

    A fundermental concept explained so clearly, you have a rare talent, keep going with these.

  • @Mike500912
    @Mike500912 2 ปีที่แล้ว +4

    1. Microprocessors do output energy via driving external ports, used for driving other subcircuits.
    2. A logic gate isn't a fundamental building block. It itself is made up of transistors, a more fundamental building block.

  • @gbear1005
    @gbear1005 2 ปีที่แล้ว +1

    Richard Feynman (the Nobel physicist / father of QED) gave talks on reversible computing and energy use

  • @Zeero3846
    @Zeero3846 2 ปีที่แล้ว +4

    I got so excited that I actually manage to guess your topic of reversible computing. I had only recently learned about through some crazy crypto proposal about an energy-backed stablecoin, and it fundamentally relied on the idea of reversible computing. In particular, it suggested a way to transfer energy over the internet. A computation doesn't necessarily have to be located on the same computer. It could happen over the network if you imagine the network as one giant computer. In performing the computation, the input side would generate heat, while the output absorbed an equivalent amount, and you essentially have a new kind of heat pump. Of course, this sounds way too easy to be true, and that's because it is, but it's still definitely cool to think about. It's somewhat reminiscent of Nikola Tesla's wireless energy tower, except perhaps it's a little less scary as it's not trying to transmit energy directly through some physical medium, like the earth and sky, but rather over our telecommunications infrastructure as data.

  • @itsawonderfullife4802
    @itsawonderfullife4802 2 ปีที่แล้ว +1

    Great video as always. One little reminder though:
    At 7:53 you have to also consider the power plant (and include it in the "system") which makes the electricity used to produce the work (=energy for refrigerator's compressor) which is needed for the refrigerator (to reverse and decrease the entropy of its interior) to say that the 2nd law is maintained overall and entropy of a closed system (most often) increases.

  • @landsgevaer
    @landsgevaer 2 ปีที่แล้ว +4

    After having spend so much time thinking how to carefully explain an unintuitive topic and managing to produce such a wonderful video, it must be so disappointing reading so many of the comments here that don't show anything has landed.
    Fortunately, there are a few exceptions too.
    Rest assured that it is the YT audience that is lacking, not the producer...
    😉

    • @En_theo
      @En_theo 2 ปีที่แล้ว

      Well, she kinda misunderstood some concepts herself since she said @6:24 that entropy "never" goes down, which is false. It tends to increase but there is still a possibility that it goes down. If the universe is eternal, then that hypothesis becomes a certainty and you find yourself with renewed energy again (after a veryyyy long time though).

    • @landsgevaer
      @landsgevaer 2 ปีที่แล้ว

      @@En_theo That may be a theoretical note to make, but I wouldn't say that makes the statement "wrong" for this audience.
      We can hopefully agree that many comments are much worse... 😉

    • @En_theo
      @En_theo 2 ปีที่แล้ว

      @@landsgevaer
      Comments are entitled to the people who make them, as long as they don't pretend to be physicists bringing a new knowledge. I'm not blaming her, it's a common misconception that entropy "always" increases. A simple sentence like "it tends to, but the details are for another video" would be nice.

  • @ThatJay283
    @ThatJay283 11 หลายเดือนก่อน

    6:30 so, this would be important because in a computer, entropy can go down. it can simply clear chunks of memory to zero, load programs, etc. so in order to decrease the entropy in a computer, an increase in entropy must happen elsewhere (eg by fissioning uranium atoms in a power plant) to counteract this decrease in entropy. that is the theoretical limit.

  • @davidh.4649
    @davidh.4649 2 ปีที่แล้ว +8

    So Jade ... does the Landauer Limit apply to synapses in the human brain as well? 😁 Good thought provoking video as always!

    • @gotbread2
      @gotbread2 2 ปีที่แล้ว

      It does. Its a fundamental consequence of information erasure.

  • @TejasIsAmazing
    @TejasIsAmazing ปีที่แล้ว

    8:01 The second law of thermodynamics is about the thermodynamic entropy, but we are talking about information entropy for the logic gates. Would the law still apply? Are both the different entropy connected in some way?

  • @paulthompson9668
    @paulthompson9668 2 ปีที่แล้ว +6

    Hi Jade, this video seems like a wonderful lead-in to a video on how quantum computers will require less energy due to the reversibility of their gates.

    • @dodokgp
      @dodokgp 2 ปีที่แล้ว +1

      Not unless the cryogenic operation is obviated.

    • @paulthompson9668
      @paulthompson9668 2 ปีที่แล้ว +1

      @@dodokgp So let's get cracking on quantum computing at room temperature!

  • @LMProduction
    @LMProduction 2 ปีที่แล้ว

    The Landauer limit can be circumvented by noting that entropy can increase in other conserved quantities, not just energy. A paper from 2009 (and a few more papers in the years since) by J. Vaccaro and others showed that in principle, you can transfer the kb ln(2) bit of entropy to an angular momentum reservoir, therefore inducing no energy loss. We're currently working on a qubit operation to show a single bit transfer of thermal energy from the environment into a spin bath experimentally.

  • @MrAttilaRozgonyi
    @MrAttilaRozgonyi 2 ปีที่แล้ว +3

    I remember reading somewhere that information actually has weight. In other words, a hard drive full of data is measurably a tiny bit heavier than that same drive when erased. I wish I could remember the actual reason why. Maybe that could make an interesting video? All the best! ☺️☺️

    • @Snowflake_tv
      @Snowflake_tv 2 ปีที่แล้ว

      I'm interested in, too.

    • @markricher2065
      @markricher2065 2 ปีที่แล้ว

      I was thinking the very same thing 🤔

    • @Snowflake_tv
      @Snowflake_tv 2 ปีที่แล้ว

      Is information then mass???

    • @MrAttilaRozgonyi
      @MrAttilaRozgonyi 2 ปีที่แล้ว +1

      I have a vague recollection that it *may* be limited to the older style HDD’s which have magnetic platters and it may not be the case in SSD drives… but that’s all I can remember. :) I’d love to know more about this.

    • @RobBCactive
      @RobBCactive 2 ปีที่แล้ว +2

      As HDD platters magnetise the coating when the drive heads set bits and sit in the Earth's magnetic field, it's possible the measured weight changes a little by adding a magnetic force but that is not gaining or losing mass.

  • @username-mu8yf
    @username-mu8yf 2 ปีที่แล้ว

    13:42 incorrect table, 1s in output do not correspond to equal inputs. Seems like the output is copy-pasted from the input

  • @micro2743
    @micro2743 2 ปีที่แล้ว +4

    "Pure Information" (Data) can be stored on a Floppy Drive/Hard Drive/CD/DVD/Thumb Drive/SSD/etc... and it requires no energy and gives off no heat. Moving information requires Energy and produces Heat! Data stored in active memory must be refreshed and also requires Energy and produces Heat, and of coarse physical hard drives spin for a while even when they are idle. You covered logic gates, but it should also be noted that often computers are just moving data i.e. copying an image from a web server to your local hard drive, then to active memory, and eventually to your graphics card so you can see it on your screen.

    • @StefanNoack
      @StefanNoack 2 ปีที่แล้ว +1

      Well, the floppy media requires mass, and that is equivalent to energy ;-)

    • @Rithmy
      @Rithmy 2 ปีที่แล้ว

      And the wear and tear of the floppy media can be said to equal energy loss. Almost like radiation in big.

    • @micro2743
      @micro2743 2 ปีที่แล้ว

      @@StefanNoack A mass at rest is potential energy and I don't think it produces heat.

    • @micro2743
      @micro2743 2 ปีที่แล้ว

      @@Rithmy I am talking about cold storage, and I know a floppy can last 20+ years. I don't think a CDR/DVDR lasts that long. Retail CDs/DVDs are usually pressed and degrade much more slowely. Since energy cannot be destroyed, you are correct. That magnet field has to be moving somwhere else as the Floppy degrades, and theoretically creates a small amount of heat. Would we even be able to measure it?

    • @Rithmy
      @Rithmy 2 ปีที่แล้ว

      @@micro2743 Isn't mass alone creating heat by pressure?

  • @vansf3433
    @vansf3433 2 ปีที่แล้ว

    Whenever you have the cureent i runs through a logic gate or gated latch, you have an "on signal" coming out of an output , the signal is corresponding 1 , and when there is no current through another output, no signal is interpreted as "off signal" or 0 , and when you have a bunch of gated latches arranged in matrices to keep such bits of 1 and 0 , you have a compter memory

  • @bntagkas
    @bntagkas 2 ปีที่แล้ว +4

    im just a stupid highschool dropout but to me it seems all of these energies are really kinetic energy. whether it moves a car, or manipulates information or produces heat, if you zoom in you move atoms in a way that benefits you, you move atoms to move the car=kinetic, you move atoms/electrons/photons to manipulate information, you move atoms etc to produce heat. so it seems to me all kinds are really one, kinetic

  • @commandershepard6189
    @commandershepard6189 2 ปีที่แล้ว

    Cool! Good video... Some people don't understand that 0 is open circuit while 1 is closed circuit. Meaning 0 is the absence of a bit while 1 is a bit. In microchips 0 has work or foce being applied due to heat transfer from the nearby transistors and the opening of that transistor... this equats to energy loss in the application. Yeah, work applied but no work output. Cool stuff. The problem, we'll never get around thermodynamics.

  • @MemphiStig
    @MemphiStig 2 ปีที่แล้ว +1

    Jade's just as good as any other science channel out there, and she's more charismatic (and much cuter!) than most, if not all. She really deserves more subs and views.

  • @granduniversal
    @granduniversal 2 ปีที่แล้ว

    This approach to entropy is like making a database for an application. You make that based upon the business rules. When you are trying to find out the business rules, you will have to deal with a lot of uncertainty. That uncertainty is not unlike the challenges that surround trying to figure out chaos math. Anyway, the more uncertainty, the more entropy. And the natural state for most people to be in is that of ignorance. When you are figuring out the business rules you will find varying interpretations, and varying ideas about what kind of a world those can go in. That part is not unlike how consciousness is not a broad spectrum thing, but rather about what we can focus on at any one point in time. In the end you hope to get to the truth. The truth is the only "state" that can answer all of the questions, logically lead to the end result desired for a particular circumstance.
    This thing about getting at the business rules, it is important as a metaphor. It speaks about how you can try to do math between apples and oranges sometimes, but we want there to be an equivalency. Well, there will be, if you borrow from the outside. I don't mean using things like big data to help you find the business rules. That will most likely result in scope creep, as it points out to you more ways to make differentiations, flinging you into what could be before you understand what is. I mean wrestling with it by using as little information to deduce what you have to as possible. If there is an equivalency, then the correct answer should be surrounded by a lot of falsehood, that is interest in remaining ignorant, or potential other path traces, or heat. Because this is hard work, and nobody should go around saying it isn't. If you are right about knowledge, or wisdom, there will always be a lot of people going around telling you that you are wrong, in other words. That's no reason to give up.
    The example also points out something about linguistics that I find fascinating. It only affects the consistency of one side of the two sides that should be active to gain effective communication. But we see from things like animal training that is really all that is necessary to achieve success, for the master. It is about reward. And reward is about what an object needs or wants, not what the master demands it needs or wants. Making databases, however, is about making it easier for everybody to do things repeatedly, with less error. Does this lesson tell us that is achievable with only one side being consistent? You know, with the system being built correctly? I guess we find that out when we use a good app versus a bad one?

  • @Dyslexic-Artist-Theory-on-Time
    @Dyslexic-Artist-Theory-on-Time 2 ปีที่แล้ว

    One-way to think of ‘information’ is that the spontaneous absorption and emission of light photon ∆E=hf energy is forming potential photon energy into the kinetic energy of electrons. Kinetic Eₖ=½mv² energy is the energy of what is actually happening. This process forms an uncertain probabilistic future that is continuously coming into existence with the exchange of photon energy. The wave particle duality of light and matter in the form of electrons is forming a blank canvas for us (atoms) to interact with; we have waves over a period of time and particles as an uncertain future unfolds. The mathematics of quantum mechanics represents the physics of time with classical physics represents processes over a ‘period of time’ as in Newton's differential equations. In this theory the mathematics of quantum mechanics represents geometry, the Planck Constant ħ=h/2π is linked to 2π circular geometry representing a two dimensional aspect of 4π spherical three-dimensional geometry. We have to square the wave function Ψ² representing the radius being squared r² because the process is relative to the two-dimensional spherical 4π surface. We then see 4π in Heisenberg’s Uncertainty Principle ∆×∆pᵪ≥h/4π representing our probabilistic temporal three dimensions life. The charge of the electron e² and the speed of light c² are both squared for the same geometrical reason. This process forms a continuous exchange of energy forming what we experience as the continuous passage of time.

  • @ramuk1933
    @ramuk1933 2 ปีที่แล้ว +1

    Could you extract energy from the computation if you had more imputs then outputs?

  • @hynol
    @hynol 2 ปีที่แล้ว

    6:19 - closed system. If you pump energy to system entropy can go down.

  • @TechnoMageB5
    @TechnoMageB5 2 ปีที่แล้ว

    2 things:
    1) With current electronic computers, there is no way even with reversible logic gates to make a net zero energy use computer. The architecture itself requires energy to operate, thus the laws of entropy cannot be circumvented. Quantum computing still needs to solve the architecture problem, but at least the computing itself could theoretically work.
    2) This video reminds me of a discussion at work I had with a girl years ago, who was a college student working there at the time, about logic states. She was curious as to how computers "think". I started with a similar example, the light being on or off (while switching the light in the room on and off as an illustration), that by storing and comparing millions of on/off states, computers evaluate conditions for us, and that those "light switches" are represented by the presence or absence of voltage - more basic than this video. Next time I visit that location, she tells me that literally a few days after our talk, her professor starts talking about the same thing, and here she is in class smiling to herself the whole time thinking "I know where this is going..."

  • @key_bounce
    @key_bounce 2 ปีที่แล้ว

    One big problem: There are three-input, three-output versions of ALL the normal logic gates. These output the same inputs, just "switched".
    So you have the same number of input states as output states.
    (Checking google / wiki, one example is the Fredkin gate: if A is true, swap B and C; if A is false, leave B and C unchanged. This has been implemented in quantum computers; a 5 qbit computer can do a full add of two bits with no information loss. There may be other examples as well -- I thought there were but cannot find them.)
    As far as we can tell, the energy cost in this case is dependent entirely on the time factor -- since you care about the output on one line (the one that goes to the next gate), it takes time (and power) to change the state (voltage, current, or whatever is being used). Or put another way -- at some point, all your spare output lines have to hit a resistor and then the ground line, and that resistor has to generate heat (how much depends on R and I, which in turn means speed.)

  • @lensman7519
    @lensman7519 2 ปีที่แล้ว

    Great content ty
    Nice convergences with plank length, spooky connectivity, entropy and quantum computers. Thx for the headache

  • @TheViolaBuddy
    @TheViolaBuddy 2 ปีที่แล้ว

    Is the Landauer limit why quantum logic gates have to have the same number of output and input bits? Or is that a different restriction that happens to have the same solution of adding bits to make the logic reversible?

    • @KohuGaly
      @KohuGaly 2 ปีที่แล้ว +1

      Yes, that is the reason. The irreversible classical gates leak the extra quantum information as heat. What exactly heat is on a quantum level is a very interesting subject.

  • @gsittly
    @gsittly 2 ปีที่แล้ว

    Channel: ♥️
    Topics: ♥️♥️
    Explanations: ♥️♥️♥️
    Jade: ♥️♥️♥️♥️
    Thank you for this great stuff you share with us 😊

  • @TheHallucinati
    @TheHallucinati 2 ปีที่แล้ว

    I remember how my friends and I were tripping on acid around a bonfire during one of our camping excursions, and someone (can't remember who it was exactly) asked something like: "Can every question theoretically be answered at some point?". And somehow it steered the conversation to the point where I was like: " - We don't have enough energy to answer some of our questions" , and it just sounded hilarious, like "we're too tired to answer your question now, go away". So I tried to explain what I meant. Unfortunately I quickly realised that although I was in no danger of running out of energy, my verbal eloquence was very quickly exhausted. I always envied people who not only could explain things like that in a clear and concise form, but actually keep the listeners interested.

  • @glenncurry3041
    @glenncurry3041 2 ปีที่แล้ว +1

    What an interesting approach from a strictly information physics approach! Being old, my formal electronics education was discreet components. ICs were first coming out and the CPU did not exist. IOW what you show as a symbol for a logic gate to me was originally a batch of steering diodes and mono or bi-stable multivibrators. To me the total energy loss for that gate is based on having to overcome the barrier voltages at each of the solidstate devices internal junctions. e.g. 7v for a silicon PN junction. Plus additional components, wiring,.... Now cascade them.. and suddenly multiply that by billions on one chip!
    Even with today's abilities to cram massive numbers of these onto one chip, they are each still discreet devices and junctions with those needs. While you are working at a completely different level. And I see a great gap between them yet.

    • @hyperduality2838
      @hyperduality2838 2 ปีที่แล้ว

      There is a 4th law of thermodynamics:-
      Equivalence, similarity = duality (isomorphism).
      An "equivalence gate" measures similarity, sameness or dualness.
      Homology is dual to co-homology -- topology.
      Increasing the number of dimensions or states is an entropic process -- co-homology.
      Decreasing the number of dimensions or states is a syntropic process -- homology.
      From a converging, convex (lens) or syntropic perspective everything looks divergent, concave or entropic -- the 2nd law of thermodynamics.
      All observers have a syntropic perspective according to the 2nd law.
      My syntropy is your entropy and your syntropy is my entropy -- duality.
      Syntropy (prediction, convergence) is dual to increasing entropy -- the 4th law of thermodynamics!
      Teleological physics (syntropy) is dual to non-teleological physics (entropy).
      Convergence (syntropy, homology) is dual to divergence (entropy, co-homology).
      The word entropy means "a tendency to diverge" or differentiate into new states, reductionism -- science.
      The word syntropy means "a tendency to converge" or integrate into a single whole state, holism -- religion.
      Differentiation is dual to integration, division is dual to unity, reductionsim is dual to holism.
      Syntropy is dual to entropy.
      "Science without religion is lame, religion without science is blind" -- Einstein.
      Science is dual to religion -- the mind duality of Albert Einstein.
      Your mind uses information to converge or create optimized predictions -- a syntropic process.
      Making predictions to track targets, goals & objectives is a syntropic process -- teleological.
      Duality creates reality.
      "Always two there are" -- Yoda.
      The 4th law of thermodynamics is hardwired into mathematics.

  • @mrnobody2873
    @mrnobody2873 2 ปีที่แล้ว

    I have recently started questioning the Boltzmann use of entropy. The problem is the number of possible states seems to be the issue for me. Looking at iterative computations like you would use in cellular automata, where you have a simple ruleset giving rise to a complex system which then has a basic ruleset coalesce and produce another layer of complexity and so on, would seem to indicate the entropy calculated at anypoint downstream of the primary iteration would be different than one which is calculated before that iteration.
    The flaw here is the assumption of a primary iteration. By formulating the rules a closed system will start with, you inject an infinite number of pre-existing layers of computation, meaning, entropy is a relative comparison between infinitely complex and unknowable states making it ultimately useless. Why?
    At the moment I start the first iteration, I could have created many different permutations based on the complexity of my system and all of the systems of complexity that go into making me. At the start, I have an already infinitely complex system of possible states, only one of which is being evaluated. Without knowing all of my possible states, it is then impossible to get a true evaluation of entropy. This reduces entropy to a single bit output always evaluating to 1.

  • @phatrickmoore
    @phatrickmoore 2 ปีที่แล้ว +1

    Since at some point information will be extracted from any computing system, I believe the Landauer limit cannot be evaded. Reversibility seems possible if the system remains undisturbed, but otherwise, the signal must at some point either be duplicated or wholly returned.

    • @compuholic82
      @compuholic82 2 ปีที่แล้ว

      Yes and no. You are right, that in some sense there it is not possible to evade the Landauer limit.
      A good way to see that is to look at mathematical calculations are fundamentally not invertible. So these are calculations that destroy information. To finish such a calculation you might need many computational steps. During the computation you can actually cheat the fundamental irreversibility of the calculation by using reversible logic. The cost of this is that you will need to add "waste bits" to the output which are not part of the actual answer.
      So you are actually deferring the fundamental irreversibility of a calculation till the end. After all you are only interested in the answer and not the waste bits. And by discarding the waste bits you are then destroying information and the Landauer limit will hit.
      But not all of these "waste bits" are necessarily uncorrelated. So even of you are discarding N bits of data, you might actually not be discarding N bits of information. And that means that if the computation was implemented in non-reversible logic, the computation was less efficient than it needed to be.

    • @phatrickmoore
      @phatrickmoore 2 ปีที่แล้ว

      @@compuholic82 hi, thank you for your thought-provoking response!
      I would need to think about some examples on your arguments for mathematical irreversibility and the ability to cheat it with reversible logic. For one, I know there are both reversible and irreversible mathematical operations. A reversible one is adding a co start to a number. An irreversible one is taking the floor of a number (or any modulus type operation). I also know that in any finite memory digital systems there is of course some inherent irreversibility due to precision losses. I’d have to think about if any of this matters :)
      But! I think your thoughts about waste bits is a big part of this. Ultimately, it is believed that information is conserved at the quantum level. So, there is no information lost ever. The situation Landauer’s limit applies to then seems to shift to the energy equivalence specifically related to the amount of extracted information.
      Then this becomes super interesting, because we would need to pin down exactly what extracted information means. Likely this would come down to the information that is causally linked to future events of the extracting system (that is, in the past light cone).
      I guess this is obvious, but the Landauer limit seems like a direct analog to the thermodynamic limits of work you can extract from a system (in terms of energy). This may have already been stated in the video or somewhere!! But, I think in general, Landauer limit is precisely about the limits of extraction of information.

  • @whimsy5623
    @whimsy5623 2 ปีที่แล้ว

    5:36 What is the ordered list containing the states of each and every lightbulb within the system?

  • @jolodojo
    @jolodojo 2 ปีที่แล้ว +1

    Great video. What i do not understand however is that the wish of every physicist seems to be to make something irreversible reversible.. It is like the quest for a perpetuum mobile all over again. In my mind this is the true waste of energy. Think what possibilties there would be if everybody starts accepting that there is no reversible universe. There is no going back in time. Only forward.

  • @kozmizm
    @kozmizm 2 ปีที่แล้ว

    In classical computing some information is destroyed(lost) when doing a calculation/computation, but if you do things cleverly, you can create an odd symmetry where the calculation done forward in time is also essentially the same as the calculation done backward in time, also referred to as reversible computing, in which no information is lost, even though it requires that many more states are preserved, and the result is sort of like the same thing as the initial state in the first place, adding quite a lot of difficulty into the design of the machine. Yet, in a superconducting environment, this is 100% efficient, theoretically, leading to zero energy wasted. That is, if my understanding of all that is correct. Hence, it is theoretically possible to have a special quantum computer that does not waste any electricity. Instead of destroying information by starting with multiple pieces of information that combine down into fewer bits of information, we preserve all the information during the transformation using our special knowledge of physics, and the number of inputs equals the number of outputs and everything is able to be deterministically reversed. Even though we only care about a portion of those outputs, no energy was destroyed. Every bit of energy has been transformed but preserved without any destruction. It's hard to wrap your head around, but it seems to be a sound idea. Kudos to whoever came up with that idea! Essentially it preserves everything so there is basically zero entropy and therefore no loss. Even if there are minuscule amounts of entropy, the circuit is so efficient that the energy loss will be almost zero compared to classical computing. You can't do this using regular transistors. You have to build this symmetry into the physical design of the system itself, using the natural properties of nature to outsmart entropy. It's not a simple problem, but in quantum gates using quantum computing, it is theoretically possible, I'm using that word again, "theoretical". Honestly, from a purely theoretical point of view it makes sense. Just look at superconductors and we see that entropy can be overcome in certain circumstances. Now, imagine if we can do superconducting materials combined with form of super-symmetrical logic gates to overcome entropic waste. it's like taking superconducting to a meta level. It's the supersymmetry hypercube, man. Like, totally dude! Far out!

  • @michaelzumpano7318
    @michaelzumpano7318 2 ปีที่แล้ว

    I have a question. Is it wrong to say it is the “observation” of the state of the system (which reduces our uncertainty) that results in heat? In the case of the first binary gate, the observation is represented by the single output of the discriminator that tells us the state. The billiard ball gate still has outputs, which will require observation to reduce our uncertainty. Wouldn’t there still be a decoherence event? Our uncertainty collapses. The billiard ball computer has three separate outputs, because we are evaluating which of three states the system possessed. In order to answer the question of which state the system was in when it was measured we need to use the three outputs to determine a single answer. That would require two comparisons. So wouldn’t that produce two units of heat? So I’m not sure the billiard ball gate does anything.

    • @KohuGaly
      @KohuGaly 2 ปีที่แล้ว

      Yes, that would be true if you actually observe the state. If you just wish to feed the state into the next logic gate, no observation is being performed. The observation is performed at the output of the whole circuit. All the stuff in between input and output can perform arbitrary amount of computation and still use 0 energy.

  • @cmilkau
    @cmilkau 2 ปีที่แล้ว

    If you want to use reversibility to eliminate the entropy going down, you have to make the gates produce their outputs by overwriting their inputs (e.g. like quantum computers do). If you have an extra output wire, and the gate changes it to the "correct" value, no matter what value it had before, that gate reduces entropy by 1 ("output wire has any value" to "output wire has a definite value"). The only way to compensate for that would be to "forget" (overwrite) one of the input values. The billiards ball machine does that, as the balls leave the input pipes. Quantum computers also do that, as the inputs can no longer be measured after they have been processed.

  • @helicalactual
    @helicalactual ปีที่แล้ว +1

    how can you not model the system in all permutation states prior to the Nand operation and figure out which it was? it seems short sighted to say that its irreversible or "unknowable" when you can model both system states perfectly fine. the bottom line is there is information stored in the position operator, and given an "ideal" system, you could trace the "steps" of other parts of the system and re derive not assume where the constituent parts of the system are located, and re derive all the particles. including the one that obfuscated the system. the step before the Nand.

  • @cmilkau
    @cmilkau 2 ปีที่แล้ว

    Hmm, I don't think it matters whether the gate has more inputs or outputs. The signal at the inputs is still there when the output is calculated. If you think that the output is known after the calculation, but not before, entropy in the circuit always goes down by the number of outputs, no matter how many inputs. Usually, as stated earlier in the video, the Landauer energy is associated with changing one bit (e.g. a signal on a wire), or more precisely, *overwriting* it (throwing the previous value away).

  • @atomicsoham4864
    @atomicsoham4864 2 ปีที่แล้ว

    Hi Jade, I had one QUESTION 🙋, 6:22 here you have talked about relation of entropy with 2nd law of ThermoD, However is information entropy similar or related to heat or physical entropy? I thought information entropy is just a analogy to explain Disorderness in information and does that mean it will follow ThermoD laws too?

    • @hyperduality2838
      @hyperduality2838 2 ปีที่แล้ว

      There is a 4th law of thermodynamics:-
      Equivalence, similarity = duality (isomorphism).
      An "equivalence gate" measures similarity, sameness or dualness.
      Homology is dual to co-homology -- topology.
      Increasing the number of dimensions or states is an entropic process -- co-homology.
      Decreasing the number of dimensions or states is a syntropic process -- homology.
      From a converging, convex (lens) or syntropic perspective everything looks divergent, concave or entropic -- the 2nd law of thermodynamics.
      All observers have a syntropic perspective according to the 2nd law.
      My syntropy is your entropy and your syntropy is my entropy -- duality.
      Syntropy (prediction, convergence) is dual to increasing entropy -- the 4th law of thermodynamics!
      Teleological physics (syntropy) is dual to non-teleological physics (entropy).
      Convergence (syntropy, homology) is dual to divergence (entropy, co-homology).
      The word entropy means "a tendency to diverge" or differentiate into new states, reductionism -- science.
      The word syntropy means "a tendency to converge" or integrate into a single whole state, holism -- religion.
      Differentiation is dual to integration, division is dual to unity, reductionsim is dual to holism.
      Syntropy is dual to entropy.
      "Science without religion is lame, religion without science is blind" -- Einstein.
      Science is dual to religion -- the mind duality of Albert Einstein.
      Your mind uses information to converge or create optimized predictions -- a syntropic process.
      Making predictions to track targets, goals & objectives is a syntropic process -- teleological.
      Duality creates reality.
      "Always two there are" -- Yoda.
      The 4th law of thermodynamics is hardwired into mathematics.

  • @NikolaosSkordilis
    @NikolaosSkordilis 2 ปีที่แล้ว

    Very informative article, but what about Fredkin and Toffoli gates? They are both reversible (and universal) 3-bit input | 3-bit output gates, differing slightly in principle. I believe there are suggestions for both classic and quantum variants.
    Fredkin's and Toffoli's billiard-ball computer is related (and more easily understood since it's about macro objects) but it's quite less rigorous.

  • @ZMacZ
    @ZMacZ 2 ปีที่แล้ว

    The cost of information, is based on expenditure of keeping it intact as well as forming it.
    So, the minimum energy for any current storage system is based on the smallest
    particle you can hold which must then be stable as well.
    A photon is currently the smallest particle of information.
    In the future this may include quarks.
    The minimum expenditure for keeping a particle in it's state is variable upon the
    usage of the sort of particle. If you'd use a proton set (H2), it would stay stable
    at a minimum cost, while being supercooled to nearing 0K temps.
    With enough space and lack of energy to apply heat, this would stay in position.
    Ofc this is far from the Landauer limit, since the value there is much smaller,
    and only theoretically feasible. Last time I checked photons would not want to stay
    in the same state when held even at a temperature of 0K.

    • @ZMacZ
      @ZMacZ 2 ปีที่แล้ว

      Economical lowering the Landauer limit by cooling can only be done
      in energy deficient space, like out of the solar system space.
      On Earth you need really high efficient insulation and much cooling.
      Not economical.
      In fact you can show that information, and with that energy is best stored
      in information deficient areas of space, since there the ambient energy is lowest,
      and with that the lowest density of information already present disturbing yours.
      Also, following information = energy = mass, means that the lowest mass particle
      indicates the minimum of practical nearing of the lowest entropy
      of a single bit of information, which can never actually be reached due
      to expense being above zero to keep it that way..

  • @Bassotronics
    @Bassotronics 2 ปีที่แล้ว

    Great video.
    How about a gate with photonic cache memory could be possible for irreversibility? Nahh.. that’s silly.

  • @marcinpietrzak9358
    @marcinpietrzak9358 2 ปีที่แล้ว

    16:10 it is not reversible - there are 3 out pipes, so 8 possible output states vs. 4 input states.

  • @alengm
    @alengm ปีที่แล้ว

    Nitpick: the truth table seems incorrect at 13:52. It doesn't do anything

  • @jonarmani8654
    @jonarmani8654 2 ปีที่แล้ว +1

    17:20 The Landauer Devourer

  • @Handelsbilanzdefizit
    @Handelsbilanzdefizit 2 ปีที่แล้ว +1

    Quantum-Computers do fully reversible operations.
    They stay cool, if you cool them down.
    But has this something to do with shannon-entropy of information?
    So, how is shannon-entropy and thermodynamic-entropy related?
    Are they equivalent?

    • @katkatfarkat
      @katkatfarkat 2 ปีที่แล้ว

      I don't quite like thermodynamic-entropy, what it stands for or rather what people say it means (basically inevitable irreversible "random" "chaos" for which the universe strives for some dumb reasons)😅 I refuse to accept/beliieve this lol
      Whereas entropy according to Shannon describes how much information a message possesses, quite literally, there is a formula with which you can calculate e.g. this very comment has. Luckily for me the actual content is irrelevant ^^. The value you would calculate solely depends on how much of unexpected or surprising the message is, although one could understand this at a high-level too, it boils down and is primary meant on bit level. E.g. our alphabet is redundant which causes each letter to have lower entropy. To illustrate: u can c y n y it is tru :)
      This btw how some lossless algorithms can compress huge files in small sizes, very basic would be if you count zeros here 1000 0001 instead of sending each bit.
      I think entropy in information theory makes intuitive sense, if you here the same news over and over again it is no news anymore => less info-density => less entropy. There also a maximum entropy, cause you can only say so much in one comment ;)
      peace

  • @HeavyMetalMouse
    @HeavyMetalMouse 2 ปีที่แล้ว

    The billiard ball gate makes me think there ought to be something analogous that could be done with beam splitters and pulses of light being producing interference constructively or destructively. Photon reflections lose much less energy on reflection than macroscopic objects, making 'optical gates' a strong candidate.
    Another difficulty is the 'null input' issue - we would have a difficult time creating a gate that produces an output when no input at all is given (the XNOR gate is a good example of this - it gets 0,0 in, and outputs 1) - as a result, whatever system we use to represent inputs and outputs, we can't reasonably assign 'no input' to a '0' without some extra considerations that would cost additional energy. Ideally we would need some physical input that could be in one of two states when it is input, could then be made to interact within the gate, and then output the results of the interaction, ideally in the same format. Perhaps polarizations of light, vertical or horizontal, instead of phases interfering? Hmm....

    • @hyperduality2838
      @hyperduality2838 2 ปีที่แล้ว

      There is a 4th law of thermodynamics:-
      Equivalence, similarity = duality (isomorphism).
      An "equivalence gate" measures similarity, sameness or dualness.
      Homology is dual to co-homology -- topology.
      Increasing the number of dimensions or states is an entropic process -- co-homology.
      Decreasing the number of dimensions or states is a syntropic process -- homology.
      From a converging, convex (lens) or syntropic perspective everything looks divergent, concave or entropic -- the 2nd law of thermodynamics.
      All observers have a syntropic perspective according to the 2nd law.
      My syntropy is your entropy and your syntropy is my entropy -- duality.
      Syntropy (prediction, convergence) is dual to increasing entropy -- the 4th law of thermodynamics!
      Teleological physics (syntropy) is dual to non-teleological physics (entropy).
      Convergence (syntropy, homology) is dual to divergence (entropy, co-homology).
      The word entropy means "a tendency to diverge" or differentiate into new states, reductionism -- science.
      The word syntropy means "a tendency to converge" or integrate into a single whole state, holism -- religion.
      Differentiation is dual to integration, division is dual to unity, reductionsim is dual to holism.
      Syntropy is dual to entropy.
      "Science without religion is lame, religion without science is blind" -- Einstein.
      Science is dual to religion -- the mind duality of Albert Einstein.
      Your mind uses information to converge or create optimized predictions -- a syntropic process.
      Making predictions to track targets, goals & objectives is a syntropic process -- teleological.
      Duality creates reality.
      "Always two there are" -- Yoda.
      The 4th law of thermodynamics is hardwired into mathematics.

  • @Elear.
    @Elear. 2 ปีที่แล้ว

    I have been wondering about this for years. Finally. Thank you

  • @rikardlalic7275
    @rikardlalic7275 2 ปีที่แล้ว

    During our lifetime we are solving equations all the time, by acting, thinking, making decisions, reducing so possible solutions availability constantly and irreversibly. Is it this that creates the illusion of time flowing and flowing in one fictional direction only, making up so strange kind of awareness of the past, the presence and the future? Does our time flow equals energy loss? What happens with exhausts, the used, discarded (and lost) bits?

  • @offroadr
    @offroadr 2 ปีที่แล้ว

    I may be mistaken, but it seems to me the billiard ball has three outputs to two inputs or in reverse has three inputs for two outputs. Maybe this means no loss going forward in the circuit, but then what do we do with all three outputs when we only need the one output, doesn't the dumping of that information cause the same heat loss?

  • @mioszlinkiewicz4272
    @mioszlinkiewicz4272 ปีที่แล้ว

    Ok so what if we do a computation using a board in superconducting state - exactly using Cooper pairs (electrons)? Wouldn't this violate the entropy - theoretically no energy is being lost?

  • @tsalVlog
    @tsalVlog ปีที่แล้ว

    why not colliding light on a photonic system? you could use transforms to decode the input waveforms from the output waveform, if you have set rules around the system.

  • @ciroguerra-lara6747
    @ciroguerra-lara6747 ปีที่แล้ว

    Another way to look at the Landauer principle, from what I've read, is the energy needed to erase one bit. Also I read that thermodynamic and computational reversibility are not always equivalent. Also that the Landauer limit on a Maxwell demon implies that, since in reality the demon will have eventually to erase information (we do not have infinite resources to keep information) irreversibility and entropy increase eventually catch up with us.