Intro to Information Theory | Digital Communication | Information Technology

แชร์
ฝัง
  • เผยแพร่เมื่อ 27 มิ.ย. 2018
  • Shannon Entropy in Information theory. Compression and digital communication in systems and technology. The Entropy of English.
    The video we did over on Mark's channel! • Why is English spellin...
    Subscribe to Mark's channel Alliterative! / alliterative
    Hi! I'm Jade. Subscribe to Up and Atom for new physics, math and computer science videos every second week!
    This video was co-written by my super smart hubby Simon Mackenzie.
    SUBSCRIBE TO UP AND ATOM / upandatom
    Follow me @upndatom
    INSTAGRAM: / upndatom
    TWITTER: upndatom?lang=en
    A big thank you to my AMAZING PATRONS!
    Alan McNea, Daniel Tan-Holmes, Simon Mackenzie, Yoseph, Andrew Pann, Dave, Anne Tan, Todd Loreman, David, Susan Jones, Stephen Veitch, Dave Mayer, Renato Pereira, Simon Dargaville, Dean Madden, Noah McCann, Robert Frieske, Magesh.
    If you'd like to consider supporting Up and Atom, head over to my Patreon page :)
    / upandatom
    For a one time donation, head over to my PayPal :)
    www.paypal.me/upandatomshows
    Other videos you might like:
    What is a Singularity, Exactly? • What is a Singularity,...
    When to Think Less (According to Math) • The Accuracy Paradox -...
    When to Quit (According to Math) • When To Quit (Accordin...
    Here's more stuff to read if you're interested :
    www.princeton.edu/~wbialek/ro...
    people.seas.harvard.edu/~jones...
  • วิทยาศาสตร์และเทคโนโลยี

ความคิดเห็น • 491

  • @xSkyWeix
    @xSkyWeix ปีที่แล้ว +66

    It is an old video and probably no one ever will read this. But I am always amazed at how much you, Jade, care for details and the cinematic side of your videos. Truly a creative treat each time. For me, all these little antics are the best part.

    • @kayakMike1000
      @kayakMike1000 ปีที่แล้ว +3

      I read this comment.

    • @billyalarie929
      @billyalarie929 ปีที่แล้ว +1

      Not only did we read this but the very creator of this thing wanted to let YOU know that YOUR COMMENT is important. Even years after the fact.

    • @xSkyWeix
      @xSkyWeix ปีที่แล้ว

      @@billyalarie929 Yeah, it always crack a smile on my face when I have promts about this comment :)

    • @joeheintz
      @joeheintz 8 หลายเดือนก่อน

      that's all she cares about, because she doesn't understand the science side of any of her videos.

  • @edmundkemper1625
    @edmundkemper1625 2 ปีที่แล้ว +22

    That "Least amount of Yes/No questions to ask to arrive at the answer" part is spot on, one of the most intuitive explanations of entropy!

  • @graciouscompetentdwarfrabbit
    @graciouscompetentdwarfrabbit 5 ปีที่แล้ว +106

    The fact that english is my second language and I read it in a blink (probably 3 or 4 blinks tbh. but i aint counting my blinks) makes me pretty happy

    • @alexjordan8838
      @alexjordan8838 5 ปีที่แล้ว +5

      Gabriel M. English is my first language and I couldn’t even read it! Lol, keep it up!

    • @vishwasnegi5184
      @vishwasnegi5184 5 ปีที่แล้ว +1

      Nice sense of humor 😂

    • @graciouscompetentdwarfrabbit
      @graciouscompetentdwarfrabbit 5 ปีที่แล้ว +1

      Blinks are a good unit for sentences and possibly paragraphs. It's like using pats or 4 syllables words for measuring the hug duration (btw my go to hug-wise are 3 pats and 2 words for casual hugs, if that hug means a little more than the usual, at least double the word count and DO NOT PAT, unless you think it's enough hugging)

    • @monad_tcp
      @monad_tcp 5 ปีที่แล้ว +1

      yes, probably it depends on how many hours you spend reading

    • @chrisschock9283
      @chrisschock9283 5 ปีที่แล้ว

      Same :D

  • @upandatom
    @upandatom  5 ปีที่แล้ว +138

    Really so excited to be back guys :)

    • @piyushh8859
      @piyushh8859 5 ปีที่แล้ว +1

      Up and Atom🙋🙋🙋🙋🙋 welcome back 🙌🙌🙌🙌🙌

    • @piyushh8859
      @piyushh8859 5 ปีที่แล้ว +1

      Up and Atom you have forgotten to pin your comment 😅😅😅

    • @DrychronRed
      @DrychronRed 2 ปีที่แล้ว +1

      The volume of the music is a little high in my view. It's hard for me to grok what you're saying and to process it a moment later. This is feedback meant to be constructive. Love your videos!

    • @mathew4181
      @mathew4181 2 ปีที่แล้ว

      Shannon’s paper ' A Mathematical Theory of Communication ' defined how information is encoded, transmitted, and decoded. His most dramatic discovery was a remarkable parallel: The math that describes the uncertainty of noise is exactly the same as thermodynamic entropy. He called noise “information entropy.”
      Formally, heat entropy is the principle in thermodynamics that says that the path from order to disorder is irreversible .
      Information entropy is similar, but it applies to data instead of heat. It’s not just a pattern of disorder; it’s also a number that measures the loss of information in a transmitted signal or message. A unit of information-the bit-measures the number of choices (the number of possible messages) symbols can carry. For example, an ASCII string of seven bits has 2^7 or 128 message choices. Information entropy measures the uncertainty that noise imposed upon the original signal. Once you know how much information you started with, entropy tells you how much you’ve lost and can’t get back.
      Information entropy is not reversible, because once a bit has been lost and becomes a question mark, it’s impossible to get it back. Worse yet, the decoder doesn’t report the question mark! It assigns it a 1 or 0. Half the time it will be right. But half the time it will be wrong, and you can’t know which bits are the originals and which bits are only guesses.
      The question marks in the received signal are bits that have become lost in noise.
      Noise equals uncertainty. When there’s no noise and you receive a 1, you are 100 percent sure the transmitter sent a 1. The more noise there is, the less certain you are what the original signal was. If there’s lots of noise, you’re only 50 percent sure the transmitter sent a 1, because your decoder gets it wrong half the time. Nobody knows what was originally said.
      Because signals (encoded messages) and noise are polar opposites, coded information can never come from noise. A broken machine makes a horrible squeal. But there’s no encoder, so the squeal is not code. (It’s not digital either. It’s an analog sound wave.) An intelligent agent has to encode that squeal into digital symbols and interpret their meaning before it can be considered information
      Any randomness-based theory of evolution violates the laws of information entropy. Music doesn’t get better when you scratch CDs. Organisms do not gain new features when their DNA mutates through damage or copying errors. Instead they get cystic fibrosis or some other birth defect, like legs instead of antennae growing out of a fruit fly’s head. Natural selection can clear competition by killing off inferior rivals. But it can’t work backward from a random mutation and undo the damage.
      For many decades, the Neo-Darwinian Modern Synthesis has claimed that adding noise to a signal can occasionally improve its content. Beneficial random mutations, together with natural selection, were allegedly the key to everything. If this were actually the case, I would have to agree that Mother Nature would possess a truly amazing built-in tool of continuous improvement.
      How intriguing it was, then, to confirm that in computer science, networking, and telecommunications, the concept of adding noise to a signal to improve its contents simply does not exist at all; neither in theory nor practice. Claude Shannon’s work showed the exact opposite.

  • @bkuls
    @bkuls 4 ปีที่แล้ว +10

    I am a master's in wireless and signal processing. And I'll tell you, you have a better knowledge than most of my peers and so-called "engineers". Kudos to your channel!

  • @wrathofsocrus
    @wrathofsocrus 5 ปีที่แล้ว +18

    This is really important for the hearing impaired as well. Having as much information as possible gives a much higher probability of someone with poor hearing understanding you. Saying words without context, in strange sequences, and without complete sentences will greatly reduce the chances that it will be understood correctly.

  • @doodelay
    @doodelay 5 ปีที่แล้ว +2

    I love your channel and just now found you today! You're one of the best of the best in the math and physics youtube because you cover a HUGE variety of topics and give yourself enough time to do so WHILE adding animations and music! This channel's an absolute gold mine :D

  • @amaurydecizancourt4741
    @amaurydecizancourt4741 ปีที่แล้ว +1

    Probably the best and clearest presentation of a subject that has been close to my heart and neurons for the last twenty years. Bravo and thank you.

  • @ecsodikas
    @ecsodikas 4 ปีที่แล้ว +7

    I love how Mark looks exactly like a guy who knows a whole lot about words. ❤️

  • @fuzzmeister
    @fuzzmeister ปีที่แล้ว +2

    Your work is simply brilliant. So helpful and put together with passion and carefully curated for the education of your audience. Abundance - people like you are working hard to create it ! 😍 - thankyou!

  • @schmetterling4477
    @schmetterling4477 2 ปีที่แล้ว +1

    That is one of the best explanations I have ever seen. Excellent.

  • @JasonSpano
    @JasonSpano 5 ปีที่แล้ว +1

    Glad you're back! This was an awesome video. Thanks for posting! :)

  • @user-qb4gb2ev5f
    @user-qb4gb2ev5f 4 ปีที่แล้ว +1

    Very helpful tutorial! Finally I understand entropy! Thanks!

  • @NMIC374
    @NMIC374 3 ปีที่แล้ว +1

    I love your videos!! my new favorite math, physics, scientific philosophy, (etc.), TH-camr!!!!!

  • @mathew4181
    @mathew4181 2 ปีที่แล้ว +3

    Shannon’s paper ' A Mathematical Theory of Communication ' defined how information is encoded, transmitted, and decoded. His most dramatic discovery was a remarkable parallel: The math that describes the uncertainty of noise is exactly the same as thermodynamic entropy. He called noise “information entropy.”
    Formally, heat entropy is the principle in thermodynamics that says that the path from order to disorder is irreversible .
    Information entropy is similar, but it applies to data instead of heat. It’s not just a pattern of disorder; it’s also a number that measures the loss of information in a transmitted signal or message. A unit of information-the bit-measures the number of choices (the number of possible messages) symbols can carry. For example, an ASCII string of seven bits has 2^7 or 128 message choices. Information entropy measures the uncertainty that noise imposed upon the original signal. Once you know how much information you started with, entropy tells you how much you’ve lost and can’t get back.
    Information entropy is not reversible, because once a bit has been lost and becomes a question mark, it’s impossible to get it back. Worse yet, the decoder doesn’t report the question mark! It assigns it a 1 or 0. Half the time it will be right. But half the time it will be wrong, and you can’t know which bits are the originals and which bits are only guesses.
    The question marks in the received signal are bits that have become lost in noise.
    Noise equals uncertainty. When there’s no noise and you receive a 1, you are 100 percent sure the transmitter sent a 1. The more noise there is, the less certain you are what the original signal was. If there’s lots of noise, you’re only 50 percent sure the transmitter sent a 1, because your decoder gets it wrong half the time. Nobody knows what was originally said.
    Because signals (encoded messages) and noise are polar opposites, coded information can never come from noise. A broken machine makes a horrible squeal. But there’s no encoder, so the squeal is not code. (It’s not digital either. It’s an analog sound wave.) An intelligent agent has to encode that squeal into digital symbols and interpret their meaning before it can be considered information
    Any randomness-based theory of evolution violates the laws of information entropy. Music doesn’t get better when you scratch CDs. Organisms do not gain new features when their DNA mutates through damage or copying errors. Instead they get cystic fibrosis or some other birth defect, like legs instead of antennae growing out of a fruit fly’s head. Natural selection can clear competition by killing off inferior rivals. But it can’t work backward from a random mutation and undo the damage.
    For many decades, the Neo-Darwinian Modern Synthesis has claimed that adding noise to a signal can occasionally improve its content. Beneficial random mutations, together with natural selection, were allegedly the key to everything. If this were actually the case, I would have to agree that Mother Nature would possess a truly amazing built-in tool of continuous improvement.
    How intriguing it was, then, to confirm that in computer science, networking, and telecommunications, the concept of adding noise to a signal to improve its contents simply does not exist at all; neither in theory nor practice. Claude Shannon’s work showed the exact opposite.

  • @ARTiculations
    @ARTiculations 5 ปีที่แล้ว +5

    This video totally blew my mind and I am loving this collab so much ❤️

  • @IdeaStudioBKK
    @IdeaStudioBKK 5 ปีที่แล้ว +11

    Fantastic video. I am a massive fan of Shannons work, it was really hammered into me in my undergrad days.

  • @LouisHansell
    @LouisHansell 3 ปีที่แล้ว

    Jade, I always enjoy your videos. You restore my wave function.

  • @glaucosaraiva363
    @glaucosaraiva363 5 ปีที่แล้ว

    You are in the right direction for a huge success! Your videos are getting better and better...Congratulations from Brazil!

  • @MMarcuzzo
    @MMarcuzzo 3 ปีที่แล้ว

    Have been watching your videos and really liking them. And surprisingly, I see something of my favourite subject: information theory!

  • @blijebij
    @blijebij 2 ปีที่แล้ว

    Very interesting and explained with care and passion! Love it!

  • @Giganfan2k1
    @Giganfan2k1 2 ปีที่แล้ว +1

    Had a stroke that affected my langauge center. Took me a tick or two. So happy I could read. Thanks!!!
    Sorry, have to do this P.S. As a person on the autism spectrum. I am really going to have to digest the last 1/4-1/3 of that for a while.
    My lexicon/big word making/vocabulary sometimes has alienated the people around me. I am almost paralyzed daily to express myself concisely.
    So instead of using five sentences on everything going on I fall back on:
    I amble down the street.
    Or
    I saunter down the street.
    Everyone asks" why can't you just walk down the street?"
    I say" Because I don't walk correctly because of joint problems. So I have to take breaks. When I do I am talking mental inventory, or trying to be in the moment. So I could look disheveled to the casual on looker.
    *I didn't want to say all that.*
    So I ambled down the street. Where as a saunter might be something you do walking around a music festival."

  • @testme2026
    @testme2026 4 ปีที่แล้ว

    This is by far the best explanation ever, yet when you search for it it comes well down the list.

  • @laurenr8977
    @laurenr8977 3 ปีที่แล้ว

    Thank you for this! Was the most accessible video on this that I've found.

  • @SeanVoltaire
    @SeanVoltaire 3 ปีที่แล้ว

    This is REALLY well done and does a GREAT job of distilling complex concepts into simple steps for laypeople :)

  • @AvenMcM
    @AvenMcM 5 ปีที่แล้ว

    Hurray! Great to see this, welcome back!

  • @KuraSourTakanHour
    @KuraSourTakanHour 5 ปีที่แล้ว +14

    Now I want to know the entropy of Chinese and other languages 😅

  • @KnowingBetter
    @KnowingBetter 5 ปีที่แล้ว +98

    This was epic. You've really upped your game after that film course... need to sign up for one of those. Glad you're back!

    • @upandatom
      @upandatom  5 ปีที่แล้ว +23

      thank you! and yeah you should it was amazing. but i actually filmed that before the course lol

    • @johnshumate8112
      @johnshumate8112 3 ปีที่แล้ว +1

      Holy crap, knowing better’s here, I love your videos KB keep up the great work

    • @mattgraves3709
      @mattgraves3709 2 ปีที่แล้ว

      @@upandatom You have great creative talent.

    • @coltenbraxton4326
      @coltenbraxton4326 2 ปีที่แล้ว

      Instablaster.

    • @fuckyoutubengoogle2
      @fuckyoutubengoogle2 2 ปีที่แล้ว

      I was just saying "I think the channels [Up and Atom and Knowing Better] thoughtlessly praise each other for more clicks. I happened to have a real distaste for Knowing Better because I found out how deceptive his vids on Christopher Columbus are after watching really good rebuttals by the channel Bad Empanada. I commented about this under the Knowing Better vid but my comments were selectively deleted. They left some of my comments up but just a few out of context to make me look bad. I didn't care much about how I appear but the links and mentions of the rebuttal were removed and this is such an important topic having deadly serious real consequences."

  • @petrskupa6292
    @petrskupa6292 4 ปีที่แล้ว

    You are great! Funky, clever, informative..

  • @HYPERC666
    @HYPERC666 5 ปีที่แล้ว

    Great channel. It's in my daily doseage with Sixty Symbols. Keep up the awesome work.

  • @xacharon
    @xacharon 5 ปีที่แล้ว +2

    The coin flip demo in this video (and the use of "funky" and "fair" coins) reminded me of the "Quantum coin toss" video featuring you and PhysicsGirl. :)

  • @noahmccann4438
    @noahmccann4438 5 ปีที่แล้ว +1

    Mark’s walk through the history of writing was very enlightening. At first it seems nonsensical that early writers would omit things like spaces and punctuation but if framed in the context of a culture with a small number of writers it makes more sense. If what you only write for or consume from a small group of people, you can afford to have very unique writing rules. We even see this today - when writing a note for yourself you may write illegibly or omit information because you expect you’ll remember part of it anyways.
    As a software developer I’ve seen something very similar at play in my coding - personal projects I work on won’t follow standards as closely, and I’m more likely to come up with my own patterns loosely based on common ones. But at work, we try to stick to standards from multiple levels - standards in the team, in the project, and in the industry.
    That said, in both the writing and coding examples above, there are other things at play - I understand that early writing was very much driven by the medium being used, and I’m sure there were time constraints that encouraged conciseness over exhaustiveness (if you have to manually copy all texts, you probably don’t want to do more work than needed assuming the recipient will understand anyways).

  • @thesentientneuron6550
    @thesentientneuron6550 5 ปีที่แล้ว +1

    3:56 I like that song. Gives me happy thoughts.

  • @zrodger2296
    @zrodger2296 3 ปีที่แล้ว

    Just watched a documentary on Claude Shannon. Wanted to dig deeper. Best explanation of this I've seen yet!

  • @juliocardenas4485
    @juliocardenas4485 3 ปีที่แล้ว

    This is absolutely fabulous!!

  • @tinaburns1376
    @tinaburns1376 3 ปีที่แล้ว

    I LOVE THIS VIDEO. It made is so easy to understand entropy. It is also funny.

  • @maccarrena
    @maccarrena 5 ปีที่แล้ว +1

    Cool topic and cool video. I recently read a paper about entropy of the most commonly spoken languages and it stated that English has a lot of redundancy, the Asian languages had much higher entropy (around 3.5 - 4.5 bits) and I think French was the least effective amongst all chosen (around 1.5 bits).

  • @SuviTuuliAllan
    @SuviTuuliAllan 5 ปีที่แล้ว +4

    I use Cosmic Microwave Background radiation for all my entropy needs!

  • @Russet_Mantle
    @Russet_Mantle 5 หลายเดือนก่อน

    4:56 that aggressive screen shake got me rolling

  • @Ghost572
    @Ghost572 2 ปีที่แล้ว

    I think this is the first time I've binge watched a you tube channel, all the titles appeal to me. I'm surprised this didn't come into my recommended earlier.

  • @lowhanlindsey
    @lowhanlindsey 5 ปีที่แล้ว +3

    Your channel is the best thing on the internet!

    • @upandatom
      @upandatom  5 ปีที่แล้ว

      aww thank you! n_n

  • @baharehshahpar8674
    @baharehshahpar8674 2 ปีที่แล้ว

    Amazing explanation thank you 🙏🏼🙏🏼

  • @gypsyjr1371
    @gypsyjr1371 3 ปีที่แล้ว +2

    Thanks for your cool and educational videos! This is one I can possibly even understand fully. Long ago, when MOSfets had first come on the scene, I graduated from West Point with enough Mathematics for a BS, enough Electrical Engineering credits for a BS in that, and enough computer use (was a part time mainframe computer tech too) and credits for a CS if only it has existed then. But being the Army, I got a BS General Science (meaning strategy and tactics) instead. Thats all they gave out.
    So over the years, I worked for the Department of Defense, for contractors to same, for the Navy, and for a black box contractor to DoD who had a device (in 1980) that could be in a van and read a computer screen inside a building it was parked next to. No longer classified, and long replaced by better technology. I wrote AI, telecommunications systems, video and voice storage in databases with advanced search abilities, the first bank computer system which answered your calls and frustrated you to no end (not proud of that but it *was* and *is* a technology marvel). Among other things, I wrote code that allowed an encrypted message to be sent over the fledgling internet to a server, and then it would be relayed by satellite to the destination specified in the header of the encoded message. So this time, this video, I don't have to concentrate much on to understand. :)

    • @TheJacklwilliams
      @TheJacklwilliams 2 ปีที่แล้ว

      @Gypsy JR, freaking wow. What amazes me is the number of people over the years that leap into tech chasing the money train. Which of course, we are all somewhat guilty of. However, I learned quite some time ago, left the business for about 5 years, that the real pull for me is simple intellectual curiousity and a deep love for all things tech. You, have had an incredible career. Mine has some significant highlights however, if I could re-wind, I'd of went down the dev rabbit hole in a similar fashion. I've dabbled. The last chapter for me is dev. I'll be here until the lights go out. So, question. What out of all of the experiences you've had, did you enjoy the most?

  • @1495978707
    @1495978707 5 ปีที่แล้ว

    4:25 4.7 is the base 2 logarithm of 26. Shannon entropy uses the base 2 logarithm to estimate the number of questions you have to ask because the way she divided the alphabet to ask questions is the most efficient way to get to an answer. Always divide up the possibilities into halves, and then the log tells you how many times you need to halve to get to knowing the answer.

  • @dannybaseball2444
    @dannybaseball2444 2 ปีที่แล้ว

    Great teaching as always and I love the way this video anticipates the satisfaction of solving wordle.

  • @danielbrockerttravel
    @danielbrockerttravel 2 หลายเดือนก่อน

    I came up with similar ideas years ago on objective uncertainty and belief possibility. Unfortunately I had never heard of information theory. I would have benefitted a lot from seeing this video!

  • @will4not
    @will4not 5 ปีที่แล้ว +1

    Oh my goodness: I came for an interesting video on understanding language. I never thought it would involve entropy. That’s so cool.

    • @upandatom
      @upandatom  5 ปีที่แล้ว

      haha thanks! Glad you learnt something new :)

  • @originalveghead
    @originalveghead 3 ปีที่แล้ว

    Really well explained. Thanks.

  • @varshneydevansh
    @varshneydevansh 8 หลายเดือนก่อน

    Beautifully explained

  • @rakeshbhaipatel1714
    @rakeshbhaipatel1714 ปีที่แล้ว +1

    That's a pretty good intuitive explanation of entropy. It would be great if you could make a video on mutual information.

  • @izzyhaze7347
    @izzyhaze7347 4 ปีที่แล้ว

    Exactly what I needed to make my day

  • @gaeb-hd4lf
    @gaeb-hd4lf 5 ปีที่แล้ว

    Awesome video, you deserve WAY more suscribers!

  • @Jamie-st6of
    @Jamie-st6of 3 ปีที่แล้ว +1

    you're absolutely correct, but i think that it's worth mentioning that the more confusing redundancies in english are mostly inherited from french (and probably some type of proto-german, but i dont know much at all about the germanic languages).
    the silent 'e' on the end of words actually comes from the french spelling system (a silent 'e' was added to make the final consonant pronounced rather than silent, and sometimes used to indicate grammatical gender). 'qu' clarifies the pronunciation of 'c' before certain vowels ('i' and 'e'), similar to the silent 'u' in 'guerilla' and the french 'ç' (cedilla is sort-of the opposite of 'qu'). some digraphs also come from french, such as the 'au' in 'aura'.
    unlike everything else, most of english's vowel pronunciation weirdness is an indigenous phenomenon mostly developed by the english themselves. (if you're curious, look up 'The Great Vowel Shift')
    side note: it's somewhat odd to use the past tense to refer to scripts without upper/lower case and without vowels, given that multiple extant scripts have those properties (hebrew, arabic, syriac, aramaic but only a handful of villages still use aramic script, and arguably chinese but it's so different it may as well not count).

  • @dhurtt22
    @dhurtt22 5 ปีที่แล้ว

    That was great! It’s good to see you back!

    • @upandatom
      @upandatom  5 ปีที่แล้ว

      thank you it's great to be back!

  • @yuxiaoluo9414
    @yuxiaoluo9414 4 ปีที่แล้ว

    very good representation, really like the sample of noisy environment.

  • @Dixavd
    @Dixavd 5 ปีที่แล้ว +3

    I love the cat's turn at 7:32 and then the fact it's missing at 7:41

    • @upandatom
      @upandatom  5 ปีที่แล้ว +2

      haha he's quick!

  • @jupahe6448
    @jupahe6448 5 ปีที่แล้ว +3

    Even as a nonnative speaker this is very interesting, so glad you're back 😊
    Greetings from Germany 🇩🇪

    • @upandatom
      @upandatom  5 ปีที่แล้ว

      thank you so glad to be back! :)))

  • @celsorosajunior
    @celsorosajunior 5 ปีที่แล้ว +1

    Realy cool! Any additional video about the importance of this subject to encoding digital information? It would be great!

    • @upandatom
      @upandatom  5 ปีที่แล้ว

      a lot of people have been asking me about this so I'll keep it on my radar :)

  • @rodrigotravitzki7535
    @rodrigotravitzki7535 2 ปีที่แล้ว

    totally wonderful!
    just thinking about... would this "before and after the flip" be related to Bayesian approach?

  • @AnandKumar-ql1sv
    @AnandKumar-ql1sv 5 ปีที่แล้ว

    I don’t comment on videos but your ‘s vidz r worth commenting... Loved the way you explained...

  • @archeronaute5041
    @archeronaute5041 5 ปีที่แล้ว

    The physic entropy is also a measure of uncertaincy. The entropy of the macroscopic state of a system can be deffine by the Boltzmann equation as S=K.ln(W) where W is the number of microscopic state possible for this macrostate. So the bigger the entropy is the more there are possible microstates which means the less you know about a system when you know its macrostate.

  • @burningsilicon149
    @burningsilicon149 ปีที่แล้ว

    I like to think of entropy as the total number of possible states.And information as something that reduces the number of possible states.Like if you flip 2 coins before looking you have 4 possible states TT, TH, HT, HH.If someone tells you the second one is a Head this reduces the possible states by a factor of 2 TH,HH.That amount of reduction is the information in the statement “the second coin is heads”.

  • @fredochs
    @fredochs 5 ปีที่แล้ว

    She's back! Yay!

  • @matthewpull9178
    @matthewpull9178 5 ปีที่แล้ว

    When the alien saw "U SUCK LOL" I nearly fell off seat laughing! Glad you're back ☺️

  • @doylesaylor
    @doylesaylor 3 ปีที่แล้ว

    I think overall these are excellent posts on a lot of topics I like to hear you discuss. My ‘question’ is what is the physical representation of the word, question, itself? The theory is posed in terms of text and the logic of writing information. There are strange physical concepts in such efforts like characterizing gaps in steps as instantiations. Disentangling notations and or scripts we might find interesting approaches by finding the physical meaning of ‘question’.

  • @leesweets4110
    @leesweets4110 2 ปีที่แล้ว

    Thats some good dubstep. Surprising to hear it in one of your videos.

  • @jasperh6618
    @jasperh6618 5 ปีที่แล้ว

    There's an interesting link to be made between redundancies in written communication and the (efficiency of) transmission of bits in computers.
    In computers, sending a message once is cheap and fault sensitive while sending a message twice is expensive, so you want to encode the message in such a way you get some benefits of both worlds. I wonder how much redundant information in written communication can be left out until only the equivalent of "send a message once" remains

  • @deadman746
    @deadman746 7 หลายเดือนก่อน

    There are a couple of things I find particularly interesting.
    One is that in general it is not possible in general to fix the entropy of an utterance by itself. It is only possible to do so if you know all the information used in encoding and decoding, which you don't. This is the same with thermodynamic entropy.
    Also, the psycholinguists noticed that diachronic changes in language tend to follow a surprisingly precise balance between redundancy and efficiency. As soon as a language gets 3% off optimal, organically, it will adjust to correct given a large enough community of speakers.

  • @travisterry2200
    @travisterry2200 5 ปีที่แล้ว +1

    This channel is way more my level than .mind you decisions. or even numberphile. 👍

  • @subramaniannk3364
    @subramaniannk3364 4 ปีที่แล้ว

    Nice video! Could you make a video on Shannon's contribution-noisy channel theorem and source coding theorem?

  • @joshsatterwhite1571
    @joshsatterwhite1571 5 ปีที่แล้ว +4

    Jesus, Jade, 25K subs already? You're blowing up, and you certainly deserve it.

    • @upandatom
      @upandatom  5 ปีที่แล้ว +1

      thank you! but it's all thanks to my collaboration with physics girl

    • @PhingChov
      @PhingChov 5 ปีที่แล้ว

      Physics Girls brought us there, but your the reason we stay and subscribe. Jade, keep up the quality work and we'll be back for more!

    • @PetraKann
      @PetraKann 4 ปีที่แล้ว

      how many followers would Jade get if she decided to only dub her voice or exact audio over the top of the video footage?

    • @danielchettiar5670
      @danielchettiar5670 3 ปีที่แล้ว

      @@upandatom 2 years and 300k+ subs later.... You deserve it!

  • @gamewarmaster
    @gamewarmaster 5 ปีที่แล้ว +2

    Yaay someone played banjo tooie too! 0:53

  • @shiblyahmed3720
    @shiblyahmed3720 4 ปีที่แล้ว

    Hey, I like your style and the way you talk!! Didn't I see someone just like you who talks about math & physics on TH-cam?

  • @awuuwa
    @awuuwa ปีที่แล้ว

    the music at the end is brilliant

  • @otakuribo
    @otakuribo 5 ปีที่แล้ว +2

    *Hypothesis:* dubstep contains more entropy on average than other forms of electronic music, but oddly not as much as rap which has more redundant backbeats but lots and lots of lyrical information

    • @upandatom
      @upandatom  5 ปีที่แล้ว +1

      this is an interesting theory

  • @yaminohotoke
    @yaminohotoke 5 ปีที่แล้ว

    Cool video! Lots of information... wonder what's the entropy of a TH-cam video?!

  • @empire-classfirenationbatt2691
    @empire-classfirenationbatt2691 5 ปีที่แล้ว

    I knew about this concept but I watched the video anyways because I knew I'd learn something new I didn't know before, it's been a while since I've seen one of your videos... and you're cute😂😂😍
    Keep up the good work😂👍😝

  • @michaelblosenhauer9887
    @michaelblosenhauer9887 2 ปีที่แล้ว

    The cat at 7:35 made me laugh out loud.

  • @JimLoganIII
    @JimLoganIII 5 ปีที่แล้ว

    What is the first piece of music in your video? I really like it. It reminds me of the music in the show “Doc Martin”. Oh, and your video was very helpful!

  • @klaasterpstra6119
    @klaasterpstra6119 3 ปีที่แล้ว

    Good explanation of a difficult concept!

  • @awuuwa
    @awuuwa ปีที่แล้ว

    excellent content

  • @Keronin
    @Keronin 5 ปีที่แล้ว

    Loved the video! I'm a big fan of The Endless Knot already, so I enjoyed the collaboration!
    Out of curiosity, what was the music you used at the end/during the "concert"?

    • @upandatom
      @upandatom  5 ปีที่แล้ว

      concert? like at the credits? it's a song from epidemicsound.com

  • @TheMultipower47
    @TheMultipower47 5 ปีที่แล้ว +4

    how can i move to bossville?

  • @scoreprinceton
    @scoreprinceton 3 ปีที่แล้ว

    @upandatom Very well presented, congrats!! Isn’t information content so defined and measured incomplete?

  • @Ureallydontknow
    @Ureallydontknow 3 ปีที่แล้ว

    Before we can say that the average entropy of english is 2.62 bits per character, some conditions must be met first. For example, a message length 1 does not have an average of 2.62. the entropy for length 1 is between 4 and 5 for a 26 letter alphabet. 32 letters is exactly 5 for a message length one if q no u is possible. Assuming white space is not in the alphabet.

  • @wolfstormwizard424
    @wolfstormwizard424 3 ปีที่แล้ว

    I love your vids so much I really want to major in CS 😁😁

  • @hyperduality2838
    @hyperduality2838 3 ปีที่แล้ว

    Syntropy (prediction) is dual to increasing entropy -- the 4th law of thermodynamics!
    Repetition (redundancy) is dual to variation.
    Certainty is dual to uncertainty -- the Heisenberg certainty/uncertainty principle.
    Randomness (entropy) is dual to order (predictability). "Always two there are" -- Yoda.

  • @meissner07
    @meissner07 ปีที่แล้ว

    I literally have a Check-E-Cheese coin sitting on my desk at all times. lol.

  • @eduardolima5495
    @eduardolima5495 5 ปีที่แล้ว

    Hahahahhah the best part of the video is you saying “I’m back” thanks so much for producing more content!!!!
    By the way, it happens in Portuguese too, o think all the languages have some redundancy, right?
    Keep up the good work, u r amzg grl!!

  • @acommenter
    @acommenter 5 ปีที่แล้ว +71

    U always comes after a Q!?
    I'll go tell my Iraqui friends about that one!

    • @pierreabbat6157
      @pierreabbat6157 5 ปีที่แล้ว +1

      Donnez-leur cinq coqs qataris.

    • @alexv3357
      @alexv3357 5 ปีที่แล้ว +6

      For a moment, that joke went over my head and I thought you just misspelled Iroquois

    • @ptousig
      @ptousig 5 ปีที่แล้ว +4

      Qat... a very useful scrabble word.

    • @srinivastatachar4951
      @srinivastatachar4951 4 ปีที่แล้ว +1

      Qantas doesn't think so...

    • @Salsuero
      @Salsuero 3 ปีที่แล้ว

      All that Qatari money is betting against this truth.

  • @informationtheoryvideodata2126
    @informationtheoryvideodata2126 2 ปีที่แล้ว

    It is important to clarify that the number of questions we have to ask defines the entropy does not define the information in a general way. This distinction is fundamental especially in relation to new developments in the theory of information defined by the Set Shaping Theory SST.

  • @tadasleksas6312
    @tadasleksas6312 5 ปีที่แล้ว

    This video was Realy good!

  • @tatjanag.7959
    @tatjanag.7959 5 ปีที่แล้ว

    Jade!!! No credits in the end for the car scene?? I'm shocked! 🙀😂
    Congrats on the 25k!! You absolutely deserve it 😊🎉💛

    • @upandatom
      @upandatom  5 ปีที่แล้ว

      haha oh hey!

  • @bluefire9697
    @bluefire9697 5 ปีที่แล้ว

    Yo, this is an amazing video!!!! I got confused by your explanation for the first time lol 😂!!!!

    • @bluefire9697
      @bluefire9697 5 ปีที่แล้ว

      The meaning of information entropy btw.

    • @upandatom
      @upandatom  5 ปีที่แล้ว

      oh no where did you get confused?!

    • @bluefire9697
      @bluefire9697 5 ปีที่แล้ว

      It was in your explanation abt the meaning of information entropy, like your definition. But I watched it twice and now I get it !!!( wut what you meant with your coin concept)

    • @upandatom
      @upandatom  5 ปีที่แล้ว

      ok good! :)

  • @Quadr44t
    @Quadr44t ปีที่แล้ว

    1:27 So, statistical thermodynamics is also information theory? Cuz a course in that subject finally sortoff help me make sense of the concept of entropy in general..

  • @progyanmahanta8547
    @progyanmahanta8547 5 ปีที่แล้ว

    I love your videos...😍

  • @JTheoryScience
    @JTheoryScience 5 ปีที่แล้ว

    'Information' was NOT spelled incorrectly because i understood what you wrote and the point of writing is to convey information to someone so they comprehend it. 100% totally perfect no mistake here in my opinion. Bloody delightful to get another Australian/ NewZealander onto TH-cam Science-ing.

  • @Raffo42
    @Raffo42 5 ปีที่แล้ว

    Uhh, uhh, are you going into something like cracking simple codes (a.k.a. "monoalphabetic substitution") based on letter and syllable frequencies? (Or is that a question I should ask on Mark's channel? ^^)

  • @mobile2
    @mobile2 3 ปีที่แล้ว

    if i could watch videos like yours when I was a secondary school student, I would learn physics more interesting.
    Channel capacity C=B*log2(1+S/N) told by Shannon-Hartley theorem is famous in telecommunication.
    I am a cellular radio network engineer. I was afraid to study semiconductor and electromagnetism when I studied electronic engineering. The Maths are very difficult (e.g Maxwell equations)

  • @zincwing4475
    @zincwing4475 3 ปีที่แล้ว

    Shannon, my teenage hero as a teenager.