The last example is actually pretty good, too bad that you didn't do the matching calculation in the video itself to show why possibility can matter *a lot* in compression: 1/2 of the time you send 0 = 0.5 * 1 = 0.5 bits 1/4 of the time you send 10 = 0.25 * 2 = 0.5 bits 1/8 of the time you send 110 = 0.125 * 3 = 0.375 bits 1/8 of the time you send 111 = 0.125 * 3 = 0.375 bits Which sums up to only 1.75 bits needed per state, saving 12.5% of bandwidth, and shows nicely how knowledge about the to-be-compressed data allows beating the standard approach of taking log(number of states) as the number of bits needed, which is pretty much always the worst case for compression.
This guy looks so incredibly passionate about what he is teaching. If he was at my Uni, I would not regret paying my tuition if I knew he was getting paid to teach me. Damn. Makes me wanna be a professor.
I have no idea how I passed my chemistry tests on entropy, the way those books explain those concepts is so terrible. I just remember memorizing something along the lines of "entropy is a measure of disorder, and it always increases" and I didn't have any freaking damn clue what that even meant or why that was significant. The state of education is so bad. it just a plug'n-chug system, no creativity. Absolutely no passion for teaching or in learning is instilled into people. Great video!
PNG: each scanline is fed through a filter, which predicts the pixel value from the nearby pixels, and the best choice is encoded on a per-scanline basis. then the output is fed through Deflate, which may recognize patterns but more often does RLE, and which also applies the use of Huffman coding. for example, each pixel could be predicted by subtracting the value of the pixel to the left. if they are the same color, you get "0,0,0,0", and with a lot of this, the image compresses nicely.
I found myself drawn into this video more than I usually do for these kind of videos. The topic was shown in a very interesting way and the professor's voice quite enjoyable to the ears.
I'm a CS major and this channel is teaching me more than my class so far. Okay, I'll admit I've already learned what this class is teaching from youtube as well, but that's beside the point.
Yeah, he quickly gets to the point, he doesn't mess around when explaining and explains extremely well. On top of that he just seems so passionate about what he does, if that were my field of study, you couldn't ask for a better teacher.
Wow, I actually learned something new on this video. I had never thought about this in all of my time of programming and working with computers. Keep it coming computerphile. Also, I am glad I subed to this channel.
This was the best computerphile yet :) You really can't separate computers and maths (if you are talking about the internal workings at least). Computers are logical systems... math is logic.
You should specify that entropy is reached only for statistical compression algorithms. With LZW for example you can go bellow that. You should do a video about it, because it's quite interesting. I was mesmerized the first time I learned about it.
It was a bit confusing, but the formula -(p*log(p)) is normally not used alone, but rather one sums over all events that are possible and so the p term is just to obtain a weighted sum since what one is really interested in is what the average number of bits is for an event. The -(log(2^-2)) formula comes from the summation formula and you get (-1/4*log(1/4)) + (-1/4*log(1/4)) + (-1/4*log(1/4)) + (-1/4*log(1/4)) = -log(1/4) but this only works since the events are equiprobable.
After this video, an introduction to Huffman encoding is an absolute must have. Once you know that you should give shorter codes to more probable events, then huffman coding is the next step to deciding which codes should be used. It's also dead simple to teach.
You're actually correct. In this case your interpretation is very valid. As long as the receiver and the sender both sync up at a particular time to communicate, then you can use that scheme. However, what he described in the video is the precursor to large file compression. If you had to send weather data from many different cities, you can have a bit stream that starts like "000000...". The receiver would not be able to tell if that's 6 sunny or 3 rainy. The method here is Huffman Coding
Getting closer to Huffman coding used by LZH / ZIP compression. I hope we'll see more about that. Also as long as we're looking at image compression it would be cool to see a demo in slow motion of a JPEG, GIF, and PNG decompressing into a visible buffer.
right, I'm going to clear some stuff up about those who want to get around this problem. sending data at x mins past the hour christian wagener - " Send a "0" on the hour = weather A Send a "0" one min past the hour = weather B Send a "0" two min past the hour = weather C Send nothing = weather C Single bit at $750 on average:) " the problem is, what if the weather changes 30 from weather b to weather a 30 seconds after hour? you'd be left with a problem. even if you think "oh just give the data recorded on the hour", the problem is, I don't want to wait 2 mins to get the info, I don't want to wait any amount of time. time adds entropy, sure. but If I ask for the weather, I need it now. another problem is what if the connection break? then we think they have weather C constantly. MRAROCKERDUDE - " I don't know if this is just reading too much into the weather metaphor but could you not encode the different weather states as 1, 0, 00, 11? " the problem here is, it's not like speaking. you can't just stop half way through the sentence to add entropy like you can with speaking. think of it like a bunch of 1s and 0s going through a data line. say I have 4 weathers and I encode them as 0,10,111,110 this is good because I can do this: 01001000110 and I know what it means. with weather 1 being a and weather 2 being b and so on I can say that is: a ,b ,a ,b ,a ,a ,d however if it was 0,1,00,11 well... try decoding this: 01001010110010
You sum up that result for all your states. All of them has to sum up to 1. You got 4 states of 1/4. -(1/4 * log(1/4) to the base 2) = 1/2 and 1/2 + 1/2 + 1/2 +1/2 = 2.
Given context, you can compress 4 states to fewer than 2 bits. For example, you could say that if the weather is unchanged, only send one low bit. This requires the listener to be able to determine whether it is a 1 or 2 bit signal, which means either having metadata such as a length header, or a timeout to determine the end of the signal.
A "baud" is a physical link-level symbol that can have any number of states, not just two. Many modulation schemes allow for encoding multiple bits in a single baud/symbol - don't worry, this is widely known and used. Please go over to the wikipedia page on bauds and symbol rates and read up (and stop by the page on QAM as well). I can't repeat what's on there in 500 chars, and the articles address your questions quite exhaustively.
in a real-life data-format, also typically the Huffman tables are sent prior to any globs of encoded data, so a single decoder can deal with multiple sets of data. in a format like Deflate, the tables are themselves entropy coded. some formats (such as MPEG) use fixed tables, but still send a synchronization code and basic headers (for each frame). (some others send tables only on I-Frames). the Huffman table basically tells how to map the particular symbols to the particular bit patterns.
I just didn't get this back when I was in my teens. I was sure there must be a way around it. Its only when i got older that it seemed obvious. Lossy compression is another story. There may be ways to improve that.
that's the storage vs. time thing I'm sure we'll get to eventually. The various notions of optimality in computer science sure will give some nice videos. Maybe even how various compressions and codings relate to temperatures in physics. I've recently read a paper on a basically 1:1 mapping between code complexity and thermodynamics.
The reason why prefix codes are used instead of what you propose is because a continuous sequence of such codes would be ambiguous. For instance in your proposed code the decoder can't distinguish whether "0001" means "foggy(00) sunny(0) cloudy(1)" or "sunny(0) sunny(0) rainy(01)" or any other valid combination. You'd have to waste more bits for a 'word length' prefix or some framing structure to allow you to detect word boundaries, and at that point you might as well use a prefix code.
I want to say that practicaly you can send the LA wheater report in one byte if you use time differences. sunny - send 0 at 5 : 30, rainy - send 1 at 5 : 30, foggy send 0 at 6 : 00, cloudy send 1 at 6 : 00.
It is amazing that you made such a simple and easily corrected mistake, but that you do not have enough intellectual humility to just understand where you've gone wrong, accept it, and move on.
Time is information. You may physically send one bit, but the time is an implicit source of information no matter how precise you want to be. When we talk information theory, we're interested in all factors that may constitute "information". And it still ignores the various possible problems that may occur in transit, which can include unpredictable delays in timing, throwing off the system. As mentioned.
This only works if you send a code each day. If, as he said later in the video, you send all the information for a whole week at the end of the week (without breaks in between), you won't be able to determine which days were foggy.
There are some underlying issues that are specific to network theory and confidence in the received data that they cover *very* briefly in the beginning when they discuss sending a zero for sunny in the Sahara "just to be certain". You need high confidence that you actually received the correct message, and treating null as a state ignores many other possibilities in this scenario (cut wire, building on fire, etc.).
in the video, yes, basically. they didn't talk about the (relative funkiness) that is arithmetic coding though, which can also use fractional bits, but is still limited by entropy limits. it typically compresses slightly better at a significant speed cost vs Huffman (Huffman is generally preferable as it is much faster, and the size difference is usually fairly minor).
What you describe is a modulation scheme and there the units of transmission are not called "bits" but "symbols". There are plenty of link modulation schemes which encode more than a single bit in a single symbol (google: QAM or QPSK, etc.), but these do not alter the fact that to describe 4 states you need at least 2 bits. Also look at "symbol rate" on wikipedia, which explains a lot of the general ideas behind this as well.
I'm not saying a signal should be sent more than 1 time a day. For example, as the professor in the video stated, if we have an assigned time at which the signal should be sent (say, at 3pm), then one would send a beep in the first ten seconds after 3:00pm if the weather is cloudy, or one beep in the second decasecond after 3:00 to signal some other weather until you go to sending a signal in the 4th decasecond if necessary. All you need to do, then, is send one signal instead of 2 at any time.
like they mentioned later on, it cannot be a prefix of another code. in your example if you get a series of codes, like 1010 is that sunny, rainy, sunny, rainy or is that foggy foggy? i dont think they did the best job of describing this, but its in there!
It would be nice to see an episode on gray code counters and their applications. I was intrigued by that when I learned about it in comp sci classes years ago. I imagine they're used in countdown timer circuits to avoid transient states associated with critical events.
For this example he said that 0 equals a short pulse und 1 equals a longer puls. Also, the guy in reno is always awaiting 2 Bits from LA. Usually when using morse code you make a small pause with the same length as your short pulse called a "Dit" between letters and a pause of 3 "Dit" (also called a "Dah") between words. Other examples are the ASCII-Code and a CSV-File. ASCII has a fixed string-length and for a CSV-File you need to specify a separator.
One of the hardest things about client-server relationship in an environment that is time dependent is getting them both to "think" in the same frame of time. Have a read particularly at video game client-server synchronization techniques and you might understand just how complicated this issue is.
The probabilities of a state have nothing to do with how many bits you need except when all states have equal probability. If there was one chance in a million of rain in the Sahara and one million minus one out of a million of sunny, then you still need only one bit to determine the state. The limiter is how many states you wish to report on.
We can send the 4 weathers in 0.75 bit: 1. Sunny: 0 sent @ x hours 2. Cloudy: no message sent @ x & (x+∆x) hours 3. Rainy: 1 @ x hours 4. Foggy: 0 @ (x+∆x) hours Disadvantage: You've to choose compression over time
You're almost correct. Most serial communication system use a clock called "BAUD rate" but they use -volts for logic high, +volts for logic low and 0v for idle. Old RS232 used +-12v
ergo, it;d be ambiguous if it were automatically decoded. you are exactly right. the code has to be the expected length. but wen you;ve only got four conditions the supposition I took was that the telegraph operator would manually decode it. He was speaking to the fundamental elements of complete and unambiguous code -- I didn`t realise you were informed, while relating to you the simplest situation as I could, as an answer to your question.
Perhaps it should be mentioned that the calculation for number of bits per state, p*log(p), is based on minimizing the expected total cost of transmission.
Clock synchronization is a hard problem. Rather, you'd work with a different type of encoding. You don't encode your data bits directly to on/off, but rather you use the /change/ (on->off) to indicate a one, and the lack of change to indicate a zero (or the other way around). This also circumvents some other problems that are due to large sequences of ones (or zeroes). If you're interested, look up manchester encoding on wikipedia.
True, that works. The drawback is performance - you can only send a very limited amount of data unless you can adjust the timings to make them more strict. If I wanted to send 2 unit of information, I would have to wait from 5 to 8 times the interval. This could be cut down to 3 to 4 times the interval by sending 0 during the same interval to mean another states (i.e. 0 from 1-15 means rainy).
If you had more states you would have to sacrifice something else to make your message just as clear, less bits per second perhaps. There is a huge amount of science in making sure signals are transmitted or stored without data lose, data is added so bits can be reconstructed if they are lost, alternating bits are used to keep track of which bit number you are up to so binary words are mapped to longer words, so that some patterns that are hard to read aren't used.
I would like to see a three prat series on the transistor. one on Computerphile, one on Periodic Videos and one on Sixty Symbols. On Computerphile you can talk about how they are used in computers. On the other two channels you can talk about the chemistry and physics that make them work.
It could be confuse between 0 and 01 but we suppose we don't send 2 different messages the same minute. Each weather is 25% chance, BUT maybe (surely) the sunny weather is more stable. The weather changes less often for sunny BUT sunny last longer. So if we put 01 for sunny it will use less 2bit info than 1 bit. Same idea for sunny. if cloudy is the most rare weather after sunny, we put 01 for cloudy for the sunny weather.
In addition to the length of the tone that others have mentioned, there are also other ways of differentiation high from low bits. For example, they could be at different amplitudes (volume) or different frequencies (pitch or even colour). This is known as modulation.
No. I'm talking about the objects in the box. If I'm sending umbrellas and only 1 umbrella can fit in a box then I can only send 1 umbrella per box, but if I learn that you can close/fold up an umbrella then I can send more than 1 umbrella per box. If you say I always have to send 2 boxes and I can fit 3 folded umbrellas in a box then I can have 6 states as to the 2 previous way. More information same package. I could also lower cost by saying unfolded = 3 then as well.
Hello! Considering that there is another video on compression (as there are multiple videos on sorting for example), a suggestion may be to put a link in each other's description to point to the next and/or previous video. This would keep everything toghether.
You have to see this in the context of precursors to compression as well. Let's say you try to compress Sunny-Sunny-Cloudy-Cloudy, it would give you 0011, but then this also means Rainy-Stormy. Now if you take his Los Angeles code, you have 4 codes, 0, 10, 110 and 111. So no matter what other you put it in, it can always be read back, no matter what form you put it in. So let's say 1000 messages were sent using the 2 bit method vs this method for los angeles you get: 1750 vs 2000 messages.
what you're describing is a lossy compression, where the exact moment the weather changes can be extrapolated from the data stream... yes, I do know weather changes gradually, but it's also never /just/ sunshine or /just/ fog, this however was /just/ an example of how compression works as I said, if you really wanna save on data report the weather once a week, the cost cutting would be huge!
you can't, because if you transmit "01", you can't know if it's cloudy, or if it's rainy and than sunny. the bits that a file is made of have no spaces between them and therefore any combination cannot be the same as the beginning of another. I personally suggest to you to look up "Huffman coding" in Wikipedia and how it works. it's a really interesting algorithm which purpose is to make a key for a bit sequence, just like has been shown in the video.
to compress (and more pertinently decompress) you have to use a specified algorithm. each of your signals would have to contain two binary digits in order to be decoded correctly
The prefix property as explained in the video. With your version you cannot uniquely decode for example 1010. Is it 1, 0, 1, 0 or is it 10, 10 or is it 10, 1, 0 or is it 1, 0, 10... To distinguish them you need extra information and the minimal amount of information you can get away with is if you use the prefix property. It doesn't matter how you transmit your information (timing, pigeon, ...), because you can always reduce it to a number of bits. Bits are easier to reason about than pigeons.
Well, the thing is that, theoretically, you are still using more "bits" of information: - The 1 or 0 denoting the weather - The time the message is recieved So yes, you are sending less bits through a cable. But the message is meaningless without knowing the time so the actual information (rainy/sunny/etc) still requires extra "bits" of information (the time of recieving). So you don't save anything overall, just the amount of transferred bits.
What you are asking is not strictly a question on information theory but rather about engineering and how to build a system with certain properties. The problems you describe are real and there are engineering solutions around them. As you guessed correctly, they require a bit of overhead, called "signalling". That's why e.g. your 54Mbps WiFi never goes at 54Mbps - it's the physical "line" signalling rate (also called "baud rate").
It could be done in one bit. If, as the guy says, the signal is transmitted at the same time every day (say 12:00), then a delay could be introduced that varies according to the weather: Sunny: Send one bit at 12:00 exactly. Cloudy: Send one bit at 12:00.00.01 Rainy: Send one bit at 12:00.00.02 Foggy: Send one bit at 12:00.00.03 Or a dot at ...01 or ...02, or a dash at ...01 or ...02, which would take the same time as sending the two bits in the example in the video.
Hi, i was thinking along the same line. Simply send a "1" between 1 second and 15 seconds past the min for sunny, then a "1" between 16 seconds and 30 seconds past the min for fog etc. You only need one bit for each state.
That is an interesting idea. A comment thought is that you are in effect using two bits anyway since there is two different sending times which can be considered 0 and 1. Given the price/bit, you are ofcourse right though. This is known as a covet channel meaning that one communicates with more then just sending bits and might be an interesting topic for a video.
The last example is actually pretty good, too bad that you didn't do the matching calculation in the video itself to show why possibility can matter *a lot* in compression:
1/2 of the time you send 0 = 0.5 * 1 = 0.5 bits
1/4 of the time you send 10 = 0.25 * 2 = 0.5 bits
1/8 of the time you send 110 = 0.125 * 3 = 0.375 bits
1/8 of the time you send 111 = 0.125 * 3 = 0.375 bits
Which sums up to only 1.75 bits needed per state, saving 12.5% of bandwidth, and shows nicely how knowledge about the to-be-compressed data allows beating the standard approach of taking log(number of states) as the number of bits needed, which is pretty much always the worst case for compression.
This guy looks so incredibly passionate about what he is teaching. If he was at my Uni, I would not regret paying my tuition if I knew he was getting paid to teach me.
Damn. Makes me wanna be a professor.
His voice is so relaxing. I would love to hear an audiobook read by him haha.
I love that on your channels you only find talented enthusiasts that not only explain stuff very clearly but also make it sound fun. Good job, Brady.
That professor is such a badass :)
You should have uploaded this video in 144p
I have spent the last 3 years of mathematics thinking logs were useless now. Thank you for proving me wrong.
I have no idea how I passed my chemistry tests on entropy, the way those books explain those concepts is so terrible. I just remember memorizing something along the lines of "entropy is a measure of disorder, and it always increases" and I didn't have any freaking damn clue what that even meant or why that was significant. The state of education is so bad. it just a plug'n-chug system, no creativity. Absolutely no passion for teaching or in learning is instilled into people.
Great video!
What a pleasure it is to hear somebody who knows what he is talking about answering sensible questions.
Annotation added for absolute clarity (though the Prof says it almost in his next breath) >Sean
Once again Bradey asks the perfect questions. I think he has a great gift to help get the information across to viewers.
This is precisely the type of content this channel needs.
PNG: each scanline is fed through a filter, which predicts the pixel value from the nearby pixels, and the best choice is encoded on a per-scanline basis. then the output is fed through Deflate, which may recognize patterns but more often does RLE, and which also applies the use of Huffman coding.
for example, each pixel could be predicted by subtracting the value of the pixel to the left. if they are the same color, you get "0,0,0,0", and with a lot of this, the image compresses nicely.
This guy has an amazing way of holding my attention for long videos.
A big part of the greatness comes from the questions asked by the student.
Imagine having him as a lecturer, he's definitely one of the best!
"We edit nothing out of you" and everything befor and after that is pure gold.
Do steganography one of these times pls.
I found myself drawn into this video more than I usually do for these kind of videos. The topic was shown in a very interesting way and the professor's voice quite enjoyable to the ears.
I'm a CS major and this channel is teaching me more than my class so far.
Okay, I'll admit I've already learned what this class is teaching from youtube as well, but that's beside the point.
Probably the best one Computerphile has done so far. This covered so much ground clearly and in only 12 minutes.
This professor is awesome... He seems like a GREAT teacher, if allowed to teach
I loved the style of this video. It's great seeing the professor talk to Brady. It brings another level of humanity to the conveyance of the topic.
I would pay anything to get a class with this guy. So inspiring!
Yeah, he quickly gets to the point, he doesn't mess around when explaining and explains extremely well. On top of that he just seems so passionate about what he does, if that were my field of study, you couldn't ask for a better teacher.
I love Brady's incisive, curious and critical style of interviewing.
the limit of Brady's channels as it approaches infinity = an intellectual society.
This is my favorite phile site!
Great job delivering educational and fun material,
thanks guys
Wow, I actually learned something new on this video. I had never thought about this in all of my time of programming and working with computers. Keep it coming computerphile. Also, I am glad I subed to this channel.
This was the best computerphile yet :) You really can't separate computers and maths (if you are talking about the internal workings at least). Computers are logical systems... math is logic.
Great video. FYI the encoding method he is referring to for encoding varied probability symbols (7:38) is called Huffman encoding.
You should specify that entropy is reached only for statistical compression algorithms. With LZW for example you can go bellow that. You should do a video about it, because it's quite interesting. I was mesmerized the first time I learned about it.
I hope we get much more from Professor Brailsford, he's great.
It was a bit confusing, but the formula -(p*log(p)) is normally not used alone, but rather one sums over all events that are possible and so the p term is just to obtain a weighted sum since what one is really interested in is what the average number of bits is for an event.
The -(log(2^-2)) formula comes from the summation formula and you get (-1/4*log(1/4)) + (-1/4*log(1/4)) + (-1/4*log(1/4)) + (-1/4*log(1/4)) = -log(1/4) but this only works since the events are equiprobable.
After this video, an introduction to Huffman encoding is an absolute must have. Once you know that you should give shorter codes to more probable events, then huffman coding is the next step to deciding which codes should be used. It's also dead simple to teach.
I hope to see more from this gentelman. I is mindblowing how one's passion to anything can transfer to others through a youtube video. inspiring!
You're actually correct. In this case your interpretation is very valid. As long as the receiver and the sender both sync up at a particular time to communicate, then you can use that scheme. However, what he described in the video is the precursor to large file compression. If you had to send weather data from many different cities, you can have a bit stream that starts like "000000...". The receiver would not be able to tell if that's 6 sunny or 3 rainy. The method here is Huffman Coding
Getting closer to Huffman coding used by LZH / ZIP compression. I hope we'll see more about that. Also as long as we're looking at image compression it would be cool to see a demo in slow motion of a JPEG, GIF, and PNG decompressing into a visible buffer.
This guy is awesome.
He should have his own TV show or movie documentary.
I'm really enjoying these computerphile videos.
This is a good one. Don't shy away from the details!
right, I'm going to clear some stuff up about those who want to get around this problem.
sending data at x mins past the hour
christian wagener -
"
Send a "0" on the hour = weather A
Send a "0" one min past the hour = weather B
Send a "0" two min past the hour = weather C
Send nothing = weather C
Single bit at $750 on average:)
"
the problem is, what if the weather changes 30 from weather b to weather a 30 seconds after hour? you'd be left with a problem. even if you think "oh just give the data recorded on the hour", the problem is, I don't want to wait 2 mins to get the info, I don't want to wait any amount of time. time adds entropy, sure. but If I ask for the weather, I need it now.
another problem is what if the connection break? then we think they have weather C constantly.
MRAROCKERDUDE -
"
I don't know if this is just reading too much into the weather metaphor but could you not encode the different weather states as 1, 0, 00, 11?
"
the problem here is, it's not like speaking. you can't just stop half way through the sentence to add entropy like you can with speaking. think of it like a bunch of 1s and 0s going through a data line.
say I have 4 weathers and I encode them as 0,10,111,110 this is good because I can do this: 01001000110 and I know what it means. with weather 1 being a and weather 2 being b and so on I can say that is: a ,b ,a ,b ,a ,a ,d
however if it was 0,1,00,11 well... try decoding this: 01001010110010
You sum up that result for all your states. All of them has to sum up to 1. You got 4 states of 1/4. -(1/4 * log(1/4) to the base 2) = 1/2 and 1/2 + 1/2 + 1/2 +1/2 = 2.
Given context, you can compress 4 states to fewer than 2 bits.
For example, you could say that if the weather is unchanged, only send one low bit. This requires the listener to be able to determine whether it is a 1 or 2 bit signal, which means either having metadata such as a length header, or a timeout to determine the end of the signal.
Its great to see videos about entropy, I hope to see more on information theory which is exiting and unfortunately underrated....
The presenter always asks good questions
A "baud" is a physical link-level symbol that can have any number of states, not just two. Many modulation schemes allow for encoding multiple bits in a single baud/symbol - don't worry, this is widely known and used.
Please go over to the wikipedia page on bauds and symbol rates and read up (and stop by the page on QAM as well). I can't repeat what's on there in 500 chars, and the articles address your questions quite exhaustively.
He's not dumb, he represents the common man, that's why he always talks to experts on all of his channels and asks probing questions.
Very nice explanation indeed. These concepts are so important these days for machine learning algorithms :)KD
This is awesome!! I always wanted to know how this worked! What a great guest!!
in a real-life data-format, also typically the Huffman tables are sent prior to any globs of encoded data, so a single decoder can deal with multiple sets of data. in a format like Deflate, the tables are themselves entropy coded.
some formats (such as MPEG) use fixed tables, but still send a synchronization code and basic headers (for each frame).
(some others send tables only on I-Frames).
the Huffman table basically tells how to map the particular symbols to the particular bit patterns.
I just didn't get this back when I was in my teens. I was sure there must be a way around it. Its only when i got older that it seemed obvious. Lossy compression is another story. There may be ways to improve that.
that's the storage vs. time thing I'm sure we'll get to eventually.
The various notions of optimality in computer science sure will give some nice videos.
Maybe even how various compressions and codings relate to temperatures in physics. I've recently read a paper on a basically 1:1 mapping between code complexity and thermodynamics.
The reason why prefix codes are used instead of what you propose is because a continuous sequence of such codes would be ambiguous. For instance in your proposed code the decoder can't distinguish whether "0001" means "foggy(00) sunny(0) cloudy(1)" or "sunny(0) sunny(0) rainy(01)" or any other valid combination. You'd have to waste more bits for a 'word length' prefix or some framing structure to allow you to detect word boundaries, and at that point you might as well use a prefix code.
I could listen to him for hours and retain every word.
I want to say that practicaly you can send the LA wheater report in one byte if you use time differences. sunny - send 0 at 5 : 30, rainy - send 1 at 5 : 30, foggy send 0 at 6 : 00, cloudy send 1 at 6 : 00.
It is amazing that you made such a simple and easily corrected mistake, but that you do not have enough intellectual humility to just understand where you've gone wrong, accept it, and move on.
Time is information. You may physically send one bit, but the time is an implicit source of information no matter how precise you want to be. When we talk information theory, we're interested in all factors that may constitute "information".
And it still ignores the various possible problems that may occur in transit, which can include unpredictable delays in timing, throwing off the system. As mentioned.
Insane video for these guys, for their memory. I wonder how their life changed since then.
I wish I had such a professor in my college
This only works if you send a code each day. If, as he said later in the video, you send all the information for a whole week at the end of the week (without breaks in between), you won't be able to determine which days were foggy.
There are some underlying issues that are specific to network theory and confidence in the received data that they cover *very* briefly in the beginning when they discuss sending a zero for sunny in the Sahara "just to be certain". You need high confidence that you actually received the correct message, and treating null as a state ignores many other possibilities in this scenario (cut wire, building on fire, etc.).
in the video, yes, basically.
they didn't talk about the (relative funkiness) that is arithmetic coding though, which can also use fractional bits, but is still limited by entropy limits. it typically compresses slightly better at a significant speed cost vs Huffman (Huffman is generally preferable as it is much faster, and the size difference is usually fairly minor).
What you describe is a modulation scheme and there the units of transmission are not called "bits" but "symbols". There are plenty of link modulation schemes which encode more than a single bit in a single symbol (google: QAM or QPSK, etc.), but these do not alter the fact that to describe 4 states you need at least 2 bits. Also look at "symbol rate" on wikipedia, which explains a lot of the general ideas behind this as well.
You could sent one click at a certain time for each different message. Send at 8:00 if sunny send at 8:01 if rainy. Right?
I love this man. Make him a regular, please :3
I'm not saying a signal should be sent more than 1 time a day. For example, as the professor in the video stated, if we have an assigned time at which the signal should be sent (say, at 3pm), then one would send a beep in the first ten seconds after 3:00pm if the weather is cloudy, or one beep in the second decasecond after 3:00 to signal some other weather until you go to sending a signal in the 4th decasecond if necessary. All you need to do, then, is send one signal instead of 2 at any time.
like they mentioned later on, it cannot be a prefix of another code. in your example if you get a series of codes, like 1010 is that sunny, rainy, sunny, rainy or is that foggy foggy? i dont think they did the best job of describing this, but its in there!
It would be nice to see an episode on gray code counters and their applications. I was intrigued by that when I learned about it in comp sci classes years ago. I imagine they're used in countdown timer circuits to avoid transient states associated with critical events.
Awesome video! It gives an interesting insight to the statistical meaning of entropy
For this example he said that 0 equals a short pulse und 1 equals a longer puls. Also, the guy in reno is always awaiting 2 Bits from LA. Usually when using morse code you make a small pause with the same length as your short pulse called a "Dit" between letters and a pause of 3 "Dit" (also called a "Dah") between words. Other examples are the ASCII-Code and a CSV-File. ASCII has a fixed string-length and for a CSV-File you need to specify a separator.
One of the hardest things about client-server relationship in an environment that is time dependent is getting them both to "think" in the same frame of time. Have a read particularly at video game client-server synchronization techniques and you might understand just how complicated this issue is.
The probabilities of a state have nothing to do with how many bits you need except when all states have equal probability. If there was one chance in a million of rain in the Sahara and one million minus one out of a million of sunny, then you still need only one bit to determine the state. The limiter is how many states you wish to report on.
We can send the 4 weathers in 0.75 bit:
1. Sunny: 0 sent @ x hours
2. Cloudy: no message sent @ x & (x+∆x) hours
3. Rainy: 1 @ x hours
4. Foggy: 0 @ (x+∆x) hours
Disadvantage: You've to choose compression over time
I love seeing these compression and data videos
I like this new channel a lot.
You're almost correct. Most serial communication system use a clock called "BAUD rate" but they use -volts for logic high, +volts for logic low and 0v for idle. Old RS232 used +-12v
ergo, it;d be ambiguous if it were automatically decoded. you are exactly right. the code has to be the expected length. but wen you;ve only got four conditions the supposition I took was that the telegraph operator would manually decode it. He was speaking to the fundamental elements of complete and unambiguous code -- I didn`t realise you were informed, while relating to you the simplest situation as I could, as an answer to your question.
Perhaps it should be mentioned that the calculation for number of bits per state, p*log(p), is based on minimizing the expected total cost of transmission.
Clock synchronization is a hard problem. Rather, you'd work with a different type of encoding. You don't encode your data bits directly to on/off, but rather you use the /change/ (on->off) to indicate a one, and the lack of change to indicate a zero (or the other way around). This also circumvents some other problems that are due to large sequences of ones (or zeroes). If you're interested, look up manchester encoding on wikipedia.
True, that works. The drawback is performance - you can only send a very limited amount of data unless you can adjust the timings to make them more strict. If I wanted to send 2 unit of information, I would have to wait from 5 to 8 times the interval. This could be cut down to 3 to 4 times the interval by sending 0 during the same interval to mean another states (i.e. 0 from 1-15 means rainy).
If you had more states you would have to sacrifice something else to make your message just as clear, less bits per second perhaps. There is a huge amount of science in making sure signals are transmitted or stored without data lose, data is added so bits can be reconstructed if they are lost, alternating bits are used to keep track of which bit number you are up to so binary words are mapped to longer words, so that some patterns that are hard to read aren't used.
I would like to see a three prat series on the transistor.
one on Computerphile, one on Periodic Videos and one on Sixty Symbols.
On Computerphile you can talk about how they are used in computers.
On the other two channels you can talk about the chemistry and physics that make them work.
It could be confuse between 0 and 01 but we suppose we don't send 2 different messages the same minute.
Each weather is 25% chance, BUT maybe (surely) the sunny weather is more stable. The weather changes less often for sunny BUT sunny last longer. So if we put 01 for sunny it will use less 2bit info than 1 bit. Same idea for sunny. if cloudy is the most rare weather after sunny, we put 01 for cloudy for the sunny weather.
Damn it Brady, where do you find these amazing people?
In addition to the length of the tone that others have mentioned, there are also other ways of differentiation high from low bits. For example, they could be at different amplitudes (volume) or different frequencies (pitch or even colour). This is known as modulation.
No. I'm talking about the objects in the box. If I'm sending umbrellas and only 1 umbrella can fit in a box then I can only send 1 umbrella per box, but if I learn that you can close/fold up an umbrella then I can send more than 1 umbrella per box.
If you say I always have to send 2 boxes and I can fit 3 folded umbrellas in a box then I can have 6 states as to the 2 previous way. More information same package. I could also lower cost by saying unfolded = 3 then as well.
Hello! Considering that there is another video on compression (as there are multiple videos on sorting for example), a suggestion may be to put a link in each other's description to point to the next and/or previous video. This would keep everything toghether.
You have to see this in the context of precursors to compression as well. Let's say you try to compress Sunny-Sunny-Cloudy-Cloudy, it would give you 0011, but then this also means Rainy-Stormy.
Now if you take his Los Angeles code, you have 4 codes, 0, 10, 110 and 111. So no matter what other you put it in, it can always be read back, no matter what form you put it in.
So let's say 1000 messages were sent using the 2 bit method vs this method for los angeles you get: 1750 vs 2000 messages.
what you're describing is a lossy compression, where the exact moment the weather changes can be extrapolated from the data stream... yes, I do know weather changes gradually, but it's also never /just/ sunshine or /just/ fog, this however was /just/ an example of how compression works
as I said, if you really wanna save on data report the weather once a week, the cost cutting would be huge!
you can't, because if you transmit "01", you can't know if it's cloudy, or if it's rainy and than sunny. the bits that a file is made of have no spaces between them and therefore any combination cannot be the same as the beginning of another.
I personally suggest to you to look up "Huffman coding" in Wikipedia and how it works. it's a really interesting algorithm which purpose is to make a key for a bit sequence, just like has been shown in the video.
to compress (and more pertinently decompress) you have to use a specified algorithm. each of your signals would have to contain two binary digits in order to be decoded correctly
The prefix property as explained in the video. With your version you cannot uniquely decode for example 1010. Is it 1, 0, 1, 0 or is it 10, 10 or is it 10, 1, 0 or is it 1, 0, 10... To distinguish them you need extra information and the minimal amount of information you can get away with is if you use the prefix property. It doesn't matter how you transmit your information (timing, pigeon, ...), because you can always reduce it to a number of bits. Bits are easier to reason about than pigeons.
He tries to make things more clear for the viewers. I think he's quite smart actually.
This guy is an excellent instructor!
This video goes quite nice with the most recent Crash Course Chemistry video
Well, the thing is that, theoretically, you are still using more "bits" of information:
- The 1 or 0 denoting the weather
- The time the message is recieved
So yes, you are sending less bits through a cable. But the message is meaningless without knowing the time so the actual information (rainy/sunny/etc) still requires extra "bits" of information (the time of recieving). So you don't save anything overall, just the amount of transferred bits.
What you are asking is not strictly a question on information theory but rather about engineering and how to build a system with certain properties. The problems you describe are real and there are engineering solutions around them. As you guessed correctly, they require a bit of overhead, called "signalling". That's why e.g. your 54Mbps WiFi never goes at 54Mbps - it's the physical "line" signalling rate (also called "baud rate").
It could be done in one bit.
If, as the guy says, the signal is transmitted at the same time every day (say 12:00), then a delay could be introduced that varies according to the weather:
Sunny: Send one bit at 12:00 exactly.
Cloudy: Send one bit at 12:00.00.01
Rainy: Send one bit at 12:00.00.02
Foggy: Send one bit at 12:00.00.03
Or a dot at ...01 or ...02, or a dash at ...01 or ...02, which would take the same time as sending the two bits in the example in the video.
Hi, i was thinking along the same line. Simply send a "1" between 1 second and 15 seconds past the min for sunny, then a "1" between 16 seconds and 30 seconds past the min for fog etc. You only need one bit for each state.
That is an interesting idea. A comment thought is that you are in effect using two bits anyway since there is two different sending times which can be considered 0 and 1. Given the price/bit, you are ofcourse right though. This is known as a covet channel meaning that one communicates with more then just sending bits and might be an interesting topic for a video.