Regarding your statement that its impossible to reach full entropy (19:30), I have a question: Imagine you are flipping a biased coin with HEADS (lets call it 1) having a probability p (so by definition it has entropy of -log2(max(p, p-1)) ). Now, make two strings of throws: S0 = 011001... S1 = 010111... Lets extract 1 bit of information from this as follows: Find the first bit where the strings dont match. That is, find the smallest i such that S0[i] != S1[i]. Then, our output shall be S0[i]. Assuming that all events are independent and p is constant, shouldn't this generate perfectly 50% - 50% bits, and therefore 1 bit of entropy per bit?
Regarding your statement that its impossible to reach full entropy (19:30), I have a question:
Imagine you are flipping a biased coin with HEADS (lets call it 1) having a probability p (so by definition it has entropy of -log2(max(p, p-1)) ).
Now, make two strings of throws:
S0 = 011001...
S1 = 010111...
Lets extract 1 bit of information from this as follows:
Find the first bit where the strings dont match. That is, find the smallest i such that S0[i] != S1[i]. Then, our output shall be S0[i].
Assuming that all events are independent and p is constant, shouldn't this generate perfectly 50% - 50% bits, and therefore 1 bit of entropy per bit?
I haven't seen the next video yet, so I apologize if this has been covered there.