As the course TA, Costin B., correctly pointed out to me, I made an error in this lecture when I said that Feynman only thought about quantum computers simulating quantum systems, and not about the possibility of them simulating general computation. In fact Feynman did think about the possibility of quantum computation simulating general deterministic and probabilistic computation, as early as 1982. Please check out his extremely readable 1982 paper here: www.cs.cmu.edu/~odonnell/quantum18/feynman-simulating-physics-with-computers.pdf
This is brilliant. I absolutely love the way you teach. Really cool. Love the blackboard too. Content with attached lecture notes is an invaluable resource to study this topic. Thank you for taking the effort to record it
I'm very curious about the nature of "probabilistic computation". It seems like there may be several different kinds. For example, the "probabilistic" aspect of the Miller-Rabin probabilistic primality proving algorithm resides in the choice of random numbers. Aside from the choice of the random number, it is perfectly deterministic. It would seem that such a probabilistic algorithm has a very different intrinsic property compared to say a "fuzzy logic" system. It'd be interesting to go into the science of the "different kinds" of probabilistic algorithms, and how that relates to quantum. I suspect that the probabilistic nature of quantum computing is a whole different type...although probably quite similar to the type of Miller-Rabin. I also wonder where neural networks fall into this spectrum. Is it also a "probabilistic algorithm" in the same sense that the weights of a neural network are like the random numbers in MR?
Hi Professor Ryan, I have a doubt at 29:00, you mentioned that if we generate 1000 random bits in an array then one would need 2^1000 numbers to describe the final state of the array, but as we are generating each bit independently can't we explain the state using 1000 numbers, where ith number tells the probability of 0 of ith bit (probability of 1 for that bit will be = 1 - the probability of the bit being 0) ?
You know, we are at 1:08:29 of 1:20:23, and the "rotate, compute, rotate" line hasn't been mentioned yet. I know we're rolling up to it, but... Ah - 1:19:17 and we're finally hearing the words.
Awesome, thank you!
As the course TA, Costin B., correctly pointed out to me, I made an error in this lecture when I said that Feynman only thought about quantum computers simulating quantum systems, and not about the possibility of them simulating general computation. In fact Feynman did think about the possibility of quantum computation simulating general deterministic and probabilistic computation, as early as 1982. Please check out his extremely readable 1982 paper here: www.cs.cmu.edu/~odonnell/quantum18/feynman-simulating-physics-with-computers.pdf
This is brilliant. I absolutely love the way you teach. Really cool. Love the blackboard too. Content with attached lecture notes is an invaluable resource to study this topic. Thank you for taking the effort to record it
thank you for reminding me to charge my battery pack! ⚡🤖 #stayjuicy
I'm very curious about the nature of "probabilistic computation". It seems like there may be several different kinds. For example, the "probabilistic" aspect of the Miller-Rabin probabilistic primality proving algorithm resides in the choice of random numbers. Aside from the choice of the random number, it is perfectly deterministic. It would seem that such a probabilistic algorithm has a very different intrinsic property compared to say a "fuzzy logic" system. It'd be interesting to go into the science of the "different kinds" of probabilistic algorithms, and how that relates to quantum. I suspect that the probabilistic nature of quantum computing is a whole different type...although probably quite similar to the type of Miller-Rabin. I also wonder where neural networks fall into this spectrum. Is it also a "probabilistic algorithm" in the same sense that the weights of a neural network are like the random numbers in MR?
Hi Professor Ryan, I have a doubt at 29:00, you mentioned that if we generate 1000 random bits in an array then one would need 2^1000 numbers to describe the final state of the array, but as we are generating each bit independently can't we explain the state using 1000 numbers, where ith number tells the probability of 0 of ith bit (probability of 1 for that bit will be = 1 - the probability of the bit being 0) ?
Brilliant summary of optical quantum computing:
- Initialize 1000 photons
- Run thru obstacle course
43:00
Oh. That escalated quickly
You know, we are at 1:08:29 of 1:20:23, and the "rotate, compute, rotate" line hasn't been mentioned yet. I know we're rolling up to it, but...
Ah - 1:19:17 and we're finally hearing the words.
Hang in there till Lecture 11 :) It's a leitmotif for the whole course...
Man, this "waving off" the difference between N and N*log(N) is bugging me... All dependence on N counts!