@@nobodyofconsequence6522 this is... surprisingly true. Especially when you get to the point of trying to make your program more efficient. You're really just trying to get it to do as much with as little time as possible, and sometimes that may be through laziness of not handling something that's out of the scope of the original program design. Basically saying "not my problem; I'm just going to do what I was told to do, nothing more". And, surprisingly, limiting the scope like this actually makes future programming more modular. In other words, it works.
Ok since I am at it here is another comment: no the sieve of Erasthostenes is not an efficient method to compute primes, I just used it as a way to illustrate computing with infinite datastructures. It is known since a while that Primes is in RP, that is randomised polynomial time. Some years ago it was even shown to be in P, i.e. you don't even need a dice.
@@ThorstenAltenkirch No. There is *never* any divisibility test in the sieve. Instead there is elimination of multiples, as you actually explained in the beginning of the video. To quote Wikipedia: Primes can also be produced by iteratively sieving out the composites through divisibility testing by sequential primes, one prime at a time. It is not the sieve of Eratosthenes but is often confused with it, even though the sieve of Eratosthenes directly generates the composites instead of testing for them. Trial division has worse theoretical complexity than that of the sieve of Eratosthenes in generating ranges of primes.
I hope TH-cam won't completely mangle my code here. I grabbed a convenient library that provides a `primes` in Haskell and set out to calculate, on the one hand, the count of times that a number is crossed out when finding all the primes less than some given n using the sieve of Eratosthenes: > sieved n = sum . map (\p -> length . takeWhile ( length . takeWhile (\p -> c `mod` p /= 0) . takeWhile (
and then he messed up trying to call nats, not originally expecting it to return and iterator "that was not very clever". They got a kick out of me, sort of reminding me I'm not the only one that occasionally forgets these things. Thanks for leaving that one in @Computerphile
TLDR: "yield" in python is used in generators to loop through something and store the data lazily so when you call the function next it doesn't have to start at the beginning but rather picks up where it left off (often in a loop)
I know I could look it up, but I'd love for you to tell me what that does. Does it just test all those conditions, i.e. returning true only if a is less than x, x is less than y, and y is less than b?
@@ZipplyZane See it in this way: Given expression: a < x < y < b 1. First, the interpreter evaluates (a < x). Let's say, the result of "a < x" is True. Then the current expression will be True < y < b. As in #1, the interpreter evaluates (True < y), holds the result in a stack, and repeats until the interpreter reaches the end of the expression. Don't think it in terms of stacks and ASTs. It is better to see this as a mathematical expression.
This is honestly some of the most beautiful code I've ever seen. I've made a many sieve, but never a lazy sieve. I can't believe you can do this incredible calculation with only a few lines of code and an understanding of coroutines.
I hadn’t considered Nottingham as a potential university before but just looked at it now and will definitely be considering it as a first/second choice!
@@MubashirullahD because his nats() function's output is exactly the same as range() except it is infinite. Rather than use range internally, he explicitly calls yield to demonstrate how it works
This is because with social distancing via Zoom (or similar) there is already a camera on the interviewer. Pre-Covid it would be non-lazy to add a second camera pointed at the camera operator. Or to put it another way, in a video conference everyone is a voice behind a camera, including the professor
I think it's between Altenkirch and Brailsford as my favorite overall Computerphile presenters in terms of raw depth-of-content. Pound is a close third.
'now' == 'right now'. 'Right' is just a stress with no real meaning. You sentence is basically 'Now is better than never. Never is better than now.' This can be reduced to False.
yield is multiple return yield makes your function "G" a generator, which yields a sequence. when you call next(g) it continues running until the next yield statement and the function next() returns the thing you yielded.
If you want a lazy natural number generator that doesn't blow the stack after a few iterations, do it manually: class Nats: def __init__(self, n): self.n = n - 1 def __next__(self): self.n += 1 return self.n def __iter__(self): return self Now you can use "Nats(2)" just like he used "nats(2)" in the video. Of course the sieve generator still reaches maximum recursion depth pretty quickly, but it gets way further with this Nats implementation. Edit: I just noticed that "blowing the stack" isn't quite right here. Since it's python, it's all safe and heap allocated anyway, so it just hits the recursion limit.
You don’t even need a class here: def nats(n): while True: yield n n+=1 Or better yet def nats(s): yield from range(s, sys.maxsize) And lastly: s = range(2, sys.maxsize)
@@corynivens6510 Haha you're right. The while True solution is much cleaner. I don't like the sys.maxsize thing though. I know its usually gonna be big enough, but theoretically python ints can grow infinitely big (until your ram is full), so that implementation is technically not equivalent to the other ones.
Lol I just looked through the itertools module and of course this iterator exists in the std lib! So here's the definite solution: from itertools import count (then use "count" instead of "nats")
Anything higher than sys.maxsize will result in an overflow. The only caveat with s = range is that range doesn’t support next(), so you’d technically need s=iter(range...)
There is a fundamental difference between iterators and laziness: lazy evaluation memoizes results of subcomputations whereas iterators do not. For example, the Haskell version of the infinite list of Fibbonacci numbers can be defined as fibs = 0:1:zipWith (+) fibs (tail fibs) If you translate this definition to iterators you'd end up with something like the following: def fibs(): yield 0 yield 1 yield from (i+j for i,j in zip(fibs(),tail(fibs()))) def tail(it): # couldn't find this in the standard library... next(it) # drop the first element while True: n = next(it) yield n However, the Python code above takes exponential time whereas the Haskell one takes linear time (ignoring the log factor for the increasing sizes of numbers). To get linear time using iterators you have to manually code the memoization (using a loop for simplicity): def fast_fibs(): a,b = 0,1 while True: yield a a,b = b,a+b
@@mrdkaaa Lazy evaluation a.k.a. call by need requires memoization of results, otherwise it is simply call by name. For reference check the chapter on graph reduction i. SPJ's "The Implementation of functional languages" (it's out of print by freely available from his web page).
I know i am 39 minutes late but I have never been this early before. Seeing only 62 comments is so weird. And this is nicely timed as I was just researching about yield statements.
Aalex not quite actually..generators or iterators in general holds the address to the particular value it is now referencing and moves to the “next” address when next() is called
Yes. Also, python doesn't support tail recursion. Edit: as pointed out, of course python supports tail recursion. :-) What I meant is that it doesn't support _tail call elimination / tail call optimization_.
The Sieve code is interesting, but I really worry about the efficiency! There are only two ways to generate the next prime here, and that is to either remember all previous primes, and iterate until you find one which is not a multiple of any of them (but only up to sqrt(n)), or you have to use standard sieve thinking and strike out at least a bunch of multiples each time you find a new prime. The second approach has much more useful Big-O properties. The Sudoku solver otoh will work with constant memory, it should be more or less equal to a standard recursive solver with backtracking?
"Laziness is a virtue - well, in programming anyway!" No need to specify that it's in programming. Without laziness we wouldn't have had a reason to domesticate animals, build societies, develop science and technologies, create machines, etc. Basically we'd still be in the stone age if our species wasn't lazy and constantly looking to make life easier for ourselves.
@Joseph Malone I disagree. By using an animal to plow your field (and later a machine) we decrease the work we need to do. Ideally we wouldn't be doing that job at all but at least until we get decent enough ai, not doing that job would mean we wouldn't get to eat. Therefore laziness is expressed as "optimization" rather than as completely not doing the required task because the task is essential for our survival. In the case of tasks that are less essential for our survival such as copying books, the laziness factor is more subtle as the motivation there is more about saving costs than not doing the original job. Although saving costs can be considered as part of laziness since it allows those involved to get the same thing for less effort. Essentially minimizing effort is inherently linked with and usually driven by laziness.
@@beskamir5977 Our brain tells from fun and not fun works. Hunting is inherently fun to our brain (or our ancesters would have died from laziness). When our ancesters started ploughing, they avoided the pain of starvation, but doing so also deprived them of the fun of hunting. Most modern works are boring, because our brains were not evolved to do that. BTW, many women enjoy shopping, because shopping is basically paid gathering.
@@y.z.6517 I think you basically answered your own question. Laziness happens when the thing we need to do is boring. When it's fun laziness doesn't apply as much.
I see a lot of comments on how this implementation of the sieve is poor; that’s because it’s a simple example of a technique that can be useful. It’s not meant to be used exactly as shown, but it can do the job anyway. The sudoku solver shows a better use case.
Agreed. Some people just don't get the meaning of the video. Professor didn't say he is going to do 100% accurate implementation. He wanted to show what you can do with yield and how to be 'lazy'.
laziness is me writing a for loop to make a list of alphabets or numbers instead of actually typing it out myself. in a few years I'm gonna start making a neural network that can write half of the code for me lol
Generators are great. Using recursion to generate a sequence of integers is horrible! It is very wasteful of both space and time. It will also blow out your stack. He clearly lives in the world of theory where performance does not matter at all. Here is a better version. def count(n): while True: yield n n = n + 1 Simple and efficient. Recursion is a great technique when needed, but it should be avoided when it is not needed.
I tried to use something like your count with his "sieve" and it still blew out my stack. I wrote in another comment a better version (in my opinion) of calculating primes. I absolutely agree with your end statement that you should avoide recursion when possible
@@hodsinay6969 his sieve has the same problem. I didn't feel like rewriting his whole program on a tablet. I will try to find your comment and check it.
@Pi Pony functional programming. its a paradigm, I'm just interested in it because its so different to what I'm used to with c++ and the bit of python and c# I know. edit I don't think haskell can do more, I'm just learning it out of interest
Minor suggestion. I think he should have tried to show what happens at the end of his yield sudoku solver. Using "return" with "yield" has interesting behavior with a "StopIteration" exception.
If you use the `filter` predicate, it gets even more obvious that the sieve function precisely mimics the idea of layers of sieves (aka filters!). def sieve(gen): i = next(gen) yield i yield from sieve(filter(lambda k: not is_multiple(k, i), gen)) This evaluates to filter(lambda k: not is_multiple(N, i), ... filter(lambda k: not is_multiple(5, i), filter(lambda k: not is_multiple(3, i), filter(lambda k: not is_multiple(2, i), gen))) ...) where N is the nth prime.
Another way of understanding "yield from" is thinking that it "returns" a function but you didn't call it yet. (Useful concept to also understand function pointers / blocks / closures / lambdas). Kind of.
The problem with this approach is that it reaches recursion depth pretty fast. I was able to sieve primes using a slightly improved version of this method to about 3500. After that yield from is just unviable
@@CecilWesterhof I did the same thing as in the comment above for generating natural numbers and the rest is exactly like described in the video. I'm not sure how to use mutable state for the sieve itself.
Thorsten is a real CS scientists ... unlike the self-claimed, ludicrous CS or engineers just from taking a 6 month course... va a lifestyle of math/science.
8:19 This kind of naming always gets me. In Haskell, when doing `filter`, do you filter out the elements that make the predicate true, or do you keep them? (It takes a while to get used to it!) Similarly, do you sieve out the elements, or are you actually keeping them?
Always run a 2-line test code first to figure stuff like that out. This is even more important, if you dip in and out of several languages and libraries.
Hah, a few weeks ago I got interested in prime numbers and after some time I got it to be able to rival C in speed, best I got it was all primes under 10^11 in 93.6s which is blazing fast, but I could have still made it even faster, but it gets very complex fast.
The following generator based version of sieve() doesn't have stack overflow and is more efficient. Plus, IMHO, it explains what't going on in the algorithm more clearly; but is a bit longer. def sieve(n=2): prev, skip = [], False while True: for p in prev: if n%p == 0: skip=True; break; if not skip: yield n prev.append(n) n, skip = n+1, False
when calculating primes the way you said in python it will do it layzily, but it will probably not have enough space in a matter of seconds, generators are not something that you just save an infinate amount of without repercussions, neither are levels of recursion. I suggest to get the primes lazily with something more like: def get_primes(): primes = set() i = 2 while True: if all(i % p != 0 for p in primes): yield i primes.add(i) i += 1 this code does basiclly the same as the code in the video (even though there are probably better ways to calculate prime numbers) I know that it shows the power of generators less, but it probably is a bad idea to use a generator that uses a generator that uses a generator...
As promised, I came over here to check it out. This is much better than his. It would be nice to not save all the primes you already found, but I can't think of anyway to avoid it.
@@johnbennett1465 there are a lot of ways to improve this algorithm of getting primes. for instance not checking all the primes but only the ones that are lower than the number sqrt (and saving the primes in a list for that), or just checking all the numbers up to the number I am checking sqrt's, in the long run it is probably better. I just wanted to implement the same algorithm as he did without using unneeded recursion.
@@hodsinay6969 you could make it a little more pythonic like this: import itertools def get_primes(): primes = set() for i in itertools.count(2): if all(i % p != 0 for p in primes): yield i primes.add(i) That also brings back the usage of generators: itertools.count is a generator that counts to infinity!
@@hodsinay6969 your suggestions do improve the speed. I was more worried about running out of memory. Your suggestions don't help with that. Admittedly you have to create a lot of primes before it is a problem. 🙂 I just don't like programs that keep eating up memory and try to avoid creating them when I can. Having to port my spell checker to a microprocessor with only 256 bytes (yes bytes) of RAM memory may have scared me. 😅 Note that the code and compressed word list was in ROM.
the coolest thing i got from this is modularity. the rest is fascinating, i did not know yield, and recursion does my head in a bit, but i can use it. but modularity is such a level up for me in my coding,. thank you for that my friend!!!
In the video when he goes 3+4 is nothing, I was like wait... I could of sworn that was 7, and than he realizes it was 7. Professor Altenkirch has that vibe of a person who is like pothead that is lazy (interesting that (laziness) has to deal with the video) and doesn't know anything, yet as you can see, he knows programming well which though isn't technically hard, it is something that many think is hard/challenging as well as his mathematics. Would of loved to have him as my professor when I was getting my CS degree.
Many people find it hard to understand Dynamic Programming intuitively. I tell them, Dynamic Programming is just a breadth-first search, lazy (on-demand), with results cached.
The nats generator shows is mathematically more elegant, but it will not render an infinite sequence since python will create a call stack and you'll eventually receive a RecursionError. itertools.count would avoid that or a while loop
nats() seems very 'functional', but it contains state, and the function returns a different value with the same (no) arguments which seems the opposite to what functional often tries to do. Still learning...
With recursion in generators does the call stack keep growing (especially in the case in the video) or does it optimize and keep only the relevant variable?
This guy looks like every computer scientist to ever exist, in the same body
the Avatar
I immediately thought of Brent Spiner's character in Independence Day.
@@JackVermicelli same!
🤣🤣🤣🤣🤣🤣🤣🤣🤣
Yesssss and talks like Dr Strangelove
I don't know about coding but I am kind of an expert in Laziness.
brambedkar59 and you’re not alone
Indeed. Not to brag, but I'm one of the foremost experts in the field.
if IFeelLikeIt:
yield work
To be fair a lot of coding is about laziness, so you probably have an aptitude of it.
@@nobodyofconsequence6522 this is... surprisingly true. Especially when you get to the point of trying to make your program more efficient. You're really just trying to get it to do as much with as little time as possible, and sometimes that may be through laziness of not handling something that's out of the scope of the original program design. Basically saying "not my problem; I'm just going to do what I was told to do, nothing more". And, surprisingly, limiting the scope like this actually makes future programming more modular. In other words, it works.
Ok since I am at it here is another comment: no the sieve of Erasthostenes is not an efficient method to compute primes, I just used it as a way to illustrate computing with infinite datastructures. It is known since a while that Primes is in RP, that is randomised polynomial time. Some years ago it was even shown to be in P, i.e. you don't even need a dice.
Professor, could you then kindly suggest a more efficient method of code, to find primes?
The sieve is efficient. But you implemented trial division.
@@jens6398 I don't know what you mean by "trial division" but I have implemented the sieve.
@@ThorstenAltenkirch No. There is *never* any divisibility test in the sieve. Instead there is elimination of multiples, as you actually explained in the beginning of the video.
To quote Wikipedia: Primes can also be produced by iteratively sieving out the composites through divisibility testing by sequential primes, one prime at a time. It is not the sieve of Eratosthenes but is often confused with it, even though the sieve of Eratosthenes directly generates the composites instead of testing for them. Trial division has worse theoretical complexity than that of the sieve of Eratosthenes in generating ranges of primes.
I hope TH-cam won't completely mangle my code here. I grabbed a convenient library that provides a `primes` in Haskell and set out to calculate, on the one hand, the count of times that a number is crossed out when finding all the primes less than some given n using the sieve of Eratosthenes:
> sieved n = sum . map (\p -> length . takeWhile ( length . takeWhile (\p -> c `mod` p /= 0) . takeWhile (
4:10
“3+4 is... nothing. Oh, it’s 7. Okay, that’s good”
I don’t know why this had me laughing so hard
That made me laugh too. Like a smart professor struggling with a gui interface
Bacon u making lthings boring
My face hurts from laughing at that.
same with me.
and then he messed up trying to call nats, not originally expecting it to return and iterator "that was not very clever". They got a kick out of me, sort of reminding me I'm not the only one that occasionally forgets these things. Thanks for leaving that one in @Computerphile
Professor Altenkirch is hysterical. I wish I had him during my CS classes.
Hilarity?
Serdar Kocak hysterical is often used as “even more than hilarious”
@sirati97 he means hilarious
Lol take it easy
@@chrisbreidenstein3831 he meant hilarious
Dude, I feel so relaxed just looking at this guy. It’s a true inspiration, thank you. I need a beer now.
TLDR: "yield" in python is used in generators to loop through something and store the data lazily so when you call the function next it doesn't have to start at the beginning but rather picks up where it left off (often in a loop)
Ty!
Laziness in python is writing codes without getting annoyed by ';' and understanding the beauty of a < x < y < b.
Lex Fridman
@@omri9325 yep, he described this feature of python as the most beautiful thing a programming language can ever have.
I dislike python but love a < b < c
It's genius.
I know I could look it up, but I'd love for you to tell me what that does. Does it just test all those conditions, i.e. returning true only if a is less than x, x is less than y, and y is less than b?
@@ZipplyZane See it in this way:
Given expression: a < x < y < b
1. First, the interpreter evaluates (a < x).
Let's say, the result of "a < x" is True.
Then the current expression will be True < y < b.
As in #1, the interpreter evaluates (True < y), holds the result in a stack, and repeats until the interpreter reaches the end of the expression.
Don't think it in terms of stacks and ASTs. It is better to see this as a mathematical expression.
This is honestly some of the most beautiful code I've ever seen. I've made a many sieve, but never a lazy sieve. I can't believe you can do this incredible calculation with only a few lines of code and an understanding of coroutines.
No you cannot multiply strings
python: print("haha string goes b" + "r"*10)
>haha string goes brrrrrrrrrr
I'm stealing that
I see, you have been on r/programmerhumor today.
alternatively
print('ha' * 2 + 'string go b' + 'r' * 10)
reddit
this is str*int, not str*str
Agreed. He is the most likable professor ever. And in keeping with the important lesson on laziness, I’m typing this lying down.
@1:30 he can say the word "really" without moving a single muscle.
This video is all about laziness, I guess
Laziness lol
This cracked me up :D
I hadn’t considered Nottingham as a potential university before but just looked at it now and will definitely be considering it as a first/second choice!
This is a perfect example of real life coding.
me: nooo, you can't define a recursive function without an exit condition!!
Altenkirch: haha, stack goes brrrrr.
haha.
We could have done
for i in range(n):
yield i
@@MubashirullahD That's basically a circular definition. Range() needs to be defined as well
@@ThomasBomb45 I don't know what you mean by circular? :) I'm assuming everyone knows range is lazy in python 3.
@@MubashirullahD because his nats() function's output is exactly the same as range() except it is infinite. Rather than use range internally, he explicitly calls yield to demonstrate how it works
@@ThomasBomb45 # Range is not needed
i=0
while True:
i+=1
yield i
I'VE WATCHED SO MANY VIDEOS FROM COMPUTERPHILE AND NEVER SEEN THE VOICE BEHIND THE CAMERA. FINALLY.
This is because with social distancing via Zoom (or similar) there is already a camera on the interviewer.
Pre-Covid it would be non-lazy to add a second camera pointed at the camera operator.
Or to put it another way, in a video conference everyone is a voice behind a camera, including the professor
Just wanted to say that I really enjoyed your short paper about the contribution of Martin Hofmann. Thanks!
I think it's between Altenkirch and Brailsford as my favorite overall Computerphile presenters in terms of raw depth-of-content. Pound is a close third.
- "WHEHRE IS THIS THING???"
- "ok ok yeah yeah imma do it"
Best lines ever
This is a good explanation about yield. With regard to Sudoku, all good Sudoku puzzles only have one solution.
I guess I'll be That Guy today:
"Now is better than never.
Although never is often better than *right* now."
The zen of python is never a bad thing to reference
So Zen! :-)
>>> import _this_
'now' == 'right now'. 'Right' is just a stress with no real meaning.
You sentence is basically 'Now is better than never. Never is better than now.' This can be reduced to False.
@@y.z.6517 I think you might've been staring at your IDE too long
This yield statement puzzles me so hard man
It's like "return" but it starts from that line the next time you run the function.
It just "pauses" at that point, waiting for you to call it again.
yield is multiple return
yield makes your function "G" a generator, which yields a sequence.
when you call next(g) it continues running until the next yield statement and the function next() returns the thing you yielded.
@@raingram I've literally never seen it that way. You're amazing. Thanks for your comment.
I get it now
@@raingram great explanation
@@christophebeaulieu4916 ~. i’m
The content is great and his warm dry humor makes me giggle.
"I wanted to talk about laziness, but I was to lazy to do talk about it"
Missed opportunity 🙃
You are not lazy, or you wouldn't have commented.
If you want a lazy natural number generator that doesn't blow the stack after a few iterations, do it manually:
class Nats:
def __init__(self, n):
self.n = n - 1
def __next__(self):
self.n += 1
return self.n
def __iter__(self):
return self
Now you can use "Nats(2)" just like he used "nats(2)" in the video. Of course the sieve generator still reaches maximum recursion depth pretty quickly, but it gets way further with this Nats implementation.
Edit: I just noticed that "blowing the stack" isn't quite right here. Since it's python, it's all safe and heap allocated anyway, so it just hits the recursion limit.
You don’t even need a class here:
def nats(n):
while True:
yield n
n+=1
Or better yet
def nats(s):
yield from range(s, sys.maxsize)
And lastly:
s = range(2, sys.maxsize)
@@corynivens6510 Haha you're right. The while True solution is much cleaner. I don't like the sys.maxsize thing though. I know its usually gonna be big enough, but theoretically python ints can grow infinitely big (until your ram is full), so that implementation is technically not equivalent to the other ones.
Lol I just looked through the itertools module and of course this iterator exists in the std lib! So here's the definite solution:
from itertools import count
(then use "count" instead of "nats")
Anything higher than sys.maxsize will result in an overflow. The only caveat with s = range is that range doesn’t support next(), so you’d technically need s=iter(range...)
Ah man, the itertools solution really is the one isn’t it? itertools or nothing haha
Enjoyed this... He had a great sense of humour... Intellect with humility is the greatest... Thanks
I'm always happy to see a new episode with Prof. Altenkirch :>
Trying to show your homies a cool programming trick be like:
''...so now when I press this button... oh it doesn't work...yyym'''
For a moment I thought I didn't understand seive but suddenly profesor corrects it.😅 It's beautiful to show that it's ok to make mistakes
As soon as I see the title, I figured it would be about generators. Very useful things.
There is a fundamental difference between iterators and laziness: lazy evaluation memoizes results of subcomputations whereas iterators do not.
For example, the Haskell version of the infinite list of Fibbonacci numbers can be defined as
fibs = 0:1:zipWith (+) fibs (tail fibs)
If you translate this definition to iterators you'd end up with something like the following:
def fibs():
yield 0
yield 1
yield from (i+j for i,j in zip(fibs(),tail(fibs())))
def tail(it): # couldn't find this in the standard library...
next(it) # drop the first element
while True:
n = next(it)
yield n
However, the Python code above takes exponential time whereas the Haskell one takes linear time (ignoring the log factor for the increasing sizes of numbers). To get linear time using iterators you have to manually code the memoization (using a loop for simplicity):
def fast_fibs():
a,b = 0,1
while True:
yield a
a,b = b,a+b
@@mrdkaaa Lazy evaluation a.k.a. call by need requires memoization of results, otherwise it is simply call by name. For reference check the chapter on graph reduction i. SPJ's "The Implementation of functional languages" (it's out of print by freely available from his web page).
I don't understand why Computerphile videos would have any dislikes. That said: NICE.
I know i am 39 minutes late but I have never been this early before. Seeing only 62 comments is so weird. And this is nicely timed as I was just researching about yield statements.
Doesn't this create a callstack that gets bigger and bigger with every iteration (like all non-tail-recursions) or am I missing something?
Aalex not quite actually..generators or iterators in general holds the address to the particular value it is now referencing and moves to the “next” address when next() is called
Yes. Also, python doesn't support tail recursion.
Edit: as pointed out, of course python supports tail recursion. :-) What I meant is that it doesn't support _tail call elimination / tail call optimization_.
Tried it with the exact code and for default settings python 3.7 it gave an "RecursionError: maximum recursion depth exceeded" after the prime 733.
@@noobinvestor3180 it will, the yield from is the issue here, it's basically recursion
@@chris8443 not yet :)) with the new PEG parser, we might get something nice
You can tell this guy knows what he's talking about here. He looks like an EXPERT in the art of lazy.
This is probably the shortest pythoniest-python program I have ever seen!
This method is paramount for feeding data / data augmentations into machine learning model training
I now finally see the big advantage of using generators!!! Great video!!
I could be wrong, but I am pretty sure this will blow the stack if you do this in python
i love his coffee mug
The Sieve code is interesting, but I really worry about the efficiency!
There are only two ways to generate the next prime here, and that is to either remember all previous primes, and iterate until you find one which is not a multiple of any of them (but only up to sqrt(n)), or you have to use standard sieve thinking and strike out at least a bunch of multiples each time you find a new prime. The second approach has much more useful Big-O properties.
The Sudoku solver otoh will work with constant memory, it should be more or less equal to a standard recursive solver with backtracking?
"oh it's ... That's bad"
Story of my life.
"This is what being a programmer is actually like" tutorial
When you’re not programming you’re guarding the train in The Matrix. Top man!
Never knew about yield - yet I use Haskell and Python. Very helpful!
"Laziness is a virtue - well, in programming anyway!"
No need to specify that it's in programming. Without laziness we wouldn't have had a reason to domesticate animals, build societies, develop science and technologies, create machines, etc. Basically we'd still be in the stone age if our species wasn't lazy and constantly looking to make life easier for ourselves.
@Joseph Malone I disagree. By using an animal to plow your field (and later a machine) we decrease the work we need to do. Ideally we wouldn't be doing that job at all but at least until we get decent enough ai, not doing that job would mean we wouldn't get to eat. Therefore laziness is expressed as "optimization" rather than as completely not doing the required task because the task is essential for our survival. In the case of tasks that are less essential for our survival such as copying books, the laziness factor is more subtle as the motivation there is more about saving costs than not doing the original job. Although saving costs can be considered as part of laziness since it allows those involved to get the same thing for less effort. Essentially minimizing effort is inherently linked with and usually driven by laziness.
@Joseph Malone As a character's voiceline in R6 goes, "Efficiency is clever laziness".
@@beskamir5977 Explain why people play games? By playing games, we basically seek *more* work that are unnecessary.
@@beskamir5977 Our brain tells from fun and not fun works. Hunting is inherently fun to our brain (or our ancesters would have died from laziness). When our ancesters started ploughing, they avoided the pain of starvation, but doing so also deprived them of the fun of hunting. Most modern works are boring, because our brains were not evolved to do that.
BTW, many women enjoy shopping, because shopping is basically paid gathering.
@@y.z.6517 I think you basically answered your own question. Laziness happens when the thing we need to do is boring. When it's fun laziness doesn't apply as much.
4:26 'nutz'
4:40 'yeet'
Got it
Prof. Thorsten is friggin awesome.
I see a lot of comments on how this implementation of the sieve is poor; that’s because it’s a simple example of a technique that can be useful. It’s not meant to be used exactly as shown, but it can do the job anyway. The sudoku solver shows a better use case.
Agreed. Some people just don't get the meaning of the video. Professor didn't say he is going to do 100% accurate implementation. He wanted to show what you can do with yield and how to be 'lazy'.
Laziness as in "waiting to evaluate an expression as late as possible"? Or laziness as in getting nothing done? xD
The latter is much easier to implement!
pull request let me do nothing all day
IKR! There are 2 types of laziness in programming.
laziness is me writing a for loop to make a list of alphabets or numbers instead of actually typing it out myself.
in a few years I'm gonna start making a neural network that can write half of the code for me lol
The shirts are always top shelf!!!!
Generators are great. Using recursion to generate a sequence of integers is horrible! It is very wasteful of both space and time. It will also blow out your stack. He clearly lives in the world of theory where performance does not matter at all.
Here is a better version.
def count(n):
while True:
yield n
n = n + 1
Simple and efficient.
Recursion is a great technique when needed, but it should be avoided when it is not needed.
I tried to use something like your count with his "sieve" and it still blew out my stack.
I wrote in another comment a better version (in my opinion) of calculating primes.
I absolutely agree with your end statement that you should avoide recursion when possible
@@hodsinay6969 his sieve has the same problem. I didn't feel like rewriting his whole program on a tablet. I will try to find your comment and check it.
Try solving this without recursion: computing all the permutations of a sequence.
def permute(seq) :
"generator which yields successive permutations of the elements of seq."
if len(seq) == 0 :
yield ()
else :
for i in range(0, len(seq)) :
for rest in permute(seq[:i] + seq[i + 1:]) :
yield (seq[i],) + rest
#end for
#end for
#end if
#end permute
>>> list(permute((1, 2, 3, 4)))
[(1, 2, 3, 4), (1, 2, 4, 3), (1, 3, 2, 4), (1, 3, 4, 2), (1, 4, 2, 3), (1, 4, 3, 2), (2, 1, 3, 4), (2, 1, 4, 3), (2, 3, 1, 4), (2, 3, 4, 1), (2, 4, 1, 3), (2, 4, 3, 1), (3, 1, 2, 4), (3, 1, 4, 2), (3, 2, 1, 4), (3, 2, 4, 1), (3, 4, 1, 2), (3, 4, 2, 1), (4, 1, 2, 3), (4, 1, 3, 2), (4, 2, 1, 3), (4, 2, 3, 1), (4, 3, 1, 2), (4, 3, 2, 1)]
Also, don't reinvent the wheel. "itertools.count" already exists and does the same thing.
some languages are smart and flatten the recursion, maybe the professor is accustomed to those
Best language ever
oh no you're gonna start a war (btw c++ best (: )
@Pi Pony I was using smart pointers!
@Pi Pony tbh most of the time I don't even use smart one
@Pi Pony nope, I probably should but I'm currently learning haskell to get a sense of fp
@Pi Pony functional programming. its a paradigm, I'm just interested in it because its so different to what I'm used to with c++ and the bit of python and c# I know. edit I don't think haskell can do more, I'm just learning it out of interest
Now I get why developers love Python :)
Minor suggestion. I think he should have tried to show what happens at the end of his yield sudoku solver. Using "return" with "yield" has interesting behavior with a "StopIteration" exception.
This dude looks like the perfect guy to describe laziness
PEP8: no spaces before colons, and put spaces around your operators.
colons are not operators, no?
9:12
The "Three!" is perfect
If you use the `filter` predicate, it gets even more obvious that the sieve function precisely mimics the idea of layers of sieves (aka filters!).
def sieve(gen):
i = next(gen)
yield i
yield from sieve(filter(lambda k: not is_multiple(k, i), gen))
This evaluates to
filter(lambda k: not is_multiple(N, i), ... filter(lambda k: not is_multiple(5, i), filter(lambda k: not is_multiple(3, i), filter(lambda k: not is_multiple(2, i), gen))) ...)
where N is the nth prime.
(Looks at my cat sleeping) Nature is "efficient", just turns out efficiency is comfortable sometimes. :)
Props for rocking the unicorn mug 🤩
Another way of understanding "yield from" is thinking that it "returns" a function but you didn't call it yet. (Useful concept to also understand function pointers / blocks / closures / lambdas). Kind of.
This video is absolute gold!
Never seen a man who speaks so well without moving their lower jaw
This mug.
I love this mug.
I want this mug.
The problem with this approach is that it reaches recursion depth pretty fast. I was able to sieve primes using a slightly improved version of this method to about 3500. After that yield from is just unviable
It's possible with mutable state instead of recursion
e.g.
def nats(n):
while True:
yield n
n += 1
edit: fixed typo
That was the first thing I thought. It can (on my system) go to 130, but with 131 I get a RecursionError.
Do you mind sharing your code?
@@CecilWesterhof I did the same thing as in the comment above for generating natural numbers and the rest is exactly like described in the video. I'm not sure how to use mutable state for the sieve itself.
@@pafnutiytheartist That is strange. With that change it goes from 130 to 497. Not nearly the 3500 you talk about.
Laziness is driving power of progress
Thorsten is a real CS scientists ... unlike the self-claimed, ludicrous CS or engineers just from taking a 6 month course... va a lifestyle of math/science.
8:19 This kind of naming always gets me. In Haskell, when doing `filter`, do you filter out the elements that make the predicate true, or do you keep them? (It takes a while to get used to it!) Similarly, do you sieve out the elements, or are you actually keeping them?
Same thing as other search filters
Always run a 2-line test code first to figure stuff like that out. This is even more important, if you dip in and out of several languages and libraries.
If computer scientists were Pokemon, this guy is the final evolution.
I don't know if it for this video specifically, but professor seems to bring about an air of laziness through his mannerisms as well😂
Write a prime factoring routine and apply it looking for remainders of n (the number currently being tested).
You can't make a character this great
Just learned about this yesterday in my programming languages course
Wish I had this guy as a teacher.
That coffee cup is glorious.
Why is like he does'nt sleep enough ?
Because too lazy he isto sleep .
I'm sorry! I make grammatical errors too
Lazy to sleep
Lazy sleep
I haven't slept well in years
Laziness is a gateway drug to Haskell
Hah, a few weeks ago I got interested in prime numbers and after some time I got it to be able to rival C in speed, best I got it was all primes under 10^11 in 93.6s which is blazing fast, but I could have still made it even faster, but it gets very complex fast.
The following generator based version of sieve() doesn't have stack overflow and is more efficient. Plus, IMHO, it explains what't going on in the algorithm more clearly; but is a bit longer.
def sieve(n=2):
prev, skip = [], False
while True:
for p in prev:
if n%p == 0: skip=True; break;
if not skip:
yield n
prev.append(n)
n, skip = n+1, False
You could also add in the fact that any composite number n must have a factor 2
when calculating primes the way you said in python it will do it layzily, but it will probably not have enough space in a matter of seconds, generators are not something that you just save an infinate amount of without repercussions, neither are levels of recursion.
I suggest to get the primes lazily with something more like:
def get_primes():
primes = set()
i = 2
while True:
if all(i % p != 0 for p in primes):
yield i
primes.add(i)
i += 1
this code does basiclly the same as the code in the video (even though there are probably better ways to calculate prime numbers)
I know that it shows the power of generators less, but it probably is a bad idea to use a generator that uses a generator that uses a generator...
As promised, I came over here to check it out. This is much better than his. It would be nice to not save all the primes you already found, but I can't think of anyway to avoid it.
@@johnbennett1465 there are a lot of ways to improve this algorithm of getting primes. for instance not checking all the primes but only the ones that are lower than the number sqrt (and saving the primes in a list for that), or just checking all the numbers up to the number I am checking sqrt's, in the long run it is probably better.
I just wanted to implement the same algorithm as he did without using unneeded recursion.
@@hodsinay6969 you could make it a little more pythonic like this:
import itertools
def get_primes():
primes = set()
for i in itertools.count(2):
if all(i % p != 0 for p in primes):
yield i
primes.add(i)
That also brings back the usage of generators: itertools.count is a generator that counts to infinity!
@@slash_me that's a great suggestion, I didn't know this generator before...
@@hodsinay6969 your suggestions do improve the speed. I was more worried about running out of memory. Your suggestions don't help with that. Admittedly you have to create a lot of primes before it is a problem. 🙂 I just don't like programs that keep eating up memory and try to avoid creating them when I can. Having to port my spell checker to a microprocessor with only 256 bytes (yes bytes) of RAM memory may have scared me. 😅 Note that the code and compressed word list was in ROM.
haha I thought you wrote that wrong!! but I was deferring to your professorial nature!
I love you videos
the coolest thing i got from this is modularity. the rest is fascinating, i did not know yield, and recursion does my head in a bit, but i can use it. but modularity is such a level up for me in my coding,. thank you for that my friend!!!
8:04 - 8:15 programming in a nutshell
never, ever taught, I feel like have huge gaps in my academic knowledge and I'm trying to become an engineer
In the video when he goes 3+4 is nothing, I was like wait... I could of sworn that was 7, and than he realizes it was 7. Professor Altenkirch has that vibe of a person who is like pothead that is lazy (interesting that (laziness) has to deal with the video) and doesn't know anything, yet as you can see, he knows programming well which though isn't technically hard, it is something that many think is hard/challenging as well as his mathematics. Would of loved to have him as my professor when I was getting my CS degree.
This guy exactly looks like someone qualified to say something on this topic.
Laziness is an undewwated notion. He could not have said it any lazier.
the fact this was reccomended to me makes me feel personally attacked
It's a wonderful idea to handle large files.
I loved your shirt
Many people find it hard to understand Dynamic Programming intuitively. I tell them, Dynamic Programming is just a breadth-first search, lazy (on-demand), with results cached.
when you find out about yield and generators in python in 2020...
Mine was " yield from "
i only listened to the first 40seconds and i approve.
Did anyone check stackoverflow for a python question titled "Why doesn't my sieve function work?"
The nats generator shows is mathematically more elegant, but it will not render an infinite sequence since python will create a call stack and you'll eventually receive a RecursionError.
itertools.count would avoid that or a while loop
This guy is one with the code
nats() seems very 'functional', but it contains state, and the function returns a different value with the same (no) arguments which seems the opposite to what functional often tries to do. Still learning...
0:15 is me anytime I get an email from my university admin
🤣🤣😩
I'd smoke weed with this dude
That's a nice compliment :)
For some reason he looks realy competent to talk about lziness.
With recursion in generators does the call stack keep growing (especially in the case in the video) or does it optimize and keep only the relevant variable?