'now' == 'right now'. 'Right' is just a stress with no real meaning. You sentence is basically 'Now is better than never. Never is better than now.' This can be reduced to False.
"Laziness is a virtue - well, in programming anyway!" No need to specify that it's in programming. Without laziness we wouldn't have had a reason to domesticate animals, build societies, develop science and technologies, create machines, etc. Basically we'd still be in the stone age if our species wasn't lazy and constantly looking to make life easier for ourselves.
@Joseph Malone I disagree. By using an animal to plow your field (and later a machine) we decrease the work we need to do. Ideally we wouldn't be doing that job at all but at least until we get decent enough ai, not doing that job would mean we wouldn't get to eat. Therefore laziness is expressed as "optimization" rather than as completely not doing the required task because the task is essential for our survival. In the case of tasks that are less essential for our survival such as copying books, the laziness factor is more subtle as the motivation there is more about saving costs than not doing the original job. Although saving costs can be considered as part of laziness since it allows those involved to get the same thing for less effort. Essentially minimizing effort is inherently linked with and usually driven by laziness.
@@beskamir5977 Our brain tells from fun and not fun works. Hunting is inherently fun to our brain (or our ancesters would have died from laziness). When our ancesters started ploughing, they avoided the pain of starvation, but doing so also deprived them of the fun of hunting. Most modern works are boring, because our brains were not evolved to do that. BTW, many women enjoy shopping, because shopping is basically paid gathering.
@@y.z.6517 I think you basically answered your own question. Laziness happens when the thing we need to do is boring. When it's fun laziness doesn't apply as much.
In the video when he goes 3+4 is nothing, I was like wait... I could of sworn that was 7, and than he realizes it was 7. Professor Altenkirch has that vibe of a person who is like pothead that is lazy (interesting that (laziness) has to deal with the video) and doesn't know anything, yet as you can see, he knows programming well which though isn't technically hard, it is something that many think is hard/challenging as well as his mathematics. Would of loved to have him as my professor when I was getting my CS degree.
If you use the `filter` predicate, it gets even more obvious that the sieve function precisely mimics the idea of layers of sieves (aka filters!). def sieve(gen): i = next(gen) yield i yield from sieve(filter(lambda k: not is_multiple(k, i), gen)) This evaluates to filter(lambda k: not is_multiple(N, i), ... filter(lambda k: not is_multiple(5, i), filter(lambda k: not is_multiple(3, i), filter(lambda k: not is_multiple(2, i), gen))) ...) where N is the nth prime.
the coolest thing i got from this is modularity. the rest is fascinating, i did not know yield, and recursion does my head in a bit, but i can use it. but modularity is such a level up for me in my coding,. thank you for that my friend!!!
11:09 I don't think you need global here. Global is only needed if you're changing a reference variable in the outer scope, which you're not doing when simply mutating a list.
that's what I was expecting too, but I tried looping over this on CPython 3.7... and it seems to actually keep going quite a bit past the expected 1000 iterations. I didn't wait to see how much further it would go, but I was surprised. Maybe it discards the outer iterator because the "yield from" is the last instruction before an implied return? I don't get it edit: nevermind, I think I did something wrong when I tried it before (Maybe I mistakenly called my own nats function inside Altenkirch's rather than recursing?), and this time got RecursionError.
and then he messed up trying to call nats, not originally expecting it to return and iterator "that was not very clever". They got a kick out of me, sort of reminding me I'm not the only one that occasionally forgets these things. Thanks for leaving that one in @Computerphile
@@nobodyofconsequence6522 this is... surprisingly true. Especially when you get to the point of trying to make your program more efficient. You're really just trying to get it to do as much with as little time as possible, and sometimes that may be through laziness of not handling something that's out of the scope of the original program design. Basically saying "not my problem; I'm just going to do what I was told to do, nothing more". And, surprisingly, limiting the scope like this actually makes future programming more modular. In other words, it works.
Ok since I am at it here is another comment: no the sieve of Erasthostenes is not an efficient method to compute primes, I just used it as a way to illustrate computing with infinite datastructures. It is known since a while that Primes is in RP, that is randomised polynomial time. Some years ago it was even shown to be in P, i.e. you don't even need a dice.
@@ThorstenAltenkirch No. There is *never* any divisibility test in the sieve. Instead there is elimination of multiples, as you actually explained in the beginning of the video. To quote Wikipedia: Primes can also be produced by iteratively sieving out the composites through divisibility testing by sequential primes, one prime at a time. It is not the sieve of Eratosthenes but is often confused with it, even though the sieve of Eratosthenes directly generates the composites instead of testing for them. Trial division has worse theoretical complexity than that of the sieve of Eratosthenes in generating ranges of primes.
I hope TH-cam won't completely mangle my code here. I grabbed a convenient library that provides a `primes` in Haskell and set out to calculate, on the one hand, the count of times that a number is crossed out when finding all the primes less than some given n using the sieve of Eratosthenes: > sieved n = sum . map (\p -> length . takeWhile ( length . takeWhile (\p -> c `mod` p /= 0) . takeWhile (
@@MubashirullahD because his nats() function's output is exactly the same as range() except it is infinite. Rather than use range internally, he explicitly calls yield to demonstrate how it works
TLDR: "yield" in python is used in generators to loop through something and store the data lazily so when you call the function next it doesn't have to start at the beginning but rather picks up where it left off (often in a loop)
yield is multiple return yield makes your function "G" a generator, which yields a sequence. when you call next(g) it continues running until the next yield statement and the function next() returns the thing you yielded.
Aalex not quite actually..generators or iterators in general holds the address to the particular value it is now referencing and moves to the “next” address when next() is called
Yes. Also, python doesn't support tail recursion. Edit: as pointed out, of course python supports tail recursion. :-) What I meant is that it doesn't support _tail call elimination / tail call optimization_.
Generators are great. Using recursion to generate a sequence of integers is horrible! It is very wasteful of both space and time. It will also blow out your stack. He clearly lives in the world of theory where performance does not matter at all. Here is a better version. def count(n): while True: yield n n = n + 1 Simple and efficient. Recursion is a great technique when needed, but it should be avoided when it is not needed.
I tried to use something like your count with his "sieve" and it still blew out my stack. I wrote in another comment a better version (in my opinion) of calculating primes. I absolutely agree with your end statement that you should avoide recursion when possible
@@hodsinay6969 his sieve has the same problem. I didn't feel like rewriting his whole program on a tablet. I will try to find your comment and check it.
I hadn’t considered Nottingham as a potential university before but just looked at it now and will definitely be considering it as a first/second choice!
laziness is me writing a for loop to make a list of alphabets or numbers instead of actually typing it out myself. in a few years I'm gonna start making a neural network that can write half of the code for me lol
I know i am 39 minutes late but I have never been this early before. Seeing only 62 comments is so weird. And this is nicely timed as I was just researching about yield statements.
I know I could look it up, but I'd love for you to tell me what that does. Does it just test all those conditions, i.e. returning true only if a is less than x, x is less than y, and y is less than b?
@@ZipplyZane See it in this way: Given expression: a < x < y < b 1. First, the interpreter evaluates (a < x). Let's say, the result of "a < x" is True. Then the current expression will be True < y < b. As in #1, the interpreter evaluates (True < y), holds the result in a stack, and repeats until the interpreter reaches the end of the expression. Don't think it in terms of stacks and ASTs. It is better to see this as a mathematical expression.
I think it's between Altenkirch and Brailsford as my favorite overall Computerphile presenters in terms of raw depth-of-content. Pound is a close third.
I see a lot of comments on how this implementation of the sieve is poor; that’s because it’s a simple example of a technique that can be useful. It’s not meant to be used exactly as shown, but it can do the job anyway. The sudoku solver shows a better use case.
Agreed. Some people just don't get the meaning of the video. Professor didn't say he is going to do 100% accurate implementation. He wanted to show what you can do with yield and how to be 'lazy'.
The Sieve code is interesting, but I really worry about the efficiency! There are only two ways to generate the next prime here, and that is to either remember all previous primes, and iterate until you find one which is not a multiple of any of them (but only up to sqrt(n)), or you have to use standard sieve thinking and strike out at least a bunch of multiples each time you find a new prime. The second approach has much more useful Big-O properties. The Sudoku solver otoh will work with constant memory, it should be more or less equal to a standard recursive solver with backtracking?
It's not that unusual... Just generator expressions, "yield from" and recursion. IMO, assignment expressions and the "generator.send" protocol are much weirder.
- Yes, it does. That’s kind of the hidden joke in the definition of ‘laziness’: that you can spend much longer looking for the shortcut than it would have taken to go the long way round. The payoff is that you can use the same shortcut (or programming trick) multiple times (eg for serving multiple users) or for more complex problems (where the long way round rapidly becomes impractical) in the future.
The problem with this approach is that it reaches recursion depth pretty fast. I was able to sieve primes using a slightly improved version of this method to about 3500. After that yield from is just unviable
@@CecilWesterhof I did the same thing as in the comment above for generating natural numbers and the rest is exactly like described in the video. I'm not sure how to use mutable state for the sieve itself.
This is honestly some of the most beautiful code I've ever seen. I've made a many sieve, but never a lazy sieve. I can't believe you can do this incredible calculation with only a few lines of code and an understanding of coroutines.
See my comment. This is an elegant but useless piece of code. Even if recursion wasn't an issue this is not using a sieve and is painfully inefficient and almost laughably incompetent
Using "yield from" this way is a really poor idea in Python because it does not support tail recursion. In general with Python stick to iteration instead of recursion (I love recursion too but it has its limits in Python), you can still have your beautiful laziness without the use of recursion.
There is a fundamental difference between iterators and laziness: lazy evaluation memoizes results of subcomputations whereas iterators do not. For example, the Haskell version of the infinite list of Fibbonacci numbers can be defined as fibs = 0:1:zipWith (+) fibs (tail fibs) If you translate this definition to iterators you'd end up with something like the following: def fibs(): yield 0 yield 1 yield from (i+j for i,j in zip(fibs(),tail(fibs()))) def tail(it): # couldn't find this in the standard library... next(it) # drop the first element while True: n = next(it) yield n However, the Python code above takes exponential time whereas the Haskell one takes linear time (ignoring the log factor for the increasing sizes of numbers). To get linear time using iterators you have to manually code the memoization (using a loop for simplicity): def fast_fibs(): a,b = 0,1 while True: yield a a,b = b,a+b
@@mrdkaaa Lazy evaluation a.k.a. call by need requires memoization of results, otherwise it is simply call by name. For reference check the chapter on graph reduction i. SPJ's "The Implementation of functional languages" (it's out of print by freely available from his web page).
I don't know why this is recommended to me, but I'll watch it because I do love to learn new things and this is definitely a subject that I have no knowledge on.
Thorsten is a real CS scientists ... unlike the self-claimed, ludicrous CS or engineers just from taking a 6 month course... va a lifestyle of math/science.
Rewriting in C# for guys who don't use python. using System; using System.Collections.Generic; public static class Program { public static void Main(string[] args) { var natural_numbers = sieve(nats(2).GetEnumerator()).GetEnumerator(); for (int i = 0; i
I wanna talk about laziness... you know. And with these intro-words all that could be said about people's laziness is said. :D ... now let's start with laziness in Python.
Hah, a few weeks ago I got interested in prime numbers and after some time I got it to be able to rival C in speed, best I got it was all primes under 10^11 in 93.6s which is blazing fast, but I could have still made it even faster, but it gets very complex fast.
The nats generator shows is mathematically more elegant, but it will not render an infinite sequence since python will create a call stack and you'll eventually receive a RecursionError. itertools.count would avoid that or a while loop
ebulating A lack of the ability to support threading doesn’t make writing Python more or less easy. It was just an oversight when developing Python that makes it an inferior language forever in the future.
8:19 This kind of naming always gets me. In Haskell, when doing `filter`, do you filter out the elements that make the predicate true, or do you keep them? (It takes a while to get used to it!) Similarly, do you sieve out the elements, or are you actually keeping them?
Always run a 2-line test code first to figure stuff like that out. This is even more important, if you dip in and out of several languages and libraries.
This guy looks like every computer scientist to ever exist, in the same body
the Avatar
I immediately thought of Brent Spiner's character in Independence Day.
@@JackVermicelli same!
🤣🤣🤣🤣🤣🤣🤣🤣🤣
Yesssss and talks like Dr Strangelove
Professor Altenkirch is hysterical. I wish I had him during my CS classes.
Hilarity?
Serdar Kocak hysterical is often used as “even more than hilarious”
@sirati97 he means hilarious
Lol take it easy
@@chrisbreidenstein3831 he meant hilarious
@1:30 he can say the word "really" without moving a single muscle.
This video is all about laziness, I guess
Laziness lol
This cracked me up :D
Agreed. He is the most likable professor ever. And in keeping with the important lesson on laziness, I’m typing this lying down.
I guess I'll be That Guy today:
"Now is better than never.
Although never is often better than *right* now."
The zen of python is never a bad thing to reference
So Zen! :-)
>>> import _this_
'now' == 'right now'. 'Right' is just a stress with no real meaning.
You sentence is basically 'Now is better than never. Never is better than now.' This can be reduced to False.
@@y.z.6517 I think you might've been staring at your IDE too long
Trying to show your homies a cool programming trick be like:
''...so now when I press this button... oh it doesn't work...yyym'''
You can tell this guy knows what he's talking about here. He looks like an EXPERT in the art of lazy.
"Laziness is a virtue - well, in programming anyway!"
No need to specify that it's in programming. Without laziness we wouldn't have had a reason to domesticate animals, build societies, develop science and technologies, create machines, etc. Basically we'd still be in the stone age if our species wasn't lazy and constantly looking to make life easier for ourselves.
@Joseph Malone I disagree. By using an animal to plow your field (and later a machine) we decrease the work we need to do. Ideally we wouldn't be doing that job at all but at least until we get decent enough ai, not doing that job would mean we wouldn't get to eat. Therefore laziness is expressed as "optimization" rather than as completely not doing the required task because the task is essential for our survival. In the case of tasks that are less essential for our survival such as copying books, the laziness factor is more subtle as the motivation there is more about saving costs than not doing the original job. Although saving costs can be considered as part of laziness since it allows those involved to get the same thing for less effort. Essentially minimizing effort is inherently linked with and usually driven by laziness.
@Joseph Malone As a character's voiceline in R6 goes, "Efficiency is clever laziness".
@@beskamir5977 Explain why people play games? By playing games, we basically seek *more* work that are unnecessary.
@@beskamir5977 Our brain tells from fun and not fun works. Hunting is inherently fun to our brain (or our ancesters would have died from laziness). When our ancesters started ploughing, they avoided the pain of starvation, but doing so also deprived them of the fun of hunting. Most modern works are boring, because our brains were not evolved to do that.
BTW, many women enjoy shopping, because shopping is basically paid gathering.
@@y.z.6517 I think you basically answered your own question. Laziness happens when the thing we need to do is boring. When it's fun laziness doesn't apply as much.
For a moment I thought I didn't understand seive but suddenly profesor corrects it.😅 It's beautiful to show that it's ok to make mistakes
8:04 - 8:15 programming in a nutshell
The shirts are always top shelf!!!!
In the video when he goes 3+4 is nothing, I was like wait... I could of sworn that was 7, and than he realizes it was 7. Professor Altenkirch has that vibe of a person who is like pothead that is lazy (interesting that (laziness) has to deal with the video) and doesn't know anything, yet as you can see, he knows programming well which though isn't technically hard, it is something that many think is hard/challenging as well as his mathematics. Would of loved to have him as my professor when I was getting my CS degree.
If you use the `filter` predicate, it gets even more obvious that the sieve function precisely mimics the idea of layers of sieves (aka filters!).
def sieve(gen):
i = next(gen)
yield i
yield from sieve(filter(lambda k: not is_multiple(k, i), gen))
This evaluates to
filter(lambda k: not is_multiple(N, i), ... filter(lambda k: not is_multiple(5, i), filter(lambda k: not is_multiple(3, i), filter(lambda k: not is_multiple(2, i), gen))) ...)
where N is the nth prime.
This dude looks like the perfect guy to describe laziness
the fact this was reccomended to me makes me feel personally attacked
I don't know if it for this video specifically, but professor seems to bring about an air of laziness through his mannerisms as well😂
This guy is the nuclear scientist being bossed around after giving bad news to a movie supervillain.
Nice unicorn mug
haha I thought you wrote that wrong!! but I was deferring to your professorial nature!
I love you videos
the coolest thing i got from this is modularity. the rest is fascinating, i did not know yield, and recursion does my head in a bit, but i can use it. but modularity is such a level up for me in my coding,. thank you for that my friend!!!
I need that mug.
I like how human he is XD
now I vill be talking like this guy prrofèsor for the r-est of the day
Cheers to all the folks that were annoyed by i%n==0 but didn't immediately know why.
I liked the mug.
11:09
I don't think you need global here. Global is only needed if you're changing a reference variable in the outer scope, which you're not doing when simply mutating a list.
so cool!
RecursionError: maximum recursion depth exceeded
that's what I was expecting too, but I tried looping over this on CPython 3.7... and it seems to actually keep going quite a bit past the expected 1000 iterations. I didn't wait to see how much further it would go, but I was surprised. Maybe it discards the outer iterator because the "yield from" is the last instruction before an implied return? I don't get it
edit: nevermind, I think I did something wrong when I tried it before (Maybe I mistakenly called my own nats function inside Altenkirch's rather than recursing?), and this time got RecursionError.
@@Twisted_Code appreciate your effort.
really deep.
The most lazy thing you can do in python is being verbose.
This guy really learned to control his eyebrows 😳11s almost completely gone!
You can't be lazy in Germany or your citizenship will be taken
That's why he moved to the UK.
@@caw25sha I see 😂
I was wondering why they were using i%n == 0, I thought you just had to define the list in terms of what you're removing
I haven't yet, but one day I'll find a lazy use case. maybe.
This professor would make an awesome movie bad guy.
4:10
“3+4 is... nothing. Oh, it’s 7. Okay, that’s good”
I don’t know why this had me laughing so hard
That made me laugh too. Like a smart professor struggling with a gui interface
Bacon u making lthings boring
My face hurts from laughing at that.
same with me.
and then he messed up trying to call nats, not originally expecting it to return and iterator "that was not very clever". They got a kick out of me, sort of reminding me I'm not the only one that occasionally forgets these things. Thanks for leaving that one in @Computerphile
I don't know about coding but I am kind of an expert in Laziness.
brambedkar59 and you’re not alone
Indeed. Not to brag, but I'm one of the foremost experts in the field.
if IFeelLikeIt:
yield work
To be fair a lot of coding is about laziness, so you probably have an aptitude of it.
@@nobodyofconsequence6522 this is... surprisingly true. Especially when you get to the point of trying to make your program more efficient. You're really just trying to get it to do as much with as little time as possible, and sometimes that may be through laziness of not handling something that's out of the scope of the original program design. Basically saying "not my problem; I'm just going to do what I was told to do, nothing more". And, surprisingly, limiting the scope like this actually makes future programming more modular. In other words, it works.
Ok since I am at it here is another comment: no the sieve of Erasthostenes is not an efficient method to compute primes, I just used it as a way to illustrate computing with infinite datastructures. It is known since a while that Primes is in RP, that is randomised polynomial time. Some years ago it was even shown to be in P, i.e. you don't even need a dice.
Professor, could you then kindly suggest a more efficient method of code, to find primes?
The sieve is efficient. But you implemented trial division.
@@jens6398 I don't know what you mean by "trial division" but I have implemented the sieve.
@@ThorstenAltenkirch No. There is *never* any divisibility test in the sieve. Instead there is elimination of multiples, as you actually explained in the beginning of the video.
To quote Wikipedia: Primes can also be produced by iteratively sieving out the composites through divisibility testing by sequential primes, one prime at a time. It is not the sieve of Eratosthenes but is often confused with it, even though the sieve of Eratosthenes directly generates the composites instead of testing for them. Trial division has worse theoretical complexity than that of the sieve of Eratosthenes in generating ranges of primes.
I hope TH-cam won't completely mangle my code here. I grabbed a convenient library that provides a `primes` in Haskell and set out to calculate, on the one hand, the count of times that a number is crossed out when finding all the primes less than some given n using the sieve of Eratosthenes:
> sieved n = sum . map (\p -> length . takeWhile ( length . takeWhile (\p -> c `mod` p /= 0) . takeWhile (
No you cannot multiply strings
python: print("haha string goes b" + "r"*10)
>haha string goes brrrrrrrrrr
I'm stealing that
I see, you have been on r/programmerhumor today.
alternatively
print('ha' * 2 + 'string go b' + 'r' * 10)
reddit
this is str*int, not str*str
me: nooo, you can't define a recursive function without an exit condition!!
Altenkirch: haha, stack goes brrrrr.
haha.
We could have done
for i in range(n):
yield i
@@MubashirullahD That's basically a circular definition. Range() needs to be defined as well
@@ThomasBomb45 I don't know what you mean by circular? :) I'm assuming everyone knows range is lazy in python 3.
@@MubashirullahD because his nats() function's output is exactly the same as range() except it is infinite. Rather than use range internally, he explicitly calls yield to demonstrate how it works
@@ThomasBomb45 # Range is not needed
i=0
while True:
i+=1
yield i
TLDR: "yield" in python is used in generators to loop through something and store the data lazily so when you call the function next it doesn't have to start at the beginning but rather picks up where it left off (often in a loop)
Ty!
"I wanted to talk about laziness, but I was to lazy to do talk about it"
Missed opportunity 🙃
You are not lazy, or you wouldn't have commented.
This is a perfect example of real life coding.
- "WHEHRE IS THIS THING???"
- "ok ok yeah yeah imma do it"
Best lines ever
Dude, I feel so relaxed just looking at this guy. It’s a true inspiration, thank you. I need a beer now.
This yield statement puzzles me so hard man
It's like "return" but it starts from that line the next time you run the function.
It just "pauses" at that point, waiting for you to call it again.
yield is multiple return
yield makes your function "G" a generator, which yields a sequence.
when you call next(g) it continues running until the next yield statement and the function next() returns the thing you yielded.
@@raingram I've literally never seen it that way. You're amazing. Thanks for your comment.
I get it now
@@raingram great explanation
@@christophebeaulieu4916 ~. i’m
Doesn't this create a callstack that gets bigger and bigger with every iteration (like all non-tail-recursions) or am I missing something?
Aalex not quite actually..generators or iterators in general holds the address to the particular value it is now referencing and moves to the “next” address when next() is called
Yes. Also, python doesn't support tail recursion.
Edit: as pointed out, of course python supports tail recursion. :-) What I meant is that it doesn't support _tail call elimination / tail call optimization_.
Tried it with the exact code and for default settings python 3.7 it gave an "RecursionError: maximum recursion depth exceeded" after the prime 733.
@@noobinvestor3180 it will, the yield from is the issue here, it's basically recursion
@@chris8443 not yet :)) with the new PEG parser, we might get something nice
i love his coffee mug
Generators are great. Using recursion to generate a sequence of integers is horrible! It is very wasteful of both space and time. It will also blow out your stack. He clearly lives in the world of theory where performance does not matter at all.
Here is a better version.
def count(n):
while True:
yield n
n = n + 1
Simple and efficient.
Recursion is a great technique when needed, but it should be avoided when it is not needed.
I tried to use something like your count with his "sieve" and it still blew out my stack.
I wrote in another comment a better version (in my opinion) of calculating primes.
I absolutely agree with your end statement that you should avoide recursion when possible
@@hodsinay6969 his sieve has the same problem. I didn't feel like rewriting his whole program on a tablet. I will try to find your comment and check it.
Try solving this without recursion: computing all the permutations of a sequence.
def permute(seq) :
"generator which yields successive permutations of the elements of seq."
if len(seq) == 0 :
yield ()
else :
for i in range(0, len(seq)) :
for rest in permute(seq[:i] + seq[i + 1:]) :
yield (seq[i],) + rest
#end for
#end for
#end if
#end permute
>>> list(permute((1, 2, 3, 4)))
[(1, 2, 3, 4), (1, 2, 4, 3), (1, 3, 2, 4), (1, 3, 4, 2), (1, 4, 2, 3), (1, 4, 3, 2), (2, 1, 3, 4), (2, 1, 4, 3), (2, 3, 1, 4), (2, 3, 4, 1), (2, 4, 1, 3), (2, 4, 3, 1), (3, 1, 2, 4), (3, 1, 4, 2), (3, 2, 1, 4), (3, 2, 4, 1), (3, 4, 1, 2), (3, 4, 2, 1), (4, 1, 2, 3), (4, 1, 3, 2), (4, 2, 1, 3), (4, 2, 3, 1), (4, 3, 1, 2), (4, 3, 2, 1)]
Also, don't reinvent the wheel. "itertools.count" already exists and does the same thing.
some languages are smart and flatten the recursion, maybe the professor is accustomed to those
Laziness as in "waiting to evaluate an expression as late as possible"? Or laziness as in getting nothing done? xD
The latter is much easier to implement!
pull request let me do nothing all day
IKR! There are 2 types of laziness in programming.
"oh it's ... That's bad"
Story of my life.
"This is what being a programmer is actually like" tutorial
The content is great and his warm dry humor makes me giggle.
I hadn’t considered Nottingham as a potential university before but just looked at it now and will definitely be considering it as a first/second choice!
(Looks at my cat sleeping) Nature is "efficient", just turns out efficiency is comfortable sometimes. :)
when you find out about yield and generators in python in 2020...
Mine was " yield from "
Now I get why developers love Python :)
This guy looks like the twins from matrix
Which one?
@@Gabriel64468 the another twin
laziness is me writing a for loop to make a list of alphabets or numbers instead of actually typing it out myself.
in a few years I'm gonna start making a neural network that can write half of the code for me lol
I know i am 39 minutes late but I have never been this early before. Seeing only 62 comments is so weird. And this is nicely timed as I was just researching about yield statements.
Why is like he does'nt sleep enough ?
Because too lazy he isto sleep .
I'm sorry! I make grammatical errors too
Lazy to sleep
Lazy sleep
I haven't slept well in years
Laziness in python is writing codes without getting annoyed by ';' and understanding the beauty of a < x < y < b.
Lex Fridman
@@omri9325 yep, he described this feature of python as the most beautiful thing a programming language can ever have.
I dislike python but love a < b < c
It's genius.
I know I could look it up, but I'd love for you to tell me what that does. Does it just test all those conditions, i.e. returning true only if a is less than x, x is less than y, and y is less than b?
@@ZipplyZane See it in this way:
Given expression: a < x < y < b
1. First, the interpreter evaluates (a < x).
Let's say, the result of "a < x" is True.
Then the current expression will be True < y < b.
As in #1, the interpreter evaluates (True < y), holds the result in a stack, and repeats until the interpreter reaches the end of the expression.
Don't think it in terms of stacks and ASTs. It is better to see this as a mathematical expression.
I think it's between Altenkirch and Brailsford as my favorite overall Computerphile presenters in terms of raw depth-of-content. Pound is a close third.
I see a lot of comments on how this implementation of the sieve is poor; that’s because it’s a simple example of a technique that can be useful. It’s not meant to be used exactly as shown, but it can do the job anyway. The sudoku solver shows a better use case.
Agreed. Some people just don't get the meaning of the video. Professor didn't say he is going to do 100% accurate implementation. He wanted to show what you can do with yield and how to be 'lazy'.
I'd smoke weed with this dude
That's a nice compliment :)
I'm always happy to see a new episode with Prof. Altenkirch :>
The Sieve code is interesting, but I really worry about the efficiency!
There are only two ways to generate the next prime here, and that is to either remember all previous primes, and iterate until you find one which is not a multiple of any of them (but only up to sqrt(n)), or you have to use standard sieve thinking and strike out at least a bunch of multiples each time you find a new prime. The second approach has much more useful Big-O properties.
The Sudoku solver otoh will work with constant memory, it should be more or less equal to a standard recursive solver with backtracking?
0:15 is me anytime I get an email from my university admin
🤣🤣😩
If you have to learn some really unusual syntax before you can do this, then does that still count as lazy? 🤔
It's not that unusual... Just generator expressions, "yield from" and recursion. IMO, assignment expressions and the "generator.send" protocol are much weirder.
- Yes, it does. That’s kind of the hidden joke in the definition of ‘laziness’: that you can spend much longer looking for the shortcut than it would have taken to go the long way round.
The payoff is that you can use the same shortcut (or programming trick) multiple times (eg for serving multiple users) or for more complex problems (where the long way round rapidly becomes impractical) in the future.
The problem with this approach is that it reaches recursion depth pretty fast. I was able to sieve primes using a slightly improved version of this method to about 3500. After that yield from is just unviable
It's possible with mutable state instead of recursion
e.g.
def nats(n):
while True:
yield n
n += 1
edit: fixed typo
That was the first thing I thought. It can (on my system) go to 130, but with 131 I get a RecursionError.
Do you mind sharing your code?
@@CecilWesterhof I did the same thing as in the comment above for generating natural numbers and the rest is exactly like described in the video. I'm not sure how to use mutable state for the sieve itself.
@@pafnutiytheartist That is strange. With that change it goes from 130 to 497. Not nearly the 3500 you talk about.
This is honestly some of the most beautiful code I've ever seen. I've made a many sieve, but never a lazy sieve. I can't believe you can do this incredible calculation with only a few lines of code and an understanding of coroutines.
As lazy as i am, i put def nats(n=2)
Off by 1 errors are the worst in Boolean expressions.
Or in this case, off by Shift-1 ? 😋
I've seen you get a recursion error at prime 733. Is there a way to make it efficient ? Else it would be an elegant but useless piece of code
See my comment. This is an elegant but useless piece of code. Even if recursion wasn't an issue this is not using a sieve and is painfully inefficient and almost laughably incompetent
Coy McBob can you describe how you would implement the sieve without testing divisibility?
Enjoyed this... He had a great sense of humour... Intellect with humility is the greatest... Thanks
Did anyone check stackoverflow for a python question titled "Why doesn't my sieve function work?"
Using "yield from" this way is a really poor idea in Python because it does not support tail recursion. In general with Python stick to iteration instead of recursion (I love recursion too but it has its limits in Python), you can still have your beautiful laziness without the use of recursion.
Best language ever
oh no you're gonna start a war (btw c++ best (: )
@@alexrobertson35 c++ segfault incoming
@@pipony8939 I was using smart pointers!
@@alexrobertson35 smart? why not genius pointers instead?
@@pipony8939 tbh most of the time I don't even use smart one
The guy from the video basically impersonates laziness
This is a good explanation about yield. With regard to Sudoku, all good Sudoku puzzles only have one solution.
Just wanted to say that I really enjoyed your short paper about the contribution of Martin Hofmann. Thanks!
This is probably the shortest pythoniest-python program I have ever seen!
There is a fundamental difference between iterators and laziness: lazy evaluation memoizes results of subcomputations whereas iterators do not.
For example, the Haskell version of the infinite list of Fibbonacci numbers can be defined as
fibs = 0:1:zipWith (+) fibs (tail fibs)
If you translate this definition to iterators you'd end up with something like the following:
def fibs():
yield 0
yield 1
yield from (i+j for i,j in zip(fibs(),tail(fibs())))
def tail(it): # couldn't find this in the standard library...
next(it) # drop the first element
while True:
n = next(it)
yield n
However, the Python code above takes exponential time whereas the Haskell one takes linear time (ignoring the log factor for the increasing sizes of numbers). To get linear time using iterators you have to manually code the memoization (using a loop for simplicity):
def fast_fibs():
a,b = 0,1
while True:
yield a
a,b = b,a+b
@@mrdkaaa Lazy evaluation a.k.a. call by need requires memoization of results, otherwise it is simply call by name. For reference check the chapter on graph reduction i. SPJ's "The Implementation of functional languages" (it's out of print by freely available from his web page).
He never told us what came after 35....
I don't know why this is recommended to me, but I'll watch it because I do love to learn new things and this is definitely a subject that I have no knowledge on.
It's a sign Sara. Join the computer science field
m_CompSciStudents++;
Starting with this kind of video will only scare you)))
It is a great dream that my phone at leats recommends me smth like this. I suggest you watch the lambda calculus videos they are real fun!
Programming is actually very easy and fun, just start putting your free hours in it and you'll become a lazy programmer in no time!
PEP8: no spaces before colons, and put spaces around your operators.
colons are not operators, no?
When you’re not programming you’re guarding the train in The Matrix. Top man!
Thorsten is a real CS scientists ... unlike the self-claimed, ludicrous CS or engineers just from taking a 6 month course... va a lifestyle of math/science.
No! What comes after 35 is in fact a state secret! Don't say it! :P
Rewriting in C# for guys who don't use python.
using System;
using System.Collections.Generic;
public static class Program
{
public static void Main(string[] args)
{
var natural_numbers = sieve(nats(2).GetEnumerator()).GetEnumerator();
for (int i = 0; i
never, ever taught, I feel like have huge gaps in my academic knowledge and I'm trying to become an engineer
idk why those spaces before colons make me uncomfortable ¯\_(ツ)_/¯
_this post is made by PEP8 gang_
So I saved this video to a playlist but I won't watch it until I really need to😴😴😴
I wanna talk about laziness... you know.
And with these intro-words all that could be said about people's laziness is said. :D ... now let's start with laziness in Python.
Hah, a few weeks ago I got interested in prime numbers and after some time I got it to be able to rival C in speed, best I got it was all primes under 10^11 in 93.6s which is blazing fast, but I could have still made it even faster, but it gets very complex fast.
The nats generator shows is mathematically more elegant, but it will not render an infinite sequence since python will create a call stack and you'll eventually receive a RecursionError.
itertools.count would avoid that or a while loop
screams in pep8
The ultimate laziness in Python: the global lock. Who needs threads, amirite?
wenuriteurite
ebulating A lack of the ability to support threading doesn’t make writing Python more or less easy. It was just an oversight when developing Python that makes it an inferior language forever in the future.
This guys looks like the Hollywood type cast for a computer scientist/ hacker / computer guy
This guy looks like the computer hacker guy employed by the villain in every 80s action movie
8:19 This kind of naming always gets me. In Haskell, when doing `filter`, do you filter out the elements that make the predicate true, or do you keep them? (It takes a while to get used to it!) Similarly, do you sieve out the elements, or are you actually keeping them?
Same thing as other search filters
Always run a 2-line test code first to figure stuff like that out. This is even more important, if you dip in and out of several languages and libraries.
I could be wrong, but I am pretty sure this will blow the stack if you do this in python