@@VietVuHunzter It actually sort of happened to him, he mentioned how he was consulting for a bank or something, and they gave him a laptop with no internet or IDE or any sort of tools, but there was Python and a basic text editor IIRC.
A fantastic, entertaining and highly educational talk. It always bothers me that I can't play the piano and talk at the same time (my wife usually asks me things while I'm playing). But David can even type concurrent Python code in Emacs in Allegro vivace speed and talk about it at the same time. An expert in concurrency in every sense of the word. How enviable!
his ability to explain his actions while programming fast and accurate without any real tool support (not even syntax highlighting) is pretty much amazing.
This presentation is legendary. Not only is David extraordinarily savvy on the inner workings of Python, but also I doubt that anybody can write code and present ideas AT THE SAME TIME! Sourcery!
Using coroutines in this way is an example of "cooperative multitasking"; if all your code yields the processor when it should, it can work quite well, but a single uncooperative code element can make a mess of things.
A lot of wisdom here, to me, it take 3 times watching this to understand the fully concept of concurrency, nice presentation with useful practice, thank a lot.
I would probably have to watch this 20 times over to really understand most (if not all) of his demo. It's like watching a Ninja hack away at the keyboard.
Thank you! This brilliant talk/live-code-session finally answered all my unanswered questions about concurrency and coroutines in Python. This is absolutely the most legendary live coding with concurrent explanation, I've ever seen.
Video quality isn't the best, he's writing everything from scratch... But that doesn't even matter! His explanations are great and has the perfect balance between humor and educational. One of the best talks I've ever seen for real.
I think that's a ridiculous myth -- that the 10x developer is a myth. Every field has people that are incompetent, people that are average, and people that are exceptional. The average people do important work. But exceptional folks are capable of generating solutions that are simply 10x better than what an "average person" would ever have come up with.
27:06, i've never seen a list (albeit empty) on the left side of an assigment statement. What does it do? The simpler `[] = 1` throws `TypeError: 'int' object is not iterable`. Oh ok, 42:17, it was a bug/typo, but somehow it still worked. Btw, awesome talk. He's a great speaker.
Commenting a bit late. Agree that this is a spectacular talk. Has anybody tried the multithreading code - the first example? I can never get 2 clients working at the same time. The server responds to only one client.
Values unpacking works on lists as well: >>> def returns_a_list_of_lists(): >>> return 1, [2, 3, 4], [5, 6, 7] >>> a, [b, c, d], [e, f, g] = returns_a_list_of_lists() >>> a == 1 True >>> g == 7 True The third and last argument must have been an empty list.
If you are trying to code along and are getting socket already in use when you are submitting a job to the process pool, put fib_server(('', 25000)) in server.py in a `if __name__ == '__main__'`
turns out even in python 2.7, you can do this: `[] = []` BUT `{} = {}` will not work.... says it's a SyntaxError "can't assign to literal". What do you know?
your future_wait will grow very fast because we dont clear it i think its better to do this 'def future_done(future): tasks.append(future_wait.pop(future)) future_notify.send(b'x')
When I first had to implement a small distributed processing program in Python, I skipped Threads and went right to the multiprocessing package, which actually creates different processes for each object you start. This behavior is much more on par with the behavior of other languages and what is expected from a concurrency perspective. After all, why would you implement concurrent workers if you can't take advantage of more processing power? Might as well do a script that works sequentially.
Douglas De Rizzo Meneghetti Because sometimes what you're doing doesn't require more processing power just more processing time. This especially includes IO work where the processor gets tied up waiting for a response and is doing nothing in the meantime. Now imagine you had to make multiple IO connections each waiting on the other. In cases like these it is better to wait on all concurrently, and that's why it's important to understand the nature of your program. Is it IO bound or CPU bound? There are other tradeoffs to consider and obviously from watching this talk it could be that neither multiprocessing or threads are necessary or desirable in the end.
Someone understands why he says ""the load is essentially offloaded somewhere else, you're not fighting with the GIL or anything like that..." 15:48 ? th-cam.com/video/MCs5OvhV9S4/w-d-xo.html The GIL is still limiting the task that are sent to the pool of threads right? Where are those task submitted otherwise ?
I'm not sure I've got the point. It was a great talk and I think I've learned something new about threading and async io, but the two problems being presented: 1.performance impact due to the GIL, 2. performance cliff due to python prioritizing CPU intense jobs, was there a hinted solution to any of these problems? Especially for the second problem, all I learned from the video was by combining async io with ProcessPool, we can achieve similar performance cliff as the threading approach. But the performance cliff is still here?
I think the take away is that when there is massive number of concurrent tasks, co-routines is the way to go - you can mix in multiprocessing to ensure all cores get used - Posix threads are not going to scale for such requirements.
There David talking about CPU process, but will it works with process which use GPU if we have only one video card and something what take almost all GPU's memory? Or better wrap this process with threading.Lock()?
Anyone tried this on windows and got some different behavior? (eg: the requests per second didn't drop when I asked the server to compute the fib number of 35 like in: @11:18)
Amazing talk, I really enjoy it. The only problem I have with it is he and the python community pretend NodeJS not exists. Isn't the non-blocking + callbacks the whole idea of nodejs?
The fact that he walked in and did this demo from scratch while keeping the packed room engaged was incredible.
and without syntax highlighting and completion õ.o
His talk are amazing. I was most impressed with the vault talk. This guy is Tom Hanks on an island, but instead of Wilson, he got Python!
@@VietVuHunzter It actually sort of happened to him, he mentioned how he was consulting for a bank or something, and they gave him a laptop with no internet or IDE or any sort of tools, but there was Python and a basic text editor IIRC.
@@cheaterman49 th-cam.com/video/RZ4Sn-Y7AP8/w-d-xo.html
@@cheaterman49 Actually it's proof discovery in a legal process, patent lawsuit or something
This is a performance by someone who has achieved the highest level of their craft.
Facts
A fantastic, entertaining and highly educational talk. It always bothers me that I can't play the piano and talk at the same time (my wife usually asks me things while I'm playing). But David can even type concurrent Python code in Emacs in Allegro vivace speed and talk about it at the same time. An expert in concurrency in every sense of the word. How enviable!
+Christoph Zwerschke But when you're coding is easier. He's just talking what he's doing on the code, which is not that hard to do.
+Christoph Zwerschke Practice pair programming and it'll become easier
The Jimi Hendrix of Python
Good one ! :))
Hendirx + Robin Williams.
Actually he's more of a Bob Ross of Python I think
his ability to explain his actions while programming fast and accurate without any real tool support (not even syntax highlighting) is pretty much amazing.
watching this after 10years...
still relevant knowledge!!!
This guy is a legend.
Probably my all-time favourite conference talk :)
This presentation is legendary. Not only is David extraordinarily savvy on the inner workings of Python, but also I doubt that anybody can write code and present ideas AT THE SAME TIME! Sourcery!
dang.. I keep coming here every now and then ..such a brilliant talk !
Using coroutines in this way is an example of "cooperative multitasking"; if all your code yields the processor when it should, it can work quite well, but a single uncooperative code element can make a mess of things.
"You know, every great talk, is started with Socket Programming."
A lot of wisdom here, to me, it take 3 times watching this to understand the fully concept of concurrency, nice presentation with useful practice, thank a lot.
This speech is actual even for today! The best explanation of async programming it Python I've ever seen
I would probably have to watch this 20 times over to really understand most (if not all) of his demo. It's like watching a Ninja hack away at the keyboard.
Thank you! This brilliant talk/live-code-session finally answered all my unanswered questions about concurrency and coroutines in Python. This is absolutely the most legendary live coding with concurrent explanation, I've ever seen.
This talk made me move from Python to Golang. Thank you David! 🙂
Video quality isn't the best, he's writing everything from scratch... But that doesn't even matter!
His explanations are great and has the perfect balance between humor and educational.
One of the best talks I've ever seen for real.
David Beasley's videos always make ideas so much clearer for me. Such a great teacher!
PyCon 2015, where I actually first learned coding in Python and started loving it!
This is so interesting, David is informative and hilarious at the same time. I didn't know python supports threading in this way! Awesome!
the best talk I've seen so far
This is a phenomenal performance. I learned a lot
I think he might actually be one of those mythical 10x developers lol.
I think that's a ridiculous myth -- that the 10x developer is a myth.
Every field has people that are incompetent, people that are average, and people that are exceptional.
The average people do important work. But exceptional folks are capable of generating solutions that are simply 10x better than what an "average person" would ever have come up with.
27:06, i've never seen a list (albeit empty) on the left side of an assigment statement. What does it do? The simpler `[] = 1` throws `TypeError: 'int' object is not iterable`. Oh ok, 42:17, it was a bug/typo, but somehow it still worked.
Btw, awesome talk. He's a great speaker.
Amazing, the best and smoothest live coding demo I have ever seen!
David is Mozart of Python world.
Probably should've gained up the audio a bit. I have my volume maxed out to be able to hear him at a nominal level.
Amazing talk. for pythonistas, of course...
Commenting a bit late. Agree that this is a spectacular talk. Has anybody tried the multithreading code - the first example? I can never get 2 clients working at the same time. The server responds to only one client.
Dave is one of the best presenters of all time
Values unpacking works on lists as well:
>>> def returns_a_list_of_lists():
>>> return 1, [2, 3, 4], [5, 6, 7]
>>> a, [b, c, d], [e, f, g] = returns_a_list_of_lists()
>>> a == 1
True
>>> g == 7
True
The third and last argument must have been an empty list.
If you are trying to code along and are getting socket already in use when you are submitting a job to the process pool, put fib_server(('', 25000)) in server.py in a `if __name__ == '__main__'`
Thanks a lot! I just got this error and deliberately searched in the comments if somebody already published the answer. Thanks!
Super simple and well explained! Excellent presentation!
Great talk, well that's what I call an experienced python programmer!
Awesome Talk. Impressive.
yield from was introduced in CPython 3.3.
stackoverflow.com/questions/30147165/why-does-assigning-to-an-empty-list-e-g-raise-no-error if you want to know why 42:30 works
The only thing I learn from the video is he's legend.
7 years ago, omg. now it's mainsteam anywhere. NodeJS, Kotlin, Spring Reactor
At 6:16, I'm wondering what he wrote there, where it is blocked by the inner video frame.
Aaron Hall Its Thread(target=fib_handler, args=(client,), daemon=True).start() :P
github.com/dabeaz/concurrencylive/blob/master/server.py
Ashwini Chaudhary I just realized I never thanked you. Thanks. :D
God bless you, man
Zero experience in sockets and concurrency in Python. I could not even come up with threaded performance monitor let alone all tne rest. Incredible.
GODLIKE!!!!
Great Talk! Learned a lot from it!
I had to watch twice to understand what he is typing and telling at the same time. Lol . How can we become like him :-(
Would it be bad practice (or helpful?) to write fib() as C module which releases the GIL? How much would that add overhead?
No slides at speakerdeck and 404 for github page =(
turns out even in python 2.7, you can do this: `[] = []`
BUT `{} = {}` will not work.... says it's a SyntaxError "can't assign to literal". What do you know?
your future_wait will grow very fast because we dont clear it i think its better to do this 'def future_done(future):
tasks.append(future_wait.pop(future))
future_notify.send(b'x')
I feel sorry for the 5 people who 👎 this talk.
When I first had to implement a small distributed processing program in Python, I skipped Threads and went right to the multiprocessing package, which actually creates different processes for each object you start. This behavior is much more on par with the behavior of other languages and what is expected from a concurrency perspective. After all, why would you implement concurrent workers if you can't take advantage of more processing power? Might as well do a script that works sequentially.
Douglas De Rizzo Meneghetti Because sometimes what you're doing doesn't require more processing power just more processing time. This especially includes IO work where the processor gets tied up waiting for a response and is doing nothing in the meantime. Now imagine you had to make multiple IO connections each waiting on the other. In cases like these it is better to wait on all concurrently, and that's why it's important to understand the nature of your program. Is it IO bound or CPU bound? There are other tradeoffs to consider and obviously from watching this talk it could be that neither multiprocessing or threads are necessary or desirable in the end.
Even with GIL, multi thread is ok for IO-bound tasks.
The De Niro & Pacino of Python.
Someone understands why he says ""the load is essentially offloaded somewhere else, you're not fighting with the GIL or anything like that..." 15:48 ? th-cam.com/video/MCs5OvhV9S4/w-d-xo.html
The GIL is still limiting the task that are sent to the pool of threads right? Where are those task submitted otherwise ?
I'm not sure I've got the point. It was a great talk and I think I've learned something new about threading and async io, but the two problems being presented: 1.performance impact due to the GIL, 2. performance cliff due to python prioritizing CPU intense jobs, was there a hinted solution to any of these problems? Especially for the second problem, all I learned from the video was by combining async io with ProcessPool, we can achieve similar performance cliff as the threading approach. But the performance cliff is still here?
Right, his point was that there are no easy solutions to those problems and that asyncio doesn't make them go away.
I think the take away is that when there is massive number of concurrent tasks, co-routines is the way to go - you can mix in multiprocessing to ensure all cores get used - Posix threads are not going to scale for such requirements.
Awesome talk!
is there a screencast of this available like his talk "built-in super heroes" ?
There David talking about CPU process, but will it works with process which use GPU if we have only one video card and something what take almost all GPU's memory? Or better wrap this process with threading.Lock()?
Still not old in 2022.
Code: github.com/dabeaz/concurrencylive/
+Ashwini Chaudhary Thanks
Great talk
where his writen code could be found
my mind got destroyed way before that
Frigging amazing!
What is the code line written to watch a thread at 6:16?
have the same question. it works fine with ... args=(client,)).start()
Best talk !!!
Anyone tried this on windows and got some different behavior? (eg: the requests per second didn't drop when I asked the server to compute the fib number of 35 like in: @11:18)
On the other hand the perf1.py test is affected by the computation of the fib number of 35
where I can download code?
Teclado brought me here!!
it bothers me when people dont use syntax highlighting..
this guy pythons!
nice
TL;DR; Wouldn't it be good if Python had a concurrency model like Node.
What he said at 43:46. Thanks!
why not yield future? where from about if what equal future?
This is called "intellectually"
Tom Hanks!
Amazing talk, I really enjoy it.
The only problem I have with it is he and the python community pretend NodeJS not exists. Isn't the non-blocking + callbacks the whole idea of nodejs?
node cannot really handle multi core and so is not great, Erlang, go are much better
that's why some switched to go
Python docs have very poor asyncio documentation #python_improve_asyncio_docs
oh God! I don't know SHIT in python
Thanks david for amazing talk. For steps go to:
github.com/codeAshu/concurrency
This is pretty great, but just the fact that threads are still a "hot topic" in Python gives me pause... It's 2016.
Too complicated for me
goodbye GIL
Can't handle the spit noises arg
Down is stupid, even less opened emacs editor. Large font and on the screen 10 lines only visible, constantly scrolls down/top
Spaghetti code
This guy is the John Goodman of Python :D
Watching this in 2024, amazing talk 🫡👏👏