I’m just going to stand here for 45 minutes and live hand-code a proof-of-concept parser generator… Twice. That’s got to be one of the most ridiculously cool live demos I’ve ever seen. Plus, the guy’s genuinely funny too. 😁
I'm in a CS PhD program and have never had a class on compilers. It was never offered in my undergrad and I can't take it at my graduate school, because it wouldn't be covered by my tuition waiver.
@@jackeown the book to read is the structure and interpretation of computer programs. It teaches you scheme and in 5 chapters you use it to build a complete compiler for a new language. It is also called the Wizard book, and the class it is used in is widely considered one of the greatest computer science courses
Btw you can set rules of a selfmade lexer generator to solve conflicts by precedence of patterns. My convert to a DFA and look for the longest match an return the longest match, at each state in the DFA, if it accept, pick the token which are defined first.
Good talk. The question I have is why do we still separate tokenization from parsing? So many languages have context sensitivity and this practice makes that very difficult to handle. The second question I have is, does SLY support stream-based parsing or does it only work on complete documents?
I am not David and your question is 2 years old but anyways... PLY and SLY use LALR approach from my understanding as they are based on yacc, and LALR parsers are usually targeted towards context-free grammars. Although, PLY can have multiple grammars at the same time and you can switch between them on the go, kind of nested grammars, which gives you more freedom despite not being formally context-sensitive. I don't know about SLY, but PLY did not have support for stream inputs and I would assume the same for SLY, but you can always take user input and convert into strings. I can't think of a scenario where parsing live streams of characters would be that critical and necessary.
If you're using a metaclass, you get the class dictionary before the class body is executed in it. He's probably using a custom mapping that defines a `__missing__` method, which will be triggered whenever a name is looked up that isn't defined in the scope. You can use "magic" mappings in metaclasses to basically arbitrarily redefine the semantics of the language inside of a class body, so long as it's valid Python syntax (NameErrors happen at runtime, not during parsing)
I’m just going to stand here for 45 minutes and live hand-code a proof-of-concept parser generator… Twice.
That’s got to be one of the most ridiculously cool live demos I’ve ever seen. Plus, the guy’s genuinely funny too. 😁
He didn't code two parser generators though, just two parsers. Project is cool though :)
I'm a simple man. If I see David Beazley, I click.
me too! love his talks - wished he held courses down under
@@edwinhe2865 Chicago is not too far#Sarcasm :) He has a compiler course coming up. It's cold there but there are a lot of pubs near his studio.
me too. I search around for Beazley's all talk videos.
Dabeaz - a steady stream of excellent talks!
What tool is being used to execute python code merged into source code?
Seems like EMACS, something like this github.com/millejoh/emacs-ipython-notebook/wiki/Screenshots
This is more hilariously turning the code into Lisp. Quite informational on why that language is how it is: it barely requires a parser!
The compilers course is a must for all the people interesting in computer science
I'm in a CS PhD program and have never had a class on compilers. It was never offered in my undergrad and I can't take it at my graduate school, because it wouldn't be covered by my tuition waiver.
@@jackeown the book to read is the structure and interpretation of computer programs. It teaches you scheme and in 5 chapters you use it to build a complete compiler for a new language. It is also called the Wizard book, and the class it is used in is widely considered one of the greatest computer science courses
Great introduction.
Excellent talk.
What editor is he using???
He's an emacs user, generally speaking.
Good talk. I'm not sure if you redeemed yourself from the abhorrent docstring with SLY. This could be more of a lateral move. :D
I like the version of the dragon book 🤣
What if I'm one of those who did not cry when confronted with that boom but read it twice? 🤣
What's this f'blah blah {template_thing}' trick? That's bad ass. Where do I get it?
Oh, it's Python 3.6. Cool. I must have missed that memo.
@@modusponens1094 Yeah, Python appears to be taking lessons from Perl. There are now 3 ways to interpret a string.
Btw you can set rules of a selfmade lexer generator to solve conflicts by precedence of patterns. My convert to a DFA and look for the longest match an return the longest match, at each state in the DFA, if it accept, pick the token which are defined first.
Good talk. The question I have is why do we still separate tokenization from parsing? So many languages have context sensitivity and this practice makes that very difficult to handle.
The second question I have is, does SLY support stream-based parsing or does it only work on complete documents?
I am not David and your question is 2 years old but anyways...
PLY and SLY use LALR approach from my understanding as they are based on yacc, and LALR parsers are usually targeted towards context-free grammars. Although, PLY can have multiple grammars at the same time and you can switch between them on the go, kind of nested grammars, which gives you more freedom despite not being formally context-sensitive.
I don't know about SLY, but PLY did not have support for stream inputs and I would assume the same for SLY, but you can always take user input and convert into strings. I can't think of a scenario where parsing live streams of characters would be that critical and necessary.
This is awesome but also cursed python. 35:30
At 32:49, why doesn't the code raise a NameError?
Daniel Duong whitespace and tab should be ignored.
Actually a guy asked the same question at the end of the talk, but I didn't really understand the answer.
It does raise the error...
If you're using a metaclass, you get the class dictionary before the class body is executed in it. He's probably using a custom mapping that defines a `__missing__` method, which will be triggered whenever a name is looked up that isn't defined in the scope. You can use "magic" mappings in metaclasses to basically arbitrarily redefine the semantics of the language inside of a class body, so long as it's valid Python syntax (NameErrors happen at runtime, not during parsing)
it is because the ignored characters are " \t" a space and a tab
-3-4 does equal one in some languages
so it does -(3-4) ?
at 10.0 good
First? Really? Wow!
pamdemonia
Nooooooooo! I missed my chance!