Man...this guy is just incredible. He has almost a legendary status in the history of computer science yet he is so humble and so down to earth. His presentation skills are just awesome -I didn't loose focus even once during whole video, which happens every 30 seconds when the profs at my uni try to teach. He explained everything in very clear and concise way, no BS at all. No trying to sound smart, no cryptic explanations, no ego at all. I wish lecturers at my uni were a bit more like him...
He is a scientist. I've worked with elite scientists and they are almost all like that. What happens with this chamber we are part of (programmers) is that we have way too many people that are not scientific driven and have gigantic egos. They try to blame and judge all the time, they keep fighting for useless crap such as: "No comments on the code" "Correct indentation style" "I am better than you" "Industry standards" "X is Better than Z even though it is exactly the same thing in runtime" "X style is better than Z style" "I am good, you are bad" "Hurr durr VIM is better than everything else" "I use a Macbook and VIM thus I am a superior programmer" etc, etc, etc...
I literally owe my career to Mr. Kernigan. As a young programmer writing PL/1 and COBOL code on IBM MVS platforms, I discovered "The C Programming Language" in the library in 1984 and checked it out as a curiosity... I'd never heard of the language. The terseness of the language and elegance of his examples in the book blew me away! I knew I had to learn this language - even though I really liked PL/1 , so I bought the only C compiler available for a PC at the time (Whitesmith's C Compiler), and taught myself the language. A year later I was working in Unix and C for a major Aerospace company and never looked back. Thank you Mr. Kernigan.... my career was a blast because of you and Dennis Richie, et. al.
His lecture is electrifying, and he's 70s, put many younger guys to shame. A legend yet so humble and not condescending to his audience like current conference speakers. Top guy.
13:54 I've noticed something that could be a pattern. Many legendary scientists seem to be humble in general and are not afraid to admit that their knowledge is limited. On the other hand, there are many people that are very arrogant, in contrast to what they have achieved (i.e. not much). I couldn't expect less from such a legend... Thank you U.o.Nottingham for sharing this lecture. And thank you Prof. Kernighan, for C and Unix.
Totally agree. The geniuses understand, perhaps more than anything else, how little they know. The jerks on the other hand believe they own a field when they've made some minor contribution (and very often even before then).
Ya, but how do you deal with the paradox? I.e. learning requires humility of acknowledging the "known unknowns ", but culture still prerequisites "known knowns' for jobs? How is the person doing the hiring should know the difference without being a guru himself? Human societies ...ugh much harder than computers.
I've noticed this in other fields as well, even including such non-hard-sciences as music. How do some of the most-accomplished get that way? By learning, which you can't do if you think you know it all already. And by practicing what you preach to find flaws to improve on. So whether it's correlation or causation doesn't maybe even matter that much - humility and expertise are often found together.
Another reason I would add that (partly at least) helps some languages to thrive is marketing. Most of the most well-known languages of the new millennium (e.g. C#, Go, Swift, Rust) were developed or backed by major companies (Microsoft, Google, Apple and Mozilla respectively). The most successful language is not alway the one with the best specs, but with the best hype.
@@aidanprattewart : I've looked at Rust. It's not as bad as C++, but it's still maintained some of the flaws of C (symbols modify types!) and C++ (ugly and when last I looked slightly cryptic metaprogramming).
Amazing talk. Learned so much about the evolution of languages from the master himself. I'm now inspired to read about a bunch of languages that I've not explored 😀
Sincere huge thanks to the transcriber(s) of this video.
5 ปีที่แล้ว +5
This is one of the best presentation I've seen in a while. Simple explanation for complicated topic. I think I'm going to learn a new programming language... Thanks
I write a lot of python scripts in Linux, and 9/10 if I need to do some kind of data handling for files, I just have the python script run an awk command. Python is no slouch at file processing, but an awk one liner can usually do the trick :)
To be precise: 06:11 - this is not the 1961 CACM cover, because TMG and PL/I were not invented yet in 1961. This is the tower from the 1969 Sammet's book. CACM cover of January 1961 did feature a reference to the Tower of Babel, however, it listed slightly fewer languages.
A legend. You gave us so much sir. I learned C just because a lot of modern langages I use (Java, C#) use its syntax and I thought it was a way to honour the creation of C that to learn C and C++ even if I don't use those at work, just to honor the great work. Thank you sir. My job and life today are the result of all that fine work you did with Ritchie :-)
I once made a web server in C, and everyone seemed to be arguing that there would be a threshold, such that their interpreted language would run faster. I even tried to hammer the site, and there is NO point that ASP was a better choice (lol). There is no limit where ASP, or another other backend will surpass C.
I vaguely recall watching this two years ago, now with some more general programming experience I find it much more informative, very good stuff. I even used AMPL about 6 months ago, super useful for supply chain optimization but the documentation is ...haphazard, plenty of it but you must dig through hundreds of pages of tutorial style docs spread across two dozen PDF files, it really has no solid "man page" style syntax summary, doubly so for the data syntax.
Very nice talk. Arguably, everything a programmer does is design languages/interpreters thereof. We write functions (interpreters) that take data[structures] (source code) and do something with it. Then you name all that stuff and there you have a (programming/declarative/configuration) language. Often, it is not worth or helpful to define your own parser taking textual source code to obtain your syntax tree, since there are now many standard ways to represent the kind of labeled graphs that all programs are textually or as nested datastructures in any language (html with extra attributes, nested associative arrays [json/yaml], S-expressions...). In fact, please don't define your own parser for key-value data like so many unix programs did in the past, just because it (seems) so simple.
I have used AMPL in MIT edx course on supply chain management. I used it for modelling linear programming to find optimization values using constraints.
I would have shouted perl to the first question, because I'd be done writing it and running it before most would get their boilerplate done. Though it would take 5 minutes to write it pretty and make it obvious how it worked for any later maintainer. But for problems like the one-shot mentioned above, a really simple text split and numeric compare on a field - and you're done. Maybe 3 lines if you were being super explicit. I love C and use it daily. C++ - well, some of it is good, but trying to use ever more complex constructs and bizarre template syntax (the template engine itself is Turing-complete without any cods at all, as Damian Conway demonstrated) is a dead end - adding complexity to cure complexity. But for a simple one-shot task? In most cases developer time costs more than computer time, and perl is better for those. Parsing that out in C vs using a language meant to do parsing is not a win. Ah, so (on viewing further) our hero says AWK - and that's a great choice, if you know AWK. I know perl, so, horses for courses. And make no mistake, this guy is a hero.
I was one of the people who had to do system programming in Pascal. Back in the 1980's I worked for NCR doing application programming on automated cash machines (ATMs). Even though it was application programming (business logic) it had many system aspects and ATM programming was not trivial. Intel did have a suitable language for their real iRMX OS (called PL/M I think), that was used to write device drivers and managers. But it was considered too hard for application developers and so Intel's Pascal/86 was mandated (I think this was a re-branded version of Pascal/MT). This was an extended version of Pascal designed to be used as a "real" language. It had support for include files, separate source code module, overlays, variable length arrays (only when passing parameters between different modules), assembly language support -- but I never used that. It worked surprisingly well, but I would not want to do that again...
Apollo workstations which were market leaders in mid 80's wrote their Domain OS in mostly Pascal. Being involved in a project which had begun on VAX VMS there were real issues with Pascal. Source had to be manipulated between systems and dialect differences with missing extensions made it unsuitable for large long running programs. Each data structure type needed its own allocation/deallocation routines and unlike C you couldn't repurpose blocks of bytes in various sizes except on the stack. The strong types forced massive code duplication too, so further bloating executables. From what I've seen Pascal advocates have tended to use a particular extended variant and not had concerns of porting to newer hardware & OS, while C ports were facilitated by practical standard language features like the pre-processor.
C is a very complicated language. Not in the language itself, but also in it's coupling with the OS, take POSIX compliance, vs POSIX certified, vs Windows Certified or even DOS. Compliance with C89, C99, 2017, etc.
Pascal with a few (small) extensions/modifications worked fine as a systems programming language. So did Algol. Bliss was definitely used as a systems programming language. So was Mesa. So was PL/I (but that one was probably too complicated). Cobol with a few extensions has been used to implement a (fast!) Cobol compiler so it could probably also have been used as a systems programming language.
I think Pearl was simply displaced by Python and Bash(as most non-compact systems have moved to Bash from posix sh since the peak growth of Pearl), Python became preferred for the heavy scripting end because it is easier to read and thus maintain for subsequent users even though Pearl may have some advantage for the original authoring. Python(with all the extensions) also has more uses and users, simple populatrity.
He says you should teach associative arrays to undergrad. students. In C, how do you use assoc. arrays? Should we implement it each time from scratch? In Python, for instance, we have dictionaries. What is the preferred way to use hashmaps in C?
@35:00 sparse matrices??? Sure, I've had to deal with these all my life in computational physics. But I also do research these days in Clifford algebra, and this talk struck me suddenly: why not use Clifford algebra structures instead of sparse matrix algebra? I'll bet this could simplify a lot of code and produce faster run times. (The mathematical basis is sound: all matrices can be recast into Clifford algebra, but in a Clifford algebra you do not need all the zeroes for a sparse matrix, instead you are using just basis vectors, and combining them as multivectors.)
Why don't you give it a shot? Perhaps you can write the AMPL killer using a translator which converts computational physics or linear programming problems into Clifford algebras and solve them more efficiently. My contact with large sparse matrices is limited to some work (mostly at Uni, but a bit at work) on control systems and dealing with zeros and poles, so that might be another possible target niche for a solver...
I have a BSCS, class of 2000. After OOP, all they ever said was document, document, document. Upon graduation and getting into industry, I determined that the last thing a programmer would EVER do is document their code. It's nothing more than digging your own grave. Once management got the juice, you would be fired with no warning. Then they hoped your "budget" replacement could maintain your code. Ha! The more obscure and abstruse your code, the better off you were!
It is probably not correct to classify C# as a "look-alike-language". Basical syntax of expressions and control structures is almost the same as in C. This makes the language easy to learn. This is only the surface. Under the hood however, I see more similarities with Delphi than with C++ and C. Same value-reference semantic as Delphi and handling of GUI events similar to Borland VCL.
Scheme... bad memories for me. I studied CS at University on Montreal with the creator of scheme. We had to bootstrap Scheme in Scheme as assignment. By far the worst course I did, and my only abandoned course. And did Simula in Programming 101 and 201.
I came here just to listen to Brian Kernighan... so it came as a nice surprise to see the video was from that TH-cam favourite Uni. Nottingham and to see the talk introduced by Prof. Brailsford. I like how he goes over some shortcomings of C and they're all problems with the standard library.... not the language itself.... "Because I invented it... it's perfect" ;) Now, I must admit, I'm an anti-Java bigot... but "Strongly hyped" particularly tickled me. As for functional languages in "the real world".... you gotta remember that, despite it's silly name, JavaScript is LISP dressed up as C... and all the best JavaScript programmers (HELLO!) are functional programmers.... well sort-of... not quite.... but sort-of.... y'know.
I confess not to see very well how javascript uses S-expressions; although modern javascript has been gaining a lot of functional-inspired functionality, and it is true that Brendan Eich wanted to use Scheme to script up Mosaic, but was ordered to write up a language which looked more like Java (Netscape had signed an agreement with Sun to push Java on the Web). Particularly, javascript does not share the "original sin" of functional languages of not having any operator precedence at all, which is what really puts me off of using SML, Haskell, Mythryl and a bunch of other languages I have had contact with.
@@tiagorodrigues3730 Huh, Haskell has operator precedence, you can confirm this yourself by typing :i (+) into ghci (the REPL). Prelude> :t (+) ... infixl 6 + Even languages like OCaml have a hierarchy of precedence, though a fix set of operators. As for the original post, LISP removed of the key ideas and macro power isn't really lisp, you lose a lot of the ability to extend the language nicely. Further, languages like JS stomp over many good language design ideas that existed in many many lisps.
Perl 5 currently has a yearly release cycle (with new features every release, while maintaining backward compatibility), and Perl 6 has a release planned for Christmas this year ( though frankly it is awesome already ) 「$ perl6 -e 'say ([+] 1..10**10_000).chars; say now - INIT now'」 「200000.3792978」 That is it adds all of the numbers in the range from 1 to 10¹⁰⁰ in less than half of a second without truncation to a floating point number (ok it actually cheats, but this is the way you would write it if you weren't going to calculate it based on the end-points yourself). Plus Perl 6 has features that no other language has, like built-in grammar engine, and auto-threading, super-simple way of adding new operators just as you would add a new subroutine (almost all "normal" operators in the Rakudo implementation are actually created that way). So the crack at 57m32s about Perl missing the boat seems quite mis-informed.
+Brad Gilbert But if it had done *some* of that 5 years ago, its more likely that more programmers would have tried it and evolved along with it. Many Perl programmers have since moved on, and its uncertain they'll ever return.
+Brad Gilbert Perhaps you weren't paying attention but he did mention he does use Perl still and likes using it. His final comments weren't about whether language is good or not, but rather whether it can thrive and attract new people. Perl, like Java, is not going away, that's for sure. But it did "reinvent" itself too late and its userbase will gradually die off, bar some legacy systems that depend on it.
+Brad Gilbert Perl's insufficiently C-like syntax has hurt its adoption rate by the masses of folks for whom their first language in college had a very C-like syntax and who are not looking to learn new languages, especially if the culture where they work does not include using Perl for anything. Perl had associative arrays before that was a common language feature, but simple operator overloading has been in many languages for a long time (and is commonly railed against for hurting code readability). Several languages I've seen (none of the main stream, highly popular languages) come with support for grammar engines, so you are being a bit hyperbolic about "no other language" having these features. I'll have to look at "auto-threading" in Perl 6, but CILK does some very slick things with threading automatically and so does the async/await keywords in C#. However, I'm still looking for languages that actually supply really good high-level concurrent/parallel programming constructs as part of the syntax of the languages rather than a bolt-on API (e.g., goroutines in the Go programming language are high-level syntax).
...or you could, you know, do the sum of the series in a more appropriate way and get your result in a handful of microseconds, most of which are wasted on IO.
Great talk but a comment on the editing. As much as I like to look at Professor Kernighan's pretty face it would be helpful to leave his slides up long enough to be able to read them.
5 ปีที่แล้ว +9
I have immense respect and admiration for Brian Kernighan and I love his work, but.... Comic Sans? Really?
Yes, it looks childish and ugly. But it's also one of the more readable types for dyslexics who sometimes have difficulty telling letters apart in other types. I'm sure it is a coincidence but Comic Sans has at least *something* going for it.
Not at all to me. Every time I write C, I run something like "man strtok" a lot. Because the details about those standard functions are hard to remember and easy to forget. Especially if you are not a purely C programmer. If you compare that to Python, for instance, you hardly have to remember anything. For example, here is a Python solution to the problem in the talk: cat file | python -c 'import sys; print("".join([y for y in sys.stdin.readlines() if float(y.split()[2])>6]))' - as you can see, all methods take just a single parameter.
The fact that someone added subtitles to this video is incredible
Thank you for making this talk available to the public.
+Rhys Yorke You're welcome, glad you enjoyed it.
Man...this guy is just incredible. He has almost a legendary status in the history of computer science yet he is so humble and so down to earth. His presentation skills are just awesome -I didn't loose focus even once during whole video, which happens every 30 seconds when the profs at my uni try to teach. He explained everything in very clear and concise way, no BS at all. No trying to sound smart, no cryptic explanations, no ego at all. I wish lecturers at my uni were a bit more like him...
He is great because he admits it when he doesn't know.
Almost?
absolutely legendary status
He is a scientist. I've worked with elite scientists and they are almost all like that. What happens with this chamber we are part of (programmers) is that we have way too many people that are not scientific driven and have gigantic egos. They try to blame and judge all the time, they keep fighting for useless crap such as:
"No comments on the code"
"Correct indentation style"
"I am better than you"
"Industry standards"
"X is Better than Z even though it is exactly the same thing in runtime"
"X style is better than Z style"
"I am good, you are bad"
"Hurr durr VIM is better than everything else"
"I use a Macbook and VIM thus I am a superior programmer"
etc, etc, etc...
the problem with him is that he is too biased against pascal
I literally owe my career to Mr. Kernigan. As a young programmer writing PL/1 and COBOL code on IBM MVS platforms, I discovered "The C Programming Language" in the library in 1984 and checked it out as a curiosity... I'd never heard of the language. The terseness of the language and elegance of his examples in the book blew me away! I knew I had to learn this language - even though I really liked PL/1 , so I bought the only C compiler available for a PC at the time (Whitesmith's C Compiler), and taught myself the language. A year later I was working in Unix and C for a major Aerospace company and never looked back. Thank you Mr. Kernigan.... my career was a blast because of you and Dennis Richie, et. al.
His lecture is electrifying, and he's 70s, put many younger guys to shame. A legend yet so humble and not condescending to his audience like current conference speakers. Top guy.
13:54 I've noticed something that could be a pattern. Many legendary scientists seem to be humble in general and are not afraid to admit that their knowledge is limited.
On the other hand, there are many people that are very arrogant, in contrast to what they have achieved (i.e. not much).
I couldn't expect less from such a legend...
Thank you U.o.Nottingham for sharing this lecture.
And thank you Prof. Kernighan, for C and Unix.
Totally agree.
The geniuses understand, perhaps more than anything else, how little they know.
The jerks on the other hand believe they own a field when they've made some minor contribution (and very often even before then).
Ya, but how do you deal with the paradox?
I.e. learning requires humility of acknowledging the "known unknowns ", but culture still prerequisites "known knowns' for jobs?
How is the person doing the hiring should know the difference without being a guru himself?
Human societies ...ugh much harder than computers.
And then there's Linus Torvalds.
This hypothesis is almost certainly confirmation bias IMO.
I've noticed this in other fields as well, even including such non-hard-sciences as music. How do some of the most-accomplished get that way?
By learning, which you can't do if you think you know it all already. And by practicing what you preach to find flaws to improve on. So whether it's correlation or causation doesn't maybe even matter that much - humility and expertise are often found together.
I found that I asked the same question in a different place a year later, or similar ID :)
I could listen to Prof Kernighan speak all day long. What a clear thinker
I like to imagine that this humble man uses Comic Sans in his code editor as well. Great talk and thanks for C!
he probably uses vim so I don't think so😅... You can use it in vim ofcourse if you don't bother messing with various dot files 😊
I still picture him reclining in the chair describing how Unix works.
The man who introduced him taught me language design.
Prof Brailsford is another giant in Computing Science.
At 25:07 you have the most concise example of an associative array. Period. Thank you Dr. Kernighan for this precious gem.
Stop it.
Pretty sure BK hates Grovelling.
:-)
B.K..?! Hands down.. Couldn’t agree more that TH-cam is a real treasure island in our time. Thank you
CS == Computer Science? No
CS == C#? No.
CS == Comic Sans? Yes!
*lol* exactly
Should have used true and false instead of yes and no
he's a legend
You all hate Comic Sans just because it's trendy
Someday, I would love to possess 1/10th the knowledge and humility that Brian Kernighan has.
Another reason I would add that (partly at least) helps some languages to thrive is marketing. Most of the most well-known languages of the new millennium (e.g. C#, Go, Swift, Rust) were developed or backed by major companies (Microsoft, Google, Apple and Mozilla respectively). The most successful language is not alway the one with the best specs, but with the best hype.
All hail Rust.
@@aidanprattewart : I've looked at Rust. It's not as bad as C++, but it's still maintained some of the flaws of C (symbols modify types!) and C++ (ugly and when last I looked slightly cryptic metaprogramming).
60172 laguages
It is always a pleasure to hear Brain Kernighan speak
Amazing talk. Learned so much about the evolution of languages from the master himself. I'm now inspired to read about a bunch of languages that I've not explored 😀
Sincere huge thanks to the transcriber(s) of this video.
This is one of the best presentation I've seen in a while. Simple explanation for complicated topic. I think I'm going to learn a new programming language... Thanks
That's how you do an intro, people. 5 seconds and get the star on stage.
Comic Sans improves ALL presentations!
Smug Anime Girl ikr normally I’m the first to point out that abomination of a font but this guy makes me forget all about it lol
@@vlc-cosplayer same
Stanford lectures use Chalkboard. I've learned to get over it.
I write a lot of python scripts in Linux, and 9/10 if I need to do some kind of data handling for files, I just have the python script run an awk command. Python is no slouch at file processing, but an awk one liner can usually do the trick :)
Sure, Brian Kernighan is a great, but have you ever seen better camera work and speaker/slide transition timing in a university lecture? Incredible!
This man IS a true legend..
Always a pleasure to here Brian speak. Thanks for posting!
I would say that today a language is more defined by the existing tools you can use with it than by the language itself.
To be precise: 06:11 - this is not the 1961 CACM cover, because TMG and PL/I were not invented yet in 1961. This is the tower from the 1969 Sammet's book. CACM cover of January 1961 did feature a reference to the Tower of Babel, however, it listed slightly fewer languages.
Thank you for sharing this. That was wonderfully enlightening.
A legend. You gave us so much sir. I learned C just because a lot of modern langages I use (Java, C#) use its syntax and I thought it was a way to honour the creation of C that to learn C and C++ even if I don't use those at work, just to honor the great work. Thank you sir. My job and life today are the result of all that fine work you did with Ritchie :-)
I bought a wallet from the University of Nottingham and it fell apart within a year. This talk was so good that now I feel I owe you money.
seye69 he already has your change in his backpocket. Can you hear it?
Did you *really* need the user instructions to specify that it was not actually intended either for knotting *or* for containing ham?
I once made a web server in C, and everyone seemed to be arguing that there would be a threshold, such that their interpreted language would run faster. I even tried to hammer the site, and there is NO point that ASP was a better choice (lol). There is no limit where ASP, or another other backend will surpass C.
I vaguely recall watching this two years ago, now with some more general programming experience I find it much more informative, very good stuff.
I even used AMPL about 6 months ago, super useful for supply chain optimization but the documentation is ...haphazard, plenty of it but you must dig through hundreds of pages of tutorial style docs spread across two dozen PDF files, it really has no solid "man page" style syntax summary, doubly so for the data syntax.
The 64 dislikes are all pure functional programming programmers.
Unix, C and AWK are parts of my life. I love you!
Very nice talk. Arguably, everything a programmer does is design languages/interpreters thereof. We write functions (interpreters) that take data[structures] (source code) and do something with it. Then you name all that stuff and there you have a (programming/declarative/configuration) language. Often, it is not worth or helpful to define your own parser taking textual source code to obtain your syntax tree, since there are now many standard ways to represent the kind of labeled graphs that all programs are textually or as nested datastructures in any language (html with extra attributes, nested associative arrays [json/yaml], S-expressions...). In fact, please don't define your own parser for key-value data like so many unix programs did in the past, just because it (seems) so simple.
I have used AMPL in MIT edx course on supply chain management. I used it for modelling linear programming to find optimization values using constraints.
Thuis guys knows stuff.. and shows how to age gracefully.
He is old, but gold!
58:12 - "But you wonder whether it [C++] has passed beyond some threshold of complexity that's beyond mortals." 😂
Not sure about mortals, but certainly tolerance.
I miss lectures.
I would have shouted perl to the first question, because I'd be done writing it and running it before most would get their boilerplate done. Though it would take 5 minutes to write it pretty and make it obvious how it worked for any later maintainer. But for problems like the one-shot mentioned above, a really simple text split and numeric compare on a field - and you're done. Maybe 3 lines if you were being super explicit.
I love C and use it daily. C++ - well, some of it is good, but trying to use ever more complex constructs and bizarre template syntax (the template engine itself is Turing-complete without any cods at all, as Damian Conway demonstrated) is a dead end - adding complexity to cure complexity.
But for a simple one-shot task? In most cases developer time costs more than computer time, and perl is better for those. Parsing that out in C vs using a language meant to do parsing is not a win.
Ah, so (on viewing further) our hero says AWK - and that's a great choice, if you know AWK. I know perl, so, horses for courses. And make no mistake, this guy is a hero.
Under 5k rows I would just use Excel and turn on filters. 90 seconds at most
I love troff and im happy to see it get mentioned in this video. I make videos on troff on my channel for those of you that are interested.
Is it just me or does anyone else too feel his voice seem much younger than his face?
Meera Vinod a sign of health at his age. I love it
I was one of the people who had to do system programming in Pascal. Back in the 1980's I worked for NCR doing application programming on automated cash machines (ATMs). Even though it was application programming (business logic) it had many system aspects and ATM programming was not trivial. Intel did have a suitable language for their real iRMX OS (called PL/M I think), that was used to write device drivers and managers. But it was considered too hard for application developers and so Intel's Pascal/86 was mandated (I think this was a re-branded version of Pascal/MT).
This was an extended version of Pascal designed to be used as a "real" language. It had support for include files, separate source code module, overlays, variable length arrays (only when passing parameters between different modules), assembly language support -- but I never used that.
It worked surprisingly well, but I would not want to do that again...
Little known fact: the original Macintosh OS (pre MacOS X)was written in Pascal with a liberal amount of 68000 assembly.
Apollo workstations which were market leaders in mid 80's wrote their Domain OS in mostly Pascal. Being involved in a project which had begun on VAX VMS there were real issues with Pascal. Source had to be manipulated between systems and dialect differences with missing extensions made it unsuitable for large long running programs. Each data structure type needed its own allocation/deallocation routines and unlike C you couldn't repurpose blocks of bytes in various sizes except on the stack. The strong types forced massive code duplication too, so further bloating executables.
From what I've seen Pascal advocates have tended to use a particular extended variant and not had concerns of porting to newer hardware & OS, while C ports were facilitated by practical
standard language features like the pre-processor.
C is a very complicated language. Not in the language itself, but also in it's coupling with the OS, take POSIX compliance, vs POSIX certified, vs Windows Certified or even DOS. Compliance with C89, C99, 2017, etc.
Pascal with a few (small) extensions/modifications worked fine as a systems programming language. So did Algol. Bliss was definitely used as a systems programming language. So was Mesa. So was PL/I (but that one was probably too complicated). Cobol with a few extensions has been used to implement a (fast!) Cobol compiler so it could probably also have been used as a systems programming language.
@John Q. Bebtelovimab This was strange. I assume he was talking about Ansi Pascal and I assume he wasn't talking about Ansi C?
Pascal's key issue was different sized arrays being different data types in and of themselves, preventing modular coding.
Why a language succeeds: All those reasons, plus the creator needs to have a beard.
Bjarne doesnt have a beard,
@@harmonymoyo4420 And thats why c++ is so incredibly ugly, bloated and unelegant
@wlod nat that was probably due to the proximity to the bell lab boys at the time, code energy was just flowing out of those rooms
I've never noticed Brian's name on my copy of the AMPL book. Quite a surprise.
I am not computer science but i worked with computer - I used his UNIX C book.
Wonderful video, thanks you so much
At 28:50 on SWIFT, precisely right. Same with JS frameworks.
I know this will apply to nobody else, but he talks and jokes JUST like my band director, and that makes me like him that much more lmao
Is that comic sans? Legend!
Fantastic talk :)
The Wizard of Computer Science
"strongly-hyped" XD
It was worth a chuck :)
i think he meant, "strongly typed"
Tarek Ali no he actually means hyped 😉
Whatever it is, both are true. And I like to think that he put in the “hyped” on purpose.
@@tarekali7064 wrongly typed
I don't know how I managed to miss this event!
11:26 "People in 70s discovered, that you could program in the command interpreter."
Microsoft in 2009: "I think now I'm starting to get it..."
Fasacinating. Thank you.
Good talk, thanks!
Right on the spot, Java: Strongly-hyped
What a great presentation! :)
PopQuiz: What is the dollar amount in precious change that Brian is playing with in his pocketses?
256 cents, handed out by the Knuth?
"good programming langauge shouldn't make you think differently. It should just let you express what you want." -- me
I think Pearl was simply displaced by Python and Bash(as most non-compact systems have moved to Bash from posix sh since the peak growth of Pearl), Python became preferred for the heavy scripting end because it is easier to read and thus maintain for subsequent users even though Pearl may have some advantage for the original authoring. Python(with all the extensions) also has more uses and users, simple populatrity.
The fact that you don't know how to spell Perl makes me seriously doubt the factual accuracy of this comment.
PERL!!!!!!!!!!
He says you should teach associative arrays to undergrad. students. In C, how do you use assoc. arrays? Should we implement it each time from scratch? In Python, for instance, we have dictionaries. What is the preferred way to use hashmaps in C?
Amen to his comments about Swift... I got so tired of the radical changes that broke code that they'd make to the language with EVERY new version.
Strongly-hyped: Java
Incredibly interesting.
This guy carries 1.5x speed by himself so I don't have to use it😂
I used bash, sed and awk to handle my references for generation of bibtex for latex documentation when doing my thesis.
Are Brian's slides available?
Hi Max, good suggestion. We've just pasted a Dropbox link into the description of the video to download them.
Awesome! Thank you so much for taking the time :)
@@uniofnottingham 2 years later the link is dead. Could you please make the slides available again?
@@rolandlemmers6462 Thanks for flagging. We've re-uploaded them.
Good call on the swift thing
I wonder just how much the students appreciate that they're being lectured to by an absolute computing legend.
The lecture is great, Kernighan simplay awesome, but the sound is difficult
@35:00 sparse matrices??? Sure, I've had to deal with these all my life in computational physics. But I also do research these days in Clifford algebra, and this talk struck me suddenly: why not use Clifford algebra structures instead of sparse matrix algebra? I'll bet this could simplify a lot of code and produce faster run times. (The mathematical basis is sound: all matrices can be recast into Clifford algebra, but in a Clifford algebra you do not need all the zeroes for a sparse matrix, instead you are using just basis vectors, and combining them as multivectors.)
Why don't you give it a shot? Perhaps you can write the AMPL killer using a translator which converts computational physics or linear programming problems into Clifford algebras and solve them more efficiently.
My contact with large sparse matrices is limited to some work (mostly at Uni, but a bit at work) on control systems and dealing with zeros and poles, so that might be another possible target niche for a solver...
Thankfully the whiteboard notes will only ever be read by machines, so the fact that they're effectively flashcards shouldn't matter too much.
Simon Peyton Jones uses Comic Sans...
Brian Kernighan uses Comic Sans...
If you want to design a language, use Comic Sans.
I wrote this exact program in Java.
Ocaml is multi-paradigm, not only functional
I have a BSCS, class of 2000.
After OOP, all they ever said was document, document, document.
Upon graduation and getting into industry, I determined that the last thing a programmer would EVER do is document their code.
It's nothing more than digging your own grave.
Once management got the juice, you would be fired with no warning.
Then they hoped your "budget" replacement could maintain your code.
Ha!
The more obscure and abstruse your code, the better off you were!
Interesting he didn't mention Smalltalk on that list.
Editing the audio to filter background noise would be great, but hey, it's not that big a deal.
It is probably not correct to classify C# as a "look-alike-language". Basical syntax of expressions and control structures is almost the same as in C. This makes the language easy to learn. This is only the surface. Under the hood however, I see more similarities with Delphi than with C++ and C. Same value-reference semantic as Delphi and handling of GUI events similar to Borland VCL.
What do you think of the fourth level, OOP language - Java?
Five fifth decious include posion design.
Legend 🤞🏾
Professor, How do you think about Forth language and Forth machine?
what a legend
Scheme... bad memories for me. I studied CS at University on Montreal with the creator of scheme. We had to bootstrap Scheme in Scheme as assignment. By far the worst course I did, and my only abandoned course.
And did Simula in Programming 101 and 201.
I came here just to listen to Brian Kernighan... so it came as a nice surprise to see the video was from that TH-cam favourite Uni. Nottingham and to see the talk introduced by Prof. Brailsford.
I like how he goes over some shortcomings of C and they're all problems with the standard library.... not the language itself.... "Because I invented it... it's perfect" ;)
Now, I must admit, I'm an anti-Java bigot... but "Strongly hyped" particularly tickled me.
As for functional languages in "the real world".... you gotta remember that, despite it's silly name, JavaScript is LISP dressed up as C... and all the best JavaScript programmers (HELLO!) are functional programmers.... well sort-of... not quite.... but sort-of.... y'know.
I confess not to see very well how javascript uses S-expressions; although modern javascript has been gaining a lot of functional-inspired functionality, and it is true that Brendan Eich wanted to use Scheme to script up Mosaic, but was ordered to write up a language which looked more like Java (Netscape had signed an agreement with Sun to push Java on the Web). Particularly, javascript does not share the "original sin" of functional languages of not having any operator precedence at all, which is what really puts me off of using SML, Haskell, Mythryl and a bunch of other languages I have had contact with.
@@tiagorodrigues3730
Huh, Haskell has operator precedence, you can confirm this yourself by typing :i (+) into ghci (the REPL).
Prelude> :t (+)
...
infixl 6 +
Even languages like OCaml have a hierarchy of precedence, though a fix set of operators.
As for the original post, LISP removed of the key ideas and macro power isn't really lisp, you lose a lot of the ability to extend the language nicely. Further, languages like JS stomp over many good language design ideas that existed in many many lisps.
The original Macintosh system was mostly written in Pascal.
this is a moment
Perl 5 currently has a yearly release cycle (with new features every release, while maintaining backward compatibility), and Perl 6 has a release planned for Christmas this year ( though frankly it is awesome already ) 「$ perl6 -e 'say ([+] 1..10**10_000).chars; say now - INIT now'」 「200000.3792978」 That is it adds all of the numbers in the range from 1 to 10¹⁰⁰ in less than half of a second without truncation to a floating point number (ok it actually cheats, but this is the way you would write it if you weren't going to calculate it based on the end-points yourself). Plus Perl 6 has features that no other language has, like built-in grammar engine, and auto-threading, super-simple way of adding new operators just as you would add a new subroutine (almost all "normal" operators in the Rakudo implementation are actually created that way). So the crack at 57m32s about Perl missing the boat seems quite mis-informed.
+Brad Gilbert But if it had done *some* of that 5 years ago, its more likely that more programmers would have tried it and evolved along with it. Many Perl programmers have since moved on, and its uncertain they'll ever return.
+Brad Gilbert Perhaps you weren't paying attention but he did mention he does use Perl still and likes using it. His final comments weren't about whether language is good or not, but rather whether it can thrive and attract new people. Perl, like Java, is not going away, that's for sure. But it did "reinvent" itself too late and its userbase will gradually die off, bar some legacy systems that depend on it.
+Brad Gilbert Perl's insufficiently C-like syntax has hurt its adoption rate by the masses of folks for whom their first language in college had a very C-like syntax and who are not looking to learn new languages, especially if the culture where they work does not include using Perl for anything. Perl had associative arrays before that was a common language feature, but simple operator overloading has been in many languages for a long time (and is commonly railed against for hurting code readability). Several languages I've seen (none of the main stream, highly popular languages) come with support for grammar engines, so you are being a bit hyperbolic about "no other language" having these features. I'll have to look at "auto-threading" in Perl 6, but CILK does some very slick things with threading automatically and so does the async/await keywords in C#. However, I'm still looking for languages that actually supply really good high-level concurrent/parallel programming constructs as part of the syntax of the languages rather than a bolt-on API (e.g., goroutines in the Go programming language are high-level syntax).
came here to say this, Perl6 Rocks! :D not missed the boat or anything.
...or you could, you know, do the sum of the series in a more appropriate way and get your result in a handful of microseconds, most of which are wasted on IO.
Great talk but a comment on the editing. As much as I like to look at Professor Kernighan's pretty face it would be helpful to leave his slides up long enough to be able to read them.
I have immense respect and admiration for Brian Kernighan and I love his work, but....
Comic Sans? Really?
Yes, it looks childish and ugly. But it's also one of the more readable types for dyslexics who sometimes have difficulty telling letters apart in other types. I'm sure it is a coincidence but Comic Sans has at least *something* going for it.
The K in K&R getting those cliched errors and confusions in C is just so unbelievable.
Not at all to me. Every time I write C, I run something like "man strtok" a lot. Because the details about those standard functions are hard to remember and easy to forget. Especially if you are not a purely C programmer. If you compare that to Python, for instance, you hardly have to remember anything. For example, here is a Python solution to the problem in the talk: cat file | python -c 'import sys; print("".join([y for y in sys.stdin.readlines() if float(y.split()[2])>6]))' - as you can see, all methods take just a single parameter.
amazing
55:04 when he talks about why Java didn't work out for the web, isn't that how wasm works?
Kernighan rules so hard
Came for Brian Kernighan, stayed for Comic Sans.