How can 30 people dislike a video with Professor Kerninghan using a device which would not exist without his contributions to the field.. Unbelievable....
There are many reasons why you dislike, it is most likely the algo has presented this to somebody who has no interest in computing history (quite likely). I always dislike videos that the algo has presented me, that are not relevant to me, because then the algo is less likely to present you with similar videos. Just sayin'
+Russ Prince Co developer of the UNIX operating system. It's derivates and clones now run Apple's computers, phones and tablets and Android devices. Most web servers use Unix or Linux - including google, and millions of desktops too. Co-creator of the C programming language. Still most of today's software are written in C or it's extension C++.
+Russ Prince Kernighan was one of the developers that created Unix and the C programming language, both still running on nearly every server in existence and many home operating systems
+Russ Prince Nowadays we all use computers(PCs, tablets, smartphones, embedded etc.) and access the web. None of which would be the same without Kernighan's contributions. The world itself wouldn't be all that different per se, but the way it's run would, everything runs on computers today and they wouldn't be what they are without Kernighan and others like Ken Thompson, Dennis Ritchie, Richard Stallman and Linus Trovalds.
They could have had all the glory unfortunately they got greedy and wanted commercialization the licenses were astronomical prices. They didn't have the vision for home pc's at the time.
For real, though, "Here, create stuff and we won't have management breathing down your neck. And if what you invent is really cool, we'll pay you even more the next year. (Read More) Oh, and we're literally the biggest telecommunications company in the country, so pending a complete market crash, we're not going out of business anytime soon." Every introverted analytical nerd's wet dream.
As I come to the end of my 30+ year career in IT, I stumble upon this series of videos! I recall first being introduced to C after having been exposed to Pascal ... shudder! Being freed from the shackles of Pascal with my introduction to C is an experience I cannot forget. From then on I would use C wherever I could. I was blessed with having an understanding data structures prof, who being more comfortable with Pascal let me use C to do my DS assignments. For that I am ever grateful. Pointers are a joy to work with! Oh the sins one can commit using pointers :) I still have my copy of the C Programming Language as well as the dragon book. But I miss the diversity of languages that existed back then. But not not COBOL. Though I'll grudgingly acknowledge WATIV over FORTRAN 77. To Professor Kernighan, Dennis Ritchie, Ken Thompson and others, I say thank-you for lighting the path ... I followed as best I could.
I have here in front of me the book "Software Tools", by Brian Kernighan and PJ. Plauger, using the language RATFOR (Rational FORTRAN) from 1976. I bought this book 40 years ago, when I was 11, and I was mucking around with assembly on a KIM-1 and PET2001, and C on CP/M (and later Cromix). It was formative for my approach to programming. Thanks alot, Brian!
i am honored today, thanks to Computerphile ...my first book ever book and hence my introduction to programming was "The C Programming Language - Book by Brian Kernighan and Dennis Ritchie"
+CaptainDangeax Let's not get too much carried out by this. Microsoft brought the PC in homes. It is their merit that PCs are so common and used today. Don't get me wrong, I am a C/C++ linux programmer so I appreciate what these people have done, but this field would not be what it is today if people (especially those outside our field) were not so welcoming to computers.
+OoJxShadow I agree. You can Wax lyrical about K&R but these guys would be the first to extol the contributions of Jobs and Gates , and if you want to talk about the CS academia then it's disingenuous to leave out Claude Shannon or George Boole or even John Bardeen and Walter Brattain also from Bell Labs. Same as people rave on about Von Neumann (the slime) when the real kudos should go to John Mauchly and J. Presper Eckert.
OoJxShadow Sorry but you're wrong. Many other computers were home computers far before the PC became affordable, like the Atari ST, the Amiga, this last one was really ahead of his time in many of its features and it tooks years for M$ to get to the level. One exsample ? Multitasking. None at M$ before NT4 was really multitasking, just 10 years after the first Amiga. M$ is not an innovative company, they always made their business by other's ideas and I can give you a list of features borrowed as long as a day without wine and bread.
This is true, but jobs and gates didn't contribute to computing, they contributed to the user experience and they were both great salesmen for their products. gates and jobs were never into computing, they were into computers and there is a big difference.
The "C" book by Brian Kernighan and Dennis Ritchie is the best book on a computer language that I've seen. When I first saw it, I was dedicated to Turbo Pascal but "The C Book" helped me to accept "C'. The book is so clear and easy to follow and I felt like I was making progress on every page.
Well, one thing you need to realize is that all this software was not the full-fledged system of today so it's not like they had a current day Linux system kicking about. The source code is freely available nowadays so you can see what I mean. Also, it's not uncommon for students to write even more powerful yet toy operating systems in less time these days (since there's easier access to information, we know a lot more, and we have more powerful tools). GitHub is full of them. I think the one thing to take away is that Bell Labs was a place that really fostered innovation and it's important to have things like that. Because the output of ideas and hard work that poured out of it literally changed the world. It wasn't just one or two small things.
@@CaptainDangeax I don’t know how long you think compiling took back then, but I’d imagine a missing “;” would’ve cost you a few hours. It definitely is an achievement to say the least with those kinda limited computing resources.
@@obinator9065 I think you missed my point. I think that, although KT said he wrote Unix in a few weeks, he was thinking of the many problems of actually writing an os since years and also practicing multics and experimenting all the problems with it. Then when he finally decided to code, everything was already written in his mind so it only took weeks, but the thinking took years. I'm not mentioning about compilation problems and missing ; or {} or whatever
Great stuff! Gee.... what it must have been like to work alongside guys like Kernighan, Ritchie and Thompson! What a great place Bell Labs must have been then!
Outstanding work, Computerphile! Thank you ever so much for this interview of Brian Kernighan, one of the original Unix greybeards. Please continue doing these. I love hearing the legendary masters of computer history speak. Please try and interview Ken Thompson, as well. Would love to listen to something current from him.
+Larry Pete one-liners in programming can be extremely messy if you actually keep them to one line instead of using the idea of a short snippet of code for a purpose.
@@tacokoneko And that's not a smart move from NVidia because Linux is the system for high end computation and NVidia is trying to enter this market with Cuda.
Heh, I did a similar thing, writing a Tcl/Tk script to check if external servers were responding and showing them in a table while I was discussing the matter with a colleague. Wonderful language and toolkit.
IBM (of all companies) had an open source operating system, VM/CMS, the first virtual machine/hypervisor. It gave up the source code to customers who wanted to customize it, because marketing was not interested in selling it. The success of the idea eventually led IBM to close the code and make it a commercial product.
I still weep watching these guys, who changed the world for the better, in their crappy offices, and then watching some meaningless media star spouting rubbish from their multimillion dollar mansion. Perhaps pipelines were a bridge too far.
@@hectorcanizales5900 AT&T is partially responsible for commercializing software. Even though the success of Bell Labs was only possible because software was not commercialized.
Flex, YACC and Bison were certainly very interesting tools to discover in my "Languages, Compilers and Interpreters" course I was taking while studying. My experience with them is however limited to making a (very) simplified compiler for a Pascal language.
5:34 Other OSes had this concept called “overlays”, where you divided up your (large) program into sections such that only some sections were loaded into memory at once. There were complicated mechanisms such that when a procedure in one section made a call into one in another section, the latter would get automatically loaded. But what happened if there wasn’t enough memory? Then some other section had to be thrown out of memory. If no code was currently executing in a suitable section, then fine. But what if there was? Then it would have to be automatically reloaded when control returned to a procedure in there. Or maybe you disallowed unloading sections which held currently-executing procedures. Yes, it all got very complicated. I never wrote an overlaid program in my life. Luckily the 32-bit era arrived just in time.
so Unix pipelining was initially a means to run a series of small programs that individually fit in the computer's memory but could be combined together as to their overall purpose of outcome to be a achieved. So not entirely a moment of saying, "hey, lets pipe a series of programs together because wouldn't that be a clever idea!" (Still using awk regularly in my daily pipelines - thanks, Brian!)
Very interesting and informative about how linux was developed and its bases. Hepls to know undelying principals than kraming a lot of code. Very inspiring for own work have a lot of respect for these inovators.
I mean, it's not like major theoretical work isn't getting done at big tech R&D departments, but I agree none come close to Bell Labs. For example, Simon Peyton Jones, the key figure in the development of the state-of-the-art free-software (although BSD and not GPL) Haskell compiler ghc, suprisingly enough, has been employed at Microsoft Research for quite some time.
Well, this conversation got AWKward! (That's what she SED!) Get a GREP on yourself! Ugh, I'm so BASHful What can I say, I'm a Boune Again *nixer You either VIM free or VI hard! isn't that right, ED? They should go over the POSIX Standard. That's neat stuff.
I believe that the invention of the "little languages" is the most fundamental contribution of UNIX to Computing. The concept (now called Domain Specific Languages, DSL) is one area of computing where I expect a LOT of development. Because once a "little language" has been implemented, it makes life a LOT easier for domain specialists who otherwise need to spend far too much time writing programs in generic programming languages. In a modern system, the combination of a graphical front-end, a DSL as an intermediate stage, and various back-end programs that consume the DSL, is an extremely powerful combination. Far superior to having it all in a single monolithic program.
1:252:103:00 So, basically, it was like Sudbury Valley. 16:15 Alot of the best stuff about Unix was never released. E.g., 9th and 10th edition had Plan 9-style networking. Sure would have been nice if that had been allowed out earlier; maybe sockets wouldn't have become the standard, then.
I remember doing a report generator using PERL, the "Pathologically Eclectic Rubbish Lister" in which I needed a count of all unique web browsers hitting our web site at a library I worked at. Note the real name is "Practical Extraction Report Language".
Timings: 0:45 Working at Bell labs. 3:50 Pic little language (5:20 - Eqn language) 12:15 Both pic and eqn are implemented by using YACC 14:17 Awk is great for oneliners 15:25 grep, sed, yacc, lex are pattern matching programs, but they don't scale 16:16 Cross-subsidization 18:44 What if LINUX didn't occur because of UNIX openess? 19:50 Were you an OpenSource pioneer? 21:50 Tcl/Tk graphics library 26:40 X Window System
19:58 That openness isn’t there today either, according to some companies. Proprietariness is alive and well. But at least Free/Open Source software has a name (OK, two names), and a clear definition you can point to.
Unix was the model for Linux, which today totally dominates the computing world. Once it became commercialized, Unix unfortunately became the field for a tussling match between a whole bunch of different vendors each trying to lock customers into their own proprietary variant. This fragmentation ultimately destroyed Unix. But oddly enough, at the same time, Unix was also the core nursery for the Free Software/Open Source movement, among other things the GNU project for creating a Free operating system. This is where Linux began. So Unix was very much the launching pad for what became the heart and soul of the computing world today. Its spirit lives on.
Is there a chance of getting Alfred V. Aho - of The Dragon Book (er Principles of Compiler Design) - in a discussion with Kernighan (and/or) Professor Brailsford? I'd like to hear a discussion of the evolution of programming languages and to hear their views on state of things today.
in retro spect BSD Unix went with essentially an open source approach while AT&T's USL stuck with commercialization approach, and the BSD path had the longest legs; the reason Unix poser, Linux, was able to leap frog BSD was because of the legal entanglements that BSD was kept intwined in until the June 1995 release - which gave Linux enough time to win a mind share tussle over BSD; at least FreeBSD is still with us today and still important
BSD isn’t Unix either in the literal sense, it doesn’t share any code with the original Unix. It may as well be considered a Unix like operating system like Linux.
Whenever there's talk about Bell Labs and Unix, Multics is always left out of the story (even CTSS). There is much to be credited to Multics that was copied into Unix. However, these are amazing systems and much to be learned from them. I run a Multics environment though not on real hardware.
We do microservices today for that same reason except that the resource shortage is not on RAM or ROM but rather on peoples brain in understanding too complex systems.. i need a coffee
When you have a pipeline that you use regularly and a package is updated and that changes the behaviour of something in your pipeline subitly but enough to totally break everything Thats one reason for re intigration
+Para199x Well, at least he didn't quip 'somethingsomething Plebs and their graphical installers' So there's that! XD (That being said, I've used Arch myself, I cannot say a word)
I will now reveal my plebian status. I attempted to install arch once and got stuck when I needed to get the network working so I could download packages but there was a missing dependancy it wanted to download to do that....
+Yuannan Lin Unfortunately I actually require using windows atm :( (work related stuff, yay proprietary software) Also at the time I was physically disconnected from my router and had a desktop so ethernet wasn't an immediate possibility ;)
I second the sentiment, but that could get "political", insofar as it might put Computerphile on one end or the other of the holy war that is Free Software.
@@AexisRai there's no such thing as apolitical. Computerphile is already political; it just sits somewhere in the centre. It's also not a "holy war", it's a serious political disagreement about how best to write software to help or harm human beings.
wow so Bell lab is like the Google of research facilities. can you imagine how awesome it is to do any type of research you wanted with all the money you needed. I wonder if it's the same way now today?
+Varekeh Corlon HP Labs was very similar. 'let 1,000 flowers bloom' IMHO the psychopathic gluttonous MBAs, most of whom hate that type of R&D, have an awful lot to answer for!
Peter Walker Is HP still like that. I'm interested in being a researcher in comp sci and engineering. I think the atmosphere at places like these are pretty cool. Do places like this still exist?
For a live demo of the correcting the mistakes of data with pipeline, you can see Kernighan himself from 1982 : th-cam.com/video/XvDZLjaCJuw/w-d-xo.htmlm15s
Linus Torvald's indirect quoted words from the official first release of Linux event in Helsinki (can be found from youtube, without subtitles though) were that unix costs so much that it's just better to do it yourself. So with that hindsight if unix was released publicly without the wild price tag, linux wouldn't probably have been done and released. At least the key element from the comment was the price and no student being able to buy it and pretty much nothing else. Even today the concept of openness escapes some old management, in the era of free use. Like record companies are like hounddogs preventing even fair use of their golden old ownership of songs on youtube. Even by musicians playing them themselves for educational purposes as only a couple of seconds long clips. Although it seems like as long as they're allowed to rob people like that, they can just claim anybody's income from youtube as their own. And the worst thing is that the original artist never sees that money and probably paid to the label to publish it.
That's because of DOS limitation not being multitask. The result of a DOS stdout pushed to the pipe is actually recordede in the C:\TEMP directory, and then retrieved by the next process. Bummer !
@@oysteinsoreide4323 Base64 encoding is here. Never forget windows was build as a graphical layer above DOS, carrying all the limitations of a segmented mono task system. NT was new in many aspects, but still carries a lot of garbage Microsoft doesn't have nor the time neither the possibility to make clear table of the past, unlike Apple did when moving from OS9 to MacOS X. For example, try to create a directory with the name CON, COM1 or LPT1 on windows. You. just. can. not. No limitation like that exist in any Unix or unix like operating system, and it is blessed bread for hackers using this feature to annoy poor windows sysadmin. I kick an oak tree and 10 fall to the ground. Sorry, many French humour and direct translation in my post : to make clear table, blessed bread, kick an oak tree...
@@CaptainDangeax I agree that Unix or Unix like OSes are better OS design. And if it had not been for the limitations of Linux or Mac in other aspects. I would probably have Linux at home and at work. But the world makes the choise quite easy: windows for me even if the operating system is full of flaws.
But pipelining could not have been such a problem to implement. I mean, it is nothing more than combining small subroutines within a bigger one except that the routines are system ones rsther than app ones. Am I missing something?
It's not an implementation innovation - it's an program architecture change from monolithic program to chain-linked sub-programs that was forced on UNIX builders by the limit on usable RAM space. (A comprehensive program to do say text processing would be too big to load into RAMs of that era.) The new architecture then allowed big programs to be executed via a sequence of sub-programs so linked that the output from one became the input to the next. It also allowed sub-programs written in different languages to be used within the same big program (or more accurately, program sequence) provided these languages were compilable on UNIX. Since each language is designed around a particular type of problem or data type, having different sub-tasks of a program coded in the language best suited to that task produced faster processing of each sub-task. The pipeline command in UNIX also allowed users to compose their own system programs using the shell scripting language, the various utility libraries carried within the system and any C coded programs they might provide themselves.
Except it needed both OS and shell support for concurrent operation, initially it was an idea by the group manager to eliminate temporary i/o files. That it allowed people to build on other programs like typesetters or sorts without relinking the code together was a key advantage. That meant users could program tools as one liners or simple scripts
We only use Microsoft's windows, because it was cheaper than UNIX. However you can now get a Unix with a free magazine for £5. This cheep Unix also gives you more freedom, and reliability.
Interesting that Unix was the child of a company that built itself based on a monopoly. Linux supposedly has always been about a 'free market' and anti-monopolistic practices of other software companies, more specifically the GNU tools that make Linux possible including the gcc compiler. Incredibly ironic.
How can 30 people dislike a video with Professor Kerninghan using a device which would not exist without his contributions to the field.. Unbelievable....
ahahhaha that is, in fact, so funny to realize
Right
There are many reasons why you dislike, it is most likely the algo has presented this to somebody who has no interest in computing history (quite likely). I always dislike videos that the algo has presented me, that are not relevant to me, because then the algo is less likely to present you with similar videos. Just sayin'
@@tensevo that's true, but there's also the 'not interested' feature specifically for that.
New to the internet?
Brian Kernighan, a legend for every computer scientist! Thank you for this video Computerphile!
+Russ Prince oh really! The world would be different without Kernighan.
+Russ Prince
Co developer of the UNIX operating system. It's derivates and clones now run Apple's computers, phones and tablets and Android devices. Most web servers use Unix or Linux - including google, and millions of desktops too.
Co-creator of the C programming language. Still most of today's software are written in C or it's extension C++.
+Russ Prince Kernighan was one of the developers that created Unix and the C programming language, both still running on nearly every server in existence and many home operating systems
+Russ Prince wat
+Russ Prince Nowadays we all use computers(PCs, tablets, smartphones, embedded etc.) and access the web. None of which would be the same without Kernighan's contributions. The world itself wouldn't be all that different per se, but the way it's run would, everything runs on computers today and they wouldn't be what they are without Kernighan and others like Ken Thompson, Dennis Ritchie, Richard Stallman and Linus Trovalds.
Man, Bell Labs would've been heaven on earth for a nerd to work at back then!
They could have had all the glory unfortunately they got greedy and wanted commercialization the licenses were astronomical prices. They didn't have the vision for home pc's at the time.
For real, though, "Here, create stuff and we won't have management breathing down your neck. And if what you invent is really cool, we'll pay you even more the next year. (Read More)
Oh, and we're literally the biggest telecommunications company in the country, so pending a complete market crash, we're not going out of business anytime soon."
Every introverted analytical nerd's wet dream.
You mean Amazon?/s
Ahh, the man who essentially brought us "Hello World"... Legend! Great video Computerphile!
I'm feeling stupid whenever these gentlemen or Steve are featured. It's an honor to learn about their history. Much appreciated.
As I come to the end of my 30+ year career in IT, I stumble upon this series of videos! I recall first being introduced to C after having been exposed to Pascal ... shudder! Being freed from the shackles of Pascal with my introduction to C is an experience I cannot forget. From then on I would use C wherever I could.
I was blessed with having an understanding data structures prof, who being more comfortable with Pascal let me use C to do my DS assignments. For that I am ever grateful. Pointers are a joy to work with! Oh the sins one can commit using pointers :)
I still have my copy of the C Programming Language as well as the dragon book. But I miss the diversity of languages that existed back then. But not not COBOL. Though I'll grudgingly acknowledge WATIV over FORTRAN 77.
To Professor Kernighan, Dennis Ritchie, Ken Thompson and others, I say thank-you for lighting the path ... I followed as best I could.
I like how DB refers to web hyper links as web pointers.
ok?
I have here in front of me the book "Software Tools", by Brian Kernighan and PJ. Plauger, using the language RATFOR (Rational FORTRAN) from 1976. I bought this book 40 years ago, when I was 11, and I was mucking around with assembly on a KIM-1 and PET2001, and C on CP/M (and later Cromix). It was formative for my approach to programming. Thanks alot, Brian!
ok?
We are honored to have such a living legend in computing history
This is a great interview. Certainly one of the men who have changed the world.
i am honored today, thanks to Computerphile ...my first book ever book and hence my introduction to programming was "The C Programming Language - Book by Brian Kernighan and Dennis Ritchie"
Brian Kernighan, with his collegues Dennis Ritchie and Ken Thompson, made far much more for computing than any of other like steve jobs or bill gates.
+CaptainDangeax Let's not get too much carried out by this. Microsoft brought the PC in homes. It is their merit that PCs are so common and used today. Don't get me wrong, I am a C/C++ linux programmer so I appreciate what these people have done, but this field would not be what it is today if people (especially those outside our field) were not so welcoming to computers.
+OoJxShadow
I agree. You can Wax lyrical about K&R but these guys would be the first to extol the contributions of Jobs and Gates , and if you want to talk about the CS academia then it's disingenuous to leave out Claude Shannon or George Boole or even John Bardeen and Walter Brattain also from Bell Labs. Same as people rave on about Von Neumann (the slime) when the real kudos should go to John Mauchly and J. Presper Eckert.
OoJxShadow Sorry but you're wrong. Many other computers were home computers far before the PC became affordable, like the Atari ST, the Amiga, this last one was really ahead of his time in many of its features and it tooks years for M$ to get to the level. One exsample ? Multitasking. None at M$ before NT4 was really multitasking, just 10 years after the first Amiga. M$ is not an innovative company, they always made their business by other's ideas and I can give you a list of features borrowed as long as a day without wine and bread.
This is true, but jobs and gates didn't contribute to computing, they contributed to the user experience and they were both great salesmen for their products.
gates and jobs were never into computing, they were into computers and there is a big difference.
Both play their role, but without the first (K R & T), the second (J & G) would have sold vacuum cleaners.
The "C" book by Brian Kernighan and Dennis Ritchie is the best book on a computer language that I've seen. When I first saw it, I was dedicated to Turbo Pascal but "The C Book" helped me to accept "C'. The book is so clear and easy to follow and I felt like I was making progress on every page.
Same.
Without Brian Kernighan (and Dennis, and Ken), my job wouldn't be what it is today. And I like my job. So, thank you, Professor Kernighan.
I ❤️ Unix & C. Thanks for the power. 👍🏻
Amen to that.
ok?
Fascinating - thanks! So much here. . . .
thanks for this, an utter pleasure to watch
Amazing! We still use sed and awk for some use cases today. And Unix lives on through Linux in almost every enterprise server globally.
The world is grateful to your great feat, sir
Implemented Awk over a weekend. I'm speechless.
You should see the video with Ken Thompson he literally said unix was built in a few weeks. Really amazing how much they accomplished in little time.
Well, one thing you need to realize is that all this software was not the full-fledged system of today so it's not like they had a current day Linux system kicking about. The source code is freely available nowadays so you can see what I mean. Also, it's not uncommon for students to write even more powerful yet toy operating systems in less time these days (since there's easier access to information, we know a lot more, and we have more powerful tools). GitHub is full of them.
I think the one thing to take away is that Bell Labs was a place that really fostered innovation and it's important to have things like that. Because the output of ideas and hard work that poured out of it literally changed the world. It wasn't just one or two small things.
@@nobytes2 Few weeks for actually writing the code, many years of experiments and thinking before...
@@CaptainDangeax I don’t know how long you think compiling took back then, but I’d imagine a missing “;” would’ve cost you a few hours. It definitely is an achievement to say the least with those kinda limited computing resources.
@@obinator9065 I think you missed my point. I think that, although KT said he wrote Unix in a few weeks, he was thinking of the many problems of actually writing an os since years and also practicing multics and experimenting all the problems with it. Then when he finally decided to code, everything was already written in his mind so it only took weeks, but the thinking took years. I'm not mentioning about compilation problems and missing ; or {} or whatever
your presence is so chill and informative, thanks for the work you put into this open sourced work!
Wow, this was just amazing! I wish there was more to this, a half hour is too short!
What a nice and humble man. That guy inventet Unix and C. With his collegues. He is amazing!
Great stuff! Gee.... what it must have been like to work alongside guys like Kernighan, Ritchie and Thompson! What a great place Bell Labs must have been then!
Outstanding work, Computerphile! Thank you ever so much for this interview of Brian Kernighan, one of the original Unix greybeards. Please continue doing these. I love hearing the legendary masters of computer history speak. Please try and interview Ken Thompson, as well. Would love to listen to something current from him.
Amazing interview
"`one` is a metaphore for `two` or `three`"
+Larry Pete for sufficiently large values of "one" it's actually equal to "two" or even "three" in practice
+Larry Pete one-liners in programming can be extremely messy if you actually keep them to one line instead of using the idea of a short snippet of code for a purpose.
N
Lot to unpack from that statement :)
"Science is rigorous"
I hope you can interview Ken Thompson...
Then Linus Torvalds down the line. (Imagining half of the conversation will be in curse) ..nah
@@IoriTatsuguchi linus hates nvidia because they intentionally make their hardware harder to use on linux
๖ۣۜ♥๖̶tacokitten๖̶ yeah so?
@@tacokoneko And that's not a smart move from NVidia because Linux is the system for high end computation and NVidia is trying to enter this market with Cuda.
@@IoriTatsuguchi you mean ncurses
Heh, I did a similar thing, writing a Tcl/Tk script to check if external servers were responding and showing them in a table while I was discussing the matter with a colleague. Wonderful language and toolkit.
IBM (of all companies) had an open source operating system, VM/CMS, the first virtual machine/hypervisor. It gave up the source code to customers who wanted to customize it, because marketing was not interested in selling it. The success of the idea eventually led IBM to close the code and make it a commercial product.
I still weep watching these guys, who changed the world for the better, in their crappy offices, and then watching some meaningless media star spouting rubbish from their multimillion dollar mansion. Perhaps pipelines were a bridge too far.
Bell Labs was not a crappy place to work. It was like the Google of their time.
@@Nookerdog777 better than google. It was before software was commercialized after all. Ironically AT&T was responsible for it.
@@hexa3389 why is it ironic ?
@@hectorcanizales5900 AT&T is partially responsible for commercializing software. Even though the success of Bell Labs was only possible because software was not commercialized.
@@hexa3389 interesting, thanks.
Flex, YACC and Bison were certainly very interesting tools to discover in my "Languages, Compilers and Interpreters" course I was taking while studying. My experience with them is however limited to making a (very) simplified compiler for a Pascal language.
thanks for this, I really enjoy the little background bits of computer history.
5:34 Other OSes had this concept called “overlays”, where you divided up your (large) program into sections such that only some sections were loaded into memory at once. There were complicated mechanisms such that when a procedure in one section made a call into one in another section, the latter would get automatically loaded. But what happened if there wasn’t enough memory? Then some other section had to be thrown out of memory. If no code was currently executing in a suitable section, then fine. But what if there was? Then it would have to be automatically reloaded when control returned to a procedure in there. Or maybe you disallowed unloading sections which held currently-executing procedures.
Yes, it all got very complicated. I never wrote an overlaid program in my life. Luckily the 32-bit era arrived just in time.
The secret, as always, was to write a library that would handle it for you. Still complex, but much less so.
2.11BSD UNIX used extensive and complex overlays to jam essentially 4.3BSD with TCP/IP and all the rest into split I/D PDP-11s.
What a great talk :) Thanks.
Pure and humble genius...
Woohoo what a great interview! Fan boy!
A truly fascinating man.....
Just beautiful, thank you so much for this..
so Unix pipelining was initially a means to run a series of small programs that individually fit in the computer's memory but could be combined together as to their overall purpose of outcome to be a achieved. So not entirely a moment of saying, "hey, lets pipe a series of programs together because wouldn't that be a clever idea!" (Still using awk regularly in my daily pipelines - thanks, Brian!)
It was actually an idea by the group's manager, people had various small utilities but i/o to files was clumsy.
Your channel is brilliant - thank you!
Very interesting and informative about how linux was developed and its bases. Hepls to know undelying principals than kraming a lot of code. Very inspiring for own work have a lot of respect for these inovators.
Nice to see Obi-Wan Kenobi, explaining UNIX.
Proud to be the 2**100th who liked this series ;-))
There still are monopolies, but instead of investing in R&D, they buy back stock.
+heroineworshipper and pay armies of lawyers
*regulated* monopoly. That is key...
I mean, it's not like major theoretical work isn't getting done at big tech R&D departments, but I agree none come close to Bell Labs. For example, Simon Peyton Jones, the key figure in the development of the state-of-the-art free-software (although BSD and not GPL) Haskell compiler ghc, suprisingly enough, has been employed at Microsoft Research for quite some time.
Not for long more now, Buddy. The Citizen Kanes of Silicon Valley will be busted down to size.
@@dankierson I don't see how that's relevant
I'd love to see videos on history of the main shell programs, sed, awk, xargs, sort, find, etc.
Well, this conversation got AWKward!
(That's what she SED!)
Get a GREP on yourself!
Ugh, I'm so BASHful
What can I say, I'm a Boune Again *nixer
You either VIM free or VI hard! isn't that right, ED?
They should go over the POSIX Standard. That's neat stuff.
+FEARbraveheart *slow clap*
boune
9/10 groaned out loud
I C what you did there.
Thank you for your:
- timestamp
- pipe
- file system (port)
This was lovely. Very pleasant.
I believe that the invention of the "little languages" is the most fundamental contribution of UNIX to Computing.
The concept (now called Domain Specific Languages, DSL) is one area of computing where I expect a LOT of development. Because once a "little language" has been implemented, it makes life a LOT easier for domain specialists who otherwise need to spend far too much time writing programs in generic programming languages.
In a modern system, the combination of a graphical front-end, a DSL as an intermediate stage, and various back-end programs that consume the DSL, is an extremely powerful combination. Far superior to having it all in a single monolithic program.
1:25 2:10 3:00 So, basically, it was like Sudbury Valley.
16:15 Alot of the best stuff about Unix was never released. E.g., 9th and 10th edition had Plan 9-style networking. Sure would have been nice if that had been allowed out earlier; maybe sockets wouldn't have become the standard, then.
I remember doing a report generator using PERL, the "Pathologically Eclectic Rubbish Lister" in which I needed a count of all unique web browsers hitting our web site at a library I worked at. Note the real name is "Practical Extraction Report Language".
this channel rocks!
I chuckled at the TCL/TK bit. Yes, that's an ODD language indeed.
27:05 That is the inevitable fate of every GUI architecture, to get more and more complicated over time. There are no “simple, efficient” GUIs.
Plan 9
false.
Wow... The working condition at IBM was just incredible then. 😍
This will be interesting to watch.
Timings:
0:45 Working at Bell labs.
3:50 Pic little language (5:20 - Eqn language)
12:15 Both pic and eqn are implemented by using YACC
14:17 Awk is great for oneliners
15:25 grep, sed, yacc, lex are pattern matching programs, but they don't scale
16:16 Cross-subsidization
18:44 What if LINUX didn't occur because of UNIX openess?
19:50 Were you an OpenSource pioneer?
21:50 Tcl/Tk graphics library
26:40 X Window System
4:48 Back in the day we used to compute in millibytes, not centi or deci, but millibytes.
These young men could never understand what that was like.
+ABitOfTheUniverse unless this is a joke how can you have less than 1 byte unless your talking about bits
+ABitOfTheUniverse A millibyte doesn't even make sense. A hundredth of a byte? That's less than a bit, which does not compute.
lel
+Aurelius R The prefix milli does not represent a hundredth. Also, you might additionally want to read up on "joke", "humor" and "funny".
Benny Löfgren Right, because sarcasm is so easy to pick up on in text. And yes, milli is thousandth. Don't you know what a joke is?
Legendary.
"One is a metaphor for two or three"
19:58 That openness isn’t there today either, according to some companies. Proprietariness is alive and well. But at least Free/Open Source software has a name (OK, two names), and a clear definition you can point to.
Pattern matching is great, after i got a taste of it in Rust, i miss it every time i use a different language
Unix was the model for Linux, which today totally dominates the computing world.
Once it became commercialized, Unix unfortunately became the field for a tussling match between a whole bunch of different vendors each trying to lock customers into their own proprietary variant. This fragmentation ultimately destroyed Unix.
But oddly enough, at the same time, Unix was also the core nursery for the Free Software/Open Source movement, among other things the GNU project for creating a Free operating system. This is where Linux began. So Unix was very much the launching pad for what became the heart and soul of the computing world today. Its spirit lives on.
Is there a chance of getting Alfred V. Aho - of The Dragon Book (er Principles of Compiler Design) - in a discussion with Kernighan (and/or) Professor Brailsford? I'd like to hear a discussion of the evolution of programming languages and to hear their views on state of things today.
in retro spect BSD Unix went with essentially an open source approach while AT&T's USL stuck with commercialization approach, and the BSD path had the longest legs; the reason Unix poser, Linux, was able to leap frog BSD was because of the legal entanglements that BSD was kept intwined in until the June 1995 release - which gave Linux enough time to win a mind share tussle over BSD; at least FreeBSD is still with us today and still important
BSD isn’t Unix either in the literal sense, it doesn’t share any code with the original Unix. It may as well be considered a Unix like operating system like Linux.
portable bash/csh/ksh floating point maths;
answer=`echo "$numerator $demoninator" | awk '{printf "%.2f", $1 / $2}'`
I use it all the time.
9:00 بداية الكلام عن AWK
Whenever there's talk about Bell Labs and Unix, Multics is always left out of the story (even CTSS). There is much to be credited to Multics that was copied into Unix. However, these are amazing systems and much to be learned from them. I run a Multics environment though not on real hardware.
Vector terminals, sounds cool, reminds mechanical computers
We do microservices today for that same reason except that the resource shortage is not on RAM or ROM but rather on peoples brain in understanding too complex systems.. i need a coffee
When you have a pipeline that you use regularly and a package is updated and that changes the behaviour of something in your pipeline subitly but enough to totally break everything Thats one reason for re intigration
aaaawesome!
I was reading up on GTK when he started talking about the TK library.
When he said TK was for GUIs it suddenly hit me what GTK meant
***** how sad, it made so much sense
+Yuannan Lin Could you be any more stereotypical of an arch user please? =D
+Para199x Well, at least he didn't quip 'somethingsomething Plebs and their graphical installers' So there's that! XD (That being said, I've used Arch myself, I cannot say a word)
I will now reveal my plebian status. I attempted to install arch once and got stuck when I needed to get the network working so I could download packages but there was a missing dependancy it wanted to download to do that....
+Yuannan Lin Unfortunately I actually require using windows atm :( (work related stuff, yay proprietary software)
Also at the time I was physically disconnected from my router and had a desktop so ethernet wasn't an immediate possibility ;)
Thank you PhD
You guys should interview Richard Stallman!!
I second the sentiment, but that could get "political", insofar as it might put Computerphile on one end or the other of the holy war that is Free Software.
I doubt there are people who dislike free software. Maybe Stallmann is a bit extreme, but I doubt anyone really thinks he does any harm.
@@AexisRai there's no such thing as apolitical. Computerphile is already political; it just sits somewhere in the centre. It's also not a "holy war", it's a serious political disagreement about how best to write software to help or harm human beings.
What is that thing that sounds like "awg" that they're talking about at 9:20?
+Max Coplan
awk - pattern-directed scanning and processing language
Maťo Tondash +Norbury53 oh hey, I just tried it in my terminal and it works!cool.
14:50 AWKward.. Heh. Good one, professor.
wow so Bell lab is like the Google of research facilities. can you imagine how awesome it is to do any type of research you wanted with all the money you needed. I wonder if it's the same way now today?
+Varekeh Corlon
HP Labs was very similar.
'let 1,000 flowers bloom'
IMHO the psychopathic gluttonous MBAs, most of whom hate that type of R&D, have an awful lot to answer for!
Peter Walker Is HP still like that. I'm interested in being a researcher in comp sci and engineering. I think the atmosphere at places like these are pretty cool. Do places like this still exist?
I wonder what Brian thinks about Go?
He wrote the book on Go.
For a live demo of the correcting the mistakes of data with pipeline, you can see Kernighan himself from 1982 : th-cam.com/video/XvDZLjaCJuw/w-d-xo.htmlm15s
Linus Torvald's indirect quoted words from the official first release of Linux event in Helsinki (can be found from youtube, without subtitles though) were that unix costs so much that it's just better to do it yourself. So with that hindsight if unix was released publicly without the wild price tag, linux wouldn't probably have been done and released. At least the key element from the comment was the price and no student being able to buy it and pretty much nothing else.
Even today the concept of openness escapes some old management, in the era of free use. Like record companies are like hounddogs preventing even fair use of their golden old ownership of songs on youtube. Even by musicians playing them themselves for educational purposes as only a couple of seconds long clips. Although it seems like as long as they're allowed to rob people like that, they can just claim anybody's income from youtube as their own. And the worst thing is that the original artist never sees that money and probably paid to the label to publish it.
7:40 heart attack lol
I miss the pipe mechanism when I work on a windows computer. Microsoft never made the pipeline work in a proper way......
That's because of DOS limitation not being multitask. The result of a DOS stdout pushed to the pipe is actually recordede in the C:\TEMP directory, and then retrieved by the next process. Bummer !
@@CaptainDangeax It could have been binary nontheless. Being text based means that binary code is destroyed.
@@CaptainDangeax The text-based nature of the windows pipeline is the largest problem with it. Not the lack of multithreading.
@@oysteinsoreide4323 Base64 encoding is here. Never forget windows was build as a graphical layer above DOS, carrying all the limitations of a segmented mono task system. NT was new in many aspects, but still carries a lot of garbage Microsoft doesn't have nor the time neither the possibility to make clear table of the past, unlike Apple did when moving from OS9 to MacOS X. For example, try to create a directory with the name CON, COM1 or LPT1 on windows. You. just. can. not. No limitation like that exist in any Unix or unix like operating system, and it is blessed bread for hackers using this feature to annoy poor windows sysadmin. I kick an oak tree and 10 fall to the ground. Sorry, many French humour and direct translation in my post : to make clear table, blessed bread, kick an oak tree...
@@CaptainDangeax I agree that Unix or Unix like OSes are better OS design. And if it had not been for the limitations of Linux or Mac in other aspects. I would probably have Linux at home and at work. But the world makes the choise quite easy: windows for me even if the operating system is full of flaws.
Is that stuffed bear behind Kernighan wearing a shirt with the Raspberry Pi logo?
Yes that's Babbage, the Raspberry Pi mascot!
Sean Riley Cool
+Sean Riley +MelBrooksKA And he's named after this guy, if you're curious: en.wikipedia.org/wiki/Charles_Babbage
But pipelining could not have been such a problem to implement. I mean, it is nothing more than combining small subroutines within a bigger one except that the routines are system ones rsther than app ones. Am I missing something?
It's not an implementation innovation - it's an program architecture change from monolithic program to chain-linked sub-programs that was forced on UNIX builders by the limit on usable RAM space. (A comprehensive program to do say text processing would be too big to load into RAMs of that era.) The new architecture then allowed big programs to be executed via a sequence of sub-programs so linked that the output from one became the input to the next. It also allowed sub-programs written in different languages to be used within the same big program (or more accurately, program sequence) provided these languages were compilable on UNIX. Since each language is designed around a particular type of problem or data type, having different sub-tasks of a program coded in the language best suited to that task produced faster processing of each sub-task. The pipeline command in UNIX also allowed users to compose their own system programs using the shell scripting language, the various utility libraries carried within the system and any C coded programs they might provide themselves.
Except it needed both OS and shell support for concurrent operation, initially it was an idea by the group manager to eliminate temporary i/o files.
That it allowed people to build on other programs like typesetters or sorts without relinking the code together was a key advantage.
That meant users could program tools as one liners or simple scripts
That guy was right , X is overcomplicated and a total mess.
is that john sturgis in young sheldon?
We only use Microsoft's windows, because it was cheaper than UNIX.
However you can now get a Unix with a free magazine for £5. This cheep
Unix also gives you more freedom, and reliability.
Exactly 1000 likes.
where still using xorg today C:
mmhmm
Well i refer to most programs by there process name
far out....quite interesting
AWK : Aho, Wineberger, Kernighan
A comparison with today's research environments can only make you sad.
Casio built really good watches :D
PDP-1000?
Calling TCL "odd" is an understatement! I want to kill it with fire every time I have to use it in our simulator.
"This really bright student in Finnland". 😂
He put the K in AWK
Tickle-tee-key - I didn't even know it was pronounced like that :)
Interesting that Unix was the child of a company that built itself based on a monopoly.
Linux supposedly has always been about a 'free market' and anti-monopolistic practices of other software companies, more specifically the GNU tools that make Linux possible including the gcc compiler.
Incredibly ironic.