I think having this testimony of the begginings of computer science in high quality, on TH-cam, is really important and has a great historical value. Thanks Brady and Computerphile. That's great work you're doing.
I used to write COBOL programs for our student admin and the library. The system was an ICL1904 running George III and we were privileged in that we had 3 "glass terminals" reserved for programmers. If all the glass terminals were in use we could use one of the older old ASR-33 teletypes in another room. However, most of our work was still done using decks of cards and line printer output. That computer was still in use up to the mid-1980s. I think that disk platter might have been an EDS-80? 5:31 We used those! ADM-3A terminals connected to the ICL. We only had ONE in the programmer's room and you always had to wait to get on it.
Professor Brailsford is a wonderful story teller. His narration and personal reflection on the past makes this video a fascinating ride through a historic frontier in computing.
That is an excellent interview. It is so nice to hear the memories and the personal view on the history of computing from the person who went through this all himself.
Some great memories in this video. I worked for ICL in the building shown (Bracknell) as a diagnostician, writing patches for mainframe operating systems in assembler. I then moved onto Unix and in the late 1980s I designed & implemented the reliability trials software (written in C) for a new Unix server (DRS6000) we were manufacturing in our factory at Ashton, Manchester.
3:40.... WTF!!! does anyone else hear him swearing? "So, i can get the fucking cover off, and put it into my top loading washing machine!" Love this guy.
I like professor Brailsford videos because they bring back so many memories. The first mini-computer I used was a PDP-8/e, where I programmed in assembler and FOCAL (a BASIC type language). Then I moved to VAX/VMS and UNIX/SunOS machines, and finally to Linux PC's, which I use at work today.
Professor Brailsford is the best. Wish I could have his knowledge FTP'd into my brain... I love listening to history of computing and want to hear everything he has to say!
Watching David Brailsworth reminds me of learning maths and science from Johnny Ball or art and painting from Tony Hart. I absolutely love hearing his unique and knowledgable stories on the history of computing.
Wow, I lived this history and it is amazing to me how things were in the '70's and how much they have changed. I began in California writing FORTRAN in collegein 1969, did an MS in Geology from Stanford using a mainframe and a dumb terminal to run a teaching model of 2D gravity models in the earth's crust in 1973. When I went to US Geol. Survey in 1976 after a couple of years I beban trying to sell UNIX to them and failed. I remember using a PDP-11/34 with RL01 2 MB disks running BDS UNIX in about 1978 and writing my first C program. about that time USGS began adopting UNIX. It wasn't until 1986, though that I had my first real exposure to UNIX as a full-time user and system-admin and programmer and later with Sun in about 1988. Then in the '90's I did Sun administration on SunOs 4,1 and later Solaris 5 to 9. I saw my first Linux in 1995 and run it on an IBM PS/2 with 16 mb of ram. What I remember is that I made it look pretty much like what I had at work under Solaris 5 because both ran FVWM. But to say that UNIX can run in small systems, I remember in 1980 installing a 10 MB UNIX on a friend's PC and later, a couple of years ago running a recent release of Ubuntu live off a DVD on a system I had that used to run Windows 2000 in 256 MB of ram. I had run recent Linix distros which were by no means small in 1/4 and 1/2 GB of ram within the past two years.
Back in the 70's, college students would play a kind of game where they would load several programs into a mainframe (we used an IBM 360) and whoever took over the entire memory would win. There were some programs that tried to attack other programs, but we soon found out that a tiny program that made copies of itself as fast as possible would fill the memory and crash anything else running. We didn't know it then, but we were essentially writing computer viruses.
I can't even express how valuable these accounts are. I love to hear anything and everything these old-timers have to say! Even though I knew most of this already, it's really neat to have a personal perspective applied to it--and it's probably easier to remember that way, too, being human.
I worked at Bell Fruit Manufacturing in Nottingham as late as 2001. We had a PDP 11 there to build slot machine code on before burning eproms for use in a machine. We used to set the build off, go for lunch and it might just have finished by the time you got back. My god that made you a careful and efficient programmer.
I'm so happy to see these types of videos. It's only been some decades since those days, but we've progressed so much. Without effort to communicate this type of historical stuff, it could be very obscured in the future. I think it's important to preserve the way things used to be done.
Interesting. I was working for Singer Business Machines when we were sold to ICL. The machine on which I was programming became the ICL 1500 and was made in Utica, NY, USA. All the programming I did was written in 1500 Assembler. I had originally learned programming on an IBM 360, also using IBM Assembler.
I had Unix envy. My mini-computer experience was VAX/VMS on a VAX 11-750 viewed through a VT-220. What an epiphany when we moved to SunOS on Sun-3 (68020).
As someone born in 1981 and enamored with computers since the C64, Prof Brailsford is a national treasure. Please continue to interview him and let him speak (through howevermany tangents there may be) about the origins of Computer Science and publish everything he has to say. Modern programmers have much to learn from people that had to punch cards and edit on paper. Saving 15 minutes in an algorithm in 1965 translates to saving 10 minutes of battery life in a smart phone now.
Brings me to my 13th birthday present from my father. ADM 3A screen teletype, modem, my own phoneline and a user account on local campus HP3000. The biggest mistake I made with that setup was when I once printed out a long program I had made. It came COD to me and my father refused to pay it. He loaned me the money to get it out from the post and swore me to never print out anything more than one page.
Wow - that's me in the picture, with Neil and John! Also in the classroom I think! How times have changed - I remember toggling in the bootloader for the 11/70 on the piano keys too!
This is probably my favorite Computerphile video as of yet! When my dad started as a computer scientist, he started on a PDP-11, so he would tell me about this kind of stuff all the time. I'm really glad to see somebody else talk about it too.
These videos are just fantastic! I didn't think it could get any better than numberphile and sixty symbols, but I find myself really enjoying the videos on the history of computing.
Brailsford and his peers are my heroes. They did the equivalent of cutting the grass on a soccer pitch with barber's shears so that things could evolve to the point where I'm writing this on a system so rich in resources that the slowest element of the entire system is always me. One-to-four TB of local disk is commonplace, along with 8-24 GB of RAM. I know someday this post will date me, but I can remember being paid in the $100s of dollars to upgrade a laboratory PC from 128kb to 512kb. Some nights, upgrading several PCs for professors and labs, I made my entire house payment in profits.
The most important feature of Linux (the Unix kernel started by Linus Torvalds) is its licence. The second is that it can and is run with Gnu. The person responsible for the Linux licence and for Gnu is Richard Stallman.
I learned in School programming with assembler as well as programming with Delphi (a more sophisticated language like C++/java etc). And it helped me understanding of how PC work and how to find errors in your code. I guess its good to know for a software engineer how your codes works after you compiled it.
That was my exact thought. I can only imagine getting to sit down and talk with him for an afternoon about the history and evolution of computing. It would be so fascinating.
FORTH was a much more efficient and flexible multi-user operating system / programming language that took up less space in RAM. It was ideal for that generation. I'm surprised Prof. Brailsford didn't mention it. :)
I loved learning on the ICL 1900 series. When I went to work as a computer auditor for a large accounting firm, I bumped into IBM, NCR, Honeywell etc. and discovered that the George III and George IV Operating Systems ("GEneral ORGanisational Environment" I think was the translation of the acronym) were a long way in advance of the US competitors. Some of the big CDC and Burroughs systems came close, though. I never worked with the new 2900 series, but there are still some standard features of the George systems which are not implemented in modern OSs. George systems defaulted to peripheral device independence, had internal file version management, virtual storage and dynamic memory management. Ah, yes, nostalgia - not what it used to be...
So Brady can you give Professor Brailsford a go=pro and just have him walk around with it for a day? The man is such a good speaker and he's so full of great memories, I could listen to him talk all day.
This is one of the best videos in the series so far! And by the way, the chance that some sort of Unix or Linux based systems are directly involved in you watching this from the Internet, is about 100%.
Several reasons: It is extremely modular. The code is open and by compiling the operating system exactly to your needs you can get a lot more out of it than the 1 size fits all of consumers OSs. And even if you don't compile it yourself, you can swap out anything and change everything to exactly what you want without anything in your way. The free as in freedom part makes it so amazing. The downside is that it is much easier to mess things up, but real geeks see that as a challenge ;)
I remember those days. Not the "good old days." Just the "old days." Minix was bliss! I could run something on my PC at home that looked a lot like the computer at work. Got a 1200 bps modem and another phone line and found bulletin boards and the line into the computer at work. Access to the internet in the late 1980's. Lots of work and virtually no one understood. We are living now in a rather golden time...
There are not many 'programmers' nowadays but a lot of 'coders'. I started programming in 1969 on an IBM 360/30 with 32K of memory and two 2311 disk drives. All of the programming was done in Assembler as the COBOL compiler would not run in 32K. I had no degree or much in the way of programming education when I started - it was a pure OJT ( On the Job Training ) situation but I took to it quite well. By the mid-1970s I was working for Singer Business Machines when we were bought by ICL. The group of which I was a part worked on the Singer/ICL 1500 series of systems that we made in Utica, NY. The 1500 had at most 16K of solid-state RAM, a 256 character CRT, two tape cartridge drives and a keyboard. Fun times!
From assembler on an IBM 7044 in 1967 at Melbourne Uni to timesharing on the Cambridge Uni Atlas in 1971 to BSD Unix at AGSM, UNSW, Sydney in 1977 (thanks Iain Johnstone and Andrew Hume), I have been using Unix for 37 years now. Long live.
I actually got exposed to UNIX-Like tools before I got to UNIX. At U.S. Geological Survey in 1976 we had a Honeywell Multics mainframe. The line editor was very much like ex, the line editor you can still access from vi. That OS was written in PL/1, gasp, and expensive to use. I was agitating for UNIX, but lost at that time even though after about 1978 it started to be adopted. My boss at the time loved Mac. He would be surprised to see OS X as it is based on BSD Unix. Multics was the predecessor to UNIX.
It's so bizarre to think that people was chanting for an independent computer, with disconnected hardware wich could run software, and today we're all crazy about clod computing which seems like the exact opposite to what people needed back then.
Man, the memories. I remember BASIC programming using the local college's PDP 1170 running RSTS/E. That thing was just this massive mystery to a 16 year old kid. By the time I got to college in 1987 they'd changed to a VAX 6210 using VMS. I loved DEC equipment and VMS. I remember railing against having to use UNIX at my first job. UNIX's command line case sensitivity alone drove me mad. There wasn't such a thing in VMS, and typically I would use VMS with the caps lock on all the time. The thing that turned me around was the power of the UNIX shell scripting language. I fell in love with Bourne shell, and then more so Korn shell. The idea of piping command output to another command, parsing text with awk and sed; mind blown. I was hooked. It was like silly putty. You could bend, twist, and shape your scripts as you pleased. That was 1991. Still hooked.
It feels so blessed to have such advanced machines now ! they have evolved from ground up in such less time compared to say chairs and other stuff we have been sing since ages !!!
I started with IBM in 1968 as a programmer. Thanks for the nice trip back over time. Mainframes cost millions then and now the Raspberry Pi costs $35! Amazing how things have changed.
I'd love to hear continuation of this story ( how linux went on to rule servers ) and also how did the eventual availability of a OS that was more or less understandable to an average Joe impact computing from a computer scientist's perspective. This was a great video, thank you.
Steve Jobs, someone who just so happened to have ideas that were innovative at the time, but ultimately not exceptional dies: Everyone goes bananas. Dennis Richie, father of modern programming languages, THE person that enabled the rise of Bill Gates and Steve Jobs in the first place dies: Not even a whisper. And people say this world isn't fucked up.
According to the C standard ++c increments c and returns that incremented value. In order to perform a comparison the compiler must collapse both sides to a value of arithmetic type. The problem is that it is not guaranteed that the side effect of ++c actually happens before its evaluated - it just has to happen between the same sequence points (those are the boundaries of an expression). For this reason the left side of c
Indeed this would make an interesting topic for a video. I am far from an expert in OS design. I can tell you that on Unix, devices are treated as files. This simplifies device access by allowing standard file-handling methods to be applied. Today's microcomputers are so powerful that they can execute multiple processes simultaneously. The multitasking heritage of Unix has served it well. Microsoft on the other hand has had to bolt such functionality onto Windows, a perilous task at best.
This is bad ass. This is the best history of the most important beginning aspect of computer science standardization condensed for a bystander, like me, ever. :-) thank you!
It's a question of when the increment happens, before evaluation (as ++c does) or after evaluation. c == c++ is a true statement, for example. That's one reason why for complex data types in c++ where you want to increment by 1, you're far better off using ++c than c++ since pre-increment doesn't require you to maintain the current value.
Thanks for this. I started in computers working for DEC. Times have moved on and now I'm watching this on an HP (who bought Compaq, who bought DEC) running Linux.
Actually, the increment is performed before the comparison, and even before the evaluation of the C++ part. Using the increment (or decrement) operator as a suffix simply yields the old value as opposed to yielding the new value when using the operator as a prefix. Internally it's not so much an order of execution thing but simply a language specification of what both operations should evaluate to.
Not all the world runs UNIX. In 79 I was working on an IBM System3, which had its own OS. With it, we managed to run interactive jobs with only 96k of memory. The descendants of that midrange machine are still with us as the IBM iSeries. They still run a proprietary OS and are one of the safest, most reliable business computers in the world, with most of them running continuously 24/7. BTW, in 79 our keypunch cards had a bevelled corner to identify when a card was placed backwards in the stack.
My first job in the I.T. world was managing a DG Eclipse MV9500 which I later upgraded to MV9600U. It too used dumb terminals though at the time we had a TCP/IP stack on it and used Pacer on the Mac platform to access it. But you could do print jobs in batch mode. I still remember the command in AOS/VS II was 'qdbi'
Well, to add Mr. Brailsford question at the end, iOS comes from MacOS, which is at it's core a version of Unix. As well, Android is pretty much a distribution of Linux. So, even to this day, with our fancy smartphones, we're still running Unix. That's impressive for an OS written in the late 60s - almost by a single guy no less! Mr. Thompson, we owe you ^_^
This modular ability of the OS means that you can have a whole host of devices all able to understand the same commands. This makes writing scripts that run on several devices all function the same way, regardless if it is an Android phone, a desktop workstation, or a high capacity data storage center. Configuration is a breeze, even if you have 300 devices that all need to be set up to work together, all made by different manufacturers.
The first program I wrote was in ALGOL 68 on a PDP-11/VAX timeshare cluster at the University of Sussex in 1979 when I was 9 (it was basically Hello World). I spent most of my time playing Colossal Cave Adventure. I was still writing FORTRAN under VMS DCL on a VT100 greenscreen attached to a VAX 8550 in the early 90s. Amazing to think that my 25 quid Raspberry Pi blows all those systems out of the water.
The topic at that point was forking UNIX so it would be able to be used freely with clear constraints. Torvalds was the one who carried this out. Before the Linux kernel, GNU didn't really pick up steam and vice versa. The two worked off each other, but the reason Stallman wasn't mentioned was that the topic was that UNIX was a legal mess at the time, and Torvalds made an OS to sidestep that.
One of my first jobs in the mid 80's was data input on a time sharing system. If you ever fancied an afternoon off you just had to type fast enough to crash the system and they would have to call an outside engineer to come in to repair it. Happy days!
Its amazing how in 30-40 year our small electronic devices are several times more powerful then the big computers back then. It also blows my mind that hard drives were measured in MB when we are using TB hard drives in computers and GB on phones and tablets.
Thanks for the reply :) I actually watched that video right after this one and satisfied with Prof. Brailsford's explanation. (The GNU joke at the end made me laugh a lot actually) I just thought I'd leave the comment up here instead of deleting it in order to help educate others who aren't familiar with GNU/Linux.
Hah I remember running a 64kb computer at home, a Tandy Radioshack TRS-80 Color Computer II, starting from 1984 I remember. It plugged into your TV and ran a version of Microsoft Basic. I can't remember the graphics resolution but I think it might have been something like 160x120 with 9 set colors, which were frankly aweful and didn't even include a proper white but instead an awful salmon buff colour. It was awesome xD My programming experience was a bit stunted however for two reasons. First it didn't have a disk drive. Instead it used audio case tapes as mass storage. They were pretty finicky but worse we kept loosing the cable to connect the tape drive to the computer. I think we went through about 4 of them, and most of the time didn't have access to any sort of non-volatile memory. I think when it did work it could store about 8 full memory dumps or programs, each of which had to be loaded (slowly) into the memory with a CLOAD. Secondly I didn't understand what PEEK and POKE did. Basically PEEK reads from an address and POKE writes an address. I didn't understand that all of memory was accessible in this way and even that the entirety of computer memory _was_ a long array of bytes. What I could have done if I understood this...
I think having this testimony of the begginings of computer science in high quality, on TH-cam, is really important and has a great historical value. Thanks Brady and Computerphile. That's great work you're doing.
Please see the 'behind the scenes' film as linked in the description to hear Professor Brailsford talk about this >Sean
Fascinating stuff. Could listen to Professor Brailsford all day.
I used to write COBOL programs for our student admin and the library. The system was an ICL1904 running George III and we were privileged in that we had 3 "glass terminals" reserved for programmers. If all the glass terminals were in use we could use one of the older old ASR-33 teletypes in another room. However, most of our work was still done using decks of cards and line printer output. That computer was still in use up to the mid-1980s.
I think that disk platter might have been an EDS-80?
5:31 We used those! ADM-3A terminals connected to the ICL. We only had ONE in the programmer's room and you always had to wait to get on it.
Great narrator. I bet this guy's grandkids love it when he reads books to them and tells them stories.
The clarity of this man is remarkable.
Professor Brailsford is a wonderful story teller.
His narration and personal reflection on the past makes this video a fascinating ride through a historic frontier in computing.
That is an excellent interview. It is so nice to hear the memories and the personal view on the history of computing from the person who went through this all himself.
Some great memories in this video. I worked for ICL in the building shown (Bracknell) as a diagnostician, writing patches for mainframe operating systems in assembler. I then moved onto Unix and in the late 1980s I designed & implemented the reliability trials software (written in C) for a new Unix server (DRS6000) we were manufacturing in our factory at Ashton, Manchester.
3:40.... WTF!!! does anyone else hear him swearing?
"So, i can get the fucking cover off, and put it into my top loading washing machine!"
Love this guy.
"A big dump at the end of every week".
I like professor Brailsford videos because they bring back so many memories. The first mini-computer I used was a PDP-8/e, where I programmed in assembler and FOCAL (a BASIC type language). Then I moved to VAX/VMS and UNIX/SunOS machines, and finally to Linux PC's, which I use at work today.
Professor Brailsford is the best. Wish I could have his knowledge FTP'd into my brain... I love listening to history of computing and want to hear everything he has to say!
Watching David Brailsworth reminds me of learning maths and science from Johnny Ball or art and painting from Tony Hart. I absolutely love hearing his unique and knowledgable stories on the history of computing.
Wow, I lived this history and it is amazing to me how things were in the '70's and how much they have changed. I began in California writing FORTRAN in collegein 1969, did an MS in Geology from Stanford using a mainframe and a dumb terminal to run a teaching model of 2D gravity models in the earth's crust in 1973. When I went to US Geol. Survey in 1976 after a couple of years I beban trying to sell UNIX to them and failed. I remember using a PDP-11/34 with RL01 2 MB disks running BDS UNIX in about 1978 and writing my first C program. about that time USGS began adopting UNIX. It wasn't until 1986, though that I had my first real exposure to UNIX as a full-time user and system-admin and programmer and later with Sun in about 1988. Then in the '90's I did Sun administration on SunOs 4,1 and later Solaris 5 to 9. I saw my first Linux in 1995 and run it on an IBM PS/2 with 16 mb of ram. What I remember is that I made it look pretty much like what I had at work under Solaris 5 because both ran FVWM. But to say that UNIX can run in small systems, I remember in 1980 installing a 10 MB UNIX on a friend's PC and later, a couple of years ago running a recent release of Ubuntu live off a DVD on a system I had that used to run Windows 2000 in 256 MB of ram. I had run recent Linix distros which were by no means small in 1/4 and 1/2 GB of ram within the past two years.
Back in the 70's, college students would play a kind of game where they would load several programs into a mainframe (we used an IBM 360) and whoever took over the entire memory would win. There were some programs that tried to attack other programs, but we soon found out that a tiny program that made copies of itself as fast as possible would fill the memory and crash anything else running. We didn't know it then, but we were essentially writing computer viruses.
I can't even express how valuable these accounts are. I love to hear anything and everything these old-timers have to say! Even though I knew most of this already, it's really neat to have a personal perspective applied to it--and it's probably easier to remember that way, too, being human.
I worked at Bell Fruit Manufacturing in Nottingham as late as 2001. We had a PDP 11 there to build slot machine code on before burning eproms for use in a machine. We used to set the build off, go for lunch and it might just have finished by the time you got back. My god that made you a careful and efficient programmer.
I'm so happy to see these types of videos. It's only been some decades since those days, but we've progressed so much. Without effort to communicate this type of historical stuff, it could be very obscured in the future. I think it's important to preserve the way things used to be done.
I was born in the 90s and I find early computers fascinating.
Interesting. I was working for Singer Business Machines when we were sold to ICL. The machine on which I was programming became the ICL 1500 and was made in Utica, NY, USA. All the programming I did was written in 1500 Assembler. I had originally learned programming on an IBM 360, also using IBM Assembler.
I had Unix envy. My mini-computer experience was VAX/VMS on a VAX 11-750 viewed through a VT-220.
What an epiphany when we moved to SunOS on Sun-3 (68020).
For those missing Richard Stallman, look at the "Behind the scenes" film linked in the description.
As someone born in 1981 and enamored with computers since the C64, Prof Brailsford is a national treasure. Please continue to interview him and let him speak (through howevermany tangents there may be) about the origins of Computer Science and publish everything he has to say. Modern programmers have much to learn from people that had to punch cards and edit on paper. Saving 15 minutes in an algorithm in 1965 translates to saving 10 minutes of battery life in a smart phone now.
Brings me to my 13th birthday present from my father. ADM 3A screen teletype, modem, my own phoneline and a user account on local campus HP3000. The biggest mistake I made with that setup was when I once printed out a long program I had made. It came COD to me and my father refused to pay it. He loaned me the money to get it out from the post and swore me to never print out anything more than one page.
Wow - that's me in the picture, with Neil and John! Also in the classroom I think! How times have changed - I remember toggling in the bootloader for the 11/70 on the piano keys too!
So good ! I love learning about the origins of computing... It's so recent, yet so different to the ubiquitous computers of today.
we need more of this guy. I wanna learn ALL history from him.
I am happy to see this. First time a popular video for what I am still programming for. I work mainly on Unisys. Masm, Cobol, RDMS, SSG, ECL and LINC.
Nicely explained , back in history We seem to be very fortunate that we are born in this advanced age where almost everything is automated
This is probably my favorite Computerphile video as of yet!
When my dad started as a computer scientist, he started on a PDP-11, so he would tell me about this kind of stuff all the time. I'm really glad to see somebody else talk about it too.
My dad helped develop the OS for the ICL 2900 series mainframe.
I remember advising someone that a 20GB is a waste of money because you would just never use all of that capacity :)
I really like this guy. As a student of CS, its one thing to learn about CS history from a book, and to learn it from someone who experienced it.
What a beautifully told story!
I was swept by the emotion of it all.
Great job. Loved it!
These videos are just fantastic! I didn't think it could get any better than numberphile and sixty symbols, but I find myself really enjoying the videos on the history of computing.
Brailsford and his peers are my heroes. They did the equivalent of cutting the grass on a soccer pitch with barber's shears so that things could evolve to the point where I'm writing this on a system so rich in resources that the slowest element of the entire system is always me. One-to-four TB of local disk is commonplace, along with 8-24 GB of RAM. I know someday this post will date me, but I can remember being paid in the $100s of dollars to upgrade a laboratory PC from 128kb to 512kb. Some nights, upgrading several PCs for professors and labs, I made my entire house payment in profits.
Thank god for the miracle of high level languages! Programming at the low level would takes ages...AGES!
Professor Brailsford is pretty awesome. I've been here for I think an hour now just listening to him talk.
I love to hear about "old computers" and "old computations". Beautiful, Professor, beautiful.
The most important feature of Linux (the Unix kernel started by Linus Torvalds) is its licence. The second is that it can and is run with Gnu. The person responsible for the Linux licence and for Gnu is Richard Stallman.
I was born the year the transistor was invented and watched this video on an Ubuntu (Linix) operated computer. Thanks for the stroll down memory lane.
Why isn't there a UNIX holiday?
I learned in School programming with assembler as well as programming with Delphi (a more sophisticated language like C++/java etc). And it helped me understanding of how PC work and how to find errors in your code. I guess its good to know for a software engineer how your codes works after you compiled it.
Wow, I wish I had him as my Computer science lecturer.
That was my exact thought. I can only imagine getting to sit down and talk with him for an afternoon about the history and evolution of computing. It would be so fascinating.
I could listen to this guy all day. Very interesting stuff.
fascinating recount of the history of mainframes and unix
I love history talks like this, great informative video, thanks Brady and Professor Brailsford.
FORTH was a much more efficient and flexible multi-user operating system / programming language that took up less space in RAM. It was ideal for that generation. I'm surprised Prof. Brailsford didn't mention it. :)
I loved learning on the ICL 1900 series.
When I went to work as a computer auditor for a large accounting firm, I bumped into IBM, NCR, Honeywell etc. and discovered that the George III and George IV Operating Systems ("GEneral ORGanisational Environment" I think was the translation of the acronym) were a long way in advance of the US competitors. Some of the big CDC and Burroughs systems came close, though.
I never worked with the new 2900 series, but there are still some standard features of the George systems which are not implemented in modern OSs. George systems defaulted to peripheral device independence, had internal file version management, virtual storage and dynamic memory management.
Ah, yes, nostalgia - not what it used to be...
So Brady can you give Professor Brailsford a go=pro and just have him walk around with it for a day? The man is such a good speaker and he's so full of great memories, I could listen to him talk all day.
This is one of the best videos in the series so far! And by the way, the chance that some sort of Unix or Linux based systems are directly involved in you watching this from the Internet, is about 100%.
Several reasons: It is extremely modular. The code is open and by compiling the operating system exactly to your needs you can get a lot more out of it than the 1 size fits all of consumers OSs. And even if you don't compile it yourself, you can swap out anything and change everything to exactly what you want without anything in your way. The free as in freedom part makes it so amazing.
The downside is that it is much easier to mess things up, but real geeks see that as a challenge ;)
I remember those days. Not the "good old days." Just the "old days." Minix was bliss! I could run something on my PC at home that looked a lot like the computer at work. Got a 1200 bps modem and another phone line and found bulletin boards and the line into the computer at work. Access to the internet in the late 1980's. Lots of work and virtually no one understood. We are living now in a rather golden time...
Professor Brailsford is a true pleasure to listen to!
this channel just keeps getting better!
There are not many 'programmers' nowadays but a lot of 'coders'. I started programming in 1969 on an IBM 360/30 with 32K of memory and two 2311 disk drives. All of the programming was done in Assembler as the COBOL compiler would not run in 32K. I had no degree or much in the way of programming education when I started - it was a pure OJT ( On the Job Training ) situation but I took to it quite well. By the mid-1970s I was working for Singer Business Machines when we were bought by ICL. The group of which I was a part worked on the Singer/ICL 1500 series of systems that we made in Utica, NY. The 1500 had at most 16K of solid-state RAM, a 256 character CRT, two tape cartridge drives and a keyboard. Fun times!
From assembler on an IBM 7044 in 1967 at Melbourne Uni to timesharing on the Cambridge Uni Atlas in 1971 to BSD Unix at AGSM, UNSW, Sydney in 1977 (thanks Iain Johnstone and Andrew Hume), I have been using Unix for 37 years now. Long live.
i love hearing this guy talk :)
Rest in peace Dennis Ritchie. Your work has surely made the biggest impact on computer science.
It's nice to be a computer programmer and watch these videos to learn more about the history of computers/programming
I actually got exposed to UNIX-Like tools before I got to UNIX. At U.S. Geological Survey in 1976 we had a Honeywell Multics mainframe. The line editor was very much like ex, the line editor you can still access from vi. That OS was written in PL/1, gasp, and expensive to use. I was agitating for UNIX, but lost at that time even though after about 1978 it started to be adopted. My boss at the time loved Mac. He would be surprised to see OS X as it is based on BSD Unix. Multics was the predecessor to UNIX.
It's so bizarre to think that people was chanting for an independent computer, with disconnected hardware wich could run software, and today we're all crazy about clod computing which seems like the exact opposite to what people needed back then.
Man, the memories. I remember BASIC programming using the local college's PDP 1170 running RSTS/E. That thing was just this massive mystery to a 16 year old kid. By the time I got to college in 1987 they'd changed to a VAX 6210 using VMS. I loved DEC equipment and VMS. I remember railing against having to use UNIX at my first job. UNIX's command line case sensitivity alone drove me mad. There wasn't such a thing in VMS, and typically I would use VMS with the caps lock on all the time. The thing that turned me around was the power of the UNIX shell scripting language. I fell in love with Bourne shell, and then more so Korn shell. The idea of piping command output to another command, parsing text with awk and sed; mind blown. I was hooked. It was like silly putty. You could bend, twist, and shape your scripts as you pleased. That was 1991. Still hooked.
Ritchie, Thompson and Torvalds, three of my biggest heros praised in one video. That made my day! Big time!
This video actually brought a tear to my eye.
Really really loving all these computerphile videos!
Absolutely fantastic. A true inspiration. This is what I, we, have been waiting for. Congratulations on finding this man.
It feels so blessed to have such advanced machines now ! they have evolved from ground up in such less time compared to say chairs and other stuff we have been sing since ages !!!
Awesome and important video. Thanks for recording this. I lived through this era and worked at Bell.
I started with IBM in 1968 as a programmer. Thanks for the nice trip back over time. Mainframes cost millions then and now the Raspberry Pi costs $35! Amazing how things have changed.
I'd love to hear continuation of this story ( how linux went on to rule servers ) and also how did the eventual availability of a OS that was more or less understandable to an average Joe impact computing from a computer scientist's perspective. This was a great video, thank you.
Steve Jobs, someone who just so happened to have ideas that were innovative at the time, but ultimately not exceptional dies: Everyone goes bananas.
Dennis Richie, father of modern programming languages, THE person that enabled the rise of Bill Gates and Steve Jobs in the first place dies: Not even a whisper.
And people say this world isn't fucked up.
According to the C standard ++c increments c and returns that incremented value. In order to perform a comparison the compiler must collapse both sides to a value of arithmetic type. The problem is that it is not guaranteed that the side effect of ++c actually happens before its evaluated - it just has to happen between the same sequence points (those are the boundaries of an expression). For this reason the left side of c
I love the use of past tense. Many of the items like RM03s and PDP11s are still around in surprising places.
Indeed this would make an interesting topic for a video. I am far from an expert in OS design.
I can tell you that on Unix, devices are treated as files. This simplifies device access by allowing standard file-handling methods to be applied.
Today's microcomputers are so powerful that they can execute multiple processes simultaneously. The multitasking heritage of Unix has served it well. Microsoft on the other hand has had to bolt such functionality onto Windows, a perilous task at best.
This is bad ass. This is the best history of the most important beginning aspect of computer science standardization condensed for a bystander, like me, ever. :-) thank you!
Just come across your video. Thank you so much for it.
It's a question of when the increment happens, before evaluation (as ++c does) or after evaluation. c == c++ is a true statement, for example. That's one reason why for complex data types in c++ where you want to increment by 1, you're far better off using ++c than c++ since pre-increment doesn't require you to maintain the current value.
I can watch these videos over and over again. Fascinationg. I would really love some Linux videos!
This might be the best video so far. This channel is awesome!
Thanks for this. I started in computers working for DEC. Times have moved on and now I'm watching this on an HP (who bought Compaq, who bought DEC) running Linux.
Well done Sean Riley, and thank you Brady.
I'm running Linux on my laptop so this was a really interesting video.
Actually, the increment is performed before the comparison, and even before the evaluation of the C++ part. Using the increment (or decrement) operator as a suffix simply yields the old value as opposed to yielding the new value when using the operator as a prefix. Internally it's not so much an order of execution thing but simply a language specification of what both operations should evaluate to.
Not all the world runs UNIX. In 79 I was working on an IBM System3, which had its own OS. With it, we managed to run interactive jobs with only 96k of memory. The descendants of that midrange machine are still with us as the IBM iSeries. They still run a proprietary OS and are one of the safest, most reliable business computers in the world, with most of them running continuously 24/7. BTW, in 79 our keypunch cards had a bevelled corner to identify when a card was placed backwards in the stack.
My first job in the I.T. world was managing a DG Eclipse MV9500 which I later upgraded to MV9600U. It too used dumb terminals though at the time we had a TCP/IP stack on it and used Pacer on the Mac platform to access it. But you could do print jobs in batch mode. I still remember the command in AOS/VS II was 'qdbi'
Well, to add Mr. Brailsford question at the end, iOS comes from MacOS, which is at it's core a version of Unix. As well, Android is pretty much a distribution of Linux.
So, even to this day, with our fancy smartphones, we're still running Unix. That's impressive for an OS written in the late 60s - almost by a single guy no less!
Mr. Thompson, we owe you ^_^
This modular ability of the OS means that you can have a whole host of devices all able to understand the same commands. This makes writing scripts that run on several devices all function the same way, regardless if it is an Android phone, a desktop workstation, or a high capacity data storage center. Configuration is a breeze, even if you have 300 devices that all need to be set up to work together, all made by different manufacturers.
The first program I wrote was in ALGOL 68 on a PDP-11/VAX timeshare cluster at the University of Sussex in 1979 when I was 9 (it was basically Hello World). I spent most of my time playing Colossal Cave Adventure. I was still writing FORTRAN under VMS DCL on a VT100 greenscreen attached to a VAX 8550 in the early 90s. Amazing to think that my 25 quid Raspberry Pi blows all those systems out of the water.
(++C>C) evaluates to false precisely because the increment happens first. If C=4, then ++C>C performs C++ (C=5), and then C>C (5>5), returning false.
He didn't mention Free Software and Stallman Richard. fsf.org
The topic at that point was forking UNIX so it would be able to be used freely with clear constraints. Torvalds was the one who carried this out. Before the Linux kernel, GNU didn't really pick up steam and vice versa. The two worked off each other, but the reason Stallman wasn't mentioned was that the topic was that UNIX was a legal mess at the time, and Torvalds made an OS to sidestep that.
One of my first jobs in the mid 80's was data input on a time sharing system. If you ever fancied an afternoon off you just had to type fast enough to crash the system and they would have to call an outside engineer to come in to repair it. Happy days!
I could listen to this guy forever, computers and technology are fucking amazing.
Very interesting how thing change so much so quickly.
Its amazing how in 30-40 year our small electronic devices are several times more powerful then the big computers back then. It also blows my mind that hard drives were measured in MB when we are using TB hard drives in computers and GB on phones and tablets.
Thanks for the reply :)
I actually watched that video right after this one and satisfied with Prof. Brailsford's explanation. (The GNU joke at the end made me laugh a lot actually)
I just thought I'd leave the comment up here instead of deleting it in order to help educate others who aren't familiar with GNU/Linux.
Hah I remember running a 64kb computer at home, a Tandy Radioshack TRS-80 Color Computer II, starting from 1984 I remember. It plugged into your TV and ran a version of Microsoft Basic. I can't remember the graphics resolution but I think it might have been something like 160x120 with 9 set colors, which were frankly aweful and didn't even include a proper white but instead an awful salmon buff colour. It was awesome xD
My programming experience was a bit stunted however for two reasons. First it didn't have a disk drive. Instead it used audio case tapes as mass storage. They were pretty finicky but worse we kept loosing the cable to connect the tape drive to the computer. I think we went through about 4 of them, and most of the time didn't have access to any sort of non-volatile memory. I think when it did work it could store about 8 full memory dumps or programs, each of which had to be loaded (slowly) into the memory with a CLOAD.
Secondly I didn't understand what PEEK and POKE did. Basically PEEK reads from an address and POKE writes an address. I didn't understand that all of memory was accessible in this way and even that the entirety of computer memory _was_ a long array of bytes. What I could have done if I understood this...