This whole interview series with Dave Cutler should be sent to the National Archives. The amount of information about how computers, as we now know them, came to life, is unbelievable. Kudos Dave to share these stories with us mortals.
@@johnsimon8457 I like CHM's series, but I like this Dave Cutler interview better than theirs, probably because it's two technical Microsoft insiders talking together at a similar level
My boss told me how he turned down a job in 1980 with Microsoft because the naval shipyard was offering a few dollars per hour more. He said the MS recruiter told him they had profit sharing, but my boss didn't appreciate what that could mean so he stuck to his secure government job. He's hated himself ever since.
I met a guy that said turning down Microsoft cost him 21 million. I was briefly at Microsoft (not in the good times) and if you got into the right team it was great but get into a bad team and it's a nightmare (which is what happened to me).
This guy was hired to rewrite the ("pathetic") underpinnings of the computing world and he sat down and did it. Obviously with lots and lots of help--but he directed the show. The fanout of his work is mind boggling.
All he did was reinvent VMS, and carry over the anti-Unix culture from DEC. This is why the entire rest of the computing world is now dominated by Linux.
@@lawrencedoliveiro9104 it's funny because android isn't counted as linux in any market share reports I can find. But if you do, linux clearly dominates for end user devices. That and 90% of the server market. Everything but desktop.
Dave Cutler: "ALL of Microsoft operating system software was pretty pathetic, it was all assembly, it was non-portable..." Dave's Garage: "Ayy, I wrote some of that"... lol! Another thing we love, brutal honesty (besides an insiders peek into the belly of the Beast) about Dave's Garage! 😅
Remember, Cutler had just left DEC in a huff, after they cancelled his project to create a successor to the VAX and VMS. So he went to Microsoft and basically reinvented VMS. Windows NT was supposed to be portable across multiple architectures, yet in the end, every single non-x86 port failed.
@@lawrencedoliveiro9104 ARM devices account for the majority of electronic waste thanks to smartphones, tablets, chromebooks, and macbooks, while the x86 system can last way longer than ever before. Unless ARM can be used just like x86 with a replaceable CPU, RAM, and SSD, it shouldn't be mandatory for the PC market.
That remains to be seen. I am afraid now that watching the full interview is going to be tedious with all of the repetition from that already watched once.
Love these!! I was just a wee lad back in the 90's but to know where all these things came from that were so foundational to how we use computers today is so cool!
@@brandonsummers6360 I was majoring in physics at the time and instead of doing my quantum homework used to spend hours a day on usenet for any new news of NT. I'm sure I learned more about computers than anything else by graduation time despite never taking a computer class.
After grad school, I was offered a job doing RISC CPU design at Hewlett-Packard in 1979, but took a job at Bell Labs instead. I had done my grad school work on a PDP-11/45, and was incredibly impressed with the orthogonality and flexibility of the PDP-11 architecture. On the periphery of all of this stuff; fascinating to hear Dave Cutler talk about this. Thanks!
RISC design in 1979? I programmed in assembly on both PDP-11 and VAX-11. The VAX-11 was even more orthogonal than the PDP-11, and even more addressing modes! That sounds a decade too early. The first papers on RISC didn't come out until the mid-80's, and that was only AFTER the failure of the VAX-11 series to deliver anticipated performance (primarily because virtual-memory page faults due to memory being swapped out to disk were causing addressing failures in the middle of instructions, forcing those partially-completed instructions to be "unrolled" before executing the interrupt routine for the virtual memory system to load the correct page, so that the instruction could be re-tried (and if more memory references remained, possibly triggering ANOTHER virtual memory page fault...). The VAX-11 has the most beautiful, elegant instruction set I have ever seen, and it was entirely WRONG. But as far as I know, NOBODY was really aware of that in 1979, let alone aware of it enough to come up with the idea of, and HIRING people for CPUs designed with an instruction set in the exact opposite fashion.
I can relate to most things Dave Cutler is talking about. I was in college in the 80s and we had DOS machines for simple word processing and programming in Basic. For real programming we had a VAX/VMS with most compilers - C, Ada, Pascal, Fortran etc. I see why he says Microsoft OSes were pathetic compared to VMS. Cannot wait for the full interview!
Real computing and the advance of computer science was done on PDP's or Lisp machines. VAX/VMS or IBM's mainframes were bought by companies that were already stuck with prior hardware or the same vendors. VAX for real computing lol
@@Traumatree haha. Cobol reel-to-reel sequential storage. Microsoft provided a different business paradigm, a "get out of the closet with your Wang terminal and token ring" type of mindset. We all hate proprietary, but evolution is evolution. Microsoft undercut everybody for years, and that's how they won the office. If Jobs didn't come back, Apple would have hung itself with a scsi chain attached to a motorola starmax clone.
@@dinozaurpickupline4221 correct. Mainframes/centralized computing was pathetic. Microsoft is devolved. The difference is point of no return. They are getting close. What started the computing lifestyle as we know it, is slowly being written into history... unless they choose to re-architect their crappy OS.
These interviews with Dave Cutler are terrific! I've been a huge fan of Cutler since he was at DEC, and these are the first interviews of him I've ever seen. Way Cool! Also, I remember that building up the hill near the 520, 405 intersection. It used to say DIGITAL on it, then later MICROSOFT. Now you can hardly see it because the trees have grown so much, but I think it's a bank now.
This is a pleasure to watch. As i have said before, you actually let people talk without interrupting too much. The lighting and sound are perfect. And of course, very interesting topics :)
I understand the hesitancy to work for them, but you don’t know that until you get there. I was once headhunted to go and work running a few software projects for one of the best Middle Eastern airlines. Therefore one of the best airlines in the world by far. And when finally got there, after a five month recruitment process and moving my family across the world, my first thought was how the fuck do these people even keep planes in the air. It was absolute chaos. And a culture that was basically. Fuelled by slavedriving and fear. They hired me because I knew what they needed to do to get beyond the sticking points that they were facing, and that was the sole basis on which they hired me. Then when I got there first day they were trying to lay out exactly how I should do my job. So after six months I fucked them off. Quite often in life you find out that when you look behind the curtain the Wizard of Oz isn’t really a wizard at all.
Really excited for this one! Having someone like yourself who was there interviewing Dave will be fascinating to listen to. I have heard others interview him but you, you can ask him things that others would not even know about. Cant wait!
@@DavesGarage Dave, if it's anything I have learned from you, it's that you have a very good way with words and breaking down complex things into digestible bits. I've learned a lot from you in the 6 months I have been subbed Im betting this interview will be great! 🌝 Also, Cutler is a complex man with a rich history, so I am not surprised lol!
Just so crazy to me as a 40 something watching a very technical bloke older than my dad who is the most non-technical person you can think of. I don't know anyone technical from that early era of computing personally but i find it absolutely fascinating to watch and learn their experiences compared to my own.
He was born before invention of LISP and Fortran. No PCs. Computers the size of a room and costing more than an aircraft. Limited access to some kind of printing terminal at a university. That was a really different world.
I grew up with an IT parent in the 80's and 90's... Knew of many graybeards from that era. Fast forward to today, my closest friend and also my closest coworker, both my peer in age, are both non-technical; we're talking "can barely put the mouse pointer here and click" non-technical. "Microsoft Words" non-technical. Naturally I followed the IT parent into IT, then got into firmware development later on. Not that I'm selecting non-technical people as friends, it just so happens to work out that way.
I knew that Denny's well. We were in the (now gone) Overlake business park. I interviewed many prospective employees there :) How many people were hired and fired in that place??? The stories that place could tell. It was such a dive. Duck taped bench seating and all. It was so great :)
🎯 Key Takeaways for quick navigation: 00:22 🖥️ Dave Cutler discusses early Microsoft operating systems, noting they were non-portable and primarily written in assembly language. 01:57 🍳 Steve Ballmer played a significant role in convincing Dave Cutler and his team to join Microsoft, ultimately leading to their recruitment. 04:25 🖥️ Dave Cutler worked on the Prism project, a RISC architecture project at DEC, but it was eventually canceled due to advancements in single-chip systems. 06:25 🏢 After leaving DEC, Dave Cutler planned to start a server company, but some of his colleagues had moved to Microsoft, and the project's cancellation led to his departure.
In retrospect, it would have been great to meet Dave Cutler back in 1990 to show him the hybrid CISC/RISC bit slice processor I had designed which fetched and executed every instruction in a single clock cycle. My computer architecture professor was very impressed, but he was not well-connected in the industry.
@@sdrc92126 Higher scales of integration happened via smaller process technology (i.e. more transistors per chip). This meant that you could put everything on one chip, even for 64-bit CPUs, rather than using separate bit slice chips connected together. The MIPS R4000, released in 1991, was the first 64-bit MPU. Even before that, a multi-chip 64-bit CPU could have been built for less total chips than a bit sliced CPU, using an approach similar to the first POWER CPU from IBM.
Great conversations, giving me flashbacks... As an old VAX/VMS systems admin, DEC had an amazing and stable platform for the time. I took a job porting VMS COM files in 1993 to UNIX scrips and never looked back...
I’ve always been a big fan of Dave Cutler. I knew of him during the nineties. Also, Cairo was my favorite NT release, too. Seeing it run on a friend’s PowerPC machine blew me away.
My first job was as a tech making PCs for a training company, networking them together and troubleshooting any issues thereafter. Windows 95 was their OS. When the blue screen of death appeared, you had to rename user.dat and system.dat every single time, which was very often. An absolute nightmare to support.
Hi Dave--The AMD 2900 series 4-bit ALU that Cutler refers to at about 2:50 is the chip that the processor in the original 'Multiplexer/Demultiplexer' ('MDM') avionics box in the Space Shuttle Orbiter was based on. 4 of them strung together. I got to work on the control store for that box. Neat to hear Cutler mention it!
From the bottom of my heart, I would like to thank the previous generation in software development for their passionate, meticulous, hard and smart work. Your work has enriched my life immeasurably. Your work gave me the opportunity to find a deep passion for modern technology, computers and high-level languages. Much of what you were able to do I never learned to understand the way you did. I feel shame and gratitude. Thank you for everything.
I HATE New Outlook Mail. It a while to sync older replies/forward mails but when I open it, it often will go BEHIND other Programs!!! What the hell is the point in doing that????
I remember all this. And it's fascinating to look back at that time period. (I did a lot of deals at that dennys, and I hated the food - but the DEC guys didn't ) ;)
Speaking of non-portable operating systems, I believe there's still no fully portable version of VMS. Some professor in France was working on an open-source version of it, but last I checked the project hadn't been updated in years.
I had such an extensive library of VMS scripts for productivity enhancement, it was like having my own OS (with VMS under the hood doing the work). I'm thinking it would have served as a nice abstraction layer.
I would say, there doesn’t need to be. The only parts of VMS that users still care about would be DCL command procedures and user-mode code. You can dump all the super/exec/kernel-mode stuff, and implement the remaining needed APIs as an emulation layer on top of a modern Linux kernel. That would be the most cost-effective approach.
I am so glad I found your channel. MS wasted so many hours of my life with reboots and blue screens of death and cryptic error messages. MS did absolute garbage work on Windows. Balmer and Gates were horrible. I finally had enough about 15 years a, I got a Mac and never looked back. I have no idea if Windows is any better today - maybe it is maybe it isn’t, I will never ever go back. I run all my business stuff on Linux which is orders of magnitude more reliable than any windows server..
I love the history and deep dives into the tech that everyone takes for granted now. Another cool topic is the old landline telephony networks. Brilliant man great vid 💯
I worked at DEC, and subsequently my own startup… the thingI took away from this interview, is that all startups make critical missteps, but it’s only good ideas and good leadership that can pull it through to long term success.
Memory availability, whether external or on-chip (registers, caches) was a fundamental driver in the evolution of CPU architectures and ISAs. The RISC craze in the early 1990s was a good hint at the pace of the semiconductor industry development. Anyway, Itanium killed the RISC star and then iPhone resurrected it.
RISC never died, and the iPhone isn't what saved it. Sony, Nintendo, and Microsoft shipped several hundred million MIPS, ARM, and PowerPC game consoles starting in 1995, and in that same period ARM became the standard embedded processor in a ton of things. Almost every modern Windows PC has ARM cores inside the mechanical or solid-state hard disks, the WiFi chipset, and if you have an RGB-lighted keyboard it's likely got an ARM in it too. It's funny that Dave Cutler went from having his thunder stolen by RISC and Unix at DEC to having his thunder stolen by RISC and Unix at MS though.
Yeah, there hasn't been a new CISC architecture launched in decades. I do think that RISC is a bit over-hyped in that a lot of the factors that made it a good idea in the 80s are no longer present or have been mitigated by other developments (microcode ROMs are no longer a huge contributor to chip area), and that a much more CISCy design point than your typical RISC architecture is viable, but it's hard to argue that the concept that has dominated all new work in ISA design for 40 years has ever been "dead" during that time. And while it does look like Unices running on RISC architectures will eventually dominate, Windows and x86 probably won't die until the whole Google-driven "make all mobile devices into glorified web terminals for the cloud" trend goes away and mobile computing is no longer deliberately hobbled.
I really miss working with OpenVMS, even if Pathworks as a Domain Controller was a pain in the a..., but eventually I was able to establish Trust relationships between the 33 domains that existed in the company I was working for with a simple DCL script. And another colleague managed to write a DCL virus by writing a script which started with "distribute this script to all machines". Good times.
Got to work with SPARC and MIPS machines in the form of Sun and SGI workstations. Incredibly fast for the times. Incredibly expensive too! Amazing machines! If you have a 5 year old smartphone, you're going way faster, and using a lot of the same architecture. ARM is RISC, and it's coming back in big ways!
I remember the AMD 2901 bit slice processors. They were used in the Data General Eclipses I used to repair, back when I was a computer tech. I'd dig right into the microcode on those. I also worked on VAX 11/780s.
You guys are legend ! Thanks for sharing all this history ! I got my first computer I was 12 in 1987:.. I was kid but I had a blast navigations in the vax system of the local university, doing bbs.. what a great time it was ;-)
“Steve Ballmer took me down to the Denny’s on 148th street right by the Fred Meyer“ - I know where that was! Or at least maybe it was remodeled and then torn down. It’s just like how Douglas Coupland’s Microserfs book makes mention of the Umajimaya on 156th and 24th where the Trader Joe’s is now. I’d pay a lot of money to take a Time Machine back to early 90’s Seattle + Eastside when it was just Boeing that really made the area. Edit: “Nathan Myhrvold had Bill’s ear…” the molecular gastronomy guy. Ah boy.
Yep, those were the usual haunts for Microsofties back in the day. I drove those places everyday on my way to the campus. I worked in Nathan's Advanced Technology Group.
This is all over my head all I know is my cell phone has the equivalent power of like 10 super computers from the 90s and it's still increasingly slow to the point of being unusable. Code is getting worse and worse.
Too many slow languages, scripting, inefficient algorthms and bloated libraries. Poweful hardware rewards bad programmers and inefficient ideas. When I started programming in the 70s, assembler / machine code were the only ways to get usable performance.
Its always interesting to see articles written today saying how productive software has gotten. I think its just more people thrown into software development.
@@toby9999That was the issue Toby. Writing in assembler or machine code (loved it 😊), took time, not only to write, but debug. Once memory became cheap, it no longer mattered if they wrote bloatware. I was never really convinced about RISC, customers became obsessed about MIP rates, so manufacturers went back to RISC architectures, which upped the MIPS, but as a consequence needed more memory, and ultimately took longer to execute a string of instructions, that used to be performed by a few instructions.
I some ways it is possible to return to the old days with the popularity of Arduino derived small systems for embedded applications. Once again you’ve got to watch how large your code is, conserve memory and optimize program structure for acceptable performance on relatively slow processors. The difference is now we can write in C++ like fashion in an IDE and easily control breakpoints in real time. We probably would have given our eye teeth in 1980 to have an Arduino. On the other hand, ESP-32 is leading us back to bad habits with its larger memory, faster CPU, spacious flash, etc. The other day I coded a triggered sweep oscilloscope for a Lilygo dev board with a high resolution OLED display about the size of an old school stick of chewing gum in about 300 lines of code in the Arduino IDE. Now Arduino itself has adopted ESP-32 with the Uno R4. Man, I’d love to time travel back to the 1980s to tease what would be coming 40 years down the road.
Hey Dave, whatever happened to the Microsoft table tablet where you could put your camera on it and it would download the photos. I saw it a long time ago before the whole touch screen craze kicked off.
I kinda started in AS/400 (at work at least, home was 8bit/16 bit >> PC) - and transitioned into NT/2000/PC support. I'd love to have a beer and spend some time with Dave Cutler. Would be amazing to get the inside lines..
@@pf100andahalf At one point IBM had one that was about the size of a PC (it was low end in AS/400 terms) - and there was a time where I wanted one. I was just wrong. It would have been an almost junk box, and AS/400 is so tied into a subscription OS / Hardware combo - and frankly I used to look through the red books in a desire to learn more - but it was all dead end. Anyway... WRKSYSSTS :)
@@AdmV0rl0n My friend's AS/400 was about double the width of a desktop pc and about the same height. This is by memory from 30 years ago so I may be off a bit. He wrote code for OS/400. I don't knew exactly what he did, mainly because I'm just a hardware guy and since it didn't run pc software I was only mildly interested for historical reasons, but I knew its importance.
@@pf100andahalf It was a fair amount back there // hand waves // - but most of the code from what I remember was RPG, COBOL and some REXX, with OS400 being the OS. I'm over generalising, but it was a batch processing OS, with a database for a filesystem. Everything native was terminal/emulation, and by modern standards - clunky. In my later days people were running TCP stacks (early AS/400 'network' was pain) and trying to use 3rd party bolt on web based terminal or glorified front ends. There was some X86 boxes you could attach and later they had virtualisation in the OS. I enjoyed my time with AS/400 - but its end lesson for me was to avoid getting tied into a vendor lock - and all that comes with it. The world briefly had personal computers - but we are rapidly shifting back to vendor lock in and you'll pay crack cocaine money for - in the old days - cpu cycles/time - today it will seem larger and grander - but its very client server, and its already now hitting a point where only the rich will be able to play. These things work in cycles, so I look forward for the next client/personal wave. Wether that happens... well...
worked at an ISP for about ... a decade (a specific one, not in general. thats more years) and we ran a DEC Alpha for one of our servers. the head it guy was so pumped to had been able to buy a RISC machine. this is when mmx and ppros either just came out or were coming out soon. i am bad with time.
3:04 - I worked for the company that sued AMD over the demonstrated fact that their 2901 did not actually work as advertized, and thus was born the 2901A which set up the industry that Mr Cutler mentions here...
00:00 📅 Dave Cutler recalls his initial encounter with Microsoft and meeting Bill Gates. 01:02 💼 Cutler discusses his reservations about joining Microsoft despite Bill Gates' enthusiasm. 01:43 🥞 Steve Ballmer's persuasive breakfast at Denny's swayed Cutler and his team to join Microsoft. 04:10 🏗️ The Prism Project: Cutler's involvement in developing RISC architectures at DEC West. 06:25 💔 Prism's cancellation leads Cutler to leave DEC, preparing to start his own company.
He said "Ballmer fleeced us," but I didn't get a good sense of why he said that. I see from other comments there's a "full interview" coming. Guess I'll have to wait. :)
I was team "big blue" evrrything else was a toy. I wrote some microcode for the 11/44 and couldnt wait to move on. I wrote microcode for the IBM 360/30 and it was a pleasure to work on real hardware. Nothing comes close to Big Blue
I have one question that I could never find an answer to. At the time of the rise of the Desktop GUI for some reason Microsoft did not include concurrent (multiuser) multitasking. That was the direction Unix and even Microsoft XENIX took in the early 1980s. It even existed on poor PC hardware in the form of TurboDOS and M/PM. To this day it is only with the latest Windows 11 Enterprise edition that does this. Why?
I was at Warrex-Centurion in the late 1970's to mid 1980's. Centurion started shipping its AMD 2900 series 4-bit ALU based systems in 197901980. The Centurion multi-user 8 bit minicomputer systems called the CPU5, CPU6 and later the CPU7 all used the AMD 290x chips.
This whole interview series with Dave Cutler should be sent to the National Archives. The amount of information about how computers, as we now know them, came to life, is unbelievable. Kudos Dave to share these stories with us mortals.
Check out the Computer History Museum’s oral history series and they have a few dozen of these.
@@johnsimon8457 I like CHM's series, but I like this Dave Cutler interview better than theirs, probably because it's two technical Microsoft insiders talking together at a similar level
AT LEAST, to the national archives.
Nailed it
Agreed!
My boss told me how he turned down a job in 1980 with Microsoft because the naval shipyard was offering a few dollars per hour more. He said the MS recruiter told him they had profit sharing, but my boss didn't appreciate what that could mean so he stuck to his secure government job. He's hated himself ever since.
I met a guy that said turning down Microsoft cost him 21 million. I was briefly at Microsoft (not in the good times) and if you got into the right team it was great but get into a bad team and it's a nightmare (which is what happened to me).
We want more of Dave Cutler!
and more Dave's P. spicy memories forging solid solutions
Please do please us!
This guy was hired to rewrite the ("pathetic") underpinnings of the computing world and he sat down and did it. Obviously with lots and lots of help--but he directed the show. The fanout of his work is mind boggling.
All he did was reinvent VMS, and carry over the anti-Unix culture from DEC. This is why the entire rest of the computing world is now dominated by Linux.
@@lawrencedoliveiro9104"entire rest of computer world" is like 9% lol 😂
@@nneeerrrd More Android devices ship per year than the entire Microsoft Windows installed base.
Maybe he's counting routers@@nneeerrrd
@@lawrencedoliveiro9104 it's funny because android isn't counted as linux in any market share reports I can find. But if you do, linux clearly dominates for end user devices. That and 90% of the server market. Everything but desktop.
This is pure gold Dave. Bring on the full interview!
Dave Cutler: "ALL of Microsoft operating system software was pretty pathetic, it was all assembly, it was non-portable..."
Dave's Garage: "Ayy, I wrote some of that"...
lol! Another thing we love, brutal honesty (besides an insiders peek into the belly of the Beast) about Dave's Garage! 😅
Remember, Cutler had just left DEC in a huff, after they cancelled his project to create a successor to the VAX and VMS. So he went to Microsoft and basically reinvented VMS.
Windows NT was supposed to be portable across multiple architectures, yet in the end, every single non-x86 port failed.
@@lawrencedoliveiro9104It's not over yet. Qualcomm is making significant investments in the PC platform.
@@lawrencedoliveiro9104 Well, ARM seems to get some traction
@@d0rban More ARM processors ship per year than the entire population of the Earth. But almost none of them run Windows.
@@lawrencedoliveiro9104 ARM devices account for the majority of electronic waste thanks to smartphones, tablets, chromebooks, and macbooks, while the x86 system can last way longer than ever before. Unless ARM can be used just like x86 with a replaceable CPU, RAM, and SSD, it shouldn't be mandatory for the PC market.
Ahhhhhhhh!! Can't wait for the full interview to come out!!
I cannot wait for the full thing to be released Dave!
Doing a series with these early-days developers is a fantastaic idea!
That remains to be seen. I am afraid now that watching the full interview is going to be tedious with all of the repetition from that already watched once.
Love these!! I was just a wee lad back in the 90's but to know where all these things came from that were so foundational to how we use computers today is so cool!
@@brandonsummers6360 I was majoring in physics at the time and instead of doing my quantum homework used to spend hours a day on usenet for any new news of NT. I'm sure I learned more about computers than anything else by graduation time despite never taking a computer class.
After grad school, I was offered a job doing RISC CPU design at Hewlett-Packard in 1979, but took a job at Bell Labs instead. I had done my grad school work on a PDP-11/45, and was incredibly impressed with the orthogonality and flexibility of the PDP-11 architecture. On the periphery of all of this stuff; fascinating to hear Dave Cutler talk about this. Thanks!
RISC design in 1979?
I programmed in assembly on both PDP-11 and VAX-11. The VAX-11 was even more orthogonal than the PDP-11, and even more addressing modes!
That sounds a decade too early. The first papers on RISC didn't come out until the mid-80's, and that was only AFTER the failure of the VAX-11 series to deliver anticipated performance (primarily because virtual-memory page faults due to memory being swapped out to disk were causing addressing failures in the middle of instructions, forcing those partially-completed instructions to be "unrolled" before executing the interrupt routine for the virtual memory system to load the correct page, so that the instruction could be re-tried (and if more memory references remained, possibly triggering ANOTHER virtual memory page fault...). The VAX-11 has the most beautiful, elegant instruction set I have ever seen, and it was entirely WRONG.
But as far as I know, NOBODY was really aware of that in 1979, let alone aware of it enough to come up with the idea of, and HIRING people for CPUs designed with an instruction set in the exact opposite fashion.
Teasing us with all these clips of this EPIC interview!!! I can’t wait for the full interview to go live!
No, Cutler was from DEC, not HP.
I can relate to most things Dave Cutler is talking about. I was in college in the 80s and we had DOS machines for simple word processing and programming in Basic. For real programming we had a VAX/VMS with most compilers - C, Ada, Pascal, Fortran etc. I see why he says Microsoft OSes were pathetic compared to VMS. Cannot wait for the full interview!
Real computing and the advance of computer science was done on PDP's or Lisp machines. VAX/VMS or IBM's mainframes were bought by companies that were already stuck with prior hardware or the same vendors. VAX for real computing lol
The world of TSRs is a travesty.
@@Traumatree haha. Cobol reel-to-reel sequential storage. Microsoft provided a different business paradigm, a "get out of the closet with your Wang terminal and token ring" type of mindset. We all hate proprietary, but evolution is evolution. Microsoft undercut everybody for years, and that's how they won the office. If Jobs didn't come back, Apple would have hung itself with a scsi chain attached to a motorola starmax clone.
So Microsoft now is not pathetic
@@dinozaurpickupline4221 correct. Mainframes/centralized computing was pathetic. Microsoft is devolved. The difference is point of no return. They are getting close. What started the computing lifestyle as we know it, is slowly being written into history... unless they choose to re-architect their crappy OS.
These interviews with Dave Cutler are terrific! I've been a huge fan of Cutler since he was at DEC, and these are the first interviews of him I've ever seen. Way Cool! Also, I remember that building up the hill near the 520, 405 intersection. It used to say DIGITAL on it, then later MICROSOFT. Now you can hardly see it because the trees have grown so much, but I think it's a bank now.
yes a bank... vested by MS !
😂
These interviews are about the years that I worked at MS. Fun to hear! Thanks Dave!
This is a pleasure to watch. As i have said before, you actually let people talk without interrupting too much. The lighting and sound are perfect. And of course, very interesting topics :)
These interviews are great!
Awesome stuff! Can't wait for the whole thing.
I understand the hesitancy to work for them, but you don’t know that until you get there. I was once headhunted to go and work running a few software projects for one of the best Middle Eastern airlines. Therefore one of the best airlines in the world by far. And when finally got there, after a five month recruitment process and moving my family across the world, my first thought was how the fuck do these people even keep planes in the air. It was absolute chaos. And a culture that was basically. Fuelled by slavedriving and fear. They hired me because I knew what they needed to do to get beyond the sticking points that they were facing, and that was the sole basis on which they hired me. Then when I got there first day they were trying to lay out exactly how I should do my job. So after six months I fucked them off. Quite often in life you find out that when you look behind the curtain the Wizard of Oz isn’t really a wizard at all.
lol sounds like emirates
URGGGHHHHH waiting till October 21 but wanting it now ! Dave you produce some of the best content.
Right? lmao
Fantastic interview David, you are doing a huge service to the community, for enthusiasts and historians!
Really excited for this one! Having someone like yourself who was there interviewing Dave will be fascinating to listen to. I have heard others interview him but you, you can ask him things that others would not even know about. Cant wait!
Hope I did you proud in the full interview! But I only got about 1/3 of the way through my questions :-)
@@DavesGarage Dave, if it's anything I have learned from you, it's that you have a very good way with words and breaking down complex things into digestible bits. I've learned a lot from you in the 6 months I have been subbed Im betting this interview will be great! 🌝 Also, Cutler is a complex man with a rich history, so I am not surprised lol!
@@DavesGarageIf so, then I'll pray for part II and part III in a few weeks or months! ;-)
Would be great to hear Dave's opinion on modern Windows on ARM approach and how his system design is/isn't paying off in it.
Always appreciate your videos, Dave!
Thanks for having Cutler on, he's such an unsung hero of so much in the tech world.
Can't wait to hear the whole thing!
Just so crazy to me as a 40 something watching a very technical bloke older than my dad who is the most non-technical person you can think of. I don't know anyone technical from that early era of computing personally but i find it absolutely fascinating to watch and learn their experiences compared to my own.
He was born before invention of LISP and Fortran. No PCs. Computers the size of a room and costing more than an aircraft. Limited access to some kind of printing terminal at a university. That was a really different world.
I grew up with an IT parent in the 80's and 90's... Knew of many graybeards from that era. Fast forward to today, my closest friend and also my closest coworker, both my peer in age, are both non-technical; we're talking "can barely put the mouse pointer here and click" non-technical. "Microsoft Words" non-technical. Naturally I followed the IT parent into IT, then got into firmware development later on.
Not that I'm selecting non-technical people as friends, it just so happens to work out that way.
I could listen to you [and someone] else talk about things I have absolutely zero concept of for hours.
Looking forward to this
awesome! killing me with these teaser clips Dave 😄
I knew that Denny's well. We were in the (now gone) Overlake business park. I interviewed many prospective employees there :) How many people were hired and fired in that place??? The stories that place could tell. It was such a dive. Duck taped bench seating and all. It was so great :)
These are great Dave! Looking forward to seeing more!
Awesome, thank you!
Ballmer was a brilliant enterprise software salesman/strategist; he was absolutely clueless about the consumer side
🎯 Key Takeaways for quick navigation:
00:22 🖥️ Dave Cutler discusses early Microsoft operating systems, noting they were non-portable and primarily written in assembly language.
01:57 🍳 Steve Ballmer played a significant role in convincing Dave Cutler and his team to join Microsoft, ultimately leading to their recruitment.
04:25 🖥️ Dave Cutler worked on the Prism project, a RISC architecture project at DEC, but it was eventually canceled due to advancements in single-chip systems.
06:25 🏢 After leaving DEC, Dave Cutler planned to start a server company, but some of his colleagues had moved to Microsoft, and the project's cancellation led to his departure.
love the interview, brings back memories
Thank you Dave!! Looking forward to the full video 😀
In retrospect, it would have been great to meet Dave Cutler back in 1990 to show him the hybrid CISC/RISC bit slice processor I had designed which fetched and executed every instruction in a single clock cycle. My computer architecture professor was very impressed, but he was not well-connected in the industry.
I remember this. It is too bad this never took off. Kinda like rambus memory.
I remember reading about bit slicers in grade school in the 80's and wondered whatever happened with them.
IBM POWER was already doing multiple instructions per clock cycle (“superscalar”) in 1990.
@@sdrc92126 Higher scales of integration happened via smaller process technology (i.e. more transistors per chip). This meant that you could put everything on one chip, even for 64-bit CPUs, rather than using separate bit slice chips connected together. The MIPS R4000, released in 1991, was the first 64-bit MPU. Even before that, a multi-chip 64-bit CPU could have been built for less total chips than a bit sliced CPU, using an approach similar to the first POWER CPU from IBM.
Great conversations, giving me flashbacks... As an old VAX/VMS systems admin, DEC had an amazing and stable platform for the time. I took a job porting VMS COM files in 1993 to UNIX scrips and never looked back...
I’ve always been a big fan of Dave Cutler. I knew of him during the nineties. Also, Cairo was my favorite NT release, too. Seeing it run on a friend’s PowerPC machine blew me away.
My first job was as a tech making PCs for a training company, networking them together and troubleshooting any issues thereafter. Windows 95 was their OS. When the blue screen of death appeared, you had to rename user.dat and system.dat every single time, which was very often. An absolute nightmare to support.
I worked in Cupertino for 10 years. The crap Microsoft guys went through in those days, was truly pathetic.
Hi Dave--The AMD 2900 series 4-bit ALU that Cutler refers to at about 2:50 is the chip that the processor in the original 'Multiplexer/Demultiplexer' ('MDM') avionics box in the Space Shuttle Orbiter was based on. 4 of them strung together. I got to work on the control store for that box. Neat to hear Cutler mention it!
From the bottom of my heart, I would like to thank the previous generation in software development for their passionate, meticulous, hard and smart work. Your work has enriched my life immeasurably. Your work gave me the opportunity to find a deep passion for modern technology, computers and high-level languages. Much of what you were able to do I never learned to understand the way you did. I feel shame and gratitude. Thank you for everything.
Cutler is one of my heroes.
Thanks gentlemen.
I HATE New Outlook Mail. It a while to sync older replies/forward mails but when I open it, it often will go BEHIND other Programs!!!
What the hell is the point in doing that????
Dave Cutler. What a legend.
I remember all this. And it's fascinating to look back at that time period. (I did a lot of deals at that dennys, and I hated the food - but the DEC guys didn't ) ;)
Speaking of non-portable operating systems, I believe there's still no fully portable version of VMS. Some professor in France was working on an open-source version of it, but last I checked the project hadn't been updated in years.
I had such an extensive library of VMS scripts for productivity enhancement, it was like having my own OS (with VMS under the hood doing the work). I'm thinking it would have served as a nice abstraction layer.
I would say, there doesn’t need to be. The only parts of VMS that users still care about would be DCL command procedures and user-mode code. You can dump all the super/exec/kernel-mode stuff, and implement the remaining needed APIs as an emulation layer on top of a modern Linux kernel. That would be the most cost-effective approach.
As a retired cable TV guy. I feel that I played a large role in the Microsoft launch.
Wow, we learned a lot of stuff that was never disclosed before. Can't wait for the rest of the videos!
Thanks for breaking this up into chunks.
I am so glad I found your channel. MS wasted so many hours of my life with reboots and blue screens of death and cryptic error messages. MS did absolute garbage work on Windows. Balmer and Gates were horrible. I finally had enough about 15 years a, I got a Mac and never looked back. I have no idea if Windows is any better today - maybe it is maybe it isn’t, I will never ever go back. I run all my business stuff on Linux which is orders of magnitude more reliable than any windows server..
My sentiments sir.
This is really interesting! I'm looking forward to watch the whole interview.
I love the history and deep dives into the tech that everyone takes for granted now. Another cool topic is the old landline telephony networks. Brilliant man great vid 💯
I worked at DEC, and subsequently my own startup… the thingI took away from this interview, is that all startups make critical missteps, but it’s only good ideas and good leadership that can pull it through to long term success.
Memory availability, whether external or on-chip (registers, caches) was a fundamental driver in the evolution of CPU architectures and ISAs. The RISC craze in the early 1990s was a good hint at the pace of the semiconductor industry development. Anyway, Itanium killed the RISC star and then iPhone resurrected it.
RISC never died, and the iPhone isn't what saved it. Sony, Nintendo, and Microsoft shipped several hundred million MIPS, ARM, and PowerPC game consoles starting in 1995, and in that same period ARM became the standard embedded processor in a ton of things. Almost every modern Windows PC has ARM cores inside the mechanical or solid-state hard disks, the WiFi chipset, and if you have an RGB-lighted keyboard it's likely got an ARM in it too. It's funny that Dave Cutler went from having his thunder stolen by RISC and Unix at DEC to having his thunder stolen by RISC and Unix at MS though.
Yeah, there hasn't been a new CISC architecture launched in decades. I do think that RISC is a bit over-hyped in that a lot of the factors that made it a good idea in the 80s are no longer present or have been mitigated by other developments (microcode ROMs are no longer a huge contributor to chip area), and that a much more CISCy design point than your typical RISC architecture is viable, but it's hard to argue that the concept that has dominated all new work in ISA design for 40 years has ever been "dead" during that time. And while it does look like Unices running on RISC architectures will eventually dominate, Windows and x86 probably won't die until the whole Google-driven "make all mobile devices into glorified web terminals for the cloud" trend goes away and mobile computing is no longer deliberately hobbled.
Great interview. Also really enjoying the PiDP-11 replica in the background
I really miss working with OpenVMS, even if Pathworks as a Domain Controller was a pain in the a..., but eventually I was able to establish Trust relationships between the 33 domains that existed in the company I was working for with a simple DCL script.
And another colleague managed to write a DCL virus by writing a script which started with "distribute this script to all machines".
Good times.
4:55 For those who are interested, there are some DEC internal memos discussing PRISM and MICA among the Bitsavers collection.
Thanks...enjoying browsing the archives.
I wish I knew this was a teaser so I could have just waited for the full thing.
He seems such a nice guy. Nice to hear the man after all these years.
Remember the 'Halloween Documents'?
Got to work with SPARC and MIPS machines in the form of Sun and SGI workstations. Incredibly fast for the times. Incredibly expensive too! Amazing machines! If you have a 5 year old smartphone, you're going way faster, and using a lot of the same architecture. ARM is RISC, and it's coming back in big ways!
Do explain please!
I remember the AMD 2901 bit slice processors. They were used in the Data General Eclipses I used to repair, back when I was a computer tech. I'd dig right into the microcode on those. I also worked on VAX 11/780s.
The full thing will be interesting.
You guys are legend ! Thanks for sharing all this history !
I got my first computer I was 12 in 1987:.. I was kid but I had a blast navigations in the vax system of the local university, doing bbs.. what a great time it was ;-)
another great interview heres to many more (hoping)
I thought he said "prison" project the first few times. I've been on a few projects like that.
Amazing all the different computers OS’s compilers etc thst have been built
“Steve Ballmer took me down to the Denny’s on 148th street right by the Fred Meyer“ - I know where that was! Or at least maybe it was remodeled and then torn down.
It’s just like how Douglas Coupland’s Microserfs book makes mention of the Umajimaya on 156th and 24th where the Trader Joe’s is now. I’d pay a lot of money to take a Time Machine back to early 90’s Seattle + Eastside when it was just Boeing that really made the area.
Edit: “Nathan Myhrvold had Bill’s ear…” the molecular gastronomy guy. Ah boy.
Yep, those were the usual haunts for Microsofties back in the day. I drove those places everyday on my way to the campus. I worked in Nathan's Advanced Technology Group.
It's Dave Cutler! The father of the NT kernel!
More Dave Cutler please.
Christ, I used to live on 148th NE for many years on the Bellevue / Redmond border...I remember those locations.
This is all over my head all I know is my cell phone has the equivalent power of like 10 super computers from the 90s and it's still increasingly slow to the point of being unusable. Code is getting worse and worse.
Too many slow languages, scripting, inefficient algorthms and bloated libraries. Poweful hardware rewards bad programmers and inefficient ideas. When I started programming in the 70s, assembler / machine code were the only ways to get usable performance.
Its always interesting to see articles written today saying how productive software has gotten. I think its just more people thrown into software development.
@@toby9999That was the issue Toby. Writing in assembler or machine code (loved it 😊), took time, not only to write, but debug. Once memory became cheap, it no longer mattered if they wrote bloatware. I was never really convinced about RISC, customers became obsessed about MIP rates, so manufacturers went back to RISC architectures, which upped the MIPS, but as a consequence needed more memory, and ultimately took longer to execute a string of instructions, that used to be performed by a few instructions.
I some ways it is possible to return to the old days with the popularity of Arduino derived small systems for embedded applications. Once again you’ve got to watch how large your code is, conserve memory and optimize program structure for acceptable performance on relatively slow processors. The difference is now we can write in C++ like fashion in an IDE and easily control breakpoints in real time. We probably would have given our eye teeth in 1980 to have an Arduino.
On the other hand, ESP-32 is leading us back to bad habits with its larger memory, faster CPU, spacious flash, etc. The other day I coded a triggered sweep oscilloscope for a Lilygo dev board with a high resolution OLED display about the size of an old school stick of chewing gum in about 300 lines of code in the Arduino IDE. Now Arduino itself has adopted ESP-32 with the Uno R4.
Man, I’d love to time travel back to the 1980s to tease what would be coming 40 years down the road.
Can an interview get any duller than this? Chat about personalities and egos of executives.
Hey Dave, whatever happened to the Microsoft table tablet where you could put your camera on it and it would download the photos. I saw it a long time ago before the whole touch screen craze kicked off.
That was very interesting. The fact that I sort of understood what he was saying makes me feel really smart lol. Will have to watch the whole thing.
I kinda started in AS/400 (at work at least, home was 8bit/16 bit >> PC) - and transitioned into NT/2000/PC support.
I'd love to have a beer and spend some time with Dave Cutler. Would be amazing to get the inside lines..
A friend of mine that worked at IBM had an AS/400 in his bedroom and every time I came over he was doing something with it.
@@pf100andahalf At one point IBM had one that was about the size of a PC (it was low end in AS/400 terms) - and there was a time where I wanted one.
I was just wrong. It would have been an almost junk box, and AS/400 is so tied into a subscription OS / Hardware combo - and frankly I used to look through the red books in a desire to learn more - but it was all dead end.
Anyway... WRKSYSSTS :)
@@AdmV0rl0n My friend's AS/400 was about double the width of a desktop pc and about the same height. This is by memory from 30 years ago so I may be off a bit. He wrote code for OS/400. I don't knew exactly what he did, mainly because I'm just a hardware guy and since it didn't run pc software I was only mildly interested for historical reasons, but I knew its importance.
@@pf100andahalf It was a fair amount back there // hand waves // - but most of the code from what I remember was RPG, COBOL and some REXX, with OS400 being the OS. I'm over generalising, but it was a batch processing OS, with a database for a filesystem. Everything native was terminal/emulation, and by modern standards - clunky. In my later days people were running TCP stacks (early AS/400 'network' was pain) and trying to use 3rd party bolt on web based terminal or glorified front ends. There was some X86 boxes you could attach and later they had virtualisation in the OS.
I enjoyed my time with AS/400 - but its end lesson for me was to avoid getting tied into a vendor lock - and all that comes with it. The world briefly had personal computers - but we are rapidly shifting back to vendor lock in and you'll pay crack cocaine money for - in the old days - cpu cycles/time - today it will seem larger and grander - but its very client server, and its already now hitting a point where only the rich will be able to play. These things work in cycles, so I look forward for the next client/personal wave.
Wether that happens... well...
worked at an ISP for about ... a decade (a specific one, not in general. thats more years) and we ran a DEC Alpha for one of our servers. the head it guy was so pumped to had been able to buy a RISC machine. this is when mmx and ppros either just came out or were coming out soon. i am bad with time.
Thank you for the snack sized bytes. Looking forward to the 21st.
3:04 - I worked for the company that sued AMD over the demonstrated fact that their 2901 did not actually work as advertized, and thus was born the 2901A which set up the industry that Mr Cutler mentions here...
The title says it all: "Microsoft's "Pathetic" Operating Systems"
Any insights to os2 warp?
Steve Ballmer looks so good I didn't even recognize him
Developers, Developers, Developers, Developers.
Dave is still sharp as a knife. Awesome guy.
00:00 📅 Dave Cutler recalls his initial encounter with Microsoft and meeting Bill Gates.
01:02 💼 Cutler discusses his reservations about joining Microsoft despite Bill Gates' enthusiasm.
01:43 🥞 Steve Ballmer's persuasive breakfast at Denny's swayed Cutler and his team to join Microsoft.
04:10 🏗️ The Prism Project: Cutler's involvement in developing RISC architectures at DEC West.
06:25 💔 Prism's cancellation leads Cutler to leave DEC, preparing to start his own company.
He said "Ballmer fleeced us," but I didn't get a good sense of why he said that.
I see from other comments there's a "full interview" coming. Guess I'll have to wait. :)
Have you ever done a Windows vs OS2 Warp video?
Pathetic, Ballmer, Denny's is the kind of clickbait I'm here for. And not a single word in all caps.
Coulda used some more scare quotes though.
Reminds me of those interviews with Veterans that tell you what really happened during WWII or Vietnam
I've eaten at that Denny's many times before it closed. No job offers, though :P
damn, cut out @ 7:50... LOL GERRRR!!!!
If I could write assembler I’d spend all my time writing software for the SNES and GameBoy Advance- the golden era as far as I’m concerned 😅
I was team "big blue" evrrything else was a toy. I wrote some microcode for the 11/44 and couldnt wait to move on.
I wrote microcode for the IBM 360/30 and it was a pleasure to work on real hardware.
Nothing comes close to Big Blue
Is a full uncut interview going to be posted? It cuts off rather abruptly. Interested in hearing it more continuously
I have one question that I could never find an answer to. At the time of the rise of the Desktop GUI for some reason Microsoft did not include concurrent (multiuser) multitasking. That was the direction Unix and even Microsoft XENIX took in the early 1980s. It even existed on poor PC hardware in the form of TurboDOS and M/PM. To this day it is only with the latest Windows 11 Enterprise edition that does this. Why?
Breakfast at Dennys Seriously. 😂😂
I was at Warrex-Centurion in the late 1970's to mid 1980's. Centurion started shipping its AMD 2900 series 4-bit ALU based systems in 197901980. The Centurion multi-user 8 bit minicomputer systems called the CPU5, CPU6 and later the CPU7 all used the AMD 290x chips.
I love the comment thread here,so much to know
This is great, thanks!
I was just cleaning my garage and found a copy of Windows 7 beta
Oh here it is!