13:20 Doesn't even miss a beat, "This machine as it sits here on the table costs $10,000". Ah, those where the days when computer sales professionals wore suits, drank scotch in airport lounges, and buying a niche market workstation was like buying an imported sports car.
I just find it refreshing that they immediately say the price. It seems nowadays that they are very hesitant to state the price and always kind of grin and go "Well....". It's like, if you're uncomfortable stating the price then maybe you're asking too much for it.
@@nonconsensualopinion even back then they were hesitant on saying the price, this was one of the exceptions. It was just more/less understood that you were gonna shell out 4-5 figures
@@nonconsensualopinion The reps for consumer computers in other episodes usually hem and haw and try to give prices for down-specced versions of the same machine first to soften the blow. The IBM reps don't really care because they could charge whatever they wanted, and most of their volume probably paid much less anyway as part of a large contract.
Meanwhile, back in Cambridge, England, a little known computer company called Acorn were working on their first RISC machine, the Archimedes, incorporating the RISC CPU which they had designed a year prior in 1985. That CPU? The ARM, which is now at the core of every smartphone, tablet, and millions of other devices used by us today which demand lower power consumption than the Intel x86 architecture is able to deliver.
I know this is an old post, but Acorn's RISC cpu was actually based around Berkley's RISC research - conducted by the very David Patterson in the interview. That research went on to help design the SPARC and the Risc V cpu's as well.
Here's another trivia item. Acorn was actually looking at the new 65816 chip for their next generation computer (which ended up in the Apple 2GS and Nintendo SNES ) They decided it was too complex yet underperformed and not really the direction of the future, leading to their decision to make their own simple processor.
Was watching an interview with David Jaggar who worked at ARM in the 90's. He said there have been 150 Billion arms sold to date. Not all are are the core processors. Many are very small uControllers. Still pretty impressive.
Interesting that the RISC architecture, which started as a high-end workstation / server processor ended up in the mobile arena for consumer devices while CISC or x86 was able to overcome its limitations and completely take over the Workstation / Server market.
Not sure if “overcome” is really the right word. :-) More like “brute force” and of course rely heavily on predictive branching, caching, pipelines... I sometimes wonder whether we would have been better off if RISC had had the software support to become viable.
I loved tuning into computer chronicles back in the 80's and 90's. Love watching these old episodes in 2022. If only Gary could have seen the technology today, even just in our phones.
@@tr1p1ea As a computer scientist student, our phones would be considered super advanced tech back then... Have you watched ANY of these episodes? Your phone is more than 5000x faster than even supercomputers at the time. Have you at all taken in what even advanced personal computers could do at the time?? And how bulky they were? Man, you have no idea of the power inside your own tiny phone compared with their few KB (not GB, not even MB) of RAM. By the mid 80s you could get 10-20MB harddrives that were huge, again, not GBs...
Gary was reportedly working on a mobile phone before he died. Essentially a handheld computer that had Smart Control features, office and home pbx linking, wireless connectivity whilst on the go. He might not be as surprised as you think. The man was quite the visionary.
Computer Chronicles was frequently ahead of their time. I always thought of RISC as an early-mid 1990s thing. 1986? I love that about Computer Chronicles (: April 4, 2013
It was the British company Acorn that took RISC and ran with the principle, designing their own RISC chips and placing them into their own Acorn Archimedes series starting in 1987. Acorn over time morphed into ARM, which you've probably heard of. This English company still designs the vast majority of RISC CPUs in mobile phone technology to this day. And RISC proved to be a valuable philosophy after all, especially for low powered computing needs.
@@david-spliso1928 Technically MIPS was the first RISC processor, introduced back in 1981, but the theoretical basis for RISC goes back to the 1960s, while there were several multiple-chip IC CPUs and microprocessor CPUs in the 1970s for mainframes and minicomputers that can retroactively be categorized as RISC or RISC-like. ARM is really the most successful by far, but back in the 1990s many believed IBM's Power architecture will succeed x86, which never happened in reality. Now many believe RISC-V will succeed x86. Anyway, talking about CISC and RISC nowadays is kinda pointless, because CISC processors are more and more RISC-like, while RISC processors are more and more CISC-like. I believe the CISC/RISC distinction is largely obsolete these days.
Probably one of the most profound episode of the series. Path of RISC was rocky and embedded in marketing slogans to confuse people, so it wasn't a direct path to success. Even those who predicted success wouldn't have come close to realizing what these processors would be used for and how it would impact society on a huge level.
Even so, in the long run, it changed how computers and phones are today and definitely accelerated device portability even though RISC was originally in only desktops.
It's impressive for a TV programme to assume their audience has some knowledge. You'd be guaranteed to have your intelligence insulted by a computer show these days.
Have to remember that back then there weren't many other forms of getting specialized current events. The contents of this show often was the first time people heard of new hardware or software, or upcoming technologies in a practical consumer-oriented way (vs purely scientific). It was exciting to listen and watch each and every episode. In contrast today, it's hard to draw such shows on TV since there are other alternative ways for people to get the same information. Tv shows these days need to be flashy to keep people's attention.
back in those days americans were the leaders and their populous reflected that. nowdays asians are reaching that position - east asians at first and eventually south asians will get there as well. meanwhile western children took their lifestyle as granted and developed a sense of entitlement. they're losing interest in serious education and becoming distracted by narcissism, materialism and hedonism which has been accelerated through the internet.
Well, for much of the run of the show computers were still fairly pricey, so they'd have to assume if you could afford this hardware you were likely intelligent enough to understand/use it. Computer are so cheap/relatively easy to use these days that any braindead slob can use them.
I have to agree. I was born in the mid 80s around the time these Computer Chronicles show started to air and I am impressed with the fact that they had such a great show back then. They need to bring informative shows like this back. Children who are interested in Computing Science and Electrical Engineering would gain so much from them at an early age.
I think your spot on, that's not to say people have become dumber but let's be honest I certainly don't believe we've gotten any smarter. This focus on things like intersectionality and political correctness has dramatically shifted our ability to openly communicate on tough subjects without fear of backlash of some kind. I think this has a lot to do with why we don't see good programming like this today. If you watch Linus Tech tips, you can probably get what I am saying and yes sometimes I feel as if my intelligence is insulted by many topics, but I am not nearly as long as they get the facts right and that's often the problem.
Gary's happy face is a treat! RISC, in the end, won the battle. Things like the new Apple chip aren't even nails in the coffin but flowers over the grave. ARM is everywhere - on cell phones and wifi dongles. The fact that it's entering servers and PCs just now is what gives the impression the Intel x86 family rules the CPU world.
It reminds me of how much has changed since 1981. Many of these vendors and their products were cutting edge in the '80s and early '90s. But I'm also blown away and reminded of how expensive a desktop and/or it's accessories were. Most of us could only dream of owning a new and current desktop computer. But I thank s a non-connected desktop computer sort of made you a smarter computer user. Because you really had to know your stuff to have technical expertise. No Google to look things up or to provide a solution for your issues.
was born in the 90's and I'm a web developer now but I love watching this shit. All the stuff we take for granted now was so new and cutting edge at one point.
@@raven4k998 Every Intel and AMD processor today is a RISC CPU. They use MicroOps to implement the x86 CISC instruction set on it. The first x86 based RISC CPU that uses these MicroOps was the Pentium Pro. And the ARM architecture is RISC from the beginning. Your smartphone is very likely running on an ARM processor. The same applies to the Raspberry Pi.
@@raven4k998 Why are you talking bs? RISC is a type of philosophy to implement a CPU in a certain way, Linux is an operating system. Did you even understand what i said?
Living in this future is pretty cool. I remember watching these episodes as a kid and dreaming of what would become. If they only knew how far RISC would come.
A much younger David Patterson appears in this episode. Didn't seem to tout the power savings of using RISC because it wasn't until roughly a quarter of a century later this became important for tablets and smartphones.
It's interesting that they were talking about the development of AI even back then. It makes sense, but it was something I had never thought of before.
INTEL CISC processors are nowadays micro code running on several simultaneous RISC pipelines. RISC never defeated CISC because CISC transformed into using RISC at it's core, running CISC faster than a simple RISC pipeline could run the equivalent instructions. CISC in essence became CRISC.
Arm chips, which are found on the majority of mobile devices today, are RISC architecture. We wouldn't have I-Phones, Android devices, or single-board computers today without the work being shown here.
Almost, the ARM architecture was actually created by Acorn Computers in 1984, the baby of Arcon (ARM Holdings) still based in Cambridge, England, still licenses the architecture.
+80386 - yeah, and that form would likely be larger, run hotter and have worse battery life. power-efficient, small, and relatively cool RISC based embedded CPUs enabled the explosion of mobile, internet-connected devices, they were critical.
RISC lost and became CISC with thousands of instructions. RISC only RISC by its name. Mobiles use it because of its cheaper but RISC V is coming and now everybody even Intel is shitting themselves because it's free.
That guy George Morrow nails it, and very politely debunks the lady saying risc is nowhere as fast. We are about to reach the silicon limit, and the only way to keep up the speed increase is either to add more cores or switch architecure/instructions set. Im very excited to have a risc cpu in my desktop pc in the near future.
Funny to think about it that way, but it might be Apple that makes it happen. They’re certainly brazen enough to switch architectures for Mac (they’ve done it before), and could get the ball rolling for porting PC software over to ARM. I think if Microsoft had stuck it out with Windows RT that would’ve helped too, but they threw in the towel way too early.
Actually there was an episode around this very time frame where the guests were mentioning silicon limits, and said light based computers would be the future. So it has been 40 years they have been talking about this.
@@nickwallette6201 well it happened, apple with all its flaws relased the most potent desktop processor based on RISC. I think everybody sticking to x86-64 is losing precious time, especially the game developers.
One year later the British company Acorn would manufacture their own RISC chips and RISC based computers, the Acorn Archimedes series, setting the ball rolling properly for RISC architecture into the world. To this day, ARM (which Acorn morphed into) makes the vast majority of the RISC CPUs inside mobile phones and other low powered devices.
Just a clarification: ARM doesn't make any chips. They license out the intellectual property rights and the licensee takes it from there and has great liberty in doing what they want to fit their products. Each chip sold has a percentage due as a royalty fee back to ARM. This business model differs greatly from Intel/AMD and may have contributed to the strong growth of ARM based processors.
George Morrow's last comment was absolutely dead on. "Eventually we're going to get down to point where silcion's important, and there RISC machines win on silicon better than the CISC." We're down to NM processes now. Silicon is more important now than ever. RISC is showing that it's able to operate more effeciently than CISC at the smaller processes, with less silicon needed to do so successfully.
I was born in 1981. The growth of computers felt like the wild west during my formative years. Tonight, I'm watching this episode of old school TV on my OLED smartphone.
Well, the “risc risk” has certainly paid off! There are far more computers out there today using RISC than CISC, & as more people use mobile devices than traditional computers, CISC will become further less mainstream.
While RISC is huge today, it was fascinating to see just how accurate Jan Lewis's evaluation / prediction of RISC was for 1986 given the realities of the time.
Risc was bigger back then. then today cause computers today have to power to brute force their way through non risc like windows os to power through tasks while linux is Risc based today and speed through things faster then windows even with it's proton emulation layer to run windows software🤣🤣🤣
21:34 Having lots of on-chip registers was the single biggest thing, in my view. But that was already enough to create a big headache for code compatibility.
What strikes me the most about this video, is that even though we are living in way more technologically advanced society, you will almost never hear such filled with technical terms talk on mainstream media broadcasts today. Even though the computer technology massively improved, the average user technical knowledge level has plummeted in parallel.
@@lucius1976 RISC processors are more efficient, and they would be ideal for servers and data centers. They aren't ideal for scientists and engineers who use modeling applications, CISC based processors are much better at crunching float numbers.
@@Ace1000ks19751982 A complex deeply-pipelined CPU has no problem finding useful work to keep busy in a data center environment when it's under constant load. That changes the equation.
That last bit of news regarding Molecular Computers was fascinating. Wonder what ever happened to that research. Here we are 30 years later, and no one has even heard of progress in molecular computing.
Presently, research centres around the world are investing huge amounts of money into molecular computing research, particularly the Chinese. Many papers have been published over the last ten years.
I worked at Motorola in the late 90s in the PowerPC division and there was a paper about future technologies that showed a hybrid CPU made of silicon and mice nerve cells. I’ll never forget the accompanying image.
I think part of it was realizing there already is a computing environment within biological systems. Gene editing with CRISPR/Cas9 or now mRNA-based immunotherapies could potentially achieve some of what was envisioned before getting to more complex nanomachines, which are also being worked on with MEMS.
21:44 Which is what happened: in the latter 80s, the Unix workstation makers moved _en masse_ to RISC architectures, and they found quite a profitable market for about another decade in the scientific/technical/CG area ... until Windows NT came along and wiped them out.
That's only an superficial view of things ... NT is only an OS that supported x86 designs that played catch game with RISC with SPEED (Mhz), in fact the only game they could win as x86 was actually inferior architecture by any mean. As x86 was widely spread, there was no problem for Intel to increase speed and at the same time add few improvements here and there. Actually today x86 internal design looks more like RISC than original x86 from where it come from. For RISC was total different story, narrow specialized market, ultra-expensive design of next generation chip die, they couldn't catch with that release rate and eventually every RISC player at the end did give up ... except ARM but that's another story.
@@tomservo5007 search TH-cam for triangulation 114. I believe if you go to minute 40 and watch from there he'll mention that he got into a fight and died. So I take it he got killed. He definitely mentions it I'm not sure of the time stamp.
George Morrow's comment negative about "Keep making new instructions sets. And that's like inventing a New Typeface every time you want to say something." Reality is they become a New Typeface using much fewer words to say what you want. Also, regarding figuring out to run anyone's computer binary without recompiling. IBM did that with something called Object Observability that included the Source Language with the Compiled Language so when the program was moved to an updated processor, the Object Code saved as modified.
Gary: Now Frank just for a reference point, what is the cost of this machine? Frank:This machine as it sits here on the table is about USD 10,000. Gary: Oh, but .. can it run Crysis?
Gary Kildall was one of the best man of the times for computers when it was fun to work on computers and made it interesting to learn . Now days it just internet surfing and google and apple taking over along with microsoft . IBM had good computers in the past but didn't spend the money too keep up with the general publics needs and what they liked . I think modern computing is gotten boring and there no fun in it anymore .
2:45 The RT/PC was such an absymal performer, the joke was that “RT” stood for “Reduced Technology”*. Just a few years later, though, IBM came out with its POWER architecture, and then the world really stood up and took notice. *Echoed also in Microsoft’s more recent cut-down “Windows RT” OS.
As I watch in my head I am offering advice and opinions not fully realizing I have the benfit of seeing decades into the future. But I feel smart until I realize this.
@@MickeyD2012 Yeah. It's not insightful - RISC was never "risky" in terms of marketshare or mindshare, nor was its influence downplayed (witness all the stuff x86 pulled in from RISC concepts) - with that in mind, your pun isn't even remotely witty. Your reply is even more ridiculous because you think I'm disagreeing with that.
The 80s was a true magical and golden time for computers. Today computers are nothing more than a toaster... It was 84 before I ever got my hands on a IBM PC. In Orange Park FL there was an IBM retail store and they would let me play with any computer and even open software for me to use. It would be 87 before I got my first computer - IBM PCjr in 88 I got a PC clone with CGA.
yea ...memories..hope he's still around yo...i grew up with a compaq portable 3 with plasma orange display lol great dos ..c64 was cool . remember those badass Orchid graphics cards and the ISA Tsung labs era cards hahah..
+TechBaron, Cameras and more! Stewart Cheifet is still around, last I heard. Gary Kildall died in 1994 after an incident in a biker bar. There was actually an episode of Computer Chronicles in 1994 in tribute to the man's life and his impact on computers, along with the rise and fall of his company (Digital Research). It's part of why I love the fact that these shows are online. I never used CP/M and was unfamiliar with DRI, since it was before my time, but I've learned so much about the past, and about Mr, Kildall's contributions, here on YT.
I was watching a TH-cam video recently where Bryan Lunduke was interviewing Stewart Cheifet (from April of 2017, iirc), who was amazed that a new generation of people were watching his show, now that they're being uploaded to TH-cam and the Internet Archive, and as such he wanted to cooperate to make as many episodes available to us all as possible (on the Internet Archive, anyhow, for non-commercial use). Truly a stand-up guy, as was Gary Kildall before his unfortunate passing - without CP/M, we wouldn't be where we are now, and he was also a big part of bankrolling the show to begin with, so we owe them both a debt of gratitude. While I don't recall seeing Computer Chronicles on any of the PBS stations we got on Canadian cable TV back when it was still in production, I've watched many episodes online, and still can't get enough of it. P.S. If you're interested in the video, do a TH-cam search for ""The Computer Chronicles w/Stewart Cheifet" - Lunduke Hour - Apr 5, 2017", you won't be disappointed :D
For context, all modern x86 processors are RISC core. The added complexity for the instruction set is added for backwards compatibility and to make it easier to program. All those "unneeded" instructions are useful for reduction in complexity in programming. At one end you can have simplicity in the CPU and complexity in the code and at the other end complexity in the CPU and simplicity in the code. Intel was already on the path with CISC and chose to integrate RISC into the x86 and the CPU handles the transition for you. RISC (and I'm going to specify ARM) started simple but has been adding features since. Between the both of them it's almost a race to the middle :-)
@@daishi5571 The only RISC-like x86 uarch was the P6. Microcoded architectures always have split complex ops into smaller ops. There is a very interesting article that explains what I mean called The legend of "x86 CPUs decode instructions into RISC form internally" which you should read.
@@CanuckGod In 1995, at the age of 10, my dad got our first family PC, a Gateway 2000 P5-75 (original Pentium 75mhz) with (I don't remember how much RAM) but the hard drive was 1 GB and it had a 3 tray CD-ROM drive, a 3.5" floppy drive, 17" monitor, Soundblaster, and an old school Epson Stylus Color Printer all for $2400 USD.
the humble ARM-based cpu with vector extension (scalable vector extension) now runs the fastest supercomputer (Fujitsu Fugaku) in the world as of November 2020. that's RISC!
Why is an ARM cpu humble? It’s literally the most used instruction set in the world and comes from a huge history of RISC research and previous designs, some of which were (and are) extremely popular.
@@BlownMacTruck because it's not hyped up like Intel or AMD. And I know ARM outnumbered x86 by 10 to 1 since its everywhere: cell phones, microwaves, watches, servers, game console, engine control unit, radios, tvs, you name it. Yet not a lot of people knows the company ARM. that's why i call it very humble yet powerful
@@Lithiumbattery Uh, they’re literally the most powerful IP in the processor industry. They don’t need hype. You may as well say Oracle or EMC are humble because they don’t need to hype their products even though they’re giant gorillas in their space. They don’t need to hype anything because they’re so big. They make ruthless design decisions with enormous repercussions. That’s like the very opposite of humble.
@@BlownMacTruck Where I can read about this in dept? I really like CPU design, and I like the efficiency of RISC, I'd like any and all literature I could download; I have searched the Interwebs but not much there
More or less in the Pentium processors era. Before that you had a plethora of add on cards to get printer ports, serial ports, Floppy drive ports and harddisk controllers in a computer, they were not on the motherboard. When Pentium motherboards started to come out those started to get integrated onto the motherboard itself. And to save costs those IO features were integrated into a single chip (the North Bridge) instead of putting several different controller chips on the board. Also handling memory became harder due to various new timing restraints (this is the era when you started seeing different kinds of methods to speed up memory, like EDO ram ) so a more complex memory controller was required which ended up being in the south bridge chip, a different chip cause the speed/bandwidth requirements for memory access are much higher then for the other kinds of I/O that the north bridge handled.
It's crazy how x86 (basically) was already bloated by 1986 standard. Yet here we are nearly 40 years later and are still complaining about it. Yet we still get the Steam Deck with x86_64. Lessons learned: "If something is broken you either fix it now, or you're most likely never going to fix it".
Remarkably prescient episode. Practically all of the points made for and against RISC have been born out since then. The 90's saw an explosion of RISC based chips from IBM, DEC, HP, ARM, and even Intel. All of them featured long pipelines, large register sets. The point about the efficient use of silicon really is what eventually sold RISC as the technology that would succeed over CISC.
Yea ARM processors are actually the most common. Your typical PC has like 15 of them in it. Doing everything from decoding keyboard button presses to managing the internals of the SSD.
Modern x86 processors (Atom excluded) translate x86 instructions to an internal RISC execution engine. Meanwhile, the ARM ISA has expanded quite a lot in more recent years, particularly with floating point extensions.
I love watching these shows and feeling nostalgic, but this episode I find annoying because at this stage a 16-bit RISC architecture had already been available for 11 years. The Texas Instruments TMS9900 series have been recognised as RISC processors and the architecture was years ahead of its time. The TI-99/4A was my first computer.
19:18 And then Cyrix (I think it was Cyrix) cleverly converted the x86 CPU into a translator which converted CISC instructions into small "micro ops" RISC instructions. Intel and AMD quickly jumped on that bandwagon.
All CISC machines already used microcode for their complex instructions. But the execution units' pipelines were deepened to allow higher clock frequencies. Speculative branching was introduced, etc, etc.
13:23 Wow... $10K I just ordered my new iMac. It holds 64GB RAM and has a 2TB SSD. Now I wonder how much this iMac would cost back in 1986! I bet it would be in the millions!
@Невада большевик And that whole zeal for “compatibility” is keeping the PC side stuck on overheating Intel x64 architecture. It was time to move to RISC 10+ years ago. Even AMD is going to jump eventually.
@Невада большевик I have and I’m aware AMD essentially built x64. I’m not blasting AMD here at all. In fact, Intel would be more screwed if not for AMD. Doesn’t mean RISC of some form, whether it is ARM or RISC-V, isn’t a better path forward.
13:20 Doesn't even miss a beat, "This machine as it sits here on the table costs $10,000". Ah, those where the days when computer sales professionals wore suits, drank scotch in airport lounges, and buying a niche market workstation was like buying an imported sports car.
$10,000 in 1986, adjusted for inflation, is $23,000 in 2018
@@MonkeyForNothing In other words, four fully equipped Threadrippers or two top end Xeon systems.
I just find it refreshing that they immediately say the price. It seems nowadays that they are very hesitant to state the price and always kind of grin and go "Well....". It's like, if you're uncomfortable stating the price then maybe you're asking too much for it.
@@nonconsensualopinion even back then they were hesitant on saying the price, this was one of the exceptions. It was just more/less understood that you were gonna shell out 4-5 figures
@@nonconsensualopinion The reps for consumer computers in other episodes usually hem and haw and try to give prices for down-specced versions of the same machine first to soften the blow. The IBM reps don't really care because they could charge whatever they wanted, and most of their volume probably paid much less anyway as part of a large contract.
Gary Kildall... his name still pops into my head whenever I have to fiddle around in the BIOS. And here he is interviewing people for TV, fascinating!
did you get Risc on your computer?
Meanwhile, back in Cambridge, England, a little known computer company called Acorn were working on their first RISC machine, the Archimedes, incorporating the RISC CPU which they had designed a year prior in 1985.
That CPU?
The ARM, which is now at the core of every smartphone, tablet, and millions of other devices used by us today which demand lower power consumption than the Intel x86 architecture is able to deliver.
I know this is an old post, but Acorn's RISC cpu was actually based around Berkley's RISC research - conducted by the very David Patterson in the interview. That research went on to help design the SPARC and the Risc V cpu's as well.
Here's another trivia item. Acorn was actually looking at the new 65816 chip for their next generation computer (which ended up in the Apple 2GS and Nintendo SNES ) They decided it was too complex yet underperformed and not really the direction of the future, leading to their decision to make their own simple processor.
Was watching an interview with David Jaggar who worked at ARM in the 90's. He said there have been 150 Billion arms sold to date. Not all are are the core processors. Many are very small uControllers. Still pretty impressive.
Interesting that the RISC architecture, which started as a high-end workstation / server processor ended up in the mobile arena for consumer devices while CISC or x86 was able to overcome its limitations and completely take over the Workstation / Server market.
Not sure if “overcome” is really the right word. :-) More like “brute force” and of course rely heavily on predictive branching, caching, pipelines... I sometimes wonder whether we would have been better off if RISC had had the software support to become viable.
I loved tuning into computer chronicles back in the 80's and 90's. Love watching these old episodes in 2022.
If only Gary could have seen the technology today, even just in our phones.
They wouldn't be as blown away as you think.
Also phones don't represent super advanced technology.
@@tr1p1ea As a computer scientist student, our phones would be considered super advanced tech back then... Have you watched ANY of these episodes? Your phone is more than 5000x faster than even supercomputers at the time. Have you at all taken in what even advanced personal computers could do at the time?? And how bulky they were? Man, you have no idea of the power inside your own tiny phone compared with their few KB (not GB, not even MB) of RAM. By the mid 80s you could get 10-20MB harddrives that were huge, again, not GBs...
Gary was reportedly working on a mobile phone before he died. Essentially a handheld computer that had Smart Control features, office and home pbx linking, wireless connectivity whilst on the go. He might not be as surprised as you think. The man was quite the visionary.
Computer Chronicles was frequently ahead of their time. I always thought of RISC as an early-mid 1990s thing. 1986? I love that about Computer Chronicles (:
April 4, 2013
LOL I know right?
It was the British company Acorn that took RISC and ran with the principle, designing their own RISC chips and placing them into their own Acorn Archimedes series starting in 1987. Acorn over time morphed into ARM, which you've probably heard of. This English company still designs the vast majority of RISC CPUs in mobile phone technology to this day. And RISC proved to be a valuable philosophy after all, especially for low powered computing needs.
@@david-spliso1928 Technically MIPS was the first RISC processor, introduced back in 1981, but the theoretical basis for RISC goes back to the 1960s, while there were several multiple-chip IC CPUs and microprocessor CPUs in the 1970s for mainframes and minicomputers that can retroactively be categorized as RISC or RISC-like.
ARM is really the most successful by far, but back in the 1990s many believed IBM's Power architecture will succeed x86, which never happened in reality. Now many believe RISC-V will succeed x86.
Anyway, talking about CISC and RISC nowadays is kinda pointless, because CISC processors are more and more RISC-like, while RISC processors are more and more CISC-like.
I believe the CISC/RISC distinction is largely obsolete these days.
@@elimalinsky7069 It's widely accepted that the IBM 801 was the first RISC processor.
Motorola 6800
Probably one of the most profound episode of the series. Path of RISC was rocky and embedded in marketing slogans to confuse people, so it wasn't a direct path to success. Even those who predicted success wouldn't have come close to realizing what these processors would be used for and how it would impact society on a huge level.
10.000.00 for a 4 meg of ram computer I am just like man did they see you guys coming with that machine🤣🤣🤣🤣🤣🤣🤣🤣🤣🤣🤣🤣
Even so, in the long run, it changed how computers and phones are today and definitely accelerated device portability even though RISC was originally in only desktops.
It's impressive for a TV programme to assume their audience has some knowledge. You'd be guaranteed to have your intelligence insulted by a computer show these days.
Have to remember that back then there weren't many other forms of getting specialized current events. The contents of this show often was the first time people heard of new hardware or software, or upcoming technologies in a practical consumer-oriented way (vs purely scientific). It was exciting to listen and watch each and every episode. In contrast today, it's hard to draw such shows on TV since there are other alternative ways for people to get the same information. Tv shows these days need to be flashy to keep people's attention.
back in those days americans were the leaders and their populous reflected that. nowdays asians are reaching that position - east asians at first and eventually south asians will get there as well. meanwhile western children took their lifestyle as granted and developed a sense of entitlement. they're losing interest in serious education and becoming distracted by narcissism, materialism and hedonism which has been accelerated through the internet.
Well, for much of the run of the show computers were still fairly pricey, so they'd have to assume if you could afford this hardware you were likely intelligent enough to understand/use it. Computer are so cheap/relatively easy to use these days that any braindead slob can use them.
I have to agree. I was born in the mid 80s around the time these Computer Chronicles show started to air and I am impressed with the fact that they had such a great show back then. They need to bring informative shows like this back. Children who are interested in Computing Science and Electrical Engineering would gain so much from them at an early age.
I think your spot on, that's not to say people have become dumber but let's be honest I certainly don't believe we've gotten any smarter. This focus on things like intersectionality and political correctness has dramatically shifted our ability to openly communicate on tough subjects without fear of backlash of some kind. I think this has a lot to do with why we don't see good programming like this today. If you watch Linus Tech tips, you can probably get what I am saying and yes sometimes I feel as if my intelligence is insulted by many topics, but I am not nearly as long as they get the facts right and that's often the problem.
Gary's happy face is a treat!
RISC, in the end, won the battle. Things like the new Apple chip aren't even nails in the coffin but flowers over the grave. ARM is everywhere - on cell phones and wifi dongles. The fact that it's entering servers and PCs just now is what gives the impression the Intel x86 family rules the CPU world.
The x86 world is RISC based since the Pentium Pro.
Open source RISC-V is starting to become a big thing.
It’s much better produced and explained than most youtubers xplainers.
I love watching these shows here in 2019.
2020, better know as "the Corona year"
Discovered this channel in 2020 and can’t get enough.
It reminds me of how much has changed since 1981. Many of these vendors and their products were cutting edge in the '80s and early '90s. But I'm also blown away and reminded of how expensive a desktop and/or it's accessories were. Most of us could only dream of owning a new and current desktop computer. But I thank s a non-connected desktop computer sort of made you a smarter computer user. Because you really had to know your stuff to have technical expertise. No Google to look things up or to provide a solution for your issues.
Hey from 2021
Hello from 2023
was born in the 90's and I'm a web developer now but I love watching this shit. All the stuff we take for granted now was so new and cutting edge at one point.
Wow, Jen and George knocked it out of the park with their insights! The HP guy spoke quite well too.
I really like that they were so good at making sure the show was approachable to everyone but didn't talk down to the viewer.
do you have Risc on your computer?🤔
@@raven4k998 Every Intel and AMD processor today is a RISC CPU. They use MicroOps to implement the x86 CISC instruction set on it.
The first x86 based RISC CPU that uses these MicroOps was the Pentium Pro.
And the ARM architecture is RISC from the beginning. Your smartphone is very likely running on an ARM processor. The same applies to the Raspberry Pi.
@@OpenGL4ever your confused risc is linux not the cpu silly goose🤣🤣🤣
@@raven4k998 Why are you talking bs? RISC is a type of philosophy to implement a CPU in a certain way, Linux is an operating system.
Did you even understand what i said?
@@OpenGL4ever sshh on you linux is a risc based os it runs so fast compared to windows a cpu is a cpu not risc at all
Living in this future is pretty cool. I remember watching these episodes as a kid and dreaming of what would become. If they only knew how far RISC would come.
This is a beautiful and refreshing comment.
It was a risc, but it paid off.
A much younger David Patterson appears in this episode. Didn't seem to tout the power savings of using RISC because it wasn't until roughly a quarter of a century later this became important for tablets and smartphones.
I wish Gary could see the computers today.
I find it amazing that they had these discussions in 1986.
People were not dumber in the past. Quite the opposite.
These people built the foundations of most of the things you know
It's interesting that they were talking about the development of AI even back then. It makes sense, but it was something I had never thought of before.
Started in the 1950s
Exciting era to be alive and in the computer field. unbelievable! So glad I was there...Wow
What is old is new again. A lot excitement of around RISC-V right now. :D
Yup, came hear after all the hype around Apple’s M1 chip to get a primer on RISC
@@274pacific The "hype" is real. M1 and M2 are amazingly fast per Watt.
God bless Gary Kildall - Rest in peace & he was a genius.
nope nope nope Bill Gates was a Genius Gary was not that bright which is why Microsoft took off and is still prevalent even today
Wild watching this knowing what RISC became. For those of you who don’t know, all iPhones, iPads, and M-series Macs are RISC computers
INTEL CISC processors are nowadays micro code running on several simultaneous RISC pipelines. RISC never defeated CISC because CISC transformed into using RISC at it's core, running CISC faster than a simple RISC pipeline could run the equivalent instructions. CISC in essence became CRISC.
Didn't know that, very interesting!
And RISC has been adding complexity. RISC in essence is becoming CRISC.....hang on :-P
Arm chips, which are found on the majority of mobile devices today, are RISC architecture. We wouldn't have I-Phones, Android devices, or single-board computers today without the work being shown here.
we still would have all this, but they would have some different form. the world doesn't wait for one specific person or thing. it moves on.
Almost, the ARM architecture was actually created by Acorn Computers in 1984, the baby of Arcon (ARM Holdings) still based in Cambridge, England, still licenses the architecture.
+80386 - yeah, and that form would likely be larger, run hotter and have worse battery life. power-efficient, small, and relatively cool RISC based embedded CPUs enabled the explosion of mobile, internet-connected devices, they were critical.
Funny basically RISC lost for years and now one can say they are kicking Intel's butt! Intel lost the battle in the mobile market.
RISC lost and became CISC with thousands of instructions. RISC only RISC by its name. Mobiles use it because of its cheaper but RISC V is coming and now everybody even Intel is shitting themselves because it's free.
That guy George Morrow nails it, and very politely debunks the lady saying risc is nowhere as fast. We are about to reach the silicon limit, and the only way to keep up the speed increase is either to add more cores or switch architecure/instructions set. Im very excited to have a risc cpu in my desktop pc in the near future.
Funny to think about it that way, but it might be Apple that makes it happen. They’re certainly brazen enough to switch architectures for Mac (they’ve done it before), and could get the ball rolling for porting PC software over to ARM.
I think if Microsoft had stuck it out with Windows RT that would’ve helped too, but they threw in the towel way too early.
I've been hearing were reaching the silicon limit for nearly two decades.
Well, we're also nearly 40 years in the future now.
The problems have shifted a bit.
Actually there was an episode around this very time frame where the guests were mentioning silicon limits, and said light based computers would be the future. So it has been 40 years they have been talking about this.
@@nickwallette6201 well it happened, apple with all its flaws relased the most potent desktop processor based on RISC. I think everybody sticking to x86-64 is losing precious time, especially the game developers.
One year later the British company Acorn would manufacture their own RISC chips and RISC based computers, the Acorn Archimedes series, setting the ball rolling properly for RISC architecture into the world. To this day, ARM (which Acorn morphed into) makes the vast majority of the RISC CPUs inside mobile phones and other low powered devices.
Just a clarification: ARM doesn't make any chips. They license out the intellectual property rights and the licensee takes it from there and has great liberty in doing what they want to fit their products. Each chip sold has a percentage due as a royalty fee back to ARM. This business model differs greatly from Intel/AMD and may have contributed to the strong growth of ARM based processors.
@@oldtwinsna8347 That's true. It's ARMs designs which Qualcomm, Mediatek, Apple etc licence and manufacture. Thanks for clarifying.
George Morrow's last comment was absolutely dead on.
"Eventually we're going to get down to point where silcion's important, and there RISC machines win on silicon better than the CISC."
We're down to NM processes now. Silicon is more important now than ever. RISC is showing that it's able to operate more effeciently than CISC at the smaller processes, with less silicon needed to do so successfully.
I was born in 1981. The growth of computers felt like the wild west during my formative years.
Tonight, I'm watching this episode of old school TV on my OLED smartphone.
Well,
the “risc risk” has certainly paid off!
There are far more computers out there today using RISC than CISC, & as more people use mobile devices than traditional computers, CISC will become further less mainstream.
This show should've kept going. It's like the computer version of Motorweek
Specialized chips are used now very often, in televisions, streaming devices, even cables.
While RISC is huge today, it was fascinating to see just how accurate Jan Lewis's evaluation / prediction of RISC was for 1986 given the realities of the time.
Risc was bigger back then.
then today cause computers today have to power to brute force their way through non risc like windows os to power through tasks while linux is Risc based today and speed through things faster then windows even with it's proton emulation layer to run windows software🤣🤣🤣
@@raven4k998 Not really. CISC was big first and existed long before RISC. RISC surged later on.
14:35 The ability to move a wire frame graphic around is insignificant next to that of the power of the 1983 star wars arcade machine.
21:34 Having lots of on-chip registers was the single biggest thing, in my view. But that was already enough to create a big headache for code compatibility.
Isn't Gary Kildall the guy who created the original version of DOS?
What he made is the origins of all DOS based OS. CP/M
MS-DOS is based from 86DOS/QDOS which is a crude copy of CP/M.
What strikes me the most about this video, is that even though we are living in way more technologically advanced society, you will almost never hear such filled with technical terms talk on mainstream media broadcasts today.
Even though the computer technology massively improved, the average user technical knowledge level has plummeted in parallel.
Gary Kildall was the God of modern day computing. RIP Gary.
I would have to agree with that assessment. He deserves a lot more credit than what's given to him from the general public. Truly a genius.
The Gods wrote Unix. Ken Thompson, Dennis Ritchie, Brian Kernighan, Douglas McIlroy, and Joe Ossanna at Bell Labs
RISC architecture is going to change everything
RISC processors are used today for smartphones, tablets, and controllers for various devices. RISC processors might be used for servers in the future.
And Workstations - Apple already switching. Can already use it for Linux (Rasperry Pi etc.). Only matter of time until Windows is on offer too
@@lucius1976 RISC processors are more efficient, and they would be ideal for servers and data centers. They aren't ideal for scientists and engineers who use modeling applications, CISC based processors are much better at crunching float numbers.
@@Ace1000ks19751982 A complex deeply-pipelined CPU has no problem finding useful work to keep busy in a data center environment when it's under constant load. That changes the equation.
@@frother That is why multi core Intel Xeon cpus are used in data centers.
Man, hearing them talk about risc and AI is relevant even today! Crazy!
That last bit of news regarding Molecular Computers was fascinating. Wonder what ever happened to that research. Here we are 30 years later, and no one has even heard of progress in molecular computing.
Presently, research centres around the world are investing huge amounts of money into molecular computing research, particularly the Chinese. Many papers have been published over the last ten years.
I worked at Motorola in the late 90s in the PowerPC division and there was a paper about future technologies that showed a hybrid CPU made of silicon and mice nerve cells. I’ll never forget the accompanying image.
I think part of it was realizing there already is a computing environment within biological systems. Gene editing with CRISPR/Cas9 or now mRNA-based immunotherapies could potentially achieve some of what was envisioned before getting to more complex nanomachines, which are also being worked on with MEMS.
RISC: you could not live with your own failure. Where did that bring you? Back to me.
21:44 Which is what happened: in the latter 80s, the Unix workstation makers moved _en masse_ to RISC architectures, and they found quite a profitable market for about another decade in the scientific/technical/CG area ... until Windows NT came along and wiped them out.
That's only an superficial view of things ... NT is only an OS that supported x86 designs that played catch game with RISC with SPEED (Mhz), in fact the only game they could win as x86 was actually inferior architecture by any mean. As x86 was widely spread, there was no problem for Intel to increase speed and at the same time add few improvements here and there. Actually today x86 internal design looks more like RISC than original x86 from where it come from. For RISC was total different story, narrow specialized market, ultra-expensive design of next generation chip die, they couldn't catch with that release rate and eventually every RISC player at the end did give up ... except ARM but that's another story.
Gary Kildall, a genius...
RIP, he seemed like a very good guy always smiling.
@@MalamIbnMalam I wish we had more answers to the cause of death. One simply doesn't hit their head and die in a biker bar.
@@tomservo5007 search TH-cam for triangulation 114. I believe if you go to minute 40 and watch from there he'll mention that he got into a fight and died. So I take it he got killed. He definitely mentions it I'm not sure of the time stamp.
George Morrow's comment negative about "Keep making new instructions sets. And that's like inventing a New Typeface every time you want to say something." Reality is they become a New Typeface using much fewer words to say what you want. Also, regarding figuring out to run anyone's computer binary without recompiling. IBM did that with something called Object Observability that included the Source Language with the Compiled Language so when the program was moved to an updated processor, the Object Code saved as modified.
Gary: Now Frank just for a reference point, what is the cost of this machine?
Frank:This machine as it sits here on the table is about USD 10,000.
Gary: Oh, but .. can it run Crysis?
I've watched this one but I am doing it again
I am commenting from the future: You were right, RISC architecture was and still is your future. Thanks guys!
The RISC was worth the REWARD.
Gary Kildall was one of the best man of the times for computers when it was fun to work on computers and made it interesting to learn . Now days it just internet surfing and google and apple taking over along with microsoft . IBM had good computers in the past but didn't spend the money too keep up with the general publics needs and what they liked .
I think modern computing is gotten boring and there no fun in it anymore .
2:45 The RT/PC was such an absymal performer, the joke was that “RT” stood for “Reduced Technology”*. Just a few years later, though, IBM came out with its POWER architecture, and then the world really stood up and took notice.
*Echoed also in Microsoft’s more recent cut-down “Windows RT” OS.
Win RT certainly was ... reduced.
As I watch in my head I am offering advice and opinions not fully realizing I have the benfit of seeing decades into the future.
But I feel smart until I realize this.
If only the Alpha had survived. Intel wouldn't have that of course.
Eventually, the RISC paid off.
Is that supposed to be insightful or funny? Because it's neither…
@@BlownMacTruck Ask arm about that. Do you have an Apple or Android phone?
@@BlownMacTruck As a matter of fact, did you have anything insightful or funny to say? Because I'd love to hear it.
@@MickeyD2012 Yeah. It's not insightful - RISC was never "risky" in terms of marketshare or mindshare, nor was its influence downplayed (witness all the stuff x86 pulled in from RISC concepts) - with that in mind, your pun isn't even remotely witty. Your reply is even more ridiculous because you think I'm disagreeing with that.
Even a modern x86 Chip from Intel or AMD maps the external CISC to an internal RISC core. So, RISC won indeed.
The 80s was a true magical and golden time for computers. Today computers are nothing more than a toaster... It was 84 before I ever got my hands on a IBM PC. In Orange Park FL there was an IBM retail store and they would let me play with any computer and even open software for me to use. It would be 87 before I got my first computer - IBM PCjr in 88 I got a PC clone with CGA.
That time when making a new Instruction Set was as painful as making a new Typeface!
Watching here 2022
Gary Kildall was left handed!? No wonder CP/M has such a sinister feel about it.
yea ...memories..hope he's still around yo...i grew up with a compaq portable 3 with plasma orange display lol great dos ..c64 was cool . remember those badass Orchid graphics cards and the ISA Tsung labs era cards hahah..
+TechBaron, Cameras and more! Stewart Cheifet is still around, last I heard. Gary Kildall died in 1994 after an incident in a biker bar. There was actually an episode of Computer Chronicles in 1994 in tribute to the man's life and his impact on computers, along with the rise and fall of his company (Digital Research).
It's part of why I love the fact that these shows are online. I never used CP/M and was unfamiliar with DRI, since it was before my time, but I've learned so much about the past, and about Mr, Kildall's contributions, here on YT.
I was watching a TH-cam video recently where Bryan Lunduke was interviewing Stewart Cheifet (from April of 2017, iirc), who was amazed that a new generation of people were watching his show, now that they're being uploaded to TH-cam and the Internet Archive, and as such he wanted to cooperate to make as many episodes available to us all as possible (on the Internet Archive, anyhow, for non-commercial use). Truly a stand-up guy, as was Gary Kildall before his unfortunate passing - without CP/M, we wouldn't be where we are now, and he was also a big part of bankrolling the show to begin with, so we owe them both a debt of gratitude. While I don't recall seeing Computer Chronicles on any of the PBS stations we got on Canadian cable TV back when it was still in production, I've watched many episodes online, and still can't get enough of it.
P.S. If you're interested in the video, do a TH-cam search for ""The Computer Chronicles w/Stewart Cheifet" - Lunduke Hour - Apr 5, 2017", you won't be disappointed :D
For context, all modern x86 processors are RISC core. The added complexity for the instruction set is added for backwards compatibility and to make it easier to program. All those "unneeded" instructions are useful for reduction in complexity in programming. At one end you can have simplicity in the CPU and complexity in the code and at the other end complexity in the CPU and simplicity in the code. Intel was already on the path with CISC and chose to integrate RISC into the x86 and the CPU handles the transition for you. RISC (and I'm going to specify ARM) started simple but has been adding features since. Between the both of them it's almost a race to the middle :-)
That's really cool to know, I no longer have to wonder how x86 managed to stay around so long, and become so much more efficient and powerful!
False. The core isn't risc.
@@DorperSystems Perhaps you would like to elaborate.
@@daishi5571 The only RISC-like x86 uarch was the P6. Microcoded architectures always have split complex ops into smaller ops. There is a very interesting article that explains what I mean called
The legend of "x86 CPUs decode instructions into RISC form internally"
which you should read.
10,000 bucks, 70 meg hard drive and 4 megs of RAM. Crazy.
8 years later (1994) I got my first PC, a 486 with 4 MB RAM and 200 MB HD for $1500 Canadian or so - a fair bit cheaper :)
@@CanuckGod In 1995, at the age of 10, my dad got our first family PC, a Gateway 2000 P5-75 (original Pentium 75mhz) with (I don't remember how much RAM) but the hard drive was 1 GB and it had a 3 tray CD-ROM drive, a 3.5" floppy drive, 17" monitor, Soundblaster, and an old school Epson Stylus Color Printer all for $2400 USD.
And 10k in 1986 would be almost 30k in today's money
Back then, going up 33 Mhz was exciting. A whole new computer, etc.
For 4mb of ram!
25:05 That never happened. Aren’t you glad?
@23:14 “But we’ll get down to the point where silicon is important…” and here we are in 2021 at 5 nanometer.
Ahh that glorious pan am 747.. HEY!! BE NICE TO THE DC3!!
the humble ARM-based cpu with vector extension (scalable vector extension) now runs the fastest supercomputer (Fujitsu Fugaku) in the world as of November 2020.
that's RISC!
Why is an ARM cpu humble? It’s literally the most used instruction set in the world and comes from a huge history of RISC research and previous designs, some of which were (and are) extremely popular.
@@BlownMacTruck because it's not hyped up like Intel or AMD. And I know ARM outnumbered x86 by 10 to 1 since its everywhere: cell phones, microwaves, watches, servers, game console, engine control unit, radios, tvs, you name it. Yet not a lot of people knows the company ARM. that's why i call it very humble yet powerful
@@Lithiumbattery Uh, they’re literally the most powerful IP in the processor industry. They don’t need hype.
You may as well say Oracle or EMC are humble because they don’t need to hype their products even though they’re giant gorillas in their space. They don’t need to hype anything because they’re so big. They make ruthless design decisions with enormous repercussions. That’s like the very opposite of humble.
@@BlownMacTruck Where I can read about this in dept? I really like CPU design, and I like the efficiency of RISC, I'd like any and all literature I could download; I have searched the Interwebs but not much there
when did motherboards get split into north and southbridge bus architecture?
More or less in the Pentium processors era. Before that you had a plethora of add on cards to get printer ports, serial ports, Floppy drive ports and harddisk controllers in a computer, they were not on the motherboard. When Pentium motherboards started to come out those started to get integrated onto the motherboard itself. And to save costs those IO features were integrated into a single chip (the North Bridge) instead of putting several different controller chips on the board. Also handling memory became harder due to various new timing restraints (this is the era when you started seeing different kinds of methods to speed up memory, like EDO ram ) so a more complex memory controller was required which ended up being in the south bridge chip, a different chip cause the speed/bandwidth requirements for memory access are much higher then for the other kinds of I/O that the north bridge handled.
No one wanted to say how fast are these RISC machines vs 386 16MHz. Would be nice to hear performance figures and best with clocks to assess IPC
From 1986, but now we see modern phones and computers using RISC processors.
RISC is finally viable, or has been for a decade
Gary Kildall is Tech History that deserves to be remembered.
th-cam.com/video/Rqm2R9Xmrh0/w-d-xo.html
About a 20 year lag to this discussion
I would love to go back in time and explain to these people about AI
I hope there was no floppy copying with that 10 grand machine.
To think we went from this level of journalism in the computer world to the modern day grifters like LInus Tech Tips.....
It's crazy how x86 (basically) was already bloated by 1986 standard.
Yet here we are nearly 40 years later and are still complaining about it.
Yet we still get the Steam Deck with x86_64.
Lessons learned: "If something is broken you either fix it now, or you're most likely never going to fix it".
Remarkably prescient episode. Practically all of the points made for and against RISC have been born out since then. The 90's saw an explosion of RISC based chips from IBM, DEC, HP, ARM, and even Intel. All of them featured long pipelines, large register sets. The point about the efficient use of silicon really is what eventually sold RISC as the technology that would succeed over CISC.
Gotta love KITT making a showing!
11:04 Did Stu, Gary and the guests buy a four-pack of suits, shirts and ties and divvy them out? :)
When taping this, they had a sale at the Dollar Store across the street I believe. Who can resist? 😂
“A traditional machine”.
A look into the future, from the past.
RISC (ARM & MIPS) are really dead now - only in some million smart phones ;-)
More like 2 billion.
Yea ARM processors are actually the most common. Your typical PC has like 15 of them in it. Doing everything from decoding keyboard button presses to managing the internals of the SSD.
Modern x86 processors (Atom excluded) translate x86 instructions to an internal RISC execution engine. Meanwhile, the ARM ISA has expanded quite a lot in more recent years, particularly with floating point extensions.
@@ElectricEvan Rightly said .. I have learnt that the HDD uses the ARM-R family of architectures
.. mips is being brought back for phones. 32 bit and 64 bit
26:19 the good ol days when you could buy software developed in Hackensack New Jersey.
That software looks like an early version of Google Keep.
I love watching these shows and feeling nostalgic, but this episode I find annoying because at this stage a 16-bit RISC architecture had already been available for 11 years. The Texas Instruments TMS9900 series have been recognised as RISC processors and the architecture was years ahead of its time. The TI-99/4A was my first computer.
19:18 And then Cyrix (I think it was Cyrix) cleverly converted the x86 CPU into a translator which converted CISC instructions into small "micro ops" RISC instructions. Intel and AMD quickly jumped on that bandwagon.
All CISC machines already used microcode for their complex instructions. But the execution units' pipelines were deepened to allow higher clock frequencies. Speculative branching was introduced, etc, etc.
@@patrikfloding7985 you'd think that microops are the same as microode. The implementation is radically different.
This video smells like old library ...
Molecular computers fighting tumors? Hearing that's like reading an Asimov book.
This is very cool
*Zero Cool has entered the chat*
2020 - Victory goes to RISC
So, um, when am I going to be able to get those molecular computers to repair my brain!?
20:05 looks like that Jeff Dunham puppet
13:23 Wow... $10K
I just ordered my new iMac. It holds 64GB RAM and has a 2TB SSD.
Now I wonder how much this iMac would cost back in 1986! I bet it would be in the millions!
It simply did not exist.
Wow! The tech of today is better than the tech of almost forty years ago at a cheaper price! Thats amazing!!
How much for the new iMac?
RIP, Gary.
It almost like they talking about CPU vs GPGPU
2020 now, the God of 1010.
wat
It’s a shame people were so addicted to Intel compatibility.
@Невада большевик And that whole zeal for “compatibility” is keeping the PC side stuck on overheating Intel x64 architecture. It was time to move to RISC 10+ years ago. Even AMD is going to jump eventually.
@Невада большевик And even AMD realizes that RISC is more of a future than x64. Minor technical point aside.
@Невада большевик I have and I’m aware AMD essentially built x64. I’m not blasting AMD here at all. In fact, Intel would be more screwed if not for AMD. Doesn’t mean RISC of some form, whether it is ARM or RISC-V, isn’t a better path forward.
@Невада большевик Also TH-cam is janky as hell right now and it randomly deletes my comments as well, join the club. 😂😂😂
How else you gonna run Ski Free and Reversi on modern hardware!? You want those classics running on bare metal, not emulation for the real experience.
thank you youtube algorithm
whatever happened to MIPS these days
So much foretold here
62.000 transistors.....
ARM processors and Apple M1 are RISC CPUs