When I worked at Intel, most of the phones speed dial was #1, that department's boss, #2 AMD's equivalent department. We talked constantly and the joke when ending the call was "we hate you", "we hate you too." Instead of goodbye, often said in a silly voice.
Still an issue. I doubt we have true competition of these two for a long time. There were some lectures about CPU fuzzing which revealed many documented nowhere opcodes surprisingly identical on both competing products. They are playing in coop I guess.
Itanium was an Extremely Parallel Instructionset Computer processor? An EPIC processor? Only now do I realize what a gigantic shot at Intel it was to name the Ryzen server line EPYC. That is just pure unadulterated Schadenfreude Edit: You did bring it up at the end. I was having quite the laugh tho when the coin dropped in my mind
Clean room reverse engineering doesn't mean they didn't look at it. It only means the team writing the code never looked at it. One team reverse engineers it and then documents it, the other implements it.
Yup One team looks at the code, writes out a specification in a big book. Then the lawyers very carefully take it to another team that have never looked at the original code to write something that does this. @@ArifGhostwriter
"intel had made themselves effectively 'a waffle iron that could do maths '' "... Classic! I enjoy this channel for the sense of humour, but as an engineer the depth of technical information. Excellent job!
Nvidia is eating intel's lunch in waffle iron production. The sizes 4090 cards come in, you could probably cook 3 waffles with one card. Also, I remember a tech outlet (can't remember which one, gonna say it was the finnish muropaketti) cooking an actual egg with an uncovered athlon chip (they made a tinfoil cup for it!).
It's easy to laugh at afterwards, but compilers are already extremely smart, and when first introduced, people doubted they would ever be as efficient as hand-written assembler, so it's not completely silly to put a lot of the code analysis in software instead of hardware.
@@klafbang That's a good point. I'm not an expert, but I was studying computer science in the late 90s and was paying attention to where R&D was going. Just a few years earlier many people thought RISC was going to take over everything (and it had in minicomputers with Alpha, Sparc, MIPS, etc.) Intel was mildly successful moving x86 up market with Pentium-Pro, and at the same time their ICC compiler was the best at generating fast x86 code, beating out Microsoft, Borland, gcc, etc. Pentium-Pro being a pipelined and superscalar design, they knew to invest in compiler technology. From that perspective, Intel made a bet that - 1. Next generation processor designs would need simplified instruction sets. 2. To maximize performance a RISC processor needs to maximize pipelining and speculative execution. 3. It would not make sense to make the execution core simplified just to make the instruction decoding and ordering more complicated in silicon. 4. We already have a world class compiler team. 5. We need to make a new architecture that Intel doesn't have to share. With that backdrop, Itanium didn't seem that far fetched. Amusingly enough, Transmeta's design for an x86 competitor was innovative not just for Dynamic Voltage & Frequency Scaling, but it was also a VLIW RISC design, and it had a patented code-morphing compiler built in to the firmware. Their tech, as well as AMD's micro-op decoding in Athlon and Opteron, proved that you COULD keep x86 and just implement a RISC-ified processor on the back end.
One of the big issues that hobbled early Pentium IV adoption was that it required RAMBUS memory, which was incredibly, outrageously expensive at the time.
Wasn't RAMBUS a patent troll that walk out of JEDEC meeting, claimed all ideas as theirs, patented all of the stuff that was proposed as an open standard and went to court against pretty much everybody from that meeting?
I was waiting for him to talk about that RDRAM was ABSURDLY expensive and you have to run two sticks and also have dummies for the unoccupied RAM slots.
@@joeconti2396 Intel was really banking on that being the future. It didn’t make much sense, I can’t remember any other architecture using it…Apple, Sun, DEC, etc all seemed to avoid it. I don’t even know if Itanium supported it.
You may wonder why Intel didn't use RDRAM with Itanium if they were really betting on both to be a success......I think it's because Intel took so long bringing Itanium to market that they had already killed RDRAM on the P4 by that time.
@@mellusk9194 I still cringe when I see someone selling a 1.6 or so Ghz Pentium 4 system with the "no lowballers I know what I have" mentality only to never realize they are the least desirable because of RAMBUS. God it's awful.
Came to say the same. When Intel had Slot 1 with the PIII, AMD came out with the Slot A Athlon. And the number one reason that Slot CPUs existed was because the L2 cache was relatively large and therefore off die(Probably to reduce costs/increase yields) and they therefore needed a larger PCB for that cache. We only went back to Socket PIIIs and Athlons when they were able to integrate the relatively large L2 cache on the same die as the CPU. When this happened, AMD released the Duron, a cache reduced Athlon.
Indeed. And it was a stellar overlocker. Remember running a Duron 600 at 900+ Mhz for years. Same was true for the p3 celerons of the time, even though they still sucked even when overclocked to the moon.
@@williamcroson9102 The cache access was a good rp line to spin, and was partially true, but the main reason for Intel, was that is messed with mother board supplies for AMD. With Intel there is always a hard cold business reason for all their actions, that's why Intel is Intel.
ERRATA: - Duron was not budget version of K6-III but Athlon - AVX512 was introduced in Rocket Lake. Alder Lake could use these instructions if E-cores were disabled but Intel blocked that feature with uCode updates
There used to be an mp3 encoder written in ASM called gogo / petite, that was 10 times faster than every other encoder, and it used 3D now, so it was twice as fast on an AMD K6.
@@RetroBytesUK yeah sse2 is the only one that stuck around long enough to target. Now chips are so darned fast you don't need to target crazy instruction sets.
Actually, C# targets all that stuff in its just in time compiler for you. Every new version gets orders of magnitude faster without you even doing anything.
@@prman9984 This was 20 years ago on Linux/Unix. 3DNow had just come out and there was pretty much nothing else that used those extensions yet. It was a bunch of guys that took LAME and rewrote the encoding parts by hand in NASM. I don't know much about C# though, because I only used windows 95 for a bit back in the 90s, and only on weekends to play LAN games with my friends.
@@prman9984True of all languages using modern compiler backends like GCC and LLVM. When you target a CPU with those instruction sets, code generation will make use of those instructions. However, a human who knows what they’re doing can do a much better job in assembly or with vector types and intrinsics in C, C++, C#, Rust and many others.
Man, this brings us down memory lane. I was there for all but the earliest of releases. My first x86 PC was a 486 (family PC) … My first PC I ever purchased for myself was a Celeron. I've had, K6's, Athlons, (I had the first Gen one that went into a slot as well ... The slot A)... The craziest thing we had was a dual Celeron with the Abit BP6(?)... Running Linux. Those were the days... Sh!t I'm old ...
First PC I did development on was a Tandy Color Computer. First IBM compatible was a x286 from a company called Leading Edge. It was an insane amount of money.
Thanks RetroBytes on this deep shallow dive into the rivalry. I remember working on “old IBM” PCs at Uni in 88. All these company names brings back memories. I too had a set of the Intel Bunny Suit plushies that I’m sure someone’s child at work got. The old days. Now everything is virtualized and only hardware is laptops, consoles and gaming PCs.
The problem with quake wasn't so much that the non-Pentium-CPUs were so much worse on floating point but that they couldn't run them in parallel to the integer pipeline as the Pentium could.
Very nice video. Only major thing I would have added is the inclusion of X3D as a technology, like how you did MMX and SSE. You could have mentioned when AMD finally got SMT to compete with Hyper Threading years later. But as I said, great video. The truth is, this battle has been raging for so long that you'd have to do a series of videos if you ever intended to fully chronicle everything, so for a single video effort, I'm very happy with this.
I'm also not keen on the way he pronounces Alder Lake as "ay-da lake" - as a fellow brit I feel like he should know how to pronounce 'Alder' a word which originated from Europe (specifically Germanic-English) But I enjoyed the video lol, fun recap of the last 30+ years of PC processors.
ARM is a large company and will try to maximise their profits, thats what companies do. They dont quite have the same market power as the likes of Intel, as they dont produce their own phyiscal products, so cant use their control of supply to manipulate their big customers in the same way. You're right though everything is constantly evolving.
Bulldozer architecture did come good in its own way. When the games and other software learned to use the chip, it became a really good '8 core' option. I bought one when it first came out, an FX 8120, which had some problems until I paired it with a 9 series chipset. Then its parallelism improved markedly, and as the software matured, it performed increasingly well. I myself wrote code which used core affinity to make best use of the way Bulldozer/Piledriver worked, and had incredible efficiency compared to the same problems being tackled on comparable Intel machines without these enhancements built in. By the time I upgraded to a 8350, which comfortably overclocked to 4.8 GHz, paired with a decent GPU, performance was extremely good for the price point. In upgrading cycles I sold each CPU for a decent 75% of the original purchase price, upgraded from the 8350 to a Ryzen 5 5600, which my PC still runs. The laptop I'm writing this on is a Lenovo with an AMD Ryzen 5 5625U.
Excellent video! A wonderful trip down memory lane! One minor quibble that I noticed. At 38:04 you mention the AMD Duron as a budget version of the K6-3. The Duron was the budget version of the Athlon and was released June 2000. The original Athlon CPUs used Slot A from June 1999 to June 2000. The Duron was released when the Athlon was moved to the Socket A format starting in June of 2000 as there was no Slot A Duron.
That was honestly a really good overview of how these companies competed with each other. I was aware of most of it but it was cool to see it all in the one place, and the dry humour? much fun.
Very nice Video. One Appendix: At the End of the AMD Bulldozer Era, AMD not only got a Deal with Sony and MS to sell them their APU's, but they remodelled the Bulldozer Cores: Every Core got it's own Insctruction unit. Those Puma, Puma+ and Kaveri Cores are the Cores which went into the Xbox and PS4, and you could buy those 4 Core APU for low power PC, with a compareable performance to an i3, without beeing a waffle Iron. AMD made never "Bigchip" Kaveri's. But the Kaveri's were the foundation for the ZEN. There are even (low performance) Kaveri for AM4. I still have A10-7850k and A10-7870k as low power Linux desktops running. CPU Power of an I3-4330 and an R7 250 GPU are not great but everything in one package makes some nice workstation.
I had a Kaveri as well that I ran 'CrossFire' with a R7 250 dGPU. The combo of it and the IGP did a pretty nice job. I remember having to seek out the 250 I found as being the one with more cores than the standard retail part, in spite of the name being identical. Good times...
Very cool video, shared with friends and put it in my favourites. My 13700K is having massive instability issues and tried everything to fix it but nothing. I am going to get my X870-I STRIX board pretty soon, and then buy my 9800X3D once my pre-order arrives (they put m,e in a queue and will hold onto it for me from AMD). Very excited as this is my third personal rig and my 1st AMD system, heard great things about the 5800X3D and 7800X3D, and the reviews were overwhelmingly positive for the 9800X3D at a decent price of $689.99 CAD!
I know you said it was gonna be long. I didn't look at the length of the video and when it was done I didn't realize I had been sitting here for an entire hour watching this. Great video. Long? Yes. Entertaining, informative and worth it? Hell yes.
The Pentium Pro was a fantastic chip for what it was designed as, a Workstation CPU for CAD and various other tasks. it even ran Dual CPU better than Dual Intel Pentium CPU's, it was the CPU that the Pentium II was designed off, in fact, the Pentium II is technically a Pentium Pro, with a Die shrink and MMX Instructions with a re-designed cache system. You got the K3-III a bit wrong. K6-III was essentially a K6-II but instead of relying on the Motherboards slow Level 2 cache, AMD added 256K of Level 2 cache on the CPU Die, this would then relegate the motherboards L2 cache into a Level 3 victim cache. As for the Duron, this didn't come out until AFTER the Athlon processor. in fact it didn't come out until the 2nd iteration of the Athlon CPU that was Socket A or Socket 462. 2nd gen Athlon (using the Thunderbird core) had 256K of L2 cache on die, the Duron used the same Thunderbird core, but the L2 cache was reduced to 64K Itanium, what a joke of an architecture that was. but to be honest, the time intel shot them selves in the foot big time was the i820 platform, RDRAM, and the multi year licencing agreement they entered into with RAMBUS, this agreement crippled their ability to produce and support the Pentium 4 Processors with affordable and performant memory, it kept intel from using DDR SDRAM for many years.
@@tripplefives1402 but DDR SDRAM offered more performance per dollar than RAMBUS. DDR was an open standard from JEDEC, where as RDRAM was a closed standard from Rambus. DDR memory dropped in price as soon as it was widely adopted, in order to make RDRAM, memory manufacturers had to pay royalties to Rambus, licencing fees, this kept the cost of RDRAM high, along with its poor performance, it led RDRAM to become a failed memory technology
@@camjohnson2004 this ^^^ you didn't have to pay licensing fees to the RAMBUS Consortium to use DDR so it was always going to be cheaper in the long run.
@@SomeMorganSomewhere No you read wrong. Memory Manufacturers had to pay RAMBUS royalties to make RDRAM and any chipset manufacturer had to pay Rambus royalties if they wanted their chipsets to support it.
@@camjohnson2004 That's exactly what I said... Because you didn't have to pay licensing fees to RAMBUS it was always going to be cheaper to produce *DDR* in the long run.
Amazing writeup. I was hoping that Meltdown or Spectre get mentioned but all and all not disapointed. In all fairness when someone asks me a computer history question my first reaction is to figure out if this channel had made a video of it. Recently had to explain to someone "how internet works" and guess what link did I send them... Keep up the amazing work!
17:05 No, they weren't. AMD also had to reverse engineer the 486, just as with the 386. They also had to wait for Intel to lose another legal battle (separate from the one for the 386) before they could sell those 486 processors starting in 1993.
Funny because similar situation goes now with Qualcomm and MediaTek, where Snapdragon go full waffle iron as their chips are copy-paste of previous generation that just clocked higher and can toast your phone to insane temperatures and have slapped useless AI everywhere(that do nothing as its just empty talk and software bloadware)and Mediatek, that was mostly known for crappy cheap chips that everyone avoided, but now they do dimensity series that can fight in high end spec devices for half price.
Eh the latest snapdragon (desktop) chips are *supposedly* clean sheat. It's actually why arm is losing their minds and pulling arms license from qual. They claim that qual is using designs from the company they bought which their licensing is ... Confusing on. Essentially IP isn't transferable.... Weird arm crap.
@@EmyrDerfel Plus they almost never get any Linux drivers, so if you want to install other OS to revive an outdated tablet/phone half of the stuff doesn't work.
Man, I remember getting burned by going in on Bulldozer after graduating from an aging Athlon XP. I did a lot of music production work, and this makes heavy use of parallel 32-bit floating point calculations. Imagine my surprise when I didn't get nearly the performance uplift I expected. Still, when the first Ryzen hit the scene, the price was so right that I could not resist. I'm pleased that AMD could bounce back. I'm hopeful that Intel will continue to compete. This history shows us that consumers saw the best improvements in X86 when the competition was fierce.
The best and most entertaining video I watched on TH-cam in a month, probably longer (and I stream this yt s*** non-stop at this point)! I wouldn’t mind if this video was 3 hours long!
Yes, this video is quite long. But the fact that you managed to compile (imo) the definitive history of the desktop processor market is no small feat. I will certainly be referring people to this when discussing the subject.
Thanks for this video. I've lived through part if this timeline, joining the PC race at around the Pentium era. This video is the clearest documentary of the comerardiry and back and forth battle between AMD and Intel. I knew AMD have been making chips since the 70s but I didn't know about the 2nd source deal and the how that related to IBM. Over the years I've owned Intel, Cyrix and AMD. Cyrix was definitely the worst, but I've mostly favoured AMD as they were mostly cheaper and I felt like favouring the underdog (last CPU purchase was the 1st gen Ryzen which I got when it felt like the whole world was running on Intel). Perhaps AMD are not the underdog anymore?
My introduction to IT was a highschool summer job replacing Cyrix 6x86 boards and rebuilding the pc's for a neighbor. He went all in on them only to find out that autocad and other 3d modelling systems would burn them out or were so slow as to be expensive paper weights.
Funny thing. Pat Galsinger (current CEO of intel) was a lead architect of 486, when he was about 27 years old. Also AMD 486, was not just a clone of Intel 486. It came pretty late, (just little before Pentium), but had some substantial improvements, including cache and power management, and better clocks. 386 was fundamentally the most important in terms of features. To this day we use similar model of the machine. 486 was huge jump to make cpu smarter (by any standard previous CPUs were rather primitive). And next one, Pentium, was a total rethink on another level, where it started becoming superscalar (up two two concurrent instructions), and Pentium Pro, as well Pentium II, become fully out of order with speculative execution, which is basically what we do to this day in high performance CPUs. (there were few workstation class CPUs before that were also doing it years before, and some mainframes doing it even before that). The UMC, you probably didn't see in real world, because they were mosly marketed in other parts of the world. In fact after their 486 clone, they lost patent lawsuit, and were prohibited from selling their CPUs in the USA. I think UMC Green CPU (486 clone), was from scratch design, with some reverse engineering of the chip too maybe. Cyrix 486 was another clean design, that was not based on Intel core. In fact it has a number of incompatibilities in a low level, which caused some operating systems to not handle it properly, or required special code to handle that CPU. But performance wise it was similar to Intel 486, and general architecture was very similar too.
Loved this. Very nostalgic as I lived through pretty much all of it, avoiding most of the disasters and owning 286, 386DX, some sort of 486, Athlon, Core 2 Duo, and then most of the versions of Zen.
I had an FX-8350 or something like that and it was the first real gaming pc I had, got it up to 4.4ghz with a ridiculously massive heatsink thing with 3 fans. Lasted me a good 9 years, eventually underclocking to get it to even boot. I ran that thing ragged. Now on a Ryzen 7950x3d and loving it. All I'll say is that bulldozer was loved by me, an idiot who knew nothing and at just under the £89 per year of the 9 years I had it (the entire machine cost me £800) I'd say it was worth the money for sure.
the k5 was so interesting. its basically a risc amd 29k with an x86 decoder bolted on. they called it risc86 at the time. its a fascinating rabbit hole
I miss the look and feel of the old chips. a 286 can be used as a literal poker chip, and the 3/486 are so beautiful I kept one framed when I was a kid. I remember having discussion with older nerds about the FPU, I wanted to get one to finish my OC'd 286 build with 2M RAM. They told me if you play games it's useless to you. AND I told them. That will not continue.
RM Nimbus, he'll that brings back memories of when I was appointed computer tech at the 6th form college I worked at. No training given just up to me to read books to find out. Those were the days.😅
So many renditions of these stories descend in to a dry list of names, numbers and dates. The joyful humour in your telling make it a fun adventure into the foibles of men and business. Thanks.
Great video! There were some things I knew and some things I didn't. I am at the 39 min mark. I plan to watch more but I just want to mention a few things. The Duron was not a cut down version of the K6-III. It was a budget version of the Athlon. Also, I was a bit sad that you didn't mention the 386DX40 which was a real gem compared to the 486SX. Also, Intel broke the licensing agreement after it come up with the 386, not the 486. At this point Intel really didn't care about losing IBM as a customer so it didn't care about the across licensing agreement it made in order to do business with IBM.
Fascinating little essay. I was born in 90, so my pc knowledge begins around the pentium 2 and taking apart ones that got thrown out. Growing up i always saw pentium 4 ads everywhere, so the first time i saw an athlon machine i thought it was junk.
I am old enough to remember new machines with 8088 processors. A wonderful boiling down of a long history. I have regarded Intel as a dirty competitor which paid itself back with a series of bad decisions. Too bad you didn't touch on the rd-ram mess. That was fun.
Exactly! They were infamous for their strong arm tactics forcing OEMs not to buy AMD. Not to mention the Coppermine P3 recall they were forced to make and the current oxidation issues with their 13th and 14th gen.
Hopefully RISC-V does it even more and faster. I would much more like to see a more open ISA and architecture like RISC-V, then one that is even more closed than x86/AMD64.
Thank you. AMD has really upped the game when it comes to processors. Bad business model - the ones I bought 10+ years ago are still running very nicely.
@@Toksyuryel ARM is kinda overrated, even apple sillicon fanboys even admit how powerful and power efficent Lunar Lake is vs M3 lineup. It's kinda sad at the same time that Meteor Lake and Arrow Lake are a flop lol.
Great video. Missed a couple of things, one, you didn't mention Slot A vs Slot 1 rivalry... As that was around the time DEC were now working with AMD. I remember the DEC Alpha Slot A solution being used by DEC and AMD against the Intel Slot1 competitor. I do remember switching to AMD on the server side when the Alpha was no longer supported on Windows NT4 after Service Pack 3...lol But I guess that's more of a server focussed product...
i really do like those long, in depth videos. sure, other channels make shorter ones in the same subjects, but they are leaving so much out! btw my cpu's were made by MOS, Motorola, Intel, AMD, Intel, AMD ... in that order (no i had more than 6 cpu's... i just mentioned the changes in manufacturer) oh... and did you really say zen 2 started with the ryzen 9000 series?? (at 58:37)
This video makes so much more sense right near the end. I was growing up and understanding how computers work *just* as Intel took the lead definitively, so I was quite concerned about their recent fall from grace as it were. Turns out AMD's resurgence is more back to the Athlon status quo, with less mistakes this time, which is fascinating lmao
To put into perspective how much brand recognition in the PC market had changed over time here's a small anecdote. I got my first PC back in 1996 after I've grown up with Amiga. At the time I knew what a PC was, I knew Intel and AMD and had a good understanding of how PC's work .. but I had never even heard of IBM.
I'll never forget my first experience with computers was when my father bought a Tandy 1000 with an 8088. The following Christmas I got a Tandy Color Computer with the Z80. In Middle School I was bringing in homework that I printed out on the daisy wheel printer. The 80s were great :)
Didn't Intel have a Windows 7&8 compatible Broadwell CPU with 10 cores, the i7 6950x? I'd imagine that CPU would beat the Ryzen 2700x on average, though USB 3 is a bit problematic. It's easier for Windows 7 to be on an i7 6950x, and Windows 8.1 on an AM4 CPU with the B450 chipset. 8 supports USB 3 while 7 doesn't and last time I checked, most AM4 boards didn't have a lot of USB 2, only as an extra connector from the inside, which is what I'm gonna have to utilize to get 7 working on this Ryzen 2600E.
29:31 The ironic thing about people avoiding Cyrix for FPU is that I had the pictured chip (6x86-P150) and it lasted me long enough to get replaced by a 512k Athlon XP 2600+ clocked over 10x as fast. Even ran XP on that machine, albeit with significant RAM and storage upgrades. That included gaming, that Cyrix CPU was even able to manage Sims 1.
It was only stuff that leaned heavily on floating point struggled, plenty of games made little to no use of floating point and where just fine. Its just thanks to quakes approach to using floating point that Cyrix's reputation took such a kicking (and lots of people wanting to play quake).
@@MrZics oddly at one point intel did make arm Chips, they bought the part of DEC responsible for the strong arm. The produced them still with the name strong arm, then under the name xscale.
@@RetroBytesUK Indeed, I had a "Pocket PC" Windows phone with an xscale chip. It was...OK. What made them unbearable was cramming a Win9x era GUI into a small screen, requiring a mini stylus to do much of anything. Most of them still had a physical slide-out keyboard, a feature I still kinda miss even to this day. Alas, the iPhone's touch UI effectively eclipsed everything in the handheld space that came before.
You kinda brushed over that Cyrix was purchased by VIA and become VIA Cyrix. I have a couple of them, a VIA Cyrix III 500MHz and 600MHz in my collection along with a Cyrix M II 66MHz.
I work for a government organisation in Ireland, and let's just say it's one you'd want to have decent kit. I'm not in IT but my presence here indicates I have an interest. I got asked by a couple of people why their computers were "a bit slow" and ended up spending a significant amount of time replacing the Sandybridge Pentiums with 4GB of RAM in late 2024!
Minor correction: The 286 never had a paged MMU. That was introduced with the 386. The 286 had segments, just like the 86, but with segments up to 1meg in size instead of 64k in the 86 and of course a 24bit memory space (16meg max) as opposed to 20bit (1meg max). The fact that the later 386 was a true 32bit CPU with a paged MMU gave it the - then common - nickname of "VAX on a Chip".
I hope it goes down in the history books what a massive turn AMD has had in the Ryzen era, going from "don't buy their products" to scrappy underdog, to peer level to now being undisputed best processors for most (consumer) use cases, while EPYC has been massive in the enterprise space as well; even if the marketshare hasn't caught up to adopting the new market landscape yet.
amd's chiplet approach really helped them out. on zen 1 and zen 1+ (ryzen 1000 and 2000) desktop had a single die, workstations and servers had two or four active die that were nearly identical to the desktop counterparts providing a really easy method of scaling up to 32 cores. on zen2 with the io die + x chiplet count amd just went hog wild. i wouldn't be surprised if we get 512 core epyc cpus in the somewhat near future. also, zen4c is really cool. by limiting the voltage and clock speeds amd could shove 16 cores into a single ccx die and not have it melt under a full all core load. kinda want to see a budget ryzen cpu with those cores for the budget sff workstation market
That took me back from my 486 DX2 to my Pentium Overdrive to my Pentium MMX , 2,3 and 4 before moving to Athalon, FX Black bulldozer central heater and then returning to 8, 10 and 11 gen i7s before going Ryzen 9 4th get last year.
I remember building my first PC (I owned one prior, a Pentium 2). This time though, I was going to put it together myself, around the year 2000/2001? i just remember reading all the reviews in magazines and getting my Athalon XP 2.2 (which was actually 1.8 GHz), but Amd marketed it as fast as a 2.2 pentium. put it on the MSI motherboard, was a great chip, and to this day still turns on. It just aged out of the game and never broke down. still have it collecting dust, sadly. that one and my fathers athalon 2.4 version,
12:30 That seems to be an odd thing to say, given that Intel did not provide AMD the designs of the 386, in breach of their contract. AMD had to reverse-engineer the processor to make their own versions, and wasn't able to sell any until 1991, after a legal battle that Intel lost.
Seymour Cray ..... If you say it out aloud it could be the name of a famous fish spotter or fisherman. Thankyou thankyou i'm here all week! 🤣 I'll get me coat ...
this is a very instructive tale on how a smaller engineer driven business can outperform bigger commercial driven one kept alive only by the ignorance of customers
When I worked at Intel, most of the phones speed dial was #1, that department's boss, #2 AMD's equivalent department. We talked constantly and the joke when ending the call was "we hate you", "we hate you too." Instead of goodbye, often said in a silly voice.
When was this?
That's _awesome!!_
What dates are we talking about?
Still an issue. I doubt we have true competition of these two for a long time. There were some lectures about CPU fuzzing which revealed many documented nowhere opcodes surprisingly identical on both competing products. They are playing in coop I guess.
@@icywiener5421 Could also be the ridiculous amount of backwards compatibility baggage x86 is carrying around to this day.
Ah, yes, 1969. A "relative newcomer" compared to Intel's 1968. 😂
That is technically correct; the best kind of correct 😁
Still wet behind the ears lol
@@louis_makesYes, yes, they didn't start competing with Intel using their own chips until later, but still!
Who would be that comenter?
Relative lol such a relative term.. I wonder what they thought by that though..? Whatever this video was.
Itanium was an Extremely Parallel Instructionset Computer processor? An EPIC processor?
Only now do I realize what a gigantic shot at Intel it was to name the Ryzen server line EPYC. That is just pure unadulterated Schadenfreude
Edit: You did bring it up at the end. I was having quite the laugh tho when the coin dropped in my mind
Risen Ryzen almost is the same ryzen from the ashes
That was news to me too. Now I get why so many techies found the EPYC name worth highlighting when it was announced.
Clean room reverse engineering doesn't mean they didn't look at it. It only means the team writing the code never looked at it. One team reverse engineers it and then documents it, the other implements it.
They were handing paper notes as their medium of communication, apparently.
Yup
One team looks at the code, writes out a specification in a big book.
Then the lawyers very carefully take it to another team that have never looked at the original code to write something that does this.
@@ArifGhostwriter
Excellent trip down memory lane - thank you - that was Epyc!
"intel had made themselves effectively 'a waffle iron that could do maths '' "... Classic! I enjoy this channel for the sense of humour, but as an engineer the depth of technical information. Excellent job!
It's lovely British humour!
Nvidia is eating intel's lunch in waffle iron production. The sizes 4090 cards come in, you could probably cook 3 waffles with one card. Also, I remember a tech outlet (can't remember which one, gonna say it was the finnish muropaketti) cooking an actual egg with an uncovered athlon chip (they made a tinfoil cup for it!).
Glad to see that I'm not the only one to joke about iAPX 432 being so complex that VAX is like a RISC chip by comparison.
"Basically required the compiler to be sentient " LOL
Sony's CELL architecture had this exact same issue.
Just about to watch and already know from this that Itanium is mentioned :)
It's easy to laugh at afterwards, but compilers are already extremely smart, and when first introduced, people doubted they would ever be as efficient as hand-written assembler, so it's not completely silly to put a lot of the code analysis in software instead of hardware.
@@klafbang That's a good point. I'm not an expert, but I was studying computer science in the late 90s and was paying attention to where R&D was going. Just a few years earlier many people thought RISC was going to take over everything (and it had in minicomputers with Alpha, Sparc, MIPS, etc.) Intel was mildly successful moving x86 up market with Pentium-Pro, and at the same time their ICC compiler was the best at generating fast x86 code, beating out Microsoft, Borland, gcc, etc. Pentium-Pro being a pipelined and superscalar design, they knew to invest in compiler technology.
From that perspective, Intel made a bet that -
1. Next generation processor designs would need simplified instruction sets.
2. To maximize performance a RISC processor needs to maximize pipelining and speculative execution.
3. It would not make sense to make the execution core simplified just to make the instruction decoding and ordering more complicated in silicon.
4. We already have a world class compiler team.
5. We need to make a new architecture that Intel doesn't have to share.
With that backdrop, Itanium didn't seem that far fetched.
Amusingly enough, Transmeta's design for an x86 competitor was innovative not just for Dynamic Voltage & Frequency Scaling, but it was also a VLIW RISC design, and it had a patented code-morphing compiler built in to the firmware. Their tech, as well as AMD's micro-op decoding in Athlon and Opteron, proved that you COULD keep x86 and just implement a RISC-ified processor on the back end.
@@Toksyuryel It does indeed, and is exactly why so little code can take proper advantage of it.
One of the big issues that hobbled early Pentium IV adoption was that it required RAMBUS memory, which was incredibly, outrageously expensive at the time.
Wasn't RAMBUS a patent troll that walk out of JEDEC meeting, claimed all ideas as theirs, patented all of the stuff that was proposed as an open standard and went to court against pretty much everybody from that meeting?
I was waiting for him to talk about that RDRAM was ABSURDLY expensive and you have to run two sticks and also have dummies for the unoccupied RAM slots.
@@joeconti2396 Intel was really banking on that being the future. It didn’t make much sense, I can’t remember any other architecture using it…Apple, Sun, DEC, etc all seemed to avoid it. I don’t even know if Itanium supported it.
You may wonder why Intel didn't use RDRAM with Itanium if they were really betting on both to be a success......I think it's because Intel took so long bringing Itanium to market that they had already killed RDRAM on the P4 by that time.
@@mellusk9194 I still cringe when I see someone selling a 1.6 or so Ghz Pentium 4 system with the "no lowballers I know what I have" mentality only to never realize they are the least desirable because of RAMBUS. God it's awful.
38:00
The AMD Duron was the budget version of the Athlon, not the K6-III
Came to say the same. When Intel had Slot 1 with the PIII, AMD came out with the Slot A Athlon. And the number one reason that Slot CPUs existed was because the L2 cache was relatively large and therefore off die(Probably to reduce costs/increase yields) and they therefore needed a larger PCB for that cache. We only went back to Socket PIIIs and Athlons when they were able to integrate the relatively large L2 cache on the same die as the CPU. When this happened, AMD released the Duron, a cache reduced Athlon.
Indeed. And it was a stellar overlocker. Remember running a Duron 600 at 900+ Mhz for years. Same was true for the p3 celerons of the time, even though they still sucked even when overclocked to the moon.
Thanks. I kept wondering what a Joron was. 😂😂😂
@@prman9984 It's a half decent whisky
@@williamcroson9102 The cache access was a good rp line to spin, and was partially true, but the main reason for Intel, was that is messed with mother board supplies for AMD. With Intel there is always a hard cold business reason for all their actions, that's why Intel is Intel.
ERRATA:
- Duron was not budget version of K6-III but Athlon
- AVX512 was introduced in Rocket Lake. Alder Lake could use these instructions if E-cores were disabled but Intel blocked that feature with uCode updates
There used to be an mp3 encoder written in ASM called gogo / petite, that was 10 times faster than every other encoder, and it used 3D now, so it was twice as fast on an AMD K6.
3d now was a great idea for its time, its a shame it never got the software support.
@@RetroBytesUK yeah sse2 is the only one that stuck around long enough to target. Now chips are so darned fast you don't need to target crazy instruction sets.
Actually, C# targets all that stuff in its just in time compiler for you. Every new version gets orders of magnitude faster without you even doing anything.
@@prman9984 This was 20 years ago on Linux/Unix. 3DNow had just come out and there was pretty much nothing else that used those extensions yet. It was a bunch of guys that took LAME and rewrote the encoding parts by hand in NASM. I don't know much about C# though, because I only used windows 95 for a bit back in the 90s, and only on weekends to play LAN games with my friends.
@@prman9984True of all languages using modern compiler backends like GCC and LLVM. When you target a CPU with those instruction sets, code generation will make use of those instructions. However, a human who knows what they’re doing can do a much better job in assembly or with vector types and intrinsics in C, C++, C#, Rust and many others.
Man, this brings us down memory lane.
I was there for all but the earliest of releases. My first x86 PC was a 486 (family PC) … My first PC I ever purchased for myself was a Celeron.
I've had, K6's, Athlons, (I had the first Gen one that went into a slot as well ... The slot A)...
The craziest thing we had was a dual Celeron with the Abit BP6(?)... Running Linux.
Those were the days...
Sh!t I'm old ...
Your first PC was 486 and you consider yourself old? Step aside young whipper snapper.
First PC's scrapped together for myself were 486's. And the first one I bought brand new was a Cyrix 686MX-200! I loved that piece of shyte chip. Lol
First PC I did development on was a Tandy Color Computer. First IBM compatible was a x286 from a company called Leading Edge. It was an insane amount of money.
@@tripplefives1402 My first PC was actually a Vic 20 ...
This was my first Intel PC ... I should've made that clear 😁
Thanks RetroBytes on this deep shallow dive into the rivalry. I remember working on “old IBM” PCs at Uni in 88. All these company names brings back memories. I too had a set of the Intel Bunny Suit plushies that I’m sure someone’s child at work got. The old days. Now everything is virtualized and only hardware is laptops, consoles and gaming PCs.
The problem with quake wasn't so much that the non-Pentium-CPUs were so much worse on floating point but that they couldn't run them in parallel to the integer pipeline as the Pentium could.
A new RetroBytes video? What a wonderful way to start my Saturday!
Yea
I love the history!
Very nice video. Only major thing I would have added is the inclusion of X3D as a technology, like how you did MMX and SSE. You could have mentioned when AMD finally got SMT to compete with Hyper Threading years later. But as I said, great video. The truth is, this battle has been raging for so long that you'd have to do a series of videos if you ever intended to fully chronicle everything, so for a single video effort, I'm very happy with this.
58:36 Am I dumb or Ryzen 9000 is Zen5 not Zen2 as mentioned in video?
Yep, came to the comments for that. He meant either Ryzen 3000 series, or Ryzen 9s as the R9 3900x, the first Ryzen 9, came out July 7th 2019
He meant Ryzen 9 lineup.
Agreed, I was going to make the same comment. Zen 2 is Ryzen 3000 series in 2019.
He said "9000 series" by accident.
I'm also not keen on the way he pronounces Alder Lake as "ay-da lake"
- as a fellow brit I feel like he should know how to pronounce 'Alder' a word which originated from Europe (specifically Germanic-English)
But I enjoyed the video lol, fun recap of the last 30+ years of PC processors.
And at the end, ARM starts squeezing its customers for more revenue... it's constantly evolving.
ARM is a large company and will try to maximise their profits, thats what companies do. They dont quite have the same market power as the likes of Intel, as they dont produce their own phyiscal products, so cant use their control of supply to manipulate their big customers in the same way. You're right though everything is constantly evolving.
Bulldozer architecture did come good in its own way. When the games and other software learned to use the chip, it became a really good '8 core' option. I bought one when it first came out, an FX 8120, which had some problems until I paired it with a 9 series chipset. Then its parallelism improved markedly, and as the software matured, it performed increasingly well. I myself wrote code which used core affinity to make best use of the way Bulldozer/Piledriver worked, and had incredible efficiency compared to the same problems being tackled on comparable Intel machines without these enhancements built in.
By the time I upgraded to a 8350, which comfortably overclocked to 4.8 GHz, paired with a decent GPU, performance was extremely good for the price point. In upgrading cycles I sold each CPU for a decent 75% of the original purchase price, upgraded from the 8350 to a Ryzen 5 5600, which my PC still runs. The laptop I'm writing this on is a Lenovo with an AMD Ryzen 5 5625U.
Excellent video! A wonderful trip down memory lane!
One minor quibble that I noticed. At 38:04 you mention the AMD Duron as a budget version of the K6-3. The Duron was the budget version of the Athlon and was released June 2000. The original Athlon CPUs used Slot A from June 1999 to June 2000. The Duron was released when the Athlon was moved to the Socket A format starting in June of 2000 as there was no Slot A Duron.
That was honestly a really good overview of how these companies competed with each other. I was aware of most of it but it was cool to see it all in the one place, and the dry humour? much fun.
Such a comprehensive timeline. Must have taken a lot of research to put all together, thanks for such a great video
Very nice Video. One Appendix: At the End of the AMD Bulldozer Era, AMD not only got a Deal with Sony and MS to sell them their APU's, but they remodelled the Bulldozer Cores: Every Core got it's own Insctruction unit. Those Puma, Puma+ and Kaveri Cores are the Cores which went into the Xbox and PS4, and you could buy those 4 Core APU for low power PC, with a compareable performance to an i3, without beeing a waffle Iron. AMD made never "Bigchip" Kaveri's. But the Kaveri's were the foundation for the ZEN. There are even (low performance) Kaveri for AM4. I still have A10-7850k and A10-7870k as low power Linux desktops running. CPU Power of an I3-4330 and an R7 250 GPU are not great but everything in one package makes some nice workstation.
I had a Kaveri as well that I ran 'CrossFire' with a R7 250 dGPU. The combo of it and the IGP did a pretty nice job. I remember having to seek out the 250 I found as being the one with more cores than the standard retail part, in spite of the name being identical. Good times...
Very cool video, shared with friends and put it in my favourites. My 13700K is having massive instability issues and tried everything to fix it but nothing. I am going to get my X870-I STRIX board pretty soon, and then buy my 9800X3D once my pre-order arrives (they put m,e in a queue and will hold onto it for me from AMD). Very excited as this is my third personal rig and my 1st AMD system, heard great things about the 5800X3D and 7800X3D, and the reviews were overwhelmingly positive for the 9800X3D at a decent price of $689.99 CAD!
I know you said it was gonna be long. I didn't look at the length of the video and when it was done I didn't realize I had been sitting here for an entire hour watching this. Great video. Long? Yes. Entertaining, informative and worth it? Hell yes.
The Pentium Pro was a fantastic chip for what it was designed as, a Workstation CPU for CAD and various other tasks. it even ran Dual CPU better than Dual Intel Pentium CPU's, it was the CPU that the Pentium II was designed off, in fact, the Pentium II is technically a Pentium Pro, with a Die shrink and MMX Instructions with a re-designed cache system.
You got the K3-III a bit wrong. K6-III was essentially a K6-II but instead of relying on the Motherboards slow Level 2 cache, AMD added 256K of Level 2 cache on the CPU Die, this would then relegate the motherboards L2 cache into a Level 3 victim cache.
As for the Duron, this didn't come out until AFTER the Athlon processor. in fact it didn't come out until the 2nd iteration of the Athlon CPU that was Socket A or Socket 462. 2nd gen Athlon (using the Thunderbird core) had 256K of L2 cache on die, the Duron used the same Thunderbird core, but the L2 cache was reduced to 64K
Itanium, what a joke of an architecture that was. but to be honest, the time intel shot them selves in the foot big time was the i820 platform, RDRAM, and the multi year licencing agreement they entered into with RAMBUS, this agreement crippled their ability to produce and support the Pentium 4 Processors with affordable and performant memory, it kept intel from using DDR SDRAM for many years.
At the time rambus and ddr were equally expensive.
@@tripplefives1402 but DDR SDRAM offered more performance per dollar than RAMBUS. DDR was an open standard from JEDEC, where as RDRAM was a closed standard from Rambus. DDR memory dropped in price as soon as it was widely adopted, in order to make RDRAM, memory manufacturers had to pay royalties to Rambus, licencing fees, this kept the cost of RDRAM high, along with its poor performance, it led RDRAM to become a failed memory technology
@@camjohnson2004 this ^^^ you didn't have to pay licensing fees to the RAMBUS Consortium to use DDR so it was always going to be cheaper in the long run.
@@SomeMorganSomewhere No you read wrong. Memory Manufacturers had to pay RAMBUS royalties to make RDRAM and any chipset manufacturer had to pay Rambus royalties if they wanted their chipsets to support it.
@@camjohnson2004 That's exactly what I said...
Because you didn't have to pay licensing fees to RAMBUS it was always going to be cheaper to produce *DDR* in the long run.
Yeah.... Always a good day when a new RetroBytes video is released. I always liked AMD. Support the little guy.
disgruntled Cyrix supporter chiming in. He never mentioned the Cyrix3 which was comparable to the Pentium 3, though released when VIA owned Cyrix.
Absolutely fascinating video for someone who's been playing with computers from the early x86 days. Thanks.
nothing better than waking up on a day off to a new retrobytes video. easily the best tech videos on youtube :)
Amazing writeup. I was hoping that Meltdown or Spectre get mentioned but all and all not disapointed. In all fairness when someone asks me a computer history question my first reaction is to figure out if this channel had made a video of it. Recently had to explain to someone "how internet works" and guess what link did I send them... Keep up the amazing work!
You could have sent them Al Gore's relatively concise explanation.
17:05 No, they weren't. AMD also had to reverse engineer the 486, just as with the 386. They also had to wait for Intel to lose another legal battle (separate from the one for the 386) before they could sell those 486 processors starting in 1993.
Funny because similar situation goes now with Qualcomm and MediaTek, where Snapdragon go full waffle iron as their chips are copy-paste of previous generation that just clocked higher and can toast your phone to insane temperatures and have slapped useless AI everywhere(that do nothing as its just empty talk and software bloadware)and Mediatek, that was mostly known for crappy cheap chips that everyone avoided, but now they do dimensity series that can fight in high end spec devices for half price.
Eh the latest snapdragon (desktop) chips are *supposedly* clean sheat. It's actually why arm is losing their minds and pulling arms license from qual. They claim that qual is using designs from the company they bought which their licensing is ... Confusing on. Essentially IP isn't transferable.... Weird arm crap.
The worst part about Mediatek is their drivers/OS division, Mediatek based phones never seem to get more than a year of Android updates.
@@EmyrDerfel Plus they almost never get any Linux drivers, so if you want to install other OS to revive an outdated tablet/phone half of the stuff doesn't work.
Ex-Senior Fairchild Semiconductor employees were known as 'Fairchildren' - random but true anecdote.
Oh l love that.
Man, I remember getting burned by going in on Bulldozer after graduating from an aging Athlon XP. I did a lot of music production work, and this makes heavy use of parallel 32-bit floating point calculations. Imagine my surprise when I didn't get nearly the performance uplift I expected. Still, when the first Ryzen hit the scene, the price was so right that I could not resist. I'm pleased that AMD could bounce back. I'm hopeful that Intel will continue to compete. This history shows us that consumers saw the best improvements in X86 when the competition was fierce.
The best and most entertaining video I watched on TH-cam in a month, probably longer (and I stream this yt s*** non-stop at this point)! I wouldn’t mind if this video was 3 hours long!
Yes, this video is quite long. But the fact that you managed to compile (imo) the definitive history of the desktop processor market is no small feat. I will certainly be referring people to this when discussing the subject.
Thanks for this video. I've lived through part if this timeline, joining the PC race at around the Pentium era. This video is the clearest documentary of the comerardiry and back and forth battle between AMD and Intel. I knew AMD have been making chips since the 70s but I didn't know about the 2nd source deal and the how that related to IBM. Over the years I've owned Intel, Cyrix and AMD. Cyrix was definitely the worst, but I've mostly favoured AMD as they were mostly cheaper and I felt like favouring the underdog (last CPU purchase was the 1st gen Ryzen which I got when it felt like the whole world was running on Intel). Perhaps AMD are not the underdog anymore?
Retrobytes and Auto-Shenanigans joint vid... when? I liked this video, and pushed the button specifically for that.
Transmeta, also known as the company Linus Torvalds worked for :P
Posted this just as I'm looking for something to watch before sleeping. Thanks mate :)
Good night!
My introduction to IT was a highschool summer job replacing Cyrix 6x86 boards and rebuilding the pc's for a neighbor. He went all in on them only to find out that autocad and other 3d modelling systems would burn them out or were so slow as to be expensive paper weights.
Intel introduced avx512 foundation isa from skylake onwards, on amd its from zen4 onwards
This was amazing. Going to share it with my CIS 121 class. :)
Yessss bedtime story uploaded at exactly the right time!
Funny thing. Pat Galsinger (current CEO of intel) was a lead architect of 486, when he was about 27 years old.
Also AMD 486, was not just a clone of Intel 486. It came pretty late, (just little before Pentium), but had some substantial improvements, including cache and power management, and better clocks.
386 was fundamentally the most important in terms of features. To this day we use similar model of the machine. 486 was huge jump to make cpu smarter (by any standard previous CPUs were rather primitive). And next one, Pentium, was a total rethink on another level, where it started becoming superscalar (up two two concurrent instructions), and Pentium Pro, as well Pentium II, become fully out of order with speculative execution, which is basically what we do to this day in high performance CPUs. (there were few workstation class CPUs before that were also doing it years before, and some mainframes doing it even before that).
The UMC, you probably didn't see in real world, because they were mosly marketed in other parts of the world. In fact after their 486 clone, they lost patent lawsuit, and were prohibited from selling their CPUs in the USA. I think UMC Green CPU (486 clone), was from scratch design, with some reverse engineering of the chip too maybe.
Cyrix 486 was another clean design, that was not based on Intel core. In fact it has a number of incompatibilities in a low level, which caused some operating systems to not handle it properly, or required special code to handle that CPU. But performance wise it was similar to Intel 486, and general architecture was very similar too.
Loved this. Very nostalgic as I lived through pretty much all of it, avoiding most of the disasters and owning 286, 386DX, some sort of 486, Athlon, Core 2 Duo, and then most of the versions of Zen.
I had an FX-8350 or something like that and it was the first real gaming pc I had, got it up to 4.4ghz with a ridiculously massive heatsink thing with 3 fans. Lasted me a good 9 years, eventually underclocking to get it to even boot. I ran that thing ragged. Now on a Ryzen 7950x3d and loving it. All I'll say is that bulldozer was loved by me, an idiot who knew nothing and at just under the £89 per year of the 9 years I had it (the entire machine cost me £800) I'd say it was worth the money for sure.
Also if the labour government have any brains at all they'll park tanks at ARMs headquarters and say if they try to outsource they'll get nuked lol.
i ran one until ryzen came out. it was not so great for gaming... but virtual machines. it was a beast
the k5 was so interesting. its basically a risc amd 29k with an x86 decoder bolted on. they called it risc86 at the time. its a fascinating rabbit hole
I miss the look and feel of the old chips. a 286 can be used as a literal poker chip, and the 3/486 are so beautiful I kept one framed when I was a kid. I remember having discussion with older nerds about the FPU, I wanted to get one to finish my OC'd 286 build with 2M RAM. They told me if you play games it's useless to you. AND I told them. That will not continue.
RM Nimbus, he'll that brings back memories of when I was appointed computer tech at the 6th form college I worked at. No training given just up to me to read books to find out. Those were the days.😅
So many renditions of these stories descend in to a dry list of names, numbers and dates. The joyful humour in your telling make it a fun adventure into the foibles of men and business. Thanks.
Great video! There were some things I knew and some things I didn't. I am at the 39 min mark. I plan to watch more but I just want to mention a few things. The Duron was not a cut down version of the K6-III. It was a budget version of the Athlon. Also, I was a bit sad that you didn't mention the 386DX40 which was a real gem compared to the 486SX. Also, Intel broke the licensing agreement after it come up with the 386, not the 486. At this point Intel really didn't care about losing IBM as a customer so it didn't care about the across licensing agreement it made in order to do business with IBM.
Fascinating little essay. I was born in 90, so my pc knowledge begins around the pentium 2 and taking apart ones that got thrown out. Growing up i always saw pentium 4 ads everywhere, so the first time i saw an athlon machine i thought it was junk.
2 of my favorite things, computers and long form youtube videos. great video!!
Good history lesson. I was around for much of this history, watching it all. But I had forgotten some these things, and thanks for reminding me.
I am old enough to remember new machines with 8088 processors. A wonderful boiling down of a long history. I have regarded Intel as a dirty competitor which paid itself back with a series of bad decisions. Too bad you didn't touch on the rd-ram mess. That was fun.
Exactly! They were infamous for their strong arm tactics forcing OEMs not to buy AMD. Not to mention the Coppermine P3 recall they were forced to make and the current oxidation issues with their 13th and 14th gen.
Interestingly on the ARM front RISC-V is starting to take little bites out of it. And with the spat between Qualcomm and ARM that might speed up...
Hopefully RISC-V does it even more and faster. I would much more like to see a more open ISA and architecture like RISC-V, then one that is even more closed than x86/AMD64.
I have to wonder if perhaps the way for x86 to survive would be for it to go the OpenGL route and be handled by something akin to the Khronos Group.
It's crazy that opteron in 2010 had 16 cores compared to xeons 6 backthen, still didn't change the fact that they sucked but I think it's crazy.
Thank you.
AMD has really upped the game when it comes to processors.
Bad business model - the ones I bought 10+ years ago are still running very nicely.
Intel -> intel -> Cyrix -> AMD -> AMD -> AMD -> AMD
-> *The ARM Apocalypse*
@@Toksyuryel ARM is kinda overrated, even apple sillicon fanboys even admit how powerful and power efficent Lunar Lake is vs M3 lineup.
It's kinda sad at the same time that Meteor Lake and Arrow Lake are a flop lol.
@@Toksyuryel ARMpocalypse :)
Great video. Missed a couple of things, one, you didn't mention Slot A vs Slot 1 rivalry... As that was around the time DEC were now working with AMD. I remember the DEC Alpha Slot A solution being used by DEC and AMD against the Intel Slot1 competitor. I do remember switching to AMD on the server side when the Alpha was no longer supported on Windows NT4 after Service Pack 3...lol
But I guess that's more of a server focussed product...
i really do like those long, in depth videos. sure, other channels make shorter ones in the same subjects, but they are leaving so much out!
btw my cpu's were made by MOS, Motorola, Intel, AMD, Intel, AMD ... in that order (no i had more than 6 cpu's... i just mentioned the changes in manufacturer)
oh... and did you really say zen 2 started with the ryzen 9000 series?? (at 58:37)
This video makes so much more sense right near the end. I was growing up and understanding how computers work *just* as Intel took the lead definitively, so I was quite concerned about their recent fall from grace as it were. Turns out AMD's resurgence is more back to the Athlon status quo, with less mistakes this time, which is fascinating lmao
To put into perspective how much brand recognition in the PC market had changed over time here's a small anecdote. I got my first PC back in 1996 after I've grown up with Amiga. At the time I knew what a PC was, I knew Intel and AMD and had a good understanding of how PC's work .. but I had never even heard of IBM.
The 1st gen core 7 had a 6 core variant on the 1366 socket : the i7 990x. The phenom || also had a 6 core variant, x6 1100t
I have a 990X I've still never used. :D
Could we see a video like this but with the GFX card lines? Now a days we have Team Red and Team Green, but there use to be a few different choices.
The Emergency Edition still lives on as the KS SKUs.
19:16: text on the box "CPU Upgrade - Makes PC's Run Faster" XD
I'll never forget my first experience with computers was when my father bought a Tandy 1000 with an 8088. The following Christmas I got a Tandy Color Computer with the Z80. In Middle School I was bringing in homework that I printed out on the daisy wheel printer. The 80s were great :)
Same experience! GW-Dos and starflight
Didn't Intel have a Windows 7&8 compatible Broadwell CPU with 10 cores, the i7 6950x? I'd imagine that CPU would beat the Ryzen 2700x on average, though USB 3 is a bit problematic. It's easier for Windows 7 to be on an i7 6950x, and Windows 8.1 on an AM4 CPU with the B450 chipset. 8 supports USB 3 while 7 doesn't and last time I checked, most AM4 boards didn't have a lot of USB 2, only as an extra connector from the inside, which is what I'm gonna have to utilize to get 7 working on this Ryzen 2600E.
29:31 The ironic thing about people avoiding Cyrix for FPU is that I had the pictured chip (6x86-P150) and it lasted me long enough to get replaced by a 512k Athlon XP 2600+ clocked over 10x as fast. Even ran XP on that machine, albeit with significant RAM and storage upgrades. That included gaming, that Cyrix CPU was even able to manage Sims 1.
It was only stuff that leaned heavily on floating point struggled, plenty of games made little to no use of floating point and where just fine. Its just thanks to quakes approach to using floating point that Cyrix's reputation took such a kicking (and lots of people wanting to play quake).
Wonderful video. Thanks for posting!
Not mentioned here: AMD has recently worked with ARM to make AI chips, while Intel has largely ignored ARM.
@@MrZics oddly at one point intel did make arm Chips, they bought the part of DEC responsible for the strong arm. The produced them still with the name strong arm, then under the name xscale.
@@RetroBytesUK Indeed, I had a "Pocket PC" Windows phone with an xscale chip. It was...OK. What made them unbearable was cramming a Win9x era GUI into a small screen, requiring a mini stylus to do much of anything. Most of them still had a physical slide-out keyboard, a feature I still kinda miss even to this day. Alas, the iPhone's touch UI effectively eclipsed everything in the handheld space that came before.
Isn't Intel banking on RISC-V instead?
You kinda brushed over that Cyrix was purchased by VIA and become VIA Cyrix. I have a couple of them, a VIA Cyrix III 500MHz and 600MHz in my collection along with a Cyrix M II 66MHz.
ohhhh the Bulldozer.
I ran an FX-4100 for a while. It wasn't a quad-core, it was a dual-core with flavor.
This was an excellent recap of the industry and done very well!! Cheers
Nicely done. Filled me in on a lot of interesting gaps in my head canon. Many thanks 🎋
I think Intel Atom deserved a mention in this video. If not a video of its own.
intel Atom is not a desktop processor
@@majkosk8x So? Neither is itanium or epyc or Geode. Atom was intels attempt to compete with ARM.
+1 for a video on what happened with Atom
@@majkosk8xive actually got a 16 core atom microserver
atom is basically the e cores with some extra stuff that you see on the last 4 generatiin of intel cpus @@mattsword41
I work for a government organisation in Ireland, and let's just say it's one you'd want to have decent kit. I'm not in IT but my presence here indicates I have an interest. I got asked by a couple of people why their computers were "a bit slow" and ended up spending a significant amount of time replacing the Sandybridge Pentiums with 4GB of RAM in late 2024!
Superb timeline video! Never knew the high end waffle iron market was so competitive!
awesome as always. watched every minute. top work fella
Core Solo, Core Duo, Core 2 Duo, Core 2 Quad, Core i7... Athlon64 X4 was a thing. 😅 Names got really tripped up there for a bit. And Alder Lake... 😊
Thanks, very enlightening, keep up the great work!
Minor correction: The 286 never had a paged MMU. That was introduced with the 386. The 286 had segments, just like the 86, but with segments up to 1meg in size instead of 64k in the 86 and of course a 24bit memory space (16meg max) as opposed to 20bit (1meg max). The fact that the later 386 was a true 32bit CPU with a paged MMU gave it the - then common - nickname of "VAX on a Chip".
Fantastic video!
What a delightful Walk though of my childhood 😂❤ at the start I wasn't that great at floating point operations either.
I hope it goes down in the history books what a massive turn AMD has had in the Ryzen era, going from "don't buy their products" to scrappy underdog, to peer level to now being undisputed best processors for most (consumer) use cases, while EPYC has been massive in the enterprise space as well; even if the marketshare hasn't caught up to adopting the new market landscape yet.
Oh no, the Duron was NOT the budget K6, instead it was a budget Athlon! And before Duron, the K6 was the cheaper of two when Athlon was launched.
01:02:53 Since you've now said Ada Lake twice, I am compelled to point out that it's actually Alder Lake.
P3 was first Slot 1 too, Duron was the budget version of the Thunderbird Socket A not of the K6-III
amd's chiplet approach really helped them out. on zen 1 and zen 1+ (ryzen 1000 and 2000) desktop had a single die, workstations and servers had two or four active die that were nearly identical to the desktop counterparts providing a really easy method of scaling up to 32 cores. on zen2 with the io die + x chiplet count amd just went hog wild. i wouldn't be surprised if we get 512 core epyc cpus in the somewhat near future. also, zen4c is really cool. by limiting the voltage and clock speeds amd could shove 16 cores into a single ccx die and not have it melt under a full all core load. kinda want to see a budget ryzen cpu with those cores for the budget sff workstation market
That took me back from my 486 DX2 to my Pentium Overdrive to my Pentium MMX , 2,3 and 4 before moving to Athalon, FX Black bulldozer central heater and then returning to 8, 10 and 11 gen i7s before going Ryzen 9 4th get last year.
Top quality video again, thanks for sharing and reminding me what I have lived the last 30 years.
I remember building my first PC (I owned one prior, a Pentium 2). This time though, I was going to put it together myself, around the year 2000/2001? i just remember reading all the reviews in magazines and getting my Athalon XP 2.2 (which was actually 1.8 GHz), but Amd marketed it as fast as a 2.2 pentium. put it on the MSI motherboard, was a great chip, and to this day still turns on. It just aged out of the game and never broke down. still have it collecting dust, sadly. that one and my fathers athalon 2.4 version,
The K5 was basically a 29000 with a translation layer added. Worked quite well. After that, AMD bought Nexgen and used their design to build the K6.
12:30 That seems to be an odd thing to say, given that Intel did not provide AMD the designs of the 386, in breach of their contract. AMD had to reverse-engineer the processor to make their own versions, and wasn't able to sell any until 1991, after a legal battle that Intel lost.
Seymour Cray ..... If you say it out aloud it could be the name of a famous fish spotter or fisherman.
Thankyou thankyou i'm here all week! 🤣
I'll get me coat ...
Loved it, even though I knew all of this, I enjoyed your presentation of the info.
this is a very instructive tale on how a smaller engineer driven business can outperform bigger commercial driven one kept alive only by the ignorance of customers