Interestingly, the person, who managed the copper interconnect project in IBM, that later gave considerable edge to Athlon was actually Dr. Lisa Su. Just imagine, that the person, who helped to develop Athlon manufacturing (albeit indirectly) is the same one, who later served as an AMD CEO during the time, when Zen was developed and is being produced, crushing Intel in HPC. Also, before Athlon and AMD64, Intel and Compaq had DEC Alpha future development cancelled (to ensure, that Itanium woudn't have to deal with RISC compatition), disbanding the development team and leaving some of it's technologies shelved. Some of the DEC engineers then ended up in AMD, developing AMD64 architecture Athlon and Opteron. One of those engineers was Jim Keller, who also much later participated in Zen development. Also AMD licensed EV6 bus from Alpha, using it in Athon.
Now that you think about it, the Athlon name may have have a deeper meaning after all. It was supposed to come from the word decathlon, but with so much of DECs works being used for it, it was almost a "DEC - Athlon" chip.
And as I recall, Itanium seemed like a brain fart by Intel. The X86 instruction set getting extended to 64 bit by AMD made that all too obvious; instruction set compatibility was more important than Intel realized.
@@stevebabiak6997 Itanium was an interesting concept but required too much research to get it done, and by the time it came out x86 caught up in performance and marketshare. More importantly the mere announcement of Itanium made a lot of companies stop developing alternative ISAs (DEC Alpha, PA-RISC, etc), which drove a large chunk of the market to x86. So Itanium basically destroyed its own place on the market with its own announcement. Which is really quite an achievement! AMD64 was just a nail in the coffin; Itanium was already on life support paid for by HP from day one.
@@stevebabiak6997Itanium inherently was a very "riscy" choice (pun intended), because it's at a very extreme end of architecture choices, with its VLIW instruction set. AFAIK the only area where that style has been used for a long time is signal processing (i.e. where you have very specialized, optimized software), where simplifying the chip has huge relative gains on power usage and chip vost. VLIW basically means you have huge instructions that explicitly have fields for each specific ALU of the CPU, so the CPU only needs a very primitive scheduler and no out-of-order handling. In general purpose computing, the difference just wasn't significant enough to warrant the effort to switch over. Also the huge instructions complicate compiler programming, and increase memory usage significantly (besides the extra memory requirement for the 64bit addressing), and that's probably also why Intel always segregated Itanium to be just for servers. And then AMD64 killed off Itanium (and forced Intel to "cannibalize" Itanium with its own 64bit x86 chips). Architecture wise, Itanium probably was great for some types of scientific and Supercomputing applications, but it just doesn't make any sense for e.g. a webserver.
I read somewhere that Lisa Su was also part of the CELL processor team that later was used for the PS3 and Servers, two of the most important business of AMD: graphics and data center, is kinda curious how this person accumulated experience in all the key business AMD have now
Forget the other statements about AMD kicking Intel's ass because Intel will move to TSMC before that happens and would probably push AMD to the side a little with TSMC as they already have, buying up the remaining run of TSMC N3 for this year and I think into next which then keeps AMD from being able to use N3 for most their product lines that will come out next year. So, a future look? How much do you know about CPUs? AMD will have to redo their current MCM architecture and probably move to what Intel is putting out in 2024 to at least laptops (Meteor Lake, MCM architecture that is tiled, so one chiplet pushes up against the next and there are direct connects between them). Many insiders feel this is going to be better than AMD's Infinity Fabric where processor cores have to send data to/from another die called the IOD which has a multiplexer for getting data to where it's supposed to be, and this design was fast enough for Zen 2/3 but is giving them problems for Zen 4. So AMD will probably move to a direct connect MCM architecture just like Intel and I believe they have the rights to do so as I think that connection was licensed by a few companies so CPUs could be made with a wide variety of chiplets or tiles. So, imagine a world where a CPU can be made with lots of different chips from different companies. That's the next 5 years. CPUs will have hardware accelerators, AI cores similar to what AMD is already putting in some laptop products. You can have a graphics processor be a separate die, etc..... You could even have a chiplet/tile simply be cache, similar to how AMD is stacking L3 cache on their X3D parts put no need to stack it any more. Yeah, that's the next 5 years and it's going to be exciting and both Intel and AMD are ready for this.
@@johndoh5182 Sorry, but I didn't bother reading that blurb as you started out with blatant speculation that nobody but TCMC, AMD and Intel knows anything about.
Ah, a stroll down the nightmare memory lane. It was such an adventure watching AMD's struggles against Intel, and now they've finally caught and surpassed them.
@@mddunlap03 Intel might be a bit behind those days but they still got some design strengths and they have an excellent software team and they acquired quite a solid reputation. That's why despite very good products, AMD have difficulties to push through a lot of markets. This time this is not because or coercion. Also Intel have deep pockets and the amount of field they cover is way larger than AMD so they can perfectly manage losing a few battles here and there.
@@TheVanillatech Is Jim still around? It seems like he's taking a smaller role after expanding into a website and podcast. I remember watching his masterpieces like _Nanomatters_ and _Path Tracing_ years ago.
I actually worked at TI in the 1990s (1995 & 96 I think) on a Pentium class clone processor. Internally it was called it the Amazon project. TI had effectively stolen Cyrix CPUs by signing a manufacturing deal with Cyrix that allowed TI to sell the chip directly to OEMs and to undercut Cyrix. TI pursued the Amazon project for a few brief years in the hopes of coming up with their own successor chip, but we were several months behind schedule when we began getting silicon and TI's bean counters decided to can the entire project and sell off the IP rather than try to be an also-ran in the x86 CPU business. The Amazon chip had several innovated design features that would have allowed it to run rings around a typical Pentium processor, but the it lacked out of order execution, which was the next new thing with the up and coming Pentium Pro. Apparently TI management didn't want to spend the resources to go chasing that target. Awesome video. Brings back lots of memories.
I remember in the late 90’s, around 98/99 if I recall correctly, when the Hammer architecture was first unveiled even before the chips came out. I downloaded the PDF and read all about the 64 bit extensions to the x86 architecture. The Clawhammer, Sledgehammer and Jackhammer names were bandied around, what later became Thunderbird and Athlon a few years later. The proposed changes were shared with the world years before the first chip using them debuted. The 64 bit extensions were embraced by Intel without any accolades that they were invented by AMD. When AMD reached 1Ghz before Intel everyone became AMD fanboys overnight. Especially with how easy they were to overclock.
And this is one of the few times Microsoft actually did the right thing: they were ON it like a whiz. They had caught wind that Intel would partner with HP and their HPUX as the server operating system of choice for Itanium. Microsoft didn't like that one bit, and put all hands on deck to make a good x86_64 compiler and port Windows to the new instruction set extension.
No. Athlon K7 was the first major effort in 1999 and was still 32-bit. Thunderbird were later Athlon models reaching over 1Ghz,. Sledgehammer and Clawhammer were Athlon64 and had no competition because the pentium 4 was shit and only clueless idiots got those.
Among Intel's big flops was the new Willamette core of year 2000. (a HUGE project I worked on) It was an ultra complicated, super pipelined (12 stage) generation that required even more stages (18) for the Prescott followup, thus dooming the series to cancellation, as they could not tame the Prescott beast as it fell behind schedule. In hindsight it exposed a major design/tech_path screw up by top management. The 'right hand turn' of multi-cored P6's helped retrieve the situation .. for a while.
such a prolific researcher you've come full circle to an older essay. i actually remember that particular one because i subscribed and looked forward to more of your work. Thank you for documenting this history for all our benefit. 😊
I love these historical flashbacks of these massive companies. I've learned a lot about AMD, especially it's early beginnings from this current trilogy! Will you make a fourth one covering where this video left off until present day?
One reason Abu Dhabi investmsment looked at AMD was intel getting Core and other innovations out of the Intel branch in Israel. The two CPU giants are proxy warriors for a very different kind of fight.
I really love your video analysis and your newsletter, I am sincerely impressed by your work as it has the right blend of thorough analysis and the right dose of sprinkled in humor! Keep it going, my guy
Ohhh The K6-2 My first PC, i learned a lot in that PC for my motherboard had a sketchy L2 cache and gives a lot of BSOD's LOL.. and the Thunderbird Athlons.. those where good and proud times. So Proud of AMD's achievements! keep going!!!
My very first windows PC was powered by a K6-2 and it introduced me to the internet, PC gaming and then I handed it off to my friend and it did the same for him. Fond memories all thanks to AMD bringing the price down for the everyday person.
11:19 - You have to appreciate how much the prices of computer hardware has decreased over the years. Inflation adjusted, those CPUs cost around 1.5k to 2k in todays money and that is only the CPU.
This is great content, the late 90`s and early 00`s were a great time for competition in desktop processors, coming along with the discovery of what graphics card could (and would) do. Thanks for your work.
I built myself computers in 99, 04, and 06 or 07, and used AMD for all three. Those were the last new computers I've had, and finding used/refurbished AMD stuff is not easy. I'll likely build another in a year or two, and it'll be back to AMD.
I bought Ahtlon64 and had built lots of winapps with it. Now, knowing its history, I'm even more proud that I had that chip. Thanks for the history lesson!
I will always have a deep love for my Opteron 165. Using 1A-Cooling (a German company) watercooling parts in 2005, 3.2/3.3GHz+ was my daily driver frequency. The waterblock itself was a heavy chunk of metal. The contact point/cold plate was two strips of copper and one of brass in the middle magically joined seamlessly. It mounted with a single thumbscrew dead in the middle to provide pressure directly down on the silicon which would cause the block to be sometimes comically askew with the torque from the tubing with no reduction in performance. The whole block was seemingly one unbreakable monolith, and I always wondered what it looked like inside. 4x 1GB DDR sticks, GeForce 6800GT, NB and GPU were also waterblocked (wildly thin blocks, same company, same single point of pressure mounting) with a super thick 240MM radiator. Those were the days... The Eheim pump was 110V and required the case to have an AC plug routed out the back. Thanks for the walk down memory lane here :)
A64 was incredible - in terms of price AND performance. A bit confusing with the different socket types, going 754 locked you down to a 3.4Ghz maximum but they were much harder to find and more expensive than the 939's. I spent a fortune on a DFI Lanparty 754 board and regretted it pretty soon after. But hey .... Core 2 Duo was round the corner so it was a no brainer what to buy next.
I remember the time my small amount of AMD stock was worth practically nothing. it was in spitting distance of being de-listed. I.... still have it though. ;-)
I had the first Athlon XP based computer back in 2001, there were some problems with it. The chipset wasn't very stable. I did have AMD products before this, like a AMD K6-2 350 MHz, and AMD K6-3 450 MHz, but those products didn't offer the performance of a Pentium 2 or 3. I changed to a Pentium 2, and Pentium 3 in 1998 and 1999. I tried a Athlon XP, but it wasn't very stable and crashed a lot for some reason. Then, I changed to Penitum 4 1.8 GHz . The last AMD product I used was in 2009, and it was AMD Phenom II 945 Denab, and I switched to a Intel I7-960. After that I just used Intel products ever since. A lot of people are saying AMD stuff is better now, and when I have to get a new computer next time it will be a AMD product.
I had an athalon xp 1900+ in the early 2000s, and my dad had a 2ghz p4. My athalon ran circles around his p4, it wasn't even close. my amd was more prone to overheating, but it was definitely a superior cpu.
My PC history was Pentum MMX (fine), K6-2 upgrade (fantastic), AMD K8 / Athlon 64 (great speed and efficiency). Then an Intel i7 2600K that never made me happy for some reason. Problems with suspend etc. Last year I bought a not-so-leading-edge Ryzen 7 5800g. Laptop chip in desktop package. Love it again.
Oh I remember the original Celeron 300A very well. It was the Intel chip I could afford as a teenager after being disenchanted by the Cyrix 6x86 lackluster gaming performance. And off-course I made sure to get a motherboard and ram able to run at 100Mhz bus speed. After some initial hesitation I switched it from the default 66Mhz bus speed to 100Mhz and yes it ran effortlessly at 450Mhz with the stock cooler. This was more or less a public secret: the lack of L2 cache was wat made the chip able to run so well at higher clock speeds.
@@michal0g it was a long time ago so my memory was a bit hazy on that, but yes indeed! With the regular P2's you had these L2 chips on the riser board of the slot processors that were the form factor they came in... it's all starting to come back now 😁
One of my old PCs had a K6-2. Even at 500MHz, a Celeron 300 was over twice as fast. I guess the floating point unit in them was just that bad. Also the Celeron 300A (the 2nd version of the 300MHz chip, based on Mendocino with on-die L2 cache) was extremely popular and could overclock to over twice its clockspeed on air. It was a beast.
Was looking for this comment. It was almost like magic. Overlocking a 300MHz/66mhz-bus Celeron to a 450MHz/100mhz-bus chip. As a kid, I felt I tricked the system. At the time, such a Celeron cost less than 100€. A real 450MHz intel chip was more than €600. Add a 3Dfx Voodoo card and hot damn, what a gamnig machine monster it was! Amazing times.
HOLY SHIT, are you telling me there was a chance, that Jensen huang would had managed AMD and NVidia. OMG can you imagine that? GPUs would cost 5 times more and CPUs would cost the double.
@@cv990a4 Why would intel do that, only AMD needs to buy themself in, sell nintendo gear ! intel needs proprietary developing, not China steel it ! Do it yourself, own x86 and ARC !
Much nostalgia for the chips featured in this video. Still blows my mind that AMD shipped Athlons with exposed die and users could damage it when applying heatsink lol
You missed/misrepresented some things I think... One of the main reasons the second generation of Celeron CPUs was better, was that they had 128k on-chip L2 cache, which ran at the full CPU frequency, as opposed to 0 on the first generation and 512k off-chip L2 cache of the P2 which ran at half the frequency. So equivalently clocked Celerons were actually faster in some workloads (particularly gaming) than P2s (and much cheaper). Celerons also overclocked well, since they were made on the same process, so a 300MHz Celeron 300A (A was the cached version) was known for being able to run at 450MHz. The cache was actually a key reason the second Celeron generation was successful, not the clock speed increase. Iinitially there was only the 300A and 333, (compared to the 266 & 300 of the cacheless parts). The Pentium 4 NetBurst architecture was generally considered a flop. intel went for a very deep pipeline, to try to get high clock speeds, but sacrificing IPC. This didn't work out so well in practice. This partly opened the door for AMD to take the performance crown with the Athlon. Would have probably been worth mentioning for better context. At around the same time as the move towards 64-bit computing occured, intel created a completely new architecture (IA-64, which actually originated at HP) and its Itanium processon (colloquially known as the Itanic, due to its collosal failure), primarily for servers. These were never consumer CPUs though. IA-64 was a "very long word architecture", which relied on compilers to order instructions efficiently, but these were difficult to write and never really materialised, resulting in generally poor performance. In addition, as it was not compatible with x86, x86 code required emulation, further killing performance. The architectural difference is rather important. On the other hand, AMD extended the original 32-bit x86 architecture (later name IA-32), to the currently used x86-64 (aka. amd64), which maintained backwards compatibility (not requiring emulation). So the first consumer 64-bit CPU was the Athlon64. Intel later followed suit and also stopped producing the Itanic. After NetBurst, Intel went back to iterating on the P6 architecture (of the Pentium ii & iii), under the "Core" branding, which performed much better. So missteps from both companies were involved in the switch of performance leadership. The video doesn't really reflect this... (Actually, the situation today feels somewhat akin to the P4/Athlon days, from the consumer product side). Also, you mentioned that the lower clock speed on Barcelona CPUs was due to the cache bug (I guess specifically the TLB bug. Afaict this wasn't a problem "translating between its caches". Note the Translation Lookaside Buffer (TLB) is a very specific cache used by the MMU and the bug occurs on page entry modification - perhaps that's where this description came from?). The actual bug is kind of cache corruption issue, where the same cache line might end up residing in both the L2 & L3 cache, even though the archtecture is such that those are exclusive caches (eg. another core might accesse the data from L3, even though it "also" lives, potentially modified, in a different core's L2). In the video there's also the following statement: "That software workaround compromised performance so Barcelona failed to hit the clock speeds AMD promised..." This doesn't make any sense. It may well be that those chips had both the bug *and* were not able to hit the expected clock speeds, but the lower clock speeds would not be due to the bug (which may itself lower performance, by effectively lowering IPC).
In years 1995 to 2002, I suggested AMD computers to gamers and Intel computers to engineers. AMD traditionally had better in-processor networks, and Intel had better raw calculation capabilities. And, I indicated ATI Radeon video cards in both Intel and AMD; the Sound blaster AWE32 or similar made for a great all-rounder PC...
rayoflight62 AMD for gaming, engineers need the same system, why use less compatible systems ! EU laws, muhahahahahaha, legal Clone IBM Bios, replace the intel for AMD in dresden... GAMING PC only, SoundBlaster was only needed for games ! way more better cards on the market, compatible in GMAES you needed ONLY ! U understand it now, why engineers need intel, if one of them does the calculations on AMD, it's not compatible !
I still have a box with Pentium 100 CPU's and K5's and K6's, also Slot A and Slot 1 processors and MB's, I bet they still work to be honest. I'm holding onto them as a collector \ hoarder type of thing but they are all in one box marked "Old computer parts" Then of course the Athlon came and performance was way better than Intel for a while until Core 2 Duo came and since then and until recently it's been AMD playing catchup. 8:00 fun fact, upper right corner the L1 and those dots were the pencil trick. with a #2 pencil you could unlock the multiplier, all 4 needed to be connected.. I did this with many Athlons :)
i own 2 identical PIII system HP systems, one on slot and one on Socket, same PIII chip on both. Why thy did this, AMD did second level cash daughter board for it, a better solution. include the memory controller and both cash !
I never had one but I remember `from my computer shop job in the 90s the K5 causing divide by zero errors in any run-time DOS program, it's cache memory was blazing fast. I had to use a program that causes slow-down by saturating the CPU with NOPS.
SIS chipsets did a very bad job on the AMD socket chips Nvidia bought them, nForce chipsets. You mixed it op, Co CPU code, by Zero was the Pentium 60/66 bug
@@lucasrem Integer Divide By Zero was a speed bug running older DOS programs on modern fast CPU's. Most developers never even considered the exponential speed increases coming from Socket 7 CPU's.
Sure integer speed was great on the old K5s, but that floating point .... my God! Terrible. I paid a shop for a Pentium 133 and they pulled a fast one and gave me an AMD PR-133 (K5) ... which only ran at 100Mhz. I called them, they claimed theyd done me a favour "Faster than a true intel P133" they said. Without mentioning it cost just half as much, and was terrible at float point. Given I'd only bought the thing to play Quake, I was seriously angry. 10 months later I went back to the store and stole a Pentium 200Mhz right from under the counter. Revenge!
@@paulmichaelfreedman8334 Jesus that exchange rate back then! When I stole mine (although Im loathe to use that term - that shop owner had STOLEN my money already with the switcheroo) the price was £200, Exactly £1/megahurt. I'd never installed a CPU before, figured it out as I went along. Wasn't even sure if my motherboard would take it. But it did. Jumperless - set the clockspeed in BIOS and boom ... 200Mhz. Quake timerefresh - from the difficulty select screen - went from 7.7fps on the AMD PR-133 up to 21.8fps with the P200. Still can't believe those scumbags at Falcon Computers would steal a childs money and then lie to their face on the phone when they comlpained! Pricks.
I grew up ten minutes from the ATi building in Markham, i saw that logo every day growing up. I remember when my cousin and I got the new 64MB AGP card from ATi. How far we've come..
4:00 Not sure if this was intentional or not, but note that the "Pentium" you show here on the left is not one of the Pentiums under discussion, but a much later low-end Intel Skylake chip from 2015 that re-uses the Pentium name for marketing purposes (well into the Core i# era)
I’m literally reading how Eagle Pass crossing is one of two border closures at the exact time you mention Ruiz crossing the border into Eagle Pass. Timing
18:04 just imagine if nvidia+AMD merged way back then (and Huang took over). Ruiz' days were ultimately numbered anyway and Huang is the much stronger CEO. AMD also started slipping come 2006, while the ATI merger was strategic, their lackluster CPUs compared to Intel's Core 2 and then also long time required to get quad core chips out to masses were really their death knell, and then intel integrated the FSB and any benefit from AMD was gone (wasn't until several CEOs later and Ryzen did they finally compete again). nvidia also really wants an x86 license and this would've solved it. But with nvidia blocked from buying ARM these days, I don't think they would've been blocked merging with AMD given the economy and regulatory environment back then. I think under Huang AMD would've had a much better chance all the way up to at least the Ryzen days - depends on how products would've performed.
There's a cool story I remember from the late-90's era, but can't Google up the details for. It was about the pricing of Celeron processors. Intel set a baseline price that was above what AMD was charging, and everyone knew Intel had brought out Celeron to compete with AMD, but they insisted on charging more. Then it wasn't selling, so the price came down after all, and the guy who'd said it wouldn't (Andy Grove?) resigned. It was a fun win for the upstart company at the time.
Cyrix 120 is still working here too, DOS 6.22, Windows 3.11 not any issues. a Branded IBM build, IBM on the CPU too, not any Cyrix names on it, always thought it was in Germany produced, now i know NYC.
Itanium brings back memories. Itanic! It might be able to run HP PA-risc. But had really really bad IA 32 performance due to emulsion and abysmal compilers. iIA 64 was nearly as bad because compilers. Amd aIA64 could run native or nearly native IA32 code with minimal changes. Plus and easy to access IA64 mode with much better compilers, with all code far easier to adjust. No contest.
AMD saved x86 from Intel’s own shortsightedness. They were focusing on an Itanium based 64 bit future. The AMD64 extensions were a game changer and are still in use today. Whilst Itanium might have been good for HP and Intel no one else in the industry wanted it to be the successor to x86.
To be more specific, The Itanium optimization chain was to simply expose everything and leave micro optimization like pipelining and branch prediction on the compiler. The simplified hypothesis being one one hand that the compiler knows the bigger picture of an application and while slower than hardware it will only need to optimize the binary once, as opposed to the CPU making fast hardware optimizations during every run. And this lack of HW optimizing also frees up substantial silicon area and power budget for more core logical processing units. (Which is largely why GPUs can do so much calculation per dollar for specific tasks). This could actually work well, but as you stated the compiler-optimizers of that time did not have the needed advances and Intel didn't bother to put that horse before the cart, and secondly it requires compiling for very specific models of hardware (less portable binaries, more prone to vender lock-in). End users also get attached to their legacy proprietary binary software so even with a good cooptimizing compiler they can't just recompile to get the advantages, thus the marketing importance of good backward compatibility.
@@PainterVierax Itanium is RISC, not "pretty much". (Paedantry!) But being RISC wasn't the core market failure. It was just a poorly supported implementation. ARM's various designs are RISC and do just fine. Historically the market is much more complicated, but 80x86 being CISC was not the major factor in it's market dominance. It was much more a matter of momentum, contracts, compatibility, and supply competition at a critical time in the Micro-computer market.
@@mytech6779 ARM did a bit better because the embedded nature of the products allowed OS to adapt from it. Performance and compatibility is a reason why Android uses Dalvik then ART instead of a more classic binary packaging. That's also why specific distros like Armbian or Raspbian were formed on top of the Debian work and why Arch or BSD are quite popular on ARM and RV devices. Add to this the huge library incompatibility with the vast x86 environment and you got why Apple took so long to make an ARM transition to their "desktop" environment (not iOS) despite a very secured marketshare.
Interestingly, AMD's original Athlon CPUs used the same bus as the Digital Equipment Corp Alpha 21264 EV6. They licenced this from DEC. This is a DDR bus, and it's why they had much greater memory bandwidth in one of your slides on Intel vs AMD performance in the Athlon vs Pentium 3 era.
great CPUs for mid/high gaming solutions and mobile ones. But their desktop parts are actually lacking in applicative, not that great in idle/light task power consumption and their low-end offer is not competitive at all. Same for High-end workstation, the new Threadripper is not the game changer it was before. Finally, their recent platforms seem very unfinished/unreliable, more than typical AMD style.
Looking at a Pentium with a small number of pins as opposed to what they have now with 1700 pins is pretty funny. It shows how little conductivity these earlier CPUs had.
The whole computer trip to now has been remarkable when microscopes see more they made more tiny curcuits smaller but Now theyre all reached a limit as with Intel the last 3 generationsCPUs from intel were actually the same cpu but 10% if that no wonder upgraders stayed away as with me i7-900k done4 ages
Back then, I had the K6-2 at 400Mhz with 3dNow! Very happy with it, and it greatly outperformed more expensive Intels in Games which fully supported it.
Wild to think how different the market would be today if AMD hadn't bought ATI or even more so if they had merged with Nvidia. Nvidia is larger and worth more than either Intel or AMD today on their own. If they had combined with AMD and Jensen got his hands on an x86 license then they may be bigger than Apple
mattwhite7421 They needed a GPU partner, what if they bought Matrox, same as now it would have been. Cheaper to buy you in, then do all the dev at home !
AMD bring a lot of nostalgia during the late 90s throughout to the mid 00s. For me it's kind of weird hearing AMD struggling back then as basically not one but ill informed novices would even consider Intel in the enthusiast space. Like 90% of gaming PCs that all my friends and local gamers in general went the AMD Athlon XP, 64 and FX series plus mostly nvidia combo, ATi being the better and more popular option in the early 00s due to the nvidia FX gen being an absolute flop. We all knew that P3 was too expensive and slower for gaming than the Athlon, and the 64/FX was just light years ahead of the cluster f**k that was the P4. It wasn't until I think about 2006 or 2007 when intel Core was compelling and offered an upgrade, that the AMD preference finally started to wane. I suppose 99% of consumers back then had no idea about the anti competitive shady stuff intel was doing back then, not like the awareness people have now. We all assumed AMD was racking it in back in those glory days of the Athlon era. I also remember the woes of the first Phenom, I bought one very early on and it was good but very much totally limited by clocks due to that design bug. The Phenom II was fantastic though, one of the best OCing AMD chips I ever had, but was too late to the market by that stage. My main desktop is still an FX8320, its one of the best PCs I've had in terms of snappiness in general everyday use even though it really lacks the modern gaming and heavy number crunching power even when it was new. But It's way faster than the i7 2500K I tried for a while in that respect, much for muchness in the other areas, and even is more snappy than the lower end Ryzen laptops I currently use which I did not expect. I am not sure why any of this is the case, I suspect something to do with the latency though the pipeline being better than early Ryzen and Intel of the same gen? But I am no expert on those older architecture. Maybe, one day I'll upgrade to that mid/high end Ryzen desktop I have been talking about for the past 5-6 years 😅
OMG! I remember Cyberpro in Norcross, GA, right off Jimmy Carter Blvd! There were a TON of little mom and pop computer stores in an industrial park there! They all pretty much had the same stock, but you could build a whole cheap machine just by going to those shops and price compare to get the parts from the cheapest vendors...
Pentium 4 taking the crown from Athlon Thunderbird? haha. I beg to differ. P4 was hot, big and expensive. Pentium 3 Coppermine and Tualatin were much better as the time showed. Best wishes.
Only games If you all work on the same project, all hardware needs to be compatible ! why is that so hard to understand, he is not a coder, a gamer too ?
Watching them slide into irrelevancy into the 2010s was so sad. I was so happy when they released Ryzen, since they hadn't really been worth buying in close to a decade unless you had a really limited budget.
I was in Hsinchu when Intel lost their strangle hold on the Taiwanese motherboard manufacturers. My understanding was that Intel told the motherboard makers that if they made a high end motherboard for AMD CPUs, that Intel would stop selling Intel chipsets to that motherboard maker. Since Taiwan was producing most of the motherboards, this hurt AMDs desktop business. However, while I was in Taiwan on one of the projects I did there, Intel had delayed building a new fab and this left them short on product to deliver. The result was that every motherboard maker immediately brought out high performance AMD motherboards. This put Intel and AMD on a more equal basis in the desktop market.. That is my understanding of the desktop motherboard skirmish between AMD and Intel.kl
Back then 1999, I was writing code for biometrics and using Intel Pentium was able to analyze more than 600 fingerprints per seconds. Did not worked well with AMD or Cyrex. Don't know how that's going now with crazy fast processors.
Slots, muhahahaha We have a Celeron Slot 1 CPU, we know what it is. AMD sold cash broad too, level 2 cach expansion slot, that was that. NOT on Die !!!!
The K5 wasn't as bad as this video suggests, it was the choice for budget systems at the time as the performance was sound and the price below the intel parts. One might tell the story as: AMD got to use an outdated fab without going bankrupt.
I would like to add that Globalfoundries, TSMC etc. are pure-play foundries. As such, if you were to included others like Samsung, which are IDM and also takes orders from other players, GF is hardly placed 3rd.
I hated both AMD for their AthlonXP and Intel for their P3.. as a former PC shop owner, selling CPUs with open silicon dies was probably the worst idea the industry could ever have. I've lost so much money, time and nerves in fighting customers too dumb to install their cooler and trying to claim warranty on cracked dies. Sure, it made them cheaper to fab, but that just shifted the cost to the last link in the chain.
In the developing countries it is still impossible to find AMD powered laptops in retail chains for electronics and local internet shops. The ones available in small nambers are the premium models that cost over 1k and very few people can afford. That's why 1366x768 models with Celerons are still selling like hot cakes,since there is nothing else available in that segment. Intel is still doing shady busines and its a shame that amd is not suing them.
Someone, somewhere, much smarter than me, already combined moore's law, the capital cost of new fabs, and the limits the human user has to calculate where the diminishing returns to all this capital madness will start. Semiconductors will have a "Concorde/SST" moment sooner than we think.
Interestingly, the person, who managed the copper interconnect project in IBM, that later gave considerable edge to Athlon was actually Dr. Lisa Su. Just imagine, that the person, who helped to develop Athlon manufacturing (albeit indirectly) is the same one, who later served as an AMD CEO during the time, when Zen was developed and is being produced, crushing Intel in HPC.
Also, before Athlon and AMD64, Intel and Compaq had DEC Alpha future development cancelled (to ensure, that Itanium woudn't have to deal with RISC compatition), disbanding the development team and leaving some of it's technologies shelved. Some of the DEC engineers then ended up in AMD, developing AMD64 architecture Athlon and Opteron. One of those engineers was Jim Keller, who also much later participated in Zen development. Also AMD licensed EV6 bus from Alpha, using it in Athon.
Now that you think about it, the Athlon name may have have a deeper meaning after all. It was supposed to come from the word decathlon, but with so much of DECs works being used for it, it was almost a "DEC - Athlon" chip.
And as I recall, Itanium seemed like a brain fart by Intel. The X86 instruction set getting extended to 64 bit by AMD made that all too obvious; instruction set compatibility was more important than Intel realized.
@@stevebabiak6997 Itanium was an interesting concept but required too much research to get it done, and by the time it came out x86 caught up in performance and marketshare.
More importantly the mere announcement of Itanium made a lot of companies stop developing alternative ISAs (DEC Alpha, PA-RISC, etc), which drove a large chunk of the market to x86.
So Itanium basically destroyed its own place on the market with its own announcement. Which is really quite an achievement!
AMD64 was just a nail in the coffin; Itanium was already on life support paid for by HP from day one.
@@stevebabiak6997Itanium inherently was a very "riscy" choice (pun intended), because it's at a very extreme end of architecture choices, with its VLIW instruction set. AFAIK the only area where that style has been used for a long time is signal processing (i.e. where you have very specialized, optimized software), where simplifying the chip has huge relative gains on power usage and chip vost. VLIW basically means you have huge instructions that explicitly have fields for each specific ALU of the CPU, so the CPU only needs a very primitive scheduler and no out-of-order handling. In general purpose computing, the difference just wasn't significant enough to warrant the effort to switch over. Also the huge instructions complicate compiler programming, and increase memory usage significantly (besides the extra memory requirement for the 64bit addressing), and that's probably also why Intel always segregated Itanium to be just for servers. And then AMD64 killed off Itanium (and forced Intel to "cannibalize" Itanium with its own 64bit x86 chips).
Architecture wise, Itanium probably was great for some types of scientific and Supercomputing applications, but it just doesn't make any sense for e.g. a webserver.
I read somewhere that Lisa Su was also part of the CELL processor team that later was used for the PS3 and Servers, two of the most important business of AMD: graphics and data center, is kinda curious how this person accumulated experience in all the key business AMD have now
Can't wait to see the "5 years from now" video in this saga. It's not that I learn anything new, I just like Jon's take on it.
AMD kicking Intel's ass?
@@Punisher9419AMD putting an alarm clock to Intel's head
Forget the other statements about AMD kicking Intel's ass because Intel will move to TSMC before that happens and would probably push AMD to the side a little with TSMC as they already have, buying up the remaining run of TSMC N3 for this year and I think into next which then keeps AMD from being able to use N3 for most their product lines that will come out next year.
So, a future look? How much do you know about CPUs?
AMD will have to redo their current MCM architecture and probably move to what Intel is putting out in 2024 to at least laptops (Meteor Lake, MCM architecture that is tiled, so one chiplet pushes up against the next and there are direct connects between them). Many insiders feel this is going to be better than AMD's Infinity Fabric where processor cores have to send data to/from another die called the IOD which has a multiplexer for getting data to where it's supposed to be, and this design was fast enough for Zen 2/3 but is giving them problems for Zen 4. So AMD will probably move to a direct connect MCM architecture just like Intel and I believe they have the rights to do so as I think that connection was licensed by a few companies so CPUs could be made with a wide variety of chiplets or tiles.
So, imagine a world where a CPU can be made with lots of different chips from different companies. That's the next 5 years. CPUs will have hardware accelerators, AI cores similar to what AMD is already putting in some laptop products. You can have a graphics processor be a separate die, etc..... You could even have a chiplet/tile simply be cache, similar to how AMD is stacking L3 cache on their X3D parts put no need to stack it any more.
Yeah, that's the next 5 years and it's going to be exciting and both Intel and AMD are ready for this.
@@johndoh5182 Sorry, but I didn't bother reading that blurb as you started out with blatant speculation that nobody but TCMC, AMD and Intel knows anything about.
Sure i read the memoirs as well
Ah, a stroll down the nightmare memory lane. It was such an adventure watching AMD's struggles against Intel, and now they've finally caught and surpassed them.
And it will see if intell can catch back up or where only strong from first mover power
@@mddunlap03 Intel might be a bit behind those days but they still got some design strengths and they have an excellent software team and they acquired quite a solid reputation. That's why despite very good products, AMD have difficulties to push through a lot of markets. This time this is not because or coercion. Also Intel have deep pockets and the amount of field they cover is way larger than AMD so they can perfectly manage losing a few battles here and there.
@@mddunlap03Intel has built a LOT of their recent success on resources that may not produce the huge advance this time that Intel needs.
I wouldn't exactly call 35% market share "surpassed"..
They suppressed them in the 2000’s but squandered it with there piledriver and bulldozer architecture.
You are the only channel with cojones to call Intel illegal actions out!
Bravo!
Slap. On. Wrist.
AdoredTV spent years destroying Intel and Nvidias nefarious business practices and anti-consumer BS.
@@TheVanillatech true.
Cold Fusion did a video about Intels shenanigans against AMD.
@@TheVanillatech Is Jim still around? It seems like he's taking a smaller role after expanding into a website and podcast.
I remember watching his masterpieces like _Nanomatters_ and _Path Tracing_ years ago.
I actually worked at TI in the 1990s (1995 & 96 I think) on a Pentium class clone processor. Internally it was called it the Amazon project.
TI had effectively stolen Cyrix CPUs by signing a manufacturing deal with Cyrix that allowed TI to sell the chip directly to OEMs and to undercut Cyrix. TI pursued the Amazon project for a few brief years in the hopes of coming up with their own successor chip, but we were several months behind schedule when we began getting silicon and TI's bean counters decided to can the entire project and sell off the IP rather than try to be an also-ran in the x86 CPU business.
The Amazon chip had several innovated design features that would have allowed it to run rings around a typical Pentium processor, but the it lacked out of order execution, which was the next new thing with the up and coming Pentium Pro. Apparently TI management didn't want to spend the resources to go chasing that target.
Awesome video. Brings back lots of memories.
I remember in the late 90’s, around 98/99 if I recall correctly, when the Hammer architecture was first unveiled even before the chips came out. I downloaded the PDF and read all about the 64 bit extensions to the x86 architecture. The Clawhammer, Sledgehammer and Jackhammer names were bandied around, what later became Thunderbird and Athlon a few years later. The proposed changes were shared with the world years before the first chip using them debuted. The 64 bit extensions were embraced by Intel without any accolades that they were invented by AMD.
When AMD reached 1Ghz before Intel everyone became AMD fanboys overnight. Especially with how easy they were to overclock.
And this is one of the few times Microsoft actually did the right thing: they were ON it like a whiz. They had caught wind that Intel would partner with HP and their HPUX as the server operating system of choice for Itanium. Microsoft didn't like that one bit, and put all hands on deck to make a good x86_64 compiler and port Windows to the new instruction set extension.
Wait, Sledgehammer is the codename for K8, as well as other 'hammers'. Athlon and Thunderbird were K7.
AMD64 was forced by law, intel HP, MS itanium project.
@@ronch550 a lot he says is not true.
No. Athlon K7 was the first major effort in 1999 and was still 32-bit. Thunderbird were later Athlon models reaching over 1Ghz,.
Sledgehammer and Clawhammer were Athlon64 and had no competition because the pentium 4 was shit and only clueless idiots got those.
Among Intel's big flops was the new Willamette core of year 2000. (a HUGE project I worked on) It was an ultra complicated, super pipelined (12 stage) generation that required even more stages (18) for the Prescott followup, thus dooming the series to cancellation, as they could not tame the Prescott beast as it fell behind schedule. In hindsight it exposed a major design/tech_path screw up by top management. The 'right hand turn' of multi-cored P6's helped retrieve the situation .. for a while.
That was a strange time yes.. the Pentium 4 line running dead. And the Pentium M / Core Duo taking over.
Do you know anything about the Prescott successor, Tejas and Jayhawk? Beyond what is known on the usual wikipedia etc.
such a prolific researcher you've come full circle to an older essay. i actually remember that particular one because i subscribed and looked forward to more of your work.
Thank you for documenting this history for all our benefit. 😊
I love these historical flashbacks of these massive companies. I've learned a lot about AMD, especially it's early beginnings from this current trilogy!
Will you make a fourth one covering where this video left off until present day?
One reason Abu Dhabi investmsment looked at AMD was intel getting Core and other innovations out of the Intel branch in Israel. The two CPU giants are proxy warriors for a very different kind of fight.
Holy shit
That doesn't surprise me. There's always more going on behind the scenes.
fuck intel
Intel Haifa I believe was responsible for Centrino and the Core Series CPUs. @@beeman4266
I really love your video analysis and your newsletter, I am sincerely impressed by your work as it has the right blend of thorough analysis and the right dose of sprinkled in humor! Keep it going, my guy
"Intel was capable of destroying HP", and instead Fiorina largely did it herself.
Ohhh The K6-2 My first PC, i learned a lot in that PC for my motherboard had a sketchy L2 cache and gives a lot of BSOD's LOL.. and the Thunderbird Athlons.. those where good and proud times. So Proud of AMD's achievements! keep going!!!
Same for me. Paid $200 at Best Buy including monitor.
My very first windows PC was powered by a K6-2 and it introduced me to the internet, PC gaming and then I handed it off to my friend and it did the same for him. Fond memories all thanks to AMD bringing the price down for the everyday person.
Me too, $200 at Best Buy. Lan parties with Counter Strike on a GeForce 1.
11:19 - You have to appreciate how much the prices of computer hardware has decreased over the years. Inflation adjusted, those CPUs cost around 1.5k to 2k in todays money and that is only the CPU.
Remember my buddy's dad paid $3k for an HP or Compaq machine with pentium 1 in the 90s.
This is great content, the late 90`s and early 00`s were a great time for competition in desktop processors, coming along with the discovery of what graphics card could (and would) do. Thanks for your work.
I built myself computers in 99, 04, and 06 or 07, and used AMD for all three. Those were the last new computers I've had, and finding used/refurbished AMD stuff is not easy. I'll likely build another in a year or two, and it'll be back to AMD.
I bought Ahtlon64 and had built lots of winapps with it. Now, knowing its history, I'm even more proud that I had that chip. Thanks for the history lesson!
These videos are great. I had a gap in my knowledge for the 2002-2010 period that this filled in nicely!
I will always have a deep love for my Opteron 165. Using 1A-Cooling (a German company) watercooling parts in 2005, 3.2/3.3GHz+ was my daily driver frequency. The waterblock itself was a heavy chunk of metal. The contact point/cold plate was two strips of copper and one of brass in the middle magically joined seamlessly. It mounted with a single thumbscrew dead in the middle to provide pressure directly down on the silicon which would cause the block to be sometimes comically askew with the torque from the tubing with no reduction in performance. The whole block was seemingly one unbreakable monolith, and I always wondered what it looked like inside.
4x 1GB DDR sticks, GeForce 6800GT, NB and GPU were also waterblocked (wildly thin blocks, same company, same single point of pressure mounting) with a super thick 240MM radiator. Those were the days... The Eheim pump was 110V and required the case to have an AC plug routed out the back.
Thanks for the walk down memory lane here :)
Ahhh this brings back memories as a kid overclocking AMDs finding them sooo much better
Oh I really want that cloudy Vaio desktop background. What a look
the Athlon64 era was truly golden. those were the real good ol days… the PowerMac G5 and Athlon64 were in my house
A64 was incredible - in terms of price AND performance. A bit confusing with the different socket types, going 754 locked you down to a 3.4Ghz maximum but they were much harder to find and more expensive than the 939's. I spent a fortune on a DFI Lanparty 754 board and regretted it pretty soon after. But hey .... Core 2 Duo was round the corner so it was a no brainer what to buy next.
@@TheVanillatech I was roughly 9 back then but I remembered 754 being the budget option and it was only single channel DDR?
I remember the time my small amount of AMD stock was worth practically nothing. it was in spitting distance of being de-listed. I.... still have it though. ;-)
I had the first Athlon XP based computer back in 2001, there were some problems with it. The chipset wasn't very stable. I did have AMD products before this, like a AMD K6-2 350 MHz, and AMD K6-3 450 MHz, but those products didn't offer the performance of a Pentium 2 or 3. I changed to a Pentium 2, and Pentium 3 in 1998 and 1999. I tried a Athlon XP, but it wasn't very stable and crashed a lot for some reason. Then, I changed to Penitum 4 1.8 GHz .
The last AMD product I used was in 2009, and it was AMD Phenom II 945 Denab, and I switched to a Intel I7-960. After that I just used Intel products ever since. A lot of people are saying AMD stuff is better now, and when I have to get a new computer next time it will be a AMD product.
I had an athalon xp 1900+ in the early 2000s, and my dad had a 2ghz p4. My athalon ran circles around his p4, it wasn't even close. my amd was more prone to overheating, but it was definitely a superior cpu.
My PC history was Pentum MMX (fine), K6-2 upgrade (fantastic), AMD K8 / Athlon 64 (great speed and efficiency).
Then an Intel i7 2600K that never made me happy for some reason. Problems with suspend etc.
Last year I bought a not-so-leading-edge Ryzen 7 5800g. Laptop chip in desktop package. Love it again.
15:27 Intel warned her of destroying HP to which Fiorina replied that she would be the one to destroy HP
I remember having a K6, and then an Athlon XP Pro. They were pretty solid chips, and I got some pretty good gaming in with them.
Oh I remember the original Celeron 300A very well. It was the Intel chip I could afford as a teenager after being disenchanted by the Cyrix 6x86 lackluster gaming performance. And off-course I made sure to get a motherboard and ram able to run at 100Mhz bus speed. After some initial hesitation I switched it from the default 66Mhz bus speed to 100Mhz and yes it ran effortlessly at 450Mhz with the stock cooler. This was more or less a public secret: the lack of L2 cache was wat made the chip able to run so well at higher clock speeds.
The 300A had 128k L2 cache. It wasn't the lack of cache, but the fact that it was on-die that made it overclockable.
@@michal0g it was a long time ago so my memory was a bit hazy on that, but yes indeed! With the regular P2's you had these L2 chips on the riser board of the slot processors that were the form factor they came in... it's all starting to come back now 😁
One of my old PCs had a K6-2. Even at 500MHz, a Celeron 300 was over twice as fast. I guess the floating point unit in them was just that bad.
Also the Celeron 300A (the 2nd version of the 300MHz chip, based on Mendocino with on-die L2 cache) was extremely popular and could overclock to over twice its clockspeed on air. It was a beast.
Was looking for this comment.
It was almost like magic. Overlocking a 300MHz/66mhz-bus Celeron to a 450MHz/100mhz-bus chip.
As a kid, I felt I tricked the system. At the time, such a Celeron cost less than 100€. A real 450MHz intel chip was more than €600.
Add a 3Dfx Voodoo card and hot damn, what a gamnig machine monster it was!
Amazing times.
HOLY SHIT, are you telling me there was a chance, that Jensen huang would had managed AMD and NVidia.
OMG can you imagine that? GPUs would cost 5 times more and CPUs would cost the double.
But then maybe Intel would have bought ATI...
@@cv990a4 Arc demonstrates that wouldn't have worked.
@@cv990a4 Why would intel do that, only AMD needs to buy themself in, sell nintendo gear !
intel needs proprietary developing, not China steel it ! Do it yourself, own x86 and ARC !
And Intel would have been happy to double the price of their processors just because they could.
Thank you sir for the amazing videos you've created.
Witch King of Angmar: No man can kill me.
Éowyn: I am no man.
----
AMD founder Jerry Sanders: Real men have fabs.
Lisa Su: I am no man.
6:35 The Slot 1 Celeron was popular because it overclocked very well.
This channel blows my mind
I was with AMD in 1999/2000, and I loved the spirit working in Austin and Dresden.
What a blast from the past. I built PCs with those things back in the day. I definitely owned at least a K6-2 and Athlon
Same
AMD stock is $140 right now. I find it mindboggling that only 8 years ago in 2015 it was worth like $2 oO
Much nostalgia for the chips featured in this video. Still blows my mind that AMD shipped Athlons with exposed die and users could damage it when applying heatsink lol
You missed/misrepresented some things I think...
One of the main reasons the second generation of Celeron CPUs was better, was that they had 128k on-chip L2 cache, which ran at the full CPU frequency, as opposed to 0 on the first generation and 512k off-chip L2 cache of the P2 which ran at half the frequency. So equivalently clocked Celerons were actually faster in some workloads (particularly gaming) than P2s (and much cheaper). Celerons also overclocked well, since they were made on the same process, so a 300MHz Celeron 300A (A was the cached version) was known for being able to run at 450MHz. The cache was actually a key reason the second Celeron generation was successful, not the clock speed increase. Iinitially there was only the 300A and 333, (compared to the 266 & 300 of the cacheless parts).
The Pentium 4 NetBurst architecture was generally considered a flop. intel went for a very deep pipeline, to try to get high clock speeds, but sacrificing IPC. This didn't work out so well in practice. This partly opened the door for AMD to take the performance crown with the Athlon. Would have probably been worth mentioning for better context.
At around the same time as the move towards 64-bit computing occured, intel created a completely new architecture (IA-64, which actually originated at HP) and its Itanium processon (colloquially known as the Itanic, due to its collosal failure), primarily for servers. These were never consumer CPUs though. IA-64 was a "very long word architecture", which relied on compilers to order instructions efficiently, but these were difficult to write and never really materialised, resulting in generally poor performance. In addition, as it was not compatible with x86, x86 code required emulation, further killing performance. The architectural difference is rather important.
On the other hand, AMD extended the original 32-bit x86 architecture (later name IA-32), to the currently used x86-64 (aka. amd64), which maintained backwards compatibility (not requiring emulation). So the first consumer 64-bit CPU was the Athlon64. Intel later followed suit and also stopped producing the Itanic.
After NetBurst, Intel went back to iterating on the P6 architecture (of the Pentium ii & iii), under the "Core" branding, which performed much better.
So missteps from both companies were involved in the switch of performance leadership. The video doesn't really reflect this... (Actually, the situation today feels somewhat akin to the P4/Athlon days, from the consumer product side).
Also, you mentioned that the lower clock speed on Barcelona CPUs was due to the cache bug (I guess specifically the TLB bug. Afaict this wasn't a problem "translating between its caches". Note the Translation Lookaside Buffer (TLB) is a very specific cache used by the MMU and the bug occurs on page entry modification - perhaps that's where this description came from?). The actual bug is kind of cache corruption issue, where the same cache line might end up residing in both the L2 & L3 cache, even though the archtecture is such that those are exclusive caches (eg. another core might accesse the data from L3, even though it "also" lives, potentially modified, in a different core's L2).
In the video there's also the following statement: "That software workaround compromised performance so Barcelona failed to hit the clock speeds AMD promised..." This doesn't make any sense. It may well be that those chips had both the bug *and* were not able to hit the expected clock speeds, but the lower clock speeds would not be due to the bug (which may itself lower performance, by effectively lowering IPC).
In years 1995 to 2002, I suggested AMD computers to gamers and Intel computers to engineers.
AMD traditionally had better in-processor networks, and Intel had better raw calculation capabilities.
And, I indicated ATI Radeon video cards in both Intel and AMD; the Sound blaster AWE32 or similar made for a great all-rounder PC...
rayoflight62
AMD for gaming, engineers need the same system, why use less compatible systems !
EU laws, muhahahahahaha, legal Clone IBM Bios, replace the intel for AMD in dresden...
GAMING PC only, SoundBlaster was only needed for games ! way more better cards on the market, compatible in GMAES you needed ONLY !
U understand it now, why engineers need intel, if one of them does the calculations on AMD, it's not compatible !
@@lucasremnah not really
I still have a box with Pentium 100 CPU's and K5's and K6's, also Slot A and Slot 1 processors and MB's, I bet they still work to be honest. I'm holding onto them as a collector \ hoarder type of thing but they are all in one box marked "Old computer parts"
Then of course the Athlon came and performance was way better than Intel for a while until Core 2 Duo came and since then and until recently it's been AMD playing catchup.
8:00 fun fact, upper right corner the L1 and those dots were the pencil trick. with a #2 pencil you could unlock the multiplier, all 4 needed to be connected.. I did this with many Athlons :)
i own 2 identical PIII system HP systems, one on slot and one on Socket, same PIII chip on both.
Why thy did this, AMD did second level cash daughter board for it, a better solution.
include the memory controller and both cash !
Yep. I remember the pencil trick.
Having bought an AMD in the mid 2008s, I love having kept my AMD bae alive to buy another in 2022 ❤️
I never had one but I remember `from my computer shop job in the 90s the K5 causing divide by zero errors in any run-time DOS program, it's cache memory was blazing fast. I had to use a program that causes slow-down by saturating the CPU with NOPS.
SIS chipsets did a very bad job on the AMD socket chips
Nvidia bought them, nForce chipsets.
You mixed it op, Co CPU code, by Zero was the Pentium 60/66 bug
@@lucasrem Integer Divide By Zero was a speed bug running older DOS programs on modern fast CPU's. Most developers never even considered the exponential speed increases coming from Socket 7 CPU's.
Sure integer speed was great on the old K5s, but that floating point .... my God! Terrible. I paid a shop for a Pentium 133 and they pulled a fast one and gave me an AMD PR-133 (K5) ... which only ran at 100Mhz. I called them, they claimed theyd done me a favour "Faster than a true intel P133" they said. Without mentioning it cost just half as much, and was terrible at float point. Given I'd only bought the thing to play Quake, I was seriously angry.
10 months later I went back to the store and stole a Pentium 200Mhz right from under the counter. Revenge!
@@TheVanillatech I managed to get a P200 fresh from the tray that had fallen off a truck... saved me €500,-
@@paulmichaelfreedman8334 Jesus that exchange rate back then! When I stole mine (although Im loathe to use that term - that shop owner had STOLEN my money already with the switcheroo) the price was £200, Exactly £1/megahurt.
I'd never installed a CPU before, figured it out as I went along. Wasn't even sure if my motherboard would take it. But it did. Jumperless - set the clockspeed in BIOS and boom ... 200Mhz.
Quake timerefresh - from the difficulty select screen - went from 7.7fps on the AMD PR-133 up to 21.8fps with the P200.
Still can't believe those scumbags at Falcon Computers would steal a childs money and then lie to their face on the phone when they comlpained! Pricks.
I grew up ten minutes from the ATi building in Markham, i saw that logo every day growing up. I remember when my cousin and I got the new 64MB AGP card from ATi. How far we've come..
4:00 Not sure if this was intentional or not, but note that the "Pentium" you show here on the left is not one of the Pentiums under discussion, but a much later low-end Intel Skylake chip from 2015 that re-uses the Pentium name for marketing purposes (well into the Core i# era)
was thinking the same the heatsink looked way too modern
I’m literally reading how Eagle Pass crossing is one of two border closures at the exact time you mention Ruiz crossing the border into Eagle Pass. Timing
Is everyone free to enter eagle pass from Mexico?
18:04 just imagine if nvidia+AMD merged way back then (and Huang took over). Ruiz' days were ultimately numbered anyway and Huang is the much stronger CEO.
AMD also started slipping come 2006, while the ATI merger was strategic, their lackluster CPUs compared to Intel's Core 2 and then also long time required to get quad core chips out to masses were really their death knell, and then intel integrated the FSB and any benefit from AMD was gone (wasn't until several CEOs later and Ryzen did they finally compete again).
nvidia also really wants an x86 license and this would've solved it. But with nvidia blocked from buying ARM these days, I don't think they would've been blocked merging with AMD given the economy and regulatory environment back then.
I think under Huang AMD would've had a much better chance all the way up to at least the Ryzen days - depends on how products would've performed.
if they fused back then nvidia/AMD would be a 1.7 trillion company at least and intel would be probably on the edge of going out of business
Thank you
There's a cool story I remember from the late-90's era, but can't Google up the details for. It was about the pricing of Celeron processors. Intel set a baseline price that was above what AMD was charging, and everyone knew Intel had brought out Celeron to compete with AMD, but they insisted on charging more. Then it wasn't selling, so the price came down after all, and the guy who'd said it wouldn't (Andy Grove?) resigned. It was a fun win for the upstart company at the time.
I had to play this video at 1.5x speed for it to sound normal.
I still have my K6 for old PC games. I can't believe it still works.
Cyrix 120 is still working here too, DOS 6.22, Windows 3.11
not any issues.
a Branded IBM build, IBM on the CPU too, not any Cyrix names on it, always thought it was in Germany produced, now i know NYC.
Wow. My first pc with a GeForce. Dirt cheap. Life long counter strike and half life fan.
Itanium brings back memories. Itanic! It might be able to run HP PA-risc. But had really really bad IA 32 performance due to emulsion and abysmal compilers. iIA 64 was nearly as bad because compilers.
Amd aIA64 could run native or nearly native IA32 code with minimal changes. Plus and easy to access IA64 mode with much better compilers, with all code far easier to adjust. No contest.
AMD saved x86 from Intel’s own shortsightedness. They were focusing on an Itanium based 64 bit future. The AMD64 extensions were a game changer and are still in use today.
Whilst Itanium might have been good for HP and Intel no one else in the industry wanted it to be the successor to x86.
To be more specific, The Itanium optimization chain was to simply expose everything and leave micro optimization like pipelining and branch prediction on the compiler. The simplified hypothesis being one one hand that the compiler knows the bigger picture of an application and while slower than hardware it will only need to optimize the binary once, as opposed to the CPU making fast hardware optimizations during every run.
And this lack of HW optimizing also frees up substantial silicon area and power budget for more core logical processing units. (Which is largely why GPUs can do so much calculation per dollar for specific tasks).
This could actually work well, but as you stated the compiler-optimizers of that time did not have the needed advances and Intel didn't bother to put that horse before the cart, and secondly it requires compiling for very specific models of hardware (less portable binaries, more prone to vender lock-in).
End users also get attached to their legacy proprietary binary software so even with a good cooptimizing compiler they can't just recompile to get the advantages, thus the marketing importance of good backward compatibility.
@@mytech6779 so pretty much the same downside of Risc based architectures. Clearly not what users want.
@@PainterVierax Itanium is RISC, not "pretty much". (Paedantry!)
But being RISC wasn't the core market failure. It was just a poorly supported implementation. ARM's various designs are RISC and do just fine.
Historically the market is much more complicated, but 80x86 being CISC was not the major factor in it's market dominance. It was much more a matter of momentum, contracts, compatibility, and supply competition at a critical time in the Micro-computer market.
@@mytech6779 ARM did a bit better because the embedded nature of the products allowed OS to adapt from it. Performance and compatibility is a reason why Android uses Dalvik then ART instead of a more classic binary packaging. That's also why specific distros like Armbian or Raspbian were formed on top of the Debian work and why Arch or BSD are quite popular on ARM and RV devices.
Add to this the huge library incompatibility with the vast x86 environment and you got why Apple took so long to make an ARM transition to their "desktop" environment (not iOS) despite a very secured marketshare.
Interestingly, AMD's original Athlon CPUs used the same bus as the Digital Equipment Corp Alpha 21264 EV6. They licenced this from DEC. This is a DDR bus, and it's why they had much greater memory bandwidth in one of your slides on Intel vs AMD performance in the Athlon vs Pentium 3 era.
AMD has some great products. Currently using 5800X3D and 7900XTX GPU!
Currently watching this on a Framework Laptop 13 with AMD 7840U APU (Zen4 + RDNA3).
Their cpus are great, their GPUs need a little work.
great CPUs for mid/high gaming solutions and mobile ones. But their desktop parts are actually lacking in applicative, not that great in idle/light task power consumption and their low-end offer is not competitive at all. Same for High-end workstation, the new Threadripper is not the game changer it was before. Finally, their recent platforms seem very unfinished/unreliable, more than typical AMD style.
@@PainterVierax is that you userbenchmark?
@@PainterVierax what a bunch of nothing
Imagine a Jim Keller interview on Asianometry
AMD and Nvidia merge, and are run by Jenson Huang, is a very interesting thought experiment
9:12 Always nice to make a wintertime stop in 🌵 Scottsdale, "The West's Most Western Town!" 🙄Yeehaw! 😉✌️😎
I have fond memories of a AMD K6-2 system from back in the 90s.
If you only play games, it's compatible !
Excellent content as always 👌
Wait... In a different timeline Mr Huang is a CEO of AMD as well?!?
Looking at a Pentium with a small number of pins as opposed to what they have now with 1700 pins is pretty funny. It shows how little conductivity these earlier CPUs had.
Great video. Please do some more bio tech videos, they are my favorite.
The whole computer trip to now has been remarkable when microscopes see more they made more tiny curcuits smaller
but Now theyre all reached a limit as with Intel the last 3 generationsCPUs from intel were actually the same cpu but 10% if that no wonder upgraders stayed away as with me i7-900k done4 ages
"And so it goes" just after mentioning Dresden... Slaughterhouse 5 reference??
0:13 We're so back
Back then, I had the K6-2 at 400Mhz with 3dNow!
Very happy with it, and it greatly outperformed more expensive Intels in Games which fully supported it.
I remember having a good time with k6-2 350 and 550 until RTCW, it was dog slow with a Kyro 2 on that one
Wild to think how different the market would be today if AMD hadn't bought ATI or even more so if they had merged with Nvidia. Nvidia is larger and worth more than either Intel or AMD today on their own. If they had combined with AMD and Jensen got his hands on an x86 license then they may be bigger than Apple
Maybe nForce would have been better than it was.
mattwhite7421
They needed a GPU partner, what if they bought Matrox, same as now it would have been.
Cheaper to buy you in, then do all the dev at home !
Are you also the youtuber History for GRANITE?
AMD bring a lot of nostalgia during the late 90s throughout to the mid 00s. For me it's kind of weird hearing AMD struggling back then as basically not one but ill informed novices would even consider Intel in the enthusiast space. Like 90% of gaming PCs that all my friends and local gamers in general went the AMD Athlon XP, 64 and FX series plus mostly nvidia combo, ATi being the better and more popular option in the early 00s due to the nvidia FX gen being an absolute flop.
We all knew that P3 was too expensive and slower for gaming than the Athlon, and the 64/FX was just light years ahead of the cluster f**k that was the P4.
It wasn't until I think about 2006 or 2007 when intel Core was compelling and offered an upgrade, that the AMD preference finally started to wane.
I suppose 99% of consumers back then had no idea about the anti competitive shady stuff intel was doing back then, not like the awareness people have now. We all assumed AMD was racking it in back in those glory days of the Athlon era.
I also remember the woes of the first Phenom, I bought one very early on and it was good but very much totally limited by clocks due to that design bug. The Phenom II was fantastic though, one of the best OCing AMD chips I ever had, but was too late to the market by that stage.
My main desktop is still an FX8320, its one of the best PCs I've had in terms of snappiness in general everyday use even though it really lacks the modern gaming and heavy number crunching power even when it was new. But It's way faster than the i7 2500K I tried for a while in that respect, much for muchness in the other areas, and even is more snappy than the lower end Ryzen laptops I currently use which I did not expect. I am not sure why any of this is the case, I suspect something to do with the latency though the pipeline being better than early Ryzen and Intel of the same gen? But I am no expert on those older architecture.
Maybe, one day I'll upgrade to that mid/high end Ryzen desktop I have been talking about for the past 5-6 years 😅
Great history lessor, I've always had a soft spot for AMD.
OMG! I remember Cyberpro in Norcross, GA, right off Jimmy Carter Blvd! There were a TON of little mom and pop computer stores in an industrial park there! They all pretty much had the same stock, but you could build a whole cheap machine just by going to those shops and price compare to get the parts from the cheapest vendors...
Oh, damn. I was hoping that guy staring at you was gonna pay off, like they were going to warn you the airbnb was haunted or something.
Pentium 4 taking the crown from Athlon Thunderbird? haha.
I beg to differ.
P4 was hot, big and expensive.
Pentium 3 Coppermine and Tualatin were much better as the time showed.
Best wishes.
I remember we were so broke during the financial crisis we had to sell the lawn at the Sunnyvale campus.
did _not_ expect Return of the King to get dragged in this video 😅
Ohhhh I had. K5. Wait it was dome form of IBM badged thing. Maybe my memory is gone lol.
I kinda switched to AMD now... Loved Intel... But not anymore rly. Zen 4 is awesome
Only games
If you all work on the same project, all hardware needs to be compatible !
why is that so hard to understand, he is not a coder, a gamer too ?
AMD became the only viable option this year. Let's hope Intel will come back with Arrow Lake.
Watching them slide into irrelevancy into the 2010s was so sad. I was so happy when they released Ryzen, since they hadn't really been worth buying in close to a decade unless you had a really limited budget.
I was in Hsinchu when Intel lost their strangle hold on the Taiwanese motherboard manufacturers. My understanding was that Intel told the motherboard makers that if they made a high end motherboard for AMD CPUs, that Intel would stop selling Intel chipsets to that motherboard maker. Since Taiwan was producing most of the motherboards, this hurt AMDs desktop business. However, while I was in Taiwan on one of the projects I did there, Intel had delayed building a new fab and this left them short on product to deliver.
The result was that every motherboard maker immediately brought out high performance AMD motherboards. This put Intel and AMD on a more equal basis in the desktop market..
That is my understanding of the desktop motherboard skirmish between AMD and Intel.kl
Back then 1999, I was writing code for biometrics and using Intel Pentium was able to analyze more than 600 fingerprints per seconds. Did not worked well with AMD or Cyrex. Don't know how that's going now with crazy fast processors.
back in the pentium II days, the L2 was not on die, and was on the packaging, on those giant slotted ones. i forget what that slot was called.
Slots, muhahahaha
We have a Celeron Slot 1 CPU, we know what it is.
AMD sold cash broad too, level 2 cach expansion slot, that was that. NOT on Die !!!!
21:15
Barselona?
Okeeh.
First pc I built was a K62
Me too. Didn't build though. Full machine cost me $200 at Best Buy including monitor.
Can you do a video on Rapid?
You should make this a daily, any topic really. :D
The K5 wasn't as bad as this video suggests, it was the choice for budget systems at the time as the performance was sound and the price below the intel parts.
One might tell the story as: AMD got to use an outdated fab without going bankrupt.
Hard to believe that a latino had a part in this story and was not a janitor 😊.
I would like to add that Globalfoundries, TSMC etc. are pure-play foundries. As such, if you were to included others like Samsung, which are IDM and also takes orders from other players, GF is hardly placed 3rd.
I hated both AMD for their AthlonXP and Intel for their P3.. as a former PC shop owner, selling CPUs with open silicon dies was probably the worst idea the industry could ever have. I've lost so much money, time and nerves in fighting customers too dumb to install their cooler and trying to claim warranty on cracked dies. Sure, it made them cheaper to fab, but that just shifted the cost to the last link in the chain.
Were you joking when you showed the gentleman dressed in his clean room suit with his pony tail hanging out?
Another great episode! Open, free and fair competition must be the bedrock of the industry. Truth, Justice, and the American Way 👍
You have to be kidding? STILL believing in that dream? :D
Man, you really just glided over the whole 64bit transition. Barely a mention.
To be fair, that subject probably deserves its own video.
The guy in the thumbnail looks surprisingly close to Steven Wolfram
One of the odd things about the K5, K5 and K6-2 is that they were RISC internally. They used microcode translation to emulate an x86 CPU.
In the developing countries it is still impossible to find AMD powered laptops in retail chains for electronics and local internet shops.
The ones available in small nambers are the premium models that cost over 1k and very few people can afford.
That's why 1366x768 models with Celerons are still selling like hot cakes,since there is nothing else available in that segment.
Intel is still doing shady busines and its a shame that amd is not suing them.
And that's why we refuse to give Intel and Ngreedia a penny.
Go AMD!!
Ngreedia 😂😂
Someone, somewhere, much smarter than me, already combined moore's law, the capital cost of new fabs, and the limits the human user has to calculate where the diminishing returns to all this capital madness will start.
Semiconductors will have a "Concorde/SST" moment sooner than we think.
When will you cover Itanium? It was a tremendous flop that took SGI, and others with it.