AMD: The Incredible Adventure Continues

แชร์
ฝัง
  • เผยแพร่เมื่อ 25 ธ.ค. 2024

ความคิดเห็น • 314

  • @jindrichlnenicka7214
    @jindrichlnenicka7214 ปีที่แล้ว +94

    Interestingly, the person, who managed the copper interconnect project in IBM, that later gave considerable edge to Athlon was actually Dr. Lisa Su. Just imagine, that the person, who helped to develop Athlon manufacturing (albeit indirectly) is the same one, who later served as an AMD CEO during the time, when Zen was developed and is being produced, crushing Intel in HPC.
    Also, before Athlon and AMD64, Intel and Compaq had DEC Alpha future development cancelled (to ensure, that Itanium woudn't have to deal with RISC compatition), disbanding the development team and leaving some of it's technologies shelved. Some of the DEC engineers then ended up in AMD, developing AMD64 architecture Athlon and Opteron. One of those engineers was Jim Keller, who also much later participated in Zen development. Also AMD licensed EV6 bus from Alpha, using it in Athon.

    • @zyrobs
      @zyrobs ปีที่แล้ว +12

      Now that you think about it, the Athlon name may have have a deeper meaning after all. It was supposed to come from the word decathlon, but with so much of DECs works being used for it, it was almost a "DEC - Athlon" chip.

    • @stevebabiak6997
      @stevebabiak6997 ปีที่แล้ว +6

      And as I recall, Itanium seemed like a brain fart by Intel. The X86 instruction set getting extended to 64 bit by AMD made that all too obvious; instruction set compatibility was more important than Intel realized.

    • @zyrobs
      @zyrobs ปีที่แล้ว

      @@stevebabiak6997 Itanium was an interesting concept but required too much research to get it done, and by the time it came out x86 caught up in performance and marketshare.
      More importantly the mere announcement of Itanium made a lot of companies stop developing alternative ISAs (DEC Alpha, PA-RISC, etc), which drove a large chunk of the market to x86.
      So Itanium basically destroyed its own place on the market with its own announcement. Which is really quite an achievement!
      AMD64 was just a nail in the coffin; Itanium was already on life support paid for by HP from day one.

    • @imrevadasz1086
      @imrevadasz1086 10 หลายเดือนก่อน

      ​​​@@stevebabiak6997Itanium inherently was a very "riscy" choice (pun intended), because it's at a very extreme end of architecture choices, with its VLIW instruction set. AFAIK the only area where that style has been used for a long time is signal processing (i.e. where you have very specialized, optimized software), where simplifying the chip has huge relative gains on power usage and chip vost. VLIW basically means you have huge instructions that explicitly have fields for each specific ALU of the CPU, so the CPU only needs a very primitive scheduler and no out-of-order handling. In general purpose computing, the difference just wasn't significant enough to warrant the effort to switch over. Also the huge instructions complicate compiler programming, and increase memory usage significantly (besides the extra memory requirement for the 64bit addressing), and that's probably also why Intel always segregated Itanium to be just for servers. And then AMD64 killed off Itanium (and forced Intel to "cannibalize" Itanium with its own 64bit x86 chips).
      Architecture wise, Itanium probably was great for some types of scientific and Supercomputing applications, but it just doesn't make any sense for e.g. a webserver.

    • @reiniermoreno1653
      @reiniermoreno1653 3 หลายเดือนก่อน +1

      I read somewhere that Lisa Su was also part of the CELL processor team that later was used for the PS3 and Servers, two of the most important business of AMD: graphics and data center, is kinda curious how this person accumulated experience in all the key business AMD have now

  • @andersjjensen
    @andersjjensen ปีที่แล้ว +127

    Can't wait to see the "5 years from now" video in this saga. It's not that I learn anything new, I just like Jon's take on it.

    • @Punisher9419
      @Punisher9419 ปีที่แล้ว +1

      AMD kicking Intel's ass?

    • @GewelReal
      @GewelReal ปีที่แล้ว +2

      ​@@Punisher9419AMD putting an alarm clock to Intel's head

    • @johndoh5182
      @johndoh5182 ปีที่แล้ว +2

      Forget the other statements about AMD kicking Intel's ass because Intel will move to TSMC before that happens and would probably push AMD to the side a little with TSMC as they already have, buying up the remaining run of TSMC N3 for this year and I think into next which then keeps AMD from being able to use N3 for most their product lines that will come out next year.
      So, a future look? How much do you know about CPUs?
      AMD will have to redo their current MCM architecture and probably move to what Intel is putting out in 2024 to at least laptops (Meteor Lake, MCM architecture that is tiled, so one chiplet pushes up against the next and there are direct connects between them). Many insiders feel this is going to be better than AMD's Infinity Fabric where processor cores have to send data to/from another die called the IOD which has a multiplexer for getting data to where it's supposed to be, and this design was fast enough for Zen 2/3 but is giving them problems for Zen 4. So AMD will probably move to a direct connect MCM architecture just like Intel and I believe they have the rights to do so as I think that connection was licensed by a few companies so CPUs could be made with a wide variety of chiplets or tiles.
      So, imagine a world where a CPU can be made with lots of different chips from different companies. That's the next 5 years. CPUs will have hardware accelerators, AI cores similar to what AMD is already putting in some laptop products. You can have a graphics processor be a separate die, etc..... You could even have a chiplet/tile simply be cache, similar to how AMD is stacking L3 cache on their X3D parts put no need to stack it any more.
      Yeah, that's the next 5 years and it's going to be exciting and both Intel and AMD are ready for this.

    • @andersjjensen
      @andersjjensen ปีที่แล้ว +1

      @@johndoh5182 Sorry, but I didn't bother reading that blurb as you started out with blatant speculation that nobody but TCMC, AMD and Intel knows anything about.

    • @234dB
      @234dB ปีที่แล้ว

      Sure i read the memoirs as well

  • @bbbl67
    @bbbl67 ปีที่แล้ว +252

    Ah, a stroll down the nightmare memory lane. It was such an adventure watching AMD's struggles against Intel, and now they've finally caught and surpassed them.

    • @mddunlap03
      @mddunlap03 ปีที่แล้ว +8

      And it will see if intell can catch back up or where only strong from first mover power

    • @PainterVierax
      @PainterVierax ปีที่แล้ว +1

      @@mddunlap03 Intel might be a bit behind those days but they still got some design strengths and they have an excellent software team and they acquired quite a solid reputation. That's why despite very good products, AMD have difficulties to push through a lot of markets. This time this is not because or coercion. Also Intel have deep pockets and the amount of field they cover is way larger than AMD so they can perfectly manage losing a few battles here and there.

    • @LatitudeSky
      @LatitudeSky ปีที่แล้ว

      ​@@mddunlap03Intel has built a LOT of their recent success on resources that may not produce the huge advance this time that Intel needs.

    • @WooShell
      @WooShell ปีที่แล้ว +14

      I wouldn't exactly call 35% market share "surpassed"..

    • @mitchjames9350
      @mitchjames9350 ปีที่แล้ว +15

      They suppressed them in the 2000’s but squandered it with there piledriver and bulldozer architecture.

  • @grtitann7425
    @grtitann7425 ปีที่แล้ว +79

    You are the only channel with cojones to call Intel illegal actions out!
    Bravo!

    • @CMSonYT
      @CMSonYT ปีที่แล้ว +1

      Slap. On. Wrist.

    • @TheVanillatech
      @TheVanillatech ปีที่แล้ว +16

      AdoredTV spent years destroying Intel and Nvidias nefarious business practices and anti-consumer BS.

    • @grtitann7425
      @grtitann7425 ปีที่แล้ว

      @@TheVanillatech true.

    • @theHerathrig
      @theHerathrig ปีที่แล้ว +3

      Cold Fusion did a video about Intels shenanigans against AMD.

    • @deus_ex_machina_
      @deus_ex_machina_ 5 หลายเดือนก่อน +1

      @@TheVanillatech Is Jim still around? It seems like he's taking a smaller role after expanding into a website and podcast.
      I remember watching his masterpieces like _Nanomatters_ and _Path Tracing_ years ago.

  • @joelcorley3478
    @joelcorley3478 ปีที่แล้ว +6

    I actually worked at TI in the 1990s (1995 & 96 I think) on a Pentium class clone processor. Internally it was called it the Amazon project.
    TI had effectively stolen Cyrix CPUs by signing a manufacturing deal with Cyrix that allowed TI to sell the chip directly to OEMs and to undercut Cyrix. TI pursued the Amazon project for a few brief years in the hopes of coming up with their own successor chip, but we were several months behind schedule when we began getting silicon and TI's bean counters decided to can the entire project and sell off the IP rather than try to be an also-ran in the x86 CPU business.
    The Amazon chip had several innovated design features that would have allowed it to run rings around a typical Pentium processor, but the it lacked out of order execution, which was the next new thing with the up and coming Pentium Pro. Apparently TI management didn't want to spend the resources to go chasing that target.
    Awesome video. Brings back lots of memories.

  • @mattbland2380
    @mattbland2380 ปีที่แล้ว +94

    I remember in the late 90’s, around 98/99 if I recall correctly, when the Hammer architecture was first unveiled even before the chips came out. I downloaded the PDF and read all about the 64 bit extensions to the x86 architecture. The Clawhammer, Sledgehammer and Jackhammer names were bandied around, what later became Thunderbird and Athlon a few years later. The proposed changes were shared with the world years before the first chip using them debuted. The 64 bit extensions were embraced by Intel without any accolades that they were invented by AMD.
    When AMD reached 1Ghz before Intel everyone became AMD fanboys overnight. Especially with how easy they were to overclock.

    • @andersjjensen
      @andersjjensen ปีที่แล้ว +26

      And this is one of the few times Microsoft actually did the right thing: they were ON it like a whiz. They had caught wind that Intel would partner with HP and their HPUX as the server operating system of choice for Itanium. Microsoft didn't like that one bit, and put all hands on deck to make a good x86_64 compiler and port Windows to the new instruction set extension.

    • @ronch550
      @ronch550 ปีที่แล้ว +10

      Wait, Sledgehammer is the codename for K8, as well as other 'hammers'. Athlon and Thunderbird were K7.

    • @lucasrem
      @lucasrem ปีที่แล้ว

      AMD64 was forced by law, intel HP, MS itanium project.

    • @lucasrem
      @lucasrem ปีที่แล้ว +3

      @@ronch550 a lot he says is not true.

    • @theexplosionist2019
      @theexplosionist2019 ปีที่แล้ว +4

      No. Athlon K7 was the first major effort in 1999 and was still 32-bit. Thunderbird were later Athlon models reaching over 1Ghz,.
      Sledgehammer and Clawhammer were Athlon64 and had no competition because the pentium 4 was shit and only clueless idiots got those.

  • @coraltown1
    @coraltown1 ปีที่แล้ว +33

    Among Intel's big flops was the new Willamette core of year 2000. (a HUGE project I worked on) It was an ultra complicated, super pipelined (12 stage) generation that required even more stages (18) for the Prescott followup, thus dooming the series to cancellation, as they could not tame the Prescott beast as it fell behind schedule. In hindsight it exposed a major design/tech_path screw up by top management. The 'right hand turn' of multi-cored P6's helped retrieve the situation .. for a while.

    • @pizzablender
      @pizzablender ปีที่แล้ว +3

      That was a strange time yes.. the Pentium 4 line running dead. And the Pentium M / Core Duo taking over.

    • @zyrobs
      @zyrobs ปีที่แล้ว

      Do you know anything about the Prescott successor, Tejas and Jayhawk? Beyond what is known on the usual wikipedia etc.

  • @lordr1800
    @lordr1800 ปีที่แล้ว +6

    such a prolific researcher you've come full circle to an older essay. i actually remember that particular one because i subscribed and looked forward to more of your work.
    Thank you for documenting this history for all our benefit. 😊

  • @-TheLynx-
    @-TheLynx- ปีที่แล้ว +32

    I love these historical flashbacks of these massive companies. I've learned a lot about AMD, especially it's early beginnings from this current trilogy!
    Will you make a fourth one covering where this video left off until present day?

  • @LatitudeSky
    @LatitudeSky ปีที่แล้ว +55

    One reason Abu Dhabi investmsment looked at AMD was intel getting Core and other innovations out of the Intel branch in Israel. The two CPU giants are proxy warriors for a very different kind of fight.

    • @freddy4603
      @freddy4603 ปีที่แล้ว +9

      Holy shit

    • @beeman4266
      @beeman4266 ปีที่แล้ว +15

      That doesn't surprise me. There's always more going on behind the scenes.

    • @HeroDai2448
      @HeroDai2448 11 หลายเดือนก่อน

      fuck intel

    • @francishallare204
      @francishallare204 10 หลายเดือนก่อน

      Intel Haifa I believe was responsible for Centrino and the Core Series CPUs. ​@@beeman4266

  • @peterheynmoller2581
    @peterheynmoller2581 ปีที่แล้ว +1

    I really love your video analysis and your newsletter, I am sincerely impressed by your work as it has the right blend of thorough analysis and the right dose of sprinkled in humor! Keep it going, my guy

  • @samiraperi467
    @samiraperi467 ปีที่แล้ว +14

    "Intel was capable of destroying HP", and instead Fiorina largely did it herself.

  • @vicv9503
    @vicv9503 ปีที่แล้ว +3

    Ohhh The K6-2 My first PC, i learned a lot in that PC for my motherboard had a sketchy L2 cache and gives a lot of BSOD's LOL.. and the Thunderbird Athlons.. those where good and proud times. So Proud of AMD's achievements! keep going!!!

    • @capoman1
      @capoman1 6 หลายเดือนก่อน

      Same for me. Paid $200 at Best Buy including monitor.

  • @Reachforitify
    @Reachforitify ปีที่แล้ว +6

    My very first windows PC was powered by a K6-2 and it introduced me to the internet, PC gaming and then I handed it off to my friend and it did the same for him. Fond memories all thanks to AMD bringing the price down for the everyday person.

    • @capoman1
      @capoman1 6 หลายเดือนก่อน +1

      Me too, $200 at Best Buy. Lan parties with Counter Strike on a GeForce 1.

  • @JoseLopez-hp5oo
    @JoseLopez-hp5oo ปีที่แล้ว +3

    11:19 - You have to appreciate how much the prices of computer hardware has decreased over the years. Inflation adjusted, those CPUs cost around 1.5k to 2k in todays money and that is only the CPU.

    • @capoman1
      @capoman1 6 หลายเดือนก่อน

      Remember my buddy's dad paid $3k for an HP or Compaq machine with pentium 1 in the 90s.

  • @wbjulio
    @wbjulio ปีที่แล้ว +3

    This is great content, the late 90`s and early 00`s were a great time for competition in desktop processors, coming along with the discovery of what graphics card could (and would) do. Thanks for your work.

  • @jaredkennedy6576
    @jaredkennedy6576 ปีที่แล้ว +4

    I built myself computers in 99, 04, and 06 or 07, and used AMD for all three. Those were the last new computers I've had, and finding used/refurbished AMD stuff is not easy. I'll likely build another in a year or two, and it'll be back to AMD.

  • @lee6741
    @lee6741 ปีที่แล้ว +5

    I bought Ahtlon64 and had built lots of winapps with it. Now, knowing its history, I'm even more proud that I had that chip. Thanks for the history lesson!

  • @davidgunther8428
    @davidgunther8428 ปีที่แล้ว +8

    These videos are great. I had a gap in my knowledge for the 2002-2010 period that this filled in nicely!

  • @alexlefevre3555
    @alexlefevre3555 ปีที่แล้ว +1

    I will always have a deep love for my Opteron 165. Using 1A-Cooling (a German company) watercooling parts in 2005, 3.2/3.3GHz+ was my daily driver frequency. The waterblock itself was a heavy chunk of metal. The contact point/cold plate was two strips of copper and one of brass in the middle magically joined seamlessly. It mounted with a single thumbscrew dead in the middle to provide pressure directly down on the silicon which would cause the block to be sometimes comically askew with the torque from the tubing with no reduction in performance. The whole block was seemingly one unbreakable monolith, and I always wondered what it looked like inside.
    4x 1GB DDR sticks, GeForce 6800GT, NB and GPU were also waterblocked (wildly thin blocks, same company, same single point of pressure mounting) with a super thick 240MM radiator. Those were the days... The Eheim pump was 110V and required the case to have an AC plug routed out the back.
    Thanks for the walk down memory lane here :)

  • @jasonosunkoya
    @jasonosunkoya 10 หลายเดือนก่อน +1

    Ahhh this brings back memories as a kid overclocking AMDs finding them sooo much better

  • @RingoBuns
    @RingoBuns ปีที่แล้ว +1

    Oh I really want that cloudy Vaio desktop background. What a look

  • @covert0overt_810
    @covert0overt_810 ปีที่แล้ว +7

    the Athlon64 era was truly golden. those were the real good ol days… the PowerMac G5 and Athlon64 were in my house

    • @TheVanillatech
      @TheVanillatech ปีที่แล้ว

      A64 was incredible - in terms of price AND performance. A bit confusing with the different socket types, going 754 locked you down to a 3.4Ghz maximum but they were much harder to find and more expensive than the 939's. I spent a fortune on a DFI Lanparty 754 board and regretted it pretty soon after. But hey .... Core 2 Duo was round the corner so it was a no brainer what to buy next.

    • @DLTX1007
      @DLTX1007 17 วันที่ผ่านมา

      @@TheVanillatech I was roughly 9 back then but I remembered 754 being the budget option and it was only single channel DDR?

  • @JonahTsai
    @JonahTsai ปีที่แล้ว +7

    I remember the time my small amount of AMD stock was worth practically nothing. it was in spitting distance of being de-listed. I.... still have it though. ;-)

  • @Ace1000ks
    @Ace1000ks ปีที่แล้ว +1

    I had the first Athlon XP based computer back in 2001, there were some problems with it. The chipset wasn't very stable. I did have AMD products before this, like a AMD K6-2 350 MHz, and AMD K6-3 450 MHz, but those products didn't offer the performance of a Pentium 2 or 3. I changed to a Pentium 2, and Pentium 3 in 1998 and 1999. I tried a Athlon XP, but it wasn't very stable and crashed a lot for some reason. Then, I changed to Penitum 4 1.8 GHz .
    The last AMD product I used was in 2009, and it was AMD Phenom II 945 Denab, and I switched to a Intel I7-960. After that I just used Intel products ever since. A lot of people are saying AMD stuff is better now, and when I have to get a new computer next time it will be a AMD product.

  • @georgehilty3561
    @georgehilty3561 ปีที่แล้ว +4

    I had an athalon xp 1900+ in the early 2000s, and my dad had a 2ghz p4. My athalon ran circles around his p4, it wasn't even close. my amd was more prone to overheating, but it was definitely a superior cpu.

  • @pizzablender
    @pizzablender ปีที่แล้ว +1

    My PC history was Pentum MMX (fine), K6-2 upgrade (fantastic), AMD K8 / Athlon 64 (great speed and efficiency).
    Then an Intel i7 2600K that never made me happy for some reason. Problems with suspend etc.
    Last year I bought a not-so-leading-edge Ryzen 7 5800g. Laptop chip in desktop package. Love it again.

  • @arthurbrax6561
    @arthurbrax6561 ปีที่แล้ว +2

    15:27 Intel warned her of destroying HP to which Fiorina replied that she would be the one to destroy HP

  • @darkfoxfurre
    @darkfoxfurre ปีที่แล้ว +3

    I remember having a K6, and then an Athlon XP Pro. They were pretty solid chips, and I got some pretty good gaming in with them.

  • @RC534
    @RC534 ปีที่แล้ว +4

    Oh I remember the original Celeron 300A very well. It was the Intel chip I could afford as a teenager after being disenchanted by the Cyrix 6x86 lackluster gaming performance. And off-course I made sure to get a motherboard and ram able to run at 100Mhz bus speed. After some initial hesitation I switched it from the default 66Mhz bus speed to 100Mhz and yes it ran effortlessly at 450Mhz with the stock cooler. This was more or less a public secret: the lack of L2 cache was wat made the chip able to run so well at higher clock speeds.

    • @michal0g
      @michal0g ปีที่แล้ว +2

      The 300A had 128k L2 cache. It wasn't the lack of cache, but the fact that it was on-die that made it overclockable.

    • @RC534
      @RC534 ปีที่แล้ว +1

      @@michal0g it was a long time ago so my memory was a bit hazy on that, but yes indeed! With the regular P2's you had these L2 chips on the riser board of the slot processors that were the form factor they came in... it's all starting to come back now 😁

  • @zyrobs
    @zyrobs ปีที่แล้ว +2

    One of my old PCs had a K6-2. Even at 500MHz, a Celeron 300 was over twice as fast. I guess the floating point unit in them was just that bad.
    Also the Celeron 300A (the 2nd version of the 300MHz chip, based on Mendocino with on-die L2 cache) was extremely popular and could overclock to over twice its clockspeed on air. It was a beast.

    • @MayzMatthiasYzebaert
      @MayzMatthiasYzebaert ปีที่แล้ว +1

      Was looking for this comment.
      It was almost like magic. Overlocking a 300MHz/66mhz-bus Celeron to a 450MHz/100mhz-bus chip.
      As a kid, I felt I tricked the system. At the time, such a Celeron cost less than 100€. A real 450MHz intel chip was more than €600.
      Add a 3Dfx Voodoo card and hot damn, what a gamnig machine monster it was!
      Amazing times.

  • @OAlexisSamaO
    @OAlexisSamaO ปีที่แล้ว +9

    HOLY SHIT, are you telling me there was a chance, that Jensen huang would had managed AMD and NVidia.
    OMG can you imagine that? GPUs would cost 5 times more and CPUs would cost the double.

    • @cv990a4
      @cv990a4 ปีที่แล้ว +1

      But then maybe Intel would have bought ATI...

    • @brodriguez11000
      @brodriguez11000 ปีที่แล้ว +1

      @@cv990a4 Arc demonstrates that wouldn't have worked.

    • @lucasrem
      @lucasrem ปีที่แล้ว

      @@cv990a4 Why would intel do that, only AMD needs to buy themself in, sell nintendo gear !
      intel needs proprietary developing, not China steel it ! Do it yourself, own x86 and ARC !

    • @nitehawk86
      @nitehawk86 ปีที่แล้ว

      And Intel would have been happy to double the price of their processors just because they could.

  • @Ramirez83786
    @Ramirez83786 ปีที่แล้ว +1

    Thank you sir for the amazing videos you've created.

  • @tnfshbest
    @tnfshbest ปีที่แล้ว +2

    Witch King of Angmar: No man can kill me.
    Éowyn: I am no man.
    ----
    AMD founder Jerry Sanders: Real men have fabs.
    Lisa Su: I am no man.

  • @0MoTheG
    @0MoTheG ปีที่แล้ว +1

    6:35 The Slot 1 Celeron was popular because it overclocked very well.

  • @landrec2
    @landrec2 ปีที่แล้ว +1

    This channel blows my mind

  • @calvinhobbes1617
    @calvinhobbes1617 ปีที่แล้ว

    I was with AMD in 1999/2000, and I loved the spirit working in Austin and Dresden.

  • @davec8921
    @davec8921 ปีที่แล้ว +3

    What a blast from the past. I built PCs with those things back in the day. I definitely owned at least a K6-2 and Athlon

    • @capoman1
      @capoman1 6 หลายเดือนก่อน

      Same

  • @whuzzzup
    @whuzzzup ปีที่แล้ว +2

    AMD stock is $140 right now. I find it mindboggling that only 8 years ago in 2015 it was worth like $2 oO

  • @floodo1
    @floodo1 ปีที่แล้ว +2

    Much nostalgia for the chips featured in this video. Still blows my mind that AMD shipped Athlons with exposed die and users could damage it when applying heatsink lol

  • @michal0g
    @michal0g ปีที่แล้ว +2

    You missed/misrepresented some things I think...
    One of the main reasons the second generation of Celeron CPUs was better, was that they had 128k on-chip L2 cache, which ran at the full CPU frequency, as opposed to 0 on the first generation and 512k off-chip L2 cache of the P2 which ran at half the frequency. So equivalently clocked Celerons were actually faster in some workloads (particularly gaming) than P2s (and much cheaper). Celerons also overclocked well, since they were made on the same process, so a 300MHz Celeron 300A (A was the cached version) was known for being able to run at 450MHz. The cache was actually a key reason the second Celeron generation was successful, not the clock speed increase. Iinitially there was only the 300A and 333, (compared to the 266 & 300 of the cacheless parts).
    The Pentium 4 NetBurst architecture was generally considered a flop. intel went for a very deep pipeline, to try to get high clock speeds, but sacrificing IPC. This didn't work out so well in practice. This partly opened the door for AMD to take the performance crown with the Athlon. Would have probably been worth mentioning for better context.
    At around the same time as the move towards 64-bit computing occured, intel created a completely new architecture (IA-64, which actually originated at HP) and its Itanium processon (colloquially known as the Itanic, due to its collosal failure), primarily for servers. These were never consumer CPUs though. IA-64 was a "very long word architecture", which relied on compilers to order instructions efficiently, but these were difficult to write and never really materialised, resulting in generally poor performance. In addition, as it was not compatible with x86, x86 code required emulation, further killing performance. The architectural difference is rather important.
    On the other hand, AMD extended the original 32-bit x86 architecture (later name IA-32), to the currently used x86-64 (aka. amd64), which maintained backwards compatibility (not requiring emulation). So the first consumer 64-bit CPU was the Athlon64. Intel later followed suit and also stopped producing the Itanic.
    After NetBurst, Intel went back to iterating on the P6 architecture (of the Pentium ii & iii), under the "Core" branding, which performed much better.
    So missteps from both companies were involved in the switch of performance leadership. The video doesn't really reflect this... (Actually, the situation today feels somewhat akin to the P4/Athlon days, from the consumer product side).
    Also, you mentioned that the lower clock speed on Barcelona CPUs was due to the cache bug (I guess specifically the TLB bug. Afaict this wasn't a problem "translating between its caches". Note the Translation Lookaside Buffer (TLB) is a very specific cache used by the MMU and the bug occurs on page entry modification - perhaps that's where this description came from?). The actual bug is kind of cache corruption issue, where the same cache line might end up residing in both the L2 & L3 cache, even though the archtecture is such that those are exclusive caches (eg. another core might accesse the data from L3, even though it "also" lives, potentially modified, in a different core's L2).
    In the video there's also the following statement: "That software workaround compromised performance so Barcelona failed to hit the clock speeds AMD promised..." This doesn't make any sense. It may well be that those chips had both the bug *and* were not able to hit the expected clock speeds, but the lower clock speeds would not be due to the bug (which may itself lower performance, by effectively lowering IPC).

  • @rayoflight62
    @rayoflight62 ปีที่แล้ว +5

    In years 1995 to 2002, I suggested AMD computers to gamers and Intel computers to engineers.
    AMD traditionally had better in-processor networks, and Intel had better raw calculation capabilities.
    And, I indicated ATI Radeon video cards in both Intel and AMD; the Sound blaster AWE32 or similar made for a great all-rounder PC...

    • @lucasrem
      @lucasrem ปีที่แล้ว

      rayoflight62
      AMD for gaming, engineers need the same system, why use less compatible systems !
      EU laws, muhahahahahaha, legal Clone IBM Bios, replace the intel for AMD in dresden...
      GAMING PC only, SoundBlaster was only needed for games ! way more better cards on the market, compatible in GMAES you needed ONLY !
      U understand it now, why engineers need intel, if one of them does the calculations on AMD, it's not compatible !

    • @jayyydizzzle
      @jayyydizzzle ปีที่แล้ว

      ​@@lucasremnah not really

  • @kindnuguz
    @kindnuguz ปีที่แล้ว +2

    I still have a box with Pentium 100 CPU's and K5's and K6's, also Slot A and Slot 1 processors and MB's, I bet they still work to be honest. I'm holding onto them as a collector \ hoarder type of thing but they are all in one box marked "Old computer parts"
    Then of course the Athlon came and performance was way better than Intel for a while until Core 2 Duo came and since then and until recently it's been AMD playing catchup.
    8:00 fun fact, upper right corner the L1 and those dots were the pencil trick. with a #2 pencil you could unlock the multiplier, all 4 needed to be connected.. I did this with many Athlons :)

    • @lucasrem
      @lucasrem ปีที่แล้ว

      i own 2 identical PIII system HP systems, one on slot and one on Socket, same PIII chip on both.
      Why thy did this, AMD did second level cash daughter board for it, a better solution.
      include the memory controller and both cash !

    • @capoman1
      @capoman1 6 หลายเดือนก่อน

      Yep. I remember the pencil trick.

  • @t1t0s89
    @t1t0s89 ปีที่แล้ว

    Having bought an AMD in the mid 2008s, I love having kept my AMD bae alive to buy another in 2022 ❤️

  • @paulmichaelfreedman8334
    @paulmichaelfreedman8334 ปีที่แล้ว +3

    I never had one but I remember `from my computer shop job in the 90s the K5 causing divide by zero errors in any run-time DOS program, it's cache memory was blazing fast. I had to use a program that causes slow-down by saturating the CPU with NOPS.

    • @lucasrem
      @lucasrem ปีที่แล้ว +4

      SIS chipsets did a very bad job on the AMD socket chips
      Nvidia bought them, nForce chipsets.
      You mixed it op, Co CPU code, by Zero was the Pentium 60/66 bug

    • @TheVanillatech
      @TheVanillatech ปีที่แล้ว

      @@lucasrem Integer Divide By Zero was a speed bug running older DOS programs on modern fast CPU's. Most developers never even considered the exponential speed increases coming from Socket 7 CPU's.

    • @TheVanillatech
      @TheVanillatech ปีที่แล้ว

      Sure integer speed was great on the old K5s, but that floating point .... my God! Terrible. I paid a shop for a Pentium 133 and they pulled a fast one and gave me an AMD PR-133 (K5) ... which only ran at 100Mhz. I called them, they claimed theyd done me a favour "Faster than a true intel P133" they said. Without mentioning it cost just half as much, and was terrible at float point. Given I'd only bought the thing to play Quake, I was seriously angry.
      10 months later I went back to the store and stole a Pentium 200Mhz right from under the counter. Revenge!

    • @paulmichaelfreedman8334
      @paulmichaelfreedman8334 ปีที่แล้ว +1

      @@TheVanillatech I managed to get a P200 fresh from the tray that had fallen off a truck... saved me €500,-

    • @TheVanillatech
      @TheVanillatech ปีที่แล้ว

      @@paulmichaelfreedman8334 Jesus that exchange rate back then! When I stole mine (although Im loathe to use that term - that shop owner had STOLEN my money already with the switcheroo) the price was £200, Exactly £1/megahurt.
      I'd never installed a CPU before, figured it out as I went along. Wasn't even sure if my motherboard would take it. But it did. Jumperless - set the clockspeed in BIOS and boom ... 200Mhz.
      Quake timerefresh - from the difficulty select screen - went from 7.7fps on the AMD PR-133 up to 21.8fps with the P200.
      Still can't believe those scumbags at Falcon Computers would steal a childs money and then lie to their face on the phone when they comlpained! Pricks.

  • @soapbar88
    @soapbar88 ปีที่แล้ว

    I grew up ten minutes from the ATi building in Markham, i saw that logo every day growing up. I remember when my cousin and I got the new 64MB AGP card from ATi. How far we've come..

  • @siberx4
    @siberx4 ปีที่แล้ว

    4:00 Not sure if this was intentional or not, but note that the "Pentium" you show here on the left is not one of the Pentiums under discussion, but a much later low-end Intel Skylake chip from 2015 that re-uses the Pentium name for marketing purposes (well into the Core i# era)

    • @hanspeter24
      @hanspeter24 ปีที่แล้ว

      was thinking the same the heatsink looked way too modern

  • @ThatswhatsupTWU
    @ThatswhatsupTWU ปีที่แล้ว +2

    I’m literally reading how Eagle Pass crossing is one of two border closures at the exact time you mention Ruiz crossing the border into Eagle Pass. Timing

    • @francishallare204
      @francishallare204 10 หลายเดือนก่อน +1

      Is everyone free to enter eagle pass from Mexico?

  • @colinstu
    @colinstu ปีที่แล้ว +1

    18:04 just imagine if nvidia+AMD merged way back then (and Huang took over). Ruiz' days were ultimately numbered anyway and Huang is the much stronger CEO.
    AMD also started slipping come 2006, while the ATI merger was strategic, their lackluster CPUs compared to Intel's Core 2 and then also long time required to get quad core chips out to masses were really their death knell, and then intel integrated the FSB and any benefit from AMD was gone (wasn't until several CEOs later and Ryzen did they finally compete again).
    nvidia also really wants an x86 license and this would've solved it. But with nvidia blocked from buying ARM these days, I don't think they would've been blocked merging with AMD given the economy and regulatory environment back then.
    I think under Huang AMD would've had a much better chance all the way up to at least the Ryzen days - depends on how products would've performed.

    • @HeroDai2448
      @HeroDai2448 11 หลายเดือนก่อน +1

      if they fused back then nvidia/AMD would be a 1.7 trillion company at least and intel would be probably on the edge of going out of business

  • @Techaktien
    @Techaktien ปีที่แล้ว +1

    Thank you

  • @MrStevemur
    @MrStevemur ปีที่แล้ว

    There's a cool story I remember from the late-90's era, but can't Google up the details for. It was about the pricing of Celeron processors. Intel set a baseline price that was above what AMD was charging, and everyone knew Intel had brought out Celeron to compete with AMD, but they insisted on charging more. Then it wasn't selling, so the price came down after all, and the guy who'd said it wouldn't (Andy Grove?) resigned. It was a fun win for the upstart company at the time.

  • @gary-williams
    @gary-williams ปีที่แล้ว +1

    I had to play this video at 1.5x speed for it to sound normal.

  • @otakujhp
    @otakujhp ปีที่แล้ว +3

    I still have my K6 for old PC games. I can't believe it still works.

    • @lucasrem
      @lucasrem ปีที่แล้ว +1

      Cyrix 120 is still working here too, DOS 6.22, Windows 3.11
      not any issues.
      a Branded IBM build, IBM on the CPU too, not any Cyrix names on it, always thought it was in Germany produced, now i know NYC.

    • @capoman1
      @capoman1 6 หลายเดือนก่อน

      Wow. My first pc with a GeForce. Dirt cheap. Life long counter strike and half life fan.

  • @Michael_Brock
    @Michael_Brock ปีที่แล้ว +3

    Itanium brings back memories. Itanic! It might be able to run HP PA-risc. But had really really bad IA 32 performance due to emulsion and abysmal compilers. iIA 64 was nearly as bad because compilers.
    Amd aIA64 could run native or nearly native IA32 code with minimal changes. Plus and easy to access IA64 mode with much better compilers, with all code far easier to adjust. No contest.

    • @mattbland2380
      @mattbland2380 ปีที่แล้ว +2

      AMD saved x86 from Intel’s own shortsightedness. They were focusing on an Itanium based 64 bit future. The AMD64 extensions were a game changer and are still in use today.
      Whilst Itanium might have been good for HP and Intel no one else in the industry wanted it to be the successor to x86.

    • @mytech6779
      @mytech6779 ปีที่แล้ว +2

      To be more specific, The Itanium optimization chain was to simply expose everything and leave micro optimization like pipelining and branch prediction on the compiler. The simplified hypothesis being one one hand that the compiler knows the bigger picture of an application and while slower than hardware it will only need to optimize the binary once, as opposed to the CPU making fast hardware optimizations during every run.
      And this lack of HW optimizing also frees up substantial silicon area and power budget for more core logical processing units. (Which is largely why GPUs can do so much calculation per dollar for specific tasks).
      This could actually work well, but as you stated the compiler-optimizers of that time did not have the needed advances and Intel didn't bother to put that horse before the cart, and secondly it requires compiling for very specific models of hardware (less portable binaries, more prone to vender lock-in).
      End users also get attached to their legacy proprietary binary software so even with a good cooptimizing compiler they can't just recompile to get the advantages, thus the marketing importance of good backward compatibility.

    • @PainterVierax
      @PainterVierax ปีที่แล้ว

      @@mytech6779 so pretty much the same downside of Risc based architectures. Clearly not what users want.

    • @mytech6779
      @mytech6779 ปีที่แล้ว

      ​@@PainterVierax Itanium is RISC, not "pretty much". (Paedantry!)
      But being RISC wasn't the core market failure. It was just a poorly supported implementation. ARM's various designs are RISC and do just fine.
      Historically the market is much more complicated, but 80x86 being CISC was not the major factor in it's market dominance. It was much more a matter of momentum, contracts, compatibility, and supply competition at a critical time in the Micro-computer market.

    • @PainterVierax
      @PainterVierax ปีที่แล้ว +1

      @@mytech6779 ARM did a bit better because the embedded nature of the products allowed OS to adapt from it. Performance and compatibility is a reason why Android uses Dalvik then ART instead of a more classic binary packaging. That's also why specific distros like Armbian or Raspbian were formed on top of the Debian work and why Arch or BSD are quite popular on ARM and RV devices.
      Add to this the huge library incompatibility with the vast x86 environment and you got why Apple took so long to make an ARM transition to their "desktop" environment (not iOS) despite a very secured marketshare.

  • @brandonfriesen9820
    @brandonfriesen9820 9 หลายเดือนก่อน

    Interestingly, AMD's original Athlon CPUs used the same bus as the Digital Equipment Corp Alpha 21264 EV6. They licenced this from DEC. This is a DDR bus, and it's why they had much greater memory bandwidth in one of your slides on Intel vs AMD performance in the Athlon vs Pentium 3 era.

  • @MegaChickenPunch
    @MegaChickenPunch ปีที่แล้ว +16

    AMD has some great products. Currently using 5800X3D and 7900XTX GPU!

    • @catsspat
      @catsspat ปีที่แล้ว +7

      Currently watching this on a Framework Laptop 13 with AMD 7840U APU (Zen4 + RDNA3).

    • @Re-InCarNation
      @Re-InCarNation ปีที่แล้ว +5

      Their cpus are great, their GPUs need a little work.

    • @PainterVierax
      @PainterVierax ปีที่แล้ว +1

      great CPUs for mid/high gaming solutions and mobile ones. But their desktop parts are actually lacking in applicative, not that great in idle/light task power consumption and their low-end offer is not competitive at all. Same for High-end workstation, the new Threadripper is not the game changer it was before. Finally, their recent platforms seem very unfinished/unreliable, more than typical AMD style.

    • @rampage_sl
      @rampage_sl ปีที่แล้ว +8

      @@PainterVierax is that you userbenchmark?

    • @MegaChickenPunch
      @MegaChickenPunch ปีที่แล้ว +6

      @@PainterVierax what a bunch of nothing

  • @RohitSharma-mi8gt
    @RohitSharma-mi8gt 2 หลายเดือนก่อน

    Imagine a Jim Keller interview on Asianometry

  • @spoot
    @spoot ปีที่แล้ว +7

    AMD and Nvidia merge, and are run by Jenson Huang, is a very interesting thought experiment

  • @gus473
    @gus473 ปีที่แล้ว

    9:12 Always nice to make a wintertime stop in 🌵 Scottsdale, "The West's Most Western Town!" 🙄Yeehaw! 😉✌️😎

  • @seylaw
    @seylaw ปีที่แล้ว +1

    I have fond memories of a AMD K6-2 system from back in the 90s.

    • @lucasrem
      @lucasrem ปีที่แล้ว

      If you only play games, it's compatible !

  • @scunnerdarkly4929
    @scunnerdarkly4929 ปีที่แล้ว

    Excellent content as always 👌

  • @GewelReal
    @GewelReal ปีที่แล้ว +2

    Wait... In a different timeline Mr Huang is a CEO of AMD as well?!?

  • @johndoh5182
    @johndoh5182 ปีที่แล้ว +1

    Looking at a Pentium with a small number of pins as opposed to what they have now with 1700 pins is pretty funny. It shows how little conductivity these earlier CPUs had.

  • @samgeorge4798
    @samgeorge4798 ปีที่แล้ว +1

    Great video. Please do some more bio tech videos, they are my favorite.

  • @markissboi3583
    @markissboi3583 ปีที่แล้ว +1

    The whole computer trip to now has been remarkable when microscopes see more they made more tiny curcuits smaller
    but Now theyre all reached a limit as with Intel the last 3 generationsCPUs from intel were actually the same cpu but 10% if that no wonder upgraders stayed away as with me i7-900k done4 ages

  • @Aubstract
    @Aubstract ปีที่แล้ว

    "And so it goes" just after mentioning Dresden... Slaughterhouse 5 reference??

  • @AKK5I
    @AKK5I ปีที่แล้ว

    0:13 We're so back

  • @Incoming1983
    @Incoming1983 ปีที่แล้ว

    Back then, I had the K6-2 at 400Mhz with 3dNow!
    Very happy with it, and it greatly outperformed more expensive Intels in Games which fully supported it.

    • @fabiosemino2214
      @fabiosemino2214 ปีที่แล้ว

      I remember having a good time with k6-2 350 and 550 until RTCW, it was dog slow with a Kyro 2 on that one

  • @thereddaikon
    @thereddaikon ปีที่แล้ว +2

    Wild to think how different the market would be today if AMD hadn't bought ATI or even more so if they had merged with Nvidia. Nvidia is larger and worth more than either Intel or AMD today on their own. If they had combined with AMD and Jensen got his hands on an x86 license then they may be bigger than Apple

    • @brodriguez11000
      @brodriguez11000 ปีที่แล้ว

      Maybe nForce would have been better than it was.

    • @lucasrem
      @lucasrem ปีที่แล้ว

      mattwhite7421
      They needed a GPU partner, what if they bought Matrox, same as now it would have been.
      Cheaper to buy you in, then do all the dev at home !

  • @gpmelendez
    @gpmelendez ปีที่แล้ว

    Are you also the youtuber History for GRANITE?

  • @3800S1
    @3800S1 ปีที่แล้ว

    AMD bring a lot of nostalgia during the late 90s throughout to the mid 00s. For me it's kind of weird hearing AMD struggling back then as basically not one but ill informed novices would even consider Intel in the enthusiast space. Like 90% of gaming PCs that all my friends and local gamers in general went the AMD Athlon XP, 64 and FX series plus mostly nvidia combo, ATi being the better and more popular option in the early 00s due to the nvidia FX gen being an absolute flop.
    We all knew that P3 was too expensive and slower for gaming than the Athlon, and the 64/FX was just light years ahead of the cluster f**k that was the P4.
    It wasn't until I think about 2006 or 2007 when intel Core was compelling and offered an upgrade, that the AMD preference finally started to wane.
    I suppose 99% of consumers back then had no idea about the anti competitive shady stuff intel was doing back then, not like the awareness people have now. We all assumed AMD was racking it in back in those glory days of the Athlon era.
    I also remember the woes of the first Phenom, I bought one very early on and it was good but very much totally limited by clocks due to that design bug. The Phenom II was fantastic though, one of the best OCing AMD chips I ever had, but was too late to the market by that stage.
    My main desktop is still an FX8320, its one of the best PCs I've had in terms of snappiness in general everyday use even though it really lacks the modern gaming and heavy number crunching power even when it was new. But It's way faster than the i7 2500K I tried for a while in that respect, much for muchness in the other areas, and even is more snappy than the lower end Ryzen laptops I currently use which I did not expect. I am not sure why any of this is the case, I suspect something to do with the latency though the pipeline being better than early Ryzen and Intel of the same gen? But I am no expert on those older architecture.
    Maybe, one day I'll upgrade to that mid/high end Ryzen desktop I have been talking about for the past 5-6 years 😅

  • @tomschmidt381
    @tomschmidt381 ปีที่แล้ว +2

    Great history lessor, I've always had a soft spot for AMD.

  • @mikekopack6441
    @mikekopack6441 ปีที่แล้ว +3

    OMG! I remember Cyberpro in Norcross, GA, right off Jimmy Carter Blvd! There were a TON of little mom and pop computer stores in an industrial park there! They all pretty much had the same stock, but you could build a whole cheap machine just by going to those shops and price compare to get the parts from the cheapest vendors...

  • @LavaHotMan
    @LavaHotMan ปีที่แล้ว

    Oh, damn. I was hoping that guy staring at you was gonna pay off, like they were going to warn you the airbnb was haunted or something.

  • @ajax700
    @ajax700 11 หลายเดือนก่อน +1

    Pentium 4 taking the crown from Athlon Thunderbird? haha.
    I beg to differ.
    P4 was hot, big and expensive.
    Pentium 3 Coppermine and Tualatin were much better as the time showed.
    Best wishes.

  • @redpriest26
    @redpriest26 ปีที่แล้ว

    I remember we were so broke during the financial crisis we had to sell the lawn at the Sunnyvale campus.

  • @henrahmagix
    @henrahmagix ปีที่แล้ว

    did _not_ expect Return of the King to get dragged in this video 😅

  • @Alex.The.Lionnnnn
    @Alex.The.Lionnnnn ปีที่แล้ว

    Ohhhh I had. K5. Wait it was dome form of IBM badged thing. Maybe my memory is gone lol.

  • @lazymass
    @lazymass ปีที่แล้ว +4

    I kinda switched to AMD now... Loved Intel... But not anymore rly. Zen 4 is awesome

    • @lucasrem
      @lucasrem ปีที่แล้ว

      Only games
      If you all work on the same project, all hardware needs to be compatible !
      why is that so hard to understand, he is not a coder, a gamer too ?

    • @nipa5961
      @nipa5961 ปีที่แล้ว

      AMD became the only viable option this year. Let's hope Intel will come back with Arrow Lake.

  • @jplayer073
    @jplayer073 ปีที่แล้ว

    Watching them slide into irrelevancy into the 2010s was so sad. I was so happy when they released Ryzen, since they hadn't really been worth buying in close to a decade unless you had a really limited budget.

  • @Flameboar
    @Flameboar ปีที่แล้ว

    I was in Hsinchu when Intel lost their strangle hold on the Taiwanese motherboard manufacturers. My understanding was that Intel told the motherboard makers that if they made a high end motherboard for AMD CPUs, that Intel would stop selling Intel chipsets to that motherboard maker. Since Taiwan was producing most of the motherboards, this hurt AMDs desktop business. However, while I was in Taiwan on one of the projects I did there, Intel had delayed building a new fab and this left them short on product to deliver.
    The result was that every motherboard maker immediately brought out high performance AMD motherboards. This put Intel and AMD on a more equal basis in the desktop market..
    That is my understanding of the desktop motherboard skirmish between AMD and Intel.kl

  • @drtracking
    @drtracking ปีที่แล้ว

    Back then 1999, I was writing code for biometrics and using Intel Pentium was able to analyze more than 600 fingerprints per seconds. Did not worked well with AMD or Cyrex. Don't know how that's going now with crazy fast processors.

  • @Neeboopsh
    @Neeboopsh ปีที่แล้ว

    back in the pentium II days, the L2 was not on die, and was on the packaging, on those giant slotted ones. i forget what that slot was called.

    • @lucasrem
      @lucasrem ปีที่แล้ว

      Slots, muhahahaha
      We have a Celeron Slot 1 CPU, we know what it is.
      AMD sold cash broad too, level 2 cach expansion slot, that was that. NOT on Die !!!!

  • @tomthroffle
    @tomthroffle ปีที่แล้ว

    21:15
    Barselona?
    Okeeh.

  • @Justin_black_leviathan
    @Justin_black_leviathan ปีที่แล้ว +1

    First pc I built was a K62

    • @capoman1
      @capoman1 6 หลายเดือนก่อน

      Me too. Didn't build though. Full machine cost me $200 at Best Buy including monitor.

  • @usnoozeyuloosey
    @usnoozeyuloosey ปีที่แล้ว

    Can you do a video on Rapid?

  • @misewixe2777
    @misewixe2777 ปีที่แล้ว

    You should make this a daily, any topic really. :D

  • @0MoTheG
    @0MoTheG ปีที่แล้ว +4

    The K5 wasn't as bad as this video suggests, it was the choice for budget systems at the time as the performance was sound and the price below the intel parts.
    One might tell the story as: AMD got to use an outdated fab without going bankrupt.

  • @ravipeiris4388
    @ravipeiris4388 ปีที่แล้ว +1

    Hard to believe that a latino had a part in this story and was not a janitor 😊.

  • @ongwy66
    @ongwy66 ปีที่แล้ว

    I would like to add that Globalfoundries, TSMC etc. are pure-play foundries. As such, if you were to included others like Samsung, which are IDM and also takes orders from other players, GF is hardly placed 3rd.

  • @WooShell
    @WooShell ปีที่แล้ว

    I hated both AMD for their AthlonXP and Intel for their P3.. as a former PC shop owner, selling CPUs with open silicon dies was probably the worst idea the industry could ever have. I've lost so much money, time and nerves in fighting customers too dumb to install their cooler and trying to claim warranty on cracked dies. Sure, it made them cheaper to fab, but that just shifted the cost to the last link in the chain.

  • @mapp0v0
    @mapp0v0 ปีที่แล้ว

    Were you joking when you showed the gentleman dressed in his clean room suit with his pony tail hanging out?

  • @JonMasters
    @JonMasters ปีที่แล้ว

    Another great episode! Open, free and fair competition must be the bedrock of the industry. Truth, Justice, and the American Way 👍

    • @TheVanillatech
      @TheVanillatech ปีที่แล้ว

      You have to be kidding? STILL believing in that dream? :D

  • @McGurble
    @McGurble ปีที่แล้ว +1

    Man, you really just glided over the whole 64bit transition. Barely a mention.

    • @benjaminlynch9958
      @benjaminlynch9958 ปีที่แล้ว +1

      To be fair, that subject probably deserves its own video.

  • @brandonzhang5808
    @brandonzhang5808 ปีที่แล้ว

    The guy in the thumbnail looks surprisingly close to Steven Wolfram

  • @douro20
    @douro20 ปีที่แล้ว +2

    One of the odd things about the K5, K5 and K6-2 is that they were RISC internally. They used microcode translation to emulate an x86 CPU.

  • @alphadog6970
    @alphadog6970 ปีที่แล้ว

    In the developing countries it is still impossible to find AMD powered laptops in retail chains for electronics and local internet shops.
    The ones available in small nambers are the premium models that cost over 1k and very few people can afford.
    That's why 1366x768 models with Celerons are still selling like hot cakes,since there is nothing else available in that segment.
    Intel is still doing shady busines and its a shame that amd is not suing them.

  • @grtitann7425
    @grtitann7425 ปีที่แล้ว +13

    And that's why we refuse to give Intel and Ngreedia a penny.
    Go AMD!!

    • @MrHav1k
      @MrHav1k ปีที่แล้ว +3

      Ngreedia 😂😂

  • @ricardokowalski1579
    @ricardokowalski1579 ปีที่แล้ว

    Someone, somewhere, much smarter than me, already combined moore's law, the capital cost of new fabs, and the limits the human user has to calculate where the diminishing returns to all this capital madness will start.
    Semiconductors will have a "Concorde/SST" moment sooner than we think.

  • @percival477
    @percival477 ปีที่แล้ว

    When will you cover Itanium? It was a tremendous flop that took SGI, and others with it.