00:35 Quick note regarding this. The reason Intel went along with this was because the military usually required a redundant manufacturer, in case the primary one either went out of business, or hit some roadblock and were unable to make any more.
This is why the US government and others do not allow monopoly's. Once you become on top and kill off or buy the competition it allows the company to relax in innovation and overcharge the consumer. I feel there should be a third CPU maker I mean technically there is ARM but they pretty much allow anyone to license their designs. Elon should start a CPU and GPU company he would benefit from it. He is already using custom chips for Tesla AutoPilot from the current suppliers that make them. He pretty much killed or slowed any satellite provider cause companies refuse to innovate anymore. Just like he killed ULA with spaceX. The real problem is everyone is content on earth happy with what they have or trying to get. Could you imagine being born in the early 1900's watching stuff evolve at a fast pace then in the 2000's everything has hit a brick wall.
@@Rainbow__cookie Consoles kept the lights on for them. They also had to spin off Global Foundries and I remember they had to sell their corporate office bldg and then lease it for cash. I wonder how close they came to being acquired by Samsung?
I remember slot a Athlon CPUs and then the socket a (462 pin) tbird axia-y 1000 which usually would clock to 1333mhz just be changing the fsb from 100 to 133MHz. The Morgan core and also spitfire core duron ran at 1GHz too.
Until a few years ago I was running an AMD Athlon II x3 tri-core, which was actually a Quad-Core sold with one disabled, usually due to a defect. Some motherboards at the time came with a core unlocker which would try an activate that extra core. If you were lucky you could get a Quad-Core for the price of a tri-core (I wasn't!).
Same here, but an Athlon II X4 620. Built a new machine based upon the 2nd gen Ryzen and slightly upgraded the old machine with a used, $30 AMD Phenom II X4 840.
in my country (especially for those who works in intel), they call them "american mcdonalds". and they'll laugh whenever "american mcdonalds" releases a new chip back then.
It seemed being a ripoff is not always a bad thing… the important thing is innovating after you get your feet up. AMD, Huawei, XiaoMi, nVidia… the examples goes on and on…
"Building out your company by shamelessly ripping off someone else" is such a common thing in manufacturing that both America and China got into industrial production by doing it.
The Huawei modem Provided by my ISP, is the BAIN of my Existence! Fucking Thing KEEPS DROPPING OUT! in the middle of Games/Downloads! ( & IT Takes Half an Hour to RESET!)
Great overview, Linus. I love those kind of "tech history" videos a lot, having participated in computer technology upgrades for over 25 years now. Thank you!
I always loved AMD, because of its pricing methods, it was significantly cheaper than Intel. It was the platform of the adventurers (with empty pockets :D ).
They are more affordable when they couldn’t compete otherwise :) for example in the early 2000s AMD released the Athlon XP 3200 which was much more expensive than Intels offerings at that time because it was the best of the best.
@@Your_username_ I'm not sure about the pricing around that time, but I remember I bought my Athlon XP 1600 dirt cheap from a local vendor :) . I loved that CPU, I only had GPU performance issues around that time :)
Why am I not surprised that he left out the fact that after AMD came out with Athlon 64 Intel had to license it so they could use it, so now both companies can only function because they have licensing deals with each other. One from Intel for x86 and one from AMD for x64, 😁
AMD bought the OG Team red ATI. Nvidia and even 3D FX's Voodoo graphics were using green color schemes back in the day. They couldn't keep using green so they just went with ATI's red branding.
IIRC, early K6's used to have a special CPUID easter egg where it would return the string "NexGenAMD" (or something similar that fit) instead of "AuthenticAMD" if you put a special value in one of the registers. Let's see, from memory there's several CPUID strings for intel compatibles... "GenuineIntel", "AuthenticAMD", "CyrixInstead", "CentaurHauls", and then the no-fun bland: "UMC UMC UMC ". Here's a more complete list: en.wikipedia.org/wiki/CPUID
Came looking for this comment. The K7 was similarly also done by a new team acquired from DEC that brought over technology from the Alpha architecture which introduced DDR RAM to x86.
Once upon a time, there were many cpu makers. now, there are only few left, even fewer that make high-end chips. AMD is one of the few giants who managed to stand the test of time.
Standi.g the test of time really sells tgeir struggle short. AMD was on the brink of going under multiple times over the decades. They made Global Foundries to compete with Intel own fabs but were left with no choice but to spin it off because it nearly bankrupted them. A missed opportunity given the current semiconductor situation.
When I was much younger, I still remember x86 camp has many players, Intel , AMD, Cyrix, IDT Winchip . Then came Transmeta Crusoe in mobile platform many years later, still x86 platform . ARM was puny back then. x86-64 license came from AMD much much later
My AMD Phenom II X6 that came out around 2010 had a black and green box and a green AMD logo on it. The first AMD CPU I seen that had a red box and a white or red AMD logo was the FX lineup
In high school in the mid 90s, I finally convinced my dad to buy us a PC. Is was the AST pc sold by Walmart (I didn't know any better, I was a teen in a podunk town with no real tech savvy friends). It was labeled as a K6 586/75. Upon startup, the bios listed the chip as 486/66...and so began my distrust of walmart.
It was not uncommon for the AM5x86 to show as a 486-66. I have 3? of them and all do that. They ran at a external multiplier of 2x and then internally ran at 4x. So the motherboard saw it as a 486DX2-66. SOME later boards could tell it was a 5x86 and showed the proper CPU.
@@amirpourghoureiyan1637 oh it was most certainly my justification. I always said “when I grow up” I will build an intel system for about $4k-$5k on an intel system. Luckily I don’t have to do that I can run AMD and finally be winning for once lol
My very first PC was based on a Duron @750Mhz and since then all my computers were built around AMD CPUs, usually the entry level models. This brings back so many memories!
Reminds me how 480p/576p(pal) was pretty much maximum main resolution for anything TV and consumer media related like DVDs until 2006, and then in the space of just 10 years we went from that to talk about 8K everywhere! Even on PC it wasn't much better than that, with 800 × 600 and 1024 × 768 being the most common resolutions until the mid 2000s!!
That happened when CPU makers discovered they couldn't make huge steps in clock speed anymore after about 3 to 4Ghz. So they had to come up with something. New, more efficient CPU architectures don't come overnight so they started adding cores and threads to increase performance. Stuff like like Hyperthreading and the Pentium D (yuk!) all came after Intel hit a silicon speed limit with the Pentium 4.
That's mostly because AMD went into "hibernation" for a couple of years and intel did not feel the need nor pressure to up the core count since the competition was behind anyways. In other words, intel ripped us all off.
i remember using the athlon chip in 2005 and then a sempron chip when they first came out in 2006. The funny thing is that although the sempron was a budget cpu the changes in the cpu made it better for rendering 3d graphics than the much better athlon chip.
One of the most interesting videos EVER tough many info about athlon and duron wasn’t so precise. And thank you for activating subtitles. Hugs from Italy.
I remember building a k6-2 machine. Back then was great if we're in the tech world you would get CPU' given out in care packages when you go to the events.
1997, K6-2 233mhz (which instead of 3.5 x 66mhz I got working as 3 x 83mhz at 250mhz) and 3DFX voodoo card and original Tomb Raider. The jump from previous 486DX2 66mhz was insane and then some. From Doom like games to 3DFX accelerated Tomb Raider. I had half a dozen friends every day watching me play Tomb Raider. I remember that like it was yesterday.
Man, I remember hearing about the FX years ago when I was running a Phenom 2 and being psyched about it. Those things were impressive for like a year and ran hot as balls. Glad Ryzen didn't keep the trend going.
Yeah but I run the Ryzen 7 5800X on water and it runs very toasty in the 60's under normal gaming loads and in the high 80's under full load because I also do work inside blender unity and substance painter. (yes I am also aware this one is the toastiest of the linup from Ryzen 5000)
@@Rena_Skuxy Did you account for the temperature sensors being miscalibrated? Some motherboards adjust for that, others don't. Have you tuned voltages in the BIOS? I was able to drop my 5600x by 0.2v and it was still stable at stock everything. Again, some motherboards just slam power through the chips that they don't need to. Edit: Load temps went down around 20-25C from a 0.2v drop.
Been on the AMD kick since the K6-2 with 3DNOW. I have been with them throughout all their ups and downs but nice to see how they been doing with the introduction of Ryzen. Was amazingly impressed with its first generation and its latest iteration has been great too.
I remember the old "Barton" chips, I think that's what they were, that you could use a piece of tape and pencil to hard OC by putting the top on the pin side of the cpu, drawing a specific length line on it with the pencil, then removing the tape. Was interesting...
Phenom II wasn't the start of the decline, it was the peak in terms of their performance. My old 1090T I bought for less than half the price of the equivalent i7, was a hell of a powerhouse for computing tasks, like encoding and rendering, and it lasted me for 9 years of continued use without performance ever being an issue.
Yup, it was a monster for its price. Had the same CPU, passed it to my younger brother in 2016 he used it until the beggining of this year and now it is in my dad's office being a fairly decent office desktop with 8GB of RAM and an SSD.
I'm still using my 2010 bought Phenom II X4 955T. I got it with 16 GB RAM for just over $500 and sure... HD5700 card is not running Crysis, but I'm really too old for that anyway. Only upgrade is an SSD. Phenom II was, in my opinion, an absolute success, where the FX series are an absolute conundrum...
Same here, bought my 1090T in 2011 and it's still in my pc. The GPU broke and due to the current market I'm waiting. But the 1090T is still going strong after 10 years. Especially with encoding (1080p).
Dude, when I saw this video call with the K6-2 image, it just brought many memories, the most important was that I met my wife in the internet using a PC with this chip back in 1998. It was 23 years ago. Still, so fast. Now I'm a happy Ryzen owner :).
Cool. Feels like it was important to mention that Opteron, waaaay back 2003, was the first iteration of what we now think of a a 64-bit PC processor. Well... unless tomorrow's video is all about Itanium... :-)
@Techquickie A bit of Information is missing about the AMD K6 in the Beginning: Its based on the Nx686 from Nexgen, after AMD purchased them. Was "just" made Socket 7 Compatible. And also the AMD K6-3 with L2 Cache in the CPU - making it the first Consumer CPU with L3 cache (on the Motherboard)
I've owned a K6-2, Duron, Thunderbird, Athlon XP, Sempron, Opteron, Phenom II and now Ryzen CPUs over the last 20 years. Except for the recent Ryzen purchase, it's been all the gaming value for me. And with the 5600X, it's still a pretty good value.
@@mohamadyusoff9091 sadly it was so bad for gaming, the Phenom II was almost as good. Went with a used Intel Haswell during that time for the first time since rocking a Pentium MMX.
1:40 That's my printer there, down on the right left corner with the missing plastic cover... An HP LaserJet 6P/MP from 2007... It was a great printer back in the days, and expensive as hell.
I know the feeling well. Been with AMD for a long time then went intel in 2014 with Haswell and it did me proud which is now my secondary PC. Just last month went back with team red and it put a big smile on my face going back to them when building it.
4:40 I actually own an Opteron-based system, it's a dual-core AMD Opteron 165 with an nVidia 8600 GT graphics card, the machine is from 2005 and I currently have Windows XP and Windows Vista SP2 on it
It's a wonderful video. But I would've loved to see mention how AMD's 64 bit architecture ended beating Intel's Itanium on the market to the point that Intel adopted AMD's x86_64 instead.
My first pc was a 486. I delivered papers for about 2 years to buy that thing. 33mhz/ 100mhz with the turbo button on, 4 mb of ram running Norton commander :) Later in life I also had an Athlon 2 and a phenom 2 1100t black that I rocked all the way up to when ryzen came out, and on my second ryzen chip for a couple of years now. I have had a good time with AMD through the years.
6:28 The "A" series was the Fusion concept, fair GPU (great for on-CPU-Chip) though they did update the CPU cores to Bulldozer. The top A10 and A12 are STILL some of the best video on a CPU chip, easily handing anything Intel EVER it's head and killing the bottom-end discrete GPU market.
Ah, nostalgia. You know I ran a Phenom II and an FX chip for almost ten years and they worked great, even into modern applications with newer graphics cards. Finally refurbed and sold them because of the hardware supply shortage, damn good value even if they weren't market lead at the time. I'd never seen a PC hold any value after a decade and change until now.
@@prateekkarn9277 AMD CEO Lisa Su said the CPU shortage has caused the company to prioritize higher-end commercial CPUs and gaming processors over lower-end PC chips.
I had almost every one of these from the 486 up to AMD Phenom. Right now I’m running team blue though. Including the original Athlon slot processor which shortly after exploded due to a power supply issue. I probably bought a cheap one. 😂
You should switch back to team red. More affordable more consumer friendly cause you don’t always have to buy a new motherboard. And lets not forget that current amd cpus either walk the intel ones or are equivalent.
@@ChronicleDrew at the time when I built it for the cost the Intel was the best performance per dollar. When I go to build another one I will compare as I usually do
@@ChronicleDrew Yeah, but if you go 5000 series, you probably are going to have to buy a new motherboard after. So, please, stop using this as a valid argument because you can give people a false impression about upgradability with this statement. Also if you buy a cheap motherboard, you're not even sure if your MB will even support the upcoming chip even if it is in the same socket. In that case your best bet is making a custom firmware/bios upgrade if the manufacturer didn't (and I would just recommend not underspending on a motherboard for this reason alone). Really stop using this as a valid argument, this doesn't really hold out anymore in this day and age (not like it used to some 15+ years ago). If you are running a good intel based PC right now, it's probably not even worth upgrading to 5000 series because of the upcoming zen 4 (which most likely sports a new socket). If you already have a beast of an intel machine, you're definitely not going to notice anything in most games except for an inflated FPS number. Maybe in multithreaded workloads, but most people only claim they need this so they can justify overspending on their purchase they didn't truly need.
My very first custom rig had a 1.1 Ghz Duron processor. Like the rest of that build, it wasn't the best, not by a long-shot, but it was mine, and it provided me with quite a bit of fun, as well as many lessons on what not to do for future builds. Good times.
I think I've owned at least 1 of nearly every AMD desktop generation since we crossed the 1ghz line. (maybe duron) The first few i built myself were xp ( i might have skipped a few bulldozers but my 1100t was still faster than them in single threaded )
I remember my first PC, thought I had the world with a AMD K6 with MMX processor, especially when I got the 3DFX Voodoo card for it too, my mind was blown! lol seemed a whole world away from the Amiga 1200 (with a blizzard 030/50 accelerator board) that I had before the pc.
Their worst mistake was Bulldozer, a design based on the same principles as the highly clocked Pentium 4 Netburst architecture which they had just destroyed with the Athlon 64. They literally made the worst possible architecture choice they could have made at the time.
@@Mr.Morden I head than Pentium eventually overtook Athlon before Intel launched their Core series. Part of it was due to aggressive marketing to OEMs, but also I think Pentium scaled better with higher clock speeds later through its product cycle or something.
3:21 - Now that is one hell of a flexible motherboard! AGP, PCI, ISA, 2x IDE, floppy, 2x serial, parallel, AT and ATX power, and TWO different types of RAM! Everything you could want in a retro machine except USB (probably because it predates it).
My childhood PC was a Athlon 64 X2. It was pretty slow in my memory because it was the "family PC" which means its full of virus. But I began my journey in computer hardware from there and performed my first Windows reinstall on that PC. It ran great afterwards. I spent many hour in GTA SA on the machine, and its always gonna be a part of me despite it died a few years ago.
Wonderful recap. This video will outlive me as it provides future enthusiast the proper foundation on team red vs blue. Most likely will be mandatory viewing a few decades from now. Cheers!
I remember salvaging older laptops and replacing the intel P2/P3 CPUs with AMD variants, it was a great offer to get more use out of your 3-4 year old laptop. Either company would be where it is today without the other. I hope they continue to innovate and challenge each other for another few decades.
@@kingeling lol I wouldn't say strong but it still works as a basic computer or as a light gaming or emulator PC. Definitely no brand new tomb raider gonna be played on it or nothing.
@@kingeling Bless you. That a good cough 😷 . LoL Yeah I have it water cool, and rx580. I'll keep the GPU but definitely a new system. ("If" I have no use for it, I'll just donate the system to a family that needs it)
Had the Pentium 500, Anthlon 64, Anthlon x2, Core Duo, Core 2 Duo, Phenom X4 & X6... Still thinks AMD is way better in price, performance, innovation pushing the limits. First to 1ghz, 64bits, multi processors..etc. They weren't wrong betting on multi core, everything made sense until they found out people are too lazy to rewrite everything to run on more cores. Like the 32bit transition to 64bit was slow af. Which we still have 2 "Program Files" in windows. They're just ahead of their times. With purchasable prices.
We wouldn't have Intel CPUs if not for AMD either. Intel literaly couldn't figure out how to make x64 work, while AMD had already figured it out and they were already selling products based on it. So Intel had to get x64 licensed from AMD. Basically now Intel holds the x86 license, while AMD holds the x64 license.
And I've been there almost all the way. Starting with the AM386SX, briefly passing by a Cyrix MII, then an Athlon Thunderbird 1300, then a Sempron 2800+, then an FX 8350 and now a Ryzen 5 3600. It's been a good run :)
@@theigpugamer They did that because they didn't have competition. Both AMD and Intel are crap since 2008 and have been milking us, either both at the same time or not. That's why ARM CPUs caught up to x86-64 CPUs.
@@rattlehead999 you're right But idc about billion dollar companies I was just making a pun But this battle between intel and amd makes me happy as we consumers get more
Started with a triple-core AMD Athlon II X3, upgraded that PC to a Athlon II X4 later. Next PC had a 8-core FX-8350. Current PC had a Ryzen 2700x, currently has a 3900x
Something you could have covered slightly better was the "deal". AMD licenses the X86 instruction set from Intel, which is what allowed them to keep making X86 based CPUs. But, on the flip side, Intel licenses the x64 instruction set from AMD. I don't know the numbers, but I'm absolutely sure the license fees nullify each other. But it's somewhat hilarious that either side could pull their contract and absolutely cripple the other.
They were never friends, intel was forced to give AMD the x86 license, iirc IBM wanted backup CPUs that weren't just from intel, so intel had to give AMD the license. And since AMD got the x86 license up to 2006 they beat intel or were as good as them for cheaper, then started losing more and more from 2006 to 2017, and then AMD has been winning ever since.
I might be late to post on this, but you overlooked the am486 dx4-100. It was one of the fastest 486 processors at the time, and I was one of the lucky few to have it. It actually blew the water out of all pentium processors until the pentium 133.
Love your sponsor description. “No mark up meaning that you pay what you would at a normal store.” 3080 starting retail price, $699. Redux, “$1099 seems like normal store price.”
I understand that but that isn’t the point. He said same as what you would pay in store and say if someplace like Best Buy had them in stock they wouldn’t be charging $400 over MSRP for anything.
Bad scripted video: how you can tell if a channel is yeah in bed with Intel? emphasize the negatives of AMD old cpus vs intel ones A LOT many times and with specific vibrant tone, expression, then pass fast all the AMD CPUs that kicked Intel for years in the balls like Athlon x64, don't mention that AMD was the first that did x64 for years vs Intel and multicore CPUs again ahead of Intel, until Core2Duo when they had completion, downplay all those years, and then go back to emfasize the bad FX bulldozer series but mentioning the "superiority" of greedy Intel and they quadcore cpus...and finally talk about Zen once and not mentioning Z2 and z3 that is kicking Intel to the ground for years now in performance, consumption and sales. But for the bad FX we even put a sinking ship.... yeah. nice script guys! LTT fan boys you can attack as much as you like and tell me the bs excuses about giving the blue few hard videos, but you don't know that they get paid every time they mention the word "intel"!. They promote Intel even by talking "harshly a bit" to them or not emphasizing AMD superiority. His real character and enthusiasm is only in some unscripted videos where they build AMD servers. There you see an honest Linus, not a "bussiness" in bed with Intel.
I had a K6-III+ back in the day. It’s hard to overstate just how amazing that chip was. To go from a 133MHz Pentium MMX to a chip that could do 600MHz+ was amazing… all without touching any other components. Well besides moving jumpers around on the motherboard to set bus speed, voltage, and clock multiplier.
I'm glad to be back on AMD with my Ryzen9 5900x. I started my PC journey with the AMD 386 DX40, moved to a K6-2 and finally K7, been Intel since then, but then Ryzen happened. What a comeback!
My journey is a bit shorter, but I have had quite a few AMD based systems, and after a brief history with Intel, I am back on AMD. 1. HP Pavilion laptop (circa 2005), with the Sempron 3000+. 1.8 GHz single-core. 2. Acer Aspire laptop (2008) with the AMD Turion 64x2 processor. 1.9 GHz dual core. 3. First desktop build (2012) - AMD FX-8120 4. Desktop upgrade (2014) - Intel Core i7 4790K 5. Second desktop build (2019) - AMD Ryzen 9 3900X
I'm actually watching this on one of my lower end, older PCs, which has an old Phenom X4 835 in it. The computer was built by an OEM with an Athlon X2 215, but I decided to see what I could do with it for like $30 in eBay parts. I'm pleased that such an old platform still has the ability to do current computational jobs without much of a hitch.
If memory serves, the last AMD CPU that an intel chipset supported was the AMD DX4-100 CPU. Which kept a lot of those pre-Pentium MBs in the game for a while. in those days, there was a third option, the Cyrix 5x86, which was actually manufactured by IBM and sported better performance numbers but was hampered by lack of support, needing patches downloaded to be compatible with most games.
Ahhhh, my first PC used that K6-2. 166Mhz, 👌mwah! Fell out of love with PC's during 'the dark years', came back for the Ryzen 2600. The circle of life, such beautiful.
♫ Ryzen up! Straight to the top... ♫ My favorite was the K6-III+. These were very fast, low power, low heat, and they were very inexpensive. The original K6 & K6 2 chips ran very hot. One time, a K6 2 200Mhz melted the cooling fan to the processor. It still continued to run - long after the fan was replaced. The K6 III's ran so cool, that they were barely warm to the touch. I ran them with fans, but they really didn't need them. Also, in those days, many motherboards came with DIP switched, instead of jumpers. Configuring & overclocking was pretty simple. 400's could run at 500, with no apparent problems. That was the last time, I even attempted overclocking as a matter of policy. After dealing with instability, after a couple of years of service, I decided that it was more worth it to me, for the board to last as long as possible. When the Athlon era came around, Overclocking seemed to take an eventual toll on system components. It was more worth it to me to save up for better components. The same seemed to apply to Pentium systems of that era.
Nice trip down memory lane, thanks! I have owned or used a significant portion of the chips mentioned in this video. My first AMD chip was the 8088 clone in my IBM XT. Then a few years later I scored an Am386DX-33 that impressed the heck out of my 1993 self. Later I heard rumors about the awesome K6, and eventually got my hands on one. Very not bad, and it left me liking AMD enough to stick with them, for the most part. Heck, my 6-core Athlon lasted me 10 years until I replaced it with my current Ryzen 2700. Never cared much for their graphics cards, though.
My first PC build started with a FX 4100, then my youtubing career took off with a 8320e because my motherboard couldn't go beyond 95w but that board and the two CPU's got me through nearly 7 years of gaming and school and video and music production before my rebuild with a Ryzen 2600x and a stronger GPU and ram.
"AMD got their start by making Intel CPUs."
"Linus Tech Tips got their start by making NCIX tech tips."
Soon this comment will gonna blow up, it just jumped from 15 likes to 25 likes in just a minute…
@@NitheshVG734 I'm here to witness it
Commenting before it blows up
hey
true...
00:35 Quick note regarding this. The reason Intel went along with this was because the military usually required a redundant manufacturer, in case the primary one either went out of business, or hit some roadblock and were unable to make any more.
I've always wondered why this was. Thank you!
@@kantdiego You're welcome
@@MN-jw7mm He didn't thank you...
@@hashhacker2130 you're welcome 😎
Thanks
If you're going to refer to them by color, then AMD was team green in the early days, before acquiring ATi (team red in the graphics market).
theres nvidia for that one
I have an old xp gaming rig with an athlon cpu and nvidia graphics and naturally... it has green lighting because team green x 2
And they were a lot better that way.
Seriously. I still consider AMD "Team Green". To hear them "Team Red" is still weird to me, even a decade later.
@Gonçalo Amaro how is this a woosh? The original comment is clearly not a joke
The first high end PC I built from scratch was a dual Athlon MP system with a Tyan motherboard. I loved that computer.
I had to check, Tyan is still around in the server/workstation categories.
I happen to have an Athlon MP system with a Tyan motherboard. And I switch between a GeForce FX 5900 Ultra, and a Voodoo 5 5500 in it.
First Computer I built was a K6-2 450 with a 3DFX Voodoo something. I can’t remember.
my first was 6700k and a GTX 970. im also super young. most of those parts are older than me! but i really want to get into old tech
Mine was a 1GHz T'bird, was a ripper in its day.
Thank you, Lisa Su. Otherwise we would still be buying $450 4-core 14nm Intel CPUs.
14nm++++++++++++++++++++++++xe v2 rev2.3 **
Imagine the wattage consumption...
This is why the US government and others do not allow monopoly's. Once you become on top and kill off or buy the competition it allows the company to relax in innovation and overcharge the consumer. I feel there should be a third CPU maker I mean technically there is ARM but they pretty much allow anyone to license their designs. Elon should start a CPU and GPU company he would benefit from it. He is already using custom chips for Tesla AutoPilot from the current suppliers that make them. He pretty much killed or slowed any satellite provider cause companies refuse to innovate anymore. Just like he killed ULA with spaceX. The real problem is everyone is content on earth happy with what they have or trying to get. Could you imagine being born in the early 1900's watching stuff evolve at a fast pace then in the 2000's everything has hit a brick wall.
Amd was almost going bankrupt back then
@@Rainbow__cookie Consoles kept the lights on for them. They also had to spin off Global Foundries and I remember they had to sell their corporate office bldg and then lease it for cash. I wonder how close they came to being acquired by Samsung?
AMD was also first to break 1ghz. I feel like that was an important milestone that you glossed over mate.
Yep it was Athlon thunderbird or right before if I remember correctly?
@@Halarue Athlon Orion actually. Thunderbird came along a few months later.
@@LS3ftw15 right! It was a great time to be an AMD fan!
I remember slot a Athlon CPUs and then the socket a (462 pin) tbird axia-y 1000 which usually would clock to 1333mhz just be changing the fsb from 100 to 133MHz.
The Morgan core and also spitfire core duron ran at 1GHz too.
they are also the first to make 64 bit cpus
I really thought this was called "best amd cpu ever" and learned all this history expecting a new CPU at the end XD
I thought this as well!
Did they changed the title...?
Bruh lol
They 100% changed the title.
@@Razi98 bruh I saw the same thing
@@EdyDev same
Until a few years ago I was running an AMD Athlon II x3 tri-core, which was actually a Quad-Core sold with one disabled, usually due to a defect. Some motherboards at the time came with a core unlocker which would try an activate that extra core. If you were lucky you could get a Quad-Core for the price of a tri-core (I wasn't!).
Same here, but an Athlon II X4 620. Built a new machine based upon the 2nd gen Ryzen and slightly upgraded the old machine with a used, $30 AMD Phenom II X4 840.
Also some of the Phenom II X3's were X4s and they could be unlocked on certain motherboards. The same with some of the Phenom II X4s could be come X6s
my family computer was an acer running an Athlon 64 X2 4450E
I was using it for the vista preinstalled games
Performances left on the road
I got lucky but my phenom x3 had stability issues when I did that
@@Fender178 couldn't you also unlock it to be a 5 core if two extra cores didn't work or am i mistaken?
I remember back in the 90s when people called AMD “another micro division” and said they were just Intel copycats!
Look who's laughing now
Well, at the time they quite literally were haha
so under radted
They were not wrong at that era
in my country (especially for those who works in intel), they call them "american mcdonalds". and they'll laugh whenever "american mcdonalds" releases a new chip back then.
It seemed being a ripoff is not always a bad thing… the important thing is innovating after you get your feet up.
AMD, Huawei, XiaoMi, nVidia… the examples goes on and on…
Think of the oreo!
Almost as if government patents didn't help with innovation...
"Building out your company by shamelessly ripping off someone else" is such a common thing in manufacturing that both America and China got into industrial production by doing it.
The Huawei modem Provided by my ISP, is the BAIN of my Existence! Fucking Thing KEEPS DROPPING OUT! in the middle of Games/Downloads! ( & IT Takes Half an Hour to RESET!)
Compaq
Great overview, Linus. I love those kind of "tech history" videos a lot, having participated in computer technology upgrades for over 25 years now. Thank you!
I always loved AMD, because of its pricing methods, it was significantly cheaper than Intel. It was the platform of the adventurers (with empty pockets :D ).
They are more affordable when they couldn’t compete otherwise :) for example in the early 2000s AMD released the Athlon XP 3200 which was much more expensive than Intels offerings at that time because it was the best of the best.
@@Your_username_ I'm not sure about the pricing around that time, but I remember I bought my Athlon XP 1600 dirt cheap from a local vendor :) . I loved that CPU, I only had GPU performance issues around that time :)
It's true but there were exceptions where some AMD processors are more expensive than contemporary Intel processors.
but I think right now intel chips are cheaper than AMD with better performance, right?
@@henrypowell3496 Depends on where you live. AMD is usually better price/performance
Why am I not surprised that he left out the fact that after AMD came out with Athlon 64 Intel had to license it so they could use it, so now both companies can only function because they have licensing deals with each other. One from Intel for x86 and one from AMD for x64,
😁
I don’t know. Why would you be surprised? He probably just forgot to add it, no?
Are you suggesting some ulterior motive for leaving that point ?
I believe it was mentioned in an earlier Intel CPU history recap video.
As an AMD fanboy I can accurately identify OP as an AMD fanboy
Probably because he's said it in multiple videos before
Screw the CPU history,why isn't anyone talking about the fact that AMD used to be green??
AMD bought the OG Team red ATI. Nvidia and even 3D FX's Voodoo graphics were using green color schemes back in the day. They couldn't keep using green so they just went with ATI's red branding.
@@alexisrivera200xable good choice ^^
This whole video was like intel vs amd having an anime showdown.
Minus the power of friendship and talk no Jutsu...
Amd to intel: We are basically the same bro.
intel: you're weak.
AMD: I'm you.
Could've mentioned that the K6 was based around a design by NexGen, which AMD had acquired prior and made for a strong basis.
IIRC, early K6's used to have a special CPUID easter egg where it would return the string "NexGenAMD" (or something similar that fit) instead of "AuthenticAMD" if you put a special value in one of the registers. Let's see, from memory there's several CPUID strings for intel compatibles... "GenuineIntel", "AuthenticAMD", "CyrixInstead", "CentaurHauls", and then the no-fun bland: "UMC UMC UMC ". Here's a more complete list: en.wikipedia.org/wiki/CPUID
Came looking for this comment. The K7 was similarly also done by a new team acquired from DEC that brought over technology from the Alpha architecture which introduced DDR RAM to x86.
Once upon a time, there were many cpu makers.
now, there are only few left, even fewer that make high-end chips.
AMD is one of the few giants who managed to stand the test of time.
Isn't it literally just Intel, AMD and VIA that are still arround with proper x86-64 licenses?
Standi.g the test of time really sells tgeir struggle short. AMD was on the brink of going under multiple times over the decades. They made Global Foundries to compete with Intel own fabs but were left with no choice but to spin it off because it nearly bankrupted them. A missed opportunity given the current semiconductor situation.
When I was much younger, I still remember x86 camp has many players, Intel , AMD, Cyrix, IDT Winchip . Then came Transmeta Crusoe in mobile platform many years later, still x86 platform . ARM was puny back then. x86-64 license came from AMD much much later
@@fleurdewin7958 What DID happen to Cyrix?
@@yuminago Now owned by VIA
Beginning to doubt AMD was known as Team Red before 2005... all their boxes seem green. But that was before they took over ATI which was red iirc.
My AMD Phenom II X6 that came out around 2010 had a black and green box and a green AMD logo on it. The first AMD CPU I seen that had a red box and a white or red AMD logo was the FX lineup
@@wolfemancola and ryzen is orange+white. Its only their gpus that are marketed as red
In high school in the mid 90s, I finally convinced my dad to buy us a PC. Is was the AST pc sold by Walmart (I didn't know any better, I was a teen in a podunk town with no real tech savvy friends). It was labeled as a K6 586/75. Upon startup, the bios listed the chip as 486/66...and so began my distrust of walmart.
You can trusssst Mart-law 🐍
Hence why I only do business with Omega Mart.
@@dakoderii4221 I found the funny
It was not uncommon for the AM5x86 to show as a 486-66. I have 3? of them and all do that. They ran at a external multiplier of 2x and then internally ran at 4x. So the motherboard saw it as a 486DX2-66. SOME later boards could tell it was a 5x86 and showed the proper CPU.
The AM486 was my first processor and made me and AMD fan ever since… never had an intel machine until my first laptop in 2017
That's crazy.
@@DaDARKPass yeah man took a lot of L’s building only AMD (and at the time ATI) computers.
Your wallet probably thanked you in the long run
@@amirpourghoureiyan1637 oh it was most certainly my justification. I always said “when I grow up” I will build an intel system for about $4k-$5k on an intel system. Luckily I don’t have to do that I can run AMD and finally be winning for once lol
@@JustCallMePCra lol your patience payed off, seeing the old price tags definitely make you wonder where the time's gone!
“There’s no law that says a fake can’t surpass the original “
- Shiro Emiya
Hey, you just forgot amd's little venture in the arm world for their server lineup (and it was bad)
AMD plans to make ARM CPUs again, if it is feasible.
If I remember correctly. They sold the ARM division to Qualcomm. ARM AMD cpus where the thing that today we know as snapdragon.
@@SDogo No you're wrong. It was AMD embedded GPU business called Imageon which was sold to Qualcomm which is now known Adreno GPUs.
@@atisbasak damn new facts everyday
@@atisbasak damn..this will make the comparison between adreno in snapdragon & amd gpu in the upcoming exynos will be even more interesting
5:19 "It was the start of a long decline for Team Red." Look over your left shoulder Linus. It's Team Green.
My very first PC was based on a Duron @750Mhz and since then all my computers were built around AMD CPUs, usually the entry level models. This brings back so many memories!
It still baffles me how slow we went from 1 thread to 4 and how fast we went from 4 to 12 or 16 .
Reminds me how 480p/576p(pal) was pretty much maximum main resolution for anything TV and consumer media related like DVDs until 2006, and then in the space of just 10 years we went from that to talk about 8K everywhere!
Even on PC it wasn't much better than that, with 800 × 600 and 1024 × 768 being the most common resolutions until the mid 2000s!!
@@you2be839 thats crazy 🤯
That happened when CPU makers discovered they couldn't make huge steps in clock speed anymore after about 3 to 4Ghz. So they had to come up with something. New, more efficient CPU architectures don't come overnight so they started adding cores and threads to increase performance. Stuff like like Hyperthreading and the Pentium D (yuk!) all came after Intel hit a silicon speed limit with the Pentium 4.
That's mostly because AMD went into "hibernation" for a couple of years and intel did not feel the need nor pressure to up the core count since the competition was behind anyways. In other words, intel ripped us all off.
i remember using the athlon chip in 2005 and then a sempron chip when they first came out in 2006. The funny thing is that although the sempron was a budget cpu the changes in the cpu made it better for rendering 3d graphics than the much better athlon chip.
AMD Athlon was in the computer I built to take to college. Still rocking AMD today.
very stupid :D
you probably didnt even finish that college
@@Daniel-zr4pk Pretty sure you’ve never even been to college since you’re such a childish lad
One of the most interesting videos EVER tough many info about athlon and duron wasn’t so precise. And thank you for activating subtitles. Hugs from Italy.
I remember building a k6-2 machine. Back then was great if we're in the tech world you would get CPU' given out in care packages when you go to the events.
1997, K6-2 233mhz (which instead of 3.5 x 66mhz I got working as 3 x 83mhz at 250mhz) and 3DFX voodoo card and original Tomb Raider. The jump from previous 486DX2 66mhz was insane and then some. From Doom like games to 3DFX accelerated Tomb Raider. I had half a dozen friends every day watching me play Tomb Raider. I remember that like it was yesterday.
I loved my Athlon Thunderbird. I even cracked a corner of the core and it never quit working.
I have had cracked corners as well. I only learned last year that the active portion of the silicon is on the very bottom 1%, the top is just extra.
I took one diode off when I delid my 64 x2 to try to overclock it more and that baby worked fine guess I got lucky good times
Man, I remember hearing about the FX years ago when I was running a Phenom 2 and being psyched about it. Those things were impressive for like a year and ran hot as balls. Glad Ryzen didn't keep the trend going.
but Ryzen runs hotter
LoL I'm using the fx9590 with water cooling and that thing hits 58* while playing games so pretty much same as Intel nowadays
Yeah but I run the Ryzen 7 5800X on water and it runs very toasty in the 60's under normal gaming loads and in the high 80's under full load because I also do work inside blender unity and substance painter. (yes I am also aware this one is the toastiest of the linup from Ryzen 5000)
@@Rena_Skuxy Did you account for the temperature sensors being miscalibrated? Some motherboards adjust for that, others don't. Have you tuned voltages in the BIOS? I was able to drop my 5600x by 0.2v and it was still stable at stock everything. Again, some motherboards just slam power through the chips that they don't need to. Edit: Load temps went down around 20-25C from a 0.2v drop.
@@MJ-uk6lu No they don’t?
Im from Malaysia 🇲🇾 and currently work at amd factory. Glad to be part of the family.
The K5 was actually a AM29K (from that risc-family that ended up in laser printers) with an x86 instruction decoder.
Been on the AMD kick since the K6-2 with 3DNOW. I have been with them throughout all their ups and downs but nice to see how they been doing with the introduction of Ryzen. Was amazingly impressed with its first generation and its latest iteration has been great too.
I was running a Phenom II x6 up until 2 years ago. It was an amazing chip and it served me well.
still got my Phenom II x4 to this day. Serves me for over 10 years now.
I remember the old "Barton" chips, I think that's what they were, that you could use a piece of tape and pencil to hard OC by putting the top on the pin side of the cpu, drawing a specific length line on it with the pencil, then removing the tape. Was interesting...
I had a Barton 2500+, what was yours?
Phenom II wasn't the start of the decline, it was the peak in terms of their performance. My old 1090T I bought for less than half the price of the equivalent i7, was a hell of a powerhouse for computing tasks, like encoding and rendering, and it lasted me for 9 years of continued use without performance ever being an issue.
Yup, it was a monster for its price. Had the same CPU, passed it to my younger brother in 2016 he used it until the beggining of this year and now it is in my dad's office being a fairly decent office desktop with 8GB of RAM and an SSD.
Shit my 64 x2 lasted untill 2014 and I built it when it came out
I am still running my 1090T for gaming and work, got it in 2011.
I'm still using my 2010 bought Phenom II X4 955T. I got it with 16 GB RAM for just over $500 and sure... HD5700 card is not running Crysis, but I'm really too old for that anyway. Only upgrade is an SSD.
Phenom II was, in my opinion, an absolute success, where the FX series are an absolute conundrum...
Same here, bought my 1090T in 2011 and it's still in my pc. The GPU broke and due to the current market I'm waiting. But the 1090T is still going strong after 10 years. Especially with encoding (1080p).
Dude, when I saw this video call with the K6-2 image, it just brought many memories, the most important was that I met my wife in the internet using a PC with this chip back in 1998. It was 23 years ago. Still, so fast. Now I'm a happy Ryzen owner :).
Cool. Feels like it was important to mention that Opteron, waaaay back 2003, was the first iteration of what we now think of a a 64-bit PC processor. Well... unless tomorrow's video is all about Itanium... :-)
@Techquickie
A bit of Information is missing about the AMD K6 in the Beginning: Its based on the Nx686 from Nexgen, after AMD purchased them. Was "just" made Socket 7 Compatible.
And also the AMD K6-3 with L2 Cache in the CPU - making it the first Consumer CPU with L3 cache (on the Motherboard)
I've owned a K6-2, Duron, Thunderbird, Athlon XP, Sempron, Opteron, Phenom II and now Ryzen CPUs over the last 20 years. Except for the recent Ryzen purchase, it's been all the gaming value for me. And with the 5600X, it's still a pretty good value.
no bulldozer?
@@mohamadyusoff9091 sadly it was so bad for gaming, the Phenom II was almost as good. Went with a used Intel Haswell during that time for the first time since rocking a Pentium MMX.
1:40 That's my printer there, down on the right left corner with the missing plastic cover... An HP LaserJet 6P/MP from 2007... It was a great printer back in the days, and expensive as hell.
"Athlon, Duron, Opteron, Sempron, Turion"
They forgot one, the AMD Onion
It has layers
It's the next generation. Cpu powerful enough to overshadow the best by orders of magnitude. More cores than ever. Superior
@@awesomefacepalm which can make you cry if you peel
Thats because they havent had 3d stacking yet.
Soon... soon.
Onion goes to Intel
Watching this 2 years later when the first Zen 5 products just launched.
I had quite few AMDs until i got an x3 one with three cores, then went Intel for 2 generations, but now i am back to Zen...
I know the feeling well. Been with AMD for a long time then went intel in 2014 with Haswell and it did me proud which is now my secondary PC. Just last month went back with team red and it put a big smile on my face going back to them when building it.
4:40 I actually own an Opteron-based system, it's a dual-core AMD Opteron 165 with an nVidia 8600 GT graphics card, the machine is from 2005 and I currently have Windows XP and Windows Vista SP2 on it
It's a wonderful video. But I would've loved to see mention how AMD's 64 bit architecture ended beating Intel's Itanium on the market to the point that Intel adopted AMD's x86_64 instead.
My first pc was a 486. I delivered papers for about 2 years to buy that thing. 33mhz/ 100mhz with the turbo button on, 4 mb of ram running Norton commander :) Later in life I also had an Athlon 2 and a phenom 2 1100t black that I rocked all the way up to when ryzen came out, and on my second ryzen chip for a couple of years now. I have had a good time with AMD through the years.
I like how AMD went from being green to red as they integrated ATI. Also RIP ATI :(
6:28
The "A" series was the Fusion concept, fair GPU (great for on-CPU-Chip) though they did update the CPU cores to Bulldozer.
The top A10 and A12 are STILL some of the best video on a CPU chip, easily handing anything Intel EVER it's head and killing the bottom-end discrete GPU market.
Athlon 64 x2 was my first cpu i purchased by itself. Dual core was revolutionary.
Ah, nostalgia. You know I ran a Phenom II and an FX chip for almost ten years and they worked great, even into modern applications with newer graphics cards. Finally refurbed and sold them because of the hardware supply shortage, damn good value even if they weren't market lead at the time. I'd never seen a PC hold any value after a decade and change until now.
AMD makes a crazy amount of CPUs. My first CPU was a Bulldozer then I upgraded to Ryzen.
I've never had an Intel PC. Go Team Red!
BlueulB
I wanted to wait for am5 to get amd. But look like amd got greedy and become so expensive. So i will stay with intel.
Had only a short dip into the blue, back in the fire now, Team Red is so much better, still working with my 1st Gen Ryzen, still goes strong.
@@raychii7361 greedy?
Mate there's a silicon shortage going around right now y'know
People like us need to get paid too.
@@prateekkarn9277 AMD CEO Lisa Su said the CPU shortage has caused the company to prioritize higher-end commercial CPUs and gaming processors over lower-end PC chips.
Nicely done, and oh how many of these AMD chip versions have I lived through! Nice days now, with more power to the user, and RAM that can keep up.
I had almost every one of these from the 486 up to AMD Phenom. Right now I’m running team blue though. Including the original Athlon slot processor which shortly after exploded due to a power supply issue. I probably bought a cheap one. 😂
You should switch back to team red. More affordable more consumer friendly cause you don’t always have to buy a new motherboard. And lets not forget that current amd cpus either walk the intel ones or are equivalent.
@@ChronicleDrew at the time when I built it for the cost the Intel was the best performance per dollar. When I go to build another one I will compare as I usually do
@@ChronicleDrew Yeah, but if you go 5000 series, you probably are going to have to buy a new motherboard after. So, please, stop using this as a valid argument because you can give people a false impression about upgradability with this statement. Also if you buy a cheap motherboard, you're not even sure if your MB will even support the upcoming chip even if it is in the same socket. In that case your best bet is making a custom firmware/bios upgrade if the manufacturer didn't (and I would just recommend not underspending on a motherboard for this reason alone). Really stop using this as a valid argument, this doesn't really hold out anymore in this day and age (not like it used to some 15+ years ago).
If you are running a good intel based PC right now, it's probably not even worth upgrading to 5000 series because of the upcoming zen 4 (which most likely sports a new socket). If you already have a beast of an intel machine, you're definitely not going to notice anything in most games except for an inflated FPS number. Maybe in multithreaded workloads, but most people only claim they need this so they can justify overspending on their purchase they didn't truly need.
My very first custom rig had a 1.1 Ghz Duron processor. Like the rest of that build, it wasn't the best, not by a long-shot, but it was mine, and it provided me with quite a bit of fun, as well as many lessons on what not to do for future builds. Good times.
I think I've owned at least 1 of nearly every AMD desktop generation since we crossed the 1ghz line. (maybe duron)
The first few i built myself were xp ( i might have skipped a few bulldozers but my 1100t was still faster than them in single threaded )
I remember my first PC, thought I had the world with a AMD K6 with MMX processor, especially when I got the 3DFX Voodoo card for it too, my mind was blown! lol seemed a whole world away from the Amiga 1200 (with a blizzard 030/50 accelerator board) that I had before the pc.
i love how after 15 years of amd failing they just decided to make cpus on a new architecture and annhilate intel
""new architecture" Zen is in fact the same as Core or AMD's version of Core. You compare Zen 1 with Sandy Bridge and is very similar in the diagrams.
It was a drawn out "Wait for it..."
Their worst mistake was Bulldozer, a design based on the same principles as the highly clocked Pentium 4 Netburst architecture which they had just destroyed with the Athlon 64. They literally made the worst possible architecture choice they could have made at the time.
@sbcontt YT nvidia has not acquired ARM and it might not be able to. All countries except the US want to stop the deal.
@@Mr.Morden I head than Pentium eventually overtook Athlon before Intel launched their Core series. Part of it was due to aggressive marketing to OEMs, but also I think Pentium scaled better with higher clock speeds later through its product cycle or something.
3:21 - Now that is one hell of a flexible motherboard! AGP, PCI, ISA, 2x IDE, floppy, 2x serial, parallel, AT and ATX power, and TWO different types of RAM! Everything you could want in a retro machine except USB (probably because it predates it).
*sigh* Still rocking the old FX 8350... hot and still somehow not dead.
I am too, but as a secondary PC.
Same here
Cries in FX 5600
My childhood PC was a Athlon 64 X2. It was pretty slow in my memory because it was the "family PC" which means its full of virus. But I began my journey in computer hardware from there and performed my first Windows reinstall on that PC. It ran great afterwards. I spent many hour in GTA SA on the machine, and its always gonna be a part of me despite it died a few years ago.
I feel so old after watching this
My first computer was a Tandy PC-1000, running an AMD-manufactured Intel 8086-2.
From thieves to the morally superior business, how times change. It's nice to see a corporation to get better rather than sink lower for once.
Wonderful recap. This video will outlive me as it provides future enthusiast the proper foundation on team red vs blue. Most likely will be mandatory viewing a few decades from now. Cheers!
I feel that linus should have mentioned Intel playing dirty with oems highly contributed to amd's decline in the mid 2000s.
The sponsor's website's cursor gimmick is so cool!!
I remember salvaging older laptops and replacing the intel P2/P3 CPUs with AMD variants, it was a great offer to get more use out of your 3-4 year old laptop. Either company would be where it is today without the other. I hope they continue to innovate and challenge each other for another few decades.
My A10-7850k is still Strong in 2021
"It might be time to retire 😅"
New model are very shiny
i have one aswell in my tv room, it is paired with an r7 240 in dual graphics mode, 2400mhz ram
Less than 2000 points in Cinebench... Nearly 70°C on idle at bclk... “Still strong”, ahem.
@@kingeling lol I wouldn't say strong but it still works as a basic computer or as a light gaming or emulator PC. Definitely no brand new tomb raider gonna be played on it or nothing.
@@kingeling
Bless you. That a good cough 😷 . LoL
Yeah I have it water cool, and rx580.
I'll keep the GPU but definitely a new system. ("If" I have no use for it, I'll just donate the system to a family that needs it)
Had the Pentium 500, Anthlon 64, Anthlon x2, Core Duo, Core 2 Duo, Phenom X4 & X6... Still thinks AMD is way better in price, performance, innovation pushing the limits.
First to 1ghz, 64bits, multi processors..etc.
They weren't wrong betting on multi core, everything made sense until they found out people are too lazy to rewrite everything to run on more cores. Like the 32bit transition to 64bit was slow af. Which we still have 2 "Program Files" in windows.
They're just ahead of their times.
With purchasable prices.
Luckily Intel licensed the x86 architecture for an unlimited time...
We wouldn't have got Ryzen if not that...
In turns, intel can use x86-64 architecture from amd
We wouldn't have Intel CPUs if not for AMD either.
Intel literaly couldn't figure out how to make x64 work, while AMD had already figured it out and they were already selling products based on it. So Intel had to get x64 licensed from AMD.
Basically now Intel holds the x86 license, while AMD holds the x64 license.
Intel : give me back my x86 license
Amd : ok, but no more x64
They had Itanium, purebred x64, but it's not for normal consumer and basically a dead ecosystem.
And I've been there almost all the way. Starting with the AM386SX, briefly passing by a Cyrix MII, then an Athlon Thunderbird 1300, then a Sempron 2800+, then an FX 8350 and now a Ryzen 5 3600. It's been a good run :)
Ah, the good old days when AMD CPUs could actually explode ^^
the good old days when we can cook pasta on a cpu....i think you can do it as well today on top intel cpu
Still have a Phenom II X6 in my gaming computer. It's doing a good job even on recent games and workloads.
When you're so early you can't think of something funny to comment
Bruh
True
I even forgot that I still have an FX-8320 that I still use regularly...
I’m joke, let me early a think
the linus pulseway commercial before a ltt video is always priceless!!!!
Amd's new performance uplift per zeneration is impressive
nice pun
Not really, it's much or less half of what it should be per year/generation.
@@rattlehead999 ohh so what you're gonna do
Go to intel haha 😂 with 2-5% performance uplift per decade
@@theigpugamer They did that because they didn't have competition.
Both AMD and Intel are crap since 2008 and have been milking us, either both at the same time or not.
That's why ARM CPUs caught up to x86-64 CPUs.
@@rattlehead999 you're right
But idc about billion dollar companies
I was just making a pun
But this battle between intel and amd makes me happy as we consumers get more
2:37, ah, the K5-PR100, CPU in my first ever PC back in 1997... Good times!
Started with a triple-core AMD Athlon II X3, upgraded that PC to a Athlon II X4 later. Next PC had a 8-core FX-8350. Current PC had a Ryzen 2700x, currently has a 3900x
I am still using a athlon X2 lmao. It's potato
I am still using a athlon X2 lmao. It's potato
Like with the 10 years of GPUs videos you guys did. Would like to see all the Intel CPUs video too. Should be interesting.
When you consider the fundamentals of processing capacity over speed it makes sense AMD are leading today.
Something you could have covered slightly better was the "deal". AMD licenses the X86 instruction set from Intel, which is what allowed them to keep making X86 based CPUs. But, on the flip side, Intel licenses the x64 instruction set from AMD. I don't know the numbers, but I'm absolutely sure the license fees nullify each other. But it's somewhat hilarious that either side could pull their contract and absolutely cripple the other.
Neither side can do that, actually.
Intel’s own friend betrayed them and still won
They were never friends, intel was forced to give AMD the x86 license, iirc IBM wanted backup CPUs that weren't just from intel, so intel had to give AMD the license.
And since AMD got the x86 license up to 2006 they beat intel or were as good as them for cheaper, then started losing more and more from 2006 to 2017, and then AMD has been winning ever since.
I might be late to post on this, but you overlooked the am486 dx4-100. It was one of the fastest 486 processors at the time, and I was one of the lucky few to have it. It actually blew the water out of all pentium processors until the pentium 133.
Remember when Linus told us to buy Ryzen Gen 1 then laughed calling us "Poor Suckers" a year later?
Yup
Really? in which video he call us that?
Love your sponsor description. “No mark up meaning that you pay what you would at a normal store.” 3080 starting retail price, $699. Redux, “$1099 seems like normal store price.”
You’d be very lucky to pay that for a 3070.
I understand that but that isn’t the point. He said same as what you would pay in store and say if someplace like Best Buy had them in stock they wouldn’t be charging $400 over MSRP for anything.
Bad scripted video:
how you can tell if a channel is yeah in bed with Intel?
emphasize the negatives of AMD old cpus vs intel ones A LOT many times and with specific vibrant tone, expression, then pass fast all the AMD CPUs that kicked Intel for years in the balls like Athlon x64, don't mention that AMD was the first that did x64 for years vs Intel and multicore CPUs again ahead of Intel, until Core2Duo when they had completion, downplay all those years, and then go back to emfasize the bad FX bulldozer series but mentioning the "superiority" of greedy Intel and they quadcore cpus...and finally talk about Zen once and not mentioning Z2 and z3 that is kicking Intel to the ground for years now in performance, consumption and sales. But for the bad FX we even put a sinking ship.... yeah.
nice script guys!
LTT fan boys you can attack as much as you like and tell me the bs excuses about giving the blue few hard videos, but you don't know that they get paid every time they mention the word "intel"!. They promote Intel even by talking "harshly a bit" to them or not emphasizing AMD superiority. His real character and enthusiasm is only in some unscripted videos where they build AMD servers. There you see an honest Linus, not a "bussiness" in bed with Intel.
In general people are ignorant AF with any sort of media.
Amazing video ! So much history explained.
I had a K6-III+ back in the day. It’s hard to overstate just how amazing that chip was. To go from a 133MHz Pentium MMX to a chip that could do 600MHz+ was amazing… all without touching any other components. Well besides moving jumpers around on the motherboard to set bus speed, voltage, and clock multiplier.
I'm glad to be back on AMD with my Ryzen9 5900x. I started my PC journey with the AMD 386 DX40, moved to a K6-2 and finally K7, been Intel since then, but then Ryzen happened. What a comeback!
My journey is a bit shorter, but I have had quite a few AMD based systems, and after a brief history with Intel, I am back on AMD.
1. HP Pavilion laptop (circa 2005), with the Sempron 3000+. 1.8 GHz single-core.
2. Acer Aspire laptop (2008) with the AMD Turion 64x2 processor. 1.9 GHz dual core.
3. First desktop build (2012) - AMD FX-8120
4. Desktop upgrade (2014) - Intel Core i7 4790K
5. Second desktop build (2019) - AMD Ryzen 9 3900X
I'm actually watching this on one of my lower end, older PCs, which has an old Phenom X4 835 in it. The computer was built by an OEM with an Athlon X2 215, but I decided to see what I could do with it for like $30 in eBay parts. I'm pleased that such an old platform still has the ability to do current computational jobs without much of a hitch.
If memory serves, the last AMD CPU that an intel chipset supported was the AMD DX4-100 CPU. Which kept a lot of those pre-Pentium MBs in the game for a while. in those days, there was a third option, the Cyrix 5x86, which was actually manufactured by IBM and sported better performance numbers but was hampered by lack of support, needing patches downloaded to be compatible with most games.
Ahhhh, my first PC used that K6-2. 166Mhz, 👌mwah! Fell out of love with PC's during 'the dark years', came back for the Ryzen 2600. The circle of life, such beautiful.
♫ Ryzen up! Straight to the top... ♫
My favorite was the K6-III+. These were very fast, low power, low heat, and they were very inexpensive. The original K6 & K6 2 chips ran very hot. One time, a K6 2 200Mhz melted the cooling fan to the processor. It still continued to run - long after the fan was replaced. The K6 III's ran so cool, that they were barely warm to the touch. I ran them with fans, but they really didn't need them. Also, in those days, many motherboards came with DIP switched, instead of jumpers. Configuring & overclocking was pretty simple. 400's could run at 500, with no apparent problems.
That was the last time, I even attempted overclocking as a matter of policy. After dealing with instability, after a couple of years of service, I decided that it was more worth it to me, for the board to last as long as possible. When the Athlon era came around, Overclocking seemed to take an eventual toll on system components. It was more worth it to me to save up for better components. The same seemed to apply to Pentium systems of that era.
Nice trip down memory lane, thanks! I have owned or used a significant portion of the chips mentioned in this video. My first AMD chip was the 8088 clone in my IBM XT. Then a few years later I scored an Am386DX-33 that impressed the heck out of my 1993 self. Later I heard rumors about the awesome K6, and eventually got my hands on one. Very not bad, and it left me liking AMD enough to stick with them, for the most part. Heck, my 6-core Athlon lasted me 10 years until I replaced it with my current Ryzen 2700. Never cared much for their graphics cards, though.
My first PC build started with a FX 4100, then my youtubing career took off with a 8320e because my motherboard couldn't go beyond 95w but that board and the two CPU's got me through nearly 7 years of gaming and school and video and music production before my rebuild with a Ryzen 2600x and a stronger GPU and ram.
I still have my phenom x6 1090t black edition but with a dead motherboard :(, I really want to find an AM3+ socket motherboard to run it again.
AMD's K5 was actually the 29000 CPU discussed earlier in the video with an x86 decoder put on top.
The first PC I built myself when I was 8 or 9 was an AMD Athlon 64 x2