I used to work for ATI in the early 2000s, but I left just as the AMD acquisition was finalized. I feel like this video was a bit negative, as ATI did hold the performance crown a few times against NVidia. Also, their drivers were terrible around the turn of the century, but after a big blow up with Microsoft (which is a whole story unto itself) they totally overhauled their software process and managed to make very stable and high performance drivers by around 2003. I worked in the Radeon group, though I specialized in multimedia, including the TV Wonder and All-in-Wonder product lines. I'm pretty disappointed that these weren't mentioned, because these were absolute game-changers, especially before Windows had DirectPlay. The story of the end of All-in-Wonder (which I was actually involved in) is really interesting in itself, especially the story of the All-in-Wonder X1800. Not sure how much I could tell though. Am I still under NDA? 😂 Working in the Radeon department was intense. The competition between NVidia and ATI was extreme, and they were both developing at full-speed to optimize for a very small number of game-related metrics (mostly shader efficiency). I actually think their feud, focused almost exclusively on the relatively small niche of high-end gaming, was really detrimental to the PC industry, especially with respect to power consumption and battery life. For example, the ridiculous "Aero" interface of Windows Vista (and later versions) only existed because ATI had spent such an insane amount of money developing 3D shaders that they had to find a use for it outside of games (ATI basically wrote the spec for Aero and Microsoft put their name on it). Anyway, there were a lot of incredible stories that came out of ATI even in the few short years I worked there. This video is a nice very high-level introduction, but wow, there's so much that happened there that most people will never know about. It really was an interesting company.
Surprising to see you here, that's so awesome! I'd kill to hear those stories, that era of consumer graphics was so interesting and it's always nice to learn more about ATI's history. Love your content by the way!
I agree with you....I feel this video is a little more negative then it should be. I love AMD's drivers, and this is why I continue to buy video cards based on AMD GPU's. It's almost like this video favors NVIDIA over ATI/AMD, which doesn't make any sense to be honest
@@dsalvis Agreed. There were many instances throughout history where ATI took the performance crown/had the superior technology compared to NVIDIA (Radeon 9700 PRO, Radeon X800, Radeon X1900, HD 5870 are what mainly comes to mind) I feel this video doesn't really take them into consideration, they are definitely important parts of ATI's history and show that they have outdone NVIDIA in the past.
@@dsalvis You can argue whose drivers were the best all day long, but the proof was in the product itself. In my 25 years of computer gaming, I never had an nvidia product fail on me. In the same token, I can count on my hand how many times AMD drivers were to blame for the GPU failure. ATI/Radeon seemed to think there was a product expiration date for the items they produced, so they refuse to support them quite like Apple, Horizon Hobby and BestBuy branded products. As for the ALL IN Wonder product line, it left it's customers WONDERING why they broke so frequently or flat out failed to perform as described. I am a heavy PC bencher from the KingPin days, passionately building my own very high-end gaming workstations. Unfortunately, Radeon never made the cut.
Another point worth mentioning is that a fragment of ATI lives on inside Qualcomm as their Adreno GPU's, ATI sold its mobile division to Qualcomm back in 2009. Its designers worked on gamecube and xbox 360 ATI GPU designs.
@@ReallyJTT What’s even more amusing is that prior to the sale to Qualcomm, some of ATi’s mobile products weren’t known as “Radeon”, but as Imageon. One of those GPUs even ended up inside the Zeebo console.
Another funny thing unlerated to ATI is that Apples GPU Archetecture in the M1 is basically a heavenly modified PowerVR mobile GPU. (which explains why the memory for the tile table is way to small to actually fit all the tiles if 16x16 tiles are used, thus having to switch to direct mode).
A good historic video... albeit marked by a few notable omissions like ATI's massive comeback, taking the performance and efficiency crown, during 9000 series era.
Also occasional mistakes, such as claiming that Intel integrated a graphics units in CPU right after the i740 failure. In reality they integrated it in the chipset - CPU integrated graphics only arrived in 2011 with Sandy Bridge.
The 9000 series doing so well was partially because the geforce 5000/FX cards they were competing against were absolutely awful. I do remember my 9800pro continuing to be competitive with the nvidia 6000 series in some games though (I also remember it dropping a fairly big capacitor onto the bottom of my case but continuing to work like nothing happened - which apparently was not uncommon).
ATi had another massive comeback with the HD 4000 and 5000 series as well, but buyers were buying Nvidia out of habbit though so it didn't cash in that big.
It's hard to believe ATI has been gone for over a decade now. I built so many systems with those cards. I still have an ATI card in one of my PCs to this day working great. Makes me feel old. 😆
Yep. Got one here that's given a decades service: $ lspci | grep VGA 01:00.0 VGA compatible controller: Advanced Micro Devices, Inc. [AMD/ATI] Turks PRO [Radeon HD 6570/7570/8550]
I was an early Netscape employee in 1994. ATI was literally next door on Middlefield Rd. In Mountain View, CA and we were growing so quickly we’d just take a PO next door to pickup new network hubs (back in the 10mb days) and then 100mb switches to meet the exponential demand the builds took on the infrastructure. Those were the days.
I lived in Sunnyvale in 1998 and I lived right around the corner from Netscape. I learned to rollerblade by practising in the Netscape, Yahoo, and Pointcast parking lots on weekends. 😂
This was largely correct, but I do remember three distinct times where ATI/AMD had the undisputed performance crown. The ATI 9700 PRO was the first of these that I owned.
Ati had 4 generations where they were the king, the Radeon 9000 series, the Radeon X series, the Radeon X1000 series and the Radeon HD4000 series. I played Crysis for the 1st time with a Radeon X1900XTX, not an Nvidia 9ne.
Yeah not only did the 9700 pro completely wreck the Geforce 4 and it also wrecked the later Geforce FX series as well. But also this was when ATI fixed their drivers meaning that the 8500 became not only faster than the Gefroce 3 but also faster than the Geforce 4!!
I always admired the underdogs of computing technologies. I used to buy AMD chips, ATI graphics, LG monitors, and Toshiba laptops when they went south (today I took the SSD made by Kioxia).
@Zakir Siddiqui their heart is in the right place, but when buying equipment, especially computer equipment, its best to stay as objective as possible and view the purchase in terms of quality and power/featureset, consistency/reliability in service life, and price over whether or not the equipments' manufacturers are the "underdog"
Kioxia ex Torshiba was one of the largest flash memory maker tho They r definitely not the under dog They spin the flash memory part of due to other problems within Torshiba
I had worked for ATI before. I was working at the front line of actually loading the bare circuit board and used the machine to print the paste on the board. Then loaded circuit chips to the machine for the robot arms to picked and placed the chips on specific areas on the bare boards. Finally, The circuit boards got carried by the special conveyor tracks into the industrial oven. The finished video cards came out from the end of the oven for cooling and inspection.
Ha! I worked as a contract sysadmin for a short spell in 1994/1995. I worked with that 3d graphics group. Kubota was going to lay them all off apparently and the team got wind of it. The manager was approached by ATI and told they wanted to build a team. He had one. They all resigned together. Kubota would not let them have the premises that the team was in - it was rented and they had no other use for it. So my company was hired to set up their new office and then to move them back like six months later when Kubota relinquished the lease. They were a pretty intense group - very cool though. And they were excited to see their efforts more widely used.
ATI delivered some amazing cards throughout their existence, I've become a bit of a collector, though I started with the 9700/R300, that's the time I remember them really coming into the limelight as a true competitor, but they just went crazy with DirectX 9, the engineering that went into R500 was nuts, and honestly it was mostly due to games not pushing the tech hard enough that the R500 based cards didn't look better than they did. I got an X1950 XTX in a very cheap computer at a flee market, at first I didn't think it was anything special, it was a generation I kinda skipped, I had a Geforce 6800 iirc. But the more I researched it the more it was clear to me, ATI put the very best into R500, and I'd love to see it dissected and pushed to it's limits in various ways. ATI clearly scooped up some incredible talent early on, it's a shame they weren't bigger before they got bought. Anyway great video! I always enjoy getting a more personal perspective of some of these people.
I had no idea ATI was Canadian! That early 2000's competition with Nvidia was great for consumers. Then came CUDA cores and changed the market completely.
I bought the amazingly fast ATI Radeon 9700 Pro at the end of 2002 along with a copy of Unreal Tournament 2003 and I loved the experience. 2 years later I bought Half Life 2 and the 9700 Pro was still a beast. It played the game smoothly, maxed out with high frame rates. That was one of the greatest cards I have ever owned and it provided years of entertainment. I hope my current rtx 3060 ti is as useful as long as the 9700 Pro was because I love that card too.
Just want to mention many innacuracies, which is not common on your videos. 3dfx was still launching products in Oct 2000! It went belly up later. ATi was still far from top in 1998, though the early part of the video makes it look like it was close. The part on the 8500 is entirely correct. Other innacuracies I noted: - Intel i740 was launched when the Voodoo 2 was top dog, 3dfx was very much the king then. Nvidia were starting to be a serious competitor. - Intel continued to rely on 2D/3D add-in cards, which the Nvidia TNT variants dominated. S3 was also a player, but much smaller and Nvidia became top dog. - Intel licensed chipsets/BUS during P3/P4 cycle, so SiS and VIA supplied chipsets with integrated graphics. - By the time the P4 was dominating, Intel had more mature integrated graphics (passable). Nvidia and ATi launched IGPs and soon became leaders on integrated graphics. - Intel was never very competitive on integrated graphics but cut out chipset licenses since Nehalem architecture, to have a monopoly. ATi was also quite late to integrated graphics, only launching first products in 2002, after dominating the add-on market with Radeon 9000. Before then I don't really think they had the money to design a chipset. Even the first tries were using south bridges by VIA and other existing suppliers.
This video doesn't even mention 3dfx, despite them being a larger player in the late 90s than Nvidia. 3dfx's Glide protocol was superior to Nvidia's OpenGL, and made games like Starsiege: Tribes run like butter. The 3dfx Voodoo3 was a game changer, being that company's first 2d AND 3d video card. (The Voodoo2 would sit next to a 2d card, and have a cable running the video out from the 2d into a video in on the Voodoo2.) Also no mention of the ATI All-In-Wonder cards that combined great 2d and 3d acceleration with great video input, capture, and playback. All of ATI's cards included some level of video input, making it easy to record live TV in the era before Tivo became common. This made their cards fun to own - you might be watching TV next to your PC and think "how can I watch TV on my bigger monitor?" and you'd usually just have to plug in a couple cables and install the capture software.
WAT? Glide was not superior to OpenGL, it was literally a cutdown version of OpenGL (the api developed by silicon graphics - the same company that made the graphics workstations used to animate toy story and Jurassic park in the early 90s - featured memorably in the latter as “hey, I know Unix”). Glide on the voodoo 2 supported 16bit colour depths with an 800x640 resolution, if you had two two cards in sli config, you could have 1024x800. SGIs supported 32bit colour depth, hw accelerated NURBS, hw overlays, multi texturing, all at resolutions of 1600x1200 years before glide. Nvidia ended up hiring a load of SGIs engineers AFTER sgi went bankrupt in the early to mid 2000’s, and those engineers championed OpenGL pretty hard (thank god, otherwise there would literally be direct3d on windows, and Sod all 3D on macOS or Linux). To call OpenGL an nvidia api is misinfomed at best, insulting at worst :( An honourable mention must also go to 3D labs (who finally forced through the GL3 spec before going bankrupt), and to ATI for open sourcing mantle (the basis of Vulcan, d3d12, and apples metal). Nvidia have not got a great track record wrt open standards (pushing cgfx instead of glsl, or cuda instead of opencl).
The voodoo3 was the beginning of the end for 3dfx. 16mb, 16bit colour, and was horribly outclassed by the ATI rage fury at launch (32mb mem, 32bit colour depth, full direct3d and OpenGL support). The voodoo 4 and 5 were simply terrible, and killed off the company (2 power connectors required, for half the performance of the competition)
@@robthebloke Glide was completely custom job, nothing to do with OpenGL and not very similar at all. It was highly customized to the target hardware, when they added on-chip perspective division the API had to be changed because the input for Glide 1.x was done in screen coordinates, so they had to switch to clip coordinates.. they never had version of Glide that would have had HW TL engine, nVidia bought them before that happened (their money was running out anyway so it wouldn't have made any difference). The separate memory for each TMU was also a bit fail from programmer's point of view.. I get it why they did it, each TMU had exclusive access to the memory for more bandwidth but change to UMA to something else would again need API changes because the abstraction was at so low level that any hardware change required rewrite of client applications. I liked the API, though.. it was very easy to work with despite all mentioned.
I started my Windows career on a Rage 3D, went to a Radeon X850 Pro (AGP x8), nowadays using an aging A6-5200 APU with Linux, funny how ATI always had my back no matter where I went in the computersphere hah.
I remember it took me years to stop referring to them as ATI, Though this was compounded by the fact that I already mixed either company for the other... In my defence both were Three letter acronyms that stared with the letter 'A' & both used the colour red heavily in their branding.
Thank god for AMD/ATI; as a mostly competent competitor and occasional performance dominator; they really set Nvidia and Intel's monopolistic anti-consumer designs on the market back... even if it was out of necessity rather than charity. Not that they are angel's, just those who have it the worse in market share tend to try the hardest to come off as pro-consumer, as we are seeing now with Intel's most recent and most likely to succeed entry into the gpu/compute market with Arc. I can tell those gpu's are far far more potent than their current drivers allow, kinda like the Radeon 8500 that ended up kerb-stomping the Geforce 3ti/(& 4 ti!) after a few years.. just not now so we get ARC very cheap.
I had the Original All in Wonder card. It had the Rage chip in it. Later versions would introduce the Radeon. It was the first card to combine both a 2d card and a 3d accelerator. It also had a TV tuner. I was using my PC as a DVR before DVR's were really a thing. It was one of my favorite cards ever.
I'm so fortunate to find this video a mere day after it was uploaded. ATI was my favorite GPU company.. an underdog like AMD that somehow could occasionally manage to outperform the larger competitor its sort of fitting AMD bought them I just wish they could have kept their identity.
The ATI logo last appeared on the 5000 series GPU’s in 2009, after that its only been AMD, But ofc. You could say AMD incorporated the “red” theme of ATI, before they were green.
@@peterromano1911 I don't know what version of the 6800 XT you have, but the stock version from AMD only has an R pressed into it, presumable for Radeon. You can literally check it yourself, look at the 5870's backplate, and then compare it to the next generation 6970's backplate, 2009 was the last we saw the ATI logo.
The evolution of PC display is interesting. I was studying in the US at the time when IBM just launched PC in 1981 with 2 display cards, monochrome with higher solution and color with lower solution. Businesses prefer monochrome display to run their apps but when Lotus came up with 1 2 3 spreadsheet with a capability to display graph, they were in conundrum. If they want to show graph they need color display and color monitor but it was bad for working with numbers. A company came up with solution, inventing a monochrome card that can display graphic and called it Hercules Card, costing around $300. It sold like hot cake and since Hercules did not patent its invention (if I remember correctly), soon imitators from Taiwan came up with Hercules compatible cards costing only $100 and they were sold even better and wiping out Hercules company in no time. About color display card, I still remembered Voodoo brand.
I worked there for almost nine years, left just before the merger/acquisition. It was always a like a game of tag with Nvidia. ATI held the performance title, followed by Nvidia and then back to ATI. IT was an awesome experience and journey to work with so many brilliant people in such a setting. In the end though... if someone wants to buy a $1.x B company for $5.6B, who would say no? It was sad to see such an Iconic Canadian company go! A loss for Canada, in my opinion.
The 9000 and HD series were legendary. I had one of the low end 9000 cards but it was actually a rebadged 7000 series so it sucked. I got burned buying the geforce 9 series and shortly after got it's ass handed to it by the HD series and then salt to wounds my cards were the bump gate ones and all died before 12 months :(
I had a 1650XT that suddenly just quit on me. One moment it was displaying game graphics, the next the screen was black. It was an ICE cool version that was slightly overclocked.
Great material. Two addendums: -Before Intel moved to integrating graphics inside CPUs, they did so in motherboard northbridge first -TruForm would come back years later as hardware tesselation became commonplace, ironically from nVidia.
Thanks Jon, I really can't get enough of your content, it's a great presentation of the material as well, that pic of that old ATI driver disc brought back some memories.
Shortcomings in drivers is exactly why I stopped using ATI when the GeForce 2 came out. Nvidia was just way more stable and reliable. From GeForce 2 until now, I have only been Nvidia, usually with an AMD processor. But I now own a Ryzen system with a Radeon card, my first one. And I love it. It's stable, barely uses power, and the card was affordable to buy without needing a bank loan. NOW in hindsight, I appreciate what ATI was doing and how difficult it was, and I am glad they still making Nvidia sweat.
9700Pro All-in-wonder was my first high end GPU, and it was awesome aside from driver issues. I remember at the time in order to play Call of Duty you had to use a certain set of drivers but those drivers broke other games and so on. I should add, R300 really kicked nvidia in the balls. The Geforce 4 series was eclipsed by most of the 9x00 lineup and the FX 5x00 series was no good.
I think nvidias recent dominance is influencing peoples memories of the past. ATI/AMD wasn’t always an underdog. Radeon 9000 series was much better than GeForce at the time. They were also typically first to support new direct X versions. First with unified shaders and many other things. From 2002 to 2012 I mostly used ATI/AMD and apart from the classic 8800GT nvidia didn’t have anything that really stood out as amazing. FX was a joke and GeForce 6 and 7 were decent but nothing amazing. 8 was amazing but 9 was just 8 again rebranded. Then the 200 to 700 series were ok I guess but I stuck to Radeon 7000 series. Then with GTX 900 and especially 1000 series nvidia really took off but pre 2015 I mostly only used ATI/AMD.
This is definitely my favourite channel right now. I walked by AMDs corporate office near Toronto not too long and I was thinking about how I didn’t really know their history with ATI so I wanted to investigate further. But I forgot until I saw this video on my feed and it has everything 🤝
I have an ATI Radeon HD3850 that is still operating to this day, was my first GPU, fantastic card. Wish I waited for the HD4850 but who doesn't wish they had waited a few months for the next generation haha.
HD 3xxx and 4xxx were great HW, but the drivers, especially openGL were not that great. At one point 4xxx had OpenGL 3.3 support only to have the next driver version cut some features again and downgrade to 3.2. 🤦♂
Another fascinating and intelligently presented mini documentary. A+++ Any chance of looking at doing one on the personal computer movement of the 80's in Japan? MSX, Sharp and NEC?
what makes it mini? syndicated television dictated documentaries should be in neat 30-60-90 minute slots. it wasn't because those times accomodated telling a full story or an in-depth look even. this channel is a shining example of the contrary.
@@housemana :) 16 minutes long makes it mini. I was not referring to the amount of content and information, just the length. :) So yeah, another fantastic mini documentary from the channel that consistently delivers.
I remember having so much hope for the HD2900 XT only for it to be a complete dud that was incredibly hot and loud. Was a sad sad time but thankfully they turned things around and the HD3850 and HD 850's were absolute steals back in the day.
I was running a hd4850, it was a good card. I replaced that with a gforce 6700 and haven't upgraded since then. I am looking to upgrade to Intel. What a turn of events
I was introduced to ATI with the 9800 pro. I was using a mid range nvidia 5000 series card (don't remember which) and thought it was great. Then my friend showed me his 9800 pro, I quickly switched. From then out I would switch back and forth with the brands.
Couple of inaccuracies here. The ATI rage was the first consumer 3D display card that supported 32bit output (required for pro level anim software such as maya). The nvidia riva tnt was only 24bit at the time, and 3dfx were only 16bit. It’s kinda wrong to call the GeForce 2 the first gpu (I’d argue it was GeForce 3). The original GeForce featured register combiners, which basically added a dot product as a multi texture blending mode, which could provide per pixel phong shading (if you used one combination of logic gates). The GeForce 2 upped the ante by doubling the combiners and texture units available IIRC, but programmable in the modern sense it was not. GF3 added basic ASM vertex and fragment shaders (same chip I started developing for on the original Xbox). The ATI drivers being crap, is a load of crap. They strictly adhered to the OpenGL and D3D specs to the letter, it was nvidia that had tonnes of bugs (ie set shader source, would also incorrectly compile the shader as well). It didn’t help that the nvidia drivers would allow CG shaders to compile and execute just fine in their GLSL compiler, even though they were two distinct languages. A single slip up by the developer would pass through nvidia drivers unnoticed. Sigh. At the time when GeForce 3/4 was king, developers used those cards to develop their code (and so bugs in their code would go unnoticed). That wasn’t much helped by nvidia flooding developers with a tonne of non-compliant source code examples that were copy-pasted through out games of the era. Games developed on ati hardware would work everywhere (because correct errors would be thrown at the correct time). Games developed on nvidia, would often only run on nvidia (which is where this myth developed). Just the opinion of an old school graphics dev who’s worked on every platform from SGi machines up to the current gen devices….
not a bad video at all, you missed some important chapters, such as "Adreno" where ATI sold its mobile division to Qualcomm back in 2009, but i think you can not do justice to AMD/ATI without mentioning the complete saviour of the company after Hector Ruiz, and that is Lisa Su.
Nvidia didn't kill ATI's success, CUDA did. If Nvidia hadn't locked it down, there's a very good chance they wouldn't have sought to be acquired by AMD (though it would probably have happened eventually.) OpenCL came too late and without sufficient performance to keep up with proprietary CUDA, and now that nvidia is creating proprietary hardware that relies on CUDA to function, it's gonna be a rough road back to competition for AMD in that regard. Glad to see a piece on their origins from you finally! ATI was always my go to for bang for your buck back in the 2000s, I didn't buy a new GPU from nvidia until 2020 in fact
Reminds me of the good old days. Please make one for SiS, this little known budget brand was popular here in south Asia as both ATI and NVidia were super expensive.
Imagine, if at any time during ATI's years, the large, visible tech company founded by three HKers could have actually established a significant presence... in HK.
I remember the Radeon HD 5000 series, and how they were pretty much superior to all of Nvidia's offerings. The first DX11 GPUs, and not only were they faster, they ran cooler and quieter too. I'd love to use Radeon graphics again, but unfortunately, due to CUDA's chokehold on the industry, I can't use it for professional work.
I spent the majority of my gaming years with ATI Radeon cards. Back in the day, as my first truly powerful 3D graphics card I had a Radeon 8500, it was a huge improvement from my previous S3 Savage 4 card. Later I upgraded to a Radeon 9800 Pro, which was quite strong card in its time. Years later I had a HD 4870, it was also great. I was a big fan of ATI, I was proud to put the ATI sticker on my PC case. They produced great gaming cards, I had a lot of fun with them, great memories.
You forgot to mention the almighty 9700PRO and 9800PRO and XT, those cards really did belt the shit out of nVidia at the time, by a significant amount. They were incredible cards, i remember playing Battle Field 1942 at 1920x1440 on my CRT with everything turned right up and it ran like a train
EDIT: haha wow there it is at 3:32 minute mark!! Half life 2 & ati card promotion poster! Wow, memories I remember half life 2 being marketed as having superior pc graphics shaders with ATI cards they even had a bundle at the game launch with a 2003-2004 era card. Back in the day I remember shopping for a motherboard with an AGP slot, only to have to return to PCI a few years later 😂 anyone remember that one card that only ran physics calculations?
ATI products are fantastic. They have had their ups and their downs, but they have always been solid performers. Over the last 30 years, I have only ever purchased a single nvidia product, the GTX 1080. Rest have always been ATI/AMD. Currently watching this on an AMD Radeon RX580 while gaming on my newer system with an AMD Radeon RX 6900XT.
Thanks to ATI not hiring me as a new grad in 90s, I went to silicone valley and never looked back to them. Interestingly tonight I'm back to Toronto for Thanksgiving dinner and staying a very short distance to where they interviewed me almost 30 years ago. Since they never bothered to tell me about my interview's result I never bought any of their video cards afterwards.
Tessellation tech from the early 2000s was NOT abandoned, the hardware and technique remained in use, only it is more commonly known as bump mapping. It was supposed to work with nurbs surfaces, and some engines supported it (like unreal engine 2, if I remember right), but the graphics APIs didn't properly expose the functionality. And nvidia were very cozy with the direct3d devs at microsoft. For nearly a decade direct-x specs followed nvidia hardware, leaving ati with "bad drivers" after they pioneered programmable shaders. Arguably, unreal's nanite is closer to ati's tessellation examples from 20 years ago, than nvidia's newer tessellation implementation.
So in the end AMD and ATi have the second chance in 2010 when Abu Dhabi investment firm bought it. AMD fab was separated and became Glofo. Those fab I mentioned in New York and in Dresden together with the Fab in Singapore formerly Chartered Semiconductor.
AMD acquired ATi, then they sold the mobile graphics division to Qualcomm. This is how AMD ended up with RADEON and Qualcomm with ADRENO, which is scaled-down Xenon chip from X360. The scaling down was done in Finland by ex-Bitboys, which was acquired by ATi a few years prior. The Finnish team basically rewrote all functional blocks to make them lower power so that they could feed off from a tiny battery in a handheld device like cellphone. The last Adreno designed in Europe was something between 205 and 300-something, Qualcomm moved the R&D to states in 2012. The point of using Xenon as basis for the "new" mobile IP was to leverage existing compiler. The compiler is the single most complicated piece of the whole stack so it was risk-reducing strategy to make the new chip backward compatible.
I've always enjoyed my ATI/AMD products, sure the GPU's aren't the fastest one can get, but they are competitive enough, and age like fine wine, also their open source support is fantastic.
ATI and their drivers was such a huge issue. At work, I got a laptop refresh from Dell at the time and it had an ATI graphics card. I don't remember the specific issue, but it was something like an external monitor. It simply wouldn't work properly. Searched online and it was a know issue. Ended up getting a new laptop with Nvidia and it just worked with no problem. This was after AMD had purchased ATI and I had my hopes up that they had recognized the history of driver performance and stability and had put some effort into it. Great video.
There used to be a lot of cases where laptops simply didn't use the gpu but instead the igpu. Customer support was unable to help and of every instance of it I found on online forums, noone had been able to fix it.
@@shadmansudipto7287 it was a while ago and was definitely a known issue. I don't recall the specifics anymore. Other GPU features worked fine that wouldn't have been the case if it was built in. At the time, I did play games on my work laptop and they were fine.
Nvidia has had lots of its own issues over the years, and AMD has often done some things better than Nvidia. Having used many different AMD GPUs for the past 20 years, as well as a few Nvidia ones, and also often helping friends with their PC purchases or PC troubleshooting, I've never had or seen a serious issue with either one. The most serious issues with GPUs I've ever had has been with (integrated) Intel graphics, though that was on a very old laptop. I should also note that Nvidia gives worse support for its GPUs after they get old (about 4+ years), as they seem to deliberately try to force you to upgrade at that point, whereas AMD's drivers have tended to work better after that 4 year mark.
@@syncmonism You get it. many people who are discussing their experiences/issues literally have only had a few cards/rigs. once one gets around the have-built or helped-build ~20 or so rigs part , one will get a good grasp on just how delicate yet standardized the whole computer as a system really is.
Nothing, I mean nothing about the 9x00 Radeon? Why do you think AMD picked them up? ATI had a lot cooking in their kitchen, at the time. Way more than Nvidia did. And they were not ripping people off.
AMD: wow you guys have been really killing it, Nvidia is a name you can be proud of! We'd like to buy your company and expand it further! JENSEN: sounds fantastic actually, I would still remain on a CEO in some capacity correct? AMD: No deal. Door slams in the background, followed by the screeching wheels of a car peeling out...
I think AMD's acquisition of ATI was a net negative to the industry. Nvidia seemed to thrive as AMD just hasn't focused on GPUs. This really slowed the Radeon brand. Yes its still a solid product, and AMD GPUs are very good, but it is obvious it's not a priority for AMD as it is with nvidia. AMD seemed to put more value in integrating ATI tech in APUs versus being on the bleeding edge of dedicated GPUs.
ATI were not the only tech in Toronto, Alias|Wavefront was pretty big in the late 90's and up the road, over the 'border' in French Montreal, there was Softimage 3D as well as Matrox. Toronto was a very attractive place for waves of refugees from Asia because they extended the city out and built out whole new districts for whomever was fresh off the boat. There wasn't this multicultural way of just slotting people wherever - such as in the UK - and these new communities, be it from Vietnam, Korea (or wherever else America bombed) were able to be welcomed into safe areas where culture could be kept and integrated into the vibrant, lovely city that Toronto is. Silicon Graphics played a bigger part other than just brains. Often share prices were tied to expectations of SGI product launches and future roadmap. There were some financial scandals when promises made were not delivered. Some markets for 3D were bigger in the imaginations of the shareholder brochure writers than reality. Toronto also had the benefit of CBC - a better funded version of the BBC for Canadians. Due to CBC being there with their ridiculous budgets, a veritable cluster of film, VFX and other post production companies existed for film and TV. There was a huge talent pool and life was good. ATI were just a small player in all of this, most of it having morphed into a different landscape to what we have today. Please revisit this Toronto scene from 'back in the day' because it is a story that deserves to be told.
Ive been working in Canada for a few years now and some of the things these Toronto and vancouver peeps say are just bewilderingly divorced from anything resembling reality
@Garrus Vakarian All I am saying is that Toronto is a very nice place and that their attitude to housing people is very different to what I expected. They welcomed people and gave them an opportunity to thrive. Sure, some chip in an office PC has little to do with making documentaries about seal clubbing, but I see CBC as like an 'anchor store' in a shopping mall, where plenty of other businesses form a cluster. In pre-internet times you needed to be in the right place. Back then VFX, broadcasting and related tech was always in a cluster, whether in Europe or North America. I personally thought that OpenGL was a bit of a game changer, and you weren't going to find people that knew what that was outside of a few tech hubs.
As an American who worked for a Canadian company, the company LOVED those people fresh off the boat not because they welcomed diverse cultures or people. Oh no. They sought them out and hired them because they would work for cheap and could be easily intimidated and replaced. They had so many new immigrants working there, they didn't have enough workers who could speak English well enough to support American customers. They got a big client in Mexico and had zero employees who could speak Spanish. The immigrant employees were all from EMEA and Asia. It was a very strange place to work. But it was done this way because it was cheap labor. No other reason. No special love for immigrants or cultures. Cheap.
@@LatitudeSky I think they got the pick of the refugees though! In the UK we get the dregs after the Germans and Canadians get the first dibs! I have to say that I am extremely naive and very glad to receive your comment!
the commodore pc10/pc1 graphics chip "AGA" (Tandy like hires graphics, like cgi with 16 colors and/or double res) was an ATI design, if I get the dates right.
ATI is a legend and their MASSIVE issues with drivers are legendary as well😁 Nvidia capitalized on ATI's mistakes and ongoing issues even there were situations ATI chips were more powerful than Nvidia's. What I find really surprising as I wasn't aware of it, is that AMD wanted to buy Nvidia but "leather boss" Jensen wanted to drive both companies as CEO. Looking several years later Jensen is trying to buy ARM softer way and it still didn't work as he would like to... guy has clearly bad luck to combine Nvidia GPUs with relevant CPU company. Who knows, maybe its going to work for the 3rd time😉
I recall having a 3d rage ii mulimedia version of the card. It freuently produced some weird artifacts in interstate 76 when 3d acceleration was turned on and a few other titles as well. It was expensive and it didnt work that well.
technology company competition is like watching the greatest real sports of all time. but the sports isn't just idle repetitive artificial games, but the very advancement of humanity into possibly a real sci-fi-like future one day
I'll never forget the ATI 9700/9800 Pro vs the Nvidia GeForce FX 5800 Ultra wars. The ATI cards slaughtered Nvidia's hot and loud cards. th-cam.com/video/qN4O2IqlS84/w-d-xo.html
I used to work for ATI in the early 2000s, but I left just as the AMD acquisition was finalized.
I feel like this video was a bit negative, as ATI did hold the performance crown a few times against NVidia. Also, their drivers were terrible around the turn of the century, but after a big blow up with Microsoft (which is a whole story unto itself) they totally overhauled their software process and managed to make very stable and high performance drivers by around 2003.
I worked in the Radeon group, though I specialized in multimedia, including the TV Wonder and All-in-Wonder product lines. I'm pretty disappointed that these weren't mentioned, because these were absolute game-changers, especially before Windows had DirectPlay. The story of the end of All-in-Wonder (which I was actually involved in) is really interesting in itself, especially the story of the All-in-Wonder X1800. Not sure how much I could tell though. Am I still under NDA? 😂
Working in the Radeon department was intense. The competition between NVidia and ATI was extreme, and they were both developing at full-speed to optimize for a very small number of game-related metrics (mostly shader efficiency). I actually think their feud, focused almost exclusively on the relatively small niche of high-end gaming, was really detrimental to the PC industry, especially with respect to power consumption and battery life. For example, the ridiculous "Aero" interface of Windows Vista (and later versions) only existed because ATI had spent such an insane amount of money developing 3D shaders that they had to find a use for it outside of games (ATI basically wrote the spec for Aero and Microsoft put their name on it).
Anyway, there were a lot of incredible stories that came out of ATI even in the few short years I worked there. This video is a nice very high-level introduction, but wow, there's so much that happened there that most people will never know about. It really was an interesting company.
Surprising to see you here, that's so awesome! I'd kill to hear those stories, that era of consumer graphics was so interesting and it's always nice to learn more about ATI's history. Love your content by the way!
The retro PC gaming community needs this!!
I agree with you....I feel this video is a little more negative then it should be. I love AMD's drivers, and this is why I continue to buy video cards based on AMD GPU's. It's almost like this video favors NVIDIA over ATI/AMD, which doesn't make any sense to be honest
@@dsalvis Agreed. There were many instances throughout history where ATI took the performance crown/had the superior technology compared to NVIDIA (Radeon 9700 PRO, Radeon X800, Radeon X1900, HD 5870 are what mainly comes to mind) I feel this video doesn't really take them into consideration, they are definitely important parts of ATI's history and show that they have outdone NVIDIA in the past.
@@dsalvis You can argue whose drivers were the best all day long, but the proof was in the product itself. In my 25 years of computer gaming, I never had an nvidia product fail on me. In the same token, I can count on my hand how many times AMD drivers were to blame for the GPU failure. ATI/Radeon seemed to think there was a product expiration date for the items they produced, so they refuse to support them quite like Apple, Horizon Hobby and BestBuy branded products. As for the ALL IN Wonder product line, it left it's customers WONDERING why they broke so frequently or flat out failed to perform as described. I am a heavy PC bencher from the KingPin days, passionately building my own very high-end gaming workstations. Unfortunately, Radeon never made the cut.
Another point worth mentioning is that a fragment of ATI lives on inside Qualcomm as their Adreno GPU's, ATI sold its mobile division to Qualcomm back in 2009. Its designers worked on gamecube and xbox 360 ATI GPU designs.
Which explains why Adreno is an anagram of Radeon.
@@JackBandicootsBunker never noticed that 😮
It is likely no coincidence the current Radeon drivers are called Adrenalin. But I did not know Adreno was from ATI. Wow. What a heritage.
@@ReallyJTT What’s even more amusing is that prior to the sale to Qualcomm, some of ATi’s mobile products weren’t known as “Radeon”, but as Imageon. One of those GPUs even ended up inside the Zeebo console.
Another funny thing unlerated to ATI is that Apples GPU Archetecture in the M1 is basically a heavenly modified PowerVR mobile GPU. (which explains why the memory for the tile table is way to small to actually fit all the tiles if 16x16 tiles are used, thus having to switch to direct mode).
ATI use to sell a variant of their cards with a TV tuner. I had an 8500 all-in-wonder and it was amazing.
All-in-Wonder, loved that thing.
I miss those cards with TV out. Good old days.
All of their cards had some level of video capture, like composite in.
@@StephenGillie Right, but having the ability to change channels specific to the TV tuner is a big difference.
@@dlmac I used a $99 VCR *shrug*
A good historic video... albeit marked by a few notable omissions like ATI's massive comeback, taking the performance and efficiency crown, during 9000 series era.
Also occasional mistakes, such as claiming that Intel integrated a graphics units in CPU right after the i740 failure. In reality they integrated it in the chipset - CPU integrated graphics only arrived in 2011 with Sandy Bridge.
Hells ya, I wish I still had my 9600 128MB around.
The 9000 series doing so well was partially because the geforce 5000/FX cards they were competing against were absolutely awful. I do remember my 9800pro continuing to be competitive with the nvidia 6000 series in some games though (I also remember it dropping a fairly big capacitor onto the bottom of my case but continuing to work like nothing happened - which apparently was not uncommon).
ATi had another massive comeback with the HD 4000 and 5000 series as well, but buyers were buying Nvidia out of habbit though so it didn't cash in that big.
It's hard to believe ATI has been gone for over a decade now. I built so many systems with those cards. I still have an ATI card in one of my PCs to this day working great. Makes me feel old. 😆
Technically they aren't gone. Many of them still work for AMD after the buyout long ago. Basically AMD is ATI.
Yep. Got one here that's given a decades service:
$ lspci | grep VGA
01:00.0 VGA compatible controller: Advanced Micro Devices, Inc. [AMD/ATI] Turks PRO [Radeon HD 6570/7570/8550]
Which ATI GPU is your PC using?
@@goblinphreak2132 Some of them even comment here..
I was an early Netscape employee in 1994. ATI was literally next door on Middlefield Rd. In Mountain View, CA and we were growing so quickly we’d just take a PO next door to pickup new network hubs (back in the 10mb days) and then 100mb switches to meet the exponential demand the builds took on the infrastructure.
Those were the days.
@@sub-vibes p
@@kito4525 enis
I lived in Sunnyvale in 1998 and I lived right around the corner from Netscape.
I learned to rollerblade by practising in the Netscape, Yahoo, and Pointcast parking lots on weekends. 😂
The Kubota 3d group was out in Marlborough, MA iirc. It was an annoying commute as I recall.
@@NotJustBikes Haha, faded glory. All these companies are mostly gone, for a long time. Who remembers Pointcast?
Man, just trying to imagine the Sliders universe where AMD successfully acquired Nvidia instead of ATI.
We'd have even more Cromags...
@@Komeuppance I refuse to acknowledge the Cromag years. Maybe it's for the best that AMD stuck with ATI in all possible universes 😄
I remember the ari air-in-wonder videos cards, Cool product
Lol can't believe it took me more than 15 years to find a Sliders reference on TH-cam.
@@goldman77700 There are dozens of us! Dozens!
This was largely correct, but I do remember three distinct times where ATI/AMD had the undisputed performance crown. The ATI 9700 PRO was the first of these that I owned.
9700 Pro was awesome, completely destroyed what Nvidia had to offer.
@@xeridea there was also Terascale.
Not to mention in the less-objective rankings - such as value or approachability, ATI/AMD truly gave Nv a run for their money. HD 4850 anyone? :)
Ati had 4 generations where they were the king, the Radeon 9000 series, the Radeon X series, the Radeon X1000 series and the Radeon HD4000 series.
I played Crysis for the 1st time with a Radeon X1900XTX, not an Nvidia 9ne.
Yeah not only did the 9700 pro completely wreck the Geforce 4 and it also wrecked the later Geforce FX series as well. But also this was when ATI fixed their drivers meaning that the 8500 became not only faster than the Gefroce 3 but also faster than the Geforce 4!!
I always admired the underdogs of computing technologies. I used to buy AMD chips, ATI graphics, LG monitors, and Toshiba laptops when they went south (today I took the SSD made by Kioxia).
Toshiba by any other name.
@Zakir Siddiqui their heart is in the right place, but when buying equipment, especially computer equipment, its best to stay as objective as possible and view the purchase in terms of quality and power/featureset, consistency/reliability in service life, and price over whether or not the equipments' manufacturers are the "underdog"
Kioxia ex Torshiba was one of the largest flash memory maker tho
They r definitely not the under dog
They spin the flash memory part of due to other problems within Torshiba
kioxia (ex toshiba) is the perhaps biggest flash memory maker in the world if you account for the OEM stuff. hardly an underdog
now buy intel graphic card
I had worked for ATI before. I was working at the front line of actually loading the bare circuit board and used the machine to print the paste on the board. Then loaded circuit chips to the machine for the robot arms to picked and placed the chips on specific areas on the bare boards.
Finally, The circuit boards got carried by the special conveyor tracks into the industrial oven. The finished video cards came out from the end of the oven for cooling and inspection.
Ha! I worked as a contract sysadmin for a short spell in 1994/1995. I worked with that 3d graphics group. Kubota was going to lay them all off apparently and the team got wind of it. The manager was approached by ATI and told they wanted to build a team. He had one. They all resigned together.
Kubota would not let them have the premises that the team was in - it was rented and they had no other use for it. So my company was hired to set up their new office and then to move them back like six months later when Kubota relinquished the lease. They were a pretty intense group - very cool though. And they were excited to see their efforts more widely used.
ATI delivered some amazing cards throughout their existence, I've become a bit of a collector, though I started with the 9700/R300, that's the time I remember them really coming into the limelight as a true competitor, but they just went crazy with DirectX 9, the engineering that went into R500 was nuts, and honestly it was mostly due to games not pushing the tech hard enough that the R500 based cards didn't look better than they did.
I got an X1950 XTX in a very cheap computer at a flee market, at first I didn't think it was anything special, it was a generation I kinda skipped, I had a Geforce 6800 iirc. But the more I researched it the more it was clear to me, ATI put the very best into R500, and I'd love to see it dissected and pushed to it's limits in various ways. ATI clearly scooped up some incredible talent early on, it's a shame they weren't bigger before they got bought.
Anyway great video! I always enjoy getting a more personal perspective of some of these people.
I had no idea ATI was Canadian! That early 2000's competition with Nvidia was great for consumers.
Then came CUDA cores and changed the market completely.
I bought the amazingly fast ATI Radeon 9700 Pro at the end of 2002 along with a copy of Unreal Tournament 2003 and I loved the experience. 2 years later I bought Half Life 2 and the 9700 Pro was still a beast. It played the game smoothly, maxed out with high frame rates. That was one of the greatest cards I have ever owned and it provided years of entertainment. I hope my current rtx 3060 ti is as useful as long as the 9700 Pro was because I love that card too.
Just want to mention many innacuracies, which is not common on your videos. 3dfx was still launching products in Oct 2000! It went belly up later.
ATi was still far from top in 1998, though the early part of the video makes it look like it was close. The part on the 8500 is entirely correct.
Other innacuracies I noted:
- Intel i740 was launched when the Voodoo 2 was top dog, 3dfx was very much the king then. Nvidia were starting to be a serious competitor.
- Intel continued to rely on 2D/3D add-in cards, which the Nvidia TNT variants dominated. S3 was also a player, but much smaller and Nvidia became top dog.
- Intel licensed chipsets/BUS during P3/P4 cycle, so SiS and VIA supplied chipsets with integrated graphics.
- By the time the P4 was dominating, Intel had more mature integrated graphics (passable). Nvidia and ATi launched IGPs and soon became leaders on integrated graphics.
- Intel was never very competitive on integrated graphics but cut out chipset licenses since Nehalem architecture, to have a monopoly.
ATi was also quite late to integrated graphics, only launching first products in 2002, after dominating the add-on market with Radeon 9000. Before then I don't really think they had the money to design a chipset. Even the first tries were using south bridges by VIA and other existing suppliers.
This video doesn't even mention 3dfx, despite them being a larger player in the late 90s than Nvidia. 3dfx's Glide protocol was superior to Nvidia's OpenGL, and made games like Starsiege: Tribes run like butter. The 3dfx Voodoo3 was a game changer, being that company's first 2d AND 3d video card. (The Voodoo2 would sit next to a 2d card, and have a cable running the video out from the 2d into a video in on the Voodoo2.)
Also no mention of the ATI All-In-Wonder cards that combined great 2d and 3d acceleration with great video input, capture, and playback. All of ATI's cards included some level of video input, making it easy to record live TV in the era before Tivo became common. This made their cards fun to own - you might be watching TV next to your PC and think "how can I watch TV on my bigger monitor?" and you'd usually just have to plug in a couple cables and install the capture software.
It did mention 3dfx in one sentence and then continued to mention that they were acquired by nvidia.
WAT? Glide was not superior to OpenGL, it was literally a cutdown version of OpenGL (the api developed by silicon graphics - the same company that made the graphics workstations used to animate toy story and Jurassic park in the early 90s - featured memorably in the latter as “hey, I know Unix”).
Glide on the voodoo 2 supported 16bit colour depths with an 800x640 resolution, if you had two two cards in sli config, you could have 1024x800. SGIs supported 32bit colour depth, hw accelerated NURBS, hw overlays, multi texturing, all at resolutions of 1600x1200 years before glide.
Nvidia ended up hiring a load of SGIs engineers AFTER sgi went bankrupt in the early to mid 2000’s, and those engineers championed OpenGL pretty hard (thank god, otherwise there would literally be direct3d on windows, and Sod all 3D on macOS or Linux).
To call OpenGL an nvidia api is misinfomed at best, insulting at worst :(
An honourable mention must also go to 3D labs (who finally forced through the GL3 spec before going bankrupt), and to ATI for open sourcing mantle (the basis of Vulcan, d3d12, and apples metal).
Nvidia have not got a great track record wrt open standards (pushing cgfx instead of glsl, or cuda instead of opencl).
The voodoo3 was the beginning of the end for 3dfx. 16mb, 16bit colour, and was horribly outclassed by the ATI rage fury at launch (32mb mem, 32bit colour depth, full direct3d and OpenGL support). The voodoo 4 and 5 were simply terrible, and killed off the company (2 power connectors required, for half the performance of the competition)
@@robthebloke Glide was completely custom job, nothing to do with OpenGL and not very similar at all. It was highly customized to the target hardware, when they added on-chip perspective division the API had to be changed because the input for Glide 1.x was done in screen coordinates, so they had to switch to clip coordinates.. they never had version of Glide that would have had HW TL engine, nVidia bought them before that happened (their money was running out anyway so it wouldn't have made any difference). The separate memory for each TMU was also a bit fail from programmer's point of view.. I get it why they did it, each TMU had exclusive access to the memory for more bandwidth but change to UMA to something else would again need API changes because the abstraction was at so low level that any hardware change required rewrite of client applications. I liked the API, though.. it was very easy to work with despite all mentioned.
Growing up the 9800 XT was the hottest card money could buy. Thanks for the memories.
I started my Windows career on a Rage 3D, went to a Radeon X850 Pro (AGP x8), nowadays using an aging A6-5200 APU with Linux, funny how ATI always had my back no matter where I went in the computersphere hah.
I remember it took me years to stop referring to them as ATI, Though this was compounded by the fact that I already mixed either company for the other... In my defence both were Three letter acronyms that stared with the letter 'A' & both used the colour red heavily in their branding.
Thank god for AMD/ATI; as a mostly competent competitor and occasional performance dominator; they really set Nvidia and Intel's monopolistic anti-consumer designs on the market back... even if it was out of necessity rather than charity. Not that they are angel's, just those who have it the worse in market share tend to try the hardest to come off as pro-consumer, as we are seeing now with Intel's most recent and most likely to succeed entry into the gpu/compute market with Arc. I can tell those gpu's are far far more potent than their current drivers allow, kinda like the Radeon 8500 that ended up kerb-stomping the Geforce 3ti/(& 4 ti!) after a few years.. just not now so we get ARC very cheap.
Makes me sad that they don't do any of that anymore
I had the Original All in Wonder card. It had the Rage chip in it. Later versions would introduce the Radeon. It was the first card to combine both a 2d card and a 3d accelerator. It also had a TV tuner. I was using my PC as a DVR before DVR's were really a thing. It was one of my favorite cards ever.
I'm so fortunate to find this video a mere day after it was uploaded. ATI was my favorite GPU company.. an underdog like AMD that somehow could occasionally manage to outperform the larger competitor its sort of fitting AMD bought them I just wish they could have kept their identity.
The current AMD Graphics offering still carries the ATI logo on the products.
ATI is still very much alive.
The ATI logo last appeared on the 5000 series GPU’s in 2009, after that its only been AMD, But ofc. You could say AMD incorporated the “red” theme of ATI, before they were green.
@@medallish There is an ATI logo on the back of my AMD Radeon RX 6800XT.
Please fact check before you text....
@@peterromano1911 I don't know what version of the 6800 XT you have, but the stock version from AMD only has an R pressed into it, presumable for Radeon. You can literally check it yourself, look at the 5870's backplate, and then compare it to the next generation 6970's backplate, 2009 was the last we saw the ATI logo.
@@medallish I bought my GPU directly from AMD.
@@peterromano1911 Then you would see there's literally no ATI logo to be spottet. The closest I could see was a label that says "ATI holdings LLC".
Asianometry was mentioned in the Dutch newspaper NRC yesterday.
The evolution of PC display is interesting. I was studying in the US at the time when IBM just launched PC in 1981 with 2 display cards, monochrome with higher solution and color with lower solution. Businesses prefer monochrome display to run their apps but when Lotus came up with 1 2 3 spreadsheet with a capability to display graph, they were in conundrum. If they want to show graph they need color display and color monitor but it was bad for working with numbers. A company came up with solution, inventing a monochrome card that can display graphic and called it Hercules Card, costing around $300. It sold like hot cake and since Hercules did not patent its invention (if I remember correctly), soon imitators from Taiwan came up with Hercules compatible cards costing only $100 and they were sold even better and wiping out Hercules company in no time. About color display card, I still remembered Voodoo brand.
Yes! I had a Hercules as well!
He has done a video on 3DFx the maker of the Voodoo brand.
@@MeasureOnce I did not have the genuine one, but a compatible!
I worked there for almost nine years, left just before the merger/acquisition. It was always a like a game of tag with Nvidia. ATI held the performance title, followed by Nvidia and then back to ATI. IT was an awesome experience and journey to work with so many brilliant people in such a setting. In the end though... if someone wants to buy a $1.x B company for $5.6B, who would say no? It was sad to see such an Iconic Canadian company go! A loss for Canada, in my opinion.
The 9000 and HD series were legendary. I had one of the low end 9000 cards but it was actually a rebadged 7000 series so it sucked. I got burned buying the geforce 9 series and shortly after got it's ass handed to it by the HD series and then salt to wounds my cards were the bump gate ones and all died before 12 months :(
I had a 1650XT that suddenly just quit on me. One moment it was displaying game graphics, the next the screen was black. It was an ICE cool version that was slightly overclocked.
Great material.
Two addendums:
-Before Intel moved to integrating graphics inside CPUs, they did so in motherboard northbridge first
-TruForm would come back years later as hardware tesselation became commonplace, ironically from nVidia.
Thanks Jon, I really can't get enough of your content, it's a great presentation of the material as well, that pic of that old ATI driver disc brought back some memories.
Shortcomings in drivers is exactly why I stopped using ATI when the GeForce 2 came out. Nvidia was just way more stable and reliable. From GeForce 2 until now, I have only been Nvidia, usually with an AMD processor. But I now own a Ryzen system with a Radeon card, my first one. And I love it. It's stable, barely uses power, and the card was affordable to buy without needing a bank loan. NOW in hindsight, I appreciate what ATI was doing and how difficult it was, and I am glad they still making Nvidia sweat.
9700Pro All-in-wonder was my first high end GPU, and it was awesome aside from driver issues. I remember at the time in order to play Call of Duty you had to use a certain set of drivers but those drivers broke other games and so on.
I should add, R300 really kicked nvidia in the balls. The Geforce 4 series was eclipsed by most of the 9x00 lineup and the FX 5x00 series was no good.
The integration of multimedia was the big thing of that time with video capture and TV tuners.
@@brodriguez11000 Indeed I was playing Xbox Live on my 2002 PC in a window, while chatting to my gaming friends in another.
The 9700 pro's dx9 capabilities were game changing. It crushed the 4600 ti. Both had the same MSRP of $399.
I think nvidias recent dominance is influencing peoples memories of the past. ATI/AMD wasn’t always an underdog. Radeon 9000 series was much better than GeForce at the time. They were also typically first to support new direct X versions. First with unified shaders and many other things. From 2002 to 2012 I mostly used ATI/AMD and apart from the classic 8800GT nvidia didn’t have anything that really stood out as amazing. FX was a joke and GeForce 6 and 7 were decent but nothing amazing. 8 was amazing but 9 was just 8 again rebranded. Then the 200 to 700 series were ok I guess but I stuck to Radeon 7000 series. Then with GTX 900 and especially 1000 series nvidia really took off but pre 2015 I mostly only used ATI/AMD.
The radeon 9700 dominated that season, until Nvidia responded of course. That cat & mouse game continues for years
This is definitely my favourite channel right now. I walked by AMDs corporate office near Toronto not too long and I was thinking about how I didn’t really know their history with ATI so I wanted to investigate further. But I forgot until I saw this video on my feed and it has everything 🤝
I love the details of the video, and the fact that you included Carmack's letter, is just fantastic, had fun reading it.
Thank you.
I forgot how Canadian ATI was. I remember a lot of those products!
I have an ATI Radeon HD3850 that is still operating to this day, was my first GPU, fantastic card. Wish I waited for the HD4850 but who doesn't wish they had waited a few months for the next generation haha.
HD 3xxx and 4xxx were great HW, but the drivers, especially openGL were not that great. At one point 4xxx had OpenGL 3.3 support only to have the next driver version cut some features again and downgrade to 3.2. 🤦♂
I lived in Toronto in the late 1990s. ATI was a big name in the IT scene in Toronto, and my first computer came with a ATI card.
As a geezer that was involved in the PC industry for years I love these history lessons.
Another fascinating and intelligently presented mini documentary. A+++
Any chance of looking at doing one on the personal computer movement of the 80's in Japan? MSX, Sharp and NEC?
what makes it mini?
syndicated television dictated documentaries should be in neat 30-60-90 minute slots. it wasn't because those times accomodated telling a full story or an in-depth look even. this channel is a shining example of the contrary.
@@housemana :) 16 minutes long makes it mini. I was not referring to the amount of content and information, just the length. :)
So yeah, another fantastic mini documentary from the channel that consistently delivers.
That was a great rundown on a company and its products, with which I had some familiarity but now understand a lot more thoroughly.
I remember having so much hope for the HD2900 XT only for it to be a complete dud that was incredibly hot and loud. Was a sad sad time but thankfully they turned things around and the HD3850 and HD 850's were absolute steals back in the day.
I was running a hd4850, it was a good card. I replaced that with a gforce 6700 and haven't upgraded since then. I am looking to upgrade to Intel. What a turn of events
I am still rocking a HD7850
Not bad for it age
I had that 8500 card. I was blown away by it. It cushed RTCW and most other games of the time. It also had decent overclocking ability too.
I had an 8500. Really loved that card
Me too.
I was introduced to ATI with the 9800 pro. I was using a mid range nvidia 5000 series card (don't remember which) and thought it was great. Then my friend showed me his 9800 pro, I quickly switched. From then out I would switch back and forth with the brands.
Couple of inaccuracies here. The ATI rage was the first consumer 3D display card that supported 32bit output (required for pro level anim software such as maya). The nvidia riva tnt was only 24bit at the time, and 3dfx were only 16bit.
It’s kinda wrong to call the GeForce 2 the first gpu (I’d argue it was GeForce 3). The original GeForce featured register combiners, which basically added a dot product as a multi texture blending mode, which could provide per pixel phong shading (if you used one combination of logic gates). The GeForce 2 upped the ante by doubling the combiners and texture units available IIRC, but programmable in the modern sense it was not. GF3 added basic ASM vertex and fragment shaders (same chip I started developing for on the original Xbox).
The ATI drivers being crap, is a load of crap. They strictly adhered to the OpenGL and D3D specs to the letter, it was nvidia that had tonnes of bugs (ie set shader source, would also incorrectly compile the shader as well). It didn’t help that the nvidia drivers would allow CG shaders to compile and execute just fine in their GLSL compiler, even though they were two distinct languages. A single slip up by the developer would pass through nvidia drivers unnoticed. Sigh.
At the time when GeForce 3/4 was king, developers used those cards to develop their code (and so bugs in their code would go unnoticed). That wasn’t much helped by nvidia flooding developers with a tonne of non-compliant source code examples that were copy-pasted through out games of the era. Games developed on ati hardware would work everywhere (because correct errors would be thrown at the correct time). Games developed on nvidia, would often only run on nvidia (which is where this myth developed).
Just the opinion of an old school graphics dev who’s worked on every platform from SGi machines up to the current gen devices….
That was the Matrox and 3dfx time.
You forgot S3.
not a bad video at all, you missed some important chapters, such as "Adreno" where ATI sold its mobile division to Qualcomm back in 2009, but i think you can not do justice to AMD/ATI without mentioning the complete saviour of the company after Hector Ruiz, and that is Lisa Su.
Nvidia didn't kill ATI's success, CUDA did. If Nvidia hadn't locked it down, there's a very good chance they wouldn't have sought to be acquired by AMD (though it would probably have happened eventually.) OpenCL came too late and without sufficient performance to keep up with proprietary CUDA, and now that nvidia is creating proprietary hardware that relies on CUDA to function, it's gonna be a rough road back to competition for AMD in that regard. Glad to see a piece on their origins from you finally! ATI was always my go to for bang for your buck back in the 2000s, I didn't buy a new GPU from nvidia until 2020 in fact
i think a video for S3 graphics can add more to the history of the Looking back series
Reminds me of the good old days. Please make one for SiS, this little known budget brand was popular here in south Asia as both ATI and NVidia were super expensive.
CONGRATULATIONS🎉🎉🎉🎊 YOU HAVE BEEN SHORTLISTED AMONG MY LUCKY WINNER'S DM THE NUMBER ABOVE⬆️⬆️
Another very cool insight into the chip and tech industry dude. Thanks 👍
I love ATI. They made the graphics chips for the Nintendo Wii console.
ATI Also Did The Gamecube
Awesome vid. I had completely forgotten about ATI.
Imagine, if at any time during ATI's years, the large, visible tech company founded by three HKers could have actually established a significant presence... in HK.
Thankfully it didn’t, otherwise Communist China would have had a weapon against the West
I remember the Radeon HD 5000 series, and how they were pretty much superior to all of Nvidia's offerings. The first DX11 GPUs, and not only were they faster, they ran cooler and quieter too.
I'd love to use Radeon graphics again, but unfortunately, due to CUDA's chokehold on the industry, I can't use it for professional work.
I spent the majority of my gaming years with ATI Radeon cards. Back in the day, as my first truly powerful 3D graphics card I had a Radeon 8500, it was a huge improvement from my previous S3 Savage 4 card. Later I upgraded to a Radeon 9800 Pro, which was quite strong card in its time. Years later I had a HD 4870, it was also great. I was a big fan of ATI, I was proud to put the ATI sticker on my PC case. They produced great gaming cards, I had a lot of fun with them, great memories.
Matrox was another Canadian graphics card maker...even before ATI IIRC! (started with S-100 cards)
Welcome to asianometry oddware, where you discuss lithography that is odd, forgotten or obsolete...
Been to the Markham location many times.
I will always have a place in my heart for ATi.
You forgot to mention the almighty 9700PRO and 9800PRO and XT, those cards really did belt the shit out of nVidia at the time, by a significant amount.
They were incredible cards, i remember playing Battle Field 1942 at 1920x1440 on my CRT with everything turned right up and it ran like a train
Would have loved if you talked about the HD series. A pinnacle in GPU design under the AMD brand. Still, nice video. Keep the good work
EDIT: haha wow there it is at 3:32 minute mark!! Half life 2 & ati card promotion poster! Wow, memories
I remember half life 2 being marketed as having superior pc graphics shaders with ATI cards they even had a bundle at the game launch with a 2003-2004 era card. Back in the day I remember shopping for a motherboard with an AGP slot, only to have to return to PCI a few years later 😂 anyone remember that one card that only ran physics calculations?
Yeah, AGEIA PhysX cards.
Great in-depth report thanks. What does it mean for AMD having aquired Xilinx? Would be great to have your views on this - thanks again
Interesting look back at ATI, great job!
I loved ATI the ATI Allinwounder. I really miss it.
ATI products are fantastic. They have had their ups and their downs, but they have always been solid performers. Over the last 30 years, I have only ever purchased a single nvidia product, the GTX 1080. Rest have always been ATI/AMD. Currently watching this on an AMD Radeon RX580 while gaming on my newer system with an AMD Radeon RX 6900XT.
ATI also released a number of SoundBlaster compatible sound cards. I had an ATI Stereo F/X in my 386.
Buying creative would have been nice.
I think the ATI card was like half the price of a true SoundBlaster. Worked just the same as I remember. 🤷♂️
Thanks to ATI not hiring me as a new grad in 90s, I went to silicone valley and never looked back to them. Interestingly tonight I'm back to Toronto for Thanksgiving dinner and staying a very short distance to where they interviewed me almost 30 years ago. Since they never bothered to tell me about my interview's result I never bought any of their video cards afterwards.
One of the founders is still in the area.
@@MetaView7 I hope he is not that fat short *Asian* guy who asked me business questions instead of electronics.
Tessellation tech from the early 2000s was NOT abandoned, the hardware and technique remained in use, only it is more commonly known as bump mapping. It was supposed to work with nurbs surfaces, and some engines supported it (like unreal engine 2, if I remember right), but the graphics APIs didn't properly expose the functionality.
And nvidia were very cozy with the direct3d devs at microsoft. For nearly a decade direct-x specs followed nvidia hardware, leaving ati with "bad drivers" after they pioneered programmable shaders.
Arguably, unreal's nanite is closer to ati's tessellation examples from 20 years ago, than nvidia's newer tessellation implementation.
Sliders and Cromags you guys kill me. Great timing on update as usual. Go AMD!
7:10 - the logo looks cool for it's time
My first GPU was an ATI X1650. Ran like a champ.
what a throw back! thanks for this
So in the end AMD and ATi have the second chance in 2010 when Abu Dhabi investment firm bought it. AMD fab was separated and became Glofo. Those fab I mentioned in New York and in Dresden together with the Fab in Singapore formerly Chartered Semiconductor.
Nice story again !
Did you cover the Matrox saga already ?
You Missed That ATI sold off their Mobile Devision to Qualcomm
AMD acquired ATi, then they sold the mobile graphics division to Qualcomm. This is how AMD ended up with RADEON and Qualcomm with ADRENO, which is scaled-down Xenon chip from X360. The scaling down was done in Finland by ex-Bitboys, which was acquired by ATi a few years prior. The Finnish team basically rewrote all functional blocks to make them lower power so that they could feed off from a tiny battery in a handheld device like cellphone. The last Adreno designed in Europe was something between 205 and 300-something, Qualcomm moved the R&D to states in 2012. The point of using Xenon as basis for the "new" mobile IP was to leverage existing compiler. The compiler is the single most complicated piece of the whole stack so it was risk-reducing strategy to make the new chip backward compatible.
Ohhh giving me fond memories of my 3DFX card back in the early 90s. Hanging out with my nerd friends playing quake.
I've always enjoyed my ATI/AMD products, sure the GPU's aren't the fastest one can get, but they are competitive enough, and age like fine wine, also their open source support is fantastic.
Ignored the connection with Adreno too..
This disappointed me.
ATI and their drivers was such a huge issue. At work, I got a laptop refresh from Dell at the time and it had an ATI graphics card. I don't remember the specific issue, but it was something like an external monitor. It simply wouldn't work properly. Searched online and it was a know issue. Ended up getting a new laptop with Nvidia and it just worked with no problem. This was after AMD had purchased ATI and I had my hopes up that they had recognized the history of driver performance and stability and had put some effort into it.
Great video.
and yet... they are still not as good as nvidia's drivers...
There used to be a lot of cases where laptops simply didn't use the gpu but instead the igpu. Customer support was unable to help and of every instance of it I found on online forums, noone had been able to fix it.
@@shadmansudipto7287 it was a while ago and was definitely a known issue. I don't recall the specifics anymore. Other GPU features worked fine that wouldn't have been the case if it was built in.
At the time, I did play games on my work laptop and they were fine.
Nvidia has had lots of its own issues over the years, and AMD has often done some things better than Nvidia.
Having used many different AMD GPUs for the past 20 years, as well as a few Nvidia ones, and also often helping friends with their PC purchases or PC troubleshooting, I've never had or seen a serious issue with either one. The most serious issues with GPUs I've ever had has been with (integrated) Intel graphics, though that was on a very old laptop. I should also note that Nvidia gives worse support for its GPUs after they get old (about 4+ years), as they seem to deliberately try to force you to upgrade at that point, whereas AMD's drivers have tended to work better after that 4 year mark.
@@syncmonism You get it. many people who are discussing their experiences/issues literally have only had a few cards/rigs. once one gets around the have-built or helped-build ~20 or so rigs part , one will get a good grasp on just how delicate yet standardized the whole computer as a system really is.
I had an ATI Rage 3D AGP card back when...
had the ati radeon 5970 it was a freakin beast at the time
Nothing, I mean nothing about the 9x00 Radeon? Why do you think AMD picked them up? ATI had a lot cooking in their kitchen, at the time. Way more than Nvidia did. And they were not ripping people off.
I remember ATI on the opening credits to Roller Coaster Tycoon 3.
AMD: wow you guys have been really killing it, Nvidia is a name you can be proud of! We'd like to buy your company and expand it further!
JENSEN: sounds fantastic actually, I would still remain on a CEO in some capacity correct?
AMD: No deal.
Door slams in the background, followed by the screeching wheels of a car peeling out...
I have to listen these stories before sleep like a baby being keen for bedtime stories ~
I think AMD's acquisition of ATI was a net negative to the industry. Nvidia seemed to thrive as AMD just hasn't focused on GPUs. This really slowed the Radeon brand. Yes its still a solid product, and AMD GPUs are very good, but it is obvious it's not a priority for AMD as it is with nvidia. AMD seemed to put more value in integrating ATI tech in APUs versus being on the bleeding edge of dedicated GPUs.
ATI were not the only tech in Toronto, Alias|Wavefront was pretty big in the late 90's and up the road, over the 'border' in French Montreal, there was Softimage 3D as well as Matrox.
Toronto was a very attractive place for waves of refugees from Asia because they extended the city out and built out whole new districts for whomever was fresh off the boat. There wasn't this multicultural way of just slotting people wherever - such as in the UK - and these new communities, be it from Vietnam, Korea (or wherever else America bombed) were able to be welcomed into safe areas where culture could be kept and integrated into the vibrant, lovely city that Toronto is.
Silicon Graphics played a bigger part other than just brains. Often share prices were tied to expectations of SGI product launches and future roadmap. There were some financial scandals when promises made were not delivered. Some markets for 3D were bigger in the imaginations of the shareholder brochure writers than reality.
Toronto also had the benefit of CBC - a better funded version of the BBC for Canadians. Due to CBC being there with their ridiculous budgets, a veritable cluster of film, VFX and other post production companies existed for film and TV. There was a huge talent pool and life was good. ATI were just a small player in all of this, most of it having morphed into a different landscape to what we have today.
Please revisit this Toronto scene from 'back in the day' because it is a story that deserves to be told.
Ive been working in Canada for a few years now and some of the things these Toronto and vancouver peeps say are just bewilderingly divorced from anything resembling reality
@Garrus Vakarian yeah I mean there's plenty of stuff to like, I don't know why some feel the need to ruin a good thing
@Garrus Vakarian All I am saying is that Toronto is a very nice place and that their attitude to housing people is very different to what I expected.
They welcomed people and gave them an opportunity to thrive.
Sure, some chip in an office PC has little to do with making documentaries about seal clubbing, but I see CBC as like an 'anchor store' in a shopping mall, where plenty of other businesses form a cluster. In pre-internet times you needed to be in the right place. Back then VFX, broadcasting and related tech was always in a cluster, whether in Europe or North America.
I personally thought that OpenGL was a bit of a game changer, and you weren't going to find people that knew what that was outside of a few tech hubs.
As an American who worked for a Canadian company, the company LOVED those people fresh off the boat not because they welcomed diverse cultures or people. Oh no. They sought them out and hired them because they would work for cheap and could be easily intimidated and replaced. They had so many new immigrants working there, they didn't have enough workers who could speak English well enough to support American customers. They got a big client in Mexico and had zero employees who could speak Spanish. The immigrant employees were all from EMEA and Asia. It was a very strange place to work. But it was done this way because it was cheap labor. No other reason. No special love for immigrants or cultures. Cheap.
@@LatitudeSky I think they got the pick of the refugees though!
In the UK we get the dregs after the Germans and Canadians get the first dibs!
I have to say that I am extremely naive and very glad to receive your comment!
the commodore pc10/pc1 graphics chip "AGA" (Tandy like hires graphics, like cgi with 16 colors and/or double res) was an ATI design, if I get the dates right.
My first "high End" video card was ATI Radeon 9800 Pro 128MB, it was AMAZING
I had the 9700 Pro and it was amazing so I can only imagine what the 9800 Pro was like.
16:33 "Stay cool, it's hot out there!" Presumably, this video was made a few months ago.
Grew up three kilometres from where their headquarters was in Richmond Hill.
ATI is a legend and their MASSIVE issues with drivers are legendary as well😁 Nvidia capitalized on ATI's mistakes and ongoing issues even there were situations ATI chips were more powerful than Nvidia's.
What I find really surprising as I wasn't aware of it, is that AMD wanted to buy Nvidia but "leather boss" Jensen wanted to drive both companies as CEO. Looking several years later Jensen is trying to buy ARM softer way and it still didn't work as he would like to... guy has clearly bad luck to combine Nvidia GPUs with relevant CPU company. Who knows, maybe its going to work for the 3rd time😉
I recall having a 3d rage ii mulimedia version of the card. It freuently produced some weird artifacts in interstate 76 when 3d acceleration was turned on and a few other titles as well. It was expensive and it didnt work that well.
I miss the old ati logo of my gpu sitting on the right side of my task bar.
:') good old times
Some friends of my half brithers fathers family here in Toronto worked in ATI early on, I know they did something or Intel chip design.
Thanks!💚💜🇺🇸 God bless America, diyorum.
Any chance you can hit the networking side and discuss Cabletron/Bay Networks/Cisco etc?
technology company competition is like watching the greatest real sports of all time. but the sports isn't just idle repetitive artificial games, but the very advancement of humanity into possibly a real sci-fi-like future one day
I've been using ATI/AMD graphics since I started building PCs in the late 1990s.
I remember the ATi vs. Nvidia wars of the early 2000s :)
I'll never forget the ATI 9700/9800 Pro vs the Nvidia GeForce FX 5800 Ultra wars. The ATI cards slaughtered Nvidia's hot and loud cards.
th-cam.com/video/qN4O2IqlS84/w-d-xo.html