I once found an Nvidia GT 630 in a parts bin back in 2012, the same year it came out. Was allowed to just take it, and the danmed thing still lives in a cupboard somewhere, somehow still functional. Back then the lack of graphical horsepower was rather traumatic. But its long since found its use as a backup glorified hdmi / dvi / vga output for various troubleshooting/debugging issues over the years.
I bought two of the R9 295x2 cards during early Covid for 200 bucks to mess around with and could barely power them with a 1500 watt power supply. OCing would usually trip the overcurrent protection lmao
I think I regretted buying the Matrox Millennium 2 at the time. It was not cheap, did not support most 3D stuff, and I was hoping that 2D games would run really well on it. They did not.
I also regret the Matrox Millennium II. I think I got it for CAD reasons, but was not a great gaming card. If I recall it only supported Direct3D while 3DFX and OpenGL was killing it on the gaming front.
Old memories are being flooded into me from CompUSA days. Was the Matrox the one that had a "warning" label promoting how other cards might only be good at 3d games but that their card was good at both 2D and 3D games?
Yep I had one of those. I had to buy it because I made the mistake of buying a Compaq Deskpro (NLX Form Factor) and basically at the time it was the only card that would fit. It could run Homeworld... just. I supported it with a Voodoo II (? I think) 8MB sister card eventually. When I built my first PC by hand it was like the gods had sung in my direction! Suddenly I was in a land of AMD CPU goodness with "proper" graphics - I know it was an ATI GFX card, but can't remember which one. a Radeon of some sort I think. But for the time it was powerful. The jump from my Compaq to my home build was massive. The Matrox Mill 2 was just a little underpowered for the day. Only 4MB ram, 8 with an upgrade. The Voodoo sister card added "proper" 3d abilities (lets face it - Quake). My timeline is messed up 'cos I'm getting games and tech out of sequence... it was a long time ago!!!!
I had a Sapphire Fury for a while and yeah, that HBM did NOT like ANY sort of tweaking to the clock speeds. I remember adjusting it even ONE mhz would cause the system to freeze.
Then you didnt read up on it back then. I had both a Fury and Fury X and although the overclocking was very limited, you could overclock the HBM. However, it could only be done in fixed increments (55mhz or somethinf weird) and if your card couldn't handle that next jump, you couldn't OC it. I was lucky enough that both my Fury and Fury X would allow 1 or 2 steps of memory OCing.
Fury X was actually an awesome card, it had strengths even against the 980ti and was either super close in performance or beat it in some games. In addition we got the fury nano which was the absolute king of ITX builds. The fury X is pretty much silent even at full load which is pretty incredible.
I know, I freaking LOVE the FuryX. We'd be lucky to see GPUs like it again, they were high end GPUs in a slim package that were just silent at all times.
this guy is the only LTG "Host" that i cannot stand. hes always letting his opinion get in the way of facts (Not to mention his jokes suck). the Fury X was a great card.
I had a Voodoo 5 5500, so the Voodoo 4 4500 wasn't the top card from 3DFX, and the 5500 was a pretty solid card for the time, it performed really well and games that supported glide looked and played better than the same game running on something like a TNT Riva 2 with OpenGL, and it was the first GPU that I'm aware of that had 2 GPU dies on a single board, stuff that Nvidia repeated with later cards like the 295
Yeah, I knew there was a Voodoo 5, soI was a bit surprised they mentioned the 4 as the last one. The voodoo was a beast but it was overkill for the time with those dual gpus. I had a Riva TNT 2 and while it was good, it couldn't match those, but hey, it was cheaper
@@smokeduv The 4 technically released at the same time the voodoo 5, so it's the just as new but definitely not their best card. I have a Voodoo 5, it's a cool card.
Voodoo 5 5500 was released in 2000, but ATI had their Fury MAXX dual GPU card in 1999, so the 3dfx card was not the first in that regard. Voodoo 4 4500 was in fact the last card from 3dfx as it was released a few months after the Voodoo 5 5500. Also, both used the same chips, the VSA-100, only the 5500 had two of them. Finally, the fact that a 5500 would beat a TNT2 (and even the first gen GeForce) did not help it much when it released right after GeForce 2 and ATI's first Radeon, both of which were faster and more advanced (they were DX7 cards with hardware Transform and Lighting, while the Voodoo was DX6). In other words, too little too late and I don't even think that it was cheaper than its competitors.
I know it was trash, but I loved my Voodoo5 5500. It was the first GPU I owned that needed an extra power connector (you had to plug a molex into it) and it was actually two GPUs tied together.
It may not have been the fastest card, but it did look cool and had great support up until around 2002-3 which was relatively decent for the time. Would've been better off with a GeForce 2 though for sure.
I remember thinking at the time that I would have bought the Titan Z if I won the lottery to use in an ITX system. If I were building a full size system, then like you said I would have bought dual Titan Blacks.
Remember when Titan's allowed you to choose whether you wanted good FP32 or FP64 performance up to the Titan Black? And then Nvidia was just like, "lol Not anymore. :)"
I am still daily driving my MSI GTX 1080 for 6 years now, I think this thing was way more than a solid purchase. Even though it's not top of the line anymore, it can run many games in 4K. Not the newest or most demanding ones, but to me it's mainly about productivity anyway.
And here i am gaming on a quadro k5000 and tesla k80 (though the tesla only helps in phys-x games) (And no im not a dumbass that bought a workstation card thinking its better tgen a gaming card due to its price. I use them for scientific computation, 3d modeling and cad, gaming is just something they get to do once in a while.)
Same here and you know what is the real kicker? At the time people where saying that the 1080 was like the 4090 is now: Too hot, too expensive, and way too much card to justify. Yet years down the road that investment has paid off in spades because you bought a great part and now its aged gracefully to a good or ok level part. Those nay sayers bought "just enough" or only ok level parts that now have little useful life in newer titles without dramatic compromises.
My 7yo has my old gtx 1080 in his PC Honestly a bit overkill for a 7 year old, but I wanted to build him a PC for his birthday and I didn't want to sell it and contribute to the environment of people taking advantage of others while the shortage was going full tilt.
AMD has had a few interesting cards in recent years. There was a Radeon Pro card (clearly not aimed at gamers) that had an NVMe interface on it so users could attach an SSD. On their GPU. I think the idea was that the SSD could be used as pseudo-RAM for productivity tasks that used way more than 10-20 GB of actual RAM. But that interface was clearly a bottleneck, and AMD didn’t make that many of these cards.
@@drakethedragon457yep. Had to Google for the model number. It was the Radeon Pro SSG. They advertised it as the first GPU to break the 1TB memory barrier. LOL.
The ATI All-In-Wonder 128 was my biggest video card purchase regret. At launch they promised eventual driver support for OpenGL. I believe they did finally add it years later but by then there were much better graphic chip options.
I was really sad when my Vega 56 (Sapphire Pulse) died. It had so much oc potential and can run some games in 4K (you'd be surprised how well it scales resolution)
I mean, it WAS a 4K card when it was released, so yeah, it can handle older games in 4K. But I was using mine for 1080p by the time I replaced it with the 7900 XTX.
This news script appears to discuss a few graphics cards that were failures in terms of marketing and performance. The first card mentioned is the Nvidia Titan Z, which was released in 2014 and was marketed at gamers but was too expensive and underwhelming in terms of performance. The second card mentioned is the AMD R9 Fury X, which was also released in 2015 and was marketed as a rival to Nvidia's top-tier Titan X but was outperformed by the GTX 980 Ti. The final card discussed is the 3dfx voodoo 4 4500, which was the last graphics card produced by the company and performed poorly compared to its rivals.
Before I learned about CPU bottlenecks, I bought an HIS 4850 to use with a Athlon 64 3700+ (single core). Helped greatly with the few single-threaded games that I had but did basically nothing for the rest.
I've had an 4830 paired with an athlon x2 6200+ and 4 gb of ram - that was an amazing budget combination back in the days. A 400 € build that supported me for many years - great times
I bought a second HD 7950 for crossfire use back in the day. I quickly discovered that few games actually benefitted from it (with many games actually becoming unstable while trying to use it)- and having two gpus turned my PC into an oven, and a rocket engine, at the same time.
I had a pair of hd7970s, and for a short while, tri-fire 7970s. The only game that worked well was battlefield 3 and 4. Which, at the time, was all I played. The hd7000 series was a fantastic value for a lot of years though. I really liked my 7970, I still have it somewhere.
The absolute worst buy of my life was the ATi All-In-Wonder 128. It was basically an ATi Rage 128 from 1999 with 16 MB of VRAM with TV tuner attached. The problem was that my dad purchased this card in 2001 only because of that dumb feature. Graphics cards those days were evolving quite quickly and already in 2003 this GPU had trouble launching a lot of games. And since my family was quite poor I was stuck with this hardware for ages and had to see my classmates playing then-new GTA San-Andreas and NFS Most Wanted, when I had to play older stuff. If only my father have bought GeForce 2 or 3 instead
I actually had that card when it was new at the time. The Rage 128 was worlds better than the Rage 2 and was actually competitive. The tuner even took a hit from lightning blowing out that portion of the card. Surprisingly the 2d/3d still worked well inside Linux if you didnt install or setup the tuner. I eventually replaced it with a All-in-Wonder Radeon due to the fried tuner for my main Windows machine.
I had this experience with my Riva TNT2. It was a top of the line card for it's time, but was given to me in 2001 for my first PC, by which time it was starting to be obsolescent. It ran games just fine up until Homeworld 2 launched, and I didn't have support for DX8 or 9, which was a limiting factor for me at the time (i forget which one it was). I ended up getting the Radeon 9250 SE not knowing it was basically an entry level display output not worth half a damn, and while it WAS faster than that TNT2, it ended up being unable to run Oblivion at launch just a year or two later. I was lucky my step-dad had bought a 9800 PRO All in Wonder that he was no longer using, because I was able to adopt it as my own for a massive performance upgrade, and my first proper experience with a "high end" card, even though it had already been replaced with the X800 and X850 by then. I still regret getting that 9250 to this day, it was my shortest lived GPU EVER, with even my 9 month old 5700xt outliving it before being replaced with a 2080ti and sold during the GPU shortage to someone building a new rig to survive the pandemic with. I think it only lasted me 6 months before being unable to run a new game on it and getting swapped for that 9800, which it's important to note WE ALREADY HAD, and my step-dad had just left it sitting out near his desk. I didn't see it until after we had wasted $90 on that shitty low end GPU, and asked him if I could have it since mine was garbage. He knew we were going to go get one and literally could have solved it ahead of time lol. I miss those days now, you could wait a year while saving up lunch money change for a parts upgrade, and get double to triple the performance sometimes just by being patient and sticking with the hardware and games you were running decently together at the time. It's not like I couldn't play starcraft just fine on a potato by that point in a pinch xD. Still, even having money sometimes my family would be dumb about stuff and make me wait because I was a kid, while we had hardware sitting around idle gathering dust. Now my step-dad is better about it, he asks me the second he decommissions anything if I want any of it before it gets sent to a refurbisher/recycler to be salvaged and probably resold for cheap to those who can't afford new hardware.
The Fury X was more of a test bed for AMD and an experiment with HBM. I was lackluster but was a major stepping stone for them coming off the 500 series. Hell, I still use a 590 myself. Nothing I play needs more graphics power right now.
@@dark666king they used it again in the Vega cards and Radeon VII. This was at a time when GDDR5 was very long in the tooth, and VERY power hungry. nVidia had sophisticated memory compression techniques AMD didn't, so they had to use brute force to feed their GPUs (290/390 had a massive 512 bit memory bus which I don't think has been seen again). They hit a memory bottleneck, so HBM was their only way to get more bandwidth in, and they hoped having more cache and WAY higher bandwidth would make up for the lack of VRAM. It kind of did. With newer memory, and their own newer memory compression algorithms, they don't need that super high, raw bandwidth to feed their consumer cards anymore. It wasn't so much a failure, as other technologies more cost effective came along. They still use HBM on some enterprise/compute cards, but games just don't need the bandwidth HBM offered.
I snagged a GTX-480 really cheap @ Newegg right after the 580 was released.... performance roughly matched my other GPU a GTX-570 but it was a lot louder. When the Kepler cards arrived I traded the 480 in for a GTX-680 using EVGA's "Step-Up" program.
You should do a history of Markham Ontario's ATI. My first "real" graphics card was a Radeon (although I'm old enough to have used a CGA video card). Interesting Canadian tech history.
The card I regret buying the most: I regret buying the 4090 for scalper prices (2700$). I bought it because we used gpu acceleration at work and I was often working on home and preferred my home rig which was faster. The 3080 was plenty fast enough for the stuff we did, but I was running against the limitations of its 10gb vram. 3 weeks after I bought it, I got fired. I have a new job now but no use for the 4090 other than gaming, for which I don't have time because well, new job. It's a great card otherwise...
For those wondering, we ran small NN models for a bunch of internal tasks + some client apps. Most models were rather small and the biggest model we worked with was 1GB in size and using 8gb while running (but utilising almost 96GB when training on the in-house rig). A few days before I got fired we launched a new upgraded model that used 12-13Gb while running so for a short while, the purchase was worth it
Oh wow, that's such bad luck. At least you picked up a new job pretty quick. Hopefully you were able to make up some money by selling the 3080. Either way, sorry to hear about the shitty situation
@@CreativityNull Oh yeah I was so happy I got to sell the 3080 for the price I did. I don't remember the exact numbers but I sold it for roughly 50$ more than the price I've seen it go around that time. Homey who came over to check it and buy it even gave me a little more extra cash. But my 3080 had a backplate thermal pad mod which was worth way more than the 50$ and had an above average silicon and memory dies - when oc'd it was within 1% of the avg stock 3090 in synthetic benchmarks, but I didn't even run it OC'd because I preferred how quiet it was. Also literally no one whom I talked to at the time, except the final buyer, believed that I never mined on it lol. People think it's a given unfortunately, even though I wrote in the ad text that I don't sell to miners, for obvious reasons
I remember buying a GeForce 4600TI in mid-2002. The GPU was excellent and gave me about 3 years without performance problems. But I never forget the fact that a week later the Radeon 970 PRO came out with superior performance and at least here in Brazil, with a very close price. A strategic mistake for sure.
@@CaptainVKanth Seriously? In fact, the 9700 brought some innovations that later became standard in GPUs. I had a 9600 when the 4600 had a cooler problem and damaged the GPU, and the only problem was the ATI driver at the time, which was very unstable (catalyst if I remember the name). But at that time the problems with GPUs were diverse, such as not correctly recognizing the AGP speed (here a great advantage of the 9700, which was 8x and the 4600, only 4x), fluctuating voltage, memory recognition, faulty drivers in Windows XP and many others. Wild times huahuaah.
@@marcoskatsuragi Yeah, my friend had one back in the day saying it's much better than the GeForce 4. But he mentioned texture bugs when playing Need For Speed Hot Pursuit 2. He had to wait for a driver update to fix it. I think some AIB manufacturers also has to release bios updates as well.
When I first got into PCs I went through several low end cards. I started with a GTX 750 Ti. I wasn't happy with the performance so I bought a GTX 1050 Ti about two months later when it first released. Turns out I wasn't happy with that one either and I ended up buying a used EVGA 1060 6GB card off of eBay about five months later. I stuck with that GPU for about two years then upgraded to a standard GTX 1660. I had that card for about a year and ended up buying an RX 5700 XT. I really liked that GPU and kept it until about November of 2020. I had to get rid of my gaming rig, because I moved over to Japan for nearly a year and a half. While overseas, I built a new ITX rig with an RX 6600 XT. I was able to ship it back to the States and I still have the same GPU. If i had to say which card I regretted buying, I'd say it was the 1050 Ti. Looking back I should have just bough the 1060 6gb as it was only fifty bucks more and had much better performance.
My regret was the GTX 960. I had a GTX 750 Ti 2GB and decided to use my savings to buy the 960 4GB because I was starting with Blender at the time... The card was expensive and the performance difference in Blender was minimal. I was so pissed that I sold it at a lost and got an Radeon R9 380 4GB. In spite of all we can say about AMD cards, that card worked so well and my system never crashed or got a BSOD. I kept the card for 5 years until the "new minimum VRAM" became 6GB, the card started to have trouble keeping up with the new games.
If you still have the 750 Ti, keep it. It will run on any OS (XP and up) in any system configuration as long as it has a PCI Express slot of some kind. It can also natively drive CRTs.
3dfx also sunk a lot of cash into a collaboration with Sega to build a console that in a parallel universe would have been what we call the Dreamcast. If they hadn't chosen the Japanese-designed Dreamcast and had gone with the Sega America/3dfx design, 3dfx would have had more runway and could have survived as a 3rd competitor. That prototype was a BEAST, too. The Dreamcast itself was powerful, that thing put it to shame.
The one I regret most is probably the Diamond Stealth III S540 I got in high school, which used the S3 Savage4 Pro+ chipset. Performance was okay-ish in most games, and the S3 Texture Compression (which was later implemented by DirectX and OpenGL on all cards) was pretty nifty in the few games that supported S3's MeTaL renderer, but in hindsight I feel like I should've asked for an Nvidia TNT2 card of some sort instead.
@@MrGencyExit64 That's actually what the Stealth III S540 was replacing, a ViRGE GX of some variety. Given my poor experience with the ViRGE, I don't know why the hell I wanted another S3 card. 🤣🤣
When I was in engineering school, I ended up getting a Voodoo5 5500 PCI card for my FreeBSD workstation - only had PCI slots at the time. That card was amazing for the things I was throwing at it. The only difference between the PCI and AGP version was the inclusion of the PCI-to-AGP bridge chipset on the AGP card.
And that PCI variant is now the most expensive one because of the system compatibility it has. People who want those cards to relive that era of GLide gaming don't want to get the AGP version that's stuck using a handful of AGP 3.3v motherboards, when the PCI version is ~90% of the performance in D3D/OGL (and has no penalty in GLide) and works on any board with standard PCI. Those cards also have the quirk of using PCI signaling internally and using AGP's PCI fallback mode at 66MHz to get it working as AGP. The PCI Voodoos have no translator bridge, but they do have the usual PCI bus transaction chips.
Radeon Vii gets a lot of hell but I’m glad I have one in my collection i think it will be pretty rare in the coming years because of its short shelf life. And I really liked the way the card looked but when the 5700xt came out with the same performance and at a much cheaper price it was the death of the vii. But the vii is different and a bit quirky and thats why I like it. I don’t regret buying it, it was about as fast as they said it would be (2080) or maybe just a little slower. Even have a reference 5700xt while it was a loud card it could be tuned and undervolted to fix it. Rocking a 4090 gaming oc and ftw3 3080ti in my main rigs today and gonna pick up a sapphire nitro xtx early next year.
Biggest Regret was trying SLI with 2 GTX 680's. The micro stuttering was horrible, the noise was horrible and performance increase 100% did not justify the downsides, I never touched SLI again and now only buy the highest single GPU card I can afford.
I can echo these statements. Bought 2 GTX670's and water blocked them and OC the snot out of them. It seemed pretty slick at the time but the micro stuttering and SLI compatibility bugs ruined years of gaming for me.
Speaking of strange cards that make you go: "What the..." The original 3DFX cards were something very strange that would be utterly unthinkable nowadays. They weren't full GPUs. They were add-ons for existing systems. 3D graphics were expensive, very expensive in fact. So they produced these cut down GPUs that were barely capable of running this reduced version of OpenGL they called Glide. The 3DFX cards only provided this new 3D hardware, so they were designed to leverage the capabilities of the VGA cards that most systems already had present. They came with a little pass-through cable that allowed you to plug your old 2D card into your new 3D card, which you would then plug into your screen. The whole idea would make zero sense for modern PCs, but back in 1996 it was a great way to get an existing system running Quake in a smooth fashion. And running Quake smoothly at a resolution that wasn't 640x480 was essentially that era's version of running Crysis.
@@BigFatCone Oh you are certainly correct there. GL Quake made the software renderer look like a half baked prototype. The early days of 3D acceleration were a lot fun. Stuff evolved so quickly.
@@mirage809 It's been crazy being along for the ride since the 80's. I don't think I got my Voodoo 1 until Quake 2 was a thing but the point still stands.
Actually there wasn't a GPU I bought and regretted as I make my plans rather well thought-through. I got the GTX 460 in the end of 2010. While it didn't age that well it was working fine for me until I upgraded it with a GTX 1070 in 2016 - and it's still working! The 1070 aged much better and it's still my main GPU this day. While the 460 struggled six years after I bought it (like the GT 1030 D5 is actually stronger than it) the 1070 is still very capable and only slightly beaten by the 3050 as long as I can live without RTX and DLSS (thanks AMD for keeping my 1070 competitive with FSR!). I'm still waiting for a reasonable replacement that will serve me well for another 5+ years. Guess the problem is while the performance per generation is increasing nicely it takes much longer than the former "new generation every year" to get a new generation now so the overall development speed slowed down.
There used to be a guy from Poland, that grilled some ham and cheese on the GTX 480. The video was called "gotuj z GeForce" if I remember correctly. Edit: Here's the video th-cam.com/video/y2OKoEIbGr4/w-d-xo.html
hey remember when it was discovered (yeah, discovered, not announced) that the gtx 970 had only 3.5gb or vram with the other 0.5gb physically disabled on the chips?
I had a 480, thing was still beastly and was able to play GTA V just fine. As for other video cards that sucked ass, I nominate the GeForce 295. A card that was literally 2 cards sandwiched together, sucked in dust like no one's business and the marketed ram was cut in half due to the 2 cards.
I direct-ordered the very first GPU (Orchid Voodoo 1) back in 96. From 96-2000 3DFX really ruled the market. It was truly a revolutionary product. Prior to 3DFX, you pretty much had to constantly upgrade your CPU/mb on a yearly basis to get decent fps from your software rendering. I still have that Voodoo 1 btw in a classic gaming rig, running Win98 lol. Fun to compare it to my 4090 to see how far we’ve come.
Dell XPS 19 laptop with 2 8800GTX cards... probably the biggest laptop ever made...I really should send it over to you guys to take apart and toy with. I mentioned it in a comment before that got tons of engagement.
I was debating between getting two GTX 970s in SLI or a single 980 for my PC at the time, I looked at stuff and found the dual 970 to perform better, all be it at a higher total price too. Two weeks after buying them, the 980ti came out, it cost the same, and since it didn't rely on SLI it would have been by far the better purchase, I still regret trying to build an SLI system, it was my first one, and it would be my last one, I am kinda happy that trend has gone the way of the dodo, so no one else will waste their money like I did back then. 970 is still a good card, but since for 95% of the time I had paid the functional price of a 980ti, but was getting 970 performance, it was a really really bad purchase.
Lol I am still using single gtx 970 to play all games. Not a single game I played was a problem for me. And seeing this new gpu prices, I think I’ll never be able to buy “next gen gpu” at their launch dates anymore. But still it was fun journey.
I had my bout with Crossfire and even multi-brand mGPU. There were probably only ever actually 3 games that ever did exist with flawless Crossfire/SLI scaling and they were running the Mantle API which I don't think Nvidia supported. The absolute king of the crop was BF4 running mantle where 2 FuryXs scaled better than 100% and the main thing is that it was a perfectly buttery smooth experience with absolutely ZERO stuttering even on an old 3rd gen i7, - If games could've continued to scale like that then I'd be wanting liquid cooled (Vega 64LC anyone?) GPUs in crossfire in my rig to this day. It was actually pretty amazing for gaming/productivity if all you played was BF4 (that was me) and then when it was time to rendering something in Blender you had a load of powerful GPUs at your disposal (Back then AMD actually rendered faster than nvidia in Blender, a trend that's no longer the case). To this day BF4 has offered the smoothest stutter free experience I have ever seen and that was running 2 liquid cooled FuryXs that ran silently. So the promise land of crossfire did exist, it's just that shortly after Mantle was abandoned and DX12 never lived up to it's mGPU promises.
@@YashPatel97335 I would still be using one of my 970s if they didn't start giving me weird graphical glitches, I suspect it was due to overheating from dried up thermal paste, but I never did verify it. Got a 3070 just before the scalpathon of 2020/2021/2022 took place
@@QueenSaffryn cool. The point is that 970 and 980 are still very good cards if you want to hold out on buying new cards. And you can always get one generation behind cards at very reasonable price just like you got 3070.
980 Ti, pretty powerfull card at the time, but with oc and ov the pcb got a little bit black in the vrm area. The waterblock and a fan on the back didn't help with that 😅
I remember going out and buying a 980 (upgrading from a 660 Ti) because I had just bought Arkham Knight and wanted to play it at the best settings with the "PhsyX" stuff enabled. I don't regret it for a minute. Arkham Knight looked amazing with long overdue use of the Batmobile.
The Titan X (which is just a 980 Ti with 6 more GB of VRAM) is probably the most "definitive" legacy Nvidia card available in my arrogant opinion. Needs external power, but beyond that, can work in any OS from Windows XP up and can also drive any monitor including CRTs. And even today, the card is surprisingly powerful.
I remember when they were telling GTX 480 owners to let their cards cool for a few minutes after shutting down their PC before touching the card because it ran so hot.
None, in 2015 I built my current system sporting a GTX970, great card for the money at the time. I upgraded that same system to a GTX 1070 FTW, also a great card, which was only replaced by an RTX 3070 FTW this year. Now I'm a bit bottle-necked by my aging i7, but I intend to move this GPU to an i9 system early next year. The only time I've been disappointed by performance was on a prebuilt back when I was in high-school, lesson earned, every system since then I have built from scratch after much research. Although I do remember having to upgrade from 2mb of ram to 4mb of ram ($800 at the time) on my first computer, a Mac, so that I could run Marathon at a frame-rate above a slideshow presentation, oh how times have changed.
The card I think I regret most buying, would probably be when I bought the GeForce 4 Ti. That thing costed an absolute crapload of money at the time, and when I got home with my shiny new graphics card, I discovered my PC (which was a Dell Dimension 2400) did not in fact have an AGP slot! And by the time I got a new computer, PCIe was already mainstream. And I was pretty much forced to sell it second hand later down the line for chump change. (the store I bought it from did not take opened boxes as returns. Once it was open, it was yours).
I briefly owned an NVIDIA 7950GX2 - the two-GPU sandwich card. I actually wanted to get an X1950XTX for the combined HDRi + AA goodness, but it was out of stock everywhere. No regrets about the 7950GX2 though - I managed to sell it off a few months before 8800GTX got released and get that one with minimal additional investment.
I am a fan of 3Dfx, I had the voodoo 1, voodoo2, Banshee, V3000, and ended with the 5500apg. Loved how that card had Halo warthogs and Spartans on the back of the box. What people seem to NOT talk about was that 3Dfx cards used an API called glide that gave the cards a "performance" advantage over most at the time. However, it was limited to 16bit color textures whereas OpenGL and MS Direct3D supported 32bit. Games started going that way. The newer Voodoo's never truly supported OpenGL or Direct3D correctly and Nvidia and ATI did. 3Dfx's unwillingness to support the new features and the "cash grab" you mentioned destroyed the company. It's funny that many of the features, we currently use, evolved from 3Dfx's infamous T-Buffer technology. I sure hated buying that 5500 from Best Buy and the losing ALL support when switching to Windows XP. Painful.
My worst purchase was my still current GTX 1070. That puppy ended up being SO much bang for its buck, that thanks to all the ongoings on the GFX market in the recent years, as well as me being more into less graphics intensive genres over the years, I still couldn't justify upgrading it to a point where I'd be satisfied for anywhere close to a sensible price - even considering MSRP.
Waaay back in the day of Geforce 4, i bought the literal cheapest Geforce in nVidia's lineup. It was soooooo bad, it could barely run anything new at the time on low settings! I wowed not to ever buy another budget component ever again.
Diamond Stealth III S540 Savage 4 Pro - Back in the late 90's it was supposed to be a budget video card that, thanks to some neat tricks like a hardware texture compression decoder, was able to provide better looking graphics than NVIDIA's offerings. The better imagery was true, in games that supported S3 Texture Compression (basically unreal engine) but the card was slower than the competition because of dumb headed design decisions like not including a fan on the GPU.
a 1070 from Gigabyte ... I RMA'd it because of a bad fan bearing, and after 8+ months of them claiming they never received it despite USPS proof of delivery to their warehouse, on what was supposed to be a 2 week fix, it showed back up at my house. I had already given up and bought a new card by then, so I didn't test it before selling it, but it had documentation saying it was cleared by their RMA department ... despite my emails with them still claiming they hadn't received it
I think I’ve done well so far. 750ti back in 2015 that got the job done with no fuss, later upgraded to a 960GTX that soldiered on way longer than I expected. It’s currently on hiatus until I build my wife’s PC (she doesn’t game like I do so doesn’t need anything too heavy in the GPU dept.). Currently running a 3070ti FE, which I knew was a bit of a red headed step child in reviews but it’s the best I could afford at the time. I’m still on 1080p anyway so it’s easily coping.
I've really not had a lo g history with GPUs that I've owned personally. I only built my first PC back in 2013. Before the I'd game on a laptop with a gt555m and prior to that on an iMac in boot camp. All in I've only owned 3 graphics cards. My original build which had a GTX 650 (currently in my server for encoding purposes) an R9 280X and now a GTX 1060 which I've had for the past 6 years. Of all of them the worst was probably the 280x. It was just a rebranded 7970 at the time and while it was performant, I had a lot of headaches with drivers and missed features such as a dedicated encoder. But the big issue for me with the card in recent years is the power draw. It sits in a fractal node 202 which I use as a living room rig, and it really struggles to dissipate the heat. 225W of power draw ain't great in a case that small.
i loved my 480. It kept the heating bills low, and it could almost play A Hat in Time! ....I used it until 2020, when premiere stopped supporting the card. Funny since I got it when adobe was liquidating their stock of the card from their workstations.
Biggest loss for me was my first real pc. I purchased a computer from cyberpower with a decent cpu (intel e8400), but got an nvidia 8600gt. This thing sucked, but was brand new to custom pcs. Best thing is that I learned a ton from that purchase and it led me to a life of pc tech support (both professional and now the guy that fixes everybody’s pcs for free).
There was also the voodoo 5 series you're the voodoo 5500 voodoo 5 6000 The 6000 had 4 voodoo chips on it. Then there was the leaked voodoo rage graphics card that they were working on before they were bought by nvidia It would have been a direct X 8.1 card in hardware with apparently some early interact X9 functionality if I remember right
I recall playing games on my GTX 480. I could hear the fans through the headphones, and my mic easily picked it up and almost drowned my voice out under load; good times. I remember how I tried to run VR on it and calculated I might be able to on a 480 SLI setup with the NVIDIA VRWorks... a shame it was discontinued. might have gotten 10-20 fps on VR... Not good for playing, but to say I could run VR on such old hardware and above 5 fps would be a cool thing to say IMO.
One card I regretted buying was an 8800GT from XFX. While the card itself worked, the fan on it was appallingly loud, even at idle. I can't fathom why I didn't return it at the time, but I managed to live with a 24/7 hairdryer inside my PC for the better part of 2 years. Thankfully it died before the 3-year-mark, as 8800GTs were never long-lived cards. Both of my brothers' 8800GTs also died before reaching 3 years. It was an excellent performed for the money, to its credit.
I still regret getting an R9 380 from Gigabyte. One of the first purchases I made as a PC builder and by far the worst. Performance was good but it didn't last as the card died after just a few weeks. Newegg wanted nothing to do with it and I was forced to go through the manufacturer to get a new one. After 4 months of being forgotten about repeatedly by Gigabyte, they offered to send me a new card as they couldn't repair the old one. After a few more weeks I finally got it back, only to find out I had been duped. Immediately started having issues with the card black screening and subsequently underclocking itself. Perusing the internet pointed to a bad refurb, after checking the serial number and comparing it to what I had on file I learned that they did in-fact "repair" my card and sent it back to me without saying a thing. Tried getting them to send me the new card they had promised me but after a bit of back and forth they just stopped talking to me. To this day I refuse to buy anything Gigabyte makes, I can't bring myself to support a company that treats its customers that poorly.
First PC I ever owned was a dinosaur I bought with money I scraped together as a kid that came with an onboard GPU that was nothing but trouble. It was a Trident-Cyberblade, which was a collaboration between two awful budget manufacturers that used to exist. It didn't work well due to driver issues but there was zero chance of them getting fixed because, get this, not long after it came out (well before I got one) Trident and Cyberblade decided they were going to sue one-another and disavow the stupid thing. Ultimateley even the hilariously dated, dog-slow Nvidia PCI TNT2 64MB I bought for pocket change to replace it was a huge improvement in both performance and stability.
There was a card from Sapphire way back in the 00's (I can't remember the specific model. I'll edit this post if I find it.) It was supposed to be a competitive "budget" card. Not only did it suck wind, but there was a "knob" on the card's fan that stuck out nearly a third of the width of the card itself. This meant that you could not put any card in the adjoining slot at all. It was the worst physical design flaw I had ever seen. Thankfully, I believe I was operating on a 240 from Nvidia at the time, so I wasn't even tempted. Nothing stellar about the 240, but it was decent for the time.
I am still using my fury x. It plays games like Bannerlord, warhammer 40k chaos gate and pathfinder. I agree that it was dumb to say that it was overclockable. But HBM ram is still something that will probably show its face again in the near future. Also, this card is literally 4.5" tall and 8" long (plus the added radiator). New high end GPUs are the size of entire ITX builds.... lets talk about that for a moment. WTF are 90% of computer cases suppose to do with a GPU the size of a twin bed?
My biggest graphics card mistake wasn't a card itself, but the connector. I was upgrading and decided to stick with AGP for the new motherboard, instead of that new upstart PCIe standard which probably wouldn't go anywhere...
The weirdest thing about the Fury X is that it CAN overclock quite well by modern standards, often achieving 12-16% over stock with little effort. AMD had been validating Fiji (the codename of the GPU core inside Fury X) at up to a 20% overclock, at their default target of 1.2v. Clearly some marketing whiz heard this and started hyping the overclocking capabilities, ignoring that the engineers achieving this level of overclock were using binned GPU cores from their own lab, not cores that were being mass produced and which were targeting a more broad range of stability. What Fiji excelled at though was undervolting, and AMD themselves showed that off with the R9 Nano; a Fury X through and through but with the power limit chopped in half. You can take a retail Fury X and drop the power draw by over 150W without sacrificing a single MHz in games or demanding tasks, and sometimes you can even get 40-60MHz extra BACK with a slight overclock. Fiji started the modern age of undervolt + overclock.
My dad got an AIO 2080, his first ever liquid cooled anything, and then he nicked one of the coolant tubes when he built his new rig. They actually repaired it for free when he sent it back, but it took like 3 months, so he ended up buying an air cooled 2080 instead, and I got the AIO one for free 😀 It's doesn't preform amazingly anymore compared to more modern cards, but I still love it.
The worst 3D accelerator i know of is the Alliance AT3D. the rendering quality of that card is just cursed due a combination of using dither for everything, not supporting many vertex formats and the mip mapping being basically random. to be fair, it is slightly faster than the S3 virge, but this only means you're getting the absolutely cursed visuals faster. And it's even funnier that many voodoo rush cards had it as it's 2D chip, creating a monstrosity of a card that had both the worst and best 3D chips available at the time on the same board.
That one time with that XFX Radeon GPU with no cooling, had 1GB VRAM and used "hypermemory", which meant it chewed up a good chunk of the system RAM in demanding games.
Ah the only one I regret is a 3070Ti where I listened to some online reviewers and in store sales people instead of my gut. On the flip my RX480 ROG STRIX was an awesome purchase at the time.
rx 6600 mech to replace my old 1060 which was very silent... thought it wouldn't be loud. It is adjustable in fan curve editor, but minimum is at 21%, which is still pretty audible and then it's even capped at 100W which makes it slower than other 6600 cards out there if you don't undervolt the crap out of it, which doesn't really work that well because it's a budget card and the silicone doesn't really leave much headroom. If you don't bother with fan tuning, you either have zero noise or a jet engine once it goes above 45.
Man now I feel old. My first gaming rig had a dual gpu 3dfx. Pops bought it as the company was going under. It gave me a few good years. I was able to play this nascar game and flight sim. I got his pentium 3 and the 3dfx card rig because he bought just before the pentium 4 came out.
I was thirteen and had $200 in birthday money. I saw GTA IV for PC at target for 20$ and convinced parentals to buy it for me. Later I went to the local PC shop with my remaining money and asked the guy behind the counter for a graphics card that will run GTA IV. I walk out of there with 5$ left but, now also the proud owner of a Nvidia 8600 GS. I run back home and spend the next hour installing my new found treasure into the family PC. Let us say that I was very disappointed when that card could not even manage 20fps on the lowest settings.
My GTX 1080TI with AOI water cooler. The radiator fan was always real noisy and a few months ago the pump died, I was hitting ~105C when I realized there was a problem. Now I need to find a new heatsink somewhere.
The GPU I regret buying the most was actually the AMD RX580. I was building the best computer I could afford with the money I had, which was by all means a budget PC. I walked out of Best Buy with a GTX 1060 and installed it, snapped a pic and sent it to a discord server where my friend said "Dude! Return that right now and get an RX580!! It's cheaper and has WAY better performance!!" I'd never had an AMD card before so I decided this was gonna be the shot I gave them. I went back to Best Buy and came out again with the XFX version of the RX580. Spent a little while putting it in and it was smooth sailing, I was getting great performance and it didn't leave me wanting. For a month. The day after the warranty expired my computer wouldn't turn on. I spent the entire day diagnosing until I found that switching to my integrated graphics worked. I took the GPU to a friend's house and confirmed it was dead. Best Buy wouldn't take it back, and after a bit of back and forth with AMD I was SoL. I ended up playing without a dGPU for a while before saving up just enough for the then-new 2060. I haven't gone back to AMD since. To be completely clear, this isn't to rag on AMD or to say people who buy their GPUs (or even the rx580) are somehow stupid for it. The year to year back and forth is getting really strong on team red and on paper they've got impressive GPUs for extremely competitive prices. But that experience soured me to the brand, and now I just hope Intel comes out with decent GPUs sometime soon because Nvidia annoys the hell out of me.
I am very disappointed the 2060 wasn't on there...even NVIDIA agreed it shouldn't have existed by re-releasing it as a 1650 or whatever. The entire reason for the 20xx existence was ray tracing & the 2060 performed worse than 5 generations below it with that enabled. It's existence was a waste.
@@mrbobgamingmemes9558 I think I'm slightly misunderstanding what you're saying... the 2060 sucked at ray tracing, which was literally the entire reason for its existence as a 2060. turning on ray tracing made its performance worse than a card 3 generations prior. therefore, the 2060 was pure waste. the 1650 was a 2060 (architecture, etc) minus the ray tracing. that was fine. it wasn't amazing performance or anything, but it was a new generation card that wasn't overpromised/under delivered & still slightly better than the 1060. I have no issue with it. my issue is purely with the 2060. if the major selling feature of ALL 20xx cards, including the 2060, couldn't even be used without it su*king A$$, then it shouldn't have existed. they should've simply made the 1650 and omitted the 2060 entirely.
I remember being really disappointed by an S3 Graphics card (not to be confused with Amazon S3.) Very few games supported it and those that did... well, it usually made no difference in terms of graphics quality.
before i knew anything about gpu's i upgraded from my rtx 570 to a gtx 1050.... needless to say i did not notice much of an improvement and am now very happy with my 3060ti
Way back in 1996 I bought a Matrox Mystique, the card ended up being nicknamed the "Matrox Mystake". A glutten for punishment, I followed this card with the Matrox G400, another "winner" of a graphics card.
I kind of have to disagree on the Fury X. It was far from a great card, but it wasn't useless or a mistake. In very niche usecases it was ideal. At that point I had build a small ITX system with an i7 6700K (which I wanted to OC, aircooled) and at first an R9 Fury. Due to the small volume of the case and the limited airflow, temps would rise quickly and things started to throttle. Switched the GPU for a reference GTX980, but that didnt work out either. After that I switched to an R9 Fury-X and I put a small 140mm AIO on the CPU and all the issues were solved: CPU maxing around 72C and the GPU would never go abovr 52-56C even wheb overclocked. A very niche usecase, but it was perfect for my use.
Heck it was a high end GPU in a small form factor that ran perfectly silently, it was also basically the last high end GPU that could've been used in BF4 under the Mantle API to experience Crossfire at it's absolute peak (Back then BF4 scaled 2 FuryXs over 100% performance gain and it was 100% stutter free, unheard of in crossfire/sli and idk if we'll ever see that again) and it did so silently. Nvidia never gave me that experience and I'm so glad AMD did because that was something to see. True mGPU is 100% possible in games because I've seen it done, devs just abandoned the concept.
I once found an Nvidia GT 630 in a parts bin back in 2012, the same year it came out. Was allowed to just take it, and the danmed thing still lives in a cupboard somewhere, somehow still functional. Back then the lack of graphical horsepower was rather traumatic. But its long since found its use as a backup glorified hdmi / dvi / vga output for various troubleshooting/debugging issues over the years.
Have a GT 730 like that. Came with a prebuilt system the rest of was actually good except for it. Always found it strange.
I have a 630 that still runs all day every in my Plex server. Spent a decade before in a crappy Lenovo workstation and just won't die
I too have a 630, works fine for me as I don't need GPU power.
My XPS 8700 has a GT 630 and has not shown any problems. I didn't even know it has a bad reputation and I use it every day.
@@SuperGon6 they put one in there so they can market that it has a dedicated gpu and make it more expensive
The dual-GPU cards, with two PCBs one on another, were something funny too
What was one called? GeForce 7950GX2? Was that thing any good?
@@cfb36 here are two more, the 9800GX2 (decent card) and the GTX295
@@cfb36 gtx 690
@@lbochtler Also the GTX 690 wich at the time was actually very good. and not that expensive compared buying two separate 680s.
GTX 690?
I recall when the R9 295X2 was one of the most powerful cards on the market.
Two GPUs on one PCB and with a built-in water cooler
Wacky times. Kindoff sad, but im also happy that Gpus are so heavily standardised now.
Yes, it also kinda sucked because you had to use crossfire which wasn't supported too well and it used 700 watts of power just on it's own
@@MrTurbo_ 700?! dang... and here we are looking at today's which use less than that thinking it's highly excessive lol
I bought two of the R9 295x2 cards during early Covid for 200 bucks to mess around with and could barely power them with a 1500 watt power supply. OCing would usually trip the overcurrent protection lmao
The 295x2 was utterly stupid and pointless and wasteful, but I am glad such a ridiculous feat of engineering was made.
I think I regretted buying the Matrox Millennium 2 at the time. It was not cheap, did not support most 3D stuff, and I was hoping that 2D games would run really well on it. They did not.
I also regret the Matrox Millennium II. I think I got it for CAD reasons, but was not a great gaming card. If I recall it only supported Direct3D while 3DFX and OpenGL was killing it on the gaming front.
Old memories are being flooded into me from CompUSA days. Was the Matrox the one that had a "warning" label promoting how other cards might only be good at 3d games but that their card was good at both 2D and 3D games?
Millennium 1 was great... 2 and Mystique... not
Same.
Yep I had one of those. I had to buy it because I made the mistake of buying a Compaq Deskpro (NLX Form Factor) and basically at the time it was the only card that would fit. It could run Homeworld... just. I supported it with a Voodoo II (? I think) 8MB sister card eventually. When I built my first PC by hand it was like the gods had sung in my direction! Suddenly I was in a land of AMD CPU goodness with "proper" graphics - I know it was an ATI GFX card, but can't remember which one. a Radeon of some sort I think. But for the time it was powerful. The jump from my Compaq to my home build was massive. The Matrox Mill 2 was just a little underpowered for the day. Only 4MB ram, 8 with an upgrade. The Voodoo sister card added "proper" 3d abilities (lets face it - Quake). My timeline is messed up 'cos I'm getting games and tech out of sequence... it was a long time ago!!!!
I had a Sapphire Fury for a while and yeah, that HBM did NOT like ANY sort of tweaking to the clock speeds. I remember adjusting it even ONE mhz would cause the system to freeze.
Meanwhile, my Vega56 OCs the HBM2 memory from 800 to 1000mhz flawlessly. Cha-ching!
@@blunderingfool i wished my Vega 56 could do that 😩
Then you didnt read up on it back then. I had both a Fury and Fury X and although the overclocking was very limited, you could overclock the HBM. However, it could only be done in fixed increments (55mhz or somethinf weird) and if your card couldn't handle that next jump, you couldn't OC it.
I was lucky enough that both my Fury and Fury X would allow 1 or 2 steps of memory OCing.
@@blunderingfool Vega 56 with 64 bios is just plain amazing. Not particularly efficient however, but damn good for the price
I rocked a R9 fury until I scored a 3060 last year
Fury X was actually an awesome card, it had strengths even against the 980ti and was either super close in performance or beat it in some games. In addition we got the fury nano which was the absolute king of ITX builds. The fury X is pretty much silent even at full load which is pretty incredible.
I know, I freaking LOVE the FuryX. We'd be lucky to see GPUs like it again, they were high end GPUs in a slim package that were just silent at all times.
It was badass in every way... except the VRAM capacity.
@@sergeantsarge7081 made up for it in speed for sure but a couple more gbs of vram it would have been a 1070ti killer
this guy is the only LTG "Host" that i cannot stand. hes always letting his opinion get in the way of facts (Not to mention his jokes suck). the Fury X was a great card.
What other manufacturer/card comes with water cooling? Underrated card
I had a Voodoo 5 5500, so the Voodoo 4 4500 wasn't the top card from 3DFX, and the 5500 was a pretty solid card for the time, it performed really well and games that supported glide looked and played better than the same game running on something like a TNT Riva 2 with OpenGL, and it was the first GPU that I'm aware of that had 2 GPU dies on a single board, stuff that Nvidia repeated with later cards like the 295
Yeah, I knew there was a Voodoo 5, soI was a bit surprised they mentioned the 4 as the last one. The voodoo was a beast but it was overkill for the time with those dual gpus. I had a Riva TNT 2 and while it was good, it couldn't match those, but hey, it was cheaper
@@smokeduv The 4 technically released at the same time the voodoo 5, so it's the just as new but definitely not their best card. I have a Voodoo 5, it's a cool card.
I miss my 5500
I remember Voodoo drivers offering *tons* of settings that could be adjusted. I was mind-blown trying to customize its performance.
Voodoo 5 5500 was released in 2000, but ATI had their Fury MAXX dual GPU card in 1999, so the 3dfx card was not the first in that regard. Voodoo 4 4500 was in fact the last card from 3dfx as it was released a few months after the Voodoo 5 5500. Also, both used the same chips, the VSA-100, only the 5500 had two of them. Finally, the fact that a 5500 would beat a TNT2 (and even the first gen GeForce) did not help it much when it released right after GeForce 2 and ATI's first Radeon, both of which were faster and more advanced (they were DX7 cards with hardware Transform and Lighting, while the Voodoo was DX6). In other words, too little too late and I don't even think that it was cheaper than its competitors.
I know it was trash, but I loved my Voodoo5 5500. It was the first GPU I owned that needed an extra power connector (you had to plug a molex into it) and it was actually two GPUs tied together.
It may not have been the fastest card, but it did look cool and had great support up until around 2002-3 which was relatively decent for the time. Would've been better off with a GeForce 2 though for sure.
Yeah, the V4 4500 is practically just a single-chip version of it.
@@classic_jam Oh yeah, I would have gotten way more life out of a GeForce 2. I think the card I ended up replacing the V5 with was a Ti 4600.
Still has one and it still works
yeah I still have 2 of them yeah but it was SLI on the card
I remember thinking at the time that I would have bought the Titan Z if I won the lottery to use in an ITX system. If I were building a full size system, then like you said I would have bought dual Titan Blacks.
Remember when Titan's allowed you to choose whether you wanted good FP32 or FP64 performance up to the Titan Black? And then Nvidia was just like, "lol Not anymore. :)"
I am still daily driving my MSI GTX 1080 for 6 years now, I think this thing was way more than a solid purchase. Even though it's not top of the line anymore, it can run many games in 4K. Not the newest or most demanding ones, but to me it's mainly about productivity anyway.
And here i am gaming on a quadro k5000 and tesla k80 (though the tesla only helps in phys-x games)
(And no im not a dumbass that bought a workstation card thinking its better tgen a gaming card due to its price. I use them for scientific computation, 3d modeling and cad, gaming is just something they get to do once in a while.)
Same here and you know what is the real kicker? At the time people where saying that the 1080 was like the 4090 is now: Too hot, too expensive, and way too much card to justify. Yet years down the road that investment has paid off in spades because you bought a great part and now its aged gracefully to a good or ok level part. Those nay sayers bought "just enough" or only ok level parts that now have little useful life in newer titles without dramatic compromises.
@@lbochtler Isn't the K6000 the last Quadro card with Windows XP support, and also (mostly) the last Quadro card to have not-shit FP64 performance?
@@Hybris51129 Too hot? I got a Titan Xp which is basically just a 1080 Ti Special Edition, and it runs amazingly even in modern Blender workloads. lol
My 7yo has my old gtx 1080 in his PC
Honestly a bit overkill for a 7 year old, but I wanted to build him a PC for his birthday and I didn't want to sell it and contribute to the environment of people taking advantage of others while the shortage was going full tilt.
AMD has had a few interesting cards in recent years. There was a Radeon Pro card (clearly not aimed at gamers) that had an NVMe interface on it so users could attach an SSD. On their GPU. I think the idea was that the SSD could be used as pseudo-RAM for productivity tasks that used way more than 10-20 GB of actual RAM. But that interface was clearly a bottleneck, and AMD didn’t make that many of these cards.
Wait... a NVMe SSD on a GPU?!!?
@@drakethedragon457yep. Had to Google for the model number. It was the Radeon Pro SSG. They advertised it as the first GPU to break the 1TB memory barrier. LOL.
I believe they did a video dedicated to that gpu, it sounds so good! Especially now that vram is at a premium for games
It was the most direct Direct Storage Access feature so far lol.
The ATI All-In-Wonder 128 was my biggest video card purchase regret. At launch they promised eventual driver support for OpenGL. I believe they did finally add it years later but by then there were much better graphic chip options.
I remember being intrigued by the All-In-Wonder cards cause of their video inputs, but glad I avoided these headaches.
I was really sad when my Vega 56 (Sapphire Pulse) died. It had so much oc potential and can run some games in 4K (you'd be surprised how well it scales resolution)
I am still rocking a vega 64 nitro and it still kicks ass at 1440p
I mean, it WAS a 4K card when it was released, so yeah, it can handle older games in 4K. But I was using mine for 1080p by the time I replaced it with the 7900 XTX.
This news script appears to discuss a few graphics cards that were failures in terms of marketing and performance. The first card mentioned is the Nvidia Titan Z, which was released in 2014 and was marketed at gamers but was too expensive and underwhelming in terms of performance. The second card mentioned is the AMD R9 Fury X, which was also released in 2015 and was marketed as a rival to Nvidia's top-tier Titan X but was outperformed by the GTX 980 Ti. The final card discussed is the 3dfx voodoo 4 4500, which was the last graphics card produced by the company and performed poorly compared to its rivals.
Before I learned about CPU bottlenecks, I bought an HIS 4850 to use with a Athlon 64 3700+ (single core). Helped greatly with the few single-threaded games that I had but did basically nothing for the rest.
Pretty much all games were single thread back then so you weren't missing much. Core 2 just had insane single thread performance to begin with.
I've had an 4830 paired with an athlon x2 6200+ and 4 gb of ram - that was an amazing budget combination back in the days. A 400 € build that supported me for many years - great times
I bought a second HD 7950 for crossfire use back in the day. I quickly discovered that few games actually benefitted from it (with many games actually becoming unstable while trying to use it)- and having two gpus turned my PC into an oven, and a rocket engine, at the same time.
I had a pair of hd7970s, and for a short while, tri-fire 7970s. The only game that worked well was battlefield 3 and 4. Which, at the time, was all I played. The hd7000 series was a fantastic value for a lot of years though. I really liked my 7970, I still have it somewhere.
The Fury X were some of the coolest cards ever made.
The 6990 was also a very effective room heater with deafening dB's.
As someone who has began collecting GPU's old and new this video was cool as heck good stuff
The absolute worst buy of my life was the ATi All-In-Wonder 128.
It was basically an ATi Rage 128 from 1999 with 16 MB of VRAM with TV tuner attached.
The problem was that my dad purchased this card in 2001 only because of that dumb feature.
Graphics cards those days were evolving quite quickly and already in 2003 this GPU had trouble launching a lot of games. And since my family was quite poor I was stuck with this hardware for ages and had to see my classmates playing then-new GTA San-Andreas and NFS Most Wanted, when I had to play older stuff.
If only my father have bought GeForce 2 or 3 instead
I actually had that card when it was new at the time. The Rage 128 was worlds better than the Rage 2 and was actually competitive. The tuner even took a hit from lightning blowing out that portion of the card. Surprisingly the 2d/3d still worked well inside Linux if you didnt install or setup the tuner.
I eventually replaced it with a All-in-Wonder Radeon due to the fried tuner for my main Windows machine.
I had this experience with my Riva TNT2. It was a top of the line card for it's time, but was given to me in 2001 for my first PC, by which time it was starting to be obsolescent. It ran games just fine up until Homeworld 2 launched, and I didn't have support for DX8 or 9, which was a limiting factor for me at the time (i forget which one it was). I ended up getting the Radeon 9250 SE not knowing it was basically an entry level display output not worth half a damn, and while it WAS faster than that TNT2, it ended up being unable to run Oblivion at launch just a year or two later. I was lucky my step-dad had bought a 9800 PRO All in Wonder that he was no longer using, because I was able to adopt it as my own for a massive performance upgrade, and my first proper experience with a "high end" card, even though it had already been replaced with the X800 and X850 by then. I still regret getting that 9250 to this day, it was my shortest lived GPU EVER, with even my 9 month old 5700xt outliving it before being replaced with a 2080ti and sold during the GPU shortage to someone building a new rig to survive the pandemic with. I think it only lasted me 6 months before being unable to run a new game on it and getting swapped for that 9800, which it's important to note WE ALREADY HAD, and my step-dad had just left it sitting out near his desk. I didn't see it until after we had wasted $90 on that shitty low end GPU, and asked him if I could have it since mine was garbage. He knew we were going to go get one and literally could have solved it ahead of time lol. I miss those days now, you could wait a year while saving up lunch money change for a parts upgrade, and get double to triple the performance sometimes just by being patient and sticking with the hardware and games you were running decently together at the time. It's not like I couldn't play starcraft just fine on a potato by that point in a pinch xD. Still, even having money sometimes my family would be dumb about stuff and make me wait because I was a kid, while we had hardware sitting around idle gathering dust. Now my step-dad is better about it, he asks me the second he decommissions anything if I want any of it before it gets sent to a refurbisher/recycler to be salvaged and probably resold for cheap to those who can't afford new hardware.
The Fury X was more of a test bed for AMD and an experiment with HBM. I was lackluster but was a major stepping stone for them coming off the 500 series. Hell, I still use a 590 myself. Nothing I play needs more graphics power right now.
Polaris came AFTER Fiji. In other words, the Fury X was the 300 series. The 400 and 500 series came after.
How was it a stepping stone if they literally decided to ditch the HBM and never use it again in consumer segment products?
@@dark666king stepping stone can also be a learning experience for them
@@dark666king they used it again in the Vega cards and Radeon VII. This was at a time when GDDR5 was very long in the tooth, and VERY power hungry. nVidia had sophisticated memory compression techniques AMD didn't, so they had to use brute force to feed their GPUs (290/390 had a massive 512 bit memory bus which I don't think has been seen again).
They hit a memory bottleneck, so HBM was their only way to get more bandwidth in, and they hoped having more cache and WAY higher bandwidth would make up for the lack of VRAM. It kind of did.
With newer memory, and their own newer memory compression algorithms, they don't need that super high, raw bandwidth to feed their consumer cards anymore.
It wasn't so much a failure, as other technologies more cost effective came along. They still use HBM on some enterprise/compute cards, but games just don't need the bandwidth HBM offered.
@@dark666king Vega 56 and 64, Radeon VII all used HBM
@4:17 you gotta love the extra grilling feature "comes included with every card" for free!
I snagged a GTX-480 really cheap @ Newegg right after the 580 was released.... performance roughly matched my other GPU a GTX-570 but it was a lot louder.
When the Kepler cards arrived I traded the 480 in for a GTX-680 using EVGA's "Step-Up" program.
You should do a history of Markham Ontario's ATI. My first "real" graphics card was a Radeon (although I'm old enough to have used a CGA video card). Interesting Canadian tech history.
Other youtubers already did that.
I have buyers regret occasionally. Spent 900 on a 3070 Ti last winter. It was the replacement to a GT 1030.
The 3070ti is still a great card. Just be happy you have such a great pc.
Really? i have the RTX 3060 Ti and this card is really good!
Same here. I spent $800 AUD on an RTX 2060, it was overpriced, but I'm still glad I have it.
"To top it off, performance was... Underwhelming."
*video pauses for buffering*
I still have my 4500 agp. I mowed so many lawns for that card. I'm being buried with it!
The FuryX was awesome for small form factor builds, I put together two of them in a compact MATX case and it was overkill for even VR back in the day
The card I regret buying the most: I regret buying the 4090 for scalper prices (2700$). I bought it because we used gpu acceleration at work and I was often working on home and preferred my home rig which was faster. The 3080 was plenty fast enough for the stuff we did, but I was running against the limitations of its 10gb vram. 3 weeks after I bought it, I got fired. I have a new job now but no use for the 4090 other than gaming, for which I don't have time because well, new job. It's a great card otherwise...
For those wondering, we ran small NN models for a bunch of internal tasks + some client apps. Most models were rather small and the biggest model we worked with was 1GB in size and using 8gb while running (but utilising almost 96GB when training on the in-house rig). A few days before I got fired we launched a new upgraded model that used 12-13Gb while running so for a short while, the purchase was worth it
Oh wow, that's such bad luck. At least you picked up a new job pretty quick. Hopefully you were able to make up some money by selling the 3080. Either way, sorry to hear about the shitty situation
@@CreativityNull Oh yeah I was so happy I got to sell the 3080 for the price I did. I don't remember the exact numbers but I sold it for roughly 50$ more than the price I've seen it go around that time. Homey who came over to check it and buy it even gave me a little more extra cash. But my 3080 had a backplate thermal pad mod which was worth way more than the 50$ and had an above average silicon and memory dies - when oc'd it was within 1% of the avg stock 3090 in synthetic benchmarks, but I didn't even run it OC'd because I preferred how quiet it was. Also literally no one whom I talked to at the time, except the final buyer, believed that I never mined on it lol. People think it's a given unfortunately, even though I wrote in the ad text that I don't sell to miners, for obvious reasons
@Transistor Jump Europe, and technically it was 2360$ back then
So sorry to hear that man. Glad that you're OK and I hope you'll put your GPU to good use! Maybe some VR someday?
1:42 I remember hearing "Nvidia is gonna get shit on when the new R9 comes out. BET!"
HAHAHAHAHAHAHAHAHAHHAHAHAHAHA
I remember buying a GeForce 4600TI in mid-2002. The GPU was excellent and gave me about 3 years without performance problems. But I never forget the fact that a week later the Radeon 970 PRO came out with superior performance and at least here in Brazil, with a very close price. A strategic mistake for sure.
Wasn't the 9700 pro plagued with texture issues and buggy colour pallets?
@@CaptainVKanth Seriously? In fact, the 9700 brought some innovations that later became standard in GPUs. I had a 9600 when the 4600 had a cooler problem and damaged the GPU, and the only problem was the ATI driver at the time, which was very unstable (catalyst if I remember the name). But at that time the problems with GPUs were diverse, such as not correctly recognizing the AGP speed (here a great advantage of the 9700, which was 8x and the 4600, only 4x), fluctuating voltage, memory recognition, faulty drivers in Windows XP and many others. Wild times huahuaah.
@@marcoskatsuragi Yeah, my friend had one back in the day saying it's much better than the GeForce 4. But he mentioned texture bugs when playing Need For Speed Hot Pursuit 2. He had to wait for a driver update to fix it. I think some AIB manufacturers also has to release bios updates as well.
When I first got into PCs I went through several low end cards. I started with a GTX 750 Ti. I wasn't happy with the performance so I bought a GTX 1050 Ti about two months later when it first released. Turns out I wasn't happy with that one either and I ended up buying a used EVGA 1060 6GB card off of eBay about five months later. I stuck with that GPU for about two years then upgraded to a standard GTX 1660. I had that card for about a year and ended up buying an RX 5700 XT. I really liked that GPU and kept it until about November of 2020. I had to get rid of my gaming rig, because I moved over to Japan for nearly a year and a half. While overseas, I built a new ITX rig with an RX 6600 XT. I was able to ship it back to the States and I still have the same GPU. If i had to say which card I regretted buying, I'd say it was the 1050 Ti. Looking back I should have just bough the 1060 6gb as it was only fifty bucks more and had much better performance.
My regret was the GTX 960. I had a GTX 750 Ti 2GB and decided to use my savings to buy the 960 4GB because I was starting with Blender at the time... The card was expensive and the performance difference in Blender was minimal. I was so pissed that I sold it at a lost and got an Radeon R9 380 4GB. In spite of all we can say about AMD cards, that card worked so well and my system never crashed or got a BSOD. I kept the card for 5 years until the "new minimum VRAM" became 6GB, the card started to have trouble keeping up with the new games.
If you still have the 750 Ti, keep it. It will run on any OS (XP and up) in any system configuration as long as it has a PCI Express slot of some kind. It can also natively drive CRTs.
3dfx also sunk a lot of cash into a collaboration with Sega to build a console that in a parallel universe would have been what we call the Dreamcast. If they hadn't chosen the Japanese-designed Dreamcast and had gone with the Sega America/3dfx design, 3dfx would have had more runway and could have survived as a 3rd competitor. That prototype was a BEAST, too. The Dreamcast itself was powerful, that thing put it to shame.
The one I regret most is probably the Diamond Stealth III S540 I got in high school, which used the S3 Savage4 Pro+ chipset. Performance was okay-ish in most games, and the S3 Texture Compression (which was later implemented by DirectX and OpenGL on all cards) was pretty nifty in the few games that supported S3's MeTaL renderer, but in hindsight I feel like I should've asked for an Nvidia TNT2 card of some sort instead.
I had S3's first 3D card (the VIRGE). I don't recommend anything S3 makes :)
@@MrGencyExit64 That's actually what the Stealth III S540 was replacing, a ViRGE GX of some variety. Given my poor experience with the ViRGE, I don't know why the hell I wanted another S3 card. 🤣🤣
When I was in engineering school, I ended up getting a Voodoo5 5500 PCI card for my FreeBSD workstation - only had PCI slots at the time. That card was amazing for the things I was throwing at it. The only difference between the PCI and AGP version was the inclusion of the PCI-to-AGP bridge chipset on the AGP card.
And that PCI variant is now the most expensive one because of the system compatibility it has. People who want those cards to relive that era of GLide gaming don't want to get the AGP version that's stuck using a handful of AGP 3.3v motherboards, when the PCI version is ~90% of the performance in D3D/OGL (and has no penalty in GLide) and works on any board with standard PCI. Those cards also have the quirk of using PCI signaling internally and using AGP's PCI fallback mode at 66MHz to get it working as AGP. The PCI Voodoos have no translator bridge, but they do have the usual PCI bus transaction chips.
Radeon Vii gets a lot of hell but I’m glad I have one in my collection i think it will be pretty rare in the coming years because of its short shelf life. And I really liked the way the card looked but when the 5700xt came out with the same performance and at a much cheaper price it was the death of the vii. But the vii is different and a bit quirky and thats why I like it. I don’t regret buying it, it was about as fast as they said it would be (2080) or maybe just a little slower. Even have a reference 5700xt while it was a loud card it could be tuned and undervolted to fix it. Rocking a 4090 gaming oc and ftw3 3080ti in my main rigs today and gonna pick up a sapphire nitro xtx early next year.
Biggest Regret was trying SLI with 2 GTX 680's. The micro stuttering was horrible, the noise was horrible and performance increase 100% did not justify the downsides, I never touched SLI again and now only buy the highest single GPU card I can afford.
I can echo these statements. Bought 2 GTX670's and water blocked them and OC the snot out of them. It seemed pretty slick at the time but the micro stuttering and SLI compatibility bugs ruined years of gaming for me.
Speaking of strange cards that make you go: "What the..."
The original 3DFX cards were something very strange that would be utterly unthinkable nowadays. They weren't full GPUs. They were add-ons for existing systems. 3D graphics were expensive, very expensive in fact. So they produced these cut down GPUs that were barely capable of running this reduced version of OpenGL they called Glide. The 3DFX cards only provided this new 3D hardware, so they were designed to leverage the capabilities of the VGA cards that most systems already had present. They came with a little pass-through cable that allowed you to plug your old 2D card into your new 3D card, which you would then plug into your screen. The whole idea would make zero sense for modern PCs, but back in 1996 it was a great way to get an existing system running Quake in a smooth fashion. And running Quake smoothly at a resolution that wasn't 640x480 was essentially that era's version of running Crysis.
I remember getting my first Voodoo card. Orchid Righteous 3D. That and Tomb Raider and I was completely blown away!
Who cares about resolution? The quality in graphic fidelity compared to software render blew me away.
@@BigFatCone Oh you are certainly correct there. GL Quake made the software renderer look like a half baked prototype. The early days of 3D acceleration were a lot fun. Stuff evolved so quickly.
@@mirage809 It's been crazy being along for the ride since the 80's. I don't think I got my Voodoo 1 until Quake 2 was a thing but the point still stands.
Actually there wasn't a GPU I bought and regretted as I make my plans rather well thought-through. I got the GTX 460 in the end of 2010. While it didn't age that well it was working fine for me until I upgraded it with a GTX 1070 in 2016 - and it's still working! The 1070 aged much better and it's still my main GPU this day. While the 460 struggled six years after I bought it (like the GT 1030 D5 is actually stronger than it) the 1070 is still very capable and only slightly beaten by the 3050 as long as I can live without RTX and DLSS (thanks AMD for keeping my 1070 competitive with FSR!). I'm still waiting for a reasonable replacement that will serve me well for another 5+ years.
Guess the problem is while the performance per generation is increasing nicely it takes much longer than the former "new generation every year" to get a new generation now so the overall development speed slowed down.
There used to be a guy from Poland, that grilled some ham and cheese on the GTX 480. The video was called "gotuj z GeForce" if I remember correctly.
Edit: Here's the video th-cam.com/video/y2OKoEIbGr4/w-d-xo.html
You legend!
hey remember when it was discovered (yeah, discovered, not announced) that the gtx 970 had only 3.5gb or vram with the other 0.5gb physically disabled on the chips?
I had a 480, thing was still beastly and was able to play GTA V just fine.
As for other video cards that sucked ass, I nominate the GeForce 295. A card that was literally 2 cards sandwiched together, sucked in dust like no one's business and the marketed ram was cut in half due to the 2 cards.
Without the GTX Titan Z priced at $3K how would videocard buyers let Nvidia know they were willing to spend over $1000 for consumer graphics cards?
I direct-ordered the very first GPU (Orchid Voodoo 1) back in 96. From 96-2000 3DFX really ruled the market. It was truly a revolutionary product. Prior to 3DFX, you pretty much had to constantly upgrade your CPU/mb on a yearly basis to get decent fps from your software rendering. I still have that Voodoo 1 btw in a classic gaming rig, running Win98 lol. Fun to compare it to my 4090 to see how far we’ve come.
Dell XPS 19 laptop with 2 8800GTX cards... probably the biggest laptop ever made...I really should send it over to you guys to take apart and toy with. I mentioned it in a comment before that got tons of engagement.
Write them an email, and say you will trade it for a shirt with the autograph of linus
I was debating between getting two GTX 970s in SLI or a single 980 for my PC at the time, I looked at stuff and found the dual 970 to perform better, all be it at a higher total price too. Two weeks after buying them, the 980ti came out, it cost the same, and since it didn't rely on SLI it would have been by far the better purchase, I still regret trying to build an SLI system, it was my first one, and it would be my last one, I am kinda happy that trend has gone the way of the dodo, so no one else will waste their money like I did back then.
970 is still a good card, but since for 95% of the time I had paid the functional price of a 980ti, but was getting 970 performance, it was a really really bad purchase.
Lol I am still using single gtx 970 to play all games. Not a single game I played was a problem for me. And seeing this new gpu prices, I think I’ll never be able to buy “next gen gpu” at their launch dates anymore. But still it was fun journey.
I had my bout with Crossfire and even multi-brand mGPU. There were probably only ever actually 3 games that ever did exist with flawless Crossfire/SLI scaling and they were running the Mantle API which I don't think Nvidia supported.
The absolute king of the crop was BF4 running mantle where 2 FuryXs scaled better than 100% and the main thing is that it was a perfectly buttery smooth experience with absolutely ZERO stuttering even on an old 3rd gen i7, - If games could've continued to scale like that then I'd be wanting liquid cooled (Vega 64LC anyone?) GPUs in crossfire in my rig to this day.
It was actually pretty amazing for gaming/productivity if all you played was BF4 (that was me) and then when it was time to rendering something in Blender you had a load of powerful GPUs at your disposal (Back then AMD actually rendered faster than nvidia in Blender, a trend that's no longer the case). To this day BF4 has offered the smoothest stutter free experience I have ever seen and that was running 2 liquid cooled FuryXs that ran silently.
So the promise land of crossfire did exist, it's just that shortly after Mantle was abandoned and DX12 never lived up to it's mGPU promises.
I wish multiple gpu scaling was a thing still. I know it's stupid in terms of power and price, but it looks so damn cool
@@YashPatel97335 I would still be using one of my 970s if they didn't start giving me weird graphical glitches, I suspect it was due to overheating from dried up thermal paste, but I never did verify it. Got a 3070 just before the scalpathon of 2020/2021/2022 took place
@@QueenSaffryn cool. The point is that 970 and 980 are still very good cards if you want to hold out on buying new cards. And you can always get one generation behind cards at very reasonable price just like you got 3070.
I regret buying my RTX 3090 at the very peak of scalper pricing, because I paid, funnily enough, $3,090 for it.
980 Ti, pretty powerfull card at the time, but with oc and ov the pcb got a little bit black in the vrm area. The waterblock and a fan on the back didn't help with that 😅
I remember going out and buying a 980 (upgrading from a 660 Ti) because I had just bought Arkham Knight and wanted to play it at the best settings with the "PhsyX" stuff enabled. I don't regret it for a minute. Arkham Knight looked amazing with long overdue use of the Batmobile.
The Titan X (which is just a 980 Ti with 6 more GB of VRAM) is probably the most "definitive" legacy Nvidia card available in my arrogant opinion. Needs external power, but beyond that, can work in any OS from Windows XP up and can also drive any monitor including CRTs. And even today, the card is surprisingly powerful.
I remember when they were telling GTX 480 owners to let their cards cool for a few minutes after shutting down their PC before touching the card because it ran so hot.
None, in 2015 I built my current system sporting a GTX970, great card for the money at the time. I upgraded that same system to a GTX 1070 FTW, also a great card, which was only replaced by an RTX 3070 FTW this year. Now I'm a bit bottle-necked by my aging i7, but I intend to move this GPU to an i9 system early next year. The only time I've been disappointed by performance was on a prebuilt back when I was in high-school, lesson earned, every system since then I have built from scratch after much research. Although I do remember having to upgrade from 2mb of ram to 4mb of ram ($800 at the time) on my first computer, a Mac, so that I could run Marathon at a frame-rate above a slideshow presentation, oh how times have changed.
Those AMD GPUS with built in water coolers and the later AMD FX series. They required so much power and generated so much heat sure were something.
Well I was on budget so picked the integred RTX 620 hd
Works awesome 😁
The card I think I regret most buying, would probably be when I bought the GeForce 4 Ti. That thing costed an absolute crapload of money at the time, and when I got home with my shiny new graphics card, I discovered my PC (which was a Dell Dimension 2400) did not in fact have an AGP slot! And by the time I got a new computer, PCIe was already mainstream. And I was pretty much forced to sell it second hand later down the line for chump change. (the store I bought it from did not take opened boxes as returns. Once it was open, it was yours).
I briefly owned an NVIDIA 7950GX2 - the two-GPU sandwich card. I actually wanted to get an X1950XTX for the combined HDRi + AA goodness, but it was out of stock everywhere. No regrets about the 7950GX2 though - I managed to sell it off a few months before 8800GTX got released and get that one with minimal additional investment.
I am a fan of 3Dfx, I had the voodoo 1, voodoo2, Banshee, V3000, and ended with the 5500apg. Loved how that card had Halo warthogs and Spartans on the back of the box. What people seem to NOT talk about was that 3Dfx cards used an API called glide that gave the cards a "performance" advantage over most at the time. However, it was limited to 16bit color textures whereas OpenGL and MS Direct3D supported 32bit. Games started going that way. The newer Voodoo's never truly supported OpenGL or Direct3D correctly and Nvidia and ATI did. 3Dfx's unwillingness to support the new features and the "cash grab" you mentioned destroyed the company. It's funny that many of the features, we currently use, evolved from 3Dfx's infamous T-Buffer technology. I sure hated buying that 5500 from Best Buy and the losing ALL support when switching to Windows XP. Painful.
My worst purchase was my still current GTX 1070. That puppy ended up being SO much bang for its buck, that thanks to all the ongoings on the GFX market in the recent years, as well as me being more into less graphics intensive genres over the years, I still couldn't justify upgrading it to a point where I'd be satisfied for anywhere close to a sensible price - even considering MSRP.
5:32 I was expecting some Police Squad shenanigans at the end. Missed opportunity!
Waaay back in the day of Geforce 4, i bought the literal cheapest Geforce in nVidia's lineup. It was soooooo bad, it could barely run anything new at the time on low settings! I wowed not to ever buy another budget component ever again.
Diamond Stealth III S540 Savage 4 Pro - Back in the late 90's it was supposed to be a budget video card that, thanks to some neat tricks like a hardware texture compression decoder, was able to provide better looking graphics than NVIDIA's offerings. The better imagery was true, in games that supported S3 Texture Compression (basically unreal engine) but the card was slower than the competition because of dumb headed design decisions like not including a fan on the GPU.
a 1070 from Gigabyte ... I RMA'd it because of a bad fan bearing, and after 8+ months of them claiming they never received it despite USPS proof of delivery to their warehouse, on what was supposed to be a 2 week fix, it showed back up at my house. I had already given up and bought a new card by then, so I didn't test it before selling it, but it had documentation saying it was cleared by their RMA department ... despite my emails with them still claiming they hadn't received it
I think I’ve done well so far. 750ti back in 2015 that got the job done with no fuss, later upgraded to a 960GTX that soldiered on way longer than I expected. It’s currently on hiatus until I build my wife’s PC (she doesn’t game like I do so doesn’t need anything too heavy in the GPU dept.).
Currently running a 3070ti FE, which I knew was a bit of a red headed step child in reviews but it’s the best I could afford at the time. I’m still on 1080p anyway so it’s easily coping.
I once bought an EN 8400GS. That genuinely seemed no better than the on-board graphics and was possibly
worse.
I've really not had a lo g history with GPUs that I've owned personally. I only built my first PC back in 2013. Before the I'd game on a laptop with a gt555m and prior to that on an iMac in boot camp.
All in I've only owned 3 graphics cards. My original build which had a GTX 650 (currently in my server for encoding purposes) an R9 280X and now a GTX 1060 which I've had for the past 6 years.
Of all of them the worst was probably the 280x. It was just a rebranded 7970 at the time and while it was performant, I had a lot of headaches with drivers and missed features such as a dedicated encoder. But the big issue for me with the card in recent years is the power draw. It sits in a fractal node 202 which I use as a living room rig, and it really struggles to dissipate the heat. 225W of power draw ain't great in a case that small.
i loved my 480. It kept the heating bills low, and it could almost play A Hat in Time!
....I used it until 2020, when premiere stopped supporting the card. Funny since I got it when adobe was liquidating their stock of the card from their workstations.
in economy systems, its said, that its not dum dum who sells but who buys
youtube commits blasphemy against US law, freedom of speech
Biggest loss for me was my first real pc. I purchased a computer from cyberpower with a decent cpu (intel e8400), but got an nvidia 8600gt. This thing sucked, but was brand new to custom pcs. Best thing is that I learned a ton from that purchase and it led me to a life of pc tech support (both professional and now the guy that fixes everybody’s pcs for free).
That was around the time I switched to ATI even though I had been a big Nvidia fan.
what a time to be alive, the channel is still named tesla
There was also the voodoo 5 series you're the voodoo 5500 voodoo 5 6000 The 6000 had 4 voodoo chips on it. Then there was the leaked voodoo rage graphics card that they were working on before they were bought by nvidia It would have been a direct X 8.1 card in hardware with apparently some early interact X9 functionality if I remember right
Allow Riley to keep making these vids funnier, this is gold lol
The my ex joke and the awkward silence at the end made me lmfao
I recall playing games on my GTX 480. I could hear the fans through the headphones, and my mic easily picked it up and almost drowned my voice out under load; good times. I remember how I tried to run VR on it and calculated I might be able to on a 480 SLI setup with the NVIDIA VRWorks... a shame it was discontinued. might have gotten 10-20 fps on VR... Not good for playing, but to say I could run VR on such old hardware and above 5 fps would be a cool thing to say IMO.
One card I regretted buying was an 8800GT from XFX. While the card itself worked, the fan on it was appallingly loud, even at idle. I can't fathom why I didn't return it at the time, but I managed to live with a 24/7 hairdryer inside my PC for the better part of 2 years. Thankfully it died before the 3-year-mark, as 8800GTs were never long-lived cards. Both of my brothers' 8800GTs also died before reaching 3 years. It was an excellent performed for the money, to its credit.
I remember saving my money and buying a Voodoo 4500 back in the day. I also still have my dual EVGA GTX 260 Core 216s.
I still regret getting an R9 380 from Gigabyte. One of the first purchases I made as a PC builder and by far the worst. Performance was good but it didn't last as the card died after just a few weeks. Newegg wanted nothing to do with it and I was forced to go through the manufacturer to get a new one. After 4 months of being forgotten about repeatedly by Gigabyte, they offered to send me a new card as they couldn't repair the old one. After a few more weeks I finally got it back, only to find out I had been duped. Immediately started having issues with the card black screening and subsequently underclocking itself. Perusing the internet pointed to a bad refurb, after checking the serial number and comparing it to what I had on file I learned that they did in-fact "repair" my card and sent it back to me without saying a thing. Tried getting them to send me the new card they had promised me but after a bit of back and forth they just stopped talking to me. To this day I refuse to buy anything Gigabyte makes, I can't bring myself to support a company that treats its customers that poorly.
First PC I ever owned was a dinosaur I bought with money I scraped together as a kid that came with an onboard GPU that was nothing but trouble. It was a Trident-Cyberblade, which was a collaboration between two awful budget manufacturers that used to exist. It didn't work well due to driver issues but there was zero chance of them getting fixed because, get this, not long after it came out (well before I got one) Trident and Cyberblade decided they were going to sue one-another and disavow the stupid thing. Ultimateley even the hilariously dated, dog-slow Nvidia PCI TNT2 64MB I bought for pocket change to replace it was a huge improvement in both performance and stability.
Wow, the Voodoo4. I remember a really funny and scathing review of that card on Sharky Extreme back in the day. Good times.
I had a GTX480. It ran hot yes but it was the king at the time.
I wonder if drivers ever made it better?
Matrox the only card built for gaming that didn't support a single game
There was a card from Sapphire way back in the 00's (I can't remember the specific model. I'll edit this post if I find it.) It was supposed to be a competitive "budget" card. Not only did it suck wind, but there was a "knob" on the card's fan that stuck out nearly a third of the width of the card itself. This meant that you could not put any card in the adjoining slot at all. It was the worst physical design flaw I had ever seen. Thankfully, I believe I was operating on a 240 from Nvidia at the time, so I wasn't even tempted. Nothing stellar about the 240, but it was decent for the time.
I am still using my fury x. It plays games like Bannerlord, warhammer 40k chaos gate and pathfinder. I agree that it was dumb to say that it was overclockable. But HBM ram is still something that will probably show its face again in the near future. Also, this card is literally 4.5" tall and 8" long (plus the added radiator). New high end GPUs are the size of entire ITX builds.... lets talk about that for a moment. WTF are 90% of computer cases suppose to do with a GPU the size of a twin bed?
My biggest graphics card mistake wasn't a card itself, but the connector. I was upgrading and decided to stick with AGP for the new motherboard, instead of that new upstart PCIe standard which probably wouldn't go anywhere...
Let's completely forget about the GTX 1630
The weirdest thing about the Fury X is that it CAN overclock quite well by modern standards, often achieving 12-16% over stock with little effort. AMD had been validating Fiji (the codename of the GPU core inside Fury X) at up to a 20% overclock, at their default target of 1.2v. Clearly some marketing whiz heard this and started hyping the overclocking capabilities, ignoring that the engineers achieving this level of overclock were using binned GPU cores from their own lab, not cores that were being mass produced and which were targeting a more broad range of stability.
What Fiji excelled at though was undervolting, and AMD themselves showed that off with the R9 Nano; a Fury X through and through but with the power limit chopped in half. You can take a retail Fury X and drop the power draw by over 150W without sacrificing a single MHz in games or demanding tasks, and sometimes you can even get 40-60MHz extra BACK with a slight overclock. Fiji started the modern age of undervolt + overclock.
My dad got an AIO 2080, his first ever liquid cooled anything, and then he nicked one of the coolant tubes when he built his new rig.
They actually repaired it for free when he sent it back, but it took like 3 months, so he ended up buying an air cooled 2080 instead, and I got the AIO one for free 😀
It's doesn't preform amazingly anymore compared to more modern cards, but I still love it.
The R9 Fury dropped to $400 over night and was a great card at that price.
The worst 3D accelerator i know of is the Alliance AT3D. the rendering quality of that card is just cursed due a combination of using dither for everything, not supporting many vertex formats and the mip mapping being basically random. to be fair, it is slightly faster than the S3 virge, but this only means you're getting the absolutely cursed visuals faster.
And it's even funnier that many voodoo rush cards had it as it's 2D chip, creating a monstrosity of a card that had both the worst and best 3D chips available at the time on the same board.
That one time with that XFX Radeon GPU with no cooling, had 1GB VRAM and used "hypermemory", which meant it chewed up a good chunk of the system RAM in demanding games.
Ah the only one I regret is a 3070Ti where I listened to some online reviewers and in store sales people instead of my gut. On the flip my RX480 ROG STRIX was an awesome purchase at the time.
I've never regretted a GPU Purchase, only which Reference/AIB Model & Cooler it came with
rx 6600 mech to replace my old 1060 which was very silent... thought it wouldn't be loud. It is adjustable in fan curve editor, but minimum is at 21%, which is still pretty audible and then it's even capped at 100W which makes it slower than other 6600 cards out there if you don't undervolt the crap out of it, which doesn't really work that well because it's a budget card and the silicone doesn't really leave much headroom. If you don't bother with fan tuning, you either have zero noise or a jet engine once it goes above 45.
Man now I feel old. My first gaming rig had a dual gpu 3dfx. Pops bought it as the company was going under. It gave me a few good years. I was able to play this nascar game and flight sim. I got his pentium 3 and the 3dfx card rig because he bought just before the pentium 4 came out.
I was thirteen and had $200 in birthday money. I saw GTA IV for PC at target for 20$ and convinced parentals to buy it for me. Later I went to the local PC shop with my remaining money and asked the guy behind the counter for a graphics card that will run GTA IV. I walk out of there with 5$ left but, now also the proud owner of a Nvidia 8600 GS. I run back home and spend the next hour installing my new found treasure into the family PC.
Let us say that I was very disappointed when that card could not even manage 20fps on the lowest settings.
My GTX 1080TI with AOI water cooler.
The radiator fan was always real noisy and a few months ago the pump died, I was hitting ~105C when I realized there was a problem.
Now I need to find a new heatsink somewhere.
The GPU I regret buying the most was actually the AMD RX580.
I was building the best computer I could afford with the money I had, which was by all means a budget PC. I walked out of Best Buy with a GTX 1060 and installed it, snapped a pic and sent it to a discord server where my friend said
"Dude! Return that right now and get an RX580!! It's cheaper and has WAY better performance!!"
I'd never had an AMD card before so I decided this was gonna be the shot I gave them. I went back to Best Buy and came out again with the XFX version of the RX580.
Spent a little while putting it in and it was smooth sailing, I was getting great performance and it didn't leave me wanting. For a month.
The day after the warranty expired my computer wouldn't turn on. I spent the entire day diagnosing until I found that switching to my integrated graphics worked. I took the GPU to a friend's house and confirmed it was dead. Best Buy wouldn't take it back, and after a bit of back and forth with AMD I was SoL.
I ended up playing without a dGPU for a while before saving up just enough for the then-new 2060. I haven't gone back to AMD since.
To be completely clear, this isn't to rag on AMD or to say people who buy their GPUs (or even the rx580) are somehow stupid for it. The year to year back and forth is getting really strong on team red and on paper they've got impressive GPUs for extremely competitive prices. But that experience soured me to the brand, and now I just hope Intel comes out with decent GPUs sometime soon because Nvidia annoys the hell out of me.
I am very disappointed the 2060 wasn't on there...even NVIDIA agreed it shouldn't have existed by re-releasing it as a 1650 or whatever. The entire reason for the 20xx existence was ray tracing & the 2060 performed worse than 5 generations below it with that enabled. It's existence was a waste.
The card itself is not terrible , even on nvidia ampere ray tracing still fps killer
@@mrbobgamingmemes9558 I stand 100% by what I said 😄
@@DragoNate the card is not e waste but the ray tracing ones, nvidia should not release ray tracing if their own gpu stuck at slideshow fps
@@mrbobgamingmemes9558 I think I'm slightly misunderstanding what you're saying...
the 2060 sucked at ray tracing, which was literally the entire reason for its existence as a 2060. turning on ray tracing made its performance worse than a card 3 generations prior. therefore, the 2060 was pure waste.
the 1650 was a 2060 (architecture, etc) minus the ray tracing. that was fine. it wasn't amazing performance or anything, but it was a new generation card that wasn't overpromised/under delivered & still slightly better than the 1060. I have no issue with it.
my issue is purely with the 2060. if the major selling feature of ALL 20xx cards, including the 2060, couldn't even be used without it su*king A$$, then it shouldn't have existed. they should've simply made the 1650 and omitted the 2060 entirely.
I remember being really disappointed by an S3 Graphics card (not to be confused with Amazon S3.) Very few games supported it and those that did... well, it usually made no difference in terms of graphics quality.
An HBM "What happened?/where did it go?" Video would be fun.
before i knew anything about gpu's i upgraded from my rtx 570 to a gtx 1050.... needless to say i did not notice much of an improvement and am now very happy with my 3060ti
Way back in 1996 I bought a Matrox Mystique, the card ended up being nicknamed the "Matrox Mystake". A glutten for punishment, I followed this card with the Matrox G400, another "winner" of a graphics card.
I kind of have to disagree on the Fury X. It was far from a great card, but it wasn't useless or a mistake. In very niche usecases it was ideal.
At that point I had build a small ITX system with an i7 6700K (which I wanted to OC, aircooled) and at first an R9 Fury. Due to the small volume of the case and the limited airflow, temps would rise quickly and things started to throttle. Switched the GPU for a reference GTX980, but that didnt work out either.
After that I switched to an R9 Fury-X and I put a small 140mm AIO on the CPU and all the issues were solved: CPU maxing around 72C and the GPU would never go abovr 52-56C even wheb overclocked.
A very niche usecase, but it was perfect for my use.
Heck it was a high end GPU in a small form factor that ran perfectly silently, it was also basically the last high end GPU that could've been used in BF4 under the Mantle API to experience Crossfire at it's absolute peak (Back then BF4 scaled 2 FuryXs over 100% performance gain and it was 100% stutter free, unheard of in crossfire/sli and idk if we'll ever see that again) and it did so silently.
Nvidia never gave me that experience and I'm so glad AMD did because that was something to see.
True mGPU is 100% possible in games because I've seen it done, devs just abandoned the concept.