Unfair comparison, AMD named it so differently in order to compete against the Titan, not normal series of cards. The R9 Not-So-Fury series compete against the Titan X, Xp, Xpp.
I used my 980 ti for about 8 years retired it on February 9 2024 now it's a shelf piece, i pray to it everyday in hope gpu prices will one day return to what they once were.
The R9 Fury series was the beginning of AMD's marketing team getting a large amount of flack for being trash by quote-unquote calling it an overclockers dream. The 980Ti shit all over the Fury series at stock and the 980Ti's insane overclocking headroom made it a bloodbath.
yeah it was a strange but cool card. Big drawback was the 4gb Vram, while the 390x (which was way cheaper) already had 8gbs i remember filling up over 4GBs with a decently modded Fallout 4 at the time. And not even talking about playing in WQHD or even 4k.. and with my modern modlist it would see no daylight, except if only using 1k texture packs for mods. But even then it could have vram spikes and framedrops. the fury x should have been cheaper then the 980ti and with atleast 6GBs. Then it would have been a great card. it was over 30% more expensive then the 390x and as expensive as the 980ti. But its place is in between them.
I think it's worth mentioning at the very least that the 980Ti came a little later, the 980 was the card out at the time and it only had 4GBs VRAM also. 980Ti is still the better card overall though at the end of the day.
I dont understand how the AMD graphics team didnt realise, that a GCN 3.0 R9 390x (2014) is what people really wanted, then: 2016: 14nm 1800x/1700x/1600x + RX 480/RX 470/ RX 460 2017: 12nm 2990wx/2970wx/2950wx + V64/V56/V48 2018: 14nm 3950x/3900x/3800x/3700x/3600x + 5700xt/5600xt 2019 CES: 7nm 3990x + Rad vii // RDNA2 lineup 2020: RDNA3 + zen3D AT LAUNCH 2022: RDNA4 + zen4D AT LAUNCH 2023/2024: RDNA5 + zen5D AT LAUNCH The fury cards should have been for the INSTINCT lineup ONLY (especially how it was tiny) and only have a limited run to test the waters, and instead ramp up production of RAD VII because that is 7nm and the improved version of HBM Also sell it at $999 because it has 16gb HBM2 dammit, why price it so low Miners would have been all over super cheap GCN cards in 2012-2015, then go full production with polaris(2016), and after 1 year go vega (2017) where threadripper / vega is the ace to catch intel / nvidia off guard
My 970 overclocks to infinity and beyond... never crashed it, It kept going untill I started to get wide eyed with caution. I had to double triple check that it was even taking the oc, definitly got a golden piece of silly string. I think heaven crashed once... I run it at stock clocks because I just don't care for the noisy heat.. Anyway +1 for maxwell wil oc pretty freaking far.
The GTX 900 series has another trait that most people don't care about: they're the very last GPU generation with a native analog VGA output (via the DVI port) that allows high-res high-refresh output to drive CRTs at ridiculously high FPS. And no, converters from HDMI or Displayport to VGA aren't an option to substitute that, as 99% of them can't exceed neither 1080p nor 60Hz/FPS in other words: they're very desirable for a specific kind of nerd, basically lol edit: coincidentally it's also the very last generation to get XP drivers
@@Knaeckebrotsaege They kind of suck for Windows XP though since the drivers are half-baked. Better off in Win7. Fermi is a good overkill XP option. I have used my GTX 980 for CRT shenanigans before :)
I would consider the 980ti the oldest card that is still powerful enough to play modern games at playable frame rates. Pretty incredible out of a 9 year old GPU. A nice bonus that it’s the card to get if you’re rocking a CRT with the native analog support.
Yes, but also no. I'm currently working on an FX-8350-based retro gaming PC to play on a big 1080p LCD or a 29" CRT. I've got older GPUs, but I'm going to set this up so I can quickly switch system drives and GPUs. One will be a 980Ti, the other an R9-290. Thus the need for swappable system drives (Windows 7 Pro). I could use VMs, but nah.
These videos always make me realise how little the last years grafics improvements really did to the overall look of games, yet how much more performance the requierer.
@@PCZONE1 I played it, i have a really good set-up with a 4090, and yes it looks better. But considering the fact that i only had 60 FPS at good times, with half of them being interpolated. The performance cost is so insane and the fidelity increase so small, it doesn´t feel worth it anymore.
An uncommon case of the "fine wine" not working for AMD. Maxwell was just that good, and GCN 3.0 wasn't that much better than GCN 2.0. Not to say the Fury was bad. It performs well enough. It's just the GTX 980 Ti was stronger and more power efficient while debuting costing the same.
Yeah it's why I didn't go for the Fury cards back then and just stuck with my 270X bumblebee until I upgraded to an RX 570. I'd kinda describe the Fury cards as being AMD's effort in competing against Maxwell cards in the same vein as Intel's 13th/14th gen cpu fiascos by throwing everything and the kitchen sink in an attempt to say they're capable of trading blows with their Ryzen counterparts. Except, the Fury cards weren't half as disastrous in the long run compared to what Intel's going through now.
@@Muddy.Teabagger The latest driver still supports the 900 series cards. I just checked the NVIDIA site. However, I don't think that will last much longer.
@@Muddy.TeabaggerDriver version 560.70, released on the 16th of July 2024 still supports the 900 series, including the 980ti. Why are you under the impression that this card isn't supported anymore?
mate thanks for this test! But i really wished for u to include the framerates of the r9 390x, cause the fury x placed direcly in between the 390x and the 980ti, while costing as much as the 980ti. i remember this was extremely odd to see ther over 30% cheaper 390x having double the vram of AMDs flagship card. safe to say i bought the 390x nitro for that reason. Got it a while after release for 300 bucks brand new, while fury x and 980ti where still really expensive.
Nvidia still makes drivers for 980Ti for modern games. polaris, fury, and vega have been ended, and modern games are getting glitchy on them as a result. It helps that the 980Ti actually kind of holds up in modern games though.
I still sometimes daily drive my Zotac GTX 980 Ti Amp! Extreme instead of my newer card just because it's such an awesome piece of tech and it still looks cool and performs surprisingly well. It overclocks to over 1525MHz with stock core voltage (and can handle +700MHz on memory) and can handle many games even with 1440p. Its performance is slightly better than a stock GTX 1070 Ti but it of course draws a whole lot more power with the massive overclock. FSR3.1 and Lossless Scaling have also given it some new life. I wish more youtubers would actually run their 980 Tis with proper clocks since no one in their right mind runs them stock, they have massive amounts of OC potential and can usually hit 1500MHz.
Holy cow. The White Dwarf Mag/Warhammer 40K Grey Wolves reference. Thanks for the great review, as I'm debating whether swapping out my GTX 980 to a 980 Ti that I had sitting around in an old system is worth it. It certainly looks like it!
Crazy how the GTX 980 ti is still one of the oldest graphics cards that can still render most modern games at stable frame rates and still supported. It's 9 years old just shows how Expensive gaming graphics for the past 9 years.
Great video, but I have one suggestion, you should use 30 for the increments on the X line of the graphs, it makes more sense in the gaming performance world.
Oh man do I remember my 980ti, that card was an absolute beast! I got the Gigabyte Extreme version & that bad boy ran 1550/2000 all day long. I miss the days where a 50% OC was something that could be possible with the right silicone...
Actually... Good point about the Fury X failure leaving AMD to compete at the top top ever again. It actually worked out for us consumers if you ask me. For years now I have never been able to recommend an Nvidia card to my friends and co-workers building PCs because the green cards prices keep getting less and less reasonable. I'd hate for both options to be WILDLY overpriced
@IcebergTech Another great video. 👍 For your next video, bear in mind the Vega 64 Liquid Cooled model was the top tier Vega and would maintain higher clocks thanks to better cooling and a higher power limit. The Air cooled Vega 64 cards with a UV OC would sometimes beat the 1080 (non ti), but they were still a bit limited by power and thermals.
The problem with the "Fine Wine" term with AMD is that it was in most cases it was all AMD's fault. It took up to years after a released architecture to have their driver and utilization optimization to be at a healthy point ,prior to shift to DX12/Vulkan hardware and API. AMD's driver team was no where the level of size or ability vs Nvidia's. So around 2012 AMD were the ones who pushed for a new low level API called Mantle which pushed most the optimization work onto the developers. In theory freeing up a lot of AMD's resources on driver optimization on a game by game basis while removing quite a bit of API overhead increasing efficiency and performance at the same time. Mantle ended up being axed because it was proprietary to only AMD hardware and the work went to OpenGL to create the Vulkan API.
In fairness, it's not like you got a bad deal with a 290(X) in 2013. It was cheaper than a 780Ti, had more memory on a ludicrous 512 bit bus, and performance was already reasonably close. It would've been nice to have all that performance to start with obviously, but it really did end up feeling more like a nice bonus than anything negative. Part of it was also the underlying architecture. AMDs performance on low level APIs wasn't just a matter of optimization, GCN was simply better suited to the task than Kepler with key advantages such as async compute support. Not that I'm trying to say AMD didn't have plenty of driver issues. Just adding a bit more nuance.
I would enjoy seeing videos (not too far from a LGR) where we find modern uses for older GPU's. For example, the Retro community got so robust that you can take an R9 290X (technically you could go as far back as HD 4000), flash the bios and put it on a machine utilizing CRT Emudriver for CRT displays. I don't know if this list would past 5 or 10 ideas for breathing new life in these older GPU's but I'd love to see what ideas are out there. Update: Another idea would be using pre-RTX 2080 gpu's to put into a media server for encoding 4K videos.
That water cooling was there to increase GCN efficiency. Some reviewer did more digging on this matter before. I think it was tech report or anandtech. About the cost it was supposed to be sold at much expensive price than 980Ti. But AMD had no choice to price the card at similar price because that's the price nvidia set for their 980Ti. Back then i heard some partners wnd up very furious at AMD because they decided to sell the card at $650.
@@donaldnemesis393 It definitely doesn't need it. The R9 Nano is the same thing and is clocked only 50MHz lower with a lower power limit. If you tweak the Nano... it's a Fury X. Doesn't run that hot. If they sold any with decent aircoolers they'd be fine.
very good video and game slection. nice work man. i purchased a 980ti for my CRT monitors this year. very happy with it. its an ASUS reference model 980ti. very overpriced in my country by the way(south America). i paid 230usd. Happy nonetheless. thumbs up
Just as a point of reference, I have an EVGA exclusive 120mm water cooled GTX 980-Ti in the same rig as my RTX 380 FE. I have it as a dedicated CRT-only gaming GPU. Not only is it exquisite for classic titles, and those times you want to see those nice, fat scan lines on, say, the original Deus Ex. But it is genuinely fkg amazing for many modern titles, such as twitch-shooters like the modern Doom, when actual zero lag input feels like gaming from the future. Not only do I have a high quality CRT monitor, but I also have a 27 inch Trinitron behemoth. And using tricks like Super Resolution looks utterly amazing. Particularly at night. With the lights off... I've considered getting a Titan X - the last nVidia card ever with native analog out with the DVI-I output. I could even transfer the EVGA liquid cooler to the Titan X, as a stealth sleeper GPU. But I don't think that logically there is any benefit? Practically, I would only be doubling the VRAM. But I can't imagine any benefit in 240p or 480i output. Higher resolution textures would be useless. Sorry, I'm still arguing with myself. Anyway, for pure analog gaming, with instantaneous response (Quake 3D, Doom, Wolfenstein 3D, etc., plus all of the Classics) the 980-Ti is still an incredible card. Anyone who truly appreciates the phosphorus glow of an Electron Gun firing directly at your head, the unparalleled atmosphere of Silent Hill on a CRT monitor, grab a 980-Ti while you still can. I love my 3080 FE. I love that the VRAM doesn't go one degree above 80c, and because my eyesight sucks, 1440p is all I'll ever need. Maybe I'll try an OLED, but lower res. and higher frames. But for genuine atmosphere, and for instant twitch reflexes, for me the 980-Ti is the GOAT. If the 1080-Ti had analog out, it's clearly the winner. But digital only? No. My 980 has pride of place.
I had a pair of EVGA SC 980 TIs. I ended up spliting the two between a couple of computers. Utimately upgrading both. I traded both in late 2023 for a Geforce 3060 TI which I sold for $300. Those were very good cards.
You're flat out wrong saying AMD never again competed at the top end. The 6900xt matched the 3090 and 6950XT to the 3090Ti... Also the 7900xtx beats the 4080Super. I don't count the 4090 and classify that as a Titan, but you don't have to do that if you don't agree.
3080 and 6800 XT are fairly evenly matched, trading blows but with the 6800 XT slightly slower on average across a testing suite of a lot of games skewed a little by inherent ray tracing in some titles; RT performance is A LOT slower in 6000 series. Those results are reflected by 3090 vs 6900 XT and the 3090 Ti was straight up faster than the 6950 XT in most games even in raster. By no means was it a significant margin but it was a consistent one. 7900 XTX beats the 4080 Super, yes, but AMD aren't competing against the 4090. 4090 I consider a "Titan" card too; that line of GPUs is essentially just renamed as XX90 in general, including the 3090 and 3090 Ti which were 1100 - 1500 dollar cards.
@@zareefkabir705 They used to have a difference in compute performance for 'professionals' but they nerfed that for ones such as the Titan X which were no different from a gaming card with a higher price tag.
RDNA 2 true it compited but AMD released the 6950xt to beat the 3090 the 6900 xt was behind it... then Nvidia released the 3090 ti and won the crown again, but you are right they competed since then it didnt happen because the 4090 classify it the way you want is still an RTX 4k series meant for mainstream market despite its price is strill meant for it.
Ah I miss healthy competition on the GPU market, by Pascal it already stopped, and in 2020 when the 3000 rtx and 6000 AMD came out, competition seemed to be back on track, but if rumors are true, AMD isn't gonna be competing on the high end anymore.
From memory manufacturing was one of the the limiting factors for AMD at this time, the second was team size. GCN 1.2 was a minor upgrade over GCN 1.1 in the last gen bring very little in the way of improvements and by this point the architecture and chip was getting very large resulting in poor yields. I remember reading somewhere that this was the largest chip TSMC could produce for AMD. The HBM was where a lot of effort was put to in this card and is actually what allowed for the smaller size as it saved a lot of space compared to GDDR5. It was rumoured that the Fury X would be followed up with another release once yields improved and they worked out how to produce larger chips as the 4GB memory was very much a technical limitation of HBM implementation meaning you couldn't simply slap more per stack to increase capacity, not that cost or yields allowed enough it. The 4GB was called out as a limit that while fin for current games at the time, was expected to be problematic in the next 2+years. Originally AMD was looking to compete with the 980 but the 980ti was a preemptive launch to steal the headlines. Like you said, AMD's Fury X really competed in the mini PC space where you had to move down Nvidias product line to get something competitive.
re the GTA5 engine limit ... easiest way to "bump up the settings" is to use frame scaling in the advanced graphics settings, while leaving everything else as-is. This setting can even force current GPUs to their knees, especially if everything else is already maxed out
I put everything to the max and it pretty much put my 3060 on life support. Poor thing couldn’t handle everything at the maximum settings at 1440p. It probably doesn’t help that I have a weak cpu though.
Yeah that Fury X... I mean I love the use of HBM, trying new stuff like that, but yeah I was using mine up until about a year ago or so, it was kinda driving me crazy, the GCN strategy didn't work for this long, it showed kinda early it had a bit of a scaling issue.
R9 Nano is capable of running Fury X speeds if you remove the power limits. Yes this does take away literally every efficiency advantage that it would have. However the little aircooler can handle it.
I recall the Fury non-X being a decent value alternative to the GTX 980, in particular when it went on sale at the end of the generation. The Fury X on the other hand feels like it was quickly forgotten, a card that underperformed at a much too high price, I suppose that Vega 64 was AMD admitting that pushing their chip to the limit and slapping an expesive cooler on it isn't going to save it.
Last year for Xmas I built my 9 year old a gaming pc with a 980ti 8700k that I delided and over clocked to 5ghz 1.3v paired with 2x 16gb 4400 ddr4 and for games he plays the 980ti still performs and I only paid £50 for a gigabyte windforce triple fan.
980ti is the real goat for me, after 9 years can still deliver good fps at 1080p with full driver support. Price\Performance in the used market it's even better than the 1080 ti.
the 980ti holds a sweet spot in my heart my first built pc and i got a 4k monitor for it but it my later years i decided on 1440p high refresh rate to be the spot it was such a great gpu the last of its kind for 600 being a highest end card then the 1080ti being a great buy after that Nvidia fell off honestly becoming greedy with little improvement focused on ray tracing and DLSS to make up for improvement with a huge price increase
HBM is superior but it’s Achilles heel is its 800mhz speed vs 3+ghz on nVidia. Amd always uses larger bandwidth which holds up in higher resolution but again if the memory speed isn’t up to par then performance won’t be as good as advertised. Amd can beat them but it’s very costly back then. Today since some processes have lowered significantly in cost so HBM3/4 can do 16gb vram bandwidth along with 3ghz memory speeds will decimate. Amd just doesn’t have the engineering stones to do it. If they did they could take over the entire market.
Now we only need videos comparing these results to the most current drivers to see which team does better driver optimization. Already know AMD will win, but by how much?
Fury line could've been a decent card if it had more the 4Gb of Vram. Of course AMD's 600mm2 GCN chip was very inefficient and it couldn't been as fast if it's memory system wasn't HBM as it did allow more power to be directed towards the core without sacrificing memory buffer size or speed. But to be fare 980ti was probably the best thing that ever happened to Nvidia. It was a good card from the get go and it had so much overclocking potential making the next gen GTX 1080 not even an upgrade. Though most of the AMD's struggles weren't necessarily because of lack of effort, but it used bad node from a manufacturer who could never match TSMC which Nvidia used for 900 series. AMD had the raw compute power of GCN, but times were so such that it wasn't needed and GPUs were mainly for gaming purposes. But it is what it is, the end result is all that matters.
If ya gonna compare the 1080ti and the Vega 64. Its quite important to use the liquid cooled sapphire vega. I have two of them and they Clock higher than any other vegas ive had.
not gonna lie, i just think that going from 4096 cores on the fury x and nano to 6144 on the 7900xtx is amazing when its over 400% performance, just goes to show what a higher clock speed and IPC and faster VRAM can do. also would you prefer 4KB of cores or 6KB or do you want 16KB with the 4090.
What would have happened if AMD hadn't decided to go with HBM for R9 Fury X? If it had come with 8GB of GDDR5 and a 384-bit bus... Maybe AMD would not have stopped competing for the top spot in consumer GPUs. HBM was an odd choice. HBM1 could only have a max of 4GB and it was expensive. AMD was losing market share rapidly after being ahead in sales in 2013 for discreet GPUs.
AMD bet HBM will go mainstream and cheap as time goes by. Nvidia on the other hand end up investing more towards GDDR leading to GDDR5X and eventually GDDR6. funny thing is GDDR5 was originally being co-develop with memory maker with AMD. Also the only time AMD was really ahead in term of sales against nvidia was during fermi generation when nvidia had issues with TSMC 40nm because of their bigger die size. Worse when Rory read end up being CEO AMD reject many OEM offer. This what makes nvidia end up dominating the gpu in laptop market.
The 980ti was supposed to be my 4k card when i built a new pc for the first time in over ten years. Then, final fantasy 15 came along and said no. I then bought the 1080ti to be that. Then, Yakuza Kiwami 2 told it to eat kerb. Given nvidia's pricing for rtx, i finally rescinded my ban on Radeon, with an rx 6800 since that was all i could get. And now Robocop Rogue City is telling me no.
Robocop Rogue City runs like complete shit on everything tbh. Not seen that talked about much, the performance is awful even on benchmark videos, streams and gameplay regardless of setup or GPU vendor choice. Stutter galore and low internal resolutions to get playable framerates. Typical UE5 shenanigans. Yakuza Kiwami 2 has stuttering issues in cutscenes that happen in the exact same spots and scenes like a video with a stutter in the encoding except it's real time cutscenes. I remember the beginning disco scene giving my RX 5700 XT an aneurism too. Runs flawlessly at max 4k on my 6800 XT now though, apart from the minute stutters that occur in cutscenes which seem like an engine issue. I also had a 980 Ti before my 5700 XT and yeah FF 15 shat all over it.
When you have a result that averages around 30 FPS, just as a bit of shorthand, we call that "Starfield Quality", so we all can understand each other...
Playing TW warhammer 3 on haswell-e + 980ti right now and it works fine. No stuttering. Could look better tho. Somehow even on ultra preset it doesn't look great. I wonder if that is driver related.
I'm not going to lie, the newer years like 2017, 2019 etc are not worth making since most people already know of the answer between the RX XXXX series vs RTX series, maybe you should go backwards, would be cool to see the 2011 best cards or 2009 best cards.
@@zCaptainz Wanna try that again? th-cam.com/video/5PUFTG8_dP4/w-d-xo.html Typically, I'm talking about modern games that don't demand much, but since you wanted to go for the far end, it's good to see that the 980Ti is still a viable option.
You should avoid them. Drivers are no longer supported. Use an xbox in dev mode for emulating xbox 360 and older consoles (not ps3 ). Or build a cheaper PC using rx 5000 or rtx 1000. (though even they're showing their age).
I fail to understand the reasoning behind using 'era appropriate' drivers. After all, no one's going to install 7 year old drivers, especially when newer ones aren't only stable but also offer noticeable performance improvements, especially in case of AMD.
Oh my how the mighty have fallen the previous generation AMD had the edge in some titles the the R9 290X due to the extra gig of vRAM now during the 2015 era the 980 Ti spanked the ever loving bajesus out of the r9 Fury X by 1 having 2GB of vRAM and 2 having the maxwell magic behind it and 3 being an overclocking beast. Also it took AMD to HBM2 and the Vega 56/64 era to reach 8gb HBM memory in the form of HBM2. There are workstation cards out there that use HBM chips.
ERRATA:
The 980Ti has 96 ROPS, not 92
Your videos are amazing you mate are legend keep up the good work , love from Japan! You are under appreciated in TH-cam !
GYATTA:
Thiccums
Hi, mate kindly do nvidia 1050ti I still holding it till now. Thanks keep doing good things friend. Love from kenya 🇰🇪
gamers nexus is coming for you for this mistake!
Unfair comparison, AMD named it so differently in order to compete against the Titan, not normal series of cards. The R9 Not-So-Fury series compete against the Titan X, Xp, Xpp.
I used my 980 ti for about 8 years retired it on February 9 2024 now it's a shelf piece, i pray to it everyday in hope gpu prices will one day return to what they once were.
amen
So you can sell you 980ti for $250... lols :]
Ah, a man of culture. I was using them in SLI and one died in 2022 while the other died this July. They will be missed.
1080 ti shall be exciting
It’s going to be a slaughter
1080ti is the GOAT for sure. Price, performance, efficiency, reliability, and has enough VRAM to continue kicking ass in 2024 and beyond.
I’m still using one lol
Just upgraded to a 1080ti don't plan on upgrading till the newer generations match the price
@@supergamer-sy4sj Well obviously its the RX 580 vs the 1080ti we already know what the outcome is going to be
The R9 Fury series was the beginning of AMD's marketing team getting a large amount of flack for being trash by quote-unquote calling it an overclockers dream. The 980Ti shit all over the Fury series at stock and the 980Ti's insane overclocking headroom made it a bloodbath.
yeah it was a strange but cool card. Big drawback was the 4gb Vram, while the 390x (which was way cheaper) already had 8gbs
i remember filling up over 4GBs with a decently modded Fallout 4 at the time. And not even talking about playing in WQHD or even 4k..
and with my modern modlist it would see no daylight, except if only using 1k texture packs for mods. But even then it could have vram spikes and framedrops.
the fury x should have been cheaper then the 980ti and with atleast 6GBs. Then it would have been a great card.
it was over 30% more expensive then the 390x and as expensive as the 980ti.
But its place is in between them.
I think it's worth mentioning at the very least that the 980Ti came a little later, the 980 was the card out at the time and it only had 4GBs VRAM also. 980Ti is still the better card overall though at the end of the day.
I dont understand how the AMD graphics team didnt realise, that a GCN 3.0 R9 390x (2014) is what people really wanted, then:
2016: 14nm 1800x/1700x/1600x + RX 480/RX 470/ RX 460
2017: 12nm 2990wx/2970wx/2950wx + V64/V56/V48
2018: 14nm 3950x/3900x/3800x/3700x/3600x + 5700xt/5600xt
2019 CES: 7nm 3990x + Rad vii // RDNA2 lineup
2020: RDNA3 + zen3D AT LAUNCH
2022: RDNA4 + zen4D AT LAUNCH
2023/2024: RDNA5 + zen5D AT LAUNCH
The fury cards should have been for the INSTINCT lineup ONLY (especially how it was tiny) and only have a limited run to test the waters, and instead ramp up production of RAD VII because that is 7nm and the improved version of HBM
Also sell it at $999 because it has 16gb HBM2 dammit, why price it so low
Miners would have been all over super cheap GCN cards in 2012-2015, then go full production with polaris(2016), and after 1 year go vega (2017) where threadripper / vega is the ace to catch intel / nvidia off guard
You couldn't even overclock the HBM on it.
My 970 overclocks to infinity and beyond... never crashed it, It kept going untill I started to get wide eyed with caution. I had to double triple check that it was even taking the oc, definitly got a golden piece of silly string.
I think heaven crashed once... I run it at stock clocks because I just don't care for the noisy heat..
Anyway +1 for maxwell wil oc pretty freaking far.
The GTX 900 series has another trait that most people don't care about: they're the very last GPU generation with a native analog VGA output (via the DVI port) that allows high-res high-refresh output to drive CRTs at ridiculously high FPS. And no, converters from HDMI or Displayport to VGA aren't an option to substitute that, as 99% of them can't exceed neither 1080p nor 60Hz/FPS
in other words: they're very desirable for a specific kind of nerd, basically lol
edit: coincidentally it's also the very last generation to get XP drivers
@@Knaeckebrotsaege They kind of suck for Windows XP though since the drivers are half-baked. Better off in Win7. Fermi is a good overkill XP option. I have used my GTX 980 for CRT shenanigans before :)
Pretty sure my old 1050 has a dvi port though
@@shadyb834 nvidia 10 series card doesnt have analog dvi output (DVI-I), it does have DVI-D which is digital not analog
Yet mine is worth like 40 bucks if I can find a sucker... Cool piece of interest tho.
@@FacialVomitTurtleFights low-end garbage like 950s, 960s and 970s are basically worthless yes. was talking more about the 980Ti and Titan X
4GB was an HBM1 limitation. Not a cost one
AMD and Nvidia should Collab and make the "Ryzen 4070"
Man of culture 😂😂
I'm more looking forward to the ryzen 4090
Ztt fandom, lessgoo
intel i7 8809g flashback
😐
I would consider the 980ti the oldest card that is still powerful enough to play modern games at playable frame rates. Pretty incredible out of a 9 year old GPU. A nice bonus that it’s the card to get if you’re rocking a CRT with the native analog support.
Yes, but also no. I'm currently working on an FX-8350-based retro gaming PC to play on a big 1080p LCD or a 29" CRT. I've got older GPUs, but I'm going to set this up so I can quickly switch system drives and GPUs. One will be a 980Ti, the other an R9-290. Thus the need for swappable system drives (Windows 7 Pro). I could use VMs, but nah.
These videos always make me realise how little the last years grafics improvements really did to the overall look of games, yet how much more performance the requierer.
Yeah. when games like AC syndicate and Odyssey look better than cyberpunk, you gotta wonder what's going on under the hood.
@@XiaOmegaX They definetly do not look better than cyberpunk with ray tracing.
@@PCZONE1 I'd say Cyberpunk without RT looks better.
@@PCZONE1 I played it, i have a really good set-up with a 4090, and yes it looks better. But considering the fact that i only had 60 FPS at good times, with half of them being interpolated. The performance cost is so insane and the fidelity increase so small, it doesn´t feel worth it anymore.
@@MeowMeowMeowMeowMeowMeowMeowww I play at 1440p on a 3080ti with dlss Q and it still looks fantastic and I get 60 fps. with fsr frame gen its 110 :)
An uncommon case of the "fine wine" not working for AMD. Maxwell was just that good, and GCN 3.0 wasn't that much better than GCN 2.0. Not to say the Fury was bad. It performs well enough. It's just the GTX 980 Ti was stronger and more power efficient while debuting costing the same.
Yeah it's why I didn't go for the Fury cards back then and just stuck with my 270X bumblebee until I upgraded to an RX 570. I'd kinda describe the Fury cards as being AMD's effort in competing against Maxwell cards in the same vein as Intel's 13th/14th gen cpu fiascos by throwing everything and the kitchen sink in an attempt to say they're capable of trading blows with their Ryzen counterparts. Except, the Fury cards weren't half as disastrous in the long run compared to what Intel's going through now.
AMD also lied about fixing the vram, instead supported hbcc on Vega where it wasn't needed, and AMD dropped driver support for fury years before Vega.
The 980 Ti still has driver support. I have a regular 980 and it's still humming along in my old 9900k
driver support ended about 5/6 months ago for 980 ti as i have one and got it a month after it ended support
@@Muddy.Teabagger The latest driver still supports the 900 series cards. I just checked the NVIDIA site. However, I don't think that will last much longer.
🔥
It has "support". Nothing changes if you use latest or lets say 470 version, even in the latest games
@@Muddy.TeabaggerDriver version 560.70, released on the 16th of July 2024 still supports the 900 series, including the 980ti. Why are you under the impression that this card isn't supported anymore?
mate thanks for this test!
But i really wished for u to include the framerates of the r9 390x, cause the fury x placed direcly in between the 390x and the 980ti,
while costing as much as the 980ti.
i remember this was extremely odd to see ther over 30% cheaper 390x having double the vram of AMDs flagship card.
safe to say i bought the 390x nitro for that reason. Got it a while after release for 300 bucks brand new, while fury x and 980ti where still really expensive.
Because 4GB is the maximum you can get with HBM1.
Really looking forward to seeing how the Fury X performs with the RID drivers!
Nvidia still makes drivers for 980Ti for modern games. polaris, fury, and vega have been ended, and modern games are getting glitchy on them as a result. It helps that the 980Ti actually kind of holds up in modern games though.
I still sometimes daily drive my Zotac GTX 980 Ti Amp! Extreme instead of my newer card just because it's such an awesome piece of tech and it still looks cool and performs surprisingly well. It overclocks to over 1525MHz with stock core voltage (and can handle +700MHz on memory) and can handle many games even with 1440p. Its performance is slightly better than a stock GTX 1070 Ti but it of course draws a whole lot more power with the massive overclock. FSR3.1 and Lossless Scaling have also given it some new life.
I wish more youtubers would actually run their 980 Tis with proper clocks since no one in their right mind runs them stock, they have massive amounts of OC potential and can usually hit 1500MHz.
this man can make some numbers interesting for over 20 minutes on stuff i already know
that takes some pure skill, ty for the video and good job
Holy cow. The White Dwarf Mag/Warhammer 40K Grey Wolves reference. Thanks for the great review, as I'm debating whether swapping out my GTX 980 to a 980 Ti that I had sitting around in an old system is worth it. It certainly looks like it!
Crazy how the GTX 980 ti is still one of the oldest graphics cards that can still render most modern games at stable frame rates and still supported. It's 9 years old
just shows how Expensive gaming graphics for the past 9 years.
Ahhh.. Fury era. Back then when AMD decided that they wanted to complicate their naming even more and confuse the customer base.. 😅
Great video, but I have one suggestion, you should use 30 for the increments on the X line of the graphs, it makes more sense in the gaming performance world.
Nice. Been ages since ive powered up my rigs with these cards in them. Just not enough time in the day to play with my hoard/collection. Thank you .
Oh man do I remember my 980ti, that card was an absolute beast! I got the Gigabyte Extreme version & that bad boy ran 1550/2000 all day long. I miss the days where a 50% OC was something that could be possible with the right silicone...
Actually... Good point about the Fury X failure leaving AMD to compete at the top top ever again. It actually worked out for us consumers if you ask me. For years now I have never been able to recommend an Nvidia card to my friends and co-workers building PCs because the green cards prices keep getting less and less reasonable. I'd hate for both options to be WILDLY overpriced
@IcebergTech Another great video. 👍
For your next video, bear in mind the Vega 64 Liquid Cooled model was the top tier Vega and would maintain higher clocks thanks to better cooling and a higher power limit. The Air cooled Vega 64 cards with a UV OC would sometimes beat the 1080 (non ti), but they were still a bit limited by power and thermals.
The problem with the "Fine Wine" term with AMD is that it was in most cases it was all AMD's fault. It took up to years after a released architecture to have their driver and utilization optimization to be at a healthy point ,prior to shift to DX12/Vulkan hardware and API. AMD's driver team was no where the level of size or ability vs Nvidia's.
So around 2012 AMD were the ones who pushed for a new low level API called Mantle which pushed most the optimization work onto the developers. In theory freeing up a lot of AMD's resources on driver optimization on a game by game basis while removing quite a bit of API overhead increasing efficiency and performance at the same time. Mantle ended up being axed because it was proprietary to only AMD hardware and the work went to OpenGL to create the Vulkan API.
In fairness, it's not like you got a bad deal with a 290(X) in 2013. It was cheaper than a 780Ti, had more memory on a ludicrous 512 bit bus, and performance was already reasonably close. It would've been nice to have all that performance to start with obviously, but it really did end up feeling more like a nice bonus than anything negative.
Part of it was also the underlying architecture. AMDs performance on low level APIs wasn't just a matter of optimization, GCN was simply better suited to the task than Kepler with key advantages such as async compute support.
Not that I'm trying to say AMD didn't have plenty of driver issues. Just adding a bit more nuance.
Big fan of your video's !! Good Job
I would enjoy seeing videos (not too far from a LGR) where we find modern uses for older GPU's. For example, the Retro community got so robust that you can take an R9 290X (technically you could go as far back as HD 4000), flash the bios and put it on a machine utilizing CRT Emudriver for CRT displays. I don't know if this list would past 5 or 10 ideas for breathing new life in these older GPU's but I'd love to see what ideas are out there. Update: Another idea would be using pre-RTX 2080 gpu's to put into a media server for encoding 4K videos.
this is funny to see how fury x needs gigantic watercooler to even cool down something that slower and cost as much as 980 ti
That water cooling was there to increase GCN efficiency. Some reviewer did more digging on this matter before. I think it was tech report or anandtech. About the cost it was supposed to be sold at much expensive price than 980Ti. But AMD had no choice to price the card at similar price because that's the price nvidia set for their 980Ti. Back then i heard some partners wnd up very furious at AMD because they decided to sell the card at $650.
@@donaldnemesis393 It definitely doesn't need it. The R9 Nano is the same thing and is clocked only 50MHz lower with a lower power limit. If you tweak the Nano... it's a Fury X. Doesn't run that hot. If they sold any with decent aircoolers they'd be fine.
Amd did compete again with nvidia in the radeon 6900xt and 6950xt. The 6950xt out performing the 3090 in all games pretty much
Love this series my man. Next week is going to be a bloodbath though. 😢
very good video and game slection. nice work man. i purchased a 980ti for my CRT monitors this year. very happy with it. its an ASUS reference model 980ti. very overpriced in my country by the way(south America). i paid 230usd. Happy nonetheless. thumbs up
I just went from my old 2080 ti to an ryzen 4070 and I must say it was an insane leap in performance 👍
The year that got me into PC gaming!! I'll never forget my Sapphire Fury Nitro.
loved my 980ti these cards bloody slapped for the time and still do tbh even on modern titles turning it down to medium perfectly playable
Just as a point of reference, I have an EVGA exclusive 120mm water cooled GTX 980-Ti in the same rig as my RTX 380 FE.
I have it as a dedicated CRT-only gaming GPU. Not only is it exquisite for classic titles, and those times you want to see those nice, fat scan lines on, say, the original Deus Ex.
But it is genuinely fkg amazing for many modern titles, such as twitch-shooters like the modern Doom, when actual zero lag input feels like gaming from the future.
Not only do I have a high quality CRT monitor, but I also have a 27 inch Trinitron behemoth. And using tricks like Super Resolution looks utterly amazing. Particularly at night. With the lights off...
I've considered getting a Titan X - the last nVidia card ever with native analog out with the DVI-I output. I could even transfer the EVGA liquid cooler to the Titan X, as a stealth sleeper GPU.
But I don't think that logically there is any benefit? Practically, I would only be doubling the VRAM. But I can't imagine any benefit in 240p or 480i output. Higher resolution textures would be useless.
Sorry, I'm still arguing with myself. Anyway, for pure analog gaming, with instantaneous response (Quake 3D, Doom, Wolfenstein 3D, etc., plus all of the Classics) the 980-Ti is still an incredible card.
Anyone who truly appreciates the phosphorus glow of an Electron Gun firing directly at your head, the unparalleled atmosphere of Silent Hill on a CRT monitor, grab a 980-Ti while you still can.
I love my 3080 FE. I love that the VRAM doesn't go one degree above 80c, and because my eyesight sucks, 1440p is all I'll ever need. Maybe I'll try an OLED, but lower res. and higher frames.
But for genuine atmosphere, and for instant twitch reflexes, for me the 980-Ti is the GOAT. If the 1080-Ti had analog out, it's clearly the winner. But digital only? No. My 980 has pride of place.
I had a pair of EVGA SC 980 TIs. I ended up spliting the two between a couple of computers. Utimately upgrading both. I traded both in late 2023 for a Geforce 3060 TI which I sold for $300. Those were very good cards.
You're flat out wrong saying AMD never again competed at the top end. The 6900xt matched the 3090 and 6950XT to the 3090Ti... Also the 7900xtx beats the 4080Super. I don't count the 4090 and classify that as a Titan, but you don't have to do that if you don't agree.
With 30 series nvidia were using inferior samsung nodes. And funny thing that you did not consider 3090/3090Ti as titan like 4090.
3080 and 6800 XT are fairly evenly matched, trading blows but with the 6800 XT slightly slower on average across a testing suite of a lot of games skewed a little by inherent ray tracing in some titles; RT performance is A LOT slower in 6000 series. Those results are reflected by 3090 vs 6900 XT and the 3090 Ti was straight up faster than the 6950 XT in most games even in raster. By no means was it a significant margin but it was a consistent one. 7900 XTX beats the 4080 Super, yes, but AMD aren't competing against the 4090.
4090 I consider a "Titan" card too; that line of GPUs is essentially just renamed as XX90 in general, including the 3090 and 3090 Ti which were 1100 - 1500 dollar cards.
@@64-96 What's the diffirence between titan and the regular cards?
@@zareefkabir705 They used to have a difference in compute performance for 'professionals' but they nerfed that for ones such as the Titan X which were no different from a gaming card with a higher price tag.
RDNA 2 true it compited but AMD released the 6950xt to beat the 3090 the 6900 xt was behind it... then Nvidia released the 3090 ti and won the crown again, but you are right they competed since then it didnt happen because the 4090 classify it the way you want is still an RTX 4k series meant for mainstream market despite its price is strill meant for it.
Ah I miss healthy competition on the GPU market, by Pascal it already stopped, and in 2020 when the 3000 rtx and 6000 AMD came out, competition seemed to be back on track, but if rumors are true, AMD isn't gonna be competing on the high end anymore.
The cool thing with the 980ti is that you can toy around with the bios freely (except that voltage adjustment need some hex editing to be unlocked).
The RX Fury...a meme of a GPU.
From memory manufacturing was one of the the limiting factors for AMD at this time, the second was team size. GCN 1.2 was a minor upgrade over GCN 1.1 in the last gen bring very little in the way of improvements and by this point the architecture and chip was getting very large resulting in poor yields. I remember reading somewhere that this was the largest chip TSMC could produce for AMD.
The HBM was where a lot of effort was put to in this card and is actually what allowed for the smaller size as it saved a lot of space compared to GDDR5. It was rumoured that the Fury X would be followed up with another release once yields improved and they worked out how to produce larger chips as the 4GB memory was very much a technical limitation of HBM implementation meaning you couldn't simply slap more per stack to increase capacity, not that cost or yields allowed enough it. The 4GB was called out as a limit that while fin for current games at the time, was expected to be problematic in the next 2+years.
Originally AMD was looking to compete with the 980 but the 980ti was a preemptive launch to steal the headlines.
Like you said, AMD's Fury X really competed in the mini PC space where you had to move down Nvidias product line to get something competitive.
Would be interesting to see the Fury X vs the 8GB 390X. Maybe the lower end card is now better thanks to more VRAM.
Great video as always! Are you planning to film new Q&A video? If yes, where to send my question for you?)))
Recently got myself a MSI Lightning GTX 980 Ti. Nice looking beast GPU
16:30 your spelling makes my British heart flutter - thank you
re the GTA5 engine limit ... easiest way to "bump up the settings" is to use frame scaling in the advanced graphics settings, while leaving everything else as-is. This setting can even force current GPUs to their knees, especially if everything else is already maxed out
I put everything to the max and it pretty much put my 3060 on life support. Poor thing couldn’t handle everything at the maximum settings at 1440p. It probably doesn’t help that I have a weak cpu though.
Your killing it with this series dude!
Cheers Iceberg🤘🥶
I saw a 980ti on fb marketplace for $50 not long ago. Still have my old one so I didn't jump on it but it was very tempting.
Seems like still solid for me, i played mostly old game anyway. If it's under 70 i will buy one.
Yeah that Fury X... I mean I love the use of HBM, trying new stuff like that, but yeah I was using mine up until about a year ago or so, it was kinda driving me crazy, the GCN strategy didn't work for this long, it showed kinda early it had a bit of a scaling issue.
Great work
R9 Nano is capable of running Fury X speeds if you remove the power limits. Yes this does take away literally every efficiency advantage that it would have. However the little aircooler can handle it.
Good one thanks. Build your own
I recall the Fury non-X being a decent value alternative to the GTX 980, in particular when it went on sale at the end of the generation. The Fury X on the other hand feels like it was quickly forgotten, a card that underperformed at a much too high price, I suppose that Vega 64 was AMD admitting that pushing their chip to the limit and slapping an expesive cooler on it isn't going to save it.
Last year for Xmas I built my 9 year old a gaming pc with a 980ti 8700k that I delided and over clocked to 5ghz 1.3v paired with 2x 16gb 4400 ddr4 and for games he plays the 980ti still performs and I only paid £50 for a gigabyte windforce triple fan.
980ti is the real goat for me, after 9 years can still deliver good fps at 1080p with full driver support.
Price\Performance in the used market it's even better than the 1080 ti.
the 980ti holds a sweet spot in my heart my first built pc and i got a 4k monitor for it but it my later years i decided on 1440p high refresh rate to be the spot it was such a great gpu the last of its kind for 600 being a highest end card then the 1080ti being a great buy after that Nvidia fell off honestly becoming greedy with little improvement focused on ray tracing and DLSS to make up for improvement with a huge price increase
HBM is superior but it’s Achilles heel is its 800mhz speed vs 3+ghz on nVidia. Amd always uses larger bandwidth which holds up in higher resolution but again if the memory speed isn’t up to par then performance won’t be as good as advertised. Amd can beat them but it’s very costly back then. Today since some processes have lowered significantly in cost so HBM3/4 can do 16gb vram bandwidth along with 3ghz memory speeds will decimate. Amd just doesn’t have the engineering stones to do it. If they did they could take over the entire market.
I'm confused. I remember the 980ti and Fury x trading blows. I went with the 980ti but was torn at the time.
It was on the past. But Fury X architecture itself also a bit problematic especially at low resolution.
The fury, then vii and vega hurt amd badly. It gave them a nasty black eye. I remember switching to nvidia because the fury x was bad.
I may buy what gets called the GOAT
Now we only need videos comparing these results to the most current drivers to see which team does better driver optimization. Already know AMD will win, but by how much?
19:10 OMG Slaughter inbound...
Using DXVK on DX11 titles makes AMD cards perform better.
Under Linux + Proton gives old AMD hardware a new life.
The fury x was so fun to overclock I had one and was happy with it. Vega 64 wasn't supposed to go against the 1080 ti It was against the 1080.
The 980ti was a beast i still have one laying around somewhere.
Wake up babe new Iceberg video dropped!
This man called it "que da cores"
Fury line could've been a decent card if it had more the 4Gb of Vram. Of course AMD's 600mm2 GCN chip was very inefficient and it couldn't been as fast if it's memory system wasn't HBM as it did allow more power to be directed towards the core without sacrificing memory buffer size or speed.
But to be fare 980ti was probably the best thing that ever happened to Nvidia. It was a good card from the get go and it had so much overclocking potential making the next gen GTX 1080 not even an upgrade.
Though most of the AMD's struggles weren't necessarily because of lack of effort, but it used bad node from a manufacturer who could never match TSMC which Nvidia used for 900 series.
AMD had the raw compute power of GCN, but times were so such that it wasn't needed and GPUs were mainly for gaming purposes.
But it is what it is, the end result is all that matters.
I think it would be beneficial to add the average power consumption too. These are the flagships of the time, I’d assume they consume a ton of power.
To look at real power consumptiom you need specific equipment for that which is very expensive. Sometimes not even big tech channel have it like.
Easy win for team green on this episode and the next I would think.
So weird just watched a 2 yr old video of you mentioning this card (8 vs 16) looked on eBay (lowest $75) for it then bam.. this!
If ya gonna compare the 1080ti and the Vega 64. Its quite important to use the liquid cooled sapphire vega. I have two of them and they Clock higher than any other vegas ive had.
Still rocking a GTX 980 ti from evga in my main system
Still rocking a 1080 here :D
not gonna lie, i just think that going from 4096 cores on the fury x and nano to 6144 on the 7900xtx is amazing when its over 400% performance, just goes to show what a higher clock speed and IPC and faster VRAM can do. also would you prefer 4KB of cores or 6KB or do you want 16KB with the 4090.
What would have happened if AMD hadn't decided to go with HBM for R9 Fury X? If it had come with 8GB of GDDR5 and a 384-bit bus... Maybe AMD would not have stopped competing for the top spot in consumer GPUs. HBM was an odd choice. HBM1 could only have a max of 4GB and it was expensive. AMD was losing market share rapidly after being ahead in sales in 2013 for discreet GPUs.
AMD bet HBM will go mainstream and cheap as time goes by. Nvidia on the other hand end up investing more towards GDDR leading to GDDR5X and eventually GDDR6. funny thing is GDDR5 was originally being co-develop with memory maker with AMD. Also the only time AMD was really ahead in term of sales against nvidia was during fermi generation when nvidia had issues with TSMC 40nm because of their bigger die size. Worse when Rory read end up being CEO AMD reject many OEM offer. This what makes nvidia end up dominating the gpu in laptop market.
You should add overclocking results aswell
Beautiful cards, I have my 980m 8gb in my Dell precision and I made it run forza horizon 5 at 4K and forbidden west …. It ran … sort of ;)
Clearly the greatest gaming GPU of all time was the Voodoo 2, you didn't even need a pass through cable with that bad boy for 2D acceleration :)
The 980ti was supposed to be my 4k card when i built a new pc for the first time in over ten years. Then, final fantasy 15 came along and said no.
I then bought the 1080ti to be that. Then, Yakuza Kiwami 2 told it to eat kerb.
Given nvidia's pricing for rtx, i finally rescinded my ban on Radeon, with an rx 6800 since that was all i could get. And now Robocop Rogue City is telling me no.
Robocop Rogue City runs like complete shit on everything tbh. Not seen that talked about much, the performance is awful even on benchmark videos, streams and gameplay regardless of setup or GPU vendor choice. Stutter galore and low internal resolutions to get playable framerates. Typical UE5 shenanigans. Yakuza Kiwami 2 has stuttering issues in cutscenes that happen in the exact same spots and scenes like a video with a stutter in the encoding except it's real time cutscenes. I remember the beginning disco scene giving my RX 5700 XT an aneurism too. Runs flawlessly at max 4k on my 6800 XT now though, apart from the minute stutters that occur in cutscenes which seem like an engine issue.
I also had a 980 Ti before my 5700 XT and yeah FF 15 shat all over it.
When you have a result that averages around 30 FPS, just as a bit of shorthand, we call that "Starfield Quality", so we all can understand each other...
Playing TW warhammer 3 on haswell-e + 980ti right now and it works fine. No stuttering. Could look better tho. Somehow even on ultra preset it doesn't look great. I wonder if that is driver related.
I'm not going to lie, the newer years like 2017, 2019 etc are not worth making since most people already know of the answer between the RX XXXX series vs RTX series, maybe you should go backwards, would be cool to see the 2011 best cards or 2009 best cards.
I miss the old days of voltage modding these cards with a bios hack.
When the next set of consoles come out is it going to be called Super Next Generation or something?
Do a video of u testing ryzen 4090 vs the Radeon rtx 1080 super
Again, it's nice to see how well the cards did back then, but come on, we are interested in seeing how well the GPUs hold up to modern titles
Are you planning on buying a 980Ti for your CyberPunk playthrough?
@@zCaptainz
Wanna try that again?
th-cam.com/video/5PUFTG8_dP4/w-d-xo.html
Typically, I'm talking about modern games that don't demand much, but since you wanted to go for the far end, it's good to see that the 980Ti is still a viable option.
They’ll probably perform alright at 1080p in modern games nowadays, but I doubt they can do anything above that.
You should avoid them. Drivers are no longer supported. Use an xbox in dev mode for emulating xbox 360 and older consoles (not ps3 ). Or build a cheaper PC using rx 5000 or rtx 1000. (though even they're showing their age).
@@thegamerkhan Avoid them when possible, sure, but some people just come across these cards for free and would like to see how well they'd work
the r9 is one of the best looking FE cards of all time
MBA*
@@MisterKrakens mba?
@@mitsuhh Amd "Founder's" cards aren't called FE but "MBA" (Made by AMD)
The radeon 7 should be up against the 1080ti. Also resell value of the radeon 7 is insane bought mine for 400 euro sold it for 1600 euro.
Yes, old hardware review.
stupid question - is it possible to buyy that R9 Fury X of you?
I fail to understand the reasoning behind using 'era appropriate' drivers.
After all, no one's going to install 7 year old drivers, especially when newer ones aren't only stable but also offer noticeable performance improvements, especially in case of AMD.
Oh my how the mighty have fallen the previous generation AMD had the edge in some titles the the R9 290X due to the extra gig of vRAM now during the 2015 era the 980 Ti spanked the ever loving bajesus out of the r9 Fury X by 1 having 2GB of vRAM and 2 having the maxwell magic behind it and 3 being an overclocking beast. Also it took AMD to HBM2 and the Vega 56/64 era to reach 8gb HBM memory in the form of HBM2. There are workstation cards out there that use HBM chips.
Should have included 2013 flagship scores
Kinda feels like the fury x was a r9 290x with a water cooler
I had fury nano back then as I got a good deal on one used
What do you think of the I5-10400F?
Eh, PCIE 3.0. Potentially an issue if you want to pair it with an x4 or x8 GPU, but probably fine otherwise.
It's a shame the 4GB VRAM really hurt the Fury's longevity
"An overclockers DREAM" 😂