@@219SilverChoc My PSU went at the same time I don't know which one blew up and took the other one out but both needed replacing. What did you upgrade to?
@@DannyRice01 I went from an i5 6600k with 8gb to a RX 6800, Ryzen 7 5700x with 32GB of RAM. Didn't want to go underkill with RAM after having issues with it later with my old rig, hard to judge with CPU's though as games that have CPU performance issues come more down to underutilization/being unoptimized than them actually being CPU demanding. Hopefully going overkill with VRAM helps me out again, DLSS is less needed at 1440p but I will prob have to rely on FSR/TSR in newer stuff, already do when trying to hit high framerates.
Always risky buying a GPU that's more than say 6-7 years old, you made the right choice. Ironically, if you somehow managed to pick up a GTX 1080 ti for really cheap, that card is still playing a majority of games at a decent level even today, but cards like that are an anomaly.
@@ziokalco I got it for 90€ lol, the seller was moving out. It had been repasted and the thermal pads were new. Temps are wonderful, and it overclocks pretty well in all honesty I have it with my R5 7600X I got for 100€ and my B650 mobo that I got for 150.
that was my very first graphics card in my first gaming PC R9 390x nitro from Sapphire. This card cost me 300 bucks brand new and was considered high end. Only the gtx 980/ 980ti and R9 Fury stood above. That card was literally a space ship for my friends and me. I cramped it into my super cheap sharkoon case from amazon and paired it with the AMD FX-6300 which i overclocked to 4,7 ghz and with 240mm watercooler (i had to cut something from the case out, to fit it in) And also OC'd the R9 390x. This PC was a room heater. :D For me this is one of the best looking designs ever made. The card looked so good.
This is why having more VRAM is so consumer friendly. I have no doubt that all RDNA2/RDNA3 cards with 16GB+ will last for at least a decade. Yes, they won't be able to play at max setting or above 1080p, but the games will still look good enough. In contrast, most cards from Nvidia competing with those RDNA2/3 will only last a few years more, hell the 3070 and 3070ti are already obsolete due their VRAM capacity in some games when they are more than capable of running them at decent settings and fps otherwise.
@@ABaumstumpf I mean, we already see this on current gen. Slower cards can beat the more powerful cards when the former has more VRAM and they are in a VRAM limited scenario. Best examples so far are the RX 7600/RX 7600XT and the 4060ti/4060ti 16GB. The extra VRAM does help a lot in some scenarios. Some games, even at 1080p do require more than 8GB of VRAM, at higher resolution it is even more necessary since the assets are bigger. Obviously this varies from game to game, but 8GB VRAM is barely the minimum required for most new titles at 1080p. Once developers stop supporting the old generation of console it is probable that 1080 PC gaming will require at least 10 GB VRAM to play at high/medium settings.
@@LeGoooze "we already see this on current gen. Slower cards can beat the more powerful cards when the former has more VRAM and they are in a VRAM limited scenario." Yeah, we have seen a couple of games that had bugged memory management or used the highest LOD for all objects no matter the distance. And for most games we see that - oh right - the R9 290 8GB still has roughly the same relative performance to the GTX 970. Once studios stop supporting the older generations what we will see is (yet again, same as it was with the last few generations) a large DROP in performance and quality, with initially games requiring ungodly amounts of storage, storage-bandwith, CPU-power, Memory etc. Not cause they games would actually require that but just cause the early games just push the limit and ignore any semblance of efficiency. And with cards like the 3060 8GB we also nicely see that the biggest difference it has towards the 12GB variant is simply memory Bandwidth. (the 8GB variant still was a bad deal).
@@ABaumstumpf Doubt, most people are on old hardware (both pc and consoles), and games as is are already unsustainable and expensive. Now that ue5 is finally maturing, things will not change mutch for some years.
Witcher 3 NG in 4k Medium without reflexions on my 290oc (a 4gb 390) still Hits average 38 with fsrQ (in 4k also 1440p As Long u don't fast Travel into a City, result in 2 min vram overflow lags. Enter them regular is fine bytheway
I had the 290x 8GB Sapphire from launch in 2013 until 2018. It was great. When it died, I got a 1070ti 8gb. These two cards lifted me up for quite some time. I am a happy boi. I'm waiting for the prices of this generation to lower so I can get like a 7900XT or something. If AMD launches humane prices next launch, it might be a good choice. They have stuff that still works. The 6700HD ATI Radeon card lasted for me from launch til I got my 290x. Gotta say, it was a huge jump for me. I was playing DeadSpace and Deus Ex on that 6700 and I was amazed at that 1GB VRAM. Then I shot up to 8 and was like WHOA. There wasn't anything I couldn't run. Doom 2016 and Mass Effect Andromeda were stuff I could play without issues.
My 390x lasted YEARS and I water-cooled it when the fans died during the height of the 2018 GPU price spikes. It was handed down to my wife and lasted years, all the way to 2077 running at 1080p/60 boiling the water!
And the verdict will be the same: unfit for playing the newest titles, while consuming an obscene amount of energy (I don't even want to imagine what a kW/h will cost in 10y 😵💫
This makes it even worse that new mid range gpus are still coming with 8gb and that theres still no low end 8gb card with there only being 6 and 4 gb cards
Almost bought one of these back in the day, got a Strix 970 instead. Both were great cards at the time, considering the 390x and GTX 980 were way more money for hardly any more performance. Back when you could get a 1116mhz card and overclock it to 1520mhz lol, good times.
@@twiistedpanda4781 in 2014 at Release nearly nothing make use of more than 3gb, and even modded Skyrim use ~3.5 so the issues weren't noticed until around half a year later, than the updatet nv Driver try very hard to don't utilise more than 3.45 gib so the 970 perform in realistic usecases mostly normal, ~5% faster than Stock 290 until ~ 2017 games and in 1080p still up to ~2019, Most Users upgraded to a 1080(ti), 2080 or mostly a 3070 until the gpu shortage without often if even ever noticing the 0.5gig issues. In modern games the Stock 290 is now 10-15% faster mostly cause the full4gib vram
@@twiistedpanda4781 Eh not really, if you benchmarked both cards across titles they ended up usually being about the same. You might be able to turn textures a little higher on the 390. I'm sure you can find a modern day benchmark suit on both cards in recent titles. Maybe in some new games that use tons of vram it'd tank the 970 but the 390 most likely doesn't perform that well in that title anyway because of other architecture/driver bottlenecks.
I've been messing with the GTX-980-Ti. Surprisingly still potent for a dinosaur. You seem the type to have already done a video on it in 2024, I'll look out. If you haven't, it' s impressive....comparable to the RTX 3050...not just the 6GB version....but it GUZZLES power in comparison. A 6500XT would be the closest performance comparison to the R9-390 is my guess. Great video.
Thank you for using the R9 390 the way I’ve always used it. Mixed low/medium settings and textures cranked. Between the massive bus width and 8 gigs of vram it can handle a ton of textures even today.
In europe the 980 price dopped in reasonable territory after the 980ti Release. Nv had at launch on every Card overpriced since gtx700 until the 4070super
I loved my r9 390 so much, but it cooked itself to death right in the middle of the gpu shortage. That card always ran incredibly hot, and I had to repaste it more than once
In 2016, i was finally doing a new build for the first time in over a decade. When it came to the choice of graphics, it was a choice of _two_ R9 390 cards for Crossfire action. Or a single 980ti. That there is still _new_ nvidia drivers supporting the old Maxwell behemoth does suggest i chose correctly, after all.
Friend of me used his i5-6600k and r9 390x until last week. He bought it back in the day. He used this computer for gaming a lot for example he has over 900 hours in Total War: WARHAMMER III. Now he can enjoy his new PC. i71400kf with RTX 4070Ti Super.
I remember 2015, nobody was impressed by the 390. Everyone thought the 8gb of Vram was cool but pretty pointless and the 980 ti was stealing all the thunder.
As with someone with a 4GB GPU, GTX 860M from 2014, I find the lack of more VRAM as the only reason I can't play many modern games....I kid I kid. The fact that there are games which (after updates) block you from running the game on the same GPU it was running before, is one more reason to stop buying games from any store which is not GOG or like GOG. DRM free FTW. Good video and IMO the GPU is still holding well for its age if you have it and can't afford to change it. Well done AMD.
9:45 XeSS won't run properly on AMD cards before RDNA2 (and nvidia cards before Pascal) due to a lack of DP4a instructions, so you end up using a fallback for a fallback...
You will be amazed how well this card works under Linux with Proton. In most cases you have better performance than Windows because the Linux drivers are still being in actively development.
I’ll never understand why Alan wake remains a benchmark for old GPUs. It’s not even that great a game. Just play stuff more relevant to the old stuff you have. It’s more than likely a better made game.
Love the R9 390!! Singlehandedly was the card that ended up getting me into AMD graphics when building my first computer. It was unavailable in store, so I ended up waiting and using integrated graphics. Getting a used aircooled Sapphire Nitro Fury changed my gaming experience dramatically. I ended up switching to a Gigabyte GTX 1080- then back to the Fury when I was offered more than I paid for the 1080 during the GPU shortage. The Fury was such a beast in emulation. 512GB/s of memory bandwidth meant that even upto 8xMSAA was practically free at 1080p. Emulation performance was actually fairly comparable between the two even upto 1440p.
The previous R290 was a card that did me so well for years and was the first card i had in my first gaming PC, but never come across the r390. I did buy an R280x in the pandemic for a cheapo build. The comparative performance dip was much more than i thought. The custom drivers was a hassle (different to the one used in the video here) as each one that came out needed a whole bunch of fiddling and differing ways to install them each time so i stopped using them since the official support was pulled. Interesting vid.
For the games that do run on it, I'm pleasantly surprised at the benchmarks. For the folks still holding on to these cards they still have a somewhat okay 1080p med settings card. I wonder if the high end cards of today will age this well in 10 years from now or PC hardware requirements are gonna go up starkly by the next console generation
10 years from now every AAA game will likely have Cyberpunk levels of ray tracing, so we'll probably end up with our current cards technically having enough raster, but anything short of like the 4090 not being able to keep up. The VRAM situation all over again
@@roguetwice469 the 4090 is just the Titan of the dx12 ultimate (half)decade, the Titan was the 5 year Reference Point of dx11 decade I bought my 290 as similiar raw Performance Card for less Money 2 years later in 2014. Dx12 Support was Lucky and the Point of more Value instead of 6 against 4gib vram
@@PhilippJanusch I'm having trouble imagining a cheap 4090 alternative with DX13 support in two years, but that would be incredible. It'll probably be $6-700 because that's "cheap" for a GPU nowadays.
You've completely missed the real reason the R9 390 aged so well: is its _massive_ 512-bit memory bus, and very high VRAM bandwidth as consequence. The 390 has the GPU die surrounded by a super crowded 16 (!!) memory chips. It was the last non-HBM card with such a wide memory bus _ever_. Even the RTX 3090/4090 only have 384-bit. And now both Nvidia and AMD are trying to sell people 128-bit e-waste that will age like milk.
That would make the gpu a lot worse though since upgradeable ram has to have extra connections, and thus more resistance and slower vram. Who knows though, maybe that new Camm2 standard for ram can improve speeds and make upgradeable vram possible?
9:42 XeSS generally decreases performance on cards without dp4a support, which on AMD is only supported on the Radeon VII, some RDNA 1 cards and all RDNA 2+ cards. Even the 5700XT doesn't work well with XeSS because of this, so I'd say you're probably stuck with either native res, FSR or TAAU on older AMD cards.
Superised they even bothered to make a fork of XeSS that tries to work on older GPUs as it doesn't give a performance boost usually and prob looks worse (the XMX version on Intel GPUs looks better than the DP4a version on non-Intel).
@@219SilverChoc True. I guess Intel may have wanted something that worked on all their somewhat recent iGPUs, though I'm not sure when they got DP4a support. From what I've seen the emulated DP4a version does at least look identical to the native DP4a version, but there's just not much point in using it.
Man i love your channel, eventhough i game on laptops, its very refreshing seeing a gaming tech enthusias channel that focus on old hardware. I love the damn smell of nostalgic tech era, not in a form of old retro 90s crt ish kinda era
Sad isn't it, just sad :! I got a RTX 3060 12Gbyte right now. Hoping to ride it out on 12Gbyte. In a few years... But I had a GTX 1060 with 6Gbyte. And usually for the games around that time, it was more then fine. Yet... the RX 480 with 8Gbyte kinda made sense.
0:41 I can tell you the feeling. Everyone thought it was power hungry with bad drivers, and were recommending the GTX 970 instead, in spite of its botched 3.5GB of RAM. I always knew the R9 390 was superior because of its RAM and feature set, but then you immediately got called an AMD fanboy.
Man AMD GPUs were always good at high VRAM amounts. I used my RX 580 quite far into 2022 which is newer than this but also had a shockingly high 8gb for its age and price point. This and the 390 can be found quite cheap on ebay. I have to agree with the older ryzens being usable still, my 3800X is still running strong.
loved doing builds with the R9 cards (280, 380, 290, 390, etc) but honestly have noticed their age is definitely getting to them like you said. Still good cards for esports games but the power consumption is a big turn off when you can just get one of the RX 400 or 500 series cards or their Nvidia equivalents for very similar prices
When you remember GCN launched against the Fermi GTX580 at only 800Mhz (and would do a 25% oc easy). AMD added FP16 support and delta color compression but not much really changed to the shaders because AMD was too broke after buying ATI in the first place. AMD were also still supporting the very different Terascale HD6000 series which used VLIWS architecture versus the RISC MIMD that GCN used so the driver team was split for years, That changed in 2016 when they finally started to focus just on GCN hence "fine wine" became a thing. The last GCN cores are on Cezanne like the 5600g and refreshed 5600GT for Jan 2024 just making it past the 12th birthday, To my knowledge only Itanium is longer lived.
The only way it works is LINUX, because the drivers are open source and still work. Considering Windows 11 recall takes screenshots of your desktop every 3 seconds and is unencrypted and already hacked, you REALLY SHOULDN'T BE USING WINDOWS 11 ANYWAY, and Linux will solve all your problems using this card.
I don't get your crash with Starfield. My R9-390 with the RID drivers run it just fine at 1080p low for about 36fps. My CPU/MLB is a Machinist X79 Xeon combo if that makes a difference. Also I'm using a 1TB Samsung SATA SSD. It does, however, draw a crap ton of power. LOL.
Would have been nice to see a 4Gb (R9 290) variant included. Over the years some outlets had done comparisons between the R9 290 4/8Gb (which is basically the same as the 390) and even the GTX 970. I think the last comprehensive test i saw i now nearly 3 years old and there the R9 290X 8GB and GTX 970 still had roughly the same relative performance they had at launch. If the extra memory only pays of after >5 years then you really didn't benefit much from it. Still good cards - took AMD some time after the 200 series till they got good highend GPUs again, but Vega 56 also had great value (if you could get one of those rare paperlaunch models).
NVIDIA: Still supports with drivers the GTX 900 series(2014 released date). AMD: No more drivers for the R9 300 series (2015 released date) after 6-2022. I will never buy an AMD gpu again! Only Intel or Nvidia.
Was thinking this would perform better. Compared with the 970 this is quite slower and prices should be about the same for 4GB cards. (I bought one for a friend for $25) I still have a R9 290 and performance is still OK for casual gaming. They can also be undervolted a lot and use some 100W instead of 250. Instead of upgrading, I got a Freesync Premium screen and been very happy with the result. Cost me much less than a new GPU at $150 and keeps everything nice when it drops below 60fps.
I bought it to play doom 2016 with vulkan. It ran great at the time. It really sucked at older games, AMD drivers had terrible support. It costs me almost nothing after i sold my GTX780, had it for about a year. Then i bought a GTX1070, never looked back. Way better.
I bought a r9 390 when it was new. My take on it was it basically traded blows with the GTX 970. GTX 970 had lot better efficiency and when overcooked would pull ahead in performance. The 390 had a lot more vram 8 gb vs basically 3.5 ( who remembers this controversy). I also think it had better performance in modern apis like vulkan and dx12
honestly even though I have a 3070 with only 8gb of VRAM, it still runs games like cyberpunk at 1440p high with some ray tracing well over 70-80 fps, although I would never buy a brand new 8gb card I think the vram explosion is largely out of proportion, since 98% of games still work well with it
I had the 290x and then later the 390 as a temporary card after my 290x lit itself on fire and died. It was worth it to me since the 290x was when I started playing games at 4k.
Back when AMD gpus were competent. I was about to get a 390 but 980 Ti dropped which had insane price to performance. But yeah, had Nvidia not released 980 Ti, I'd go for 390 because I didnt like 970's lack of vram.
Those where great cards, but I wouldn't recommend anyone running those as daily cards even if only causally gaming. Simply put power usage will eat out card value in very short time. I had previous model till recently so called 7970 GHz edition, replaced with cheapest 6600 my power bill dropped 10-15$ per month, so card will pay out in electricity saving under a year. Not to mention 6600 is silent, and old card was very very loud and hot. Whole PC got 10C cooler despite many fans and generous case.
It's called FARTnite, first of all!!! I have been wondering about these old cards, if you can take cards with lower RAM and solder on bigger ones to add a few GB. That with modern FSR implementations, I think that it could give you a good extra bump in performance!
Again, failing to understand new technologies in game engines and how they work is the culprit of many misinformation on the "tech channels". Seeing how 4090 can use up ~17GB of VRAM in Diablo IV @ 1080p tells you everything you need to know but people will insist that 8GB is not enough because they look at VRAM usage and draw their conclusion solely on that. To all of those people, I present to you - caching! Using excess vram to store data closer to the gpu is thing for past 3 or so years. You can A B test this using same brand of the gpu with different VRAM size and observe the gpu with more vram using more vram for the same thing producing almost the same performance as the one with less vram while the one with less vram will not stutter or exhibit any problems that come along with the lack of vram. Of course there are 3 or maybe 4 console ports that will exhibit stutters, but that is due to lazy porting.
Didnt realise how much vram new games could use until i upgraded my rx 470 to a rx 7800xt. Already seeing games using around 15 gigs of vram at maxed out settings. I bought the 7800xt with the intention of the vram adding a couple more years to its lifespan over the Nvidia options. My 470 eventually caught up to the 1060 qfter a few years aswell so maybe my 7800xt will be on par with the 4070ti super a few years down the line.
I had the 390X Nitro. The plastic cooler was more fragile than I expected. The quality was rather disappointing, even though the card preformed well. The memory was locked too unfortunately.
A shame to abandon support for GCN 2, it still is a pretty capable gpu with 8gb of memory. Customs drivers could help a bit. For Helldivers, i'm pretty sure the game would run much better with DX11. you can force it in launch options.
I was active in the PC enthusiast space back then, and believe it or not, at that time the extra 4 GB of VRAM on essentially the same CPU die was considered a marketing gimmick, a bigger number to entice unsavvy consumers into falling for the bigger number better gag. My, how things have changed. Developers are using up VRAM nowadays the way a hillbilly uses the back of a pickup truck to shovel junk (that has no being there) into.
AlanWake2 requires mesh which isn't supported until RTX 2000/RX 6000, while GPUs which don't support these mesh shaders can still run the game, few older ones will just cripple.
"8gb", yeah except you can't actually use it quickly enough in gaming. The difference between a modern 8gb buffer and a 10 year old buffer is massive. The only way to actually use that 8gb would be in a professional graphics workload/render, not time sensitive by any stretch.
The R9 390 was my very first GPU when I returmed to PC building and gaming..Could not have made a better choice. GTX 900 series 4GB VRAM was an absolute no go for me.
I bought an R9 390 in 2017. Aside from killing a PSU or two (one of mine and one in a friend's PC after I gave it to him), it was a phenomenal card. It certainly kicked my 1060 3G's butt.
I was 25 when I built my first PC in 2015. Good old R9 390, that card served me well. Unfortunately I paired it with an FX 8370. I still have the check from that class action lawsuit!
Things have changed, I don't think you actually *need* new hardware like before to play new games. Conceptually, no game requires more than a 10-year old PC, and no one will in the future. But both lackluster optimization and little care for users make for the current scenario, where requirements skyrocket with marginal improvements in graphics quality. We probably need tools to hack games internally to downgrade graphics (besides upscaling techniques of course), but doing so will make them to look worse compared to if these games were actually developed with older hardware as target. Games should be made with scalability in mind, so any modern AAA title will still look great no matter the hardware.
Have a Sapphire Nitro R9 Fury I haven't even tried yet, wasn't sure what it was equivalent to. Picked it up years ago for a Retro Build, ( possibly a 2500K / Z77 ) only took it out of the box to have a look at it. ( I do like the looks of it )
Decade later, we're still getting 8 GB in "high end premium 1080p cards" for the same if not higher price 🤣 Really shows how manufacturers (especially the green one) are purposefully stagnating the progress...
I have this, still rocking. I'm actually play a lot of sim racing games, doom 2016 and gta4, my son play a lot of Minecraft and other indie games like slime rancher. So far so good, that card is with me for 8 years but i'm going to built a new pc and have to let the 390 on a shelf
The 390 was definitely worth it for those who bought it new, but no one should be searching for it now. This GPU was, overall, just as much of an impractical purchase as the Radeon VII was. Both cards had far more VRAM than they could make use of during their useful lifespan, no matter how much some internet contrarian tries to convince you otherwise.
If people want an upgrade to this card, I'd say an rx 6800 makes a lot more sense right now than an rx 6600. The 6800 can be found for around 360$ right now, and keeping in the spirit of the R9 390, has 16 gigs of vram.
This GPU deserved longer driver support. I am sure that some games would've run better on it. It was around RX 570 - RX 580 performance‚ in the games that were released before the end of driver support for this GPU.
I had the Strix R9 390 in my first ever gaming PC back in 2016. That GPU definitely could've lasted longer than I had it for, and I've missed it for a while
if you didn't want to play modern titles and are happy with 1080 high settings, you'd be better off with a RX 580 or 90, or a 1660 super, FSR 1.0 is better than no FSR support
The most interesting part of looking back at these older cards to me is gauging how well modern cards will age, no doubt my 7800xt will last till next generation, it honestly makes me feel like modern prices are worth it at least for how long these cards should last, VRAM tends to be the pain point of older hardware every generation, and a 3060 had 12GBs of it meaning it should last plenty long
I'd love to see its comparison with RX 580 8 GB. A frend of mine has one of them, and still plays games on it. AFAIR back then these were considered really similar performance, I'd love to see how they aged compared to each other.
Beautiful GPU this was. I was genuinely torn about having to eventually part ways with it, but 1080ti was just that supermodel you couldn't say no to if she asked you to bed.
Had a sapphire nitro r9 390 Great gpu but mine kicked the bucket about 3 years in, just bad luck but man was 8GB was unheard of back then. Pretty sure the norm was 4GB and the occasional 6GB at the time
oh god young GN & LTT jumpscare
Hey I know you!
What a coincidence to find your comment, on the 3rd spot no less.
Lmao 😂
Oh god , a furry.
Oh god, no father figure detected
Youngster steve and baby linus does scare me a bit
Loved my R9 390. Was a champion until the day it died just recently.
RIP you legend.
Mine also blew up just before the semiconductor shortage during covid so it forced me to upgrade at the best possible time looking back on it!
Mine died early last year, finally upgraded my PC after that as I had been rocking my old rig for 7 years without an upgrade.
@@219SilverChoc My PSU went at the same time I don't know which one blew up and took the other one out but both needed replacing. What did you upgrade to?
@@DannyRice01 I went from an i5 6600k with 8gb to a RX 6800, Ryzen 7 5700x with 32GB of RAM.
Didn't want to go underkill with RAM after having issues with it later with my old rig, hard to judge with CPU's though as games that have CPU performance issues come more down to underutilization/being unoptimized than them actually being CPU demanding. Hopefully going overkill with VRAM helps me out again, DLSS is less needed at 1440p but I will prob have to rely on FSR/TSR in newer stuff, already do when trying to hit high framerates.
RIP :/
I did consider an R9 390 while building my newest workstation, though luckily I did find an RX5700 at a much lower price
Rx 5700 is a far superior GPU. Fortunate
Always risky buying a GPU that's more than say 6-7 years old, you made the right choice.
Ironically, if you somehow managed to pick up a GTX 1080 ti for really cheap, that card is still playing a majority of games at a decent level even today, but cards like that are an anomaly.
And many gpus were crypto mine abused and require a thorough cleaning, and bios replace.
That's a crazy deal actually, you would have been ripped off with that 390
@@ziokalco I got it for 90€ lol, the seller was moving out. It had been repasted and the thermal pads were new. Temps are wonderful, and it overclocks pretty well in all honesty
I have it with my R5 7600X I got for 100€ and my B650 mobo that I got for 150.
that was my very first graphics card in my first gaming PC
R9 390x nitro from Sapphire. This card cost me 300 bucks brand new and was considered high end.
Only the gtx 980/ 980ti and R9 Fury stood above.
That card was literally a space ship for my friends and me. I cramped it into my super cheap sharkoon case from amazon and paired
it with the AMD FX-6300 which i overclocked to 4,7 ghz and with 240mm watercooler (i had to cut something from the case out, to fit it in)
And also OC'd the R9 390x. This PC was a room heater. :D
For me this is one of the best looking designs ever made. The card looked so good.
Lol, my 290 (sometimes 290 Crossfire) is still in my Main sharkoon rig :D crappy included Fans aren't enough to bring fresh air to the cards^^
This is why having more VRAM is so consumer friendly. I have no doubt that all RDNA2/RDNA3 cards with 16GB+ will last for at least a decade. Yes, they won't be able to play at max setting or above 1080p, but the games will still look good enough. In contrast, most cards from Nvidia competing with those RDNA2/3 will only last a few years more, hell the 3070 and 3070ti are already obsolete due their VRAM capacity in some games when they are more than capable of running them at decent settings and fps otherwise.
"This is why having more VRAM is so consumer friendly."
And you can claim that without seeing how the 4Gb variant is holding up?
@@ABaumstumpf I mean, we already see this on current gen. Slower cards can beat the more powerful cards when the former has more VRAM and they are in a VRAM limited scenario. Best examples so far are the RX 7600/RX 7600XT and the 4060ti/4060ti 16GB. The extra VRAM does help a lot in some scenarios. Some games, even at 1080p do require more than 8GB of VRAM, at higher resolution it is even more necessary since the assets are bigger. Obviously this varies from game to game, but 8GB VRAM is barely the minimum required for most new titles at 1080p. Once developers stop supporting the old generation of console it is probable that 1080 PC gaming will require at least 10 GB VRAM to play at high/medium settings.
@@LeGoooze "we already see this on current gen. Slower cards can beat the more powerful cards when the former has more VRAM and they are in a VRAM limited scenario."
Yeah, we have seen a couple of games that had bugged memory management or used the highest LOD for all objects no matter the distance.
And for most games we see that - oh right - the R9 290 8GB still has roughly the same relative performance to the GTX 970.
Once studios stop supporting the older generations what we will see is (yet again, same as it was with the last few generations) a large DROP in performance and quality, with initially games requiring ungodly amounts of storage, storage-bandwith, CPU-power, Memory etc.
Not cause they games would actually require that but just cause the early games just push the limit and ignore any semblance of efficiency.
And with cards like the 3060 8GB we also nicely see that the biggest difference it has towards the 12GB variant is simply memory Bandwidth. (the 8GB variant still was a bad deal).
@@ABaumstumpf Doubt, most people are on old hardware (both pc and consoles), and games as is are already unsustainable and expensive. Now that ue5 is finally maturing, things will not change mutch for some years.
Witcher 3 NG in 4k Medium without reflexions on my 290oc (a 4gb 390) still Hits average 38 with fsrQ (in 4k also 1440p
As Long u don't fast Travel into a City, result in 2 min vram overflow lags. Enter them regular is fine bytheway
I had the 290x 8GB Sapphire from launch in 2013 until 2018. It was great. When it died, I got a 1070ti 8gb. These two cards lifted me up for quite some time.
I am a happy boi. I'm waiting for the prices of this generation to lower so I can get like a 7900XT or something. If AMD launches humane prices next launch, it might be a good choice. They have stuff that still works. The 6700HD ATI Radeon card lasted for me from launch til I got my 290x. Gotta say, it was a huge jump for me. I was playing DeadSpace and Deus Ex on that 6700 and I was amazed at that 1GB VRAM. Then I shot up to 8 and was like WHOA. There wasn't anything I couldn't run. Doom 2016 and Mass Effect Andromeda were stuff I could play without issues.
My 390x lasted YEARS and I water-cooled it when the fans died during the height of the 2018 GPU price spikes. It was handed down to my wife and lasted years, all the way to 2077 running at 1080p/60 boiling the water!
7900xtx/3090/4090 will get the same vid 10 years from now, 24 gb is a lotttt
as owner of 3 290's: i would go All in on this bet :D
😮
And the verdict will be the same: unfit for playing the newest titles, while consuming an obscene amount of energy (I don't even want to imagine what a kW/h will cost in 10y 😵💫
@@drumsmoker731 0ct with ur own Solar Power and a (Car) batterypack as backup
The over glorified UE5 is already capping that sht
Wow that GN Steve from 2014 was a blast from the past allright.
This makes it even worse that new mid range gpus are still coming with 8gb and that theres still no low end 8gb card with there only being 6 and 4 gb cards
What an amazing GPU. If you bought this in 2015 you'd definitely have had your money's worth by now.
Still running it. It's been great but time to upgrade now as it's hit its end for games now
Almost bought one of these back in the day, got a Strix 970 instead. Both were great cards at the time, considering the 390x and GTX 980 were way more money for hardly any more performance. Back when you could get a 1116mhz card and overclock it to 1520mhz lol, good times.
Got my 290 a week before 970 Release. Than the 970 was ~ 6% faster and 20€ cheaper, now my 290 is 15% or more faster, over 30% with my overclock
But didn't the 970 got some issues cuz the last .5 gigs of vram is significantly slower than the rest snd using it tanks performance?
@@twiistedpanda4781 in 2014 at Release nearly nothing make use of more than 3gb, and even modded Skyrim use ~3.5 so the issues weren't noticed until around half a year later, than the updatet nv Driver try very hard to don't utilise more than 3.45 gib so the 970 perform in realistic usecases mostly normal, ~5% faster than Stock 290 until ~ 2017 games and in 1080p still up to ~2019, Most Users upgraded to a 1080(ti), 2080 or mostly a 3070 until the gpu shortage without often if even ever noticing the 0.5gig issues.
In modern games the Stock 290 is now 10-15% faster mostly cause the full4gib vram
@@twiistedpanda4781 Eh not really, if you benchmarked both cards across titles they ended up usually being about the same. You might be able to turn textures a little higher on the 390. I'm sure you can find a modern day benchmark suit on both cards in recent titles. Maybe in some new games that use tons of vram it'd tank the 970 but the 390 most likely doesn't perform that well in that title anyway because of other architecture/driver bottlenecks.
You should also try custom drivers for this gpu
custom drivers work like a charm
12:50
And also the Default from 2022, mostly runs fine (year of cause not dx12ultimate titles like AW2
I've been messing with the GTX-980-Ti. Surprisingly still potent for a dinosaur. You seem the type to have already done a video on it in 2024, I'll look out. If you haven't, it' s impressive....comparable to the RTX 3050...not just the 6GB version....but it GUZZLES power in comparison.
A 6500XT would be the closest performance comparison to the R9-390 is my guess.
Great video.
i have a 6500 xt and it is faster than my friends 580 unless there is a game that requires more that 4gb of vram lol
oh hell yeah it's my once-upon-a-time dream GPU being reviewed
Thank you for using the R9 390 the way I’ve always used it. Mixed low/medium settings and textures cranked. Between the massive bus width and 8 gigs of vram it can handle a ton of textures even today.
FINALLY! I've been waiting for a video on the r9 390 (The love of my life) Since I got my hands on it myself! Thank you very much
we went from nothing can run cyberpunk to anything can run cyberpunk
I still play lots of older games, which are still as good as they ever were, where this GPU would probably shine.
Me too but ive got a 4070ti what a waste of money that was considering 90% of my games are like a decade old so they all run at hundreds of fps.
WHAT?! i was looking for r9 290/390 videos just yesterday!
Sorry I’m late 😁
The only thing this thing did was expose the 980 non-ti launch price as crap value
In europe the 980 price dopped in reasonable territory after the 980ti Release. Nv had at launch on every Card overpriced since gtx700 until the 4070super
Hawaii was a legendary chip
used to OC mine to 1200Mhz Core with a cheap 120 AIO, outperformed GTX980s from that time
I loved my r9 390 so much, but it cooked itself to death right in the middle of the gpu shortage. That card always ran incredibly hot, and I had to repaste it more than once
In 2016, i was finally doing a new build for the first time in over a decade. When it came to the choice of graphics, it was a choice of _two_ R9 390 cards for Crossfire action. Or a single 980ti.
That there is still _new_ nvidia drivers supporting the old Maxwell behemoth does suggest i chose correctly, after all.
crazy how long a gpu can last if the company is willing to maintain it’s drives.
Just proves, you can't 'Always' go back with certain things, as card is nearly at end of life, at least for gaming, nice video, thanks!
an iceberg gpu video on a friday afternoon
*hell yeah*
Friend of me used his i5-6600k and r9 390x until last week. He bought it back in the day.
He used this computer for gaming a lot for example he has over 900 hours in Total War: WARHAMMER III.
Now he can enjoy his new PC. i71400kf with RTX 4070Ti Super.
I remember 2015, nobody was impressed by the 390. Everyone thought the 8gb of Vram was cool but pretty pointless and the 980 ti was stealing all the thunder.
woo yea oh yea iceberg upload
Ahead of schedule!
As with someone with a 4GB GPU, GTX 860M from 2014, I find the lack of more VRAM as the only reason I can't play many modern games....I kid I kid.
The fact that there are games which (after updates) block you from running the game on the same GPU it was running before, is one more reason to stop buying games from any store which is not GOG or like GOG. DRM free FTW.
Good video and IMO the GPU is still holding well for its age if you have it and can't afford to change it. Well done AMD.
I work at a computer recycler/refurbisher. For some reason, we're drowning in these bloody things, so it's nice to know that they aren't just scrap.
9:45 XeSS won't run properly on AMD cards before RDNA2 (and nvidia cards before Pascal) due to a lack of DP4a instructions, so you end up using a fallback for a fallback...
It's barely faster than an RX6400 and depending on drivers like you covered, the 6400 is going to be better. That's a 4GB low profile GPU lol
my R9 290 was a good heat source for my room in the winters back then
You will be amazed how well this card works under Linux with Proton.
In most cases you have better performance than Windows because the Linux drivers are still being in actively development.
I’ll never understand why Alan wake remains a benchmark for old GPUs. It’s not even that great a game. Just play stuff more relevant to the old stuff you have. It’s more than likely a better made game.
Love the R9 390!!
Singlehandedly was the card that ended up getting me into AMD graphics when building my first computer. It was unavailable in store, so I ended up waiting and using integrated graphics. Getting a used aircooled Sapphire Nitro Fury changed my gaming experience dramatically.
I ended up switching to a Gigabyte GTX 1080- then back to the Fury when I was offered more than I paid for the 1080 during the GPU shortage. The Fury was such a beast in emulation. 512GB/s of memory bandwidth meant that even upto 8xMSAA was practically free at 1080p. Emulation performance was actually fairly comparable between the two even upto 1440p.
It started for me with the ATi All In Wonder Radeon 9700 Pro, I have not looked back at nv.
This is what I want not mini pc reviews
The previous R290 was a card that did me so well for years and was the first card i had in my first gaming PC, but never come across the r390. I did buy an R280x in the pandemic for a cheapo build. The comparative performance dip was much more than i thought. The custom drivers was a hassle (different to the one used in the video here) as each one that came out needed a whole bunch of fiddling and differing ways to install them each time so i stopped using them since the official support was pulled. Interesting vid.
Wow my first graphics card.for such an oldmgpu it went way above my expectations in terms of performance
Oh yeah... The r9 290/390 x were awesome. Built a 290x system (8GB) 2 years ago. What a nice card.
For the games that do run on it, I'm pleasantly surprised at the benchmarks. For the folks still holding on to these cards they still have a somewhat okay 1080p med settings card.
I wonder if the high end cards of today will age this well in 10 years from now or PC hardware requirements are gonna go up starkly by the next console generation
Witcher 3 4k mid without reflexions, fsr Quality, 30-39 fps
10 years from now every AAA game will likely have Cyberpunk levels of ray tracing, so we'll probably end up with our current cards technically having enough raster, but anything short of like the 4090 not being able to keep up. The VRAM situation all over again
@@roguetwice469 the 4090 is just the Titan of the dx12 ultimate (half)decade, the Titan was the 5 year Reference Point of dx11 decade
I bought my 290 as similiar raw Performance Card for less Money 2 years later in 2014. Dx12 Support was Lucky and the Point of more Value instead of 6 against 4gib vram
@@PhilippJanusch I'm having trouble imagining a cheap 4090 alternative with DX13 support in two years, but that would be incredible.
It'll probably be $6-700 because that's "cheap" for a GPU nowadays.
You've completely missed the real reason the R9 390 aged so well: is its _massive_ 512-bit memory bus, and very high VRAM bandwidth as consequence. The 390 has the GPU die surrounded by a super crowded 16 (!!) memory chips. It was the last non-HBM card with such a wide memory bus _ever_. Even the RTX 3090/4090 only have 384-bit.
And now both Nvidia and AMD are trying to sell people 128-bit e-waste that will age like milk.
I love that Anjin meme
Is Forza Motorsport 8 Playable with new drivers?
There should be graphics boards the same as motherboards. Then, you could just upgrade the RAM or the vpu as you wanted for cheaper.
That would make the gpu a lot worse though since upgradeable ram has to have extra connections, and thus more resistance and slower vram. Who knows though, maybe that new Camm2 standard for ram can improve speeds and make upgradeable vram possible?
9:42 XeSS generally decreases performance on cards without dp4a support, which on AMD is only supported on the Radeon VII, some RDNA 1 cards and all RDNA 2+ cards. Even the 5700XT doesn't work well with XeSS because of this, so I'd say you're probably stuck with either native res, FSR or TAAU on older AMD cards.
Superised they even bothered to make a fork of XeSS that tries to work on older GPUs as it doesn't give a performance boost usually and prob looks worse (the XMX version on Intel GPUs looks better than the DP4a version on non-Intel).
@@219SilverChoc True. I guess Intel may have wanted something that worked on all their somewhat recent iGPUs, though I'm not sure when they got DP4a support. From what I've seen the emulated DP4a version does at least look identical to the native DP4a version, but there's just not much point in using it.
0:36 Woah, a younger Tech Jesus!
Man i love your channel, eventhough i game on laptops, its very refreshing seeing a gaming tech enthusias channel that focus on old hardware. I love the damn smell of nostalgic tech era, not in a form of old retro 90s crt ish kinda era
My first graphics card I bought. Still a beast all these years later.
Its cool&a Bit insane that the 290oc/390 still Beats new 130$/€ cards like a380/1650(s)
4060ti with crippled 128 bit crying in corner
Sad isn't it, just sad :!
I got a RTX 3060 12Gbyte right now. Hoping to ride it out on 12Gbyte. In a few years...
But I had a GTX 1060 with 6Gbyte. And usually for the games around that time, it was more then fine. Yet... the RX 480 with 8Gbyte kinda made sense.
0:41 I can tell you the feeling. Everyone thought it was power hungry with bad drivers, and were recommending the GTX 970 instead, in spite of its botched 3.5GB of RAM. I always knew the R9 390 was superior because of its RAM and feature set, but then you immediately got called an AMD fanboy.
Man AMD GPUs were always good at high VRAM amounts. I used my RX 580 quite far into 2022 which is newer than this but also had a shockingly high 8gb for its age and price point. This and the 390 can be found quite cheap on ebay. I have to agree with the older ryzens being usable still, my 3800X is still running strong.
Rather unfortunate that the Hawaii GPU didn't implement frame-buffer color compression, wasting that 512-bit memory bus.
loved doing builds with the R9 cards (280, 380, 290, 390, etc) but honestly have noticed their age is definitely getting to them like you said. Still good cards for esports games but the power consumption is a big turn off when you can just get one of the RX 400 or 500 series cards or their Nvidia equivalents for very similar prices
Thankfully 8GB VRAM pools have been relegated to cards like the 3050.
........
Got an R7 M260 in a laptop and it is remarkably fresh and fiesty compared to Ryzens with Vega 8s. Only bad thing is that it's in a HP laptop
The 390 I had was fine, the problem was the drivers , nvidia was just kicking amds butt with drivers during that time so 8gb meant nothing .
When you remember GCN launched against the Fermi GTX580 at only 800Mhz (and would do a 25% oc easy). AMD added FP16 support and delta color compression but not much really changed to the shaders because AMD was too broke after buying ATI in the first place.
AMD were also still supporting the very different Terascale HD6000 series which used VLIWS architecture versus the RISC MIMD that GCN used so the driver team was split for years, That changed in 2016 when they finally started to focus just on GCN hence "fine wine" became a thing.
The last GCN cores are on Cezanne like the 5600g and refreshed 5600GT for Jan 2024 just making it past the 12th birthday, To my knowledge only Itanium is longer lived.
The only way it works is LINUX, because the drivers are open source and still work. Considering Windows 11 recall takes screenshots of your desktop every 3 seconds and is unencrypted and already hacked, you REALLY SHOULDN'T BE USING WINDOWS 11 ANYWAY, and Linux will solve all your problems using this card.
Just got a used R9 390 on ebay for 10 bucks so hopefully when it ships I dont get a rock, and when I plug it in it isnt a paperweight 😂
I don't get your crash with Starfield. My R9-390 with the RID drivers run it just fine at 1080p low for about 36fps. My CPU/MLB is a Machinist X79 Xeon combo if that makes a difference. Also I'm using a 1TB Samsung SATA SSD. It does, however, draw a crap ton of power. LOL.
Would have been nice to see a 4Gb (R9 290) variant included.
Over the years some outlets had done comparisons between the R9 290 4/8Gb (which is basically the same as the 390) and even the GTX 970. I think the last comprehensive test i saw i now nearly 3 years old and there the R9 290X 8GB and GTX 970 still had roughly the same relative performance they had at launch.
If the extra memory only pays of after >5 years then you really didn't benefit much from it. Still good cards - took AMD some time after the 200 series till they got good highend GPUs again, but Vega 56 also had great value (if you could get one of those rare paperlaunch models).
NVIDIA: Still supports with drivers the GTX 900 series(2014 released date).
AMD: No more drivers for the R9 300 series (2015 released date) after 6-2022.
I will never buy an AMD gpu again! Only Intel or Nvidia.
Was thinking this would perform better. Compared with the 970 this is quite slower and prices should be about the same for 4GB cards. (I bought one for a friend for $25)
I still have a R9 290 and performance is still OK for casual gaming. They can also be undervolted a lot and use some 100W instead of 250.
Instead of upgrading, I got a Freesync Premium screen and been very happy with the result. Cost me much less than a new GPU at $150 and keeps everything nice when it drops below 60fps.
I bought it to play doom 2016 with vulkan. It ran great at the time.
It really sucked at older games, AMD drivers had terrible support.
It costs me almost nothing after i sold my GTX780, had it for about a year. Then i bought a GTX1070, never looked back. Way better.
I bought a r9 390 when it was new. My take on it was it basically traded blows with the GTX 970. GTX 970 had lot better efficiency and when overcooked would pull ahead in performance. The 390 had a lot more vram 8 gb vs basically 3.5 ( who remembers this controversy). I also think it had better performance in modern apis like vulkan and dx12
honestly even though I have a 3070 with only 8gb of VRAM, it still runs games like cyberpunk at 1440p high with some ray tracing well over 70-80 fps, although I would never buy a brand new 8gb card I think the vram explosion is largely out of proportion, since 98% of games still work well with it
I had the 290x and then later the 390 as a temporary card after my 290x lit itself on fire and died. It was worth it to me since the 290x was when I started playing games at 4k.
Back when AMD gpus were competent. I was about to get a 390 but 980 Ti dropped which had insane price to performance.
But yeah, had Nvidia not released 980 Ti, I'd go for 390 because I didnt like 970's lack of vram.
Those where great cards, but I wouldn't recommend anyone running those as daily cards even if only causally gaming. Simply put power usage will eat out card value in very short time. I had previous model till recently so called 7970 GHz edition, replaced with cheapest 6600 my power bill dropped 10-15$ per month, so card will pay out in electricity saving under a year. Not to mention 6600 is silent, and old card was very very loud and hot. Whole PC got 10C cooler despite many fans and generous case.
It's called FARTnite, first of all!!!
I have been wondering about these old cards, if you can take cards with lower RAM and solder on bigger ones to add a few GB.
That with modern FSR implementations, I think that it could give you a good extra bump in performance!
Again, failing to understand new technologies in game engines and how they work is the culprit of many misinformation on the "tech channels". Seeing how 4090 can use up ~17GB of VRAM in Diablo IV @ 1080p tells you everything you need to know but people will insist that 8GB is not enough because they look at VRAM usage and draw their conclusion solely on that. To all of those people, I present to you - caching! Using excess vram to store data closer to the gpu is thing for past 3 or so years. You can A B test this using same brand of the gpu with different VRAM size and observe the gpu with more vram using more vram for the same thing producing almost the same performance as the one with less vram while the one with less vram will not stutter or exhibit any problems that come along with the lack of vram. Of course there are 3 or maybe 4 console ports that will exhibit stutters, but that is due to lazy porting.
Didnt realise how much vram new games could use until i upgraded my rx 470 to a rx 7800xt. Already seeing games using around 15 gigs of vram at maxed out settings. I bought the 7800xt with the intention of the vram adding a couple more years to its lifespan over the Nvidia options.
My 470 eventually caught up to the 1060 qfter a few years aswell so maybe my 7800xt will be on par with the 4070ti super a few years down the line.
I had the 390X Nitro. The plastic cooler was more fragile than I expected. The quality was rather disappointing, even though the card preformed well. The memory was locked too unfortunately.
A shame to abandon support for GCN 2, it still is a pretty capable gpu with 8gb of memory.
Customs drivers could help a bit.
For Helldivers, i'm pretty sure the game would run much better with DX11. you can force it in launch options.
I was active in the PC enthusiast space back then, and believe it or not, at that time the extra 4 GB of VRAM on essentially the same CPU die was considered a marketing gimmick, a bigger number to entice unsavvy consumers into falling for the bigger number better gag. My, how things have changed. Developers are using up VRAM nowadays the way a hillbilly uses the back of a pickup truck to shovel junk (that has no being there) into.
AlanWake2 requires mesh which isn't supported until RTX 2000/RX 6000, while GPUs which don't support these mesh shaders can still run the game, few older ones will just cripple.
"8gb", yeah except you can't actually use it quickly enough in gaming.
The difference between a modern 8gb buffer and a 10 year old buffer is massive.
The only way to actually use that 8gb would be in a professional graphics workload/render, not time sensitive by any stretch.
The R9 390 was my very first GPU when I returmed to PC building and gaming..Could not have made a better choice. GTX 900 series 4GB VRAM was an absolute no go for me.
I bought an R9 390 in 2017. Aside from killing a PSU or two (one of mine and one in a friend's PC after I gave it to him), it was a phenomenal card. It certainly kicked my 1060 3G's butt.
I was 25 when I built my first PC in 2015. Good old R9 390, that card served me well. Unfortunately I paired it with an FX 8370. I still have the check from that class action lawsuit!
Things have changed, I don't think you actually *need* new hardware like before to play new games. Conceptually, no game requires more than a 10-year old PC, and no one will in the future. But both lackluster optimization and little care for users make for the current scenario, where requirements skyrocket with marginal improvements in graphics quality. We probably need tools to hack games internally to downgrade graphics (besides upscaling techniques of course), but doing so will make them to look worse compared to if these games were actually developed with older hardware as target. Games should be made with scalability in mind, so any modern AAA title will still look great no matter the hardware.
Have a Sapphire Nitro R9 Fury I haven't even tried yet, wasn't sure what it was equivalent to. Picked it up years ago for a Retro Build, ( possibly a 2500K / Z77 ) only took it out of the box to have a look at it. ( I do like the looks of it )
Decade later, we're still getting 8 GB in "high end premium 1080p cards" for the same if not higher price 🤣 Really shows how manufacturers (especially the green one) are purposefully stagnating the progress...
My Sapphire r9 290 vaporX OC (basically +15% OC so faster than 290x/390 if No vram shortage) runs fine Till today
Hawaii best architecture ever
I have this, still rocking. I'm actually play a lot of sim racing games, doom 2016 and gta4, my son play a lot of Minecraft and other indie games like slime rancher.
So far so good, that card is with me for 8 years but i'm going to built a new pc and have to let the 390 on a shelf
The 390 was definitely worth it for those who bought it new, but no one should be searching for it now. This GPU was, overall, just as much of an impractical purchase as the Radeon VII was. Both cards had far more VRAM than they could make use of during their useful lifespan, no matter how much some internet contrarian tries to convince you otherwise.
If people want an upgrade to this card, I'd say an rx 6800 makes a lot more sense right now than an rx 6600. The 6800 can be found for around 360$ right now, and keeping in the spirit of the R9 390, has 16 gigs of vram.
This GPU deserved longer driver support. I am sure that some games would've run better on it. It was around RX 570 - RX 580 performance‚ in the games that were released before the end of driver support for this GPU.
I had the Strix R9 390 in my first ever gaming PC back in 2016. That GPU definitely could've lasted longer than I had it for, and I've missed it for a while
if you didn't want to play modern titles and are happy with 1080 high settings, you'd be better off with a RX 580 or 90, or a 1660 super, FSR 1.0 is better than no FSR support
Ah yes, my MSI R9 390X. It's 8gb carried me well from 2015 to 2020. My 57XT is looking like it will go down that same road here later this year. c:
Hey I had an R9 390 Nitro! Great card until it blew up on me for no reason. I think I overheated it a few times lol
The most interesting part of looking back at these older cards to me is gauging how well modern cards will age, no doubt my 7800xt will last till next generation, it honestly makes me feel like modern prices are worth it at least for how long these cards should last, VRAM tends to be the pain point of older hardware every generation, and a 3060 had 12GBs of it meaning it should last plenty long
I'd love to see its comparison with RX 580 8 GB. A frend of mine has one of them, and still plays games on it. AFAIR back then these were considered really similar performance, I'd love to see how they aged compared to each other.
Beautiful GPU this was.
I was genuinely torn about having to eventually part ways with it, but 1080ti was just that supermodel you couldn't say no to if she asked you to bed.
Had a sapphire nitro r9 390
Great gpu but mine kicked the bucket about 3 years in, just bad luck but man was 8GB was unheard of back then.
Pretty sure the norm was 4GB and the occasional 6GB at the time