Radeon VII is an expensive data-center GPU (Instinct MI50) im disguise. 3.36 TFlops FP64 (actually 6.72 in hardware, but AMD locked it down in drivers); the 1080 Ti only does 0.34 TFlops FP64. And it has much faster VRAM at 1TB/s, although it could only handle ~750GB/s in practice. It was not competition for the 1080 Ti, but for the $7k Nvidia Tesla V100 16GB. Was a no-brainer to buy 10x Radeon VII instead of 1x V100 back in the day, for computational physics research.
I don’t think we’ll ever see pascal level price to performance ever again. I like to think that was the greatest gpu generation we’ve ever seen. None of the options had big weaknesses and they each performed very well for their price.
l think second generation of Maxwell was pretty good too. Ten years old 980 Ti is still a beast of a GPU. And even GTX 950 is really good performer as well.
I still have a Radeon vii in my gpu collection it’s an interesting card and I’ve always loved the look of it. But the 5700xt really made the vii look bad when it launched.
I mean, to be fair, the Radeon VII was never meant to be a gaming card, it was a cut down server GPU thrown together last minute so AMD would have SOMETHING because Navi was 6 months delayed
@@raresmacovei8382 well in some it was others it wasn’t it was hit or miss. I have a Radeon vii and a reference 5700xt and nitro 5700xt se. The vii and the reference xt has to be undervolted and tuned to get the best out of them. I like collecting gpu’s but my main systems have a 4090, nitro 7900xtx and ftw3 3080ti. I never sell anything and really missed out selling some when gpu were crazy expensive a couple years back. The Radeon vii was going for astronomical amounts on eBay.
It's really weird to think that the RX6600 is about the same performance as the non-Ti 1080. That's actually really good performance for an entry level card.
A huge thing to note is that FSR and FG are going to absolutely make these two old gpus stay usable for another solid year atleast. In avatar the 1080ti at 1440p high FSR FG can hit 55-70 in most areas which I think is very impressive. Looking forward to my 5090 to replace the 1080ti though
@@GewelRealEveryone keeps blowing the input lag thing out of proportion, not even with afmf it was that noticeable, everyone will be fine with this for single player games, and for most multiplayer games this isn't even needed since those already run on a potato.
@@POVwithRC Don't get me wrong, it's not ideal and id rather leave it off, but being a student I cannot get a new GPU yet until I begin working. I'm plenty happy to have my GPU hallucinate up some frames for me in the meantime, while I use a controller. PLus, fast paced games don't need FG on this gpu to hit good frames and I don't care a lot about quality in competitive games.
I like the hypothetical approach presented. One thing I'm very curious about... how the framerates would be affected if you disable tessellation on the games tested (where available) on the Radeon VII and the 1080 TI. I remember tessellation being one of those settings that had to be disabled to keep consistent higher framerates achievable on the GCN cards I've owned. It might make an interesting video, investigating the performance differences with tessellation enabled/disabled comparisons on the GCN line. 🙂Love watching the older tech being experimented on. Keep up the awesome work Iceberg.
that was way back in the day, and amd instead of fixing the issue, decided to make a change in the driver which would lessen the tessellation load by literally just reducing the amount of tessellation its doing. In my experience it does not make much of a difference
Game works games used to push tessellation to absolutely bullshit x64 factor. AMD just made it more sensible, even on GCN you can force the driver to honor the game's values if you have some reason to do so@@sandrocarletto9386
I still daily drive a Radeon VII on an ultrawide monitor. Probably the only reason it's still alive is because my system is heavily CPU and RAM speed bottle-necked. Most games will run 100 FPS (freesync refresh rate of my monitor) with some graphics setting tuning. It's also worth noting that the thermal limit on the card is 110C, and unless you set an aggressive cooling curve in Adrenalin client, it will very quickly jump to that and sit there the entire time. I often find myself power and clock limiting it even with all the fans in my case (Corsair 4000X) at max speed. The three fans on the Radeon VII max out at just above 3800 RPM, and are super loud when they resonate... If I had less of a bottle-neck, I'd imagine I would have to keep a power limit on it all the time to keep it from boiling off the thermal paste
I have one too. I undervolted mine. Runs at 1020 milivolts. Only needs to spin at 2800 rpm to keep the Hotspot under 90c while retaining 100% boost speed. If I remember correctly, didn't they have thermal pads instead of paste?
@josephgrooms2977 I plan on keeping mine as long as possible. I will upgrade when I can't use it anymore or I buy a game that I can't run. Most demanding game I have is halo infinite at the moment.
@@existens6889so what? Don't care and don't expect it for a GPU that's 4 generations old. The GPU industry has fooled people into thinking that computer hardware and GPUs only last like 3-4 years. They don't. They often easily chug along for 4x+ that length of time. And often if they do slow down some quick cleaning, new thermal paste, etc will solve those issues.
@@skylitemedia862don't exppect your half a decade old hardware will run newer tripplee a games. Don't complain when games like alan wake 2 or frontiers of pandora don't run, just cause you want to save some pennies
i'm not sure how you feel about making content centered around cpu scaling but i'd love to see a video comparing the 8700k with your 5600x using the 1080 ti and your game suite. THanks for the great content.
I'm still using a GTX1080 Ti with a E5-1680 v2 @ 4.5Ghz + 32Gb of RAM @ 2133 CL12 + Asus RIVF + Samsung 980 NVME 512Gb (Bios Mod). It's still a very capable machine.
I still use a 1080 Ti with a 7700k. It plays all of the games I own on high settings at 1440p so I haven't felt the need to upgrade yet. But 2024 might be the year to finally make the change.
Oh yeah. I'm not switching it up anytime soon. You don't need to spend a lot of money. It serves me very well. I don't need to fill Intel/AMD/Nvidia pockets with money every time they do a small update. OFC is not as performant as some new systems, but for the price, it's very good.
There's a mod for AW2 that disables mesh shaders and makes it quite playable on the 1080 Ti, so I don't feel like it's much of a problem for these older cards... so long as performance makes sense when disabled to even spend time on the mod.
spiritual sucessor for me is the 6800xt from amd, its to this day a very competitive card and the price for a 16gb vram card is insane low compared to nvidia options now and it came out 3 years ago.. how fast time goes oof.
The 1080ti is the best mistake Nvidia ever made. There's no way we get a card like this again, ever. The GTX 1080ti is seven years old next march, what a monster.
Cool video. I had a FE 1080ti for awhile before it started to die a horrible death. For the longest time I was running on my backup which was an RX 570 8GB. I just upgraded to an RX 5600XT a few days ago and regret nothing. With at least some of my games I can crank up the settings again. And since I run Linux AMD support is just all around better. Although I have heard that RTX cards are getting better driver support so maybe someday that will change. I have been a Radeon fan though since they were an ATI brand out of Markham, Ontario. A bit of a national pride thing. I even have one of my old AGP cards with the old ATI mascot Ruby printed on the shroud.
I've been collecting old AMD cards for my winXP retro PC. I always had Radeon cards from 2005-2015 until I got a gtx 970 and never looked back. I have a 1080ti now. But I went on ebay and scored a 9800pro, x850xt and radeon 3850 all in AGP and I put the old copper zalman LED coolers on all of them and they ROCK in these old computers!
I just picked up a GTX 1080 for my old rig. I suppose a 1660TI could be comparable performance but I could not pass it up for the price. Now the old 6800K has an era accurate GPU lol.
14:39 - I protest!! JK ... But you should have kept the 1080 Ti at maximum texture settings... if it falls to 1030 levels of performance, than be it... that's not a comparison anymore.
I'm actually surprised that the VII diid as well as it did because the Radeon VII was primarily designed to be a workstation/prosumer card. The 16GB of HBM2 in that era is proof of this as no game could possibly have benefitted from that much VRAM at the time. The Radeon VII was more like AMD trying to make a Titan-like card and the GTX 1080 Ti was better at gaming than its Titan sibling at the time.
I remember the VII was excellent for workstation work back when Nvidia didn't have the dominance they have now in that catagory thanks to GCN's great compute performance and the 16gb of HBM2
Still using my 1080ti that I bought used in 2020. Mainly because I'm cheap and can't afford one of the new shiny ones, but it still runs everything I want it to!
If only. That would have meant Nvidia had something to compete against. Here were at the end of 2023, and Nvidia continues to set prices. I love what you did in the video, and your were bang on. Only issue is, the real "what might have been" is the today's pricing.
Damn, even with a card that's supposedly better/on par, the 1080Ti stills wins quite a few games. It would take AMD quite a few years to finally eclipse it fully.
I wish AMD would finally be able to compete with Nvidia in the way they compete with Intel in CPU's. AMD unfortunately still hasn't found their way in the Highend GPU-Market
they wont. not because they can't. after the fiasco release of 4080, AMD released 7900XTX a bit cheaper, same performance no DLSS, more powerdraw. naturally a competitor burn the market and gain marketshare. AMD didn't, what they do was matched the pricing. so 4080 and 7900XTX can be at same shit value different price point.
Just a question for you, with nothing further - why test the 1080ti in era appropriate games at 1440p when it was marketed at, and primarily used for, 4k?
The 1080Ti was the last "best available" GPU (I don't count Titans) that could be had at a price-to-performance ratio that reasonably mirrored the rest of Nvidia's range. We'll never see that again.
this is very interesting benchmark. for one, 1080ti is beast even today at 6years old, if it had not died from overheating like my first did. Second, radeon VII was 2years late of 1080ti so 7900xtx has caught up tremendously in current gen while a bit to catch up but can say it is competing well considering price. Biggest question mark, although much less relevant to games, is how Radeon VII is last consumer gpu so far with HBM memory and likely will stay that way for very long time.
The L1 cache of this Vega based GPU is only 16 kb versus the 48 kB of the 1080 Ti which is why it really struggles with the 1% and averages. The RX 5700 XT has less shaders less rops less TMU's but has a unified shader cache allowing direct access to 64 bit as it's L1. This is why RDNA is such a massive leap forward versus their Vega counterparts. Interesting seeing this play out in a what if scenario.
It's highly doubtful a single hardware feature is responsible for the performance difference. AMD only increased the L1 size in RDNA3 to 32KB, so I'm not sure where did you come with the "unified shader cache" thing for RDNA1. Pascal was the last Nvidia architecture to use dedicated L1 texture case (48KB read-only) and was not latency optimized, since all read requests had to go through the texture units. GCN had other weak points, besides the software driver behavior, that unattended brought down the performance in many cases, as a result of poor VALU occupancy caused by a myriad of factors.
@@Ivan-pr7ku I read it from their white paper "RDNA 1.0" Instruction Set Architecture and DNA ARCHITECTURE just google these. Sure there are other factors but these are the main nail in the coffin. Not being able to move larger textures files or swapping data across because of a lack of cache is like a having a to small of a diameter on your injector. not enough fuel not enough bang means a shitty running engine.
interesting results in CP77, back when cyberpunk launched i had a radeon VII, however my card did around 45fps in 1440p. it might come from my undervolt to i think it was 975mV, however i didnt OC
I don't see why it wouldn't?. It's not that much slower than the 1080ti and my 1080ti was running almost all my games maxed out at 1440p at decent fps (except for a few games hence why I upgraded). So the 1080 would be doing a lot better than my 1080ti was seeing as the difference between 1440p and 1080p is huge in terms of gpu power needed. (Just looked it up on google because I couldn't remember the exact number but here it is). According to google: 1440p monitors offer 78% more pixels than the Full HD option. So yeah it would definitely run almost everything at 1080p just fine except for the latest unoptimised messes that we've gotten in the last year or so if my 1080ti was running almost everything maxed out decently well at a resolution that takes almost 80% more power to run lol.@@michubern1444
I loved my Radeon VII. Picked it up for $550 brand new right before the mining craze. Sold it before the mining boom crashed and it more than paid for my current 6900XT which was well over valued at the time. Wish I could get the card back. Overclocked it was quite a beast. Stock cooler sucked though and was so obnoxiously loud.
Fun fact: the radeon 7 still holds the crown for highest memory bandwidth on a consumer GPU. Nothing has matched it yet! Even a 4090 only has 86% of the bandwidth (after accounting for the GDDR6X MTA)
I just retired my Radeon VII 2 months ago for a 6950xt, ended up getting it from someone that actually had two 1080s in SLI he tried it didnt like it and sold it to me for 350 bucks a few weeks after its release. If you do the washer mod to the VII thermals drop and my clocks would sit quite a bit higher. Also was SaM on when testing the VII?!
Is that a "alternate history hub" reference? Now about the card, the worst problem is the assembly of the VII it self, the card have lots of interesting technologies, but they rush things soo much that neither the hardware or the software are mature for market, in the end amd even stop to try make a functional driver
I loved my 1080TI, unfortunately I bought a Evga version, which I had to RMA 3 times. Like clockwork it would die every 6 months. Ended up eventually just selling it after my last RMA and just bought a 2080ti from MSI that ran great for nearly 2 years until I sold it.
The RX Vega Frontier Edition competed well against the 1080 TI. It was very powerful, had great raw horsepower and lots of OpenCL processing juice. The Radeon VII was quite a bit better, but it was really just a proof of concept that didn't pay off.
Still got both of my GTX Rog Strix 1080's. Been wanting to upgrade so badly but the truth is... I'm still fine even with lower graphics. I haven't even upgraded my monitors still 1080p 144hz IPS. So I realistically have no hurry yet, but I am eager to see what is up on the line next with AMD. I'm hoping power cuts to the 8000 series cards as I'm running an ITX build. Performance advantage is obviously still on Nvidia's court, however, AMD has got me much more curious as to whether or not they'll keep the new architecture of dual chiplet design. Over all, just honestly greatful that the GTX 1080 has brought me in this far.
Radeon VII always shined best at 4k. Yes, yes, I know but 4k with med-high settings often looks better than 1440p at ultra and the bandwidth of the VII is what makes it do better. Just like at 1080p a 5700XT is typically faster than the VII but at 4k it walks away.
Good video. I had the option of a Vega Frontier Edition, that fell flat a couple of weeks back and was wondering if the Radeon VII wold be worth it. Not that I'd need either one of them, other than Collectors items, given that my RX 6650XT seems to have them both beat, apart from VRAM amount.
Edit, when i talk about a cards class, i am talking about their manufacturing spec, such as their die size and memory bus, these things do not change between generations, there is always a minimum viable size of around 100mm, and always a maximum viable(consumer) size of around 600mm, with the 2080TI and Titan RTX being an outlier at 750mm like the enterprise Tesla series of cards often around 800mm (now called A100 and H100 dropping the tesla name) The competition for GTX 1080 didnt come until the 5700XT Both 250-350mm 60 class die and 256 bit 70 class bus Its just too bad the 5700XT was a generation and a half late AMD's only direct competitor to the ~300mm GTX 1080 was so bad that they didnt even launch it, instead only launched the ~500mm 80 class die with HBM2 Titan class memory bus on the Vega64, AMD didnt release a ~300mm 60 class card because this 500mm+ card and its cut down partner were so bad they couldnt properly compete with the ~300mm 60 class die GTX 1080 Oh, and AMD never launched a competitor to the 1080TI in the RX5000 series, with 60 class being their highest tier, not releasing a ~400mm 70 class, or ~500mm 80 class(like the 471mm 1080TI) However, if we had recieved a ~500mm RX 5900XT, the ~500mm 6900 would have been less compelling because the 6900 uses the same node, but gives up some of that ~500mm, for RT, and to function properly, an 80-100CU 5900XT would likely need a 384 bit memory bus, where as the 6900XT has only a 256 bit(remember, its not giving up die area to cache and RT so has more room for compute units). The 6900XT may still have excel at lower resolution thanks to the cache, but a theoretical 5900XT might have beat the 3090 in traditional raster at 4K
I never got the Dual/single rank thing, not that I researched too hard. From memory I came to a conclusion that Single rank was better for the 5700x, any reason why you mentioned/use dual rank? I have 2x 16gb Single rank Kingston Fury 3200 C16 OC-ed to 3600 C16 1800 IF✌
I recently for shits and giggles picked up a TITAN Xp and Radeon VII. I collect cards (albeit mostly pre-2013 stuff). The pair cost me around $400US. A bit of a spendy pickup, but a fair deal for it. Not bad as backup cards either compared to my Radeon RX6800.
I wonder if the 5600X had a big difference on the numbers. It would be interesting to see how different the FPS numbers would be like on a 2017-2019 cpu just for shits and gigs
Radeon group had no clear leadership and went orthodox on the 7, Big chip, lots of power, and lots of memory bandwith by using the latest hardware technologies. nVidia did what nVidia does and raise the bar with their latest software technologies. AMD always was behind CUDA in performance per Watt amd with nVidia sticking to older process technologies eeking out FPS through driver implementation of all DX12 features, including DX12 Ultimate with the 20+ series. AMD has always suffered poor driver implementation. But they are getting much better at it now. If you plug in a decent RX580 8 GB now, with fast ddr4 dram. you will be astonished compared to how it was in 2018. I have also tried CrossfireX (2x16x)in the games that stll support it and it's a hair slower than an overclocked 2060 super.
I got an RTX 4070, while it’s price to performance isn’t great, it’s overall performance is pretty good, so in terms of raw performance, I think an RTX 4080 is pretty good, that said, I heard rumors that the RTX 4080 Super would see a price drop so I’d just wait personally
@@JehanPrasetyo.pI‘d definitely wait for the super if you don’t need a Gpu right now. If you DO need one ASAP, I would spring for either the 4090 for ultimate performance or the 7900xtx which is usually better and cheaper.
@@JehanPrasetyo.p That might depend on your resolution, if you’re using it at 4K no, 1440p maybe a little, not too sure, although I’m fairly certain the bottleneck would be minimal if it did exist, I have heard of builds that pair the i7 13700K with the RTX 4090 so I think that pairing an RTX 4080 Super with an i7 14700K would be good
I've got an Asus Strix 980 Ti (overclocks very well) and it gets more FPS in most games than my 1070 Ti and my 3050... But due to support it cant play some games. There should be some sort of law against this.. Also Im wondering how the 7900 XTX is going to age..?
It'll age very nicely, your talking about a card with under volt and slight over clock reaching 4090 performance in 90% of games and better coding potential from what some TH-cam channels have showed. Driver updates and open source FSR3 with frame gen will make for a long run graphics card.
The 7900xtx is out of my price range as for the 7900xt, but with a little luck, after the 40 super series come out. Maybe AMD will drop a 7850xt or 7800xtx and price it at $575-$625 with performance very close to 7900xt
Maxwell lack FP16 hardware so there's a trade penalty in some game, gpu pascal add it but Turing have 1:1 vs FP32. Today it's Intel alchemist who lack FP64 hardware and need emulate with tensor cores.
Memory is the ultimate predictor of a card's longevity; that's why all vega cards are still decent, heck the 1060 would still be a decent used card like the rx570-580 8Gb had nVidia not made it 3/6GB and started their trend of selling already obsolete cards.
You're completely biased towards AMD and it shows. When the GTX beats it with an enormous lead its "a couple of frames better" but when AMD is 6 frames better its "a ginormous difference" wtf? And since when 1.9 frames are "a couple of frames"? In favor of AMD of course. If it was the other way around you would say they're almost identical of course. AND to correct you once again, you're the one having "brand preferences".
1.9 frames are almost “2” frames. 2 is a couple. A couple of socks is 2 socks, a couple like your parents is a couple of two people. He’s not wrong when he says that 1.9 (when rounding up to 2) frames is a couple of frames. I think you are being nitpicky.
The main observable difference between these cards is survivability. The 1080Ti can usually be revived from any serious failure. Once a Radeon VII dies, it's DEAD dead.
First years of my radeon vii i made 6000$ in profit... second years 7500$. Still have to today for all my editing need but stopped mining with it. It was and still is the best mining card for the $$$
This was a very interessting Video, thanks. Still...on Paper it looks like the Radeon7 should be much more potent than the 1080ti with 16gb of HBM2. Why is it only Neck in Neck with the Nvidia, while Speed and quantity of Vram has become a major performance aspect.
All else aside, I love the look of that AMD card. Well, I should say: I'm strongly averse towards making components "cool", giving them RGB lights and putting windows on computer cases. So somehow this design of just a BLOCK kind of intrigues me :D
Radeon VII is an expensive data-center GPU (Instinct MI50) im disguise. 3.36 TFlops FP64 (actually 6.72 in hardware, but AMD locked it down in drivers); the 1080 Ti only does 0.34 TFlops FP64. And it has much faster VRAM at 1TB/s, although it could only handle ~750GB/s in practice. It was not competition for the 1080 Ti, but for the $7k Nvidia Tesla V100 16GB.
Was a no-brainer to buy 10x Radeon VII instead of 1x V100 back in the day, for computational physics research.
thats cool
I wonder if they did that to cash in on the crypto boom.
@@GoldenGrenadierthey did.
They won't admit it, but it's also the reason Vega was such a nearly mining card.
I don’t think we’ll ever see pascal level price to performance ever again. I like to think that was the greatest gpu generation we’ve ever seen. None of the options had big weaknesses and they each performed very well for their price.
l think second generation of Maxwell was pretty good too. Ten years old 980 Ti is still a beast of a GPU. And even GTX 950 is really good performer as well.
@@SharpShoot3r_14 980 Ti is just over 8yrs old, and most are around 7yrs
@@hambotech9954
No, most are Almost 10 years old now. And 980ti is is 9 years old now.
I still have a Radeon vii in my gpu collection it’s an interesting card and I’ve always loved the look of it. But the 5700xt really made the vii look bad when it launched.
I mean, to be fair, the Radeon VII was never meant to be a gaming card, it was a cut down server GPU thrown together last minute so AMD would have SOMETHING because Navi was 6 months delayed
While efficiency was worse on R7, it always was faster in games, stock or especially when both are overclocked
@@raresmacovei8382 well in some it was others it wasn’t it was hit or miss. I have a Radeon vii and a reference 5700xt and nitro 5700xt se. The vii and the reference xt has to be undervolted and tuned to get the best out of them. I like collecting gpu’s but my main systems have a 4090, nitro 7900xtx and ftw3 3080ti. I never sell anything and really missed out selling some when gpu were crazy expensive a couple years back. The Radeon vii was going for astronomical amounts on eBay.
@@raresmacovei8382as per usual, it's a power hungry beast with a pretty face.
Undervolting this thing drops avg power by like 70W
Man the radeonVII looks cool to me idk why,
Nice video as always
radeon vii ultra ti agp
It is indeed a very nice looking gpu
Always wanted one but not paying the stupid prices for it
It's the very minimalistic boxy look to it. It just looks like a metal plate with 3 fans in it.
because it's rare and unique
The Radeon VII did one thing VERY well: Mine Ethereum. It was near a rtx 3090 in that regard.
It's really weird to think that the RX6600 is about the same performance as the non-Ti 1080. That's actually really good performance for an entry level card.
A huge thing to note is that FSR and FG are going to absolutely make these two old gpus stay usable for another solid year atleast. In avatar the 1080ti at 1440p high FSR FG can hit 55-70 in most areas which I think is very impressive.
Looking forward to my 5090 to replace the 1080ti though
I pray for your wallet when that 5090 comes out 💀
delicious 10 years of input lag
You can keep your weird AI hallucination nonsense thanks.
@@GewelRealEveryone keeps blowing the input lag thing out of proportion, not even with afmf it was that noticeable, everyone will be fine with this for single player games, and for most multiplayer games this isn't even needed since those already run on a potato.
@@POVwithRC Don't get me wrong, it's not ideal and id rather leave it off, but being a student I cannot get a new GPU yet until I begin working. I'm plenty happy to have my GPU hallucinate up some frames for me in the meantime, while I use a controller.
PLus, fast paced games don't need FG on this gpu to hit good frames and I don't care a lot about quality in competitive games.
By the way in steam launch options you can make lots of games run Vulkan instead of dx12 , you should try this other times
or dxvk
I like the hypothetical approach presented. One thing I'm very curious about... how the framerates would be affected if you disable tessellation on the games tested (where available) on the Radeon VII and the 1080 TI. I remember tessellation being one of those settings that had to be disabled to keep consistent higher framerates achievable on the GCN cards I've owned. It might make an interesting video, investigating the performance differences with tessellation enabled/disabled comparisons on the GCN line. 🙂Love watching the older tech being experimented on. Keep up the awesome work Iceberg.
that was way back in the day, and amd instead of fixing the issue, decided to make a change in the driver which would lessen the tessellation load by literally just reducing the amount of tessellation its doing. In my experience it does not make much of a difference
@@sandrocarletto9386
If there's no difference with less triangle spam, I'd say they fixed the issue.
@@sandrocarletto9386I set my amd card to override application settings and limit to x16 tessellation. Still doing it with my 6900xt
t@@smith7602 it doesnt make much of a difference disabling tessellation completely because AMD already minimises its use in the driver
Game works games used to push tessellation to absolutely bullshit x64 factor. AMD just made it more sensible, even on GCN you can force the driver to honor the game's values if you have some reason to do so@@sandrocarletto9386
I still daily drive a Radeon VII on an ultrawide monitor. Probably the only reason it's still alive is because my system is heavily CPU and RAM speed bottle-necked. Most games will run 100 FPS (freesync refresh rate of my monitor) with some graphics setting tuning.
It's also worth noting that the thermal limit on the card is 110C, and unless you set an aggressive cooling curve in Adrenalin client, it will very quickly jump to that and sit there the entire time. I often find myself power and clock limiting it even with all the fans in my case (Corsair 4000X) at max speed. The three fans on the Radeon VII max out at just above 3800 RPM, and are super loud when they resonate... If I had less of a bottle-neck, I'd imagine I would have to keep a power limit on it all the time to keep it from boiling off the thermal paste
What CPU are you using that’s CPU bottlenecking a Radeon VII?
A Ryzen 3 1300x!?!?
I have one too. I undervolted mine. Runs at 1020 milivolts. Only needs to spin at 2800 rpm to keep the Hotspot under 90c while retaining 100% boost speed. If I remember correctly, didn't they have thermal pads instead of paste?
@@bornonthebattlefront4883 intel i5 9600kf
@0wenage they did, I replaced mine with paste and new pads, if it survives for another year, I'll see about getting one of those graphite thermal pads
@josephgrooms2977 I plan on keeping mine as long as possible. I will upgrade when I can't use it anymore or I buy a game that I can't run. Most demanding game I have is halo infinite at the moment.
1080ti seems about as powerful as a 4060. Pretty impressive something from 7 years ago holds up that well
there is almost no chance of finding a new 1080 ti, or one that didnt mine crypto
@@existens6889yeah but still tho, it's a beast considering the fact that it's 7 Years old
@@existens6889 My EVGA 1080 Ti FTW 3 Hybrid has never been used to mine crypto and is still working great. It loses to my 4060 gaming laptop though.
@@existens6889so what? Don't care and don't expect it for a GPU that's 4 generations old. The GPU industry has fooled people into thinking that computer hardware and GPUs only last like 3-4 years. They don't. They often easily chug along for 4x+ that length of time. And often if they do slow down some quick cleaning, new thermal paste, etc will solve those issues.
@@skylitemedia862don't exppect your half a decade old hardware will run newer tripplee a games. Don't complain when games like alan wake 2 or frontiers of pandora don't run, just cause you want to save some pennies
i'm not sure how you feel about making content centered around cpu scaling but i'd love to see a video comparing the 8700k with your 5600x using the 1080 ti and your game suite. THanks for the great content.
With overclocking. There's too much performance left on the table without it.
I'm still using a GTX1080 Ti with a E5-1680 v2 @ 4.5Ghz + 32Gb of RAM @ 2133 CL12 + Asus RIVF + Samsung 980 NVME 512Gb (Bios Mod). It's still a very capable machine.
a dinosaur of a CPU
Ancient
I still use a 1080 Ti with a 7700k. It plays all of the games I own on high settings at 1440p so I haven't felt the need to upgrade yet. But 2024 might be the year to finally make the change.
@@GewelReal Oh yeah. But a powerfull one.
Oh yeah. I'm not switching it up anytime soon. You don't need to spend a lot of money. It serves me very well. I don't need to fill Intel/AMD/Nvidia pockets with money every time they do a small update. OFC is not as performant as some new systems, but for the price, it's very good.
I refuse to sell my 1080 Ti because i find it so awesome and it reminds me on better days, even when those aren't that long ago.
There's a mod for AW2 that disables mesh shaders and makes it quite playable on the 1080 Ti, so I don't feel like it's much of a problem for these older cards... so long as performance makes sense when disabled to even spend time on the mod.
The 1080ti is a card I always wished had and I'm surprised when I lookup cards relative speed on techpowerup how often the 1080ti shows up.
Don't let your dreams be dreams man, I upgraded from my beloved 970 to a 1080 in April 2020 and then to a 1080ti in 2022 and it's been a blast!
i am 2 minutes late.... Sold my only GPU, my 6600, come 2024, i want to have 3 gpus, the 1080 TI will be the weakest.
radeon vii ultra ti agp
radeon vii ultra ti agp
And just what are you gonna do with 3 GPU's?
1080 ti being the weakest is the hugest flex ive heard 💀, I just bought a gtx 1660 yesterday as my first gpu
@@34unrealsome people just have different priorities 🤷. I really want to sell my 1080 ti for when intel battlemage comes out
I was lucky. Buying a used 1080 ti in 2019 for 500 Bucks. It's still going strong. I hope GPU prices come down in the next years
spiritual sucessor for me is the 6800xt from amd, its to this day a very competitive card and the price for a 16gb vram card is insane low compared to nvidia options now and it came out 3 years ago.. how fast time goes oof.
The 1080ti is the best mistake Nvidia ever made. There's no way we get a card like this again, ever.
The GTX 1080ti is seven years old next march, what a monster.
the 1080ti is truly the old man in a job where men die young
The 2080ti was always the better GPU. It has access to all modern features today except frame generation and AV1 encode.
@@ZackSNetworkfor double the price basically
@@ZackSNetworkbut not price.
@@ZackSNetworkisnt the rt performance of rtx 2xxx garbage compared to even 3060/3070 ?
Cool video. I had a FE 1080ti for awhile before it started to die a horrible death. For the longest time I was running on my backup which was an RX 570 8GB.
I just upgraded to an RX 5600XT a few days ago and regret nothing. With at least some of my games I can crank up the settings again.
And since I run Linux AMD support is just all around better. Although I have heard that RTX cards are getting better driver support so maybe someday that will change.
I have been a Radeon fan though since they were an ATI brand out of Markham, Ontario. A bit of a national pride thing.
I even have one of my old AGP cards with the old ATI mascot Ruby printed on the shroud.
I've been collecting old AMD cards for my winXP retro PC. I always had Radeon cards from 2005-2015 until I got a gtx 970 and never looked back. I have a 1080ti now. But I went on ebay and scored a 9800pro, x850xt and radeon 3850 all in AGP and I put the old copper zalman LED coolers on all of them and they ROCK in these old computers!
I just picked up a GTX 1080 for my old rig. I suppose a 1660TI could be comparable performance but I could not pass it up for the price. Now the old 6800K has an era accurate GPU lol.
I still love my Radeon VII, even if I no longer use it. A shame my model is doomed to die (Samsung HBM) spontaneously.
radeon vii ultra ti agp
Mine has the Hynix and doesnt clock anywhere near what a sumsung HBM vii does so i guess there's a trade off lol
@@pr0j3ktEv0luti0nthe difference in clock are irrelevant thanks to the fact that the VRAM is directly on the GPU die
I really liked when thanos was cut in half in the "What If" series you put in the thumbnail.
14:39 - I protest!! JK ... But you should have kept the 1080 Ti at maximum texture settings... if it falls to 1030 levels of performance, than be it... that's not a comparison anymore.
7:38 makes my day to see sebs helmet
I'm actually surprised that the VII diid as well as it did because the Radeon VII was primarily designed to be a workstation/prosumer card. The 16GB of HBM2 in that era is proof of this as no game could possibly have benefitted from that much VRAM at the time. The Radeon VII was more like AMD trying to make a Titan-like card and the GTX 1080 Ti was better at gaming than its Titan sibling at the time.
I love my 1080 ti, it's a beast. Some undervolting with msi afterburner, and the cards runs great at 70C while still pumping out 144 fps.
Man I sold my 1080Ti a month ago. What a beast it was… played Jedi Survivor on it with normal settings on 1080p with 40-50 fps.
1080Ti is the GOAT and with FSR3 becoming more available in the future it'll manage to stay relevant for even longer
I remember the VII was excellent for workstation work back when Nvidia didn't have the dominance they have now in that catagory thanks to GCN's great compute performance and the 16gb of HBM2
Still using my 1080ti that I bought used in 2020. Mainly because I'm cheap and can't afford one of the new shiny ones, but it still runs everything I want it to!
stop romanticizing dinosaurs
4:15 i see what you did right there lol
If only. That would have meant Nvidia had something to compete against. Here were at the end of 2023, and Nvidia continues to set prices.
I love what you did in the video, and your were bang on. Only issue is, the real "what might have been" is the today's pricing.
Damn, even with a card that's supposedly better/on par, the 1080Ti stills wins quite a few games. It would take AMD quite a few years to finally eclipse it fully.
My 1080ti is still powering my daily and it will be until it dies. It’s still a powerhouse
I wish AMD would finally be able to compete with Nvidia in the way they compete with Intel in CPU's. AMD unfortunately still hasn't found their way in the Highend GPU-Market
they wont. not because they can't. after the fiasco release of 4080, AMD released 7900XTX a bit cheaper, same performance no DLSS, more powerdraw.
naturally a competitor burn the market and gain marketshare. AMD didn't, what they do was matched the pricing. so 4080 and 7900XTX can be at same shit value different price point.
@@whitygoose
4080 is incredible!
Frame generation and DLSS quality destroy so hard the 7900xtx, that 7900xtx will obsolete soon!
Just a question for you, with nothing further - why test the 1080ti in era appropriate games at 1440p when it was marketed at, and primarily used for, 4k?
The 1080Ti was the last "best available" GPU (I don't count Titans) that could be had at a price-to-performance ratio that reasonably mirrored the rest of Nvidia's range. We'll never see that again.
that little graphic being a nod to althistory hub is cute
thanks for compare these 2 cards, just thinking to swap 1080ti to RVII, as i have both, 1080TI with 1950x and RVII using as EGPU
These are fairly niche videos. Really good quality and enjoyable to watch videos though.
I was just bidding on one of these🤘🏼🏁
Another great video! Merry Christmas good sir! 🎄🎄
Thanks! You too!
this is very interesting benchmark. for one, 1080ti is beast even today at 6years old, if it had not died from overheating like my first did. Second, radeon VII was 2years late of 1080ti so 7900xtx has caught up tremendously in current gen while a bit to catch up but can say it is competing well considering price. Biggest question mark, although much less relevant to games, is how Radeon VII is last consumer gpu so far with HBM memory and likely will stay that way for very long time.
The L1 cache of this Vega based GPU is only 16 kb versus the 48 kB of the 1080 Ti which is why it really struggles with the 1% and averages. The RX 5700 XT has less shaders less rops less TMU's but has a unified shader cache allowing direct access to 64 bit as it's L1. This is why RDNA is such a massive leap forward versus their Vega counterparts. Interesting seeing this play out in a what if scenario.
It's highly doubtful a single hardware feature is responsible for the performance difference. AMD only increased the L1 size in RDNA3 to 32KB, so I'm not sure where did you come with the "unified shader cache" thing for RDNA1. Pascal was the last Nvidia architecture to use dedicated L1 texture case (48KB read-only) and was not latency optimized, since all read requests had to go through the texture units. GCN had other weak points, besides the software driver behavior, that unattended brought down the performance in many cases, as a result of poor VALU occupancy caused by a myriad of factors.
@@Ivan-pr7ku I read it from their white paper "RDNA 1.0" Instruction Set Architecture and DNA ARCHITECTURE just google these. Sure there are other factors but these are the main nail in the coffin. Not being able to move larger textures files or swapping data across because of a lack of cache is like a having a to small of a diameter on your injector. not enough fuel not enough bang means a shitty running engine.
interesting results in CP77, back when cyberpunk launched i had a radeon VII, however my card did around 45fps in 1440p. it might come from my undervolt to i think it was 975mV, however i didnt OC
that intro animation went hard
radeon vii ultra ti agp
radeon vii ultra ti apu
Still rock a GTX 1080 non Ti and runs pretty much everything at ultra at 1080p :D
radeon vii ultra ti agp
no it doesnt but its still good ;)
@@michubern14441080p?
I don't see why it wouldn't?. It's not that much slower than the 1080ti and my 1080ti was running almost all my games maxed out at 1440p at decent fps (except for a few games hence why I upgraded). So the 1080 would be doing a lot better than my 1080ti was seeing as the difference between 1440p and 1080p is huge in terms of gpu power needed.
(Just looked it up on google because I couldn't remember the exact number but here it is).
According to google: 1440p monitors offer 78% more pixels than the Full HD option.
So yeah it would definitely run almost everything at 1080p just fine except for the latest unoptimised messes that we've gotten in the last year or so if my 1080ti was running almost everything maxed out decently well at a resolution that takes almost 80% more power to run lol.@@michubern1444
I loved my Radeon VII. Picked it up for $550 brand new right before the mining craze. Sold it before the mining boom crashed and it more than paid for my current 6900XT which was well over valued at the time. Wish I could get the card back. Overclocked it was quite a beast. Stock cooler sucked though and was so obnoxiously loud.
I cheer for no billion dollar company. Especially these days when theyre both extremely overpriced.
Fun fact: the radeon 7 still holds the crown for highest memory bandwidth on a consumer GPU. Nothing has matched it yet! Even a 4090 only has 86% of the bandwidth (after accounting for the GDDR6X MTA)
Says more about HMB vs GDDR than amd vs nvidia, but at the end of the day the R7's bandwidth meant little since the card itself had shit compression.
Oh heck this was my idea comparing the 1080ti. I feel like the 5700 XT would be a good addition aswell.
Thanks for that. Do you have any idea what was the fastest gaming card with a blower fan? Was it a 1080Ti?
vega and Vii looked pretty cool and its the last swan song of GCN, but sadly its not widely accessible
I found the underdog Radeon VII for cheap. Can't wait to try it out 😊😊
I just retired my Radeon VII 2 months ago for a 6950xt, ended up getting it from someone that actually had two 1080s in SLI he tried it didnt like it and sold it to me for 350 bucks a few weeks after its release. If you do the washer mod to the VII thermals drop and my clocks would sit quite a bit higher. Also was SaM on when testing the VII?!
Is that a "alternate history hub" reference?
Now about the card, the worst problem is the assembly of the VII it self, the card have lots of interesting technologies, but they rush things soo much that neither the hardware or the software are mature for market, in the end amd even stop to try make a functional driver
I loved my 1080TI, unfortunately I bought a Evga version, which I had to RMA 3 times. Like clockwork it would die every 6 months. Ended up eventually just selling it after my last RMA and just bought a 2080ti from MSI that ran great for nearly 2 years until I sold it.
3:57 4 sticks of single rank is dual rank aswell!
The RX Vega Frontier Edition competed well against the 1080 TI. It was very powerful, had great raw horsepower and lots of OpenCL processing juice. The Radeon VII was quite a bit better, but it was really just a proof of concept that didn't pay off.
Still got both of my GTX Rog Strix 1080's. Been wanting to upgrade so badly but the truth is... I'm still fine even with lower graphics. I haven't even upgraded my monitors still 1080p 144hz IPS. So I realistically have no hurry yet, but I am eager to see what is up on the line next with AMD. I'm hoping power cuts to the 8000 series cards as I'm running an ITX build. Performance advantage is obviously still on Nvidia's court, however, AMD has got me much more curious as to whether or not they'll keep the new architecture of dual chiplet design.
Over all, just honestly greatful that the GTX 1080 has brought me in this far.
Speaking of Pascal... Today father have sold my old PC with i5 7400 and gtx 1050 for 7500 UAH (157 GBP rn) The CEX machine indeed
RX 5700 XT is the Radeon card of 2019 that answers the GTX 1080 Ti. Specifically the RX 5700 XT 50th Anniversary Edition.
Radeon VII always shined best at 4k. Yes, yes, I know but 4k with med-high settings often looks better than 1440p at ultra and the bandwidth of the VII is what makes it do better. Just like at 1080p a 5700XT is typically faster than the VII but at 4k it walks away.
With current prices on the Radeon 7, you can buy a RTX 3080 for the same money...😂😂
Prey and Control both stellar games.
trust me, even if AMD can beat NVIDIA. they wont. they will never ever try to win against nvidia. nvidia raise the price, amd follow.
the Radeon VIIs strongpoint was its looks xD
Good video.
I had the option of a Vega Frontier Edition, that fell flat a couple of weeks back and was wondering if the Radeon VII wold be worth it.
Not that I'd need either one of them, other than Collectors items, given that my RX 6650XT seems to have them both beat, apart from VRAM amount.
Edit, when i talk about a cards class, i am talking about their manufacturing spec, such as their die size and memory bus, these things do not change between generations, there is always a minimum viable size of around 100mm, and always a maximum viable(consumer) size of around 600mm, with the 2080TI and Titan RTX being an outlier at 750mm like the enterprise Tesla series of cards often around 800mm (now called A100 and H100 dropping the tesla name)
The competition for GTX 1080 didnt come until the 5700XT
Both 250-350mm 60 class die and 256 bit 70 class bus
Its just too bad the 5700XT was a generation and a half late
AMD's only direct competitor to the ~300mm GTX 1080 was so bad that they didnt even launch it, instead only launched the ~500mm 80 class die with HBM2 Titan class memory bus on the Vega64, AMD didnt release a ~300mm 60 class card because this 500mm+ card and its cut down partner were so bad they couldnt properly compete with the ~300mm 60 class die GTX 1080
Oh, and AMD never launched a competitor to the 1080TI in the RX5000 series, with 60 class being their highest tier, not releasing a ~400mm 70 class, or ~500mm 80 class(like the 471mm 1080TI)
However, if we had recieved a ~500mm RX 5900XT, the ~500mm 6900 would have been less compelling because the 6900 uses the same node, but gives up some of that ~500mm, for RT, and to function properly, an 80-100CU 5900XT would likely need a 384 bit memory bus, where as the 6900XT has only a 256 bit(remember, its not giving up die area to cache and RT so has more room for compute units). The 6900XT may still have excel at lower resolution thanks to the cache, but a theoretical 5900XT might have beat the 3090 in traditional raster at 4K
Yo! I was wondering if you could add the witcher 3 to your test suite! It would definitely be quite insightful! Thankyou
I never got the Dual/single rank thing, not that I researched too hard. From memory I came to a conclusion that Single rank was better for the 5700x, any reason why you mentioned/use dual rank? I have 2x 16gb Single rank Kingston Fury 3200 C16 OC-ed to 3600 C16 1800 IF✌
I recently for shits and giggles picked up a TITAN Xp and Radeon VII. I collect cards (albeit mostly pre-2013 stuff). The pair cost me around $400US. A bit of a spendy pickup, but a fair deal for it. Not bad as backup cards either compared to my Radeon RX6800.
I wonder if the 5600X had a big difference on the numbers. It would be interesting to see how different the FPS numbers would be like on a 2017-2019 cpu just for shits and gigs
Excellent work as always 🤘💯
Radeon group had no clear leadership and went orthodox on the 7, Big chip, lots of power, and lots of memory bandwith by using the latest hardware technologies. nVidia did what nVidia does and raise the bar with their latest software technologies. AMD always was behind CUDA in performance per Watt amd with nVidia sticking to older process technologies eeking out FPS through driver implementation of all DX12 features, including DX12 Ultimate with the 20+ series. AMD has always suffered poor driver implementation. But they are getting much better at it now. If you plug in a decent RX580 8 GB now, with fast ddr4 dram. you will be astonished compared to how it was in 2018. I have also tried CrossfireX (2x16x)in the games that stll support it and it's a hair slower than an overclocked 2060 super.
Do you think RTX 4080 is good? Or i need to wait for the Super version?
radeon vii ultra ti agp
I got an RTX 4070, while it’s price to performance isn’t great, it’s overall performance is pretty good, so in terms of raw performance, I think an RTX 4080 is pretty good, that said, I heard rumors that the RTX 4080 Super would see a price drop so I’d just wait personally
@@JMPStart aight, I need a PC for running UE5, so I might need the Super Version. Is it bottlenecked with i7 14700K?
@@JehanPrasetyo.pI‘d definitely wait for the super if you don’t need a Gpu right now. If you DO need one ASAP, I would spring for either the 4090 for ultimate performance or the 7900xtx which is usually better and cheaper.
@@JehanPrasetyo.p That might depend on your resolution, if you’re using it at 4K no, 1440p maybe a little, not too sure, although I’m fairly certain the bottleneck would be minimal if it did exist, I have heard of builds that pair the i7 13700K with the RTX 4090 so I think that pairing an RTX 4080 Super with an i7 14700K would be good
It had competition. Not real one but Nvidia only made it like this cause AMD promised much more performance than vega delivered.
I would still buy a GTX 1080TI 🔥🔥🔥
I've got an Asus Strix 980 Ti (overclocks very well) and it gets more FPS in most games than my 1070 Ti and my 3050... But due to support it cant play some games. There should be some sort of law against this.. Also Im wondering how the 7900 XTX is going to age..?
if every 7900xtx donates 10$ to NimeZ for his exceptionally hard work, i think finewine will pop its head once again
It'll age very nicely, your talking about a card with under volt and slight over clock reaching 4090 performance in 90% of games and better coding potential from what some TH-cam channels have showed. Driver updates and open source FSR3 with frame gen will make for a long run graphics card.
The 7900xtx is out of my price range as for the 7900xt, but with a little luck, after the 40 super series come out. Maybe AMD will drop a 7850xt or 7800xtx and price it at $575-$625 with performance very close to 7900xt
Maxwell lack FP16 hardware so there's a trade penalty in some game, gpu pascal add it but Turing have 1:1 vs FP32. Today it's Intel alchemist who lack FP64 hardware and need emulate with tensor cores.
what do you mean by "There should be some sort of law against this" ??? technology moves on and yet you dare say that ? hilarious
wanna go back in time and tell myself "boy--just buy a 1080ti, enjoy the last flagship w/ a price won't shame your ancestors"
Thought this video released 3 years ago. It was released 3 hours ago turns out.
Memory is the ultimate predictor of a card's longevity; that's why all vega cards are still decent, heck the 1060 would still be a decent used card like the rx570-580 8Gb had nVidia not made it 3/6GB and started their trend of selling already obsolete cards.
You're completely biased towards AMD and it shows. When the GTX beats it with an enormous lead its "a couple of frames better" but when AMD is 6 frames better its "a ginormous difference" wtf? And since when 1.9 frames are "a couple of frames"? In favor of AMD of course. If it was the other way around you would say they're almost identical of course. AND to correct you once again, you're the one having "brand preferences".
1.9 frames are almost “2” frames. 2 is a couple. A couple of socks is 2 socks, a couple like your parents is a couple of two people. He’s not wrong when he says that 1.9 (when rounding up to 2) frames is a couple of frames. I think you are being nitpicky.
@@Djuncle No. Because the way he talks up AMD clearly makes the average user listening think worse of the nvidia card.
how do u get msi ab to show the .1% low when i add it to the osd nothing happens i can see everything else but not the .1% lows
Now let’s go to 2023 releases, where these releases gobble up vram like there’s no tomorrow and 16gb has now become the standard
That Radeon 7 is hitting 100c frequently. Thats insane lol
I always wish the Radeon VII is a better card than it was. It’s a fantastic looking card and a wonderful card for Hackintoshing.
Did he used official AMD drivers? I think for a legacy card such the Radeon VII the nimez drivers are a no brainer
The main observable difference between these cards is survivability. The 1080Ti can usually be revived from any serious failure. Once a Radeon VII dies, it's DEAD dead.
I wish i could still enjoy to play games at either that low res, quality settings or low fps
One of my favorite channels making a video on one of my favorite cards! Can't wait to watch it!
radeon vii ultra ti agp
radeon vii ultra ti agp
radeon vii ultra ti agp
@@JoseLucasd what is wrong with you guys?
this man makes banger content
radeon vii ultra ti agp
“The multi-billion dollar underdogs” 😂
First years of my radeon vii i made 6000$ in profit... second years 7500$. Still have to today for all my editing need but stopped mining with it. It was and still is the best mining card for the $$$
This was a very interessting Video, thanks.
Still...on Paper it looks like the Radeon7 should be much more potent than the 1080ti with 16gb of HBM2. Why is it only Neck in Neck with the Nvidia, while Speed and quantity of Vram has become a major performance aspect.
All else aside, I love the look of that AMD card. Well, I should say: I'm strongly averse towards making components "cool", giving them RGB lights and putting windows on computer cases. So somehow this design of just a BLOCK kind of intrigues me :D
I like the Loki imagery
Even for now I do still finding 1080Ti founder edition but those marketplace noobs tells me its already peace out