Grab a GN Modmat for PC building, toolkit, mouse mat, and more while supporting our testing! store.gamersnexus.net/ This month has been crazy non-stop testing and reviews, and we're still not done yet! Plus, more coming in November.
@@Rov-Nihil true. They are worthless unless you weigh less than 150 pounds and you are very small. I got one for testing purposes and it’s 12in wide, I fit and it’s cool but not realistic for regular Americans and not safe for the road lol
Love love love the human element here--showing us during the review how your process works, and showing some of the other folks behind the scenes. Awesome way to have taken the "show don't tell" feedback from fans to heart!
I love so many things about this video, 1. that you looked at the thermals, 2. that you investigated the transients, and 3. the "important reminder" that you don't necessarily need this. I appreciate the benchmarks as well, but everyone has benchmarks. Thanks for all you guys do! See you in the next one.
Guys, on behalf of many gamers out there, I just wanna say thank you. I appreciate you taking the time to make all these tests. They are really helpful & I rarely buy core hardware components without first checking you reviews.
I would love to see some kind of VR test since games that are CPU limited can become GPU limited when playing in VR, especially when supersampling. RTX 40 series might have more or less value to VR users playing CPU limited games such as MFS2020 which Nvidia used in their promotion of DLSS 3.0
It is the little things in GN videos that make them among the best. Steve pushing the gaming chair away and sitting in the folding metal chair was great. Such a brutal take down of a chair.
I completely agree with this sentiment those types of chairs are so friggen narrow that it's just impossible to sit in them comfortably. Sitting in them feels like you're being squeezed into a sardine can, they're god awful.
They dont but when steve said near the end well have to see about 4080 to know if its good value , everyone with kmows the 4080 16gb is a massive price hike over the 3080 70% /95% in Europe over the 3080 fe , with 50% increase , I thought that was kind of a silly comment and letting nvidia slide on thst which is ludicrous
2 ปีที่แล้ว +355
I love the transparency and amount of detail and care that these guys put into their videos. Amazing standards for current times.
Check out Der8auer's channel too, he has some interesting observations to limit power draw by a third(!) and still lose only a couple of percent of performance.
See, that's my thing... I don't want a damn space heater. Summer gets to 110°F, sometimes more, here in Oklahoma. My 1080 draws 180w and it puts off a negligible amount of heat.
I think EVGA was spot on with not bothering with the 3090 TI and 4090s being released this close together effectively making the 3090 TI's Lemon products that they ended up losing money on heavily, rather than staying the the abusive relationship and potentially recouping their losses with the 4090 series they cut their losses. Very sad to lose EVGA in the market though, they have set the standard in quality and service for video cards.
Considering how well these cards did on thermals it wouldn't surprise me if they poached EVGA's best engineers for it effectively sabotaging the company.
3dfx bought STB in 98 or 99 to manufacture and sell it's own cards and look what happened. I'm not suggesting anything but CEO hubris can really fuck up a company. Intel is worse than Mr. Magoo, AMD almost went bankrupt just a few years ago and small biz/startups just get phagocytized by big corps.
Love the effort you guys put into your reviews, probably the most detailed 4090 review out there! Also love the shade thrown at the “gaming chair” 5:35
Glad you guys are putting so much attention to detail in the makeup of the thermals. I was pretty pissed when i learned that my 3090 mem was hitting 104c and it wasn't shown on the nvidia overlay. Luckily replacing the pads dropped temps to 80c, but after this lesson i vowed to not make any more gpu purchases without a better understanding of the power and thermals. Consumers shouldn't have to jump through hoops to fix poor engineering (or $ corner cutting) on products especially expensive premium products.
@@jermasus I know you were joking, but that WAS the case with some AIBs. The 3090 has VRAM on the back of the board, as well as the front. Every manufacturer thought they could get away with the same bottom of the barrel thermal pads that they used with GDDR6, but 6x gets way hotter. Some AIBs straight up didn't put thermal pads on the VRAM on the back of the board. I did a thermal pad swap on my Zotac 3080 AMP Holo (the only card I could get "retail" during the pandemic), and my VRAM temps went from 106C down to 72C. There is absolutely no reason they should've cheaped out that much on a card that costed $1,000 directly from the manufacturer.
Same with my 3080 had to buy thermal pads and slap them on to drop like 20 degrees and repaste the gpu. considering how cheap thermal pads are for manufacturer's to buy on mass, cheaping out on it for a few $$ was pitiful.
The context on the FPS charts in terms of the exact model of GPU used _on this date_ is actually really helpful in these types of reviews, I feel like a lot of people forget how much perf can change per driver update and vendor to vendor
I'm enjoying the format of this review. The choice of sets is excellent. The tests looping in the background are visually interesting and it shows the crew is definitely putting extra thought into the visuals with each new video.
Not finished with the video yet but you guys are doing an absolute insane amount of work in order to give us some info about how these cards behave. True respect.
@@temperedglass1130 Other people get money for it and do a piss poor job comparatively too, genius. That and they are getting money and not having their stock handed to them with the caveat they kiss their ass.
@@temperedglass1130 dude if that’s your reaction to someone saying they’re putting out a good project you must be the most miserable sun deprived basement dwelling girl repelling troll of a human. Like and subscribe
Went with a 4090 Founders for MSRP in the UK for £1599. This card will last me a LONG time. Coming from a 3070 the performance uplift at 4k 120hz is insane.
I got one too, but I'm kind of worried about even keeping it. I thought the melting issues were put to rest but it looks like people are dealing with it. How are things holding up for you?
I think it will be interesting to see how the 4080s and especially the new AMD cards compete against that beast. We just have to wait a bit, and we have to hope that the GN team will withstand those long, long testing runs (especially with Steve and his unlimited rage in the background :D). Thank you guys for your work!
The 4080 16 GB has 59.375% of the shading units that a 4090 has and the 4080 12GB has a mere 46.875%. Extrapolating those figures onto TechPowerUp's relative performance 4K resolution graph, that puts the 4080 16GB a bit below the 3080 Ti and the 4080 12GB a bit above the 3070 Ti. That's disconcerting given the actual 4080 12GB will almost certainly not scale as well due to the large difference in memory bus between the 4080 12GB and the 4090. So in fact we might see a 4080 12GB that performs on par with a 3070 (or potentially worse assuming the dip to a 192-bit memory bus only drops performance a few percentage points). I'm not sure I see a universe where the 4080 12GB makes any sense anywhere near $900 USD. Even if you assume that the 4090 is being held back somewhat at 4K by current CPUs, it's not being held back to the extent where the extremely cut down 4080 12GB is going to find 50% extra performance to justify it's insane price tag.
@@giglioflex Until we see the actual released figures, I'm going to take that with a heavy grain of salt. There were a lot of comments with a similar comparative outlook with the 30- series and they all appeared to be rumor and incorrect. Mainly because of the raw numbers of a particular segment wasn't a proper way to phys out the methodology of the new series. Unfortunately, availability was the bottleneck and because of that, price for that instance. As someone who is on the fence, both on AMD and Nvidia, I have to see how AMD compares in performance and then again with the 4080. This will determine if I go ahead and buy the 4090 because I'm unhappy with the perf of the -80 variant or the upcoming AMD.
@Lurch I'm not seeing it. Not sure what you're overreacting to with regard to the GPUs. Yeah, they use a lot of power but if you're in the US, that's not that much of an issue at 0.13-0.14 cents per kWh nationally.
Thanks Steve. I love watching your channel grow over the years and it is to a point where I honestly have no other questions beyond what you are testing. Extremely informative. Long live GN
Can you do power spikes at 70% power target? It would also be great if you could confirm Roman’s findings about the huge increase in efficiency at these power level. He had anything between -2 to -5% performance at -20 to -40% less power draw. Maybe this also means the power spikes go down maybe it means they go up or the spikes stay similar just also 20-40% lower. Would be really interesting
I think VR performance should also be tested now. even though there is a very small audience for it but this card can allow VR gamers to push the resolution scale.
In general its also just a weird display so i never know how to compare the benchmarks of other games against it. Would love a few VR Game charts, so i can draw a direct conclusion
@@astraxgt you can't compare benchmarks of other games against it in VR, it's much more crucial to maintain the framerate that matches your headset otherwise your get reprojection, and it's very, very ugly all that matters in VR is, what framerate can i reliably hit, at what settings, and at what resolution because resolution is *always* scaled up in VR beyond the native headset resolution, due to the way the lenses work... of course past a certain point it becomes supersampling, which is still good, if you can power it.
@@astraxgt in quest 2 the only way to get 1-to-1 pixel is at resolution scale at 1.7x which roughly translates to 8K rendering. 4090 can meet that requirement as seen from these benchmark
Yes a VR section would be great. Even if it's just VR benchmarks and not actual game benchmarks. But if they do games I would really like an rFactor 2 VR benchmark. Same for CPU reviews
Your shop looks great. I especially enjoyed how Steve shoved that gaming chair out of the way, preferring to sit in a metal folding chair, which really reinforces how much he hates gaming chairs. Love your channel, thanks for the early 4090 coverage.
Jesus Christ haha, had to scroll over 30 comments just to find one on the card itself and not sucking GN off. Like I appreciate their hard work too, but is no one else seeing the legendary generational performance leap here?
@@theholypopechodeii4367 You can not be able to afford a product and still appreciate it's achievements and breakthroughs. No one looks at a Ferrari and gets mad that it's 4 million...
Id really like to see VR tests for these. Asseto Corsa Competizione needs all the power it can get. VR really makes use of all the power these cards have to offer. The resolution often time nears 4K if not exceeds it for super sampling.
Yep I figured these would be the first cards that make serious VR games possible. I mean you can do some insane shit with these cards even at high resolutions.
4090 aside... thanks for the behind the scenes. I really value your thorough testing methodology an no-bs approach. GN easily is my go-to for looking up reviews for an upcoming purchase decision.
I really hope RX7000 series aren't a disappointment... I mean, the performance of the 4090 is truly impressive and they are turning those watts into FPS, but it's too high in power and price for me to upgrade just yet. Cool to see the reference card cooling properly though, that's a welcome change.
Coming from someone who far far prefers Nvidia.... I agree. The performance is impressive, but honestly, at that price, power consumption, and size it BETTER be. I am going to be skipping these unless they manage to slim them down, they are just too expensive, power hungry, and huge.
Anyone buying AMD GPUs is stupid at this point. They don't have a good product, the support is shit, and you would need to be unaware that nvidia exists to get one.
@@Cyber_Akuma Do you live in a third world country where just a tiny bit of power like that is actually costly? Most of the world that can afford these GPUs hardly pay a few cents for the monthly use at full power 24/7 for these systems. Either you are not aware that energy for this sort of thing doesn't actually cost you anything, or you may live in a place where buying luxury toys is stupid.
@@sqlevolicious If 1 kWh costs about $0.20, and your system draws 1000W (about 1150W with 85+% efficiency), you're looking at ~$169 a month with 24/7 usage. Run it for a year, and it adds up to more than the cost of the card.... Similarly, a 700W system (probably a higher-end 4080 12GB), would cost $24 a month running just 5 hours a day. It's a significant amount when it accumulates...
Steve, your content keeps getting better and better. You are a reference in this field, and I want to thank you for being always on our side. Sending a virtual hug!
I'm suprised we got a performance jump this big, but the price followed as well. With that said, I'm waiting to see what AMD has done with RNDA 3, as it's had a 50% uplift as well. So this may indicate we'll get a new console mid gen upgrade again.
@Lurch u have power and frame rate tests, u can make an estimation of efficiency on ur own. its easy to see that it has massively improved over the 30 series, at stock anyway.
The thermals are surprisingly good. However, it would be interesting to see the thermals on this card when it is inside a case with a 13900k or 7900x on Steve's desk.
AIR. I built my first PC 22 years ago. Back then it was all about getting as much air as possible on the GPU. Then it took a backseat to aesthetics. Now we are back to air. I ripped the foam filter out of my Meshify C, ditched the AIO with the radiator in front, and bought a good air cooler. I feel like I've come full circle. It's so much cooler and surprisingly the dust isn't much of an issue. I'm guessing I could handle that 4090 if it fits. But honestly, my 2070 super is still just fine. I'm too old to be caring about chasing crazy FPS. I spent way too much time caring about everything other than enjoying games.
@@RJT80 - You're bringing back memories. I built my first PC 30 years ago, and liquid cooling wasn't even a thing. It's not like you needed anything, but a system fan to cool your graphics card because sVGA didn't exactly draw a lot of power. Air is actually the best option, as long as you have proper ventilation. As a network engineer, I've spec'd many a system, and everything is air cooled for a reason. I only know of one liquid cooling solution, and that's actually a new system where you actually immerse the components in the liquid. It's even known as Liquid Immersion Cooling. It looks interesting, but it would take years to implement as we'd have to reengineer all our server farms. I hear you about chasing fps. Like you, I'm too old to worry about just playing competitive shooters. I like my solo games, and they are fine at 60fps.
I agree, however, I would imagine that they still run cooler than the 3080 or 3090. I think Nvidia realised that they pushed it to the limit with the 3090, and if they wanted to actually increase performance significantly, they were going to have to cool the cards better. It's 100% why the cards are so extremely thick now, if your previous cards are already running at 110 degrees and you want them to pull another 100 watts, you're going to have to upgrade the cooling.
@@cummerou1 - I definitely think the cards are cooled better, as this video shows. I'm just curious how that translates when used in a non-testbench environment. We all know that some cases are absolute garbage for removing heat, so it would be interesting to see what the minimum case and cooling you'd need.
The 4090 is a beast. I'm surprised how much better it is than the 3090Ti. I personally game on a 1440p monitor, and most likely won't be needing something this power hungry. Very impressive 4k results though, I honestly thought the card wouldn't perform this good at 4K.
Had 3080ti Build for 21:9 1440p, and was disappointed with the output... This card however seems to be meant for 21:9 1440p... Cause in a lot of cases it's maxing games, in a lot of cases its not. If you go any lower with the card, every new AAA that comes out, will make your pc suffer.
Thanks for the best review work in the world! Pretty much exactly double the performance of a 3080 in 4k, with and without RT. Considering the 4090 costs much more than double than the 3080 at msrp, it's looking to be bad value. This said, the 4080 will not be double the performance of a 3080, and is still double the msrp (here in Finland, 750€ vs 1500€). The 4080-4070 bs edition is even closer to 3080, I'm sure, maybe 20% better, and still costs much more. Nah, I'm good with my 3080 for now. Gonna skip this gen, or get an AMD RDNA3 card if they offer something more competitive with better pricing.
I just bought a 2nd hand EVGA 3080 with 2y remaining warranty for 500 EUR. So the 4090 is 4x more expensive in Europe, being only 89% faster at 4K. 3090s are going for 700 Eur, so you have a 68% perf uplift for almost x3 the price. At 2K Eur, the RTX 4090 it's a downright scam in Europe. The 4080s are even worse, just laughable, you have to be crazy to buy one of those.
Price completely makes it non impressive. It's inflated on top of the previous, already inflated, prices. It could be even faster and still not justify a 1600 USD tag...
Dude, it would take 30 seconds to add figures for creatives, Blender rendering benchmarks, etc. You guys go so in depth, way more in depth than just gamers appreciate.
Wish that we still had 1080ti benchmark data on this graph. I know a lot of people that are holding on to them until they see a big enough performance to price delta to buy and it'd be very nice to have that still as a point of comparison.
I'm guessing that's why he made the off-hand comment about "1080ti notwithstanding" when talking about normal vs ti cards and how it's "a mistake they'll never make again?" I've been out of the loop on GPUs for a few years now, mostly because I've seen no reason to upgrade from my 1080ti.
@@JesseBayne I finally upgraded my 1080ti to a 4090 after 4 years (I got it around September 2018). Absolutely worth it for my 1440p Ultrawide and livestreaming
It would be nice to have h264/h265 video editing performance included in the standard benchmarks for GPU's. 3d compute/rendering benchmarks are also good but having a video editing(playback/export) benchmark would be a good addition to standard GPU reviews.
I'm still bitter about them branding a 4070 as a 4080 to try to force people to buy up the remaining 3000 series stock. If RDNA3 is even remotely competitive with these, I'm going to make the switch. It shouldn't cost this much for a graphics card.
Thanks for all the hard work Steve and crew! Love knowing us consumers have got you in our corner and your no-BS attitude is a welcome change to other media outlets
Mad respect for the time and depth of the reviews on GN. Basically an only fps benchmark video would do just as well in views, so the extra effort isn't needed for views but much appreciated by me and, most likely, the rest of the viewers 👍👍
I recently (half a month ago) was lucky enough to have bought a RTX 3090Ti Suprim X for $979.99 when I was browsing around Amazon, glad I was able to snag it then as the prices have hiked back up. I’ll be fine for another several years, but man that performance uplift is insane!
Nice. I was lucky enough to buy an open box 3090 ti suprim x for $900 on eBay the day before the 4090 announcement. Put a waterblock on it and it's been great. Really do not need anything more. I just hope new games will not be to demanding and force me to upgrade.
There was a lot of talk about the 4090 doing poorly at just rasterization, since everyone’s early tests were restricted to DLSS on. But seeing these benchmarks… this is wild!
The only thing that IMO is missing from this review is lack of VR gaming performance charts. RTX4090 might be one of the first GPUs that are powerful enough to allow for comfortable usage of those with most of the available titles :)
An overclocked 4090 at 666.6 watts is eye watering! I can't imagine the heat that's going to come out of the poor P.C's that are going to be running this.
@@JohnDoeC78 Until summer hits and you air fry. Might be useful for exploring antartica with the downside of accelerating sea level rise as you melt the ancient ice around you and potentially unleash protozoic viruses to ravish mankind for their endless selfishness.
Have you tested undervolting performance, the 3080/3090 were very good undervolters. I was able to save about 100w of power draw on my 3090 at only a 3% performance hit.
Thanks to this release today I was able to snatch a 3090Ti for $900 bucks with taxes. Good enough for me. I still got a DDR4 built and don't plan on going DDR5 anytime soon. Good luck all and absolutely killer video. I'd stop by and visit if I could and congratulate the whole team myself. Bravo.
??? Why? The Radeon Technology Group does not benefit from Nvidia's R&D. I know, "competition is good" etc but just because one company does well doesn't mean another is going to start doing well, especially when it comes to AMD GPUs.
@@BreadDestroyer Because leaks were more focused on AMD improvment than Nvidia one. So if leakers are any trust worthy, then situation will be very interesting. Why? Look on chart at energy consumption
@@BreadDestroyer Because NVIDIA rarely makes decisions in a vacuum. Last time AMD made large gains with their architecture NVIDIA released the 1080Ti. One could argue they have a better grip on what AMD is doing than anyone else and what needs to be done to out perform them.
@@BreadDestroyer They make things relative to what they think the competition is doing. So the 7900XT will be made with what they think the 4090 will be performing at and whether it can exceed it or match it.
@@ladrok97 As a rule I don't trust leakers, because you have like a 40% chance of any given leak being reliable, but even setting that aside, I still don't see why you'd be hyped for AMD's GPUs when 1) They've been pretty uniformly terrible for the past decade or so, with even the "good" releases falling short of Nvidia and/or being plagued by driver issues and 2) By your own logic Nvidia have pre-emptively responded and pre-emptively checkmated them. They can go back to competing on price, I guess, but "get a worse GPU for cheaper" doesn't excite me. That's pretty much always an option by getting last-gen hardware anyway.
I think a better way to measure power spikes is the area of the height and length of the spike. The smallest peak of the spike is not that relevant, because even just wires have enough inductance to absorb that. Similarly to how Hardware Unboxed calculates cumulate deviation in monitor pixel transitions, it's not just the overshoot, but also the duration.
If you can get a hold of the Galax 4090 with 4 fans, I'd be interested in seeing how much adding a fan to the back of the card actually helps with GPU cooling (if at all) over more traditional 2 and 3 fan designs.
I understand the need for a controlled environment but maybe do some testing in the future. Testing them in a hotter environment with a good airflow case would represent a good real world case for a lot of gamers. I live in the barracks and the barracks are temperature controlled, but my room is not. I'll often find my room hitting 80-90 degrees Fahrenheit simply because of my 1080ti and 5950x in the Meshify 2 xl with every fan slot occupied. Edit: This will slowly mitigate the effectiveness of my cooling and I'd like to see a video done on that.
Really appreciate the hard work guys and thermal testing you did, giving us a bit more info than just some game charts .. looking forward for more 4090 videos :)
The sheer amount of transparency with the testing methodology is nothing short of bloody brilliant, Steve and team. Y'all have knocked it out of the park. As much as I'd love to pull the trigger on the 4090, I'm going to keep my artefacting MSI GTX 1080 FE limping along until the RDNA3 annoucement and reviews before I make my decision. I'm lucky that I can justify the money on a 4090, but patience is a virtue. From Australia, kudos to you and the team mate. Y'all deserve a good rest after all this.
Yep, same. My buddy and I were actually just talking about doing the same. My 1070Ti has started showing signs of wear. I'm holding out for RDNA3 as well before making pulling the trigger on a new card.
@@TimothyStovall108 im rocking a regular 1070 & recently picked up cod: vanguard again (hadnt played it since launch). only getting about 30-35 fps on there, dropping down to 20-25 when i start shooting :(
Im curious, Why not save a bit and buy a 3090/Ti? Sure the performance is great, but a whole build can be had for $1600 with a 3080 Ti/3090. To each their own(I'll buy a 4090/RDNA3 Eventually.)
@@PDXCustomPCS Eh, I bought a 3080 12Gb back in June, but wasn't it wasn't the uplift I was hoping for in the games I play, as I was still GPU bound, so I sold it within the week for what I paid for it to a friend. I thought about snagging a 3090Ti, but decided to wait a few months for the new cards.
16:57 that is amazing editing, scripting and delivery 🤣 I found the 3090 FE have great thermals. Better than partners considering it ejected a lot of the heat out of the case and did so in a smaller package than partners. 3090 partner cards were insanely big but nothing like these 4090s.
I ended up replacing the paste and pads in my 3090 FE and ended up getting even better thermals, I would be interested to see what happens in the 4090 doing the same.
10:15 This was always going to be the way that it was going to go. The 30 series FE cards were very well built and had a great design and they've iterated it better. EVGA knew the score here. Eventually Nvidia is just going to eat more and more of the card market share.
I think this pretty much secured that I'm going to buy the FE card once again this time around. Looks good, well cooled, and a +33% power limit from the looks of it makes partner boards obsolete.
I was think of getting a strix oc but now I’ve seen in at £2.4k might now be worth the significant price premium over the FE as I’ll be water cooling anyway
The insane raster performance is gonna make this a VR beast, with the huge resolutions and supersampling required/desired for every increasingly bigger (resolution-wise) vr displays. Exciting :D
@@alexoelkers2292 this isn't remotely true. It might be in the sense that there's a lot of VR games that intentionally have potato graphics, but the most interesting ones are often those that don't. Flight Simulator, racings sims, Elite Dangerous, VRChat, etc etc I have almost never played a PCVR title that I didn't wish I had more horsepower to crank up graphics, resolution, supersampling, frame rate, etc. and when you figure in all the mods that are coming as people port unreal 4 games...
after this review the prices went through the roof in the Netherlands. The official stores are selling this card between €2500 -€2900 ..stores are now the new scalpers.
Thank you for mentioning the jump over the 3090, because a generational uplift is over the SAME product in a stack. To me it's looking like 75 - 100% in general which is mind blowing for a single gen, but what was predicted with Nvidia moving off a slower node to the best node in the world, TSMC N4. And this also drives cost. I know this comment will get the hate, but this is a much better deal than last gen 3090 Ti where for most the time it was selling around this price range or even higher. To me this is the first REAL 4K gaming GPU, where you don't have to turn a lot of settings down and it can pump out the fps. I'm waiting for the following generation though because I want that performance at lower power consumption which is what will happen with the 5000 series and RDNA 4. So point of contention with your commentary about buying advice. This is a HELL of a 4K gaming GPU which is the only reason why people should be buying it anyway, since the 6900 XT and 3090 and 3090 Ti are already premiere 2K gaming GPUs. This is something that gets paired with that big shiny 4K OLED TV with a PC input made for gaming and enjoy the splendor of what modern hardware can give in gaming. In which case you DO want RT most the time. Also, if I'm buying this I'm not pairing it with anything except Zen 4 Vcache, where I think some of those CPU limitations will disappear.
No hate needed dude, good comment. I agree with you, this thing is a 4K beast and is made specifically for it. I have a 3080 and I'm commonly seeing double the performance which is insane. I was hoping that the 4090 would provide 144fps at 4k and my eyes were literally watering watching this review haha, the uplift is amazing.
One question - how is the coil whine on the RTX4090? The RTX 3090 and 3080Ti had quite a bit of coil whine and it wasn't just limited to certain brands - it was happening on all brands and was significantly louder than 2000 series cards.
@@chltmdwp 30 series cards i've tested and found to have coil whine: RTX 3090 FE, Asus Strix RTX 3090 OC, Gigabyte RTX 3080Ti Eagle OC, MSI RTX 3080Ti Suprim X, Asus Strix RTX 3080Ti, Asus TUF RTX 3080Ti OC, RTX 3080 FE, MSI RTX 3080Ti Ventus 3X, Asus Strix 3070, RTX 3070 FE. I build a lot of rigs for clients.
Hey Steve and team! I know it probably won't get seen, but I would absolutely *kill* for VR benchmarking from you all. I know you have plenty else in the works so it's unlikely to happen, but I have seen such drastic differences both on software choice as well as hardware, that it's a daunting use case to purchase for without any in-depth reliable information from common media outlets. Thanks for all your work!
Well worth is subjective so no one will be able to answer that question for you, only for them. I’d say none, since a 3080 can run any game I please at satisfying settings at 1440p to me. Now if I were to move to 4K, I would want something more akin to this for really any demanding AAA game. The 4090, in my opinion, is the first real 4k card as the 3080/3090 were for 1440p.
there's nobody I would trust more with this information than Steve Gamer Nexus, thanks for the hard work and dedication to integrity you guys rock! I would also love to see a sorta dumbed down 'how'd they pull it off" video where Steve explains how they managed such a performance leap to me like I'm an idiot (I am)
@@EmpanadaDeCaca I apologise its just that the phrase "there is nobody I would trust more with [X]" usually has the context of the speaker entrusting [X] with/to a person associated with it.
@@sigy4ever You entrust something to someone, but trust something someone has or gives, in common cases. In the subject of information, it usually inverses the direction this information goes in.
Very thorough review guys. Thanks for the cable clear-up. Can we get some octanebench/redshift/vray testing in the future? GPU path-tracing performance usually jumps generationally by a lot more than gaming performance.
I've only just started trying to get one, since like last Friday. I feel the pain, I can't imagine the frustration for people trying to get one since launch.
@Hollowpoint762 I had one bought from b&h since launch but still hasn't shipped and says backorder. This week is the only time I've been able to add to cart so hopefully it's getting easier.
@@Notnownev from what I can tell, this has been one of the more active drop weeks since launch. Guess I picked a good time to try, now I just need some luck.
@@HollowPoint_762 oh man there is so many cards available near where i live (gigabyte gaming OC version) it was months of them selling out immediately when cards came in.. now they have 10+ in stock at some stores got mine while i could.. doubt pricing will drop unless sales slow down drastically.. but ive seen people scalping the 7900 xtx at prices that will push people up to the 4090 since its not that much more vs scalpers pricing on AMD..
I logged in to TH-cam today and my feed was inundated with RTX 4090 reviews. I filtered through them all until I found the GN review, and just watched that one.. best choice I made today. Not that others don't do a good job, some places do, but I won't get the full no B.S. information that I get with GN. Simply put, you guys are the best in the business, keep it up! Oh, and no.. I am not buying one of these.. my 3080 is still overkill for most games at 1440p, this would be literally flushing money down the toilet for me.
Grab a GN Modmat for PC building, toolkit, mouse mat, and more while supporting our testing! store.gamersnexus.net/
This month has been crazy non-stop testing and reviews, and we're still not done yet! Plus, more coming in November.
Thank you and all of the staff there at Gamers Nexus for all of your hard work you put in for us.
I waited 361 days to get my 3080 at MSRP. These jokers aren't gonna get my money until the 4090Ti comes out.
thanks for you and your teams hard work, you guys are extremely appreciated in the community.
Thank you Steve!
Would a 7950X have made a difference in CPU bound scenarios?
Dude I love how Steve shoves the gamer chair out of the way to deploy the trusty metal folding chair. 5:35
He has no time for tom foolery
Came here to say this
Glad I wasn't the only one to appreciate that 😂
You mean the disposed Chinese boxer seats that was made in masses for car enthusiasts, who soon enough realized how worthless those seats are 😹
@@Rov-Nihil true. They are worthless unless you weigh less than 150 pounds and you are very small. I got one for testing purposes and it’s 12in wide, I fit and it’s cool but not realistic for regular Americans and not safe for the road lol
I’m really happy that pure rasterization is such a big improvement but I still don’t think we should just accept 1600 for a gpu and 900 for a 4070
I agree so don't buy it...
here in a germany the fake4080 12gb msrp is 1249€.. it´s getting ridiculous
This is now the norm for GPU prices, sadly I'd get used to it. It's not going to get lower in price.
Its €2250 for a 4090 in Holland
there will be enough people who buy it anyway.
Love love love the human element here--showing us during the review how your process works, and showing some of the other folks behind the scenes. Awesome way to have taken the "show don't tell" feedback from fans to heart!
I love so many things about this video, 1. that you looked at the thermals, 2. that you investigated the transients, and 3. the "important reminder" that you don't necessarily need this. I appreciate the benchmarks as well, but everyone has benchmarks. Thanks for all you guys do! See you in the next one.
Weirdo
u win the internet sir, gold star and u can squeezy my bum
@@AlphaSnowLion Yeah. That was funny.
@@blokin5039how is he a weirdo?
Guys, on behalf of many gamers out there, I just wanna say thank you. I appreciate you taking the time to make all these tests. They are really helpful & I rarely buy core hardware components without first checking you reviews.
Bruh it's their job
@@v5k456jh3 exactly, also there is people thanking Nvidia, it makes me laugh, they are not your friend, they don't care about customers
@@DC3Refom I imagine they do care about their customers. No customers = no business.
@@v5k456jh3 you could have saved yourself the trouble (and the embarrassment) of typing that comment. seriously.
@@asteria9963 Why would I be embarrassed telling the truth? Lol..
Liking that open-plan, seemingly lighter view of the studio.
Me too.
It's the test bench room rather than the A-roll set. Does feel easier on the eyes tbh.
In a different life I think I’d have a lot of fun working there. The testing area reminds me of a startup I worked for in the late 90s. 😃
Much better than the old background imo
They did it so you can see the lights dim when he turns the bench on 😬
This is an IMPRESSIVE benchmarking setup. Wow. Thank you for all you do, Gamers Nexus!
I would love to see some kind of VR test since games that are CPU limited can become GPU limited when playing in VR, especially when supersampling.
RTX 40 series might have more or less value to VR users playing CPU limited games such as MFS2020 which Nvidia used in their promotion of DLSS 3.0
@@Nala_1230 so you don't clap when a professional musician plays well? oh my god, the amount of salt
@13:40
"we only ran this test for 8 hours for transients...." One of the most GN things I've ever heard.
Thanks for the extremely detailed review!!
This channel has such a great production. I'm glad they moved into this new space.
jesus starting to get grey hair. i see a little.
Steve has lately been going Adam Five on Nvidia.
Intel: Thanks Steve! 🤣
@@Joker-no1fz hahaha, a commenter called me "tech gandalf" the other day and I'm here for it
@@GamersNexus Hum, Steve the gray? (still time to transform to "the White" later on if Sauron....I mean NVidia gets mad.)
@@GamersNexus Hey it was me !
I mean i said :
"It made me feel so old to see gray hair on Steve, in a few years he will become Gandalf"
My favourite thing about this is how Steve and the team are so proud of their methodology :)
They have every right to be proud, it's a really sophisticated setup.
@@H3LLGHA5T Agreed 100% I'm glad they're showing it off
Linus gonna be like: "write that down, write that down!!!"
THE SHADE THAT WAS THROWN ALL OVER THAT "GAMING" CHAIR HOLY SHIT THAT WAS PRIME GN COMEDY
They are very thorough :)
It is the little things in GN videos that make them among the best. Steve pushing the gaming chair away and sitting in the folding metal chair was great. Such a brutal take down of a chair.
Hehe aye!
Someday, there will be a ChairsNexus, and we will raise our voices in unison and say, "thank you papa"
I completely agree with this sentiment those types of chairs are so friggen narrow that it's just impossible to sit in them comfortably. Sitting in them feels like you're being squeezed into a sardine can, they're god awful.
I didn't catch that at first. After rewatching that bit, I completely agree. Great not so subtle burn.
This is literally The Ultimate Nvidia 4090 Guide. Such high quality testing. Other TH-camrs should learn from this.
truly an undisputed title for this video indeed
I have a Palit Omniblack 4090 and I have set at 2650Mhz@0.95v It pulls under 300W. The 4090 is so versatile and power-hungry at all
I love your reviews. You don't drink the marketing kool-aid. You don't go into crazy hyperbole. Just the data and evaluation of the product. Thanks.
I see that you are new here. Welcome. Behold, the truth and only truth, and sarcasm, and puns, and nerd stuff.
@@BIOSHOCKFOXX and tech Jesus.
They dont but when steve said near the end well have to see about 4080 to know if its good value , everyone with kmows the 4080 16gb is a massive price hike over the 3080 70% /95% in Europe over the 3080 fe , with 50% increase , I thought that was kind of a silly comment and letting nvidia slide on thst which is ludicrous
I love the transparency and amount of detail and care that these guys put into their videos. Amazing standards for current times.
Facts...they love what they do and it shows
Not just current times.
The best review of the 4090 I've seen today by far zero noise just useful info, once again you have raise the bar higher, congrats!!
LTT was great too, new labs did great testing
Check out Der8auer's channel too, he has some interesting observations to limit power draw by a third(!) and still lose only a couple of percent of performance.
The killer app is that the 4090 can double as a space heater for this winter
😂🤣😂
I plan to use mine as a convection oven for cooking.
Just close the room and leave the benchmark running. lol
@@alanwatts8239 😂😂🤣🤣🤣
See, that's my thing... I don't want a damn space heater. Summer gets to 110°F, sometimes more, here in Oklahoma. My 1080 draws 180w and it puts off a negligible amount of heat.
The walk around was sick. Loved to see the studio and on going processes
I think EVGA was spot on with not bothering with the 3090 TI and 4090s being released this close together effectively making the 3090 TI's Lemon products that they ended up losing money on heavily, rather than staying the the abusive relationship and potentially recouping their losses with the 4090 series they cut their losses. Very sad to lose EVGA in the market though, they have set the standard in quality and service for video cards.
Considering how well these cards did on thermals it wouldn't surprise me if they poached EVGA's best engineers for it effectively sabotaging the company.
3dfx bought STB in 98 or 99 to manufacture and sell it's own cards and look what happened. I'm not suggesting anything but CEO hubris can really fuck up a company. Intel is worse than Mr. Magoo, AMD almost went bankrupt just a few years ago and small biz/startups just get phagocytized by big corps.
@@Skelath I guess time will tell, we'll see who switches team 👀👀
@@tankerock I doubt it, AMD burned that bridge in the past.
@@Skelath You doubt your own comment? I was agreeing with you lol 🤣
I really like the increase in quality of your production! Definitely nice to have also comments from your team!
Good job Steve!
Would you slurp him raw?
The amount of work put into the reviews here is something out of the ordinary. Keep it up, guys!
Bot
@@l4k3 I'm sorry I'm not that familiar with the English language. What does "BOT" mean in the context here?
@@rfariavitor Christer is referring to you as if you are a robot.
@@ItIsRecoil Ohhh....I wish I were one. I wouldn't be this tired and it's not even Friday yet. Thanks for the explanation though!
Love the effort you guys put into your reviews, probably the most detailed 4090 review out there!
Also love the shade thrown at the “gaming chair” 5:35
Best part lmao
Metal fold out chairs have better thermal control anyways. Lol
Graphs are really boring, DF at least explains it in laymens terms and has game footage.
@Flare The GPU was still the bottleneck for FF14 at 4K max settings, which showed that the 4090 was 75% faster than the 3090 Ti.
@Flare missing the point. play the game at 4k max settings on that and get back to us
Glad you guys are putting so much attention to detail in the makeup of the thermals. I was pretty pissed when i learned that my 3090 mem was hitting 104c and it wasn't shown on the nvidia overlay. Luckily replacing the pads dropped temps to 80c, but after this lesson i vowed to not make any more gpu purchases without a better understanding of the power and thermals. Consumers shouldn't have to jump through hoops to fix poor engineering (or $ corner cutting) on products especially expensive premium products.
Yikes 20 degree drop? What did they use to make the pads? Air?
@@jermasus I know you were joking, but that WAS the case with some AIBs. The 3090 has VRAM on the back of the board, as well as the front. Every manufacturer thought they could get away with the same bottom of the barrel thermal pads that they used with GDDR6, but 6x gets way hotter. Some AIBs straight up didn't put thermal pads on the VRAM on the back of the board. I did a thermal pad swap on my Zotac 3080 AMP Holo (the only card I could get "retail" during the pandemic), and my VRAM temps went from 106C down to 72C. There is absolutely no reason they should've cheaped out that much on a card that costed $1,000 directly from the manufacturer.
Same with my 3080 had to buy thermal pads and slap them on to drop like 20 degrees and repaste the gpu. considering how cheap thermal pads are for manufacturer's to buy on mass, cheaping out on it for a few $$ was pitiful.
Was this a problem with the EVGA models? I picked up a 3080Ti from them and this is the first I’ve heard of this issue
Mine was an FE. Not sure if any board partners had these issues, but it is notorious on 3080 and 3090 founder editions...
The context on the FPS charts in terms of the exact model of GPU used _on this date_ is actually really helpful in these types of reviews, I feel like a lot of people forget how much perf can change per driver update and vendor to vendor
I'm enjoying the format of this review. The choice of sets is excellent. The tests looping in the background are visually interesting and it shows the crew is definitely putting extra thought into the visuals with each new video.
Love your dismissal of the gaming chair for the steel chair! A good throwback to an old video!
Not finished with the video yet but you guys are doing an absolute insane amount of work in order to give us some info about how these cards behave. True respect.
It is almost as if they are getting money for it 🤫🤫🤫 clown.
@@temperedglass1130
Other people get money for it and do a piss poor job comparatively too, genius. That and they are getting money and not having their stock handed to them with the caveat they kiss their ass.
@@InvadeNormandy 🤫🤫🤫 silence simp. What I spoke is factually true. Your crying will not change reality.
@@temperedglass1130 dude if that’s your reaction to someone saying they’re putting out a good project you must be the most miserable sun deprived basement dwelling girl repelling troll of a human.
Like and subscribe
Went with a 4090 Founders for MSRP in the UK for £1599. This card will last me a LONG time. Coming from a 3070 the performance uplift at 4k 120hz is insane.
Where did you pick this up from, dude?
@@ishmaelscarratt8164 Nvidia UK website has them listed but out of stock. Fulfilled by Scan.
It’s back in stock now
@@wedeservedit Yeh, slightly cheaper as well
I got one too, but I'm kind of worried about even keeping it. I thought the melting issues were put to rest but it looks like people are dealing with it. How are things holding up for you?
I think it will be interesting to see how the 4080s and especially the new AMD cards compete against that beast. We just have to wait a bit, and we have to hope that the GN team will withstand those long, long testing runs (especially with Steve and his unlimited rage in the background :D). Thank you guys for your work!
The 4080 16 GB has 59.375% of the shading units that a 4090 has and the 4080 12GB has a mere 46.875%.
Extrapolating those figures onto TechPowerUp's relative performance 4K resolution graph, that puts the 4080 16GB a bit below the 3080 Ti and the 4080 12GB a bit above the 3070 Ti.
That's disconcerting given the actual 4080 12GB will almost certainly not scale as well due to the large difference in memory bus between the 4080 12GB and the 4090. So in fact we might see a 4080 12GB that performs on par with a 3070 (or potentially worse assuming the dip to a 192-bit memory bus only drops performance a few percentage points).
I'm not sure I see a universe where the 4080 12GB makes any sense anywhere near $900 USD. Even if you assume that the 4090 is being held back somewhat at 4K by current CPUs, it's not being held back to the extent where the extremely cut down 4080 12GB is going to find 50% extra performance to justify it's insane price tag.
@@giglioflex Until we see the actual released figures, I'm going to take that with a heavy grain of salt. There were a lot of comments with a similar comparative outlook with the 30- series and they all appeared to be rumor and incorrect. Mainly because of the raw numbers of a particular segment wasn't a proper way to phys out the methodology of the new series. Unfortunately, availability was the bottleneck and because of that, price for that instance.
As someone who is on the fence, both on AMD and Nvidia, I have to see how AMD compares in performance and then again with the 4080. This will determine if I go ahead and buy the 4090 because I'm unhappy with the perf of the -80 variant or the upcoming AMD.
@Lurch I'm not seeing it. Not sure what you're overreacting to with regard to the GPUs. Yeah, they use a lot of power but if you're in the US, that's not that much of an issue at 0.13-0.14 cents per kWh nationally.
Thanks, Steve
Thank you papa
Thanks, Steve
Thanks Steve. I love watching your channel grow over the years and it is to a point where I honestly have no other questions beyond what you are testing. Extremely informative. Long live GN
Now back to you, Steve
@Lurch you sound like a goober
@Lurch if the future includes people like you participating in procreation then yes, I'll buy a 4090 and happily "watch it all burn" lmfao clown
Can you do power spikes at 70% power target? It would also be great if you could confirm Roman’s findings about the huge increase in efficiency at these power level. He had anything between -2 to -5% performance at -20 to -40% less power draw.
Maybe this also means the power spikes go down maybe it means they go up or the spikes stay similar just also 20-40% lower. Would be really interesting
Also interested in this
Great production quality folks, everyone's excellent work is appreciated.
I think VR performance should also be tested now. even though there is a very small audience for it but this card can allow VR gamers to push the resolution scale.
PC VR is so dead that there is no point. There’s plenty of resolution scaling in the current process.
In general its also just a weird display so i never know how to compare the benchmarks of other games against it.
Would love a few VR Game charts, so i can draw a direct conclusion
@@astraxgt you can't compare benchmarks of other games against it
in VR, it's much more crucial to maintain the framerate that matches your headset
otherwise your get reprojection, and it's very, very ugly
all that matters in VR is, what framerate can i reliably hit, at what settings, and at what resolution
because resolution is *always* scaled up in VR beyond the native headset resolution, due to the way the lenses work... of course past a certain point it becomes supersampling, which is still good, if you can power it.
@@astraxgt in quest 2 the only way to get 1-to-1 pixel is at resolution scale at 1.7x which roughly translates to 8K rendering. 4090 can meet that requirement as seen from these benchmark
Yes a VR section would be great. Even if it's just VR benchmarks and not actual game benchmarks. But if they do games I would really like an rFactor 2 VR benchmark. Same for CPU reviews
Your shop looks great. I especially enjoyed how Steve shoved that gaming chair out of the way, preferring to sit in a metal folding chair, which really reinforces how much he hates gaming chairs. Love your channel, thanks for the early 4090 coverage.
Got a ton of 4090 info and found out at 5:34 that Steve really hates that chair. He shoved it aside and sat on a metal folding chair, classic.
Holy cow I wasn't expecting that much of an improvement in raster performance and also raytracing
Jesus Christ haha, had to scroll over 30 comments just to find one on the card itself and not sucking GN off. Like I appreciate their hard work too, but is no one else seeing the legendary generational performance leap here?
Bro, do you even rasterize?
@@theholypopechodeii4367 You can not be able to afford a product and still appreciate it's achievements and breakthroughs. No one looks at a Ferrari and gets mad that it's 4 million...
Side note, that A roll talking head shot looks incredible!! You guys are delivering such content-rich and beautiful looking videos!
Id really like to see VR tests for these. Asseto Corsa Competizione needs all the power it can get. VR really makes use of all the power these cards have to offer. The resolution often time nears 4K if not exceeds it for super sampling.
Yes, good call. I love VR gaming , it's spoiled screen gaming for me 😆
check out babeltechreviews, they did several VR benchmarks including Asseto Corsa
Ofcourse that is what this card is made for... the rest are for playing games.
Yep I figured these would be the first cards that make serious VR games possible. I mean you can do some insane shit with these cards even at high resolutions.
Would like to see benchmarks with VRChat, and know what components can be doing bottleneck
never seen testing like this, this is why you guys are the best.
Would love you to see add Flight Sim 2020 to benchmarks...its optimization has come along and it would be nice to see where GPU/CPU limits kick in.
I love this review!
Watch Optimum Tech video
4090 aside... thanks for the behind the scenes. I really value your thorough testing methodology an no-bs approach. GN easily is my go-to for looking up reviews for an upcoming purchase decision.
Thanks for all the hard work GN team.
I really hope RX7000 series aren't a disappointment... I mean, the performance of the 4090 is truly impressive and they are turning those watts into FPS, but it's too high in power and price for me to upgrade just yet. Cool to see the reference card cooling properly though, that's a welcome change.
Coming from someone who far far prefers Nvidia.... I agree. The performance is impressive, but honestly, at that price, power consumption, and size it BETTER be. I am going to be skipping these unless they manage to slim them down, they are just too expensive, power hungry, and huge.
Anyone buying AMD GPUs is stupid at this point. They don't have a good product, the support is shit, and you would need to be unaware that nvidia exists to get one.
@@Cyber_Akuma Do you live in a third world country where just a tiny bit of power like that is actually costly? Most of the world that can afford these GPUs hardly pay a few cents for the monthly use at full power 24/7 for these systems.
Either you are not aware that energy for this sort of thing doesn't actually cost you anything, or you may live in a place where buying luxury toys is stupid.
@@sqlevolicious If 1 kWh costs about $0.20, and your system draws 1000W (about 1150W with 85+% efficiency), you're looking at ~$169 a month with 24/7 usage. Run it for a year, and it adds up to more than the cost of the card....
Similarly, a 700W system (probably a higher-end 4080 12GB), would cost $24 a month running just 5 hours a day. It's a significant amount when it accumulates...
@@xletr Rates are much higher in Europe at the moment too. I'm at £0.33 per kWh in the UK (~45% more expensive).
The other guy is peak US defaultism.
Steve, your content keeps getting better and better.
You are a reference in this field, and I want to thank you for being always on our side.
Sending a virtual hug!
I'm suprised we got a performance jump this big, but the price followed as well.
With that said, I'm waiting to see what AMD has done with RNDA 3, as it's had a 50% uplift as well. So this may indicate we'll get a new console mid gen upgrade again.
So did the power consumption. Twice as fast for twice the power consumption.
@@Knowbody42 what are you talking about, 60% faster at the same power and price as 3090ti
@@Knowbody42 you should stay in school and take an extra math class
Im hoping amd will come out punching this time around. Instead of just making yet another card only second best.
666 watts is "same power consumption" apparently. According to the "new math" people are being taught in school or something.
GN is really taking the testing to a whole new level. thanks
@Lurch u have power and frame rate tests, u can make an estimation of efficiency on ur own. its easy to see that it has massively improved over the 30 series, at stock anyway.
Anyone noticed the subtle diss at "gaming" chairs on 5:34 ?
nice!
Just absolutely incredible coverage! Great job GN team!
Would love to see premiere , resolve, after effects, and 3D rendering benchmarks also, thanks for the great content !
The thermals are surprisingly good. However, it would be interesting to see the thermals on this card when it is inside a case with a 13900k or 7900x on Steve's desk.
AIR. I built my first PC 22 years ago. Back then it was all about getting as much air as possible on the GPU. Then it took a backseat to aesthetics. Now we are back to air. I ripped the foam filter out of my Meshify C, ditched the AIO with the radiator in front, and bought a good air cooler. I feel like I've come full circle. It's so much cooler and surprisingly the dust isn't much of an issue. I'm guessing I could handle that 4090 if it fits. But honestly, my 2070 super is still just fine. I'm too old to be caring about chasing crazy FPS. I spent way too much time caring about everything other than enjoying games.
@@RJT80 - You're bringing back memories. I built my first PC 30 years ago, and liquid cooling wasn't even a thing. It's not like you needed anything, but a system fan to cool your graphics card because sVGA didn't exactly draw a lot of power.
Air is actually the best option, as long as you have proper ventilation. As a network engineer, I've spec'd many a system, and everything is air cooled for a reason. I only know of one liquid cooling solution, and that's actually a new system where you actually immerse the components in the liquid. It's even known as Liquid Immersion Cooling. It looks interesting, but it would take years to implement as we'd have to reengineer all our server farms.
I hear you about chasing fps. Like you, I'm too old to worry about just playing competitive shooters. I like my solo games, and they are fine at 60fps.
I agree, however, I would imagine that they still run cooler than the 3080 or 3090. I think Nvidia realised that they pushed it to the limit with the 3090, and if they wanted to actually increase performance significantly, they were going to have to cool the cards better.
It's 100% why the cards are so extremely thick now, if your previous cards are already running at 110 degrees and you want them to pull another 100 watts, you're going to have to upgrade the cooling.
@@cummerou1 - I definitely think the cards are cooled better, as this video shows. I'm just curious how that translates when used in a non-testbench environment. We all know that some cases are absolute garbage for removing heat, so it would be interesting to see what the minimum case and cooling you'd need.
@@ColdRunnerGWN Agreed, a lot more metal to conduct heat away doesn't matter as much if there's no airflow in the case
The 4090 is a beast. I'm surprised how much better it is than the 3090Ti. I personally game on a 1440p monitor, and most likely won't be needing something this power hungry. Very impressive 4k results though, I honestly thought the card wouldn't perform this good at 4K.
Had 3080ti Build for 21:9 1440p, and was disappointed with the output... This card however seems to be meant for 21:9 1440p... Cause in a lot of cases it's maxing games, in a lot of cases its not. If you go any lower with the card, every new AAA that comes out, will make your pc suffer.
Thanks for the best review work in the world!
Pretty much exactly double the performance of a 3080 in 4k, with and without RT. Considering the 4090 costs much more than double than the 3080 at msrp, it's looking to be bad value. This said, the 4080 will not be double the performance of a 3080, and is still double the msrp (here in Finland, 750€ vs 1500€). The 4080-4070 bs edition is even closer to 3080, I'm sure, maybe 20% better, and still costs much more.
Nah, I'm good with my 3080 for now. Gonna skip this gen, or get an AMD RDNA3 card if they offer something more competitive with better pricing.
If you upgrade every generation you're an actual degen.
I just bought a 2nd hand EVGA 3080 with 2y remaining warranty for 500 EUR. So the 4090 is 4x more expensive in Europe, being only 89% faster at 4K. 3090s are going for 700 Eur, so you have a 68% perf uplift for almost x3 the price. At 2K Eur, the RTX 4090 it's a downright scam in Europe. The 4080s are even worse, just laughable, you have to be crazy to buy one of those.
The gold standard of hardware testing and journalism, no one else comes even close.
Thanks a lot GN, from all gamers and enthusiasts.
It's impressive how much performance has improved with this card. I don't think there are any raster only games which challenge the hardware at 4k.
Bet Jetson opened the door inside his jacket closet to reveal an architecture vault with stuff that would make DARPA cream their pants!!
Price completely makes it non impressive.
It's inflated on top of the previous, already inflated, prices.
It could be even faster and still not justify a 1600 USD tag...
@@kiwd-dynamic I agree, the price to performance is nowhere near where it needs to be.
@@kiwd-dynamic and 450 watts
@@kiwd-dynamic 3080 Ti/3090 is a Winner for builds now.
Dude, it would take 30 seconds to add figures for creatives, Blender rendering benchmarks, etc. You guys go so in depth, way more in depth than just gamers appreciate.
These 4k 200fps charts are driving me crazy. Time to add 8k testing.
@@hdvrNG More power
Really wish you'd do some VR comparisons as this an area that may benefit from this kind of performance improvement.
Steve, please never stop adding the snippets from the Intel ARC keynote to your videos. They are golden and I bust out laughing every time lmfao
Wish that we still had 1080ti benchmark data on this graph. I know a lot of people that are holding on to them until they see a big enough performance to price delta to buy and it'd be very nice to have that still as a point of comparison.
I'm guessing that's why he made the off-hand comment about "1080ti notwithstanding" when talking about normal vs ti cards and how it's "a mistake they'll never make again?" I've been out of the loop on GPUs for a few years now, mostly because I've seen no reason to upgrade from my 1080ti.
@@JesseBayne I finally upgraded my 1080ti to a 4090 after 4 years (I got it around September 2018). Absolutely worth it for my 1440p Ultrawide and livestreaming
It would be nice to have h264/h265 video editing performance included in the standard benchmarks for GPU's. 3d compute/rendering benchmarks are also good but having a video editing(playback/export) benchmark would be a good addition to standard GPU reviews.
th-cam.com/video/K8_QPx-IN-o/w-d-xo.html if you're interested in those h264/h265 benchmarks
I'm still bitter about them branding a 4070 as a 4080 to try to force people to buy up the remaining 3000 series stock. If RDNA3 is even remotely competitive with these, I'm going to make the switch. It shouldn't cost this much for a graphics card.
Steve and the whole team are the real mvp's. 2 weeks and counting with continuous updates on hot topics, good job!!!
The videos you make are absolutely incredible.
Well done.
Bot
Thanks for all the hard work Steve and crew! Love knowing us consumers have got you in our corner and your no-BS attitude is a welcome change to other media outlets
Mad respect for the time and depth of the reviews on GN. Basically an only fps benchmark video would do just as well in views, so the extra effort isn't needed for views but much appreciated by me and, most likely, the rest of the viewers 👍👍
I recently (half a month ago) was lucky enough to have bought a RTX 3090Ti Suprim X for $979.99 when I was browsing around Amazon, glad I was able to snag it then as the prices have hiked back up. I’ll be fine for another several years, but man that performance uplift is insane!
Nice! I got a 3090 aorus master for a bit around $800. I'm also pretty pleased with my purchase :)
Nice. I was lucky enough to buy an open box 3090 ti suprim x for $900 on eBay the day before the 4090 announcement. Put a waterblock on it and it's been great. Really do not need anything more. I just hope new games will not be to demanding and force me to upgrade.
There was a lot of talk about the 4090 doing poorly at just rasterization, since everyone’s early tests were restricted to DLSS on. But seeing these benchmarks… this is wild!
The only thing that IMO is missing from this review is lack of VR gaming performance charts. RTX4090 might be one of the first GPUs that are powerful enough to allow for comfortable usage of those with most of the available titles :)
Totally agree. The flat screen benchmarks are so over the top now so they are almost irrelevant. Let's bring VR into the spotlight!
An overclocked 4090 at 666.6 watts is eye watering! I can't imagine the heat that's going to come out of the poor P.C's that are going to be running this.
This is why I'm getting the Suprim X, the radiator will dump all that nice heat into my room, meaning I'm saving money by not heating my room!
Saving money by buying a 2000$ GPU? Yeah sure
It will clearly be exactly as hot as hell.
@@DTheVigne its $1600 and heating costs are insane
@@JohnDoeC78 Until summer hits and you air fry. Might be useful for exploring antartica with the downside of accelerating sea level rise as you melt the ancient ice around you and potentially unleash protozoic viruses to ravish mankind for their endless selfishness.
Have you tested undervolting performance, the 3080/3090 were very good undervolters. I was able to save about 100w of power draw on my 3090 at only a 3% performance hit.
That's impressive
Thanks to this release today I was able to snatch a 3090Ti for $900 bucks with taxes. Good enough for me. I still got a DDR4 built and don't plan on going DDR5 anytime soon. Good luck all and absolutely killer video. I'd stop by and visit if I could and congratulate the whole team myself. Bravo.
The fact that there is so much uplift compared to previous generation makes me really excited to see what AMD has to offer with RDNA3.
???
Why? The Radeon Technology Group does not benefit from Nvidia's R&D. I know, "competition is good" etc but just because one company does well doesn't mean another is going to start doing well, especially when it comes to AMD GPUs.
@@BreadDestroyer Because leaks were more focused on AMD improvment than Nvidia one. So if leakers are any trust worthy, then situation will be very interesting. Why? Look on chart at energy consumption
@@BreadDestroyer Because NVIDIA rarely makes decisions in a vacuum. Last time AMD made large gains with their architecture NVIDIA released the 1080Ti. One could argue they have a better grip on what AMD is doing than anyone else and what needs to be done to out perform them.
@@BreadDestroyer They make things relative to what they think the competition is doing. So the 7900XT will be made with what they think the 4090 will be performing at and whether it can exceed it or match it.
@@ladrok97 As a rule I don't trust leakers, because you have like a 40% chance of any given leak being reliable, but even setting that aside, I still don't see why you'd be hyped for AMD's GPUs when
1) They've been pretty uniformly terrible for the past decade or so, with even the "good" releases falling short of Nvidia and/or being plagued by driver issues and
2) By your own logic Nvidia have pre-emptively responded and pre-emptively checkmated them. They can go back to competing on price, I guess, but "get a worse GPU for cheaper" doesn't excite me. That's pretty much always an option by getting last-gen hardware anyway.
Impressive fos numbers. I'd be interested to see what you get from a 4080 now and what thst means for a price/performance comparison
I think a better way to measure power spikes is the area of the height and length of the spike. The smallest peak of the spike is not that relevant, because even just wires have enough inductance to absorb that. Similarly to how Hardware Unboxed calculates cumulate deviation in monitor pixel transitions, it's not just the overshoot, but also the duration.
17:02 "that's unfortunate" 😂😂😂 best part of the review
The best benchmarks around! Thanks as always Steve and crew!
If you can get a hold of the Galax 4090 with 4 fans, I'd be interested in seeing how much adding a fan to the back of the card actually helps with GPU cooling (if at all) over more traditional 2 and 3 fan designs.
It doesn't. At all.
more fans doesnt mean better cooling. look at Gigabyte eagle 3 fans models are kinda trash.
@@nicane-9966 Thus why I want to see how the gimmick stacks up.
The only reason to replace a 1080 ti at 1440p gaming is when it finally decides to rest. Great review!
Same with my 1070 at 1080p gaming :)
I understand the need for a controlled environment but maybe do some testing in the future. Testing them in a hotter environment with a good airflow case would represent a good real world case for a lot of gamers. I live in the barracks and the barracks are temperature controlled, but my room is not. I'll often find my room hitting 80-90 degrees Fahrenheit simply because of my 1080ti and 5950x in the Meshify 2 xl with every fan slot occupied. Edit: This will slowly mitigate the effectiveness of my cooling and I'd like to see a video done on that.
Really appreciate the hard work guys and thermal testing you did, giving us a bit more info than just some game charts .. looking forward for more 4090 videos :)
The sheer amount of transparency with the testing methodology is nothing short of bloody brilliant, Steve and team. Y'all have knocked it out of the park.
As much as I'd love to pull the trigger on the 4090, I'm going to keep my artefacting MSI GTX 1080 FE limping along until the RDNA3 annoucement and reviews before I make my decision. I'm lucky that I can justify the money on a 4090, but patience is a virtue.
From Australia, kudos to you and the team mate. Y'all deserve a good rest after all this.
Yep, same. My buddy and I were actually just talking about doing the same. My 1070Ti has started showing signs of wear. I'm holding out for RDNA3 as well before making pulling the trigger on a new card.
@@TimothyStovall108 im rocking a regular 1070 & recently picked up cod: vanguard again (hadnt played it since launch).
only getting about 30-35 fps on there, dropping down to 20-25 when i start shooting :(
Im curious, Why not save a bit and buy a 3090/Ti? Sure the performance is great, but a whole build can be had for $1600 with a 3080 Ti/3090. To each their own(I'll buy a 4090/RDNA3 Eventually.)
@@PDXCustomPCS Me?
@@PDXCustomPCS Eh, I bought a 3080 12Gb back in June, but wasn't it wasn't the uplift I was hoping for in the games I play, as I was still GPU bound, so I sold it within the week for what I paid for it to a friend. I thought about snagging a 3090Ti, but decided to wait a few months for the new cards.
16:57 that is amazing editing, scripting and delivery 🤣
I found the 3090 FE have great thermals. Better than partners considering it ejected a lot of the heat out of the case and did so in a smaller package than partners. 3090 partner cards were insanely big but nothing like these 4090s.
I ended up replacing the paste and pads in my 3090 FE and ended up getting even better thermals, I would be interested to see what happens in the 4090 doing the same.
@@Scragg- you ain't getting one
@@behindthen0thing How would you know? You seeing my bank account or something?
@@Scragg- he saw his and thought yours would be the same
I think their testing is more elaborate than most manufacturers at this point.
10:15 This was always going to be the way that it was going to go. The 30 series FE cards were very well built and had a great design and they've iterated it better. EVGA knew the score here. Eventually Nvidia is just going to eat more and more of the card market share.
I think this pretty much secured that I'm going to buy the FE card once again this time around. Looks good, well cooled, and a +33% power limit from the looks of it makes partner boards obsolete.
I was think of getting a strix oc but now I’ve seen in at £2.4k might now be worth the significant price premium over the FE as I’ll be water cooling anyway
The insane raster performance is gonna make this a VR beast, with the huge resolutions and supersampling required/desired for every increasingly bigger (resolution-wise) vr displays. Exciting :D
@@MadWatcher because VR is power hungry. this is the best performance you're going to get for VR right now
The majority of vr games really don't need a card any where near that powerful.
@@alexoelkers2292 this isn't remotely true. It might be in the sense that there's a lot of VR games that intentionally have potato graphics, but the most interesting ones are often those that don't. Flight Simulator, racings sims, Elite Dangerous, VRChat, etc etc
I have almost never played a PCVR title that I didn't wish I had more horsepower to crank up graphics, resolution, supersampling, frame rate, etc.
and when you figure in all the mods that are coming as people port unreal 4 games...
@@alexoelkers2292 lol.
You've clearly never played Assetto Corsa Competizione.
after this review the prices went through the roof in the Netherlands. The official stores are selling this card between €2500 -€2900
..stores are now the new scalpers.
Thank you for mentioning the jump over the 3090, because a generational uplift is over the SAME product in a stack. To me it's looking like 75 - 100% in general which is mind blowing for a single gen, but what was predicted with Nvidia moving off a slower node to the best node in the world, TSMC N4. And this also drives cost. I know this comment will get the hate, but this is a much better deal than last gen 3090 Ti where for most the time it was selling around this price range or even higher.
To me this is the first REAL 4K gaming GPU, where you don't have to turn a lot of settings down and it can pump out the fps. I'm waiting for the following generation though because I want that performance at lower power consumption which is what will happen with the 5000 series and RDNA 4.
So point of contention with your commentary about buying advice. This is a HELL of a 4K gaming GPU which is the only reason why people should be buying it anyway, since the 6900 XT and 3090 and 3090 Ti are already premiere 2K gaming GPUs. This is something that gets paired with that big shiny 4K OLED TV with a PC input made for gaming and enjoy the splendor of what modern hardware can give in gaming. In which case you DO want RT most the time.
Also, if I'm buying this I'm not pairing it with anything except Zen 4 Vcache, where I think some of those CPU limitations will disappear.
No hate needed dude, good comment. I agree with you, this thing is a 4K beast and is made specifically for it. I have a 3080 and I'm commonly seeing double the performance which is insane. I was hoping that the 4090 would provide 144fps at 4k and my eyes were literally watering watching this review haha, the uplift is amazing.
Everyone who said to wait for benchmarks and that the pricing was about right were absolutely correct.
Waiting to see what AMD has to offer..
One question - how is the coil whine on the RTX4090? The RTX 3090 and 3080Ti had quite a bit of coil whine and it wasn't just limited to certain brands - it was happening on all brands and was significantly louder than 2000 series cards.
My 3070 never had issues, and i am very sensitive to electric noises.
@@chltmdwp 30 series cards i've tested and found to have coil whine: RTX 3090 FE, Asus Strix RTX 3090 OC, Gigabyte RTX 3080Ti Eagle OC, MSI RTX 3080Ti Suprim X, Asus Strix RTX 3080Ti, Asus TUF RTX 3080Ti OC, RTX 3080 FE, MSI RTX 3080Ti Ventus 3X, Asus Strix 3070, RTX 3070 FE. I build a lot of rigs for clients.
@@0perativeX I have the Asus KO OC 3070. Maybe I got lucky:)
Back to you, Steve
Hey Steve and team! I know it probably won't get seen, but I would absolutely *kill* for VR benchmarking from you all. I know you have plenty else in the works so it's unlikely to happen, but I have seen such drastic differences both on software choice as well as hardware, that it's a daunting use case to purchase for without any in-depth reliable information from common media outlets.
Thanks for all your work!
I'd honestly love to know what games are worth spending this sort of money on a graphics card in 2022.
4K at minimum 60fps guaranteed with RT on. Not so long ago, we were just trying for plain 1080p. I'm blown away.
@@Paul_Sleeping - That didn't answer the question Paul.
Well worth is subjective so no one will be able to answer that question for you, only for them. I’d say none, since a 3080 can run any game I please at satisfying settings at 1440p to me. Now if I were to move to 4K, I would want something more akin to this for really any demanding AAA game. The 4090, in my opinion, is the first real 4k card as the 3080/3090 were for 1440p.
Much better review than LTT's video. Keep up the great work.
LTT is more about entertainment (these recent past years) than doing proper reviews, research and explaining by vulgarizing.
@@Koeras16 LLT is a spastic
there's nobody I would trust more with this information than Steve Gamer Nexus, thanks for the hard work and dedication to integrity you guys rock! I would also love to see a sorta dumbed down 'how'd they pull it off" video where Steve explains how they managed such a performance leap to me like I'm an idiot (I am)
but you didn't give them any information?
@@sigy4ever ?????
@@EmpanadaDeCaca I apologise its just that the phrase "there is nobody I would trust more with [X]" usually has the context of the speaker entrusting [X] with/to a person associated with it.
@@sigy4ever welcome to the internet
@@sigy4ever You entrust something to someone, but trust something someone has or gives, in common cases. In the subject of information, it usually inverses the direction this information goes in.
My comment isn't related to the card, but about how happy you look showing us how much you've improved. I'm happy for you bud! The sky is the limit.
Very thorough review guys. Thanks for the cable clear-up. Can we get some octanebench/redshift/vray testing in the future? GPU path-tracing performance usually jumps generationally by a lot more than gaming performance.
Was lucky enough to finally grab one of these from best buy!!
I've only just started trying to get one, since like last Friday. I feel the pain, I can't imagine the frustration for people trying to get one since launch.
@Hollowpoint762 I had one bought from b&h since launch but still hasn't shipped and says backorder. This week is the only time I've been able to add to cart so hopefully it's getting easier.
@@Notnownev from what I can tell, this has been one of the more active drop weeks since launch. Guess I picked a good time to try, now I just need some luck.
@@HollowPoint_762 oh man there is so many cards available near where i live (gigabyte gaming OC version) it was months of them selling out immediately when cards came in.. now they have 10+ in stock at some stores
got mine while i could.. doubt pricing will drop unless sales slow down drastically.. but ive seen people scalping the 7900 xtx at prices that will push people up to the 4090 since its not that much more vs scalpers pricing on AMD..
@@notastone4832 I still haven't been able to get a founders edition
I like how gamernexus gets to the point with no BS.
I logged in to TH-cam today and my feed was inundated with RTX 4090 reviews. I filtered through them all until I found the GN review, and just watched that one.. best choice I made today. Not that others don't do a good job, some places do, but I won't get the full no B.S. information that I get with GN. Simply put, you guys are the best in the business, keep it up!
Oh, and no.. I am not buying one of these.. my 3080 is still overkill for most games at 1440p, this would be literally flushing money down the toilet for me.