Not to defend the 5090, but cards don't permantly draw their max TDP as a constant when turned on. And it's not even notably higher than cards a few generations ago.
@@Mondainai3090 of two generations ago has a tdp of 350w, which means the 5090’s tdp is 65% more than the same level gpu two generations ago. Which seems notably higher to me.
To be completely fair, energy efficiency is much more important than power draw. If a card takes 50% more power but completes the task twice as fast, it's a net positive for the environment
I was thinking if frame generation can be used for rendering in blender, that would be great. Then you can render every 2nd frame and get in theory twice as fast renders . Do not have to worry about input lag
They basically fed a 4090 with ALL the growth hormone and said job done. Frames per Watt/$/core are the real measures of improvement. Looking forward to the 24 GB 5080 Super Ti review in 6-9 months
Don't see that happening, for that Nvidia needs to feel market pressure. RTX 5080 will remain uncontested. AMD is a dud, scared shitless to even release the already widely stocked RX 9070s sitting on store shelves in some backroom. People waiting on RX 9070s are going to be enraged when they know the MSRP. Ain't no way Lisa Su will be aggressive on pricing, that was a pipe dream from AMD loyalists, also believing liars like Frank Azor. The only reason why Nvidia went aggressive is because Jensen is ultra competitive, he doesn't like to cede ANY ground if he can help it.
Yeah he obviously knows how it performs, so everything said that he implies is subject to testing is likely true. The revised 4.0 Frame Gen is not black magic free performance, raster gains are minimal per $ etc.. Overall sounds like we should be prepared for a disappointing generation.
Steve has risen... its framegenover. That said: _roughly_ 30% more performance with 30% bigger die, 30% more cores and 30% more power used. One of the graphics cards of all time.
So basically 2 years and all they have managed to do is the same chip as 4090, but bigger. What are the architectural designers & engineers doing at Nvidia?
@@Techcensorshipbot Oh it's gonna be somethin'! I'm expecting an absolute sh*t show. I'm thinking 3 grand (or more) is a safe bet for scalper pricing as well.
Actually they didn't lie at all, it is "Hardware Unboxed", so they already have everything already unboxed and ready for review haha. Them unboxing it is just complimentary fan service.
It’s actually 33% more expensive since 2000/1500 =1.333! So apart from getting more VRAM there is ZERO generational improvement in three years. NVIDIA are laughing all the way to the bank.
Thumbs up for calling out 5070 = 4090 nonsense. I have the same stance. I have nothing against frame smoothing, but that is what it should be called, and the marketing is extremely dishonest. But it's nothing new, it's Deja Vu from 40 series launch.
Crazy that they can get away with giving misleading marketing just because there's a small ass disclaimer (which 90% of consumers dont see because the GIGANTIC GREEN HEADLINE says 5070 = 4090)
@@toad7395 They flooded YT with DLSS 3 comparisons after 40 series release. I don't even remember a disclaimer, or any indicator that FG was used. Just a massive FPS boost for turning on DLSS 3... It worked so well, of course they are just going to repeat that...
@lukastemberger Cause reducing the package is "eco friendly" pulling constantly 600w has nothing eco friendly in it. Its like phone manufacturers take out chargers from the box just to sell those you separately for more money.
@@OneDollaBill lmao, 600W is nothing given the “eco friendly” context. You’re easily spending double that for AC on countries with hot climate or on heating. And nothing on a consumer level comes even close to making a dent on corporate/office/industrial power consumption.
well akshualllyyy, overclocking was never a linear scale. 10% extra performance for 30% more power consumption was usually the deal. Because VF curves etc.
Here's what scares me the most: Reading around some threads, a lot of people are on the mindset of "Ignorance is bliss, I don't care if the frames are generated". Some people disagree because of the introduced latency, or the low quality of the generated frames, but pretty much nobody talks about the biggest issue, which is that the game still only samples the player's inputs at the real frames. If people become so easily mislead by frame-gen, and nvidia learns that this tactic works, it may have grim consequences for all of us.
Nvidia: ‘we are nearing the limits of raster improvement’ Everyone with a brain: ‘well then maybe you shouldn’t have the xx80 be half the cores of the flagship, that sounds like leaving a lot of performance on the table right off the bat’
Yeah, but they need all the good chips to send to AI vendors before that bubble explodes. Gaming is an afterthought for Nvidia right now. With the exception of the 5090, everything else is just leftovers.
Yeah the guys constantly researching and developing stuff are stagnating just because they didn't switch to 3N and increased the prices of all their GPUs...
Haha these prices are insane. Considering the state of AAA gaming these days, these hugely expensive graphics cards don’t seem like a good proposition to me.
Exactly. First such comment i ran into. Who spends 3000 euros for just a gpu, so just for one part of gaming pc. So probably spending more like 10 000 euros for a high end gaming pc in order to play buggy, poorly optimized, woke, copy pasted and boring aaa games with 200 frames generated per second. But yet people like that clearly exist. I guess some are just so rich that money loses meaning for them, i cant think of another real reason.
@Jojooooooo it's almost like Jensen wanted to give us a laugh. For a $2000 gpu.. the packaging is probably 0.1% of the overall impact on the planet of that GPU. The fabs. Can we manufacture the silicon in an eco friendly machine? No. 😑
11:13 Absolute truth. Frame gen tech is supposed to be the “cherry on top”, so to speak. Once you blow away the audience with the rasterization power, you _then_ bring up how much more _supplementary_ power the feature provides. They’re doing it in reverse here.
pushing for more raster is useless when CPU cannot keep up. 4080 vs 4090 already show this issue. the performance uplift of 4090 from 4080 did not reflect the amount of CUDA core increase in 4090 even at 4k. frame gen happen because they want to bypass CPU bottleneck.
@ so you bypass the cpu bottleneck with AI generated frames…? That doesn’t make sense to me at all. Because at the end of the day the GPU can’t “replace” what the engine outputs. It can “guess” which will inevitably lead to things like input lag or ghosting. In the end this just seems to be a marketing front to push their agenda. If you can’t make any noticeable gains in rasterization at this particular time, don’t release a new product. It’s that simple.
@@anitaremenarova6662 I am not interested in the 5090 but in the 5080 where the improvements are only around 10-15% and the difference in price is even more insane!
@@anitaremenarova6662in what universe ? Its always been around or above 2000. And still is sadly. After the 5090 benchmarks are released the 4090s are gonna go up even further
Idea for an alternate channel or series: Hardware Reboxed, where you do a moratorium on the previous generation and how you felt the various cards/chips were overall. Then you put them back in their boxes, where they remain until you need one for benchmarks or something.
Go with ryzen 5 7500F, very budget friendly and you can hop on AM5 for later upgrading, or r7 7700 they go quite affordable lately, ỏe ryzen 7 9800x3d for that gimmick highest fps 1080p
Regardless of gaming parformance 575W is a big no for me. Even if let's say 6 years from now I could get a used one for $300, I just wouldn't want to have it in my room, it's just too much heat.
One of the perks of living in northern Europe is that you don't have to care about this heat. I need electric heating for 98% of time. During those 2%..I don't want to game..but spend time outside. Though it's slightly mind bogling when GPU generates 1/10 of the heat of my sauna unit..that I use to heat up one room to 110c (230f).
It's a great improvement for companies using it for machine learning. Extra vram really helps there. They will gladly buy 5090's, for consumers who want to game, probably not so much.
วันที่ผ่านมา
Hopefully not too long after the release we get performance figures in local runnable AI applications. I'm really curious how it stacks up against the 4090.
Seems like a product mainly aimed at professional use i.e. AI and 3d rendering that's throwing a bone to rich gamers than the other way round. I know the 90 series has always been a bit like that but this one seems more so (so a titan card essentially)
Yeah 100% right. I am even thinking of selling the 4090 to get a 5080 later on, due to lack of professional use at the moment and only gaming features being used. I guess I can get more money "back" in the process and get more frames, though generated.
the 4080 is only ten percent behind the 4090 at half the price, much better gaming card if you don't need tons of VRAM for productivity (since DLSS knocks down the VRAM usage massively), 5090 is basically just a cheaper (and probably a bit faster in some scenarios) RTX 6000.
@@PiotrBarcz yeah, it will definitely be interesting to see the reviews of all cards, including AMD in march and 5070 and ti soon. I had both the 4070 ti and 4090 and honestly, the 4070 ti is enough, all else was "nice to have". So I guess people with the 4080 made a great choice. And probably the 5080 will be much better for most people due to double the price of the halo card.
@ yeah I've been thinking of getting a 4090 build in the future but frankly the 4080 makes so much more sense I would be an idiot to get the 90 with my financial situation anyway XD
This is where there is much value to getting a dryer vent and pipe the heat out the window. Will save you a few bucks a month in running the AC harder. Reminds me of when I used a dryer vent to cool my PC with 20 degree air from outside. Just change it from the exhaust in summer to intake in winter.
keep in mind the value of the US dollar has dropped 100% in the last 5 years....losing 20% of its value each year..... this is not a joke or exagerration..... $100 5 years ago is now worth $50..... so 2k today is 1k really.....ppl are ignoring inflation with these prices.
Probably will do much better considering the airflow is totally unobstructed. Just get good exhaust fans and maybe make a shroud inside of your case to move the air more directly to the exhaust and you'll be good.
I like the green themed studio lights for Nvidia products: I am guessing you will do blue for Intel and Red for AMD. Maybe the whole studio should be a bit brighter though to compensate or just have some parts with the Green/Red/Blue theme?
I mean at this point i really do believe Nvidia engineers when they say they struggle to get the sort of raw performance uplift we've seen previously. We are encroaching on the point of diminishing returns, and thats down to straight up physics and material science.
This. Everyone rolled their eyes when Jensen said Moore's Law was Dead, but even AMD and Intel are failing to roll out significant generational improvement. I feel it will take new innovation to see the jumps we saw previously, but in the meantime, I'll always be entertained by keyboard warriors who feel that the best engineers in their field are somehow inept at finding the improvements they feel entitled to.
Of course, but people seem to ignore this fact. Nvidia knows this, that's why they push the AI stuff so hard. You can't shrink the process node infinitely and it becomes increasingly difficult to do so. They went from 5nm to 4nm. Next generation they'll probably go to 3nm. If you reach 1nm, what then? AI is basically the "backup plan".
I think the good counter-example to this is Intel 14nm++++++ era. It's because nVidia is a monopoly and is making almost all of their money on AI, so they're dedicating their engineering time and building their GPU architecture to optimise for that. Look what AMD did in the CPU space with a brand new architecture - performance gains that Intel essentially said for 5 years were impossible. This is economics, not physics. We wait for the monopoly to break.
I think it's called Moore's Law, we are reaching the limits of silicon. It would have to be a total shift to something new, like optical computing, or dare I say quantum computing (imagine the cooling needed for that?)...
@@asianck I mean no reviewer is only going to do tests with frame gen on. I don't think that's a concern really. Judging by how people are talking between the lines with there pre-coverage, we're in for a real disappointment when the 5080 and below numbers.
I am looking at buying a 5090, but I am coming from 30 series. Upgrading from a 4090 to a 5090 seems like insanity to me. Also good to hear that you won't be looking at frame gen initially. I'm much more interested in RT performance which is one of the major reasons I'm looking at upgrading at all.
Well, the RTX 5090 looks more like an architectural enhancement, than a leap forward in gaming. GDDR 7, bigger bandwith and more memory, gives the 25-30% uplift, BUT at 30% more money...Seems that Nvidia changed Moore's Law with, "You want more transistors, pay for them"... For many of us gamers, a used RTX 4090, is the way to go...
@@nahbro3240as far as gaming goes, VR enthusiasts will be ready to jump on these 5090s. Hopefully, it’s closer to 40-50% in raster, but even a 30% uplift will be a difference maker for these high resolution VR headsets that are available now days. Especially since DLSS and frame gen are not utilized in VR and we need that pure raster to power these headsets that are 3-4X’s the pixels over 4K
We've gone through the miniaturization phase of computing, now we are going back to room sized super computers. Now we just need personal Fusion Batteries to power everything.
You bought a *_Corvette_* for less than 2 grand? Did you get some kind of 24 hours of Lemons super deal or did you buy it way back in the 50s / 60s when I think 2 grand was the starting price?
@@levygaming3133 You can get a used car that is factually still usable for example about 3000 euros. Like example a Golf or such. If i could choose which one to pick if i would get it for free, i would pick the 5090. And would sell it immadiately.
@@manefin I know full well that you can get _some_ kind of used car for 2-3 thousand dollars, but he specifically called out a Corvette, which is an expensive American sports car (typically ~60k MSRP), and I was extremely doubtful that _any_ corvettes have depreciated to the point of being that cheap. The only way I think it could’ve been that cheap is, like I pointed out in my comment, way back in the past before _* a half century *_ of inflation, when the MSRP for the base model _might_ have been 2k, but even then I think it was probably closer to 3-3.5k, at least Shelby’s of that vintage were about that much (with an engine package being scrapped because it would’ve raised the price by a full $500, which was a lot of money in the 60s).
A lot of the 40 series improvement was from the TSMC node improvement. Without that size of leap in the manufacturing tech, we're seeing that the chip design itself isn't advancing that much.
Blackwell doesn't improve rasterization per CU. It technically is better per clock, but 50 series has lower clocks than 40 series so it ends up being a wash. RT is a 30% improvement per CU, and tensor is vastly beefed up. But no one seems to care about that. Rasterization isn't a focus anymore nor will it be going forward.
You get more performance anyway though, you're not paying MORE for the performance, you're paying the same for performance. Still a win in my eye, better than go to worse.
@@PiotrBarczlol, that's not a "win", that's the definition of stagnation. "More for less" is progress (see 3070 vs. 2080 Ti, 4070 vs 3080) you could even make a case for "same for less" (7800XT was basically a 6800XT at a discount), but "more for more" is just bad value, even at this ridiculous price point. It's not even more efficient, for crying out loud.
@@gvd-l3o Still more performance, that's a win no matter how you look at it, more is more and doesn't matter in the end how much is costs. Get a good paying job and then buy the card instead of complaining about the price since the world is already backstepping.
FPS initially - Frames Per Second FPS now - Fakes Per Second 😂 P.S. Seriously though, fake is incorrect, rather should be known as generated frames, GFPS maybe. 😊
Now guys imagine the cards price is not inflated that its MSRP was announced to be 1000$ where as 5080 was announced to be high end GPU for 600$. What a time would that have been to live in.
My thoughts, it's another semi-stagnant gen at least for what we've been shown. 5070 will probably only slightly outperform a 4070 super. Multi frame gen seems okay if you have a 240+hz monitor not so much if you have lower. DLSS transformer model looks good so that's something nice for all RTX users. Let's hope FSR4 and 9070 exceed expectations and give people real value.
100%.This is a stop gap generation for Nvidia. Similar to what the 2000 series was, marginal generational improvements with a few extra features, it'll probably age very badly. Nvidia 6000 series and AMD's UDNA is where they'll bring out the big guns and get back into the 40-50% gen over gen rasterization performance gains.
This still looks better than the 2000 series did at launch but Nvidia actually managed to launch raytracing with that series despite it being terrible at the time. Time will tell if dlss4 is also something people disliked at first but became popular
Well, people expect that rx 9070 xt will be a cheaper 7900 xt. And that fsr 4 will be better than fsr 3. And thats about it. Though its not much different in with nvidia. I just always find interesting when people talk about gaming gpus that they never talk about video games.
@ I meant generally theres huge focus on stats/data of gpus and such rather than focus on the video games themselves. You know about what the gaming gpus and such are made for. For me at least its a bit funny. Maybe to others it makes perfect sense.
IMO, most tech tubers have already completed benchmarks and are just waiting for the embargo to lift. The 24th will have hundreds of videos with results.
It makes me sad that I've already seen tons of memes where they make fun of 4090 users because they genuinely believed that the 5070 will match the performance. And I freaking hate this kind of marketing as well. AI generated frames are not performance. Nvidia thinks we only play brand new AAA games. All that DLSS 4 "magic" is completely useless in games before 2018.
@LukewarmEnthusiast I don't know... Maybe because we have 4K 240hz and even 7680 x 2160 240hz monitors. A 5070 will be way too underpowered for those even with older games. And I don't even mean very old games. There are tons of 4-5 year old games without any kind of DLSS support.
@valentinvas6454 That's the reason they emphasize frame gen and mfg so much. Usually DLSS and FSR are reserved for newer games. Most devs aren't going to go back to an old game to implement an upscaler but you can usually find a mod to implement frame gen to older games improving performance. With the 5070 having support for mfg you may reach those 4k 240hz target you aim for. You may not like frame gen but it does improve performance.
Which means games have to be optimized so they can actually crank out that much FPS on top of the line hardware. The devs are counting on Nvidia's optimizations to fix their stupid game's performance.
Looking forward to the reviews,I would expect at least a minimum of DLSS 4 multi frame generation review after trashing it for weeks, its deserves a quick peek 50 series is going to trounce AMD so theirs no need to be disappointed unless your an AMD fanboy who hates to be bested
27% uplift for 25% more cost (if you're extremely lucky!!!), yet 125W more, so essentially we're paying for every percentile of uplift, which in turn is only enabled by relatively commensurate extra CUDA cores AND Wattage!!!!! Disguised under some fake frame tech!!!!! Poor Nvidia, Poor.... ....even LAME!!!!!!
Same cost as rent around here... and both those thoughts give me anxiety 2025 is wild, wild times, man, lol Def not a product with this consumer in mind, so biased perspective for certain. Wonder how PC enthusiast level spenders feel 'boot the 5090 pricing to perf. Pretty card, sure it will run beautifully. 2k USD.... just... damn.... ~a random canadian subscriber dude
Don`t say! Nvidia save the memory… so it is ecological! 😂 Saving is good! And ofcourse… The more you buy, the more you save! So whole Nvidia ecosystem is eco friendly! … could you now put that gun down… I was good boy as you said… ?
That IS TH-cam compression actually, I see it on every video on youtube these days shot at 4K 60, any movement like that is going to look like trash with the fine detail. TH-cam caps its bitrate at like 40 mbps for most videos and it takes at least 50 to 60 to get clear visuals on detail like that during fast movement.
i think that if one can claim that 5090 with DLSS 4 can achieve over 2x performance, while being ~30% faster, then someone else can test 4090 with alternatives, like lossless scaling and use that to it's extent... i won't say 20x framegen, but 10x would be hillarious, Nvidia team would be like "wait, it's illegal"
Personally I anticipate it won’t be worth the upgrade for those with a 4090 but for those who still own the 3000 series maybe you could do some mental gymnastics to make that price work
I can't be the only one thinking: Show me it all at once. Don't give Nvidia the extended free marketing with small pieces here and there to keep the cycle of news focused around Nvidia for an extended period of time.
nope, if anything higher price and it will come if 5090 stock leaks are true. People will spend even 5k if needed for 5090 or 6090. The cost there is just a limit of availability.
I am happy I decided on my RX7900 XTX and I am enthused by FSR 4.0. I have no issues with using a Nvidia GPU, but can't justify changing back this gen, at least so far. The last time I had one was my trusty GTX 1080, what a great card. Looking forward to your reviews Steve, thanks to you and Tim for all the coverage. You guys are the best!
I hate almost everything about this card: the enormous price, the LOW extra 30% performance, the HIGH power usage BUT I LOVE how compact it is, god darnz it!
Upscaling? Ok, maybe on single player games. Frame generation? Absolutely NO! NO and NEVER! Especially if i am paying that amount of money, i just want true performance via rendering. I dont want any "trick" to boost "numbers" instead of real performance. We want high fps to get lower in game latency and so we can response anything in the game fast enough. But fg fps is absolutely opposite of that. It "holds" next frame to create "between" frames so it creates more latency.
I mean the "30% more performance for 25% more price" statement doesn't really hold, though, because do we really expect the 4090 to hold its value at 1600?
Put a slow mo camera. Side by side with a card that doesn't guess 3 frames in 4? See if the gpu knows if you've moved or fired. During the guess frames. Latency?
To be fair to Steve the channel is hardware unboxed, not hardware unboxing
That's an excellent point.
wish I had known that sooner in my life... 🤯
You are absolutely correct, this video should have started at 2:56
@@sanderbosThis never would have happened without AI
he works for a living,do you understand the concept?
Hearing "eco-friendly packaging" followed by "575 watt card" made me laugh.
Not to defend the 5090, but cards don't permantly draw their max TDP as a constant when turned on. And it's not even notably higher than cards a few generations ago.
@@Mondainai3090 of two generations ago has a tdp of 350w, which means the 5090’s tdp is 65% more than the same level gpu two generations ago. Which seems notably higher to me.
Not to mention the ecological impact of A.I. But cool cardboard box, Jensen.
To be completely fair, energy efficiency is much more important than power draw. If a card takes 50% more power but completes the task twice as fast, it's a net positive for the environment
@@Mondainai Indeed, gpu's at idle pull very little power. It's only under 100% full load that the TDP is reached.
Steve is standing oh no
HE’S STABDING😱
All performance and analysis has basically been leaked.
This is all we needed to know
Nvidia crowbar, u think?
ping me when Steve is levitating
I’m terrified. Sit down, Steve! You’re scaring us!
here we go again!
Frame smoothing is by far the best description of frame generation I've heard so far.
It’s better than motion blur but not quite new frames.
One could use that to "bully" Nvidia fans with "Only smooth brains need frame smoothing".
Not brain smoothing?
Nothing new, its very similar to interpolation tv's have used for ages. Quite unsuitable for gaming, if you're picky as me.
I was thinking if frame generation can be used for rendering in blender, that would be great. Then you can render every 2nd frame and get in theory twice as fast renders . Do not have to worry about input lag
They basically fed a 4090 with ALL the growth hormone and said job done.
Frames per Watt/$/core are the real measures of improvement.
Looking forward to the 24 GB 5080 Super Ti review in 6-9 months
Don't see that happening, for that Nvidia needs to feel market pressure. RTX 5080 will remain uncontested. AMD is a dud, scared shitless to even release the already widely stocked RX 9070s sitting on store shelves in some backroom. People waiting on RX 9070s are going to be enraged when they know the MSRP. Ain't no way Lisa Su will be aggressive on pricing, that was a pipe dream from AMD loyalists, also believing liars like Frank Azor. The only reason why Nvidia went aggressive is because Jensen is ultra competitive, he doesn't like to cede ANY ground if he can help it.
or next year
at an affordable $1500
@@mclarenbutton6342 100% speculation, no real basis to assume things before we have benchmarks.
@@Tomazack that's 100% copium
Steve standing should be considered a benchmark leak
Yeah he obviously knows how it performs, so everything said that he implies is subject to testing is likely true. The revised 4.0 Frame Gen is not black magic free performance, raster gains are minimal per $ etc.. Overall sounds like we should be prepared for a disappointing generation.
Foreshadowing
Agreed
Agreed plus ultra
Ha that took me a minute to get 😂
Steve has risen... its framegenover.
That said: _roughly_ 30% more performance with 30% bigger die, 30% more cores and 30% more power used.
One of the graphics cards of all time.
And you get to pay bare minimum 25% more for it for the privilege. Isn't that great!? Hail Jensen!
@@Kryptic1046 I'm waiting for the release so I can have a good laugh at the scalper pricing. Will we see 3K+? I think its possible.
So basically 2 years and all they have managed to do is the same chip as 4090, but bigger. What are the architectural designers & engineers doing at Nvidia?
It's the integrated SLI era of giant double cards all over again.
@@Techcensorshipbot Oh it's gonna be somethin'! I'm expecting an absolute sh*t show. I'm thinking 3 grand (or more) is a safe bet for scalper pricing as well.
It’s Hardware Unboxed, not Unboxing, I’ve been lied to
They don't do unboxing, as usually the box is already unboxed.
Truely an unforgiveable moment.
Gotta unsubscribe this moment.
Fake Unboxing frames
Actually they didn't lie at all, it is "Hardware Unboxed", so they already have everything already unboxed and ready for review haha. Them unboxing it is just complimentary fan service.
3 years and 30% more performance for 25% more money. So after 3 years we barely get better performance for roughly the same value.
Amazing.
The more you buy, the more you Nvidia
25%? lol only if you can get the FE from Nvidia. The AIB cards will be 30%+, not to mention the scalpers will do their thing to make things even worse
@@TheEchelon People who support nvidia buying these things deserve to be scammed.
It’s actually 33% more expensive since 2000/1500 =1.333! So apart from getting more VRAM there is ZERO generational improvement in three years. NVIDIA are laughing all the way to the bank.
@@jacklowe7479 4090 is $1600 and 5090 is $2000. 5090 is exactly 25% more expensive than 4090....
2k is a hard nut to crack for the quality of games we're getting now.
Thumbs up for calling out 5070 = 4090 nonsense. I have the same stance. I have nothing against frame smoothing, but that is what it should be called, and the marketing is extremely dishonest. But it's nothing new, it's Deja Vu from 40 series launch.
Crazy that they can get away with giving misleading marketing just because there's a small ass disclaimer (which 90% of consumers dont see because the GIGANTIC GREEN HEADLINE says 5070 = 4090)
@@toad7395 They flooded YT with DLSS 3 comparisons after 40 series release. I don't even remember a disclaimer, or any indicator that FG was used. Just a massive FPS boost for turning on DLSS 3... It worked so well, of course they are just going to repeat that...
Shameless nvidia will literally do anything for money. There is zero value to being honest. I hate Nvidia so much. (I have a 4070 TuSuper 😑)
Considering it's literally _generating_ frames via AI the name is fitting. Smoothing makes it sound like it's blurring everything together.
@@toad7395misleading ? They literally said that is thanks to AI that it can achieve that performance
nothing better than thinking of the eco system while unpacking a product that pulls 600 Watts..
Exactly
It's funny to see it become a small space heater. Rip for those with high electricity bills
What does that have to do with it? If it pulls a lot of power, it needs to have a lot of plastic waste in its packaging?
@lukastemberger Cause reducing the package is "eco friendly" pulling constantly 600w has nothing eco friendly in it. Its like phone manufacturers take out chargers from the box just to sell those you separately for more money.
@@OneDollaBill lmao, 600W is nothing given the “eco friendly” context. You’re easily spending double that for AC on countries with hot climate or on heating.
And nothing on a consumer level comes even close to making a dent on corporate/office/industrial power consumption.
30 pct more performance and 30 pct more power consumption. Back in my day, we called that an overclock.
You're not even exaggerating, my q6600 overclocked to 3.6ghz, that was 30% 😂
well akshualllyyy, overclocking was never a linear scale. 10% extra performance for 30% more power consumption was usually the deal. Because VF curves etc.
and on top of that NVIDIA is blessing you with 25% more value too!
Ermh... Usually overclock doens't scale linearly.
@@Bellonii *Less value
Here's what scares me the most: Reading around some threads, a lot of people are on the mindset of "Ignorance is bliss, I don't care if the frames are generated". Some people disagree because of the introduced latency, or the low quality of the generated frames, but pretty much nobody talks about the biggest issue, which is that the game still only samples the player's inputs at the real frames. If people become so easily mislead by frame-gen, and nvidia learns that this tactic works, it may have grim consequences for all of us.
Concern trolling troll with a troll avatar.
We are firmly in "car downpayment" or "decent vacation" category for the prices of these cards. The 80 series isn't far off either.
The vacation will create much better memories than a GPU. 👍🏻
30% faster 30% more power 30% more expensive, no thx
Rumors are that the 5090 models you will actually be able to buy, will be much more than 30% more expensive...
with that said you wouldn´t ever bought one some are just poor
well the 1440p cards are better at least
@@aggies1130% more expensive than the 5090 founders? Or just 30% more than 4090
@@aggies11That implies u can get a new 4090 for the msrp price.. good luck with that.
Nvidia: ‘we are nearing the limits of raster improvement’
Everyone with a brain: ‘well then maybe you shouldn’t have the xx80 be half the cores of the flagship, that sounds like leaving a lot of performance on the table right off the bat’
yup
They should just had a ti version right of the back
lol exactly
Yeah, but they need all the good chips to send to AI vendors before that bubble explodes. Gaming is an afterthought for Nvidia right now. With the exception of the 5090, everything else is just leftovers.
The more you buy…
I find 5090 to be a very impressive piece of engineering and visual design. I'd buy one for $1000 maybe.
Bro you won’t be able to buy it at 2k it’s gonna sell for 2.5k$+ at a minimum
@@youssefmohammed5456 You're off buy 50%. Only way to get a 5090 will be from Ebay and the price will be $5000
2030 it will be down to 1k
@@youssefmohammed5456 You really don't get a joke
@@Greenalex89 and it will be in the minimum specs😂
Nvidia doesn't want you to understand that they're stagnating very hard
And AMD is still behind.
@@jbrou123 The difference being that AMD is not "valued" at 3+ trillion USD
They focused on AI performance.. it makes a lot of sense to focus on that when 95% of their income now comes from AI datacenter lol
Actually, this is the expected behavior of a monopoly. This is the whole reason people say monopolies are bad.
Yeah the guys constantly researching and developing stuff are stagnating just because they didn't switch to 3N and increased the prices of all their GPUs...
Haha these prices are insane. Considering the state of AAA gaming these days, these hugely expensive graphics cards don’t seem like a good proposition to me.
Agreed.
Heard that about 4090 "It's not meant for gaming" they said.
Exactly. First such comment i ran into. Who spends 3000 euros for just a gpu, so just for one part of gaming pc. So probably spending more like 10 000 euros for a high end gaming pc in order to play buggy, poorly optimized, woke, copy pasted and boring aaa games with 200 frames generated per second. But yet people like that clearly exist. I guess some are just so rich that money loses meaning for them, i cant think of another real reason.
Agreed. Playing only about a couple times/week these days
100%!
Finally a Hardware unboxing on the Hardware Unboxed YT channel!
Oh no, what's next?! A gaming video on the Gamer's Nexus channel? Don't give me hope!
It's normally referring to pulling down the boxers of hardware.
there goes my fraud case
Unboxed. So, it should already be out when he shows stuff. technically
The 5090 packaging looks like something you'd buy your dog for his 10th birthday
Yeah I was thinking cat scratch thing.
Tbh tho I actually like that it is recyclable
+2
Yep tbh the 4090 box was way better.
💀
eco friendly packaging for a very unecofriendly card 💀
@Jojooooooo it's almost like Jensen wanted to give us a laugh. For a $2000 gpu.. the packaging is probably 0.1% of the overall impact on the planet of that GPU. The fabs. Can we manufacture the silicon in an eco friendly machine? No. 😑
It's cringe how they pretend to be all eco with the reviewer packaging but normal users will recieve a box full of plastic bags
11:13 Absolute truth.
Frame gen tech is supposed to be the “cherry on top”, so to speak. Once you blow away the audience with the rasterization power, you _then_ bring up how much more _supplementary_ power the feature provides. They’re doing it in reverse here.
More like the game developers are giving them no choice.
pushing for more raster is useless when CPU cannot keep up. 4080 vs 4090 already show this issue. the performance uplift of 4090 from 4080 did not reflect the amount of CUDA core increase in 4090 even at 4k. frame gen happen because they want to bypass CPU bottleneck.
@@arenzricodexd4409 Well said.
@ so you bypass the cpu bottleneck with AI generated frames…?
That doesn’t make sense to me at all. Because at the end of the day the GPU can’t “replace” what the engine outputs. It can “guess” which will inevitably lead to things like input lag or ghosting.
In the end this just seems to be a marketing front to push their agenda. If you can’t make any noticeable gains in rasterization at this particular time, don’t release a new product. It’s that simple.
No frame generation on day 1 reviews.... THANK YOU! Definitely a secondary, perhaps tertiary feature!
Steve's standing position is clearly AI-generated
Here in Europe the card is 30%+ more expensive, so we don't get any improvement in price/performance. Unacceptable.
Wait until initial hype dies down, 4090 sold close to MSRP for a long while but it took a bit after launch.
@@anitaremenarova6662 I am not interested in the 5090 but in the 5080 where the improvements are only around 10-15% and the difference in price is even more insane!
Actually, I never saw a 4090 for less than 2000€
@@anitaremenarova6662in what universe ? Its always been around or above 2000. And still is sadly. After the 5090 benchmarks are released the 4090s are gonna go up even further
@@mondodimotoriReally, I got the 4090 Strix for 1026€, no VAT.
Idea for an alternate channel or series: Hardware Reboxed, where you do a moratorium on the previous generation and how you felt the various cards/chips were overall. Then you put them back in their boxes, where they remain until you need one for benchmarks or something.
I love this idea! Make it happen Steve!
Ban the previous generation? Or do you mean post-mortem?
He's standing there ... Menacingly!
30% mo performance for 30% mo money. Classic Nvidia.
we dont even get a single 1% for free
You're not getting less per dollar, be happy about that, could be worse.
This is going to be my 9th year running the gtx 1080. I will upgrade the cpu this year though, the 6600k is showing its age.
Legendary combo
Your CPU is showing its age? And how is the new Indiana Jones game running on your "not showing its age" GPU?
Need money for gpu?
Go with ryzen 5 7500F, very budget friendly and you can hop on AM5 for later upgrading, or r7 7700 they go quite affordable lately, ỏe ryzen 7 9800x3d for that gimmick highest fps 1080p
Hah. Just last week i upgraded my 6600k to 9800x3d. Still rocking with gtx 1080 😁Upgrading to 5070ti i guess
steve is standing I think thats a credible leak for the 5090 performance
Regardless of gaming parformance 575W is a big no for me. Even if let's say 6 years from now I could get a used one for $300, I just wouldn't want to have it in my room, it's just too much heat.
It's no more difference than running a 700watt microwave.
@@kravenfoxbodies2479 And people don't have 700w microwaves running in their room for hours
@@kravenfoxbodies2479 You run a microwave for 1-2 min, not hours.
But the package is eco-friendly!
One of the perks of living in northern Europe is that you don't have to care about this heat. I need electric heating for 98% of time. During those 2%..I don't want to game..but spend time outside.
Though it's slightly mind bogling when GPU generates 1/10 of the heat of my sauna unit..that I use to heat up one room to 110c (230f).
It's a great improvement for companies using it for machine learning. Extra vram really helps there.
They will gladly buy 5090's, for consumers who want to game, probably not so much.
Hopefully not too long after the release we get performance figures in local runnable AI applications. I'm really curious how it stacks up against the 4090.
with so many more CUDA cores it'll make a killing no matter the clock speed 😂
Seems like a product mainly aimed at professional use i.e. AI and 3d rendering that's throwing a bone to rich gamers than the other way round. I know the 90 series has always been a bit like that but this one seems more so (so a titan card essentially)
Yeah 100% right. I am even thinking of selling the 4090 to get a 5080 later on, due to lack of professional use at the moment and only gaming features being used. I guess I can get more money "back" in the process and get more frames, though generated.
the 4080 is only ten percent behind the 4090 at half the price, much better gaming card if you don't need tons of VRAM for productivity (since DLSS knocks down the VRAM usage massively), 5090 is basically just a cheaper (and probably a bit faster in some scenarios) RTX 6000.
@@PiotrBarcz yeah, it will definitely be interesting to see the reviews of all cards, including AMD in march and 5070 and ti soon.
I had both the 4070 ti and 4090 and honestly, the 4070 ti is enough, all else was "nice to have". So I guess people with the 4080 made a great choice. And probably the 5080 will be much better for most people due to double the price of the halo card.
@ yeah I've been thinking of getting a 4090 build in the future but frankly the 4080 makes so much more sense I would be an idiot to get the 90 with my financial situation anyway XD
This going to be hell in summer with 600wat
Luckily in my country one needs heating for 98% of time + I could propably route this heat to my sauna
The amount of heat it will dump inside a case may require people to rethink their overall cooling setup.
People who spend $2000 on a graphics card can likely afford air conditioners.
This is where there is much value to getting a dryer vent and pipe the heat out the window. Will save you a few bucks a month in running the AC harder. Reminds me of when I used a dryer vent to cool my PC with 20 degree air from outside. Just change it from the exhaust in summer to intake in winter.
WAT
Steve is standing again...let me grab some popcorn 😂
cliché like me post
@ScotsmanGamerindeed I'm a talentless unoriginal twat 😂👍🏻
@@IlMemetor72 fair play in your honesty!
2g for a card is too much.
Even if they had priced it at last generations price with only a 30% uplift it would be to much for me to upgrade from my 4090.
keep in mind the value of the US dollar has dropped 100% in the last 5 years....losing 20% of its value each year..... this is not a joke or exagerration..... $100 5 years ago is now worth $50..... so 2k today is 1k really.....ppl are ignoring inflation with these prices.
for 100% more performance...i would.
for 35% performance, i pass.
@@user-ev7vh2is6b "dropped 100%"... What can you say "dropped 100%" about? It's about your education. When you dropped 100%, you always get 0.
You usually spend same amount money thru years while changing gpus in the end
They shouldn't get upset it's called Hardware Unboxed which is a past tense, it should only have things that is already been unboxed my man.
Frame smoothing, I wish they would call it like this, instead of frame generation. Hit the nail on its head perfectly.
The design of the 5090 FE is really cool, can't wait to see how well that 2 slot cooler dissipates 575 watts.
Probably not very well
CPU is going to be in hell
It will do it with room to spare. Watch Gamers Nexus interview covering the design and function of the cooler.
Where's a hairdryer emoji when needed. Or space heater. Or blast furnace. 🥵
Probably will do much better considering the airflow is totally unobstructed. Just get good exhaust fans and maybe make a shroud inside of your case to move the air more directly to the exhaust and you'll be good.
My feelings are that I’m still happy with the 3090 I bought for $700USD.
I meant, that as good as the 1080ti price to performance ratio
@@ct2651 He did better - plus 24gb of VRAM. That's huge. I have a j3090 and a 4090. Love my 3090. I won't be giving NVIDIA any money this time around.
24 GB of VRAM. Good performance. No need to upgrade unless you are going to the 5090.
I know right My brother got a RTX 3090 after the crash for $580 bucks used I also got an RX 6950 XT for the same price
3090 is great at those used prices. The 24gb vram makes it a great gaming and AI enthusiast card.
We got Steve unboxing on Hardware Unboxed before GTA 6
Happy to see the level headed take on frame gen just being a smoothing post process, this kind of honest reporting is the reason I'm subscribed.
I like the green themed studio lights for Nvidia products: I am guessing you will do blue for Intel and Red for AMD. Maybe the whole studio should be a bit brighter though to compensate or just have some parts with the Green/Red/Blue theme?
I mean at this point i really do believe Nvidia engineers when they say they struggle to get the sort of raw performance uplift we've seen previously. We are encroaching on the point of diminishing returns, and thats down to straight up physics and material science.
This. Everyone rolled their eyes when Jensen said Moore's Law was Dead, but even AMD and Intel are failing to roll out significant generational improvement. I feel it will take new innovation to see the jumps we saw previously, but in the meantime, I'll always be entertained by keyboard warriors who feel that the best engineers in their field are somehow inept at finding the improvements they feel entitled to.
Of course, but people seem to ignore this fact. Nvidia knows this, that's why they push the AI stuff so hard. You can't shrink the process node infinitely and it becomes increasingly difficult to do so. They went from 5nm to 4nm. Next generation they'll probably go to 3nm. If you reach 1nm, what then? AI is basically the "backup plan".
I think the good counter-example to this is Intel 14nm++++++ era. It's because nVidia is a monopoly and is making almost all of their money on AI, so they're dedicating their engineering time and building their GPU architecture to optimise for that. Look what AMD did in the CPU space with a brand new architecture - performance gains that Intel essentially said for 5 years were impossible. This is economics, not physics. We wait for the monopoly to break.
@ you counteracted your own argument with that.
@@MinishMan A company having the best product doesn't make them a monopoly. You could buy Intel ARC or AMD GPU.
Please just test with native 4k to force them to give us an actual upgrade next generation
I think it's called Moore's Law, we are reaching the limits of silicon. It would have to be a total shift to something new, like optical computing, or dare I say quantum computing (imagine the cooling needed for that?)...
Just gotta test with the exact same settings to get a good comparison. Else u comparing apples with oranges, but I trust these guys to do this.
They usually do both so I am pretty sure we will see no DLSS/no MFG, DLSS/no MFG and DLSS/MFG results.
👍
@@asianck I mean no reviewer is only going to do tests with frame gen on. I don't think that's a concern really. Judging by how people are talking between the lines with there pre-coverage, we're in for a real disappointment when the 5080 and below numbers.
I am looking at buying a 5090, but I am coming from 30 series. Upgrading from a 4090 to a 5090 seems like insanity to me. Also good to hear that you won't be looking at frame gen initially. I'm much more interested in RT performance which is one of the major reasons I'm looking at upgrading at all.
I’ll wait for the 5080ti. There is a huge gap between the 80 and 90. I’d like a card at 1200 or 1300 in that spot
@chrishewitt5826 there is room for a 4080ti also, huge gap, never came
@@chrishewitt5826 It's not a bad shout. I wouldn't be surprised to see a couple of cards slotting into that gap. Maybe a Ti and a Ti Super.
Well, the RTX 5090 looks more like an architectural enhancement, than a leap forward in gaming. GDDR 7, bigger bandwith and more memory, gives the 25-30% uplift, BUT at 30% more money...Seems that Nvidia changed Moore's Law with, "You want more transistors, pay for them"... For many of us gamers, a used RTX 4090, is the way to go...
Only reason going for a 5090 is because you have a 4k set up for gaming. Otherwise a 4090 in 1440p is overkill.
Jensen's Law: the price of GPUs doubles every two years.
4090 is required for ultta 1440p at stable 60+ in EVERY game
I hVe. 7900xtx and cant even native ultra every game in 1440p @@nahbro3240
@@nahbro3240as far as gaming goes, VR enthusiasts will be ready to jump on these 5090s. Hopefully, it’s closer to 40-50% in raster, but even a 30% uplift will be a difference maker for these high resolution VR headsets that are available now days. Especially since DLSS and frame gen are not utilized in VR and we need that pure raster to power these headsets that are 3-4X’s the pixels over 4K
We've gone through the miniaturization phase of computing, now we are going back to room sized super computers. Now we just need personal Fusion Batteries to power everything.
2 slots isn't THAT big
@@aerithgrowsflowers 575 Watts is. Mental.
Yay! Finally a GPU priced more than my first Corvette.
You bought a *_Corvette_* for less than 2 grand? Did you get some kind of 24 hours of Lemons super deal or did you buy it way back in the 50s / 60s when I think 2 grand was the starting price?
@@levygaming3133 You can get a used car that is factually still usable for example about 3000 euros. Like example a Golf or such. If i could choose which one to pick if i would get it for free, i would pick the 5090. And would sell it immadiately.
@@manefin I know full well that you can get _some_ kind of used car for 2-3 thousand dollars, but he specifically called out a Corvette, which is an expensive American sports car (typically ~60k MSRP), and I was extremely doubtful that _any_ corvettes have depreciated to the point of being that cheap. The only way I think it could’ve been that cheap is, like I pointed out in my comment, way back in the past before _* a half century *_ of inflation, when the MSRP for the base model _might_ have been 2k, but even then I think it was probably closer to 3-3.5k, at least Shelby’s of that vintage were about that much (with an engine package being scrapped because it would’ve raised the price by a full $500, which was a lot of money in the 60s).
@ Im sure theres old used corvettes in sale for about 3000 euros in usa.
Do you and Tim rotate who gets to have the gold play button?
You can order multiple ones. No need to rotate.
I love how compact the card is this gen.
That's just the FE. Go watch Linus's video "Every RTX 5090 at CES 2025" on the board partner cards. Some of them take nearly 4 slots.
@@jbrou123 Skill issue on the other brands' part xD
At this point, with the amount of money these GPUs cost, they should be the whole PC and all you do is plug it into the wall.
8:50 BUT BUT IT has DLSS 4 so it can be 4x faster, Jensen's Jacket told me so!
A lot of the 40 series improvement was from the TSMC node improvement. Without that size of leap in the manufacturing tech, we're seeing that the chip design itself isn't advancing that much.
Blackwell doesn't improve rasterization per CU. It technically is better per clock, but 50 series has lower clocks than 40 series so it ends up being a wash. RT is a 30% improvement per CU, and tensor is vastly beefed up. But no one seems to care about that. Rasterization isn't a focus anymore nor will it be going forward.
Standing Steve is never a good sign
Steve is standing, light the beacons request aid from Rohan!
So +25% performance, +25%price, +25%larger die > 0% progression. Sweet. . . Let's hope a 9070XT will at least be value for money.
You get more performance anyway though, you're not paying MORE for the performance, you're paying the same for performance. Still a win in my eye, better than go to worse.
@@PiotrBarczlol, that's not a "win", that's the definition of stagnation. "More for less" is progress (see 3070 vs. 2080 Ti, 4070 vs 3080) you could even make a case for "same for less" (7800XT was basically a 6800XT at a discount), but "more for more" is just bad value, even at this ridiculous price point. It's not even more efficient, for crying out loud.
@@PiotrBarcznope it isn't, it's like walking without moving forward.
@@jakacresnar5855 Welp sorry to say but the world ain't what it used to be. And yes it is more efficient, at cooling.
@@gvd-l3o Still more performance, that's a win no matter how you look at it, more is more and doesn't matter in the end how much is costs. Get a good paying job and then buy the card instead of complaining about the price since the world is already backstepping.
FPS initially - Frames Per Second
FPS now - Fakes Per Second
😂
P.S. Seriously though, fake is incorrect, rather should be known as generated frames, GFPS maybe. 😊
Now guys imagine the cards price is not inflated that its MSRP was announced to be 1000$ where as 5080 was announced to be high end GPU for 600$. What a time would that have been to live in.
Now imagine a world where prices are determined by what dorks on the internet say they should be based on “reasons”.
My thoughts, it's another semi-stagnant gen at least for what we've been shown. 5070 will probably only slightly outperform a 4070 super. Multi frame gen seems okay if you have a 240+hz monitor not so much if you have lower. DLSS transformer model looks good so that's something nice for all RTX users. Let's hope FSR4 and 9070 exceed expectations and give people real value.
100%.This is a stop gap generation for Nvidia. Similar to what the 2000 series was, marginal generational improvements with a few extra features, it'll probably age very badly. Nvidia 6000 series and AMD's UDNA is where they'll bring out the big guns and get back into the 40-50% gen over gen rasterization performance gains.
This still looks better than the 2000 series did at launch but Nvidia actually managed to launch raytracing with that series despite it being terrible at the time. Time will tell if dlss4 is also something people disliked at first but became popular
Well, people expect that rx 9070 xt will be a cheaper 7900 xt. And that fsr 4 will be better than fsr 3. And thats about it. Though its not much different in with nvidia. I just always find interesting when people talk about gaming gpus that they never talk about video games.
@ This is a hardware review channel, it's not that confusing why people are talking about GPU's.
@ I meant generally theres huge focus on stats/data of gpus and such rather than focus on the video games themselves. You know about what the gaming gpus and such are made for. For me at least its a bit funny. Maybe to others it makes perfect sense.
2:56 The timestamp at which the GPU is finally unboxed and thus lives up to the channel name. You're welcome.
Thank you. Monumental moment for the channel
They need a melody and a graphic for when the item is completely unboxed.
Finally, somebody said it out loud , "Hate how it's being marketed, and how it clams to be an outright performance enhancing feature ". 👍👍
Every tech channel: "Oooh, aahhhh... packaging!"
Every fanboy: "Take my kidneys!"
Me: "Let's take another look at AMD".
Steve will now spend the next 3 days in his man cave benchmarking this thing, good luck Steve, don't forget to sleep mate.
sleep is for people without hobbies
IMO, most tech tubers have already completed benchmarks and are just waiting for the embargo to lift. The 24th will have hundreds of videos with results.
He dont need sleep he need answers:)
It makes me sad that I've already seen tons of memes where they make fun of 4090 users because they genuinely believed that the 5070 will match the performance. And I freaking hate this kind of marketing as well. AI generated frames are not performance.
Nvidia thinks we only play brand new AAA games. All that DLSS 4 "magic" is completely useless in games before 2018.
Why would you buy a new GPU to run an old game? That literally makes no sense. I think Nvidia knows their target audience.
@LukewarmEnthusiast I don't know... Maybe because we have 4K 240hz and even
7680 x 2160 240hz monitors. A 5070 will be way too underpowered for those even with older games. And I don't even mean very old games. There are tons of 4-5 year old games without any kind of DLSS support.
@valentinvas6454 That's the reason they emphasize frame gen and mfg so much. Usually DLSS and FSR are reserved for newer games. Most devs aren't going to go back to an old game to implement an upscaler but you can usually find a mod to implement frame gen to older games improving performance. With the 5070 having support for mfg you may reach those 4k 240hz target you aim for. You may not like frame gen but it does improve performance.
@@LukewarmEnthusiast So do I have to attach my old GPU to play my old games? What are you talking about?
For AI frame gen as you said, we need higher raster fps so the frame gen could work as it suppose to. So raster is still the king.
exactly. Lower fps = larger difference between two frames = vector interpolation less accurate causing more artifacts
Which means games have to be optimized so they can actually crank out that much FPS on top of the line hardware. The devs are counting on Nvidia's optimizations to fix their stupid game's performance.
Looking forward to the reviews,I would expect at least a minimum of DLSS 4 multi frame generation review after trashing it for weeks, its deserves a quick peek 50 series is going to trounce AMD so theirs no need to be disappointed unless your an AMD fanboy who hates to be bested
27% uplift for 25% more cost (if you're extremely lucky!!!), yet 125W more, so essentially we're paying for every percentile of uplift, which in turn is only enabled by relatively commensurate extra CUDA cores AND Wattage!!!!! Disguised under some fake frame tech!!!!! Poor Nvidia, Poor.... ....even LAME!!!!!!
Thank you for having a genuine, no nonsense stance on frame generation
It's going to be around 3000€ in the eu
Same cost as rent around here... and both those thoughts give me anxiety
2025 is wild, wild times, man, lol
Def not a product with this consumer in mind, so biased perspective for certain. Wonder how PC enthusiast level spenders feel 'boot the 5090 pricing to perf.
Pretty card, sure it will run beautifully. 2k USD.... just... damn....
~a random canadian subscriber dude
This "eco friendly" packaging doesn't make up for Nvidia pushing 8GB GPUs that will be e-waste in a few years.
Don`t say! Nvidia save the memory… so it is ecological!
😂
Saving is good!
And ofcourse… The more you buy, the more you save!
So whole Nvidia ecosystem is eco friendly!
… could you now put that gun down… I was good boy as you said… ?
10:30 wow that Wukong artifacting is SUPER noticeable, that's way beyond youtube compression.
Looks terrible.
That IS TH-cam compression actually, I see it on every video on youtube these days shot at 4K 60, any movement like that is going to look like trash with the fine detail. TH-cam caps its bitrate at like 40 mbps for most videos and it takes at least 50 to 60 to get clear visuals on detail like that during fast movement.
Unbox the box on unboxed.
i think that if one can claim that 5090 with DLSS 4 can achieve over 2x performance, while being ~30% faster, then someone else can test 4090 with alternatives, like lossless scaling and use that to it's extent... i won't say 20x framegen, but 10x would be hillarious, Nvidia team would be like "wait, it's illegal"
Personally I anticipate it won’t be worth the upgrade for those with a 4090 but for those who still own the 3000 series maybe you could do some mental gymnastics to make that price work
World class gymnastics required :)
The engineering of the cooler to be 2 slots by itself is incredibly impressive.
I can't be the only one thinking: Show me it all at once. Don't give Nvidia the extended free marketing with small pieces here and there to keep the cycle of news focused around Nvidia for an extended period of time.
Steve Standing? Oh lord we about to witness something great
The ONE time i decide to buy a flagship GPU and the box looks like I bought a refurbished toaster 😪
Rtx 5090 need price dropped to $1300
Just watch it go out of stock in few weeks selling for 2500 dollars
nope, if anything higher price and it will come if 5090 stock leaks are true. People will spend even 5k if needed for 5090 or 6090. The cost there is just a limit of availability.
U can drop it to MSRP $1000 and it will still be scalped and low in stock, it will still cost $2000😂
Weeks? god id expect hours@adityaverma6754
@@adityaverma6754jokes on you its already that price in europe at release. So we will get 3000
I am happy I decided on my RX7900 XTX and I am enthused by FSR 4.0. I have no issues with using a Nvidia GPU, but can't justify changing back this gen, at least so far. The last time I had one was my trusty GTX 1080, what a great card.
Looking forward to your reviews Steve, thanks to you and Tim for all the coverage. You guys are the best!
I hate almost everything about this card: the enormous price, the LOW extra 30% performance, the HIGH power usage BUT
I LOVE how compact it is, god darnz it!
2:20 wouldnt be surprised if connectors still burn out
Unboxing in the Unboxing channel?
HERESY!
Not what channel name says read or again
Finally. Hardware Unboxed, unboxing hardware.
Very honest and to the point review. Love it brother!
It's very misleading of Nvidia to say the 5070 = 4090 performance using DLSS4 AI FG, when the 5080 won't even match the 4090.
Upscaling? Ok, maybe on single player games.
Frame generation? Absolutely NO! NO and NEVER!
Especially if i am paying that amount of money, i just want true performance via rendering. I dont want any "trick" to boost "numbers" instead of real performance.
We want high fps to get lower in game latency and so we can response anything in the game fast enough. But fg fps is absolutely opposite of that. It "holds" next frame to create "between" frames so it creates more latency.
I mean the "30% more performance for 25% more price" statement doesn't really hold, though, because do we really expect the 4090 to hold its value at 1600?
It should drop unless people are silly and pay unreasonable used prices.
This product is dogshit. They took foot off gas because of AMD exit high end.
Put a slow mo camera. Side by side with a card that doesn't guess 3 frames in 4? See if the gpu knows if you've moved or fired. During the guess frames. Latency?
I think this is the first time I see an actual unboxing at Hardware Unboxed!
So happy these new cards came out, so i will eventually buy a RX 7800 XT for 300 euros in 6 months or so. 🙂
i'm so happy that these unboxing videos exist, so that i don't have to unbox my own 5090 and can stick to my old 970 gtx!
you took 9 minutes to repeat "30% faster for 25% more money, again 30% faster for 25% more money"
"Turn your struggles into stepping stones for greatnes
I would like a comparison to the 3070 or at least 3000 series as i think many have skipped the 4000 series