Well, you get slightly better tech too, with better power efficiency but don't expect much. 10-30% gains to the last generation, Moore's Law has been long dead.
we are certainly buying the new tech but ew are still absolutely getting much better silicon. the new architecture of the Blackwell will deliver better performance. how much will always depend. i tell people dont upgrade every time. give it 1-2 versions to really get the value.
I wonder what performance would have an actually increased count or processing units. That is something I would go paying for, knowing that now I don't need to buy a new card ever
I thought the same way at first. Like most others, I'll reserve my final verdict until we see the full potential of the new series. That said, the 4090 with frame generation has been a game-changer for me in AAA games-it's like night and day. The smoothness is incomparable. The new version will not only be smoother (thanks to the FPS boost) but will also look better than previous versions. What many seem to overlook is that the kind of performance gains we would have seen from a "traditional" upgrade-without the AI-driven enhancements-might have been a 50% improvement over the 4090 at best. Nvidia’s decision to focus on AI is incredibly smart. This approach is far better than a conventional upgrade (which usually just means adding more CUDA cores). Instead, we’re getting at least a 100% performance improvement over the 4090. This shift to AI is the only viable path to keep up with higher resolutions in the future. The increase in transistors from a node shrink-say from 4nm to 3nm-won’t be nearly enough to handle the demands of 8K. The jump in pixel count is simply too massive: 4K has 8.3 million pixels, while 8K has 33 million. The traditional approach just can’t deliver the necessary gains. Nvidia’s strategy is brilliant (and, of course, they have the resources to pull it off). Everyone else is simply trying to follow their lead.
It’s actually a deal, 36% faster for only 25% more money. What did you guys want? Jensen to give it to you for free? Everyone was saying the same crap 6 years ago, ohhhh RT is bs, NVIDIA are liars and it’s a scam - pure raster is the only way - long live AMD…Then same thing with DLSS etc etc. Yet today most ppl can’t tell the difference between DLSS quality and native unless they sit there just looking at it side by side. As a gamer I don’t care how the frames get to the screen. It’s like asking what’s the fastest way to get from NY to LV? Someone replies, flying. And you reply back no, I meant by foot…
Agree, very helpful with real examples. The bit about haggling with a 4090 owner before benchmarks comes out that it performs like a 5070 will is not gonna work though...lol
One thing I noticed that they clearly used older DLSS dll versions that had the most ghosting issues. I always use the latest version of all the dlss dll files and there's a huge difference even between all the 3.5 to 3.8.10 versions. My bet is that this uses the game native original 3.5.x versions. RR had a massive change to quality even before DLSS 4 release. So did FG and DLSS upscalers.
You came in clutch with this one, FC. At first, I was very skeptical of how you portray the latest technologies, but after maturing and experiencing it firsthand, I came back to support your ideas. To be honest, you are the most unbiased TH-camr that I have ever seen, trying to show us the truth rather than winning the sponsorships other TH-camrs seem to chase.
The ones iv seen in multiple places are $4800-$7700 so no one's buying anytime soon once that white rock high wears off maybe we'll get em at bottom barrel Price's.
people who have the flagship from last generation and think to sell and buy the new flagship of the new generation is only for people with more money than sense. so it doesn't really matter. you could have a 3090 and still be fine today.
@xyr3s yeah you're just locked out of the features I was really hoping to do some work on RTX Remix but you have to have a 4000 Series. Frame generation only 4000 series's. But if none of that matters and it's strictly gaming a 3090 FE is still a potent GPU.
Bought a 4090 in October knowing that this would be nothing but a money grab. I'm good for years and don't have to think about that price tag and scalpers. It'd be nice if pre orders existed though...do you think Nvidia doesn't do pre-orders because they have their own "scalping dept" and buy their own and sell on eBay for 3x msrp? Most likely that's what's going on.
@@xyr3snot at all, people who can afford it can simply afford it, some people do want to play Cyberpunk at 240fps, nothing wrong with that, nothing wrong with having fun and enjoying your hobby bruv.
You're interpreting it wrong.. it's a 33% perf increase with a the same power consumption because of the increased SMs. That's assuming there is no generational improvement
Reflex 2 is what VR games (where input latency matters) have had almost since day 1. Asynchronous reprojection. It's why they feel so "responsive" even at lowish framerates. I personally really like it but it's a pain to implement for game engines like unity and unreal so it's not used because when you whip your mouse around you get the rendered frame surrounded by a black screen so you need to do a lot of leg work to fill that in. LTT did a good video showing it off and AI is the perfect solution to fill stuff in. Pretty neat how fast it became a reality for more than just niche applications. It doesn't need to be perfect, just "good enough" where it doesn't distract you constantly. The latency reduction is generally speaking almost always worth it unless you're running at some comically low framerate to begin with.
@@bigturkey1 Its the only one that kind of makes sense if someone wants a new RTX card. That 12G of vram just means that in 2 years those people will be looking to upgrade just like they want them to! Personally I think that midrange cards every few years are the way to go but everyone is free to do what works for them of course.
If you look at Ray tracing settings at 13:40 . It looks like the DDR7 is not doing anything and it’s actually worse performance gains than 33%. Wukong Gets 24 to 25 fps max raytracing on 4090. Here 5090 is 29, so 29/24=20% performance increase. Ray Tracing is the achilles heal for gaming, this is why AI had to step in.
@@KiNg0fShReD SM count or core & memory clock almost never scale linearly, and vary wildly between engines and games. RT improvements aren't crazy this Gen, so I would personally guess the RTX 5090 is around 35-40% faster in *raster* on average
So cool seeing you at CES!! edit: just finished watching the video, and this content is so unique to this channel... No one else in "tech media" is highlighting the insights you share here. Outstanding work, mate!
Using 4x frame generation even for 4K single-player games doesn't make much sense to me. The most sensible situation would be if you already had 60fps, then you'd get 240fps after frame generation, if you also have a 4K 240Hz monitor. For >60fps, you won't notice an improvement unless the 5090 supports DP 2.1 and you're using the likes of a 360Hz 4K monitor. On the other hand, for
As an owner of a 240 Hz (1440p) monitor, you are correct. Frame generation looks and feels like crap unless the baseline FPS is at least 60 to 120 FPS. At low baseline framerates, issues become extremely apparent; such as latency, frame pacing, motion blur, ghosting, and stuttering (because some visual effects are not properly interpolated).
@@KillFrenzy96 Yeah, when they show stuff like Cyberpunk with 27fps magically turning into 100fps+ it doesn't sit well with me. And that seems to be what MFG is made for. Normalizing base framerates under 30fps sounds like a complete disaster.
@@Rafael-rn6hn tbf, that 100fps+ is not using raw 4K raster as the base, its using Quality-performance mode DLSS upscale to get a better base frame rate, then doing the frame generation from there. That being said, I agree that this is only useful in certain circumstances. Can't wait for reviews!
@@504Trey i dunno man :D 9070 is supposed to have upto 4070ti performance. if they price that at like 500 buckaroos, things could get interesting. and if the 9070xt ends up at 4080 performance for like 650? very very interesting : p
@xyr3s Yeah but if you can get a 5070 at $500 with 4090 performance, that makes the 9070xt a 3rd of the performance at the same cost, amd gpu's are dead this year
Seems like just yesterday the top range X80 card was around $600. And the Nvidia TITAN seemed crazy expensive at a thousand+ dollars. The good old days.
When 4090 came out I was among the first to buy it. But 5090 seems different...guess gonna skip this gen. I do not need more "exclusive" fake frames with artifacts + more latency.
4090 owner here. Forget about getting mine for $800 lol. We, high end gpu buyers are like crack addict, we are the ones following channels like this to see what’s up! Lol
25% more expensive, 50% Rasterization uplift. The $5090 is worth $1499, but I feel like Nvidia just gave it 32GB instead of 24 as an excuse to price it this high. A 5080 Ti with 24GB or 20GB of VRAM for the same performance as the 5090 in CES2026 like how the 3080 Ti was the 3090 with 12GB of VRAM is the real halo product for consumers. This is a Titan card.
it is not 50% raster bump over 4090. and titan cards was useless it had like 3-5% perf bump over the model under it. it just had more vram. these 4090/5090 actually had an crazy perf bump over its model under it
@@Teletha whatchu talking about Willis? Titan cards absolutely demolished their regular rtx bretheren in compute, especially FP32 and 64, which is what they were designed for. They were a half way point between consumer and the quadro cards and priced accordingly. Granted the only reason they demolished was because NVIDIA software locked that support but still.
@@RobloxianX The 5090 is worth whatever people are willing to pay for it. Just because it’s worth 1499$ to you doesn’t mean 2,000$ isn’t a steal for someone or a small business looking to do their own tinkering with AI for profits. Just remember that same die can also be put in the professional version RTX Blackwell 6000 with probably 64gb of slower ram for 8,000$ or a compute only B100 where two of them will live with 192gb of HBM3 for the low price of 35k. 2k in this market where everything is about AI is actually a good price if you plan on using it for anything other than gaming.
didnt we get a 30% perf bump going from 80 SM 3080ti to 76 SM 4080? you dont think new architecture will give 5080 a similar 25-30% bump over 4080? (yes i am coping being to poor for 5090)
Not with 30% more power draw. The question is how they manage to keep power to performance ratio the same when the GPU has better VRAM, larger die and more bandwidth.
FE is balanced and idiot proof card, nothing aggressive. A serious overclocker will not care about balance or power draw (or even money). They will buy board partner cards, throw the fans and vapour chamber in the bin, slap on a waterblock and tune the f out of the GPU.
To be fair, I think DLSS 4 is almost native if it's at 4K resolution. There's still some issues for sure, but it's very, very close to native now. I could just turn on DLSS 4 and just enjoy my 4K 240Hz OLED on Wukong, Cyberpunk and Witcher 4 (when it comes out) and it would just hit that 240fps for me to get the max performance of my hardware. My 4090's not able to do that without major artifacts, which makes this an instant buy on day 1. Especially that super sexy 5090 FE
Except it's not. Upscaled 4k isn't native 4k. You could say it looks like it, or close to it butnit isn't it. Also, upscaler tech has to be implemented feature of games in order to work, and it encorages game developers to rely on this tech instead of proper optimisation. Basically, Nvidia, AMD and Intel pushes fake frames to push their own products and by implementing itnin their games they save massive money on labour that would be required to optimise the games. And the result is often abysmal artifacting/ghosting, flickering textures, and massive input lag.
Yeah, only multi-frame generation is exclusive to 50 series. Idk about you, but for me the new RTX 50 is a disappointment, that we waited 2.5 years for might I remind y'all. Gonna keep my 4090 until RTX 60 comes out.
@Rubed0_o yeah I don't care about Framegen what's the point of getting more FPS when it's already smooth enough at the cost of extra latency anyway, we're getting like 30% performance increase at the cost of 30% price increase and power consumption This feels like a lame refresh
Great time to be a 4090 owner... Can either sit and do nothing and know you basically still own a top tier card for two more years, or you can sell it off in a month for good money and upgrade for a few hundred bucks. It's crazy how performance improvements are hard gated behind manufacturing process node improvements... Blackwell on same node as Ada = near zero perf/watt increase in raster.
This. I'm still seeing used 4090's go for $1700+ (recently completed sales btw) on Ebay, we're eating good either way! Only way I don't get a 5090 is if MFG is artifact filled or thermals suck on founders (my 4090 never goes past 80c, its a miraculous piece of engineering)
Good info. I’ll take all that extra raw performance from the 5090 I can get for VR. The extra video memory and NVENC encoders will be handy for video editing too. Also, I’ll sell off my 4090 after the benchmarks are out 😂
If your a dedicated, pushing the limits VR gamer, 33% is an absolute dream. Sign me up 😁. Honesty though, unless you game with triple 4k monitors or VR for PC its crazy.
@@justinhogue986133% is not even close a true generational leap. for dedicated vr 4x double the pixel and double the frame rate is standard i will wait for tsmc 2nm
@@natedisrud that is the problem unless someone figures out a software solution i have played silent hill 2 with frame gen uevr and it almost works with frame gen but there is too many artifacts not as much flickering but not better than no frame gen
you showed the wukong if it uses 5090 native 30 fps!!! it gets more confusing about the new cards. same or more demanding scenarios my 4090 gets at 4k native 30-35 fps... same thing is also they showed on cyberpunk native 4k 25-26 fps also 4090 does the same maybe better.
getting a tuf 5090 it has a great PCB and perfect for putting a waterblock on it. really liked what i got for my 4090tuf over the strix the price cut was well worth it and had no issues OC to 3ghz
The 5090 Founders Edition looks fantastic as a creator card. 32GB VRAM and a two slot design is insane. It seems too good to be true compared to premium priced workstation cards, so I'm curious if there will even be enough stock produced to handle the demand. Hoping that long term we'll see those gains trickle down to lower end cards, and that competition will step up with VRAM.
Since Nvidia killed off Quadro’s they have been slowly turning RTX high end cards into the new quadros since they don’t care about making money off creators anymore
The truth finally someone who actually plays games and doesn’t just look at bar charts 📊 That two slot design is mega well done nvidia Law of the cheapest with decent warranty
I’m in a bit of a predicament. I sold my MSI 4090 Suprim X for $1,700 three weeks ago. I definitely got plenty of use out of it and even swapped in a water block a few times. While I’m aiming to upgrade to a 5090 eventually, I just came across a like-new 4090 FE for $1,400. It’s a hard deal to pass up, and now I’m leaning toward going back to a 4090.
It's back to the future, to 2018 and it's the RTX2080 series all over again.... AI tech solutions (locked to the newest line-up) have the priority, instead of raw power (just a small linear progress there). And a price to match.
@@TRS-Eric ........exactly my point. The meager rasterization performance increase over Pascal for the price increase was notoriously bad, at the cost of new RT and Upscalling tech prioritization (which took years to be relevant). "New AI features" was the point of RTX2000 series, not the performance increase.
Thanks for the breakdown! Just a quick off-topic question: My OKX wallet holds some USDT, and I have the seed phrase. (wonder obey dial dash soon tank spike scout region undo zero such). Could you explain how to move them to Binance?
It's like £1950 and the 4090 was £1600, so it's another substantial price hike. This means the AIBs are all going to be £2000+ :/ I just cannot justify the price, but I'm thinking of selling my 4090 and going AMD anyways for the Linux compatibility now Windows Recall is out. I'm not interested in frame generation or AA as it ruins image quality, what was wrong with just brute forcing straight lines?
Unreal Engine 4/5 becoming the standard is what happened. Yes we could get games that run at native 4k with ray tracing quality at 60+ fps on mid level hardware if they actually cared about optimizations. They don’t want you to know it’s possible because it accelerates the speed they can push out new garbage.
I don't think anyone was expecting the 5090 to be a good value, but at least it's not _worse_ value than the 4090. The one I'm more interested in is the 5080 and a lot of people the 5070. Even without the frame gen BS it looks like they're seeing a respectable 25% uplift without a price increase. That's better than anything we got with the 4060ti, 4070, or 4080. Not to mention if Reflex 2 works half as well as they claim you might finally _want_ to enable frame gen in your AAA single player games.
Funny watching +240 useless fps when my 955DF CRT at 75Hz still feels better to play with smoother motion than +165Hz current monitors. This bubble must burst.
Do you reckon the 2 slot design is purely because AIBs complained about the 4090FE being too good? It runs so cool mine never has the fans come on unless I'm playing a demanding game. So will the 5090FE now have the fans running all the time and you'll have to buy an AIB version for it to be silent?
I just want to see a review that will focus on native resolution performance, DLSS, Frame generation and Reflex are just cool demos 99% of the time it will be off.
Bro, can you please ask an nvidia dev if 4xMFG will work in VR? VR users really could use the extra frames! I'll buy 50 series just for that alone if they can get it working in vr 😅
I sold 1 of my 2 4090s 2 weeks ago for $1800. Sold same day I listed it. I think I’m gonna keep my other 4090 though. Grab a 5090 to replace the 4090 I sold.
you know all the fps in the world makes no difference when your screen can only dispolay so much older ones are 60 ......newer were 90 and modern are 144 and 165 and 240 now u get 360 even without a modern screen wasted frames with frame gen is absolutely useless and not even seen
Before CES announcement everyone was predicting 5080 will be 10% faster (1.10x) the 4090. It seems now that's not the case. But didn't the leaks already know the specs of 5080? So what changed?
DLSS Off is also a deception because it uses TAA, the blurriest picture to compare to. I will keep gaming at 4K without raytracing without AA. Raytrcing often gives a worse picture like less datails and darker, because the devs don't have to look how the picture and light changes and if it looks good, they can't even change it if they want to, they have to enable raytracing and are out of control.
DLSS4 and Reflex 2 look like they are done for one another. Basically reflex 2 description on their site seems to imply that it is slightly mainly manipulating the data that each DLSS4 fake frame would get to position itself properly in the world based on mouse movements, and that isn't yet rolled out enough to be main show of marketing. But it's just implicated in a way that I as a programmer assumes that this is obviously what they are doing based on the description how it works. So you could have 30fps real frame and your mouse movements would be 120fps fake frame, with reduced latency. If it operates like that then DLSS4 mult generation makes a lot more sense than it does right now.
+33% SM and increased memory bandwidth. 4090 was limited by memory speed so the raw performance jump on average should be more than 30%. I wonder if 5090 will be able to do 3GHz core clock with ease?
When i set slider all the way up to max memory overclock my 4090 gains about 1-2 fps :D I don't think it's limited by it. Maybe only for specific games&benchmark
@@sebilsever8961 Ofc it will be highly dependent on the workload or game and resolution/settings used. But most of the time I got the most FPS boost from the mem oc and not core clock increase (at least not on its own). Especially when playing at higher resolutions or using DLAA or DSR/DLDSR. From my experience the stock GDDR6X was clearly holding back the 4090. It'll be interesting to see how much GDDR7 and the wide memory bus will help this time around.
There is going to be so many RTX 4090 scams, usually people walk away from someone selling an RTX 4090 for $700+ but people will think that it's a dumpster fire sale because of the 5070 and fall for it.
It's the era of Upscaling and frame generator. We only pay the money to access the latest frame generator software update.
*latest hardware
@@OneTomatoWrong. Look at the fkn 80 class.
no body needs that crap. they need to push for better arc and make studios to optimize the games.
Well, you get slightly better tech too, with better power efficiency but don't expect much. 10-30% gains to the last generation, Moore's Law has been long dead.
we are certainly buying the new tech but ew are still absolutely getting much better silicon. the new architecture of the Blackwell will deliver better performance. how much will always depend. i tell people dont upgrade every time. give it 1-2 versions to really get the value.
Graphics Processing Unit ❌
Frames Generating Unit✅
I wonder what performance would have an actually increased count or processing units. That is something I would go paying for, knowing that now I don't need to buy a new card ever
Frames Hallucinating Unit
@@Rafael-rn6hn FHU 🤣
Its both 🤡
@@Teletha ya.. actually this should be the new name.. frame generating unit or frame processing unit.. 🤣🤣🤣
$2000 gpu to run a game with 29 fps max settings lol huge improvement
it was 24 fps before in 4090 ... absolutely pathetic upgrade for 2000++++ USD
@@HybOj *20fps
@@gozutheDJ thanks for a reminder what 2000 USD was able to do, I stand corrected
I thought the same way at first. Like most others, I'll reserve my final verdict until we see the full potential of the new series. That said, the 4090 with frame generation has been a game-changer for me in AAA games-it's like night and day. The smoothness is incomparable. The new version will not only be smoother (thanks to the FPS boost) but will also look better than previous versions.
What many seem to overlook is that the kind of performance gains we would have seen from a "traditional" upgrade-without the AI-driven enhancements-might have been a 50% improvement over the 4090 at best. Nvidia’s decision to focus on AI is incredibly smart. This approach is far better than a conventional upgrade (which usually just means adding more CUDA cores). Instead, we’re getting at least a 100% performance improvement over the 4090.
This shift to AI is the only viable path to keep up with higher resolutions in the future. The increase in transistors from a node shrink-say from 4nm to 3nm-won’t be nearly enough to handle the demands of 8K. The jump in pixel count is simply too massive: 4K has 8.3 million pixels, while 8K has 33 million. The traditional approach just can’t deliver the necessary gains.
Nvidia’s strategy is brilliant (and, of course, they have the resources to pull it off). Everyone else is simply trying to follow their lead.
It’s actually a deal, 36% faster for only 25% more money. What did you guys want? Jensen to give it to you for free? Everyone was saying the same crap 6 years ago, ohhhh RT is bs, NVIDIA are liars and it’s a scam - pure raster is the only way - long live AMD…Then same thing with DLSS etc etc. Yet today most ppl can’t tell the difference between DLSS quality and native unless they sit there just looking at it side by side. As a gamer I don’t care how the frames get to the screen. It’s like asking what’s the fastest way to get from NY to LV? Someone replies, flying. And you reply back no, I meant by foot…
Better and more informative CES report than all the mainstream TechTube channels, kudos.
Agreed, this was exceptional
Agree, very helpful with real examples. The bit about haggling with a 4090 owner before benchmarks comes out that it performs like a 5070 will is not gonna work though...lol
@@arthur78 Yes I really get the feeling he's not some company shill but want to inform us.
@@helloken Yes I really like his style. Don't get the feeling he was paid to say certain things to make a company look good.
The video absolutely picks up the differences on the DLSS 3.5 vs 4 on the ceiling fan.
That actually looks fantastic, even on youtube.
That's an improvement which is going to be in the 40xx series too, the only exclusive thing is the multi frame generation
@@Healthy_Toki even 20 and 30 series
My 3090 is ready for another 2 years
Wow! I’ll certainly notice that I kick ass in a game….
@@Healthy_Toki take a look on the fans again... not same fans and running different speeds
One thing I noticed that they clearly used older DLSS dll versions that had the most ghosting issues. I always use the latest version of all the dlss dll files and there's a huge difference even between all the 3.5 to 3.8.10 versions. My bet is that this uses the game native original 3.5.x versions. RR had a massive change to quality even before DLSS 4 release. So did FG and DLSS upscalers.
You came in clutch with this one, FC. At first, I was very skeptical of how you portray the latest technologies, but after maturing and experiencing it firsthand, I came back to support your ideas. To be honest, you are the most unbiased TH-camr that I have ever seen, trying to show us the truth rather than winning the sponsorships other TH-camrs seem to chase.
Thanks Jufes, so for 4090 owners... relax and don't sell LOL.
The ones iv seen in multiple places are $4800-$7700 so no one's buying anytime soon once that white rock high wears off maybe we'll get em at bottom barrel Price's.
people who have the flagship from last generation and think to sell and buy the new flagship of the new generation is only for people with more money than sense. so it doesn't really matter. you could have a 3090 and still be fine today.
@xyr3s yeah you're just locked out of the features I was really hoping to do some work on RTX Remix but you have to have a 4000 Series. Frame generation only 4000 series's. But if none of that matters and it's strictly gaming a 3090 FE is still a potent GPU.
Bought a 4090 in October knowing that this would be nothing but a money grab. I'm good for years and don't have to think about that price tag and scalpers. It'd be nice if pre orders existed though...do you think Nvidia doesn't do pre-orders because they have their own "scalping dept" and buy their own and sell on eBay for 3x msrp? Most likely that's what's going on.
@@xyr3snot at all, people who can afford it can simply afford it, some people do want to play Cyberpunk at 240fps, nothing wrong with that, nothing wrong with having fun and enjoying your hobby bruv.
I want 9090 that produces 10 fake frames and runs on 1 horsepower (~746 W)
Precis
33% perf increase with 33% power consumption increase = 0 generational improvement
A weird comment, I'm sure you're aware that power scaling isn't linear so why make it.
Exactly
😂😂😂
Welcome to the end of Moore's Law. Better fake frames is what we have to focus on now.
You're interpreting it wrong.. it's a 33% perf increase with a the same power consumption because of the increased SMs. That's assuming there is no generational improvement
Reflex 2 is what VR games (where input latency matters) have had almost since day 1. Asynchronous reprojection. It's why they feel so "responsive" even at lowish framerates. I personally really like it but it's a pain to implement for game engines like unity and unreal so it's not used because when you whip your mouse around you get the rendered frame surrounded by a black screen so you need to do a lot of leg work to fill that in. LTT did a good video showing it off and AI is the perfect solution to fill stuff in. Pretty neat how fast it became a reality for more than just niche applications. It doesn't need to be perfect, just "good enough" where it doesn't distract you constantly. The latency reduction is generally speaking almost always worth it unless you're running at some comically low framerate to begin with.
3kliksphillips also made a really good video about it back then and also a few hours ago
vr is dead
Why are people even buying 4K gaming monitors if all they're gonna see on them is upscaled AI generated horse shit?
You know that there are many games people can play at native 4k with high or ultra settings with or without raytracing.
As someone who games on a 4K monitor (48" OLED) with a 4090, I can tell you that DLSS is not "AI generated horse shit".
@@TootsMcGoots-j8w How many imaginary frames per second are you willing to take, and at what cost?
For high resolution text besides gaming
@TootsMcGoots-j8w exactly people can't seem to think outside of their use case for their pc especially if their pc is ass.
This generation was basically a software update
both software and hardware. Stop whining already
*hardware and software
they going to sell a lot of 5070
@@Aleksey-vd9oc just software
@@bigturkey1 Its the only one that kind of makes sense if someone wants a new RTX card. That 12G of vram just means that in 2 years those people will be looking to upgrade just like they want them to! Personally I think that midrange cards every few years are the way to go but everyone is free to do what works for them of course.
4:45 More memory bandwith due to 40% faster GDDR7 is gonna do something, it's not just a matter of SM count.
Yeah i can tell there are alot of resentment from gamers but 5090 will be a stronger productivity card. And i cant wait for my 5090 Astral
If you look at Ray tracing settings at 13:40 . It looks like the DDR7 is not doing anything and it’s actually worse performance gains than 33%.
Wukong Gets 24 to 25 fps max raytracing on 4090. Here 5090 is 29, so 29/24=20% performance increase.
Ray Tracing is the achilles heal for gaming, this is why AI had to step in.
@@KiNg0fShReD SM count or core & memory clock almost never scale linearly, and vary wildly between engines and games. RT improvements aren't crazy this Gen, so I would personally guess the RTX 5090 is around 35-40% faster in *raster* on average
@@KiNg0fShReD Naaah the grass are greener and the shadows looks more natural on the right. And terrain looks fine aswell on the right.
@@KiNg0fShReD
Driver optimisations will take time.
So cool seeing you at CES!!
edit: just finished watching the video, and this content is so unique to this channel... No one else in "tech media" is highlighting the insights you share here. Outstanding work, mate!
Using 4x frame generation even for 4K single-player games doesn't make much sense to me.
The most sensible situation would be if you already had 60fps, then you'd get 240fps after frame generation, if you also have a 4K 240Hz monitor.
For >60fps, you won't notice an improvement unless the 5090 supports DP 2.1 and you're using the likes of a 360Hz 4K monitor.
On the other hand, for
As an owner of a 240 Hz (1440p) monitor, you are correct. Frame generation looks and feels like crap unless the baseline FPS is at least 60 to 120 FPS.
At low baseline framerates, issues become extremely apparent; such as latency, frame pacing, motion blur, ghosting, and stuttering (because some visual effects are not properly interpolated).
@@KillFrenzy96 Yeah, when they show stuff like Cyberpunk with 27fps magically turning into 100fps+ it doesn't sit well with me. And that seems to be what MFG is made for. Normalizing base framerates under 30fps sounds like a complete disaster.
@@Rafael-rn6hn tbf, that 100fps+ is not using raw 4K raster as the base, its using Quality-performance mode DLSS upscale to get a better base frame rate, then doing the frame generation from there.
That being said, I agree that this is only useful in certain circumstances. Can't wait for reviews!
To be honest, 575W is insane!
It is bro.
Boost will be 1000w min
PSU manufacturers need some revenue too
- Uncle Jensen probs
So Blackwell is Turing 2.0. Damn that’s disappointing.
You know what's more disappointing? AMD GPUs.
@@Aethelbeorn we haven't seen them yet lol.
@xyr3s
We haven't seen them yet because amd was too ashamed to show them at ces because Nvidia blew everything they have to offer out the water 😂😂😂
@@504Trey i dunno man :D 9070 is supposed to have upto 4070ti performance. if they price that at like 500 buckaroos, things could get interesting. and if the 9070xt ends up at 4080 performance for like 650? very very interesting : p
@xyr3s Yeah but if you can get a 5070 at $500 with 4090 performance, that makes the 9070xt a 3rd of the performance at the same cost, amd gpu's are dead this year
Seems like just yesterday the top range X80 card was around $600. And the Nvidia TITAN seemed crazy expensive at a thousand+ dollars.
The good old days.
I remember Turning on ARK OG when I got my First Titan (Pascal) thinking holy 💩
i 'member
I remember when gasoline was 87 cents a gallon in 1996. What's your point?
I remember when eggs were $1.79 per dozen, too bad the world was locked down “for two weeks, to flatten the curve”…
@@bobby0081inflation is theft.
When 4090 came out I was among the first to buy it. But 5090 seems different...guess gonna skip this gen. I do not need more "exclusive" fake frames with artifacts + more latency.
It'll be harder and harder to squeeze raw performance from here on since shrinking die nodes are yielding diminishing returns.
@@Krenisphia Time to move to chiplets.
If Apple can do a big GPU that way, Nvidia probably can do it better.
@@Rubed0_o they tried lol, this takes a lot of time
@@tomorpedreiro3032
Ik, but without a paradigm change it's just about the only solution that doesn't result in sky high prices.
*less latency
4090 owner here. Forget about getting mine for $800 lol. We, high end gpu buyers are like crack addict, we are the ones following channels like this to see what’s up! Lol
The only game breaking hardware thing = DP2.1 that will give bandwidth well beyond 120hz 4k without DSC
25% more expensive, 50% Rasterization uplift. The $5090 is worth $1499, but I feel like Nvidia just gave it 32GB instead of 24 as an excuse to price it this high. A 5080 Ti with 24GB or 20GB of VRAM for the same performance as the 5090 in CES2026 like how the 3080 Ti was the 3090 with 12GB of VRAM is the real halo product for consumers. This is a Titan card.
I like how you accidentally put a dollar sign in front of 5090. Freudian.
I think if they'd just kept calling them Titans no one would have bitched about XX90 pricing.
it is not 50% raster bump over 4090. and titan cards was useless it had like 3-5% perf bump over the model under it. it just had more vram. these 4090/5090 actually had an crazy perf bump over its model under it
@@Teletha whatchu talking about Willis? Titan cards absolutely demolished their regular rtx bretheren in compute, especially FP32 and 64, which is what they were designed for. They were a half way point between consumer and the quadro cards and priced accordingly. Granted the only reason they demolished was because NVIDIA software locked that support but still.
@@RobloxianX The 5090 is worth whatever people are willing to pay for it. Just because it’s worth 1499$ to you doesn’t mean 2,000$ isn’t a steal for someone or a small business looking to do their own tinkering with AI for profits. Just remember that same die can also be put in the professional version RTX Blackwell 6000 with probably 64gb of slower ram for 8,000$ or a compute only B100 where two of them will live with 192gb of HBM3 for the low price of 35k. 2k in this market where everything is about AI is actually a good price if you plan on using it for anything other than gaming.
didnt we get a 30% perf bump going from 80 SM 3080ti to 76 SM 4080?
you dont think new architecture will give 5080 a similar 25-30% bump over 4080? (yes i am coping being to poor for 5090)
30% perf bump yes but it was honestly underwhelming. The 4080 can run at 3 ghz constantly, while a 3080ti struggles past 2.2 ghz.
Just wait for reviews dude. I’m a 5090 FE buyer and I’m doing the same.
yes but you went from 8 nm to 4 nm... this time ur still on tsmc 4nm
Not with 30% more power draw. The question is how they manage to keep power to performance ratio the same when the GPU has better VRAM, larger die and more bandwidth.
That sounds wrong. Probably because of DLSS
I think FE can pull this off because of the liquid metal. We are not sure if the 3rd party cards will use crappy paste or not yet I assume?
Well 3rd party models can be flashed easier for more performance
FE is balanced and idiot proof card, nothing aggressive. A serious overclocker will not care about balance or power draw (or even money). They will buy board partner cards, throw the fans and vapour chamber in the bin, slap on a waterblock and tune the f out of the GPU.
To be fair, I think DLSS 4 is almost native if it's at 4K resolution. There's still some issues for sure, but it's very, very close to native now. I could just turn on DLSS 4 and just enjoy my 4K 240Hz OLED on Wukong, Cyberpunk and Witcher 4 (when it comes out) and it would just hit that 240fps for me to get the max performance of my hardware. My 4090's not able to do that without major artifacts, which makes this an instant buy on day 1. Especially that super sexy 5090 FE
Except it's not. Upscaled 4k isn't native 4k. You could say it looks like it, or close to it butnit isn't it. Also, upscaler tech has to be implemented feature of games in order to work, and it encorages game developers to rely on this tech instead of proper optimisation. Basically, Nvidia, AMD and Intel pushes fake frames to push their own products and by implementing itnin their games they save massive money on labour that would be required to optimise the games. And the result is often abysmal artifacting/ghosting, flickering textures, and massive input lag.
Will it be enough to hit 4k240hz on Warzone ?
DLSS4 and Reflex 2 coming to all RTX series?
Might keep my 3090 two more years or until a 5080ti 24gb comes out
Yep, RTX3090 user here as well, and an RTX5080Ti 24GB would be what I'd aim for. And maybe wait longer, for RTX 6000 series.
@T595955i had the 5080 be 24gb I would go for it but Nvidia know what they are doing
Yeah, only multi-frame generation is exclusive to 50 series.
Idk about you, but for me the new RTX 50 is a disappointment, that we waited 2.5 years for might I remind y'all.
Gonna keep my 4090 until RTX 60 comes out.
@Rubed0_o yeah I don't care about Framegen what's the point of getting more FPS when it's already smooth enough at the cost of extra latency anyway, we're getting like 30% performance increase at the cost of 30% price increase and power consumption
This feels like a lame refresh
Nope. Only on 50 series
Glad you were there to cover this. No one better to be on the floor!
Major props to your hard work man! Keep it up
Is it just me or at the end in the 5080 dlss 3.5 display does the fan seems to be moving twice as fast as the Dlss 4 display?
Great time to be a 4090 owner... Can either sit and do nothing and know you basically still own a top tier card for two more years, or you can sell it off in a month for good money and upgrade for a few hundred bucks.
It's crazy how performance improvements are hard gated behind manufacturing process node improvements... Blackwell on same node as Ada = near zero perf/watt increase in raster.
This. I'm still seeing used 4090's go for $1700+ (recently completed sales btw) on Ebay, we're eating good either way!
Only way I don't get a 5090 is if MFG is artifact filled or thermals suck on founders (my 4090 never goes past 80c, its a miraculous piece of engineering)
Good info. I’ll take all that extra raw performance from the 5090 I can get for VR. The extra video memory and NVENC encoders will be handy for video editing too. Also, I’ll sell off my 4090 after the benchmarks are out 😂
hi, I have a 2070 super. Should I upgrade on 4070 Super or 5070 ? (My budget isn't that much for a 5090) or should I wait again for 6000 Series ?
5090 max settings 29 fps???? This is f.... joke..
You are a joke.
@ maybe you should check??
@@Creationseed9🤦♂️
No. Wait for the raster benchmarks
yip..compared to a 4090..its only 6 fps difference
the rest is AI fake frames
How will you move the power slider at all? Aren't those new 12VHPWR connectors only 600W max?
You can get another 75w from the pci slot for overclocks if there is any headroom
I cant wait for the 5090 raster benchmarks and cant wait for all of the white 5090s to release.
If your a dedicated, pushing the limits VR gamer, 33% is an absolute dream. Sign me up 😁. Honesty though, unless you game with triple 4k monitors or VR for PC its crazy.
Ima get it, but I'll get alot of the cash from selling 4090.
@@justinhogue986133% is not even close a true generational leap. for dedicated vr 4x double the pixel and double the frame rate is standard i will wait for tsmc 2nm
But the 4x frame Gen is doing the heavy lifting and doesn't work with vr... I wish it did somehow though lol
@@natedisrud that is the problem unless someone figures out a software solution i have played silent hill 2 with frame gen uevr and it almost works with frame gen but there is too many artifacts not as much flickering but not better than no frame gen
Dont need to disassemble, it's just cables connecting the PCBs
you showed the wukong if it uses 5090 native 30 fps!!! it gets more confusing about the new cards. same or more demanding scenarios my 4090 gets at 4k native 30-35 fps... same thing is also they showed on cyberpunk native 4k 25-26 fps also 4090 does the same maybe better.
aesthetics wise
4090 FE > 5090 FE
its a shame they trashed SLI...
getting a tuf 5090 it has a great PCB and perfect for putting a waterblock on it. really liked what i got for my 4090tuf over the strix the price cut was well worth it and had no issues OC to 3ghz
I gotta say Palit looks the best. Hope they are good again like they were in 30 gen.
33% real frame improvement for only 25% increased cost. Nvidia apparently doesn't want their to be competition. Lol
1500 to 2000 is a 33% increase.
The 5090 Founders Edition looks fantastic as a creator card. 32GB VRAM and a two slot design is insane. It seems too good to be true compared to premium priced workstation cards, so I'm curious if there will even be enough stock produced to handle the demand. Hoping that long term we'll see those gains trickle down to lower end cards, and that competition will step up with VRAM.
Since Nvidia killed off Quadro’s they have been slowly turning RTX high end cards into the new quadros since they don’t care about making money off creators anymore
good thing sold my 4090 few months back basically got all my money back.
So far only the 5090 Founders Edition, the dual slot card.
Everything else is 3 to 4 slot.
The truth finally someone who actually plays games and doesn’t just look at bar charts 📊
That two slot design is mega well done nvidia
Law of the cheapest with decent warranty
This tech and gpus are looking incredible imo
Why dont they compare DLSS 4 on the 4090 and 5090?
I’m in a bit of a predicament. I sold my MSI 4090 Suprim X for $1,700 three weeks ago. I definitely got plenty of use out of it and even swapped in a water block a few times. While I’m aiming to upgrade to a 5090 eventually, I just came across a like-new 4090 FE for $1,400. It’s a hard deal to pass up, and now I’m leaning toward going back to a 4090.
It's back to the future, to 2018 and it's the RTX2080 series all over again....
AI tech solutions (locked to the newest line-up) have the priority, instead of raw power (just a small linear progress there). And a price to match.
@@T595955i yep, I skipped 20 series and kept my 1080ti until 30’s.
I mean we got a very good bump in raw power
The 20 series was one of the biggest jumps we've had in years due to raytracing. Bad example.
@@TRS-Eric ........exactly my point.
The meager rasterization performance increase over Pascal for the price increase was notoriously bad, at the cost of new RT and Upscalling tech prioritization (which took years to be relevant). "New AI features" was the point of RTX2000 series, not the performance increase.
@ from what I recall, 20 series could barely do raytracing. The performance hit was not worth it to me.
What is a good price for second hand 4090 on Irish market to consider buying it over 5080 ?
The 5090 FE using liquid metal 😮
Does it have thermal paste on next Gen??
Wait a minute, an actual informative CES post? Great work, keep em coming!
Sounds like you need a faster camera..
Thanks for the breakdown! Just a quick off-topic question: My OKX wallet holds some USDT, and I have the seed phrase. (wonder obey dial dash soon tank spike scout region undo zero such). Could you explain how to move them to Binance?
It's like £1950 and the 4090 was £1600, so it's another substantial price hike. This means the AIBs are all going to be £2000+ :/ I just cannot justify the price, but I'm thinking of selling my 4090 and going AMD anyways for the Linux compatibility now Windows Recall is out. I'm not interested in frame generation or AA as it ruins image quality, what was wrong with just brute forcing straight lines?
Amd 😮 careful saying that on this channel , hasn’t he told you about amd dip yet … 😂
Wait at least until the 5090 sells out and people want the 4090 for msrp again. 9070XT will only reach 4070ti speeds but should be a bargain vs 5070
nvidia works fine on CachyOS, been using it for a year now.
Unreal Engine 4/5 becoming the standard is what happened.
Yes we could get games that run at native 4k with ray tracing quality at 60+ fps on mid level hardware if they actually cared about optimizations.
They don’t want you to know it’s possible because it accelerates the speed they can push out new garbage.
4 times the framegen, said the man in snake hide jacket, selling snakeoil for 1000s of usd a piece
Good coverage. Blackwell is mostly DLSS4 for increases.
I'm not sure if I want one yet.
as a rtx 4090 user i been wonderin for a while WHO ON EARTH BUYS A "90 MODEL TO FCKN USE DLSS ?!?!?!?!?
Jensen's just selling you A.I tech instead of raster for $2000 damn
I don't think anyone was expecting the 5090 to be a good value, but at least it's not _worse_ value than the 4090. The one I'm more interested in is the 5080 and a lot of people the 5070. Even without the frame gen BS it looks like they're seeing a respectable 25% uplift without a price increase. That's better than anything we got with the 4060ti, 4070, or 4080. Not to mention if Reflex 2 works half as well as they claim you might finally _want_ to enable frame gen in your AAA single player games.
The 5080 is the worse value along with the 5070. Only the 5090 and 5070ti are good.
It's a 30% increase in raster with 30% more power consumption and a 30% increase in price. I'd say it's exactly like the 4090 was in terms of value.
Now I dont feel so bad buying my 4090 in August
太對了
Welcome to the new game era, where you are watching a preredered video of a game played by AI, with better quality and more FPS then youtube / twitch.
😂 ikr
I always play with more fps then yt/twitch. :D wtf are you talking about?
Ill stick with my 4070 at 1080p...
Turn everything up to ultra at native=equals great graphics and no wierdness...
1080 🤢
Funny watching +240 useless fps when my 955DF CRT at 75Hz still feels better to play with smoother motion than +165Hz current monitors.
This bubble must burst.
Reflex 2 is probably the most interesting software feature
Do you reckon the 2 slot design is purely because AIBs complained about the 4090FE being too good? It runs so cool mine never has the fans come on unless I'm playing a demanding game. So will the 5090FE now have the fans running all the time and you'll have to buy an AIB version for it to be silent?
Good thing it isn't constrained by a power connector with a 600 watt limit...
What will it look like comparing a 4090 and 5070 both using DLSS 4 ?
If you don't use frame gen on 5070, they look the same but 4090 is faster. Frame gen 4x is the only dlss 4 feature exclusive to 50 series.
Will DLSS 4 come to RTX 40 series?
Nope
yes DLSS 4 is available for all rtx owners.
only frame gen is exclusive to 4k and 5k series
Everything except "Multi Frame Generation"
I have a 4080 super sell and upgrade?
Do any of the 5090 sin cards use 2 connectors?
No
I just want to see a review that will focus on native resolution performance, DLSS, Frame generation and Reflex are just cool demos 99% of the time it will be off.
Bro, can you please ask an nvidia dev if 4xMFG will work in VR?
VR users really could use the extra frames!
I'll buy 50 series just for that alone if they can get it working in vr 😅
Dlss 3.5 injects 1 frame in between, dlss 4 injects 3 frames ,derfore twice the framerate
Keep up the great work! 👍🏻
I sold 1 of my 2 4090s 2 weeks ago for $1800. Sold same day I listed it. I think I’m gonna keep my other 4090 though. Grab a 5090 to replace the 4090 I sold.
I swear the 3rd to last card on the bottom was a boombox
Underrated comment.
you know all the fps in the world makes no difference when your screen can only dispolay so much
older ones are 60 ......newer were 90 and modern are 144 and 165 and 240
now u get 360 even
without a modern screen wasted frames with frame gen is absolutely useless and not even seen
What will Nvidia do in the future when generating more frames and upscaling does not work anymore to sell your GPU?
seems like another ampere.. I'll wait for 6090. I like the efficient gpus
If what he says is true this is the 20 series all over again.
Before CES announcement everyone was predicting 5080 will be 10% faster (1.10x) the 4090. It seems now that's not the case. But didn't the leaks already know the specs of 5080? So what changed?
DLSS Off is also a deception because it uses TAA, the blurriest picture to compare to. I will keep gaming at 4K without raytracing without AA. Raytrcing often gives a worse picture like less datails and darker, because the devs don't have to look how the picture and light changes and if it looks good, they can't even change it if they want to, they have to enable raytracing and are out of control.
Great content man! , In 20days the real benchmark reveal how "fake" Jansen presentation was😂
Remember everyone, what is the goal? Fomo is the goal, not that any of us actually need one.
My HP 4090 makes these 5090's look tiny lol.
My Gigabyte gaming OC 3 slot monster 4090 also says hello lol
DLSS4 and Reflex 2 look like they are done for one another. Basically reflex 2 description on their site seems to imply that it is slightly mainly manipulating the data that each DLSS4 fake frame would get to position itself properly in the world based on mouse movements, and that isn't yet rolled out enough to be main show of marketing. But it's just implicated in a way that I as a programmer assumes that this is obviously what they are doing based on the description how it works.
So you could have 30fps real frame and your mouse movements would be 120fps fake frame, with reduced latency. If it operates like that then DLSS4 mult generation makes a lot more sense than it does right now.
The year is 3030: we’ll be gaming at 500 FPS at 16k but 90% of those frames will be AI generated and feel as laggy as gaming on dialup
Thank you for covering this.
The only channel i come to for this shit 💪💪💪
gonna insta buy the asus astral 5090 and put it in the lian li sup01
+33% SM and increased memory bandwidth. 4090 was limited by memory speed so the raw performance jump on average should be more than 30%. I wonder if 5090 will be able to do 3GHz core clock with ease?
When i set slider all the way up to max memory overclock my 4090 gains about 1-2 fps :D I don't think it's limited by it. Maybe only for specific games&benchmark
@@sebilsever8961It definitely is limited in 4K and 8k with its memory speed and bandwidth.
@@sebilsever8961 Ofc it will be highly dependent on the workload or game and resolution/settings used. But most of the time I got the most FPS boost from the mem oc and not core clock increase (at least not on its own). Especially when playing at higher resolutions or using DLAA or DSR/DLDSR. From my experience the stock GDDR6X was clearly holding back the 4090. It'll be interesting to see how much GDDR7 and the wide memory bus will help this time around.
I’ll be pairing my 4090 with the 9800x3D and the 5090 FE with the 14900K.
What about architecture change from ADA to Blackwell ? It’s not just SM count increase ?
Can't wait to see the review. Curious if the 5080 or 5090 is the buy.
5080 looks like a 4080 Super refresh with 4x frame gen since it’s still slower at raster than a 4090
Best report on this on TH-cam so far..
There is going to be so many RTX 4090 scams, usually people walk away from someone selling an RTX 4090 for $700+ but people will think that it's a dumpster fire sale because of the 5070 and fall for it.