APM and Will, I really really liked this format. And I can tell and appreciate that you guys thought of how to tackle this new landscape of looking at the cards holistically and incorporating experience metrics/discussion. Well done.
Agreed. I love a good set of benchmarks, but it's refreshing to see a conversation about the more subjective aspects of how performance is improved. It reminds me of some of the testing that [H]ard|OCP used to do, where both raw performance as well as visual differences were weighed in separate comparisons. Time really is a flat circle.
@@CaptainScorpio24 Kyle Bennett is still semi-active on various social media sites, but hasn't been doing any formal reviews. The [H] forums are also still very active, as they were never taken offline, and there are rumblings that Kyle might be reviving the main site in some form or another. At least two of the ex-staffers, Brent Justice and Dan Dobrowolski, spun up a website that's been doing good work for a while - The FPS Review.
Thanks for spending time on MFG and being honest as an average gamer in non-first person shooters it is non-noticeable. I only use my PC for MSFS and am looking forward to MFG. I have seen a few benchmarks showing not much of an uplift yet in MSFS, but honestly I think this is a programming issue from ASOBO as MSFS 2024 has so many issues as is at the moment. Good Vid
DLSS 4 is a 'whole lot faster' than the DLSS 3.5, but given the description of 'playing games in jello' comes to mind it's just a RTX 4090 Ti with a 30% more heat generation and >30% more power draw. Did nVidia hire Intel Engineers recently?
@@stephenpatterson8056 As a 4090 owner who has a 3080Ti second system. Can't wait to get one so I can sell the 3080Ti and shift the 4090 down the pipe. LOL
This,👍 I don't need to play the latest and greatest at the highest settings, with all the bells and whistles. I'll be running my 4090 as long as it runs and I can get drivers for it. The Intel 13th Gen CPU otoh...
Thank you for your real and good conclusions . Great video. I know that i will be happy to change my 1070Mobile for a 5090 in my new refresh pc of 6 years. Thank you Nvidia too.
With DLSS and FG my 4090 will be fine for another generation. With 24GIG/384-bit bus and 7950X-3D on a LG 4K 24hz monitor. Plus I know the 4090 will at least get some trickle down of some of the new tech.
I love how the founders sizes is used to sum up all versions🤔 As if its the only version that will be available or that most will buy. They made the founders version smaller but its not out of the box cool running at all. Hence why the vast majority of AIB Partner cards are HUGE!
Thanks for the review guys! I've noticed that none of the day 1 reviewers have done any VR results - any chance you want to be that guy? Pimax Crystal Light is 2880x2880 each eye. How does this hold up?
I like the format and style you do on here, even if it is a little long winded as a result. Unfortunately some issues with the review make me question the reliability, due to the conclusions you jump to, and some other weirdness going on; 1) How would frame pacing be affected by generating extra frames? The MFR calculation is not magic, it doesn't affect the hardware level frame calculation, hence the claim that it makes for a smoother experience on frame pacing seems to be based on feels alone. Feels is fine, when you're talking about an opinion, but I thought this was a review? 2) It really bothers me why you would put a full blow-through design card in a case like the Terra, which has no exhaust fans of it's own. Hot air, meet wall. I hope you do a full post-mortem review of how the card functioned and how long it took to heat soak itself to death in there. I'm betting you lost a fair few bins of the GPU core clocks from the get go. 3) The power draw; most of the discussion is focused on idle - while it is high, why does it matter? The graph(s) you're using are lacking critical detail such as what both axi are actually indicating. Yes you can tell one is time and one is watts, but what does the time axis indicate exactly? It seems pointless to use graphs like this if you don't make it clear what you're showing. A simple bar graph would've done the same thing, just with less noise and much harder to misinterpret. Also, with the amount of white space you have on your graphs, you could easily include the system details for clarity, to save anyone looking at a specific part of the review the annoyance of needing to go looking for what the system details are. I'm aware they're in the notes, which is great, but it's not the same, because you're just a screenshot away from crucial context going missing. Thank you for the discussion and review, it's good to have multiple points of view and takes on these. With a little more attention to detail this'd be a great review.
This is exactly some of the info I was looking for. I too am still rocking an 850W PSU and was wondering if it would be enough for a 5090 paired with a 13700K at stock clocks. Worked fine with my 4090 so I guess 5090 will likely be OK? And yeah I'm looking to get a 5090 but doesn't mean I'm just cool with burning another $200+ on a new PSU if I don't have to lol
You will need at least a 1000w for this card, regardless of what you run it with, though personally I would go for 1300 straight up as you will need it in the future anyway. However, for games, with a 13700K you will be CPU bound in everything that isn't CPU frequency favoring. So most new games afaik? Based on the five different sources of reviews I've seen, the 5090 seems to be CPU bound with even newer CPU's to some extent, so I would consider it a waste of money to spend this much on a card unless you're running it in a system that can match it.
Let's hope the RTX 5080 delivers because otherwise gamers building highend machines were in a better situation when the RTX 4090 was still in production than they are with this 2k USD 575W behemoth.
The power draw is a bigger factor than most understand. You need a case that can dump 700-800 watts out of the case (if you combine CPU GPU and heat generated by PSU, fans etc), you probably want a 1200W PSU to be on the efficient side, you want to have GEN5 PCIE to leverage the technology. And if your environment isn't air conditioned properly you're going to be sweating. Luckily the 5090 is going to be as rare as hen's teeth, so few gamers are going to make the mistake. It's bad enough with a 350-400W GPU already.
@@CaptainScorpio24 Same, I bought a 4070 ti Super last spring and in the future I refuse to ever buy anything that draws more power. I had a 7900XTX before that but had to sell it because my total system power draw was 700-750 watts while gaming and my AC couldn't keep up. Now my system pulls 450 Watts gaming and AC isn't crying in the summer and neither am I.
Interesting that other who actually measure frametimes with MFG conclude that it doesn't solve problem with bad frame pacing but you claim it does. Placebo effect? To me sounds like that because there is no explanation how it would solve it at all, if engine deliver 10 frames during lets say 100ms at pace of every 10ms (and MFG convert that to 40) and then suddenly deliver just 7 for the next 100ms MFG can convert that only to 28 instead of previous 40 and it must be noticeable. No frame generation tech can solve that problem.
There is some change on the hardware level that affects the frame pacing compared to 40 series cards. It may still be imperfect at times but it should be better. I can't remember the specific details. I either saw it in an article somewhere or maybe in one of DF's videos they covered it
@@stephenpatterson8056 DF are what most people call shills but I personally see them more as fanboys. Timeframe graph with or without MFG are totally identical, there is more frames so time every frame is showed on screen is shorter but there is nothing in hardware that can make miracles and remove spikes(reason explained above).
No and no news on availability, but Nvidia did announce the first 2 games are Valorant and The Finals: www.nvidia.com/en-us/geforce/news/reflex-2-even-lower-latency-gameplay-with-frame-warp/ -Adam
I just bought end of last year a corsair power supply that said it was atx 3.1 and it does not have a dedicated 12vhpw port so im going to have to use the adapter. Because im not buying another one.
If you want a cleaner look, you can also get a cable from cable mod that will go from the 16 pin on the GPU and break out to either 3 or 4 connectors that go straight to your PSU
Nvidia officially recommends a 1000w PSU. Considering they got Cyberpunk to push it to 700w on the GPU alone (not counting cpu, ram, etc.), I don't think I'd risk an 850w.
@ Okay. In that case, do math. You need a 20% overhead for degradation and power fluctuations. That gives you a total power allotment of 840w. Doesn't leave you much wriggle room. I'd be very nervous about using that. And again, Nvidia is recommending a 1000w PSU. The ONLY way I might consider it is if you undervolted, and I dunno how much undervolting you could do to get to 80% utilization. And really, you're gonna buy a 5090 and then undervolt it and lose some of that power you paid for?
@ Just wondering what degradation are you talking about. A 850W 80+ Titanium can output 1000w. The 20% is for efficiency (power draw from the wall) A 850W 80+ can output 850W (and a bit over) 100% 24/7.
@@PetterNorthpole Is that what you have sir? Because it sounds like you're not really asking a question. It sounds like you're determined to use the 850W psu no matter what anyone says.
You guys are praising this card too much for being 2 slot when everyone these days have gotten used to monster cards. I'd rather see a 4 slot card with better performance than a 2 slot with a minor upgrade.
Yes they are. A GPU is a semiconductor crunching numbers and spitting out a synthetic image to a digital display. With frame gen, it's a different part of the semiconductor crunching numbers in a different manner and spitting out a synthetic image to a digital display. It's all math resulting in imagery. Saying one frame is more "real" than the other is like asking me what 2+2 is and only accepting my response of "4" as real if I pull out a chalk board and write it out for you rather than just doing it in my head and saying "4." To quote Morpheus from The Matrix: "What IS 'real?' How do you DEFINE 'real?'"
600 Watts of power draw is not in anyway good. The Memory also runs in the 90s so how long do you think before we start seeing 5090s on Channels like Northwest Repair.
600 is fine. If the memory rates above 95c, then yeah. 95c is literally just 95c. Normally you are one to agree with but you are out of line here tbh fam.
Ok, but what does your personal preference add to the discourse? Like, hold my hand here. Tell me why I care about what your preference in GPU will be.
Sounded more like a declaration. It's like those people who on every phone review loudly comment "well I'm not upgrading". Oh wow cool really? Changes my life dawg. @@cortneywebb1677
No Will, they aren't all fake frames. Stop being weird. The fake frames are the AI slop vomited out by the software stack stood up to cover for lack of gen over gen perf per watt improvements.
Why do people reviewing the 5090 INSIST on using MSRP when finding a 4090 at MSRP outside a tiny window right after it came out is about as likely as finding Bigfoot and Elvis? The 4090s price has been well over 2000 USD much of its life and is closer to 3000 USD now which if you take a 25 percent uplift, with a lot of new extras (encoders/decoders, better cores, more and faster memory, DLSS, upscaling, raytracing, AI to name a few) at LESS than what a 4090 sells out is a HUGE bargain!
@@cortneywebb1677 They can mention the actual price at the time of writing, or the average price since launch. Tesla can offer a car for an MSRP of 1 USD, if they never create one you can not say the Tesla Model S sucks because another Tesla is only a dollar. The cold hard truth is the 4090 has probably averaged around 2300 USD so by the more accurate figure the 5090 is a real bargain -at least until we see its actual selling cost.
Why bother with the review of the RTX 5090, I would just focus on the RTX 5080 and RTX 5070, no one other than a few gamers can afford the RTX 5090 so why bother with it, and a good amount of folks who can afford it will not be able to buy it anyway. Just a waste of time, just like reviewing a 10,000 TV, who's buying that right?
I would appreciate timestamps in a video like this. I can’t fathom any reason not to include them in a review… other than to inflate watch time metrics.
APM and Will,
I really really liked this format.
And I can tell and appreciate that you guys thought of how to tackle this new landscape of looking at the cards holistically and incorporating experience metrics/discussion.
Well done.
Agreed. I love a good set of benchmarks, but it's refreshing to see a conversation about the more subjective aspects of how performance is improved. It reminds me of some of the testing that [H]ard|OCP used to do, where both raw performance as well as visual differences were weighed in separate comparisons. Time really is a flat circle.
@@AK-Brian do those guys still active ... i used read their stuff 7 8 yrs back
@@CaptainScorpio24 Kyle Bennett is still semi-active on various social media sites, but hasn't been doing any formal reviews. The [H] forums are also still very active, as they were never taken offline, and there are rumblings that Kyle might be reviving the main site in some form or another. At least two of the ex-staffers, Brent Justice and Dan Dobrowolski, spun up a website that's been doing good work for a while - The FPS Review.
It's like a Full Nerd discussion segment but just about the new card and the new features... good background discussion.
Something about not seeing Gordon in the thumbnail made me sad af
The first testing I did without him to help me, and I definitely felt it...
-Adam
The spirit of Gordon lives in Will's voice.
Thanks guys for doing the work for this
Thanks for spending time on MFG and being honest as an average gamer in non-first person shooters it is non-noticeable. I only use my PC for MSFS and am looking forward to MFG. I have seen a few benchmarks showing not much of an uplift yet in MSFS, but honestly I think this is a programming issue from ASOBO as MSFS 2024 has so many issues as is at the moment. Good Vid
You guys have great on screen chemistry. Very comprehensive and genuine
DLSS 4 is a 'whole lot faster' than the DLSS 3.5, but given the description of 'playing games in jello' comes to mind it's just a RTX 4090 Ti with a 30% more heat generation and >30% more power draw.
Did nVidia hire Intel Engineers recently?
And 30% more expensive.
@LeonardTavast Asus astral is $2800. 30% is based on paper launch fe model.
Good info! Thanks!
As a 4090 owner this seems like an easy skip.
As a previous 4090 owner who sold it after the announcement event, I can't wait to (hopefully) get one lol
@@stephenpatterson8056 As a 4090 owner who has a 3080Ti second system. Can't wait to get one so I can sell the 3080Ti and shift the 4090 down the pipe. LOL
This,👍 I don't need to play the latest and greatest at the highest settings, with all the bells and whistles. I'll be running my 4090 as long as it runs and I can get drivers for it.
The Intel 13th Gen CPU otoh...
eww That leaves me feeling bad for smaller 5000 cards
Is there any kind of bottleneck with the 7950x3d or is it fast enough for the rtx 5090?
Nothing major that I noticed. Nvidia's reviewers guide had numbers with a 9800X3D and the 7950X3D wasn't far off from the numbers I saw.
-Adam
Thank you for your real and good conclusions . Great video. I know that i will be happy to change my 1070Mobile for a 5090 in my new refresh pc of 6 years. Thank you Nvidia too.
With DLSS and FG my 4090 will be fine for another generation. With 24GIG/384-bit bus and 7950X-3D on a LG 4K 24hz monitor. Plus I know the 4090 will at least get some trickle down of some of the new tech.
The RTX 5090 is overpriced for the performance it delivers. Thank you for your honesty!
First time since 980 i will not be updating to new gen. Finally :)
You should upgrade, Jensen appreciates your yearly donations.
@@SMGJohn lol
I love how the founders sizes is used to sum up all versions🤔 As if its the only version that will be available or that most will buy. They made the founders version smaller but its not out of the box cool running at all. Hence why the vast majority of AIB Partner cards are HUGE!
All I want to know is how it does in MSFS 2020/MSFS 2024... Not one youtuber even bothered to test it.
yes there is one that I saw with both MSFS 2020 and 2024
Thanks for the review guys!
I've noticed that none of the day 1 reviewers have done any VR results - any chance you want to be that guy? Pimax Crystal Light is 2880x2880 each eye. How does this hold up?
Adam, if you lock your game to 30 fps, and you enable single FG, the actual render will run at 15 fps, if I am not mistaken.
I like the format and style you do on here, even if it is a little long winded as a result.
Unfortunately some issues with the review make me question the reliability, due to the conclusions you jump to, and some other weirdness going on;
1) How would frame pacing be affected by generating extra frames? The MFR calculation is not magic, it doesn't affect the hardware level frame calculation, hence the claim that it makes for a smoother experience on frame pacing seems to be based on feels alone. Feels is fine, when you're talking about an opinion, but I thought this was a review?
2) It really bothers me why you would put a full blow-through design card in a case like the Terra, which has no exhaust fans of it's own. Hot air, meet wall. I hope you do a full post-mortem review of how the card functioned and how long it took to heat soak itself to death in there. I'm betting you lost a fair few bins of the GPU core clocks from the get go.
3) The power draw; most of the discussion is focused on idle - while it is high, why does it matter? The graph(s) you're using are lacking critical detail such as what both axi are actually indicating. Yes you can tell one is time and one is watts, but what does the time axis indicate exactly? It seems pointless to use graphs like this if you don't make it clear what you're showing. A simple bar graph would've done the same thing, just with less noise and much harder to misinterpret.
Also, with the amount of white space you have on your graphs, you could easily include the system details for clarity, to save anyone looking at a specific part of the review the annoyance of needing to go looking for what the system details are. I'm aware they're in the notes, which is great, but it's not the same, because you're just a screenshot away from crucial context going missing.
Thank you for the discussion and review, it's good to have multiple points of view and takes on these. With a little more attention to detail this'd be a great review.
Sounds like it will heat your room.
yeah? just like every computer?
700+ watts is what my space heater draws on low 😬
Now you can have a space heater that also does computer stuff!
And just think, all your space heater does is make heat. It can't even run Doom
@@stephenpatterson8056 No, but it can run Superhot
@@mrzozelowh Great for those Texas summers!
Now you get the same functionality, but also path tracing. Simply better
So, what we have is RTX 4090 TI ?
This is exactly some of the info I was looking for. I too am still rocking an 850W PSU and was wondering if it would be enough for a 5090 paired with a 13700K at stock clocks. Worked fine with my 4090 so I guess 5090 will likely be OK? And yeah I'm looking to get a 5090 but doesn't mean I'm just cool with burning another $200+ on a new PSU if I don't have to lol
set PL AT 70%
You will need at least a 1000w for this card, regardless of what you run it with, though personally I would go for 1300 straight up as you will need it in the future anyway.
However, for games, with a 13700K you will be CPU bound in everything that isn't CPU frequency favoring. So most new games afaik?
Based on the five different sources of reviews I've seen, the 5090 seems to be CPU bound with even newer CPU's to some extent, so I would consider it a waste of money to spend this much on a card unless you're running it in a system that can match it.
Thank God we have an SFFPC enthusiast here 🙏 😂
Let's hope the RTX 5080 delivers because otherwise gamers building highend machines were in a better situation when the RTX 4090 was still in production than they are with this 2k USD 575W behemoth.
Leaks are already starting to come, Opendata leak shows it's on average 8.22% faster than the 4080 Super.
It's basically a 4080 Super Super 😂
47:51 “Some people like the best” and only in Photoshop 😂
Welcome to Morocco ❤ We want an agent in Morocco
A tiki with a 5090 would be impressive to see
Putting that much power in 2 slots 😮
The power draw is a bigger factor than most understand. You need a case that can dump 700-800 watts out of the case (if you combine CPU GPU and heat generated by PSU, fans etc), you probably want a 1200W PSU to be on the efficient side, you want to have GEN5 PCIE to leverage the technology. And if your environment isn't air conditioned properly you're going to be sweating. Luckily the 5090 is going to be as rare as hen's teeth, so few gamers are going to make the mistake. It's bad enough with a 350-400W GPU already.
that's why I upgraded from 3070 Founders edition to gigabyte rtx 4070 ti super gigabyte eagle oc at launch 2024 jan 😊
@@CaptainScorpio24 Same, I bought a 4070 ti Super last spring and in the future I refuse to ever buy anything that draws more power. I had a 7900XTX before that but had to sell it because my total system power draw was 700-750 watts while gaming and my AC couldn't keep up. Now my system pulls 450 Watts gaming and AC isn't crying in the summer and neither am I.
@@FredSaidIt our ti super will easily last 5 6 yrs and still it won't become unusable.
@@CaptainScorpio24 yikes
I'll be getting a msi 5090 i think. Suprim. Vanguard or something
How come the Nvidia Founder's Edition is the only review released, and not the partner card boards? Obviously an embargo but weird.
Adaxum’s focus on real-world applications makes it a standout project. Joined the presale today!
Nvidia partners will probably charge even more than the founders edition
Will, look at the camera more.
It's not overkill. Never enough power.
Interesting that other who actually measure frametimes with MFG conclude that it doesn't solve problem with bad frame pacing but you claim it does. Placebo effect? To me sounds like that because there is no explanation how it would solve it at all, if engine deliver 10 frames during lets say 100ms at pace of every 10ms (and MFG convert that to 40) and then suddenly deliver just 7 for the next 100ms MFG can convert that only to 28 instead of previous 40 and it must be noticeable. No frame generation tech can solve that problem.
There is some change on the hardware level that affects the frame pacing compared to 40 series cards. It may still be imperfect at times but it should be better. I can't remember the specific details. I either saw it in an article somewhere or maybe in one of DF's videos they covered it
@@stephenpatterson8056 DF are what most people call shills but I personally see them more as fanboys. Timeframe graph with or without MFG are totally identical, there is more frames so time every frame is showed on screen is shorter but there is nothing in hardware that can make miracles and remove spikes(reason explained above).
I want to know about Adam's tablet with pen!
I love reMarkable tablets, and this one is the Paper Pro: remarkable.com/store/remarkable-paper/pro
-Ada
Is Nvidia Reflex 2 out yet?
No and no news on availability, but Nvidia did announce the first 2 games are Valorant and The Finals: www.nvidia.com/en-us/geforce/news/reflex-2-even-lower-latency-gameplay-with-frame-warp/
-Adam
I just bought end of last year a corsair power supply that said it was atx 3.1 and it does not have a dedicated 12vhpw port so im going to have to use the adapter. Because im not buying another one.
If you want a cleaner look, you can also get a cable from cable mod that will go from the 16 pin on the GPU and break out to either 3 or 4 connectors that go straight to your PSU
@stephenpatterson8056 thanks! I'll look into that
So you can use a 850W psu?
Nvidia officially recommends a 1000w PSU. Considering they got Cyberpunk to push it to 700w on the GPU alone (not counting cpu, ram, etc.), I don't think I'd risk an 850w.
@@ElladanKenet That 700w is the total system power (if you are talking about this video). They didnt use any device to isolate the GPU.
@ Okay. In that case, do math. You need a 20% overhead for degradation and power fluctuations. That gives you a total power allotment of 840w. Doesn't leave you much wriggle room. I'd be very nervous about using that. And again, Nvidia is recommending a 1000w PSU. The ONLY way I might consider it is if you undervolted, and I dunno how much undervolting you could do to get to 80% utilization. And really, you're gonna buy a 5090 and then undervolt it and lose some of that power you paid for?
@ Just wondering what degradation are you talking about. A 850W 80+ Titanium can output 1000w.
The 20% is for efficiency (power draw from the wall)
A 850W 80+ can output 850W (and a bit over) 100% 24/7.
@@PetterNorthpole Is that what you have sir? Because it sounds like you're not really asking a question. It sounds like you're determined to use the 850W psu no matter what anyone says.
Adaxum’s innovative approach to E-commerce integration is a game-changer. Joined the presale today!
Adaxum’s presale bonuses made it an easy decision for me. Locked in my position early!
You guys are praising this card too much for being 2 slot when everyone these days have gotten used to monster cards. I'd rather see a 4 slot card with better performance than a 2 slot with a minor upgrade.
Blender or at least cinebench
It's not "a whole lot faster"...
I don't trust anyone who uses motion blur in their games xD
costs 25%more than a 4090 only 27% better even at 4k meh
Its not a whole lot faster, its Tie faster.
No. All the frames are not fake frames.
Yes they are. A GPU is a semiconductor crunching numbers and spitting out a synthetic image to a digital display. With frame gen, it's a different part of the semiconductor crunching numbers in a different manner and spitting out a synthetic image to a digital display. It's all math resulting in imagery. Saying one frame is more "real" than the other is like asking me what 2+2 is and only accepting my response of "4" as real if I pull out a chalk board and write it out for you rather than just doing it in my head and saying "4."
To quote Morpheus from The Matrix: "What IS 'real?' How do you DEFINE 'real?'"
its the ps pro version of the 4090
He said a whole lot faster. 😅 yeah no!
dont call it overkill. it cant do 8k 240hz. max setings. DONT PLAY
600 Watts of power draw is not in anyway good. The Memory also runs in the 90s so how long do you think before we start seeing 5090s on Channels like Northwest Repair.
Depends. What are the manufacturer temperature tolerances for GDDR7? I can find it for GDDR6x and previous generations.
@@Aethelbeorn 95c is still 95c imagine in a real case with restricted air flow a test. bench is not a typical case
600 is fine.
If the memory rates above 95c, then yeah. 95c is literally just 95c.
Normally you are one to agree with but you are out of line here tbh fam.
@@POVwithRC All depends on what you pay for power.
Gotta undervolt that bastard for sure. 600W is crazy.
I saw another video with 620w spikes.
It's the 4090 Ti with max fake frames!
@8:48 134F (56C) metal will definitely burn you, what are you smoking
How you figure? Hot water out of your faucet is hotter than that.
@@cortneywebb1677Kinda, at 134F, it will burn you but it'd take roughly 15 seconds of sustained exposure to begin to cause burn damage.
I really don't care about the 5090. 5080, 5070 - maybe.
At $1999 I wouldn’t either
Ok, but what does your personal preference add to the discourse? Like, hold my hand here. Tell me why I care about what your preference in GPU will be.
@@POVwithRCmaybe he is just putting the comment so PC world has an idea of what he wants to see no reason to be mean.
Sounded more like a declaration. It's like those people who on every phone review loudly comment "well I'm not upgrading". Oh wow cool really? Changes my life dawg.
@@cortneywebb1677
@POVwithRC Would you say something like that to anyone in real life? 😅😂😂 would be unbelievable!
Give the worst review possible , so that we can actually get one.😀😀😀 Run f@h on it and let us know the PPD.
No Will, they aren't all fake frames. Stop being weird. The fake frames are the AI slop vomited out by the software stack stood up to cover for lack of gen over gen perf per watt improvements.
so i was hoping to see the 5090 fe on the terra.
nvm 7:00
It’s a bit disappointing I was expecting 175fps at 4k on 5090
Why do people reviewing the 5090 INSIST on using MSRP when finding a 4090 at MSRP outside a tiny window right after it came out is about as likely as finding Bigfoot and Elvis? The 4090s price has been well over 2000 USD much of its life and is closer to 3000 USD now which if you take a 25 percent uplift, with a lot of new extras (encoders/decoders, better cores, more and faster memory, DLSS, upscaling, raytracing, AI to name a few) at LESS than what a 4090 sells out is a HUGE bargain!
Because that is the only way they can mention price. Otherwise the price will be to volatile and they will always be inaccurate.
You will also not get a 5090 for msrp lol
@@cortneywebb1677 They can mention the actual price at the time of writing, or the average price since launch. Tesla can offer a car for an MSRP of 1 USD, if they never create one you can not say the Tesla Model S sucks because another Tesla is only a dollar. The cold hard truth is the 4090 has probably averaged around 2300 USD so by the more accurate figure the 5090 is a real bargain -at least until we see its actual selling cost.
@@ronaldhunt7617 it hasn’t launched so there is no average price
Why bother with the review of the RTX 5090, I would just focus on the RTX 5080 and RTX 5070, no one other than a few gamers can afford the RTX 5090 so why bother with it, and a good amount of folks who can afford it will not be able to buy it anyway. Just a waste of time, just like reviewing a 10,000 TV, who's buying that right?
1h video with no time stamps/chapters
not cool guys
There are timestamps. If TH-cam doesn't load them in automatically you can look in the description and they should be clickable there.
-Adam
@pcworld thanks Adam
You know what's really not cool? Making your only input complaining about timestamps and otherwise being a complete ingrate.
I would appreciate timestamps in a video like this. I can’t fathom any reason not to include them in a review… other than to inflate watch time metrics.
Timestamps are there in the description and showing up for me, not sure why you aren't able to see them. Sorry about that.
-Adam
The video has chapters and timestamps for me....????
Adaxum’s innovative approach to E-commerce integration is a game-changer. Joined the presale today!
Adaxum’s focus on real-world applications makes it a standout project. Joined the presale today!
Adaxum’s presale bonuses made it an easy decision for me. Locked in my position early!