It's kind of funny that more than 4 years and 3 gpu generations later and this game is still the game to benchmark lmao. I wonder if anyone actually bought the 2080ti cyberpunk edition.
imagine paying 2000usd+ for 64fps @1440p in wukong when 1440p 100hz monitors cost 100usd, 1440p 180hz 150usd. mining is over, GPUs don't print money anymore, prices must come back to Earth from 10x overprice that makes 0 sense. dlss + fake frames should be additional features on top of decent performance for an AFFORDABLE price that corresponds to gaming performance you get in 2025, not a performance replacement for 10x overprice like mining isn't over and GPUs still can print money. they can't. all gamers lose when fake 'gamers' defend overpriced 10x low-VRAM crap, true gamers should demand better GAMING GPUs and pre-mining prices. because thats in TRUE GAMERS INTERESTS. and then gamers can have glitchy fake frames ON TOP of ACTUAL PERFORMANCE, not INSTEAD of actual performance.
the problem is 10x overprice and 0 progress @affordable prices. gamers and gamedev desperately need some actual progress with AFFORDABLE GPUs since 2016 when mining started. so gamers can finally buy themselves 16gb+ VRAM 200-300usd affordable FAST GPUs and forget about crap low-VRAM graphics in games because gamedev can just ignore low-VRAM slow crap GPUs. the 9070xt with 1440p 52fps in CP2077 or 73fps NO RT in wukong doesn't actually sound like 'fast' in 2025. 46fps @1080p, its like 30-35usd worth of 2025 gaming performance. the 4080/s its the same slow performance for 2025. RTX is 2018 6yo OLD tech. all GPUs must run it by now perfectly fine BUT NOPE. the 9070 should be a 250usd card to destroy both nv and intel. the 9070xt @300-350usd will be 'fine' - that'd be at least some progress with GPUs for a change. e.g. 9600xt 2008 vs 1060 2016 x9 times faster, 1060 vs 4060 2023 is just x2 faster. the 5090 64fps @1440p in wukong. basically the 5090 has like 70usd worth of ACTUAL 2025 gaming performance in it. "modern GPUs" aren't even able to run modern games on ULTRA BUDGET MONITORS.
What artifacting? He showed artifacts with 30 FPS base / no DLSS upscaling benchmark, and then DLSS performance with 250-280FPS, even with 50% speed you wouldn't notice artifacts.
@@danielowentech Hey Daniel, when you do take a look at the Upscalers, could you perhaps compare I.e. CNN Quality vs Transformer Balanced/Performance on 1440p and 1080p?
Would love this as well. The improvement in image quality brought by the new transformer model to me is the highlight of this 5000 launch window and should not be understated as it’s basically free fps for the same (and sometimes even better) image quality
@@danielowentech Compare something like 1440p native image quality vs 4k dlss performance. Which one is faster, which one looks better, and then add these permutations to the transformer model. :D
One thing I would love to see is a comparison between the latest version of lossless scaling. It recently got an update with better quality frame generation, less performance hit and lower latency.
@@icy1007 Huh? I've used it on Avatar frontiers of Pandora locked at 82fps max graphics, and it did extremely well upscaling x2 to 164fps, also upscaled bloodborne on shadps4 to 120fps no problem and it looks amazing, I think it does pretty damn well in Elden Ring as well, seeing as it's locked to 60fps, x2 mode makes you able to play elden ring at 120fps and it also looks incredible, and there is no frame gen for that game and probably will never be, lossless scaling is sweet, I have a 4070ti super so i love this new dlss 4 upgrade but it wont give us frame gen on any game we want, and we wont be getting mult gen, but lossless scaling can give us that, although at x3 and x4 i start to see visual shit i dont like, x2 mode is pretty amazing.
Frame Gen is amazing in single player games like Harry Potter, where there is no need for inputs on every frame. Anything with pvp or that requires the fastest reaction time possible, it sucks. It would be great in an MMORPG if all you do is run around town. If you pvp in your mmo, it will be very bad.
There this thing called motion blur, it makes less frames look smoother and looks better than fake frames. It won't feel off like with fake frames and it even hides the artifacts of upscaling/TAA.
It's very good in single player third person games e.g. the witcher 3, assassin's creed. Even better when you're using a gamepad because the latency is whatever if the base framerate is good.
@@dontsupportrats4089 No it won't, it will help you if you can only actually get 60FPS. The numbers are like this. 60FPS No FG Reflex off avg PC latency 52ms 60FPS No FG Reflex on avg PC latency 33ms 240FPS 4xFG Reflex on avg PC latency 38ms Which one are you going to use? the visual feedback and motion clarity are totally different. If you use MFG like this, you're basically cheating. I'm playing Street Fighter 6 with this setting and my anti-air success rate noticeably higher thanks to the better 240FPS motion clarity. and I'm still get better input latency than people who doesn't know how to apply Reflex to your games that doesn't support it. Ultimately it's just a tool for you to use, and you should use it wisely.
Frame Gen is for very high fps to match the high refresh rate monitor, so the GPU puts out 180-250fps and FG would get you to 250-500fps on your 240/500hz monitor because it doesn't look like raster is going to get 500fps and up
Frame gen is for at least 60 base so not too high. For me, a base constant 60 with frame gen can feel better than an inconsitent 80-100 doubled. Of course.
@@tixx7492 The added latency of FG is very low, and since it mandates Reflex, you end up with lower latency than fully native (native without any added nvidia technologies like reflex). Native + reflex will still be faster, but FG mandates reflex, so either way its addition to a game is a boon.
@@frankiepollock7380But 75fps “naked” after apply tech should be doing that baseline 60fps and x2 will make very good and solid 120fps or very acceptable x3 180fps (better than 60 native, and 80/100 ya mention about it (in x2)). People must to realize 70/75fps is the minimum, after apply tech the gpu must be STILL Less than 90% usage.
yes and no. Do you really want to enable fake frames with big latency for online multiplayer games? That's where you NEED a lot of fps. You don't need more than 120 or so in single player games where the MFG can be turned on despite big latency and fake frames.
Only in Cyberpunk. Tried it in HZD Remastered and I would not say it looks great and much better than the older model (3.8.10). It definitely needs time to improve.
None of it's gonna be worth for me as long as there's that smearing effect that I notice easily and then can't unsee.. 4:40 / 5:00 look at the gun's scope and that massive trail that's following it
The game was not built with the huge reliance of motion vectors in mind. The HUD and the translucent elements show this the best. It would be pretty hard reinventing the game’s rendering pipeline from the current deferred mode so we are stuck with this I think until devs start implementing solutions for this.
True, but i tried the already implemented dlss on my 4080S, and the experience was spotty at best. I don't believe generating even more frames will help with the visuals@@omc951
Yup. At this point I'm convinced the people that like this are either blind or leave their FPS counters turned on and enjoy seeing bigger numbers more than they enjoy playing the actual game.
@@paulc5389 Humans can't see ''clear'' outside of central area where they are looking at, rest of the image on a monitor is a blur, there is also a blind spot where our eyes can't see anything, and you see something that might not even be there. Moving the eyes also blurs the entire image, but our brain just shows a clear image based on what was there before, and what it expects to see. Combine that with playing the game, shooting at a target, and most people will miss most of the artifacts.
@paulc5389 it's one of those things you learn to ignore. It's a trade. Learn to ignore sub 60 performance, or learn to ignore artifacts. One I enjoy depending on the hardware and the game. Indiana jones is a very taxing game with stability issues so I personally wouldn't use it with that. Because the instability can break the render layer. If you have ever been a poor pc gamer then you could see the application. If you have only ever had a mid-high end system you won't get it.
@@paulc5389 i personally cant see anything wrong most of the time, most of the reviews says its very good with baseline of 50-60 fps also at 2x mode the artifacts much more less.
Something I wondered about is what was covered briefly, where you end up going over the monitor's max refresh rate with 4X MFG mode. I would guess this is objectively worse than staying within the G-sync range.
I was curious about this as well, in my mind if you are playing a game and you keep getting hit with low 1% lows could you use a combo of MFG and Gsync to get higher 1% lows but stay at your monitors refresh rate
@@danielowentechWhy not just cap your framerate at your refresh rate or 3 frames less? If you over shoot your refresh rate, your 1% lows will be higher, and you can cap a at the refresh rate and maintain the higher 1% lows which will give you a very smooth playing experience. Thats what i do on all games, but i haven’t tested it with frame gen yet, only dlss.
@@KiNg0fShReDI do this with my games using frame gen and it works great. V-sync on in nvidia control panel, limit max fps 3 below the monitor max and everything is smooth
It would be nice to see the RTX 5090's potential fully realized by using DLDSR at resolutions like 5760x3240 while leveraging Multi Frame Generation to match the monitor's maximum refresh rate. This could provide an incredible balance of quality and performance.
I can't play games without DLDSR considering how trash native TAA looks. And now with DLSS 4 Performance looking close to Quality modes of older DLSS, it's just chef's kiss at this point
Honestly, with the how TAA is abused to add new graphics effect, It doesn't change much. I know I'm beating on a dead horse, but look at Stalker 2. If this is what the average game studio think is "current gen fidelity", artifacting is the least of my worries
I don't think people are complaining that it doesn't LOOK better than native. People are frustrated that we're getting barely any BASE performance benefit over the last generation, but we're being sold that it's orders of magnitude faster. Sure, maxing our your monitor's refresh rate will look great, but we NEED a better raster performance more than we need frame generation.
5090 is the only card that brings more or less generation uplift (+25-30%). Lower end models are way smaller and will bring 10-15% uplift, which is indeed very frustrating.
@@stangamer1151 But the 5090 is more or less just a straight up upgraded 4090, proven by the rumors of a 4090ti or titan card with the same cooling solution (early prototype, so no liquid metal yet). The pricing even confirms this. I believe CUDA is a dead end technology and nvidia is going full AI because they don't have a backup plan. The 50 series is just a beefcake version of the 40 series
But it wont make much money with your lame marketing, sounds like something 200$ tv would use... We need AI ti super nano frames multiplicator device to ask 2500$.
@@RobiBrock Which is basically exactly that, frame interpolation and everyone hates it on their TVs, and I hate it on both the TV and graphics card, even though it's a bit more sophisticated on the GPUs, but not by much.
@@RobiBrock which is an accurate association as that is exactly what TV smoothing does, it interpolates frames to increase fps and smooth the experience. the reason TV's could have done this way before gaming is because smoothing a video is different that smoothing a game with unknown data ahead of the existing frame. TV's have all the frame data way ahead of time.
Hey Daniel! You can cap frame generation (at least prior to DLSS4) via NVCP. I use this to set targets just like you would without frame gen. I think it will be interesting to mess with different caps and change between 2x / 3x / 4x in games to get higher base framerates while staying under the reflex cap for 4k240hz (225hz), I believe this should help lower latency in some titles!
I was looking for a comment like this , so just to confirm I can use frame gen to get above my monitor refresh rate but then add a framerate cap so gsync always stays on?
It actually lowers the base frame rates though. The frame rate will always be multiplied by that number so if you have too many frames for the cap, your base frame rate will be capped.
@@saiyaman9000 You need to force VSynch through Nvidia App or control panel. It automatically caps yor frame rate a few frames below your maximum refresh rate. @120hz it caps it at 116fps for example. It is part of Nvidia Reflex which is always forced on when using Nvidia frame gen.
Great video and interesting stuff with new RTX 5000 series technologies! It would be great if you make a video comparison of the artifacts with different DLSS modes and multipliers. I know that youtube can barely show difference due to the image quality, but you can still upload footages at google disk or something like that so we can see it on our high-refresh monitors too. I am thinking about upgrading my 6950XT to something like 5070ti or 5080 to play on 4k 120hz TV of mine, but i am afraid that DLSS will give me too much noticeable artifacts, especially on 65" TV.
I'd expect Digital Foundry and Hardware Unboxed to do some extremely detailed side by sides in the future. I had limited time to work on this video on top of benchmarking for my main 5090 review, so I just tried to highlight the important details on use case and general subjective experience so that people have some idea what's going on with multi frame gen.
So happy that I'm in that 1% of people who enjoys games at 60fps, no more, no less! I can't think how stressed other people are buying 400hz monitors to use mfg x4 and still be disappointed that they don't reach 400fps. I'm even fine with the 40fps mode created by the geniuses at Insomniac for the ps5 games, those 40 frames fell twice as better than the 30 fps. I would always take 60 real fps over any fake frames and enjoy those 4k ray traced graphics.
@@SC87Returns for me, 90 fps is the minimum I will accept. I can't do 60 fps anymore, it feels as laggy as 30 fps did back when I got used to gaming at 60 fps.
Until you play at 144-240 fps. Don’t try it, you will never go back. The consoles will be using this tech soon. Some games already have frame gen. It will be FSR for them and not this.
Losslessly scaling has artifacts and increases the smoothness and people crap on it, Nvidia has artifacts and increses the smoothness and people praise it. Shills all over the place these days.
I would love to see a comparison between this and lossless scaling 3. I’ve been super impressed with lsfg, but it would be interesting to see how much benefit multi frame gen gets out of being integrated directly into games.
I started using Lossless Scaling 3 with my 7900 XTX. It's near flawless. I can now play both Fortnite hardware RT and Witcher 3 RT Ultra at native 2560x1440 res and get around 120-130fps. I just make sure I have anti-lag turned on.
It has the same "issues" as Nvidia but when a tech youtuber uses it they do nothing but crap on it and praise Nvidia even though they both share the same issues. Lossless scaling kind of proves that Nvidia is just making up crap and limiting what older GPUs can do for no reason. Even AMD has fluid motion but it's more open to the GPUs it can be used on.
In conclusion buy lossless scaling for 5 dollars and forget 😂 I don't think this generation it's worth it but the next generation will be really worth to buy, amd has an opportunity here
Tbh I try not to use it under 80fps to max my tv out at 120. It's very dependent on the game. Cyberpunk is always used because the game is basically made to advertise Nvidia tech. It work's extremely well with both Alan Wake 2 and Cyberpunk. DLSS does as well.
@@JoonKimDMD Low base framerate = high base latency. High upscaled framerate + high base latency = strange motion effects, as if you have nausea. If the AI data is trained well (as in CP 2077, Nvidia probably spent a ton of time prepping their model for this and some other top games), then there are almost no tearing or other effects. In other games or with other technologies (Lossless Scaling i.e.), there will be more tearing, artifacts and nausea effects from "fluid" motion, which is not as fluid as if the game was rendered natively at a high framerate.
@@samgragas8467 in CPU limited games it will be a shame to see the latency cut off be lower, but for a lot of the games that support it I'll be getting 120-200 so I'll be very happy with the experience. End of the day even worst case scenario titles like Stalker 2 should do a good job, if not looks like I need a 4K or better display to join my 480Hz and pick and choose what games I play on both. Looks like another CPU upgrade will be needed next year as well.... perhaps
@@SD-vp5vo It won't be useless, you'd just suffer a latency penalty if you less all settings on ultra with PT, if you scale down some settings you get the latency back. 5090 will easily last you the next 4 years, especially with how poorly some games are optimised. MFG will be needed.
is there really that much ghosting/glitching behind/under the car while driving or am i just seeing things? the shadow or something is glitching looks like 0:58
Yeah I am pretty sure it's the framegen (it's even worse with FSR framegen), although if you use the default TAA in cyberpunk instead of DLSS/DLAA etc. there is also A LOT of ghosting. As you can see at 10:28 with default TAA and framegen, the ghosting is very obvious.
Great video. I was testing the new transformer model for about an hour and a half in several areas and using ICAT to compare images: what's more impressive is the new transformer model. DLSS Quality mode looked like trash at 1440p but now, even performance mode is usable. DLSS is now more viable at resolutions lower than 4K. Ray reconstruction used to look BAD on the CNN model, game looked like an oil painting with ghosting. It's all resolved now. The game is so sharp even at DLSS Performance at 3440x1440. Edit: Sorry Ole, I tried to reply twice but for some reason my comments never showed up. Yes, it works on RTX 4K series. Glad you were able to confirm.
@@ole7736 Yes! For my coment above I was testing the new model on my 4090. All RTX 4K owners can use the new model in CP2077 (the game was updated yesterday, check the patch notes to confirm); after the new driver drops next week, for the RTX 5K launch, you will be able to enable the transformer on all games that haven't been updated to DLSS4 through a driver override. I stopped testing and just played the game and CP2077 with path tracing is actually enjoyable now, game looks amazing.
MFG4x is uselles on lower end cards. They didint increased raster performance bellow 5090 that much so this feature is useless for everyone except 5090 owners. Maybe with 5080 in some tittles, but bellow that its uselles because cards wont have raster power for 4k/60. Btw that trail behind car looks so bad. How can you play like that?
I am glad that you actually spoke to the feel/latency and showed it off and compared to how you'd use it in a game like Cyberpunk vs. a competitive shooter. I absolutely agree. I picked up a 5080 and I tested out frame generation due to all the talk about it being latent as I am EXTREMELY picky about input latency. I thought the mouse would feel extremely floaty and in any mode, and really bad in 4x mode. What I found was that the latency was virtually non-existent during my tests. 180's were perfect (not that latency really affects those...but I know how they feel when spinning) and my crosshair did what I wanted. I still wouldn't have it on for a competitive shooter, but for virtually anything else, turn it on IMO. It's glorious.
Don't confuse frametimes with latency. A games latency is not the same thing as the time between frames. Latency is the time between an input and it being sent to the display.
@@conika92 frame times are the times between frames being presented. But let's say a frame is buffered for a bit before reaching the display, and an input takes time to process in the game engine. Then the time between an input occuring and that input being shown on your display is greater than the time between frames presented on your display. I'm oversimplifying all of this, but that's the general idea.
@ I didn't mean the what frametimes means, but thanks. I was asking what are the actual frametimes when using frame gen? I understand that frame makes everything smoother but input doesn't coresponds correctly. It's like having a lag in a offline game if I am correct. My question was if FG is off obvisously for 60 fps you will get 16.6ms what will you get if FG is on, would it remain 16.6ms and show 120 fps or something completely different?
"It almost looks good"- Me seeing the most jank artifacts I have ever seen on screen while he is saying this lol Seriously, I would think my GPU was dying if I saw that happening to me 😬
Same thought, man. It doesn't look close to good, but almost anytime I use frame gen, no matter the base, I always end up turning it off because it just looks weird and my brain rejects it. This though is just straight up in your face janky garbage.
The fact that you can have sub-300 fps in Cyberpunk w/ Path Tracing and all bells and whistles, without your CPU begging for life, is honestly pretty cool. I bet most AAA games will support this, so people can keep their 2-3 Gen old CPUs, upgrade to 240hz-ish monitors and have a better experience. Not always though, but most of the time. Kinda wild!
if that performance would be priced @200-300usd not 2000 that'd be cool. just a reminder that 1440p 180hz monitors cost 150usd in 2024. 1440p 100hz is just 100usd. less resolution you have to start with the worse upscaling is. dlss + fake frames should be additional features on top of decent performance for an AFFORDABLE price that corresponds to gaming performance you get in 2025, not a performance replacement for 10x overprice like mining isn't over and GPUs still can print money. they can't
Just put a limit on the fps and it will be reduced by a large amount. And it depend on the games so the game you play is probably one of the worst since most of the time you can't see it.
Smoothness is nice but the actual game feel is paramount. Can instantly notice frame gen in multiplayer games and even most single player games. Just feels worse, even if only marginally. And I’m not paying 1-2k on a gpu to compromise.
I'm not sure what the compromise is that you're referring to. Without MFG, these cards will be faster, not by much in some cases, but faster. the MFG is an option, not a compromise, and most reviewers I've seen actually seem to prefer it on in games where MFG makes sense, like CP2077 with RT. A competitive game should be better optimized for lower end hardware and this card will eat through those at native or DLSS Quality better than any other card in existence. It's the least compromising card being made. It's just expensive, and if that's the issue, that's fine, but blaming MFG doesn't really hold weight. If you can get a 4090 at MSRP, then do that, the compromise is that it's 20-40% slower, and 25% cheaper. If you don't want to pay Nvidia, buy a 7900xtx or a 9070, the compromise is that while they are cheaper, and offer probably better FPS per dollar than a top end card, they are significantly slower in every regard, and whether people want to acknowledge it or not, have more driver issues than green cards.
@@CarlosElPeruacho its Popular to crap on FG so thats why they do it..thats all. I use it in Cyberpunk so i can play with 90fps Path-tracing and to be honest i really dont feel any difference...i would really need to focus on that. There are some games where it is huge and i dont use it not even if i get bad fps...so thats what is really important, how well it is made into the game etc. (Ratchet and clank has for example terrible laag with fg i would never use it there)
@@Shadowclaw25 Yeah, there's some definite cognitive dissonance around MFG as an optional, powerful, and well implemented tech in these GPUs. Herds gonna herd I guess, fake frames are bad and all that. I have used LSFG, I have a 3080, so it's that or FSR. I find DLSS is usually enough without FG, but on some games it helps saturate my monitors refresh a little better.
Completely agree, maybe I'm just more sensitive to that kind of stuff, but anything above 40 starts to get really noticeable, 60+ is not "floaty" it just unplayable. 70+? It's like playing game with 15fps, that wasn't acceptable even 20 years ago. Maybe if youre playing something really slow paced, with crappy controller, on the tv... But with modern keyboard/mouse (heck, even my current controller is pushing 1000hz polling and 3-5ms latency now), it's just sad to see what is sold to gamers as a next big thing.
Yeah, the x2 FG present in the game right now does not affect the UI in any significant way. The new drivers release in a week - hopefully that will fix…
@@kalle5548that only works if the hud can update at little no CPU cost. Otherwise you'd lose the CPU independence of frame gen. Most games update de hud by updating the game world with physics, to then grab the new stats and draw them on the screen. Running all the physics ain't exactly cheap
I mean, every 3 out of 4 frames you see on a screen are generated. No wonder HUD looks so bad. I do not think there is an easy way to fix this right now. Current AI model is just not good enough to create clean frames w/o any major issues yet. It needs time. By the time RTX 6000 releases, we might get more or less decent quality MFG 4x. But right now MFG 4x is more of a test feature, than a real thing that helps to enhance visuals.
Question: Will this help with games that break if you go over 60 fps? Could you lock the game at 60 so the engine and physical play nice, and turn on frame gen to get extra frames?
I imagine this would work, but the game needs to support frame generation i the first place. This is not driver level like AFMF or a post-effect like lossless scaling FG. You can force the new multipliers through the nvidia app in DLSS 3 titles though, which is nice (although I think the titles need to be whitelisted by Nvidia in their app first).
@@baroncalamityplus most of the games that have engine limits like that also do not support any form of frame gen, but if you can somehow inject it into the engine it might work (though UI detection will still be imperfect, leading to artifacts like the one Daniel shows in his video, but likely worse)
I've used driver based frame gen in Skyrim, which famously hates going over 60fps, and frankly I don't think it's worth it at all. Introduces artifacting without feeling or looking any better
Not DLSS4 MFG but I used the Lossless scaling program (about 6 pounds on steam) to make Elden ring run at 120 FPS no issues at all, looked gorgeous too. So there's a solution.
The HUD-elements are a great example showing just how broken Game-studios and their engines are: Nvidia made it very clear and easy to understand: ANY AND ALL upscalers MUST be done before HUD-elements and fullscreen post-processing or they will result in glitched graphics.... and what do those studios do? Jup - ignore that requirement and intentionally make the game look worse.
The upscaling runs before the HUD is rendered... The Frame Generation runs on the entire frame, if you don't there's the issue that FSR 3 FG has where everything behind a transparent UI element will not be updating at the generated frame rate (which looks equally jarring). Both have their challenges & neither solution offers superior results. However with Nvidias approach & a theoretically perfect AI model capable of reconstructing the HUD elements the output would be considered "perfect," whereas FSR 3 FG would ALWAYS show the same problem no matter what.
@@MLWJ1993 "The Frame Generation runs on the entire frame," Well - that is the thing: Even the documentation tells you that this is the WRONG thing to do. But if you inject something like FSR FG than of course you get degraded quality. And the game shown here is NOT using FSR but DLSS - so it is 100% the games fault and they just made the image-quality worse for no gains.
@ABaumstumpf Not at all "wrong". If you're not doing that you'll get the exact same problem any NATIVE FSR 3 FG IMPLEMENTATION gets, anything behind a transparent UI element will update at the base framerate, not the generated one. If you want to know how that looks, look up Avatar Frontiers of Pandora. It's not at all a better solution to the problem. 😅
@@MLWJ1993 "anything behind a transparent UI element will update at the base framerate, not the generated one." No, not at all, not even close. The stated best approach is quite simple: You provide the framegen-library (whichever you are using) with the full rendered frame WITHOUT Overlays or extra effects, and then you render the HUD etc on top of the provided generated frame. It is literally impossible for any part of the generated image to be updated at a different framerate if you just follow instructions. And not following specs is exactly the BS that Avatar or Flightsim or CB77 did.
@@ABaumstumpf and that's EXACTLY what FSR 3 FG does & guess what. It causes that EXACT ISSUE with transparent UI elements... but you're too dense to accept that as the answer. So I guess the onus is on you to provide proof that what you're spouting here is factually correct, prove it! 😉
im using it for helldivers 2 and the other more graphically demanding games to max out 180 hz on my 1440p monitor. as long as you dont use it with fsr, picture quality really doesnt suffer at all.
@@__-fi6xg True I have rx 7900GRE and i also use AFMF2 in helldievers 2 cuz game run worse and worse each new update and with this i can have around 200fps in 1440p native with max settings on my 240HZ monitor so its fine. AND yes AFMF2 works really well but you need to have anty lag on and also have at least 60fps if you have bellow 60fps game gonna feels worse.
Like everything Nvidia makes, MFG is great for the most expensive setups. If you dont have at least a 180Hz monitor for the MFGx3 you won't really benefit from it. Same as DLSS in a way, where DLSS from 1080p does not have much wiggle room to look good and give performances boost, we are stuck at Quality mode. So the sweet spot to use Nvidia feature is a 1440p 240hz with a GPU capable of doing 60 base FPS (plus the cost of MFG) at 1440p (835p internal with DLSS Balanced). So yeah Nvidia is again fucking over ""budget"" gamers by releasing ultra high end GPU either capable of going up to 4k 240fps with good visual quality at 1600€ (the rumored 5080 cost in Europe) or anything below the ultra high end, struggling to get a 60fps because it can't cheat as much with DLSS and FG without giving up too much on quality.
What? If you don't have a high refresh monitor why would you need a 5060 or any of these cards to be more powerful? Why are you even PC gaming? Consoles are $250-750 lol.
A budget gamer isn't even looking at these. A budget gamer should be looking at older cards on the used market with LSFG. Seriously get a Titan Xp 12gb and LSFG, upscale any game. Build with a 5700x3d, you can play any game you want 1080p to 1440p, get respectable frame rates, and spend less than a 5080, hell, get a good deal and you could spend less than a 5070 Ti.
@@CarlosElPeruacho I put "budget" in a lot of quotation marks, because anything that's not ultra higth end with Nvidia is low end. And you are wrong, even real budget gamers will look at the 5060, because Nvidia dominate the mind share they won't even think about AMD or Intel. And I have migh doubt about future 5060 owners having a +1440p +180hz monitor, and even if they have, I don't think the 5060 will have the raw power to make the baseline fps enough to give an experience comparable to using DLSS and MFG with a 5080 or higher.
Great vid! Thank you for sharing your experience and thoughts on this. I reckon this is a great vid for people to also learn what the frame gen actually does.
If you play only esports games... it's better to go always native and even 100% AMD. But for single player games in 4K, I think NVIDIA is getting the edge
that is 30 fps frame generated to 120. of course its worse. that being said, this is miles better than any open source frame generation implementation right now. And the fact that x2 and x3 is that stable, means its pretty good. if you have 60 fps you can easily make it 120 or 180 and not notice anything with this.
Thanks for showing 4x on performance mode and high base FPS - every other video is only showing it with everything turned on. That's what I was looking for.
having more frames at the cost of a fuckton more latency is SHIT. regular framegen is a good sweetspot in performance offering though depending on the game.
His latency was that bad because he was playing at 30 fps. Frame Gen or not, it would be floaty. As if you didn't watch the video. If you are stuck there for whatever reason, why not enable frame gen and at least get to look at a much more fluid image? Motion blur was used a lot in console games for this exact reason too. It doesn't reduce latency but it smooths our the image and the motion.
This is a problem... TH-camrs seem to be singing the praises of this Frame Generation nonsense on a GPU where one doesnt actually need to turn it on at all. I think ALL TH-cam reviewers should form a "gentleman's agreement" to not talk about this frame generation tech until they have tested it on the 5070 at minimum (because apparently this makes a 5070 comparable to a 4090 in "performance". This will ensure that there are rational and truthful viewpoints for the actual mid range mainstream card and not "facts" based on a £2000 GPU that effectively no one will be able to either afford, or purchase even if they could due to deliberate shortages.
Yeah, this is somewhat my problem with it. It's great..... if you spend $2k+ (about $5k here lol). It's worthless on lower end cards - ie. what most people play on. If they even bother upgrading to this overpriced generation at all. These features are largely useless for most people.
@@matdan2 Why's that, you looking to buy a 5090? Because none of the other cards are gonna be worth it.. Put your money where your mouth and put down that £3000 for the new 90 card..
Now, the question is... How is the transformer model going to translate to the 40 Series? Are lower resolutions finally viable using with DLSS? We remain to see.
DLSS performance mode or balanced is already good, If you have a 1080p monitor use DLDSR 1440p with DLSS, with 1440p monitor you can use DLDSR 4k with performance mode, or just use quality mode on 1440p.
So long as you stay around 1080p internal resolution it should be fine according to reviewers. So 4k monitor + DLSS Performance (1080p internal resolution) or 1440p monitor + DLSS Quality (960p internal resolution). Personally I'm wondering if it's worth to go for 4k DLSS Balanced (1253p internal resolution, right in-between 1080p and 1440p) and using MFGx4 to get near 240 FPS instead of doing MFGx3 with 4k DLSS performance. This will have more input lag since the base framerate will be lower, but might look significantly better due to less upscaling guesswork. Assuming 4k 240 HZ monitor which is the new standard for 5090 owners, of course.
Good morning/Good afternoon, I am Brazilian, and my English is not very good (but I'm improving) to keep up with your content. It would be great if your videos had automatic dubbing on TH-cam.
Telling people that dlss performance + 4x frame gen is normal on a 2 grad gpu is like telling people to run games at the lowest settings on the best gpu on the planet both look equally as bad.
please compare mfg with lossless upscaler experience. the way you describe it and the way it looks in the video seems basically identical to what you get with lossless upscaler.
LSFG won't have Reflex by default. The game needs to have it already. DLSSFG is only usable because Reflex is integrated into it to reduce latency significantly over base native latency before frame generation.
Digital Foundry already did that. You get more artifacting, more haloing, but it's usable, as long as your base frame rate is higher. DLSS FG simply looks and performs better than LSFG. I use Lossless scaling with older games that don't support DLSS so I can get my raster performance up to 240hz on my 4k monitor with my 3080, not so much the FG I tried it in 7 Days, cause FSR sucks in that game. I am planning on getting the 5090, and giving my wife my current PC, and taking hers as a workstation to offload fusion 360 and UE Engine from my gaming rig.
@@CarlosElPeruacho DLSS FG x2 was shown to perform same as last version of LSFG x2 when it comes to latency, don't know about 5000 series frame gen, we will see.
(copying my reply to similar comment above): Not only does lossless scaling not use motion-vector information from game engine (which makes it extremely bad at interpolating), it also uses very crude interpolation algorithms, while Nvidia is using the best AI technology available. Nvidia also has a huge advantage over lossless because it leverages the dedicated FP4 hardware on the GPU, and Nvidia has millions of dollars to throw on super-computers to train their transformers on mind-boggling amount of data. I'm sorry, but thinking lossless can come anywhere close with these 3 disadvantages: No motion-vectors, no FP4 AI hardware, no obscene budget to train the neural-net.
I feel like the main driving force is that they want you to pay for software. The 5090 in their eyes is probably worth 2000 because you get 4x frame green with around a 20-30% performance up lift.
@@doc7000The $550 card also has 4x frame gen, so that doesn't make any sense. Besides that, cute for you to think that they made the 5090 with gamers in mind at all. Maybe as a second thought.
Honestly, I just bought rtx 4070 ti super after AMD stunt and.. I love the frame gen on it. With reflex even on like 50 fps I don't really feel any input lag. I was very hesitant what to feel about it before trying it but honestly now I love it and I believe that 4x on 5000 may actually be nice to use. Not gonna lie, didn't expect to like it at all.
The fact that you spent so much money on such a high end card, and are still subject to needing frame gen should leave you severely depressed, not happy. I was thinking of buying one of those, but thanks to your comment I think I will hold off lol
@brando3342 I've tested it in cyberpunk with path tracing and other stuff I would not turn on just playing it so I had ~40-50fps in 2k, decided to give frame gen a go to decide what do I think about it from experience, not what I've heard and seen. Didn't see any additional input lag, and had more frames, so naturally smoother looking image. Then I decided to test forza horizon 5, kinda old title but I remember barely making to the 60 on lowest on my r9 270x. In 2k I had 160fps on max settings, 220 with frame gen and didn't see any noticable input lag while driving 400km/h. I bought this to play in 2k native and 4k upscaled on high/ultra settings in high framerate. If I can get free frames without input lag and no noticable difference in the image for literally nothing.. why not? I'm sad about how the market looks like now, I've been looking for an upgrade since rtx 2000. I've been waiting for loooong years to get my hands on anything better than my 10+ years old card. That's why AMD got me angry and I unfortunately decided that there's no hope in GPU market so I can go for the best option for me, which was 4070 ti super. I'm tired of waiting and that tweet few days ago was like a kick to the stomach when you're already on the ground after punch. I wanted to go for 7900 xt, but not after that. In my opinion the prices won't get better and unfortunately this is our reality now. The only hope is intel but I don't trust them yet.
@@brando3342 You understand that's w/ settings like Path tracing right? You don't need to use FG but if you want to use settings that are taxing even for the strongest cards on the market (and pretty much impossible on AMD cards), then ya, you're gonna need a little help from FG. I don't think you understand what things like Full RT or Path Tracing actually do or what they are? Maybe you should just stick to raster only at 1080P and entry level budget cards. BTW 4070ti is not such a high end card. It's upper mid-range card.
You sound like a tryhard. If you think you need 360hz in order to be able to compete, you gotta wake up. Fortnite already proved that even console scrubs could compete with their 60 fps cap and their controllers. Wake up, 360hz is you thinking you have an advantage while in reality it doesn't even matter. Maybe 1 moment out of hundreds of hours it did help you.
@@DBTHEPLUG ofc you dont need 360hz but when playing comp games at a high level you should atleast have 240hz, you literally see people earlier on faster monitors.
@ in a tac shooter that makes a huge difference, you mentioned Fortnite as a comp game too, but the skill in Fortnite is the building not the shooting, so a high refresh rate monitor doesnt really do all that much, atleast compared to val and cs. but aiming in all shooters just becomes significantly easier on high refresh rate
@@zephr147 The same advantage a 240hz monitor gives you in Fortnite, is the same advantage you'll have in any other shooter. You're just making up excuses to rationalize how it's possible for people at 60hz to compete with people at 240hz. Rationalize how you have people playing on touchscreens that are able to compete at high levels? The competitive advantage you think a high refresh rate monitor gives you is all mostly in your head. It's just like you thinking you have a chance when opening up those lootboxes in apex legends. When displaying the chance or advantage as a percentage, it would be miniscule. That's why I'm never going beyond 144hz (I could overlock my monitor to 165hz if I wanted to.)
@@huggywuggy594 not necessarily. I’m fine with the responsiveness of 60fps, but not lower than that. I won’t ever use frame gen below 60fps base frame rate.
Slightly yes, because frame gen takes away GPU compute resources. The real issue is that doesn't help you with latency, if you turn 30fps into 120 it will (hoplefully) look decent but feel very wierd, if you turn 120 into 480 you just won't see that much change.
@@lol-bg4wh Who said one should go down 4times the pixels, do have a look up on the different resolution you can have below 4K UHD. Lets not forget the ingame resolution scaling where you can lower it to 80% in most games and not loose noticable quality. This is why i am so against upscaling. No one knows how to tweak their systems anymore and just go after buzzwords like 4K, 5K, 6K. The standard of structure has just been forgotten. In the end you awnsered my point. If you are able to play with the artifacts you can lower you res and not see the artifact without going slowmo.
I come to realize the biggest problem with pc gaming is the never ending FPS chase! All the while games don't look much better than 10+ years ago in many aspects.
People keep getting faster higher resolution monitors and then complain when GPUs can’t keep up, it’s kinda funny. It feels like everyone was just fine with 1080p 60fps for a while (any GPU now will crush that) but now everyone is trying to run crazy stuff like 4K 240hz and if it’s not running at that speed it’s unplayable. People are trying to push 16x the pixels per second as before and freaking out that their card can’t keep up while looking even better. Wonder when we’ll see people mad that things don’t run well in 8k.
@@-Burb Hi end GPUs were always about very high fps until Nvidia started to push for ray tracing, let alone path tracing even Nvidia cards are not fully ready yet, especially with amount of VRAM some of them have, so even Nvidia owners call ray tracing fps killer and often disable it. You can't get stable 1080p 60 fps with ray tracing enabled on low range cards in some games, and ray tracing will become mandatory soon.
I agree. It's why I'm not big on frame gen. Frame Gen with nvidia reflex has latency almost as high a no frame gen and no Nvidia reflex. So it negates nvidia reflex chopping your total system latency in half.
@@Sp3cialk304 frame gen 4x with the new transformer model and reflex doesn't add latency. Go look at ltt benchmarks, absolutely no noticeable increase in latency.
They should really make illegal to say "performance" instead of "frame rate" at this point. They are NOT the same, despite what Nvidia wants you to believe
Daniel, I'd be curious if you see a further improvement in quality using DLSS4 with 4x Frame Gen with DLAA enabled? Maybe it will help with the weird graphical glitches?
No. DLAA only runs on the engine rendered frames, interpolated frames use pixel color and motion information from DLAA-smoothed frames but doesn’t have any additional post processing attached to it. The only thing that can eliminate ghosting artifacts in frame gen is high base frame rate and lack of any persistent UI that could be mistakenly interpreted by the interpolation algorithm as part of the 3d scene.
Initial impressions without detailed comparisons: DLSS 4 is much better. Which makes sense because it is directly integrated into the game and so has access to much more information to work with when doing the interpolation of frames.
@@danielowentech i'd argue that 4x LSFG is nothing compared to native ingame implementation but comparing LSFG x2 used on top of DLSS x2 frame generation could be interesting, in my experience with a 4070ti SUPER this combination isn't perfect but usable, the real bottleneck is the minor artefecting of LSFG x2 just don't start from too low fps or i bet it's going to break much faster than DLSS 4 in x4 mode
I just tried using X2 Nvidia FG and X2 Lossless FG at the same time with a 4090 in Indiana Jones and the experience was too awful for me to keep playing. I can just about handle FSR and DLSS framegen X2 but I can't use Lossless at all. The amount of artifacting is unnaceptable to me and I really don't understand how anyone can put up with it.
@@Veganarchy-Zetetic i don't think you are supposed to mix Lossless with FG, turn FG off in game and use 2x/3x/4x or w/e in lossless, you can use DLSS in game and disable it on Lossless.
I only tried it because someone just mentioned it. I can't use Lossless. It looks terrible no matter what settings or version. Mixing the 2 did infact look slightly better but the latency was the same.
I just knew Dan would open the video with the DOWNSIDE of DLSS 4 because he can't help himself In his video{ FSR 4 Looks AMAZING} " he couldn't contain his jubilation & express how amazed he was as his first impression With DLSS 4 he posts a negative video title like It helps you win MORE, but it doesn't help you win, First impression " Surreal , Almost looks good " Then if the base rate is higher, presto, he finds good things to say Not before he lets you down Its long standing pattern
Remember Daniel, this is still does not have Reflex 2, which will implement Async reprojection that would take user Mouse and only Mouse input into consideration when making those Generated frames.
I am under the impression that Reflex 2 is not compatible with multi frame gen, and is exclusive to a few specific esports titles (at least for now). Keep in mind that reflex 2 also does some "extrapolation" to in-fill guess work about what is happening after the space warp, which means it would have a complicated interaction with frame gen.
@@sufian-ben while this could be implemented in the future, Nvidia hasn’t announced any games using both MFG and Reflex 2 at the same time. I do agree it’s a perfectly logical step to try to integrate them at some point.
2kliksphilip does said Reflex 2 CAN work with Framegen in his 5090 review. But until NVIDIA confirms anything I'll remain skeptical about it (Especially since Reflex 2 isn't even out yet)
@@danielowentech it is said that its designed to work alongside the Multi-Frame generation and its the silver bullet to the x2 performance over 4090. same as Reflex 1 is automatically injected in the older model of Frame Generation. but I might be wrong
No this has to be some kind if confusion. Reflex 2 is not the same as async reprojection. That is the next possible level of latency feeling lowering tech. But it is not what reflex 2 is
Yes, you would probably appreciate seeing "100" fps instead of 30, even though it will feel floaty regardless. However, the idea situation is to, of course, NOT PUT YOURSELF IN THAT TERRIBLE RASTERIZATION POSITION IN THE FIRST PLACE, which is when people are like "well DLSS is better than FSR so I'll go Nvidia over AMD even though it's a worse card for more money" I'm like :| JUST GET THE BETTER CARD
If rasters gonna suck either way, why not make it better? I don’t see how it’s a downside The XTX raster is worse as well, so I don’t see a reason to go with amd
@randomroIIs But it doesn't have to suck, so why buy the worse card because you want more fake/better frames, just get the card that gets the frames you want Having to interpolate at 4K is a first world problem, I'm not focused on that. And how the 7700XT is like a third of the 5090 in performance at 1440p, yet costs a sixth. That shit's tragic.
@@ocha-timeNo card can do path traced 4k ultra..so wym get the card that can do. I'd rather get the one which will offer "muh fake frames" fluidity with minimal graphics loss Also with games having shitty taa these days I'd take dlss4 or fsr4 anyday..Dlss is better now so i will choose that over Amd's nvidia price - $50 with buns fsr3. I will come back to Amd when fsr4 is out and on par with dlss4(fingercrossed)
@@ocha-time Also to adjust settings to get 200fps native, the game would be look dogshit compared to 4k ultra path traced with 200fps fake frames. Watch optimum tech's review
@foxx8x Why are we turning on space tracing in the first place? Do we LIKE to tank our framerate to 25% normal so we can justify throwing AI frames in? People are so stupid with their resources these days lol, they'll front they NEED a 4080 Super to get playable framerates and YES I want to spend $1400 to get it!
As a "1920p" player (using DLDSR 1.78x on a 27" 1440p 360hz QD-OLED monitor) - I can definitely see Multi Frame Gen helping to achieve around 360fps to take advantage of the full 360hz. So I should aim to have a base rendering fps target of 90 first?
It's kind of funny that more than 4 years and 3 gpu generations later and this game is still the game to benchmark lmao. I wonder if anyone actually bought the 2080ti cyberpunk edition.
shows the big lack of innovation from game devs when a 5 year old game is still the benchmark/showcase for new gpus
@@Spaceman69420 Also because Nvidia is using this game as their testing grounds lmao
Only the top-notch Nvidia gpus can run Cyberpunk with 4k and max path tracing
@@Spaceman69420most Devs are moving to ue5 to save money on talent, and ue5 just sucks
basically crysis 3
reject frames per second, embrace hallucinations per second
-Leather Jacket
The more you buy, the more you save 🤣
@@dennythescar80s8 The more you buy, the more you hallucinate.
Now with 200% more Electric Sheep.
We still can't prove that reality is real according to scientists so we can all be hallucinating everything. We will never know.
Epic comment xD
The schizophrenia-based rendering
Lmao
Lol
People saying this act like they don't have a monkey brain..
4090 copium.
@@xvnbm Why does your post have a translation option with the translation being supply?.
All the artifacting from Frame Gen goes well with the relic in your head malfunctioning in Cyberpunk lol
its a feature XXD
Schizo generation
imagine paying 2000usd+ for 64fps @1440p in wukong when 1440p 100hz monitors cost 100usd, 1440p 180hz 150usd.
mining is over, GPUs don't print money anymore, prices must come back to Earth from 10x overprice that makes 0 sense.
dlss + fake frames should be additional features on top of decent performance for an AFFORDABLE price that corresponds to gaming performance you get in 2025, not a performance replacement for 10x overprice like mining isn't over and GPUs still can print money. they can't.
all gamers lose when fake 'gamers' defend overpriced 10x low-VRAM crap, true gamers should demand better GAMING GPUs and pre-mining prices. because thats in TRUE GAMERS INTERESTS. and then gamers can have glitchy fake frames ON TOP of ACTUAL PERFORMANCE, not INSTEAD of actual performance.
the problem is 10x overprice and 0 progress @affordable prices.
gamers and gamedev desperately need some actual progress with AFFORDABLE GPUs since 2016 when mining started. so gamers can finally buy themselves 16gb+ VRAM 200-300usd affordable FAST GPUs and forget about crap low-VRAM graphics in games because gamedev can just ignore low-VRAM slow crap GPUs. the 9070xt with 1440p 52fps in CP2077 or 73fps NO RT in wukong doesn't actually sound like 'fast' in 2025. 46fps @1080p, its like 30-35usd worth of 2025 gaming performance. the 4080/s its the same slow performance for 2025. RTX is 2018 6yo OLD tech. all GPUs must run it by now perfectly fine BUT NOPE.
the 9070 should be a 250usd card to destroy both nv and intel. the 9070xt @300-350usd will be 'fine' - that'd be at least some progress with GPUs for a change. e.g. 9600xt 2008 vs 1060 2016 x9 times faster, 1060 vs 4060 2023 is just x2 faster.
the 5090 64fps @1440p in wukong. basically the 5090 has like 70usd worth of ACTUAL 2025 gaming performance in it.
"modern GPUs" aren't even able to run modern games on ULTRA BUDGET MONITORS.
dude, you just made me wanna buy it. sounds so lit doesnt it 🤣
cool, glitching effect with those artifacting, makes Cyberpunk feel more immersive with all the glitches and stuff lol
RTX 5000 series aren't even out yet, we don't even have drivers for them and the results are crazy good.
@@SweetFlexZ numbers on the screen are good, the result is the input lag and actually screen smoothness and lack of artefacting, wich is missing
What artifacting? He showed artifacts with 30 FPS base / no DLSS upscaling benchmark, and then DLSS performance with 250-280FPS, even with 50% speed you wouldn't notice artifacts.
You should have been there on day 1 CP2077, super immersive being in cyber-psychosis all the time lmao
@@SweetFlexZ lmao what you huffing?
Great video w/ great context.
Would love an updated video comparing DLSS quality to performance! Make this happen, Daniel!
This video was focused on the frame generation. The new upscaling looks great, but would require a separate video for detailed analysis.
@ correct. Meant to say separate video vs updated. Thank you for what you do!
@@danielowentech Hey Daniel, when you do take a look at the Upscalers, could you perhaps compare I.e. CNN Quality vs Transformer Balanced/Performance on 1440p and 1080p?
Would love this as well. The improvement in image quality brought by the new transformer model to me is the highlight of this 5000 launch window and should not be understated as it’s basically free fps for the same (and sometimes even better) image quality
@@danielowentech Compare something like 1440p native image quality vs 4k dlss performance. Which one is faster, which one looks better, and then add these permutations to the transformer model. :D
One thing I would love to see is a comparison between the latest version of lossless scaling. It recently got an update with better quality frame generation, less performance hit and lower latency.
Lossless Scaling is junk. It doesn't have game information and looks horrible.
@ that’s why I asked for a comparison
@@icy1007 Huh? I've used it on Avatar frontiers of Pandora locked at 82fps max graphics, and it did extremely well upscaling x2 to 164fps, also upscaled bloodborne on shadps4 to 120fps no problem and it looks amazing, I think it does pretty damn well in Elden Ring as well, seeing as it's locked to 60fps, x2 mode makes you able to play elden ring at 120fps and it also looks incredible, and there is no frame gen for that game and probably will never be, lossless scaling is sweet, I have a 4070ti super so i love this new dlss 4 upgrade but it wont give us frame gen on any game we want, and we wont be getting mult gen, but lossless scaling can give us that, although at x3 and x4 i start to see visual shit i dont like, x2 mode is pretty amazing.
I bet frame gen would work well in MMORPGs where you are CPU limited in high population towns and latency doesn't matter.
Frame Gen is amazing in single player games like Harry Potter, where there is no need for inputs on every frame. Anything with pvp or that requires the fastest reaction time possible, it sucks. It would be great in an MMORPG if all you do is run around town. If you pvp in your mmo, it will be very bad.
There this thing called motion blur, it makes less frames look smoother and looks better than fake frames. It won't feel off like with fake frames and it even hides the artifacts of upscaling/TAA.
It's very good in single player third person games e.g. the witcher 3, assassin's creed. Even better when you're using a gamepad because the latency is whatever if the base framerate is good.
Cities Skyline 2 would be awesome with fsr/dlss fg.
@@dontsupportrats4089 No it won't, it will help you if you can only actually get 60FPS. The numbers are like this.
60FPS No FG Reflex off avg PC latency 52ms
60FPS No FG Reflex on avg PC latency 33ms
240FPS 4xFG Reflex on avg PC latency 38ms
Which one are you going to use? the visual feedback and motion clarity are totally different. If you use MFG like this, you're basically cheating.
I'm playing Street Fighter 6 with this setting and my anti-air success rate noticeably higher thanks to the better 240FPS motion clarity. and I'm still get better input latency than people who doesn't know how to apply Reflex to your games that doesn't support it. Ultimately it's just a tool for you to use, and you should use it wisely.
Frame Gen is for very high fps to match the high refresh rate monitor, so the GPU puts out 180-250fps and FG would get you to 250-500fps on your 240/500hz monitor because it doesn't look like raster is going to get 500fps and up
Frame gen is for at least 60 base so not too high. For me, a base constant 60 with frame gen can feel better than an inconsitent 80-100 doubled. Of course.
thing is for most games where you get 180-250 frames, u dont want the increased latency of FG/MFG.
@@tixx7492 The added latency of FG is very low, and since it mandates Reflex, you end up with lower latency than fully native (native without any added nvidia technologies like reflex). Native + reflex will still be faster, but FG mandates reflex, so either way its addition to a game is a boon.
@@frankiepollock7380But 75fps “naked” after apply tech should be doing that baseline 60fps and x2 will make very good and solid 120fps or very acceptable x3 180fps (better than 60 native, and 80/100 ya mention about it (in x2)). People must to realize 70/75fps is the minimum, after apply tech the gpu must be STILL Less than 90% usage.
yes and no. Do you really want to enable fake frames with big latency for online multiplayer games? That's where you NEED a lot of fps. You don't need more than 120 or so in single player games where the MFG can be turned on despite big latency and fake frames.
Really good video showing pros and cons of frame gen. Surprised no one else has done this yet.
Daniel is in a league of his own tbh
Hardware Unboxed said they would soon so keep your eye out. They always have great videos too
2kliksphilip
This topic has been covered 100x lol
There isnt pros. There is one pro. Singular.
i think the transformer dlss is also a huge improvment. ultra performance dlss looks so good now. its insane
i think that's the best thing that happened so far, better DLSS.
Only in Cyberpunk. Tried it in HZD Remastered and I would not say it looks great and much better than the older model (3.8.10). It definitely needs time to improve.
None of it's gonna be worth for me as long as there's that smearing effect that I notice easily and then can't unsee.. 4:40 / 5:00 look at the gun's scope and that massive trail that's following it
Yeah defeats the whole reason why I want a high framerate in the first place, MFG is a crap gimmick to paper over the cracks of poor optimisation.
I think he said 5 times that shouldn’t base your opinion on what you see since TH-cam is locked to 60fps.
The game was not built with the huge reliance of motion vectors in mind. The HUD and the translucent elements show this the best. It would be pretty hard reinventing the game’s rendering pipeline from the current deferred mode so we are stuck with this I think until devs start implementing solutions for this.
@@omc951he was implying that artifacts will be LESS visible on the TH-cam video. The red streaking is horrible and not a video compression artifact
True, but i tried the already implemented dlss on my 4080S, and the experience was spotty at best. I don't believe generating even more frames will help with the visuals@@omc951
Jesus I see so many artifacts and glitches everywhere... not just the hud.
Yup. At this point I'm convinced the people that like this are either blind or leave their FPS counters turned on and enjoy seeing bigger numbers more than they enjoy playing the actual game.
@@paulc5389 Humans can't see ''clear'' outside of central area where they are looking at, rest of the image on a monitor is a blur, there is also a blind spot where our eyes can't see anything, and you see something that might not even be there. Moving the eyes also blurs the entire image, but our brain just shows a clear image based on what was there before, and what it expects to see. Combine that with playing the game, shooting at a target, and most people will miss most of the artifacts.
@paulc5389 it's one of those things you learn to ignore. It's a trade. Learn to ignore sub 60 performance, or learn to ignore artifacts. One I enjoy depending on the hardware and the game. Indiana jones is a very taxing game with stability issues so I personally wouldn't use it with that. Because the instability can break the render layer. If you have ever been a poor pc gamer then you could see the application. If you have only ever had a mid-high end system you won't get it.
@@paulc5389 people like it because there i almost no visual artefact with all the update, wait 6 month and mfg will be as good as dlss3...
@@paulc5389 i personally cant see anything wrong most of the time, most of the reviews says its very good with baseline of 50-60 fps also at 2x mode the artifacts much more less.
Something I wondered about is what was covered briefly, where you end up going over the monitor's max refresh rate with 4X MFG mode. I would guess this is objectively worse than staying within the G-sync range.
I think it looks better staying within the range. That's why I preferred X3 to x4
I was curious about this as well, in my mind if you are playing a game and you keep getting hit with low 1% lows could you use a combo of MFG and Gsync to get higher 1% lows but stay at your monitors refresh rate
@@danielowentechWhy not just cap your framerate at your refresh rate or 3 frames less? If you over shoot your refresh rate, your 1% lows will be higher, and you can cap a at the refresh rate and maintain the higher 1% lows which will give you a very smooth playing experience. Thats what i do on all games, but i haven’t tested it with frame gen yet, only dlss.
@@KiNg0fShReDI do this with my games using frame gen and it works great. V-sync on in nvidia control panel, limit max fps 3 below the monitor max and everything is smooth
"Over 240fps on my 4k monitor are too many frames."
What a time to be alive!!
Well, I dont know about his monitor but, in my case, its true since its 4k 144hz
It would be nice to see the RTX 5090's potential fully realized by using DLDSR at resolutions like 5760x3240 while leveraging Multi Frame Generation to match the monitor's maximum refresh rate. This could provide an incredible balance of quality and performance.
Yes
I can't play games without DLDSR considering how trash native TAA looks. And now with DLSS 4 Performance looking close to Quality modes of older DLSS, it's just chef's kiss at this point
@@dvornikovalexei Just wish DLSS wasn't completely broken in the games I play
"the artifacts are looking a lot better!" I think that giant ghosting trail from the sights on that weapon beg to differ a little 4:50
that’s just goes to show how bad it was
Yeah, it looks terrible, everywhere on the screen is all messed up. It would be messing with my head big time
Guys! Guys! Games look better then ever. Games always had the Oil Filter on, you are just rasics. Buy 5090 now
You should see the issues in Cities skylines 2 where DLSS straight up doesn't work and will duplicate the building holograms all over your screen 🤣
Honestly, with the how TAA is abused to add new graphics effect, It doesn't change much.
I know I'm beating on a dead horse, but look at Stalker 2. If this is what the average game studio think is "current gen fidelity", artifacting is the least of my worries
I don't think people are complaining that it doesn't LOOK better than native. People are frustrated that we're getting barely any BASE performance benefit over the last generation, but we're being sold that it's orders of magnitude faster.
Sure, maxing our your monitor's refresh rate will look great, but we NEED a better raster performance more than we need frame generation.
5090 is the only card that brings more or less generation uplift (+25-30%). Lower end models are way smaller and will bring 10-15% uplift, which is indeed very frustrating.
Get with the times caveman, you get 200+ FPS
@@stangamer1151 But the 5090 is more or less just a straight up upgraded 4090, proven by the rumors of a 4090ti or titan card with the same cooling solution (early prototype, so no liquid metal yet). The pricing even confirms this. I believe CUDA is a dead end technology and nvidia is going full AI because they don't have a backup plan. The 50 series is just a beefcake version of the 40 series
This gen is more like a refresh than a generational upgrade.
They should call it frame smoothing and people would fuzz less about it
No they would associate it with the "smoothing" setting that TVs have.
But it wont make much money with your lame marketing, sounds like something 200$ tv would use... We need AI ti super nano frames multiplicator device to ask 2500$.
@@RobiBrockbut it's exactly like what the TVs have, but works better.
@@RobiBrock Which is basically exactly that, frame interpolation and everyone hates it on their TVs, and I hate it on both the TV and graphics card, even though it's a bit more sophisticated on the GPUs, but not by much.
@@RobiBrock which is an accurate association as that is exactly what TV smoothing does, it interpolates frames to increase fps and smooth the experience. the reason TV's could have done this way before gaming is because smoothing a video is different that smoothing a game with unknown data ahead of the existing frame. TV's have all the frame data way ahead of time.
Hey Daniel! You can cap frame generation (at least prior to DLSS4) via NVCP. I use this to set targets just like you would without frame gen. I think it will be interesting to mess with different caps and change between 2x / 3x / 4x in games to get higher base framerates while staying under the reflex cap for 4k240hz (225hz), I believe this should help lower latency in some titles!
What is NVCP and can it be used with lossless scaling? EDIT: Nvm you mean Nvidia Control Panel and yea it does not work with lossless scaling sadly.
I was looking for a comment like this , so just to confirm I can use frame gen to get above my monitor refresh rate but then add a framerate cap so gsync always stays on?
It actually lowers the base frame rates though. The frame rate will always be multiplied by that number so if you have too many frames for the cap, your base frame rate will be capped.
@@saiyaman9000 You need to force VSynch through Nvidia App or control panel. It automatically caps yor frame rate a few frames below your maximum refresh rate. @120hz it caps it at 116fps for example. It is part of Nvidia Reflex which is always forced on when using Nvidia frame gen.
@@ICaImI Lossless Scaling is junk.
Great video and interesting stuff with new RTX 5000 series technologies! It would be great if you make a video comparison of the artifacts with different DLSS modes and multipliers. I know that youtube can barely show difference due to the image quality, but you can still upload footages at google disk or something like that so we can see it on our high-refresh monitors too. I am thinking about upgrading my 6950XT to something like 5070ti or 5080 to play on 4k 120hz TV of mine, but i am afraid that DLSS will give me too much noticeable artifacts, especially on 65" TV.
I'd expect Digital Foundry and Hardware Unboxed to do some extremely detailed side by sides in the future. I had limited time to work on this video on top of benchmarking for my main 5090 review, so I just tried to highlight the important details on use case and general subjective experience so that people have some idea what's going on with multi frame gen.
So happy that I'm in that 1% of people who enjoys games at 60fps, no more, no less! I can't think how stressed other people are buying 400hz monitors to use mfg x4 and still be disappointed that they don't reach 400fps. I'm even fine with the 40fps mode created by the geniuses at Insomniac for the ps5 games, those 40 frames fell twice as better than the 30 fps. I would always take 60 real fps over any fake frames and enjoy those 4k ray traced graphics.
I love real frames as well and 60fps would be fine for me, anything more is just extras for me.
@@SC87Returns for me, 90 fps is the minimum I will accept. I can't do 60 fps anymore, it feels as laggy as 30 fps did back when I got used to gaming at 60 fps.
Until you play at 144-240 fps. Don’t try it, you will never go back. The consoles will be using this tech soon. Some games already have frame gen. It will be FSR for them and not this.
Would be cool if you could test 2x FG and see if the new AI optical flow model is better than DLSS 3.
Image quality wise that is.
I think hardware unboxed had dat in their review and it was few percent better
@shayanhosseini8429 yes yes, but I meant if the image looks better vs FPS is a few percent better 😉
Its 5-7% slower the new model dlss as it uses tensor cores.Thats the improvement on new 5000 series.Rtx 2000/3000 series will struggle.
Losslessly scaling has artifacts and increases the smoothness and people crap on it, Nvidia has artifacts and increses the smoothness and people praise it. Shills all over the place these days.
I would love to see a comparison between this and lossless scaling 3. I’ve been super impressed with lsfg, but it would be interesting to see how much benefit multi frame gen gets out of being integrated directly into games.
I started using Lossless Scaling 3 with my 7900 XTX. It's near flawless. I can now play both Fortnite hardware RT and Witcher 3 RT Ultra at native 2560x1440 res and get around 120-130fps. I just make sure I have anti-lag turned on.
@@chs_ambs8356No way lossless is remotely close to FSR3..
It has the same "issues" as Nvidia but when a tech youtuber uses it they do nothing but crap on it and praise Nvidia even though they both share the same issues. Lossless scaling kind of proves that Nvidia is just making up crap and limiting what older GPUs can do for no reason. Even AMD has fluid motion but it's more open to the GPUs it can be used on.
@@chs_ambs8356this is good to know!
@@drewzle837 so much cope here. This is incredible to watch. Talking to all 3 of you.
In conclusion buy lossless scaling for 5 dollars and forget 😂
I don't think this generation it's worth it but the next generation will be really worth to buy, amd has an opportunity here
Breaking news: frame gen which notoriously requires at least 50 base fps, doesn't work perfectly with below 50 base fps 🤯
You saying we cant test new gen stuff because old gen was bad?
Could you explain what you mean?? I play 30fps classic games and have problems with lossless scaling.
@@JoonKimDMD Frame gen is just not supposed to be used under 60 fps, AMD and Nvidia have both said this
Tbh I try not to use it under 80fps to max my tv out at 120. It's very dependent on the game. Cyberpunk is always used because the game is basically made to advertise Nvidia tech.
It work's extremely well with both Alan Wake 2 and Cyberpunk. DLSS does as well.
@@JoonKimDMD Low base framerate = high base latency. High upscaled framerate + high base latency = strange motion effects, as if you have nausea.
If the AI data is trained well (as in CP 2077, Nvidia probably spent a ton of time prepping their model for this and some other top games), then there are almost no tearing or other effects. In other games or with other technologies (Lossless Scaling i.e.), there will be more tearing, artifacts and nausea effects from "fluid" motion, which is not as fluid as if the game was rendered natively at a high framerate.
Thanks for this video, if anyone is using a high refresh rate display (480Hz in my case) sounds like it's going to be insane.
Yes, but your CPU needs to deliver 130 FPS to reach 480 FPS. It is overkill.
@@samgragas8467 in CPU limited games it will be a shame to see the latency cut off be lower, but for a lot of the games that support it I'll be getting 120-200 so I'll be very happy with the experience. End of the day even worst case scenario titles like Stalker 2 should do a good job, if not looks like I need a 4K or better display to join my 480Hz and pick and choose what games I play on both. Looks like another CPU upgrade will be needed next year as well.... perhaps
new games will be even more demanding and it will all be useless
@@SD-vp5vo It won't be useless, you'd just suffer a latency penalty if you less all settings on ultra with PT, if you scale down some settings you get the latency back. 5090 will easily last you the next 4 years, especially with how poorly some games are optimised. MFG will be needed.
is there really that much ghosting/glitching behind/under the car while driving or am i just seeing things? the shadow or something is glitching looks like 0:58
I say yes it look something is going to me as well
if you're not using fg it's still there on my screen.
Yeah I am pretty sure it's the framegen (it's even worse with FSR framegen), although if you use the default TAA in cyberpunk instead of DLSS/DLAA etc. there is also A LOT of ghosting. As you can see at 10:28 with default TAA and framegen, the ghosting is very obvious.
it's an artifact of super resolution when running at low frame rates
Yes, that's actually TAA that does that.
Great video. I was testing the new transformer model for about an hour and a half in several areas and using ICAT to compare images: what's more impressive is the new transformer model. DLSS Quality mode looked like trash at 1440p but now, even performance mode is usable. DLSS is now more viable at resolutions lower than 4K. Ray reconstruction used to look BAD on the CNN model, game looked like an oil painting with ghosting. It's all resolved now. The game is so sharp even at DLSS Performance at 3440x1440.
Edit: Sorry Ole, I tried to reply twice but for some reason my comments never showed up. Yes, it works on RTX 4K series. Glad you were able to confirm.
Can you run the transformer model on 40 series cards?
@@ole7736 Yes, look at the latest patch notes for CP2077. I was testing it on my 4090.
@@ole7736 Yes! For my coment above I was testing the new model on my 4090. All RTX 4K owners can use the new model in CP2077 (the game was updated yesterday, check the patch notes to confirm); after the new driver drops next week, for the RTX 5K launch, you will be able to enable the transformer on all games that haven't been updated to DLSS4 through a driver override. I stopped testing and just played the game and CP2077 with path tracing is actually enjoyable now, game looks amazing.
I just found out on Reddit that it is possible, from someone using a 4070ti.
MFG4x is uselles on lower end cards. They didint increased raster performance bellow 5090 that much so this feature is useless for everyone except 5090 owners. Maybe with 5080 in some tittles, but bellow that its uselles because cards wont have raster power for 4k/60.
Btw that trail behind car looks so bad. How can you play like that?
wow it really isn't much better than lossless scalings LSFG multi frame gen that's crazy
yeah, and with NVIDIA Reflex 2.0 for older GPUs combined with lossless scaling x3 or x4 it'll solve the latency problem too
@@aradparviz-ry5hv never, losless full software latency, nvidia x4 frame gen is not.
I am glad that you actually spoke to the feel/latency and showed it off and compared to how you'd use it in a game like Cyberpunk vs. a competitive shooter. I absolutely agree. I picked up a 5080 and I tested out frame generation due to all the talk about it being latent as I am EXTREMELY picky about input latency. I thought the mouse would feel extremely floaty and in any mode, and really bad in 4x mode. What I found was that the latency was virtually non-existent during my tests. 180's were perfect (not that latency really affects those...but I know how they feel when spinning) and my crosshair did what I wanted. I still wouldn't have it on for a competitive shooter, but for virtually anything else, turn it on IMO. It's glorious.
Is this a joke? 280 fps with 33.33ms which is equivalent to 30 fps response time? Frame gen sounds like giant scam to me.
Don't confuse frametimes with latency. A games latency is not the same thing as the time between frames. Latency is the time between an input and it being sent to the display.
@@danielowentech Thanks for the correction Dani, really appreciate it. Then what would be the actual frame-times then?
@@danielowentech Thx for sharing knowledge.
@@conika92 frame times are the times between frames being presented. But let's say a frame is buffered for a bit before reaching the display, and an input takes time to process in the game engine. Then the time between an input occuring and that input being shown on your display is greater than the time between frames presented on your display. I'm oversimplifying all of this, but that's the general idea.
@ I didn't mean the what frametimes means, but thanks. I was asking what are the actual frametimes when using frame gen? I understand that frame makes everything smoother but input doesn't coresponds correctly. It's like having a lag in a offline game if I am correct. My question was if FG is off obvisously for 60 fps you will get 16.6ms what will you get if FG is on, would it remain 16.6ms and show 120 fps or something completely different?
"It almost looks good"- Me seeing the most jank artifacts I have ever seen on screen while he is saying this lol Seriously, I would think my GPU was dying if I saw that happening to me 😬
Same thought, man. It doesn't look close to good, but almost anytime I use frame gen, no matter the base, I always end up turning it off because it just looks weird and my brain rejects it. This though is just straight up in your face janky garbage.
The fact that you can have sub-300 fps in Cyberpunk w/ Path Tracing and all bells and whistles, without your CPU begging for life, is honestly pretty cool. I bet most AAA games will support this, so people can keep their 2-3 Gen old CPUs, upgrade to 240hz-ish monitors and have a better experience. Not always though, but most of the time. Kinda wild!
if that performance would be priced @200-300usd not 2000 that'd be cool. just a reminder that 1440p 180hz monitors cost 150usd in 2024. 1440p 100hz is just 100usd.
less resolution you have to start with the worse upscaling is.
dlss + fake frames should be additional features on top of decent performance for an AFFORDABLE price that corresponds to gaming performance you get in 2025, not a performance replacement for 10x overprice like mining isn't over and GPUs still can print money. they can't
Adaxum’s focus on real-world applications makes it a standout project. Joined the presale today!
I got a 240 hz QD OLED monitor and the VRR flicker becomes apparent and makes 120fps look like 60, I dont see the hype and I have a 40 series card
Just put a limit on the fps and it will be reduced by a large amount. And it depend on the games so the game you play is probably one of the worst since most of the time you can't see it.
Smoothness is nice but the actual game feel is paramount. Can instantly notice frame gen in multiplayer games and even most single player games.
Just feels worse, even if only marginally. And I’m not paying 1-2k on a gpu to compromise.
Ok so you can play at 80fps instead of 240... Oh wait.. you said no compromise.......
I'm not sure what the compromise is that you're referring to. Without MFG, these cards will be faster, not by much in some cases, but faster. the MFG is an option, not a compromise, and most reviewers I've seen actually seem to prefer it on in games where MFG makes sense, like CP2077 with RT. A competitive game should be better optimized for lower end hardware and this card will eat through those at native or DLSS Quality better than any other card in existence. It's the least compromising card being made. It's just expensive, and if that's the issue, that's fine, but blaming MFG doesn't really hold weight. If you can get a 4090 at MSRP, then do that, the compromise is that it's 20-40% slower, and 25% cheaper. If you don't want to pay Nvidia, buy a 7900xtx or a 9070, the compromise is that while they are cheaper, and offer probably better FPS per dollar than a top end card, they are significantly slower in every regard, and whether people want to acknowledge it or not, have more driver issues than green cards.
@@CarlosElPeruacho its Popular to crap on FG so thats why they do it..thats all.
I use it in Cyberpunk so i can play with 90fps Path-tracing and to be honest i really dont feel any difference...i would really need to focus on that.
There are some games where it is huge and i dont use it not even if i get bad fps...so thats what is really important, how well it is made into the game etc.
(Ratchet and clank has for example terrible laag with fg i would never use it there)
@@Shadowclaw25 Yeah, there's some definite cognitive dissonance around MFG as an optional, powerful, and well implemented tech in these GPUs. Herds gonna herd I guess, fake frames are bad and all that. I have used LSFG, I have a 3080, so it's that or FSR. I find DLSS is usually enough without FG, but on some games it helps saturate my monitors refresh a little better.
@@CarlosElPeruachoGood god, you guys are like the extreme end of frame smoothing fanatics.
It's interesting to see how 50-series things work in real world usage. Thumbs up!
70 milliseconds ? I know that feels horrible I'm good on that
Completely agree, maybe I'm just more sensitive to that kind of stuff, but anything above 40 starts to get really noticeable, 60+ is not "floaty" it just unplayable. 70+? It's like playing game with 15fps, that wasn't acceptable even 20 years ago. Maybe if youre playing something really slow paced, with crappy controller, on the tv... But with modern keyboard/mouse (heck, even my current controller is pushing 1000hz polling and 3-5ms latency now), it's just sad to see what is sold to gamers as a next big thing.
this channel delivers great vids, thank you daniel
The artificing on the Hud elements is likely a massive bug should be patched soon I think, frame gen shouldn't be effecting the HUD
Yeah, the x2 FG present in the game right now does not affect the UI in any significant way. The new drivers release in a week - hopefully that will fix…
@fakejazznut
100% I remember when frame generation came out this was a problem with spiderman, got patched out.
I was thinking that the hud should be applied after framegen. Hopefully, devs won't abuse this more than they already have with DLSS
@@kalle5548that only works if the hud can update at little no CPU cost. Otherwise you'd lose the CPU independence of frame gen.
Most games update de hud by updating the game world with physics, to then grab the new stats and draw them on the screen. Running all the physics ain't exactly cheap
I mean, every 3 out of 4 frames you see on a screen are generated. No wonder HUD looks so bad. I do not think there is an easy way to fix this right now. Current AI model is just not good enough to create clean frames w/o any major issues yet. It needs time. By the time RTX 6000 releases, we might get more or less decent quality MFG 4x. But right now MFG 4x is more of a test feature, than a real thing that helps to enhance visuals.
Question: Will this help with games that break if you go over 60 fps? Could you lock the game at 60 so the engine and physical play nice, and turn on frame gen to get extra frames?
I imagine this would work, but the game needs to support frame generation i the first place. This is not driver level like AFMF or a post-effect like lossless scaling FG. You can force the new multipliers through the nvidia app in DLSS 3 titles though, which is nice (although I think the titles need to be whitelisted by Nvidia in their app first).
@@baroncalamityplus most of the games that have engine limits like that also do not support any form of frame gen, but if you can somehow inject it into the engine it might work (though UI detection will still be imperfect, leading to artifacts like the one Daniel shows in his video, but likely worse)
I've used driver based frame gen in Skyrim, which famously hates going over 60fps, and frankly I don't think it's worth it at all. Introduces artifacting without feeling or looking any better
@@silverhost9782 yep, same results in Fallout 4
Not DLSS4 MFG but I used the Lossless scaling program (about 6 pounds on steam) to make Elden ring run at 120 FPS no issues at all, looked gorgeous too. So there's a solution.
Great video. Well done!
The HUD-elements are a great example showing just how broken Game-studios and their engines are:
Nvidia made it very clear and easy to understand: ANY AND ALL upscalers MUST be done before HUD-elements and fullscreen post-processing or they will result in glitched graphics.... and what do those studios do? Jup - ignore that requirement and intentionally make the game look worse.
The upscaling runs before the HUD is rendered... The Frame Generation runs on the entire frame, if you don't there's the issue that FSR 3 FG has where everything behind a transparent UI element will not be updating at the generated frame rate (which looks equally jarring). Both have their challenges & neither solution offers superior results.
However with Nvidias approach & a theoretically perfect AI model capable of reconstructing the HUD elements the output would be considered "perfect," whereas FSR 3 FG would ALWAYS show the same problem no matter what.
@@MLWJ1993 "The Frame Generation runs on the entire frame,"
Well - that is the thing: Even the documentation tells you that this is the WRONG thing to do. But if you inject something like FSR FG than of course you get degraded quality.
And the game shown here is NOT using FSR but DLSS - so it is 100% the games fault and they just made the image-quality worse for no gains.
@ABaumstumpf Not at all "wrong". If you're not doing that you'll get the exact same problem any NATIVE FSR 3 FG IMPLEMENTATION gets, anything behind a transparent UI element will update at the base framerate, not the generated one.
If you want to know how that looks, look up Avatar Frontiers of Pandora. It's not at all a better solution to the problem. 😅
@@MLWJ1993 "anything behind a transparent UI element will update at the base framerate, not the generated one."
No, not at all, not even close.
The stated best approach is quite simple: You provide the framegen-library (whichever you are using) with the full rendered frame WITHOUT Overlays or extra effects, and then you render the HUD etc on top of the provided generated frame.
It is literally impossible for any part of the generated image to be updated at a different framerate if you just follow instructions.
And not following specs is exactly the BS that Avatar or Flightsim or CB77 did.
@@ABaumstumpf and that's EXACTLY what FSR 3 FG does & guess what. It causes that EXACT ISSUE with transparent UI elements... but you're too dense to accept that as the answer.
So I guess the onus is on you to provide proof that what you're spouting here is factually correct, prove it! 😉
You can use vsync + reflex to cap the framerate below the monitor refresh rate automatically
You should always force vsync with frame Gen
The same with AFMF2 on my 7800xt its good for a going from 120 fps on Marvel rivals to 240
It's only good because you have already 120fps basic, and the same is true for DLSS FG. FG is very bad when you start with low FPS, so AFMF2.
im using it for helldivers 2 and the other more graphically demanding games to max out 180 hz on my 1440p monitor. as long as you dont use it with fsr, picture quality really doesnt suffer at all.
@@__-fi6xg True I have rx 7900GRE and i also use AFMF2 in helldievers 2 cuz game run worse and worse each new update and with this i can have around 200fps in 1440p native with max settings on my 240HZ monitor so its fine. AND yes AFMF2 works really well but you need to have anty lag on and also have at least 60fps if you have bellow 60fps game gonna feels worse.
@@__-fi6xg I'm nvidia user but I had the luck to test AFMF2 on a 7900xtx and it was amazing good.
Like everything Nvidia makes, MFG is great for the most expensive setups. If you dont have at least a 180Hz monitor for the MFGx3 you won't really benefit from it. Same as DLSS in a way, where DLSS from 1080p does not have much wiggle room to look good and give performances boost, we are stuck at Quality mode. So the sweet spot to use Nvidia feature is a 1440p 240hz with a GPU capable of doing 60 base FPS (plus the cost of MFG) at 1440p (835p internal with DLSS Balanced).
So yeah Nvidia is again fucking over ""budget"" gamers by releasing ultra high end GPU either capable of going up to 4k 240fps with good visual quality at 1600€ (the rumored 5080 cost in Europe) or anything below the ultra high end, struggling to get a 60fps because it can't cheat as much with DLSS and FG without giving up too much on quality.
What? If you don't have a high refresh monitor why would you need a 5060 or any of these cards to be more powerful? Why are you even PC gaming? Consoles are $250-750 lol.
@tyoung319 à 144hz monitor is a high refresh rate monitor. But using MFG x4 with it would imply a FPS baseline of 36, and 48fps for the x3 mode.
A budget gamer isn't even looking at these. A budget gamer should be looking at older cards on the used market with LSFG. Seriously get a Titan Xp 12gb and LSFG, upscale any game. Build with a 5700x3d, you can play any game you want 1080p to 1440p, get respectable frame rates, and spend less than a 5080, hell, get a good deal and you could spend less than a 5070 Ti.
You want the 4060 to be a 4090. Sure.
@@CarlosElPeruacho I put "budget" in a lot of quotation marks, because anything that's not ultra higth end with Nvidia is low end. And you are wrong, even real budget gamers will look at the 5060, because Nvidia dominate the mind share they won't even think about AMD or Intel. And I have migh doubt about future 5060 owners having a +1440p +180hz monitor, and even if they have, I don't think the 5060 will have the raw power to make the baseline fps enough to give an experience comparable to using DLSS and MFG with a 5080 or higher.
Great vid! Thank you for sharing your experience and thoughts on this. I reckon this is a great vid for people to also learn what the frame gen actually does.
If you play only esports games... it's better to go always native and even 100% AMD. But for single player games in 4K, I think NVIDIA is getting the edge
Yeah, AMD's Anti-Lag+ was game changing in the esport games 🤭
Just wanted to say I really enjoy your content! Your reviews are great, and very measured and intelligently presented. Thanks!
1:13 - Watch closely at the palm leaves in the upper left corner, there are a lot of artifacts during movement.
yeah, the trees were stuttering across the screen lol
that is 30 fps frame generated to 120. of course its worse. that being said, this is miles better than any open source frame generation implementation right now. And the fact that x2 and x3 is that stable, means its pretty good. if you have 60 fps you can easily make it 120 or 180 and not notice anything with this.
Thanks for showing 4x on performance mode and high base FPS - every other video is only showing it with everything turned on. That's what I was looking for.
70ms is LAGGY AF, dont support this nvidia gen pls people. over 20 ms and u feel it alot
having more frames at the cost of a fuckton more latency is SHIT. regular framegen is a good sweetspot in performance offering though depending on the game.
His latency was that bad because he was playing at 30 fps. Frame Gen or not, it would be floaty. As if you didn't watch the video. If you are stuck there for whatever reason, why not enable frame gen and at least get to look at a much more fluid image?
Motion blur was used a lot in console games for this exact reason too. It doesn't reduce latency but it smooths our the image and the motion.
Maybe watch the full video and pay attention before commenting.
Nice objective review. Frame gen isn't performance, it's a feature.
It is performance. With caveats.
This project has everything-strong team, great concept, and early adopter perks. Secured my ADX today!
Great job at discribing your experience, thanks Daniel.
This is a problem... TH-camrs seem to be singing the praises of this Frame Generation nonsense on a GPU where one doesnt actually need to turn it on at all.
I think ALL TH-cam reviewers should form a "gentleman's agreement" to not talk about this frame generation tech until they have tested it on the 5070 at minimum (because apparently this makes a 5070 comparable to a 4090 in "performance".
This will ensure that there are rational and truthful viewpoints for the actual mid range mainstream card and not "facts" based on a £2000 GPU that effectively no one will be able to either afford, or purchase even if they could due to deliberate shortages.
Hey TH-camrs, don’t listen to this guy
Yeah, this is somewhat my problem with it. It's great..... if you spend $2k+ (about $5k here lol). It's worthless on lower end cards - ie. what most people play on. If they even bother upgrading to this overpriced generation at all. These features are largely useless for most people.
@@matdan2 Why's that, you looking to buy a 5090? Because none of the other cards are gonna be worth it.. Put your money where your mouth and put down that £3000 for the new 90 card..
Now, the question is... How is the transformer model going to translate to the 40 Series? Are lower resolutions finally viable using with DLSS? We remain to see.
DLSS performance mode or balanced is already good, If you have a 1080p monitor use DLDSR 1440p with DLSS, with 1440p monitor you can use DLDSR 4k with performance mode, or just use quality mode on 1440p.
4k looks insane on transformer. I bet ultra performance dlss at 4k is like performance with previous model.
@@1GTX1 just use dlaa it looks great
@erenyaeger7662 new performance looks better in movement than old quality DLSS its insane
So long as you stay around 1080p internal resolution it should be fine according to reviewers.
So 4k monitor + DLSS Performance (1080p internal resolution) or 1440p monitor + DLSS Quality (960p internal resolution).
Personally I'm wondering if it's worth to go for 4k DLSS Balanced (1253p internal resolution, right in-between 1080p and 1440p) and using MFGx4 to get near 240 FPS instead of doing MFGx3 with 4k DLSS performance. This will have more input lag since the base framerate will be lower, but might look significantly better due to less upscaling guesswork.
Assuming 4k 240 HZ monitor which is the new standard for 5090 owners, of course.
Thanks for this Daniel, I was looking forward to the review of this feature most. Updating the game so I'll try out first-hand as well.
Adaxum’s presale is gaining momentum, and I’m thrilled to be part of it.
This is sick..Seeing optimum tech compare real 70fps to 200 fps "fake frames" fluidity already convinced me on this tech
Good morning/Good afternoon,
I am Brazilian, and my English is not very good (but I'm improving) to keep up with your content. It would be great if your videos had automatic dubbing on TH-cam.
No
It would be nice, but in the meantime keep improving and you'll feel less of a need for auto dubbing
Learning english is a very important skill that can take you further in life. Using subtitles helps you learn it
Telling people that dlss performance + 4x frame gen is normal on a 2 grad gpu is like telling people to run games at the lowest settings on the best gpu on the planet both look equally as bad.
EXACTLY. It’s really annoying when people get new stuff and automatically slobber all over it’s nuts like it owes them money 🤦♂️
i hv nvr gone below dlss quality in 3070 Founders edition and rtx 4070 ti super gigabyte eagle oc since launch jan 2024
Adaxum is one of those tokens that could surprise people down the line. Happy to have gotten in early.
Loseless Scaling X3 exists gamers! Everyone can have a 5090 for 8 bucks 😂
3$ in some countries 😂 best thing ever no jokes
Except MFG isn't far off from the base image, whereas "lossless" scaling isn't lossless at all.
much worse quality though, but the smoothness is insanely good with it
@@SydGhosh You've must've seen old videos. X3 with the latest update is working fantastically I was using it yesterday.
@ Have you used the most recent version?
please compare mfg with lossless upscaler experience. the way you describe it and the way it looks in the video seems basically identical to what you get with lossless upscaler.
shhh ;) >.< LOL
LSFG won't have Reflex by default. The game needs to have it already. DLSSFG is only usable because Reflex is integrated into it to reduce latency significantly over base native latency before frame generation.
Digital Foundry already did that. You get more artifacting, more haloing, but it's usable, as long as your base frame rate is higher. DLSS FG simply looks and performs better than LSFG. I use Lossless scaling with older games that don't support DLSS so I can get my raster performance up to 240hz on my 4k monitor with my 3080, not so much the FG I tried it in 7 Days, cause FSR sucks in that game. I am planning on getting the 5090, and giving my wife my current PC, and taking hers as a workstation to offload fusion 360 and UE Engine from my gaming rig.
@@CarlosElPeruacho DLSS FG x2 was shown to perform same as last version of LSFG x2 when it comes to latency, don't know about 5000 series frame gen, we will see.
(copying my reply to similar comment above):
Not only does lossless scaling not use motion-vector information from game engine (which makes it extremely bad at interpolating), it also uses very crude interpolation algorithms, while Nvidia is using the best AI technology available.
Nvidia also has a huge advantage over lossless because it leverages the dedicated FP4 hardware on the GPU, and Nvidia has millions of dollars to throw on super-computers to train their transformers on mind-boggling amount of data.
I'm sorry, but thinking lossless can come anywhere close with these 3 disadvantages: No motion-vectors, no FP4 AI hardware, no obscene budget to train the neural-net.
Tip, turn on G sync and V sync in the control panel. This way no matter what frame gen won’t push pass your screen refresh rate
Yep! activating an arbitrary FPS cap with frame gen on causes further problems with latency and game feel.
Interesting video! Would love to see a comparison with FSR3 (frame gen), fsr 2.0, and lossless scaling!
3 years for this? this is a 4090 ti.i know nvidia gets all their money with other things and this proves it. they dont care about gpus anymore
I feel like the main driving force is that they want you to pay for software. The 5090 in their eyes is probably worth 2000 because you get 4x frame green with around a 20-30% performance up lift.
i am loving frame gen. I can get high graphics 180 fps on my monitor which is something i could not do before even with low graphics.
3 years...?
@@doc7000The $550 card also has 4x frame gen, so that doesn't make any sense. Besides that, cute for you to think that they made the 5090 with gamers in mind at all. Maybe as a second thought.
@@DBTHEPLUG interesting. Who do you think they made it for then?
As someone who predominantly plays openworld rpg's this is very helpful thanks. Awaiting 5080 benchmarks before I make my mind up...
Honestly, I just bought rtx 4070 ti super after AMD stunt and.. I love the frame gen on it. With reflex even on like 50 fps I don't really feel any input lag. I was very hesitant what to feel about it before trying it but honestly now I love it and I believe that 4x on 5000 may actually be nice to use. Not gonna lie, didn't expect to like it at all.
The fact that you spent so much money on such a high end card, and are still subject to needing frame gen should leave you severely depressed, not happy.
I was thinking of buying one of those, but thanks to your comment I think I will hold off lol
@brando3342 I've tested it in cyberpunk with path tracing and other stuff I would not turn on just playing it so I had ~40-50fps in 2k, decided to give frame gen a go to decide what do I think about it from experience, not what I've heard and seen. Didn't see any additional input lag, and had more frames, so naturally smoother looking image. Then I decided to test forza horizon 5, kinda old title but I remember barely making to the 60 on lowest on my r9 270x. In 2k I had 160fps on max settings, 220 with frame gen and didn't see any noticable input lag while driving 400km/h. I bought this to play in 2k native and 4k upscaled on high/ultra settings in high framerate. If I can get free frames without input lag and no noticable difference in the image for literally nothing.. why not? I'm sad about how the market looks like now, I've been looking for an upgrade since rtx 2000. I've been waiting for loooong years to get my hands on anything better than my 10+ years old card. That's why AMD got me angry and I unfortunately decided that there's no hope in GPU market so I can go for the best option for me, which was 4070 ti super. I'm tired of waiting and that tweet few days ago was like a kick to the stomach when you're already on the ground after punch. I wanted to go for 7900 xt, but not after that. In my opinion the prices won't get better and unfortunately this is our reality now. The only hope is intel but I don't trust them yet.
@@brando3342 You understand that's w/ settings like Path tracing right? You don't need to use FG but if you want to use settings that are taxing even for the strongest cards on the market (and pretty much impossible on AMD cards), then ya, you're gonna need a little help from FG.
I don't think you understand what things like Full RT or Path Tracing actually do or what they are? Maybe you should just stick to raster only at 1080P and entry level budget cards.
BTW 4070ti is not such a high end card. It's upper mid-range card.
1080p competitive games is the future.
You sound like a tryhard.
If you think you need 360hz in order to be able to compete, you gotta wake up.
Fortnite already proved that even console scrubs could compete with their 60 fps cap and their controllers.
Wake up, 360hz is you thinking you have an advantage while in reality it doesn't even matter. Maybe 1 moment out of hundreds of hours it did help you.
@@DBTHEPLUG ofc you dont need 360hz but when playing comp games at a high level you should atleast have 240hz, you literally see people earlier on faster monitors.
@zephr147 Earlier when playing on ultra slow motion. That doesn't matter at full speed.
@ in a tac shooter that makes a huge difference, you mentioned Fortnite as a comp game too, but the skill in Fortnite is the building not the shooting, so a high refresh rate monitor doesnt really do all that much, atleast compared to val and cs. but aiming in all shooters just becomes significantly easier on high refresh rate
@@zephr147 The same advantage a 240hz monitor gives you in Fortnite, is the same advantage you'll have in any other shooter.
You're just making up excuses to rationalize how it's possible for people at 60hz to compete with people at 240hz.
Rationalize how you have people playing on touchscreens that are able to compete at high levels?
The competitive advantage you think a high refresh rate monitor gives you is all mostly in your head. It's just like you thinking you have a chance when opening up those lootboxes in apex legends.
When displaying the chance or advantage as a percentage, it would be miniscule.
That's why I'm never going beyond 144hz (I could overlock my monitor to 165hz if I wanted to.)
Adaxum’s innovative approach to E-commerce integration is a game-changer. Joined the presale today!
Just locked in my ADX tokens during the presale-excited to see how Adaxum evolves.
Is the latency better when you just leave frame gen off?
The general answer is yes
yes but most people would rather get higher fps with same or higher latency i think.
@@huggywuggy594 not necessarily. I’m fine with the responsiveness of 60fps, but not lower than that. I won’t ever use frame gen below 60fps base frame rate.
Slightly yes, because frame gen takes away GPU compute resources. The real issue is that doesn't help you with latency, if you turn 30fps into 120 it will (hoplefully) look decent but feel very wierd, if you turn 120 into 480 you just won't see that much change.
@ you're right, upscaling better
If you can stand the artifact related to the frame gen, then you sure as hell could manage to play at a lower res from the start
This
That's why I play at 1080p for cheap!
I’m sorry but unless you looking for artifacts in slow motion or really hard its very tough to find them, playing on 1080p is much worse
@@lol-bg4wh Who said one should go down 4times the pixels, do have a look up on the different resolution you can have below 4K UHD. Lets not forget the ingame resolution scaling where you can lower it to 80% in most games and not loose noticable quality. This is why i am so against upscaling. No one knows how to tweak their systems anymore and just go after buzzwords like 4K, 5K, 6K. The standard of structure has just been forgotten. In the end you awnsered my point. If you are able to play with the artifacts you can lower you res and not see the artifact without going slowmo.
I come to realize the biggest problem with pc gaming is the never ending FPS chase! All the while games don't look much better than 10+ years ago in many aspects.
Blame Unreal Engine reliance for that.
Since physically based rendering was adopted 10 years ago there were no major breakthroughs in rendering besides ray tracing,
People keep getting faster higher resolution monitors and then complain when GPUs can’t keep up, it’s kinda funny. It feels like everyone was just fine with 1080p 60fps for a while (any GPU now will crush that) but now everyone is trying to run crazy stuff like 4K 240hz and if it’s not running at that speed it’s unplayable.
People are trying to push 16x the pixels per second as before and freaking out that their card can’t keep up while looking even better.
Wonder when we’ll see people mad that things don’t run well in 8k.
@@-Burb Hi end GPUs were always about very high fps until Nvidia started to push for ray tracing, let alone path tracing even Nvidia cards are not fully ready yet, especially with amount of VRAM some of them have, so even Nvidia owners call ray tracing fps killer and often disable it. You can't get stable 1080p 60 fps with ray tracing enabled on low range cards in some games, and ray tracing will become mandatory soon.
latency is king for gameplay and gameplay is all that matters
I agree. It's why I'm not big on frame gen. Frame Gen with nvidia reflex has latency almost as high a no frame gen and no Nvidia reflex. So it negates nvidia reflex chopping your total system latency in half.
yup! graphics wont make a bad game good
@ i heard reflex can also increase latency, after trying it out loads honestly couldnt tell the difference but that was years ago on verdansk
@@Sp3cialk304 frame gen 4x with the new transformer model and reflex doesn't add latency. Go look at ltt benchmarks, absolutely no noticeable increase in latency.
how does latency impact chess?
The red circle example was perfect to show it.
Certainly it was not well trained in this specific scenario.
They should really make illegal to say "performance" instead of "frame rate" at this point. They are NOT the same, despite what Nvidia wants you to believe
Daniel, I'd be curious if you see a further improvement in quality using DLSS4 with 4x Frame Gen with DLAA enabled? Maybe it will help with the weird graphical glitches?
No. DLAA only runs on the engine rendered frames, interpolated frames use pixel color and motion information from DLAA-smoothed frames but doesn’t have any additional post processing attached to it. The only thing that can eliminate ghosting artifacts in frame gen is high base frame rate and lack of any persistent UI that could be mistakenly interpreted by the interpolation algorithm as part of the 3d scene.
I'm sorry but for $2000 there should not be a single video on TH-cam with the spoken words of "low fps". This is terrible. Don't buy it!
That's a dumb thing to say, you can get low fps by just asking an absurd amount of pixels to be rendered by your GPU...
@@MLWJ1993 Stupid comment.
Hey Daniel, would he great to now see how this compares to the new Lossless scaling updating, they massively improved their latency and x3+ modes.
Great tool. Small extra latency and good quality from 80 to 144 FPS.
do you even game?
@@YoungGirlz8463 it seems some ppl just want to see a bloated FPS number on their screens and nothing more. sad times :(
can you compare it to lossless scaling?
Initial impressions without detailed comparisons: DLSS 4 is much better. Which makes sense because it is directly integrated into the game and so has access to much more information to work with when doing the interpolation of frames.
@@danielowentech i'd argue that 4x LSFG is nothing compared to native ingame implementation but comparing LSFG x2 used on top of DLSS x2 frame generation could be interesting, in my experience with a 4070ti SUPER this combination isn't perfect but usable, the real bottleneck is the minor artefecting of LSFG x2 just don't start from too low fps or i bet it's going to break much faster than DLSS 4 in x4 mode
I just tried using X2 Nvidia FG and X2 Lossless FG at the same time with a 4090 in Indiana Jones and the experience was too awful for me to keep playing. I can just about handle FSR and DLSS framegen X2 but I can't use Lossless at all. The amount of artifacting is unnaceptable to me and I really don't understand how anyone can put up with it.
@@Veganarchy-Zetetic i don't think you are supposed to mix Lossless with FG, turn FG off in game and use 2x/3x/4x or w/e in lossless, you can use DLSS in game and disable it on Lossless.
I only tried it because someone just mentioned it. I can't use Lossless. It looks terrible no matter what settings or version.
Mixing the 2 did infact look slightly better but the latency was the same.
AMD fan when FSR flickering: lol, nobody will notice that, unless you zoom 10x
AMD fan when DLSS flickering: ew, no, disgusting
So poor image quality previously associated with cheaper AMD cards is now proudly presented by Nvidia's $2000 card? Great job, Nvidia!
Adaxum’s presale bonuses made it an easy decision for me. Locked in my position early!
I just knew Dan would open the video with the DOWNSIDE of DLSS 4 because he can't help himself In his video{ FSR 4 Looks AMAZING} " he couldn't contain his jubilation & express how amazed he was as his first impression With DLSS 4 he posts a negative video title like It helps you win MORE, but it doesn't help you win, First impression " Surreal , Almost looks good " Then if the base rate is higher, presto, he finds good things to say Not before he lets you down Its long standing pattern
Yep, hes tripling down on / embracing the negative aspects first Which is what one does when they want others to get a bad first impression as well
obvious rage bait is obvious.
or you’re a 12 year old. Probably that one.
@ Bias is obvious as well
I'm sorry but I thought he must be half blind to say it was anything other than terrible, It looks so jank and glitchy
This video convinced me that I will not get Nvidia, I do not need to play with 140+ FPS with more input lag
Remember Daniel, this is still does not have Reflex 2, which will implement Async reprojection that would take user Mouse and only Mouse input into consideration when making those Generated frames.
I am under the impression that Reflex 2 is not compatible with multi frame gen, and is exclusive to a few specific esports titles (at least for now). Keep in mind that reflex 2 also does some "extrapolation" to in-fill guess work about what is happening after the space warp, which means it would have a complicated interaction with frame gen.
@@sufian-ben while this could be implemented in the future, Nvidia hasn’t announced any games using both MFG and Reflex 2 at the same time. I do agree it’s a perfectly logical step to try to integrate them at some point.
2kliksphilip does said Reflex 2 CAN work with Framegen in his 5090 review. But until NVIDIA confirms anything I'll remain skeptical about it (Especially since Reflex 2 isn't even out yet)
@@danielowentech it is said that its designed to work alongside the Multi-Frame generation and its the silver bullet to the x2 performance over 4090. same as Reflex 1 is automatically injected in the older model of Frame Generation. but I might be wrong
No this has to be some kind if confusion.
Reflex 2 is not the same as async reprojection. That is the next possible level of latency feeling lowering tech.
But it is not what reflex 2 is
This is very helpful thank you for show casing the new features on this card. I wonder how this card will perform in the FF7 Rebirth pc version.
11:37 WOW. What happened with the ghosting here? is Transformer disabled?! This looks VERY bad and distracting.
It must be because it looks extremely bad
@@VelvetThunderB99 yeah but before and after there weren't issues like that no?
Got my position locked in during the Adaxum presale. This project is a no-brainer!
The E-commerce and DeFi angle of Adaxum caught my attention. Might be worth considering.
Yes, you would probably appreciate seeing "100" fps instead of 30, even though it will feel floaty regardless.
However, the idea situation is to, of course, NOT PUT YOURSELF IN THAT TERRIBLE RASTERIZATION POSITION IN THE FIRST PLACE, which is when people are like "well DLSS is better than FSR so I'll go Nvidia over AMD even though it's a worse card for more money" I'm like :| JUST GET THE BETTER CARD
If rasters gonna suck either way, why not make it better? I don’t see how it’s a downside
The XTX raster is worse as well, so I don’t see a reason to go with amd
@randomroIIs But it doesn't have to suck, so why buy the worse card because you want more fake/better frames, just get the card that gets the frames you want
Having to interpolate at 4K is a first world problem, I'm not focused on that. And how the 7700XT is like a third of the 5090 in performance at 1440p, yet costs a sixth. That shit's tragic.
@@ocha-timeNo card can do path traced 4k ultra..so wym get the card that can do. I'd rather get the one which will offer "muh fake frames" fluidity with minimal graphics loss
Also with games having shitty taa these days I'd take dlss4 or fsr4 anyday..Dlss is better now so i will choose that over Amd's nvidia price - $50 with buns fsr3. I will come back to Amd when fsr4 is out and on par with dlss4(fingercrossed)
@@ocha-time Also to adjust settings to get 200fps native, the game would be look dogshit compared to 4k ultra path traced with 200fps fake frames. Watch optimum tech's review
@foxx8x Why are we turning on space tracing in the first place? Do we LIKE to tank our framerate to 25% normal so we can justify throwing AI frames in? People are so stupid with their resources these days lol, they'll front they NEED a 4080 Super to get playable framerates and YES I want to spend $1400 to get it!
Adaxum is starting to get some attention. Glad I managed to get in during the presale.
Daniel try to say something positive about frame gen so hard. If he would say the truth ,might get banned by Nvidia like Hardware unboxed before.
As a "1920p" player (using DLDSR 1.78x on a 27" 1440p 360hz QD-OLED monitor) - I can definitely see Multi Frame Gen helping to achieve around 360fps to take advantage of the full 360hz.
So I should aim to have a base rendering fps target of 90 first?
Target 95 FPS unless you are heavily CPU-limited, then it's 90 FPS. Then limit FPS via NVCP and use ×4.