@@anttikangasvieri1361 yea but depends on your CPU generation and vram capacity. Most people can't afford a high end build so they'll probably lag and driver crashes
@JithuChilukuri warp reflex is locked to 50 supposedly.. that technology with 2x framegen could be incredible. Just with artifacts sprinkled in but it'll feel nice!
Frame Warp is atrocious. Look at the right side, the text. It's predicting garbage based on prior frames, and it flickers heavily when it snaps back to the actual rendered frame. Additionaly, they essentialy take a 3D camera, but move it in 2D, meaning you no longer get an accurate representation where you're actually standing. And to hide that, they MUST cut out your weapon, and enemies, moving them back to where they are supposed to be, only based on depth data. While in reality every object in 3D space would need to be cut out and moved to accomodate the new camera position, not just the characters and weapons. And then use prediction inpainting to fill out the empty space. The actual frame you get is completly fake, in both viewing angle and lots of rendered pixels. Don't forget, you will move your camera WAY faster in actual gameplay, meaning that you could end up with 30% of pixels being inpainted with TONS of gibberish and flickering. What about a rocket flying towards your face? Will it be moved as well? What about pillars, walls and so on? They will most likely just be ignored. Also don't forget that your GPU just wasted tons of computing on rendering pixels that are cut out and moved out of frame, and then waste additional computing power to generate way worse visuals.
Frame warp is very cheap and proven to perform. It doesn’t use AI, and it has existed as a feature for VR games under the name “asynchronous spacewarp” for over seven years at this point. It does wonders to improve latency in VR, where latency is much more significant. The artifacting is really not that significant, even in VR, and it can make VR so much less nauseating. If you need to minimize latency as much as possible, the consequence of these artifacts is worth it, but you can always disable it if you don’t need the reduction of latency. All this will do is inpaint live while waiting for the next frame to be created, so it wouldn’t make your frames you would usually get have lower quality. The alternative to inpainting is a still frame. Inpainting paints between frames, not the frames themselves, so I don’t think you should be too worried about image quality reduction. It can only give a user more information.
@@hyl You're right, it's not really "AI". It's a predictive rendering algorithm based prior frames. In short. It has no idea what's supposed to be there, so it predicts it, and fills in garbage. Just open the official video, and open your eyes. It's flickering already on subtle movements. If you pause it at the right moments, you see how scrambled some sections become. The text on the right side is scrambled inbetween certain frames during movement. And this showcase was a best case scenario, it will be way worse during actual gameplay. I'm sure there are tons of players with bad vision and may not be able to notice it. For me i'm being hella destracted by subtle movements in the frame, because i've trained my brain to aim at moving things for the past decades. But sure, we can deactivate it, but then you will have to compete against players with an advantage. Which isn't optimal either.
@ I get what you’re saying. For me though, 60 FPS through some spacewarp solution would be preferable over 30 FPS native. Though warping does result in artifacts like you say, 30 FPS to me looks like the entire frame is artifacting because of how slow the frame updates. If you want to see a bit more about this topic I think Linus Tech Tips made a video looking into the feature under the title “Asynchronous Reprojection”
Yeah, and I dont know if it's just me, but the actual lighting of the frames seems always a bit off (or in the showcase rather the ceiling lights), as in a very subtle flickering overall which is very annoying and can be super distracting
Do you realise DLSS is still done by the GPU hardware? You have no clue what you are talking about smh No car in the world runs with an engine that is not tuned. No program in the world is operated without optimization. The same goes for GPUs. Graphics cards need utilization methods to use as much gpu hardware as possible, and DLSS is the method that does that. People are discarding the fact that DLSS is still done by the GPU hardware, not by the server GPU connected via the internet.
Not only that, everything costs more, this GPU generation is beta testing AI technologies with false advertising on performance. Why would you utilize frame generation to re introduce visual errors and input lag which all previous generations were trying to get rid of? As long as people keep paying for this, it's gonna keep happening. The problem is people who can afford these cards keep paying for worse value and the developers have less incentive to change their model. There is also lack of competition since AMD is literally owned by the same family. I guess we will have to wait for China to catch up before we see real competition.
soo.... gpu companies sell us fake fake frames and fake resolution meanwhile games studios tell us to be happy not owning games? id rather leave this hobby tbh
dude there are so many super games you can play right now on avarage PC. Most of the best games ever are like 5-20 years old, all the new ones are just clones barely nothing new. And the graphics arent that better, im playing stalker 2 currently on top settings but i can tell you now that 5 year old Metro Exodus had better graphics, climate, gamplay and everything.
@@RADkate If you have to make a BS argument like this then you already know you're BSing yourself. It's fake frames because those frames are AI generated.
I like how games are becoming more and more unoptimized that we first needed more powerful hardware to now needing fake frames. All to make a game that doesn't look any better than one built 10 years ago....minus the lighting.
yeah imagine if games actually optimized and all the AI was to increase graphics instead of make them lose the least amount of quality while adding performance
0:30 I agree that it is fake, sounds very scummy, but I do not agree that they should push and FG a bad direction. Performance per transistor is decreasing as fabrication size decreases. This is a hardware problem due to the current limitation of research and laws of physics. While fake frames are still fake, DLSS is still a legitimate technology going forward to tackle the challenges that are being faced.
There is no hardware limitation that necessitates the existence of frame generation, what's happening is companies aren't optimizing their games to save a couple of bucks. Games from almost a decade ago look as good if not better than games made now and run far better, even on last gen consoles. While a physical limit does theoretically exist for transistor size, we have not hit it, especially in consumer electronics. The problem is with the game, not the card, in every industries push to enshittify everything with AI the gaming industry fabricated the narrative that your graphics card is just shit and you should give them thousands more of your money so they can give you a new GPU that "fixes" it with AI.
@@sdg131 nope, shareholders won't invest time and money to "optimise" games when they can sell crap instantly and cash in on first day sales. They get their bonus and move on to ruin another dev.
@@owdoogames whats wrong with the pricing? 5080 is 1000, thats close to what i paid for my 1080ti SIX YEARS AGO, the 5090 is absolutely a pleb tax for streamers etc but who cares lmao, 99% of the people complaining about 5090 price dont need a 5090
Why do you care if then end result is good and makes games look and play better? I don’t care if the game company, Microsoft, or nvidia is the one who improves games
@@samgoff5289 based on current trends, this is not making games better mate. The fact that we are now getting recommended spec sheets that include frame gen and DLSS to reach 60fps is a tragedy. If those targets were 120hz then sure, I have no issues but 60???
8:00 The reason why digital Foundry can get early Hands-On with everything, and release a video is because. They have essentially sold their platform. Anybody that has watched their channel for any long period of time should know this you need only to look at Spider-Man 2 review for a very clear example of this. When that game launched it was very much broken a lot of people were clipping through the map very bad bugs an artifacts. A great game but not without its issues. Digital Foundry neglected to mention any issues whatsoever with the game despite tons of comments talking about how many issues it had solely because they got an interview with the developer(insomniac). Do the right thing because it's the right thing to do. I know not to trust anything they say now because they can be bought.
You do realise that they don't review games right, that they only talk about frame rates and aspects of the tech being used. They showed us a graph of the framerate, they explained how they use ray tracing to fake internal shots through the windows of buildings, they never gave it a score based on the quality of the gameplay. Now if they were faking the frame rate graphs, and telling you about the tech used in the fake trailer after the game was released and pretending it was in the full release, you might have a point. But they don't do that. Same as they don't do reviews. Do you ever think them creaming their pants over FSR or global illumination instead of telling you a game plays like shit, even when the game plays like shit, might be the reason they get access, instead of marking up imaginary reviews that only exist in your head?
Game dev in the next 3-8 years : our game run natively on 15fps with unnoticable visual improvement. but thank to ai 4x multi frame gen now we can run it on 60fps :D Wait.. optimization? What's that? We too lazy to do whatever that thing anyway
Amd is doing that right now with a custom version of fsr 4 for handhelds. It's specifically to 3x+ the battery life and in that context I'm all for it.
@pieterlindeque7798 5090 can max out every game if we talking raster and normal ray tracing on max but if you add path tracing then it cannot achieve 60fps natively without those features especially at 4k resolution
at this point let's just start reading books as entertainment again , since movies , shows , anime , games , social media and everything you can think of has become trash in the past 6 years
so the real question will the 5070 even be worth buying if your not going to use dlss but what if your upgrading a 3070 to a 5070 just in raw power would it be worth the upgrade at 2k gaming?
One of the more subtle issues introduced by the focus on Frame Generation + Upscaling is that the downsides (input lag, artifacts) don't impact the entire GPU stack equally. Rather, these downsides are much more pronounced for entry level or mid-range GPUs than on the top-end cards. On a 4090/5090, the base framerate (pre-framegen) is high enough that input latency is at an acceptable level and introducing 5-10ms of additional input latency isn't the end of the world. That same tradeoff doesn't make nearly as much sense when the base framerate is 30fps. Visual artifacting tells a similar story. On a 4090/5090 where the base framerate is 80-144 FPS, individual generated frames won't stay on the screen for very long, so any artifact is extremely short-lived and likely not to be noticed. When you go further down the stack and the base framerate is 30-50 FPS, visual artifacts would be far more noticeable. Even DLSS upscaling shares a similar problem. Relying on upscaling from 1440p --> 4k or 1080p --> 4k is a much better experience than upscaling from 720p to 1440p. It's easy to recommend DLSS Balanced for 4k users, but harder to recommend DLSS balanced for a 1080p monitor user. These technologies tend to put their best foot forward on the highest-end hardware, but the experience of using these techniques on lower-end hardware is a lot more mixed.
Btw same with taa. The more fps and more resolution you have it directly correlates with all these technologies. So if you're poor wompwomp too bad you're not good enough for jensens jacket gang
Yeah it's really a paradox. All this tech could be beneficial if it could "fake" performance on cards that don't really have it but it doesn't really work there. It is usable really on high end stuff which would probably be powerful enough to give a good experience without all this tech in most games and it is basically just an extra creme on top. I really hope somebody will come up with a better budget offering, be it AMD or even Intel or something
Isn’t this true with all previous versions of DLSS as well though? I suppose it’s probably more pronounced with multi-frame gen since more of the frames aren’t really rendered… But it sounds to me like NVIDIA has tried their best to allow you to choose exactly which settings work best to find the sweet spot for your hardware/workload. It’ll come down to individual preference whether you’d like to play with no artifacts at 20-30FPS or with a bit more latency or artifacts at 100+ fps.
is still early to say if its good or not, like how dlss 1.0 or 2.0 where not great in many games, but dlss 3.7 is absolute great comparted to fsr and in some games even native if you play in 1440p or 4k resolution
Honestly I've been kind of disappointed in Digital Foundry for a while. I don't think they're being paid for glowing reviews or anything. But I think their love of new technology bias them a lot.
Graphics haven't improved much since 2019, but we get now unoptimised games, RE2 remake(2019) and SWBF2(2017) looked great and didn't even have ray tracing, and could run in many gpus.
I prefer good stable graphics than ai generated ultra realistic graphics that look strange, but Nvidia wants games to be artificially more demanding and require exclusive features avaible only to their latest gpus to sell 2000$ gpus, and they even pay developers to force games to have their features.
from personal upscaling experience and not in realtime, even heavy AI rendering with enough time to render, anything below 720p upscaled to 4k is useless. Will look 100% fake and smeary and somehow "painting" like. Even 720p to 4K is already "hard", thats why i would upscale 720p content rather to 1440p (2x), for real time upscaling its much less resource eating regarding to CPU and "AI" stuff, for lets say video upscaling, you save A LOT OF SPACE by making it 1440p instead of 4K, it can look in 1440p even better than the (too extremely) upscaled 4K. I am playing for example Cyberpunk, Stalker 2 and a few GPU hungry online survival games with DLSS.... i have "just" a RTX 2080 but i want to enjoy often 4k60 or 1440p120 on OLED TV. To make it worse, the fps are not allowed to drop below 40 fps or frame times above 12 ms, otherwise Gsync doesnt work on the TV (40-120 Hz Gsync range), so in some games which drop below 40 fps or have some framedrops/stutter due too high resolution, DLSS is a gamechanger to make the "old" 2080 still "nice" to use. But if possible, i avoid ANY "fake" enhancements, including DLSS, even at the cost of reducing effects, texture details, rendering distance etc down to a certain point. In my opinion, AI doenst have any reason to be involved in enhancing every aspect of graphics, fps etc. For me it looks as if DLSS, now this frame generation to 2-4x the fps, its just a BAD TRY to compensate for the resources hungry, mainly UE5 games trying to look hyperrealistic. Good think im playing nowadays often older titles from my childhood, those classic games are so much more entertaining and better, despite much less capabilities of the PCs those days.
Dude, that would only happen with the most demanding games at highest ray tracing/path tracing settings and if you dont utilize DLSS or frame gen (like for example Indiana Jones and Great Circle wont run for more then 30fps if you have everything maxed out including path tracing but dont enable DLSS or frame generation) If you run path tracing/ray tracing settings at low or medium, you get playable framerates even without DLSS or frame generation and if you disable raytracing entirely, then even more so you dont have to worry about framerate whatsover at highest settings possible. I mean ye, you can argue with that kind of money for GPU, it should run a game with full ray tracing settings without DLSS/frame generation higher, but when Nvidia's competitors dont have GPUs that would be able to do that either (at all in fact), then I think lot of this whining is ridicilous. Its literally only in those highest possible ray/path tracing settings in most system heavy games you get such poor framerates if you dont enable DLSS or path tracing and from my personal experience using DLSS3 with 4070 Super I can say that as "average joe gamer" I didnt see or feel any drawbacks using them. Sure maybe if/when you know what to look for or are more sensitive to latency, you can notice the difference, but even then depending on the game you are playing, you can ask yourself does it really matter?
2:20 - i'd like to see someone take this challenge and crack it on older cards, and can officially call out Jensen's BS excuse for exclusive features on new gen cards.
Amd will do it. They have us frame gen foe non 40xx gpus. They will give us multi FG as well. Or buy 5usd prog LossLess scaling and get 4x fg. I used it on old games year ago (fg 3x) to get 25fps game to 75fps.. I knew nvidia will do this, year ago. And they did it.. im still faacinated by ther deception levels.
Playing with frame gen on my current 4080 feels like playing on console on a very bad TV with no game mode. It's doable but man, it's so unpleasant. I hate that this is the future that NVidia is pushing.
This is why instead of hoping AMD competes to lower Nvidia prices, the masses need to support the competition to send a message. If AMD 9070 can hit 4080 levels on raster, I am switching. I hate when companies are entitled to our money due to blind loyalty! There is still hope as NOBODY knows the true performance of the 9070, only AMD and board partners
Is it me or is Nvidias GPUs straight changing our games now.. If they're changing textures,faces,lighting they're changing the developers vision.. That's very different to just adding fake frames.
Exactly. The games made by devs that do pay serious attention to every detail, texture, model and lighting. But that all gets ruined by NVIDIA using AI to literally recreate their whole game.
@@marcogenovesi8570 So you're saying that just because a developer doesn't do what I previously said, they don't have a game? They still made a whole game that people can play, albeit only the rich people, it's not like they used AI to generate the game. Even Minesweeper or Tetris was made by a developer, it's not like the game spawned in out of nowhere. Now NVIDIA is generating 200+ frames with AI which still ruins a lot of efforts that the developer made, which is unfair because the developer didn't use AI but NVIDIA is forcing people to use DLSS because of the 30 FPS you'd get otherwise. Even in the AI-enhanced face that was shown in this video, it is pretty clear that between the original face and the AI one, a lot of shadows and lighting effects were altered. That's my whole point.
Your argument makes no sense. The developers can choose if they want to implement these features. The bad part is if they're used as a crutch against poor optimization.
@deejnutz2068 I think they will especially if they switch from native resolution. Im not saying it's so bad that they won't use it. A lot of people use the current DLSS. All im saying is that PC gamers are not as oblivious as people make them out to be.
I agree with this because any gamer who’s even somewhat into competitive games is already trained to spot pixels that look different (because that means an enemy) so they’re gonna notice “wrong” pixels in AI generated frames
@@patsk8872 Where are you getting this from. The 5070 seems like it will be very close in performance to a 4070 TI super. Maybe slightly weaker. Not "a lot more", as your stating.
yeah, I'm most tired of them saying how DLSS looks better now when every time I turn it on I can see the motion blur and feel the input lag. It looks like shit, stop telling me it doesnt.
It'll be really interesting to see hands-on with all of the frame-gen and the Reflex 2 turned on, to see if the work done to add frame-warping to reduce the latency "feel" reduces the lag feel.
Hi Vex, It took me last year more or less a month to find accurate infos for a Nice upgrade of my system, I watched many of your videos and many other content creators and I came to the point than I had to jump off the green and blue train !!! I went for full red team with 7900 XTX Tuf and a 7800x3d and I 've gotta say than my gaming experience went much better AND Adrenaline is working just great ! Thanks for all the infos you already gave us and the ones arriving now ! Fantastic content, accurate and useful ! Keep up the great job !!!
Faces at 2:06 - the AI face looks it has a plastic sheen and feels not right, but the standard one to me looks more actually human. How is that better?
Well, not everyone is born like a globlin. You have genuinely natural beauties. And are you saying I am plastic? lol I have a natural glowing skin, because I take care of my body and skin.
I think the standard one looked better because it looks more game-like, while the AI face just looks like a deepfake, you get the live action face and you try to plop it into a 3d rendered scene and it will look weird. Keeps reminding me of fully 3D rendered movies and how acting will be a relic of the past, and all movies will look just like real life but rendered on Unreal Engine 9.
@@UltraVegito-GodKiller-1995It's not a 4k card that's gonna eat up more than 12 gb of vram. I have a 4070 ti super and ive hardly ever utilized more than 12 gbs of vram, that too with ray tracing which honestly doesn't make a difference in some titles. It's just a bullshit argument for anyone using 1440p or 1080p cards. For 4k, yes 12 isn't enough.
17:03 Frame Warp is not actually that new. John Carmack was talking about what he called Time Warp that they used in VR in 2017. If anyone is interested it is from the UMKC talk at around the 30 minute mark.
I switched from 3070 ti to 3080 ti 12gb and difference is huge.Sounds stupid but it is,those 4gb really makes the difference in games that are close to using 7gb or more.But the biggest,like huuuge performance boost I got switching from 10700k to 9800x3d. For example in Witcher 3 just switching cards I got 4 fps,from 73 to 77 but less fps drops due to memory.After buying new CPU that jumped to 110 fps
Because there was also a example of a game running running at 124 fps with DLSS4 which means 30fps base… and regardless if it’s 60 fps base it sill adds latency if you actually watched the entire video. It’s simply a bad gimmick to get ppl to buy more.
@@JonBslime tbh I find frame gen even more pointless because of that.. If you can already get 60 fps in single player games, what's the point of using frame gen and doubling latency to get more? Low end cards do not have enough VRAM for it (FG eats it up like a mf), so that is out of the question. The more beneficial use case would be for multiplayer games, but the latency penalty is too much for it to be viable. Maybe one day if they can get the latency penalty down to within a few ms.
@@JonBslime It's easy to do the calculation 30x4 = 120 60x4 = 240 What I don't understand is the thumbnail of the video, which says 214 frame fakes, when doing calculations it's easy to see what the base fps are with dlss performance etc, and then add the x4 of MFG, If I have to use dlss for better performance I use it, also have to say that I'm not a big fan of frame generation.
Massive way to save me money... I will boot game and not use DLSS if I see anything below 60 fps with it off, I will refund the game lmao. Most of the AAA stuff has been terrible as of late anyway.
I hope nobody buys them and I can get a few!! I see way more man childs hating on the 50 series in comments, 80% atleast, good stay on your past gen and shut up!!
In order to sell a product you need consumers. Nvidia is pushing fake frames so hard because there are so many people out here willing to buy this bullcrap. It fucking sucks.
they still have them. Motion interpoaltion is very common in tvs. But they have really high input lag penalty and looks very bad with gaming cuz they dont have access to the game values.
@15:12 Why both? Because its 4x the compute, which means its a free ad for Nvidia on the older gen cards. You can see both DLSS versions, compare image quality, see how much better 4 is and also see how much worse your card performs with it vs a 50 series card and then get the incentive to upgrade. Nvidia will never throw you a bone without gaining something or gaslighting you into upgrading.
It works fine, lossless scaling. but as shown it uses pc resources to generate those frames. Performance/optimilisation wise MFG is better, because it runs in an optimized workflow, and can acces software, driver and hardware level stuff(its all nvidia). it has more potential. But, does it look good, does it perform good, and is it worth spending cash. Esp, because all the dlss /reflex optimisations also reach 4000 series, combined with lossless scaling it's up there too. it leaves a 20-30% improvement. for how much? I dont think its a bad call to pick up a 4090 second hand now, while all 4090 sellers are rush selling those cards because 5070 is going to perform like a 4090.
@@ArdaU Same. Tried it on factorio, and the artifacts of the intermediate frames are way worse then the desire to have more fps. Gladly I found another way to run factory 120 fps native via a mod. I guess the tech only really works if the images are already extremely busy with millions of polygons and textures. For 2d games or anime, every glitch sticks out like a sore thumb.
@@Raynhardx FSR 3 Frame generation is great btw it really feels like you are actually getting high fps but when I move if I watch closer it feels a bit weird for some reason
When you play VR games, you might feel discomfort or pain in your eyes. Why does this happen? It's often due to flickering lights. Sometimes you notice the flickering, and sometimes you don’t but brains still process it. Obvious DLSS random lighting artifacts will strain eyes, even when simply watching monitor pixels.
The slow release of DLSS features makes it harder for people to realise that those tiny artifacts aren't what games normally look like. People just assume the blur and jitter is normal.
Yep! I was called out for "Oh its just how the game is" yet no.. its not. And a lot of people like my self NOTICE these issues when using DLSS or FSR & FG. Its NOT normal at all. The fact its being shrugged off as "Meh, its what ever" is not good.
*Happy new year 🎊 You work for 40yrs to have $1M in your retirement, meanwhile some people are putting just $10K into trading from just few months ago and now they are multimillionaires*
After I raised up to 325k trading with her I bought a new House and a car here in the states 🇺🇸🇺🇸 also paid for my son's surgery (Oscar). Glory to God.shalom.
Good day all👍🏻 from Australia 🇦🇺. I have read a lot of posts that people are very happy with the financial guidance she is giving them ! What way can I get to her exactly ?
when the RTX 90-series comes out there won't be any pixels or game at all, your GPU will just have an AI that will convince you that you've just played the most awesome game ever.
I think they want to limit reflex 2 to 5000 series first where it will be a "best case scenario" on 1 architecture. This will give the best impression and the best initial view in the eye of the press of the tech before ironing out any issues then moving it to older architectures where the tech might not be able to showcase the same results.
You do have a point there. I like 1440p myself but I'll go to 1080p for framerate purposes as I did in Dead Space Remake. The secret is a high-end audio receiver with an upscaler and game mode with low latency. 1080p games actually look incredible on a 4k screen and you can run with REAL framerates with little to no input lag.
What some don't seem to understand is that no matter how much Nvidia is gonna reduce the input latency, you will still have the base framerate one. So let's say a base framerate of 30 fps, but MFG gets it up to 120 fps, you will have 33.33ms and NOT 8.33ms you would get from native 120 fps.
Keep in mind that DLSS can easily double your base frame rate if you're GPU bound due to AI upscaling, which positively affects input lag. If your base frame rate without DLSS is 30 fps, then with DLSS 4 it may jump all the way to 240 fps with your base fps doubling. It's honestly crazy technology and doesn't look as bad as people make it out to be. Only for companies to throw it all under a bus and lower your base frame rate to 15 fps. Say hello to good old TAA smearing, because you WILL experience it if you didn't before.
@Vex it's probably an option in the game settings because the "feature" is going to be hardware-bound. You need the RTX 5000 to be able to use it. For folks without one, tough luck.
convolutional neural networks tend to be smaller in size and more computationally effective than transformer models, meaning that the new model will probably use more VRAM and cores
Yeah I am stoked to see if the lower res upscaling looks better. Performance at 4k is as far as I can go but I prefer balanced. Maybe there is some magic that makes performance look great so I can actually run pathtracing at 60fps 4k with framegen
Bro fr trust me guys, dlss has always increased fps by alot and seeing that GPU frame ratio to ai frame as 1:15 is crazy like that increases fps by a ton. No doubt why gensen said 5070=4090 Dlss 4 is gonna be so freaking good that I bet 5060ti will be the best budget GPU if it's price is around 300$ Edit: I also think 5090 users will switch to 8k for triple a games instead of 4k now, we might literally get 8k benchmarks soon
Just stop talking, man. You're endorsing the decline in graphics card development. Instead of focusing on true hardware raw power like previous generations, these cards rely entirely on gimmicky features. You're part of the problem because you fail to see how you're falling for NVIDIA's trick, paying for software-based features that have little to do with the real raw performance of a proper graphics card, like those from earlier generations. Paying for fake generated frames instead of true raw frames per second.. why are you lowering your needs and are willing to just accept anything for the sake of it?
Unlike 'brute force' rasterisation, most of these 'features' have to be incorporated into the game to be utilised by the GPU. That's what sucks about this stuff. Getting a newer GPU doesn't necessarily mean the performance of older games will be improved - support has to be enabled by the game developer.
In the future years we wont even need a GPU... we will pay for " cloud " service that allow our GPUless pc to access AI and get 400fps ... I dont like it x)
@@johnc8327 yeah, it's similar, but I'm talking actually about paying for a non existed GPU.. Cloud gaming you still need the console right? ( I really just care about pc, never looked into it tbh)
Nah they have cloud streaming games for PC already. Geforce Now was actually pretty good, gave fast high end graphics with no noticable latency as long as I had a good internet connection. Problem was licensing. Geforce Now only lets you play games you own on Stean or Epic (or GOG) but only if the publishers lease the game on the platform. That means tons of games aren't on there. Was fun while I couldn't get a graphics card during Covid, but I prefer being able to play on my own PC.
Also they are advertizing their card as x4 faster. But this only works in the games that support DLSS framegen. At the moment only game I encountered that support it is Stalker 2. What about older games ? Or all other that do not support latest bells and whistles ?
@@Dempig I did not mean old games like 2004 games. But Games that came year or two ago, that do not have support for DLSS or Framegen, and never will. And by the way, not all games have DLSS
@@PixPunxel What games released int he past 2 years are you playing that do not have DLSS? Far more games have DLSS than FSR. Devs dont even bother adding FSR because of how bad it is. Upscaling is required for new games or they would be unplayable, 99% will support dlss.
@enzoguimaraescost you cant make an argument for the 0.02 percent. Pc gaming has never relied on optimization as a whole. Cuz u can't optimize a game for every pc hardware configuration.
@@dante19890 Tf does "brute force" mean? Every game used to be optimized like crazy to even get it to run. Quake, doom, crisis, etc. Pretty much every game from that era was running with 1/50 the processing power and look about as good as games from way later than them.
Any inferences of how these cards will work in the *_non-gaming_* world? You know, where actual *_work_* needs to be done? Like in wrangling big Photoshop files, video editing, CAD, Blender, etc?
"you will own no real frames and be happy" 3090 was around 2Ghz 4090 is around 3Ghz I feared the 5090 would not see the same 70% jump in performance because he wasn't going to repeat that clock increase and sadly I was correct
Optimization is a thing of the past for AAA. frame generation and DLSS will make Nvidia more money, save money for the studios, all while the gamer will be paying more for less Everyone wins except gamers
prior to DLSS, I came across this Depth Aware Video Frame Interpolation (DAIN) alpha which amazed me at first. But upon close speculation similar artifacts were noticed. We had a debate in a group if this could be implemented live in games where the next frame was a guess. 8 years later improving on this approach, AI is still being taught on how to guess better, instead of cards working out to push higher...
Frame generation tech has horrible latency at lower frame rate, its great to increase performance if it can achieve stable 60 ish fps to say 120 or 180 bit otherwise the input lag is terrible
With DLSS super sampling, you would need a super weak GPU to not get 60fps. Even a 5060 will likely be able to DLSS balanced + frame gen to put out a better experience than a 3090 using no DLSS at all. Brute force rendering will be a worse experience unless you have a 5090.
I was thinking.. if it went from like 120 to 135 or whatever... That means the base frame rate is 68 instead of the 60 fps of the old fg which is a few ms better latency. 🤷 Better is better I guess.
@@johnc8327no offense but everyone with a brain would know you'd get a better experience with a 3090 in any game in any settings. Not even talking about the vram amount of the 5060. Framegen is useless for most people
12:16 - The driver doesn't hook into the software. Instead the software talks to the driver and the driver can "mishear" the software. 16:37 - Frame warp only reduces camera rotation latency, nothing else. Also, afaik, it only works in first person games.
Not excited about the frame gen, and at the 9070 XT vs. 5070 price point, I think the former might make up for DLSS Super Res. with it's superior rasterization + inherent quality diff. between upscaled and native
Didn’t they already say that dlss 4 will have fixed the artifacting issues at a very small increase to latency? Any chance the current issues in the video are going to be worked out prior to launch?
Yeah Nvidia Reflex 3 will go to the future and grab the future frames for you and put them on screen so you can have -15ms latency, players on RTX6090 have an advantage on older cards and you will rank up in the future using AI TOPS
If only 1 of every 4 frames are real, then Nvidia better accept only 1 in every 4 of the $$$ that I pay for their GPUs are real. Give me my $500 5090 now, Jensen.
@@endureuntiltheend86 the greatest magic trick of all time. Fooled everybody by being the one to write the books, make the movies and pick our politics. Then pretend it’s all been us. Wild it worked without questions for decades. The internet is what will end up waking ppl up and saving us. They all want it regulated
What is the functional purpose of framegen? Like i thought the whole reason ppl wanted high frames was because it made games run smoother. So frame gen gives you a higher framerate in exchange for smoothness?
Reflex 2 technology with in-painting missing spaces looks identical to z-buffer 3d issue reshade and vorpx use. For VR for each eye pixels on a rendered frame are moved left or right depending on their depth buffer info, which means close objects are moved a lot and it leaves sort of a glass/water halo artifact around objects. I wonder if nvidia tech could somehow be applied to fix that too
Crazy now, FPS is no longer the metric of performance. It is now latency. Because with frame gen we can have all the FPS we want but the response could be terrible.
@@ratchetjoker1317 I think it IS the future. It's just not ready. Far more concerning is they are selling these cards based on their AI performance and not their raster.
I believe the main reason CCN is available in CP2077 is that it essentially serves as a testbed for new Nvidia technology, making it easier to showcase improvements.
11:04 Im having a hard time figuring out if im dum or im missing something. Like how can DLSS 4 generate 240 fps if the base fps is 30 ? DLSS generates frames in between each other so generates 4 fps from 1 fps shouldn't it at best be like 120-150 fps ?
The base 30FPS are without any DLSS, no upscaling and no frame generation. The 240FPS are achieved by first upscaling with DLSS to 60FPS, then they push the three fake frames to hit 240FPS.
People will 100% notice it. They may not know exactly why it is, but they will notice if they have eyes. Especially a problem for PC gamers and lower end people, as the closer you are to the screen and lower fps you are. monitors have lower res on avg compared to tvs will make it very obvious. Not even mentioning that none of this 50-60ms input lag is even remotely close to playable on a mouse. Even going from 100fps as my base it's very clear when I enabled framegen, the lag is unbearable.
0:03 It is rtx 4090 not rtx 5090
u right, my b. didnt even catch it in the edit
@@vextakesit's ok.. you're still the goat
@@vextakes take some rest brah
@@vextakes you need it
@@vextakes you need it
In the RTX 6000 generation, you receive an LSD pill instead of a GPU, allowing you to visualize all those frames directly in your mind.
lower latency as frames are in your mind man
@@anttikangasvieri1361 yea but depends on your CPU generation and vram capacity. Most people can't afford a high end build so they'll probably lag and driver crashes
id pay for that lol
meanwhile AMD just releases another 6800xt
nice stolen comment lil bro
Nvidia incoming drivers:
dlss4 = false;
if ( series >= 5000 ) { dlss4 = true; }
Too verbose.
dlss4 = series >= 5000;
@@tibui-c4k this guy codes
@@tibui-c4k not sure c++ can compile that 🤔
Only multi frame generation is exclusively to 50 series all remaining features are coming to 40,30,20 series
@JithuChilukuri warp reflex is locked to 50 supposedly.. that technology with 2x framegen could be incredible. Just with artifacts sprinkled in but it'll feel nice!
Frame Warp is atrocious. Look at the right side, the text. It's predicting garbage based on prior frames, and it flickers heavily when it snaps back to the actual rendered frame.
Additionaly, they essentialy take a 3D camera, but move it in 2D, meaning you no longer get an accurate representation where you're actually standing. And to hide that, they MUST cut out your weapon, and enemies, moving them back to where they are supposed to be, only based on depth data. While in reality every object in 3D space would need to be cut out and moved to accomodate the new camera position, not just the characters and weapons. And then use prediction inpainting to fill out the empty space. The actual frame you get is completly fake, in both viewing angle and lots of rendered pixels.
Don't forget, you will move your camera WAY faster in actual gameplay, meaning that you could end up with 30% of pixels being inpainted with TONS of gibberish and flickering. What about a rocket flying towards your face? Will it be moved as well? What about pillars, walls and so on? They will most likely just be ignored.
Also don't forget that your GPU just wasted tons of computing on rendering pixels that are cut out and moved out of frame, and then waste additional computing power to generate way worse visuals.
Frame warp is very cheap and proven to perform. It doesn’t use AI, and it has existed as a feature for VR games under the name “asynchronous spacewarp” for over seven years at this point. It does wonders to improve latency in VR, where latency is much more significant. The artifacting is really not that significant, even in VR, and it can make VR so much less nauseating. If you need to minimize latency as much as possible, the consequence of these artifacts is worth it, but you can always disable it if you don’t need the reduction of latency.
All this will do is inpaint live while waiting for the next frame to be created, so it wouldn’t make your frames you would usually get have lower quality. The alternative to inpainting is a still frame. Inpainting paints between frames, not the frames themselves, so I don’t think you should be too worried about image quality reduction. It can only give a user more information.
@@hyl You're right, it's not really "AI". It's a predictive rendering algorithm based prior frames. In short. It has no idea what's supposed to be there, so it predicts it, and fills in garbage. Just open the official video, and open your eyes. It's flickering already on subtle movements. If you pause it at the right moments, you see how scrambled some sections become. The text on the right side is scrambled inbetween certain frames during movement. And this showcase was a best case scenario, it will be way worse during actual gameplay. I'm sure there are tons of players with bad vision and may not be able to notice it. For me i'm being hella destracted by subtle movements in the frame, because i've trained my brain to aim at moving things for the past decades.
But sure, we can deactivate it, but then you will have to compete against players with an advantage. Which isn't optimal either.
@ I get what you’re saying. For me though, 60 FPS through some spacewarp solution would be preferable over 30 FPS native. Though warping does result in artifacts like you say, 30 FPS to me looks like the entire frame is artifacting because of how slow the frame updates. If you want to see a bit more about this topic I think Linus Tech Tips made a video looking into the feature under the title “Asynchronous Reprojection”
@@hyl LTT is the last channel you want for any in-depth technical analysis. I'd be waiting on a Digital Foundry video for stuff like this!
Yeah, and I dont know if it's just me, but the actual lighting of the frames seems always a bit off (or in the showcase rather the ceiling lights), as in a very subtle flickering overall which is very annoying and can be super distracting
We came from not owning games, to not owning frames.
COPE
Nvidia is the best
@@FO0TMinecraftPVP Projecting troll buddy boy boy.
@@seaneckhart9914 What lol
@@seaneckhart9914 Jews? You mean you? Autocorrect going crazy.
Do you realise DLSS is still done by the GPU hardware? You have no clue what you are talking about smh
No car in the world runs with an engine that is not tuned. No program in the world is operated without optimization. The same goes for GPUs. Graphics cards need utilization methods to use as much gpu hardware as possible, and DLSS is the method that does that. People are discarding the fact that DLSS is still done by the GPU hardware, not by the server GPU connected via the internet.
I don't like where we are going with fake everything
Started with fake boobs and toupes.
The Matrix
Not only that, everything costs more, this GPU generation is beta testing AI technologies with false advertising on performance. Why would you utilize frame generation to re introduce visual errors and input lag which all previous generations were trying to get rid of? As long as people keep paying for this, it's gonna keep happening. The problem is people who can afford these cards keep paying for worse value and the developers have less incentive to change their model. There is also lack of competition since AMD is literally owned by the same family. I guess we will have to wait for China to catch up before we see real competition.
What is real? right?
there next GPU will be designed, named, and manufactured by AI
Imagine playing Hogwarts Legacy and everyone is a Harry Potter by Balenciaga character
to be honest, that would be kinda funny in its own way^^
soo.... gpu companies sell us fake fake frames and fake resolution meanwhile games studios tell us to be happy not owning games? id rather leave this hobby tbh
...SOON ENOUGH,IT'LL BE ABOUT MAYBE EVEN *_food_* ,THO
........INDEED A NIGHTMARE IRL
Just buy old games or wait for these new overpriced games to go on major sales before sweeping them up. Or consider sailing the high seas
dude there are so many super games you can play right now on avarage PC.
Most of the best games ever are like 5-20 years old, all the new ones are just clones barely nothing new.
And the graphics arent that better, im playing stalker 2 currently on top settings but i can tell you now that 5 year old Metro Exodus had better graphics, climate, gamplay and everything.
define fake because they are still being output rasterization is Just as inaccurate as vector reconstruction lol
@@RADkate If you have to make a BS argument like this then you already know you're BSing yourself. It's fake frames because those frames are AI generated.
I like how games are becoming more and more unoptimized that we first needed more powerful hardware to now needing fake frames. All to make a game that doesn't look any better than one built 10 years ago....minus the lighting.
yeah imagine if games actually optimized and all the AI was to increase graphics instead of make them lose the least amount of quality while adding performance
Ngl
Cyberpunk, black myth wukong and hellblade actually look great
And beat games from like 2015
@@blackface-b1v definitely not Battlefront 2...
They should be using AI to optimize the game code rather than generate fake frames and fake resolutions.
@pcwalter7567 battlefront 2 looks incredible
I would consider it an outlier
It still looks worse than black myth tho... even if by a little bit
0:30 I agree that it is fake, sounds very scummy, but I do not agree that they should push and FG a bad direction. Performance per transistor is decreasing as fabrication size decreases. This is a hardware problem due to the current limitation of research and laws of physics. While fake frames are still fake, DLSS is still a legitimate technology going forward to tackle the challenges that are being faced.
Glad to see someone else understands it's a limitation of physics. Though that doesn't excuse Nvidia's pricing and marketing strategies.
There is no hardware limitation that necessitates the existence of frame generation, what's happening is companies aren't optimizing their games to save a couple of bucks. Games from almost a decade ago look as good if not better than games made now and run far better, even on last gen consoles. While a physical limit does theoretically exist for transistor size, we have not hit it, especially in consumer electronics. The problem is with the game, not the card, in every industries push to enshittify everything with AI the gaming industry fabricated the narrative that your graphics card is just shit and you should give them thousands more of your money so they can give you a new GPU that "fixes" it with AI.
@@sdg131 nope, shareholders won't invest time and money to "optimise" games when they can sell crap instantly and cash in on first day sales. They get their bonus and move on to ruin another dev.
@@owdoogames whats wrong with the pricing? 5080 is 1000, thats close to what i paid for my 1080ti SIX YEARS AGO, the 5090 is absolutely a pleb tax for streamers etc but who cares lmao, 99% of the people complaining about 5090 price dont need a 5090
@@laxminarayananks1520Once again, we're conflating things like Nvidia both makes graphics cards and games. 😅
DLSS4 will make devs even lazier at optimizing their games then they already are
They will probably just download movie assets with millions of polygons and triangles
@@poka26ev2they already are
Why do you care if then end result is good and makes games look and play better? I don’t care if the game company, Microsoft, or nvidia is the one who improves games
cutting even more shortcuts
@@samgoff5289 based on current trends, this is not making games better mate.
The fact that we are now getting recommended spec sheets that include frame gen and DLSS to reach 60fps is a tragedy.
If those targets were 120hz then sure, I have no issues but 60???
That CEO will say anything to get his stocks up lol. AI AI AI A AI
AI event 2025 😂
CES? Nah CAI
you just raised nvidia's shares by 2% bu saying AI a couple of times.
He said AI over 200 times in that event lol
That’s kinda his job lol
8:00 The reason why digital Foundry can get early Hands-On with everything, and release a video is because. They have essentially sold their platform. Anybody that has watched their channel for any long period of time should know this you need only to look at Spider-Man 2 review for a very clear example of this. When that game launched it was very much broken a lot of people were clipping through the map very bad bugs an artifacts. A great game but not without its issues. Digital Foundry neglected to mention any issues whatsoever with the game despite tons of comments talking about how many issues it had solely because they got an interview with the developer(insomniac). Do the right thing because it's the right thing to do. I know not to trust anything they say now because they can be bought.
df is pretty boring nowadays i can't stand alex just constantly laughing at every word coming out of his mouth
Df sux
They did the same thing with Horizon Forbidden West back in 2022 when it launched for PS5.
You do realise that they don't review games right, that they only talk about frame rates and aspects of the tech being used.
They showed us a graph of the framerate, they explained how they use ray tracing to fake internal shots through the windows of buildings, they never gave it a score based on the quality of the gameplay.
Now if they were faking the frame rate graphs, and telling you about the tech used in the fake trailer after the game was released and pretending it was in the full release, you might have a point. But they don't do that.
Same as they don't do reviews.
Do you ever think them creaming their pants over FSR or global illumination instead of telling you a game plays like shit, even when the game plays like shit, might be the reason they get access, instead of marking up imaginary reviews that only exist in your head?
Cool conspiracy theory bro.
Game dev in the next 3-8 years : our game run natively on 15fps with unnoticable visual improvement. but thank to ai 4x multi frame gen now we can run it on 60fps :D
Wait.. optimization? What's that? We too lazy to do whatever that thing anyway
...WELL,THERES STILL SOME good THING,HEHE
....ADVANTAGE AT any COMPETITIVE GAME....HEHE....
You do know that path tracing was enabled right? If it was disabled the 5090 is pushing 100+fps at 4k raster
Amd is doing that right now with a custom version of fsr 4 for handhelds. It's specifically to 3x+ the battery life and in that context I'm all for it.
@@xxelitewarriorxxso you're telling me a damn 5090 that costs more than an entire PC can't max out a game? That's amazing.
@pieterlindeque7798 5090 can max out every game if we talking raster and normal ray tracing on max but if you add path tracing then it cannot achieve 60fps natively without those features especially at 4k resolution
The era of unoptimized games that can muster 10fps is here, big L for the gamers
at this point let's just start reading books as entertainment again , since movies , shows , anime , games , social media and everything you can think of has become trash in the past 6 years
Optimizing games is a thing of the past. They're at the "optimizing gamers" stage now.
I don't care because i dont care about any of these games anyway. Time to grow up.
And watch, DF will champion this crap like it's a win for gamers everywhere.
Unoptimized AAA games. Not sure about Indie Games, i think Indie Games are okay.
so the real question will the 5070 even be worth buying if your not going to use dlss but what if your upgrading a 3070 to a 5070 just in raw power would it be worth the upgrade at 2k gaming?
Better 5070Ti for 16GB Vram, it really must have in such awful times of no optimization
One of the more subtle issues introduced by the focus on Frame Generation + Upscaling is that the downsides (input lag, artifacts) don't impact the entire GPU stack equally. Rather, these downsides are much more pronounced for entry level or mid-range GPUs than on the top-end cards. On a 4090/5090, the base framerate (pre-framegen) is high enough that input latency is at an acceptable level and introducing 5-10ms of additional input latency isn't the end of the world. That same tradeoff doesn't make nearly as much sense when the base framerate is 30fps. Visual artifacting tells a similar story. On a 4090/5090 where the base framerate is 80-144 FPS, individual generated frames won't stay on the screen for very long, so any artifact is extremely short-lived and likely not to be noticed. When you go further down the stack and the base framerate is 30-50 FPS, visual artifacts would be far more noticeable.
Even DLSS upscaling shares a similar problem. Relying on upscaling from 1440p --> 4k or 1080p --> 4k is a much better experience than upscaling from 720p to 1440p. It's easy to recommend DLSS Balanced for 4k users, but harder to recommend DLSS balanced for a 1080p monitor user.
These technologies tend to put their best foot forward on the highest-end hardware, but the experience of using these techniques on lower-end hardware is a lot more mixed.
Btw same with taa. The more fps and more resolution you have it directly correlates with all these technologies. So if you're poor wompwomp too bad you're not good enough for jensens jacket gang
Yeah it's really a paradox. All this tech could be beneficial if it could "fake" performance on cards that don't really have it but it doesn't really work there. It is usable really on high end stuff which would probably be powerful enough to give a good experience without all this tech in most games and it is basically just an extra creme on top. I really hope somebody will come up with a better budget offering, be it AMD or even Intel or something
Isn’t this true with all previous versions of DLSS as well though? I suppose it’s probably more pronounced with multi-frame gen since more of the frames aren’t really rendered…
But it sounds to me like NVIDIA has tried their best to allow you to choose exactly which settings work best to find the sweet spot for your hardware/workload. It’ll come down to individual preference whether you’d like to play with no artifacts at 20-30FPS or with a bit more latency or artifacts at 100+ fps.
"The more you buy the more you save"
80-144 fps in what? 1080p solitaire? Cyberthang it's only 20-27
Jensen is pretty good at marketing, saying my igpu "Brute force renders" games at 40fps sounds like heavy lifting.
Your igpu is such a brute 😅 NVIDIA gpus are civilized.
@@alandiegovillalobos man, imagine what my igpu could do if it had ""AI"".
My ps5 renders at 40 no problem
Its pc it always sucks@paincake2595
@ hahaha 😂 oh I can’t. Will have to ask my buddy with the flashing jacket. I’m sure it’ll perform at 5090 levels though.
Yeah it makes it sound like raw performance is some uncultured savage was of doing things.
15:50 or option 3: it's just in that preview version for comparison and wil be removed before releasing the game/update.
Vex is pointing out issues with DLSS 4 in the Digital Foundry video that Digital Foundry didn't point out themselves.
DF are Nvidia bootlickers
is still early to say if its good or not, like how dlss 1.0 or 2.0 where not great in many games, but dlss 3.7 is absolute great comparted to fsr and in some games even native if you play in 1440p or 4k resolution
That's because DF are industry shills
Honestly I've been kind of disappointed in Digital Foundry for a while. I don't think they're being paid for glowing reviews or anything. But I think their love of new technology bias them a lot.
@@aweirdwombat
They seem like tech shills to me.
Not shills to tech companies, but shills to whatever new tech comes out.
Overexcitement maybe.
Graphics haven't improved much since 2019, but we get now unoptimised games, RE2 remake(2019) and SWBF2(2017) looked great and didn't even have ray tracing, and could run in many gpus.
I prefer good stable graphics than ai generated ultra realistic graphics that look strange, but Nvidia wants games to be artificially more demanding and require exclusive features avaible only to their latest gpus to sell 2000$ gpus, and they even pay developers to force games to have their features.
Good times
Graphics have not improved much since 2010. I have been playing older games and been blown away how good some of them still look.
You have bad eyes, bro.
Graphics peaked in 2016, and have actually been degrading since 2019 since this AI trash really got going.
Great videos on the 50 series! Subbed and looking forward to future content
Nvidia is making a huge claim they are upscaling from 480p to 4k!
3klikphillip did that experiment
from personal upscaling experience and not in realtime, even heavy AI rendering with enough time to render, anything below 720p upscaled to 4k is useless. Will look 100% fake and smeary and somehow "painting" like.
Even 720p to 4K is already "hard", thats why i would upscale 720p content rather to 1440p (2x), for real time upscaling its much less resource eating regarding to CPU and "AI" stuff, for lets say video upscaling, you save A LOT OF SPACE by making it 1440p instead of 4K, it can look in 1440p even better than the (too extremely) upscaled 4K.
I am playing for example Cyberpunk, Stalker 2 and a few GPU hungry online survival games with DLSS.... i have "just" a RTX 2080 but i want to enjoy often 4k60 or 1440p120 on OLED TV. To make it worse, the fps are not allowed to drop below 40 fps or frame times above 12 ms, otherwise Gsync doesnt work on the TV (40-120 Hz Gsync range), so in some games which drop below 40 fps or have some framedrops/stutter due too high resolution, DLSS is a gamechanger to make the "old" 2080 still "nice" to use.
But if possible, i avoid ANY "fake" enhancements, including DLSS, even at the cost of reducing effects, texture details, rendering distance etc down to a certain point.
In my opinion, AI doenst have any reason to be involved in enhancing every aspect of graphics, fps etc. For me it looks as if DLSS, now this frame generation to 2-4x the fps, its just a BAD TRY to compensate for the resources hungry, mainly UE5 games trying to look hyperrealistic.
Good think im playing nowadays often older titles from my childhood, those classic games are so much more entertaining and better, despite much less capabilities of the PCs those days.
No. From 1080p to 4k. Then optical flow for the rest.
you know you can choose from what it upscale from it looks really good upscaling from 1440p to 4k
you are not supposed to upscale from anything lower then 1080p and that you will only need to do if you got a really bad gpu.
Imagine paying $2k for a gpu raw power which gives u less than 30fps in 4k is a crime
Would you buy an overpriced 4090 instead ?
Lol they say it's 2k but you goddamn well know that you won't get any for less than 2500
You can't possibly think it will provide 30fps...
RT is included in this figure and Its a tech that simply cant be scaled without fancy mockery. When its off its 70+ on last gen.
Dude, that would only happen with the most demanding games at highest ray tracing/path tracing settings and if you dont utilize DLSS or frame gen (like for example Indiana Jones and Great Circle wont run for more then 30fps if you have everything maxed out including path tracing but dont enable DLSS or frame generation)
If you run path tracing/ray tracing settings at low or medium, you get playable framerates even without DLSS or frame generation and if you disable raytracing entirely, then even more so you dont have to worry about framerate whatsover at highest settings possible.
I mean ye, you can argue with that kind of money for GPU, it should run a game with full ray tracing settings without DLSS/frame generation higher, but when Nvidia's competitors dont have GPUs that would be able to do that either (at all in fact), then I think lot of this whining is ridicilous. Its literally only in those highest possible ray/path tracing settings in most system heavy games you get such poor framerates if you dont enable DLSS or path tracing and from my personal experience using DLSS3 with 4070 Super I can say that as "average joe gamer" I didnt see or feel any drawbacks using them. Sure maybe if/when you know what to look for or are more sensitive to latency, you can notice the difference, but even then depending on the game you are playing, you can ask yourself does it really matter?
2:20 - i'd like to see someone take this challenge and crack it on older cards, and can officially call out Jensen's BS excuse for exclusive features on new gen cards.
Amd will do it. They have us frame gen foe non 40xx gpus. They will give us multi FG as well. Or buy 5usd prog LossLess scaling and get 4x fg. I used it on old games year ago (fg 3x) to get 25fps game to 75fps.. I knew nvidia will do this, year ago. And they did it.. im still faacinated by ther deception levels.
NVIDIA has evolved from a hardware company into a software company
its cuz they reached max level in hardware and trying to get AI leveld up
Yeah rtx 5060 8gb@@dante19890
@@dante19890 lmfao
@@dante19890 Fucking hell my man is clueless jeez
It's called Moore's Law bro. We've reached the physical limits of what hardware can do.
Playing with frame gen on my current 4080 feels like playing on console on a very bad TV with no game mode. It's doable but man, it's so unpleasant. I hate that this is the future that NVidia is pushing.
profound
And people are literally praising frame gen 🤣 number higher must mean better 🤣
i hate fake frames, I dunno it just feels wrong
This is why instead of hoping AMD competes to lower Nvidia prices, the masses need to support the competition to send a message. If AMD 9070 can hit 4080 levels on raster, I am switching. I hate when companies are entitled to our money due to blind loyalty! There is still hope as NOBODY knows the true performance of the 9070, only AMD and board partners
You're just picky frame gen is good even from 40 fps to 80~
Is it me or is Nvidias GPUs straight changing our games now..
If they're changing textures,faces,lighting they're changing the developers vision..
That's very different to just adding fake frames.
there is no such thing as a developers vision or talented developers anymore
Exactly. The games made by devs that do pay serious attention to every detail, texture, model and lighting. But that all gets ruined by NVIDIA using AI to literally recreate their whole game.
@@random_person618 what are those devs? I have not seen many of those in AAA
@@marcogenovesi8570 So you're saying that just because a developer doesn't do what I previously said, they don't have a game? They still made a whole game that people can play, albeit only the rich people, it's not like they used AI to generate the game. Even Minesweeper or Tetris was made by a developer, it's not like the game spawned in out of nowhere. Now NVIDIA is generating 200+ frames with AI which still ruins a lot of efforts that the developer made, which is unfair because the developer didn't use AI but NVIDIA is forcing people to use DLSS because of the 30 FPS you'd get otherwise. Even in the AI-enhanced face that was shown in this video, it is pretty clear that between the original face and the AI one, a lot of shadows and lighting effects were altered. That's my whole point.
Your argument makes no sense. The developers can choose if they want to implement these features. The bad part is if they're used as a crutch against poor optimization.
9:39 The average PC gamer is not the average person. They will notice all these flaws.
No they won't.
specially those paying 1K + into a card...
@deejnutz2068 I think they will especially if they switch from native resolution. Im not saying it's so bad that they won't use it. A lot of people use the current DLSS. All im saying is that PC gamers are not as oblivious as people make them out to be.
I agree with this because any gamer who’s even somewhat into competitive games is already trained to spot pixels that look different (because that means an enemy) so they’re gonna notice “wrong” pixels in AI generated frames
lol
The cheapest 4070Ti super is over 1000$ in canada. The 5070 is listed about 800$
4070 Ti Super has a lot more raw, actual rendering performance. But if you're fine with AI generated frames then...
@@patsk8872most people would still take the 5070
@@patsk8872 No?
@@patsk8872 Where are you getting this from. The 5070 seems like it will be very close in performance to a 4070 TI super. Maybe slightly weaker. Not "a lot more", as your stating.
We should change the testing to a pixel quality test where we analyze the screen pixels to kind of hold them more accountable for this
yeah, I'm most tired of them saying how DLSS looks better now when every time I turn it on I can see the motion blur and feel the input lag. It looks like shit, stop telling me it doesnt.
If the game already feels like shit DLSS isn’t going to help
COPE
Amd is for poor people
DLSS makes tons of shit feeling games way better already...🤡
actually i think this is a net positive for everyone, there's not that many demanding games unless you turn on ray tracing
@@googleplaynow9608DLSS helps but it becomes a problem if the true framerate is 20 fps and the rest are fake frames.
@@fs5866 Uhhh... Unless you have a 4070 class card or better, I think a lot of people would disagree with you.
It'll be really interesting to see hands-on with all of the frame-gen and the Reflex 2 turned on, to see if the work done to add frame-warping to reduce the latency "feel" reduces the lag feel.
Artifacting will be huge. U vet artifacts from FG and than fresh new artifacts from Reflex 2.0...
this is the time for AMD to shine by giving us actual performance and frames instead of fake frames.
I'm waiting for an intel B770
ye since when has AMD came and save the day
true because fsr4 upscaling looks much better now
@@peanut93able We don't know how good FSR4 will be yet.
Amd always behind in gpu
Hi Vex, It took me last year more or less a month to find accurate infos for a Nice upgrade of my system, I watched many of your videos and many other content creators and I came to the point than I had to jump off the green and blue train !!!
I went for full red team with 7900 XTX Tuf and a 7800x3d and I 've gotta say than my gaming experience went much better AND Adrenaline is working just great !
Thanks for all the infos you already gave us and the ones arriving now !
Fantastic content, accurate and useful ! Keep up the great job !!!
Faces at 2:06 - the AI face looks it has a plastic sheen and feels not right, but the standard one to me looks more actually human. How is that better?
luckily we've been trained a bit to spot AI faked images, it's all smoothing from now folks!
Well, not everyone is born like a globlin. You have genuinely natural beauties.
And are you saying I am plastic? lol I have a natural glowing skin, because I take care of my body and skin.
@HomelessShoe lol username does not check out.
@@HomelessShoe why are you getting flared up over a comment?
I think the standard one looked better because it looks more game-like, while the AI face just looks like a deepfake, you get the live action face and you try to plop it into a 3d rendered scene and it will look weird.
Keeps reminding me of fully 3D rendered movies and how acting will be a relic of the past, and all movies will look just like real life but rendered on Unreal Engine 9.
RTX 5070 with 12GB VRAM is a joke.
For real, im building my first pc right now with a 4070 Ti Super and that’s got 16 GB of VRAM
If the 5070 is better than the RX 7800 XT (at raw performance) I'm 100% getting the 5070
@@BloodLetterGyour raw performance isn't getting there once that GPU is deprived of VRAM
@@UltraVegito-GodKiller-1995 well, the Dlss 4 might make up for a Vram
@@UltraVegito-GodKiller-1995It's not a 4k card that's gonna eat up more than 12 gb of vram. I have a 4070 ti super and ive hardly ever utilized more than 12 gbs of vram, that too with ray tracing which honestly doesn't make a difference in some titles. It's just a bullshit argument for anyone using 1440p or 1080p cards. For 4k, yes 12 isn't enough.
17:03 Frame Warp is not actually that new. John Carmack was talking about what he called Time Warp that they used in VR in 2017.
If anyone is interested it is from the UMKC talk at around the 30 minute mark.
My issue is that it is still 1000$ for 16GB of vram...being stuck with a 3070 i cannot financially justify upgrading yet
technically it's $750 , 5070ti has 16gb
@@TheMissingDislikeButton will be 1000/1200 euros in europe, as what happened with the 4070ti
@@dragonmares59110 yes, it's a shame that after 4 years you can't get a card that's twice as fast for 400 bucks.
I switched from 3070 ti to 3080 ti 12gb and difference is huge.Sounds stupid but it is,those 4gb really makes the difference in games that are close to using 7gb or more.But the biggest,like huuuge performance boost I got switching from 10700k to 9800x3d.
For example in Witcher 3 just switching cards I got 4 fps,from 73 to 77 but less fps drops due to memory.After buying new CPU that jumped to 110 fps
Nvidia could easily give more, but they know they can milk people for AI by forcing high VRAM onto more expensive cards.
Here's what I don't see anyone mention. How does this affect performance in VR?
Where generating different content to each eye can be an issue
Yep .. AFAIK VR cant utilize the frame gen without messing up render
chameleon eyes vision
Subed, thanks buddy. Keep pushing it!
Love your videos Thank you for the Content
It does not add 200+ frames from 24 or whatever. It's first boosted with dlss to get a decent base frame rate of 60 then 60x4=240
Yea true, i dont understand how vex doesn't realize something so simple to see
Because there was also a example of a game running running at 124 fps with DLSS4 which means 30fps base… and regardless if it’s 60 fps base it sill adds latency if you actually watched the entire video. It’s simply a bad gimmick to get ppl to buy more.
@@JonBslime tbh I find frame gen even more pointless because of that.. If you can already get 60 fps in single player games, what's the point of using frame gen and doubling latency to get more? Low end cards do not have enough VRAM for it (FG eats it up like a mf), so that is out of the question.
The more beneficial use case would be for multiplayer games, but the latency penalty is too much for it to be viable. Maybe one day if they can get the latency penalty down to within a few ms.
@@JonBslime It's easy to do the calculation
30x4 = 120
60x4 = 240
What I don't understand is the thumbnail of the video, which says 214 frame fakes, when doing calculations it's easy to see what the base fps are with dlss performance etc, and then add the x4 of MFG, If I have to use dlss for better performance I use it, also have to say that I'm not a big fan of frame generation.
120 hertz already looks smooth as butter why would i go even higher and add more latency and artifacts?
The Censored Guide to Wealth is like a hidden treasure I can’t believe it’s not more popular
AI generated comment to match the fake frames, eh?
Massive way to save me money... I will boot game and not use DLSS if I see anything below 60 fps with it off, I will refund the game lmao. Most of the AAA stuff has been terrible as of late anyway.
thanks for making this man. its horrifying how many people are just lapping this shit up. its so disappointing to me.
I hope nobody buys them and I can get a few!! I see way more man childs hating on the 50 series in comments, 80% atleast, good stay on your past gen and shut up!!
In order to sell a product you need consumers.
Nvidia is pushing fake frames so hard because there are so many people out here willing to buy this bullcrap.
It fucking sucks.
Remember when TVs had this super smooth motion feature and how well those sold?
they still have them. Motion interpoaltion is very common in tvs. But they have really high input lag penalty and looks very bad with gaming cuz they dont have access to the game values.
Literally every modern smart tv sells with this lmao terrible example
I really hate those feature on tv. It turn 24fps cinematic movie into tv series with weird jelly like movement and noticable ghosting
tv still got those and it's good, the power demand is nowhere with games but with movies its great
@@ryr2277 agreed I hate it. But it's not affected TV sales at all as this comment suggests.
@15:12 Why both? Because its 4x the compute, which means its a free ad for Nvidia on the older gen cards.
You can see both DLSS versions, compare image quality, see how much better 4 is and also see how much worse your card performs with it vs a 50 series card and then get the incentive to upgrade.
Nvidia will never throw you a bone without gaining something or gaslighting you into upgrading.
Just use "lossless scaling" software for multi frame generation if you have an older card. Works fine for me.
It works fine, lossless scaling. but as shown it uses pc resources to generate those frames.
Performance/optimilisation wise MFG is better, because it runs in an optimized workflow, and can acces software, driver and hardware level stuff(its all nvidia).
it has more potential.
But, does it look good, does it perform good, and is it worth spending cash. Esp, because all the dlss /reflex optimisations also reach 4000 series, combined with lossless scaling it's up there too. it leaves a 20-30% improvement. for how much?
I dont think its a bad call to pick up a 4090 second hand now, while all 4090 sellers are rush selling those cards because 5070 is going to perform like a 4090.
It's so bad that I can't even use that shit for watching anime, it creates artifacts like crazy I would rather get 60 fps
@@ArdaU Same. Tried it on factorio, and the artifacts of the intermediate frames are way worse then the desire to have more fps. Gladly I found another way to run factory 120 fps native via a mod. I guess the tech only really works if the images are already extremely busy with millions of polygons and textures. For 2d games or anime, every glitch sticks out like a sore thumb.
@@Raynhardx FSR 3 Frame generation is great btw it really feels like you are actually getting high fps but when I move if I watch closer it feels a bit weird for some reason
sorry but its just not usable, i have characters vanish from turning the camera, very visible artifacts.
When you play VR games, you might feel discomfort or pain in your eyes. Why does this happen? It's often due to flickering lights. Sometimes you notice the flickering, and sometimes you don’t but brains still process it.
Obvious DLSS random lighting artifacts will strain eyes, even when simply watching monitor pixels.
DLSS, THE GUARANTEE YOU WILL BE EXPERIENCING YOUR GAME IN RHE FUCKING PAST
The slow release of DLSS features makes it harder for people to realise that those tiny artifacts aren't what games normally look like. People just assume the blur and jitter is normal.
Yep! I was called out for "Oh its just how the game is" yet no.. its not. And a lot of people like my self NOTICE these issues when using DLSS or FSR & FG.
Its NOT normal at all. The fact its being shrugged off as "Meh, its what ever" is not good.
*Happy new year 🎊 You work for 40yrs to have $1M in your retirement, meanwhile some people are putting just $10K into trading from just few months ago and now they are multimillionaires*
wow this awesome 👏 I'm 47 and have been looking for ways to be successful, please how??
It's Esther A Berg doing, she's changed my life.
I do know Ms Esther A Berg ., I also have even become successful....
After I raised up to 325k trading with her I bought a new House and a car here in the states 🇺🇸🇺🇸 also paid for my son's surgery (Oscar). Glory to God.shalom.
Good day all👍🏻 from Australia 🇦🇺. I have read a lot of posts that people are very happy with the financial guidance she is giving them ! What way can I get to her exactly ?
when the RTX 90-series comes out there won't be any pixels or game at all, your GPU will just have an AI that will convince you that you've just played the most awesome game ever.
You will just be livestreaming your videocard :)
Thats one way to clear my backlog XD
I think they want to limit reflex 2 to 5000 series first where it will be a "best case scenario" on 1 architecture. This will give the best impression and the best initial view in the eye of the press of the tech before ironing out any issues then moving it to older architectures where the tech might not be able to showcase the same results.
Meanwhile us 1080p literal GODS playing any game at 200 FPS with these barely sufficient 4k focused cards
You do have a point there. I like 1440p myself but I'll go to 1080p for framerate purposes as I did in Dead Space Remake. The secret is a high-end audio receiver with an upscaler and game mode with low latency. 1080p games actually look incredible on a 4k screen and you can run with REAL framerates with little to no input lag.
I like 4k sorry 😢
Staying at 1440p for the foreseeable future. My 4090 and 7800x3d are more than capable for at least this entire gen.
must suck to be able to see pixels on your screen
Not on the latest poorly optimized UE5 games at least.
Vote with your wallets.
Entirely, sticking with my 980ti for yet another gen. Eyeing up the Arcs.
4090 is doing just fine, not worth upgrading yet and it will be cheaper now. scalpers will now have to sell it at a loss🥳🥳🎉
I'm definitely getting a 5080
AMD IGPU for me, Nvidia is just gambling for frames.
I will when i buy a 5080. More raw performance better RT and FG than anything AMD has to offer
21:18 those are temporal anti aliasing artifacts. fsr and dlss rely on multiple frames to smooth things out.
What some don't seem to understand is that no matter how much Nvidia is gonna reduce the input latency, you will still have the base framerate one. So let's say a base framerate of 30 fps, but MFG gets it up to 120 fps, you will have 33.33ms and NOT 8.33ms you would get from native 120 fps.
Keep in mind that DLSS can easily double your base frame rate if you're GPU bound due to AI upscaling, which positively affects input lag. If your base frame rate without DLSS is 30 fps, then with DLSS 4 it may jump all the way to 240 fps with your base fps doubling. It's honestly crazy technology and doesn't look as bad as people make it out to be.
Only for companies to throw it all under a bus and lower your base frame rate to 15 fps. Say hello to good old TAA smearing, because you WILL experience it if you didn't before.
@@rikuleinonen When i use frame gen my input lag always increases.
I refuse to fall into this trend. Please AMD, focus on raster!!! Thank you Vex for calling out BS! Respect!!
@Vex it's probably an option in the game settings because the "feature" is going to be hardware-bound. You need the RTX 5000 to be able to use it. For folks without one, tough luck.
convolutional neural networks tend to be smaller in size and more computationally effective than transformer models, meaning that the new model will probably use more VRAM and cores
Yup, so the FPS uplift should be way lower.
i'm more interested in the upscaler upgrade.
maybe dlaa will be less blurry now.
Dlaa blurry ? Wtf are you talking with 3.7 update it is so sharp plus you can manually adjust sharpness using dlss tweaks
@@DragonOfTheMortalKombathow to know what sharpness is correct tho
@@DragonOfTheMortalKombat the shits blurry, this is hard cope icl
Yeah I am stoked to see if the lower res upscaling looks better. Performance at 4k is as far as I can go but I prefer balanced. Maybe there is some magic that makes performance look great so I can actually run pathtracing at 60fps 4k with framegen
dlaa is better than native what are you on 💀
Bro fr trust me guys, dlss has always increased fps by alot and seeing that GPU frame ratio to ai frame as 1:15 is crazy like that increases fps by a ton. No doubt why gensen said 5070=4090
Dlss 4 is gonna be so freaking good that I bet 5060ti will be the best budget GPU if it's price is around 300$
Edit: I also think 5090 users will switch to 8k for triple a games instead of 4k now, we might literally get 8k benchmarks soon
Just stop talking, man. You're endorsing the decline in graphics card development. Instead of focusing on true hardware raw power like previous generations, these cards rely entirely on gimmicky features. You're part of the problem because you fail to see how you're falling for NVIDIA's trick, paying for software-based features that have little to do with the real raw performance of a proper graphics card, like those from earlier generations. Paying for fake generated frames instead of true raw frames per second.. why are you lowering your needs and are willing to just accept anything for the sake of it?
Classic manufactured generational improvement
Why has blackwell then more AI TOPS than ADA? No improved tensor cores?
Could you please remake this video when the new model from DLSS4 is released on 40-series? Curious how transform is going to do against current CNN.
Im curious if you can actually turn off the so called "AI features" if not i'll stick with the GPU's that don't have any "AI features"
Unlike 'brute force' rasterisation, most of these 'features' have to be incorporated into the game to be utilised by the GPU. That's what sucks about this stuff. Getting a newer GPU doesn't necessarily mean the performance of older games will be improved - support has to be enabled by the game developer.
From not owning games to not owning frames
In the future years we wont even need a GPU... we will pay for " cloud " service that allow our GPUless pc to access AI and get 400fps ... I dont like it x)
Agreed. Look at this online only gaming bs we have already.
Where you been homie? They have that now with Xbox cloud and GeForce now.
@@johnc8327 yeah, it's similar, but I'm talking actually about paying for a non existed GPU.. Cloud gaming you still need the console right? ( I really just care about pc, never looked into it tbh)
Because more input lag is what everyone wants
Nah they have cloud streaming games for PC already. Geforce Now was actually pretty good, gave fast high end graphics with no noticable latency as long as I had a good internet connection. Problem was licensing. Geforce Now only lets you play games you own on Stean or Epic (or GOG) but only if the publishers lease the game on the platform. That means tons of games aren't on there.
Was fun while I couldn't get a graphics card during Covid, but I prefer being able to play on my own PC.
Future games will be entirely AI. Theyre using ai compute as a benchmark. Software is more important than hardware.
Fake FPS must be paid with fake money . 😀
My question is will this be some kind of gateway to poor gaming optimization. Seems like a lot of shortcuts that may affect the base game 🤷♂️
Also they are advertizing their card as x4 faster. But this only works in the games that support DLSS framegen. At the moment only game I encountered that support it is Stalker 2.
What about older games ? Or all other that do not support latest bells and whistles ?
Older games will easily run at high framerates even without upscaling and frame gen, why would you need it for old games?
@@Dempig I did not mean old games like 2004 games. But Games that came year or two ago, that do not have support for DLSS or Framegen, and never will. And by the way, not all games have DLSS
@@PixPunxel What games released int he past 2 years are you playing that do not have DLSS? Far more games have DLSS than FSR. Devs dont even bother adding FSR because of how bad it is. Upscaling is required for new games or they would be unplayable, 99% will support dlss.
@@Dempig No Rest for The Wicked did not have it, Wayfinder did not have it. Heck Starfield did not have it
This is worse than the old triple buffering input lag. It has to create a buffer between real frames to interpolate between with fake frames.
RIP game optimisation
PC gaming has never been optimized. Always relied on bruteforce
@@dante19890of course it was,rainbow six, valve games, etc
@enzoguimaraescost you cant make an argument for the 0.02 percent. Pc gaming has never relied on optimization as a whole. Cuz u can't optimize a game for every pc hardware configuration.
@@dante19890 half life 2 says otherwise.
@@dante19890 Tf does "brute force" mean? Every game used to be optimized like crazy to even get it to run. Quake, doom, crisis, etc. Pretty much every game from that era was running with 1/50 the processing power and look about as good as games from way later than them.
Any inferences of how these cards will work in the *_non-gaming_* world? You know, where actual *_work_* needs to be done? Like in wrangling big Photoshop files, video editing, CAD, Blender, etc?
"you will own no real frames and be happy"
3090 was around 2Ghz 4090 is around 3Ghz I feared the 5090 would not see the same 70% jump in performance because he wasn't going to repeat that clock increase and sadly I was correct
Can't gauge performance from clock frequency alone these days.
We want clear images with good native frame rates. The next wave of AAA games will potentially have neither. I guess we’ll see.
Optimization is a thing of the past for AAA. frame generation and DLSS will make Nvidia more money, save money for the studios, all while the gamer will be paying more for less
Everyone wins except gamers
prior to DLSS, I came across this Depth Aware Video Frame Interpolation (DAIN) alpha which amazed me at first. But upon close speculation similar artifacts were noticed.
We had a debate in a group if this could be implemented live in games where the next frame was a guess. 8 years later improving on this approach, AI is still being taught on how to guess better, instead of cards working out to push higher...
You really don't have a logical reason to why the reflex 2 technology comes to 50 series cards first? Really? It's money...
Frame generation tech has horrible latency at lower frame rate, its great to increase performance if it can achieve stable 60 ish fps to say 120 or 180 bit otherwise the input lag is terrible
With DLSS super sampling, you would need a super weak GPU to not get 60fps. Even a 5060 will likely be able to DLSS balanced + frame gen to put out a better experience than a 3090 using no DLSS at all. Brute force rendering will be a worse experience unless you have a 5090.
I was thinking.. if it went from like 120 to 135 or whatever... That means the base frame rate is 68 instead of the 60 fps of the old fg which is a few ms better latency. 🤷 Better is better I guess.
@@johnc8327no offense but everyone with a brain would know you'd get a better experience with a 3090 in any game in any settings. Not even talking about the vram amount of the 5060. Framegen is useless for most people
@@johnc8327 Frame time is still bad, even with reflex + boost
@@ArtorioVideojogos lol no it isn't. Latency is better and so is smoothness.
12:16 - The driver doesn't hook into the software. Instead the software talks to the driver and the driver can "mishear" the software.
16:37 - Frame warp only reduces camera rotation latency, nothing else. Also, afaik, it only works in first person games.
Not excited about the frame gen, and at the 9070 XT vs. 5070 price point, I think the former might make up for DLSS Super Res. with it's superior rasterization + inherent quality diff. between upscaled and native
The more you buy...the more frame you save thats right new quote
Didn’t they already say that dlss 4 will have fixed the artifacting issues at a very small increase to latency? Any chance the current issues in the video are going to be worked out prior to launch?
AI is going to add to latency. Next gen we will probably see something to tackle it
The AI will generate frames and play the game. You'll just watch.
Nvidia Reflex 2 says hello
Yeah Nvidia Reflex 3 will go to the future and grab the future frames for you and put them on screen so you can have -15ms latency, players on RTX6090 have an advantage on older cards and you will rank up in the future using AI TOPS
I completely agree, 1 of every 4 frames are real........ that's madness =(
When you think about it, it's like half of that 1 frame that is real given upscaling is happening too
@@cloudycolacorp Yeah you are right, even worse hahah
And that HALF real frame got upscaled textures and is trying to be sharper by Ai@@tech1m502
If only 1 of every 4 frames are real, then Nvidia better accept only 1 in every 4 of the $$$ that I pay for their GPUs are real. Give me my $500 5090 now, Jensen.
100%
9:43 I don't know about his siblings, but this this an issue that after being catched by an eye once won't go anywhere anymore.
literally nothing is real anymore... anything, anywhere
Thank the ghews
@@endureuntiltheend86 the greatest magic trick of all time. Fooled everybody by being the one to write the books, make the movies and pick our politics. Then pretend it’s all been us. Wild it worked without questions for decades. The internet is what will end up waking ppl up and saving us. They all want it regulated
lossless upscale has had 4x framegen already for a while.
Xd
With horrendous artifacting... Makes it completely unusable.
@@Velocifero and you think dlss 4 wont?
its on the same level as a 10 yr old TVs motion interpolation. pretty much unusable
@@LordVader1887You want us to believe a solo developer has made a frame generation algorithm as robust as a multi billion dollar company?
What is the functional purpose of framegen? Like i thought the whole reason ppl wanted high frames was because it made games run smoother. So frame gen gives you a higher framerate in exchange for smoothness?
are we gonna see video game characters with 3 rows of teeth and 8 fingers on each hand?
Oh well thanks for making Reflex 2 available on previous series.
Good to know that I can finally use advertised frame gen after 3 years.
Reflex 2 technology with in-painting missing spaces looks identical to z-buffer 3d issue reshade and vorpx use. For VR for each eye pixels on a rendered frame are moved left or right depending on their depth buffer info, which means close objects are moved a lot and it leaves sort of a glass/water halo artifact around objects. I wonder if nvidia tech could somehow be applied to fix that too
Crazy now, FPS is no longer the metric of performance. It is now latency. Because with frame gen we can have all the FPS we want but the response could be terrible.
i dont know man. Doubting AI seems like a really silly position to take.
Luckily you are just a random pc user. There are more technical people that can assure you that AI isnt the way with this.
@Phininx sure. But what will you do when the tech might prove you wrong? (Purely Hypothetical)
AI is fucking garbage
@@ratchetjoker1317 I think it IS the future. It's just not ready. Far more concerning is they are selling these cards based on their AI performance and not their raster.
I believe the main reason CCN is available in CP2077 is that it essentially serves as a testbed for new Nvidia technology, making it easier to showcase improvements.
2:25 You are sound like Trump here 😂
Better than sounding like kamala and her fake ass accents and hideous laugh
Exciting times ahead with Adaxum! Just joined the presale, can't wait to see where this project goes
I cant wait for the raster benchmarks of the 5090 vs the 4090
11:04 Im having a hard time figuring out if im dum or im missing something. Like how can DLSS 4 generate 240 fps if the base fps is 30 ? DLSS generates frames in between each other so generates 4 fps from 1 fps shouldn't it at best be like 120-150 fps ?
The base 30FPS are without any DLSS, no upscaling and no frame generation. The 240FPS are achieved by first upscaling with DLSS to 60FPS, then they push the three fake frames to hit 240FPS.
If it looks amazing who cares how the frames get made
People will 100% notice it. They may not know exactly why it is, but they will notice if they have eyes. Especially a problem for PC gamers and lower end people, as the closer you are to the screen and lower fps you are. monitors have lower res on avg compared to tvs will make it very obvious. Not even mentioning that none of this 50-60ms input lag is even remotely close to playable on a mouse. Even going from 100fps as my base it's very clear when I enabled framegen, the lag is unbearable.
Because it DOESN'T look amazing, it looks like shit and FEELS even worse to PLAY.
AI is dogshit.
@@K.S-w5s gotcha. I didnt plan on buying 1 anyway so no sweat off my back. Happy with my 7900