I agree, I really think that 1. DLSS 3.0 should’ve been called DLFG (Deep Learning Frame Generation) because calling it DLSS is kinda misleading, and 2. We should really be considering Asynchronous Reprojection over techniques like frame generation, as with the former you don’t need to fake any frames but the gameplay still manages to be way smoother. It’s kind of baffling to me that anything outside of VR hasn’t adopted it yet because it’s basically free responsiveness.
@@Swallowfire There's a huge conceptual problem with VR with RTX as well. Most of NVIDIA's state-of-the-art path tracing demos rely on heavy use of biasing and denoising hacks to the image to run at real-time or pseudo real-time framerates. They've just barely gotten it to where you don't notice too many artifacts with all this heavy filtering if both eyes are seeing the same image. But with stereoscopic vision, you notice it so much! I'm a CG dev and got so excited at techniques like ReSTIR and Optix until I tested it in a VR context. It completely falls apart in VR, and if we can't use those techniques, we're still miles away from true real-time path tracing. I haven't tested DLSS in VR but I imagine it's similar for VR because all these techniques rely on spatiotemporal filtering (trying to insert missing information by looking at data from frames nearby or looking at data from previous frames across time). These techniques all start to show their artifacts 10x more with stereoscopic vision.
Indeed, I just found out about this, I thought the frame generation was based on the old frame not making a new one, why did they do this? makes no sense to me.
Eh, except for that one frame on a camera cut none of the artefacts are worse than TH-cam compression. Almost definitely awful in person, not really noticeable on TH-cam. And I went through it frame by frame because I got it wrong mistaking other, more noticeable, artefacts on her hair to be caused by DLSS
It's been working really well for me in Cyberpunk. Switching out the .dll to 1.0.7 already shows pretty big progress. The biggest issues I have with it are issues with artifacts in the menu and some UI artifacts. In game I actually prefer its image quality to DLSS 2's image quality.
The Cyberpunk update wasn't out while I was making the video. The continuous first person view really helps prevent the issues that annoy me. I'm sure I wouldn't mind it in God of War either with a single continuous shot and no cuts.
@@hypetelevision I have a 4090. I think I agree with the above comment, the artifacts in first person shooters are minimized with DLSS 3, but then input latency becomes more noticeable. I used DLSS 3 in Hogwarts legacy, and the character artifacting right in frony of the screen was super annoying. The technologies pretty cool though and I still find I'll use it in certain games.
@@soulbreaker1467Yes 4060 beats 7900xtx in terms of power consumption, features and future proofing because of having dlss and fg so you make no sense here mr "lol"
Frame interpolation has all days been a nightmare at high speeds where objects move a lot from frame to frame. And it has been even since mid 00's with the twixtor plugin for After Effects that could slow down your footage simulating super slow motion. If the object moved too much from frame to frame it would create a lot of artifacts. Especially when introducing motion blur and high speeds or very complex objects like fluids (waves crashing) it becomes nearly impossible to make it look good with out any artifacts unless you already have a high frame rate. The lower the frame rate, the more artifacts because objects move further from frame to frame. I would say you need at least 100 fps before turning frame gen on would even start to make sense. And by that point it almost doesn't matter any more. I highly doubt that this feature will ever look as good as the native frame rate at sub 80fps. At least not with the traditional interpolation. Maybe if you interpolate the raw data before rendering, instead of interpolating the finished rendered images it could work without artifacts. But that would probably tank performance instead of boosting it. I hope that I am wrong though and that we see some insane AI magic in the future.
2:35 Hello! I am not an English speaker and I am very interested to know: you speak words "far" and "simulator" is in the British manner (British pronunciation), although the country on your channel is the USA. Why is this so?
people always look for something to complain about , I love it so far also I'm not playing my games at super slowed down speeds just so I can catch artifacts or issues, if are you constantly looking for errors you will find it in everything, instead of taking the time to enjoy it they use the time to find errors
:23 if I may, the "magic" you speak of is just a form of FSR actually (lol ) - it is rendering the game at a lower resolution, then scretching it up ("upscaling") and then 'enhancing' it, with sharpening, contrast adjustments, etc etc. - you end up with better framerate but slightly less fidelity (interestingly, it is similar to when NVIDIA acquired 3dfx and put out a "T-Buffer" gimmick; it was a 'blending of frames' to try alleviate aliasing at the time) I don't see how some people can't just see it.... the ghosting, blurring, etc.. maybe it has something to do with their monitors, the refresh rate, or something like that - if a monitor has a really low refresh rate like 10ms or something, they'll see 'ghosting' all the time anyway, and totally won't notice the 'ghosting' artifacts in the game when dlss3 is on lol going AMD for now, yes... great vid examples
See, DLSS3 frame generation will get better over time and it needs to be tested with each game, all AI need some work, testing and fine tuning. It may not look good at first with fast moving scenes but will get better and be very important in the future
I just spent hours trying to fix an issue that seems to be totally related with DLSS unless my 3070 is dying. I normally don't trust these features so I didn't have much info about them. My Nvidia app optimization for NFS unbound decided to enable it and put almost everything to ultra. Yesterday seemed to be working fine getting over 100fps but sometimes I was having major freezes and occasionally the game crashed with error related to DirectX and to check new drivers. Decided to reinstall drivers and things started getting worse, 20 fps or less sometimes then resuming to 80 or 90 until I decided to simply disable it. Seems that 30x series are not supporting this feature. If so why does the stupid optimization enables it. Go and eat s... Nvidia.
It's getting better. Frame generation is a first gen technology so expect it to get better and better over time. Currently its at a state that is very usable in a lot of games.
this ^ I don't see how some people can't just see it.... the ghosting, blurring, etc. maybe it has something to do with their monitors, the refresh rate, or something like that - if a monitor has a really low refresh rate like 8ms or something, they'll see 'ghosting' anyway, and totally won't notice the 'ghosting' artifacts in the game lol
@@MunenushiBecause it isn't there. You're imagining it. Go through the video frame by frame. Any issues DLSS3 causes are entirely hidden by TH-cam compression except for on camera cuts and for like 2 frames when her boot is just entering the frame. I'm not disputing that it's obvious in person, but you cannot tell on TH-cam.
Blogger doesn't understand that these artefacts can be fix, and Frame generation can be improved. 10 month later in december 2023 I can't see any problems in many DLSS3 FG games
DLSS looks bad, especially when you watch the videos slowed down to 25%. This CLEARLY shows you 2 important things: 1) DLSS still has tons of blurriness and temporal smearing / ghosting! You can even notice it at normal playback speeds, but slowing it down makes it even more obvious. 2) Frame Generation 'augments' your frames by adding GARBAGE frames in between QUALITY frames for 'fake' smoothness. Essentially, it just adds NOISE at a fast pace. So much for the group think on 'superior image quality'. DLSS + frame gen looks like blurry noise. But hey, people gotta cope with wasting all that $$$ on nVidia hype, somehow. 🤣
To be honest, these ghostery images are much better then intermittent shedders and delays, anyone who plays games like Battlefield, COD, would not mind these. They want flawless gaming. So Both, DLSS and FSR are fine to me.
I perceive it as a stutter, like shader compilation. It's impossible to show that in a 60fps video from a 100+fps source file, and even if I could, I don't know how many other people would if they're not the one to press the button, so I used individual corrupt frames to demonstrate what's actually happening to make me think the game it stuttering.
you can actually see it in game, in his example video here. it's like old school 'ghosting' issues.... some percieve it as stuttering, some as blurring, etc. it is totally noticeable playing
DLSS is like that fable "The Emperor's New Clothes". It's pretending but it isn't; it cannot look as good as native by it's very implementation; it is not rendering at full native resolution; they can say supersampling all they want; but it is actually rendering with less fidelity than native; hence the blurry awfulness in games.
I am really impressed with fg on my 4090 fe and dlss is great sure it can introduce artifacts in certain situations but for new tech it's impressive. as far as pricing yeah it's getting ridiculous but amd is actually worse value at the minute as it doesn't have the performance, the features and the coolers are garbage in comparison
You might have enough VRAM spending an insane 1600 dollars on a graphics card but 4060, 4060 Ti, 4070, and 4070 Ti already have not enough VRAM and frame gen requires a lot of VRAM. NVIDIA cards age like milk. Also no, AMD has equal or better performance with substantially more VRAM (except the horrid 7600p except for the very highest tier where their RDNA 3 architecture didn't meet performance targets. The only difference is tech features. Prices will only ever go up. Have fun spending 2500 dollars on a 5090, but hey it will likely have 32GB VRAM with a 512 bit bus so it must be worth it.
oh my go it really "sucks" because when I'm playin a game i slow down the F game at 18% speed and see a shadow holyyyyy experience ruined .. lmao dlss 3 is the best thing ever i have a rtx 4060 ti that give the same performance as 3090 ti with dlss 3 not exaggerating at all go check it out yourself the difference is so minimal that its negligible and no need to say it sucks, pretty sure dlss 4 or something would provide better AI its in early stages still looks amazing. I remember having a gt 1030 and shity graphics compared to that now when i play games I don't look at slow down footage to point out how horrible it is, LOL!.
i dont exactly agree with the bit about DLSS 3.0 only working with rtx 40 series being a bad thing .. to be fair its not a bad or a good thing , its merely because the rtx 40 series containts specialized optical flow accelrators which are not there in any lower series GPUs ... these accelrators are the reason why dlss 3 only works on rtx 40 series and not on 30 series or below ... if you want a frame generation for a lower series you can always use amd's FSR which is really good in my opinion considering the fact that it can run on any gpu released in the last decade at the cost of some CPU usage .
I personally started playing at native 1440P at 125FPS. DLSS will give you a better picture, but it can also make it worse and add ghosting. DLSS frame generation is a total abomination.
Creating frames that the game does not makes the input lag even bigger than with dlss 2.0 and even more on lower end cpu. It also lower the visual fidelity of the game. But yeah you have more FPS appears on the screen... Most ppl won't notice the difference so personally I prefer less input lag and more visual fidelity over more FPS.
The problem with this is that some game companies are gonna be like: hey let's make more money by not paying our devs to optimize our game, we will just some DLSS on it to get a higher frame rate instead.
the true issue with dlss 3 frame generation is that, when you absolutely need more frames, you will get more visual artifacts and feel less responsiveness
@@Swallowfire Well, you need a base framerate of at least 50-60 for the latency to not make the game unplayable. RT Overdrive would make that impossible as even a 4090 can't run it properly. It is a great technology, though but as I am on an AMD GPU, I am waiting for Fluid Motion Frames.
Thank you for informing me that you can't use vsync with DLSS. That is very important information. I do have one question, though. Can you cap your FPS still?
I'm kinda late but thank you for making this video. I find it pretty weird when people (and Nvidia) are trying to argue that DLSS3 makes weaker RTX 4000 cards a better choice than more powerful Radeon or RTX 3000 cards because frame generation... As if this thing worked perfectly. I think many people don't understand that the extra DLSS3 frames are not even generated by the game engine so you can't do anything in these generated frames. That's why the game doesn't feel more responsive even though you see more frames.
Frame gen works extremely well in some games, Alan Wake 2 being a good example. Same for Cyberpunk 2077. As long as your game isn't extremely fast paced, if you hit 45fps without FG, FG will feel good.
If you haven't bought a card yet basically any flavor of 3080 used is what you want. It does all the important tricks and raytracing is perfectly doable at 1440p max settings. Avoid mined cards by buying an LHR card. 4080 is a great card being much more efficient than 3080 for what it gives you but that pricing...
As a dev, I really dislike it just conceptually even though a lot of users seem to have taken a liking to it. What I worry about regardless of how much people like it is that I think this tech will very likely slow down progress in the real-time computer graphics industry in actually being able to rasterize (or even raytrace in the future) at full resolution. It alleviates pressure from us developers as well as hardware manufactures to *actually* build software and hardware fast enough to rasterize or raytrace 1SPP (or higher with true AA) at real-time framerates on high-resolution displays. It also defeats the functional purpose of playing at a higher screen resolution. It might not entirely defeat the aesthetic purpose if people don't mind the artifacts from spatiotemporal sampling, but say someone wants to play a game at higher resolution because they want to be able to spot enemies a kilometer away to snipe. DLSS would actually not help here at all and generally make things worse, since a raytracer rendering at 1/4th to 1/9th screen resolution is typically going to be unable to raytrace such distant objects that start to become smaller than a pixel in size (there aren't enough rays being cast at 1RPP so the fewer rays per pixel will mostly miss those distant objects). A rasterizer has a similar issue because most of those distant triangles will want to rasterize to the same pixel with fewer pixels and won't be drawn or will be overwritten with the depth test. Super sampling can't insert things the rasterizer or raytracer never rendered in the first place. So if someone has the option between playing a game with DLSS at 2160p vs. playing without at 1280p at roughly the same FPS, they will actually see more information clearly, like objects at a distance, fine textural details, strands of hair, things instantly appearing in-frustum, at 1280p better than 2160p with DLSS since again the rasterizer/raytracer has more pixels to work with. So depending on our tastes, maybe 2160p with DLSS might look better to some people than 1280 without, but 1280p without still offers much more visual information. If the main point of increased screen resolution is not just to make things look pretty but to see more data, then DLSS doesn't help at all and will likely slow progress in this area. If it's just to make things look pretty to people at the cost of visual information, then depending on the person, it may or may not succeed for them depending on their personal idea of what pretty means. As for what looks pretty, personally I prefer fewer visual artifacts at the cost of resolution. Realize not everyone is the same but I seem to be like you. I come from a visual arts background (VA/CompSci dual major) and I might be a bit visually obsessed. There's lots of visual things most people don't notice that I tend to notice, so I generally prefer DLSS off and just dial down the screen resolution to get the frame rates.
I totally see your point, other comments are also saying it's absurd that PC players are content using upscalers like a console. I wouldn't go that hard. My main reason for using DLSS is that the anti aliasing it provides on Quality mode (66.6%) is often better than the AA built into the game. Forza Horizon 5's AA is very ugly and DLSS cleans it up nicely. Then again, I've always had top of the line hardware. If you're running a 2060 then DLSS might be absolutely necessary to get 60fps.
@@Swallowfire Makes sense! Are you comparing DLSS to TAA or MSAA or both if I may ask? MSAA should hypothetically yield the best quality (or at least the closest to ground truth), since it's actually rendering at the sub-pixel level and akin to rendering at 4+ times the frame buffer resolution, but that's very expensive. TAA actually seems worse than FXAA IMO since the temporal sampling across time can lead to bigger artifacts smeared across frames. DLSS at least seems superior to me for antialiasing to TAA. I'm really not a big fan of trying to sample across frames or across pixels and look for data to reuse that matches since it biases the results, but at least DLSS does it in a really smart way to try to (not always succeeding) reduce the artifacts as much as possible. I'm hoping in the not-too-distant future that we can get rid of spatial and temporal samplers and just take multiple samples per pixel as with the case of MSAA.
Usually I'm comparing DLSS to TAA. Unfortunately a lot of games force TAA on when an upscaler is off. Hogwarts and Forza both do this. MSAA is definitely best but like you said, crazy on the GPU. DLSS is the best but it does break down. Play HiFi Rush if you want to see DLSS absolutely fall to bits lol
That's exactly what I fear aswell, games that can't be played without DLSS3 even on 70-80 models, I tested out remnant 2 yesterday and on an RTX 3070 with settings on high at 1440p it wouldn't hit 60 fps if I turned DLSS2 off and that game is NOT some next gen graphics beast, don't think it even has ray tracing and such. Now imagine in the future other games that will be like that, now the game without dlss3 runs like 40 fps and to get it to 70 you need dlss3 but since it has less and less frames to work with the whole image will just be terrible.
Doesn't DLSS 4k quality mode render at around 1800p internally and balanced mode at around 1440p? Meaning it still has more pixel information to work with over native 1280p. I think you were referring to DLSS 4k performance mode which is around 1080p internal resolution. I think in many cases even DLSS 4k performance mode still looks a bit better than native 1440p, let alone 1280p
This is all good and well but the only other company that sells you GPU's that are good enough to play the latest games on is AMD and their solutions are even worse. In fact they are so bad that in order to possible gain some marketshare they have been paying game companies not to support DLSS.
I agree. The whole thing with Starfield and FSR is super shady. FSR itself is fine. I did a direct comparison a while ago and DLSS always took the performance crown, but FSR and XeSS looked better in some very specific circumstances.
Your eyes look sexy when upset :) I love my 4090 and flight sim all i gotta say, and no, i don't slow down anything to observe something to complain about, i enjoy the flying and since getting a 490 I"m gettting better first try landing scores too and 120 and up all settings ultra, terrain detail 400 alll the way up, all options completely maxed out, never below 100 and i've never noticed jitter or stutter.
Why are you not recommending Xess. For 4 GB GPU (gtx 1650 super) users this is the only DLSS for me. Intel is a new GPU manufacturer but it is rising faster than any newcomer if I am not wrong. I believe Arc B380 will be the future most used GPU in steam servay.
@@Swallowfire Some people can't handle the truth. Also, this only being available to the recent high-end GPUs enable game developers to poorly optimize their games. Keep up the good work man. Subscribed!
My 1080Ti was a stopgap from 1070 to 3080. Just not enough horsepower to do justice to some newer games like Metro and others. Glad I skipped RTX2000 because the RT performance sucked on those cards. But yeah, 1080ti didn't do the job for me. Even if some games are just terribly optimized throwing more power at them is still the solution. Also, raytracing when done right is KICKASS.
It'd be better if older tech could use it, that way we create less e-waste. It's why I support S.L.I/crossfire for DX11, & mGPU for DX12. We need more mGPU support on more games.
@@kevinerbs2778 crossfire likily wont ever really work well. Also did you not see what I said about it needing hardware that actually supports the new stuff. Older stuff simply cant take advantage of it
We've gone from frame tearing to reality tearing
I agree, I really think that 1. DLSS 3.0 should’ve been called DLFG (Deep Learning Frame Generation) because calling it DLSS is kinda misleading, and 2. We should really be considering Asynchronous Reprojection over techniques like frame generation, as with the former you don’t need to fake any frames but the gameplay still manages to be way smoother. It’s kind of baffling to me that anything outside of VR hasn’t adopted it yet because it’s basically free responsiveness.
Yes, I notice reprojection in VR quite often and it's waaaaaay less intrusive and ugly than the horrible stutter I feel when frame generation is on.
@@Swallowfire There's a huge conceptual problem with VR with RTX as well. Most of NVIDIA's state-of-the-art path tracing demos rely on heavy use of biasing and denoising hacks to the image to run at real-time or pseudo real-time framerates. They've just barely gotten it to where you don't notice too many artifacts with all this heavy filtering if both eyes are seeing the same image. But with stereoscopic vision, you notice it so much! I'm a CG dev and got so excited at techniques like ReSTIR and Optix until I tested it in a VR context. It completely falls apart in VR, and if we can't use those techniques, we're still miles away from true real-time path tracing.
I haven't tested DLSS in VR but I imagine it's similar for VR because all these techniques rely on spatiotemporal filtering (trying to insert missing information by looking at data from frames nearby or looking at data from previous frames across time). These techniques all start to show their artifacts 10x more with stereoscopic vision.
🤓☝️
Indeed, I just found out about this, I thought the frame generation was based on the old frame not making a new one, why did they do this? makes no sense to me.
Eh, except for that one frame on a camera cut none of the artefacts are worse than TH-cam compression. Almost definitely awful in person, not really noticeable on TH-cam. And I went through it frame by frame because I got it wrong mistaking other, more noticeable, artefacts on her hair to be caused by DLSS
It's been working really well for me in Cyberpunk. Switching out the .dll to 1.0.7 already shows pretty big progress. The biggest issues I have with it are issues with artifacts in the menu and some UI artifacts. In game I actually prefer its image quality to DLSS 2's image quality.
The Cyberpunk update wasn't out while I was making the video. The continuous first person view really helps prevent the issues that annoy me. I'm sure I wouldn't mind it in God of War either with a single continuous shot and no cuts.
what card are you using
@@hypetelevision I have a 4090. I think I agree with the above comment, the artifacts in first person shooters are minimized with DLSS 3, but then input latency becomes more noticeable. I used DLSS 3 in Hogwarts legacy, and the character artifacting right in frony of the screen was super annoying. The technologies pretty cool though and I still find I'll use it in certain games.
Everybody knows that DLSS stands for Dirty Little Secrets Scam !!!!
Their fans boys dont and defend it i seen people saying 4060 beats the rx 7900 xtx lol
@@soulbreaker1467Yes 4060 beats 7900xtx in terms of power consumption, features and future proofing because of having dlss and fg so you make no sense here mr "lol"
well, dlss 3 is
@@soulbreaker1467 rtx 4060 beats the rx 7900 xtx
but only in blender
I noticed some random black points (artifacts) in the right witcher 3 video also
Frame interpolation has all days been a nightmare at high speeds where objects move a lot from frame to frame. And it has been even since mid 00's with the twixtor plugin for After Effects that could slow down your footage simulating super slow motion. If the object moved too much from frame to frame it would create a lot of artifacts.
Especially when introducing motion blur and high speeds or very complex objects like fluids (waves crashing) it becomes nearly impossible to make it look good with out any artifacts unless you already have a high frame rate. The lower the frame rate, the more artifacts because objects move further from frame to frame. I would say you need at least 100 fps before turning frame gen on would even start to make sense. And by that point it almost doesn't matter any more.
I highly doubt that this feature will ever look as good as the native frame rate at sub 80fps. At least not with the traditional interpolation. Maybe if you interpolate the raw data before rendering, instead of interpolating the finished rendered images it could work without artifacts. But that would probably tank performance instead of boosting it.
I hope that I am wrong though and that we see some insane AI magic in the future.
2:35 Hello! I am not an English speaker and I am very interested to know: you speak words "far" and "simulator" is in the British manner (British pronunciation), although the country on your channel is the USA. Why is this so?
people always look for something to complain about , I love it so far also I'm not playing my games at super slowed down speeds just so I can catch artifacts or issues, if are you constantly looking for errors you will find it in everything, instead of taking the time to enjoy it they use the time to find errors
Did you notice it in the Spider-Man cutscene I showed?
I played 4k since 2014, trust me, if you see and know these issues you will see them. It's just an acquired skill
amen!
:23 if I may, the "magic" you speak of is just a form of FSR actually (lol ) - it is rendering the game at a lower resolution, then scretching it up ("upscaling") and then 'enhancing' it, with sharpening, contrast adjustments, etc etc. - you end up with better framerate but slightly less fidelity
(interestingly, it is similar to when NVIDIA acquired 3dfx and put out a "T-Buffer" gimmick; it was a 'blending of frames' to try alleviate aliasing at the time)
I don't see how some people can't just see it.... the ghosting, blurring, etc..
maybe it has something to do with their monitors, the refresh rate, or something like that - if a monitor has a really low refresh rate like 10ms or something, they'll see 'ghosting' all the time anyway, and totally won't notice the 'ghosting' artifacts in the game when dlss3 is on lol
going AMD for now, yes... great vid examples
See, DLSS3 frame generation will get better over time and it needs to be tested with each game, all AI need some work, testing and fine tuning.
It may not look good at first with fast moving scenes but will get better and be very important in the future
Even in dlss super res i run native and get updated resolution. If i run low res then there's artifacts and prediction errors
I just spent hours trying to fix an issue that seems to be totally related with DLSS unless my 3070 is dying. I normally don't trust these features so I didn't have much info about them. My Nvidia app optimization for NFS unbound decided to enable it and put almost everything to ultra. Yesterday seemed to be working fine getting over 100fps but sometimes I was having major freezes and occasionally the game crashed with error related to DirectX and to check new drivers. Decided to reinstall drivers and things started getting worse, 20 fps or less sometimes then resuming to 80 or 90 until I decided to simply disable it. Seems that 30x series are not supporting this feature. If so why does the stupid optimization enables it. Go and eat s... Nvidia.
It's getting better. Frame generation is a first gen technology so expect it to get better and better over time. Currently its at a state that is very usable in a lot of games.
Right for sure. Tons of artifacts when jumping between the logs.
this ^ I don't see how some people can't just see it.... the ghosting, blurring, etc.
maybe it has something to do with their monitors, the refresh rate, or something like that - if a monitor has a really low refresh rate like 8ms or something, they'll see 'ghosting' anyway, and totally won't notice the 'ghosting' artifacts in the game lol
@@MunenushiBecause it isn't there. You're imagining it. Go through the video frame by frame. Any issues DLSS3 causes are entirely hidden by TH-cam compression except for on camera cuts and for like 2 frames when her boot is just entering the frame.
I'm not disputing that it's obvious in person, but you cannot tell on TH-cam.
Is it causing motion sickness?
maybe if devs even bothered to optimize the games we wouldn't have to use these horrible technoligies like fsr and dlss. Way to reliant on them.
Ya. If every game ran like Doom Eternal we wouldn't be in this mess.
I got it right. So obvious FG was the one on the right. I could literally feel the FG compared to native.
Blogger doesn't understand that these artefacts can be fix, and Frame generation can be improved. 10 month later in december 2023 I can't see any problems in many DLSS3 FG games
honestly, it looks like fantastic technology.
DLSS looks bad, especially when you watch the videos slowed down to 25%. This CLEARLY shows you 2 important things:
1) DLSS still has tons of blurriness and temporal smearing / ghosting! You can even notice it at normal playback speeds, but slowing it down makes it even more obvious.
2) Frame Generation 'augments' your frames by adding GARBAGE frames in between QUALITY frames for 'fake' smoothness. Essentially, it just adds NOISE at a fast pace.
So much for the group think on 'superior image quality'. DLSS + frame gen looks like blurry noise. But hey, people gotta cope with wasting all that $$$ on nVidia hype, somehow. 🤣
I'm in the market for a new build, and I appreciate the information that you presented here. Thanks for backing up your argument with evidence too!
lol, still rocking a 1080.
Love the 1080. Had a founders edition in my PC for like 4 years.
To be honest, these ghostery images are much better then intermittent shedders and delays, anyone who plays games like Battlefield, COD, would not mind these. They want flawless gaming. So Both, DLSS and FSR are fine to me.
DLSS frame generation looks like stutters to me. DLSS and FSR upscaling don't cause that. Leave them on but frame generation off.
I would buy three steam decks if I could link them all up and have one machine with three times the performance of a single steam deck. SLI anyone?
Hahaha and it would still be cheaper than a 4090
if we can only see it in FS when you slow down things drastically, then the issue doesn't exist.
I perceive it as a stutter, like shader compilation. It's impossible to show that in a 60fps video from a 100+fps source file, and even if I could, I don't know how many other people would if they're not the one to press the button, so I used individual corrupt frames to demonstrate what's actually happening to make me think the game it stuttering.
@@Swallowfire ok cheers
you can actually see it in game, in his example video here. it's like old school 'ghosting' issues.... some percieve it as stuttering, some as blurring, etc. it is totally noticeable playing
DLSS is like that fable "The Emperor's New Clothes". It's pretending but it isn't; it cannot look as good as native by it's very implementation; it is not rendering at full native resolution; they can say supersampling all they want; but it is actually rendering with less fidelity than native; hence the blurry awfulness in games.
I am really impressed with fg on my 4090 fe and dlss is great sure it can introduce artifacts in certain situations but for new tech it's impressive. as far as pricing yeah it's getting ridiculous but amd is actually worse value at the minute as it doesn't have the performance, the features and the coolers are garbage in comparison
Oh really? I didn't know the coolers on the FE models weren't up to scratch.
@@SwallowfireAMD reference cards sell in low numbers. They are perfectly, just not the quietest. Way better than anything previously.
You might have enough VRAM spending an insane 1600 dollars on a graphics card but 4060, 4060 Ti, 4070, and 4070 Ti already have not enough VRAM and frame gen requires a lot of VRAM. NVIDIA cards age like milk. Also no, AMD has equal or better performance with substantially more VRAM (except the horrid 7600p except for the very highest tier where their RDNA 3 architecture didn't meet performance targets. The only difference is tech features. Prices will only ever go up. Have fun spending 2500 dollars on a 5090, but hey it will likely have 32GB VRAM with a 512 bit bus so it must be worth it.
My dude what was that first game you were playing?
The side scroller? That's a game called Fist Forged in Shadow Torch. Metroidvania with a rabbit and a big metal fist. It's great.
for laptop gamers like me, we have it even worse, our 4070 mobile has only 8 gigs of memory and almost 20-30% less performance.
Maybe dlss need itd own processor chip on graphich card.
oh my go it really "sucks" because when I'm playin a game i slow down the F game at 18% speed and see a shadow holyyyyy experience ruined .. lmao dlss 3 is the best thing ever i have a rtx 4060 ti that give the same performance as 3090 ti with dlss 3 not exaggerating at all go check it out yourself the difference is so minimal that its negligible and no need to say it sucks, pretty sure dlss 4 or something would provide better AI its in early stages still looks amazing.
I remember having a gt 1030 and shity graphics compared to that now when i play games I don't look at slow down footage to point out how horrible it is, LOL!.
i dont exactly agree with the bit about DLSS 3.0 only working with rtx 40 series being a bad thing .. to be fair its not a bad or a good thing , its merely because the rtx 40 series containts specialized optical flow accelrators which are not there in any lower series GPUs ... these accelrators are the reason why dlss 3 only works on rtx 40 series and not on 30 series or below ... if you want a frame generation for a lower series you can always use amd's FSR which is really good in my opinion considering the fact that it can run on any gpu released in the last decade at the cost of some CPU usage .
I personally started playing at native 1440P at 125FPS. DLSS will give you a better picture, but it can also make it worse and add ghosting.
DLSS frame generation is a total abomination.
I find DLSS is at its best when it's making up for bad anti-aliasing. Forza Horizon 5 has horrible AA and DLSS smooths it out perfectly.
ok but we at least they bring something new, it’s not perfect but it’s a big leap.
Creating frames that the game does not makes the input lag even bigger than with dlss 2.0 and even more on lower end cpu.
It also lower the visual fidelity of the game.
But yeah you have more FPS appears on the screen...
Most ppl won't notice the difference so personally I prefer less input lag and more visual fidelity over more FPS.
"Magic" read: "mass amounts of money and electricity"
In games such as Hogwarts Legacy where the optimization is shit, FG is actually extremely useful in straight up doubling my fps
The problem with this is that some game companies are gonna be like: hey let's make more money by not paying our devs to optimize our game, we will just some DLSS on it to get a higher frame rate instead.
this doesnt apply to every game though
the true issue with dlss 3 frame generation is that, when you absolutely need more frames, you will get more visual artifacts and feel less responsiveness
The extra input delay drove me nuts in Cyberpunk Overdrive with frame gen on. Couldn't stand it with a controller or a mouse.
@@Swallowfire Well, you need a base framerate of at least 50-60 for the latency to not make the game unplayable. RT Overdrive would make that impossible as even a 4090 can't run it properly. It is a great technology, though but as I am on an AMD GPU, I am waiting for Fluid Motion Frames.
I thought I was watching digital foundry for a second…. lol right on brother…
Thank you for informing me that you can't use vsync with DLSS. That is very important information. I do have one question, though. Can you cap your FPS still?
It depends on the game. I was able to cap it in Witcher, but not Spider-Man.
@@Swallowfire doesn't that just cause screen tearing?
In Spider-Man and Need for Speed, it does. Even on a freesync display. Very ugly and annoying.
I'm kinda late but thank you for making this video. I find it pretty weird when people (and Nvidia) are trying to argue that DLSS3 makes weaker RTX 4000 cards a better choice than more powerful Radeon or RTX 3000 cards because frame generation... As if this thing worked perfectly. I think many people don't understand that the extra DLSS3 frames are not even generated by the game engine so you can't do anything in these generated frames. That's why the game doesn't feel more responsive even though you see more frames.
Frame gen works extremely well in some games, Alan Wake 2 being a good example. Same for Cyberpunk 2077.
As long as your game isn't extremely fast paced, if you hit 45fps without FG, FG will feel good.
If you haven't bought a card yet basically any flavor of 3080 used is what you want. It does all the important tricks and raytracing is perfectly doable at 1440p max settings. Avoid mined cards by buying an LHR card. 4080 is a great card being much more efficient than 3080 for what it gives you but that pricing...
As a dev, I really dislike it just conceptually even though a lot of users seem to have taken a liking to it. What I worry about regardless of how much people like it is that I think this tech will very likely slow down progress in the real-time computer graphics industry in actually being able to rasterize (or even raytrace in the future) at full resolution. It alleviates pressure from us developers as well as hardware manufactures to *actually* build software and hardware fast enough to rasterize or raytrace 1SPP (or higher with true AA) at real-time framerates on high-resolution displays.
It also defeats the functional purpose of playing at a higher screen resolution. It might not entirely defeat the aesthetic purpose if people don't mind the artifacts from spatiotemporal sampling, but say someone wants to play a game at higher resolution because they want to be able to spot enemies a kilometer away to snipe. DLSS would actually not help here at all and generally make things worse, since a raytracer rendering at 1/4th to 1/9th screen resolution is typically going to be unable to raytrace such distant objects that start to become smaller than a pixel in size (there aren't enough rays being cast at 1RPP so the fewer rays per pixel will mostly miss those distant objects). A rasterizer has a similar issue because most of those distant triangles will want to rasterize to the same pixel with fewer pixels and won't be drawn or will be overwritten with the depth test. Super sampling can't insert things the rasterizer or raytracer never rendered in the first place.
So if someone has the option between playing a game with DLSS at 2160p vs. playing without at 1280p at roughly the same FPS, they will actually see more information clearly, like objects at a distance, fine textural details, strands of hair, things instantly appearing in-frustum, at 1280p better than 2160p with DLSS since again the rasterizer/raytracer has more pixels to work with. So depending on our tastes, maybe 2160p with DLSS might look better to some people than 1280 without, but 1280p without still offers much more visual information. If the main point of increased screen resolution is not just to make things look pretty but to see more data, then DLSS doesn't help at all and will likely slow progress in this area. If it's just to make things look pretty to people at the cost of visual information, then depending on the person, it may or may not succeed for them depending on their personal idea of what pretty means.
As for what looks pretty, personally I prefer fewer visual artifacts at the cost of resolution. Realize not everyone is the same but I seem to be like you. I come from a visual arts background (VA/CompSci dual major) and I might be a bit visually obsessed. There's lots of visual things most people don't notice that I tend to notice, so I generally prefer DLSS off and just dial down the screen resolution to get the frame rates.
I totally see your point, other comments are also saying it's absurd that PC players are content using upscalers like a console. I wouldn't go that hard. My main reason for using DLSS is that the anti aliasing it provides on Quality mode (66.6%) is often better than the AA built into the game. Forza Horizon 5's AA is very ugly and DLSS cleans it up nicely. Then again, I've always had top of the line hardware. If you're running a 2060 then DLSS might be absolutely necessary to get 60fps.
@@Swallowfire Makes sense! Are you comparing DLSS to TAA or MSAA or both if I may ask? MSAA should hypothetically yield the best quality (or at least the closest to ground truth), since it's actually rendering at the sub-pixel level and akin to rendering at 4+ times the frame buffer resolution, but that's very expensive. TAA actually seems worse than FXAA IMO since the temporal sampling across time can lead to bigger artifacts smeared across frames.
DLSS at least seems superior to me for antialiasing to TAA. I'm really not a big fan of trying to sample across frames or across pixels and look for data to reuse that matches since it biases the results, but at least DLSS does it in a really smart way to try to (not always succeeding) reduce the artifacts as much as possible. I'm hoping in the not-too-distant future that we can get rid of spatial and temporal samplers and just take multiple samples per pixel as with the case of MSAA.
Usually I'm comparing DLSS to TAA. Unfortunately a lot of games force TAA on when an upscaler is off. Hogwarts and Forza both do this. MSAA is definitely best but like you said, crazy on the GPU.
DLSS is the best but it does break down. Play HiFi Rush if you want to see DLSS absolutely fall to bits lol
That's exactly what I fear aswell, games that can't be played without DLSS3 even on 70-80 models, I tested out remnant 2 yesterday and on an RTX 3070 with settings on high at 1440p it wouldn't hit 60 fps if I turned DLSS2 off and that game is NOT some next gen graphics beast, don't think it even has ray tracing and such.
Now imagine in the future other games that will be like that, now the game without dlss3 runs like 40 fps and to get it to 70 you need dlss3 but since it has less and less frames to work with the whole image will just be terrible.
Doesn't DLSS 4k quality mode render at around 1800p internally and balanced mode at around 1440p? Meaning it still has more pixel information to work with over native 1280p. I think you were referring to DLSS 4k performance mode which is around 1080p internal resolution. I think in many cases even DLSS 4k performance mode still looks a bit better than native 1440p, let alone 1280p
This is all good and well but the only other company that sells you GPU's that are good enough to play the latest games on is AMD and their solutions are even worse. In fact they are so bad that in order to possible gain some marketshare they have been paying game companies not to support DLSS.
I agree. The whole thing with Starfield and FSR is super shady. FSR itself is fine. I did a direct comparison a while ago and DLSS always took the performance crown, but FSR and XeSS looked better in some very specific circumstances.
Your eyes look sexy when upset :) I love my 4090 and flight sim all i gotta say, and no, i don't slow down anything to observe something to complain about, i enjoy the flying and since getting a 490 I"m gettting better first try landing scores too and 120 and up all settings ultra, terrain detail 400 alll the way up, all options completely maxed out, never below 100 and i've never noticed jitter or stutter.
Why are you not recommending Xess. For 4 GB GPU (gtx 1650 super) users this is the only DLSS for me. Intel is a new GPU manufacturer but it is rising faster than any newcomer if I am not wrong. I believe Arc B380 will be the future most used GPU in steam servay.
When I made this video, I didn't own an Arc. I have since made a video comparing DLSS, FSR, and XESS and I was very impressed by it. Check it out!
quit complaining you have a 4080 dude😂 you get high frame rates anyways lol
thanks for the video anyways!
HAHA I love the card! I just don't love Nvidia's shady marketting.
@@Swallowfire agreed, i’m still stuck on a 1650 super so i like to try to keep up
Hey that's valid. My media PC had a 1650 until January this year. Very solid card considering what they cost before the GPU shortage haha
@@Swallowfire thats when i built my system, should have waited
Maybe Intels graphics cards? I do not like AMD. I know, nobody cares
I was thinking about buying an Arc now the price as dropped.
not kinda sucks
it sucks
Or just buy a used 3090 and fix the thermal pads
Friend of mine did that and loves it. Even though it CPU limits everything cos he's only got a 3700X lol
the amount of people who got a 40 series coping in these comments is insane, dlss 3 is just a scam to sell overpriced gpus
I have a 4090, but I almost never use DLSS, Frame Generation, Ray Tracing, or any RTX technologies.
I don't even notice those 0.1 milliseconds glitches, people "hate" on things too easily.
On a 144hz display, I see them as stutters. Feels like playing a game with the dreaded shader compilation stutter.
"I don't have this problem, so no one else does."
Some people have more sensitive eyes than others.
Pixel peeping
It’s pretty noticeable you don’t need to pixel peep if you are used to 120+ hz
Beautiful video
underrated video
TH-cam doesn't show it but this is my most disliked video ever, 19% dislikes lmao
Lots of salty 40 series owners.
@@Swallowfire Some people can't handle the truth. Also, this only being available to the recent high-end GPUs enable game developers to poorly optimize their games.
Keep up the good work man. Subscribed!
I can't believe that DLSS 3.0 is so bad that it is actually a turn off. I am good with my 1080 Ti. Nice video bud.
1080ti is a beast. One of the best cards they ever made.
My 1080Ti was a stopgap from 1070 to 3080. Just not enough horsepower to do justice to some newer games like Metro and others. Glad I skipped RTX2000 because the RT performance sucked on those cards. But yeah, 1080ti didn't do the job for me. Even if some games are just terribly optimized throwing more power at them is still the solution. Also, raytracing when done right is KICKASS.
Ah yes Nvidia making a new technology that requires new hardware that supports it sooooo evil uhuh
It'd be better if older tech could use it, that way we create less e-waste. It's why I support S.L.I/crossfire for DX11, & mGPU for DX12. We need more mGPU support on more games.
@@kevinerbs2778 crossfire likily wont ever really work well. Also did you not see what I said about it needing hardware that actually supports the new stuff. Older stuff simply cant take advantage of it
AMD made their tech :
- support old cards
- support any games
- support competitor (ngreedia) cards
up to you tho
@@Munenushi Nvidia has the same thing its called NIS and it was on par with fsr 1 but it hasnt been worked on since fsr 2