Not sure about DLSS or XeSS but FSR can be enabled in driver to work with any game. It’s just the performance and image quality is better for games that directly support it. Driver based FSR requires AMD GPU.
I've seen a few cases where the in-game scaling is relatively good, but this especially applies to what you said that 4K output where you're working with more pixels is best. This kind of in-game scaling is very prevalent in console titles, and it's only now we are starting to see some console games use FSR (both 1.0 and 2.0) with mixed results.
I've noticed just about every newer game offers upscaling now in lieu of lowering the native resolution. This is actually a smart new standard. I decided to test upscaling compared to native when I was heavily playing Diablo 4. On my 1080ti and 4k monitor, it was pretty easy to overload with 4k native ultra settings. Heck it had trouble with 4k medium settings. The Diablo franchise came a long way in graphics, so that's nice. It can be played on super high end gear and look amazing or it can be played on grandma's hp all-in-one, still fun to play. Anyway, I found that FSR 2.0 'balanced' with all the graphics settings on ultra is the most optimal setting. The native 4k textures don't look any better than the ultra textures upscaled and the fps nearly doubles with much fewer hickups during heavy gameplay. Thanks for compiling this info to help explain all these gimmick upscaling technologies. If I had an rtx card I'd try dlss. But the AMD fsr 2 has been my go-to algorithm for many hours of Diablo 4 and it's darn good with low input latency even with the Nvidia latency reducer boost shut off
Dear Mr. PCWorld, you made a mistake in this. A 90% scale of 1080p, which is 90% of the pixels, requires a resolution of 1821x1025. A resolution of 1728x972 only has 81% of the pixels. You can't just multiply the vertical resolution by the percentage, you have to multiply the number of pixels by the percentage (or just use an online calculator to do the math for you).
You're right in terms of pixel count. At the same time game menus tend to refer to the scaling options as X% (and it's assumed that's on each axis) -- so this gets confusing quickly.
If you are playing at 1080P, have an RTX card and want your have to look when better, you can use DLDSR in conjunction with DLSS. It basically takes a lower resolution upscales it above 1080P and then downcales it back to 1080P. You end up getting a more detailed and sharper image. I've used it in Horizon Zero Dawn and it looks absolutely amazing IMO
No way, everything is so blurry even compared to original lower settings. Those two GPU companies are testing something new but customers paid the cash.
So i have 2050 RTX playing forza horizon 5 in high setting fullHD getting around 80FPS should i turn dlss performance or quality i prefer more quality image or maybe i do upscaling
Games like ghost of tsushima, although no RT but , which is well optimized in the pc and consoles , and have great support on all Xess , fsr and DLSS is an example that native Res is better even if the game heavily supports the graphics utilities
I dissagree, DLSS is absolutely in league of Its own and I noticed that it produces better image quality then native TAA, Im gaming in 1440p and Ive compared DLSS to FSR multiple times and FSR is just worse native resolution while DLSS fixes flickering and rough edges that native resolution comes with too. If you have enough performance headroom DLAA is the GOAT.
All these upspaling methods create sideeffects. I would not use any of them and play games native. To me it was very visible coming from consoles. The biggest crap imo, because you can clearly see the artefacts, if you are used to a native original look. Tot for me. And now Sony start with this crap pssr. Which makes developer skipping optimization more and more. PC gaming with upscaling was a bad choice.
@ilcarlino752 Not a joke. You get sideffects from all these ai upscaling technologies. All these people without any idea about a good picture maybe okay with overly sharp edges, fake contrast, fake colors, shimmering artefacts (especially wuth stuff like frame generation on top). You can´t beat the native resolution, because it is how it should look. We get worse and worse native resolution which is handable and then upscaled. We are in 2024. We should use native resolution and fps. Look at Monster Hunter Wilds... here you get, how the direction is. Worse over worse performance even and very good hardware... and then they recommend (!) using the upscaling to reach higher fps. It is a joke. Good that DF shows sideeffects more and more.
@mipha5262 quite wrong, as dlss is literally just aided by ai to be faster, not to make up anything; frame gen, for example, uses our inputs and everything the devs trained it to do, to give us actual frames.
Thank you for this explanation. I have a few older cards of both AMD and Nividia that I am trying out (first time PC gaming) and wanted to know what those resolution modifying methods do. Seems like these upscaler methods can benefit older cards.
It is spatial upscale like FSR 1 but it looks better because it is updated and regularly tuned by AI so has better detail reconstruction but still doesn't quite match FSR 2.0 Recommended to mod temporal upscalers before trying out LS
I did this for the last 7 years.. I still have a cheap 4K Acer Monitor, I bought at Best Buy on sale about 7 years ago. My Old PC Rig only had an RX480 in it, so Most of the time I would set my games up at 1080P on a 4K monitor and not notice in games. But Content live movie's and thing's like that would show just how bad the monitor is.. Now I have a 6800XT and it pushes 4K all day NP.. But I am still stuck with the Acer monitor.. I want HDR badly but don't want a 27" monitor or a 42" C2 LG... it's so sad that there soooo very few Great HDR 32" monitors at or under 500 bucks.. But you can get a 42" Tv with a good 600HDR for $400/$500 Bucks it makes zero sense!! As soon as you try and find a 120Hz Tv at 32" with OLED, YOU Can't!!! they are always 60Hz at 32"s for some reason.. As soon as you hit the next size up they jump from 60hz to 120hz.. SMFH.. Why??
And you need a solid understanding that users can't just pick which technique to use since XeSS and DLSS can only be used on their respective hardware. So If you have an AMD card, FSR is the only one you can use and you have 0 choice. With that out of the way....
And then there is Alan Wake 2 which does not even let you play at native resolution 🥲😄 Upscaling from 540p while having RTX4080 is something abhorent in my opinion. Can you maybe make a video about what technology Remedy used since the game is beyond insanely demanding ? Upscaling from such low resolutions is huge loss for AW2 in my opinion, maybe Remedy should focus on creating better storyline than breathtaking graphics (saying that as a fan of Control)...but I digress. Thanks for the video.
You do state that native is always better, but that is not true in some cases. DLSS Quality (2 or 3) can sometimes produce scharper images with less noise in the background.
Yes with DLSS supplanting the temporal anti-aliasing of the game engine with its own solution sometimes distant objects come out clearer, it's pretty cool
Don't kid yourself, upscaling isn't a feature. Graphics should be drawn at a higher resolution then downscaled to the display, just like in video production where you shoot/scan at 8K and output at 2K or 4K
Getting harder and more expensive to shrink the die size. Once everyone's on 1-4 nanometer chips, going to be fewer and fewer gains. And Nvidia trying to leave the dGPU market for AI isn't a good sign either.
Super Resolution R+D+i has been there since the beginning of discrete gpus and also in optics... it's just interesting that doing integer operations you can manage to do much faster now and with much less power aproximated floating point algorithms... Silicon Graphics used it decades ago because there was no possibility of making better hardware...now is a mix of marketing, real technology, overpriced access, not much adoption, no free or dirver level to implement even for developers... the same as ray tracing, real time ray tracing is a good illumination technology pursued for decades in order to show realism i nsimulaton or scenery...important for gaming? alot to talk... launched products are stagnated on the perfr5omance, but not all arre build using the same updated noe technology or efficiency or custom optimized software so... I agree as a customer is awful to choose or to see how difficults is to try to mix and match meaningful technologies and reasonable price... but technology, innovation and possibilities are out thre...why they don't go to market in a sensible priced way and time?
These upscaling techniques are a mess for devopment. PC gaming made it worse to a level where developer just skip true optimization and now system recommendations are very high for 1080p (or lower)... lol. Only to use this stuff, which brings in artefacts and other sideeffects. Especially if you coming from consoles where you get usually a high native resolution. Now Sony starts with this crap PSSR what you can´t turn off. Hope it will be not the standard in future. Games looking worse and worse native.
skip to 6:56 the rest is bloat to drive watch time. nvidia = dlss, amd = fsr, intel = xess. but this is a basic guide and doesnt touch on the dlss, fsr and xess variant drivers. so overall this video is borderline pointless
To be honest, I didn't expect PC world to have such a poor content. With such a title the video should be more than what we should use or what we shouldn't. There should have been tests conducted on actual games comparing each one of these technologies, but there is none which means all information given in this video is based on estimates and logic which is far from perfect. If you test these upscaling technologies and compare them you would fine that at some instances using DLSS can even output better image quality than native resolution, but you assumed that native is always better which is not the case. One more thing: using XeSS instead of FSR 2 isn't always the best thing to do even while using arc GPUs. Real actual tests would have given real useful information to viewers about that rather than just telling them what they already know...
Tried the same game "Starfield" in 2 settings via old Nvidia GPU, high graphic with "scale down" texture, and simple low graphic. DLSS, FSR or whatever it is, "scaled" setting looks way, way, way worse than native low setting. Meanwhile in Cyberpunk 4K benchmark with RT, FPS dropped from native 80+ to 20+, I ain't seeing any better "steam dust effect" or better lightning, but I do realize the tree models are pretty bad. Ray Tracing is a scam, along with "scaled" graphic.
This technology would be actually really good if you could just use them on any game.
or with any card or even train your own games to be faster the more you play the more you shave...
Not sure about DLSS or XeSS but FSR can be enabled in driver to work with any game. It’s just the performance and image quality is better for games that directly support it. Driver based FSR requires AMD GPU.
It was possible day one. Lossless Scaling, ShaderGlass, IntegerScaler, magpie. You're welcome. I'm sure there's many more apps out there.
@@JasonB808 XeSS also works on everything
magpie be like prrrrr
I've seen a few cases where the in-game scaling is relatively good, but this especially applies to what you said that 4K output where you're working with more pixels is best. This kind of in-game scaling is very prevalent in console titles, and it's only now we are starting to see some console games use FSR (both 1.0 and 2.0) with mixed results.
this explained a bit to me what all these terms meant. I always found it vague but i have a better understanding now. Thank you!
Unreal Engine 5 Devs will slap this on and forget about optimizing other aspects
I've noticed just about every newer game offers upscaling now in lieu of lowering the native resolution. This is actually a smart new standard. I decided to test upscaling compared to native when I was heavily playing Diablo 4. On my 1080ti and 4k monitor, it was pretty easy to overload with 4k native ultra settings. Heck it had trouble with 4k medium settings. The Diablo franchise came a long way in graphics, so that's nice. It can be played on super high end gear and look amazing or it can be played on grandma's hp all-in-one, still fun to play. Anyway, I found that FSR 2.0 'balanced' with all the graphics settings on ultra is the most optimal setting. The native 4k textures don't look any better than the ultra textures upscaled and the fps nearly doubles with much fewer hickups during heavy gameplay. Thanks for compiling this info to help explain all these gimmick upscaling technologies. If I had an rtx card I'd try dlss. But the AMD fsr 2 has been my go-to algorithm for many hours of Diablo 4 and it's darn good with low input latency even with the Nvidia latency reducer boost shut off
This guide is so good. Thank you.
Thank you for these videos.
Blocky video due to compression algorithm to adress image quality of rendering... could someone turn the quality of videos on youtube ON, please...
That'll never happen unless they start charging people $30+/month. The bandwidth they'd need would go up astronomically.
but what if i have nvidia card, but not RTX, so no dlss, then what should i use? FSR? TSR? or XESS?
I guess FSR
What game is that at 8:23? Looks fun.
Forspoken
7:05 this is what you're looking for
Great content it helped me a lot great work man i really appreciate ur channel and content love from India ❤
Dear Mr. PCWorld, you made a mistake in this. A 90% scale of 1080p, which is 90% of the pixels, requires a resolution of 1821x1025. A resolution of 1728x972 only has 81% of the pixels. You can't just multiply the vertical resolution by the percentage, you have to multiply the number of pixels by the percentage (or just use an online calculator to do the math for you).
You're right in terms of pixel count. At the same time game menus tend to refer to the scaling options as X% (and it's assumed that's on each axis) -- so this gets confusing quickly.
@@retrosean199 Well if that's true then I withdraw my objection.
Yeah, Keith was referring to what is visualized in the game UI.
-Adam
Does DLSS work better on 2070 Super compared to other techs? Does it work at all?
If you are playing at 1080P, have an RTX card and want your have to look when better, you can use DLDSR in conjunction with DLSS. It basically takes a lower resolution upscales it above 1080P and then downcales it back to 1080P. You end up getting a more detailed and sharper image.
I've used it in Horizon Zero Dawn and it looks absolutely amazing IMO
No way, everything is so blurry even compared to original lower settings. Those two GPU companies are testing something new but customers paid the cash.
I dislike tech videos that pretends not to be biased, whereas we all are biased.
I'd rather hear someone who assume what he likes.
So i have 2050 RTX playing forza horizon 5 in high setting fullHD getting around 80FPS should i turn dlss performance or quality i prefer more quality image or maybe i do upscaling
if you prefer quality why did you even ask
Games like ghost of tsushima, although no RT but , which is well optimized in the pc and consoles , and have great support on all Xess , fsr and DLSS is an example that native Res is better even if the game heavily supports the graphics utilities
I dissagree, DLSS is absolutely in league of Its own and I noticed that it produces better image quality then native TAA, Im gaming in 1440p and Ive compared DLSS to FSR multiple times and FSR is just worse native resolution while DLSS fixes flickering and rough edges that native resolution comes with too. If you have enough performance headroom DLAA is the GOAT.
All these upspaling methods create sideeffects. I would not use any of them and play games native. To me it was very visible coming from consoles. The biggest crap imo, because you can clearly see the artefacts, if you are used to a native original look. Tot for me. And now Sony start with this crap pssr. Which makes developer skipping optimization more and more. PC gaming with upscaling was a bad choice.
@@mipha5262I hope this is a joke.
@ilcarlino752 Not a joke. You get sideffects from all these ai upscaling technologies. All these people without any idea about a good picture maybe okay with overly sharp edges, fake contrast, fake colors, shimmering artefacts (especially wuth stuff like frame generation on top). You can´t beat the native resolution, because it is how it should look. We get worse and worse native resolution which is handable and then upscaled. We are in 2024. We should use native resolution and fps. Look at Monster Hunter Wilds... here you get, how the direction is. Worse over worse performance even and very good hardware... and then they recommend (!) using the upscaling to reach higher fps. It is a joke. Good that DF shows sideeffects more and more.
@mipha5262 quite wrong, as dlss is literally just aided by ai to be faster, not to make up anything; frame gen, for example, uses our inputs and everything the devs trained it to do, to give us actual frames.
@mipha5262 1080p upscaled with dlss will always look better than 1080p native, period.
Thank you for this explanation. I have a few older cards of both AMD and Nividia that I am trying out (first time PC gaming) and wanted to know what those resolution modifying methods do.
Seems like these upscaler methods can benefit older cards.
what about lossless scaling?
It is spatial upscale like FSR 1 but it looks better because it is updated and regularly tuned by AI so has better detail reconstruction but still doesn't quite match FSR 2.0
Recommended to mod temporal upscalers before trying out LS
MAY DAY is back!
Native wont necessarily better with TAA applied, or that other AA that softens the image.
I did this for the last 7 years.. I still have a cheap 4K Acer Monitor, I bought at Best Buy on sale about 7 years ago. My Old PC Rig only had an RX480 in it, so Most of the time I would set my games up at 1080P on a 4K monitor and not notice in games. But Content live movie's and thing's like that would show just how bad the monitor is.. Now I have a 6800XT and it pushes 4K all day NP.. But I am still stuck with the Acer monitor.. I want HDR badly but don't want a 27" monitor or a 42" C2 LG... it's so sad that there soooo very few Great HDR 32" monitors at or under 500 bucks.. But you can get a 42" Tv with a good 600HDR for $400/$500 Bucks it makes zero sense!! As soon as you try and find a 120Hz Tv at 32" with OLED, YOU Can't!!! they are always 60Hz at 32"s for some reason.. As soon as you hit the next size up they jump from 60hz to 120hz.. SMFH.. Why??
And you need a solid understanding that users can't just pick which technique to use since XeSS and DLSS can only be used on their respective hardware. So If you have an AMD card, FSR is the only one you can use and you have 0 choice. With that out of the way....
XeSS is open source like FSR
It's just not true. XeSS also works with nvidia and amd cards, it's just using dp4a.
And then there is Alan Wake 2 which does not even let you play at native resolution 🥲😄 Upscaling from 540p while having RTX4080 is something abhorent in my opinion. Can you maybe make a video about what technology Remedy used since the game is beyond insanely demanding ? Upscaling from such low resolutions is huge loss for AW2 in my opinion, maybe Remedy should focus on creating better storyline than breathtaking graphics (saying that as a fan of Control)...but I digress. Thanks for the video.
He explained native res. I guess this video isnt for me....
4:21 Represented - not representeded.
Upscaling is good for APUs!!!!
You should probably add some clarification on what technology can be used on what brand of cards, if this is geared for the "newbies".
Try Stalker2 without any upsaler a shame
You do state that native is always better, but that is not true in some cases. DLSS Quality (2 or 3) can sometimes produce scharper images with less noise in the background.
Yes with DLSS supplanting the temporal anti-aliasing of the game engine with its own solution sometimes distant objects come out clearer, it's pretty cool
Don't kid yourself, upscaling isn't a feature. Graphics should be drawn at a higher resolution then downscaled to the display, just like in video production where you shoot/scan at 8K and output at 2K or 4K
What a dumb take lmaok
sir aapko send kha karu
Dlss always makes my game feel jank
What strikes me is how GPUs are stagnant and therefore need crutches. But they call it technology
What are you even talking about rn?
its rtx bro.. it's very taxing . now path tracing has benn introduced its even more resource hog than ray tracing.. even 4090 is nt enough.
Getting harder and more expensive to shrink the die size. Once everyone's on 1-4 nanometer chips, going to be fewer and fewer gains. And Nvidia trying to leave the dGPU market for AI isn't a good sign either.
Super Resolution R+D+i has been there since the beginning of discrete gpus and also in optics... it's just interesting that doing integer operations you can manage to do much faster now and with much less power aproximated floating point algorithms... Silicon Graphics used it decades ago because there was no possibility of making better hardware...now is a mix of marketing, real technology, overpriced access, not much adoption, no free or dirver level to implement even for developers... the same as ray tracing, real time ray tracing is a good illumination technology pursued for decades in order to show realism i nsimulaton or scenery...important for gaming? alot to talk... launched products are stagnated on the perfr5omance, but not all arre build using the same updated noe technology or efficiency or custom optimized software so... I agree as a customer is awful to choose or to see how difficults is to try to mix and match meaningful technologies and reasonable price... but technology, innovation and possibilities are out thre...why they don't go to market in a sensible priced way and time?
Youre ignorant
Just do it the old school way, let your monitor upscale. LOL
Representeded? That’s not even a word man. You mean ‘represented’.
Where’s Gordon
can't see any difference with these techniques lmfao
If you turn on a really low quality/high performance preset you will really see it 🤣
dlss actually looks better than native with anti aliasing in most of the games
Me watching in 720p phone 😅
logo
These upscaling techniques are a mess for devopment. PC gaming made it worse to a level where developer just skip true optimization and now system recommendations are very high for 1080p (or lower)... lol. Only to use this stuff, which brings in artefacts and other sideeffects. Especially if you coming from consoles where you get usually a high native resolution. Now Sony starts with this crap PSSR what you can´t turn off. Hope it will be not the standard in future. Games looking worse and worse native.
skip to 6:56 the rest is bloat to drive watch time. nvidia = dlss, amd = fsr, intel = xess. but this is a basic guide and doesnt touch on the dlss, fsr and xess variant drivers. so overall this video is borderline pointless
To be honest, I didn't expect PC world to have such a poor content. With such a title the video should be more than what we should use or what we shouldn't. There should have been tests conducted on actual games comparing each one of these technologies, but there is none which means all information given in this video is based on estimates and logic which is far from perfect. If you test these upscaling technologies and compare them you would fine that at some instances using DLSS can even output better image quality than native resolution, but you assumed that native is always better which is not the case. One more thing: using XeSS instead of FSR 2 isn't always the best thing to do even while using arc GPUs. Real actual tests would have given real useful information to viewers about that rather than just telling them what they already know...
Tried the same game "Starfield" in 2 settings via old Nvidia GPU, high graphic with "scale down" texture, and simple low graphic. DLSS, FSR or whatever it is, "scaled" setting looks way, way, way worse than native low setting.
Meanwhile in Cyberpunk 4K benchmark with RT, FPS dropped from native 80+ to 20+, I ain't seeing any better "steam dust effect" or better lightning, but I do realize the tree models are pretty bad. Ray Tracing is a scam, along with "scaled" graphic.
Those weren't all the scaling options. You neglected AMD VSR and Nvidia DLDSR
The title was about "upscaling" technologies, not scaling technologies.
@@IslamGhonaym Yet native is somehow the "best" upscaling
Gonzalez Carol Jones Margaret Lee Gary
Didn’t know Mr. Beast is also a tech guy 😂
Anderson Carol Jones Sharon Harris Donald
Mr beast
Native is not always best.
99.9999% it's the best :P
-Adam
@@pcworld Not if the game uses crappy AA.
About this topic, wanna talk about AMD "partnerships" ?
as long as you also want to talk about NVidia "partnerships"
@@RedLine0069 i'm not a big AMD fanboy, so no thanks
Partnerships ? more like bribery.
@@RedLine0069 Ooooooo sure why not. You know about the geforce partner programme ?
@@riccardocattozzo2579 of course you are not, you are a Nvidia Fanboy