The editing in this video is so subtly incredible. The bent pins on the CPU and never ending GPU killed me. Well done Philip. Really a masterclass in sneaking jokes and fun details into the video.
Thank you for secretly releasing probably the most easily understandable guide to AA in graphics and how to parse them for actual tangible results and not data on a spreadsheet.
I never get a chance to post a comment so early. So I'm going to use this moment to say I really appreciate all your work Phil. I don't even play CS:GO anymore, but I'll still watch your videos for their informative and well presented design 😌
I second this. I follow for Phil, almost anything he makes. His voice is kinda comforting. Not to mention his Helluvalot and sleek yermom jokes :) Cheers Philip! Wish you all the best!
Having read about AA over the last 20 years, this is the best overview video of the current tech. The only thing missing is talking about the effect on moving objects (image stability)
More people need to see that bottleneck graph section. My god it's irritating how many people treat bottlenecking like it's literally Satan himself, when the reality is your system is pretty much bottlenecking most of the time, and it just depends on what you're doing/playing.
Every chain is as strong as it's weakest link - if that weakness is 4x the strength you will ever need, it's still very much a weakness, but it's down to personality how much that bothers you 👌
Most of the time the system is not bottlenecked but all resources are underutilized. The graph shown is not a very good example of bottlenecking, because there is literally no perceivable difference between let's say steady 144fps and 500fps (for many the lower bound is even 60fps). The system is doing pointless extra work just because it can. When people talk about bottlenecking, it refers to a perceivable effect caused by the disparity in performance levels between different components.
@@romaliop strongly disagree. Bottlenecking is completely separate from what you personally deem to be perceiveable. That's a ridiculous statement and has never been true.
easy to understand and factually incorrect on several counts. you cant do MSAA with 2 samples per pixel. its only 4, 16, 64, etc. youtube looks better in 2160p than on 1080p on a 1080p screen because the bandwidth youtube uses for 1080p is too low to generate decent image quality on that resolution. streaming in 2160p and then downscaling gives better image quality because the bitrate for 2160p streams is higher, not because downsampling magically makes the quality better. this is made by someone who barely understand how to use a computer, for people who barely understand how to use a computer.
@@mopilok okay but then why make this whole video instead of simply saying that 4x MSAA gives better image quality than no anti aliasing, which is common knowledge
OMG! you just changed my life! So, I just recently bought the bendable $2,000 Corsair OLED gaming monitor and was considering sending it back because the text in windows was very pixelated. It's amazing in games and for HDR video, but in windows the text just doesn't look good. After watching this video I enabled VSR and then changed my resolution from 1440p to 2160p and the text looks so much better! This is amazing. Thank you so much for making these types of videos.
Yep it's cool that you can get nicer looking yt videos on a 1080p screen by selecting 1440p. I already knew that but great that Phil noted it in the beginning of the video! Ps why the fuck have you gotten a 3090 but not a better monitor for it... 1080p 144hz ain't bad but paired with a 3090...
I knew this trick from a while. I've been playing an handful of games at 1.75x resolution from my 1440p screen and the difference is astonishing. Goodbye TAA, DLSS + DSR is awesome and doesn't blur anything.
@@jesse291 Good question, real-time live-rendered ray tracing in a beta animation program is the answer (also because it was the only 30 series card being sold locally) you can pull about 10fps at 480p which is adequate, it takes a second when you stop moving or make a change for the scene to get a clear image. The monitor can wait till I can save up again since I already have one.
Been running above native res for a long time. I do wish LCDs could act like CRTs where display resolution didn't matter, though. Fancy upscaling wasn't ever necessary on CRTs because most resolutions looked equally as "good" in terms of how the display handled it.
once you have a 8k screen the pixel canvas will be so dense that you will be effectively be able to use any resolution with very little loss in quality. won't be perfect as CRTs are, but good enough that you won't notice the bluriness.
BOAH ... this talk about CRT is wrong on so many levels! Please work on your attention span and watch more than the first 3 minutes of the video. 1. NONE of the issues super-resolution/downsampling/supersampling are fixing can be addressed by the CRT! 2. If your CRT is blurring the picture it simply is either a very crappy model or run at the ragged edge of its pixel clock ... if not even overclocked (been there, done that, killed a 17" CRT doing so ... refresh rate was more important than sharpness for me back in the day ... everything below 100Hz while gaming gave me dizziness ... everything below 85Hz gave me downright vomit reflexes ... like a bad VR setup) Back in the day at the end of the CRT era there were very good 19" CRT monitors from Samsung (and Iiyama and probably others which were even more expensive ... outside my budget at the time) which were laser sharp at 1280x1024 at 85Hz. Running the same monitor at 1280x1024 at 100Hz by contrast produced a blurry mess which couldn't even hold its geometry over the whole screen and had to be degaussed every couple of hours running that edgy. I ran that CRT monitor at 1024x768 at 100Hz and used the freed GPU power for 2x or 4xSuperSampling AA (which in effect IS downsampling ... done since 1999 ... MSAA wasn't even a thing back then ) ... no dizziness and WAY better image quality than 1280x1024 ... On the CRT there are the same jaggered edges, shimmering/interrupted small geometries and blurred out alpha-details on textures far away or on steep viewing agles! If you want the effect of a CRT which was overclocked until it almost shits it self, use FXAA ...
also the fact that 4K is a higher bitrate than 1080p on youtube, so that's probably the main factor it in looking much better than youtube 1080p on a 1080p display.
One thing to note about the "sweet spot" is GPU latency. When approaching 100% GPU usage, the GPU gets a bit clumsy and input latency goes up. So even though the sweet spot has the same FPS, it's actually a less responsive experience that can have a negative impact on your skills in a competitive game.
Yes, i experienced the same. Not too good for competitive shooters, but fine for pretty much anything else. I’m actually using it for pretty much any other game my system can handle at 4K using a 1440p screen. For me, this was most visible playing Mafia 3. The game looks super 💩 at 1080p and 1440p but stunningly good at 4K even on a 1440p screen. It’s worth trying on many engines if the system can handle. Mostly it’s also worth lowering other graphical settings as the better edges due to Downsampling are actually worth more than higher texture quality e.g.. and you can almost always lower or even get rid of any AA setting.
Cs go do not have a sweet spot on higher end pc's right now. I dont have fps cap and i cant get more than 50% gpu and 20% cpu usage. Source engine just cant utilize a full potential of modern hardware
I clicked on the 1080p quality option and it turned your video into a slide show. Such an amazing contrast! I suddenly had time to really appreciate every single one of frames, it really improved my enjoyment of this content :D
@@3kliksphilip Incredibly, I'm also having this issue, but I think it's an issue with the Revision mod rather than the engine. The same frame drops occur in the final area of Area 51, when you're looking at Bob Page from the opposite side of where you entered the room from, on a balcony. It's only in that specific area, looking in that specific direction where I get nearly 10 FPS, which makes me think there's some specific game object that's being rendered from that point which is causing the frame drops, and is probably true in the MJ12 ocean lab level as well.
@@3kliksphilip sounds like a shader is doing heavy per pixel calculations over the surface (water shader maybe?) and as the game is old, it isn't using modern optimizations to render fast enough, thus the performance hit is exponentially higher when you bump the resolution (and thus the amount of pixels to render)
@@3kliksphilip Well, I'd also say that old game engines use old graphics API such as DirectX 8 or even 7, and older OpenGL versions. Not counting that most of the old games takes advantage to only one core of a CPU (CPU frequency matters more than cores and threads quantity), reason why people want source ports of old games, so that they can take advantage of new graphics technologies and run those games better. Plus, probably loads of other code stuff that have not been made in the past because they were just not meant to be used or thought that they would've made a difference in the future. Not to mention how CS:GO's performance would be if it would be using DirectX 11 API instead of 9. Then there are certain games and engines that are mostly CPU dependant (CS:GO is a good example) unless you set it at a resolution that's sufficiently high to force it use your GPU power, which is pretty much explained in the graph you showed in this video.
One of my favourite things in Titanfall 2 was the addition of dynamic temporal super sampling based on gpu load and frametimes and I still wonder why more games dont add such an amazing feature
@@shakal_ it was the dynamic resolution setting games already have had but it just downsampled when you had frame rate overhead, literally the normal process but the slider goes past 100%
I often return to this video, and I don't really play cs:go; this video explain AA and downscaling so well that anything else sounds confusing and I need THIS video to remind me how they work like
The amount of work you put in your videos is as amazing as always. Glad to hear the old MVP anthem for the sponsor though, can't wait for the next case opening millionaire episode and valve adding your music kit.
Man, this reminds me of the PS3/360 era of gaming where PC hardware was so advanced that you could run modern games with supersampling/downsampling. I played the original dishonored at basically 4k (Using Sparse Grid SuperSampling) and it looked absolutely incredible. Honestly it probably looked better on that 1200p panel than it does playing at native 4k on my TV lol.
I wish I would've had this information and my current pc 10 years ago, better late than never! Stopped playing CS a long time ago, but I still watch your content for entertainment purpose, your videos are so well thought, you never fail to disappoint.
probably buried, but just wanna say I appreciate this video even though I dont play csgo. I have been using your method to play competitive rocket league downscaled from 4k to 1080p for almost a year now. As rocket league is light to run it is very easy to still push 300 fps with this method. The game looks so much better and I feel I have significant better understanding of downfield opponent's orientations and movements. your the goat ty!
I really appreciate these long, well explained, well edited, and well written videos. This one in particular I think is such a quality video. I never really knew what bottlenecking was too, so your explanation really helped me finally understand it. Thanks for the great lunchtime video!
11:08 a few people have done some fairly scientific latency testing in a number of games and generally found the best performance when GPU load is kept to around 80% or below; above 80%, latency tends to go up faster than FPS goes down. ensuring that the GPU does not spend a lot of time at or above 80% will also greatly improve noise and thermals.
Nvidia Reflex and AMD Anti-Lag aims to reduce/remove the increase in latency when you're closing in on max GPU usage. Have used and tested NV Reflex and can say it works in the supported games. Don't know much about how well AMD's solution works (who also popularised it first).
no such thing. there's a ton of variables. the specific gpu, cpu and rest of pc used, the specific game (and version, software) used. Case used, fans, room temperature. etc even then, what do you mean with "best performance"? yeah, no such thing
They meant best performance as in highest FPS/lowest latency as there are certain games where the latency increases when you're GPU bound. Maybe their choice of wording might have not been the correct thing but the overall context makes sense.
The thing is that what you seem to be looking for is simply texture multisampling. Supersampling is pretty much considered a sin for real-time renderers. Instead, anisotropic filtering does exactly what you are looking for and really really well. But devs usually implement the alpha part the cheap way: using thresholding no need to mess with blending fragments in the correct order. This obviously results in aliasing near these texture holes. Now by ray-tracing the primitives containing transparent parts on top of the opaque (rasterized) ones, we obtain the same level of quality of supersampling (texture-wise) for much, much cheaper. Ray-tracing is quite a fancy & easy solution to alpha blending. There is some rewrite potential of these parts of legacy renderers, but gamers need to actively ask for it. The superior stage is shader multisampling altogether which, well, is pretty much indistinguishable from supersampling. There is some work done on locally increasing the sample count but at the end of the day, this is really similar to running the same shader code multiple times per pixel.
I don't do a lot of gaming but I make digital drawings; when I make an illustration that is meant to be in 1080p, I still make the canvas in 8K and then export it in 1080p because I thought it looked nicer. Glad to see I am not crazy
19:20 at this point, I'll adding a bit of informations here: The tree becomes more solid at longer distance because that's how the shader in game works with mipmap. Lower mipmap (lower alpha mask for the leaves is the same way as lowering resolution of the texture map) in result the alpha mask will become more solid and can't see through. This works on every games, even now we always have to use mipmap to save resource from afar. This plus the LoD of the mesh and the resolution give this kind of blocky mess. Good video btw
So much research and effort in one video. It's great to see people being excited about such niche things. My favorite video of yours so far man, keep up the good work!
Every time I watch one of your videos, I just think of the old days on CS:CZ and CS:S. 10 odd years now. Great to see you're still around and hey! 1m+ subs now! Congratulations!
CSAA was actually a more optimized format of MSAA where it stored a reduced amount of color samples, compared to coverage (coverage being which color sample is affecting the pixel how much), so you could have much smoother edges, and even though it could only have like 4 color samples max on 16x CSAA for example (I dont remember the exact number unfortunately), it yielded performance comparable to MSAA 8x. Unfortunately support was dropped when the Pascal architecture on NVIDIA came around (GTX 1080 Ti included). I remember playing around with CSAA in half life 2 then being confused at why the option was gone when I got my 1080 Ti :(
It doesn't play nice with the architecture. You *can* still enable it through Nvidia Inspector, but it'll crash without a compatible GPU. It also doesn't offer tangible benefits over what we see as 8x MSAA on a current gen GPU which is actually not the same algorithm as what 8x MSAA used to be (and this goes for all GPU vendors).
WOW! How did I not know this? The image quality looked much better. Like running a game at native. Thank you! YT vids look better now. I suppose I should have known though. It's the same principle as recording and mixing audio at a higher bit depth and rate only to down sample the master. I also want to add that one reason for going with lower quality graphics in FPSs is to reduce the amount of distractions.
This was interesting and educational. That said, it does seem only really relevant/useful for people still on 1080p monitors, or maybe 1440p. I have a 4K monitor and can't see myself ever using this since downscaling from 8K just seems impractical for current GPU's even if you wanted to - 4K is already hard enough. Might be relevant in the future though.
even for you the Deep Learning DSR (DLDSR) of Nvidia could help since it can use your 4k or slightly below 4k resolution and pump it up to 6k to downscale from there. At quite little performance impact. :)
Mr kliks man, at this point u make a lot of us realize what we can do with our PCs, but I think you’re at the point where you should develop a software or partner with a software that tells us what quality settings we should play each game with. You can start with easy computing settings but when u can define more important qualities you can add those qualities to the software. I can help to a certain extent, but most of ur fans trust you way more than most creators, and I think you have a future in this technology
I make use of the downsampling effect for JPEG images too. I realized that when targeting the same image file size it is better to a) use a higher image resolution with lower quality than a b) smaller resolution with higher quality, because the scaling algorithms when presenting the image on a viewport smaller then the image, usually yields better results for B. Only recently I discovered that A can have better results when using a sharpening filter prior saving.
I am using a 3200G Vega 8 machine and a 1366x768 monitor output but I render the desktop at 1600x900 resolution with AMD's VSR (Virtual Super Resolution). It enables me to render the UI of my decade old games at a more comfortable resolution than my monitor's native resolution. I can also push it to 1440p but it is heavier to render on my vega 8 and makes text less readable on my monitor.
i have the same rig with same monitor and ive always struggled with sharp edges since anti-aliasing is so demanding i will try the VSR tho i hope it won't lag too much
I have also noticed that DLDSR also helps get rid of shimmering effects as well. I have been sort of using it as an alternative to TAA in some games where it is super blurry and has ghosting.
ive been playing dead space 2 on pc again and the AA method used back in 2011 is terrible and forcing it in the control panel just hits the frames way too hard, I ended up just enabling super resolution and im rendering the game at 4k and downscaling to 1080p, it doesn't look as crisp as forcing 8x MSAA but it looks much better than native and still holds 165fps.
@@myztklk3v I'm on the opposite side of that. It's been about a decade since I played with a friend (I don't remember if we did 1 together, or if it's just single player until 2) so I'm holding off until I see the remake. Just like Bloodborne's Remake (or could be Bloodborne 2, but considering it 's being made by Bluepoint and most likely coming to PC I think it's remake) I don't want to complete the game right before the new one comes out so I don't immediately know what to do and have a bit of time to remember/wander around.
I had an entire blog dedicated to high quality screenshots from games I've taken and they were all done with downsampling :) There used to be a great pc screenshot thread on neogaf with tons of amazing quality screenshots doing the same. Some people even mod old ass games to force support for like 8k resolutions on games from the 90s and get beautiful screenshots that nobody else can get unless they do they same. I actually love it so much. Very highly suggest taking a lil sightseeing tour through Karnaca in Dishonored 2 while downsampling to the maximum of your computers abilities.
The thing that got me to appreciate just how good downscaling can be is vr. I upgraded the hell out of my pc so it could handle vr, I went overboard and can now play vr games at 1.9x the internal resolution of the headset. If you've got the extra frames above your refresh rate to spare then definitely try squeezing some more quality out
Very good video. I've always known that watching a TH-cam video, for instance, in 4k on a 1080p screen will look much better than watching it at 1080p, but I hadn't realised quite how much of a difference it could make until I watched this video. The same goes for running CS:GO at a higher resolution than your native screen too, although with that being said I play on 1280x1024 stretched lol.
What splits everyone's views on whether or not upscaling or downscaling matters is due to how people care about their gameplay experience. for example, if someone cares more about the story, they wouldnt care as much about how fast the screen goes if they care about the gameplay, they would want a good image either way, everyone has their own way of wanting to experience a game. No need to argue with someone that they should experience what you experience
I think the future is in combining: downscaling in geometry and textures, upscaling for raytracing, ssao and so on. Also 4k and 5k displays will solve this problem, running in native resolution.
Valve could have made the tree leaves look correct with anti-aliasing - they have a variable in materials named $allowalphatocoverage that they don't seem to ever use, but it makes $alphatest materials able to be anti-aliased.
testing my settings was hell of a journey as my game crashed every single time I changed the resolution, aswell as sometimes i changes MSAA settings. I don't have a potato pc tho
I finally get how this works. It's not anti-aliasing so much as it's saying "see that palm tree? It's only rendering at 320x240", but your display is capable of showing more pixels of detail at that location, so downsampling allows you to get more information out of a lower resolution base. Running a game at 1080p isn't actually 1080p for every object on screen. But running at 4K downsampled means all the smaller elements on screen that lost detail in the renderer get that detail back.
I'm not sure why but this video is endlessly rewatchable and I love it. Would be very interested in seeing you SS more games to reveal "hidden details".
Wow, this explains why I always had the impression that watching videos in 4k DOES look better on my 1440p monitor than watching them in 1440p. I always thought this has to be a placebo effect, though, so it's nice to know it's real.
A good 1440P video will be created from the downscaled 4K version though, and hopefully with a downscaling algorithm that is at least as high quality as what a realtime downscaler would use. If you are talking about TH-cam videos though...maybe TH-cam doesn't use a very high quality downscaler. They have to process massive amounts of videos, so who knows.
@@keithwilliams2353 the video is not actually being downsampled when you play it above your monitor's resolution. it just squashes the video into whatever window size it's given, and then it's usually not filtered at all unless the video driver has something that's specifically designed for that. if you play a video above 1080p on a 720p display, thin lines with harsh contrast are usually going to be massively aliased. Especially noticeable when somebody records a CRT monitor in a video. In most cases it'll look great and perfectly crisp, but then it's given a black line that's 1-3 pixels thin and it suddenly looks like you're playing shadow of chernobyl with AA turned off.
@@Shpoovy I'm pretty sure my hardware accelerated video playback is using something more complicated than a point filter to downscale a video that is higher res than my monitor. My GPU offers realtime color correction and denoising, like they wouldn't be bothered to at least implement a billinear filter?
@@Shpoovy Sure, but I'm pretty sure almost all modern PC discrete and integrated graphic solutions have support for hardware accelerated output scaling. Maybe you won't get it on a chromebook or and old tablet, or a device that is misconfigured?
Tried to explain this to my friend and he called me a looney conspiracy theorist. He was an IT specialist with experience in game development, goes to show a degree doesn't give you intelligence. (CPU/GPU limited part)
An "IT specialist with experience in game development" should definitely understand what a bottleneck is, maybe you two had some misunderstanding going on..
GREAT VIDEO-AA & MSAA & FSAA makes me remember in the beginning when enabling it any game was struggling to run......forward to 2022 you feel nothing if you enable it.....same story repeats now with DLSS & Raytracing.....in 20 years from now it will be just a feature by default :)
Meh, as no matter how hard the original image is to render, the aa isn't more, so with modern games that litteraly can't run on hardware from that time ( even forgetting driver support) it's quite painless especially at lower resolutions
I imagine what DLDSR is doing is rendering at the resolution they claim it's rendering at, upscaling using their AI magic to the resolution they say it "looks like", and then downsampling back to your monitor's resolution.
I loved this video. Fwiw the big problem with aliasing for me is when the camera moves and the aliasing artifacts make it look like something is moving when it's not, which is really bad in competitive games. Or worse, leaves in the distance moving causing a flashing artifact.
Phenomenal video as always, Philip! Just wanted to chime in regarding the clarity of alpha textures because I literally just faced this issue today while learning Unreal Engine 4 - game engines heavily rely on mip-maps (a chain of progressively lower-resolution copies of a texture) that they switch to as the texture gets further away. Their primary purpose is to avoid shimmering on distant textures that can't be stably resolved at your current resolution, meaning at higher resolutions you use higher-res mips. While generally imperceptible by design, it's impossible to miss on alpha textures as once-clear leaves morph into a blobby mess. This can be mitigated by adjusting the LOD bias setting for your 3D application via Nvidia Control Panel or AMD's equivalent, forcing it to use higher res mips at the cost of potentially increased VRAM usage. Definitely something to keep in mind when employing upscaling and especially downsampling in the future!
Okay, here's a quick history of AntiAliasing. In the beginning, there was AA. This was produced by rendering to a buffer that was several (2x or 4x) times greater in size then the native display resolution, then downscaling it to native. It was horribly inefficient, and took twice (or four times) as much work as rendering natively, but it looked fantastic. Then came "Multisampling AA". This worked by rendering at native resolution, then mixing in small parts of surrounding pixels into each pixel to smooth jaggies. It worked, and was highly efficient as it was only rendering at native resolution, but it made everything blurry, and couldn't add back in pixels that were lost when they were rendered. As companies played with it, different sampling patterns came about (including Quincunx - 5 pixels in a diamond pattern, the most simple sampling you could get), with different numbers of pixels averaged. Original AA was still around at this time, and was renamed "Super-Sampling AA". Then Matrox came out with a fantastic system called "Edge AA". This rendered at native for the whole scene, then used the depth buffer to detect edges of objects and re-render them with SSAA (up to 16x!). Because it was only working on the edges, it wasn't as intensive as full screen SSAA, and it didn't make everything blurry like MSAA. It didn't work on objects with alpha channels, like chainlink fences and leaves for example, but where it did work, the effect was astounding. All the 3D companies copied this and put it into their MSAA engines for improved speed and visual fidelity (leaving flat textures largely unblurred). Next, NVidia worked out that if you ran an edge detect photoshop filter followed by a blur filter on those edges, you could get cheap AA using just texture shaders. This was quickly adopted as it made things look slightly better for "free", and was called FXAA (And/or TXAA when they started using previous frames to help with the edge detection, T standing for "Temporal"). This was also adopted in games with their own implementations and in AMD's drivers too. All these systems still used native resolution, and so couldn't reconstruct data that was missing between the pixels, like original SSAA did. NVidia introduced DSR, which was basically a way of dynamically rendering at higher-than-native resolution for quality (or lower-than-native for performance, a trick borrowed from consoles). This allowed Super-ish sampling (as it wasn't a whole multiple of the native resolution like SSAA) again at the cost of having to render higher resolution screens. Now NVidia has taken their DLSS system, which uses AI/Machine Learning to try and fill in data where there wasn't any, to create a version of Super-Sampling. In the same way it (for performance) renders at a lower resolution then upscales with AI, this new system renders at (or above) native, upscales it with AI, then downsamples with smoothing like SSAA used to to construct an anti-aliased image, and called it DLRSS. If the AI upscaler recognizes objects like leaves and upscales them accurately, this will work well. Unfortunately, it seems that it isn't, so it's not much better than MSAA. In short, for best visual quality use SSAA or DSR (which will incur a heavy performance hit), followed by DLRSS if your system supports it and can't do SSAA or DSR, followed by MSAA, followed by FXAA if you have no other option.
2 questions, 1. how do I find the framerate that my cpu and gpu can run seperatly. 2.Should downscaling be used with SSAA/DSR to provide better visual quality/detail or should it be used by itself.
DSR is not working correctly without smoothing on uneven resolutions. 33% smoothing therefore should be a good compromise for all resolutions. DLDSR on the other hand works correctly with all levels of smoothing. Smoothing in DLDSR is actually deep learning sharpening, the lower the slider the more aggressive the sharpening. Therefore, I would argue, 66% is about the optimum with DLDSR. You want the neural network to add some details but not to get into ,"painterly" territory.
you are missing Supersample, it will apply that 8x MSAA to textures with alpha too, it can be force in the nvidia control panel, some games will be compatible like Fallout 3 (and in general older APIs than DX10)
This documentary deserve to be a huge part of screen resolution on official wikipedia page. People need to know that this is the actualy power combined cpu+gpu and not the "value" that some rgb-led lights add on the pc as a device.
"Better" is a very ominous statement. MSAA is exponentially faster than downscaling, not to mention the computational pressure to render sub-pixel geometries that just look nicer without being crucial to the game itself. This statement also advocates for upscaling+AA when gaming on a 8K monitor.
Few things, MSAA can affect foliage if it the alpha cutouts are forwarded to its coverage and also theres a reason textures also look better on higher res and that is because the Mipmaps are biased against the higher res.
Finally someone who actually understands bottlenecks. I've seen so many people play competitive games with the lowest settings at the lowest possible resolution, with the mindset that worse graphics = higher framerate = lower latency. But as you've very clearly described here, there will be a cap for how high your framerate can actually go. Plus, some settings like anti-aliasing can actually improve visual fidelity and improve your gameplay as a result. Even if you're ultra-competitive, you should be playing a game as old and easy to run as CS:GO at at least 1080p with some level of AA. Otherwise you're just wasting your PC's hardware for no real benefit.
This was the best downsampling (or downscaling) video I've seen yet. I especially liked the Nice Tree 1 bit; you really showed that DLDSR is NOT getting more detail than the resolution it's based off. I didn't know if DLDSR got more detail and I didn't think of testing it like this. Super cool! What do you think of downsampling (eg. 1.78x DLDSR) then downscaling (eg. 80% resolution scale) to get better image quality and not as significant of a performance impact as 1.78x DLDSR? I tried this exact thing in Hunt Showdown but found that the image was actually MORE JAGGY with the aforementioned scaling soup than native 1080p. I wonder why this is so 🤔... I'd expect the opposite as the net resolution is still higher than 1080p.
My guess( I never played it nor am I some sort of expert) is that possibly the downs downscaling wasn't perfectly even or something with the engine itself.
I long stopped playing CSGO but still watch your videos from time to time. Playing other games, some older with weird artifacts when antialias is turned on, the discovery of downscaling is huge for me, especially if those games are stuck at 60 or 144 fps. Thank you for this video, you improved my gaming experience.
I get called a freak of nature by my friends for preferring playing FPS games with low resolutions and very bad aliasing. The reason is that I always find it extremely easy to spot moving targets or distant enemies when visually they're so obtrusive. I honestly do believe that it is significantly easier to see opponents. The only times that I'm held back by my resolution are in cases with things like bars or bushes, and in my opinion, competitive FPS games shouldn't be bloating their maps with unnecessary details like that anyway, or at least they shouldn't be doing it in places where enemies can be.
The editing in this video is so subtly incredible. The bent pins on the CPU and never ending GPU killed me. Well done Philip. Really a masterclass in sneaking jokes and fun details into the video.
even the subtitles, 12:36 for example
Bent pins ?
The overlong vodoo GPU made me remember this classic th-cam.com/video/_3iHV0NvLPI/w-d-xo.html
phil is the aphex twin of youtube
@@johnathanmcdoe Ain't it just Bitchin' Fast card? Some meme card from 90s.
15:45 Damn, DLDSR added an entire gun to the image. Amazing to see, how far technology has come.
i know right?
This puts a whole new level on video game violence.
This got me laughing so hard
Dude, Can't stop laughing
r/beatmetoit
"And if you see SS and think it sounds familiar for some reason..."
As a German I was sweating for a second there
haha yes it is
shit got a little bit nazi
I'm from Poland and I had the same reaction
Somehow thought of wolfenstein 3d after that one, after all that game could use some upscaling.
100%
Thank you for secretly releasing probably the most easily understandable guide to AA in graphics and how to parse them for actual tangible results and not data on a spreadsheet.
Nah this video honestly taught me quite a bit.
This video alone made me super interested in this kind of thing
I never get a chance to post a comment so early.
So I'm going to use this moment to say I really appreciate all your work Phil. I don't even play CS:GO anymore, but I'll still watch your videos for their informative and well presented design 😌
I second this.
I follow for Phil, almost anything he makes. His voice is kinda comforting. Not to mention his Helluvalot and sleek yermom jokes :)
Cheers Philip! Wish you all the best!
Same here!!
I also don't play the game anymore, just here for the technical stuff
I don't think more than a quarter of philip's audience has booted up CSGO in the last year.
I haven't played CSGO in years, but these tips will help with the games I am playing!
Why the hell am I watching this? I own a laptop with integrated graphics.
So relatable
at this point in my life 90% of the information i know Ai upscaling and graphics cards is from philip
how about a 100%
Fr I wouldn't know about ai upscaling otherwise
Yeee
🗿
Your knowledge has been upscaled.
Having read about AA over the last 20 years, this is the best overview video of the current tech. The only thing missing is talking about the effect on moving objects (image stability)
Tested DLDSR and didn't notice any kind of temporal artifacts like DLSS has
More people need to see that bottleneck graph section. My god it's irritating how many people treat bottlenecking like it's literally Satan himself, when the reality is your system is pretty much bottlenecking most of the time, and it just depends on what you're doing/playing.
By the nature of any system, performance is always bottlenecked. Something can always be improved, even if improvements are negligible
Sort of like crap coming out of my as
Every chain is as strong as it's weakest link - if that weakness is 4x the strength you will ever need, it's still very much a weakness, but it's down to personality how much that bothers you 👌
Most of the time the system is not bottlenecked but all resources are underutilized. The graph shown is not a very good example of bottlenecking, because there is literally no perceivable difference between let's say steady 144fps and 500fps (for many the lower bound is even 60fps). The system is doing pointless extra work just because it can. When people talk about bottlenecking, it refers to a perceivable effect caused by the disparity in performance levels between different components.
@@romaliop strongly disagree. Bottlenecking is completely separate from what you personally deem to be perceiveable. That's a ridiculous statement and has never been true.
Me watching this video on my phone in portrait mode at 480p : 🧐 "yes, I see what you mean"
I can tell you've put a lot of thought in making your explanations easy to understand for everyone, seriously well done
hey deer ;)
easy to understand and factually incorrect on several counts. you cant do MSAA with 2 samples per pixel. its only 4, 16, 64, etc. youtube looks better in 2160p than on 1080p on a 1080p screen because the bandwidth youtube uses for 1080p is too low to generate decent image quality on that resolution. streaming in 2160p and then downscaling gives better image quality because the bitrate for 2160p streams is higher, not because downsampling magically makes the quality better. this is made by someone who barely understand how to use a computer, for people who barely understand how to use a computer.
@@TheSuperappelflap he already mentioned bitrate when talking about youtubers quality tho lol
@@mopilok okay but then why make this whole video instead of simply saying that 4x MSAA gives better image quality than no anti aliasing, which is common knowledge
@@TheSuperappelflap This has nothing to do with the point of the video? To talk about downsampling that's the point of the video....
OMG! you just changed my life! So, I just recently bought the bendable $2,000 Corsair OLED gaming monitor and was considering sending it back because the text in windows was very pixelated. It's amazing in games and for HDR video, but in windows the text just doesn't look good.
After watching this video I enabled VSR and then changed my resolution from 1440p to 2160p and the text looks so much better! This is amazing. Thank you so much for making these types of videos.
I just enabled it and WEW it's like iv'e upgraded my 1080p 144hz monitor to 1440p without spending £400
you're missing out on downsampling 1440p though
@@fengrr It'll probably be better but I'm broke after getting a 3090 so this will do in the mean time.
Yep it's cool that you can get nicer looking yt videos on a 1080p screen by selecting 1440p. I already knew that but great that Phil noted it in the beginning of the video!
Ps why the fuck have you gotten a 3090 but not a better monitor for it... 1080p 144hz ain't bad but paired with a 3090...
I knew this trick from a while. I've been playing an handful of games at 1.75x resolution from my 1440p screen and the difference is astonishing. Goodbye TAA, DLSS + DSR is awesome and doesn't blur anything.
@@jesse291 Good question, real-time live-rendered ray tracing in a beta animation program is the answer (also because it was the only 30 series card being sold locally) you can pull about 10fps at 480p which is adequate, it takes a second when you stop moving or make a change for the scene to get a clear image. The monitor can wait till I can save up again since I already have one.
Not just educational but very humorous. Well done
Been running above native res for a long time. I do wish LCDs could act like CRTs where display resolution didn't matter, though. Fancy upscaling wasn't ever necessary on CRTs because most resolutions looked equally as "good" in terms of how the display handled it.
once you have a 8k screen the pixel canvas will be so dense that you will be effectively be able to use any resolution with very little loss in quality. won't be perfect as CRTs are, but good enough that you won't notice the bluriness.
recently I've tested my old 17" CTR, connected to the laptop VGA port.
To my amazement, it mirrored the 1080P from the laptop pretty well :D
I do it for old games ever since the W7 days, playing games like FEAR, Condemned and SWAt 4 in 4k looks great to this day!
BOAH ... this talk about CRT is wrong on so many levels! Please work on your attention span and watch more than the first 3 minutes of the video.
1. NONE of the issues super-resolution/downsampling/supersampling are fixing can be addressed by the CRT!
2. If your CRT is blurring the picture it simply is either a very crappy model or run at the ragged edge of its pixel clock ... if not even overclocked (been there, done that, killed a 17" CRT doing so ... refresh rate was more important than sharpness for me back in the day ... everything below 100Hz while gaming gave me dizziness ... everything below 85Hz gave me downright vomit reflexes ... like a bad VR setup)
Back in the day at the end of the CRT era there were very good 19" CRT monitors from Samsung (and Iiyama and probably others which were even more expensive ... outside my budget at the time) which were laser sharp at 1280x1024 at 85Hz. Running the same monitor at 1280x1024 at 100Hz by contrast produced a blurry mess which couldn't even hold its geometry over the whole screen and had to be degaussed every couple of hours running that edgy. I ran that CRT monitor at 1024x768 at 100Hz and used the freed GPU power for 2x or 4xSuperSampling AA (which in effect IS downsampling ... done since 1999 ... MSAA wasn't even a thing back then ) ... no dizziness and WAY better image quality than 1280x1024 ...
On the CRT there are the same jaggered edges, shimmering/interrupted small geometries and blurred out alpha-details on textures far away or on steep viewing agles!
If you want the effect of a CRT which was overclocked until it almost shits it self, use FXAA ...
CRTs are the real magic
thanks now i'll watch all the videos in 4k on my 1080p screen.. it does actually look better and i dind't expect it to
also the fact that 4K is a higher bitrate than 1080p on youtube, so that's probably the main factor it in looking much better than youtube 1080p on a 1080p display.
3:52 You scared the shit out of me with that stuck pixel. Job well done. Congratulations.
Huh?
@@celthefox There are multiple red pixels in the video. Best seen at 3:57. They are easiest to spot on a 4k display with the video playing at 4k.
@@BromTeque OHHHHH, i was watching w/ my phone at 480p lol, thats why, thanks bro!
@@BromTeque r u serious? i watch 4k on 1080 monitor and i cant see also im colorblind :"
lol i noticed that too
One thing to note about the "sweet spot" is GPU latency. When approaching 100% GPU usage, the GPU gets a bit clumsy and input latency goes up. So even though the sweet spot has the same FPS, it's actually a less responsive experience that can have a negative impact on your skills in a competitive game.
Yes, i experienced the same. Not too good for competitive shooters, but fine for pretty much anything else. I’m actually using it for pretty much any other game my system can handle at 4K using a 1440p screen. For me, this was most visible playing Mafia 3. The game looks super 💩 at 1080p and 1440p but stunningly good at 4K even on a 1440p screen. It’s worth trying on many engines if the system can handle. Mostly it’s also worth lowering other graphical settings as the better edges due to Downsampling are actually worth more than higher texture quality e.g.. and you can almost always lower or even get rid of any AA setting.
@@MrPicklock Texture quality has virtually no impact on performance unless you have low VRAM. Other than that yeah.
Cs go do not have a sweet spot on higher end pc's right now. I dont have fps cap and i cant get more than 50% gpu and 20% cpu usage. Source engine just cant utilize a full potential of modern hardware
@@puff2998 CPU usage does not represent single threaded bottlenecks, which is what CS GO is running into.
@@edragyz8596 indeed. Imagine what if cs could be able to squeze 100% of 8c/16t and even more core cpu's out there
3:52 Philip, I need you to NOT make me think I suddenly have a slew of stuck pixels on a brand new monitor.
Down sampling is also used in many high end video cameras to create beautiful video, even at lower resolutions. It works really well
I clicked on the 1080p quality option and it turned your video into a slide show. Such an amazing contrast! I suddenly had time to really appreciate every single one of frames, it really improved my enjoyment of this content :D
6:20 so many germans have sweat during this video
They had us in the first half, not gonna lie
ahah yes
They tried to protect europe from banks
I always tought that downsampling was great, but this video really opened my eyes to all the possibilities.
My old games will look absolutely stunning
@@3kliksphilip Clearly the original Deus Ex is beyond the RTX 3080s capabilities lol
@@3kliksphilipHi kliks
@@3kliksphilip Incredibly, I'm also having this issue, but I think it's an issue with the Revision mod rather than the engine. The same frame drops occur in the final area of Area 51, when you're looking at Bob Page from the opposite side of where you entered the room from, on a balcony. It's only in that specific area, looking in that specific direction where I get nearly 10 FPS, which makes me think there's some specific game object that's being rendered from that point which is causing the frame drops, and is probably true in the MJ12 ocean lab level as well.
@@3kliksphilip sounds like a shader is doing heavy per pixel calculations over the surface (water shader maybe?) and as the game is old, it isn't using modern optimizations to render fast enough, thus the performance hit is exponentially higher when you bump the resolution (and thus the amount of pixels to render)
@@3kliksphilip Well, I'd also say that old game engines use old graphics API such as DirectX 8 or even 7, and older OpenGL versions. Not counting that most of the old games takes advantage to only one core of a CPU (CPU frequency matters more than cores and threads quantity), reason why people want source ports of old games, so that they can take advantage of new graphics technologies and run those games better. Plus, probably loads of other code stuff that have not been made in the past because they were just not meant to be used or thought that they would've made a difference in the future.
Not to mention how CS:GO's performance would be if it would be using DirectX 11 API instead of 9.
Then there are certain games and engines that are mostly CPU dependant (CS:GO is a good example) unless you set it at a resolution that's sufficiently high to force it use your GPU power, which is pretty much explained in the graph you showed in this video.
One of my favourite things in Titanfall 2 was the addition of dynamic temporal super sampling based on gpu load and frametimes and I still wonder why more games dont add such an amazing feature
@@shakal_ it was the dynamic resolution setting games already have had but it just downsampled when you had frame rate overhead, literally the normal process but the slider goes past 100%
4k isn't showing up so I'm assuming the video is still processing
I often return to this video, and I don't really play cs:go; this video explain AA and downscaling so well that anything else sounds confusing and I need THIS video to remind me how they work like
The amount of work you put in your videos is as amazing as always.
Glad to hear the old MVP anthem for the sponsor though, can't wait for the next case opening millionaire episode and valve adding your music kit.
Man, this reminds me of the PS3/360 era of gaming where PC hardware was so advanced that you could run modern games with supersampling/downsampling. I played the original dishonored at basically 4k (Using Sparse Grid SuperSampling) and it looked absolutely incredible. Honestly it probably looked better on that 1200p panel than it does playing at native 4k on my TV lol.
I wish I would've had this information and my current pc 10 years ago, better late than never! Stopped playing CS a long time ago, but I still watch your content for entertainment purpose, your videos are so well thought, you never fail to disappoint.
"You never fail to dissapoint" just might be the opposite of what you tryn'a say hehe 😅
@@idobecker7501 yes, you are right, woops 😅 English is my second language
probably buried, but just wanna say I appreciate this video even though I dont play csgo. I have been using your method to play competitive rocket league downscaled from 4k to 1080p for almost a year now. As rocket league is light to run it is very easy to still push 300 fps with this method. The game looks so much better and I feel I have significant better understanding of downfield opponent's orientations and movements. your the goat ty!
I really appreciate these long, well explained, well edited, and well written videos. This one in particular I think is such a quality video. I never really knew what bottlenecking was too, so your explanation really helped me finally understand it. Thanks for the great lunchtime video!
The red stars that disappear at lower resolutions at 3:57 is probably one of the coolest things ive seen on youtube. Nice one
11:08 a few people have done some fairly scientific latency testing in a number of games and generally found the best performance when
GPU load is kept to around 80% or below; above 80%, latency tends to go up faster than FPS goes down.
ensuring that the GPU does not spend a lot of time at or above 80% will also greatly improve noise and thermals.
Nvidia Reflex and AMD Anti-Lag aims to reduce/remove the increase in latency when you're closing in on max GPU usage. Have used and tested NV Reflex and can say it works in the supported games. Don't know much about how well AMD's solution works (who also popularised it first).
no such thing. there's a ton of variables. the specific gpu, cpu and rest of pc used, the specific game (and version, software) used. Case used, fans, room temperature. etc
even then, what do you mean with "best performance"? yeah, no such thing
@@user-wq9mw2xz3j "no such thing" as what? getting the most from any given hardware?
They meant best performance as in highest FPS/lowest latency as there are certain games where the latency increases when you're GPU bound. Maybe their choice of wording might have not been the correct thing but the overall context makes sense.
@@henrym5908 getting the most out of hardware is using it to 100%.
The thing is that what you seem to be looking for is simply texture multisampling. Supersampling is pretty much considered a sin for real-time renderers. Instead, anisotropic filtering does exactly what you are looking for and really really well. But devs usually implement the alpha part the cheap way: using thresholding no need to mess with blending fragments in the correct order. This obviously results in aliasing near these texture holes. Now by ray-tracing the primitives containing transparent parts on top of the opaque (rasterized) ones, we obtain the same level of quality of supersampling (texture-wise) for much, much cheaper. Ray-tracing is quite a fancy & easy solution to alpha blending. There is some rewrite potential of these parts of legacy renderers, but gamers need to actively ask for it.
The superior stage is shader multisampling altogether which, well, is pretty much indistinguishable from supersampling. There is some work done on locally increasing the sample count but at the end of the day, this is really similar to running the same shader code multiple times per pixel.
I don't do a lot of gaming but I make digital drawings; when I make an illustration that is meant to be in 1080p, I still make the canvas in 8K and then export it in 1080p because I thought it looked nicer. Glad to see I am not crazy
I do my drawings on SVG to simplify this
19:20 at this point, I'll adding a bit of informations here: The tree becomes more solid at longer distance because that's how the shader in game works with mipmap. Lower mipmap (lower alpha mask for the leaves is the same way as lowering resolution of the texture map) in result the alpha mask will become more solid and can't see through. This works on every games, even now we always have to use mipmap to save resource from afar. This plus the LoD of the mesh and the resolution give this kind of blocky mess. Good video btw
So much research and effort in one video. It's great to see people being excited about such niche things. My favorite video of yours so far man, keep up the good work!
Every time I watch one of your videos, I just think of the old days on CS:CZ and CS:S. 10 odd years now. Great to see you're still around and hey! 1m+ subs now! Congratulations!
CSAA was actually a more optimized format of MSAA where it stored a reduced amount of color samples, compared to coverage (coverage being which color sample is affecting the pixel how much), so you could have much smoother edges, and even though it could only have like 4 color samples max on 16x CSAA for example (I dont remember the exact number unfortunately), it yielded performance comparable to MSAA 8x.
Unfortunately support was dropped when the Pascal architecture on NVIDIA came around (GTX 1080 Ti included). I remember playing around with CSAA in half life 2 then being confused at why the option was gone when I got my 1080 Ti :(
It doesn't play nice with the architecture. You *can* still enable it through Nvidia Inspector, but it'll crash without a compatible GPU. It also doesn't offer tangible benefits over what we see as 8x MSAA on a current gen GPU which is actually not the same algorithm as what 8x MSAA used to be (and this goes for all GPU vendors).
WOW! How did I not know this? The image quality looked much better. Like running a game at native. Thank you!
YT vids look better now. I suppose I should have known though. It's the same principle as recording and mixing audio at a higher bit depth and rate only to down sample the master. I also want to add that one reason for going with lower quality graphics in FPSs is to reduce the amount of distractions.
This was interesting and educational. That said, it does seem only really relevant/useful for people still on 1080p monitors, or maybe 1440p. I have a 4K monitor and can't see myself ever using this since downscaling from 8K just seems impractical for current GPU's even if you wanted to - 4K is already hard enough. Might be relevant in the future though.
even for you the Deep Learning DSR (DLDSR) of Nvidia could help since it can use your 4k or slightly below 4k resolution and pump it up to 6k to downscale from there. At quite little performance impact. :)
Mr kliks man, at this point u make a lot of us realize what we can do with our PCs, but I think you’re at the point where you should develop a software or partner with a software that tells us what quality settings we should play each game with. You can start with easy computing settings but when u can define more important qualities you can add those qualities to the software. I can help to a certain extent, but most of ur fans trust you way more than most creators, and I think you have a future in this technology
I like how the CPU at 9:48 has a ton of bent pins
Thank you for this using DLDSR now 1.78x 25% smoothness
I make use of the downsampling effect for JPEG images too. I realized that when targeting the same image file size it is better to a) use a higher image resolution with lower quality than a b) smaller resolution with higher quality, because the scaling algorithms when presenting the image on a viewport smaller then the image, usually yields better results for B. Only recently I discovered that A can have better results when using a sharpening filter prior saving.
I am using a 3200G Vega 8 machine and a 1366x768 monitor output but I render the desktop at 1600x900 resolution with AMD's VSR (Virtual Super Resolution).
It enables me to render the UI of my decade old games at a more comfortable resolution than my monitor's native resolution.
I can also push it to 1440p but it is heavier to render on my vega 8 and makes text less readable on my monitor.
i have the same rig with same monitor and ive always struggled with sharp edges since anti-aliasing is so demanding
i will try the VSR tho i hope it won't lag too much
I have also noticed that DLDSR also helps get rid of shimmering effects as well. I have been sort of using it as an alternative to TAA in some games where it is super blurry and has ghosting.
Dang Max Payne really looking good with this downsampling @ 15:02
ive been playing dead space 2 on pc again and the AA method used back in 2011 is terrible and forcing it in the control panel just hits the frames way too hard, I ended up just enabling super resolution and im rendering the game at 4k and downscaling to 1080p, it doesn't look as crisp as forcing 8x MSAA but it looks much better than native and still holds 165fps.
Wait until the remake(s) come out. They already showed off some new systems that they're doing and it looks good on top of that.
@@blindfire3167 yeah the remake looks dope, part of the reason why im replaying the originals.
@@myztklk3v I'm on the opposite side of that. It's been about a decade since I played with a friend (I don't remember if we did 1 together, or if it's just single player until 2) so I'm holding off until I see the remake.
Just like Bloodborne's Remake (or could be Bloodborne 2, but considering it 's being made by Bluepoint and most likely coming to PC I think it's remake) I don't want to complete the game right before the new one comes out so I don't immediately know what to do and have a bit of time to remember/wander around.
Dead space 2 is so fun and looks still so good for it being released in 2011 or 2012 (forgot which)
Me too! I've been playing both Dead Space 1 and 2 in 4k, and they look so much better on the screen especially in Dead Space 2
I had an entire blog dedicated to high quality screenshots from games I've taken and they were all done with downsampling :) There used to be a great pc screenshot thread on neogaf with tons of amazing quality screenshots doing the same. Some people even mod old ass games to force support for like 8k resolutions on games from the 90s and get beautiful screenshots that nobody else can get unless they do they same. I actually love it so much.
Very highly suggest taking a lil sightseeing tour through Karnaca in Dishonored 2 while downsampling to the maximum of your computers abilities.
I've been playing old games regularly at 8k on my 4k monitor just for the absolute lack of aliasing of any kind , it looks phenomenal.
The thing that got me to appreciate just how good downscaling can be is vr. I upgraded the hell out of my pc so it could handle vr, I went overboard and can now play vr games at 1.9x the internal resolution of the headset. If you've got the extra frames above your refresh rate to spare then definitely try squeezing some more quality out
7:21 I didn't expect this joke so soon, as a result I am on the floor.
Very good video. I've always known that watching a TH-cam video, for instance, in 4k on a 1080p screen will look much better than watching it at 1080p, but I hadn't realised quite how much of a difference it could make until I watched this video. The same goes for running CS:GO at a higher resolution than your native screen too, although with that being said I play on 1280x1024 stretched lol.
the section at 8:00 got me all bricked up
edit: nah this whole video is sick as fuck. way more interesting than the upscaling IMO
What splits everyone's views on whether or not upscaling or downscaling matters is due to how people care about their gameplay experience.
for example, if someone cares more about the story, they wouldnt care as much about how fast the screen goes
if they care about the gameplay, they would want a good image
either way, everyone has their own way of wanting to experience a game. No need to argue with someone that they should experience what you experience
I think the future is in combining: downscaling in geometry and textures, upscaling for raytracing, ssao and so on.
Also 4k and 5k displays will solve this problem, running in native resolution.
Valve could have made the tree leaves look correct with anti-aliasing - they have a variable in materials named $allowalphatocoverage that they don't seem to ever use, but it makes $alphatest materials able to be anti-aliased.
testing my settings was hell of a journey as my game crashed every single time I changed the resolution, aswell as sometimes i changes MSAA settings. I don't have a potato pc tho
I finally get how this works. It's not anti-aliasing so much as it's saying "see that palm tree? It's only rendering at 320x240", but your display is capable of showing more pixels of detail at that location, so downsampling allows you to get more information out of a lower resolution base. Running a game at 1080p isn't actually 1080p for every object on screen. But running at 4K downsampled means all the smaller elements on screen that lost detail in the renderer get that detail back.
there isnt even 1440p quality yet
10:49 is a great visual example of bottlenecking, great work!
I'm not sure why but this video is endlessly rewatchable and I love it. Would be very interested in seeing you SS more games to reveal "hidden details".
11:20 that's a really concise explanation as to how bottlenecks work, nice job
the 4k version isn't out yet :(
"If you see 'SS' and think it sounds familiar for some reason..."
Yeah I was totally thinking about DLSS...
Wow, this explains why I always had the impression that watching videos in 4k DOES look better on my 1440p monitor than watching them in 1440p. I always thought this has to be a placebo effect, though, so it's nice to know it's real.
A good 1440P video will be created from the downscaled 4K version though, and hopefully with a downscaling algorithm that is at least as high quality as what a realtime downscaler would use. If you are talking about TH-cam videos though...maybe TH-cam doesn't use a very high quality downscaler. They have to process massive amounts of videos, so who knows.
@@keithwilliams2353
the video is not actually being downsampled when you play it above your monitor's resolution.
it just squashes the video into whatever window size it's given, and then it's usually not filtered at all unless the video driver has something that's specifically designed for that.
if you play a video above 1080p on a 720p display, thin lines with harsh contrast are usually going to be massively aliased. Especially noticeable when somebody records a CRT monitor in a video.
In most cases it'll look great and perfectly crisp, but then it's given a black line that's 1-3 pixels thin and it suddenly looks like you're playing shadow of chernobyl with AA turned off.
@@Shpoovy I'm pretty sure my hardware accelerated video playback is using something more complicated than a point filter to downscale a video that is higher res than my monitor. My GPU offers realtime color correction and denoising, like they wouldn't be bothered to at least implement a billinear filter?
@@keithwilliams2353
"...unless the video driver has something that's specifically designed for that."
-me, two paragraphs ago
@@Shpoovy Sure, but I'm pretty sure almost all modern PC discrete and integrated graphic solutions have support for hardware accelerated output scaling. Maybe you won't get it on a chromebook or and old tablet, or a device that is misconfigured?
Tried to explain this to my friend and he called me a looney conspiracy theorist. He was an IT specialist with experience in game development, goes to show a degree doesn't give you intelligence. (CPU/GPU limited part)
An "IT specialist with experience in game development" should definitely understand what a bottleneck is, maybe you two had some misunderstanding going on..
6:19 Those two letters don't bring good correlations to a Polish person like me 😅
3:08 the graphics are so good, I didn't even noticed that this was Arma
GREAT VIDEO-AA & MSAA & FSAA makes me remember in the beginning when enabling it any game was struggling to run......forward to 2022 you feel nothing if you enable it.....same story repeats now with DLSS & Raytracing.....in 20 years from now it will be just a feature by default :)
You can definitely still feel it lol.
Meh, as no matter how hard the original image is to render, the aa isn't more, so with modern games that litteraly can't run on hardware from that time ( even forgetting driver support) it's quite painless especially at lower resolutions
6:18 Me remembering my history lessons after I find my grandpa's electrician helmet
I imagine what DLDSR is doing is rendering at the resolution they claim it's rendering at, upscaling using their AI magic to the resolution they say it "looks like", and then downsampling back to your monitor's resolution.
It’s insane how all these terms like anti aliasing that see every day and had no clue what meant were explained so easily
5:31 wow, that's like 16 times the detail!
wow... nice reference
I'm now running at 1440p downscaled to 1080. Thanks :)
I loved this video. Fwiw the big problem with aliasing for me is when the camera moves and the aliasing artifacts make it look like something is moving when it's not, which is really bad in competitive games. Or worse, leaves in the distance moving causing a flashing artifact.
this answered alot of background questions in my mind. thanks!
Phenomenal video as always, Philip! Just wanted to chime in regarding the clarity of alpha textures because I literally just faced this issue today while learning Unreal Engine 4 - game engines heavily rely on mip-maps (a chain of progressively lower-resolution copies of a texture) that they switch to as the texture gets further away. Their primary purpose is to avoid shimmering on distant textures that can't be stably resolved at your current resolution, meaning at higher resolutions you use higher-res mips.
While generally imperceptible by design, it's impossible to miss on alpha textures as once-clear leaves morph into a blobby mess. This can be mitigated by adjusting the LOD bias setting for your 3D application via Nvidia Control Panel or AMD's equivalent, forcing it to use higher res mips at the cost of potentially increased VRAM usage. Definitely something to keep in mind when employing upscaling and especially downsampling in the future!
With downscaling you have to be able to be pushin P (pushing pixels obviously)
Okay, here's a quick history of AntiAliasing.
In the beginning, there was AA. This was produced by rendering to a buffer that was several (2x or 4x) times greater in size then the native display resolution, then downscaling it to native. It was horribly inefficient, and took twice (or four times) as much work as rendering natively, but it looked fantastic.
Then came "Multisampling AA". This worked by rendering at native resolution, then mixing in small parts of surrounding pixels into each pixel to smooth jaggies. It worked, and was highly efficient as it was only rendering at native resolution, but it made everything blurry, and couldn't add back in pixels that were lost when they were rendered. As companies played with it, different sampling patterns came about (including Quincunx - 5 pixels in a diamond pattern, the most simple sampling you could get), with different numbers of pixels averaged. Original AA was still around at this time, and was renamed "Super-Sampling AA".
Then Matrox came out with a fantastic system called "Edge AA". This rendered at native for the whole scene, then used the depth buffer to detect edges of objects and re-render them with SSAA (up to 16x!). Because it was only working on the edges, it wasn't as intensive as full screen SSAA, and it didn't make everything blurry like MSAA. It didn't work on objects with alpha channels, like chainlink fences and leaves for example, but where it did work, the effect was astounding. All the 3D companies copied this and put it into their MSAA engines for improved speed and visual fidelity (leaving flat textures largely unblurred).
Next, NVidia worked out that if you ran an edge detect photoshop filter followed by a blur filter on those edges, you could get cheap AA using just texture shaders. This was quickly adopted as it made things look slightly better for "free", and was called FXAA (And/or TXAA when they started using previous frames to help with the edge detection, T standing for "Temporal"). This was also adopted in games with their own implementations and in AMD's drivers too.
All these systems still used native resolution, and so couldn't reconstruct data that was missing between the pixels, like original SSAA did.
NVidia introduced DSR, which was basically a way of dynamically rendering at higher-than-native resolution for quality (or lower-than-native for performance, a trick borrowed from consoles). This allowed Super-ish sampling (as it wasn't a whole multiple of the native resolution like SSAA) again at the cost of having to render higher resolution screens.
Now NVidia has taken their DLSS system, which uses AI/Machine Learning to try and fill in data where there wasn't any, to create a version of Super-Sampling. In the same way it (for performance) renders at a lower resolution then upscales with AI, this new system renders at (or above) native, upscales it with AI, then downsamples with smoothing like SSAA used to to construct an anti-aliased image, and called it DLRSS. If the AI upscaler recognizes objects like leaves and upscales them accurately, this will work well. Unfortunately, it seems that it isn't, so it's not much better than MSAA.
In short, for best visual quality use SSAA or DSR (which will incur a heavy performance hit), followed by DLRSS if your system supports it and can't do SSAA or DSR, followed by MSAA, followed by FXAA if you have no other option.
And lastly comes TXAA/TSAA because 99% that have it smooth everything into a smoothie.
@@steinkoloss7320 TXAA/TSAA is the same as FXAA, just with more "Temporal" added. And yes, it's blur-o-vision.
2 questions,
1. how do I find the framerate that my cpu and gpu can run seperatly.
2.Should downscaling be used with SSAA/DSR to provide better visual quality/detail or should it be used by itself.
DSR is not working correctly without smoothing on uneven resolutions. 33% smoothing therefore should be a good compromise for all resolutions. DLDSR on the other hand works correctly with all levels of smoothing. Smoothing in DLDSR is actually deep learning sharpening, the lower the slider the more aggressive the sharpening. Therefore, I would argue, 66% is about the optimum with DLDSR. You want the neural network to add some details but not to get into ,"painterly" territory.
How long did this take to make? I'm lost for words with the quality of this video
never stop talking about upscaling, it's fascinating I love it !
you are missing Supersample, it will apply that 8x MSAA to textures with alpha too, it can be force in the nvidia control panel, some games will be compatible like Fallout 3 (and in general older APIs than DX10)
This documentary deserve to be a huge part of screen resolution on official wikipedia page.
People need to know that this is the actualy power combined cpu+gpu
and not the "value" that some rgb-led lights add on the pc as a device.
I used 4K DSR for years when playing on a 1080p monitor. Downsampling is a far better method of AA than even MSAA. More pixels = better.
"Better" is a very ominous statement. MSAA is exponentially faster than downscaling, not to mention the computational pressure to render sub-pixel geometries that just look nicer without being crucial to the game itself. This statement also advocates for upscaling+AA when gaming on a 8K monitor.
Whenever I watch one of your videos I'm smiling but the moment at 6:19 suddently made me look serious as a german LOL
daddy upscaler is back in town
10:35 this graph is absolutely amazing, I can't wait to use this to explain to my friends what is performance bottlenecking
I use downscaling a lot. Just makes the image look sharper and I have a ton of performace unused bc I have a 3060ti and a 1080p monitor.
All i can think of is the video you did on settings years ago and it's making me feel old.
Great video with good depth on all you're talking about!
5:42 the detail!
I love how downsampling is spelled differently every time it's said in the captions
Few things, MSAA can affect foliage if it the alpha cutouts are forwarded to its coverage and also theres a reason textures also look better on higher res and that is because the Mipmaps are biased against the higher res.
"there's a reason textures look better on higher res, and that's because the texture mips are higher res" damn that's crazy 🤯
Finally someone who actually understands bottlenecks. I've seen so many people play competitive games with the lowest settings at the lowest possible resolution, with the mindset that worse graphics = higher framerate = lower latency. But as you've very clearly described here, there will be a cap for how high your framerate can actually go. Plus, some settings like anti-aliasing can actually improve visual fidelity and improve your gameplay as a result. Even if you're ultra-competitive, you should be playing a game as old and easy to run as CS:GO at at least 1080p with some level of AA. Otherwise you're just wasting your PC's hardware for no real benefit.
This was the best downsampling (or downscaling) video I've seen yet. I especially liked the Nice Tree 1 bit; you really showed that DLDSR is NOT getting more detail than the resolution it's based off. I didn't know if DLDSR got more detail and I didn't think of testing it like this. Super cool!
What do you think of downsampling (eg. 1.78x DLDSR) then downscaling (eg. 80% resolution scale) to get better image quality and not as significant of a performance impact as 1.78x DLDSR? I tried this exact thing in Hunt Showdown but found that the image was actually MORE JAGGY with the aforementioned scaling soup than native 1080p. I wonder why this is so 🤔... I'd expect the opposite as the net resolution is still higher than 1080p.
Bro I have like this problem in huntshown and i been f kn with it for days now and I was like this will solve it
My guess( I never played it nor am I some sort of expert) is that possibly the downs downscaling wasn't perfectly even or something with the engine itself.
I didnt think nvidia allowed dldsdr and dlss to be turned on at the same time
I long stopped playing CSGO but still watch your videos from time to time. Playing other games, some older with weird artifacts when antialias is turned on, the discovery of downscaling is huge for me, especially if those games are stuck at 60 or 144 fps. Thank you for this video, you improved my gaming experience.
I get called a freak of nature by my friends for preferring playing FPS games with low resolutions and very bad aliasing. The reason is that I always find it extremely easy to spot moving targets or distant enemies when visually they're so obtrusive. I honestly do believe that it is significantly easier to see opponents. The only times that I'm held back by my resolution are in cases with things like bars or bushes, and in my opinion, competitive FPS games shouldn't be bloating their maps with unnecessary details like that anyway, or at least they shouldn't be doing it in places where enemies can be.