The thing that drives me batshit crazy is the limited draw distance on objects, vegetation and LOD's even when it's cranked all the way up. So down the road when your computer can crush it you're still limited to seeing pop in.
i absolutely hate LOD in games (at least their current implementation). There should be a way to calibrate it so the game checks your resolution and FOV, and using that it goes through the assets you can see and calculates how many pixels it takes up on your screen, and calculates LOD from there. Because currently, you can still see sizable objects that are just blurs or grey blocks until you get extremely close to it which ruins the viewing experience for me.
This is already a problem with old 3d games right now. It likely won't stop being a problem either. There are game devs right who STILL use framerate to calculate physics and whatnot. The industry hates future-proofing. It's why live-service games can get away with becoming literally unplayable once the central servers get shut down.
Exactly. It's like yeah, I get it, when San Andreas was being made at the time, Rockstar couldn't even fathom the idea of a 4090 graphics card. But with just how powerful our modern hardware is, I should easily be able to have the entire State loaded in at once with all the high LOD models loaded in and have ZERO performance hit
It makes a lot of sense for recording footage. Being able to zoom in post production is crazy useful. so from a gaming perspective a content creator may get a lot out of it.
When I got my 1080ti I started running all my games at 4K (DSR) on a 1080p display because in older games, render distance and LOD were tied to resolution. Perfect supersampling AA and dramatically less geometry pop in.
@@drunkchey9739 yup, if my 1080p display had better color I would use that over my 1440p display. Especially because 4k DSR at 1440p doesn't look right and 5K is out of my performance tolerance.
@@322-Dota2 what about crap like cyberpunk 2077 which has built in taa and if you disable it through config game just breaks because game was built on it?
@@322-Dota2 While you might not feel a need for AA at 8K, there will still be artifacts. To truly not need AA anymore, you need insanly high resolutions. If you don't believe me, go to blur busters ufo test page, select the aliasing visibility test and measure how far from the screen you need to be to not see any artifacts any more. Then calculate the needed resolution for a normal viewing distance from that.
@@B.E.3.R I've never even heard of it until now, and I've even booted up CS2 more than once (to test whether my potato can run it comfortably, or at all in the case of the Linux install).
4:27 the temporal issues that come with it, TAA is a plague to modern games that implement it badly. oversharpen to compensate, ghosting, blur, dependency for certain effects, etc.
Image ghosting in Satisfactory was quite distracting the last time I played it, with the effect being super smeary, which reminded me of Doom 2016, specifically the scene in Samuel Hayden's Office when he waves his hands around, trails are left behind on his carpet. Out of FSR, DLSS and TSR and plain old TAA, I remember DLSS being the least image disrupting method, though it's still noticeable on the conveyor belts. It made me consider turning off anti-aliasing entirely, though that'll cause issues too since stuff like screen-space contact shadows relies on TAA to smooth it out! So it's become an issue that TAA is relied upon to make some effects work, which is an unfortunate circumstance to be in. I kinda just try to ignore it, since nothing is perfect, so it's a matter of compromises... Edit, since somehow I couldn't grammar check correctly, and also an elaboration on the scene in Doom 2016.
I'd really love to know Phillip's thoughts on TAA and all the ghosting it produces. I recently bought a 2k screen with a 4070ti super to accompany it. After trying some modern games I felt something was wrong, but couldn't quite put my finger on it. The biggest offender for me was The Finals, I get that it uses RT to update light in destroyed buildings and it would shimmer a lot without some temporal solution, so it forces TAA or TAA-based upscalers as it's only options, leaving out traditional AA or even no AA at all. But I really hate what it does to image clarity especially on longer distances, players and other moving objects become mushy blobs with long trails after them. No matter the settings, I always see those ugly afterimages and it makes my blood boil since that obviously flawed technology is pushed on me with no alternatives present.
I think you're right, but I do very much see the importance of having the option. Txaa makes some of the best hair rendering I've ever seen, most likely only replicateable by super sampling. It's the copper of antialiasing solutions, you know? Like how, sure, platinum, gold and silver ARE more conductive, but they're nowhere near as abundant as copper. Hence why they use it in everything. And when combined with the sharpening filters of fsr and the like, it can make for a solid stopgap. I actually prefer cranking the sharpness in everything I use on my steam deck, since blurriness bothers me alot more than sheer pixelation. But that's the key difference, there I have a choice of how my games are scaled, console players don't. And when the game devs want to make a game run well on console, you used to have to jump through all kinds of hoops and truely work for your frame rates, show some technical wizardry. Now they're gonna be tempted (and sometimes even forced by management) to just slap some fsr on that b, and call it a day.
Personally, while I see the value in 8k, I definitely am more pleased with the current refresh rate arms race going on in the monitor space currently. The benefit of 8k relative to the performance cost is definitely very high, and while it is not without value, I personally think that chasing higher refresh rates (240hz/360hz/480hz) is the more beneficial solution. We are still very far away from hitting truly diminishing returns when it comes to responsiveness (okay, maybe this one not so much), motion smoothness, and *especially* motion clarity. Low MPRT is absolutely critical to having a truly sharp gaming experience IMO and goes woefully underdiscussed compared to cranking the resolution. What is the point of rendering such an extreme amount of video information when the vast majority of it ends up smeared and indecipherable simply because our eyes don't mesh well with how sample and hold displays display the image? It takes a performance dump of already potentially *questionable* value and makes it even more situationally beneficial, further diminishing the appeal in running such high resolutions.
I sacrificed colour accuracy and brightness for a different IPS monitor of the same resolution, but upgraded from 60Hz to 165Hz. Oh my god it was so worth it.
TBH Frame gen is the way to go for motion clarity, at 300+ fps you already have enough responsiveness, and since frame gen has less latency and less artifacts at higher fps. The downsides would be very small. The only limitation is that the optical flow accelerators on current cards weren't made for 300>1000fps. Frame gen can be seen as an enhanced BFI, where instead of flashing a black frame to increase motion clarity, you flash an interpolated frame.
Why not both? I have a 4k240 display. Target of equivalent motion clarity to 1000fps seems to be the end game, although BFI could give that at much lower refresh rates. Still, I don't play games with extremely high motion. Anything in 3 figures fps is sufficient for my needs, yet my resolution itch isn't satisfied and I'd like more options beyond 4k. Options exist going both ways for whatever is preferred and given long enough I'm sure they'll converge.
@@snowwsquire Interpolation is definitely the way forward but, yeah, you do need a notably high starting FPS to overcome the shortcomings of modern interpolation. The optical flow processor stuff is mostly marketing however, applications like Lossless Scaling can already do 4x interpolation even on GPUs without specialized hardware and it isn't reliant on any actual game data, it just captures the game window and interpolates it like a video and it does it very well, with about as little latency as is possible for that style of interpolation and pretty decent quality as well. Though obviously, first party solutions from companies like Nvidia where the game feeds behind the scene data does yield higher quality output. It's also quite a bit better than black frame insertion on account of not only having no brightness hit (making HDR feasible) it also provides increases in motion smoothness that BFI fails to achieve.
Pixel quality > pixel count; furthermore, art direction > fidelity Chasing such high technical standards has game developers and artists overworked for diminishing visual returns, install sizes that occupy 1/4 of a console's hard drive, and products that often need to be smothered in layers of AI and TAA vaseline in order to run at a stable frame rate at all. Meanwhile, Nintendo still makes billions shipping games at 720-1080p, while still earning praise for visuals
I don't find 4K that much better than 1440p personaly already, but I still can understand that others enjoy higher resolutions. It's just not that interesting to me. Of course higher resolution is better, but framerates and a smooth visual are also important.
Still, in the future 4K and even 8K will become that "smooth framerates and clarity" sweetspot. I also have a 1440p screen and am very happy with it, yet I can't wait for 4K to become as easy to run as 1080p nowadays, I love replaying favourite games whenever I get a higher resolution screen, just so I can experience detail I wouldn't have otherwise.
Same here. I first went from full HD to 4K, which was a huge difference. Much later I got a 1440p high refresh rate screen and the difference in sharpness is unnoticable in desktop/text sharpness. Maybe for something like Escape from Tarkov it would be nice for spotting stuff in the distance, but 4K high refresh rate is prohibitively expensive both in processing power and screen price. No way I'm paying NVIDIA the "we price for the elite" cost that they have imposed on newer RTX cards. High end AMD is already expensive enough.
I definitely noticed 1440p to 4k. I don’t notice 1800p to 4k. Its like how 144hz vs 240 is noticeable. 240 to 360 is harder. Its just pass the barrier. Though 500hz is approximately the frame rate an untrained eye will notice.
While it is eventually possible that the 8K would be a realistic option on screen monitors. One is personally more interested in the possibilities of 8k VR. With double 8k strapped to your eyeballs, it would make an unprecedentedly clear viewing experience.
It is prettty wild to me that it isn't supported directly more often. It isn't a very well-known idea because you need to set it up in such roundabout ways.
I used to think that 4k was pointless compared to 1440p, but now I realize my real beef is with TV manufacturers, game consoles, and to a lesser extent monitor manufacturers who force us into exponential upgrades without offering adequate options in the middle. I always thought that consoles and TV manufacturers targeting 4k without any love for 1440p was stupid. I think it's stupid that there's not a lot of 1800p monitor options. I think that it's incredibly stupid that TV manufacturers are going right to 8k, and it's especially stupid how the consoles will fall right in line, and likely won't support monitors that will inevitably be created in-between.
It's a shame 6k doesnt come to tv. Imax films are shown in 6k at the high end theaters. It could easily be transferable to tvs. It's the reason why 4k exists in the first place
The biggest use case for 8K is headsets, VR or otherwise. That said I still think gaming at 8K (outside of a headset) is dumb and likely will be for a very long time. The diminishing returns are pretty extreme and there are lots of other metrics worth boosting before that. I also don't think downscaling from 8K counts as gaming at 8K. Ultimately what you see is still at 4K and only requires a 4K screen. And 5K and 6K aren't 8K. The returns diminish exponentially and it's rare to see people arguing 5K is too much for example.
Thanks for the nuanced take. Most of the time people claim that there is no advantage to 8K, while, as you said, there is one, it's just not even remotely worth it for now. If you are talking native 8K I fully agree that it will be a bad idea for a long time. The thing is though, upscaling is getting better and better, so gaming on an 8K display could become reasonable quite a bit faster. Currently even just the upscaling cost to 8K isn't worth it, but two GPU gens in the future that will probably not be a noticable performance impact anymore. For completly matching peak human vision, you'd need 2400ppd. I tried to measure that for myself and ended up with >1700ppd instead, which is >86K on a 32" at 80cm distance. In realistic use cases it will be indistinguishable from a 16K screen and even that is very far into diminishing returns. Personally, I would prefer 6K, as the benefit of 8K+ is too little and I would rather have the resources spent on refresh rate and color depth. But I expect the industry to push for 8K instead.
Im pretty sure average ppd on the best eyes is lower than 500ppd. Maybe there is a study that says a focal point can see that high. So in that case just focus 1440p into that area. But the rest of the screen would realistically never need to cross 16k. Or 15360p. Regardless of size. Apple "retina" is 120ppd. And double that is probably goodenough@@davidruppelt
It's true. Downscaling isn't the same as native at all. It's the same as watching a video in 720p that was made at 4k. It's a huge upgrade in image quality versus a video made at 720p. But at the end of the day the clarity isn't anything more than 720p. Theortically even if the bitrate was the same. It would still apply that you can not zoom in (or bring your head closer) to actually see detail. It's the same story with upscalers. Atleast for the current ones. You can test by turning off forced taa/upscaling. You'll see that you don't actually lose any detail. With upscaling a blade of grass can be ultra sharp. Without it you'll see the 16 raw pixels. But if you turn up the real resolution of that to 64 pixels. You would actually be able to see the veins and stuff on piece of grass. Yet you would still see squares on the squares so you have people who still call it inferior to the ultra sharp perfect circle.
@@jwhi419 I have the 2400ppd from the "NVIDIA Automotive Screen Density Calculator" where they claim that to be "approx. limit of high contrast feature detection". I don't think there is a practical benefit of a screen that capable over a 300ppd screen with AA, but if you truly want a resolution that under no circumstances needs AA, than 2400ppd it is. 300ppd would be "20/4 vision; approx. limit of alignment detection (hyperacuity)" and is equivalent to a 32" 16K screen at 80cm distance. That should in my opinion be the endgoal. Apples 120ppd are "20/10 vision; practical upper limit of visual acuity". That is a more realistic goal for now and would probably be good enough for me, but would need AA to avoid the edge cases where it isn't. I did a test with blur busters "Worst-Case Aliasing Visibility Test" and there I could still see the artifacts from a distance equivalent to 1700ppd. There may very well be some error in my test design, but at least I believe the conclusion to be valid, that 120ppd are not enough for worst case scenarios.
@@davidruppelt i see. However i do think that if the tech to create a 32 inch 16k display exist. Then the tech to do the same in vr with a headset weighing 200grams would probably exist too. Well the chief of blur busters has talked about an end goal display technology in his forum before. The Holo deck from star trek. I do not recall the details but i think he ignores pixels at such a high resolution. Of course his deal is more about the motion clarity. Since just like you noted that you would need more than 1000ppd for the line test. You would need more than 10,000hz to move each of those pixels visibily to a human.
And what about VR too, we are already almost approaching 8k with the apple vision pro. Regardless of the success of the device, the screen resolution is something that is praised when compared to other VR headsets.
1440p on a phone is a gimmick. I've had a phone with 1440p for a while, but I lowered the resolution to 1080p and noticed no difference. At 400ppi already, what's the point of even higher density? It just drains your battery faster and the only way to notice a difference is with a microscope.
Yeah, that's why I still game with a 1060 @ capped 30fps. No need for the extra frames - I don't notice any difference. 30 frames is enough. The only way to tell the difference is by scouring a video frame-by-frame.
I've always appreciated your videos on tech and specs (including resolutions) because it's always felt like you were hopping on the next big thing before anyone else even acknowledged its future widespread use. I remember a world a bit more than 10 years ago when the majority view on the internet was that 4K was useless because we had hit diminishing returns at 1440p and wouldn't benefit from more pixels on monitors that sit so close to our eyes. Now the same is being said about 8K while quality 4K monitors can be bought on Amazon for 250£ and have thousands of glowing reviews. It's a uniquely human thing to feel so pedantically correct about "tech advances aren't worth it anymore! we need other transcendental innovations!"... and to be so harshly proven wrong when the inevitable happens. Great video!
I mean, I still think so. 4k is still a gimmick and almost no content is consumed at that resolution and gains no real benefits from it. Objectively you can in most cases not tell a difference on a tv with those distances, and a computer monitor gains so little benefits that a 4x performance penalty cannot be justified in any way.
One of the issues I still se in "gaming" monitor reviews is that they point out that the increased pixel count is useless while gaming, if so maybe use it while doing other stuff. Personally I prefer 4k gaming even at 27 inches and I look forward to the time when I get my first 8k screen. I hope it could be have better colours as than my 4k smartphone.
i think the main polarizing point of upscaling is that instead of being a free bonus performance or clarity boost developers have started using it as a replacement for optimization to ship games out faster and cheaper, and at that point you still need high end hardware to run it well which kind of defeats half of the point of upscaling
"developers have started using it as a replacement for optimization" I've seen this statement in some form or another for probably 3 years now and I still have not seen any proof or evidence to back it up. People have even come up with elaborate conspiracy theories where developers have conspired with Nvidia to make their games run worse to make DLSS seem more valuable. So lucio-ohs8828, do you have any proof whatsoever or are you making shit up and talking out your ass like everyone else?
@@kiri101 Please dont excuse the generic slop that is gaming nowadays, its clear that devs dont have any talent anymore. Its all just talentless UE or Unity devs making bad games and using AI so that they are at least running lmao Star Wars Battlefront 2 came out 6 years ago and ran fine on a 1070 or 1080 and looks much better than just about every game nowadays. Perhaps hiring actually talented people that care about their product or having a custom engine isn't such a bad idea? Memory is very cheap, GPUs are not.
Obviously when people say 8k is a gimmick, they are talking about the here and now. The hardwarerequirements and costs are just not worth it. Things might change in 10 to 20 years, as they always do, but thats not a revolutionary idea.
That's not obvious, many people say resolutions higher than 4k are _completely_ worthless due to the limits of human vision. This is already visible on smaller devices, 4k laptops aren't much sharper than 1440p ones and I'll never need a 4k phone even in 20 years. I agree with philip that the limit for monitors/TVs is a bit higher, I wish there were more 5-6k options, but there's a limit there too and 8k will likely be the last increase. VR headsets will go even higher but once we have better foveation and reprojection it might be a bit meaningless to talk about native resolution/refresh rate
@@speedstyle. laptop displays above 1080p have demonstrated how useless they are for more than 10 years now. but still they keep being produced despite the fact that you have to sit 2 inches away to even tell. intel didn't even allow unfiltered scaling from 4k to 1080 until 2019, and every laptop before that is arbitrarily not allowed to use the objectively simplest method of rescaling in existence. and then they pretended it is an actual feature that took effort to implement when they so graciously decided that the consumer worms were allowed to use it.
sadly, although i think it does have a purpose, and thank you for pointing this out, the LOD changing on models will appear a lot more obvious at 8k. at 4k playing skyrim with max object fade its still noticable at ~40 inches
@@2kliksphilip just googled unreal 5 nanite cloud and it looks very cool, hopefully newer games hold up to the future! i wonder if older games can handle 8k with your rtx 4090 and how they handle the higher res, if theres noticable popin/LOD switching? could be an interesting test :)
@@bebobo1 there seems to be a hard limit before it switches LOD models. i think you can actually regenerate distant LOD for skyrim using external tools. i remember doing something similar for getting the map to work with roads+seasons mods, something like xlodgen iirc?
Imagine working hard on making a decent video, being eager to hear your viewer's feedback only to be bombarded with a reddit tier comment chain spamming "hmm yes, maybe it does have purpose, thank you for pointing this out"
The reason so many people hate 8K is because people want an upgrade in resolution, graphics, and frame rate, not JUST resolution. We already have 4K60 which is a 4 times upgrade to resolution compared to 1080p60, the next step should be 4K240, not 8K60. That way we have a 4 times upgrade to resolution AND frame rate compared to 1080p60, not just 16 times in resolution. Combine that with running old games at high settings, and the triangle is complete. We are so close to finishing the triangle, why would anyone care about 8K at this point in time when we are THIS close to having a TRUE next gen gaming experience standard? 1080p60 was the standard for so long, it's time to make 4k240 the next one.
Right there with you. Once i managed to get 1080p60hz i upped my game to 1080p144hz. My next jump will be 1440p240hz and once i achieved steady performance at that, maybe we will be aiming for 4k560 or whatever. I used to be ahead of the curve with 1080p and 144hz but ive since slowed down massively. Everyones been talking about how 1080 isnt good enough and ive yet to upgrade to 1440
1:45 Is it odd though? Because medium graphics take away a LOT of detail. Sure, most agree that ULTRA settings are overkill and don't do much, but HIGH graphics are the sweetspot. Medium is definitely a downgrade, even at 4K. It's easy to see in the GTA V gameplay you provided.
yea many games have a big fidelity stepdown from high to medium than from ultra to high. High settings + high resolution is the best combo visuals. I'd take 4k high over 1440p ultra any day.
I did find it interesting that he used GTAV as the example, because in most modern games I would agree that you can drop the graphic settings quite a bit before it becomes very noticeable, but GTAV has a very obvious drop from high to "medium" (which is actually the lowest setting available). Medium is basically the xbox360/ps3 graphic settings.
I'll just leave a little shout out for tiny little 8K screens. We need them! VR with 8K per eye and super bright HDR goated OLED would be amazing. Foveated rendering takes care of not having GPU to drive all those pixels and 8K gives you amazing clarity that has to be getting close to what our eyes can even deal with. I'm no vision to pixel expert but I do notice that in VR, the same resolution looks so much worse than it does on a monitor a few feet away from me. So lets get on with it and get to 8K! It can only help VR heads wanting that to trickle down to little screens for their faces.
It's all about the detail and aliasing, not the literal pixel size. Supersampling to 8k could provide perfect image quality, but an 8k display just pushes those issues down the road.
@@DraxilSpada-vc2wg oh yea ok that makes alot of sense, I was thinking like an old 19' monitor or something which would be terrible to look at🤣 On a 7' that res would be perfect and i'm sure it makes games and other programs run alot easier on your pc.
@@Sk0die The hilarious part is I use it with an RTX 4090, it allows me to crank things up to stupid amounts, or just run my PC without burning a figurative hole in the motherboard.
@@DraxilSpada-vc2wg you are experiencing a level of bottleneck I didn't even know was possible. please tell me you have another display you use as well? I'm fairly certain my ryzen APU could run any game in existence on high settings with that display, I think the 4090 could be a tad overkill😂
i play all my games at 4K on a 1080p monitor because it just looks that good. i imagine playing at 8K on a 4K monitor would be just as amazing of a jump in image quality. as more 8K monitors start to appear, i'll be happy to get a 4K monitor for cheap
I personally have made the experience that using DLSS or other AI upscalers beyond native to essentially do superresolution but with AI upscaling usually just means you are importing the upscaling artifacts these produce into your native resolution rendering which was previously pristine. This is of course also just a tradeoff you can make depending on your preference. Just wanted to mention that this is not simply better than native most of the time. There are visible drawbacks that I at least notice without actively trying to pay attention to it. To be a bit more concrete, the visual artifact I always find annoying is how distant objects that have a high-contrast transition to a background will leave obvious and dark ghosting artifacts in motion, usually caused by camera motion. Great video btw. Thanks philip!
8K is objectively better than 4K and you can see the difference under specific circumstances. However due to diminishing returns and extreme hardware requirements I foresee its adoption outside TVs being very slow. According to Steam Hardware Survey over 50% of participating Steam users have a primary display resolution of 1080p or lower and 4K adoption is at less than 10%. To me this says that we're years away from 8K gaming being anything more than a curiosity for most people especially as any resolution is a moving target when it comes to performance (the price to performance stagnation in the GPU market doesn't help either). Another issue to consider is that monitors last a long time. It is not uncommon for people to keep using the same monitor even if they completely upgraded their PC in the meantime. This likely contributes to the slow adoption of higher resolutions.
I feel like you meant this as a joke but yeah, in 5 years with a decent card it will be a normal option.... I started playing games in 4k in 2017 with a gtx 1070. I could play games in 8k with my current GPU if i wanted to.
@@OttoLP personally in 5 years I fully expect I will only just have adopted 4k, but I'm also not buying anything that could be construed as high end lol. I just can't bear to spend that much. But yeah, just as 4k seemed very silly and wasteful, but is now seeming more and more viable... So too will 8k. I expect we are more than 5 years away from 8k reaching mainstream, but I don't doubt that we will start seeing it again by 2030.
@@pancakerizer Hahahaha, yeah that would be crazy playing games at 4k 240fps like it is nothing. Though I doubt they will improve on path tracing performance as much in 2 generations, it might get 2-4 times faster at most I guess. Fun thought though :D
@@OttoLPhonestly going off of 4K (which never took market dominance) It would be about 8 years from now for people to start adopting it in mass. Because high frame rate monitors will lag behind by about that much
I actually think we will never get there. Just like phone batteries get bigger (denser) and CPUs more efficient but every phone has "a day and a bit" battery life, our graphics cards will get more powerful, but games will have more polygons or shaders or textures or in case they can't add anything else, more and more rays being traced and noise being compensated. So 8k high quality will unfortunately not happen I think. But still, if we get to 4.5k scaled down, 5k, 6k... people will hopefully see the difference. I had, seven years ago, one of the first GPUs marketed as "true 4k card", even back then, without a 4k monitor, the anti-aliasing of running at a higher res and downscaling was just something that I couldn't get over. And now I always have to spend much more than I'd "need" on my PCs, just because of this.
This is a pretty good answer. I don't think that we'll "never get there", but like you're pointing out, everything is going to scale the same to the point that we'll hardly see a benefit. Batteries get better and CPUs get faster and more efficient, and developers are like "It's free real estate," and never bother to optimize. Sure, someone might want an 8k gaming monitor, but do they want to pay for the video card to run that monitor?
I remember Monster Hunter Rise on PC having broken TAA (literally didn't work at all, dunno if it got fixed) and I used my extra GPU headroom to run at 6K. It downsampled to 4K using DLDSR and the image quality was superb ❤
@@albertfanmingo Yea,s with your nose glued to the display and having to move your head for seeing the edges of the screen you can see it. And that's exactly the most st*pid and inefficient way a display is not to be used.
For 7 years i gamed on 1360x768p on a 19 inch tv. It wasn't until 2024 that i finally upgraded. I got a used 24 inch 1080p monitor from craigslist and it's wonderful. I honestly don't think i'd ever want 4k gaming or above, 1080p at this size and distance looks perfectly fine for me.
4:20 Yes but the "native" here is still using taa, with is why I dont like upscaling at all, I personally dont like the blur taa gives. And would rather use not aa at all, even with all the crunchy pixels, hell at 1440p and higher aa isnt "really" needed (especially at 8k wink wink). And if proformance is needed I would rather drop resolution and use something like smaa, then use upscaling. I wish modern games kept clear of taa. r/fucktaa has some good info.
AA is definitely still needed at 4k for _most_ people. You might not mind that unstable raw look, but there’s a reason TAA became so ubiquitous: most people do. Besides the fact that modern and future games look very different to old games without AA, as the more complex your geometry and shading are, the more unstable the image is going to look without AA. In order for geometry and shading to advance, some sort of AA that can solve temporal instability is necessary.
Always Disabling AA gang here. Yes, I would much rather have jaggies (which are not visible on 1440p unless you stay still and do one of those "zoom ins" on the screen, which I'm not doing because I'm playing the game) and a higher framerate than a lower framerate and a blurry mess of an image full of ghosting.
@@Mankepanke I wonder how much of this is a generational thing. I'm in my mid thirties and among people in that age bracket it's not uncommon to not care about aliasing, even if on the more glaring side. It's like my brain doesn't register it as a problem.
SMAA sucks for vegetation so its as a method of AA its pretty much useless on games with lots of vegetation. DLAA is clearer than TAA so try that. DLAA/TAA being temporal methods allows games to run certain effects at lower samples to optimise performance. Volumetric lighting can be ran at 1/4 res and TAA/DLAA can be used to makes this a complete image with little artifacting. being able to save 1.1ms on a render of each frame can mean the difference of going from 55 to 60fps. similar methods are used in screen space reflections.
People back in the day kept saying there is no point of HD resolution and Blu Ray. Now nobody wants to touch a 480p video with a stick, or even 720p. We'll get to 8K, but perhaps we have to increase the refresh rate first, 4K 120 > 8K 60 imo.
Ah so its one of those cycles. I didnt know that, but it doesnt surprise me. Eventually people will probably say 1080p is unusable and 1440p turns into the new 1080p (as in the bare minimum like it is now), and then 2k and so on and so on.
There are always going to be people who are fine with 720p and 30 FPS. Look at the new Zelda game that came out last week. To me the jaggies and smudged shadows are very distracting though.
@@winterhell2002 True but thats on a smaller screen as well so its not as noticable. If it was 1080p on a 21inch or 45 inch screen it would be way more noticeable. The switch upscales though when docked and connected to an external screen
For anyone who would rather buy a more sensible card than a 4090 though, 1440p is still the way to go. I'm not sure 8K ever will be. It just seems unnecessary.
1440p is simply the point where for most monitor sizes + distances from your eyes, it's completely useless to go any higher because your eye isn't able to tell the difference unless you lean way too close to the screen, same as 1080p is more than enough on pretty much every single mobile phone. Right now I'm typing on my phone and even if I'm looking from up close the letters all look literally perfect and I am confident that no normal person could be able to tell the difference when shown a 1440p smartphone side by side by it, god forbid a 4K ones that apparently exist...
True. 4k is still a gimmick for most people and 8k just sounds like a meme. Why waste hardware and very expensive electrical power for a complete placebo?
Remember when people said you need the best to game at 4k, maybe a GTX 980 really was not the minium 4K spec, likewise for 8K the minimum spec is not a RTX 4090.
Buying a 27 inch 4K monitor was the best gaming decision I ever made, despite the internet saying 4K "isn't worth it at that size". HA! That showed them!
Definitely not too much resolution. It's just a matter of preference whether you focus on spatial or temporal resolution, as you currently can't reasonably have both. For productivity I would prefer even more. 27" 5K or 32" 6K. Sadly there are not a lot of options, and they are all non gaming displays and very expensive.
Finally someone that said this... I went as far as to buy pro display XDR specifically for the 6k res at 32" (tho the rest of the features are also very welcome - like bye, bye IPS glow) and the difference is SO CLEAR compared to 4k - I have a freaking astigmatism btw. People pretending anything above 4k being useless always was a mistery to me - it has to be just repeating tech reviewers who in turn have a agenda - no sane person would dismiss 8k as it is dismissed by majority of people if they actually saw the difference. You might not care about it, accept what you have, but to state for e.g. that 8k won't come and 4k is the "last resolution" and we'll go with fidelity only from now on is just stupid.
You're doing an apples to oranges comparison here. The common complaint is that a 4k display (at it's intended viewing distance), is indistinguishable from an 8k display, is not addressed here. Rendering games at higher resolutions, such as with some anti-aliasing techniques will produce a better image no doubt, but buying a higher resolution display won't help. Now, there are exceptions, such as split-screen gaming, or displays which have abnormally close viewing distances (like those found in VR headsets), where 8k is desirable, but those are still exceptions. I have to stress, that this all hinges upon you using your display at a reasonable viewing distance - too close and you'll get eye fatigue, too far and you won't even make out the difference between 1080p and 4k. Perhaps the biggest takeaway I find in your video is the fact that you didn't even compare a real 8K display to a 4K one.
The point you say I missed actually plays a prominent part in this video you've just watched. Comparing a 16" 4K to a 32" 4K screen is pretty damn close to comparing the clarity of 4K to 8K at the same viewing distance.
@@2kliksphilip Yes, but you're viewing them from the same distance. In actual use, you should keep the 32" display farther from your head than the 16" one.
@@2kliksphilip I'm not sure even this gets to the core point though - the suggestion is that 4k content on the 16" 4k display will probably look identical to 8k content downsampled to 4k on the 32" 4k display at the same viewing distance. Downsampling from higher resolutions is literally the most primitive kind of anti-aliasing, but it's the most accurate and produces the least artifacts. If it's the only benefit of rendering at 8k then 8k monitors probably are just a fad, and we're really saying we just want better AA at 4k...
Both of you have missed the point. If a 16" 4k screen is noticeably sharper than a 32" 4k screen then a 32" 8k screen would have increased sharpness over a 4k one. I am already viewing a 32" monitor from this distance so my viewing distance won't change.
...no. I am viewing my 32" at the distance I would use my 32" at, and noticing increased clarity from the same resolution on a smaller screen at equal distance. Saying I'd double the distance I'd view my 4k screen at simply by making it 8k is being disingenuous and is what your misunderstanding hinges on.
Instead of higher fidelity. I wish more games would focus on physics. Destructible buildings and stuff. And just be able to interact with everything. Hell Let Loose is perfect in many ways. Except performance and absolutely everything is static in that game… Imagine a Hell Let Loose with fully destructible buildings
I was looking at the office on a 27 inch 1440p screen 1.5 meters away and was blown away that i could read the text on the book shelves. Idk why but it added a lot... FYI; Normally I watch from 2.5 meters away on a 65 inch 4K OLED. I tried moving drom 3.5 m up to 0.5m, but I just can't read it. Maybe it was created in 1080p and i am gaslighting myself, but I think 8K might actually make a difference for me. (If movies were 8k :p) I have had a 4k oled tv for 8 years now, so maybe I just got very used to it.
First ever 4K displays were often made of two panels bunched together. But now we are seeing more 4K displays with high refresh rates. 8K displays would be a thing, and who knows, maybe DLSS and many other technological improvements would make 8K a preferred choice in upcoming years
I plan to get a 4K monitor this Black Friday, and my PC only has a 6700K and a GTX 1070! I can wait a few years to play the new demanding titles like Alan Wake 2, wait until components are cheaper. Monitors are a surprisingly good price right now! I've been catching your videos on 4K since that GTA V one all those years ago. I just want to say-people who solely recommend 1440p don't seem to remember that PCs can be used for more than just gaming. I write a lot, and I am really excited for text clarity when I get that new monitor and upgrade from my 27" 1080p one. Text is going to look so good. Same thing with TH-cam videos and movies, such higher performance and they're not even hard to run compared to games. I'll use my GTX 1070, play some older games for a year, and then get something like an AMD RX 8700 XT on Black Friday 2025. No issue. People need to become less polarized on this. Love the video as always Philip!
@@LucasCunhaRocha I recently played Control and absolutely adored it! I remember playing Quantum Break back when it was new and really enjoying it, too. Remedy seems like a pretty great developer with a lot of bangers, and I want to try all of their games and see how they are. I hear Alan Wake 1 is the best 6/10 game out there, and I'm excited to see. I don't intend to buy AW2 on Epic, though. Dislike using anything other than Steam. Just gonna wait until it comes to Steam years from now
Also a good thing about 4K is that it can integer scale 1080p so no downside in clarity compared to a native 1080p monitor. Though 1080p will still look worse to you as you'll get used to 4K windows and old games that can be 4K and stuff and then a game dropping down to 1080p won't be so pretty but still, works. 8K would be even better for integer scaling as it could do 4K, 1440p and 1080p, the current three common resolutions. But high refresh 8K monitors for a decent price are far off, we're still stuck on 8K 60hz non gaming monitors lol.
As upscaling tech gets better and better I can see it being worthwhile. Right now the cost for both the GPU and for the monitor/tv is a bit steep. 4K finally got vaguely affordable and I'm very happy with it at the moment.
jumping to 4k after being on 1440p for so long actually helped me realise how average graphics can really look, the added clarity revealed the flaws in a lot of the game elements and areas that were hidden due to the low resolution but also poorer contrast of my older screen. it was immediately noticeable in text and a lot of game elements just how much smoother and cleaner everything looked. I think 8k would have similar benefit providing the texture resolution and overall game settings don't have to be lower to much, medium 8k being better than say high 4k and much better than ultra 1440p for example, especially with upscaling rendering the game from 2k or 4k resolution anyway. I feel once we have texture upscaling as well. so games can have like 1080p textures auto upscaled to 4k, this would save on space as well as improve loading times. and massively improve visuals imagine a game with only 4k quality textures or low end systems having like 480/640p textures upscaled to like 1080p or 1440p as well.
8 years ago(2016), LG released a 27" 5k monitor monitor called Ultrafine, primary for Apple users. Mind you: 5k has 80% more pixels than 4k. Shortly after, Apple discounted all USB-C accessories by 30% because of some drama. The discount also applied to the LG Ultrafine, therefore I bought one as an impulse purchase. Of all possible imaginable games, I played WoW on it, and Pandaria never looked this good again. To this day, I am waiting for a 27" 5k / 32" 6k OLED 120Hz+ to repeat this experience. I sold it shortly after to recoup the money and also because the monitor (not the display/panel itself!) was trash, constantly going off randomly or switching to 30Hz.
Philip digs deep into things I have no access to or interest in. I like Philip for that (and more). Keeps me more open-minded and helps me embrace the future rather than shut down with only the things I'm familiar and content with, while automatically rejecting everything new. Thanks for the reality check, Philip. I'm still rocking my GTX 1080 on a 1080p screen anyway because I don't care enough to invest into an upgrade, but thanks nonetheless.
5k at 32 inches is my dream. Windows was originally designed around 96 PPI and 5k at 31.5 inches with 200% scaling gives you an effective PPI for scaling purposes of 93.24. 5k is also nearly double the pixel count of 4k, which would be a massive improvement for older games while also allowing for perfect pixel scaling at 1440p for games that are harder to run. 200% scaling makes everything on the desktop side of things so much better to work with as well.
I think more importantly than explaining why 8k might be worth it, you said that you look forward to the future, without it letting you detract from what you enjoy in the present. I think people are motivated to upgrade not just because "it's new", but also because it means that what they currently have "is old".
I hate this intro. Not because it's not well done, it's great and gets the point across. But because it is necessary. I noticed with my own online discussions lately, that you can't have nice things anymore. Noone tries to hear each other out, have civil discussions and reach mutually satisfactory conclusions. Everything is simply on the spectrum between 'um acktschually' and 'you're wrong'. I come here to hear to hear an opinion other than my own, that I sometimes agree and sometimes disagree with but mainly would have never thought about. And it pains me to see, that the discussion has to start with what is basically a disclaimer that is more than a minute long just so people are not butthurt.
I personally still find the chase for higher fidelity to be frustrating, because playing games like Cyberpunk which is touted as one of the best looking games ever was an absolute pain. Stuff looked blurry, there was loads of ghosting, and the animations powering stuff still had a lot of jank. I had to tinker with settings in and out of game for hours, trying to prevent the game from looking too pixelated and crunchy, but also looking like the world was full of ghosts. All of this hassle from my brand new rig which should handle things no problem.
The first thing i did when i bought my 4k tv 55" tv is i cranked my games to 4k (i don't remember what those were sadly, but GTA V was def on the list) and was extremely disappointed when i saw absolutely zero difference between 2k and 4k from 2 meters away. So yeah, i watch movies and play games at 2k to this day and i see no reason to move on. Tbh i don't really care about flat screen stuff. If the text is crisp and it's bright enough - i can work and game on that. Now VR and AR - that's where extra pixels really matter. Can't wait for higher resolution glasses to become affordable!
We've been so lost in maxxed out graphics that I didn't even consider high resolution gaming on graphics fidelity lower than high settings, but it makes good sense. Alotta VR games are purposefully non-realistic, but they're immersive all the same, you could very easily overrule graphics by just pumping up the resolution; However many graphics settings don't only drop in quality but become more pixelated as they drop, especially stuff like shadow edges and textures. It seems mostly contradictory to have so many pixels on the screen, smoothing everything out, just to see very smoothly jagged pixelated shadows and textures on the screen. If games had actually proportionately treated their graphics setting instead of making everything look pixelated in lower settings, it would be more visually pleasing to run lower fidelity on higher resolutions. Due to this, I'd suggest 1440p to just replace 1080p for the time being, a lot of people are misled by e-sports standards and are forced to drown under convoluted AA and DLSS settings to get a modern look to their games when 1440p can very easily just have the lightest touch of either to completely get rid of any pixelated look.
The optimal distance chart at 2:45 is much more relevant for media consumption than for gaming. As an ex-eyefinity user, it's okay to not have the whole screen in the focused view and have it fill up my peripheral vision.
I think that the two primary purposes of 8K gaming would be: 1. Extra-large screens. In Apple marketing terms, not even 4K would count as "Retina" with an LCD/OLED screen in a size you'd normally buy a projector for, like offensively large, maybe 160-280 inches. In those cases, 8K would be actually noticeably sharp. 2. Screens that you'd look at from extremely close. How do you do that? VR! Or any kind of headset like that. The sharpness difference from having 4 times the pixels would actually be noticeable here, too.
What I'm gonna look at in 10 or however many years time is what games I play right now will even recognize that 8K exists (in the game's settings menu, not a config file). I know devs are a lot more aware of higher resolution gaming but it's gonna be interesting, especially with the increased prevalence of indie games.
Another reason why upscaling is so good for 8k is ofc that the resolution of 8k is so high the upscaling quality is going to be very good regardless and the performance gains will be amazing on top. DLSS and FSR Performance mode uses 1/2 or 0.5x res for each axis (0.25x total pixels), so at 4k it upscales from 1920x1080, at 8k it's 3840x2160 because it has 4x the pixels of 4k - aka performance at 8k upscales from 4k. Ultra performance is 1/3rd res per axis, at 8k that's 2560x1440p. 4k quality mode upscaling internal resolution is 1440p - exact same as 8k ultra performance. RT scales per pixel so saving as much performance as possible by using upscaling is only going to get more and more important, with that increase in quality from the resolution increase it's going to be great. It definitely won't start going mainstream this decade but it will hit eventually. Maybe frame gen advances leaps and bounds too and latencies are decoupled so inputs aren't as affected, then a high framerate 8k experience might be more possible sooner than people think. Just need to wait 8k HRR micro LED monitors to not cost the entire GDP of a small nation and you're in the end game
Back in the 2000's when 1080p monitors started to become affordable I read some comments back then about people claiming that anti aliasing will no longer be needed for such a high resolution and it will just waste performance. Years later people claimed the same thing for 4k displays. But even now AA being disabled on games rendered at 4k is still noticeable but to a lesser extent compared to 1080p or 1440p. But then there is the issue of TAA which can destroy image clarity in motion even at 4k. 8k can finally be the sweet sport to truly no longer utilize AA. Or older methods of AA like FXAA might actually look pretty good, even in motion, at 8k.
The thing that drives me batshit crazy is the limited draw distance on objects, vegetation and LOD's even when it's cranked all the way up. So down the road when your computer can crush it you're still limited to seeing pop in.
I could not agree more. A 5800x3d and a 7900xtx should be enough to see the horizon but all i see a line of lods
hmm yes, maybe it does have purpose, thank you for pointing this out
i absolutely hate LOD in games (at least their current implementation). There should be a way to calibrate it so the game checks your resolution and FOV, and using that it goes through the assets you can see and calculates how many pixels it takes up on your screen, and calculates LOD from there. Because currently, you can still see sizable objects that are just blurs or grey blocks until you get extremely close to it which ruins the viewing experience for me.
This is already a problem with old 3d games right now. It likely won't stop being a problem either.
There are game devs right who STILL use framerate to calculate physics and whatnot. The industry hates future-proofing. It's why live-service games can get away with becoming literally unplayable once the central servers get shut down.
Exactly. It's like yeah, I get it, when San Andreas was being made at the time, Rockstar couldn't even fathom the idea of a 4090 graphics card. But with just how powerful our modern hardware is, I should easily be able to have the entire State loaded in at once with all the high LOD models loaded in and have ZERO performance hit
8K does have a purpose. Driving down the price for 4K.
It makes a lot of sense for recording footage.
Being able to zoom in post production is crazy useful. so from a gaming perspective a content creator may get a lot out of it.
@@RusticRonnie 4k already does the trick, usually. You can zoom in on a quarter of the screen and still have a 1080p image.
@@RusticRonnie, not really, 8k is slower to work with on anything but the very, very high end stuff
When I got my 1080ti I started running all my games at 4K (DSR) on a 1080p display because in older games, render distance and LOD were tied to resolution. Perfect supersampling AA and dramatically less geometry pop in.
This is the way!
I've been playing at 4k for over a decade and have since switched to 8k on 55" Samsung TV 3 years ago and I can't go back
supersampling is criminally underrated, the crispness it can bring to even a simple 1080p screen is insane
@@drunkchey9739 yup, if my 1080p display had better color I would use that over my 1440p display. Especially because 4k DSR at 1440p doesn't look right and 5K is out of my performance tolerance.
DSR is just amazing on 1080p
Looks good watching on my phone
It looks good on my smart watch.
Even though they are generally just a bit higher than 1080p or sometimes 1440p, Phones process the Pixel density of the Gods.
hmm yes, maybe it does have purpose, thank you for pointing this out
Hmm, yes...
hmm, yes...
hmm, monke...
@@Nyllsor maybe it does have purpose
combo breaker
just give me some new form of anti aliasing that doesn't turn image into a ghosthunt
at 8k you don't need anti aliasing
@@322-Dota2 what about crap like cyberpunk 2077 which has built in taa and if you disable it through config game just breaks because game was built on it?
@@322-Dota2 While you might not feel a need for AA at 8K, there will still be artifacts. To truly not need AA anymore, you need insanly high resolutions. If you don't believe me, go to blur busters ufo test page, select the aliasing visibility test and measure how far from the screen you need to be to not see any artifacts any more. Then calculate the needed resolution for a normal viewing distance from that.
CMAA is hands-down the best cheap AA, unfortunately very few games use it: Yakuza 3-5 remasters, FFXVI, CS2 and that's about it
@@B.E.3.R I've never even heard of it until now, and I've even booted up CS2 more than once (to test whether my potato can run it comfortably, or at all in the case of the Linux install).
4:27 the temporal issues that come with it, TAA is a plague to modern games that implement it badly. oversharpen to compensate, ghosting, blur, dependency for certain effects, etc.
Image ghosting in Satisfactory was quite distracting the last time I played it, with the effect being super smeary, which reminded me of Doom 2016, specifically the scene in Samuel Hayden's Office when he waves his hands around, trails are left behind on his carpet. Out of FSR, DLSS and TSR and plain old TAA, I remember DLSS being the least image disrupting method, though it's still noticeable on the conveyor belts. It made me consider turning off anti-aliasing entirely, though that'll cause issues too since stuff like screen-space contact shadows relies on TAA to smooth it out! So it's become an issue that TAA is relied upon to make some effects work, which is an unfortunate circumstance to be in. I kinda just try to ignore it, since nothing is perfect, so it's a matter of compromises...
Edit, since somehow I couldn't grammar check correctly, and also an elaboration on the scene in Doom 2016.
I'd really love to know Phillip's thoughts on TAA and all the ghosting it produces. I recently bought a 2k screen with a 4070ti super to accompany it. After trying some modern games I felt something was wrong, but couldn't quite put my finger on it. The biggest offender for me was The Finals, I get that it uses RT to update light in destroyed buildings and it would shimmer a lot without some temporal solution, so it forces TAA or TAA-based upscalers as it's only options, leaving out traditional AA or even no AA at all. But I really hate what it does to image clarity especially on longer distances, players and other moving objects become mushy blobs with long trails after them. No matter the settings, I always see those ugly afterimages and it makes my blood boil since that obviously flawed technology is pushed on me with no alternatives present.
I think you're right, but I do very much see the importance of having the option.
Txaa makes some of the best hair rendering I've ever seen, most likely only replicateable by super sampling.
It's the copper of antialiasing solutions, you know? Like how, sure, platinum, gold and silver ARE more conductive, but they're nowhere near as abundant as copper. Hence why they use it in everything.
And when combined with the sharpening filters of fsr and the like, it can make for a solid stopgap.
I actually prefer cranking the sharpness in everything I use on my steam deck, since blurriness bothers me alot more than sheer pixelation.
But that's the key difference, there I have a choice of how my games are scaled, console players don't.
And when the game devs want to make a game run well on console, you used to have to jump through all kinds of hoops and truely work for your frame rates, show some technical wizardry. Now they're gonna be tempted (and sometimes even forced by management) to just slap some fsr on that b, and call it a day.
Yep, It would be a better comparison if the "Native" image was without TAA. I prefer SMAA even if it still has some pixelated parts.
@@qualitycontent1021 I agree, the finals has some of the worst TAA imo
Personally, while I see the value in 8k, I definitely am more pleased with the current refresh rate arms race going on in the monitor space currently. The benefit of 8k relative to the performance cost is definitely very high, and while it is not without value, I personally think that chasing higher refresh rates (240hz/360hz/480hz) is the more beneficial solution. We are still very far away from hitting truly diminishing returns when it comes to responsiveness (okay, maybe this one not so much), motion smoothness, and *especially* motion clarity.
Low MPRT is absolutely critical to having a truly sharp gaming experience IMO and goes woefully underdiscussed compared to cranking the resolution. What is the point of rendering such an extreme amount of video information when the vast majority of it ends up smeared and indecipherable simply because our eyes don't mesh well with how sample and hold displays display the image? It takes a performance dump of already potentially *questionable* value and makes it even more situationally beneficial, further diminishing the appeal in running such high resolutions.
I sacrificed colour accuracy and brightness for a different IPS monitor of the same resolution, but upgraded from 60Hz to 165Hz. Oh my god it was so worth it.
TBH Frame gen is the way to go for motion clarity, at 300+ fps you already have enough responsiveness, and since frame gen has less latency and less artifacts at higher fps. The downsides would be very small. The only limitation is that the optical flow accelerators on current cards weren't made for 300>1000fps. Frame gen can be seen as an enhanced BFI, where instead of flashing a black frame to increase motion clarity, you flash an interpolated frame.
Why not both? I have a 4k240 display. Target of equivalent motion clarity to 1000fps seems to be the end game, although BFI could give that at much lower refresh rates. Still, I don't play games with extremely high motion. Anything in 3 figures fps is sufficient for my needs, yet my resolution itch isn't satisfied and I'd like more options beyond 4k. Options exist going both ways for whatever is preferred and given long enough I'm sure they'll converge.
@@snowwsquire Interpolation is definitely the way forward but, yeah, you do need a notably high starting FPS to overcome the shortcomings of modern interpolation. The optical flow processor stuff is mostly marketing however, applications like Lossless Scaling can already do 4x interpolation even on GPUs without specialized hardware and it isn't reliant on any actual game data, it just captures the game window and interpolates it like a video and it does it very well, with about as little latency as is possible for that style of interpolation and pretty decent quality as well. Though obviously, first party solutions from companies like Nvidia where the game feeds behind the scene data does yield higher quality output. It's also quite a bit better than black frame insertion on account of not only having no brightness hit (making HDR feasible) it also provides increases in motion smoothness that BFI fails to achieve.
@@TeeBeeDee you say it’s mostly marketing and then detail how it’s higher quality, and the bigger part is that it’s not run on the shaders.
"You don't have to be polarized about literally everything ever"
Woah there buddy, this is the internet we're talking about
no thank you i can only see 40 pixels across
-burger40
you forgot the 6
Dude lying about his identity thinking we won't know about his 6 shm
@@roborogue_ interestingly in his icon it faintly says "burger40", no 6.
@@korakys looks like it says "burger10"
@@korakys also his username is in fact burger40, it's just the unique tag that shows in comments that's burger406
Pixel quality > pixel count; furthermore, art direction > fidelity
Chasing such high technical standards has game developers and artists overworked for diminishing visual returns, install sizes that occupy 1/4 of a console's hard drive, and products that often need to be smothered in layers of AI and TAA vaseline in order to run at a stable frame rate at all. Meanwhile, Nintendo still makes billions shipping games at 720-1080p, while still earning praise for visuals
I don't find 4K that much better than 1440p personaly already, but I still can understand that others enjoy higher resolutions. It's just not that interesting to me. Of course higher resolution is better, but framerates and a smooth visual are also important.
Still, in the future 4K and even 8K will become that "smooth framerates and clarity" sweetspot. I also have a 1440p screen and am very happy with it, yet I can't wait for 4K to become as easy to run as 1080p nowadays, I love replaying favourite games whenever I get a higher resolution screen, just so I can experience detail I wouldn't have otherwise.
Same here. I first went from full HD to 4K, which was a huge difference. Much later I got a 1440p high refresh rate screen and the difference in sharpness is unnoticable in desktop/text sharpness. Maybe for something like Escape from Tarkov it would be nice for spotting stuff in the distance, but 4K high refresh rate is prohibitively expensive both in processing power and screen price. No way I'm paying NVIDIA the "we price for the elite" cost that they have imposed on newer RTX cards. High end AMD is already expensive enough.
It starts to looks better on TV's IMO (55 inch and higher). On computer monitors it's just meh.
I definitely noticed 1440p to 4k.
I don’t notice 1800p to 4k.
Its like how 144hz vs 240 is noticeable.
240 to 360 is harder.
Its just pass the barrier.
Though 500hz is approximately the frame rate an untrained eye will notice.
"You don't have to be polarized about literally everything ever"
This quote applies to so much rn, thank you
While it is eventually possible that the 8K would be a realistic option on screen monitors. One is personally more interested in the possibilities of 8k VR. With double 8k strapped to your eyeballs, it would make an unprecedentedly clear viewing experience.
Down scaling is amazing. It makes the image so clear I can't go back to 2K after witnessing its beauty.
It is prettty wild to me that it isn't supported directly more often. It isn't a very well-known idea because you need to set it up in such roundabout ways.
I used to think that 4k was pointless compared to 1440p, but now I realize my real beef is with TV manufacturers, game consoles, and to a lesser extent monitor manufacturers who force us into exponential upgrades without offering adequate options in the middle. I always thought that consoles and TV manufacturers targeting 4k without any love for 1440p was stupid. I think it's stupid that there's not a lot of 1800p monitor options. I think that it's incredibly stupid that TV manufacturers are going right to 8k, and it's especially stupid how the consoles will fall right in line, and likely won't support monitors that will inevitably be created in-between.
It's a shame 6k doesnt come to tv. Imax films are shown in 6k at the high end theaters. It could easily be transferable to tvs. It's the reason why 4k exists in the first place
The biggest use case for 8K is headsets, VR or otherwise.
That said I still think gaming at 8K (outside of a headset) is dumb and likely will be for a very long time. The diminishing returns are pretty extreme and there are lots of other metrics worth boosting before that.
I also don't think downscaling from 8K counts as gaming at 8K. Ultimately what you see is still at 4K and only requires a 4K screen.
And 5K and 6K aren't 8K. The returns diminish exponentially and it's rare to see people arguing 5K is too much for example.
Thanks for the nuanced take. Most of the time people claim that there is no advantage to 8K, while, as you said, there is one, it's just not even remotely worth it for now. If you are talking native 8K I fully agree that it will be a bad idea for a long time. The thing is though, upscaling is getting better and better, so gaming on an 8K display could become reasonable quite a bit faster. Currently even just the upscaling cost to 8K isn't worth it, but two GPU gens in the future that will probably not be a noticable performance impact anymore.
For completly matching peak human vision, you'd need 2400ppd. I tried to measure that for myself and ended up with >1700ppd instead, which is >86K on a 32" at 80cm distance. In realistic use cases it will be indistinguishable from a 16K screen and even that is very far into diminishing returns.
Personally, I would prefer 6K, as the benefit of 8K+ is too little and I would rather have the resources spent on refresh rate and color depth. But I expect the industry to push for 8K instead.
Im pretty sure average ppd on the best eyes is lower than 500ppd. Maybe there is a study that says a focal point can see that high. So in that case just focus 1440p into that area. But the rest of the screen would realistically never need to cross 16k. Or 15360p. Regardless of size.
Apple "retina" is 120ppd. And double that is probably goodenough@@davidruppelt
It's true. Downscaling isn't the same as native at all. It's the same as watching a video in 720p that was made at 4k. It's a huge upgrade in image quality versus a video made at 720p. But at the end of the day the clarity isn't anything more than 720p. Theortically even if the bitrate was the same. It would still apply that you can not zoom in (or bring your head closer) to actually see detail.
It's the same story with upscalers. Atleast for the current ones. You can test by turning off forced taa/upscaling. You'll see that you don't actually lose any detail. With upscaling a blade of grass can be ultra sharp. Without it you'll see the 16 raw pixels. But if you turn up the real resolution of that to 64 pixels. You would actually be able to see the veins and stuff on piece of grass. Yet you would still see squares on the squares so you have people who still call it inferior to the ultra sharp perfect circle.
@@jwhi419 I have the 2400ppd from the "NVIDIA Automotive Screen Density Calculator" where they claim that to be "approx. limit of high contrast feature detection". I don't think there is a practical benefit of a screen that capable over a 300ppd screen with AA, but if you truly want a resolution that under no circumstances needs AA, than 2400ppd it is.
300ppd would be "20/4 vision; approx. limit of alignment detection (hyperacuity)" and is equivalent to a 32" 16K screen at 80cm distance. That should in my opinion be the endgoal. Apples 120ppd are "20/10 vision; practical upper limit of visual acuity". That is a more realistic goal for now and would probably be good enough for me, but would need AA to avoid the edge cases where it isn't. I did a test with blur busters "Worst-Case Aliasing Visibility Test" and there I could still see the artifacts from a distance equivalent to 1700ppd. There may very well be some error in my test design, but at least I believe the conclusion to be valid, that 120ppd are not enough for worst case scenarios.
@@davidruppelt i see. However i do think that if the tech to create a 32 inch 16k display exist. Then the tech to do the same in vr with a headset weighing 200grams would probably exist too.
Well the chief of blur busters has talked about an end goal display technology in his forum before. The Holo deck from star trek. I do not recall the details but i think he ignores pixels at such a high resolution. Of course his deal is more about the motion clarity. Since just like you noted that you would need more than 1000ppd for the line test. You would need more than 10,000hz to move each of those pixels visibily to a human.
in the year 2027 i shall look forward to viewing your nostril hair in 8k at max settings
And what about VR too, we are already almost approaching 8k with the apple vision pro. Regardless of the success of the device, the screen resolution is something that is praised when compared to other VR headsets.
1440p on a phone is a gimmick. I've had a phone with 1440p for a while, but I lowered the resolution to 1080p and noticed no difference. At 400ppi already, what's the point of even higher density? It just drains your battery faster and the only way to notice a difference is with a microscope.
Yeah, that's why I still game with a 1060 @ capped 30fps. No need for the extra frames - I don't notice any difference. 30 frames is enough. The only way to tell the difference is by scouring a video frame-by-frame.
@@Dancyspartan
this is just bait now
I've always appreciated your videos on tech and specs (including resolutions) because it's always felt like you were hopping on the next big thing before anyone else even acknowledged its future widespread use. I remember a world a bit more than 10 years ago when the majority view on the internet was that 4K was useless because we had hit diminishing returns at 1440p and wouldn't benefit from more pixels on monitors that sit so close to our eyes. Now the same is being said about 8K while quality 4K monitors can be bought on Amazon for 250£ and have thousands of glowing reviews. It's a uniquely human thing to feel so pedantically correct about "tech advances aren't worth it anymore! we need other transcendental innovations!"... and to be so harshly proven wrong when the inevitable happens. Great video!
I mean, I still think so. 4k is still a gimmick and almost no content is consumed at that resolution and gains no real benefits from it.
Objectively you can in most cases not tell a difference on a tv with those distances, and a computer monitor gains so little benefits that a 4x performance penalty cannot be justified in any way.
One of the issues I still se in "gaming" monitor reviews is that they point out that the increased pixel count is useless while gaming, if so maybe use it while doing other stuff. Personally I prefer 4k gaming even at 27 inches and I look forward to the time when I get my first 8k screen. I hope it could be have better colours as than my 4k smartphone.
i think the main polarizing point of upscaling is that instead of being a free bonus performance or clarity boost developers have started using it as a replacement for optimization to ship games out faster and cheaper, and at that point you still need high end hardware to run it well which kind of defeats half of the point of upscaling
Work expands to fill available resources. Just like how applications use crazy amounts of memory nowadays.
hmm yes, maybe it does have purpose, thank you for pointing this out
Sometimes it's a lack of optimization, sometimes its just a harder problem to solve
"developers have started using it as a replacement for optimization"
I've seen this statement in some form or another for probably 3 years now and I still have not seen any proof or evidence to back it up. People have even come up with elaborate conspiracy theories where developers have conspired with Nvidia to make their games run worse to make DLSS seem more valuable.
So lucio-ohs8828, do you have any proof whatsoever or are you making shit up and talking out your ass like everyone else?
@@kiri101
Please dont excuse the generic slop that is gaming nowadays, its clear that devs dont have any talent anymore. Its all just talentless UE or Unity devs making bad games and using AI so that they are at least running lmao
Star Wars Battlefront 2 came out 6 years ago and ran fine on a 1070 or 1080 and looks much better than just about every game nowadays. Perhaps hiring actually talented people that care about their product or having a custom engine isn't such a bad idea?
Memory is very cheap, GPUs are not.
Obviously when people say 8k is a gimmick, they are talking about the here and now. The hardwarerequirements and costs are just not worth it.
Things might change in 10 to 20 years, as they always do, but thats not a revolutionary idea.
That's not obvious, many people say resolutions higher than 4k are _completely_ worthless due to the limits of human vision. This is already visible on smaller devices, 4k laptops aren't much sharper than 1440p ones and I'll never need a 4k phone even in 20 years. I agree with philip that the limit for monitors/TVs is a bit higher, I wish there were more 5-6k options, but there's a limit there too and 8k will likely be the last increase. VR headsets will go even higher but once we have better foveation and reprojection it might be a bit meaningless to talk about native resolution/refresh rate
@@speedstyle.
laptop displays above 1080p have demonstrated how useless they are for more than 10 years now.
but still they keep being produced despite the fact that you have to sit 2 inches away to even tell.
intel didn't even allow unfiltered scaling from 4k to 1080 until 2019, and every laptop before that is arbitrarily not allowed to use the objectively simplest method of rescaling in existence.
and then they pretended it is an actual feature that took effort to implement when they so graciously decided that the consumer worms were allowed to use it.
sadly, although i think it does have a purpose, and thank you for pointing this out, the LOD changing on models will appear a lot more obvious at 8k. at 4k playing skyrim with max object fade its still noticable at ~40 inches
Fortunately draw distance is always improving in newer games, and stuff like unreal 5's nanite could potentially do away with it forever
although i would love 2x pixel density on my primary big monitor, even just for productivity
@@2kliksphilip just googled unreal 5 nanite cloud and it looks very cool, hopefully newer games hold up to the future! i wonder if older games can handle 8k with your rtx 4090 and how they handle the higher res, if theres noticable popin/LOD switching? could be an interesting test :)
im pretty sure there are lines in the .ini files that control draw distance, but it's been a while so i don't remember exactly which ones.
@@bebobo1 there seems to be a hard limit before it switches LOD models. i think you can actually regenerate distant LOD for skyrim using external tools. i remember doing something similar for getting the map to work with roads+seasons mods, something like xlodgen iirc?
Imagine working hard on making a decent video, being eager to hear your viewer's feedback only to be bombarded with a reddit tier comment chain spamming "hmm yes, maybe it does have purpose, thank you for pointing this out"
hmm yes, maybe it does have purpose, thank you for pointing this out
hmm yes, maybe it does have purpose, thank you for pointing this out
hmm yes, maybe it does have purpose, thank you for pointing this out
hmm yes, maybe it does have purpose,thank you for pointing this out.
hmm yes, maybe it does have purpose, thank you for pointing this out
The reason so many people hate 8K is because people want an upgrade in resolution, graphics, and frame rate, not JUST resolution. We already have 4K60 which is a 4 times upgrade to resolution compared to 1080p60, the next step should be 4K240, not 8K60. That way we have a 4 times upgrade to resolution AND frame rate compared to 1080p60, not just 16 times in resolution. Combine that with running old games at high settings, and the triangle is complete. We are so close to finishing the triangle, why would anyone care about 8K at this point in time when we are THIS close to having a TRUE next gen gaming experience standard? 1080p60 was the standard for so long, it's time to make 4k240 the next one.
Agreed!
Right there with you. Once i managed to get 1080p60hz i upped my game to 1080p144hz. My next jump will be 1440p240hz and once i achieved steady performance at that, maybe we will be aiming for 4k560 or whatever.
I used to be ahead of the curve with 1080p and 144hz but ive since slowed down massively. Everyones been talking about how 1080 isnt good enough and ive yet to upgrade to 1440
I still think it doesn't have a purpose, but I appreciate you arguing the opposite view
"Added clarity and realism to the distant buildings..."
-> Shows clip of GTA V.
1:45 Is it odd though? Because medium graphics take away a LOT of detail. Sure, most agree that ULTRA settings are overkill and don't do much, but HIGH graphics are the sweetspot. Medium is definitely a downgrade, even at 4K. It's easy to see in the GTA V gameplay you provided.
yea many games have a big fidelity stepdown from high to medium than from ultra to high. High settings + high resolution is the best combo visuals. I'd take 4k high over 1440p ultra any day.
he's coping.
I did find it interesting that he used GTAV as the example, because in most modern games I would agree that you can drop the graphic settings quite a bit before it becomes very noticeable, but GTAV has a very obvious drop from high to "medium" (which is actually the lowest setting available). Medium is basically the xbox360/ps3 graphic settings.
@@NimbleSnek i wonder how fat the Nvidia check is..
@@TerrorByte69420
if he was paid by nvidia wouldn't he be telling you to only play at ultra so you're forced to spent 10000 dollars on a flagship gpu?
I'll just leave a little shout out for tiny little 8K screens. We need them! VR with 8K per eye and super bright HDR goated OLED would be amazing. Foveated rendering takes care of not having GPU to drive all those pixels and 8K gives you amazing clarity that has to be getting close to what our eyes can even deal with. I'm no vision to pixel expert but I do notice that in VR, the same resolution looks so much worse than it does on a monitor a few feet away from me. So lets get on with it and get to 8K! It can only help VR heads wanting that to trickle down to little screens for their faces.
8k and even higher is relevant for VR as optics continue to improve.
It's all about the detail and aliasing, not the literal pixel size. Supersampling to 8k could provide perfect image quality, but an 8k display just pushes those issues down the road.
me watching this video at 240p on my 640x480 monitor:
ah yes, quality
I don't think i've ever used a monitor of such low resolution in my entire life. 720p ones for sure but 480p is next level, how big is it?
@@Sk0die It's a 7 inch monitor. :3
USB powered, so my PC powers the monitor as well as displaying to it, uses much less power.
@@DraxilSpada-vc2wg oh yea ok that makes alot of sense, I was thinking like an old 19' monitor or something which would be terrible to look at🤣
On a 7' that res would be perfect and i'm sure it makes games and other programs run alot easier on your pc.
@@Sk0die The hilarious part is I use it with an RTX 4090, it allows me to crank things up to stupid amounts, or just run my PC without burning a figurative hole in the motherboard.
@@DraxilSpada-vc2wg you are experiencing a level of bottleneck I didn't even know was possible. please tell me you have another display you use as well?
I'm fairly certain my ryzen APU could run any game in existence on high settings with that display, I think the 4090 could be a tad overkill😂
i play all my games at 4K on a 1080p monitor because it just looks that good. i imagine playing at 8K on a 4K monitor would be just as amazing of a jump in image quality. as more 8K monitors start to appear, i'll be happy to get a 4K monitor for cheap
I personally have made the experience that using DLSS or other AI upscalers beyond native to essentially do superresolution but with AI upscaling usually just means you are importing the upscaling artifacts these produce into your native resolution rendering which was previously pristine. This is of course also just a tradeoff you can make depending on your preference. Just wanted to mention that this is not simply better than native most of the time. There are visible drawbacks that I at least notice without actively trying to pay attention to it. To be a bit more concrete, the visual artifact I always find annoying is how distant objects that have a high-contrast transition to a background will leave obvious and dark ghosting artifacts in motion, usually caused by camera motion.
Great video btw. Thanks philip!
8K is objectively better than 4K and you can see the difference under specific circumstances. However due to diminishing returns and extreme hardware requirements I foresee its adoption outside TVs being very slow. According to Steam Hardware Survey over 50% of participating Steam users have a primary display resolution of 1080p or lower and 4K adoption is at less than 10%. To me this says that we're years away from 8K gaming being anything more than a curiosity for most people especially as any resolution is a moving target when it comes to performance (the price to performance stagnation in the GPU market doesn't help either).
Another issue to consider is that monitors last a long time. It is not uncommon for people to keep using the same monitor even if they completely upgraded their PC in the meantime. This likely contributes to the slow adoption of higher resolutions.
It's me!! I'm in the 50%!! I don't have much choice though since it's my laptop's screen, but the high refresh rate does feel really good.
Objectively higher, not better.
@@spectrobit5554 I meant that it's objectively better in terms of quality. Sorry for the confusion.
hmm yes maybe it does have purpose, thank you for pointing this out
...in 2030 with a RTX 8090
2030 is just 5 years and 3 months away. If I'm lucky I'll own a 4K 240 Hz monitor by then
I feel like you meant this as a joke but yeah, in 5 years with a decent card it will be a normal option....
I started playing games in 4k in 2017 with a gtx 1070. I could play games in 8k with my current GPU if i wanted to.
@@OttoLP personally in 5 years I fully expect I will only just have adopted 4k, but I'm also not buying anything that could be construed as high end lol. I just can't bear to spend that much.
But yeah, just as 4k seemed very silly and wasteful, but is now seeming more and more viable... So too will 8k.
I expect we are more than 5 years away from 8k reaching mainstream, but I don't doubt that we will start seeing it again by 2030.
@@pancakerizer Hahahaha, yeah that would be crazy playing games at 4k 240fps like it is nothing. Though I doubt they will improve on path tracing performance as much in 2 generations, it might get 2-4 times faster at most I guess. Fun thought though :D
@@OttoLPhonestly going off of 4K (which never took market dominance)
It would be about 8 years from now for people to start adopting it in mass. Because high frame rate monitors will lag behind by about that much
I actually think we will never get there. Just like phone batteries get bigger (denser) and CPUs more efficient but every phone has "a day and a bit" battery life, our graphics cards will get more powerful, but games will have more polygons or shaders or textures or in case they can't add anything else, more and more rays being traced and noise being compensated.
So 8k high quality will unfortunately not happen I think. But still, if we get to 4.5k scaled down, 5k, 6k... people will hopefully see the difference. I had, seven years ago, one of the first GPUs marketed as "true 4k card", even back then, without a 4k monitor, the anti-aliasing of running at a higher res and downscaling was just something that I couldn't get over.
And now I always have to spend much more than I'd "need" on my PCs, just because of this.
This is a pretty good answer. I don't think that we'll "never get there", but like you're pointing out, everything is going to scale the same to the point that we'll hardly see a benefit. Batteries get better and CPUs get faster and more efficient, and developers are like "It's free real estate," and never bother to optimize. Sure, someone might want an 8k gaming monitor, but do they want to pay for the video card to run that monitor?
the thing 8k brings to the table is cream to an already frosted cake
I remember Monster Hunter Rise on PC having broken TAA (literally didn't work at all, dunno if it got fixed) and I used my extra GPU headroom to run at 6K. It downsampled to 4K using DLDSR and the image quality was superb ❤
8k has sense in vr
hmm yes maybe it does have purpose, thank you for pointing this out - but have you considered 16K?
Not even 4K has a point.
This person has never seen a monitor above 1080p
@@albertfanmingo Yea,s with your nose glued to the display and having to move your head for seeing the edges of the screen you can see it. And that's exactly the most st*pid and inefficient way a display is not to be used.
@@elmariachi5133 My brother in Christ you have not seen a 4k monitor why are you trying to advocate against something you have never seen
@@albertfanmingo Because I have seen. And because I am against scam. And nearly anything industry has 'invented' last ten years is scam.
The more pixels the better, but 8K simply isn't topical yet.
hmm yes, maybe it foes have purpose, thank you gor pointing this out
For 7 years i gamed on 1360x768p on a 19 inch tv. It wasn't until 2024 that i finally upgraded. I got a used 24 inch 1080p monitor from craigslist and it's wonderful. I honestly don't think i'd ever want 4k gaming or above, 1080p at this size and distance looks perfectly fine for me.
hmm no, definitely it does not have purpose, curse you for pointing this out
4:20 Yes but the "native" here is still using taa, with is why I dont like upscaling at all, I personally dont like the blur taa gives. And would rather use not aa at all, even with all the crunchy pixels, hell at 1440p and higher aa isnt "really" needed (especially at 8k wink wink).
And if proformance is needed I would rather drop resolution and use something like smaa, then use upscaling. I wish modern games kept clear of taa. r/fucktaa has some good info.
AA is definitely still needed at 4k for _most_ people. You might not mind that unstable raw look, but there’s a reason TAA became so ubiquitous: most people do. Besides the fact that modern and future games look very different to old games without AA, as the more complex your geometry and shading are, the more unstable the image is going to look without AA. In order for geometry and shading to advance, some sort of AA that can solve temporal instability is necessary.
Always Disabling AA gang here. Yes, I would much rather have jaggies (which are not visible on 1440p unless you stay still and do one of those "zoom ins" on the screen, which I'm not doing because I'm playing the game) and a higher framerate than a lower framerate and a blurry mess of an image full of ghosting.
@@Mankepanke I wonder how much of this is a generational thing. I'm in my mid thirties and among people in that age bracket it's not uncommon to not care about aliasing, even if on the more glaring side. It's like my brain doesn't register it as a problem.
SMAA sucks for vegetation so its as a method of AA its pretty much useless on games with lots of vegetation. DLAA is clearer than TAA so try that. DLAA/TAA being temporal methods allows games to run certain effects at lower samples to optimise performance. Volumetric lighting can be ran at 1/4 res and TAA/DLAA can be used to makes this a complete image with little artifacting. being able to save 1.1ms on a render of each frame can mean the difference of going from 55 to 60fps. similar methods are used in screen space reflections.
TAA is horrendous and wish more games used straight upscaling instead.
Clearly, the only ones bold enough to advance the realm of resolutions is the capable men and women that create NSFW games and animations.
I too remember when I was excited for newer, better technology. Thanks for making me remember that. It's OK to be excited about things!
I'm still running some newer games at 720p on my 970 :)
Based and budget pilled.
cool to see poor brothers like me 💪
My main display is 1080P but my secondary is an old 720P HDTV. We will make it brother!
watching this without eyes
Everyone already knows the human eye can’t see past 8gb of ram.
People back in the day kept saying there is no point of HD resolution and Blu Ray. Now nobody wants to touch a 480p video with a stick, or even 720p. We'll get to 8K, but perhaps we have to increase the refresh rate first, 4K 120 > 8K 60 imo.
Ah so its one of those cycles. I didnt know that, but it doesnt surprise me. Eventually people will probably say 1080p is unusable and 1440p turns into the new 1080p (as in the bare minimum like it is now), and then 2k and so on and so on.
There are always going to be people who are fine with 720p and 30 FPS. Look at the new Zelda game that came out last week. To me the jaggies and smudged shadows are very distracting though.
@@winterhell2002 True but thats on a smaller screen as well so its not as noticable. If it was 1080p on a 21inch or 45 inch screen it would be way more noticeable. The switch upscales though when docked and connected to an external screen
For anyone who would rather buy a more sensible card than a 4090 though, 1440p is still the way to go. I'm not sure 8K ever will be. It just seems unnecessary.
1440p is simply the point where for most monitor sizes + distances from your eyes, it's completely useless to go any higher because your eye isn't able to tell the difference unless you lean way too close to the screen, same as 1080p is more than enough on pretty much every single mobile phone. Right now I'm typing on my phone and even if I'm looking from up close the letters all look literally perfect and I am confident that no normal person could be able to tell the difference when shown a 1440p smartphone side by side by it, god forbid a 4K ones that apparently exist...
True. 4k is still a gimmick for most people and 8k just sounds like a meme. Why waste hardware and very expensive electrical power for a complete placebo?
Remember when people said you need the best to game at 4k, maybe a GTX 980 really was not the minium 4K spec, likewise for 8K the minimum spec is not a RTX 4090.
7900xtx and 4080 Super can easily run 4k res too
As mentioned in the video, ultra settings also has diminishing returns
none of these brainded chatters have seen 4k 27 inches clarity lul
Buying a 27 inch 4K monitor was the best gaming decision I ever made, despite the internet saying 4K "isn't worth it at that size". HA! That showed them!
Definitely not too much resolution. It's just a matter of preference whether you focus on spatial or temporal resolution, as you currently can't reasonably have both. For productivity I would prefer even more. 27" 5K or 32" 6K. Sadly there are not a lot of options, and they are all non gaming displays and very expensive.
i love u 2kfilip im sorry everyone is commenting on ur vid with a copypaste
Finally someone that said this...
I went as far as to buy pro display XDR specifically for the 6k res at 32" (tho the rest of the features are also very welcome - like bye, bye IPS glow) and the difference is SO CLEAR compared to 4k - I have a freaking astigmatism btw. People pretending anything above 4k being useless always was a mistery to me - it has to be just repeating tech reviewers who in turn have a agenda - no sane person would dismiss 8k as it is dismissed by majority of people if they actually saw the difference. You might not care about it, accept what you have, but to state for e.g. that 8k won't come and 4k is the "last resolution" and we'll go with fidelity only from now on is just stupid.
You're doing an apples to oranges comparison here. The common complaint is that a 4k display (at it's intended viewing distance), is indistinguishable from an 8k display, is not addressed here. Rendering games at higher resolutions, such as with some anti-aliasing techniques will produce a better image no doubt, but buying a higher resolution display won't help. Now, there are exceptions, such as split-screen gaming, or displays which have abnormally close viewing distances (like those found in VR headsets), where 8k is desirable, but those are still exceptions. I have to stress, that this all hinges upon you using your display at a reasonable viewing distance - too close and you'll get eye fatigue, too far and you won't even make out the difference between 1080p and 4k.
Perhaps the biggest takeaway I find in your video is the fact that you didn't even compare a real 8K display to a 4K one.
The point you say I missed actually plays a prominent part in this video you've just watched. Comparing a 16" 4K to a 32" 4K screen is pretty damn close to comparing the clarity of 4K to 8K at the same viewing distance.
@@2kliksphilip Yes, but you're viewing them from the same distance. In actual use, you should keep the 32" display farther from your head than the 16" one.
@@2kliksphilip I'm not sure even this gets to the core point though - the suggestion is that 4k content on the 16" 4k display will probably look identical to 8k content downsampled to 4k on the 32" 4k display at the same viewing distance.
Downsampling from higher resolutions is literally the most primitive kind of anti-aliasing, but it's the most accurate and produces the least artifacts. If it's the only benefit of rendering at 8k then 8k monitors probably are just a fad, and we're really saying we just want better AA at 4k...
Both of you have missed the point. If a 16" 4k screen is noticeably sharper than a 32" 4k screen then a 32" 8k screen would have increased sharpness over a 4k one. I am already viewing a 32" monitor from this distance so my viewing distance won't change.
...no. I am viewing my 32" at the distance I would use my 32" at, and noticing increased clarity from the same resolution on a smaller screen at equal distance. Saying I'd double the distance I'd view my 4k screen at simply by making it 8k is being disingenuous and is what your misunderstanding hinges on.
Instead of higher fidelity. I wish more games would focus on physics.
Destructible buildings and stuff. And just be able to interact with everything.
Hell Let Loose is perfect in many ways. Except performance and absolutely everything is static in that game…
Imagine a Hell Let Loose with fully destructible buildings
Would love for VR and all those things you mentioned to become more focused upon.
Hmm yes maybe it does have a purpose, thank you for pointing that out
mm yes maybe it does have purpose, thank you for pointing this out
I was looking at the office on a 27 inch 1440p screen 1.5 meters away and was blown away that i could read the text on the book shelves. Idk why but it added a lot...
FYI; Normally I watch from 2.5 meters away on a 65 inch 4K OLED.
I tried moving drom 3.5 m up to 0.5m, but I just can't read it. Maybe it was created in 1080p and i am gaslighting myself, but I think 8K might actually make a difference for me. (If movies were 8k :p)
I have had a 4k oled tv for 8 years now, so maybe I just got very used to it.
First ever 4K displays were often made of two panels bunched together. But now we are seeing more 4K displays with high refresh rates. 8K displays would be a thing, and who knows, maybe DLSS and many other technological improvements would make 8K a preferred choice in upcoming years
I plan to get a 4K monitor this Black Friday, and my PC only has a 6700K and a GTX 1070! I can wait a few years to play the new demanding titles like Alan Wake 2, wait until components are cheaper. Monitors are a surprisingly good price right now!
I've been catching your videos on 4K since that GTA V one all those years ago. I just want to say-people who solely recommend 1440p don't seem to remember that PCs can be used for more than just gaming. I write a lot, and I am really excited for text clarity when I get that new monitor and upgrade from my 27" 1080p one. Text is going to look so good. Same thing with TH-cam videos and movies, such higher performance and they're not even hard to run compared to games.
I'll use my GTX 1070, play some older games for a year, and then get something like an AMD RX 8700 XT on Black Friday 2025. No issue. People need to become less polarized on this.
Love the video as always Philip!
why would you ever want to "play" alan wake? game is awful.
@@LucasCunhaRocha I recently played Control and absolutely adored it! I remember playing Quantum Break back when it was new and really enjoying it, too. Remedy seems like a pretty great developer with a lot of bangers, and I want to try all of their games and see how they are. I hear Alan Wake 1 is the best 6/10 game out there, and I'm excited to see.
I don't intend to buy AW2 on Epic, though. Dislike using anything other than Steam. Just gonna wait until it comes to Steam years from now
Also a good thing about 4K is that it can integer scale 1080p so no downside in clarity compared to a native 1080p monitor. Though 1080p will still look worse to you as you'll get used to 4K windows and old games that can be 4K and stuff and then a game dropping down to 1080p won't be so pretty but still, works. 8K would be even better for integer scaling as it could do 4K, 1440p and 1080p, the current three common resolutions. But high refresh 8K monitors for a decent price are far off, we're still stuck on 8K 60hz non gaming monitors lol.
As upscaling tech gets better and better I can see it being worthwhile. Right now the cost for both the GPU and for the monitor/tv is a bit steep. 4K finally got vaguely affordable and I'm very happy with it at the moment.
jumping to 4k after being on 1440p for so long actually helped me realise how average graphics can really look, the added clarity revealed the flaws in a lot of the game elements and areas that were hidden due to the low resolution but also poorer contrast of my older screen.
it was immediately noticeable in text and a lot of game elements just how much smoother and cleaner everything looked. I think 8k would have similar benefit providing the texture resolution and overall game settings don't have to be lower to much, medium 8k being better than say high 4k and much better than ultra 1440p for example, especially with upscaling rendering the game from 2k or 4k resolution anyway.
I feel once we have texture upscaling as well. so games can have like 1080p textures auto upscaled to 4k, this would save on space as well as improve loading times. and massively improve visuals
imagine a game with only 4k quality textures or low end systems having like 480/640p textures upscaled to like 1080p or 1440p as well.
8 years ago(2016), LG released a 27" 5k monitor monitor called Ultrafine, primary for Apple users. Mind you: 5k has 80% more pixels than 4k. Shortly after, Apple discounted all USB-C accessories by 30% because of some drama. The discount also applied to the LG Ultrafine, therefore I bought one as an impulse purchase. Of all possible imaginable games, I played WoW on it, and Pandaria never looked this good again. To this day, I am waiting for a 27" 5k / 32" 6k OLED 120Hz+ to repeat this experience. I sold it shortly after to recoup the money and also because the monitor (not the display/panel itself!) was trash, constantly going off randomly or switching to 30Hz.
This did not make me be all like 'hmm yes maybe it does have purpose, thank you for pointing this out'
man I cant wait for 16K to be a thing so I can finally see what everyone is so hyped about with 4K
"i grew up in a time where i looked forward to the future" - that one hurt
Me watching on 1080p screen : hmm yes maybe it does have purpose, thank you for pointing this out
Philip digs deep into things I have no access to or interest in. I like Philip for that (and more). Keeps me more open-minded and helps me embrace the future rather than shut down with only the things I'm familiar and content with, while automatically rejecting everything new.
Thanks for the reality check, Philip. I'm still rocking my GTX 1080 on a 1080p screen anyway because I don't care enough to invest into an upgrade, but thanks nonetheless.
Guy like me still chilling on 1080p
I'm still in a 32 inch 1080p club.
Every game must look like Minecraft I bet
5k at 32 inches is my dream. Windows was originally designed around 96 PPI and 5k at 31.5 inches with 200% scaling gives you an effective PPI for scaling purposes of 93.24. 5k is also nearly double the pixel count of 4k, which would be a massive improvement for older games while also allowing for perfect pixel scaling at 1440p for games that are harder to run. 200% scaling makes everything on the desktop side of things so much better to work with as well.
I think more importantly than explaining why 8k might be worth it, you said that you look forward to the future, without it letting you detract from what you enjoy in the present. I think people are motivated to upgrade not just because "it's new", but also because it means that what they currently have "is old".
8k and beyond makes the The most sense for VR type applications.
I could listen to philip talk about rendering techniques and resolutions all day honestly
I think 1080p is gonna be the hill I die on.
There's value to 8K gaming...if you can afford it.
Your point on downscaling really did seal the deal, you do definitely have a point.
8K won't be a gimmick in 10 years time most gamers will be playing game at 8k60fps on PC
I hate this intro. Not because it's not well done, it's great and gets the point across.
But because it is necessary.
I noticed with my own online discussions lately, that you can't have nice things anymore. Noone tries to hear each other out, have civil discussions and reach mutually satisfactory conclusions. Everything is simply on the spectrum between 'um acktschually' and 'you're wrong'.
I come here to hear to hear an opinion other than my own, that I sometimes agree and sometimes disagree with but mainly would have never thought about.
And it pains me to see, that the discussion has to start with what is basically a disclaimer that is more than a minute long just so people are not butthurt.
It's miserable isn't it. People are looking for the next COVID to be outraged about but in the meantime any topic will do
I personally still find the chase for higher fidelity to be frustrating, because playing games like Cyberpunk which is touted as one of the best looking games ever was an absolute pain. Stuff looked blurry, there was loads of ghosting, and the animations powering stuff still had a lot of jank. I had to tinker with settings in and out of game for hours, trying to prevent the game from looking too pixelated and crunchy, but also looking like the world was full of ghosts. All of this hassle from my brand new rig which should handle things no problem.
Cyberpunk 2077 at 6k is something I wish more people could experience its so immersive
hmm yes, mabye it does have a purpose, thank you for point this out
The first thing i did when i bought my 4k tv 55" tv is i cranked my games to 4k (i don't remember what those were sadly, but GTA V was def on the list) and was extremely disappointed when i saw absolutely zero difference between 2k and 4k from 2 meters away. So yeah, i watch movies and play games at 2k to this day and i see no reason to move on. Tbh i don't really care about flat screen stuff. If the text is crisp and it's bright enough - i can work and game on that. Now VR and AR - that's where extra pixels really matter. Can't wait for higher resolution glasses to become affordable!
We've been so lost in maxxed out graphics that I didn't even consider high resolution gaming on graphics fidelity lower than high settings, but it makes good sense.
Alotta VR games are purposefully non-realistic, but they're immersive all the same, you could very easily overrule graphics by just pumping up the resolution;
However many graphics settings don't only drop in quality but become more pixelated as they drop, especially stuff like shadow edges and textures.
It seems mostly contradictory to have so many pixels on the screen, smoothing everything out, just to see very smoothly jagged pixelated shadows and textures on the screen.
If games had actually proportionately treated their graphics setting instead of making everything look pixelated in lower settings, it would be more visually pleasing to run lower fidelity on higher resolutions.
Due to this, I'd suggest 1440p to just replace 1080p for the time being, a lot of people are misled by e-sports standards and are forced to drown under convoluted AA and DLSS settings to get a modern look to their games when 1440p can very easily just have the lightest touch of either to completely get rid of any pixelated look.
I don’t even own a computer or anything and i still look forward to all your videos.
The optimal distance chart at 2:45 is much more relevant for media consumption than for gaming. As an ex-eyefinity user, it's okay to not have the whole screen in the focused view and have it fill up my peripheral vision.
I just want to thank you for not making this a 53 min long documentary, starting from the Big Bang till today.
Me watching this at 480p on a 1080p screen: "Interesting."
I think that the two primary purposes of 8K gaming would be:
1. Extra-large screens. In Apple marketing terms, not even 4K would count as "Retina" with an LCD/OLED screen in a size you'd normally buy a projector for, like offensively large, maybe 160-280 inches. In those cases, 8K would be actually noticeably sharp.
2. Screens that you'd look at from extremely close. How do you do that? VR! Or any kind of headset like that. The sharpness difference from having 4 times the pixels would actually be noticeable here, too.
What I'm gonna look at in 10 or however many years time is what games I play right now will even recognize that 8K exists (in the game's settings menu, not a config file). I know devs are a lot more aware of higher resolution gaming but it's gonna be interesting, especially with the increased prevalence of indie games.
hmm yes, maybe it foes have purpose, thank you for pointing this out
Another reason why upscaling is so good for 8k is ofc that the resolution of 8k is so high the upscaling quality is going to be very good regardless and the performance gains will be amazing on top. DLSS and FSR Performance mode uses 1/2 or 0.5x res for each axis (0.25x total pixels), so at 4k it upscales from 1920x1080, at 8k it's 3840x2160 because it has 4x the pixels of 4k - aka performance at 8k upscales from 4k. Ultra performance is 1/3rd res per axis, at 8k that's 2560x1440p. 4k quality mode upscaling internal resolution is 1440p - exact same as 8k ultra performance.
RT scales per pixel so saving as much performance as possible by using upscaling is only going to get more and more important, with that increase in quality from the resolution increase it's going to be great. It definitely won't start going mainstream this decade but it will hit eventually. Maybe frame gen advances leaps and bounds too and latencies are decoupled so inputs aren't as affected, then a high framerate 8k experience might be more possible sooner than people think. Just need to wait 8k HRR micro LED monitors to not cost the entire GDP of a small nation and you're in the end game
Back in the 2000's when 1080p monitors started to become affordable I read some comments back then about people claiming that anti aliasing will no longer be needed for such a high resolution and it will just waste performance.
Years later people claimed the same thing for 4k displays. But even now AA being disabled on games rendered at 4k is still noticeable but to a lesser extent compared to 1080p or 1440p. But then there is the issue of TAA which can destroy image clarity in motion even at 4k.
8k can finally be the sweet sport to truly no longer utilize AA. Or older methods of AA like FXAA might actually look pretty good, even in motion, at 8k.
1:16 Seems to me from this new (at least to me lol ) profile pic that you like being on the receiving end of stuff in general lol
Me, clicks on this video at 3 AM. Looks at the "hmm" comments.
It's time to go to sleep!
goodnight
@@fowlizm9065 gn!