Oh wow. That "part of the same video and bitrate" blue sky vs grass example was perfect. Should almost become one of those reoccuring gifs with a caption that are posted in gaming threads on twitter & reddit.
Ironically enough, the smile would be proof that he's not AI since you can see all the definition of the muscles on his face shifting, which AI video can't do (or at least not well).
DLSS does allow developers to get away with less thoughtful optimization, but so does literally everything that increases performance without developer input, like faster processors, larger hard drives, faster internet, etc.
Im normally very anti dlss but doom eternal was a great exemple of an amazingly optimised game with great dlss implimentation. Though ironically the game was so well optimised i rarely ever used dlss and just turned the resolution up
@@hughjanes4883 DOOM eternal is the best case scenario for DLSS. It lets me play at a perfect 90-100 fps with raytracing enabled at 1080p with my 3060ti. Every other game I try with DLSS typically just has a small performance boost with a lot of smearing. I think DLSS is perfect for edging out a little more performance to bump you up a graphics setting, but past that it's implementation can be lackluster.
Exactly. In the 8 and 16 bit era, developers would optimize down to the individual pixel. That ended when the next gen released. People also don't realize that some optimizations become infeasible as complexity increases or that they have tradeoffs.
@@aolson1111 dont forget old timey optimisations on 8 or 16 bit consoles heavily depended on hardware that the devs could leverage, as every game was gonna be played on same same console. Now devs have no gaurentee every computer will have the same hardware optimisations available, even on console where it can changemid generation
These days i work on freelance 3D render projects for a client that has absolutely no idea of these things, they want me to deliver 3 minute 4K 30fps renders under 300mb. usually when i deliver the video their reaction is like... it looks a bit blurry in fullscreen mode... i tell them i will need to increase bitrate for good quality but it would be bigger file size for which they tell me to do any magic and not to go above 400mb so then i go into video editor and just put on sharpness filters and add some LUTs and they are happy lol. It makes my work look bad but what can i do ...
A similar experience in my texture art for 3D products. With PBR workflow and 2k-4k res, total size limit of 3.5mb is nuts though understandable for web optimization.
They might just think 4K means more gooder. Perhaps you could show them a FullHD version at the same size and see if they are happier with it then discuss from there.
@@tteqhu Probably because it takes up less space than things at a 1GB or multiple GB, his file(s) assuming he never really goes over around 690MB per file can fit each and every file on a 700MB CD so it's probably a cost per dollar of data storage (and possibly backup) thing. I have Dumb and Dumber torrent rip that I dowloaded on one of my computers many years ago that was clearly a 480i/576i 24 Dvd RIP I don't remeber that exact file size but in in the 100's of megabytes and it portrays the 480 resolution video content perfectly, I'd say it could even pass for HQ good bitrate SD quality to 720p HD quality borderline but it's pretty great for below HD resolution interlaced content stored at a pretty conservative and frugal amount of 100's of MB, So all in all it's pretty impressive and very watchable even today. I find it pretty much BlackMagic that it was possible at all, bit it is ,or was rather.
Yo bossman, you were my most listened to artist on Spotify this year, which was very surprising! Very glad you have your song on there and keep up the amazing work!
This reminds me, recently there was a siggraph paper where Chinese researchers presented a new upscaling method for games: Unlike DLSS, the new method doesn't use a fixed input render resolution. Instead, the final (properly shaded) pixels a rendered at very low input resolution (much lower than for DLSS), while the pure texture mapped polygons (without any shader effects) are used as input in higher or even full resolution. These input data streams are then used by the ML algorithm to produce the final upscaled image. The results look pretty good because the model knows what the high frequency textures and polygon edges are supposed to look like, and it only has to upscale all the shading / material effects from rather low resolution, which have often mainly low frequency detail anyways. So I expect that DLSS (and XeSS, PSSR etc) upscaling will get significantly better in the future. Which is interesting as there wasn't much progress in upscaling since DLSS 2 came out several years ago. (Frame generation and frame denoising are a different beast.)
I dont think Upscaling is the culprit but rather titans like Unreal Engine that muddy the water. The aggressive push for things like lumen or nanite and removal of legacy optimizations like hardware tesselation resulted in devs needing to drastically lower resolution for effects like ambient occlusion that need temporal smearing to cover up the flaws, which leads to ugly visuals, not upscaling
Yeah 8k textures,raytracing and Nanite all look garbage when you have to run the game at 480p upscaled to hit 60fps. Idk how nobody noticed this but it's kinda the state of things right now. The new shiny isn't pretty when you have to compromise everything to run it.
unreal 5 is more built for fortnite then anything else bru and even then fortnite runs like garbage compared to when it used to use unreal 4 and doesnt look much better
Why remove legacy optimizations when you can reprogram them into your engine, thus allowing newer and older computers to run the games at high fps while also giving more details.
You can turn off lumen, and nanite, taa, and everything really. hell, you can switch back to forward rendering if you want and make games render like they used to.
There's nothing inherently wrong with upscaling, but when a game requires upscaling in order to reach 60fps at 2k, then I think something has gone wrong somewhere. Until we get better upscaling which doesn't make the image quality blurry, I'll always prefer native resolution over upscaling.
What I like about philip is that he's a Counter Strike gamer who embraces that he's a nerd like most of us, and he enjoys taking us on deep journeys through niche interests that leave us all a bit more interested in his niches
While I don't think it's entirely the fault of upscaling. The AAA gaming industry DEFINITELY has gotten lazy (I would know, I'm a game developer). Lack of optimization (both in filesize and in performance) is a huge issue in the modern gaming industry, and you can absolutely do a lot better than what large companies currently are
@@sd19delta16 At least four years. There are rumors that there won't be a new generation before 2030. What's more likely is this generation getting an extended life through portable consoles. Both Microsoft and Sony are working on Switch (2) competitors that are supposed to be as capable as their base consoles.
I shall now repeat for those hard of hearing: "it is still RELATIVELY EARLY in this console generation's lifespan DUE TO THE INCREASED COOMPLEXITY AND DEVELOPMENT TIME of modern games" alright thanks four years is a lot when a game and its sequel can be produced and polished within 26 months, but it's not much when a single game takes 72 months to ship
dlss is great but i wish it was used as a tool to squeeze performance out of hardware that might otherwise be incapable as opposed to being used as a crutch for unoptimised games from major studios. i have a 6700xt and im already finding alot of modern games that turn on upscaling technology as part of the medium or even sometimes HIGH settings presets, which seems like a complete misuse of the tech
What exactly is the difference between having to squeeze out more performance because the hardware would otherwise be incapable vs making unoptimized games from mayor studios run on hardware that would otherwise be incapable? How much you dislike a certain studio?
@NeovanGoth my rule of thumb is that on a midrange pc of the era, you should be able to get 60-100fps depending on genre with the medium settings preset, BEFORE any upscaling technique is used theres a difference between a kid using fsr to get the newest cod to run on his 10 year old laptop and a AAA release requiring upscaling tech to be playable on anything below a 4090
@@jalipeno97 but... cod doesnt need a 4090 to run 120fps... i get you're trying to make a point but you can't just go around saying things that are wrong, bo6 literally runs on a damn ps4 (which is literally 10 year old tech but obviously not as a weak as a 10 year old laptop) around like 40-60fps, i know thats not a pc equivalent but still. You don't need to exaggerate just to make a narrative that every game needs dlss...
can they tho.. with all the limited time and super tight release dates and other goals from the publisher etc, I think its most of the time not possible to fine tune the optimization..its the last thing in developement anyways.. it is a real shame
@@emperorfaiz If they don't have the skill, they shouldn't be developers on the first place. But I guess the time constraint isn't their fault, it's the publisher's
@@Chaoscelus The problem is that most of them where hired for political reason instead of skills. The skilled devs have mostly moved to the indie side because they didnt have enough brownie points to compete politically.
It's not that I think devs are becoming lazier, I think their higher-ups are more concerned with it being cheaper when relying on upscaling as a crutch instead of having to pay for bespoke graphics fine tuning done by people who need to.earn a salary.
That's one thing that I think a lot of consumers seem to forget. Unless it's a small indie team, or smaller, there's usually executives and publishers, and those executives and publishers are the ones making decisions that affect things like deadlines and crunch time without ever consulting with the teams actually doing the work designing the game, programming the logic, or creating the assets. The actual developers aren't the ones choosing to take shortcuts or skimping out on optimizing the game, they're simply not given the opportunity to do things the right way because big old publishing executive says the game needs to be on shelves by next month.
@@KyleDavis328 The only thing consumers complain about is this dynamic between companies and execs. I don't think it's very forgotten. I think what people like you often forget is that games are insanely complex projects, that require an incredible amount of work and problem solving from the best people in the industry, yet, here you are acting as if you could fix any of the issues they face.
@@TunaIRL Frankly, I am shaking my heads a lot when every time it is "It's the execs!" while in recent years we've actually seen the projects being mismanaged by their dev team, like Concord, that was actually going TERRIBLE far more terrible than the release, because they were lacking someone above checking in on them, and toxic positivity. Optimizations, especially today, is becoming far harder, and the more complex the project, the longer it can take to optimize, and while you are, you can brick the entire project temporarily. Optimizations are hard.
@@SioxerNikita I just wouldnt blame either side. Both, and every other party involved have a lot of things to manage in these projects. It's impossible to blame a specific thing unless you were a part of the project and can share on it.
@@TunaIRL That's also true. There are cases where we have direct info, like Concord. 18 months from release and the project was essentially non-existent. My point was more your point frankly, that it can be anything, and it is frankly tiring to see everyone blame just execs, studios, etc. They might have something to do. It's like Westwood, people blame EA for closing them down, while Westwood mismanaged their projects heavily, like SUPER heavily, and went over budget constantly.
It still bothers me that if you pirate content for free from websites hosted by the public you will get better video quality than if you paid for the content digitally.
Dlss is part of bigger problem where industry just shifts to how much less they can get away with , theres no reason games looking not better than rdr2 perform 4times worse on same hardware , but all the hardware +software upgrade since then allows industry to not worry about it
While I share your frustration, RDR2 is one of the highest budget games of all time. There's your reason. It was made on Rockstar's proprietary engine, which they invested in for GTA V, which was an even more expensive game to develop. The rest of the industry can't simply access that tech and benefit from that investment.
If we had tech to make video 100 times better, streaming companies would use it to deliver 100 times crappier bitrate. At least TH-cam is honest and just tell you that there's a better HD you have to pay for, while Netflix just gives premium users 4K that looks worse than DVDs.
Are devs lazy? Idk. You tell me what to call it when devs spend their whole ms budget on two checkboxes in unreal engine. We are 4 years into 9th gen with 10+ tflop consoles. 4 years into 8th gen with sub 2 tflop consoles, we had DOOM 2016, AC Origins, The witcher 3, Battlefield 1... We had a ton of games that without a doubt looked next gen and weren't just some linear tech demos. When will 9th gen get its next gen games?
"Are devs lazy? Idk. You tell me what to call it when devs spend their whole ms budget on two checkboxes in unreal engine." That is... stupid. You are 4 years in - games take 4-5 years to make, and there was a massive supply chain issue for the first two years of these consoles. Even getting dev consoles was a problem.
Yeah. You’re totally right. I think if there was depth of field and maybe a bit of shadows here and there, all to make the characters and environments easier to read, then I think the static noise aesthetic would work well in a video game. It looks so cool!
I know the deets of those tech, as I work with video myself. I still appreciate you covering those topics - sometimes you bring up stuff I could overlook or even didn't know about. I especially appreciate your demonstration of what you're talking about right now and used it as illustrations to explain stuff to people numerous time. Please don't stop. I also used this knowledge in my professional job with streaming, as well as I use it top optimize my streams so they look better than everyone else's around :-D
@@MultiZymethSKI see, when I click 6:34 it jump just after he say AVI (at 6:35), so i missed it. But he's not that wrong, AVI can only use old codecs up to HEVC... Ok it's too vague but it's not that wrong either.
We all know for sure over the years we've lost efficiency, this is provable if you look at how much more common bad broken pc ports are , and if you look at the fidelity gained it games for the cost in performance increase , games quite literally look like they are from 2016 but run 800p on the latest consoles. This doesn't apply to some games , but honestly most games.
@@Die-Coughman thats just not how dlls works dlls only downscales your resolution and upscales it back to your original resolution so your not upscaling anything only downscaling
The ghosting is TAA (Temporal Anti Aliasing) that blends old and new frames together into a blurry mess if anything moves to fast (aka any faster then the slow panning tech demos). It is used to hide noise, didering and lots of other bad render techniques that would look horrible without TAA. Something like MSAA or SGSSAA are much better, but doesn't work with RTX and loads of other lazy techniques used in modern games
1:12 Northern Journey is a good example of this, since a lot of the textures are photographed from real Norwegian nature, and there's a bunch of grass and foliage everywhere, most gameplay videos online see the quality drop a lot during normal moving scenes if you were to look at the ground. Oh, and on many occasions, the game is just dark (lighting) in general.
The issue with DLSS is not that it makes devs lazy, its that it makes studio execs see a way to cut corners. Why would they spend millions of dollars making their devs do real optimization work when they can simply rely on DLSS, a tool that is advertised as being able to multiply a game's frame count for no cost to them! Its a bit weird to be blaming the performance issue on people wanting higher framerates and resolutions when most people still use 1080p. I still play at 1080p and love the experience, but have definitely noticed a huge drop in performance on newer titles that more or less look the same as titles I was playing a decade ago on an older system. Despite never having "pushed for a higher resolution," im getting a fraction of the frames on a newer computer. And even if you still want to make the wild claim about games performing progressively worse falls to a push for higher resolutions then youre still wrong about it: the 1080ti came out seven years ago and was a beast at 4k high FPS gaming in its era (and honestly still is). The "push" for higher resolutions more or less capped out at 4k a while ago, and most people still dont care about more than 1080p. Would I honestly say that its all the fault of executives pushing to get games out faster and cheaper while not giving a single thought to the end user experience? Yes, yes i would. In every case. Would I say it's because the execs are lazy? Not necessarily, I'd say it's because they're greedy. Being lazy is simply a byproduct. Great video man, weird way to end it.
By his own logic "It will save someone somewhere some money" it only makes sense that upscalers are being blamed because that's exactly what they do by helping these studios to not spend resources on optimization therefore i was also surprised to see the video end like that. I mean upscalers are not the only reason for bad optimization like some people make it out to be but it certainly takes a part in it and that's undeniable at this point.
Case in point: 1080p still holds a majority on the Steam hardware survey at 55% (1440p is a distant second with 19%) and the most common GPUs are 60-series cards
I usually remaster videos by using Lossless scaling interpolating frames. It's impressive how good the videos are when doing that. It's sad to know here about upscaling tools doesn't work on it.
Video broadcast engineer here. We just have one comparison: On versus Off. Try it with either upscaler method. There’s multiple “correct” methods to upscale any video (and many more incorrect ones, used to be more common, thankfully less so since 4KTVs became normal).
@@2kliksphilip But you get better performance at said lower resolution than using DLSS. It's impressive, but the performance is worse than natively running the lower resolution and the visuals are worse than running the native resolution of your display. I wouldn't call that "free".
@@TechnoBabble The overhead is small, it can upscale eg 1080p in the time it would take to render 1200p natively. The upscale (at 1440p+) looks a lot better than 1200p, and in many cases it _does_ look better than native thanks to better antialiasing.
@@speedstyle. I've literally never seen a single game where maximum quality DLSS looks better than native. Every game is softer/slightly blurry or has very visible ghosting even at high framerates.
@@TechnoBabble yeah I agree, dlss looks like crap if there's a lot of movement, especially with frame generation on. upscaling also looks too sharp for me and ruins the immersive smooth edges that blends a bit better than seeing clear edges.
2:40 Just gonna add onto this; upscalers like DLSS and FSR have some level of game integration. They're able to physically detect ingame geometry, tell what the lighting is doing for that one pixel, able to detect motion and such. It's a much more 3D process, which is why upscalers struggle a little more with 2D games (not that most 2D games need it). Game upscalers couldn't work with video fundamentally unless there was another AI detecting depth, motion, lighting and more, which we have but it's slow and you're better off getting an AI which can detect and smooth out bitrate errors then upscale the cleaned up image.
this is only partially true. AMD offers RSR for global use, which uses FSR 1-based Upscaling. Now, this obviously isnt as good as a properly "TH-cam-integrated" FSR 2/3 or DLSS2/3, but it gets the job done. Additionally, Both AMD and Nvidia offer Upscaling for Videos via "Video Upscale" and "RTX Video Super Resolution".
Not sure if you're aware, but both AMD and Nvidia offer Video Upscaling, via "Video Upscale" and "RTX Video Super Resolution". In AMD's case, FSR 1 is being utilized. As you have already mentioned Time Travel aint quite possible, hence the reliance on spatial upscaling compared to temporal one. Not sure about the Nvidia implementation though.
When Philip uploads a video, I wonder where my childhood friend is right now. Maybe he's an astronaut now, fighting evil alien robots and commanding a fleet of Leonard Nimoys.
The final point sounds like a direct parallel of some channel I came across that ADAMANTLY HATES any sort of upscaling in video games. I'm here to say both you and that other channel have points- upscaling and TAA allows for higher framerates and more accessible gaming experiences, but they have regularly been proven to make everything very blurry, ghosty and less sharp in general. Unreal engine is taking a lot of that criticism due to Lumen and its general optimization methods relying on upscaling and denoising, causing big blobs of pixel dust remaining in a scene after something passes through it.
TAA is fine until you go back to an older game, then you get better performance and realize you don't need glasses (Performance isn't TAAs fault). I wish SGSSAA would become the standard AA method and that games was renderd like they used to be, on a frame by frame basis :/
Nope, it's moving the goal post, not making gaming more accessible to anyone. If game cannot run even at native 2k on the latest and fastest GPU (4090), you know the optimization is simply dogshit.
@zeykis7369 Nanite is just a denser type of LOD, that actively increases and decreases the tri count of high detail models. It has nothing to do with visual clarity.
Unreal Engine is the poster child of devs relying on these smart upscaling technologies instead of properly optimizing their games. But this is largely because developers have begun to rely on pathtracing tech. These chipsets weren't really ready for raytracing when they first came out and they still basically aren't.
When doing proper AI video upscaling, you treat every shot as a one-off. You don't interpret over keyframes, because AI may decide that the clip before is a continuation of the clip after, which will pollute data. The longer you have consistent data on screen, the more AI can accumulate and then reproject back onto footage to achieve higher resolution. A slow moving shot with consistent lighting and a long duration will upres well. Something where lights are flashing for a few seconds with tons of motion blur won't resolve much, if AI has nothing to grab onto. More input data means better result and it varies shot to shot. You spend hours doing it the right way and for something like this to work in real time, there would have to be a lot of pre-processing. A lot of watts wasted to get something that may not even look right half the time. It can work on autopilot, but not guaranteed. Anyways... good video. Covers all the important points.
That "video thumbnail" is the most disturbing thing I've seen since my grandmother left the bathroom door open I don't think I will be able to watch this video "until" my meds show up. Please stand by.
but unreal engine 5 and upscaling have made performance worse, while as you said the gpu performance uplifts are not keeping up because they come at an increased price. There are videos breaking down the gpu pipeline in ue5 games and pretty much all of them use expensive features like lumen and nanite to make the scene look marginally better at a huge performance decrease while at the same time being way cheaper money wise, and those cost savings are neither used to expand the scope/quality of the game nor are they passed down to consumers or game developers as their wages fall behind inflation and the prices of games are vastly outpacing inflation. Meanwhile the minimum spec hardware has been raised with every game faster than gpus are gaining performance despite upscaling being used aggressively (upscaling at 1080p should not be a thing at all) in the default settings since the days of dlss 2, which looked good, but still considerably worse than native
no upscaling doesnt make performance worse. also upscaling is really clever way to optimize, how would you even reach 4k details without upscaling? and if you add old tech msaa then your already low framerate would be 2x lower or worse
@@HenrichAchberger "how would you even reach 4k details without upscaling" optimizing so you can run 4k native (like doom and others)? the point isn't that upscaling by itself lowers performance but that the corners that it cuts leads to decreased performance also msaa isn't the only AA and TAA doesn't make the image better as opposed to other AA (even FXAA) it just smears pixels round edges and makes the image a blurry shit show. The use of lumen and more importantly nanite absolutely destroys performance due to overdraw and unnecessary computations instead of just optimizing a single goddamn mesh. That's why UE5 games have lower quality graphics and performance than previous gen games AND need upscaling (sometimes you don't even have the gpu overhead for it to have a performance uplift because of nanite and lumen). UE5 is not the only offender though has could be seen with mhwilds that needs frame generation in 1080p to run at MINIMUM 60fps on a 30-series card (even on a 3080ti which i'm sorry but it's not old by any means)
There are multiple vidoes going into fine detail showing how poor modern optimization has been. Upscaling isn't soley to blame, nor is the familiarity with 4 year old consoles, and if hardware prices are an issue then optimization being sacked for cheap & automatic checkboxes isn't helping.
The main issue with "too low bitrate" is that videos are not properly encoded with a quality setting on youtube, but with a hard bandwidth cap, which is forcing the quality lower. Meaning that normally the bandwidth would spike for complex scenes as necessary, to maintain an equal quality, but TH-cam encoder limits the maximum amount a second of video can use, artificially limiting the available bandwidth. This together with the lack of motion blur in many games (and action shot style cameras) and the codec has to drop details, which looks very unnatural, blurry and blocky. The simple solution is usually to artificially add motion blur to the videos, even if your camera can't shoot with a slow enough shutter speed due to a lack of a variable aperture with software. This drops the "harsh" details on moving objects and thus the codec can encode this content better. The same is true for video game shots, which is one of the reasons why many games use motion blur (which then streamers turn off, because who the fuck cares about viewing quality, right?) :) And there's no reason why FSR/DLSS shouldn't work on videos, but the main issue is that it needs to be trained differently - on bad video encodings instead of low resolution renderings - so it can properly mask the artifacts of bad encodings.. It also needs to consume the encoded video stream, to extract the motion vectors and the images at the same time, to lower the workload.
Most people turn off motion blur because its just bad (me included), and i just say that in the sense that no one wants it most of the time, aside from some cinematic scenes or racing games
It’s most likely a quality setting (CRF-ish?) COMBINED with a bitrate cap rather than either/or. Not using quality based encoding means stationary videos (like those Music uploads that consist of just the album art) will use the same bandwidth as a live action video, which is bad. Not using a bitrate cap means that when people override the quality settings to 1080p and it plays most of the video fine until it hits a complex scene that overwhelms their internet connection, they will complain and blame TH-cam.
Even BluRay's are compressed a lot. An actual film could be around 20TB in size in raw images. For cinema they compress them to DCP format which for a 2k film is around 250GB in size. You can try and upscale video which all do anyway if you're resolution doesn't match the file resolution or downscale, some are better then others but there are deminishing returns that just make the image look worse. Subtle upscalers and downscalers are best I find. MPC Video renderer is very good. But really bitrate is kind, resolution isn't all that important at all. You can only upscale so much, if th einformation isn't there to start with then you can never get it back so having a higher bitrate file to start with is way more impaortant even more so when upscaling. TH-cam bitrate has noticeably reduced over the past couple of years. I'm pretty sure it's when they introduced their premium bitrate setting you have to pay for.
Man, I haven't read "2k" in the actual cinema context for so long, thanks. But I think you are little harsh to say Blurays are compressed hard. They are compressed reasonably, I think. Streaming platforms cut information even harder, because most people don't pay attention.
@@tteqhu Oh yea BluRays are perfectly fine. I understand you need to compress a 20TB film otherwise we would need a lot of BluRay discs just to watch a film. Some BluRays are better then others though, some 4k BluRays are compressed way too much or they use way too much DNR or other techniques. Other BluRays are great.
Oh, so I see you are familiar with the videos by Threat Interactive. Should we expect a video from you about TAA, De-noising, rendering "tricks" game studios used to do and etc? Seems to be right up your alley
@@2kliksphilip Yea, I think I'm on the same camp as you. I don't mind. Btw, I think you read my comment the other way around - they used to do trickery, now they try and "model reality" (raytracing, pbr, RTreflections) and so they need to denoise the image for it to be "acceptable", I do think some things were lost from a different time of game development (crowbcat's left 4 dead video to me is the greatest example of how a "smaller" team of devs intimately familiar with the engine and game could do much better than blockbuster AAAA modern productions). But wrt graphics, idk. Could games look as good as they do with lesser GPU requirements? Sure. Is it worth it to spend resources on it, given that AMD and Nvidia (and sure, intel) provide solutions to some problems? idk. Probably not. Is any of all of this actual "laziness"? Absolutely not, just good old cost benefit analysis. Threat interactive's videos about game optmisation are still very interesting to watch, and remind me of that video of yours where some dev (a great dev, I don't mean as an insult, only as to explain "not a huge game studio") implemented that frame permanence thing to keep rendering an image and distorting it based on movement, so as not to freeze the frame and have people in VR get sick when the GPU couldn't keep up - or something like that, memory isn't perfect.
@@2kliksphilip Sort of, iirc they clickbait other things and say TAA is good, the other things are evil, but I mignt be misremembering, ofc. But it is also clickbait, etc.
@@prgnify I tried to watch a few Threat Interactive videos, and it seemed far more like a marketing ploy, that or some people that haven't really worked in the industry, having their own little pet peeve with certain techniques. Personally, I couldn't take the smugness, their holier than thou attitude.
@@SioxerNikita Yeah, it is marketing, for sure. But I disagree it is a huge ploy or something. By looking at it, its some kid who had a pet peeve with certain techniques and got too emotional. Some of what they show I can "see their vision" so to speak, with some dithering and stuff like that having to be smoothed out. But completely agree with the smugness and saviour complex. I guess it is to be expected from teenagers who think they know everything there is to know. But it being expected doesn't mean we have to take it, lol. Very difficult to watch in anything but 3x speed just to get to see the images and think for myself - everything is very cringy.
Halfway through the video I suddenly realised how identical to me you look, with a giant fleece thing on and a cup with a bombilla sticking out of it right next to you. I erupted in laughter
I'd just like to bring a little extra detail to the explanations here. Upscaling in games _does_ use temporal accumulation (the spreading computation over several frames trick explained in the video) because better data in = better data out. However, the core of the technology is the ability to just straight-up guess what should have been in the new pixels without actually rendering them at all. Additionally, there _is_ actually quite advanced and capable video super-resolution tech out there. It just isn't yet real-time (even with a beefy GPU) and can only be run by grabbing some research team's github repository and stumbling through getting it to run.
You can upscale videos and streamed media with Lossless Frame Scaling, though. You can even use Frame Generation on them, up to 2X. The 3X and 4X options aren't stable enough without introducing noticeable vibrating artifacts due to the lack of motion vectors.
@@keppycsThere is a paid tool for this on steam, dont remember name sadly. It was developed by some guys to enable dlss on any game, but ultimately ended up working even on other visual media, like youtube
@@SioxerNikita "Lossless scaling" is a name of the program. It is distributed on steam, made for games but also works with media. Most probably, they meant it figuratively
@@ArchiXBelidercene Yeah, I looked it up, and saw that, and lossless scaling is pretty much intended to scale up in powers of 2, to keep the scaling perfect, if I read correctly. But yeah, Lossless Frame Scaling for video and streamed media is something that doesn't exist XD
Hey man i hope u r reading this comment. I always had two questions and both of them have almost no resources and answers on internet. First is what u somewhat answered in this video. Like how regular screens or tv can upscale videos so better Like when u play 1080p video on 4k display in fullscreen then upscaling is definetly happening and it's so good while dlss and fsr struggle to upscale games. Then I want to know the difference between MEMC and Frame generation in games. because MEMC adds those extra frames in realtime in video on tv which looks very good and here also in games FG struggles.
The upscaling that TVs use isn't anything special and certainly isn't close to DLSS in quality, so I disagree with your sentiment that games struggle with upscaling while TVs don't. And the same with the frame generation aspect- it's more challenging for games to do because added latency is an issue with games in a way that a delay with smoothing motion on TV screens isn't. When only watching content, a delayed display doesn't matter so much. Games typically have more overlaid HUD elements which makes it harder to smooth the motion without inside knowledge of the game engine to know which bits to ignore
Another thing to mention, is that even before the compression algorithm there's chroma subsampling for almost all videos that aren't intermediary files. So, videos literally have half of the data cut off and reconstructed based on 1 sample of color per 4 samples of luminance.
Hey Philip! Do you still make your videos subtitles by feeding your script to that program that puts the words at the correct intervals? I'm amazed that after all this time you still have such good captions on your videos. However some of the funny bits you used to do like at 6:47 with "a helluva" are not here anymore, or random typing. Thank you for still providing us with nice and high quality videos! Also, at 2:08 makes me remember that video about MSA, where you were saying that MSAA x8 actually takes 9 samples, but no, not really, it is 8 :D
The interesting thing is, physical film might be possible to upscale with similar to dlss. Might is the key word there. But gate weave (every frame is slightly differently aligned due to the mechanics of the camera) as well as the fact that the grain/crystal structure of film is irregular, it means every fame is slightly different even if nothing in front of the camera changes. If the differences could be accounted for (no doubt an astronomically difficult task), it might be possible to temporally upscale. Though in the case of well preserved high quality film stock, the benefits would most likely be undetectable under normal viewing conditions. Additionally, film grain artifacts would probably be difficult if not impossible to work around and high quality film is already so sharp that scanning it in its true quality is debatably impossible due to the density and the irregular shape of the crystals. For home movies shot on physically smaller film, like 8mm, those might be possible to enhance. But aside from high quality scans and restoration, I think "enhancing" physical film is a pseudoscience. Stuff like trying to entirely remove film grain, or otherwise molest the quality of the film. Look up recent James Cameron "remasters," they look like trash. Removing the charm of the medium of film undermines the artistry, technology and culture the film emerged from. Not to discourage the idea though, a technology like that could be beneficial for film restoration, but I don't have much faith based on recent official attempts at AI remastering of film.
I still avoid upscalers wherever possible, the artifacts, the blurriness, etc. depending on which upscaler used in games... they all come with obvious downsides and rather have less FPS, but those are at least not bad.
For something similar with video, if you can track and stabilize a moving sequence of frames with sub-pixel precision, you can then sample these together on a larger canvas to upscale it. This is the principle behind super-resolution/pixel shift technology in some image editors and cameras respectively, as well as (as I understand it, at least) NASA’s drizzle stacking in astrophotography. Your explanation of how DLSS works sounds very similar in that it sounds like it samples the same area across a subdivided pixel.
In general, one can do super-resolution on video with pretty nice results. Please do note that your sub-pixel sample measurement doesn't have to be from near frame. One can do match similar parts of image using ORB/SURF/SIFT (affine-transform-oblivious descriptor). Then you can do weighted average of such measurements. And indeed doing super-resolution on video requires some filtering for edge artifacts. But doing filtering with wavelet transform does work pretty nice for removing such edge artifacts. Using something like FSR or DLSS. No AI needed. I speak that from experience since I worked on optimizing upscaler like that more than half decade ago. Problems you've pointed out are pretty much solvable without AI. If you don't have to do it with real time and smallish delay, ofc.
If you go to the Nvidia control panel on Windows, you can enable video upscaling. You might have to change some settings depending on the browser you're using, but I've found it can work pretty well on 1080p videos to 4k, and even works on stuff like Zoom and Discord video calls.
Fun fact, the "Auto" resolution may not always be displaying the resolution it is saying. This is because if it struggles to get the video it will downscale it until it can handle it better, though it may not always go back to the resolution you want. When you set the resolution manually it gets a lot closer to the one you actually want (it may loose some detail due to the lossy compession), and in this case it will prefer pausing the video (buffering) over lowering the resolution. Sadly on the web version of TH-cam you can't set a default resolution and you always have to switch over to not get the "Auto" one. On mobile you could select a specific default resolution, but now you can only select based on how much data you are willing to spend on viewing the video rather than the actual quality, so this is more like a restricting version of "Auto" where it tries to stay within some range rather than going all the way down if it feels like it.
Oh wow. That "part of the same video and bitrate" blue sky vs grass example was perfect. Should almost become one of those reoccuring gifs with a caption that are posted in gaming threads on twitter & reddit.
I thought the same, "this guy it's a genius! " I try to explain bitrate to people soooo many times
This should be a short.
@@montyoso fuck shorts
@@montyoso Unfortunately, shorts only support ultra-narrow view aspect ratio so for optimal short, you would need to create a custom video.
I was honestly expecting Philip to get datamoshed at the end with his smile, it would have been a very funny way to end the video
you sure would've
Yeah He should add that!
I tried to eat 4 boiled eggs within the time it took to watch this vid. Barely finished the second. I will not fail next time.
youll get it next time, i believe in you
It's a lot easier if you don't boil them. Just crack the eggs into a bowl and drink it next time, it will more than double your egg consumption speed.
An egg every 2 mins, pathethic.
Almost hard boiled, deshell then swallow without chewing.
Aren't you cooling them off? Takes me 2 bites for 1 egg
Mission failed. We'll get em next time.
you're never beating the AI allegations with that smile at the end
Thats what Christine McVie meant in her song "Show Me A Smile".
Pretty much instantly reminded me of Richard D James as soon as i saw it hahaha
Ironically enough, the smile would be proof that he's not AI since you can see all the definition of the muscles on his face shifting, which AI video can't do (or at least not well).
That audio delay isn't helping either xD
The funny thing is as a fellow brit, I've used that exact smarmy kind of 'told you so' smile. Must be a brit thing.
Philips clothing is my thermometer for the UK.
it is very chilly atm
3:00 Didn't know FSR has a built in chicken remover
It was supposed to censor genitalia but for some reason it's removing all the male chickens now
FidelityFX Schicken Remover
It needs a sacrifice
@@Astheniumn a ritual to get the upscaler working
Fowl Super Remover
DLSS does allow developers to get away with less thoughtful optimization, but so does literally everything that increases performance without developer input, like faster processors, larger hard drives, faster internet, etc.
Im normally very anti dlss but doom eternal was a great exemple of an amazingly optimised game with great dlss implimentation.
Though ironically the game was so well optimised i rarely ever used dlss and just turned the resolution up
@@hughjanes4883 DOOM eternal is the best case scenario for DLSS. It lets me play at a perfect 90-100 fps with raytracing enabled at 1080p with my 3060ti. Every other game I try with DLSS typically just has a small performance boost with a lot of smearing. I think DLSS is perfect for edging out a little more performance to bump you up a graphics setting, but past that it's implementation can be lackluster.
Exactly. In the 8 and 16 bit era, developers would optimize down to the individual pixel. That ended when the next gen released. People also don't realize that some optimizations become infeasible as complexity increases or that they have tradeoffs.
@@aolson1111 dont forget old timey optimisations on 8 or 16 bit consoles heavily depended on hardware that the devs could leverage, as every game was gonna be played on same same console. Now devs have no gaurentee every computer will have the same hardware optimisations available, even on console where it can changemid generation
@@hughjanes4883 I don't know how to tell you this, but your game shouldn't be running like ass 1080p. If it does, it means you're a shitty dev.
I don't know
I do i do fein fein fien fien fiein fein
Me too either
Alright I guess
me neither
Same here, man, i just dont fkn know :C
These days i work on freelance 3D render projects for a client that has absolutely no idea of these things, they want me to deliver 3 minute 4K 30fps renders under 300mb. usually when i deliver the video their reaction is like... it looks a bit blurry in fullscreen mode... i tell them i will need to increase bitrate for good quality but it would be bigger file size for which they tell me to do any magic and not to go above 400mb so then i go into video editor and just put on sharpness filters and add some LUTs and they are happy lol. It makes my work look bad but what can i do ...
A similar experience in my texture art for 3D products. With PBR workflow and 2k-4k res, total size limit of 3.5mb is nuts though understandable for web optimization.
They might just think 4K means more gooder. Perhaps you could show them a FullHD version at the same size and see if they are happier with it then discuss from there.
Henry’s come to see us!
Why is he so fixated about specific number in hundreds of MB?
@@tteqhu Probably because it takes up less space than things at a 1GB or multiple GB, his file(s) assuming he never really goes over around 690MB per file can fit each and every file on a 700MB CD so it's probably a cost per dollar of data storage (and possibly backup) thing.
I have Dumb and Dumber torrent rip that I dowloaded on one of my computers many years ago that was clearly a 480i/576i 24 Dvd RIP I don't remeber that exact file size but in in the 100's of megabytes and it portrays the 480 resolution video content perfectly, I'd say it could even pass for HQ good bitrate SD quality to 720p HD quality borderline but it's pretty great for below HD resolution interlaced content stored at a pretty conservative and frugal amount of 100's of MB, So all in all it's pretty impressive and very watchable even today. I find it pretty much BlackMagic that it was possible at all, bit it is ,or was rather.
Yo bossman, you were my most listened to artist on Spotify this year, which was very surprising! Very glad you have your song on there and keep up the amazing work!
because videos arent made with 3d graphics and motion vectors
Yes, and normal and unshaded color and depth
Yes, DLSS, FSR and XESS have access to in-engine information, so its really impossible to make it work for videos
This reminds me, recently there was a siggraph paper where Chinese researchers presented a new upscaling method for games: Unlike DLSS, the new method doesn't use a fixed input render resolution. Instead, the final (properly shaded) pixels a rendered at very low input resolution (much lower than for DLSS), while the pure texture mapped polygons (without any shader effects) are used as input in higher or even full resolution. These input data streams are then used by the ML algorithm to produce the final upscaled image. The results look pretty good because the model knows what the high frequency textures and polygon edges are supposed to look like, and it only has to upscale all the shading / material effects from rather low resolution, which have often mainly low frequency detail anyways.
So I expect that DLSS (and XeSS, PSSR etc) upscaling will get significantly better in the future. Which is interesting as there wasn't much progress in upscaling since DLSS 2 came out several years ago. (Frame generation and frame denoising are a different beast.)
@@cube2foxWhat's the paper called?
@@keppycs "Deep Fourier-based Arbitrary-scale Super-resolution for Real-time Rendering"
I dont think Upscaling is the culprit but rather titans like Unreal Engine that muddy the water. The aggressive push for things like lumen or nanite and removal of legacy optimizations like hardware tesselation resulted in devs needing to drastically lower resolution for effects like ambient occlusion that need temporal smearing to cover up the flaws, which leads to ugly visuals, not upscaling
Yeah 8k textures,raytracing and Nanite all look garbage when you have to run the game at 480p upscaled to hit 60fps. Idk how nobody noticed this but it's kinda the state of things right now. The new shiny isn't pretty when you have to compromise everything to run it.
unreal 5 is more built for fortnite then anything else bru
and even then fortnite runs like garbage compared to when it used to use unreal 4 and doesnt look much better
Why remove legacy optimizations when you can reprogram them into your engine, thus allowing newer and older computers to run the games at high fps while also giving more details.
You can turn off lumen, and nanite, taa, and everything really. hell, you can switch back to forward rendering if you want and make games render like they used to.
I like your funny words magic man
20 secs of vídeo in 450kb
WITCHCRAFT!
4chan: first time?
Imagine that @25 fps is literally 0.9 kb/frame* - brutal.
*) yes, I know frames depend upon each other to a huge degree which saves tons of space
There's nothing inherently wrong with upscaling, but when a game requires upscaling in order to reach 60fps at 2k, then I think something has gone wrong somewhere. Until we get better upscaling which doesn't make the image quality blurry, I'll always prefer native resolution over upscaling.
What I like about philip is that he's a Counter Strike gamer who embraces that he's a nerd like most of us, and he enjoys taking us on deep journeys through niche interests that leave us all a bit more interested in his niches
and he's got a hellalalotta b-roll.
One of the few youtubers I used to watch for a specific game I liked that I still have a good reason to watch now
Eventually you run out of things to talk about a specific thing. Philip made the transition to other topics perfectly imo
Thanks for subtitles philip
Finally, I now know how you look when doing the "helluluawa lot of..." bit :D. I'll never forget this day
7:56 that smile killed me 😂
While I don't think it's entirely the fault of upscaling. The AAA gaming industry DEFINITELY has gotten lazy (I would know, I'm a game developer). Lack of optimization (both in filesize and in performance) is a huge issue in the modern gaming industry, and you can absolutely do a lot better than what large companies currently are
7:34 4 years isn't exactly what I would call "early in this console generation". Fantastic video though.
That's fair but we do still have another 4 years to go.
@@sd19delta16 At least four years. There are rumors that there won't be a new generation before 2030. What's more likely is this generation getting an extended life through portable consoles. Both Microsoft and Sony are working on Switch (2) competitors that are supposed to be as capable as their base consoles.
key word you left out there was "relatively"
I shall now repeat for those hard of hearing:
"it is still RELATIVELY EARLY in this console generation's lifespan DUE TO THE INCREASED COOMPLEXITY AND DEVELOPMENT TIME of modern games"
alright thanks
four years is a lot when a game and its sequel can be produced and polished within 26 months, but it's not much when a single game takes 72 months to ship
dlss is great but i wish it was used as a tool to squeeze performance out of hardware that might otherwise be incapable as opposed to being used as a crutch for unoptimised games from major studios. i have a 6700xt and im already finding alot of modern games that turn on upscaling technology as part of the medium or even sometimes HIGH settings presets, which seems like a complete misuse of the tech
What exactly is the difference between having to squeeze out more performance because the hardware would otherwise be incapable vs making unoptimized games from mayor studios run on hardware that would otherwise be incapable? How much you dislike a certain studio?
@NeovanGoth my rule of thumb is that on a midrange pc of the era, you should be able to get 60-100fps depending on genre with the medium settings preset, BEFORE any upscaling technique is used
theres a difference between a kid using fsr to get the newest cod to run on his 10 year old laptop and a AAA release requiring upscaling tech to be playable on anything below a 4090
@@jalipeno97 but... cod doesnt need a 4090 to run 120fps... i get you're trying to make a point but you can't just go around saying things that are wrong, bo6 literally runs on a damn ps4 (which is literally 10 year old tech but obviously not as a weak as a 10 year old laptop) around like 40-60fps, i know thats not a pc equivalent but still. You don't need to exaggerate just to make a narrative that every game needs dlss...
@@cool_boxy did i say cod? im not speaking generally
its not as common as some people say but im tired of having to smear my screen in vaseline to play any new games
These videos are so well presented and you have a good on screen presence, good work Phillip!!!
7:44 Because its true. Devs CAN optimize their games... yet they don't.
can they tho.. with all the limited time and super tight release dates and other goals from the publisher etc, I think its most of the time not possible to fine tune the optimization..its the last thing in developement anyways.. it is a real shame
Either they don't have skill or time to do it.
@@emperorfaiz If they don't have the skill, they shouldn't be developers on the first place. But I guess the time constraint isn't their fault, it's the publisher's
@@Chaoscelus The problem is that most of them where hired for political reason instead of skills. The skilled devs have mostly moved to the indie side because they didnt have enough brownie points to compete politically.
@Erowens98 Exactly, is the publisher's (and more's) fault
It's not that I think devs are becoming lazier, I think their higher-ups are more concerned with it being cheaper when relying on upscaling as a crutch instead of having to pay for bespoke graphics fine tuning done by people who need to.earn a salary.
That's one thing that I think a lot of consumers seem to forget. Unless it's a small indie team, or smaller, there's usually executives and publishers, and those executives and publishers are the ones making decisions that affect things like deadlines and crunch time without ever consulting with the teams actually doing the work designing the game, programming the logic, or creating the assets. The actual developers aren't the ones choosing to take shortcuts or skimping out on optimizing the game, they're simply not given the opportunity to do things the right way because big old publishing executive says the game needs to be on shelves by next month.
@@KyleDavis328 The only thing consumers complain about is this dynamic between companies and execs. I don't think it's very forgotten.
I think what people like you often forget is that games are insanely complex projects, that require an incredible amount of work and problem solving from the best people in the industry, yet, here you are acting as if you could fix any of the issues they face.
@@TunaIRL Frankly, I am shaking my heads a lot when every time it is "It's the execs!" while in recent years we've actually seen the projects being mismanaged by their dev team, like Concord, that was actually going TERRIBLE far more terrible than the release, because they were lacking someone above checking in on them, and toxic positivity.
Optimizations, especially today, is becoming far harder, and the more complex the project, the longer it can take to optimize, and while you are, you can brick the entire project temporarily.
Optimizations are hard.
@@SioxerNikita I just wouldnt blame either side. Both, and every other party involved have a lot of things to manage in these projects. It's impossible to blame a specific thing unless you were a part of the project and can share on it.
@@TunaIRL That's also true. There are cases where we have direct info, like Concord. 18 months from release and the project was essentially non-existent.
My point was more your point frankly, that it can be anything, and it is frankly tiring to see everyone blame just execs, studios, etc. They might have something to do.
It's like Westwood, people blame EA for closing them down, while Westwood mismanaged their projects heavily, like SUPER heavily, and went over budget constantly.
It still bothers me that if you pirate content for free from websites hosted by the public you will get better video quality than if you paid for the content digitally.
loved this format of content, keep up the amazing work man :D
Dlss is part of bigger problem where industry just shifts to how much less they can get away with , theres no reason games looking not better than rdr2 perform 4times worse on same hardware , but all the hardware +software upgrade since then allows industry to not worry about it
While I share your frustration, RDR2 is one of the highest budget games of all time. There's your reason. It was made on Rockstar's proprietary engine, which they invested in for GTA V, which was an even more expensive game to develop. The rest of the industry can't simply access that tech and benefit from that investment.
i think you did a very nice job with the lighting in this video! looks super crisp 👌
If we had tech to make video 100 times better, streaming companies would use it to deliver 100 times crappier bitrate.
At least TH-cam is honest and just tell you that there's a better HD you have to pay for, while Netflix just gives premium users 4K that looks worse than DVDs.
Are devs lazy? Idk. You tell me what to call it when devs spend their whole ms budget on two checkboxes in unreal engine.
We are 4 years into 9th gen with 10+ tflop consoles.
4 years into 8th gen with sub 2 tflop consoles, we had DOOM 2016, AC Origins, The witcher 3, Battlefield 1...
We had a ton of games that without a doubt looked next gen and weren't just some linear tech demos.
When will 9th gen get its next gen games?
"Are devs lazy? Idk. You tell me what to call it when devs spend their whole ms budget on two checkboxes in unreal engine."
That is... stupid. You are 4 years in - games take 4-5 years to make, and there was a massive supply chain issue for the first two years of these consoles. Even getting dev consoles was a problem.
@@zoeherriot Me want game now😡 Dev make game in 2 week😡 Where is my perfect game😡😡
4:22 you can pause that video at any time and the gameplay becomes a static noise painting with no info about depth
Yeah. You’re totally right.
I think if there was depth of field and maybe a bit of shadows here and there, all to make the characters and environments easier to read, then I think the static noise aesthetic would work well in a video game. It looks so cool!
That video with the sky/grass ratios is a FANTASTIC demo
Lossless scaling app on steam enables you to use frame interpolation and upscaling for every game and content. Not an ad I just love this program.
I know the deets of those tech, as I work with video myself. I still appreciate you covering those topics - sometimes you bring up stuff I could overlook or even didn't know about. I especially appreciate your demonstration of what you're talking about right now and used it as illustrations to explain stuff to people numerous time. Please don't stop.
I also used this knowledge in my professional job with streaming, as well as I use it top optimize my streams so they look better than everyone else's around :-D
6:34 - I think AVI is a video container, not a codec
xvid is the future
AVI is a video container
AV1 is a video codec
Just open your eyes man
@@annihil4nth Um, I know? That's why I'm saying that comparing codecs like H264, HEVC and AV1 to a container like AVI does not make sense.
@@MultiZymethSKI see, when I click 6:34 it jump just after he say AVI (at 6:35), so i missed it.
But he's not that wrong, AVI can only use old codecs up to HEVC... Ok it's too vague but it's not that wrong either.
avi is a video-specific extension of riff container, and it can store any audio and video codec that has fourcc code
much love phillip, wish you a merry christmas and a happy new year
You talk, I listen
The comparison around 6:00 was really nice
We all know for sure over the years we've lost efficiency, this is provable if you look at how much more common bad broken pc ports are , and if you look at the fidelity gained it games for the cost in performance increase , games quite literally look like they are from 2016 but run 800p on the latest consoles. This doesn't apply to some games , but honestly most games.
You merged that preview image so good with the start of the video, my applause
Maybe I've only played games with bad dlss but my experience has always been slightly ghosting. So I don't use it.
What resolution are you upscaling to?
@@Die-Coughman thats just not how dlls works dlls only downscales your resolution and upscales it back to your original resolution so your not upscaling anything only downscaling
The ghosting is TAA (Temporal Anti Aliasing) that blends old and new frames together into a blurry mess if anything moves to fast (aka any faster then the slow panning tech demos). It is used to hide noise, didering and lots of other bad render techniques that would look horrible without TAA. Something like MSAA or SGSSAA are much better, but doesn't work with RTX and loads of other lazy techniques used in modern games
That Nicolas Cage cup is definitely the bulk of the bitrate on this video.
I already knew the answer, but I didn't know how to optimize bitrate, and I never figured out that it works as if it's a limited resource. Good video!
1:12
Northern Journey is a good example of this, since a lot of the textures are photographed from real Norwegian nature, and there's a bunch of grass and foliage everywhere, most gameplay videos online see the quality drop a lot during normal moving scenes if you were to look at the ground.
Oh, and on many occasions, the game is just dark (lighting) in general.
from this comment i found this game, and really enjoying it so far, thank you
@@MJ-ev4nw I want to shill it everywhere
excellent vid, one of the better ones in recent memory. Thanks phillip!
The issue with DLSS is not that it makes devs lazy, its that it makes studio execs see a way to cut corners. Why would they spend millions of dollars making their devs do real optimization work when they can simply rely on DLSS, a tool that is advertised as being able to multiply a game's frame count for no cost to them!
Its a bit weird to be blaming the performance issue on people wanting higher framerates and resolutions when most people still use 1080p. I still play at 1080p and love the experience, but have definitely noticed a huge drop in performance on newer titles that more or less look the same as titles I was playing a decade ago on an older system. Despite never having "pushed for a higher resolution," im getting a fraction of the frames on a newer computer. And even if you still want to make the wild claim about games performing progressively worse falls to a push for higher resolutions then youre still wrong about it: the 1080ti came out seven years ago and was a beast at 4k high FPS gaming in its era (and honestly still is). The "push" for higher resolutions more or less capped out at 4k a while ago, and most people still dont care about more than 1080p.
Would I honestly say that its all the fault of executives pushing to get games out faster and cheaper while not giving a single thought to the end user experience? Yes, yes i would. In every case. Would I say it's because the execs are lazy? Not necessarily, I'd say it's because they're greedy. Being lazy is simply a byproduct.
Great video man, weird way to end it.
By his own logic "It will save someone somewhere some money" it only makes sense that upscalers are being blamed because that's exactly what they do by helping these studios to not spend resources on optimization therefore i was also surprised to see the video end like that. I mean upscalers are not the only reason for bad optimization like some people make it out to be but it certainly takes a part in it and that's undeniable at this point.
I wonder if I will ever hear any other argument made about any business related topic other than greed.
@@TunaIRL Yeah, there are a lot more stuff to talk about, budget allocation and focus.
Case in point: 1080p still holds a majority on the Steam hardware survey at 55% (1440p is a distant second with 19%) and the most common GPUs are 60-series cards
@@TunaIRL Free Luigi!
honestly enjoy your videos more with your beautiful mug in it. thank you AI allegators
I usually remaster videos by using Lossless scaling interpolating frames. It's impressive how good the videos are when doing that. It's sad to know here about upscaling tools doesn't work on it.
Try SVP motion interpolation
Video broadcast engineer here. We just have one comparison: On versus Off. Try it with either upscaler method. There’s multiple “correct” methods to upscale any video (and many more incorrect ones, used to be more common, thankfully less so since 4KTVs became normal).
I cant think of single game where i thought upscaler looked better than native
@@2kliksphilip But you get better performance at said lower resolution than using DLSS.
It's impressive, but the performance is worse than natively running the lower resolution and the visuals are worse than running the native resolution of your display. I wouldn't call that "free".
@@TechnoBabble The overhead is small, it can upscale eg 1080p in the time it would take to render 1200p natively. The upscale (at 1440p+) looks a lot better than 1200p, and in many cases it _does_ look better than native thanks to better antialiasing.
@KonradGM Why do I get the feeling that you've only seen FSR in action, but not DLSS?
@@speedstyle. I've literally never seen a single game where maximum quality DLSS looks better than native.
Every game is softer/slightly blurry or has very visible ghosting even at high framerates.
@@TechnoBabble yeah I agree, dlss looks like crap if there's a lot of movement, especially with frame generation on. upscaling also looks too sharp for me and ruins the immersive smooth edges that blends a bit better than seeing clear edges.
2:40 Just gonna add onto this; upscalers like DLSS and FSR have some level of game integration. They're able to physically detect ingame geometry, tell what the lighting is doing for that one pixel, able to detect motion and such. It's a much more 3D process, which is why upscalers struggle a little more with 2D games (not that most 2D games need it).
Game upscalers couldn't work with video fundamentally unless there was another AI detecting depth, motion, lighting and more, which we have but it's slow and you're better off getting an AI which can detect and smooth out bitrate errors then upscale the cleaned up image.
this is only partially true. AMD offers RSR for global use, which uses FSR 1-based Upscaling. Now, this obviously isnt as good as a properly "TH-cam-integrated" FSR 2/3 or DLSS2/3, but it gets the job done. Additionally, Both AMD and Nvidia offer Upscaling for Videos via "Video Upscale" and "RTX Video Super Resolution".
Not sure if you're aware, but both AMD and Nvidia offer Video Upscaling, via "Video Upscale" and "RTX Video Super Resolution". In AMD's case, FSR 1 is being utilized. As you have already mentioned Time Travel aint quite possible, hence the reliance on spatial upscaling compared to temporal one. Not sure about the Nvidia implementation though.
What I wanna know is when are we gonna get native resolution without blurry TAA covering everything
i didnt know sebastian vettel was this interested in graphics upscalers. awesome vid
When Philip uploads a video, I wonder where my childhood friend is right now. Maybe he's an astronaut now, fighting evil alien robots and commanding a fleet of Leonard Nimoys.
The final point sounds like a direct parallel of some channel I came across that ADAMANTLY HATES any sort of upscaling in video games. I'm here to say both you and that other channel have points- upscaling and TAA allows for higher framerates and more accessible gaming experiences, but they have regularly been proven to make everything very blurry, ghosty and less sharp in general. Unreal engine is taking a lot of that criticism due to Lumen and its general optimization methods relying on upscaling and denoising, causing big blobs of pixel dust remaining in a scene after something passes through it.
TAA is fine until you go back to an older game, then you get better performance and realize you don't need glasses (Performance isn't TAAs fault). I wish SGSSAA would become the standard AA method and that games was renderd like they used to be, on a frame by frame basis :/
Nope, it's moving the goal post, not making gaming more accessible to anyone. If game cannot run even at native 2k on the latest and fastest GPU (4090), you know the optimization is simply dogshit.
Nanite is also a huge huge part of the problem
@zeykis7369
Nanite is just a denser type of LOD, that actively increases and decreases the tri count of high detail models. It has nothing to do with visual clarity.
@@benjaminzarkhin1293 mb I was talking about the performance side of things and non optimized meshes
Unreal Engine is the poster child of devs relying on these smart upscaling technologies instead of properly optimizing their games. But this is largely because developers have begun to rely on pathtracing tech. These chipsets weren't really ready for raytracing when they first came out and they still basically aren't.
And that is why all my videos online are just film reels I mail to people, absolutely no compression.
"Better looking visuals" nah
When doing proper AI video upscaling, you treat every shot as a one-off. You don't interpret over keyframes, because AI may decide that the clip before is a continuation of the clip after, which will pollute data. The longer you have consistent data on screen, the more AI can accumulate and then reproject back onto footage to achieve higher resolution. A slow moving shot with consistent lighting and a long duration will upres well. Something where lights are flashing for a few seconds with tons of motion blur won't resolve much, if AI has nothing to grab onto. More input data means better result and it varies shot to shot. You spend hours doing it the right way and for something like this to work in real time, there would have to be a lot of pre-processing. A lot of watts wasted to get something that may not even look right half the time. It can work on autopilot, but not guaranteed.
Anyways... good video. Covers all the important points.
Yeah, it probably would need manual review to work right now. I don't know how useful AI upscaling would be right now though.
That "video thumbnail" is the most disturbing thing I've seen since my grandmother left the bathroom door open I don't think I will be able to watch this video "until" my meds show up. Please stand by.
but unreal engine 5 and upscaling have made performance worse, while as you said the gpu performance uplifts are not keeping up because they come at an increased price.
There are videos breaking down the gpu pipeline in ue5 games and pretty much all of them use expensive features like lumen and nanite to make the scene look marginally better at a huge performance decrease while at the same time being way cheaper money wise, and those cost savings are neither used to expand the scope/quality of the game nor are they passed down to consumers or game developers as their wages fall behind inflation and the prices of games are vastly outpacing inflation.
Meanwhile the minimum spec hardware has been raised with every game faster than gpus are gaining performance despite upscaling being used aggressively (upscaling at 1080p should not be a thing at all) in the default settings since the days of dlss 2, which looked good, but still considerably worse than native
Hit the nail on the head
no upscaling doesnt make performance worse. also upscaling is really clever way to optimize, how would you even reach 4k details without upscaling? and if you add old tech msaa then your already low framerate would be 2x lower or worse
@@HenrichAchberger "how would you even reach 4k details without upscaling" optimizing so you can run 4k native (like doom and others)? the point isn't that upscaling by itself lowers performance but that the corners that it cuts leads to decreased performance also msaa isn't the only AA and TAA doesn't make the image better as opposed to other AA (even FXAA) it just smears pixels round edges and makes the image a blurry shit show. The use of lumen and more importantly nanite absolutely destroys performance due to overdraw and unnecessary computations instead of just optimizing a single goddamn mesh. That's why UE5 games have lower quality graphics and performance than previous gen games AND need upscaling (sometimes you don't even have the gpu overhead for it to have a performance uplift because of nanite and lumen). UE5 is not the only offender though has could be seen with mhwilds that needs frame generation in 1080p to run at MINIMUM 60fps on a 30-series card (even on a 3080ti which i'm sorry but it's not old by any means)
There are multiple vidoes going into fine detail showing how poor modern optimization has been.
Upscaling isn't soley to blame, nor is the familiarity with 4 year old consoles, and if hardware prices are an issue then optimization being sacked for cheap & automatic checkboxes isn't helping.
The main issue with "too low bitrate" is that videos are not properly encoded with a quality setting on youtube, but with a hard bandwidth cap, which is forcing the quality lower. Meaning that normally the bandwidth would spike for complex scenes as necessary, to maintain an equal quality, but TH-cam encoder limits the maximum amount a second of video can use, artificially limiting the available bandwidth.
This together with the lack of motion blur in many games (and action shot style cameras) and the codec has to drop details, which looks very unnatural, blurry and blocky.
The simple solution is usually to artificially add motion blur to the videos, even if your camera can't shoot with a slow enough shutter speed due to a lack of a variable aperture with software. This drops the "harsh" details on moving objects and thus the codec can encode this content better.
The same is true for video game shots, which is one of the reasons why many games use motion blur (which then streamers turn off, because who the fuck cares about viewing quality, right?) :)
And there's no reason why FSR/DLSS shouldn't work on videos, but the main issue is that it needs to be trained differently - on bad video encodings instead of low resolution renderings - so it can properly mask the artifacts of bad encodings.. It also needs to consume the encoded video stream, to extract the motion vectors and the images at the same time, to lower the workload.
Why would the codec have an easier time compressing videos with motion blur?
Most people turn off motion blur because its just bad (me included), and i just say that in the sense that no one wants it most of the time, aside from some cinematic scenes or racing games
@@cube2foxSame reason as the blue sky example. Motion blur makes things less detailed, so it's easier for the encoder to process
@@blanaobla Or to make heavy 30 FPS games smoother, less framey looking.
It’s most likely a quality setting (CRF-ish?) COMBINED with a bitrate cap rather than either/or.
Not using quality based encoding means stationary videos (like those Music uploads that consist of just the album art) will use the same bandwidth as a live action video, which is bad.
Not using a bitrate cap means that when people override the quality settings to 1080p and it plays most of the video fine until it hits a complex scene that overwhelms their internet connection, they will complain and blame TH-cam.
great video, but what about frame generation?
Bruh what do you mean outrage... Devs themselves tell people to "just use DLSS" when people are mad about terrible performance and UE5
Even BluRay's are compressed a lot. An actual film could be around 20TB in size in raw images. For cinema they compress them to DCP format which for a 2k film is around 250GB in size. You can try and upscale video which all do anyway if you're resolution doesn't match the file resolution or downscale, some are better then others but there are deminishing returns that just make the image look worse. Subtle upscalers and downscalers are best I find. MPC Video renderer is very good. But really bitrate is kind, resolution isn't all that important at all. You can only upscale so much, if th einformation isn't there to start with then you can never get it back so having a higher bitrate file to start with is way more impaortant even more so when upscaling.
TH-cam bitrate has noticeably reduced over the past couple of years. I'm pretty sure it's when they introduced their premium bitrate setting you have to pay for.
Man, I haven't read "2k" in the actual cinema context for so long, thanks.
But I think you are little harsh to say Blurays are compressed hard. They are compressed reasonably, I think.
Streaming platforms cut information even harder, because most people don't pay attention.
@@tteqhu Oh yea BluRays are perfectly fine. I understand you need to compress a 20TB film otherwise we would need a lot of BluRay discs just to watch a film. Some BluRays are better then others though, some 4k BluRays are compressed way too much or they use way too much DNR or other techniques. Other BluRays are great.
hey this is the first time seeing your face man love your vids
Oh, so I see you are familiar with the videos by Threat Interactive. Should we expect a video from you about TAA, De-noising, rendering "tricks" game studios used to do and etc? Seems to be right up your alley
@@2kliksphilip Yea, I think I'm on the same camp as you. I don't mind.
Btw, I think you read my comment the other way around - they used to do trickery, now they try and "model reality" (raytracing, pbr, RTreflections) and so they need to denoise the image for it to be "acceptable",
I do think some things were lost from a different time of game development (crowbcat's left 4 dead video to me is the greatest example of how a "smaller" team of devs intimately familiar with the engine and game could do much better than blockbuster AAAA modern productions). But wrt graphics, idk. Could games look as good as they do with lesser GPU requirements? Sure. Is it worth it to spend resources on it, given that AMD and Nvidia (and sure, intel) provide solutions to some problems? idk. Probably not.
Is any of all of this actual "laziness"? Absolutely not, just good old cost benefit analysis.
Threat interactive's videos about game optmisation are still very interesting to watch, and remind me of that video of yours where some dev (a great dev, I don't mean as an insult, only as to explain "not a huge game studio") implemented that frame permanence thing to keep rendering an image and distorting it based on movement, so as not to freeze the frame and have people in VR get sick when the GPU couldn't keep up - or something like that, memory isn't perfect.
@@2kliksphilip Sort of, iirc they clickbait other things and say TAA is good, the other things are evil, but I mignt be misremembering, ofc. But it is also clickbait, etc.
@@2kliksphilip Man I thought of this too. So many mass viewpoints you can just guess nowadays from just a few reddit comments around any topic.
@@prgnify I tried to watch a few Threat Interactive videos, and it seemed far more like a marketing ploy, that or some people that haven't really worked in the industry, having their own little pet peeve with certain techniques.
Personally, I couldn't take the smugness, their holier than thou attitude.
@@SioxerNikita Yeah, it is marketing, for sure. But I disagree it is a huge ploy or something. By looking at it, its some kid who had a pet peeve with certain techniques and got too emotional.
Some of what they show I can "see their vision" so to speak, with some dithering and stuff like that having to be smoothed out.
But completely agree with the smugness and saviour complex. I guess it is to be expected from teenagers who think they know everything there is to know. But it being expected doesn't mean we have to take it, lol. Very difficult to watch in anything but 3x speed just to get to see the images and think for myself - everything is very cringy.
great video. love the comfy hobo look
Halfway through the video I suddenly realised how identical to me you look, with a giant fleece thing on and a cup with a bombilla sticking out of it right next to you. I erupted in laughter
Caboosing into higher quality videos is what the future holds
No way, I was literally configuring my Upscaler settings in cyberpunk, well played Philip.
That video of Philip dancing around... I feel as though I had just seen it yesterday. But yesterday, I was mad...
Another great video as always, but I have one question. WHAT'S IN THE CUP?!
I'd just like to bring a little extra detail to the explanations here. Upscaling in games _does_ use temporal accumulation (the spreading computation over several frames trick explained in the video) because better data in = better data out. However, the core of the technology is the ability to just straight-up guess what should have been in the new pixels without actually rendering them at all.
Additionally, there _is_ actually quite advanced and capable video super-resolution tech out there. It just isn't yet real-time (even with a beefy GPU) and can only be run by grabbing some research team's github repository and stumbling through getting it to run.
You can upscale videos and streamed media with Lossless Frame Scaling, though. You can even use Frame Generation on them, up to 2X. The 3X and 4X options aren't stable enough without introducing noticeable vibrating artifacts due to the lack of motion vectors.
How would one go about achieving this?
@@keppycsThere is a paid tool for this on steam, dont remember name sadly.
It was developed by some guys to enable dlss on any game, but ultimately ended up working even on other visual media, like youtube
Erm, I don't think it means what you think it means. You cannot losslessly scale videos. Kind of a thing with video.
@@SioxerNikita "Lossless scaling" is a name of the program. It is distributed on steam, made for games but also works with media. Most probably, they meant it figuratively
@@ArchiXBelidercene Yeah, I looked it up, and saw that, and lossless scaling is pretty much intended to scale up in powers of 2, to keep the scaling perfect, if I read correctly.
But yeah, Lossless Frame Scaling for video and streamed media is something that doesn't exist XD
transition from thumbnail to video was clean
The video bitrate explanation is brilliant.
the blue sky vs grass part was super duper cool!
Hey man i hope u r reading this comment. I always had two questions and both of them have almost no resources and answers on internet. First is what u somewhat answered in this video. Like how regular screens or tv can upscale videos so better Like when u play 1080p video on 4k display in fullscreen then upscaling is definetly happening and it's so good while dlss and fsr struggle to upscale games. Then I want to know the difference between MEMC and Frame generation in games. because MEMC adds those extra frames in realtime in video on tv which looks very good and here also in games FG struggles.
The upscaling that TVs use isn't anything special and certainly isn't close to DLSS in quality, so I disagree with your sentiment that games struggle with upscaling while TVs don't. And the same with the frame generation aspect- it's more challenging for games to do because added latency is an issue with games in a way that a delay with smoothing motion on TV screens isn't. When only watching content, a delayed display doesn't matter so much. Games typically have more overlaid HUD elements which makes it harder to smooth the motion without inside knowledge of the game engine to know which bits to ignore
Another thing to mention, is that even before the compression algorithm there's chroma subsampling for almost all videos that aren't intermediary files.
So, videos literally have half of the data cut off and reconstructed based on 1 sample of color per 4 samples of luminance.
Maybe it would actually be first thing to upscale? At least when video features text there's one artifact that can probably be reasonably fixed.
Hey Philip! Do you still make your videos subtitles by feeding your script to that program that puts the words at the correct intervals? I'm amazed that after all this time you still have such good captions on your videos. However some of the funny bits you used to do like at 6:47 with "a helluva" are not here anymore, or random typing. Thank you for still providing us with nice and high quality videos!
Also, at 2:08 makes me remember that video about MSA, where you were saying that MSAA x8 actually takes 9 samples, but no, not really, it is 8 :D
@@2kliksphilip He is talking about your video "The Finest Pixels for CS:GO - Antialiasing"
The interesting thing is, physical film might be possible to upscale with similar to dlss. Might is the key word there. But gate weave (every frame is slightly differently aligned due to the mechanics of the camera) as well as the fact that the grain/crystal structure of film is irregular, it means every fame is slightly different even if nothing in front of the camera changes. If the differences could be accounted for (no doubt an astronomically difficult task), it might be possible to temporally upscale. Though in the case of well preserved high quality film stock, the benefits would most likely be undetectable under normal viewing conditions. Additionally, film grain artifacts would probably be difficult if not impossible to work around and high quality film is already so sharp that scanning it in its true quality is debatably impossible due to the density and the irregular shape of the crystals. For home movies shot on physically smaller film, like 8mm, those might be possible to enhance. But aside from high quality scans and restoration, I think "enhancing" physical film is a pseudoscience. Stuff like trying to entirely remove film grain, or otherwise molest the quality of the film. Look up recent James Cameron "remasters," they look like trash. Removing the charm of the medium of film undermines the artistry, technology and culture the film emerged from. Not to discourage the idea though, a technology like that could be beneficial for film restoration, but I don't have much faith based on recent official attempts at AI remastering of film.
thats an amazing video! Thanks for making :)
I still avoid upscalers wherever possible, the artifacts, the blurriness, etc. depending on which upscaler used in games... they all come with obvious downsides and rather have less FPS, but those are at least not bad.
Seeing Philip laugh at himself saying "a helelevalot" brings me joy
Really loved that blue-sky-same-video example.
Maybe TH-cam should drop having 9+ quality levels and just stick to 480p, 1080p and 4k at higher bitrates.
Frame 1 is as gorgeous as ever :D Love it.
For something similar with video, if you can track and stabilize a moving sequence of frames with sub-pixel precision, you can then sample these together on a larger canvas to upscale it. This is the principle behind super-resolution/pixel shift technology in some image editors and cameras respectively, as well as (as I understand it, at least) NASA’s drizzle stacking in astrophotography.
Your explanation of how DLSS works sounds very similar in that it sounds like it samples the same area across a subdivided pixel.
You should check out the Threat Initiative vids about game performance. Showing all the issues with DLSS and its seeming reliance on TAA.
In general, one can do super-resolution on video with pretty nice results. Please do note that your sub-pixel sample measurement doesn't have to be from near frame. One can do match similar parts of image using ORB/SURF/SIFT (affine-transform-oblivious descriptor). Then you can do weighted average of such measurements. And indeed doing super-resolution on video requires some filtering for edge artifacts. But doing filtering with wavelet transform does work pretty nice for removing such edge artifacts. Using something like FSR or DLSS. No AI needed. I speak that from experience since I worked on optimizing upscaler like that more than half decade ago. Problems you've pointed out are pretty much solvable without AI. If you don't have to do it with real time and smallish delay, ofc.
Another excellent one, Philip!
Hey phil, i found questions nobody is asking you atm.
Would like another shot at that interview. You know a wild amount of video information.
If you go to the Nvidia control panel on Windows, you can enable video upscaling. You might have to change some settings depending on the browser you're using, but I've found it can work pretty well on 1080p videos to 4k, and even works on stuff like Zoom and Discord video calls.
there AI upscale for video for mpv but nvidia dont even care to import the filter
AVI is not a video compression standard; it is a container format like mp4 or mkv.
Fun fact, the "Auto" resolution may not always be displaying the resolution it is saying. This is because if it struggles to get the video it will downscale it until it can handle it better, though it may not always go back to the resolution you want. When you set the resolution manually it gets a lot closer to the one you actually want (it may loose some detail due to the lossy compession), and in this case it will prefer pausing the video (buffering) over lowering the resolution. Sadly on the web version of TH-cam you can't set a default resolution and you always have to switch over to not get the "Auto" one. On mobile you could select a specific default resolution, but now you can only select based on how much data you are willing to spend on viewing the video rather than the actual quality, so this is more like a restricting version of "Auto" where it tries to stay within some range rather than going all the way down if it feels like it.
Frame generation is possible using an app called lossless scaling you can watch a TH-cam video at 120 FPS (max)
Its different kind of experience .