Is DLSS Ruining Games?
ฝัง
- เผยแพร่เมื่อ 5 มิ.ย. 2024
- A growing trend with games these days is optimization. So many fall victim to bad performance scaling, bugs, or VRAM requirements. It feels really bad when a game like "Remnant II" releases and the performance scales so poorly that the developers said themselves that they INTEND players to use upscalers like DLSS, FSR, and XESS to play the game properly. Something must going wrong right now. IS DLSS RUINING GAMES?
==JOIN THE DISCORD!==
/ discord
My Spotify:
open.spotify.com/artist/3Xulq...
Daniel Owen: • Your PC isn't ready fo...
Jay: • I'm going to show you ...
Nvidia: • What’s the Latest? DLS...
Nvidia Game Developer: • Viewport UI and Bluepr...
wccftech.com/remnant-ii-devs-...
store.steampowered.com/app/12...
store.steampowered.com/news/a...
www.techpowerup.com/gpu-specs...
==WINDOWS AT A HUGE DISCOUNT!==
Windows 10 pro ($15): biitt.ly/TFR1G
Windows 11 pro($21): biitt.ly/U9jCZ
Code: "vex" for 25% off!
0:00- Remnant II falls victim
1:01- Cyberpunk & Last of Us
2:12- The good side of using Upscalers
3:18- Using Upscalers to overcome optimization
4:30- What's wrong with relying on Upscalers
5:20- Game tech outpacing GPUs
7:18- Do Devs care?
9:02- Hope this trend doesn't continue - วิทยาศาสตร์และเทคโนโลยี
FSR/DLSS should be a nice little bonus to boost your already solid framerate, it shouldn't be required to get the game to work right
But FSR 2.1 (I don't know about DLSS) additionally provides temporal anti-aliasing. And considering that TAA will always be a bit blurry, FSR holds up incredibly well (in quality mode), while also providing a modest performance boost. It also quite effectively removes aliasing from Hairworks in TW3, so you don't have to rely on MSAA, which makes it actually worth using.
Not right now, but when every computer has the capability to do AI upscaling, It will definitely be the next step into making really good looking games by developing with it as the base.
I agree with number 1. With games becoming too powerful graphically.
@@Elite7555another weirdo that likes TAA
@@bravish I think so too, and by that point, native resolution's only purpose will be to take close-up screenshots within the game.
Main issue here is that studios are putting too much emphasis on graphics and not worrying about quality overall. Graphics look real nice on your trailers but doesn't replace the fun factor (that is definitely affected if you need to buy a powerful card to play and still need tricks to fight FPS drops)
Sadly better graphics and in-game bells and whistles sell.
True except dlss doesn't even make the game look much better for the hardware requirements. It will get scrapped
And then when a big developer dares to make a game that doesn't conform to the graphical fidelity goose chase, people still bitch about it. Hopefully Fromsoft never listens to these dumbasses.
Literally been an issue for 25+ years (yes I'm old). Obviously upscalers etc didn't exist in the days of BUILD, Quake (1-3) engines and everything from the PC "golden age" c1994 to 2004, but we've always _always_ had issues with studios putting graphics above gameplay. For that matter, there's also _always_ been issues with graphical advances outpacing hardware -- at least for the majority of people. When Quake came out back in the '90s, the vast majority of gamers didn't have graphical acceleration at all, and the ones who did were forced to run the game at low settings and in a reduced screen window. The 2000s to mid 2010s were the happy medium of cheap graphics tech and engines that ran great and looked great... But now we seem back to the late '90s-early '00s era of lazy devs and engines outpacing affordable tech once again.
Just saying, from someone who's been gaming since the goddam 1980s, this is nothing new.
proven by how the publishers and what not have collectively said "don't expect all games to be this good" when baldur's gate III came out
I've been saying this for a while now. Dying Light 2 was the first game that really made me notice how bad the issue was getting with games seeming to require DLSS to play at a decent framerate. Upscaling is an awesome technology to squeeze more performance out of older hardware, or to make something like Ray Tracing viable in games for a more modern card. But it seems like it's being used as a crutch to just not bother optimizing games anymore.
Yep same here. DL2 has really subpar performance and not much going on to justify it.
Except It does not automatically make Devs extremely lazy. They cannot get way with just FSR/DLSS alone with Zero game optimizations.
Took you a while huh? I've noticed this issue post-Crysis.
DL2 was my first experience with this too. I had a 1080ti (which even now is still quite powerful), and it couldn't hold 60 FPS @ minimum settings on a resolution I had no problems running other demanding games on (I should note my peak was like 43 fps while I stood still and stared at the ground in the intro sequence lmao).
Modern software developers have managed, through sheer incompetence and inertia, to render irrelevant multiple magnitudes of hardware improvements over the past 30 years. I wonder how much electricity is wasted each year because of bad code.
I have a feeling we’ll see a lot more games like this
60fps 4k dlss performance with 3080 and 120+fps 4k dlss performance with 4090 does seem the way new games are being optimized for. Makes sense with how many tvs support 4k 120hz now. theres got to be way more people overall buying more 4k 120hz tvs vs 1440p/1080p monitors in 2023.
Except FSR and DLSS is not the problem, Just because FSR and DLSS exists does not mean its a free pass to 100% skip optimizations. It does not automatically make Devs extremely lazy. They cannot get way with just FSR/DLSS alone with Zero game optimizations.
@@Rairosu they are litterally skiping optimization bro .that has we have seen in recent games
Just look at cyberpunks phantom liberty update and starfield.
This is BS.
The way the gaming industry has gone the less and less I want to play games anymore.
@@NoiceBOB Starfield isn't even going to have DLSS at all on release, please do research first
Yes. This is exactly what we said would happen when DLSS launched. Devs using it as an excuse to not optimize their games.
It probably doesn't help that DLSS likely requires its own optimization pipeline to look halfway decent.
Where's the proof?
Anything nvidia expect deceptive tactics, and FPS bs was the start of messing with your mind from the industry..
I always played 60-70fps and am one of the best players in LoL, and Battlefield when i played while ago.
exaclty what we knew would happen
@@Bustermachine You're confusing DLSS with TAA. TAA needs to be run within the pipeline to look decent. DLSS and FSR just need TAA to work and are easily implemented.
relying on DLSS to optimize your game is like using glue to hold your submarine together at depth.
have you heard of a little submarine called "Titan"?
PCs are now doing what consoles have been doing since the Xbox360/PS3 era: running in low resolutions and trying to act like they're not. Most of that old console era was 720p, even though all our tvs and systems were 1080p - and with xbox one/ps4, it was the same thing, a lot of "adaptive resolution". It's going to suck if this starts to be more commonplace in PC games. I remember being happy that my little 2060 had DLSS, as I figured it would give me a bit more oomph, but it's been a mess in most games that bring it on board, who only bring it to cover problems and not to help broke ass PC gamers like me lol
@@PuncherOfWomenAndMinorities that's a new one. you got the joke but you didn't get it
or reuse condoms
And we all know how that turned out
i always feared DLSS and Nanite because despite being pretty cool tech wise, it was an extremely easy excuse big companies could use to completely ignore game optimisation more than they already do.
Except FSR and DLSS is not the problem, Just because FSR and DLSS exists does not mean its a free pass to 100% skip optimizations. It does not automatically make Devs extremely lazy. They cannot get way with just FSR/DLSS alone with Zero game optimizations.
You don't know what Nanite is
Nanite IS optimization
What Nanite has to do with it? The thing is a real optimization for geometrical objects
Welcome to the next generation.
I remember when everyone was "forced" to upgrade their machines when everything was suddenly created for 3d video cards
And people cried just as much about being "forced" to upgrade.
Optimization is one of the last things you do in game development, so if it's not getting the attention it needs, that could be because of time pressure. Devs might *want* to have time to optimize, but when looking at potential steps that could be condensed, it would be an obvious candidate given how much it can be "cheated" using upscaling. I suspect this may be what happened to Jedi: Survivor
I've actually heard from people who code, and a few developers with experience note they learned a lesson at some point that you should optimize early.
If you don't, you have to go through the whole project optimising everything when you could have built it up from the ground up on a framework of proper optimisation.
It can make sense that while learning a new engine or gaming device that the bloat comes in as mistakes during the learning process.
But eventually, general experience along especially with experience with the engines and platforms should lead to teams knowing what they are doing at project inception. You would see this progression with older versions of Windows and game consoles; advanced games that ran better than the previous ones on those systems.
UE5 just came out last year, PlayStation 5 in late 2020.
(Maybe Xbox series x is a big technical change, I just wouldn't be surprised if it wasn't that much of a difference compared to xbox one and windows 10..and now 11. And the switch was built on older tech even when released around 6 years ago now. But the ps5 is the popular thing folks have to work around now)
I would say it hasn't been enough time, but that would be excusing all the bad launches throughout the 2010s.
It appears to be that greater graphical fidelity is demanded even when that same level of fidelity takes twice or four times the effort and performance, while giving diminishing returns in terms of actual satisfaction when....so many popular games played online with others are years and years old now and run on potatoes and toasters.
Do they just have to keep hiring more and more new people to make it all happen on a tight schedule? Well we know the answer to that already, yes, it's yes.
So it seems that AAA devs are in a perpetual cycle of learning how to do things and by the time they've built up experience, something new comes along that is even more expensive both computationally and effort/logistics wise, along with studios closing down and people constantly shuffled and....
I suspect they just never get to that point of competence. And if they were to spend the money to somehow do that on time.... maybe they wouldn't profit?
We could just use 10 years with no new additions to the whole process....just, take a break and work things through, straighten the kinks out, and see the quality rise back up before moving forward.
After all, it's not like anyone is demanding 8K, and surely we've finally hit the top mark and into the real area of diminishing Graphical returns?
Right....right?!?!
You are wrong, we must develop with optimization in mind from the beginning, because if you schedule a 3-day development requirement that just works and nothing else, you will have a lot of work to do to rewrite your code later, I am a software developer and that is correct. It's about thinking first about the best performance of your code.
Consumers buying a game in alpha state: "Wtf this game is poorly optimized..."
Devs: "..."
@@chillinchum For sure, people always expect too much from the beginning of a new generation. There is a sweet spot, though. 2009-2011 is my favorite example when it felt like devs really had a solid handle on the current tech *just* before new hardware rendered their progress meaningless. And then there's MGSV and the venerable Fox Engine, which seemingly blows a massive hole into this theory
I refuse to believe that new games are "out pacing" the power of gpu's. They do not look much better than they used to look. Devs are just becoming more and more lazy about optimization.
Yeah, the same graphic not even realistic as realistic it is
Yeah I mean, older games that look incredible came out during the RTX 2000~ or hell even the GTX 1000 series, even the best looking games these days dont really hold a candle to Red Dead Redemption 2 which came out on the PS4/Xbone which were running on hardware significantly less powerful than the best PC hardware at the time- and it still ran at a very solid framerate.
Meanwhile if you tried playing RDR2 on PC when it came out (and even now really) it runs like dogshit on pc hardware that's multiple times more powerful than those old consoles...
No matter how much power you have, if the game devs didn't properly optimize their game, it wont matter, look at the original dark souls 1 for example, that thing brought even RTX 2080's to its knees in blighttown, despite looking like an early PS3/Xbox360 title.
It's true that this generation of Graphics cards have been absolute dogwater though, RTX 3060 Ti outperformining it's successor the 4060 Ti is fucking pathetic and laughable, and Nvidia should be ridiculed for it at every opportunity. Literally the only card that offers significant gains on its predecessor is the 1000$+ 4090, which is bullshit.
One big culprit is ultra HD textures. Which players will not even notice until they're hugging an object. You see this problem with Higwarrts and, RE4 remake. VRAM is gobbled up to store textures to an insane degree. Turning textures down actually fixes performance while barely changing visuals. Hogwarts in particular you will not notice a difference between ultra high and arguably even medium texture settings.
@@WelcomeToDERPLAND I played dark souls remastered on 1660 TI laptop and an RTX2080 and I can’t ever remember frame dips or stuttering on either system. 1660ti was 1080p max settings and 2080 was 1440p max settings. I definitely had to turn down settings with my 970m laptop to get 60fps at 1080p though, ps3 and 360 ran that game at 30fps 720p, with frequent dips into slideshow territory.
@@MechAdv Key word: Remastered
The Remaster fixes most of the performance issues.
The prepare to die edition is what kills fps, specifically in blighttown.
Dont buy a game at launch, buy it when it works properly, i.e when all the issues are ironed out. Don't be afraid to skip a game if it launches in a poor state
My rule of thumb is 1.5yrs after a sp title release. This usually is enough time for all the dlc for the game to be released and issues to be worked out. MP? Well that is a shitshow because you gotta get in early on MP to not miss out before population drips which it almost inevitably does.
I was so close to pre ordering Remnant 2 because of how much I liked the first game and how well it run. But I was glad I backed out after the early access benchmarks came out, then I went to the subreddit and after seeing that post from the devs my fears were confirmed.
Tell that to all the dumb masses still pre-ordering xD
That's what i did with Elden Ring, it was running poorly for people at launch but right now after multiple patches it's running damn nice at 1440p/high/60fps with very small rare drops.
@@Dregomz02 Ye Elden Ring was the first game that I ever purchased and played Day 1 and I got burned. Waited a couple months and played it when it ran well. Never again though lol
The main problem is devs not optimizing for DX12. In some cases,devs are just taking their game, running it in DX12, but then converting those calls to DX11 which hits your performance even more.
Its sad that the entire software industry in recent years has been more concerned about taking shortcuts and pumping out content that it has optimizing their software to run better.
This!
this is not how graphics API work. they are too different.
@@DiogoManteu I'd advise you to do your research. They arent.
Hell, Intels GPU's do this all the time by converting older DX calls to 12. Thats why recent Intel GPU's dont perform as well on older DX versions.
@@silvy7394 that's not on the game or engine programmer.
@@DiogoManteu It is if they're so fucking lazy they wont convert their game to 12 and instead just translate 11 calls to 12. Or not develop for 12 and kill performance on cards like Intel's.
Get out of here 🤡🤡. You dont know shit.
I think the biggest problem in a lot of these latest releases is Unreal Engine 5 itself. For games that are available in 4 and 5 there is a hugely noticeable increase in performance hitching, memory requirements, and required drive speeds. Unreal Engine 5 has actively discouraged optimization in its marketing , and even with it, games seem to run worse.
Mordhau went from UE4 to UE4.5 and the game became a fair bit smoother fps-wise when it happened, at least on my power gamer AMD rig from 2+ years ago, big battles with loads of real players still puts a big burden on my cpu however due to the insane numbers of computations but still.
Doesn't UE5 have better performance at the same quality level?
@@arcticfox037 its all about how its used and who's working on it, until the engine automates literally everything and auto optimises it, which is still a decade or more away, maybe ue7 or 8, competent devs will still need to work on polish and optimisation.
@@arcticfox037 Ue5 without using Nanite or lumen will perform almost the exact same as ue4, its when turning these more expensive features on where the optimization is felt. They are more expensive, but have no limit to their ability when they are used. So more expensive, but more capability from lumen and nanite
The GPU race has turned into the best "crutches" race.
not the gpu manufacturers fault.
the game devs are using cool features like dlss & fsr as an excuse to not properly optimize their games.
Like a lot of people said they would when it first got announced.
@@hypnotiqnvidia is also guilty of this issue of slacking off due to dlss, the 4060 is a garbage graphics card because it literally depends on dlss to get good fps
@@hypnotiq I disagree, the pipeline for GPUs, graphics engines, and game development, is too interlinked to say that the problem sits solely in the lap of any one of the three. They're all to blame.
@@Bustermachine also very true
@@hypnotiq its a mix of both. nvidia and game devs are both at fault
Devs need to optimize their shit better than let gpu upscaling manage it
Facts
Amazing how we’re blaming hardware for poor quality games
@@Ober1kenobi Are you trying to say the hardware that's been put out recently is good and up to standard? In my opinion it is very much a case of both groups being in the wrong.
I've been saying this forever
People with 8GB videocards, and PC gamers in general, need to lower their expectations of console ports.
Making a game is extremely hard work, and to accuse game developers of being lazy is just honestly completely ignorant.
The optimization argument is getting blown out of proportion. Would you rather wait three more months before it releases? If you want it to be better optimized, just wait three more months, and it will typically be much better optimized by then, if it wasn't well optimized to begin with.
It's much more important that the game is actually a good game than to have it super well optimized, and this looks to be a VERY good game.
Making a game really well optimized for PC can be very difficult for a game that's first developed for the PS5. Making sure that the game will run really well on the PC will give the game designers more restrictions, and will make it harder for them to optimize the game for the PS5.
@@thee-sportspantheon330 Hardware can only advance so much per generation. Nothing can be done about that. Game devs are choosing to add technology that GPUs are having trouble keeping up with, along with just making their games unoptimized. An RTX 4090 shouldn't have any trouble running any game 4K maxed out, but sadly, it does on a few "AAA" games. When you pay THAT much for a GPU, you shouldn't have to use upscaling technology just to get good FPS.
I personally think this is also an advertising issue - fancy graphics and tricks look and sound real good in trailers (and some gamers also get real obsessive about having big numbers,) but there they can just run it on a supercomputer and ignore the optimization. Then, actually making it run well on normal devices takes a lot more dev time than the studio will usually sign off on when they can just upscale and call it a day with relatively minimal impact on their profits.
Kind of a funny parallel with apples (red delicious in particular) - in the short term, people buy fruits in the store based on how they look and maybe feel, so stores select for fruit that looks pretty and stays fresh as long as possible. Problem is, with apples, that means they end up waxy, grainy, and bland when you actually eat them.
Except FSR and DLSS is not some dark magic wizardry that makes optimizations just magically work right out the gate.
Game developer here - we are actually delighted to spend absurd amounts of time optimizing for performance, it's incredibly complex, deep and satisfying. When you're talking about 'developers not caring', what you're really talking about is how deadlines are imposed by budgets and publishers. If you have 12 months to finish a milestone, optimization is done after the game is content complete and a fun, fulfilling experience, and due to these external financial pressures optimization can sometimes be cut or delayed, particularly at a small studio. Delays can happen to make this work, but only if there's cash in the bank to pay for everyone's healthcare, food and rent long enough to delay, which is rarely the case. Most deadlines set us up to fail from the start and are never enough time, because management and the higher ups are starting from a date they would like to release and working backwards on a spreadsheet to assign work. Management is also the group who would see something like DLSS and say to the team, 'well why dont you just turn that on by default?'
Technologies like nanite and lumen aren't things you just turn on and see if they work on different hardware, they are complex codebases that apply differently to every single object, scene, and asset in the game, and typically are intertwined in a 100-step optimization process that affects every line of code and every type of hardware that could possibly be released on.
I've been working as a backend dev for more than a year now and let me tell you that most of the times it's not about caring, it's more like you can fix it, but your boss have some priorities and maybe when those priorities are done, he won't pay you to optimize the game if it isn't too bad, money is power nowadays (and it's been for a very long time)
Yeah, I can definitely see that. I’m sure some game devs enjoy their jobs but at the end of an 8 hour shift or longer you just wanna get tf home, especially if the needed work is unpaid.
Yep. Optimization is not easy and it takes a lot of time. People want games quickly and those at the top want them released quickly. FSR/DLSS is being used as a way to get it playable sooner and out the door. If it sells well, it gets optimized and patched. If not, maybe it gets optimized down the road or maybe it doesn't.
Sadly, they don't get optimized in a patch. Often (for me anyway) patches end up making it even worse than launch.
@@michaeldeford1335 if it sells well without controversy related to performance why would a company spend money on optimization (and they wont), it isnt used to get games out the door quicker to patch it later, its just used as a crutch to get games out the door for the sake of early release.
Same here and I've had similar experiences being a backend dev. Normally it isn't the lead dev/scrum master who wants to leave it out but someone up in middle/upper management who decides that the costs & effort outweigh the benefits even if the devs disagree. I've had countless arguments with these people on how much effort things take and they seem to think I'm purposefully putting way too much time down. They literally have no clue
The second upscaling technology went from an optional feature to increase FPS for users willing to compromise on certain visual features to mandatory to achieve even just 60 FPS, the industry started to lose its way. This is a horrible trend starting, and I'm worried this will be the standard going forward.
It's a total contradiction of what raytracing was supposed to add to the games.
Raytracing = better accuracy of shadows & reflections for "high picture quality", but lower fps,
D.L.S.S = "worse picture quality" by removing the accuracy of things like AA, AF, shadows, & draw distance, but higher fps.
@@kevinerbs2778 Modern AA was already removing details, with methods like TAA creating ghosting and blurring the whole picture. I honestly think DLSS and FSR are the best kind of anti aliasing at high resolutions, since MSAA is so heavy on the framerate and TAA doesn't look good imo.
@@meuhtalgearYou don't even need AA at higher resolutions like 4k. Downscaling just makes you image look like dogshit. Its something you don't really want if you play at higher res
Cringe take. This happens because of console culture that is based on locked 30 or 60 fps. DLSS is not responsible for the developers picking one over the other, and in most cases still being unable to deliver.
@@evaone4286the need for AA isn't resolution bound but based on screen size and resolution. 7" 1080p doesn't need AA cause your eyes can't make out jagged edges.
Random question - what GPU performance monitoring software is Vex using in this video?
Late reply but thats MSI Afterburner
I have always stood against upscaling and frame generation because i knew this was where it was gonna go. It mirrors exactly how overclocking has gone. Overclocking was once the best way to wring more performance out of cheap lower end parts. Today overclocking is locked down and only available to high end expensive parts with an extra surcharge tacked on. Same thing here, first upscaling was a nice boost now its the baseline, next is it only being available on the high end for a surcharge. As soon as nvidia is happy that tier technology has cemented itself as the leader its getting stripped off the low end parts. They made it clear this was the plan when the RTX 20 series came out and they made the "peasant grade" GTX 16 series to go with it, 20 series gpus with the RT stripped out.
This is what a comment looks like from someone who got his education from a kellogs box. And they're still looking for the toy at the bottom.
@@ravenpearce9849dudes completely right what are you on about
@@ANGER2077
I don't disagree with everything, however it's important to note that if we want raw smooth performance people will need to purchase more expensive hardware that often most either cannot afford or are not comfortable with the cost. DLSS and FSR isn't just a "fake frame" generator, but when you look close to the details they actually will also fill in the gaps where some animations (let's say character movement) without would have slightly choppy movement due to the time it takes for your hardware to output an image before the next frame. But with DLSS or FSR enabled, these "fake frames" actually end up filling in those gaps which 'can' provide a smoother user experience (depending how you have your game setup).
I was stating that with the scale of which games are growing, you will often either need one of two things to compensate. Either more power, so a better card that can handle the raw performance and provide a smooth user experience, or technology such as DLSS and FSR which can offer a significant improvement without the need to dish out extra funds.
You have the choice and can pick whatever poison you'd like. I'm not sure if you've noticed but there's an "Off" option too for these games that suggest the use of DLSS or FSR. If we want devs to magically wrangle up performance in a title that may require more without these features, sorry but your experience won't be too great. But if you believe so you are more than welcome to go out there, become a game developer, and I encourage you to prove me wrong.
DLSS was originally meant to exist as a crutch to make ray traced games performant enough to be playable. The fact that we have to use it for rasterized games nowadays is..annoying.
to be fair. unreal engine 5 nanite is incredible technology (zero pop in)
@@legendp2011 to be fair, maybe nanite isn't worth it then if it makes the game unplayable without upscaling technologies.
@@thelastgalvanizer up to the developers. I personally find pop in is more distracting than modern DLSS, but I can understand it's a personal preference.
using nanite also probably reduced the budget (helpful for a small team with limited resources).
No nanite means a team would have to create 5x more objects to manage LOD transitions (also every-time they make a new object, they than have to create all the different LOD models, slows down development)
nanite doesn't just fix pop in. it also saves a huge amount of time and resources for developers. I can understand why they used nanite
@@legendp2011For the record, LOD creation is practically automatic, and the number of people who would be able to tell the difference between Nanite and good LOD levels is... Literally 0, you can't since LOD levels are determined by distance from render view.
Nanite is kind of an enigma. It can be more costly than traditional LODs when it comes to less complex scenes. But when geometric density, and especially object density reaches a certain level, it can end up being cheaper and more performant. It really comes down to use cases and optimization. And surely there could be implemented a kind of quality scaling setting for nanite that allows for more or fewer polygons overall, or for fewer polygons in the distance. It could be like standard quality options we have in traditional games. Why does it have to be so locked down?
1) I think many, but not all devs are relying on upscalers (DLSS/ XESS / FSR) to get new game to some sort of a playable state
2) investors / shareholders / management get pressured to have a title out by a certain timeframe regardless of the state a game is in pushing devs to not optimise titles properly.
3) Game share more about DLC / loot boxes / cash grabs to onsell features etc and no longer just a game …. It’s a business with repeated turnover and that’s it.
4) the graphics don't look hardly different/better from a game released 10 years ago, yet we are getting worse and worse performance. Feels like collusion with GPU manufacturers or terrible optimisation to me.
short timeframes are results of idiots still buying halfmade games
You are one of the very few who does acknowledge that the devs basically have to do what the greedy higher ups tell them to. Most people and the content creators only mention the devs when in most cases they are doing what the publisher orders them to.
alot of devs i find for some reason think that dlss on balanced looks just as good as native. so they probs just assume theres not problem using it.
@@mryellow6918 those devs are dumb, they have no clue what game engines can handle and use way too much polys effects etc
WTF. I cannot believe I haven't subscribed yet. Subscribed now, yours is one of the best upcoming fresh channels on gaming that I want to keep up to date on!
hey just found ur videos ur style is so seemless and smooth if you had some chill lofi or sum goin in the background this would be 10/10 vibe would watch all day
Wait till the next gen when AMD won't make any high end cards. you'll pay more for less, and be happy for DLSS 4 while Jensen Huang gets another leather jacket .
Paying more for less is better then the more you spend the more you save.
DLSS 3 is actually locked via software on the non-40 series GPUs, nvidia is truely doing this only for pure gain
Intel Battlemage had better be competitive. They are our last hope.
Either that or use a console lol. I'm so pissed that the Xbox series S runs games better my 3090 desktop.
The console litteraly only has 4 terrflops of performance!
@@vogonp4287 Intel is essentially a year behind Nvidia and AMD. Battlemage is going to be launching when RDNA 4 and Blackwell are coming out and it's aiming to compete with current gen not Nvidia's and AMD's next gen.
Intel won't be competing against Nvidia's high end either so Nvidia has even more of an incentive to do what they did with the 40 series outside of the 4090 and move all products down one tier while charging more because AMD isn't competing either. It's the same reason why Nvidia just cancelled the 4090 Ti. They have no reason to ever release it because their main competitors are nowhere near close to their level of performance.
Optimizing models, effects and textures costs money and A LOT of time. I feel a lot of developers are pretty much skipping a huge portion of this process today and also rely on different technologies to either automate or at least make it easier which sometimes make a suboptimal result. Hopefully technologies like nanite for unreal engine will remedy some of the core issues that we have with game performance.
unless we have the ability to delete the people that rushes the devs in releasing a game, not gonna happen anytime soon lol
Bro, I’m a software developer, and when I read about various tactics game developers were using way back in the 90’s / 2000’s I am literally blown away by the sheer ingenuity and complexity for which they ended up solving their problems
I have NO QUESTION in my mind that modern day developers across the board…aren’t anywhere near as good in regards to writing Efficient Software 😂❤
We were spoiled lol
The problem with Nanite (and UE5 in general) is how bassline heavy it is. Make a blank UE5 scene and look at how poor the framerate is. That said, it's like a freight train - you can load it up will stuff and it won't slow down.
Yeah, because the management side of things is only interested in something to show the investors for a given quarter, so they rush things to release, and of course the developers' priority, when being forced to release a game months or sometimes years ahead of schedule, is to focus on getting the game to an actually functional state, and unfortunately stuff like optimization is usually one of the last steps of the whole process once everything else is done, and they don't even have time to even finish everything else.
Upscaling technology isn't the problem here. The problem is capital getting in the way of art, a tale as old as time. Blaming upscaling tech is a weird take, because you know that we'd probably be in the same situation regardless, just with even less playable games.
UE5 has easy preset programmings for a lot of things so they never bother to optimize anything
When you were talking about technologies like nanite and lumen being too much for the hardware, I've seen people benchmark nanite and it's supposed to *increase* performance. The comparisons I've seen have shown that even in sub-ideal cases nanite outperforms regular geometry handling. I don't know about lumen though.
Not just that. Years ago we used to render games at higher resolution than our monitor and then downscale the image to gain graphics clarity and still play at good framerates. Now we are doing the contrary, we render games at lower resolutions and have poorer graphics clarity to be able to play it at playable framerates.
Remnant 2 uses UE5's Nanite feature for the polygon details and it works based on the native resoluton, meaning the more real resolution you throw at it, the harder Nanite has to works, which destroy the performance because it was always meant for a low resolution output that's upscaled using one of the big 3/4 res upscalers.
So no, DLSS didn't ruin the game, UE5 did. If DLSS never existed I think UE5 would have still been ultimately engineered this way with it's own upscaler called TSR.
But Nvidia did ruin the prices, using DLSS as an excuse.
I see 0 difference tbh between game with and without nanite if there will be 0 difference in other games we pretty much can consider this thing as useless gimmick probably made only to made devs work easier but for cost of performance.
Nanite is just to power hungry for now. I think it is a great technology for the future, it is too much. Unless the remnant two devs didn’t implement it efficiently, since epic’s demos looked quite good.
Lumen seems to be a better technology for current gen gaming.
@@Extreme96PLyeah, devs should focus more on lumen.
@@Extreme96PL nope there are definitely merits to using Nanite. Even though this was my first Nanite experience I could tell right away it looked better. There was little to no shimmering in distant objects, even at 1080p. Distant object detail was maintained so well I couldn't tell when an object was gaining or losing polygons, which I could notice easily in any other game because of their varying levels of LoD implementation. Like how when you walk far enough an object will suddenly be replaced with a lower poly model - that doesn't happen in Nanite. So I legitimately couldn't tell when an object (and it's shadows!) faded in or out of existence, and most importantly object pop-in was non-existant. If I looked past the horrible character models and animation the graphics really did feel like a next-gen thing.
Do you not like unreal engine 5? ( I hate it, I think it's trash, & it makes developers lazy.)
I would imagine the devs do care, the real question is whether the suits they answer to cares. Think of the bottom line and how much money you can save if you can just use the magic slider to fix the problems that would take more development and therefore more time and money.
I like to think that the actual developers generally care about what they produce and that problems typically stem from higher up. Like the guy at a store who genuinely wants to provide proper service but the bossman thinks of it as a reduction in productivity because that energy could be used to stock more items.
As someone who's an upcoming game dev, I can confirm most people [devs] do care, we want our games to be played how we want them, sadly I agree that the people cracking the whip simply don't care, they don't see games as art they see it as a "product" as something to be done as quickly, cheaply and as profitably as possible
If it means that there's a way to almost cheat performance into a game, I can grantee they'll do it, because if there are corners to be cut, they'll cut alright, it's the sad state of some modern games
We need more flat organizations with less management, let the professionals be professional
@@Marsk1tty flat originations aren't the best, they mostly work in small groups, once the groups get big enough power dynamics and invisible hierarchy is created
It's good for indie projects but the bigger the game gets the more people is needed than the flat hierarchy can handle
they dont modern devs only know copy n paste and cry
imagine thinking dlss or frame generate are "sliders"... DLSS/FSR is up scaling to reduce resolution of objects that barely or unnotice in a frame (which almost doesnt affect the experience) to boost up performance which gain a TON of experience when you gaming
Your channel needs more visibility, your work is awesome
I think that optimization is an on going process, but when the game is close to be finished meaning all its gamplay and tech features are online, then there are a few coders optimizing things depending on the team size and so on. The workload is a linear thing and not that much can be done in parallel not even to mention identifying issues, bugs and such. + the huge variability of hardware also makes things take long. In Remnants case, there is the issue of working with an engine that might release performance updates you waited and planned for but it took them longer to make those. This is why games release in the way they do nowadays.
DLSS was a great idea for VR. Using it to make the other frames needed for the second eye.
I don't you'd want it in VR as the artifacts would be more visible.
I knew this will come. The first games using dlss actually had a good performance boost.
I hoped that we will jump a graphics generation and dlss will make them just playable but I forgot that its also possible that developers are using it to skip work.
it is actually embarassing that games are so poorly optimized that i cant get constant 144fps in games on my 4090, in remnant without DLSS its around 80-90 and not most eople are not enthusiast enough to invest in a 700€+ GPU, i have no clue on which systems those Devs actually test their stuff.
Nice video, I was expecting for someone bringing this to light.
So overly positive reviews ignoring how this game simply CAN'T be played without upscaling.
I think its a combination of meeting overheads demands and deadlines and the card manufacturers drip feeding barely better consumer end hardware each year, it seems like the development hardware these games have are a decade ahead of our cards, so when its time to get the games to run on user end machines, it just isn't happening. So in the mean time we get this.
All of the post processing effects in deferred rendering games add tremendous amounts of blur to the experience. From TSAA, to Upscaling algorithms, to whatever else they come up with to cheat poor performance, you get a sloppy picture. I already have blurry vision, i don't want my games to be blurry too! It's supersampling the Native resolution and no AA for me.
I think DLSS and FSR are the best antialiasing performance wise. At 4k the blur is negligible but ghosting is what annoys me the most.
Fsr is the worst
Not really for ds dlss have better clarity then native res at 4K, and same as cp
As a former developer I'd like to say there is a difference between the developers themselves and the higher up managers. It takes time and thus money to optimize performance. Ofen the quick and easy route is taken,
and technologys like upscaling are just luring as a quick fix that saves money :(
Unfortunately the majority of people (and content creators) don't realize how this works behind the scenes, and they only mention the devs, when in reality the vast majority of the horrible anti consumer decisions are made by the publisher and the greedy higher ups.
As a former dev, you should know the problem is also object and pixel density bring too high too.
@@NeoShameMan True.
The need to compete with better graphics also causes pressure.
as a consumer that knows
my condolences
corporate greed f's us both
lol there are plenty of incompetent developers too, easy to use the higher ups as scapegoats...@@J.A.Z-TheMortal
I remember when Nvidia introduced Temporal Filtering into Rainbow Six Siege, and it was really helpful for getting my aging 550 Ti able to play the game at 1080p. Now here we are, FSR and DLSS being used to supplant proper optimization. Realistically though, I'd imagine it's being used as a stop gap between release jitters and post launch stabilization to get the release window tighter in a schedule. It's not ideal, but hopefully it's a better choice than game devs simply not caring.
What's the background music starting at 0:15 ?
IMO if this is how bad it really seems; it's going to be a temporary problem with gaming in the next few years as hardware improvements start to slow down significantly and game developers start to realize it, they will eventually come back the same route and focus more and optimising their games. This could also suggest higher resolutions might drop in popularity a bit? possibly?
on another topic about steamcharts, I hate that if you use upscaling on say 1440p it probably records it as running at 1440p when it really isn't.
steam charts are from the hardware survery which is just the resolution of the primary monitor
The thing is. Most of the time games get harsher requirements, its because of consoles getting stronger. ATM Consoles are as strong as an i3 12100 and 6700 xt (i know they use a 6700, but console have optimizations on the board itself compared to computers). So in 5 years they will probably be on a way higher level since we have rapid improvements on the soc and other sff components market. So it wouldn't be false to think that the next console generation will be as strong as a current ryzen 5 7600x 4070ti config.
@@FenrirAlter I can see what you mean, and I feel like consoles would be a little more inticing to get if that's the case, compared to gaming PCs
Yeah of course reactionary behaviour will probably take place.
The issue isn't that it will stay the same. It's that it doesn't have to get worse before it gets better.
Hopefully similar games to BattleBit Remastered comes out that are like a replacement for these higher-budget games.
They look much worse and usually this comes with a lot better performance and they may still do other things better as they aren't tied to all the red tape, demands and larger budget to make any changes. That's not to say lower-tier graphics automatically give you better performance, there can still be stuff that slows it down significantly.
In Teardown for example the game is very low-res, but stuff like smoke can lag hard. I haven't played that game, but have seen some play it on TH-cam.
Its funny how these technologies started with the aim to give us a native 4k quality image by upscaling from a lower resolution. Now we are upscaling games to 1080p from lower resolutions. I like these technologies but like you say I fear that they are becoming a necessity to maintain playable frame rates and not a luxury for obtaining excellent image quality.
The best part is that, when I got remnant 2 jumped on it looked terrible at 1080p, I turned it off. It still looked terrible, so I checked the config files and resolution scaling is automatically set to on at 50%, with no in game slide for it. So you're already running at 540p then upscaling that with performance. So the resolution is fucked
I agree with all your points here. I want optimized performance, because I see no reason to not be getting consistent 3 digit frame rates without frame generation when I set everything to low and DLSS to performance on my 3090
Completely agree with you. Subscribed. You provided a lot of really good points.
Game developers don't really care about performance considering their current working conditions.
The deadlines are unrealistic to the point where they cannot ship a working product, us expecting not only something big, working, including all of the new graphical bells and whistles, but also well optimized is insanity.
Heck, it's impressive they manage to even ship something that kinda works on release date.
The real problem is how big and mainstream the gaming industry has become,
the devs are doing their best considering the conditions they are in.
The problem are the management, ceo's, and investors.
I don't see any unoptimized indie games, heck, look at the insane optimizations in factorio.
Its all good, i lost interest in the triple a game industry, all they do is pop out soulless cash grabs, let them die out instead
It's the same story as always. It's sad.
Ive been hearing this garbage since the ps3 era. Im tired of it. Fuck them, hold em accountable. I don’t buy a TV and forgive them for the shit not working.
>Indie games aren't usually held back by greedy CEOs and investors because they work as they please. Something not right? Delay, fix, tada. No issues.
>Big company games just want money, so hype up something, set a public deadline, fuck over your employees, absolutely no delays, and haha! You already pre-ordered it! NO REFUNDS!
:
Time to unionize i guess. Or start a platform that incentivizes gamers to mass boycott something and it works 100% of the time.
I'd love to see someone delve into the laptop gpu arena around this topic. So much weird (and disappointing) stuff going on there already.
Upscaling tools are saving time/money in the developement process. What do you think a company which wants to make money will do/choose?
what mic are you using
Awesome video! I feel like player expectations played a role in this, when 4k hit the tv market. Many console players anticipated the new consoles to support 4k resolution. This generation of consoles also brought older games to the new systems, giving us familiar titles with smoother 60fps performance. Excitingly, new technologies like ray tracing have emerged as well. However, expecting a leap of four times the pixels and doubling the framerate while enhancing lighting and detail is a big challenge. Upscaling is probably trying to fill that gap somewhere.
Pretty much this.
Raytracing is one of the big use cases for upscaling because initial raytracing cards just had no chance to keep up with actual raytracing at full resolutions, so for that it was a worthwhile trade-off. But with ever-increasing resolutions it instead became a crutch to run even basic graphics on intended resolutions.
finally someone not dumb be saying this since before launch
@@ilovehotdogs125790 I think it could be catching up, but the hardware that actually IS better is overpriced as fuck.
Just look at the entire 40 series from Nvidia. It doesnt have the same generational uplift as any previous generation had over its predecessors, it stagnated, sure. But strangely it all falls in line if you call the 4060 a 4050, a 4070 a 4060 and just shift everything one bracket down. They just shifted the numbering scheme and realized they can actually make more money that way.
Yep. 4k killed graphics innovations and performances.
And why did we expect 4k? Because they put a big 4k/8K GRAPHIC ON THE FRONT OF OUR PS5 box. How is this player expectation?
I love remnant 2, but the optimization really disappointed me, at 4k with a 4090, I get about the same performance I would in cyberpunk with RT on high. Meaning sub 60fps. While I get I'm playing at a stupid high resolution, no other game gives me issues like this. Especially considering the performance gains I get by lowering the resolution or using dlss/fsr is quite a bit less than other games. It's not til about 1080p or lower that I start to see strong performance gains. They really need to do a lot more optimization. I honestly feel like this is an area where gunfire has been weak in before rem 2, rem 1 didn't have that great of optimization and didn't even have the ability to do 4k.
I'm not sure how you are getting those framerates, using an i7 with a 4090 I'm getting about 120fps at 4k
@@gokushivum with no upscaling and framegen off with max settings?
@@darktrexcz Make sure your 4090 is really a 4090 not a 4070 TI undercover
@@parlor3115 I've been building PCs for 20 years, I know what a 4090 is. There's benchmarks on youtube that confirm my framerate. 120fps is with dlss quality and framegen on. Double check your settings, I guarantee those are on.
@@darktrexcz And you got goofed like that. Man you suck
Maybe devs should be focusing on using the vulkan api more often when making games instead of direct x. Think we will find you use fsr and dlss less when using vulkan more in games as its a great api for even old gpu's
Frame generation (DLSS3/FSR3) is my biggest fear, from good addition I really expect it to become mandatory in a near future, just like the upscaling techniques (DLSS2, FSR2) became.
But the worst thing about Frame generation is that it's not like upscaling technique, you need to have a minimum horsepower to make it usable (without too much artefacts), and imagine lazy, greddy editor like EA make it mandatory to run it at 60 fps on high end stuff, it could be a real disaster making CPU optimization a joke and create a threashold you've need to overcome to make the game juste playable, and if you can't reach it it's unplayable buy a new GPU.
I find it really ironic to make use pay for technologies that can eventually be software obsolescence.
I had this thought from a long time that upscaling technologies are discouraging devs to not optimize their games for native resolution. DLSS and FSR are literally ruining games, I remembered when upscalers aren't a thing and devs actually had to optimize their games. Honestly before DLSS came out the last best looking and the most optimized games I ever played was Ghost Recon Wildlands and Far Cry 5 they ran so good on native resolution with my old GTX 1060 6GB
They also have less complex graphics, good optimization then mean lower texture resolution and variety, less interactive foliage, sand and snow, non dynamic lighting, object that shine in the shadows, less complex geometry, etc... ie... battle bits 😊
I wouldve prefered if AC Odyssey came with fsr instead of me having to run a blurred out 720p
How are they ruining games? Which part of your gaming experience specifically has been ruined by upscalers? Your game looks a little bit worse? That probably saved the devs months of development time in optimization that they put to make other areas of the game better.
@@Ansalion if the game is made for being played at half resolution even on the latest hardware, gamers who don't yet own an rtx will literally not be able to run the game at all lol. And don't respond by telling people to upgrade, gpus are expensive as shit and if you're able to upgrade your pc, congrats, you're privileged as shit.
"Optimized Wildlands" sounds like a joke, but that's okay, at least it scales very well on lower settings for older hardware and had a really massive seamless open world with breathtaking detailed landscapes.
This is one of your best videos. I've had to criticise you on a lot of videos, but this is a very real problem and you did a great job pointing them out and backing them up with the "whys" and "whens". Thank you. 👍
This is why I love Dead Island 2. When it first launched, the game ran at ~100FPS on native 1440p but enabling FSR actually made it to drop in frames to about 50 lmao.
FINALLY! Someone is actually talking about it! This is what I thought about DLSS since video games started being unplayable without it. While it is cool to be able to boost more frames thanks to DLSS, games doesn't really that good when it is on. I just want to enjoy a game on my decent rig without any upscaler on.
Game devs wants their game to run well (even though optimizing is boring from a game dev pov), but publishers wants money coming in as early as posible, so optimization gets last in their priority list, that''s why is common to see this days broken games at launch but a few patches later they get better. The rant should be aimed at publishers, not at game devs. This happens in almost all of the software industries.
I feel devs always keep getting the short end of the sticks for problems they should not be held responsible for. Management at first, and now even hardware manufacturers. It's not the Steve the environment artist's fault Jensen thinks an 10% perf gain on half the bus width is an "upgrade". The 1060 was on par with the 980, the 2060 was on par with the 1080, even 3060 was chewing on the heels of a 2080 non-super. The 4060 can't even beat the 3060 ti, heck it even falls behind the 3060 when vram overflows.
The devs are just as responsible, I'm seeing examples of devs themselves making excuses that we KNOW are full of shit.
@@jaronmarles941 See, software development in general is hard. Try to coordinate 10 people into making a small game and have it done by the end of this week. Impossible. By the end of this week what happend is with a bit of luck everyone has their repo setup and is ready to sta- oh wait, it should have been done by now??
Yeah, people who don't know like to shit on the devs for not making good games. In reality, it's the publishers and the management, they are the root of all evil.
its not because its boring but because it costs money
Optimization is absolutely not boring, it's super fun. I think one of the big issues is the games industry has a lack of engineering talent. Not only is there a lack of engineering, but the bar has been progressively raised due to not only continuously rising expectations of graphics, moore's law dying, introduction of 8/16 core consumer cpus, but also new and harder to use technologies like Vulkan/DX12.
It's not uncommon to have a 50 to 1 artist to graphics programmer ratio. Team of 400 people, 250 artists, 5 graphics programmers. Artists start just PILING work on the heap, graphics programmers are like way underwater in work just getting the game to not DEVICE REMOVED because Vulkan was just such a good idea of a technology.
OK another story for you, due to lack of graphics programming talent company decides to use unreal, general purpose engine. General purpose means not specialized, ie not optimized. To make this crystal clear, if a game uses unreal, it is unoptimized. Full stop. If the game looks like it has advanced graphics and it uses unreal, prepare to buy a supercomputer. The grand irony is we moved from opengl/dx11 to low level apis, but that raised the barrier to entry high enough to push three quarters the industry to use unreal, which is going to perform worse than a well designed custom opengl/dx11 engine.
Company demands graphics that can compete with e.g. red dead redemption. So what do you get? A general purpose engine pushed way past its limits. Not only that but Unreal has the horrific idea to let artists write shader code. They glue together overcomplicated garbage with their lego duplo blueprints despite having never heard of the words 'register' or 'occupancy'. No graphics programming talent to fix it. Might get a cleanup crew contracted near end of project to get it to pass console certification.
DLSS is a double edged sword on one hand ive been using it on witcher 3 and I can't really tell any visual anomalies, so I get why games don't really care about optimization but that is really killing older GPUS cards like the 1080 ti might have a shorter life expetency because developers don't care to optimize anymore
Damn good point I didn’t consider this.
Which is a shame, the 1080ti is still a beast and with its 11GB vram could still last a lot longer.
Did you use Pokémon black/white piano soundtrack?
Edit: I actually spent 1 hour searching for that theme hahahah, it is the dragonspiral tower theme, you're a man lf culture
Is there a reason to have baked Comic Sans subtitles the size of a screen?
Very good point. It really beats the purpose of buying an expensive GPU and expecting it to give out better results and then after installing it to your system, would have you use a software just to make it perform as what you have paid it for.
But the expensive GPU still does give better results compared to less expensive GPUs? Less expensive GPUs are even more reliant on upscalers compared to your expensive GPU. You’re comparing it against some imagined standard that doesn't actually exist.
@@AnsalionThe more expensive cards may be managing to run native for now, but for how long? It's not like developers never abused a clutch in the past, right?
And I can see how optimizing graphics across different levels of hardware isn't an easy task, but the majority of gamers have the cheaper hardware still, so they should be the reference point to run native.
You may build the best looking game of its time but, if people can't run it, you will certainly hear about it (cough, cough, Cyberpunk 2077, cough, cough),
Upscaling is a great extra option, but it will 100% make developers be more careless with optimization, as it has been happening already
Working with game dev/tech team, I think the problem is a mix between the two possible answers. The tech is being tested with high-end hardware and focusing to bring maximum fidelity, better images and so it goes. This means more information and factors to deal with when thinking in optimization. But this reflects in time, and the executives forces the game to be delivered in just a playable state.
And so, the dev team needs to rely on existing tech to make faster deliveries, sacrificing a great part of the optimizing, which is time consuming.
Great video :)
As a game dev in training. Optimization is an art that seems to be somewhat lost with improvements to upscalers and the accursed input lagging of frame generators. I hope that the recommended settings of games should be standardized to without all these input lagging and reduced quality false features that deceptively make you think that you can run the game without these enabled. It should be stated in the recommended setting in steam if the performance that they are getting is WITH or WITHOUT these features. IMO it is false advertisement to push out a much more unoptimized game and calling it a day
Don't get me wrong i do not believe that programmers (or game designers) experimenting with these features is a bad thing in itself as it can bring new found life into an older 10 series card. It is bad when it is used incorrectly like lots of games now do and ignore blatant base performance issues
Excellent Video. Excellent Topic. This is the way of marketing no matter the item. They introduce to you as good for one thing and then manufacturers start using it for other nefarious purposes. Pretty soon it becomes just another way to get ripped off for your money.
Well, if the information online about poor sales numbers are to be believed, it does not seem to matter what nVidia and AMD wants.
What matters is what the consumers want.
And they don't seem to want what is on offer.
lol, kind of reminds me of the lottery system in my area. it was originally introduced with the promise that the revenue from it would be added onto the budget for school funding each year. years later and the state legislature has cut school funding from the budget and its funding comes only from the lotto...
Yeah I was super shocked seeing the Remnant 2 performance with my specs (7800x3d, 4080) I HAVE to have dlss on to get an enjoyable experience it’s honestly so baffling how game developers are not optimizing their games and just expect upscaling to make up for their laziness.
With Nvidia and Amd shifting their focus to AI, putting minimal effort into their consumer GPUs and giving them only marginally better performance every new gen, this trend might continue.
I think it's wrong to put all the blame on game devs. Hardware is fundamental.
@@thecamlayton I'm not trying to blame the devs. I'm sure the devs are trying to make the best games with the newest tech avaliable to them but if the average persons hardware can't keep up it'll continue like that. In this case I'm blaming the Hardeware companies.
@@thecamlayton bruh stop d*ckriding. these devs nowadays are there for a paycheck and agree to switch to ue5 and just slap on their assets and move on. modern devs in the triple a scene are mindless sheep who do as told and you all buy it up.
@@thecamlayton back then devs use to work dedicated to get their game to work. modern times allow unfinished games and patch it over time.
@@diddykong7354that's literally been the case for at least the past 20 years
Sadly, this was probably inevitable. This is why we can't have nice things.
fully agree. the reasons to avoid dlss are so manifold. firstly the artistic intent of the graphic/world designers is best represented by native resolution. then there is the ghosting/artifacts which even in newest iteration of dlss are not fully gone. and if you own a rtx card you can test this easily by yourself with contrast rich movment scenes because in youtube it is really hard to see often times. but uncompressed video or live on your monitor you will see it.
This comment just screams "I've seen a TH-cam video" finding artifacts from DLSS is genuinely impossible in natural motion unless you record and slow down the footage
@@Cluelessss_ i've seen it on my system with baldurs gate 3 in real time, no need to slow down anything. the ghosting is distracting and it's not like any other algo does it better, fsr and xess have the same issues. the thing is only that with dlss 3 it is not AS obvious but it still is there, just try baldurs 3, fs2020, nfs, spider man miles morales ... no need to slow down any footage. just play game and "turn off your nvidia bias" for just a minute.
also when you quote in your comment "i've seen a youtube video" ... then you just prove that you didnt read my comment at all since i especially SAID in my first commment that it isn't really visible on youtube but tested on system IRL. so please read before you make false assumptions
I literally can't see a difference between native and dlss
So THIS is what I've been experiencing with newer games. I've been complaining about games not being optimized at all for a solid while now, this would explain it.
As someone in school for game development, game optimization is one of the worst and most boring part of the project. Not only that but the cost of waiting months to optimize the games costs, and management would much rather update the performance once people have paid for it than go another month without any returns.
Vis blocking, tris culling and the like is really easy to implement at the level design phase of a project, if folks just took some time to understand it, along with games engines being written well. Unfortunately it seems studio's think of it as an afterthought which is lazy and dumb at best.
Game optimization is not that hard. Guess, your school is fooling you to create the next lazy devs generation.
Sad.
@@levijosephcreatesExactly!
@@tertozer3543 I know its not hard, its time consuming and boring. Its not a big problem for me, but other people absolutely despise it.
@@cxngo8124 Agree it's time consuming but it's an integral part of being a level designer, again if the engine is good. In my opinion a lot of game studio's are run by suits who don't understand design fully, they think it's possible to just patch bad design at a later date, facepalm.
It also seems a lot of studio's seem more bothered about diversity hiring over quality employees these days, although that is a totally separate issue it also plays a part in creating a quality product.
For a while I have been confused between 4060 and 6650 xt or 6700 xt. Nvidia cards are expensive and do not provide good performance except by dlss. Is it really worth the price? Are amd cards good? I heard from some that they have a lot of problems, (this is my first build)
Get a RX 6700xt or 6750xt the additional 4GB VRAM and faster baseline performance far outweighs DLSS especially because the AMD cards are less expensive and you can still use FSR2.2.
I agree, if DLSS doesn't interest you and all you do is game, you're going to be very happy with a 6700 XT or 6750 XT!
Amd cards do not have a lot of problems.
0:48 what program is being used at the top left of the game
MSI afterburner and RTSS
It seems like the issue here is that game development is demanding and as the shortcuts come out there not being implemented super well. It's like with Ray tracing it's a lot easier and faster than going in and doing rasterized lighting but it's also a lot more demanding. I might be a little bit out of the loop but it seems like these tools like upscaling have really exploded these last few years and we're just having to suffer through Growing Pains like 3D animation did in the 2000s.
Pretty much. The advanced features of the engine cost a lot more as a base, but become more performant in the complexity that they allow. Devs will still need to optimize for CPU, but the graphics side of things arent as big as a deal so long as the scene isnt filled with transparent textures
Mostly i dont like to use dlss unless frame rate is very low. Whatever nvidia says its still a upscale methods and make games little blurry
I feel like this is gonna be the gate of what people fear of where a AI server room is just gonna be the bloodline of what GPU's are to us rn and be locked to a subscription service doing stuff like DLSS to make games playable without a GPU but we don't own the hardware and I dread that fkn day
Isn't that cloud gaming?
@@RexGuard yes
You will own nothing and be happy
NVIDIA will push GeForce Experience down people’s throats in a few years and it’s gonna be HILARIOUS
I miss the engine from remnant 1, that looked so smooth and ran insanely nice on my 1660 at ultra
I was a an early adopter of 4k monitors, back then it was an insane improvement of image clarity over 1080p, and playing at 1080 looked horrible and blurry. I see the same effect when using DLSS now, feels like I just spread vaseline on my monitor, there is no clarity anymore, just a uniform blurry mess everywhere.
Name one game where dlss looks blurry
This has been a great fear for me as well, to compound it further i feel like it puts AMD in a bad spot since DLSS tends to be the better upscaler, and i want AMD to succeed. Let's hope we get massive performance gains in the next generation or just better optimized games in general.
AMD doesn’t even want itself to succeed. They could have EASILY smashed the entire 40 series line up except the 4090, but no, they went for the bare minimum “good enough to get sales” approach and completely uncompetitive pricing model.
At least in the case or Remnant 2 the issue seems to be that because of the tiles they used a lot of advanced graphics API's to generate shadow maps, and dynamic lighting. Because they aren't static set pieces the game uses tiles. So rather than devout a lot of time and energy to making a low resource cost solution they slapped on a bunch of off the shelf solutions and it bogs the game down like crazy. You can get mod packs that allow you to shut off some of the advanced graphics stuff and the game runs way way way smoother when you do.
And the big kick is that Remnant doesn't even look all that good. Gears of War 5 looks better and is multiple years old. Outriders looks better and is a couple years old. Hell Warframe looks as good and can run on super low end hardware. It sure seems like they sacrificed a ton of performance for very modest visual gains.
This!
When reviewers say shit like "it runs great!" When they're running a game at 120fps with DLSS on a 4090 but 60fps natively, that is hella not good optimization!
Im with you all the way on this, but at the same time I dont think this is gonna stop. Its probably gonna take that approach even more, and I think alot of it is out of necessity tbh. Hardware is getting to a point where its really hard to make it smaller or more powerful without just... scaling it up? Tacking on more cores, glueing components together etc. If we want performance to move forward at any similar rate as it has historically, I think we are gonna need alot of help from software.
But again, I agree and I find myself turning down graphics in alot of cases so I dont have to run frame gen and upscalers. They create so many weird graphical glitches and make games feel sluggish and unresponsive compared to native resolution and traditional rendering
Edit: 4:12 - 540p is not half of 1080p, 720 is. You have to multiply the numbers in a resolution to find the actual pixel count. 1920x1080=2 million pixels (ish), where 1280x720= 920k (ish) pixels. 540p would be somewhere in the neighbourhood of 500k pixels, or a quarter of 1080p. This matters because its the total amount of pixels that your system needs to push that actually matters. Likewise, 4k is 8M pixels, or 4 times 1080p.
Until the NEED for upscaling is removed from Remnant, It's not in a good state. and I worry it will never get to that point after hearing the devs say "We designed the game in mind with upscalers"
I was also bummed out over the game's "poor" performance. Yes, it kinda works now that I have everything set to low settings, but with a 2070 super I feel like I should be able to accomplish more?
If they using unreal engine 5 with all the new techs like lumen, nanite, virtual shadow maps, tsr, etc. than no it will not run well on a 2070ti if they don‘t use upscaling. These techs are running around 30-60 fps depending on the use case on the current console gen which are close hardware so they can optimize it way better.
Your System is probably running way worse than a ps5 so yeah i will not run well.
@@Shiv0rDoesn’t even look that good
Theirs a another factor you might have missed, hiring practices. It not a secrete that the gaming industry like to treat its workers as disposable as possible including the programmers who have to do the codding and the optimizing. They don't have a team of experienced coder that know what their doing and have lot of experience working with that project and other teams that can eak out the performance. They have a lot of contractors crunching their work scheduled working on some one else's code that also might have been some one else's code that they wrote just crunching out their contract.
What is the piano song playing 0:38
What do we expect? when you have GPU maker put something like upscaling sowftware as the main feature, and not as bonus, ofc game developers also take notes...
I think that upscaling techniques are amazing and are a good thing to have been developed, however I had this worry since they released these features and it seems like I was right to worry. I thought that developers would lean on upscaling as a crutch to compensate for having to optimize and even though they can I don't think they EVER should. Upscaling should be there for if you want to use high resolution or if your GPU is starting to show its age, unfortunately I don't think we will go back now and devs are going to keep leaning on upscaling.
100% agree. I'm just not gonna buy a game where the devs expect you to use DLSS to compensate for awful optimization.
This reminds me of the argument of Mods vs Vanilla game
Mod's to make the game run smoother as apposed to not ever wanting/needing mods because the game should've been fine on it's own without them and mods should be a bonus, not a necessity.
good video though!
It's a shame that decent optimization is no longer a priority. On both issues, the DLSS/FSR and second issue is the amount of space games requiring now on our hard drives.
If you want prettier games, they're going to take up more space. There's no getting around that. If you want longer games, more content rich games, they're going to take up more space. That's a "physical" happenstance.
Yes, when given the opportunity a corporation will always choose the lowest effort and cheapest methods. It is why we are in shock when they don't.
Totally agree with most of the points. I’m getting fed up with every game these days REQUIRING you to use an upscaler to be playable. Native resolution looks so much better than using an upscaling tech. To me it just comes off as lazy developing. Games from 2017/2019 that don’t use any of these by default still look great today. Say what you want about the game but Anthem for example still looks excellent (even tho you can use DLSS in it but you don’t need to). What happened to optimising a game to run good. It needs to stop.
To be fair, Optimization is painful sometimes but it's still no excuse for requiring an upscaler to even play the game properly.
Im assuming someones already mentioned this, but half of 1080p isnt 540p (Thats a quarter of the resolution). 720p is approx half of 1080p.
I still agree however. I also think developers are using upscaling to skip having to optimise their code.
Just imagine how the world would look like if every business application was in the state of your average videogame. The world would collapse. Great, now your PayPal transaction takes several days, your money can instantly vanish, your browser crashes constantly, database roundtrips take hours, you need a high-end gaming PC to start the simplest office software, and the list goes on and on. I will never understand how you can be proud of the state of such a product. At this point, videogame development feels like the choice for junior developers and once you actually have any understanding on how to develop software, you move on to a different industry.
This was always going to happen. The path of least resistance meant that developers would lean on the crutch of upscaling rather than put in the work themselves.
Based opinion.
Ergo, these useless as devs are fucking lazy, and would rather not put the effort in to ensure their product works on as many different hardware configs as possible.
Surprise, surprise.
what happened to graphics cards that could just play the games properly without these little gimmicks? As others have said, it's nice to have these features to enhance your already solid experience, but you shouldn't have to rely on the features to have a normal experience.
The graphics cards should be powerful enough to run games without the need for DLSS and similar features.
nvidia fanboys happened.
I used the Nvidia upscaller to improve the perfornance on my old GTX 970. But if you need it to improve performance using a cutting edge GPU on a game I played on PS3, there is a problem
Also a problem with Lords Of The Fallen, every time I turn off the dlss and just use 100 % scale native resolution my frames significantly drops to 30 and even dips below that..