- As a game dev obsessed with optimization (and kinda having it as the only responsibility at my workplace) i'm indescribably upset with the current technical state of modern games. Back in the days with limited hardware specs as well as harsh development reality people had to put their best to simply make the game playable. I've been excited about it ever since i saw it working as a kid. Nowadays the games are slapped with the most generalized optimization techniques that are not even polished to fit into the game well enough, lol. - Today, with so many high-level languages with large overhead, game engines aimed at designers who are freaked out by just thinking of using their keyboard and not their mouse, the industry falling to its lowest and becoming yet another business area, the publishers ridiculous tendency to develop a lot of generic over casualed games in a foolish desire to conquer new users and overall shitty educational quality the games are ended up having so much overhead and technical debt that you could easily fit 2 or even 3 old games in there, both memory and performance wise. - I have always been thriving into the industry for this and i remained committed to the optimization all along. I am treating it like an art, which it really is. It's a shame seeing the industry being bloated with people so far away from understanding how games are working that the betelgeuse appears to be at the arm's reach, but this is the reality. Less competition as well, which oh man do i abuse. - Thanks for making the effort to read, whoever you are. Much appreciated :D
What's even more infuriating is seeing gamers and even some benchmark channels say things like "Oh this 2023 game runs 60fps on medium settings on a 900$ PC? Then it's optimized!" meanwhile that game may have 2012 graphics at absolute best and absolutely no physics or interactive environments to speak of. There's this braindead notion that a game's optimization is based on when it was released rather than how it looks like and how much physics or interactive environment it has. As a result devs use modern hardware not to push graphics and technology forward, but to instead become lazier. Imagine if we told devs 12 years ago that in the future video games would visually look 15% better but will run quite literally 500% worse
People don't often consider a game's optimization in their review all that much or praise it all that much. Take NMS. Procedurally generated yet it runs much better than most titles. Valheim is less optimized but its an indie title and has made large strides in improving its optimization. RDR2 runs pretty well too, same with GTA V. Doom, obviously. Distance is another well optimized game. Same with ''The entropy Center''. Forza horizon 4 and 5 are very well optimized. But people will say FC5 is ''well optimized'' or stuff like that.
While a few studios no doubt demonstrate a degree of technical incompetence, I firmly believe the vast majority of these cases simply boil down to poor management. Tight deadlines, feature creep, loose vision, quick turnaround to avoid investor pullout, higher ups that are disconnected from the pipeline and complexities that arise out of short-sighted decisions, the list goes on. With "industry" now being the operative word for the gaming industry, it's no surprise we see the same practices being utilised in larger projects to corner cut so long as people keep buying; and more so than any other sector of products I've seen, the consumer-base for games is far and away the most willing to put up with this capitalistic downward spiral.
@qma2275 After watching the video and hearing a lot from you got me really interested into diving deep into the topics. I really wanna learn more about the optimization and graphics that really works though in terms of optimisation and everything that happens behind a game. So would like to try out any recommendation that you've got for a beginner like me!
The pinnacle of modern game optimization has got to go to doom eternal that game put doom 2016 and all other modern games that rely on bloated file sizes to shaaaaaaaame
@@reglan_devOkay, I have two big problems with that. First off, why do indie games not qualify for the pinnacle of modern game optimization? Also, Do you know how much better optimized Factorio is than any other indie game?
@@what42pizza 1. The video and the commenter both mentioned AAA games. As we all know, modern AAA scene is famous for massive amount of unoptimised games. Indie games on the other hand, are known for the fact how well done they are, and optimised. Therefore they are a pinnacle of game optimisation, however the video wasnt talking about them. 2. Yes, i know how well optimised factorio is. However, its not the only very well optimised game out there too
What pissed me off the most regarding Starfield and Todd is when he said they optimized the game pretty well but a few weeks later they released a patch that increased performance by quite a lot. His own team proved that he was talking BS. Not that it needed much proving anyway because we have Cyberpunk where I could achieve literally double the FPS. And Night City is not even remotely comparable to Neon or Akila. Okay you can't fill a room with cheese in Cyberpunk but who even really cares.
Most companies' idea of "optimization" is to just raise the system requirements. I've always joked that if a Windows programmer today was asked to write an exact copy of Space Invaders, it would require a 2.4Gh i5 CPU, 8GB of RAM, a graphics card supporting pixel shaders 3.0, DirectX 11, and a minimum of 20GB of hard drive space.
Now programmers just don't even wanna do hard work, I just hope everything will become normal in near future else we gonna have to keep upgrading each year and keep up with the trend even if we don't want to, I mean... Devs could make the game for lowest hardware and using those hardware and I mean lowest is not Intel atom but rather what most people use on average and what most people are capable of using, and use those combinations of hardware to build games or applications.
@@chry003 One of the problems is the reliance on programming "environments" that aim to make programming easier at the cost of increased requirements. I get it, you want to use whatever will make your life easier, but often using these, gives a more inefficient result. There's a small utility for older versions of Windows called TaskArrange that lets you change the order of the buttons on the Task Bar, since older versions of Windows lacked this ability. The whole thing is about 50K, programmed in pure assembly. I once downloaded a program that did the same thing, with maybe an extra feature or two, but it required the .Net Framework. The program was 500K. I know in today's world, 500K is nothing, but the point is that it was ten times larger than the one made in assembly. Plus the extra size of the .Net Framework. All to do the work of a 50K program. For another example, take the PS2 versions of GTA3, and GTA: Liberty City Stories. Both games use essentially the same map, so they should run about the same in an emulator. However, Liberty City Stories requires a faster system to run smoothly, than GTA3 does. Another thing is that often when someone says that something can't be done, what they mean is that there's no pre-made function to do that. Within the limits of the system's hardware, software can do ANYTHING the programmer wants it to. Windows can't run Mac software, right? Well, if you write a Mac emulator, it can.
At blizzcon many years ago, Blizzard told a story, they created a new city for world of warcraft, called Dalaran, however they discovered that rendering this city was ridiculously slow. On further investigation they discovered the problem was a toy shop, it had a model of the city as one of the toys. To save time the programmer had merely referenced the normal city model, scaled it down and rendered it. This led to a huge problem, if you looked at the toy store, the game would resize the city and render it as a model, this of course included the toy shop and the model city, which caused it to again resize the city effectively causing infinite recursion.
Wouldn't the game just not work then? It has to render an infinite number of the cities. Perhaps not because they get smaller? But somehow this makes me think about how some bigMobile games are optimised to run on comparatively weaker CPUs
Imagine if hardware was developed just like modern games: “- Whoa! It POSTs! - Ship it! - But there’s issues and clock is unstable. - We will fix it via driver updates!”
If they could, they would. Fortunately, casual gamers aren't the only ones buying motherboards and computer parts, I guess? Professionals and big corporations do too, and they probably want a stable and well made part.
you described every new GPU launches since 2015! RX480 blowing the motherboard PCIE by pulling more than 75W is one of the recent examples being fixed by a driver update :P
developers often dont eve optimize... they wait to see if idiots will buy overpowered hardware 1st. code is written by idiots.. sub optimally... they try get the product done 1st.
I can't wait until the Silicon Limit is reached and we finally start actually optimising more rather than demanding more and more RAM, higher and higher clock speeds, and yet more storage. Don't get me wrong, there are issues with over-optimisation or pre-optimisation from a maintenance standpoint, but *some* optimisation would be nice.
@@fireloop69 I mean Skyrim is a BAD example, it's not a well optimised game, and it doesn't run perfectly. But also my issue was that there *aren't* over optimisations, instead games and such just demand higher spec machines.
@@cptnraptor understandable well take rdr2 as an example then one may say its over optimised but it still looks better than most modern games while running smoother as well
tried to run fortnite on a gtx 1050 recently , it looks like shit , meanwhile i can run cod ww2 on medium low whit increadible performance ! and don't let me talk about other well optimized games like bioshock , black mesa, etc.
For those who learned OpenGL you need to extends object to the same class so when you apply a shader just do it for all object at once by creating a override func (common problem)
I would credit every single japaneese video game developer in this department. They release some of the most optimised games ever. Special shoutout to metal gear solid V. I played that game on 2 gb ram.
There is a very important thing that is often forgotten about big O notation: it ignores constants and those constants might be huge. And that matters a lot if your code doesn't work on big dataset.
@@MehmetMehmet-y8c Big-O is a quick&dirty notation aimed to eyeball how an algorithm scales with the number of inputs given. Imagine I've some hypothetical algorithm and after careful analysis I determine that the number of operation required to complete goes: n^3 + 10n^2 + 100n. With big-O notation you're making the rather dirty observation that eventually n^3 dominated and anything else doesn't matter. When however? How 100n can be ignored for small-n? Big-O is flawed in that sense
I think graphics card companies also have something to do with games not being optimized as to sell stronger GPUS so people can run a terribly optimized game
Actually, GPU vendors tend to work with game studios to optimize their games to run on their hardware so their hardware looks better in reviews. Your comment reminds me of Nvidia Gameworks though. It was this black box Nvidia gave to devs to do certain effects, such as hair. But, it was pretty much Nvidia abusing brute force tessellation to make subpar effects. Nvidia had better tessellation than AMD, so games ran faster on Nvidia cards. AMD at the same time made open source ways to do the same thing, though looking better, and running faster... even on Nvidia hardware.
This entire Bullshit of a theory is actually being taken seriously by some people. It blows my mind how unreasonably moronic people have become. Releasing unoptimized games really hurt the reputation of the devs and this in turn could majorly hurt their sales enough to shut them down. It's quite literally devs just releasing the game in a broken state so they can get money real quick from all the preorders. At the end of the day, devs release broken games to get a quick buck and not to save some GPU vendors ass.
@@xeridea Game sponsorship only tells part of the story. Almost all, if not all Ubisoft games are AMD sponsored, yet if we're talking CPUs, Intel CPUs handle the Assassins Creed and Far Cry games a lot better, an i9 9900k significantly outperforms a 5950X which came out a year later and usually trades blows with the 10900k. GPUs however, I believe Nvidia's usually do just as good a job, and in some cases even better, like in Far Cry 6 which has Ray Tracing.
Games that don't even look as good as the original Crysis (2007) run like crap on today's hardware... just look at Starfield. An empty planet and a few pebbles... no terrain, no vegetation and yet it can drop to 50 FPS on my high end PC. It's ridiculous.
Game devs have it easy with ample RAM and CPU and they still fuck it up. Try embedded development where resources are extremely limited and crashes can potentially cause catastrophic failures.
True. Unoptimized games is not because of the software engineers but because of the managenent. Speaking from experience as a dev myself. Boy, if you see our codebase. Lol. Tighter deadlines so, we are cutting corners. I don't want to but I need to xD
The big O notation describes the growth of a function, not the actual execution speed. For example, hash maps have a lookup time complexity of O(1), whereas linear arrays have a lookup time complexity of O(n). However, if the hashing function is slow, array lookups will most likely outperform map lookups for lower values of n. O(10000) = O(1) is slower than O(1n) for n < 10000. That is to say, it would be incorrect to state that the big O notation accurately represents the real-world performance of an algorithm outside of big data. The correct way to locate slow functions is by profiling.
Asphalt 8/9 is crazy optimized if you’ve ever played it. Glorious graphics, solid 60 fps all the time even on pretty old computers and mobiles, very responsive.
that's intentional, Nintendo's law is always make games that are fun regardless of graphics, that's why the wii games have held up so well compared to ps2-3 games
I jokingly made the reference that AAA studio leads just say "Our game isn't unoptimised, just buy a 4090" behind the scenes. And then Todd Fuckwad said it in an actual interview. Fuck everything that guy stands for. And then he dares get dissapointment at the game awards every time it didn't win. Dude thinks the sun is shining out of his ass
csavo, meghallottam az akcentust, mentem is a csatorna leirasaba es lattam amit remeltem. egy MAGYAR youtuber aki angol tartalmat gyart? hat ezt a csatornat jo alaposan at fogom bongeszni 😂 csinald csak tovabb, nagy dolgot muvelsz💯
Ksp2 devs: Let's make a ridiculously accurate rocket building game! And the map is the whole Solar System, just for good measure! Also Ksp2 devs: Recommended for 1080p60 are rtx 3080
6:07 i got curious what game this is and made a code to analyze all existing games -- "a game that isn't dying or anything or at least that's what the devs like to think" -- is it World of Warships?
Nowadays games: - 98gbs for just a fighting game (yes Tekken, I am talking about you); - DirectX12 that freaks up the graphics for actually strong chips but old graphic cards (like GTX 970), making then running slow or creating really annoying visual arctifacts due to low resolution applied (another example of that is that Tekken 5 looks way prettier than Tekken 8 in low quality. Also SF6 doesn't need DirectX12, it could have DirectX11 and Vulkan for performance versions too). Seems like the industry only cares about Path Tracing in real time render.
I think we should focus on software more than hardware now, hardware is at it's peak and it will never improve to the point of actually...well, improving. I have a 3050 laptop, it's pretty sluggish in certain games but run others like a dream, is there a huge graphical difference? No. So why does it do that?? Fucking optimization. The 3050m gpu is definitely not the peak I was talking about, but it's pretty damn close, just add some extra vram, a little more cuda cores, a little faster bandwidth speed, and boom; you just made yourself something that can run literally everything, (spoiler: that exists, it's called the 3060) and if the 3050 on itself can run everything, some at lower settings, a lot more at higher, then I don't see a single reason we should "improve" from the 3090, much less the 4090. If we're talking realistically, the 4090 can last people decades! What more do we need from a graphics card? Time travel? It's a graphics card, it runs games at 4k ultra with RT on, I seriously do NOT see anything we could improve. If game companies put care into optimization, then I would've have to upgrade my laptop for another decade, and to those with a much more powerful gpu's, like for example a 3070 or a 4070, they shouldn't have to upgrade for another however long it takes for these gpu's to simply break.
A youtube video once said the reason DX12 is so “bad” is because the burden of optimisation is now on the game devs, who don’t have much experience with drivers..
Hey! What's the tool you are using at 3:50? This seems way better than having to boot up photoshop every time I want to use smoothness textures (damn Unity smoothness on Albedo Alpha :v) Great video!
To give some feedback: I think the "Code" section was a bit too fast especially you explaining what we are actually trying to accomplish. I didn't get it before rewatching that part a few times
The number one thing that works for optimization is Frame Generation. Anybody can now do it with any game any graphics card with a program on Steam called Lossless Scaling. That's what happens when AMD makes FSR frame gen open source
9:26 "pretty good optimization"? no, that's insane optimization. what a simple solution as well. now, whenever i think a program can't be more optimized, i'll slap myself and remember this example and try to optimize it more.
Im not really from a rich place or still have money as a student so my pc is kinda meh...and i admire this a lot...as a programmer myself i really dove deep into this lately I just cant play a game slower than 40-50 fps ...or with drops
New games almost entirly lack proper optimizations to make it playable and acceptable for players to enjoy , those who still do it good , are passionate artisits...
In the 90s due to the limited resources on computers, devs had no choice but to optimize the software(not just games) before release, internet for updates was pretty much non existent. Remember how little RAM some programs used like Adobe PDF, heck look for PDF reader alternatives and you will see some use less RAM overall. Heck Windows did not show seconds on the clock in the taskbar as the CPU would need to render a new number every second and that would cause a performance hit, while this was more true for Win9x it still shows that devs had to make sure that the code they wrote was good from the start. But these days cause we can have 64GB of RAM, 2TB SSDs devs dont really bother with optimizing software, why waste time optimizing when you have so much resources?
Cyberpunk runs on the steamdeck... but people want to tell me it is impossible to run it on the ps4 with addon and 2.0 update? NO WAY... This game was a crime
yes!! 2 years ago the performance of that game started to go downhill from always over 100fps ultra to almost never going above 80fps, too many fps drops to 60fps or below and lots of stuttering and increased input lag. Removing the old maps and game modes in the 1.0 update and worse performance are the worst issues of Ready or Not to me, it's just so unfortunate.
9:22 You aren't using the full performance benefits of Dictionaries. By acessing the value by key and not iterating over the values you get a exectuation time of 0.25 ms: Dictionary wordsDictionary = new Dictionary(); void LoadWords() { string[] lines = File.ReadAllLines("words_alpha.txt"); foreach (string line in lines) { string morse = Translate(line, ""); if (!wordsDictionary.ContainsKey(morse)) { wordsDictionary.Add(morse, new List()); } wordsDictionary[morse].Add(line); } } string Translate(string input, string divider = " ") { string result = ""; foreach (char c in input) { lettersDictionary.TryGetValue(c, out string morse); ; result += morse + divider; } return result; } List TranslateInvalidCode(string input) { input = input.Replace(" ", ""); return wordsDictionary[input]; }
Dude imagine being the most sold game of all time and having optimization so bad that a huge chunk of youre comunity is dedicatted to fixing it Minecraft is truly not a heavy game only if the devs adressed it it would be playable
At this point in time, graphics have reached their peak, in a lot of games you would never be able to tell the difference between high settings and ultra setting except for a drop in FPS, hell in some games medium settings look just as good as high settings unless you compare them side-by-side. It's time that game studios realized that individually rendered fish scales isn't something that players want, the fish could literally be a collection of oval objects with a gray texture and players would be happy. Optimization is needed more than ever given that most people will never buy a graphics card that's more powerful than the RTX 20 series, and people like me who still run the GTX 16 series probably won't upgrade to a 20 series card for years to come.
Dunkirk is a small city in France, at the englosh channel, in wich the British troops where trapped during the invasion of france. There is also a Movie about this with the same name. @worldsinmotion
What's going on doe right ? What happened to optimizing games properly bro ? (somebody fill me in) 1. Over dependence with high end cards, processors, systems, and consoles ? 2. Game Project rushing because of money and time control by financial support ? 3. Half ass development and early releases for fast money intake ? Most Games exited the FUN area and went to fucking " BUSINESS MODE ONLY ". This is why I have absolute love for indie game developers who actually make really good optimized games with great content plus interact with their community.
The games you show at the beginning are doing all the things you mention in the rest of the video. That's not "why" they are "unoptimized". And "optimized" is often a ambiguous term. Games may be optimized to run their workload the best (or as good as possible given the time constraints/skills of the whole team) but still won't run on a potato PC because it is not part of their design goal. If you include a path tracer in your engine you better be "optimized". But if you have to run on Switch, then it would make better sense to drop the path tracer and maybe have a simple renderer with a few light sources.
@@swh77 It's not speculation when there's a plethora of research papers, presentations, articles, blog posts and discussions regarding all of this out in the public domain, a significant portion of which has come from developers working on AAA games. You don't need access to the source code that game X uses to implement screen-space reflections when the studio who made game X literally held a presentation discussing their approach in GDC one year, you can just go watch their presentation.
We can't even run games at 4k 165 fps without upscaling unless it is 4080 or 4090. Sucks that gpu prices are getting higher when we're not even getting good performance at native resolution
"Optimisation is easy just use the 20 new unrealengine5 meme effects that require a 4090 to run!" This video is why modern videogames are the way they are lmao
he should definetly have talked about baked lighting in the video. it makes games run so much better and it looks awesome if done right. Many beginner devs just use lumen since it's now enabled by default in ue5 and those devs don't know any better than to use this extremely slow and performance consuming technique. Also a fun fact: i once saw a tutorial on "how to remove the 'lighting needs to be rebuilt' error" and the dude literally just told the viewers to select all the lights and make them dynamic. talk about good performance...
@@paper_shreds I see some people calling baked lighting "faking it" or if its somehow inferior to real time lighting, even if the results are better. I can't wait until they realize that all 3D graphics are built on trickery
I feel like those people saw some presentation about open world games at some point where dynamic light sources are the only option and that made them think thats the best way@@SomeRandomPiggo
@@TheJohn_Highway R6 used to actually look good though. They've reduced the graphics over the years and over-sharpened it for E-sports. That's a bad example anyways, Half-Life Alyx and CS2 both use baked lighting and look photoreal sometimes.
In the future games will be made in a way that doesnt require insane hardware and theyll be a universal standard for specs whilst maintaining realistic graphics
Can't tell if you're completely delusional or colossally optimistic. I wouldn't trust modern devs to make a 1:1 copy of Minecraft that would run as well as the OG one.
@@TheJohn_HighwayI dunno about that Minecraft is a pretty purely optimised game, community mods have more than doubled the performance of the base game. Granted mojang has been improving it especially with the lighting engine rewrite recently that fixed one of the worst bottlenecks.
@jeff_7274 bedrock is CRAZY optimized. That's why I'm always tweaking the settings to run shaders without rtx LOL But java... well, I know it's a little limited by the language it's using, but still, we're talking about the second or third biggest company in the world. I honestly hope they switch Java to also c++ and make them both equal.
@@jeff_7274 compare to vanilla Java perhaps, with my setup I can get ~60 fps in a jungle at 64 chunks on bedrock 30 in the same seed with 32 chunks in Java so you are right there. But with sodium mod and a few others I can get around 100 - 300 fps. So there is still a long way to go.
I needed to empty out some space in my drive. From what i gathered indie devs do so much better in optimization. Just look at the most recent poor optimization from cities skyline 2. Most of the time they always argue about home computers under performing to deflect the criticism.
Yeah, but indie games are generally not that large in scope either. Look at the ones which are, like valheim. Its optimization is alright. Now compare it to NMS, a AAA game. Look at choo choo charles, an indie game. Optimization is not that great on it. Granted, it was made relatively quickly and with just 1 guy, but still.
Moore’s Law is only dying if you’re talking about transistor density not compute power density, which is what we should really be measuring and isn’t slowing down any time soon.
The blinking light is from World of Warships. Don't play it, the game has jumped off a cliff somewhere around 2021. Arguably even earlier, when devs reworked a class to a point where it's simply permabanned from any high level competitive play.
Devs relying on pure hw power instead of developers' skills in squeezing out as much power of that hw instead is bane of most modern games (especially bigger ones). For example, remember how Crash Bandicoot was more than even Sony believed was possible? Meanwhile we have bland looking games that look maybe 5% better than games from few years ago but take dozens of times more disk space and struggle to run even on the best HW consumers could theoretically access. Old ARK take over 430GB on disk, looks crap, plays even worse and is plagued with more bugs than some actual Early Access I've played (combined). Fitgirl's repack is about 43GB. Starfield is almost unplayable withoud Nvidia's DLSS and Todd has the nerve to say it's because their game pushes the technology to their limits and PC should just upgrade (ignoring the fact that consoles run the game better in general but their hardware is at most medium-grade compared to gaming PCs). Modern devs are just lazy. They self-learned from poor-quaity tutorials on YT or some shit and think themselves great developers.
If it is so hard to estimate, and hard to do why game company used to hype up gaming community with such announcement and such date release. Get the job done and then announce!
Meanwhile "Alan Wake II" Triangles? What Triangles? There's only 1 Trillion triangles on the player's perspective tho. Surely your entry level 4090 can handle it, right?
I'm a little of a conspiracy guy, and I would say the big hardware players have some kind of deal with the AAA companies to not optimize their games, because of the overreliance on DLSS and FSR, and because no one in the entire planet can tell me Starfield is optimized and Todd Howard response to that is "it's optimized, go and upgrade your PC lol". C'mon, it's really sus.
- As a game dev obsessed with optimization (and kinda having it as the only responsibility at my workplace) i'm indescribably upset with the current technical state of modern games. Back in the days with limited hardware specs as well as harsh development reality people had to put their best to simply make the game playable. I've been excited about it ever since i saw it working as a kid. Nowadays the games are slapped with the most generalized optimization techniques that are not even polished to fit into the game well enough, lol.
- Today, with so many high-level languages with large overhead, game engines aimed at designers who are freaked out by just thinking of using their keyboard and not their mouse, the industry falling to its lowest and becoming yet another business area, the publishers ridiculous tendency to develop a lot of generic over casualed games in a foolish desire to conquer new users and overall shitty educational quality the games are ended up having so much overhead and technical debt that you could easily fit 2 or even 3 old games in there, both memory and performance wise.
- I have always been thriving into the industry for this and i remained committed to the optimization all along. I am treating it like an art, which it really is. It's a shame seeing the industry being bloated with people so far away from understanding how games are working that the betelgeuse appears to be at the arm's reach, but this is the reality. Less competition as well, which oh man do i abuse.
- Thanks for making the effort to read, whoever you are. Much appreciated :D
What's even more infuriating is seeing gamers and even some benchmark channels say things like "Oh this 2023 game runs 60fps on medium settings on a 900$ PC? Then it's optimized!" meanwhile that game may have 2012 graphics at absolute best and absolutely no physics or interactive environments to speak of.
There's this braindead notion that a game's optimization is based on when it was released rather than how it looks like and how much physics or interactive environment it has. As a result devs use modern hardware not to push graphics and technology forward, but to instead become lazier.
Imagine if we told devs 12 years ago that in the future video games would visually look 15% better but will run quite literally 500% worse
People don't often consider a game's optimization in their review all that much or praise it all that much.
Take NMS. Procedurally generated yet it runs much better than most titles. Valheim is less optimized but its an indie title and has made large strides in improving its optimization. RDR2 runs pretty well too, same with GTA V. Doom, obviously. Distance is another well optimized game. Same with ''The entropy Center''. Forza horizon 4 and 5 are very well optimized.
But people will say FC5 is ''well optimized'' or stuff like that.
While a few studios no doubt demonstrate a degree of technical incompetence, I firmly believe the vast majority of these cases simply boil down to poor management. Tight deadlines, feature creep, loose vision, quick turnaround to avoid investor pullout, higher ups that are disconnected from the pipeline and complexities that arise out of short-sighted decisions, the list goes on.
With "industry" now being the operative word for the gaming industry, it's no surprise we see the same practices being utilised in larger projects to corner cut so long as people keep buying; and more so than any other sector of products I've seen, the consumer-base for games is far and away the most willing to put up with this capitalistic downward spiral.
@qma2275 After watching the video and hearing a lot from you got me really interested into diving deep into the topics. I really wanna learn more about the optimization and graphics that really works though in terms of optimisation and everything that happens behind a game. So would like to try out any recommendation that you've got for a beginner like me!
thats why AAA makes billions while you work for minimum wage as an indie dev lmao
The pinnacle of modern game optimization has got to go to doom eternal that game put doom 2016 and all other modern games that rely on bloated file sizes to shaaaaaaaame
I'd argue games developed by Nintendo themselves on the Switch takes the cake, but I guess that depends on what requirements you're going for here.
What about Factorio?
@@what42pizzaits not a AAA games. If we were talking about indie games, then youd see most of them being well optimised. Like factorio
@@reglan_devOkay, I have two big problems with that. First off, why do indie games not qualify for the pinnacle of modern game optimization? Also, Do you know how much better optimized Factorio is than any other indie game?
@@what42pizza 1. The video and the commenter both mentioned AAA games. As we all know, modern AAA scene is famous for massive amount of unoptimised games.
Indie games on the other hand, are known for the fact how well done they are, and optimised. Therefore they are a pinnacle of game optimisation, however the video wasnt talking about them.
2. Yes, i know how well optimised factorio is. However, its not the only very well optimised game out there too
"Our game is running just fine, maybe it's time to upgrade."
Todd Howard: To PC gamers with i9's and 4090ti
What pissed me off the most regarding Starfield and Todd is when he said they optimized the game pretty well but a few weeks later they released a patch that increased performance by quite a lot. His own team proved that he was talking BS.
Not that it needed much proving anyway because we have Cyberpunk where I could achieve literally double the FPS. And Night City is not even remotely comparable to Neon or Akila.
Okay you can't fill a room with cheese in Cyberpunk but who even really cares.
@@valentinvas6454Fr unpaid modders released optimization patches like that same night it released 😂 also added dlss and other very much needed stuff
@@valentinvas6454 Dude you're comparing apples to oranges. This is so ignorant on so many levels I don't even know what to say.
@@ged-4138 Care to elaborate?
@@ged-4138 you dont know that to say because you dont have anything to say. So next time just keep it to yourself.
Most companies' idea of "optimization" is to just raise the system requirements.
I've always joked that if a Windows programmer today was asked to write an exact copy of Space Invaders, it would require a 2.4Gh i5 CPU, 8GB of RAM, a graphics card supporting pixel shaders 3.0, DirectX 11, and a minimum of 20GB of hard drive space.
And you'll not be wrong 😂
Now programmers just don't even wanna do hard work, I just hope everything will become normal in near future else we gonna have to keep upgrading each year and keep up with the trend even if we don't want to, I mean... Devs could make the game for lowest hardware and using those hardware and I mean lowest is not Intel atom but rather what most people use on average and what most people are capable of using, and use those combinations of hardware to build games or applications.
@@chry003 One of the problems is the reliance on programming "environments" that aim to make programming easier at the cost of increased requirements. I get it, you want to use whatever will make your life easier, but often using these, gives a more inefficient result.
There's a small utility for older versions of Windows called TaskArrange that lets you change the order of the buttons on the Task Bar, since older versions of Windows lacked this ability. The whole thing is about 50K, programmed in pure assembly. I once downloaded a program that did the same thing, with maybe an extra feature or two, but it required the .Net Framework. The program was 500K. I know in today's world, 500K is nothing, but the point is that it was ten times larger than the one made in assembly. Plus the extra size of the .Net Framework. All to do the work of a 50K program.
For another example, take the PS2 versions of GTA3, and GTA: Liberty City Stories. Both games use essentially the same map, so they should run about the same in an emulator. However, Liberty City Stories requires a faster system to run smoothly, than GTA3 does.
Another thing is that often when someone says that something can't be done, what they mean is that there's no pre-made function to do that. Within the limits of the system's hardware, software can do ANYTHING the programmer wants it to. Windows can't run Mac software, right? Well, if you write a Mac emulator, it can.
DX11? It'd need DX12, Windows 10 and an RTX card, cuz we don't use the old fashion shading style anymore.
At blizzcon many years ago, Blizzard told a story, they created a new city for world of warcraft, called Dalaran, however they discovered that rendering this city was ridiculously slow.
On further investigation they discovered the problem was a toy shop, it had a model of the city as one of the toys. To save time the programmer had merely referenced the normal city model, scaled it down and rendered it.
This led to a huge problem, if you looked at the toy store, the game would resize the city and render it as a model, this of course included the toy shop and the model city, which caused it to again resize the city effectively causing infinite recursion.
that's crazy. thanks for sharing.
Wouldn't the game just not work then? It has to render an infinite number of the cities.
Perhaps not because they get smaller?
But somehow this makes me think about how some bigMobile games are optimised to run on comparatively weaker CPUs
Imagine if hardware was developed just like modern games:
“- Whoa! It POSTs!
- Ship it!
- But there’s issues and clock is unstable.
- We will fix it via driver updates!”
Shh! Don't give them ideas!
It's consumers that enabled this behavior. If people stopped buying broken games due to their FOMO, we'd be doing much better
If they could, they would. Fortunately, casual gamers aren't the only ones buying motherboards and computer parts, I guess? Professionals and big corporations do too, and they probably want a stable and well made part.
you described every new GPU launches since 2015! RX480 blowing the motherboard PCIE by pulling more than 75W is one of the recent examples being fixed by a driver update :P
developers often dont eve optimize... they wait to see if idiots will buy overpowered hardware 1st.
code is written by idiots.. sub optimally... they try get the product done 1st.
I can't wait until the Silicon Limit is reached and we finally start actually optimising more rather than demanding more and more RAM, higher and higher clock speeds, and yet more storage.
Don't get me wrong, there are issues with over-optimisation or pre-optimisation from a maintenance standpoint, but *some* optimisation would be nice.
there is no issue with over optimizations an optimized game will run perfectly for years skyrim is peak example
@@fireloop69 I mean Skyrim is a BAD example, it's not a well optimised game, and it doesn't run perfectly.
But also my issue was that there *aren't* over optimisations, instead games and such just demand higher spec machines.
@@cptnraptor understandable well take rdr2 as an example then one may say its over optimised but it still looks better than most modern games while running smoother as well
@@fireloop69rdr 2 is definitely the perfect example
tried to run fortnite on a gtx 1050 recently , it looks like shit , meanwhile i can run cod ww2 on medium low whit increadible performance ! and don't let me talk about other well optimized games like bioshock , black mesa, etc.
For those who learned OpenGL you need to extends object to the same class so when you apply a shader just do it for all object at once by creating a override func (common problem)
This comment was approved by real amecian patriots!!!
I would credit every single japaneese video game developer in this department. They release some of the most optimised games ever. Special shoutout to metal gear solid V. I played that game on 2 gb ram.
Oh yes I especially like how even cutting enemies into 100 plus pieces only lags slightly
There is a very important thing that is often forgotten about big O notation: it ignores constants and those constants might be huge. And that matters a lot if your code doesn't work on big dataset.
explain
@@MehmetMehmet-y8c Big-O is a quick&dirty notation aimed to eyeball how an algorithm scales with the number of inputs given. Imagine I've some hypothetical algorithm and after careful analysis I determine that the number of operation required to complete goes: n^3 + 10n^2 + 100n. With big-O notation you're making the rather dirty observation that eventually n^3 dominated and anything else doesn't matter. When however? How 100n can be ignored for small-n? Big-O is flawed in that sense
@@ef3675 i understand what you mean. good example
I think graphics card companies also have something to do with games not being optimized as to sell stronger GPUS so people can run a terribly optimized game
Actually, GPU vendors tend to work with game studios to optimize their games to run on their hardware so their hardware looks better in reviews. Your comment reminds me of Nvidia Gameworks though. It was this black box Nvidia gave to devs to do certain effects, such as hair. But, it was pretty much Nvidia abusing brute force tessellation to make subpar effects. Nvidia had better tessellation than AMD, so games ran faster on Nvidia cards. AMD at the same time made open source ways to do the same thing, though looking better, and running faster... even on Nvidia hardware.
This entire Bullshit of a theory is actually being taken seriously by some people. It blows my mind how unreasonably moronic people have become. Releasing unoptimized games really hurt the reputation of the devs and this in turn could majorly hurt their sales enough to shut them down. It's quite literally devs just releasing the game in a broken state so they can get money real quick from all the preorders. At the end of the day, devs release broken games to get a quick buck and not to save some GPU vendors ass.
@@xeridea Game sponsorship only tells part of the story. Almost all, if not all Ubisoft games are AMD sponsored, yet if we're talking CPUs, Intel CPUs handle the Assassins Creed and Far Cry games a lot better, an i9 9900k significantly outperforms a 5950X which came out a year later and usually trades blows with the 10900k. GPUs however, I believe Nvidia's usually do just as good a job, and in some cases even better, like in Far Cry 6 which has Ray Tracing.
Games that don't even look as good as the original Crysis (2007) run like crap on today's hardware... just look at Starfield. An empty planet and a few pebbles... no terrain, no vegetation and yet it can drop to 50 FPS on my high end PC. It's ridiculous.
@@xerideaI remember this.
Simply, quality content , straight to point , and engaging , luckly i was already subscribed to this channel
This video was super cool. Why is this channel so underrated. Hope it blows up!
The demonstration about "what if light was slow" was also amazing
Wrong title, it should have been: The LOST Art of Game Optimization
It’s not lost though. AAA games aren’t the only games on the market and even then there are well optimized AAA games
And for common people, AAA is the only one matter. . @@crestofhonor2349
@@crestofhonor2349
Name 1 optimized AAA game made after 2016
Look, this is why I say "More Hardware doesn't make a better game" The argument should be "Better Optimization makes a better game"
You deserve much more subs and views! Amazing video!!
100%, I was confused by how this can only have 300 views, great work
Game devs have it easy with ample RAM and CPU and they still fuck it up. Try embedded development where resources are extremely limited and crashes can potentially cause catastrophic failures.
True. Unoptimized games is not because of the software engineers but because of the managenent.
Speaking from experience as a dev myself. Boy, if you see our codebase. Lol. Tighter deadlines so, we are cutting corners. I don't want to but I need to xD
I just about cried when I saw you put the subdivision modifier on the door handle. I think I felt genuine pain.
The big O notation describes the growth of a function, not the actual execution speed. For example, hash maps have a lookup time complexity of O(1), whereas linear arrays have a lookup time complexity of O(n). However, if the hashing function is slow, array lookups will most likely outperform map lookups for lower values of n. O(10000) = O(1) is slower than O(1n) for n < 10000.
That is to say, it would be incorrect to state that the big O notation accurately represents the real-world performance of an algorithm outside of big data. The correct way to locate slow functions is by profiling.
Never understood this O shit, tbh.
Asphalt 8/9 is crazy optimized if you’ve ever played it. Glorious graphics, solid 60 fps all the time even on pretty old computers and mobiles, very responsive.
Nintendo is great with compression techniques it’s crazy. They truely are the best at making compressed video game file formats.
They had to, their Nintendo switch is comparable to 2023 midrange smartphone in term of power
that's intentional, Nintendo's law is always make games that are fun regardless of graphics, that's why the wii games have held up so well compared to ps2-3 games
I jokingly made the reference that AAA studio leads just say "Our game isn't unoptimised, just buy a 4090" behind the scenes.
And then Todd Fuckwad said it in an actual interview. Fuck everything that guy stands for. And then he dares get dissapointment at the game awards every time it didn't win. Dude thinks the sun is shining out of his ass
csavo, meghallottam az akcentust, mentem is a csatorna leirasaba es lattam amit remeltem. egy MAGYAR youtuber aki angol tartalmat gyart? hat ezt a csatornat jo alaposan at fogom bongeszni 😂 csinald csak tovabb, nagy dolgot muvelsz💯
Ksp2 devs: Let's make a ridiculously accurate rocket building game! And the map is the whole Solar System, just for good measure!
Also Ksp2 devs: Recommended for 1080p60 are rtx 3080
what
what
what
what
what
What a great work dude!
I give it a like just for the effort that you put in the video
This video basically explained why I'll never learn programming. The numbers immediately made my head hurt.
6:07 i got curious what game this is and made a code to analyze all existing games -- "a game that isn't dying or anything or at least that's what the devs like to think" -- is it World of Warships?
This was surprisingly interesting… Thank you!
Nowadays games:
- 98gbs for just a fighting game (yes Tekken, I am talking about you);
- DirectX12 that freaks up the graphics for actually strong chips but old graphic cards (like GTX 970), making then running slow or creating really annoying visual arctifacts due to low resolution applied (another example of that is that Tekken 5 looks way prettier than Tekken 8 in low quality. Also SF6 doesn't need DirectX12, it could have DirectX11 and Vulkan for performance versions too).
Seems like the industry only cares about Path Tracing in real time render.
its not dx12 fault, its fault of lazy developers used to dx11 abstractions
Tbf with tekken 8 you can just Delete the story files (30+ gb size)
Vulkan > DX12, btw you can use DXVK or VKD3D in some cases to fix some stutter and mem leaks with older tittles.
There's mid 2000s game thst looks and plays amazing on what we today consider low end hardware
I think we should focus on software more than hardware now, hardware is at it's peak and it will never improve to the point of actually...well, improving. I have a 3050 laptop, it's pretty sluggish in certain games but run others like a dream, is there a huge graphical difference? No. So why does it do that?? Fucking optimization. The 3050m gpu is definitely not the peak I was talking about, but it's pretty damn close, just add some extra vram, a little more cuda cores, a little faster bandwidth speed, and boom; you just made yourself something that can run literally everything, (spoiler: that exists, it's called the 3060) and if the 3050 on itself can run everything, some at lower settings, a lot more at higher, then I don't see a single reason we should "improve" from the 3090, much less the 4090. If we're talking realistically, the 4090 can last people decades! What more do we need from a graphics card? Time travel? It's a graphics card, it runs games at 4k ultra with RT on, I seriously do NOT see anything we could improve. If game companies put care into optimization, then I would've have to upgrade my laptop for another decade, and to those with a much more powerful gpu's, like for example a 3070 or a 4070, they shouldn't have to upgrade for another however long it takes for these gpu's to simply break.
A youtube video once said the reason DX12 is so “bad” is because the burden of optimisation is now on the game devs, who don’t have much experience with drivers..
What about vulkan....?
@@BOT-fq9bu also a low level API and roughly speaking, you would have to do the same amount of work
If your game sucks it's not the API's fault, it's yours.
Hey! What's the tool you are using at 3:50? This seems way better than having to boot up photoshop every time I want to use smoothness textures (damn Unity smoothness on Albedo Alpha :v) Great video!
WOW! Thats a highly concentrated quality video. So much to meaningful content in so little time.
To give some feedback:
I think the "Code" section was a bit too fast especially you explaining what we are actually trying to accomplish. I didn't get it before rewatching that part a few times
wow, someone who knows what they're talking about, very refreshing
This is underrated. Great video, sir!
Thanks, this was a pretty cool rundown.
The number one thing that works for optimization is Frame Generation. Anybody can now do it with any game any graphics card with a program on Steam called Lossless Scaling. That's what happens when AMD makes FSR frame gen open source
9:26 "pretty good optimization"? no, that's insane optimization. what a simple solution as well. now, whenever i think a program can't be more optimized, i'll slap myself and remember this example and try to optimize it more.
very infomative video. some part of the videos are really hard to hear due to the heavy accent, the autogenerated sub can only do so much.
Raytracing can be more efficient than the typical rasterization when dealing with a huge quantity of triangles.
Yeah, but you still need special hardware to do so, there sure is a better way to do it, just like we all did before Raytracing and still look good.
Thought it was the subdivision that broke shit but 3 MILLION?!!? HOW DO YOU DO THAT BY ACCIDENT
Im not really from a rich place or still have money as a student so my pc is kinda meh...and i admire this a lot...as a programmer myself i really dove deep into this lately
I just cant play a game slower than 40-50 fps ...or with drops
Quality content right here! Thank you for sharing this!
New games almost entirly lack proper optimizations to make it playable and acceptable for players to enjoy , those who still do it good , are passionate artisits...
In the 90s due to the limited resources on computers, devs had no choice but to optimize the software(not just games) before release, internet for updates was pretty much non existent.
Remember how little RAM some programs used like Adobe PDF, heck look for PDF reader alternatives and you will see some use less RAM overall.
Heck Windows did not show seconds on the clock in the taskbar as the CPU would need to render a new number every second and that would cause a performance hit, while this was more true for Win9x it still shows that devs had to make sure that the code they wrote was good from the start.
But these days cause we can have 64GB of RAM, 2TB SSDs devs dont really bother with optimizing software, why waste time optimizing when you have so much resources?
7:43 all that work for "dunkirk"
GPU: I fear no Pixel, but that thing *Shows Door Handle*.... it scares me.
Thank you very much, now I know why I am getting 3 fps👌🏻
People always say stop optimizing games it’s stupid, and well they need to be optimized
Cyberpunk runs on the steamdeck... but people want to tell me it is impossible to run it on the ps4 with addon and 2.0 update? NO WAY... This game was a crime
The steam deck is more powerful then the ps4. so yes... it runs on a steamdeck and not a ps4.
@@jairit1606 and the legion go is more powerful then the steamdeck - i played every quest and the addon. Really freaking good game. (and Handheld pc)
Someone needs to send this to the team working on ready or not 😂
yes!! 2 years ago the performance of that game started to go downhill from always over 100fps ultra to almost never going above 80fps, too many fps drops to 60fps or below and lots of stuttering and increased input lag.
Removing the old maps and game modes in the 1.0 update and worse performance are the worst issues of Ready or Not to me, it's just so unfortunate.
9:22
You aren't using the full performance benefits of Dictionaries. By acessing the value by key and not iterating over the values you get a exectuation time of 0.25 ms:
Dictionary wordsDictionary = new Dictionary();
void LoadWords()
{
string[] lines = File.ReadAllLines("words_alpha.txt");
foreach (string line in lines)
{
string morse = Translate(line, "");
if (!wordsDictionary.ContainsKey(morse))
{
wordsDictionary.Add(morse, new List());
}
wordsDictionary[morse].Add(line);
}
}
string Translate(string input, string divider = " ")
{
string result = "";
foreach (char c in input)
{
lettersDictionary.TryGetValue(c, out string morse); ;
result += morse + divider;
}
return result;
}
List TranslateInvalidCode(string input)
{
input = input.Replace(" ", "");
return wordsDictionary[input];
}
@1:50 C. You are using Unity
Dude imagine being the most sold game of all time and having optimization so bad that a huge chunk of youre comunity is dedicatted to fixing it
Minecraft is truly not a heavy game only if the devs adressed it it would be playable
They have been working on it. I think they also focus on good practice more than performance.
This title feels like it should be the lost art of game optimization
At this point in time, graphics have reached their peak, in a lot of games you would never be able to tell the difference between high settings and ultra setting except for a drop in FPS, hell in some games medium settings look just as good as high settings unless you compare them side-by-side. It's time that game studios realized that individually rendered fish scales isn't something that players want, the fish could literally be a collection of oval objects with a gray texture and players would be happy. Optimization is needed more than ever given that most people will never buy a graphics card that's more powerful than the RTX 20 series, and people like me who still run the GTX 16 series probably won't upgrade to a 20 series card for years to come.
Dunkirk is a small city in France, at the englosh channel, in wich the British troops where trapped during the invasion of france. There is also a Movie about this with the same name. @worldsinmotion
I did not understand a word from 5:46 to 10:17. 10/10
get your ears checked then lol
Try understanding some bitches, we can understand him fine
Very awesome video. Such an accessible introduction to optimization
Ah yes, game optimization, the thing every developer ever forgot about
What's going on doe right ? What happened to optimizing games properly bro ? (somebody fill me in)
1. Over dependence with high end cards, processors, systems, and consoles ?
2. Game Project rushing because of money and time control by financial support ?
3. Half ass development and early releases for fast money intake ?
Most Games exited the FUN area and went to fucking " BUSINESS MODE ONLY ". This is why I have absolute love for indie game developers who actually make really good optimized games with great content plus interact with their community.
The games you show at the beginning are doing all the things you mention in the rest of the video. That's not "why" they are "unoptimized". And "optimized" is often a ambiguous term. Games may be optimized to run their workload the best (or as good as possible given the time constraints/skills of the whole team) but still won't run on a potato PC because it is not part of their design goal. If you include a path tracer in your engine you better be "optimized". But if you have to run on Switch, then it would make better sense to drop the path tracer and maybe have a simple renderer with a few light sources.
Unless you have access to source codes and assets of those games, your claim is just a speculation.
@@swh77 Because in his video he is just talking about basic stuff. Not what goes into optimizing a raytracer, shader passes and so on.
@@swh77 It's not speculation when there's a plethora of research papers, presentations, articles, blog posts and discussions regarding all of this out in the public domain, a significant portion of which has come from developers working on AAA games. You don't need access to the source code that game X uses to implement screen-space reflections when the studio who made game X literally held a presentation discussing their approach in GDC one year, you can just go watch their presentation.
Really good video dude! Noice
All of this in 10 minutes, insane video
Comment for the algorithm, awesome content!
We can't even run games at 4k 165 fps without upscaling unless it is 4080 or 4090. Sucks that gpu prices are getting higher when we're not even getting good performance at native resolution
4K is a lot pixels to cover one needs those high priced GPUs to render it.
Research floating point operations.
buy a 1080x1920 monitor.
Good video, man! CFBR
Hi! Good video, but you used the wrong Godot logo at about 2:00. That's all
You mean those extra teeth?
@@llllllXllllllyes, that logo is very old and not used anymore
Give this man a world of warships sponsor
"Optimisation is easy just use the 20 new unrealengine5 meme effects that require a 4090 to run!"
This video is why modern videogames are the way they are lmao
he should definetly have talked about baked lighting in the video. it makes games run so much better and it looks awesome if done right. Many beginner devs just use lumen since it's now enabled by default in ue5 and those devs don't know any better than to use this extremely slow and performance consuming technique.
Also a fun fact: i once saw a tutorial on "how to remove the 'lighting needs to be rebuilt' error" and the dude literally just told the viewers to select all the lights and make them dynamic. talk about good performance...
@@paper_shreds I see some people calling baked lighting "faking it" or if its somehow inferior to real time lighting, even if the results are better. I can't wait until they realize that all 3D graphics are built on trickery
I feel like those people saw some presentation about open world games at some point where dynamic light sources are the only option and that made them think thats the best way@@SomeRandomPiggo
@@SomeRandomPiggo
I believe that baked lighting gets a bad rep because nearly every modern game with baked lighting looks terrible (see:R6 Siege)
@@TheJohn_Highway R6 used to actually look good though. They've reduced the graphics over the years and over-sharpened it for E-sports. That's a bad example anyways, Half-Life Alyx and CS2 both use baked lighting and look photoreal sometimes.
In the future games will be made in a way that doesnt require insane hardware and theyll be a universal standard for specs whilst maintaining realistic graphics
Can't tell if you're completely delusional or colossally optimistic. I wouldn't trust modern devs to make a 1:1 copy of Minecraft that would run as well as the OG one.
@@TheJohn_HighwayI dunno about that Minecraft is a pretty purely optimised game, community mods have more than doubled the performance of the base game. Granted mojang has been improving it especially with the lighting engine rewrite recently that fixed one of the worst bottlenecks.
@@MrMoon-hy6pnBedrock was pretty well optimized. You can get 60 fps at a render distance of 84 chunks on the right specs.
@jeff_7274 bedrock is CRAZY optimized. That's why I'm always tweaking the settings to run shaders without rtx LOL
But java... well, I know it's a little limited by the language it's using, but still, we're talking about the second or third biggest company in the world. I honestly hope they switch Java to also c++ and make them both equal.
@@jeff_7274 compare to vanilla Java perhaps, with my setup I can get ~60 fps in a jungle at 64 chunks on bedrock 30 in the same seed with 32 chunks in Java so you are right there. But with sodium mod and a few others I can get around 100 - 300 fps. So there is still a long way to go.
Thanks man I loved it
Fascinating video!
Peak content ur underrated frfr
On low budget developers export game engine with game they make. Making Pong requires now to export Unreal Engine 3 Custom with it.
what is the game in 9:30 minute
I needed to empty out some space in my drive. From what i gathered indie devs do so much better in optimization. Just look at the most recent poor optimization from cities skyline 2. Most of the time they always argue about home computers under performing to deflect the criticism.
Yeah, but indie games are generally not that large in scope either. Look at the ones which are, like valheim. Its optimization is alright. Now compare it to NMS, a AAA game. Look at choo choo charles, an indie game. Optimization is not that great on it. Granted, it was made relatively quickly and with just 1 guy, but still.
Moore’s Law is only dying if you’re talking about transistor density not compute power density, which is what we should really be measuring and isn’t slowing down any time soon.
nice video
Topic os good but I can't understand a single word, depending on subtitles only 😢
What game is it that you mention at 6:12?
The blinking light is from World of Warships.
Don't play it, the game has jumped off a cliff somewhere around 2021. Arguably even earlier, when devs reworked a class to a point where it's simply permabanned from any high level competitive play.
1:51 It can be c. actually ;) Try the same in UE5 with Nanite :D
Should have watched till the end xD Great video! ^_^
Devs relying on pure hw power instead of developers' skills in squeezing out as much power of that hw instead is bane of most modern games (especially bigger ones).
For example, remember how Crash Bandicoot was more than even Sony believed was possible?
Meanwhile we have bland looking games that look maybe 5% better than games from few years ago but take dozens of times more disk space and struggle to run even on the best HW consumers could theoretically access.
Old ARK take over 430GB on disk, looks crap, plays even worse and is plagued with more bugs than some actual Early Access I've played (combined). Fitgirl's repack is about 43GB.
Starfield is almost unplayable withoud Nvidia's DLSS and Todd has the nerve to say it's because their game pushes the technology to their limits and PC should just upgrade (ignoring the fact that consoles run the game better in general but their hardware is at most medium-grade compared to gaming PCs).
Modern devs are just lazy. They self-learned from poor-quaity tutorials on YT or some shit and think themselves great developers.
Apparently Microsoft just say “fuck this” and shit out a crudely written port for €70
really good video, but you need to find some way to reduce peaks whenever you pronounce S, as it did hurt my ears a bit...
If it is so hard to estimate, and hard to do why game company used to hype up gaming community with such announcement and such date release. Get the job done and then announce!
A sacred art lost to time..
No way wreckfest made it on a thumbnail
Ur gonna blow up
If EVERYTHING were rewritten in rust there wouldn't be world wars and need for optimisations anymore(contains irony)
C. You are using Unity.
Meanwhile "Alan Wake II"
Triangles? What Triangles? There's only 1 Trillion triangles on the player's perspective tho. Surely your entry level 4090 can handle it, right?
Having an RTX 3060 be the MINIMUM GPU requirement for a game to run screams shoddy optimization
Nowadays game studios just throw DLSS at everything and call it a day.
I'm a little of a conspiracy guy, and I would say the big hardware players have some kind of deal with the AAA companies to not optimize their games, because of the overreliance on DLSS and FSR, and because no one in the entire planet can tell me Starfield is optimized and Todd Howard response to that is "it's optimized, go and upgrade your PC lol". C'mon, it's really sus.
Modern game devs: I will pretend I did not see that
Nanite is actually a really bad way to LOD scaling as it manipulates existing meshes in real time requiring extreme amounts of CPU usage.