@@DragonOfTheMortalKombat That's how they "advertised" it. Lumen and Nanite, revolutionary low cost methods that would let you push graphical boundaries at good performance. That did not work out at all huh.
I hate how games have became blurrier and blurrier over the years. All these detailed textures and models with hundreds of thousands of polys just to slap TAA and DLSS on top of it, not to mention all the temporal ghosting. We're at the point where older games rendered at native res with 4xMSAA look better than a lot of modern games on image clarity alone.
MSAA now looks like an anomaly in history. It had already achieved the best visuals and for a few years now, devs tried different techniques that in one way or another were not able to match what MSAA did a long time ago (MSAA wasin the OpenGL specification in 2002). So why not use MSAA? It's incompatible with other visual effects. Not even very sophisticated ones. It's an edge based superscaling and while interpolation of color pixels is fine, interpolation of depth/normal/stencil is not. MSAA will not come back. The blurriness however, that's TAA and DLSS and how they use it to hide purposeful dithering of effects that would otherwise destroy the performance even more. Everything comes at a price. Not sure if i'm willing to pay this one.
The main reason Witcher 4 switched to UE5 is: The original devs that understand and know how to fix and maintain RedEngine all left, they have no one to maintain it or upgrade it. Majority of those devs are now with Rebel Wolves.
The reason why we have this problem, is that the development studios got rid of all or most of the staff who knew how the proprietary engines worked, so they thought it easier and cost effective to just switch to UE instead of training new people to work with their engine.
@@goncalolopes5818 yup, Xray did a great job gameplay wise but is an absolutal clusterfuck behind the scenes. Anomaly and Gamma are the result of thousands of people squeezing every inch of performance out of the engine yet these mods barely hold together. Noone would bother to make a 2024 game on Xray
With the bespoke engine, you can optimize exactly for the task you want to do. But the cost to reach modern standards will kill you. Same as with operating. A specific one may be best, but in the end you grab some Linux off the shelf, because ... well "code is a liability". Don't write code.
Source 2 has produced some of the clearest most crisp graphics I've ever seen come from an engine in the modern era of games, and it runs better than everything else!
@@mryellow6918 That’s true, but imo makes sense for vr, where native resolution is often higher than 4k. Internal resolution below 1080p is where I tend to draw a line for my own preferences (on a flat screen, probably too low for vr).
Exactly, if they used for example engine from Wither 3 or Cyberpunk, that would be perfect or some CryEngine.....I don't understand that obsession with Unreal.
I woult take kenshi graphics or something if it meant the game would be fun and run well. I so dont care about graphics anymore. give me a fugly looking game that runs well and is fun over a "beautiful" "photrealistic" blur fest of 30 fps AI upscaling.
I was so disappointed when I read that CDPR were ditching Red Engine in favour of UE 5. Cyberpunk 2077 looked pretty damm good even on lower settings imo.
Yeah at lowest settings it was still looking as good as the best PS4 games. It looks like a game, not a movie, thats all. And the writing and gameplay of Cyberpunk is so good it just doesn't matter. Its like so many people wanna have this photorealistic game so theyvcan just show people how real it looks 😂. I am over 40, so its all mindblowing to me. The complexity and amount of player agency in games is what impresses me about modern gaming. Graphics be damned. We played Qbert and Snake and loved it. 😂
Red Engine was a shitty engine compared with the unreal engine. When you license the Unreal engine you have at your disposal a bunch of software to create a full game from scratch
@@miguelcondadoolivar5149 Then don't complain about games taking too long to develop or underdeveloped stories or gameplay because the dev's are trying to get the engine to work.
in my opinion when DLSS and FSR got introduced in to the gaming world, every developer feels like lets optimize the game a little bit and throw that shit in and hope it does the rest i hate it
I don't think it even works that good either, I just keep the resolution scaling off in cyberpunk cuz I can't even tell a difference. it's either no impact lag with stutter, or slightly less stutter and a ton of input lag
That's not true, DLSS and FSR are just really good for PC gamers, it's literally free performance for a slight downgrade in image quality. Unoptimized games existed before DLSS and FSR, the problem right now is the fact that GPU have been getting way more expensive and it's become hard for devs to optimize for mid-range GPU while having graphics that are considered next-gen (when a game doesn't have next-gen graphics it get attacked by gamers). With FSR3 at least you can compensate for the fact that GPU are more expensive, the technology also has a lot of potantial, DLSS is already capable of making 1080p look almost perfectly like 1440p.
@ni9274 "slight" meaning sometimes completely blurring images in motion, blurring textures, prone to visual artifacting. Battlefield 1 and battlefield 5 for shooters as an example, didn't need upscaling to achieve visually impressive scenes or performance.
@@ni9274 What free performance? It does nothing in Escape from Tarkov except blur the people I am trying to shoot at. It does not increase performance in any game I have played, outside of the basic lowering of the input framebuffer. It's a joke to consider rendering the game at 720p and upscaling it to 1080p native as 'free FPS' it is not.
DLSS is horrid, the fact we're being forced into it makes me wonder if theres a conspiracy with unreal engine devs and GPU manufacturers. I feel like NVIDIA felt like they messed up with the gtx 1080, they made the card too good and it lasted too long, so now they're making cards that last less long with "new" tech each time for a more constant cash flow.
@@michaelsexton924 prove it, smartarse. Profits are always chosen over a good product nowadays, nvidia gets way more money from stuff outside gaming anyway.
I think devs just don't CARE to optimize it because nvidia gives them the crutch, and nvidia is happy that devs are becoming dependent on their AI framegen or upscaling slop.
dlss and fsr were designed so lower end cards and older cards can get more life out of them not for the flagship cards to run games at passable performance, but somehow here we are.
i dont want photo realistic games if hardware cant run anything. Graphics needs to stop going "futher" it needs to optimise and let hardware catch up before we go futher
How do developers then explain why their modern games look worse than their previous titles to the average consumer? Production cost and time will drastically increase as well in an industry where development costs are already out of control.
@@penumbrum3135 the "average consumer" wants a fun game.. they really don't care about graphics. Take Nintendo's games and sales.. they beat Sony's games and sales on lesser hardware and they don't get PC ports.
@@penumbrum3135 You don't chase after graphics.. Nintendo does quite fine without having to make their games "photo realistic" Nintendo's games typically outsell "photo realistic" big budget games as well or make more money using less resources
Maybe I'm looking too deep but I feel like everything is becoming more superficial. It's all about what's on the tin anymore. I am hopeful that this game in particular will get better but games these days are hype fest disappointments.
the only way we are going to get better hardware is if there is a reason for companies to create it, if there aren't demanding games like stalker 2 pushing hardware limits then hardware companies are just gonna get lazy
At this point they need general optimization within the engine, this is bullshit across the board how low FPS youre getting per dollar on these games @@xbm41
@xbm41 Just wrong. Soo many use amd. Better price performance. Nvidia is like 300mph bike. Its nice but expensive, yet you can get a 280mph bike for half the price
@@GameBoyyearsago That is the power of good developers. When you only hire cheap junior devs with little to no experience, then UE games are bad optimized and look the same.
@@GameBoyyearsago LOL :) Crytek had many troubles. many forks to other studios. it went wrong when it stop focusing on HIGH end PC and went to be console compatible like all the other engines. Then Corporate Management went crazy and bankrupt the company. I guess Crytek recovered some what. Engine wise. "Hunt: Showdown" is cool game showing off Crytech's Cryegine but feels just like "Unreal" in looks. All the Far Crys were forked off an older Cryegine. Same with "New World" and "Star Citizen" as Amazon Gaming Studios forked CryEngine into Lumberyard.
Did the devs port the whole game to UE5 or is it some hybrid like the GTA remasters? Renderware is the main engine takes care of the phys, AI and stuff. Gamelogic exetera. Unreal is for rendering in these failed remasters.
And boy was it awful, it got close to emulate yakuza 0 but the bugs at the beginning...god, just the second of characters not being in the scene EVERY CAMERA TRANSITION was enough to destroy any immersion possible.
@ i also didn’t like it, it felt much more clunky, and small things like your character not hitting enemies when lying down, big downgrade from the dragon engine
Because the game's look significantly worse without it. I mean look at Metaphor Refantazio. The final rendered image is abysmal with tons of aliasing everywhere and it's very distracting. A TAA mod exists and it makes things leaps and bounds better despite not even being a proper native implementation.
@@wanderingwobb6300 TAA when looking at static images is an improvement, but all that ghosting is a TAA artifact because it is physically taking the data of the previous frame and "Cleaning up the edges" of an object that isn't even there anymore, or when the rustling of leaves in a tree or grass right in front of you, looking like static that is TAA. TAA is "fast" and "easy" that is why it is used.
Its just mindblowing how much more processing power we need for modern games and them not looking much better. Best example: look at Batman Arkham Knight. That game still used the Unreal Engine 3, with some features and rendering stuff ported from the UE4 (because they found UE4 to be buggy and too resource intensive). Or look at Alien Isolation, that game practically runs on a toaster these days and looks great, aside from the bad AA implementation. And Alien Isolation didnt use any global illumination to save resources, EVERY lightsource was dynamic in that game. How is this possible?!
As a small dev using UE every day, I can confirm one thing : If you listen to Epic, your game is gonna be a beautiful, shiny .... slug. You can't use hardware raytracing, especially if you want to use it for reflections and shadow or it will DEVOUR your budget. I still use screen space reflections cause even in software mode, Lumen eats a lot. Letting it take care of light is already taxing. You have to limit your textures resolution a lot (everything else is already taking all the VRAM), don't use Nanite, except if you have very dense areas. Most of the time, Nanite will give you big quads overdraws and you will prefer well adjusted LODs (which take much more time). Scalability profiles (you know these "shadows", "textures"... settings profiles that set a bunch of core variables) you select must be tweaked to fit your needs and not exceed them. Something many people don't do. Unreal Engine is not as much "out of the box" as they want everyone to think. (except for the assets. You were right, everyone use Quixel scans lol). But even with these assets... you better optimise. My game Lovely Traps Dungeon will soon (a week or two) have an updated demo on Steam if you want to try (the one available now is a lot outdated, the project evolved) an humble indie game with half-stylized graphics (meaning : custom shaders, a lot of home-made assets) made with UE5.5 (in the coming update).
@@MGrey-qb5xz They just decided to protect you from abusive game passes. And they don't force DRMs. Quite the opposite since they recommend we give more value instead of adding anti-piracy technical crap. But yes, I'm trying to make it so it can be sold anywhere.
@@FinalDefect I love Steam and that's my go to, but I think @MGrey-qb5xz might be talking about how Steam doesn't sell you the game, but a license to use. This has been a topic in recent lawsuits where Steam doesn't want to move away from the verbiage that's associated with buying, because saying you don't own it could hurt sales.
@FinalDefect rather not got in to this argument here but steam has been degrading many features and making the user experience worse for publisher and greed sake
Last gen games running at 150+ fps native res look better to me than current gen upscaled trash. Everything is blurry and full of artifacting and ghosting now, and running slow on top of that I don't really understand what is happening to graphical priorities at the big studios. It seems like framerate is barely part of the discussion, when its the biggest contributor for a good image when you are actually in motion playing. I dont get it.
@@nontoxic9960 when the average user gets better hardware, games just get more inefficient. It's happened every year since I was a child, and I don't believe it will change anytime soon. People don't change, not when their current approach is profitable anyway... :(
This. I obviously lack some insights, but I don't really get why a game that doesn't look any better than Doom 2016 or Doom: Eternal has to be that much more demanding and run so much worse, falling apart in some visual regards as well.
Simply lack of skill. I am in the games industry and it’s just a gigantic clown show compared to 1990-2012. Too many imposters got into the industry in every department. It’s kind of like TV and film. This is why we see more indies succeeding.
@@michaelduan5592 So just like everywhere else, organisations are shedding huge amounts of specialised knowledge and making a worse product because to do so is more profitable.
We've gotten to the point that games in real time rendering barely look better than 6 years ago (or even worse in some cases), and they can't even be run baseline on modern hardware (having over 10x the performance from 6 years ago), and instead resort to running at 20 fps and making up fake frames in between to get you to the bare minimum of 60.
It's crazy to me that My 2060 or even my 1060 before it can run battlefield 1 max graphics no issues but LITERALLY can't run games like this when arguably they're on par graphics wise
I play enough games to know when programmers are lazy - the temp on my graphics cards is much higher than others for no reason on my end. Hogwart's Legacy is a great example - I can cook eggs on it while running other games at very high settings doesn't do much to keep me warm in winter.
@@MGrey-qb5xzyeah, most big gaming studios keep people around for a project or two, after they get better they need to be paid more so they just replace them with newbies and the cycle continues. That's why fromsoft and the Yakuza studio can release great games fast compared to the rest of the industry, they retain talent that knows how to work with the tools they use
UE5 is such a joke. It’s the hangs that really make things so bad. Nothing prepared me for turning up all the fancy UE5 settings in Fortnite to see what the engine was all about and watching my 4090 laptop get brought to its knees while still not looking all that great in a lot of cases. The main reason installed the game was to see a showcase of UE5 and I was unimpressed to say the least.
@@Tw33zD I never compared the laptop 4090 to a desktop 4090. I’m well aware that the laptop 4090 uses a variant of ad103 (same die in the desktop 4080) with a bit less memory bandwidth due to power limitations on laptop. It’s still has 16 gb of vram and can pull close to 200w in games consistently. Thats extremely good for a laptop gpu. Oh, and it cost 2k and gpu temps are well below 80c. Pretty good deal for a complete system with ad103. UE5 is just a dog shit engine that doesn’t work well. Everyone knows this. Doesn’t matter how godly your pc is. UE5 lags behind many of its modern peers.
People called this out 2 years ago when UE5 was first announced, all studios will shift to it eventually and stop optimising their games by solely relying on upscaling methods. Ever since Control was first released, games have been relying on upscaling methods and companies never looked back. The developers for typhoon coaster who is famous for perfect optimisation and performance by coding with only assembly language even made the remark saying that games will eventually all be blend of same product instead of art due to heavy reliant on the same game engine like UE and Unity.
Unity thankfully still has a great forward and forward+ renderer that can use MSAA and look crisp. They are also making it more and more optimized as the years pass. Unity is living it's redemption arc now. Godot is pretty decent too with forward rendering so MSAA is safe, but it's a lot less optimized in some areas than Unity. To give it credit tho it has some really fast fog and GI rendering.
@@Hahdesu i love walking into a house only to turn around and look out the door/window only to be blinded by god and everything outside the house just gets washed out and becomes a white light i love it even more when there are enemies that can still see me and are shooting at me from outside and the only way ik where they are is by using my compass to se the red dot ue5 photo realism/lighting = once you step inside its like your character has never seen the sun before in their life and gets blinded
such a huge world, a unique genre of games in its kind, anomalies, mutants, npcs, atmosphere, do you really want 200fps? oh those 20 year old mom experts, cyberpunk at the start with lags, remember witcher 3? and starfield,..oh those 20 year old mom experts looking for graphics, stalker is atmosphere
@@onestopgamechannel7846 I hate to break it for you, buddy, but this has nothing to do with UE5 but their incompetence of doing it correctly. I agree with your general sentiment, though.
I remember when the splash screen "Made in Unity" was making people say the same thing haha. Its probably a more of a skill issue than which tool the developer chooses. But making games is incredibly hard so im not trying to demean anyone :D
@@VambraceMusic Its the tool. Given that even Fortnite Devs... people who actually work on and actively develop the Engine itself in the very studio that created the Engine.. has trouble optimizing for it.
@@Mr.Genesis Then why did days gone optimize it so well ? Or the mortal kombat games ? Or batman ? it's the developers' fault, they are known to ship unpolished games.
@@Mr.Genesis The tool certainly matters a lot. Its a difficult decision to make, do you spend the time and effort creating your own (like jonathan blow creating his own programming language) or use whats available at the cost of performance due to bloat in the code or whatever else it might be. Or in music, do you use samples or farm your own goats to use on the drums you build yourself. At the end youre more of a goat farmer than musician :D
Especially compared to the original three games that came before this one. A-Life was amazing. Being able to hear of a gunfight in a different region on the radio, go to that location in real-time and find the stragglers still duking it out was fun as hell. Roaming mutant hordes, the squad based tactics... all of it was great. Not perfect but great.
yeah enemies just spawn out of thin air you just ran through a big open field? enemies will spawn right behind you, even though it's literally impossible realistically
I can't tell if you're talking about enemy Ai in general, or just in the stalker series. I'm used to enemies either running straight into my line of sight for free kills, or just hiding in a single spot for an entire gunfight. As someone who's new to the stalker series, the enemy Ai feels great in comparison to a lot of other games
Hearing 343 Industries Halo Studios say that anything is going to save the Halo franchise has the same vibe as hearing your gambling addict father say that he's going to make it all back this time if you just lend him some more money. They've been "saving" the Halo franchise for over 12 years.
Thats the issue with new games. For some reason 2015 games like dying light or witcher 3 looks much sharper than these new unreal engine ones that look only good on trailers but in gameplay they are blurry and really nauseus to play idk..
@@mar1999ify they look blurry because all of the use TAA and upscaling, because its easier to slap fsr and dlss on game rather then optimize it decenly
the lack of any real A life makes scoped guns completely pointless. No reason to get that SVD at all when a pimped out SMG handles the games exclusive close range combat.
@@gooddoggo3547 A-Life is currently being looked into, there's no timeframe on when it'll be implemented properly. It is not coming in the patch later this week.
no excuse for games to run this poorly on the level of hardware we have today. dont blame the graphics cards, dont blame it on the vram. the game drops below 60fps sometimes on a 7800x3d, shame on the developers.
Yes, particularly in terms of CPU utilization it feels like Devs have become lazy simply because there is powerful hardware available to brute force stuff. I feel pretty confident that if the most powerful gaming CPU available was an R5 3600 it would be possible with work for devs to get all these games running at 120 fps on it. Not just making this up ; the fact is that there isn't any special work load or complex simulation that modern games require of the CPU which wasn't already present in games 5 years ago. It's just less efficient. With graphics at least there is the diminishing returns argument. A graphics load that is 3x harder to run only looks 5% better subjectively so that's very hard on GPUs. But for CPUs we are mostly doing the same thing as before with way more powerful processors.
@@SolidChrisGR LMFAO stop simping for devs, they are clearly at fault. "every UE5 game runs like shit" Black myth wukong doesn't, Hellbalde 2 doesn't , The finals doesn't . None of them drop below 60 on 7800x3D. probably not even below 100 lol.
The thing about these UE 5 titles that use Nanite, Nanite tanks vram. Just by playing on dx11 in the silent hill 2 remake i stopped having drops to 5 fps. I'm still rocking on an RTX 2060 that i bought in 2019, running on dx11 i played the game comfortably and streamed it on discord with no problems
Having worked with UE4 and UE5 for the better part of 4 years professionally now, I can confidently say that it's not the engine. It's a hodgepodge of previs/vertical promises, knowledge deltas and poor optimization. UE5 is a great engine - but not well documented. You've got to put in work (and a lot of it) to understand how its systems are interconnected, what it does and doesn't do well - and best practices to circumvent common issues. The engine is extermely dependant on manual performance optimization. That includes best practices, as well as hacks (And I can't overstate the importance of hacks). On top of that, when starting production, the engine has a habit of giving you false impressions about how well you're doing on performance. You'll see decent profiling stats all the time - but there's a tipping point you can't easily come back from once you've broken through that threshold because scaling back is way harder than scaling up. In the case of Stalker not everything is lost though, as its main performance issues seem to be happening on the main thread and how the game is handling its AI systems. Depending on what exactly they are doing, there's still tons of potential for optimization. But yes - that should have happened before launch, not after.
A life 2.0 isn't even in the game and it runs like shit. I bought the game cause I've played so much anomoly and I felt I owed them a purchase but I legit played 2 hours and Uninstalled going back to anomoly lmao.
@@HansLollo what do you mean by hacks? I think its definitely the engines fault if you only can get specific performance out of it if you need to use hacks that arent exposed normally. Thats some bs UX epic did here.
tl:dr if you know how to use the engine and spend a long time optimising the game you can get an unreal 5 game to run decent. The reality? Even if people do know how they wont do it because it costs time and money when these scumbags can just say "just use dlss and frame gen to get 60 fps at medium settings thats enough for us hue hue hue" Unreal is not a complicated engine compared to most others. Infact because it's the most used engine the odds of having no one familiar with it in the entire team is practically zero. It's not about lack of knowledge. It's about lack of care. Games years ago had soul and passion put into them. They were made to sell but they were made to be fun first. If that meant they took a long time to come out they would take a long time to come out. Now it's "release this game in 1 year or it's scrapped" The bar for quality simply does not exist anymore for these people.
Too many games today look worse and perform worse than similar games from years ago, all on hardware that is significantly more powerful than from years ago, and at a much higher price. Something needs to give. Yesterday.
But they've added 8K detail to the character's individual eyelashes! How can we live without ultra detailed textures that get collapsed into a single pixel more than a meter away?
Heavily stylized games like Overwatch 2 look 100X better for the simple fact they make the world look interesting instead of drab washed out mud. The overelyance on upscalers is why most of these games look like shit, not to mention the focus on hyper-realism takes away any flair the game could have had.
"Stalker 2 have artistic style and doesn't feel quite real" - as any person born in Russia, Ukraine or Belarus would say to you, that it is more than real. Any game have some sort of artistic style, but if you google how abandoned homes or towns in CIS region - you would also say, that it is way too realistic)
I learned this early with Remnant 2, one of the first UE5 projects. The game runs terribly to this day even after a year and a half of patches. I'm so scared for Monster Hunter too. It might not be UE5 but RE Engine seems to have the same problems. What is with these fucking engines man? Beg Valve for Source 2 or SOMETHING. Anything but UE5...
@@AuthorityCat Yeah, RE Engine is fine on linear and small scale environments but by god, does it run like shit on the MH Wilds demo. I remember playing DMC5 at high settings on a freaking integrated GPU at 30fps. Now, the MHWilds demo runs like shit on my 3060 med settings, and that's with DLSS setting on.
The graphics are there, when there's a storm in this game it's absolutely terrifying and beautiful at the same time. But yeah, UE5 is a bitch and runs like ass.
I agree, the graphics doesn't seem that impressive. I mean sure it looks good when it's set to epic setting but at low setting it's like the game from 2000s & blurry as hell. On the other hand, Low settings in alan wake 2 doesn't look that bad, even if we ignore the ugly terrain the rest is phenomenal.
that means modding will probably be killed in their games. enjoy buggy games that cant be modded to fix it. i was hoping they would kinda carry the torch if/when bethesda fails ES6, but looks like they will be the last ones with very modable AAA games in the future
@@GoalOrientedLifting Maybe you should think for a second before writing nonsense, STALKER 2 was released four days ago, there are 7 pages of mods for it on Nexus already
@@Finnishguy777 did i say there wouldnt exist mods? no. but the level of modding you can do on a game MADE with modding in mind with their OWN engine that comes with a modding kit, and unreal 5, are 2 complete different worlds. Look up what you need to know and tools required for UE modding vs how easy it is to mod with a dedicated modding kit for their own engine maybe you should think for a second
why study computer science when you can chatgpt it? why code the game when you can github copilot it? why optimise the game when you can AI upscale it? its all the same at this point, we are all cooked 💀💀
Funnily enough, in cyber sec, we are really benefiting from this because a lot of glaring mistakes are being exploited because of people who are not very good at programming making important functionality.
Worth noting that as a senior/principal dev, LLM tools are actually very helpful at speeding up development. The trick is, as with any code you get from any other source than yourself, you need to understand the code before utilizing it. Don't copy/paste and use in production blindly... that is scary, haha. I've had code generated that has security issues, but since I took the time to understand it, I was able to prompt the LLM to fix them, or fix them myself. Yet, even in those cases, it still saved me time.
Game dev here. Let me be honest, most UE5 games (I think 90% of them) don't really need UE5's specific features like Lumen and Nanite. The main benefit of those two features is cutting down development time NOT making the game more optimized. For example, if the game doesn't have very difficult light scenarios (dynamic day-night cycles, many moving lights, etc.) you don't need Lumen and could just use baked lightmaps. For Nanite, pre-processing LOD from the 3D authoring tools can get the job done just fine. The result might be quite poppy but you save tons of computing time on that. I haven't played this game yet so I can't say that their decision on Lumen or Nanite is justified but most of the UE5 games aren't.
I think the game definitely needs lumen because there are a ton of different weather cycles and events that are dynamic, anomalies create their own light sometimes, it's necessary imo. But nanite I'm not sure, I think a lot of the textures would be pretty easy to pre process.
@@kyero8724 the game was announced in 2018. They had like 4 years of dev time before Russia invaded. Yes its fucked up they had to work through it, but no one would have cared if they stuck with Unreal Engine 4. Instead in 2022 they completely reworked the game for Unreal Engine 5 pushing the development into the war.
I agree with everything you said but just wanted to play devils advocate for a sec: 1) upscaling really isnt THAT big of a deal at 4k. When i use "quality" DLSS in 4k i can barely tell the difference, though certain games look better than others...I think developers may be looking forward and deciding over the next 5 years+ that 4k will become the norm and upscaling wont matter to the masses. 2) upscaling itself will continue to improve. it might be that eventually we cant even actually tell the difference unless were looking for it. 3) having stock engines allows the developers to focus on whats important- making good gameplay with good stories. im a musician, it would be a drag to have to build an amplifier and a guitar before i recorded a new album. 4) with intel jumping in the gpu space and beating the 4060 at a cheaper price. maybe were at the very beginning of GPU prices becoming reasonable again? im not optimistic about any of this but just trying to stay positive when its pretty crappy times to be a PC gamer. Im just happy i can afford this overpriced PC i have but im a 37 year old man. sucks for the younger folks thats spending 1000+ on a gpu is just outta the question.
The actual truth is that in UE5 gives developers a lot of subpar tools that were not supposed to be replacement for the actual thing : for example NANITE is not a replacement for LOD - Everyone treats it as replacement for LOD. That's like 50% of performance hit right there.
This is exactly true. I am a UE game dev, on my channel I did a video testing Nanite performance. It is terrible with anything less than 20 million triangles in the frustum. And projects that large with Nanite enabled won't even start on the Steam Hardware most common systems, UE5 will just crash. I have eight computers all of differing hardware from low end to high end, for testing my game releases on, to make sure they are performant. I will never use Nanite, or design my games around needing it.
When something is so good that it becomes the *_standard_* across the board, then it by default becomes "generic" because everyone is using it since it's so good. So you want shitty looking games or what? If you want interesting art directions go play Minecraft. 😂
@PotatoeJoe69 I'm not saying it's necessarily bad. But also different game engines can help making the game feel more unique. If every game feels like a reskin of the other, what's the fun? I think that can still be accomplished with UE5, but recently quite a few titles give that "it's UE5" feeling
It feels like developers are no longer bothering to optimize their games when they can just tell gamers to use DLSS/FSR. People forget frame generation was originally for 4K gaming to help give playable frame rates.
And the generated frames doesn't even match with your motion, which gives a sensation of delay or odd responses. Generating frames is not the way to go.
Quick note on Halo at the beginning: the move to UE was something that was a long time coming. Halo's engine was fine, if not a bit esoteric as in-house engines tend to be, but the suite of tools used to build the games (Faber) were mired in unbearably oppressive technical debt. According to Jason Schreier, 343i leadership was seriously considering moving to UE4 during preproduction, but ultimately decided against it in favor of developing Slipspace (an updated version of the engine they'd been using since Halo CE). This massively backfired, as the abysmal state of Faber meant that routine tasks such as recompiling the engine or iterating on content took orders of magnitude more effort to get done than it would have with Unreal. Fixing Faber's countless shortcomings would have basically necessitated rebuilding the engine again, so why not just go with a known proven engine that already had halfway decent dev tools?
I think source 2 is such an underrated engine. New counter strike maps look so good and run so well (I can't even imagine running something that looks similar on my 1050ti laptop in UE5). Not photo realistic like UE5, though they still look amazing, without compromising so much on performance.
Yeah source 2 is amazing. But they have the benefits of pre baking almost all the lighting. Conceptually you could probably tune a UE5 map to look as good and run as well if you spent enough time
yea, idk if valve allows others to use it to make games like source 1, but i wanna see in terms of graphic how much the source 2 engine can be pushed cause its only used in CS2 whch needs to be optimized as much as possible cause most people still play CS2 at low end pcs, in s&box whch is aiming to be cartoonish, valves newest game wch is also in cartoonish art style and HL alyx wch looked really good but i feel like its still not on max due to it being a VR game.
And also well optimized (a bit), my shit ass laptop (no graphics card, just plain potato laptop) can run cs2 with 55-60 fps, ofc at lowest settings but still looking beautiful.
Who's underrating Source 2? That's the silliest thing I've read today. Source 2 is just updated Source 1. So you're comparing a 20yrs old fully matured game engine. To a completely different, completely new engine from it's predecessor UE4, that's also only been released for a little over 2yrs now...
Back then games made you wonder how they could look that good with average hardware. Now they make you wonder how they run like crap on high end hardware. GTA V ran on a PS3. We’re not gonna see the same story with GTA VI and PS5.
@@bahshas Style choices being baked lighting. If more games were just content with having baked lighting you wouldn't need a thousand dollars in GPU hardware to get a respectable framerate. Of course the ultimate irony here is that the first Mirror's Edge was an Unreal Engine 3 title.
@@zelcion I'm playing the Metro trilogy, Metro 2033 looks as good as Metro Exodus, but it's less blury, it has better perfomance, I just can't understand all these tech bs throwed in and bloating everything, we are using TAA then upscaling then frame gen and it makes all the engine power and graphical improvements go directly to the trash can.
@@Wkaelx I agree. Played the Metro series recently too. The jump from 2033 getting 350 fps, to Exodus getting between 50-120 fps, doesn’t appear to be reflected in the graphics. Yeah, they are better, but not that much better. I’m playing the Mass Effect Legendary edition right now. Mass Effect 2 looks absolutely brilliant with the new textures. Incredibly sharp, and high FPS. I would much prefer a game made to run well. 5 percent better graphics is just not worth 50 percent less fps.
No. What kills gaming is rushed developers and developers overusing tech for what it isn't made for. Lumen isn't a one click button to replace all your lighting and make any game magically look good. It's a tool like anything else. It needs proper knowledge to use properly and optimize. Same with nanite. These are all helpful tools but not made for you just to slap onto any game and call it a day. Satisfactory runs on unreal engine 5 and uses lumen. It can render GIANT factories with thousands of stuff and still be over 60 fps. It's not about the tools. It's about the one using them.
I hated this engine since UE3, most devs dont know how to use it, other than Epic themselves i never saw anyone do physics properly with it, it doesnt look that good, the lighting is always weird and fake, it struggles with bigger maps. Picking it for STALKER was an amateurish decision
@@BostilCensuradovery few UE3 games are nice to play. Killing Floor 2 looks amazing. Sadly the game was mismanaged to the point it now weighs 90 gigs with all the cosmetics but hey, for 2016 it was incredible.
Not on my PC. I really wanna know what everyone thats complaining is running. Then they wanna cry that they shouldnt have to upgrade their PC. lol the entitlement of people nowadays is off the charts. Dont get me wrong. I deal with the memory leaks occasionally as well but I believe thats fixable in time.
@@Terp-daze i have an Rtx 3050, i5 10400 and 16(8x2)gb 3600mzh Ddr4 ram. Its not the best but as a 3d artist it meets all my needs. It also plays most of the games well at 1080p high. The main issue is the developers dont care about optimization at this point and UE5 is an unstable engine. I can run an empty scene at 120fps at epic preset and then i add a cube and boom 90fps like what? The engine itself is tanking fps because of its lighting model and TAA/TSR implementation.
They should've used Exodus engine for Stalker it would've been perfect. UE doesn't even get rid of its plastic look for how much performance it chugs to look realistic.
My thoughts exactly, imagine if Metro was built on UE, how would it look AND how would it performe. So glad that that studio stuck with the engine they made previous 2 games with almost a decade ago but still managed to change the looks and crank up graphics on amazing level. You can notice how improved metro exodus is in technical level than previous 2. I certainly hope they stick with 4A engine for their next title as well
as someone that studies Game Design and Development at University, we are being taught how to use and develop for UE5 and the consensus is that most devs/students like using it because of how friendly the visual scripting is and how easy it is to add high detail models and textures, we also do learn Unity 6000 and some small pieces of Godot, I personally dont like developing for UE5 just because I find programming in C# easier for Unity and the fact I dont need a high end GPU compared to UE5, plus when developing we use on demand texture streaming in engine which slows the process down soo much
I think this is being pushed by asset managers, like BR and VG to push people to upgrade and get money on their other computer hardware investments. This kind of hidden monopoly is not good for consumers.
@@ПавелВолков-э9и Don't blame the drug dealer, blame the addict! Enabling stupid behaviour is equally responsible and ultimately, without the enabler there is no situation to begin with.
@@ПавелВолков-э9и Well when there's no example of anyone using the tool properly then maybe the tool is at fault in this situation every single UE5 game runs badly every UE5 game is very CPU heavy and every UE5 game upscaling is almost always a requirement hell there exist no CPU on the market that is strong enough for the engine to not have problems.
Make low graphics great again. Imagine if the same mentality for photo realistic games was used in any sport, requiring players in said sport to wear gucci kits at the minimum to attend.
graphics are irrelevant the aesthetic is the lead direction when someone think in "bad" or "good" graphics, graphics without aestethic intention is irrelevant and worst than puke experiments is a souless image
@@cezarstefanseghjucan Except graphics are getting worse and performance is getting worse, all while playing Stalker 2 I kept thinking "I've seen better games that performed better what is this shit..."
Back in the PS1 days devs are really crafty in squeezing every bit of power from the PS1 console. Imagine taking those guys and giving them access to todays resources and tech.
I believe that major thing is that huge amount of developers today don't understand how computer work. I'm generation of developer who is written own interrupt handlers and multitasking scheduler on assembler to 8-bit machines and I can tell that modern frameworks of programming are more complex than understanding how computer works on CPU/GPU level. It feels like junior developers are just out of understanding what causes performance losses. Besides, it is not only knowledge how computers works. It is also doing less as possible, saving resources to stuff that matters. Of course some thing can be added when it practically takes no resources. However that is not clear thing to many of the developers, or gamers, what is "enough". Optimization is not targetting game content to 4K resolution, or not all fancy features should be all time on. I think we could have better metrics for that. One really universal metric seems to be polygon size.
@@St4rdog In my opinion, it takes at least 20 years to learn both highlevel and low level stuff enough good. So learning should start as child and at 30+ it is possible to be enough good developer. That means that people who start learning twenties are in bad position if they want to be hired, because AI will make make junior level coding useless. That means 10 year career that starts at 40. Or option 2, be entrepreneur.
2 things; it's not really proper to blame the engine for performance, optimization is up to the devs and UE5 will run insanely well if you have an actual performance specialist on staff to optimize the game the real reason AAA likes UE is because they don't need to hire actual software engineers due to the blueprint system which can be taught to a highschool graduate in a couple weeks of training, and the real reason why these AAA UE5 games suck is exactly because they hired cheaper labor in order to put more money into marketing for day1 sales, the unfortunate reality is more marketing gets more sales than a better game
You got it. UE5 allows for cheaper and more interchangeable labor. The more games using UE5, the more devs will know UE5, which makes it easy to upscale and downscale the workforce on a whim. UE5 removes the skill needed to create a videogame.
100% people are blaming the engine for a developer error. the engine is capable of beautiful, performant games at a scale no previous engine has ever come close to.
No doubt, the engine has huge optimizations and tuning capabilities. However, its up to the developers to actually use and fine tune it. And Optimizations and QA are one of the first things that get cut when the deadlines start hitting. Remember: Minimum Viable Product
Blame the devs not the engine. Games are optimized and perform well depending on the work and effort that goes into them. I do not want to watch another video of video essayists on youtube pretending they understand everything they talk about.
@@mikec2845no it is not. Its technology that makes "beautiful games" is also heavy taxing on any hardware. I do not want ultra realistic games that slowly crawl at 540p scaled to full HD in 30fps, i want fast games that runs 60 without scaling. Fr, I had a joke in ue4 times and sadly it is still accurate. If Titanfall 2 was made with UE instead of Source, it will be 2-3x more demanding on requirements but have the same graphics.
As a Unreal Engine user for more than 5 years, I think more project utilzing Unreal 5 also push Epic harder to polish their engine to be more adaptive to all kind of scenario. I can understand people worry about it's taking over. But to be honest, there is still Unity with larget market share.
i still use a 1050 TI and i can run 80% of games that come out nowadays , and the top 20% are almost always dumbed down tech demos that are worthless to to play wtf am i going to play stalker 2 for ? they did nothing new its just "game you have nostalgia about 2 " , just like the hardware improved almost nothing in a decade so did the AAA games
Hot? Take: Not everything needs a day night cycle. Having several static ToD with high quality baked lighting is usually fine. We’re getting so lost in "muh graphics" and deadlines that were forgetting about great optimizations. And gameplay ofc GTA V had mirrors you can see yourself in. 10 years ago. Without RT.
Yeah, this take is valid if only the video aint bout stalker. Also, i think i can count with fingers what game ive played and relatively popular have night and day cycle.
Source 2 has been out for years and is incredibly out dated. You're excited for an engine that literally came out 10 years ago and looks like it. Also ask any CS player about how Source 2 is going. Lots of performance issues.
@@asmilingguppy2212still the performance is way better on source 2 and besides source 2 was and is being refined over the years. Do you think they made unreal the day it was announced?
@@ksavx2119 Unreal has been around for years but there have been major shifts and updates which have not occured in the valve eco system. Either way my point is that no, source 2 the performance can be awful. There are tons of issues with CS2 and shaders. It isn't the engine, its how you use it. And I suggest you look more into source 2 before making claims about the engine.
All the companies switching to UE5 will probably actually hurt devs. It'll be so much easier to fire people if everyone in the industry just runs the same set of tools.
You hit the nail on the head. This is exactly what's going on. The big gaming industry has already been desperately trying to find ways to get rid of as much staff as people, it's no conspiracy theory at this point it's commonly known fact unless you've been living under a rock or delusional.
yeah, back then if you worked for CDPR for example, you most likely wouldn't be fired ever, because teaching a new dev to use the red engine was expensive. Now, if you need a better developer you just hire another one and he already knows how to use your generic game engine
Well, thank for your another fine work here, Vex. Special thanks for some small details you pay attention to. And, of course, for bringing up S.T.A.L.K.E.R. 2 topic, so in-depth, great piece of labor you've put into it. I do appreciate it it's a special thing today for me, for us here, really. GSC have their faults, through their history, yet they are our boys, as 4A Games are, founded itself by descendants of a Grigorovitch Serhiy Constiantinovitch Games in 2009-2010 (just after the Call of Pripyat, the last of the original S.T.A.L.K.E.R. trilogy was released). Greetings from a fellow gamer, myself, since 1996, from Kyiv. Absolutely do agree about what you've said about UE5. I'd like to believe, games using it are going to improve over engine's life cycle, as was the case with the previous Unreal Engine iterations. It's some kind of a tradition already: UE3, UE4, UE5 demos blows mind entirely, when compared to the contemporary games, it's like with consoles, like PS since the very first, when devs are usually more likely to get a hold of an engine and to squeeze the most of juice out of the system almost on a brink of a next-gen sale start. I mean, UE5 is a young engine yet, with an immense potential, but still. I'm a factory worker and my rig is quite limited in its budget, I've got an i7 4790K at 4.6 GHz, 32GB DDR3 RAM, and just have switched a 1060/6 for a 1080ti, just for this very game. I see now, why they've changed the official sys. req. to 3070ti. 1680х1050 (my display max resolution in 2k24, not a joke, still much better then my EPS/XPS cutter machine at work), all settings on Epic (max), with no upscale MSI Gaming X 1080ti gets about 35-45fps, ASUS Dual 1060/6 OC got like 17-20 fps, quite predictable. Actually, S.T.A.L.K.E.R. games' series were never famous for their fine optimization. Especially the second one, Clear Sky. There was the time (2008), when I've first met an option, described Ray Tracing. Sun lighting rays then, just full dynamic shadows from moving sun but it took such a heavy tool on those days' systems that no GPU was capable of it then, not even 2GPU GTX295. Ridiculous, but the 2008 game max settings have about 50 fps on a 2013 GTX780, about 60 fps on a 2016 GTX1060/6 and about 100-110 on a 2017 1080ti. 1680х1050, stiil. You know, I remember well my first experience with the first days' Cyberpunk 2077, like 4 years now, almost, with my GTX780 on board (then-minimal sys. req.), and the experience wasn't smooth, for sure. Now it was the very same feeling. It will get better, if GSC would care for its child long enough. BTW, If you guys remember, there was serious problems with the performance of Cyberpunk on PS4. Well, take a look into Steam Deck gameplay in S.T.A.L.K.E.R. 2 ... =)
I'm more concerned as to the lack of alternatives from Unreal Engine , there's far too many game studios using it nowadays. I'd rather CDPR stuck with RedEngine after doing so much work with it for Cyberpunk and kept evolving that. Not to mention that UE5 seems very immature and although it has some good features , it also has some serious performance issues for real time rendering which upscaling is being used to mask for gaming. I get the feeling that UE5 is being targeted at arch Vis and other uses which are offline so performance issues are being ignored.
Or maybe just maybe the engine has a ton of features , devs don’t know how to budget well and want to use all of the features when we’re not there yet in terms of hardware and then they can’t optimize it either. And this just because if you use those features it will market better. The best example is hair in stalker imo. Completely unnecessary, they could have used fallback hair cards at least, but now the game tanks due to the hair settings in towns. That’s a frame budget issue not the engines fault. We have a ton of engines in the market already, just not free. Unity shot themselves in the foot and Godot still isn’t there feature wise. Another thing to consider is the nature of ue5 , that it has everything by default attracts every lazy dev out there. That’s why there’s this misconception that godot and unity have more creative games. Because people who work on those engines cannot afford to be lazy with out of the box solutions
Capcom will sadly never license out the RE-engine despite being quite the powerhouse, especially now that DD2 was used as a guinea pig for it's open-world capabilities. It shows slight aging mainly with hair physics being questionable but they have been chipping away at the uncanny valley it has.
The heavy reliance on TAA and DLSS in games nowadays is unacceptable. And even then unless you use global illumination (big hit on performance) the shadows in these games are worse than the Witcher 2 (enhanced) for example. Which is over 12 years old by now, it's pathetic. Every time i hear a game moving to Unreal 5, or whatever the new one is, i become concerned, it has become a red flag. We have some rough years ahead of us.
what do you mean "some rough years ahead of us"? It's looking like devs are getting into UE5 more & more. Expect we're getting innumerable rough years ahead of us.
@@alphawolf60 Got me in the first half, ngl. And yeah, probably, i'll just stick to indies and 10+ year old games. Not that the AAA industry has put out anything of value lately.
Ive been telling my friends that games dont just run well without upscaling anymore. I mean what the hell happened? Sometime after RDR2 nothing has ran as good with Native resolution, or DLAA is what they call it now. Upscaling and Frame Generation has made devs get away with unoptimized crap.
Assassin's Creed Valhalla has 90ish average fps on my 2019 PC(ryzen 2700x and rx 5700) without upscaler and frame generation, so don't say that after RDR 2 nothing runs as good with native resolution, cause it's objectively false. it started happening when upscaler and frame generation were introduced. now frame generation has become mandatory cause devs got lazy with optimization. Monster Hunter Wilds literally says to enable upscaler and frame generation on the recommend specs page. this is the future.
@@SapiaNt0mata Assasins Creed is just the same game every 2-3 years and they have their own game engine. So it would be a shame, when Ubisoft games have bad performance. And all Ubisoft games feel the same.
Well, they did outsource the project to a garbage company full of incompetent hacks. Still doesn't excuse the lack of respect not only for their own franchise what made them this big to begin with, but also no respect for the customers and fans!
@@daktaklakpak5059But God has helped me to this very day; so I stand here and testify to small and great alike. I am saying nothing beyond what the prophets and Moses said would happen- that the Messiah would suffer and, as the first to rise from the dead, would bring the message of light to his own people and to the Gentiles. Acts 26:22-23
The AI of the monsters is so janky. Literally all monsters follow the same attack pattern: run straight towards you, hit, run back, turn, run straight towards you, hit, etc. The pigs, the pseudodogs, the flesh, even the bloodsuckers and poltergeists do exactly the same. Hit and run, repeat.
That's definition of well designed AI.. In fact they even know how to avoid your line of sight effectively, so you can't even abuse surroundings all that much. Mutants will hide bebind objects when they can't reach you.
@@MrZlocktar Yes of course, just like a real boar encounter, they're known to hit people, run back, run sideways and run again towards you multiple times, truly genius. Of course when I see a bloodsucker I think "this looks like the kind of creature that hits you and then starts running away in circles like sonic"
@@MrZlocktar But you know animals don't do this, right? If it's a fake attack to make you back down, they will run at you and then retreat. But if it is serious, a boar will knock you down and rip into you with its tusks, probably impaling you. Bloodsuckers are even more retarded to do hit and runs. It's lazy AI scripting, which hopefully will improve in the future.
@metalema6 In real life you don't have health bar smartass. You know what that mean? Play it on Veteran and try to fuck around with boars. Go on champ. You'll learn really fast how IRL scenario would go. And stalker somehow depicts it rather accurate. Spoiler: they won't hit and run, you'll be cripled from first hit and then you die of bleeding.
I think the problem is that these large game studios test their games on state of the art gaming PCs instead of using PCs with the lower levels of performance that the average consumer would have.
I dont know if game developers have the same problem as software developers, but at the company i work, the Software Devs do not seem to grasp that comlanies will not put our software on a good new machine. They're instead waaaay more likely to put it on the dusty computer in the corner with a processor released back when i was still in college. Telling them that they need to make sure it works on old equipment really doesnt go anywhere.
I care about performance much more than graphics especially in FPS games. Doom Eternal looked good and pushed almost 200 fps on my system which enhanced the gameplay. This game feels like a laggy mess in comparison.
Personally, I don't believe that game graphics need to be better than they were in about 2005. At that point, there was plenty of detail available to make everything easily identifiable. But it wasn't covered in so much unnecessary detail and special effects that you couldn't see anything. They focused on the big details that players would actually notice while playing, instead of the tiny details that only show up in perfectly staged screenshots. There's a reason that the retro look is so common in indie games. Its easier to make, easier to run, and with a bit of skill, can still look really nice. And its why I find games like Ultrakill or Animal Well more appealing looking than any new "AAA" game like CoD or Horizon. Sure, there are some cases where having tons of detail is good. Like if making your game perfectly photo realistic is an integral part of the story telling. But the vast majority of games don't need to be as detailed as they are these days. No one cares if each individual eyelash on every single NPC is modeled in so much detail that you can count the dust mites on them. Because unless you're going to spend all your time staring at a wall counting the individual pixels, you're never, EVER, going to see that detail. Instead, you're going to be sprinting past all those hundreds of gigabytes of 4k textures and hyper detailed GPU choking shaders, wishing you could be fighting monsters at more than 30 FPS. Because of that, most players are just going to end up disabling most of that detail anyway, by turning down the graphics options. Which usually makes the game look WORSE than it would have 20 years ago, while still running worse, too. Which is just plain stupid.
Not everyone wants to play an ugly ass looking game, amazing how that works with this day and age and I can agree with it to an extent depending on the style. Although I prioritize gameplay more but a good-looking game is also just as important along with it.
@@JDumbz Different strokes, I guess. Gameplay matters much more to me, like fr I'd unironically play a game that's pixilated to all hell if the gameplay elements are fire
Remember when developers used to invest time, energy, resources, money and great talent into developing their own engine? Because I do. And gaming was way better then.
And lmao there comes the lies. "And gaming was way better then" I loved it when GTA 4 stuttered like hell. "Remember when developers used to invest time, energy, resources, money and great talent into developing their own engine" yeah creation used in starfield that has no CPU performance issues right ? "Because I do" no you clearly don't lol.
@@DragonOfTheMortalKombatbefore you get all emotional, try to take a deep breath then comment.. there's no denying that the industry has been going down since the big cough. When I was a kid, you bought a game and it worked when you brought it home to play. No patches for better or worse, the games were more soulful and complete.
The problem is that people are demanding more out of games. Back when games didn't need realistic physics, advanced lighting, particle systems, skeletal animation with inverse kinematic support, landscape generation capabilities, material editing tools, dynamic global illumination, animation and sequence editing tools, and countless other modern features, it WAS possible for studios to make their own engines - although they often just licensed another engine and modified it, but that's another story. Now, though, people expect so much more from games than a ten to twenty hour single player experience with minimal variation and customization. All of those features take time to develop, and frankly it's a waste of time to reinvent the wheel. Why bother rebuilding an entity handling system when someone else has already done it? Or a material painting system, or a physics engine, or any of the other features we take for granted in modern titles. So developers have to pick: Focus on the parts of the game that make it a unique creative endeavor, or spend time and energy redoing two years of what Epic, iD, or Valve have already done and polished. Unfortunately, there's a bit of a trap there: Because Epic works hard to make a low barrier to entry (via blueprinting, mostly), it's very easy to sit down with UE5 and have a world you can run around in in only a few hours. However, it's not very scalable, and the engine tools aren't good enough to do optimization work without the use understanding how things work under the hood. If you make a complex set of AI rules for the NPCs in your game, and manage them all via blueprints in the render thread, your game performance is going to blow chunks. This is an obvious gotcha, but there are a *lot* of nonobvious ones - including the entire blueprint system itself - and as such it's very easy to make a very poorly optimized title in UE5. The right way to do a UE5 game is to use blueprints for prototyping, then get people who actually know the engine to build the systems you've prototyped in blueprints again using C++, making appropriate optimizations along the way. This takes money, time, and experience, though, so it's not always done... and as a result you have a ton of poorly optimized UE5 games on the market.
@@eagleone5456 There's truth in this but when triple-A games come out and they don't look photorealistic enough, the graphics aren't good enough (for example the facial animations of Starfield), the games get destroyed online. The fact is that people demand better looking graphics, animations, lighting whatever which ends up taking much more resources to make and which puts much more load on the CPU and GPU.
For me, UE5 graphics are so so good in still images, but are COMPLETELY destroyed when moving. Shimmering in finer details such as in the grass and shadows, ghosting, blurring caused by TAA creating a motion blur effect basically, reflections that are super weird when you move, etc. This kills image clarity and does not justify the performance drop for me AT ALL. That's the biggest problem I have with UE5. I just don't agree with people thinking that it "looks better" than old games so "it's normal that it runs like shit". I only agree with the fact that UE5 is making it super easy for developers to create games, which is a good thing, but IMHO we just are not there yet in terms of engine optimization and hardware. I'm tired of every UE5 game being the new "Crysis 3".
But the problem isn't exactly the engine. It's the devs fault. They think that if they use an already established engine, it won't need optimization. It's wrong. The engine is a "tool", you still need to do your stuff. When you make your own engine you do the optimization yourself because you're making the engine. You can change a lot of things in UE5, and of course, create it yourself. I've lost count of the amount of games that I tweaked to get better optimization and image quality via the ".ini" files of these games. If we, gamers, can do it, they also can. UE5 is amazing they just need to use it properly, I think they use UE5 to cut corners. Like "I won't be wasting that much time because I already have this much done".
"I've lost count of the amount of games that I tweaked to get better optimization and image quality via the ".ini" files of these games." Same. I've even used hex editors to alter the code of certain games to allow for things like proper ultrawide support. And don't get me started on how many .ini edits I've had to make in games like Fallout 4. Which begs the question: Why tf do we as players have to do so much of the optimization work on behalf of the devs? It's ridiculous.
"But the problem isn't exactly the engine. It's the devs fault. They think that if they use an already established engine, it won't need optimization." This probably does apply to quite a few developers out there, but with regards to GSC and STALKER 2 this is blatantly false. The game essentially uses a custom fork of UE 5.1 because the engine, available as it it from Epic, didn't entirely fit the developers requirements.
This is true, HOWEVER, we should expect to see top tier optimization and amazing performance in games that Epic makes themselves. This is not the case with Fortnite. Occam's razor says UE5 is AIDS.
The real PROBLEM is there is not enough programmers and companies do NOT want to pay for them, they want to lower development costs so all they need to do is hire artists that can be outsourced
Ive been saying its corporatism thats ruining gaming. The big suits come in and ruin the original vision of the game or make cuts to it that take away from the game or just want them to rush it out unfinished bc of money and deadlines.
Yeah. OpenGL and DirectX / Direct3D were made by a bunch of geniuses decades ago, who do we have making rendering pipelines today? I mean the GPU companies can make some impressive visuals, but the realtime optimization & the mountain of artifacting problems needs some work.
The funniest thing is that the developers of the original Stalker went to 4A Games. And they created the 4A Engine. And the game Metro Exodus, which was released in 2019, looks, maybe, a little worse than Stalker 2, but it also weighs three times less than Stalker 2 on the Unreal engine 5. 😅😅
Thats a negative, metro exodus is to me the best graphic to performance ratio.. Even when using low settings its still looks so good, goodluck finding UE5 engine games that looks good without make it a basically a polygon game on low settings
I have said this before and I will say it again, upcoming games made using Rockstar's Rage and Naughty Dog's game engines will embarrass UE once again like they have done over the last decade.
Let's add Capcom RE engine, the Id tech engine and the 4A engine (which does path tracing on consoles at 60fps and looks fantastic) to the list as well. There are probably more.
Call me old-fashioned but I refuse to use any kind of upscaling / frame generation because I think it's fraud. If engines rely so heavily on this tech why do I need to buy a GPU that doesn't have better performance than one that's ten years old?
@@Legalizeasbestos That's the strangest thing about it, I highly doubt it's due to performance. In the original triology, the majority of ALife's brute force was generated in a .txt that compared the player's coordinates with those of certain spawns. It really didn't put any extra stress on the hardware. quite strange that it is disabled
Since 2007, when Unreal Tournament III came out, I got that feeling, like some games just look wrong - grey and blurry. I much preferred the cartoonish visuals of UT2004.
With how awful Unreal engine 4 and 5 is, it makes me almost feel spoiled for Unreal engine 3. It had awful lighting problems & blurry melty texture issues, but at least the actual anti-aliasing worked, now we don't even get that.
I think this is a topic that needs more attention. There's so many reasons why one engine basically having a monopoly is bad for games. One that I don't think many realize is how easy it is for big companies to layoff full time employees. They used to have to train people on their own engine but now they can just outsource work or hire contract workers during crunch periods and than scale down to a small team when there's less work. Unreal is also one of the biggest reason dev time has increased. Every dev is in an arms race trying to make their game look better than the last Unreal Engine release to standout. It used to be that every engine had different strengths and they could make their games stand out in more creative ways but that's not the case if they're all using the same tool
I remember when they showcased UE5 saying it wasn't demanding and that we wouldn't need expensive hardware.
LMAO who said that ?
@@DragonOfTheMortalKombat The UE5 showcase.
@@DragonOfTheMortalKombat
That's how they "advertised" it.
Lumen and Nanite, revolutionary low cost methods that would let you push graphical boundaries at good performance.
That did not work out at all huh.
seems like they were aiming for 2030's cheap hardware, not 2020's cheap hardware
imagine if devs apply this lighting that Unreal Engine has made. fps will tank by another 10-20%.
say it with me folks,
DLSS is not a replacement for game optimization.
DLSS is not a replacement for game optimisation!
TAA is not a replacement for anti aliasing
@@TRLuckyNZ DLSS is not a replacement for hyperrealism mania.
Black cats are the nicest, friendliest cats you'll meet.
@@jmal orange cats would like to introduce themselves (probably by running into something).
I hate how games have became blurrier and blurrier over the years. All these detailed textures and models with hundreds of thousands of polys just to slap TAA and DLSS on top of it, not to mention all the temporal ghosting. We're at the point where older games rendered at native res with 4xMSAA look better than a lot of modern games on image clarity alone.
Exactly. It's all sooo blurry even at 4k it's disgusting
The foliage looks like tv static
Maybe you should get your eyes checked.
MSAA now looks like an anomaly in history. It had already achieved the best visuals and for a few years now, devs tried different techniques that in one way or another were not able to match what MSAA did a long time ago (MSAA wasin the OpenGL specification in 2002).
So why not use MSAA? It's incompatible with other visual effects. Not even very sophisticated ones. It's an edge based superscaling and while interpolation of color pixels is fine, interpolation of depth/normal/stencil is not. MSAA will not come back.
The blurriness however, that's TAA and DLSS and how they use it to hide purposeful dithering of effects that would otherwise destroy the performance even more. Everything comes at a price. Not sure if i'm willing to pay this one.
This
The main reason Witcher 4 switched to UE5 is: The original devs that understand and know how to fix and maintain RedEngine all left, they have no one to maintain it or upgrade it. Majority of those devs are now with Rebel Wolves.
Welp. Witcher 4 is so fucked
@@WaitstressMe when I'm fucking stupid
DEI is the name of the destroyer. witcher four gays
@@TheBatCat101 gamers when woman
@@heythere6552 I mean, almost every market women entered started going down hill, just happened with the game industry in the last 15 years.
The reason why we have this problem, is that the development studios got rid of all or most of the staff who knew how the proprietary engines worked, so they thought it easier and cost effective to just switch to UE instead of training new people to work with their engine.
Well as it turns out there are even less people experienced with UE5 who will work for big companies.
Well, that’s fine. But the problem is they don’t seem to know how UE5 works either.
Trust me, X-ray engine, the one from previous stalker games was a pain to work with and a buggy mess
@@goncalolopes5818 yup, Xray did a great job gameplay wise but is an absolutal clusterfuck behind the scenes. Anomaly and Gamma are the result of thousands of people squeezing every inch of performance out of the engine yet these mods barely hold together. Noone would bother to make a 2024 game on Xray
With the bespoke engine, you can optimize exactly for the task you want to do. But the cost to reach modern standards will kill you. Same as with operating. A specific one may be best, but in the end you grab some Linux off the shelf, because ... well "code is a liability". Don't write code.
The fact that Half Life Alyx was able to run on my Nvidia 1060 on good settings is mind boggling. Valve is really good at optimizing their stuff
Idk if you didn't know that game had a dynamic resolution you couldn't run off. Even my 2080 couldn't lock the full rez.
Source 2 has produced some of the clearest most crisp graphics I've ever seen come from an engine in the modern era of games, and it runs better than everything else!
It's crazy that I can fire up HL-2 and still admire the graphics from a 20+ year old game.
@@mryellow6918 That’s true, but imo makes sense for vr, where native resolution is often higher than 4k. Internal resolution below 1080p is where I tend to draw a line for my own preferences (on a flat screen, probably too low for vr).
I really wish Valve releases Source 2 for other devs to use sometime soon.
At this point, it would cost me LESS to just create a radioactive chernobyl hell-scape to live in.
imploding your own nuclear reactor > buying current day hardware and/or software
underrated comment
Do it let’s get some co op irl stalker
Though you could get the same experience in Ohio
lolololololololol
tfw you spend 1000 dollars to run a game at 40 fps and it looks about as good as a unity asset flip from 2017
Stalker 2 really didn't need to push the boundaries of current hardware. I would have enjoyed it even with 2020 graphics if you know what I mean.
Funny thing is - it looks like a game from 2016-2018
Agreed
Exactly, if they used for example engine from Wither 3 or Cyberpunk, that would be perfect or some CryEngine.....I don't understand that obsession with Unreal.
I woult take kenshi graphics or something if it meant the game would be fun and run well. I so dont care about graphics anymore. give me a fugly looking game that runs well and is fun over a "beautiful" "photrealistic" blur fest of 30 fps AI upscaling.
@@Pidalin It's well documented and babby's first engine.
I was so disappointed when I read that CDPR were ditching Red Engine in favour of UE 5. Cyberpunk 2077 looked pretty damm good even on lower settings imo.
Yeah at lowest settings it was still looking as good as the best PS4 games. It looks like a game, not a movie, thats all.
And the writing and gameplay of Cyberpunk is so good it just doesn't matter.
Its like so many people wanna have this photorealistic game so theyvcan just show people how real it looks 😂.
I am over 40, so its all mindblowing to me. The complexity and amount of player agency in games is what impresses me about modern gaming. Graphics be damned.
We played Qbert and Snake and loved it. 😂
Red Engine was a shitty engine compared with the unreal engine. When you license the Unreal engine you have at your disposal a bunch of software to create a full game from scratch
@@ormvianaMaybe, but having such large chunks of the industry depend on Unreal doesn't feel right.
@@miguelcondadoolivar5149 Then don't complain about games taking too long to develop or underdeveloped stories or gameplay because the dev's are trying to get the engine to work.
@@ormviana red engine was much better than unreal but CDPR hired half brained programmers who never worked with anything else
in my opinion when DLSS and FSR got introduced in to the gaming world, every developer feels like lets optimize the game a little bit and throw that shit in and hope it does the rest i hate it
I don't think it even works that good either, I just keep the resolution scaling off in cyberpunk cuz I can't even tell a difference. it's either no impact lag with stutter, or slightly less stutter and a ton of input lag
That's not true, DLSS and FSR are just really good for PC gamers, it's literally free performance for a slight downgrade in image quality.
Unoptimized games existed before DLSS and FSR, the problem right now is the fact that GPU have been getting way more expensive and it's become hard for devs to optimize for mid-range GPU while having graphics that are considered next-gen (when a game doesn't have next-gen graphics it get attacked by gamers).
With FSR3 at least you can compensate for the fact that GPU are more expensive, the technology also has a lot of potantial, DLSS is already capable of making 1080p look almost perfectly like 1440p.
@ni9274 "slight" meaning sometimes completely blurring images in motion, blurring textures, prone to visual artifacting. Battlefield 1 and battlefield 5 for shooters as an example, didn't need upscaling to achieve visually impressive scenes or performance.
@@ni9274 "free performance" is not a thing.
@@ni9274 What free performance? It does nothing in Escape from Tarkov except blur the people I am trying to shoot at. It does not increase performance in any game I have played, outside of the basic lowering of the input framebuffer.
It's a joke to consider rendering the game at 720p and upscaling it to 1080p native as 'free FPS' it is not.
DLSS is horrid, the fact we're being forced into it makes me wonder if theres a conspiracy with unreal engine devs and GPU manufacturers.
I feel like NVIDIA felt like they messed up with the gtx 1080, they made the card too good and it lasted too long, so now they're making cards that last less long with "new" tech each time for a more constant cash flow.
The only conspiracy here is how unknowledgeable you are about PC hardware and software. Same goes for the person that made this video.
@@michaelsexton924 prove it, smartarse. Profits are always chosen over a good product nowadays, nvidia gets way more money from stuff outside gaming anyway.
@@michaelsexton924if you were smart you'd know that these people are always trying to make money, not please buyers. Nvidia keeps making money.
I think devs just don't CARE to optimize it because nvidia gives them the crutch, and nvidia is happy that devs are becoming dependent on their AI framegen or upscaling slop.
dlss and fsr were designed so lower end cards and older cards can get more life out of them not for the flagship cards to run games at passable performance, but somehow here we are.
i dont want photo realistic games if hardware cant run anything. Graphics needs to stop going "futher" it needs to optimise and let hardware catch up before we go futher
How do developers then explain why their modern games look worse than their previous titles to the average consumer?
Production cost and time will drastically increase as well in an industry where development costs are already out of control.
@@penumbrum3135 the "average consumer" wants a fun game.. they really don't care about graphics.
Take Nintendo's games and sales.. they beat Sony's games and sales on lesser hardware and they don't get PC ports.
@@penumbrum3135 You don't chase after graphics.. Nintendo does quite fine without having to make their games "photo realistic"
Nintendo's games typically outsell "photo realistic" big budget games as well or make more money using less resources
Maybe I'm looking too deep but I feel like everything is becoming more superficial. It's all about what's on the tin anymore. I am hopeful that this game in particular will get better but games these days are hype fest disappointments.
the only way we are going to get better hardware is if there is a reason for companies to create it, if there aren't demanding games like stalker 2 pushing hardware limits then hardware companies are just gonna get lazy
Dev got to stop relying on upscaling cheat code like DLSS and FSR and instead optimize their games betters.
if a game is optimized for intel or nvidia, it will always be a shitshow if you got any AMD hardware
At this point they need general optimization within the engine, this is bullshit across the board how low FPS youre getting per dollar on these games @@xbm41
They won't because TH-camrs and pundits won't shut up about him"how great DLSS is" rather than talk about terrible it is that we even need it
@@xbm41 We're well past that. It has nothing to do with your hardware brand.
@xbm41
Just wrong. Soo many use amd. Better price performance.
Nvidia is like 300mph bike. Its nice but expensive, yet you can get a 280mph bike for half the price
Crysis 3 doesn't look bad at all compared to this game and it runs much better. This graphical innovation is going in the wrong places.
That is the power of Cry Engine : )
@@GameBoyyearsago That is the power of good developers. When you only hire cheap junior devs with little to no experience, then UE games are bad optimized and look the same.
Dei hires.
And rage 2, and metro exodus, and doom eternal. All 3 look better than Stalker 2 tbh
@@GameBoyyearsago LOL :) Crytek had many troubles. many forks to other studios. it went wrong when it stop focusing on HIGH end PC and went to be console compatible like all the other engines. Then Corporate Management went crazy and bankrupt the company. I guess Crytek recovered some what. Engine wise. "Hunt: Showdown" is cool game showing off Crytech's Cryegine but feels just like "Unreal" in looks. All the Far Crys were forked off an older Cryegine. Same with "New World" and "Star Citizen" as Amazon Gaming Studios forked CryEngine into Lumberyard.
Thanks for being real.
UE5 is the stutter engine. So gross.
Did the devs port the whole game to UE5 or is it some hybrid like the GTA remasters? Renderware is the main engine takes care of the phys, AI and stuff. Gamelogic exetera. Unreal is for rendering in these failed remasters.
The engine is not the problem, developers who don’t know how to optimize are the problem
@@PuxiIt's actually UE4
@@SerialDesignatioN UE4 renderer and Renderware for all the rest. Do your homework next tine.
Original Stalker engine is crash engine
I prefer having more stutter than more crash
Just an FYI, Yakuza is not switching to Unreal Engine, they only did it for their remake of Ishin(2014)
And boy was it awful, it got close to emulate yakuza 0 but the bugs at the beginning...god, just the second of characters not being in the scene EVERY CAMERA TRANSITION was enough to destroy any immersion possible.
@ i also didn’t like it, it felt much more clunky, and small things like your character not hitting enemies when lying down, big downgrade from the dragon engine
oh thank god
@@cuppedcupthe dragon engine is the definition of cluncky though ishin was a bit better
Damn 😂 I just bought that game yesterday cuz it was on sale fuuuuuukkk meeeee
Paying over $1000 for a card and not getting 60 on 1080p is so fucking demoralizing
Unless we get framegen that entirely gets rid of input delay, then it's just not worth it
A 1000 dollar card should be able to CRUSH 1080p lmao, people gotta stop excusing these studios fr
In 5 years you'll remember this as the good old days when gpus didn't cost 6000$ 😂😂
lmao which GPU costs 1K dude? but anyways if anyone should get the blame its you for paying that much in first place
That's why you pay 400 (cad) and get similar results.
It is crazy how barely anybody is talking about how bad TAA looks.
Because the game's look significantly worse without it. I mean look at Metaphor Refantazio. The final rendered image is abysmal with tons of aliasing everywhere and it's very distracting. A TAA mod exists and it makes things leaps and bounds better despite not even being a proper native implementation.
@@wanderingwobb6300 TAA looks significantly worse than aliasing artefacts, in my opinion.
@@wanderingwobb6300 TAA when looking at static images is an improvement, but all that ghosting is a TAA artifact because it is physically taking the data of the previous frame and "Cleaning up the edges" of an object that isn't even there anymore, or when the rustling of leaves in a tree or grass right in front of you, looking like static that is TAA. TAA is "fast" and "easy" that is why it is used.
Threat Interactive makes some good videos breaking showing issues with UE and their shitty TAA
@@wanderingwobb6300MSAA looks way better and it still does antialiasing just fine
Its just mindblowing how much more processing power we need for modern games and them not looking much better. Best example: look at Batman Arkham Knight. That game still used the Unreal Engine 3, with some features and rendering stuff ported from the UE4 (because they found UE4 to be buggy and too resource intensive). Or look at Alien Isolation, that game practically runs on a toaster these days and looks great, aside from the bad AA implementation. And Alien Isolation didnt use any global illumination to save resources, EVERY lightsource was dynamic in that game.
How is this possible?!
As a small dev using UE every day, I can confirm one thing : If you listen to Epic, your game is gonna be a beautiful, shiny .... slug.
You can't use hardware raytracing, especially if you want to use it for reflections and shadow or it will DEVOUR your budget. I still use screen space reflections cause even in software mode, Lumen eats a lot. Letting it take care of light is already taxing.
You have to limit your textures resolution a lot (everything else is already taking all the VRAM), don't use Nanite, except if you have very dense areas. Most of the time, Nanite will give you big quads overdraws and you will prefer well adjusted LODs (which take much more time). Scalability profiles (you know these "shadows", "textures"... settings profiles that set a bunch of core variables) you select must be tweaked to fit your needs and not exceed them. Something many people don't do.
Unreal Engine is not as much "out of the box" as they want everyone to think. (except for the assets. You were right, everyone use Quixel scans lol). But even with these assets... you better optimise.
My game Lovely Traps Dungeon will soon (a week or two) have an updated demo on Steam if you want to try (the one available now is a lot outdated, the project evolved) an humble indie game with half-stylized graphics (meaning : custom shaders, a lot of home-made assets) made with UE5.5 (in the coming update).
@@MGrey-qb5xz They just decided to protect you from abusive game passes. And they don't force DRMs. Quite the opposite since they recommend we give more value instead of adding anti-piracy technical crap.
But yes, I'm trying to make it so it can be sold anywhere.
@@MGrey-qb5xzsteam ? Anti consumer ?
@@FinalDefect I love Steam and that's my go to, but I think @MGrey-qb5xz might be talking about how Steam doesn't sell you the game, but a license to use.
This has been a topic in recent lawsuits where Steam doesn't want to move away from the verbiage that's associated with buying, because saying you don't own it could hurt sales.
@FinalDefect rather not got in to this argument here but steam has been degrading many features and making the user experience worse for publisher and greed sake
As a Unreal Dev that works with photorealistic assets a lot, I disagree with all of this comment.
Last gen games running at 150+ fps native res look better to me than current gen upscaled trash. Everything is blurry and full of artifacting and ghosting now, and running slow on top of that
I don't really understand what is happening to graphical priorities at the big studios. It seems like framerate is barely part of the discussion, when its the biggest contributor for a good image when you are actually in motion playing. I dont get it.
Wait for better hardware to become cheaper. Technology has slowed down drastically in leaps.
@@nontoxic9960 when the average user gets better hardware, games just get more inefficient. It's happened every year since I was a child, and I don't believe it will change anytime soon. People don't change, not when their current approach is profitable anyway... :(
This. I obviously lack some insights, but I don't really get why a game that doesn't look any better than Doom 2016 or Doom: Eternal has to be that much more demanding and run so much worse, falling apart in some visual regards as well.
Simply lack of skill. I am in the games industry and it’s just a gigantic clown show compared to 1990-2012. Too many imposters got into the industry in every department. It’s kind of like TV and film.
This is why we see more indies succeeding.
@@michaelduan5592 So just like everywhere else, organisations are shedding huge amounts of specialised knowledge and making a worse product because to do so is more profitable.
We've gotten to the point that games in real time rendering barely look better than 6 years ago (or even worse in some cases), and they can't even be run baseline on modern hardware (having over 10x the performance from 6 years ago), and instead resort to running at 20 fps and making up fake frames in between to get you to the bare minimum of 60.
It's crazy to me that My 2060 or even my 1060 before it can run battlefield 1 max graphics no issues but LITERALLY can't run games like this when arguably they're on par graphics wise
@@lukemus401 Developers have been growing lazy year by year, and lately they are also incompetent.
@@zolikoffwell lot of them get pushed out and fired
I play enough games to know when programmers are lazy - the temp on my graphics cards is much higher than others for no reason on my end. Hogwart's Legacy is a great example - I can cook eggs on it while running other games at very high settings doesn't do much to keep me warm in winter.
@@MGrey-qb5xzyeah, most big gaming studios keep people around for a project or two, after they get better they need to be paid more so they just replace them with newbies and the cycle continues. That's why fromsoft and the Yakuza studio can release great games fast compared to the rest of the industry, they retain talent that knows how to work with the tools they use
UE5 is such a joke. It’s the hangs that really make things so bad. Nothing prepared me for turning up all the fancy UE5 settings in Fortnite to see what the engine was all about and watching my 4090 laptop get brought to its knees while still not looking all that great in a lot of cases. The main reason installed the game was to see a showcase of UE5 and I was unimpressed to say the least.
You can not compair trash laptop "4090" to a real 4090
That trash is more like a 4070
Get a real pc
@@Tw33zD I never compared the laptop 4090 to a desktop 4090. I’m well aware that the laptop 4090 uses a variant of ad103 (same die in the desktop 4080) with a bit less memory bandwidth due to power limitations on laptop. It’s still has 16 gb of vram and can pull close to 200w in games consistently. Thats extremely good for a laptop gpu. Oh, and it cost 2k and gpu temps are well below 80c. Pretty good deal for a complete system with ad103.
UE5 is just a dog shit engine that doesn’t work well. Everyone knows this. Doesn’t matter how godly your pc is. UE5 lags behind many of its modern peers.
@@Tw33zD a laptop 4090 is still a 4090, stop shilling up UE5
@@BrandonMeyer1641 lol no
@@xeon39688 trashing on both bro
People called this out 2 years ago when UE5 was first announced, all studios will shift to it eventually and stop optimising their games by solely relying on upscaling methods. Ever since Control was first released, games have been relying on upscaling methods and companies never looked back. The developers for typhoon coaster who is famous for perfect optimisation and performance by coding with only assembly language even made the remark saying that games will eventually all be blend of same product instead of art due to heavy reliant on the same game engine like UE and Unity.
Precisely. Upscaling tech, at this point, has *exclusively* harmed gaming in general.
Unity thankfully still has a great forward and forward+ renderer that can use MSAA and look crisp. They are also making it more and more optimized as the years pass. Unity is living it's redemption arc now. Godot is pretty decent too with forward rendering so MSAA is safe, but it's a lot less optimized in some areas than Unity. To give it credit tho it has some really fast fog and GI rendering.
" by coding with only assembly language " That is hardcore bullshit
@@synpathy6729 *you're
@@Navhkrin Rollercoaster tycoon was written entirely in assembly, by one person. OP got the name wrong
Those low quality Lumen reflections in Stalker 2 look so bad that screen space reflections would’ve looked and perform better
they look so bad it distracts me often, you are right
@@Hahdesu i love walking into a house only to turn around and look out the door/window only to be blinded by god and everything outside the house just gets washed out and becomes a white light i love it even more when there are enemies that can still see me and are shooting at me from outside and the only way ik where they are is by using my compass to se the red dot ue5 photo realism/lighting = once you step inside its like your character has never seen the sun before in their life and gets blinded
such a huge world, a unique genre of games in its kind, anomalies, mutants, npcs, atmosphere, do you really want 200fps? oh those 20 year old mom experts, cyberpunk at the start with lags, remember witcher 3? and starfield,..oh those 20 year old mom experts looking for graphics, stalker is atmosphere
@@onestopgamechannel7846 I hate to break it for you, buddy, but this has nothing to do with UE5 but their incompetence of doing it correctly. I agree with your general sentiment, though.
@@user-cm7yd1yo8g The great atmosphere in stalker 2 gets washed out by these idiotic things that breaks immersion. Stop eating slop
When i hear unreal engine5 i tell myself:oh no another stutter fest and bad performance game...
I remember when the splash screen "Made in Unity" was making people say the same thing haha. Its probably a more of a skill issue than which tool the developer chooses. But making games is incredibly hard so im not trying to demean anyone :D
@@VambraceMusic Its the tool. Given that even Fortnite Devs... people who actually work on and actively develop the Engine itself
in the very studio that created the Engine.. has trouble optimizing for it.
@@Mr.Genesis Then why did days gone optimize it so well ? Or the mortal kombat games ? Or batman ? it's the developers' fault, they are known to ship unpolished games.
@@VambraceMusic compltely agreed. Non UE games can be just as bad if not worse.
@@Mr.Genesis The tool certainly matters a lot. Its a difficult decision to make, do you spend the time and effort creating your own (like jonathan blow creating his own programming language) or use whats available at the cost of performance due to bloat in the code or whatever else it might be.
Or in music, do you use samples or farm your own goats to use on the drums you build yourself. At the end youre more of a goat farmer than musician :D
The Ai is some of the worst I’ve ever seen idk why you’re sugar coating it
Especially compared to the original three games that came before this one.
A-Life was amazing. Being able to hear of a gunfight in a different region on the radio, go to that location in real-time and find the stragglers still duking it out was fun as hell.
Roaming mutant hordes, the squad based tactics... all of it was great. Not perfect but great.
@@Rexhunterj A life is disabled because UE5 is shit
yeah enemies just spawn out of thin air
you just ran through a big open field? enemies will spawn right behind you, even though it's literally impossible realistically
They are fixing it
Right now there was a patch and A life is already here and as I know it will improve even more
I can't tell if you're talking about enemy Ai in general, or just in the stalker series. I'm used to enemies either running straight into my line of sight for free kills, or just hiding in a single spot for an entire gunfight. As someone who's new to the stalker series, the enemy Ai feels great in comparison to a lot of other games
What til he finds out how good Stalker Anomaly looks on an engine that's almost 15 years old with a bit of modding.
wait till he tries skyrim with a bit of modding
@@asmilingguppy2212 I hate modding. I think it ruins games. Developers make half-baked game and expect modders to "fix" it for free.
Exactly, and the graphics level there is perfect, why would more be needed
@@ScienceDiscoverer modding can fix a game, or make a good make better. there are two sides of a coin
@@ScienceDiscoverer lmao youre dumb.
all games should have mod tools.
Hearing 343 Industries Halo Studios say that anything is going to save the Halo franchise has the same vibe as hearing your gambling addict father say that he's going to make it all back this time if you just lend him some more money. They've been "saving" the Halo franchise for over 12 years.
UE is the RPG Maker of AAA devs.
Damn! So accurate!
My rib caged exited my sides and went out of orbit from laughter, because this is so horrendously true.
Hey now don't be mean. Rpg maker can be run on a potato so at least there isn't insult to injury.
and dont forget renpy
at least most popular RPG Maker games look unique from each other
I'll always refer to it as the Unreal Stutter Engine.
Unreal Stutter 5fps
I recently returned to Dying Light 1 and honestly when I saw how crisp the game looks thanks to lack of AA I cant go back to playing these new games
Thats the issue with new games. For some reason 2015 games like dying light or witcher 3 looks much sharper than these new unreal engine ones that look only good on trailers but in gameplay they are blurry and really nauseus to play idk..
@@mar1999ify they look blurry because all of the use TAA and upscaling, because its easier to slap fsr and dlss on game rather then optimize it decenly
Istg dude dying light 1 has the most crisp image ever it's crazy how better it looks than modern games
@mar1999ify even arma 3 looks amazing compared to some ue5 games
PBR materials don't work too well with fxaa, that's why you need crazy high resolutions (higher than 4k) or to use temporal super sampling.
the lack of any real A life makes scoped guns completely pointless. No reason to get that SVD at all when a pimped out SMG handles the games exclusive close range combat.
Sad
@@SixTough A life is coming in a patch, seems it was bugged.
@@gooddoggo3547 good news! Can the game handle long range though?
@@gooddoggo3547 A-Life is currently being looked into, there's no timeframe on when it'll be implemented properly. It is not coming in the patch later this week.
true
no excuse for games to run this poorly on the level of hardware we have today. dont blame the graphics cards, dont blame it on the vram. the game drops below 60fps sometimes on a 7800x3d, shame on the developers.
i dont beleve the developers dont optimized the game this stupid engine is the problem , every UE5 game runs like shit , EVEN 2D ONES !!
Yes, particularly in terms of CPU utilization it feels like Devs have become lazy simply because there is powerful hardware available to brute force stuff. I feel pretty confident that if the most powerful gaming CPU available was an R5 3600 it would be possible with work for devs to get all these games running at 120 fps on it. Not just making this up ; the fact is that there isn't any special work load or complex simulation that modern games require of the CPU which wasn't already present in games 5 years ago. It's just less efficient.
With graphics at least there is the diminishing returns argument. A graphics load that is 3x harder to run only looks 5% better subjectively so that's very hard on GPUs. But for CPUs we are mostly doing the same thing as before with way more powerful processors.
@@SolidChrisGR LMFAO stop simping for devs, they are clearly at fault. "every UE5 game runs like shit" Black myth wukong doesn't, Hellbalde 2 doesn't , The finals doesn't . None of them drop below 60 on 7800x3D. probably not even below 100 lol.
It was a mistake to pick Unreal 5 as an engine. Not realy the devs fault.
The thing about these UE 5 titles that use Nanite, Nanite tanks vram. Just by playing on dx11 in the silent hill 2 remake i stopped having drops to 5 fps. I'm still rocking on an RTX 2060 that i bought in 2019, running on dx11 i played the game comfortably and streamed it on discord with no problems
wasnt UE5s main marketing strategy about allowing amazing graphics with little to no performance impact? so what good is it for now?
Having worked with UE4 and UE5 for the better part of 4 years professionally now, I can confidently say that it's not the engine. It's a hodgepodge of previs/vertical promises, knowledge deltas and poor optimization. UE5 is a great engine - but not well documented. You've got to put in work (and a lot of it) to understand how its systems are interconnected, what it does and doesn't do well - and best practices to circumvent common issues. The engine is extermely dependant on manual performance optimization. That includes best practices, as well as hacks (And I can't overstate the importance of hacks). On top of that, when starting production, the engine has a habit of giving you false impressions about how well you're doing on performance. You'll see decent profiling stats all the time - but there's a tipping point you can't easily come back from once you've broken through that threshold because scaling back is way harder than scaling up. In the case of Stalker not everything is lost though, as its main performance issues seem to be happening on the main thread and how the game is handling its AI systems. Depending on what exactly they are doing, there's still tons of potential for optimization. But yes - that should have happened before launch, not after.
A life 2.0 isn't even in the game and it runs like shit. I bought the game cause I've played so much anomoly and I felt I owed them a purchase but I legit played 2 hours and Uninstalled going back to anomoly lmao.
@@HansLollo what do you mean by hacks? I think its definitely the engines fault if you only can get specific performance out of it if you need to use hacks that arent exposed normally. Thats some bs UX epic did here.
@@dd805100 what does this have to do with anything.Do you expect a-life to make it run like shit?
@@dd805100 You sound like a loser. You don't like STALKER, you like Anomaly.
tl:dr if you know how to use the engine and spend a long time optimising the game you can get an unreal 5 game to run decent.
The reality? Even if people do know how they wont do it because it costs time and money when these scumbags can just say "just use dlss and frame gen to get 60 fps at medium settings thats enough for us hue hue hue" Unreal is not a complicated engine compared to most others. Infact because it's the most used engine the odds of having no one familiar with it in the entire team is practically zero. It's not about lack of knowledge. It's about lack of care. Games years ago had soul and passion put into them. They were made to sell but they were made to be fun first. If that meant they took a long time to come out they would take a long time to come out. Now it's "release this game in 1 year or it's scrapped"
The bar for quality simply does not exist anymore for these people.
the hardware these days is absolutely insane but the devs become so lazy that even a 5 grand pc feels like youre using mid tier parts
Again my 3070 ti is running great 😂
@@kyero8724 nobody asked
Which is why I'm not upgrading yet. Performance per dollar is wack.
@@robfox6103door
@@kyero8724ignorant or lying, either way you're wrong.
Too many games today look worse and perform worse than similar games from years ago, all on hardware that is significantly more powerful than from years ago, and at a much higher price.
Something needs to give. Yesterday.
But they've added 8K detail to the character's individual eyelashes! How can we live without ultra detailed textures that get collapsed into a single pixel more than a meter away?
It makes me miss the PS3 era where you had reasonably detailed 720p or 1080p and now we're upscaling 540p struggling to get above 30fps.
They earn more, so it wont give.
Heavily stylized games like Overwatch 2 look 100X better for the simple fact they make the world look interesting instead of drab washed out mud. The overelyance on upscalers is why most of these games look like shit, not to mention the focus on hyper-realism takes away any flair the game could have had.
"Stalker 2 have artistic style and doesn't feel quite real" - as any person born in Russia, Ukraine or Belarus would say to you, that it is more than real. Any game have some sort of artistic style, but if you google how abandoned homes or towns in CIS region - you would also say, that it is way too realistic)
Everytime a new game is announced to be made in UE5 I just know it's going to run like shit, no matter the developer.
My UE5 game I'm making is running at 100fps
I learned this early with Remnant 2, one of the first UE5 projects. The game runs terribly to this day even after a year and a half of patches. I'm so scared for Monster Hunter too. It might not be UE5 but RE Engine seems to have the same problems. What is with these fucking engines man? Beg Valve for Source 2 or SOMETHING. Anything but UE5...
Same with UE4.I know the game will look like plastic and blurry.
@@AuthorityCat Yeah, RE Engine is fine on linear and small scale environments but by god, does it run like shit on the MH Wilds demo. I remember playing DMC5 at high settings on a freaking integrated GPU at 30fps. Now, the MHWilds demo runs like shit on my 3060 med settings, and that's with DLSS setting on.
@@mitsuhh Sigh.
i expected to see insane graphics on ue 5 and WHERE ARE THEY?all i see is stutter and poor optimization
The graphics are there, when there's a storm in this game it's absolutely terrifying and beautiful at the same time.
But yeah, UE5 is a bitch and runs like ass.
I agree, the graphics doesn't seem that impressive. I mean sure it looks good when it's set to epic setting but at low setting it's like the game from 2000s & blurry as hell. On the other hand, Low settings in alan wake 2 doesn't look that bad, even if we ignore the ugly terrain the rest is phenomenal.
@@hquan1 stalker 2 has better graphics than alan wake 2. Plus alan wake 2 isn't open world.
@hquan1 yea the low settings look horrible i was expecting betyer graphics for seing how bad the performance i hope yhey heavily fix it
@@Johnson-r7g2o No it doesn't. Lol RDR2 and Cyberpunk look better than Stalker 2
The upcoming witcher and cyberpunk game will use UE5, lol we are so cooked.
that means modding will probably be killed in their games. enjoy buggy games that cant be modded to fix it. i was hoping they would kinda carry the torch if/when bethesda fails ES6, but looks like they will be the last ones with very modable AAA games in the future
It is gonna be another disaster
@@GoalOrientedLifting Maybe you should think for a second before writing nonsense, STALKER 2 was released four days ago, there are 7 pages of mods for it on Nexus already
@@Finnishguy777 did i say there wouldnt exist mods? no. but the level of modding you can do on a game MADE with modding in mind with their OWN engine that comes with a modding kit, and unreal 5, are 2 complete different worlds.
Look up what you need to know and tools required for UE modding vs how easy it is to mod with a dedicated modding kit for their own engine
maybe you should think for a second
Metro exodus looks 1000x better than this
Every single Unreal Engine game stutters when a new area loads in, which Epic Games doesn't care about fixing.
why study computer science when you can chatgpt it?
why code the game when you can github copilot it?
why optimise the game when you can AI upscale it?
its all the same at this point, we are all cooked 💀💀
Convenient methods for them that harm us in the end…
Funnily enough, in cyber sec, we are really benefiting from this because a lot of glaring mistakes are being exploited because of people who are not very good at programming making important functionality.
realistically speaking its not even the people who’re programming it’s fault. it whoever hired them
Worth noting that as a senior/principal dev, LLM tools are actually very helpful at speeding up development. The trick is, as with any code you get from any other source than yourself, you need to understand the code before utilizing it. Don't copy/paste and use in production blindly... that is scary, haha. I've had code generated that has security issues, but since I took the time to understand it, I was able to prompt the LLM to fix them, or fix them myself. Yet, even in those cases, it still saved me time.
In other words, the issue isn't the tools, it is the incorrect usage of the tools.
Game dev here. Let me be honest, most UE5 games (I think 90% of them) don't really need UE5's specific features like Lumen and Nanite. The main benefit of those two features is cutting down development time NOT making the game more optimized. For example, if the game doesn't have very difficult light scenarios (dynamic day-night cycles, many moving lights, etc.) you don't need Lumen and could just use baked lightmaps. For Nanite, pre-processing LOD from the 3D authoring tools can get the job done just fine. The result might be quite poppy but you save tons of computing time on that.
I haven't played this game yet so I can't say that their decision on Lumen or Nanite is justified but most of the UE5 games aren't.
Dev huh? Whats worse. Crunch or a War?
@@kyero8724 crunch probably
I think the game definitely needs lumen because there are a ton of different weather cycles and events that are dynamic, anomalies create their own light sometimes, it's necessary imo. But nanite I'm not sure, I think a lot of the textures would be pretty easy to pre process.
@@theycallmeklutchkidno it doesn't need lumen, it needs skilled developers
@@kyero8724 the game was announced in 2018. They had like 4 years of dev time before Russia invaded. Yes its fucked up they had to work through it, but no one would have cared if they stuck with Unreal Engine 4. Instead in 2022 they completely reworked the game for Unreal Engine 5 pushing the development into the war.
23:24 this is the sole reason I don’t play UE5 games. I just won’t tolerate it. It’s the absolute worst thing a game can have technical wise.
I agree with everything you said but just wanted to play devils advocate for a sec:
1) upscaling really isnt THAT big of a deal at 4k. When i use "quality" DLSS in 4k i can barely tell the difference, though certain games look better than others...I think developers may be looking forward and deciding over the next 5 years+ that 4k will become the norm and upscaling wont matter to the masses.
2) upscaling itself will continue to improve. it might be that eventually we cant even actually tell the difference unless were looking for it.
3) having stock engines allows the developers to focus on whats important- making good gameplay with good stories. im a musician, it would be a drag to have to build an amplifier and a guitar before i recorded a new album.
4) with intel jumping in the gpu space and beating the 4060 at a cheaper price. maybe were at the very beginning of GPU prices becoming reasonable again?
im not optimistic about any of this but just trying to stay positive when its pretty crappy times to be a PC gamer. Im just happy i can afford this overpriced PC i have but im a 37 year old man. sucks for the younger folks thats spending 1000+ on a gpu is just outta the question.
"Upscaling Ain't that bad" -after spending thousands of dollars on a GPU.
The actual truth is that in UE5 gives developers a lot of subpar tools that were not supposed to be replacement for the actual thing : for example NANITE is not a replacement for LOD - Everyone treats it as replacement for LOD. That's like 50% of performance hit right there.
This is exactly true. I am a UE game dev, on my channel I did a video testing Nanite performance. It is terrible with anything less than 20 million triangles in the frustum. And projects that large with Nanite enabled won't even start on the Steam Hardware most common systems, UE5 will just crash. I have eight computers all of differing hardware from low end to high end, for testing my game releases on, to make sure they are performant. I will never use Nanite, or design my games around needing it.
"Generic Engine 5" is the real name
i've saying that since the UE3 days, since for me every AAA studio back in the early 2010 using the engine had a samey feeling to me.
that's kind of the point though, no?
Generic Stutter Engine
When something is so good that it becomes the *_standard_* across the board, then it by default becomes "generic" because everyone is using it since it's so good.
So you want shitty looking games or what? If you want interesting art directions go play Minecraft. 😂
@PotatoeJoe69 I'm not saying it's necessarily bad. But also different game engines can help making the game feel more unique. If every game feels like a reskin of the other, what's the fun? I think that can still be accomplished with UE5, but recently quite a few titles give that "it's UE5" feeling
It feels like developers are no longer bothering to optimize their games when they can just tell gamers to use DLSS/FSR. People forget frame generation was originally for 4K gaming to help give playable frame rates.
And the generated frames doesn't even match with your motion, which gives a sensation of delay or odd responses. Generating frames is not the way to go.
Quick note on Halo at the beginning: the move to UE was something that was a long time coming. Halo's engine was fine, if not a bit esoteric as in-house engines tend to be, but the suite of tools used to build the games (Faber) were mired in unbearably oppressive technical debt. According to Jason Schreier, 343i leadership was seriously considering moving to UE4 during preproduction, but ultimately decided against it in favor of developing Slipspace (an updated version of the engine they'd been using since Halo CE). This massively backfired, as the abysmal state of Faber meant that routine tasks such as recompiling the engine or iterating on content took orders of magnitude more effort to get done than it would have with Unreal. Fixing Faber's countless shortcomings would have basically necessitated rebuilding the engine again, so why not just go with a known proven engine that already had halfway decent dev tools?
I think source 2 is such an underrated engine. New counter strike maps look so good and run so well (I can't even imagine running something that looks similar on my 1050ti laptop in UE5). Not photo realistic like UE5, though they still look amazing, without compromising so much on performance.
Yeah source 2 is amazing. But they have the benefits of pre baking almost all the lighting. Conceptually you could probably tune a UE5 map to look as good and run as well if you spent enough time
yea, idk if valve allows others to use it to make games like source 1, but i wanna see in terms of graphic how much the source 2 engine can be pushed cause its only used in CS2 whch needs to be optimized as much as possible cause most people still play CS2 at low end pcs, in s&box whch is aiming to be cartoonish, valves newest game wch is also in cartoonish art style and HL alyx wch looked really good but i feel like its still not on max due to it being a VR game.
@@nowaynowaynottoday Ehm... Unreal do prebake too, most games does that unless they have dynamic weather and time of the day
And also well optimized (a bit), my shit ass laptop (no graphics card, just plain potato laptop) can run cs2 with 55-60 fps, ofc at lowest settings but still looking beautiful.
Who's underrating Source 2? That's the silliest thing I've read today.
Source 2 is just updated Source 1. So you're comparing a 20yrs old fully matured game engine. To a completely different, completely new engine from it's predecessor UE4, that's also only been released for a little over 2yrs now...
Mirrors Edge Catalyst looks better than 80% of new games coming out today. Runs 10x better too. And that game is 9 years old.
and the craziest thing is that mirrors edge 1 looks better than catalyst due to the stylistic choices
Frostbite still got it. Remember BF3 when you can set graphics to low, it still gona look decent and have 100+ framerate ?
alien isolation and mgsv
both look phenominal
Back then games made you wonder how they could look that good with average hardware. Now they make you wonder how they run like crap on high end hardware.
GTA V ran on a PS3. We’re not gonna see the same story with GTA VI and PS5.
@@bahshas Style choices being baked lighting. If more games were just content with having baked lighting you wouldn't need a thousand dollars in GPU hardware to get a respectable framerate. Of course the ultimate irony here is that the first Mirror's Edge was an Unreal Engine 3 title.
One thing I don't understand, they not even look that great
IKR??? They often look like a blurry mess
@@zelcion I'm playing the Metro trilogy, Metro 2033 looks as good as Metro Exodus, but it's less blury, it has better perfomance, I just can't understand all these tech bs throwed in and bloating everything, we are using TAA then upscaling then frame gen and it makes all the engine power and graphical improvements go directly to the trash can.
You can disable all that an Run it nativ with todays gpus @@Wkaelx
@@Wkaelx throwed in lmao
@@Wkaelx I agree. Played the Metro series recently too. The jump from 2033 getting 350 fps, to Exodus getting between 50-120 fps, doesn’t appear to be reflected in the graphics. Yeah, they are better, but not that much better.
I’m playing the Mass Effect Legendary edition right now. Mass Effect 2 looks absolutely brilliant with the new textures. Incredibly sharp, and high FPS.
I would much prefer a game made to run well. 5 percent better graphics is just not worth 50 percent less fps.
Optimization is a lost art
No. What kills gaming is rushed developers and developers overusing tech for what it isn't made for.
Lumen isn't a one click button to replace all your lighting and make any game magically look good. It's a tool like anything else. It needs proper knowledge to use properly and optimize.
Same with nanite. These are all helpful tools but not made for you just to slap onto any game and call it a day.
Satisfactory runs on unreal engine 5 and uses lumen. It can render GIANT factories with thousands of stuff and still be over 60 fps.
It's not about the tools. It's about the one using them.
Except the calculations for the factory are actually done CPU side, not on graphics and the game is much less complex when it comes to graphics.
Been saying this since UE4 with darksiders 3
Any game developed on this engine feels like a fanmade project
I hated this engine since UE3, most devs dont know how to use it, other than Epic themselves i never saw anyone do physics properly with it, it doesnt look that good, the lighting is always weird and fake, it struggles with bigger maps. Picking it for STALKER was an amateurish decision
@@BostilCensuradoI hate it since UE2 quake arena(😂just so we can continue)
@@BostilCensuradovery few UE3 games are nice to play. Killing Floor 2 looks amazing. Sadly the game was mismanaged to the point it now weighs 90 gigs with all the cosmetics but hey, for 2016 it was incredible.
Guilty Gear:
Bro I've always felt like it was made by a Indie Team especially with everything rendering right in from of the player.
Metro Exodus came out in 2019 it still looks amazing and runs amazing. Sometimes STALKER 2 looks worse. Its just sad
Not on my PC. I really wanna know what everyone thats complaining is running. Then they wanna cry that they shouldnt have to upgrade their PC. lol the entitlement of people nowadays is off the charts. Dont get me wrong. I deal with the memory leaks occasionally as well but I believe thats fixable in time.
@@Terp-daze i have an Rtx 3050, i5 10400 and 16(8x2)gb 3600mzh Ddr4 ram.
Its not the best but as a 3d artist it meets all my needs. It also plays most of the games well at 1080p high.
The main issue is the developers dont care about optimization at this point and UE5 is an unstable engine. I can run an empty scene at 120fps at epic preset and then i add a cube and boom 90fps like what? The engine itself is tanking fps because of its lighting model and TAA/TSR implementation.
They should've used Exodus engine for Stalker it would've been perfect. UE doesn't even get rid of its plastic look for how much performance it chugs to look realistic.
My thoughts exactly, imagine if Metro was built on UE, how would it look AND how would it performe. So glad that that studio stuck with the engine they made previous 2 games with almost a decade ago but still managed to change the looks and crank up graphics on amazing level. You can notice how improved metro exodus is in technical level than previous 2. I certainly hope they stick with 4A engine for their next title as well
Metro 2033 OG from 2010 looks better. I made a video on my channel showcasing the lighting differences.
as someone that studies Game Design and Development at University, we are being taught how to use and develop for UE5 and the consensus is that most devs/students like using it because of how friendly the visual scripting is and how easy it is to add high detail models and textures, we also do learn Unity 6000 and some small pieces of Godot, I personally dont like developing for UE5 just because I find programming in C# easier for Unity and the fact I dont need a high end GPU compared to UE5, plus when developing we use on demand texture streaming in engine which slows the process down soo much
Im getting to the point where i pre-select "ignore" on Unreal 5 games.
unreal is just an insult to the amazing hardware we have nowadays
I think this is being pushed by asset managers, like BR and VG to push people to upgrade and get money on their other computer hardware investments. This kind of hidden monopoly is not good for consumers.
@@oren2234 true
@@ПавелВолков-э9и Don't blame the drug dealer, blame the addict! Enabling stupid behaviour is equally responsible and ultimately, without the enabler there is no situation to begin with.
@@ПавелВолков-э9и Well when there's no example of anyone using the tool properly then maybe the tool is at fault in this situation every single UE5 game runs badly every UE5 game is very CPU heavy and every UE5 game upscaling is almost always a requirement hell there exist no CPU on the market that is strong enough for the engine to not have problems.
Make low graphics great again.
Imagine if the same mentality for photo realistic games was used in any sport, requiring players in said sport to wear gucci kits at the minimum to attend.
Zero interest for bad graphics.
Might as well quit gaming if anything I play looks half-indie, half-experimental and full vomit.
just play indie games then
Chasing graphics sounds exhausting to me. I like nice looking games too but I just did a Half Life run recently and had a good time.
graphics are irrelevant the aesthetic is the lead direction when someone think in "bad" or "good" graphics, graphics without aestethic intention is irrelevant and worst than puke experiments is a souless image
@@cezarstefanseghjucan Except graphics are getting worse and performance is getting worse, all while playing Stalker 2 I kept thinking "I've seen better games that performed better what is this shit..."
21:24 Alan Wake 2 uses Remedy's Northlight engine not Unreal Engine 5
He was using it to make a point about photorealism being a selling point I think. Not what engine it uses.
I respect the shit out of developers who make their own engines still.
@@MelGibs-v7j$$$$$$$$$
@@MelGibs-v7j Now if they would respect their IP and not race swap people for DEI money. They still haven't broke even on sales.
DLSS should be a backup not a focused tool.
Red engine isn't gonna be used anymore sadly
studio just got sick of band-aiding it together.
@@gerarderloperBS. They have lost their best developers, so they can only base curret generation of so called "devs".
Back in the PS1 days devs are really crafty in squeezing every bit of power from the PS1 console. Imagine taking those guys and giving them access to todays resources and tech.
The best looking PS1 games were probably at the end of it's lifespan, WipEout 3 is an example of a beautifully looking PS1 game imo
Yeah, those are the guys who work on unreal and unity and all the other proprietary game engines.
I believe that major thing is that huge amount of developers today don't understand how computer work.
I'm generation of developer who is written own interrupt handlers and multitasking scheduler on assembler to 8-bit machines and I can tell that modern frameworks of programming are more complex than understanding how computer works on CPU/GPU level. It feels like junior developers are just out of understanding what causes performance losses.
Besides, it is not only knowledge how computers works. It is also doing less as possible, saving resources to stuff that matters. Of course some thing can be added when it practically takes no resources.
However that is not clear thing to many of the developers, or gamers, what is "enough". Optimization is not targetting game content to 4K resolution, or not all fancy features should be all time on. I think we could have better metrics for that. One really universal metric seems to be polygon size.
They are over 50 years old so won't get hired. Instead we get Josh who's 20 and learnt shit coding from Brackeys.
@@St4rdog
In my opinion, it takes at least 20 years to learn both highlevel and low level stuff enough good. So learning should start as child and at 30+ it is possible to be enough good developer.
That means that people who start learning twenties are in bad position if they want to be hired, because AI will make make junior level coding useless. That means 10 year career that starts at 40. Or option 2, be entrepreneur.
2 things;
it's not really proper to blame the engine for performance, optimization is up to the devs and UE5 will run insanely well if you have an actual performance specialist on staff to optimize the game
the real reason AAA likes UE is because they don't need to hire actual software engineers due to the blueprint system which can be taught to a highschool graduate in a couple weeks of training, and the real reason why these AAA UE5 games suck is exactly because they hired cheaper labor in order to put more money into marketing for day1 sales, the unfortunate reality is more marketing gets more sales than a better game
You got it. UE5 allows for cheaper and more interchangeable labor. The more games using UE5, the more devs will know UE5, which makes it easy to upscale and downscale the workforce on a whim.
UE5 removes the skill needed to create a videogame.
100% people are blaming the engine for a developer error. the engine is capable of beautiful, performant games at a scale no previous engine has ever come close to.
No doubt, the engine has huge optimizations and tuning capabilities. However, its up to the developers to actually use and fine tune it. And Optimizations and QA are one of the first things that get cut when the deadlines start hitting. Remember: Minimum Viable Product
Blame the devs not the engine. Games are optimized and perform well depending on the work and effort that goes into them.
I do not want to watch another video of video essayists on youtube pretending they understand everything they talk about.
@@mikec2845no it is not. Its technology that makes "beautiful games" is also heavy taxing on any hardware. I do not want ultra realistic games that slowly crawl at 540p scaled to full HD in 30fps, i want fast games that runs 60 without scaling.
Fr, I had a joke in ue4 times and sadly it is still accurate.
If Titanfall 2 was made with UE instead of Source, it will be 2-3x more demanding on requirements but have the same graphics.
As a Unreal Engine user for more than 5 years, I think more project utilzing Unreal 5 also push Epic harder to polish their engine to be more adaptive to all kind of scenario. I can understand people worry about it's taking over. But to be honest, there is still Unity with larget market share.
To be fair the 4060 8gb has less performance than a nearly 8 year old 1080 ti
i still use a 1050 TI and i can run 80% of games that come out nowadays , and the top 20% are almost always dumbed down tech demos that are worthless to to play wtf am i going to play stalker 2 for ? they did nothing new its just "game you have nostalgia about 2 " , just like the hardware improved almost nothing in a decade so did the AAA games
@@MrRafagigapr That's a hard cope to justify sticking to a 4gb vram card
@@MrRafagigapr I agree. I can go play modded STALKER 1 and enjoy it instead of this mess!
1080ti was an aberration card that is way stronger than most xx80ti's
@@anoshiki11231 that's a hard cope from you bud, if their pc works for them, let them use it.
Amazing graphics don't count for much when every other game runs like Power Point presentation even on the highest end GPUs.
Hot? Take: Not everything needs a day night cycle. Having several static ToD with high quality baked lighting is usually fine.
We’re getting so lost in "muh graphics" and deadlines that were forgetting about great optimizations. And gameplay ofc
GTA V had mirrors you can see yourself in. 10 years ago. Without RT.
eh, Stalker has always been about things such as the day-night cycle.
Duke Nukem 3d had mirrors like that, in 1997!
Yeah, this take is valid if only the video aint bout stalker. Also, i think i can count with fingers what game ive played and relatively popular have night and day cycle.
Even Ocarina of Time had a day night cycle. That’s not the problem.
Watch crowbcats new video dead rising could on the wii
Once you notice the ghosting and blurriness, it is maddening.
The Unreal Engine and its consequences have been a disaster for the human race.
Humanity was already fucked before unreal engine dude
You dont get it@@hipjoeroflmto4764
Watching this makes me even more excited for source 2 and how well valve plays with tons of physics stuff and not just superficial graphic like unreal
UE5 has chaos physics, your comment is irrelevant
@@brushrunnerbarely any game uses it and it runs horrible, comment irrelevant
Source 2 has been out for years and is incredibly out dated. You're excited for an engine that literally came out 10 years ago and looks like it. Also ask any CS player about how Source 2 is going. Lots of performance issues.
@@asmilingguppy2212still the performance is way better on source 2 and besides source 2 was and is being refined over the years. Do you think they made unreal the day it was announced?
@@ksavx2119 Unreal has been around for years but there have been major shifts and updates which have not occured in the valve eco system. Either way my point is that no, source 2 the performance can be awful. There are tons of issues with CS2 and shaders. It isn't the engine, its how you use it. And I suggest you look more into source 2 before making claims about the engine.
All the companies switching to UE5 will probably actually hurt devs. It'll be so much easier to fire people if everyone in the industry just runs the same set of tools.
You hit the nail on the head. This is exactly what's going on. The big gaming industry has already been desperately trying to find ways to get rid of as much staff as people, it's no conspiracy theory at this point it's commonly known fact unless you've been living under a rock or delusional.
yeah, back then if you worked for CDPR for example, you most likely wouldn't be fired ever, because teaching a new dev to use the red engine was expensive. Now, if you need a better developer you just hire another one and he already knows how to use your generic game engine
Well, thank for your another fine work here, Vex. Special thanks for some small details you pay attention to.
And, of course, for bringing up S.T.A.L.K.E.R. 2 topic, so in-depth, great piece of labor you've put into it. I do appreciate it it's a special thing today for me, for us here, really. GSC have their faults, through their history, yet they are our boys, as 4A Games are, founded itself by descendants of a Grigorovitch Serhiy Constiantinovitch Games in 2009-2010 (just after the Call of Pripyat, the last of the original S.T.A.L.K.E.R. trilogy was released).
Greetings from a fellow gamer, myself, since 1996, from Kyiv.
Absolutely do agree about what you've said about UE5. I'd like to believe, games using it are going to improve over engine's life cycle, as was the case with the previous Unreal Engine iterations. It's some kind of a tradition already: UE3, UE4, UE5 demos blows mind entirely, when compared to the contemporary games, it's like with consoles, like PS since the very first, when devs are usually more likely to get a hold of an engine and to squeeze the most of juice out of the system almost on a brink of a next-gen sale start. I mean, UE5 is a young engine yet, with an immense potential, but still.
I'm a factory worker and my rig is quite limited in its budget, I've got an i7 4790K at 4.6 GHz, 32GB DDR3 RAM, and just have switched a 1060/6 for a 1080ti, just for this very game. I see now, why they've changed the official sys. req. to 3070ti.
1680х1050 (my display max resolution in 2k24, not a joke, still much better then my EPS/XPS cutter machine at work), all settings on Epic (max), with no upscale MSI Gaming X 1080ti gets about 35-45fps, ASUS Dual 1060/6 OC got like 17-20 fps, quite predictable. Actually, S.T.A.L.K.E.R. games' series were never famous for their fine optimization. Especially the second one, Clear Sky. There was the time (2008), when I've first met an option, described Ray Tracing. Sun lighting rays then, just full dynamic shadows from moving sun but it took such a heavy tool on those days' systems that no GPU was capable of it then, not even 2GPU GTX295. Ridiculous, but the 2008 game max settings have about 50 fps on a 2013 GTX780, about 60 fps on a 2016 GTX1060/6 and about 100-110 on a 2017 1080ti. 1680х1050, stiil.
You know, I remember well my first experience with the first days' Cyberpunk 2077, like 4 years now, almost, with my GTX780 on board (then-minimal sys. req.), and the experience wasn't smooth, for sure. Now it was the very same feeling. It will get better, if GSC would care for its child long enough.
BTW, If you guys remember, there was serious problems with the performance of Cyberpunk on PS4. Well, take a look into Steam Deck gameplay in S.T.A.L.K.E.R. 2 ... =)
I'm more concerned as to the lack of alternatives from Unreal Engine , there's far too many game studios using it nowadays. I'd rather CDPR stuck with RedEngine after doing so much work with it for Cyberpunk and kept evolving that.
Not to mention that UE5 seems very immature and although it has some good features , it also has some serious performance issues for real time rendering which upscaling is being used to mask for gaming.
I get the feeling that UE5 is being targeted at arch Vis and other uses which are offline so performance issues are being ignored.
Or maybe just maybe the engine has a ton of features , devs don’t know how to budget well and want to use all of the features when we’re not there yet in terms of hardware and then they can’t optimize it either. And this just because if you use those features it will market better. The best example is hair in stalker imo. Completely unnecessary, they could have used fallback hair cards at least, but now the game tanks due to the hair settings in towns. That’s a frame budget issue not the engines fault.
We have a ton of engines in the market already, just not free. Unity shot themselves in the foot and Godot still isn’t there feature wise.
Another thing to consider is the nature of ue5 , that it has everything by default attracts every lazy dev out there. That’s why there’s this misconception that godot and unity have more creative games. Because people who work on those engines cannot afford to be lazy with out of the box solutions
Capcom will sadly never license out the RE-engine despite being quite the powerhouse, especially now that DD2 was used as a guinea pig for it's open-world capabilities. It shows slight aging mainly with hair physics being questionable but they have been chipping away at the uncanny valley it has.
The heavy reliance on TAA and DLSS in games nowadays is unacceptable. And even then unless you use global illumination (big hit on performance) the shadows in these games are worse than the Witcher 2 (enhanced) for example. Which is over 12 years old by now, it's pathetic. Every time i hear a game moving to Unreal 5, or whatever the new one is, i become concerned, it has become a red flag. We have some rough years ahead of us.
NGL the lighting is even worse than in F.E.A.R., which released in 2004.
what do you mean "some rough years ahead of us"? It's looking like devs are getting into UE5 more & more. Expect we're getting innumerable rough years ahead of us.
@@alphawolf60 Got me in the first half, ngl. And yeah, probably, i'll just stick to indies and 10+ year old games. Not that the AAA industry has put out anything of value lately.
@@alphawolf60 only thing I can say is that if more people use it and master it, hopefully that means better optimization.
Ive been telling my friends that games dont just run well without upscaling anymore. I mean what the hell happened? Sometime after RDR2 nothing has ran as good with Native resolution, or DLAA is what they call it now. Upscaling and Frame Generation has made devs get away with unoptimized crap.
Assassin's Creed Valhalla has 90ish average fps on my 2019 PC(ryzen 2700x and rx 5700) without upscaler and frame generation, so don't say that after RDR 2 nothing runs as good with native resolution, cause it's objectively false. it started happening when upscaler and frame generation were introduced. now frame generation has become mandatory cause devs got lazy with optimization. Monster Hunter Wilds literally says to enable upscaler and frame generation on the recommend specs page. this is the future.
@@SapiaNt0mataThe future is me not spending money on unoptimized crap that's no better than products from 10 years ago.
I have i9 14900K and RTX 4070 Super. 180 - 190 FPS on High graphic settings.
@@SapiaNt0mata Assasins Creed is just the same game every 2-3 years and they have their own game engine. So it would be a shame, when Ubisoft games have bad performance. And all Ubisoft games feel the same.
What people forget is games like Arkham Knight, achieved near photo realistic graphics, crazy optimisation, seamless open world IN UE3.
Even rockstar did it and FAILED!
With gta definitive edition.
Well, they did outsource the project to a garbage company full of incompetent hacks. Still doesn't excuse the lack of respect not only for their own franchise what made them this big to begin with, but also no respect for the customers and fans!
@@daktaklakpak5059But God has helped me to this very day; so I stand here and testify to small and great alike. I am saying nothing beyond what the prophets and Moses said would happen- that the Messiah would suffer and, as the first to rise from the dead, would bring the message of light to his own people and to the Gentiles.
Acts 26:22-23
@@konrad1916ok
it's actually a miracle. They made engine that emulates other engine(ue emulates renderware). That's something, even if it has bugs.
The AI of the monsters is so janky. Literally all monsters follow the same attack pattern: run straight towards you, hit, run back, turn, run straight towards you, hit, etc. The pigs, the pseudodogs, the flesh, even the bloodsuckers and poltergeists do exactly the same. Hit and run, repeat.
That's definition of well designed AI.. In fact they even know how to avoid your line of sight effectively, so you can't even abuse surroundings all that much. Mutants will hide bebind objects when they can't reach you.
@@MrZlocktar Yes of course, just like a real boar encounter, they're known to hit people, run back, run sideways and run again towards you multiple times, truly genius. Of course when I see a bloodsucker I think "this looks like the kind of creature that hits you and then starts running away in circles like sonic"
@@MrZlocktar But you know animals don't do this, right? If it's a fake attack to make you back down, they will run at you and then retreat. But if it is serious, a boar will knock you down and rip into you with its tusks, probably impaling you. Bloodsuckers are even more retarded to do hit and runs. It's lazy AI scripting, which hopefully will improve in the future.
@metalema6 In real life you don't have health bar smartass. You know what that mean? Play it on Veteran and try to fuck around with boars. Go on champ. You'll learn really fast how IRL scenario would go. And stalker somehow depicts it rather accurate. Spoiler: they won't hit and run, you'll be cripled from first hit and then you die of bleeding.
Probably copy/pasted the old "AI" script. They did the same back then if I remember correctly.
I think the problem is that these large game studios test their games on state of the art gaming PCs instead of using PCs with the lower levels of performance that the average consumer would have.
I dont know if game developers have the same problem as software developers, but at the company i work, the Software Devs do not seem to grasp that comlanies will not put our software on a good new machine.
They're instead waaaay more likely to put it on the dusty computer in the corner with a processor released back when i was still in college.
Telling them that they need to make sure it works on old equipment really doesnt go anywhere.
@@tweda4damn I’m sorry to hear that bro. Shit sucks when the company overlords just can’t be wrong
" wait the game is unoptimized ? just turn on scaling ! " - game developers
I still don't understand why people give so much shit about graphics in games.
I care about performance much more than graphics especially in FPS games. Doom Eternal looked good and pushed almost 200 fps on my system which enhanced the gameplay. This game feels like a laggy mess in comparison.
Exactly. Original 2007 Stalker holds up with its old graphics, just fine. Story and gameplay above all.
Personally, I don't believe that game graphics need to be better than they were in about 2005. At that point, there was plenty of detail available to make everything easily identifiable. But it wasn't covered in so much unnecessary detail and special effects that you couldn't see anything. They focused on the big details that players would actually notice while playing, instead of the tiny details that only show up in perfectly staged screenshots.
There's a reason that the retro look is so common in indie games. Its easier to make, easier to run, and with a bit of skill, can still look really nice. And its why I find games like Ultrakill or Animal Well more appealing looking than any new "AAA" game like CoD or Horizon.
Sure, there are some cases where having tons of detail is good. Like if making your game perfectly photo realistic is an integral part of the story telling. But the vast majority of games don't need to be as detailed as they are these days. No one cares if each individual eyelash on every single NPC is modeled in so much detail that you can count the dust mites on them. Because unless you're going to spend all your time staring at a wall counting the individual pixels, you're never, EVER, going to see that detail. Instead, you're going to be sprinting past all those hundreds of gigabytes of 4k textures and hyper detailed GPU choking shaders, wishing you could be fighting monsters at more than 30 FPS.
Because of that, most players are just going to end up disabling most of that detail anyway, by turning down the graphics options. Which usually makes the game look WORSE than it would have 20 years ago, while still running worse, too. Which is just plain stupid.
Not everyone wants to play an ugly ass looking game, amazing how that works with this day and age and I can agree with it to an extent depending on the style. Although I prioritize gameplay more but a good-looking game is also just as important along with it.
@@JDumbz Different strokes, I guess. Gameplay matters much more to me, like fr I'd unironically play a game that's pixilated to all hell if the gameplay elements are fire
Remember when developers used to invest time, energy, resources, money and great talent into developing their own engine? Because I do. And gaming was way better then.
And lmao there comes the lies. "And gaming was way better then" I loved it when GTA 4 stuttered like hell.
"Remember when developers used to invest time, energy, resources, money and great talent into developing their own engine" yeah creation used in starfield that has no CPU performance issues right ?
"Because I do" no you clearly don't lol.
@@DragonOfTheMortalKombatbefore you get all emotional, try to take a deep breath then comment.. there's no denying that the industry has been going down since the big cough. When I was a kid, you bought a game and it worked when you brought it home to play. No patches for better or worse, the games were more soulful and complete.
The problem is that people are demanding more out of games. Back when games didn't need realistic physics, advanced lighting, particle systems, skeletal animation with inverse kinematic support, landscape generation capabilities, material editing tools, dynamic global illumination, animation and sequence editing tools, and countless other modern features, it WAS possible for studios to make their own engines - although they often just licensed another engine and modified it, but that's another story.
Now, though, people expect so much more from games than a ten to twenty hour single player experience with minimal variation and customization. All of those features take time to develop, and frankly it's a waste of time to reinvent the wheel. Why bother rebuilding an entity handling system when someone else has already done it? Or a material painting system, or a physics engine, or any of the other features we take for granted in modern titles.
So developers have to pick: Focus on the parts of the game that make it a unique creative endeavor, or spend time and energy redoing two years of what Epic, iD, or Valve have already done and polished.
Unfortunately, there's a bit of a trap there: Because Epic works hard to make a low barrier to entry (via blueprinting, mostly), it's very easy to sit down with UE5 and have a world you can run around in in only a few hours. However, it's not very scalable, and the engine tools aren't good enough to do optimization work without the use understanding how things work under the hood. If you make a complex set of AI rules for the NPCs in your game, and manage them all via blueprints in the render thread, your game performance is going to blow chunks. This is an obvious gotcha, but there are a *lot* of nonobvious ones - including the entire blueprint system itself - and as such it's very easy to make a very poorly optimized title in UE5.
The right way to do a UE5 game is to use blueprints for prototyping, then get people who actually know the engine to build the systems you've prototyped in blueprints again using C++, making appropriate optimizations along the way. This takes money, time, and experience, though, so it's not always done... and as a result you have a ton of poorly optimized UE5 games on the market.
@@DragonOfTheMortalKombat what a braindead corporation shill!
@@eagleone5456 There's truth in this but when triple-A games come out and they don't look photorealistic enough, the graphics aren't good enough (for example the facial animations of Starfield), the games get destroyed online. The fact is that people demand better looking graphics, animations, lighting whatever which ends up taking much more resources to make and which puts much more load on the CPU and GPU.
For me, UE5 graphics are so so good in still images, but are COMPLETELY destroyed when moving. Shimmering in finer details such as in the grass and shadows, ghosting, blurring caused by TAA creating a motion blur effect basically, reflections that are super weird when you move, etc. This kills image clarity and does not justify the performance drop for me AT ALL. That's the biggest problem I have with UE5. I just don't agree with people thinking that it "looks better" than old games so "it's normal that it runs like shit". I only agree with the fact that UE5 is making it super easy for developers to create games, which is a good thing, but IMHO we just are not there yet in terms of engine optimization and hardware. I'm tired of every UE5 game being the new "Crysis 3".
Funnily enough there's an actual subreddit dedicated to hating on TAA
Tbf, i excuse stalker 2’s jank bc they had a literal war going on. One of the devs was kia. No other company has any excuse.
But the problem isn't exactly the engine. It's the devs fault. They think that if they use an already established engine, it won't need optimization. It's wrong. The engine is a "tool", you still need to do your stuff.
When you make your own engine you do the optimization yourself because you're making the engine. You can change a lot of things in UE5, and of course, create it yourself. I've lost count of the amount of games that I tweaked to get better optimization and image quality via the ".ini" files of these games. If we, gamers, can do it, they also can.
UE5 is amazing they just need to use it properly, I think they use UE5 to cut corners. Like "I won't be wasting that much time because I already have this much done".
"I've lost count of the amount of games that I tweaked to get better optimization and image quality via the ".ini" files of these games."
Same. I've even used hex editors to alter the code of certain games to allow for things like proper ultrawide support. And don't get me started on how many .ini edits I've had to make in games like Fallout 4. Which begs the question: Why tf do we as players have to do so much of the optimization work on behalf of the devs? It's ridiculous.
You must be a Putin simp
"But the problem isn't exactly the engine. It's the devs fault. They think that if they use an already established engine, it won't need optimization."
This probably does apply to quite a few developers out there, but with regards to GSC and STALKER 2 this is blatantly false. The game essentially uses a custom fork of UE 5.1 because the engine, available as it it from Epic, didn't entirely fit the developers requirements.
This is true, HOWEVER, we should expect to see top tier optimization and amazing performance in games that Epic makes themselves. This is not the case with Fortnite. Occam's razor says UE5 is AIDS.
Release now fix later is the new trend, maybe in 3 years this game will run better lol.
The real PROBLEM is there is not enough programmers and companies do NOT want to pay for them, they want to lower development costs so all they need to do is hire artists that can be outsourced
Ive been saying its corporatism thats ruining gaming. The big suits come in and ruin the original vision of the game or make cuts to it that take away from the game or just want them to rush it out unfinished bc of money and deadlines.
@@Terp-daze Also too much woke nonsense in western gaming studios.
Yeah. OpenGL and DirectX / Direct3D were made by a bunch of geniuses decades ago,
who do we have making rendering pipelines today? I mean the GPU companies can make some impressive visuals, but the realtime optimization & the mountain of artifacting problems needs some work.
The funniest thing is that the developers of the original Stalker went to 4A Games. And they created the 4A Engine. And the game Metro Exodus, which was released in 2019, looks, maybe, a little worse than Stalker 2, but it also weighs three times less than Stalker 2 on the Unreal engine 5. 😅😅
Metro Exodus looks worse?
Thats a negative, metro exodus is to me the best graphic to performance ratio.. Even when using low settings its still looks so good, goodluck finding UE5 engine games that looks good without make it a basically a polygon game on low settings
@liquidsunshine697 no
@@liquidsunshine697thats what the guy said...
Even the Enhanced Edition, which is RT only, runs 60fps on a 2060!
1080p 60fps
1000+$ GPU and 600$ CPU to run a game with worse graphics than we had almost a decade ago is a robery and scam.
I've got a 3060Ti (Rog Strix) and a Ryzen 7 5800x3D. I have to run this game on low to get it to consistently playable. It is really annoying.
I have a 1060 and an i5 and it's consistent for me, i think your problem might be your definition of playable, not your hardware and not the game
until you run out of vram, then it's just stutters until you restart the game
@@nothingbutlove4886 any time it has fits just pausing until the menu is smooth again solves it
@@wyattrobertson7760 "solves it". Nice experience, money well spent.
@@wyattrobertson7760 lol the true andrei tarkovsky sub-24fps cinematic experience
I have said this before and I will say it again, upcoming games made using Rockstar's Rage and Naughty Dog's game engines will embarrass UE once again like they have done over the last decade.
Damn I had no news for upcoming games using Decima and CryEngine or were they still updated?
Let's add Capcom RE engine, the Id tech engine and the 4A engine (which does path tracing on consoles at 60fps and looks fantastic) to the list as well. There are probably more.
this is a shit take. I can build you a duke nukem 3d level that runs like shit
@@SL4PSH0CK KCD 2 has Cryengine 5.0 and it looks better than anything else on the market right now.
Sure, but ND and Rockstar would excel graphically even if they had to use UE5.
What is the point of making such photo realistic graphics when most of the hardwares are just struggling to run it...
Call me old-fashioned but I refuse to use any kind of upscaling / frame generation because I think it's fraud. If engines rely so heavily on this tech why do I need to buy a GPU that doesn't have better performance than one that's ten years old?
You know whats funny? The actual system that powered their previous games life simulation, A-Life is not even in Stalker 2.
Further investigation was done. It’s actually in the code it’s just not enabled. People assume it’s due to performance impacts
Actually, Alife 2.0 it's in the game, but really buggy and some things disabled. Devs of the game and the mod of Gamma confirm it
@@Legalizeasbestos That's the strangest thing about it, I highly doubt it's due to performance. In the original triology, the majority of ALife's brute force was generated in a .txt that compared the player's coordinates with those of certain spawns. It really didn't put any extra stress on the hardware. quite strange that it is disabled
@ I also wonder why this is a “complex” thing to simulate. Seems super easy
Maybe in 2 years they will add it lol.
Since 2007, when Unreal Tournament III came out, I got that feeling, like some games just look wrong - grey and blurry. I much preferred the cartoonish visuals of UT2004.
With how awful Unreal engine 4 and 5 is, it makes me almost feel spoiled for Unreal engine 3. It had awful lighting problems & blurry melty texture issues, but at least the actual anti-aliasing worked, now we don't even get that.
I think this is a topic that needs more attention. There's so many reasons why one engine basically having a monopoly is bad for games. One that I don't think many realize is how easy it is for big companies to layoff full time employees. They used to have to train people on their own engine but now they can just outsource work or hire contract workers during crunch periods and than scale down to a small team when there's less work. Unreal is also one of the biggest reason dev time has increased. Every dev is in an arms race trying to make their game look better than the last Unreal Engine release to standout. It used to be that every engine had different strengths and they could make their games stand out in more creative ways but that's not the case if they're all using the same tool
Using a puddle with rain drops falling on it as the example of water appearing to move when it shouldn’t was a pretty silly decision