the stutter is fixed once your game devs can recognize the limits and work around them. there is no magic involved in 3d engines...everything is still having limits
@@krz9000 Yeah it’s silly to rely on game engines to do all the work for you, game devs still need to be competent programmers and find solutions to any type of stuttering that the engine can’t deal with on its own
@@Z3uS2when i see a game with unreal engine i know already that it's gonna stutter like crazy. If i see cryengine i know I'm in for a good time with great visuals. Sadly not many cryengine games
It’s either the upscaler being wonky (TSR isn’t very good atm), or Lumen using an extremely low sample count, or both. Could also being changes to Temporal values in the Lumen denoiser. You can see traditional upscalers destroy RTX lighting quality and reflections in basically every game in this way, especially at 50%. This is why Ray Reconstruction exists, allowing the denoiser to run on a full resolution image rather than the non upscaled lowered resolution. One would have to compare the Matrix demo using the NvRTX branch in order to test this as it uses its own systems.
One note to the Reflection issue showing at 15:38: Lumen has a CVAR r.Lumen.Reflections.MaxRoughnessToTrace which is a cutoff that determines if raytraced or screen space reflections (You called it probe based) should be used by lumen. This is set globally so materials with a certain roughness value do not use raytraced reflections. In UE5.4 this now is a setting in the scenes postprocessing. It is possible that UE5.0 used a different value for this setting than the default of 0.4 and is now overwritten. If i.e. 0.3 or lower would have been used. noise gets reduced significantly which in turn reduces accuracy but improves performance.
I remember an epic engineer years ago saying on a DF video they are working hard on the fortnite shader stuttering.. few years later and its just as bad. Wtf are they doing
3 years and we are getting there, almost usable! Closing in! Certainly good the performance optimization, but the shader compilation issue is taking too damn much time to get it fixed.
Such a shame they haven’t fix this stuttering, with ddr4, ddr5 and high bandwidth cards with direct storage support, stutters should be a thing of the past!
It's also worth pointing out, that every driver update will bring back that stuttering too. Which is a real pain. It's not just a "first play experience", it's a "first play experience every few weeks".
I like the Steam Deck solution to this, which is to crowdsource the shader cache. Steam keeps a constantly-updating shared blob of compiled shaders that get automatically uploaded and downloaded to and from Steam Deck users for each game. I'll bet this functionality could one day come to GFE if people were actually excited to use it.
All I hope when watching DF videos is that the developers and publishers are listening and taking notes. You guys do such a good job of offering advice through your videos that would benefit everyone.
theyre really very good games, i recommend at least play just cause 2 from the whole list, its game cant be missed and shows how square enix graphic power in 2011 once again it has 2 more graphic options for nvidia cards, which are bokeh lighting and water physics/ simulation
The shader comp is insane in games like Fortnite, which on a fresh install means a few games that are literally a slideshow going from 144 FPS to 20 constantly until you play a few matches.
Omg is this why my Fortnite stutters? I never play but when I do get on I get constant stutters despite having a good pc and no stutters in any other game
@@wanshurst2416 If you've been playing for a while its not an issue. If you do a cold boot it will stutter while it builds shaders. I tested how bad it an be with 10900k 4090 pc vs 7800x3d 4090 and while the amd cpu was noticely less worse in frametime and avg fps its still bad for the first 5+ games on fresh install.
I will take frames and no stutter over a window reflection or a better looking brick wall every single time. Its really WILD that they are just now working on multi-core performance improvements considering how long multi core cpus have been around. Its also still wild to me that we are so obsessed with and pushing for technolgies that cant even run well at the standard resolutions we use now (4k is the tv standard, 1440 pc standard) without upscaling, and even then you need to have the top tier equipment for an experience plauged by stuttering to look at a window reflection or to stare at a brick wall to see how the sun reflects. Im not saying we shouldnt be pushing for these new plateaus but cmon guys, lets build on solid foundations here and not sand.
Not to mention you need to be well versed in overclocking. This was well said tho. I’m one of the few that will build a god rig just to run it at 1080p since I care about input lag
In my opinion, the reason they want raytracing to work so badly is because of how much time it saves in development. Without raytracing, lighting needs to be baked and requires lots of trial and error to get scene lighting to look right/good. With raytracing, the engine and GPU does it almost all of it for you in real-time. It reduces dev time and saves money.
4:05 good performance gains, but what seems like massive downgrade in indirect lighting. There's flickering everywhere under that bridge, where it was fine in 5.0
I'm an audio enthusiast much more than a graphics enthusiast, and I'm always afraid technology will forget us. Especially with so many people today using smart phones and portable consoles with headsets (or heaven forbid without them), and even home theaters moving towards "sound bars" and all sorts of other magic gimmicks. Which, on the other hand, I guess makes it easy to understand why it might be frustrating to develop audio for games. You'll spend your time carefully capturing all the foley sounds, recording crickets at night, composing and conducting symphony orchestras, mastering everything to a T, figuring out how to place everything correctly in a surround or atmos system etc etc., meanwhile knowing a large portion of your players will be playing those carefully crafted soundscapes through a "surround" bar made from soap box plastic in a giant, barren concrete room.
Since there are still several years until the release of these games (with further improvements to the game engine coming out in the meantime) and also because I trust in CPDR's abilities, I remain quite confident.
@@Z3t487There's no guarantee that those developers are updating the engine in line with their game code, they usually pick one version of the engine and build from there. If they try to update the engine during development, many of the game's source files will break and it will be exceedingly hard to pin point bugs. This is why Unreal Engine 4 games are still being released because It took those projects that long to complete.
I saw a video recently -- "Optimizing my Game so it Runs on a Potato" by @Blargis3d. He's making an indie game and was having the same compilation stutter problem. He solved it in kind of a genius way. When the game starts, before every level, he has a black loading screen. Thing is --- it's a trick. What he's *actually* doing behind that black screen, is playing the game at 10 times speed, walking in every room, loading every texture and killing every enemy. That way, all of the hitching and loading that has to happen, happens during that period. Then, when the 'loading' completes and the player plays the level, it's actually the *second* time that all of those assets are loaded. Thus -- complete elimination of compilation stutter. This was the first time I ever heard of this (I didn't even know such a thing was possible) and thought it was really cool. Thus, sharing it here. :)
The game is Bloodthief by @Blargis3d. The shader compilation trick is in the video "Optimizing my Game so it Runs on a Potato" and is indeed very cool!
Because it's not that easy to do it properly and a lot of studios who are using a ready made engine do this because they lack the technical know-how and/or financial resources to either develop an own engine, or to dive deep into the quirks of a complex monster like UE. They import their art, design their levels, write their game logic in Lua or UnrealScript, click the "build project" button, and hope for the best. Many devs aren't even aware of the typical issues, which is why DF does such an important job in explaining it over and over again.
The work involved isn't just a checkbox and a screen or something: the way shaders work in Unreal (and every other engine, really) is that you dynamically stitch them together and with potentially dynamic external parameters: think weather systems, characters getting wet or dirty, blending between different ground materials, adding effects for power ups, that sort of thing (and those are just the most obvious ones) Unreal has a big list of these snippets, but not how they will be combined or with what values until you actually tell it to use them. Doing a shader compile is basically the developer finding all the final shader combinations they use in a game (easily thousands nowadays), often through just running through it and trying everything, and telling the engine to use them all in order. Unreal can't easily fix this without breaking the entire workflow that pretty much every artist has used for about 20 years now. Developers need to perform a huge amount of work for what they might not see as being high value, since "it's only the first time"
have u played last of us part 1 remake? i had to wait 40-45mins just in loading for the shader cache imagine waiting that long haha no one would wait that long and either play with stutter or refund and delete the game
@@ChillieGaming I would genuinely rather wait 48 hours for a game to load than it be shipped unfinished, this is why they get away with it, coz y’all bit it regardless
they've changed how lumen works but I doubt they went through the city project and tweaked things for 5.4. So things that were not problems in 5.0 will become issues in 5.4. zero surprise there really, it's just what happens
that being said it's known to not use strong emissive values with small sources with lumen for a while now, this would never be an issue with a real game (if the devs are competent). It's doing real time GI god damn it, it's insane it even works at all at reasonable performance. I feel like people quickly forget how impressive and truly next gen UE5 still is. It's not without problems for sure but holy hell which other free engine has a tool suite this impressive? spoilers there are none that compare at all
@@quackcharge Unity compares. In fact, their Screen Space Global Illumination has very stable emissives. Maybe Epic can take a look at it. And runtime animation rigging since 2019. And before you mention Lumen, Unity has had realtime GI since 2017, except the geometry positions are baked, so light won't spill through an open door. Gray Zone Warfare is a disaster so far. They are realising nobody can run these games.
the music at the very start of the video is my favorite track in Unreal Tournament. Brings back the good memories from 2001 when I first played the game.
UE5 is really giving off those classic cryengine 2/3 vibes. Great to look at, with performance issues. For them to not have parallelization right out of the box is baffling.
@@coffin7904 It's extremely wrong and you are very mitsaken. It's like calling Skrillex "house music." Or Aphex Twin "techno". Or Lynyrd Skynyrd "country". "Breakcore" and "drum and bass" are distinct genres and saying the difference is minute shows that you really don't know much about this type of music. You could say that both are part of the same super-genre of Jungle, but that's not what we're talking about. If you're going to claim that breakcore and D&D have a "minute" difference, what's the point of even using distinct genre terms at all? Let's just call it all "electronic music" and skip the specificity altogether! Christ.
@@azazelleblack It turns me into the Hulk whenever I hear kids put the word 'core' at the end of any arbitrary word. Like calling frutiger aero 'cleancore' or something.
@@coffin7904 Absolutely! Before we get started, it's important to understand that these genre terms are sort of poorly-defined and used pretty loosely. With that said, there ARE definitions to these terms, and in particular "Breakcore" is a distinct subgenre under the heading of "Drum & Bass" or "Jungle" music. People disagree over whether Jungle or D&B came first (and thus deserves to be the super-genre), but both evolved from earlier Breakbeat and Rave Hardcore music. Jungle music had heavy influence from Dancehall and usually was lower tempo, with an MC and party vibe, while Drum & Bass was focused on literally just rumbling basslines and breakbeat drum loops. Drum & Bass as a genre typically refers to the early works of artists like Technical Itch, Q Project, Grooverider, and many, many others. This style originated in the early-to-mid 1990s and was still very much party music. It's danceable, and while it has a much darker vibe than something like Rave Hardcore (or especially that genre's successor, Happy Hardcore), it's still chill enough that you can zone out and relax to it. Breakcore, meanwhile, is an evolution at least three stages removed from the original Drum & Bass sound. Around the same time people were coming up with D&B, other artists were experimenting with new sounds and creating what was then called the very stupid name "Intelligent Dance Music," or "IDM". IDM is often harsh, atonal, and challenging to listen to, and despite the name, it's almost entirely undanceable. The early crossover efforts between D&B's successors (known commonly as "Darkcore", see: Dieselboy) and IDM were mockingly called "Drill & Bass", but this style became somewhat popular within its niche and has numerous artists. You get crossover from both D&B guys and IDM guys in this genre. Drill & Bass eventually gave way to Breakcore, the successor genre that takes the relatively technical and stripped-down Drill & Bass and turns it up to 11 with influences from Hardcore (both EDM and Punk), industrial music, and even avant-garde noise music. True Breakcore is a brutal and harsh genre that's hard to enjoy for most people. It features gruesomely mutilated breakbeats with little rhythmic coherence and sharp, distorted sounds that can be like audio jumpscares or just constant stressors. Even if you say "Drum & Bass" is the super-genre, nobody is thinking about Breakcore when they say "Drum & Bass". They're thinking about Aphrodite, about LTJ Bukem, about Dieselboy, about Evol Intent, and so on. Breakcore artists are guys like Venetian Snares, Bogdan Raczynski, Sh!tmat, Rotator, and so on. Foregone Destruction is absolutely not Breakcore, lol.
Dude thank you i hope you cover additional versions as they release this was very informative if ue5 was ready for the project im starting! It is not there yet sadly!
Great content, Alex. Instead of the technical talk (which I’ve grown in understanding over the years due to Digital Foundry) you did a great job illustrating the points you made. Bravo.
Problem with the Frankenstein PC is it is missing the co-processing from the I/O Chip in the console. The PS5 I/O has the equivalent of 11 PS5 CPU cores dedicated to Decompression and Dieect Memory Access, it has an addition processor dedicated to I/O of the SSD, and another dedicated to memory mapping. On the Frankenstein PC, all of these tasks have to be handled on the CPU or GPU depending on how they decided to tackle it. As we move into full fledged PS5/XBSX titles, the gap between the consoles and this PC will significantly increase as the I/O Olin the console becomes gets more and more use.
The comparison at 4:00 does have more fps in 5.4 but global illumination is way more noisy under the bridge. So Lumen is doing a worst job at calculating those emissive materials. (oh, it's mention at the end... hehehe)
the historical trend says that by UE 7.4 the stutter will be a constant up and down between 16ms and 100+ ms and the entire industry will pretend it isn't happening or if it is happening it's not a big deal or if it is a big deal it's impossible to solve and we just have to accept it any way. no version of Unreal has ever stuttered LESS than its predecessor.
I think they should put all of their resources to fixing the performance before improving rendering. Makes the most sense even from a marketing perspective
How on earth did it take 3 years to add multi-core rendering support? Hey, we're going to build a house without a roof, release it, and 3 years later we'll add the roof. Didn't they run this on a PS5? The demo of Matrix on a 7800X3D with a 4090 is unacceptable even today with 5.4 given that this is the best you can buy.
It has just been mind-blowing to me that these companies can put sooo much time, money, energy, human resources into making these amazing pieces of art, just to get to the very end finished product and accepting shader compilation stutter and traversal stutter. It's unfortunate, frustrating and just upsetting. Millions of dollars go into these projects and us consumers spend thousands on PCs just to have all these little hitches. Hope this will come to an end
I've been refusing to play UE titles since Jedi Fallen Order, and this is more confirmation to continue to avoid UE games. Unless DF/Alex confirms a UE title doesn't have stutter, then I refuse to play/purchase.
I doubt they could even if they wanted to, and for all we know the de-listing might have something to do with Tencent. And never forget that the original series of Unreal games were co-developed and directed by Digital Extremes. The only Unreal games made entirely by Epic Games were UT3 and UC2.
@@AlexanTheMan 99% of large companies like epic are after 1 thing. Money, don't know what else you would expect. An Unreal game would only get the boomers interested and they're less likely to spend money on fomo and battlepasses and dumps 100s hours into a game unlike the kids that play fortnite.
Why isn't it possible for the engine to download the shaders from a cdn/steam/nvidia/amd, matching to the video card? Steam does this for some games games (apparently not for UE5 games). And consoles do it too.
They depend not only on the card, but also the driver. It might be possible, but it would be a lot weaker of a system than it is in, lets say, a steam deck
@@Wobbothe3rd the processor of a 4060 is the same, no matter if it’s from asus or msi. I’m a software dev, and im pretty sure that it’s even simpler. Im mean it’s just a simple compiler. It’s like compiling C code for i386, i486, i586, i686. There is no need to compile it differently for every single SKU. Only for the architecture: Turing, Ada Lovelace, etc. So that’s about five/six different architectures. UNLESS, shaders contain precompiler statements which check for VRAM size or number of shader cores, which would be IMO very weird.
@@iurigrang you raised a good point with the driver. But the fun fact is, that’s because one essential part of the driver is the shader compiler. It’s likely, that not every small driver increment, changes the output of the compiler. So nvidia/amd needs to keep a lookup table: source shader hash + architecture + driver version => compiled shader hash. That would be a finite number of files to deliver via cdn. I mean they could say, we offer that service only for the latest WHQL driver and the latest upstream driver. That would be even fewer files tob host. With a fallback to self compiling. I guess the problem is, nobody wants to pay for the CDN, not the game developer and not Nvidia/AMD. ^^
The Matrix tech demo settings were similar but seemingly something changed under the hood in UE 5.4. Most likely this artifacting is somehow connected to the render parallelization improvement. Kinda seems like something is out of order/not synched in the pipeline.
@@Waffle4569 I kind of agree. We're comparing performance between UE5 iterations though so if the way lighting is processed helps performance while also making things prettier/uglier, that's what we're seeing. The *Matrix* game/demo settings are the same, just the engine changed. Apparently those lighting artifacts/fizzles are from MORE rays being processed into the scene and weren't there before due to those lights casting nearly no raytraced light before.
This stuttering issue has me quite worried about CDPR’s next games on it. It would be so sad if the new Witcher released with terrible shader compilation or traversal stuttering considering how great CDPR’s games ran on high end PC’s with RED Engine.
11:41 The denoiser seems to struggle substantially more in 5.4 as visible in the lights under the bridge/overpass. So some of that extra performance may have been gained by reducing overall image quality
UE stutter became most apparent to me with Lords of the Fallen... even after 40+ patches, the game STILL suffers from hitching, micro-stutters, and the like. It begs the question: Which engine can rise up and offer a robust set of "future-proof" features, provide affordable licensing, etc.?
I'm holding out hope that we can have some kind of machine learning model to handle shader compilation and that can run on the GPU. A model trained specifically to do shader compilation could be infinitely more efficient than the dumb "on demand" way it's done now.
Me with my 5800X3D and 4070 Super: *Plays Fortnite in 1080p, performance mode and low settings* I cannot suffer through a few games just to cache my shaders in dx12. It is a miserable experience.
Hearing the Unreal Tournament music makes my heart melt ♥️. Truly brings me back, and gets reminded that Epic Games is a juggernaut when it comes to their graphics from the Unreal Engine.
Overall it's great to see improvements, but it still needs a lot of work. No wonder all new UE5 games look and run the way they do. However, I still prefer path tracing with DLSS RR over UE5's global illumination, reflections, shadows... They have that unstable flickering "boiling" look that I find distracting. 🤔
Yeah, it's noticeable in ark. The software lumen is a big difference to path tracing. Software raytracing, hardware raytracing, then path tracing. Epic has said they want to optimize the engine to the point where hardware lumen is as expensive as software lumen is today. So if they can achieve this, we'll get better performance or resolution for UE5 games. Also we'll get better rt quality for 30fps modes. Something tells me they want software lumen working phones and mobile gaming PC's and the switch 2.
Did you test if the shader stuttering issue is worse or better depending on the graphics API ? At least in Unity it is much harder to remove shader stuttering for modern APIs like DX12 and Vulkan because they expect you to compile a shader variant for each mesh attribute combination that you might use. Like let's say you have one mesh that has Position Normals Tangents and Texture Coordinate 0 and then another mesh that also has Texture Coordinate 1 for lightmapping. You would nee to make sure to prewarm shaders for both these mesh attribute configurations instead of just once for the shader variant. That's why i still stick to DX11 / GLES3 because i didn't find time yet to build a system that would collect all the possible combinations used in a project and prewarm them properly.
4:00 this says it's the same settings, but is it? I notice a lot more denoising issues around the lights hanging under the highway. It's much worse in the distance.
11:57 also the light from lights hanging from below the bridge have extreme artifacting on them with 5.4, which is not the case on Unreal 5.0. But the stutter should be fixed over everything else. Game that stutters is not fun to play at all and at least makes me quit the game before finishing even the 1st level
It's pretty shameful that a company like Epic, with all its resources still has problems with shader compilation and traversal stutter on PC If I were Jensen at Nvidia, I would be calling Epic every day pressuring them to put in the work. So many studios use UE. So to the average person, who has never heard of shader stutter, booting up a UE game would just put a black mark on PC gaming for them. This would have a negative affect on PC hardware sales. I bought a PS5 for it's small number of exclusives but mainly to play UE5 games that interest me just to avoid them on my PC (7800X3D/4090) And this hurts smaller studios that can't afford making their own engine. UE5 has so many great features which attracts smaller studios who want to make something more than a sides scroller.
It's a difficult problem because of how flexibly artists can author new shaders in the engine, with potentially tons of different permutations. There is effectively no way of knowing ahead of time which permutations of those shaders will be used at runtime, and the engine was structured for APIs like DirectX 11 where the driver automatically recompiled/restructured pipelines for different state and shader combinations with less overhead. There is even a new Vulkan extension that reintroduces part of this dynamic driver recompilation to subvert the issue, but DirectX 12 still requires you to compile your pipelines ahead of time for all possible shader/state combinations you could encounter during runtime if you want to avoid this problem.
@@TheSandvichTrials I assume you were talking about the dynamic rendering extension for Vulkan? If so DX12 doesn't yet have something like that that I'm aware of, but it does have an equivalent of Vulkan's new shader objects where you can compile individual shaders ahead of time and mix-and-match them at runtime, opting out of certain optimisations. The new work graphs feature of DX12 makes use of its equivalent of shader objects.
@@TheSandvichTrials Yeah, it's real fucking neat and could open up a ton of possibilities for GPU computation. Apparently Epic was playing around with porting Nanite to work graphs and held a presentation at GDC with their findings, so I'm hoping the footage of that is released, if it isn't already.
@@jcm2606 I watched that presentation, they showed Unreal running in that megascans canyon/desert environment, though they refused to share any details. But it seemed to work!
This is why it saddens me to see so many developers dropping their in-house engines for UE5. I always had at least 1-2 consoles for every generation, but always played big titles on PC, but current generation makes me rethink this bc it seems like the only way to dodge stutters is to switch to consoles...
It needs a 2D shader animation for 3D buildings at a distance, and an option to lock frames at 40FPS and 45fps, it also be nice to integrate frame generation, so having something like internal 52fps lock and upscaling to 60fps with frame generation..
I know engines get patches and updates throughout their life, but I'm surprised by how poorly UE5 can work with multi-threading. Like its nothing new, its been around for years
Ok, maybe I'm a troglodyte here, but can someone explain to me WHY all PC games now seem to have shader compilation stutters? I don't get it. Is it a DX12 or Vulkan thing? I mean, none--let me emphasize that point, NONE--of the games I used to play in the 2000s and first half of the 2010s ever needed a shader precompilation step. They never stuttered. They just played. Why is this a thing now? How can we get rid of it? It's fucking ridiculous. I want games to JUST PLAY, like they used to.
Yes, it's a DX12/Vulkan thing. Put simply, prior to DX12 and Vulkan, the driver was responsible for compiling shaders and would do a bunch of optimisations in the background to ensure that shader compilation didn't affect performance much. This meant that how games handled shader compilation was more consistent, but it and the various other abstractions of older APIs led to performance being unpredictable at the developer level. As a result DX12 and Vulkan were created, which gave developers more direct control of the hardware at the cost of requiring them to do things that the driver would previously do. One of those things is that the developer is now responsible for deciding when, where and how to compile shaders, which they're currently failing at doing properly.
It's nice performance is better but I can graphics are worse as well. Check out the lights under the overpass from 3:58. Lots of flicker and lighting weirdness.
We'll land on mars before shader compilation stutters are fixed in UE.
the stutter is fixed once your game devs can recognize the limits and work around them. there is no magic involved in 3d engines...everything is still having limits
@@krz9000 Yeah it’s silly to rely on game engines to do all the work for you, game devs still need to be competent programmers and find solutions to any type of stuttering that the engine can’t deal with on its own
@@krz9000 Not even Epic itself can get rid of the stutters in their own damn game, as it's shown in this very video.
Maybe... I bet we don't land on Mars for at least 15 years.
@@Z3uS2when i see a game with unreal engine i know already that it's gonna stutter like crazy. If i see cryengine i know I'm in for a good time with great visuals. Sadly not many cryengine games
Yeesh the lighting under the bridge in 5.4 is fireworks
Been seeing that in Gray Zone Warfare, too. It's incredibly distracting.
@@SHABAD0O i'm watching this hoping they update to 5.4 so we get a few more fps in GZW 🤣
It’s either the upscaler being wonky (TSR isn’t very good atm), or Lumen using an extremely low sample count, or both. Could also being changes to Temporal values in the Lumen denoiser. You can see traditional upscalers destroy RTX lighting quality and reflections in basically every game in this way, especially at 50%. This is why Ray Reconstruction exists, allowing the denoiser to run on a full resolution image rather than the non upscaled lowered resolution. One would have to compare the Matrix demo using the NvRTX branch in order to test this as it uses its own systems.
I was noticing it was before he got to the subject. It's worse a far render distance it seems.
What did they change to make it look so much worse? Now better performance but worse visuals. I guess we just won't get both
One note to the Reflection issue showing at 15:38:
Lumen has a CVAR r.Lumen.Reflections.MaxRoughnessToTrace which is a cutoff that determines if raytraced or screen space reflections (You called it probe based) should be used by lumen. This is set globally so materials with a certain roughness value do not use raytraced reflections. In UE5.4 this now is a setting in the scenes postprocessing. It is possible that UE5.0 used a different value for this setting than the default of 0.4 and is now overwritten. If i.e. 0.3 or lower would have been used. noise gets reduced significantly which in turn reduces accuracy but improves performance.
/\ This guy lumens.
|
DF to Epic...what about shader stutterring?
Epic to DF...Ye,Ye,Yes.
💀RIP to the performance for all next gen games
I remember an epic engineer years ago saying on a DF video they are working hard on the fortnite shader stuttering.. few years later and its just as bad. Wtf are they doing
@@sven957 nothing, nothing all, naturally.
@@sven957making it more noticeable.
the epic store pc strugles when seach. store page frontend we talking about no graphics 3d.
That demo was 3 years ago... god damn whered the time go
better question, where did the frames go? oh wait, they were never really there.
And still no games lol
@@yc_030We have that crappy graphics tekken 8😂
damn, really? wtf 😅 feels like a year
3 years and we are getting there, almost usable! Closing in! Certainly good the performance optimization, but the shader compilation issue is taking too damn much time to get it fixed.
UE 5.4: "Ddddid I stutter? Well yes I did..."
Sigma
Fortnite could be shifting to Unreal Engine 5.4 soon
Such a shame they haven’t fix this stuttering, with ddr4, ddr5 and high bandwidth cards with direct storage support, stutters should be a thing of the past!
@@Multimeter1Seriously, at this point there’s no excuse.
Say what again!
It's also worth pointing out, that every driver update will bring back that stuttering too. Which is a real pain. It's not just a "first play experience", it's a "first play experience every few weeks".
I like the Steam Deck solution to this, which is to crowdsource the shader cache. Steam keeps a constantly-updating shared blob of compiled shaders that get automatically uploaded and downloaded to and from Steam Deck users for each game. I'll bet this functionality could one day come to GFE if people were actually excited to use it.
@@GallileoPaballa That's a great solution.
Would it talked shader for every game / driver / card combination?
12:29 that guy is really interested in the parking rules
I thought he was reading the bus stop time table lol
I'm dying.
Epic devs trying to find the instructions to fix shader stutter.
He likes those +500% zoom views.
I hope he finds what he's looking for
All I hope when watching DF videos is that the developers and publishers are listening and taking notes. You guys do such a good job of offering advice through your videos that would benefit everyone.
Me too.
5:21 the frame-time graph looks like the city its trying to render
'a frametime graph that looks like post-modern art'
never change Alex... never change
theyre really very good games, i recommend at least play just cause 2 from the whole list, its game cant be missed and shows how square enix graphic power in 2011
once again it has 2 more graphic options for nvidia cards, which are bokeh lighting and water physics/ simulation
The shader comp is insane in games like Fortnite, which on a fresh install means a few games that are literally a slideshow going from 144 FPS to 20 constantly until you play a few matches.
I always had rock solid 240fps? 🤔
Omg is this why my Fortnite stutters? I never play but when I do get on I get constant stutters despite having a good pc and no stutters in any other game
@@wanshurst2416 If you've been playing for a while its not an issue. If you do a cold boot it will stutter while it builds shaders. I tested how bad it an be with 10900k 4090 pc vs 7800x3d 4090 and while the amd cpu was noticely less worse in frametime and avg fps its still bad for the first 5+ games on fresh install.
@@wanshurst2416 update your drivers and you're back to the stutterfest
intro track id for those wandering: Michiel van den Bos - Forgone Destruction
The fact that stuttering still isn't fixed is surreal, unreal you might even say!
*ba dum tsss*
That Unreal Tournament music slaps
loved that game series
It's Foregone Destruction I believe !Such a good track!
You gotta remember your roots
Every time I see Forgone Destruction’s map image in Fortnite I smile. I’m sad I keep missing the track in the shop. It’s an all time classic
I love Unreal Tournament and the fact that EPIC removed the series from all storefronts still keeps me salty AF
With that music it's impossible not to imagine a UT capture the flag match on Facing Worlds with these graphics.
I'm still salty at Epic for abandoning the next Unreal Tournament game. (and Paragon, for that matter)
I will take frames and no stutter over a window reflection or a better looking brick wall every single time. Its really WILD that they are just now working on multi-core performance improvements considering how long multi core cpus have been around. Its also still wild to me that we are so obsessed with and pushing for technolgies that cant even run well at the standard resolutions we use now (4k is the tv standard, 1440 pc standard) without upscaling, and even then you need to have the top tier equipment for an experience plauged by stuttering to look at a window reflection or to stare at a brick wall to see how the sun reflects. Im not saying we shouldnt be pushing for these new plateaus but cmon guys, lets build on solid foundations here and not sand.
Not to mention you need to be well versed in overclocking. This was well said tho. I’m one of the few that will build a god rig just to run it at 1080p since I care about input lag
In my opinion, the reason they want raytracing to work so badly is because of how much time it saves in development. Without raytracing, lighting needs to be baked and requires lots of trial and error to get scene lighting to look right/good. With raytracing, the engine and GPU does it almost all of it for you in real-time. It reduces dev time and saves money.
@@silfrido17681080p looks good on an actual 1080p monitor. 1080p on a 1440p or 4K display is horribly blurry.
Adore the Young Frankenstein bit with the thunder at each mention.
15:00 ish: Phew I'm glad you addressed the ceiling lights. They were driving me mad in the performance comparison :D
I looove these tech spotlight videos from Alex. Absolutely makes my day!
I am happy you covered the emmisive lighting noise. I noticed that and did not know what was causing that.
I wish they’d update the demo on consoles like kind of a UE5.4 tech demo for users.
Just for info, every update the devs need to pay to sony/MS.
@@Swisshostthat sucks
@@Swisshost That was in the 7th gen, but it hasn't been a thing for at least a decade.
4:05 good performance gains, but what seems like massive downgrade in indirect lighting. There's flickering everywhere under that bridge, where it was fine in 5.0
Super Castlevania IV Simon’s theme kicks ass in the background!
Castlevania goated
Castlevania has stupidly good music. Symphony of the Night and IV are something truly amazing.
I think Alex is mixing up Frankenstein and Dracula :D
Agreed, SC4 had without a doubt the best music on the Super NES. Even today it sounds amazing coming from 8 sound channel's.@@WH250398
Trying to figure out UE5 as an audio person has been fun 😂
Jesus Christ, I was trying to get Atmos working on 5.3 and in the end I just gave up
I'm an audio enthusiast much more than a graphics enthusiast, and I'm always afraid technology will forget us. Especially with so many people today using smart phones and portable consoles with headsets (or heaven forbid without them), and even home theaters moving towards "sound bars" and all sorts of other magic gimmicks. Which, on the other hand, I guess makes it easy to understand why it might be frustrating to develop audio for games. You'll spend your time carefully capturing all the foley sounds, recording crickets at night, composing and conducting symphony orchestras, mastering everything to a T, figuring out how to place everything correctly in a surround or atmos system etc etc., meanwhile knowing a large portion of your players will be playing those carefully crafted soundscapes through a "surround" bar made from soap box plastic in a giant, barren concrete room.
Can't wait to see all the stutter in Witcher 4 and Cyberpunk 2 :/
Since there are still several years until the release of these games (with further improvements to the game engine coming out in the meantime) and also because I trust in CPDR's abilities, I remain quite confident.
@@Z3t487There's no guarantee that those developers are updating the engine in line with their game code, they usually pick one version of the engine and build from there.
If they try to update the engine during development, many of the game's source files will break and it will be exceedingly hard to pin point bugs. This is why Unreal Engine 4 games are still being released because It took those projects that long to complete.
@@AlexanTheManaccording to cdpr, it's implied that the engine will be improved throughout development of the game
@@aquaneon8012 Source?
@@Z3t487 shader stutter has been a thing in UE since like 2010, i doubt its going to be fixed in the next 10 years
I saw a video recently -- "Optimizing my Game so it Runs on a Potato" by @Blargis3d. He's making an indie game and was having the same compilation stutter problem. He solved it in kind of a genius way. When the game starts, before every level, he has a black loading screen. Thing is --- it's a trick. What he's *actually* doing behind that black screen, is playing the game at 10 times speed, walking in every room, loading every texture and killing every enemy. That way, all of the hitching and loading that has to happen, happens during that period. Then, when the 'loading' completes and the player plays the level, it's actually the *second* time that all of those assets are loaded. Thus -- complete elimination of compilation stutter.
This was the first time I ever heard of this (I didn't even know such a thing was possible) and thought it was really cool. Thus, sharing it here. :)
The game is Bloodthief by @Blargis3d. The shader compilation trick is in the video "Optimizing my Game so it Runs on a Potato" and is indeed very cool!
@@HildingL Thank you so much for the video! I'll update my comment to reflect this info. Thanks again! :)
So why doesn't all developers add a loading shader cache before you even start playing?
coz they don't need to, people will still buy the game, ad paying devs to do something that does not increase profit makes no sense to them.
Because it's not that easy to do it properly and a lot of studios who are using a ready made engine do this because they lack the technical know-how and/or financial resources to either develop an own engine, or to dive deep into the quirks of a complex monster like UE. They import their art, design their levels, write their game logic in Lua or UnrealScript, click the "build project" button, and hope for the best. Many devs aren't even aware of the typical issues, which is why DF does such an important job in explaining it over and over again.
The work involved isn't just a checkbox and a screen or something: the way shaders work in Unreal (and every other engine, really) is that you dynamically stitch them together and with potentially dynamic external parameters: think weather systems, characters getting wet or dirty, blending between different ground materials, adding effects for power ups, that sort of thing (and those are just the most obvious ones)
Unreal has a big list of these snippets, but not how they will be combined or with what values until you actually tell it to use them. Doing a shader compile is basically the developer finding all the final shader combinations they use in a game (easily thousands nowadays), often through just running through it and trying everything, and telling the engine to use them all in order.
Unreal can't easily fix this without breaking the entire workflow that pretty much every artist has used for about 20 years now. Developers need to perform a huge amount of work for what they might not see as being high value, since "it's only the first time"
have u played last of us part 1 remake?
i had to wait 40-45mins just in loading for the shader cache
imagine waiting that long haha
no one would wait that long and either play with stutter or refund and delete the game
@@ChillieGaming I would genuinely rather wait 48 hours for a game to load than it be shipped unfinished, this is why they get away with it, coz y’all bit it regardless
in the 5.4 shots there is a TON more GI sparking . .
they've changed how lumen works but I doubt they went through the city project and tweaked things for 5.4. So things that were not problems in 5.0 will become issues in 5.4. zero surprise there really, it's just what happens
that being said it's known to not use strong emissive values with small sources with lumen for a while now, this would never be an issue with a real game (if the devs are competent). It's doing real time GI god damn it, it's insane it even works at all at reasonable performance. I feel like people quickly forget how impressive and truly next gen UE5 still is. It's not without problems for sure but holy hell which other free engine has a tool suite this impressive? spoilers there are none that compare at all
Yeah it looks blotchy
@@quackcharge Unity compares. In fact, their Screen Space Global Illumination has very stable emissives. Maybe Epic can take a look at it. And runtime animation rigging since 2019.
And before you mention Lumen, Unity has had realtime GI since 2017, except the geometry positions are baked, so light won't spill through an open door.
Gray Zone Warfare is a disaster so far. They are realising nobody can run these games.
Thank you for using Unreal Tournament soundtrack for this video.
Starting the video with the facing worlds track. Nice!
that car deformation is actually the thing that impresses me most of the Matrix Demo
0:03 Unreal Tournament - Forgone Destruction ❤️
Alex's pronunciation of Frankenstein is peak DF content 🤌
Frrrankenstein. The German is strong in this. :D
Actually the correct pronunciation
That emoji doesn't mean what you think it means. :-)
@@colaboytje It means different things to people from different cultures.
@@andreasheinze9685 That is the Italian dead duck emoji, no?
Using classic UT music at the beginning is amazing!
The entire video used UT music except the Frankenstein part
the music at the very start of the video is my favorite track in Unreal Tournament. Brings back the good memories from 2001 when I first played the game.
UE5 is really giving off those classic cryengine 2/3 vibes. Great to look at, with performance issues. For them to not have parallelization right out of the box is baffling.
"Frankenstein PC" *thunder souds* - lol love you guys at DF
making me think of ctf-face while at work... thanks guys I needed that.
Alex, you have no idea how much I appreciate your choice of music for this video!
I saw some kid describe Foregone Destruction as "breakcore" a while back. I almost turned into The Hulk.
I mean it is drum and bass but the difference between breakcore and dnb is minute. It's not exactly wrong to call it breakcore.
@@coffin7904 It's extremely wrong and you are very mitsaken. It's like calling Skrillex "house music." Or Aphex Twin "techno". Or Lynyrd Skynyrd "country". "Breakcore" and "drum and bass" are distinct genres and saying the difference is minute shows that you really don't know much about this type of music. You could say that both are part of the same super-genre of Jungle, but that's not what we're talking about. If you're going to claim that breakcore and D&D have a "minute" difference, what's the point of even using distinct genre terms at all? Let's just call it all "electronic music" and skip the specificity altogether! Christ.
@@azazelleblack can you explain what the difference is?
@@azazelleblack It turns me into the Hulk whenever I hear kids put the word 'core' at the end of any arbitrary word. Like calling frutiger aero 'cleancore' or something.
@@coffin7904 Absolutely!
Before we get started, it's important to understand that these genre terms are sort of poorly-defined and used pretty loosely. With that said, there ARE definitions to these terms, and in particular "Breakcore" is a distinct subgenre under the heading of "Drum & Bass" or "Jungle" music.
People disagree over whether Jungle or D&B came first (and thus deserves to be the super-genre), but both evolved from earlier Breakbeat and Rave Hardcore music. Jungle music had heavy influence from Dancehall and usually was lower tempo, with an MC and party vibe, while Drum & Bass was focused on literally just rumbling basslines and breakbeat drum loops.
Drum & Bass as a genre typically refers to the early works of artists like Technical Itch, Q Project, Grooverider, and many, many others. This style originated in the early-to-mid 1990s and was still very much party music. It's danceable, and while it has a much darker vibe than something like Rave Hardcore (or especially that genre's successor, Happy Hardcore), it's still chill enough that you can zone out and relax to it.
Breakcore, meanwhile, is an evolution at least three stages removed from the original Drum & Bass sound. Around the same time people were coming up with D&B, other artists were experimenting with new sounds and creating what was then called the very stupid name "Intelligent Dance Music," or "IDM". IDM is often harsh, atonal, and challenging to listen to, and despite the name, it's almost entirely undanceable.
The early crossover efforts between D&B's successors (known commonly as "Darkcore", see: Dieselboy) and IDM were mockingly called "Drill & Bass", but this style became somewhat popular within its niche and has numerous artists. You get crossover from both D&B guys and IDM guys in this genre.
Drill & Bass eventually gave way to Breakcore, the successor genre that takes the relatively technical and stripped-down Drill & Bass and turns it up to 11 with influences from Hardcore (both EDM and Punk), industrial music, and even avant-garde noise music. True Breakcore is a brutal and harsh genre that's hard to enjoy for most people. It features gruesomely mutilated breakbeats with little rhythmic coherence and sharp, distorted sounds that can be like audio jumpscares or just constant stressors.
Even if you say "Drum & Bass" is the super-genre, nobody is thinking about Breakcore when they say "Drum & Bass". They're thinking about Aphrodite, about LTJ Bukem, about Dieselboy, about Evol Intent, and so on. Breakcore artists are guys like Venetian Snares, Bogdan Raczynski, Sh!tmat, Rotator, and so on. Foregone Destruction is absolutely not Breakcore, lol.
Dude thank you i hope you cover additional versions as they release this was very informative if ue5 was ready for the project im starting! It is not there yet sadly!
That Unreal Tournament track brought back some great memories! I wish Epic would revive the series.
The music from the original Unreal Tournament is epic.
Great content, Alex. Instead of the technical talk (which I’ve grown in understanding over the years due to Digital Foundry) you did a great job illustrating the points you made. Bravo.
Problem with the Frankenstein PC is it is missing the co-processing from the I/O Chip in the console. The PS5 I/O has the equivalent of 11 PS5 CPU cores dedicated to Decompression and Dieect Memory Access, it has an addition processor dedicated to I/O of the SSD, and another dedicated to memory mapping.
On the Frankenstein PC, all of these tasks have to be handled on the CPU or GPU depending on how they decided to tackle it.
As we move into full fledged PS5/XBSX titles, the gap between the consoles and this PC will significantly increase as the I/O Olin the console becomes gets more and more use.
Careful, the PC fanboys will attack u
The Frankenstein PC had me laughing, well done!
The comparison at 4:00 does have more fps in 5.4 but global illumination is way more noisy under the bridge. So Lumen is doing a worst job at calculating those emissive materials. (oh, it's mention at the end... hehehe)
This could be an issue with the setup. He literally just ported the demo to the new version. Some confog tweaks could be necessary to fix this.
What a trainwreck of an engine when it comes to the shader stuff. Like holy shit
Man that Unreal tournament song bring back memories, I wish there is a Unreal Tournament
Epic should bring back Unreal Tournament but inside Fortnite to advertise the capabilities of UEFN.
12:31 Alex in the background looking for high frequency detail textures
Opening with Foregone Destruction - instant throwback to the good ol' UT99 days.
Well boys we maybe by Unreal Engine 7.4 we can finally get a stutter free experience.
the historical trend says that by UE 7.4 the stutter will be a constant up and down between 16ms and 100+ ms and the entire industry will pretend it isn't happening or if it is happening it's not a big deal or if it is a big deal it's impossible to solve and we just have to accept it any way. no version of Unreal has ever stuttered LESS than its predecessor.
I think they should put all of their resources to fixing the performance before improving rendering. Makes the most sense even from a marketing perspective
How on earth did it take 3 years to add multi-core rendering support? Hey, we're going to build a house without a roof, release it, and 3 years later we'll add the roof.
Didn't they run this on a PS5? The demo of Matrix on a 7800X3D with a 4090 is unacceptable even today with 5.4 given that this is the best you can buy.
If it was the easiest thing it wouldn't have been an issue to begin with.
Whoahhhh digital foundry with the jungle/ambient/intelligent DnB music at the start !!! Cool!
It has just been mind-blowing to me that these companies can put sooo much time, money, energy, human resources into making these amazing pieces of art, just to get to the very end finished product and accepting shader compilation stutter and traversal stutter. It's unfortunate, frustrating and just upsetting. Millions of dollars go into these projects and us consumers spend thousands on PCs just to have all these little hitches. Hope this will come to an end
Play on consoles.
@@борисрябушкин-з9н Consoles aren't exempt from these issues.
you need to overclock lol, no other way around it
@@silfrido1768 lol. Nope. That is not the issue at all.
Always fun to see what you guys come up with
I've been refusing to play UE titles since Jedi Fallen Order, and this is more confirmation to continue to avoid UE games. Unless DF/Alex confirms a UE title doesn't have stutter, then I refuse to play/purchase.
You play games because they're fun. If this is your concern, then you're robbing yourself of good experiences.
Great video. Loved the UT and Castlevania IV music!
Bring back the matrix demo for consoles!
Can Epic do an Unreal remake now? Is it still too soon?
That would be incredible.
I doubt they could even if they wanted to, and for all we know the de-listing might have something to do with Tencent. And never forget that the original series of Unreal games were co-developed and directed by Digital Extremes. The only Unreal games made entirely by Epic Games were UT3 and UC2.
Yeah, Epic wants nothing to do with Unreal anymore. Shows the kind of company they are.
@@sulphurous2656 This is why I hope Blizzard never follows up on D2R.
@@AlexanTheMan 99% of large companies like epic are after 1 thing. Money, don't know what else you would expect. An Unreal game would only get the boomers interested and they're less likely to spend money on fomo and battlepasses and dumps 100s hours into a game unlike the kids that play fortnite.
The shader compilation issue is the reason why I still play games in consoles. Looking forward to the Nintendo Switch 2.
04:11 going under the under pass. There looks like noise flicker near the light which isn't on the old version.
Why isn't it possible for the engine to download the shaders from a cdn/steam/nvidia/amd, matching to the video card?
Steam does this for some games games (apparently not for UE5 games).
And consoles do it too.
There are literally hundreds of SKUs of PC GPUs.
They depend not only on the card, but also the driver.
It might be possible, but it would be a lot weaker of a system than it is in, lets say, a steam deck
@@Wobbothe3rd the processor of a 4060 is the same, no matter if it’s from asus or msi.
I’m a software dev, and im pretty sure that it’s even simpler. Im mean it’s just a simple compiler. It’s like compiling C code for i386, i486, i586, i686. There is no need to compile it differently for every single SKU. Only for the architecture: Turing, Ada Lovelace, etc. So that’s about five/six different architectures.
UNLESS, shaders contain precompiler statements which check for VRAM size or number of shader cores, which would be IMO very weird.
@@iurigrang you raised a good point with the driver. But the fun fact is, that’s because one essential part of the driver is the shader compiler.
It’s likely, that not every small driver increment, changes the output of the compiler.
So nvidia/amd needs to keep a lookup table:
source shader hash + architecture + driver version => compiled shader hash.
That would be a finite number of files to deliver via cdn.
I mean they could say, we offer that service only for the latest WHQL driver and the latest upstream driver. That would be even fewer files tob host.
With a fallback to self compiling.
I guess the problem is, nobody wants to pay for the CDN, not the game developer and not Nvidia/AMD. ^^
Just compute the shader on device before starting playing, that's way easier.
Love how clear and concise this video is Alex! The Matrix demo still looks so good. Accurate lighting is so powerful.
4:04 "Exact same content and settings"
> Extremely noticeable artifacting on the ceiling
The Matrix tech demo settings were similar but seemingly something changed under the hood in UE 5.4. Most likely this artifacting is somehow connected to the render parallelization improvement. Kinda seems like something is out of order/not synched in the pipeline.
I thought the same but he explains why this is the case at the end of the video.
@@Cheynanigans__ It is the same, but it kind of invalidates performance comparisons when there's such a difference.
@@Waffle4569 I kind of agree. We're comparing performance between UE5 iterations though so if the way lighting is processed helps performance while also making things prettier/uglier, that's what we're seeing. The *Matrix* game/demo settings are the same, just the engine changed. Apparently those lighting artifacts/fizzles are from MORE rays being processed into the scene and weren't there before due to those lights casting nearly no raytraced light before.
It's because of lumen using variable rate shading i would say
4:20 why does 5.4 have so many more artifacts in its lighting compared to 5.0?
Watch the video
This stuttering issue has me quite worried about CDPR’s next games on it. It would be so sad if the new Witcher released with terrible shader compilation or traversal stuttering considering how great CDPR’s games ran on high end PC’s with RED Engine.
You can manually fix it, as Alex says. You need to force pre-compilation of all the shaders.
It wont probably it will be minimized if well developed but still be present
gear up, the performance is gonna be worse than you can possibly imagine.
The Hitcher
We're going back to Witcher 1 levels of stutter baby lets go
11:41 The denoiser seems to struggle substantially more in 5.4 as visible in the lights under the bridge/overpass. So some of that extra performance may have been gained by reducing overall image quality
we are getting close just 4 to 5 years and ue5 will be playable
Too bad games are being made using ue5 as we speak.
On the next gen after this maybe
UE stutter became most apparent to me with Lords of the Fallen... even after 40+ patches, the game STILL suffers from hitching, micro-stutters, and the like.
It begs the question: Which engine can rise up and offer a robust set of "future-proof" features, provide affordable licensing, etc.?
the day the gpu handles almost everythin is gonna be nice
Doubt it will ever happen. Been saying this for years now
@@SPG8989 Alan wake 2 did a pretty good job with it though hey, unless you mean unreal engine specifically.
I'm holding out hope that we can have some kind of machine learning model to handle shader compilation and that can run on the GPU. A model trained specifically to do shader compilation could be infinitely more efficient than the dumb "on demand" way it's done now.
4:17 hearing Castlevania music on a DF video is a surprise to be sure, but a welcome one
Me with my 5800X3D and 4070 Super: *Plays Fortnite in 1080p, performance mode and low settings*
I cannot suffer through a few games just to cache my shaders in dx12. It is a miserable experience.
You can download the shaders for fortnite
Last time i played the game it worked just fine?
@@timodeurbroeck9957even when downloading the shaders doesn't help. You need to compile them for your specific PC configuration.
4:00 what's up with the lights on the right? They seem to be artefacting.
Epic really needs to get stuttering fixed and ASAP. It utterly ruins games on PC. Makes me not want to touch the engine.
I dread every UE 5 release as a PC gamer. I always wonder how bad the stutter will be. Not if it will stutter.
Hearing the Unreal Tournament music makes my heart melt ♥️. Truly brings me back, and gets reminded that Epic Games is a juggernaut when it comes to their graphics from the Unreal Engine.
Always great stuff Alex 👍
Traversal stutter has been in UE for, what… 20 years now maybe?
Overall it's great to see improvements, but it still needs a lot of work. No wonder all new UE5 games look and run the way they do. However, I still prefer path tracing with DLSS RR over UE5's global illumination, reflections, shadows... They have that unstable flickering "boiling" look that I find distracting. 🤔
Yeah, it's noticeable in ark. The software lumen is a big difference to path tracing. Software raytracing, hardware raytracing, then path tracing. Epic has said they want to optimize the engine to the point where hardware lumen is as expensive as software lumen is today. So if they can achieve this, we'll get better performance or resolution for UE5 games. Also we'll get better rt quality for 30fps modes. Something tells me they want software lumen working phones and mobile gaming PC's and the switch 2.
0:03 ah Facing Worlds OST from OG UT. Good to know that Epic tries to erase UT from it's history.
Not bad. If they now can make the Epic Game Store run 60% faster too...
Did you test if the shader stuttering issue is worse or better depending on the graphics API ? At least in Unity it is much harder to remove shader stuttering for modern APIs like DX12 and Vulkan because they expect you to compile a shader variant for each mesh attribute combination that you might use. Like let's say you have one mesh that has Position Normals Tangents and Texture Coordinate 0 and then another mesh that also has Texture Coordinate 1 for lightmapping. You would nee to make sure to prewarm shaders for both these mesh attribute configurations instead of just once for the shader variant. That's why i still stick to DX11 / GLES3 because i didn't find time yet to build a system that would collect all the possible combinations used in a project and prewarm them properly.
They really need to finally fix this god-awful stuttering. It ruins the gaming experience big time
4:00 this says it's the same settings, but is it? I notice a lot more denoising issues around the lights hanging under the highway. It's much worse in the distance.
Games still look a blurry mess, seems like realistic effects are all based on vaseline vision for devs.
11:57 also the light from lights hanging from below the bridge have extreme artifacting on them with 5.4, which is not the case on Unreal 5.0. But the stutter should be fixed over everything else. Game that stutters is not fun to play at all and at least makes me quit the game before finishing even the 1st level
It's pretty shameful that a company like Epic, with all its resources still has problems with shader compilation and traversal stutter on PC
If I were Jensen at Nvidia, I would be calling Epic every day pressuring them to put in the work. So many studios use UE. So to the average person, who has never heard of shader stutter, booting up a UE game would just put a black mark on PC gaming for them. This would have a negative affect on PC hardware sales.
I bought a PS5 for it's small number of exclusives but mainly to play UE5 games that interest me just to avoid them on my PC (7800X3D/4090)
And this hurts smaller studios that can't afford making their own engine. UE5 has so many great features which attracts smaller studios who want to make something more than a sides scroller.
It's a difficult problem because of how flexibly artists can author new shaders in the engine, with potentially tons of different permutations. There is effectively no way of knowing ahead of time which permutations of those shaders will be used at runtime, and the engine was structured for APIs like DirectX 11 where the driver automatically recompiled/restructured pipelines for different state and shader combinations with less overhead. There is even a new Vulkan extension that reintroduces part of this dynamic driver recompilation to subvert the issue, but DirectX 12 still requires you to compile your pipelines ahead of time for all possible shader/state combinations you could encounter during runtime if you want to avoid this problem.
@@TheSandvichTrials I assume you were talking about the dynamic rendering extension for Vulkan? If so DX12 doesn't yet have something like that that I'm aware of, but it does have an equivalent of Vulkan's new shader objects where you can compile individual shaders ahead of time and mix-and-match them at runtime, opting out of certain optimisations. The new work graphs feature of DX12 makes use of its equivalent of shader objects.
@@jcm2606 Interesting, I didn't know about that yet. The work graph stuff in general looks pretty nutso...
@@TheSandvichTrials Yeah, it's real fucking neat and could open up a ton of possibilities for GPU computation. Apparently Epic was playing around with porting Nanite to work graphs and held a presentation at GDC with their findings, so I'm hoping the footage of that is released, if it isn't already.
@@jcm2606 I watched that presentation, they showed Unreal running in that megascans canyon/desert environment, though they refused to share any details. But it seemed to work!
Awesome music selection!!! That whole games soundtrack is incredible!! The Castlevania bit in the video.
lmao its 2024 and we are still talking about Unreal Engine Stuttering? 😂😂😂😂 UNREAL!
This is why it saddens me to see so many developers dropping their in-house engines for UE5. I always had at least 1-2 consoles for every generation, but always played big titles on PC, but current generation makes me rethink this bc it seems like the only way to dodge stutters is to switch to consoles...
Blessed forgone destruction music
It needs a 2D shader animation for 3D buildings at a distance, and an option to lock frames at 40FPS and 45fps, it also be nice to integrate frame generation, so having something like internal 52fps lock and upscaling to 60fps with frame generation..
I know engines get patches and updates throughout their life, but I'm surprised by how poorly UE5 can work with multi-threading. Like its nothing new, its been around for years
frametime on the frankenstein looking like a sidescroller version of the demo map
Ok, maybe I'm a troglodyte here, but can someone explain to me WHY all PC games now seem to have shader compilation stutters? I don't get it. Is it a DX12 or Vulkan thing? I mean, none--let me emphasize that point, NONE--of the games I used to play in the 2000s and first half of the 2010s ever needed a shader precompilation step. They never stuttered. They just played. Why is this a thing now? How can we get rid of it? It's fucking ridiculous. I want games to JUST PLAY, like they used to.
Yes, it's a DX12/Vulkan thing. Put simply, prior to DX12 and Vulkan, the driver was responsible for compiling shaders and would do a bunch of optimisations in the background to ensure that shader compilation didn't affect performance much. This meant that how games handled shader compilation was more consistent, but it and the various other abstractions of older APIs led to performance being unpredictable at the developer level. As a result DX12 and Vulkan were created, which gave developers more direct control of the hardware at the cost of requiring them to do things that the driver would previously do. One of those things is that the developer is now responsible for deciding when, where and how to compile shaders, which they're currently failing at doing properly.
With the increased Lumen noise, might that be cause by some changed default settings in the new version and not actually the engine difference?
It's nice performance is better but I can graphics are worse as well.
Check out the lights under the overpass from 3:58. Lots of flicker and lighting weirdness.
Another snack of a video from Alex! Thanks man!