I gave the talk, if you have any questions please just ask them. Or feel free to hit me with your thoughts. I read every post, even though I probably shouldn't. =)
@@oosmosmoo I think he is the source, most coders/engineers are introverts, working with computers make them less sociable but they're otherwise good people.
we will wait quite a while yet, he said its 50% dlss upscaling, so his "60" fps is actually like 15fps, and im betting its running on a 4090 or something
@@hulejul9748 this is an just an option for developers same with ray/path tracing, and nvidia software and research is nearly flawless every year, unreal engine is not their problem
@@googleslocik when the result is indistinguishable from native res, whats the difference? image reconstruction isnt going anywhere, and it will only improve across all platforms, so waiting for this to be viable without it doesnt make sense
Not just games but virtual production. Having path tracing for independent filmmakers. A 2-minute scene that would take 3 days to render on a reasonably powerful machine, (say 4070 super with an i71400) could render in minutes. The time saved that would then go into enhancing the creative process and allow more time for just... thinking and working as a human.. not getting demotivated etc, that time is precious.
Yeah this is a whole subject I didn't get to touch on, and deserves a separate talk by itself. There are cases where you can get very close to the offline render in terms of quality, 98%+ similar. And maybe you use this technology in a psuedo-realtime fashion to reduce render times. So instead of 5 minutes per frame, maybe its 1-3 seconds per frame. Obviously a huge time savings. Just important to note the things that aren't pathtraced yet, but even then this could still be useful for rapid prototyping.
Always exciting to see better light-tracing at lower cost. Light is THE factor that determines the realism of a game more than anything. And it helps artists to properly capture a mood in the scene.
Lightning an entire scene only with emissive materials is mindblowing. It changes almost everything in workflow. If it can work with animated materials...
@@MorimeaWell like he said this part isn't quite ready. The goal is 1 day per pixel but 2 is probably as low as it'll work for now. This is still improving tech.
To demonstrate how things have moved on....in 1992 during my CGI MA it would take c.15mins to render one 320x256px image on an Apollo workstation. And they were considered cutting edge. No raytracing, no GI. Reflections and shadows were faked. I'm now using Unreal and every day, it blows me away. I can't wait for the 5090 cards mixed with this tech...
I used to work in Imagine on the Amiga back in the late 80s and early 90s. What we did was amazing given the technology at the time. Used everything from Amiga 500 to 4000 with 68060 processors. It was a world of hurt compared to Blender on a couple of 4090s today.
In 92 at Autodesk we were happy to get our hands on Weitek coprocessors for 3D Studio. They helped and were better than xxx87s, but still glacial. Great memories of those early days.
Yeah got my start at Microprose So i had Access to O2 SGI hardware & software.... This is MAGIC compared to those $20K "state of the art" dedicated graphics powerhouses of past years.
@@Oakbeast yeah, I remember SGI. We has access to a couple of Octane IIs and Softimage that were used in a production studio. I fondly remember sitting there in the weekends, learning and working with Softimage in the hope of making it as a 3D modeler and render artist. In retrospect it was a better career option to become a programmer, but it would have been so nice to work with PIXAR in those days.
Performance wise, which of these 5 run better? Real-time Pathtracing, offline Pathtracing, software Lumen, hardware Lumen, or just dynamic lighting only?
@alyasVictorio run better is not meant for "it's better". To me it's better to have Path Tracing near 60fps and real graphic instead liw quality with bad illumination at very high FPS
That is amazing! 60FPS is awesome target at moment, but even if I get 10FPS, its game changer for me for my product visualization animations, Cheers to amazing team
Neither of these are inherit UE problems, shader compilation stuttering happens in any game that doesn't pre-cache shaders, traversal stutter happens in any game that handles its level streaming poorly. Both are developer issues, not the engine's fault that the devs are incompetent.
@@LewdSCP1471A While stutter can indeed happen in other engines, it has been near universal in Unreal Engine games since UE3, it's definitely inherent to the engine.
@@nemureru_tanuki don't forget shows like the Mandalorian that use "the volume" for actual production. That already runs real-time on UE. Those shows will directly benefit from this technology. So I see this as completely viable for full feature movies. Will probably be a while before the technology is adopted but still.
34:48 Thank you, nvidia, for the path-tracing. And thank you, Epic, for the textures failing to load on time. Soon this bug will celebrate its 20th birthday.
It looks pretty good but I'm really just not a fan of screenspace temporal solutions for anything because you will invariably get all kinds of temporal artifacting - the light shifting as the camera moves, light/dark trails as surfaces are unoccluded, etc...
Yeah just because its realtime doesn’t mean its accurate. There is no way to make millions of calculations per pixel in 0 seconds with our current technology. Thats what quantum computers are for
All of which are fully on display here too. I appreciate the work on this tech, but with the smearing and disocclusion I feel it actually looks worse in scenes with high motion. Too many comments talking about how beautiful the flashy demo is and too few talking about the disocclusion artifacts on the wires on the swing ride and the ghosting on the string lights.
I may be in the minority here but i feel the trade-off for path traced direct and indirect is very worth it. Is "temporal" simply means looking at previous frames to predict future frames. It's the most logical way of achieving such technologies and doesnt have to necessarily mean artifacts either just because it is temporal. These artifacts may very well just be coming from the denoiser thats being used and not necessarily from ReSTIR
The fact you think UE4 invented needing to compile shaders shows how little you know, i've used UE3 game builds that have shader stutter because they dont precompile them, its a developer issue, not an engine issue.
Since it's not path tracing everything, nvidia should call this hybrid path tracing. It is defaulting to ray trace and raster in quite a lot of cases. In general, when someone mentions path tracing, we expect a modern offline path tracer that can handle refraction, translucency, high quality sss, layered materials, hair, vdb, and everything else at multiple bounces.
The point is that MOST things are path tracing- direct and indirect lighting which makes up most of use cases. Saying it is defaulting to ray tracing in a lot of cases is such a disservice to the 80% of the scene that is being path traced. The main things that dont get path traced is low roughness, transclucency, fog and dof. Which in generally only make up 30% of scene. At the end of the day, almost every scene you have is going to look SIGNIFICANTLY better with this technology. Furthermore, i really dont think anyone should expect to click on this video expecting everything to be path traced and for this to be fully featured coming from nowhere. The fact that the made this compatible with unsupported use cases is a big deal on it's own. Again please lets not do a disservice to such an incredible leap in graphical fidelity
The fact they have a way to make path tracing, an inherently more technical process than ray tracing, have better performance is insane. This could really push games to insane new levels. Now hopefully UE5 devs can find a nice balance as a lot of games are very GPU demanding and dont have the best performance.
Very excited to see this shown off even if it's still not technically done or fully production ready! I imagine in another 3-4 years how much more polished this tech will finally be for production with the next-gen consoles releasing!! Very exciting times!!! :DDD
Am glad that we are in the final stretch of light simulation development for games, cause we really need to move on to the thing that most kill the inmersion: interaction between characters and with the world. That is way more important and has been waiting for decades to be nailed
Senuas Saga Hellblade to me demonstrated the best use of lighting and unreal graphics that felt the closest to realism that ive ever played. I stopped the game with photo mode so many times thinking they actually mixed in real capture just to see it was still the same scene. How I always dreamed games would be where you dont go from prerendered to in game graphics after cutscenes
I might be the only one but I can't say I'm impressed. Especially on Meerkat demo. If we compare to the original here : th-cam.com/video/SB4nnhJv3IU/w-d-xo.html (compare 34:55 here with 0:08 on the link) there is far far less GI on rocks in general making the render more video game like and unpleasant. The other thing that I don't like is that I see a lot of problem Lumen currently have in the RT Pathtrace render. Like GI poping, Specular glitching and defocus being a little bit unstable. I am ofc very interested with the techno and hope it will get better with time but I do not feel any hype from what I see. I see no gain in being realtime Pathtrace 60fps if you have to sacrifice everything that make an image good. Honestly (working in animation right now) I see more value on fixing Lumen with MRQ & Temporal than developping a real time PathTrace.
You're not the only one who noticed the issues. I hope someone asked a question about that at the talk, because compared to the original meerkat demo the shadows in the nvidia demo are going to almost black. In all fairness, gpus are probably not fast enough to handle path tracing in real time, because here they're only path tracing certain things. They're also using a lower resolution, incredibly low samples, strong denoising etc.
It's a shame that most demos are too high-contrast, which makes the final result not very appealing! An artist should supervise the videos before they go online!
It was done in an auditorium with a huge screen with lots of diffuse lighting so it’s a standard to bump the contrast for presentation as everything gets washed out anyways, they should’ve kept the original for the TH-cam presentation but I guess is that it’s always undecided wether it’ll be put on web or not
who would have thought the day will come, this might be the reason i get a high end gpu, not ray tracing or dlss 3.5, whatever, but actual path tracing in real time, what a time to be alive, edit: now that i remember, two minute papers showcased this like a month ago or so, and it was running on a gtx 680 or something quite old, so i hope nvidia doesn't come up with the stupid idea to lock it exclusively on the 5000 series
@@computron5824 Dude, this is REAL TIME PATH TRACING ! And guys like you will still find reasons to whine that it's not as good as offline path tracing.. Like really ?
Good showcase but please also talk about the caveats. We can see how this looks in motion. Show some bad cases, show the ghosting, the smearyness. It comes off as a bit shill-y and dishonest to not mention the negatives of these techniques. It's not only a performance hit. The image stability and motion clarity takes a huge noisedive in many cases. This is like 2015 era TAA all over again 😂
Lol that forest scene was not so good I think. The lighting was always shifting, popping in, like the blend of screen-space & accumulated info, the low res path trace. It all comes together in a not stable image. It's not plug&play just yet. Very temporally unstable, smeary, and noisy. Bleh.
Path tracing is nice if you can run it. Half your user base cannot, and would rather you fix stuttering. It's been here since unreal 4, and still an issue in most games using unreal 5.
This demo has nothing to do with Path Tracing. It may use some elements of it, but it is far from path tracing. I would call this technology a hybrid. They didn't show any refraction, nor any SSS material. Perhaps not by accident. Calculating a refraction material under path tracing will require quite a bit of computation. The same is true for Subsurface materials. In any case, this "MegaLights" technology is a good thing, but RTPT is a long way off. Don't get on the hype train!
I don't think this should be called path tracing in it's current state either, and I'm glad that Epic is using the term MegaLights. It would be good if they showed a more fully featured version, even if it ran terribly, rather than something that is missing so many features that people expect from path tracing.
I love all of this, and I love Richard's presentations (been watching them for a long while). The only barrier to entry here is needing to have the separate branch. In the past I'd downloaded the branches and compiled, but it's a bit limiting as far as other aspects of Unreal that are easily injected into the vanilla code from Epic Games. I wish Nvidia and them could work together so everything could be a plugin, similar to how they now have DLSS available as a directly-injected plugin. Obviously that would be vastly more complex than I myself could understand, but it would be an 'ideal' situation. I HORRIBLY miss all of Nvidia's GAMEWORKS library not being supported anymore (for a long, long time). :( Love the tech, Nvidia!!
The new tech is impressive and all, but can you PLEASE help devs eliminate the horrendous traversal stuttering which seems to be ubiquitous in UE5 games. Having all these new super expensive features that reduce performance even more is pretty pointless when the frametime graphs look like a mountain range. It's giving the engine a bad reputation.
I remember there was a 3dmark test displaying a rotating carousel with light sources. When it turned on 8 light sources the frame rate dropped to something like 5 fps.
Now... it would be interesting, what exactly is considered high-end or low-end hardware. But looks very promising! What is a bit concerning tho is the fact that companies might rely too much on things like DLSS for performance, instead of optimizing their games properly. Actually, thats already happening. On the other hand, the more streamlined and optimized the engine is out of the box and the less you can do to break performance (like with the light sources), the better.
The talk doesn't seem to be taking the new MegaLights feature into account, but it should mean the legacy fallback to Lumen could look even closer to the fully pathtraced version when it comes to emissives. The reduced disparity will be great for devs, but terrible for hardware companies trying to convince the average gamer of the difference 😅
I've been using RTXDI in my work for over a year, and when MegaLights was announced, I was fully convinced that it was finally RTXDI integrated directly into Unreal. Do you know what the difference is, in brief?
@СергейШавлюга-з2ч I'm not real sure the underlying implementation of MegaLights other than what Epic mentioned in their presentation a few weeks back. Although I think they did say it might require hardware RT. Not sure if the same as ReSTIR or not
It is probably what's under the hood,megalight is maybe just a fancy word for ReSTIR with an ergonomic and user-friendly packaging, I would be very surprised if it's otherwise
i cant wait till the default engine officially adopts proper emissive material lighting support as shown in this. another thing im anxious to receive in the default engine is proper world position offset support with ray traced shadows. Im tired of having black areas appear, shift, and disappear on my foliage, aswell as world position offset compatible ray traced shadows
it's really amazing but to run this you need upscaling, frame generation and $2500 GPU that consumes 600+ wats of power - rasterization is king for next 10 years as games fly in native 4k - hail to the king
@@nareshkumar3526many many years. Nanite, for example, was release years ago and till today the Blender Foundation has made no mention of this tech being implemented in the new roadmap in some form into Blender despite it being arguably the biggest tech breakthrough to happen to 3D Art. Same thing can be said for Eevee next which despite being brand new, is still inferior to lumen in many ways.
Cool Stuff! Tell me when it'll work without temporal denoising and upscaling. So the image stop looking like hot pile of shmoo in motion. You know, games aren't still images. Games are about moving stuff!
I gave the talk, if you have any questions please just ask them. Or feel free to hit me with your thoughts. I read every post, even though I probably shouldn't. =)
great talk
Thanks for the presentation, very interesting stuff
Buena presentación y muchas gracias por la información!!!
Awesome presentation - thanks - possibly this will go to history for the milestone presentation for real time path tracing for masses.
@@Vemrah This is the start of making it more mainstream, everyone can expect more advancements from here.
We appreciate this presentation by this man who is not too comfortable doing it, thanks for the effort ;)
Right? I'd much rather get the info directly from the source, not some spokesperson. Respect 🔥
Este señor es un genio. Directamente al grano mostrándonos lo mejor q se viene, sin tanto curriculum.
❤ his good congrats man, just didn't have time or mood to do it perfectly.
you can barely notice the discomfort at the start
@@oosmosmoo I think he is the source, most coders/engineers are introverts, working with computers make them less sociable but they're otherwise good people.
Real time PathTracing is a dream for years now I can't wait for this to be out in unreal.
we will wait quite a while yet, he said its 50% dlss upscaling, so his "60" fps is actually like 15fps, and im betting its running on a 4090 or something
@@googleslocik yes he said its on a 4090
i think the problem unreal has with poor efficiency and stuttering should be taken into effect as well, before deciding to throw on more ray tracking
@@hulejul9748 this is an just an option for developers same with ray/path tracing, and nvidia software and research is nearly flawless every year, unreal engine is not their problem
@@googleslocik when the result is indistinguishable from native res, whats the difference? image reconstruction isnt going anywhere, and it will only improve across all platforms, so waiting for this to be viable without it doesnt make sense
Not just games but virtual production. Having path tracing for independent filmmakers. A 2-minute scene that would take 3 days to render on a reasonably powerful machine, (say 4070 super with an i71400) could render in minutes. The time saved that would then go into enhancing the creative process and allow more time for just... thinking and working as a human.. not getting demotivated etc, that time is precious.
Yeah this is a whole subject I didn't get to touch on, and deserves a separate talk by itself. There are cases where you can get very close to the offline render in terms of quality, 98%+ similar. And maybe you use this technology in a psuedo-realtime fashion to reduce render times. So instead of 5 minutes per frame, maybe its 1-3 seconds per frame. Obviously a huge time savings. Just important to note the things that aren't pathtraced yet, but even then this could still be useful for rapid prototyping.
Always exciting to see better light-tracing at lower cost. Light is THE factor that determines the realism of a game more than anything. And it helps artists to properly capture a mood in the scene.
This isn't just better light at lower cost. This is holy grail of computer graphics.
Fix the UE stutter, it should be highest priority.
Realtime Stutter Elimination should be priority and I think I speak for many PC Enthusiasts.
No it call Unreal Engine 5.4
lol
Already exists. Plenty of UE5 games that don't stutter.
It's a developer issue at this point.
@@whatistruth_1 Name some please.
@@DanteBellin The one that comes directly to my mind is The Finals
Lightning an entire scene only with emissive materials is mindblowing. It changes almost everything in workflow. If it can work with animated materials...
You can already do this with Lumen afaik
@@drinkwwwaterrryeh. It it looks garbage and noisy
@@vid828 25:00 6 rays per pixel 720p resolution(upscaled) and 60fps on 4090RTX - not sure if it can be actuallly useful for this cost in performnce.
@@MorimeaWell like he said this part isn't quite ready. The goal is 1 day per pixel but 2 is probably as low as it'll work for now. This is still improving tech.
It's shows an example of an animated stained glass material in the vid.
This is great. More efficient use of resources is what we like to see.
To demonstrate how things have moved on....in 1992 during my CGI MA it would take c.15mins to render one 320x256px image on an Apollo workstation. And they were considered cutting edge. No raytracing, no GI. Reflections and shadows were faked. I'm now using Unreal and every day, it blows me away. I can't wait for the 5090 cards mixed with this tech...
I used to work in Imagine on the Amiga back in the late 80s and early 90s. What we did was amazing given the technology at the time. Used everything from Amiga 500 to 4000 with 68060 processors. It was a world of hurt compared to Blender on a couple of 4090s today.
In 92 at Autodesk we were happy to get our hands on Weitek coprocessors for 3D Studio. They helped and were better than xxx87s, but still glacial. Great memories of those early days.
Most of folks here probably have no idea how huge this is lol !
Yeah got my start at Microprose So i had Access to O2 SGI hardware & software.... This is MAGIC compared to those $20K "state of the art" dedicated graphics powerhouses of past years.
@@Oakbeast yeah, I remember SGI. We has access to a couple of Octane IIs and Softimage that were used in a production studio. I fondly remember sitting there in the weekends, learning and working with Softimage in the hope of making it as a 3D modeler and render artist.
In retrospect it was a better career option to become a programmer, but it would have been so nice to work with PIXAR in those days.
this seems like an absolute game changer! Congrats on the DEV for this, amazing!
Already super impressed by the demo being presented
Devs: *This is an area where we can make improvements on*
PathTracing _is_ a "real-life" element I cannot wait to see more in games.
I sow your comment in my Dega vous 😅 ,yep it a dream that's become true
Same for me! Love Path Tracing❤
Performance wise, which of these 5 run better? Real-time Pathtracing, offline Pathtracing, software Lumen, hardware Lumen, or just dynamic lighting only?
@alyasVictorio run better is not meant for "it's better". To me it's better to have Path Tracing near 60fps and real graphic instead liw quality with bad illumination at very high FPS
That is amazing! 60FPS is awesome target at moment, but even if I get 10FPS, its game changer for me for my product visualization animations, Cheers to amazing team
I love this, it makes the game feel so much more real, grounded and tactile, even if it's highly stylized. It's truly the future.
Finally the light in Unreal Engine viewport didn't delay like when the first time Lumen was introduced!
My man showing the future of real-time rendering and the crowd is like "what he says?"
Great, but how about fixing shader compilation/traversal stutter that plagues pretty much every UE5 game, especially on PC?
Neither of these are inherit UE problems, shader compilation stuttering happens in any game that doesn't pre-cache shaders, traversal stutter happens in any game that handles its level streaming poorly. Both are developer issues, not the engine's fault that the devs are incompetent.
@@LewdSCP1471A While stutter can indeed happen in other engines, it has been near universal in Unreal Engine games since UE3, it's definitely inherent to the engine.
Beautiful. Every month there's exciting news for 3D.
I wish there was more for stereoscopic 3D 😢 RIP nvidia 3D vision
Ikr
@@kelownatechkid it still works, I use it everyday on a 110 inch 3d projector. Just need to do a few things to make it work.
WHOOP WHOOP MY ZPD Cop made it in the presentation ❤
Very cool model. Would play the game it's in.
😂 i subbed to you just for that dude
Does Megalights use the same approach?
I would love to see the comparison of this technology with Megalights.
Thanks for your work!
No, megalights is nothing compared to this.
@@gn2727whats the difference between?
@@gn2727 in a good way or not
Unbelievable, can't wait to test it
Forget this for games, i mean.
Imagine how frigging fast you can render out an animation now for a client.
Exactly, maybe not for a full-length industry-standard movie, but for shorts and tech demos.
@@nemureru_tanuki don't forget shows like the Mandalorian that use "the volume" for actual production. That already runs real-time on UE. Those shows will directly benefit from this technology. So I see this as completely viable for full feature movies. Will probably be a while before the technology is adopted but still.
can be posible realtime, using ndisplay, specially for virtual production
34:48 Thank you, nvidia, for the path-tracing. And thank you, Epic, for the textures failing to load on time. Soon this bug will celebrate its 20th birthday.
🤣😂🤣
It looks pretty good but I'm really just not a fan of screenspace temporal solutions for anything because you will invariably get all kinds of temporal artifacting - the light shifting as the camera moves, light/dark trails as surfaces are unoccluded, etc...
Yeah just because its realtime doesn’t mean its accurate. There is no way to make millions of calculations per pixel in 0 seconds with our current technology. Thats what quantum computers are for
All of which are fully on display here too. I appreciate the work on this tech, but with the smearing and disocclusion I feel it actually looks worse in scenes with high motion. Too many comments talking about how beautiful the flashy demo is and too few talking about the disocclusion artifacts on the wires on the swing ride and the ghosting on the string lights.
I may be in the minority here but i feel the trade-off for path traced direct and indirect is very worth it. Is "temporal" simply means looking at previous frames to predict future frames. It's the most logical way of achieving such technologies and doesnt have to necessarily mean artifacts either just because it is temporal. These artifacts may very well just be coming from the denoiser thats being used and not necessarily from ReSTIR
Cool.
But what about fixing the well documented issues with stuttering which is a thing since the UE4?
They won't get marketing buzz from that so they don't mention ;) they're also incapable of fixing it.
@@willianjohnam I'm afraid it's a lost cause at this point.
The fact you think UE4 invented needing to compile shaders shows how little you know, i've used UE3 game builds that have shader stutter because they dont precompile them, its a developer issue, not an engine issue.
Since it's not path tracing everything, nvidia should call this hybrid path tracing. It is defaulting to ray trace and raster in quite a lot of cases. In general, when someone mentions path tracing, we expect a modern offline path tracer that can handle refraction, translucency, high quality sss, layered materials, hair, vdb, and everything else at multiple bounces.
The point is that MOST things are path tracing- direct and indirect lighting which makes up most of use cases. Saying it is defaulting to ray tracing in a lot of cases is such a disservice to the 80% of the scene that is being path traced. The main things that dont get path traced is low roughness, transclucency, fog and dof. Which in generally only make up 30% of scene. At the end of the day, almost every scene you have is going to look SIGNIFICANTLY better with this technology. Furthermore, i really dont think anyone should expect to click on this video expecting everything to be path traced and for this to be fully featured coming from nowhere. The fact that the made this compatible with unsupported use cases is a big deal on it's own. Again please lets not do a disservice to such an incredible leap in graphical fidelity
Who is 'we" nerd?
don't worry they can come up with new words later
Bro how are they doing this 🔥🤯🤯 ,these dudes are wizards
The fact they have a way to make path tracing, an inherently more technical process than ray tracing, have better performance is insane. This could really push games to insane new levels. Now hopefully UE5 devs can find a nice balance as a lot of games are very GPU demanding and dont have the best performance.
At last. Can't wait to put my 4090 to scream wiith the Realtime Path Tracing. Amazing. 😍
Very excited to see this shown off even if it's still not technically done or fully production ready! I imagine in another 3-4 years how much more polished this tech will finally be for production with the next-gen consoles releasing!! Very exciting times!!! :DDD
Am glad that we are in the final stretch of light simulation development for games, cause we really need to move on to the thing that most kill the inmersion: interaction between characters and with the world. That is way more important and has been waiting for decades to be nailed
can't wait for more stutter
Uhmm
these demos look absolutely incredible!
Senuas Saga Hellblade to me demonstrated the best use of lighting and unreal graphics that felt the closest to realism that ive ever played. I stopped the game with photo mode so many times thinking they actually mixed in real capture just to see it was still the same scene. How I always dreamed games would be where you dont go from prerendered to in game graphics after cutscenes
this is awesome because Lumen can certainly struggle in some cases , and offline Path Tracing just takes too long (RTX 3060).
Super Exciting! I am so grateful :) this will help me so much
looks way better than lumen, no ghosting at all.
Probably has more to do with ray reconstruction than anything else.
Very exciting and impressive! 👏👏👏
Looks amazing I wonder if NVIDIA will ever pursue bringing GameWorks back in that branch
I might be the only one but I can't say I'm impressed. Especially on Meerkat demo. If we compare to the original here : th-cam.com/video/SB4nnhJv3IU/w-d-xo.html (compare 34:55 here with 0:08 on the link) there is far far less GI on rocks in general making the render more video game like and unpleasant. The other thing that I don't like is that I see a lot of problem Lumen currently have in the RT Pathtrace render. Like GI poping, Specular glitching and defocus being a little bit unstable.
I am ofc very interested with the techno and hope it will get better with time but I do not feel any hype from what I see. I see no gain in being realtime Pathtrace 60fps if you have to sacrifice everything that make an image good.
Honestly (working in animation right now) I see more value on fixing Lumen with MRQ & Temporal than developping a real time PathTrace.
You're not the only one who noticed the issues. I hope someone asked a question about that at the talk, because compared to the original meerkat demo the shadows in the nvidia demo are going to almost black. In all fairness, gpus are probably not fast enough to handle path tracing in real time, because here they're only path tracing certain things. They're also using a lower resolution, incredibly low samples, strong denoising etc.
@computron5824 the simple fact to see LOD poing at the beginning say that nanite dont work with this yet
It's a shame that most demos are too high-contrast, which makes the final result not very appealing! An artist should supervise the videos before they go online!
It was done in an auditorium with a huge screen with lots of diffuse lighting so it’s a standard to bump the contrast for presentation as everything gets washed out anyways, they should’ve kept the original for the TH-cam presentation but I guess is that it’s always undecided wether it’ll be put on web or not
@@me-ry9ee Well, I can't even decide which pancake I should order, so...
Most important one so far
This is it !!!! and great presentation as well!
You guys are freakin' geniuses! I'm wondering, are ReSTRIR and the MegaLights feature related? Or are they different technologies?
can't wait for real time path tracing to be at the level of current day animation
This guy is pretty casual for showing the newest gaming tech in the world that actually breaks the limits.
who would have thought the day will come, this might be the reason i get a high end gpu, not ray tracing or dlss 3.5, whatever, but actual path tracing in real time, what a time to be alive, edit: now that i remember, two minute papers showcased this like a month ago or so, and it was running on a gtx 680 or something quite old, so i hope nvidia doesn't come up with the stupid idea to lock it exclusively on the 5000 series
Download the NVRTX branch and try it. Real time path tracing is not the same as offline path tracing.
@@computron5824 Dude, this is REAL TIME PATH TRACING ! And guys like you will still find reasons to whine that it's not as good as offline path tracing.. Like really ?
@@gn2727 Dude, calm down, it's marketing. I've been testing the NVRTX branches for years. Marketing this as path tracing right now is a stretch.
How about some Real-Time-No-Stutter? Or maybe Real-Time-Stable-Framerate? Now that would be game changing!
Good showcase but please also talk about the caveats. We can see how this looks in motion. Show some bad cases, show the ghosting, the smearyness. It comes off as a bit shill-y and dishonest to not mention the negatives of these techniques. It's not only a performance hit. The image stability and motion clarity takes a huge noisedive in many cases. This is like 2015 era TAA all over again 😂
6090... xd
Imagine what a 5090 is going to accomplish with this
unity is just a name now
Unless your hardware can't handle Unreal Engine, of course.
Wow! Amazing!
That forest scene is awesome!
Lol that forest scene was not so good I think. The lighting was always shifting, popping in, like the blend of screen-space & accumulated info, the low res path trace. It all comes together in a not stable image. It's not plug&play just yet. Very temporally unstable, smeary, and noisy. Bleh.
31 years... it only took 31 years since Doom in 1993 for the dream to be real!
Path tracing is nice if you can run it. Half your user base cannot, and would rather you fix stuttering. It's been here since unreal 4, and still an issue in most games using unreal 5.
More like 95% of their user base can't run it, and won't for years.
😍 I can't wait.
Great talk, cheers!
This is UNREAL!
This demo has nothing to do with Path Tracing. It may use some elements of it, but it is far from path tracing. I would call this technology a hybrid.
They didn't show any refraction, nor any SSS material. Perhaps not by accident. Calculating a refraction material under path tracing will require quite a bit of computation. The same is true for Subsurface materials.
In any case, this "MegaLights" technology is a good thing, but RTPT is a long way off. Don't get on the hype train!
I don't think this should be called path tracing in it's current state either, and I'm glad that Epic is using the term MegaLights. It would be good if they showed a more fully featured version, even if it ran terribly, rather than something that is missing so many features that people expect from path tracing.
I love all of this, and I love Richard's presentations (been watching them for a long while). The only barrier to entry here is needing to have the separate branch. In the past I'd downloaded the branches and compiled, but it's a bit limiting as far as other aspects of Unreal that are easily injected into the vanilla code from Epic Games. I wish Nvidia and them could work together so everything could be a plugin, similar to how they now have DLSS available as a directly-injected plugin. Obviously that would be vastly more complex than I myself could understand, but it would be an 'ideal' situation. I HORRIBLY miss all of Nvidia's GAMEWORKS library not being supported anymore (for a long, long time). :( Love the tech, Nvidia!!
We need adaptive samplers for shadows,
Because lot of noise around moveable objects.
The new tech is impressive and all, but can you PLEASE help devs eliminate the horrendous traversal stuttering which seems to be ubiquitous in UE5 games. Having all these new super expensive features that reduce performance even more is pretty pointless when the frametime graphs look like a mountain range. It's giving the engine a bad reputation.
excellent, I am waiting for this feature.
9:36 "Cop lights, flashlights, spotlights
Strobe lights, street lights (All of the lights, all of the lights)" 🎵
😂 I had the same thought... UE5.5 is going extra good and want us to see everything... All of the light... Shout out to kid Cudi for that song btw
Very exciting technology. I just rendered my 12 sec lasting scene in 18 hours with pathtracing.
Pretty cool to see RTGI running at >1 "uhps"
Thank you for providing the link to the branch in the video description. Oh, wait....... 😒
Great stuff 🎉
I remember there was a 3dmark test displaying a rotating carousel with light sources. When it turned on 8 light sources the frame rate dropped to something like 5 fps.
daam can't wait to make some scenes with that
cyberpunk orion gonna be crazy!
Illuminating :D
omg---- I REALLY need to learn Unreal...
Holy! Loving it.
Any example scenes with this? Looks amazing.
Now... it would be interesting, what exactly is considered high-end or low-end hardware. But looks very promising!
What is a bit concerning tho is the fact that companies might rely too much on things like DLSS for performance, instead of optimizing their games properly. Actually, thats already happening. On the other hand, the more streamlined and optimized the engine is out of the box and the less you can do to break performance (like with the light sources), the better.
The talk doesn't seem to be taking the new MegaLights feature into account, but it should mean the legacy fallback to Lumen could look even closer to the fully pathtraced version when it comes to emissives.
The reduced disparity will be great for devs, but terrible for hardware companies trying to convince the average gamer of the difference 😅
I've been using RTXDI in my work for over a year, and when MegaLights was announced, I was fully convinced that it was finally RTXDI integrated directly into Unreal. Do you know what the difference is, in brief?
@СергейШавлюга-з2ч I'm not real sure the underlying implementation of MegaLights other than what Epic mentioned in their presentation a few weeks back. Although I think they did say it might require hardware RT. Not sure if the same as ReSTIR or not
@@xephyrxero Thank you
It is probably what's under the hood,megalight is maybe just a fancy word for ReSTIR with an ergonomic and user-friendly packaging, I would be very surprised if it's otherwise
is the tech demo scene at the end available as a download somewhere?
so impressive
i cant wait till the default engine officially adopts proper emissive material lighting support as shown in this.
another thing im anxious to receive in the default engine is proper world position offset support with ray traced shadows. Im tired of having black areas appear, shift, and disappear on my foliage, aswell as world position offset compatible ray traced shadows
Amazing
Really cool stuff, can't wait to try it out
it's really amazing but to run this you need upscaling, frame generation and $2500 GPU that consumes 600+ wats of power - rasterization is king for next 10 years as games fly in native 4k - hail to the king
This thing should come to Blender too
It will definitely happen, but will take many years.
@@nareshkumar3526many many years. Nanite, for example, was release years ago and till today the Blender Foundation has made no mention of this tech being implemented in the new roadmap in some form into Blender despite it being arguably the biggest tech breakthrough to happen to 3D Art. Same thing can be said for Eevee next which despite being brand new, is still inferior to lumen in many ways.
You can say it again!
@@mrlightwriter This thing should come to Blender too
@@GenesisSoon :D
my PC started catching fire from watching this
I could see real time path tracing becoming viable at full resolution in about 2 to 3 more generations. RTX 70 series most likely.
Otherwise performance and quality awsome❤.
Finally full explanation of the difference between real time and offline PT
I really hope we can eventually get rid of the ghosting
I thought shader execution order was hardcoded into the hardware?
Insane
Cool Stuff! Tell me when it'll work without temporal denoising and upscaling. So the image stop looking like hot pile of shmoo in motion. You know, games aren't still images. Games are about moving stuff!
Were the Q&As at the end recorded?
yeah whatever graphics... where is the Ray Tracing of Sound?
I'm gonna need a two minute papers on this.
Welcome to the comment section, my friends.
awsome motivates me to get off my butt and find more resourcs to get a nice pc to run this lol
TRANSLUCENT COLOURED SHADOWS!!!
Thank you CDPR 🎉
All I can think is... HOW!?!??!? 😮🤯
when will this be available in ue5 final releases on epic games launcher.