Just a reminder that all games that support Ray Reconstruction had the feature turned ON for footage in this comparison (unless otherwise specified). Those games are Cyberpunk 2077, Alan Wake II and Star Wars Outlaws. Ray Reconstruction addresses noise differently to other forms of denoising but falls well short of eliminating the issue.
ITZZZZZZZZAAAAAAAAAAAAAAAAAAAAAGGGGGGGGGGGGGGGGGGGGGGGIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIMMMMMMMMMMMMMMMMMMMMMMMMIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIICCCCCCCCCCKKKKKKKKKK hardware is weak for real RTX and devs can't do anything about it and games are few and far apart where rtx makes any sense other than ADVERTISEMENT FOR NVIDIA YOU JUST SHILL FOR NVIDIA TO SELL US STUPID entry level 4060 AT mid tier e.g what was GTX 780 PRICES every time you compare raytracing performance or test ray tracing performance!!!
i rather play with Taa with some tweaking look at Cyperpunk i test taa and dlss and i get better quality with taa so match detailed specifically on full path tracing and games thes days get bad Taa implementation specifically for UE 4 and 5 and without fixing Major stuttering issue
the noise issue is people making noise for Nvidia Graphics cards in the first place. they have become the most unrecommendable cards in existence as they get away with charging more and more and more to make more than twice what the card should cost in profit
"[...] You can sit around twiddling your thumbs and hoping that an RTX 2080 gets cheaper, or you can enter the world of ray-tracing and high-speed, 4K gaming today and never look back. When you die and your whole life flashes before your eyes, how much of it do you want to not have ray tracing?"
UE5 is bad overall visually, AND it has massive issues on the Ryzen 5000 series causing screens to bounce around, you have to turn off cores to stop it from happening. Happens in Stalker 2, Gray Zone Warfare and even bloody Fortnite.
Exactly that's what I could not explain to people up to this date. I tried RT, looks trash and not going to buy a GPU based on that ever. Only if something drastically changes in the future in regards to this technology.
As if rasterization doesn't have its own list of quirks. I much prefer RT accuracy than SSR disocclusion, shadows not rendering at the edges of the screen, SSAO/SSR disappearing when the occluded object is off screen, light bleed, SSAO over-darkening and all kinds of funny looking inaccuracies. Screen-space effects also had many issues and it took time and research to get them in a pleasing state - compare Crysis 2007 to RDR2.
@@ShankayLoveLadyL yeah, I have the same opinion of upscaling technologies too. They make the games look bad in motion and what's worse is that devs are hiding behind dlss to not optimise their games properly. So people don't get good frame rates in native rendering.
I think the opposite. Not great in screenshots, but beautiful in motion. (I think of the light games seen and played in Alan Wake 2, Wukong, Cyberpunk, etc.)
Right? It's so tasteless. Even the surfaces that are reflective without raytracing become way more reflective with it on and it looks dumb (ratched & clank example).
I thought I was the only one thinking about this. Ray traced reflections are shoehorned into every Nvidia sponsored title purely to sell their GPUs, it's so obvious and I can't figure out why nobody mentions this. The reflections in Cyberpunk are absurd, there's no urban surface in the world that looks like this when wet. SSR is actually more convincing. RT reflections are certainly not "more realistic" - look at the Great Hall in Hogwarts and tell me why that stone floor shines like a mirror, it's daft. The one in the movie does not look like that. And the school even has reflective blackboards! 9 times out of 10 I'm selecting screen space reflections instead, because they're more diffuse - like actual material reflections. Nah, I'm much more interested and impressed by ray traced global illumination and bounce lighting than I am bright purple neon lights that one would be hard pushed to find even at night in Blackpool during the pissing down rain.
@@_Quint_ dude! just put "Tokyo raining" in the TH-cam search box. You will see high res videos and yes, surfaces can look extremely reflective depending on the material.
A lot of the RT byproducts are from TAA whether its directly or indirectly, such as the swimming/boiling effect. It's from excessive accumulation (holding onto too many frames). We also wouldn't be rendering the effects at absurdly low resolutions if TAA couldn't smooth it out, which would mean more efficient RT algorithms like Voxel GI would crop up. Always remember while developers can be passionate, development costs is the #1 priority of the publishers, and time is money. TAA makes things quicker, and using RT and just lowering its resolution is also a quicker way to make a game and save performance. That's also why the mere existence of technologies like DLSS means games are being built around upscaling being expected, because it saves them time, which then dumps the cost onto you (in the form of lower image quality and performance).
Around 2010-2015 I remember getting annoyed by how many games had obnoxious levels of motion blur as a visual effect. Not only did I hate how it looked, I remember finding it beyond silly that it was also slowing down the framerate to make it look worse. This modern situation with ray tracing is at least making things look better in many ways, but it's almost like the motion blur monster is back and this time we can't disable it.
I know we can't put the genie back in the bottle, but it really feels like ray tracing has created more issues than its solved in its 6 or so years since introduction. Consumers get medium to negligible visual improvements at the cost of massive performance hits and huge hikes in hardware cost. On the developer side, people always tout that ray tracing will make it easier for artists to work since they don't have to micromanage lighting as much, but we haven't and won't see that for a long time as we don't have the hardware to fully ray/path trace everything. We still have to rely on rasterization for so much meaning that developers are spending even more time on shoving these effects into their largely rasterized game. Really, the only ones who benefit the most are the GPU makers, as they can now sell us hardware with "huge generational leaps" in performance, but that's only in the ray tracing side, and with only medium at best improvements in the visual quality to show for it. I'm hoping we have some breakthrough that makes all this headache worth it, but man it's been too long in the making.
In all of his examples so far, except for maybe one or two, I would much prefer the RTX ON to the raster lighting. It also very much is worth a performance drop. At least for anyone with working eyes.
For you. For others, baked lighting is a scourge that needs to get put to pasture. Fake and plain wrong lighting/shadows, no matter how stable, is a fkn eyesore if anything. Looks horrible. To me.
Ray tracing was supposed to be subtle to enhance the environment. Now, everything is unrealistically shiny to show off raytracing. Less is more...besides raining environments of course
The problem here is that even the 4090 can only sustain 6-8 rays per pixel per frame in complex scenes. There have been many UE5 demos at GDC demonstrating this. Therefore, denoising, TAA, and ray reconstruction is required. When we were implementing path traced renderers in college we demonstrated that you need ~1024 rays per pixel to produce a good image, and that is not even accounting for water/dirty glass or things with lensing effects. We would need a GPU about 100x more powerful than a 4090 for a noise-less path traced renderer. Once we reach that point, however, we will be able to have the ultimate realism of having a scene that has no global illumination, illuminated only with lights that are in the scene, no "sun" or "moon." RT global illumination is still a hack.
We will never have GPU's 100x more powerful than the ones we have now. The room we have for improving processing power is coming to a practical decline, and wont take long before it's straight up financially unfeasible even before we get to a pure 'stop' in terms of transistor shrinking.
i think the transparency and refraction effects may be solved with machine learned upscaling, but im nowhere near educated enough to really talk about this. i just think since its blurred and out of focus anyway it would be less obvious if it is noisy or out of focus. direct reflections so far have made it very obvious that the ray tracing nVidia offers is subpar and not realistic at all
@@maynardburger We already have "GPUs" 100x as powerful as the ones we have now. They are just not a "single" GPU. The reason why this is not as useful for the standard projection based rasterization model is that it is very expensive to share data between cards since that has to go over the bus. Deferred rendering is a multi-step process that requires data to be shared between cards constantly. A fully path-traced renderer only requires the scene data and material properties to be sent to all GPUs once at the beginning of the frame and each can work completely independently after. If you look at Cinebench this is what happens. Cinema4D uses ray traced renderers, and those renderers can use farms of GPUs for movies already. The problem is power and cost. If we keep improving GPUs at 30% per generation for the next 18 generations we will have the 100x needed to do path tracing without ANY tricks...which will happen in like 2050. Before that we will be able to eliminate a ton of tricks already.
This is the problem with a lot of the tech sector: it's run by technofetishists, who love the latest piece of tech they've come up with because of how it works, and then trying to make it viable in the real world - even if it's not there yet.
TBF, have you tried buying a GTX brand new in the last 4 years? All modern GPUs supporting it (whatever that may entail for performance) doesn't mean ppl actually care. You cannot choose a brand new card that doesn't support RT, as they don't exist...
@@Vizeroy9 The customers mostly dont know the problems of Raytracing, they just believe Nvidias marketing and the general tech hype. Like we are talking about people that buy 4060s for raytracing.
In some games (such as Dying Light 2) I found the overall scene composition to be off with RT enabled. It's more than useless if it's not tuned properly. I think some devs just use it as a 'set and forget' alternative to taking the time and inspecting and manually tweaking the shadows and lighting in each scene.
a game needs to be designed for RT from the ground up with a very specific world design in mind as opposed to tacking it on just because (i.e. Elden Ring and most UE5 titles at this point). CP77 with RT remains by far the most beautiful and immersive visual experiences I've had in gaming so far - especially on OLED w/ HDR.
I wouldn't want to play CP2077 without at least RT Reflection these days. Noise is a problem, but I'd pick noisy but stationary reflection over the one that disappear as soon as you tilt the camera a little bit. Performance hit is acceptable on RTX 30-series.
Stalker 2 has this exact same problem, being a UE5 title and all that. It's especially noticeable in the main building of the SIRCAA, where the floor is grainy and muddy regardless of what settings you use.
@@EvilTim1911 Yes, compared to most implementations of hardware accelerated ray tracing, lumen is very low res (very few computed rays), and as a result, very noisy, so it uses heavy noise reduction.
@@EvilTim1911 Lumen also performs better than most other RT optimizations, generally. Not that I disagree, UE5 visuals in general are super blurry/noisy.
Non RT visuals look better to me, simple as that - clearer and not as you say 'noisy' - for now I'm turning RT and see if my games actually appear CLEARER Thanks for addressing the elephant in the room Tim.
The "boiling" artifacts you describe at around the 7:00 minute mark are my main gripe with the current state of raytracing. This constant wobbling is really distracting to me and is the reason I disabled PathTracing in games like Cyberpunk with lots of reflective metal surfaces even though the lighting looks noticeably better with it on.
I've started gaming around year 2001 and I would say we have the worst looking games now, blurry noisy ghosting garbage with TAA, upscaling and framegen yet the HW requirements are crazy.
Not to mention that the quality of the games themselves in terms of writing and gameplay is also worse. Fortunately that only makes the classics that much better in comparison.
@@bravoMike97people just forget bad works of art from the past, while seeing a lot of uninspired works in the present. Good games of today will similarly be remembered 20 years later, but do you think all that shovelware we see release on Steam every day will stay in anyone's memory?
@@Zecuto Just not true. Are you seriously saying that lets say movies are just as good today as they were in the lets say 90s and early 2000s? Sure some good indie stuff is made but bigger movies are trash. Games are not nearly as bad but in the past it was better gameplay and story wise.
So we dont only have unoptimized games, but we also have hardware/technology that arent able to keep up with the demand these games have. Especially with more and more titles requiring mandatory ray tracing...
It cost so much to develop a die to get better and better in just rasterization that it is not viable in long term. AMD is clearly an example. MCM or chiplet architecture is costing less than monolithic but even the 7900XT / 7900XTX didn't bring any profit to AMD because they focused on rasterization over technology. Put the 7900XTX in a monolithic architecture like RDNA4 and AMD is losing money by selling it to 999$. Behind, TSMC, Hynix and other companies who develops with AMD/Nvidia are asking to get paid for more and more.
@@Kony_BE Yet we have UE5 games gradually forcing RT onto their games even though the market isnt ready for it. It doesnt make sense for me and it looks like this could lead to a bigger percentage of NVIDIA leading the GPU market especially with many game developers switching to UE5. Thats unless Intel and AMD are able to catch up and compete with the market.
@@sgxsaint3130 lol yeah the Q3 2024 GPU marketshare has already been released a few days ago and now it shows Nvidia gaining 8% marketshare year over year while Radeon getting massively dropped in marketshare and only have 10% marketshare with the worse being Intel at 0%
There is literally one title that demands hw rt and that is the great circle... If you are talking about sw lumen req's, it is still not more than a few that doesn't have raster as an option. I do believe that your statement will be true in about a year tho since it is absolutely the only way to go if we want progression. We are gonna suffer some early problems and struggle with performance for a while longer but when the kinks have been fixed and the gpu's catch up to what is/would be possible in UE5 for example, games are gonna look 🔥🔥🔥
@@ItsCith does anyone liek really care about time cycles and weather. like people arel ike wow that sunset amazing. the afternoon lighthing is cool. the night lighthing is amazing. but does anyone care abt the lighthing inbetween. Like it isn't even noticeable most of the time. Weather is cool too but ive never seen it done well at all. if they are not doing it well why even put it there just to make it harder to play?
@@OtherwiseUknownMonkey time cycles and weather cycles "can" matter but its really dependant on the game and your approach. like for racing games it can be amazing (if done correctly) for survivable games it often doesnt get done right because they often go for a more "balanced" approach to weather/day/night cycles so that it doesnt hamper the players experience. I personally dont care for it but it can help elevate some games IF done correctly with most not doing it correctly.
@@OzzyBoganTech Because people forgot that Ray-Tracing in pre-rendered scenes ask for hours and hours of rendering for just one frame. So imagine in Real-Time what it ask to handle Ray-Tracing in real-time perfectly. Even Nvidia doesn''t have the solution.
TAA is not an option in most games anymore, and there's also a good reason for that. People who think TAA is bad have no room to complain about 'noisy' images, cuz without TAA/reconstruction, modern games with high detail, complex shaders would be an utter MESS of image quality, especially in motion.
@@marksvirsky9103 SMAA and FXAA are te rrible. Anyone who thinks otherwise hasn't made a game or game engine. Including that so called expert who tells game devs they are wrong but has not made an engine to demonstrate his case.
Some implementations are also more unrealistic. 2:25 these scenes from hogwarts legacy are perfect examples. Stone floors that have been walkes over for hundreds of years and chalkboards are not that shiny!
I've thought the same thing for years now, I've always called it the blender effect, typically because moving the viewport in 3d rendering programs results in hugely garbled images that slowly come to cohesion after the viewport is stationary for a bit. That and the massive performance hit is why I still haven't enabled any RT in games yet, or felt the need to upgrade my hardware for RT.
@@fuzzjunky the only "positive" I've found is rtx remix so we start to see older abandoned games get a refresh but even then the tool still needs a lot of work, demands at least a 3060ti (tried with a 3070 and the experience was...bad...) and gatekeeping AMD users from using it because of CUDA
@jagildown RT had a lot to do with me upgrading to a 3070(ftw) from a 1070(ftw) , I usually turn it on to see how it looks then turn it back off because I'd like to play at more than 30-40fps.
All I could think when you said the path tracing/ray tracing was “making a more realistic image” was… “really?” Honestly it kind of looks exaggerated, and unrealistic. The sad part is, devs could make the other lighting solutions look “more realistic” if they just tweaked them a bit.
@@maynardburger "That's massive hyperbole at higher resolutions" Oh really? Lol Tell me more. Go play Forza Motorsport with TAA on. It looks like garbage in that game regardless of resolution, and it's a common complaint on their forums, so much so that they had to implement a toggle to give you the option to turn it off. I have several 4k panels and TAA in many games looks like garbage on every single one of them, or are you really going to sit here and tell me not to trust my lying eyes?
The RT dream has been chased for 4 generations, yet it rarely is done well and still axes the framerate. I can only imagine where we would be if all that money and RD was invested into other improvements.
Game physics haven't gone anywhere since the 7th gen console generation with the exception of Rockstar's titles. So we really have been playing glorified tech demos for 70$
I'm starting to have issues with modern gaming. The fact games are often using sharpening effects, TAA, etc results in blurry and noisy games. Some of these games also don't allow you to turn these filters off, such as black myth wukong. Many developers are also relying on upscaling to achieve acceptable performance, furthering the problem, and now noisy ray tracing. I have a 7900XTX so I never run ray tracing anyway, I prefer high framerates, I play at 1440p 360hz.
Reconstruction is not making the problem worse, ffs. You guys simply dont have any clue what's going on. Reconstruction actually helps games be sharper, not blurrier, at a given performance level. The alternative is to lower resolution natively down much lower, which will be much blurrier.
@@hamzakhalil-gs7oc The problem with pushing so much “Ray tracing BRO!!!!”…. Is that all these devs think applying reflective texture materials to literally every surface somehow makes your game look “more cool”.
As annoying as noise can be sometimes, I'd rather have that over the disgusting mess that is SSR we've had for the past 7ish years. Having details just disappear in SSR based on the camera takes me out and I find disabling SSR much more immersive even if the reflections aren't accurate. Consistency and art direction are key.
Developers might want to start toning down reflections in general. I find modern games make far too much use of them, and they often aren't particularly realistic.
@@nightmarepotato5000 If people stop buying stupid photorealistic games that are bland and trash, there will be better games again because game studios will invest into art direction teams instead of reducing them to bare minimum like they do now
I hate how Digital Foundry would cry how "transformative" ray-tracing is and see all the miniscule difference it brings, while downplaying the negatives (noise, big performance hit). They have been an nVidia PR machine for the feature since it started. Good job to Hardware Unboxed for this excellent piece of tech journalism!
"you mostly see no difference to good rasterization" Applies only to titles that have trash implementation like the AMD sponsored RE4 remake. "a noise problem" Ray reconstruction exists, and by now AMD, Intel should've made and alternative for it. " cost problem" Battlemage and RDNA will create enough competition solve this.
Honestly, I hate to admit it, but the answer here is AI. Temporal noise is the most difficult problem to conquer with raytracing. I make offline, path-traced animations, and the amount of samples you need to eliminate the noise is often staggering: 256 samples might produce a _pretty good_ image, yet to actually _eliminate_ the noise, you'd need to spend an extra twenty seconds doing something like 4096 samples. That's completely infeasible in real time raytracing where you get a handful of crappy samples to work with. Stuff like Nvidia's AI ray reconstruction is honestly probably the solution here. It isn't there yet, but it's the most promising path forward right now. Everyone rendering path-traced animations has been using adaptive sampling + AI denoisers for years, and even then, a 4090 is still far from being able to path trace in real time. Real time ray traced games theoretically have to achieve accurate reflections, lighting, and shadows based on substantially incomplete information, which leads to flickering from frame to frame. Hardware is _not_ going to be able to do 200x more samples per frame any time soon, so I honestly do think AI techniques are the solution to this. It's annoying though - as we've moved from rasterized to raytraced games, they're definitely becoming a blurry, flickery mess. Part of the problem is that games get sold based on highly-compressed TH-cam videos that people watch on tiny phone screens where temporal noise isn't very noticeable. It's only when you're staring at the big high res screen and _actually playing the game_ that you start to notice.
3:51 - the asphalt looks nothing like the real world one. It's nowhere near this reflective (both RT ON/Off). Notice the bus (and car) doesn't even have head lights on for whatever reasons (fewer light sources). The issue with raytracing reflections is still the same as it was initially, it's overdone. Both starwars outlaws 9:50 and cyberpunk 10:39 ignore scattered lights the walls don't get illuminated by the extra polished floors - the technology (ray reconstructions/etc.) can't support so many bounces, so to me they appear just less realistic - still overdone when it comes to reflective materials.
Wonderful that at least someone in the industry dares to speak about this! The industry is lobbying HARD to only allow overly positive things to say about ray tracing, calling sceptical people haters, when in reality no-one in their right mind has any critique about the ray tracing itself, only about the manufacturers not producing hardware not currently capable of delivering quality ray tracing! And we are 5 years in soon to having RT hw too! So it is definitely not the gamers fault!
Ray Tracing was introduced to tank performance, to force planned obsolescence, and make development easier and cheaper flat out. Kind of like the only reason 2.5G and 5G Ethernet were introduced because it was cheap to produce using current materials.
Games have had an image clarity problem for a long time now. The over reliance on TAA based antialiasing solutions have contributed heavily to that. And that includes TAA based upscalers also.
You are blaming the fix for the problems of the culprit. TAA is NOT the culprit. TAA is the aid. The culprit is the switch from forward shading to deferred rendering around 15 years ago that renders the game into G-buffers and combines them in a quite fake way like layers in photoshop into the final image, so the classic MSAA is not feasible anymore. TAA is necessity to fix the new problem the industry introduced. Deferred became popular because it can MUCH more efficiently use more complex lighting and shade very dense geometries, so the screenshots suddenly became much prettier and many studios still using forward rendering were mocked by gamers for using archaic, outdated engines, expect maybe Valve.
i find shimmering from temporal aa way more annoying since its prevalent in every game, raytraced or not. imo one of the main reasons that people wax nostalgic about graphics from the early 2000s is that pretty much all games were forward rendered where everything looked crisp with msaa (or sgssaa if you were a baller.)
One note, running 4k native or capping to 60 fps will make temporal artifacts more apparent. The reason is obvious; temporal algorithms accumulate data over multiple frames, thus they will require more time at lower frame rates to converge which means you have more time to see the artifacts. Something that is obvious at 60 FPS may be a lot less obvious at or above 100 FPS. This is a case where lower resolutions or more aggressive DLSS/FSR/XESS settings lowering the overall number of pixels can actually increase the stability of the image because the higher frame rates also means a higher sampling rate per unit of time for the temporal accumulators to do their jobs. This ultimately leads to the same conclusion: ray/path tracing needs more samples to get rid of the noise.
The TAA era of graphics is just any effect that's too difficult to run in real time on most hardware (console & xx60 series cards) reduce the resolution/samples/rays of said effect then use temporal accumulation to clean it up. Resulting in byproducts like motion smearing, ghosting, oil painting/vaseline aesthetic, boiling, grain, etc. Temporal accumulation is a bad part of optimisation since it comes with severe drawbacks, it's not "free" performance like other tricks. Developers & especially engine makers need to quit building their product around it.
Temporal accumulation is cool on paper. Getting information via past frame data instead of by increasing the pixel count is a neat concept. The problem is though is that while its significantly less expensive than super-sampling, reusing old frames degrades the image quality in many ways, so you're not really enhancing the image you're trading one flaw for another. And a lot of these flaws (like the motion smearing) has legitimate accessibility concerns since some gamers experience motion sickness from it. Motion sickness is the #1 accessibility issue in gaming, so it's not a great industry standard for that reason alone, on top of its controversy. We need to make more efficient lighting systems that can run at or very close to full res, instead of relying on TAA to make it possible.
Not going to happen with the number of dynamic lights in current games. Current games use deferred renders not because they are more intuitive but because deferred renders allow the use of sub-native resolution lighting and shadows, and also don't linearly lose performance from additional lights in the scene. A string of Christmas lights swaying in the wind (e.g. a massive amount of dynamic point lights) would be impossible without the use of deferred rendering. A realistic flashlight would massively affect frame rate without deferred rendering. Destructible environments would pop out of the image.
@henryzhang7873 I think the disconnect were having here is you're conflating DR = TAA and FR = No TAA. Their are a ton of games that use DR without relying on TAA, the image doesn't look broken when its disabled or the developer doesn't force it on. These aren't limitations of DR, their design choices, mostly with the goal of speeding up development. It's like games only shipping with RT and no raster mode, that's the same line of reasoning a developer would force TAA and making the game around it is so that they don't have to spend more time mitigating aliasing & reliance on TAA, resulting in people having to use it. If the new Indian Jones game had a raster mode it would've taken more time to make, and time is money.
I don't know why more people dont talk about this. RT in stills can look pretty good. In motion RT looks AWFUL, and I feel like everyone else is blind to it. Fizzle, blur, and lag in reflections is horribly distracting.
Long story short: we're still 10+ years away from having enough RT-power in mainstream GPUs for it to look right. I personally don't care much about higher fidelity reflections and other such improvements. Not anywhere near worth the massive (>10X) increase in compute and electrical power requirements vs raster simulations. I'm the kind of person that turns off shadows if that makes a 1fps difference because I don't care about them either.
Which is why the current push for RT is bad as games are starting to not allow players to disable effects like motion blur, TAA and even RT courtest of that new Indiana Jones game. We're not ready but game developers act like everyone is on a 4090.
games look uglier than they ever have now with all the rt noise, upscaling blur, and ghosting and smearing from framegen horrible time to play at 1080p, expensive cards that run and look like shit
It's insane how 1440p has become the new 1080 mot because of hardware being so powerful it's easier to run higher res. But because software has gotten so shit, 1080p now looks like 720p
I first experienced this odd artifacting when playing "The Callisto Protocol", as it would make spikey walls look like a grainy CRT TV showing static. Great video!
My issue with RT is that it assumes everything is shiny and clean to the point the lighting and reflections become very unnatural in all but a very small handful of titles compared to traditional rasterization, most of which have Nvidia heavily involved and are little more than Nvidia marketing projects.
Thats not a RT problem, its a material problem. also its an old and stupid argument. Go outside sometimes, go look at a calm puddle, is it not mirrorlike?
I always noticed that rt looked a bit blurry at times, but I always thought it was the upscaler's fault. Turns out I just don't like rt or upscaling.... I'm sure rt will get better, but our hardware isn't ready yet.
Anybody who has had to deal with Blenders Cycles rendering engine knows the struggle with noise and something called "fireflies" (random bright pixels). I personally like little bit of noise in my images so to me it's actually a plus but I can understand how it can be annoying when trying to enjoy games.
I recently got a beautiful 4k OLED display, and one of the biggest things I've noticed now that I can see games in all of their glory, is how nasty image quality has gotten in modern games. Going back 10+ years and playing older games at native 4k, you get these beautiful, crisp vistas with minimal distracting nonsense, and despite the complexity of models and lighting being technically lower; I often find these older games subjectively MUCH better looking. I think RT has been pushed a little too early, before consumer hardware was ready for it, and it has caused some pretty ugly results for lower and mid range GPU's especially. I have a 3070 TI for example, and in the 4 years I've had this card, I've turned RT effects completely off in just about every single game I've played, and that has consistently led to a better looking game, with higher resolution and WAY less distracting artifacts. Ray traced effects are the future I'm sure, but as of now I'd suggest just about everyone turn them completely off, and enjoyed your massively increased image quality and frame rates, which 95% of the time, matter a LOT more in terms of your general gaming experience.
It does look amazing when done well though, hardly a scam. I say that as someone who doesn't think the performance cost is worth it a lot of the time, then again my 3070 can't handle it anyway most of the time.
Yeah I was heavily saving for a while for a 5090 after path tracing games starting coming up. But honestly I think I'll wait for image quality and performance improvements in raytracing games before I upgrade my aging RX580 gaming pc.
It isn’t overrated, it’s still too difficult for now to get a good real-time implementation. Real-time path tracing has been the ultimate goal for decades.
Games have gotten more blurry over the years, which imo is the most distracting trend out there. It got to a point where some Gen 7 games look better than current year, blurry mess.
Ray Tracing is mainly a useless technology for which there will never be enough HW. Constantly making people accept DLSS (reduced resolution) DLSS 3 (imaginary FPS+input lag) so that games work a little. Take the lineup from 2019: 2080 (Ti) + 9900K vs 4090 + 9800x3D. The difference between these two lineups is about 50-70% difference in performance, but games look worse today and mainly optimization has completely disappeared and everything is left to DLSS. Ray Tracing and DLSS are evil for PC gaming. Previously, DLSS was presented as a bonus FPS. Today it is just an optimization technology for developers to reduce game development time. And ray tracing can swallow 50-60FPS and Path Tracing even 100FPS. While the 4090 can handle The Division 2 on ultra in 4K at 120FPS, it can also handle newer games or especially games on Unreal Engine 5 at similar FPS, but with DLSS 2+3.... What's the point of constantly upgrading your PC? ...
It is possible to do RT with the right art direction and optimisations... You had a video of an indie game where they managed to run some RT even on gtx1060.
Hardware just isn't ready for raytracing yet. We need multiple "cheats" to get it to run reasonably and then it still looks like this. Can we just accept that we need another decade before raytracing becomes actually feasible instead of trying to cram something in the games that most players cannot even experience at all and those that can get mediocre mush?
So that's where that grainy effect is coming from; I'm quite tolerant of visual artifacts but those grainy/gritty shadows are a hard stop for me. Its really agitating as it creates false movement that draws the eye toward it when there's nothing good to see.
no amount of denoising would help the lack of proper material diffusion and light scatter. In pretty much all of l the examples in the video , materials with reflection don't appear realistic at any rate. The glass mirror is one thing, diffused light in wood/plastic/sub 5000 grit polished metals entirely different.
I would argue that RT rarely improves the picture. In the vast majority of games, the materials are not adjusted for RT. which degrades not just the style, but also the “realism”. in the same example - Hogwarts - the granite on the floor should not be so shiny.
RT has been utter dogs**t for gaming and GPU's. It's made them unnecessarily expensive, offer largely garbage results for most people and games contort themselves around this "feature" - that most people turn off anyway. I absolutely hate it.
Great to see that someone finally brings up this isse. I'm a person who Always turns off things like motion blur, film grain and such effects because I want a clear Picture. This effect is what kills RT for me, apart from the insane performance loss.
"We'll fund your game's (and game engine's) development if you force it on. It makes everything shimmery and blurry, makes games run slower despite consuming twice the power, and 80% of the things it does can be done more efficiently with other techniques, but the performance hit is even _bigger_ on our competitor's GPUs, so it makes us win benchmarks, and we'd rather see the game run at 40 fps on our GPUs and 30 fps on the competition's GPUs than have it run at 100 FPS on ours and 101 FPS on theirs."
@@maynardburger That's true but because they would never see a cent of it either way, publishers are the ones making the deals and keeping the profit.
This is frankly the real reason why RTX adoption has been lacking. It's not that it doesn't make a difference, but that it looks outright unappealing at times. I will always prefer a clean video-gamey look over a messy photorealistic look. UE5 seems to be taking the former away for good, though.
Denoiser has lag, which causes reflections to become blurry or misplaced. It's all because of current gpu's being too weak for full resolution RT effects. It's actually developers fault. Nvidia said at first you can use only one RT effect, but developers want everything for some reason.
@@maynardburger Yeah, on first gen RTX cards. Current cards can run full res, but they don't because they've added all these extra effects. It's even worse on consoles, PC RT problems are nothing compared to PS5/Pro.
You guys have no idea what you're talking about. Without TAA/reconstruction, these games would be trashy messes of image quality. Sharper, but horrible looking.
@@maynardburger Man, if only we had dozens of other AA methods that all look better. Oh we do. TAA is required to hide these effects in the way they are done. or rather, no one would do these effects this way if TAA didnt exist. One wonders why 8th gen multiplat games on PC look better then 9th gen multiplat games on PC, while performing better.
@@maynardburgerYou sound like TAA is literally the only form of AA that can exist on this solar system and if we dare to attempt any other implementations Cthulhu will awaken and eat us all.
I can count the amount of games where I think it actually makes a worthwhile enough difference to be worth burning half my fps and turning on upscaling or frame gen for without running out of fingers.
hogwarts legacy is the example for me. Everything becomes so shiny and oily just for the sake of having more reflective surfaces to show the effects off.
@@102728that game is the perfect example of what's wrong with RT. I get that the developers aren't out much, but a floor isn't supposed act like a mirror (even if it was just cleaned). I hate that they try to add shiny things just to show it in some game preview to say "we have RT (so this game is instantly better obviously)"
Thanks Hardware Unbox. You just helped me understand why I never find Ray Tracing on games I play doesn’t always feels like a visual upgrade to me. And why most of the time I really dislike ray tracing. If resolution is to blame, then dlss as a solution for rt performance drop just makes everything worse.
This is honestly another great gripe of mine against RT.... using RT Reflection in a game like Cyberpunk, especially if you are using aggressive upscaling like "Performance" mode, Then rough reflective surfaces almost get mpeg-levels of artifacts due to noise and de-noising algorithms.I find this MUCH more distracting than using high quality SSR. Especially since the weapon take up a large part of the screen and often have rough metallic surfaces.
@@maynardburger at the very least people should know what they're picking and allow us to choose. I'm glad you like TAA, but don't you think choice is a good thing?
Effects are never used in a realistic subtle fashion if they eat up a lot of performance. Games will smack you over the head with them to make sure you notice they have the fancy new thing. Like with colored lighting, lense flares, bloom, motion blur.
The timing of this video is absolutely wonderful. Since the new Indiana Jones game requires hardware-based ray tracing. I hope this'll thoroughly halts that track in development. These issues and that even the cheapest cards have a more then significant performance loss with RT enabled? It's just not beneficient to the customer, to put it in a very nuanced manner.
For years everyone has talked about "hardware ray tracing" units but they're only doing a few ray bounces which makes a really noisy image, its the tensor cores that are doing the real leg work denoising what that can to make it look...passable, and thats before they have to do even more denoising when you're inevitably using DLSS to upscale from a lower resolution so you're not getting 35fps. The RT filter we apply to rasterised games was a gimmick in 2018, 2020 and it still is today.
What a weird situation. Cutting edge games with "the best graphics" genuinely look worse than last gen games running 4K native with conventional baked lighting. The massively superior frame rates and perfect image clarity found in older titles is so much nicer to look at in motion than this modern stuff full of upscaling artifacts, TAA ghosting, and raytracing noise. What is going on? Why has the industry gone in such a bizarre direction? Surely developers and graphics artists can see how ugly these features are?
Dynamic objects. If you look at the original Crysis, lighting inside of a destructible building is very different than one in a static building that can't be destroyed. You can interact with almost any object in modern games, when in older games you couldn't even move a cup that was placed on a table. Now they are lit the exact same way as the rest of the world, while in older games you could only rely on ambient occlusion making objects "pop out"
PBR alone makes games today look way better than before. But PBR also comes with the problem of complex shaders and shader aliasing. Which demands games use some kind of temporal AA solution to keep the image stable. You guys are still living with the old mindset of 'sharper = better', but we cant have both super sharp visuals along with PBR without it being a complete mess of pixel shimmering everywhere. It turns out professional graphics developers aren't all incompetent morons and you guys might simply not understand this stuff at all.
Will you also make video about all the artifacts that rasterized techniques cause? Like SSR disocclusion, shadows not rendering at the edges of the screen, SSAO/SSR disappearing when the occluded object is off screen, etc..
I remember when it was being speculated that newer generations of RT cores would make turning on Raytracing a zero performance loss, but it never happened, even in the 4000 series, you turn on raytracing and still lose a massive amount of fps, just a bit less than previous gens. Let's see what the RTX 5000 series brings to the table.
Just a reminder that all games that support Ray Reconstruction had the feature turned ON for footage in this comparison (unless otherwise specified). Those games are Cyberpunk 2077, Alan Wake II and Star Wars Outlaws. Ray Reconstruction addresses noise differently to other forms of denoising but falls well short of eliminating the issue.
👍
Yeah exactly, Ray Reconstruction is great... if you want the game to look like you put an oil painting filter over it (Cyberpunk specifically)
ITZZZZZZZZAAAAAAAAAAAAAAAAAAAAAGGGGGGGGGGGGGGGGGGGGGGGIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIMMMMMMMMMMMMMMMMMMMMMMMMIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIICCCCCCCCCCKKKKKKKKKK
hardware is weak for real RTX and devs can't do anything about it and games are few and far apart where rtx makes any sense other than ADVERTISEMENT FOR NVIDIA YOU JUST SHILL FOR NVIDIA TO SELL US STUPID entry level 4060 AT mid tier e.g what was GTX 780 PRICES every time you compare raytracing performance or test ray tracing performance!!!
i rather play with Taa with some tweaking look at Cyperpunk i test taa and dlss and i get better quality with taa so match detailed specifically on full path tracing and games thes days get bad Taa implementation specifically for UE 4 and 5 and without fixing Major stuttering issue
The issue is deeper! Doesn't excuse RT, but games have a lot of noise, it goes down to TAA (temporal anti-aliasing), especially on unreal engine.
The main noise problem was nvidia yelling about how great it was for RTX 2060
Looking at the latest RT performance data from their Intel B580 review, ray-tracing isn't great for the RTX 4060 either, lol
💀
Or even for any card worse than a 3090...
the noise issue is people making noise for Nvidia Graphics cards in the first place. they have become the most unrecommendable cards in existence as they get away with charging more and more and more to make more than twice what the card should cost in profit
"[...] You can sit around twiddling your thumbs and hoping that an RTX 2080 gets cheaper, or you can enter the world of ray-tracing and high-speed, 4K gaming today and never look back. When you die and your whole life flashes before your eyes, how much of it do you want to not have ray tracing?"
Blurry era of unreal engine 5
Yes! UE is so bad at RT
yes but I think RT is the last of Unreal Engine 5's problems.
The terrible CPU and GPU performance even for raster needs to be optimized first.
UE5 is bad overall visually, AND it has massive issues on the Ryzen 5000 series causing screens to bounce around, you have to turn off cores to stop it from happening. Happens in Stalker 2, Gray Zone Warfare and even bloody Fortnite.
Unreal Stutter Engine 5
@@Arjun-eb1yc Just watch anything from Threat Interactive and you'll soon find out that unreal has way more problems.
Finally someone spoke about this. It looks good in screenshots and photos but blurry, noisy and low res in motion.
Exactly that's what I could not explain to people up to this date. I tried RT, looks trash and not going to buy a GPU based on that ever. Only if something drastically changes in the future in regards to this technology.
As if rasterization doesn't have its own list of quirks. I much prefer RT accuracy than SSR disocclusion, shadows not rendering at the edges of the screen, SSAO/SSR disappearing when the occluded object is off screen, light bleed, SSAO over-darkening and all kinds of funny looking inaccuracies. Screen-space effects also had many issues and it took time and research to get them in a pleasing state - compare Crysis 2007 to RDR2.
@@ShankayLoveLadyL yeah, I have the same opinion of upscaling technologies too. They make the games look bad in motion and what's worse is that devs are hiding behind dlss to not optimise their games properly. So people don't get good frame rates in native rendering.
People have spoken about this in the past as well. Your comment should technically be that finally some mainstream tech people have talked about this.
I think the opposite. Not great in screenshots, but beautiful in motion. (I think of the light games seen and played in Alan Wake 2, Wukong, Cyberpunk, etc.)
I like how every wet surface becomes a mirror.
I live in the U.K., pavements do not become mirrors when wet.
Your pavements are not made of polished marble? Third world country.
Right? It's so tasteless. Even the surfaces that are reflective without raytracing become way more reflective with it on and it looks dumb (ratched & clank example).
I thought I was the only one thinking about this. Ray traced reflections are shoehorned into every Nvidia sponsored title purely to sell their GPUs, it's so obvious and I can't figure out why nobody mentions this. The reflections in Cyberpunk are absurd, there's no urban surface in the world that looks like this when wet. SSR is actually more convincing. RT reflections are certainly not "more realistic" - look at the Great Hall in Hogwarts and tell me why that stone floor shines like a mirror, it's daft. The one in the movie does not look like that. And the school even has reflective blackboards! 9 times out of 10 I'm selecting screen space reflections instead, because they're more diffuse - like actual material reflections.
Nah, I'm much more interested and impressed by ray traced global illumination and bounce lighting than I am bright purple neon lights that one would be hard pushed to find even at night in Blackpool during the pissing down rain.
That's because UK water is dirt
@@_Quint_ dude! just put "Tokyo raining" in the TH-cam search box. You will see high res videos and yes, surfaces can look extremely reflective depending on the material.
Blur and ghosting from TAA and noise from RT gaming is great
just one big blur
A lot of the RT byproducts are from TAA whether its directly or indirectly, such as the swimming/boiling effect. It's from excessive accumulation (holding onto too many frames). We also wouldn't be rendering the effects at absurdly low resolutions if TAA couldn't smooth it out, which would mean more efficient RT algorithms like Voxel GI would crop up.
Always remember while developers can be passionate, development costs is the #1 priority of the publishers, and time is money. TAA makes things quicker, and using RT and just lowering its resolution is also a quicker way to make a game and save performance. That's also why the mere existence of technologies like DLSS means games are being built around upscaling being expected, because it saves them time, which then dumps the cost onto you (in the form of lower image quality and performance).
@@Hybred In short = It's over (for now)
Weird how 2014-2018 era of games look clearer and less blurry
@@ThunderingRoar Yep even World of Warcraft looks better than most newer games because the image is sharp and clean
Ray Tracting = Noisy
Resolution = Blurry
Framerate = Fake
1:05 = Perfection
This is NVIDIA fanboys' dream right there!
@@cyclonous6240 Womp womp
105 too noice
this seems to be what everyone is pushing from both AMD and Nvidia. Its garbage and needs to go away
Down bad
Around 2010-2015 I remember getting annoyed by how many games had obnoxious levels of motion blur as a visual effect. Not only did I hate how it looked, I remember finding it beyond silly that it was also slowing down the framerate to make it look worse. This modern situation with ray tracing is at least making things look better in many ways, but it's almost like the motion blur monster is back and this time we can't disable it.
I know we can't put the genie back in the bottle, but it really feels like ray tracing has created more issues than its solved in its 6 or so years since introduction. Consumers get medium to negligible visual improvements at the cost of massive performance hits and huge hikes in hardware cost. On the developer side, people always tout that ray tracing will make it easier for artists to work since they don't have to micromanage lighting as much, but we haven't and won't see that for a long time as we don't have the hardware to fully ray/path trace everything. We still have to rely on rasterization for so much meaning that developers are spending even more time on shoving these effects into their largely rasterized game. Really, the only ones who benefit the most are the GPU makers, as they can now sell us hardware with "huge generational leaps" in performance, but that's only in the ray tracing side, and with only medium at best improvements in the visual quality to show for it. I'm hoping we have some breakthrough that makes all this headache worth it, but man it's been too long in the making.
Whether that's grain, sizzling, boiling, grilling, baking, microwaving, it's not worth the performance drop
To you. But you ignored the horrible raster lighting. Disconnected objects, glowing parts of the room. It's a trade off.
@@SlyNine What horrible raster lightning? There's plenty of games that look gorgeous with it and not needing to eat 30 fps in the process.
In all of his examples so far, except for maybe one or two, I would much prefer the RTX ON to the raster lighting. It also very much is worth a performance drop. At least for anyone with working eyes.
Personally I prefer steaming, it makes vegetables taste much better and performs quite well at only about 6 minutes per vegetable.
For you. For others, baked lighting is a scourge that needs to get put to pasture. Fake and plain wrong lighting/shadows, no matter how stable, is a fkn eyesore if anything. Looks horrible. To me.
Upvote if you're gonna buy 5060.
😂
Said to my wife no amount of anti aliasing can hide her wrinkles. Now I am wifeless.
So, has the performance improved?
Good, noise is gone
it's like to say to wife in her 40's that she looks like in her 20's but with ultra performance upscaling
I've never seen such a glossy chalkboard 2:30 never knew they could be made of glass
it's a magic chalkboard. don't worry about it.
also games like cyberpunk 2077:
rt off dry floor
rt on wet floor😂
Ray tracing was supposed to be subtle to enhance the environment. Now, everything is unrealistically shiny to show off raytracing. Less is more...besides raining environments of course
The problem here is that even the 4090 can only sustain 6-8 rays per pixel per frame in complex scenes. There have been many UE5 demos at GDC demonstrating this. Therefore, denoising, TAA, and ray reconstruction is required. When we were implementing path traced renderers in college we demonstrated that you need ~1024 rays per pixel to produce a good image, and that is not even accounting for water/dirty glass or things with lensing effects. We would need a GPU about 100x more powerful than a 4090 for a noise-less path traced renderer.
Once we reach that point, however, we will be able to have the ultimate realism of having a scene that has no global illumination, illuminated only with lights that are in the scene, no "sun" or "moon." RT global illumination is still a hack.
We will never have GPU's 100x more powerful than the ones we have now. The room we have for improving processing power is coming to a practical decline, and wont take long before it's straight up financially unfeasible even before we get to a pure 'stop' in terms of transistor shrinking.
i think the transparency and refraction effects may be solved with machine learned upscaling, but im nowhere near educated enough to really talk about this.
i just think since its blurred and out of focus anyway it would be less obvious if it is noisy or out of focus.
direct reflections so far have made it very obvious that the ray tracing nVidia offers is subpar and not realistic at all
To be fair, AI will probably solve these problems and approximate denoising and reconstruction. That's where we're already heading anyway.
@@maynardburger A hybrid quantum + transistor based architecture maybe able to get us there in the future.
@@maynardburger We already have "GPUs" 100x as powerful as the ones we have now. They are just not a "single" GPU. The reason why this is not as useful for the standard projection based rasterization model is that it is very expensive to share data between cards since that has to go over the bus. Deferred rendering is a multi-step process that requires data to be shared between cards constantly. A fully path-traced renderer only requires the scene data and material properties to be sent to all GPUs once at the beginning of the frame and each can work completely independently after. If you look at Cinebench this is what happens. Cinema4D uses ray traced renderers, and those renderers can use farms of GPUs for movies already. The problem is power and cost.
If we keep improving GPUs at 30% per generation for the next 18 generations we will have the 100x needed to do path tracing without ANY tricks...which will happen in like 2050. Before that we will be able to eliminate a ton of tricks already.
This is the problem with a lot of the tech sector: it's run by technofetishists, who love the latest piece of tech they've come up with because of how it works, and then trying to make it viable in the real world - even if it's not there yet.
Well... most customers seem to agree by buying RTX GPUs.
I don't care for RT... not worth the cost for the 1-2 games, I own that have RT.
TBF, have you tried buying a GTX brand new in the last 4 years?
All modern GPUs supporting it (whatever that may entail for performance) doesn't mean ppl actually care. You cannot choose a brand new card that doesn't support RT, as they don't exist...
@@Vizeroy9 RTX GPUs are pretty much the only nVidia GPUs available, and people just buy nVidia (or buy a laptop/prebuilt, which are mostly nVidia)
@@Vizeroy9 The customers mostly dont know the problems of Raytracing, they just believe Nvidias marketing and the general tech hype.
Like we are talking about people that buy 4060s for raytracing.
Digital Foundry (nVidia PR) does not appove of your comment
Honestly, I've never felt the urge to use RT on any game, The Performance to Visual reward is not worth it imo.
In some games (such as Dying Light 2) I found the overall scene composition to be off with RT enabled. It's more than useless if it's not tuned properly.
I think some devs just use it as a 'set and forget' alternative to taking the time and inspecting and manually tweaking the shadows and lighting in each scene.
a game needs to be designed for RT from the ground up with a very specific world design in mind as opposed to tacking it on just because (i.e. Elden Ring and most UE5 titles at this point).
CP77 with RT remains by far the most beautiful and immersive visual experiences I've had in gaming so far - especially on OLED w/ HDR.
I wouldn't want to play CP2077 without at least RT Reflection these days. Noise is a problem, but I'd pick noisy but stationary reflection over the one that disappear as soon as you tilt the camera a little bit. Performance hit is acceptable on RTX 30-series.
I did... in Minecraft!!
I would only turn on RT when I get a 5090. That way I don't have to DLSS.
Stalker 2 has this exact same problem, being a UE5 title and all that. It's especially noticeable in the main building of the SIRCAA, where the floor is grainy and muddy regardless of what settings you use.
Stalker uses UE's Lumen which is a sort of wonky software-based implementation of RT that is even more noisy than the regular one
@@EvilTim1911 Yes, compared to most implementations of hardware accelerated ray tracing, lumen is very low res (very few computed rays), and as a result, very noisy, so it uses heavy noise reduction.
@@EvilTim1911 Lumen also performs better than most other RT optimizations, generally.
Not that I disagree, UE5 visuals in general are super blurry/noisy.
yeah I can't get Stalker 2 to not look milky and frizzle-fried
super disappointing
Non RT visuals look better to me, simple as that - clearer and not as you say 'noisy' - for now I'm turning RT and see if my games actually appear CLEARER Thanks for addressing the elephant in the room Tim.
Unfortunately everything UE5 has a noise problem.. but don't worry we can just slather your eyes with vaseline and now you can't even tell!
I swear you need to run atleast 4k or 8k to get a somewhat clear image in ue5. It's rediculous.
it's not the engine's fault, it's the developer's fault
@@draand2878 It is the UE fault.
@@draand2878Since basically everyone is switching to UE5 it is now the engine's fault
With little to no rasterization gains from new gpus, sigh
The "boiling" artifacts you describe at around the 7:00 minute mark are my main gripe with the current state of raytracing. This constant wobbling is really distracting to me and is the reason I disabled PathTracing in games like Cyberpunk with lots of reflective metal surfaces even though the lighting looks noticeably better with it on.
I've started gaming around year 2001 and I would say we have the worst looking games now, blurry noisy ghosting garbage with TAA, upscaling and framegen yet the HW requirements are crazy.
Not to mention that the quality of the games themselves in terms of writing and gameplay is also worse. Fortunately that only makes the classics that much better in comparison.
@@bravoMike97people just forget bad works of art from the past, while seeing a lot of uninspired works in the present. Good games of today will similarly be remembered 20 years later, but do you think all that shovelware we see release on Steam every day will stay in anyone's memory?
TAA isn't garbage. SMAA and FXAA have serious flaws that are obvious on any moving image.
@@Zecuto Just not true. Are you seriously saying that lets say movies are just as good today as they were in the lets say 90s and early 2000s? Sure some good indie stuff is made but bigger movies are trash. Games are not nearly as bad but in the past it was better gameplay and story wise.
THIS. Side by side the the RT image looks superior but when playing I’ve noticed the shimmering and grain is super distracting
💯
I don’t understand this claim that the RT image looks “superior”. It looks extremely exaggerated, I don’t find it necessarily more realistic at all.
@@brando3342 💯
So we dont only have unoptimized games, but we also have hardware/technology that arent able to keep up with the demand these games have. Especially with more and more titles requiring mandatory ray tracing...
It's "the way it's meant to be played"™, y'know?
It cost so much to develop a die to get better and better in just rasterization that it is not viable in long term. AMD is clearly an example. MCM or chiplet architecture is costing less than monolithic but even the 7900XT / 7900XTX didn't bring any profit to AMD because they focused on rasterization over technology. Put the 7900XTX in a monolithic architecture like RDNA4 and AMD is losing money by selling it to 999$. Behind, TSMC, Hynix and other companies who develops with AMD/Nvidia are asking to get paid for more and more.
@@Kony_BE Yet we have UE5 games gradually forcing RT onto their games even though the market isnt ready for it. It doesnt make sense for me and it looks like this could lead to a bigger percentage of NVIDIA leading the GPU market especially with many game developers switching to UE5. Thats unless Intel and AMD are able to catch up and compete with the market.
@@sgxsaint3130 lol yeah the Q3 2024 GPU marketshare has already been released a few days ago and now it shows Nvidia gaining 8% marketshare year over year while Radeon getting massively dropped in marketshare and only have 10% marketshare with the worse being Intel at 0%
There is literally one title that demands hw rt and that is the great circle... If you are talking about sw lumen req's, it is still not more than a few that doesn't have raster as an option. I do believe that your statement will be true in about a year tho since it is absolutely the only way to go if we want progression. We are gonna suffer some early problems and struggle with performance for a while longer but when the kinks have been fixed and the gpu's catch up to what is/would be possible in UE5 for example, games are gonna look 🔥🔥🔥
Real isn’t always better. Style beats realism. Baked lighting and trickery can look amazing.
Can't bake lighting if the game has dynamic time cycles and weather.
It can, but you lose flexibility and lots of dev time. Dynamic objects will always have a disconnected appearance.
I think dying light 1 proves that wrong @@ItsCith
@@ItsCith does anyone liek really care about time cycles and weather. like people arel ike wow that sunset amazing. the afternoon lighthing is cool. the night lighthing is amazing. but does anyone care abt the lighthing inbetween. Like it isn't even noticeable most of the time. Weather is cool too but ive never seen it done well at all. if they are not doing it well why even put it there just to make it harder to play?
@@OtherwiseUknownMonkey time cycles and weather cycles "can" matter but its really dependant on the game and your approach. like for racing games it can be amazing (if done correctly) for survivable games it often doesnt get done right because they often go for a more "balanced" approach to weather/day/night cycles so that it doesnt hamper the players experience.
I personally dont care for it but it can help elevate some games IF done correctly with most not doing it correctly.
Noise is not the only problem with ray tracing...
Exactly, its a shiny shiny, for bling bling dumber !
😮
From day one I thought ray tracing looked like shyt
@@OzzyBoganTech Lol it's not like you can even use it. Sour grapes
@@OzzyBoganTech Because people forgot that Ray-Tracing in pre-rendered scenes ask for hours and hours of rendering for just one frame. So imagine in Real-Time what it ask to handle Ray-Tracing in real-time perfectly. Even Nvidia doesn''t have the solution.
That's why I don't use RT and TAA
TAA is not an option in most games anymore, and there's also a good reason for that. People who think TAA is bad have no room to complain about 'noisy' images, cuz without TAA/reconstruction, modern games with high detail, complex shaders would be an utter MESS of image quality, especially in motion.
@maynardburger TAA isn't an option because it's forced like in Dying Light 2. SMAA and FXAA look great in motion and not in motion, TAA sucks
@@marksvirsky9103 "SMAA and FXAA look great in motion". No they don't and I'm tired of hearing this bs.
@@marksvirsky9103 SMAA and FXAA are te rrible. Anyone who thinks otherwise hasn't made a game or game engine. Including that so called expert who tells game devs they are wrong but has not made an engine to demonstrate his case.
@@H3LLGHA5T Why not though? I tried it in multiple games and motion clarity has been perfect. Did you try SMAA or FXAA without upscalers?
This is my issue with the original version of CONTROL. There was an “official” mod that reduced the noise but it’s still kinda noticeable
Yeah, CONTROL has it really bad especially in an elevator.
Some implementations are also more unrealistic. 2:25 these scenes from hogwarts legacy are perfect examples. Stone floors that have been walkes over for hundreds of years and chalkboards are not that shiny!
I've thought the same thing for years now, I've always called it the blender effect, typically because moving the viewport in 3d rendering programs results in hugely garbled images that slowly come to cohesion after the viewport is stationary for a bit. That and the massive performance hit is why I still haven't enabled any RT in games yet, or felt the need to upgrade my hardware for RT.
We give you shiny surfaces, unrealisticly shiny water for only cost of an another gpu over your existing one, plus you get the noise.
Im still not feeling RT tech at all.
I dont think the tradeoffs are worth it.
it's been like 5 years. you'd think we found something positive in that amount of time, but nuh
it only looks 'good' in games where not enough work was done on the rasterised graphics
@@fuzzjunky the only "positive" I've found is rtx remix so we start to see older abandoned games get a refresh but even then the tool still needs a lot of work, demands at least a 3060ti (tried with a 3070 and the experience was...bad...) and gatekeeping AMD users from using it because of CUDA
@jagildown RT had a lot to do with me upgrading to a 3070(ftw) from a 1070(ftw) , I usually turn it on to see how it looks then turn it back off because I'd like to play at more than 30-40fps.
It gets worse when we think of lower end cards.
We need a different method to fix this as it’s too demanding for lighting. Such a shame.
0:45 glossed over? you mean completely ignored like it doesn't exist. Ray tracing is absolute garbage.
All I could think when you said the path tracing/ray tracing was “making a more realistic image” was… “really?” Honestly it kind of looks exaggerated, and unrealistic. The sad part is, devs could make the other lighting solutions look “more realistic” if they just tweaked them a bit.
Games with reliance on TAA in general have ghosty/blurry/"dirty" images. Especially in 1080p, which is still the most popular screen resolution.
I hate TAA even at higher resolutions. It makes the whole image look like an oil painting, though it's definitely worse in some games than others.
@@Kryptic1046 That's massive hyperbole at higher resolutions. Y'all are still living in like 2016 when TAA was still in its early days.
@@maynardburger "That's massive hyperbole at higher resolutions"
Oh really? Lol Tell me more. Go play Forza Motorsport with TAA on. It looks like garbage in that game regardless of resolution, and it's a common complaint on their forums, so much so that they had to implement a toggle to give you the option to turn it off. I have several 4k panels and TAA in many games looks like garbage on every single one of them, or are you really going to sit here and tell me not to trust my lying eyes?
@Kryptic1046 Don't trust your eyes, buy 2 4090s and SLI. Wait, they killed that for consumer cards.
@@Kryptic1046 That's a UE/nvidia shill replying with that same toxicity to every comment criticizing RT and TAA
We still have stuttering, noise, shimmering, texture popins in 2024. Crazy
The RT dream has been chased for 4 generations, yet it rarely is done well and still axes the framerate. I can only imagine where we would be if all that money and RD was invested into other improvements.
Game physics haven't gone anywhere since the 7th gen console generation with the exception of Rockstar's titles. So we really have been playing glorified tech demos for 70$
I'm starting to have issues with modern gaming. The fact games are often using sharpening effects, TAA, etc results in blurry and noisy games. Some of these games also don't allow you to turn these filters off, such as black myth wukong. Many developers are also relying on upscaling to achieve acceptable performance, furthering the problem, and now noisy ray tracing.
I have a 7900XTX so I never run ray tracing anyway, I prefer high framerates, I play at 1440p 360hz.
Reconstruction is not making the problem worse, ffs. You guys simply dont have any clue what's going on. Reconstruction actually helps games be sharper, not blurrier, at a given performance level. The alternative is to lower resolution natively down much lower, which will be much blurrier.
The problem with ray tracing, it thinks everything in the world is clean and spotless. Not even remotely true. I don’t even think it’s that accurate.
A lot of the time with RT games it seems to make some surfaces overly reflective when in reality they probably shouldn't be.
thats not the problem of RT thats upto the dev making the normal and roughness map for the textures
@@hamzakhalil-gs7oc The problem with pushing so much “Ray tracing BRO!!!!”…. Is that all these devs think applying reflective texture materials to literally every surface somehow makes your game look “more cool”.
I’m from 2050 and Nvidia changed their GPUs’ names back to GTX
As annoying as noise can be sometimes, I'd rather have that over the disgusting mess that is SSR we've had for the past 7ish years. Having details just disappear in SSR based on the camera takes me out and I find disabling SSR much more immersive even if the reflections aren't accurate. Consistency and art direction are key.
Art direction ? There can be no such thing as an art direction since the current push of graphics is photorealism, that's all we've been trying to do.
Developers might want to start toning down reflections in general. I find modern games make far too much use of them, and they often aren't particularly realistic.
@@nightmarepotato5000 If people stop buying stupid photorealistic games that are bland and trash, there will be better games again because game studios will invest into art direction teams instead of reducing them to bare minimum like they do now
"Adds to realism" - talking about game were you play a fox with a ray gun.
Haaha
I hate how Digital Foundry would cry how "transformative" ray-tracing is and see all the miniscule difference it brings, while downplaying the negatives (noise, big performance hit). They have been an nVidia PR machine for the feature since it started. Good job to Hardware Unboxed for this excellent piece of tech journalism!
Especially alex, that guy's annoying as hell. Rt this rt that. He never shuts the ef about it
@@E-NIS690 Looks like his RT paycheck is in overdrive settings 😆
Thanks, now I finally know how to recognize where there is ray tracing and where there is not.
RayTracing has
-a noise problem
-a performance problem
-a cost problem
-a ""you mostly see no difference to good rasterization" problem
Well RTGI Looks amazing, Far better than good rasterization but thats mainly just the lighting, Still costs WAAAY too much though.
RT can save the studio time. Of course those savings will be passed on to the parent company, not to the consumer.
"you mostly see no difference to good rasterization" Applies only to titles that have trash implementation like the AMD sponsored RE4 remake.
"a noise problem" Ray reconstruction exists, and by now AMD, Intel should've made and alternative for it.
" cost problem" Battlemage and RDNA will create enough competition solve this.
🥹
"-a ""you mostly see no difference to good rasterization" problem" is more of a "you're blind" problem though.
Honestly, I hate to admit it, but the answer here is AI. Temporal noise is the most difficult problem to conquer with raytracing. I make offline, path-traced animations, and the amount of samples you need to eliminate the noise is often staggering: 256 samples might produce a _pretty good_ image, yet to actually _eliminate_ the noise, you'd need to spend an extra twenty seconds doing something like 4096 samples. That's completely infeasible in real time raytracing where you get a handful of crappy samples to work with.
Stuff like Nvidia's AI ray reconstruction is honestly probably the solution here. It isn't there yet, but it's the most promising path forward right now. Everyone rendering path-traced animations has been using adaptive sampling + AI denoisers for years, and even then, a 4090 is still far from being able to path trace in real time.
Real time ray traced games theoretically have to achieve accurate reflections, lighting, and shadows based on substantially incomplete information, which leads to flickering from frame to frame. Hardware is _not_ going to be able to do 200x more samples per frame any time soon, so I honestly do think AI techniques are the solution to this.
It's annoying though - as we've moved from rasterized to raytraced games, they're definitely becoming a blurry, flickery mess. Part of the problem is that games get sold based on highly-compressed TH-cam videos that people watch on tiny phone screens where temporal noise isn't very noticeable. It's only when you're staring at the big high res screen and _actually playing the game_ that you start to notice.
3:51 - the asphalt looks nothing like the real world one. It's nowhere near this reflective (both RT ON/Off). Notice the bus (and car) doesn't even have head lights on for whatever reasons (fewer light sources). The issue with raytracing reflections is still the same as it was initially, it's overdone.
Both starwars outlaws 9:50 and cyberpunk 10:39 ignore scattered lights the walls don't get illuminated by the extra polished floors - the technology (ray reconstructions/etc.) can't support so many bounces, so to me they appear just less realistic - still overdone when it comes to reflective materials.
Wonderful that at least someone in the industry dares to speak about this! The industry is lobbying HARD to only allow overly positive things to say about ray tracing, calling sceptical people haters, when in reality no-one in their right mind has any critique about the ray tracing itself, only about the manufacturers not producing hardware not currently capable of delivering quality ray tracing! And we are 5 years in soon to having RT hw too! So it is definitely not the gamers fault!
Now let's go with Oled flickering problem :D
Ray Tracing was introduced to tank performance, to force planned obsolescence, and make development easier and cheaper flat out. Kind of like the only reason 2.5G and 5G Ethernet were introduced because it was cheap to produce using current materials.
Games have had an image clarity problem for a long time now. The over reliance on TAA based antialiasing solutions have contributed heavily to that. And that includes TAA based upscalers also.
You are blaming the fix for the problems of the culprit. TAA is NOT the culprit. TAA is the aid. The culprit is the switch from forward shading to deferred rendering around 15 years ago that renders the game into G-buffers and combines them in a quite fake way like layers in photoshop into the final image, so the classic MSAA is not feasible anymore. TAA is necessity to fix the new problem the industry introduced. Deferred became popular because it can MUCH more efficiently use more complex lighting and shade very dense geometries, so the screenshots suddenly became much prettier and many studios still using forward rendering were mocked by gamers for using archaic, outdated engines, expect maybe Valve.
i find shimmering from temporal aa way more annoying since its prevalent in every game, raytraced or not. imo one of the main reasons that people wax nostalgic about graphics from the early 2000s is that pretty much all games were forward rendered where everything looked crisp with msaa (or sgssaa if you were a baller.)
One note, running 4k native or capping to 60 fps will make temporal artifacts more apparent. The reason is obvious; temporal algorithms accumulate data over multiple frames, thus they will require more time at lower frame rates to converge which means you have more time to see the artifacts. Something that is obvious at 60 FPS may be a lot less obvious at or above 100 FPS. This is a case where lower resolutions or more aggressive DLSS/FSR/XESS settings lowering the overall number of pixels can actually increase the stability of the image because the higher frame rates also means a higher sampling rate per unit of time for the temporal accumulators to do their jobs.
This ultimately leads to the same conclusion: ray/path tracing needs more samples to get rid of the noise.
The TAA era of graphics is just any effect that's too difficult to run in real time on most hardware (console & xx60 series cards) reduce the resolution/samples/rays of said effect then use temporal accumulation to clean it up.
Resulting in byproducts like motion smearing, ghosting, oil painting/vaseline aesthetic, boiling, grain, etc. Temporal accumulation is a bad part of optimisation since it comes with severe drawbacks, it's not "free" performance like other tricks. Developers & especially engine makers need to quit building their product around it.
Temporal accumulation is cool on paper. Getting information via past frame data instead of by increasing the pixel count is a neat concept.
The problem is though is that while its significantly less expensive than super-sampling, reusing old frames degrades the image quality in many ways, so you're not really enhancing the image you're trading one flaw for another. And a lot of these flaws (like the motion smearing) has legitimate accessibility concerns since some gamers experience motion sickness from it.
Motion sickness is the #1 accessibility issue in gaming, so it's not a great industry standard for that reason alone, on top of its controversy. We need to make more efficient lighting systems that can run at or very close to full res, instead of relying on TAA to make it possible.
Well put. I agree
Not going to happen with the number of dynamic lights in current games. Current games use deferred renders not because they are more intuitive but because deferred renders allow the use of sub-native resolution lighting and shadows, and also don't linearly lose performance from additional lights in the scene. A string of Christmas lights swaying in the wind (e.g. a massive amount of dynamic point lights) would be impossible without the use of deferred rendering. A realistic flashlight would massively affect frame rate without deferred rendering. Destructible environments would pop out of the image.
@henryzhang7873 I think the disconnect were having here is you're conflating DR = TAA and FR = No TAA. Their are a ton of games that use DR without relying on TAA, the image doesn't look broken when its disabled or the developer doesn't force it on. These aren't limitations of DR, their design choices, mostly with the goal of speeding up development.
It's like games only shipping with RT and no raster mode, that's the same line of reasoning a developer would force TAA and making the game around it is so that they don't have to spend more time mitigating aliasing & reliance on TAA, resulting in people having to use it.
If the new Indian Jones game had a raster mode it would've taken more time to make, and time is money.
I don't know why more people dont talk about this. RT in stills can look pretty good. In motion RT looks AWFUL, and I feel like everyone else is blind to it. Fizzle, blur, and lag in reflections is horribly distracting.
1:05 Damn, are we talking about ray tracing here?
Coom Tracing
Gyatt tracing
This video will hurt a significant amount of Ngreedia fans
Image quality has taken a nosedive this generation. Games are becoming increasingly blurry and noisy.
Have been for a long time
All the way back to the good old VHS quality :D.
Don't forget the incredible depth of field.
Long story short: we're still 10+ years away from having enough RT-power in mainstream GPUs for it to look right.
I personally don't care much about higher fidelity reflections and other such improvements. Not anywhere near worth the massive (>10X) increase in compute and electrical power requirements vs raster simulations. I'm the kind of person that turns off shadows if that makes a 1fps difference because I don't care about them either.
Which is why the current push for RT is bad as games are starting to not allow players to disable effects like motion blur, TAA and even RT courtest of that new Indiana Jones game. We're not ready but game developers act like everyone is on a 4090.
Black Myth: Wukong is one of the worse offenders when it comes to grainy presentation.
Don't worry, nvidia will launch a new "ray trace de-noisation" feature exclusive for 5000 series
games look uglier than they ever have now with all the rt noise, upscaling blur, and ghosting and smearing from framegen
horrible time to play at 1080p, expensive cards that run and look like shit
Totally agree with everything you say here. It's driving me crazy
It's insane how 1440p has become the new 1080 mot because of hardware being so powerful it's easier to run higher res. But because software has gotten so shit, 1080p now looks like 720p
This this this! Why do so many sheep tolerate and even defend the blurriness and lousy performance of modern games?
Disagree. There's always been trade offs in games. Calling people names over this stuff? Just stop.
I first experienced this odd artifacting when playing "The Callisto Protocol", as it would make spikey walls look like a grainy CRT TV showing static. Great video!
My issue with RT is that it assumes everything is shiny and clean to the point the lighting and reflections become very unnatural in all but a very small handful of titles compared to traditional rasterization, most of which have Nvidia heavily involved and are little more than Nvidia marketing projects.
It totally depends on the surface and material work of the developers, not the RT or PT itself
Thats not a RT problem, its a material problem. also its an old and stupid argument. Go outside sometimes, go look at a calm puddle, is it not mirrorlike?
I always noticed that rt looked a bit blurry at times, but I always thought it was the upscaler's fault. Turns out I just don't like rt or upscaling.... I'm sure rt will get better, but our hardware isn't ready yet.
Anybody who has had to deal with Blenders Cycles rendering engine knows the struggle with noise and something called "fireflies" (random bright pixels). I personally like little bit of noise in my images so to me it's actually a plus but I can understand how it can be annoying when trying to enjoy games.
In static imagery it can be a boon because it immitates high iso on traditional photography. In motion is where it falls apart with noise.
I recently got a beautiful 4k OLED display, and one of the biggest things I've noticed now that I can see games in all of their glory, is how nasty image quality has gotten in modern games. Going back 10+ years and playing older games at native 4k, you get these beautiful, crisp vistas with minimal distracting nonsense, and despite the complexity of models and lighting being technically lower; I often find these older games subjectively MUCH better looking. I think RT has been pushed a little too early, before consumer hardware was ready for it, and it has caused some pretty ugly results for lower and mid range GPU's especially. I have a 3070 TI for example, and in the 4 years I've had this card, I've turned RT effects completely off in just about every single game I've played, and that has consistently led to a better looking game, with higher resolution and WAY less distracting artifacts. Ray traced effects are the future I'm sure, but as of now I'd suggest just about everyone turn them completely off, and enjoyed your massively increased image quality and frame rates, which 95% of the time, matter a LOT more in terms of your general gaming experience.
2025 playing on 4K screen with 320x240 rendered frames, then upscaled with RayTracing...and all that at 30fps....thanks Jensen
Ray Tracing has to be one of the most expensive scams in GPU history and gullible consumers are falling for it.
It does look amazing when done well though, hardly a scam.
I say that as someone who doesn't think the performance cost is worth it a lot of the time, then again my 3070 can't handle it anyway most of the time.
Wait 5 years and see.
Raytracing is overrated af. Keep up the great work you guys and stay safe.
Yeah I was heavily saving for a while for a 5090 after path tracing games starting coming up. But honestly I think I'll wait for image quality and performance improvements in raytracing games before I upgrade my aging RX580 gaming pc.
It isn’t overrated, it’s still too difficult for now to get a good real-time implementation. Real-time path tracing has been the ultimate goal for decades.
your eyes is overrated
Games have gotten more blurry over the years, which imo is the most distracting trend out there. It got to a point where some Gen 7 games look better than current year, blurry mess.
Ray Tracing is mainly a useless technology for which there will never be enough HW.
Constantly making people accept DLSS (reduced resolution) DLSS 3 (imaginary FPS+input lag) so that games work a little.
Take the lineup from 2019: 2080 (Ti) + 9900K vs 4090 + 9800x3D. The difference between these two lineups is about 50-70% difference in performance, but games look worse today and mainly optimization has completely disappeared and everything is left to DLSS.
Ray Tracing and DLSS are evil for PC gaming. Previously, DLSS was presented as a bonus FPS. Today it is just an optimization technology for developers to reduce game development time. And ray tracing can swallow 50-60FPS and Path Tracing even 100FPS.
While the 4090 can handle The Division 2 on ultra in 4K at 120FPS, it can also handle newer games or especially games on Unreal Engine 5 at similar FPS, but with DLSS 2+3.... What's the point of constantly upgrading your PC? ...
It is possible to do RT with the right art direction and optimisations... You had a video of an indie game where they managed to run some RT even on gtx1060.
Hardware just isn't ready for raytracing yet. We need multiple "cheats" to get it to run reasonably and then it still looks like this.
Can we just accept that we need another decade before raytracing becomes actually feasible instead of trying to cram something in the games that most players cannot even experience at all and those that can get mediocre mush?
Basically, ray tracing is like high ISO in photography
So that's where that grainy effect is coming from; I'm quite tolerant of visual artifacts but those grainy/gritty shadows are a hard stop for me. Its really agitating as it creates false movement that draws the eye toward it when there's nothing good to see.
Nvidia getting ready to launch DLSS 4 on 5090 at $3k with real time denoiser which is 4x than 4090.
no amount of denoising would help the lack of proper material diffusion and light scatter. In pretty much all of l the examples in the video , materials with reflection don't appear realistic at any rate. The glass mirror is one thing, diffused light in wood/plastic/sub 5000 grit polished metals entirely different.
A generation easily skipped
I would argue that RT rarely improves the picture. In the vast majority of games, the materials are not adjusted for RT. which degrades not just the style, but also the “realism”. in the same example - Hogwarts - the granite on the floor should not be so shiny.
RT has been utter dogs**t for gaming and GPU's. It's made them unnecessarily expensive, offer largely garbage results for most people and games contort themselves around this "feature" - that most people turn off anyway. I absolutely hate it.
Which is exactly what NVidia wanted and many braindead gamers, spurred on by some unapologetic cheerleaders like Alex Battaglia, just ate it up 🤷♀
Great to see that someone finally brings up this isse. I'm a person who Always turns off things like motion blur, film grain and such effects because I want a clear Picture. This effect is what kills RT for me, apart from the insane performance loss.
"We'll fund your game's (and game engine's) development if you force it on. It makes everything shimmery and blurry, makes games run slower despite consuming twice the power, and 80% of the things it does can be done more efficiently with other techniques, but the performance hit is even _bigger_ on our competitor's GPUs, so it makes us win benchmarks, and we'd rather see the game run at 40 fps on our GPUs and 30 fps on the competition's GPUs than have it run at 100 FPS on ours and 101 FPS on theirs."
I assure you no developer is relying on Nvidia funds to make games. ffs
@@maynardburger That's true but because they would never see a cent of it either way, publishers are the ones making the deals and keeping the profit.
@@maynardburger lmao. watch the intro for gta san andreas from 2004, big ass nvidia logo. do you think rockstar put that in there for lols?
@@maynardburger You keep posting comments and catching L's, you're not even getting paid for it lmao
This is frankly the real reason why RTX adoption has been lacking. It's not that it doesn't make a difference, but that it looks outright unappealing at times.
I will always prefer a clean video-gamey look over a messy photorealistic look. UE5 seems to be taking the former away for good, though.
Man... So often it looks like pebble glass.
No one will notice anyway through all the blur artifacting and ghosting modern games all seem to share lol .
Denoiser has lag, which causes reflections to become blurry or misplaced. It's all because of current gpu's being too weak for full resolution RT effects.
It's actually developers fault. Nvidia said at first you can use only one RT effect, but developers want everything for some reason.
This isn't at all true, as these problems exist even when there's not any combination of different RT techniques being used.
@@maynardburger Yeah, on first gen RTX cards. Current cards can run full res, but they don't because they've added all these extra effects. It's even worse on consoles, PC RT problems are nothing compared to PS5/Pro.
TAA is the worst thing this generation. In the attempt to hide all this noise, it just trashes the image.
You guys have no idea what you're talking about. Without TAA/reconstruction, these games would be trashy messes of image quality. Sharper, but horrible looking.
@@maynardburger Man, if only we had dozens of other AA methods that all look better.
Oh we do.
TAA is required to hide these effects in the way they are done. or rather, no one would do these effects this way if TAA didnt exist. One wonders why 8th gen multiplat games on PC look better then 9th gen multiplat games on PC, while performing better.
@@maynardburgerYou sound like TAA is literally the only form of AA that can exist on this solar system and if we dare to attempt any other implementations Cthulhu will awaken and eat us all.
I cant be the only person that thinks the games look better with RT turned off in the first place.
I can count the amount of games where I think it actually makes a worthwhile enough difference to be worth burning half my fps and turning on upscaling or frame gen for without running out of fingers.
It does look better in most to have rtx off, only in java Minecraft or cyberpunk it looks better
hogwarts legacy is the example for me. Everything becomes so shiny and oily just for the sake of having more reflective surfaces to show the effects off.
@@102728that game is the perfect example of what's wrong with RT. I get that the developers aren't out much, but a floor isn't supposed act like a mirror (even if it was just cleaned). I hate that they try to add shiny things just to show it in some game preview to say "we have RT (so this game is instantly better obviously)"
Thanks Hardware Unbox.
You just helped me understand why I never find Ray Tracing on games I play doesn’t always feels like a visual upgrade to me. And why most of the time I really dislike ray tracing.
If resolution is to blame, then dlss as a solution for rt performance drop just makes everything worse.
I don't have this problem...(cries in 1070ti)
This is honestly another great gripe of mine against RT.... using RT Reflection in a game like Cyberpunk, especially if you are using aggressive upscaling like "Performance" mode, Then rough reflective surfaces almost get mpeg-levels of artifacts due to noise and de-noising algorithms.I find this MUCH more distracting than using high quality SSR. Especially since the weapon take up a large part of the screen and often have rough metallic surfaces.
Not just ray tracing. I wish you guys took a look at TAA, as we need high profile channels to shine the light on the TAA disadvantages.
HUB doesn't hate TAA cuz they're not idiots and know that the alternative is unacceptably messy image quality.
@@maynardburgerAs opposed to unacceptably blurry image quality right now, really good choice.
@@maynardburger Except TAA is the messiest pile of garbage...
@@maynardburger at the very least people should know what they're picking and allow us to choose. I'm glad you like TAA, but don't you think choice is a good thing?
Effects are never used in a realistic subtle fashion if they eat up a lot of performance. Games will smack you over the head with them to make sure you notice they have the fancy new thing. Like with colored lighting, lense flares, bloom, motion blur.
Ray tracing needs to go away for 5 -10 years. Then come back in full force.
The timing of this video is absolutely wonderful. Since the new Indiana Jones game requires hardware-based ray tracing. I hope this'll thoroughly halts that track in development. These issues and that even the cheapest cards have a more then significant performance loss with RT enabled? It's just not beneficient to the customer, to put it in a very nuanced manner.
1:05 I think we all know the real reason this was put in
For years everyone has talked about "hardware ray tracing" units but they're only doing a few ray bounces which makes a really noisy image, its the tensor cores that are doing the real leg work denoising what that can to make it look...passable, and thats before they have to do even more denoising when you're inevitably using DLSS to upscale from a lower resolution so you're not getting 35fps. The RT filter we apply to rasterised games was a gimmick in 2018, 2020 and it still is today.
They could double Sample Per Pixel but you will get double performance drop LoL
An RTX 5090 is gonna be necessary for that performance drop to be worth it
@@yancgc5098 You can increase Samples X2 - X4 - X8 - X16 it will just need more and more juice. Current RT/PT is optimized for current hardware.
@@club4ghz I can’t even imagine a GPU that’s 16x more powerful than a 4090, I feel like we’re gonna hit a wall way before that happens
What a weird situation. Cutting edge games with "the best graphics" genuinely look worse than last gen games running 4K native with conventional baked lighting. The massively superior frame rates and perfect image clarity found in older titles is so much nicer to look at in motion than this modern stuff full of upscaling artifacts, TAA ghosting, and raytracing noise. What is going on? Why has the industry gone in such a bizarre direction? Surely developers and graphics artists can see how ugly these features are?
Dynamic objects. If you look at the original Crysis, lighting inside of a destructible building is very different than one in a static building that can't be destroyed. You can interact with almost any object in modern games, when in older games you couldn't even move a cup that was placed on a table. Now they are lit the exact same way as the rest of the world, while in older games you could only rely on ambient occlusion making objects "pop out"
PBR alone makes games today look way better than before. But PBR also comes with the problem of complex shaders and shader aliasing. Which demands games use some kind of temporal AA solution to keep the image stable. You guys are still living with the old mindset of 'sharper = better', but we cant have both super sharp visuals along with PBR without it being a complete mess of pixel shimmering everywhere. It turns out professional graphics developers aren't all incompetent morons and you guys might simply not understand this stuff at all.
Will you also make video about all the artifacts that rasterized techniques cause? Like SSR disocclusion, shadows not rendering at the edges of the screen, SSAO/SSR disappearing when the occluded object is off screen, etc..
I remember when it was being speculated that newer generations of RT cores would make turning on Raytracing a zero performance loss, but it never happened, even in the 4000 series, you turn on raytracing and still lose a massive amount of fps, just a bit less than previous gens. Let's see what the RTX 5000 series brings to the table.
0:10 but ray tracing does not improve the visual quality at all. it makes it so much worse.
Rage bait used to be more believable, troll
That's how great marketing works... selling a feature 10 year prior to it being of any use to most gamers...