The fact that you have to spend more than 1000$ after 6 years to use this feature and still you will never know if its on or off... what a fucking shitty market state
Alan Wake 2, Cyberpunk, Metro Exodus, Ratchet and Clank, Wukong all smile on this comment and these are all AAA titles. Is it worth it? No. Is it impressive sometimes? Yes.
Jensen & Co deliberately trickling out RT as slowly as possible for maximum profit. Scummy as fk, but they're all into it...including the devs with their pathetic RT tickbox exercise.
Longer answer: if it's cyberpunk and you don't notice the drops in quality when using DLSS Balanced, then worth it. Using pathtracing on my RTX 3070 1440p with DLSS Balanced and get 40-60fps, worth it for me.
In Germany 4090 cost around 2250 euro. XTX cost 950 euro. And according to this video, 4090 is about 35% faster, but 2.2 times more expensive. If logic is working, than 4090 can stay in the shops forever and nvidia workers can buy and use it... gpu prices are so crazy
Bro, Nvidia workers are starting to quit their jobs. They are now multi-millionaires. Someone who worked there for 5 years and received Nvidia stock back then, those stocks are now worth 3.5 mil dollars. They don't have to work for the rest of their lives. Nvidia is actually having trouble keeping those people working there. 😐 I swear, this AI bubble can't burst soon enough.
@@EFXVoila Nvidia has been the primary choice for gamers for a long time, for many the switch feels difficult and for some it's impossible (CUDA devs and similar). Wish more people would see that Radeon is not some shitty alternative to GeForce
@@EFXVoilait's the fastest card after all, Nvidia is the default choice and quite honestly, if i were to spend a thousand bucks on a gayming card i might just spend 2k for a better experience, because at these price points the concept of a compromise between performance and cost is laughable anyway
7900XTX = $1,399 AUD RTX 4090 = $3,599 AUD The price difference is so insane you should be comparing the 7900XTX to a 4070ti Super... We're talking 250% more money for 30% more performance?
It's a bad comparison just because they wanted to do the best card from each manufacturer. IMO AMD should not get a say at all and they should've done 4070 Super, 4070 Ti Super, 4080 Super and 4090 to get a good range. And all at 4k DLSS performance.
@@ThunderingRoar Because of how many times throughout the video he directly compares that 2 cards as though they are battling head to head, and infers the 4090 is a better choice for RT constantly... Except. It would want to be. Because it's more than twice the price... Which is not mentioned at all. Constant (paraphrased) quotes such as "The 4090 is a far better option in this title" "The 7900XTX just can't keep up here and it's 1% lows are much worse than the 4090" Etc etc the entire way through. These cards are not comparable in performance, nor are they trying to be. You can have this exact video, with an old RTX 3060 vs the 7900XTX and say the Radeon card dominates here. Yeah no shit, it's not in the same market.
@@ThunderingRoar I guess it's worth showing but I thought it was more about RT's cost and AMD's current cards are just not proper cards so showing how they go to 3 fps in path tracing feels pointless.
@@silvio3d And he is not using it in RT most likely, so your point is useless. Half the cost is more useful if you don't care about RT. You just go and consume your little heart away.
@@silvio3d , lol, we get that you are a troll . i watched the benchmark did you? so typical hardcore nvidia fanboy claim that amd is unusable , ok troll .
I understand the idea of comparing best to best, but I would have been more interested to see 7900 XTX vs 4080, since they're actually the competing products, instead of 7900 XTX vs a GPU selling for over $1,000 more.
Exactly this it's an unfair comparison when in the uk the 4090 is vastly higher price 3x comparing the hit ect fine but the way the video flows it sure doesn't sound like it, so redo the video with the closer more actual match of a 4080 and i expect the results would be far more realistic in a fair comparison.
I would love to see a Ray-Tracing video about "mid-range" cards like 4070 super / 7900 gre on 1440p Edit: What happened here while I was gone lol. But the fact that there's so many contradictory opinions kinda proves that we need a video on this subject. And of course I initially meant RT WITH DLSS/FSR.
You can extrapolate the answer to that. His conclusion on the best cards is, that it's only worth to turn on with the best implementation ultra, or path. The 4070 and the GRE aren't nearly strong enough to run path tracing.
I would argue texture quality matters much more than ray tracing at this point, and has much less performance impact as long as the VRAM is sufficient. And that's exactly why Nvidia deliberately release cards with good computing performance but with unmatched abysmally small VRAM, to force upselling.
and artistic direction. A game with a good artistic direction is more compelling to me than a game acting as mere technical demo, and it will age better.
Texture quality do matter but people end up being silly with it. Most often you can drop it to lower quality setting and still look the same. Heck they are not noticeable even on side by side still picture comparison. And yet many people subjectively feel they losing a lot if they did not use max out texture.
since the 4090 u can play rt max settings games at 1440p or 4k performance without issues, especially with framegen its soomth enough for the massive visual gains. once you go rt u will have a hard time dealing with all the issues and visual bugs of the hacks traditional rendering introduce
@@msnehamukherjee Even most nvidia users don't use ray tracing, even though they pretend they do in online arguments. Ray tracing is useless, get over it.
Tim, those thumbnails are works of art! "On a 4090, all 3 path traced games run at below 50 FPS at 4k using DLSS quality upscaling" - so that's a definitive NO.
@@HyperScorpio86884k dlss Performance is 1080p upscaled and looks significantly better than 1440p native (which is considered nowadays the sweet spot of gaming). 720p is ultra performance
@@HyperScorpio8688 With 4k output it's 1080p on dlss performance I think, which is somehow still ok. ultra-performance gets massive blurry areas though.
30% drop _might_ sound acceptable _sometimes_ but this is a 4090. I suspect weaker cards will not only have a lower absolute FPS but also higher performance hits.
I enable it on medium since got lower card 3070ti but I think ray tracing is still the future because we also see in light direction not tesselation that emits light or art work.
@@ansonang7810 Ray tracing will always be "the future", because ray/path tracing is essentially an infinitely scalable technology. nVidia releases a new $4,000 RTX 7090 GPU that can run 3 light bounces per BVH node at 4k120 in the latest game? Game studio can simply change to 6 light bounces per BVH node and that shiny new $4,000 GPU is back at 1080p60. Then nVidia can sell the $5,000 RTX 8090 that will do 1440p60, then the $6,000 RTX 9090 that once again does 4k120. Then the game studio simply increases to 10 light bounces per BVH node, rinse and repeat until the heat death of the universe and the RTX 31090 costs $42,000. If you think nVidia won't abuse this to con their loyal fanbase into buying new overpriced GPUs, you weren't paying attention when they did this EXACT same thing with tessellation that they're doing with ray/path tracing when they ran their GameWorks scam.
@@ansonang7810 Who cares it’s 30% if you’re still close to your max. refresh rate ? Which is not the case for cheaper cards, I know, but I always use it with my 4090.
I always use it with my 2060 Super if it's RT actually worth enabling and isn't like just some shadows or just some reflections with poor ray counts. Even halving the FPS would be worth as long as it doesn't go under 30.
This is why I run a 7900xtx. I don't use ray tracing and the 4090 is currently $1900+ on Newegg. The 4080 is $1000+ and only has 16gb of VRAM compared to the $880+ 7900xtx with 24gb of VRAM. I like the settings cranked up with a high frame rate and this does the job as long as I don't use ray tracing, which is still basically a joke in most games. A huge performance hit for not much of a difference visually for most games.
I think sometimes RT is worth it, but it looks like RDNA 4 will have more ray accelerator units per core, so it should be faster for RT on an hardware level. When AMD and, why not, Intel GPUs will be as good as Nvidia ones in RT and it won't be an RTX selling point anymore, games will have probably evolved to use RT technology better
@@lucazani2730 On almost anything below 4080/ 7900 XT at 1440p RT requires upscaling. IMO visual loss due to ghosting with loss of performance are not compensated by benefits of RT.
@@be0wulfmarshallz Differences worth playing at significantly lower framerates? Because whether you've spent less than $1000 (7900 xtx) or over $2000 (rtx 4090) on just one single part of your computer, that is the trade off made in games with any appreciable improvement in visual quality. Smooth is a quality too, don't forget.
What are you using the extra VRAM for tho? It's not like the 7900 XTX will be relevant for new games even 2 years from now, it can't deal with AI tasks...
in Doom "XTX is not hopeless in RT [...] but 4090 is quite a bit better" i sure hope so for card 60% more expensive at MSRP (which doesn't correlate acurately with 49% of performace), but at this point it's 889 to 1799 on best buy, so double the price now
@@aviatedviewssound4798 it's not a buying guide so price is irrelevant here. The point was to compare how well the architectures of either company support RT and if it's worth turning on. Makes sense to use the best RT cards from either company and then look at the % impact on performance when you turn on RT. It doesn't matter that the Radeon card is half the price because the video isn't addressing price. It's isolating whether or not the experience is good.
@LucidStrike And made them need to do 50% more testing for the same video with only marginally more useful information in it. And then someone would have complained about not having the 7900xt, and so on. The point is the 4080 wasn't very relevant to the content because the video was not considering price as a variable. They were looking at whether or not RT is viable/useful/recommended, and given that the best GPUs from either vendor still aren't an easy "yes" you're supposed to extrapolate what that means for your GPU of interest based on the myriad other benchmark videos available.
From all the data of these two videos my conclusion is: Only in a hand full of game ray tracing is worth activating, and only if you have a really fast and expensive video-card. But if you want to use ray tracing, go full in to the overdrive!
My 2060 Super has used RT in many, many, many games. This list is bad and contains a lot of "RT" titles from the early days. Where is SIlent Hill 2? Where is Casting of Frank Stone? Until Dawn? ANY recent UE5 title? Why are we doing it for "RT" from 2018 games ffs. I just finished replaying Casting of Frank Stone on Ultra. 1440p DLDSR+DLSS Quality with hardware RT on my 2060 Super. If the game only does reflections or only does shadows with RT I don't think that game deserves to be on this list compared to so many games that aren't. You also don't need an expensive card. You can play on low DLSS, on 30 fps, it's more than worth.
I find ray tracing to look best on cyberpunk style games, like The Ascent and Cyberpunk 2077. Other games it is much less noticeable to my eyes. I think all the neon signs and colors complement ray tracing well.
those are obvious though. Smaller details like all the shadows and reflections on various surfaces and the bounce lighting its all done in real time and to a particular eye its hard not to geek out over it.
I am a visuals snob and always crank things too high. In games where it is well implemented, it is absolutely worth it. There are some games where I turn it down because I can’t tell the difference. That I can’t tell the difference usually means it was bolted on at the end and not a part of how they built the game. Cyberpunk / Alan Wake / Metro Exodus are probably the best ray traced games. Some of the examples in this video are examples of games where there’s almost no point to using ray tracing because the devs never really intended for it; they just added it to check a box.
The biggest problems with RT are noise and artefacts. Because current generation hardware isn't powerful enough to run games at a high ray count, denoisers have to be used to fill in the missing pixels where rays weren't able to properly get to. Denoisers blend the sampled rays together, leading to low-quality and inaccurate RT effects. Current ray-traced games use a low ray count, which causes a noisy image with artefacts. More rays and bounces mean higher image quality, more accurate RT effects, and less noise and artefacts. Increasing the number of rays and bounces is the most effective way to make RT/PT look substantially better. The problem is that the higher the ray count, the more demanding it becomes, which is why it's not a viable solution yet. Video games still haven't unlocked the full potential of RT graphics. Once hardware is powerful enough to run games at a high ray count, ray tracing will look absolutely phenomenal without all the horrible noise and artefacts. RT is the future of gaming graphics, at least for single-player games.
Sometimes there is a 'swimming pool' water effect on some walls because of ray-tracing. It's like there is a swimming pool off screen and the water is reflecting on the wall, you can see a pattern moving as if it was water. I find that very distracting. There are many games that have this.
@@iurigrang If the ray count and bounces are high enough, denoisers become irrelevant. A higher ray count is way better at improving the quality of ray tracing than denoisers are. Denoisers have visual quality drawbacks. The only drawback of a higher ray count is that it's more demanding. However, it massively increases the accuracy and image quality of RT/PT while significantly reducing the noise and artefacts. It's by far the best and most effective solution.
@@iurigrang I hope it can get to a point where denoisers will become irrelevant. I am not saying they will become totally useless. I just hope GPUs get massive performance increases in ray-traced games so they will be able to handle running video games at a higher ray count.
@@iurigrang I disagree, human perception is also quite noisy at lower light levels, which is something that denoisers erase. It's however surprisingly similar effect, and in the more immersive genres, such as first person horror games, it's more beneficial to not butcher it with denoiser. Also take note that it's not just spatial denoiser, that are at work, it also requires temporal accumulation. You can't really have a fully ray traced game with flickering lights for example. This is quite a big deal, because that's the part we can't precompute. If we were given something like 10 fold increase in overall storage and throughput for data, you'd see much better graphical improvement, just from accurate baked lighting or massive RT caches that would essentially turn the RT into JIT baking.
Would have been a more interesting comparison to see how the 4080 compares to the XTX. Being more similarly priced. edit: interesting though that after 6 years of "raytracing" sales, not even the fastest card available (4090) is really capable of running truly beneficial raytracing settings at good fps in a single game :D But ok, that's at 4k. At 1440 maybe just doable. So basically... the nvidia 5 and amd 8 series will be the first gen where raytracing is actually a meaningful feature to look for when comparing gpu performance.
Yeah the video is silly in general. They should have done every card broken down by price per performance, and taken the actual games themselves in to account (visually speaking, like Cyberpunk2077). Once you play that game with RT on, it's just not the same without it. Nobody is looking to turn RT on in a competitive FPS for example. They want ultra high frame rates and clear visibility, not WOWWIE graphics.
Rather than forever waiting, any current gen radeon and nvidia cards are already quite capable for RT, as long as you okay with upscaling. Because when the 50 nvidia series or 8k radeon launch, there is might be a newer and heavier more realistic game engine with even harder ray tracing.
It still won't be a meaningful feature for most players since there's no indication that you'll no longer have a huge performance impact, and it'll take another 1-2 generations before 4090 level performance is available at a true mid-range price point.
@@KanakaBhaswaraPrabhata HWub already tested with upscaling in the more demanding games in this video. This is really the max you can get out of them. Ok with about 50fps, you could argue that it might be enough to use frame gen also. But probably want at least 60-70fps for that though.
As I've been saying from the first nvidia presentations, when they had to show zoomed in stills and directly explain where to look at and "how awesome and mandatory this should be to anyone", RT, at least for graphics (with negligible exceptions when things may look different, even more rarely actually better) is not for actual gaming. It is for making A/B comparison videos when everyone plays the game of guessing if it's on or not. May be fun for some people, but I guess not the thing that it should have been. And we all pay for this more and more, whether we want or not. RT tax is real. Nvidia marketing genius (though cinical as hell) is undeniable. The problem we will likely see more often with broader adaptation of hardware is devs neglecting the raster so rt will be able to look better by comparison. Little by little, the frog will be cooked, more cost-cutting. The only thing is, people first need to broadly adopt it and of course pay through the roof for it, so that devs and publishers may pocket the savings. All under the pretense of "giving the gamers what they want"...
It depends on if you're staring at screenshots or video in a side by side comparison. I think the majority of people would be hard-pressed to notice most raytracing benefits when in motion and actually playing and enjoying their video game.
Not enough games implement Raytracing in a good way. In Cyberpunk and Alan Wake II even in motion you notice it, since it changes the whole vibe of the game.
Poor title choices. When RT is done right the difference is huge in how your brain perceives the lighting as real or fake game lighting. A lot of titles from like 2018-2020 had very limited use of RT. Just horrible choices to include them here over the many UE5 titles we even had this year that look fantastic.
@@haukionkannel how many of those actually get used for gaming, I doubt it's more than 5% most of these will be either doing professional work or AI. the 4080 would have been a much better comparison. no one is debating if they should go 7900xtx or 4090 for gaming. the 4090 costs more than the rest of the pc combined.
@@haukionkannel Because for people who don't care about price a 4090 will do. Everyone else will buy a cheaper Nvidia card and fools will go to AMD to get scammed by FSR.
What we got from this video? RT is not worth it. 7900xtx is great value, being 2 times cheaper than 4090 and delivering 69% perfornce of 4090 (without rt).
Is it really only 69% of the 4090? I could have swore there used to be times where it was almost on par or 10% below the 4090 in some reviews. Definitely not in ray tracing, but in rasterization.
RT is more than worth it. I will easily take RT in 30 fps than non-RT at 60 fps in any game that does indirect lighting RT. If you do just reflections yeah you're a terrible developer and should've done a proper RT implementation. 7900XTX is a mega scam. $900 to not have DLSS + DLDSR is crazy.
@@albert2006xp You are crazy literally defending paying twice as much with 2 features that aren't really much different then amd has super resolution which has no differences then dldsr has no frame gen unless the game supports it and when the new card comes out you will end up using Fsr after nvidia cuts you and your 2k gpu off.
@@albert2006xp because people like you, nvidia can afford to lay down on their back and see people queuing for +2k euros GPUs Hope you're excited to pay another 600 euros on top of current prices, for another 30 fps in RT with the next gen GPUs. to be able to play on an average of 60fps, lol. I've personally switched this year to 240hz monitors and I simply cannot go back on anything below 100-120 FPS, unless I'm playing on TV or whatever screen that would run 60hz only Anyway, point being that there are people value real frames and low latency responses over pretty graphics- this cohort will always choose performance over visual fidelity With that being said, there are plenty of games that do not need RT enabled, run buttery smooth and they still take (my) breath away.
Ray Tracing is an Nvidia gimmick to make Radeon look worse. Don't believe me? Look back to when Nvidia pushed heavy tesselation in video games. Minimal visual impact for heavy framerate drop. BUT the framerate drop was not as bad on Nvidia as on Radeon, which is exactly what they wanted people to see.
Happy to have gone with AMD last generation. I get it, RT in certain titles is really cool if you have a 1000-2000EUR+ GPU budget. However, for mortals like myself with a budget of 300-500EUR. AMD has really just been the better option. On sales the RX6800-RX6900XT had about the same market price as the RTX3060Ti here in Sweden. So basically, if I went with nvidia I would get less VRAM, less RT-performance, and far less rasterized performance. And yes, Nvidia cards are seriously overpriced here but people still buy them in abundance due to brand recognition and the medial indoctrination about being "far superior" on RT. But when reality strikes, Nvidia has been on-par with AMD in terms of RT performance in the mid-range, at least in this part of the world. Personally I mostly play VR so I am happily not caring about RT at all. :)
Not to brag but I do have that budget and I still don’t get it. I could buy three 4090s for each of my three gaming PCs and not even blink…one to use, one as back-up for when the first has its plugs melt, and the third as a shelf queen on top of each…but I’m never gonna. Because FFS unlike so many twits online desperate to brag about getting ripped off, I feel embarrassed when I get suckered into stupid purchases. Then the melting plug issue takes things to another level. Seriously I wouldn’t install a 4090 in any of my PCs if Jensen personally handed me a few of them for free. So I only take a 20% performance hit with a 7900XTX and never have to worry about melting plugs? AND I keep $1000 more to throw at whatever else I may feel like? Yeah, this isn’t a hard decision.
@@kenshirogenjuro873 I agree with you. Being 42 years old myself, I have enough income to afford several 4090's if I wanted to. My budget of 300-500EUR is solely based on how much I value to spend on gaming as a hobby... Considering that a 6800XT was 1/5 the price of a 4090 here in Sweden, and still takes me 90% of the way (considering most games I play don't use RT-, or are still highly enjoyable and good looking without it) I see no point at getting ripped off. I would rather invest that money long-term or spend it elsewhere. It's all about priority.
The much hyped AI-powered rendering will be here sooner than properly implemented realtime RT, considering the huge performance cost for a single ray traversal alone. Until then, a skillful game dev and artist can achieve good-enough visuals with standard pre-baked raster effects.
The performance cost usually isn't worth it, unless you have performance to spare, but what about the money cost? The 4090 is around 80-100% more expensive, and is rarely more than 60% faster in ray traced scenarios. When ray tracing isn't enabled, the 4090 is about 23% faster. That's it. The visual quality improvement is still very minor relative to the performance cost, and is often extremely minor.
Why do I get the feeling that RT cores and the RTX features were always intended as just a sales pitch to justify the future AI-happy hardware ahead of time?
There is an easy rule for me... if you sat a random player in front of a game and they can not tell you if RT is on or off, while the performance hit is more than 10%, its not worth it. Period. Because having significantly lower FPS in a AAA-title IS in fact very much noticeable in most cases, especially when you have spikes that drop you under 60FPS.
There's an even easier rule, you have a PC, you have eyes, you can turn it on and off in your own game. To me, the answer is on 80-90% of the time. I don't mind lower fps as much as I mind having worse lighting.
I have RX 7800 XT. I only turn on ray traced reflections in Cyberpunk 2077 with Intel XeSS and AMD fluid motion frames 2. On ultra settings except shadows I get around 60 to 70 base FPS and 110 to 120 with fluid motion frames 2. I have RX 7800 XT. Those reflections are the only Ray tracing feature worth enabling. Edit: This is all on 1440p.
After you've had RT on for 10 minutes, you'll forget it's there and not notice it anymore, so save yourself a grand by buying AMD, and use that money to go to a foreign country for a week. Besides, you'll turn it off to minimize lag in anything competitive.
The game I play (X4 Foundation) isn't even able to spell raytracing, much less display it. So, speaking for myself: so far I never lost a single thought about the lack of raytracing performance of my 6800.
AMD lets us work at a low level, think GPUOpen. I prefer working with Radeon hardware mostly for that reason. Nvidia gives me whatever their particualr flavor of this or that API, while AMD gives me the hardware.
@@killerful I understand but if the best case scenario is the 4% of the market it's an elitist analysis. Wouldn't it be better to focus on the mainstream market cards / resolutions to have a snapshot of the a bigger and more relevant audience?
Whenever you have a On vs Off feature that takes effort to develop, you can never expect On to be truly game changing. Devs will always spend the most effort on the Off to suit the majority of players and relegate the On to whatever they can get for "free" plus a little extra that they can devote the time to. This leads to things like how UE5 uses signed distance fields for fast RT approximation which AFAIK don't benefit from RT cores. You *can* GPU accelerate SDFs but my limited knowledge on the subject says it's going to use the regular cores in opposition to the rasterisation while leaving the RT silicon idle. Supporting the vast majority of gamers is more important than fully loading nVidia's speciality hardware and worst case scenario is that "RTX" style raytracing never really gets more entrenched than it is now, with alternate techniques becoming the new standard for wider compatibility and better performance.
Yeah they can develop for AMD hardware and target both PC and consoles with same effort, Nvidia requires extra work (which I am sure they sponsor wherever they can benefit from it). Maybe when Unreal Engine 6 arrives the landscape will be different, but not as it is now. With a looming deadline and budget constraints I would rather devs spend their time optimizing broadly and improving gameplay. Kinda like PhysX back in the day :D
I have a 4070TI and I am currently playing Dragon Age Veilguard. I have all settings Ultra and ray tracing maxed out. Looks amazing. The issue is that in some forested area, the frame rate goes in down to 20fps with frame rate insertion. So basically stutter heaven. It really varies from 20 to about 140fps. Would love this game to be looked at sometime for why the varience is so high.
Good video, thank you. Just doesn't feel like RT is worth it, even after so many years. Question, why are you using the 4090 against the 7900XTX when the 4080 or 4080 Super is price/performance wise a better comparison? The 4090 is almost 2.5X the price of a 7900 XTX, right now.
I thought it was cool in minecraft but then that just died off so there isn't any more reason for me to care for it. Every other game looks pretty much the same to me with or without it.
@@valderon3692 Yeah, but normal minecraft version has almost no graphic quality when you look at lightning, so that's the reason why it is so different.
When path tracing is available (Cyberpunk, Metro Exodus, Black Myth Wukong) it looks absolutely amazing. But I agree that in most game it isnt woth it.
@@_r4x4 In vanilla yes, but I have primarily played minecraft with shaders and other visual improvement mods for many years now. I still prefer the way RTX looked compared to any of them. Even though some of them use a form of ray tracing, it still never seems to work quite the same way. I don't think the tech will ever really catch on though. As soon as Nvidia stopped paying Microsoft to develop RTX, they stopped working on it. It's just not something most people care about or want to pay extra for, so most developers don't have much of a reason to include it in their games. Again, this is aside from the fact that it doesn't make much of a difference in most games.
Yeah the 7900 xtx is 1000-1100 euro the 4090 is 1600 euro, not comparable. Ofc now the 4090 is no longer in production so ofc its price is jumping higher, but the 4080 super is the equivalent to the 7900 xtx in price, not the 4090.
It all depends on your perspective. On my old GTX1080 turning on raytracing destroyed performance, so I played without it. For example in the witcher 3 at 1440p I had 7fps with raytracing on. Now, with the RTX4080S, I get perfectly smooth frame rate even with RT, so my perspective has changed. I get 82fps at native 1440p TAA and 179fps at 1440p DLSSQ + FG + fullRT (updated DLSS 3.7.2 looks sharper than native TAA in this game). The game looks incredible with RT and runs extremely well. Cyberpunk is another great example. On the GTX1080 I could not even run RT in this game, but I had 30fps at 1440p high settings without RT in the base game. Now on the RTX4080S I get much higher fps with RT and more demanding DLC map: -67fps at 1440p native + ultra RT -88fps + medium RT (RT shadows + RT reflections) -172fps DLSS Quality + FG + Ultra RT This game also support Path Tracing: -37fps at 1440p native -68fps at 1440p DLSS Quality -120fps at 1440p DLSS Quality + FG Thanks to the DLSS technology I can play cyberpunk even with the most demanding path tracing and still have a good experience. From my perspective RT feature is absolutely usable and I dont want to play without unless there are some stuttering issues (like in Star Wars Jedi Survivor).
Great content, love the follow up from the previous RT video, hope to see more of these kind of videos again when next generation cards releases. I have 2 thoughts and/or requests that I would like to share: 1) Are the image output quality equivalent between the cards? (I assume so but wonder if the graphics drivers would change the quality for performance, has happened in the past with few titles) 2) Would love to see a HUB "Cost per Frame" Chart analysis when using Ray Tracing for each of the cards Regarding cost per frame, it seems that when eyeballing based on your charts and prices here where I live the Nvidia 4090 is around 25% better than the 7900XTX at Cost per Frame in "Good none Path Tracing titles" while Nvidia is 50% better Cost per Frame when also including the Path Tracing titles. Generally AMD has a better Cost per Frame when looking at Rasterised rendering however now that we are starting (emphasis HEAVILY on *starting*) to see some value when using RT rendering over Raster for once (in some titles at least) this "RT Cost per Frame chart" would make more sense down the line to include in benchmarks, at least could reconsider it.
I have to question your use of the RTX 4090 because it skews results two ways: #1 - The percentage of gamers that can buy this card is miniscule. Hell, even the XTX is WAY out of most gamers' budgets at its current price of $870USD, let alone the RTX 4090 which is MORE THAN DOUBLE the XTX's price at $1900USD. The use of the RTX 4090 is especially unrealistic. #2 - It makes RT look more reasonable than it really is. All you've done is shown people that if they have $1900+tax to spend, they can use RT. Showing RT on the only card that can actually use it ALL THE TIME without mentioning "Oh, BTW, this card is almost $2000USD, so good luck with that!" doesn't help anyone. You also speak of performance differences between the two cards but you have to remember to give context. Not everyone knows that the RTX 4090 is more expensive than the RX 7900 XTX. Hell, even fewer know that they can buy two RX 7900 XTX cards and still get money back from the cost of a single RTX 4090. I really think that you should have used the RTX 4080 Super because at least it would show RT performance at a (somewhat) realistic price-point while being priced (somewhat) close to parity with the XTX. Really, really though, I think that using the ~$500 price point would have been better, like an RTX 4070 Super vs. an RX 7900 GRE. Yes, the numbers are going to suck, but that's the point. People should know that even at ~$500, RT will either be bad or terrible regardless of what card you buy. And yes, I don't know where you got the $1600USD price from, but PC Part Picker shows the least expensive RTX 4090 at $1900USD, a Gigabyte Gaming OC model. Maybe they're jacking the prices to make $1600 look good on Black Friday, but Radeon sure isn't doing that because $870 for an ASRock Phantom Gaming model is a phenomenal price.
I played cyberpunk with pathtracing at 1080p 30-40fps with a 3070 and still had fun. And with my 4070ti I play with pathtracing fine at 1440p high. I don’t need 100+ fps. It’s a single player game
@@JoaoMXN the only thing i don't like was they only put everything on 4k like couldn't they have also showed me if the reduction on performance was as significant if it was 1440p? If you google the amount of ppl playing on 4k it's just 2%, i know 1440p isn't any better with 12% of ppl playing on it but atleast it's a better representation or middle ground for people to determine if the information can be relevant to their rig.
I do take issue with saying that some instances of RT "Clearly improve the visuals" though it was more saying that some don't "clearly improve the visuals" but that does very much imply that some do. That is an entirely subjective take. I remember you guys loved the RT on Wukong, but I found it jarring and growing up in the country, no where near as "realistic" as ya'll said it was. Visuals are the kind of thing where there is no "objective best", it's all purely subjective. Something you might find outdated and ugly someone else might find visually stunning or impressive. Think of it this way, the Mona Lisa is a great painting, very well done obviously. Well take that and put it next to a framed version of say the Grand Festival promo art for Splatoon 3's Final Splatfest, and I would imagine if you took 500 people who had never seen ether of them before had no prier knowledge of ether of them, I would guess that the percent of people who liked ether piece of art would be fairly evenly split. You would need people going in blind because if they knew the Mona Lisa was an old painting by a famous artist they would feel inclined to say they like it more purely because of it's history and not because they actually liked it more. Let's not forget as well, the painting "Onement VI" a canvas that was painted blue and had a white line painted down the middle not only sold for nearly $44 million dollars, but is almost certainty someone's favourite piece of artwork ever made. Where I look at it and see something a toddler could do, someone else looks at it and sees meaning, and that's the beauty of art, and nether take is wrong, there is no objective "right" or objective "wrong" no art is better than any other art. You could argue texture quality but here is my counter there, I personally think Star Ocean The Second Story on the PSX is one of the best looking video games to ever be made. The towns are iconic, they stand out vividly in my mind. Arlia the small town you start in, with it's fountain just outside the mayors house, and just south of the towns builder who left his kids to go work on a project in the next town over, next to a small farm enclosure, and right next to that is the general store. You carry on to the entrance of town where you see the gate, weathered and showing the age and rural nature of the town, you then get a beautiful bridge that crosses a small river, that you can just about see the sun in, but what you can see is a very realistic reflection of the house on the North side of the river, as you take in the shimmering water you then cross the bridge passing a very aged looking Church of some kind, you then find yourself at entrance to the Sacred Forest, where you see the towns water wheel and the fences for a larger farm, as well as a small river boat. This is a game from 1998, that I STILL feel has some of the best most visually appealing locations in any game I've ever seen. Sorry for this wall of text, it's just something that has always bothered me, you can say "I think this looks better than that" but you can't really say "this clearly looks better than that" because it's all subjective everyone views art different.
@@Mathster_live I agree with the OP on this. When RTX is built into the model names, I think it would be an important investigation to show whether or not the lower end of the stack can actually use the feature they are sold on. The 4090 is a prosumer product. It's not relevant when trying to help consumers make informed choices. Which I would assume this channel is about
@@SansAppellation Bro, how can you not infer using basic logic and reasoning that if even the best of the best for both brands that the ray tracing is not really worth it or is negligible interms of graphical improvement, then the lower end cards below the best cards for both brands are objectively worse with RTX ON. You don't need to test every card lower in the stack, just use BASIC logic and assume how they perform relative to the information at hand.
Id like to see price point RTX. So my friend is spending around 350 euro on a GPU. Whats best to get? A 6750XT? A 7600XT? Or a 4060ti or 4060/3060 12gb same with price points of £/$200 - low end £/$350 - Mid end £/$500 - High end £/$750 - Very High end £/$1000 - Enthusiast Extreme
It's a great card, even with the lower raytracing performance. Raytracing is so worthless 99% of the time. Baked in, traditional lighting effects, still look gorgeous when done correctly.
@@TooBokoo Rt is not worthless lol, just because the 7900XTX can't run RT well doesn't mean it's worthless. RT looks incredible in most games. Also, FSR is very, very bad and AMD has so features at all. No software support too.
You missed the train. I bought it 8 months ago. No regrets. I invested the excessive money people use for 4090 into 49" OLED monitor and HDR capability. Much more enjoyable gaming experience than RT feature. XTX is beast, but future is dark, if nvidia has monopoly in high-end segment near future.
Thanks Tim. It would be interesting to have a video where you check what's the minimum tier of hardware, for both brands, needed to enjoy Ray Tracing together with an acceptable level of image quality (eg: DLSS Quality at 1080p being not acceptable, or barely passable). In the video "We want cheap AMD graphic cards, so we can buy GeForce" you were making an excellent point: "how well does the tech scale to a 4060 type of product?". And so, a video where we literally check "how much dolla' do you need to enjoy RT?" or "can we enjoy RT on a 400 bucks GPU?" That would be very interesting.
As a 14700K + 4090 owner I can definitively say that Ray Tracing is NOT worth it. I didn't buy my 4090 for ray tracing but more for the ability to play games at 4K @ 120fps on my LG OLED. Furthermore, something Tim neglected to mention is that you often times need a beefy CPU as it is also a requirement for ray tracing. Imo Tim also should have shown two comparisons; Ultra Settings & Optimized Settings. For example, his depiction of CP2077 at Ultra w/ PT is running at 43fps BUT w/ optimized settings that number is at minimum doubled to 85-90fps. Personally if I'm running CP2077 w/ PT optimized I'm running Balanced mode which yields 95fps in the city and 120fps every where else. I think if Nvidia really wanted to make RT more desirable for the High End GPU users as that imo is mostly who it's for; they need to try and implement an in game toggle (hotkey). Obviously that won't work for every implementation of RT but there is a mod for CP2077 that allows you to toggle RT/PT on/off with a hotkey which is great bc imo the only time RT/PT benefits is night time and slower game play experiences which don't really require higher frame rates anyways. So in short don't let Nvidia bait you into Ray Tracing as a marketing ploy as it's mostly just a gimmick and still NOT worth it.
"There are no good examples of AMD sponsored games with strong ray tracing", yeah because you didn't include Avatar and you forgot that Callisto is AMD sponsored. And your subjective grading of Callisto's RT is absurd, just check digital foundry's tech analysis of that game's innovative RT.
Avatar fits right into the average difference between 4090 and 7900 XTX with "Good RT" titles, excluding path tracing, with the 4090 outperforming the 7900 XTX by about 55%. But also, you cannot turn off Ray Tracing in AFOP, so there is no baseline for non-RT performance, that is why it was excluded, I believe. Calisto Protocol is marked as AMD-sponsored at 29:45, and it is also in-line with the ~50% performance difference characterizing the "Good RT" - without PT scenario described in the video. I agree that RT in Calisto Protocol is better than HUB denoted, but even if included in the mix, it doesn't change the overall conclusion.
@@cpt.tombstone yes I was not commenting on performance or the exclusion of Avatar, just on the false narrative that AMD sponsoring always means holding back on RT. And I disagree about Avatar, it's one of the absolute top RT games below path tracing, once again I will take DF's analysis of RT over HUB.
Those few games are outliers and dont represent the conversation as a whole. Kinda like john not reviewing Hogwarts legacy because of a small small minority of outcries, over lmnopqrstuvwxyz people. No amount of representation or analysis will ever be perfect. Lol
@@opinali Avatar is a good counterpoint, but it's still a trend that AMD-sponsored games are not really making RT shine. There are exceptions, just as there are exceptionally bad RT implementations on Nvidia sponsored titles.
I'd like a performance comparison between the 7900xtx, 3090 (both second gen raytracing hardware), and 4080 super (similar price and performance to 7900xtx)
But according to DF, once you zoom in 400%, you WILL see that extra RT shadow on a tree in the top right quadrant. That is surely worth the fps hit. NV and their shills desperately trying to convince people to part with 700+$ for these overpriced joke of their "mid tier" cards.
Excellent video as always! My 2 takeaways are 1 - pathtracing is pointless as not even a £1500 GPU can run it above 60fps even with DLSS 2 - Metro Exodus proves that RT can be a both well done and look fantastic and doesn't need to cripple performance on all GPUs. I own the game myself and using very high rather than extreme and setting VRS to 2x with medium RT and Hybrid reflections runs really well on my 7900XTX
Do keep in mind this is a 4K test. Lower resolutions + DLSS, most of us have 1440p displays and deem them adequate, papers over a lot of the performance impact. But i also think path tracing should ideally not be fully RT based, it can use RT for near hits and SDF, SH for far hits, limited trace depth, or some sort of advanced ray reuse.
@@SianaGearz Path Tracing by definition is fully RT-based. The difference between "Path Trace" and "Ray Trace" in games is really a matter of either doing each pass individually (Classic Raytrace modes) or doing everything all at once in a singular pass (Path Tracing modes). Although in a technical sense, all Raytracing is Path Tracing. In computer imaging / rendering like in Blender, Path Tracing is an optimization of Ray Tracing that spreads rays outwards from the camera itself to capture the world (Games do this regardless of RT or PT) instead of the alternate and far less optimized method of spreading rays from every light source first and hoping the rays hit the camera.
Lmao. You either call it pointless because you can't tell the difference or pointless because it's a massive difference that looks amazing but it runs too slow. Gpus will continue to get better at ray tracing. Path tracing is extremely cutting edge currently.
@@SianaGearz but with a 4090 your either playing at 4K or you want high FPS, if anyone buys a 4090 to game at 1440p60 they have more money than sense. And once you down the GPU stack the performance is going to be much worse.
@@joeykeilholz925 pointless in terms of performance, it looks incredible, in some ways it's generations ahead of mainstream tech, but the issue is the performance on a £1500 GPU is not what most PC gamers would accept (60FPS minimum)
@@albert2006xp DLDSR and DLSS is amazing but I wouldnt recommend to set DLSS to peformance because its drops quality too much. DLDSR and DLSS quality is best while balanced is as low as I would go because after that quality hit is too much.
@@Extreme96PL DLDSR+DLSS Performance is same render resolution as regular DLSS Quality and looks better. Of course you'd turn it up if you can but if you can consistently do that, time to go up a monitor tier imo. If you can turn DLDSR 1.78x + DLSS Quality on 1080p monitor consistently in games, you're better off going 1440p monitor and doing DLDSR+DLSS Performance from there to 1920p. Low end common cards (like the 3060 he mentioned) will probably be running DLDSR+DLSS performance from 1080p though and go from there if possible. The difference isn't that big as the difference between not using DLDSR and using it.
@@albert2006xp DLDSR 2.25 and DLSS quality render at 1080p I think. I use DLDSR with DLSS quality on my 3060Ti and it almost always hold 60fps. Difference in som games compared to native 1080p is colosal its completly fix blur caused by TAA and provide best anti-aliasing.
@@Extreme96PL If you use 2.25 yes, I use 1.78 most of the time since the difference is small enough and the performance is needed. In no way my 2060 Super is qualified for native 1080p. It's a 1080p DLSS Quality kind of card.
The other thing to consider is the SSR implementation. Very few games feature edge-to-edge SSR which can appear very distracting at the edges of the screen for the keen viewer. Some games mask this screen space rendering drawback, but they are very few. In most cases it matters little to the vast majority, especially during faster paced games. I never mind it too much, save for Ark: Survival Evolved - where it was literally a full inch away from the screen border. In other games, SSR can also cast reflections of near objects into far off reflective surfaces. The less noticeable issue is the absence of under-object reflections, where the detail under cars and such is simply gone as it is not captured by SSR in the viewport space. In spite of these inconsistencies, I still use SSR for almost everything (except Metro Exodus: Enhanced Edition, where it is not an option but runs far better than the original raster release). I will switch to Ray Tracing further down the line, when RTX hardware isn't so taxed and can match or exceed where we currently are with traditional rasterized performance.
I have a 4080 and the thing is a beast, so I try to throw on ray tracing whenever I can... And to be honest the only game I have ever noticed it making a difference in is Cyberpunk 2077. Every other game I just go back, turn it off and get the performance boost.
The input lag alone makes me not want to turn on RT in many cases, even in story based games UNLESS i am playing with a controller. Anything that requires aiming with a mouse is almost always not a good idea.
@@geofrancis2001 Lag and low frame rate are completely different. Lag is the time it takes for an input signal, such as clicking a button, to be reflected on screen.
What’s crazy to me is that Nvidia tried to blacklist you guys for not covering ray tracing as much as they wanted, but I think regularly showing that it chops performance between 20-50% would be much more detrimental to its public perception than not showing it at all
Listen guys, right now RTX may not seem necessary to us, but in the future Devs will remove the basic features in the graphics settings and put them in the RTX option so that we can turn it on. If we don't turn it on, the graphics will be the same as the old games, but it will consume many times more resources.
Imagine buying 4090 or 5090 and yet there are games where heavy RT implementation drops your game fps close to 50% :D I mean its just works, you buy pricey cards, because Jensen said it has best RT and you need it rolf
"No" -That's why i am not on TH-cam. Most questions don't take 3 or more minutes to answer. Anyone recall when 3D movies were all the rage for the immersion and "experience"? Yeah, RT is a solution to the question, "how do we separate more money from the consumer?"
@@_r4x4 The "explanation" is soft data that is subjective based on your personal perspective and wants. I don't need to justify my *opinion* , as that is all the answer is.
@@phm04 Absolutely yes. Until every game ships with with a clean RTGI mode there's no point in turning on Ray Tracing. I loved playing Metro Exodus Enhanced on the 3080, but near enough to every game studio working today has botched their Ray Tracing implementation, almost no-one does RTGI, or if they do you're forced to include highly taxing RT shadows or RT reflections along with it. To make Ray Tracing worthwhile you need full RTGI lighting with good optimization, again just look at Metro Exodus Enhanced. 19:34 And even in ME:E the 7900XTX performance is "good enough". Do note the 1% low is much more consistent on the Radeon card in ME:E, so your minimum FPS is at worst 40% lower than the 4090, and I'll repeat 7900XTX is half the price.
Great video, I think this serves as the basis for future Ray-Traced benchmarks. By that I mean, which games should be used for the benchmarks? Transforms significantly and a couple of popular titles from better overall, as these are the ones people looking to try ray tracing will be most interested in. With a smaller game set to work with, it will be easier to do videos like “which minimum card for Ray-Racing, in the games that matter?” so can a 4070 super run the games at 1440p (with/without frame gen) or do you need a 4070ti /super to do it?. “Can the 8900xt achieve playable fps in the RT games that matter?”. You could even include 3440x1440p 🫣🙏 (or at least extrapolated results), as 1080p is probably not a used resolution for people buying these cards.
I think a comparison between an 4080 super and the 7900xtx would have been the better one. Yeah sure I get more ray tracing when I spend like 1k more...
For someone planning to build a new PC in a few months, this kind of video is extremely helpful. Due to budget I'm really leaning toward the 7900 XTX, so having a video like this but for 1440p would be absolutely amazing as that's far more manageable for this hardware. My current PC is a 8350-FX cpu, and a 1070ti gpu. I'm so out of date that no modern titles are manageable at all.
I'm on a 2060 Super and raytracing is very common for me. Just finished Casting of Frank Stone, Until Dawn and SIlent Hill 2 all with hardware RT enabled. Also the Alan Wake 2 DLC with RT Low.
A revolutionary idea, how about instead of wasting silicon for raytracing, use it for more raster/shader cores so you could play games at sharper resolution, higher level of detail, higher resolution shadows and volumetric lighting and other cool visual effects? This could upgrade a low/medium setup to a high/ultra setup, significantly improving game visuals and/or performance for the same cost. Especially with the stagnating GPU market, this could give a huge edge in the competition. How come they haven't thought about this before?
Yeah but the problem is how much you have to pay for those 90fps... Depends on the game of course, with Cyberpunk in Overdrive mode being spectacularly bad for this but also being the best showcase for the kind of visuals you can achieve.
Tip for HUB team. Comparing data with two metrics. Scatter graph. Performance loss vs visual improvements and a trend line. Bar graphs was basically useless here.
Did you find this consistent with 1440p do the drops in performance align with your 4k ultra RT results? This doesn't exactly say much if you just use 4k which a majority don't even bother with.
The fact that you have to spend more than 1000$ after 6 years to use this feature and still you will never know if its on or off... what a fucking shitty market state
Alan Wake 2, Cyberpunk, Metro Exodus, Ratchet and Clank, Wukong all smile on this comment and these are all AAA titles. Is it worth it? No. Is it impressive sometimes? Yes.
Jensen & Co deliberately trickling out RT as slowly as possible for maximum profit. Scummy as fk, but they're all into it...including the devs with their pathetic RT tickbox exercise.
100% agree, it's fkn pathetic for nearly 2025..... Absolutely PATHETIC
@@JohnnyBg2905 i'd expect it to be more than "sometimes" when you flock that much money.
It may depend on the monitor you use. There ARE people using a 1000$ card with a 100-200$ monitor...
Short answer is no
Long Answer: Nooooo
@@Avarent01 LMAO
People also miss the fact that high framerate improves visual clarity in motion, which is a much more realistic scenario than still comparison shots.
Longer answer: if it's cyberpunk and you don't notice the drops in quality when using DLSS Balanced, then worth it. Using pathtracing on my RTX 3070 1440p with DLSS Balanced and get 40-60fps, worth it for me.
It's actually pretty cool tho if i had the hardware for it (which i don't) then I would definitely use it
In Germany 4090 cost around 2250 euro. XTX cost 950 euro. And according to this video, 4090 is about 35% faster, but 2.2 times more expensive. If logic is working, than 4090 can stay in the shops forever and nvidia workers can buy and use it... gpu prices are so crazy
Bro, Nvidia workers are starting to quit their jobs. They are now multi-millionaires. Someone who worked there for 5 years and received Nvidia stock back then, those stocks are now worth 3.5 mil dollars. They don't have to work for the rest of their lives. Nvidia is actually having trouble keeping those people working there. 😐 I swear, this AI bubble can't burst soon enough.
steam survey shows more 4090s than every single desktop Radeon in the list by the way lol.
@@EFXVoila Nvidia has been the primary choice for gamers for a long time, for many the switch feels difficult and for some it's impossible (CUDA devs and similar). Wish more people would see that Radeon is not some shitty alternative to GeForce
@@EFXVoilait's the fastest card after all, Nvidia is the default choice and quite honestly, if i were to spend a thousand bucks on a gayming card i might just spend 2k for a better experience, because at these price points the concept of a compromise between performance and cost is laughable anyway
@@dulouser1751 that mindset is exactly what they want to cultivate by keeping the gigantic performance difference between 80 and 90 series.
7900XTX = $1,399 AUD
RTX 4090 = $3,599 AUD
The price difference is so insane you should be comparing the 7900XTX to a 4070ti Super...
We're talking 250% more money for 30% more performance?
It's a bad comparison just because they wanted to do the best card from each manufacturer. IMO AMD should not get a say at all and they should've done 4070 Super, 4070 Ti Super, 4080 Super and 4090 to get a good range. And all at 4k DLSS performance.
@@albert2006xpWhy would you think that? Half the point of this video is to explore the impact of enabling RT on NV vs AMD cards?
@@ThunderingRoar Because of how many times throughout the video he directly compares that 2 cards as though they are battling head to head, and infers the 4090 is a better choice for RT constantly...
Except.
It would want to be.
Because it's more than twice the price...
Which is not mentioned at all.
Constant (paraphrased) quotes such as "The 4090 is a far better option in this title"
"The 7900XTX just can't keep up here and it's 1% lows are much worse than the 4090"
Etc etc the entire way through.
These cards are not comparable in performance, nor are they trying to be.
You can have this exact video, with an old RTX 3060 vs the 7900XTX and say the Radeon card dominates here.
Yeah no shit, it's not in the same market.
4070ti Super = $1,319 AUD
So from the 7900XTX there's an $80 AUD between 4070tiS and a $2,200 AUD between it and the 4090.
Cool.
@@ThunderingRoar I guess it's worth showing but I thought it was more about RT's cost and AMD's current cards are just not proper cards so showing how they go to 3 fps in path tracing feels pointless.
It is interesting to note that 7900 XTX is 2 times cheaper than RTX 4090 which makes it a very cost-effective card. I do not regret buying it
7900 XTX in rt in unusable, irrilevant pay less
@@silvio3d And he is not using it in RT most likely, so your point is useless. Half the cost is more useful if you don't care about RT. You just go and consume your little heart away.
Cost effective but FSR for anti-aliasing? I'd rather have a 4070 Ti Super and use DLDSR+DLSS in every game.
@@silvio3d , lol, we get that you are a troll . i watched the benchmark did you? so typical hardcore nvidia fanboy claim that amd is unusable , ok troll .
@@albert2006xp RSR is better than FSR is but it wouldn't be fair to use an AMD exclusive driver based solution against DLSS for some reason?
I understand the idea of comparing best to best, but I would have been more interested to see 7900 XTX vs 4080, since they're actually the competing products, instead of 7900 XTX vs a GPU selling for over $1,000 more.
The 7900xtx beats the 4080s handily in raster.
Exactly this it's an unfair comparison when in the uk the 4090 is vastly higher price 3x comparing the hit ect fine but the way the video flows it sure doesn't sound like it, so redo the video with the closer more actual match of a 4080 and i expect the results would be far more realistic in a fair comparison.
@@ddrmegamixer it isnt a gpu comparison
I was going to comment the same thing
@@lemmingsftw2480it’s a comparison of performance on 2 different gpus
That sure sounds like it to me
I would love to see a Ray-Tracing video about "mid-range" cards like 4070 super / 7900 gre on 1440p
Edit: What happened here while I was gone lol. But the fact that there's so many contradictory opinions kinda proves that we need a video on this subject.
And of course I initially meant RT WITH DLSS/FSR.
You can extrapolate the answer to that. His conclusion on the best cards is, that it's only worth to turn on with the best implementation ultra, or path. The 4070 and the GRE aren't nearly strong enough to run path tracing.
Don't be ridiculous. How dare you ask for an analysis that doesn't target 0.00001% of gamers?
@@offroadskater The 4070 is absolutely capable of playable PT at 1440p.
Why? It’s only worse and it’s pointless
@@LilMissMurder3409 Maybe with turned down settings and Balanced or worse Upscaling + frame gen.
Short answer: No.
Long answer: No its not.
Tldr in German: 9
It's not worth by any means if you have AMD GPU but on Nvidia side it's worth by a lot
Nvidia is still better than amd
COPE
@@JBIGroup3D Yes because you literally pay double the money
You undercomplicated it.
so even in RT 4090 doesn't provide 2x performance of 7900xtx. Why it's so expensive then?
lack of competitiom
If you want the best of the best you have to pay more than its value, that's how it works
People are paying for it so Nvidia keeps the price high.
datacenter maket, CUDA and all of that.
I would argue texture quality matters much more than ray tracing at this point, and has much less performance impact as long as the VRAM is sufficient.
And that's exactly why Nvidia deliberately release cards with good computing performance but with unmatched abysmally small VRAM, to force upselling.
Textures are free graphics, and nvidia doesn't want you to have anything for free😆😆
Nailed it. I fell victim with 4070 TiSuper. At least I actually got the ram needed for good experience.
and artistic direction. A game with a good artistic direction is more compelling to me than a game acting as mere technical demo, and it will age better.
Texture quality of Starfield are very high resolution by default, yet it still looks like crap even on Highest settings
Texture quality do matter but people end up being silly with it. Most often you can drop it to lower quality setting and still look the same. Heck they are not noticeable even on side by side still picture comparison. And yet many people subjectively feel they losing a lot if they did not use max out texture.
Not for me. Gimme that buttery smooth framerate first of all please.
What gpu do you have?
@@SweetFlexZ probably below 4070
since the 4090 u can play rt max settings games at 1440p or 4k performance without issues, especially with framegen its soomth enough for the massive visual gains. once you go rt u will have a hard time dealing with all the issues and visual bugs of the hacks traditional rendering introduce
@@liquidsunshine697 Exactly. Dude probably own an AMD GPU lol.
@@msnehamukherjee Even most nvidia users don't use ray tracing, even though they pretend they do in online arguments. Ray tracing is useless, get over it.
Tim, those thumbnails are works of art!
"On a 4090, all 3 path traced games run at below 50 FPS at 4k using DLSS quality upscaling" - so that's a definitive NO.
you could easily play on dlss performance and not notice anything ever
@@plasmahvh Except it's very much noticable??? Like that is 720p upscaled to 4k
You WILL know that
@@HyperScorpio86884k dlss Performance is 1080p upscaled and looks significantly better than 1440p native (which is considered nowadays the sweet spot of gaming). 720p is ultra performance
@@HyperScorpio8688 With 4k output it's 1080p on dlss performance I think, which is somehow still ok. ultra-performance gets massive blurry areas though.
@@plasmahvh 🤡
30% drop _might_ sound acceptable _sometimes_ but this is a 4090. I suspect weaker cards will not only have a lower absolute FPS but also higher performance hits.
I enable it on medium since got lower card 3070ti but I think ray tracing is still the future because we also see in light direction not tesselation that emits light or art work.
@@ansonang7810 Ray tracing will always be "the future", because ray/path tracing is essentially an infinitely scalable technology. nVidia releases a new $4,000 RTX 7090 GPU that can run 3 light bounces per BVH node at 4k120 in the latest game? Game studio can simply change to 6 light bounces per BVH node and that shiny new $4,000 GPU is back at 1080p60. Then nVidia can sell the $5,000 RTX 8090 that will do 1440p60, then the $6,000 RTX 9090 that once again does 4k120. Then the game studio simply increases to 10 light bounces per BVH node, rinse and repeat until the heat death of the universe and the RTX 31090 costs $42,000.
If you think nVidia won't abuse this to con their loyal fanbase into buying new overpriced GPUs, you weren't paying attention when they did this EXACT same thing with tessellation that they're doing with ray/path tracing when they ran their GameWorks scam.
@@ansonang7810
Who cares it’s 30% if you’re still close to your max. refresh rate ? Which is not the case for cheaper cards, I know, but I always use it with my 4090.
I always use it with my 2060 Super if it's RT actually worth enabling and isn't like just some shadows or just some reflections with poor ray counts. Even halving the FPS would be worth as long as it doesn't go under 30.
This is why I run a 7900xtx. I don't use ray tracing and the 4090 is currently $1900+ on Newegg. The 4080 is $1000+ and only has 16gb of VRAM compared to the $880+ 7900xtx with 24gb of VRAM. I like the settings cranked up with a high frame rate and this does the job as long as I don't use ray tracing, which is still basically a joke in most games. A huge performance hit for not much of a difference visually for most games.
I think sometimes RT is worth it, but it looks like RDNA 4 will have more ray accelerator units per core, so it should be faster for RT on an hardware level. When AMD and, why not, Intel GPUs will be as good as Nvidia ones in RT and it won't be an RTX selling point anymore, games will have probably evolved to use RT technology better
@@lucazani2730 On almost anything below 4080/ 7900 XT at 1440p RT requires upscaling. IMO visual loss due to ghosting with loss of performance are not compensated by benefits of RT.
HUB literally said that 40% of their games show differences and 30% overall had major differences.
@@be0wulfmarshallz Differences worth playing at significantly lower framerates? Because whether you've spent less than $1000 (7900 xtx) or over $2000 (rtx 4090) on just one single part of your computer, that is the trade off made in games with any appreciable improvement in visual quality. Smooth is a quality too, don't forget.
What are you using the extra VRAM for tho? It's not like the 7900 XTX will be relevant for new games even 2 years from now, it can't deal with AI tasks...
in Doom "XTX is not hopeless in RT [...] but 4090 is quite a bit better" i sure hope so for card 60% more expensive at MSRP (which doesn't correlate acurately with 49% of performace), but at this point it's 889 to 1799 on best buy, so double the price now
He should've compared 4080 vs 7900xtx but he's probably paid by Nvidia anyways.
@@aviatedviewssound4798 it's not a buying guide so price is irrelevant here.
The point was to compare how well the architectures of either company support RT and if it's worth turning on. Makes sense to use the best RT cards from either company and then look at the % impact on performance when you turn on RT.
It doesn't matter that the Radeon card is half the price because the video isn't addressing price. It's isolating whether or not the experience is good.
@@TheDarksideFNothingCompromise: Including the 4080 as well would've covered everything.
@LucidStrike And made them need to do 50% more testing for the same video with only marginally more useful information in it. And then someone would have complained about not having the 7900xt, and so on.
The point is the 4080 wasn't very relevant to the content because the video was not considering price as a variable. They were looking at whether or not RT is viable/useful/recommended, and given that the best GPUs from either vendor still aren't an easy "yes" you're supposed to extrapolate what that means for your GPU of interest based on the myriad other benchmark videos available.
@@TheDarksideFNothing I am not speaking about price, I am talking about core count or transistor count.
From all the data of these two videos my conclusion is:
Only in a hand full of game ray tracing is worth activating, and only if you have a really fast and expensive video-card.
But if you want to use ray tracing, go full in to the overdrive!
My 2060 Super has used RT in many, many, many games. This list is bad and contains a lot of "RT" titles from the early days. Where is SIlent Hill 2? Where is Casting of Frank Stone? Until Dawn? ANY recent UE5 title? Why are we doing it for "RT" from 2018 games ffs. I just finished replaying Casting of Frank Stone on Ultra. 1440p DLDSR+DLSS Quality with hardware RT on my 2060 Super.
If the game only does reflections or only does shadows with RT I don't think that game deserves to be on this list compared to so many games that aren't. You also don't need an expensive card. You can play on low DLSS, on 30 fps, it's more than worth.
and you will still lose from 10 to 50% of your performance, just a terrible technology
I find ray tracing to look best on cyberpunk style games, like The Ascent and Cyberpunk 2077. Other games it is much less noticeable to my eyes. I think all the neon signs and colors complement ray tracing well.
those are obvious though. Smaller details like all the shadows and reflections on various surfaces and the bounce lighting its all done in real time and to a particular eye its hard not to geek out over it.
And the fools discarded that engine!
It works in cyberpunk because it does not have baked lighting. This game requires it to be real time and it is the reason why RT works so great
@@liquidsunshine697 Silent Hill 2 remake I did notice it very well, probably because of how dark that game was.
I am a visuals snob and always crank things too high. In games where it is well implemented, it is absolutely worth it. There are some games where I turn it down because I can’t tell the difference. That I can’t tell the difference usually means it was bolted on at the end and not a part of how they built the game. Cyberpunk / Alan Wake / Metro Exodus are probably the best ray traced games. Some of the examples in this video are examples of games where there’s almost no point to using ray tracing because the devs never really intended for it; they just added it to check a box.
The biggest problems with RT are noise and artefacts. Because current generation hardware isn't powerful enough to run games at a high ray count, denoisers have to be used to fill in the missing pixels where rays weren't able to properly get to. Denoisers blend the sampled rays together, leading to low-quality and inaccurate RT effects. Current ray-traced games use a low ray count, which causes a noisy image with artefacts. More rays and bounces mean higher image quality, more accurate RT effects, and less noise and artefacts. Increasing the number of rays and bounces is the most effective way to make RT/PT look substantially better. The problem is that the higher the ray count, the more demanding it becomes, which is why it's not a viable solution yet. Video games still haven't unlocked the full potential of RT graphics. Once hardware is powerful enough to run games at a high ray count, ray tracing will look absolutely phenomenal without all the horrible noise and artefacts. RT is the future of gaming graphics, at least for single-player games.
Denoisers will always have to be used, they are used in offline rendering for a reason. But I agree with your main point.
Sometimes there is a 'swimming pool' water effect on some walls because of ray-tracing. It's like there is a swimming pool off screen and the water is reflecting on the wall, you can see a pattern moving as if it was water. I find that very distracting. There are many games that have this.
@@iurigrang If the ray count and bounces are high enough, denoisers become irrelevant. A higher ray count is way better at improving the quality of ray tracing than denoisers are. Denoisers have visual quality drawbacks. The only drawback of a higher ray count is that it's more demanding. However, it massively increases the accuracy and image quality of RT/PT while significantly reducing the noise and artefacts. It's by far the best and most effective solution.
@@iurigrang I hope it can get to a point where denoisers will become irrelevant. I am not saying they will become totally useless. I just hope GPUs get massive performance increases in ray-traced games so they will be able to handle running video games at a higher ray count.
@@iurigrang I disagree, human perception is also quite noisy at lower light levels, which is something that denoisers erase. It's however surprisingly similar effect, and in the more immersive genres, such as first person horror games, it's more beneficial to not butcher it with denoiser.
Also take note that it's not just spatial denoiser, that are at work, it also requires temporal accumulation. You can't really have a fully ray traced game with flickering lights for example. This is quite a big deal, because that's the part we can't precompute. If we were given something like 10 fold increase in overall storage and throughput for data, you'd see much better graphical improvement, just from accurate baked lighting or massive RT caches that would essentially turn the RT into JIT baking.
Would have been a more interesting comparison to see how the 4080 compares to the XTX. Being more similarly priced.
edit: interesting though that after 6 years of "raytracing" sales, not even the fastest card available (4090) is really capable of running truly beneficial raytracing settings at good fps in a single game :D But ok, that's at 4k. At 1440 maybe just doable.
So basically... the nvidia 5 and amd 8 series will be the first gen where raytracing is actually a meaningful feature to look for when comparing gpu performance.
Yeah the video is silly in general. They should have done every card broken down by price per performance, and taken the actual games themselves in to account (visually speaking, like Cyberpunk2077). Once you play that game with RT on, it's just not the same without it.
Nobody is looking to turn RT on in a competitive FPS for example. They want ultra high frame rates and clear visibility, not WOWWIE graphics.
it's not a gpu comparison in that way though, its a test of RT in general using the highest end product from each
Rather than forever waiting, any current gen radeon and nvidia cards are already quite capable for RT, as long as you okay with upscaling.
Because when the 50 nvidia series or 8k radeon launch, there is might be a newer and heavier more realistic game engine with even harder ray tracing.
It still won't be a meaningful feature for most players since there's no indication that you'll no longer have a huge performance impact, and it'll take another 1-2 generations before 4090 level performance is available at a true mid-range price point.
@@KanakaBhaswaraPrabhata HWub already tested with upscaling in the more demanding games in this video. This is really the max you can get out of them.
Ok with about 50fps, you could argue that it might be enough to use frame gen also. But probably want at least 60-70fps for that though.
As I've been saying from the first nvidia presentations, when they had to show zoomed in stills and directly explain where to look at and "how awesome and mandatory this should be to anyone", RT, at least for graphics (with negligible exceptions when things may look different, even more rarely actually better) is not for actual gaming. It is for making A/B comparison videos when everyone plays the game of guessing if it's on or not. May be fun for some people, but I guess not the thing that it should have been. And we all pay for this more and more, whether we want or not. RT tax is real. Nvidia marketing genius (though cinical as hell) is undeniable.
The problem we will likely see more often with broader adaptation of hardware is devs neglecting the raster so rt will be able to look better by comparison. Little by little, the frog will be cooked, more cost-cutting. The only thing is, people first need to broadly adopt it and of course pay through the roof for it, so that devs and publishers may pocket the savings. All under the pretense of "giving the gamers what they want"...
Is it me or is there very minimal differences in these examples between RT off and High? The performance hit isn't worth it imo.
I think the difference is big. Some people notice visuals a lot more than others.
Depends on the person.
It depends on if you're staring at screenshots or video in a side by side comparison. I think the majority of people would be hard-pressed to notice most raytracing benefits when in motion and actually playing and enjoying their video game.
Not enough games implement Raytracing in a good way. In Cyberpunk and Alan Wake II even in motion you notice it, since it changes the whole vibe of the game.
Poor title choices. When RT is done right the difference is huge in how your brain perceives the lighting as real or fake game lighting. A lot of titles from like 2018-2020 had very limited use of RT. Just horrible choices to include them here over the many UE5 titles we even had this year that look fantastic.
£750 7900XTX vs a £2200 4090 card... you could almost buy 3x 7900xtx cards for the price of a 4090.....
And 4090 has sold much more than 7900XTX… so the price is not issue!
@@haukionkannel how many of those actually get used for gaming, I doubt it's more than 5% most of these will be either doing professional work or AI. the 4080 would have been a much better comparison. no one is debating if they should go 7900xtx or 4090 for gaming. the 4090 costs more than the rest of the pc combined.
@@haukionkannel Because for people who don't care about price a 4090 will do. Everyone else will buy a cheaper Nvidia card and fools will go to AMD to get scammed by FSR.
What we got from this video?
RT is not worth it.
7900xtx is great value, being 2 times cheaper than 4090 and delivering 69% perfornce of 4090 (without rt).
Is it really only 69% of the 4090? I could have swore there used to be times where it was almost on par or 10% below the 4090 in some reviews. Definitely not in ray tracing, but in rasterization.
If 4090 is 31% faster then 7900 XTX is delivering ~76.3% of 4090 performance
100 / 131% = ~76.3
RT is more than worth it. I will easily take RT in 30 fps than non-RT at 60 fps in any game that does indirect lighting RT. If you do just reflections yeah you're a terrible developer and should've done a proper RT implementation.
7900XTX is a mega scam. $900 to not have DLSS + DLDSR is crazy.
@@albert2006xp You are crazy literally defending paying twice as much with 2 features that aren't really much different then amd has super resolution which has no differences then dldsr has no frame gen unless the game supports it and when the new card comes out you will end up using Fsr after nvidia cuts you and your 2k gpu off.
@@albert2006xp because people like you, nvidia can afford to lay down on their back and see people queuing for +2k euros GPUs
Hope you're excited to pay another 600 euros on top of current prices, for another 30 fps in RT with the next gen GPUs. to be able to play on an average of 60fps, lol.
I've personally switched this year to 240hz monitors and I simply cannot go back on anything below 100-120 FPS, unless I'm playing on TV or whatever screen that would run 60hz only
Anyway, point being that there are people value real frames and low latency responses over pretty graphics- this cohort will always choose performance over visual fidelity
With that being said, there are plenty of games that do not need RT enabled, run buttery smooth and they still take (my) breath away.
Ray Tracing is an Nvidia gimmick to make Radeon look worse. Don't believe me? Look back to when Nvidia pushed heavy tesselation in video games. Minimal visual impact for heavy framerate drop. BUT the framerate drop was not as bad on Nvidia as on Radeon, which is exactly what they wanted people to see.
This is what I’ve started thinking, an overhyped marketing tactic for a small improvement to justify spending thousands. Absolutely ridiculous.
Unless ray tracing can be achieved at least high settings 60fps 1080p on 300$ cards, it is not becoming mainstream at all
Agree, although i can play games at 30 fps but only if there are no awfull input lag or physics problem
Which is quite rare honestly so i target 40 fps in most games (singleplayer) unless there are major input lag or physics issue
Ray tracing is best for older games with DX8 type lighting, where it can completely transform them.
Happy to have gone with AMD last generation. I get it, RT in certain titles is really cool if you have a 1000-2000EUR+ GPU budget. However, for mortals like myself with a budget of 300-500EUR. AMD has really just been the better option. On sales the RX6800-RX6900XT had about the same market price as the RTX3060Ti here in Sweden. So basically, if I went with nvidia I would get less VRAM, less RT-performance, and far less rasterized performance. And yes, Nvidia cards are seriously overpriced here but people still buy them in abundance due to brand recognition and the medial indoctrination about being "far superior" on RT. But when reality strikes, Nvidia has been on-par with AMD in terms of RT performance in the mid-range, at least in this part of the world. Personally I mostly play VR so I am happily not caring about RT at all. :)
Not to brag but I do have that budget and I still don’t get it. I could buy three 4090s for each of my three gaming PCs and not even blink…one to use, one as back-up for when the first has its plugs melt, and the third as a shelf queen on top of each…but I’m never gonna. Because FFS unlike so many twits online desperate to brag about getting ripped off, I feel embarrassed when I get suckered into stupid purchases.
Then the melting plug issue takes things to another level. Seriously I wouldn’t install a 4090 in any of my PCs if Jensen personally handed me a few of them for free.
So I only take a 20% performance hit with a 7900XTX and never have to worry about melting plugs? AND I keep $1000 more to throw at whatever else I may feel like? Yeah, this isn’t a hard decision.
@@kenshirogenjuro873 I agree with you. Being 42 years old myself, I have enough income to afford several 4090's if I wanted to. My budget of 300-500EUR is solely based on how much I value to spend on gaming as a hobby... Considering that a 6800XT was 1/5 the price of a 4090 here in Sweden, and still takes me 90% of the way (considering most games I play don't use RT-, or are still highly enjoyable and good looking without it) I see no point at getting ripped off. I would rather invest that money long-term or spend it elsewhere. It's all about priority.
The much hyped AI-powered rendering will be here sooner than properly implemented realtime RT, considering the huge performance cost for a single ray traversal alone. Until then, a skillful game dev and artist can achieve good-enough visuals with standard pre-baked raster effects.
The performance cost usually isn't worth it, unless you have performance to spare, but what about the money cost?
The 4090 is around 80-100% more expensive, and is rarely more than 60% faster in ray traced scenarios. When ray tracing isn't enabled, the 4090 is about 23% faster. That's it.
The visual quality improvement is still very minor relative to the performance cost, and is often extremely minor.
Overpriced GPU market, lazy game developer implementation, slow progress in the technology, pure pain for consumers.
honestly no
I have a RTX 4080 and game at 1440p. The only games I use Ray tracing on are Doom Eternal and Dying Light 2. Every else I turn it off.
Why do I get the feeling that RT cores and the RTX features were always intended as just a sales pitch to justify the future AI-happy hardware ahead of time?
Could grab a 7900 XTX and a high refresh OLED for the same as a 4090 and I know which would look better with no ray tracing.
Enjoy that flicker display.
Yes, a 4070 Ti Super and 1440p OLED. 100/100 times. With RT on.
@@DrakonRjust cap the fps to your 1% low and there wont be any flickering
@@ThunderingRoar literally not the type of flicker I'm talking about.
@@DrakonR -10 points for the pointless and unnecessary use of "literally".
Are you talking about VRR flicker?
There is an easy rule for me... if you sat a random player in front of a game and they can not tell you if RT is on or off, while the performance hit is more than 10%, its not worth it. Period.
Because having significantly lower FPS in a AAA-title IS in fact very much noticeable in most cases, especially when you have spikes that drop you under 60FPS.
There's an even easier rule, you have a PC, you have eyes, you can turn it on and off in your own game. To me, the answer is on 80-90% of the time. I don't mind lower fps as much as I mind having worse lighting.
I have RX 7800 XT. I only turn on ray traced reflections in Cyberpunk 2077 with Intel XeSS and AMD fluid motion frames 2. On ultra settings except shadows I get around 60 to 70 base FPS and 110 to 120 with fluid motion frames 2. I have RX 7800 XT. Those reflections are the only Ray tracing feature worth enabling.
Edit: This is all on 1440p.
I second that, considering the card cost me ~€400 on a flash sale I'm quite satisfied with it
After you've had RT on for 10 minutes, you'll forget it's there and not notice it anymore, so save yourself a grand by buying AMD, and use that money to go to a foreign country for a week.
Besides, you'll turn it off to minimize lag in anything competitive.
The game I play (X4 Foundation) isn't even able to spell raytracing, much less display it. So, speaking for myself: so far I never lost a single thought about the lack of raytracing performance of my 6800.
I'd like to see a Generation comparison.
Gen1: RTX 2080(/ti) vs 6900XT
Gen2: RTX 3080/ti/3090 vs 7900XT/XTX
Spider-Man game is proof that good developers know how to program ray-tracing well for both Geforce and Radeon.
AMD lets us work at a low level, think GPUOpen. I prefer working with Radeon hardware mostly for that reason. Nvidia gives me whatever their particualr flavor of this or that API, while AMD gives me the hardware.
There's no way I could justify paying double the cost for a 4090 over a 7900XTX, for raytracing or not, it's just not worth it.
I remember playing GTA SAN ANDREAS on PS2 and I enjoyed more than anything on PS5 today.
why we comparing a 7900xtx to a 4090? i get its the latest from the brands but u cant compare a 7900xtx to a 4090
They literally say why in the beginning of the video.
Okay then let's compare the 4090 to nothing because AMD can't produce a competitive card
"we" that's funny
I almost never use ray tracing with my 4090 and 7800x3d. I hate playing under about 90-100 fps and I almost entirely play single player games.
Same here, but still on 5800X3D, maybe considering an upgrade to the 9800X3D. Ray tracing can even sometimes introduce a CPU bottleneck.
I would have liked to see a video like this between the 4080 and XTX at 1440p. Great video, though.
Is RT worth it?
Aka, do you own a 4090 or next Gen 5080Ti/5090 to actually have useable framerates? XD
“I’ve decided to go with realistic gaming configurations”. Buddy there is nothing realistic about using a nearly $2000 gpu..
Yeah, it seems that sometimes reviewers lose touch with reality when they have free 4090s laying around.
It's literally about analyzing the performance hit in the best case scenario.
It's realistic for me, and I appreciate the testing.
@@killerful I understand but if the best case scenario is the 4% of the market it's an elitist analysis. Wouldn't it be better to focus on the mainstream market cards / resolutions to have a snapshot of the a bigger and more relevant audience?
@@ViolaGhio not being funny but if it's not even really worth it on these cards then what is the point testing it on lower end ones
Whenever you have a On vs Off feature that takes effort to develop, you can never expect On to be truly game changing. Devs will always spend the most effort on the Off to suit the majority of players and relegate the On to whatever they can get for "free" plus a little extra that they can devote the time to. This leads to things like how UE5 uses signed distance fields for fast RT approximation which AFAIK don't benefit from RT cores. You *can* GPU accelerate SDFs but my limited knowledge on the subject says it's going to use the regular cores in opposition to the rasterisation while leaving the RT silicon idle. Supporting the vast majority of gamers is more important than fully loading nVidia's speciality hardware and worst case scenario is that "RTX" style raytracing never really gets more entrenched than it is now, with alternate techniques becoming the new standard for wider compatibility and better performance.
Yeah they can develop for AMD hardware and target both PC and consoles with same effort, Nvidia requires extra work (which I am sure they sponsor wherever they can benefit from it). Maybe when Unreal Engine 6 arrives the landscape will be different, but not as it is now. With a looming deadline and budget constraints I would rather devs spend their time optimizing broadly and improving gameplay. Kinda like PhysX back in the day :D
After 6 years we didn' t had to buy a 4090 to experience good ray tracing at 60fps... This only shows how NVIDIA monopoly hurts us as consumers
I have a 4070TI and I am currently playing Dragon Age Veilguard.
I have all settings Ultra and ray tracing maxed out. Looks amazing.
The issue is that in some forested area, the frame rate goes in down to 20fps with frame rate insertion. So basically stutter heaven.
It really varies from 20 to about 140fps.
Would love this game to be looked at sometime for why the varience is so high.
Good video, thank you. Just doesn't feel like RT is worth it, even after so many years. Question, why are you using the 4090 against the 7900XTX when the 4080 or 4080 Super is price/performance wise a better comparison? The 4090 is almost 2.5X the price of a 7900 XTX, right now.
It was explained in the beginning of the video, test was done using best cards from both ngreedia and AMD
The point of the video isn't comparing amd vs nvidia, it's seeing the performance loss you get from either companies best offerings
@@dainiusvysniauskas2049 Ahh missed that bit, that explains it, thanks.
RT = 90% MARKETING + 10% VISUAL QUALITY(IF YOU STAND STILL)
I thought it was cool in minecraft but then that just died off so there isn't any more reason for me to care for it. Every other game looks pretty much the same to me with or without it.
@@valderon3692 Yeah, but normal minecraft version has almost no graphic quality when you look at lightning, so that's the reason why it is so different.
When path tracing is available (Cyberpunk, Metro Exodus, Black Myth Wukong) it looks absolutely amazing.
But I agree that in most game it isnt woth it.
@@_r4x4 In vanilla yes, but I have primarily played minecraft with shaders and other visual improvement mods for many years now. I still prefer the way RTX looked compared to any of them. Even though some of them use a form of ray tracing, it still never seems to work quite the same way. I don't think the tech will ever really catch on though. As soon as Nvidia stopped paying Microsoft to develop RTX, they stopped working on it. It's just not something most people care about or want to pay extra for, so most developers don't have much of a reason to include it in their games. Again, this is aside from the fact that it doesn't make much of a difference in most games.
ill render the glorious thumbnail in 4k RT
That is genuinely one of the best videos about ray tracing on TH-cam.
Medium Ray Tracing looked nice in Control, but that was only because there was so much glass in the game that reflections made a big difference.
Would love the same comparison between 7900xtx and 4080 super!
Yeah the 7900 xtx is 1000-1100 euro the 4090 is 1600 euro, not comparable. Ofc now the 4090 is no longer in production so ofc its price is jumping higher, but the 4080 super is the equivalent to the 7900 xtx in price, not the 4090.
yes me too. comparing a 4090 to 7900xtx is literally useless since most own gpus 1k bucks or lower...
Funny thing the 7900XTX is currently closer in price to the 4070ti Super than it is the 4080 Super
Dont see differences . Is like watching 144hz and 185hz 😂
If you have to ask this question, you already know the answer
asking the question and making a video about it is click bait mate.
It all depends on your perspective. On my old GTX1080 turning on raytracing destroyed performance, so I played without it. For example in the witcher 3 at 1440p I had 7fps with raytracing on. Now, with the RTX4080S, I get perfectly smooth frame rate even with RT, so my perspective has changed. I get 82fps at native 1440p TAA and 179fps at 1440p DLSSQ + FG + fullRT (updated DLSS 3.7.2 looks sharper than native TAA in this game). The game looks incredible with RT and runs extremely well.
Cyberpunk is another great example. On the GTX1080 I could not even run RT in this game, but I had 30fps at 1440p high settings without RT in the base game. Now on the RTX4080S I get much higher fps with RT and more demanding DLC map:
-67fps at 1440p native + ultra RT
-88fps + medium RT (RT shadows + RT reflections)
-172fps DLSS Quality + FG + Ultra RT
This game also support Path Tracing:
-37fps at 1440p native
-68fps at 1440p DLSS Quality
-120fps at 1440p DLSS Quality + FG
Thanks to the DLSS technology I can play cyberpunk even with the most demanding path tracing and still have a good experience. From my perspective RT feature is absolutely usable and I dont want to play without unless there are some stuttering issues (like in Star Wars Jedi Survivor).
Great content, love the follow up from the previous RT video, hope to see more of these kind of videos again when next generation cards releases.
I have 2 thoughts and/or requests that I would like to share:
1) Are the image output quality equivalent between the cards? (I assume so but wonder if the graphics drivers would change the quality for performance, has happened in the past with few titles)
2) Would love to see a HUB "Cost per Frame" Chart analysis when using Ray Tracing for each of the cards
Regarding cost per frame, it seems that when eyeballing based on your charts and prices here where I live the Nvidia 4090 is around 25% better than the 7900XTX at Cost per Frame in "Good none Path Tracing titles" while Nvidia is 50% better Cost per Frame when also including the Path Tracing titles.
Generally AMD has a better Cost per Frame when looking at Rasterised rendering however now that we are starting (emphasis HEAVILY on *starting*) to see some value when using RT rendering over Raster for once (in some titles at least) this "RT Cost per Frame chart" would make more sense down the line to include in benchmarks, at least could reconsider it.
I have to question your use of the RTX 4090 because it skews results two ways:
#1 - The percentage of gamers that can buy this card is miniscule. Hell, even the XTX is WAY out of most gamers' budgets at its current price of $870USD, let alone the RTX 4090 which is MORE THAN DOUBLE the XTX's price at $1900USD. The use of the RTX 4090 is especially unrealistic.
#2 - It makes RT look more reasonable than it really is. All you've done is shown people that if they have $1900+tax to spend, they can use RT. Showing RT on the only card that can actually use it ALL THE TIME without mentioning "Oh, BTW, this card is almost $2000USD, so good luck with that!" doesn't help anyone.
You also speak of performance differences between the two cards but you have to remember to give context. Not everyone knows that the RTX 4090 is more expensive than the RX 7900 XTX. Hell, even fewer know that they can buy two RX 7900 XTX cards and still get money back from the cost of a single RTX 4090. I really think that you should have used the RTX 4080 Super because at least it would show RT performance at a (somewhat) realistic price-point while being priced (somewhat) close to parity with the XTX.
Really, really though, I think that using the ~$500 price point would have been better, like an RTX 4070 Super vs. an RX 7900 GRE. Yes, the numbers are going to suck, but that's the point. People should know that even at ~$500, RT will either be bad or terrible regardless of what card you buy.
And yes, I don't know where you got the $1600USD price from, but PC Part Picker shows the least expensive RTX 4090 at $1900USD, a Gigabyte Gaming OC model. Maybe they're jacking the prices to make $1600 look good on Black Friday, but Radeon sure isn't doing that because $870 for an ASRock Phantom Gaming model is a phenomenal price.
I played cyberpunk with pathtracing at 1080p 30-40fps with a 3070 and still had fun. And with my 4070ti I play with pathtracing fine at 1440p high. I don’t need 100+ fps. It’s a single player game
Fair comparison considering it’s almost holiday season with a looming product launch not long after that.
The idea is to compare the best GPUs that can run RT from AMD and Nvidia.
@@JoaoMXN the only thing i don't like was they only put everything on 4k like couldn't they have also showed me if the reduction on performance was as significant if it was 1440p? If you google the amount of ppl playing on 4k it's just 2%, i know 1440p isn't any better with 12% of ppl playing on it but atleast it's a better representation or middle ground for people to determine if the information can be relevant to their rig.
@@wihukeon that I agree. 1440p nowadays is set to overtake 1080p.
I do take issue with saying that some instances of RT "Clearly improve the visuals" though it was more saying that some don't "clearly improve the visuals" but that does very much imply that some do. That is an entirely subjective take. I remember you guys loved the RT on Wukong, but I found it jarring and growing up in the country, no where near as "realistic" as ya'll said it was. Visuals are the kind of thing where there is no "objective best", it's all purely subjective. Something you might find outdated and ugly someone else might find visually stunning or impressive.
Think of it this way, the Mona Lisa is a great painting, very well done obviously. Well take that and put it next to a framed version of say the Grand Festival promo art for Splatoon 3's Final Splatfest, and I would imagine if you took 500 people who had never seen ether of them before had no prier knowledge of ether of them, I would guess that the percent of people who liked ether piece of art would be fairly evenly split. You would need people going in blind because if they knew the Mona Lisa was an old painting by a famous artist they would feel inclined to say they like it more purely because of it's history and not because they actually liked it more.
Let's not forget as well, the painting "Onement VI" a canvas that was painted blue and had a white line painted down the middle not only sold for nearly $44 million dollars, but is almost certainty someone's favourite piece of artwork ever made. Where I look at it and see something a toddler could do, someone else looks at it and sees meaning, and that's the beauty of art, and nether take is wrong, there is no objective "right" or objective "wrong" no art is better than any other art.
You could argue texture quality but here is my counter there, I personally think Star Ocean The Second Story on the PSX is one of the best looking video games to ever be made. The towns are iconic, they stand out vividly in my mind. Arlia the small town you start in, with it's fountain just outside the mayors house, and just south of the towns builder who left his kids to go work on a project in the next town over, next to a small farm enclosure, and right next to that is the general store. You carry on to the entrance of town where you see the gate, weathered and showing the age and rural nature of the town, you then get a beautiful bridge that crosses a small river, that you can just about see the sun in, but what you can see is a very realistic reflection of the house on the North side of the river, as you take in the shimmering water you then cross the bridge passing a very aged looking Church of some kind, you then find yourself at entrance to the Sacred Forest, where you see the towns water wheel and the fences for a larger farm, as well as a small river boat. This is a game from 1998, that I STILL feel has some of the best most visually appealing locations in any game I've ever seen.
Sorry for this wall of text, it's just something that has always bothered me, you can say "I think this looks better than that" but you can't really say "this clearly looks better than that" because it's all subjective everyone views art different.
Bad comparison, 7900XTX is in between 4070TI super and 4080super so it must be compared with those 2
A video without a 4090 running at 1080p? Blasphemy!
These are not comparing counterparts for similarly priced cards, it's a comparison of the best case scenario for both brands.
@@Mathster_live
I agree with the OP on this.
When RTX is built into the model names, I think it would be an important investigation to show whether or not the lower end of the stack can actually use the feature they are sold on.
The 4090 is a prosumer product. It's not relevant when trying to help consumers make informed choices. Which I would assume this channel is about
@@SansAppellation Bro, how can you not infer using basic logic and reasoning that if even the best of the best for both brands that the ray tracing is not really worth it or is negligible interms of graphical improvement, then the lower end cards below the best cards for both brands are objectively worse with RTX ON.
You don't need to test every card lower in the stack, just use BASIC logic and assume how they perform relative to the information at hand.
Id like to see price point RTX. So my friend is spending around 350 euro on a GPU. Whats best to get? A 6750XT? A 7600XT? Or a 4060ti or 4060/3060 12gb
same with price points of
£/$200 - low end
£/$350 - Mid end
£/$500 - High end
£/$750 - Very High end
£/$1000 - Enthusiast Extreme
7900xtx holds up pretty well to the 4090. Given the price comparison one has to ask, is it really worth paying all that much more. 🤔
This has been a highly informative series, I had honestly expected there to be a bigger improvement in RT by now.
No
The 7900 XTX looks so tempting to buy at this stage!
do it
It's a great card, even with the lower raytracing performance. Raytracing is so worthless 99% of the time. Baked in, traditional lighting effects, still look gorgeous when done correctly.
@@TooBokoo Rt is not worthless lol, just because the 7900XTX can't run RT well doesn't mean it's worthless. RT looks incredible in most games. Also, FSR is very, very bad and AMD has so features at all. No software support too.
You missed the train. I bought it 8 months ago. No regrets. I invested the excessive money people use for 4090 into 49" OLED monitor and HDR capability. Much more enjoyable gaming experience than RT feature. XTX is beast, but future is dark, if nvidia has monopoly in high-end segment near future.
@kraithaywire this is just flat out wrong. The 7900xtx has tons of software features and they're far better than Nvidias offerings imo
I'd hazard a guess that most Nvidia users enable dlss with heavy RT settings and still retain decent fps.
Thanks Tim. It would be interesting to have a video where you check what's the minimum tier of hardware, for both brands, needed to enjoy Ray Tracing together with an acceptable level of image quality (eg: DLSS Quality at 1080p being not acceptable, or barely passable). In the video "We want cheap AMD graphic cards, so we can buy GeForce" you were making an excellent point: "how well does the tech scale to a 4060 type of product?". And so, a video where we literally check "how much dolla' do you need to enjoy RT?" or "can we enjoy RT on a 400 bucks GPU?"
That would be very interesting.
As a 14700K + 4090 owner I can definitively say that Ray Tracing is NOT worth it. I didn't buy my 4090 for ray tracing but more for the ability to play games at 4K @ 120fps on my LG OLED. Furthermore, something Tim neglected to mention is that you often times need a beefy CPU as it is also a requirement for ray tracing.
Imo Tim also should have shown two comparisons; Ultra Settings & Optimized Settings. For example, his depiction of CP2077 at Ultra w/ PT is running at 43fps BUT w/ optimized settings that number is at minimum doubled to 85-90fps. Personally if I'm running CP2077 w/ PT optimized I'm running Balanced mode which yields 95fps in the city and 120fps every where else.
I think if Nvidia really wanted to make RT more desirable for the High End GPU users as that imo is mostly who it's for; they need to try and implement an in game toggle (hotkey). Obviously that won't work for every implementation of RT but there is a mod for CP2077 that allows you to toggle RT/PT on/off with a hotkey which is great bc imo the only time RT/PT benefits is night time and slower game play experiences which don't really require higher frame rates anyways.
So in short don't let Nvidia bait you into Ray Tracing as a marketing ploy as it's mostly just a gimmick and still NOT worth it.
tldw - buy 7900 xtx and with the money you saved buy a qd oled monitor, better experience overall than any game with RT
"There are no good examples of AMD sponsored games with strong ray tracing", yeah because you didn't include Avatar and you forgot that Callisto is AMD sponsored. And your subjective grading of Callisto's RT is absurd, just check digital foundry's tech analysis of that game's innovative RT.
HWU down bad on this.
Avatar fits right into the average difference between 4090 and 7900 XTX with "Good RT" titles, excluding path tracing, with the 4090 outperforming the 7900 XTX by about 55%. But also, you cannot turn off Ray Tracing in AFOP, so there is no baseline for non-RT performance, that is why it was excluded, I believe.
Calisto Protocol is marked as AMD-sponsored at 29:45, and it is also in-line with the ~50% performance difference characterizing the "Good RT" - without PT scenario described in the video. I agree that RT in Calisto Protocol is better than HUB denoted, but even if included in the mix, it doesn't change the overall conclusion.
@@cpt.tombstone yes I was not commenting on performance or the exclusion of Avatar, just on the false narrative that AMD sponsoring always means holding back on RT. And I disagree about Avatar, it's one of the absolute top RT games below path tracing, once again I will take DF's analysis of RT over HUB.
Those few games are outliers and dont represent the conversation as a whole. Kinda like john not reviewing Hogwarts legacy because of a small small minority of outcries, over lmnopqrstuvwxyz people. No amount of representation or analysis will ever be perfect. Lol
@@opinali Avatar is a good counterpoint, but it's still a trend that AMD-sponsored games are not really making RT shine. There are exceptions, just as there are exceptionally bad RT implementations on Nvidia sponsored titles.
I'd like a performance comparison between the 7900xtx, 3090 (both second gen raytracing hardware), and 4080 super (similar price and performance to 7900xtx)
Turn on RT, you will get visible fps hit, but the visual is almost the same. We call that gimmick
But according to DF, once you zoom in 400%, you WILL see that extra RT shadow on a tree in the top right quadrant.
That is surely worth the fps hit.
NV and their shills desperately trying to convince people to part with 700+$ for these overpriced joke of their "mid tier" cards.
Short answer: No.
Long answer: Not really.
Excellent video as always! My 2 takeaways are
1 - pathtracing is pointless as not even a £1500 GPU can run it above 60fps even with DLSS
2 - Metro Exodus proves that RT can be a both well done and look fantastic and doesn't need to cripple performance on all GPUs. I own the game myself and using very high rather than extreme and setting VRS to 2x with medium RT and Hybrid reflections runs really well on my 7900XTX
Do keep in mind this is a 4K test. Lower resolutions + DLSS, most of us have 1440p displays and deem them adequate, papers over a lot of the performance impact.
But i also think path tracing should ideally not be fully RT based, it can use RT for near hits and SDF, SH for far hits, limited trace depth, or some sort of advanced ray reuse.
@@SianaGearz Path Tracing by definition is fully RT-based. The difference between "Path Trace" and "Ray Trace" in games is really a matter of either doing each pass individually (Classic Raytrace modes) or doing everything all at once in a singular pass (Path Tracing modes). Although in a technical sense, all Raytracing is Path Tracing.
In computer imaging / rendering like in Blender, Path Tracing is an optimization of Ray Tracing that spreads rays outwards from the camera itself to capture the world (Games do this regardless of RT or PT) instead of the alternate and far less optimized method of spreading rays from every light source first and hoping the rays hit the camera.
Lmao. You either call it pointless because you can't tell the difference or pointless because it's a massive difference that looks amazing but it runs too slow. Gpus will continue to get better at ray tracing. Path tracing is extremely cutting edge currently.
@@SianaGearz but with a 4090 your either playing at 4K or you want high FPS, if anyone buys a 4090 to game at 1440p60 they have more money than sense. And once you down the GPU stack the performance is going to be much worse.
@@joeykeilholz925 pointless in terms of performance, it looks incredible, in some ways it's generations ahead of mainstream tech, but the issue is the performance on a £1500 GPU is not what most PC gamers would accept (60FPS minimum)
3060 is the most popular gpu based on steam hardware survey.. I would've probably looked at that vs a $2000 halo product.
And on a 1080p monitor with either DLSS Quality or DLDSR+DLSS P.
@@albert2006xp DLDSR and DLSS is amazing but I wouldnt recommend to set DLSS to peformance because its drops quality too much. DLDSR and DLSS quality is best while balanced is as low as I would go because after that quality hit is too much.
@@Extreme96PL DLDSR+DLSS Performance is same render resolution as regular DLSS Quality and looks better. Of course you'd turn it up if you can but if you can consistently do that, time to go up a monitor tier imo. If you can turn DLDSR 1.78x + DLSS Quality on 1080p monitor consistently in games, you're better off going 1440p monitor and doing DLDSR+DLSS Performance from there to 1920p.
Low end common cards (like the 3060 he mentioned) will probably be running DLDSR+DLSS performance from 1080p though and go from there if possible. The difference isn't that big as the difference between not using DLDSR and using it.
@@albert2006xp DLDSR 2.25 and DLSS quality render at 1080p I think. I use DLDSR with DLSS quality on my 3060Ti and it almost always hold 60fps. Difference in som games compared to native 1080p is colosal its completly fix blur caused by TAA and provide best anti-aliasing.
@@Extreme96PL If you use 2.25 yes, I use 1.78 most of the time since the difference is small enough and the performance is needed. In no way my 2060 Super is qualified for native 1080p. It's a 1080p DLSS Quality kind of card.
The other thing to consider is the SSR implementation. Very few games feature edge-to-edge SSR which can appear very distracting at the edges of the screen for the keen viewer.
Some games mask this screen space rendering drawback, but they are very few. In most cases it matters little to the vast majority, especially during faster paced games.
I never mind it too much, save for Ark: Survival Evolved - where it was literally a full inch away from the screen border. In other games, SSR can also cast reflections of near objects into far off reflective surfaces.
The less noticeable issue is the absence of under-object reflections, where the detail under cars and such is simply gone as it is not captured by SSR in the viewport space.
In spite of these inconsistencies, I still use SSR for almost everything (except Metro Exodus: Enhanced Edition, where it is not an option but runs far better than the original raster release).
I will switch to Ray Tracing further down the line, when RTX hardware isn't so taxed and can match or exceed where we currently are with traditional rasterized performance.
I was hoping you'd do the nvidia testing on the 4080S. So we would have had a better "raster vs rt" comparison.
I have a 4080 and the thing is a beast, so I try to throw on ray tracing whenever I can...
And to be honest the only game I have ever noticed it making a difference in is Cyberpunk 2077. Every other game I just go back, turn it off and get the performance boost.
The input lag alone makes me not want to turn on RT in many cases, even in story based games UNLESS i am playing with a controller. Anything that requires aiming with a mouse is almost always not a good idea.
What? Is this an AMD issue with RT? I haven't seen any more than 2ms input lag with RT on using a 3090 and I'm pretty sure that is due to DLSS.
@@TheZoenGaming low frame rate = lag.
Bruh 80 fps is 12.5 ms lag, 120 fps is 8.33ms lag. U think 4 milliseconds will make the difference? Lmao
@@geofrancis2001 Lag and low frame rate are completely different. Lag is the time it takes for an input signal, such as clicking a button, to be reflected on screen.
@@kerkertrandov459 m8, you're confusing frame time and lag.
What’s crazy to me is that Nvidia tried to blacklist you guys for not covering ray tracing as much as they wanted, but I think regularly showing that it chops performance between 20-50% would be much more detrimental to its public perception than not showing it at all
I don't go in a game looking for my reflection in mirrors or puddles of water. I think that spatial sound does more for gameplay.
Listen guys, right now RTX may not seem necessary to us, but in the future Devs will remove the basic features in the graphics settings and put them in the RTX option so that we can turn it on. If we don't turn it on, the graphics will be the same as the old games, but it will consume many times more resources.
Imagine buying 4090 or 5090 and yet there are games where heavy RT implementation drops your game fps close to 50% :D
I mean its just works, you buy pricey cards, because Jensen said it has best RT and you need it rolf
50% of insane performance is still insane performance.
"No"
-That's why i am not on TH-cam. Most questions don't take 3 or more minutes to answer.
Anyone recall when 3D movies were all the rage for the immersion and "experience"? Yeah, RT is a solution to the question, "how do we separate more money from the consumer?"
Abe Lincoln Vampire hunter is pretty cool in 3d.
yeah, but your answer doesn't explain this answer while this video gives you everything to fully convince you why.
@@_r4x4 The "explanation" is soft data that is subjective based on your personal perspective and wants. I don't need to justify my *opinion* , as that is all the answer is.
27:53 When the 7900XTX is fully half the price of the 4090, people would have to be crazy to buy Nvidia right now.
28:32 you sure?
@@phm04raster performance is infinitely more valuable than rt perfomance. Especially for half the price
@@phm04 Absolutely yes.
Until every game ships with with a clean RTGI mode there's no point in turning on Ray Tracing.
I loved playing Metro Exodus Enhanced on the 3080, but near enough to every game studio working today has botched their Ray Tracing implementation, almost no-one does RTGI, or if they do you're forced to include highly taxing RT shadows or RT reflections along with it.
To make Ray Tracing worthwhile you need full RTGI lighting with good optimization, again just look at Metro Exodus Enhanced.
19:34 And even in ME:E the 7900XTX performance is "good enough".
Do note the 1% low is much more consistent on the Radeon card in ME:E, so your minimum FPS is at worst 40% lower than the 4090, and I'll repeat 7900XTX is half the price.
But 7900 xtx already costs half the 4090 price... Cheapest 4090 is $1800, cheapest 7900 xtx $870.
4080s is just better than XTX in every area except rasterization. You'd have to be crazy to buy the AMDip
Great video, I think this serves as the basis for future Ray-Traced benchmarks. By that I mean, which games should be used for the benchmarks? Transforms significantly and a couple of popular titles from better overall, as these are the ones people looking to try ray tracing will be most interested in. With a smaller game set to work with, it will be easier to do videos like “which minimum card for Ray-Racing, in the games that matter?” so can a 4070 super run the games at 1440p (with/without frame gen) or do you need a 4070ti /super to do it?. “Can the 8900xt achieve playable fps in the RT games that matter?”. You could even include 3440x1440p 🫣🙏 (or at least extrapolated results), as 1080p is probably not a used resolution for people buying these cards.
Could you review the difference between native 4k resolution and stretching out 1080p on a 4k monitor?
1080p 60hz bros stay winning. ✨
Exactly. If I don't have an expensive card why would I go higher render res or fps instead of improving the game image itself.
Black myth Timkong
I think a comparison between an 4080 super and the 7900xtx would have been the better one.
Yeah sure I get more ray tracing when I spend like 1k more...
Whole video is silly
Nvidia is still better than amd
COPE COPE COPE
Yes you are spot on.
The 7900xtx beats the 4080s in raster.
@@FO0TMinecraftPVP Oh so you get your gpus from nvidia for free, right? Right?
For someone planning to build a new PC in a few months, this kind of video is extremely helpful. Due to budget I'm really leaning toward the 7900 XTX, so having a video like this but for 1440p would be absolutely amazing as that's far more manageable for this hardware.
My current PC is a 8350-FX cpu, and a 1070ti gpu. I'm so out of date that no modern titles are manageable at all.
I would love to see what your findings would be at 1440p. Especially in the games that you found benefit most.
Thanks for your hard work!
We have not moved from the days of the RTX 2080. I'm on a 7900XT and raytracing is basically nonexistent for me.
I'm on a 2060 Super and raytracing is very common for me. Just finished Casting of Frank Stone, Until Dawn and SIlent Hill 2 all with hardware RT enabled. Also the Alan Wake 2 DLC with RT Low.
A revolutionary idea, how about instead of wasting silicon for raytracing, use it for more raster/shader cores so you could play games at sharper resolution, higher level of detail, higher resolution shadows and volumetric lighting and other cool visual effects? This could upgrade a low/medium setup to a high/ultra setup, significantly improving game visuals and/or performance for the same cost. Especially with the stagnating GPU market, this could give a huge edge in the competition. How come they haven't thought about this before?
If you can get more than 90FPS with RT, then that is fine,
as RT is more geared to adventure games than those games that need 500FPS.
Yeah but the problem is how much you have to pay for those 90fps...
Depends on the game of course, with Cyberpunk in Overdrive mode being spectacularly bad for this but also being the best showcase for the kind of visuals you can achieve.
Tip for HUB team. Comparing data with two metrics. Scatter graph. Performance loss vs visual improvements and a trend line.
Bar graphs was basically useless here.
Did you find this consistent with 1440p do the drops in performance align with your 4k ultra RT results?
This doesn't exactly say much if you just use 4k which a majority don't even bother with.