If you watch Daniel Owen’s video he showed that a good amount of FPS is gained when switching to the high setting and that there isn’t much visual difference but very high to cinematic doesn’t do much in terms of performance. Iv noticed with unreal engine games in particular that the epic settings usually cut performance by a ton without a noticeable visual difference and I almost always play on high because the performance gain is a lot greater then any visual cost.
The ray tracing comparison was very interesting. I've run this bench so much on both sides and I felt the ray tracing looked different on AMD, and it wasn't just noise which would be expected but a bit beyond that which is quite unusual.
@@ZeroZingo Sure that is part of it but it's a lot more than that. You can tell the GI and shadows are a lot more accurate on Nvidia. This isn't the case with other path traced/RT GI games. I've a feeling a driver update could address this not that it would be worth it to turn it on anyway since the performance hit is crippling.
He didn't even put on ray tracing on the Nvidia test at the end. You need to restart it for that option to change. We watched him not do that and go straight into the benchmark. He hasn't a clue what he's doing.
@Rocko130185 All of the test where done before hand, I actually recorded my settings separately. If you have concerns, the thing to do is ask questions.
@@ToniaHan Getting 63 fps base frame rate without frame gen and 104 fps with frame gen with 4080 at 1440p so they delivered right in the g spot for me. It should be perfectly playable even without game ready drivers which are still yet to come.
@user-zn5xu1nj4h for a less than 2 year old, fastest gaming graphics card in the world, I'd say that's a womp womp. The XTX got destroyed, but still $1,600, and it can't hit 60fps 4k native after a year and 10 months?
This game looks stunning. I have 2 6700xt and 2 7800xt. Can't wait till updates come out and drivers after release. Game is beautiful even with medium settings.
@@kalle8960 7800X3D is super power efficient, yes it's pulling less frames than the 14900K but it's not exactly pulling 1/3 of the frames for 1/3 of the power cost now is it..........
This game uses ray tracing by default, regardless of the settings. The 'Ray Tracing' option actually controls path tracing, while other forms of ray tracing remain active. This can negatively impact AMD GPUs. The developers should have included an option to use alternative lighting and shadow technologies instead of forcing ray tracing on by default.
With UE5 Lumen as default no dev is going to waste time on a non-RT solution. Non-RT also makes your environments really static and is highly limiting to game design, let's not go back.
The dev just use lumen in the UE5 which is RT, if they have to add rasterization back to the game, then it is a massive time and financial cost for them. RT is the way to go anyway, it is not the dev's fault that AMD makes bad gpu.🤷
@@Jakiyyyyy This is a Nvidia sponsored game. I don't down a AMD gpu but why would they optimize this for AMD. Think about it. There are plenty of ways to not have ray tracing running in the background just because this is UE5
RT is not off. RT can't be turned off. PT is off, Lumen is always on which is UE5's software RT method. Ray Reconstruction also makes a nice difference to image quality and details vs it being off.
First of all this game doesn't currently support ray reconstruction and he's referring to hardware RT , Lumen is a low cost software ray tracing yes but it's also much much worse.
@@Jasontvnd9 Yes we know that now but it does use an alternative denoiser as is evident, so whatever they implement is doing a much better job then traditional hand tuned denoisers as found in all current games using RTGI and oath tracing. Yes I'm aware what is being referred to. Lumen is lower fidelity and temporarily more unstable than traditional RT in other engines as evidenced by all of those games that don't use UE5.
@@damara2268 Yes, i can see that, but it might be in use for ground geometry and maybe for trees as well. I think everything nanite may be a bit too demanding for this gen of hardware, ah well, if it's just nanite and there is no Lumen whatsoever, it could be very playable.
This game uses ray tracing all the time. There is a software lumen fall back solution which is a type of software ray tracing, that's why it's so demanding. The "Full ray tracing" means it uses hardware lumen, which is somewhat comparable to path tracing. Also , this crying that the game is demanding... Yes, finally a game that pushes the envelope. Finally a studio that is putting out some state-of-the-art visuals... 2080Ti ran at 20FPS in native 4K in Quake 2 RT in 2019. It ran Control at 1080p@60. It's the same story here. It's good that we have games like this.
Not really, in full ray traycing with AMD GPU it is indeed hardware lumens, but with Nvidia GPU it is patch traycing, don't say I'm wrong because Nvidia itself has advertised it as patch traycing on its youtube channel
@@chien.nguyen. I don't think that's a whole new path tracer in there 🤔 This is a UE5 game that uses Hardware (accelerated) Lumen. I believe Digital foundry confirmed that, but now you confused me and I'll need to double check this 🤔
What state of the art?? Where in the world this visuals are better than Hellblade 2? Even Hellblade 2 runs fine in 7000 series maxed out. This is a normal nvidia move to push Ray Tracing in order to promote their cards, but in reality there's nothing major from visuals, only performance loss for people with cards below 4090 and 4080.
@@yeahitsme4427 you may be right about NVIDIA trying to sell GPUs.. that's how it works, I wouldn't argue with that. But comparing this game to Hellblade 2 is unfortunate coz they are both beautiful games. Both are state-of-the-art, and I never said it's more advanced that HB2 (even though I think it is, just the water caustics alone are technically on a whole another level that water rendering in Hellblade). Also, this game seems to be pretty scalable. It runs on 1060.. poorly, but it runs. So you can imagine it will run on everything above pretty well, if you're not daft enough to whack everything to max.
@@yeahitsme4427 bruh thats so fucking true. Its not amd users are mad cause bad this is just rigged somebody needs to code some shit so your pc recognizes amd as nvidia and run the test. 🤣
Apparently there is RT Heavy Water Caustics (light bounce under water) in the Full RT mix but this benchmark doesnt really show it, but its likely there and enabled.
RTX4090 is a monster card although the price dictates that it should be. As a console gamer moving to PC I've been totally blown away by how good it is, now my xbox looks like ass in everything I play on it!
I have noticed that pretty much every GPU except from NVIDIA's 4000 series is performing really badly in this benchmark. For example, my RTX 3090 Ti gets about 24fps where, when using the exact same settings, an RTX 4070 Super (which is slower) averages 35fps. The latter IS NOT ~37% faster in any other game, even when comparing RT scenarios. Let's just say I'm REALLY waiting on a driver that'd fix this issue...
@@fvallo They better optimize the shit out of 20 & 30 series for this game, otherwise they're gonna have a lot of people pissed. No one bought RTX 4000, most people skipped the generation, holding onto their 3060 Ti/3070/3080s because these are still really capable GPUs at 1080 & 1440p!
@@Hateroz worst of all even on best settings the image looks like Shimmering mess, ue5 is unoptimized mess, compare this demo to avatar full rt jungle demo that runs 4 times better, it just looks cleaner, sharper... Ue5 should have been released in 2027.
@@Hateroz bro its ALREADY optimized why are you trying to go all cinematic + native on a 3070 or 2070. Go with high and reduce the shadows and global illumination, it will drastrically improve fps.
This game when you put on "full RT" enables path tracing. 4070 super is faster for path tracing than 3090ti despite being about the same performance in regular RT, nvidia added path tracing accelerating units to rtx 4000 which make a big improvement.
Even the 4090 isn’t pushing crazy numbers under those conditions.. the hardware just isn’t keeping up with what’s being pushed in full. But really not a problem, except that at peak settings the impact really isn’t too much more impressive than High settings.. Maybe it comes down to monitor at that point?
@@imAgentR well the benchmark had an update and I found a sweet spot of DLSS Quality at high settings and frame gen on. Averaged 96 fps at 4k so im happy about that.
what sources do you have to say that it will go 3x faster? the leaks give a MAXIMUM of 50% more... especially now that they have no reason to push given the non-existent competition
@@club4ghz let's hope so... keep in mind that the 40 series, at least in the high end, was made to counter AMD since it was believed that their high end was an impercomplex MCM that would have given mush to any card NVIDIA had designed... in the end it all turned out to be fluff... look also at the 4090TI that everyone was praising... if there is no competition NVIDIA will not use the best chips for gamers if data centers pay 4X/5X times more for them... then nothing to say about NVIDIA... if I had a technology that was practically YEARS ahead of the others I would make them pay... we'll see
i tried it on my 6800XT, and even in high settings, no RT,1440p, the image looks really noisy and granular (expecially in water and leaves), after seeing your video, for me is clear that this game needs the official AMD drivers and then a re-test, because there is too much optimization problmens on AMD side (and this is fine on 14/08/24, the game isn't released yet and it was develeped with a strong sponsorship from NVIDIA), we'll see when it is officialy released
No this is not an AMD pb, it appens the same thing on my 3060 ti, 1440p and 1080p native and 1440p +dlss or any upscaller looks the same as your description. But once i run the benchmark at 4k even with tsr performance or dlss performance (not fsr : it looks always hawful ) all of a sudden the image is perfect almost as good at 4k native( but significantly less sharper )or equivalent to 4k + dlss quality. It seems like the engine really like 4k and it's okay with me because 4k + dlss ultra performance gives the same FPS as 1080p native while looking significantly better, or there is 4k + dlss performance which is a bit more demanding and a bit better looking. It's clear that the engine is built around upscallers. Or may be the game is only optimized of the rtx 40XX serie but I don't think so.
8:00 I know this video is 4 months old, but I can somewhat answer this. This game is using the RTX branch of UE5 (ray traced global illumination solution/library) and what you see is the devs optimising heavily for Nvidia, because well.. it's using the RTX branch of UE (literally the latest patch has a note of them further optimising RT for Nvidia specifically). Thus the way the engine reacts with AMD isn't strange, and watching GNs Battlemage review a 7900 XTX can only match a 3060 Ti at 720p @medium RT. That's.. shocking, and not because AMD can't do RT. Intel suffers too and they heap on RT more than AMD.
AMD even mentioned in their drivers that the Ray tracing has an issue. Known Issues Overly dark shadows or desaturated colors may be observed while playing Black Myth: Wukong when Global Illumination is to Medium or higher. Users experiencing this issue may set Global Illumination to Low as a temporary workaround. [Resolution targeted for 24.9.1]
@@Bang4BuckPCGamerThe comparison literally shows AMD shadows are lighter, not darker. Also the overly dark shadows were.. really really dark, it was akin to having crushed blacks in HDR. (ReSTIR GI could be factor, but that is again going against the symptoms reported lol). The only thing that would leave is desaturated colours, that could be an answer but the colours looks comparable, it was specifically the shadows as you point out. Would this be worth a revisit? : o EDIT: I just tested it with my 7900 XTX and the shadows aren't as dark and defined as the 4090, but (slightly) darker than the 7900 XTX on the left. Interesting 🤔🤔
As you increase the resolution to 4k you should reduce tessellation to minimum on AMD, this a technic NVIDIA use hence they locked their tessellation setting from user.
8:31 check that statue head on the left , clrearly bugged on the amd side , guess the devs. didnt spend to much time with the amd driver side for this game ^^
@@Bang4BuckPCGamer The rt setting is literally called "Nvidia full RT", the name of the setting already tells you its not gonna run well on AMD. It is actually enabling path tracing for which rtx 4000 have special HW units while rx 7000 dont have that. Also nvidia has a highly improved denoiser for path tracing inside DLSS, thats what makes the differences in the shadows etc. The same thing goes with alan wake 2 and cyberpunk too, path tracing looks better on nvidia because of the denoiser inside DLSS. Nvidia calls the denoiser "ray reconstruction".
@@damara2268I'm aware of all this, but Path Tracing looking bad on the 7900 XTX is not true. Cyberpunk 2077 th-cam.com/video/QQfSzPjeihY/w-d-xo.htmlsi=CSPHGD9gm6Q5XP5g Alan Wake 2 th-cam.com/video/sLQ7cLcAXsg/w-d-xo.htmlsi=-3JPQdxRDwVuQua_
Bro Hunt showdown got engine upgrade , however it's now called hunt 1896. Any chance compare it on both 4090 and 7900 xtx ? tried on my amd card ant it's beautiful but quite brutal on performance, another can it run crysis like vibe :)
I mean this isn't just a price thing, if the RTX 4080 Super was there.. it would still have provided way superior rendering quality compared to amd.. fsr just isn't great.
Tested on my 4090 and the conclusion is go DLSS quality/perf with FG and everything maxed out and forget it. Good alternative for solid 60fps+ is no RT, no FG, native 4K, and just switch few Cinematic settings to Medium or High - barely any difference in some cases.
I think it’s possible that AMD is using only Lumen for RT. That could explain the shadow intensity difference with NVidia at the max RT settings. NVidia probably switches to its proprietary RTX implementation in the game replacing Lumen. Edit: both cards fall back to Lumen when no “extra” RT is enabled.
full ray traycing with AMD GPU is indeed hardware lumens, but with Nvidia GPU it is patch traycing, don't say I'm wrong because Nvidia itself has advertised it as patch traycing on its youtube channel
To be fair the clarity difference in raytracing comes if im correct from the low fps on the AMD card resulting in more noise and blurry frames in between resulting in a worse visual experience in the finer details of the scene..
More proof that the 7900 XTX gives the best Price to Performance ratio in my opinion... Hard to beat a card that can give similar results to the best card on the market at almost half the price.... Unless you're a die hard Ray Tracing fan it's a ovbious choice.
I own both (4090 and 7900xtx). RTX is a total monster though it's often bottlenecked by the cpu and this can limit the fps. But RX 7900xtx is good and fast enough for gaming today. And there are not $800 difference in performances between both gpu.
Maybe AMD is using some kind of software version of path tracing whereas Nvidia is using hardware accelerated path tracing. This sort of thing happens when using Lumen in UE5 and it would explain both the visual and performance differences between the two.
Nvidias 40s series support opacity micromaps which makes tracing against opacity masked materials like foliage way faster. Also it has shader execution reordering and hw accelerated BVH traversal and BVH build. Can't wait for Digital Foundry's analysis when the game comes out.
I usually run my 7900xtx overclocked. I got average 94FPS on very high, no rays, 2K I then set the GPU to default clock speed and got average 150fps. LOL My CPU 7800X3d motherboard is the MSI tomahawk MAG RAM 64 gig of 6400mh 7900XTX Red Devil. PSU 1000watt 80plus gold SSD is 7200 clock speed or whatever that means.
@@darkfire3691 and ? other UE 5 games ,used the same tehnique and it didnt had this probs. at all , on amd gpu ! its clearly no optimization for amd here i think.
im not sure whats happening. i know this is 4k bm with a 4090. but my mind was blown as i could run this on 1080 GTX 1660 TI in cinematic and get 45 fps.. it shouldnt even run like that with that setting
Totally unfair test, the 7900 XTX graphics card held the 7800X3D back by far, it was the bottleneck. Test again with both the 14900KS and the 7800X3D, both with RTX 4090's for a real comparison.
interesting to see how efficient the game is with memory usage and how much more efficient NV is when it come to memory usage. oo yes, 50% more performance for the green team... ouch.
Bad RT perf on RT mode on AMD is completely understandable, but on RT Off it is joke (even if lumen is used, it still don't use HW RT so result need to be on RTX 4080 level), game is completely unoptimized for AMD GPUs.
that 4090 carried that Intel cpu hard LOL imagine if you just used it on the 7800x3d lol though probably not given much more When you turned on the RT I kinda like how AMD did shadows they look more real to actual shadows since they aren't really that dark irl however Fuzziness on the AMD side but FPS wasn't to bad image was lol
Replay Rx7900 XtX or driver 24.8.1 and microsoft new patch system, and on SAM in motherboard ., i have 5800x3d and have this gpu 110/120 Fps ultra setting + ultra RT in 4k
I dont need to replay anything. 1 SAM is already on. I am not sure why you assumed it was off, as if I am some rookie. 2 Driver 24.8.1 gives less performance by at least 3 to 4fps.
Yes it's a disaster when AMD fanboys say RT is not necessary, when now RT is no longer an option but mandatory, don't ask me why because this game uses Lumen even when RT is turned off, which That proves that RT is very important, only gpu's cannot handle it easily. In 2 years when the new generation of gpu is released, RTX 5000, RT will definitely be improved, if AMD does not improve RT on RX 8000 series, that's the end for them, don't argue with me because something like rx 7900xtx has RT performance equal to RTX 3000, the problem is not only in RT performance but also in RT technologies like noise reduction .... a lot of things AMD doesn't have right now
Wukong just proofing that Lumen ARE enough to have good raster lighting without the huge performance impact, path tracing are overrated coming from 4070 ti super user.
What is your estimate for RTX 5090 with the following settings: Display Resolution: 4K (3840 x 2160) , Super Resolution: 100% (DLAA), Frame Generation: OFF, Full Ray Tracing: ON, Full Ray Tracing Level: VeryHigh and every other setting on "Cinematic" ?! Will the 5090 achieve stable/averague/more than 60 FPS (without FG -> and as an example/comparison, the 4090 does not achieve an average of 30 FPS) ?! What would you think if even the 5090 would end up below 60 fps ? Other than that, I am somewhat a bit underwhelmed with UE5: There is popin, shimmering and anti-alias-flickering in various titles (Robocob, Immortals of Aveon etc.) While UE5 is very heavy with all features and very taxing on your system, it does not produce results as "clean" as one would imagine.
5090 will be around 30-40% faster 4090 so there won't be native 4k anymore all they now focus on improving dlss and fg to achieve e more fps to hide true potential of a card ..
@@cronos1675 I guess it depends on the Tensor-Core improvements. Maybe the 5090 is way more capable than the 4090 when it comes to RT payloads. Like the comparison shown here where the rasterization performance is not that much different (in special if you compare prices) but with raytracing, the 7900 XTX can not keep up anymore. Maybe the 5090 has a huge advantage over the 4090 when it comes to that as well and maybe this could push it over 60 fps using RT in native res. We'll see. It will be interesting. Could also be that DLSS 4.0 or 5.0 has 50-Series exclusive features due to the different chip design. If the 5090 will not be able to push more than 60 FPS in this with native res, I will be disappointed.
4090 more shadows? from my pov, the 4090 got less or missing global illumination here with RT on, hence the darker shadows... Probable explained the 3x FPS more from the AMD.
AMD Radeon is so bad at RT is truly disgusting how bad it is. And FSR is also very bad. AMD needs to fix their drivers and cards to compete with Nvidia.
You went and put ray tracing on without restarting, meaning you didn't switch on at all. You should actually read what the graphics options said. You're about the 5th clown to do that. And dont say you didn't because we watched you switch it on and go straight to the benchmark using dlss.
Yeah this game looks like trash. 90 fps hit for some darker shadows??? Lmao so dumb. Atleast in cyberpunk raytracing actually looked good. This game looks like Poop with RT on or off. Oh you only need a $2,000 gpu to run it at 23 fps native. Woohoo.
@@SemperValor path tracking gives you true shadows, not "some darker shadows", if you really compare lumen vs path tracing in the benchmark then you will see a day and night difference... with full RT enabled it looks "alive" while wih only lumen close to "just another game"
If you watch Daniel Owen’s video he showed that a good amount of FPS is gained when switching to the high setting and that there isn’t much visual difference but very high to cinematic doesn’t do much in terms of performance. Iv noticed with unreal engine games in particular that the epic settings usually cut performance by a ton without a noticeable visual difference and I almost always play on high because the performance gain is a lot greater then any visual cost.
There is no doubt bloated Game Engines hit hard on performance sector.
The ray tracing comparison was very interesting. I've run this bench so much on both sides and I felt the ray tracing looked different on AMD, and it wasn't just noise which would be expected but a bit beyond that which is quite unusual.
Nvidia is probably running ray reconstruction DLSS while AMD is running traditional RT denoisers for a diffrent look.
@@ZeroZingo Sure that is part of it but it's a lot more than that. You can tell the GI and shadows are a lot more accurate on Nvidia. This isn't the case with other path traced/RT GI games. I've a feeling a driver update could address this not that it would be worth it to turn it on anyway since the performance hit is crippling.
He didn't even put on ray tracing on the Nvidia test at the end. You need to restart it for that option to change. We watched him not do that and go straight into the benchmark. He hasn't a clue what he's doing.
@Rocko130185 All of the test where done before hand, I actually recorded my settings separately. If you have concerns, the thing to do is ask questions.
@Bang4BuckPCGamer your average was 112fps when everyone else with the similar set up is between 97fps and 100fps, myself included.
I guess this is the first RTX 5090 game.
Nah this is the first RTX 4080 1440p game.
@@club4ghz no one promised that 4080 or 4090 can hit the max setting, dev only promisd that 1060 can be played.
@@ToniaHan Getting 63 fps base frame rate without frame gen and 104 fps with frame gen with 4080 at 1440p so they delivered right in the g spot for me. It should be perfectly playable even without game ready drivers which are still yet to come.
@user-zn5xu1nj4h for a less than 2 year old, fastest gaming graphics card in the world, I'd say that's a womp womp. The XTX got destroyed, but still $1,600, and it can't hit 60fps 4k native after a year and 10 months?
Nah this is the first RTX 3080 1080p game.
The reason they look different might be Nvidias "ray reconstruction" which I think was in dlss 3.5 and cyberpunk 2077 if I remember correctly.
This game looks stunning. I have 2 6700xt and 2 7800xt. Can't wait till updates come out and drivers after release. Game is beautiful even with medium settings.
31w AMD - 100w Intel ))) That Smart 3D Cache
The amd processor isn't having to pump out the same amount of frames
@@CriminalGameplay It's also not killing itself through oxidation.
3D cache has nothing to do with the power usage. You just heard the buzzword online and spam it everywhere now lol
@@kalle8960 7800X3D is super power efficient, yes it's pulling less frames than the 14900K but it's not exactly pulling 1/3 of the frames for 1/3 of the power cost now is it..........
@@Endermanv-ot2if Yes it is powe efficient. That efficiency has nothing to do with 3D cache though
This game uses ray tracing by default, regardless of the settings. The 'Ray Tracing' option actually controls path tracing, while other forms of ray tracing remain active. This can negatively impact AMD GPUs. The developers should have included an option to use alternative lighting and shadow technologies instead of forcing ray tracing on by default.
With UE5 Lumen as default no dev is going to waste time on a non-RT solution. Non-RT also makes your environments really static and is highly limiting to game design, let's not go back.
It's all business these days. The game runs raytraced by default is a part of UE5. 🤷♀
The dev just use lumen in the UE5 which is RT, if they have to add rasterization back to the game, then it is a massive time and financial cost for them.
RT is the way to go anyway, it is not the dev's fault that AMD makes bad gpu.🤷
@@Jakiyyyyy This is a Nvidia sponsored game. I don't down a AMD gpu but why would they optimize this for AMD. Think about it. There are plenty of ways to not have ray tracing running in the background just because this is UE5
Asi fue siempre, nueva tecnologia que luego dejan de usar las anteriores y que no se molestan en hacer versiones para gpus viejas, es lo normal.
RT is not off. RT can't be turned off. PT is off, Lumen is always on which is UE5's software RT method. Ray Reconstruction also makes a nice difference to image quality and details vs it being off.
First of all this game doesn't currently support ray reconstruction and he's referring to hardware RT , Lumen is a low cost software ray tracing yes but it's also much much worse.
@@Jasontvnd9 Yes we know that now but it does use an alternative denoiser as is evident, so whatever they implement is doing a much better job then traditional hand tuned denoisers as found in all current games using RTGI and oath tracing.
Yes I'm aware what is being referred to. Lumen is lower fidelity and temporarily more unstable than traditional RT in other engines as evidenced by all of those games that don't use UE5.
Lumen is perfectly capable of being hardware accelerated, and it actually is.
@@niebuhr6197 Not with RT cores it isnt...No point in doing BVH calculations or else you could just do HW RT.
Insane cards
The trees and grass in this game looks very good, like they are real geometries instead of usual alpha tested textures.
That is probably nanite, not sure though
@@oropher1234 Nanite is not used for everything there, I can clearly see the grass patches swapping to higher LOD when they get closer.
@@damara2268 Yes, i can see that, but it might be in use for ground geometry and maybe for trees as well.
I think everything nanite may be a bit too demanding for this gen of hardware, ah well, if it's just nanite and there is no Lumen whatsoever, it could be very playable.
This game uses ray tracing all the time. There is a software lumen fall back solution which is a type of software ray tracing, that's why it's so demanding.
The "Full ray tracing" means it uses hardware lumen, which is somewhat comparable to path tracing.
Also , this crying that the game is demanding... Yes, finally a game that pushes the envelope. Finally a studio that is putting out some state-of-the-art visuals... 2080Ti ran at 20FPS in native 4K in Quake 2 RT in 2019. It ran Control at 1080p@60. It's the same story here. It's good that we have games like this.
Not really, in full ray traycing with AMD GPU it is indeed hardware lumens, but with Nvidia GPU it is patch traycing, don't say I'm wrong because Nvidia itself has advertised it as patch traycing on its youtube channel
@@chien.nguyen. I don't think that's a whole new path tracer in there 🤔 This is a UE5 game that uses Hardware (accelerated) Lumen. I believe Digital foundry confirmed that, but now you confused me and I'll need to double check this 🤔
What state of the art?? Where in the world this visuals are better than Hellblade 2? Even Hellblade 2 runs fine in 7000 series maxed out. This is a normal nvidia move to push Ray Tracing in order to promote their cards, but in reality there's nothing major from visuals, only performance loss for people with cards below 4090 and 4080.
@@yeahitsme4427 you may be right about NVIDIA trying to sell GPUs.. that's how it works, I wouldn't argue with that. But comparing this game to Hellblade 2 is unfortunate coz they are both beautiful games. Both are state-of-the-art, and I never said it's more advanced that HB2 (even though I think it is, just the water caustics alone are technically on a whole another level that water rendering in Hellblade). Also, this game seems to be pretty scalable. It runs on 1060.. poorly, but it runs. So you can imagine it will run on everything above pretty well, if you're not daft enough to whack everything to max.
@@yeahitsme4427 bruh thats so fucking true. Its not amd users are mad cause bad this is just rigged somebody needs to code some shit so your pc recognizes amd as nvidia and run the test. 🤣
Apparently there is RT Heavy Water Caustics (light bounce under water) in the Full RT mix but this benchmark doesnt really show it, but its likely there and enabled.
RTX4090 is a monster card although the price dictates that it should be. As a console gamer moving to PC I've been totally blown away by how good it is, now my xbox looks like ass in everything I play on it!
I have noticed that pretty much every GPU except from NVIDIA's 4000 series is performing really badly in this benchmark. For example, my RTX 3090 Ti gets about 24fps where, when using the exact same settings, an RTX 4070 Super (which is slower) averages 35fps. The latter IS NOT ~37% faster in any other game, even when comparing RT scenarios. Let's just say I'm REALLY waiting on a driver that'd fix this issue...
@@Hateroz you're saying there will be a performance boost on 30 series?
@@fvallo They better optimize the shit out of 20 & 30 series for this game, otherwise they're gonna have a lot of people pissed. No one bought RTX 4000, most people skipped the generation, holding onto their 3060 Ti/3070/3080s because these are still really capable GPUs at 1080 & 1440p!
@@Hateroz worst of all even on best settings the image looks like Shimmering mess, ue5 is unoptimized mess, compare this demo to avatar full rt jungle demo that runs 4 times better, it just looks cleaner, sharper... Ue5 should have been released in 2027.
@@Hateroz bro its ALREADY optimized why are you trying to go all cinematic + native on a 3070 or 2070. Go with high and reduce the shadows and global illumination, it will drastrically improve fps.
This game when you put on "full RT" enables path tracing. 4070 super is faster for path tracing than 3090ti despite being about the same performance in regular RT, nvidia added path tracing accelerating units to rtx 4000 which make a big improvement.
There seem to be way less shadow flickering on AMD. Hopefully it's something Nvidia can fix with a driver update because it's very distracting.
Ita exactly the same on both.....
My 4070 ti super struggled with ray tracing and using anything higher than dlss performance. Unreal engine 5 in 4k is crazy demanding.
Even the 4090 isn’t pushing crazy numbers under those conditions.. the hardware just isn’t keeping up with what’s being pushed in full.
But really not a problem, except that at peak settings the impact really isn’t too much more impressive than High settings..
Maybe it comes down to monitor at that point?
@@imAgentR well the benchmark had an update and I found a sweet spot of DLSS Quality at high settings and frame gen on. Averaged 96 fps at 4k so im happy about that.
@@stevegomez232that’s awesome! I’m testing now and didn’t realize there was an update. I have a 1650 and 4060 Ti 16GB I’m looking at
@@stevegomez232 It improved performance after update?
The Nvidia card runs ray reconstruction in this game, so yes it should look better compared to an AMD card when RT is enabled.
5090 is going to be 3X faster than 4090 in this benchmark at 4K with Full RT on. My prediction.
5080 more than enough
Yeah this is path tracing, I think the 5090 will be called PTX and not RTX anymore.
what sources do you have to say that it will go 3x faster? the leaks give a MAXIMUM of 50% more... especially now that they have no reason to push given the non-existent competition
@@MarcoGiacon 50% maybe in raster but with Full RT, Path tracing it's going to be more. RTX 4080 is 3x faster than RTX 3080 in this benchmark.
@@club4ghz let's hope so... keep in mind that the 40 series, at least in the high end, was made to counter AMD since it was believed that their high end was an impercomplex MCM that would have given mush to any card NVIDIA had designed... in the end it all turned out to be fluff... look also at the 4090TI that everyone was praising... if there is no competition NVIDIA will not use the best chips for gamers if data centers pay 4X/5X times more for them... then nothing to say about NVIDIA... if I had a technology that was practically YEARS ahead of the others I would make them pay... we'll see
i tried it on my 6800XT, and even in high settings, no RT,1440p, the image looks really noisy and granular (expecially in water and leaves), after seeing your video, for me is clear that this game needs the official AMD drivers and then a re-test, because there is too much optimization problmens on AMD side (and this is fine on 14/08/24, the game isn't released yet and it was develeped with a strong sponsorship from NVIDIA), we'll see when it is officialy released
The grain comes from FSR and TSR, something is wrong with those two. If you switch to dlss it goes away fully, on xess goes away partially.
No this is not an AMD pb, it appens the same thing on my 3060 ti, 1440p and 1080p native and 1440p +dlss or any upscaller looks the same as your description. But once i run the benchmark at 4k even with tsr performance or dlss performance (not fsr : it looks always hawful ) all of a sudden the image is perfect almost as good at 4k native( but significantly less sharper )or equivalent to 4k + dlss quality. It seems like the engine really like 4k and it's okay with me because 4k + dlss ultra performance gives the same FPS as 1080p native while looking significantly better, or there is 4k + dlss performance which is a bit more demanding and a bit better looking. It's clear that the engine is built around upscallers. Or may be the game is only optimized of the rtx 40XX serie but I don't think so.
@@andreasquadrani7159 turn super resolution up. The lower you go, the worse it looks
@@stixktv__9999 i’ll try, maybe it is just a problem due to the fact that this benchmark comes from an early build of the game
10900K+6800XT 1080p 100% res Cine Quality 39fps. This benchmark is brutal.
8:00 I know this video is 4 months old, but I can somewhat answer this. This game is using the RTX branch of UE5 (ray traced global illumination solution/library) and what you see is the devs optimising heavily for Nvidia, because well.. it's using the RTX branch of UE (literally the latest patch has a note of them further optimising RT for Nvidia specifically). Thus the way the engine reacts with AMD isn't strange, and watching GNs Battlemage review a 7900 XTX can only match a 3060 Ti at 720p @medium RT. That's.. shocking, and not because AMD can't do RT. Intel suffers too and they heap on RT more than AMD.
AMD even mentioned in their drivers that the Ray tracing has an issue.
Known Issues
Overly dark shadows or desaturated colors may be observed while playing Black Myth: Wukong when Global Illumination is to Medium or higher. Users experiencing this issue may set Global Illumination to Low as a temporary workaround. [Resolution targeted for 24.9.1]
@@Bang4BuckPCGamerThe comparison literally shows AMD shadows are lighter, not darker. Also the overly dark shadows were.. really really dark, it was akin to having crushed blacks in HDR. (ReSTIR GI could be factor, but that is again going against the symptoms reported lol). The only thing that would leave is desaturated colours, that could be an answer but the colours looks comparable, it was specifically the shadows as you point out. Would this be worth a revisit? : o
EDIT: I just tested it with my 7900 XTX and the shadows aren't as dark and defined as the 4090, but (slightly) darker than the 7900 XTX on the left. Interesting 🤔🤔
As you increase the resolution to 4k you should reduce tessellation to minimum on AMD, this a technic NVIDIA use hence they locked their tessellation setting from user.
8:31 check that statue head on the left , clrearly bugged on the amd side , guess the devs. didnt spend to much time with the amd driver side for this game ^^
The AMD driver has not released yet, so we will see.
Well, it is a Nvidia sponsored title after all
@@Bang4BuckPCGamer The rt setting is literally called "Nvidia full RT", the name of the setting already tells you its not gonna run well on AMD. It is actually enabling path tracing for which rtx 4000 have special HW units while rx 7000 dont have that. Also nvidia has a highly improved denoiser for path tracing inside DLSS, thats what makes the differences in the shadows etc. The same thing goes with alan wake 2 and cyberpunk too, path tracing looks better on nvidia because of the denoiser inside DLSS. Nvidia calls the denoiser "ray reconstruction".
@@damara2268I'm aware of all this, but Path Tracing looking bad on the 7900 XTX is not true.
Cyberpunk 2077 th-cam.com/video/QQfSzPjeihY/w-d-xo.htmlsi=CSPHGD9gm6Q5XP5g
Alan Wake 2
th-cam.com/video/sLQ7cLcAXsg/w-d-xo.htmlsi=-3JPQdxRDwVuQua_
Bro Hunt showdown got engine upgrade , however it's now called hunt 1896. Any chance compare it on both 4090 and 7900 xtx ? tried on my amd card ant it's beautiful but quite brutal on performance, another can it run crysis like vibe :)
Can't help but notice the much lower CPU power draw on 7800x3d vs 14900ks
Is the latest AMD driver game ready for black myth wukong?
I mean this isn't just a price thing, if the RTX 4080 Super was there.. it would still have provided way superior rendering quality compared to amd.. fsr just isn't great.
Tested on my 4090 and the conclusion is go DLSS quality/perf with FG and everything maxed out and forget it.
Good alternative for solid 60fps+ is no RT, no FG, native 4K, and just switch few Cinematic settings to Medium or High - barely any difference in some cases.
I think it’s possible that AMD is using only Lumen for RT. That could explain the shadow intensity difference with NVidia at the max RT settings. NVidia probably switches to its proprietary RTX implementation in the game replacing Lumen.
Edit: both cards fall back to Lumen when no “extra” RT is enabled.
full ray traycing with AMD GPU is indeed hardware lumens, but with Nvidia GPU it is patch traycing, don't say I'm wrong because Nvidia itself has advertised it as patch traycing on its youtube channel
@@chien.nguyen. 👍🏻
To be fair the clarity difference in raytracing comes if im correct from the low fps on the AMD card resulting in more noise and blurry frames in between resulting in a worse visual experience in the finer details of the scene..
RT performance on 7900xtx doesn't make sense
perhaps after a driver update we going to see some improvements?
I imagine so.
Hardware issue
Why does the RT present differently?
More proof that the 7900 XTX gives the best Price to Performance ratio in my opinion... Hard to beat a card that can give similar results to the best card on the market at almost half the price.... Unless you're a die hard Ray Tracing fan it's a ovbious choice.
I own both (4090 and 7900xtx). RTX is a total monster though it's often bottlenecked by the cpu and this can limit the fps. But RX 7900xtx is good and fast enough for gaming today. And there are not $800 difference in performances between both gpu.
Maybe AMD is using some kind of software version of path tracing whereas Nvidia is using hardware accelerated path tracing. This sort of thing happens when using Lumen in UE5 and it would explain both the visual and performance differences between the two.
Hardware accelerated looks better but performs slightly worse I believe
Nvidias 40s series support opacity micromaps which makes tracing against opacity masked materials like foliage way faster. Also it has shader execution reordering and hw accelerated BVH traversal and BVH build. Can't wait for Digital Foundry's analysis when the game comes out.
Nvidia过几天要发布针对黑神话悟空的优化驱动了,不知道amd会不会发布?
It's always worth it n yes with some comments can't wait to get a new pc with the 5090 from origin.
Puke...
@@Jasontvnd9 might need get that looked at. :)
I usually run my 7900xtx overclocked. I got average 94FPS on very high, no rays, 2K
I then set the GPU to default clock speed and got average 150fps. LOL
My
CPU 7800X3d
motherboard is the MSI tomahawk MAG
RAM 64 gig of 6400mh
7900XTX Red Devil.
PSU 1000watt 80plus gold
SSD is 7200 clock speed or whatever that means.
Nvidia has certain smoothness to it.
Frame gen will feel abit laggy when playing look so much smoother when recorded
did you oc/uv the xtx?
Naturally
hey thanks!
Damn guess this game its optimized for nvidia better ? 45% diff on native 4k . or amd didnt released the driver update for this game yet ?
Well it is an Nvidia sponsored game, and I hear this game is being bundled with their GPU's.
@@Bang4BuckPCGamer i see makes sense then ^^
@@darkfire3691 and ? other UE 5 games ,used the same tehnique and it didnt had this probs. at all , on amd gpu ! its clearly no optimization for amd here i think.
@@darkfire3691 It doesn't use lumen RT, it uses path tracing
I own a 14900k. Is it worth going from that to a 14900ks?
No
Your CPU is already high end one. I guess GPU is what needs to be considered more.
@@CrocdileLee RTX 4080 FE
im not sure whats happening. i know this is 4k bm with a 4090. but my mind was blown as i could run this on 1080 GTX 1660 TI in cinematic and get 45 fps.. it shouldnt even run like that with that setting
Totally unfair test, the 7900 XTX graphics card held the 7800X3D back by far, it was the bottleneck. Test again with both the 14900KS and the 7800X3D, both with RTX 4090's for a real comparison.
@iLegionaire3755 why are you talking about the CPU, in a GPU bound scenario? What a rookie comment.
interesting to see how efficient the game is with memory usage
and how much more efficient NV is when it come to memory usage.
oo yes, 50% more performance for the green team... ouch.
@@gametime4316 100% more cost
Bad RT perf on RT mode on AMD is completely understandable, but on RT Off it is joke (even if lumen is used, it still don't use HW RT so result need to be on RTX 4080 level), game is completely unoptimized for AMD GPUs.
Stalker 2’s only real competition in terms of visuals.
I want to see a 4K 100% full raytraced no frame gen benchmark xD i think the 4090 suffers but the AMD GPU explodes.
Uhh no it's unplayable on the 4090 also.
that 4090 carried that Intel cpu hard LOL imagine if you just used it on the 7800x3d lol though probably not given much more When you turned on the RT I kinda like how AMD did shadows they look more real to actual shadows since they aren't really that dark irl however Fuzziness on the AMD side but FPS wasn't to bad image was lol
Thousand of $ GPUs a piece and yet struggle😮
Weird the 7900 xtx always uses more vram than the 4090.
That's not weird. Nvidia uses colour compression, AMD does not.
Nvidia sponsored the game. But even despite that, the difference is monstrous!
in 4K RX 7900 XTX is similar to 4090 with Fake Frames DLSS3 turned off. 3080 & 4090 are both the same 50FPS average with Fake Frames DLSS3
turned off.
This feels like the real next generation crysis, the current generation of graphics cards are officially obsolete.
Only with UE5. Everything else runs fine, Frostbite, Unity, etc.
If AMD don't get their act together to support RT I have to go towards Nvidia which I really don't want to.
in the files game there is ray reconstruction dll
@@Livebasura69 It doesn't surprise me. This is an Nvidia game.
@@Bang4BuckPCGamer Since you were talking about the difference in how the screens handled ray tracing, perhaps it was active on the 4090
Replay Rx7900 XtX or driver 24.8.1 and microsoft new patch system, and on SAM in motherboard ., i have 5800x3d and have this gpu 110/120 Fps ultra setting + ultra RT in 4k
I dont need to replay anything. 1 SAM is already on. I am not sure why you assumed it was off, as if I am some rookie. 2 Driver 24.8.1 gives less performance by at least 3 to 4fps.
The game is optimized for Nvidia cards obviously.
this game is so heavy 😭
Yes it's a disaster when AMD fanboys say RT is not necessary, when now RT is no longer an option but mandatory, don't ask me why because this game uses Lumen even when RT is turned off, which That proves that RT is very important, only gpu's cannot handle it easily. In 2 years when the new generation of gpu is released, RTX 5000, RT will definitely be improved, if AMD does not improve RT on RX 8000 series, that's the end for them, don't argue with me because something like rx 7900xtx has RT performance equal to RTX 3000, the problem is not only in RT performance but also in RT technologies like noise reduction .... a lot of things AMD doesn't have right now
Nvidia pricing will go even more insane when this goes mainstream.
Why would you compare 7900 to 4090? 7900xtx=4080
Graphics quality is negligible but thr RT performance is unaccepatable.
with optimized settings on my rtx 4070 i was getting 80-90 fps with very high RT
At what resolution? This is important.
@@Bang4BuckPCGamer 1080p
@@denzylRTXMakes sense, you should have led with that.
Waiting for 5090 then.
Wukong just proofing that Lumen ARE enough to have good raster lighting without the huge performance impact, path tracing are overrated
coming from 4070 ti super user.
Path tracing is not overrated
Coming from a 4080 super user
Absolutly agree
Path Tracing is love!
Coming from a 4090 user.
path tracing vs luman is day and night difference, coming from a 4090 user
7:44 wtf AMD 36fps to 114 fps NVIDIA ????????
DLSS wird FG better than FSR FG. Omfg
5090 job :)
RTX 4090 less power vs 7900xtx win and rt on win ez xd RTX 4090 the king ;)
What is your estimate for RTX 5090 with the following settings: Display Resolution: 4K (3840 x 2160) , Super Resolution: 100% (DLAA), Frame Generation: OFF, Full Ray Tracing: ON, Full Ray Tracing Level: VeryHigh and every other setting on "Cinematic" ?! Will the 5090 achieve stable/averague/more than 60 FPS (without FG -> and as an example/comparison, the 4090 does not achieve an average of 30 FPS) ?! What would you think if even the 5090 would end up below 60 fps ?
Other than that, I am somewhat a bit underwhelmed with UE5: There is popin, shimmering and anti-alias-flickering in various titles (Robocob, Immortals of Aveon etc.) While UE5 is very heavy with all features and very taxing on your system, it does not produce results as "clean" as one would imagine.
5090 will be around 30-40% faster 4090 so there won't be native 4k anymore all they now focus on improving dlss and fg to achieve e more fps to hide true potential of a card ..
@@cronos1675 I guess it depends on the Tensor-Core improvements. Maybe the 5090 is way more capable than the 4090 when it comes to RT payloads. Like the comparison shown here where the rasterization performance is not that much different (in special if you compare prices) but with raytracing, the 7900 XTX can not keep up anymore. Maybe the 5090 has a huge advantage over the 4090 when it comes to that as well and maybe this could push it over 60 fps using RT in native res. We'll see. It will be interesting.
Could also be that DLSS 4.0 or 5.0 has 50-Series exclusive features due to the different chip design.
If the 5090 will not be able to push more than 60 FPS in this with native res, I will be disappointed.
@@holgerbahr8961i think 5090 will be 30% better in raster and 100% better in rt performance, rt will sell that card
@@fvallo With these capabilities it should be able to run this benchmark in 60 FPS 4k native with RT.
4090 more shadows? from my pov, the 4090 got less or missing global illumination here with RT on, hence the darker shadows... Probable explained the 3x FPS more from the AMD.
I said the RTX 4090 has more accurate shadows. This is path tracing, that is why the 7900 XTX is 3X behind in performance.
solution : RT médium
39fps for 7900 XTX medium RT.
@@Bang4BuckPCGamer sorry this was the solution for the rtx 4090 in order to have more performance 🙂thank you for your videos
Dude AMD are using 64X tessellation while Nvidia hide its setting and use 32X, no wonder AMD is struggling. Change your setting. Lol
AMD Radeon RX 8000 (RDNA4) is Ray Tracing.
This is 4090 and still giving 56 fps 😆
Going to disagree on the visual quality.
dude...
AMD 7900 XTX vs RTX 4090, really ???!!!
Does AMD have a more powerful graphics card I don't know about?
@@Bang4BuckPCGamer no. But the most powerful AMD card can't stand against RTX 4090. Maybe comparing it with rtx 4080 super is much fair
I don't own an RTX 4080 Super. I am comparing the best from AMD and the best from NVIDIA.
@@Bang4BuckPCGamer thank you very much for your content. you are one of the best in this field 🌺
dead on arrival.. if they don't fix performance in 3 months after release i won't buy this :)
welcome to next gen, each game using UE5 will be like this starting from now on.. so you won't be playing much, it's time to upgrade ;)
@@sesionlive2494 have 7800X3D and a 7900XTX the benchmark teat can barely pull 50-65 FPS in 4k...
One card will have random crashes, driver timeouts, and frame time issues. Where as the Nvidia one will be smooth as butter :)
Black Myth Wukong laugh at your AMD 7900 XTX.
AMD Radeon is so bad at RT is truly disgusting how bad it is. And FSR is also very bad. AMD needs to fix their drivers and cards to compete with Nvidia.
@@dennythescar80s8 it's not that bad. Better than an RTX 3080. It's more RTX 3090 level.
@@Bang4BuckPCGamer are you on something ? RTX 3080 is over all Radeon cards in RT, all tech media test it and got the same result.
@@dennythescar80s8 th-cam.com/video/rL8-aA0vDCg/w-d-xo.htmlsi=_CG0j_Ib_Oxdx2if
You went and put ray tracing on without restarting, meaning you didn't switch on at all. You should actually read what the graphics options said. You're about the 5th clown to do that. And dont say you didn't because we watched you switch it on and go straight to the benchmark using dlss.
All that stuff was done before hand. It's called editing. You should try it.
Looks mediocre, runs like utter garbage ❤ i really don't like the UE.
Yeah this game looks like trash. 90 fps hit for some darker shadows??? Lmao so dumb. Atleast in cyberpunk raytracing actually looked good. This game looks like Poop with RT on or off. Oh you only need a $2,000 gpu to run it at 23 fps native. Woohoo.
@@SemperValor path tracking gives you true shadows, not "some darker shadows", if you really compare lumen vs path tracing in the benchmark then you will see a day and night difference... with full RT enabled it looks "alive" while wih only lumen close to "just another game"
FSR looks terrible. My 4070 beats7900xtx with rt on in 4k.
And game still looks midle on cinematic
Thx man rx7900xt sucks i gonna buy 4090
and water is so bad on fsr omg
The AMD card took a dirt nap soon as RT was turned on lmaoo.
Hunk of junk.
Ugly graphics and bad optimization