I have 7900 xtx nitro for two years and I think there is still only one game I play that has ray tracing: chivalry 2, ready or not, age of empires 4, pubg, battlefield 2042. When I bought xtx my decision was ray tracing is still too early and chose 24gb ram, more raster for less money than nvidia. Very happy so far
@@TheSometimeAfter Probably Battlefield, but I am too lazy to verify . It may have been the very first game. I believe Lord Emperor Jensen used RTX2060 6GB to demonstrate it instead of the very mighty 2080Ti 11GB.
Dude, at the moment there are more than 100 games with RT support and we are seeing new RT games almost every week. I am sure you have more than one RT game in your library. As for your other argument, you said raster performance vs. nvidia, I assume you mean the RTX4080S, because that's the closest competition. Both the 4080S and the 7900XTX offers comparable raster performance. According to techpoweup the RX7900XTX has 96.1fps based on 25 games tested at 4K, while the RTX 4080S has 95.1fps. My model is OC'ed to 2925MHz and 820GB/s memory bandwidth, so I would probably get over 100fps in their test. Stock 4080S may loose in some raster games, but only if you arnt willing to play at much supperior image quality, because the combination of DLSS performance + DLDSR absolutely destroy TAA native image quality while still offering performance boost. As for the initial $100 price difference, the RTX7900XTX is certainly more expensive in the long run due to higher TDP. I saw bank4buckPCgamer YT gameplay videos and his 7900XTX at 99% GPU usage ALWAYS draws 465W. My RTX4080S draws between 260-315W depending on the game (at 99% GPU usage), so that's about 150W difference. My entire PC draws 430W at max. 150W in use for 8 hours (480 minutes) is 1.20 kWh, or 438.00 kWh per year. I have to pay 452zł for that (109 $USD at today's exchange rate). In my country, the 7900XTX would definitely end up costing more money in the long run, maybe not the first year becasue I'm not playing for 8 hours daily, but after 3 years that would be the case.
the anger in majority amd community don't like see game options disable ....specially its eye sore to see your card unable to run it when its on....i hate seeing it disable ...RT...when every nvidia users have it on
- Dying Light 2 is an Nvidia sponsored game. - It has heavy ray tracing. - No FSR 3 even though it's been out for a year at this point. - Got DLSS 3 more than a year ago. Sounds about right lol.
But god forbid when Starfield came out and got no DLSS, people freaks out, even made it into headline with AMD spokesperson being confronted lol, it's sponsored titles, why are they acting so surprised when Nvidia also done the same 😂
@tomthomas3499 check dlss vs fsr comparison from Hardware Unboxed. Frame generation is on par, but the upscaler is inferior, quite noticably at QHD and FHD
With RDNA 4 around the corner I really wanted to look at how the 7900XTX holds in some of the more complex RT games. It's a total of 12 games I wanted to look and I was going to include all of them in this video but I decided against it because it would be way too long so I'll be splitting it into 2 or maybe 3 parts. Are you guys looking forward to CES tomorrow? I am.
Always look forward to seeing how far the new tech can be pushed (5090). I'm likely to skip this generation though, unless there's something that's too good and well priced enough to ignore. XTX should be fast enough for 4k and 1440 high refresh gaming for another couple of years.
I'm not considering upgrading my 4090, but I really want to see if AMD can close the gap in "heavier" RT games with RDNA4 and if FSR4 looks any good. Regarding Nvidia I'm actually less pumped to see what they have in Jensen's oven, simply because the GPU's presented will likely be more expensive than Ada and the new features will probably take 2-3 years to matter much in games if they need work from the devs to implement.
@@MrMeanh Yeah curious to see AMD's architectural RT improvements. Know what you mean about the new features, depending on what they are it can take time for them to become common place in new games.
I am particularly interested in FSR4 and its potential compatibility with RDNA3 GPUs, such as my recently purchased 7900XTX, which was very expensive for me.
@@TerraWare i am very bearish for RDNA 4's raytracing i hope they give 4070 level of rt at the 250 dollar gpu class like intel did with battlemage ........
Had this flavor of 7900xtx for about a year now. No plans of upgrading any time soon, this thing is a beast. Haven't converted to the 4K display religion yet, either. 2 x 1440p displays working well for me.
Just want to say that I really like small(er) channels like yours because I feel like they have such a cozy atmosphere. Just a bunch of people talking to each other, because they share the same hobby and are interested in tech, analyses, GPUs, games, graphics, new rendering tech, etc. It's like being with a bunch of friends geeking about graphics, while drinking tea. 😃
Running a 7900 XT with a 12400f (not upgraded it yet), playing on an ultrawide 1440p monitor and Indiana Jones is hitting between 80-130 fps in ultra. Pretty good for an Nvidia sponsored game that forces RT
@@TheGlobuleReturns Yes, and the restriction is not just that it has to be Nvidia... it has to be a 12gb vram Nvidia card... xDD But hey, as you say, the game looks great anyway and most importantly, it's a very good game.
I don't know about failed but that was insane man. I was expecting a reveal like with RX 6000/7000 but it was like they forgot about the DGPU guys watching. The media brief had more to say. I'm surprised.
A further friendly correction, who the fk plays Dragon age veilgaurd? Why is even considered. If i go a bit extreme, both of these games. There are better games out there. So many of them.
Honestly, my thoughts on RT is that its a mixed bag at this point. Some devs like Insomniac, Capcom, Ubisoft massive are doing great work and transforming the game with RT reflections and RTGI while still maintaining very high FPS. At the same time, most of the other RT effects feel like they are a plug-in with the bare-minimum effort and simply exist for the sake of a sponsorship. Take RT shadows and RT AO in Ratchet and Clank Rift Apart or RTXDI in Star Wars outlaws where they barely do anything but cost a metric ton of performance. As for Path Tracing with UE5 and Black Myth Wukong, that is straight up dumpster fire. All they are doing is making Epic look like a god tier optimizer where the game looks the exact same but runs at 1/4 the frame-rate. It also feels like they are manufacturing problems like flickery shadows in Black Myth Wukong to sell a solution. Like Seriously? How are shadows flickery in Alan Wake 2 and Black Myth Wukong but, not in a game like Breath of the Wild that runs on a 10 year tablet SoC?
I've seen recently slightly more detailed Pathtracing vs Raytracing vs No RT comparison . Well maybe not much more detailed, but better locations in my oppinion. PT had ZERO problems you mentioned aside of lower performance off course.
I actually sold my XTX, though I don't know if I sold it too early due to potential tarrifs happening. But yeah, time for the 5080/5090 this gen after going AMD high end for 2 gens (6900 XT, 7900 XTX). When AMD targets high end again I'll give it a try :)
Well that was a strange presentation. They showed nothing about RDNA 4. Their media brief had more to say. Weird. Strix Halo on the other hand was the coolest product they showed. 16 Zen 5 cores with 40CU's of RDNA 3.5. That's a heck of an APU
@TerraWare Yeah, what a weird presentation. Apparently it was just a preview? Also apparently FSR is 9070 series specific?? I was left more confused watching that, seriously lmao. AMD is cooking with APUs, we won't see it on AM5 probably but someone like minisforum will make a slim mini PC with it. It should be more than enough for a lot of people.
@@Accuaro Check out Hardware Unboxed's latest video. They asked AMD questions he touches on them and it leaves me with more questions than answers btw. Apparently they weren't given enough time by CES event organizers so they couldnt show RDNA 4. It's BS or their priorities are backwards. You'd think that'd be one of their main priorities considering its what most people watching wanted to see. They also gave Matt Booty a bunch of time phoning it in from Seattle saying practically nothing and the Dell guy talking about nothing.
The 3080ti is doing really well. I haven't had many issues with 4k RT on my 7900XTX but I don't own many Path Tracing games, so normal RT games run well, especially with FSR FG Taking you into high refresh territory with some upscaling. In this video I think Wukong and SH2 are both on really old versions of UE5, so if they ever get ported to newer versions they should run better on both AMD and Nvidia cards. There is also software RT these days, but with Nvidia's marketing we are moving into it being 'Full RT' or nothing now I guess.
I think having Full RT/PT as additional options like in Indiana Jones is perfectly fine, the XTX does really well in that game, even though there RTGI by default. I think regular RT will become more and more common though which is fine. Even the 6800XT does pretty good in Indiana Jones for example. Even at 4K and thats without upscaling, just some tweaks. No biggie.
@@TerraWare I meant as in games having RT by default like Indiana Jones, but because it's not an option in the settings, a lot of people think that game doesn't have RT unless you turn on 'Full RT', so that is another masterclass by Nvidia marketing. Full RT is not even an option on AMD so people will think AMD cards can't do ray tracing, and I expect more games will come out with 'Full RT' supporting only certain nvidia cards. Full RT is a lot easier to understand than Path Tracing, and when other non Nvidia cards can do normal ray tracing, it's an easier way to differentiate what Nvidia cards can do and what other cards can't do.
"full ray tracing" is the term I'm looking for at AMD CES. If they don't use that term, then most likely path tracing is not a priority and they might just wait for udna era. Which I don't think most people even care about. However, moving to hardware for their RT should help and going full ai should also help. Fsr3 FG already produces a significant amount of FPS compared to the competition, full ai should reduce any artifacts whilst still maintaining a significant FPS boost, maybe produces even more FPS. Even though AMD is just targeting the low and midrange, they have to produce a compelling showcase or I think a lot of their consumers will jump ship. No point in waiting for UDNA to show any ambition.
@@XDDX_NZ Ah I see what you mean. It makes sense. Nvidia are masters when it comes to marketing. Unless you're in the know you probably don't know Indiana Jones uses RTGI regardless.
Black Myth Wukong is such a badly optimized game on Radeon. I can't help but think it's intentional. One thing that really bothered me: the game ran relatively well on my XTX, running at around 90 average all the way up to the "hand face" boss in chapter 6, after which all 24 GB of VRAM fills up and the game chugs to a stuttery mess, eventually crashing. The whole area after that, flying around on the cloud, is a whole different game with regard to performance and after a short time, I end up with the same issues. Does anyone else have this issue? Is it the fact I'm on Linux, or do other XTX users on Windows also have this issue?
AMD just doesn't have the necessary hardware to do ray tracing at the level of Nvidia. Nothing to do with optimization in this case. You can always play with just software lumen.
This isn't intentional. RDNA3 cards, and even RTX30 cards, simply do not have the technology to run PT at reasonable speeds.My 4080S is over 2x times faster in this game even compared to RTX3090ti, and usually both cards are close. With medium PT at 1440p I saw only 3% relative difference compared to lumen (123fps with medium PT, vs 127 fps lumen in built in benchmark at 1440p DLSSQ +FG cinematic settings, medium PT), so IMO PT in this game is very well optimized.
@@AlexFoxRO Oh, I'm not using RT or PT, other than what's already there by default. It's just bad optimization to have the game run fine for the first 80% and then require a different card for the last 20%. Like, did the devs just give up?
I still can't this upcoming gen competing even with the 40 series in ray-tracing definently not path-tracing let alone ray-reconstruction so it's not just the raw numbers. Unless AMD pull a rabbit out of the hat with some brand new dedicated ray-tracing cores which is always welcome for them. But they are just are just so far behind Nvidia and their tensor cores at the moment but let's see how these 2025 cards play out.
As long as they keep up on RT it should be fine. I don't think they'll be able to catch up on PT anytime soon but we'll see what's up soon enough anyway.
Great video. I feel like too many people just write off the 7900xtx's RT performance. It's not high end Nvidia level, but it's still as good as a 4070 ti in most games. I've had a ton of luck with AFMF in ray tracing titles for 120+ fps with full RT. Path tracing always seems to break afmf though Path tracing is in a weird state right now anyways. Path tracing just looks completely different from RT, so it's going to produce a naturally much darker image. Developers don't waste a lot of time on a graphics setting 98% of players won't touch so they don't really add additional lighting to make the game more playable. It has a ton of potential but devs need to tailor the game around it a bit.
@@PabloB888 AI. And in case of games - it's always like that: Nvidia releases cards with a limited amount of memory (without reserve, now it's 8GB), and fanboys say: _"Why do you need more? You don't need more! Cock-a-doodle-doo!"_
i mean to be honest, in 1440p i never reached over 16gb vram wait i did the game was Avatar but this is the game that pushes Vram so high and Nvidias Limit on Vram is the reason why most games use just 16gb vram nowdays. But even in 1440p the Vram usage is from 14-16gb so no one can tell me that 12gb is more then enough in 2025 for 1440p and who know what the future will have for suprises (wait indiana Jones is very Demandin if you use Max Textures where even the RTX 4080 start to cry).
@allxtend4005 At 1440p my 4080S has enough VRAM to run Indiana Jones at 1440p Native with maxed out settings with PT. At 4K however my card is vram limited, but even if I would have 24GB VRAM the game would still run like crap, because the RTX4080S is too slow to run PT games at 4K native with playable fps. With DLSS and lower texture streaming settings the game runs smooth though and I havent noticed any degradation in texture quality, so I cant say 16GB VRAM limits my experience. I would be worried if I had 12GB, but 16GB is still plenty. The RX7900XTX has 24GB, but you will not evrn run PT anyway. IMO only 8GB-10GB VRAM cards can limit gaming experience at the moment, anything above that is still fine and will be for the next few years, because we will be still playing PS5 ports.
Given that I am a fan of RT(only eflections realy for rthe time being) and unlike some other channels you tested RT in ALL games just as the title said(no clickbates),you earned yourself so-called subscription(in reality it's just free of charge folowing lol) .
RT and sales prices in Australia at the time were big reasons I went 4080 over this card, 2 years ago. i have a build coming up for my Son though, and I'm revisiting it all again to get ideas on what's best for his build Looking forward to RDNA4, but im wondering if this is actually the card to get if I wanna go AMD for this build. Not sure I'll go near Nvidia this upcoming gen yet. I'll wait and see though
Yeah probably wait if you can. AMD seems to have forgotten to show their GPU's in their presentation. That's one of the strangest things I've seen in some time when it comes to presentations.
7000 handle RT in majority of games better than many think, but the problem is the more complex it is, so the bigger visual difference it makes, the more fps Radeons loose compared to Nvidias. Combined with lack of help of DLSS when FSR imo looks bad to the point I would prefer not using it, so lowering settings instead. It obviously leads to going native with RT off. Btw that's how some "preferences" are forced by reality.
Before I start watching the video. I play Cyberpunk 2077 and Dragon Age Veilguard with RX6600 with Raytracing. Reflections in CP2077 and Selective RT mode in Veilguard. CP2077 at 1200p with FSR 3 Quality , Frame Generation and in the AMD Software called Adrenaline activated AMD Fluid Motion Frames 2 to mitigate possible FrameGen latency issues as much as possible and AMD Super Resolution set to upscaled 1200p-2160p . Playing at a mixture of Medium,High and Ultra settings. Not the greatest Picture quality improved once I turned Super Resolution ON. Framerates? I have no clue. I play without onscreen stats so I can focus on the game. IF something bothers me too much I modify settings a bit. Perhaps I should turn off RT Reflections during day in CP2077 and increase overall settings and/or resolution. RT reflections are most noticable during night on reflective,usually wet surfaces. It greatly increases the immersion in my oppinion. So I am certain I would be quite pleased with 7900XTX performance.
You're doing it right. Many people feel they must max out every setting to be able to enjoy a game. I've been trying to break that mindset with some videos I make.
7900XTX owners always say they have no interest in RT. I can understand why people werent interested in RT 7 years ago (in 2018, we only had like 3-5 games with RT support), but in 2025 we already have over 100 games with RT support, and almost every week we see new games with RT support. You just cant ignore RT anymore unless you really dont care about the graphics at all. Games often look much better with RT and on RTX40 series cards, you can run RT games quilte well. I have never played a single RT or even PT game where my performance dropped to unplayable levels. Quite a few RT games from my library runs at over 60fps even at 4K native (RE3 Remake in particular runs particularly well at 130-200fps), and with DLSS + FG on top of that almost every RT game runs at 120fps with the exception of PT. PT games require to lower DLSS settings to Performance (but that's fine, because with the latest 3.8.1 DLSS and my reshade settings I still get 4K like image) to get smooth 80-100fps. PT is extremely demanding, but in Black Myth Wukong I saw situation when PT was quite cheap. I get 123fps with mediumPT and 127fps with lumen (1440p DLSS Quality + FG, cinematic settings, medium RT/PT), that's just 3% relative difference.
RT is becoming something that's in newer games whether you care about it or not. Radeon does pretty good in those games, like Indiana Jones maxed out settings or UE5 games. Multiple RT effects or more complex RT can be a bit more taxing on Radeon, especially PT which takes a big hit to performance in relative to Nvidia counterpart.
Pleased you did this video because as i've been saying the XTX I tried Dying Light 2 in also and it did do very well Veilguard sometimes it takes a nosedive with reflections but nothing major in the end I found with more testing The 4080 Super did a bit better but not by much in Veilguard so that was good Also if you use fade focused textures that defo eats up a lot of vram because my 4080 super at times did get on the brink of it's max 16Gb usage, but Found as long as I don't use FG it's ok mostly Anyway lots of ppl just Assume the 7900XTX is bad with RT full stop but that's not the case in my experience Lastly yeah can't wait for CES, still deciding what CPU to get Oh yeah forgot to mention, motion blur with FSR causes that ghosting around the head of the character, found that out last week
@@TerraWare & tbh the price for the 9800X3D has gone up where I live in the UK to £529.99 So may as well just get the 9950X3D which is want I mainly wanted in the first place as I prefer more cores & since I think it will be priced at no more than £599.99 the 9950X3D The price was £449.99 at first for the 9800X3D but waited lol The other thing is I did see that the 9950X3D might be TDP 170W but waiting for confirmation 7950X3D uses 120w like the 7800X3D max so idk if will be true
@@TerraWare Oh yeah certainly does the job the 7950X3D But I want even better 1% lows and better FPS in the CPU limited games and the upcoming ones since Devs are getting worse lol
I have a 3080Ti since it came out and the last year i was always looking for a new card, but Nvidia 4080 and 4090 are way to expensive and the AMD 79xx are not good enough to upgrade to.. so i totally skipped this Nvidia and AMD Generation. I have no idea if the AMD 9070 XT or the Nvidia 5070 Ti will be good enough to make it worth the money for the upgrade.. if i do not get at least 30% more Performance, i will not upgrade. I can still work with DLSS and tweaking the Graphic Settings from ultra to high.
3080Ti is still a very capable GPU. 30% minimum uplift for upgrade is pretty good. I don't think the 9070XT will be that unfortunately. Maybe the 5070Ti. I think that's supposed to be 16GB vram as well.
Running out of vram sometimes manifests as depressed performance without stuttering. Depends on the game that´s why Dragon Age runs much slower without stuttering.
@@Robspassion It's a pretty good card. A bit power hungry but so is the 3080Ti lol and especially the variants of both that I own are factory overclocked.
TV is way too big for a desktop setup imo. 40" would probably be the highest I'd go but TV's these days are perfect for gaming so it comes down to preference.
They will be in part 2 of this video and others. Had to cut it short because the video got way too long after saying what I wanted to about these 4 games so far.
Not gonns lie newr the start when you said "Oh look at that loads of difference. Totally different" between RT on and off.ithought you were being sarcastic. It looks the damn same.
I wait for CES, I wanna change my rtx 3060Ti 6GX for 9070XT/(non XT) eventually 7900XT~ my budget, RTX 4070S has one problem for me= 12gb vram, and yeah also slower card. Ryzen 5600x shouldn't be so big problem at 1440p.
RT in the Veilguard is such a tiny tiny visual improvement for a wild performance hit. I took a screen on on instance where RT OFF I was getting 160fps and with RT ON i was getting 57. 103fps for an unnoticeable difference. AND I LOVE RT.. Its just really bad in DAV
RT in Veilguard is kind of unusual because it's extremely CPU heavy. Just a guess but you could actually be CPU bound with a massive drop like that. I did a thorough video looking at the CPU cost in this game. Here's a link if interested with timestamp to CPU bottleneck section. th-cam.com/video/_8rtUlTQdRw/w-d-xo.htmlsi=m-Go4cwjt-mMBjao&t=181
@TerraWare Maybe.. I found the only really CPU intensive thing to be the Strand Hair. At least for me my CPU usage didn't change a whole lot when using RT or not. My point is there's such a minimal difference visually with RT in that particular game for such a wild performance hit.. in some of the areas I took acreenshots there is literally no difference in the AO, shadows or diffusion at all but still resulted in nearly 80fps difference! The only place RT really makes any bit of a difference is when it's on its adding to the SSR.. for what I could tell anyway.
My guess is that they used baked lightings properly so that when RT is enabled the difference is so little, in AW2 for example, they need to go out and deliberately make conventional shadows and reflections bad to make RT stand out even more.
i'm sorry, but that flashlight still doesn't look anything close to accurate, so i'm really not sure where the value is. yea, the light bounces around, but it still looks like a video game, just a different looking video game.
and why did not use AFMF ? near double the FPS so basicly way more fps and yes MS get reduced and it does not feel worse in response, but okey ... we do the comparision thing where we use DLSS+FG and FSR without AFMF and call it a day ? Dying Light 2 is very demanding with RayTracing, but as i can say the Rx7900xtx can even handle Cyberpunk 2077 in 1440p and Path Tracing and max out settings with around 60 fps Xsse or what Intels upscaler is called or 80 - 100 FPS with FSR 3.0 (Why not 3.1 but okey cd project doing ther thing). With AFMF enabled of course and see there, good FPS compared to Nvidia somehow, Driver issues ? Not any at all. When we take a look at Avatar or any other games who Implemented FSR 3.1 in a good way then we see that FSR 3.1 is not that faar away from DLSS 3.5 Dlss still is a little bit better but the main reason is because DLSS is way better implemented then FSR in most games meanwhile older RTX cards use FSR nowdays and not DLSS to achive ther Goal of better FPS because they are locked out of newest DLSS feautures. Black Myth Wukong i tested in 1440p cinema Settings and it is at 100-120 fps avg with AFMF 180 - 240, the RTX 4070ti super achive with the same settings around 80 fps with DLSS and FG. Now i wounder why the Nvidia RTX 4070ti super cost as much or even a little bit more then the Rx7900xtx but is so faar behind even in RayTracing, the only one thing the Card wins is when games have very bad FSR support and let's notice that Nvidias FG add so much Ms that it really feel bad, this is what i noticed by myself and the RTX 4070ti super was tested with the exact same specs as the Rx7900xtx.
Nvidia are massively invested in RT. I don't know about catching up to 50 series, especially when you take into account RR, dedicated cores that are in their 4th gen now but as long as they make good generational improvements they can deliver a pretty solid product.
rdan4 is dead end architecture feeling ...just like vega stop at it tracks while went off road with another architecture with polaris...then rename it to rdna since all rdna been navi architecture since while arcturus supposed to be successor but instead AMD threw a wrench at the project that Lisa Su don't want that architecture in any gamer peasants hands.. Its too expensive for budget users. Thats what got raja koduri angry about it . Not doing what nvidia been doing...giving expensive architecture for everyone but to bottom line for lisa su...it not gonna happen since she wants more money than $2k on radeon instinct brand, its why gamers never get nice things from amd.
Quick reply = it sucks, but the card is GREAT. The best raster right after the 4090 but much cheaper. I have it and enjoy it knowing that most of Gaming Ray Tracing is currently done with Reshade RTGI. Yes. MOST of RT is currently NOT hardware but sofware. And there, only raster power matters!
I'm still using an XFX RX 6800 XT and sometimes there's a game or two that perform well with RT and they're usually the ones that are built with it in mind. Metro Exodus Definitive Edition, Avatar Frontiers of Pandora, Indiana Jones, etc. Regardless, the RX 7000 series really disappointed me. It's why I just skipped it and stuck with my 6800 XT, but regardless I was still really underwhelmed that the RT performance is still a generation behind Nvidia. Turning it on suddenly makes your new more expensive GPU unable to keep pace with last-gen products, it's not a good look. I thought it would change with the RX 7000 series but it was exactly the same.
I hear ya. If you got a 6800XT which I have owned one since launch, there's no reason to upgrade to the 7800XT, they are very similar. Only card that made sense to upgrade to from a 6800XT with AMD was 7900XTX imo. 7900XT launch price was a joke. Later dropped to $700, was much better.
I think it's the most overrated game in the world. All the Chinese I would say propaganda worked great. It's nothing special, and it's frustratingly difficult and has no difficulty setting, and horrible checkpoint system. Well, it has no autosave feature so you have to for example travel through the same area and fight the same enemies over and over again even when you fight bosses which is all the time... there are like 80+ bosses. Hitboxes are wrong, combat also feels wrong, sound design is average, story is nonsensical and writing assumes you understand Journey to the West by default. It's just not that great as presented, that's why I think it lost to Astrobot which is just pure accessible fun 🤷♂ Check out proper critiques of the game and not botted Steam reviews... there is no freaking way the 97% score is representative of this game. Chinese were told to give it a positive rating or they just felt like doing so because it's the first AAA Chinese release in ages, but the rating has no bearing in reality.
aren't we already going for PATH tracing then ??? , ray tracing is at it's END and has been for a long time now , what's the point anymore in giving newer cards ancient ray tracing INSTEAD of the newer way BETTER AND FASTER path tracing ??
Its around a 4070 Super to Ti, especially mine which has a 450W bios in it and overclocks to 2.1ghz and the XTX can also outperform it in some cases with RT. It's not bad actually. Now in PT or Full RT yeah Radeon will take a bigger hit than those cards as I showed here in BMW.
Mehhh...Ray-tracing is still a very niche options that doesn't factor into my GPU buying practices. Games run JUST FINE without RT, plus what percentage of games have RT? 1%? 2%? 5%??...just not enough for me to worry about it.
Too much stigma around raytracing, it's obvious by now the best looking games have a lot of RT and new cards should be able to handle it. Perhaps not on max settings, but still take advantage of it. Currently playing Space Marine 2 and while the game looks alright it also looks quite dated, even with the 90GB texture pack requiring a 20GB card. Textures can no longer make up for realtime lighting, reflections and shadows.
Agreed, games look really dated without RT, shadow & especially light appear where it shouldn't like lips, nostrils & eyelids where you know someone's face should be in full shadow. For those who's having problem with performance lost, there's DLSS to boost FPS & Framegen to smooths out any stuttering, it's not perfect yet but Nvidia is the best in the business so far. After experiencing path tracing in Cyberpunk, I can't go back. to playing with no RT.
@@mmadevgame Useable RT tech will definitely trickle down to lower end hardware, might not be as fast as we wanted but it will eventually. If I remembered correctly it was the same deal with Shadow setting in games, around a decade ago only the most powerful GPU can render soft shadows well, but now it's almost a non-issue even for low tier modern cards.
Part 2 of this video is live.
th-cam.com/video/qQauOmGWghU/w-d-xo.html
I have 7900 xtx nitro for two years and I think there is still only one game I play that has ray tracing: chivalry 2, ready or not, age of empires 4, pubg, battlefield 2042. When I bought xtx my decision was ray tracing is still too early and chose 24gb ram, more raster for less money than nvidia. Very happy so far
So which is the one game? You named 5.
@@TheSometimeAfter
Probably Battlefield, but I am too lazy to verify .
It may have been the very first game.
I believe Lord Emperor Jensen used RTX2060 6GB to demonstrate it instead of the very mighty 2080Ti 11GB.
Dude, at the moment there are more than 100 games with RT support and we are seeing new RT games almost every week. I am sure you have more than one RT game in your library. As for your other argument, you said raster performance vs. nvidia, I assume you mean the RTX4080S, because that's the closest competition. Both the 4080S and the 7900XTX offers comparable raster performance. According to techpoweup the RX7900XTX has 96.1fps based on 25 games tested at 4K, while the RTX 4080S has 95.1fps. My model is OC'ed to 2925MHz and 820GB/s memory bandwidth, so I would probably get over 100fps in their test. Stock 4080S may loose in some raster games, but only if you arnt willing to play at much supperior image quality, because the combination of DLSS performance + DLDSR absolutely destroy TAA native image quality while still offering performance boost. As for the initial $100 price difference, the RTX7900XTX is certainly more expensive in the long run due to higher TDP. I saw bank4buckPCgamer YT gameplay videos and his 7900XTX at 99% GPU usage ALWAYS draws 465W. My RTX4080S draws between 260-315W depending on the game (at 99% GPU usage), so that's about 150W difference. My entire PC draws 430W at max. 150W in use for 8 hours (480 minutes) is 1.20 kWh, or 438.00 kWh per year. I have to pay 452zł for that (109 $USD at today's exchange rate). In my country, the 7900XTX would definitely end up costing more money in the long run, maybe not the first year becasue I'm not playing for 8 hours daily, but after 3 years that would be the case.
the anger in majority amd community don't like see game options disable ....specially its eye sore to see your card unable to run it when its on....i hate seeing it disable ...RT...when every nvidia users have it on
@@PabloB888
Using DLSS doesn't mean real 4K gaming...
- Dying Light 2 is an Nvidia sponsored game.
- It has heavy ray tracing.
- No FSR 3 even though it's been out for a year at this point.
- Got DLSS 3 more than a year ago.
Sounds about right lol.
But god forbid when Starfield came out and got no DLSS, people freaks out, even made it into headline with AMD spokesperson being confronted lol, it's sponsored titles, why are they acting so surprised when Nvidia also done the same 😂
@tomthomas3499 check dlss vs fsr comparison from Hardware Unboxed. Frame generation is on par, but the upscaler is inferior, quite noticably at QHD and FHD
@@Godmode_ON24 FSR looked good in this game I think the best.
@@tomthomas3499
Yes,but Starfield DOES have DLSS now, does it not?
Hence why AMD have upscaling at the driver level now. Works pretty good too.
With RDNA 4 around the corner I really wanted to look at how the 7900XTX holds in some of the more complex RT games. It's a total of 12 games I wanted to look and I was going to include all of them in this video but I decided against it because it would be way too long so I'll be splitting it into 2 or maybe 3 parts. Are you guys looking forward to CES tomorrow? I am.
Always look forward to seeing how far the new tech can be pushed (5090). I'm likely to skip this generation though, unless there's something that's too good and well priced enough to ignore. XTX should be fast enough for 4k and 1440 high refresh gaming for another couple of years.
I'm not considering upgrading my 4090, but I really want to see if AMD can close the gap in "heavier" RT games with RDNA4 and if FSR4 looks any good. Regarding Nvidia I'm actually less pumped to see what they have in Jensen's oven, simply because the GPU's presented will likely be more expensive than Ada and the new features will probably take 2-3 years to matter much in games if they need work from the devs to implement.
@@dougquaid570 Oh totally. Even the 6800XT and 3080Ti that are 4 years old now still do great.
@@MrMeanh Yeah curious to see AMD's architectural RT improvements. Know what you mean about the new features, depending on what they are it can take time for them to become common place in new games.
I am particularly interested in FSR4 and its potential compatibility with RDNA3 GPUs, such as my recently purchased 7900XTX, which was very expensive for me.
We need more comprehensive RT videos. Keep making these exact comparisons.
There will be more in the next part. This is only 4 out of 12. We have Cyberpunk, Control and The Ascent to name a few
@@TerraWare i am very bearish for RDNA 4's raytracing i hope they give 4070 level of rt at the 250 dollar gpu class like intel did with battlemage ........
@@bmqww223 Way too ambitious, not happening at that price range.
Had this flavor of 7900xtx for about a year now. No plans of upgrading any time soon, this thing is a beast. Haven't converted to the 4K display religion yet, either. 2 x 1440p displays working well for me.
Great video idea to prep for CES!
I wish you would show a quick chart after each comparison to help us keep track though! Would be very helpful!
Pretty good idea. I need to look at charts and how to make them in a way that makes sense.
this channel is so underrated. your testing is really useful
Thanks I apreciate it. Always looking to make it better
Just want to say that I really like small(er) channels like yours because I feel like they have such a cozy atmosphere. Just a bunch of people talking to each other, because they share the same hobby and are interested in tech, analyses, GPUs, games, graphics, new rendering tech, etc. It's like being with a bunch of friends geeking about graphics, while drinking tea. 😃
That's exactly my aim. Share my enthusiasm with other like minded people, people new to the hobby. It's a good way to learn more too.
Can't wait for FSR4.
THX for the great video! Subscribed! 🤗
Thanks for the sub!
You can use AFMF2 from Adrenaline driver on any game.
I have the exact AMD gpu. Other than the RGB dying on it, the card still preforms excellently.
Oh that sucks. The RGB looks real nice on this card but not that big of a deal.
@@sabermajora408 Same problem: one day I got to the bathroom and when I came back only half of the led Stripe was on, then it died after a reboot.
@@N7-Ikari yep exactly what happened to me
@TerraWare ya its a shame. I ended up turning all the rgb off in my system lol
Amd sucks
Great job as usual brother 👍
That avatar lol.
We need more ray traced flashlights! (or muzzle flashes for that matter!) 😄 Great video as usual 😌
Yes. Would be awesome if they'd done RT flashlight on Silent Hill 2. The shadows from the flashlight can look really out of place in that game.
Excellent comparison and perfect timing! Can’t wait to see what’s revealed at CES
Running a 7900 XT with a 12400f (not upgraded it yet), playing on an ultrawide 1440p monitor and Indiana Jones is hitting between 80-130 fps in ultra. Pretty good for an Nvidia sponsored game that forces RT
Well... only RT ilumination , u cant activate full RT with AMD card in Indiana Jones.
@ wasn’t aware nvidia came with extra settings! The game looks pretty incredible with amd and ultra settings I must say
@@TheGlobuleReturns Yes, and the restriction is not just that it has to be Nvidia... it has to be a 12gb vram Nvidia card... xDD
But hey, as you say, the game looks great anyway and most importantly, it's a very good game.
Too bad AMD FAILED and didnt even show GPU. (7900xtx owner here)
I don't know about failed but that was insane man. I was expecting a reveal like with RX 6000/7000 but it was like they forgot about the DGPU guys watching. The media brief had more to say. I'm surprised.
7:02 a friendly correction. Dragon Age Veilgaurd and Jedi Survivor are not on the same engine. Veilguard is on Frostbite and Survivor is UE5
Was just about to comment that. Also, Jedi Survivor is UE4 AFAIK.
LOL. You're totally right and I knew that, have made so many videos on Jedi Survivor and it will be on Part 2 of this. Must've had a brain stutter.
A further friendly correction, who the fk plays Dragon age veilgaurd? Why is even considered. If i go a bit extreme, both of these games. There are better games out there. So many of them.
Well they need the equivalent of DLSS and Frame Gen for RT to be playable. One don't simply do RT and expect playable FPS.
The 7900 XTX sapphire nitro is such awesome gpu, I’ve had no problems with it.
It's an obsolete hot garbage 💀
@@OmnianMIU What GPU do you have?
Honestly, my thoughts on RT is that its a mixed bag at this point. Some devs like Insomniac, Capcom, Ubisoft massive are doing great work and transforming the game with RT reflections and RTGI while still maintaining very high FPS. At the same time, most of the other RT effects feel like they are a plug-in with the bare-minimum effort and simply exist for the sake of a sponsorship. Take RT shadows and RT AO in Ratchet and Clank Rift Apart or RTXDI in Star Wars outlaws where they barely do anything but cost a metric ton of performance.
As for Path Tracing with UE5 and Black Myth Wukong, that is straight up dumpster fire. All they are doing is making Epic look like a god tier optimizer where the game looks the exact same but runs at 1/4 the frame-rate. It also feels like they are manufacturing problems like flickery shadows in Black Myth Wukong to sell a solution. Like Seriously? How are shadows flickery in Alan Wake 2 and Black Myth Wukong but, not in a game like Breath of the Wild that runs on a 10 year tablet SoC?
I've seen recently slightly more detailed Pathtracing vs Raytracing vs No RT comparison .
Well maybe not much more detailed, but better locations in my oppinion.
PT had ZERO problems you mentioned aside of lower performance off course.
I actually sold my XTX, though I don't know if I sold it too early due to potential tarrifs happening. But yeah, time for the 5080/5090 this gen after going AMD high end for 2 gens (6900 XT, 7900 XTX). When AMD targets high end again I'll give it a try :)
Well that was a strange presentation. They showed nothing about RDNA 4. Their media brief had more to say. Weird. Strix Halo on the other hand was the coolest product they showed. 16 Zen 5 cores with 40CU's of RDNA 3.5. That's a heck of an APU
@TerraWare Yeah, what a weird presentation. Apparently it was just a preview? Also apparently FSR is 9070 series specific?? I was left more confused watching that, seriously lmao.
AMD is cooking with APUs, we won't see it on AM5 probably but someone like minisforum will make a slim mini PC with it. It should be more than enough for a lot of people.
@@Accuaro Check out Hardware Unboxed's latest video. They asked AMD questions he touches on them and it leaves me with more questions than answers btw.
Apparently they weren't given enough time by CES event organizers so they couldnt show RDNA 4. It's BS or their priorities are backwards. You'd think that'd be one of their main priorities considering its what most people watching wanted to see.
They also gave Matt Booty a bunch of time phoning it in from Seattle saying practically nothing and the Dell guy talking about nothing.
@@TerraWare Wow.. that's ridiculous. Honestly wtf..
The 3080ti is doing really well. I haven't had many issues with 4k RT on my 7900XTX but I don't own many Path Tracing games, so normal RT games run well, especially with FSR FG Taking you into high refresh territory with some upscaling. In this video I think Wukong and SH2 are both on really old versions of UE5, so if they ever get ported to newer versions they should run better on both AMD and Nvidia cards. There is also software RT these days, but with Nvidia's marketing we are moving into it being 'Full RT' or nothing now I guess.
I think having Full RT/PT as additional options like in Indiana Jones is perfectly fine, the XTX does really well in that game, even though there RTGI by default. I think regular RT will become more and more common though which is fine. Even the 6800XT does pretty good in Indiana Jones for example. Even at 4K and thats without upscaling, just some tweaks. No biggie.
@@TerraWare I meant as in games having RT by default like Indiana Jones, but because it's not an option in the settings, a lot of people think that game doesn't have RT unless you turn on 'Full RT', so that is another masterclass by Nvidia marketing. Full RT is not even an option on AMD so people will think AMD cards can't do ray tracing, and I expect more games will come out with 'Full RT' supporting only certain nvidia cards. Full RT is a lot easier to understand than Path Tracing, and when other non Nvidia cards can do normal ray tracing, it's an easier way to differentiate what Nvidia cards can do and what other cards can't do.
@@TerraWare can you give me optimized settings for Indiana jones with path tracing? I've 4090 with PT I'm getting slouchy performance.
"full ray tracing" is the term I'm looking for at AMD CES. If they don't use that term, then most likely path tracing is not a priority and they might just wait for udna era. Which I don't think most people even care about.
However, moving to hardware for their RT should help and going full ai should also help. Fsr3 FG already produces a significant amount of FPS compared to the competition, full ai should reduce any artifacts whilst still maintaining a significant FPS boost, maybe produces even more FPS.
Even though AMD is just targeting the low and midrange, they have to produce a compelling showcase or I think a lot of their consumers will jump ship. No point in waiting for UDNA to show any ambition.
@@XDDX_NZ Ah I see what you mean. It makes sense. Nvidia are masters when it comes to marketing. Unless you're in the know you probably don't know Indiana Jones uses RTGI regardless.
Black Myth Wukong is such a badly optimized game on Radeon. I can't help but think it's intentional. One thing that really bothered me: the game ran relatively well on my XTX, running at around 90 average all the way up to the "hand face" boss in chapter 6, after which all 24 GB of VRAM fills up and the game chugs to a stuttery mess, eventually crashing. The whole area after that, flying around on the cloud, is a whole different game with regard to performance and after a short time, I end up with the same issues. Does anyone else have this issue? Is it the fact I'm on Linux, or do other XTX users on Windows also have this issue?
Kinda remind me of Hairwork when they max out tesselation and crippled AMD cards.
AMD just doesn't have the necessary hardware to do ray tracing at the level of Nvidia. Nothing to do with optimization in this case. You can always play with just software lumen.
This isn't intentional. RDNA3 cards, and even RTX30 cards, simply do not have the technology to run PT at reasonable speeds.My 4080S is over 2x times faster in this game even compared to RTX3090ti, and usually both cards are close. With medium PT at 1440p I saw only 3% relative difference compared to lumen (123fps with medium PT, vs 127 fps lumen in built in benchmark at 1440p DLSSQ +FG cinematic settings, medium PT), so IMO PT in this game is very well optimized.
@@AlexFoxRO Oh, I'm not using RT or PT, other than what's already there by default. It's just bad optimization to have the game run fine for the first 80% and then require a different card for the last 20%. Like, did the devs just give up?
I still can't this upcoming gen competing even with the 40 series in ray-tracing definently not path-tracing let alone ray-reconstruction so it's not just the raw numbers. Unless AMD pull a rabbit out of the hat with some brand new dedicated ray-tracing cores which is always welcome for them. But they are just are just so far behind Nvidia and their tensor cores at the moment but let's see how these 2025 cards play out.
As long as they keep up on RT it should be fine. I don't think they'll be able to catch up on PT anytime soon but we'll see what's up soon enough anyway.
Great video. I feel like too many people just write off the 7900xtx's RT performance. It's not high end Nvidia level, but it's still as good as a 4070 ti in most games. I've had a ton of luck with AFMF in ray tracing titles for 120+ fps with full RT. Path tracing always seems to break afmf though
Path tracing is in a weird state right now anyways. Path tracing just looks completely different from RT, so it's going to produce a naturally much darker image. Developers don't waste a lot of time on a graphics setting 98% of players won't touch so they don't really add additional lighting to make the game more playable. It has a ton of potential but devs need to tailor the game around it a bit.
Fmf sucks
please whenever RT is off its better to keep the metrics on so that we can know how much is the performance hit thank you.
7900XTX is a monster.
Just look at those juicy 24gb Vram , touch them and feel it. So pleasant to have them.
How many games will benefit from the 24GB of VRAM? PT games with FG can use more than 16GB, but the RX7900XTX cannot run PT well anyway.
@@PabloB888 AI. And in case of games - it's always like that:
Nvidia releases cards with a limited amount of memory (without reserve, now it's 8GB), and fanboys say:
_"Why do you need more? You don't need more! Cock-a-doodle-doo!"_
@@-mrws- 8GB VRAM is certainly not good enough anymore, because most games use 9-12GB VRAM based on what I have tested.
i mean to be honest, in 1440p i never reached over 16gb vram wait i did the game was Avatar but this is the game that pushes Vram so high and Nvidias Limit on Vram is the reason why most games use just 16gb vram nowdays.
But even in 1440p the Vram usage is from 14-16gb so no one can tell me that 12gb is more then enough in 2025 for 1440p and who know what the future will have for suprises (wait indiana Jones is very Demandin if you use Max Textures where even the RTX 4080 start to cry).
@allxtend4005 At 1440p my 4080S has enough VRAM to run Indiana Jones at 1440p Native with maxed out settings with PT. At 4K however my card is vram limited, but even if I would have 24GB VRAM the game would still run like crap, because the RTX4080S is too slow to run PT games at 4K native with playable fps. With DLSS and lower texture streaming settings the game runs smooth though and I havent noticed any degradation in texture quality, so I cant say 16GB VRAM limits my experience. I would be worried if I had 12GB, but 16GB is still plenty. The RX7900XTX has 24GB, but you will not evrn run PT anyway. IMO only 8GB-10GB VRAM cards can limit gaming experience at the moment, anything above that is still fine and will be for the next few years, because we will be still playing PS5 ports.
good video😘
Wukong has sharpness slider - and pt medium is about reflections in bodies of water
Have you tried Frame Generation on extreme settings?
epic thumbnail
This is kind of weird, the 7800XT is close from the 4070 in RT and this isn't far off that performance.
8:05 defiantly lower texture on 3080 ti to high for smoother overall gameplay
really rooting for team red this generation. nvidia needs a competitor badly.
Given that I am a fan of RT(only eflections realy for rthe time being) and unlike some other channels you tested RT in ALL games just as the title said(no clickbates),you earned yourself so-called subscription(in reality it's just free of charge folowing lol) .
Lol. Thanks I appreciate it.
RT and sales prices in Australia at the time were big reasons I went 4080 over this card, 2 years ago.
i have a build coming up for my Son though, and I'm revisiting it all again to get ideas on what's best for his build
Looking forward to RDNA4, but im wondering if this is actually the card to get if I wanna go AMD for this build.
Not sure I'll go near Nvidia this upcoming gen yet. I'll wait and see though
Yeah probably wait if you can. AMD seems to have forgotten to show their GPU's in their presentation. That's one of the strangest things I've seen in some time when it comes to presentations.
7000 handle RT in majority of games better than many think, but the problem is the more complex it is, so the bigger visual difference it makes, the more fps Radeons loose compared to Nvidias. Combined with lack of help of DLSS when FSR imo looks bad to the point I would prefer not using it, so lowering settings instead. It obviously leads to going native with RT off. Btw that's how some "preferences" are forced by reality.
Before I start watching the video.
I play Cyberpunk 2077 and Dragon Age Veilguard with RX6600 with Raytracing.
Reflections in CP2077 and Selective RT mode in Veilguard.
CP2077 at 1200p with FSR 3 Quality , Frame Generation and in the AMD Software called Adrenaline activated AMD Fluid Motion Frames 2 to mitigate possible FrameGen latency issues as much as possible and AMD Super Resolution set to upscaled 1200p-2160p .
Playing at a mixture of Medium,High and Ultra settings.
Not the greatest Picture quality improved once I turned Super Resolution ON.
Framerates?
I have no clue.
I play without onscreen stats so I can focus on the game.
IF something bothers me too much I modify settings a bit.
Perhaps I should turn off RT Reflections during day in CP2077 and increase overall settings and/or resolution.
RT reflections are most noticable during night on reflective,usually wet surfaces.
It greatly increases the immersion in my oppinion.
So I am certain I would be quite pleased with 7900XTX performance.
You're doing it right. Many people feel they must max out every setting to be able to enjoy a game. I've been trying to break that mindset with some videos I make.
7900XTX owners always say they have no interest in RT. I can understand why people werent interested in RT 7 years ago (in 2018, we only had like 3-5 games with RT support), but in 2025 we already have over 100 games with RT support, and almost every week we see new games with RT support. You just cant ignore RT anymore unless you really dont care about the graphics at all. Games often look much better with RT and on RTX40 series cards, you can run RT games quilte well. I have never played a single RT or even PT game where my performance dropped to unplayable levels. Quite a few RT games from my library runs at over 60fps even at 4K native (RE3 Remake in particular runs particularly well at 130-200fps), and with DLSS + FG on top of that almost every RT game runs at 120fps with the exception of PT. PT games require to lower DLSS settings to Performance (but that's fine, because with the latest 3.8.1 DLSS and my reshade settings I still get 4K like image) to get smooth 80-100fps. PT is extremely demanding, but in Black Myth Wukong I saw situation when PT was quite cheap. I get 123fps with mediumPT and 127fps with lumen (1440p DLSS Quality + FG, cinematic settings, medium RT/PT), that's just 3% relative difference.
RT is becoming something that's in newer games whether you care about it or not. Radeon does pretty good in those games, like Indiana Jones maxed out settings or UE5 games. Multiple RT effects or more complex RT can be a bit more taxing on Radeon, especially PT which takes a big hit to performance in relative to Nvidia counterpart.
Pleased you did this video because as i've been saying the XTX I tried Dying Light 2 in also and it did do very well
Veilguard sometimes it takes a nosedive with reflections but nothing major in the end I found with more testing
The 4080 Super did a bit better but not by much in Veilguard so that was good
Also if you use fade focused textures that defo eats up a lot of vram because my 4080 super at times did get on the brink of it's max 16Gb usage, but Found as long as I don't use FG it's ok mostly
Anyway lots of ppl just Assume the 7900XTX is bad with RT full stop but that's not the case in my experience
Lastly yeah can't wait for CES, still deciding what CPU to get
Oh yeah forgot to mention, motion blur with FSR causes that ghosting around the head of the character, found that out last week
Yeah it's very capable. What CPU do you have now? I think you told me before but I forget.
@@TerraWare 7950X3D
Once I get the new one however i'll still get decent money for that
@@TerraWare & tbh the price for the 9800X3D has gone up where I live in the UK to £529.99
So may as well just get the 9950X3D which is want I mainly wanted in the first place as I prefer more cores & since I think it will be priced at no more than £599.99 the 9950X3D
The price was £449.99 at first for the 9800X3D but waited lol
The other thing is I did see that the 9950X3D might be TDP 170W but waiting for confirmation
7950X3D uses 120w like the 7800X3D max so idk if will be true
@@RJTHEGAME I think you're fine with the 7950X3D. I imagine the 9950X3D will probably be quite pricey.
@@TerraWare Oh yeah certainly does the job the 7950X3D
But I want even better 1% lows and better FPS in the CPU limited games and the upcoming ones since Devs are getting worse lol
I use RT shadows in MSFS 2024 with no impact, it seems to me.
Yes I have been playing FS24 too. It runs pretty good.
I have a 3080Ti since it came out and the last year i was always looking for a new card, but Nvidia 4080 and 4090 are way to expensive and the AMD 79xx are not good enough to upgrade to.. so i totally skipped this Nvidia and AMD Generation. I have no idea if the AMD 9070 XT or the Nvidia 5070 Ti will be good enough to make it worth the money for the upgrade.. if i do not get at least 30% more Performance, i will not upgrade. I can still work with DLSS and tweaking the Graphic Settings from ultra to high.
3080Ti is still a very capable GPU. 30% minimum uplift for upgrade is pretty good. I don't think the 9070XT will be that unfortunately.
Maybe the 5070Ti. I think that's supposed to be 16GB vram as well.
Running out of vram sometimes manifests as depressed performance without stuttering. Depends on the game that´s why Dragon Age runs much slower without stuttering.
You could be right. I wish if spent more time on it out of curiosity.
@@TerraWare Great video btw! Much appreciated! Didn´t expect the 7900XTX to do this well.
@@Robspassion It's a pretty good card. A bit power hungry but so is the 3080Ti lol and especially the variants of both that I own are factory overclocked.
What are the benefits for you to sy games on a monitor and not a tv?
TV is way too big for a desktop setup imo. 40" would probably be the highest I'd go but TV's these days are perfect for gaming so it comes down to preference.
why not test with cyberpunk and Indiana jones? these 2 games are the best to test proper RT (i.e path tracing)
They will be in part 2 of this video and others. Had to cut it short because the video got way too long after saying what I wanted to about these 4 games so far.
Not gonns lie newr the start when you said "Oh look at that loads of difference. Totally different" between RT on and off.ithought you were being sarcastic. It looks the damn same.
I wait for CES, I wanna change my rtx 3060Ti 6GX for 9070XT/(non XT) eventually 7900XT~ my budget, RTX 4070S has one problem for me= 12gb vram, and yeah also slower card. Ryzen 5600x shouldn't be so big problem at 1440p.
We need the new cards 😢
RT in the Veilguard is such a tiny tiny visual improvement for a wild performance hit. I took a screen on on instance where RT OFF I was getting 160fps and with RT ON i was getting 57. 103fps for an unnoticeable difference. AND I LOVE RT.. Its just really bad in DAV
RT in Veilguard is kind of unusual because it's extremely CPU heavy. Just a guess but you could actually be CPU bound with a massive drop like that. I did a thorough video looking at the CPU cost in this game. Here's a link if interested with timestamp to CPU bottleneck section. th-cam.com/video/_8rtUlTQdRw/w-d-xo.htmlsi=m-Go4cwjt-mMBjao&t=181
@TerraWare Maybe.. I found the only really CPU intensive thing to be the Strand Hair. At least for me my CPU usage didn't change a whole lot when using RT or not. My point is there's such a minimal difference visually with RT in that particular game for such a wild performance hit.. in some of the areas I took acreenshots there is literally no difference in the AO, shadows or diffusion at all but still resulted in nearly 80fps difference! The only place RT really makes any bit of a difference is when it's on its adding to the SSR.. for what I could tell anyway.
My guess is that they used baked lightings properly so that when RT is enabled the difference is so little, in AW2 for example, they need to go out and deliberately make conventional shadows and reflections bad to make RT stand out even more.
1:29 where all shadows go? This look like 2007 game 💀💀
i'm sorry, but that flashlight still doesn't look anything close to accurate, so i'm really not sure where the value is. yea, the light bounces around, but it still looks like a video game, just a different looking video game.
and why did not use AFMF ? near double the FPS so basicly way more fps and yes MS get reduced and it does not feel worse in response, but okey ... we do the comparision thing where we use DLSS+FG and FSR without AFMF and call it a day ?
Dying Light 2 is very demanding with RayTracing, but as i can say the Rx7900xtx can even handle Cyberpunk 2077 in 1440p and Path Tracing and max out settings with around 60 fps Xsse or what Intels upscaler is called or 80 - 100 FPS with FSR 3.0 (Why not 3.1 but okey cd project doing ther thing).
With AFMF enabled of course and see there, good FPS compared to Nvidia somehow, Driver issues ? Not any at all.
When we take a look at Avatar or any other games who Implemented FSR 3.1 in a good way then we see that FSR 3.1 is not that faar away from DLSS 3.5
Dlss still is a little bit better but the main reason is because DLSS is way better implemented then FSR in most games meanwhile older RTX cards use FSR nowdays and not DLSS to achive ther Goal of better FPS because they are locked out of newest DLSS feautures.
Black Myth Wukong i tested in 1440p cinema Settings and it is at 100-120 fps avg with AFMF 180 - 240, the RTX 4070ti super achive with the same settings around 80 fps with DLSS and FG.
Now i wounder why the Nvidia RTX 4070ti super cost as much or even a little bit more then the Rx7900xtx but is so faar behind even in RayTracing, the only one thing the Card wins is when games have very bad FSR support and let's notice that Nvidias FG add so much Ms that it really feel bad, this is what i noticed by myself and the RTX 4070ti super was tested with the exact same specs as the Rx7900xtx.
I will believe AMD has finally caught up when I see it. I don't believe they will.
Nvidia are massively invested in RT. I don't know about catching up to 50 series, especially when you take into account RR, dedicated cores that are in their 4th gen now but as long as they make good generational improvements they can deliver a pretty solid product.
rdan4 is dead end architecture feeling ...just like vega stop at it tracks while went off road with another architecture with polaris...then rename it to rdna since all rdna been navi architecture since while arcturus supposed to be successor but instead AMD threw a wrench at the project that Lisa Su don't want that architecture in any gamer peasants hands.. Its too expensive for budget users. Thats what got raja koduri angry about it . Not doing what nvidia been doing...giving expensive architecture for everyone but to bottom line for lisa su...it not gonna happen since she wants more money than $2k on radeon instinct brand, its why gamers never get nice things from amd.
Nvidia uses 1-2 gb less VRAM.
Quick reply = it sucks, but the card is GREAT. The best raster right after the 4090 but much cheaper. I have it and enjoy it knowing that most of Gaming Ray Tracing is currently done with Reshade RTGI. Yes. MOST of RT is currently NOT hardware but sofware. And there, only raster power matters!
I'm still using an XFX RX 6800 XT and sometimes there's a game or two that perform well with RT and they're usually the ones that are built with it in mind. Metro Exodus Definitive Edition, Avatar Frontiers of Pandora, Indiana Jones, etc.
Regardless, the RX 7000 series really disappointed me. It's why I just skipped it and stuck with my 6800 XT, but regardless I was still really underwhelmed that the RT performance is still a generation behind Nvidia. Turning it on suddenly makes your new more expensive GPU unable to keep pace with last-gen products, it's not a good look. I thought it would change with the RX 7000 series but it was exactly the same.
I hear ya. If you got a 6800XT which I have owned one since launch, there's no reason to upgrade to the 7800XT, they are very similar. Only card that made sense to upgrade to from a 6800XT with AMD was 7900XTX imo.
7900XT launch price was a joke. Later dropped to $700, was much better.
its why black myth lost the gaming awards...it didn't pass the AMD experience first .....and majority console gamers hate the game
I think it's the most overrated game in the world. All the Chinese I would say propaganda worked great. It's nothing special, and it's frustratingly difficult and has no difficulty setting, and horrible checkpoint system. Well, it has no autosave feature so you have to for example travel through the same area and fight the same enemies over and over again even when you fight bosses which is all the time... there are like 80+ bosses. Hitboxes are wrong, combat also feels wrong, sound design is average, story is nonsensical and writing assumes you understand Journey to the West by default. It's just not that great as presented, that's why I think it lost to Astrobot which is just pure accessible fun 🤷♂
Check out proper critiques of the game and not botted Steam reviews... there is no freaking way the 97% score is representative of this game. Chinese were told to give it a positive rating or they just felt like doing so because it's the first AAA Chinese release in ages, but the rating has no bearing in reality.
aren't we already going for PATH tracing then ??? ,
ray tracing is at it's END and has been for a long time now ,
what's the point anymore in giving newer cards ancient ray tracing INSTEAD of the newer way BETTER AND FASTER path tracing ??
Supposedly the new RDNA still loses to this.
According to rumors yea and in rasterization its probably true. Am hoping RT will be more capable. Time will tell.
But wait !!! Nvidia Fanboys said Amd its terrible cant do RT at all !
This video must be a fake scam ^^ not real made up ^^
🤣😂
Fanboys on both sides overly exaggerate things. It's how that mentality works.
I beg to differ. Veilguard is a great Dragon Age and a great game overall. It also runs flawlessly for the way it looks.
Bad. OK done.
(own one)
A 3080ti matches a 4070 Super in performance. The fact that a 7900xtx matches a 3080ti tells how horrible AMD GPU’s are at raytracing.
We knew that 2 years ago...
Its around a 4070 Super to Ti, especially mine which has a 450W bios in it and overclocks to 2.1ghz and the XTX can also outperform it in some cases with RT. It's not bad actually. Now in PT or Full RT yeah Radeon will take a bigger hit than those cards as I showed here in BMW.
Better deal than the 5000 cards, I also don't hear one person who really cares for ray tracing.
Really? Many people care for RT. RT comparisons are some of my most popular vids
majority trumpers cares since the minority speak louder than your side
Mehhh...Ray-tracing is still a very niche options that doesn't factor into my GPU buying practices. Games run JUST FINE without RT, plus what percentage of games have RT? 1%? 2%? 5%??...just not enough for me to worry about it.
It's a feature all GPUs support and new engines like UE5 use it by default anyway.
Yuo have 69 likes at the moment
Giggity
Not good I assume? Aaaand I'm out.
Thanks for the comment anyway lol
Nvidia Schill video is obvious
no, he is one amd users speak the facts to us the reality that AMD been lying to us since rdna3 release
Too much stigma around raytracing, it's obvious by now the best looking games have a lot of RT and new cards should be able to handle it. Perhaps not on max settings, but still take advantage of it. Currently playing Space Marine 2 and while the game looks alright it also looks quite dated, even with the 90GB texture pack requiring a 20GB card. Textures can no longer make up for realtime lighting, reflections and shadows.
RT can be quite transformative, especially a good RTGI
Yes but it's still very expensive to buy gpus to run games with ray tracing. The performance hit is too extreme still
agreed 👍
Agreed, games look really dated without RT, shadow & especially light appear where it shouldn't like lips, nostrils & eyelids where you know someone's face should be in full shadow. For those who's having problem with performance lost, there's DLSS to boost FPS & Framegen to smooths out any stuttering, it's not perfect yet but Nvidia is the best in the business so far.
After experiencing path tracing in Cyberpunk, I can't go back. to playing with no RT.
@@mmadevgame Useable RT tech will definitely trickle down to lower end hardware, might not be as fast as we wanted but it will eventually. If I remembered correctly it was the same deal with Shadow setting in games, around a decade ago only the most powerful GPU can render soft shadows well, but now it's almost a non-issue even for low tier modern cards.
raxtrace are more like gimmick then something usable!!polishing floors on rainy game its not realistic too!!where you can see in reality that