I went from an Asus ROG Strix 3080 12GB ( Faster than a 3080ti Founders) to an Asus TUF 6950XT and i gained 100fps in the built in MW2 Benchmark !!! Similar gains were noticed in multiplayer and warzone ( Severe CPU bottlenecked though in WZ) . It was like a totally different class of GPU while being 200 euros cheaper !!! I decided to go AMD both on CPU and GPU side as i only care for COD and Warzone after years being on Intel and Nvidia fully tuned !!! 7900XTX from what i saw Beats the 4090 in warzone while costing half the price !!! I know its a hard ask from you but it would be very interesting if you could compare the 7900XTX vs the 4090 !!! Even matching the performance at half the price makes it a great value option !!!
Yup. I essentially already have results from people. The fastest 7900xtx I've ever seen scored 422 avg fps in wz2 benchmark. An average 7900xtx can do 390-400. My 4090 does 375 best case in benchmark. Fastest 4090 I've ever seen is like 400FPS in the benchmark (full custom loop with liquid metal on that 4090) So yes, in theory the 7900xtx will be 5-10% faster then a 4090 assuming both are overclocked.
Ah, but MW2 heavily favours Radeon. It is the only game where the 7900XTX beats the 4090. You will not get so lucky in other shooters. The real battle will be next gen RDNA4 vs Blackwell.
I think what was impressive about Nvidia this generation was how they closed that insurmountable gap the 6950 XT had in Warzone. The dip was fixed around the launch of MW II, crazy good though especially at $600
Yup, to be fair the 4090 closes the gap in Warzone 2 against AMD with literally just raw power. XD it is insanely over specced allowing it to keep up despite it not having infinity cache like AMD gpu's. Warzone 1 actually still stutters for me on the 6950xt, but nobody plays that game fulltime anymore, so it's not particularly important. The price tag is what makes the 6950xt/6900xt so good...16GB of vram and a top warzone performer for the price of a 3080 12GB is absolutely awesome.
@@Beefy-Tech The last 2 AMD cards I’ve owned that have had zero issues has been the 6950 XT PC Red Devil and 6800 XT ASRock Phantom G. I’ve had nothing but driver is encoding issues with the RX 7000 series cards. My performance is still way lower than yours on the 4090 though, might have to fresh install Windows 11 just to see if that’s a factor. I remember you mentioned Windows 10 being a bit faster but I’ve been on 11 since ‘21
@@FIVESTRZ windows 10 runs faster with AMD 7000 cpu's in specific. I don't know exactly why that is, but pretty much everyone that had weird performance issues with their 7000x3d cpu's switched to windows 10 and everything was resolved for them. Try also using the Microsoft Uninstaller Tool to remove all of your AMD chipset drivers, as I tested FPS with them installed vs Uninstalled and it consistently gave me about 20 more FPS in benchmarks to have the chipset drivers fully uninstalled and just use Prefer cache in bios + Process Lasso. (THIS IS FOR COD ONLY, haven't tested other games)
@@Beefy-Tech good call out at the end there. Might give it a go then go back to old school settings without all the game mode jazz. Gonna stay on 11 and eat the performance loss though. OS is too damn pretty lol
@@imeanyeah1652 I'm about to buy one of the 2 either the 6950xt or 7800xt they are both 500€, everyone say that 6950xt is better but consumes more, I don't care about ray tracing, power consumption or new AI technologies that I probadly won't use, I will probadly upgrade 750W PSU, which one should I get in your opinion?
@@brucefly3612 While I wouldn't recommend it I run the Rx 6950xt on a 650 watt power supply and it is completely fine. It really depends on the power consumption of the rest of your pc whether you need to upgrade or not. I would personally recommend the 6950xt at the same price as the 7800xt is simply faster and still has access to all current AMD driver/game features such as frame gen and AFMF 2. If power consumption matters a lot to you then the 7800xt is more efficient but keep in mind that both can be undervolted also.
I just got a 6950 xt to give my 6800xt to my kid. im only getting 224 fps from 203 before. 6800 xt OC to 2550 mhz was previously able to get 2630/50 before previous updates with the 6800xt. -But with this 6950xt i cant get it past 2650 before "unrecoverable direct x error" and I don't get a higher score. -However with AMD Stress test i can run for 10 min of 2800 mhz fine. and Heaven benchmark. Card is XFX Merc. CPU 5800x3d pbo2 -30 - i've tried DDU clean drivers, scan and repair game. Old drivers and new drivers. Game also crashed on default settings. -I'm thinking of returning it reordering thinking i got a bad one. In farm 18 im getting 10-15% less than you.
Remember that a lot of thiz can still be down to CPU too when you are directly comparing. My 6950xt isn't particularly great for OC because I essentially can't undervolt it much at all, and it needs 20+% power draw to reach the peak boost I'm reaching in the video, but if you have the ability to return it and maybe get a better overclocking card definitely do it, ud have nothing to lose after all👍
@@Beefy-Tech With any OC I can’t get a higher clock speed running the in game bench mark vs default settings. When monitoring with OCCT. Even the auto undervolt or auto oc causes MW2 to crash with direct x error or overclocked error. It’s odd this game won’t take any overclock. When the card will hold a sustained 2800 MHz oc for 10 min
@@DrTopLiftDPT def extremely weird, most 6950xt's are 2700-2800 capable so it should def also hold that in cod...it may indeed be cod being the problem.
im thinking of upgrading to a 6950XT from my RTX 3070, cause the prices where I live are still pretty fucked up. Used 3070s are ~400 USD, my model specifically is around ~500 USD, meanwhile, you can get a used 6950XT for ~150 USD more, at around ~650 USD.and that's the high-end models. The one I'm looking at rn is a Powercorlor Red Devil RX 6950XT for 630 USD.
Dude I'm running a 4090 with a 5900x 32gb of 3200mhz ram and I'm not getting but around 120 in warzone resurgence and 100 in almazra maybe like 140 in multi-player no mater what I do or change I can run at 4k or down to 1080p and the fps stays pretty much the same.
Sadly my friend, that's as a result of the 5900x. The cpu bottlenecks in this game are insane, and anything short of a 7800x3d/7900x3d/7950x3d won't be nearly enough for a 4090 in warzone 2. Every season they hit the cpu performance even harder, making the need for a stronger cpu required as the seasons pass.
@Beefy-Tech dang I new it seemed to be getting worse and worse every update I get less and less performance. I was seeing over 200 fps across the game when I first got the 4090. I have a new 3080 just sitting now that I paid a stupid amount for but upgraded to the 4090 soon after and I bet it will perform just as well as this 4090 now in MW2 and warzone. What a waste. So I need to upgrade platforms?
@@casey22857 yup, this games state just keeps getting worse and worse for performance. You could make a future proof play and go AM5 with a 7000x3d cpu, and that'll allow you 2-3 generations of CPU's to choose from. If you buy a 7000x3d cpu today, ull gain about 60-70% FPS in warzome 2 over a 5000 series non 3D... so am absurd upgrade in terms of gains. But the real value comes when you realize that on the same motherboard and possibly even ram, you can get a 8000x3d cpu, or even 9000x3d cpu. The beauty of X3D is that they don't need fast ram AT ALL to run well. Just the other day i ran stock 4800mhz ddr5 vs Fully tuned 6200mhz ddr5, less then 1-2% performance difference between them overall. Means if you buy AM5 today, ull not need a Mobo or Ram upgrade for years to come, and ud hit that 200 Fps mark on Al Mazrah again no problem.
60fps more in 1% lows is a strong argument for the 4090 (leaving value for money on the side). are u sure it was just server related? do u have a 4080 by any chance to do the same test? love ur stuff...keep it up
No guarantee, but the reason I said it's server related and not just the 6950xt, is cause I've seen those 1% lows on the 4090 in previous testing, but was unable to replicate the testing because it all depends which servers you get put in, and it's completely random. It could just be the 6950XT at fault, but look at the Ashika Island results, the 1% lows a perfectly solid there, indicating the server on al mazrah was probably responsible.
@@Beefy-Tech I can confirm that yesterday on Marshlands I was getting less FPS on my 6900 XT and then the next game the FPS got up to its normal values on the same spot. If it's not the servers, it's also the strongholds and the bots that can affect your FPS.
@@reizanaliu yup. I've unironically played a full game of Al Mazrah on 4090 and I averaged 280 FPS with 140 in the lows ☠️ fast forward 2 days later, I ran around al mazrah city with a 268 average and 180 in the lows.
Wow. Makes me want to sell my 4090 for a 6950xt, especially if I can find for $599. That’s $1k saved for similar performance on BR. I’m also only running 240hz, so those extra frames are mute.
To be fair, 4090 will dominate virtually anything else out there by a wide margin the second you are outside of warzone😂 Rendering is 230% faster on 4090 vs 6950xt, so for content creation 4090 is simply way better as an example. And in other games 4090 is 50-90% faster (depending on if Ray tracing is on). Meaning if you ever take a break from warzone and play a single player games, the 4090 experience is much better. The cost of that? Money, a lot of money... xD that's the 1 big downside of the 4090 after all.
Lol. I love my 6950xt. And the 4090 is silly af. But unless you need the money, I wouldn't downgrade now you have it. It's the best card, at the moment. And you'll be future-proofed for a lot longer. If you had enough money to shill out to NGreedia's most idiotic purchase, then I'd just stick with it, now you have it. Free heating and all, lol.
I'm sure I could find a game in which my 6950xt beats your 4090. Just saying. That doesn't mean mine's the more powerful card. Unless you aim for a very specific card and you want every possible frame out of it for competition or something, you should feel lucky, dude. You're at the absolute top with that power hogging fire hazard.
I have a 6950 xt toxic and I'm not able to reach that average FPS, I am at 180 more or less... Some tips? I have overclock a little bit more than your video and I have a pbo on my 5800x3d at -25
This was tested on the side of the map during season 3.5. Those parts of the map are wayyy easier on the CPU,meaning performance will be higher then a lot of other parts of the map. Not only, but in season 3.5 performance in Al Mazrah was better than in season 4, so it wouldn't be directly comparable. Not to mention the cpu I used in this test was a fully optimized 7950X3D, which means the least cpu bottleneck you can possibly get in WZ2.
@@Beefy-Tech the strange things that I saw 5800x3d+ 6950 toxic reach 400 FPS in warzone 1, then here in warzone 2 if I play max settings or minimum doesn't change too much in terms of FPS is always around 180-200 and you can understand that play minimum has no sense cause looks worst then PS1. But is look also that this FPS can be reach from other lower cards as 3080 and 3070 🤷🏻♂️ (I tried also max settings 4k and I have around 90-120 FPS that look good for 4k, but for full hd I'm a little disappointed 😞)
@@emanueleguida3401 to be fair, you are simply cpu bound and so you won't see more then 180-200 regardless of your in game settings. Al Mazrah suffers of thus cpu bound aspect of it since launch, and it's only been getting worse. But yeah, meanwhile your 6950xt definitely can push more frames, the 5800x3d wouldn't be able to keep up.
Picked up a 6950xt last April because of sale on Newegg. The Sapphire Nitro+ version. It did not belong on my rig lol super overkill. Didnt matter though, because as i upgraded to AM5 it continues to impress. Best decision I made
In this video I used the same Kingston fury renegade 32GB kit (2×16) which is XMP 6400mhz CL32 rated. I generally only run 6000-6200mhz on this kit, but with custom tuned timings.
So in the benchmark, my 4090 loses by about 10% to a 7900xtx. In BR they ofc perform the same due to cpu bottlenecks xD I've seen 4090s overvlocked far more then mine that was within 3% of a well overclocked 7900xtx, but that's only when compared to the average 7900xtx. The fastest 7900xtx I've seen is still 5% faster in the benchmark then the fastest 4090.
@@Beefy-Tech Yeah its brilliant value for money. I'd still obviously rather have a 4090 but im very happy i also saved myself 600 and put it elsewhere into my build How do the 1% lows stack up for you between the cards?
@@jamie56k honestly, it's hard to tell due to servers being so atrocious. But if we only take into account benchmark results, 4090 and 7900xtx have pretty much identical 1% lows. This specific test the 6950xt had lower 1% lows by quiet a bit...but I'm pretty sure it was just the server because the Ashika Island results for 1% lows were solid. Not only that, I've seen the 4090 have similar 1% lows in a different game of al mazrah, yet again demonstrating why WZ2 Al Mazrah Testing is so damn annoying xD.
Hey Beef! I currently have a 5800x3D/RTX 3090. Do you think I’d see a improvement in fps in Cornzone if I swapped the 3090 for a 6950xt?🤔 Currently get around 170-205 fps and 240 fps on Ashika. Or am I doomed because of cpu bottleneck in that unoptimized game?
It all depends on if you see cpu bottlenecks already or not. If not, then 6950xt can definitely do better in warzone specifically, but outside of warzone 2 your 3090 would be equally quick to a 6950xt, and it has more vram then 6950xt I'd say you are far better off to enjoy the performance of your system and just do a system upgrade later on, as you'll see far better improvements gamewide rather then just improving warzone.
6950XT is way faster than a 3090 lol especially with SAM on UV/OC my 6800XT is already reaching 3090TI territory. i have a 21500 time spy score. a waste of equipment paring an AMD CPU with an nvidia GPU you dont get to benefit all the goodies.
wow, i have a 6950xt and a 12700k @ 4.8ghz and only see around 140-160 fps on ashika is it even possible for me to get near the fps you were getting on your 6950xt even with the config file or is it just processor diff
I have a 13600k and a 6950xt, and while playing warzone I get “Packet Bursts” and my frame rate is only around 100. Due to the packet bursts the game is almost unplayable and I don’t know what the problem is. Do you think I’m CPU bottlenecked??
Packet burst has nothing to do with your cpu or FPS. Also what settings are you playing on?? Try 1440p Medium basic preset. That's usually the best way to go for comp settings.
Also no a i5 13600k is Nota bottleneck. One warzone is horribly optimized. The only way your bottlenecking is unless you're playing at 1080p low settings. Try 1440p medium to high because the more high graphical presets the more it puts on your GPU rather than your CPU
The benchmark has changed. When you enter that building half way through the bot now throws a Molotov cocktail at you. The flames will hit fps hard. Also the water changed. With Deferred physics quality originally you could turn it off and you wouldn’t see ripples on the surface and it was opaque and non reflective. Now there is always ripples and the surface is clear and reflective even when turned off.
Price to performance I agree with you (but 4090 is in a league of its own), but 147.1% low vs. 214 fps is not close... Maybe the average, but who cares about the average? Percentiles and frametime are far more important, and all this is not fully utilizing the 4090's performance and not counting things like DLSS for 4K gaming, AV1 support, Nvenc, Cuda, efficiency, etc. Sorry for the bad English, or maybe you are just click bating in that case, its working :p PS: no, its not server related, I have the 6950xt and can confirm those 1% lows, have a youtube channel but in spanish!! about Overclock also !!
Alright that's good to know fan. I was wondering why the 1% lows are the way they are in Al Mazrah, meanwhile the Ashika Island results were solid on the 6950xt in terms of 1% lows. That's what made me speculate the servers being at fault. Not only, but my 4090 has actually had a server where it also had terrible 1% lows before, but that can also be tied to the CPU to be fair.
If one would stream(1080p) AND play (1440p) WZ2 with a 7800x3D would a RX6950 XT perform better than a 4070 regarding doing both tasks at the same time?
Enjoy fam! Make sure to do proper research on the GPU you are getting. Just beware the AMD drivers experience still isn't as good as Nvidia's. But their GPU's are definitely a great bargain for buck.
HOW?!? I have a xfx 6950xt and ryzen 7 7800X3D On 1440p I’m only getting around 165fps in warzone! If I drop everything to low/ very low i still cannot surpass 200fps?!? How are you getting 300fps in al mazrah?!?
This isn't it. Haven't done the run yet, this was simply a gpu to gpu comparison. Also I fully 110% believe the 1% lows of that 6950XT were due to the server. Look at the Ashika Island results and how solid those 1% lows were on the 6950xt. It means the 4090 got lucky with the 214fps 1% lows... it had a good server while 6950xt had a bad server. Perfect example of how bad the servers are now! I have seen the 4090 with the same 1% lows the 6950xt had in this video👀 so server consistency is ass yet again
People need to stop benchmarking this game. It’s not stable enough. The whole point of benchmarking is to use something that will give you a baseline and then work from there. This game is broken!
Yeah I stopped worrying about frames in WZ. Like verdansk, every update loses frames so it became pointless. Then menu benchmark on the other hand works well for testing overclocks and undervolts. Over ten runs I’ll get +/- one maybe two frames.
The entire point of my video is thanks to cod being so busted. I wouldn't have a video idea if this game just worked, but it doesn't, and it still has a huge fan base despite all the hate the game has garnered, so I'm sure there's people out there that would appreciate knowing a 6950xt is such a monster in Warzone, and that they don't actually need a 4090 to dominate 👍
@@Beefy-Tech The point is that tomorrow or next week, these numbers probably will be totally different. Not because of the hardware, but because of the updates etc. The performance is all over the place.
@@harshhell4185 yup. But the 6950xt will still be performing above its weight class. The beauty of cod is that while the performance is being destroyed every season. It gets destroyed equally to every system xD. 6950xt will not he doing as well in 2 months, bit if 4090 is lowering in performance with it, it should still be fine to compare.
0:50 .... you state no CPU bottleneck.... yes its at 15-20% but what's going on with the 1-2 cores being hit by the main thread... I can guarantee they will be 100% pegged.... remember you can have 16 cores, 1 busy core and 15 idle, this will show like 8% util, when in fact its 100% because the app is only using a single thread on a single core
@@Beefy-Tech That just isn't a true statement... generally a game will have 1 main thread/process that does most of the crunching, and it can farm out some tasks to other threads, but anything that needs to happen for each rendered frame, will be done by main process/thread... so if you have a 48core 2.4ghz CPU and a 4090 pegged at 100% the CPU is still the bottle neck, but it will show as 1% utilisation. This because the cores processing the games threads are only clocking at 2.4ghz, and the game runs over many threads, but 2.4ghz isn't enough to process the data in time for the GPU, so the GPU will down clock and run in a lower state, it does this on the fly, when it does it will run cooler, and still state 100% utilisation.... this is because its using 100% of its current clock /frequency, but it will be way under its peak performance settings. If it's not the case here, share a post of the cores in task manager while you run that game
I had to check your video.... Your 4090 is running in a lower state that's for sure... 250w at 100% util.... never... I suspect your monitoring the frequency limit or max boost, not the current or actual frequency.
You aren't wrong. The reason the 6950xt keeps up after all is cause the game is super unoptimized and the Al Mazrah Servers are ass. I'm sure someone out there that mains Al Mazrah is thankful to know a 6950XT is enough to satisfy their FPS Needs though xD
One Major difference Nvidia cannot mtch = look at the high to lows its drastic for Nvidia 433 down to 1% 2 279 Under that of amd with Lesser Clock Means more stability @ A Higher clock and " I " Have Had Three i Built systems for under full stress loads Full 3600 And stable geek bench @ 3800 , So your Comparrison here tellls " ME " that you quite dont know Ho to Overclock that AMD GPU all i see you at is Under 26=800 = so you didnt even Over clock gpu and just used the overclock funtion Link / tab in the sofyware is what This is , COMMON Dud e At Least try = I Bump right again the MAX 300 fps that the game Allows !!! With a 6700 """ Yea thats right = with a 6700 Running 2800 / 2600 , and a 6950 xt i set it at 3600 / 3400 , tap power to the max as well as VRam control Max fgrequence aLL THE WAY TO THE RIGHT , and Aboce that = max voltage i run is 1080 . 1.8 i only drop it when i see ustability down to first 1040 , then to 1035 rhen 1030 which i now for a fine tunned Undervolt is around 1027 to 1035 on the 6950xt , this 6700 is running @ 1080 as its my daily , people say 1440p is too much for the 6700 seriers cards i tap out this monitor and most max fps caps apex what have you i see a major fualt around trying to stream from same rig to be a major fualt you can do it but quality suffers LAO I AWW YOU ARE RUNNING FULL SPOT CACHE AND UTRAL RAY TRACING , SO YOU ARE FURTHER LIMITING THE AMD , WOW DUDUE , AND I HAVENT EVEN SET IT TO SEE HOW YOU CAME TO A CONCLUSION
If u play fps games stop faling for this bs what matter is ur 1% dont mater witch 1 has higher fps in the testbof low graphics look how the gpu usage is 60% vs 6950 is at 95%
I went from an Asus ROG Strix 3080 12GB ( Faster than a 3080ti Founders) to an Asus TUF 6950XT and i gained 100fps in the built in MW2 Benchmark !!! Similar gains were noticed in multiplayer and warzone ( Severe CPU bottlenecked though in WZ) . It was like a totally different class of GPU while being 200 euros cheaper !!! I decided to go AMD both on CPU and GPU side as i only care for COD and Warzone after years being on Intel and Nvidia fully tuned !!! 7900XTX from what i saw Beats the 4090 in warzone while costing half the price !!! I know its a hard ask from you but it would be very interesting if you could compare the 7900XTX vs the 4090 !!! Even matching the performance at half the price makes it a great value option !!!
Yup. I essentially already have results from people.
The fastest 7900xtx I've ever seen scored 422 avg fps in wz2 benchmark. An average 7900xtx can do 390-400.
My 4090 does 375 best case in benchmark. Fastest 4090 I've ever seen is like 400FPS in the benchmark (full custom loop with liquid metal on that 4090)
So yes, in theory the 7900xtx will be 5-10% faster then a 4090 assuming both are overclocked.
Ah, but MW2 heavily favours Radeon. It is the only game where the 7900XTX beats the 4090. You will not get so lucky in other shooters. The real battle will be next gen RDNA4 vs Blackwell.
@@louisfriend9323 yup, that's true.
@@louisfriend9323 Like i said, i only care about COD !!! That is why i went with AMD after years on Intel.
7900xtx
13900k
6900mhz ram
1080p low settings
Cod benchmark 390fps
CPU BOTTLENECK 58%
GPU BOTTLENECK 42%
Just snagged 6950xt for $470 from amazon sale
I think what was impressive about Nvidia this generation was how they closed that insurmountable gap the 6950 XT had in Warzone. The dip was fixed around the launch of MW II, crazy good though especially at $600
Yup, to be fair the 4090 closes the gap in Warzone 2 against AMD with literally just raw power. XD it is insanely over specced allowing it to keep up despite it not having infinity cache like AMD gpu's.
Warzone 1 actually still stutters for me on the 6950xt, but nobody plays that game fulltime anymore, so it's not particularly important. The price tag is what makes the 6950xt/6900xt so good...16GB of vram and a top warzone performer for the price of a 3080 12GB is absolutely awesome.
@@Beefy-Tech The last 2 AMD cards I’ve owned that have had zero issues has been the 6950 XT PC Red Devil and 6800 XT ASRock Phantom G. I’ve had nothing but driver is encoding issues with the RX 7000 series cards. My performance is still way lower than yours on the 4090 though, might have to fresh install Windows 11 just to see if that’s a factor.
I remember you mentioned Windows 10 being a bit faster but I’ve been on 11 since ‘21
@@FIVESTRZ windows 10 runs faster with AMD 7000 cpu's in specific. I don't know exactly why that is, but pretty much everyone that had weird performance issues with their 7000x3d cpu's switched to windows 10 and everything was resolved for them.
Try also using the Microsoft Uninstaller Tool to remove all of your AMD chipset drivers, as I tested FPS with them installed vs Uninstalled and it consistently gave me about 20 more FPS in benchmarks to have the chipset drivers fully uninstalled and just use Prefer cache in bios + Process Lasso.
(THIS IS FOR COD ONLY, haven't tested other games)
@@Beefy-Tech good call out at the end there. Might give it a go then go back to old school settings without all the game mode jazz. Gonna stay on 11 and eat the performance loss though. OS is too damn pretty lol
i literally just ordered a 6950xt over the 7800xt
me too i was confused between the two, then decided to get xfx amd radeon rx 6950 xt merc black
@@mhocine47 The 6950xt is faster than the 7800xt it should compare more similarly to the 7900 gre
@@imeanyeah1652
I'm about to buy one of the 2 either the 6950xt or 7800xt they are both 500€, everyone say that 6950xt is better but consumes more, I don't care about ray tracing, power consumption or new AI technologies that I probadly won't use, I will probadly upgrade 750W PSU, which one should I get in your opinion?
@@brucefly3612 While I wouldn't recommend it I run the Rx 6950xt on a 650 watt power supply and it is completely fine. It really depends on the power consumption of the rest of your pc whether you need to upgrade or not. I would personally recommend the 6950xt at the same price as the 7800xt is simply faster and still has access to all current AMD driver/game features such as frame gen and AFMF 2. If power consumption matters a lot to you then the 7800xt is more efficient but keep in mind that both can be undervolted also.
@@brucefly3612you can run the 6950xt on a 600w power supply as long as you dont have a powerhungry cpu and undervolt it
I just got a 6950 xt to give my 6800xt to my kid. im only getting 224 fps from 203 before. 6800 xt OC to 2550 mhz was previously able to get 2630/50 before previous updates with the 6800xt.
-But with this 6950xt i cant get it past 2650 before "unrecoverable direct x error" and I don't get a higher score.
-However with AMD Stress test i can run for 10 min of 2800 mhz fine. and Heaven benchmark. Card is XFX Merc. CPU 5800x3d pbo2 -30
- i've tried DDU clean drivers, scan and repair game. Old drivers and new drivers. Game also crashed on default settings.
-I'm thinking of returning it reordering thinking i got a bad one. In farm 18 im getting 10-15% less than you.
Remember that a lot of thiz can still be down to CPU too when you are directly comparing. My 6950xt isn't particularly great for OC because I essentially can't undervolt it much at all, and it needs 20+% power draw to reach the peak boost I'm reaching in the video, but if you have the ability to return it and maybe get a better overclocking card definitely do it, ud have nothing to lose after all👍
@@Beefy-Tech well it seems to hold the oc fine for overwatch, Heaven benchmark - other things just not warzone. I have to run it at default
@@DrTopLiftDPT that's weird🤔 tbh that could be warzone 2 related more then anything else.
@@Beefy-Tech With any OC I can’t get a higher clock speed running the in game bench mark vs default settings. When monitoring with OCCT.
Even the auto undervolt or auto oc causes MW2 to crash with direct x error or overclocked error. It’s odd this game won’t take any overclock. When the card will hold a sustained 2800 MHz oc for 10 min
@@DrTopLiftDPT def extremely weird, most 6950xt's are 2700-2800 capable so it should def also hold that in cod...it may indeed be cod being the problem.
im thinking of upgrading to a 6950XT from my RTX 3070, cause the prices where I live are still pretty fucked up. Used 3070s are ~400 USD, my model specifically is around ~500 USD, meanwhile, you can get a used 6950XT for ~150 USD more, at around ~650 USD.and that's the high-end models. The one I'm looking at rn is a Powercorlor Red Devil RX 6950XT for 630 USD.
The Red Devil is a solid choice. It'll definitely be a noticeable upgrade aslong as you have a CPU to back it up for warzone 2.
Dude I'm running a 4090 with a 5900x 32gb of 3200mhz ram and I'm not getting but around 120 in warzone resurgence and 100 in almazra maybe like 140 in multi-player no mater what I do or change I can run at 4k or down to 1080p and the fps stays pretty much the same.
Sadly my friend, that's as a result of the 5900x.
The cpu bottlenecks in this game are insane, and anything short of a 7800x3d/7900x3d/7950x3d won't be nearly enough for a 4090 in warzone 2.
Every season they hit the cpu performance even harder, making the need for a stronger cpu required as the seasons pass.
@Beefy-Tech dang I new it seemed to be getting worse and worse every update I get less and less performance. I was seeing over 200 fps across the game when I first got the 4090. I have a new 3080 just sitting now that I paid a stupid amount for but upgraded to the 4090 soon after and I bet it will perform just as well as this 4090 now in MW2 and warzone. What a waste. So I need to upgrade platforms?
@@casey22857 yup, this games state just keeps getting worse and worse for performance.
You could make a future proof play and go AM5 with a 7000x3d cpu, and that'll allow you 2-3 generations of CPU's to choose from.
If you buy a 7000x3d cpu today, ull gain about 60-70% FPS in warzome 2 over a 5000 series non 3D... so am absurd upgrade in terms of gains.
But the real value comes when you realize that on the same motherboard and possibly even ram, you can get a 8000x3d cpu, or even 9000x3d cpu.
The beauty of X3D is that they don't need fast ram AT ALL to run well.
Just the other day i ran stock 4800mhz ddr5 vs Fully tuned 6200mhz ddr5, less then 1-2% performance difference between them overall.
Means if you buy AM5 today, ull not need a Mobo or Ram upgrade for years to come, and ud hit that 200 Fps mark on Al Mazrah again no problem.
@Beefy-Tech ok I understand guess I'll try and sell this platform and help take some of the sting out of the upgrade lol
60fps more in 1% lows is a strong argument for the 4090 (leaving value for money on the side). are u sure it was just server related? do u have a 4080 by any chance to do the same test? love ur stuff...keep it up
No guarantee, but the reason I said it's server related and not just the 6950xt, is cause I've seen those 1% lows on the 4090 in previous testing, but was unable to replicate the testing because it all depends which servers you get put in, and it's completely random.
It could just be the 6950XT at fault, but look at the Ashika Island results, the 1% lows a perfectly solid there, indicating the server on al mazrah was probably responsible.
@@Beefy-Tech I can confirm that yesterday on Marshlands I was getting less FPS on my 6900 XT and then the next game the FPS got up to its normal values on the same spot. If it's not the servers, it's also the strongholds and the bots that can affect your FPS.
@@reizanaliu yup. I've unironically played a full game of Al Mazrah on 4090 and I averaged 280 FPS with 140 in the lows ☠️ fast forward 2 days later, I ran around al mazrah city with a 268 average and 180 in the lows.
@@Beefy-Tech so you never know what you gonna get :D fantastic experience hahahaha
@@reizanaliu oh you can expect something for sure, and that's the fact that there'll be less performance next season😂
Just wondering, did you use SAM with the 6950xt?
Yes fam. SAM enabled, Resize Bar enabled in BIOS, along with GPU OC dor the 6950xt.
Beeen searching for this vids month
Wow. Makes me want to sell my 4090 for a 6950xt, especially if I can find for $599. That’s $1k saved for similar performance on BR. I’m also only running 240hz, so those extra frames are mute.
To be fair, 4090 will dominate virtually anything else out there by a wide margin the second you are outside of warzone😂
Rendering is 230% faster on 4090 vs 6950xt, so for content creation 4090 is simply way better as an example.
And in other games 4090 is 50-90% faster (depending on if Ray tracing is on). Meaning if you ever take a break from warzone and play a single player games, the 4090 experience is much better.
The cost of that? Money, a lot of money... xD that's the 1 big downside of the 4090 after all.
Nah bruh 😂 tempting for THIS GAME ALONE, when these new broken PC ports rollout the only brute force option is the 4090
@@FIVESTRZ yup xD
Lol. I love my 6950xt. And the 4090 is silly af. But unless you need the money, I wouldn't downgrade now you have it. It's the best card, at the moment. And you'll be future-proofed for a lot longer.
If you had enough money to shill out to NGreedia's most idiotic purchase, then I'd just stick with it, now you have it. Free heating and all, lol.
I'm sure I could find a game in which my 6950xt beats your 4090. Just saying. That doesn't mean mine's the more powerful card. Unless you aim for a very specific card and you want every possible frame out of it for competition or something, you should feel lucky, dude. You're at the absolute top with that power hogging fire hazard.
Phenomenal video & thanks for helping me make my decision, currently have a 3080 i7-13700k & only mainly play COD this helps save some money
It's a better deal. But unless you intend to sell yours or something, I'd just keep that 3080. It's a great card.
@@anomonyous we’ve definitely been chillin bro I have the FE it is indeed a solid card
@@anomonyous lowkey thinking of grabbing a 7800xt that price to performance is decent
@@anomonyous went with the 7800xt so happy I let that 3080 go thank you Jesus plus I got scalped for the 3080 2yrs ago so yea I’m relieved 😌
@@dphillip19 hows the boost and what cpu u running
I have a 6950 xt toxic and I'm not able to reach that average FPS, I am at 180 more or less... Some tips? I have overclock a little bit more than your video and I have a pbo on my 5800x3d at -25
This was tested on the side of the map during season 3.5. Those parts of the map are wayyy easier on the CPU,meaning performance will be higher then a lot of other parts of the map. Not only, but in season 3.5 performance in Al Mazrah was better than in season 4, so it wouldn't be directly comparable. Not to mention the cpu I used in this test was a fully optimized 7950X3D, which means the least cpu bottleneck you can possibly get in WZ2.
@@Beefy-Tech the strange things that I saw 5800x3d+ 6950 toxic reach 400 FPS in warzone 1, then here in warzone 2 if I play max settings or minimum doesn't change too much in terms of FPS is always around 180-200 and you can understand that play minimum has no sense cause looks worst then PS1. But is look also that this FPS can be reach from other lower cards as 3080 and 3070 🤷🏻♂️ (I tried also max settings 4k and I have around 90-120 FPS that look good for 4k, but for full hd I'm a little disappointed 😞)
@@emanueleguida3401 to be fair, you are simply cpu bound and so you won't see more then 180-200 regardless of your in game settings. Al Mazrah suffers of thus cpu bound aspect of it since launch, and it's only been getting worse. But yeah, meanwhile your 6950xt definitely can push more frames, the 5800x3d wouldn't be able to keep up.
I really wanna see the new 14th gen intels, if they can utilize full potencial of 4090... some sources talkin about 6.3 ghz minimum on P cores...
Same, I'm getting that 14900k on release 👍 really excited to overclock it
Picked up a 6950xt last April because of sale on Newegg. The Sapphire Nitro+ version. It did not belong on my rig lol super overkill. Didnt matter though, because as i upgraded to AM5 it continues to impress. Best decision I made
I was thinking about buying a 4070 but after doing some research i’m leaning on the 6950xt
Solid decision to go 6950xt tbh, great gpu and offers mad good value
me too
I bought mine a week ago and I have 230-240fps w/ only 75% of the gpu used. I dont regret it at all
@@pancakesdoctor I bought a 7900xtx 💀💀
Oh yeah. Absolutely 4070 not even worth it. Ngreedia prices are absolutely brain damaged.
Thanks for video.
What is your ram model and specs?
In this video I used the same Kingston fury renegade 32GB kit (2×16) which is XMP 6400mhz CL32 rated.
I generally only run 6000-6200mhz on this kit, but with custom tuned timings.
You have wibdows10. Do you have activated gpu scheduling on nvidia system??
No, gpu scheduling hurts performance consistently within MW2, so I keep it off.
@@Beefy-Tech ahahah wtf lol do you know what is gpu scheduling????
@@OmnianMIU yes, I do know what it is.
damn my 4090 only gets 130 fps on warzone 2 at 1440p lmaoooo
7900xtx is even better, although it is an AMD favoured title we’re talking about
So in the benchmark, my 4090 loses by about 10% to a 7900xtx.
In BR they ofc perform the same due to cpu bottlenecks xD
I've seen 4090s overvlocked far more then mine that was within 3% of a well overclocked 7900xtx, but that's only when compared to the average 7900xtx.
The fastest 7900xtx I've seen is still 5% faster in the benchmark then the fastest 4090.
My 7900xtx and 7800x3d did like 1440p low 180 i guess i didnt optimise good.
@@Beefy-Tech Yeah its brilliant value for money. I'd still obviously rather have a 4090 but im very happy i also saved myself 600 and put it elsewhere into my build
How do the 1% lows stack up for you between the cards?
@@jamie56k honestly, it's hard to tell due to servers being so atrocious.
But if we only take into account benchmark results, 4090 and 7900xtx have pretty much identical 1% lows.
This specific test the 6950xt had lower 1% lows by quiet a bit...but I'm pretty sure it was just the server because the Ashika Island results for 1% lows were solid. Not only that, I've seen the 4090 have similar 1% lows in a different game of al mazrah, yet again demonstrating why WZ2 Al Mazrah Testing is so damn annoying xD.
@@Beefy-Tech Thanks for your hard work man, been subbed since about 500 :)
Hey Beef! I currently have a 5800x3D/RTX 3090. Do you think I’d see a improvement in fps in Cornzone if I swapped the 3090 for a 6950xt?🤔 Currently get around 170-205 fps and 240 fps on Ashika. Or am I doomed because of cpu bottleneck in that unoptimized game?
It all depends on if you see cpu bottlenecks already or not. If not, then 6950xt can definitely do better in warzone specifically, but outside of warzone 2 your 3090 would be equally quick to a 6950xt, and it has more vram then 6950xt
I'd say you are far better off to enjoy the performance of your system and just do a system upgrade later on, as you'll see far better improvements gamewide rather then just improving warzone.
@@Beefy-Tech thanks for the advice I appreciate that. 👊🏻
I have the 5800x3D + RX 6900 XT and I get around 10 fps more than you. Don't think it's worth the hassle.
6950XT is way faster than a 3090 lol especially with SAM on UV/OC my 6800XT is already reaching 3090TI territory. i have a 21500 time spy score. a waste of equipment paring an AMD CPU with an nvidia GPU you dont get to benefit all the goodies.
you can literally sell that 3090 buy a 6950XT get better performance all around and put money in your pocket. get rid of it before its too late.
hey beefy you are the man! question how do i stop losing fps while streaming on obs? losing like 30-40 fps is that normal
capture card?
I'm just impressed that people actually play call of duty still.
@@AntiFurry927 weird way to spell overpriced rehashed trash sold on a yearly subscription to people with more money than sense
Lol I test bench the game and I barely play it anymore😂😂😂
What software do you use to get the info in the upper right. About to jump to a 7900xtx from a 4070ti
Msi afterburner
wow, i have a 6950xt and a 12700k @ 4.8ghz and only see around 140-160 fps on ashika is it even possible for me to get near the fps you were getting on your 6950xt even with the config file or is it just processor diff
7800x3d is a beast in warzone
I have a 13600k and a 6950xt, and while playing warzone I get “Packet Bursts” and my frame rate is only around 100. Due to the packet bursts the game is almost unplayable and I don’t know what the problem is. Do you think I’m CPU bottlenecked??
Packet burst has nothing to do with your cpu or FPS. Also what settings are you playing on?? Try 1440p Medium basic preset. That's usually the best way to go for comp settings.
Also no a i5 13600k is Nota bottleneck. One warzone is horribly optimized. The only way your bottlenecking is unless you're playing at 1080p low settings. Try 1440p medium to high because the more high graphical presets the more it puts on your GPU rather than your CPU
Cod has terrible servers 😂💀 or its your WiFi
Your test is update other chanel very old benchmark .keep up the good work
And again after the kid season uodate my benchmark went a little down... From cpu 498 fps to 480 fps in the warzone reloaded benchmark?
Yup. Same here man, about a 10-15 FPS loss in benchmark, meaning BR al mazrah will again lose a bit
@@Beefy-Tech pff why every mid-season or season update there is fps loss...
@@coolbvwes1 I wish I knew man... it's so annoying that no matter how good our tech is, cod will find a way to ruin the performance xD
The benchmark has changed. When you enter that building half way through the bot now throws a Molotov cocktail at you. The flames will hit fps hard. Also the water changed. With Deferred physics quality originally you could turn it off and you wouldn’t see ripples on the surface and it was opaque and non reflective. Now there is always ripples and the surface is clear and reflective even when turned off.
What resolution was this on 1080 or 1440??
Can you same test on 6900xt .I'm cruise who many different 6950xt and 6900 this season game on warzone complete cpu bottle neck
They are extremely close together. Essentially the 6950xt is 5-7% faster on average.
Price to performance I agree with you (but 4090 is in a league of its own), but 147.1% low vs. 214 fps is not close... Maybe the average, but who cares about the average? Percentiles and frametime are far more important, and all this is not fully utilizing the 4090's performance and not counting things like DLSS for 4K gaming, AV1 support, Nvenc, Cuda, efficiency, etc. Sorry for the bad English, or maybe you are just click bating in that case, its working :p
PS: no, its not server related, I have the 6950xt and can confirm those 1% lows, have a youtube channel but in spanish!! about Overclock also !!
Alright that's good to know fan. I was wondering why the 1% lows are the way they are in Al Mazrah, meanwhile the Ashika Island results were solid on the 6950xt in terms of 1% lows. That's what made me speculate the servers being at fault. Not only, but my 4090 has actually had a server where it also had terrible 1% lows before, but that can also be tied to the CPU to be fair.
What cpu are you using?
If one would stream(1080p) AND play (1440p) WZ2 with a 7800x3D would a RX6950 XT perform better than a 4070 regarding doing both tasks at the same time?
Ya im planning on switching to AMD
Enjoy fam! Make sure to do proper research on the GPU you are getting.
Just beware the AMD drivers experience still isn't as good as Nvidia's. But their GPU's are definitely a great bargain for buck.
How do you use your shared files on disk?
My 7700X / 6950XT gets 110-120FPS in Warzone 2 1440p and avg 230FPS in MP 1440p FSR 2.1 quality
That’s bad
Is the 4090 bottle kneck? I dont think theres a cpu that can keep up with it, but definitely purchasing 6950xt rather then a 7900xtx
What is your build....
So i built pc with 13600kf and 6950xt 32gb ram 6000mhz and fos in warzone are 50-120 whats the problem....
crazy
Cant use your game file, the link is broken.
Yes your game config file is broken can you fix it and reply back when fixed? Curious to see if its anything different to mine.
Yup, I'll fix it now. It's happened before. I wonder why it keeps breaking though
Fixed it.
Link wasn't copied over fully lmao.
HOW?!? I have a xfx 6950xt and ryzen 7 7800X3D On 1440p I’m only getting around 165fps in warzone! If I drop everything to low/ very low i still cannot surpass 200fps?!? How are you getting 300fps in al mazrah?!?
214fps 1% in plunder? you gotta be kidding me bro lol
Didn't u mention last video that next video will be Plunder full run?
This isn't it. Haven't done the run yet, this was simply a gpu to gpu comparison.
Also I fully 110% believe the 1% lows of that 6950XT were due to the server. Look at the Ashika Island results and how solid those 1% lows were on the 6950xt.
It means the 4090 got lucky with the 214fps 1% lows... it had a good server while 6950xt had a bad server.
Perfect example of how bad the servers are now!
I have seen the 4090 with the same 1% lows the 6950xt had in this video👀 so server consistency is ass yet again
@@Beefy-Tech yeah i was like no way you gonna get ashika 1% in al mahzra 😅 specially plunder mide
People need to stop benchmarking this game. It’s not stable enough. The whole point of benchmarking is to use something that will give you a baseline and then work from there. This game is broken!
Yeah I stopped worrying about frames in WZ. Like verdansk, every update loses frames so it became pointless. Then menu benchmark on the other hand works well for testing overclocks and undervolts. Over ten runs I’ll get +/- one maybe two frames.
The entire point of my video is thanks to cod being so busted. I wouldn't have a video idea if this game just worked, but it doesn't, and it still has a huge fan base despite all the hate the game has garnered, so I'm sure there's people out there that would appreciate knowing a 6950xt is such a monster in Warzone, and that they don't actually need a 4090 to dominate 👍
@@Beefy-Tech The point is that tomorrow or next week, these numbers probably will be totally different. Not because of the hardware, but because of the updates etc. The performance is all over the place.
@@harshhell4185 yup. But the 6950xt will still be performing above its weight class. The beauty of cod is that while the performance is being destroyed every season. It gets destroyed equally to every system xD.
6950xt will not he doing as well in 2 months, bit if 4090 is lowering in performance with it, it should still be fine to compare.
@@Beefy-Tech yeah but they need a 7950x3D though :P
I'm set on the 6950XT it's gonna be nice for Harry Potter Hogwarts legacy too
why my 6950xt aint like this
I have a 7950x3d cpu which means little to no bottlenecks on 6950xt
Yeah AMD chips love this game.
6950XT from XFX so great the best GPU iv ever got
6950 xt performs better with driver only install of adrenalin. Use afterburner NOT adrenalin tuning😂😂😂
0:50 .... you state no CPU bottleneck.... yes its at 15-20% but what's going on with the 1-2 cores being hit by the main thread... I can guarantee they will be 100% pegged....
remember you can have 16 cores, 1 busy core and 15 idle, this will show like 8% util, when in fact its 100% because the app is only using a single thread on a single core
In a gaming situation you will never use all the cores, and if the GPU sits at 100% that means the GPU is the bottleneck and not the CPU.
@@Beefy-Tech That just isn't a true statement... generally a game will have 1 main thread/process that does most of the crunching, and it can farm out some tasks to other threads, but anything that needs to happen for each rendered frame, will be done by main process/thread... so if you have a 48core 2.4ghz CPU and a 4090 pegged at 100% the CPU is still the bottle neck, but it will show as 1% utilisation.
This because the cores processing the games threads are only clocking at 2.4ghz, and the game runs over many threads, but 2.4ghz isn't enough to process the data in time for the GPU, so the GPU will down clock and run in a lower state, it does this on the fly, when it does it will run cooler, and still state 100% utilisation.... this is because its using 100% of its current clock /frequency, but it will be way under its peak performance settings.
If it's not the case here, share a post of the cores in task manager while you run that game
I had to check your video.... Your 4090 is running in a lower state that's for sure... 250w at 100% util.... never...
I suspect your monitoring the frequency limit or max boost, not the current or actual frequency.
actually jump to 2:45 GPU usage on the 4090 70%
Your system is probably core hunting and using none 3d v-cached cores
Nvidia drivers have a overhead issue that could explain why the 4090 is acting like that its easy to find just google it.
that is not saying much for a game that is broken.....
You aren't wrong. The reason the 6950xt keeps up after all is cause the game is super unoptimized and the Al Mazrah Servers are ass.
I'm sure someone out there that mains Al Mazrah is thankful to know a 6950XT is enough to satisfy their FPS Needs though xD
@@Beefy-Tech lol
4090 has much better 1% low fps which is the most important thing for competitive gaming
rdna 2 and rdna 3 diffrences
THE SIZE COMPARISON!!!!
SAM > ReBAR
One Major difference Nvidia cannot mtch = look at the high to lows its drastic for Nvidia 433 down to 1% 2 279 Under that of amd with Lesser Clock Means more stability @ A Higher clock and " I " Have Had Three i Built systems for under full stress loads Full 3600 And stable geek bench @ 3800 , So your Comparrison here tellls " ME " that you quite dont know Ho to Overclock that AMD GPU all i see you at is Under 26=800 = so you didnt even Over clock gpu and just used the overclock funtion Link / tab in the sofyware is what This is , COMMON Dud e At Least try = I Bump right again the MAX 300 fps that the game Allows !!! With a 6700 """ Yea thats right = with a 6700 Running 2800 / 2600 , and a 6950 xt i set it at 3600 / 3400 , tap power to the max as well as VRam control Max fgrequence aLL THE WAY TO THE RIGHT , and Aboce that = max voltage i run is 1080 . 1.8 i only drop it when i see ustability down to first 1040 , then to 1035 rhen 1030 which i now for a fine tunned Undervolt is around 1027 to 1035 on the 6950xt , this 6700 is running @ 1080 as its my daily , people say 1440p is too much for the 6700 seriers cards i tap out this monitor and most max fps caps apex what have you i see a major fualt around trying to stream from same rig to be a major fualt you can do it but quality suffers
LAO I AWW YOU ARE RUNNING FULL SPOT CACHE AND UTRAL RAY TRACING , SO YOU ARE FURTHER LIMITING THE AMD , WOW DUDUE , AND I HAVENT EVEN SET IT TO SEE HOW YOU CAME TO A CONCLUSION
got one new for $520
If u play fps games stop faling for this bs what matter is ur 1% dont mater witch 1 has higher fps in the testbof low graphics look how the gpu usage is 60% vs 6950 is at 95%
Would a ryzen 9 5900 or 5950x be better?