*Small correction:* 1080p data for RTX 3070 Ti should be 97.1 FPS. Also note that our testing is at "Ultra" which is one below "Experimental" since that setting did have some warnings related to it.
Really well optimized game played for a few hours on my RX 6900XT(4k/High settings) and it's nice game bugs free,good story and gameplay all in all my recommendation ....
Hi brother Long follower of your site information In your site rtx 3070 1080p green 1440p green 4k green I love that card But i have rx 6800 1080p green 1440p green 4k yellow😅 thats hurt me😅 I just came here for know the reason😊 (Bcoz rx 6800 10 to 20% higher performance than rtx 3070 )
very informative as always thank you!! But why does nobody care about my 6950xt? Not just this channel either. It's the forgotten evil cousin that lives in the attic 😮💨
@@vulcan4d Can you name one game that’s actually limited by VRAM? I’m asking because I’ve played pretty much all the major releases from the past year in 1440p with DLSS Quality without any issues. VRAM has never been a problem for me so far.
Haha in your review it says "Overall, the game isn't a Witcher 3, but it's not that far behind." By all accounts KCD II is an improvement over the first game in a lot of ways, and the first game was already better than Witcher 3 in a lot of ways. W3 isn't bad but it's very, very overrated.
The VRAM usage will vary by area though, as the 4K results show, there isn't a performance delta between the 4060 Ti 8gb and 16gb so VRAM isn't a real issue with an 8gb card that may dip into system memory.
@@TechPowerUp I can confirm that, had variations of 2.4 GB up and down on 8 GBs GPUs while playing. Forests and details dense area tend to use higher VRAM. What I noticed is that the game tend to "restrain" that VRAM usage while in game, but will release it when charging back the game after quitting. This on high settings, so no short LOD. Curious, yet a very good game for low specs gaming
*Small correction:* 1080p data for RTX 3070 Ti should be 97.1 FPS. Also note that our testing is at "Ultra" which is one below "Experimental" since that setting did have some warnings related to it.
RTX 3070 is better than RTX 3070 Ti @ 1080p?
Good call out. Retesting those! *edit* 3070 Ti should read 97.1 FPS.
Very impressive optimization for 2025
Even though KGC I was highly unoptimized on highest settings in 2018.
They did their homework after the first@@Strongholdex
Oof at the 6700xt losing out so badly to the 3060ti.
Hoping kicking it down to high helps
6700xt is 3060ti in performance originally\
6800 is 3070 originally
Looking at my boy, the 4070 super being a true unit even at 4K with DLSS. Its a great card!
The best decision I made was to get a 4070 super instead of 7900gre. It's faster in new games and the price was the same where I live
@@xtremefest7901amd 💀
how much did u pay for it?
@@ashiqurrahman8830 converted to USD I paid 700$ for the super and gre was 680$. These are actually good prices where I live lmao.
If games are optimized the 4070 super destroys the 4070
What scene are you getting these numbers in? I haven’t seen any other benchmarks including my own get this high.
What CPU are you using? You can check if you are CPU bottlenecked by checking your GPU usage.
7800x3D with 4070ti Super. My GPU usage is pretty much always 100%
Really well optimized game played for a few hours on my RX 6900XT(4k/High settings) and it's nice game bugs free,good story and gameplay all in all my recommendation ....
what about CPU performance? Can i run it with Ryzen 3600x and RX 6750XT to 1080p max settings? (without RT obviously XD)
You may be limited by your CPU in some scenarios but should still be very playable.
Hi brother
Long follower of your site information
In your site
rtx 3070
1080p green
1440p green
4k green
I love that card
But i have rx 6800
1080p green
1440p green
4k yellow😅 thats hurt me😅
I just came here for know the reason😊
(Bcoz rx 6800 10 to 20% higher performance than rtx 3070 )
Is it crashing for anyone else?
damn amd really sucks in this game apparently
Amazing, is still good programmers exist in game development?
Best video series ever
That's insane
7900XT oh yeah!
these benchmark telling me to throw my RX 6600
GTX 1050 4gb 8gb rám i5 4 core And game run on medium +- 70fps
very informative as always thank you!! But why does nobody care about my 6950xt? Not just this channel either. It's the forgotten evil cousin that lives in the attic 😮💨
The performance is very similar to the 6900XT so we don't test both. Same with the 3080 TI. Just add 1-3% to the 6900 XT 👍
Hello benchmarks❤
4070 super is a little 220w monster
Best purchase of my life. I’m only upgrading when they release another GPU with the same bang for the buck.
@DoutorGonzo yeah same haha, especially with the DLSS upgrade too.
It is a good card but the 12gb VRAM is already limiting it
@@vulcan4d in broken unoptimized games ill never play, yeah big deal.
@@vulcan4d Can you name one game that’s actually limited by VRAM? I’m asking because I’ve played pretty much all the major releases from the past year in 1440p with DLSS Quality without any issues. VRAM has never been a problem for me so far.
gonna be playing locked to 72fps with DLSS @ Ultra @ 3440x1440, frame gen'd to 144 fps (using Lossless Scaling)
Nvidia, please take note of how in this benchmark the abominations known as ai and frame generation were not metioned once.
1080i I can run Ultra settings with 3060 OC 12G. No optimal though does not look that great.
Considering 4k30 with dlssQ on 3050 is plausible
Just use LSFG and u can do fake 60😂
Bro no one needs digital foundry to deduce how blurry fsr is..remember dlss 1 was a laughing stock? U think we needed DF to tell us that?
And this is why I prefer 1080p stuff just runs better even on modest GPUs
And I'm still rocking RX 5700XT... I'm sorry for being poor 😂
30 FPS 4K con una ARC B580 por valor de 300€ frente a gráficas de 1000€ con 50 FPS ha ha ha
Haha in your review it says "Overall, the game isn't a Witcher 3, but it's not that far behind."
By all accounts KCD II is an improvement over the first game in a lot of ways, and the first game was already better than Witcher 3 in a lot of ways. W3 isn't bad but it's very, very overrated.
5080 is better than 7900xtx
Big if true
@TechPowerUp ment 4080
Another nvidia title...
Rtx Is shit need dlls without dlls Is samé how GTX
Rtx 4090 still legendary !
again 7900xtx trash like always
You have a colorful life...
@@Roberty98 nope i see only one color green
@@Jutepest nobody cares
Yeah AMD gpu's are aging like milk
Naa its fine.
That vram usage is cap, at high settings RTX 4060 at 1440p is using max 6.2 GB
source: zworms
The VRAM usage will vary by area though, as the 4K results show, there isn't a performance delta between the 4060 Ti 8gb and 16gb so VRAM isn't a real issue with an 8gb card that may dip into system memory.
@@TechPowerUp I can confirm that, had variations of 2.4 GB up and down on 8 GBs GPUs while playing. Forests and details dense area tend to use higher VRAM.
What I noticed is that the game tend to "restrain" that VRAM usage while in game, but will release it when charging back the game after quitting.
This on high settings, so no short LOD.
Curious, yet a very good game for low specs gaming