Thanks for testing with realistic settings. It's puzzling to me why anyone would buy an entry/mid card like this and crank everything to max and then whine about low fps, but they do it all the time in testing.
@@fighterguy0 I recently ordered this kind of config. And I hope I won't be disappointed. I'm still hesitating between getting a 2k or 4k screen. But there is a bottleneck of 13% in 2k. Which makes me lean towards a 4k screen. But when I see Cyberpunk in 30fps... it scares me. lol
Thank you for the vidéo. I m looking for a Rx 7800 xt nitro to change my Rx 5700 xt. I m playing in 3440x1440 with a ryzen 7 7800x3d and 32 gb of ddr5. I guess it will be good if it perform well in 4K.
Hi bro, i really liked your video, well please tell me should i buy rx 7800 xt or upcoming rtx 4070 supar for 4k 60hz gaming, well there js little difference of 16gb and 12gb, so will it gonna affect the games while playing in 4k or it will not,
Well, upping your resolution to 4K will affect vram alot, but most games should be able to use not too much vram even on 12gb cards, however I do think that starting with 16gb is a better approach but it is definitely up to you if you. So far according to techpowerup, the 4070 super is atleast 2% faster than the 7800xt: www.techpowerup.com/gpu-specs/geforce-rtx-4070-super.c4186
i bought a 4070 at release and sold it to get a 7800xt a few months later lol. For 1440p they are aboout the same but at 4K the 12 gb really becomes a bottleneck. (My monitor is LG 1440p 165 hz, my tv is LG C3 4K 120hz) My 4070 was not a SUPER though, it was a normal 4070.
@@BabyChakks1320 the more VRAM for more higher resolutions - the better (if GPU Core can hadle this resolution of course). You can even try 8K with VSR + FSR (AMD), or DSR + DLSS (NVIDIA), but you need a lot of VRAM for that anyway, on NVIDIA there's no choice, but 3090 in RTX 3XXX. And AMD provides well-enough amounts of VRAM in RX 6XXX, RX 7XXX series
GameFrostYT : using any kind of Antialiasing technique at 4K is stupid because the resolution itself is already so high that there are no jagged edges. At the same time it's heavily taxing on the performance and therefore doesn't show the realistic power of any GPU.
@@kndlshop again buttercup, if 4K is used on a 28" or 31" monitor and one sits 1 meter away, as one should, there are absolutely no jagged edges. Antialiasing technique was designed for 1080p or lower resolutions where the nature of the resolution is always bigger than single pixels. You are blabbing illiterate things here, buttercup.
Great Benchmark, dude. Congrats!
Thank you 👍
I have a 6800 and yes it can 4k.
well justice served tomorrow getting new lg 4k monitor, thank you so much!
Same
Nah dude, a quad hd monitor and one 4k tv is better
Thank you for the amazing test
You're welcome!!!
Please do more games with fsr too
Thanks for testing with realistic settings. It's puzzling to me why anyone would buy an entry/mid card like this and crank everything to max and then whine about low fps, but they do it all the time in testing.
I saw some problems of tearing you should use free sync. Btw good job, with a ryzen 7800x3d and ddr5 ram would be even more better!
Thanks, I purposely disabled freesync because I had stuttering issues with recording on a capture card!
You think with 7800xt, 7800x3d, and 32 gb ddr5 I will be able to play on ultra 4k settings?
@@BabyChakks1320 thats my hardware and i play 4k
@@fighterguy0 I recently ordered this kind of config. And I hope I won't be disappointed. I'm still hesitating between getting a 2k or 4k screen. But there is a bottleneck of 13% in 2k. Which makes me lean towards a 4k screen. But when I see Cyberpunk in 30fps... it scares me. lol
@@YouMe_Clay Simple, just don't play Cyberpunk :p (even though they "repaired" the game, it's still badly optimized in terms of FPS)
Thank you for the vidéo. I m looking for a Rx 7800 xt nitro to change my Rx 5700 xt. I m playing in 3440x1440 with a ryzen 7 7800x3d and 32 gb of ddr5. I guess it will be good if it perform well in 4K.
Yep, this card will perform absolutely fine on ultrawide
Hi bro, i really liked your video, well please tell me should i buy rx 7800 xt or upcoming rtx 4070 supar for 4k 60hz gaming, well there js little difference of 16gb and 12gb, so will it gonna affect the games while playing in 4k or it will not,
Well, upping your resolution to 4K will affect vram alot, but most games should be able to use not too much vram even on 12gb cards, however I do think that starting with 16gb is a better approach but it is definitely up to you if you. So far according to techpowerup, the 4070 super is atleast 2% faster than the 7800xt: www.techpowerup.com/gpu-specs/geforce-rtx-4070-super.c4186
i bought a 4070 at release and sold it to get a 7800xt a few months later lol.
For 1440p they are aboout the same but at 4K the 12 gb really becomes a bottleneck.
(My monitor is LG 1440p 165 hz, my tv is LG C3 4K 120hz)
My 4070 was not a SUPER though, it was a normal 4070.
@@TheTryingDutchmancan you play in ultra settings and what are your pc specs?
@@TheTryingDutchmandebating if I should spend $250 extra for 7900xt🤔
@@BabyChakks1320 the more VRAM for more higher resolutions - the better (if GPU Core can hadle this resolution of course).
You can even try 8K with VSR + FSR (AMD), or DSR + DLSS (NVIDIA), but you need a lot of VRAM for that anyway, on NVIDIA there's no choice, but 3090 in RTX 3XXX. And AMD provides well-enough amounts of VRAM in RX 6XXX, RX 7XXX series
GameFrostYT : using any kind of Antialiasing technique at 4K is stupid because the resolution itself is already so high that there are no jagged edges. At the same time it's heavily taxing on the performance and therefore doesn't show the realistic power of any GPU.
nope.
you can clearly see jagged edges in 4k without antialiasing specially in bigger monitors. games look just horrible without it
@@kndlshop then use a max 31" LCD for 4K and sit 120 cm away instead of 30 cm. Then 4K is properly fitting your FOV and you wont see jagged edges.
@@kndlshop again buttercup, if 4K is used on a 28" or 31" monitor and one sits 1 meter away, as one should, there are absolutely no jagged edges.
Antialiasing technique was designed for 1080p or lower resolutions where the nature of the resolution is always bigger than single pixels.
You are blabbing illiterate things here, buttercup.
no thank tou@@lflyr6287 i prefer to use antialiasing and sit closer to my 49´
@@lflyr6287 because i work with design alot im using a 4k 49 inch monitor 16:9 and i sit at arms lenght (a bit more) lol
yes
Yes
answer: yes it can!
Nooo it's only meant to run 2k stop having fun!