I like your videos but this is complete miss without DLSS results to compare. You just make an educated assumption. But DLSS is not free performance. It still impacts GPU's performance. I am not saying you are wrong, I just can't say you are right. And I can't test it myself as I don't have 2 CPUs. It would be nice for you, if you have 2 CPUs to make a follow-up video where you get same PC, different GPU and test one scene for native/upscalled. Than we could clearly see proportional impact of using DLSS.
@@ingamgoduka57 Late April Fools comment? UE and Unity are literally the two most CPU-limited engines out there, with tons of other issues like #stutterstruggle. 😄
But GPUs are miles better in running Physics just use Nvidia PhysX which is awesome & cross platform. But modern game developers are not using PhysX which is a missed opportunity. Rockstar games have their own GPU based Physic Engine called Euphoria its been used from GTA4 to their latest games. You can also use compute shaders for some physics but modern game developers are lazy. The only thing that kills CPUs is highpoly animated characters, too many different materials & shaders. But modern good game developers use shader vertex animation for NPCs which only uses GPU & change into skinned mesh when you start interacting with the NPCs GTA to Spiderman NPC. Cyberpunk implement same thing in their updates which drastically improved performance. Capcom must fix or ditch their game engine.
CPU handle your 1. Render targets 2. Physic related 3. NPC AI related 4. Date streaming related CPU definitely matters even at 4K gaming. you will get stutter or worse 1% low if CPU can't handle those things well.
But GPUs are miles better in running Physics just use Nvidia PhysX which is awesome & cross platform. But modern game developers are not using PhysX which is a missed opportunity. Rockstar games have their own GPU based Physic Engine called Euphoria its been used from GTA4 to their latest games. You can also use compute shaders for some physics but modern game developers are lazy. The only thing that kills CPUs is highpoly animated characters, too many different materials & shaders. But modern good game developers use shader vertex animation for NPCs which only uses GPU & change into skinned mesh when you start interacting with the NPCs GTA to Spiderman NPC. Cyberpunk implement same thing in their updates which drastically improved performance.
@@ingamgoduka57 Nvidia PhysX, even on Nvidia, do not run everything on the GPU. I know this is hard to believe for many but is the truth, I know this for two reasons, one I was a old Ageia PhysX user, before Nvidia bought it (meaning I add a Ageia PPU...) and I'm a game developer (a modder to be exact) my self for more than 20 years now and I have worked with PhysX (and other physics engines). Nvidia passed this wrong idea about what PhysX can really do, on purpose, just to sell you more GPU's and they won, because the majority of people/gamers, are totally mistaken about what GPU PhysX acceleration really does in games. In reality the only physics a GPU can accelerate, in real games, is the non gameplay impacting physics. Meaning soft body physics like, cloth movement and destruction, liquids, dynamic fog effects, particle based physics, ex. glass breaking peace's or similar effects and rigid body objects, that can't affect the AI or the player, in any meaningful way. Physics that can affect the AI or the player, meaning those that can kill/block the AI or the Player, is always run on the CPU, no matter what. For example, rigid body objects that you can pickup, with a gravity gun like in HL2 and shoot at NPC's to kill them, that is always run on CPU. Bullet detection/collision for shooters, is always run on the CPU. Collision detection for AI navigation, is run on the CPU. Collision detection for audio, is run on CPU. Ray tracing for gameplay, is run on CPU, etc. There's plenty of physics, in real games that exclusively run on CPU's, even on Nvidia, because there's no other way. Yes Nvidia has demos, where they have GPU's, running a bunch of rigid body objects around that look just like those moving around in games, but those demos, have zero AI, is all visual flare, the minute you put AI in the scene and the physics objects need to hurt the AI the physics have to switch to the CPU. And just for those that may wonder, this was true with the Ageia PPU as well, in reality the PPU was just a low power ASIC, with a custom processor with a bunch of tiny cores, very similar to a GPU.
Analogy to understand the relation between CPU and GPU: - GPU = orange - CPU = orange juicer - Objective: get quickly the highest amount of juice from the orange (highest possible FPS) = get the biggest possible orange (higher GPU) but don't forget that the best orange juicers (better CPUs) are the ones that get more juice from the same orange and avoid the worst orange juicers that can't even penetrate the orange (very low CPUs).
@@atirta777 I don't think those two games are very good examples. Neither of them is GPU-intensive enough relative to their CPU usage, and I expect an RX 7900 XTX would still be significantly bottlenecked by a Ryzen 5 1600X. I'd pick a different example like Cyberpunk 2077 with ray tracing enabled, which is so GPU-intensive at 4K (especially for AMD GPUs, which aren't great at ray tracing) that an RX 7900 XTX actually _wouldn't_ be bottlenecked by a Ryzen 5 1600X.
@@nathangamble125 Doom and RDR2 are good examples don't you think? The R5 1600X drives nearly 200fps on Eternal and I doubt 7900XTX does better than that with maxed RT on. Same with RDR2, about 90fps and 7900XTX doesn't reliably hit that at 4K Max. Also, we're talking about playable experiences here, which isn't the case with Cyberpunk 4K Native + RT on the 7900XTX.
Good 3 steps to use when building a *gaming* PC: 1. Pick a GPU according to the fps and resolution you like to play at in your games of choice. 2. Pick a CPU that achieves *at least* that same framerate. 3. Scale both to your budget while also considering longevity.
Can you even explain what "cpu according to the framerate you like to play at" means? What would you recommend to a person who "likes" 60 fps? Or 144 fps?
Respectfully disagree, the frames per second you might see on a chart will almost certainly not be your real world user experience. By the time you have a bunch of programs running in the system tray, a browser with 20 tabs open, discord and anything else you might have running in the background this will not be your experience, especially if you're like me and watching a 1080p or even 4k movie on a second screen while gaming. Typically I would recommend buying a CPU that is more powerful than you think you need, at the moment I'd say 8 cores minimum and scaling the GPU based upon your budget after that, temper your expectations and if you're not happy with your parts choice for your budget then wait and save more. It's always better to have a CPU that's more powerful than the GPU for so many reasons, the smoothness of operation is what you will notice day to day, gaming or not.
@@mertince2310basically he just means you have to balance your system. The gpu should almost always be your limiting factor in gaming. Hence why you spend 50% of your budget on the gpu. you choose the cpu that will not be the limiting factor. Obviously a 12400f and a 4090 is crazy. The cpu is the limit. Something like the 14700k is much more in line with the power of the 4090 and is a good balance.
Very old comment to reply to but I think he means that the variation in frames is smaller think like 40-120 fps on the 5800x3d but 80-120 on the 7800x3d.
@@InfernusdomniAZ Thats what he meant. Some games became unplayable when i kept my 5950X with a 7900 XTX at 4K. The .1% and 1% FPS spikes were CRAZY. This i the only time in my Hardware career over 16 years that buying better hardware actually made games run worse for the first time.
Very well explained. Thank you very much for helping me understand this issue better. It's why I'm a member of your channel. I'm happy and proud to support such stellar work. I appreciate you🙂.
@@zzavatski Well.... yeah, but this is the fiction *everybody* is supporting, because (apparently) it is impossible to make hardware that will display games with any kind of effects in 4k at a decent speed for a reasonable price. If a 4090 -- agreed by all to be the most powerful hardware currently available, the "best of the best" -- can barely hit 60 fps at 4k Ultra in Alan Wake 2; can't even make 30 fps with RT on; if the only way to get decent framerates is through software 'hacks', *yet* Nvidia _still_ charges €1800 - €2100 for the hardware.... then we have to recognize that for whatever reason, the hardware is beyond its limit and everyone is lying and saying it isn't to get as much money as possible out of the users. Because it's ridiculous to think that I would spend €2000 on hardware that literally cannot do what I'm paying for unless you lied to me *like a **_boss_* .
Again they do it on purpose because in reality you don't need a super high end GPU to play 4k with high frames actually because it's about game optimization so if you optimize a game well enough you don't need a latest and greatest but they do it on purpose to make you buy the latest GPU so in a sense they work together because if games was so good optimize to run 4k 144 on a 1080ti for example people wouldn't have a reason to upgrade for does that mainly game it's sad but it's the truth@@Holliswoud
If you wanna play any RPG games with open world NPCs like animals, townspeople, or strategy games with million units and calculations, CPU is almost always the bottleneck
True. Yesterday i tried Witcher 3 on both dx11 and dx12 on My i5 12600K, RTX 4070 different setting, resolutions and still have large frame drops on Novigrad.
@@grynadevilgearxd2442Witcher 3 next gen runs really badly because they hacked together a dx12 layer instead of making it run natively. If you try the Classic version you will have 0 framedrops in busy areas with your system.
@@grynadevilgearxd2442 nah witcher3 npc barely have any logic,they legit just block of texture that have 1 line and move around or never,witcher get high fps even on a 4core cpu xd,now rdr2 is a better test,days gone is also heavy cpu limited in town my 4core can barely output 30-40fps
@@saricubra2867 Well, whenever you make such a jump in CPU power and then add 'in certain titles', the response is automatically: no shit Sherlock. The 9400 doesn't even have multithreading. It's 6 cores and the same 6 threads. Plugging a Ryzen 1600 into those Starfield charts would probably show the 7800X3D to be miles ahead as well.
? Is not that amazing, the 12600k is way better in that regard since it almost matches the 12700k in games for way less, in fact he should've gone with that one instead since he wasted money. @@saricubra2867
True I was struggling with 30 fps and bad stutters with my 4090. I swapped all my fans and my ram with rgb stuff and now I'm getting a bare minimum of 200 fps, rgb is life 😂
If only you can feel the shiver that went down my spine when you mentioned DLSS. That realisation that kicked my brain to high gear and I was like...Holy timbers, he's right! If you're playing at 4k with DLSS quality, you're actually playing at 1440p.... You're more CPU bound than you thought.
True, but youre good with a 4090 with even a 12600K or 5800X3D if you were gonna use upscaling. Its why with a 12700K im not really caring what new CPUs come out and havent cared about anything newer. Im more than fine with my 4070 Ti at 1440p, and id be fine with anything rivaling a 4090 in the future cause id realistically be moving to 4K in that case. Its complicating. If i were gonna buy a 4090 level GPU for 1440p in the future cause thats just how much harder it is to run 1440p max settings, id honestly just move to a used 14700K or 13700K, overclock and time the ram and be fine probably. Not planning on buying a new CPU+motherboard combo realistically till AM6 and whatever Intel has to over at the time.
Been pointing this out for a while especially to channels like Hardware Unboxed who have been at the forefront of 'CPU doesn't matter at 4K' nonsense because that reviewer personally doesn't like to use upscaling and only plays Fortnite.
Yup, just upgraded from a 2070 Super to a 4070 Super with the intention of playing at 4K and after booting up Cyberpunk I realized that my Ryzen 3600 was clearly holding back my GPU. I decided to do a full upgrade to a 7600 system and now everything's alright. Turns out ray tracing takes a big hit on the CPU, even if you're playing at 4K
I agree. I had 3600 as well and it's not a very good CPU to play with ray tracing effects. I'm still using 2070super and even it was getting bottlenecked by 3600 for raytracing. Now I'm using 13900K and it's buttery smooth.
In short: - The CPU can achieve a mostly fixed framerate across all resolutions - The framerate the GPU can achieve scales inversely with the resolution - This is not set in stone, but dependent on the game and the settings - The lowest between the two is the framerate you'll end up with, what causes generally a GPU bottleneck in higher resolutions and a CPU bottleneck in lower resolutions - This can easily be seen in GPU benchmarks, where you'll see the GPU achieve the same framerates between 1080p and 1440p in some games, or not scale proportionally to the drop in resolution - A third limiter is the game engine, which in some cases, mostly older engines, can't achieve higher framerates than a certain cap - Upscalers like DLSS, FSR and XeSS render at lower resolutions than the target resolution, providing close to the framerate of a lower resolution - This makes it more likely you'll hit a CPU bottleneck when using upscaling even at high monitor resolutions - BONUS: Geforce cards drop the driver overhead on the CPU, while Radeon cards keep them on the GPU. This means that with lower end CPU's, there is a significant difference in performance between Geforce GPU's and Radeon GPU's since the CPU bottleneck is more severe when paired with a Geforce card due to that driver overhead
I heard that the NVIDIA GPU driver offloads all the work to only a single CPU core, while the AMD GPU driver can distribute the driver workload to multiple CPU cores. Thus, you especially need a high single core performance CPU when you use a NVIDIA GPU. Someone please confirm, what I said.
Just think of it as two different pipelines: CPU instructions and GPU instructions. Depending on the rendering engine and how the game was programmed, the balance of instructions sent to the CPU and GPU varies. For most games, changing resolution will mostly impact GPU load.
@@christophermullins7163 are you talking about draw distance? never heard of cpu render resolution before draw distance helps a lot for cpu and gpu but you will still get lag spikes a lot because of the pop in from draw distance
I got to about 10:00 and you were going on and on, and maybe you made this point later? I tuned out. The bottom like for ANY resolution is what a person wants for fps. If I'm playing Starfield with a 4080, well, I have a 2K rig so it's great, or good enough. I prefer an avg. above 90 and a 1% low to be above 60 fps. I find that gives me a stutter free experience in open world games. But looking at a TH-camr benchmark it doesn't tell me much, because they will stick to certain settings just for consistency, but there's no way I'm playing Starfield at 4K with a 4080 UNLESS I scale down the quality to where once again my avg. fps is above 90 and my 1% is above 60, or I'm using DLSS, and using RT, DLSS is almost always going to be enabled. YES, because I'm pushing the fps back up, the CPU is going to matter. I'm not gaming at the settings HDWU had for 4K in Starfield. I'm not going to let a GPU struggle like that and give stutters from time to time. AND THIS is the point most people get wrong when they look at TH-cam benchmarks and listen to a person say "at 4K they're the same". And frankly, part of this held belief has been SPREAD by different TH-camrs. And I don't need to get into what the CPU does and what the GPU does. What I can tell you is when you play a game at high fps at 4K the CPU is CERTAINLY going to matter. I do know the CPU has to pass data to the GPU before the GPU can render a different frame, otherwise it's painting the same frame, because the GPU doesn't know movement, so there is change data that the CPU has to send to the GPU to get a different frame. The CPU tracks the game. You also have to keep updating RAM. And if YOU personally like playing at 4K with 1% lows at 40 fps, well have fun with it.
When a game is very CPU intensive, that applies to any resolution. Higher resolutions have close to no impact on CPU performance. I said ''close to no impact'' because as the resolution increases, the framebuffer size increases, and that requires more memory bandwidth for transferring data between the CPU and VRAM. That increased data transfer can ''technically'' lead to slightly higher CPU usage. However, that increase is generally not significant compared to the overall workload of the GPU, which handles the bulk of the rendering process.
@@christophermullins7163not at all. There is no difference in resolution. If you are playing at 60 fps, vou requirement will be the same regardless of resolution. People just say stupid things like this because they think "lower res = more fps = more CPU demand", but they fail to realize the CPU requirement increases due to higher fps, not lower resolution. They are not the same thing
@@saricubra2867 I said that "CPU matters less at 4k than lower resolutions" and after watching this video, Daniel confirmed that my statement is accurate. The title is clickbaity AF. The point of the video is to say that when we say "play at 4k" we are more likely upscaling but that does not invalidate the original statement at all. "Slower CPUs have less of a performance hit at 4k" "CPU performance matters less at higher resolution" are irrefutably factual statements and Daniel himself knowd that without any doubt in my mind. It is all about the way we word the question and assuming we are using upscaling and lower graphics settings etc. it cannot be argued that anything I said is wrong. You'd be greatly benefited from trying to understand why that is the case if you still haven't realized that it is true. Go ahead and argue something irrelevant like Daniels video.. no one said anything about upscaling guys.. it's called clickbait.
@@christophermullins7163 But you are making a completely different statement. He didn't confirm your statement. This entire video is digging into the nuance of why cpu performance *looks* (emphasis mine) like it doesn't matter at 4k but it actually does. The issue isn't that slower cpus doesn't matter at 4k but more so that you're gpu limited if you just slap on max setting and called it a day. It will matter if you adjust down the settings and use up-scaling to reach your potential max framerate which is determined by what cpu you have. Most people aren't playing at max setting on 4k. It's like you heard the phrase "you're kind of right" and just stopped watching after that. You can argue all you want about how you're right but you actually aren't because you're making a statement based on a faulty conclusion. The data from the video above us literally tells us that you can be cpu limited at 4k and it does matter what cpu you have especially if you're reducing your gpu bottleneck by paring down the settings and resolution. No one in their right mind would recommend pairing a 4080 with something like a 2600x "because cpu doesn't matter at 4k" because we all know that it does bottleneck.
thank you for making this video, so annoying seeing so many people assume you can pair a 4090 with a 5600 and expect the same results as a 7800x3d at 1440p+ 🤣
@@ZackSNetworkwow that is insane. I can understand a 5800x3d or something but a 1600 isn't even going to have a PCIE 4 unless you go get a new motherboard that costs 3x the CPU 💀
I was trying to explain this to someone in another video, but I didn't even think of the whole upscaling aspect of it when I was trying to explain why upgrading a CPU isn't going to give you gains in the same way a GPU improvement will. Not only can I reference this video now if needed, but I also think your explanation here will set me up for my next conversation on the topic. Thanks!
I never only play games. I need benchmarks where they game at 4k, have a wnd monitor with 3 Chrome tabs that are streaming video, voice chat over discord, spotify etc all running. (No i dont listen to people and music and video at the same time, but the program is running) And THEN tell me i only need 4 cores and 16gb ram in 2024. Cause thats bs
Thanks for doing this I've seen way to many "CPU doesnt matter at 4K" comments over the past few years, even at 1080P I noticed a big improvement in 1% lows alone at 1080P when upgrading from an i7 8700K to 13600K (Carried over existing RTX 3070).
i'm still on a i7 8700k running at 4.8ghz with a 3060ti and i have people telling me all the time my cpu is still fine for gaming i play battlefield games all the time and helldivers 2 and it's always my cpu even when i play metro exodus this cpu is just to old now i cannot stay at 60 fps
You would be surprised how many people are bottlenecked by their wrong decisions and not researching when they are buying PCs or parts and wasting their money.
One more aspect where the CPU is relevant for 4K / increasing frame-rates that Digital Foundry found (mentioned in passing in their Dragons Dogma 2 Tech Review - 19:00 - 19:27) : When CPU limited, the frame-rate in the same scene is still taking a (small) hit (~7FPS difference between 1080p and 4k as well as ~2 FPS between 1080p and 1440p) at higher resolutions. Though they didn't really test this in detail (as this was not relevant for this review) and just mentioned that they suspect this is due to this / some game(s) drawing more things in the distance while at higher resolution.
Great video, as a hardware enthusiast I've known about this for many years but its really hard to articulate it as well as you did here. Going to bookmark this one for future use so I don't have to try and explain it.
My favorite case of this is the original Crysis (and remastered) where on top of being super GPU heavy, it also increases more CPU-driven rendering budgets with an increase in resolution. On top of that it's also not very well multi-threaded so you basically need a 7800x3D to run a 2007 (right?) game at 4k60fps maxed out settings. I think it ends up pushing out the LOD more or something super weird the higher your resolution, so you get more CPU calls. Digital Foundry did a pretty good video on it and in the clip where they got Alex (their PC focused member) a 7800x3D to replace his 12900k they discuss Crysis specifically. But yeah, basically at 4k your CPU is less likely to be a bottleneck, but it still very much can be, and you want something at least pretty good to make sure you don't get slammed with crap frametime spikes, as CPU limited gameplay will almost always be a stuttery mess where GPU limited will be better.
an outlier to the 4k argument is MMOs.... in MMOs, where a bunch of players are around each other, cpu definitely matters.... 3d chips really shine in those scenarios. my minimum fps in wow in valdrakken in 4k doubled when i switched from a 3700x to a 5800x3d. i get 80fps in 4k native max settings in city.
U should fine tune that even more with 14600k and a 7900xtx i get 150 to 180 fps in vald at 4k but the thing with vald is it artificialy limits gpu and cpu usage im always sitting at like 60% gpu and like 20% cpu usage in the emerald dream i get 90% + gpu usage and like 400+ fps at 4k
@@Billyshatner88Just get a 7800X3D which would blow both of them out of the water. Lol. In all seriousness, if the hardware you’re playing on is serving you well then no need to change it. The PC community STRUGGLES with this notion and continues to complain why they’re in debt.
@@Billyshatner88 psst... You're CPU bound. That's what being CPU bound looks like. If your framerate isn't hitting max and your GPU utilisation isn't at 100% that means you're CPU bound.
Yes CPU performance is all about raw data. Especially in multiplayer where you have to calculate 15-20 players like in a WoW raid that tanks the CPU hard while in a singleplayer game everything is predetermined the CPU doesn't have to do anything there.
I sent this to a friend today, because I cannot be bothered trying to explain this to them anymore. This discussion really needs to be settled -- once, and for all. But, I realize there will always be some degree of confusion or misinterpretations when it comes to information like this.
Mainly game at 4k besides esports titles and just upgraded from the 5600x to the 5800x3d and it's been a big uplift. Average is only 10% to 20% higher but the 1%low is so much higher its great. Even if GPU is maxed out it still helps a lot!
Did the same change and got the same results as you, those better 1% lows make such a difference. Really wanted a 7800x3d, but factor in the price of the new ram and MB and it was 3-4 times the cost of just dropping in the 5800x3d. Shall stick with this for a while.
It's great to see a selection of videos explaining the connection between CPU/GPU and framerate at different resolutions come out in the past few days. It works to test CPUs with nothing but a 4090 for comparison and likewise GPUs with a 7800X3D for consistent benchmarks, it doesn't help people trying to match components in an upgrade or on a strict budget when the midrange CPU wastes GPU power and vice versa.
9:41 Here's the key point. I used to think CPU didn't matter at 4K too. Then I actually got a 4K monitor and started using DLSS on every single modern game. You just need to look at the CPU usage rising after activating DLSS to figure out that your CPU actually *does* matter. If you play at 4K native it still matters, just a lot less since you're extremely GPU bound.
I’m glad that this isn’t in response to my post on one of these videos! 😅 I did say that CPU doesn’t matter at 4K resolution but what I really meant was the difference between CPU’s of a particular segment. I was considering a 14900K at one point and there’s the 7800x3D, but I went with a 7950x3D. The graphs for those can show different frame rates but as you mentioned it’s highly dependent on the area that you’re in in the game. Also, I doubt that I would actually be able to appreciate any kind of uplift that’s less than 5 or 10%.
@theinsfrijonds Why not upgrade to a CPU that has more than dual channel memory? When I made the switch I noticed the difference in frame pacing, but I was also able to truly multi task. It amazes me that streamers don't use workstations to game with. Yes there's a limit of what a cpu can do at 4K, but no one has put the Quad, hex, and Oct memory channel systems to the test in a public video. The cool thing is you can work and GAME at the same time with a workstation. You have enough pci-e lanes that you can run two VM at the same time and use more than one GPU.. Gaming is limited to single GPU since SLI and Xfire bit the dust. if you have another GPU you can use VT-d and pass it through a VM to run rendering., professional workloads, or a streaming for youtube/twitch/whatever box on the same system. The use of unlocked threadripper and unlocked Xeon is not well know by those that focus only on gaming.
@@michaelcarson8375You save a lot more money buying an X3D cheap instead of buying an overpriced platform with quad channel support. The extra cache is equivalent to a giant memory OC.
@@michaelcarson8375 Those processors were out of my budget. I mostly bought AMD because it's more power efficient. I do hate the fact that the memory controller in their processors is definitely worse than Intel (limited to 3000 mega transfers per channel.) Also I'm not sure that there is a motherboard where I could access the second PCI Express slot with my 4090 graphics card. It's just that huge (four physical slots taken up in space but three on the back of the case.) Good to know though that there are uses for quad channel and up. I can only imagine that channels haven't covered that due to the limited number of buyers plus the fact that it would be very expensive for them to review
@@michaelcarson8375 Because workstation cpu are focused more on doing a bunch of things well rather than having high clock speed which some games really need. You could potentially limit yourself on performance if you go for something like a threadripper or a xeon cpu. Also the price difference doesn't really make sense for a streamer when they could put that money into second pc that they use for capture and rendering, keeping a eye on chat, and so on. They cut out the overhead on streaming without sacrificing performance this way. *edit* also using a vm while gaming could potentially get you banned if you play online if you can even launch the game at all due to anti cheat detection. A lot of cheaters used to use vms to run their cheats.
@@Shadowninja1200 Excuses excuses. There are unlocked CPUs from both Intel and AMD Those systems are the true high end desktops. Dual channel cpus no matter the number of cores are not high end desktops.
Really helpful for tuning a setup at 4k, you can initially max everything at 1080p to find out what the CPU FPS limit is and then go to 4k to target that frame rate, may as well crank up all the quality settings to max until you see the FPS dip below the CPU limit
@@paul2609 I disagree, if the budget is that tight that you can't afford the extra £100 or so to go from 6 to 8-12, then you should seriously reconsider whether you can afford a pc at all.
@@teddyholiday8038It depends, for example, the 5900X has 12 big cores and the 12700K is hybrid. The only reason why i choose the 12700K besides the better IPC is the lack of CPU overhead (Intel thread director cleans interruptions for the main threads for any program). I wanted a X800X3D chip but they are 8 core only and they would feel like a slog when the scheduling isn't as good. Ryzen 9 5900X gets lower averages than these newer chips but 1% lows and frametimes are very, very smooth when the game is well optimized (like Cyberpunk 2077).
I'd also bet people running 4k don't really need AA turned on to 4x 8x or msaa or txaa.. that's pretty taxing on gpu st 4k.. I bet that would show a uplift of some fps.. where a good cpu matters.
Lol.. title: "lets see how cpu fare in 4k" Video: "heres how different cpu fares in 1080p" Idk how you guys can take this video seriously. Go open any cpu review videos on yt. You can see the results between cpu @4k are pretty negligible.
Its kinda infurating that there is no comprehensive comparison of cpu performance on 4x/simulation games. Its always AAA stuff, id like to know how much a 7800x3d would improve my ticks per second on vic 3.
Im in this scenario. Currently have a ryzen 3700x paired with RTX2080 and planning to upgrade to Rx7900XT. Im playing sim games at 4k, 60hz monitor. Wondering if the CPU bottleneck will be a problem and if a 5800X3D upgrade is a smart move. What do you guys think ?
1:37 why don't you use something like Presentmon to actually show the relationship between components during the frame cycles? Wouldn't that give a lot more detail and thorough explanation that could be more easily understood?
@@InnerFury1886 TLDR: I stopped as soon as I saw "upcoming release" which informed me that you are uninformed. PresentMon is on version 2 and it's been available for download for a significant amount of time already...
A smaller content creator made a video recently where he suggested pairing a 4080 with a Ryzen 5 7600 for a 4K build. I commented and said that was a bad pairing, and a bunch of people all replied to my comment giving me so much crap and roasting me while saying you didn't need a stronger CPU for 4K because games will pretty much always be GPU bound at 4K. I then made the mistake of bringing it over to reddit, where I then got roasted by like a million people who all agreed that you don't need more than a 7600 for something like a 4080. Truly a "be right even if you're alone" moment for me. I mean seriously, if you're pairing a $1,000 GPU with a $200 CPU in 2024, you've done it wrong, even if you're playing at 4K.
@@pengu6335 You would. But at such a high price class I would go with AM5. There is a high possibility that you can upgrade the CPU in future without changing anything.
i built a compter this year out of other computeres, but ever since android 15 came out, now every time i boot, my windows freezes as soon the loading linux thing prints out. i bet i got hacked. i feel like that proabably a cpu bottlenet but i dont have 4k display so im kinda left with no recourse except more overvolting.. i got room though. its crazy how high that thing actually goes. also if i pull out my graphics card the whole computer just shuts off immediately. does that mean i didnt integrate graphics for the cpu correctly last time i kept flashing all the gpt patricias. intels just uunoptimized traSH. it always wont stop starting but now it wont even start stopping either
I barely can reach 300fps or slightly more in Unreal Tournament 2004 (f*ck Epic for killing Unreal for the sake of Fortnite) on my i7-12700K, the game runs in one thread and the bottleneck is so hard that anything above the Intel UHD 770 is pointless. On task manager, one thread is 100% at all times and the iGPU is 100% as well, the rest of the CPU does nothing. Alder Lake has godlike IPC.
very informative and practical. I generally recommend to my friends that CPU doesn't have to be the best for their build, but I always ask them what their resolution is, their target "smoothness" (most people are not familiar with fps) and the games they wanna play. So most of the time I would recommend a mid-range CPU like a ryzen 5 or i5 + a decent midrange GPU for 1080/1440p. Sometimes I'd even sell them to the upscaling tech available for each side as well as show them that some games have beautiful raytracing BUT I always make sure to let them know that it's not always a good idea to crank up the settings and make them consider optimizing settings.
People say stuff like this and it’s very dumb. Even if you are less CPU bound it still takes a lot of power for raytracing and other aspects. Overall it just doesn’t make sense to pair a i3 or Ryzen 5 with a 4080 and especially a 4090.
It does, look at the 4K results. Especially if you're on a budget, you tend to get more fps and eye candy out of a better GPU+worse CPU than a worse GPU+better CPU at 4K. Especially when it comes to upgrades! Shall I change to a new platform, or just stick a beefy GPU in, and wait another CPU gen to come out... Nobody says you gonna get the most out of your Top Tier GPU paired with a mid tier CPU, but you gonna get a better result than pairing up a mid tier GPU + mid tier CPU. This is clearly visible at the 4K chart. In some cases you will, in some cases you won't get the most out of your Beefy GPU, but you're better off than with a mid tier at 4K. I'm only talking about 4K.
I have a R5 7600 and an xtx, I only get cpu limited in a handful of specific areas in only a couple of games, and I don't think i've seen gpu go under 90% in those areas anyways.
BS. 4090 doubled FPS over a 3080 using a 5600X at 4K. It enables 120+fps with max settings in DCS. Went to a 5800X3D, I can't notice a difference in game. Also, the 4090 is very quiet compared to a 3080 Strix going balls out at 4K.
@@V4zz33 You're not going to play at native 2160p, because that's a waste of GPU power. And a RTX 4080 costs like 3 times as much as a Ryzen 7 7800X3D, so going for a strong (or even the strongest) CPU doesn't have as much of an impact on the budget compared to getting a stronger GPU.
This is a fair point when using dlss, you are taking the load off the gpu an scaling the resolution. What about with DLAA wouldn’t this lean more towards the GPU or is just as dependent on the CPU?
CPU does matter, but not nearly as much as people think. In 4K you'll need to spend a lot more on the GPU so it's perfectly fine to go with a 5600 if your only interest is casual gaming.
@@wertyuiopasd6281 Zen4 is still more expensive all around without providing too much of a boost. Budget builds are fine with Zen3 but if you really want to get the current gen then 7500f is the go-to if available.
Great video...I actually changed my opinion on this a while back. In CP77 with a 5950x and a 4090 with rt max, dlss off in 4k I was getting about 40 fps, when I upgraded to a 7800x3d it uped to about 48 fps so there was a 20% increase. I was actually pretty shocked.
@@od1sseas663 make an argument then numb nuts. It shows ignorance to think or say that higher resolution does not remove CPU bottleneck. It's irrefutable that it does.
At 1080p? Yes. Probably in 90% of the games you play this system is cpu bottlenecked. But 120 fps is super game specific, can't talk about that in general. At 1440p, meh, probably not.
You're assuming that people that buy 1000$ GPU-s to play in 4K, are going to turn down resolution 2 years after they bought it instead of just selling the old one and getting a new GPU that can play in 4K. Also, you can tweak some Settings on your GPU to achieve your desired FPS. Shadows/Reflections/Lighting/Effects ... all these impact performance. I would rather play with Medium Shadows than to upscale from a lower resolution. Also. CPU demands also increase from year to year. R5 5600 was able to play anything in 100FPS when it launched. Lo and behold, in 2024 it can only do 60FPS in some demanding poorly optimised titles. A 3090 was able to hit close to 100 FPS in any game in 4K in 2020. So. If I paired a 5600 with a 3090 to play in 4K in 2020, guess what? Both the CPU and GPU demands have increased. So the CPU can only pull 60 FPS, but also the GPU can only pull 60 FPS in 4K. If the time comes and the 5600 will only be able to pull 30 FPS (in a decently optimized game, not this Denuvobroken DragonsDogma2), the 3090 will also pull only 30 FPS in 4K. Sure. You could turn down settings, but tell me. Who bought a 3090 to play in FHD? :)) If it can't handle it, you just sell the whole PC, or the components, and get something better. If on the other hand, you would've spent more and got the 13900K with a 3090, you would'nt have used all of the CPU potential. That CPU can push 150FPS in pretty much any game, but the 3090 can't push 150FPS in any game in 4K. You basically bought a system that is designed for 2K or worse. You "future-proofed" your system. Meaning that in 2024 when your 3090 can't hit 60 FPS in 4K, and you decide to drop resolution to 2K, they finally match up. So basically ... you paid some extra money, for extra processing power, which you haven't used for 4 years. All this, to just be able to play in a lower resolution :))) Ridiculous ...
I own a Samsung Syncmaster 955DF pseudo 1440p CRT and i can confirm that the framerate wars are just a bunch load of BS. Take any CRT and it blows any modern game screen for gaming, seriously, it's not even close. You get built-in anti-aliasing, free frame generation tech and upscaling (CRTs don't have a native spatial resolution and refresh rate, it's an infinite range following their specs). Playing at 1856x1344 is just crazy for such an old monitor, it looks way, WAY sharper and better than a 1080p LCD i have right now.
@@saricubra2867 At some point CRT's got surpassed by OLED You can put any 4K OLED besides a CRT and notice the differences Linus did something like that CRT is only better than the low cost 1080p LED displays 1080p displays got left behind a while back, nobody cares anymore about that resolution
So after certain level of power of CPU (cores and speed to handle info) at 4k does not matter yf you put a better CPU the performance won’t go up but if you put a lower CPU you will get impacted due to bottleneck?
Going from a 5500 to a 5700x3d has smoothed out my games so much at 3440x1440. In some games the average fps is pretty much the same but i get no more stuttering in games where i constantly had issues with it.
Fly at low altitude in Microsoft Flight Simulator or DCS with a i9-12900K and an RTX 4090 and you will see your frames sink below 30fps in 4K. This also happens in Racing Sims when you have a full grid of cars. I find it hard enough to maintain just 72fps in a Pimax Crystal VR headset in DCS and MSFS2020.
Ok bro cut the crap. There was no difference between these CPUs at 4k. You tried later to prove your point with games in 1080p. That's not 4k. All I saw is that Ryzen 5 5600 pulled 1fps less then ryzen 7 7800X3D. Nothing else matters for the subject. Ofc there is gonna be bigger difference if some idiot is using ryzen 1700 for example and pairs it with 4080. That's just dumb. But my point is for modern rigs you should save on the CPU and invest more money in GPU. Gamers have been doing that for ages now. Only competitive gamers need top notch CPU for those CS2 tournaments when they game on 165hz monitors and what not. Normal, average gamer that's focused on 4k and has 60hz or 75hz screen should invest more in GPU for sure.
this is exactly my case now, i have r5 3600 and rx 6800 combo, i play at 4k 60fps with fsr on quality most games, in very crowded places in many games like cities or forests i get cpu bottleneck, my gpu can run it but cpu cant hold on since i play at 1440p cause of fsr usage even at 4k, and its kinda sad, cause gpu cant handle native 4k in many games so its 40 50 fps, and with fsr cpu is too slow again getting 50 fps still below my playable 60fps. so im stuck now, need to upgrade to 5800x3d probably, but my psu cant handle that and i need to upgrade more and more parts to make it work ;\ so im balancing settings to achieve 60 fps at 4k + fsr
do you run ultra or medium settings? do you use dlss/fsr? many factors for 4k gaming. if you run native 4k at ultra settings of the mid range cpu of the last few gens will perform about the same. but who the fuck has a 4090 and a 5700x
He showed you but you did not understand it. AVG fps doesnt matter its the 0.1 and 1% lows that create huge stutter on high end systems. Hes 100% right, i had the problem myself in 4K with a 7900 XTX and 5950X which did not come along with the gpu.
Nice video.. again. :) I can absolutely recommend the videos from Gamers Nexus regarding "GPU Busy", "Animation Error", "input lag" (Latency pipeline), "Optimization" and how GPU drivers work.. discussed with an engineers from Intel and Nvidia.
I recently updated from Ryzen 5 5600x to Ryzen 7 7800XD paired with 3080 because I also play 4K with dlss thinking it will improve the frame rates a lot. But actually it didn’t change anything and it feels like a waste of money.
The point in (my own) short words: for 4K gaming CPU performance is often not important, just because the framerates are lower, just because the gpu is maxed out. that's all. when the framerates raise, the cpu has to work again a lot more. to be futureproof, an unlimited fast CPU and GPU is needed %-). Nobody really knows, which new game will "waste" more power on the gpu or cpu. Too much just does not exist, but saving the money for a top level cpu but inserting a 4090 is just wrong. Especially with UE5engine using most of the time Raytracing on the CPU we all can find us in a state, where a faster cpu could be easily be needed. That all said with the need for intense AI npcs, just because ... why not?
So, TLDR: if you plan on optimizing your games for HIGH FPS, your cpu is more important. You open the game and put everything on max like 4k/ultra but then you realize you only get 40-50fps or you might get 60-70 but want 90fps+ etc. Then what you do is turn down graphics settings and/or use upscaling (duh!), and the more you do that, the closer you get to your CPU limit (=max fps your cpu can deliver before the gpu starts rendering). And once you hit it, you can turn down your settings as much as you want and upscale from 240p you won't get more fps. It has to be said though, if you have a semi modern CPU and want to play with "moderate" fps ~60-90fps you really have to look hard for games where this becomes an issue.
This is why I change GPU once every 2.5 years but CPU every 5-6 years. My monitor and the game I'm playing are actually the key reasons that cause me to upgrade my PC components. You have to find out what your CPU/GPU are actually doing before you upgrade. There are plenty of tools like MSI Afterburner, AMD Adrenaline Software monitoring, Intel PresentMon, Task Manager, etc.
As someone with an 8700k and 4080 Super combo , CPU performance does absolutely matter. I'm even CPU bound in Time Spy of all things 😂 . Bit imbalanced but my old 2080ti GPU died recently , so put this GPU into my old build. Planning to upgrade to Zen 5 though soon enough ;)
Basically, both your CPU and GPU need to have the computing power to output at your chosen resolution and framerate target. With the added variable that those computing needs will vary from game to game. I honestly get the confusion.
I play at 4k, my old system was a i7 6700k (4 cores/8 threads) paired with a RTX3080 and I upgraded to a i7 12700 (12 cores/20 threads) and the gain in frames is minimal... I'm talking like 1-3%
@@mrshinobigaming8447hey if you’re using a bottleneck calculator I would advise against it they’re usually really shitty Play a round with a few and test different CPU’s and hopefully you’ll get what I mean
@@deucedeucerims im not using bottleneck calculator i have watch alot of reviews max gpu for r5 5600x is 7700xt at 1080p anything more powerful will cuz bottleneck
would love to see someone do a video of CPU comparisons but at different DLSS/FSR presets to see the performance difference between internal resolutions upscaled by the GPU VS internal resolutions matched to displayed resolution. Most might think it's 1:1 but the extra processing done by the GPU may affect the overall performance in-game.
i have 4070 and running forbidden west on 107 fps with dlss on. there is no stuttering or any other problems. when it comes out i will buy 5070 and I just cant tell the difference in all this performance when naked eye can hardly see 240 fps or refresh rate on your monitor.
It all depends on the game and if you are cpu bound or gpu bound. In some games i play at 3200x1800 even tho i have a 4k monitor, i'm still gpu bound in the games i play like that tho. Got a 7700 none x :D And i feel my fps and latency is a little bit better when gpu bound than when i'm cpu bound, but that might be my monitor's scaling that makes that difference, idk tbh.
Not to mention that the CPU has a major impact on system snappiness and load times even if you're 100% GPU bound while gaming. I have a TitanXp (1080 TI equivalent) paired with an R7 5800X and I'm itching for a CPU upgrade even though my GPU is 3 years older than my CPU. Additionally, the CPU can get interrupted by background tasks or if the game has to load in assets and that can come through as stuttering even in GPU bound scenarios.
I have upgraded from FHD 60Hz to 4k 120Hz screen and havent touched my R5 5600 + RX6700XT combo. In 7+ years old titles performance is OK at 4k in newer ones not so much. What bothers me the most is upgrading just my screen created some ugly lags for every day use of Win11 aka complete opposite of snappiness. Even though it is not gaming related it is enough annoying for me to ask if anyone else had similar experience and what kind of solution can fix the lags.
the point is its not worth spending a lot of money to upgrade to the next gen if your running 4k e.g a 5900x to 7800 x3d... If you have a few stutters drop the settings
That's the thing tho, can't really drop settings for CPU limits. The only thing affecting it is (maybe) level of detail and, if the game has it, any other general setting that reduces spawn rate of NPCs and other objects that have collision and stuff. If your CPU can't hit 60 fps it just won't hit 60 and you either need to live with it or you need to upgrade. Even turning on dlss/fsr won't help (apart from frame gen).
So I have a core i5 7th Gen with gtx 1060 and the only game I play is dota 2. Now I am seeing my CPU is the bottleneck when my gpu is at 40 50 percent utilization even on 4K.
Or to put it another way: when do you start to be GPU limited? One of the best tools for visualizing this is the Forza Horizon 5 benchmark because it shows what percentage of the time it’s limited by one vs. the other and how many FPS the CPU is managing in a given run vs. the GPU.
Ok... Im using a 4080 with a ryzen 9 5900x and im gaming at 4k with dlss quality... correct me if im wrong here, but as long im hitting 99% gpu usage im fine correct? the only thing i should looking for is the gpu usage right? if it hits 99% im good right?
Check out Jawa, the place where gamers buy, build, and sell! bit.ly/JawaOwenApril24 Use code OWEN10 for $10 off your first purchase!
As a Jawa Verifed Seller, I will say there is some incredible value in to be had in purchasing a prebuilt from a good seller on the platform.
never sekk in java, prices is joke, like you give it for free.
I like your videos but this is complete miss without DLSS results to compare. You just make an educated assumption. But DLSS is not free performance. It still impacts GPU's performance. I am not saying you are wrong, I just can't say you are right. And I can't test it myself as I don't have 2 CPUs. It would be nice for you, if you have 2 CPUs to make a follow-up video where you get same PC, different GPU and test one scene for native/upscalled. Than we could clearly see proportional impact of using DLSS.
@@ingamgoduka57 Late April Fools comment? UE and Unity are literally the two most CPU-limited engines out there, with tons of other issues like #stutterstruggle. 😄
But GPUs are miles better in running Physics just use Nvidia PhysX which is awesome & cross platform. But modern game developers are not using PhysX which is a missed opportunity. Rockstar games have their own GPU based Physic Engine called Euphoria its been used from GTA4 to their latest games. You can also use compute shaders for some physics but modern game developers are lazy. The only thing that kills CPUs is highpoly animated characters, too many different materials & shaders. But modern good game developers use shader vertex animation for NPCs which only uses GPU & change into skinned mesh when you start interacting with the NPCs GTA to Spiderman NPC. Cyberpunk implement same thing in their updates which drastically improved performance. Capcom must fix or ditch their game engine.
CPU handle your
1. Render targets
2. Physic related
3. NPC AI related
4. Date streaming related
CPU definitely matters even at 4K gaming. you will get stutter or worse 1% low if CPU can't handle those things well.
CPU = Handles the gameplay or game feel.
GPU = How pretty your game looks.
I take gameplay over graphics any day, so i always overspend on CPUs.
But GPUs are miles better in running Physics just use Nvidia PhysX which is awesome & cross platform. But modern game developers are not using PhysX which is a missed opportunity. Rockstar games have their own GPU based Physic Engine called Euphoria its been used from GTA4 to their latest games. You can also use compute shaders for some physics but modern game developers are lazy. The only thing that kills CPUs is highpoly animated characters, too many different materials & shaders. But modern good game developers use shader vertex animation for NPCs which only uses GPU & change into skinned mesh when you start interacting with the NPCs GTA to Spiderman NPC. Cyberpunk implement same thing in their updates which drastically improved performance.
@@ingamgoduka57 Nvidia PhysX, even on Nvidia, do not run everything on the GPU.
I know this is hard to believe for many but is the truth, I know this for two reasons, one I was a old Ageia PhysX user, before Nvidia bought it (meaning I add a Ageia PPU...) and I'm a game developer (a modder to be exact) my self for more than 20 years now and I have worked with PhysX (and other physics engines).
Nvidia passed this wrong idea about what PhysX can really do, on purpose, just to sell you more GPU's and they won, because the majority of people/gamers, are totally mistaken about what GPU PhysX acceleration really does in games.
In reality the only physics a GPU can accelerate, in real games, is the non gameplay impacting physics.
Meaning soft body physics like, cloth movement and destruction, liquids, dynamic fog effects, particle based physics, ex. glass breaking peace's or similar effects and rigid body objects, that can't affect the AI or the player, in any meaningful way. Physics that can affect the AI or the player, meaning those that can kill/block the AI or the Player, is always run on the CPU, no matter what.
For example, rigid body objects that you can pickup, with a gravity gun like in HL2 and shoot at NPC's to kill them, that is always run on CPU.
Bullet detection/collision for shooters, is always run on the CPU.
Collision detection for AI navigation, is run on the CPU.
Collision detection for audio, is run on CPU.
Ray tracing for gameplay, is run on CPU, etc.
There's plenty of physics, in real games that exclusively run on CPU's, even on Nvidia, because there's no other way.
Yes Nvidia has demos, where they have GPU's, running a bunch of rigid body objects around that look just like those moving around in games, but those demos, have zero AI, is all visual flare, the minute you put AI in the scene and the physics objects need to hurt the AI the physics have to switch to the CPU.
And just for those that may wonder, this was true with the Ageia PPU as well, in reality the PPU was just a low power ASIC, with a custom processor with a bunch of tiny cores, very similar to a GPU.
Also the more amount of unique NPCs in a game, the higher the CPU load
Analogy to understand the relation between CPU and GPU:
- GPU = orange
- CPU = orange juicer
- Objective: get quickly the highest amount of juice from the orange (highest possible FPS) = get the biggest possible orange (higher GPU) but don't forget that the best orange juicers (better CPUs) are the ones that get more juice from the same orange and avoid the worst orange juicers that can't even penetrate the orange (very low CPUs).
Let me just pair a Pentium with a 4090. It will be awesome.
I just checked up the prices of them... oh boy, they arent even cheaper than a 12400F and have lower clockspeeds to boot xD
Amen brother
As one wise Al Yankovich once said, "it's all about the Pentiums!"
glorious 4K at 1 FPS
😂
I saw a person pair a ryzen 5 1600x with a RX 7900xtx and argue that there was no CPU bottleneck.
Oof, Zen 1 was really bad for gaming. Zen 2 was better, but I think Zen 3 is when AMD actually started competing with Intel in gaming CPUs.
I mean it's very possible in a few games at 4K Native like Doom Eternal or even RDR2, but the combo isn't very reliable😅.
@@atirta777 I don't think those two games are very good examples. Neither of them is GPU-intensive enough relative to their CPU usage, and I expect an RX 7900 XTX would still be significantly bottlenecked by a Ryzen 5 1600X. I'd pick a different example like Cyberpunk 2077 with ray tracing enabled, which is so GPU-intensive at 4K (especially for AMD GPUs, which aren't great at ray tracing) that an RX 7900 XTX actually _wouldn't_ be bottlenecked by a Ryzen 5 1600X.
@@nathangamble125 Doom and RDR2 are good examples don't you think? The R5 1600X drives nearly 200fps on Eternal and I doubt 7900XTX does better than that with maxed RT on. Same with RDR2, about 90fps and 7900XTX doesn't reliably hit that at 4K Max. Also, we're talking about playable experiences here, which isn't the case with Cyberpunk 4K Native + RT on the 7900XTX.
Tell that person that he can upgrade to a 5700X3D for the cheap.
Good 3 steps to use when building a *gaming* PC:
1. Pick a GPU according to the fps and resolution you like to play at in your games of choice.
2. Pick a CPU that achieves *at least* that same framerate.
3. Scale both to your budget while also considering longevity.
Can you even explain what "cpu according to the framerate you like to play at" means? What would you recommend to a person who "likes" 60 fps? Or 144 fps?
2 steps when building a PC
1. Buy the fastest CPU in the market
2. Buy the fastest GPU in the market
@@aohjiiYou forgot the first step
Step 1: Get a job
Respectfully disagree, the frames per second you might see on a chart will almost certainly not be your real world user experience.
By the time you have a bunch of programs running in the system tray, a browser with 20 tabs open, discord and anything else you might have running in the background this will not be your experience, especially if you're like me and watching a 1080p or even 4k movie on a second screen while gaming.
Typically I would recommend buying a CPU that is more powerful than you think you need, at the moment I'd say 8 cores minimum and scaling the GPU based upon your budget after that, temper your expectations and if you're not happy with your parts choice for your budget then wait and save more.
It's always better to have a CPU that's more powerful than the GPU for so many reasons, the smoothness of operation is what you will notice day to day, gaming or not.
@@mertince2310basically he just means you have to balance your system. The gpu should almost always be your limiting factor in gaming. Hence why you spend 50% of your budget on the gpu. you choose the cpu that will not be the limiting factor. Obviously a 12400f and a 4090 is crazy. The cpu is the limit. Something like the 14700k is much more in line with the power of the 4090 and is a good balance.
Well i have a 4090 and went from a 5800x3D to a 7800x3D, my average fps stayed the same at 4k pretty much, but it was MUCH more stable.
Can you please elaborate on what you mean by that, or what the insinuation is. Thanks.
Very old comment to reply to but I think he means that the variation in frames is smaller think like 40-120 fps on the 5800x3d but 80-120 on the 7800x3d.
@@InfernusdomniAZ Thats what he meant. Some games became unplayable when i kept my 5950X with a 7900 XTX at 4K. The .1% and 1% FPS spikes were CRAZY. This i the only time in my Hardware career over 16 years that buying better hardware actually made games run worse for the first time.
makes me excited about my 5700X3D to 9800X3D upgrade. Soon
@@Gamez4eveR Dont forget to report back. Also list the GPU you use and whiat resolution you use.
Very well explained. Thank you very much for helping me understand this issue better. It's why I'm a member of your channel. I'm happy and proud to support such stellar work. I appreciate you🙂.
Especially the false narrative from NVIDIA that when you set resolution to 4K and enable DLSS you still play at 4K.
@@zzavatski Well.... yeah, but this is the fiction *everybody* is supporting, because (apparently) it is impossible to make hardware that will display games with any kind of effects in 4k at a decent speed for a reasonable price. If a 4090 -- agreed by all to be the most powerful hardware currently available, the "best of the best" -- can barely hit 60 fps at 4k Ultra in Alan Wake 2; can't even make 30 fps with RT on; if the only way to get decent framerates is through software 'hacks', *yet* Nvidia _still_ charges €1800 - €2100 for the hardware.... then we have to recognize that for whatever reason, the hardware is beyond its limit and everyone is lying and saying it isn't to get as much money as possible out of the users. Because it's ridiculous to think that I would spend €2000 on hardware that literally cannot do what I'm paying for unless you lied to me *like a **_boss_* .
Again they do it on purpose because in reality you don't need a super high end GPU to play 4k with high frames actually because it's about game optimization so if you optimize a game well enough you don't need a latest and greatest but they do it on purpose to make you buy the latest GPU so in a sense they work together because if games was so good optimize to run 4k 144 on a 1080ti for example people wouldn't have a reason to upgrade for does that mainly game it's sad but it's the truth@@Holliswoud
If you wanna play any RPG games with open world NPCs like animals, townspeople, or strategy games with million units and calculations, CPU is almost always the bottleneck
True. Yesterday i tried Witcher 3 on both dx11 and dx12 on My i5 12600K, RTX 4070 different setting, resolutions and still have large frame drops on Novigrad.
@@grynadevilgearxd2442Witcher 3 next gen runs really badly because they hacked together a dx12 layer instead of making it run natively. If you try the Classic version you will have 0 framedrops in busy areas with your system.
@@Ravix0fFourHornWhat if we use Vulkan instead?
Yeah I know😀 because i'm playing on dx11. I just mean that Novigrad is more CPU bound with that many NPC.
@@grynadevilgearxd2442 nah witcher3 npc barely have any logic,they legit just block of texture that have 1 line and move around or never,witcher get high fps even on a 4core cpu xd,now rdr2 is a better test,days gone is also heavy cpu limited in town my 4core can barely output 30-40fps
Going from a 9400 to a 12700k doubled my fps in certain titles at 4k.
Is not a fair comparison, the 12700K kinda is a mistake by Intel, it is way too good.
@@saricubra2867 Well, whenever you make such a jump in CPU power and then add 'in certain titles', the response is automatically: no shit Sherlock. The 9400 doesn't even have multithreading. It's 6 cores and the same 6 threads. Plugging a Ryzen 1600 into those Starfield charts would probably show the 7800X3D to be miles ahead as well.
? Is not that amazing, the 12600k is way better in that regard since it almost matches the 12700k in games for way less, in fact he should've gone with that one instead since he wasted money. @@saricubra2867
it all depends on how much RGB you have
@brunoutechkaheeros1182 put RGB on your cooler.
True I was struggling with 30 fps and bad stutters with my 4090. I swapped all my fans and my ram with rgb stuff and now I'm getting a bare minimum of 200 fps, rgb is life 😂
If only you can feel the shiver that went down my spine when you mentioned DLSS. That realisation that kicked my brain to high gear and I was like...Holy timbers, he's right!
If you're playing at 4k with DLSS quality, you're actually playing at 1440p.... You're more CPU bound than you thought.
What is an obvious thing to many to others it’s not 🎉
I've been saying this for years 😂.
Not to mention there is almost no reason to not run DLSS quality at 4K. It's where it shines the most
True, but youre good with a 4090 with even a 12600K or 5800X3D if you were gonna use upscaling. Its why with a 12700K im not really caring what new CPUs come out and havent cared about anything newer. Im more than fine with my 4070 Ti at 1440p, and id be fine with anything rivaling a 4090 in the future cause id realistically be moving to 4K in that case. Its complicating. If i were gonna buy a 4090 level GPU for 1440p in the future cause thats just how much harder it is to run 1440p max settings, id honestly just move to a used 14700K or 13700K, overclock and time the ram and be fine probably. Not planning on buying a new CPU+motherboard combo realistically till AM6 and whatever Intel has to over at the time.
Been pointing this out for a while especially to channels like Hardware Unboxed who have been at the forefront of 'CPU doesn't matter at 4K' nonsense because that reviewer personally doesn't like to use upscaling and only plays Fortnite.
Yup, just upgraded from a 2070 Super to a 4070 Super with the intention of playing at 4K and after booting up Cyberpunk I realized that my Ryzen 3600 was clearly holding back my GPU. I decided to do a full upgrade to a 7600 system and now everything's alright. Turns out ray tracing takes a big hit on the CPU, even if you're playing at 4K
I agree. I had 3600 as well and it's not a very good CPU to play with ray tracing effects. I'm still using 2070super and even it was getting bottlenecked by 3600 for raytracing. Now I'm using 13900K and it's buttery smooth.
How is 4070 super holding in 1440p
IAM planing to buy one but kind of nervous
@@EliteGamingG786 You can just... look up benchmarks online...
LOL even a 7600 isn't good enough
Yes but the good news is that a 7600 can max out GPU at 4k. While at 1440p or 1080p, a 7800x3d will do a better job.
In short:
- The CPU can achieve a mostly fixed framerate across all resolutions
- The framerate the GPU can achieve scales inversely with the resolution
- This is not set in stone, but dependent on the game and the settings
- The lowest between the two is the framerate you'll end up with, what causes generally a GPU bottleneck in higher resolutions and a CPU bottleneck in lower resolutions
- This can easily be seen in GPU benchmarks, where you'll see the GPU achieve the same framerates between 1080p and 1440p in some games, or not scale proportionally to the drop in resolution
- A third limiter is the game engine, which in some cases, mostly older engines, can't achieve higher framerates than a certain cap
- Upscalers like DLSS, FSR and XeSS render at lower resolutions than the target resolution, providing close to the framerate of a lower resolution
- This makes it more likely you'll hit a CPU bottleneck when using upscaling even at high monitor resolutions
- BONUS: Geforce cards drop the driver overhead on the CPU, while Radeon cards keep them on the GPU. This means that with lower end CPU's, there is a significant difference in performance between Geforce GPU's and Radeon GPU's since the CPU bottleneck is more severe when paired with a Geforce card due to that driver overhead
TY bro
I heard that the NVIDIA GPU driver offloads all the work to only a single CPU core, while the AMD GPU driver can distribute the driver workload to multiple CPU cores.
Thus, you especially need a high single core performance CPU when you use a NVIDIA GPU.
Someone please confirm, what I said.
- Remember, if you have 60 hz DISPLAY, then DISPLAY is your 60 fps bottleneck, no matter how high fps your CPU and GPU can produce.
excellent post! 🎉
Very short 😂😂😂😂😂😂😂
Just think of it as two different pipelines: CPU instructions and GPU instructions.
Depending on the rendering engine and how the game was programmed, the balance of instructions sent to the CPU and GPU varies.
For most games, changing resolution will mostly impact GPU load.
i honestly love the way you do your videos. the moving dan to point at things is cute and humorous
real
Remember if you are gpu bottlenecked you can always just use upscalers and lower graphic settings.
The same doesn't apply to a cpu bottleneck
I decrease the CPU render resolution to put more of the resolution on the GPU instead.
@@christophermullins7163 are you talking about draw distance? never heard of cpu render resolution before draw distance helps a lot for cpu and gpu but you will still get lag spikes a lot because of the pop in from draw distance
I got to about 10:00 and you were going on and on, and maybe you made this point later? I tuned out.
The bottom like for ANY resolution is what a person wants for fps. If I'm playing Starfield with a 4080, well, I have a 2K rig so it's great, or good enough. I prefer an avg. above 90 and a 1% low to be above 60 fps. I find that gives me a stutter free experience in open world games.
But looking at a TH-camr benchmark it doesn't tell me much, because they will stick to certain settings just for consistency, but there's no way I'm playing Starfield at 4K with a 4080 UNLESS I scale down the quality to where once again my avg. fps is above 90 and my 1% is above 60, or I'm using DLSS, and using RT, DLSS is almost always going to be enabled.
YES, because I'm pushing the fps back up, the CPU is going to matter. I'm not gaming at the settings HDWU had for 4K in Starfield. I'm not going to let a GPU struggle like that and give stutters from time to time.
AND THIS is the point most people get wrong when they look at TH-cam benchmarks and listen to a person say "at 4K they're the same". And frankly, part of this held belief has been SPREAD by different TH-camrs.
And I don't need to get into what the CPU does and what the GPU does. What I can tell you is when you play a game at high fps at 4K the CPU is CERTAINLY going to matter. I do know the CPU has to pass data to the GPU before the GPU can render a different frame, otherwise it's painting the same frame, because the GPU doesn't know movement, so there is change data that the CPU has to send to the GPU to get a different frame. The CPU tracks the game. You also have to keep updating RAM.
And if YOU personally like playing at 4K with 1% lows at 40 fps, well have fun with it.
When a game is very CPU intensive, that applies to any resolution. Higher resolutions have close to no impact on CPU performance. I said ''close to no impact'' because as the resolution increases, the framebuffer size increases, and that requires more memory bandwidth for transferring data between the CPU and VRAM. That increased data transfer can ''technically'' lead to slightly higher CPU usage. However, that increase is generally not significant compared to the overall workload of the GPU, which handles the bulk of the rendering process.
Applies more so to lower resolutions***
@@christophermullins7163not at all. There is no difference in resolution. If you are playing at 60 fps, vou requirement will be the same regardless of resolution. People just say stupid things like this because they think "lower res = more fps = more CPU demand", but they fail to realize the CPU requirement increases due to higher fps, not lower resolution. They are not the same thing
@@meurer13danielHigher FPS means that the CPU has to handle a shorter game cycle therefore processing more inputs and outputs more quickly.
@@saricubra2867 I said that "CPU matters less at 4k than lower resolutions" and after watching this video, Daniel confirmed that my statement is accurate. The title is clickbaity AF. The point of the video is to say that when we say "play at 4k" we are more likely upscaling but that does not invalidate the original statement at all. "Slower CPUs have less of a performance hit at 4k" "CPU performance matters less at higher resolution" are irrefutably factual statements and Daniel himself knowd that without any doubt in my mind. It is all about the way we word the question and assuming we are using upscaling and lower graphics settings etc. it cannot be argued that anything I said is wrong. You'd be greatly benefited from trying to understand why that is the case if you still haven't realized that it is true.
Go ahead and argue something irrelevant like Daniels video.. no one said anything about upscaling guys.. it's called clickbait.
@@christophermullins7163 But you are making a completely different statement. He didn't confirm your statement. This entire video is digging into the nuance of why cpu performance *looks* (emphasis mine) like it doesn't matter at 4k but it actually does. The issue isn't that slower cpus doesn't matter at 4k but more so that you're gpu limited if you just slap on max setting and called it a day. It will matter if you adjust down the settings and use up-scaling to reach your potential max framerate which is determined by what cpu you have. Most people aren't playing at max setting on 4k.
It's like you heard the phrase "you're kind of right" and just stopped watching after that. You can argue all you want about how you're right but you actually aren't because you're making a statement based on a faulty conclusion.
The data from the video above us literally tells us that you can be cpu limited at 4k and it does matter what cpu you have especially if you're reducing your gpu bottleneck by paring down the settings and resolution. No one in their right mind would recommend pairing a 4080 with something like a 2600x "because cpu doesn't matter at 4k" because we all know that it does bottleneck.
Is DD2 really cpu limited? Or just badly optimized. The industry dropped the ball when the highest of the highest end cpus dip to 20-30s fps
thank you for making this video, so annoying seeing so many people assume you can pair a 4090 with a 5600 and expect the same results as a 7800x3d at 1440p+ 🤣
I saw a person pair a zen 1 ryzen 5 1600x with a RX 7900xtx. They argued there is no bottleneck.
@@ZackSNetworkwow that is insane. I can understand a 5800x3d or something but a 1600 isn't even going to have a PCIE 4 unless you go get a new motherboard that costs 3x the CPU 💀
Why would you drop $1,000 on a gpu and not at least get a 5600 or something for $130 more lol. Even if there is "no bottleneck"
They are the people crying why their framerates are the same on both low and max settings in certain games, and then blame it on bad optimisation lmao
People think that?
I was trying to explain this to someone in another video, but I didn't even think of the whole upscaling aspect of it when I was trying to explain why upgrading a CPU isn't going to give you gains in the same way a GPU improvement will. Not only can I reference this video now if needed, but I also think your explanation here will set me up for my next conversation on the topic. Thanks!
So basically as nearly everyone plays games at upscaled 4k instead of native, the CPU is just as important as playing at a native lower resolution.
I never only play games. I need benchmarks where they game at 4k, have a wnd monitor with 3 Chrome tabs that are streaming video, voice chat over discord, spotify etc all running. (No i dont listen to people and music and video at the same time, but the program is running)
And THEN tell me i only need 4 cores and 16gb ram in 2024. Cause thats bs
Thanks for doing this I've seen way to many "CPU doesnt matter at 4K" comments over the past few years, even at 1080P I noticed a big improvement in 1% lows alone at 1080P when upgrading from an i7 8700K to 13600K (Carried over existing RTX 3070).
i'm still on a i7 8700k running at 4.8ghz with a 3060ti and i have people telling me all the time my cpu is still fine for gaming i play battlefield games all the time and helldivers 2 and it's always my cpu even when i play metro exodus this cpu is just to old now i cannot stay at 60 fps
This is by far the best video on explaining this specific topic cheers bro! Love all your vids
The only bottleneck everyone has is how much is in their bank account.
You would be surprised how many people are bottlenecked by their wrong decisions and not researching when they are buying PCs or parts and wasting their money.
One more aspect where the CPU is relevant for 4K / increasing frame-rates that Digital Foundry found (mentioned in passing in their Dragons Dogma 2 Tech Review - 19:00 - 19:27) :
When CPU limited, the frame-rate in the same scene is still taking a (small) hit (~7FPS difference between 1080p and 4k as well as ~2 FPS between 1080p and 1440p) at higher resolutions.
Though they didn't really test this in detail (as this was not relevant for this review) and just mentioned that they suspect this is due to this / some game(s) drawing more things in the distance while at higher resolution.
Great video, as a hardware enthusiast I've known about this for many years but its really hard to articulate it as well as you did here. Going to bookmark this one for future use so I don't have to try and explain it.
My favorite case of this is the original Crysis (and remastered) where on top of being super GPU heavy, it also increases more CPU-driven rendering budgets with an increase in resolution. On top of that it's also not very well multi-threaded so you basically need a 7800x3D to run a 2007 (right?) game at 4k60fps maxed out settings. I think it ends up pushing out the LOD more or something super weird the higher your resolution, so you get more CPU calls. Digital Foundry did a pretty good video on it and in the clip where they got Alex (their PC focused member) a 7800x3D to replace his 12900k they discuss Crysis specifically.
But yeah, basically at 4k your CPU is less likely to be a bottleneck, but it still very much can be, and you want something at least pretty good to make sure you don't get slammed with crap frametime spikes, as CPU limited gameplay will almost always be a stuttery mess where GPU limited will be better.
an outlier to the 4k argument is MMOs.... in MMOs, where a bunch of players are around each other, cpu definitely matters.... 3d chips really shine in those scenarios. my minimum fps in wow in valdrakken in 4k doubled when i switched from a 3700x to a 5800x3d. i get 80fps in 4k native max settings in city.
U should fine tune that even more with 14600k and a 7900xtx i get 150 to 180 fps in vald at 4k but the thing with vald is it artificialy limits gpu and cpu usage im always sitting at like 60% gpu and like 20% cpu usage in the emerald dream i get 90% + gpu usage and like 400+ fps at 4k
@@Billyshatner88Just get a 7800X3D which would blow both of them out of the water. Lol. In all seriousness, if the hardware you’re playing on is serving you well then no need to change it. The PC community STRUGGLES with this notion and continues to complain why they’re in debt.
Yeah, had a similar experience from the exact same upgrade in flight sim titles.
@@Billyshatner88 psst... You're CPU bound. That's what being CPU bound looks like. If your framerate isn't hitting max and your GPU utilisation isn't at 100% that means you're CPU bound.
Yes CPU performance is all about raw data. Especially in multiplayer where you have to calculate 15-20 players like in a WoW raid that tanks the CPU hard while in a singleplayer game everything is predetermined the CPU doesn't have to do anything there.
I sent this to a friend today, because I cannot be bothered trying to explain this to them anymore. This discussion really needs to be settled -- once, and for all. But, I realize there will always be some degree of confusion or misinterpretations when it comes to information like this.
Mainly game at 4k besides esports titles and just upgraded from the 5600x to the 5800x3d and it's been a big uplift. Average is only 10% to 20% higher but the 1%low is so much higher its great. Even if GPU is maxed out it still helps a lot!
Did the same change and got the same results as you, those better 1% lows make such a difference. Really wanted a 7800x3d, but factor in the price of the new ram and MB and it was 3-4 times the cost of just dropping in the 5800x3d. Shall stick with this for a while.
@@ChoppyChofYeah, the 1% lows matter just as much. Most people think average fps is all that matters.
It's great to see a selection of videos explaining the connection between CPU/GPU and framerate at different resolutions come out in the past few days. It works to test CPUs with nothing but a 4090 for comparison and likewise GPUs with a 7800X3D for consistent benchmarks, it doesn't help people trying to match components in an upgrade or on a strict budget when the midrange CPU wastes GPU power and vice versa.
9:41 Here's the key point. I used to think CPU didn't matter at 4K too. Then I actually got a 4K monitor and started using DLSS on every single modern game. You just need to look at the CPU usage rising after activating DLSS to figure out that your CPU actually *does* matter. If you play at 4K native it still matters, just a lot less since you're extremely GPU bound.
I’m glad that this isn’t in response to my post on one of these videos! 😅
I did say that CPU doesn’t matter at 4K resolution but what I really meant was the difference between CPU’s of a particular segment. I was considering a 14900K at one point and there’s the 7800x3D, but I went with a 7950x3D.
The graphs for those can show different frame rates but as you mentioned it’s highly dependent on the area that you’re in in the game. Also, I doubt that I would actually be able to appreciate any kind of uplift that’s less than 5 or 10%.
@theinsfrijonds Why not upgrade to a CPU that has more than dual channel memory? When I made the switch I noticed the difference in frame pacing, but I was also able to truly multi task. It amazes me that streamers don't use workstations to game with. Yes there's a limit of what a cpu can do at 4K, but no one has put the Quad, hex, and Oct memory channel systems to the test in a public video. The cool thing is you can work and GAME at the same time with a workstation. You have enough pci-e lanes that you can run two VM at the same time and use more than one GPU.. Gaming is limited to single GPU since SLI and Xfire bit the dust. if you have another GPU you can use VT-d and pass it through a VM to run rendering., professional workloads, or a streaming for youtube/twitch/whatever box on the same system. The use of unlocked threadripper and unlocked Xeon is not well know by those that focus only on gaming.
@@michaelcarson8375You save a lot more money buying an X3D cheap instead of buying an overpriced platform with quad channel support. The extra cache is equivalent to a giant memory OC.
@@michaelcarson8375 Those processors were out of my budget. I mostly bought AMD because it's more power efficient. I do hate the fact that the memory controller in their processors is definitely worse than Intel (limited to 3000 mega transfers per channel.)
Also I'm not sure that there is a motherboard where I could access the second PCI Express slot with my 4090 graphics card. It's just that huge (four physical slots taken up in space but three on the back of the case.)
Good to know though that there are uses for quad channel and up. I can only imagine that channels haven't covered that due to the limited number of buyers plus the fact that it would be very expensive for them to review
@@michaelcarson8375 Because workstation cpu are focused more on doing a bunch of things well rather than having high clock speed which some games really need. You could potentially limit yourself on performance if you go for something like a threadripper or a xeon cpu. Also the price difference doesn't really make sense for a streamer when they could put that money into second pc that they use for capture and rendering, keeping a eye on chat, and so on. They cut out the overhead on streaming without sacrificing performance this way.
*edit* also using a vm while gaming could potentially get you banned if you play online if you can even launch the game at all due to anti cheat detection. A lot of cheaters used to use vms to run their cheats.
@@Shadowninja1200 Excuses excuses. There are unlocked CPUs from both Intel and AMD Those systems are the true high end desktops. Dual channel cpus no matter the number of cores are not high end desktops.
Really helpful for tuning a setup at 4k, you can initially max everything at 1080p to find out what the CPU FPS limit is and then go to 4k to target that frame rate, may as well crank up all the quality settings to max until you see the FPS dip below the CPU limit
Of course it matters. Better cpu can provide smoother frame times
Will also handle background tasks better, plus more and more games can now take advantage of 12 or more cores, why hamstring yourself with 8?
@@mickieg1994 Because budget?
@@mickieg1994 only reason why I would go with 8 is if I’m hellbent on a x800x3D chip, but otherwise yeah the more cores the better
@@paul2609 I disagree, if the budget is that tight that you can't afford the extra £100 or so to go from 6 to 8-12, then you should seriously reconsider whether you can afford a pc at all.
@@teddyholiday8038It depends, for example, the 5900X has 12 big cores and the 12700K is hybrid. The only reason why i choose the 12700K besides the better IPC is the lack of CPU overhead (Intel thread director cleans interruptions for the main threads for any program).
I wanted a X800X3D chip but they are 8 core only and they would feel like a slog when the scheduling isn't as good. Ryzen 9 5900X gets lower averages than these newer chips but 1% lows and frametimes are very, very smooth when the game is well optimized (like Cyberpunk 2077).
spider man remastered with ray tracing enabled is the perfect example of this. it's still very cpu influenced even at 4k.
@danielowentech breaks out the teacher side of him! Always great to hear your explanations! 👍🏻
I'd also bet people running 4k don't really need AA turned on to 4x 8x or msaa or txaa.. that's pretty taxing on gpu st 4k.. I bet that would show a uplift of some fps.. where a good cpu matters.
Whats Daniel is trying to say
If your cpu cant give u more then 60fps even with a high-end gpu its mean your cpu isn't powerful enough for it
Unless you are capping the framerate.
Lol.. title: "lets see how cpu fare in 4k"
Video: "heres how different cpu fares in 1080p"
Idk how you guys can take this video seriously.
Go open any cpu review videos on yt. You can see the results between cpu @4k are pretty negligible.
0:01 the face you are making while pointing to what looking like a red dog rocket covering a name is priceless.
+1000! Lets play games like Cities Skyline or Microsoft Flight Simulator at any resolution : we willl see that CPU does matter.
With a 7900xtx and 7800x3d i was gpu limited at ultrawide 1440p, i cant speak to city skylines, but at least for mfs2020 youll be gpu limited at 4k
Its kinda infurating that there is no comprehensive comparison of cpu performance on 4x/simulation games. Its always AAA stuff, id like to know how much a 7800x3d would improve my ticks per second on vic 3.
I bought my i7-12700K for PS3 emulation (besides for work), those games that you mentioned aren't as demanding.
Im in this scenario. Currently have a ryzen 3700x paired with RTX2080 and planning to upgrade to Rx7900XT. Im playing sim games at 4k, 60hz monitor. Wondering if the CPU bottleneck will be a problem and if a 5800X3D upgrade is a smart move. What do you guys think ?
@@BlueSatt A nice match i guess but that gear lifespan will be a bit shorter
1:37 why don't you use something like Presentmon to actually show the relationship between components during the frame cycles? Wouldn't that give a lot more detail and thorough explanation that could be more easily understood?
@@InnerFury1886 TLDR: I stopped as soon as I saw "upcoming release" which informed me that you are uninformed.
PresentMon is on version 2 and it's been available for download for a significant amount of time already...
A smaller content creator made a video recently where he suggested pairing a 4080 with a Ryzen 5 7600 for a 4K build. I commented and said that was a bad pairing, and a bunch of people all replied to my comment giving me so much crap and roasting me while saying you didn't need a stronger CPU for 4K because games will pretty much always be GPU bound at 4K. I then made the mistake of bringing it over to reddit, where I then got roasted by like a million people who all agreed that you don't need more than a 7600 for something like a 4080. Truly a "be right even if you're alone" moment for me. I mean seriously, if you're pairing a $1,000 GPU with a $200 CPU in 2024, you've done it wrong, even if you're playing at 4K.
this is so true. At this point you should be aiming for a 7800x3d and nothing less.
@@g.eberling8700Eh, I'd argue you can get away with a 5800X3D.
They are right, ryzen 5 7600 is very very good, it can handle 7900xtx in balanced games but not highly cpu demanding
@@pengu6335 You would. But at such a high price class I would go with AM5. There is a high possibility that you can upgrade the CPU in future without changing anything.
@@g.eberling8700 That's only assuming you already have an AM4 motherboard.
i built a compter this year out of other computeres, but ever since android 15 came out, now every time i boot, my windows freezes as soon the loading linux thing prints out. i bet i got hacked. i feel like that proabably a cpu bottlenet but i dont have 4k display so im kinda left with no recourse except more overvolting.. i got room though. its crazy how high that thing actually goes. also if i pull out my graphics card the whole computer just shuts off immediately. does that mean i didnt integrate graphics for the cpu correctly last time i kept flashing all the gpt patricias. intels just uunoptimized traSH. it always wont stop starting but now it wont even start stopping either
riiiiight. lol my 10900k holds my 3090 back in cp2077. An i9 12900k alleviates this. NPCs kill a cpu.
I barely can reach 300fps or slightly more in Unreal Tournament 2004 (f*ck Epic for killing Unreal for the sake of Fortnite) on my i7-12700K, the game runs in one thread and the bottleneck is so hard that anything above the Intel UHD 770 is pointless. On task manager, one thread is 100% at all times and the iGPU is 100% as well, the rest of the CPU does nothing.
Alder Lake has godlike IPC.
very informative and practical. I generally recommend to my friends that CPU doesn't have to be the best for their build, but I always ask them what their resolution is, their target "smoothness" (most people are not familiar with fps) and the games they wanna play.
So most of the time I would recommend a mid-range CPU like a ryzen 5 or i5 + a decent midrange GPU for 1080/1440p. Sometimes I'd even sell them to the upscaling tech available for each side as well as show them that some games have beautiful raytracing BUT I always make sure to let them know that it's not always a good idea to crank up the settings and make them consider optimizing settings.
People say stuff like this and it’s very dumb. Even if you are less CPU bound it still takes a lot of power for raytracing and other aspects. Overall it just doesn’t make sense to pair a i3 or Ryzen 5 with a 4080 and especially a 4090.
It does, look at the 4K results. Especially if you're on a budget, you tend to get more fps and eye candy out of a better GPU+worse CPU than a worse GPU+better CPU at 4K. Especially when it comes to upgrades! Shall I change to a new platform, or just stick a beefy GPU in, and wait another CPU gen to come out...
Nobody says you gonna get the most out of your Top Tier GPU paired with a mid tier CPU, but you gonna get a better result than pairing up a mid tier GPU + mid tier CPU. This is clearly visible at the 4K chart.
In some cases you will, in some cases you won't get the most out of your Beefy GPU, but you're better off than with a mid tier at 4K.
I'm only talking about 4K.
Ryzen 5 should be okay as long as it's the 7600. I mean it's not gonna give you 7800x3d levels of performance but it's not as bad as i3.
I have a R5 7600 and an xtx, I only get cpu limited in a handful of specific areas in only a couple of games, and I don't think i've seen gpu go under 90% in those areas anyways.
BS.
4090 doubled FPS over a 3080 using a 5600X at 4K.
It enables 120+fps with max settings in DCS.
Went to a 5800X3D, I can't notice a difference in game.
Also, the 4090 is very quiet compared to a 3080 Strix going balls out at 4K.
@@V4zz33 You're not going to play at native 2160p, because that's a waste of GPU power. And a RTX 4080 costs like 3 times as much as a Ryzen 7 7800X3D, so going for a strong (or even the strongest) CPU doesn't have as much of an impact on the budget compared to getting a stronger GPU.
This is a fair point when using dlss, you are taking the load off the gpu an scaling the resolution. What about with DLAA wouldn’t this lean more towards the GPU or is just as dependent on the CPU?
CPU does matter, but not nearly as much as people think. In 4K you'll need to spend a lot more on the GPU so it's perfectly fine to go with a 5600 if your only interest is casual gaming.
7600*
@@wertyuiopasd6281 Zen4 is still more expensive all around without providing too much of a boost. Budget builds are fine with Zen3 but if you really want to get the current gen then 7500f is the go-to if available.
Great video...I actually changed my opinion on this a while back. In CP77 with a 5950x and a 4090 with rt max, dlss off in 4k I was getting about 40 fps, when I upgraded to a 7800x3d it uped to about 48 fps so there was a 20% increase. I was actually pretty shocked.
This feels clickbaity. The statement "CPU matters less at 4k" is just as valid as it was before this video.
It’s not
@@od1sseas663 make an argument numb nuts..
@@od1sseas663 make an argument then numb nuts. It shows ignorance to think or say that higher resolution does not remove CPU bottleneck. It's irrefutable that it does.
I agree. He intentionally avoid putting an actual benchmark @4k cos the negligible difference wouldnt suit his narrative
@@Centrioless That’s the point. It’s not a cpu benchmark anymore when you’re testing 4K, it’s a GPU benchmark. 🤦🤦
I have a Ryzen 7 5700G paired with an RX 7800xt will my gpu be help back trying to play games at 120fps ?
At 1080p? Yes. Probably in 90% of the games you play this system is cpu bottlenecked. But 120 fps is super game specific, can't talk about that in general.
At 1440p, meh, probably not.
@@Chrissy717 so I’m just bottlenecked at 1080p ? I’ll be fine at 1440p and higher ?
Id be shocked too if 4080 buyers actually used their 4k-priced GPU to play in actual 4k..
I have a 4090 and I don’t. It’s a waste of electricity. Also, DLSS quality is extremely good.
@theinsfrijonds hey since you have money to just piss away, I'll take any extra you want to flush down the toilet. I'll send you my paypal.
so if i reach 100fps in native 4K resolution with best cpu and gpu, i can expect my fps to be lower with cheaper cpu
You're assuming that people that buy 1000$ GPU-s to play in 4K, are going to turn down resolution 2 years after they bought it instead of just selling the old one and getting a new GPU that can play in 4K.
Also, you can tweak some Settings on your GPU to achieve your desired FPS. Shadows/Reflections/Lighting/Effects ... all these impact performance. I would rather play with Medium Shadows than to upscale from a lower resolution.
Also. CPU demands also increase from year to year.
R5 5600 was able to play anything in 100FPS when it launched. Lo and behold, in 2024 it can only do 60FPS in some demanding poorly optimised titles.
A 3090 was able to hit close to 100 FPS in any game in 4K in 2020.
So. If I paired a 5600 with a 3090 to play in 4K in 2020, guess what? Both the CPU and GPU demands have increased. So the CPU can only pull 60 FPS, but also the GPU can only pull 60 FPS in 4K. If the time comes and the 5600 will only be able to pull 30 FPS (in a decently optimized game, not this Denuvobroken DragonsDogma2), the 3090 will also pull only 30 FPS in 4K.
Sure. You could turn down settings, but tell me. Who bought a 3090 to play in FHD? :))
If it can't handle it, you just sell the whole PC, or the components, and get something better.
If on the other hand, you would've spent more and got the 13900K with a 3090, you would'nt have used all of the CPU potential. That CPU can push 150FPS in pretty much any game, but the 3090 can't push 150FPS in any game in 4K. You basically bought a system that is designed for 2K or worse. You "future-proofed" your system. Meaning that in 2024 when your 3090 can't hit 60 FPS in 4K, and you decide to drop resolution to 2K, they finally match up. So basically ... you paid some extra money, for extra processing power, which you haven't used for 4 years. All this, to just be able to play in a lower resolution :))) Ridiculous ...
I own a Samsung Syncmaster 955DF pseudo 1440p CRT and i can confirm that the framerate wars are just a bunch load of BS.
Take any CRT and it blows any modern game screen for gaming, seriously, it's not even close. You get built-in anti-aliasing, free frame generation tech and upscaling (CRTs don't have a native spatial resolution and refresh rate, it's an infinite range following their specs).
Playing at 1856x1344 is just crazy for such an old monitor, it looks way, WAY sharper and better than a 1080p LCD i have right now.
@@saricubra2867 At some point CRT's got surpassed by OLED
You can put any 4K OLED besides a CRT and notice the differences
Linus did something like that
CRT is only better than the low cost 1080p LED displays
1080p displays got left behind a while back, nobody cares anymore about that resolution
DLSS literally looks like native. Better use upscaling than dropping settings. So you’re the one who’s ridiculous here.
Other than 4090 that is exactly whats happening, people need to use upscaling within a year they bought it on latest AAA games at 4k
So after certain level of power of CPU (cores and speed to handle info) at 4k does not matter yf you put a better CPU the performance won’t go up but if you put a lower CPU you will get impacted due to bottleneck?
If I changed my i5-12600k to a i7-14700k how many more frames would I get playing Baldurs Gate 3 at 4K ultra preset running a 4080 super?
Going from a 5500 to a 5700x3d has smoothed out my games so much at 3440x1440. In some games the average fps is pretty much the same but i get no more stuttering in games where i constantly had issues with it.
Fly at low altitude in Microsoft Flight Simulator or DCS with a i9-12900K and an RTX 4090 and you will see your frames sink below 30fps in 4K.
This also happens in Racing Sims when you have a full grid of cars.
I find it hard enough to maintain just 72fps in a Pimax Crystal VR headset in DCS and MSFS2020.
Ok bro cut the crap. There was no difference between these CPUs at 4k. You tried later to prove your point with games in 1080p. That's not 4k. All I saw is that Ryzen 5 5600 pulled 1fps less then ryzen 7 7800X3D. Nothing else matters for the subject. Ofc there is gonna be bigger difference if some idiot is using ryzen 1700 for example and pairs it with 4080. That's just dumb. But my point is for modern rigs you should save on the CPU and invest more money in GPU. Gamers have been doing that for ages now. Only competitive gamers need top notch CPU for those CS2 tournaments when they game on 165hz monitors and what not. Normal, average gamer that's focused on 4k and has 60hz or 75hz screen should invest more in GPU for sure.
Would you mind covering the impact that core isolation has on FPS? 1080 vs 1440p vs 4k? Thank!
this is exactly my case now, i have r5 3600 and rx 6800 combo, i play at 4k 60fps with fsr on quality most games, in very crowded places in many games like cities or forests i get cpu bottleneck, my gpu can run it but cpu cant hold on since i play at 1440p cause of fsr usage even at 4k, and its kinda sad, cause gpu cant handle native 4k in many games so its 40 50 fps, and with fsr cpu is too slow again getting 50 fps still below my playable 60fps. so im stuck now, need to upgrade to 5800x3d probably, but my psu cant handle that and i need to upgrade more and more parts to make it work ;\ so im balancing settings to achieve 60 fps at 4k + fsr
R5 5600 does not bottleneck 6800 on 1440p a very lil bit at 1080p
@@mrshinobigaming8447 they said r5 3600
@@mrshinobigaming8447 i have 3600 not 5600
Thank you for this explanation- I am one of many who struggle with settings and tweaks - more education is needed on topics like this.
do you run ultra or medium settings? do you use dlss/fsr? many factors for 4k gaming. if you run native 4k at ultra settings of the mid range cpu of the last few gens will perform about the same. but who the fuck has a 4090 and a 5700x
Question. 1080p native vs 4k dlss ultra performace has the same effect on cpu/gpu bottleneck ?
Awesome content as always @Daniel Owen. I build alot of PCs and haven't considered this perspective, so thank you for bringing it to my attention!!
So, 5600 + 4080 show trivial difference from 78x3d + 4080 at 4k, what you want to prove?
He showed you but you did not understand it. AVG fps doesnt matter its the 0.1 and 1% lows that create huge stutter on high end systems. Hes 100% right, i had the problem myself in 4K with a 7900 XTX and 5950X which did not come along with the gpu.
Nice video.. again. :) I can absolutely recommend the videos from Gamers Nexus regarding "GPU Busy", "Animation Error", "input lag" (Latency pipeline), "Optimization" and how GPU drivers work.. discussed with an engineers from Intel and Nvidia.
Is a 12900k scarifying much with a 4090 if I’m typically gaming at 4k ultra with DLSS Quality?
Driver overhead is an issue with Nvidia. In the 7600 vs 5600 video, the 4090 was barely ahead of a 6950xt on the 7600
I'm an i7-12700K owner, what is driver overhead? (a joke).
I recently updated from Ryzen 5 5600x to Ryzen 7 7800XD paired with 3080 because I also play 4K with dlss thinking it will improve the frame rates a lot. But actually it didn’t change anything and it feels like a waste of money.
The point in (my own) short words: for 4K gaming CPU performance is often not important, just because the framerates are lower, just because the gpu is maxed out. that's all. when the framerates raise, the cpu has to work again a lot more.
to be futureproof, an unlimited fast CPU and GPU is needed %-). Nobody really knows, which new game will "waste" more power on the gpu or cpu. Too much just does not exist, but saving the money for a top level cpu but inserting a 4090 is just wrong.
Especially with UE5engine using most of the time Raytracing on the CPU we all can find us in a state, where a faster cpu could be easily be needed. That all said with the need for intense AI npcs, just because ... why not?
So, TLDR: if you plan on optimizing your games for HIGH FPS, your cpu is more important.
You open the game and put everything on max like 4k/ultra but then you realize you only get 40-50fps or you might get 60-70 but want 90fps+ etc. Then what you do is turn down graphics settings and/or use upscaling (duh!), and the more you do that, the closer you get to your CPU limit (=max fps your cpu can deliver before the gpu starts rendering). And once you hit it, you can turn down your settings as much as you want and upscale from 240p you won't get more fps. It has to be said though, if you have a semi modern CPU and want to play with "moderate" fps ~60-90fps you really have to look hard for games where this becomes an issue.
This is why I change GPU once every 2.5 years but CPU every 5-6 years. My monitor and the game I'm playing are actually the key reasons that cause me to upgrade my PC components.
You have to find out what your CPU/GPU are actually doing before you upgrade. There are plenty of tools like MSI Afterburner, AMD Adrenaline Software monitoring, Intel PresentMon, Task Manager, etc.
As someone with an 8700k and 4080 Super combo , CPU performance does absolutely matter. I'm even CPU bound in Time Spy of all things 😂 . Bit imbalanced but my old 2080ti GPU died recently , so put this GPU into my old build. Planning to upgrade to Zen 5 though soon enough ;)
Will Dragons Dogma 2 play better on a Ryzen 9 5900X or a Ryzen 7 5700X3D ?
Basically, both your CPU and GPU need to have the computing power to output at your chosen resolution and framerate target. With the added variable that those computing needs will vary from game to game. I honestly get the confusion.
I play at 4k, my old system was a i7 6700k (4 cores/8 threads) paired with a RTX3080 and I upgraded to a i7 12700 (12 cores/20 threads) and the gain in frames is minimal... I'm talking like 1-3%
So my question is this: is 5700x3d better when upscaled from 1080p up Vs newer non-3D chips such as 7600x ?
How does a 5600X and 7900GRE sound in terms of bottleneck?
Max 20% bottleneck at 1440p
Max 35% bottleneck at 1080p
5600x can provide 60+fps at every game with 1080p
What the comment above said. Also depends on games you play. If you play fps games at lowered settings, the amount of bottleneck will be worse.
@@mrshinobigaming8447hey if you’re using a bottleneck calculator I would advise against it they’re usually really shitty
Play a round with a few and test different CPU’s and hopefully you’ll get what I mean
@@deucedeucerims im not using bottleneck calculator i have watch alot of reviews max gpu for r5 5600x is 7700xt at 1080p anything more powerful will cuz bottleneck
Another great video my man. You remain one of the best TH-camrs in this space.
i noticed a massive difference when i went from a 5700x to 5700x3d it felt Way more smoother and i got absolutely no stutters now at 4k
would love to see someone do a video of CPU comparisons but at different DLSS/FSR presets to see the performance difference between internal resolutions upscaled by the GPU VS internal resolutions matched to displayed resolution. Most might think it's 1:1 but the extra processing done by the GPU may affect the overall performance in-game.
i have 4070 and running forbidden west on 107 fps with dlss on. there is no stuttering or any other problems. when it comes out i will buy 5070 and I just cant tell the difference in all this performance when naked eye can hardly see 240 fps or refresh rate on your monitor.
Does cpu impact dlss/fsr and frame generation speeds?
How do you know i have an RTX 3080 with a ryzen 7 1700x in my system?😅
It all depends on the game and if you are cpu bound or gpu bound. In some games i play at 3200x1800 even tho i have a 4k monitor, i'm still gpu bound in the games i play like that tho. Got a 7700 none x :D And i feel my fps and latency is a little bit better when gpu bound than when i'm cpu bound, but that might be my monitor's scaling that makes that difference, idk tbh.
I have a 11 year old laptop with a pseudo 4:3 1440p CRT monitor, what is latency?
Thank you for addressing this! I would see so many comments just like the ones you were talking about. Hard to argue with misinformation
Not to mention that the CPU has a major impact on system snappiness and load times even if you're 100% GPU bound while gaming.
I have a TitanXp (1080 TI equivalent) paired with an R7 5800X and I'm itching for a CPU upgrade even though my GPU is 3 years older than my CPU.
Additionally, the CPU can get interrupted by background tasks or if the game has to load in assets and that can come through as stuttering even in GPU bound scenarios.
I have upgraded from FHD 60Hz to 4k 120Hz screen and havent touched my R5 5600 + RX6700XT combo. In 7+ years old titles performance is OK at 4k in newer ones not so much. What bothers me the most is upgrading just my screen created some ugly lags for every day use of Win11 aka complete opposite of snappiness. Even though it is not gaming related it is enough annoying for me to ask if anyone else had similar experience and what kind of solution can fix the lags.
the point is its not worth spending a lot of money to upgrade to the next gen if your running 4k e.g a 5900x to 7800 x3d... If you have a few stutters drop the settings
That's the thing tho, can't really drop settings for CPU limits. The only thing affecting it is (maybe) level of detail and, if the game has it, any other general setting that reduces spawn rate of NPCs and other objects that have collision and stuff.
If your CPU can't hit 60 fps it just won't hit 60 and you either need to live with it or you need to upgrade. Even turning on dlss/fsr won't help (apart from frame gen).
could say, it doesn't scale linearly. some things will be the same load on cpu regardless of resolution.
So I have a core i5 7th Gen with gtx 1060 and the only game I play is dota 2. Now I am seeing my CPU is the bottleneck when my gpu is at 40 50 percent utilization even on 4K.
If you were gpu limited in an 11 year old game that would be a problem
dota can run on a calculator from the 1950s im not suprised lol
Or to put it another way: when do you start to be GPU limited? One of the best tools for visualizing this is the Forza Horizon 5 benchmark because it shows what percentage of the time it’s limited by one vs. the other and how many FPS the CPU is managing in a given run vs. the GPU.
very will put out and a new way to look at CPU - GPU requirements
Ok... Im using a 4080 with a ryzen 9 5900x and im gaming at 4k with dlss quality... correct me if im wrong here, but as long im hitting 99% gpu usage im fine correct? the only thing i should looking for is the gpu usage right? if it hits 99% im good right?
Correct, you are rendering internally at 1440p, again this is always game dependent
i think its partially due to not benchmark CPUS at 4k. so as a user the min CPU i need to run games at 4k is such a difficult question to answer
Good job Daniel! I brute forced Dogma so no issues. I recorded it on my channel with 4 pawns in the city.