I will be honest. I am kind of stupid and need this short-form and it explained to me three times. I mean I hope the TH-cam Short is next. I will really understand it then. 😅
Amazing video , thank u very much , i loved the fact that u made a distinction between 60fps 4k and high fps 4k and in that situation the cpu is very important. Waiting for the 9950x3d
Thanks Steve! Simply put, you need both CPU and GPU benchmarks to decide what would be the best combo for your use case. CPU bound benchmarks tell you how many FPS a certain CPU can push at best in some games. Whereas GPU bound benchmarks tell you how many FPS a GPU can push in the same games. Based on that, you can conclude any CPU/GPU combo where the GPU will be the bottleneck.
And typical use-case, higher-resolution CPU testing results are an important part of a proper CPU review, as they answer the question people interested in potentially buying the CPU have as to whether it's worth getting it if they game at those typical use-case, higher resolutions. When the only component to change between tests is the CPU, then it's the difference the CPU makes in that test environment that's being tested - even when there's GPU-limiting happening.
Great job, Balin! I love the addition of the 4K native line here. It provides an important piece of data that was missing when the assumption was that next to nobody plays at 4K native with all the settings except ray tracing cranked up. It helps make the point, a lot. This from someone who fully understands why CPU testing is done the way it is.
its not just todays '4k native' -- what also matters is the 1440, next game engine fps, and if you've got the headroom to run that if you upgrade just the GPU
@@andytroo True. For me, a top tier GPU that can run games at 4K native at maxed settings should be paired with a top-tier CPU, even if that CPU's oomph won't always be needed to play at 4K today. I might also want to run an easier title at 1080p or do some CPU-heavy productivity tasks.
its not native though. its 2227 x 1253. All the other review sites acknowledge this and show very little benefit at higher resolutions in most games. If you consider productivity then it shifts to Intel. Check out The "Perfect" Gaming CPU isn't what they say it is... -- 9800x3D META-ANALYSIS! from Vex
@@unspoken7704 When you say, "It's not native," to what exactly are you referring? I think you mean that DLSS balanced 4K is actually 2227 x 1253, and that is true. But I'm not talking about the lower set of blue bar graphs. I'm talking about the superimposed vertical red line labeled "4K NATIVE." Seems you have missed that entirely.
@@rangersmith4652 correct I mean the 4k balanced performance metric. The results should also be in the graph but I understand they are using older material to superimpose on top of. I'm high lighting the fact the DLSS results are are a bit wasteful and building in the 4k native into the graphing system would be more valuable. capeesh?
What everybody else must be thinking: why aren't you part of the presentation team? Your really great!! And this summary format is a really good idea and should happen again. Congratulations on a nicely thought through event. Looking forward to your next one! And please guys can we have him/you do the next one!
WE NEED MORE BALIN - His cool and collected manner is the perfect contrast to Steve being pissed and Tim being adorable. What a trio, we are truly blessed!
CPU-limited testing is essentially like testing with the GPUs of the future, that are not here yet. It shows arguably the most important thing - how futureproof that piece of tech that you're buying.
It’s honestly just comparing the maximal capabilities of the cpu. This inherently does take into account future gpu performance if cpu performance is being maxed out.
@@robertt9342 Well said. If there are suddenly some "synthetic" conditions, under which one CPU perfoms even better, than what have been found in "real" tests, it should be clear, that it has not been pushed to its limits before. And having some headroom is kinda synonymous with being future-proof.
I've been playing at 4k 60 since 2018, the CPU has never been my problem, the 8400f in 2018 ran 4k 60 with a GTX 1080 Ti, the 9900K in 2019 with a GTX 1080 Ti, the 9900K and 2080 Ti in 2021, the 2024 with a 5700x3D and a RTX 4070 Super, I've never needed to invest heavily in the CPU, all my money was always focused on the GPU because the focus is 4k 60, and I can play competitive games at 2k 144 and 4k 144 smoothly on the Ryzen 5700x3D, the difference is that it puts everything on low settings, which helps with maximum vision in competitive mode.
CPU bound - 1080p: pushes CPU to the max, GPU is waiting on the CPU. GPU bound - 4K: pushes GPU to the max, CPU is waiting on the GPU. To date, there are no GPUs capable of forcing a 4K CPU-bound scenario for a modern/premium CPU.
Basically a CPU that gets the most frames at 1080p will have much longer longevity. A lot more head room to handle future games at 1080p or 4k. Sure that 285k probably is within 5-10% of the 9800x3d at 4k. but.. as time goes by, the 9800x3d will remain competitive while the 285k will fall off sooner.
Yep, as years go by you have to reduce game settings to maintain FPS but with CPU that does not improve much, you can reduce draw distance/calls and simulation/shadow quality but that's about it - then the faster CPU wins out - for GPU you can reduce/disable tons of settings and win back some FPS and it's easier to swap in a new GPU than CPU/board/RAM.
@@mikfhan Yup. And in most games those settings still won't help you all that much with CPU performance. It's unusual to see a game where settings can change CPU performance by more than 20%, but in most games it's pretty easy to get 100%+ performance scaling by changing settings and resolution.
In Tarkov 9800X3D is 20% faster than 7800X3D at native 4K with 4090. 30% better lows. And 7800X3D is a lot better than any intel. 9800X3D is first CPU not dipping under 60fps in the worst scenario(MP, heaviest map and so on). The point is if you know your particular game is drastically CPU heavy, you can make good guesses based on low resolution testing in other games even they are GPU heavy. There is some exceptions but mostly information is out there if you dig it. High resolution CPU testing at best serves only people who play exact same games used in benchmarks.
Yes, and also not everyone prefers the same frames per second. I see so many comments saying "This CPU is overkill for that GPU and resolution!!!" No it's not, I prefer to game on high fps, your good enough to get 60 fps CPU is not good enough for me.
Another point with CPU usage is that I have seen a comment stating, "I am told I am CPU bound, but my 4 core CPU is only using 25% in the game I play." In the case the reason is the game is only using a single core ( at 100%) the other 3 are idle, thus the total CPU usage is 25% but the single core is maxed out and holding the system back.
You can also get weird numbers above 30% if one core is boosting more than the rest, or if there is some limited multi-threading but the main load remains single-threaded.
@@omnisemantic1 no, I'm talking about cpu limited situations where none of the other components are the issue. It's often because of the way some games (and engines) are coded for multi-threading. *Edit* oh, I might've misunderstood you, maybe you weren't exactly disagreeing with me? English isn't my first language, so sometimes I miss some nuance.
As someone in the comments said. Some of the viewers of review videos are new to the PC gaming hobby. I have been watching review videos for almost 2 decades, the idea of benchmarking cpus at the lower resolutions has always instinctively made sense to me. But for someone that just got into PC building or upgrading in the last few years, they would not be able to understand it, until its explained. That said this video really illustrates it well, good job Balin!
Some people just want data to back up their claims. Which after 3 extra videos, now they have it. The irony is that they made all these videos to show people are wrong, but in the process made the videos that people wanted to see so they could see that different games will have different or no improvements at all. Which is the entire point of why they asked, because HUB typically will test 20+ games to find some kind of average. The bottom line is that HUB can say "its not meant as a buying guide" when in reality, everything they make is being used as a buying guide whether they like it or not.
I watched the long version, and even though I already knew the GPU bottle neck needed to be removed to test accurately, I still learned new info. It was very thorough.
Steve's Elixir of Rejuvenation works BOSS!! Unfortunately the audio I heard was dropping too low in parts to hear all the words, might just be YT glitching on me. The visual presentation and structure was excellent.
I never understood the problem people have with it. When your GPU sounds like a vacuum due to being under 99% load every time you open a game a better CPU will do nothing for you. Upgrade your GPU until your it's the other way around.
I think that people don't generally have a problem with understanding why 1080p is done - to measure a CPU's potential. HU just mischaracterizes the argument as being either low-res or high-res testing, instead of acknowledging that they both play an important part in informing people about a CPU's potential, both unbound, and in typical use environments.
@@Thunderhawk51 You saw it when the 4090 was launched, is the thing. CPU reviews for CPUs. GPU reviews for GPUs. Upscaling tech reviews primarily for GPUs, CPU working as support. Not that hard.
@@Thunderhawk51 People come to a CPU review expecting to get a whole "what PC should I build?' guide. That's not how this works. A CPU review is a comparison, under ideal circumstances, of one CPU against another. Doing such a review with a mid range GPU, at 4k, in ultra settings or whatever inane shit some people want to see added to these reviews would make it worthless as a CPU review since you'd be reviewing the performance of the whole system. And considering there are hundreds of thousands of different possible systems, multiplied by however many games and settings, you're obviously not going to get full system reviews, that's insane. So instead you get reviews telling you how each component will perform under ideal circumstances. It's then up to you to figure out what component will work best with another.
What all these people want to hear is: "Intel is not much slower than amd". But the hurtful truth they find is: Buy 9800x3d if you want high refresh rate experience or 5700x3d if you don't want to spend much
For pure gaming, sure. However the main argument of the Intel fanboys (just to make sure - I'm typing this on Ryzen laptop) was that if people are willing to spend 500 usd on a CPU, it is very unlikely that they will use it on a low-res monitor, and on real 4K the difference in gaming is mostly within a margin of error. They should have shown this in a main video and they should've included 1440p and quality upscaling (where of course the difference would be significantly smaller vs 7700X and 285K), but instead they shown 1080 and also "pretty much 1080p" which made the video look like an ad for 9800x3D.
@@Aquaquake 1440p DLSS/FSR Quality mode upscales from 960p, and with upscaling overhead, still typically performs better than 1080p native, so the FPS results would be the same or greater than 1080p native. 4K Balanced upscales from around 1260p, and with the hit of upscaling, will be quite a bit slower than 1080p native. If you're happy with 1440p DLSS quality, the 1080p native results are highly relevant.
4k native data helped me a lot. Upgrading from a 7 year old CPU, if all the current CPUs give the same gpu limited fps in the games I play then any of them will meet my current needs. The 1080p data shows me which one will probably serve me the longest.
To make the data more accurate, you should take the results of a GPU 4k benchmark with your specific GPU. From there, any CPU that hits that frame rate will meet your needs. No 4k CPU benchmark needed and you will get better results.
@@ЯремаМихайлюк-ш1у I'm not the best explainer and I'm not a native English speaker, but I'd like to try. Which part did you not understand? Do you not understand what this video is about or why Steve thinks this way?
@ well for starters, why did they show that all cpus will have the same fps count in 4k native with the same GPU? I mean I don’t understand how that fps count doesn’t change across the different cpus. It should add some advantage even if there is the same gpu, no?
@@ЯремаМихайлюк-ш1у Well, there are cases where performance differs in these setups, but most of the time it is the same across CPUs. Experts would tell you that this is because in most of these setups the bottleneck is the GPU. To understand this, you first need to know that both the GPU and the CPU can work at the same time to calculate frames, in the sense that the CPU prepares draw calls and other game logic for future frames, while the GPU handles rendering tasks for the current frame. However, each frame still has to go through both parts. If your CPU is capable of 200 fps and your GPU is only capable of 100 fps, your effective frame rate will be limited to 100 fps. Therefore, using a CPU that is capable of even 500fps will have no advantage in this scenario, because both CPUs have already calculated the frames while the GPU was busy with the 100fps. Having a better CPU in this scenario will only make the CPU wait longer for the GPU. Most of the time, you will have a GPU bottleneck at 4k, because increasing the resolution only puts more work on the GPU, while the CPU workload remains the same.
i just got my 9800 X3d came from a 14900k, and playing wow cranked to max at 4k yeielded me around 150-200fps on my 14900k, with the 9800X3d its 250-300FPS , so yeah processors DO matter even at 4k depending on the games that take advance of CPU / caching etc.
@@UMADl3RO5 Dear god, you *still* don't understand., If you benchmark WoW at 1080p, it'd likely show the differences between the CPU even more starkly, because at that point, the CPU is absolutely the bottleneck, not the GPU. That WoW is seemingly not *completely* GPU bottlenecked at 4K is not the point - that the testing needs to have the CPU as the bottleneck to get valid results, is. A test of WoW at 1080p would probably show a similar, if not greater uplift, so testing at 1080p is still the correct methodology for the vast majority of games. Please, this really isn't rocket surgery.
@@Beany2007FTW lmao I understand that. I just feel like its lazy to not include the resolutions people are playing at. Its easier to see the graphs and compare rather than predicting what the results might be. I understand you are just concerned about seeing the speed difference and you cannot do that at 4k or 1440p. I just want to see the graphs regardless with all the resolutions. If thats asking too much then I guess I’m asking too much. But buddy I understand why they test at 1080, why is it hard to include other resolutions? Will it affect the 1080p results to include 1440 or 4k? No. If you have a problem with that i dont know what to tell you.
I can't believe the tenacity you guys have for educating the "unwashed" masses. I'm an old gamer, my first 3D accelerator was a Hercules Riva TNT. This was before the term GPU, because back then the CPU did most of the heavy lifting. It wasn't until the GeForce era where we really started to discuss bottlenecks, as the GeForce started to take the burden off the CPU. Hardware T&L became a thing, along with other technologies. At that time very few people used LCDs (they were garbage), it was all CRT, and in reality it was awesome. There were no "native" resolutions, just a maximum resolution, so no interpolation. We got higher refresh rates the lower resolutions we used. I'm biased, but this was the golden age of PC gaming. When we started discussing CPU bottlenecks, it became obvious that CPU reviews showing gaming benchmarks needed to focus on lower resolutions. So if most people were gaming at 1024 X 768, then benchmarking was done at 640 X 480, maybe 800 X 600. So generally speaking, reviewers and gamers alike knew that to highlight the differences between CPUs, you needed to take the burden off the GPU. There were some dissenters though, but it surprises me that they still exist after this long. I've always thought of the CPU setting the ceiling of performance. It doesn't matter what GPU you have, X CPU will only ever be able to put out Y FPS. So once you know the absolute maximum that CPU is capable of delivering, you should expect that using a higher resolution and detail settings will result in lower FPS. It is not useful data to run your benchmarks at 4K where the GPU is fully utilized (90% - 100%) if the purpose is to see how different CPUs scale. You'd be lucky if you seen much difference in .1% and 1% lows in that scenario. I'd love to know what GPU these people are gaming on. If they are gaming with a 4060Ti or less, then "b_tch you GPU bound at 1080p anyway, what do you care about 4K?". @Harware Unboxed: if you really want to trigger them, start benchmarking at 720p.🤣
CPU benchmarks could focus on lower resolutions and nobody would complain. HU have repeatedly misrepresented the argument they're trying to shut-down, because if they didn't then they'd inescapably end-up giving legitimacy to it, because there is genuine legitimacy to it. Most people taking issue with HU's lack of higher-resolution data fully understand why CPU-unbound testing reflects the CPU's maximum potential vis a vis other CPUs. What people have complained about are CPU reviews that completely omit any mention of what difference the CPU makes at typical-use resolutions, as the review then fails to inform prospective buyers of whether they should upgrade or not. And a primary reason, if not the reason people are watching a CPU review is to know whether they should upgrade or not, and they're watching to inform themselves on what the CPU means to them. When a CPU omits crucial information about what the CPU means to its target audience, then it's falling short as a CPU review. And when a CPU review hypes up performance spreads that almost nobody will experience, due to them existing only at resolutions lower than what 95%+ of people buying the top-end hardware will actually game at, then the review is misleading when it hypes up that performance difference without communicating the caveat that it won't be experienced at that hardware's typical-use resolutions by 90%+ of its purchasers, either today, or even in the next several years and after multiple GPU upgrades. The only people who will see 1080p-style FPS spreads from their 9800X3D within the next several years will be those purchasing a GPU with RTX 5090 level of performance, or more. And two years after the RTX 4090's release, those who've bought one represent less than 1% of Steam's userbase. Expect the number buying an RTX 5090 to be even less, considering its significant price increase.
Man i am very new here. Sir can you explain pls why lower resolution matters in cpu benchmarks? (I read every word and sentence people use here but my knowledge is limited so it’s not understood by me most of the time) Thank you
@@ЯремаМихайлюк-ш1у If your GPU isn't powerful enough, it will limit the performance of the CPU in gaming. And if your CPU isn't powerful enough, it will limit the performance of your CPU in gaming. And the higher resolution that you run a game in, the more workload your GPU has to render the image. Whereas the CPU generally has the same workload, regardless of which resolution you run a game in. So, to see the full potential of what a CPU can do when it isn't being limited by your GPU, you need to test it at a low enough resolution that the GPU isn't limiting the performance of the CPU. That's why low resolution benchmarks are used to compare the raw performance of CPUs. That's good to establish the relative performance of just the CPUs, but it serves as synthetic and merely academic data if the tested resolution isn't the resolution you're actually going to use the CPU in. That's why it's also important to test CPUs in the resolutions people will normally be using them in. And for the highest-end gaming CPU there is, the normal resolutions it's used for will be 1440p and 4k. Since currently-available GPUs aren't powerful enough to avoid imposing performance restrictions on a high-end CPU at those higher resolutions, the performance differences seen between CPUs at 1080p will gradually disappear, the higher the resolution goes. For people thinking about buying a 9800X3D, seeing where the performance starts to disappear at higher resolutions, and how much of it disappears, is important information, because it doesn't make sense to spend hundreds of dollars on a CPU upgrade for gaming if it won't make any difference to the gaming performance you already have. So, it's very relevant to a review of the CPU to communicate that, so that people aren't upsold on a product that will basically take money out of their pocket and give it to AMD, in exchange for nothing of value for them. HU have argued that a CPU review is only about the academic (low resolution) test data, and not the practical (high, normal resolution) test data. While they're correct about the reason for doing testing at 1080p (to see the comparative, raw potential of CPUs), they aren't correct about anything else they've claimed on this topic, and have made many strawman, gaslighting, and outright silly arguments in defence of their position. Most of HU's arguments in defence of their position are flawed, feature selective acknowledgement, or completely misrepresent what they're addressing, and so don't stand-up to scrutiny. But they've doubled-down on those arguments for whatever reason.
@ BIG thanks man. I just wanna be more knowledgeable in this sphere and without a pc it’s pretty funny because my theory gets mixed up very often if i don’t watch any news on the topic. By the way is there a point in buying x3d cpu’s if someone plays only in 4k? I mean normal 9950x will do almost the same job in games as i understand it and it will be better for workstation, no? (That seems right for me but i somehow found some comparison benchmarks and sometimes (and that’s why i don’t believe it) some youtubers show increase in 10-20% in some games in 4k (with x3d) (is it even possible?? )
@@ЯремаМихайлюк-ш1у No prob. I think buying an X3D CPU is very worth it for high-res gaming and other things - though, in 4k gaming, there likely won't be much, or any difference between a bunch of high-end CPUs unless you have at least an RTX 4090 level of GPU power. I'm pretty sure I know which video you're referring to when you say it shows 10 - 20% more FPS with the 9800X3D at 4k, and it's a fake benchmark video. It was uploaded before the 9800X3D released, and it shows the 9800X3D running hotter than the 7800X3D in its tests, which is the opposite of reality. The figures in it are all just made up. I and others commented at that video, pointing-out that it's fake, but I think the uploader hid a bunch of our comments. Other than for gaming, the X3D CPUs run at much lower temperatures than the non-X3D counterparts, and use much less power (at least the 7800X3D does). I had a 7700X before getting my 7800X3D. But as I like a quiet, cool, and powerful PC all at once, the X3D suits me much better than the non-X3D CPUs. The 9800X3D has significantly better productivity performance than the 7800X3D, and so that benefit will exist outside of gaming. Even though I already have a 7800X3D and game at 1440p, I'd probably buy a 9800X3D for its increased productivity performance and cooler temps if it weren't for it using significantly more power than the 7800X3D. The 9800X3D still runs around 7C cooler than the 7800X3D while under load, which is great.
Very Nice video HUB team!!! Steve, thanks for letting Balin have a shot at explaining this subject!!! Maybe it was the visuals, but this did help me get a better understanding of the CPU testing process.
@@Peter.H.A.Petersen even if you have a lower GPU you can still change settings to get there. But alot of CPUs cannot deliver high FPS. We are in 2024, under 90 FPS is very very bad. What if you want 144? 180?
This video made me stop for a moment to look at what kind of Steve we got today... :-D Still nice one, it does great job at explaining why testing at 1080p matters instead of 4k. And yeah, while I definitely would love some kind of upgrade guide type of video, there are just way too many factors with CPU performance, from graphic card used, settings,... to specific games you play and FPS targets. I think way too often people are thinking about how to get max FPS without considering where good enough point is... it really can save you some money. And not everyone is equally sensitive to FPS. Hence why I am still happily gaming with i5 12400, since I am not really that badly sensitive to FPS above 60. Not to mention I also got it quite cheap.
Yes thank you for distilling it down to what the point of it is. I know what Steve was trying to say in the last one but I think he got a little too into the weeds and needed to simplify it to a tweet level length. CPUs hardcap your fps to whatever their maximum is, so finding that maximum with lower resolution testing should be the goal. However, GPUs can be dynamically adjusted to output more or less fps, so finding the relative scaling between resolutions, RT on/off, etc. is actually useful data.
The main thing I take issue with is them saying "A CPU review is not a buying guide". I mean I kinda find new hardware interesting, but I think the reason people get so bent-out-of-shape over the "4k" performance is that they want a CPU review to answer the fundamental question "what is the cheapest CPU I can buy in a GPU limited scenario"? Amazon will eventually ban my account if I order and return all the CPUs to figure out which one is best for my needs -- that's why as a consumer I rely on these channels. So spelling out where the GPU limited line is actually serves a valuable purpose because it lets me know if I need to upgrade or not.
That's true. .HU are in the wrong on lots of their comments on this topic, and that's definitely one of the comments they're in the wrong. A CPU review is definitely a buying guide. A primary reason, if not the reason people are watching a CPU review is to know whether they should upgrade or not, and they're watching to inform themselves on what the CPU means to them. When a CPU omits crucial information about what the CPU means to its target audience, then it's falling short as a CPU review.
HUB is bascially digging a hole trying to defend themselves when all they needed to say is that "some games aren't worth testing 4K like sim games." "sometimes we don't have time to test 4K for new CPUs". Or they could have said "the 1080p here shows potential in the future when you are limited, while this specific game shows that if you play 4K you won't get any benefits with an upgrade today". And that's actually useful and important. Instead they keep insisting they don't need to do any testing...and that's up to them to test what they want. People are asking for stuff to be tested because they are wondering if games in general will benefit right away. HUB always does this whenever people point out that they are being narrow-minded. Just say you don't want to test it and be done with it rather than "everyone is wrong, we are right". Its not a black and white ask. Nobody would be testing 1440p/4K if that was the case. But a ton of youtubers out there do test new CPUs on 1440/4K. Is it really that hard to simply show a few examples of 4K not scaling, and 1080p scaling? Wasn't that the entire point of why some youtubers tested it? Isn't the entire point of this channel to back up claims with testing? Like after 3 extra videos, you've proved that point. If you did that right away at the start, nobody would be complaining.... So in the end they were bullied into doing something people wanted and they didn't want to do, just so they can be like "see we were right". Talk about groundhog day.
@@be0wulfmarshallz Definitely. That's also my exact thought: Just say that high-res benchmarking isn't something you personally want to do with your videos, and that's fine. All this BS they say to try to rationalize their choice that is the problem - it's misinforming and misleading people, and making the tech space dumber. Their arguments are completely harebrained, rife with falsehoods, and I'm sure include some deliberate deceptions. What they're saying with their videos is that they aren't equipped with good analytical and comprehension, but are charlatans and Dunning-Krugers in the tech space. I wrote this comment for this video. But this is only a small part of it. There's so much more that's wrong with what they're saying on this topic that I can't even see where where the end of it would be if I'd try to address everything: As with HU's previous videos on this topic, there are a bunch of things wrong with what's claimed in this video. Aside from the persistent strawman and false dichotomy that people are demanding, or that there must be either low-res testing (academic) or high-res testing (practical), rather than both to give the full picture (both should be in a proper CPU review), one such claim being that GPU-limited CPU benchmarks are useless (which is literally the same fallacy as those claiming low-res testing is useless), and then later reframing it more specifically as high-res testing being useless to determine a CPU's unbound potential. But those two statements aren't the same statements, and presenting the latter as confirmation bias for the former is dishonest, and disinforming. I think that when people make these kinds of logical fallacies it's usually an intentional sleight-of-hand deception. A CPU's unbound potential is only one part of the whole picture of a CPU review and buying decision. There are many other things to know, including what a CPU's potential is in typical-use resolutions that will end up being GPU-limited with all existing GPU hardware. And CPU reviews are quite literally "upgrade" / buying guides, it's the central point of why they exist in the first place. They're communicating technical and other information about CPUs so that people watching can become informed and figure-out whether it's in their interest to buy one. That's why HU puts price-to-performance charts in their own CPU review videos - otherwise, remove those charts from your reviews. Saying they're not buying guides as an excuse for not including other contextual information that's important to the buying decision-making is both false and a weak excuse. If data informing how a CPU performs in its market's typical-use resolutions is omitted, then it's failing as a CPU review. Part of objectively measuring what a CPU can do is measuring what it can do when placed in its typical-use environments (resolutions) - that data shows what difference the CPU as the isolated factor can actually make in that environment, thus informing prospective buyers whether it's worth buying. Here's another correction to Balin: providing 4k benchmark data gives actual information about actual CPU performance... when it's in a 4k gaming environment, which it will be for many purchasers. And 4k data with RTX 4090-level GPU performance isn't only informing of systems using that exact hardware and settings, but gives very-useful information to people who have any of various levels of GPU performance. The idea that the result doesn't inform other hardware configurations is baffling, because I can easily translate such data to my system with an RTX 3080 (soon to be RTX 5080). The people at HU seem to be infected with the same disease where they don't understand the meanings of the words that they use, and they try to impose redefinitions of the meanings of words that are already defined in the English language. But things don't work that way. Another problematic assertion he makes is this: "But it's useless information if you're willing to downgrade visuals in search of higher framerates". That's definitely needing an explanation to back it up, though a good one can't be given because it's entirely false. Someone with a properly-working brain should be aware that turning-down settings will have the effect of increasing the overall FPS. And so native 4k benchmark results also inform people who are willing to turn-down their settings to get higher FPS that they can expect a minimum of the shown FPS, and that the number will go higher as they turn settings down. A person with a bunch of gaming experience probably has a feel for how much additional performance they might get from certain settings, and there exist videos for popular games showing how much a performance impact individual settings make. Saying that any of a selection of CPUs will work to deliver strong 4k performance makes the point of why it's important information for a CPU review, which is inevitably a buying guide for the CPU. Benchmarks informing people that there isn't a gaming advantage from one high-end CPU to another at higher resolutions is called useful information. In truth, HU's videos on this topic, including this one, are absolutely rife with what can only be described as Dunning-Kruger rationale, suggesting HU occupy the space of tech reviewers without having technical understanding that matches the role. And by presenting half-truths and falsehoods as authoritative, it's making viewers less informed and worsening the tech information space.
@@thischannel1071 I cant help but agree. Im getting sick of the hostility of running benchmarks outside of 1080p. Those benchmarks are only really relevant if you are considering the CPU or GPU alone in isolation. Personally Im running 1440p on a 7800X3D and a 7900XTX and benchmarks at that point are helpful for me for to make up my mind for the components together, I dont care about the CPU or GPU separately and I was able to find enough benchmarking of the config to know I was on the right track with my preferred combo, even with my VR rig.
Did you watch the whole video? They quite literary tell you how to figure out where they GPU limited line is for you. What would be the point if turning a CPU review into a 4090 at 4k native benchmark? Not to mention such data would not be as useful for you if you don't have the benchmark combo of CPU and GPU.
Yeah I think even this dumbed down video is too much for those people. They want a 30 second video that conclusively tells them to buy it or not, not a 10-40 minute video that carefully lays out the facts and tells them, "Study these charts and make up your own mind."
But it can be though. If your GPU is capable of running at 150fps in your favourite game with the best CPU, you are currently running it at 100fps, you want to be looking at the CPU capable of 200fps+ ideally, or at least 180fps in CPU bound tests.
@@dotjazAnd unless you're rocking a 4090, it becomes irrelevant. Meaning you need to do more evaluations on YOUR situation. Once again, if you're happy at 100 FPS, pointless upgrade. If you aren't happy with that, then upgrade with the most accurate info that's useful to you. JFC, this is why you follow more than one channel for tech reviews, because not every single one is going to run a full gambit. May as well do it yourself at that point instead of grasping at straws and whinging. Whether you can afford it or not is a whole other thing. Not financial advice.
@@SelecaoOfMidas Stop bringing 4090s into everything, those without 4090s can still achieve high frame rates with lower quality settings and upscaling. GPUs are very scalable unlike CPUs. So there's no point in saying "only people with 4090s can see the difference" It is simply not irrelevant.
@@dotjaz Also people forget that games are going to become more CPU heavy as time goes on. Horizon zero dawn in a completely CPU limited scenario gets 250+ FPS but its sequel forbidden west gets only 180. If your CPU can do 150FPS now, it won't do the same in future. If I want 80-90 FPS for single player gaming , I'll get a CPU that can do 200 so that it lasts me a really long time, even if I upgrade my GPU.
Can you test every single game ever made with a 3800X and 6750XT at 1080p and 1440p so that I don't have to understand anything you just said and can see what the performance of my system will be if I upgrade to a 1440p monitor some time in the future? Please?
I like that you clarify and emphasize the "upscaling" - aka lower internal rendering resolution - part. I never really thought about that in terms of CPU bound performance. 👍
Like i understand showing the differences between CPU performance at 1080p but I wish i could know how much of a difference there remains after upping the resolution
@@mikekostecki2569 If upping the resolution makes you GPU bound, the difference gets smaller or even disappears. If not, it the difference is the same as it was at 1080p.
@ I ended up buying a 7600x3d rather than a 7800x3d for this exact reason. I haven’t bought a gpu yet for it, but from what I’ve gathered it seems I’ll lose maybe 10 frames at 1440p with a high end card.
More of this guy. So the 4k native for 9800x3d is shown, but what was the 4k native for the other CPUs? That is the difference, or lack of, that I would like to see. Are you saying all 4 CPUs had the same framerate at 4k native?
Yes. All those CPUs had exactly the same frame rate at native 4K resolution. Basically, if you are running 4K native, there is no difference. But, if you do native 1080p, there is a huge difference.
Now do the same benchmarks at native 1440p so people know what fps to achieve with those cpus and finally figure out do they need to upgrade their cpu or not Steve doubleganger.
Just give your viewers what they want instead of the BS reasons. Look at old reviews (likely out of date) These reviews aren't for upgrades or comparisons (despite literally showing comparisons). You show GPUs in CPU limited scenarios but don't want to show CPUs in GPU constrained scenarios. Upscaling etc. Most people that aren't enthusiasts are just going to expect that a 7800x3d is quicker than all other CPUs at gaming by like 40 percent and go buy that even on a 4k monitor. The average consumer likely has no idea about the relationship between CPU, GPU and resolution. A charts showing this is absolutely helpful as we all started as novices in this hobby and the way the channel is coming off so snobbish about it is really disappointing. The hypocrisy and mental gymnastics that are going on instead of providing your audience with what they want is insane guys. I've been subscribed since the channels infancy and watch 9 out of 10 videos you produce. I won't be watching your CPU reviews going forward.
I don't even care all that much if they give high-res benchmark results or not. What gets to me is how false, manipulative, and utterly harebrained their arguments are. If they actually believe the things they say, then they are, to the nth degree, exemplary of the Dunning-Kruger effect. I wrote this comment in response to this latest video: As with HU's previous videos on this topic, there are a bunch of things wrong with what's claimed in this video. Aside from the persistent strawman and false dichotomy that people are demanding either low-res testing (academic) or high-res testing (practical), rather than both to give the full picture (both should be in a proper CPU review), one such claim being that GPU-limited CPU benchmarks are useless (which is literally the same fallacy as those claiming low-res testing is useless), and then later reframing it more specifically as high-res testing being useless to determine a CPU's unbound potential. But those two statements aren't the same statements, and presenting the latter as confirmation bias for the former is dishonest, and disinforming. I think that when people make these kinds of logical fallacies it's usually an intentional sleight-of-hand deception. A CPU's unbound potential is only one part of the whole picture of a CPU review and buying decision. There are many other things to know, including what a CPU's potential is in typical-use resolutions that will end up being GPU-limited with all existing GPU hardware. And CPU reviews are quite literally "upgrade" / buying guides, it's the central point of why they exist in the first place. They're communicating technical and other information about CPUs so that people watching can become informed and figure-out whether it's in their interest to buy one. That's why HU puts price-to-performance charts in their own CPU review videos - otherwise, remove those charts from your reviews. Saying they're not buying guides as an excuse for not including other contextual information that's important to the buying decision-making is both false and a weak excuse. If data informing how a CPU performs in its market's typical-use resolutions is omitted, then it's failing as a CPU review. Part of objectively measuring what a CPU can do is measuring what it can do when placed in its typical-use environments (resolutions) - that data shows what difference the CPU as the isolated factor can actually make in that environment, thus informing prospective buyers whether it's worth buying. Here's another correction to Balin: providing 4k benchmark data gives actual information about actual CPU performance... when it's in a 4k gaming environment, which it will be for many purchasers. And 4k data with RTX 4090-level GPU performance isn't only informing of systems using that exact hardware and settings, but gives very-useful information to people who have any of various levels of GPU performance. The idea that the result doesn't inform other hardware configurations is baffling, because I can easily translate such data to my system with an RTX 3080 (soon to be RTX 5080). The people at HU seem to be infected with the same disease where they don't understand the meanings of the words that they use, and they try to impose redefinitions of the meanings of words that are already defined in the English language. But things don't work that way. Another problematic assertion he makes is this: "But it's useless information if you're willing to downgrade visuals in search of higher framerates". That's definitely needing an explanation to back it up, though a good one can't be given because it's entirely false. Someone with a properly-working brain should be aware that turning-down settings will have the effect of increasing the overall FPS. And so native 4k benchmark results also inform people who are willing to turn-down their settings to get higher FPS that they can expect a minimum of the shown FPS, and that the number will go higher as they turn settings down. A person with a bunch of gaming experience probably has a feel for how much additional performance they might get from certain settings, and there exist videos for popular games showing how much a performance impact individual settings make. Saying that any of a selection of CPUs will work to deliver strong 4k performance makes the point of why it's important information for a CPU review, which is inevitably a buying guide for the CPU. Benchmarks informing people that there isn't a gaming advantage from one high-end CPU to another at higher resolutions is called useful information. In truth, HU's videos on this topic, including this one, are absolutely rife with what can only be described as Dunning-Kruger rationale, suggesting HU occupy the space of tech reviewers without having technical understanding that matches the role. And by presenting half-truths and falsehoods as authoritative, it's making viewers less informed and worsening the tech information space.
@thischannel1071 absolutely man. It's been boiling my blood with the approach they've been taking. Problem is they're too deep in now to change their position as they'll come across as incompetent. Super disappointing honestly
Exactly, just do both testing of 1080p and at 4k, Techpowerup and other review channels have been doing this for years! Which is why I only look at those results
You completely missed the criticism of the previous video. Nobody is arguing that 9800x3D is the fastest gaming CPU and that CPU's strength is best shown in low resolution. The problem is that: a) You've made a dedicated video titled "Ryzen 7 9800X3D, Really Faster For Real-World 4K Gaming" and you have provided there ZERO real 4K benchmarks (or even the quality mode) and only used "balanced" upscaling (which is significantly more CPU-bound than even QHD, let alone 4k) b) you selected comparison with 7700X and 285K, both of which are productivity-oriented CPUs (Intel also already announced that Arrow's Lake's gaming performance is scheduled for fixing in December, so a little odd timing but I'll give you a benefit of doubt on this one), as opposed to an obvious choice of 7800x3D Combining both resulted in _technically correct_ , but practically misleading video, which made 9800x3D look like is provides some gains in 4K resolution (which again in doesn't, "balanced" mode is not even 1440p)
Indeed, I'm puzzled how 1080P is tested NATIVE but 4K is tested with Balanced Upscaling. It's as what you said, showing CPU bound against another CPU bound scenario. There are people who want CPU only data, and there are people who want to see what's it like where CPU and GPU work together (eg 1440P Native) and neither are wrong. So I think the great issue is that people are blanketing those asking for 1440P / 4K are not asking the right thing.
a) "real world 4k" - surveys show that people gaming in the real world at 4k are upsampling, becuase very, very few people own a 4090. b) 1) the intel CPU's are newley released, and if you're making a purchasing decision right now, what you would compare to the released AMD cpu's are compared to the released Intel CPU's) 2) Next month the new Arrow lake will do the same - and compare the released Intel CPU's to the available - and will probably have a better showing. neither of that changes that if in October, November or December, if you buy a cpu, and (like almost everyone) run an up-sampler at 4k, the 9800x3d gives you a major FPS boost over the best Intel CPU in most titles.
@@andytroo The poll is irrelevant because it samples people's current usage, *not* people who will be upgrading their system. You simply cannot claim that people who have a budget for a 500 USD CPU are going to use it alongside a low-res monitor or that they will settle for a poor quality upscaled 4K. And even then - 25% of people voted for "quality" settings (as opposed to only 11% for both balanced and performance) - what's the explanation of not using a single comparison in quality settings? "People were asking for including 4K testing, let's make a 40 minute video of testing in 1080p and _also pretty much 1080p_ ". For the 285K, simply saying it and 7700x are very sus choice for a non-marketing purposes, considering their gaming performance is weak.
@@andytroo The "real-world 4k" survey actually showed that more than double the percentage of 4k gamers game using 4k DLSS Quality than those who use DLSS Balanced or Performance. But HU showed 4k Balanced benchmarks. The surveys also showed that native 1440p is the largest demographic, but didn't show any benchmarks for that. So, they didn't reflect what people actually said they use the most in the surveys. Also, those surveys reflect all of HU's viewership, and not specifically people buying a 9800X3D. Among purchasers of a 9800X3D, the number of people gaming at native resolution and DLSS Quality setting will be far higher than the average HU viewer, and 9800X3D purchasers will have a much higher average GPU performance level than the average HU viewer. HU's arguments on this matter are mostly nonsense, and supported by half-truths, falsehoods, and misrepresented data.
I was one of the ones that complained about not having 4K Native results in last video. I gotta say it's amazing that you guys hear the community and deliver or explain why us gamers might or might no be wrong. Thanks for all of the effort!
Except they didn't even listen to their own polls which shows more people game at 4k Quality DLSS than Balanced but they still chose to show Balanced 😂😂
I'm a simple man: I see 5800X3D in a graph, I press thumbs up. Well done Baelan! Also thank you for providing material for educating people on testing methodology, it's easy to forget that people might not fully understand the content.
@@mikeramos91 there's also a point of diminishing returns as well. Steeve explained why in the last podcast episode. I wasn't paying that much attention to remember why but another technical reason comes into play with going as far down as 720p is also a bad idea for today's CPUs.
@@Lebon19 1080p is really the spot where most ppl play at & makes the most difference. In some cases 1440p is becoming the new spot especially with higher ram speeds coming out
@@mikeramos91 Agreed. 720p used to be the default and 1080p was the intermediate Hi-Res setting (like 1440p is here). Now, 1080p is that bottom and 1440p in the middle. So, it's only a matter of time where 1440p will replace 1080p. But it's still a few years off at least. So, even though it's a matter of time, I still don't see 1080p for CPU testing go anywhere anytime soon.
@@mikeramos91 There shouldn't be wider differences in CPUs benchmarking at 720p, otherwise HU is guilty of everything they argued against and in support of to defend their 1080p benchmarks. Their whole point is that 1080p benchmarking should be focused on because it show the unrestricted performance of the CPUs. But 1080p benchmarks also isn't very relevant for 9800X3D gamers, as 98%+ of people who buy the top-end CPU won't be using it to play at 1080p. While 1080p might still be the most played-at resolution, that's out of all CPUs. When specifically talking about the highest-end CPU and GPU hardware there is, it's a very niche gaming resolution that only very few people (a small portion of competitive gamers) use.
Your own data shows there's a big difference in real world usage (4k upscaled). You claim that we shouldn't upgrade if we don't need to, so why not include real world data in all reviews like you have done to help us make that decision? No one is asking to abandon 1080p benchmarks, just include 1440p and 4k ones (upscaled ideally and native as an extra). How would a 5800x3d or 7800x3d owner playing at 1440p and 4k decide whether to upgrade without you guys providing this data? We can just use cinebench if you want to see cpu capabilities for single and multi core. If the reviews at 1440p and 4k benchmarks were truly useless then they would show no improvement - but they do - so it's your own data which disproves your statement.
"How would a 5800x3d or 7800x3d owner playing at 1440p and 4k decide whether to upgrade without you guys providing this data?" - You would ask yourself if you are getting enough fps for you. If you are, then you wouldn't upgrade. If you're not, you would use a monitor to show you how utilized your GPU is in your games. If it is below 95% utilization most of the time, while playing, then you know your CPU is not fast enough. Therefore, any CPU review here, will show you how much faster the CPU's perform relative to each other. You then get the fastest one you can afford and you will get higher fps numbers. If you really need the numbers, use the percentages to get a ballpark. (Don't forget you need to know the maximum your GPU can go, too. Use the internet to determine that. There is no point in upgrading for 30% and the GPU can only go up another 10% or something.) If your numbers are 76 fps and the CPU review says a CPU is 30% faster, then you can add about a 3rd of your fps number to itself and get a ballpark uplift. Then ask, "Is 30% enough to pay for a whole new CPU for?". I prefer 50-70% uplift, when I upgrade. You only need to upgrade, if your CPU is holding back your GPU and your experience is bad right now.
@@boirfanman Or they could just add the 1440p and 4k benchmarks - who the hell has got time for all that? The fact that his own data showed there was differences (some more or some less) with the 4k benchmarks disproves their points that 4k benchmarks don't say anything.
@@konvictz0007 Wow. Then I agree with Steve. It's not worth his time to cater to people who are this entitled. Use your time to figure out something that benefits you. Not someone else's time.
You fail to see the point Balen is making, 1440 and 4k would only make sense if you also happened to have a 4090.. If you don't, and most doesn't. You would need to look up the latest benchmark of your GPU to see what kind of FPS it could give with the best CPU available anyway. So it doesn't really save time for most consumers.
"decide whether to upgrade without you guys providing this data?" They even said CPU reviews are not an upgrade guide. You are exactly who they are talking about at 12:02 lol
11:35 Steves got some strange logic here… you set everything to highest at 4k with dlss and frame-gen, and if you don’t get the desired frame rate you go buy better hardware. its a bit bad when you have a 4090 with nothing better to buy until 5090 comes but it is what it it…
Nah, the logic is sound, it's just not for the everyday gamer with his/her xx60/x600 class GPU, but for the 0.001% of people that has a 4090, willing to sacrifice graphic quality(surely including RT), but not willing to sacrifice 4K resolution or framerate... lucky that the 4090 is one of the very few NV cards that has sufficient VRAM so you won't have to compromise on texture quality...
@andraskovacs8959 I strongly disagree with vram statement, it's easy to go over 24gb in cyberpunk 2077 with 4k texture mods and then i get stutters. That's the main reason I can't wait to get the 5090 and it's 32gb vram.
@@medovk Well, for some people, nothing's enough. Let me not shed a tear for you. If you run out of 24GB, it's your own volition, the game does not force you to. I'"m fine with it, the 16GB on my RX6800 will do just fine for me for a long time - you do you, if you are able and willing to spend the big bucks on the "never enough" VGAs, I don't mind, have your fun with them... just don't come to me for validation or empathy with problems that even 99% of the 4090 owners(themselves less than 1% of the gamer community) do not have...
If these videos are the only way you're going to show 4k or 1440p results, then please keep making them. As you're talking about how useless the results are, I'm looking at your data and am very interested in this "useless" data. Of course 1080p benchmarks are *more* useful than higher resolution benchmarks, but knowing where the 'ceiling' is is still very useful information. The 9800X3D is measurably better than the 7700X on Hogwarts Legacy (on your test bench at high settings) at 4K with DLSS, but not any better without DLSS. Great! That's useful information I wouldn't have known otherwise!
Still, you shouldn't use steves' CPU bottleneck data to see if you want to upgrade your CPU. You should really only use the steves data to see how CPUs compare to each other.
You can figure out how much better the 9800x3d is than the 7700x at 4k WITHOUT testing, i dont know how but im being told you can! if someone can tell me, let me know, i want ACCURATE numbers not guesswork BTW.
@@sykusyku Yes I'm trying to decide if i want to go 4k monitor so i need those results. If a game is getting 75fps on 4k (with said cpu) then im not doing it but if we are getting 150fps then im making the switch. Why are we trying to change this 30 years in? Blows my mind
@@sykusyku Oh, that's easy. The 9800x3d is as much better than the 7700x at 4k as it is at 1080p. The main reason you get fewer frames in most benchmarks and in real life when you run at 4k is because your GPU is the bottleneck.
@@pedropierre9594 thanks * ^^ * tbh my thought process was "a GPU cost me way more than a Motherboard, Ram and CPU". and my last comp ran for 13 years with a realy low end GPU (15 years ago at this point, a Palin GTX 660 OC) so i wanted to avoid that. i mostly play singleplayer entertainement games, not PvP stuff, so graphics was always my main concern, and the 4080 would serve me well for a long time i assumed ^^
I feel like the people complaining about cpu testing methodology have not finished high school which could be the case for a lot of your viewers. Appreciate the effort you guys take to educate your viewers!
Honestly, when i watched Steve's video i thought no one is going to watch till the end on this... and the end was the part that really made his points the clearest imo... This video is good and i hope it helps people understand how to use these charts better to suit individual needs instead of nit picking about your very clearly robust and effective means of testing, as well as presenting the data in a very useful manner. You have vids for individual GPU's using the best CPU's in the bench and CPU tests with the best GPU's which gives us useable data for how each individual part can perform in a best case scenario which will show us exactly where to find a bottleneck and how it can vary across a multitude of games. What you guys do is no small feat and i feel for Steve and the whole crew when people seem to just get upset that you don't showcase their exact set up or scenario. But many of us appreciate all the little details more than you know so keep up the good work you guys!
I fell for that trap when I built my PC. At the time, a 5800x and 5800x3d showed little difference at 4k, so I bought the 5800x. Fast forward a few years, and I ended up regretting it. Newer games were more CPU demanding, the 5800x struggled to maintain 60fps in games like Jedi Survivor and Starfield. The frustrating aspect is that there was barely any improvement in frames by turning down game settings. Even other games struggled with poor 1% lows (Forza Motosport). Lesson learned, if there's a 20% gap between CPUs at 1080p, the gap will also exist at 4k. The gap might not be noticeable at 4k in todays' games, but it's there, and will show up in the future as GPUs improve and games get more CPU demanding.
I kept expecting to Steve to do a drive by... Slowly walking behind Balin while looking menacingly at the camera, grabbing a random thing off the shelves without looking, and slowly walking back out without ever breaking eye contact
This is getting out of hand, now there are three of them.
Nice Phantom Menace reference 🙌
I will be honest. I am kind of stupid and need this short-form and it explained to me three times. I mean I hope the TH-cam Short is next. I will really understand it then. 😅
We should not have made this bargin.
is that legal?
Inflation by 50% 😂
Hello Steve... Wait a minute
impostor
@@LunarListDO NOT SAY THE "A" WORD
@@tomaszzalewski4541 amogus
Wаit a minute... WHO ARE U??? 🧑🏼💼
steve got younger
“You didn’t believe us when Steve said it, so we’ve hired actor Luke Evans to put in an Aussie accent and explain it again”
😂
Good one
Hello
That's actually funny! 😂
No because why did I think he looked so familiar 😂
Amazing video , thank u very much , i loved the fact that u made a distinction between 60fps 4k and high fps 4k and in that situation the cpu is very important. Waiting for the 9950x3d
When its so serious Steve is standing on the roof above and the editor has to make the video himself.
9800x3d was so good, that it made Steve younger, trully a great CPU
that's why people like V8's so much 🤭
Plot twist, he's actually older but uses moisturizer.
except most people can't get it right now.
@@1sonyzzNo replacement for displacement
@@Treestorm Use a stock drop alert service.
Steve looking young today, must have got a new power tool.
Don't be afraid of a little moisturizer fellas, it works wonders.
@@JeredtheShy seems like works better drinking it
The 9800x3d is so legendary, it has magical power
Steve looking different today. not standing, not lounging. maxxing.
Not steveing either
Lowkey make me want to ropemaxxx ngl
Stevemaxxing
Hes lounging outside the camera
that's what FPSmaxxing does to you
Thanks Steve!
Simply put, you need both CPU and GPU benchmarks to decide what would be the best combo for your use case.
CPU bound benchmarks tell you how many FPS a certain CPU can push at best in some games. Whereas GPU bound benchmarks tell you how many FPS a GPU can push in the same games. Based on that, you can conclude any CPU/GPU combo where the GPU will be the bottleneck.
And typical use-case, higher-resolution CPU testing results are an important part of a proper CPU review, as they answer the question people interested in potentially buying the CPU have as to whether it's worth getting it if they game at those typical use-case, higher resolutions. When the only component to change between tests is the CPU, then it's the difference the CPU makes in that test environment that's being tested - even when there's GPU-limiting happening.
This guy is really good. Nice voice, excellent pace. Simply a perfect host.
Thanks Steve!
Yucks Intel.
Thanks, over to you Steve!
You can literally see it!
Thx Steve for this video. Wait
Steve is so fed up explaining this that he's standing outside of the video intro.
While murmuring to himself quietly and smoking a cigarette
Some say, he edited a video faster than the 14900k and that he undervolted his refrigerator. He's not the Steve, but he's the Steve's gym Bro cousin.
Wow, that reference is kind of a deep cut these days lol
Top Gear, what a show !
Brother from a different mother
I have an i9 14900k with 700+ hours on it and no issues 🐑
@@madclone84 Someone touched the nerve. 🤣🤣🤣
Steve got upgraded to steveX3D
I agree, bro's stacked
Great job, Balin! I love the addition of the 4K native line here. It provides an important piece of data that was missing when the assumption was that next to nobody plays at 4K native with all the settings except ray tracing cranked up. It helps make the point, a lot. This from someone who fully understands why CPU testing is done the way it is.
its not just todays '4k native' -- what also matters is the 1440, next game engine fps, and if you've got the headroom to run that if you upgrade just the GPU
@@andytroo True. For me, a top tier GPU that can run games at 4K native at maxed settings should be paired with a top-tier CPU, even if that CPU's oomph won't always be needed to play at 4K today. I might also want to run an easier title at 1080p or do some CPU-heavy productivity tasks.
its not native though. its 2227 x 1253. All the other review sites acknowledge this and show very little benefit at higher resolutions in most games. If you consider productivity then it shifts to Intel. Check out The "Perfect" Gaming CPU isn't what they say it is... -- 9800x3D META-ANALYSIS! from Vex
@@unspoken7704 When you say, "It's not native," to what exactly are you referring? I think you mean that DLSS balanced 4K is actually 2227 x 1253, and that is true. But I'm not talking about the lower set of blue bar graphs. I'm talking about the superimposed vertical red line labeled "4K NATIVE." Seems you have missed that entirely.
@@rangersmith4652 correct I mean the 4k balanced performance metric. The results should also be in the graph but I understand they are using older material to superimpose on top of. I'm high lighting the fact the DLSS results are are a bit wasteful and building in the 4k native into the graphing system would be more valuable. capeesh?
Fantastic job Balin, love your hosting.
What everybody else must be thinking: why aren't you part of the presentation team?
Your really great!!
And this summary format is a really good idea and should happen again.
Congratulations on a nicely thought through event.
Looking forward to your next one! And please guys can we have him/you do the next one!
Guy has been editing videos for so long he's already mastered their intonations and speech patterns :D
We had Steve standing, sitting, lounging on the couch; now we have Steve "I can't even anymore..." :)
"I can't even" Steven
@@ivanbrasla How did you miss the most obvious wordplay?
"I can't [St]even"
You know it's trouble when Steve is standing. Well, that's nothing, wait til he's so fed up he throws a Balin at you.
WE NEED MORE BALIN - His cool and collected manner is the perfect contrast to Steve being pissed and Tim being adorable. What a trio, we are truly blessed!
I think what we need are some real life friends.
This is how Steve looked like before testing 40+ games for the new Intel CPUS.
best comment
Steve got a hair cut, beard shave and a nose job :)
Made him look 15 years younger!
💀
And tats too.
@@Hardwareunboxed and 246% more likeable!
Steve lied down for so long he fell asleep. RIP Steve
🤣
CPU-limited testing is essentially like testing with the GPUs of the future, that are not here yet. It shows arguably the most important thing - how futureproof that piece of tech that you're buying.
A new CPU with a forever unreleased top tier GPU. Every gamers wet dream.
Exactly. Put it perfectly
It’s honestly just comparing the maximal capabilities of the cpu. This inherently does take into account future gpu performance if cpu performance is being maxed out.
@@robertt9342 Well said. If there are suddenly some "synthetic" conditions, under which one CPU perfoms even better, than what have been found in "real" tests, it should be clear, that it has not been pushed to its limits before. And having some headroom is kinda synonymous with being future-proof.
its testing for people who don't play at ultra settings but medium or high for extra fps as well.
Pretty good, Balin, nice presentation, easy to follow. And good to be able to put a face to the name!
Damn, the 9800x3d is so good, Steve got younger and went to the gym.
When we thought “Standing Steve” meant things were SERIOUS, “Seated Balin” is like a triple exclamation mark!!!
Congratulations on the lecture Beilein! Show up more often, it's good for people to know the whole team behind Hardware Unboxed!
I've been playing at 4k 60 since 2018, the CPU has never been my problem, the 8400f in 2018 ran 4k 60 with a GTX 1080 Ti, the 9900K in 2019 with a GTX 1080 Ti, the 9900K and 2080 Ti in 2021, the 2024 with a 5700x3D and a RTX 4070 Super, I've never needed to invest heavily in the CPU, all my money was always focused on the GPU because the focus is 4k 60, and I can play competitive games at 2k 144 and 4k 144 smoothly on the Ryzen 5700x3D, the difference is that it puts everything on low settings, which helps with maximum vision in competitive mode.
Yes because you are always GPU Limited
As if the 9900K and the 5700X3D are not both really really good CPUs from their respective eras :p
60 is low fps in 2024, however, even 60 will be a problem for poorly optimized games with weaker CPUs, even at 4k.
CPU bound - 1080p: pushes CPU to the max, GPU is waiting on the CPU.
GPU bound - 4K: pushes GPU to the max, CPU is waiting on the GPU.
To date, there are no GPUs capable of forcing a 4K CPU-bound scenario for a modern/premium CPU.
Thanks 🙏
Basically a CPU that gets the most frames at 1080p will have much longer longevity. A lot more head room to handle future games at 1080p or 4k. Sure that 285k probably is within 5-10% of the 9800x3d at 4k. but.. as time goes by, the 9800x3d will remain competitive while the 285k will fall off sooner.
Yep, as years go by you have to reduce game settings to maintain FPS but with CPU that does not improve much, you can reduce draw distance/calls and simulation/shadow quality but that's about it - then the faster CPU wins out - for GPU you can reduce/disable tons of settings and win back some FPS and it's easier to swap in a new GPU than CPU/board/RAM.
@@mikfhan Yup. And in most games those settings still won't help you all that much with CPU performance.
It's unusual to see a game where settings can change CPU performance by more than 20%, but in most games it's pretty easy to get 100%+ performance scaling by changing settings and resolution.
In Tarkov 9800X3D is 20% faster than 7800X3D at native 4K with 4090. 30% better lows. And 7800X3D is a lot better than any intel. 9800X3D is first CPU not dipping under 60fps in the worst scenario(MP, heaviest map and so on). The point is if you know your particular game is drastically CPU heavy, you can make good guesses based on low resolution testing in other games even they are GPU heavy. There is some exceptions but mostly information is out there if you dig it. High resolution CPU testing at best serves only people who play exact same games used in benchmarks.
@@kognak6640on top of that, these test should be done with discord, 2-3 tabs open on the second monitor to see the real world usage bottleneck😅
Yes, and also not everyone prefers the same frames per second. I see so many comments saying "This CPU is overkill for that GPU and resolution!!!" No it's not, I prefer to game on high fps, your good enough to get 60 fps CPU is not good enough for me.
Another point with CPU usage is that I have seen a comment stating, "I am told I am CPU bound, but my 4 core CPU is only using 25% in the game I play." In the case the reason is the game is only using a single core ( at 100%) the other 3 are idle, thus the total CPU usage is 25% but the single core is maxed out and holding the system back.
And in some cases, a game can still be CPU limited even though none of the CPUs threads are hitting exactly 100%. Confusing, really.
You can also get weird numbers above 30% if one core is boosting more than the rest, or if there is some limited multi-threading but the main load remains single-threaded.
@@JakeRayTM Yes, this is why you determine whether you're bottlenecked by your CPU first and foremost by the GPU load (when using unlocked fps ofc)
@@omnisemantic1 there is 1 instance when you can see the cpu bottleneck by looking at usage, your cores are all at 99%
@@omnisemantic1 no, I'm talking about cpu limited situations where none of the other components are the issue. It's often because of the way some games (and engines) are coded for multi-threading.
*Edit* oh, I might've misunderstood you, maybe you weren't exactly disagreeing with me? English isn't my first language, so sometimes I miss some nuance.
As someone in the comments said. Some of the viewers of review videos are new to the PC gaming hobby.
I have been watching review videos for almost 2 decades, the idea of benchmarking cpus at the lower resolutions has always instinctively made sense to me.
But for someone that just got into PC building or upgrading in the last few years, they would not be able to understand it, until its explained.
That said this video really illustrates it well, good job Balin!
Some people just want data to back up their claims. Which after 3 extra videos, now they have it. The irony is that they made all these videos to show people are wrong, but in the process made the videos that people wanted to see so they could see that different games will have different or no improvements at all. Which is the entire point of why they asked, because HUB typically will test 20+ games to find some kind of average.
The bottom line is that HUB can say "its not meant as a buying guide" when in reality, everything they make is being used as a buying guide whether they like it or not.
@@be0wulfmarshallzReviews are not meant to be a buyer's guide. They are meant to be PART of what goes into a buyer's decision process.
So we're finally getting more B-roll! Nice :)
Poor Steve also deserves a break from those mission critical 4k CPU benchmarks
Totally worth the watch just to see Balin host a video. Great work!
Lmao i never expected this, good luck being part of the videos sometimes!!
I watched the long version, and even though I already knew the GPU bottle neck needed to be removed to test accurately, I still learned new info. It was very thorough.
Steve's Elixir of Rejuvenation works BOSS!! Unfortunately the audio I heard was dropping too low in parts to hear all the words, might just be YT glitching on me. The visual presentation and structure was excellent.
Yay Balin! Excellent video, I don’t mind him hosting videos one bit if you need a third face for the show :)
I never understood the problem people have with it. When your GPU sounds like a vacuum due to being under 99% load every time you open a game a better CPU will do nothing for you. Upgrade your GPU until your it's the other way around.
I think that people don't generally have a problem with understanding why 1080p is done - to measure a CPU's potential. HU just mischaracterizes the argument as being either low-res or high-res testing, instead of acknowledging that they both play an important part in informing people about a CPU's potential, both unbound, and in typical use environments.
my laptop when i launch roblox
Steve : "Not this old chestnut again, Balen you can tackle this time"
Yeah, anything but showing what people want to see...
@@Thunderhawk51 You saw it when the 4090 was launched, is the thing. CPU reviews for CPUs. GPU reviews for GPUs. Upscaling tech reviews primarily for GPUs, CPU working as support. Not that hard.
@@Thunderhawk51 People come to a CPU review expecting to get a whole "what PC should I build?' guide. That's not how this works. A CPU review is a comparison, under ideal circumstances, of one CPU against another. Doing such a review with a mid range GPU, at 4k, in ultra settings or whatever inane shit some people want to see added to these reviews would make it worthless as a CPU review since you'd be reviewing the performance of the whole system. And considering there are hundreds of thousands of different possible systems, multiplied by however many games and settings, you're obviously not going to get full system reviews, that's insane. So instead you get reviews telling you how each component will perform under ideal circumstances. It's then up to you to figure out what component will work best with another.
What all these people want to hear is: "Intel is not much slower than amd".
But the hurtful truth they find is: Buy 9800x3d if you want high refresh rate experience or 5700x3d if you don't want to spend much
For pure gaming, sure. However the main argument of the Intel fanboys (just to make sure - I'm typing this on Ryzen laptop) was that if people are willing to spend 500 usd on a CPU, it is very unlikely that they will use it on a low-res monitor, and on real 4K the difference in gaming is mostly within a margin of error. They should have shown this in a main video and they should've included 1440p and quality upscaling (where of course the difference would be significantly smaller vs 7700X and 285K), but instead they shown 1080 and also "pretty much 1080p" which made the video look like an ad for 9800x3D.
@@Aquaquake Well, someone will always believe the planet is flat.
@@Aquaquake 1440p DLSS/FSR Quality mode upscales from 960p, and with upscaling overhead, still typically performs better than 1080p native, so the FPS results would be the same or greater than 1080p native. 4K Balanced upscales from around 1260p, and with the hit of upscaling, will be quite a bit slower than 1080p native. If you're happy with 1440p DLSS quality, the 1080p native results are highly relevant.
Or 13600kf for less money, more fps on average than the 5700X3D and no dip. And way better everything other than gaming.
@@rluker5344 It is not cheaper in my country, not sure about yours. Multi core performance is sure better
Steve bout to put his editor on BLAST! Btw he should be a regular on the channel on camera with Tm and Steve
If you agree Bailin (Idk how to spell his name I apologize if I spelled it wrong) Please leave a like on this comment
Great job! Enjoyed the watch and meeting the third team member!
The production quality of these videos is superb, excellent audio, excellent image 👏👏👏👏👏
4k native data helped me a lot. Upgrading from a 7 year old CPU, if all the current CPUs give the same gpu limited fps in the games I play then any of them will meet my current needs. The 1080p data shows me which one will probably serve me the longest.
To make the data more accurate, you should take the results of a GPU 4k benchmark with your specific GPU. From there, any CPU that hits that frame rate will meet your needs. No 4k CPU benchmark needed and you will get better results.
@@enderfox2667can you explain it to me please? I am just bewildered here and i didn’t understand video at all :(
Thx
@@ЯремаМихайлюк-ш1у I'm not the best explainer and I'm not a native English speaker, but I'd like to try. Which part did you not understand? Do you not understand what this video is about or why Steve thinks this way?
@ well for starters, why did they show that all cpus will have the same fps count in 4k native with the same GPU? I mean I don’t understand how that fps count doesn’t change across the different cpus. It should add some advantage even if there is the same gpu, no?
@@ЯремаМихайлюк-ш1у Well, there are cases where performance differs in these setups, but most of the time it is the same across CPUs. Experts would tell you that this is because in most of these setups the bottleneck is the GPU.
To understand this, you first need to know that both the GPU and the CPU can work at the same time to calculate frames, in the sense that the CPU prepares draw calls and other game logic for future frames, while the GPU handles rendering tasks for the current frame.
However, each frame still has to go through both parts. If your CPU is capable of 200 fps and your GPU is only capable of 100 fps, your effective frame rate will be limited to 100 fps.
Therefore, using a CPU that is capable of even 500fps will have no advantage in this scenario, because both CPUs have already calculated the frames while the GPU was busy with the 100fps.
Having a better CPU in this scenario will only make the CPU wait longer for the GPU.
Most of the time, you will have a GPU bottleneck at 4k, because increasing the resolution only puts more work on the GPU, while the CPU workload remains the same.
i just got my 9800 X3d came from a 14900k, and playing wow cranked to max at 4k yeielded me around 150-200fps on my 14900k, with the 9800X3d its 250-300FPS , so yeah processors DO matter even at 4k depending on the games that take advance of CPU / caching etc.
Exactly, why did they make a video saying nothing but 1080p results are valid when comparing them to other cpus?
@@UMADl3RO5 Dear god, you *still* don't understand.,
If you benchmark WoW at 1080p, it'd likely show the differences between the CPU even more starkly, because at that point, the CPU is absolutely the bottleneck, not the GPU. That WoW is seemingly not *completely* GPU bottlenecked at 4K is not the point - that the testing needs to have the CPU as the bottleneck to get valid results, is. A test of WoW at 1080p would probably show a similar, if not greater uplift, so testing at 1080p is still the correct methodology for the vast majority of games.
Please, this really isn't rocket surgery.
@@Beany2007FTW lmao I understand that. I just feel like its lazy to not include the resolutions people are playing at. Its easier to see the graphs and compare rather than predicting what the results might be. I understand you are just concerned about seeing the speed difference and you cannot do that at 4k or 1440p. I just want to see the graphs regardless with all the resolutions. If thats asking too much then I guess I’m asking too much. But buddy I understand why they test at 1080, why is it hard to include other resolutions? Will it affect the 1080p results to include 1440 or 4k? No. If you have a problem with that i dont know what to tell you.
I can't believe the tenacity you guys have for educating the "unwashed" masses. I'm an old gamer, my first 3D accelerator was a Hercules Riva TNT. This was before the term GPU, because back then the CPU did most of the heavy lifting. It wasn't until the GeForce era where we really started to discuss bottlenecks, as the GeForce started to take the burden off the CPU. Hardware T&L became a thing, along with other technologies. At that time very few people used LCDs (they were garbage), it was all CRT, and in reality it was awesome. There were no "native" resolutions, just a maximum resolution, so no interpolation. We got higher refresh rates the lower resolutions we used. I'm biased, but this was the golden age of PC gaming.
When we started discussing CPU bottlenecks, it became obvious that CPU reviews showing gaming benchmarks needed to focus on lower resolutions. So if most people were gaming at 1024 X 768, then benchmarking was done at 640 X 480, maybe 800 X 600. So generally speaking, reviewers and gamers alike knew that to highlight the differences between CPUs, you needed to take the burden off the GPU. There were some dissenters though, but it surprises me that they still exist after this long.
I've always thought of the CPU setting the ceiling of performance. It doesn't matter what GPU you have, X CPU will only ever be able to put out Y FPS. So once you know the absolute maximum that CPU is capable of delivering, you should expect that using a higher resolution and detail settings will result in lower FPS. It is not useful data to run your benchmarks at 4K where the GPU is fully utilized (90% - 100%) if the purpose is to see how different CPUs scale. You'd be lucky if you seen much difference in .1% and 1% lows in that scenario.
I'd love to know what GPU these people are gaming on. If they are gaming with a 4060Ti or less, then "b_tch you GPU bound at 1080p anyway, what do you care about 4K?".
@Harware Unboxed: if you really want to trigger them, start benchmarking at 720p.🤣
CPU benchmarks could focus on lower resolutions and nobody would complain. HU have repeatedly misrepresented the argument they're trying to shut-down, because if they didn't then they'd inescapably end-up giving legitimacy to it, because there is genuine legitimacy to it.
Most people taking issue with HU's lack of higher-resolution data fully understand why CPU-unbound testing reflects the CPU's maximum potential vis a vis other CPUs. What people have complained about are CPU reviews that completely omit any mention of what difference the CPU makes at typical-use resolutions, as the review then fails to inform prospective buyers of whether they should upgrade or not. And a primary reason, if not the reason people are watching a CPU review is to know whether they should upgrade or not, and they're watching to inform themselves on what the CPU means to them.
When a CPU omits crucial information about what the CPU means to its target audience, then it's falling short as a CPU review. And when a CPU review hypes up performance spreads that almost nobody will experience, due to them existing only at resolutions lower than what 95%+ of people buying the top-end hardware will actually game at, then the review is misleading when it hypes up that performance difference without communicating the caveat that it won't be experienced at that hardware's typical-use resolutions by 90%+ of its purchasers, either today, or even in the next several years and after multiple GPU upgrades. The only people who will see 1080p-style FPS spreads from their 9800X3D within the next several years will be those purchasing a GPU with RTX 5090 level of performance, or more. And two years after the RTX 4090's release, those who've bought one represent less than 1% of Steam's userbase. Expect the number buying an RTX 5090 to be even less, considering its significant price increase.
Man i am very new here. Sir can you explain pls why lower resolution matters in cpu benchmarks? (I read every word and sentence people use here but my knowledge is limited so it’s not understood by me most of the time)
Thank you
@@ЯремаМихайлюк-ш1у If your GPU isn't powerful enough, it will limit the performance of the CPU in gaming. And if your CPU isn't powerful enough, it will limit the performance of your CPU in gaming. And the higher resolution that you run a game in, the more workload your GPU has to render the image. Whereas the CPU generally has the same workload, regardless of which resolution you run a game in. So, to see the full potential of what a CPU can do when it isn't being limited by your GPU, you need to test it at a low enough resolution that the GPU isn't limiting the performance of the CPU. That's why low resolution benchmarks are used to compare the raw performance of CPUs.
That's good to establish the relative performance of just the CPUs, but it serves as synthetic and merely academic data if the tested resolution isn't the resolution you're actually going to use the CPU in. That's why it's also important to test CPUs in the resolutions people will normally be using them in. And for the highest-end gaming CPU there is, the normal resolutions it's used for will be 1440p and 4k. Since currently-available GPUs aren't powerful enough to avoid imposing performance restrictions on a high-end CPU at those higher resolutions, the performance differences seen between CPUs at 1080p will gradually disappear, the higher the resolution goes.
For people thinking about buying a 9800X3D, seeing where the performance starts to disappear at higher resolutions, and how much of it disappears, is important information, because it doesn't make sense to spend hundreds of dollars on a CPU upgrade for gaming if it won't make any difference to the gaming performance you already have. So, it's very relevant to a review of the CPU to communicate that, so that people aren't upsold on a product that will basically take money out of their pocket and give it to AMD, in exchange for nothing of value for them.
HU have argued that a CPU review is only about the academic (low resolution) test data, and not the practical (high, normal resolution) test data. While they're correct about the reason for doing testing at 1080p (to see the comparative, raw potential of CPUs), they aren't correct about anything else they've claimed on this topic, and have made many strawman, gaslighting, and outright silly arguments in defence of their position. Most of HU's arguments in defence of their position are flawed, feature selective acknowledgement, or completely misrepresent what they're addressing, and so don't stand-up to scrutiny. But they've doubled-down on those arguments for whatever reason.
@ BIG thanks man. I just wanna be more knowledgeable in this sphere and without a pc it’s pretty funny because my theory gets mixed up very often if i don’t watch any news on the topic. By the way is there a point in buying x3d cpu’s if someone plays only in 4k? I mean normal 9950x will do almost the same job in games as i understand it and it will be better for workstation, no? (That seems right for me but i somehow found some comparison benchmarks and sometimes (and that’s why i don’t believe it) some youtubers show increase in 10-20% in some games in 4k (with x3d) (is it even possible?? )
@@ЯремаМихайлюк-ш1у No prob. I think buying an X3D CPU is very worth it for high-res gaming and other things - though, in 4k gaming, there likely won't be much, or any difference between a bunch of high-end CPUs unless you have at least an RTX 4090 level of GPU power. I'm pretty sure I know which video you're referring to when you say it shows 10 - 20% more FPS with the 9800X3D at 4k, and it's a fake benchmark video. It was uploaded before the 9800X3D released, and it shows the 9800X3D running hotter than the 7800X3D in its tests, which is the opposite of reality. The figures in it are all just made up. I and others commented at that video, pointing-out that it's fake, but I think the uploader hid a bunch of our comments.
Other than for gaming, the X3D CPUs run at much lower temperatures than the non-X3D counterparts, and use much less power (at least the 7800X3D does). I had a 7700X before getting my 7800X3D. But as I like a quiet, cool, and powerful PC all at once, the X3D suits me much better than the non-X3D CPUs.
The 9800X3D has significantly better productivity performance than the 7800X3D, and so that benefit will exist outside of gaming. Even though I already have a 7800X3D and game at 1440p, I'd probably buy a 9800X3D for its increased productivity performance and cooler temps if it weren't for it using significantly more power than the 7800X3D. The 9800X3D still runs around 7C cooler than the 7800X3D while under load, which is great.
Very Nice video HUB team!!!
Steve, thanks for letting Balin have a shot at explaining this subject!!!
Maybe it was the visuals, but this did help me get a better understanding of the CPU testing process.
Excellent video man, told me everything I needed to know to understand this subject in 14 mins!
2:08 Balin channels his inner Steve.
6:33 I would argue that the 9800X3D would still deliver a much higher 1% lows at 4K which would make it a way smoother experience tho.
Sure will. That's the whole point.
I agree, but only if you own a 4090 ...
@@Peter.H.A.Petersen Lets get a 5090 instead :D
@@Peter.H.A.Petersen even if you have a lower GPU you can still change settings to get there. But alot of CPUs cannot deliver high FPS. We are in 2024, under 90 FPS is very very bad. What if you want 144? 180?
@@Peter.H.A.PetersenNo, not only it you have a 4090.......
Bro layed down so hard, he gave birth to a son
Bro what. XD
This video made me stop for a moment to look at what kind of Steve we got today... :-D Still nice one, it does great job at explaining why testing at 1080p matters instead of 4k. And yeah, while I definitely would love some kind of upgrade guide type of video, there are just way too many factors with CPU performance, from graphic card used, settings,... to specific games you play and FPS targets. I think way too often people are thinking about how to get max FPS without considering where good enough point is... it really can save you some money. And not everyone is equally sensitive to FPS. Hence why I am still happily gaming with i5 12400, since I am not really that badly sensitive to FPS above 60. Not to mention I also got it quite cheap.
Yes thank you for distilling it down to what the point of it is. I know what Steve was trying to say in the last one but I think he got a little too into the weeds and needed to simplify it to a tweet level length.
CPUs hardcap your fps to whatever their maximum is, so finding that maximum with lower resolution testing should be the goal.
However, GPUs can be dynamically adjusted to output more or less fps, so finding the relative scaling between resolutions, RT on/off, etc. is actually useful data.
Well done video❤! I dont get why this is so hard to understand for some people.
The main thing I take issue with is them saying "A CPU review is not a buying guide". I mean I kinda find new hardware interesting, but I think the reason people get so bent-out-of-shape over the "4k" performance is that they want a CPU review to answer the fundamental question "what is the cheapest CPU I can buy in a GPU limited scenario"? Amazon will eventually ban my account if I order and return all the CPUs to figure out which one is best for my needs -- that's why as a consumer I rely on these channels. So spelling out where the GPU limited line is actually serves a valuable purpose because it lets me know if I need to upgrade or not.
That's true. .HU are in the wrong on lots of their comments on this topic, and that's definitely one of the comments they're in the wrong. A CPU review is definitely a buying guide. A primary reason, if not the reason people are watching a CPU review is to know whether they should upgrade or not, and they're watching to inform themselves on what the CPU means to them. When a CPU omits crucial information about what the CPU means to its target audience, then it's falling short as a CPU review.
HUB is bascially digging a hole trying to defend themselves when all they needed to say is that "some games aren't worth testing 4K like sim games." "sometimes we don't have time to test 4K for new CPUs". Or they could have said "the 1080p here shows potential in the future when you are limited, while this specific game shows that if you play 4K you won't get any benefits with an upgrade today". And that's actually useful and important. Instead they keep insisting they don't need to do any testing...and that's up to them to test what they want. People are asking for stuff to be tested because they are wondering if games in general will benefit right away. HUB always does this whenever people point out that they are being narrow-minded. Just say you don't want to test it and be done with it rather than "everyone is wrong, we are right". Its not a black and white ask. Nobody would be testing 1440p/4K if that was the case. But a ton of youtubers out there do test new CPUs on 1440/4K. Is it really that hard to simply show a few examples of 4K not scaling, and 1080p scaling? Wasn't that the entire point of why some youtubers tested it? Isn't the entire point of this channel to back up claims with testing? Like after 3 extra videos, you've proved that point. If you did that right away at the start, nobody would be complaining.... So in the end they were bullied into doing something people wanted and they didn't want to do, just so they can be like "see we were right". Talk about groundhog day.
@@be0wulfmarshallz Definitely. That's also my exact thought: Just say that high-res benchmarking isn't something you personally want to do with your videos, and that's fine. All this BS they say to try to rationalize their choice that is the problem - it's misinforming and misleading people, and making the tech space dumber. Their arguments are completely harebrained, rife with falsehoods, and I'm sure include some deliberate deceptions. What they're saying with their videos is that they aren't equipped with good analytical and comprehension, but are charlatans and Dunning-Krugers in the tech space.
I wrote this comment for this video. But this is only a small part of it. There's so much more that's wrong with what they're saying on this topic that I can't even see where where the end of it would be if I'd try to address everything:
As with HU's previous videos on this topic, there are a bunch of things wrong with what's claimed in this video. Aside from the persistent strawman and false dichotomy that people are demanding, or that there must be either low-res testing (academic) or high-res testing (practical), rather than both to give the full picture (both should be in a proper CPU review), one such claim being that GPU-limited CPU benchmarks are useless (which is literally the same fallacy as those claiming low-res testing is useless), and then later reframing it more specifically as high-res testing being useless to determine a CPU's unbound potential. But those two statements aren't the same statements, and presenting the latter as confirmation bias for the former is dishonest, and disinforming. I think that when people make these kinds of logical fallacies it's usually an intentional sleight-of-hand deception. A CPU's unbound potential is only one part of the whole picture of a CPU review and buying decision. There are many other things to know, including what a CPU's potential is in typical-use resolutions that will end up being GPU-limited with all existing GPU hardware.
And CPU reviews are quite literally "upgrade" / buying guides, it's the central point of why they exist in the first place. They're communicating technical and other information about CPUs so that people watching can become informed and figure-out whether it's in their interest to buy one. That's why HU puts price-to-performance charts in their own CPU review videos - otherwise, remove those charts from your reviews. Saying they're not buying guides as an excuse for not including other contextual information that's important to the buying decision-making is both false and a weak excuse. If data informing how a CPU performs in its market's typical-use resolutions is omitted, then it's failing as a CPU review. Part of objectively measuring what a CPU can do is measuring what it can do when placed in its typical-use environments (resolutions) - that data shows what difference the CPU as the isolated factor can actually make in that environment, thus informing prospective buyers whether it's worth buying.
Here's another correction to Balin: providing 4k benchmark data gives actual information about actual CPU performance... when it's in a 4k gaming environment, which it will be for many purchasers. And 4k data with RTX 4090-level GPU performance isn't only informing of systems using that exact hardware and settings, but gives very-useful information to people who have any of various levels of GPU performance. The idea that the result doesn't inform other hardware configurations is baffling, because I can easily translate such data to my system with an RTX 3080 (soon to be RTX 5080). The people at HU seem to be infected with the same disease where they don't understand the meanings of the words that they use, and they try to impose redefinitions of the meanings of words that are already defined in the English language. But things don't work that way.
Another problematic assertion he makes is this: "But it's useless information if you're willing to downgrade visuals in search of higher framerates". That's definitely needing an explanation to back it up, though a good one can't be given because it's entirely false. Someone with a properly-working brain should be aware that turning-down settings will have the effect of increasing the overall FPS. And so native 4k benchmark results also inform people who are willing to turn-down their settings to get higher FPS that they can expect a minimum of the shown FPS, and that the number will go higher as they turn settings down. A person with a bunch of gaming experience probably has a feel for how much additional performance they might get from certain settings, and there exist videos for popular games showing how much a performance impact individual settings make.
Saying that any of a selection of CPUs will work to deliver strong 4k performance makes the point of why it's important information for a CPU review, which is inevitably a buying guide for the CPU. Benchmarks informing people that there isn't a gaming advantage from one high-end CPU to another at higher resolutions is called useful information.
In truth, HU's videos on this topic, including this one, are absolutely rife with what can only be described as Dunning-Kruger rationale, suggesting HU occupy the space of tech reviewers without having technical understanding that matches the role. And by presenting half-truths and falsehoods as authoritative, it's making viewers less informed and worsening the tech information space.
@@thischannel1071 I cant help but agree. Im getting sick of the hostility of running benchmarks outside of 1080p. Those benchmarks are only really relevant if you are considering the CPU or GPU alone in isolation.
Personally Im running 1440p on a 7800X3D and a 7900XTX and benchmarks at that point are helpful for me for to make up my mind for the components together, I dont care about the CPU or GPU separately and I was able to find enough benchmarking of the config to know I was on the right track with my preferred combo, even with my VR rig.
Did you watch the whole video? They quite literary tell you how to figure out where they GPU limited line is for you. What would be the point if turning a CPU review into a 4090 at 4k native benchmark? Not to mention such data would not be as useful for you if you don't have the benchmark combo of CPU and GPU.
For some people, it doesn't matter. They'll never understand this concept, no matter how good is the video explaining it (like this one).
Yeah I think even this dumbed down video is too much for those people. They want a 30 second video that conclusively tells them to buy it or not, not a 10-40 minute video that carefully lays out the facts and tells them, "Study these charts and make up your own mind."
Thanks!
Love the 10-15 min length!
TLDW; CPU reviews compare CPU's objectively, they're not personalized upgrade guides.
But it can be though. If your GPU is capable of running at 150fps in your favourite game with the best CPU, you are currently running it at 100fps, you want to be looking at the CPU capable of 200fps+ ideally, or at least 180fps in CPU bound tests.
@@dotjazAnd unless you're rocking a 4090, it becomes irrelevant. Meaning you need to do more evaluations on YOUR situation. Once again, if you're happy at 100 FPS, pointless upgrade. If you aren't happy with that, then upgrade with the most accurate info that's useful to you. JFC, this is why you follow more than one channel for tech reviews, because not every single one is going to run a full gambit. May as well do it yourself at that point instead of grasping at straws and whinging.
Whether you can afford it or not is a whole other thing. Not financial advice.
@@SelecaoOfMidas Stop bringing 4090s into everything, those without 4090s can still achieve high frame rates with lower quality settings and upscaling. GPUs are very scalable unlike CPUs. So there's no point in saying "only people with 4090s can see the difference" It is simply not irrelevant.
@@dotjaz Also people forget that games are going to become more CPU heavy as time goes on. Horizon zero dawn in a completely CPU limited scenario gets 250+ FPS but its sequel forbidden west gets only 180. If your CPU can do 150FPS now, it won't do the same in future. If I want 80-90 FPS for single player gaming , I'll get a CPU that can do 200 so that it lasts me a really long time, even if I upgrade my GPU.
I love how y'all basically regurgitate what's said in this and the longer video it's based on
This channel has been taken over by some rogue element!
Can you test every single game ever made with a 3800X and 6750XT at 1080p and 1440p so that I don't have to understand anything you just said and can see what the performance of my system will be if I upgrade to a 1440p monitor some time in the future? Please?
I like that you clarify and emphasize the "upscaling" - aka lower internal rendering resolution - part. I never really thought about that in terms of CPU bound performance. 👍
Thank you, Orlando Bloom! Great info
how about gamers that don't get it wrong because they DON'T get a 4090?
Impostor? Steve looks kinda sus
I just wanna know how a certain cpu performs at native 1440p with a 4080 super 😢
Like i understand showing the differences between CPU performance at 1080p but I wish i could know how much of a difference there remains after upping the resolution
@@mikekostecki2569 It will perform about the same as at 1080p.
@@mikekostecki2569 If upping the resolution makes you GPU bound, the difference gets smaller or even disappears. If not, it the difference is the same as it was at 1080p.
@ I ended up buying a 7600x3d rather than a 7800x3d for this exact reason. I haven’t bought a gpu yet for it, but from what I’ve gathered it seems I’ll lose maybe 10 frames at 1440p with a high end card.
More of this guy.
So the 4k native for 9800x3d is shown, but what was the 4k native for the other CPUs? That is the difference, or lack of, that I would like to see. Are you saying all 4 CPUs had the same framerate at 4k native?
Yes. All those CPUs had exactly the same frame rate at native 4K resolution. Basically, if you are running 4K native, there is no difference. But, if you do native 1080p, there is a huge difference.
Congratulation on the new tattoo Steve!
A great video, was really needed after the previous one with the 4k native data.
Now do the same benchmarks at native 1440p so people know what fps to achieve with those cpus and finally figure out do they need to upgrade their cpu or not Steve doubleganger.
Yeah, 1440p is the perfect balance for testing.
Just give your viewers what they want instead of the BS reasons.
Look at old reviews (likely out of date)
These reviews aren't for upgrades or comparisons (despite literally showing comparisons).
You show GPUs in CPU limited scenarios but don't want to show CPUs in GPU constrained scenarios.
Upscaling etc.
Most people that aren't enthusiasts are just going to expect that a 7800x3d is quicker than all other CPUs at gaming by like 40 percent and go buy that even on a 4k monitor. The average consumer likely has no idea about the relationship between CPU, GPU and resolution. A charts showing this is absolutely helpful as we all started as novices in this hobby and the way the channel is coming off so snobbish about it is really disappointing.
The hypocrisy and mental gymnastics that are going on instead of providing your audience with what they want is insane guys.
I've been subscribed since the channels infancy and watch 9 out of 10 videos you produce. I won't be watching your CPU reviews going forward.
I don't even care all that much if they give high-res benchmark results or not. What gets to me is how false, manipulative, and utterly harebrained their arguments are. If they actually believe the things they say, then they are, to the nth degree, exemplary of the Dunning-Kruger effect. I wrote this comment in response to this latest video:
As with HU's previous videos on this topic, there are a bunch of things wrong with what's claimed in this video. Aside from the persistent strawman and false dichotomy that people are demanding either low-res testing (academic) or high-res testing (practical), rather than both to give the full picture (both should be in a proper CPU review), one such claim being that GPU-limited CPU benchmarks are useless (which is literally the same fallacy as those claiming low-res testing is useless), and then later reframing it more specifically as high-res testing being useless to determine a CPU's unbound potential. But those two statements aren't the same statements, and presenting the latter as confirmation bias for the former is dishonest, and disinforming. I think that when people make these kinds of logical fallacies it's usually an intentional sleight-of-hand deception. A CPU's unbound potential is only one part of the whole picture of a CPU review and buying decision. There are many other things to know, including what a CPU's potential is in typical-use resolutions that will end up being GPU-limited with all existing GPU hardware.
And CPU reviews are quite literally "upgrade" / buying guides, it's the central point of why they exist in the first place. They're communicating technical and other information about CPUs so that people watching can become informed and figure-out whether it's in their interest to buy one. That's why HU puts price-to-performance charts in their own CPU review videos - otherwise, remove those charts from your reviews. Saying they're not buying guides as an excuse for not including other contextual information that's important to the buying decision-making is both false and a weak excuse. If data informing how a CPU performs in its market's typical-use resolutions is omitted, then it's failing as a CPU review. Part of objectively measuring what a CPU can do is measuring what it can do when placed in its typical-use environments (resolutions) - that data shows what difference the CPU as the isolated factor can actually make in that environment, thus informing prospective buyers whether it's worth buying.
Here's another correction to Balin: providing 4k benchmark data gives actual information about actual CPU performance... when it's in a 4k gaming environment, which it will be for many purchasers. And 4k data with RTX 4090-level GPU performance isn't only informing of systems using that exact hardware and settings, but gives very-useful information to people who have any of various levels of GPU performance. The idea that the result doesn't inform other hardware configurations is baffling, because I can easily translate such data to my system with an RTX 3080 (soon to be RTX 5080). The people at HU seem to be infected with the same disease where they don't understand the meanings of the words that they use, and they try to impose redefinitions of the meanings of words that are already defined in the English language. But things don't work that way.
Another problematic assertion he makes is this: "But it's useless information if you're willing to downgrade visuals in search of higher framerates". That's definitely needing an explanation to back it up, though a good one can't be given because it's entirely false. Someone with a properly-working brain should be aware that turning-down settings will have the effect of increasing the overall FPS. And so native 4k benchmark results also inform people who are willing to turn-down their settings to get higher FPS that they can expect a minimum of the shown FPS, and that the number will go higher as they turn settings down. A person with a bunch of gaming experience probably has a feel for how much additional performance they might get from certain settings, and there exist videos for popular games showing how much a performance impact individual settings make.
Saying that any of a selection of CPUs will work to deliver strong 4k performance makes the point of why it's important information for a CPU review, which is inevitably a buying guide for the CPU. Benchmarks informing people that there isn't a gaming advantage from one high-end CPU to another at higher resolutions is called useful information.
In truth, HU's videos on this topic, including this one, are absolutely rife with what can only be described as Dunning-Kruger rationale, suggesting HU occupy the space of tech reviewers without having technical understanding that matches the role. And by presenting half-truths and falsehoods as authoritative, it's making viewers less informed and worsening the tech information space.
@thischannel1071 absolutely man. It's been boiling my blood with the approach they've been taking.
Problem is they're too deep in now to change their position as they'll come across as incompetent.
Super disappointing honestly
This also means that if you ARE playing in 4k native/no framegen without a 4090, you can easily stick with a 12600k/7700 :)
Unless you want 10x chrome tabs streaming all the Steve CPU reviews without 4k included because Steve is stubborn.
Every Valorant player: "???"
Exactly, just do both testing of 1080p and at 4k, Techpowerup and other review channels have been doing this for years! Which is why I only look at those results
@@CallMeRabbitzUSVI and what do you learn from, say, a cpu benchmark in The Last of Us at 4K? What information is there, that isn't in the 1080p data?
@@CallMeRabbitzUSVI YOU STILL DONT GET IT OMEGALUL
This is so cool! Love seeing Baylin (sorry don’t know how to spell) in a video, AMAZING! And great job bro 😊
Nice cpu benchmark reading guide👍
You completely missed the criticism of the previous video. Nobody is arguing that 9800x3D is the fastest gaming CPU and that CPU's strength is best shown in low resolution. The problem is that:
a) You've made a dedicated video titled "Ryzen 7 9800X3D, Really Faster For Real-World 4K Gaming" and you have provided there ZERO real 4K benchmarks (or even the quality mode) and only used "balanced" upscaling (which is significantly more CPU-bound than even QHD, let alone 4k)
b) you selected comparison with 7700X and 285K, both of which are productivity-oriented CPUs (Intel also already announced that Arrow's Lake's gaming performance is scheduled for fixing in December, so a little odd timing but I'll give you a benefit of doubt on this one), as opposed to an obvious choice of 7800x3D
Combining both resulted in _technically correct_ , but practically misleading video, which made 9800x3D look like is provides some gains in 4K resolution (which again in doesn't, "balanced" mode is not even 1440p)
Indeed, I'm puzzled how 1080P is tested NATIVE but 4K is tested with Balanced Upscaling. It's as what you said, showing CPU bound against another CPU bound scenario.
There are people who want CPU only data, and there are people who want to see what's it like where CPU and GPU work together (eg 1440P Native) and neither are wrong. So I think the great issue is that people are blanketing those asking for 1440P / 4K are not asking the right thing.
a) "real world 4k" - surveys show that people gaming in the real world at 4k are upsampling, becuase very, very few people own a 4090.
b) 1) the intel CPU's are newley released, and if you're making a purchasing decision right now, what you would compare to the released AMD cpu's are compared to the released Intel CPU's) 2) Next month the new Arrow lake will do the same - and compare the released Intel CPU's to the available - and will probably have a better showing.
neither of that changes that if in October, November or December, if you buy a cpu, and (like almost everyone) run an up-sampler at 4k, the 9800x3d gives you a major FPS boost over the best Intel CPU in most titles.
@@andytroo The poll is irrelevant because it samples people's current usage, *not* people who will be upgrading their system. You simply cannot claim that people who have a budget for a 500 USD CPU are going to use it alongside a low-res monitor or that they will settle for a poor quality upscaled 4K. And even then - 25% of people voted for "quality" settings (as opposed to only 11% for both balanced and performance) - what's the explanation of not using a single comparison in quality settings? "People were asking for including 4K testing, let's make a 40 minute video of testing in 1080p and _also pretty much 1080p_ ". For the 285K, simply saying it and 7700x are very sus choice for a non-marketing purposes, considering their gaming performance is weak.
@@andytroo The "real-world 4k" survey actually showed that more than double the percentage of 4k gamers game using 4k DLSS Quality than those who use DLSS Balanced or Performance. But HU showed 4k Balanced benchmarks. The surveys also showed that native 1440p is the largest demographic, but didn't show any benchmarks for that. So, they didn't reflect what people actually said they use the most in the surveys.
Also, those surveys reflect all of HU's viewership, and not specifically people buying a 9800X3D. Among purchasers of a 9800X3D, the number of people gaming at native resolution and DLSS Quality setting will be far higher than the average HU viewer, and 9800X3D purchasers will have a much higher average GPU performance level than the average HU viewer. HU's arguments on this matter are mostly nonsense, and supported by half-truths, falsehoods, and misrepresented data.
I was one of the ones that complained about not having 4K Native results in last video. I gotta say it's amazing that you guys hear the community and deliver or explain why us gamers might or might no be wrong. Thanks for all of the effort!
Except they didn't even listen to their own polls which shows more people game at 4k Quality DLSS than Balanced but they still chose to show Balanced 😂😂
I'm a simple man: I see 5800X3D in a graph, I press thumbs up.
Well done Baelan!
Also thank you for providing material for educating people on testing methodology, it's easy to forget that people might not fully understand the content.
Great to see you in front of the camera, you're a natural! Hope to see more of you.
We got Steve running in Unreal 5 using a 5090 before GTA 6.
go back to 720p CPU benchmarks. The beatings will continue until morale improves.
Even tho there would be wider differences in cpus, it’s just not relevant enough for todays gamers 🤷🏻♂️
@@mikeramos91 there's also a point of diminishing returns as well. Steeve explained why in the last podcast episode. I wasn't paying that much attention to remember why but another technical reason comes into play with going as far down as 720p is also a bad idea for today's CPUs.
@@Lebon19 1080p is really the spot where most ppl play at & makes the most difference. In some cases 1440p is becoming the new spot especially with higher ram speeds coming out
@@mikeramos91 Agreed. 720p used to be the default and 1080p was the intermediate Hi-Res setting (like 1440p is here). Now, 1080p is that bottom and 1440p in the middle. So, it's only a matter of time where 1440p will replace 1080p. But it's still a few years off at least. So, even though it's a matter of time, I still don't see 1080p for CPU testing go anywhere anytime soon.
@@mikeramos91 There shouldn't be wider differences in CPUs benchmarking at 720p, otherwise HU is guilty of everything they argued against and in support of to defend their 1080p benchmarks. Their whole point is that 1080p benchmarking should be focused on because it show the unrestricted performance of the CPUs.
But 1080p benchmarks also isn't very relevant for 9800X3D gamers, as 98%+ of people who buy the top-end CPU won't be using it to play at 1080p. While 1080p might still be the most played-at resolution, that's out of all CPUs. When specifically talking about the highest-end CPU and GPU hardware there is, it's a very niche gaming resolution that only very few people (a small portion of competitive gamers) use.
Your own data shows there's a big difference in real world usage (4k upscaled). You claim that we shouldn't upgrade if we don't need to, so why not include real world data in all reviews like you have done to help us make that decision? No one is asking to abandon 1080p benchmarks, just include 1440p and 4k ones (upscaled ideally and native as an extra).
How would a 5800x3d or 7800x3d owner playing at 1440p and 4k decide whether to upgrade without you guys providing this data? We can just use cinebench if you want to see cpu capabilities for single and multi core. If the reviews at 1440p and 4k benchmarks were truly useless then they would show no improvement - but they do - so it's your own data which disproves your statement.
"How would a 5800x3d or 7800x3d owner playing at 1440p and 4k decide whether to upgrade without you guys providing this data?" - You would ask yourself if you are getting enough fps for you. If you are, then you wouldn't upgrade. If you're not, you would use a monitor to show you how utilized your GPU is in your games.
If it is below 95% utilization most of the time, while playing, then you know your CPU is not fast enough. Therefore, any CPU review here, will show you how much faster the CPU's perform relative to each other. You then get the fastest one you can afford and you will get higher fps numbers. If you really need the numbers, use the percentages to get a ballpark.
(Don't forget you need to know the maximum your GPU can go, too. Use the internet to determine that. There is no point in upgrading for 30% and the GPU can only go up another 10% or something.)
If your numbers are 76 fps and the CPU review says a CPU is 30% faster, then you can add about a 3rd of your fps number to itself and get a ballpark uplift. Then ask, "Is 30% enough to pay for a whole new CPU for?". I prefer 50-70% uplift, when I upgrade.
You only need to upgrade, if your CPU is holding back your GPU and your experience is bad right now.
@@boirfanman Or they could just add the 1440p and 4k benchmarks - who the hell has got time for all that? The fact that his own data showed there was differences (some more or some less) with the 4k benchmarks disproves their points that 4k benchmarks don't say anything.
@@konvictz0007 Wow. Then I agree with Steve. It's not worth his time to cater to people who are this entitled. Use your time to figure out something that benefits you. Not someone else's time.
You fail to see the point Balen is making, 1440 and 4k would only make sense if you also happened to have a 4090.. If you don't, and most doesn't. You would need to look up the latest benchmark of your GPU to see what kind of FPS it could give with the best CPU available anyway. So it doesn't really save time for most consumers.
"decide whether to upgrade without you guys providing this data?" They even said CPU reviews are not an upgrade guide. You are exactly who they are talking about at 12:02 lol
When is the 10800X3D coming out?
Nobody knows
December.
@@SlyNine 2027
5800X3D -> 7800X3D -> 9800X3D -> 11800X3D
Something tells me they are going to move away from that naming and go with Ryzen AI like their laptops...
GJ Balin! Delivered pretty smoothly 👍
Nailed it Balin, I didn't think you would but nailed it completely as I watched the entire video. 😆Well done.
Great job Balin, Steve and Tim would be proud of your work mate 🤭👌 Shh even if they are not, I am totally supporting you 💪💯
11:35 Steves got some strange logic here… you set everything to highest at 4k with dlss and frame-gen, and if you don’t get the desired frame rate you go buy better hardware. its a bit bad when you have a 4090 with nothing better to buy until 5090 comes but it is what it it…
Nah, the logic is sound, it's just not for the everyday gamer with his/her xx60/x600 class GPU, but for the 0.001% of people that has a 4090, willing to sacrifice graphic quality(surely including RT), but not willing to sacrifice 4K resolution or framerate... lucky that the 4090 is one of the very few NV cards that has sufficient VRAM so you won't have to compromise on texture quality...
@andraskovacs8959 I strongly disagree with vram statement, it's easy to go over 24gb in cyberpunk 2077 with 4k texture mods and then i get stutters. That's the main reason I can't wait to get the 5090 and it's 32gb vram.
@@medovk Well, for some people, nothing's enough. Let me not shed a tear for you. If you run out of 24GB, it's your own volition, the game does not force you to. I'"m fine with it, the 16GB on my RX6800 will do just fine for me for a long time - you do you, if you are able and willing to spend the big bucks on the "never enough" VGAs, I don't mind, have your fun with them... just don't come to me for validation or empathy with problems that even 99% of the 4090 owners(themselves less than 1% of the gamer community) do not have...
@ yeah, things can always be better. 5090 is gonna fix this soon.
*insert Asian Jim meme*
Nice one! Good job on the video, hope it'll help those in need of an explanation.
We could pass through a CPU video synopsis. My cousin Balin would give us a royal welcome!
- Threaten's to get fired
- Makes a great video
- Fan's reaction: *surprised pikachu face*
Get him to be a host as well!
If these videos are the only way you're going to show 4k or 1440p results, then please keep making them. As you're talking about how useless the results are, I'm looking at your data and am very interested in this "useless" data.
Of course 1080p benchmarks are *more* useful than higher resolution benchmarks, but knowing where the 'ceiling' is is still very useful information.
The 9800X3D is measurably better than the 7700X on Hogwarts Legacy (on your test bench at high settings) at 4K with DLSS, but not any better without DLSS. Great! That's useful information I wouldn't have known otherwise!
Yes i want to compare apples to apples and its very important info
Still, you shouldn't use steves' CPU bottleneck data to see if you want to upgrade your CPU. You should really only use the steves data to see how CPUs compare to each other.
You can figure out how much better the 9800x3d is than the 7700x at 4k WITHOUT testing, i dont know how but im being told you can! if someone can tell me, let me know, i want ACCURATE numbers not guesswork BTW.
@@sykusyku Yes I'm trying to decide if i want to go 4k monitor so i need those results. If a game is getting 75fps on 4k (with said cpu) then im not doing it but if we are getting 150fps then im making the switch. Why are we trying to change this 30 years in? Blows my mind
@@sykusyku Oh, that's easy. The 9800x3d is as much better than the 7700x at 4k as it is at 1080p. The main reason you get fewer frames in most benchmarks and in real life when you run at 4k is because your GPU is the bottleneck.
Doesn't matter with DLSS being a factor 4k benchmarks should be included.
Hope to see more Balin. It’s nice having another host to bring some freshness to the channel.
meanwhile i laugh in 1440p with a 5700X / 4080 and hope my system lives long enough to reuse parts of it when time for an upgrade comes 😅
Because you are smart and not a consoomer
@@pedropierre9594 thanks * ^^ * tbh my thought process was "a GPU cost me way more than a Motherboard, Ram and CPU". and my last comp ran for 13 years with a realy low end GPU (15 years ago at this point, a Palin GTX 660 OC) so i wanted to avoid that. i mostly play singleplayer entertainement games, not PvP stuff, so graphics was always my main concern, and the 4080 would serve me well for a long time i assumed ^^
Thank you for sharing this and thank you for admitting if you play at 4K, spending 500 bucks is not worth it
I feel like the people complaining about cpu testing methodology have not finished high school which could be the case for a lot of your viewers. Appreciate the effort you guys take to educate your viewers!
They clearly never heard of a "control variables" lol. HUB would never make their video 5x longer just for inconclusive results on CPU performance.
Honestly, when i watched Steve's video i thought no one is going to watch till the end on this... and the end was the part that really made his points the clearest imo...
This video is good and i hope it helps people understand how to use these charts better to suit individual needs instead of nit picking about your very clearly robust and effective means of testing, as well as presenting the data in a very useful manner. You have vids for individual GPU's using the best CPU's in the bench and CPU tests with the best GPU's which gives us useable data for how each individual part can perform in a best case scenario which will show us exactly where to find a bottleneck and how it can vary across a multitude of games. What you guys do is no small feat and i feel for Steve and the whole crew when people seem to just get upset that you don't showcase their exact set up or scenario. But many of us appreciate all the little details more than you know so keep up the good work you guys!
I fell for that trap when I built my PC. At the time, a 5800x and 5800x3d showed little difference at 4k, so I bought the 5800x. Fast forward a few years, and I ended up regretting it. Newer games were more CPU demanding, the 5800x struggled to maintain 60fps in games like Jedi Survivor and Starfield. The frustrating aspect is that there was barely any improvement in frames by turning down game settings. Even other games struggled with poor 1% lows (Forza Motosport). Lesson learned, if there's a 20% gap between CPUs at 1080p, the gap will also exist at 4k. The gap might not be noticeable at 4k in todays' games, but it's there, and will show up in the future as GPUs improve and games get more CPU demanding.
Yeah; we want to know the absolute maximum a CPU can kick out frames.
I kept expecting to Steve to do a drive by... Slowly walking behind Balin while looking menacingly at the camera, grabbing a random thing off the shelves without looking, and slowly walking back out without ever breaking eye contact