Ryzen 7 9800X3D, Really Faster For Real-World 4K Gaming?

แชร์
ฝัง
  • เผยแพร่เมื่อ 21 ธ.ค. 2024

ความคิดเห็น • 4.6K

  • @Hardwareunboxed
    @Hardwareunboxed  หลายเดือนก่อน +825

    I'd also like to thank Gamers Nexus Steve who wrote this for me, but we didn't end up adding it to the video. So please watch their review to support them if you haven't already: th-cam.com/video/s-lFgbzU3LY/w-d-xo.html
    From Gamers Nexus:
    We currently run tests at 1080p and 1440p for CPU gaming benchmarks, though we mostly rely on 1080p results for comparison. Although we didn't bother for the 9800X3D review, we typically publish 1-3 1440p charts in games that are still mostly CPU-bound for perspective.
    There are a lot of ways to approach reviews. We view bottleneck testing as a separate content piece or follow-up, as it also starts getting into territory of functionally producing a GPU benchmark.
    What matters is a consistent philosophy: Our primary philosophy is to isolate components as much as possible, then as standalone or separate feature pieces, we run 'combined' tests that mix variables in ways we wouldn't for a standardized reviews. For us, reviews are standardized, meaning all parts (more or less) follow the same test practices. Introducing more judgment calls introduces more room for inconsistency in human decision making, so we try to avoid these wherever possible to keep comparisons fair. Choosing those practices is based upon ensuring we can show the biggest differences in components with reasonably likely workloads.
    A few things to remember with benchmarks that are borderline GPU-bound:
    - You can no longer fully isolate how much of the performance behavior is due to the CPU, which can obscure or completely hide issues. These issues include: poor frametime pacing, inconsistent frametime delivery, in-game simulation time error due to a low-end CPU dropping animation consistency despite good frame pacing, and overall quality of the experience. This is not only because it becomes more difficult to isolate if issues such as micro stutters are caused by the CPU or GPU, but also because the limitation may completely sidestep major issues with a CPU. One example would be Total War: Warhammer 3, which has a known and specific issue with scheduling on high thread count Intel CPUs in particular. This issue can be hidden or minimized by a heavy GPU bind, and so 4K / Ultra testing would potentially mean we miss a major problem that would directly impact user experience.
    - Drawing upon this: We don't test for the experience in only that game, but we use it as a representative of potentially dozens of games that could have that behavior. In the same example, we want that indicator of performance for these reasons: (1) If a user actually does just play in a CPU bind for that game, they need to know that even a high-end parts can perform poorly if CPU-bound; (2) if, in the future, a new GPU launches that shifts the bind back to the CPU, which is likely, we need to be aware of that in the original review so that consumers can plan for their build 2-3 years in the future and not feel burned by a purchase; (3) if the game may represent behavior in other games, it is important to surface a behavior to begin the conversation and search for more or deeper problems. It's not possible to test every single game -- although HUB certainly tries -- and so using fully CPU-bound results as an analog to a wider gaming subset means we know what to investigate, whereas a GPU bind may totally hide that (or may surface GPU issues, which are erroneously attributed to the CPU).
    One thing to also remember with modern 1080p testing is that it also represents some situations for DLSS, FSR, or XeSS usage at "4K" (upscaled).
    A great example of all of this is to look at common parts from 4-5 years ago, then see how they have diverged with time. If we had been GPU-bound, we'd have never known what that divergence might be.
    Finally: One of the major challenges with GPU-bound benchmarks in a CPU review is that the more variable ceiling caused by intermittent GPU 'overload' means CPU results will rarely stack-up in the hierarchy most people expect. This requires additional explanation to ensure responsible use of the data, as it wouldn't be odd to have a "better" CPU (by hierarchy) below a "worse" CPU if both are externally bound.
    We still think that high resolution testing is useful for separate deep dives or in GPU bottleneck or GPU review content.

    • @krazyfrog
      @krazyfrog หลายเดือนก่อน +30

      I don't know why this even needs to be explained in 2024

    • @SomeoneThatDoesntCare
      @SomeoneThatDoesntCare หลายเดือนก่อน +15

      Think most people "should" understand that the 1080p benchmarking allows to compare CPU. But people also require a way to understand whats the best way to spend thier money , which is why a quick average PC aka rtx 4060 1440p ish benchmark at the end would be useful. Because the reality is for most people , especially if you need new mobo/ram etc as well then your almost always likely to spend your cash on gpu upgrade vs cpu.

    • @jouniosmala9921
      @jouniosmala9921 หลายเดือนก่อน +8

      But I complained about not using 720p. :D

    • @Y0Uanonymous
      @Y0Uanonymous หลายเดือนก่อน +3

      To the contrary, you shall do 1080p testing with lower graphical settings and some upscaling. That's where most people end up before buying a new PC.

    • @A2409S
      @A2409S หลายเดือนก่อน +10

      I've ordered a 9800X3D, I game at 4k with a 3090. So, I would most certainly be GPU bound in the benchmarks above. However, I don't play FPS but I've found the biggest benefit of the X3D on my current 5800X3D to be late game performance. Many games with really high unit counts or lots of different things going on late game slow down significantly, sometimes to the point of it becoming frustrating to play. This is where I've found the X3D to be a huge improvement. I would love to see this late game performance being tested somehow, if only to confirm my theory or prove me wrong.

  • @FatetalityXI
    @FatetalityXI หลายเดือนก่อน +1524

    Surely 5090 is going to show how much more headroom 9800X3D has over other cpu's even in 4k.

    • @Hardwareunboxed
      @Hardwareunboxed  หลายเดือนก่อน +574

      Yep it will and I'd imagine you'd keep the 9800X3D for another generation as well.

    • @Shini1171
      @Shini1171 หลายเดือนก่อน +44

      @@Hardwareunboxed Isn't it the same case with the 7800x3D is it worth upgrading from it?

    • @darreno1450
      @darreno1450 หลายเดือนก่อน +57

      I was lucky to pick up a 9800X3D, but it's just a steppingstone to the 9950X3D.

    • @aapee565
      @aapee565 หลายเดือนก่อน +150

      @@Shini1171 I would say there are better uses for money than a 10% increase in CPU performance. Such as wiping your ass with dollar bills etc :D

    • @thewhiteknight9923
      @thewhiteknight9923 หลายเดือนก่อน +26

      ​@@Shini1171no.
      9000 series might be the last new generation on am5. 10000 might happen but chances seems small imo. 7800x3d is plenty good for the next few years

  • @HoodHussler
    @HoodHussler หลายเดือนก่อน +1726

    Where r my 360p and 720p benchmarks at?

    • @YothaIndi
      @YothaIndi หลายเดือนก่อน +157

      140p or bust

    • @HoodHussler
      @HoodHussler หลายเดือนก่อน +67

      @ absolutely! Some of us still use CRT so where’s our invitations to the party?

    • @themarketgardener
      @themarketgardener หลายเดือนก่อน +83

      I’d honestly take a 720p benchmark for CPU testing ngl 😂

    • @kingplunger1
      @kingplunger1 หลายเดือนก่อน +24

      ​@@YothaIndinot 144p ?

    • @YothaIndi
      @YothaIndi หลายเดือนก่อน +33

      @@kingplunger1
      Nah, that extra 4p will reduce the score dramatically 🤣

  • @SgtRock4445
    @SgtRock4445 หลายเดือนก่อน +871

    I watched this video in 480p so that I get the most Steve's per Second.

    • @YothaIndi
      @YothaIndi หลายเดือนก่อน +46

      If you watch HU & GN at 144p simultaneously you obtain the highest SPS possible

    • @DrMicoco
      @DrMicoco หลายเดือนก่อน +20

      watch in 72p and userbenchmark will just pop out of your screen XD

    • @kosmosyche
      @kosmosyche หลายเดือนก่อน +16

      I find Steve to be very CPU-bottlenecked.

    • @nipa5961
      @nipa5961 หลายเดือนก่อน +2

      @@YothaIndi That's cheating!

    • @BladeCrew
      @BladeCrew หลายเดือนก่อน +3

      I am watching Steve in 1440p 60fps on a 1080p 144Hz display.

  • @zodwraith5745
    @zodwraith5745 หลายเดือนก่อน +57

    This discussion will never end because both sides have a point.
    2:40. This is a good point that I think gets missed a lot. When you focus on competitive gaming that's tuned to be lighter on GPUs then CPU performance at 1080p is going to be FAR more important. But if you _don't_ play just competitive games what people are asking for is knowing _where_ those diminishing CPU returns are BEFORE we buy new hardware. Of course I can test my own system to see where a bottleneck occurs but how am I supposed to magically know that if I _DON'T_ own the hardware yet?
    Anyone that's asking for 1440 and 4k and thinks 1080 is useless is a f'ing idiot so don't lump all of us in with them. What the reasonable ones of us that want more than just 1080p are asking for is _exactly_ what GN does in your pinned message, just throw in ONE "don't overspend on your CPU" bench to let people see _where_ a lot of games slam into a really low GPU bottleneck. Even better if you throw in a 4070 or 6750xt because if you hit that bottleneck with a _4090_ at 1440p? That's a minefield for anyone with a midrange GPU, and you _still_ only used 4090 testing which completely ruins your claim this is "real world" testing when the vast majority of people DON'T own a 4090. The ones that do already have a 9800X3D on order so they aren't watching this anyways.
    We aren't stupid. We KNOW 1080p is the best option for CPU testing and expect it to be prevalent in CPU reviews. The issue is it's ONLY good for CPU vs CPU testing and some of us WANT to know where those GPU bottlenecks kick in. I think Daniel Owen touched on this best. Reviewers are laser focused on the testing of CPUs only, but some of the viewers are looking for real world buying advice when they're watching reviews.
    We're not challenging the scientific method, we're asking for a convenience instead of having to collect the data of GPUs ranked with only $500 CPUs at max settings, CPUs ranked with only $2000 GPUs at minimal settings, then trying to guess where a mid range GPU and mid range CPU will perform with mid range settings. There's almost NO reviewers out there that give you this content, except of course Daniel Owen that will sit for an hour and bounce through all kinds of settings and different CPUs and GPUs. But that's only helpful if he features the game you're interested in.

    • @mikenelson263
      @mikenelson263 หลายเดือนก่อน +9

      I really hope these channels understand this. The testing is important at base, but it does not do a very good job translating to buying decisions. With the amount of product review cycles and the number of channels involved, it would have been helpful to have built a predictive model by now. Rather than dump all these data points on the review screen, give the review presentation ample time to query the model for best advice for real world decision-making scenarios.

    • @timcox-rivers9351
      @timcox-rivers9351 หลายเดือนก่อน +6

      I was nodding along with what you were trying to say until you got to saying that Daniel Owen who does try with multiple settings, is still not enough information because it's not the specific game you want him to test. It sounds like there is going to be no review that is going to give everyone what they want, b/c what people really want the reviewer to do is pick their favorite game, their current CPU and their current GPU and then a direct comparison showing the exact same game with the exact hardware they have on pcpicker. That's ridiculous.

    • @Mathster_live
      @Mathster_live หลายเดือนก่อน +6

      Both sides have a point? The side where they're asking for irrelevant and redundant testing on higher resolutions when there's a GPU bottleneck?
      Even a 5600x is enough for most users, anything above that on higher resolutions, you're GPU bottlenecked. I don't get why it's so difficult to understand, better cpu = will be relevant and scales well on multiple GPU generations. The 1080p results tell you that if there was no GPU bottleneck then you would get that FPS in ALL resolutions, so it clearly tells you how long your CPU will stay relevant throughout the years if you remember what the FPS cap was if there was no GPU bottleneck.
      Like it's surprising how many people like you will find a middle ground between reviewers and people who are clearly misinformed, they don't have a point, they're just wrong and are asking for more tests to waste time.

    • @zodwraith5745
      @zodwraith5745 หลายเดือนก่อน +1

      @@timcox-rivers9351 Well obviously that's not what I'm saying, but even if it's _not_ your specific game what other reviewer do you know that goes through as many settings, resolutions, and hardware configurations as him? Even half?
      I literally can't name anyone else that will show you a benchmark with a midrange GPU, midrange CPU, 1440p at 5-6 different settings and upscaling. And that's still a LOT more info of where bottlenecks can pop up than anywhere else.

    • @zodwraith5745
      @zodwraith5745 หลายเดือนก่อน +3

      @@Mathster_live Because it shows you _where_ the GPU bottlenecks occur. If you can only see CPUs tested with a $2000 GPU at low settings, and GPUs with a $500 CPU at max settings, WTF does that tell you about a mid range CPU and midrange GPU at midrange settings? You're left guessing between 2 completely different numbers.
      If we get just *_ONE_* "don't overspend" benchmark at 1440p with a GPU that ISN'T the price of a used car if you see a bottleneck you know, "ok, you need a stronger GPU for this CPU." If you don't see a bottleneck you know "OK, this CPU I'm looking at right now is totally worth it!" SO much shit is hidden in the middle.
      Besides this video has heavily skewed results to begin with because he only showed heavily upscaled 4k "balanced" meaning really only 1254p, and he only used a 4090. He did everything to _STILL_ make the GPU not the bottleneck. How is that "real world?" Does everyone you know own a 4090? Cause I only know a couple.
      Yeah this video _was_ a waste of time but only because he purposely didn't show what people were asking for. Not to mention if you looked at his pinned comment Steve from GN *_DOES_* mix in a few 1440p slides to let you know where bottlenecks can kick in fast, and he _doesn't_ upscale the shit out of it either. So if GN does it, why doesn't benchmark Steve do it?

  • @trackgg586
    @trackgg586 หลายเดือนก่อน +442

    1. This shows what kind of beast 5800x3d was and still it.
    2. It proves your point, obviously.
    3. It may be anecdotal, but I moved from 3600x to 5800x3d on 3080, while in CP77 my FPS was not significantly affected at 1440pUW, the 1%low spiked by roughly 50%, greatly increasing stability and getting rid of any stutters. That's also a thing to consider, outside of raw FPS performance.

    • @christophermullins7163
      @christophermullins7163 หลายเดือนก่อน +35

      This is especially relevant for 40 series and raytracing. It increases CPU load and creates MUCH worse percentile lows. Faster CPUs always help. Basically almost always. Some games will see no difference but that is rare.

    • @renereiche
      @renereiche หลายเดือนก่อน +19

      I don't think your third point is anecdotal, spikes are just far more pronounced when CPU-limited, it's widely known. And above that, think Steve disproved his own point in this video with many of the included 4k DLSS gaming tests, where there was a significant difference. Imagine upgrading from a 3080 to a 4090 and get 50% higher framerate, which costs you $1600, not upgrading the CPU for $4800 to get an additional 30% performance in quite a few titles at 4k DLSS (~1440p native) doesn't make sense financially and with only 1080p tests and being told that there is no uplift at 4k (native) people wouldn't know that they are leaving so much on the table by not upgrading their 4 year old non-X3D CPUs.

    • @scottbutler5
      @scottbutler5 หลายเดือนก่อน +11

      Jeff at Craft Computing tested higher resolutions for his 9800x3d review, and found the same thing - average FPS was pretty close to other CPUs at higher resolutions but the 1% lows were significantly higher.

    • @RobBCactive
      @RobBCactive หลายเดือนก่อน +5

      @@trackgg586 Once again this video showed the value in the 5800x3D bought near its floor price in extending the useful life of the extremely solid AM4 Zen3 platform.

    • @lczp
      @lczp หลายเดือนก่อน

      Hello sir, I'm thinking on a similar upgrade, from the r5 3600 to the 5700x3d. Outside of CP77, would you say the upgrade was worth it? Did you gain more avg fps in total, disregarding the more stable 1% lows?

  • @orangejuche
    @orangejuche หลายเดือนก่อน +775

    I'm glad that Steve is getting some rest for his poor feet.

    • @michahojwa8132
      @michahojwa8132 หลายเดือนก่อน

      You know where this is going - do tests with oc/uv, med-high details, fsrp xessp and dlssp and Path tracing - because this is realistic 4k and 4090 is 1080p card.

    • @DragonOfTheMortalKombat
      @DragonOfTheMortalKombat หลายเดือนก่อน +5

      Thank 9800x3D

    • @lightofknowledge713
      @lightofknowledge713 หลายเดือนก่อน +8

      Its really sad that he got stabbed by the arrowlake😢 to the knee

    • @jacobgames3412
      @jacobgames3412 หลายเดือนก่อน +2

      ​@@lightofknowledge713very sad😢

    • @Beardyabc88
      @Beardyabc88 หลายเดือนก่อน

      Surprised he wasn't standing up haha

  • @matthewhodgson2008
    @matthewhodgson2008 หลายเดือนก่อน +6

    Would've liked to see the 5800X3D in the charts, that's the chip a lot of people are waiting to upgrade from.

  • @SAFFY7411
    @SAFFY7411 หลายเดือนก่อน +488

    Thanks Steve.

    • @VGJunky
      @VGJunky หลายเดือนก่อน +16

      Thanks Steve.

    • @Haro64
      @Haro64 หลายเดือนก่อน +14

      Thanks Steve.

    • @MyBestBuddiesForever
      @MyBestBuddiesForever หลายเดือนก่อน +11

      Thanks Steve.

    • @lilvegee
      @lilvegee หลายเดือนก่อน +11

      Thanks Steve.

    • @TheHighborn
      @TheHighborn หลายเดือนก่อน +10

      Thanks Steve

  • @RafitoOoO
    @RafitoOoO หลายเดือนก่อน +449

    He's not standing so this might be good.

    • @Hardwareunboxed
      @Hardwareunboxed  หลายเดือนก่อน +163

      It's neither good nor bad :D

    • @xRaptorScreamx
      @xRaptorScreamx หลายเดือนก่อน +98

      @@Hardwareunboxed then you should've lean on the desk :D

    • @1Grainer1
      @1Grainer1 หลายเดือนก่อน +72

      @@xRaptorScreamx True Neutral Steve, floating in the middle of the screen

    • @GalloPhilips
      @GalloPhilips หลายเดือนก่อน +6

      @@Hardwareunboxed 😂

    • @ReSpAwNkILeR
      @ReSpAwNkILeR หลายเดือนก่อน +7

      @@Hardwareunboxed its the inbetween version

  • @keithduthie
    @keithduthie หลายเดือนก่อน +451

    Thanks Steve. I have two hobbies - bitching _and_ moaning.

    • @worthywizard
      @worthywizard หลายเดือนก่อน +90

      You should try whining, it’s often overlooked

    • @__-fi6xg
      @__-fi6xg หลายเดือนก่อน +33

      i feel like crying like a little girl is the hardest one to master

    • @thepadonthepondbythescum
      @thepadonthepondbythescum หลายเดือนก่อน +20

      Don't forget being a victim of Corporate Marketing!

    • @marktackman2886
      @marktackman2886 หลายเดือนก่อน +3

      Steve is waiting for these people like a bald eagle perched on a branch, waiting to strike, I would know...I'v happily been a victim

    • @n8spL8
      @n8spL8 หลายเดือนก่อน +2

      i go for ranting and raving personally

  • @BigPeter93
    @BigPeter93 หลายเดือนก่อน +12

    1080p IS and SHOULD be the standard for CPU testing. HOWEVER, I really enjoy the 4K benchmarks as well. You said it yourself, none of the mainline tech reviewers are showing 4K benchmarks. This forces the part of your audience who want 4K CPU benchmarks to go to less than reliable sources. Wouldn't it benefit everyone involved to offer it on the slide during your reviews?

    • @thorbear
      @thorbear หลายเดือนก่อน +2

      You have been given misleading information. It is not true that none of the mainline tech reviewers are showing 4K benchmarks, and that's not what he said, although it is clearly what he wanted to say.
      LTT tested at 4K, and 1440p, and 1080p Ultra, probably the most comprehensive review available.

    • @CallMeRabbitzUSVI
      @CallMeRabbitzUSVI หลายเดือนก่อน +2

      ​@@thorbearYep, LTT is the only review of the 9800X3d that made sense

    • @7xPlayer
      @7xPlayer หลายเดือนก่อน +1

      "4K CPU benchmark review" This whole video is about how such a thing doesn't exist, cause of the GPU bottleneck in GPU bottlenecked scenario, or useless, the game is not GPU bottlenecked, so now you're on a wild goose chase, is it the cpu? if so, the 1080p test would show that anyway, or is it the memory? or the game itself? or or etc. The variables are mixed together and now you have mixed the CPU variable in, which is a big variable that hides the other variables, so no meaningful conclusion can be drawn cause the details are unclear.

    • @vitas75
      @vitas75 28 วันที่ผ่านมา +1

      ​@7xPlayer so you're saying I should just buy a ryzen 2600x for my 4090 4k rig?

    • @morpheus_9
      @morpheus_9 9 วันที่ผ่านมา

      @@vitas75obviously not. You can easily run a 5600X though.

  • @RadialSeeker113
    @RadialSeeker113 หลายเดือนก่อน +228

    @Hardware Unboxed Sim games and city builders seem to benefit the most. Missing games like anno 1800 and MSFS are vital to determine just how far a CPU can push. On avg a 9800x3d is about 50-60% faster than a 5800x3d without GPU restrictions which is completely insane.

    • @rasteek
      @rasteek หลายเดือนก่อน +6

      THIS!

    • @themarketgardener
      @themarketgardener หลายเดือนก่อน +15

      Tarkov benefits from this too because of poor optimization🫠

    • @Hardwareunboxed
      @Hardwareunboxed  หลายเดือนก่อน +87

      They are just other games that can be used to measure this. We saw plenty of examples in our 9800X3D review where the 9800X3D was 50-60% faster than the 5800X3D.

    • @ValenceFlux
      @ValenceFlux หลายเดือนก่อน +5

      I get about 60-90 fps in Starfield 4k on 5900x 4070TI.
      I get a little less in city skylines 2 and a lot less in the first game.

    • @kognak6640
      @kognak6640 หลายเดือนก่อน +21

      I would be very interested to see Cities Skylines 2 tested properly. The CPU scaling is massive with it, 64-core is technical ceiling, the game can't use more. This is absolutely unique in gaming world, nothing comes even close. However framerate is not directly tied to CPU performance, it's simulation side slowing down if CPU can't handle it(when the city grows larger). First you lose faster speed settings, then everything just moves slower and slower. I downloaded 500k size city and tested it on my 5800X3D. Framerates stayed same as always but simulation ran only 1/8th speed. It's like watching a video 1/8th speed, simply not playable. Because 100k city runs very well, I'd say 150k is max city size for 5800X3D. Basically you could find out how big city a particular CPU can handle at least on slowest speed setting(1x). No one has done it.
      Btw, if any Cities Skylines 2 player is reading this and thinking what CPU you should get, just buy one with most cores you can afford. But because no one has made tests, it really difficult to say if AMD or Intel approach is better. 9950X is probably safest bet for best CPU in consumer space.

  • @mihaighita8553
    @mihaighita8553 หลายเดือนก่อน +51

    I think one of the best use cases for the 9800 x3d should be MS Flight Sim with all the streaming/decompressing going on. Maybe you could add a comparison for that one too.

    • @andreiavasi7600
      @andreiavasi7600 หลายเดือนก่อน +1

      My 9800x3d arrived today for that exact game (msfs) :). It’s keeping my 4070 super in 40% bottleneck on airliners with ground and buildings turned way down since those kill the cpu. Plus i can’t use frame gen or dlss because then i get stutters.
      So i can’t wait to plug this baby in. Plus msfs2024 will optimize for multicore even more, which is great for the next gou upgrade, so i can avoid splurging on cpu again.

    • @Hussar-fm8iy
      @Hussar-fm8iy หลายเดือนก่อน +5

      9800x3d destroys 285k in MS flight sim even in 4k.

  • @aiziril
    @aiziril หลายเดือนก่อน +28

    Saying that it's not about raw performance, but more about having headroom when and where you need it (which depends on the game played) is a really smart way to explain this.

    • @imo098765
      @imo098765 หลายเดือนก่อน +3

      When you have enough enough CPU its amazing
      The moment you hit that CPU wall its lower fps and horrible 1% lows. Its a stuttery mess

    • @FurBurger151
      @FurBurger151 หลายเดือนก่อน

      @@aiziril Car salesman talk if you ask me.

    • @costafilh0
      @costafilh0 20 วันที่ผ่านมา

      No is not. A really smart way is to do more testing so we can see where the bottleneck actually is. But that is too much work I guess.

  • @bool2max
    @bool2max หลายเดือนก่อน +31

    Why test with upscaling @ 4K when we know that the internal resolution is closer to 1080p, for which we already know the results, i.e. that the newer CPU is faster?

    • @IK47-d2l
      @IK47-d2l หลายเดือนก่อน +6

      Yeah I don't understand the point of choosing 4K Balanced Upscale

    • @kyre4189
      @kyre4189 หลายเดือนก่อน +10

      @@IK47-d2l because Hardware Unboxed once again failed as a review channel and this video is just a copout to shift the blame to the viewers. If your review needs a separate video because it confuses the viewers, and if the separate video once again confuses the viewers and creates more questions, it's bad.

    • @IK47-d2l
      @IK47-d2l หลายเดือนก่อน +6

      @@kyre4189 yeah very disappointing from Steve at the beginning to speak to viewers as if they're some small kids who don't get it.

    • @lour3548
      @lour3548 24 วันที่ผ่านมา +2

      With all of the useless, upfront tech-splainin', you'd think he could figure out how to test high-res scenarios. I suspect he just wanted to point at "high-res" graphs that looked similar to his 1080p graphs.

    • @CosmicApe
      @CosmicApe 9 วันที่ผ่านมา +3

      Because if they did this test on an actual native 4K monitor it would show that CPUs matter so little for 4K gaming. There'd be next to no difference at 4K between the 5800/7800/9800X3D. But do it with 4K upscaling and you render at 1080p and keep it CPU bound so you can magnify false differences.

  • @giantnanomachine
    @giantnanomachine หลายเดือนก่อน +6

    Thank you, this is a video I have been really hoping for from one of my trusted review channels. Seeing how CPUs hold up (or don't) over HW generations is extremely helpful for me, since when I build my systems I usually upgrade the GPU once or twice while sticking with the same mainboard and CPU. Seeing what one given CPU or another can achieve with a 4090 (which I wouldn't buy due to price) at 4k with high quality settings today is a valuable indicator for me since in a couple of years that will be what it can achieve with a 7060Ti or 7070.

  • @MrHC1983
    @MrHC1983 หลายเดือนก่อน +22

    Upscaling is not NATIVE 4K brother....... sorry but this whole video is a wash.

    • @iliasguenou4930
      @iliasguenou4930 หลายเดือนก่อน +5

      It increases cpu load which is why he used it. Testing native 4k is a gpu test

    • @Blaze72sH
      @Blaze72sH หลายเดือนก่อน

      Why would you care about a gpu bound test in a cpu test video?

  • @SchmakerSchmoo
    @SchmakerSchmoo หลายเดือนก่อน +40

    One minor anecdote from the "vocal minority" that I think may have been missed is how low resolution benchmarks are used to justify "over buying" the CPU on a mid range build. Someone will be doing a build for ~$800 and you'll see tons of comments about "7800X3D is 50% faster - you must get it!" but these comments ignore the budget of the build and the fact that opting for such a CPU means significantly downgrading the GPU.
    A 7800X3D + 4060 is going to be a lot worse then a 7600X + 4070S in most titles.
    It is misinterpreting the graphs on both sides but only one side seemed to have gotten called out here.

    • @memitim171
      @memitim171 หลายเดือนก่อน +6

      If someone is using 1080P benchmarks to justify over buying a CPU they are doing it wrong and that isn't Steve's fault. All they can do is provide the data, all data can be misused or misrepresented and I think most of us would rather they just continued giving us the data than attempting to tackle blatant stupidity, which is always a race to the bottom.

    • @Coolmouse777
      @Coolmouse777 หลายเดือนก่อน +3

      Just look at GPU benchmarks and compare 2 numbers, it's not hard to do.

    • @Skeames1214
      @Skeames1214 หลายเดือนก่อน +5

      @@Coolmouse777 The number of people who are behaving like children is a little disturbing. “Why aren’t you a repository for every benchmark I could ever want???? Where is my favorite graph????”

    • @Stars-Mine
      @Stars-Mine หลายเดือนก่อน

      gpus you just swap out in 3 years.

    • @_Leouch
      @_Leouch 25 วันที่ผ่านมา

      you must understand that no everyone is playing games like wukong or call of duty. Some play games like cities skylines or paradox strategy games, where strong CPU is far more important

  • @KimBoKastekniv47
    @KimBoKastekniv47 หลายเดือนก่อน +54

    I fully agree that CPU testing should be done at 1080p, but I can't help but wonder why 4K balanced was chosen to prove the point. The input resolution is closer to 1080p than it is to 4K, why not quality mode?

    • @Skeames1214
      @Skeames1214 หลายเดือนก่อน +4

      Quality mode would still be closer to 1080p than 4k, and it was the middle option between the two more popular options (Performance and Quality) in the poll. People should be able to work out that scaling will increase if you choose Performance, and decrease if you choose Quality.

    • @esportsfan2118
      @esportsfan2118 หลายเดือนก่อน +28

      because they dont actualy want to show real 4k or 1440p results, and only want to stick to 1080p. they pretty much made this video out of spite for the comments and are trying to justify not showing 1440p/4k results in their cpu reviews.

    • @toddsimone7182
      @toddsimone7182 หลายเดือนก่อน +12

      Should have done 4k native

    • @enmanuel1950
      @enmanuel1950 หลายเดือนก่อน +4

      @@esportsfan2118There's a very clear example on this video showing that if you test on 4k (yes even on 4k balanced) testing 3 cpus (5800X3D, 5800X and 3950X) yields the same result for the 3 of them. Even though we know the 5800X3D is significantly faster than the other 2 and will achieve considerably more performance not only on 1080p but as well as 4k if you pair it with a gpu capable of keeping up with it.

    • @esportsfan2118
      @esportsfan2118 หลายเดือนก่อน +6

      @@enmanuel1950 yes, and it would be just like that with 5800X3D, 7800X3D and 9800X3D if they showed NATIVE 4k results. but they dont want to show people that it would be no use upgrading if u play in 4k native...

  • @evanractivand
    @evanractivand หลายเดือนก่อน +74

    Got the 9800X3D and no regrets so far. I play on 4K usually with DLSS quality or balanced to boost frames on my 4080, and it has significantly improved things. MMO's and games like Baldur's Gate 3 now don't have any CPU bottlenecks, it's made a pretty large difference to my average framerates and 1% lows in a number of CPU heavy games I play. I came from a 12700K that I thought I wouldn't get much benefit upgrading from, but boy was I wrong.
    At the end of the day you need to look at the games you play and figure out how much money you want to throw at your PC to get a level of performance you're happy with.

    • @worldkat1393
      @worldkat1393 หลายเดือนก่อน +1

      Been wondering if I go from a 7700x to it for MMOs in particular.

    • @evanractivand
      @evanractivand หลายเดือนก่อน +8

      @@worldkat1393 Yeah I tested it in New World the other day, previously on my 12700K it would drop to 50-60 FPS in towns with high population with a fair bit of stutter. On the 9800X3D the stutter is all but gone and it averages 90-100 FPS. So for CPU bound MMO's, it's a big diff.

    • @5ean5ean22
      @5ean5ean22 หลายเดือนก่อน +9

      ​@worldkat1393 once you go 3d, you don't go back.

    • @laxnatullou
      @laxnatullou หลายเดือนก่อน

      Hi! What board that u use?

    • @BrawndoQC
      @BrawndoQC หลายเดือนก่อน

      Lol, like 2 FPS.

  • @atariplayer3686
    @atariplayer3686 หลายเดือนก่อน +9

    Thank you Steve for the awesome benchmark & all the hard work you have put into this video 😊👌

  • @Zemla
    @Zemla หลายเดือนก่อน +50

    I haven't seen the video yet, I have a rtx 4080 playing at 4K.
    I switched to 9800x3d from 5700x and it EXTREMELY helped me.
    Almost no stutter where before it was too unplayable, higher FPS, minimized fps drops etc.
    worth it, money well invested.!!!!!
    I should have done the upgrade much much sooner. to 7800x3d

    • @JohnxYiu
      @JohnxYiu หลายเดือนก่อน +4

      good to hear that! I'm using 11700k and having some stuttering issues playing RDR2 and Witcher 3, now I'm planning to upgrade to 9800x3D with the hope to have this issue fixed.

    • @anthcesana
      @anthcesana หลายเดือนก่อน +2

      Exactly why the 1080p only testing is almost useless for actual resolutions people play at. I found the same with my 5800X3D upgrade

    • @AndrewB23
      @AndrewB23 หลายเดือนก่อน +7

      Uhh doubtful 😂

    • @lycanthoss
      @lycanthoss หลายเดือนก่อน

      Maybe you changed something else?
      I'm running a 12600K + 4080 myself. I recently reinstalled Windows 11 when moving to a new NVMe so I had to take out the GPU. Somehow, and I genuinely don't know how, XMP got disabled so I was getting massive stuttering because I was running at the baseline 2133 MT/s of my DDR4 kit. I didn't understand why I was stuttering so hard until I noticed in task manager that the RAM was running at 2133 MT/s.
      I was going to blame Windows 11 24H2 or the new Nvidia drivers with the security fixes, because I don't think removing the GPU or replacing the first M.2 slot should disable XMP?

    • @Nissehult23
      @Nissehult23 หลายเดือนก่อน +2

      Same. My main game World of warcraft is doing insanely well, more than doubled my avg FPS.
      Not only that, but on Space marines 2 fps (with the high res DLC) went from 60~fps to 100~120.

  • @klevzor
    @klevzor หลายเดือนก่อน +3

    Been waiting for a 4k test of this processor! Thanks. Considering moving on from my 5900x now

    • @CookieManCookies
      @CookieManCookies หลายเดือนก่อน +1

      These 4k graphs are a work of art, good job steve! I'll be back for your 5090 reviews, hopefully in 4K with 4K benchmarks!

  • @frankguy6843
    @frankguy6843 หลายเดือนก่อน +38

    As a 5800X owner I appreciate the comparison at the end a lot, I got the CPU right before the 5800X3D arrived and couldn't justify the purchase, been waiting for my time to switch off AM4 and the 9800X3D is the answer. Should pair fine with my 3080 until I upgrade and then see even more performance. AMD really doubling down on giving gamers what we want and I appreciate it.

    • @rustyshackleford4117
      @rustyshackleford4117 หลายเดือนก่อน +3

      Should be a big upgrade, I went from the 5800x to 7800x3d last year, and got decent FPS bumps even in many 4k games on ultra settings. Several CPU-heavy titles had massive uplifts, as did emulation of things like Nintendo Switch, up to 30-40% performance increase in a few cases like Tears of the Kingdom in cities with lots of NPCS.

    • @Spectre1157
      @Spectre1157 หลายเดือนก่อน +3

      Hey same here! If I was humming and hawing about it before, after reading this comment I am now decided. Thanks!

    • @cschmall94
      @cschmall94 หลายเดือนก่อน +2

      I just built a new system, coming from a 5800x, same boat as you, bought the 5800x before the x3d variant launched, and while the 4080 super does a lot of the heavy lifting over my 3070 before, the performance boost is incredible. Haven't done many tests, but the one that I did, Arma 3 koth, at 3440x1440, my 5800x/3070 would struggle to have a steady even 50fps, the 9800x3d/4080, rarely dipped below 130fps. Granted, it wasn't a full server, which kills fps most times, but still, insane boost.

    • @keirbourne5323
      @keirbourne5323 หลายเดือนก่อน

      Same here, gonna get a 9800x3d to replace my 5600x.

    • @justinthematrix
      @justinthematrix หลายเดือนก่อน

      Same man been waiting to upgrade from
      My 5800x and 3080 as well

  • @LA-MJ
    @LA-MJ หลายเดือนก่อน +3

    You can doublespeak all you want but you will not change the opinion that 1440p gaming with a cheaper-than-car GPU is what matters to many people. GPU-bound is not always GPU-bound either: look at 1℅ frametimes to evaluate the experience

  • @dloc2907
    @dloc2907 หลายเดือนก่อน +5

    Great info and completely agree. However I think we are all wondering if it worth an upgrade. So maybe show test in 1440p or 4k against a bunch of older and newer cpu's.

  • @fmatax
    @fmatax หลายเดือนก่อน

    Thank you! That "future proofing" seccion was awesome. I knew that in theory, but haven't actually see an example. It was quite the eye opener.

  • @msolace580
    @msolace580 หลายเดือนก่อน +26

    didn't put 7800x3d on the chart is a miss.. but we can assume there is no reason to upgrade to 9800x3d lol

    • @codymonster7481
      @codymonster7481 หลายเดือนก่อน

      or an intel

    • @andersjjensen
      @andersjjensen หลายเดือนก่อน

      Day one review said 11% faster on average. The 40 game benchmark marathon said 8% (like AMD's marketing), so unless you happen to play one of the games where the gains were 20% all the time, AND that game is not performing adequately for you, then yeah.... going from the 7800X3D to the 9800X3D is pretty much "e wanking". Which is fine if your kids don't need new shoes or something.

    • @emiel255
      @emiel255 หลายเดือนก่อน +2

      He doesn’t he doesn’t really need to as he made a dedicated video comparing the 9800X3D with the 7800X3D plus his day 1 review he compared the 9800X3D with many other CPU’s

  • @kr00tman
    @kr00tman หลายเดือนก่อน +41

    To be fair, I totally understand where you are coming from, but as an avid 4k gamer (at least when im gaming on my personal rig) understanding what kind of performance uplift id get from upgrading from my 7800x3d to the 9800x3d in 4k even if its only a few fps is helpful

    • @cl4ster17
      @cl4ster17 หลายเดือนก่อน +9

      Doesn't seem like you understood.
      All you need to know is how much faster one CPU is over another. Then extrapolate that knowledge to your system by simply checking if you're GPU-limited or not.
      Resolution is irrelevant to the CPU. Either it's fast enough for your use case or it isn't.

    • @kr00tman
      @kr00tman หลายเดือนก่อน +7

      @cl4ster17 that's a good way of looking at it but I don't think a lot of people fully understand that. My testing basically confirms that but that doesn't mean people like to see if there's a worthwhile improvement or not.

    • @mehedianik
      @mehedianik หลายเดือนก่อน +9

      @@cl4ster17 Your example works when you are comparing a CPU with your own CPU. That way, you have your own data to compare with to decide if it's a worthy upgrade or not. But when you are deciding between multiple CPUs, the resolution also matters. For example, if I want to decide between the 9700x and 9800x3d, a lower resolution will give me an idea regarding the actual difference between the performance of both CPUs. At higher resolutions, say 1440p, one might become CPU bottlenecked while the other doesn't. The performance gap will be closer than the lower resolution result. But how much closer? That’s what people want to know.
      Also, when comparing high-end CPUs, high-end GPUs make sense. But when you are comparing middle class CPUs, people mostly pair them with mid-end GPUs like the 4070 or 7800xt. Their GPUs become bottlenecks much earlier. If they lack understanding and only rely on the review data, they might think upgrading CPUs will give them similar performance uplift. They upgrade their CPUs and get the exact same performance they had earlier.
      That's why real-world data is also necessary, in my opinion, to assess different scenarios. This should be a part of the reviews. I understand the amount of work it takes, and I greatly appreciate reviewers' efforts to present us the data. It won't be possible for them to put in the day-one reviews, but eventually, they should include this data as well.
      Not because it's the right method of testing performance, but rather to give non-techy people a better understanding so they don't waste their hard-earned money for no to barely any performance gain.

    • @cl4ster17
      @cl4ster17 หลายเดือนก่อน

      ​@@mehedianik People don't always run the highest settings and at this point the supposed real-world data goes out the window because GPU-limited "real-world" tests don't show how much headroom if at all the CPU has which could be turned into more FPS by reducing settings.

    • @joaovictorbotelho
      @joaovictorbotelho หลายเดือนก่อน +1

      Exactly.
      The biggest part of the audience wants to have an idea how the hardware would perform in their own situation.
      Beyond that, most users change their systems in the 3 years period (not me, i got a 12 years one. Any change would be a benefit). So it doesn't really make that much sense to prove some performance in a given scenario which isn't even a practical one.
      Certainly the audience appreciates the effort for making this kind os videos, but i guess there's no need to bash against the voices that claim for 1440p, since it's the actual standard for non pro or fast pace shooters players.

  • @kentaronagame7529
    @kentaronagame7529 หลายเดือนก่อน +29

    Ever since the 9800X3D dropped Steve has been smiling a lot in his thumbnails 😂

    • @codymonster7481
      @codymonster7481 หลายเดือนก่อน

      These channels are selling out one by one. It's like they are all falling in line, echoing those pre-launch benchmarks from nVidia/AMD to make things look better than they are. If you don't show 1440p/4K tests, then the data is totally useless and any claims of "best cpu evah" is totaly fraudulent.

    • @DrMicoco
      @DrMicoco หลายเดือนก่อน +6

      @@codymonster7481 so you're telling me that a 7600 is as fast as 14900k because they're pushing the same FPS in 4k?

    • @PC_Ringo
      @PC_Ringo หลายเดือนก่อน +8

      ​@@codymonster7481 tell me you didn't watch the video without telling me that you didn't watch the video. Kek.

    • @jompkins
      @jompkins หลายเดือนก่อน

      @@PC_Ringo hahahaha yes

  • @abhijitroutray2752
    @abhijitroutray2752 หลายเดือนก่อน

    Great video. The 'Future Proof' bit was eye opening. Thanks and keep up the good work.

  • @moritzs3207
    @moritzs3207 หลายเดือนก่อน +6

    Great video, thanks a lot for the refresher. Maybe in future reviews stress the fact that 1080p performance to an extent shows the CPU limit in terms of FPS, which can be extrapolated just fine to higher resolutions IF you have the graphics card allowing this. I think a lot of people are missing that.

    • @leemarks1153
      @leemarks1153 หลายเดือนก่อน

      We should upvote this to the top because so many people still don't seem to get it

  • @Phil-q7h
    @Phil-q7h หลายเดือนก่อน +17

    Good video, thank you. I am one of those that FULLY understands why testing of CPU’s is done at low resolution, however I still want the 1440p/4k data. It’s useful to me as a Single player gamer. It lets me know where we are at, where the market is it. You contradict yourself here Steve, you pointed out in more than one chart that the cpu CAN make a difference and IS relevant at the higher resolutions. Especially when new gpu are released

    • @Hardwareunboxed
      @Hardwareunboxed  หลายเดือนก่อน +13

      I did not contradict myself at all. You're somehow missing the point. The 1080p data tells you everything you need to know. You then take that data and apply it to whatever a given GPU can achieve at a given resolution/quality settings. The 1440p/4K data is misleading and I clearly showed by here:
      9:05 - CPU Longevity Test [3950X, 5800X, 5800X3D]
      29:14 - Watch Dogs Legion [2022]
      30:14 - Shadow of the Tomb Raider [2022]
      30:44 - Horizon Zero Dawn [2022]
      31:04 - 3 Game Average [2022 Titles]
      32:04 - Starfield [2023]
      32:43 - Warhammer 40,000 Space Marine 2 [2024]
      33:04 - Star Wars Jedi Survivor [2024]
      33:34 - 3 Game Average [2024 Titles]

    • @Phil-q7h
      @Phil-q7h หลายเดือนก่อน +3

      I know what you mean. I do honestly but as someone who never upgrades, and swaps out the pc is full every 2 years. I want to know what difference it’s going to make to me today, in today’s games. I appreciate the caveats that can bring, as you say, you don’t know what I am playing or at what resolution. I rarely buy the high end cpu and have always gone for the mid range, lumping the money into the gpu. BUT, if I see a clear difference in a cpu use at the higher resolutions, I want in. And I want you to tell me that, even if it’s on a small sample size. I know I’m not gonna win you over and I know why, but still……

    • @VON_WA
      @VON_WA หลายเดือนก่อน +8

      ​@@Phil-q7hyou know, you're essentially saying "I FULLY understand the reason you're doing it this way and it is the correct reason, but I still want you to so what I want to see even though it's unreasonable."
      Just do what Steve said. Check if you're CPU bottlenecked, then upgrade. If you're still happy with the performance, then you don't need to upgrade, simple as that. You said you're a single player gamer anyway, so most of the time it will be your GPU that's bottlenecked if you're playing at 2K/4K.

  • @vasheroo
    @vasheroo หลายเดือนก่อน +4

    The Stellaris uplift from the 7800x3d to 9800x3d has got me thinking about the upgrade. You can never have enough performance to fight the true late game crisis 😂

  • @biscuitthebody
    @biscuitthebody 29 วันที่ผ่านมา

    Great explination of 1080p testing!!! The last comparison with a couple years old CPUs running then to now really cinches the idea that low res CPU testing is synthetic showing how they will likely perform with next gen games. FANTASTIC!

  • @donnys9259
    @donnys9259 หลายเดือนก่อน +24

    Hats off to you Steve for pushing out this video pretty fast after user comments from your previous video. It was good to get a refresher. Very useful. Thanks. 🙏

  • @Crankshaft_NL
    @Crankshaft_NL หลายเดือนก่อน +3

    For me the underlying question to look at 1440p testing is to look at what is the best step for upgrading(most fps for money). Cpu or gpu first. Forsure the 1080p testing is the best, no comment on that. But content like you did with min max and the 7600 and 4070ti vs 7800x3d with 4070 is welcome content from time to time, to help consumers extrapolate there own questions concerning upgrade path and where to put ther x amount of money.

  • @santeenl
    @santeenl หลายเดือนก่อน +10

    It's a lot about future proofing. I always bought a system for a longer time and would upgrade the GPU. You could buy a 9800X3D and just keep it for like 6-8 years and upgrade your GPU over the years.

    • @robertopadrinopelado
      @robertopadrinopelado หลายเดือนก่อน

      Probablemente sea posible llegar a esa cantidad de tiempo adquiriendo una placa base con la mejor conectividad que hoy esté disponible. Tanto para GPU , para los M.2 y los puertos USB 4.

    • @TheGreatBenjie
      @TheGreatBenjie หลายเดือนก่อน +1

      There is not a CPU review in existence that can accurately say how "future-proof" a cpu will be, and that metric should be largely disregarded.

    • @Flaimbot
      @Flaimbot หลายเดือนก่อน

      not just future proofing, but also if you lock your fps, you enjoy a much more stable framerate, due to MUCH higher 1% lows compared to weaker cpus, that would fit your bill just from ther avg fps

    • @santeenl
      @santeenl หลายเดือนก่อน +1

      @@robertopadrinopelado Maybe respond in English, thanks.

    • @santeenl
      @santeenl หลายเดือนก่อน

      @@TheGreatBenjie It can accurately say it's WAY faster than a 9700X for example. Even if you don't notice a lot of difference today in 1440p, you might in lets say 3 years.

  • @TT-ix5yr
    @TT-ix5yr หลายเดือนก่อน +1

    thank you its nice to see benches at 4k just to see the same settings I want to use which feels much more real

  • @jacobb-c7946
    @jacobb-c7946 หลายเดือนก่อน +3

    thank you for making this, got into an argument with a guy over this on your benchmarks for the 9800x3d. he said its 100% worth upgrading to even if you have a 7800x3d because its 20% faster and you're stupid not to but to me that simply isn't the case, it really heavily depends on what you are playing, the resolution you are at and how many generations you can keep using the CPU because it will be less likely to bottle neck later ones because of its power. its weird to me that people think anything other than this CPU is obsolete, when someone with a 7800x3d will probably only need to upgrade their CPU a GPU generation or so earlier than people using the 9800x3d, which for people who bought it when it came it is completely logical.
    and lastly, who really needs an extra 15 - 30 fps more when you are already on 144fps.

    • @memitim171
      @memitim171 หลายเดือนก่อน +2

      My 7800X3D replaced a 4690K😆, I upgrade the CPU when I can't play a game I want to play, and not before.

  • @arrken33
    @arrken33 หลายเดือนก่อน +3

    Yeah, but adding 1440p and 4k graphs gives us a reference of how bottlenecked you can get with the RTX 4900 in this case. Maybe add a segment just saying if the CPU gets bottlenecked by the GPU at 1440p and 4k.

  • @dotxyn
    @dotxyn หลายเดือนก่อน +57

    Fantastic 1080p vs 1253p comparison Steve😁

    • @johnhughes9766
      @johnhughes9766 หลายเดือนก่อน +12

      And look how much the results reduced at less than 1440p lol 😂

    • @coffeehouse119
      @coffeehouse119 หลายเดือนก่อน +13

      Yea I don't see the point of the video , why change settings not just resolution, why test vs 7700x instead of 7800x3d... strange video from Steve

    • @STATICandCO
      @STATICandCO หลายเดือนก่อน +10

      I know it's missing the point of the video but DLSS quality definitely would have been more 'realistic'. DLSS balanced doesn't really represent 4k or a realistic use case for 4090 users

    • @Donbino
      @Donbino หลายเดือนก่อน

      @@STATICandCOi actually disagree. digital foundry along with many many 4090 users including myself use dlss performance on my 32 inch qd oled. i actually can’t believe how much performance im leaving on the table lol

    • @kerkertrandov459
      @kerkertrandov459 หลายเดือนก่อน +5

      ​@@Donbino dlss performance on 32 inch. U must be blind if u cant see how bad that looks. Considering on 32 inch its much more noticable than 27 inch.

  • @5etz3r
    @5etz3r หลายเดือนก่อน

    Good video, I appreciate the one-off unique testing just to show examples and prove points about variables etc

  • @Dayanto
    @Dayanto หลายเดือนก่อน +14

    13:45 You accidentally showed the 1440p poll instead of the 4k one.

  • @TeodorMechev
    @TeodorMechev หลายเดือนก่อน +22

    I am pretty much aware of why reviews of CPUs are done on 1080p and I appreciate your hard work and vigorous testing, but in this video I would have liked to see some comparison between 7800x3d and 9800x3d in 1440p gaming and how viable would it be to upgrade or not. Maybe in upcoming one?!

    • @Your_Paramour
      @Your_Paramour หลายเดือนก่อน +10

      If you watched the video and are asking this question, you haven't understood the video. Testing at lower resolutions tells you what the maximum throughput is of your cpu, as in the vast majority of cases, cpu performance is independent of rendering resolution. Any higher resolution that does not have the same performance means it is now the gpu side that is the performance limiter. You cannot expect a reviewer to give you a complete performance profile of every game as that is totally unreasonable, and not their job.

    • @discrep
      @discrep หลายเดือนก่อน +1

      He literally explained why higher resolutions provide LESS accurate information -- because even the best GPU on the market will hit its limit of how many fps it can render before the CPU does. This is a CPU review, which compares the difference between CPUs, which cannot be determined if all of the benchmarks are the same because the GPU is the bottleneck. At lower resolution, when the GPU is not close to its limit, the CPUs will determine the max fps and you can more clearly see the true difference between the CPUs.
      Moreover, the actual fps numbers are irrelevant because everyone has different setups. You won't replicate their numbers unless the rest of your hardware and all of your software settings are identical to theirs. What you want to know is the >>relative performance gain

  • @DrunkenMonk-OG
    @DrunkenMonk-OG หลายเดือนก่อน +3

    I don't think people are arguing that the low res stuff is useless, they just don't care what it does at low res is low settings they watch reviews and want to see only what they are looking for, they want to know what gains they will get at the res they play in and settings they use.

    • @samgragas8467
      @samgragas8467 หลายเดือนก่อน +1

      CPU performance is only affected by raytracing and settings like FOV or NPC density. At 4k it is the same, the CPU is good for X FPS.

  • @tyraelhermosa
    @tyraelhermosa หลายเดือนก่อน

    Great job. You nailed it. That example at the end with the CPUs running on the 3090 vs the 4090 makes it so clear.

  • @10Sambo01
    @10Sambo01 หลายเดือนก่อน +10

    While I completely understand why CPU benchmarks are done at low resolution, I think what people are really getting at is; what PRACTICAL performance difference will this CPU make to my gaming at 1440 / 4k. While you can't predict what settings people use or the framerate they prefer, you CAN test a worst-case scenario in a suite of games to give a comparative result across multiple CPUs.

    • @memitim171
      @memitim171 หลายเดือนก่อน +1

      The CPU itself doesn't run any slower at 4K than it does at 1080p, with a powerful enough GPU these 1080p numbers will be your 4K numbers. So what you are really asking for is for him to do a shed load of testing to save you the hassle of looking at 2 numbers. Surely you can see why that's silly?

    • @10Sambo01
      @10Sambo01 หลายเดือนก่อน +2

      @@memitim171 The point is that people want to know what the difference will be if they upgrade their CPU TODAY at the resolution they currently use.
      It's all very well saying that a 9800X3D runs a game at double the FPS than their current CPU on 1080P if there is zero performance increase at 1440P which is what they use.
      Some games have a CPU bottleneck, some don't; if there's still a CPU bottleneck at 1440P with a 4090, then it's going to make a difference.

    • @memitim171
      @memitim171 หลายเดือนก่อน

      @@10Sambo01 So look at the GPU number, what is so hard about this?

    • @10Sambo01
      @10Sambo01 หลายเดือนก่อน

      @@memitim171What do you mean?

    • @memitim171
      @memitim171 หลายเดือนก่อน

      @@10Sambo01 If the CPU you are considering does 200fps in the test @ 1080P but the GPU you have does 150fps @ 1440p then your framerate with the new CPU will still be 150@1440P. Conversely, if your GPU does 200 fps @ 1440P but your CPU is doing 100 @ 1080P then your fps @ 1440P will be 100. All you need to know are these 2 numbers and if you are currently CPU or GPU limited, and Steve can't tell you the last bit because it's your computer, your game, and your setup.
      I don't play any of the games used to bench here or by any other serious TH-camr, but their data still tells me everything I need to know to make an informed purchase, because all I need is those 2 numbers (the average from a 45 game bench is more than sufficient) and to know where I'm currently at. You declaring the data is useless because it isn't at 1440P is the same as me declaring it useless because my games aren't included, it doesn't matter, because the relative power of the parts is still the same and you will always be limited by the bottleneck.

  • @Aleaf_Inwind
    @Aleaf_Inwind หลายเดือนก่อน +3

    I think your message from Gamer's Nexus says it all, "we typically publish 1-3 1440p charts in games that are still mostly CPU-bound for perspective." It just gives another perspective. Sure we can cross reference and do the work ourselves, but we can also CPU test ourselves, or cross reference other reviews ourselves. It's nice to see the work done by a channel we trust to do it right to give an alternative perspective on CPU performance in realistic situations. It doesn't mean we need a 45 game test with every possible config combination, just find a few games, maybe one that sits at the average for GPU demanding game, one that sits at the average for overall CPU performance, and one that sits at the average for CPU demanding games and do a few tests like the ones you did here. You kept saying that the data was pointless, but even though I already understood your testing method and agree with it as the correct method for testing CPUs, I still found this data very fascinating to look at, and am glad you said that you might do another one in two years with the next CPU generation release.
    On a side note, I'm still sitting with a 5600x and want to upgrade, but I'm struggling to decide between a 5700x3D and a total platform upgrade to a 7700x. The 7700x would cost way more while not giving much benefit, but then I could upgrade again further down the line to a 9800x3D or potentially a 10800x3D if it stays on AM5, but there's always a chance that it could end up on AM6, and then it would probably be a better idea to skip AM5 altogether... Decisions...

    • @Coolmouse777
      @Coolmouse777 หลายเดือนก่อน

      Hope they not listen to you. You agreed that this data is pointless but still want it ? Why ?

    • @Aleaf_Inwind
      @Aleaf_Inwind หลายเดือนก่อน

      @@Coolmouse777 No, I didn't agree that it was pointless, I completely disagreeed. I agreed that their testing methodology is the best practice for showing CPU raw power. But it's also nice to see how that power scales in different real world scenarios. I said that it was fascinating and gives another perspective, which Gamer's Nexus also seems to agree with, and they are one of the most technically minded channels around.

    • @Coolmouse777
      @Coolmouse777 หลายเดือนก่อน

      @@Aleaf_Inwind There is no another perspective because the is no additional data if you want to get real world results, just look at GPU review to, it's simple to compare 2 numbers )

    • @Coolmouse777
      @Coolmouse777 หลายเดือนก่อน

      @@Aleaf_Inwind But it is not scales, it is the point of video. If CPU get 120 at 1080p, 1440p and 4k will be the same 120 fps.

    • @Aleaf_Inwind
      @Aleaf_Inwind หลายเดือนก่อน

      @@Coolmouse777 There were plenty of cases like Cyberpunk 2077, where the 9800x3D got 219fps in 1080p and only 166fps in 4k Balanced, a reduction of almost 25%, while the 7700x only gets 158, but only had a reduction of about 5% from the 166 it gets in 1080p. So no, you don't get the same fps, you get less and it's nice to know how much less, because it's not just a solid percentile. Sure you can go cross reference GPU reviews to get an idea, but as you can see, it's not just a straight number, the 4090 still gets more frames with the 9800x3D than it does with the 7700x in 4k Balanced.

  • @Hjominbonrun
    @Hjominbonrun หลายเดือนก่อน +38

    wait, He is not lounging on his couch.
    I don't know what to make of this.

    • @YothaIndi
      @YothaIndi หลายเดือนก่อน +5

      It's ok, he's still sitting 👍

    • @christophermullins7163
      @christophermullins7163 หลายเดือนก่อน +1

      He has work to do so don't stress it too much.

  • @mr.waffles2555
    @mr.waffles2555 หลายเดือนก่อน

    This test was amazing. I have a few friends who haven’t quite grasped this concept and sharing this video with them finally bridged the gap. Thank you for all that you do.

  • @Vincent-v9q
    @Vincent-v9q หลายเดือนก่อน +8

    Steve, you are a GENIUS! (no)
    4K + DLSS balance = 2160p / 1.7 = 1270p = 2258x1270 ... It's not even 1440p!
    You are comparing 1920x1080 vs 2258x1270. You really need to compare 4K in native!
    We are waiting for a normal test. For example, will there be a difference in 4K 7500F/12400F vs 9800x3d. Without upscalers! You can add "DLSS quality" as a separate line, but it is not necessary.
    Without upscalers, it will be clear where to invest money. The difference between real prices between 9800x3d and 7500f is about 400 dollars. You can invest those $400 in a good 4K monitor and not see the difference between the 7500F and 9800x3d even on the 4090.

    • @Hardwareunboxed
      @Hardwareunboxed  หลายเดือนก่อน

      I'm not sure what you're talking about, balanced DLSS at 4K still has 31% more pixels than native 1440p.

    • @Vincent-v9q
      @Vincent-v9q หลายเดือนก่อน

      @@Hardwareunboxed I can't give you a link, just Google "dlss quality resolution". The preset quality is 66.6% of the native, the balanced preset is 58%, the performance preset is 50%, and the ultra performance preset is 33% of the native.
      Setting the balanced preset to 4K will give you a real render of 3840*0.58=2227 and a resolution of 2160x1252.
      Scaling is not based on the total number of pixels! But each side of the image is reduced by this percentage!
      In the same Hogwarts. If you go to the settings, set 4K and set DLSS Quality, for example, you will see a render resolution of 67% (2562x1441), which is even lower than the native 2560x1440, if you count by dots. 4K + DLSS balance is MUCH LOWER than native 2560x1440.
      I'm ashamed that I have to explain this to a channel with over a million subscribers. Who think that DLSS in percentages reduces the total number of pixels across the entire image area, and not on the sides.
      Steve, remember! 4K + DLSS quality ~= QHD (2560x1440) native. 4K + DLSS balanced = something in between QHD and FHD. 4K + DLSS performance = FHD.
      Waiting for a normal test in native 4K!

    • @Vincent-v9q
      @Vincent-v9q หลายเดือนก่อน

      ​@@Hardwareunboxed I can't give you a link, just Google "dlss quality resolution". The preset quality is 66.6% of the native, the balanced preset is 58%, the performance preset is 50%, and the ultra performance preset is 33% of the native.
      Setting the balanced preset to 4K will give you a real render of 3840*0.58=2227 and a resolution of 2160x1252.
      Scaling is not based on the total number of pixels! But each side of the image is reduced by this percentage!
      In the same Hogwarts. If you go to the settings, set 4K and set DLSS Quality, for example, you will see a render resolution of 67% (2562x1441), which is even lower than the native 2560x1440, if you count by dots. 4K + DLSS balance is MUCH LOWER than native 2560x1440.
      I'm ashamed that I have to explain this to a channel with over a million subscribers. Who think that DLSS in percentages reduces the total number of pixels across the entire image area, and not on the sides.
      Steve, remember! 4K + DLSS quality ~= QHD (2560x1440) native. 4K + DLSS balanced = something in between QHD and FHD. 4K + DLSS performance = FHD.
      Waiting for a normal test in native 4K!

    • @Vincent-v9q
      @Vincent-v9q หลายเดือนก่อน +5

      @@Hardwareunboxed I understand how you calculated.
      3840*2160 = 8,292,400 x 0.58 = 4,809,592
      2560*1440= 3,686,400
      4,809,592 / 3,686,400= 1.304 = ~30.4% more
      DLSS doesn't work THAT way!
      You're wrong... I described how DLSS works above.

    • @Vincent-v9q
      @Vincent-v9q หลายเดือนก่อน +5

      @@Hardwareunboxed It works like this 4K DLSS balanced =
      (3840*0.58)*(2160*0.58) = 2227 x 1253 = 2790431
      2560*1440= 3 686 400
      3686400 / 2790431 = 1.32 = QHD resolution (2560x1440) is 32% more than 4K DLSS balanced(2227 x 1253 real resolution), as you tested in this video!

  • @wolfgangvogel5407
    @wolfgangvogel5407 หลายเดือนก่อน +38

    This is a weird test tbh. Why is upscaling on balanced mode, it should be with quality or off. The claim was, in a gpu limit cpu differences are far less important. You test that, which is cool but then you turn upscaling on balanced which reduces gpu limit again, am I missing the point of this test?

    • @Hardwareunboxed
      @Hardwareunboxed  หลายเดือนก่อน +9

      Yeah missing the point for sure. Maybe just skip to and watch this entire section of the video.
      29:05 - CPU Longevity Test [3950X, 5800X, 5800X3D]
      29:14 - Watch Dogs Legion [2022]
      30:14 - Shadow of the Tomb Raider [2022]
      30:44 - Horizon Zero Dawn [2022]
      31:04 - 3 Game Average [2022 Titles]
      32:04 - Starfield [2023]
      32:43 - Warhammer 40,000 Space Marine 2 [2024]
      33:04 - Star Wars Jedi Survivor [2024]
      33:34 - 3 Game Average [2024 Titles]

    • @wolfgangvogel5407
      @wolfgangvogel5407 หลายเดือนก่อน +15

      @@Hardwareunboxed Still dont see it. I agree with the initial testing in lower res to run in a cpu limit, really best way to do it. Some media outlets even still test 720p, that would drive some people mad for sure. But I really dont see the point of upscaling balanced at 4k with a rtx 4090. Its like testing somewhere between 1080p and 1440p, it would give you the same results.

    • @Hardwareunboxed
      @Hardwareunboxed  หลายเดือนก่อน +9

      That is without question the most obvious way I can illustrate why GPU limited CPU testing is stupid and misleading. So if that didn't work I will have to tap out. You're welcome to get your GPU limited CPU testing elsewhere though.

    • @wolfgangvogel5407
      @wolfgangvogel5407 หลายเดือนก่อน +23

      @@Hardwareunboxed I am aware about GPU and CPU limit, thanks. I am not questioning any statements either. Simply saying 4k upscaling performance is weird, in the first part of the video with the 4090 the whole test is pointless and misleading. You are not testing in 4k, its not even 1440p. The second part of the video proves your point much better. I think there is simply better ways to show why you test in low res than what you did in half of the video

    • @NGHutchin
      @NGHutchin หลายเดือนก่อน +3

      @@wolfgangvogel5407 I think he has already been hit by the more aggressive comments by the time he got to yours. It is definitely true that a 4k test without upscaling represents a greater GPU bind than with. I think the argument of the better CPU ages better is not the question the audience, which this video is attempting to address, is asking. I do still appreciate the point, though.

  • @jerghal
    @jerghal หลายเดือนก่อน +25

    At 13:40 about the upscaling poll: Steve says 80% use upscaling. But then it must be the wrong poll (this is the 1440p poll instead of the 4K poll). Coz what I see on screen is 43% use upscaling (8% balanced and 35% quality mode). 39% (the biggest group) DOES NOT use upscaling for 1440p. So why test with balanced if that is the smallest (8%) group, unless it's the wrong poll 😅.

    • @CookieManCookies
      @CookieManCookies หลายเดือนก่อน +2

      Such a tiny audience, how many youtube viewers are scoping for these polls? I didnt even know there was one.

    • @jerghal
      @jerghal หลายเดือนก่อน +1

      @CookieManCookies I happened to answer that poll 😁. Well both of them.

  • @TheMasterEast
    @TheMasterEast หลายเดือนก่อน +1

    Hey Steve thank you for doing this. I know why CPU are tested like they are. But I Highly value your work here in 4K testing. Keep up the Good work.

  • @ChristopherYeeMon
    @ChristopherYeeMon หลายเดือนก่อน +38

    This should be seen as an opportunity, instead of a debate. Do the normal benchmark review that is CPU limited. And do a buying guide review where you put the CPU in 'normal' scenarios and see if it makes a difference so it shows users how far back of a CPU you need to be at to consider the upgrade

    • @mikenelson263
      @mikenelson263 หลายเดือนก่อน +6

      This is the way.

    • @teekanne15
      @teekanne15 หลายเดือนก่อน

      In theory the benchmarks themselves should be info enough for a user to deduct that. But I guess people need handholding with purchasing decisions these days.

    • @mikenelson263
      @mikenelson263 หลายเดือนก่อน +4

      @@teekanne15 No, in theory it absolutely does not work that way.

    • @Skeames1214
      @Skeames1214 หลายเดือนก่อน +1

      This is not a debate, this is a literacy issue. You're asking them to double their testing, their shoots, and their editing time for the sake of people who can't figure out that playing games at 4k Ultra is going to make the CPU less important than the graphics card. You have to see how ridiculous that is. What they're doing right now is the right answer. Make a video every couple years explaining the methodology.

    • @mikenelson263
      @mikenelson263 หลายเดือนก่อน +6

      @@Skeames1214 who's asking them to double their testing? Stop treating other people like they're dumb when you're not diligent enough to examine the validity of your own thinking.
      People who want to play at higher resolutions understand that the relevance shifts towards the GPU. Where are the thresholds? That information is actually helpful in making purchasing decisions. It would be better to structure your testing approach within the constraints that apply in order to better inform viewers.

  • @knuttella
    @knuttella หลายเดือนก่อน +9

    isn't 4k balanced just 2227 x 1253?

  • @edmon-pf9fp
    @edmon-pf9fp หลายเดือนก่อน +20

    this is crazy. why are you not showing 4k native. your showing 1080 native and in 4k balance upscaling. come on mate. i dont see what this video is showing.

  • @t0x1cde6t
    @t0x1cde6t 15 วันที่ผ่านมา

    thank you for including sim racing title. for those hardcore sim racers, we run some pretty insane resolutions (triple 1440p, double 2160p or triple 2160p) and it’s good to know 9800x3D can help boost frames in title like ACC.

  • @mikewilliams6522
    @mikewilliams6522 28 วันที่ผ่านมา +5

    I’m replacing my i9-9900k with the ryzen 7 9800x3d so I feel like that’s going to be a massive difference and improvement 😂

    • @AyazDaoud
      @AyazDaoud 20 วันที่ผ่านมา

      Congrats. I hope your GPU is good enough to use with a CPU that good.

    • @mikewilliams6522
      @mikewilliams6522 20 วันที่ผ่านมา

      @@AyazDaoud i'm using a 3090 ftw3 24gb

  • @mption293
    @mption293 หลายเดือนก่อน +3

    The most important thing is the big question, "Do I need to upgrade?" Telling me how many FPS it gets at 1080 doesn't tell me that , its very useful in overall CPU performance when it is time to upgrade, but if I'm running a 12th gen intel, and the 9800x3d is getting broadly the same results at 4k, or 1440p if i am playing at that, I don't need to upgrade.

    • @Skeames1214
      @Skeames1214 หลายเดือนก่อน +1

      "The most important thing is the big question, do I need to upgrade?"
      There is no way that question can be answered by a CPU review. It's completely subjective, only the individual can decide. They don't know what exactly you want out of your system, what games you play, etc. If you: 1. already have a high end CPU, 2. play demanding at high resolutions and quality settings, and 3. are happy with your performance, then don't upgrade. If you aren't happy with the performance, and you're CPU limited, 1080p benchmarks give you the best idea of what CPU *can* reach the number you're targeting, and you can adjust quality settings or upgrade your GPU from there.

    • @mption293
      @mption293 หลายเดือนก่อน +1

      @Skeames1214 yeah data sets at high resolutions are totally useless at helping someone make that decision/s
      This answer tells me there is no point to consuming this content until I make that decision. A product review should show if something will benifit the consumer,not just it's the fastest!

    • @Skeames1214
      @Skeames1214 หลายเดือนก่อน

      @@mption293 But whether or not it will benefit the consumer is subjective. You can’t offer a review that answers that question for everyone. You don’t know what they want, what their other hardware is, how often they upgrade, etc. Tiering the CPUs by their raw performance is a useful data point for all consumers, and more importantly the point of a *CPU* review.

    • @mption293
      @mption293 หลายเดือนก่อน

      @@Skeames1214 whether the 1080p data is beneficial is subjective too. Reviews are supposed to guide you to what's the best for your needs not to say here's all the possibilities this is the fastest in specific use case. More data might not help everyone but this data doesn't help everyone either.

    • @Skeames1214
      @Skeames1214 หลายเดือนก่อน

      @@mption293 "Whether the 1080p data is beneficial is subjective too." No, it's not. It is objectively more useful in the context of a CPU review. It shows differences in CPUs instead of masking them behind the limitations of other parts. Would you want them to test GPUs at 1080p with a 10600k? Because that's the same logic. "Show me what this can do when it's artificially handicapped by another component"
      "Reviews are supposed to guide you to what's best for your needs." No, they aren't. You're talking in circles. Nobody knows what *your* needs are other than you. You can make that determination. This isn't guesswork, knowing what CPU and GPU are capable of independently will give you everything you need to know about what parts are right for you. Reviews are to evaluate the quality of the product in question. Not to evaluate artificially limited versions of it. That's how literally every single review from every reputable channel works. They tell you what the product is capable of now, which in turn tells you more about how it will age. If you actually only care about how specific games will run at specific resolutions with specific graphics settings and a specific GPU on a specific CPU, look up benchmarks for it.

  • @galinhadopai
    @galinhadopai หลายเดือนก่อน +7

    DLSS QUALITY 🤨 NOT BALANCED

  • @Cuthalu
    @Cuthalu หลายเดือนก่อน +1

    Fantastic video, and it's something the community really does seem to need.

  • @rgstroud
    @rgstroud หลายเดือนก่อน +11

    We understand the best case processor performance comparison at 1080P but we also want the graph you gave for 4k but with Quality not Balanced differences. This will definitely tell us who have the 4090 and use epic settings if it is worth upgrading from the 7800X3D as we never use balanced settings with this configuration, we bought the 4090 to use Quality settings worst case and can add frame generation if the latency is at 60 fps native 4k resolution. This will get even more useful with the 5090 when no gpu limits will exist for most games. We will also be greatly interested in 4k quality game comparison between the 9800X3D/7950X3D and the new 9950X3D if it has the 3D cache on both CCDs. The end of the video was very useful as I did not upgrade my 5950X OC to the 5800X3D for my 4K gaming rig as the boost shows no help for many games and even though the 5800X3D was better in some games, it was not worth the upgrade as I didn't have the 4090 yet. Now with the 4090 and soon the 5090, these 4k Quality setting comparisons with the CPUs IS VERY Important to represent real world use cases.

    • @anitaremenarova6662
      @anitaremenarova6662 หลายเดือนก่อน

      I can tell you right now: it's not. AM4 havers should be upgrading if they buy flagship but the rest will be fine with ryzen 7000 until the next platform drops easily.

    • @rgstroud
      @rgstroud หลายเดือนก่อน

      @@anitaremenarova6662 sorry but real world use cases are what people want to see. You can denigh all you want it doesn't make it untrue.

  • @Krarilotus
    @Krarilotus หลายเดือนก่อน +10

    So the conclusion: with only 172 fps on Star Wars Jedi Survivors for the 9800X3D vs 155 fps on the 5800X3D the 4k testing is just showing, that these CPUs are very close, while in 1080P 234 vs 163 fps paints a completely different picture, and shows the actual IPC uplift of the new Zen Cores and Cache vs the well aged 5800X3D that we all got to know and love, and will probably not update from, until we get AM6 plattform release, because why would we?
    5800X3D still king of price to performance and the cheaper AM4 platform really does all we need for now!

    • @anitaremenarova6662
      @anitaremenarova6662 หลายเดือนก่อน +3

      @@contactluke80 Why are you copypasting the same comment everywhere? Yes, 80% of people at 4K use upscaling there literally isn't a card that can play modern games in 4K native lmfao. Also no, better processor will improve your stability (1% lows, 0,1% lows) especially when RT is used in any resolution.

  • @asranssrg
    @asranssrg หลายเดือนก่อน +21

    The 4K reviews are worthless as they are using Balanced Upscaling, who decided this was needed? Sorry you wasted time on this. Could you have provided 4K results without changing the variables? Apples to apples comparison. 4k results already show less of a performance uplift over the bs 1080p results. Rather have raw data and make my own interpretations. And no 7800X3D or 5800X3D to compare against, not good comparison.

    • @nicolastremblay7364
      @nicolastremblay7364 หลายเดือนก่อน +9

      Balanced DLSS means 1253p (Internal resolution), so not a good comparison I think also... and by the way, I fully understand that CPU tests must be done at 1080p to avoid GPU bottleneck....but for me, i'm more interested in seeing CPU tested at 4K (no upscaling), because I'm never using other resolution than this... !

    • @FurBurger151
      @FurBurger151 หลายเดือนก่อน +5

      Exactly this is pointless. They just don't want to show the real gains at native 4k.

    • @yoked391
      @yoked391 หลายเดือนก่อน +1

      @@FurBurger151 cpu gains are useless at native 4k thats why xD

    • @FurBurger151
      @FurBurger151 หลายเดือนก่อน +1

      @@yoked391 I know. I game at 4k so this CPU is pointless.

    • @thischannel1071
      @thischannel1071 หลายเดือนก่อน +2

      @@yoked391 That's what makes benchmarks at 1440p or 4k useful to potential purchasers of this, or other high-end CPUs. Seeing that there's no performance gain with the top-end gaming CPU lets people know to not spend money buying the top-end gaming CPU when it won't do anything for them. That's part of the context of what the 9800X3D means to gamers thinking about buying an upgrade, and that should be conveyed in a review of it that intends to inform consumers.

  • @SidorovichJr
    @SidorovichJr หลายเดือนก่อน

    thanks for 4K benchmark, those are really practical tests for us to make a decision

  • @Superior85
    @Superior85 หลายเดือนก่อน +8

    If the most popular option on the poll at 13:41 was native 1440p, why not at least include that result in reviews? Could be useful for both 1440p gamers and those who use 4K Quality, since 1440p is the 4K Quality DLSS rendering resolution....

    • @Dark-qx8rk
      @Dark-qx8rk หลายเดือนก่อน +2

      1440P with upscaling would in effect give the same fps as 1080P which would make the 9800X3D a great choice.

  • @FastDistance1
    @FastDistance1 หลายเดือนก่อน +3

    Great info, but just to give you an example of a user who wants 4k data to be included in the CPU benchmarks, even after watching the whole video and understanding all your points. I have a 7600 and a 4090 (for example). I am thinking about upgrading my CPU after the new part is released, and I primarily game at 4k (playing casual games). After this video, I’ll go to buy a new CPU because the 9800x3d is 17% faster on average than 7700x. If that margin were under 10%, I wouldn’t update, and the 1080p data is not helping with my decision.

    • @mgk878
      @mgk878 หลายเดือนก่อน

      If you're still asking this then maybe Steve has not explained it well enough.
      CPU performance is nearly independent of resolution. There's little or no extra work for 4K, because that work is done mainly by the GPU. So a "% uplift at 4K" is just the same number as 1080p, except that no GPU exists right now that can demonstrate that. Including those numbers in a CPU review would only help people with a 4090 and most people don't have one. Meanwhile the "CPU bound" numbers are for everyone. Most recent GPUs can achieve similar numbers just by turning down resolution and quality settings, until you get a CPU bind.
      The question of "will I get 17% more FPS" depends on your GPU and games, so the answer is really in the GPU benchmarks. I'd guess that most people asking this play at 4K/high settings and so are usually GPU bound, so the uplift would be about 0%. If I were you I'd save my money.

  • @xshadowinxbc
    @xshadowinxbc หลายเดือนก่อน +13

    *Tries to prove a point*
    *Uses DLSS balanced upscaling rather than native at 4k, even when it's absolutely not necessary*
    *Doesn't even show the 5800X3D or 7800X3D vs the 9800X3D at those resolutions*
    I don't think anyone is trying to pry the "1080p testing is useful to show CPU performance" argument out of your sweaty palms. The point is that most people are trying to decide whether an upgrade is worth for what they're actually going to be playing, at the resolutions that they will actually be using. This is becoming more and more relevant when the CPU starts costing as much, or more, than many GPUs. That's why we want to see how well it fares in both GPU and CPU limited scenarios (and then in the middle, which is basically what you did here), rather than just one of the two. It's not like it makes the 9800X3D look much worse; even at GPU limited scenarios it still provides benefits. This video is at best disingenuous, and I'm going to kind of be sitting there laughing when people plunge down nearly $500 on a 9800X3D and then later realize that it isn't doing jack shit with their budget GPU.... rather than saving a lot of money by, say, grabbing the 7600X3D (while still being on a modern platform) and then using those savings towards a better GPU. Furthermore, "in the future" is a moot point considering "the future" is going to have its own new CPUs which might provide much better price/$, or hell you could probably get the 9800X3D on sale (perhaps secondhand) for probably a fraction of its current price anyway. What is the future and when is the future that actually matters?

    • @Nick-05
      @Nick-05 หลายเดือนก่อน +1

      Nailed it

  • @CZID
    @CZID หลายเดือนก่อน

    This is what I'm looking for! Good job on doing these comparisons. Please make more videos like this. Thank you!

  • @NoirTenshin
    @NoirTenshin หลายเดือนก่อน +28

    I might not be representative of both sides, but please hear me out.
    I understand that CPU testing, is there to show what the CPU can do, in a way, it's uncapped potential.
    I also understand that higher resolutions put a cap on that testing and it masks what the CPU can do (gpu bottleneck).
    So, to (stress) TEST a CPU, you don't want it bottlenecked, i don't think anyone can argue with that.
    The issue is that people aren't just interested in the CPU potential. They are also interested in how that potential is transferred to their scenario (at least ballpark values). It is really hard to "imagine" how the CPU will hold in higher resolutions because on average, (especially with dlss and frameGen) we don't know where the bottleneck starts shifting from CPU to GPU and vice versa (it depends per game, game settings, but nobody is doing that much testing, much less an aggregate on that).
    People that complain about 1080p are complaining that the potential (1080p test) doesn't give them enough information to make an informed decision (without a large amount of speculation).
    So the expectation is (and i think here lies the issue), the new cpu comes out, people want a cpu review (not just a test of it's potential), and cpu test gives them data they can't relate to (even though it's the most proper way to test a CPU).
    So it's more about the communication (and to a degree expectation), what the test represents.
    To make an analogy, testing a car going from 0-60 miles / 0-100km is great data, but for people in the city / city car, it's more relevant to test 0-30 mph/ 0-50 km/h, while the 0-60/0-100 is still useful data. Both data points will show both "extremes" and be much easier to "imagine" where their scenario lies in that spectrum (as it's almost impossible to find the gpu bottleneck per game).
    To make the whole thing more complex, DLSS and frameGen can technically be multipliers (they don't just offset the gpu bottleneck point, especially frameGen). That leaves the end user needs not covered, as they have an additional layer of complexity added to their "calculation" (if the cpu is worth it for them, meaning delta is big enough for the amount of money).
    As we have seen in the data shown in the video, there are some games that aren't gpu bottlenecked even in 4k (DLSS : balanced). That doesn't bring any useful data for the cpu, but it does bring new information to the customer (that the cpu upgrade is still relevant even in 4k, and how much relevant).
    When you aggregate that data / on an average, that allows the (potential) customer to see the cpu in a new light, which may not be the point of the cpu test, but it should be a point of a cpu review (how it behaves in certain conditions that is relevant for customers).
    So in a way, a cpu (stress) test, paints borders and lines of a picture, but doesn't add color (and shading).

    • @Flaimbot
      @Flaimbot หลายเดือนก่อน +1

      i'll boil it down.
      * if a cpu gets 120 fps in 1080p, the same cpu gets 120 fps in 4k. (minus a small portion due to object/geometry culling at lower resolutions not happening at higher resolutions)
      * dlss reduces the rendered base resolution. look at the respective gpu BASE resolution benchmarks for the respective dlss settings numbers (minus a bit from the overhead by the smoothing)
      * frame gen only leverages free resources of the gpu in case of a cpu bottleneck and only up to 2x the non-fg numbers, because it's interweaving a real frame with a generated frame. look at the gpu load. e.g. 59% gpu load for 120fps without dlss or fg (so, a cpu bottleneck) -> bit of math magic turns these fg values into x = 120 fps * 100% / 59% = 203 fps, minus a bit of overhead. (you're free to insert your own numbers from all of your own games.)
      all of the above stacks with each other. you put on dlss. that reduces the gpu load. that translates into higher free resources for fg, but still limited by the maximum your cpu can handle x2.
      0 magic.

    • @mikenelson263
      @mikenelson263 หลายเดือนก่อน +8

      "So in a way, a cpu (stress) test, paints borders and lines of a picture, but doesn't add color (and shading)."
      This is the equivalent of why differentials or finite differences (and their higher orders) are critical in engineering. You don't want to know how much something is, you want to know by how much it changes relative to a number of parameters, by how much that change changes, by how much the change in change changes, etc.
      I find it disappointing that this point is completely missed in these discussions. Yes, the best CPU you can buy will without question give you the best future potential in keeping up with performance. But if two CPUs can hold up to the performance you need today, what is the value of the better CPU's upsell relative to performance unlocks? You need the data points to find the differences and extrapolate.

    • @jammaschan
      @jammaschan หลายเดือนก่อน

      @@Flaimbot I mean I understand that, but do you really expect the average person wanting to buy a cpu for their system to understand that? They want to know how well a cpu performs for a system the closest to theirs, and hence the comments asking for 4k benchmarks. Everyone in the comment section most likely understands why 1080p benchmarks are a thing. Little tommy wanting to build a gaming pc for christmas this year or their parents wanting to help them with is won't get it (probably even if they watch this video), so having a small explaination saying "hey 4k tests doesn't really tell us much abou the cpu's performance" in every video would definitely help.

    • @DrMicoco
      @DrMicoco หลายเดือนก่อน +3

      what you only need to know is that CPU doesnt care about RESOLUTION, that's the GPU's job, so if a CPU hits a framerate/frametime on a specific settings, then that's the ballpark you need to know, because at higher resolutions the question will be on how powerful your GPU card is.

    • @tomongchangco4345
      @tomongchangco4345 หลายเดือนก่อน

      ​@@jammaschanif your definition of the average user is little tommy's parents who I assume are not technically inclined with computer hardware then a small explanation or even a CPU review won't help them even the slightest. It is much better to tell them to consult and ask assistance from their local computer technician and let him choose the proper hardware configuration and build the system for them.

  • @Aquaquake
    @Aquaquake หลายเดือนก่อน +14

    1) Title the video implying the testing will be done in 4K
    2) Do zero tests in actual 4K, only use upscaling
    3) Compare it with 7700X and 285K, both of which are *heavily* productivity-oriented CPUs, as opposed to obvious choice of 7800x3D
    4) ???
    5) Profit!
    Wonderful ad for 9800x3D. Thanks Steve!

    • @Hardwareunboxed
      @Hardwareunboxed  หลายเดือนก่อน +2

      If that is indeed your actual take-away, especially after watching this entire segment, then I'm sorry to say, I can't help you :(
      29:05 - CPU Longevity Test [3950X, 5800X, 5800X3D]
      29:14 - Watch Dogs Legion [2022]
      30:14 - Shadow of the Tomb Raider [2022]
      30:44 - Horizon Zero Dawn [2022]
      31:04 - 3 Game Average [2022 Titles]
      32:04 - Starfield [2023]
      32:43 - Warhammer 40,000 Space Marine 2 [2024]
      33:04 - Star Wars Jedi Survivor [2024]
      33:34 - 3 Game Average [2024 Titles]

    • @Aquaquake
      @Aquaquake หลายเดือนก่อน +12

      @Hardwareunboxed exactly zero of these timestamps disprove my point of you spending half an hour doing completely useless (and arguably misleading) comparison with productivity-oriented CPUs on fake 4K (which puts more load on CPU than native 4K) in an obvious attempt to show 9800x3D in a better light. Either show real 4K comparison vs 7800x3D (adding performance per watt would be a nice eye-opener too) or don't bother trying to gaslight people.

    • @avatarion
      @avatarion หลายเดือนก่อน +3

      @@Aquaquake What's the point in a CPU review that's not focused on CPUs at all? Native 4K is a pure GPU review. It's not the CPUs fault that we have a GPU bottleneck.

    • @Aquaquake
      @Aquaquake หลายเดือนก่อน +4

      @@avatarion Then either a) make it abundantly clear that gains in *real* 4K gaming will be non-existent (which is precisely what a real 4K comparison would show), or b) don't call the video _"really faster for real-word 4K gaming"_ in a first place, if all they will test is realistically less than even 1440p. The problem is that if you crook the tests like this, it may mislead some less tech-knowledgeable people into wasting money on CPU for gaming when in reality they should be using that money on a more powerful GPU.

    • @avatarion
      @avatarion หลายเดือนก่อน +5

      @@Aquaquake I thought it makes it abundantly clear that it's real-world performance, as in what most people would run it as. It's really a waste of time to do anything else. On Techpowerup whenever there is a CPU test I may glance at the 4K performance list, but there's never anything interesting going on there. Why would it be any different in video format?

  • @joker927
    @joker927 หลายเดือนก่อน +3

    1% lows. This is what i am interested in as a 4k gamer with a 4090. Will a new CPU help make games smoother? Will this CPU make frame times more consistent? This is valuable info for a halo product buyer like me.

  • @frecheddy3712
    @frecheddy3712 หลายเดือนก่อน

    Easily one of the most eye opening and helpful videos I have seen in a while. Thank you so much for this! Cheers!

  • @Xavras_Wyzryn98
    @Xavras_Wyzryn98 หลายเดือนก่อน +3

    About CS2 data:
    Surely even when over 500 FPS is plenty for almost virtually everyone, the difference between ~250 1% lows and ~340 1% lows must be noticeable on for example 360 FPS monitor.
    Aren't 1% lows more important for competitive shooter than average FPS?

    • @Weissrolf
      @Weissrolf หลายเดือนก่อน

      Can you detect 5 out of 500 frames per second coming less than 1.1 ms later (4 vs. 2.94 ms)? Very likely not.

    • @Xavras_Wyzryn98
      @Xavras_Wyzryn98 หลายเดือนก่อน +1

      @Weissrolf
      I don't think that's how 1% lows work.
      The lows are not evenly spaced across time. 1% low of 250 vs 340 means rather that someone will have 3 seconds with such, or lower, frame rate per 5 minutes of playing time, likely bunched together.
      0.1% lows would tell how low this drops can go.

  • @Krenisphia
    @Krenisphia หลายเดือนก่อน +7

    There's nothing wrong with 1080p benchmarks. People just want to see their own resolutions ALONGSIDE it.

  • @BenBording
    @BenBording หลายเดือนก่อน +8

    I completely get why you test CPU's specifically at 1080p. Makes a ton of sense when you test CPU vs CPU.
    BUT as one of the lucky persons that game at 5160x1440 ultra settings with a 4090 on my old 5900X, this review still hit the bullseye. Almost. These benchmarks at higher resolutions are very helpful, for me personally, to judge wether or not an upgrade is even worth it in my specific use case. So thanks for the video, this was exactly what I was missing as a bonus!

    • @pepijngeorge9959
      @pepijngeorge9959 หลายเดือนก่อน +3

      The point of the video is that high ress testing doesn't tell anything about the CPU that 1080p testing doesn't already tell you.

    • @Dempig
      @Dempig หลายเดือนก่อน +1

      I wouldnt call 5160x1440p lucky, ultra wide looks really bad lol just get a big 65"+ oled or something it would look so much better. Whats the point of a extra wide display with a teeny tiny vertical view? Most games look very bad on ultrawide especially first person.

    • @BenBording
      @BenBording หลายเดือนก่อน +3

      @@Dempig lol you obviously never tried it then, but perhaps that's not your fault. 65" is just neckbreaking and looks trash up close. Most games look amazing on super ultrawide. ;)

    • @BenBording
      @BenBording หลายเดือนก่อน

      @@pepijngeorge9959 No but it does tell you something more detailed in very specific use cases, as Steve rightly points out multiple times in the video

    • @Dempig
      @Dempig หลายเดือนก่อน +1

      @@BenBording Nah man ultra wide is terrible for literally every game lol like I said it completely ruins first person games , and having such a tiny vertical viewing range with a large horizontal viewing range makes no sense.

  • @kusumayogi7956
    @kusumayogi7956 8 วันที่ผ่านมา +2

    We need native 4K benchmark, not upscaling like this

  • @BlazeBullet
    @BlazeBullet หลายเดือนก่อน +5

    Thank you Steve! Glad you got to sit down. I now know that my 5900x is fine since I game only at 4k at 60 fps. 👍🏼

  • @x4it3n
    @x4it3n หลายเดือนก่อน +13

    4K Performance/Balanced/Quality are far from Native 4K since they use much lower resolutions but yeah. Imo the real selling point of a 9800X3D at 4K would be for 1% and 0.1% Lows that probably can give you a smoother experience due to less FPS drops. I have a 5900X and most games still run great on my 4090 at 4K. But when the 5090 releases in a few months a 9800X3D might be needed...

    • @CallMeRabbitzUSVI
      @CallMeRabbitzUSVI หลายเดือนก่อน +6

      Yep! Was happy to see him actually doing 4k testing but then very upset that he is using upscaled "4K" Steve is a joke and I will continue to not take him seriously 😂

    • @RADkate
      @RADkate หลายเดือนก่อน

      there is zero reason not to use dlss at 4k

  • @adamchambers1393
    @adamchambers1393 หลายเดือนก่อน +3

    My view is that I am gaming at 4k native now and am waiting to purchase the LG 45" 5120x2160 and just want to see if it worth upgrading from the 5950x when I purchase a 5090RTX next year or if I can save the £1k on CPU, mobo, RAM etc because the % increase remains at 5% only average increase and so can wait for the Zen6 10800x3D or whatever it will be. I can't see that data without some basic 4k numbers. If I looked at the 1080p without the 4k shown, am I going to have to assume I would get the same uplift as there would be no data to tell me know there isn't that uplift (yes aware that educated guess means it wont be those numbers but an uplift % still helps decide because at some point a GPU will allow that overhead.

  • @martheunen
    @martheunen หลายเดือนก่อน

    Excellent video! Thank you! Possibly the best one of this topic yet! In the last video I was wondering if you'd do one of these explanatory videos about high res vs cpu etc etc etc... My wording in the comments of said previous video was somewhat lackluster. But yet here now is exactly the type of video I was wondering if you'd make.
    I guess these kind of videos are not your favorite to make, but personally I enjoy them and hope to learn something new about the current gen hardware and I also think they are very good for newcomers.
    Unfortunately for you I guess, this won't be the last one of it's kind, but again, every now and then (once every 2 years????) a refresher of this topic is useful for the community I think.

  • @Rafyelzz
    @Rafyelzz หลายเดือนก่อน +4

    I don't think most people are criticising the methodology, but rather the fact that it doesn't help them make a decision because most of them considering this purchase, don't use 1080p anymore. That's all. For me, understanding the gains in 1080p is useful to understand the technical innovation, but i go for the 4k tests to see if it's worth for me. Dramatically different point of view.
    This video is great because at the end it shows how GPUs bottlenecks impact fps with different CPUs

  • @DaKrawnik
    @DaKrawnik หลายเดือนก่อน +8

    Just do 1080p and 1440p and be done with it.

    • @Coolmouse777
      @Coolmouse777 หลายเดือนก่อน +4

      no reason to do 1440p at all

    • @LucasHolt
      @LucasHolt หลายเดือนก่อน +2

      @@Coolmouse777 it's becoming the default res for new monitors so it will matter

    • @TAG_Underground
      @TAG_Underground หลายเดือนก่อน +2

      @@Coolmouse777 More people game on 1440p than 4K.

    • @Coolmouse777
      @Coolmouse777 หลายเดือนก่อน

      @@LucasHolt So what ? Performance in 1440 and 1080 same when CPU bottleneck.

    • @LucasHolt
      @LucasHolt หลายเดือนก่อน +1

      @@Coolmouse777 In the real world, people want to know how the game will work on their computer. As 1440p becomes the default, that is what people will want to see. This happened with previous revolution bumps too. Otherwise, they'd still be benchmarking at 640x480.

  • @B1u35ky
    @B1u35ky หลายเดือนก่อน +30

    Balanced upscaling is not even close to 4K though

    • @kjaesp
      @kjaesp หลายเดือนก่อน +5

      What do you mean? I love spending north of 2k on a gpu not to play 4k native, that's why I play with RT on! (jk of course)

    • @kennethwilson8236
      @kennethwilson8236 หลายเดือนก่อน +6

      Yeah, thought that was an odd way to test, and which upscaling method was used? His own survey only had 8% of polled users using balanced mode

    • @kerkertrandov459
      @kerkertrandov459 หลายเดือนก่อน +6

      @@B1u35ky nobody plays 4k without dlss. I do agree dlss quality was better to test than balance tho

    • @B1u35ky
      @B1u35ky หลายเดือนก่อน +3

      @@kerkertrandov459 I play 4k either native or dlss quality. So yeah people don’t need upscaling all the time

    • @crisium3945
      @crisium3945 หลายเดือนก่อน +1

      1252p with an upscaling performance penalty makes it roughly comparable to native 1440p. So he actually tested 1440p.

  • @Shini1984
    @Shini1984 หลายเดือนก่อน

    Thanks for detailed results and explanations! Much appreciated!

  • @BrutalSavageYT
    @BrutalSavageYT หลายเดือนก่อน +3

    i actually hate the fact dlss has become such a crutch, just find a benchmark without it at high resolutions is a nightmare.

  • @PopTart-j8u
    @PopTart-j8u หลายเดือนก่อน +5

    This is exactly the video I was looking for, thank you!

  • @tical2399
    @tical2399 หลายเดือนก่อน +7

    All the snark is fine and good but there are people like me who only play at 4k. When I'm looking at a cpu I want to see what it does at 4k, as in exact numbers. Not this
    take the 1080p results and divide it by pi or some nonsense reviewers do. You can do your whole speech about 1080p results and then go "oh btw there are the 1440p and 4k results in another graph." Why is that so hard to do. Show the results and let people make whatever (right or wrong) assumptions they want about future performance.

    • @randomexcalmain4512
      @randomexcalmain4512 หลายเดือนก่อน

      Why not check the already existing GPU benchmarks?

    • @Lemurion287
      @Lemurion287 หลายเดือนก่อน +1

      Because Steve only has so much time and so many resources. Benching everything at 1080p gives him the most data for the amount of time and effort invested. Adding additional high resolution benchmarks won't change his conclusions but will increase his workload--and potentially take up the time needed to investigate real outliers that might be important.

    • @thischannel1071
      @thischannel1071 หลายเดือนก่อน

      @@Lemurion287 Didn't Steve say in this or HU's previous discussion video that they always test at 1440p and 4k as well, and just didn't include the results in their 9800X3D review video? So, HU already spends the time. They just choose to omit the information in their reviews.

    • @avatarion
      @avatarion หลายเดือนก่อน +1

      At native 4K you only have to see one semi-modern CPU running a game and you pretty much know the rest.

  • @VicAndRoll
    @VicAndRoll หลายเดือนก่อน

    I consider myself a novice at building PC (only built one so far) and I totally understand the CPU testing method, still thoroughly enjoyed the explanation video, gives me confidence on any upgrade decisions.

  • @jakefromearth5604
    @jakefromearth5604 หลายเดือนก่อน +3

    It's a pity you comparing 9800x3d not with 7800x3d

  • @MaggotCZ
    @MaggotCZ หลายเดือนก่อน +4

    there is a difference between using the best quality mode for upscaling and using balanced or below because at that point even 1440p native is more taxing on the GPU and looks better. You practically tested 1253p with 4090 lmao no sh it thats still CPU tied.

    • @samgragas8467
      @samgragas8467 หลายเดือนก่อน

      Balanced looks better and is a bit more taxing. In UE 5 it is less taxing btw.

  • @mananabanana
    @mananabanana หลายเดือนก่อน +4

    You could have made this admittedly great video without being salty throughout and it would have been a great watching experience for us and you'd be more relaxed as well. Don't get caught up with the bottom of the barrel commenters Steve! For our sake and yours.

    • @Hardwareunboxed
      @Hardwareunboxed  หลายเดือนก่อน +1

      Salty? WTF I made a few jokes while trying to explain the testing.

  • @etmargallo
    @etmargallo หลายเดือนก่อน

    This is the video I've been looking for - great job!

  • @AdrusFTS
    @AdrusFTS หลายเดือนก่อน +14

    i dont think low res cpu testing is bad, its THE way of testing cpus with no bottleneck, but using 4k with balanced upscaling as a an argument is really dumb, 4k balanced looks awful its not realistic nobody playing at 4k would use balanced upscaling, at that point a 1440p monitor would look better, 4k balanced is 1080p with a filter

    • @prussell890
      @prussell890 หลายเดือนก่อน +4

      You are totally correct, 4k dlss balanced is 1080p upscaled to 4k and looks terrible. A very misleading video convincing people to artificially believe that the 9800x3d is significantly better at 4k. 4k dlss quality which is 1440p upscaled to 4k is the minimum I use for 4k upscaling. That would have a completely different result. I feel he is being sponsored by AMD to be honest.

  • @TheSeptu
    @TheSeptu หลายเดือนก่อน +21

    When everyone asked you to test at 4K, and you instead used Balanced upscaling, which is barely above 1080p, you absolutely are trying to push a false narrative that it's irrelevant testing by not even doing it. When people asked for 4K testing they meant 4K native, and you know it. In fact Linus already DID do this actual testing that you refuse to do, and showed exactly what people were saying, that it is little to no uplift at 4K native, quickly proving your narrative false. It's pretty dishonest of you, and weird you're so defensive and aggressively insulting to your viewers for just asking you to do your job and simply do full testing.
    Dude, just admit when you're wrong.

    • @jacks3735
      @jacks3735 หลายเดือนก่อน +6

      No idea why he's being so arrogant. People don't have access to hardware and were just asking for 4k tests...

  • @CookieManCookies
    @CookieManCookies หลายเดือนก่อน +4

    If you are spending $800 on a new motherboard, and CPU. Your crazy to think that these people can't afford a 4K gaming display, or only care about dollars per frame. You don't need to choose the exact same settings people choose in real life. "High" is pretty representative. For me, 4K benchmarking is a valid concern because I'm coming from a 3950 and my big choice is between the 285K or 9800X3D. I haven't gamed in 1080p in ~20 years, so yes, completely pointless benchmarking unless your one of the 100 people on counterstrike fighting for 300+ fps.

    • @tdrm
      @tdrm หลายเดือนก่อน +1

      You seem clueless about Counter-Strike popularity. That game has millions of active players, not 100.

  • @troik
    @troik หลายเดือนก่อน +2

    thank you for this video, here is why I'd like to see more 4k testing and what I would actually need from that:
    I totally understand why low resolution testing is done, to isolate the CPU performance and keep the GPU Performance mostly out of that, to make CPUs comparable. My counter-argument to that would be that if it's not actually about the fps, but the difference between the CPUs, then a Cinebench score or looking up the gflops of the CPU, would get me 90% there (I admit that X3D makes this more complicated).
    My thinking is this, I'm gaming at 4k/120fps (at least as a target) so I'm GPU bound most of the time. With my aging 9900k in some games I sometimes reach 50-60% CPU utilization. That suggests that I might still have headroom, but 100% would mean perfect utilization on all 16 threads and no game scales that perfectly across all threads, so it might be that I'm CPU bound on a frame here and there without realizing it.
    Switching to a new CPU won't double my performance, I might get make 10-20% depending on the hardware choices I make (board, ram etc).
    So most likely I'd just see my CPU usage go down to 20-30% ingame.
    Now I come to my question, is it possible, due to how modern CPU works, that at 30% utilization, a CPU behaves differently than at 100%?
    Would it be possible, if not everything scales perfectly, that at 30% one CPU might be more effizient than another CPU and that it might be different compared to at 100%?