Honestly, the question that simply doesn't get asked enough with respect to gaming upgrades is "what display are you driving?" It eliminates so many variables and basically crops your viable options to something actually worth talking about.
It can certainly be narrowed down to the game, or class of games if the buyer/builder wishes to do so. Personally I think that's a little too rigid and would rather err on the side of overhead knowing I can play a frame hungry release at least reasonably well.
@@sdrubaa Thanks, the one I was really interested in though was the 4770k. Currently running it with a 2080ti and was wondering how much the CPU is holding back the GPU in 4k
Got that exact same mobo with my 4790k running at 4.6 GHz! Had an r9 fury that crapped out 4 months ago and am looking to upgrade GPU. This is extremely interesting. Thanks!
I run a 4790k at 4,8Ghz paired with a 3080ti. I'm currently getting parts and working on a wall mount with custom cooling. I know it's bottlenecked liket hell but soon it will be glorious!
@@martinh2783 My PC is a 4790k (at stock speeds), Maximus Hero VII, 16gb DDR3 RAM with 1080ti and I was thinking about upgrading. But you have my admiration and respect for a glorious PC with that build.
@@jodiepalmer2404 a 3070ti isn’t too bad with the 4790k, Depends on resolution of course, I run 1440p mid to high fps with my 4790k at 4.6 with same mobo and ddr3 amount
@@ShawnStrickland at the moment I'm only running at a 1080p (12 year old tv). Do you use Windows 10 or Window 11? Because I've read some games have problems running.
You could even opt for a 12600KF, which is more than enough for gaming. GN themself tested how a slightly tuned 12600K is capable of doing even a tiny little bit better than the 12900K in gaming. With that, you could go for a 3080 instead of a 3070.
For sure. Not to mention, if you're playing 1080p anyway then even a 3060 ti is more than plenty, leaving room to upgrade to a 2K monitor if you're inclined. 3060 ti can certainly fill those shoes as well if you don't expect triple digit frame rates in all cases.
12400 or a 5600(non X) whichever is cheaper then use the rest of the money on a 6900 XT 16GB for $699. I would go all AMD to enable Smart Access Memory (SAM) more fps.
I went from a 4790K to a 5800X3D and some games jumped more than 100%. It that was just with a 1080ti. I think a lot of it is memory bottlenecks in the games I play. Edit: That's also why I went the the X3D, it's amazing the kinds of spaghetti code bullshit you can cover up with more cache and faster IO!
I’m looking into upgrading my 5600x to a 5800x3D myself. That way I should be fine on the CPU side for a while, and then just upgrade to a 40 series GPU when they release. The only hard thing is finding stock in Australia but that should improve
@@MarioGoatse yeah I agree. Unless you play old shitting rts games the X3D isn't worth the upgrade. With a modern GPU I would def go with a 5900x over the X3D. Especially if you plan on keeping DRR4/AM4 for a while.
@@G.J.Uytdewilligen Oh wow, I had no idea they were rumoured to be on the horizon already. I knew AMD was going to make more X3D, but wasn’t sure when that was going to happen. Is there a rumoured date yet?
That's not true at all, it depends on the game. Generally speaking, in multiplayer, WQHD is more worthwhile than 4K. However, there are some players for whom Full HD makes more sense.
This would be interesting to repeat showing at 1440p and 4k resolutions at Ultra with these CPUs, maybe they still stand up with not much difference to new 12th gen or maybe they don't have enough cores still.
This exactly, I see almost every bench score above that 60 fps and where the difference is greater it's about insane fps. PUBG on 289fps vs 437fps is not a real world scenario., just run it at 4K. I sometimes play Rocket league on my 6700K with a 1080TI, it runs 170fps on 4K so i don't need new hardware for that game.
@@ChrisDaytrader Bull lol, i have 4090 and an 8700k at 5.3ghz, upgrading to 13900ks increases my FPS massively in most new games at 4k. Even my 8700k was hitting 100% in some games.
Very impressed with how i7 4770k showed itself, ofc 1080p is unrealistic resolution for those who use 3080 Ti, 1440p or even 3440x1440 is more realistic scenario. Recently just for fun got chinese huanan x99 bd4 motherboard for 100eur pair it with e5 2696v3(2699v3 with ddr3 and ddr4 support and higher clocks) 18 core 36 threads 3.8ghz cpu for 119eur. Runs great with my RTX 3080 with ultrawide monitor, however to get 3.8ghz on all cores after turbo boost mod you need to lock cpu to 12 cores. Haswell xeon is beast, wish to have better motherboard and try bclk overclocking on top of it. Haswell was so ahead of its time
I've been following almost the same upgrade path, 2600k/RX 590 to a 6700k/RX 590 added a 1080TI then a RTX 3060. Next step is upgrading to a 9700k/RTX 3060, I'm getting the 9700k after upgrading a family member to a 13900k. Thank for the video. Also looking into de-liding the 6700k and building it into an InWin Chopin case.
I think the L2 & L3 cache sizes of the CPUs make a bigger impact than just the 4 CPU cores. Greater CPU cores does not necessarily correlate to higher FPS. The increase of cache sizes affect this as well.
I would have liked to see also benchmarks at 1440p. Indeed, the 3060 would not push as many frames, but the differences would have been way smaller between the CPUs. It would also make sense because if you upgrade the GPU, many people would also change the monitor.
@@floodo1 1080p stresses the CPU instead of the GPU, making CPU differences more evident. There's no point in testing how a CPU affects gaming performance if your tests are all GPU limited.
@@BobBobson are all those games GPU limited at 1440p? Fair point though. In any case it's exactly like Der Bauer said, we have to consider our expected use case, individually.
Thank you for the video, my upgrade from a 3570k to the 5600x so noticeable. I ended up getting 5900x because i plan to keep it for years even though i don't notice the improvement at this time.
Glad you've made this video. The amount of arguments I've had with numpties who still think their i5 3570k/4670k is good enough when paired with a latest gen GPU is ridiculous.
Currently running a post price drop 3080 with my 4790 non K intel. This is precisely what I wanted to confirm. The 3080 replaced a gtx 1080 that I had for 5 years. I'm waiting for the new AMD or Intel chips to see which way I go. Running games at or near 4k helps. But really looking forward to upgrading the rest of my system. Edit: oh and I did need to update my PSU and case (for better thermals) 😂
Hey Fellow 4th gen friend, I upgraded my 4770K withs 5800x3d. Should last me another decade 😅 😸, unless we some revolution. At UW1440p it made massive difference for simracing, 150% increase in min. fps. Your devil should be worried for being replaced soon 😂
Best Buy US has been fully stocked on FE cards, minus 3080, for the whole week. I had no issues making orders for the missing cards for several of my family and friends needing to finally upgrade. So glad we had patience and just waited because paying actual msrp is such a huge win. Im a little curious to see how a 3770k stacks in this, its what I have my youngest kid using with a 3060 ti.
you might want to delid that 3770k then OC it :) also make sure RAM is minimum 1866MHz. Higher is better. Before upgrading to a 5800X a few months ago, the higher RAM frequency massively helped with my minimum FPS for my OCed 2500k
@@ayuchanayuko hell yeah! right now the kid is doing summer work to save money for a new board ram n cpu. After he gets his parts going I want to tinker with the 3570k and 3770k to introduce myself to this more hardcore OC'ing.
thank you for this video, I am using a 8700K oc @4.8 with a 3090, and playing at 4K and i dont notice any bottleneck, well maybe a slight bottleneck, but it is doing great.
this is exactly what i was looking for.. i'm still using a 2080 super with the 8700k @5.0Ghz for 1080p, but i want to buy a 3440x1440p or maybe a 4K monitor, so i suppose increasing the resolution will at least take a load from CPU to GPU ... just considering a 7900 xtx instead of 4090 because it's half the price and i don't wanna be homeless
I did run my i7 4790k with 3080Ti but upgraded to 12400F only the lows improved a lot but avg fps maybe 20max on any game, if u are casual and u want 90+ fps at any game it does it no problem, stil new gen is better for me cuz i play only lowest settings for high refresh rate.
The 8700k was actually realy close to the 12700k setup and remember that the 8700k-s turbo clock was way behind these new chips, so an oc to 5ghz might also shrink the gap.
This is exactly what happened to me last year. Was running a 1080ti with my 8700k before running into the chance to get a 3080ti. Even at 1440p, my poor boy was struggling while the GPU was waiting for work. After a week I caved a upgraded the rest of my rig but it was an interesting experiment.
@Zoran from what i have read lately, at 4k on a majority of games with an 8700k, delidded and oc'd performs very similar to an i12+ when both are running 4000 series gpus. Gonna find out this week
Running a 6700k and a 1060. Currently on the hunt on a used 3070 (TI). I know I am kinda bottlenecked here, but I dont like the power consumption on a new System. Waiting for an upgrade when they start releasing the 12V systems with DDR 5 to make the switch. Mostly upgrading because of DLSS.
Excellent content as always, Roman. Hope you could repeat these with 1440p and 4k resolutions. Realistically speaking, someone with a 3080 Ti would not be running a 1080p screen? 😜
Agree, it makes more sense to run same comparison at 4K, that would be really interesting to see, shame Roman didn’t throw in higher resolutions as well
My main rig still is as 4790k@4.5, can play anything with 1080p low but it's reaching the limit, unfortunately back then I bought 1600mhz, i've tried a few years ago to get faster ram but couldn't find in my region. Looking forward to upgrade when am5 arrives.
I really love that you still do PUBG benchmarks :) I think what was surprising for my experience was how much of a CPU bottleneck is even at 1440p ultra wide, depending on the settings you choose. Love my 12900k.
Great comparisation and well done. I only miss, how it compares to 10th gen cpus. With all that effot to make this video, you could make that one more too, so we could have an image how it sacales with every 2nd generation od cpu.
it's Intel sponsored, so the comparison stayed with old->current CPU 10 series only misses out on the efficiency core stuff and the all unlocked all of the time nature of the 12 series.
hey dude love the vid, ive got an 8700k @ 5ghz on an AIO, and a 3080ti. im playing games like hunt showdown on a 4k 144hz 43" display, and achieving around 90-100 fps. wondering if going to a 5800x3d or waiting for the new 7800x 3d would get me any noticeable performance upgrades (20+ FPS)
Ryzen 1000 will still bottleneck the gpu less than even the 6700k just because it has more cores and the games are more multithreaded. FX and A series will run awfully tho
That takes me back. Got my old 4790k with exactly the same ram you had sitting downstairs. Still a very capable bit of kit :) Poor thing is retired now though, wondering what I should do with it
I'm still using a 4790K GTX970 16GB as my main machine, lol. No problems with 4K video and 2K gaming, albeit now after 7 years of daily use, i need to change GPU thermal paste.
There are a few other factors to keep in mind: 1. Are you planning on upgrading other system parts within the next 6-12 months? If so, perhaps actually getting that faster, more expensive GPU could be worth it for the extra gains later. 2. Are you planning on getting a higher resolution or higher refresh rate display? You might not see big immediate gains, due to being CPU limited, but once you increase the resolution, the load will be moved more towards the GPU rather than the CPU and you'll see a bigger gain. 3. (And this is possibly the most important one) Can you wait? Depending on when you're looking to upgrade, and how long it is until new hardware comes out, it may well be better to simply hold on a little while longer and get either the price reduction on older models due to new hardware being released, or getting a lower (and consequently cheaper) end model of the new generation. If the next release is more than a few months away, then by all means, upgrade. But when it's about 3 months away? Then it's probably wiser to hang on a bit.
The problem with waiting for new hardware (your no. 3) and getting a good deal on last generation hardware a bit cheaper, is that this isn't happening any more. As an example 12600K are just 100 SEK cheaper then 13600KF [and 13600K is around 500 SEK more] in one store that have high reputation where I live and this more or less the same for all computer components. Motherboards are kept in the same price range (like the Z690) with just a small jump down compared to the newer chipset (Z790). One part of that is that demand increased for Z690 as Z790 starting cost (MSRP) made it less interesting as you can just update Z690 to use Raptor Lake (main difference is tha Z790 should out-of-the-box be able to support Raptor Lake without UEFI update). The margins are rather low for computer hardware stores and add to that competition have been less as most computer stores have closed over time. There isn't much reason to have sales (outside of Black Friday and big events) that attract lower prices. Most computer hardware stores do not have much any stocks that they need sell off as they make use of Just in Time ordering from main distributors warehouse. This is also reflected in second hand market or used market which now can keep high price as there is not much of an option and reselling hardware to re-coup what they have paid for hardware themselves once just keep older hardware high in price.
I was one of the unlucky ones that got a 7700k prior to the 8700k release. I haven't been unhappy with it though especially after delid. It has run 1.32v 5GHZ all core since i got it. With max temps around ~65c. I guess it will still be as limited as the 6700k OC results shown though as they are pretty similar cpus. Gaming at 4k 60fps, my CPU load is around 50-60%.
These benchmarks are disingenous, because of course 1080p is going to be CPU limited, that's basically the max frames the CPU can handle. But you're not getting those frames at 1440p anyways, so you're GPU limited. That's why at 9:00 and 9:14 you can see the frame rates are almost the same on all CPU's, because it's GPU bottlenecked. Technically CPU's like the 8700k wouldn't even be this bad, but they were all crippled because of that infamous hardware exploit, so they had to add a huge overhead to patch it, and these CPU's basically lost 30% of their rated performance overnight. That's more than enough to overcome the difference against the 12900k. But since the 12900k isn't vulnerable, it's free to run at its full speed. So it's not necessarily better at gaming, it just has the advantage of not being held back. Intel should have been class action sued into oblivion for that exploit that resulted in basically all older CPU's being crippled with massive performance losses just to patch it. Imagine you bought a 4090 and an exploit came along that required them to basically turn it into a 4080, and Nvidia just said "sorry". Heck no, you'd want your money back. But that's exactly what Intel did, and they knew they could be sued into oblivion, but they lobbied to keep it from happening.
Soon we will learn that the bottleneck with CPUs is the amount of cache it has to process the task from billions of transistors. A good comparison will be 1440p vs 4K benchmarks, as coming CPUs will have 200MB L3 cache it will be a good bottleneck comparison.
my asus monitor is 1080p 75hz and actually gives a quality image and no sign of ghosting. People talk about 150fps and i am ok at 75fps with vsync just to eliminate all the tearing. Playing now Metro 2033 and even at 60fps is a fluid perfect gaming experience. Just like when i started riding bikes, i just wanted the mist ppwerfull and fast out there and couldnt use the power on the streets....
It's amusing that Intel wonders what would happen when upgrading the cpu, when on that platform it's also upgrading the motherboard because usually each socket is only supported for two years at most before switching to a new one (and sometimes to more than one socket at a time). On the AMD platform the AM4 socket is in its sixth year and it is possible to see the real increase in performance from going through the first generation of cpu for that platform up to the latest 5000 series
I have been waiting for a video like this for a long time. Thank you very much for finally giving all those ppl on reddit an answer when they ask if X gpu will bottleneck their old system
Question is who buys a 3080 TI/3090 to play 1080p, you can go with a much cheaper GPU at this point. At 4K the bottleneck is unnoticeable to a point upgrading your CPU in terms of gaming is irrelevant. If you have a solid system with 4770K and you play 4K, you dont need to upgrade anything other than your GPU to keep up with games. I've been running a 3080 TI with 2600K OCd to 4.5Ghz no matter what game at no issue at 4K@60.
I run an 8700k oced to 5 Ghz with a 3090 that I got 2 years ago cause I was early on the EVGA list. I also nabbed a 28" G7 4k 144hz display and most games I can run native 4k resolution with max graphics settings and get 90+ fps. That amount of eye candy at that framerate is pretty good imo. If I am in a game where I really feel that I need the higher fps, I bump it to 1440 and the fps pegs at 144 with no stutters or dropped frames. Also, the only reason I haven't upgraded my CPU yet is that I am waiting for the Ryzen 7000 chips so that I will actually have a chip based upgrade path with socket AM5 rather than having to get a new CPU+mobo EVERY generation on the Intel platform. Intel is nice and make good chips, but with AMD actually being competitive again; Intel needs to step up their game in regards to consumer friendlyness since their chips are trading blows with AMD. Give us a socket that lasts for more than a generation or 2 and Intel will be a good choice, especially with mobo prices rising like they have been recently. Adding a $300+ mobo tax for going with an Intel chip isn't going to fly for a large part of the consumer base.
@@christophervanzetta No, I figured that it would be DDR5 only. It only makes sense if they are going to support the platform for 3 generations again. Imagine people buying a DDR4 based AM5 mobo and then complaining because they can't upgrade the chip to Ryzen 11k series chip that people who got a DDR5 version of the same mobo can. And you know that if people could do that, they would, along with making the overall expirence worse/more limited for everyone else.
Great video! It would be awesome to see this done with higher resolutions as well. I'm still rocking the 8700k, but I'm playing on 3440x1440 where the GPU does more of the heavy lifting. I wonder how much I woul gain at those resolutions by switching CPU.
How has it been working? I have the same cpu, was gonna build an entirely new pc for a 4090, but after some reading i think im going to delidd and OC it, and see how well it works first
Very interesting. I have a 9700k & a 3080ti. Been considering upgrading the CPU but want to wait another year or so until DDR5 is a bit more established.
@Defective Degenerate It's definitely still a decent setup but I have a 9900k with a 3080ti and it's definitely bottlenecked quite a bit in a bunch of the games I play...I would probably wait until the next round of CPUs to come out though, thats what I'm going to do...
Mostlikely not much of a jump since it has enough Full Cores for any Game on the market right now and in the next years. If you are already at 1440p or even consider to go to 4k in 1/2 years there are only a dozen percentage to gain in the majority of games. So for the couple hundred €/$ a better GPU also would achive this gain.
I went from an i9 9900k manually tuned (with manually tuned samsung B-die memory) to an i7 12700KF, tuned it manually and using the same memory. Differences are huge on a 3070Ti overclocked. I play games like COD and Wow.
@@Manakuski Yea, I just setup my bros new system with and he had a 9900k with 4000 cl15 mem with a 3080ti and then when I put in the 12900k with the same ram it was a very noticeable difference...I'm running a 9900k with a 3080ti myself right now but I think I will wait just a little longer until the new chips come out before I buy anything...My bro has the exact same setup that I do except for the 12900k vs 9900k and there have been a couple games that have as much as a 20-25% difference...It's really amazing how good these new chips are getting
In this “fly by numbers” there are big differences every where. There are also huge bottlenecks in performance that are not in numbers, this is perfectly and well-being said in this video. This is the truth. I experienced the same.
Running a 4790k at 4.8Ghz delied + gtx 1070 works great even on my 1440p monitor. I think I'll have to change CPU and GPU in the next year, but I'm waiting for the 14700k or an amd cpu if it performs better.
Still running my 4790k on air at 4.8 for years, no de-lid though but it runs like a banshee! Picked up a 3080 for the price of a small car last year and think I am CPU bottlenecked...
I7 7700K with a 1080ti 1440p aiming for 144hz is a pretty balanced set up (also i dont have any new games queued up i´d love to play). Planning a full system upgrade in fall.
I'm going to go out on a limb here and say that most AAA titles are pretty poorly optimized. CPU bottlenecks would likely be way less common if people wrote efficient engines instead of relicensing others and strapping their even poorer optimized code to it. The modern Dooms (2016, Eternal) can run on a potato. An ivybridge CPU is _more_ than enough to get 120fps with a good GPU. Granted, some of that is their choice of using Vulkan, so they don't have to worry about a GPU vendor's inefficient openGL implementation, but a well optimized engine goes a long way. All of the modern amenities of newer Intel and AMD microarchitectures are more supplementing poor software design than actually being absolutely essential for the target framerate. Branchy crappy code puts more strain on the branch predictor. Purely scalar code that isn't vectorized sometimes is an order of magnitude worse than code that's explicitly SIMD vectorized. It doesn't help that object oriented code effectively forces data copies and even when being absolutely careful to use move constructors, references, and everything else to prevent wholesale copies, the data layout is far from optimal for an autovectorizing compiler or a reasonable prefetch algorithm.
4930k @ 4.5GHz and DDR3 2400 8x4GB is decent but even with a RTX 2080 you lose fps. I still could get 4k@60fps with my game of choice at the time with the RTX 2080. I am now rocking a 10900k @ 5.1GHz with the DDR4 RAM at 4000 CL15 with 4x8GB and a RTX 3080 ti. Performance is still decent on the 10900k and I can max out the RTX 3080 ti.
@@GodKitty677 I should preface that I play at 2560x1600, so somewhere between 1440p and 4k. I've upgraded since, but a sandy bridge e was giving some very acceptable framerates on that engine
I agree with some of this, but I think you're simplifying the situation a bit too much and getting hung up on engines as the key performance factor. A well-written engine doesn't automatically provide efficient and fast gameplay, and at the level of AAA titles it's probably one of the least critical parts of the performance puzzle. CPU-bound performance bottlenecks usually stem from downstream consuming code (e.g. iterating entity lists too frequently, accidental O(n^2) or worse from nesting loops, doing heap allocs and deallocs instead of pooling, etc.) rather than the engine itself. GPU-bound performance bottlenecks often come from poor asset optimisation, especially excessive triangle counts on meshes and improper use of quads. Manual micro-optimisations that delve into things like vtable layout rarely deliver significant gains unless you're really starved for memory bandwidth (L2/L3 will handle the rest), and such efforts usually detract from engine usability, which can have a much bigger impact on the overall quality of the game - remember, your devs are users too and usability is important! Most studios would be far better off investing perf optimisation time into tooling and process management than they would tinkering with vectorisation and allocators. An asset and testing pipeline that can automatically catch checkins of unoptimised meshes or performance regressions between builds provides a lot of benefit for a relatively small amount of investment, and that work is transferrable between games. Better still, it improves usability and shortens iteration times on catching perf regressions, so your devs/artists can use their time more efficiently. The one case where the engine design really does matter is multithreading, and unfortunately it's one of those cases where there's no one-size-fits-all solution for all games. Distributing NPC AI and entity ticks over a threadpool makes sense for an RTS, but adds unnecessary complexity to many other games, and there are complex cost/benefit analyses to be made for multithreading (especially considering issues like code complexity, scheduler determinism, atomicity, synchronisation latency, and NUMA) that may mean it doesn't make sense to utilise threading to the greatest degree possible across the whole game. Often you just end up pushing some async jobs to a threadpool and leaving the core loop synchronous because otherwise it becomes a nightmare to write code for it.
@@gsuberland ah yes, the "coroutine" pattern to multihreading. Often it finds itself unintentionally locking on something else or not really operating with any concurrency at all. I get tight game loops are tough, and the design of such things is highly dependent on the type of the title. I will say that there are often serial loops in most code that could very much benefit from SIMD but the OOP data layout is too cumbersome to make use of it. Even idtech4's SIMD code was just limited to a handful of 4x4 linear algebra operations for the GL math (there were some limited optimizations with regard to lighting calculations and sound, too). The branchiness nature of a game can make the code rather SIMD unfriendly. The whole entity design pattern puts the data in an awkward place. Additionally a lot of game engines are designed to cull specific assets for efficiency in the renderer, so you end up having holes in your simd operands for the invisible triangles (or you end up needing to copy them). Of course some of this can be supplanted by the "graphics compute" capabilities in shaders but then it's a latency hit to get it back to host memory (assuming said calculations are important for the game logic).
@@noreng9333 Not particularly, as the 9900K has 2 more cores and 4 more threads, and the 10900K has 4 more cores and 8 more threads compared to the 8700K. Those extra cores, along with the slight uplift in IPC, and the higher clock speeds matter a lot.
I personally run a 4/2 cycle on my hardware. New CPU and mobo every 4 years and a new GPU every 2 years. I've gone 4790K>8700K>12900K. That has been a pretty solid upgrade each time. For 14900K I should be able to keep my current DDR5 kit. For GPU I've gone from 780 (+1 year after release)>980 Ti (+1 year after release)>1080 Ti (+1.5 years after release)i>3080 (launch day). Monitors went 1080p/60hz>3440*1440/100hz>3400*1440/175hz
I remember distinctly, the setup before I switched to my 7950x system, I had a boadwell xeon 6 core with a 3080 xc3 hybrid, and running Linux. I was blown away playing Icarus when putting settings lower made my performance worse. before that I had a 3800x, but I helped out my nephew because I knew I was gonna upgrade soon, and he had a bulldozer machine he was trying to run apex legends on, and failing miserably.
Just recently I was able to upgrade from 1st gen ryzen, to the latest (5000) and noticed I was actually pretty bottlenecked with a 1080Ti. I can't imagine how bad it would've been with a 3000 series. (Could make for an interesting video if you still had some older hardware laying around, not sure with Intel. However with AMD you can easily upgrade with older AM4 boards now if you're on an older processor, and gain some hefty performance depending on your video card.)
4 me Ryzen only started to get real on the 3000 and up, the first ones comparing the quad core to my quad core 2011 CPU but at 4.9ghz there wasn't really much of a big difference in performance, it was mostly to just stay in pair with Intel since for many time AMD was mostly doing nothing before the Ryzen.
the two things that used to make the biggest difference from a usability stand point, were the hard drive speed and the screen you're looking at. Bigger monitor always = better experience and the faster the computer which USED to be faster storage, means running 18k RPM raptors but now SSDs have crushed that and NVME has demolished it. So now... the single usability factor is the size of your monitor. Get a larger, but not too large, monitor. OR get 2 more of your existing monitor. 3x and a larger desk maybe. Then your user experience will be altered like a state of consciousness. And you can keep the same old system and just upgrade the GPU when needed for the games you're playing. The platform will be fine.
@@guily6669 Yeah, I think the orig Ryzen was about the same as a 4.4 or 4.5ghz OC'd Sandy Bridge performance wise (not stock 3.4ghz). Which speaks volumes of old Intel but new Intel is no where near what they used to be. Still, any new Intel introduced is behind AMD so there's that and Ryzen 4 here we come but the Vcache one is the one to wait for.
@@Cinnabuns2009 I'm waiting to see AM5 socket in action, intel made such a bad job with their latest socket locking mechanism that apply uneven pressure bending it which only increases temps, hope AMD doesn't go so cheap on the CPU locking mechanism for AM5. I still have my I7 2600K but sadly recently 1 core burned and it's now 3 cores, 6 threads, but I got a damn nice deal on a I7 3770K and I'm waiting 4 it to arrive it's just so I can keep playing well while I wait for next gen hardware and better prices. Anyway I can't deny there isn't really that huge difference either on this CPU from 4 to 3 cores for a Strix RX580 as I'm basically GPU capped all the time in the games I play it mostly only increased CPU usage but never go much more than 80% and maybe a couple FPS lost, thought surely some games the stuttering got just slightly worse but it's almost the same as 4 cores. I thought I'd be much more screwed...
It'd be interesting to see comparisons against old flagship platforms like X79. A lot of people are still holding onto those. One of my rigs with the 8 core Xeon E3-1680v2 @ 4.6 GHz is pushing 10 years old and still holds its own. Overall I'd consider that the best ddr3 setup you can get. Now that cores are more and more important, 16 unlocked threads with quad channel 2400 C9 shows what Intel was actually capable of back then. Certainly not a practical comparison these days so it's perfect for the channel.
There are a few BIOS mods out there that patch/modify a Z170 chipset based board to accept up to a i9-9900K (but you'll have to bridge a pin of three, and block others on the CPU itself)
I liked this video, but being Team Red I'd liked to see this video using AMD Ryzen comparisions. Thanks, Roman, Happy to see you recovered from Covid-19.
Mate you are using oc ddr5 ram in that 12900K, you need to compare apples to apples if yoh are talking about cpu comparisons. Stick that 12900 into 3600 DDR4 and see what frame drop you are going to get. Stick to your regular vids and leave the gpu/cpu comp to Steves
8700k, upgraded from 1050ti to a 3070. 2k 60hz display. I don't try to keep any fps counters on as I play on vsync and it's always smooth. It was worth it.
Ok great test idea, here's a question for you, should the 7700k owners be looking at the 8700k or the 6700k? *Watches* Answer at 7:38 very nice! I also really like how you introduced the CPUs one by one that worked very well instead of an overload of information immediately, nice! You even covered GPU vs CPU limited, especially at the resolution extremes.
They can buy qtj0 or qtj1 (8 cores/ 16 threads/ can be unlocked to 5.0 ghz on 100 /200 chipsets like b150/h170/z170/z270 and also on b365/z370 in 300 chipsets after modifying bios ).
@@budgetking2591 I have qtj1 , overclock it on b365m (also on b150m ) to 4.6 GHz to all cores (try also to 5.0 - it works, but eats a lot ), and memory 3200 MHz . If you have motherboard from asus - ask the seller to send with cpu a chip with modified bios (only on asus with 100 chipsets they can be removed) , if gigabyte or msi - ask only file of modified bios ( this boards can be flashed from bios with flash utility ) . In other situation - only can be flashed by programmer !
Loved your 9900k OC when came up, I'm still rocking this CPU with a 3090, both water cooled, could you please make an updated video on this CPU, as I'm not convinced to upgrade yet Thank you and keep up the good work
Very good test, i have an i7 8700K 4.9 GHZ all cores, monitor 1080p, what GPU is not bottleneck for 1080p with it ? nVidia GeForce: 4070 TI/4070/4060 ? 3090TI/3090/3080/3070TI/3060 TI ? and/or AMD Radeon 7970/7950/7800/7700 ? 6900XT/6800XT/6700XT ?
Love these comparison videos 🥰👍😇. Recently my brother was talking about another upgrade to his pc …. He’s got an Intel i7-10700(non-k), 16 GB (2x 8gb) ram and a RTX3070 …… he plays a few single story games and some online e-sport titles ….. I told him to save his cash for a few years and keep what he has for the time being. I recently gave him a 🥰🤩🤯 1440p 144hz monitor as he was still gaming on a 24” 75hz monitor 🤯 and that was a wake up call for him with his gaming experience online which made hi question his pcs performance. The monitor made the difference for him, not a pc upgrade 👍😱
I really felt the lack of cores when Cyberpunk 2077 was launched. At first i used to "play the game" (struggling at low fps) at 3440x1440 running an 1080ti @ 2100 / 6000 + 8086k 5.2ghz + 4300 18-18-28-2t ram. As soon i as i got a 3080 i encountered a very bad stuttering due to the lack of cores of the 8086k...so when i was able to afford a new platform i went with an 5900x from AMD and everything solved. It's sad that 6 core cpus aged that bad in some scenarios.
@Salt Maker Ye? Say that to my old rig that used to stutter as hell, braindead. As soon as i put a higher core count cpu on my pc stutter "magicaly" disappeared.
I run the 5820K and have done for the last 7 years overclocked at 4.3ghz it has done me proud running 2 monitors a 32" 1440p @ 165hz and a 27" 1080 @144hz Ive never had a problem running any game until I started playing Star Citizen lol not looking for a new upgrade oh running a RTX2080 Ive never bothered about FPS as long as the game runs smooth
Exactly the video I needed!! I have a 4770(non-k), 16gb 1600mhz ddr3, rx 5500 XT 8gb.. considering upgrading to a 12100f platform, but wasn't sure if it'd be worth it
I routinely hit 144 fps at 1440p with my 6700k and 3080 ti unless its like Cyberpunk but even then I'm hitting 70-80 fps which is more than playable. And those are with maxed graphic settings
Instead of buying the 12900K for $550 Just get the AMD 5 5600 OEM CPU with the ASUS B450M-Pro S TUF Gaming motherboard combo kit for $179.99 at Microcenter and save $370, then use that $370 and upgrade from a 3060 to a Radeon RX 6900 XT for $699 and now you have 4k gaming where there's no difference between the 12900k and the 5600(non X)👌
When I bought my 1080ti about 4 years ago, the 3770k was bottlenecking so much that I didn't notice the card has to be RMAd. When I got a 8700k I kept having blue screens when pushing it and turned out it was the card.
The 8700k really needs an overclock to stretch it's legs. I understand the comparison is for stock vs stock improvements but those older cpus still have a lot to give! 5ghz was a common speed for 8700ks and the 4770/6700k can easily achieve 4.5+. While the difference wouldn't be huge it would certainly help frame times and playability while you source a cpu/platform upgrade.
i liked the video very informative, but i do have a question, Why did you skip the 9700K? i have that overclocked to 5ghz with a 3060ti and dont really know how to look for bottlenecks
I know my i9 7900X is a bit of a bottleneck for my 3080 Ti, but I don't wanna switch to Windows 11 just to get a 12th gen i9. But I'll have to bite the bullet sooner or later
You could start even 2 generations lower - 2600k. Amazing that so many years and CPU generations later, in some scenarios the difference wouldn't be noticeable. Also too much GPU power is never a problem. 1080p? It can be high refreshrate monitor, add SSAA or downsampling for extra quality - the GPU load may be higher than in raw 4k with lower Hz.
My i7 6700k is now 9 years old and runs at 4.6ghz with a 2080 ti from Gigabyte Gaming OC. But it's time to build a new setup. However, I think I'll stick with the 2080ti for now and upgrade to the 4090 next year.
Holy crud. I thought this video was gonna be boring, but after seeing the GPU differences, a flagship GPU, paired with the wrong CPU, is at a huge disadvantage. How much of this is PCIe Gen3? Also what about if you're only running games at 4K? At what point is your CPU the limiter? What about lower end CPUs from this generation or even Ryzen (which have many more cores at the high end)? I run a Ryzen 5600X on my TV which is 4K 120Hz, but I have a 5950X on my main rig which I use for work. Am I leaving performance on the table for my 3090 with that mid-range processor? Is the motherboard (being ITX) a limiting factor in some way? I have so many questions now.
All CPUs in these tests were fine for casual gaming. Most people won't pay extra for upgrading to better CPU to get few frames per second. CPU bottleneck is most obvious in very low resolutions - such as 720p and also 1080p. I personally do not care if I have 210 or 288 FPS in pubg. It is more about not having too powerful GPU in combination with slow CPU. I have i9-9900 KS with 360mm AIO and I had to undervolt it and set clock to 4.8 GHz to all cores. It is perfectly fine with RTX 4080 and high refresh rate gaming at 4K and thermals are perfectly under control. No CPU throttling occurs and when gaming CPU intensive games such as BFV, NFS Heat, temperatures are never higher than 85 degrees Celsius when gaming for hours
Honestly, the question that simply doesn't get asked enough with respect to gaming upgrades is "what display are you driving?" It eliminates so many variables and basically crops your viable options to something actually worth talking about.
It can certainly be narrowed down to the game, or class of games if the buyer/builder wishes to do so. Personally I think that's a little too rigid and would rather err on the side of overhead knowing I can play a frame hungry release at least reasonably well.
Would like to see 4k comparison
@@kopkila 12900k and 8700k would have the same fps, 6700k few fps behind. All gpu limited
@@sdrubaa Thanks, the one I was really interested in though was the 4770k. Currently running it with a 2080ti and was wondering how much the CPU is holding back the GPU in 4k
@@kopkila Watch the CPU and GPU usage during gaming. If your CPU is at or near 100% you are bottlenecked. The GPU usage will indicate by how much.
“Back then you were probably running like a 970”
I still am 😢
Me too me too
Gt 730 2gb gddr5😅
I've been waiting for someone to do this kind of comparison video since the Nvidia 3000 series launched, thanks for the great content.
Got that exact same mobo with my 4790k running at 4.6 GHz! Had an r9 fury that crapped out 4 months ago and am looking to upgrade GPU. This is extremely interesting. Thanks!
I run a 4790k at 4,8Ghz paired with a 3080ti. I'm currently getting parts and working on a wall mount with custom cooling. I know it's bottlenecked liket hell but soon it will be glorious!
@@martinh2783 My PC is a 4790k (at stock speeds), Maximus Hero VII, 16gb DDR3 RAM with 1080ti and I was thinking about upgrading. But you have my admiration and respect for a glorious PC with that build.
@@jodiepalmer2404 a 3070ti isn’t too bad with the 4790k, Depends on resolution of course, I run 1440p mid to high fps with my 4790k at 4.6 with same mobo and ddr3 amount
@@ShawnStrickland at the moment I'm only running at a 1080p (12 year old tv). Do you use Windows 10 or Window 11? Because I've read some games have problems running.
You could even opt for a 12600KF, which is more than enough for gaming. GN themself tested how a slightly tuned 12600K is capable of doing even a tiny little bit better than the 12900K in gaming. With that, you could go for a 3080 instead of a 3070.
For sure. Not to mention, if you're playing 1080p anyway then even a 3060 ti is more than plenty, leaving room to upgrade to a 2K monitor if you're inclined. 3060 ti can certainly fill those shoes as well if you don't expect triple digit frame rates in all cases.
And you can slot in a 3080 in the future without touching anything else.
And you wouldn't need to fiddle with getting decent thermals from the CPU
Exactly. You can slap a NH-U12A Chromax and have a solid cooler for years to come and have incredible thermals with that CPU.
12400 or a 5600(non X) whichever is cheaper then use the rest of the money on a 6900 XT 16GB for $699. I would go all AMD to enable Smart Access Memory (SAM) more fps.
I went from a 4790K to a 5800X3D and some games jumped more than 100%. It that was just with a 1080ti. I think a lot of it is memory bottlenecks in the games I play.
Edit: That's also why I went the the X3D, it's amazing the kinds of spaghetti code bullshit you can cover up with more cache and faster IO!
I’m looking into upgrading my 5600x to a 5800x3D myself. That way I should be fine on the CPU side for a while, and then just upgrade to a 40 series GPU when they release. The only hard thing is finding stock in Australia but that should improve
@@MarioGoatse the 5900x is cheap and has 76mb of cache better choice if you ask me
@@MarioGoatse yeah I agree. Unless you play old shitting rts games the X3D isn't worth the upgrade. With a modern GPU I would def go with a 5900x over the X3D. Especially if you plan on keeping DRR4/AM4 for a while.
@@MarioGoatse Or you can wait for the rumored 5600X3D or 5900X3D of which the latter has a gigantic cache.
@@G.J.Uytdewilligen Oh wow, I had no idea they were rumoured to be on the horizon already. I knew AMD was going to make more X3D, but wasn’t sure when that was going to happen. Is there a rumoured date yet?
Nobody’s gonna pair a 12900k with a 3060 or play in 1080p with a 3080 Ti
The point of this video is to demonstrate the bottleneck
Jokes on you I paired my 12700k with a titan xp
Hey i know him, he's me.
That's not true at all, it depends on the game. Generally speaking, in multiplayer, WQHD is more worthwhile than 4K. However, there are some players for whom Full HD makes more sense.
@@AcidGubba 4K downsampled to 1080p. Because of chrome subsampling it would look way sharper than 4k on a 4k screen.
This would be interesting to repeat showing at 1440p and 4k resolutions at Ultra with these CPUs, maybe they still stand up with not much difference to new 12th gen or maybe they don't have enough cores still.
This exactly, I see almost every bench score above that 60 fps and where the difference is greater it's about insane fps. PUBG on 289fps vs 437fps is not a real world scenario., just run it at 4K. I sometimes play Rocket league on my 6700K with a 1080TI, it runs 170fps on 4K so i don't need new hardware for that game.
It's crazy how much of a difference this makes, a 8700 is a huge bottleneck for a 3080ti at 1080p but is not a bottleneck at all for 4k gaming
@@gabrielbrown1027 true but you ve to look at 1% low too, those are your killing experience.
@@gabrielbrown1027 even my 6700k with a 4090 is still ok at 4K
@@ChrisDaytrader Bull lol, i have 4090 and an 8700k at 5.3ghz, upgrading to 13900ks increases my FPS massively in most new games at 4k. Even my 8700k was hitting 100% in some games.
Nice video! These are the kinds of videos that REALLY benefit home builders! We thank you - and Intel for suggesting it!
Very impressed with how i7 4770k showed itself, ofc 1080p is unrealistic resolution for those who use 3080 Ti, 1440p or even 3440x1440 is more realistic scenario. Recently just for fun got chinese huanan x99 bd4 motherboard for 100eur pair it with e5 2696v3(2699v3 with ddr3 and ddr4 support and higher clocks) 18 core 36 threads 3.8ghz cpu for 119eur. Runs great with my RTX 3080 with ultrawide monitor, however to get 3.8ghz on all cores after turbo boost mod you need to lock cpu to 12 cores. Haswell xeon is beast, wish to have better motherboard and try bclk overclocking on top of it. Haswell was so ahead of its time
I've been following almost the same upgrade path, 2600k/RX 590 to a 6700k/RX 590 added a 1080TI then a RTX 3060. Next step is upgrading to a 9700k/RTX 3060, I'm getting the 9700k after upgrading a family member to a 13900k. Thank for the video. Also looking into de-liding the 6700k and building it into an InWin Chopin case.
I think the L2 & L3 cache sizes of the CPUs make a bigger impact than just the 4 CPU cores. Greater CPU cores does not necessarily correlate to higher FPS. The increase of cache sizes affect this as well.
Agreed, along with base and boost clock frequency increase with the newer CPUs.
Cache dont mean shit if ur your 4 core CPU is hitting 100% load.
@@4gbmeans4gb61 the load is related to cache as well
I would have liked to see also benchmarks at 1440p. Indeed, the 3060 would not push as many frames, but the differences would have been way smaller between the CPUs. It would also make sense because if you upgrade the GPU, many people would also change the monitor.
yeah 1080p benchmarks are kinda weird nowadays given how easily most hardware run them
@@floodo1 1080p stresses the CPU instead of the GPU, making CPU differences more evident. There's no point in testing how a CPU affects gaming performance if your tests are all GPU limited.
That’s what I’m saying I gave my sister a 5820 K and a 3060 TI and she’s happily running at 1440.
@@BobBobson are all those games GPU limited at 1440p? Fair point though. In any case it's exactly like Der Bauer said, we have to consider our expected use case, individually.
I'm using a 4790k with a 6700XT and at 1440p i feel it works very well. Don't think I've had a CPU bottleneck yet
Thank you for the video, my upgrade from a 3570k to the 5600x so noticeable. I ended up getting 5900x because i plan to keep it for years even though i don't notice the improvement at this time.
Glad you've made this video. The amount of arguments I've had with numpties who still think their i5 3570k/4670k is good enough when paired with a latest gen GPU is ridiculous.
Currently running a post price drop 3080 with my 4790 non K intel. This is precisely what I wanted to confirm. The 3080 replaced a gtx 1080 that I had for 5 years. I'm waiting for the new AMD or Intel chips to see which way I go. Running games at or near 4k helps. But really looking forward to upgrading the rest of my system. Edit: oh and I did need to update my PSU and case (for better thermals) 😂
Hey
Fellow 4th gen friend, I upgraded my 4770K withs 5800x3d.
Should last me another decade 😅 😸, unless we some revolution.
At UW1440p it made massive difference for simracing, 150% increase in min. fps.
Your devil should be worried for being replaced soon 😂
if you really needed to have confirmation that a 4790 non k bottlenecks a 3080 to oblivion and beyond, you need to get a reality check lol.
Best Buy US has been fully stocked on FE cards, minus 3080, for the whole week. I had no issues making orders for the missing cards for several of my family and friends needing to finally upgrade. So glad we had patience and just waited because paying actual msrp is such a huge win.
Im a little curious to see how a 3770k stacks in this, its what I have my youngest kid using with a 3060 ti.
you might want to delid that 3770k then OC it :)
also make sure RAM is minimum 1866MHz. Higher is better.
Before upgrading to a 5800X a few months ago, the higher RAM frequency massively helped with my minimum FPS for my OCed 2500k
Hey I'm secretly your long lost brother, pls buy me gpu lol 🤣
@@ayuchanayuko hell yeah! right now the kid is doing summer work to save money for a new board ram n cpu. After he gets his parts going I want to tinker with the 3570k and 3770k to introduce myself to this more hardcore OC'ing.
Thanks Roman! Hello from Melbourne Australia mate!
thank you for this video, I am using a 8700K oc @4.8 with a 3090, and playing at 4K and i dont notice any bottleneck, well maybe a slight bottleneck, but it is doing great.
this is exactly what i was looking for.. i'm still using a 2080 super with the 8700k @5.0Ghz for 1080p, but i want to buy a 3440x1440p or maybe a 4K monitor, so i suppose increasing the resolution will at least take a load from CPU to GPU ... just considering a 7900 xtx instead of 4090 because it's half the price and i don't wanna be homeless
I did run my i7 4790k with 3080Ti but upgraded to 12400F only the lows improved a lot but avg fps maybe 20max on any game, if u are casual and u want 90+ fps at any game it does it no problem, stil new gen is better for me cuz i play only lowest settings for high refresh rate.
so in 10+ years CPUs have managed to improve gaming performance by 50-100% while GPUs improved by about 900%. what is the reason for this ?
The 8700k was actually realy close to the 12700k setup and remember that the 8700k-s turbo clock was way behind these new chips, so an oc to 5ghz might also shrink the gap.
Be cool if you could have added a i7-5960X be interesting to see how that would hold up being 8 cores
This is exactly what happened to me last year. Was running a 1080ti with my 8700k before running into the chance to get a 3080ti. Even at 1440p, my poor boy was struggling while the GPU was waiting for work. After a week I caved a upgraded the rest of my rig but it was an interesting experiment.
How much did you gain on performance upgrading the CPU?
@Zoran from what i have read lately, at 4k on a majority of games with an 8700k, delidded and oc'd performs very similar to an i12+ when both are running 4000 series gpus. Gonna find out this week
Running a 6700k and a 1060. Currently on the hunt on a used 3070 (TI). I know I am kinda bottlenecked here, but I dont like the power consumption on a new System. Waiting for an upgrade when they start releasing the 12V systems with DDR 5 to make the switch. Mostly upgrading because of DLSS.
Qtjo or qtj1 ( mobile i9-9900k) can be the solution for you .
I7-6700k with 1080Ti pushing an LG 3440X1440 34" ultrawide works well for me. Box in the corner has next system: i7 12700k with 3080Ti. Best to all.
Excellent content as always, Roman. Hope you could repeat these with 1440p and 4k resolutions. Realistically speaking, someone with a 3080 Ti would not be running a 1080p screen? 😜
Or they went for a 1080p high refresh rate monitor like 240hz haha
Agree, it makes more sense to run same comparison at 4K, that would be really interesting to see, shame Roman didn’t throw in higher resolutions as well
My main rig still is as 4790k@4.5, can play anything with 1080p low but it's reaching the limit, unfortunately back then I bought 1600mhz, i've tried a few years ago to get faster ram but couldn't find in my region. Looking forward to upgrade when am5 arrives.
I really love that you still do PUBG benchmarks :) I think what was surprising for my experience was how much of a CPU bottleneck is even at 1440p ultra wide, depending on the settings you choose. Love my 12900k.
Great comparisation and well done. I only miss, how it compares to 10th gen cpus. With all that effot to make this video, you could make that one more too, so we could have an image how it sacales with every 2nd generation od cpu.
it's Intel sponsored, so the comparison stayed with old->current CPU
10 series only misses out on the efficiency core stuff and the all unlocked all of the time nature of the 12 series.
I5-10600k is equal to i7-8700k on the same frequency.
Running a i73820 with a 3060 now, was already thinking MB/ram/ cpu /psu upgrade was definitely needed; thanks for confirming what I suspected
The 3rd Gen is ancient now, it’s certainly your bottleneck, even at 1440p high.
hey dude love the vid, ive got an 8700k @ 5ghz on an AIO, and a 3080ti. im playing games like hunt showdown on a 4k 144hz 43" display, and achieving around 90-100 fps. wondering if going to a 5800x3d or waiting for the new 7800x 3d would get me any noticeable performance upgrades (20+ FPS)
Interesting, would like to see a similar content with AMD with FX, Ryzen, etc.
oh god i shudder at the thought of thinking how much a bulldozer cpu would bottleneck a 3080
Ryzen 1000 will still bottleneck the gpu less than even the 6700k just because it has more cores and the games are more multithreaded. FX and A series will run awfully tho
Ok, i just figure out my i7-9700k is Bottlenecking my 3080ti on 3440x1440 with 14%! Good stuff. Always enjoy your shows.
Great video! good food for thought and like the way you test things
That takes me back. Got my old 4790k with exactly the same ram you had sitting downstairs. Still a very capable bit of kit :) Poor thing is retired now though, wondering what I should do with it
Might be interested in buying it if you're willing to sell, I mean mobo+CPU+RAM.
I'm still using a 4790K GTX970 16GB as my main machine, lol. No problems with 4K video and 2K gaming, albeit now after 7 years of daily use, i need to change GPU thermal paste.
There are a few other factors to keep in mind:
1. Are you planning on upgrading other system parts within the next 6-12 months? If so, perhaps actually getting that faster, more expensive GPU could be worth it for the extra gains later.
2. Are you planning on getting a higher resolution or higher refresh rate display? You might not see big immediate gains, due to being CPU limited, but once you increase the resolution, the load will be moved more towards the GPU rather than the CPU and you'll see a bigger gain.
3. (And this is possibly the most important one) Can you wait? Depending on when you're looking to upgrade, and how long it is until new hardware comes out, it may well be better to simply hold on a little while longer and get either the price reduction on older models due to new hardware being released, or getting a lower (and consequently cheaper) end model of the new generation. If the next release is more than a few months away, then by all means, upgrade. But when it's about 3 months away? Then it's probably wiser to hang on a bit.
The problem with waiting for new hardware (your no. 3) and getting a good deal on last generation hardware a bit cheaper, is that this isn't happening any more. As an example 12600K are just 100 SEK cheaper then 13600KF [and 13600K is around 500 SEK more] in one store that have high reputation where I live and this more or less the same for all computer components.
Motherboards are kept in the same price range (like the Z690) with just a small jump down compared to the newer chipset (Z790). One part of that is that demand increased for Z690 as Z790 starting cost (MSRP) made it less interesting as you can just update Z690 to use Raptor Lake (main difference is tha Z790 should out-of-the-box be able to support Raptor Lake without UEFI update).
The margins are rather low for computer hardware stores and add to that competition have been less as most computer stores have closed over time. There isn't much reason to have sales (outside of Black Friday and big events) that attract lower prices. Most computer hardware stores do not have much any stocks that they need sell off as they make use of Just in Time ordering from main distributors warehouse.
This is also reflected in second hand market or used market which now can keep high price as there is not much of an option and reselling hardware to re-coup what they have paid for hardware themselves once just keep older hardware high in price.
What about a 5.0ghz oc on that 8700K? I don’t have an 8700K but 4,3 to 5,0 is quite a big jump and would’ve been interesting to see
Exactly. Even the standard boost for 8700K is 4.7GHz - 4.8GHz.
I was one of the unlucky ones that got a 7700k prior to the 8700k release. I haven't been unhappy with it though especially after delid. It has run 1.32v 5GHZ all core since i got it. With max temps around ~65c.
I guess it will still be as limited as the 6700k OC results shown though as they are pretty similar cpus. Gaming at 4k 60fps, my CPU load is around 50-60%.
Mine at 5.3ghz still bottlenecked the shit out of my 4090.
@@Amksed All core is 4.3ghz boost.
These benchmarks are disingenous, because of course 1080p is going to be CPU limited, that's basically the max frames the CPU can handle. But you're not getting those frames at 1440p anyways, so you're GPU limited. That's why at 9:00 and 9:14 you can see the frame rates are almost the same on all CPU's, because it's GPU bottlenecked. Technically CPU's like the 8700k wouldn't even be this bad, but they were all crippled because of that infamous hardware exploit, so they had to add a huge overhead to patch it, and these CPU's basically lost 30% of their rated performance overnight. That's more than enough to overcome the difference against the 12900k. But since the 12900k isn't vulnerable, it's free to run at its full speed. So it's not necessarily better at gaming, it just has the advantage of not being held back. Intel should have been class action sued into oblivion for that exploit that resulted in basically all older CPU's being crippled with massive performance losses just to patch it. Imagine you bought a 4090 and an exploit came along that required them to basically turn it into a 4080, and Nvidia just said "sorry". Heck no, you'd want your money back. But that's exactly what Intel did, and they knew they could be sued into oblivion, but they lobbied to keep it from happening.
I love your cats. they deserve their own show!
I wish there was a i7-5960X in there too. Im running that old beast still, with a 6700XT.
Thank you! i have been waiting for someone to do this. looks like i need to look at upgrading my 6700k
it depends what you play and the resolution you use. of course also the GPU
Soon we will learn that the bottleneck with CPUs is the amount of cache it has to process the task from billions of transistors.
A good comparison will be 1440p vs 4K benchmarks, as coming CPUs will have 200MB L3 cache it will be a good bottleneck comparison.
my asus monitor is 1080p 75hz and actually gives a quality image and no sign of ghosting. People talk about 150fps and i am ok at 75fps with vsync just to eliminate all the tearing. Playing now Metro 2033 and even at 60fps is a fluid perfect gaming experience. Just like when i started riding bikes, i just wanted the mist ppwerfull and fast out there and couldnt use the power on the streets....
It's amusing that Intel wonders what would happen when upgrading the cpu, when on that platform it's also upgrading the motherboard because usually each socket is only supported for two years at most before switching to a new one (and sometimes to more than one socket at a time). On the AMD platform the AM4 socket is in its sixth year and it is possible to see the real increase in performance from going through the first generation of cpu for that platform up to the latest 5000 series
I went from a 2600K to a 5950X keeping my 980Ti with a 3440 x 1440 100Hz display. The jump in FPS is amazing.
I have been waiting for a video like this for a long time. Thank you very much for finally giving all those ppl on reddit an answer when they ask if X gpu will bottleneck their old system
Question is who buys a 3080 TI/3090 to play 1080p, you can go with a much cheaper GPU at this point.
At 4K the bottleneck is unnoticeable to a point upgrading your CPU in terms of gaming is irrelevant.
If you have a solid system with 4770K and you play 4K, you dont need to upgrade anything other than your GPU to keep up with games.
I've been running a 3080 TI with 2600K OCd to 4.5Ghz no matter what game at no issue at 4K@60.
well there's alot of people, you didnt hear about high refresh rate 1080p monitors?
I run an 8700k oced to 5 Ghz with a 3090 that I got 2 years ago cause I was early on the EVGA list. I also nabbed a 28" G7 4k 144hz display and most games I can run native 4k resolution with max graphics settings and get 90+ fps. That amount of eye candy at that framerate is pretty good imo. If I am in a game where I really feel that I need the higher fps, I bump it to 1440 and the fps pegs at 144 with no stutters or dropped frames. Also, the only reason I haven't upgraded my CPU yet is that I am waiting for the Ryzen 7000 chips so that I will actually have a chip based upgrade path with socket AM5 rather than having to get a new CPU+mobo EVERY generation on the Intel platform. Intel is nice and make good chips, but with AMD actually being competitive again; Intel needs to step up their game in regards to consumer friendlyness since their chips are trading blows with AMD.
Give us a socket that lasts for more than a generation or 2 and Intel will be a good choice, especially with mobo prices rising like they have been recently. Adding a $300+ mobo tax for going with an Intel chip isn't going to fly for a large part of the consumer base.
AM5 is DDR5 only so you’ll need to spend more money then expected
@@christophervanzetta No, I figured that it would be DDR5 only. It only makes sense if they are going to support the platform for 3 generations again. Imagine people buying a DDR4 based AM5 mobo and then complaining because they can't upgrade the chip to Ryzen 11k series chip that people who got a DDR5 version of the same mobo can. And you know that if people could do that, they would, along with making the overall expirence worse/more limited for everyone else.
Great video! It would be awesome to see this done with higher resolutions as well. I'm still rocking the 8700k, but I'm playing on 3440x1440 where the GPU does more of the heavy lifting. I wonder how much I woul gain at those resolutions by switching CPU.
How has it been working? I have the same cpu, was gonna build an entirely new pc for a 4090, but after some reading i think im going to delidd and OC it, and see how well it works first
@@dipf7705this is what I'm gonna be doing myself
Very interesting. I have a 9700k & a 3080ti. Been considering upgrading the CPU but want to wait another year or so until DDR5 is a bit more established.
@Defective Degenerate It's definitely still a decent setup but I have a 9900k with a 3080ti and it's definitely bottlenecked quite a bit in a bunch of the games I play...I would probably wait until the next round of CPUs to come out though, thats what I'm going to do...
Mostlikely not much of a jump since it has enough Full Cores for any Game on the market right now and in the next years. If you are already at 1440p or even consider to go to 4k in 1/2 years there are only a dozen percentage to gain in the majority of games. So for the couple hundred €/$ a better GPU also would achive this gain.
Almost the same, 9900KF and 3090, but will be upgrading to 4090 and 13th gen this year so not to bothered
I went from an i9 9900k manually tuned (with manually tuned samsung B-die memory) to an i7 12700KF, tuned it manually and using the same memory. Differences are huge on a 3070Ti overclocked.
I play games like COD and Wow.
@@Manakuski Yea, I just setup my bros new system with and he had a 9900k with 4000 cl15 mem with a 3080ti and then when I put in the 12900k with the same ram it was a very noticeable difference...I'm running a 9900k with a 3080ti myself right now but I think I will wait just a little longer until the new chips come out before I buy anything...My bro has the exact same setup that I do except for the 12900k vs 9900k and there have been a couple games that have as much as a 20-25% difference...It's really amazing how good these new chips are getting
In this “fly by numbers” there are big differences every where. There are also huge bottlenecks in performance that are not in numbers, this is perfectly and well-being said in this video. This is the truth. I experienced the same.
Running a 4790k at 4.8Ghz delied + gtx 1070 works great even on my 1440p monitor. I think I'll have to change CPU and GPU in the next year, but I'm waiting for the 14700k or an amd cpu if it performs better.
Still running my 4790k on air at 4.8 for years, no de-lid though but it runs like a banshee! Picked up a 3080 for the price of a small car last year and think I am CPU bottlenecked...
I7 7700K with a 1080ti 1440p aiming for 144hz is a pretty balanced set up (also i dont have any new games queued up i´d love to play). Planning a full system upgrade in fall.
Had 1060 rog got 3060ti now
I'm going to go out on a limb here and say that most AAA titles are pretty poorly optimized. CPU bottlenecks would likely be way less common if people wrote efficient engines instead of relicensing others and strapping their even poorer optimized code to it.
The modern Dooms (2016, Eternal) can run on a potato. An ivybridge CPU is _more_ than enough to get 120fps with a good GPU. Granted, some of that is their choice of using Vulkan, so they don't have to worry about a GPU vendor's inefficient openGL implementation, but a well optimized engine goes a long way. All of the modern amenities of newer Intel and AMD microarchitectures are more supplementing poor software design than actually being absolutely essential for the target framerate. Branchy crappy code puts more strain on the branch predictor. Purely scalar code that isn't vectorized sometimes is an order of magnitude worse than code that's explicitly SIMD vectorized. It doesn't help that object oriented code effectively forces data copies and even when being absolutely careful to use move constructors, references, and everything else to prevent wholesale copies, the data layout is far from optimal for an autovectorizing compiler or a reasonable prefetch algorithm.
4930k @ 4.5GHz and DDR3 2400 8x4GB is decent but even with a RTX 2080 you lose fps. I still could get 4k@60fps with my game of choice at the time with the RTX 2080. I am now rocking a 10900k @ 5.1GHz with the DDR4 RAM at 4000 CL15 with 4x8GB and a RTX 3080 ti. Performance is still decent on the 10900k and I can max out the RTX 3080 ti.
@@GodKitty677 I should preface that I play at 2560x1600, so somewhere between 1440p and 4k. I've upgraded since, but a sandy bridge e was giving some very acceptable framerates on that engine
@@kungfujesus06 I had to upgrade, I was losing to many fps and could no longer hit 4k60.
I agree with some of this, but I think you're simplifying the situation a bit too much and getting hung up on engines as the key performance factor.
A well-written engine doesn't automatically provide efficient and fast gameplay, and at the level of AAA titles it's probably one of the least critical parts of the performance puzzle. CPU-bound performance bottlenecks usually stem from downstream consuming code (e.g. iterating entity lists too frequently, accidental O(n^2) or worse from nesting loops, doing heap allocs and deallocs instead of pooling, etc.) rather than the engine itself. GPU-bound performance bottlenecks often come from poor asset optimisation, especially excessive triangle counts on meshes and improper use of quads. Manual micro-optimisations that delve into things like vtable layout rarely deliver significant gains unless you're really starved for memory bandwidth (L2/L3 will handle the rest), and such efforts usually detract from engine usability, which can have a much bigger impact on the overall quality of the game - remember, your devs are users too and usability is important!
Most studios would be far better off investing perf optimisation time into tooling and process management than they would tinkering with vectorisation and allocators. An asset and testing pipeline that can automatically catch checkins of unoptimised meshes or performance regressions between builds provides a lot of benefit for a relatively small amount of investment, and that work is transferrable between games. Better still, it improves usability and shortens iteration times on catching perf regressions, so your devs/artists can use their time more efficiently.
The one case where the engine design really does matter is multithreading, and unfortunately it's one of those cases where there's no one-size-fits-all solution for all games. Distributing NPC AI and entity ticks over a threadpool makes sense for an RTS, but adds unnecessary complexity to many other games, and there are complex cost/benefit analyses to be made for multithreading (especially considering issues like code complexity, scheduler determinism, atomicity, synchronisation latency, and NUMA) that may mean it doesn't make sense to utilise threading to the greatest degree possible across the whole game. Often you just end up pushing some async jobs to a threadpool and leaving the core loop synchronous because otherwise it becomes a nightmare to write code for it.
@@gsuberland ah yes, the "coroutine" pattern to multihreading. Often it finds itself unintentionally locking on something else or not really operating with any concurrency at all. I get tight game loops are tough, and the design of such things is highly dependent on the type of the title.
I will say that there are often serial loops in most code that could very much benefit from SIMD but the OOP data layout is too cumbersome to make use of it. Even idtech4's SIMD code was just limited to a handful of 4x4 linear algebra operations for the GL math (there were some limited optimizations with regard to lighting calculations and sound, too).
The branchiness nature of a game can make the code rather SIMD unfriendly. The whole entity design pattern puts the data in an awkward place. Additionally a lot of game engines are designed to cull specific assets for efficiency in the renderer, so you end up having holes in your simd operands for the invisible triangles (or you end up needing to copy them). Of course some of this can be supplanted by the "graphics compute" capabilities in shaders but then it's a latency hit to get it back to host memory (assuming said calculations are important for the game logic).
I would have loved to se the 9900K and 10900K included in this comparison.
Me as well.
This was sponsored by Intel and that comparison doesn't work out with the math where you would more likely spend money on the graphics card
It would be the same story as the 8700K
@@noreng9333 Not particularly, as the 9900K has 2 more cores and 4 more threads, and the 10900K has 4 more cores and 8 more threads compared to the 8700K. Those extra cores, along with the slight uplift in IPC, and the higher clock speeds matter a lot.
@@noreng9333 no it wouldn't
I personally run a 4/2 cycle on my hardware. New CPU and mobo every 4 years and a new GPU every 2 years.
I've gone 4790K>8700K>12900K. That has been a pretty solid upgrade each time. For 14900K I should be able to keep my current DDR5 kit. For GPU I've gone from 780 (+1 year after release)>980 Ti (+1 year after release)>1080 Ti (+1.5 years after release)i>3080 (launch day).
Monitors went 1080p/60hz>3440*1440/100hz>3400*1440/175hz
Imagine skipping the entire Skylake era, that's kinda crazy considering how long it went.
I remember distinctly, the setup before I switched to my 7950x system, I had a boadwell xeon 6 core with a 3080 xc3 hybrid, and running Linux. I was blown away playing Icarus when putting settings lower made my performance worse. before that I had a 3800x, but I helped out my nephew because I knew I was gonna upgrade soon, and he had a bulldozer machine he was trying to run apex legends on, and failing miserably.
Just recently I was able to upgrade from 1st gen ryzen, to the latest (5000) and noticed I was actually pretty bottlenecked with a 1080Ti. I can't imagine how bad it would've been with a 3000 series. (Could make for an interesting video if you still had some older hardware laying around, not sure with Intel. However with AMD you can easily upgrade with older AM4 boards now if you're on an older processor, and gain some hefty performance depending on your video card.)
4 me Ryzen only started to get real on the 3000 and up, the first ones comparing the quad core to my quad core 2011 CPU but at 4.9ghz there wasn't really much of a big difference in performance, it was mostly to just stay in pair with Intel since for many time AMD was mostly doing nothing before the Ryzen.
the two things that used to make the biggest difference from a usability stand point, were the hard drive speed and the screen you're looking at. Bigger monitor always = better experience and the faster the computer which USED to be faster storage, means running 18k RPM raptors but now SSDs have crushed that and NVME has demolished it. So now... the single usability factor is the size of your monitor. Get a larger, but not too large, monitor. OR get 2 more of your existing monitor. 3x and a larger desk maybe. Then your user experience will be altered like a state of consciousness. And you can keep the same old system and just upgrade the GPU when needed for the games you're playing. The platform will be fine.
@@guily6669 Yeah, I think the orig Ryzen was about the same as a 4.4 or 4.5ghz OC'd Sandy Bridge performance wise (not stock 3.4ghz). Which speaks volumes of old Intel but new Intel is no where near what they used to be. Still, any new Intel introduced is behind AMD so there's that and Ryzen 4 here we come but the Vcache one is the one to wait for.
@@Cinnabuns2009 I'm waiting to see AM5 socket in action, intel made such a bad job with their latest socket locking mechanism that apply uneven pressure bending it which only increases temps, hope AMD doesn't go so cheap on the CPU locking mechanism for AM5.
I still have my I7 2600K but sadly recently 1 core burned and it's now 3 cores, 6 threads, but I got a damn nice deal on a I7 3770K and I'm waiting 4 it to arrive it's just so I can keep playing well while I wait for next gen hardware and better prices.
Anyway I can't deny there isn't really that huge difference either on this CPU from 4 to 3 cores for a Strix RX580 as I'm basically GPU capped all the time in the games I play it mostly only increased CPU usage but never go much more than 80% and maybe a couple FPS lost, thought surely some games the stuttering got just slightly worse but it's almost the same as 4 cores. I thought I'd be much more screwed...
Amazing video man! thank you for your effort. This taught me quite a bit about bottlenecking and hopefully will help me soon :-)
this is incredibly interesting, would amazing if you could look further into this taking PSU and AMD CPUS into account as well
It'd be interesting to see comparisons against old flagship platforms like X79. A lot of people are still holding onto those. One of my rigs with the 8 core Xeon E3-1680v2 @ 4.6 GHz is pushing 10 years old and still holds its own. Overall I'd consider that the best ddr3 setup you can get. Now that cores are more and more important, 16 unlocked threads with quad channel 2400 C9 shows what Intel was actually capable of back then.
Certainly not a practical comparison these days so it's perfect for the channel.
There are a few BIOS mods out there that patch/modify a Z170 chipset based board to accept up to a i9-9900K (but you'll have to bridge a pin of three, and block others on the CPU itself)
I try to stay within 3 generation difference of hardware, that way I just need to upgrade every 5 - 6years.
good stuff. more please.
I liked this video, but being Team Red I'd liked to see this video using AMD Ryzen comparisions. Thanks, Roman, Happy to see you recovered from Covid-19.
interresting, you can even gain FPS with a lower grade GPU but better cpu than big gpu with old cpu. It does a big difference. Good video.
Mate you are using oc ddr5 ram in that 12900K, you need to compare apples to apples if yoh are talking about cpu comparisons. Stick that 12900 into 3600 DDR4 and see what frame drop you are going to get. Stick to your regular vids and leave the gpu/cpu comp to Steves
8700k, upgraded from 1050ti to a 3070. 2k 60hz display. I don't try to keep any fps counters on as I play on vsync and it's always smooth. It was worth it.
Ok great test idea, here's a question for you, should the 7700k owners be looking at the 8700k or the 6700k? *Watches* Answer at 7:38 very nice! I also really like how you introduced the CPUs one by one that worked very well instead of an overload of information immediately, nice! You even covered GPU vs CPU limited, especially at the resolution extremes.
They can buy qtj0 or qtj1 (8 cores/ 16 threads/ can be unlocked to 5.0 ghz on 100 /200 chipsets like b150/h170/z170/z270 and also on b365/z370 in 300 chipsets after modifying bios ).
@@СергейКусков-л9р whats qtj0? or qt1j? what cpu is that?
@@СергейКусков-л9р oke thanks, i found on aliexpress, but they have low clockspeed, are you sure multiplier is unlocked?
@@budgetking2591 I have qtj1 , overclock it on b365m (also on b150m ) to 4.6 GHz to all cores (try also to 5.0 - it works, but eats a lot ), and memory 3200 MHz .
If you have motherboard from asus - ask the seller to send with cpu a chip with modified bios (only on asus with 100 chipsets they can be removed) , if gigabyte or msi - ask only file of modified bios ( this boards can be flashed from bios with flash utility ) . In other situation - only can be flashed by programmer !
7700k owners should be looking zen4
Nice comparison. I till have a 9900k 5.0Ghz all day paired with Evga 3080 ftw3. This is still rock solid and butter smooth.
That's why I didn't add the 9900K or 10900K to the test. In my eyes still very capable CPUs for everything
Loved your 9900k OC when came up, I'm still rocking this CPU with a 3090, both water cooled, could you please make an updated video on this CPU, as I'm not convinced to upgrade yet
Thank you and keep up the good work
No point in upgrading.
@@iamghostwitagun3800 yeah, I don't want to upgrade, I just need a updated OC for it
Very good test, i have an i7 8700K 4.9 GHZ all cores, monitor 1080p, what GPU is not bottleneck for 1080p with it ?
nVidia GeForce:
4070 TI/4070/4060 ?
3090TI/3090/3080/3070TI/3060 TI ?
and/or
AMD Radeon
7970/7950/7800/7700 ?
6900XT/6800XT/6700XT ?
Love these comparison videos 🥰👍😇. Recently my brother was talking about another upgrade to his pc …. He’s got an Intel i7-10700(non-k), 16 GB (2x 8gb) ram and a RTX3070 …… he plays a few single story games and some online e-sport titles ….. I told him to save his cash for a few years and keep what he has for the time being. I recently gave him a 🥰🤩🤯 1440p 144hz monitor as he was still gaming on a 24” 75hz monitor 🤯 and that was a wake up call for him with his gaming experience online which made hi question his pcs performance. The monitor made the difference for him, not a pc upgrade 👍😱
I really felt the lack of cores when Cyberpunk 2077 was launched. At first i used to "play the game" (struggling at low fps) at 3440x1440 running an 1080ti @ 2100 / 6000 + 8086k 5.2ghz + 4300 18-18-28-2t ram.
As soon i as i got a 3080 i encountered a very bad stuttering due to the lack of cores of the 8086k...so when i was able to afford a new platform i went with an 5900x from AMD and everything solved.
It's sad that 6 core cpus aged that bad in some scenarios.
@Salt Maker Ye? Say that to my old rig that used to stutter as hell, braindead.
As soon as i put a higher core count cpu on my pc stutter "magicaly" disappeared.
Loving that old school impact board, very nice.
Thats me running a 6900XT with a 5820K but at the end of the day i can play all games on 4k at a pretty decent frame rates
Thank you for making this video!
Henlo Roman, no surprise ASUS expect a drop in sales after so many complaints on their Z690's :DD
haha :D Apex 2022 would sell well. But there are none :(
Should have tried a 5820k/5930k…6 core 12 thread. This gen always seemed to get skipped over in comparisons.
Prolly an i7 4930k too, with an RTX 2060 it's pretty good with 32gb of ram
I run the 5820K and have done for the last 7 years overclocked at 4.3ghz it has done me proud running 2 monitors a 32" 1440p @ 165hz and a 27" 1080 @144hz
Ive never had a problem running any game until I started playing Star Citizen lol not looking for a new upgrade oh running a RTX2080 Ive never bothered about FPS as long as the game runs smooth
We need a follow up with X79 (3970X/4960X or 1680 v2) and X99 (5960X/6900K/6950X), not sure if X299 or Threadripper stuff should qualify...
X299 and especially Threadripper are garbage for gaming.
Exactly the video I needed!! I have a 4770(non-k), 16gb 1600mhz ddr3, rx 5500 XT 8gb.. considering upgrading to a 12100f platform, but wasn't sure if it'd be worth it
it won't be, the 5500xt is far from a 3080, its not even 50% of its performance, your 4770 will do fine with your 5500xt.
@@budgetking2591 but the possibility of future upgrades on a 12th gen platform, surely makes up for it?
I routinely hit 144 fps at 1440p with my 6700k and 3080 ti unless its like Cyberpunk but even then I'm hitting 70-80 fps which is more than playable. And those are with maxed graphic settings
Hey Derbauer how does the 5960x hold up even though it's still Haswell based but has 8-cores?
Instead of buying the 12900K for $550 Just get the AMD 5 5600 OEM CPU with the ASUS B450M-Pro S TUF Gaming motherboard combo kit for $179.99 at Microcenter and save $370, then use that $370 and upgrade from a 3060 to a Radeon RX 6900 XT for $699 and now you have 4k gaming where there's no difference between the 12900k and the 5600(non X)👌
The 5600 was already that cheap back then? Dang
When I bought my 1080ti about 4 years ago, the 3770k was bottlenecking so much that I didn't notice the card has to be RMAd. When I got a 8700k I kept having blue screens when pushing it and turned out it was the card.
The 8700k really needs an overclock to stretch it's legs. I understand the comparison is for stock vs stock improvements but those older cpus still have a lot to give!
5ghz was a common speed for 8700ks and the 4770/6700k can easily achieve 4.5+. While the difference wouldn't be huge it would certainly help frame times and playability while you source a cpu/platform upgrade.
A video about CPUs that have the same core count across these years might be a cool follow up video.
Or perhaps a video with a couple mid-range a high end AMD GPUs first.
EXTACTLY the video I needed, when I needed it!
Would be a more interesting test with the 4470 vs the new i3-12100. but a test as it was done
i liked the video very informative, but i do have a question, Why did you skip the 9700K? i have that overclocked to 5ghz with a 3060ti and dont really know how to look for bottlenecks
I know my i9 7900X is a bit of a bottleneck for my 3080 Ti, but I don't wanna switch to Windows 11 just to get a 12th gen i9. But I'll have to bite the bullet sooner or later
I'm running a 9700k at 5.1ghz and a 3080ti, maxes any game on 21:9. 1080p will always cpu bottleneck. But 1440p + is gpu limited.
You could start even 2 generations lower - 2600k. Amazing that so many years and CPU generations later, in some scenarios the difference wouldn't be noticeable.
Also too much GPU power is never a problem. 1080p? It can be high refreshrate monitor, add SSAA or downsampling for extra quality - the GPU load may be higher than in raw 4k with lower Hz.
Takes me back to when I had my 3080 FTW3 paired with a 3770k and 24gb of DDR3.
what do you have now?
@@BA-oy9uo 12600k , DDR4, 🤷♂️
4790k with 1070, so right now my plan to upgrade the whole rig feels like a good one
Has memory bandwith (ddr3 vs ddr4 vs ddr5) negligible influence on those tests?
My i7 6700k is now 9 years old and runs at 4.6ghz with a 2080 ti from Gigabyte Gaming OC. But it's time to build a new setup. However, I think I'll stick with the 2080ti for now and upgrade to the 4090 next year.
Holy crud. I thought this video was gonna be boring, but after seeing the GPU differences, a flagship GPU, paired with the wrong CPU, is at a huge disadvantage.
How much of this is PCIe Gen3? Also what about if you're only running games at 4K? At what point is your CPU the limiter? What about lower end CPUs from this generation or even Ryzen (which have many more cores at the high end)?
I run a Ryzen 5600X on my TV which is 4K 120Hz, but I have a 5950X on my main rig which I use for work. Am I leaving performance on the table for my 3090 with that mid-range processor? Is the motherboard (being ITX) a limiting factor in some way? I have so many questions now.
All CPUs in these tests were fine for casual gaming. Most people won't pay extra for upgrading to better CPU to get few frames per second. CPU bottleneck is most obvious in very low resolutions - such as 720p and also 1080p. I personally do not care if I have 210 or 288 FPS in pubg. It is more about not having too powerful GPU in combination with slow CPU. I have i9-9900 KS with 360mm AIO and I had to undervolt it and set clock to 4.8 GHz to all cores. It is perfectly fine with RTX 4080 and high refresh rate gaming at 4K and thermals are perfectly under control. No CPU throttling occurs and when gaming CPU intensive games such as BFV, NFS Heat, temperatures are never higher than 85 degrees Celsius when gaming for hours