This. We know that games love cache because of how insane AMD's X3D CPUs are in gaming workloads while showing no difference in productivity compared to the non-3D versions of their respective architectures. And that's the same story we see here with Intel too, no change in productivity from 12th to 13th gen, while we get a small but measurable boost in games.
Yeah along with untied ring frequency from ecores (essentially what controls ur l3 cache speed and latency) and also better cache prefetching. Still more changes than literally the entirety of 6-10th combined. All they did is just make the same die with extra cores and in fact the first time they went to 6 cores they had increase the gate pitch of 14nm (70->84nm) making it ~20% LESS dense from then on😂. What a joke 14nm++ was actually 14nnm- ☠️
Yup, which only affects IPC in certain kinds of workloads. 🤷 And bumping L2 by itself is notably less beneficial for gaming in particular than an equally significant (aka die sized) L3 bump ala AMD's V Cache, as the former has no impact whatsoever on core to core communication/latency as L2 is entirely exclusive per core, whereas increasing the shared L3's capacity DOES improve core to core communication & latency! You can see this crystal clear in the fact that X3D brought JUST as big of a gaming performance leap as the entirety of Zen 4! Aka X3D (doubled L3 cache) at MINIMUM matched the perf improvement not just from Zen 4's L2 size doubling (from 512KB to 1MB per core), but from the entire architectural change! If that doesn't make it clear that an L2 bump ≠ an L3 bump for gaming, I dunno what will. 🤷
The clocks went up, so Raptor Lake is actually OK - not every generation should bring IPC changes. 14th gen is weird - even for a refresh it doesn't bring enough.
@@DarioCastellarin I think many of Intel's current problems stem from the known issues and troubled development of its -Intel 7- 10 nm process, especially the power consumption.
Well you can say the Zen + architecture was a minor improvement over Zen. AMD are rushing (whilst extremely late) to get out the new generation, and that's spelt disaster for RDNA 3.
@@stevenwest1494 zen + had better ipc, clocks AND an entirely different design. they went from monoilithic die to chipolets. how is any of that similar to this? oh, it's not. fair point about RDNA3 as they shouldn't have even released it, but who's talking about gpus? don't wanna get started on intel's bust gpus that they have to sell for a loss because they perform about half as good as they should (at best) given their die size and transistor budget.
this BS website should be sued for misinformation of fake news... can't you guys in the US do something about it ? coz from the EU i don't think it's possible
The issue is that AMD and intel cpus are good for different things. You arent really going to be able to give one score that encapsulates those different things. i.e AMD is better for gaming because it does more work per core. Intel tends to be more suited to productivity tasks as those cpus tend to have more cores that make up for the lower amount of work that can be done per core. Userbenchmark scores are based on measuring productivity rather than the work done per core. So it isnt great if you care about gaming more than productivity tasks.
The sad part is that you'd probably be surprised by how many of that exact configs will end up being sold regardless. We here watching HWUB's coverage (and probably several other channels') might realize that that'd be the most stupid configuration to get - but the retail market works differently, both on the OEM and the consumer side.
I'd be willing to pay a little extra on the motherboard side if they lasted more than 2 true cpu generations, Rather pay for longevity rather than more motherboard flashy RGB.
Tbf that really wont be happening anyway. Absolute max is 3 gen. Pcie 6 and ddr6 are not that far away. Need new notherboards for that. Besides usually you should NOT get the first gen of a new amd socket. Zen4 motherboards have been issue after issue
Doesn't mean that other people don't do it. I went from r5 3600 to r9 5900x and it changed my experience a lot. Just because You never did it, doesn't mean others never done it.@grzegorzszewczak2808
I feel like this is actually _worse_ than that -error- - excuse me - _era_ 😂 I suspect that if we look back, we'll find _higher_ IPC jumps in those notoriously small-increment generations than we're currently seeing out of Intel.
@@h1tzzYTNot competitive in the sense that it was still firmly behind in gaming performance compared to Intel's higher end offerings at the time. Very competitive in the sense that Zen1 offered 8 cores when Intel was still shoving 4 cores (sometimes even with no HT, mind you) down consumers' throaths - and at a much lower price.
I like how the IPC debate only gets brought up when something comes out that literally doesn't move the needle or when it does A LOT. Intel needs to step it up on every level.
I'd be surprised if it wasn't like that. Things are only newsworthy when they're outside the norm, and that goes for just about everything. I might be interested in the *details* of IPC improvements when they're incremental (not huge or absent), but the fact that they exist isn't worth mentioning.
That's true. If Intel wanted to cause a positive stir, then they should have released an LGA 1700 series that has really good integrated graphics. An ryzen 5600g or 5700g style intel cpu line could turn some heads. And it would help intel leverage the amazing work they've done with their GPU's.
Intel is marketing numbers cleverly but their fab is still 10nm and supposedly MTL will be their working 7nm but look at the POWER. Look how far behind TSMC it still is.
@@JoeL-xk6boAre you comparing nm to nm? It doesn't work this way Intel 10nm is closer to tsmc 7nm DUV(if i recall it is duv) than tsmc similar name 10nm. Nm is also a worthless metric nowadays
This is exactly the video I have been waiting for as a 12900k owner! I limit my power to 140w as I use a passive case and I wanted to know if the 14th Gen chip would offer better performance at the same power limit through efficiency improvements. You have shown that there is very little point in changing. Yes the newer chips are more power efficient but like when I assessed this with 13th Gen, the improvement isn’t worth the effort. Thank you very much for making this video and congratulations on making 1,000,000 subs!
My 13700K is limited to 1.28v through an adaptive vcore. runs at an all core 5.5GHz with a max package power at 228w and 80 degrees C in Cinebench 2024 so I do feel this is partly Intel's fault for unlocking the power limit stock when it should have been vcore limited and unlocked by those who want to as the performance is stellar at 1.28v (some can do it with an even lower vcore) with the power draw at half what HUB got...I do feel HUB and others could have pointed this out as I do not see many running the K CPU's at absurd vcores that motherboards allow hitting 1.4v and above which is just stupid...
That is not correct. There is a big difference when you limit both to 140w. Reason is, 12900k at 140w cannot get anywhere near 14900k's clocks at 140w. Comparing just the Pcores alone, a 12900k at 140w might drop to 4ghz and the 14900k might easily be hitting 5ghz.
You're completely missing the point that Raptor Lake is far superior to Alder Lake due to much larger cache sizes. "Speed" is irrelevant. Your power-wasting 12900K is a boat anchor which is easily outperformed by a 13600k in almost every benchmark.
im sorry but even both limited at 200w... the 14900k or 13900k pisses all over the 12900k... hell even limited less than that.... there is a HUGE difference...
Clock for clock is the type of testing I ALWAYS love to see when new gens launch - and leave it to HUB to make it happen. Would also love to see steady state performance at specific power thresholds, as well as separate P-core and E-core performance metrics.
The E-cores have pretty bad IPC. I have a Haswell i7-4700MQ laptop chip from 2013, the IPC is 25% higher than the E-cores on my current i7-12700K. If Intel used Haswell for the E-cores, imagine how much better the perfomance per watt would be or the E-core memory latency penalty. I did a benchmark of my e-core cluster from my 12700K and it is faster than my 10 year old laptop by the clockspeed alone, not by much.
Love the in depth testing as always, especially always calling out stuff that might not be obvious to new watchers. (like how important power consumption, temps, platform support/cost and driver support (ARC) can be) looking forward to new rewievs and podcasts, btw have you considered inviting guests/specialists for certain topics or will it remain a more chill conversation between yourself?
1000%, I was stuck on LGA1150 and basicly needed to buld a whole new PC to update. I swtiched to AMD for my current PC built in 2022. Excited to see what's posslbe on the AM5 platform.
@@DeathCoreGuitar Not exactly how it works. The 12900k is significantly slower than a 13900k. What's being show here is the performance of the chips themselves at a certain clock, as well as the efficiency at said clock
@@rustler08 Yeah yeah, I phrased it poorly, sorry. I meant that some people spend a lot of money on a new CPU (and potentially new motherboard) just to get the same architecture chips but overclocked to the sky leading to more spending on a cooling system because they are hot as hell itself and electrical bills because of a high power draw. Also 13900K is "faster" because it has 32 threads and 12900K has 24
I agree that generational platform support has me looking at AM5 for a new build since it's the only current platform with an upgrade path into the future.
I totally agree with your final thoughts. I would love to update my 10900k but I'm not willing to buy another dead end motherboard with no further upgrade path.
AMDs commitment to AM4 is what made me decide to switch from Intel in my latest system build. I bought in to AM5 in hopes that AMD will support that platform as long as they did with AM4. I like the ability to drop in a new CPU when I want to upgrade in a few years without having to hassle with a motherboard upgrade. You just don't get experience that with Intel.
Same. I originally bought a 4670k to kind of wait out for a 4770k or 4790k to drop in price and it didn't until the R5 3600 launched which was cheaper, much more efficient, had the SMT I was looking for. Hung onto that 3600 then dropped in a 5800X3D for the win. Try to beat that upgrade path with Intel. Intel would need a very compelling reason to go with them again, not some bragging right at the expense of power usage and heat output.
why would Intel fix this? OEMs will still buy it. They sell product no matter how good it is. They are totally insulated from the effects of competition. They. Do. Not. Care.
Hey guys, just wanted to congrats you on 1 mil subs, absolutely deserved.. I think you're the most unbias (thus most reliable) hardware reviewers.. Keep up the great work! :)
You're right. I bought i5 11400f when I was low on budget, but now when I have more budget and some struggles with CPU on my 144Hz monitor, I could just upgrade to i7 13700, but I can't. I could do that with b450, so it was my huge mistake. Now I need to replace half of computer, instead just 1 component.
@@tilapiadave3234Man, it's not about selling 2 instead of 1. Maybe you don't want to get it. My MOBO is connected with 9234208734 cables from 354 sides, when I'm thinking about the change, it's just problematic. I'm tired of thinking about it. Much more problematic than switching new MOBO. I would like to switch CPU and forget, not unmount most of the computer. It can be done, you're right, but it's highly demotivating
When I was making a PC I wanted to use 12600k, but went for 12700k to, sort of, max out the platform, knowing full well I won't ever change that cpu. I don't have scenario where 13900k will make a noticable difference, so this is my pc until it's time to make a new one in probably 7-8 years. Now that I think about it, GPUs are following the same pattern. It's become more cost effective to get higher end card and use it for 7-8 years, than to update every other generation.
I was an early adopter of AM4 Got a Ryzen 5 1600x on launch Later bought a Ryzen 9 1800x At the end of 2019 I got a 2700x Mid way through 2020 I got a Ryzen 7 3700x And just 7 months ago I got a 5700x All on the same exact MSI X370 Gaming titanium motherboard That was “over built” back with the release of the 1800x Now, if I really want, I can still upgrade further, into a 5800x3d or into a Ryzen 9 5950x Such insane upgradability for a platform A nearly 80% single threaded uplift from the 1800x And a 170% multi threaded uplift To the 5950x Or a 100% single threaded and a 130% multi threaded uplift with the 5800x3d The AM4 platform is legendary
9:42 That's amazing, you're actually describing the very opposite of how it's been for me - when I was younger and had very little money to spend on gaming PCs, I scraped every bit of cash I could to constantly upgrade or flip my older PC for newer mid-ranged hardware (So I never had any PC configuration for more than 3 years), but now I have a top-end PC with an RTX 4090 that I bought after keeping my previous PC unchanged for over 4 years, and I have absolutely ZERO plans of even thinking about a new PC for the next few years, let alone a partial upgrade. How would it even make sense?
He's describing exactly how it is for most people on midrange/enthusiast hardware. The upgrades are more spaced in time as you have less money to spend on luxuries. Whereas people who buy the top end often have a "money is no object" case.
Yeah, was similar for me. Back in the early 2000s I was upgrading at least every 2 years or so, always on a budget with low end parts. Nowadays I tend to buy more high end for my main desktop and keep it for a long time. But I also remember that I upgraded based on kind of the rule of thumb: When there's something available at about 3x the performance, I upgrade. And that was indeed within like 2 years! So my change of upgrade cycle is not mainly because I actually can afford higher end stuff now. It's kind of the other way around. Because things haven't been moving that fast it makes more sense to go more higher end and keep it longer, rather than constantly upgrading. I kept a Haswell system for like 8-9 years or so, all throughout the dark ages of Intel's quasi monopoly, until finally Zen came along and gave me a reason to upgrade (to Zen2). Just upgraded again to Zen4, but even the impressive looking gains we finally get again ever since Zen came out are still not close to the pace we had for a couple of years back then. Anyhow where was I going with this? Right, I don't think this contradicts Steve, because it's not about budget but also about how fast hardware becomes obsolete. People who are on a budget today and buy low end hardware have no reason to upgrade every gen either. Back in those days when I upgraded constantly any 5 year old system, high end or low end, would've been utterly useless. Today most 5 year old systems are still perfectly reasonable. And I can totally see on the other end of the spectrum people buying stuff like 3090Tis because they can and just want the best (some small fraction of whom may actually need it), and might therefore do it every gen.
I really liked your short giggle at the beginning saying "new generation". Starting from an 8086 at 5 MHz I always went through all the various generations of Intel with two rules: 1) Second hand upgrade to the best or second best from the previous generation. (I'm cheap) 2) Upgrade only if the performance is 1.5 to 2 times that of the processor I already have. With these premises I will have to wait forever for Intel to make a huge move to carry out any upgrade with more than a few percentage points of advantage... or switch to AMD.
It wasn't a bad platform overall, I think the low end i3 and i5 were pretty interesting, decently competetive and didn't suffer as much from the power consumption issues. But yeah, AMD is still confidently in the lead...
This is hardly an Intel issue. This peak capitalism and planned obsolescence. No significant technological advancements have been made in the last decade, and yet shareholders in all tech companies have increased their wealth on a yearly basis.
Most people have too short an attention span to recognize these patterns. AMD would've done the same had they been the market leader for decades. The PC landscape has been doomed to begin with ever since the universal adoption of proprietary x86 architecture gave rise to the duopoly of AMD and Intel. Some people are so preoccupied with these menial dilemmas they don't even realize 99% of games never needed to be created with high end rigs in the first place. AAA publishers and silicon giants have entered a symbiotic relation completely unbeknownst to the cows they're milking.
Yeah, that's why i bought the 12700K, 7700X is 4 less cores and 700Mhz more for basically the same gaming perf. All of the memory talk nonsense on AMD meanwhile on Intel you can use low speed RAM, it doesn't hurt as much and you don't pay a premium for X3D.
Applying some actual context to a video like this would be useful. Raptor Lake was never supposed to exist in the first place (which should speak volumes about the refresh), but they figured out Meteor Lake was never going to be ready in time. Part of the cache redesign was what allowed the clockspeeds to go much higher on top of the increased capacity. The memory controller for RPL is also a pretty big improvement over ADL. While I don't disagree at all regarding platform compatibility you're ignoring reality if you think it's just an easy decision. AMD proved that you cannot keep things going forever as they dumped CPU support along the way due to limited BIOS sizes during AM4's lifetime and Intel releases significantly more SKUs per generation. You'd need to convince OEMs and motherboard makers to change up BIOS support (or go back to text only, which I'd be fine with) to mandate much higher minimum capacity before anything could conceivably change. Potentially shifting the way the BIOS works entirely would also do the trick, or convincing Intel to just pick and choose CPU support. Unfortunately I don't see any of this happening because at the end of the day it doesn't provide them with monetary benefit.
Looking back, not only these rebadges barely have any performance gains in them. But also they started having stability issues due to silicon either being pushed too hard, or having a flaw in the architecture. Not a good look for Intel are these Raptor-Lake CPU's
2016-17 I had two Motherboards. The first was an upper tier, ASUS ROG Maximus VIII- along with a Skylake 6600K. I loved this board! (around $200) my second was a budget re-manufactured board- ASUS Prime Pro X370 for My Ryzen R5-1600.(around $90) Guess which Motherboard is still being used?
Amazing - hopefully you've been able to hand off that old 6600k system to someone else... I have my old 6700k still going as an occasional bedroom HTPC...
@@alistairwillock7266 unfortunately it has an issue, that I couldn't solve. The issue is more complicated (its been a long time, I honestly forget the details. 2020 was the last time I attempted to solve) but basically it would crash/freeze after being on for some time. I think it is a 'solvable issue' because it does not freeze, if I have windows is running in safe mode. I know that, in itself, sounds like it would be easy-peasy to fix. I also remember thinking that the last time I tinkered with it(2020).
Man, you i7-4770/3770 folks had it good... I went from an i5-3550 (multiplier + BCLK overclocked) to an i7-12700K just this summer. $350 for CPU, RAM, and mobo. I'm cheap.
@@haydn-db8z The i5s were similar gaming performance back in the day compared to i7s. So it was the wiser choice. I actually had upgraded that MB from i5 4650 to i7 4770.
There's certainly exceptions with 4090 buyers, as getting the absolute best often makes it last much longer (see 1080 Ti). I certainly can't afford to upgrade every generation, but I am willing to invest into my PC as it's the most important thing in my life. So, I went from a 1080 TI to a 4090 and a 6850K to a 5950x. I expect to keep this for 3 more generations, especially as upscaling/framegen tech is now a thing. The 4090 is still CPU bottlenecked by all but the latest, fastest CPU's (depending on title). CPUs can last even longer.. though things aren't like the old days anymore where it was just IPC performance that mattered. It still matters, but now cores and cache do too, so an X3D CPU is worlds above a normal one. I will have to upgrade next gen to stop bad bottlenecking, so I'm holding out hope they figure out the multi-CCD 3D cache, without dropping clocks too. Moore's law is practically dead. Intel must stop this platform change trend; the cost is not justified.. nor is the e-waste.
Hi Steve and Tim - The ad-spot is for the ASUS ROG Swift OLED PG27AQDM but the link in your description is for the ASUS ROG Strix XG27AQMR. I'm not looking for a monitor but I wanted to check it out so thought I'd let you know.
Very happy with the upgrade from an i7 8700K to 13600K purchased a year ago but yea later down the road I probably won't bother with upgrading again on the same motherboard and just see whats coming next from Intel / AMD.
Back in 2021 I picked up a 11600k for my first ever build. Despite being fine with my decision even until now, the AMD longevity looks very enticing for the future.
I suspect the 14 and 13 series are just refinements of 12th gen in terms of yields and steppings. The additional cache may be a side effect of other improvements providing more power for additional circuits. The main criticism is the lack of interesting or useful features that demand a platform upgrade. NVMe tech and USB tech are far beyond the use case of most people, and I’m struggling to find any features that the LGA1700 platform needs but doesn’t have.
I've purchased *twelve* AM4 systems. 10 motherboards and 2 prebuilt. Gave away two, and two at parent's place, so eight currently at home. Zero AM5 so far, but I did buy a Phoenix laptop. It's mind-blowing that this 28W ultra portable is neck-and-neck against desktop i9-11900k for CPU performance, with way faster iGPU.
Intel completely devalued the 'gen' branding with this. Plus it means any benefit seen in the 15th 'gen' will be compared against two 'gens' before it. Nothing more that labelling it as a 'gen' to entice purchasing and to sell the new motherboards which are also pointless. Nothing more than a cash grab.
“Intel does it again with an outstanding product release, just look at that outstanding performance. All while AMD has no product release at all and can’t compete even with their marketing lies and manipulations.”
With Tick-Tock Intel paired motherboards with one processor architecture. The second generation was a shrink of the same architecture: sandy bridge & ivy bridge, Haswell & Broadwell. Then Skylake was crazy: 4 generations with very similar architectures and 2 or 3 different sockets. The IPC difference was small for 6th to 10th Gen. Core processors.
I have an i3 12100 which I got for £90 at the start of the year. I got lucky here as it fits my purpose, has a potential upgrade path, and my CPU allows for enabling AVX512 which helps significantly for RPCS3 emulation.
RPCS3 runs well on the 12100, really? It runs pretty good on my 13600k even without avx512 but I was thinking the two extra pcores and extra clocks of the 14700kf might help, with rpcs3 and yuzu for the more demanding scenarios.
I almost wasted my time commenting on a random bar graph video. Thank you guys for actually answering my question of is it worth upgrading if you’re tech literate and already overclocking manually
That's why I went with the 1700X oder the 7700k when it was released. After some years I replaced the 1700X with a 5600 couple of months ago. I gave the PC to someone else who's rocking it daily and bought an AM5 with 7800X3D. I wonder when I will replace that one :)
oh hella yes, i had the same journey (although with a 5900X instead of a 5600) and it's been so awesome. back in the day i did some CPU-intensive tasks too (mostly Blender, which has since been taken over by the GPU, especially since i got my first RTX card) so i got some good use out of that 1700X, and it was a friggin beast by 2017 standards. and Zen 4 is absolutely crazy, going down from 12 to 8 cores literally hasn't been a downgrade amd's platform support counts a lot. even when they mess it up -- i did switch motherboards, from my og X370 board to a B550 when upgrading the CPU, because X370 didn't have zen 3 support yet, but i gave that X370 mobo and the 1700X in it to my cousin and since then the platform did gain that support and he was able to upgrade to a 5600. there are actually CPUs from all four of the AM4 generations in the family and it's hella nice to be able to just mix and match them as needed
Time Stamp: 13:00 I totally agree.. If you just play the game you will never tell the difference between the 12900K and the 14900K... Unless you play the FPS counter in the corner game, then yeah you will see a number difference, but not a game play difference..
I can't believe my 12700k that is $273 on PC part picker beats a 14600k which is going for $300. Don't buy into false advertising. Bigger number is not always better.
Some people say that 14th gen only advantage is the enabling of "DLVR", which is supposed to reduce power consumption at same clock speed and performance. But apparently, for this advantage to be really perceptible, the socket must be power-constrained, way more than that K-series does by default. For example, it would have to be limited to something like 65W (non-K variants) in order to observe better performance at same power budget. This is a "claim" though, I haven't checked it myself, nor seen a review trying to analyze 14th gen power efficiency advantage under this angle. It would be interesting for a reviewer to have a look.
The biggest appeal of this refresh "generation" I could see before it launched was that they were finally going to get DLVR working, lowering the voltage (and therefore watts) the chip sucks down. Discussion online seems to suggest that it is working, the **900 chips are just pushed *so* hard out-of-the-box that it makes no difference in that tier, while the lower tiers see a benefit. I had hoped that the downclock to 5 GHz would help matters, but it looks pretty similar. Disappointing.
General question: What are justified reasons for a new socket? DRAM generation is (apparently) one but PCIe generation for example isn't. Please enlighten me 💡
The difference between 14th gen and 13th gen is that 14th gen uses PER core throttling, and 13th gen uses ALL core throttling; allowing 14th gen to maintain slightly higher clocks.
2:50 This is only true if you consider the R7 5800X3D/R5 5600X3D to be an "entire extra year/2 of CPU support" for AM4. If you only consider the major architecture releases otoh (Zen 1, Zen +, Zen 2, Zen 3), then AM5's guaranteed support roadmap is already JUUUUST as long as AM4's was. 🤷 (AM4 = 2017 through end of 2020; vs AM5 = 2022 through end of 2025)
LGA1700 was always going to be stuck in the middle. A new socket was needed to support the move to DDR5, and the next gen FOVEROS based chips needing a new socket for a radically different design shouldn't be a surprise.
This is the main reason i went with amd, to be able to get "actual" multiple generational cpu upgrades on one socket is huge, especially when im buying a $600 - $800 motherboard.
I wonder how many Intel chip designers or product managers watch stuff like this? The chip designers would probably say "yep, this is true, we really haven't innovated lately" and the product folks would say "we have to sell more chipsets, so they have to change to maintain revenue." Who knows.
🚨Guys, a head's up. The link in the description to the sponsored monitor is wrong. Link goes to an IPS monitor (XG27AQMR) instead of the advertised OLED one. 🚨
Hey Steve, you should try and reach out to some AU hardware resellers/system integrators and see if their willing to part with some CPU sales data at all. From my perspective Intel is out-selling AMD 10-1 in open loop system sales.
I have specifically recommended Ryzen mostly because of platform upgradability. They had old equip and had to spend big $$$ vs years ago. I told them they could drop in a big upgrade in 2-4 years, kind of like upgrading a GPU. Won't be as good as buying a new system ofc, but you'll save on RAM and mobo. Best bang for buck reasoning. I did mention the power = higher w/Intel, but they didn't care as much. The decision maker was the drop in upgrade years down the line w/o replacing everything.
I’d love to see Intel change but I’m not holding my breath. Not that it is a big issue for me as I have tended to build a top or near top end machine and keep it for around 4 years. So guess I will be stuck with my i7-13700k/Z790 machine for another few years.
Intel is definitely never changing. They still outsell AMD by far, despite really not being that competitive. Not enough incentive to improve when you have so much guaranteed income.
What I would like to have seen Intel do - and something I think would benefit users FAR MORE than simply more than 2 generations of support - is to unify the enthusiast families onto one socket. To have consumer and HEDT share a socket and have interchangeability would be HUGE! Imagine, K SKU and Z board with the only change being the socket change so that the Z board could also accept a 16 'P' core HEDT X SKU. I'd suggest also having a 350W limit on the 'Z' boards to ensure end users aren't blowing them up with X SKU overclocks. Then you could go with extra cores on the cheap board or get the more expensive board because you also need the I/O. The 'X' boards would also be one single platform for extreme overclockers to play with all of the core count CPUs. They did something similar with W790, but it's far worse and more confusing IMO since both platforms have the same name, just a 90° rotated socket to denote what is the pro workstation and what is the lite workstation board.
In short: All AMD needs to do is support AM5 from Zen4 all the way to Zen6 and they've succeeded in making Intel look silly, again. Shouldn't be that hard, actually. AM5 has everything it needs to be adequately equipped for that timeframe. Nobody on a consumer platform will need PCIe6 or CXL, anyway.
Technically they did this from the 6th to 10th Gen on desktop. All Skylake derivatives with only core and clockspeed bumps, in some cases an improved IGP
Most hate 14th gen due to lack of progress - it's just Intel and partners wanting new generation every year. The thing is, hating it doesn't make much sense from consumer standpoint, because from this point of view it's nice to be here: 14th gen makes the same perforoming 13th gen cheaper and 14700K is great upgrade option for current owners of lowerend chips - you get almost ultimate LGA1700 performance for i7 money. That's how it is for now and there's also rest of the generation coming...
@@TheCountess666 Agreed, they should have just named the 14700 as a 13750 or even 13800 and then lowered the prices of current line. It was known around the time of release for "Matisse" that Intel were in the mud and would be behind for a good few years. They changed that understanding with the introduction of e-cores but as we can all see, it was just putting a plaster on a deep wound. They'll catch up eventually but needing all that power to sometimes be ahead but also losing a lot too is actually embarrassing for Intel who have had more than enough time to become as efficient as AMD. Good for the consumer though.
@@TheCountess666 Not really, because with whole new generation come new pluses: Dell and similar have new CPUs to offer in their computers to e.g. justify their higher prices; Intel can claim longer support of 1700 and minority is interested in hardware enough to know how it is with 14th "gen"; consumers have 13th gen cheaper and with 14700K get an unusal gift of having option of upgrading to allmost full potential of plaform without buying i9, so for 2/3 price what definitely calculates price/performance wise. And there're lowerend CPU yet to come when already more budget oriented offer of 13th beats Ryzens 5 and 7. So what stands for your ,,yes, really"?
Hey Intel, support overclocking on your non Z chipsets, at least the B if not also the H, if you did them all though you'd actually 1 up AMD since they don't on their A series. Just bought an AM5 system coming from 1151, and this video explains why, I may not upgrade my CPU but if I want to I'll be able to, Intel you're fresh out of luck.
Going from 12 to 13/14th gen isn't a terrible upgrade though. I agree overall it's "meh" at best, but honestly the reason I upgraded 12th to 13th was for the vastly superior IMC and the ability to run much higher/better memory. My 12900k couldn't even hit 5800 speeds while the 13th is cruising at 7200 with no messing around, just XMP. The fact you were able to run these test on the 12th gen with 7200 is impressive, you must have some really good 12th gen binned chips as that is not typical. So overall, I don't disagree with you, but if someone had a 12700k and went to a 14900k in the same mobo, that isn't a terrible jump when considering overall performance if power draw is ignored.
I seems pretty much the only difference between Alder Lake and Raptor Lake is the increased L2 cache going from 1.25MB to 2MB per core.
This. We know that games love cache because of how insane AMD's X3D CPUs are in gaming workloads while showing no difference in productivity compared to the non-3D versions of their respective architectures. And that's the same story we see here with Intel too, no change in productivity from 12th to 13th gen, while we get a small but measurable boost in games.
Yeah along with untied ring frequency from ecores (essentially what controls ur l3 cache speed and latency) and also better cache prefetching. Still more changes than literally the entirety of 6-10th combined. All they did is just make the same die with extra cores and in fact the first time they went to 6 cores they had increase the gate pitch of 14nm (70->84nm) making it ~20% LESS dense from then on😂. What a joke
14nm++ was actually 14nnm- ☠️
Yup, which only affects IPC in certain kinds of workloads. 🤷 And bumping L2 by itself is notably less beneficial for gaming in particular than an equally significant (aka die sized) L3 bump ala AMD's V Cache, as the former has no impact whatsoever on core to core communication/latency as L2 is entirely exclusive per core, whereas increasing the shared L3's capacity DOES improve core to core communication & latency!
You can see this crystal clear in the fact that X3D brought JUST as big of a gaming performance leap as the entirety of Zen 4! Aka X3D (doubled L3 cache) at MINIMUM matched the perf improvement not just from Zen 4's L2 size doubling (from 512KB to 1MB per core), but from the entire architectural change! If that doesn't make it clear that an L2 bump ≠ an L3 bump for gaming, I dunno what will. 🤷
Which is rather genius to save design costs :D
The clocks went up, so Raptor Lake is actually OK - not every generation should bring IPC changes. 14th gen is weird - even for a refresh it doesn't bring enough.
It seems like intel has milked as much performance as they can out of this current architecture. They can't really push power higher either.
That sounds like a challenge 🔥
Which means that this architecture was badly enginereed, with no improvement overhead and no platform reusability.
@@DarioCastellarin I think many of Intel's current problems stem from the known issues and troubled development of its -Intel 7- 10 nm process, especially the power consumption.
Surely at some point they've got to stop putting 300+W monsters on boards
Intel presented a 1000watt cooler so I wouldnt be surprised to see a 600watt 15900k
This is intels way of giving us 2 cpu generations per cpu socket as compared to amd
Intel: "It's 3 generations! Reeeeeeehhh!"
@@catsspatyes, and 14th gen really go brrrrrrrr.... -.-
Well you can say the Zen + architecture was a minor improvement over Zen. AMD are rushing (whilst extremely late) to get out the new generation, and that's spelt disaster for RDNA 3.
@@stevenwest1494Yea but Zen + had decent IPC and clock gains, where as this doesn’t do anything except add a number lol
@@stevenwest1494 zen + had better ipc, clocks AND an entirely different design. they went from monoilithic die to chipolets. how is any of that similar to this? oh, it's not. fair point about RDNA3 as they shouldn't have even released it, but who's talking about gpus? don't wanna get started on intel's bust gpus that they have to sell for a loss because they perform about half as good as they should (at best) given their die size and transistor budget.
Man, I cannot wait how UserBenchmark will skew this to make AMD look bad.
Divide the performance by the cache… it makes 5800x3d and 7800x3d the worst gaming CPUs ever!
😂😂😂
this BS website should be sued for misinformation of fake news... can't you guys in the US do something about it ? coz from the EU i don't think it's possible
The issue is that AMD and intel cpus are good for different things. You arent really going to be able to give one score that encapsulates those different things. i.e AMD is better for gaming because it does more work per core. Intel tends to be more suited to productivity tasks as those cpus tend to have more cores that make up for the lower amount of work that can be done per core. Userbenchmark scores are based on measuring productivity rather than the work done per core. So it isnt great if you care about gaming more than productivity tasks.
I just read their i5 13600K review, it was pure comedy gold.
@@haukionkannel The i7-12700K does beat a R7 5800X3D.
The 5800X3D was pretty overrated, the 7800X3D solved all the flaws with the first X3D chip.
These new Intel generation 14 are worthy of the RTX 4060ti
at least 14th gen is the same as the previous while the 4060ti is a downgrade
Mediocre Build 2023
@@DarioCastellarinGN already has the disappointment build every year
The sad part is that you'd probably be surprised by how many of that exact configs will end up being sold regardless.
We here watching HWUB's coverage (and probably several other channels') might realize that that'd be the most stupid configuration to get - but the retail market works differently, both on the OEM and the consumer side.
And there should be some masterpieces like Gollum installed on that system. Full house
I'd be willing to pay a little extra on the motherboard side if they lasted more than 2 true cpu generations, Rather pay for longevity rather than more motherboard flashy RGB.
Get a tomahawk board and You won't get any flashy RGB. problem solved.
And PSU too.
So buy AMD? Their platforms last WAY longer than Intel's.
Tbf that really wont be happening anyway. Absolute max is 3 gen. Pcie 6 and ddr6 are not that far away. Need new notherboards for that. Besides usually you should NOT get the first gen of a new amd socket. Zen4 motherboards have been issue after issue
Doesn't mean that other people don't do it. I went from r5 3600 to r9 5900x and it changed my experience a lot. Just because You never did it, doesn't mean others never done it.@grzegorzszewczak2808
This is like Skylake to Kaby Lake or Comet Lake to Rocket lake all over again. Basically a refresh with that minimum IPC improvements
I feel like this is actually _worse_ than that -error- - excuse me - _era_ 😂
I suspect that if we look back, we'll find _higher_ IPC jumps in those notoriously small-increment generations than we're currently seeing out of Intel.
Rocket lake did a 19% increase (integer).
It would be nice to have the same test for an AMD CPU's to see the difference.
1800X vs 5800X3d -> double the fps and less power consumption basically
They did already
@@SweatyFeetGirl But it gives wrong idea that 1800x had competitive gaming performance at the time of its release, which was certainly not the case.
it certainly did compared to bulldozer before it @@h1tzzYT
@@h1tzzYTNot competitive in the sense that it was still firmly behind in gaming performance compared to Intel's higher end offerings at the time.
Very competitive in the sense that Zen1 offered 8 cores when Intel was still shoving 4 cores (sometimes even with no HT, mind you) down consumers' throaths - and at a much lower price.
Welcome to the +++++++ era.
Welcome back* to…
I like how the IPC debate only gets brought up when something comes out that literally doesn't move the needle or when it does A LOT.
Intel needs to step it up on every level.
I'd be surprised if it wasn't like that. Things are only newsworthy when they're outside the norm, and that goes for just about everything. I might be interested in the *details* of IPC improvements when they're incremental (not huge or absent), but the fact that they exist isn't worth mentioning.
That's true.
If Intel wanted to cause a positive stir, then they should have released an LGA 1700 series that has really good integrated graphics.
An ryzen 5600g or 5700g style intel cpu line could turn some heads. And it would help intel leverage the amazing work they've done with their GPU's.
Intel is marketing numbers cleverly but their fab is still 10nm and supposedly MTL will be their working 7nm but look at the POWER. Look how far behind TSMC it still is.
@@JoeL-xk6boAre you comparing nm to nm?
It doesn't work this way
Intel 10nm is closer to tsmc 7nm DUV(if i recall it is duv) than tsmc similar name 10nm.
Nm is also a worthless metric nowadays
@@789knowIt's kind of misleading when Intel calls their newer 10nm process "Intel 7" and their 7nm process is called "Intel 4".
This is exactly the video I have been waiting for as a 12900k owner!
I limit my power to 140w as I use a passive case and I wanted to know if the 14th Gen chip would offer better performance at the same power limit through efficiency improvements. You have shown that there is very little point in changing. Yes the newer chips are more power efficient but like when I assessed this with 13th Gen, the improvement isn’t worth the effort.
Thank you very much for making this video and congratulations on making 1,000,000 subs!
My 13700K is limited to 1.28v through an adaptive vcore. runs at an all core 5.5GHz with a max package power at 228w and 80 degrees C in Cinebench 2024 so I do feel this is partly Intel's fault for unlocking the power limit stock when it should have been vcore limited and unlocked by those who want to as the performance is stellar at 1.28v (some can do it with an even lower vcore) with the power draw at half what HUB got...I do feel HUB and others could have pointed this out as I do not see many running the K CPU's at absurd vcores that motherboards allow hitting 1.4v and above which is just stupid...
That is not correct. There is a big difference when you limit both to 140w. Reason is, 12900k at 140w cannot get anywhere near 14900k's clocks at 140w. Comparing just the Pcores alone, a 12900k at 140w might drop to 4ghz and the 14900k might easily be hitting 5ghz.
You're completely missing the point that Raptor Lake is far superior to Alder Lake due to much larger cache sizes. "Speed" is irrelevant. Your power-wasting 12900K is a boat anchor which is easily outperformed by a 13600k in almost every benchmark.
@@awebuser5914 That is also not true. The 12900k is not being outperformed by a 13600k
im sorry but even both limited at 200w... the 14900k or 13900k pisses all over the 12900k... hell even limited less than that.... there is a HUGE difference...
Clock for clock is the type of testing I ALWAYS love to see when new gens launch - and leave it to HUB to make it happen. Would also love to see steady state performance at specific power thresholds, as well as separate P-core and E-core performance metrics.
The E-cores have pretty bad IPC. I have a Haswell i7-4700MQ laptop chip from 2013, the IPC is 25% higher than the E-cores on my current i7-12700K.
If Intel used Haswell for the E-cores, imagine how much better the perfomance per watt would be or the E-core memory latency penalty.
I did a benchmark of my e-core cluster from my 12700K and it is faster than my 10 year old laptop by the clockspeed alone, not by much.
Love the in depth testing as always, especially always calling out stuff that might not be obvious to new watchers. (like how important power consumption, temps, platform support/cost and driver support (ARC) can be) looking forward to new rewievs and podcasts, btw have you considered inviting guests/specialists for certain topics or will it remain a more chill conversation between yourself?
He is the expert. If he calls someone is just to conversate.
1000%, I was stuck on LGA1150 and basicly needed to buld a whole new PC to update. I swtiched to AMD for my current PC built in 2022. Excited to see what's posslbe on the AM5 platform.
I am still on my R1600, prefectly fine for most games with my recently bought RTX 3060 (1060 6GB was fine too, but a bit limiting)
@@Chris3s get a 5600 when you got 150 pounds spare, good solid upgrade.
its even cheaper than 150 pounds lol, its 135€/120GBP@@misterpinkandyellow74
from the tests I saw the FPS increase at 1440p is minimal (in my case even ultrawide), or did I miss something? @@misterpinkandyellow74
@@Chris3sI upgraded from a 2600 to a 5600 same 1060 6gb gpu, crysis benchmark at 1080p med settings went from 120fps to 230fps......
Absolutely insane how the difference between the 12th gen to the 14th gen is so minimal that it might aswell not exist
Yeah, and some people upgrade their rig to "newest" spending thousands of dollars and not knowing that they will get a few % of uplift
@@DeathCoreGuitar Not exactly how it works. The 12900k is significantly slower than a 13900k.
What's being show here is the performance of the chips themselves at a certain clock, as well as the efficiency at said clock
reminds me of 6th-9th gen and 10th-11th gen
IPC difference from 6700K to 11900K was almost nonexistent(HUB covered it)
@@rustler08 Yeah yeah, I phrased it poorly, sorry. I meant that some people spend a lot of money on a new CPU (and potentially new motherboard) just to get the same architecture chips but overclocked to the sky leading to more spending on a cooling system because they are hot as hell itself and electrical bills because of a high power draw.
Also 13900K is "faster" because it has 32 threads and 12900K has 24
@@rustler08it means that the 14900k is a 12900k with higher clocks via more power consumption
I agree that generational platform support has me looking at AM5 for a new build since it's the only current platform with an upgrade path into the future.
I totally agree with your final thoughts. I would love to update my 10900k but I'm not willing to buy another dead end motherboard with no further upgrade path.
I guess 15th gen is your go to then, or amd if they bring out something decent for their 8000s cpus
Totally agree with your assessment my 10 900 K is more than I need for what I need so I have no desire to upgrade
AMDs commitment to AM4 is what made me decide to switch from Intel in my latest system build. I bought in to AM5 in hopes that AMD will support that platform as long as they did with AM4. I like the ability to drop in a new CPU when I want to upgrade in a few years without having to hassle with a motherboard upgrade. You just don't get experience that with Intel.
Same. I originally bought a 4670k to kind of wait out for a 4770k or 4790k to drop in price and it didn't until the R5 3600 launched which was cheaper, much more efficient, had the SMT I was looking for. Hung onto that 3600 then dropped in a 5800X3D for the win. Try to beat that upgrade path with Intel. Intel would need a very compelling reason to go with them again, not some bragging right at the expense of power usage and heat output.
why would Intel fix this? OEMs will still buy it. They sell product no matter how good it is. They are totally insulated from the effects of competition.
They. Do. Not. Care.
Hey guys, just wanted to congrats you on 1 mil subs, absolutely deserved.. I think you're the most unbias (thus most reliable) hardware reviewers.. Keep up the great work! :)
You're right. I bought i5 11400f when I was low on budget, but now when I have more budget and some struggles with CPU on my 144Hz monitor, I could just upgrade to i7 13700, but I can't. I could do that with b450, so it was my huge mistake. Now I need to replace half of computer, instead just 1 component.
That’s just Intel for ya. I’m still using my X470 I bought in 2018 and just upgraded the CPU to an 5800X3D when my 2700X was starting to struggle.
@@tilapiadave3234Man, it's not about selling 2 instead of 1. Maybe you don't want to get it. My MOBO is connected with 9234208734 cables from 354 sides, when I'm thinking about the change, it's just problematic. I'm tired of thinking about it. Much more problematic than switching new MOBO. I would like to switch CPU and forget, not unmount most of the computer. It can be done, you're right, but it's highly demotivating
@@tilapiadave3234 tilapiadave is shocked to learn that consumers have preferences and they can express them in TH-cam comments
Lets not Forget that some of 13th Gen Processors were also a Refresh of Alder Lake.
When I was making a PC I wanted to use 12600k, but went for 12700k to, sort of, max out the platform, knowing full well I won't ever change that cpu. I don't have scenario where 13900k will make a noticable difference, so this is my pc until it's time to make a new one in probably 7-8 years.
Now that I think about it, GPUs are following the same pattern. It's become more cost effective to get higher end card and use it for 7-8 years, than to update every other generation.
I was an early adopter of AM4
Got a Ryzen 5 1600x on launch
Later bought a Ryzen 9 1800x
At the end of 2019 I got a 2700x
Mid way through 2020 I got a Ryzen 7 3700x
And just 7 months ago I got a 5700x
All on the same exact MSI X370 Gaming titanium motherboard
That was “over built” back with the release of the 1800x
Now, if I really want, I can still upgrade further, into a 5800x3d or into a Ryzen 9 5950x
Such insane upgradability for a platform
A nearly 80% single threaded uplift from the 1800x
And a 170% multi threaded uplift
To the 5950x
Or a 100% single threaded and a 130% multi threaded uplift with the 5800x3d
The AM4 platform is legendary
9:42 That's amazing, you're actually describing the very opposite of how it's been for me - when I was younger and had very little money to spend on gaming PCs, I scraped every bit of cash I could to constantly upgrade or flip my older PC for newer mid-ranged hardware (So I never had any PC configuration for more than 3 years), but now I have a top-end PC with an RTX 4090 that I bought after keeping my previous PC unchanged for over 4 years, and I have absolutely ZERO plans of even thinking about a new PC for the next few years, let alone a partial upgrade. How would it even make sense?
He's describing exactly how it is for most people on midrange/enthusiast hardware.
The upgrades are more spaced in time as you have less money to spend on luxuries.
Whereas people who buy the top end often have a "money is no object" case.
Yeah, was similar for me. Back in the early 2000s I was upgrading at least every 2 years or so, always on a budget with low end parts. Nowadays I tend to buy more high end for my main desktop and keep it for a long time.
But I also remember that I upgraded based on kind of the rule of thumb: When there's something available at about 3x the performance, I upgrade. And that was indeed within like 2 years! So my change of upgrade cycle is not mainly because I actually can afford higher end stuff now. It's kind of the other way around. Because things haven't been moving that fast it makes more sense to go more higher end and keep it longer, rather than constantly upgrading. I kept a Haswell system for like 8-9 years or so, all throughout the dark ages of Intel's quasi monopoly, until finally Zen came along and gave me a reason to upgrade (to Zen2). Just upgraded again to Zen4, but even the impressive looking gains we finally get again ever since Zen came out are still not close to the pace we had for a couple of years back then.
Anyhow where was I going with this? Right, I don't think this contradicts Steve, because it's not about budget but also about how fast hardware becomes obsolete. People who are on a budget today and buy low end hardware have no reason to upgrade every gen either. Back in those days when I upgraded constantly any 5 year old system, high end or low end, would've been utterly useless. Today most 5 year old systems are still perfectly reasonable. And I can totally see on the other end of the spectrum people buying stuff like 3090Tis because they can and just want the best (some small fraction of whom may actually need it), and might therefore do it every gen.
I really liked your short giggle at the beginning saying "new generation".
Starting from an 8086 at 5 MHz I always went through all the various generations of Intel with two rules:
1) Second hand upgrade to the best or second best from the previous generation. (I'm cheap)
2) Upgrade only if the performance is 1.5 to 2 times that of the processor I already have.
With these premises I will have to wait forever for Intel to make a huge move to carry out any upgrade with more than a few percentage points of advantage... or switch to AMD.
Intel made Pentium D and the OG Pentium 4 (pretty awful), AMD had Bulldozer and Piledriver.
Pick a poison.
Not like AMD is doing much better, 10-15% single thread performance increase every 2 years will keep you waiting for a while
It wasn't a bad platform overall, I think the low end i3 and i5 were pretty interesting, decently competetive and didn't suffer as much from the power consumption issues. But yeah, AMD is still confidently in the lead...
The i3 are heavily flawed, i don't know why they are the only ones without Big-Little. The i7-12700K is the only good i7 ever made since the 8700K.
This is hardly an Intel issue. This peak capitalism and planned obsolescence. No significant technological advancements have been made in the last decade, and yet shareholders in all tech companies have increased their wealth on a yearly basis.
Most people have too short an attention span to recognize these patterns. AMD would've done the same had they been the market leader for decades. The PC landscape has been doomed to begin with ever since the universal adoption of proprietary x86 architecture gave rise to the duopoly of AMD and Intel.
Some people are so preoccupied with these menial dilemmas they don't even realize 99% of games never needed to be created with high end rigs in the first place. AAA publishers and silicon giants have entered a symbiotic relation completely unbeknownst to the cows they're milking.
No dude, Moore's Law is dying. Watch the size for the flaghship NVIDIA/AMD cards on each generation.
I love the very creative ways of showing off the cpus in your B-roll. :D
So AMD achieves 10-20% each gen while Intel doesn't even get to 5% in 2 gens.
Sad.
The peanuts in the background at 3:51 are brutal 🤣
I didnt even notice them xD
So, Alder Lake IPC > Zen 4 IPC, and was released 1 year earlier... 😅
Yeah, that's why i bought the 12700K, 7700X is 4 less cores and 700Mhz more for basically the same gaming perf.
All of the memory talk nonsense on AMD meanwhile on Intel you can use low speed RAM, it doesn't hurt as much and you don't pay a premium for X3D.
Wow! It's so important !❤❤❤
Thank you for doing this testing!
Applying some actual context to a video like this would be useful. Raptor Lake was never supposed to exist in the first place (which should speak volumes about the refresh), but they figured out Meteor Lake was never going to be ready in time. Part of the cache redesign was what allowed the clockspeeds to go much higher on top of the increased capacity. The memory controller for RPL is also a pretty big improvement over ADL.
While I don't disagree at all regarding platform compatibility you're ignoring reality if you think it's just an easy decision. AMD proved that you cannot keep things going forever as they dumped CPU support along the way due to limited BIOS sizes during AM4's lifetime and Intel releases significantly more SKUs per generation. You'd need to convince OEMs and motherboard makers to change up BIOS support (or go back to text only, which I'd be fine with) to mandate much higher minimum capacity before anything could conceivably change. Potentially shifting the way the BIOS works entirely would also do the trick, or convincing Intel to just pick and choose CPU support. Unfortunately I don't see any of this happening because at the end of the day it doesn't provide them with monetary benefit.
Looking back, not only these rebadges barely have any performance gains in them. But also they started having stability issues due to silicon either being pushed too hard, or having a flaw in the architecture. Not a good look for Intel are these Raptor-Lake CPU's
The level of detail is greatly appreciated. Thank you!
This summer, I needed a new CPU/MB. After some research, I bought an i7-12700k. This video confirmed that I made the right choice. Thanks guys!
I did the same. The only thing potentially holding my PC back is the DDR4 RAM, but it was a very nice combo deal, so I went for it.
As someone with a 12900k and thought about upgrading to a 14900k, thanks! You just saved me quite a bit of money!
2016-17 I had two Motherboards.
The first was an upper tier, ASUS ROG Maximus VIII- along with a Skylake 6600K. I loved this board! (around $200)
my second was a budget re-manufactured board- ASUS Prime Pro X370 for My Ryzen R5-1600.(around $90)
Guess which Motherboard is still being used?
Amazing - hopefully you've been able to hand off that old 6600k system to someone else... I have my old 6700k still going as an occasional bedroom HTPC...
@@alistairwillock7266 unfortunately it has an issue, that I couldn't solve. The issue is more complicated (its been a long time, I honestly forget the details. 2020 was the last time I attempted to solve) but basically it would crash/freeze after being on for some time.
I think it is a 'solvable issue' because it does not freeze, if I have windows is running in safe mode.
I know that, in itself, sounds like it would be easy-peasy to fix. I also remember thinking that the last time I tinkered with it(2020).
Also I am not knocking the 6600K, It worked great up until it had issues.
It is a shame, that I couldn't upgrade to even an 8th or 9th gen CPU.
LGA1700 supporting "three" generations of CPUs that are totally not the same rehashed part.
Alder Lake++ memes go brrrrr
I was on a i7 4770 and upgraded to a i3 12100f. What an upgrade for what tiny price!
I upgraded from an i7-4700MQ laptop chip (basically it's an i7-3770) to an i7-12700K (330USD).
Man, you i7-4770/3770 folks had it good... I went from an i5-3550 (multiplier + BCLK overclocked) to an i7-12700K just this summer. $350 for CPU, RAM, and mobo. I'm cheap.
@@haydn-db8z The i5s were similar gaming performance back in the day compared to i7s. So it was the wiser choice. I actually had upgraded that MB from i5 4650 to i7 4770.
There's certainly exceptions with 4090 buyers, as getting the absolute best often makes it last much longer (see 1080 Ti). I certainly can't afford to upgrade every generation, but I am willing to invest into my PC as it's the most important thing in my life. So, I went from a 1080 TI to a 4090 and a 6850K to a 5950x. I expect to keep this for 3 more generations, especially as upscaling/framegen tech is now a thing. The 4090 is still CPU bottlenecked by all but the latest, fastest CPU's (depending on title).
CPUs can last even longer.. though things aren't like the old days anymore where it was just IPC performance that mattered. It still matters, but now cores and cache do too, so an X3D CPU is worlds above a normal one. I will have to upgrade next gen to stop bad bottlenecking, so I'm holding out hope they figure out the multi-CCD 3D cache, without dropping clocks too.
Moore's law is practically dead. Intel must stop this platform change trend; the cost is not justified.. nor is the e-waste.
Hi Steve and Tim - The ad-spot is for the ASUS ROG Swift OLED PG27AQDM but the link in your description is for the ASUS ROG Strix XG27AQMR. I'm not looking for a monitor but I wanted to check it out so thought I'd let you know.
Very happy with the upgrade from an i7 8700K to 13600K purchased a year ago but yea later down the road I probably won't bother with upgrading again on the same motherboard and just see whats coming next from Intel / AMD.
Awesome, Thanks Steve. Do you have any data for the 13900KS?
I'd love to see an AMD IPC video too.
Back in 2021 I picked up a 11600k for my first ever build. Despite being fine with my decision even until now, the AMD longevity looks very enticing for the future.
Get b650 with 7500f, and then upgrade at zen5 or maybe zen6 with a x3d part
@@noticing33 I love this community
ew 7500f@@noticing33
I suspect the 14 and 13 series are just refinements of 12th gen in terms of yields and steppings. The additional cache may be a side effect of other improvements providing more power for additional circuits. The main criticism is the lack of interesting or useful features that demand a platform upgrade. NVMe tech and USB tech are far beyond the use case of most people, and I’m struggling to find any features that the LGA1700 platform needs but doesn’t have.
The platform is irrelevant if the GPU market still sucks.
I've purchased *twelve* AM4 systems. 10 motherboards and 2 prebuilt. Gave away two, and two at parent's place, so eight currently at home.
Zero AM5 so far, but I did buy a Phoenix laptop. It's mind-blowing that this 28W ultra portable is neck-and-neck against desktop i9-11900k for CPU performance, with way faster iGPU.
Intel completely devalued the 'gen' branding with this. Plus it means any benefit seen in the 15th 'gen' will be compared against two 'gens' before it.
Nothing more that labelling it as a 'gen' to entice purchasing and to sell the new motherboards which are also pointless. Nothing more than a cash grab.
I can’t wait to see how userbenchmark spins this one.
“Intel does it again with an outstanding product release, just look at that outstanding performance. All while AMD has no product release at all and can’t compete even with their marketing lies and manipulations.”
I dont rhink intel will ever match how great AM4 is
With Tick-Tock Intel paired motherboards with one processor architecture. The second generation was a shrink of the same architecture: sandy bridge & ivy bridge, Haswell & Broadwell. Then Skylake was crazy: 4 generations with very similar architectures and 2 or 3 different sockets. The IPC difference was small for 6th to 10th Gen. Core processors.
I have an i3 12100 which I got for £90 at the start of the year. I got lucky here as it fits my purpose, has a potential upgrade path, and my CPU allows for enabling AVX512 which helps significantly for RPCS3 emulation.
RPCS3 runs well on the 12100, really? It runs pretty good on my 13600k even without avx512 but I was thinking the two extra pcores and extra clocks of the 14700kf might help, with rpcs3 and yuzu for the more demanding scenarios.
I almost wasted my time commenting on a random bar graph video. Thank you guys for actually answering my question of is it worth upgrading if you’re tech literate and already overclocking manually
"intel's 15th gen needs to offer a nice performance uplift" about that...
That's why I went with the 1700X oder the 7700k when it was released. After some years I replaced the 1700X with a 5600 couple of months ago. I gave the PC to someone else who's rocking it daily and bought an AM5 with 7800X3D. I wonder when I will replace that one :)
oh hella yes, i had the same journey (although with a 5900X instead of a 5600) and it's been so awesome. back in the day i did some CPU-intensive tasks too (mostly Blender, which has since been taken over by the GPU, especially since i got my first RTX card) so i got some good use out of that 1700X, and it was a friggin beast by 2017 standards. and Zen 4 is absolutely crazy, going down from 12 to 8 cores literally hasn't been a downgrade
amd's platform support counts a lot. even when they mess it up -- i did switch motherboards, from my og X370 board to a B550 when upgrading the CPU, because X370 didn't have zen 3 support yet, but i gave that X370 mobo and the 1700X in it to my cousin and since then the platform did gain that support and he was able to upgrade to a 5600. there are actually CPUs from all four of the AM4 generations in the family and it's hella nice to be able to just mix and match them as needed
I went from 3800X to 5800X3D. A single, but massive upgrade with the same motherboard and RAM.
I missed the era of the 2700k to the 7700k being the same cpu. I’m glad Intel is blessing me with this nostalgia.
Really good content here. As usual, a must watch. I really agree with your analysis. Thanks a lot Steve!
Intel we renename it rise the prices and here we go new product
14th Generation should have been called "Older Lake"
Time Stamp: 13:00 I totally agree.. If you just play the game you will never tell the difference between the 12900K and the 14900K... Unless you play the FPS counter in the corner game, then yeah you will see a number difference, but not a game play difference..
I can't believe my 12700k that is $273 on PC part picker beats a 14600k which is going for $300. Don't buy into false advertising. Bigger number is not always better.
Excellent video! I totally agree with the conclusion.
could be interesting to see this test with e-cores enabled, at the same frequency, to show how they impact in modern games...
It depends, in Cyberpunk 2077, they do A LOT.
They also do wonders on shader compilation.
Did you guys change LUT and add more sharpness? The image feels so much more crisper and deeper now. Very good change
Some people say that 14th gen only advantage is the enabling of "DLVR", which is supposed to reduce power consumption at same clock speed and performance.
But apparently, for this advantage to be really perceptible, the socket must be power-constrained, way more than that K-series does by default.
For example, it would have to be limited to something like 65W (non-K variants) in order to observe better performance at same power budget.
This is a "claim" though, I haven't checked it myself, nor seen a review trying to analyze 14th gen power efficiency advantage under this angle.
It would be interesting for a reviewer to have a look.
Iv got to get the “New Benchmarks” hoodie
Would love to see a Ryzen comparison for the previous gen, see how far AMD has come!
The biggest appeal of this refresh "generation" I could see before it launched was that they were finally going to get DLVR working, lowering the voltage (and therefore watts) the chip sucks down.
Discussion online seems to suggest that it is working, the **900 chips are just pushed *so* hard out-of-the-box that it makes no difference in that tier, while the lower tiers see a benefit.
I had hoped that the downclock to 5 GHz would help matters, but it looks pretty similar. Disappointing.
General question: What are justified reasons for a new socket? DRAM generation is (apparently) one but PCIe generation for example isn't.
Please enlighten me 💡
to be brutally honest i never just upgrade just cpu by its own, when the time justified a cpu change, it's usually the time to change everything
Solid vid, exactly the info we need.
The difference between 14th gen and 13th gen is that 14th gen uses PER core throttling, and 13th gen uses ALL core throttling; allowing 14th gen to maintain slightly higher clocks.
2:50 This is only true if you consider the R7 5800X3D/R5 5600X3D to be an "entire extra year/2 of CPU support" for AM4. If you only consider the major architecture releases otoh (Zen 1, Zen +, Zen 2, Zen 3), then AM5's guaranteed support roadmap is already JUUUUST as long as AM4's was. 🤷
(AM4 = 2017 through end of 2020; vs AM5 = 2022 through end of 2025)
LGA1700 was always going to be stuck in the middle. A new socket was needed to support the move to DDR5, and the next gen FOVEROS based chips needing a new socket for a radically different design shouldn't be a surprise.
This is the main reason i went with amd, to be able to get "actual" multiple generational cpu upgrades on one socket is huge, especially when im buying a $600 - $800 motherboard.
Wait, did this video ignore the temps and wattage? 😅 Der8auer showed that there are latent improvements. Latent, that is, in his delid video.
Forget about increasing scores
They had increase the price!
Was heading to the gym but when I saw the massive gainz from 13th to 14th gen I just didn't fancy it.
Intel: were always moving...
Laterally.
I wonder how many Intel chip designers or product managers watch stuff like this? The chip designers would probably say "yep, this is true, we really haven't innovated lately" and the product folks would say "we have to sell more chipsets, so they have to change to maintain revenue." Who knows.
Both uArch and process node are not great at intel, they’re hiding behind marketing and desktop part power draw.
Shows how incredible the i7-12700K was. A perfect replacement for long time i7-8700K users
You guys single handedly made 12700K prices go up lol
🚨Guys, a head's up. The link in the description to the sponsored monitor is wrong. Link goes to an IPS monitor (XG27AQMR) instead of the advertised OLED one. 🚨
Hey Steve, you should try and reach out to some AU hardware resellers/system integrators and see if their willing to part with some CPU sales data at all. From my perspective Intel is out-selling AMD 10-1 in open loop system sales.
I have specifically recommended Ryzen mostly because of platform upgradability. They had old equip and had to spend big $$$ vs years ago. I told them they could drop in a big upgrade in 2-4 years, kind of like upgrading a GPU. Won't be as good as buying a new system ofc, but you'll save on RAM and mobo. Best bang for buck reasoning.
I did mention the power = higher w/Intel, but they didn't care as much. The decision maker was the drop in upgrade years down the line w/o replacing everything.
Should have been called the 13950k , 13750k and 13650k in my opinion
You know, every lake has its own boat. The guy rowing the boat hasn't changed. But he's got a new t-shirt every year.
That was a bit brutal to watch. It's one thing to know it, but another thing to see it in the bar graphs.
Am on 12900KF got recently on sale. It's such a small difference here truly. I'm happy on it, needed to upgrade off 9th gen.
It's very interesting to see the (non) development. Just shows again why competition is good for the market.
I’d love to see Intel change but I’m not holding my breath. Not that it is a big issue for me as I have tended to build a top or near top end machine and keep it for around 4 years. So guess I will be stuck with my i7-13700k/Z790 machine for another few years.
Still a really good cpu, for awhile.
Intel is definitely never changing. They still outsell AMD by far, despite really not being that competitive. Not enough incentive to improve when you have so much guaranteed income.
What I would like to have seen Intel do - and something I think would benefit users FAR MORE than simply more than 2 generations of support - is to unify the enthusiast families onto one socket. To have consumer and HEDT share a socket and have interchangeability would be HUGE! Imagine, K SKU and Z board with the only change being the socket change so that the Z board could also accept a 16 'P' core HEDT X SKU. I'd suggest also having a 350W limit on the 'Z' boards to ensure end users aren't blowing them up with X SKU overclocks. Then you could go with extra cores on the cheap board or get the more expensive board because you also need the I/O. The 'X' boards would also be one single platform for extreme overclockers to play with all of the core count CPUs.
They did something similar with W790, but it's far worse and more confusing IMO since both platforms have the same name, just a 90° rotated socket to denote what is the pro workstation and what is the lite workstation board.
Then you'd also no longer see OEMs/SIs putting 'K' SKU CPUs on 'H' series boards and crap like that.
In short: All AMD needs to do is support AM5 from Zen4 all the way to Zen6 and they've succeeded in making Intel look silly, again.
Shouldn't be that hard, actually. AM5 has everything it needs to be adequately equipped for that timeframe. Nobody on a consumer platform will need PCIe6 or CXL, anyway.
Technically they did this from the 6th to 10th Gen on desktop. All Skylake derivatives with only core and clockspeed bumps, in some cases an improved IGP
Most hate 14th gen due to lack of progress - it's just Intel and partners wanting new generation every year. The thing is, hating it doesn't make much sense from consumer standpoint, because from this point of view it's nice to be here: 14th gen makes the same perforoming 13th gen cheaper and 14700K is great upgrade option for current owners of lowerend chips - you get almost ultimate LGA1700 performance for i7 money. That's how it is for now and there's also rest of the generation coming...
They could have just done that by lowering 13th gen price, instead of performing this clown show.
@@TheCountess666 Not really
@@stanisawkowalski7440 yes, really.
@@TheCountess666 Agreed, they should have just named the 14700 as a 13750 or even 13800 and then lowered the prices of current line. It was known around the time of release for "Matisse" that Intel were in the mud and would be behind for a good few years. They changed that understanding with the introduction of e-cores but as we can all see, it was just putting a plaster on a deep wound.
They'll catch up eventually but needing all that power to sometimes be ahead but also losing a lot too is actually embarrassing for Intel who have had more than enough time to become as efficient as AMD. Good for the consumer though.
@@TheCountess666 Not really, because with whole new generation come new pluses: Dell and similar have new CPUs to offer in their computers to e.g. justify their higher prices; Intel can claim longer support of 1700 and minority is interested in hardware enough to know how it is with 14th "gen"; consumers have 13th gen cheaper and with 14700K get an unusal gift of having option of upgrading to allmost full potential of plaform without buying i9, so for 2/3 price what definitely calculates price/performance wise. And there're lowerend CPU yet to come when already more budget oriented offer of 13th beats Ryzens 5 and 7. So what stands for your ,,yes, really"?
Hey Intel, support overclocking on your non Z chipsets, at least the B if not also the H, if you did them all though you'd actually 1 up AMD since they don't on their A series. Just bought an AM5 system coming from 1151, and this video explains why, I may not upgrade my CPU but if I want to I'll be able to, Intel you're fresh out of luck.
my 3080 with the 12900k go just great at any game i own . Even star citizen work just fine. Nice to see that is a good cpu
Going from 12 to 13/14th gen isn't a terrible upgrade though. I agree overall it's "meh" at best, but honestly the reason I upgraded 12th to 13th was for the vastly superior IMC and the ability to run much higher/better memory. My 12900k couldn't even hit 5800 speeds while the 13th is cruising at 7200 with no messing around, just XMP. The fact you were able to run these test on the 12th gen with 7200 is impressive, you must have some really good 12th gen binned chips as that is not typical. So overall, I don't disagree with you, but if someone had a 12700k and went to a 14900k in the same mobo, that isn't a terrible jump when considering overall performance if power draw is ignored.