The fact that Steve went so far as to delete his old video which he was getting a lot of views from because the video is now outdated is sheer proof of the dedication of this guy. As always a great comparison video. Great work Steve. Keep up the good work
@@Deinorius well we did lose a weeks worth of work, so that in itself does sting a bit :D We also lost a high view video that was still generating over 5000 views per day for the channel and would have no doubt made us good money over the next 12 months. Anyway, we felt the right thing to do was delete that video and hopefully the traffic gets served to this new one, but it's YT, anything could happen there :D
@@walkir2662 people forget that they have to remove that heat from their space. I have 4 gaming PCs and 3 servers in my house. if they were all intel, I would be baking. The heat output of my 7800x3d is crazy low compared to its gaming performance.
Well, dont forget... those are uncapped FPS. And they are absolutely useless in Single-Player Games in 95% of the cases. With variable refreshrate, you only need like 80 FPS capped for the games to be enjoyable, or ... you cap it @ 60 fps, while also disabling variable refreshrate & setting your monitors hertz down to 60. Then the 4080 Super turns out to be more power efficient than my older 3070 Ti for example, while delivering noticeable better lows.
Get a cheap box fan to exhaust the heat out the window. I have 5 modern AM5/AM4 builds in the office and even with only 2 or 3 of them gaming it heats up quick. Great for the winter. If you have horizontal style sliding windows you can likely stack 2 cheap $20 box fans in the window. Walmart has these cheap ones and the handle has a groove that notches with the window frame to hold it and then slide the window as closed as possible. Set them on low and crack your door and comfy gaming.
@@SamBlips assuming you're gaming for about 4 hours every day, 365 days a year, and your 14900K guzzles up on average 150W more than the 7800X3D, you'll be spending approximately 30 dollars more per year (going by the US average of 16.68 cents per kWh). Obviously there's a lot of variables (the games you play, for how long you play them, where you live) so adjust accordingly
@@johngrimm1103 Not just that, for me it's mostly about longevity & stability, etc. With less heat output for example, cooling fans won't be pushing as hard, & there might be less overall chance for components to wear out within the system's life-time. Plus slower dust build-up & cleaning required, etc...
So if I want to upgrade it’s about the same price to get a 7800 X3D and AM5 mobo as it is to get JUST the 14900k. Even though I’m already on LGA 1700… the incentive to switch to AM5 is massive.
7800x3d + b650m-h/m.2 + 32gb Ddr5 it's around 500$. For the same price you can buy 13700k kit, or if you want to get better mob for 7800x3d like b650 aorus elite ax, price of the kit is so close to 14700kf kit. But I would consider to buy 7500f, cpu which can paired with 4090 in 4k or 4070super in 2k, and 7500f kit on b650m-h/m.2+ costs around 320$ (when hexacores cpus will not be able to gaming, you can swith it to 7700/9700/7800x3d
At this point, I wouldn't consider a 14900k for anything. 7950x will be the same at productivity workloads and at higher resolutions get the same FPS in games while consuming less power and being more stable. 7800x3d will beat it in pure gaming.
I’m waiting to build a pc with the 9800x3d or 9950x3d and a 5090. Depending how much stronger they are from current gen. If only a 5% difference idk might just build out a current gen once the parts dip in costs
I'd like to see a 14900k restricted to the power draw of a 7800x3d. Calculating Performance/W is one thing, but actually seeing it would be really interesting.
I just got a 7800X3D from Microcenter for $204 😮 with their grand opening sale for the Charlotte NC store when bundled with mobo and ram. And holy shiz its fast my I9-10850k is now a living room PC with a new Arc770!
That's a fckn steal! Also: Welcome to meaningful socket compatibility! The only socket I didn't upgrade on was AM3 because Bulldozer and Excavator sucked (and my Phenom X6 1100T Black Edition just kept trucking), but my AM4 board saw two CPUs (2700X and 5950X) and I suspect my AM5 board will most likely see three (currently on 7950X3D).
@@maanmahmoud4537 it's not exclusive to micro center grand openings, it's at every micro center and it's a gimmick. can look at it like you pay whatever their sale price is for the chip by itself (like $350) and then you get a shitty overstocked mobo and 32gb ddr5 d-die dogshit sticks. what they do is on the receipt write that the 7800X3D is $200 and the shitty mobo and sticks are $350 lmao
Yeah jumped to AMD AM4 platform with Ryzen 1700X then 3700X and now 5700X will look at 7600/7700 in the new AM5 platform. Have built some Intel base machines for others and never been super happy with them.
This could all be prevented if intel hadn't even bothered releasing a 14th gen i9. Let's be real here, it is the same thing as 13900K. But no, Intel had to make a fake generation lol
I'd like to see userbenchmark's take on all of this. "Intel continues to improve by providing irresponsible customers who've run their CPUs at ridiculous power levels a more stable solution"
@@RobBCactive If AMD is able to resolve their problems with parking and adressing the cores the x950x3D could be interessting, but right now the x950x non 3D is overall the better gaming cpu.
@@jojobetzler3732 It's rumored the 9950x3d will be both chiplets with v-cache instead of just one on the 7950x3d. Being all 16 cores with v-cache should eliminate scheduling issues.
@@Angel7black intel with 8p cores and 16e cores. So the 1/3 power playing and getting worse results is massive. It's still an equal comparison. Cause gaming is going to use 8p cores vs amd 8 cores.
@@Angel7black We're discussing games, thread count doesn't matter. But, even if you disable E-cores and turn the 14900KS into an 8c/16t chip, it's still significantly less efficient than a 7800X3D. But if you want to have an irrelevant discussion about thread counts and efficiency, I'm down. OC'd, a 14900KS draws more power than a last gen 64c/128t Threadripper 5995WX, but achieves only ~60% of the multithread performance. An OC'd 14900KS has similar power draw to a stock 96c/192t Threadripper 7995WX, but the 14900KS only achieves ~20% of the multithread performance. "But those chips cost 10x what a 14900KS does." Yup, but which Intel flagship would you prefer I compare with them? The 14900KS doesn't quite match a mid-tier Threadripper 3970X 32c/64t in multithread performance, and that chip consumes less power and is over 4 years old now. Concerning Intel, you should keep the topic on gaming, as games and single thread performance are the only areas where Intel can still compete.
@@Angel7black Even with the 7950X3D its still using way less power, performs the same or occasionally ever so slightly quicker than the 7800X3D depending on the game, and its on par or faster in the most heavily multitasking programs compared to the 14900k with an actual full fat 16 cores/32 threads and not this efficiency crap, because it's certainly not efficient. All while sucking down significantly less power. Intel is doing exactly what it did in the P4 days, just up the clocks, give it as much power as it wants and to hell with the power efficiency and heat.
when you see the power consumption its wild considering how often the x3d is running the GPU harder. intel dropping multi threading from pCores hasn't been explained like its been done for power consumption reasons either. hopefully the eCores have improved their efficiency a lot.
Intel vs Ryzen at gaming, a short history: Ryzen 1000 and 2000 series - Intel clearly better, no sweat. AMD sells very cheaply to compensate Ryzen 3000 series - AMD making gains but Intel still comfortable, chugging away at 10nm+++ Ryzen 5000 series - AMD reaches gaming parity, Intel pushes a lot more power through their chips to keep up or slightly edge ahead Ryzen 5000 series 3D V-Cache and 7000 series - AMD now significantly ahead. Intel has to push obscene amounts of power through their chips to keep up Ryzen 7000 series 3D V-Cache - AMD so far ahead that Intel decide to risk long term silicone degradation, pushing hundreds more Watts through their chips that cost 80% more money just to lose by a few % in benchmarks Ryzen 9000 series - I hope Intel has been working on something special....
I wouldn't say Intel pushed more watts through the CPUs, that's the MB manufacturers doing that. However Intel surely had no problem with the manufacturers selling the motherboards with default open power limits, as most hardware youtubers test "as delivered", hence Intel surely was not against MB manufacturers not going along with their power limits, on the contrary.
@@johnscaramis2515 No its both. Mobos went in to the overkill territory, but even with Intel's more efficient profile they're still consuming 100W more while still losing.
@@blkspade23 I would assume that Intel spec'ed their CPUs in a way to maximize performance by running at their vomit limit, but not beyond. So even with the high power draw, they would not have a technical issue. And I'm fully in line with you: rising power draw to catch up with AMD without enforcing guard rails (i.e. enforcing their power limits as default) was a big mistake. And I don't think Intel will be able to close up with 15th gen. Although with their Foveros technology, they could introduce a cache tile
Ryzen 3000 series, Intel's competing CPU flagship was the 9900K , that is on 14nm . Only 12th Intel has 10nm. AMD was eating away at Intel's HEDT market because of the R9 3900X and 3950X . Intel non HEDT platform maxes out at 8 core at the time.
Ryzen 9000 series - Intel fabs throw in the towel and Intel switches to the same TSMC production as AMD. From this point on there won't be any big differences between the two, no matter how they try.
I just did a new build(due to my power supply killing my mb & gpu). My last 3 builds(over 10 years) have all been Intel because I had stability issues with AMD. Now, for the very reason I left AMD, I've left Intel and gone to a 7800x3d build. I had also been doing water cooling because of the heat generated. Now I've gone back to air cooling with the lower power demand and temps that the AMD has. I was able to save enough on a MB/CPU combo, and a $40 Assassin 120SE air cooler- that I was able to afford a 4080 super. I have run 3dMark benchmarks, stress test, and played a few games(3440x1440 144mhz) and my max cpu temp has been 76c with all stock settings...I'm very happy with the cost to performance.
That's pretty warm I using a tuned 14900k 5.8Ghz highest Temps i see is 68c it's how you tune it and hardware you use Intel doesn't run hot if you don't be cheap on the right parts I'm not even using exotic cooling.
Same here on my 14900K which I undervolted immediatly when I say the motherboard auto vcore settings which were absurd out of the box. Have mine runniing an all core 5.6GHz P core and 4.4Ghz E core with an adaptive plus negative offset hitting 41K on Cinebench R23 and mid 80's C temps at max though I have delidded! Use it for encoding, rendering where it really comes into it's own and for gaming it does the job as well as anything...
I have read that some users run Cinebench R23 several times and then their 14900k becomes unstable (on the old unlimited power profiles). Possibly silicon degradation occurring significantly faster than expected.
@@abaj006Asus boards often overclock the processor out of the box, which it is not given. Well, there is just a defect, as I heard under warranty, this is returned
Full power is questionable because 1 what is full power on your system vs others and 2 how often do you really load up your cpu that it pulls the max power allowed
They're actually designing it a bit differently instead of just dumping it full of V-Cache lol. Hopefully, we can pick out the higher chips like 9900x3Ds if they have it on all 8 cores (or more lol.) The 9800x3D will probably be the next powerhouse for money efficiency but perhaps the flagships might outclass it and have good productivity performance without compromise lol
indeed, if we can get the IPC of the 7700X or better and Tier1 gaming performance with amazing efficiency, the 9800X3D looks like a legend in the making.
@@maze4184they're gonna switch to tiled next year too.But rumours they're gonna use a stacked tile design.Something like the 3d V-Cache tile but for all the tiles instead.So at worst im guessed they'd be on par and the same powerdraw too.
I Undervolted my 7800X3D and it uses at max now 78W, much less while actually gaming. Imo Intel is not saying out loud that you need to lower the PL1 and PL2 because they would lose even more to amd in the gaming sector.
Truly disruptive performance indeed. Intel can't handle the 7800X3D even with the 7200 CL34 memory. Thank you so much Steve. Thumbs up! Now, Arrow Lake vs 9800X3D, this will be good.
Now I hear that Intel says the CPUs have a microcode error, fixable by a BIOS update. The cynic in me says the code change will simply throttle the CPU.
The funniest thing to me is that they're equal at every resolution people should be playing at these days, especially with a 4090. AMD has so many longevity and efficiency advantages over Intel. I still love my 5950x!
People like different things. Personally I'm running an XTX at 1440p. I have a 7950X3D and I see clear performance differences when I lock a game to the vanilla CCD instead of the V-Cache CCD.
My 7800x3d comes in today! Moving on from a 12400f. Never in my life would I thought Id be switching to AMD, my first build was 12 years ago and have always used Intel. 7800x3d is just to damn good of a chip to pass on
I switched to AMD when the 5600X came out. After seeing how good the 3600 held up and the prices for their gaming performance AND energy efficiency, it became really obvious that AMD was back to winning (at least for gaming).
Switched to AMD for the first time a few years back with the 5900X and I'm still happy with it and impressed with how much AMD has been improving in the CPU market. Will upgrade again this year and currently leaning towards AMD again since I've enjoyed my 5900X and Intel's mess with instability didn't inspire confidence. Going to be watching closely to see who releases the better products this year, though I will say I'm quite interested in trying out an X3D CPU.
I greatly appreciate the structural framework of your videos. I appreciate the time and effor that goes into them. Bu I must confess, in most cases I simply skip to the final thoughts.
When building earlier this year I went the 7800X3D route - it seemed more performant, used way less power, was simple to cool and less than half the price of the intel chip where I live. It was a no brainier! Watching this makes me happy I went the way I did 😊
Thank you HWUB for using the 7800X3D instead of 7950X3D, giving even better showing for AMD since 7800X3D is totally different price range than 14900K, for gaming in particular. Such value for gaming hasn't been seen until AMD introduced their X3D lineup back in the day... Not even regarding these updated results with performance profiles for Intel, the 7800X3D is an absolute gaming beast.
@@Anonomobot I use 7800X3D with 7900XTX, even though my GPU uses a bit more power than I would like, the price/performance is still good and I can play anything.
7800x3d is 200$ cheaper, can run well on cheap boards, and requires cheaper cooling solution. That's honestly it's biggest advantages, even if it was 5% slower than the 14900K it'd still be worth it. For example, if you want a decent 14900K build, you need; 250~$ motherboard at least, with good VRMs and good power delivery. 6400-7200 mhz or better ram, which is at least 130+$ for 32gigs, otherwise you're just paying for a better memory controller without utilizing it. and at the very least a 100+$ AIO, You could also buy a large dual tower cooler but it might be a chore during the summer. You might also need a larger PSU, since an 800w PSU with a 14900K&4090 might not cut it, whereas you could get away with a 4090 on a 800w PSU when your CPU only draws 80w at most. Also the 14900K dumps a lot of heat, so you gotta make sure you have a larger case with good airflow, meaning ITX cases pretty much go out the window, even if you could cool a 14900K well enough with a cooler small enough to fit inside an ITX case, the case temps will suck and you'll need to lower the power profile so much you might as well have bought a 13700K.
@@TraumatreeThe way he worded it I believe he's saying that even if it was slower it would still be worth it. I don't think he's saying he believes it is slower just using that to point out how much of a value it is since it's also faster.
1. 7800x3D NEED a good cooling solution 2. You didn't need 250$ motherboard for a 14900k 3. You doesn't need 7200mhz ram 4. You need a huge PSU also because of GPU I have the 7800x3D and yes, I9 should not exist when we talk about gaming.
I've never felt good about cpu's using 250+ watt to run, it's one reason why I chose to go with the ryzen 7900 because it's rather great for my needs and sips power, it's nice to have a small air cooler and not worry about it.
AMD coming in STRONG at nearly half the cost and a tier down (Ryzen 7 vs Intel i9 series), how can any " influencer " recommend Intel with such a difference?
Intel is to blame here. In order to win the performance crown it let the board partners run wild with the power limits. It's Intel's fault so it should be providing replacement CPUs for those customers that suffer from CPU degradation.
Even more efficient and nipping on the heels of all this in gaming. Productivity is a somewhat different matter, but it's a gaming CPU so your choice there is clear. Prices are going up on the older CPU, however, so the 7800X3D just looks better and better--especially as DDR4 becomes more scarce as time goes on.
@@BlackJesus8463 Depends. The 7600 has better single threaded performance, but if you leverage the cache (which means games), then the 5800 runs off with things. The 7600K is a helluva budget chip, though. If it hadn't been for DD5 prices, this machine I'm typing on now probably would have been a 7600.
@@davefinfrock3324 r5 7600, not the old 7600k in gaming, on average, 7600 = 5800x3d, and the latter is less efficient, drawing about 100w because of platform costs, a 7600 with mobo and ram costs as much as a 5800x3d with mobo and ram, so the 7600 is overall the better option
Steve - thanks for this much appreciated comparison. I noticed that at 4K, performance across all 3 (7800X3d, 14900k performance & extreme profiles) is more or less the same in nearly all cases. Unfortunately you only show the power usage stats at 1080p but I'd be very interested to see this at 4K? If the 7800x3d maintains the same material power efficiency advantage over the 14900K at 4K resolution as well then that would be useful to know and at least for me fully confirm your conclusion.
*Almost top performance, a i9 or ryzen 7700x will be faster with ram speeds above 7800mhz. if you tune the memory the lead will become even bigger but at what cost.
@@insector2093 a i9 14900K/KS (binned memory controller) the only upside of it is ddr5 8000MHZ and all that gives is better 0.1% lows in some games but if u dont tune ur ram and bin ur cpu there is legit no reason to go intel...
@@insector2093 + ryzen is limited by its Infinity fabric so running 7600/7800mhz is going to reach the same performance as if you would run 6200mhz tight timings
Holy shit up to 200W more per gaming hour... Over time that is going to be so expensive. Applied to Central European prices per kWh that can range from 0,31€ to 0,50€ that can easily cost an additional 500€ or more per year just for having an intel cpu if you game for three to four hours on average every day.
But... you already knew the results: Identical performance with new-ish titles that saturates the 4090 completely, and the spread opens up the older the game is.
Thank you for still including the 4K test results in your benchmarking. It will be interesting to see with Lunar Lake, Zen 5 and then Zen 5 3D to see if CPUs make more of a difference at 4K in future, along with Nvidia's upcoming RTX 5090 GPU.
The most impressive thing about the 7800x3D isn't even the performance. It's the efficiency. You don't even need a strong cooler! I still find it hard to believe that in the most CPU demanding games, it's only pulling 30-40 watts. All while outperforming Intels pulling 200+ watts. I thought Alder Lake 12th gen Intels were decent at the time, but once DDR5 prices settled down, and there was a sale on the 7800x3D and b650 boards, it was too good of a deal to ignore.
I have my 13900k capped right now on the cache core current originally through limiting ICCMax to 328a-400a which anything under 400a lead to a current throttle. When I was current throttled, my 13900 wasn't using more than 200w. It did want to use 253w, problem was it sometimes wanted to use more than that , and for me at 253w, a 400a cache current is perfect and lead to a 0% throttle (current EDP throttle). Problem becomes, when the CPU attempts to use more than 253w, which in turn wants to use more than 400a, and Intel specifically said "core current NOT TO EXCEED 400a". They seem to be linked, if I lower my core cache current from 400a to say 360w, my max wattage used for my 13900k under full load will fall from 253w max to 200w max telling me I have a EDP current throttle. While I'm using 2 current protections instead right now through my Gigabyte motherboard with 253w/253/400w manual settings, I don't even get above 160w which I'm fine with for summer time but still, WTF? I have a Corsair 7000D with a 420mm AIO, I should be able to run at the extreme profile. Luckily now though at 160w, none of my cores even go above 75 C under heavy load but at least it is stable. Problem with these 13900k CPUs is actually trying to use all 24 cores that you paid for when it really needs all that power to push each core and it goes over 253w/400a (should had never been allowed to exceed this and damage the silicon [400a]). In my case it was fine while I was gaming since not all 24 cores are even close to being used. When I did compression/decompression, it used 100%, I crashed, and now I can't even run 400a completely stable even with thermal headroom for it. This means I'm unable to even reach 100 C on a core before crashing, and it may occur for me between 85 C and 100 C before I even have a thermal throttle during all core work loads. That sucks. I want to point out the Gigabyte has an option to show "biscuits" in which I've heard from numerous users is basically the quality of your silicon. I started out at 91 biscuits, looked it up because I never heard of biscuits and thought what the heck is that? Well now I'm currently at 89 biscuits since the crashes over the year before finding out it was the 13900k the whole time. I thought it was my RAM 6400 DDR5 since it wouldn't boot after that through XMP without changing my saved profile through my z790 Auros Elite AX and it would only allow me to boot resetting everything back to stock. Given I have to throttle my own CPU at this point, Gigabyte still doesn't have a non beta BIOS to fix this issue (which I'm hoping drops and I can run the extreme profile that I should be able to do because I have the thermal headroom for), I'm about tired of this mess and the fact they are indicating Intel will just point the finger at me when this whole thing is BS, I didn't overclock mine I just wanted the performance everyone said my 13900k would have and now I have a lesser 13900k. I just want the 253w/253/400a without crashing at heavy loads, thats it, nothing more. To note, yeah this is my first PC built in over a decade so I've always liked Intel. While I love every other part of my PC, the 4090 is absolutely amazing, everything is good. Looks like I'll be going with AMD CPUs from here on out though, I regret purchasing the 13900k.
Thank you for another great set of test result that show that: • If you want more frames/second (FPS) the 7800X3D is usually a better choice than the 14900K; • The Extreme profile will offer more FPS and use more power than the Performance profile; • It is implied but not shown that the Extreme profile will cause its CPU to age faster. What has not been discussed is the effect of the “cooling solution“, thermal velocity boost, and short time boost on FPS and aging. One of the things to identify for a given cooling solution is is the acceptable steady state temperatures for (a) wall power (b) CPU (c) GPU for stable (1) CPU frequency and GPU frequency. It is assumed that boost will take advantage below below steady state loading to boost frequency. The extent that boost will exceed acceptable steady state temperatures will accelerate aging and may impact reliability.
U know that amd runs hotter than intel in gaming, right ? Intel runs hotter in cinebench simply because it has 3x cores, so ofc the powerconsumption will be higher too, but gaming is a different story, x3d s are extremely hot, while even an i9 stays at 40-50 degrees in gaming
@@KoItai1 Intel simply consumes far more energy and outputs far more heat than AMD. AMD temps are higher due to slower heat dispersion, but it still outputs far less total heat than Intel.
@@KoItai1 "X 3Ds run extremely hot" no they don't lol, 80 C isn't hot. Besides if they were hot, they'd be thermal throttling or downclocking but they aren't. 14900K may low average temperature but that's only because while gaming, 70-75% of cores don't do anything. Look at the temperatures of the P cores, I bet they'll be hotter than X3D.
@@KoItai1 also who cares if it runs hotter as long it performs better ? You're saving still money on power supply, power bills, motherboards power stages and VRM etc.
I wish you guys would test CPU bound games more often like rimworld, dwarf fortress, factorio, hearts of iron iv, civilization vi, etc when benchmarking CPUs (on mid-late game when the power needed is much higher)
I'm really happy that my 5600x is doing everything I need it to do, and went for it back when I did. It's more than enough for my high refresh rate 1080p adventures combined with a RTX 3070. Enough fps with very low power consumption. But that Intel high-end mess is really a..well..mess. I hope Intel comes with something new and impressive soon in the CPU market to keep the competition up. Good competition is healthy for us customers and so that AMD keeps inventing and improving as well.
Now Steve you should do the productivity tests with 14900k vs 7950X/X3D. If gaming FPS get hit by Performance Profile, so productivity will be hit too!! People deserve to see this updated benchmarks in order to choose wisely. Other thing is, how can Intel recommend Extreme Profile, when PL1+PL2=253W degrades the silicon?
Great Video as always! AMD guy here, and last weekend I built my friend's kid a gaming desktop. He wanted to go Intel, and I wasn't going to try and change his mind (get a 7800x3d) instead of the 14700k. Well i built the system, everything went smooth and really had no issues. But on boot up, (MSI z790 board) in the bios asked me what power profile I wanted to use, based on my cooler. I installed a Corsair iCUE H150i ELITE (360 AIO) so I chose the water cooling power profile, thinking, oh a 14700k wont draw anything near what a 14900k would so this should be fine. Well I get windows installed and everything setup, run Cinebench, and sure enough it was drawing 300W and would peak out at 320W. I was blown away, it would almost instantly hit 100C and start to throttle. So i went back in the bios and limited it down to the (Tower Cooler) profile, which seemed to lock the max to 280W. I did no expect the 14700k to draw that much.. The temps peak out now around 90-92c, but man... that was crazy! Other Intel guys here, is that what you see from your 14700k with no limits?
that's completely normal behavior for these Intel CPUs. Productivity loads is what they are best at, keep in mind a 14700k will score 35000 in Cinebench, a 7800X3D will score only 18000. Of course this is completely useless in gaming, but shows these Intel CPUs do have a gap they fill that AMD isn't good at (yet)
Just wondering if you are using the 'optimised' -30 offset for the Ryzen? Cause you lose no performance if anything you get better multicore and better sustained single core performance along with reduced power consumption by a decent amount on top of what it already is.
you can use that with pretty much any ryzen since 5xxx series right ? iam using pbo-28 on all cores on 5600 while boosting up to 4700Ghz on single core ,stock was 4450mhz + if i run all single core workloads like cinebench by boost on all cores are 200mhz higher from stock setting ,you can basically tuned it for better performance or same performance a bit better boost and better power consumption overall any way you like for free and its as simple as it could be really
I did upgrade my TUF Z690 / 12100 with the 14600K, but never thought I was missing out on anything compared to the i7 or i9 cpus, as the most stressful use it gets is only gaming, and not with a 4090 either - yet. And it can still be OC'd to i7 level with an air-cooler, and low power consumption too. Being a "Cheapster" ftw, ha ha.
The 13/14600(K) CPUs were never the problem. Their punch-to-price was never criticised by any, really. It's the things Intel did to claim "leadership position" that was criticised on launch day, and is now starting to bite them in the ass for real. Heck, you can make good arguments for the 13/14700Ks as well. I'm habitually an AMD guy going back decades (last Intel system was a Dual Pentium III Tualatin 1266MHz) but dunking on brands like they were an opposing sports team is dumb as rocks. Each product should be evaluated on it's own merit. It doesn't matter if it has a bigger or smaller version that is better or worse. It doesn't matter if the brand has had previous glory moments or complete duds. What it comes down to is performance, price, efficiency, stability, service and RMA policy for each individual component.
Hey Steve, I would argue that Horizon Forbidden West can be CPU demanding. In cities like "The bulwark" or "Fleet's End". Maybe change the testing location, if you wan to keep the game in the suite. Pretty please? :)
If you have unlimited budget, get a 7950x3d. If you have mid-high budget, get 7800x3d. If you are on a small budget, try getting an am5 cpu, otherwise am4. Intel makes no sense in any part of the chart
@@AndyViant Go back and watch HUBs or GNs review of the 11900k then. It has two fewer cores and the new cores (on the same old node) don't make up for it in all situations. The 11900k gains a bit in productivity but loses a bit in gaming, and the efficiency is about on par. Both channels were left with "Why?" as their main conclusion.
@@AndyViant Said by an 11900K owner who can't cope up with the fact that he bought the WORST processor ever made in recent history. I am assuming here that you are not broke enough to actually own that peice of garbage otherwise I see no reason for a mentally sane person to get so triggered by a comment about 2 processors from the SAME brand. Lmao the insecurities
Do not use the baseline profiles Performance or Extreme. If you want to do nothing and not do enthusiast stuff get the 7800x3d it just works out of the box. Now if you want intel if you are willing to do enthusiast stuff first put on a correction bracket on your 14900k if you want to make your life easier get yourself a torque screwdriver that can go down to at least .05 nm lowest cost for a good one with a certification is around 70-90 dollars. Get a Thermal Grizzly 13th-14th gen contact frame and set the torque to .06nm. This will ensure you are not fighting with your ram having to tweak that as well with a correction bracket maybe not just right and as close the abysmal flawed intel LGA1700 cpu mounting pressure that is about the only thing right for DDR5 ram. Make sure you have the right board for your ram that is important from 7200mhz cl34-8200mhz cl38+, Next you can select the let the bios optimize, next if your cpu is not the greatest of quality on the silicon lottery limit cores 4 and 5 to a multiplier of 57"5.7ghz", 6ghz+ takes a lot of core voltage and if your chip is not a lottery performance will suffer. Next you will have to look at your VT curves in the bios below 5.6ghz you can leave at auto, I like to with a negative offset on 5.6ghz to every mode of turbo up to 6ghz of .5, this is where a no one size fits all as each board is different. You will need to adjust each state of turbo and after you do save and exit the bios go into windows then run 3DMARK timespy the cpu test alone and will need to do 6-12 runs to sort of get an average and account for error, then keep going into the bios, weaking the .5X on each level of turbo until you find the sweet spot for the cpu that will greatly increase your performance better than you could ever get out of box and slapping the cpu also getting the performance back those baseline profiles butcher because they both still pump too much VID/Vcore into your cpu heat is not your friend, more voltage is not always better. 7200mhz cl34 you should see a cpu score of around 25k consistently, 7600mhz about 26.5-26.9, 8000mhz depending on how well you are dialed in around 26.9-27.5k. 8200mhz is around 27.Xk. Once you do that you will get the best 1% lows again with the lowest frametime MS, you will notice your system is a lot snappier giving you the best possible gaming experience possible. Kind of shake my head at all of you reviewers not properly educating your audience treating them like they are clueless and not able to self-tune. It's not just you, it's a lot of you I miss the days when tech reviewers actually took the time to educate their audience properly sadly those days seem to be over for the most part and there is like under a 1% group of reviewers that actually does that some will not tell you how because well that is their livelihoods. Does it take extra work? Yup but if you are going to show off enthusiast hardware you may want to do enthusiast stuff, or buy enthusiast grade hardware be willing to do enthusiast level stuff. Also will add the guidelines are because each chip and board are different, there is a no one size fits all, there is a set of numbers it will just work, it may not work the way you want it to nor perform the way you want it to. That is sort of up to the end user, not even the motherboard manufacturer can do that especially on intel. Intel needs to fix the LGA1700 mounting bracket that is the first thing that needs to happen to fix the lid bow, no bios or microcode will fix that. At this point it is up to the end user, intel is hoping this just goes away as they abandon LGA1700.
Look friend, I've been overclocking for over 20 years, even venturing into LN2 a few times, so I get it, and I appreciate you making the effort to inform people on what it takes to make these chips sing. But the reality is that Intel has a bad product right now, even if you make the absurd effort spending the hours upon hours tweaking and stability testing required to squeeze out a negligible win over X3D parts, we both know if that same effort was required to get similar performance out of AMD at the same power consumption as Intel CPUs, Intel enthusiasts would rightly laugh them out of the room. If you are just gaming, grab a 6000MHz DDR5 kit and a 7800X3D, enable EXPO and you're done. You'll get a range of 90-105% of the performance in games of an expertly tuned Intel chip with 2% of the effort required, with much lower entry cost and power consumption figures. Even as someone who loves OC'ing, the reason I OC is to get everything out of my money, and the extra cost for a high binned DDR5 kit and a 14900K/KS required to best the X3D just makes the Intel platform look like a joke. If you are running productivity workloads, especially if that's your livelihood, you care equally about stability and performance, you aren't running bleeding edge OCs, and you should be running Threadripper. At which point Intel doesn't even enter the conversation, as an OC'd 14900KS has roughly 40% lower multithread performance than a midtier Threadripper 3970X released over 4 years ago. I'll still applaud you for your bench scores, and for extracting every last MHz of performance out of a complex chip to tune, but people like us are a dying breed and we make up 1% of PC builders, most people aren't doing deep dives like this, and I've reached a point with Intel where I agree that it's silly.
not a knock on what you wrote, but most people don't tune their PC. I used to repair and salvage PCs, and you'd be shocked at some of the stuff I've seen from people that built themselves a custom PC. (The worst case so far was from a guy who bought a 2070 Super to pair with an old AM3+ system that had the 95wTDP FX-8120/Gigabyte 990fx-UD3 board. He put the card in the PCIe 2.0 x4 slot and wondered why the card wasn't better than his previous card. He thought the system was broken, so he sold me the remains for $30usd w/o the 2070 S, PSU, and drives.) I probably bought a couple dozen other custom built PC's that had an issue, or the owner just wanted to get rid of it, and two of those PCs had been tuned. Most of the time the issue was the old Intel stock cooler popped off.
just switched from intel to AMD with a 7800x3d and the chip is an absolute monster. such great efficiency and temps. look online how to run PBO to undervolt (precision boost overrdrive) and it's even more efficient. super easy to do and very worth it. undervolting is the new gospel for me these days.
The 7800X3D is not a monster, it's just an 8 core chip, easy optimization for it. The 14900K is so powerful that it will take time until software can fully use it even outside gaming.
@@saricubra2867 i'm more than happy with this chip's performance and i'm seeing a noticeable performance improvement on my set up so i don't really care if you think the chip is powerful or not. 🤣 it's great for the price (especially the discount it got.) i really don't need a 14900k as i mainly only game. plus it's nearly $200 more and i don't need it to heat up my entire house especially during the summer. 🤷♀
Intel man 🤦🏻 I would classify myself as an enthusiast, but even I don’t know what the hell is going on. So there’s a standard, recommended, performance, and extreme profile? Or is the performance profile “recommended”? What will the boards ship with? Shouldn’t it be “standard” as it is, you know… standard? 🤷🏻♂️
No, no, it's all very simple! The "performance" profile is quite performant you see, and that's why it's the "standard". But we do "recommend" that you use the "extreme" profile for EXTREME gaming performance. You might think that it would make sense that the "recommended" profile is also "standard", but this way, we can tell you a) if you run into issues with "extreme", well, that's not the "standard" profile! You should use the weaker "performance" profile because that's the "standard"! Or b) if you are not happy with your processor's "performance", it's your fault because you did not set it to the "recommended" "extreme" profile that we recommend! Either way, you lose, Intel wins!
The funniest thing is that when they released 14900k and showed all those benchmarks and scores but now the recommended setting can’t hit those claimed scores… That part doesn’t sit right with me, intel sold those processors with big claims and now they’re changing the perfomance. All those customers who bought the “fastest cpu” got robbed..
@@Ascarion1234but isn’t it false advertising on intel’s part? I mean some people can’t hit intel’s claimed scores without the cpu crashing… People bought 100% of the cpu perfomance and now are recommended to use 75-85% of perfomance essentially… I truly think it’s wrong advertisement because intel themselves used those same profiles/power settings that now are not recommended to use…
All i needed to see was the 14900k using around 100w or more power and fps being almost same. Even if people dont care about electric pricing they should care about the heat being dumped to the room where the pc is.
well, 9000 series will be disappointing, 2 years leap and amd can t beat the 7800x3d, while in arrowlake even the core 5 will beat the 7800x3d, dk what amd even does with their 9000 series
@@robertjif6337 even if they don t change much else about cpu's, the fact that they jump from 10nm to 3nm, gives the cpu 20% more performance per watt; intel always shine when they change platforms; always sucked on creating better cpus on same platform, that s why they change it quite often; now considering it s also a 2 year jump; the 13600kf was just 10% behind 7800x3d; so it s not unreasonable for the core5 to be better
@@KoItai1 Arrowlake is going to be more of an efficiency increase than a performance increase. The x3d parts are basically a generation ahead of the non x3d parts, remember, we have the 9000x3d parts coming out this fall, probably at the same time arrowlake launches. The 9950x will probably be well ahead of the 15900k for workloads and about on par for gaming, the 9xx0x3d will be well ahead in gaming. The big focus of zen 5 is higher sustained clocks in allcore loads and better chiplet communication, dramatic reductions in latency etc. Word is that zen 5 will be dramatically better in games that leverage more than 8 cores. There is also talk that the 9000x3d chips are packing some other upgrades, new 3d vcache module, higher clocks and perhaps even overclocking. Most sources agree, it's not looking good for intel.
@@PineyJustice amd themselves said 9950x won t be better than 7800x3d; that means they ll be even worse than intel 14th gen, or really close, after a 2 year leap; which is really disappointing; intel gets 20% better performance per watt, and between 25-30% IPC increase; getting rid of multithread will hurt their multicore performance but will significantly increase the gaming performance, but until then a thing is sure, the nonx3d 9000 chips will be absolute trash; considering u can get 90-95% of that performance in the 7000 series at a much more discounted price
I have been an Intel user for over 25 years. Until the past 3 months. Currently using a 7800X3D and am blown away by how amazing this CPU is. Might have a fully converted user here
I would just get the 7950x3d and call it a day. 5800x3d gaming performance and more powerful everywhere else. Should test the 7950x3d vs the 14900k not the 7800x3d vs i9 14900k. Yall already showed the 7950x3d is skightly faster in gaming than even the 7800x3d.
@RamonInNZ the 7800x3d was tested and lost to the 7950x3d on this channel lol. It's rated as no1 because of its price/performance. I also said 7950x3d is faster everywhere else so I'm talking about people who want the best amd cpu and who do alot more then just gaming.
@@alsayedfakhri4597 the point is compare the best intel and amd gaming cpus. The 14700k will be slower than 14900k, so you can extrapolate with the results on the video.
@@thetheoryguy5544based on your comments, you are extremely biased in favor of intel. These are companies, man. Not sports teams. You don't pick a side and root for it 🤦♂️
@@shoobadoo123 Exactly, if more people behaved like us the PC hardware market would be way different. Fanboyish wars is what's destroys markets and hardware evolution.
Hardware Unboxed doesn't even unbox hardware anymore. Kinda sad watching your idols get lazy. Fortunately we've got world class reviews, but come on guys, unbox something for your old fans!
Honestly, these "system power" numbers really don't paint the stark difference in power usage. My 7800X3D uses double digit power numbers, usually around 60-80W. It looks like "only 90W more" for cyberpunk, but the reality is that's over twice as much.
I have a similar bit of information to share. My laptop with a 13980HX has the ability to set P1 and P2 to static values. On multicore workloads, and limited to 45W, it can clock down to 1.7 GHz. When left uncapped it will get about 4.5 GHz on P-cores but is pegged at 95C and sounds absurdly loud, requiring noise cancellation headphones. I've set various power limits to attempt to play games without it getting noisy, but it's apparent that the laptop cooling solution can't handle anything above 28-35W without hitting 95C. It's a shame really. Power efficiency is a bigger issue, since it translates to heat, battery life, and noise. It's also more significant (disproportionately) the smaller the device becomes. So while people may shrug at 200W or even 100W, those numbers have changed the mobile landscape moreso than they realize. The knock-on effect has made it so that mobile Intel CPUs at any kind of mid-high end are essentially hot, loud, and battery killing. AMD and Apple are cornering the mobile and remote work market by offering the best power consumption per dollar per unit of performance. Our company, which manages IT for hundreds of companies has stopped selling Intel mobile devices for about 3-4 years now. People want quiet systems that have long battery life and can do everything without ruining the aforementioned.
The fact that Steve went so far as to delete his old video which he was getting a lot of views from because the video is now outdated is sheer proof of the dedication of this guy.
As always a great comparison video. Great work Steve. Keep up the good work
Came to comment section to type the same but searched for 'delete' and found yours! So true and I really respect it! Keep up indeed!
It's probably unlisted or hidden.
Tbh they won't loose anything noteworthy but it's still nice to see.
@@Deinorius well we did lose a weeks worth of work, so that in itself does sting a bit :D We also lost a high view video that was still generating over 5000 views per day for the channel and would have no doubt made us good money over the next 12 months. Anyway, we felt the right thing to do was delete that video and hopefully the traffic gets served to this new one, but it's YT, anything could happen there :D
@@HUMC5 It's deleted, we have it on our local server, but it's gone from YT and Floatplane.
Just got a 7800x3D. In my country, the 7800x3D I just got was $619, and a 14900K is $1049. Absolute insanity to buy a 14900K if you are a gamer.
Felicitations. Where did you upgrade from?
same bro, upgraded from a 3600, 225+ 1% lows were not achievable on that 3600 is all I can say. I’ve been a happy gamer (:
Great CPU, i just built a new rig too!
wow what is with the prices those are crazy 14900K is 687$ 7800x3d 407$ here and that is with Eurotrash taxes those prices are gnarly
It's insane to use Intel 14th Gen CPUs at all as a gamer. They have nothing that really compares pricewise.
Every time I watch a video like this, the difference in power consumption for gaming always leaves me shocked. Thanks for your effort Steve!
But it gets always dismissed as "makes no monetary difference, LOL"
@@walkir2662 people forget that they have to remove that heat from their space. I have 4 gaming PCs and 3 servers in my house. if they were all intel, I would be baking. The heat output of my 7800x3d is crazy low compared to its gaming performance.
Well, dont forget... those are uncapped FPS. And they are absolutely useless in Single-Player Games in 95% of the cases. With variable refreshrate, you only need like 80 FPS capped for the games to be enjoyable, or ... you cap it @ 60 fps, while also disabling variable refreshrate & setting your monitors hertz down to 60. Then the 4080 Super turns out to be more power efficient than my older 3070 Ti for example, while delivering noticeable better lows.
@@walkir2662 At least it could be partially justified if the higher consuming part offered the better performance, but more power to lose is just sad.
@@Nebujin383Ah dude no. The higher the FPS the better. This is applicable to all games. Gaming at 60 FPS is just not acceptable anymore.
I upgraded from a 10850k to a 7800X3D. My room is a lot cooler now after 4 hour gaming sessions and that's what has made me the happiest of all.
So real
Get a cheap box fan to exhaust the heat out the window. I have 5 modern AM5/AM4 builds in the office and even with only 2 or 3 of them gaming it heats up quick. Great for the winter.
If you have horizontal style sliding windows you can likely stack 2 cheap $20 box fans in the window. Walmart has these cheap ones and the handle has a groove that notches with the window frame to hold it and then slide the window as closed as possible. Set them on low and crack your door and comfy gaming.
@@45eno You would be in an oven if there were 2 or 3 Intel Machines gaming.
@@45eno Lol, having to do that is ridiculous.
@@45enowhy the fck you have 5 builds in same room 💀
50% power for 2% performance? i see no reason TO do that, on the contrary
One might want to melt the ice caps a little faster. Intel offers that.
@@doublecrossedswine112Ah yes, let's bring about our demise earlier to enjoy their products for as short a time as possible 😂
The argument to run Extreme Profile would be: If you have a 14900k, you probably want every last drop of power, regardless of energy cost.
@@logantca Those ice caps MUST be defeated!
I’m confused because my ASUS board keeps PL1 & PL2 at 253w for performance and extreme profile.
when you get older, you will prefer efficiency on everything.
Efficiency? Like saving on electricity bill? Is it thar bad going with 14900K? I wasn't sure hoe much money is lost from CPU to CPU =O
Yep, once you pay your own power bills, power efficiency is king. :D
Lol older people usually use capitals... kid
@@SamBlips assuming you're gaming for about 4 hours every day, 365 days a year, and your 14900K guzzles up on average 150W more than the 7800X3D, you'll be spending approximately 30 dollars more per year (going by
the US average of 16.68 cents per kWh).
Obviously there's a lot of variables (the games you play, for how long you play them, where you live) so adjust accordingly
@@johngrimm1103 Not just that, for me it's mostly about longevity & stability, etc. With less heat output for example, cooling fans won't be pushing as hard, & there might be less overall chance for components to wear out within the system's life-time. Plus slower dust build-up & cleaning required, etc...
Just bought 7800x3d. Such a beast!
My 7950x3d is Ordered :) can't wait
👍
@@Magusslettewestberg the 7 7800x3d will be better at gaming though unless you disable some cores in the 7950x3d
So if I want to upgrade it’s about the same price to get a 7800 X3D and AM5 mobo as it is to get JUST the 14900k. Even though I’m already on LGA 1700… the incentive to switch to AM5 is massive.
in germany 7800x3d + good b650 is 500 and 14900k 600, even worse here
And considering AM5 looks like it'll support up to Zen 6 too so got plenty upgrade options down the line.
7800x3d + b650m-h/m.2 + 32gb Ddr5 it's around 500$. For the same price you can buy 13700k kit, or if you want to get better mob for 7800x3d like b650 aorus elite ax, price of the kit is so close to 14700kf kit. But I would consider to buy 7500f, cpu which can paired with 4090 in 4k or 4070super in 2k, and 7500f kit on b650m-h/m.2+ costs around 320$ (when hexacores cpus will not be able to gaming, you can swith it to 7700/9700/7800x3d
Hold out for Zen5 tbh
@@malcaniscsm5184 Zen 5 X3D to be specific, regular Zen 5 will end up being like regular Zen 4 v Zen 3 X3D, similar-ish performance.
At this point, I wouldn't consider a 14900k for anything. 7950x will be the same at productivity workloads and at higher resolutions get the same FPS in games while consuming less power and being more stable. 7800x3d will beat it in pure gaming.
The 9950X is coming out next month which should destroy the 14900K by a wide margin in productivity benchmarks.
@@PowellCat745 And Arrowlake is coming out to destroy them all so who cares?
@@PowellCat745 15900K coming soon will destroy 9950X😂
and surely intels next generation wont have any issues we are still seeing with this generation. surely
@@thetheoryguy5544 and how do you know? lol
11:52 I think it's worth to mention that in 2 of these cases, x3d could do more fps, so the GPU pulled more.
the 9800X3D is gonna be a monster
Ryzen executives are saying the V-cache will be improved this coming gen. Excited to see what that means.
I’m waiting to build a pc with the 9800x3d or 9950x3d and a 5090. Depending how much stronger they are from current gen. If only a 5% difference idk might just build out a current gen once the parts dip in costs
@@BROCK_WARHAMMERRumor is the 9950X3D will have double the V-cache as the 7950X3D. It will be a beast!!!
but remember the more amd btfo intel the more expensive they will charge. 9800x3d might be $1000
It's going to be overkill. 😉
I'd like to see a 14900k restricted to the power draw of a 7800x3d.
Calculating Performance/W is one thing, but actually seeing it would be really interesting.
I just got a 7800X3D from Microcenter for $204 😮 with their grand opening sale for the Charlotte NC store when bundled with mobo and ram. And holy shiz its fast my I9-10850k is now a living room PC with a new Arc770!
That's a fckn steal! Also: Welcome to meaningful socket compatibility! The only socket I didn't upgrade on was AM3 because Bulldozer and Excavator sucked (and my Phenom X6 1100T Black Edition just kept trucking), but my AM4 board saw two CPUs (2700X and 5950X) and I suspect my AM5 board will most likely see three (currently on 7950X3D).
$204??? WOW
Mobo model ? And ram Gb?
@@maanmahmoud4537 it's not exclusive to micro center grand openings, it's at every micro center and it's a gimmick. can look at it like you pay whatever their sale price is for the chip by itself (like $350) and then you get a shitty overstocked mobo and 32gb ddr5 d-die dogshit sticks. what they do is on the receipt write that the 7800X3D is $200 and the shitty mobo and sticks are $350 lmao
@@rickyalert but still a steal
This Intel mess is not inspiring confidence.
Yeah jumped to AMD AM4 platform with Ryzen 1700X then 3700X and now 5700X will look at 7600/7700 in the new AM5 platform. Have built some Intel base machines for others and never been super happy with them.
IT IS sad for Intel and worrying for the industry as whole.
I really dont Like de facto monopolies in this realm.
This could all be prevented if intel hadn't even bothered releasing a 14th gen i9. Let's be real here, it is the same thing as 13900K. But no, Intel had to make a fake generation lol
I'd like to see userbenchmark's take on all of this. "Intel continues to improve by providing irresponsible customers who've run their CPUs at ridiculous power levels a more stable solution"
It won't be repeated. Meteor lake wasn't the best architecture ever. Arrow Lake is quite a bit different and Lunar Lake, will potentially end AMD.
Cannot wait for Review Ryzen 7 9800X3D and Ryzen 5 9600 when released :)
It sounds like the 9950x3D might be interesting too, for anyone with workloads with large working sets.
ryzen 9600 with 5.7ghz OC looks insane
@@RobBCactive If AMD is able to resolve their problems with parking and adressing the cores the x950x3D could be interessting, but right now the x950x non 3D is overall the better gaming cpu.
@@jojobetzler3732 It's rumored the 9950x3d will be both chiplets with v-cache instead of just one on the 7950x3d. Being all 16 cores with v-cache should eliminate scheduling issues.
@@jojobetzler3732 there are no issues with latest chipset drivers for like I dunno at least a year already. You've got outdated info.
That 7800X3D is a marvel of performance and energy efficiency, it performs much more than the 14900k in games while consuming only 1/3 of the energy
You realize youre comparing a 8 core 16 thread to a 24 core 32 thread right? Its way less CPU, its gonna be way more efficient
@@Angel7black intel with 8p cores and 16e cores. So the 1/3 power playing and getting worse results is massive. It's still an equal comparison. Cause gaming is going to use 8p cores vs amd 8 cores.
@@Angel7black We're discussing games, thread count doesn't matter. But, even if you disable E-cores and turn the 14900KS into an 8c/16t chip, it's still significantly less efficient than a 7800X3D.
But if you want to have an irrelevant discussion about thread counts and efficiency, I'm down. OC'd, a 14900KS draws more power than a last gen 64c/128t Threadripper 5995WX, but achieves only ~60% of the multithread performance. An OC'd 14900KS has similar power draw to a stock 96c/192t Threadripper 7995WX, but the 14900KS only achieves ~20% of the multithread performance.
"But those chips cost 10x what a 14900KS does." Yup, but which Intel flagship would you prefer I compare with them? The 14900KS doesn't quite match a mid-tier Threadripper 3970X 32c/64t in multithread performance, and that chip consumes less power and is over 4 years old now.
Concerning Intel, you should keep the topic on gaming, as games and single thread performance are the only areas where Intel can still compete.
@@Angel7black Even with the 7950X3D its still using way less power, performs the same or occasionally ever so slightly quicker than the 7800X3D depending on the game, and its on par or faster in the most heavily multitasking programs compared to the 14900k with an actual full fat 16 cores/32 threads and not this efficiency crap, because it's certainly not efficient. All while sucking down significantly less power. Intel is doing exactly what it did in the P4 days, just up the clocks, give it as much power as it wants and to hell with the power efficiency and heat.
@@Angel7black isn't the 7800 x3d also like 100$ cheaper or something ? what drugs are you on ?
when you see the power consumption its wild considering how often the x3d is running the GPU harder.
intel dropping multi threading from pCores hasn't been explained like its been done for power consumption reasons either. hopefully the eCores have improved their efficiency a lot.
Intel vs Ryzen at gaming, a short history:
Ryzen 1000 and 2000 series - Intel clearly better, no sweat. AMD sells very cheaply to compensate
Ryzen 3000 series - AMD making gains but Intel still comfortable, chugging away at 10nm+++
Ryzen 5000 series - AMD reaches gaming parity, Intel pushes a lot more power through their chips to keep up or slightly edge ahead
Ryzen 5000 series 3D V-Cache and 7000 series - AMD now significantly ahead. Intel has to push obscene amounts of power through their chips to keep up
Ryzen 7000 series 3D V-Cache - AMD so far ahead that Intel decide to risk long term silicone degradation, pushing hundreds more Watts through their chips that cost 80% more money just to lose by a few % in benchmarks
Ryzen 9000 series - I hope Intel has been working on something special....
I wouldn't say Intel pushed more watts through the CPUs, that's the MB manufacturers doing that.
However Intel surely had no problem with the manufacturers selling the motherboards with default open power limits, as most hardware youtubers test "as delivered", hence Intel surely was not against MB manufacturers not going along with their power limits, on the contrary.
@@johnscaramis2515 No its both. Mobos went in to the overkill territory, but even with Intel's more efficient profile they're still consuming 100W more while still losing.
@@blkspade23 I would assume that Intel spec'ed their CPUs in a way to maximize performance by running at their vomit limit, but not beyond. So even with the high power draw, they would not have a technical issue.
And I'm fully in line with you: rising power draw to catch up with AMD without enforcing guard rails (i.e. enforcing their power limits as default) was a big mistake.
And I don't think Intel will be able to close up with 15th gen. Although with their Foveros technology, they could introduce a cache tile
Ryzen 3000 series, Intel's competing CPU flagship was the 9900K , that is on 14nm . Only 12th Intel has 10nm. AMD was eating away at Intel's HEDT market because of the R9 3900X and 3950X . Intel non HEDT platform maxes out at 8 core at the time.
Ryzen 9000 series - Intel fabs throw in the towel and Intel switches to the same TSMC production as AMD. From this point on there won't be any big differences between the two, no matter how they try.
I just did a new build(due to my power supply killing my mb & gpu). My last 3 builds(over 10 years) have all been Intel because I had stability issues with AMD. Now, for the very reason I left AMD, I've left Intel and gone to a 7800x3d build. I had also been doing water cooling because of the heat generated. Now I've gone back to air cooling with the lower power demand and temps that the AMD has. I was able to save enough on a MB/CPU combo, and a $40 Assassin 120SE air cooler- that I was able to afford a 4080 super. I have run 3dMark benchmarks, stress test, and played a few games(3440x1440 144mhz) and my max cpu temp has been 76c with all stock settings...I'm very happy with the cost to performance.
oooof what type power supply did you dirty like that?
That's pretty warm I using a tuned 14900k 5.8Ghz highest Temps i see is 68c it's how you tune it and hardware you use Intel doesn't run hot if you don't be cheap on the right parts I'm not even using exotic cooling.
@@tommypearson9260mobo vrms as well
🙌
Just skip the GPU. You need an insanely stable RAM to make AMD stable.
got my 7800x3d a week ago paired with a 4080 super and it's a killer!
Omg same lol
You got OF?
I thought that this would 100 % be a bot comment with this thirst bait profile picture. Quite surprised.
@@Simon_Denmark LOL
noooooooooooo you cant have fun using amd cpu you must use intel cpu
Glad my 13900K has been rock solid at full power since the build we did :)
Same here on my 14900K which I undervolted immediatly when I say the motherboard auto vcore settings which were absurd out of the box. Have mine runniing an all core 5.6GHz P core and 4.4Ghz E core with an adaptive plus negative offset hitting 41K on Cinebench R23 and mid 80's C temps at max though I have delidded! Use it for encoding, rendering where it really comes into it's own and for gaming it does the job as well as anything...
I have read that some users run Cinebench R23 several times and then their 14900k becomes unstable (on the old unlimited power profiles). Possibly silicon degradation occurring significantly faster than expected.
@@abaj006Asus boards often overclock the processor out of the box, which it is not given. Well, there is just a defect, as I heard under warranty, this is returned
Full power sounds like a lot
Full power is questionable because 1 what is full power on your system vs others and 2 how often do you really load up your cpu that it pulls the max power allowed
9800x3d will repeat history again. At worst same performance as i9 with half the price and efficiency and at best 30% faster performance.
They're actually designing it a bit differently instead of just dumping it full of V-Cache lol. Hopefully, we can pick out the higher chips like 9900x3Ds if they have it on all 8 cores (or more lol.)
The 9800x3D will probably be the next powerhouse for money efficiency but perhaps the flagships might outclass it and have good productivity performance without compromise lol
indeed, if we can get the IPC of the 7700X or better and Tier1 gaming performance with amazing efficiency, the 9800X3D looks like a legend in the making.
Intel will be on a superior node next generation. Do not expect a repeat of this gen.
@@MrHav1k you mean the next i9 wont draw 300 watts? xD
@@maze4184they're gonna switch to tiled next year too.But rumours they're gonna use a stacked tile design.Something like the 3d V-Cache tile but for all the tiles instead.So at worst im guessed they'd be on par and the same powerdraw too.
I Undervolted my 7800X3D and it uses at max now 78W, much less while actually gaming.
Imo Intel is not saying out loud that you need to lower the PL1 and PL2 because they would lose even more to amd in the gaming sector.
a 14700KF use only 80w max when gaming, what are you talking about ?
@@AquilaeYT you need to learn to read.
@@Duuh_Eazy I don't see any relation with my comment and yours.
Re-read my comment.
@@AquilaeYT He's clearly saying his 7800X3D is 78W at full load (not gaming), the actual gaming consumption isn't using that much.
@@blkspade23 Where I talk about AMD ?
Like other person, re-read.
Truly disruptive performance indeed. Intel can't handle the 7800X3D even with the 7200 CL34 memory. Thank you so much Steve. Thumbs up!
Now, Arrow Lake vs 9800X3D, this will be good.
Now I hear that Intel says the CPUs have a microcode error, fixable by a BIOS update. The cynic in me says the code change will simply throttle the CPU.
went from 11700k to 7800x3d. got a deal at microcenter for $480 that came with 32gb ddr5 ram, b650 motherboard, and 7800x3d :)
The funniest thing to me is that they're equal at every resolution people should be playing at these days, especially with a 4090. AMD has so many longevity and efficiency advantages over Intel. I still love my 5950x!
People like different things. Personally I'm running an XTX at 1440p. I have a 7950X3D and I see clear performance differences when I lock a game to the vanilla CCD instead of the V-Cache CCD.
I9 14900K= 250W
R7 7800X3D= 75W
Also the 14900K is 10% slower but cost 220€ more.
My 7800x3d comes in today! Moving on from a 12400f. Never in my life would I thought Id be switching to AMD, my first build was 12 years ago and have always used Intel. 7800x3d is just to damn good of a chip to pass on
I switched to AMD when the 5600X came out. After seeing how good the 3600 held up and the prices for their gaming performance AND energy efficiency, it became really obvious that AMD was back to winning (at least for gaming).
I switched to AMD when K6-2 came out.
👍
Switched to AMD for the first time a few years back with the 5900X and I'm still happy with it and impressed with how much AMD has been improving in the CPU market. Will upgrade again this year and currently leaning towards AMD again since I've enjoyed my 5900X and Intel's mess with instability didn't inspire confidence. Going to be watching closely to see who releases the better products this year, though I will say I'm quite interested in trying out an X3D CPU.
So glad I switched to AMD 7800X3D.
Forget Arrow Lake. Just wait for the next X3D monster!
Rumors are saying it'll be out by the end of the year. 💪
No.
@@benjaminoechsli1941 Nice. Still have to wait for 5090 though.
You guys should totally do a video showing the advantages of undervolts on the 7800x3d!
Amazing work Steve, just like another Steve. And I've come to watch your video fast!
I greatly appreciate the structural framework of your videos. I appreciate the time and effor that goes into them. Bu I must confess, in most cases I simply skip to the final thoughts.
Back to you Steve! Excellent comparing video 👍🧐
When building earlier this year I went the 7800X3D route - it seemed more performant, used way less power, was simple to cool and less than half the price of the intel chip where I live. It was a no brainier! Watching this makes me happy I went the way I did 😊
Bought a 7800x3d for 330 usd brandnew 🔥
Interesting part is in 4K, both performance and extreme profiles give exacly same result but with 80W more power usage on extreme profile.
Thank you HWUB for using the 7800X3D instead of 7950X3D, giving even better showing for AMD since 7800X3D is totally different price range than 14900K, for gaming in particular. Such value for gaming hasn't been seen until AMD introduced their X3D lineup back in the day... Not even regarding these updated results with performance profiles for Intel, the 7800X3D is an absolute gaming beast.
I have a 7800x3D + 4070 Super (own build) combo, and its an absolute gaming beast, I wouldn't touch intel ATM.
@@Anonomobot I use 7800X3D with 7900XTX, even though my GPU uses a bit more power than I would like, the price/performance is still good and I can play anything.
@@MarioCRO jel sve dobro sa 7900xtx, ikakvih problema sa driverima ili igrama?
@@q1337 sve radi super, graficka je top. Imam Sapohirku Pulse.
@@MarioCRO Now that is a next level beast
Hardware Unboxed always on point. Excellent work!
The r7 7800x3d is absolutely the best cpu for gaming that humanity ever made
This needs to be done. THANK YOU!
"no reason not use the extreme profile" - you mean no reason TO use the extreme profile?
Why burn an extra 50-100w for no discernable difference
Why buy a K when the non-k does the same thing?
@@BlackJesus8463 why buy Intel for gaming? Amd is clearly superior for less money
What do you mean no discernible reason? He said the performance dipped up to 15%
Thank you for testing 4k! I know it won't show a huge difference but it's a great real world test for those of us who do game in 4k.
7800x3d is 200$ cheaper, can run well on cheap boards, and requires cheaper cooling solution.
That's honestly it's biggest advantages, even if it was 5% slower than the 14900K it'd still be worth it.
For example, if you want a decent 14900K build, you need;
250~$ motherboard at least, with good VRMs and good power delivery.
6400-7200 mhz or better ram, which is at least 130+$ for 32gigs, otherwise you're just paying for a better memory controller without utilizing it.
and at the very least a 100+$ AIO, You could also buy a large dual tower cooler but it might be a chore during the summer.
You might also need a larger PSU, since an 800w PSU with a 14900K&4090 might not cut it,
whereas you could get away with a 4090 on a 800w PSU when your CPU only draws 80w at most.
Also the 14900K dumps a lot of heat, so you gotta make sure you have a larger case with good airflow, meaning ITX cases pretty much go out the window, even if you could cool a 14900K well enough with a cooler small enough to fit inside an ITX case, the case temps will suck and you'll need to lower the power profile so much you might as well have bought a 13700K.
Wow the first time i see someone that actualy knows what hes talking about you are right at everypoint 👑
Slower in what? Not in games as this video just showed it was the fastest.
@@TraumatreeThe way he worded it I believe he's saying that even if it was slower it would still be worth it. I don't think he's saying he believes it is slower just using that to point out how much of a value it is since it's also faster.
14900k can be cooled with air coolers for gaming workloads.
1. 7800x3D NEED a good cooling solution
2. You didn't need 250$ motherboard for a 14900k
3. You doesn't need 7200mhz ram
4. You need a huge PSU also because of GPU
I have the 7800x3D and yes, I9 should not exist when we talk about gaming.
I've never felt good about cpu's using 250+ watt to run, it's one reason why I chose to go with the ryzen 7900 because it's rather great for my needs and sips power, it's nice to have a small air cooler and not worry about it.
My 65w cpu does fine but its old and I play 1440p. I couldn't imagine stacking the heat of a 14900 with a 4090 in an enclosed case.
AMD coming in STRONG at nearly half the cost and a tier down (Ryzen 7 vs Intel i9 series), how can any " influencer " recommend Intel with such a difference?
Please consider adding X4 foundations to your tests, it would be super interesting to see how cpus manage heavy games like that
Intel is to blame here. In order to win the performance crown it let the board partners run wild with the power limits. It's Intel's fault so it should be providing replacement CPUs for those customers that suffer from CPU degradation.
Appreciate the excellent game selection of all modern and new games!!
But where is the 5800x3d in all of this?
Even more efficient and nipping on the heels of all this in gaming. Productivity is a somewhat different matter, but it's a gaming CPU so your choice there is clear. Prices are going up on the older CPU, however, so the 7800X3D just looks better and better--especially as DDR4 becomes more scarce as time goes on.
yeah i can't believe steve left out the 5800x3d when it's clearly so relevant to this (and every!) discussion!!
Its 7600 performance, not even close to 7800X3D.
@@BlackJesus8463 Depends. The 7600 has better single threaded performance, but if you leverage the cache (which means games), then the 5800 runs off with things. The 7600K is a helluva budget chip, though. If it hadn't been for DD5 prices, this machine I'm typing on now probably would have been a 7600.
@@davefinfrock3324 r5 7600, not the old 7600k
in gaming, on average, 7600 = 5800x3d, and the latter is less efficient, drawing about 100w
because of platform costs, a 7600 with mobo and ram costs as much as a 5800x3d with mobo and ram, so the 7600 is overall the better option
Steve - thanks for this much appreciated comparison. I noticed that at 4K, performance across all 3 (7800X3d, 14900k performance & extreme profiles) is more or less the same in nearly all cases. Unfortunately you only show the power usage stats at 1080p but I'd be very interested to see this at 4K? If the 7800x3d maintains the same material power efficiency advantage over the 14900K at 4K resolution as well then that would be useful to know and at least for me fully confirm your conclusion.
People who will be upgrading to the 9800x 3D 🔥
⬇️
Very good video , thanks mate
TLDW - just buy 7800X3D for top performance.
*Almost top performance, a i9 or ryzen 7700x will be faster with ram speeds above 7800mhz.
if you tune the memory the lead will become even bigger but at what cost.
@@insector2093that's for hard tuners though, for plug and play 7800x3d is champ
@@insector2093 a i9 14900K/KS (binned memory controller) the only upside of it is ddr5 8000MHZ and all that gives is better 0.1% lows in some games but if u dont tune ur ram and bin ur cpu there is legit no reason to go intel...
@@insector2093 + ryzen is limited by its Infinity fabric so running 7600/7800mhz is going to reach the same performance as if you would run 6200mhz tight timings
@@insector2093 just run LIQUID NITROGEN on my i9 it beat x3D in the dust never been happy running mine at 8 ghz 24/7.
I have a 7950x and run it on the 105 w profile, amazing... thermals under control, great speed and silence!
Holy shit up to 200W more per gaming hour... Over time that is going to be so expensive. Applied to Central European prices per kWh that can range from 0,31€ to 0,50€ that can easily cost an additional 500€ or more per year just for having an intel cpu if you game for three to four hours on average every day.
thanks for retesting!
Thank you for including the 4k results!
They always include 4k.
@@BlackJesus8463 No. In the past, they didn't always include the 4k data when testing CPUs.
But... you already knew the results: Identical performance with new-ish titles that saturates the 4090 completely, and the spread opens up the older the game is.
@@andersjjensen I want to thank them in general because I hope they will also do this with newer CPUs once they are being released and tested here.
thank you for the testing across multiple resolutions
Thank you for still including the 4K test results in your benchmarking. It will be interesting to see with Lunar Lake, Zen 5 and then Zen 5 3D to see if CPUs make more of a difference at 4K in future, along with Nvidia's upcoming RTX 5090 GPU.
Intel Fan Boys will defend Intel's ridiculously absurd CPU power usage to the death.
The most impressive thing about the 7800x3D isn't even the performance. It's the efficiency. You don't even need a strong cooler! I still find it hard to believe that in the most CPU demanding games, it's only pulling 30-40 watts. All while outperforming Intels pulling 200+ watts. I thought Alder Lake 12th gen Intels were decent at the time, but once DDR5 prices settled down, and there was a sale on the 7800x3D and b650 boards, it was too good of a deal to ignore.
I have my 13900k capped right now on the cache core current originally through limiting ICCMax to 328a-400a which anything under 400a lead to a current throttle. When I was current throttled, my 13900 wasn't using more than 200w. It did want to use 253w, problem was it sometimes wanted to use more than that , and for me at 253w, a 400a cache current is perfect and lead to a 0% throttle (current EDP throttle). Problem becomes, when the CPU attempts to use more than 253w, which in turn wants to use more than 400a, and Intel specifically said "core current NOT TO EXCEED 400a". They seem to be linked, if I lower my core cache current from 400a to say 360w, my max wattage used for my 13900k under full load will fall from 253w max to 200w max telling me I have a EDP current throttle. While I'm using 2 current protections instead right now through my Gigabyte motherboard with 253w/253/400w manual settings, I don't even get above 160w which I'm fine with for summer time but still, WTF? I have a Corsair 7000D with a 420mm AIO, I should be able to run at the extreme profile. Luckily now though at 160w, none of my cores even go above 75 C under heavy load but at least it is stable.
Problem with these 13900k CPUs is actually trying to use all 24 cores that you paid for when it really needs all that power to push each core and it goes over 253w/400a (should had never been allowed to exceed this and damage the silicon [400a]). In my case it was fine while I was gaming since not all 24 cores are even close to being used. When I did compression/decompression, it used 100%, I crashed, and now I can't even run 400a completely stable even with thermal headroom for it. This means I'm unable to even reach 100 C on a core before crashing, and it may occur for me between 85 C and 100 C before I even have a thermal throttle during all core work loads. That sucks.
I want to point out the Gigabyte has an option to show "biscuits" in which I've heard from numerous users is basically the quality of your silicon. I started out at 91 biscuits, looked it up because I never heard of biscuits and thought what the heck is that? Well now I'm currently at 89 biscuits since the crashes over the year before finding out it was the 13900k the whole time. I thought it was my RAM 6400 DDR5 since it wouldn't boot after that through XMP without changing my saved profile through my z790 Auros Elite AX and it would only allow me to boot resetting everything back to stock.
Given I have to throttle my own CPU at this point, Gigabyte still doesn't have a non beta BIOS to fix this issue (which I'm hoping drops and I can run the extreme profile that I should be able to do because I have the thermal headroom for), I'm about tired of this mess and the fact they are indicating Intel will just point the finger at me when this whole thing is BS, I didn't overclock mine I just wanted the performance everyone said my 13900k would have and now I have a lesser 13900k. I just want the 253w/253/400a without crashing at heavy loads, thats it, nothing more. To note, yeah this is my first PC built in over a decade so I've always liked Intel. While I love every other part of my PC, the 4090 is absolutely amazing, everything is good. Looks like I'll be going with AMD CPUs from here on out though, I regret purchasing the 13900k.
Thank you for another great set of test result that show that:
• If you want more frames/second (FPS) the 7800X3D is usually a better choice than the 14900K;
• The Extreme profile will offer more FPS and use more power than the Performance profile;
• It is implied but not shown that the Extreme profile will cause its CPU to age faster.
What has not been discussed is the effect of the “cooling solution“, thermal velocity boost, and short time boost on FPS and aging. One of the things to identify for a given cooling solution is is the acceptable steady state temperatures for (a) wall power (b) CPU (c) GPU for stable (1) CPU frequency and GPU frequency. It is assumed that boost will take advantage below below steady state loading to boost frequency. The extent that boost will exceed acceptable steady state temperatures will accelerate aging and may impact reliability.
amd is so efficient that intel need to disguise as room heater to match it, but still can't
U know that amd runs hotter than intel in gaming, right ? Intel runs hotter in cinebench simply because it has 3x cores, so ofc the powerconsumption will be higher too, but gaming is a different story, x3d s are extremely hot, while even an i9 stays at 40-50 degrees in gaming
@@KoItai1 Intel simply consumes far more energy and outputs far more heat than AMD.
AMD temps are higher due to slower heat dispersion, but it still outputs far less total heat than Intel.
The room heater is absolutely useful, I don't care the exhaust air of water cooler warms my feet in the winter, but !! it's summer now.
@@KoItai1 "X 3Ds run extremely hot" no they don't lol, 80 C isn't hot. Besides if they were hot, they'd be thermal throttling or downclocking but they aren't.
14900K may low average temperature but that's only because while gaming, 70-75% of cores don't do anything. Look at the temperatures of the P cores, I bet they'll be hotter than X3D.
@@KoItai1 also who cares if it runs hotter as long it performs better ? You're saving still money on power supply, power bills, motherboards power stages and VRM etc.
I wish you guys would test CPU bound games more often like rimworld, dwarf fortress, factorio, hearts of iron iv, civilization vi, etc when benchmarking CPUs (on mid-late game when the power needed is much higher)
Considering that the Performance profile can save 100-150W, while losing around 5% on average, this makes it the best choice for most people.
Sadly for Intel it just makes Zen 4 look more attractive.
I'm really happy that my 5600x is doing everything I need it to do, and went for it back when I did.
It's more than enough for my high refresh rate 1080p adventures combined with a RTX 3070. Enough fps with very low power consumption.
But that Intel high-end mess is really a..well..mess.
I hope Intel comes with something new and impressive soon in the CPU market to keep the competition up. Good competition is healthy for us customers and so that AMD keeps inventing and improving as well.
Intel is a mess nowdays
Now Steve you should do the productivity tests with 14900k vs 7950X/X3D. If gaming FPS get hit by Performance Profile, so productivity will be hit too!!
People deserve to see this updated benchmarks in order to choose wisely. Other thing is, how can Intel recommend Extreme Profile, when PL1+PL2=253W degrades the silicon?
I am very content with my 7800X3D, fast and not running hot.
Great Video as always!
AMD guy here, and last weekend I built my friend's kid a gaming desktop. He wanted to go Intel, and I wasn't going to try and change his mind (get a 7800x3d) instead of the 14700k. Well i built the system, everything went smooth and really had no issues.
But on boot up, (MSI z790 board) in the bios asked me what power profile I wanted to use, based on my cooler. I installed a Corsair iCUE H150i ELITE (360 AIO) so I chose the water cooling power profile, thinking, oh a 14700k wont draw anything near what a 14900k would so this should be fine.
Well I get windows installed and everything setup, run Cinebench, and sure enough it was drawing 300W and would peak out at 320W. I was blown away, it would almost instantly hit 100C and start to throttle. So i went back in the bios and limited it down to the (Tower Cooler) profile, which seemed to lock the max to 280W.
I did no expect the 14700k to draw that much.. The temps peak out now around 90-92c, but man... that was crazy! Other Intel guys here, is that what you see from your 14700k with no limits?
that's completely normal behavior for these Intel CPUs. Productivity loads is what they are best at, keep in mind a 14700k will score 35000 in Cinebench, a 7800X3D will score only 18000. Of course this is completely useless in gaming, but shows these Intel CPUs do have a gap they fill that AMD isn't good at (yet)
Just wondering if you are using the 'optimised' -30 offset for the Ryzen? Cause you lose no performance if anything you get better multicore and better sustained single core performance along with reduced power consumption by a decent amount on top of what it already is.
nerd
@@BlackJesus8463 This is a nerd channel....
you can use that with pretty much any ryzen since 5xxx series right ? iam using pbo-28 on all cores on 5600 while boosting up to 4700Ghz on single core ,stock was 4450mhz + if i run all single core workloads like cinebench by boost on all cores are 200mhz higher from stock setting ,you can basically tuned it for better performance or same performance a bit better boost and better power consumption overall any way you like for free and its as simple as it could be really
I did upgrade my TUF Z690 / 12100 with the 14600K, but never thought I was missing out on anything compared to the i7 or i9 cpus, as the most stressful use it gets is only gaming, and not with a 4090 either - yet. And it can still be OC'd to i7 level with an air-cooler, and low power consumption too. Being a "Cheapster" ftw, ha ha.
The 13/14600(K) CPUs were never the problem. Their punch-to-price was never criticised by any, really. It's the things Intel did to claim "leadership position" that was criticised on launch day, and is now starting to bite them in the ass for real. Heck, you can make good arguments for the 13/14700Ks as well. I'm habitually an AMD guy going back decades (last Intel system was a Dual Pentium III Tualatin 1266MHz) but dunking on brands like they were an opposing sports team is dumb as rocks. Each product should be evaluated on it's own merit. It doesn't matter if it has a bigger or smaller version that is better or worse. It doesn't matter if the brand has had previous glory moments or complete duds. What it comes down to is performance, price, efficiency, stability, service and RMA policy for each individual component.
Hey Steve, I would argue that Horizon Forbidden West can be CPU demanding. In cities like "The bulwark" or "Fleet's End".
Maybe change the testing location, if you wan to keep the game in the suite. Pretty please? :)
Will do. Thanks
@@Hardwareunboxed If you need a savegame, maybe I can send you one somehow. Takes quite a while to get to fleet's end.
Now thats content . Put in some work ! GJ 👍
Even sleeping 7800x 3d can beat 14900k easily
If you have unlimited budget, get a 7950x3d. If you have mid-high budget, get 7800x3d. If you are on a small budget, try getting an am5 cpu, otherwise am4. Intel makes no sense in any part of the chart
12100F tho
Is it even worth comparing a 14900k with an 11900K at this point?
Probably not, 11900K was much worse.
11900k was overall worse than the 10900k.
@@sirmonkey1985 said by a 10900k owner with copium or a sh*t motherboard.
@@AndyViant Go back and watch HUBs or GNs review of the 11900k then. It has two fewer cores and the new cores (on the same old node) don't make up for it in all situations. The 11900k gains a bit in productivity but loses a bit in gaming, and the efficiency is about on par. Both channels were left with "Why?" as their main conclusion.
@@AndyViant Said by an 11900K owner who can't cope up with the fact that he bought the WORST processor ever made in recent history. I am assuming here that you are not broke enough to actually own that peice of garbage otherwise I see no reason for a mentally sane person to get so triggered by a comment about 2 processors from the SAME brand. Lmao the insecurities
I was waiting for this.
Do not use the baseline profiles Performance or Extreme. If you want to do nothing and not do enthusiast stuff get the 7800x3d it just works out of the box. Now if you want intel if you are willing to do enthusiast stuff first put on a correction bracket on your 14900k if you want to make your life easier get yourself a torque screwdriver that can go down to at least .05 nm lowest cost for a good one with a certification is around 70-90 dollars. Get a Thermal Grizzly 13th-14th gen contact frame and set the torque to .06nm. This will ensure you are not fighting with your ram having to tweak that as well with a correction bracket maybe not just right and as close the abysmal flawed intel LGA1700 cpu mounting pressure that is about the only thing right for DDR5 ram. Make sure you have the right board for your ram that is important from 7200mhz cl34-8200mhz cl38+, Next you can select the let the bios optimize, next if your cpu is not the greatest of quality on the silicon lottery limit cores 4 and 5 to a multiplier of 57"5.7ghz", 6ghz+ takes a lot of core voltage and if your chip is not a lottery performance will suffer. Next you will have to look at your VT curves in the bios below 5.6ghz you can leave at auto, I like to with a negative offset on 5.6ghz to every mode of turbo up to 6ghz of .5, this is where a no one size fits all as each board is different. You will need to adjust each state of turbo and after you do save and exit the bios go into windows then run 3DMARK timespy the cpu test alone and will need to do 6-12 runs to sort of get an average and account for error, then keep going into the bios, weaking the .5X on each level of turbo until you find the sweet spot for the cpu that will greatly increase your performance better than you could ever get out of box and slapping the cpu also getting the performance back those baseline profiles butcher because they both still pump too much VID/Vcore into your cpu heat is not your friend, more voltage is not always better. 7200mhz cl34 you should see a cpu score of around 25k consistently, 7600mhz about 26.5-26.9, 8000mhz depending on how well you are dialed in around 26.9-27.5k. 8200mhz is around 27.Xk. Once you do that you will get the best 1% lows again with the lowest frametime MS, you will notice your system is a lot snappier giving you the best possible gaming experience possible.
Kind of shake my head at all of you reviewers not properly educating your audience treating them like they are clueless and not able to self-tune. It's not just you, it's a lot of you I miss the days when tech reviewers actually took the time to educate their audience properly sadly those days seem to be over for the most part and there is like under a 1% group of reviewers that actually does that some will not tell you how because well that is their livelihoods. Does it take extra work? Yup but if you are going to show off enthusiast hardware you may want to do enthusiast stuff, or buy enthusiast grade hardware be willing to do enthusiast level stuff.
Also will add the guidelines are because each chip and board are different, there is a no one size fits all, there is a set of numbers it will just work, it may not work the way you want it to nor perform the way you want it to. That is sort of up to the end user, not even the motherboard manufacturer can do that especially on intel. Intel needs to fix the LGA1700 mounting bracket that is the first thing that needs to happen to fix the lid bow, no bios or microcode will fix that. At this point it is up to the end user, intel is hoping this just goes away as they abandon LGA1700.
Look friend, I've been overclocking for over 20 years, even venturing into LN2 a few times, so I get it, and I appreciate you making the effort to inform people on what it takes to make these chips sing. But the reality is that Intel has a bad product right now, even if you make the absurd effort spending the hours upon hours tweaking and stability testing required to squeeze out a negligible win over X3D parts, we both know if that same effort was required to get similar performance out of AMD at the same power consumption as Intel CPUs, Intel enthusiasts would rightly laugh them out of the room.
If you are just gaming, grab a 6000MHz DDR5 kit and a 7800X3D, enable EXPO and you're done. You'll get a range of 90-105% of the performance in games of an expertly tuned Intel chip with 2% of the effort required, with much lower entry cost and power consumption figures. Even as someone who loves OC'ing, the reason I OC is to get everything out of my money, and the extra cost for a high binned DDR5 kit and a 14900K/KS required to best the X3D just makes the Intel platform look like a joke.
If you are running productivity workloads, especially if that's your livelihood, you care equally about stability and performance, you aren't running bleeding edge OCs, and you should be running Threadripper. At which point Intel doesn't even enter the conversation, as an OC'd 14900KS has roughly 40% lower multithread performance than a midtier Threadripper 3970X released over 4 years ago.
I'll still applaud you for your bench scores, and for extracting every last MHz of performance out of a complex chip to tune, but people like us are a dying breed and we make up 1% of PC builders, most people aren't doing deep dives like this, and I've reached a point with Intel where I agree that it's silly.
A reviewer should show out of box experience, not tweaked results since then as you say the silicone lottery comes into play.
not a knock on what you wrote, but most people don't tune their PC. I used to repair and salvage PCs, and you'd be shocked at some of the stuff I've seen from people that built themselves a custom PC.
(The worst case so far was from a guy who bought a 2070 Super to pair with an old AM3+ system that had the 95wTDP FX-8120/Gigabyte 990fx-UD3 board. He put the card in the PCIe 2.0 x4 slot and wondered why the card wasn't better than his previous card. He thought the system was broken, so he sold me the remains for $30usd w/o the 2070 S, PSU, and drives.)
I probably bought a couple dozen other custom built PC's that had an issue, or the owner just wanted to get rid of it, and two of those PCs had been tuned. Most of the time the issue was the old Intel stock cooler popped off.
just switched from intel to AMD with a 7800x3d and the chip is an absolute monster. such great efficiency and temps. look online how to run PBO to undervolt (precision boost overrdrive) and it's even more efficient. super easy to do and very worth it. undervolting is the new gospel for me these days.
The 7800X3D is not a monster, it's just an 8 core chip, easy optimization for it.
The 14900K is so powerful that it will take time until software can fully use it even outside gaming.
@@saricubra2867 i'm coming from an i7 8700k, my guy. this chip is indeed a monster.
@@DIGITALDYST0PIA 8700K is slower than a 5600X.
@@saricubra2867 i'm more than happy with this chip's performance and i'm seeing a noticeable performance improvement on my set up so i don't really care if you think the chip is powerful or not. 🤣 it's great for the price (especially the discount it got.) i really don't need a 14900k as i mainly only game. plus it's nearly $200 more and i don't need it to heat up my entire house especially during the summer. 🤷♀
Intel man 🤦🏻
I would classify myself as an enthusiast, but even I don’t know what the hell is going on.
So there’s a standard, recommended, performance, and extreme profile? Or is the performance profile “recommended”? What will the boards ship with? Shouldn’t it be “standard” as it is, you know… standard? 🤷🏻♂️
Yea Im just as confused as you are..
No, no, it's all very simple! The "performance" profile is quite performant you see, and that's why it's the "standard". But we do "recommend" that you use the "extreme" profile for EXTREME gaming performance. You might think that it would make sense that the "recommended" profile is also "standard", but this way, we can tell you a) if you run into issues with "extreme", well, that's not the "standard" profile! You should use the weaker "performance" profile because that's the "standard"! Or b) if you are not happy with your processor's "performance", it's your fault because you did not set it to the "recommended" "extreme" profile that we recommend! Either way, you lose, Intel wins!
The funniest thing is that when they released 14900k and showed all those benchmarks and scores but now the recommended setting can’t hit those claimed scores… That part doesn’t sit right with me, intel sold those processors with big claims and now they’re changing the perfomance. All those customers who bought the “fastest cpu” got robbed..
@@Ascarion1234but isn’t it false advertising on intel’s part? I mean some people can’t hit intel’s claimed scores without the cpu crashing… People bought 100% of the cpu perfomance and now are recommended to use 75-85% of perfomance essentially… I truly think it’s wrong advertisement because intel themselves used those same profiles/power settings that now are not recommended to use…
All i needed to see was the 14900k using around 100w or more power and fps being almost same. Even if people dont care about electric pricing they should care about the heat being dumped to the room where the pc is.
Unless you live in a cold area, then it becomes a plus lol 😂
The extreme profile is not worth it at all. Also the 14900k for gaming is not worth it at all.
Nice review,, just what I needed! You should throw Star citizen in there, it uses a lot of CPU
That poor Intel chip :D
350+W draw xD
So happy to see *my* 7800X3D is performing that well against the latest Intel top dog! 💙
well, 9000 series will be disappointing, 2 years leap and amd can t beat the 7800x3d, while in arrowlake even the core 5 will beat the 7800x3d, dk what amd even does with their 9000 series
@@KoItai1"will", we will see if arrow lake is really that good
@@robertjif6337 even if they don t change much else about cpu's, the fact that they jump from 10nm to 3nm, gives the cpu 20% more performance per watt; intel always shine when they change platforms; always sucked on creating better cpus on same platform, that s why they change it quite often; now considering it s also a 2 year jump; the 13600kf was just 10% behind 7800x3d; so it s not unreasonable for the core5 to be better
@@KoItai1 Arrowlake is going to be more of an efficiency increase than a performance increase. The x3d parts are basically a generation ahead of the non x3d parts, remember, we have the 9000x3d parts coming out this fall, probably at the same time arrowlake launches. The 9950x will probably be well ahead of the 15900k for workloads and about on par for gaming, the 9xx0x3d will be well ahead in gaming. The big focus of zen 5 is higher sustained clocks in allcore loads and better chiplet communication, dramatic reductions in latency etc. Word is that zen 5 will be dramatically better in games that leverage more than 8 cores. There is also talk that the 9000x3d chips are packing some other upgrades, new 3d vcache module, higher clocks and perhaps even overclocking. Most sources agree, it's not looking good for intel.
@@PineyJustice amd themselves said 9950x won t be better than 7800x3d; that means they ll be even worse than intel 14th gen, or really close, after a 2 year leap; which is really disappointing; intel gets 20% better performance per watt, and between 25-30% IPC increase; getting rid of multithread will hurt their multicore performance but will significantly increase the gaming performance, but until then a thing is sure, the nonx3d 9000 chips will be absolute trash; considering u can get 90-95% of that performance in the 7000 series at a much more discounted price
I have been an Intel user for over 25 years. Until the past 3 months. Currently using a 7800X3D and am blown away by how amazing this CPU is. Might have a fully converted user here
I would just get the 7950x3d and call it a day. 5800x3d gaming performance and more powerful everywhere else. Should test the 7950x3d vs the 14900k not the 7800x3d vs i9 14900k. Yall already showed the 7950x3d is skightly faster in gaming than even the 7800x3d.
Just slightly faster than 7800x3d while being significantly more expensive, not worth it tbh.
7800X3d is faster in gaming which is what is tested on this channel. For productivity non X3D 7900s are better!
@RamonInNZ the 7800x3d was tested and lost to the 7950x3d on this channel lol. It's rated as no1 because of its price/performance. I also said 7950x3d is faster everywhere else so I'm talking about people who want the best amd cpu and who do alot more then just gaming.
@@jeremyg2236 you do understand that people just don't use pc's to play games right?
@@suparibhau yes I know, but this video is testing mainly games, so obviously I'll recommend 7800x3d instead.
Leak from Igor's Lab suggesting that it's a microcode problem with the fix coming next month...
i7 13/14700k is the competitor here, really.
it aint much of a competition tho AMD still wins across the board
@@D00m3dHitm4n moreso competition at the same price range, the i9 isn't a gaming CPU. The i7 is also much better in other ways.
@@alsayedfakhri4597 the point is compare the best intel and amd gaming cpus. The 14700k will be slower than 14900k, so you can extrapolate with the results on the video.
Holy cow. i9 completely annihilated.
The more you Intel, the more you want AMD.
Hell nah not with with them shady ass benchmarks.
@@thetheoryguy5544 lol because intel has never had shady benchmarks
@@thetheoryguy5544based on your comments, you are extremely biased in favor of intel. These are companies, man. Not sports teams. You don't pick a side and root for it 🤦♂️
@@thetheoryguy5544 Two words. Principled Technologies.
@@shoobadoo123 Exactly, if more people behaved like us the PC hardware market would be way different. Fanboyish wars is what's destroys markets and hardware evolution.
7800X3D is simply a beast of a processor
Hardware Unboxed doesn't even unbox hardware anymore. Kinda sad watching your idols get lazy. Fortunately we've got world class reviews, but come on guys, unbox something for your old fans!
They really used to unbox things? That's kinda funny
LOL ;)
Well, the hardware is already UNBOXED lol
Qudos to you you Steve and the rest of the team. 😊
Honestly, these "system power" numbers really don't paint the stark difference in power usage. My 7800X3D uses double digit power numbers, usually around 60-80W. It looks like "only 90W more" for cyberpunk, but the reality is that's over twice as much.
israeli company shifting the blame.. hmmm..
Annudah Shoah
I have a similar bit of information to share. My laptop with a 13980HX has the ability to set P1 and P2 to static values. On multicore workloads, and limited to 45W, it can clock down to 1.7 GHz. When left uncapped it will get about 4.5 GHz on P-cores but is pegged at 95C and sounds absurdly loud, requiring noise cancellation headphones.
I've set various power limits to attempt to play games without it getting noisy, but it's apparent that the laptop cooling solution can't handle anything above 28-35W without hitting 95C. It's a shame really.
Power efficiency is a bigger issue, since it translates to heat, battery life, and noise. It's also more significant (disproportionately) the smaller the device becomes. So while people may shrug at 200W or even 100W, those numbers have changed the mobile landscape moreso than they realize. The knock-on effect has made it so that mobile Intel CPUs at any kind of mid-high end are essentially hot, loud, and battery killing.
AMD and Apple are cornering the mobile and remote work market by offering the best power consumption per dollar per unit of performance. Our company, which manages IT for hundreds of companies has stopped selling Intel mobile devices for about 3-4 years now. People want quiet systems that have long battery life and can do everything without ruining the aforementioned.