Ryzen 7 7800X3D vs. Core i9-14900K [Asus: Intel Baseline Profile] Gaming Benchmark
ฝัง
- เผยแพร่เมื่อ 30 พ.ค. 2024
- Thermal Grizzly: www.thermal-grizzly.com/en/kr...
Intel CPUs Are Crashing & It's Intel's Fault: Intel Baseline Profile Benchmark: • Intel CPUs Are Crashin...
Support us on Patreon: / hardwareunboxed
Join us on Floatplane: www.floatplane.com/channel/Ha...
Buy relevant products from Amazon, Newegg and others below:
GeForce RTX 4070 Super - geni.us/wSqSO07
GeForce RTX 4070 Ti Super - geni.us/GxWGmYQ
GeForce RTX 4080 Super - geni.us/80D6BBA
GeForce RTX 4090 - geni.us/puJry
GeForce RTX 4080 - geni.us/wpg4zl
GeForce RTX 4070 Ti - geni.us/AVijBg
GeForce RTX 4070 - geni.us/8dn6Bt
GeForce RTX 4060 Ti 16GB - geni.us/o5Q0O
GeForce RTX 4060 Ti 8GB - geni.us/YxYYX
GeForce RTX 4060 - geni.us/7QKyyLM
Radeon RX 7900 XTX - geni.us/OKTo
Radeon RX 7900 XT - geni.us/iMi32
Radeon RX 7800 XT - geni.us/Jagv
Radeon RX 7700 XT - geni.us/vzzndOB
Radeon RX 7600 XT - geni.us/eW2iWo
Radeon RX 7600 - geni.us/j2BgwXv
Radeon RX 6800 XT - geni.us/yxrJUJm
Radeon RX 6800 - geni.us/Ps1fpex
Radeon RX 6750 XT - geni.us/53sUN7
Radeon RX 6650 XT - geni.us/8Awx3
Radeon RX 6600 XT - geni.us/aPMwG
Radeon RX 6600 - geni.us/cCrY
Video Index
00:00 - Welcome to Hardware Unboxed
01:08 - Ad Spot
01:48 - The Test
02:29 - Helldivers 2
02:52 - Cyberpunk 2077: Phantom Liberty
03:09 - Counter-Strike 2
03:35 - Dragon’s Dogma II
03:53 - Assetto Corsa Competizione
04:13 - Baldur’s Gate 3
04:30 - Marvel’s Spider-Man Remastered
04:47 - Hogwarts Legacy
05:04 - Horizon Forbidden West
05:17 - F1 23
05:30 - Assassin’s Creed Mirage
05:47 - Starfield
06:11 - Average 1080p
07:19 - Power Consumption
08:52 - Old Power Data
09:17 - Final Thoughts
14:28 - Last Minute Leaked Info
Ryzen 7 7800X3D vs. Core i9-14900K [Asus: Intel Baseline Profile] Gaming Benchmark
Disclaimer: Any pricing information shown or mentioned in this video was accurate at the time of video production, and may have since changed
Disclosure: As an Amazon Associate we earn from qualifying purchases. We may also earn a commission on some sales made through other store links
FOLLOW US IN THESE PLACES FOR UPDATES
Twitter - / hardwareunboxed
Facebook - / hardwareunboxed
Instagram - / hardwareunboxed
Outro music by David Vonk/DaJaVo - วิทยาศาสตร์และเทคโนโลยี
Getting 7800x3D almost a year ago was the best decision ever for me
i have had mine for 10 months i agree they are great
Same here, 7800x3d and 7900XTX and could not be happier. 1440P!
Got 13900k one or two years ago was the worst decision ever for me ( x3d wasn't out).
I try to chillout because i run 7200 mhz good cooling and some small OC to keep it up (contact frame and delid soon), wasted money anyway.
It's the first CPU I've ever pre-ordered.
@@brucendolf My last intel CPU was i7 3770 later I jumped to AM4 3 years later I sell it add $10 and get Am5 with r5 7600 thanks to good deal
Got a brand new 7800x3D for $270 here in Japan. Sold my old 5800x3D for $240, best deal ever.
Dang, how did you manage that?! :)
I built my first pc a year ago. Under $3000 with tax.
7800X3D ,Gigabyte 4090 Windforce Turbo 3.2Ghz,DDR5 6400.
I think it’s the most powerful gaming pc.
you still had to buy motherboard and ram
since i live next to a microcenter in USA i got a motherboard + ram + 7800x3d for only 460 dollars
@@aohjii I got that same bundle no problems so far
@@maurice482390 yea same i updatd my bios to latest version no problems, enabled EXPO to unlock ram speed potential , and i underclocked 7800x3d as in i capped off the temperature limit to 85c
7800x3d amazing chip. Fast, efficient, good value, runs cool
I have one and it could run cooler if it had a thinner IHS, but I guess AM4 cooler compatibility comes at a thermal cost...
yeah it doesnt really run cool cause its got the cache slapped on top, mayben it seems cool comapred to intel and its 100c
"good value"
Amazon:
R7 7800X3D = 368 dollars on amazon.
i7-12700KF = 189 dollars.
Imagine paying twice the amount of money for 10% more FPS and 4 less cores.
Real world on Amazon:
7800X3D: 368 dollars.
i7-12700KF: 189 dollars.
Imagine spending twice the amount of money for 10% more fps and 4 less cores.
@Desalater2
My deepcool ak500 air cooler maxes it out at 85 degrees with PBO. Not sure what you're talking about
The main issue is that the components are installed upside down in Australia. Happy to help!
We install the 12VHPWR connector for the 4090 upside down here, and now I am homeless... (Joking)
Upside down Australia jokes never fail to make me laugh.
@@Very_Unwise
Why don’t Australians ever get tired of “upside down” jokes?
Because they always find them to be “over the top”!
Yeah, reduces cooler mounting pressure
@@Very_Unwise It is weird to think that every time I do a pushup, they have to do an invert row
in my region 7800x3d is 15+% cheaper then 14700k and and 30 to 40% cheaper then 14900k.... so.... its not even a competition
Yeah, Intel at this point is mainly for productivity if they're cheaper than the R9 7900/7950X
@@ismaelsoto9507pretty sure with the power limits kneecapped the 7950x is faster LMAO
If it's just about the gaming performance, it never was a competition. Intel makes sense if you are after productivity (multicore) performance. Purely for gaming, the strongest Intel part that really makes sense is the 14600K(F), or the 13th gen equivalent. It has very good gaming performance, it's cheaper than the 7800X3D and it still packs a pretty big punch in productivity applications.
@@ismaelsoto9507 in my region(ukraine) its like this:
7800x3d=360$
7900x=375$
14700k=440$
7950x=500$
14900k=575$(up to 630-700 in more reputable stores) so... it somewhat +-competitive if the application work with Intel better.
but i personally more looking at cpu like 7600x for 215$, or 5700x3d for 250 for easy AM4 replacement
Similar situation here in Denmark with one of the largest retailers.
An i7-14700K is DKK 3,444
An R7 7800X3D is DKK 2,899
15.8% cheaper for the 7800X3D.
While you probably shouldn't get the cheapest motherboard, the cheapest Z790/X670 motherboards I can find at that retailer are swapped around.
ASRock Z790M PG Lightning/D4 is DKK1,250
ASUS PRIME X670-P-CSM is DKK 1,587
So 21.2% cheaper for the Z790 board.
Total cost:
Intel: DKK 4,694
AMD: DKK 4,486
4.4% cheaper for the AMD platform - but I've no idea if the boards are any good or even comparable.
Edit:
Just to throw it in there - the i9-14900K is DKK 4,798 at the same retailer, making the 7800X3D 39.5% cheaper, and the total AMD platform cost 25.8% cheaper.
Because of intel's new "default setting" this video's setting is overclocked one 😅😅😅
Yep! 😂 Intel Default is 125W/188W for K SKUs and 65W PL1 for non-K SKUs.
@@PowellCat745 now 14900 is even worse than 13600k haha
@@apexHasu That's the problem. Without these massive power increases, I bet they're no better than 12th gen - 12900k
@@WeyermannX No better than 13th gen. The 12900K has half the E-cores of 13th and 14th gen i9's so it loses in productivity anyway and from what I've seen the 13900k/14900k are somewhat faster for gaming then 12th gen (not that much though to be fair).
@@WeyermannX even 12th gen runs with higher than Intel Default PL. In fact, it is rumored that this started to happen from around 9th Gen.
The new Baseline profile is 188W and everyone will need use that as default. I guess its time for retest with 188w..Poor Steve.
It's like for comedians, so much crap happening =more material for content, which really should make us sad😅
I cant even begin to imagine how boring retesting must be. Yet you must be very careful while doing. Big thanks to Steve for doing it for us.
@@anttikangasvieri1361 "Thanks Steve" "back to you Steve"
everyone isnt gonna NEED to use it as default, unless you have instability issues
@@userblame632 yeah those 10% of chips that can run overclocked to death they will last 6 months to a year. Lets all collectively go off of those. Shit why dont we just go off of LN2 builds because everyone has some liquid nitrogen around,
I have a bad taste in my mouth regarding this. It is much like getting a new sport car with say 350 hp. After doing well in reviews. The manufacturer issue a recall and tune the engine down to 300 hp. In order to fix overheating problems
maybe VW with diesel exhaust claims on the cards
It’s only two FPS on Medium and Ultra on 1080p, the PL1=PL2=253w is the right setting, not sure what Gigabyte are doing cause they got it wrong. So it’s like 3HP but less heat and gas consumption.
@@PaulFCB1899 i am sure gigabyte baseline is correct.
@@leonjun9401 No, it’s not, it’s a simple click to see Intel’s official specs say 253W, so Gigabyte has lowered the TDP to the specs listed for 14600k while others followed the 253W, why would Gigabyte’s be correct?
@@PaulFCB1899 Someone didn't watch till the end of the video...
X3D line has been a true game changer and truly innovative in the world of CPUs
Definitely!
Oh how the tides have changed, this is what FX was to intel CPUs 7 years ago.
the FX was even worse than this .
they were massively behind in terms of performance .
the only feasible processor for gaming was fx 6300 which performed roughly on par with i3 at the time (and that was only for entry level budget gaming systems) .
but at least FX were cheap , intel isn´t ...
@@darkglobe406I remember when the FX 6300 was extremely popular on Newegg!
Not even close, the performance of FX processors were way behind Intel ones.
1st gen Bulldozer desktop CPUs (Zambezi) could be actually slower than Phenom II CPUs depending on the application. It wasn't until 2nd gen Piledriver (Vishera) and several patches to Windows thread scheduler that performance increased and surpassed Phenom II CPUs.
A 4-module Piledriver like FX-8350 was as fast as a Ivy Bridge i3 at best. Even 1st gen Ryzen 1000 was a massive upgrade.
In fact‚ it was worse. The only good thing about FX CPUs was that they were cheaper. AMD at that time was close to bankruptcy. Intel stil has the opportunity to save its ass. Also right now they stil have a huge fan base. It will be a problem for Intel when they can't convince even their biggest diehard fans to buy the latest Intel CPUs.
FX chips still work till this day. These i9 won't make it past warranty
Great video, looking forward to the part 2 with the announced baseline profile at 125/188.
Wow. You can throw a RTX 4090 with the 7800X3D on a system that takes less than 500w on average. X3D is truly a remarkable product for gamers. In a 750w PSU that's under 70% capacity which is not too far from the optimal load in the power band.
I have exactly this combo and it draws less than 300W while gaming at 4K60FPS in most games. I did power limit both the CPU and the GPU because they’re too fast for my 4K60Hz TV
@@little_fluffy_clouds AH so another 4k 60hz enjoyer out there. Welcome to the club, I got the 7800x3d and a 4080, native 4k at 60fps is the sweet spot. No upscaling for me. my tv is a 65 incher, i use it as my desk monitor 😂
I do the same with my 4070ti 4k 60fps it's awesome. Except hopefully that 12 gb vram doesn't screw me soon until the 5 series cards come out @@turboimport95
@@turboimport95
I got a similar setup, though I went with the 7900xtx instead of the 4080.
Quite the jump coming from console. I don't like using upscaling or anything when I don't have to.
And luckily this PC almost always maxes out my 120hz TV at a full 4K at maximum settings. Even in many larger games.
It's freaking nuts. I wish I went PC at the beginning of this console generation honestly. Took me too long to make the jump over
@@sawdust8691 Welcome to the master race friend 😄
intel supported 2 generation where one generation was basically just the same with more power…
It is a progress…
13th gen doubled the number of E cores, doubled the L2 cache on P cores and also supports faster DDR5 compared to 12th gen.
A little too much power, it appears!
@@lharsay yes that's 2 generations as OP stated, now look at the 14th gen. Ctrl + C and Ctrl + V.
@@lharsay Which does f all for most people, E-cores are useless in most games.
My old theory is looking even more legit, Introduced e-cores because they couldn't fit or do anything else with their far behind process node.
So a slower CPU with bigger power consumption...Intel you high?
Lower fps not slower.
the i9 is better suited for productivity
10nm vs 5nm
Yeah, intel loses to amd in gaming
Intel is definitely high(on power consumption), yes.
The disclaimer at the end blew me away 😂. This using was with the 253w PL1 and PL2 and not the 125w/188w Default. Jeez, Intel is cooked.
Yeah, that's gonna struggle to beat a 7700x at the new default power levels
@@AndyViant I was thinking the same thing lol. Intel's f---ed bad this time. Releasing 14th gen was a mistake lol
And if the 15th gen rumors are accurate Intel is super screwed.
As seen in the outro, with 125/188 W limit it should be roughly 10% slower than this depending on the game. So, i5-14600K levels... brutal.
@@naamadossantossilva4736 Intel is screwed in enthusiast space no matter what, ARL would have to be AT LEAST 2.5x more power efficient than RPL is now to have slight chance against Zen 5. That would be a miracle.
where is the Ryzen 7 5800X3D?
Cant believe you forget that one again. 😂
This is a meme at this point.. forgotten just for you grunk😂
The Goat if CPUS!
What is that CPU? Never heard of it.
th-cam.com/video/Evj1tX8yFUU/w-d-xo.htmlsi=8HGhgAWJK2C-yvdL - already did it 11 months ago! 😂
i5-13600K and 12700K are faster.
If you see the difference in price here in Germany:
14900k around €590
1490KS around €720
7800X3D around €350
All prices including VAT.
If you are a Gamer, there not many reasons to choose an 14900K(S).
If you need more core's you can buy 79xx X3d.
Obviously if you are a gamer youdon't buy a flagship CPU with 24 cores. That's why Intel has the i5 13600k
@@KiSs0fd3aTh 13600k loses to the 7800X3D by a wide margin in gaming, there's no point comparing them.
The 13900k/14900k/s are the only chips that can match the 7800X3D. That's why they're compared side-by-side.
@@josephhodges718 Yes, a huge margin, 0% at normal resolutions you'd use a 4090 for.
😮
Just bought a 7800x3D and 32gb 6000mhz cl 30 ram that will be here tomorrow. Upgrading from a 10700k and 32gb of 3200mhz cl 16. Super excited to finally get full performance from my 4080s soon.
10th gen was the last time Intel had an actual good product. 11th gen was a turd sandwich. 12th gen did briefly get them the performance crown back, at the cost of mild power disadvantage, but X3D snubbed that pretty fast. 13th and 14th gen was an insult to their customers, and by the looks of it Intel isn't even done screwing up yet.
Dunno if 6000 is a good idea. Mine samsung kit done 6000 with cl 30-33-33-66, and it is not even near the 8000 cl34 in full hd. I would pay little more for 7200 cl34 like Deltas. More possibilities.
@@andersjjensenintel already got limit with their architecture. Do you know since core2duo started until today, it's just modified version of pentium 3.
Meanwhile AMD ryzen is whole new architecture
i did this exact upgrade about 6 months ago and near doubled my FPS in most games with a 3070TI. Kept the same Cooler and it Keeps the X3D colder then the 10700k (Overclocked) it truly is an awesome upgrade
LOL, that epilogue. At least you already covered the 125/188 W profile in your last video.
Once the actual profile hits... throw a 14900KS (150w PL1) in there and see what happens.
Your electric bill goes up
I think it’s still a loss for intel because they have to throw more power consumption and heat to beat Amd it’s not great
@@sir1junior It goes down compared to the Unlimited "default" that was implemented until recent problems with degradation.
Makes no sense to buy 14900KS and use it with this limit
... or you could run a 14900KS and game just fine, I'm doing that.
Excellent video as always.
I like how you can HEAR Steve's air quotes.
in my country 7800x3d costs 290$ and on sales arround 258$
and i9 14900k is arround 600-650$
250!? Wow!
@@conyo985 YES Aliexpress is amaxing site . the amaxing thing is that i ordered 5
and got 4 , the last one they sent 5700x instead of 780x3d by mistake, so they refunded me and aslo told me to keep the 5700x xD
i sold 3 of them on good profits :)
250$ wtf
Where is this? Can you please link us? Do they deliver internationally?
i live in the wrong country then.. :(
Today switching from 5800X3D to 7800X3D because I can switch systems for not more than 200$. Excited 🥰
That's really lucky sales of the old hardware.
How? It's a whole new platform. I'm waiting on the 9800X3D to jump from AM4. Still on zen 2; which is starting to show its age a bit.
you need ddr5 ram too, remember
@@spacechannelfiver Sell old parts, add 200$ = new parts. Simple AF
@@spacechannelfiver i have 7800x3d which is AM5 motherboard so 9800x3d will work with the same motherboard
Really appreciate how thorough you guys are, so quickly re-visiting and then building upon the new findings with this video!
117w on average more power for similar 7800x3d performance. 😮. But wait there's more. New bios with lower power settings and less performance coming up.
Well at least then the power difference will be less 😂😂😂
coming up? when
@@aohjii May 31st
I have done two 7800x3d builds this year. It is crazy how much cooler it keeps my office. I have been amazed by these cpus.
I cooled one with what was supposed to be a temp cooler, silver soul 135. Even under stress testing, it never went over 76 C.
Why would u buy this for an office pc? Wouldn't u just buy a non x3d?
@@criminalle88 Home office.
@@criminalle88 efficiency - my office is cooler, less energy at the wall. I work from home 50% of the time designing automation and the rest I spend on the road programming. I have a half dozen pcs in my home. I typically run 3 VMs on my 7800x3d build for different software platforms. This is very efficient. I replaced my 5900x build with this after doing power analysis on my daughters pc. The truth is these pcs are all so fast for most people they could not even detect a difference. Also my 7800x3d cad build is whisper quite. I have a good audio and mic setup for meetings so this is huge.
@@nimrodery This is a great home office cpu. I run tons of VMs on it. It is so cool I can power it on in the morning and my office stays cool. I love everything about this cpu.
I use that cooler on my 7700 and 7800 put a thermalright AM5 secure frame on both and gained 75 Mhz all core loaded.
Great comparison video. Thanks.
Excellent review! Thank you for the time and effort to rerun old tests again, only due to your followers interest.
Great video! I only wish you gave average of 1% lows. They are important as well, and power limits seem to have impacted those more than averages.
what a plot twist in this life as someone who remember AMD FX series and watching this, damn now we have Intel who consume power and getting 🔥
Same here, I remember that. It's such a reverse.
As someone who remembers Netbu(r)st (well, semi-remembers, I thought the Intel(r) space heaters were Nehalem), we have come full cycle.
That is what happens if a company falsely believes itself to be a monopoly. AMD has also long passed Nvidia in Raster because Nvidia is now an "AI company", i. e. the FTX of 2024. Sadly, people bought the hype about ray-tracing (which sucks ass and has no artistic value) and DLSS 3.0 (which has higher visual lag than native and looks worse) because of "new tech, woooooow" and "look at these high numbers I am getting" mentalities. In a few years, with their current disregard for the gaming market and the inevitable crash of the AI bubble, Nvidia will find themselves in the same position as Intel: with a lack of investment in their core business and investors demanding the same returns they got the previous decade.
The first computer I assembled was an AMD K6-2. I remember well, Zen5 will be another AMD Athlon moment for them.
@@wfb.subtraktor311 What "native?" You mean TAA, turn that off and tell me it's better. Not many modern games have a true "native" mode, it's "Native with TAA built in." Personally I think DLSS/DLAA/XeSS look about as good as "native" and you get more performance. A lot of people agree, therefore NVidia outsells AMD.
Love it so much :D The effort you guys are going to is phenomenal. :D
Informative video! Appreciate the effort
188W power limit seems overly conservative, when it was 253W. That means it was fed over 25% more power than it should have, no wonder they're so hard to keep cool.
And in practice the 14900K/KS drew over 300W in some applications. Fair enough as option for enthusiasts but as default spec that's more than questionable.
Back in the days, enthusiast will overclock their own CPU but today Intel make it default....
Great video, although I believe the story hasn't really changed much, 7800X3D was always the far superior option vs the 14900k for gaming-first builds (price, performance, efficiency, and upgrade path). It's a no-brainer.
Also because it soo efficient you can get away with cheap cooler, cheap motherboard with mediocre vrm , and you can use it on cheaper psu cuz lower wattage , all of that combined allow you to buy better gpu, or more storage space
@@mrbobgamingmemes9558But with Intel it could also double as a room heater for the winter or a way to turn your room into a sauna in the summer. Aren't those just fantastic selling points?
@@XxGorillaGodxXI’m sold.
@@mrbobgamingmemes9558 Oh damn, great point!
@@XxGorillaGodxX You forgot the most important selling point- getting rid of all your cash and making more room in your house
Good video Steve! That Power Consumption by the Intel chips is still crazy, power bills be expenny.
I appreciate your efforts, Steve! It's really good to have independent reviews like yours out, and I look forward to your forthcoming comparison between Intel Default v. 7800X3D.. and ofc against Zen 5, this summer! I know it's a lot of work for you, but it's also content, right!
I've got a 7800X3D with an X670 board, 7900XTX main GPU and 6600XT secondary GPU, and even with running a game on the primary and Firefox with TH-cam playing on the secondary GPU, I still don't even hit 400W of power usage.
is crazy to think that you can have a high end PC with the same old 550w 80 plus bronze power supply.
Perhaps if you play a title from 10 years ago. I have that combination (7800x3d & 7900xtx) and I have a power consumption of about 560W at peak when playing a demanding AAA title.
@@joaogabriels.f.5143 Actually, I have a Corsair HX1000i power supply, dating back to when I had dual GTX470s. The built in full system power monitoring is very useful.
I have tuned 14900k that pull 330w in cinebench and in games it use like 80-170w ( depend of games), my video card it's using 350-400W, rest of the computer components 150W, so I ask all this shit energy lovers, if my computer burn 550W +50W monitor+speakers, how can a difference of CPU save my bill if I compare worst case scenario Intel 770W with best case AMD 680W(if I have 7800 X3D instead of intel)?
Thanks for testing the Extreme profile! Can't wait to watch the next "baseline" tests, lol
14:49 Honestly, can't wait for the ARL vs ZEN5 comparison later this year! Keep up the good work, Steve! 👍
I love seeing the data, thanks!
Looking forward to the R9 7950X3D vs 14900k video in mixed gaming/application workloads.
Oh boy! This will not be pretty 😮
Edit: ASUS = 253W PL1=PL2
Sweclockers put out a report that there will be a default setting in future bios versions (so this will be the stock setting out of the box):
"Intel Default Settings är 188 W"i
Edit2: I have now finished watching the video. HUB should send Intel a thank you card for creating drama/video opportunities 😂
Having watched the video it just wasn't that much different.
Your work is appreciated. I'm sure there's always 1 more thing people ask for from data, but I think it may have been interesting to include the ASUS/etc bios default before the Intel baseline profile so any performance loss can be more easily illustrated from cutting the power.
You guys putting out to notch videos at around 2100hrs? Noice!
Ta for the amazing video. Such good data.waiting for your next one! ❤
At this point you can compare it to 5800X3D LOL
now the 14900K vs 7950x3D please. cuz while the 14900K is also 8 cores it also has another 16 minion cores to take care of background test, and the 7800x3D only has 8 so 2 of those would always carry windows and other background test. the 7950x3D in the other hand has 8 more cores that can be used to handle background test so it would be the real 8 cores vs 8 cores
Yes and those additional 8 cores are actually usable for work. Unlike Intel's that are only good enough to background low priority tasks. I have 13900K since early 2023 and was experiencing from a few moths ago unstability about 4 months before the news about this problem became public. I limited the Asus power profile manually back then to avoid that issue and my system became noticeable slower on my workflow. When I saw this issue was massive I updated the BIOS and revisited that config and applied the Intel profile and the system became even lower performing than before. So I am losing about 1.5 man hour per day of work due this limitation now. My conclusion Intel is selling snake oil with those processors. They marketed them as faster than AMD so did all the media and now the reality is that they are not. Currently waiting for Ryzen 8000 series to get a real 16 cores processor to regain my lost productivity.
Whoa. Major plot twist in the last 30 seconds. Great video. Thanks for this.
Do you have a plan for testing going forward? If the 125/188 does become the default profile is that what you are going to use as that is the OOB experience or are you going to use the Performance / Extreme profile for testing (assuming that is classed as in spec by Intel) as it is not too much different to turning XMP/EXPO on?
I'm so happy I returned my 14700k and jumped to 7800x3d.
Edit to those hating in the comments: I'm building the best GAMING PC I can right now. The only thing that I didn't get top tier is the GPU. I chose a 4080 super because the 4090 is way overpriced and doesn't make sense to get for a 1440p high refresh rate monitor.
The 7800x3d isn't a "budget" cpu. It's simply the best for GAMING right now. The fact that it's cheaper than Intel is just a bonus. CPU pricing wasn't a factor when deciding my new build. Also, the AM5 platform has greater upgrade potential with Zen 5 (maybe even 6?) - I can just upgrade the CPU and leave the rest intact. LGA 1700 is a dead end and would require a new board if I wanted to upgrade. Lastly, all of the current Intel drama steered me away.
For info: I'm upgrading from an i7-6700k. I used to be an Intel fan back when they had higher IPC and better gaming performance than AMD - even with fewer cores. The tides have changed, and AMD performs better than Intel now for gaming (on average). Just look at the 1% lows. Some are significantly higher. 3d v-cache can make a huge difference.
that's a downgrade to inferior product
@@user-mf6wv8ex9pLol what? The 7800X3D destroys the 14700K in gaming
@@TheAtomoh he mean inferior in how you can use as a "heater" in your room.
@user-mf6wv8ex9p a 370 dollar cpu completely beats your 600 dollar cpu, in gaming, and you are still salty about it. There is no "inferior" product, it's about the performance they offer at a precise price. And it also depends of the use case, (gaming in this case). Being a fan boy will only do harm to you and all the other consumers.
cpu is not only for gaming brother.
The i9 blows 7800x3d out of the water with speed in productivity applications and streamings for content creators.
7800x3d is really only for gamers on budget that doesnt wanna do anything else with the cpu.
You need to have more points of considerations than those extra overkill FPS which doesnt matter because all mainstream Cpus are already very capable in over dishing fps.
300 fps vs 330 fps makes no differences to us.
I bought a 7600x on launch and I’m happy as could be. I’ve got pleeenty of room to grow on this same platform. Zen 5, Zen 6, x3D. I’ll probably double my performance in 3 years all with the same mb, ram, and cooler. Just drop a new cpu in.
Wait and see.. we might only get marginal performance increase with ryzen 9000. Just bought ryzen 7600 that was on sale for 190€ to replace my 3600 on ultrawide with xtx. I am happy I took it instead of paying double for 7800x3d. I can now max out my 165hz refresh with most games on ultra and the 1%lows are only 10fps lower than the current fps.
Best deal ever.
Thank you so much for this
hi, when you benchmark these cpus, have you disable virtualization on bios for both cpus?, recently watched a video from ancient gameplays that show this can make up to 10% difference in performance
The difference in wattage is more than my file server pulls, with multiple docker containers running. Ryzen 5600g, rtx 4060 16gb, 64gb ram, multiple drives. It pulls 73W normally.
Just wanna see the 9800X3D
*Sounds of crashed benchmarks* IT'S OVER 9000!!!
@@Bismuth208 Imagine if the 9950X has VCache on both CCXs... That would be incredible
@@suchirghuwalewalanot likely, AMD already tried this, the issue is it will increase latency since you now have to communicate across the infinity fabric to get to the other CCD. It doesn't increase performance just the build cost.
Just want see the 99999999999999999999999999999999999999999999999999999x3D
wont happen the 9000 series they have fixed the reason why they needed the cache ... in any case.. the 9000 will whip intel
4:08 nice pun hehe. Thanks for benching ACC, on behalf of all interested sim racers!
Mega work, Steve!
I would never have thought that I would have went AMD anything, but my last build was with a 7800X3D after much deliberation. Now it looks like there is no reason to look back at all - the right choice was made.
no matter how i see it 7800x3d is rocking in those tests, yet I'm pretty sure there will always be someone says intel is better
Intel is better
🤣 My Intel is better Devil's Canyon i7 4790k only because it doesn't have major issues like the 14th Gen Intel's. Instead, my issues are 4 cores and 8 threads plus DDR3 1600
Love you guys...never proves me wrong 🤣
Intel is better for gaming because uhhhhhh ummm uhhhhhhhhhhhh
Intel is the best heater for winter
Thanks for the investigation. Much appreciated.
With a 14900k about 7 months now. With proper cooling and room temperature, loving this processor. Work/Gaming.
Intel's defaults seem to be 125w/188w.
So the intel defaults are even more gymped than this.
That's their new defaults. The original default was 125W PL1 and 253W PL2. That's how it's been since the 12900K if I recall correctly. That's going to hurt Intel when it comes to productivity.
@@vigilant_1934 agreed, but new/old doesn't matter if intel makes 125/188 their current defaults it would mean that some CPUs are not even stable on 253watts
I hope they learned their lesson. Just because you can shove a bunch of power down a CPUs throat to try and maintain being competitive it’s not worth it if the chip degrades so fast that it eats itself live in a matter of months.
Really looking forward to this one!
Just bought the 7800x3D last week to upgrade from my old 6700k during the gaming sale and was comparing against the newer intel line up. So glad I made that decision. Crazy how the performance is better, more efficient, and a better deal at nearly $200 less than the intel cpu.
Why am I even looking at this, since I still have a ryzen 1600af?! 😅
Window shopping.
Get a 5800X3D and call it a day for another generation or two, at least
Because it's interesting?
@@shootinputin6332 1600af->5800x3d most chad single upgrade without changing sockets ive ever seen
@@shootinputin6332 5700X3D is often better deal
Small suggestion: You could list the names of the parts you used for benchmarking in letters, not just reading them out. It could get a bit confusing with all the numbers and such. Great video btw !
Eager to see your reporting on the 7950X3D because I'm a gamer and use various applications that would take advantage of all the cores of that CPU.
Did they resolve the 7800x3d frame stutter issue that some people experience? Was it just the fTPM?
I mean a 5800x3d and 5700x3d comparison to 14900k is also a thing were waiting for 😂
Well that's just brought intel 14900k back to Intel 12900k performance.
As you said "What a cluster..."
It's going to be perfect for intel O.o. They can now claim their next gen is beating the 14900k by 15% or something... which will then tie the 7800x3d
@@WeyermannX Hahahah
Are u guys always gonna benchmark with the baseline profile enabled moving forward? Makes sense
Built my new rig last year with the 7800x3d, - this chip absolutely flies!
Lot of people saying not to buy Intel. Even Digital Foundry ditched Intel and went AMD
Some of the cheaper 12th gen options make for a very compelling budget build choice. But since intel got stuck on their old inferior node process while AMD used cutting edge tsmc progress none of intels current high end chips can compete
The only Intel shill channel I see these days is Hardware Canucks.
@@friedzombie4 Why do you watch a channel that you claim is Intel shill ?
@@friedzombie4 Hardware Canucks are reaching UserBenchmark levels of idiocy... Their AIO vs air cooler test had the noise level set to 38.5dB - that's hair drier blasting at your head noise - then concluded air cooling is superior. In an open bench btw.... jfc can you get any dumber. So, yes, if you like working next to a jet engine, it doesn't matter how you cool your cpu, it's all pretty much the same.
@@DragonOfTheMortalKombat I don't, I watched one of their newest videos and immediately thumbed it down and removed it from my watch history lol
7800X3D owners vindicated for the 2748274826th time
Thank you, that was extremely well done.
wish you guys also included frame time , frame pacing graphs which are very imp for measuring the actual smoothness the user experiences while playing apart from the 1 percent lows which are v.imp for user experience
let's go! my office heater is an intel
Another video about the x3d beating intel in gaming.....very cool
should there be the cpu temperature comparison while running the games? ie. profile is on/off, vs 7800x3d etc, more understandable of the profile is efficient or not
Is pbo engaged on the 7800x3d? Boost limit at 200? Undervolt/Curve optimizer?
So, will all Intel CPU reviews going forward be using the baseline profile?
It's a good question. I think they probably should, but then again XMP is also technically overclocking, and almost everybody uses XMP for their CPU benchmarks, so it's not a given to use the baseline profile.
Any one looking at the cpu revews will see the old overclocked benchmarks most reviewers won't do content like this that's why we are so glad Hardware Unboxed are here I am going to assume Gamers Nexus will also retest as well but be surprised if anyone else did so much time in invested in these videos think I heard Steve say revews have over 80 hours of testing involved
they 100% should, it's a new XMP-like thing that most people will never change, and people will play with 188w without even knowing so reviews should use that
on 14900k/ks probably, it really depends on if intel pressures board manufactures to force enable it for cpu's below that. that being said though there are some reviewers that have always forced the intel recommended PL1/2 limits in their reviews so the numbers are already out there on performance. it's the 188w limit that is the most surprising.
@@exscape Dragon's Dogma 2 is poorly optimised, like in a bad way. Also you have to be mindful of what games you use to benchmark. Some games will be heavily skewered or be in favor of Intel to the point they intentionally code the game to perform better on Intel. This is in part due to Intel bribery.
Waiting for a certain website to update after this. 😂
_THIS_ will not happen ;)
Is/will there be a performance comparison at different power limits at (65W & 95W & 125W)(PL1=PL2) vs different baseline profile? Just to know which should be the ideal "Baseline Profile"
Does next gen Intel CPUs day1 review will be test with Baseline profile?
What a mess for Intel... From 320W to 188W to protect those factory-oced CPUs from being killed. Many people who know something about current and electronics suspected that those initial settings would be killing CPUs in a few months but Intel needed a few wins in benchmarks to sell them or else those would be DOA. They preferred to kill them later on.
~40W when playing @5120x1440 and ~70W during shader compilation. 7800X3D is an efficient beast.
Could a polished overclocked 7700x with direct die cooler be added to test with the 7950x3d?? Would be intering see the OC result now that the platform has improved stability.
Good review.
Switched from 13900K to 7800x3D...gaming performance is amazing! General performance is great as well for only 8 cores. It will hold me over until 9950X comes out which is an easy upgrade for me now.
I have a Ryzen 7 7840U laptop (8 cores, 16MB L3, 3.3GHz base, 5.1GHz boost) configured for 45W and a 7950X3D workstation (software development and gaming). Unless you have a very specific need for more than 8 cores, don't bother. My laptop is practically instant in everything (Samsung 980 NVMe) you'd throw at a laptop. 200-300MB spreadsheets update as fast as I can gather my thoughts about what to type in next. Large MatLab graphs render bloody near instantly. Large matrix solutions seem to be limited more by storage I/O than CPU power. It's only when I compile large projects that I can tell a difference between 8 laptop cores and 16 desktop cores.
Why did you choose 13900k for gaming in the first place?
Do you steam, edit videos or something?
How is the difference in speed in daily tasks (webbrowser, youtube, mail, booting up) and most importantly, how is the difference in FPS stability (1% lows, fps drops, microstutters) in comparison to the 13900k?
I run a 13900K on a BIOS updated (for 13th/14th) Gigabyte z690 with DDR4. Nowhere have I seen anyone test the 13900K/14900K on this older platform; is this perhaps relevant?. I have had no issues at all and although I admit to not being a serious gamer, I regularly run Handbrake for up to 2 hours at a time and the CPU never misses a heartbeat.
how are the results when you have multiple monitors plugged in while watching youtube, listening to spotify and streaming too?
love my 7800x3d, Best value gaming chip i've owned since the i5 2500k
And yet, they still put this power hungry CPU in the Corsair One i500.
It boggles the mind, just how cheap are Intel selling these...?
Product development takes years. Corsair probably planned to build the i500 around the 14900k in 2021 and by the time it was clear how bad of a deal it is, they were probably in the final stages of the design process. Also, AMD doesn't have enough fab capacity to supply all of the Prebuilt PCs.
They have kickbacks which is part of Intel's marketing budget which is how retailers make money, they keep almost 0 margins and intel marketing basically makes up the difference, in exchange they get most laptop spots, prime spots in store, even wallpaper selections are controlled by intel
@@wfb.subtraktor311Planning a PC doesn‘t take 3 years
@@Michael_Schumacher It does when you are a large company. You need to make contracts, ensure supply chain availability, build driver packages, write manuals, design and test components, and many other things. From initial planning to release it absolutely takes at least 30 months for products like this, and you need to know which components go into it and the thermal solution you need to cool them at least a year before product release. You cannot just change the core components a month or even a few months out from release.
@@wfb.subtraktor311 Except the 14900K as a Raptor Lake refresh wasn't even planned in 2021. Meteor Lake was what Intel wanted on desktop for 2023 so Corsair would've been planning for the 13900K or a Meteor Lake equivalent.
I find this video to be in-spec.
I remember when the Pentium D was considered hot and power hungry. Now we are using more than twice the power even when limited. Heck even my 9700K only used 95W.
what do you think about buildzoid's experimental fix?
and this is why intel doesnt, or didnt want their CPUs stable. They can't compete.
I really wish you guys would use flight simulator as a benchmark since it’s so demanding on CPUs. None of the big reviewers really bother to 😢
How would you benchmark it?
isn’t it hard on the cpu due to poor multi core support?
@@danzydan2479 Load scenery flight over big city, it always starts in the same place and with the same plane, the same weather etc
enable auto pilot, fly straight, collect data
pretty obvious
@@MarginalSC DX12, man
with dlss and FG mod I have 90+ fps over New York for instance, on high and some ultra settings, photogrammetry, 1080p
damn good for 5600x+2060s
@@electrotrashmailbox At what attitude and what are you flying in?
Great sponsor. Kryosheets are amazing 🤩
If i didnt already have an am4 board and ram i would have gone with the 7800x3d. Instead upgraded to a 5800x3D, doesnt take much power, runs quiet, does what I need it to do. I didn't used to care so much about the power draw, but I've really started enjoying having a quiet system.