Core i9 14900K [Baseline Profile] vs Ryzen 7 7800X3D - Test in 10 Games
ฝัง
- เผยแพร่เมื่อ 27 พ.ค. 2024
- Ryzen 7 7800X3D vs Core i9 14900K [Intel Baseline Profile - 253W] l 1080p
Buy games at the best prices on gamivo.com - gvo.deals/TestingGames
Use coupon code and get discount - TG3
Ad - 0:00
Games :
CYBERPUNK 2077 - 0:06
Call of Duty: Warzone - 1:05
Hogwarts Legacy - 2:01 - gvo.deals/TG3HogwartsLegacy
The Witcher 3 - 2:58 - gvo.deals/TestingGamesWitcher
Horizon Forbidden West - 3:59
Red Dead Redemption 2 - 5:01 - gvo.deals/TestingGamesRDR2
Avatar Frontiers of Pandora - 6:12
Microsoft Flight Simulator - 7:17
Starfield - 8:19
Forza Horizon 5 - 9:17 - gvo.deals/TestingGamesForza5
System:
Windows 11
Core i9-14900K - bit.ly/3rTFhVy
ASUS ROG Strix Z790-E Gaming - bit.ly/3scEZpc
Ryzen 7 7800X3D - bit.ly/43e3VxW
MSI MPG X670E CARBON
G.SKILL Trident Z5 RGB 32GB DDR5 6000MHz - bit.ly/3XlBGdU
CPU Cooler - MSI MAG CORELIQUID C360 - bit.ly/3mOVgiy
GeForce RTX 4090 24GB - bit.ly/3CSaMCj
SSD - 2xSAMSUNG 970 EVO M.2 2280 1TB - bit.ly/2NmWeQe
Power Supply CORSAIR HX Series HX1200 1200W - bit.ly/3EZWtNj - เกม
Games :
CYBERPUNK 2077 - 0:06
Call of Duty: Warzone - 1:05
Hogwarts Legacy - 2:01 - gvo.deals/TG3HogwartsLegacy
The Witcher 3 - 2:58 - gvo.deals/TestingGamesWitcher
Horizon Forbidden West - 3:59
Red Dead Redemption 2 - 5:01 - gvo.deals/TestingGamesRDR2
Avatar Frontiers of Pandora - 6:12
Microsoft Flight Simulator - 7:17
Starfield - 8:19
Forza Horizon 5 - 9:17 - gvo.deals/TestingGamesForza5
System:
Windows 11
Core i9-14900K - bit.ly/3rTFhVy
ASUS ROG Strix Z790-E Gaming - bit.ly/3scEZpc
Ryzen 7 7800X3D - bit.ly/43e3VxW
MSI MPG X670E CARBON
G.SKILL Trident Z5 RGB 32GB DDR5 6000MHz - bit.ly/3XlBGdU
CPU Cooler - MSI MAG CORELIQUID C360 - bit.ly/3mOVgiy
GeForce RTX 4090 24GB - bit.ly/3CSaMCj
SSD - 2xSAMSUNG 970 EVO M.2 2280 1TB - bit.ly/2NmWeQe
Power Supply CORSAIR HX Series HX1200 1200W - bit.ly/3EZWtNj
Ryzen 7 7800X3D GOAT
Bro meatrides amd products but also hates amd products this is what i would call the schrodiners consumer
@@xigaxhad3635 Looks like to me he just likes the better products and isn't a fan boi for one or the other.
@@greebuh check his comments, the guy just hates amd gpus
Yea
"That’s why he’s the MVP, that’s why he’s the GOAT!!😊"
AMD gives better performance at almost half the power draw lmao. Intel fanboys seething
also half the price
@@luixkjkk And that's before the more expensive supporting components needed by the 14900k.
there arent any intel fanboys here lmao
watch them come soon
Why would people become a fan fanboys of an cpu manufacturer wtf
Intel's "E cores" 😂
Funny - 70watt vs 140 watt, 450$ vs 550$
AMD with Ryzen made a fatality to Intel ; )
Os caras comemorando 50w sendo que é possivel colocar o 14900k em 75w com a mesma performace mas fica calado vendo a placa de vídeo consumir 350w. É muita contradição.
@@CarFlightGamesCFG pois é,uso i9 13900k consumindo 80w no warzone e em jogos de campanha consumindo 44/65w tudo é otimização de vcore e baseline profile no turbo até vc achar a voltagem ideal
@@warzonehighlights773 кто этим будет заниматься? большинству не нужны регулировки, они в этом не разбираются.
@@Angelo0chek this isn´t my problem,and yes them
Какой? 😅
de 3d v-cache thing they did was a very smart move when it comes to gaming
it would be crazy if they put this tech in the playstation 6 and the next xbox
idk if you can cool it with an console good because 3dv gets hotter than non 3dv
How good is the Playstation 6 you got?
@@Der_Joghurt_Ohne_Ecke hotter ? its cooler ? im cooling my 7800x3d with a 40$ air cooler and it dont even exceed 60c during gaming
there is a lower thermal limit on the x3d chips than the non x3d chips because the v-chache is heat and voltage sensitive not because they get hot
the thermal limit is 10c lower than the non x3d chips lol
dont know where you got that info from but they are the opposite , the 7800x3d at least is cool af
@@Der_Joghurt_Ohne_Ecke I have a 30 buck cooler on my 7800x3d and it maxes out at 80 degrees under full load. around 70 in normal gaming. no problem to cool whatsoever
@@Deathscythe91 cyberpunk: 61c 14900k, 63c 7800x3d. 75 w vs 150w. It gets slightly hotter and draws less han half the watts. thats all because of the 3D cache, so yes, it does get hotter because of the cache, but its not so bad.
7800x3d is the best gaming cpu ever made lol
I'm glad that Intel is going trough this, this means that that they will have to work into optimization of the arquitecture and optimization of new architectures and not just raise the clock speeds to sell lies of "new products" to people.
The one ting they really have to improve on, and AMD excels at, is PR
Kinda wish this would happen to Nvidia and even the playing field a bit.
Don't forget that 14900K socket is effectively dead. No future CPU can use it .
Now let's take a look back at AM4 socket x370 Motherboard from December 2017, initially intended for Ryzen 9 1900x but user can still upgrade to Ryzen 5800x3D just by doing BIOS update.
Can Intel socket from 2017 do that? Nope, neither is LGA 1700.
AM5 today, can use 7800x3D all the way up to 10800x3D in the future
True i have 7800x3d and its best bang4buck and u can upgrade in future
a few years ago every single Intel CPU Generation had its own socket and it was communicated that upgrading a cpu on the same socket isn´t forseen. The mobo and the CPu are one unit and the socket should be optimized as much as possible for every change in a cpu. AMD bought Intel´s first socket system and they copied their socket system again when they rised.
Crazy, I got 2x the performance on RPCS3 and Yuzu after I upgraded to 7800X3D from 10700F and it never went beyond 80W
ryzen 7 7800x3d is the best gaming CPU ever made.
lol
the best gaming ever made
7800x3D is GOD and with 70w nothing more, for me the 14th generation of Intel has been a complete failure.
in Horizon and MFS the difference is simply huge
also rdr
Yes, but in several games that are very demanding such as Hogwarts Legacy, the i9 pulled ahead in averages and 1% lows and in some cases 1% lows (see Starfield and Witcher 3). A processor is much more than the aggregate average FPS you get across multiple games. You have to consider:
1) Which processor has the better memory controller?
2) Which processor offer more system resources and can multi-task? Thinks several Chrome tabs plus discord etc
3) Which processor offers better efficiency/performance? Performance in terms of system resources as well as FPS per watt
4) Which processor is right for YOU, and your workload? And more
Also, one doesn't have to use the baseline profile..... Unless of course you have zero clue how to configure/build a PC
@@user-tc4tz8ww1z i mean starfield almost fully utilizing the 7800x3d at above 85%. Honestly its truly a gaming cpu and the best one right now
@@user-tc4tz8ww1z What about price of the 14900?
All your points from 1 to 4 is usless.
@@artyrkarpov7463 What about it? It has 24 physical cores, so it is more expensive. Is there an argument somewhere??
Half the cost, half the TDP, minimum hardware requirements and at the same time the best 3D performance!
Like Pentium 4 and Athlon 64 in the past...
AMD can make a ultimate comeback!
Yet Intel can live off AMD x86 royalty............. It's not fair.
AMD made an ultimate comeback with the release of Ryzen 1xxx in 2017
Intel note to self:
Never. *EVER* Mess with an AMD X3D CPU for gaming.
7800x3D same FPS with HALF the power of 14900k 🤣
More in some cases.
With much better 1% lows
And half the price
Makes sense its 3x the CPU of the 7800X3D 8 cores vs 24 cores. IF they managed to make a cpu with triple the physical core count consume less power than the other smaller CPU on a smaller node. It would be the biggest slap in the face to AMD if one gen advanced nodes + smaller cpus = more heat than an intel CPU with triple the cores on a bigger node
@@D3nsity_ if you disable 16 E-cores, how much FPS does it win? and how less power it consumes? the answer is: not much difference. Games usually don't need more than 6 cores, and if windows scheduler doesn't fail, 14900K will always use power hungry P-cores.
consumes less to give more at a cheaper price!
You do realize that these two CPUs are in completely different classes?
@@JohnLeOpinion which is why its even funnier how bad the 14th gen intel is at games(or efficiency, or stability). There hasnt been a single consumer grade cpu in recent history which overperformed its upper tier competitiors in gaming.
GAMING KING👑
Damn! Starfield almost fully utilizing the 7800x3d 🙀
The X3D chips tend to have much higher CPU utilization than the non-3D variants, which explains how they are able to perform better dispite lowered CPU clock speeds.
Are you still playing it?
50% on a 24 core, 32 thread CPU during gaming is also absolutely crazy.
Ryzen 7 vs i9, genius
Interesting video! I would like to know if we should expect a video showing the 4090 and 7800x3d test in the game Black Desert Online in 4K resolution?
half power with that level of performance? no doubt 7800x3d the winner here
Ryzen 9 7950X after hearing 13900K-14900K have issues with gaming and sometimes multitasking: “YES, My turn A$$Holes”
I think the 14900K is now evenly matched with the 7950X while costing less but worse power efficiency. (Was previously better but consumed a LOT more power)
@@auritro3903 so what is the move than? currently have a i9 9900k, should I get the Ryzen 7 7800X3D or i9 14th gen
@@senorbonbon 7800x3d easily
@@senorbonbon even if you multitask alot and you would like to go intel do not buy the 14th gen, go with the 13900k for that since that isnt an unstable mess
What is Intel Baseline Profile? is that mean running on intel default settings?
Sound like very unimportant feature if you are overclocking and have k series
Only reason to buy intel is if your still a fanboy from when they were on top. Power consumption matters these days. Especially in australia.
For me it's funny when people say about graphic cards that dosent seems a problem
@@evan-du3vk Intels new CPUs are using more power than most GPUs. Thats the problem
@@kingiument4627 i have 13900k and in extreme settings is using less then 200 watts. So don't know witch good graphic takes less
@@evan-du3vk RTX 4070, RTX 4060, All other RTX cards below 3060, RX6600XT, RX7600, all GTX cards below the 1080 ti. Thats 90% of the GPU market right there
NASA uses intel so I’m using intel 🎉
JayzTwoCents just put out a video today detailing how Asus's current Intel Baseline bios has an incorrect AMP setting (its too low) and you have to manually increase the amps in BIOS from 280A to 400A in order to comply with Intel's recently published baseline chart, he shows how 280A decreases Cinebench score by over 3K points vs. 400A
Yes, with the success of the Ryzen 5800X3D, you see 3D integration in the Ryzen 7000 series and future CPU. Users must thank 5800X3D.😊
i wanna ask you are you use same cooler for beanching>?
Nice to see your CPU and GPU are running cool, they should last a long time.
That's what I'm talking about! Thx!
the Ryzen 7 7800X3D is the clear winner, but the 1% lows in many games is not stable as the I9 14900K
Intel doesn’t have a baseline profile, they’ve said this themselves. They said it’s 253/253 and 400a in extreme mode and 320/320 400a for KS models in extreme mode. Both of which are considered stock and covered by warranty.
So how much current did you set? Because you’re nowhere near 253w yet your P cores are throttled back heavily.
Have you not been keeping up with the drama?
Several motherboard manufacturers have customer power profiles that are enabled by default, that provide basically unlimited power (4096 watts on the Asus boards).
Asus calls it Multicore Enhancement, can't remember what the other guys call it.
Intel has asked motherboard manufacturers to stop doing that and set the default to Intels spec instead.
This is also a thing on AMD motherboards. In PBO you can set the processor power to CPU/AMD or Motherboard limits.
Setting it to Motherboards allow significantly higher volts and amps when boosting.
@@AluminumHaste Yes and Intel has publicly commented. They have explicitly stated that i9 CPUs DO NOT use Intel baseline. They instead recommend Intel Performance or Extreme modes. They offer a table which states the power limits users can use. Extreme Mode is a default setting, it is 253/253 400a for i9 K/KF and 320/320 400a for i9 KS.
Unfortunately many outlets reported and tested before Intel made their statement against the rumors and "leaks".
I've seen a lot of BS in my years but this is up there with the later Pentium 4 processors, the GTX 970 really having 3.5GB of VRAM, and Bulldozer.
Was expecting Intel's to be a lot less, but the 7800X3D is still the no-brainer for gaming.
If you have zero clue what you are doing it may be the CPU for you, the 14900K is not idiot proof
@@JohnLeOpinion 14900k is trash cpu. I had 12700k its horrible. Im so glad i upgraded to am5
@@JusstyteN Ever had one? I have one and it's great. Overclocked, never a single problem.... I can run BF 2042 and upscale movies on the background, while on my friends on discord and even streaming if I want to.
I still remember the days when it was the other way around. Intel Core i7 CPUs were unmatched. It's interesting how things turned out.
yeah FX was terrible because programs never used more than 4 cores but nowadays they perform pretty well against an i7 around its time
You've got to hand it to AMD. They made one helluva gaming CPU with the 7800x3D. Still loving my 5800x3D. These things are GOAT status.
All the test with Full HD? Im interested in higher resolution
Тогда это будет тест видеокарты
Very interesting how this processors will act with next gen consoles in 2030+ when games will utilize about 28 threads
Apart from the obvious exceptions where the difference is massive, you're obviously going to have fun on either platform. But the issue I see is power consumption and AMD absolutely stomps here
why is vram usage so high when playing warzone?
Does burning problem still occur in AMD processors and motherboards ?
no they fixed that a long time ago. The cause of that one was with people running 6000MT/s RAM in EXPO, which caused the EXPO voltage to go up to an unsafe level.
Also, this was worse with certain motherboards.
You have problem with LLC or something on Intel platform, there is no way to have 4600MHz throttle on PCores, I just did your settings in forbidden west and I have 232FPS with 14900k on intel defaults.
it's not just LLC, that and LLC really kicks in when the CPU is under heavy stress.
Every single tech person commenting on this issue has never set a 24/7 overclock, or if they have, they upgraded to the next gen parts before they degraded the CPU. These are not the guys who had an 8700k running for 5 years at 5.2GHz with 1.325v.
The LLC is feeding the CPU too much voltage on top of the voltage already being set too high.
The i9 suffers most in the 1% lows me thinks, and use much more power oc. Tnx for the comparison, mate! 😇🍑
excuse me. what's the monitoring software used?
MSI afterburner
For Warzone, why is the same FPS that the GPower of 2 GPUs is so different?
I have an i5-12400f with gigabyte h610m mobo currently using an gt 730 thinking for an upgrade to rtx 3060 or arc a750 here rtx 3060 is 323usd while arc is 240 usd what do i do?
Why would you buy rtx 3060 lol, go for 4060 or even better 6700xt.
A750 is a really good buy if you know what you’re doing. It’ll take more time to set it up correctly but once you do it’ll perform great. If you have a little more cash to spend go for the A770 as it has 16gb. With this amount of vram, it’ll hold up for a while.
of course, take the 3060, ARC is not a gaming graphics card
of course, take the 3060, ARC is not a gaming graphics card
@@dohamakarov2720 why did you sent it twice lol
Those are not Intel's baselines! Those people at motherboard manufacturers that did those baselines are completelly stupid. Especially Gigabyte. They used 1,7 mΩ loadline which leads to absolutelly insane operating voltages. Even Intel has called motherboard manufacturers (especially Gigabyte) to remove that crap from BIOS. Gigabyte already remowed that BIOS update that implements Intel Baseline. And also ASUS uses baseline that has some utterly insane settings. Those people that programmed those BIOS settings at motherboard manufacturers are stupid or completelly incompetent.
Let's be honest here. Either processor is going to give you a fantastic gaming experience. I like both of them.
7800X3D best CPU gaming in the world, Amazing
correct me if im wrong but does the 14900k use doublish the power of 7800x3d? is that what it shows
2-3 )))
lmao yes
Triple the cores and double the power consumption. 8 cores consuming half of 24 cores is more concerning
@@D3nsity_ There are usually only about eight cores working in games, not all. So its 8 vs 8.
@@Xawwis Red dead uses more than 8 cores. Lights up all 16cores/24threads of my 13700K. All the games shown here can use more than 8 cores.
Why is there such a big difference in the 7800 x3d tests in this video and in your video 6 months ago
Core i7 14700K vs. Ryzen 7 7800X3D - 10-game test
I always had this doubt:
In a real world where the player has discord, an open browser with several tabs, WhatsApp and perhaps a work app like Office, which of these processors would do better?
I would like to see a comparison like that.
Exactly this, the 8c16t has nearly 2x the usage so in real world it will 100% be worse performance with tons of background apps open.
Who tf uses discord while playing story games?
ok pay +50% price for what? so you can't use a discord and browser on amd which are add 1% more load to cpu?
It's a good idea tbh
or like 3 screens
I wish AMD GPUs were as good as this, then NVIDIA's high prices and greed would not be seen.
How you know if they will be cheaper?. 7900xtx is getting 100 watts more then 4080 super don't have dlss and rt. And in mine country amd is 10% cheaper. I get 30%-40 % less. But really 10% is that much difference in card that offer less??
Amd uses LESS power COSTS LESS, and still decently outperforms the intel 😭😭
Their GPUs may be iffy but AMD is king right now when it comes to CPUs and APUs. The 7800x3d is a beautiful little piece of tech.
It's interesting to note that on the AMD side the CPU has a significantly higher utilization vs the intel CPU, which seems to also correspond to higher utilization of the 4090. Yet the intel build has the same or lower framerates but with higher CPU clock speeds and higher power consumption. Were there any voltage settings set lower in bios on the 14900k to help with the thermal issues the 14th gen has been suffering as of late? It just seems like the 7800X3D is just doing more with less, compared to the 14900K as if it's in "tryhard" mode. I am amazed since ever since these CPUs have been on the market, the 14900k was performance king.. clearly this is wrong.. or things have changed since.
AMD supremacy 💪🏻
And the funny thing is that between the cheaper 7700 and 7800x3D there is no difference in 2K and 4K.
How is the 7800x3D for emulation like pcsx2, rpcs3 and all that?
I finished most exclusives from PS3 on rpcs3 locked 30 FPS on 4k or 1440p sometimes was possibile to 60fps but not always , pcsx2 isbeasy to emulate
@@dankuk765 I see, thank you! I'm torn between the i5 14600k, i7 14700k and the 7800x3D, can't decide which to buy and i love playing on emulators and want the highest possible fps on them...
Будет интересно посмотреть на процессоры с 3d кэшом от Интел
и на его энергопотребление )
@@user-ng6cq9qd7d за лишние 70 ватт вам придется заплатить чуть больше 1000 рублей в год
Согласитесь если вы смогли себе позволить 14900 то думаю у вас за год будет лишних 1000 рублей
И это с учётом того что ваш 14900 будет работать не меньше 5 часов в день
Так что проблема потребления не стоит внимания
Nobody notice the AMDip at 48secs?
nobody notice the intel Dip at 1:27?
Dont worry guys the ultra i9 super turbo mega next gen is on the way and no the upcoming ones wont be failed refreshes.😮
10 P-Cores, 48 Trash-cores and 570W powerdraw?
AMD is the king of gaming. And that chip is over 1 year old now.
Only i see is 80W vs 150W
12700K / 13600k / 7700x / 7800x3D is now perfect gaming cpu for good price and power draw.
For gaming AMD is the best, but what about work performance, means compilation/shaders/rendering? Still the better choice?
Thats why there are AMD Threadrippers.
@@Xawwis Sure, but not this monster 🤣I mean home workstation on 7800x3d vs 14900k with gaming option
@@Boraboaryou do aware that Threadripper consuming exactly same amount of power to 14900KS ? That's the true Rival of 14900K & KS for Productivity & Gaming at the same time
@@niezzayt3809 Yep. But I don't know if I can find Threadripper for less than 500 bucks and a cheap mobo. Also, threadripper is definitely not the Holy Grail for gaming.
@@Boraboar don t compare the 14900 k with 7800 x3 d for production task or mix work load . you need to take the 7950 x3 d or the 7950 x vs 14900k .
insane price to perfomance cpu ever made
It’s interesting situation with those i9 13th and 14th gen.People who buying them,in my opinion are intusiasts,and know that i9th are hot. And they seriously didn't think of making an undervolt of it?
Amd more efficient, same performance, cheaper, long lasting platform. Absolute w cpu. I use it daily and it fits all my needs. 8 cores is more then enough even for very heawy multitasking.
Should be a comparison view of DCS…in 4K VR…DCS maxed out will kill any CPU…
7800X3D literally the king right now.
Not quite sure as to why the intel one looks that little bit better HD then the Ryzen with the same graphics card , but you’d be silly not to go with the Ryzen.
Перед зимой можно взять. Самое то. В разгоне так вообще.
Anyone spot on this video the 14900K with baseline profile = performance drop? 🙄
And that's the reason why after many years, i moved on to AMD and haven't regret it. 7800x3d is 325euros now!!!!!
I wish my 7800x3d clocked that high. You got yourself a really nice chip. My chip sits around 4.7 Ghz. Looks like i got really unlucky on the silicon lottery.
First game Cyberpunk 7800X3D is faster and use 76W vs Core i9 14900k in baseline profile 146W Sic!
$580 was like the initial release price for 14900k why are you still using that? The cheapest iv seen is like $530
Стоит ещё учесть, что для 7800x3d достаточно будет буквально самой дешёвой материнской платы. Из-за этого разница в цене всей сборки будет ещё больше. При чем основную массу денег ты в таком случае экономишь именно на материнской плате, а не на процессоре.
Not bad amd! I’ll change my 10900f to 10900x3d in the future
50-68% of powerusage meanwhile worst case 1% worse FPS #s but mostly a little bit to 10-15% more FPS. What is he question here? 😲
Im happy with my 7800x3d. To me amd had the best cpus and nvidia the best gpus thats the best combo atm imo
253w is not baseline at all. Its new extreme profile...baseline is 125/183w
Why the intel cpu usage is always low at 30%?
IT Has more threads and cores that's why
Avatar is GPU limited.
rip intel for making high end cpu not beatable by the ryzen 7 250w for intel against 120w wha a huge difference
until you know the 14600k with some tweaks has the same gaming performance as the 14900k, the problem here for intel is, the games don't use more than 8 cores while the 14900k has more than 20 cores, which can be said in theory. It's theoretically a lot more powerful, but its current limitations don't help it shine, and the 14600k thereby proves to be even more efficient than the 14900k in terms of power consumption while still delivering comparable performance, requiring I have to say here that i9 14900k will be the destroyer of all current tasks if not playing games.
@@chien.nguyen. for that i agree with you
Intel New Bulldozer 2011
I was around for Bulldozer, and it was depressing..... but at least Bulldozer worked
Cash is king.
Cache is also king?
AMD 3D V-Cache is king.
You seem to be favoring Intel when testing only gpu intensive games, test cpu intensive games at 1080p then it will be a true comparison.
Lisa Su is a genious.
Why warzone uses 24gb of vram?!
Looks like R7 - 7800x3D is still the King even costing $200 Less than i9-14900K.
I'm an Intel fanboy and it's disappointing to see Intel falling little by little to AMD...
can you test with 7900 xtx
Test RDR2 in Saint Denis
El consumo debería darle vergüenza a intel, es más yo tengo un i5 111400f y en juego exigentes consume más que ese ryzen. Una bestialidad la rtx 4090, los 2 procesadores le hacen cuello de botella en algunos juegos.
I think you need to edit cpu name, I don't think intel would come that far with generations.
In fact, the 14900k is better for work because the 14600k, with minor tweaks, offers similar gaming performance to the 14900k while being cooler and consuming less power.
Basically no different.
It would be no real to use RTX 4090 for 1080p.
sabes donde esta el problema de mi humilde opinión con los intel en los núcleos eficientes eso ya consume energia y cuando el procesador se pone a tope con los nucleos de alto rendimiento hay donde se ve ese consumo de energía no hay optimización se supone que los eficientes tienen que trabajar en aplicaciones como chrome todas esas cosas del día a día no tratar de ejecutar un juego acompañando a los de alto rendimiento son lindos procesadores pero por ahora prefiero amd no algo que todavia ni se optimizo esta tecnología estaría buena si estuviera bien implementada junto con un npu que maneje la optimización del procesador en futuros años..
is intel even patched?
Too bad there is no Counter Strike 2 benchmarked as it's CPU dependent
ryzen is much lower power draw, almost half as much most of the time. but why the temps are mostly higher than the i9?
Чиплет с ядрами в несколько раз меньше монокристалла из-за этого от него сложнее отвести тепло. Так же у процессоров AMD толстая крышка процессора.
Why is amd running hotter despite its drawing way less power?
Неудачная крышка - теплораспределитель и малый размер чиплета с ядрами.
Трудно охладить маленький чиплет с ядрами по сравнению с монокристаллом у интел и процессоров G серии AMD.
The 3D V-Cache die sits directly on top of the Core Complex Die, and this adds another layer the heat from the CPU must travel through.
Clear W for the 7800X3D