Personally I don't game at all so it was actually refreshing not to have to see a bunch of gaming benchmarks and instead the focus was mostly on productivity type applications. I feel 99% of the sites have the main/major focus on gaming benchmarks... which I understand because many people do game... but it's good to have some sites and TH-cam channels actually focus on productivity applications when reviews and overclocking work is done. Thanks especially for the information on the various memory speeds and how that relates to any performance gains.
@@nexttothemoon1 oh look, someone who doesn't spend their entire attention span gaming with a mature, coherent response. Agree that the balance is refreshing. Thank you evolved human.
@@Alex-wg1mb True, I used to run a custom water cooling loop for my 12900K with an external 360 mm rad, but had to move countries and simplified it a bit, so I gave it a shot with a cheap-ish single tower cooler. I was very surprised. No overclocking headroom at all but no issue with gaming and streaming (CPU encode) at all.
I'm moving away from channels like J2C, GN, HUB and LTT to channels like this. I'm getting really tired of all their narrative driven content and mudslinging they seem to show lately. The whole over competitive AMD vs Intel fanboy drama is old and toxic. So, i really appreciate the real honest content without going down that particular sewer hole
thanks for the vid m8 , thats given me the info and confidence to buy from the 200 range , the heat problems in the last series really made me think twice , but no more . Big thumbs up and ill be having a look on the overclockers sight . ps im really just a gamer that likes a bit of overhead clearence .
The power and heat efficiency reductions are big steps here. This generation may not be as exciting compared to last gen, but after two more iterations of this CPU design I bet Intel becomes very appealing.
Hey Big Gob, Great to see you doing what you do best. Was thinking of getting this CPU because I needed an upgrade to my 5800x in blender. Mainly use GPU where possible, but blender uses a lot of single core computations. Would you recommend for this use case? Do not play games anymore, and AI does not interest me.
@@michaelbrindley4363 Price is gonna drop back down after the 9800X3D, AMD has reduced the 7800X3D's stocks and also literally everyone and their mothers are buying the 7800X3D and have been in the last 6-12 months which skyrocketed the prices. Now that the 5800X3D got discontinued the 7X3D is gonna take it's place after the 9X3D release.
I'm building a Arrow Lake system this week....I only play one old game about an hour a day+- Unreal Tournament. I don't need all those cores so I'm doing a 245K on a 360 AIO and a MAG Z890 Tomahawk WiFi.
Nice tests, though it would be nice to get gaming benchmarks too. Obviously 285K is worth to get only for gaming, but it would be nice to see if OC does some good there!
Every time Intel release a new generation, I keep thinking, I'll upgrade by 12900K, but 13th and 14th gen were just incremental, so I didn't see the point in upgrading, so I've been waiting on 15th gen which is obviously called the 200 series now, and there's still no point to upgrade. I guess I could move to the 14900k or ks, as I could keep my Z690 motherboard, but with all the drama around 13th and 14th gen, again it doesn't seem such an appealing prospect either. They way things are going, the next time Intel make something worth buying, I will have forgotten how to build a PC, lol.
The 13th/14th gen ecores are much better than alders, raptor lakes memory controller is much stronger and the pcores clock much higher.a decent aio 13/14 gen easily 5.6-5.8 allcore oc with temps no where near tj max Can't go wrong with 12th gen tho still rocking a 12900kf myself
It's early launch bugs and optimization issues are to be expected. Ryzen 9000x had the problems during it's launch with poor Windows optimizations skewing the performance. I'm planning on getting the 285k myself, just waiting for for some promotions and sales to roll around. It's often the same with early adoption of any new technology of vehicles these days, QC isn't much of a thing these days, the buyer and their technician is the QC... The amount of warranty repairs I've had done in my 2024 Freightliner Cascadia is ridiculous.. Thankfully it's company equipment lol not my bill to deal with. I'd give Arrow lake a couple months, and then check reviews and benchmarks again. Things could improve. Look how quickly Intel managed to improve the drivers for ARC GPU's, and XeSS is amongst one of the better AI frame generation technologies out there. Intel just has a bad habit of rushing it's launches only to fix the problems later.. 😅 I have to wait either way anyways.. going from laptop back to desktop. The GPU I plan on get the RTX 5090, doesn't release till sometime in Q1/Q2 of 2025..
@@Demonoid1990 and they haven't even tapped into the AI applications yet. Thing literally just released and all the gamer heads are hating as if there's no use for a GPU otherwise.
@@fantasypolice I'm kinda undecided on my build for next year. I'll just be waiting for the RTX 5090 to hit the market first, and go from there. I'm kinda torn between either the 9950X3D (because of it's newfound OCing potential), or the 285k (also because it shows good potential for OCing). Over the past week or so I've gotten all my new furniture setup in my room, bought the Sony Bravia 9 75" TV for my setup, and am now just waiting for the last of the PC components to launch early next year. I'm going 4k with this build, obviously on my new TV. So my pick of CPU isn't quite as critical as the GPU choice. But the are some games where Intel or AMD tend to pull away a bit at 4k, though the averages still typically show a 1-2% spread in the charts. My only real annoyance with AMD is the hotspot location on the IHS being at the bottom edge.. I plan on using a large air cooler I already purchased, which is compatible with Intel or AMD. But if I wanted to use an offset mounting bracket for AMD I could run into clearance issues with the back of the GPU depending on the PCIE slot location on the motherboard. (Depending on which motherboard I get) Air cooling Intel CPU's is a little more straightforward with the die/hotspot being pretty centered on the IHS. One way or another I'll be forced to have to make some sort of compromises.. Hopefully Intel will fix/patch some of the issues with Arrow lake by the end of the year. That should give a better comparison then when the 9950X3D launches next year. Competition is great, until you run into a situation where you have one of two choices to make, and regardless of the choice you'll be mildly compromised one way or the other..
very similar to zen 5 launch in the sense that the gaming performance is similar to previous gen and there are gains in multithreading and efficiency, particularly in the top of the line cpus.
The microcode issues didn't happen much at all, on any SKU, but the 13700k was listed as "affected". Probably less than 1% damaged CPUs in the 13700, 13900, 14700, and 14900 SKU range, so it really was a non-issue. Intel replaces the tiny amount of people affected, done. The microcode fixes are already out anyway, wouldn't affect anyone anymore building now.
I would focus on getting the ram latency as low as posible. Same way as overclocking the 3800xt cpu. IF and RAM timings. Maybe claw back some of that gaming performance.
Thanks for the content and review. I appreciate you actually including some benchmark scores for the overclock CPU, most reviews have not bothered to do so yet. As it would turn out, the 285K cannot match a 9950X when overclocked in almost any benchmark, but there are a few exceptions. Unfortunately you got a lot factually incorrect. When memories pushed over 8,000 it is absolutely not used synchronously with the memory controller. There is definitely a divider all you have to do is look in the appropriate software and it will display the correct memory controller speed which is not one to one. Additionally it's cute how you gloss completely over in-game performance for actual games and instead relied solely on synthetics to reach your gaming conclusions. Had you taken the time to benchmark inside actual games you will have found that performance is severely degraded in some games, slightly worse in most games, and in a select few showed an improvement over the previous generation. You may want to do a reality check everyone else, you are the only person who's reached these conclusions.
I regret buying the 11th gen(12th gen was not released) because the gap between 11th and 12th gen is huge. After that the only change intel made was making a slightly better version of 12900K with increasing 100watt of heat instead of performance then called it the new gen. The core ultra series had to be the 14th gen. Now it's late. The lack of competition will make AMD prices go "Nvidia Mode" by 2026
It doesn't seem right to me to let the 14900k consume 350w, it's obvious that it will perform better, the limit imposed by Intel is 253 and that should be the one when you do benchmarks, not the one decided by the motherboard.
To sum it up this intel core ultra rebranding is meant to be intels ryzen 1000. A step forward in architecture bit teething issues for the 1st generation compared to their previous
But for me this isn't really enough (even excusing the lackluster performance) considering this is 3nm as well xd, get a node advantage Vs your old 10nm and do a significant amount worse.
Same with DDR4, its down to the motherboard. With 4 DIMMs, two of the DIMMs have longer traces on the motherboard. This means lower frequencies for two of the DIMMs and worst timings. The whole system can only run as fast as the weakest link. So the faster 2 DIMM slots, have to reduce speeds to match the slower two slots.
@@Met1900 Yes, but having the 11900K have 2 less cores than the 10900K wasn't really a good idea. The sheer reason people buyed it was that AMD was still building up their fame.
Intel had a whole lot of 14nm silicon to sell that gen but they had to throw something new on it so they took their new 10nm architecture and backported it. They weren't committing to that design long term it was a cash grab.
Overclockers UK : SHINTEL isn't using a new core layout design....it's using TSMC manufactured TILE BASED design that AMD developed and is using since the Ryzen 1000 series from April of 2017. That TILE BASED design that Shintel ridiculed as "glued together" back in 2018. SHINTEL is also cheating in PWR consumption by secretly requesting additional PWR through the 24-PIN connector. SHINTEL ditched the Hyper-Thrading due to high PWR consumption, too high TEMPS and numerous SECURITY issues via HT.
@@dataterminal It will, eventually. It will prob take a few generations to catch on either by lack of support from game devs for a few titles or just lack of ability hardware wise like ray tracing when it first came out.
Saying same gaming performance as 14900K is very misleading and I get you need to say that to sell your bundles. But there is no reason to buy this platform when 12th GEN Intel beats it in gaming.
those *1080p* benchmarks on a * 4090 * with * Dlss quality * are just pointless, will you buy a 4090 to just game on CPU bound scenarios ? Obviously not cuz if i buy a 1800usd GPU, i will probably play at 4K res And a 14600k will be enough for it, and if u even consider a low clock 8 core cpu * 9800x3D * the difference will be too small 😂
I regret buying the 11th gen(12th gen was not released) because the gap between 11th and 12th gen is huge. After that the only change intel made was making a slightly better version of 12900K with increasing 100watt of heat instead of performance then called it the new gen. The core ultra series had to be the 14th gen. Now it's late. The lack of competition will make AMD prices go "Nvidia Mode" by 2026
11 gen is really good. imo i think the problem is windows 11. I feel like my 11900k system was less weird than my 12900k now, but at the other end the gen 5 experience is better, downloading est general more fast, but i think its really w11 that is shit.
Personally I don't game at all so it was actually refreshing not to have to see a bunch of gaming benchmarks and instead the focus was mostly on productivity type applications. I feel 99% of the sites have the main/major focus on gaming benchmarks... which I understand because many people do game... but it's good to have some sites and TH-cam channels actually focus on productivity applications when reviews and overclocking work is done. Thanks especially for the information on the various memory speeds and how that relates to any performance gains.
@@nexttothemoon1 oh look, someone who doesn't spend their entire attention span gaming with a mature, coherent response. Agree that the balance is refreshing. Thank you evolved human.
guys a 12700k or 12900k has never looked so good for the money/performance.
Yeah, especially if you have an older PC and want to upgrade without breaking the bank.
12900k can be aircooled even.
With professional tasks you do not need insane overclock. So with moderate 175watts it is wonderful silicon
yo guys try find the 14600T or 14700T on used market :) IT's a monster with 50 60w :)
A 5900x/5950x isn't a bad shout either, these days.
@@Alex-wg1mb True, I used to run a custom water cooling loop for my 12900K with an external 360 mm rad, but had to move countries and simplified it a bit, so I gave it a shot with a cheap-ish single tower cooler. I was very surprised. No overclocking headroom at all but no issue with gaming and streaming (CPU encode) at all.
I'm moving away from channels like J2C, GN, HUB and LTT to channels like this. I'm getting really tired of all their narrative driven content and mudslinging they seem to show lately. The whole over competitive AMD vs Intel fanboy drama is old and toxic. So, i really appreciate the real honest content without going down that particular sewer hole
2 thumbs up
GN is such a fraud.
Legend status attained .
So, stick with my 14700K, thanks Ian!
Gz for the tests Ian!
thanks for the vid m8 , thats given me the info and confidence to buy from the 200 range , the heat problems in the last series really made me think twice , but no more .
Big thumbs up and ill be having a look on the overclockers sight . ps im really just a gamer that likes a bit of overhead clearence .
I've seen that the results of higher memory, even though it does little to help average, it makes a HUGE difference to improve the 0.1% lows
Just what i was looking for, thank you!
The power and heat efficiency reductions are big steps here. This generation may not be as exciting compared to last gen, but after two more iterations of this CPU design I bet Intel becomes very appealing.
Hey Big Gob, Great to see you doing what you do best. Was thinking of getting this CPU because I needed an upgrade to my 5800x in blender. Mainly use GPU where possible, but blender uses a lot of single core computations. Would you recommend for this use case? Do not play games anymore, and AI does not interest me.
After 285k reviews, the 7800x3d looks even better for gaming.
when 14600kf is a half price of 7800x3d,u r braindead
@@MrFilip121 it does, but not at current prices.
Bruh, even 5700X3D looks better at gaming.
@@michaelbrindley4363 Price is gonna drop back down after the 9800X3D, AMD has reduced the 7800X3D's stocks and also literally everyone and their mothers are buying the 7800X3D and have been in the last 6-12 months which skyrocketed the prices. Now that the 5800X3D got discontinued the 7X3D is gonna take it's place after the 9X3D release.
I have a 7800x3d and a 13900ks. I9 wins in most everything especially 1% and stuttering is nonexistent.
I'm building a Arrow Lake system this week....I only play one old game about an hour a day+- Unreal Tournament.
I don't need all those cores so I'm doing a 245K on a 360 AIO and a MAG Z890 Tomahawk WiFi.
It can be great if you do a video of how to overclock the 285k to the max to get more performance of it.....Thanks for your work and this video :)
Hast du vielleicht ein Overclock Guide für Intel 285K BIOS?
Nice tests, though it would be nice to get gaming benchmarks too. Obviously 285K is worth to get only for gaming, but it would be nice to see if OC does some good there!
Will CU-DIMM allow higher speeds with a fully populated board?
18:05 what's the music playing ? Please tell me :)
14900ks here and Im dreading this but I really dig 5.8P 5.4E all core at 50000+ cinebench at 60c, insane!
Cinebench is all the rage these days with the gamers!!
You should be dreading your cpu frying itself.
@@Intelwinsbigly I already been through it twice, Im done with raptor lake.
Every time Intel release a new generation, I keep thinking, I'll upgrade by 12900K, but 13th and 14th gen were just incremental, so I didn't see the point in upgrading, so I've been waiting on 15th gen which is obviously called the 200 series now, and there's still no point to upgrade. I guess I could move to the 14900k or ks, as I could keep my Z690 motherboard, but with all the drama around 13th and 14th gen, again it doesn't seem such an appealing prospect either. They way things are going, the next time Intel make something worth buying, I will have forgotten how to build a PC, lol.
It seems you totally ignore AMD exists
The 13th/14th gen ecores are much better than alders, raptor lakes memory controller is much stronger and the pcores clock much higher.a decent aio 13/14 gen easily 5.6-5.8 allcore oc with temps no where near tj max Can't go wrong with 12th gen tho still rocking a 12900kf myself
It's early launch bugs and optimization issues are to be expected. Ryzen 9000x had the problems during it's launch with poor Windows optimizations skewing the performance.
I'm planning on getting the 285k myself, just waiting for for some promotions and sales to roll around. It's often the same with early adoption of any new technology of vehicles these days, QC isn't much of a thing these days, the buyer and their technician is the QC...
The amount of warranty repairs I've had done in my 2024 Freightliner Cascadia is ridiculous.. Thankfully it's company equipment lol not my bill to deal with.
I'd give Arrow lake a couple months, and then check reviews and benchmarks again. Things could improve. Look how quickly Intel managed to improve the drivers for ARC GPU's, and XeSS is amongst one of the better AI frame generation technologies out there. Intel just has a bad habit of rushing it's launches only to fix the problems later.. 😅
I have to wait either way anyways.. going from laptop back to desktop. The GPU I plan on get the RTX 5090, doesn't release till sometime in Q1/Q2 of 2025..
@@Demonoid1990 and they haven't even tapped into the AI applications yet. Thing literally just released and all the gamer heads are hating as if there's no use for a GPU otherwise.
@@fantasypolice I'm kinda undecided on my build for next year. I'll just be waiting for the RTX 5090 to hit the market first, and go from there. I'm kinda torn between either the 9950X3D (because of it's newfound OCing potential), or the 285k (also because it shows good potential for OCing).
Over the past week or so I've gotten all my new furniture setup in my room, bought the Sony Bravia 9 75" TV for my setup, and am now just waiting for the last of the PC components to launch early next year.
I'm going 4k with this build, obviously on my new TV. So my pick of CPU isn't quite as critical as the GPU choice. But the are some games where Intel or AMD tend to pull away a bit at 4k, though the averages still typically show a 1-2% spread in the charts.
My only real annoyance with AMD is the hotspot location on the IHS being at the bottom edge.. I plan on using a large air cooler I already purchased, which is compatible with Intel or AMD. But if I wanted to use an offset mounting bracket for AMD I could run into clearance issues with the back of the GPU depending on the PCIE slot location on the motherboard. (Depending on which motherboard I get)
Air cooling Intel CPU's is a little more straightforward with the die/hotspot being pretty centered on the IHS.
One way or another I'll be forced to have to make some sort of compromises.. Hopefully Intel will fix/patch some of the issues with Arrow lake by the end of the year. That should give a better comparison then when the 9950X3D launches next year.
Competition is great, until you run into a situation where you have one of two choices to make, and regardless of the choice you'll be mildly compromised one way or the other..
very similar to zen 5 launch in the sense that the gaming performance is similar to previous gen and there are gains in multithreading and efficiency, particularly in the top of the line cpus.
How do I but the nexalus waterblock? Like your video btw
Thank you! Currently we only supply these with our prebuilt 8PACK PC systems.
What about memory timings? You only provide the speeds?
i think in 2024, the 13700k is the safest top cpu option from intel. the microcode issues have not been encountered on this cpu
Check warframe crush report...
The microcode issues didn't happen much at all, on any SKU, but the 13700k was listed as "affected". Probably less than 1% damaged CPUs in the 13700, 13900, 14700, and 14900 SKU range, so it really was a non-issue. Intel replaces the tiny amount of people affected, done.
The microcode fixes are already out anyway, wouldn't affect anyone anymore building now.
@@CyberneticArgumentCreator yes
I've built a beast with a 14700k and a 4070 super
@@CyberneticArgumentCreator Problem is that those blind amd fanboys are not gonna listen 😂
I would focus on getting the ram latency as low as posible. Same way as overclocking the 3800xt cpu. IF and RAM timings. Maybe claw back some of that gaming performance.
Thanks for the content and review. I appreciate you actually including some benchmark scores for the overclock CPU, most reviews have not bothered to do so yet. As it would turn out, the 285K cannot match a 9950X when overclocked in almost any benchmark, but there are a few exceptions.
Unfortunately you got a lot factually incorrect. When memories pushed over 8,000 it is absolutely not used synchronously with the memory controller. There is definitely a divider all you have to do is look in the appropriate software and it will display the correct memory controller speed which is not one to one.
Additionally it's cute how you gloss completely over in-game performance for actual games and instead relied solely on synthetics to reach your gaming conclusions. Had you taken the time to benchmark inside actual games you will have found that performance is severely degraded in some games, slightly worse in most games, and in a select few showed an improvement over the previous generation.
You may want to do a reality check everyone else, you are the only person who's reached these conclusions.
afraid to put the 7800X3D in the charts :) ?
@@Exilium2090 if it can match 14900K’s performance than it’s already better than 7800x3d
@@Exilium2090 7800X3D don't have a chance in normal applications... X3D it's for games only
I regret buying the 11th gen(12th gen was not released) because the gap between 11th and 12th gen is huge. After that the only change intel made was making a slightly better version of 12900K with increasing 100watt of heat instead of performance then called it the new gen. The core ultra series had to be the 14th gen. Now it's late.
The lack of competition will make AMD prices go "Nvidia Mode" by 2026
Thx Ian!
wow, i get a cinebench 40002 multi core score on my 14900k with no OC, intel best get this stuff sorted asap or amd are gonna start pushing prices
waiting for the 9800X3D ❤ if the 7800X3D is already King of Gamers ....
You and Jay overclocked to the max and Intel still cannot match 7800X3D from how long ago?
It doesn't seem right to me to let the 14900k consume 350w, it's obvious that it will perform better, the limit imposed by Intel is 253 and that should be the one when you do benchmarks, not the one decided by the motherboard.
8Pack, no hyper threading
To sum it up this intel core ultra rebranding is meant to be intels ryzen 1000. A step forward in architecture bit teething issues for the 1st generation compared to their previous
But for me this isn't really enough (even excusing the lackluster performance) considering this is 3nm as well xd, get a node advantage Vs your old 10nm and do a significant amount worse.
8PACK FOR THE WIN !
Thanks for benchmarks !
God bless you ! Jesus Christ of Nazareth loves you and your loved ones !
Amen
i'd be more impressed with halving the price of all pc hardware.
F'n outrageous.
Still using my X99 platform.
are you hobo? 12600k costs $150, 7500F costs $120
There is no more excuse about bad prices
@@rulik007 The motherboard prices (for the good ones at least) are crazy though
I want an Intel Arrow Lake system for science. Alas my wallet said a 9800X3D is more than enough for an upgrade for now.
I am more interested in an overclock on the 245 ..... that should be a gaming beast. my 12600k could hit 14900K gaming levels .. all for $100 USD
Lol no 😂
10:18 graphic is about latency i guess. this stuff probably really need low one.
il keep my 13900KS thanks tho!
i7 12700k was going for $160 on amazon
4000 Mhz with 4 Dimms?
Dang Intel and AMD really need to do some work on their IMCs
Same with DDR4, its down to the motherboard. With 4 DIMMs, two of the DIMMs have longer traces on the motherboard. This means lower frequencies for two of the DIMMs and worst timings. The whole system can only run as fast as the weakest link. So the faster 2 DIMM slots, have to reduce speeds to match the slower two slots.
im having 11th gen flashbacks.
@@rozzbourn3653 11th gen was actually ipc wise a better improvement than you think. The Generations before got only more cores keeping same ipc.
@@Met1900 Yes, but having the 11900K have 2 less cores than the 10900K wasn't really a good idea. The sheer reason people buyed it was that AMD was still building up their fame.
Intel had a whole lot of 14nm silicon to sell that gen but they had to throw something new on it so they took their new 10nm architecture and backported it. They weren't committing to that design long term it was a cash grab.
The question is will it degrade if you overclock???
We'll find out in 6 months 😆
Just Disable garbage thermal velocity boost and lock the cores.
Dear Americans. Sorry.
Where is the real OC?
Dude, it's not the same gaming performance as 14th gen. It's slower. I get it, you want to sell new hardware. But do it ethically by not lying.
5.52, Most do buddy.
Overclockers UK : SHINTEL isn't using a new core layout design....it's using TSMC manufactured TILE BASED design that AMD developed and is using since the Ryzen 1000 series from April of 2017. That TILE BASED design that Shintel ridiculed as "glued together" back in 2018.
SHINTEL is also cheating in PWR consumption by secretly requesting additional PWR through the 24-PIN connector.
SHINTEL ditched the Hyper-Thrading due to high PWR consumption, too high TEMPS and numerous SECURITY issues via HT.
Ditching Zen 3 for Alder Lake best day of my life
@@mikhailshioh yeah i did the same 🤣
i ditched 5800x and got a 13700k and every sh!t problem just disappeared 😂
Who cares, AMD Ryzen™ 9 9950X3D is going to kill intel for good!
@@7HH I really hope not because it will be bad for us. We need competition between us two companies we do not need an Monopoly like Nvidia
@@Tech-Tide2387 exactly
Increased latency, worse gaming and overclocking performance are all reasons to avoid this chip like the plague.
One question for Amd fanboys 😂
Why Amd's non 3D cpus sucks compared to Intel's competitive CPUs ? 😂😂
Using overclocking to polish a turd doesn't change it from being a turd.....
Biggest launch flop of the decade, what a garbage cpu
Yeah just about. Close between this and Rocket lake😂.
I care about AI. If that NPU doesn't make NPCs in next gen games talk like ChatGPT, I'm going to be super disappointed.
@@dataterminal It will, eventually. It will prob take a few generations to catch on either by lack of support from game devs for a few titles or just lack of ability hardware wise like ray tracing when it first came out.
Saying same gaming performance as 14900K is very misleading and I get you need to say that to sell your bundles. But there is no reason to buy this platform when 12th GEN Intel beats it in gaming.
Intel Boy
What a pathetic CPU
What a pathetic gamer.
those *1080p* benchmarks on a * 4090 * with * Dlss quality * are just pointless, will you buy a 4090 to just game on CPU bound scenarios ?
Obviously not
cuz if i buy a 1800usd GPU,
i will probably play at 4K res
And a 14600k will be enough for it, and if u even consider a low clock 8 core cpu * 9800x3D *
the difference will be too small 😂
amd the daddy
Oh dear, these seem pretty lame and DOA. Was interested in buying but going to avoid.
i have the intel 265k is shit dont buy it
Not a good CPU
I regret buying the 11th gen(12th gen was not released) because the gap between 11th and 12th gen is huge. After that the only change intel made was making a slightly better version of 12900K with increasing 100watt of heat instead of performance then called it the new gen. The core ultra series had to be the 14th gen. Now it's late.
The lack of competition will make AMD prices go "Nvidia Mode" by 2026
11 gen is really good. imo i think the problem is windows 11. I feel like my 11900k system was less weird than my 12900k now, but at the other end the gen 5 experience is better, downloading est general more fast, but i think its really w11 that is shit.