Over the course of testing, we were in contact with a number of reviewers who asked us about stability with 24H2. With our setup using the firmware / BIOS listed at 3:12 of the video, we had one of the smoothest experiences with a pre-launch platform we've ever encountered. It seems like other folks (many who were using MSI motherboards) experienced nightmares. Guess that's another reason to recommend you skip Arrow Lake for now. :(
How is every CPU cooked? From a gaming only aspect maybe, but most CPUs aren't sold for gaming. The 285 is pretty boring, but the 245 walks it's price segment. The 9700X looks like it's in trouble as far as a mid level workstation segment.
@HardwareCanucks I'm nearly out of 7800X3Ds at the price I am willing to pay, and I know the 9800X3D is going to be $549 USD. Thankfully, most don't need it for gaming only builds. 12400F, 7500F, and 5700X3D upgrades for now are holding. We need a new budget king for Ryzen 9000 soon, with the other nodes drying up.
Thanks for the overview, I'm liking the new style of the low and high charts, I think it's a great way to show a reality check on GPU bottlenecks all at once instead of potential buyers focusing too heavily on pure framerates. I mentioned on another video that I'm using a Gigabyte B350 motherboard that has done well for me over multiple Ryzen generations, from 1700X to 5900X, and I was hoping to give Intel a chance this next round. But surprisingly to me, and proven in your charts, the 9700X and 9900X suddenly look all the more attractive against Arrow Lake for those of us wanting both solid gaming and productivity platforms in one. I'm really curious to see a 5900X vs 9900X vs 285K breakdown. As great as the X3D chips are, I just don't think they're for me with the lackluster productivity results...unless the 9800X3D really changes things.
Just shows how good the 7800x3d is for gaming, at insanely low power levels as well. With good cooling, undervolting and PBO it's even better. For simulator games its a must.
Can you expand on the comment about Resolve not using Quicksyn for your files? What type of files? I was under the impression Quicksync covers just about every major video type out there? Is your Resolve set up properly to utilize Quicksync?
Sure. Resolve doesn't use the quick sync encode / decode module for rendering outputs when it detects a dGPU. It's EITHER the dGPU or iGPU media engine, not both.
I think this sets them up for something a LOT better in the future but they need to take advantage of their Arrow Lake lessons before moving on to a completely different architecture.
I can only see either hardcore Intel fans / AMD haters, or 100% productivity builders, considering these chips. I admire what Intel did, but these new chips pack about the same excitement levels as a minivan. A good workhorse, but very boring performance wise.
While we save one fan in a water cooler, we spend much more in a new motherboard. Good gob, Intel for a simple update. What was wrong with LGA1700 I wonder?
As a gamer if I were to buy this CPU I would overclock until an avg temp of 70C-75C , below 60C is pointless. Derbaur8 has better stats to compare against in his video but fewer games compared, wish HW canucks can do the same, better for users to have results from multiple sources.
Completely understand but overclocking does very little to nothing for in-game framerates since by nature most games don't require the CPU to be running near its voltage / power cap.
I would say that clearly AMD has the better CPUs for gaming but this does look promising for the future for Intel if they can further improve this. I am curious how the 5800x3d fares. Currently I am rocking the 7800x3d.
So what are my options then, as an uneducated no-tech-savvy person like myself, who wants the latest and greatest? Dip my toe into AMD for the first time and wait till January for the 9950X3D?
@@MHawk-px7dm 265K comes in strong also, not as strong as the 245K though. Really so far only the 285K is lacking which imo is fine, less than 1% of people end up with those SKUs.
so the reviews consensus for Zen5 was it's "not for gamers", and yet.... the cheapest one is better for gaming than the best ARL that's more than 2x the price. wow.
I would have loved to see how the 1440p performance was. Many 1440p/144 hz monitors are coming way down in price, to around $200-$250 or less. I was planning on using the Ultra 5 245K for gaming/light plex usage but now it's almost November and after seeing two reviews now, I figured I might as well see AMDs response will be. Thank you for the video! :)
Yeah I understand that but you have to remember as resolution increases, so to does the potential for GPU bottlenecking. In a lot of games, that results in a flatlining of results since CPU horsepower doesn't really matter as much since the processor tends to "wait around" for information from the graphics card.
To be honest, Arrow Lake seems more impressive than Zen 5, when looking at power efficiency in productivity. It makes me wonder how good it'll be if it was on a lower power limit. Still, I wouldn't get that and instead opt for Zen 4 for how cheap it is now. For gaming, R7 5700X3D is a no brainer.
its shit ,it lose in most of tasks and it costs something more but 7950x are better choice then both considering its 20% cheaper with similarperformance
So I've been using my overclocked i5 760 first gen i5 till now paired with 1660 super, should i upgrade to this core ultra 5 or the 14th gen? I'm a casual gamer and browser... Can someone help?
@@djayjp higher core count always gonna suck more power in gaming who doesn’t use all core cuz those extra core even on idle use power comparing 8 core cpu power consumption in gaming vs something with more core is not fair if you compared the 9950x also suck more power than 3D chip with 8 core and lower clock speed
Watch GN on the power draw stuff. Literally ignore everyone else. They didn't isolate/test accurately to show the actual true efficiency (or lack thereof).
We don't need to isolate in order to get CPU package power draw numbers. As a matter of fact, as you saw in the recent GN video, PSU isolation doesn't really work to accurately determine how much power is going to each component if current is split between connectors / components. Even GN has stated that if HWInfo supports a platform, you can get very accurate A/B power numbers due to the way to uses rail voltage measurements on most (but not all) platforms. And the HWInfo crew is VERY good at what they do for platform support; to the point where that even now have initial Panther Lake support.
Der8auer was specifically referring to inconsistencies with reporting in DLVR's Power Gate Mode. We test in an out of box (non-OC) config which uses by default the Regulation Mode.
@@HardwareCanucks like, what happened to the spirit of tuning the system for DIY PC market? If I want a straight out of the box experience, I’ll just go get Macs,
Unfortunately, we were down to the wire on this one with all the retesting that was necessary to include Intel results with the 0x12B microcode, 24H2 and the new AMD AGESA.
Apologies, I started watching, but lost interest, no reflection on the channel. These might be just the ticket for commercial users, but they're not enthusiast cpus in any way, shape, or form. Catch you in the next one...!
@@HardwareCanucks They sure did, this is a wake up call to get their s... together. In gaming on 285K is on par with 9600X and 9700X (even 7700X is there in the mix) and in production workloads, the in their best case scenario get about 10% win over 9950X but still with massive power draw in comparison. The only progress in power draw is when compared with 14900K which was crazy to start with...
@@HardwareCanucksThese ARL cpus seem to be better suited for a business/professional PC. After a couple for windows patches and/or bios updates , I believe that the 285k will pull ahead from even the 9950x or the upcoming x3d. For a gaming rig , AMD is the undisp winner though.
To think I was on the verge of dropping $1600 between the 285k and the ASUS Z890 Extreme. After watching another content creators review and then watching this review... both pretty much echoing each other... my future Arrow Lake build was tossed into the trash can, set ablaze and for safe measure... flushed down the toilet!
Same I'm skipping Intel this time, maybe I'll use it for a sff if it's affordable. I'm not a sucker, I remember how Intel used to be smug about their lead and frankly needed this humbling.
@@valco7075 Harder? Is easier lol, so you like more i9 and after tons of digits? And is better now, the "i" series are with processors with non-artificial intelligence, and "u" with articifial intelligence and NPU, very simple! :)
This is a fair review. These other big TH-cam tech channels that I'm subscribed to (I won't name them, you probably already know) have become so anti-Intel that I don't trust them for reviews anymore. You seemed unbiased and professional. Good job.
@@Bigtymer781 its very funny, cus this is the most biased and mellow review of them all, and the least professional one when it comes to actuall testing methodology. so yea.. its very funny 🤣
I understand why everybody talks about testing at 720p. But in this day and age with how much processing has to go on with games and stuff, is that really relevant? It's basically giving a light load for the CPU to do. What matters is when the CPU is doing a lot. I don't think at 720p the CPU is doing a lot. It might be doing it better but it's not doing a lot. Understand? Was the 4090 maxed out at 1080p Ultra as well? I highly doubt it. If the Ryzen chips had more horsepower then you would see it at 1080p Ultra unless the game engines hit their limit. I know people are going to get on me about my comments here. Oh well. I think that the Intel CPUs are doing fine. It's the first gen. Everybody was hyped up about Ryzen first gen and that wasn't even as good compared to these Arrowlake CPUs. I find it hilarious how all the tech tubers are like a big mob with pitchforks and torches. They all together attacked Intel. Who was thinking that Intel was going to somehow make their best chip first from a new architecture? It's quite a strange thing to behold the hate mob. But certainly not surprised. It is always Intel bad or Nvidia bad. It's getting quite old. Anyway thanks for the testing.
720p might not be relevant NOW but we added it because it can give a glimpse of how a platform can scale relative to future GPU performance increases. For example, today's 4090 might be tomorrow's 5080 and the next generation's 6070. Someone buying a high end platform now might want to whether or not their CPU has the legs to keep up with TOMORROW'S GPUs.
I think 720p high is useful, but not 720p low. 720p would in theory remove any GPU bottleneck, and high settings are more representative of what you'd use a flagship chip for. High settings affect the CPU too, after all, especially RT.
If i wanted efficiency i would not be on desktop buying a high end CPU. What a dumb thing to focus on for Arrow Lake. Zen 5 X3D will be even more efficient and spit out far higher frames while doing so. Intel is doomed.
They needed to focus on efficiency because they couldn't keep throwing unlimited power at their silicon. At some point that would have led to massively diminishing returns. This allows them to hopefully set up for the future, delivering better generational increases that aren't power ceiling or heat limited.
@@HardwareCanucks I cant see any win for intel anywhere, the 7800x3d already has lower power usage and far higher performance and its old. Who would buy this?
Over the course of testing, we were in contact with a number of reviewers who asked us about stability with 24H2. With our setup using the firmware / BIOS listed at 3:12 of the video, we had one of the smoothest experiences with a pre-launch platform we've ever encountered. It seems like other folks (many who were using MSI motherboards) experienced nightmares. Guess that's another reason to recommend you skip Arrow Lake for now. :(
I would say avoid ARL + MSI mobo combo , instead of ARL entirely. With Asus / Gigabyte mobos everything seems to be rock solid.
thanks for including 720p low and cb23, it is getting pretty rare so I really appreciate it!
who run 720P in 2024? 1080p and 1440p should be the standard in 2024....
@@mgb2012 it is for testing CPU without being limited by the current gpus.
basicaly every cpu is cooked until the 9800X3D, then every cpu will be even more cooked
watching this video as an 7800x3d owner made me feel great
Well the 7800X3D, depending on price, could be a better buy from a price / performance standpoint depending on how $$$ the 9800X3D is
How is every CPU cooked? From a gaming only aspect maybe, but most CPUs aren't sold for gaming.
The 285 is pretty boring, but the 245 walks it's price segment. The 9700X looks like it's in trouble as far as a mid level workstation segment.
Well, if you put to much voltage to this CPUs they will literally be cooked
@HardwareCanucks I'm nearly out of 7800X3Ds at the price I am willing to pay, and I know the 9800X3D is going to be $549 USD. Thankfully, most don't need it for gaming only builds. 12400F, 7500F, and 5700X3D upgrades for now are holding. We need a new budget king for Ryzen 9000 soon, with the other nodes drying up.
love the charts very easy to underland,thank you BTW
Thanks for the compliment. We'd love more feedback about them since jamming this amount of information into a single screen becomes pretty risky.
@@HardwareCanucks Your videos are so well edited & charts are fantastic!
I would love to see the 5800X3D in the gaming charts, maybe you can include it in the upcoming reviews for the 9800X3D.
It will be included in the X3D review. Promise.
a video show casing 7800x3d supremacy in gaming. Greatest Gaming CPU ever made Hands down the GOAT.
Nah, the OG 5800X3d should probably hold that title still remaining super relevant today 😅
@@tyoung319 Agreed, 5800x3d is the goat
PC Builders: I've got 99 problems, using Intel ain't one... 🤣🤣🤣
Thanks for the overview, I'm liking the new style of the low and high charts, I think it's a great way to show a reality check on GPU bottlenecks all at once instead of potential buyers focusing too heavily on pure framerates. I mentioned on another video that I'm using a Gigabyte B350 motherboard that has done well for me over multiple Ryzen generations, from 1700X to 5900X, and I was hoping to give Intel a chance this next round. But surprisingly to me, and proven in your charts, the 9700X and 9900X suddenly look all the more attractive against Arrow Lake for those of us wanting both solid gaming and productivity platforms in one. I'm really curious to see a 5900X vs 9900X vs 285K breakdown. As great as the X3D chips are, I just don't think they're for me with the lackluster productivity results...unless the 9800X3D really changes things.
great video, fair and on point - ALK wins some and loses some. Let's wait what next - could take two more years till the Unified core is ready.
Just shows how good the 7800x3d is for gaming, at insanely low power levels as well. With good cooling, undervolting and PBO it's even better. For simulator games its a must.
I am excited for future iterations of Intel. This might be their Zen moment, well half of it
whoever creates 3D V-Cache probably get instant promotion and paycheck increase at AMD everytime intel release new CPU generation
People just dont realize how revolutionary 3D V-Cache is :)
Can you expand on the comment about Resolve not using Quicksyn for your files? What type of files? I was under the impression Quicksync covers just about every major video type out there? Is your Resolve set up properly to utilize Quicksync?
Sure. Resolve doesn't use the quick sync encode / decode module for rendering outputs when it detects a dGPU. It's EITHER the dGPU or iGPU media engine, not both.
Imaging using N3B and still losing badly to the N5 7800X3D. That’s 1.5 nodes of advantage for Intel.
Congrats Intel you won HC's "Dam, OK..." Award. 👏
Intel flopped with the core Ultra processors, the 7800X3D, 5800X3D still killing it maybe next generation Intel
I think this sets them up for something a LOT better in the future but they need to take advantage of their Arrow Lake lessons before moving on to a completely different architecture.
Did you considered the power draw from the 24 pin ATX?
We consider CPU Package Power as reported by HWInfo which takes it's reading directly from the CPU voltage rail(s)
I can only see either hardcore Intel fans / AMD haters, or 100% productivity builders, considering these chips. I admire what Intel did, but these new chips pack about the same excitement levels as a minivan. A good workhorse, but very boring performance wise.
While we save one fan in a water cooler, we spend much more in a new motherboard. Good gob, Intel for a simple update. What was wrong with LGA1700 I wonder?
Guess I'm just going to keep waiting for the 9800x3d.
Hello, content idea.... - challenge all your rival toobers to a fusion 360 learn and 3d print challenge - end goal being a 3d printed mini itx case.
Finally... Let's see what intel's got
As a gamer if I were to buy this CPU I would overclock until an avg temp of 70C-75C , below 60C is pointless. Derbaur8 has better stats to compare against in his video but fewer games compared, wish HW canucks can do the same, better for users to have results from multiple sources.
Completely understand but overclocking does very little to nothing for in-game framerates since by nature most games don't require the CPU to be running near its voltage / power cap.
I would say that clearly AMD has the better CPUs for gaming but this does look promising for the future for Intel if they can further improve this.
I am curious how the 5800x3d fares. Currently I am rocking the 7800x3d.
@@Old_Ladies look good just need to fix the gaming performance
Excuse me……..you turned off PBO? Why? Is that not how AMD CPU’s operate?
No, it isn't. PBO increases operational power much like Intel Extreme Mode.
So what are my options then, as an uneducated no-tech-savvy person like myself, who wants the latest and greatest? Dip my toe into AMD for the first time and wait till January for the 9950X3D?
Yep
Arrow lake is day one. Performance will increase,
Yes……..your power consumption is incorrect. It uses power at the 24 pin also (60 to 70 watts additional)
How are you assuming that? HWInfo reports CPU Package Power directly from the CPU rail(s)
@@HardwareCanucks Only on the CPU power plug(s). Not off the 24 pin. DerBauer pointed this out.
Intel hits bottom 😂
Coukd I ask what the timings were on the kits of RAM on each platform? 2:56
The only winner is the Ultra 5 being faster than both the 14600K and 9700X without massive power draw in everything other than gaming
@@MHawk-px7dm 265K comes in strong also, not as strong as the 245K though. Really so far only the 285K is lacking which imo is fine, less than 1% of people end up with those SKUs.
@@tyoung319 still beats the 9900x . If you do anything other that gaming , 285k is a great chip
Probably the only "positive" review at the moment.
so the reviews consensus for Zen5 was it's "not for gamers", and yet.... the cheapest one is better for gaming than the best ARL that's more than 2x the price. wow.
Well at launch Zen 5 was a dog in gaming. That's changed over the last few months, especially with 24H2 and to a lesser extent the new AGESA.
I would have loved to see how the 1440p performance was. Many 1440p/144 hz monitors are coming way down in price, to around $200-$250 or less. I was planning on using the Ultra 5 245K for gaming/light plex usage but now it's almost November and after seeing two reviews now, I figured I might as well see AMDs response will be. Thank you for the video! :)
Yeah I understand that but you have to remember as resolution increases, so to does the potential for GPU bottlenecking. In a lot of games, that results in a flatlining of results since CPU horsepower doesn't really matter as much since the processor tends to "wait around" for information from the graphics card.
If you want any CPU for gaming, the only real choice is the x3D chips. 5700/5800/7600/7800 x3D
@@HardwareCanucks Thanks for taking the time to respond and explain that to me. Everything helps!
That's a GPU benchmark you're after, not a CPU review.
Now intel chose to go allin with more power to stay in the market against AMD
Guess I'm riding with my 14900ks for a while. ✌️🇺🇲
Make sure you use the new microcode though.
@@HardwareCanucks Nah, lowers performance by 2% ✌️🇺🇲
@@bm373 so you won't be riding it for a long eh
@@riba2233 You don't need the microcode update if you manually tuned whole system. It's only for the stock settings people.
@@club4ghz ofc, but in that case he wouldn't even talk about performance difference.
The 9800X3D has already won
Theres no fucking way they're that bad for gaming.
Maybe need some windows and bios updates
They were never going to be great for gaming. If you use PC just for gaming, just get any x3D CPU
@@Aquaquake better for productivity, worse for gaming
Way dude!
To be honest, Arrow Lake seems more impressive than Zen 5, when looking at power efficiency in productivity. It makes me wonder how good it'll be if it was on a lower power limit. Still, I wouldn't get that and instead opt for Zen 4 for how cheap it is now. For gaming, R7 5700X3D is a no brainer.
its shit ,it lose in most of tasks and it costs something more but 7950x are better choice then both considering its 20% cheaper with similarperformance
Intel needs to drop the price otherwise its another waste of sand like 11th gen chips.
How efficient is a one and done 1851 socket? Really wasteful
Soooo... Its Official??? Intel its Official A Second Brand on everything?
Well they certainly aren't the Budget Brand. I don't think their ego could handle pricing these chips at the cost they should be...
So I've been using my overclocked i5 760 first gen i5 till now paired with 1660 super, should i upgrade to this core ultra 5 or the 14th gen? I'm a casual gamer and browser... Can someone help?
Go ultra 5 but I would wait for Prices to drop launch Prices are always a pain and the hollidays are coping.
What about an intel core i3 equivalent??
Anything over 90W is just wasteful.
Luckily, neither Ryzen 9000 nor Arrow Lake generally go above that while gaming.
meanwhile buying a 3090ti, 4090 and then 5090 hahaha
@djayjp only integrated gpu for you then.
@@raulitrump460 Lol I mean CPU
@@djayjp higher core count always gonna suck more power in gaming who doesn’t use all core cuz those extra core even on idle use power comparing 8 core cpu power consumption in gaming vs something with more core is not fair if you compared the 9950x also suck more power than 3D chip with 8 core and lower clock speed
4:08 These numbers don't make any sense vs the prior power consumption chart using the same workload.
We retested everything with the newest BIOS / firmwares. They were bound to change. ESPECIALY on the Intel side.
Not for me. Do not want or need NPU!
Why upgrade to a cpu that requires a new motherboard, can’t beat the 7800x3d in gaming, and loses hyperthreading 🤔👀. Yeah………..NO!
7800x3D beast gaming cpu
Watch GN on the power draw stuff. Literally ignore everyone else. They didn't isolate/test accurately to show the actual true efficiency (or lack thereof).
We don't need to isolate in order to get CPU package power draw numbers. As a matter of fact, as you saw in the recent GN video, PSU isolation doesn't really work to accurately determine how much power is going to each component if current is split between connectors / components. Even GN has stated that if HWInfo supports a platform, you can get very accurate A/B power numbers due to the way to uses rail voltage measurements on most (but not all) platforms. And the HWInfo crew is VERY good at what they do for platform support; to the point where that even now have initial Panther Lake support.
The power figure is wrong, go check out Derbauer’s,
Der8auer was specifically referring to inconsistencies with reporting in DLVR's Power Gate Mode. We test in an out of box (non-OC) config which uses by default the Regulation Mode.
@@HardwareCanucks that’s the issue, if you’re testing for power efficiency of the chip, power gating has to be taken into account
@@HardwareCanucks like, what happened to the spirit of tuning the system for DIY PC market? If I want a straight out of the box experience, I’ll just go get Macs,
@@derenbong6060 🤦♂ please don't
@@derenbong6060 In Regulation Mode, as far as we understand it, there aren't any inconsistencies.
Desperately needs a perf/W chart.
Unfortunately, we were down to the wire on this one with all the retesting that was necessary to include Intel results with the 0x12B microcode, 24H2 and the new AMD AGESA.
Watch the Gamer's Nexus review, it's all there both for gaming and productivity.
Spoilers: Intel is still pretty bad.
Nice
@@HardwareCanucks Ah I hear ya
Apologies, I started watching, but lost interest, no reflection on the channel. These might be just the ticket for commercial users, but they're not enthusiast cpus in any way, shape, or form. Catch you in the next one...!
"Intel needed this" 🤡😀🤡😀
They did.
@@HardwareCanucks They sure did, this is a wake up call to get their s... together. In gaming on 285K is on par with 9600X and 9700X (even 7700X is there in the mix) and in production workloads, the in their best case scenario get about 10% win over 9950X but still with massive power draw in comparison. The only progress in power draw is when compared with 14900K which was crazy to start with...
@@MarioCRO I think this IS them getting their poop together.
@@HardwareCanucks Honestly, too little, too late... and not to mention in the end TSMC had to take over the production.
@@HardwareCanucksThese ARL cpus seem to be better suited for a business/professional PC. After a couple for windows patches and/or bios updates , I believe that the 285k will pull ahead from even the 9950x or the upcoming x3d. For a gaming rig , AMD is the undisp winner though.
You might as well call your channel: 'Shill Canucks' after this "review".
In what way? Our intent was to keep it as balanced and unbiased as possible.
Great video and not worth the money to upgrade. Looking forward to AMD
To think I was on the verge of dropping $1600 between the 285k and the ASUS Z890 Extreme. After watching another content creators review and then watching this review... both pretty much echoing each other... my future Arrow Lake build was tossed into the trash can, set ablaze and for safe measure... flushed down the toilet!
Same I'm skipping Intel this time, maybe I'll use it for a sff if it's affordable. I'm not a sucker, I remember how Intel used to be smug about their lead and frankly needed this humbling.
I hate the new names
U9 285K sounds very cool, refresh name
@@edybtt I agree, but it's much harder to identify the chip at first glance.
I personally love the condensed lineup. The names? Well they are what they are LOL
@@valco7075 Harder? Is easier lol, so you like more i9 and after tons of digits?
And is better now, the "i" series are with processors with non-artificial intelligence, and "u" with articifial intelligence and NPU, very simple! :)
This is a fair review. These other big TH-cam tech channels that I'm subscribed to (I won't name them, you probably already know) have become so anti-Intel that I don't trust them for reviews anymore. You seemed unbiased and professional. Good job.
😂
@@zexedearth89 Nothing funny..
@@Bigtymer781 its very funny, cus this is the most biased and mellow review of them all, and the least professional one when it comes to actuall testing methodology. so yea.. its very funny 🤣
I understand why everybody talks about testing at 720p. But in this day and age with how much processing has to go on with games and stuff, is that really relevant?
It's basically giving a light load for the CPU to do. What matters is when the CPU is doing a lot. I don't think at 720p the CPU is doing a lot. It might be doing it better but it's not doing a lot. Understand?
Was the 4090 maxed out at 1080p Ultra as well? I highly doubt it.
If the Ryzen chips had more horsepower then you would see it at 1080p Ultra unless the game engines hit their limit.
I know people are going to get on me about my comments here. Oh well.
I think that the Intel CPUs are doing fine. It's the first gen. Everybody was hyped up about Ryzen first gen and that wasn't even as good compared to these Arrowlake CPUs.
I find it hilarious how all the tech tubers are like a big mob with pitchforks and torches. They all together attacked Intel.
Who was thinking that Intel was going to somehow make their best chip first from a new architecture?
It's quite a strange thing to behold the hate mob. But certainly not surprised. It is always Intel bad or Nvidia bad.
It's getting quite old.
Anyway thanks for the testing.
720p might not be relevant NOW but we added it because it can give a glimpse of how a platform can scale relative to future GPU performance increases. For example, today's 4090 might be tomorrow's 5080 and the next generation's 6070. Someone buying a high end platform now might want to whether or not their CPU has the legs to keep up with TOMORROW'S GPUs.
I think 720p high is useful, but not 720p low. 720p would in theory remove any GPU bottleneck, and high settings are more representative of what you'd use a flagship chip for. High settings affect the CPU too, after all, especially RT.
Well the cpus are a fail for this year let's hope the gpus coming from.nvidia change the landscape
Honestly, testing at 720p & 1080p is still necessary
Which is what we did. ;)
Where the fuck is the 265K ? Is it really that bad that they had to hide it?
If i wanted efficiency i would not be on desktop buying a high end CPU. What a dumb thing to focus on for Arrow Lake. Zen 5 X3D will be even more efficient and spit out far higher frames while doing so. Intel is doomed.
They needed to focus on efficiency because they couldn't keep throwing unlimited power at their silicon. At some point that would have led to massively diminishing returns. This allows them to hopefully set up for the future, delivering better generational increases that aren't power ceiling or heat limited.
Amen
@@HardwareCanucks I cant see any win for intel anywhere, the 7800x3d already has lower power usage and far higher performance and its old. Who would buy this?
@@jordan3802 those who are not dumb and know there are stuff without gaming
It matters insofar as keeping temperatures under control. Beyond that, it's a wash.
Just get an AMD
Haven't seen any reason to upgrade in 7 years. Still don't.