No You’re WRONG About the New Intel Arrow Lake CPUs Here’s Why...

แชร์
ฝัง
  • เผยแพร่เมื่อ 17 ธ.ค. 2024

ความคิดเห็น • 122

  • @TechArc_
    @TechArc_  หลายเดือนก่อน +6

    Just to clarify by all means, I'm not saying go out and buy Arrow Lake over some of AMD's parts, considering the drastically worse gaming performance. Im simply stating how much like Zen 5's "fine wine" turned out (improving performance considerably with microcode and software improvements), Intel might very well be in a similar situation (but obviously all of this is too early to say). At the end of the day, both Intel and AMD have very interesting products (AMD is still very much the overall performance leader), with Intel reshifting focus towards efficiency and laying the groundwork for future generations.

    • @01talima
      @01talima หลายเดือนก่อน +2

      so how were we wrong?

    • @NormanPutra
      @NormanPutra หลายเดือนก่อน

      Come on man, is it what it is. Just be objective. Even intel soft launch in berlin is just banner and unrealistic. 😅

    • @timester3030
      @timester3030 หลายเดือนก่อน +4

      uh... you focus on wattage and all... you can still clearly see... AMD is way more efficient...? lol...

    • @michelians1148
      @michelians1148 หลายเดือนก่อน

      😂

    • @ronron_gaming
      @ronron_gaming 23 วันที่ผ่านมา

      ​@@01talima how are you correct?
      he literally said they are shifting focus on efficiency
      since in the past, massive idiots say that their cpu power consumption is high
      now they effectively reduce the power then other idiots complain about losing 5% performance
      wtf is wrong with AMD community

  • @marioinacio9274
    @marioinacio9274 หลายเดือนก่อน +32

    But you make it seem as if intel just started makeing cpus last week. They owned the market for well over 10 years. Now it move over ler amd take the wheel from here on. "Not an amd fan or intel" I just go to what fills my needs.

    • @romanott3149
      @romanott3149 หลายเดือนก่อน +1

      don't forget that this time tsmc is making the chips, not intel themselves, could be that intel just hasn't managed to get their knowledge to work with the tsmc process

    • @kemaldemirel1714
      @kemaldemirel1714 หลายเดือนก่อน +2

      Amd has been making GPUs for how many years now?
      People still think of it as fine wine.
      I personally never buy products based on future performance. But I can see the hypocrisy here.

    • @dankmemes3153
      @dankmemes3153 หลายเดือนก่อน

      Its the first time theyre using tsmc?

    • @raisofahri5797
      @raisofahri5797 หลายเดือนก่อน

      ​@@dankmemes3153 no a lot of their mobile parts produced by tsmc and their arc gpu too

    • @dankmemes3153
      @dankmemes3153 หลายเดือนก่อน

      @ lunar lake and arrow lake, and their arc serues are the only ones that use tsmc. Everything else, including past mobile cpus have always been made by intel themselves

  • @Aaron-zl5gq
    @Aaron-zl5gq หลายเดือนก่อน +62

    you're right its not a failure its a fking disaster.

  • @denverbasshead
    @denverbasshead หลายเดือนก่อน +7

    An arrow lake CPU costs so much more to make than what a ryzen costs

  • @kerotomas1
    @kerotomas1 หลายเดือนก่อน +4

    But it is a failure. It's not faster than 14th gen for a lot more money (given the discounts of 14th gen) - and then the more expensive motherboard and ram - and while better it's still not close to Ryzen in power efficiency. It's in a black hole of uselessness.

  • @kispumel
    @kispumel หลายเดือนก่อน +8

    All you say, and what intel achieved would be okay, IF they would have achieved that on the same node(intel 7 - 10nm). What you forget, they jumped two full node ahead, maybe 2.5-3 but thats up for debate, having a node advantage over amd as it stands. It changes the whole story. Sorry, but it IS a devastation with the advanced node.

  • @robw3655
    @robw3655 หลายเดือนก่อน +14

    Pretty sure this is the guy who runs UserBenchmark

  • @maxbirdsey7808
    @maxbirdsey7808 หลายเดือนก่อน +1

    To summarise:
    - Arrow Lake regularly under performs AMD and even Intel's last generation Raptor Lake
    - Arrow Lake uses less power than Raptor Lake (not really an achievement), but pretty much always more than AMD which out performs it
    - Arrow Lake is less efficient than AMD despite being on a newer, more efficient node (granted they are using the shoddy N3B rather than N3E)
    - Arrow Lake presents horrible value for the performance and is comparatively priced high, especially when it's not out-competing prior gen parts. Granted this is because Intel went with a design that was very expensive to produce due to the packaging, and supposedly made some remarks that lost them a 40% TSMC discount which was probably needed to make this product remotely cost viable.
    But we're wrong because it might get better?
    If the issue with Arrow Lake is addressable, Intel should've delayed the launch until it was actually ready. I share that criticism with Zen 5, which should've been launched in October once the issues had been resolved.
    I've been burnt by too many early access games recently, KSP2 and Cities Skylines 2 (and almost Prison Architect 2), that I don't need my CPU following that model as well.

  • @DesoloVir
    @DesoloVir หลายเดือนก่อน +2

    Do those power efficiency numbers reflect the power the CPU draws, from the 24pin header? And not just the CPU header?

  • @iLegionaire3755
    @iLegionaire3755 22 วันที่ผ่านมา +1

    Simultaneous multi threading is light years ahead of hyper threading. It’s the same level of performance or greater but with infinitely more power efficiency. Intel just doesn’t know how to do power efficiency, and e cores do not count.

  • @Media-h4p
    @Media-h4p หลายเดือนก่อน +1

    Sure, it's "TSMC Inside" : )
    Intel Arrow Lake: 4 "tightly butted" tiles (1 3nm, 1 5nm, and 2 6nm, all TSMC) plus a "large" base tile under them.
    AMD Ryzen: 2 or 3 tiles.
    For cost, the simpler, the better!

  • @leonxus2701
    @leonxus2701 หลายเดือนก่อน +3

    gamers are not buying these inferior products and the thing which angers me the most is that, they went for tile implementation to save money and now they are charging same or more than their competition. wow. thats some high level thinking. for a inferior product like arrow lake with still demanding more power to run compare to its ryzen compition, they are asking a lot of money. they needed to price cut, so people would consider buying, because they again changed the socket. this is innovation just scaming your loyal buyers.

    • @maxbirdsey7808
      @maxbirdsey7808 หลายเดือนก่อน

      Intel's Tile Implementation does not save money. It's more expensive for them as the packaging is expensive. Intel made a product that can't compete, and can't drop the price to make it compete because they'd be selling it at cost.
      AMD went chiplet so silicon could be used throughout the whole stack, from 4-core Desktop to 128-core Server CPUs. Within a segment, they all share the same IO die. So AMD ends up with 2 CCD die types (e.g. Zen 4 and Zen 4c), Desktop IO-die and EPYC/Threadripper IO-die. AMD only needs 4 silicon designs to cover all of that.
      Intel went "tile" notionally to achieve a similar thing to AMD. They split it up so at any point they can update one part of the CPU: the cores, GPU, IO and the SOC. Idea being similar to how Zen 3/5 reused Zen 2/4's IO die, they can update part of the chip with needing to redesign everything and also they can make the IO/SOC dies on a cheaper node since IO doesn't scale with nodes well anyways. However, Intel split it into too many parts, none of which are reused in other segments (at least not yet). Intel's server silicon is entirely separate.
      Irony is if Intel just went monolithic, or at most just did a Core and IO/SOC die like AMD does, it would probably be cheaper and perform better. What they've done isn't conceptually bad, but not something do when when you're getting out competed on cost and performance when this literally hampers both of those perspectives.

    • @niggamaster9139
      @niggamaster9139 19 วันที่ผ่านมา

      Ironed on 2010 amd cpu with 5 cores and oh boy oh boy I had 0 latency won every battle ppl were accused me of having, switched to intel and started losing as hell and had insane latency ,untill I opened some secret options in bios changed setting then it's became fast enough ,still nothing like amd

  • @zorororonoraroro
    @zorororonoraroro หลายเดือนก่อน +4

    I bought 3 1200 becouse for its price it performed best. I could only get 2 core pentium from intel... and now on same motherboard I have 5800x3d. so I am happy with AM4 more than even the first CPU I bought for it. AMD was no matter how you look at it the best buy back than.

  • @arizona_anime_fan
    @arizona_anime_fan หลายเดือนก่อน +11

    you sound like a well meaning young man, but two of your information sources conclusions are wrong.
    1) they got rid of hyperthreading because tsmc's 3nm can't do hyperthreading yet, not because of efficiency, expect to see hyperthreading or a new version of hyperthreading returning to intel cpus in the future. note, intel badly needs to improve hyperthreading which has remained largely unchanged since the p4. AMD's SMT is a significantly better multi thread technology at the moment displaying the fact intel has a lot of room for improvement for it's hyper threading.
    2) they lowered boost speeds because the 13000 and 14000 debacle scared he hell out of them, and a decision was made about 4 months ago to lower the clock speeds on the 200 series to prevent a repeat ringbus burnout. there should be some overclocking headroom on these chips as they intended to run them around 6ghz to make up for the performance shortfall caused by the memory latency issues their new chiplette design had.
    it was intel marketing thst spun these two design limitations into a move toward efficiency. and i might add, these chips aren't that efficient. they're about x2 more power hungry then zen5 is at 4nm. which should be shocking if you think about it. tsmc 3nm should be by default a 33% more efficient node then tsmc 4nm. if you control for the node size, arrow lake is not significantly more efficient then 13th or 14th gen. remember intel was going from intel 10nm to tsmc 3nm, there are massive energy efficiency gains to be had by default just with the node change.
    the multithreaded performance uplift is more a condemnation of intel's hyperthreading technology and the massive improvement in performance for their E cores then it is any other factor. the ecore performance uplift is where the real magic is in the 200 series. apparently the ecores themselves are almost identical to intel 11th gen cores in performance at significant power and clock reductions. frankly those ecores are magic, and far and away the single most impressive thing about this launch.
    listen i am not a fan of either company, but i remember the amd copium from the early 2010s, and frankly i don't want to live through it in the 2020s from intels side.
    your take on being a necessary move toward a chiplet design is correct. intel is about 10 years behind amd on this, but power efficiency was never a major consideration for intel.

    • @thetheoryguy5544
      @thetheoryguy5544 หลายเดือนก่อน

      NO not the first time Intel ditched HT they did with 9th gen.

  • @impuls60
    @impuls60 หลายเดือนก่อน +14

    Your just simply wrong in your assumptions. Derbauer tried to overclock the cores to a higher speed and they simply couldnt! The 3N node isnt built for high speeds, and atleast the stepping used wont. Ok so what do you focus on when you cant have speed, you take whats left and thats efficiency. Amd just released very interesting news about the 9800x3d. They but the cache at the bottom and voila users are now allowed to overclock that chip too. Gamers nexus have an excellent vid on why the heat got trapped inside the core layer and they therefore couldnt clock any higher. So Amd went ahead an marketed it as an efficient chip. No they solved the problem and we are allowed to fully oc that one aswell. Intel bought the 3N production years ago and they simply had to use it since you cant just cancel with TSCM. Back to Intel fail; Why did they put the IMC on another tile as the compute tile? Amd had to invent X3d cache to get away with segregated imc from the compute tile. Intel didnt learn from this or didnt want to spend money on creating a similar cache system. Its a known problem with the smaller nodes that you get hotspots since theres so little mass in the cores. I think Intel had an oh shit moment early on in the design and took the easy way out by removing hyper threading to get somewhat decent frequency out of the p cores. Theres so many bad design choices in this new chip that most people are going to flock to Amd now. Putting the cache on the bottom opens the door for way more power budget so the 12 and 16 cores can have cache on both ccxs. So most of the weakness of this design is mitigated. They will be cheap to make, dont need fast ram and draw even less power than a Core Ultra cpu. In Cyperpunk just now Amd showed a 159% lead over the Core ultra. On the other hand Intel used the most expensive node, expensive foverous, 8000Mhz+ ram and a new socket and its slower? Good luck selling that one to your friends.

  • @dantebg100
    @dantebg100 หลายเดือนก่อน +4

    This cpu suck for gaming because it’s not designed for that. E cores are useless E waste for gaming 😁 you need a bigger cores at lower frequencies for better efficiency like x3d. And bigger cache

    • @rahuldharmendran2770
      @rahuldharmendran2770 หลายเดือนก่อน +1

      This CPU really sucks for gaming but E cores are not for gaming. They are for idle use and smaller tasks like browsing, TH-cam etc. Just disable them for gaming on other intel processors like disabling 1 CCD on a 7950x3d. Even with just 36mb cache 14900k was nearly close to 7950x3d with a whooping 128mb of cache if we ignore their instability issues. Its just the architecture, design and practical working of the CPU that matters not frequency, cache or core size.

  • @segbed
    @segbed หลายเดือนก่อน +1

    with current intel chip design I would like to see intel cpus with increased L3 cache. But maybe they already tested it out and it just didn't bring any significant boost like for ryzen cpus.

  • @lumiere3809
    @lumiere3809 หลายเดือนก่อน +10

    i think people are forgetting that intel is not the underdog here.. they have so much money and have been dominating the 2010's so much that they stopped innovating.. now they are trying to over compensate by bringing techs that are half baked and not ready.. what the other channels are telling people are base on data and not promises.. you dont buy promises you buy the product for what it is and what youre gonna do with it.. and for its current price including the price of a new motherboard it would be a hard sell even to the creators so lets stop kidding here

    • @rustknuckleirongut8107
      @rustknuckleirongut8107 หลายเดือนก่อน +1

      One should keep in mind when calling Intel an underdog that just a few yeas ago Intel was spending 5 times the market capitalization of AMD just on research and development in a year. It will be sad if all those years of spending $10 billion on research left them with no ace up their sleeve at all for the future.

    • @jonaswox
      @jonaswox หลายเดือนก่อน +1

      @@rustknuckleirongut8107 probably an expensive DEI apartment and a group of men in black making sure nobody is drinking coffee without paying. It all adds up :D

  • @TomaszKnopek
    @TomaszKnopek หลายเดือนก่อน +3

    Sounds like you are trying defend Intel for their failure, go to power scaling from techpowerups, you can get much less power consumption for limiting power for very small loss in performance if you care that much about efficiency. Intel called their new Ultra processors MADE TO GAME which is just bs

  • @migoy
    @migoy หลายเดือนก่อน +2

    If Arrow Lake doesn't translate to a rebound in sales, what good is this release? Intel is bleeding investors' money. Intel has to take the crown back from AMD fast with this release. Who would buy this over 9000 series?

  • @HectorLeslie
    @HectorLeslie หลายเดือนก่อน

    I am not in a hurry to buy either of the two CPUs, I will wait for 3 to 6 months before I make my purchasing decision. There is always a problem with new releases specially when it involves a new architecture.

  • @Abstrahues
    @Abstrahues หลายเดือนก่อน +2

    I feel sad for gamers but honestly from what I have seen the 285K is a creator CPU

    • @iLegionaire3755
      @iLegionaire3755 หลายเดือนก่อน +1

      9950X, Threadripper PRO and EPYC are all faster creator CPU's. And all of them draw less power.

    • @Abstrahues
      @Abstrahues หลายเดือนก่อน +1

      @iLegionaire3755 Actually that's not true, the 285K now draws less power than the 9950x for creator applications. The rest are either too expensive or just don't have a high enough clock speed for me

    • @iLegionaire3755
      @iLegionaire3755 21 วันที่ผ่านมา +1

      @@Abstrahues Maybe contestable for the 9950X, but the other two definitely not close, that point still stands. 285K is absolutely no match for EPYC or Threadripper! Xeon fares even worse against them.

    • @Abstrahues
      @Abstrahues 21 วันที่ผ่านมา

      @iLegionaire3755 you also have to compare costs my friend

  • @kecktus2017
    @kecktus2017 หลายเดือนก่อน +1

    no one cares, carti dressed up as rocky for halloween

  • @johnmcmurphy1007
    @johnmcmurphy1007 13 วันที่ผ่านมา

    It's an interesting new platform which will show it's potential maybe in about 2 years. 🤔
    The Power Draw in full core load is yet not efficient and of course Intel needs to regain Gaming Performance. 👀

  • @Skungalunga
    @Skungalunga หลายเดือนก่อน +17

    3 things.
    1. Intel is dominating creator apps.
    2. Even Wendel from L1 tech referred to gamers as less than margin of error volume of sales. Yet the majority reviews were focused on...gaming.
    3. Intel is in the exact same position when AMD launched their tile base design. ie plenty multi core performance and weak gaming performance. The same people that sang AMD praise are booing Intel for the exact same reason. ie. the review industry is full of shit.

    • @georgioszampoukis1966
      @georgioszampoukis1966 หลายเดือนก่อน +3

      Exactly. I believe the most fair review was from TechPowerUp. They tested pretty much every scenario possible.

    • @slimjimjimslim5923
      @slimjimjimslim5923 หลายเดือนก่อน

      @@georgioszampoukis1966 Zen1 was very underwhelming. It really wasn't until Zen3 that AMD beat Intel. I think 2025 will be a make or break year for Intel and plus they are suppose to get CHIPS act fund by 2025, that will boost their cash flow quite a bit.

    • @sonofavietnamveteran4817
      @sonofavietnamveteran4817 หลายเดือนก่อน

      exactly dell hp and lenovo are salivating at this chip being in their business desktops rolled out to multi million dollar companies and billion dollar companies globally i just laugh

    • @12100F
      @12100F หลายเดือนก่อน +1

      the problem with that approach is that AMD was in FAR worse financial shape when Zen launched then Intel is now, and Zen 1 improved from "utter shit" ---> "actually decent". ARL is going from "good but power hungry" to "mid but more efficient"

    • @parlor3115
      @parlor3115 หลายเดือนก่อน +1

      Idk where you're getting that Intel is "dominating creator apps". Go check Gamer's Nexus for a full breakdown of their new and last gen chip performance in productivity software. They either lose miserably or have a very tight lead for the price. They very much lose on all fronts: gaming, productivity, power consumption and pricing.

  • @mightymartin6290
    @mightymartin6290 หลายเดือนก่อน

    I think Intel has already got the Ultra 295k in the making with all the SMTs enabled.

  • @auxityne
    @auxityne 20 วันที่ผ่านมา

    When one of your arguments is "well, AMD had problems too!" I really just can't take you seriously.

  • @paulboyce8537
    @paulboyce8537 หลายเดือนก่อน +1

    5.1 E Cores/4.1 Cache little push with CUDIMM 8400 or higher and you are level on top. This is very minor change. Arrow lake is very underpowered. BIOS/MB's/OS/win all are far from optimal and doesn't fully support the Arrow Lake yet. Also the 285K is not available. Very small tasting was released. My guess is to get the BIOS/MB's/OS/win up to date and to keep the MB manufacturers in check. Few weeks or months and we get Battlemage and I think that will be a pairing worth 5080 performance for less coin. Arrow Lake is tailor made for Battlemage.

  • @JusstyteN
    @JusstyteN หลายเดือนก่อน +7

    Intel changed socket again, i dont want to even hear this name anymore. Evil company.

    • @thetheoryguy5544
      @thetheoryguy5544 หลายเดือนก่อน

      Then don't buy.

    • @JusstyteN
      @JusstyteN หลายเดือนก่อน +2

      @@thetheoryguy5544 i wont, its trash. ever since i moved to amd im so much more happy with my cpu, it will be my3rd amd cpu now

    • @iLegionaire3755
      @iLegionaire3755 หลายเดือนก่อน +1

      Arrow Lake is the worst product Intel has released since Rocket Lake. Its a clear downgrade from Raptor Lake Refresh, which still holds up today in performance, if you can get it to run stable without constant BSOD's like my now dead and sold oxidized Intel Core i9 14900K.

    • @JusstyteN
      @JusstyteN หลายเดือนก่อน

      @@iLegionaire3755 been on amd for the past year or so and im really happy with it. x3d is just great product

    • @thetheoryguy5544
      @thetheoryguy5544 หลายเดือนก่อน

      @@iLegionaire3755 You must be an absolute dummy if you think that.

  • @Trampoukosss
    @Trampoukosss หลายเดือนก่อน

    we are wrong because the future intel cpu's will be great, ok buddy

  • @demiankeaough4616
    @demiankeaough4616 หลายเดือนก่อน

    Arrow lake is not good for average consumers. Good for tech nerds with nothing better to do than tinker with it.

  • @PunmasterSTP
    @PunmasterSTP 11 วันที่ผ่านมา

    Well, there seems to be a pretty big consensus in the comments...

  • @MikhailRongeur
    @MikhailRongeur หลายเดือนก่อน

    Using CineBench 2024 to compare CPUs is complete BS and here is proof as to why. I will be comparing the CB 2024 results to the CB R23 in every category
    In the following I will be running my 9950X at stock, and by stock I mean after having done a Clear CMOS.
    First the 9950X and my RAM running after a Clear CMOS, i.e. my RAM running at JEDEC:
    CineBench 2024 result:
    Multi-Core 2196
    Single Core 134
    CineBench R23 result:
    Multi-Core 44059
    Single Core 2294
    Next the result with no changes to the CPU configuration but the RAM running EXPO
    CineBench 2024 result:
    Multi-Core 2347
    Single Core 137
    CineBench R23 result:
    Multi-Core 43815
    Single Core 2297
    Next the result with no changes to the CPU configuration but the RAM running EXPO and manually tweaking the RAM timings:
    CineBench 2024 result:
    Multi-Core 2481
    Single Core 143
    CineBench R23 result:
    Multi-Core 43890
    Single Core 2278
    Next the result with no changes to the CPU configuration but the RAM running EXPO and manually tweaking the RAM timings and increasing the RAM speed from 6000 to 6200 and the FCLK from 2000 to 2067:
    CineBench 2024 result:
    Multi-Core 2506
    Single Core 144
    CineBench R23 result:
    Multi-Core 44005
    Single Core 2302
    Do you see now why using 2024 to compare CPUs is not a good idea?

  • @sudd3660
    @sudd3660 หลายเดือนก่อน

    when almost all games are cpu single core bottlenecks and any other workload most cpu's re enough to get thigs done, that is why shit has hit the fan and intel i in trouble.
    remember when graphics cards was the bottleneck, that was the good times. we actually had fps scaling back then....

  • @mattirealm
    @mattirealm หลายเดือนก่อน +4

    The E-Cores are clearly much better. However, they were FOOLISH to remove HT before their replacement for HT "rentable units" was ready to go. The chips will improve, but they really shot themselves here, badly. AMD messed up with Zen 5, but......both kinda stink right now.

    • @thetheoryguy5544
      @thetheoryguy5544 หลายเดือนก่อน

      Most games actually benefit from disabling HT so what are you going on about?

  • @raffitchakmakjian
    @raffitchakmakjian หลายเดือนก่อน +1

    good take. I feel like all the major players are only looking at the gaming performance and passing off the gains elsewhere as just ok. These are good processors IMO, and I intend to build a 265k rig. I like what Intel has done. On a different note, you point your chin/face upwards when you're driving home a point, more so than any other person I've watched in recent memory, so much so that you're having to look down at the camera. Some criticism you didn't ask for, sorry. Good take none the less.

  • @RCSPARTAN77
    @RCSPARTAN77 หลายเดือนก่อน

    I'm an intel fan but their latest release is indefensible. Gamers want perfromance not efficiency. Forget about the IF's. Massive fail when Intel needed a product that could compete. Company is in trouble. CEO needs to go and new ideas brought in.

    • @segbed
      @segbed หลายเดือนก่อน

      I want performance and efficiency. 7800x3d.

    • @samhui9528
      @samhui9528 หลายเดือนก่อน

      Yes true gamers want performance over efficiency, but they will not think integrated GPU currently. Don't tell me performance hungry gamers play games on AMD integrated GPU.

    • @thetheoryguy5544
      @thetheoryguy5544 หลายเดือนก่อน +1

      Yeah but not all people use their PC to solely game with.

  • @jonaswox
    @jonaswox หลายเดือนก่อน

    if you want power efficiency just go for the 10 or 11 gen. Or 12600/13600.
    12600 - better in gaming than arrow lake, uses alot less power. Lol.
    There is nothing "efficient" about arrow lake :D

  • @DanKxxx
    @DanKxxx 5 วันที่ผ่านมา

    Why are gamers buying high end cpus? Are you playing games at 1080p low settings? GPU is usually the bottleneck so the whole argument about amd vs intel for gaming is moot. The amd x3d is super shit at anything other than playing low graphics games where a cpu bottleneck happens.

  • @Decki777
    @Decki777 หลายเดือนก่อน

    I have a 13700kf and I'm not a AMD fanboy but Arrow lake is trash and it's not the way to go for gaming performance. 3d V cache is the best way to go for gaming Intel should make their own version of 3d V cache CPUs if they don't it's over for them AMD and Intel are not safe anymore because ARM is getting very powerful and they are good at doing one thing so if anyone make ARM CPU for purely gaming it'll be insanely fast for gaming ARM is more powerful enough to run game's the problem is coding because most of the game's Coding is for x86 Architecture.

  • @snipershor6351
    @snipershor6351 หลายเดือนก่อน +1

    I bet the next gen processor will deliver more performance upto 20-40% compared to 285K.

    • @rahuldharmendran2770
      @rahuldharmendran2770 หลายเดือนก่อน +2

      If that's the case that would be great. 20% maybe but 40% will be so surprising and great but it is possible because this is a new architecture and maybe that's the reason why 285k is performing less than most people expected.

    • @snipershor6351
      @snipershor6351 หลายเดือนก่อน +2

      @rahuldharmendran2770 correct

    • @auritro3903
      @auritro3903 22 วันที่ผ่านมา

      I hope so, but unless we have enough information, it's unlikely.

  • @arfianwismiga5912
    @arfianwismiga5912 หลายเดือนก่อน

    it performs worst than previous generation, what's the point then

  • @Just_Mike64
    @Just_Mike64 หลายเดือนก่อน +4

    Great analysis, all other channels think that the 1080p score on a 4090 is THE thing. It isn't.

    • @arizona_anime_fan
      @arizona_anime_fan หลายเดือนก่อน +3

      stop. i can't handle this ignorance. i see this comment under every benching conversation about cpus. you don't know what you're saying at all. i'm going to explain it so you get it.
      if i am benching a cpu and i want to compair it against another cpu i need to make sure i am benching the cpus only. there are several "bottlenecks" in a cpus performance that will taint an attempt to get an honest comparison.
      1) gpu
      2) ram
      3) storage
      4) drivers/OS
      for now we'll set aside ram, that's a little complicated. we'll get to gpu at the end.
      storage can be standarized across platforms, just use the same model of storage, hd or ssd. though i suggest the fastest ssd possible in the event something you benchmark requires rapid storage access to bench propperly.
      os can be standardized as well, and drivers shouldn't matter as long as everything works
      that leaves ram, ram is complicated because at least currently neither AMD nor Intel have the same ram compatabilities. furthermore amd cpus are slightly limited in ram speed due to how their infinity fabric works. on the bright side, part of every review for cpus does some ram experimenting to get an idea what different kits affect. currently faster ram 8000 vs amd max ram of 6000 really is only worth about 1% performance uplift for intel, so functionally even though you're not using the same ram in your test systems, currently with the current cpus on the market ram can be ignored for comparison sake even if both systems don't use the same ram.
      finally gpu. in order to prevent the gpu from affecting a cpu benchmark, you need the strongest gpu on the market (4090) and you need to test in games where the gpu will never hit 100% utilization. if the gpu is maxing out at 100% utilization you are no longer benching the cpu, you're benching the gpu. as a result, you test at the lowest possible realistic resolution. at 1080p a 4090 will never be maxed out no matter what graphic settings you use. as a result you've removed the gpu from the testing consideration and now you know your bench results are 100% cpu vs cpu.
      this is the scientific method. control for the environment to ensure you're not testing something else. but what you're trying test. if you test a cpu in gaming at 4k, all you're doing is testing the gpu. you can easily hit 100% gpu utilization even on a 4090 in 4k with max graphic settings.
      if you've ever seen a cpu benchmark where all the cpus are hitting the exact same framerates then you know there is a bottle neck, and you're benching the bottlenecked part not the cpus.
      they test at 1080p because it's guarenteed to be strictly cpu limited at those resolutions. if they tested at 4k, all the cpus would hit the same fps because you'd be bencing the 4090, it's a meaningless test.
      but wait, you're saying, "I game only in 4k, so 4k results are all that matters, why should i care what cpu does better in 1080p? all i need to know is will it perform as good as a more expensive cpu in 4k?" that's a good question, but it's a shortsighted one. today, a core u5 245 might be able to match a 7800x3d in 4k in a certain title with a 4090. because that 4090 might be the bottleneck. but what about a game released tomorrow that's not on the bench? lets say civ7. that game will be heavily cpu limited even in 4k, and it's unlikely to strain any graphics card. at that point what the strongest cpu you can buy will matter, and if bencher don't bench cpus in a way to show the diference in performance then you'll never know what a stronger cpu might be. and that's not just civ7. i assume you plan to keep the cpu for years. what if you get a faster gpu... say a 5090. all of a sudden those 4k results will change, suddenly the strongest cpu at 1080p will look like the strongest cpu at 4k with the gpu bottleneck removed.
      that's why they bench in 1080p, as a sidenote, amd fanboys used to make your argument all the time when it was piledriver vs sandy bridge. they'd say things like well in 1440p the two cpus are identical (of course they were, there were no gpus strong enough to play in ultra at 1440p). but in reality anyone who got the i5 2500k got the superiod gaming cpu forever. there was never a time the 8350 was a better cpu then the 2500k. the video this cc made apes what a lot of the talking points for amd fanboys were in 2013 when it was piledriver vs sandy bridge.

    • @douglasmurphy3266
      @douglasmurphy3266 หลายเดือนก่อน +2

      ​@@arizona_anime_fanI don't think he is advocating running benchmarks at 4K RT, he is saying basically anything over a certain threshold is adequate for gaming, and the 1080p 4090 stuff is a nice scientific isolation but not typical use case. A 285K may lose to a 7800x3D, but a 12600k and a 7500F might already get the game engine past 120fps anyway so who the hell cares. Take gaming out of it at that point and look at your use case. You might not need to upgrade at all. You might be interested in platform features, or certain productivity. To me, the cost vs performance is a bigger reason not to get any Core Ultra + Z890 than imperceptible framerates beyond my monitor's refresh rate.

  • @rahuldharmendran2770
    @rahuldharmendran2770 หลายเดือนก่อน

    No one said ultra 9 285k is a bad CPU. It's just good in a few applications. I know this is a completely new architecture but they just increased efficiency by reducing power consumption thus reducing temperatures. Ultra 9 385k may be better but niw 285k. Everyone knows that after a certain power limit even if you increase 30 watts of power the performance gain is just 2-5%. Intel just reduced that. I would have appreciated this if it was a case like 13900k to 14900k where there was no significant performance gains but if they improved efficiency there. They improved efficiency at the cost of performance. Maybe few bios updates can fix this but who knows. The point is that when we have a 9950x that performs well in every workload like gaming and applications and that uses the same AM5 socket instead of a new socket and that has no other problems except price which will eventually decrease(its just high because it is a new CPU launch as everyone knows). They also have different problems. AMD's 9950x doesn't have much as intel's 285k. See Techtesters video on Ram speed scaling and core isolation and JaysTwoCents video on missing intel performance in games.( Ignore JaysTwoCents video if you don't care about gaming performance and think 285k is not focused on gaming and rather on efficiency). Intel literally said turning off core isolation decreases performance which is clearly not the case. We clearly see performance gains close to 7800x3d if we use a higher speed 8000mhz+ ram. Just be honest, how many people use 14900k purely for applications and not for gaming?

  • @mast8536
    @mast8536 หลายเดือนก่อน +1

    Everything about ARL is ten times more interesting to me than Ryzen 9000. The ground up tile approach, the better efficiency, use of TSMC, better platform/chipset than AMD, cudimm support the list goes on. It is just a more interesting product. So interesting that I was ready to buy it on launch day. Unfortunately the 285k is no where to be found, I have yet to hear any actual consumer say they got one. And to be frank I am a consumer at the end of the day and I'm going to buy what suits my needs, and a processor that I can't buy does not suit my needs. I was already willing to make concessions on having best in class gaming performance, but to have to wait around some undetermined amount of time because they paper launched is too much to ask when the 9800X3D is coming next week for 100 bucks cheaper. I've never had an AMD system and begrudgingly ordered an X870 board today, I really feel like Intel forced my hand and I hope the 9800X3D does not disappoint especially in terms of general PC use cases.

    • @rahuldharmendran2770
      @rahuldharmendran2770 หลายเดือนก่อน +2

      Every leaks I saw say 9800x3d is great. The approach was good. They changed the 3d cache placement, used zen5 cores instead of zen4 and increased clock speeds. People say it will be a 15-25% performance boost for $30-$50 more at launch. But as soon as prices decrease 9800x3d would be a good deal.

    • @iLegionaire3755
      @iLegionaire3755 หลายเดือนก่อน +1

      Ryzen 9000 >>>>>>>>>>>>>>>>>> Arrow Lake. Watch the videos of the 9800X3D, its the new champion.

    • @mast8536
      @mast8536 หลายเดือนก่อน

      @@iLegionaire3755 I got one, so far it is ripping through everything I throw at it and breathing new life into my 3080Ti

    • @auritro3903
      @auritro3903 22 วันที่ผ่านมา +1

      Arrow Lake is more interesting than Zen 5 definitely. Does that mean that it's good? Eh... no.

  • @thetheoryguy5544
    @thetheoryguy5544 หลายเดือนก่อน

    Man all the AMD fanboys from reddit have spilled into here. Funny how they get offended if your not against the competitor.

    • @arfianwismiga5912
      @arfianwismiga5912 หลายเดือนก่อน

      i hope new intel cpu has good product to offer. but i realized intel just take their new cpu to their grave.

  • @johntipeti4597
    @johntipeti4597 หลายเดือนก่อน +4

    thank goodness for a channel like yours that doesn't shill. subbed.

  • @torukmahtomahto409
    @torukmahtomahto409 หลายเดือนก่อน

    A recent test show that the Arrow Lake IS optimized for CUDIMM rams, in special with 8200mhz--400 bucks...

  • @milosmartplug
    @milosmartplug หลายเดือนก่อน

    fanbois are dime a dozen

  • @nempk1817
    @nempk1817 หลายเดือนก่อน +1

    Wtf is this video HAHAHAHA

  • @Horseofhope
    @Horseofhope หลายเดือนก่อน +2

    For someone who doesn't want to get AMD and didn't want the roided up 13-14 gen, Arrow Lake is the right CPU to get.

  • @R1L1.
    @R1L1. หลายเดือนก่อน

    yeah i was wrong, i thought they could beat ryzen in effeciency, but they failed at even that while being worse than a 12600k on some instances.

    • @jonaswox
      @jonaswox หลายเดือนก่อน

      yea it is impressively bad. And to act like someone forced their hand. If this is so unfinished, why the hell are they releasing it now?
      The fact they was not even aware apparently about all the issues and bugs...... lol , what the fuck are they even doing in QC?

  • @Oi-kb5xb
    @Oi-kb5xb หลายเดือนก่อน

    I think this guy would be pretty good with tech

  • @HexerPsy
    @HexerPsy หลายเดือนก่อน +1

    Well... Thats a load of copium lol
    Intel canceled the Royal Core project. Removing hyperthreading was part of the strategy leading up to that architecture. But the intel CEO blocked the design.
    So then - yes 14th gen was a rushed refresh, putting even more power into the chips - which came back to haunt them.
    And now Arrow Lake -- which they knew had spread out P cores -- wasn't shipped with the necessary threat director to properly distinguish P from E cores, as evidences by the 1P 16E benchmarks.
    So what must intel do for the next generation? Adding more power means more heat - so part of the improvement of the design must go to power savings. So you must trade in performance - and we can only expect minor IPC improvements.
    AMD is taking the consumer gaming market with the X3D chips. What is left for intel then? They are likely to focus on the server side to retain market share.
    Even for prosumers can get a amd 32 thread cpu that draws similar power with more performance.
    Intel is the underdog now these generations.

    • @thetheoryguy5544
      @thetheoryguy5544 หลายเดือนก่อน

      Uh productivity? It amazes me how many people think all you do on a PC is play games. These Ultra chips stomp Ryzen and 14th gen in productivity task, especially now with the microcode update and really when paired with CUDIMM.

    • @HexerPsy
      @HexerPsy หลายเดือนก่อน

      ​@@thetheoryguy5544 Maybe I am behind on the microcode update.
      But looking at the release reviews, I dont see why you would buy the more power hungry 285k over a 9900K, 7950X(3D) and such. Unless there is probably some workload that runs really well on these chips and badly on AMD's chips.

    • @thetheoryguy5544
      @thetheoryguy5544 หลายเดือนก่อน +1

      @@HexerPsy Well blender runs really well on these CPU's, Resolve and Premier Pro. I think these CPUs are like the top performers in content creation apps. That's why I would get them over the other CPU's you mentioned and these CPU's are a big step in the right direction when it comes to power and temps compared to 14th gen sure not a great as AMD but still a huge improvement.
      Tbh after the ArrowLake and Ryzen launch, I unsubbed from so many reviewers that i thought were unbiased. It just seems like a lot of them are in bed with AMD or take advantage that hating on Intel is what's trendy now so they use that to get views talking shit. Its so dishonest, when these CPU's are not bad actually.

    • @HexerPsy
      @HexerPsy หลายเดือนก่อน

      @@thetheoryguy5544 But blender is a largely nonsense benchmark for CPUs, since Optix render on Nvidia RTX cards is WAY faster - so any scene you would be working on in blender, will be optix compatible. (I do blender for a hobby, so take however you want that.)
      Its great for benchmarking and scalability, but you would prefer not to render on CPU.
      If you take DaVinci resolve - are you not on Appels M silicon? Do AMD/intel chips even matter?
      Premiere Pro - if I can accept PugetBench as a good benchmark, then 13700K+, 7950X, 9950X, 14700K+ and the 285K are all in a similar tier of performance.
      I think, for prosumers it would be more of a matter of price for the system, and that could go anywhere.
      Adobe Photoshop, AMD 9000 series is the best tier compared to the rest.
      If you are looking for efficiency for your render task, then the 9950X (200W) will use less power than the 285K (240W), but their performance is very similar.
      If your audience includes a lot of gamers - then reviewers are correct to be more negative. These are not gaming leading CPUs.
      I dont think its trendy either. Companies make questional decisions.
      AMD bulldozer was terrible in terms of power. So was intel 13th and 14th gen.
      AMD's Ryzen 1 was hopefull, but no weak in performance.
      Intel 200K is hopeless, because intel's CEO already cancelled the royal core project - the main reason why they cut out hyperthreading lol.... I think this architecture is going nowhere under Gelsinger.
      AMD had the exploding IO die. intel degraded its 13th and 14th gen. And Nvidia 12V connectors would melt... The 4090 can be OCed to use 600W.
      I think reviewers have always been upset about underperforming / disappointing / unreasonable / bad products. This gen hardware is nothing new.

  • @flytie3861
    @flytie3861 หลายเดือนก่อน

    how to spot a fanboy

  • @sean.d7171
    @sean.d7171 หลายเดือนก่อน

    The more this intel cpu sucks the more investment they will put into the next one.

  • @r25012501
    @r25012501 หลายเดือนก่อน

    Pretty sure it's simple 9k amds we're more power efficient so Intel's released a pack of poo tickets

  • @realsong-fake
    @realsong-fake หลายเดือนก่อน +3

    Stupid take