Build a PC while you still can - PCs are changing whether we like it or not.

แชร์
ฝัง
  • เผยแพร่เมื่อ 16 พ.ค. 2024
  • Get a 15-day free trial for unlimited backup at backblaze.com/LTT
    Use code LINUS and get 25% off GlassWire at lmg.gg/glasswire
    Arm CPUs are taking over. Apple Silicon showed us that desktop computers need not be power hogs - Why haven't AMD, Intel, and Nvidia done the same, and would you want it?
    Discuss on the forum: linustechtips.com/topic/14401...
    ► GET MERCH: lttstore.com
    ► SUPPORT US ON FLOATPLANE: www.floatplane.com/
    ► AFFILIATES, SPONSORS & REFERRALS: lmg.gg/sponsors
    ► PODCAST GEAR: lmg.gg/podcastgear
    FOLLOW US
    ---------------------------------------------------
    Twitter: / linustech
    Facebook: / linustech
    Instagram: / linustech
    TikTok: / linustech
    Twitch: / linustech
    MUSIC CREDIT
    ---------------------------------------------------
    Intro: Laszlo - Supernova
    Video Link: • [Electro] - Laszlo - S...
    iTunes Download Link: itunes.apple.com/us/album/sup...
    Artist Link: / laszlomusic
    Outro: Approaching Nirvana - Sugar High
    Video Link: • Sugar High - Approachi...
    Listen on Spotify: spoti.fi/UxWkUw
    Artist Link: / approachingnirvana
    Intro animation by MBarek Abdelwassaa / mbarek_abdel
    Monitor And Keyboard by vadimmihalkevich / CC BY 4.0 geni.us/PgGWp
    Mechanical RGB Keyboard by BigBrotherECE / CC BY 4.0 geni.us/mj6pHk4
    Mouse Gamer free Model By Oscar Creativo / CC BY 4.0 geni.us/Ps3XfE
    CHAPTERS
    ---------------------------------------------------
    0:00 Intro
    0:53 PCs are great, but...
    2:05 They still run on old tech
    2:53 Intel and AMD are in trouble, but that's a good thing
    5:27 Contracts, monopolies, and Qualcomm's failure to compete
    6:32 PCs have already taken the first steps to change
    8:14 Arm's advantages are integrated
    10:10 Arm chips can go bigger
    11:19 Conclusion - This is going to suck for enthusiasts
  • วิทยาศาสตร์และเทคโนโลยี

ความคิดเห็น • 10K

  • @How23497
    @How23497 ปีที่แล้ว +22421

    This all sounds like absolute hell for consumer’s rights, repairability, upgradability, and overall variety in the PC space.

    • @verios44
      @verios44 ปีที่แล้ว +4181

      Let’s be totally honest. The end goal is to get rid of end user serviceability and heck even real ownership of products.

    • @bradenrichardson4269
      @bradenrichardson4269 ปีที่แล้ว +2382

      Yeah forced obsolescence. People be throwing away their PC every two years just like they do with mobile devices now. Utter crap.

    • @anakinlowground5515
      @anakinlowground5515 ปีที่แล้ว +1366

      Yeah, I completely agree. The right to repair movement is in direct opposition to this. They cannot coexist.

    • @alphagiga4878
      @alphagiga4878 ปีที่แล้ว +217

      @@anakinlowground5515 well at least we have a major player, framework

    • @pinkchckn
      @pinkchckn ปีที่แล้ว +36

      my thoughts exactly

  • @HunterDrone
    @HunterDrone ปีที่แล้ว +2902

    my sole complaint about SOC systems is how frequently the makers of them seem to not give a shit about long term maintenance, expecting you to just buy a new model rather than maintain or reconfigure your existing machine.

    • @wafflecopter9296
      @wafflecopter9296 ปีที่แล้ว +420

      Planned obsolescence.

    • @astronichols1900
      @astronichols1900 ปีที่แล้ว +81

      As a Tegra K1 owner. Yes. and i regret ever buying it.

    • @idkanymore561
      @idkanymore561 ปีที่แล้ว +91

      That's their end goal

    • @the_danksmith134
      @the_danksmith134 ปีที่แล้ว +280

      watch them using the power efficiency as an excuse for environmental protection even though in the long run the inability to repair will create even more e-waste and contribute to even more emissions due to the higher demand for manufacturing them than conventional PCs do with all their ineffieciency

    • @xathridtech727
      @xathridtech727 ปีที่แล้ว +40

      @@the_danksmith134 sadly people replace PC instead of repairing them already

  • @GamebossUKB
    @GamebossUKB ปีที่แล้ว +989

    I believe regardless of how efficient ARM and it’s related technologies can be, there will always be a demand for individual components.

    • @oakzy3647
      @oakzy3647 ปีที่แล้ว +34

      Building a pc is sacred and taylors it to you're use case I completely agree

    • @steveklein9335
      @steveklein9335 ปีที่แล้ว +58

      i tend to agree, but then again... how many manual transmission cars are there available to buy...

    • @oakzy3647
      @oakzy3647 ปีที่แล้ว +13

      @@steveklein9335 this comparison is kind of invalid as if we don't build it they will and it works out cheaper for upgradability to allow for expansion and future technology. Did you know cars still use computer ram and motherboards and graphics cards it still has to be built so I will partially agree but a custom pc will be around until the performance of any chip can run games at thousands of fps so specs don't matter
      However there is a huge market for laptops, phones and prebuilts but anyone wanting performance will build their own for now

    • @anthonygrimaldi9483
      @anthonygrimaldi9483 ปีที่แล้ว +23

      @@steveklein9335 there will always be manufacturers that still make manuals for car enthusiasts, and there will always be a few manufacturers making PC components.

    • @benedikta.9121
      @benedikta.9121 ปีที่แล้ว +51

      @@steveklein9335 They're the majority of all cars available in every country except the US and Canada.

  • @ZachGatesHere
    @ZachGatesHere ปีที่แล้ว +339

    I feel like the goal for them is to turn computer ownership into phone ownership. You buy a new box every year that you don't own, just lease from that manufacturer.
    Also this is unrelated to the content but man this guy is a good speaker. Great to listen to.

    • @harbitude
      @harbitude ปีที่แล้ว +6

      That's exactly what it is.

    • @creeper6530
      @creeper6530 ปีที่แล้ว +11

      You're more right that I'd like you to be

    • @awwtergirl7040
      @awwtergirl7040 ปีที่แล้ว +6

      Yeah it is. It why I have started hating computing more and more. The only bright spots are the Open stuff. The most compelling feature of any computing platform today is Freedom.

    • @alexspata
      @alexspata ปีที่แล้ว +2

      exactly, this is the future I'm afraid

    • @Peglegkickboxer
      @Peglegkickboxer ปีที่แล้ว +2

      They're doing this with cars, especially all of the electric ones they are trying to shove down our throats.

  • @Lossy555
    @Lossy555 ปีที่แล้ว +6013

    Hearing this, I imagine a dark future where PCs are handled like phones today. "Sorry your 2 year old PC is now irrelevant because we're not giving it anymore updates."

    • @alexeisaular3470
      @alexeisaular3470 ปีที่แล้ว +272

      Linux 🙌

    • @tippyc2
      @tippyc2 ปีที่แล้ว +664

      @@alexeisaular3470 Ok so you're going to root the device and install linux. Sounds great. So what do you do when your wi-fi refuses to connect to a rooted device? We're already seeing rooted phones being denied services...

    • @ilyasofficial1617
      @ilyasofficial1617 ปีที่แล้ว +142

      @@tippyc2 anti trust law

    • @AR15ORIGINAL
      @AR15ORIGINAL ปีที่แล้ว +293

      @@Zaptosis Unlocking the bootloader still voids your warranty and denies you services.

    • @NepgearGM6.1
      @NepgearGM6.1 ปีที่แล้ว +5

      That

  • @nabusvco
    @nabusvco ปีที่แล้ว +5990

    While I can see this change as an inevitability, I just hope the serviceability and upgradeability will not be impacted as hard as I think they will be

    • @POLARTTYRTM
      @POLARTTYRTM ปีที่แล้ว +619

      It definitely will be impacted. Try upgrading anything on an SOC. You can't. Anything goes defective and you have to dump it all and get a new one. It's like a phone, try repairing one, can't upgrade without literally buying a new phone, if something breaks, good luck trying getting it repaired yourself, or pay a hefty price for a mere ATTEMPT at repairing the thing.

    • @wanderingwobb6300
      @wanderingwobb6300 ปีที่แล้ว +88

      It absolutely will be

    • @hivemind8817
      @hivemind8817 ปีที่แล้ว +201

      Yeah, no, I think they were stretching a bit far when saying they will make them into a SOC, the fact is it's less profitable to make an SOC than the parts separate, and this mostly comes down to yield, the larger the chip is the more likely it will be unusable due to defects in the silicon wafer. This is a major reason why amd uses chiplets, since if they make a bunch of small parts instead of a large one their yields go up.

    • @Wockes
      @Wockes ปีที่แล้ว +115

      In the future when it breaks you buy a new one. It's all e-waste like old Apple hardware

    • @brkr78
      @brkr78 ปีที่แล้ว

      There is money to be -stolen- made from the -sheep- users, Apple shows the way on how to -exploit- convince their userbase to fork over money on what can only be described as a planned pile of e-waste, timed to be obosolete once the profit margins hit a certain low mark. Yeah, no, the furure is going to suck, for both enthusiasts and normal useres, former because they have no further influence, and latter because they will be milked dry.

  • @MrDJHarrison3
    @MrDJHarrison3 ปีที่แล้ว +80

    Has to be one of the best presenters on LTT, (apart from Linus that is)
    give him more screen time, so clear and well spoken

    • @Plunkidunk
      @Plunkidunk 10 หลายเดือนก่อน +1

      Riley #1

  • @Rasterizing
    @Rasterizing ปีที่แล้ว +33

    It sounds a little like going back to the Commodore 64 era, where, for the most part, everything was on the CPU (with some exceptions). It's obviously a big problem for system builders and the general PC market as it will restrict consumer choice - although it doesn't have to be that way. I think the PC market still requires some custom builds and adaptability so if you can replace the SoC without having to change everything then that would be a big win. Although I really see it was replacing the entire board/system - so your PC would be little more than a SoC board (which you must full replace) with a custom case.
    From the chip makers point of view this has nothing to do with power or economy. They want to lock you in to a SoC, keep you there and then just drop support a couple of years later and force you to upgrade and spend more money - this is what it comes down to, restricting choice and forcing upgrades, exactly the same as mobiles and tablets. If you can't afford to upgrade your SoC you just have to suffer or be cut out when support is dropped, that's it. Along with the fact that you can't replace any individual failed components. No, this is nothing more than rampant capitalism and milking consumers for every last penny.
    x86 could be redesigned to be more efficient and have a RIS too. There's no need to go to SoC, aside from $$$!

    • @dsrocks6905
      @dsrocks6905 9 หลายเดือนก่อน

      Redesigning x86 AGAIN to say a 128bit architecture will have its own major challenges. If you can recall, the switch from 32 to 64 bit x86 wasn't without pain and trouble, and considering arm has had so much time in the oven, and translation between both instruction sets has become so much more efficient because of AI, it may be smarter to just move to ARM. Either way there will be stability and compatibility issues for a bit while everything slowly shifts over

  • @BorlandC452
    @BorlandC452 ปีที่แล้ว +1194

    Sometimes I feel a little old-fashioned for still using a tower for my main computing when I'm not much of a gamer. But I still really like having one because of how I can customize it. I love being able to swap out individual components, or even adding new ones. You really can't mix and match with a laptop, and especially not a phone.

    • @dynodope
      @dynodope ปีที่แล้ว +30

      I 100% agree with you on this!

    • @riccardo1796
      @riccardo1796 ปีที่แล้ว +158

      in the modern hellscape where everything needs to be a nondescript square with no replaceable parts the mighty tower stands tall, whirring menacingly from its over-full drive bays

    • @ignacio6454
      @ignacio6454 ปีที่แล้ว +28

      @@riccardo1796 Beautiful

    • @robgrune3284
      @robgrune3284 ปีที่แล้ว +9

      AMEN !

    • @Quebecoisegal
      @Quebecoisegal ปีที่แล้ว +6

      Yep, agree with you totally.

  • @Aefweard
    @Aefweard ปีที่แล้ว +1380

    My issue with the idea of the everything chip is say you’re 2 generations down the line and want to upgrade your graphics, but the cpu side is still chugging fine, having to replace the whole thing is not only wasteful, but more expensive. Same goes with a component dying.

    • @skydivingmoose1671
      @skydivingmoose1671 ปีที่แล้ว +75

      Wouldn't replacing a small (in comparison) SoC be better than a whole GPU and its cooler? Assuming the new chip fits in the old motherboard.

    • @RJRC_105
      @RJRC_105 ปีที่แล้ว +182

      @@skydivingmoose1671 assuming they don't change the motherboard, yes. Or use a BGA mount so you can't physically. I'm sorry but this is a regressive, wasteful step.

    • @ts47920535
      @ts47920535 ปีที่แล้ว +48

      Well, SoCs are less wasteful overall.
      My laptop's entire SoC PCB (motherboard, IO, chip, etc) takes approximately the same size than my GPU. So the mentality of 'just upgrading my graphics' goes out the windows, you would just replace the logic board.
      Technically you would be replacing ram, cpu, gpu, controllers, etc, which seems wasteful, and it is on full size PCs, but it isn't on SoCs

    • @IR4TE
      @IR4TE ปีที่แล้ว +115

      @@skydivingmoose1671 Also you have to consider, if PCs really go down the SoC route, you have to replace the whole SoC, which will be the most expensive part in the whole machine. So putting down for example 2000$ every 2 years just because you're not so satisfied with your GPU performance any more, but the rest still performs well, you waste so much money because of a little area on your SoC die.

    • @TeranToola
      @TeranToola ปีที่แล้ว +38

      @@skydivingmoose1671 You can't really do that as you'd have to completely change the memory as well, especially when you're talking about upgrading an SOC with a better GPU, unless they put all of the memory onto the silicon, or relied on 3D stacking to fill in any bandwidth issues that arise, it would be extremely costly.
      If this future of PC SOCs comes to fruition there will be a quantum crapton of E-waste. Large SOCs will still use 1000+ watts...
      The reason why the M1 Ultra uses less power than the 3090 is simply because it's effectively 2 die shrinks ahead of the 3090 (Samsung 8n vs. TSMC's 5n). A GPU with the power of a 3090 on TSMC's 5n would likely use around 200 Watts, maybe less. Especially if Nvidia ditched GDDR6X, which is a very power hungry memory

  • @ekdavid
    @ekdavid ปีที่แล้ว +552

    I am sure there will be forever a huge audience for modular pc builds

    • @KeviPegoraro
      @KeviPegoraro ปีที่แล้ว +45

      Sure it will, is not like all factories today would stop making the parts

    • @dupajasio4801
      @dupajasio4801 ปีที่แล้ว +4

      Hopefully

    • @thomgizziz
      @thomgizziz ปีที่แล้ว +5

      forever is a long time...

    • @vadimkavecsky3698
      @vadimkavecsky3698 ปีที่แล้ว +21

      Kinda like classic cars. There are those who want most modern and hassle free stuff, and those who still prefer old school ways.

    • @mokseee
      @mokseee ปีที่แล้ว +22

      @@vadimkavecsky3698 and just like old school cars, those will come at a hefty price

  • @CrzBonKerz21
    @CrzBonKerz21 ปีที่แล้ว +18

    I get so much joy when it comes to building a PC. All of the parts in their own boxes.. it's like Christmas morning. Like I literally feel happy when I'm holding a computer part box. It would be so sad to not have traditional computer parts anymore.

  • @Fatty420
    @Fatty420 ปีที่แล้ว +556

    And the great thing about a small, integrated system is that when it breaks you get to buy a whole new system!
    Wait...

    • @aetherxsn1591
      @aetherxsn1591 ปีที่แล้ว +83

      Yeaaaaa, no way SoC boxes will replace PCs.

    • @Wockes
      @Wockes ปีที่แล้ว +56

      Or if you want to upgrade your GPU you can't

    • @feeadftth
      @feeadftth ปีที่แล้ว +43

      This year was the second time i was glad i built a DIY 7 years ago, as my PSU failed and i just bought a new one. Just like the GPU 3 years ago.
      DIY without any doubt

    • @Desnhauos
      @Desnhauos ปีที่แล้ว +10

      @@Wockes the vast majority of consumers don't care about that

    • @joemarais7683
      @joemarais7683 ปีที่แล้ว +75

      @@Desnhauos a vast majority of people are going to be pissed when they can't upgrade their gaming performance without spending 3k on another entire system with a bunch of stuff they don't need upgraded.

  • @hourglass1988
    @hourglass1988 ปีที่แล้ว +395

    I'm watching this on a gaming pc I built literally 10 years ago. It was probably low/mid range even at that point costing me roughly $700 (in 2012 dollars mind you lol). I upgraded the GPU about halfway through that time for ~$200. Couple years ago I put an after market CPU cooler in for another ~$50. I've only just now started to run into games that my system *can't* run. I'll confess a lot of newer games I have to run on low to minimum settings but it can. Some of the newest games will start to cause heat problems after a couple hours of play even on low settings. But come on in that same time period I've gone through 5 laptops that I've used for little more then word processing. I'm on my third roku stick in two years because the first one died and the second one just wasn't being supported anymore. I'm personally terrified of the PC market going the way of consoles or all-in-ones.

    • @latortugapicante719
      @latortugapicante719 ปีที่แล้ว +49

      What are you doing to those poor laptops

    • @Filip_Phreriks
      @Filip_Phreriks ปีที่แล้ว +21

      I'm typing this on an Asus netbook from 2011.
      Maybe you have to stop throwing your coffee over your laptops or whatever it is you're doing.

    • @hourglass1988
      @hourglass1988 ปีที่แล้ว +32

      @@Filip_Phreriks never water damaged one. Dog got the strap of my computer bag caught around his neck and tossed one down the stairs. Had another I was letting my wife borrow for WoW but she kept turning the fans off because they were too loud and ended up cooking it. Another was on windows 98 and just wasn't supported anymore and hardware wouldn't accept a new OS. One the screen just stopped displaying. Another had a head crash is the drive. I'm sure if I left them on a desktop like my desktop some would have lived longer.

    • @svenwempe9208
      @svenwempe9208 ปีที่แล้ว +43

      @@hourglass1988 turning the fans of becouse they were too loud ..hahahahhahah🤣🤣 that shit is funny as fuck

    • @johna3357
      @johna3357 ปีที่แล้ว +2

      I'm guessing a sandybridge intel cpu?

  • @artem.boldariev
    @artem.boldariev ปีที่แล้ว +12

    One thing needs to be noted. In most modern x86 based computers there is really no need to keep backward compatibility with 16-bit 8086 and more of that cruft. I think that this is going to be dropped eventually - there is zero reasons to keep it on modern, UEFI-based computers.

    • @LiqqaRoni-cx3tx
      @LiqqaRoni-cx3tx 7 หลายเดือนก่อน

      So no more ax, al, and ah registers?

    • @artem.boldariev
      @artem.boldariev 7 หลายเดือนก่อน

      @@LiqqaRoni-cx3tx They are still available as parts of RAX register.

  • @caerjones6693
    @caerjones6693 ปีที่แล้ว +33

    THANK YOU for explaining this in a way someone barely technical could follow without making me feel dumb. I appreciate friendly faces and accessible info in enthusiast spaces!

  • @Hotrob_J
    @Hotrob_J ปีที่แล้ว +734

    They've been saying this for like 20 years now. First when laptops went mainstream, then with tablets and smartphones, then when the original NUC came out.

    • @BaldTerry
      @BaldTerry ปีที่แล้ว +136

      Agreed. Building desktops not going away anytime soon.

    • @katatastrofa6136
      @katatastrofa6136 ปีที่แล้ว +25

      The difference is that PCs will still be a thing, only different

    • @Thatonefuckinguy
      @Thatonefuckinguy ปีที่แล้ว +70

      Also tons of people such as myself simply can't afford pre-built hardware. It's that simple, the economics of it being cheaper to build a PC yourself outweigh the benefits of getting rid of custom building PC's. Also not everyone wants a piece of shit entry level card like a 1050ti that can barely run much of anything. In order for SOC to work, they'd have to stop giving out shitty entry level cards and start giving us enthusiast or at the bare minimum mid-range GPU's. CPU's tend to be fine as most pre-builts and even laptops come with at least an i5 if not an i7 unless your getting the thing for dirt cheap. But I remember not being able to buy a prebuilt and having to go custom back in 2017 or 18 when I built mine purely because not a single one for a long time in my price range came with a 1060 6GB model.

    • @graphincer3164
      @graphincer3164 ปีที่แล้ว +27

      Yeah. Hope this just stays as an industry scare as it has always been. Heck people were scared stadia was gonna take over or Geforce Now but they haven't.

    • @romano5785
      @romano5785 ปีที่แล้ว +26

      Didn't you heard? PC Gaming is DYING! lmao

  • @sonoftherooshooter
    @sonoftherooshooter ปีที่แล้ว +2358

    Linus, I've been in the tech field since 2006. Been watching you since 2007 or so. You have a lot of great staff but Anthony is special. His ability to articulate facts while sounding very concise, his pool of knowledge in the market space understanding the competitive analysis. These are all very strong assets. Treat this man good.

    • @Sam-tb9xu
      @Sam-tb9xu ปีที่แล้ว +80

      Anthony needs a raise!

    • @loktar00
      @loktar00 ปีที่แล้ว +18

      In the field since 2006 like that's a long time 🤣

    • @rmoultonrmoulton145
      @rmoultonrmoulton145 ปีที่แล้ว +153

      @@loktar00 Huh? 16 years isn't a long time? Especially when tech changes exponentially? Do you realize the amount of change he's seen since he started? Hell, I started in the field professionally a little over a decade ago and it's incredible the amount of change I've seen.

    • @Chspas
      @Chspas ปีที่แล้ว +91

      @@loktar00 bro, people were still figuring out how to create good websites in 2006. Being in the industry for that long does give you an amazing perspective on the growth of tech

    • @sonoftherooshooter
      @sonoftherooshooter ปีที่แล้ว +14

      @@loktar00 it's not necessarily about time in the field but experience seeing and working on different things gives a wealth of perspective. Desktop and enthusiast/gaming computing really hasn't changed all that much in the last 25 years if you really break it down, other than silicone advancements and the addition of VR and new form factors. My original comment was really meant to highlight that Anthony couldn't provide the level of expertise that he is currently delivering if he had not seen/read/done so many things in his career as is so evidently clear in the presentations he delivers.

  • @jasonkelley6185
    @jasonkelley6185 ปีที่แล้ว +72

    I haven't watched much LTT, just the occasional video, but this is the second time I've seen Anthony and I'm going to start watching all his stuff. This guy is the man. When I know people like this personally I buy them drinks and try to get them to talk endlessly. He really knows his stuff. To have the versatility to be the premier Swacket model is obviously a huge bonus.

    • @halko1
      @halko1 ปีที่แล้ว +5

      Anthony is one of the loved ”characters” and not without a reason. Great persona.

    • @Menleah
      @Menleah ปีที่แล้ว +2

      Well said!

  • @PathfinderKat
    @PathfinderKat ปีที่แล้ว +15

    I love my PC, it's going to get a giant upgrade soon. I love the customization. I love that I've had it since I was 16 and it has grown with me. Being able to customize it was a huge part of me.

  • @denvera1g1
    @denvera1g1 ปีที่แล้ว +733

    5:30 Anthony, this is a very well put together video, but Apple fails to disclose one key point when talking about efficiency, and most reviewers miss this VERY key fact
    Apple is using TSMC 5nm. Where as Nvidia is using Samsung 8nm, AMD is using TSMC 7nm which TSMC says that 5nm offers a -50%- 20% improvement in performance per watt over 7nm, and Intel is using Intel 10nm-'intel7' (almost as good as TSMC 7nm)
    To put your comparrison of the 3090 and the M1 Ultra into perspective, if the M1 Ultra used the same Samsung 8nm silicon, the die would be over 4x the size of the 5nm M1 Ultra, and use as much as 12x the power to get the same performance(edit most likely it would only use 6-8x more power).
    Samsung 8nm has roughly 44 million transisters /mm² where as TSMC 5nm has ~186million /mm²
    To put it another way, the 3090, ported to TSMC 5nm, would be less than 1/4 the size, and might use as little as 80w, and it would be only ~50% larger than the base model M1 as it only has ~50% more transistors than the base model M1
    ARM is only "vastly more efficient" than Intel processors that were stuck on 14nm for 6 years.
    Apple, on paper, is less efficient than AMD. But we'll have to wait for 5nm Zen4 to get official numbers.
    I fell into this trap with my M1 Mac Mini. I thought i was getting something i could dedicate to transcoding recorded TV shows from MPEG2 to H265. But it turns out, i'd have been better off getting a cheaper AMD based system with a 4750u/4800u.
    My work's laptop(Thinkpad L15 Gen1 with 4750U) is not only slightly faster at transcoding, it is also more efficient than my M1 Mac mini
    Here are the numbers for transcoding an hour long news segment using the same settings on both devices, and yes, i was using M1 native appliications.
    M1 Mac Mini : 1 hour 9 minutes, 33Wh from the wall
    4750u Thinkpad: 1 hour 4 minutes, 28Wh from the wall
    What is crazy is that not only is the 4750u more efficient, but it does this while having to power a display, loud micro-fan, and doesnt have any of the high efficiency integrated parts, instead using a removable M.2 and removable RAM.
    Remember TSMC has stated that the 5nm process used for the Apple M1, has a -50%- 20% performance per watt increase over the 7nm node of the 4750u. So, Apple should have used less than 2/3rd the power, and realistically should have used 1/2 the power because the Mac Mini has fewer perifferals to power, but instead Apple used more power to complete the same task.

    • @nathangebreselassie8515
      @nathangebreselassie8515 ปีที่แล้ว +27

      You said e TMSC point way to many times. Also, source

    • @wuokawuoka
      @wuokawuoka ปีที่แล้ว +73

      To add insult to injury, Apple integrated computers require you to buy basic stuff like RAM today. That will keep you from upgrading later at a lower price.
      The not an m.2 ssd goes in the same direction.

    • @NeVErseeNMeLikEDis
      @NeVErseeNMeLikEDis ปีที่แล้ว +13

      I read 1/3 of it n yes u're right!

    • @SidShetty
      @SidShetty ปีที่แล้ว +1

      @UCw1-Tc5nUrnksHZkbPuIo5w nvidia didnt have a choice.
      the 5 nm node has limited capacity so it happens long in advance.

    • @oiosh
      @oiosh ปีที่แล้ว +1

      @@nathangebreselassie8515 its true

  • @LongHaulRob
    @LongHaulRob ปีที่แล้ว +2005

    Anthony has honestly been my favorite addition to the LTT crew, a pleasure to watch. I'd take a computer class if he taught it.

    • @croay
      @croay ปีที่แล้ว +30

      agreed

    • @joshualioi5144
      @joshualioi5144 ปีที่แล้ว +17

      same

    • @movingstair
      @movingstair ปีที่แล้ว +27

      A computer class you say. Great idea for a vid? He prob knows the basics of what would be thought in university and lots of people would give it a listen.

    • @TimberwolfCY
      @TimberwolfCY ปีที่แล้ว +27

      @Metrion Void Tell us you're jealous without telling us you're jealous, lol

    • @haveacigar5291
      @haveacigar5291 ปีที่แล้ว +3

      self discipline would be a class he could use i am sure. he should down on the cupcakes.

  • @JordanCS13
    @JordanCS13 ปีที่แล้ว +157

    I don't think the PC Building scene is going away any time soon. When I first started building machines over 20 years ago, it was an extremely niche thing, and since then, it has grown and grown and grown, and the PC building community now is larger than it ever has been. Will it change? Almost certainly. I can see people moving to more power efficient designs, and maybe even SOCs combining CPU and GPU, but I think the desire to separate those in the PC building world will exist for a very long time, as it isn't cost efficient to lock people into those components simultaneously.
    I do see the average person migrating to a smaller form factor like a Mac Studio for their desktop work, but the PC building community isn't really about saving space on a desk...it's about the choice and price efficiency to build what you want with what components you want. Those who don't care about those things have largely already gone to laptops for 99% of their work.
    Also, the statement that no other PC can match a Mac Studio for $4,000? That's a load of crap. First off, the $4,000 Mac Studio has the 48 core chip, and in head to head, it's slower than an i9-13900K. I just priced out a build that equals the base $4,000 Mac Studio in speed, ram, storage, and faster in GPU - and it was under $3,000. And you can upgrade things and expand it later, which you can't do at all with the Mac Studio. Even if the equivalent PC was $4,000 and larger, I'd still rather have the PC because of the upgradability. If I want to swap a GPU in 2 years, I can. If I want to add 64 GB more ram in a year, I can (and at half the cost of the 128GB upgrade for the Mac Studio) - If I want to go to a 4TB SSD, I can, and for again less than half the cost that upgrade costs up front on the Mac.
    Maybe we will see the CPU and GPU get combined, but people will still be building custom systems - just with those parts, because locking yourself into a single computer where NOTHING can be upgraded later, and NOTHING can be re-used in your next computer, is just idiotic.

    • @thomgizziz
      @thomgizziz ปีที่แล้ว +18

      This exact argument was made 15 years ago by people just like anthony. If people keep saying the same thing constantly then eventually somebody will be right about it but people don't hold others to their predictions about the future.
      Also anthony isn't know for being objective about apple products, he will pick and choose the points where apple does win or just flat out misrepresent things because that is what people do when they are a fanboi.

    • @rockjano
      @rockjano ปีที่แล้ว

      No PC building will disappear just like Hackintosh (well i still use it but i don't know how long) It was nice I loved building PC's but it is just gone...OK takes some years but believe me it will go. As the price difference yeah true ... Mac has never been cheap and still it is not ... BUT you have to use Windows not MacOs and these two are just still not the same

    • @CrispBaker
      @CrispBaker ปีที่แล้ว +20

      ​@@thomgizziz it's kind of telling that the #1 reason for enthusiast PC-building, gaming, exists nowhere in this video.
      Apple hates PC gaming. Despises it. And any company that follows their lead will end up with devices that are just as useless for gaming as an M1 is.

    • @Rudxain
      @Rudxain ปีที่แล้ว +2

      ​@@CrispBakerthat's because Apple is focused on making devices for "average practical use", so they don't care about gamers because they're a minority

    • @Rudxain
      @Rudxain ปีที่แล้ว +4

      A solution to this is modular SOCs. Make SOCs replaceable by not soldering them, and using a socket instead.
      Or even better, make individual components within the SOC modular too. Of course this would require special high-precision tools to do replacements, or a robotic arm with sensors that does the job automatically

  • @SeaMonkey137
    @SeaMonkey137 ปีที่แล้ว +160

    This is probably the best video I've seen on the industry status. Excellent info. And the Hartford Whalers shirt just said it all.

    • @thomgizziz
      @thomgizziz ปีที่แล้ว +5

      It is speculative and at best is going to be only sorta wrong. Most likely he is way off in time frame and yet nobody will call him out on it in the future and will keep saying that everything is fine. People are weird.

    • @lunakoala5053
      @lunakoala5053 ปีที่แล้ว +4

      @@thomgizziz Ah okay, but your comment isn't entirely speculative as well? People are weird.

  • @DickWaggles
    @DickWaggles ปีที่แล้ว +1319

    I won't ever give up the ability to modify my computer.

    • @Lunar4
      @Lunar4 ปีที่แล้ว +201

      dont worry, itll be decided for you.

    • @DickWaggles
      @DickWaggles ปีที่แล้ว +84

      @@Lunar4 no it won't

    • @angrymoths
      @angrymoths ปีที่แล้ว +182

      @@Lunar4 I will never lie dow. I will not live in the pod. I WILL NOT EAT THE BUGS.

    • @youtubeshadowbannedme
      @youtubeshadowbannedme ปีที่แล้ว +32

      @@Lunar4 dont worry ive decided to delete your account
      Thx for 30 likes

    • @gordonneverdies
      @gordonneverdies ปีที่แล้ว +21

      @@angrymoths Damn right 💪

  • @8lec_R
    @8lec_R ปีที่แล้ว +366

    Yesss. Let's put the entire system on a single pcb so we need throw everything away when one part goes bad

    • @fabiospringer6328
      @fabiospringer6328 ปีที่แล้ว +123

      And you can't upgrade over time. Great idea.

    • @ryanunknown4181
      @ryanunknown4181 ปีที่แล้ว +19

      🍎

    • @ablueprofilepic9876
      @ablueprofilepic9876 ปีที่แล้ว +7

      🧠

    • @TetraSky
      @TetraSky ปีที่แล้ว +44

      Excellent for manufacturers who want to sell you a new shiny toy every years while at the same time actively slowing down your old device to make the new one seems faster.

    • @yaro_sem
      @yaro_sem ปีที่แล้ว +16

      Yes! Let's divide our CPU into separate chips with 1 core per chip, so we can replace only 1 core instead of the full CPU. Just 5 times more power consumption, 10 times slower, for double the price!

  • @nitroxide17
    @nitroxide17 ปีที่แล้ว +6

    In 2010/2011 everyone thought enthusiast pc market is gonna die. Hasn’t died yet. SoCs will grow but add in cards aren’t gonna disappear. ARM might take over x86 but socketed components will stay.

  • @seishino
    @seishino ปีที่แล้ว +211

    Every time I need to upgrade my graphics card, I pretty much need to update my motherboard and memory and everything else anyway. For those who update yearly, I could see this being catastrophic. But for the rest of us who upgrade every 6 years or so, you pretty much need to upgrade all the related parts.

    • @tempusnostrumest
      @tempusnostrumest ปีที่แล้ว +31

      pretty true. this is only bad for the enthusiasts who spends too much on hardware every year

    • @colclumper
      @colclumper ปีที่แล้ว +2

      I feel your pain,I just went full x670x

    • @caveake
      @caveake ปีที่แล้ว +27

      Yeach, but if something breaks in your PC you can still easilly replace/fix it

    • @The1Music2MyEars
      @The1Music2MyEars ปีที่แล้ว +14

      I built a pc in 2019 with a 9400F and a 2070. Wasn't looking to upgrade for a few years. Well I get addicted to No Mans Sky on PS5 and want to mod on PC and apparently is very CPU dependent. And micro stutters. I now only have two options, buy a 9900k for high prices or upgrade my mother board just to buy a new cpu. Frankly I'm getting tired of PC gaming especially the whole test the drive the PC to see which graphics settings you can have at >60fps before you even begin playing while on PS5 the game has already been optimized by ppl. And no I don't like to use Geforce Experience to have it throw some apparently optimized settings at my games. The other day a windows update forced a hard reboot. I cannot escape Microsoft's updates which have caused nothing but stress.

    • @panzrok8701
      @panzrok8701 ปีที่แล้ว +11

      It's far cheaper to buy more RAM and swap the GPU. The CPU is usually not the bottleneck.

  • @smakfu1375
    @smakfu1375 ปีที่แล้ว +548

    At 47, I’m used to hearing that the traditional desktop form-factor is dead. I don’t think so. I’d also be careful with assuming closely coupled system modules (aka MCM’s posing as SOC’s) are the sole optimization route, as that’s true for Apple and the ARM universe as load-store RISC style ISA’s are highly sensitive to memory subsystem latency issues. CPU core-wise, they achieve great efficiency, but flexibility is highly limited, and scaling exotic architectures gets expensive and difficult. But Apple silicon, and the other mobile SOC-style producers are stuck in a “when all you have is a hammer” situation.
    Apple’s main business is mobile, the Mac business represents ~12% of their revenue, versus mobile devices at 60%, services at 20% and accessories at 8%. The desktop portion of that Mac business is minuscule. Through simple necessity, their desktops are going to follow the patterns established by the rest of their company’s hardware designs. That’s their business model driving design decisions, but don’t assume those same decisions work for Intel, AMD, etc., because they probably don’t.
    Also, the Mac as a platform has always been defined as a sealed box, no tinkering allowed, especially when Steve Jobs or his acolytes have been in charge of the platform. The expandable “big box” Mac’s have been the exception, not the rule. The Mac and the PC (defined by its open, slotted box roots) are two very different platforms philosophically. I don’t think you’ll see closely coupled system modules replacing big honking discrete GPU’s, sockets dedicated to big discreet CPU’s, and slotted RAM and PCIe slots for the desktop and workstation workloads that demand “bigger everything” (gaming, workstation, etc.).
    IMHO, you’ll see more chiplets (more closely coupled) in each of the compute buckets (CPU & GPU) and you’ll see a lot more cache, but the principle box with slots isn’t going anywhere. Where you will see more SOC-style system module solutions is in the laptop space. However, that’s just an extension of a pattern that’s existed for a long time, it’s just that Intel’s iGPU’s and interest in closely coupling memory has been limited by their being generally lazy. Keep in mind, the vast majority of all x86 “PC’s”, both in laptop and desktop form, already (poorly) implement closely coupled GPU (mostly on-die), memory controller, cache hierarchy, etc..
    TL;DR : I doubt the desktop form factor, sockets, slots and all, isn’t going away. This all seemed a bit click-baity.

    • @MrUploader14
      @MrUploader14 ปีที่แล้ว +34

      I agree that this video is click bait designed to get a rise out of the PC community. I do believe SOC's will be an important part in business PCs and servers since power efficiency to performance is a big deal. I think the PC gaming industry will start moving towards focusing more on power efficiency in the future. GPUs will start leveraging intelligent upscaling to lower power requirements. SSDs will use fewer PCIE lanes or keep using the older versions for longer. HDDs adopting NVME to saturate data connection and be more efficient. And moving power converting to the MOBO, as well as CPU most likely moving to ARM not necessarily for the power efficiency but for the flexibility and chip density.

    • @codyo.4884
      @codyo.4884 ปีที่แล้ว +16

      As someone who prefers the benefits of the tightly integrated, “walled garden” philosophy that Apple provides, I agree with everything you said.
      There are very distinct pros and cons between tightly/loosely coupled architecture, not just in computer hardware either. It’s like trying to make an F150 compete with a BMW M series, they’re just two different tools for two different jobs, and they’re likely to have their own place in the market for the foreseeable future

    • @riezexeero7392
      @riezexeero7392 ปีที่แล้ว +27

      Sadly, I used to love all the content of LTT, now it's becoming more click bait stories. And I agree smakfu, ever since I followed the IT industry, there is always the "PC is gonna die" news. Like did vinyl die when they said it will? Even cassettes are still alive. As long as there is a market, it won't die it will just be less prominent compared to the golden days. And also LTT, the golden days of PC is all over, it was back in the late 80s, 90s and early 2000s where your main gadget is the PC. When smartphones/tablet became a thing, the PC is not the primary gadget anymore. See what happened to Dell, they became dell technologies and not just dell computers because PCs are NOT at the forfront anymore. But did it go away? Hell no there is still a big market for it. But not what it used to be. Click bait video!

    • @Valisk
      @Valisk ปีที่แล้ว +2

      Just reading a few threads in the comments here and there are kids that have only ever known ATX absolutely losing their shit.

    • @smakfu1375
      @smakfu1375 ปีที่แล้ว

      @@codyo.4884 Just to be clear, my main laptop is a MacBook Pro, I’ve been an iPhone customer for 14 years, every TV in the house is just a display for an AppleTV, I even have an early review unit 1st gen iPad (I was a contributor for a tech magazine for developers 12 years ago). So I’m a pretty big Apple whore. But when it comes to mainline development work, and gaming, the Mac is a backwater; you’re not prototyping, and building complex compute kernels, for data analysis and intelligence systems (for production cloud deployment) on the Mac. You’re doing that work, and many things like it, on PC’s running Linux and Windows.
      As for mobile devices, yes, they’re the main show for consumptive activities. But productive work is still the bastion of the general purpose computer, and always will be (IMHO… barring the machines taking over, plugging us into the matrix and making us watch TikTok all day on simulated iPhones, as a punishment).

  • @clairearan505
    @clairearan505 ปีที่แล้ว +826

    If I could have great performance in a smaller form factor and still be able to open up and fix problems or change things in the system to suit my needs, I'd be totally for this sort of change. But I don't think that's anything like the goal of the companies who will ultimately make these changes to how PCs are built.
    It seems like every industry is constantly trying to find ways to lock me into their product line and lock me out of handling any problems that might arise without calling some overpaid technician or sending my car/phone/laptop/PC/etc somewhere else at my expense to be repaired.
    My gaming rig is over 5 years old, and still running like a champ. My work rig is nearing the 5 year mark and still chugging along. If I couldn't open these up and service/clean/fix them, and instead had to rely on someone else, I'd be missing out on hours and days of productivity. That sucks.
    If these issues can be addressed, I would love to have the additional efficiency of new architecture and the space savings of a small workstation. Until then, I'm happy to have my feet warmed in the winter and boiled in the summer, thank you very much.

    • @Voshchronos
      @Voshchronos ปีที่แล้ว +1

      This. The right to repair is being killed and soon enough actually owning our computers will be purely an illusion.

    • @skeptic_lemon
      @skeptic_lemon ปีที่แล้ว +26

      Capitalism

    • @amsunaakage
      @amsunaakage ปีที่แล้ว +10

      That's why in my opinion, PCs will always be robust and on business scale it will be the best productive assets than relying on others, even if it costs more to run, but the way you put it it is worthy to stick to PCs no matter what. So I guess i can say that PCs are still cost effective in long terms.

    • @spiderjerusalem
      @spiderjerusalem ปีที่แล้ว +52

      "It seems like every industry is constantly trying to find ways to lock me into their product line and lock me out of handling any problems..."
      Well said. Just well said.

    • @Haddedam
      @Haddedam ปีที่แล้ว +11

      I dont see how more effecient processor archidecture and lower power consumption and waste heat design damages repairability. Apple hardware has been almost always locked down and hard to service even with hardware that wasnt proprietary. Doesnt seem like apple being first to popularise a thing hasnt meant everyone else is going to follow their anti user practises. Look at phones laptops and such.

  • @maxcarter5922
    @maxcarter5922 ปีที่แล้ว +77

    I want more opinion pieces or even video essays like this. We need stronger opinions about this industry.

    • @Arcaryon
      @Arcaryon ปีที่แล้ว +1

      I am not a "tech type" so I got to ask: aren’t there any? I thought that opinion pieces are pretty much universal and while some channels may not have them, I feel like educated speculations and predictions etc. are a part of the information chain in basically all fields.

  • @tompainter7167
    @tompainter7167 ปีที่แล้ว +2

    Nicely said, I just sold my 2021 16” m1 MacBook to build a high end PC for 3D I hope I was right. It was mostly the software stuff that made me switch but also the ability to make subtle upgrades for evolving needs etc

  • @Johnrich395
    @Johnrich395 ปีที่แล้ว +489

    I’m confused. Did I miss the part where ARM is required to be a soldered cpu? I figured that ARM or RISC-V would take over the compute space eventually but that we would see ARM/RISC-V components just like we see x86 components.

    • @CompMeistR
      @CompMeistR ปีที่แล้ว +69

      Arm is definitely not required to be soldered (otherwise the Ampere Altra would not exist), but Arm SOC's basically need it, as there is currently no other good way to integrate them into a system

    • @handlealreadytaken
      @handlealreadytaken ปีที่แล้ว +184

      A completely soldered, unmaintainable throw away machine is Apple's wet dream. Would be nice to see Intel or AMD offer a solution for the desktop market.

    • @TrioLOLGamers
      @TrioLOLGamers ปีที่แล้ว

      Yep. Welcome in our world: ARM is just the excuse for Apple to solder everything. It always has been.
      They've just waited the right moment and have built the right PC looking at other errors (Windows on ARM, it's not the first time they come out with other's idea and were like "we have something revolutionary").
      They literally have foughts against right to repair and today with the excuse of everything proprietary they CAN.

    • @callumb4980
      @callumb4980 ปีที่แล้ว +152

      @@handlealreadytaken it’s every companies wet dream. You’re deluded if you don’t think Samsung, pixel, etc don’t want to have the same model apple does. For companies like acer it’s cheaper for them to buy of the shelf components, but as soon as that changes 100% they will solder the user manual to the board if they can

    • @MrAnderson5157
      @MrAnderson5157 ปีที่แล้ว +16

      @@callumb4980 It will always be about the almighty dollar. The only reason for and the steps for considerations in regards to power consumptions are the world cries for greener pastures. Forceful changes, nothing to do with consumer considerations.

  • @thomaswiley666
    @thomaswiley666 ปีที่แล้ว +1182

    Actually this is an old story; LTT is just observing one of the cycles. They aren't old enough to remember 'big iron,' the massive mainframes that everyone saw in movies from the 50s to the 90s. These are the grandparents of the SoCs we see now. IBM's VTAM came along in the 70s as a comm/app layer within big iron as a way of connecting different SoCs within a mainframe. It was also a way for IBM to make more money because you had to pay more to "unlock" features, ie open communication to the SoCs within the mainframe. Base level got you one to two "features" while paying premiums got you many or the rest of the system. The add-ons were already there, they just had to be unlocked by software. And how many terminals you bought as physical add-ons.
    The first IBM PCs were mostly SoCs with perhaps a SCSI controller or add-on RAM board as after market purchases. The only "build" items were the physical items that wore out like keyboard or mouse. (Except for the IBM 101Model M keyboard - you could kill a person in the desert with one of those, and still have a functioning keyboard. Just sayin')
    Also, we've had these SoCs for years now. We've been calling them 'appliances' because, well, the term is apt. People know them as 'thin clients.' Basically, these are the replacements to terminals, the things that connected to mainframes. With an appliance you get localized processing power and lockable systems, which are needed in quality/sensitive areas of businesses.
    It's only because of the advent of smart phones do we have businesses considering marketing appliances to the general population. Phones are technically SoCs, right? And while people bitch about a phone becoming obsolete in four years, they go along with it. So now we've come back in the circle where appliances are now set-top boxes or dongles [Roku, XBox, Fire, etc.], AKA 'little iron,' with unlockable features (comm/app layer for games, movies, Internet access, etc.) for a monthly fee. And yes, the only add-ons are the physical parts - controllers, keyboards. Eventually, people will tire of disposable computing and some company will offer an appliance with the ability to snap in upgrades (an all new PCI bus) and the circle will continue.

    • @shannono.5835
      @shannono.5835 ปีที่แล้ว +94

      Truth - and thanks for the history lesson

    • @raphaelrodriguez1856
      @raphaelrodriguez1856 ปีที่แล้ว +118

      Wow this is probably one of the best and most valuable comments I have ever seen. Thank you for taking the time to write this up. It’s always super fascinating and useful to learn about world changing market pivots that happened pre internet era. It only makes me wonder what other things will experience this cycle in the future.

    • @dylanlahman5967
      @dylanlahman5967 ปีที่แล้ว +33

      This should be higher

    • @mintymus
      @mintymus ปีที่แล้ว +24

      I certainly hope you're right about the process repeating.

    • @xenotiic8356
      @xenotiic8356 ปีที่แล้ว +52

      This honestly gives me a lot of hope about the future, that we are just in an awkward transition point in a cycle, and not an anti consumer hellish endgame.

  • @Max-dw9kg
    @Max-dw9kg ปีที่แล้ว +17

    extremely interesting video and good high level overview. I understand reasoning for limiting compatibility in terms of optimization but the idea of needing to replace a computer every few years is of concern.

    • @anno1602
      @anno1602 ปีที่แล้ว +5

      Of course it is. Everything about Apple is of concern. These Mac Studios cost several thousands of dollars for sub-par performance compared to what you could build for half the price. If money was no object and upgradeability wasn't important, then yeah let's all just buy the same build with the best CPU and GPU built in. That's not the reality of the computer components industry.

  • @FCT8306onTwoWheels
    @FCT8306onTwoWheels ปีที่แล้ว +14

    i been trying to get my game on and somehow managed to get Assetto Corsa to run on my i3 at like 10fps and its not that enjoyable so I decided to find a way to build my own machine after watching the build video you guys put together that was like an hour long and also a couple of Game Bench optiplex build videos. So hyped to start my build.

    • @sebode87
      @sebode87 ปีที่แล้ว +1

      "i been trying to get my game on", who even says that 😒

    • @FCT8306onTwoWheels
      @FCT8306onTwoWheels ปีที่แล้ว +5

      @@sebode87 soon it'll be off to the races

    • @raginginferno868
      @raginginferno868 ปีที่แล้ว

      @@sebode87 fr it doesnt even make sense even after re-reading it like 3 times

    • @John-mf6ky
      @John-mf6ky ปีที่แล้ว +2

      @@sebode87 who cares?

    • @herehere3139
      @herehere3139 ปีที่แล้ว +2

      @@sebode87 your mom says that, to me. Naaam sayyyin

  • @Rac3r4Life
    @Rac3r4Life ปีที่แล้ว +239

    Even with a switch to ARM, I believe the socketed chip on motherboard paradigm will stick around. People will still want upgradability and expandability in their desktop computers.

    • @vulbyte
      @vulbyte ปีที่แล้ว +16

      i wish this to be the case, tbh, i wouldn't be upset if the cpu and motherboard came intigrated (look at amd and how much of a minefield that's been for people)
      but, keeping ram and pcie slots for expansion/upgradibility

    • @MrCreamster20
      @MrCreamster20 ปีที่แล้ว +3

      Now what if the eventual move to ARM based chip design diverged from the current APU's, the CPU physical die sizes stayed the same but; effectively in turn granting everyone as producers and users increases in performance with lower max power usage than current CPU's. Also GPU's as they are now would probably halve in size with this ARM CPU technology, that way we would still have discreet socketable CPU/GPU's, RAM, Capture cards etc... but the size, power and probably cost benefits that go in tandem with this kind of paradigm shift in design/manufacturing would be the most noticable.
      Just a hopeful probing thought and all.

    • @egg-roll8968
      @egg-roll8968 ปีที่แล้ว +1

      Us: We like choice...
      Companies: You will buy what we make and you WILL love it!
      Here I'll prove it: Why do most buy either Sammy Apple or Google for phones (mostly NA/EU market) vs the plethora of choices out there? Many of whom offer equal or better performance for equal or less.

    • @me-jv8ji
      @me-jv8ji ปีที่แล้ว +5

      @@egg-roll8968 name a phone that you can unlock the bootloader and relock for what i know only pixels can do that

    • @Thomas.5020
      @Thomas.5020 ปีที่แล้ว +8

      Doesnt matter what people want. We will buy what we're told to buy. Always been the way.

  • @touzj316
    @touzj316 ปีที่แล้ว +386

    Backwards compatibily should not be underestimated. People want compatibility. The reason why computers are valuable is because they are useful and diverse. I just recently built a computer which is like no other on the market for what I use it for, and I keep upgrading my computer based on my work needs. Right now I'm upgrading my graphics card to fit my needs, and this ability to scale my production on my computer is one of the reasons why I have a computer. There's always going to be a market for x86, but I think one day we will all need to somehow switch to something that is more modern.

    • @aristocrag281
      @aristocrag281 ปีที่แล้ว +7

      I think what is going to happen is that as the market shifts towards prefab boxes, you’re customization options will be a more niche use case and require more investment and research. It might improve quality if the entire segment is targeted towards serious builders and power users, but as a representative sample of the overall market, what percentage is that and is it sustainable in the long run to have configurations ideal for the few who are capable of their own repair and maintenance, versus the few who do?

    • @hanrinch
      @hanrinch ปีที่แล้ว +14

      @@aristocrag281 I viewed it opposite, in reality the prefab/rebuilt box is actually declining as more lite/casual user shifting toward to smartphone or tablet. The pc market is now divided into two segments, enterprise clients and power users. Enterprise client are including server/workstation/mainframe/thin client for office where prebuilt is still in demand. Power user is including esport/gaming/video&photo editing / live streaming where hand building is still popular here.

    • @robertbane2163
      @robertbane2163 ปีที่แล้ว +3

      @@hanrinch Correct! I worked at Microcenter for 10 years & saw the rise & fall of prefabs.
      It’s not like ‘87 when I started building now kids are watching youtube vids & buying parts!

    • @hanrinch
      @hanrinch ปีที่แล้ว

      @@robertbane2163 prefab had less flexibility but good warranty however casual user are sticking with phone/tablet now these day making the warranty is no longer the advantage and younger kid tend to look up online to build their dream machine. But it is still mark the decline of the industry because the prebuilt taken up to 70% of the entire market and if prefab is declining rapidly it will mark the end of industry because the market will become too small to retain competitive

    • @SonOfAdolf
      @SonOfAdolf ปีที่แล้ว

      You chit

  • @markteague8889
    @markteague8889 ปีที่แล้ว +4

    Seems like Risc-V will eventually displace ARM for SoC systems; and also, be scaled to become a sort of universal ISA that will be utilized for everything from IoT and microcontrollers to warehouse scale servers and HPC applications.

  • @memejeff
    @memejeff ปีที่แล้ว +5

    Super interesting video. I recently migrated onto slightly older enterprise grade stuff so hopefully it wont affect upgradeability of intel xeon processors if it does happen.

  • @gwgux
    @gwgux ปีที่แล้ว +50

    This is why open standards and open systems such as ones running Linux distributions are so important. The more compact and integrated the hardware, the more lock down can be done by the likes of Apple and Microsoft. For all things in computing, there are always trade offs, but at least you can retain control over YOUR systems by adhering to open standards and operating systems.

  • @lorenkening3817
    @lorenkening3817 ปีที่แล้ว +1514

    Best place to get a windows 10 pro key?

    • @David-bh1rn
      @David-bh1rn ปีที่แล้ว +23

      Scam guys don't

    • @lorenkening3817
      @lorenkening3817 ปีที่แล้ว +1

      @@kevinphilips7453 ​Thank you so much Kevin, Got the key in less then 18$ and its activated perfect!

    • @vara48
      @vara48 ปีที่แล้ว +5

      pirate windows

    • @cooy
      @cooy ปีที่แล้ว +1

      Bot

    • @secktuss9610
      @secktuss9610 ปีที่แล้ว +1

      @@cooy really taking the extra step of botting likes on a youtube comment

  • @RandomPerson-vf3ld
    @RandomPerson-vf3ld ปีที่แล้ว +24

    ARM based buildable system with a motherboard that has a cpu and gpu socket with badass Zen-7 chiplet goodness wouldnt be all that bad. Imagine the bandwidth between gpu and cpu that could be designed into such a board. So we stop shoving the gpu into a pcie slot? Then pcie is for ultra fast HDD, VR displays and other peripherals. Builders will have to figure out how to suck all that heat from the cpu and gpu chips then. Sounds possibly the future. I hope it happens.

  • @bobdouglass8010
    @bobdouglass8010 ปีที่แล้ว +1

    i can't even begin to say how much i loved this. The perfect mix of technical description and story.

  • @saisibi6708
    @saisibi6708 ปีที่แล้ว +273

    As far these changes go I really really hope the modular nature of PC's does not go away. I would very much like to have to install the GPU of my choice, the RAM of my choice and build it all on a motherboard of my choosing, all this in an enclosure that I like. So, I hope it doesn't all become a bunch of boxes doing things monotonously.

    • @mikeloeven
      @mikeloeven ปีที่แล้ว +10

      In theory you could end up with more customization in that you have desktop system boards that support universal sockets that can integrate multiple different SOC's for whatever purpose you want so going from x86 to arm makes no difference as long as companies want to maintain modular design.

    • @maman89
      @maman89 ปีที่แล้ว +3

      Does this defeats the purpose of going arm in the first place ? We were promise the same thing with phones but we all know how that went.

    • @HamidKarzai
      @HamidKarzai ปีที่แล้ว +2

      the fact that you can pack the RAM extremely close to the CPU is part of what makes Apple's SOCs so much more performant. I don't know if modularity is necessarily going to go away but it's going to come at the cost of performance, and there's nobody to blame but physics itself

    • @jiegao3591
      @jiegao3591 ปีที่แล้ว

      @@mikeloeven They definitely did that for RAM and the whole DDR system, I can't see why companies con't make really tiny sockets with a pinout and communication standard to do something similar on a really small board.

    • @filleswe91
      @filleswe91 ปีที่แล้ว +6

      The problem with the PC modular design is we lose performance by having each component so far away from each other, aka we have LOTS of latency between each component, you count each signal latency in nanoseconds (ns), not milliseconds (ms). That's why Apple Put almost every component on one silicon chip, everything I super close to each other.
      Personally I've been looking forward to a much environmentally friendlier world of computers (RISC/ARM) everywhere to not put as much strain on the electrical grids and the planet with all the oil, coal and other fuels humanity burns to charge our electronics.
      Way to go, everyone working to bring ARM to replace x86! ♥

  • @joerhorton
    @joerhorton ปีที่แล้ว +329

    Been through this a few times, during my 52 years of life, CPU's come and go and backwards compatibility is the key issue here. However if the architecture can also give that compatibility through some form of emulation or virtualisation, I can see PC;s of the future being smaller and less power hungry. In the meantime I am waiting for the next gen of GPU's and CPU's to be released.

    • @seasong7655
      @seasong7655 ปีที่แล้ว

      Windows11 on arm is said to have emulation for x86 programs. It looks like it could be done.

    • @joshsmyth130
      @joshsmyth130 ปีที่แล้ว

      It's just another thing we have to learn to work with. Everything in tech changes, who would have predicted the prevalence of mobile computing 20 or 30 years ago. I'm just worried cause I'm heading though a Cs degree and alternate architecture haven't been covered in nearly as much depth as x86.

    • @TheArsenalgunner28
      @TheArsenalgunner28 ปีที่แล้ว +1

      It all comes dooooown to profit. See, I was recently talking about this with some people when the rumoured power draws of the Nvidia 40 series were leaked. GPU’s don’t have to make big jumps anymore, their capabilities and performance are insane, to the point that it’s almost unnecessary.
      What really would be in the consumers interest is working to optimise GPU’s going forward and matching the performance with less power draw and better cooling. But ofc, that means risking being second in the GPU race and ofc, less justification to sell the next gen at larger prices. I mean if performance doesn’t leap, why would people buy a new card for £200 more than last year?
      It’s like iPhones. They can do anything now and are amazing…almost too amazing, I barely use all the features on my iPhone 11 and this thing is expensive. How can it go beyond the capabilities of what they can do I don’t know, but Apple will find something to stick on it to sell a new one the next year. GPU’s will soon be heading the way. When power draw from computers reaches the point where you have to rewire your entire house, I like to think that will surely be the stopping point. I know people will be dumb enough to do it but not everyone surely.

    • @pirojfmifhghek566
      @pirojfmifhghek566 ปีที่แล้ว +2

      Depends on who wins the IPC vs wattage vs performance game--RISC or CISC. The x86 chips are certainly power hogs, but they tend to have greater IPC gains over each generation than M2 has had over M1. There's also the question of how the x86 architecture will fare as the motherboard/RAM/storage ecosystem goes through more generations. PCIe has gone through some major growth spurts lately and it doesn't seem to be slowing down. It feels like most components aren't even taking full advantage of the current bandwidth and we're already moving on to PCIe 6.0. This is the sort of thing that generally favors computers where the ability for expansion is key, and current x86 is still the gold standard for that.
      RISC has obvious benefits for efficiency and some of the x86 benefits over RISC may in fact be moot... it's just a damn shame that Apple is the only mainstream ambassador we have for that architecture. As long as they're the only contender at the RISC table, we'll never know how high it can fly. Apple simply prioritizes aesthetics, low fan noise, and engineered obsolescence over raw power. You can't tweak it. Can't put an AIO on it. Can't get any more juice out of it. There's no use. The bios is all locked down. Somebody else needs to make a decent ARM chip for PC enthusiasts. I'm curious as all hell to see what it might pull off. It'll be a cold day in hell before we get to see somebody playing around with LN2 and an M2 chip, that's for sure. I just wanna see what's possible. I think we all do.
      Another thing that hasn't really been addressed is that we may start seeing a big need for socketable, purpose-built chips that do fully analog compute tasks in the future. Look at the stuff that the guys over at Mythic are making and tell me that socketing a chip like that into a system wouldn't make a MASSIVE difference in their capabilities. Their advances have shown us that we depend far too much on pure software to solve every problem and it's become the biggest bottleneck that we have. AI is rapidly expanding in all corners of the computing landscape and, without a doubt, analog simply does it better. It's scary how much better it is at these tasks. I believe that these components are going to have as big of a presence in our computers as video cards and RAM sticks have today. The ecosystem that can integrate analog chips the most effectively will be the one that ultimately succeeds in the coming decades.

    • @lawyerlawyer1215
      @lawyerlawyer1215 ปีที่แล้ว +2

      @@TheArsenalgunner28 I have a 3080ti , with a ryzen 9 5950x 32gb of ddr4 4200 mhz ram , 4tb nvm2 ssd , liquid cpu cooling and 8 noctua fans .
      Built the system 19 days ago , all I have in it is a fresh install of windows an cyberpunk 2077.
      Even overclocking both the 5950x cpu and the 3080ti gpu , I can’t get 60fps at 4k Max settings diss quality.
      I can’t get steady 60 fps even with diss balanced.
      I have to set diss all the way down to performance, to get 60 fps , which has been a minimum standard for pc gaming for over a decade.
      So how are Gpus overpowered?
      There are a handful of games able to bring even the all-mighty 3090ti to its knees when pushing 4k max settings. So no , graphic cards aren’t overpowered. When the most demanding game in the market , can’t make high end cards break a sweat , thats when a graphics card is overpowered. In comparison the 1080ti was more overpowered at launch Than this new cars. Since ray tracing wasn’t a thing back then. And 4k pc monitors where a very rare sight , and useless because they where 60hz
      Games where 1440p and no ray tracing. Which the 1080ti could take like a champion.
      This newe cards might be much more powerful. But they face much harder challenges

  • @Cars1Gunz1and1Weights
    @Cars1Gunz1and1Weights ปีที่แล้ว +32

    So glad this channel gets into the technicals. This guy is their best addition to the lineup

    • @themonsterunderyourbed9408
      @themonsterunderyourbed9408 10 หลายเดือนก่อน +2

      LMAO!! careful, you'll get thrown in jail for misgendering him.

    • @Bellathor
      @Bellathor 9 หลายเดือนก่อน +2

      @@themonsterunderyourbed9408 lol, you weren't kidding.

  • @pissfilth
    @pissfilth ปีที่แล้ว +2

    Good video.
    I've experienced the PC from the first 88/86 processors to the latest CPUs. I always disliked how much more CISC the CPU became over time (when programming in assembler).
    SIMD is cool. Multi-cores are cool. But in the end, i love those RISC processors more because of their ability to do things simply: Just execute the instructions i feed you.. I can even calculate (in advance) how long my program will take to execute (to ultimate accuracy).
    x86 got cluttered with many things.
    I myself think: Make processors more modular. Each module can use its own (type of, and amount of) memory, if needed. And perhaps combine those modules to your own CISC processor, if you want.

  • @dylf14
    @dylf14 ปีที่แล้ว +301

    Every time I hear about how RISC is much simpler instruction set and how CISC is simplifying with extensions, also remember that RISC is increasing its own complexity as well. Both sets are moving towards a common center. A RISC based system requires API makers/driver teams/OS teams to reorient themselves to use more instructions to accomplish the same task. Software designers don't want to spend lots of time optimizing, and would rather move on to another project as it works better at a revenue perspective. Think about how much we ask game developers to optimize games for certain archs, and they just abandon the game and move on to the next title...

    • @kazioo2
      @kazioo2 ปีที่แล้ว +38

      Jim Keller, the greatest CPU designer currently living on this planet, who did both, laughs at all these online wars of CPU architectures and claims the real, fundamental differences are minimal. The devil is in the actual implementation, not architecture. If you don't believe him compare QUALCOMM''s ARM core to Apple's AMR core or even from a single company: compare x86 ZEN3 to Bulldozer (FX) - SAME COMPANY, SAME ARCHITECUTRE, drastically different IPC and efficiency (even when you ignore smaller node gains).

    • @n0stalgia1
      @n0stalgia1 ปีที่แล้ว +3

      Developer ecosystems, especially on that level, are much longer-living. By your logic, the Linux kernel would've been abandoned a year after its release because it wasn't "better at a revenue perspective". And yet here we are.

    • @anon_y_mousse
      @anon_y_mousse ปีที่แล้ว +5

      Exactly. It just bugs the crap out of me that so many people praise RISC while you have to {load/do something/store} everywhere instead of just {do something} like with CISC. I don't care if the internal structure is more complex, the designers just need to do better there. It allows for faster code execution and always has, which is why CISC has been king these past 30 years. While RISC may be catching up, it's a missed opportunity for either Intel or AMD to not come up with a complete redesign and push for people to switch. They could absorb the costs, they have the money, and get everyone to switch to a clean and modern CISC design, but they're scared.

    • @copperfield3629
      @copperfield3629 ปีที่แล้ว +12

      The real genius of fully optimising the operation of code on a RISC CPU is down to the compiler writers. While the real low level stuff may be written in assembler (memory copy functionality being a typical example which is tweaked depending on the exact hardware platform in use), for the operating system (drivers etc) a higher level language will be used and the creation of efficient code to run on that CPU is down to the compiler. Most of the OS software engineers don't need to get mired down in the minutiae of the processor's architecture.

    • @eloyreno
      @eloyreno ปีที่แล้ว

      A real conversation. Don’t know what’s going on but I’m here for it.

  • @TheDoubleBee
    @TheDoubleBee ปีที่แล้ว +280

    To be perfectly honest, I don't particularly like ARM-while not as walled-in as x86, their architecture is still proprietary. Myself, I pray RISC-V takes off instead-it is completely open and it absolutely demolishes even M1 in performance-per-watt, as per Ars Technica article "New RISC-V CPU claims recordbreaking performance per watt".

    • @pazu8728
      @pazu8728 ปีที่แล้ว +54

      Yes. RISC-V please.

    • @akatsukilevi
      @akatsukilevi ปีที่แล้ว +10

      @@pazu8728 I'm full on board with RISC-V, really wants to have a computer I can daily drive with it

    • @BlommaBaumbart
      @BlommaBaumbart ปีที่แล้ว

      RISC-V is no solution to a walled garden. Hardware is only as free as its physical implementation. RISC-V as a concept is very free, but that will be worth nothing if the RISC-V chips that are actually made all contain spyware or locks implemented by their manufacturer.

    • @torinireland6526
      @torinireland6526 ปีที่แล้ว +25

      YES, this is the way it needs to go: open architecture, expanding modularity/repairability and promoting MORE consumer control over our devices - NOT LESS.

    • @jfftck
      @jfftck ปีที่แล้ว +1

      This is where we need PC chip makers to focus on to build better SoCs, the nice thing is that also GPUs are part of the design. Hardware manufacturers need to see that their hardware designs are the real proprietary parts and the drivers should always be open source to allow continued support for those who can’t upgrade all the time. Not many companies make hardware and that should be enough to keep a company in business, the software shouldn’t be allowed to force consumers into buying new hardware. Maybe a law should be passed that requires hardware to be supported with at minimum security patches and all code open sourced if the company is unable or unwilling to provide the patches, so the community has a chance to provide it.

  • @JeffRobinson-ro1tm
    @JeffRobinson-ro1tm ปีที่แล้ว +5

    I could see a modular approach done with SoCs. Kind of like a game cartridge sort of thing. You could just swap out bits, like the main chip, storage and even extra GPUs. You could then implement some sort of quick release water cooling

    • @Koubles
      @Koubles ปีที่แล้ว +4

      Any attempt to modularize an soc will inherently decrease performance due to increasing signal travel and strength. So I don't think we'll see something of the sort like that.

  • @highplainsdrifter1502
    @highplainsdrifter1502 10 หลายเดือนก่อน +1

    Computer builds are a piece of art. If computers were going to be replaced, then the laptops that were marketed as desktop replacements would have replaced them. I started building computers when I was 12 years old and it was a 286 sx. You would think it would have been easy back then but I have my motherboard and it has jumpers to set everything on the board. The bus speed, if you have a co-processor and jumper diagrams all over the motherboard.

  • @NavyDood21
    @NavyDood21 ปีที่แล้ว +668

    I RELLY hope that the PC never actually goes this way. They would either need to come down in price so much that it wouldnt even be worth it for companies to sell them, or they need to magically figure out how to make them just as modular as PCs currently are. I dont want to got the way of Apple and overchage for equipment that is almost complete unrepairable by anyone but a full lab.

    • @lucasrem
      @lucasrem ปีที่แล้ว +3

      You still buy board, with integrated intel Graphics on it, other parties...

    • @Thep184
      @Thep184 ปีที่แล้ว +33

      It will not happen. Just wait till M1 starts to break down… upgradable … not given, oh you just need more he rest is fine ? Well sucks, buy a new entire chip. Look at current PCs: CPU 400 bucks, RAM 100 bucks, GPU 600-1000 bucks… now imagine what a comparable one chip will cost… nah man… big PCs might be power hungry, but that is how it is… a stationery PC system doesn’t need super efficient chips… and in addition PC powerconsumption is not the issue that fuels climate changes

    • @cybrunettekitty5197
      @cybrunettekitty5197 ปีที่แล้ว +19

      Lol everyone in the comments acting like a boomer already. Change will happen & PCs definitely won't be the same be it 5 years or 20 years, but just seeing everyone complain about something they'll eventually come to like is hilarious. Of course there will always be those like "back in my day..." "you don't ever see me getting rid of my pc" "i'll never get rid of windows xp" lmao..

    • @GarryMah85
      @GarryMah85 ปีที่แล้ว +11

      I don't think the point of the video is to say that Apple will take over the PC industry. But rather, their approach of using highly efficient ARM architecture will probably be the way of the future. Apple have shown that ARM works for high performance computer application, and now it's up to other PC chip maker to figure out how to do the same for PC platform.
      Apple only makes SoC for Mac. There's a huge Windows market out there, just waiting for someone to make a great ARM-based hardware for it.

    • @Ecselsiour
      @Ecselsiour ปีที่แล้ว +2

      Perhaps not in our lifetimes. Computers used to be the size of entire rooms half a century ago. In another half century, the full tower gaming rigs today might be a wrist watch. Take what's said in this video and then add in something like carbon nanotubes, which we might crack in a hundred years.

  • @aaronspeck1644
    @aaronspeck1644 ปีที่แล้ว +75

    I FINALLY got my RGB to sync properly and you're telling me a silver cube is going to take that from me?!? Nooooooo

    • @vidhyachan6494
      @vidhyachan6494 ปีที่แล้ว

      Hahah this made me crack up

    • @chadultrapromax1735
      @chadultrapromax1735 ปีที่แล้ว

      xD Razer synapse doesn't work though

    • @joesterling4299
      @joesterling4299 ปีที่แล้ว +3

      RGB can't die soon enough for me. I'm not willing to buy into Apple to get there, though. Screw that noise.

    • @hAT81
      @hAT81 ปีที่แล้ว

      @@joesterling4299 you can say apple's cooling systems are miniaturized jet engines

    • @giornogiovanna734
      @giornogiovanna734 ปีที่แล้ว

      @@yann2850_ it boosts performance duh

  • @jaredweiman2987
    @jaredweiman2987 ปีที่แล้ว +3

    I gave my dad my M1 iMac after building a gaming PC. MacOS hardly has any games to play on it. It doesn’t matter that it CAN run current gen titles if Apple and software devs don’t allow it to. Until 80% or more of the steam store is available on MacOS, there’s zero point in having one as a dedicated gaming setup.

  • @dsrocks6905
    @dsrocks6905 9 หลายเดือนก่อน

    I think a middle ground can be found between fully integrated and modular systems. Possibly one where CPUs are still socketed and ram slots still exist, but the CPU die is much larger and contains its own integrated memory and GPU, with the slots acting as a second level of memory similar to how page files work so that there is some level of upgradability, while keeping the main benefits of an SOC. Having high speed PCIE slots available may also mean the return of tech like crossfire/sli, as with a powerful enough SOC I can imagine it being possible to more effectively manage slave devices, as long as there is software support. Honestly just using ARM on a socketed chip would likely net a huge performance bump as long as applications can be properly ported and optimized. ARM is incredibly powerful, it's just been overshadowed by AMD x86-64 because despite it's efficiency, there was still headroom in the older architecture we are currently using, so why go through the extreme pain if switching everything over to a very different architecture if you don't need to?

  • @enchantereddie
    @enchantereddie ปีที่แล้ว +481

    I cannot even recall since which year came out the saying: "PC is dying". And obviously, it still hasn't finished this process. What I believe is that there will always be the need of highest possible performance out of desktop form factor, either normal enthusiasts or some businesses. The lower end PCs have been made into very small boxes for years and are used as family rigs as well as business PCs. But they are not killing the bigger desktop boxes. Computing technology can be built into many sizes from super computers to mobile phones, and desktop is one sweet spot of them. Even ATX and x86 may be replaced by something better (possibly ARM), people will always want/need their computers to be customizable and allow them to grow bigger for the sake of better performance.

    • @shannono.5835
      @shannono.5835 ปีที่แล้ว +41

      And consumers, enthusiasts, will continue to demand a customizable PC just like the auto industry modders find ways to customize and upgrade what otherwise was an automotive SOC. I think the basic functionality of future offerings will be locked but “for a fee” upgradeability will be possible. Maybe the future looks like the Home PC of the 80’s with a Mac Mini initial presence and an “Expansion box” to permit the customization and upgrade ability that enthusiasts demand (remember the upgradable laptop docks anyone?)

    • @jb888888888
      @jb888888888 ปีที่แล้ว +29

      It seems to me that "PC is dying" came about when they noticed that the sales weren't growing exponentially any more. It is my semi-considered opinion that the market is saturated: everyone who wants a PC already has a PC. (Those who die off are being replaced by the first timers.) Hence a flattening of new PC sales. People will still want to upgrade their PC to a better one, or upgrade individual components.

    • @enchantereddie
      @enchantereddie ปีที่แล้ว +6

      @@jb888888888 HI I reckon one of the possible examination of that stalled sales was at that time AMD wasn't doing well and Intel didn't make much progress in their next-gen CPUs, therefore people waited longer before upgrading their PCs. Much of the market share was lost to mobile phones as well.

    • @harshbarj
      @harshbarj ปีที่แล้ว +15

      "PC is dying". this has been said since at least the early 90's. So 30 ish years. Now when it happens (and it will eventually) people will say, see we told you. Say something long enough that is possible or likely and eventually you'll be right. What would impress me is if people would give a year. Otherwise GTFO.

    • @ArcangelZero7
      @ArcangelZero7 ปีที่แล้ว +14

      @@harshbarj "Say something long enough and eventually you'll be right." See, why can't we keep repeating CONSTRUCTIVE things like "THIS will be the year of the Linux Desktop!" :)

  • @jameslake7775
    @jameslake7775 ปีที่แล้ว +254

    Hm. I'd say there's a mountain of "IF"s that need to be cleared before modular PCs and X86 goes away.
    Apple Silicon is power-efficient, but those designs have been under criticism for poor repairability and upgradability. Apple has sort of gotten away with it by having low expectations of repairability, but every PC becoming a black-box that requires an authorized technician and specialized tools won't sit well with many people.
    The DIY market isn't large, but does represent billions of dollars per year. There's going to be pushback from both sides to any attempt to eliminate that.
    Qualcomm has failed to compete, and I feel like I've heard them claim their next model is gonna be the one several times. Also, jumping from a duopoly to a near-monopoly known for poor long-term driver support doesn't seem like a move companies will be lining up for, and Apple Silicon/Qualcomm powered devices so far haven't been cheap (although at least the AS ones have been fast).
    Windows has a lot of legacy software that either needs to be emulated or left behind. That's either a difficult technical challenge, or cost every business that depends on some specialty piece of software significant time and money while angering every gamer who's favorite older title becomes unplayable.
    So IF people decide power consumption matters more than e-waste, and IF the major players decide and succeed in squeezing out a multi-billion dollar industry, and IF the ARM processor market gets more competitive in terms of both performance and available vendors, and IF Microsoft can get emulation right... then maybe every Windows PC will be an SOC and soldered-down everything.
    Not to say that things aren't going to look different in the future, but following Apple into an all ARM future is easier said than done.

    • @involuntaryoccupant
      @involuntaryoccupant ปีที่แล้ว +7

      that's a very good point!

    • @jamesmicklewright2835
      @jamesmicklewright2835 ปีที่แล้ว +12

      Yep. Repairability and Upgradability aside, most of the things I do on my PC that couldn't be done with any old device from the past 10 years are playing games, and I tend to stick with a core library of favourites rather than moving on to the latest and greatest all the time. If they come out with a compatability/emulation layer that has negligible performance impact AND works flawlessly (no weird glitches or bugs like I often see with emulating different architectures) then fine, but until then, no thank you.

    • @pedrovergara7594
      @pedrovergara7594 ปีที่แล้ว +20

      Modular pcs are not going anywhere. Enterprise and server markets require modularity, and selling that technology to consumers means manufacturers can double-dip on their R&D investments.
      There will be a bigger market for apple-style SOCs? Probably, but unless there is some revolutionary change on SOCs that allows servers to be run on them, we will have some form of modular components available.

    • @jamessavent3636
      @jamessavent3636 ปีที่แล้ว +7

      With right to repair becoming law in more places other than the EU, they won't be able to get away with having SOC without the average consumer being about to repair

    • @funrsguysandmore
      @funrsguysandmore ปีที่แล้ว +3

      Can we dumb this down for the casuals and the ones who don't wanna read.

  • @-dimar-
    @-dimar- ปีที่แล้ว +5

    I've been building PCs since late 90s, and built several last month. I don't see the custom PC building going away any time soon. I do wish for pre-built PC business to go away, and for users to learn how to build their own computers.

  • @matthewnorman1326
    @matthewnorman1326 ปีที่แล้ว +2

    I can normally keep up but this went over my head. I would need a primer video for this video.

  • @CarthagoMike
    @CarthagoMike ปีที่แล้ว +926

    I have no doubt ARM will play a larger role in the PC market in the future, but for now I have yet to see it dominate over X86 anytime in the near future.

    • @Aereto
      @Aereto ปีที่แล้ว +141

      It can only cut into PC markets if it is seamlessly compatible with gaming in general. If it fails to run a game in the architectural levels, it will stay in the Mobile Gaming, which is the lowest in levels of gamer respects with exception to gamblers.

    • @pumbi69
      @pumbi69 ปีที่แล้ว +8

      But what about apple silicon they already are dominating over the x86

    • @gregory5749
      @gregory5749 ปีที่แล้ว +13

      i think it will but only for mobile computers/laptops first. this actually makes me wondering what will Apple put into their next Mac Pro.

    • @shockwaverc1369
      @shockwaverc1369 ปีที่แล้ว +25

      not until UEFI on ARM is a thing (outside of expensive servers and underpowered SBCs) and manufacturers caring more about drivers, ie not just releasing a fork of linux 4.4 that requires you to install shady archived versions of GCC and calling it a day

    • @Dave102693
      @Dave102693 ปีที่แล้ว +5

      Only if developers start converting their software over to arm…and oem start optimizing for arm instead.

  • @RowanBird779
    @RowanBird779 ปีที่แล้ว +186

    I wish more CPU manufacturers existed, so it's not just red vs blue, like Cyrix

    • @billj5645
      @billj5645 ปีที่แล้ว +1

      I used to have a Cyrix processor- much more performance for much less cost. What happened to them?

    • @RowanBird779
      @RowanBird779 ปีที่แล้ว +1

      @@mattmccastle9145 I already know quite a lot about Cyrix

    • @khatdubell
      @khatdubell ปีที่แล้ว +25

      I hope the irony of you wishing there were more tech competition while flying the logo of one of the biggest crushers of competition in tech as your profile pic isn't lost on you.

    • @RowanBird779
      @RowanBird779 ปีที่แล้ว +3

      @@khatdubell I'm well aware of what Microsoft did, but hardware and software are two separate things, I guess CPUs are software now?

    • @khatdubell
      @khatdubell ปีที่แล้ว +23

      @@RowanBird779
      So, what? You're implying you're ok with it when it comes to software but not with hardware?
      And FWIW, MS _did_ essentially do this with hardware as well by teaming up with IBM

  • @ramahadranhennessy9300
    @ramahadranhennessy9300 ปีที่แล้ว +3

    Over my dead body. The idea that a PC is a modular unit has meant we, as consumers, have retained a small amount of power and control to purchase when and what is needed, without the Ram or Hard drive SOLDERED TO THE MOTHERBOARD. I’m an Apple fan. Not a fan of CPU dystopian totalitarianism, and this goes the same with having your GPU powered by a company over the net, or allowing a company like INTEL pay-wall the speed or power of your cpu. And don’t get me started on our governments…innovation has to be balanced with consumer power, privacy, and freedom.

  • @computerpwn
    @computerpwn 10 หลายเดือนก่อน +3

    this is scary and should be talked about more… trading off unmatched efficiency for freedom to choose. how can you service something you cant take apart?

    • @computerpwn
      @computerpwn 10 หลายเดือนก่อน +1

      and this is exactly what nvidia will do with the way things have been going

  • @Somtaaw7
    @Somtaaw7 ปีที่แล้ว +246

    Cool. Super looking forward to the future where everything is one chip and if you want to upgrade or even replace a broken component you need to throw the whole thing out and buy another system from a vendor. Super exciting.

    • @lukasl3440
      @lukasl3440 ปีที่แล้ว +27

      Didn't we already have this in form of laptops, mobile phones, tablets and game consoles?

    • @iandavid7730
      @iandavid7730 ปีที่แล้ว +41

      Wonder what colour the household e-waste bin will be.

    • @lukasl3440
      @lukasl3440 ปีที่แล้ว +5

      @@iandavid7730 In my country it's red.

    • @Kholaslittlespot1
      @Kholaslittlespot1 ปีที่แล้ว

      @@lukasl3440 yeah, exactly...

    • @PropaneWP
      @PropaneWP ปีที่แล้ว +21

      @@lukasl3440 Which is why people who know better don't waste money on expensive laptops, phones, tablets and game consoles. When in need, they buy the cheapest adequate-quality product available, as it will be e-waste in a couple of years anyway.

  • @tobyroberts6571
    @tobyroberts6571 ปีที่แล้ว +709

    To be honest, I'm surprised that enthusiasts have managed to keep the DIY build PC game going for so long. So many other hobbies and just things in general are taking away input and choice for consumers. I hope it hangs on a bit longer - I love my stupid PC because it was me who put it together.

    • @ShaiyanHossain
      @ShaiyanHossain ปีที่แล้ว +111

      if anything the DIY PC crowd has grown due to the emergence of streaming and better ways to play online games

    • @aserta
      @aserta ปีที่แล้ว +28

      You're wrong. It's the same bs that's being said in the car industry about electric cars, that they'll kill any kind of custom work... and there's already dozens of firms that do custom performance parts for electric cars, even boards and brains for them.

    • @cyjanek7818
      @cyjanek7818 ปีที่แล้ว +66

      @@aserta for now.
      Tesla already makes their Motors and accumulators non functional if you dont use their stuff and reverse engineering that is much more complex than making New fuel map for engine, where all you need to control is timing of spark plugs and injectors.
      Even people who are racing with teslas do not replace their stuff, Just Pack it differently at Best.

    • @8lec_R
      @8lec_R ปีที่แล้ว

      @@cyjanek7818 well that's why enthusiasts hate Tesla. If they don't stop this bullshit soon every other company will surpass them soon and they'll start losing marketshare. They are just trying to make bank while they are the major electric car manufacturer

    • @TjPhysicist
      @TjPhysicist ปีที่แล้ว +34

      enthusiast communities around otherwise forgotten technology will always be a thing though. DIY PC's will always be a thing IMO. And while PCs have basically completely been displaced in the market by phones, tablets, gaming consoles and laptops the DIY PC will never actually get fully replaced for those that still want it. Similar to CRT TVs, portable music players, and Vinyl Players...all old technologies that have been kept alive for years if not decades by enthusiast communities (to the point where there are still companies that make and release new portable music or Vinyl players every year). As opposed to smthg like beta max or floppy disks - technologies that have basically been entirely forgotten and completely replaced.

  • @austinmason6755
    @austinmason6755 ปีที่แล้ว +101

    I love watching Anthony on the channel, I feel like he delivers great information in a way your every day fella can understand!

    • @halko1
      @halko1 ปีที่แล้ว

      He delivers it well and isn’t just a talking head, but actually skilled and knowledgeable professional.

    • @miso1995srb
      @miso1995srb ปีที่แล้ว

      @@halko1 he probably is, but how do you know

    • @KingBobXVI
      @KingBobXVI ปีที่แล้ว

      @@miso1995srb - LTT has been very upfront about the qualifications/skills of the people on their staff in previous videos about the crew.

  • @sggsquadpresents
    @sggsquadpresents ปีที่แล้ว +4

    Imagine that lets say the intergrated EMMC would fail. You would pretty much be screwed because you would not be able to service it properly just like the new MacBooks with the Apple Sillicon. That is why some people prefer desktops over laptops because desktops are so much more servicable. Although there are companies like framework or kano but they would never compete because those companies use uncommon connectors.

  • @ash36230
    @ash36230 ปีที่แล้ว +173

    Me reading the title and thinking "Oh no, another chip shortage is coming"
    2:00 "It's wasteful" he says of a machine that Apple wants you to throw away if something minor breaks because you can't replace or upgrade any parts of it easily.
    If a change comes, it'll be through other ARMs, the desktop will still be here, and hopefully with replaceable components but it'll change, slowly, for as long as Microsoft holds Windows ARM back, and as long as there's no way on Windows to run X86/64 programmes with at least the same level of efficiency, and stability as Rosetta 2

    • @cosmicclan6608
      @cosmicclan6608 ปีที่แล้ว +7

      I thought GPU shortage lmao

    • @aetherxsn1591
      @aetherxsn1591 ปีที่แล้ว +1

      @@cosmicclan6608 same

    • @RickyBancroft
      @RickyBancroft ปีที่แล้ว +17

      My thoughts exactly - when I upgrade, I don't throw away my parts, I put them on ebay so someone else can enjoy them for a few more years. The opposite of wasteful.

    • @renderedpixels4300
      @renderedpixels4300 ปีที่แล้ว +1

      Although theres a lot of disingenuous stuff in the video, i think hes refering to the M1 as in arm, and not apple. But like x86, Arm is also has intellectual property and is just as closed as x86 in a lot of ways. ARM would be a good thing for enery consumption of computers in the long run but like so many things in the tech world, its a chicken and the egg situation where (in this case) arm doesnt gain traction becasue of the lack of support, and the lack of support comes from nobody using it and devs not wanting to put time to it.
      Theres too many benefits to arm in the portable space to ignore though and I hope M1s power coaxes companies into developing arm versions of their software.

    • @MmntechCa
      @MmntechCa ปีที่แล้ว

      Apple's dirty little secret. When they brag about reducing CO2 emissions, but don't mention how much e-waste their disposable tech model generates.

  • @rossjennings4755
    @rossjennings4755 ปีที่แล้ว +364

    The direction that this video speculates is the future of computers really has two pieces: moving from x86 to a RISC architecture like ARM, and moving away from discrete CPUs and GPUs toward SOCs, and it's really only the second one that sucks for enthusiasts (and anyone who values replaceable parts). Apple would like us to think the two pieces are inseparable, but they don't have to be. I can imagine a future where the current PC ecosystem is replaced by a new ARM or RISC-V-based platform, but that platform still allows devices to be assembled from components made by different manufacturers using standardized interfaces and an assembly process at least somewhat accessible to people outside of fabs, so that those components can easily be upgraded or replaced if they fail.
    In other words, I don't want a Mac, but I do want something like the Framework laptop, but NUC-shaped and running on an ARM or RISC-V CPU. It's all right if the PC has to die, as long as its spirit can live on.

    • @Hack3900
      @Hack3900 ปีที่แล้ว +36

      Please let RISC be the winner, I don't want more licenses in the way of tech when open alternatives exist ;-;

    • @haysoos123
      @haysoos123 ปีที่แล้ว +4

      We don't know what the Apple silicon Mac Pro is going to be yet, which presumably should be their modular system. Now, it may be that the parts are 'from one manufacturer' at least for the near future, but it's also possible that there would be third parties involved if it made sense.

    • @AR15ORIGINAL
      @AR15ORIGINAL ปีที่แล้ว +4

      what is NUC

    • @RobertWilke
      @RobertWilke ปีที่แล้ว +8

      That would be optimal, that said these companies would love to just lock you in and never let YOU choose. As much as I like Macs I do like choosing may parts. Change is inevitable, how and where that change goes is the issue. Let's hope it's for the better.

    • @davidperry4013
      @davidperry4013 ปีที่แล้ว +4

      A socketed 65w TDP ARM CPU, and a 105w TDP ARM-based dedicated GPU with HBM-2 VRAM would be the future of custom built gaming rigs.

  • @MrTubeMeToo
    @MrTubeMeToo ปีที่แล้ว

    Wha? I loved this video. I understood so little of it and THAT is super exciting to me. I'm looking up all the terminology and further investigating what I did not understand [94.33% of it]. Not to worry I'll watch it over and over until I do understand. :) Great video. Thanks.

  • @MultiHunterOne
    @MultiHunterOne หลายเดือนก่อน +2

    Hold on, just because we're reaching limitations of x86 and are probably going to switch to ARM doesn't necessitate that the computers turn into SoC machines with the chips soldered onto the motherboard. What's stopping us from having ARM chips being LGA or PGA and being just processors, not SoCs.

    • @oliverdickens3219
      @oliverdickens3219 13 วันที่ผ่านมา +1

      Tbh they can make more money if they make it all in ones or atleast attached to other things

  • @MastermindAtWork
    @MastermindAtWork ปีที่แล้ว +54

    The main thing I want to still be around, even though integrated CPU + GPUs are very convenient, is upgradability. Having the ability to upgrade storage and RAM personally without spending an extra $1000 a 16GB RAM and 2TB ROM setup.

    • @garrettrinquest1605
      @garrettrinquest1605 ปีที่แล้ว +2

      Exactly! The thing that is the best about the PC space right now is the modularity. If they could keep that somehow, I'd be super down to move to ARM over time

    • @storyanaksekolah2
      @storyanaksekolah2 ปีที่แล้ว +1

      until we can see a modular arm system

    • @ids1024
      @ids1024 ปีที่แล้ว +2

      Upgradability, and the ability to combine different options for memory, storage, cpu, gpu, other PCIe cards (video capture, AI accelerator, etc.). With everything integrated on one board, and a lot of it in one chip, there won't be as many options.
      Perhaps we still see this modularity in enthusiast gaming hardware partly because it's essential in the server and workstation space, where hardware can be configured in vastly different ways depending on its use (GPU compute server loaded with GPUs but minimal storage; storage server with a huge number of SSDs and no GPUs, etc.).

    • @vizender
      @vizender ปีที่แล้ว

      Well I think storage upgrability is important but I’m not that sure about RAM. Right now from what I understand one of the biggest issue on x86 RAM is the speed in which they communicate with the CPU/GPU. In the ARM SoC, the RAM being so close to the CPU/GPU cores makes it so much more efficient. I’ve seen post about people indicating they went from a 16Gb intel max to a 8Gb M1 Mac and RAM seemed to perform very well in comparaison to the older intel model.
      They had not specified any specific data but I would not be surprised it’s true.
      I really think for most consumers which don’t use their computer for some very specific tasks, upgrading the RAM after the initial purchase (if we consider the initial purchase to be sufficient at the time you bought it) will not be important for at least a few years, and currently most PC after some years have had so much replaced parts it’s basically a new PC.

    • @aravindpallippara1577
      @aravindpallippara1577 ปีที่แล้ว

      ​@@vizender m1 ram isn't anything particularly special - just lpddr4 as far as I understand ( better than ddr4 worse than ddr5)

  • @shiropanx
    @shiropanx ปีที่แล้ว +204

    Been hearing that PC's need to change since I first got the chance to build one for myself way back in the mid 90's. Always doom and gloom, Apple this. mobile that. Still... conventional PC's are still around.

    • @wakannnai1
      @wakannnai1 ปีที่แล้ว +15

      Yeah I think the doom and gloom is a bit overplayed here. I don't think the market is going to change dramatically in the next few years. For most consumers who would consider a conventional PC, I doubt efficiency is at the top of their list of priorities when purchasing a product. Most consumers in this space are fine with power consumption being higher as long as it's not excessive. Performance is far more important here. The M1 is a good chip that has big implications for Apple, but not the industry as a whole. It will be far more important on low and medium power consumption markets where efficiency matters like smartphones and laptops. We will probably see significantly more appealing and efficient APUs/SoCs from both Intel and AMD as they attempt to keep the laptop PC market competitive with not only each other but also with Apple. I think we're already seeing this with Phoenix Point from AMD. I'm sure Intel has something similarly radical as well.

    • @ElectricityTaster
      @ElectricityTaster ปีที่แล้ว +12

      That CNC machine running XP making plane parts won't change to ARM anytime soon.

    • @whwhwhhwhhhwhdldkjdsnsjsks6544
      @whwhwhhwhhhwhdldkjdsnsjsks6544 ปีที่แล้ว +2

      @@wakannnai1 idk, in some places at least energy prices have become extremely high and it just isn’t viable to be burning it all on slightly higher performance when you don’t have to

    • @xrayl7392
      @xrayl7392 ปีที่แล้ว +4

      Yeah even this video is of course dead wrong Amds and Intels Top designers have explained multiple times that ARM doesnt enjoy the same benefits in a normal Computer environment as it has with phones and Mac so yeah the "normal" pc isnt dead and another misleading TechTips video is released to the world (just like the 5800X3D review with performance numbers no other outlet seemed to agree with xD )

    • @beukneeq5766
      @beukneeq5766 ปีที่แล้ว

      exactly

  • @jasonkelley6185
    @jasonkelley6185 ปีที่แล้ว +9

    I feel like one thing you didn't consider here is that mid-range components for PCs can often use far less power and probably achieve 85% of the performance as well. How would that compare?

    • @wonrei
      @wonrei ปีที่แล้ว

      Nope mid range can't especially playing high end games

    • @zeronothinghere9334
      @zeronothinghere9334 ปีที่แล้ว

      @@wonrei 3060 performs at about half of the 3090, which was the premier at the time of this video. Gaming is a bit below an M1 Ultra (50% instead of 66%, and uses a bit more power (50% instead of 30%). It's much cheaper though

  • @blumoogle2901
    @blumoogle2901 ปีที่แล้ว +1

    The biggest changes I see coming is storage speeds catching up with memory speeds, and memory space catching up with storage space, so the differentiation is going away eventually, which might simplify the layout and remove the need for caching.
    At the same time, a lot of games work perfectly fine with on board graphics, so a gpu/cpu merge, just like the decades ago gpu/sound card merge isn't a terrible idea, just upgrading everything together.

    • @StewieG46
      @StewieG46 ปีที่แล้ว

      Yea good point. I'm still on an 150 GB HDD in my PC and 250GB SSD in my laptop. It works more then fine for what i use it for. As the max supported RAM was 128GB?, it gets already close to my needs atleast xD

  • @samgray49
    @samgray49 ปีที่แล้ว +625

    I remember when they thought laptops would replace desktops, I don't think they'll ever fully replace desktops. Maybe it'll be smaller and compact.

    • @B1u35ky
      @B1u35ky ปีที่แล้ว +142

      To be honest, for most people laptops have replaced desktops. People dont have home PCs anymore unless its for work or gaming. And even those are laptops a lot of the time

    • @YN-io6kj
      @YN-io6kj ปีที่แล้ว +18

      @@B1u35ky thats true. I know some people who have just a laptop on a table like a desktop. For me personally i always need a Desktop to complete a certain room.

    • @JudeTheYoutubePoopersubscribe
      @JudeTheYoutubePoopersubscribe ปีที่แล้ว +23

      they have, your average person does not use desktop and monitor setups. hell even gaming laptops have got good and you can definitely get one and use it for gaming.

    • @MichaelKitas
      @MichaelKitas ปีที่แล้ว +1

      @@B1u35ky I agree, I recently replaced my PC as well with the MacBook pro both at the same price
      Edit: By replaced, I mean I use it 99% of the time, I still own the PC. About gaming, personally, I don't game, it's mostly software development related use

    • @PNWAffliction
      @PNWAffliction ปีที่แล้ว +19

      yeah laptops will never be able to replace desktops due to people needing to run beasts of apps like CAD, and video rendering, music, game rendering, etc.. that's why we keep bopping back and forth between terminals, laptops, tablets, etc. etc. because nobody can make up their mind which works the best. its extremely situational.

  • @MustacheInaBox
    @MustacheInaBox ปีที่แล้ว +382

    God the title made me have a heart attack like how linus said everyone should buy parts back in 2020

    • @GPUGambon
      @GPUGambon ปีที่แล้ว +72

      The title is garbage level click bait

    • @Dr_Steal_Computer
      @Dr_Steal_Computer ปีที่แล้ว +3

      noone should buy parts, linus is irresponsible

    • @JAN0L
      @JAN0L ปีที่แล้ว +16

      And he was correct about upcoming shortages back then. Still remember the video response from Gamer's Nexus "debunking" it.

    • @Zorooooooooooooooooooooooooooo
      @Zorooooooooooooooooooooooooooo ปีที่แล้ว +8

      @@lunarvvolf9606 ...lol.

    • @aarvlo
      @aarvlo ปีที่แล้ว

      @@lunarvvolf9606 first of all... what? second of all, what would china even have to gain from people not buying parts? crypto mining? they made it illegal last year. Chinese competitors for american part producers? The chinese desktop industry isn't developed enough to compete with Taiwan and the US

  • @Leanzazzy
    @Leanzazzy ปีที่แล้ว +1

    From what I understood, PCs still use x86 which is extremely old and inefficient compared to modern architectures like ARM's.
    It still keeps redundant legacy parts in order to maintain backwards compatibility.
    But as others have said, how will making the switch affect reparability and the ability to upgrade individual components rather than locking everything and forcing consumers to use the same manufacturer.

  • @patrickjones2843
    @patrickjones2843 ปีที่แล้ว

    I like the reviews! Its fun to nerd out and hear about tech. Thanks!

  • @alexmills1329
    @alexmills1329 ปีที่แล้ว +545

    I feel like this leaves out critical information like how Apple is using a more efficient and dense silicon node, meaning that it would both be larger and more power hungry if they did this on 7nm like AMD zen 3, zen 4 is on 5nm and that will be an interesting comparison later this year.

    • @ikjadoon
      @ikjadoon ปีที่แล้ว +28

      Yeah, this was an oversight, but in the end, it didn't matter. Even comparing Zen3 to Apple A12 (both on TSMC N7 non-P), Zen3 ate 11.38W avg for 71.30 SPEC2006 points, while A12 took 3.91W for 49.85 points. On *identical nodes*, A12 (3 generations old now) was still 2x more efficient than Zen3. Source: AnandTech's Zen3 + A12 reviews.

    • @atul1991ful
      @atul1991ful ปีที่แล้ว +40

      @@ikjadoon that would be the case if the performance scaled linearly with power. Amd designed their chips to work in that power envelop so it's expected they won't be amazing at 4w but even when looking at the m1 and it's iPhone counterpart, do you see linear performance scaling when compared to power?

    • @NFStopsnuf
      @NFStopsnuf ปีที่แล้ว

      You then subsequently leave out that all the top physical chemists and spectroscopists go to Intel, not Apple, so chips from Apple will never have the same absolute power as Intel chips until they hire better

    • @ikjadoon
      @ikjadoon ปีที่แล้ว +10

      @@atul1991ful What made you think AMD did not design its microarchitectures for 4W power?
      EPYC 7763. 280W, 64C.
      Per-core power = 4.44W
      EPYC 7713P. 225W, 64C.
      Per-core power = 3.51W
      The per-core power is even lower than this because we're including the I/O die here.
      AMD specifically designs its cores to scale to very low power draw,, well under 5W. Intel does, too, but they do it less successfully. This is the bread & butter of today's microarchitectures.
      Single-core power *does* translate to linear single-core performance in the most critical segment of the perf/W curve. The problem is the uarch is too narrow in most Intel & AMD CPUs, so they pump up the frequency and suffer non-linear scaling. This is not news to either AMD or Intel. They know that.
      There's a reason mainstream CPUs were at 3 GHz for ~10 years.

    • @drsupergood8978
      @drsupergood8978 ปีที่แล้ว +8

      ​@@ikjadoon Core power usage is non-linear with performance. Most of the performance is obtained with very little power (4-6W odd) around the base frequency. It is the boost frequency that destroys power efficiency since that requires a higher voltage so is less efficient cycle for cycle. At full boost cores can easily use 12W+. These numbers looking familiar? So that the Zen3 processor could achieve the higher score than the A12 it was almost certainly running at full boost which effectively throws efficiency out the window for performance. If the cores were to be tuned to match their scores the power usage would be very similar between the two, possibly going either way between the more mature processing node of the Zen3 or the less complicated ARM cores of the A12.
      Fundamentally performance cores all work in very similar ways regardless of their architecture. The main overhead x86-64 suffers from is the increased decode complexity from fewer explicit registers, complicated instruction support and legacy instruction support. Even still I suspect more of the die area is taken up for features like AVX than instruction decode. Neither x86 ("32bit") or older instructions need to be particularly performant within the pipeline so I can expect not much die area being allocated to those.

  • @DeckDogs4Life
    @DeckDogs4Life ปีที่แล้ว +13

    I feel like the problem is going to end up being lack of serviceability. Companies like Apple that are currently going towards ARM processors have next to no serviceability. They basically want you to throw it out and buy a new one.
    That's the biggest concern I think most people have is that all Desktop PCs will go that way.

    • @ilenastarbreeze4978
      @ilenastarbreeze4978 ปีที่แล้ว +3

      That is exactly my biggest concern. I like being able to service my own machine

    • @BlommaBaumbart
      @BlommaBaumbart ปีที่แล้ว

      People already buy a new phone every 2 years, people don't give a fuck about these issues.

    • @Dave102693
      @Dave102693 ปีที่แล้ว

      Same

    • @DeckDogs4Life
      @DeckDogs4Life ปีที่แล้ว +2

      @@BlommaBaumbart in the gaming desktop community, yes they do.

    • @BorderlinePathetic
      @BorderlinePathetic ปีที่แล้ว

      @@BlommaBaumbart Lol I'm rocking a 3rd hand 980, it does fine. I rocked a S6 till couple of weeks ago, sadly that charge port died a long time before and the battery just became shit. It's bullshit to have planned obsolescence in a PC, don't get baited into the conditioning these anti consumer corpo's try to pull on you.

  • @BoyProdigyX
    @BoyProdigyX ปีที่แล้ว

    That PC build deep dive WAS great.SO MUCH INFO!!!! Oh, and I spied someone with an LTT bag on the bus yesterday haha Nice!

  • @DrPestilence
    @DrPestilence 9 หลายเดือนก่อน

    That was fascinating and very well said. Awesome video dude! Gave me a lot to consider in my mac hating life lol. I suppose if PC went that route, it could all about the wacky custom cases instead? :D

  • @Da40kOrks
    @Da40kOrks ปีที่แล้ว +313

    AMD went with chiplets because they make silicon yields less impactfull. I can't imagine making bigger and bigger SOCs would have high enough yields to be really economical in the long run.

    • @pearce05
      @pearce05 ปีที่แล้ว +19

      SOCs can be made with chiplets. Apple's M1 uses them.

    • @jonathan21022
      @jonathan21022 ปีที่แล้ว +5

      In theory this would allow them to add custom arm cores into there CPUs similar to how Intel has efficiency cores. this would allow newer software to more to the arm side as well as maintain compatibility. They would be able to limit the x86 core's to what would be need for compatibility and pull from ARM for improvements where that allows. How ever what this video left out is most software would become unusable on a arms only processor and there suggestion of using efficiency cores not the power cores maybe backwards for the performance needed to maintain legacy compatibility.

    • @SlyNine
      @SlyNine ปีที่แล้ว +15

      @@pearce05 and wait until Apple tries to match Intel performance, all that efficiency is gone. This video is a big nothing burger.

    • @info0
      @info0 ปีที่แล้ว +4

      @@SlyNine yup, power draw would be ginormous and probably would melt the M chip.

    • @MoireFly
      @MoireFly ปีที่แล้ว +11

      @@SlyNine Not necessarily - the M2 already comes very close to intel perf. And they do that not at the cost of power, but at the cost of... cost; they throw silicon at the problem like it's nothing. Nobody else in the industry spends like that; and they probably _can't_ afford to even if they wanted to.

  • @cosmicusstardust3300
    @cosmicusstardust3300 ปีที่แล้ว +83

    I kind of feel like we're seeing a repeat of 2012 when Microsoft was predicting that mobile devices like tablets and smartphones will soon replace the desktop hence how Windows 8 became a thing. And here we are 10 years later and the desktop has gotten bigger and better

    • @cleon24769
      @cleon24769 ปีที่แล้ว +7

      You just gave me funny flashbacks to the _months_ of consumer demand to bring back the Start button.
      In fact, wasn't there something like where the first user-created "shell" interface to replace Windows 8's UI was so popular, the website for the download kept DDOS'ing?

    • @BlommaBaumbart
      @BlommaBaumbart ปีที่แล้ว +2

      For many people tablets and smartphones have replaced the desktop. According to statcounter, these two categories make up 61% of computing devices. In Africa, the biggest continent, they're over 73%. Japanese teenagers in wide parts play ONLY games which work on their phones, and this includes Fortnite-like games and Resident Evil like games and other genres you associate with desktop PCs. The future fits in a pocket.

    • @cosmicusstardust3300
      @cosmicusstardust3300 ปีที่แล้ว +16

      @@BlommaBaumbart You only listed two parts of the World... Africa and Japan do not reflect the entirety of where a trend will end up. And I still see tons of statistical sources pointing to a not a decline but healthy growth in demand for PC gaming. I highly doubt the desktop will be going anywhere anytime soon, people have been saying this crap for almost two decades now.

    • @4cps777
      @4cps777 ปีที่แล้ว +1

      Add to that the fact that SoCs are probably not that great for servers from what I can tell. (However, I don't have any experience with actual server hosting so take this with a grain of salt)

    • @4cps777
      @4cps777 ปีที่แล้ว

      ​@@BlommaBaumbart That's great but as long as there are enough tech enthusiasts out there as well as enough people who actually go to work (even iPads which I personally consider to provide peak out of the box productivity for touch screen devices don't even get close to desktops, even with bad wms such as the Windows one) the deskopt ain't going anywhere.

  • @chamcham123
    @chamcham123 ปีที่แล้ว +1

    I think the nail in the coffin would be an ultra compact Thunderbolt 5 eGPU enclosure that fits easily in your backpack. Then you can buy a NUC sized PC and have reasonable (but not optimal) gaming performance. I think it would be enough for most people. As a bonus, it would remove a lot of heat from inside your PC.

  • @OpenSorce
    @OpenSorce ปีที่แล้ว +3

    Where have I heard this before? Oh yeah, that's right... the entire history of PCs! I could list how many times I've heard it since the first time I built a PC in the early 90s, but I won't. Seriously, it's a recurring theme. Home built PCs will never go away.

  • @thegardenofeatin5965
    @thegardenofeatin5965 ปีที่แล้ว +313

    I think the main thing keeping x86 relevant is Windows. Windows, and more to the point Windows' ecosystem, really can't abandon x86, because there's so much software that REQUIRES it. Apple has such a tight grip on their platform that they can dictate unilateral and very sudden platform shifts. Linux is source-available so as soon as the new architecture is supported by GCC someone somewhere can start pressing the compile button. Windows? There's gonna have to be an end of a decades-long era.

    • @petrkisselev5085
      @petrkisselev5085 ปีที่แล้ว +42

      Hence why Valve are experimenting with their Linux-based Steam OS on the Steam Deck.

    • @quinnmichaelson6793
      @quinnmichaelson6793 ปีที่แล้ว +16

      I mean I have surface pro x with a Microsoft designed arm chip. It's not perfect, it's not faster than the Mac m1, but it's hell of a lot more useful than my old android tablet, and it works well enough with software emulation.
      I remain extremely skeptical that arm can completely overtake x86/64 though to the extent the video here argues.
      Apple has been doing SOC-like things (closed unibody, no changing parts) far longer than the ARM chips were around so they are just a bad example to point to for wide reaching changes.
      I think the far more likely approach is some sort of hybrid power saving state architecture integrated into most desktops, the same way that dedicated gpus on laptops slowly developed tech to allocate power based on current usage.
      Then we will eventually see more powerful arm cpus integrated into traditional motherboard designs, rather than completely replacing them.

    • @theguythatcoment
      @theguythatcoment ปีที่แล้ว +13

      Nah, x86 is alive because server farms buy every single one of the top line chips before launching to consumers, and server farms love x86 because of virtualization. X86 are in essence a lot of 8 bit risc computers that emulate a 64 bit cisc computer. Why buying risc chips if you can't efficiently emulate x86 with them but you can the other way around

    • @tin-n-tan
      @tin-n-tan ปีที่แล้ว +15

      most linux is run on x86. Its linux so it can run on rainbows and nice thoughts but it still mostly runs on x86.

    • @arsoul3591
      @arsoul3591 ปีที่แล้ว +19

      The M2 chip literally has billions more transistors than the most recent x86 stuff from AMD / Intel does and is like twice the size or more. Why are people simply ignoring this? ARM is not more powerful, the only thing it has over x86 is less power per instruction, that's literally it... x86 isn't going anywhere.

  • @MatchTerm
    @MatchTerm ปีที่แล้ว +281

    I am EXTREMELY glad that most of the comments here support what I'm seeing myself: no matter how efficient this can be, in the state apple wants to sell it to you, this throws the ENTIRE concept of repairing out the window. Not just that, even upgradability is out besides "buy new computer" or maaaybe increase storage size.
    I shall not, in the current state, accept ARM chips and SoCs substituting desktop PCs in their entirety. The normal consumer might see the benefits of using less energy for the perofrmance, but they are gonna pay in now being in a instituzialized planned obsolescence for EVERYONE.
    Either make ARM SoCs have an option to be upgradable and repariable as x86 or do not bother trying to make me go to this bullshit, plain and simple.

    • @afriendofafriend5766
      @afriendofafriend5766 ปีที่แล้ว +39

      Yup. Just remember what Apple has done. They've artificially locked the SSD ID so that you can't even upgrade capacity. PCs? Nah. I took an old ssd from a random laptop and it works perfectly fine.

    • @MatchTerm
      @MatchTerm ปีที่แล้ว +23

      @@afriendofafriend5766 Exaclty, that's one of PC greatest points that I just do not understand why people want to kill off.
      Like, if you want a simple solution and don't care (even if people should, it is fun, but I understand time constrains), you can get a laptop. And there are already also laptops that are repairable as well.
      Just do NOT try to shove this in the throat of people like us: that want to still repair our stuff instead of always asking the corpo, that actually like to tinker and play with our hardware.
      Like I said before, in the long term, this will affect the normal consumer as well, that will have to throw away the entire system just because of one of the SoC components failing: but if that's really an inevitability, at least leave us alone please...

    • @Dave102693
      @Dave102693 ปีที่แล้ว +5

      Yeah…that pisses me off.

    • @64bitmodels66
      @64bitmodels66 ปีที่แล้ว +6

      I'm not THAT worried since this kind of stuff will always exist for people like us. Also what's stopping other companies from just copying & improving upon apple's design?
      what i'm more worried about is how this is gonna affect compatibility. x86 and arm are completely different, they do not work together

    • @jamesmicklewright2835
      @jamesmicklewright2835 ปีที่แล้ว +11

      Exactly. My last PC lasted me just about 10 years, thanks to various upgrades along the way. I wonder how long it would have lasted if I was stuck with the configuration from 2011 - 4GB RAM, a Radeon HD6870 GPU, and a 500GB mechanical HDD? The i7-2700K was still decent 10 years later. Everything else in the box ran out of steam in half that time.

  • @mattd6931
    @mattd6931 ปีที่แล้ว +1

    Nothing said in this video precludes an ongoing individual component retail market. Even if ARM replaced X86, you could still potentially have ARM CPU's and GPU's, and build your own system.
    And yeah, there'd be a huge market for prebuilt micro systems for office spaces etc. But, given consoles these days ARE PC's with standardised harware, and custom OS's, and we haven't seen the traditional "PC" market dying off, I don't think we'd see the same happen if ARM Cpu's became commonplace.

  • @mdcorbett
    @mdcorbett ปีที่แล้ว +101

    With all of the existing x86 software available and for the fact that consumers like upgrading hardware (including PC gamers), I don't see x86 dieing anytime soon. Just because something is more power efficient, doesn't mean it's more practical in all cases. Edit to clarify: existing software will need to be recompiled for the new chip or it will stay with x86 (unless it's emulated, but the performance may suffer even there.) You can't just move millions of applications and games to a new architecture.

    • @utopian666
      @utopian666 ปีที่แล้ว +15

      Yes it does. It means exactly that.

    • @lunchbox1553
      @lunchbox1553 ปีที่แล้ว +2

      @@utopian666 But it doesn't mean it will be financially successful.

    • @utopian666
      @utopian666 ปีที่แล้ว +14

      @@lunchbox1553 Yes, because Apple are not financially successful with their all Arm strategy. Glad we sorted that out.

    • @chico_zombye
      @chico_zombye ปีที่แล้ว +10

      Europe es already banning TV's that draw ridiculous ammounts if energy. I always think about It lately when I see how much energy PC's draw this days and It's not getting better.

    • @lunchbox1553
      @lunchbox1553 ปีที่แล้ว +26

      @@utopian666 They are financially successful because of their brand, not their technology. Don't fool yourself.

  • @drisbain
    @drisbain ปีที่แล้ว +158

    few things, The Actual x86 processor has been a simple (RISC type) CPU since the days of the Pentium Pro. They added a reorder buffer (which about every performance CPU has) for increasing IPC. Making a poor design run faster increases power demand, remember the days of the Pentium 4 being the end of x86 because it was to hot and ineficent... then they took the laptop chip (pentium3 m) and made it the core 2.
    Also Since you are talking performance per Watt, look at the under volted Ryzens, rather than full powered X86. Full powered X86 is not optimized for per Watt performance, just straight performance.

    • @steinwaldmadchen
      @steinwaldmadchen ปีที่แล้ว +13

      You're basically correct. IIRC AMD said they could literially build an ARM chip overnight just by replacing the decoder of Zen.
      But even Zen is behind M1 in terms of performance per watt. There're many reasons, and I don't think Apple's engineers smarter than AMD's (or Intel's) is one of them. Rather, costs related to all those x86 compatibility supports could be a reason behind that, or maybe the tight integration in Apple Silicon that AMD has also been doing for years, aberit slower. Their APU concept and the HSA technology that supports that are also about lowering latency between components.

    • @iskierka8399
      @iskierka8399 ปีที่แล้ว +29

      @@steinwaldmadchen M1 has a significant efficiency advantage in the form of its 5 nm exclusivity contract, which is something being overlooked in these claims of x86 being replaced. Yes, in terms of consumer performance, this doesn't actually matter, people should buy what's best for them right now, but when *all* other silicon is limited to 7 nm or larger still, trying to claim that M1 is a demonstration that efficiency (or outright performance) can't be matched is just outright false. Yes, a 3090's chip might be half the size of an M1 while the M1 GPU is a third of the size, but a 5 nm 3090 would likely be less than a third of the size - none of these comparisons are actually fair on the technological side.

    • @TeranToola
      @TeranToola ปีที่แล้ว +16

      @@steinwaldmadchen M1 Is not better due to architecture, it is better due to a smaller node.
      M1 is 5nm, it's "apple-to-apples" (haha) competitor will probably be Ryzen 7000 mobile.
      Apple is TSMC's pipecleaner. They get first dibs because they give TSMC the most money.

    • @steinwaldmadchen
      @steinwaldmadchen ปีที่แล้ว +4

      @@iskierka8399 I've already said there are many reasons behind that. Process node is of course one of them.
      But even a 2019 A13 Bionic at 7nm still has comparable Geekbenck single core score comparable to a 7nm Ryzen 3000 in the same year. Granted Geekbench is not the best representative of performance, and A13 is a mobile chip that is instinctively efficient oriented, but it shows Apple at least has some edges in some areas, using the same node technologies.
      On the other hand, had AMD given the chance to design a chip from ground up, without concerning software compatibility, would they still choose x86-64, or would they pick newer architectures like ARM and RISC-V? Would that theoratical RISC Ryzen performs better in majority of the scenarios than the x86 version we have? Possibly, given some engineers complained x86 is difficult to work with. If that's the case then clearly x86 is one of the reason holding us back.
      Of course for the sake of software compatibility it's a compromise many have to make, otherwise the computer they made would be useless, unfortunately. And engineering is all about compromises.

    • @dirg3music
      @dirg3music ปีที่แล้ว +3

      Yeah there is some serious gross oversimplifications going on in this video

  • @conedang
    @conedang ปีที่แล้ว +6

    Why is it that everything I’m interested in and every single hobby I have seems to continually be destroyed by society as a whole

  • @s.k.arthur6990
    @s.k.arthur6990 ปีที่แล้ว

    Kind of the same thing happened with the automobile. You used to be able to do a lot of customization and work on and repair them yourself. But now you need an IT professional to work on or repair your car. And they are mostly proprietary chips and OS. I won't even get into electric autos. Thank god I still have my '68 Mustang to tinker on. Nice job, Anthony. Cheers.

  • @jetseekers
    @jetseekers ปีที่แล้ว

    Sent this in a text to some friends
    Wanted to share here as well
    Possible future of 'ccustom PC builds.
    Powe Board (Motherboard plus Power supply) + SOC (CPU, APU, Basic Ram) + SSD.
    Either upgrade those directly or use a supplemental GPU and ram expansions

  • @existentialselkath1264
    @existentialselkath1264 ปีที่แล้ว +154

    Laptops, tablets, phones, always advance faster than anyone expects, but they never completely replace the PC.
    As tech gets smaller and smaller, you can get more and more power by keeping it the same size.
    Just like laptops ate into the pc market share for Web browsing and average work, soc computers will probably do the same to an extent for video editing or whatever. But it can never fully replace the maximum price/performance form factor of a pc, even if pc's move to a newer architecture to keep up

    • @jabbany2715
      @jabbany2715 ปีที่แล้ว +4

      As a form factor sure, but future PCs will look like these Mac Minis. Tight integration means nothing to upgrade and low power draw means more compact dimensions. Back in the day, people could upgrade the cache on processors, but as that got too fast it became a built-in part of the chip. In the future, this will likely be the same case for RAM where you do not have memory modules but it is just part of the chip package.

    • @MrMaddog2004subscribe
      @MrMaddog2004subscribe ปีที่แล้ว +6

      @@jabbany2715 I doubt it since everyone needs different amounts of ram. People who need 64gb of ram for specific work cases aren't going to want to buy a more expensive version of a CPU just to get more ram. I don't see upgrading ram going away at all. Too many people would be upset about it and it wouldn't make much sense to do so

    • @existentialselkath1264
      @existentialselkath1264 ปีที่แล้ว +2

      @@jabbany2715 you imagine ram in the future being small enough to fit on the chip. I imagine ram in the future being small enough to fit terabytes on a stick.
      The whole point of cache is that it's faster to access than anything else in the system. Integrating it into the cpu is a natural thing to do. I'm not convinced the same thing will happen for ram and especially gpus.

    • @TheMyleyD
      @TheMyleyD ปีที่แล้ว +4

      I think the issue is Moore's Law. We are getting to a point in the x86 architecture where we are using increasing power to less performance. ARM or RISC* is logically the next step. But I'm sure the future will bring different architectures.

    • @afriendofafriend5766
      @afriendofafriend5766 ปีที่แล้ว +1

      @@jabbany2715 Ironically you now *can* once again upgrade the cache.

  • @alexpappas1573
    @alexpappas1573 ปีที่แล้ว +42

    I remember in 2011 they said gaming pcs are dead and tablets are coming to replace both laptops and desktops. 😄

  • @captainobvious9188
    @captainobvious9188 ปีที่แล้ว +6

    I’m not a gamer, but I’ve been building APU based machines for my kids for a decade now. The last machines I built in 2018, Ryzen 2400G, were super clean with M.2 on the MB. I barely upgraded their GPUs with all the used mining GPUs on the market now.

  • @methos-ey9nf
    @methos-ey9nf ปีที่แล้ว +6

    It's kind of amazing to stop and think for a moment that for all these different architectures they're all made thanks to basically a handful of companies - namely ASML and TSMC.