Build a PC while you still can - PCs are changing whether we like it or not.

แชร์
ฝัง
  • เผยแพร่เมื่อ 26 พ.ย. 2024

ความคิดเห็น • 10K

  • @How23497
    @How23497 2 ปีที่แล้ว +22445

    This all sounds like absolute hell for consumer’s rights, repairability, upgradability, and overall variety in the PC space.

    • @verios44
      @verios44 2 ปีที่แล้ว +4185

      Let’s be totally honest. The end goal is to get rid of end user serviceability and heck even real ownership of products.

    • @bradenrichardson4269
      @bradenrichardson4269 2 ปีที่แล้ว +2386

      Yeah forced obsolescence. People be throwing away their PC every two years just like they do with mobile devices now. Utter crap.

    • @anakinlowground5515
      @anakinlowground5515 2 ปีที่แล้ว +1365

      Yeah, I completely agree. The right to repair movement is in direct opposition to this. They cannot coexist.

    • @alphagiga4878
      @alphagiga4878 2 ปีที่แล้ว +220

      @@anakinlowground5515 well at least we have a major player, framework

    • @pinkchckn
      @pinkchckn 2 ปีที่แล้ว +37

      my thoughts exactly

  • @Lossy555
    @Lossy555 2 ปีที่แล้ว +6053

    Hearing this, I imagine a dark future where PCs are handled like phones today. "Sorry your 2 year old PC is now irrelevant because we're not giving it anymore updates."

    • @alexeisaular3470
      @alexeisaular3470 2 ปีที่แล้ว +278

      Linux 🙌

    • @tippyc2
      @tippyc2 2 ปีที่แล้ว +667

      @@alexeisaular3470 Ok so you're going to root the device and install linux. Sounds great. So what do you do when your wi-fi refuses to connect to a rooted device? We're already seeing rooted phones being denied services...

    • @ilyasofficial1617
      @ilyasofficial1617 2 ปีที่แล้ว +143

      @@tippyc2 anti trust law

    • @AR15ORIGINAL
      @AR15ORIGINAL 2 ปีที่แล้ว +295

      @@Zaptosis Unlocking the bootloader still voids your warranty and denies you services.

    • @NepgearGM6.1
      @NepgearGM6.1 2 ปีที่แล้ว +5

      That

  • @HunterDrone
    @HunterDrone 2 ปีที่แล้ว +2918

    my sole complaint about SOC systems is how frequently the makers of them seem to not give a shit about long term maintenance, expecting you to just buy a new model rather than maintain or reconfigure your existing machine.

    • @wafflecopter9296
      @wafflecopter9296 2 ปีที่แล้ว +417

      Planned obsolescence.

    • @astronichols1900
      @astronichols1900 2 ปีที่แล้ว +81

      As a Tegra K1 owner. Yes. and i regret ever buying it.

    • @idkanymore561
      @idkanymore561 2 ปีที่แล้ว +90

      That's their end goal

    • @the_danksmith134
      @the_danksmith134 2 ปีที่แล้ว +279

      watch them using the power efficiency as an excuse for environmental protection even though in the long run the inability to repair will create even more e-waste and contribute to even more emissions due to the higher demand for manufacturing them than conventional PCs do with all their ineffieciency

    • @xathridtech727
      @xathridtech727 2 ปีที่แล้ว +39

      @@the_danksmith134 sadly people replace PC instead of repairing them already

  • @HonkeyKongLive
    @HonkeyKongLive ปีที่แล้ว +371

    I feel like the goal for them is to turn computer ownership into phone ownership. You buy a new box every year that you don't own, just lease from that manufacturer.
    Also this is unrelated to the content but man this guy is a good speaker. Great to listen to.

    • @harbitude
      @harbitude ปีที่แล้ว +6

      That's exactly what it is.

    • @creeper6530
      @creeper6530 ปีที่แล้ว +12

      You're more right that I'd like you to be

    • @awwtergirl7040
      @awwtergirl7040 ปีที่แล้ว +7

      Yeah it is. It why I have started hating computing more and more. The only bright spots are the Open stuff. The most compelling feature of any computing platform today is Freedom.

    • @alexspata
      @alexspata ปีที่แล้ว +2

      exactly, this is the future I'm afraid

    • @Peglegkickboxer
      @Peglegkickboxer ปีที่แล้ว +2

      They're doing this with cars, especially all of the electric ones they are trying to shove down our throats.

  • @nabusvco
    @nabusvco 2 ปีที่แล้ว +5991

    While I can see this change as an inevitability, I just hope the serviceability and upgradeability will not be impacted as hard as I think they will be

    • @POLARTTYRTM
      @POLARTTYRTM 2 ปีที่แล้ว +620

      It definitely will be impacted. Try upgrading anything on an SOC. You can't. Anything goes defective and you have to dump it all and get a new one. It's like a phone, try repairing one, can't upgrade without literally buying a new phone, if something breaks, good luck trying getting it repaired yourself, or pay a hefty price for a mere ATTEMPT at repairing the thing.

    • @wanderingwobb6300
      @wanderingwobb6300 2 ปีที่แล้ว +88

      It absolutely will be

    • @Nib_Nob-t7x
      @Nib_Nob-t7x 2 ปีที่แล้ว +203

      Yeah, no, I think they were stretching a bit far when saying they will make them into a SOC, the fact is it's less profitable to make an SOC than the parts separate, and this mostly comes down to yield, the larger the chip is the more likely it will be unusable due to defects in the silicon wafer. This is a major reason why amd uses chiplets, since if they make a bunch of small parts instead of a large one their yields go up.

    • @Wockes
      @Wockes 2 ปีที่แล้ว +115

      In the future when it breaks you buy a new one. It's all e-waste like old Apple hardware

    • @brkr78
      @brkr78 2 ปีที่แล้ว

      There is money to be -stolen- made from the -sheep- users, Apple shows the way on how to -exploit- convince their userbase to fork over money on what can only be described as a planned pile of e-waste, timed to be obosolete once the profit margins hit a certain low mark. Yeah, no, the furure is going to suck, for both enthusiasts and normal useres, former because they have no further influence, and latter because they will be milked dry.

  • @BorlandC452
    @BorlandC452 2 ปีที่แล้ว +1197

    Sometimes I feel a little old-fashioned for still using a tower for my main computing when I'm not much of a gamer. But I still really like having one because of how I can customize it. I love being able to swap out individual components, or even adding new ones. You really can't mix and match with a laptop, and especially not a phone.

    • @dynodope
      @dynodope 2 ปีที่แล้ว +30

      I 100% agree with you on this!

    • @riccardo1796
      @riccardo1796 2 ปีที่แล้ว +157

      in the modern hellscape where everything needs to be a nondescript square with no replaceable parts the mighty tower stands tall, whirring menacingly from its over-full drive bays

    • @ignacio6454
      @ignacio6454 2 ปีที่แล้ว +27

      @@riccardo1796 Beautiful

    • @robgrune3284
      @robgrune3284 2 ปีที่แล้ว +9

      AMEN !

    • @Quebecoisegal
      @Quebecoisegal 2 ปีที่แล้ว +6

      Yep, agree with you totally.

  • @sonoftherooshooter
    @sonoftherooshooter 2 ปีที่แล้ว +2357

    Linus, I've been in the tech field since 2006. Been watching you since 2007 or so. You have a lot of great staff but Anthony is special. His ability to articulate facts while sounding very concise, his pool of knowledge in the market space understanding the competitive analysis. These are all very strong assets. Treat this man good.

    • @Sam-tb9xu
      @Sam-tb9xu 2 ปีที่แล้ว +81

      Anthony needs a raise!

    • @loktar00
      @loktar00 2 ปีที่แล้ว +19

      In the field since 2006 like that's a long time 🤣

    • @rmoultonrmoulton145
      @rmoultonrmoulton145 2 ปีที่แล้ว +154

      @@loktar00 Huh? 16 years isn't a long time? Especially when tech changes exponentially? Do you realize the amount of change he's seen since he started? Hell, I started in the field professionally a little over a decade ago and it's incredible the amount of change I've seen.

    • @Chspas
      @Chspas 2 ปีที่แล้ว +92

      @@loktar00 bro, people were still figuring out how to create good websites in 2006. Being in the industry for that long does give you an amazing perspective on the growth of tech

    • @sonoftherooshooter
      @sonoftherooshooter 2 ปีที่แล้ว +14

      @@loktar00 it's not necessarily about time in the field but experience seeing and working on different things gives a wealth of perspective. Desktop and enthusiast/gaming computing really hasn't changed all that much in the last 25 years if you really break it down, other than silicone advancements and the addition of VR and new form factors. My original comment was really meant to highlight that Anthony couldn't provide the level of expertise that he is currently delivering if he had not seen/read/done so many things in his career as is so evidently clear in the presentations he delivers.

  • @GamebossUKB
    @GamebossUKB 2 ปีที่แล้ว +1009

    I believe regardless of how efficient ARM and it’s related technologies can be, there will always be a demand for individual components.

    • @oakzy3647
      @oakzy3647 ปีที่แล้ว +35

      Building a pc is sacred and taylors it to you're use case I completely agree

    • @steveklein9335
      @steveklein9335 ปีที่แล้ว +60

      i tend to agree, but then again... how many manual transmission cars are there available to buy...

    • @oakzy3647
      @oakzy3647 ปีที่แล้ว +13

      @@steveklein9335 this comparison is kind of invalid as if we don't build it they will and it works out cheaper for upgradability to allow for expansion and future technology. Did you know cars still use computer ram and motherboards and graphics cards it still has to be built so I will partially agree but a custom pc will be around until the performance of any chip can run games at thousands of fps so specs don't matter
      However there is a huge market for laptops, phones and prebuilts but anyone wanting performance will build their own for now

    • @anthonygrimaldi9483
      @anthonygrimaldi9483 ปีที่แล้ว +23

      @@steveklein9335 there will always be manufacturers that still make manuals for car enthusiasts, and there will always be a few manufacturers making PC components.

    • @benedikta.9121
      @benedikta.9121 ปีที่แล้ว +53

      @@steveklein9335 They're the majority of all cars available in every country except the US and Canada.

  • @Aefweard
    @Aefweard 2 ปีที่แล้ว +1377

    My issue with the idea of the everything chip is say you’re 2 generations down the line and want to upgrade your graphics, but the cpu side is still chugging fine, having to replace the whole thing is not only wasteful, but more expensive. Same goes with a component dying.

    • @skydivingmoose1671
      @skydivingmoose1671 2 ปีที่แล้ว +74

      Wouldn't replacing a small (in comparison) SoC be better than a whole GPU and its cooler? Assuming the new chip fits in the old motherboard.

    • @RJRC_105
      @RJRC_105 2 ปีที่แล้ว +183

      @@skydivingmoose1671 assuming they don't change the motherboard, yes. Or use a BGA mount so you can't physically. I'm sorry but this is a regressive, wasteful step.

    • @ts47920535
      @ts47920535 2 ปีที่แล้ว +48

      Well, SoCs are less wasteful overall.
      My laptop's entire SoC PCB (motherboard, IO, chip, etc) takes approximately the same size than my GPU. So the mentality of 'just upgrading my graphics' goes out the windows, you would just replace the logic board.
      Technically you would be replacing ram, cpu, gpu, controllers, etc, which seems wasteful, and it is on full size PCs, but it isn't on SoCs

    • @IR4TE
      @IR4TE 2 ปีที่แล้ว +115

      @@skydivingmoose1671 Also you have to consider, if PCs really go down the SoC route, you have to replace the whole SoC, which will be the most expensive part in the whole machine. So putting down for example 2000$ every 2 years just because you're not so satisfied with your GPU performance any more, but the rest still performs well, you waste so much money because of a little area on your SoC die.

    • @TeranToola
      @TeranToola 2 ปีที่แล้ว +38

      @@skydivingmoose1671 You can't really do that as you'd have to completely change the memory as well, especially when you're talking about upgrading an SOC with a better GPU, unless they put all of the memory onto the silicon, or relied on 3D stacking to fill in any bandwidth issues that arise, it would be extremely costly.
      If this future of PC SOCs comes to fruition there will be a quantum crapton of E-waste. Large SOCs will still use 1000+ watts...
      The reason why the M1 Ultra uses less power than the 3090 is simply because it's effectively 2 die shrinks ahead of the 3090 (Samsung 8n vs. TSMC's 5n). A GPU with the power of a 3090 on TSMC's 5n would likely use around 200 Watts, maybe less. Especially if Nvidia ditched GDDR6X, which is a very power hungry memory

  • @Fatty420
    @Fatty420 2 ปีที่แล้ว +563

    And the great thing about a small, integrated system is that when it breaks you get to buy a whole new system!
    Wait...

    • @aetherxsn1591
      @aetherxsn1591 2 ปีที่แล้ว +82

      Yeaaaaa, no way SoC boxes will replace PCs.

    • @Wockes
      @Wockes 2 ปีที่แล้ว +56

      Or if you want to upgrade your GPU you can't

    • @feeadftth
      @feeadftth 2 ปีที่แล้ว +43

      This year was the second time i was glad i built a DIY 7 years ago, as my PSU failed and i just bought a new one. Just like the GPU 3 years ago.
      DIY without any doubt

    • @Desnhauos
      @Desnhauos 2 ปีที่แล้ว +10

      @@Wockes the vast majority of consumers don't care about that

    • @joemarais7683
      @joemarais7683 2 ปีที่แล้ว +75

      @@Desnhauos a vast majority of people are going to be pissed when they can't upgrade their gaming performance without spending 3k on another entire system with a bunch of stuff they don't need upgraded.

  • @hourglass1988
    @hourglass1988 2 ปีที่แล้ว +392

    I'm watching this on a gaming pc I built literally 10 years ago. It was probably low/mid range even at that point costing me roughly $700 (in 2012 dollars mind you lol). I upgraded the GPU about halfway through that time for ~$200. Couple years ago I put an after market CPU cooler in for another ~$50. I've only just now started to run into games that my system *can't* run. I'll confess a lot of newer games I have to run on low to minimum settings but it can. Some of the newest games will start to cause heat problems after a couple hours of play even on low settings. But come on in that same time period I've gone through 5 laptops that I've used for little more then word processing. I'm on my third roku stick in two years because the first one died and the second one just wasn't being supported anymore. I'm personally terrified of the PC market going the way of consoles or all-in-ones.

    • @latortugapicante719
      @latortugapicante719 2 ปีที่แล้ว +49

      What are you doing to those poor laptops

    • @Filip_Phreriks
      @Filip_Phreriks 2 ปีที่แล้ว +20

      I'm typing this on an Asus netbook from 2011.
      Maybe you have to stop throwing your coffee over your laptops or whatever it is you're doing.

    • @hourglass1988
      @hourglass1988 2 ปีที่แล้ว +30

      @@Filip_Phreriks never water damaged one. Dog got the strap of my computer bag caught around his neck and tossed one down the stairs. Had another I was letting my wife borrow for WoW but she kept turning the fans off because they were too loud and ended up cooking it. Another was on windows 98 and just wasn't supported anymore and hardware wouldn't accept a new OS. One the screen just stopped displaying. Another had a head crash is the drive. I'm sure if I left them on a desktop like my desktop some would have lived longer.

    • @svenwempe9208
      @svenwempe9208 2 ปีที่แล้ว +42

      @@hourglass1988 turning the fans of becouse they were too loud ..hahahahhahah🤣🤣 that shit is funny as fuck

    • @johna3357
      @johna3357 2 ปีที่แล้ว +2

      I'm guessing a sandybridge intel cpu?

  • @MrDJHarrison3
    @MrDJHarrison3 ปีที่แล้ว +87

    Has to be one of the best presenters on LTT, (apart from Linus that is)
    give him more screen time, so clear and well spoken

  • @LongHaulRob
    @LongHaulRob 2 ปีที่แล้ว +2004

    Anthony has honestly been my favorite addition to the LTT crew, a pleasure to watch. I'd take a computer class if he taught it.

    • @croay
      @croay 2 ปีที่แล้ว +30

      agreed

    • @joshualioi5144
      @joshualioi5144 2 ปีที่แล้ว +17

      same

    • @movingstair
      @movingstair 2 ปีที่แล้ว +27

      A computer class you say. Great idea for a vid? He prob knows the basics of what would be thought in university and lots of people would give it a listen.

    • @TimberwolfCY
      @TimberwolfCY 2 ปีที่แล้ว +27

      @Metrion Void Tell us you're jealous without telling us you're jealous, lol

    • @haveacigar5291
      @haveacigar5291 2 ปีที่แล้ว +3

      self discipline would be a class he could use i am sure. he should down on the cupcakes.

  • @Hotrob_J
    @Hotrob_J 2 ปีที่แล้ว +735

    They've been saying this for like 20 years now. First when laptops went mainstream, then with tablets and smartphones, then when the original NUC came out.

    • @BaldTerry
      @BaldTerry 2 ปีที่แล้ว +138

      Agreed. Building desktops not going away anytime soon.

    • @katatastrofa6136
      @katatastrofa6136 2 ปีที่แล้ว +26

      The difference is that PCs will still be a thing, only different

    • @Thatonefuckinguy
      @Thatonefuckinguy 2 ปีที่แล้ว +70

      Also tons of people such as myself simply can't afford pre-built hardware. It's that simple, the economics of it being cheaper to build a PC yourself outweigh the benefits of getting rid of custom building PC's. Also not everyone wants a piece of shit entry level card like a 1050ti that can barely run much of anything. In order for SOC to work, they'd have to stop giving out shitty entry level cards and start giving us enthusiast or at the bare minimum mid-range GPU's. CPU's tend to be fine as most pre-builts and even laptops come with at least an i5 if not an i7 unless your getting the thing for dirt cheap. But I remember not being able to buy a prebuilt and having to go custom back in 2017 or 18 when I built mine purely because not a single one for a long time in my price range came with a 1060 6GB model.

    • @graphincer3164
      @graphincer3164 2 ปีที่แล้ว +28

      Yeah. Hope this just stays as an industry scare as it has always been. Heck people were scared stadia was gonna take over or Geforce Now but they haven't.

    • @romano5785
      @romano5785 2 ปีที่แล้ว +26

      Didn't you heard? PC Gaming is DYING! lmao

  • @denvera1g1
    @denvera1g1 2 ปีที่แล้ว +733

    5:30 Anthony, this is a very well put together video, but Apple fails to disclose one key point when talking about efficiency, and most reviewers miss this VERY key fact
    Apple is using TSMC 5nm. Where as Nvidia is using Samsung 8nm, AMD is using TSMC 7nm which TSMC says that 5nm offers a -50%- 20% improvement in performance per watt over 7nm, and Intel is using Intel 10nm-'intel7' (almost as good as TSMC 7nm)
    To put your comparrison of the 3090 and the M1 Ultra into perspective, if the M1 Ultra used the same Samsung 8nm silicon, the die would be over 4x the size of the 5nm M1 Ultra, and use as much as 12x the power to get the same performance(edit most likely it would only use 6-8x more power).
    Samsung 8nm has roughly 44 million transisters /mm² where as TSMC 5nm has ~186million /mm²
    To put it another way, the 3090, ported to TSMC 5nm, would be less than 1/4 the size, and might use as little as 80w, and it would be only ~50% larger than the base model M1 as it only has ~50% more transistors than the base model M1
    ARM is only "vastly more efficient" than Intel processors that were stuck on 14nm for 6 years.
    Apple, on paper, is less efficient than AMD. But we'll have to wait for 5nm Zen4 to get official numbers.
    I fell into this trap with my M1 Mac Mini. I thought i was getting something i could dedicate to transcoding recorded TV shows from MPEG2 to H265. But it turns out, i'd have been better off getting a cheaper AMD based system with a 4750u/4800u.
    My work's laptop(Thinkpad L15 Gen1 with 4750U) is not only slightly faster at transcoding, it is also more efficient than my M1 Mac mini
    Here are the numbers for transcoding an hour long news segment using the same settings on both devices, and yes, i was using M1 native appliications.
    M1 Mac Mini : 1 hour 9 minutes, 33Wh from the wall
    4750u Thinkpad: 1 hour 4 minutes, 28Wh from the wall
    What is crazy is that not only is the 4750u more efficient, but it does this while having to power a display, loud micro-fan, and doesnt have any of the high efficiency integrated parts, instead using a removable M.2 and removable RAM.
    Remember TSMC has stated that the 5nm process used for the Apple M1, has a -50%- 20% performance per watt increase over the 7nm node of the 4750u. So, Apple should have used less than 2/3rd the power, and realistically should have used 1/2 the power because the Mac Mini has fewer perifferals to power, but instead Apple used more power to complete the same task.

    • @superidol238
      @superidol238 2 ปีที่แล้ว +27

      You said e TMSC point way to many times. Also, source

    • @wuokawuoka
      @wuokawuoka 2 ปีที่แล้ว +73

      To add insult to injury, Apple integrated computers require you to buy basic stuff like RAM today. That will keep you from upgrading later at a lower price.
      The not an m.2 ssd goes in the same direction.

    • @NeVErseeNMeLikEDis
      @NeVErseeNMeLikEDis 2 ปีที่แล้ว +13

      I read 1/3 of it n yes u're right!

    • @SidShetty
      @SidShetty 2 ปีที่แล้ว +1

      @UCw1-Tc5nUrnksHZkbPuIo5w nvidia didnt have a choice.
      the 5 nm node has limited capacity so it happens long in advance.

    • @oiosh
      @oiosh 2 ปีที่แล้ว +1

      @@superidol238 its true

  • @ekdavid
    @ekdavid 2 ปีที่แล้ว +560

    I am sure there will be forever a huge audience for modular pc builds

    • @KeviPegoraro
      @KeviPegoraro 2 ปีที่แล้ว +46

      Sure it will, is not like all factories today would stop making the parts

    • @dupajasio4801
      @dupajasio4801 2 ปีที่แล้ว +5

      Hopefully

    • @thomgizziz
      @thomgizziz ปีที่แล้ว +6

      forever is a long time...

    • @vadim.ka96
      @vadim.ka96 ปีที่แล้ว +22

      Kinda like classic cars. There are those who want most modern and hassle free stuff, and those who still prefer old school ways.

    • @mokseee
      @mokseee ปีที่แล้ว +22

      @@vadim.ka96 and just like old school cars, those will come at a hefty price

  • @smakfu1375
    @smakfu1375 2 ปีที่แล้ว +550

    At 47, I’m used to hearing that the traditional desktop form-factor is dead. I don’t think so. I’d also be careful with assuming closely coupled system modules (aka MCM’s posing as SOC’s) are the sole optimization route, as that’s true for Apple and the ARM universe as load-store RISC style ISA’s are highly sensitive to memory subsystem latency issues. CPU core-wise, they achieve great efficiency, but flexibility is highly limited, and scaling exotic architectures gets expensive and difficult. But Apple silicon, and the other mobile SOC-style producers are stuck in a “when all you have is a hammer” situation.
    Apple’s main business is mobile, the Mac business represents ~12% of their revenue, versus mobile devices at 60%, services at 20% and accessories at 8%. The desktop portion of that Mac business is minuscule. Through simple necessity, their desktops are going to follow the patterns established by the rest of their company’s hardware designs. That’s their business model driving design decisions, but don’t assume those same decisions work for Intel, AMD, etc., because they probably don’t.
    Also, the Mac as a platform has always been defined as a sealed box, no tinkering allowed, especially when Steve Jobs or his acolytes have been in charge of the platform. The expandable “big box” Mac’s have been the exception, not the rule. The Mac and the PC (defined by its open, slotted box roots) are two very different platforms philosophically. I don’t think you’ll see closely coupled system modules replacing big honking discrete GPU’s, sockets dedicated to big discreet CPU’s, and slotted RAM and PCIe slots for the desktop and workstation workloads that demand “bigger everything” (gaming, workstation, etc.).
    IMHO, you’ll see more chiplets (more closely coupled) in each of the compute buckets (CPU & GPU) and you’ll see a lot more cache, but the principle box with slots isn’t going anywhere. Where you will see more SOC-style system module solutions is in the laptop space. However, that’s just an extension of a pattern that’s existed for a long time, it’s just that Intel’s iGPU’s and interest in closely coupling memory has been limited by their being generally lazy. Keep in mind, the vast majority of all x86 “PC’s”, both in laptop and desktop form, already (poorly) implement closely coupled GPU (mostly on-die), memory controller, cache hierarchy, etc..
    TL;DR : I doubt the desktop form factor, sockets, slots and all, isn’t going away. This all seemed a bit click-baity.

    • @MrUploader14
      @MrUploader14 2 ปีที่แล้ว +34

      I agree that this video is click bait designed to get a rise out of the PC community. I do believe SOC's will be an important part in business PCs and servers since power efficiency to performance is a big deal. I think the PC gaming industry will start moving towards focusing more on power efficiency in the future. GPUs will start leveraging intelligent upscaling to lower power requirements. SSDs will use fewer PCIE lanes or keep using the older versions for longer. HDDs adopting NVME to saturate data connection and be more efficient. And moving power converting to the MOBO, as well as CPU most likely moving to ARM not necessarily for the power efficiency but for the flexibility and chip density.

    • @codyo.4884
      @codyo.4884 2 ปีที่แล้ว +16

      As someone who prefers the benefits of the tightly integrated, “walled garden” philosophy that Apple provides, I agree with everything you said.
      There are very distinct pros and cons between tightly/loosely coupled architecture, not just in computer hardware either. It’s like trying to make an F150 compete with a BMW M series, they’re just two different tools for two different jobs, and they’re likely to have their own place in the market for the foreseeable future

    • @riezexeero7392
      @riezexeero7392 2 ปีที่แล้ว +27

      Sadly, I used to love all the content of LTT, now it's becoming more click bait stories. And I agree smakfu, ever since I followed the IT industry, there is always the "PC is gonna die" news. Like did vinyl die when they said it will? Even cassettes are still alive. As long as there is a market, it won't die it will just be less prominent compared to the golden days. And also LTT, the golden days of PC is all over, it was back in the late 80s, 90s and early 2000s where your main gadget is the PC. When smartphones/tablet became a thing, the PC is not the primary gadget anymore. See what happened to Dell, they became dell technologies and not just dell computers because PCs are NOT at the forfront anymore. But did it go away? Hell no there is still a big market for it. But not what it used to be. Click bait video!

    • @Valisk
      @Valisk 2 ปีที่แล้ว +2

      Just reading a few threads in the comments here and there are kids that have only ever known ATX absolutely losing their shit.

    • @smakfu1375
      @smakfu1375 2 ปีที่แล้ว

      @@codyo.4884 Just to be clear, my main laptop is a MacBook Pro, I’ve been an iPhone customer for 14 years, every TV in the house is just a display for an AppleTV, I even have an early review unit 1st gen iPad (I was a contributor for a tech magazine for developers 12 years ago). So I’m a pretty big Apple whore. But when it comes to mainline development work, and gaming, the Mac is a backwater; you’re not prototyping, and building complex compute kernels, for data analysis and intelligence systems (for production cloud deployment) on the Mac. You’re doing that work, and many things like it, on PC’s running Linux and Windows.
      As for mobile devices, yes, they’re the main show for consumptive activities. But productive work is still the bastion of the general purpose computer, and always will be (IMHO… barring the machines taking over, plugging us into the matrix and making us watch TikTok all day on simulated iPhones, as a punishment).

  • @8lec_R
    @8lec_R 2 ปีที่แล้ว +368

    Yesss. Let's put the entire system on a single pcb so we need throw everything away when one part goes bad

    • @fabiospringer6328
      @fabiospringer6328 2 ปีที่แล้ว +123

      And you can't upgrade over time. Great idea.

    • @ryanunknown4181
      @ryanunknown4181 2 ปีที่แล้ว +19

      🍎

    • @ablueprofilepic9876
      @ablueprofilepic9876 2 ปีที่แล้ว +7

      🧠

    • @TetraSky
      @TetraSky 2 ปีที่แล้ว +44

      Excellent for manufacturers who want to sell you a new shiny toy every years while at the same time actively slowing down your old device to make the new one seems faster.

    • @yaro_sem
      @yaro_sem 2 ปีที่แล้ว +16

      Yes! Let's divide our CPU into separate chips with 1 core per chip, so we can replace only 1 core instead of the full CPU. Just 5 times more power consumption, 10 times slower, for double the price!

  • @clairearan505
    @clairearan505 2 ปีที่แล้ว +824

    If I could have great performance in a smaller form factor and still be able to open up and fix problems or change things in the system to suit my needs, I'd be totally for this sort of change. But I don't think that's anything like the goal of the companies who will ultimately make these changes to how PCs are built.
    It seems like every industry is constantly trying to find ways to lock me into their product line and lock me out of handling any problems that might arise without calling some overpaid technician or sending my car/phone/laptop/PC/etc somewhere else at my expense to be repaired.
    My gaming rig is over 5 years old, and still running like a champ. My work rig is nearing the 5 year mark and still chugging along. If I couldn't open these up and service/clean/fix them, and instead had to rely on someone else, I'd be missing out on hours and days of productivity. That sucks.
    If these issues can be addressed, I would love to have the additional efficiency of new architecture and the space savings of a small workstation. Until then, I'm happy to have my feet warmed in the winter and boiled in the summer, thank you very much.

    • @Voshchronos
      @Voshchronos 2 ปีที่แล้ว +1

      This. The right to repair is being killed and soon enough actually owning our computers will be purely an illusion.

    • @skeptic_lemon
      @skeptic_lemon 2 ปีที่แล้ว +26

      Capitalism

    • @amsunaakage
      @amsunaakage 2 ปีที่แล้ว +10

      That's why in my opinion, PCs will always be robust and on business scale it will be the best productive assets than relying on others, even if it costs more to run, but the way you put it it is worthy to stick to PCs no matter what. So I guess i can say that PCs are still cost effective in long terms.

    • @spiderjerusalem
      @spiderjerusalem 2 ปีที่แล้ว +52

      "It seems like every industry is constantly trying to find ways to lock me into their product line and lock me out of handling any problems..."
      Well said. Just well said.

    • @Haddedam
      @Haddedam 2 ปีที่แล้ว +11

      I dont see how more effecient processor archidecture and lower power consumption and waste heat design damages repairability. Apple hardware has been almost always locked down and hard to service even with hardware that wasnt proprietary. Doesnt seem like apple being first to popularise a thing hasnt meant everyone else is going to follow their anti user practises. Look at phones laptops and such.

  • @JordanCS13
    @JordanCS13 ปีที่แล้ว +160

    I don't think the PC Building scene is going away any time soon. When I first started building machines over 20 years ago, it was an extremely niche thing, and since then, it has grown and grown and grown, and the PC building community now is larger than it ever has been. Will it change? Almost certainly. I can see people moving to more power efficient designs, and maybe even SOCs combining CPU and GPU, but I think the desire to separate those in the PC building world will exist for a very long time, as it isn't cost efficient to lock people into those components simultaneously.
    I do see the average person migrating to a smaller form factor like a Mac Studio for their desktop work, but the PC building community isn't really about saving space on a desk...it's about the choice and price efficiency to build what you want with what components you want. Those who don't care about those things have largely already gone to laptops for 99% of their work.
    Also, the statement that no other PC can match a Mac Studio for $4,000? That's a load of crap. First off, the $4,000 Mac Studio has the 48 core chip, and in head to head, it's slower than an i9-13900K. I just priced out a build that equals the base $4,000 Mac Studio in speed, ram, storage, and faster in GPU - and it was under $3,000. And you can upgrade things and expand it later, which you can't do at all with the Mac Studio. Even if the equivalent PC was $4,000 and larger, I'd still rather have the PC because of the upgradability. If I want to swap a GPU in 2 years, I can. If I want to add 64 GB more ram in a year, I can (and at half the cost of the 128GB upgrade for the Mac Studio) - If I want to go to a 4TB SSD, I can, and for again less than half the cost that upgrade costs up front on the Mac.
    Maybe we will see the CPU and GPU get combined, but people will still be building custom systems - just with those parts, because locking yourself into a single computer where NOTHING can be upgraded later, and NOTHING can be re-used in your next computer, is just idiotic.

    • @thomgizziz
      @thomgizziz ปีที่แล้ว +19

      This exact argument was made 15 years ago by people just like anthony. If people keep saying the same thing constantly then eventually somebody will be right about it but people don't hold others to their predictions about the future.
      Also anthony isn't know for being objective about apple products, he will pick and choose the points where apple does win or just flat out misrepresent things because that is what people do when they are a fanboi.

    • @rockjano
      @rockjano ปีที่แล้ว

      No PC building will disappear just like Hackintosh (well i still use it but i don't know how long) It was nice I loved building PC's but it is just gone...OK takes some years but believe me it will go. As the price difference yeah true ... Mac has never been cheap and still it is not ... BUT you have to use Windows not MacOs and these two are just still not the same

    • @CrispBaker
      @CrispBaker ปีที่แล้ว +21

      ​@@thomgizziz it's kind of telling that the #1 reason for enthusiast PC-building, gaming, exists nowhere in this video.
      Apple hates PC gaming. Despises it. And any company that follows their lead will end up with devices that are just as useless for gaming as an M1 is.

    • @Rudxain
      @Rudxain ปีที่แล้ว +2

      ​@@CrispBakerthat's because Apple is focused on making devices for "average practical use", so they don't care about gamers because they're a minority

    • @Rudxain
      @Rudxain ปีที่แล้ว +4

      A solution to this is modular SOCs. Make SOCs replaceable by not soldering them, and using a socket instead.
      Or even better, make individual components within the SOC modular too. Of course this would require special high-precision tools to do replacements, or a robotic arm with sensors that does the job automatically

  • @thomaswiley666
    @thomaswiley666 2 ปีที่แล้ว +1182

    Actually this is an old story; LTT is just observing one of the cycles. They aren't old enough to remember 'big iron,' the massive mainframes that everyone saw in movies from the 50s to the 90s. These are the grandparents of the SoCs we see now. IBM's VTAM came along in the 70s as a comm/app layer within big iron as a way of connecting different SoCs within a mainframe. It was also a way for IBM to make more money because you had to pay more to "unlock" features, ie open communication to the SoCs within the mainframe. Base level got you one to two "features" while paying premiums got you many or the rest of the system. The add-ons were already there, they just had to be unlocked by software. And how many terminals you bought as physical add-ons.
    The first IBM PCs were mostly SoCs with perhaps a SCSI controller or add-on RAM board as after market purchases. The only "build" items were the physical items that wore out like keyboard or mouse. (Except for the IBM 101Model M keyboard - you could kill a person in the desert with one of those, and still have a functioning keyboard. Just sayin')
    Also, we've had these SoCs for years now. We've been calling them 'appliances' because, well, the term is apt. People know them as 'thin clients.' Basically, these are the replacements to terminals, the things that connected to mainframes. With an appliance you get localized processing power and lockable systems, which are needed in quality/sensitive areas of businesses.
    It's only because of the advent of smart phones do we have businesses considering marketing appliances to the general population. Phones are technically SoCs, right? And while people bitch about a phone becoming obsolete in four years, they go along with it. So now we've come back in the circle where appliances are now set-top boxes or dongles [Roku, XBox, Fire, etc.], AKA 'little iron,' with unlockable features (comm/app layer for games, movies, Internet access, etc.) for a monthly fee. And yes, the only add-ons are the physical parts - controllers, keyboards. Eventually, people will tire of disposable computing and some company will offer an appliance with the ability to snap in upgrades (an all new PCI bus) and the circle will continue.

    • @shannono.5835
      @shannono.5835 2 ปีที่แล้ว +94

      Truth - and thanks for the history lesson

    • @raphaelrodriguez1856
      @raphaelrodriguez1856 2 ปีที่แล้ว +118

      Wow this is probably one of the best and most valuable comments I have ever seen. Thank you for taking the time to write this up. It’s always super fascinating and useful to learn about world changing market pivots that happened pre internet era. It only makes me wonder what other things will experience this cycle in the future.

    • @dylanlahman5967
      @dylanlahman5967 2 ปีที่แล้ว +33

      This should be higher

    • @mintymus
      @mintymus 2 ปีที่แล้ว +24

      I certainly hope you're right about the process repeating.

    • @xenotiic8356
      @xenotiic8356 2 ปีที่แล้ว +52

      This honestly gives me a lot of hope about the future, that we are just in an awkward transition point in a cycle, and not an anti consumer hellish endgame.

  • @touzj316
    @touzj316 2 ปีที่แล้ว +386

    Backwards compatibily should not be underestimated. People want compatibility. The reason why computers are valuable is because they are useful and diverse. I just recently built a computer which is like no other on the market for what I use it for, and I keep upgrading my computer based on my work needs. Right now I'm upgrading my graphics card to fit my needs, and this ability to scale my production on my computer is one of the reasons why I have a computer. There's always going to be a market for x86, but I think one day we will all need to somehow switch to something that is more modern.

    • @aristocrag281
      @aristocrag281 2 ปีที่แล้ว +7

      I think what is going to happen is that as the market shifts towards prefab boxes, you’re customization options will be a more niche use case and require more investment and research. It might improve quality if the entire segment is targeted towards serious builders and power users, but as a representative sample of the overall market, what percentage is that and is it sustainable in the long run to have configurations ideal for the few who are capable of their own repair and maintenance, versus the few who do?

    • @hanrinch
      @hanrinch 2 ปีที่แล้ว +14

      @@aristocrag281 I viewed it opposite, in reality the prefab/rebuilt box is actually declining as more lite/casual user shifting toward to smartphone or tablet. The pc market is now divided into two segments, enterprise clients and power users. Enterprise client are including server/workstation/mainframe/thin client for office where prebuilt is still in demand. Power user is including esport/gaming/video&photo editing / live streaming where hand building is still popular here.

    • @robertbane2163
      @robertbane2163 ปีที่แล้ว +3

      @@hanrinch Correct! I worked at Microcenter for 10 years & saw the rise & fall of prefabs.
      It’s not like ‘87 when I started building now kids are watching youtube vids & buying parts!

    • @hanrinch
      @hanrinch ปีที่แล้ว

      @@robertbane2163 prefab had less flexibility but good warranty however casual user are sticking with phone/tablet now these day making the warranty is no longer the advantage and younger kid tend to look up online to build their dream machine. But it is still mark the decline of the industry because the prebuilt taken up to 70% of the entire market and if prefab is declining rapidly it will mark the end of industry because the market will become too small to retain competitive

    • @SuperNoticer
      @SuperNoticer ปีที่แล้ว

      You chit

  • @tobyroberts6571
    @tobyroberts6571 2 ปีที่แล้ว +708

    To be honest, I'm surprised that enthusiasts have managed to keep the DIY build PC game going for so long. So many other hobbies and just things in general are taking away input and choice for consumers. I hope it hangs on a bit longer - I love my stupid PC because it was me who put it together.

    • @ShaiyanHossain
      @ShaiyanHossain 2 ปีที่แล้ว +111

      if anything the DIY PC crowd has grown due to the emergence of streaming and better ways to play online games

    • @aserta
      @aserta 2 ปีที่แล้ว +28

      You're wrong. It's the same bs that's being said in the car industry about electric cars, that they'll kill any kind of custom work... and there's already dozens of firms that do custom performance parts for electric cars, even boards and brains for them.

    • @cyjanek7818
      @cyjanek7818 2 ปีที่แล้ว +66

      @@aserta for now.
      Tesla already makes their Motors and accumulators non functional if you dont use their stuff and reverse engineering that is much more complex than making New fuel map for engine, where all you need to control is timing of spark plugs and injectors.
      Even people who are racing with teslas do not replace their stuff, Just Pack it differently at Best.

    • @8lec_R
      @8lec_R 2 ปีที่แล้ว

      @@cyjanek7818 well that's why enthusiasts hate Tesla. If they don't stop this bullshit soon every other company will surpass them soon and they'll start losing marketshare. They are just trying to make bank while they are the major electric car manufacturer

    • @TjPhysicist
      @TjPhysicist 2 ปีที่แล้ว +34

      enthusiast communities around otherwise forgotten technology will always be a thing though. DIY PC's will always be a thing IMO. And while PCs have basically completely been displaced in the market by phones, tablets, gaming consoles and laptops the DIY PC will never actually get fully replaced for those that still want it. Similar to CRT TVs, portable music players, and Vinyl Players...all old technologies that have been kept alive for years if not decades by enthusiast communities (to the point where there are still companies that make and release new portable music or Vinyl players every year). As opposed to smthg like beta max or floppy disks - technologies that have basically been entirely forgotten and completely replaced.

  • @Rasterizing
    @Rasterizing 2 ปีที่แล้ว +39

    It sounds a little like going back to the Commodore 64 era, where, for the most part, everything was on the CPU (with some exceptions). It's obviously a big problem for system builders and the general PC market as it will restrict consumer choice - although it doesn't have to be that way. I think the PC market still requires some custom builds and adaptability so if you can replace the SoC without having to change everything then that would be a big win. Although I really see it was replacing the entire board/system - so your PC would be little more than a SoC board (which you must full replace) with a custom case.
    From the chip makers point of view this has nothing to do with power or economy. They want to lock you in to a SoC, keep you there and then just drop support a couple of years later and force you to upgrade and spend more money - this is what it comes down to, restricting choice and forcing upgrades, exactly the same as mobiles and tablets. If you can't afford to upgrade your SoC you just have to suffer or be cut out when support is dropped, that's it. Along with the fact that you can't replace any individual failed components. No, this is nothing more than rampant capitalism and milking consumers for every last penny.
    x86 could be redesigned to be more efficient and have a RIS too. There's no need to go to SoC, aside from $$$!

    • @dsrocks6905
      @dsrocks6905 ปีที่แล้ว +1

      Redesigning x86 AGAIN to say a 128bit architecture will have its own major challenges. If you can recall, the switch from 32 to 64 bit x86 wasn't without pain and trouble, and considering arm has had so much time in the oven, and translation between both instruction sets has become so much more efficient because of AI, it may be smarter to just move to ARM. Either way there will be stability and compatibility issues for a bit while everything slowly shifts over

  • @Rac3r4Life
    @Rac3r4Life 2 ปีที่แล้ว +240

    Even with a switch to ARM, I believe the socketed chip on motherboard paradigm will stick around. People will still want upgradability and expandability in their desktop computers.

    • @vulbyte
      @vulbyte 2 ปีที่แล้ว +16

      i wish this to be the case, tbh, i wouldn't be upset if the cpu and motherboard came intigrated (look at amd and how much of a minefield that's been for people)
      but, keeping ram and pcie slots for expansion/upgradibility

    • @MrCreamster20
      @MrCreamster20 2 ปีที่แล้ว +3

      Now what if the eventual move to ARM based chip design diverged from the current APU's, the CPU physical die sizes stayed the same but; effectively in turn granting everyone as producers and users increases in performance with lower max power usage than current CPU's. Also GPU's as they are now would probably halve in size with this ARM CPU technology, that way we would still have discreet socketable CPU/GPU's, RAM, Capture cards etc... but the size, power and probably cost benefits that go in tandem with this kind of paradigm shift in design/manufacturing would be the most noticable.
      Just a hopeful probing thought and all.

    • @egg-roll8968
      @egg-roll8968 2 ปีที่แล้ว +1

      Us: We like choice...
      Companies: You will buy what we make and you WILL love it!
      Here I'll prove it: Why do most buy either Sammy Apple or Google for phones (mostly NA/EU market) vs the plethora of choices out there? Many of whom offer equal or better performance for equal or less.

    • @me-jv8ji
      @me-jv8ji 2 ปีที่แล้ว +6

      @@egg-roll8968 name a phone that you can unlock the bootloader and relock for what i know only pixels can do that

    • @Thomas.5020
      @Thomas.5020 2 ปีที่แล้ว +8

      Doesnt matter what people want. We will buy what we're told to buy. Always been the way.

  • @NavyDood21
    @NavyDood21 2 ปีที่แล้ว +666

    I RELLY hope that the PC never actually goes this way. They would either need to come down in price so much that it wouldnt even be worth it for companies to sell them, or they need to magically figure out how to make them just as modular as PCs currently are. I dont want to got the way of Apple and overchage for equipment that is almost complete unrepairable by anyone but a full lab.

    • @lucasrem
      @lucasrem 2 ปีที่แล้ว +3

      You still buy board, with integrated intel Graphics on it, other parties...

    • @Thep184
      @Thep184 2 ปีที่แล้ว +33

      It will not happen. Just wait till M1 starts to break down… upgradable … not given, oh you just need more he rest is fine ? Well sucks, buy a new entire chip. Look at current PCs: CPU 400 bucks, RAM 100 bucks, GPU 600-1000 bucks… now imagine what a comparable one chip will cost… nah man… big PCs might be power hungry, but that is how it is… a stationery PC system doesn’t need super efficient chips… and in addition PC powerconsumption is not the issue that fuels climate changes

    • @cybrunettekitty5197
      @cybrunettekitty5197 2 ปีที่แล้ว +19

      Lol everyone in the comments acting like a boomer already. Change will happen & PCs definitely won't be the same be it 5 years or 20 years, but just seeing everyone complain about something they'll eventually come to like is hilarious. Of course there will always be those like "back in my day..." "you don't ever see me getting rid of my pc" "i'll never get rid of windows xp" lmao..

    • @GarryMah85
      @GarryMah85 2 ปีที่แล้ว +11

      I don't think the point of the video is to say that Apple will take over the PC industry. But rather, their approach of using highly efficient ARM architecture will probably be the way of the future. Apple have shown that ARM works for high performance computer application, and now it's up to other PC chip maker to figure out how to do the same for PC platform.
      Apple only makes SoC for Mac. There's a huge Windows market out there, just waiting for someone to make a great ARM-based hardware for it.

    • @Ecselsiour
      @Ecselsiour 2 ปีที่แล้ว +2

      Perhaps not in our lifetimes. Computers used to be the size of entire rooms half a century ago. In another half century, the full tower gaming rigs today might be a wrist watch. Take what's said in this video and then add in something like carbon nanotubes, which we might crack in a hundred years.

  • @saisibi6708
    @saisibi6708 2 ปีที่แล้ว +273

    As far these changes go I really really hope the modular nature of PC's does not go away. I would very much like to have to install the GPU of my choice, the RAM of my choice and build it all on a motherboard of my choosing, all this in an enclosure that I like. So, I hope it doesn't all become a bunch of boxes doing things monotonously.

    • @mikeloeven
      @mikeloeven 2 ปีที่แล้ว +10

      In theory you could end up with more customization in that you have desktop system boards that support universal sockets that can integrate multiple different SOC's for whatever purpose you want so going from x86 to arm makes no difference as long as companies want to maintain modular design.

    • @maman89
      @maman89 2 ปีที่แล้ว +3

      Does this defeats the purpose of going arm in the first place ? We were promise the same thing with phones but we all know how that went.

    • @HamidKarzai
      @HamidKarzai 2 ปีที่แล้ว +2

      the fact that you can pack the RAM extremely close to the CPU is part of what makes Apple's SOCs so much more performant. I don't know if modularity is necessarily going to go away but it's going to come at the cost of performance, and there's nobody to blame but physics itself

    • @jiegao3591
      @jiegao3591 2 ปีที่แล้ว

      @@mikeloeven They definitely did that for RAM and the whole DDR system, I can't see why companies con't make really tiny sockets with a pinout and communication standard to do something similar on a really small board.

    • @GoingtoHecq
      @GoingtoHecq 2 ปีที่แล้ว

      @filleswe91 I was about to say that computers do not represent a significant portion of power used, but then I remembered server farms. Since those requirement shape the consumer space, and server farms need to perform for the least cost, well, it is obvious that we will all move to soc"s. Great for laptops really.
      Unless I can stack chips or have adjacent sockets, then computers we know them will become very obsolete.

  • @CrzBonKerz21
    @CrzBonKerz21 ปีที่แล้ว +19

    I get so much joy when it comes to building a PC. All of the parts in their own boxes.. it's like Christmas morning. Like I literally feel happy when I'm holding a computer part box. It would be so sad to not have traditional computer parts anymore.

  • @Johnrich395
    @Johnrich395 2 ปีที่แล้ว +486

    I’m confused. Did I miss the part where ARM is required to be a soldered cpu? I figured that ARM or RISC-V would take over the compute space eventually but that we would see ARM/RISC-V components just like we see x86 components.

    • @CompMeistR
      @CompMeistR 2 ปีที่แล้ว +69

      Arm is definitely not required to be soldered (otherwise the Ampere Altra would not exist), but Arm SOC's basically need it, as there is currently no other good way to integrate them into a system

    • @handlealreadytaken
      @handlealreadytaken 2 ปีที่แล้ว +183

      A completely soldered, unmaintainable throw away machine is Apple's wet dream. Would be nice to see Intel or AMD offer a solution for the desktop market.

    • @TrioLOLGamers
      @TrioLOLGamers 2 ปีที่แล้ว

      Yep. Welcome in our world: ARM is just the excuse for Apple to solder everything. It always has been.
      They've just waited the right moment and have built the right PC looking at other errors (Windows on ARM, it's not the first time they come out with other's idea and were like "we have something revolutionary").
      They literally have foughts against right to repair and today with the excuse of everything proprietary they CAN.

    • @callumb4980
      @callumb4980 2 ปีที่แล้ว +152

      @@handlealreadytaken it’s every companies wet dream. You’re deluded if you don’t think Samsung, pixel, etc don’t want to have the same model apple does. For companies like acer it’s cheaper for them to buy of the shelf components, but as soon as that changes 100% they will solder the user manual to the board if they can

    • @MrAnderson5157
      @MrAnderson5157 2 ปีที่แล้ว +16

      @@callumb4980 It will always be about the almighty dollar. The only reason for and the steps for considerations in regards to power consumptions are the world cries for greener pastures. Forceful changes, nothing to do with consumer considerations.

  • @CarthagoMike
    @CarthagoMike 2 ปีที่แล้ว +924

    I have no doubt ARM will play a larger role in the PC market in the future, but for now I have yet to see it dominate over X86 anytime in the near future.

    • @Aereto
      @Aereto 2 ปีที่แล้ว +141

      It can only cut into PC markets if it is seamlessly compatible with gaming in general. If it fails to run a game in the architectural levels, it will stay in the Mobile Gaming, which is the lowest in levels of gamer respects with exception to gamblers.

    • @pumbi69
      @pumbi69 2 ปีที่แล้ว +7

      But what about apple silicon they already are dominating over the x86

    • @apa5749
      @apa5749 2 ปีที่แล้ว +13

      i think it will but only for mobile computers/laptops first. this actually makes me wondering what will Apple put into their next Mac Pro.

    • @shockwaverc1369
      @shockwaverc1369 2 ปีที่แล้ว +25

      not until UEFI on ARM is a thing (outside of expensive servers and underpowered SBCs) and manufacturers caring more about drivers, ie not just releasing a fork of linux 4.4 that requires you to install shady archived versions of GCC and calling it a day

    • @Dave102693
      @Dave102693 2 ปีที่แล้ว +5

      Only if developers start converting their software over to arm…and oem start optimizing for arm instead.

  • @TheDoubleBee
    @TheDoubleBee 2 ปีที่แล้ว +280

    To be perfectly honest, I don't particularly like ARM-while not as walled-in as x86, their architecture is still proprietary. Myself, I pray RISC-V takes off instead-it is completely open and it absolutely demolishes even M1 in performance-per-watt, as per Ars Technica article "New RISC-V CPU claims recordbreaking performance per watt".

    • @pazu8728
      @pazu8728 2 ปีที่แล้ว +54

      Yes. RISC-V please.

    • @akatsukilevi
      @akatsukilevi 2 ปีที่แล้ว +10

      @@pazu8728 I'm full on board with RISC-V, really wants to have a computer I can daily drive with it

    • @BlommaBaumbart
      @BlommaBaumbart 2 ปีที่แล้ว

      RISC-V is no solution to a walled garden. Hardware is only as free as its physical implementation. RISC-V as a concept is very free, but that will be worth nothing if the RISC-V chips that are actually made all contain spyware or locks implemented by their manufacturer.

    • @torinireland6526
      @torinireland6526 2 ปีที่แล้ว +25

      YES, this is the way it needs to go: open architecture, expanding modularity/repairability and promoting MORE consumer control over our devices - NOT LESS.

    • @jfftck
      @jfftck 2 ปีที่แล้ว +1

      This is where we need PC chip makers to focus on to build better SoCs, the nice thing is that also GPUs are part of the design. Hardware manufacturers need to see that their hardware designs are the real proprietary parts and the drivers should always be open source to allow continued support for those who can’t upgrade all the time. Not many companies make hardware and that should be enough to keep a company in business, the software shouldn’t be allowed to force consumers into buying new hardware. Maybe a law should be passed that requires hardware to be supported with at minimum security patches and all code open sourced if the company is unable or unwilling to provide the patches, so the community has a chance to provide it.

  • @jasonkelley6185
    @jasonkelley6185 ปีที่แล้ว +73

    I haven't watched much LTT, just the occasional video, but this is the second time I've seen Anthony and I'm going to start watching all his stuff. This guy is the man. When I know people like this personally I buy them drinks and try to get them to talk endlessly. He really knows his stuff. To have the versatility to be the premier Swacket model is obviously a huge bonus.

    • @Menleah
      @Menleah ปีที่แล้ว +2

      Well said!

  • @samgray49
    @samgray49 2 ปีที่แล้ว +624

    I remember when they thought laptops would replace desktops, I don't think they'll ever fully replace desktops. Maybe it'll be smaller and compact.

    • @YN-io6kj
      @YN-io6kj 2 ปีที่แล้ว +18

      @@B1u35ky thats true. I know some people who have just a laptop on a table like a desktop. For me personally i always need a Desktop to complete a certain room.

    • @JudeTheYoutubePoopersubscribe
      @JudeTheYoutubePoopersubscribe 2 ปีที่แล้ว +23

      they have, your average person does not use desktop and monitor setups. hell even gaming laptops have got good and you can definitely get one and use it for gaming.

    • @MichaelKitas
      @MichaelKitas 2 ปีที่แล้ว +1

      @@B1u35ky I agree, I recently replaced my PC as well with the MacBook pro both at the same price
      Edit: By replaced, I mean I use it 99% of the time, I still own the PC. About gaming, personally, I don't game, it's mostly software development related use

    • @PNWAffliction
      @PNWAffliction 2 ปีที่แล้ว +19

      yeah laptops will never be able to replace desktops due to people needing to run beasts of apps like CAD, and video rendering, music, game rendering, etc.. that's why we keep bopping back and forth between terminals, laptops, tablets, etc. etc. because nobody can make up their mind which works the best. its extremely situational.

    • @ghangj
      @ghangj 2 ปีที่แล้ว +5

      I sold my desktop for a laptop. Laptop will replace desktops, are you seeing the amount of nucs and micro desktop coming out? at that point you should just get a powerful laptop. All the heat and fans noise can be managed.

  • @joerhorton
    @joerhorton 2 ปีที่แล้ว +329

    Been through this a few times, during my 52 years of life, CPU's come and go and backwards compatibility is the key issue here. However if the architecture can also give that compatibility through some form of emulation or virtualisation, I can see PC;s of the future being smaller and less power hungry. In the meantime I am waiting for the next gen of GPU's and CPU's to be released.

    • @seasong7655
      @seasong7655 2 ปีที่แล้ว

      Windows11 on arm is said to have emulation for x86 programs. It looks like it could be done.

    • @joshsmyth130
      @joshsmyth130 2 ปีที่แล้ว

      It's just another thing we have to learn to work with. Everything in tech changes, who would have predicted the prevalence of mobile computing 20 or 30 years ago. I'm just worried cause I'm heading though a Cs degree and alternate architecture haven't been covered in nearly as much depth as x86.

    • @TheArsenalgunner28
      @TheArsenalgunner28 2 ปีที่แล้ว +1

      It all comes dooooown to profit. See, I was recently talking about this with some people when the rumoured power draws of the Nvidia 40 series were leaked. GPU’s don’t have to make big jumps anymore, their capabilities and performance are insane, to the point that it’s almost unnecessary.
      What really would be in the consumers interest is working to optimise GPU’s going forward and matching the performance with less power draw and better cooling. But ofc, that means risking being second in the GPU race and ofc, less justification to sell the next gen at larger prices. I mean if performance doesn’t leap, why would people buy a new card for £200 more than last year?
      It’s like iPhones. They can do anything now and are amazing…almost too amazing, I barely use all the features on my iPhone 11 and this thing is expensive. How can it go beyond the capabilities of what they can do I don’t know, but Apple will find something to stick on it to sell a new one the next year. GPU’s will soon be heading the way. When power draw from computers reaches the point where you have to rewire your entire house, I like to think that will surely be the stopping point. I know people will be dumb enough to do it but not everyone surely.

    • @pirojfmifhghek566
      @pirojfmifhghek566 2 ปีที่แล้ว +2

      Depends on who wins the IPC vs wattage vs performance game--RISC or CISC. The x86 chips are certainly power hogs, but they tend to have greater IPC gains over each generation than M2 has had over M1. There's also the question of how the x86 architecture will fare as the motherboard/RAM/storage ecosystem goes through more generations. PCIe has gone through some major growth spurts lately and it doesn't seem to be slowing down. It feels like most components aren't even taking full advantage of the current bandwidth and we're already moving on to PCIe 6.0. This is the sort of thing that generally favors computers where the ability for expansion is key, and current x86 is still the gold standard for that.
      RISC has obvious benefits for efficiency and some of the x86 benefits over RISC may in fact be moot... it's just a damn shame that Apple is the only mainstream ambassador we have for that architecture. As long as they're the only contender at the RISC table, we'll never know how high it can fly. Apple simply prioritizes aesthetics, low fan noise, and engineered obsolescence over raw power. You can't tweak it. Can't put an AIO on it. Can't get any more juice out of it. There's no use. The bios is all locked down. Somebody else needs to make a decent ARM chip for PC enthusiasts. I'm curious as all hell to see what it might pull off. It'll be a cold day in hell before we get to see somebody playing around with LN2 and an M2 chip, that's for sure. I just wanna see what's possible. I think we all do.
      Another thing that hasn't really been addressed is that we may start seeing a big need for socketable, purpose-built chips that do fully analog compute tasks in the future. Look at the stuff that the guys over at Mythic are making and tell me that socketing a chip like that into a system wouldn't make a MASSIVE difference in their capabilities. Their advances have shown us that we depend far too much on pure software to solve every problem and it's become the biggest bottleneck that we have. AI is rapidly expanding in all corners of the computing landscape and, without a doubt, analog simply does it better. It's scary how much better it is at these tasks. I believe that these components are going to have as big of a presence in our computers as video cards and RAM sticks have today. The ecosystem that can integrate analog chips the most effectively will be the one that ultimately succeeds in the coming decades.

    • @lawyerlawyer1215
      @lawyerlawyer1215 2 ปีที่แล้ว +2

      @@TheArsenalgunner28 I have a 3080ti , with a ryzen 9 5950x 32gb of ddr4 4200 mhz ram , 4tb nvm2 ssd , liquid cpu cooling and 8 noctua fans .
      Built the system 19 days ago , all I have in it is a fresh install of windows an cyberpunk 2077.
      Even overclocking both the 5950x cpu and the 3080ti gpu , I can’t get 60fps at 4k Max settings diss quality.
      I can’t get steady 60 fps even with diss balanced.
      I have to set diss all the way down to performance, to get 60 fps , which has been a minimum standard for pc gaming for over a decade.
      So how are Gpus overpowered?
      There are a handful of games able to bring even the all-mighty 3090ti to its knees when pushing 4k max settings. So no , graphic cards aren’t overpowered. When the most demanding game in the market , can’t make high end cards break a sweat , thats when a graphics card is overpowered. In comparison the 1080ti was more overpowered at launch Than this new cars. Since ray tracing wasn’t a thing back then. And 4k pc monitors where a very rare sight , and useless because they where 60hz
      Games where 1440p and no ray tracing. Which the 1080ti could take like a champion.
      This newe cards might be much more powerful. But they face much harder challenges

  • @jameslake7775
    @jameslake7775 2 ปีที่แล้ว +253

    Hm. I'd say there's a mountain of "IF"s that need to be cleared before modular PCs and X86 goes away.
    Apple Silicon is power-efficient, but those designs have been under criticism for poor repairability and upgradability. Apple has sort of gotten away with it by having low expectations of repairability, but every PC becoming a black-box that requires an authorized technician and specialized tools won't sit well with many people.
    The DIY market isn't large, but does represent billions of dollars per year. There's going to be pushback from both sides to any attempt to eliminate that.
    Qualcomm has failed to compete, and I feel like I've heard them claim their next model is gonna be the one several times. Also, jumping from a duopoly to a near-monopoly known for poor long-term driver support doesn't seem like a move companies will be lining up for, and Apple Silicon/Qualcomm powered devices so far haven't been cheap (although at least the AS ones have been fast).
    Windows has a lot of legacy software that either needs to be emulated or left behind. That's either a difficult technical challenge, or cost every business that depends on some specialty piece of software significant time and money while angering every gamer who's favorite older title becomes unplayable.
    So IF people decide power consumption matters more than e-waste, and IF the major players decide and succeed in squeezing out a multi-billion dollar industry, and IF the ARM processor market gets more competitive in terms of both performance and available vendors, and IF Microsoft can get emulation right... then maybe every Windows PC will be an SOC and soldered-down everything.
    Not to say that things aren't going to look different in the future, but following Apple into an all ARM future is easier said than done.

    • @involuntaryoccupant
      @involuntaryoccupant 2 ปีที่แล้ว +7

      that's a very good point!

    • @jamesmicklewright2835
      @jamesmicklewright2835 2 ปีที่แล้ว +12

      Yep. Repairability and Upgradability aside, most of the things I do on my PC that couldn't be done with any old device from the past 10 years are playing games, and I tend to stick with a core library of favourites rather than moving on to the latest and greatest all the time. If they come out with a compatability/emulation layer that has negligible performance impact AND works flawlessly (no weird glitches or bugs like I often see with emulating different architectures) then fine, but until then, no thank you.

    • @pedrovergara7594
      @pedrovergara7594 2 ปีที่แล้ว +20

      Modular pcs are not going anywhere. Enterprise and server markets require modularity, and selling that technology to consumers means manufacturers can double-dip on their R&D investments.
      There will be a bigger market for apple-style SOCs? Probably, but unless there is some revolutionary change on SOCs that allows servers to be run on them, we will have some form of modular components available.

    • @jamessavent3636
      @jamessavent3636 2 ปีที่แล้ว +7

      With right to repair becoming law in more places other than the EU, they won't be able to get away with having SOC without the average consumer being about to repair

    • @funrsguysandmore
      @funrsguysandmore 2 ปีที่แล้ว +3

      Can we dumb this down for the casuals and the ones who don't wanna read.

  • @seishino
    @seishino ปีที่แล้ว +211

    Every time I need to upgrade my graphics card, I pretty much need to update my motherboard and memory and everything else anyway. For those who update yearly, I could see this being catastrophic. But for the rest of us who upgrade every 6 years or so, you pretty much need to upgrade all the related parts.

    • @tempusnostrumest
      @tempusnostrumest ปีที่แล้ว +31

      pretty true. this is only bad for the enthusiasts who spends too much on hardware every year

    • @colclumper
      @colclumper ปีที่แล้ว +2

      I feel your pain,I just went full x670x

    • @caveake
      @caveake ปีที่แล้ว +27

      Yeach, but if something breaks in your PC you can still easilly replace/fix it

    • @The1Music2MyEars
      @The1Music2MyEars ปีที่แล้ว +14

      I built a pc in 2019 with a 9400F and a 2070. Wasn't looking to upgrade for a few years. Well I get addicted to No Mans Sky on PS5 and want to mod on PC and apparently is very CPU dependent. And micro stutters. I now only have two options, buy a 9900k for high prices or upgrade my mother board just to buy a new cpu. Frankly I'm getting tired of PC gaming especially the whole test the drive the PC to see which graphics settings you can have at >60fps before you even begin playing while on PS5 the game has already been optimized by ppl. And no I don't like to use Geforce Experience to have it throw some apparently optimized settings at my games. The other day a windows update forced a hard reboot. I cannot escape Microsoft's updates which have caused nothing but stress.

    • @panzrok8701
      @panzrok8701 ปีที่แล้ว +11

      It's far cheaper to buy more RAM and swap the GPU. The CPU is usually not the bottleneck.

  • @MatchTerm
    @MatchTerm 2 ปีที่แล้ว +282

    I am EXTREMELY glad that most of the comments here support what I'm seeing myself: no matter how efficient this can be, in the state apple wants to sell it to you, this throws the ENTIRE concept of repairing out the window. Not just that, even upgradability is out besides "buy new computer" or maaaybe increase storage size.
    I shall not, in the current state, accept ARM chips and SoCs substituting desktop PCs in their entirety. The normal consumer might see the benefits of using less energy for the perofrmance, but they are gonna pay in now being in a instituzialized planned obsolescence for EVERYONE.
    Either make ARM SoCs have an option to be upgradable and repariable as x86 or do not bother trying to make me go to this bullshit, plain and simple.

    • @afriendofafriend5766
      @afriendofafriend5766 2 ปีที่แล้ว +39

      Yup. Just remember what Apple has done. They've artificially locked the SSD ID so that you can't even upgrade capacity. PCs? Nah. I took an old ssd from a random laptop and it works perfectly fine.

    • @MatchTerm
      @MatchTerm 2 ปีที่แล้ว +23

      @@afriendofafriend5766 Exaclty, that's one of PC greatest points that I just do not understand why people want to kill off.
      Like, if you want a simple solution and don't care (even if people should, it is fun, but I understand time constrains), you can get a laptop. And there are already also laptops that are repairable as well.
      Just do NOT try to shove this in the throat of people like us: that want to still repair our stuff instead of always asking the corpo, that actually like to tinker and play with our hardware.
      Like I said before, in the long term, this will affect the normal consumer as well, that will have to throw away the entire system just because of one of the SoC components failing: but if that's really an inevitability, at least leave us alone please...

    • @Dave102693
      @Dave102693 2 ปีที่แล้ว +5

      Yeah…that pisses me off.

    • @64bitmodels66
      @64bitmodels66 2 ปีที่แล้ว +6

      I'm not THAT worried since this kind of stuff will always exist for people like us. Also what's stopping other companies from just copying & improving upon apple's design?
      what i'm more worried about is how this is gonna affect compatibility. x86 and arm are completely different, they do not work together

    • @jamesmicklewright2835
      @jamesmicklewright2835 2 ปีที่แล้ว +11

      Exactly. My last PC lasted me just about 10 years, thanks to various upgrades along the way. I wonder how long it would have lasted if I was stuck with the configuration from 2011 - 4GB RAM, a Radeon HD6870 GPU, and a 500GB mechanical HDD? The i7-2700K was still decent 10 years later. Everything else in the box ran out of steam in half that time.

  • @shiropanx
    @shiropanx 2 ปีที่แล้ว +203

    Been hearing that PC's need to change since I first got the chance to build one for myself way back in the mid 90's. Always doom and gloom, Apple this. mobile that. Still... conventional PC's are still around.

    • @wakannnai1
      @wakannnai1 2 ปีที่แล้ว +15

      Yeah I think the doom and gloom is a bit overplayed here. I don't think the market is going to change dramatically in the next few years. For most consumers who would consider a conventional PC, I doubt efficiency is at the top of their list of priorities when purchasing a product. Most consumers in this space are fine with power consumption being higher as long as it's not excessive. Performance is far more important here. The M1 is a good chip that has big implications for Apple, but not the industry as a whole. It will be far more important on low and medium power consumption markets where efficiency matters like smartphones and laptops. We will probably see significantly more appealing and efficient APUs/SoCs from both Intel and AMD as they attempt to keep the laptop PC market competitive with not only each other but also with Apple. I think we're already seeing this with Phoenix Point from AMD. I'm sure Intel has something similarly radical as well.

    • @whwhwhhwhhhwhdldkjdsnsjsks6544
      @whwhwhhwhhhwhdldkjdsnsjsks6544 2 ปีที่แล้ว +2

      @@wakannnai1 idk, in some places at least energy prices have become extremely high and it just isn’t viable to be burning it all on slightly higher performance when you don’t have to

    • @xrayl7392
      @xrayl7392 2 ปีที่แล้ว +4

      Yeah even this video is of course dead wrong Amds and Intels Top designers have explained multiple times that ARM doesnt enjoy the same benefits in a normal Computer environment as it has with phones and Mac so yeah the "normal" pc isnt dead and another misleading TechTips video is released to the world (just like the 5800X3D review with performance numbers no other outlet seemed to agree with xD )

    • @beukneeq5766
      @beukneeq5766 2 ปีที่แล้ว

      exactly

    • @wakannnai1
      @wakannnai1 2 ปีที่แล้ว +1

      @@whwhwhhwhhhwhdldkjdsnsjsks6544 I doubt that 200W vs 500 or even 600W peak power is going to change anyone's mind. Even at 650W peak power, that would work out to around an extra $1 per 8 hour segment if you run the system at full blast at $0.33/kWh (close to 50% higher what my current electric bill is at). Even if this were double, 650W is still viable for most users who would spend $1000+ on computer hardware. If you're willing to spend $3999 where the mac pro starts, I doubt an extra $2/4 in their electricity bill every week will matter.
      For the other crowd, the professional, an extra $10/week is marginal compared to the money they're generating so for them it's probably an acceptable tradeoff.

  • @aaronspeck1644
    @aaronspeck1644 2 ปีที่แล้ว +75

    I FINALLY got my RGB to sync properly and you're telling me a silver cube is going to take that from me?!? Nooooooo

    • @vidhyachan6494
      @vidhyachan6494 2 ปีที่แล้ว

      Hahah this made me crack up

    • @chadultrapromax1735
      @chadultrapromax1735 2 ปีที่แล้ว

      xD Razer synapse doesn't work though

    • @joesterling4299
      @joesterling4299 2 ปีที่แล้ว +3

      RGB can't die soon enough for me. I'm not willing to buy into Apple to get there, though. Screw that noise.

    • @hAT81
      @hAT81 2 ปีที่แล้ว

      @@joesterling4299 you can say apple's cooling systems are miniaturized jet engines

    • @giornogiovanna734
      @giornogiovanna734 2 ปีที่แล้ว

      @@yann2850_ it boosts performance duh

  • @jaredweiman2987
    @jaredweiman2987 ปีที่แล้ว +3

    I gave my dad my M1 iMac after building a gaming PC. MacOS hardly has any games to play on it. It doesn’t matter that it CAN run current gen titles if Apple and software devs don’t allow it to. Until 80% or more of the steam store is available on MacOS, there’s zero point in having one as a dedicated gaming setup.

  • @gwgux
    @gwgux 2 ปีที่แล้ว +49

    This is why open standards and open systems such as ones running Linux distributions are so important. The more compact and integrated the hardware, the more lock down can be done by the likes of Apple and Microsoft. For all things in computing, there are always trade offs, but at least you can retain control over YOUR systems by adhering to open standards and operating systems.

  • @Somtaaw7
    @Somtaaw7 2 ปีที่แล้ว +246

    Cool. Super looking forward to the future where everything is one chip and if you want to upgrade or even replace a broken component you need to throw the whole thing out and buy another system from a vendor. Super exciting.

    • @lukasl3440
      @lukasl3440 2 ปีที่แล้ว +27

      Didn't we already have this in form of laptops, mobile phones, tablets and game consoles?

    • @iandavid7730
      @iandavid7730 2 ปีที่แล้ว +41

      Wonder what colour the household e-waste bin will be.

    • @lukasl3440
      @lukasl3440 2 ปีที่แล้ว +5

      @@iandavid7730 In my country it's red.

    • @Kholaslittlespot1
      @Kholaslittlespot1 2 ปีที่แล้ว

      @@lukasl3440 yeah, exactly...

    • @PropaneWP
      @PropaneWP 2 ปีที่แล้ว +21

      @@lukasl3440 Which is why people who know better don't waste money on expensive laptops, phones, tablets and game consoles. When in need, they buy the cheapest adequate-quality product available, as it will be e-waste in a couple of years anyway.

  • @enchantereddie
    @enchantereddie 2 ปีที่แล้ว +482

    I cannot even recall since which year came out the saying: "PC is dying". And obviously, it still hasn't finished this process. What I believe is that there will always be the need of highest possible performance out of desktop form factor, either normal enthusiasts or some businesses. The lower end PCs have been made into very small boxes for years and are used as family rigs as well as business PCs. But they are not killing the bigger desktop boxes. Computing technology can be built into many sizes from super computers to mobile phones, and desktop is one sweet spot of them. Even ATX and x86 may be replaced by something better (possibly ARM), people will always want/need their computers to be customizable and allow them to grow bigger for the sake of better performance.

    • @shannono.5835
      @shannono.5835 2 ปีที่แล้ว +41

      And consumers, enthusiasts, will continue to demand a customizable PC just like the auto industry modders find ways to customize and upgrade what otherwise was an automotive SOC. I think the basic functionality of future offerings will be locked but “for a fee” upgradeability will be possible. Maybe the future looks like the Home PC of the 80’s with a Mac Mini initial presence and an “Expansion box” to permit the customization and upgrade ability that enthusiasts demand (remember the upgradable laptop docks anyone?)

    • @jb888888888
      @jb888888888 2 ปีที่แล้ว +30

      It seems to me that "PC is dying" came about when they noticed that the sales weren't growing exponentially any more. It is my semi-considered opinion that the market is saturated: everyone who wants a PC already has a PC. (Those who die off are being replaced by the first timers.) Hence a flattening of new PC sales. People will still want to upgrade their PC to a better one, or upgrade individual components.

    • @enchantereddie
      @enchantereddie 2 ปีที่แล้ว +6

      @@jb888888888 HI I reckon one of the possible examination of that stalled sales was at that time AMD wasn't doing well and Intel didn't make much progress in their next-gen CPUs, therefore people waited longer before upgrading their PCs. Much of the market share was lost to mobile phones as well.

    • @harshbarj
      @harshbarj 2 ปีที่แล้ว +15

      "PC is dying". this has been said since at least the early 90's. So 30 ish years. Now when it happens (and it will eventually) people will say, see we told you. Say something long enough that is possible or likely and eventually you'll be right. What would impress me is if people would give a year. Otherwise GTFO.

    • @ArcangelZero7
      @ArcangelZero7 2 ปีที่แล้ว +14

      @@harshbarj "Say something long enough and eventually you'll be right." See, why can't we keep repeating CONSTRUCTIVE things like "THIS will be the year of the Linux Desktop!" :)

  • @PathfinderKat
    @PathfinderKat ปีที่แล้ว +15

    I love my PC, it's going to get a giant upgrade soon. I love the customization. I love that I've had it since I was 16 and it has grown with me. Being able to customize it was a huge part of me.

  • @dylf14
    @dylf14 2 ปีที่แล้ว +301

    Every time I hear about how RISC is much simpler instruction set and how CISC is simplifying with extensions, also remember that RISC is increasing its own complexity as well. Both sets are moving towards a common center. A RISC based system requires API makers/driver teams/OS teams to reorient themselves to use more instructions to accomplish the same task. Software designers don't want to spend lots of time optimizing, and would rather move on to another project as it works better at a revenue perspective. Think about how much we ask game developers to optimize games for certain archs, and they just abandon the game and move on to the next title...

    • @kazioo2
      @kazioo2 2 ปีที่แล้ว +38

      Jim Keller, the greatest CPU designer currently living on this planet, who did both, laughs at all these online wars of CPU architectures and claims the real, fundamental differences are minimal. The devil is in the actual implementation, not architecture. If you don't believe him compare QUALCOMM''s ARM core to Apple's AMR core or even from a single company: compare x86 ZEN3 to Bulldozer (FX) - SAME COMPANY, SAME ARCHITECUTRE, drastically different IPC and efficiency (even when you ignore smaller node gains).

    • @n0stalgia1
      @n0stalgia1 2 ปีที่แล้ว +3

      Developer ecosystems, especially on that level, are much longer-living. By your logic, the Linux kernel would've been abandoned a year after its release because it wasn't "better at a revenue perspective". And yet here we are.

    • @anon_y_mousse
      @anon_y_mousse 2 ปีที่แล้ว +5

      Exactly. It just bugs the crap out of me that so many people praise RISC while you have to {load/do something/store} everywhere instead of just {do something} like with CISC. I don't care if the internal structure is more complex, the designers just need to do better there. It allows for faster code execution and always has, which is why CISC has been king these past 30 years. While RISC may be catching up, it's a missed opportunity for either Intel or AMD to not come up with a complete redesign and push for people to switch. They could absorb the costs, they have the money, and get everyone to switch to a clean and modern CISC design, but they're scared.

    • @copperfield3629
      @copperfield3629 2 ปีที่แล้ว +12

      The real genius of fully optimising the operation of code on a RISC CPU is down to the compiler writers. While the real low level stuff may be written in assembler (memory copy functionality being a typical example which is tweaked depending on the exact hardware platform in use), for the operating system (drivers etc) a higher level language will be used and the creation of efficient code to run on that CPU is down to the compiler. Most of the OS software engineers don't need to get mired down in the minutiae of the processor's architecture.

    • @eloyreno
      @eloyreno 2 ปีที่แล้ว

      A real conversation. Don’t know what’s going on but I’m here for it.

  • @ash36230
    @ash36230 2 ปีที่แล้ว +172

    Me reading the title and thinking "Oh no, another chip shortage is coming"
    2:00 "It's wasteful" he says of a machine that Apple wants you to throw away if something minor breaks because you can't replace or upgrade any parts of it easily.
    If a change comes, it'll be through other ARMs, the desktop will still be here, and hopefully with replaceable components but it'll change, slowly, for as long as Microsoft holds Windows ARM back, and as long as there's no way on Windows to run X86/64 programmes with at least the same level of efficiency, and stability as Rosetta 2

    • @cosmicclan6608
      @cosmicclan6608 2 ปีที่แล้ว +7

      I thought GPU shortage lmao

    • @aetherxsn1591
      @aetherxsn1591 2 ปีที่แล้ว +1

      @@cosmicclan6608 same

    • @RickyBancroft
      @RickyBancroft 2 ปีที่แล้ว +17

      My thoughts exactly - when I upgrade, I don't throw away my parts, I put them on ebay so someone else can enjoy them for a few more years. The opposite of wasteful.

    • @renderedpixels4300
      @renderedpixels4300 2 ปีที่แล้ว +1

      Although theres a lot of disingenuous stuff in the video, i think hes refering to the M1 as in arm, and not apple. But like x86, Arm is also has intellectual property and is just as closed as x86 in a lot of ways. ARM would be a good thing for enery consumption of computers in the long run but like so many things in the tech world, its a chicken and the egg situation where (in this case) arm doesnt gain traction becasue of the lack of support, and the lack of support comes from nobody using it and devs not wanting to put time to it.
      Theres too many benefits to arm in the portable space to ignore though and I hope M1s power coaxes companies into developing arm versions of their software.

    • @MmntechCa
      @MmntechCa 2 ปีที่แล้ว

      Apple's dirty little secret. When they brag about reducing CO2 emissions, but don't mention how much e-waste their disposable tech model generates.

  • @MustacheInaBox
    @MustacheInaBox 2 ปีที่แล้ว +379

    God the title made me have a heart attack like how linus said everyone should buy parts back in 2020

    • @Dr_Steal_Computer
      @Dr_Steal_Computer 2 ปีที่แล้ว +3

      noone should buy parts, linus is irresponsible

    • @JAN0L
      @JAN0L 2 ปีที่แล้ว +16

      And he was correct about upcoming shortages back then. Still remember the video response from Gamer's Nexus "debunking" it.

    • @Zorooooooooooooooooooooooooooo
      @Zorooooooooooooooooooooooooooo 2 ปีที่แล้ว +8

      @@lunarvvolf9606 ...lol.

    • @aarvlo
      @aarvlo 2 ปีที่แล้ว

      @@lunarvvolf9606 first of all... what? second of all, what would china even have to gain from people not buying parts? crypto mining? they made it illegal last year. Chinese competitors for american part producers? The chinese desktop industry isn't developed enough to compete with Taiwan and the US

    • @adamanggoro7258
      @adamanggoro7258 2 ปีที่แล้ว

      @@lunarvvolf9606 what does the chinese has to do with the global chip shortage?

  • @nitroxide17
    @nitroxide17 ปีที่แล้ว +9

    In 2010/2011 everyone thought enthusiast pc market is gonna die. Hasn’t died yet. SoCs will grow but add in cards aren’t gonna disappear. ARM might take over x86 but socketed components will stay.

  • @cosmicusstardust3300
    @cosmicusstardust3300 2 ปีที่แล้ว +83

    I kind of feel like we're seeing a repeat of 2012 when Microsoft was predicting that mobile devices like tablets and smartphones will soon replace the desktop hence how Windows 8 became a thing. And here we are 10 years later and the desktop has gotten bigger and better

    • @cleon24769
      @cleon24769 2 ปีที่แล้ว +7

      You just gave me funny flashbacks to the _months_ of consumer demand to bring back the Start button.
      In fact, wasn't there something like where the first user-created "shell" interface to replace Windows 8's UI was so popular, the website for the download kept DDOS'ing?

    • @BlommaBaumbart
      @BlommaBaumbart 2 ปีที่แล้ว +2

      For many people tablets and smartphones have replaced the desktop. According to statcounter, these two categories make up 61% of computing devices. In Africa, the biggest continent, they're over 73%. Japanese teenagers in wide parts play ONLY games which work on their phones, and this includes Fortnite-like games and Resident Evil like games and other genres you associate with desktop PCs. The future fits in a pocket.

    • @cosmicusstardust3300
      @cosmicusstardust3300 2 ปีที่แล้ว +16

      @@BlommaBaumbart You only listed two parts of the World... Africa and Japan do not reflect the entirety of where a trend will end up. And I still see tons of statistical sources pointing to a not a decline but healthy growth in demand for PC gaming. I highly doubt the desktop will be going anywhere anytime soon, people have been saying this crap for almost two decades now.

    • @4cps777
      @4cps777 2 ปีที่แล้ว +1

      Add to that the fact that SoCs are probably not that great for servers from what I can tell. (However, I don't have any experience with actual server hosting so take this with a grain of salt)

    • @4cps777
      @4cps777 2 ปีที่แล้ว

      ​@@BlommaBaumbart That's great but as long as there are enough tech enthusiasts out there as well as enough people who actually go to work (even iPads which I personally consider to provide peak out of the box productivity for touch screen devices don't even get close to desktops, even with bad wms such as the Windows one) the deskopt ain't going anywhere.

  • @rossjennings4755
    @rossjennings4755 2 ปีที่แล้ว +364

    The direction that this video speculates is the future of computers really has two pieces: moving from x86 to a RISC architecture like ARM, and moving away from discrete CPUs and GPUs toward SOCs, and it's really only the second one that sucks for enthusiasts (and anyone who values replaceable parts). Apple would like us to think the two pieces are inseparable, but they don't have to be. I can imagine a future where the current PC ecosystem is replaced by a new ARM or RISC-V-based platform, but that platform still allows devices to be assembled from components made by different manufacturers using standardized interfaces and an assembly process at least somewhat accessible to people outside of fabs, so that those components can easily be upgraded or replaced if they fail.
    In other words, I don't want a Mac, but I do want something like the Framework laptop, but NUC-shaped and running on an ARM or RISC-V CPU. It's all right if the PC has to die, as long as its spirit can live on.

    • @Hack3900
      @Hack3900 2 ปีที่แล้ว +36

      Please let RISC be the winner, I don't want more licenses in the way of tech when open alternatives exist ;-;

    • @haysoos123
      @haysoos123 2 ปีที่แล้ว +4

      We don't know what the Apple silicon Mac Pro is going to be yet, which presumably should be their modular system. Now, it may be that the parts are 'from one manufacturer' at least for the near future, but it's also possible that there would be third parties involved if it made sense.

    • @AR15ORIGINAL
      @AR15ORIGINAL 2 ปีที่แล้ว +4

      what is NUC

    • @RobertWilke
      @RobertWilke 2 ปีที่แล้ว +8

      That would be optimal, that said these companies would love to just lock you in and never let YOU choose. As much as I like Macs I do like choosing may parts. Change is inevitable, how and where that change goes is the issue. Let's hope it's for the better.

    • @davidperry4013
      @davidperry4013 2 ปีที่แล้ว +4

      A socketed 65w TDP ARM CPU, and a 105w TDP ARM-based dedicated GPU with HBM-2 VRAM would be the future of custom built gaming rigs.

  • @alexpappas1573
    @alexpappas1573 2 ปีที่แล้ว +43

    I remember in 2011 they said gaming pcs are dead and tablets are coming to replace both laptops and desktops. 😄

  • @computerpwn
    @computerpwn ปีที่แล้ว +3

    this is scary and should be talked about more… trading off unmatched efficiency for freedom to choose. how can you service something you cant take apart?

    • @computerpwn
      @computerpwn ปีที่แล้ว +1

      and this is exactly what nvidia will do with the way things have been going

  • @DerekHubbard
    @DerekHubbard 2 ปีที่แล้ว +433

    I think the new Apple chips are amazing, but Apple will hold onto them with a death grip. They'll ensure that the rest of the world can't use them or anything like them. Also, while they may have the neatest thing going in high-efficiency high-performance computers, they will continue being actively hostile toward their customers. They're an objectively awful company.

    • @Knowbody42
      @Knowbody42 2 ปีที่แล้ว +52

      I won't give Apple a cent of my money unless they do offer them in a product that is upgradeable and repairable.

    • @psychoclips6817
      @psychoclips6817 2 ปีที่แล้ว +27

      Apple cannot stop other companies from making their own SoCs. Companies already do this and did it before Apple ever made the first M1 SoC. Unlike those other companies Apple just did it well for the intention of demanding workloads.

    • @brettski74
      @brettski74 2 ปีที่แล้ว

      Apple can't stop anyone else from using "anything like them". They're using technology licensed from ARM. The same technology is licensed to many companies, so they can't lock them out of the technology. They're also manufactured by TSMC, who manufacture chips for numerous companies including other ARM licensees, so Apple also can't lock them out of the manufacturing process. Apple's implementation is better than most and they don't have to sell M1s to anyone else if they don't want to, but there's nothing stopping some other ARM licensee from making something similar other than time and money.

    • @beeldbuijs1003
      @beeldbuijs1003 2 ปีที่แล้ว +12

      @@Knowbody42 I'd like to see more upgradable and repairable too, but on the other hand: How many people have ever upgraded their desktop? 99% never bothers and just keeps using it until it becomes too slow and then rather buys a new one. BtW still using a 2008 Mac Pro. Gave it an SSD, second hand video card that supports 4K and even updated to macOS 10.14 that Apple didn't support on this 3,1 model. Have been using it daily for 14 years straight in December.

    • @Knowbody42
      @Knowbody42 2 ปีที่แล้ว +23

      @@beeldbuijs1003 It's not just about upgrading. Most people have had to get something repaired.
      Lots of people have lost their data because they use an Apple product with soldered SSDs which are designed to not be repaired.

  • @RowanBird779
    @RowanBird779 2 ปีที่แล้ว +184

    I wish more CPU manufacturers existed, so it's not just red vs blue, like Cyrix

    • @billj5645
      @billj5645 2 ปีที่แล้ว +1

      I used to have a Cyrix processor- much more performance for much less cost. What happened to them?

    • @RowanBird779
      @RowanBird779 2 ปีที่แล้ว +1

      @@mattmccastle9145 I already know quite a lot about Cyrix

    • @khatdubell
      @khatdubell 2 ปีที่แล้ว +24

      I hope the irony of you wishing there were more tech competition while flying the logo of one of the biggest crushers of competition in tech as your profile pic isn't lost on you.

    • @RowanBird779
      @RowanBird779 2 ปีที่แล้ว +3

      @@khatdubell I'm well aware of what Microsoft did, but hardware and software are two separate things, I guess CPUs are software now?

    • @khatdubell
      @khatdubell 2 ปีที่แล้ว +22

      @@RowanBird779
      So, what? You're implying you're ok with it when it comes to software but not with hardware?
      And FWIW, MS _did_ essentially do this with hardware as well by teaming up with IBM

  • @Da40kOrks
    @Da40kOrks 2 ปีที่แล้ว +313

    AMD went with chiplets because they make silicon yields less impactfull. I can't imagine making bigger and bigger SOCs would have high enough yields to be really economical in the long run.

    • @pearce05
      @pearce05 2 ปีที่แล้ว +19

      SOCs can be made with chiplets. Apple's M1 uses them.

    • @jonathan21022
      @jonathan21022 2 ปีที่แล้ว +5

      In theory this would allow them to add custom arm cores into there CPUs similar to how Intel has efficiency cores. this would allow newer software to more to the arm side as well as maintain compatibility. They would be able to limit the x86 core's to what would be need for compatibility and pull from ARM for improvements where that allows. How ever what this video left out is most software would become unusable on a arms only processor and there suggestion of using efficiency cores not the power cores maybe backwards for the performance needed to maintain legacy compatibility.

    • @SlyNine
      @SlyNine 2 ปีที่แล้ว +15

      @@pearce05 and wait until Apple tries to match Intel performance, all that efficiency is gone. This video is a big nothing burger.

    • @info0
      @info0 2 ปีที่แล้ว +4

      @@SlyNine yup, power draw would be ginormous and probably would melt the M chip.

    • @MoireFly
      @MoireFly 2 ปีที่แล้ว +11

      @@SlyNine Not necessarily - the M2 already comes very close to intel perf. And they do that not at the cost of power, but at the cost of... cost; they throw silicon at the problem like it's nothing. Nobody else in the industry spends like that; and they probably _can't_ afford to even if they wanted to.

  • @maxcarter5922
    @maxcarter5922 2 ปีที่แล้ว +76

    I want more opinion pieces or even video essays like this. We need stronger opinions about this industry.

    • @Arcaryon
      @Arcaryon ปีที่แล้ว +1

      I am not a "tech type" so I got to ask: aren’t there any? I thought that opinion pieces are pretty much universal and while some channels may not have them, I feel like educated speculations and predictions etc. are a part of the information chain in basically all fields.

  • @existentialselkath1264
    @existentialselkath1264 2 ปีที่แล้ว +153

    Laptops, tablets, phones, always advance faster than anyone expects, but they never completely replace the PC.
    As tech gets smaller and smaller, you can get more and more power by keeping it the same size.
    Just like laptops ate into the pc market share for Web browsing and average work, soc computers will probably do the same to an extent for video editing or whatever. But it can never fully replace the maximum price/performance form factor of a pc, even if pc's move to a newer architecture to keep up

    • @jabbany2715
      @jabbany2715 2 ปีที่แล้ว +4

      As a form factor sure, but future PCs will look like these Mac Minis. Tight integration means nothing to upgrade and low power draw means more compact dimensions. Back in the day, people could upgrade the cache on processors, but as that got too fast it became a built-in part of the chip. In the future, this will likely be the same case for RAM where you do not have memory modules but it is just part of the chip package.

    • @MrMaddog2004subscribe
      @MrMaddog2004subscribe 2 ปีที่แล้ว +6

      @@jabbany2715 I doubt it since everyone needs different amounts of ram. People who need 64gb of ram for specific work cases aren't going to want to buy a more expensive version of a CPU just to get more ram. I don't see upgrading ram going away at all. Too many people would be upset about it and it wouldn't make much sense to do so

    • @existentialselkath1264
      @existentialselkath1264 2 ปีที่แล้ว +2

      @@jabbany2715 you imagine ram in the future being small enough to fit on the chip. I imagine ram in the future being small enough to fit terabytes on a stick.
      The whole point of cache is that it's faster to access than anything else in the system. Integrating it into the cpu is a natural thing to do. I'm not convinced the same thing will happen for ram and especially gpus.

    • @TheMyleyD
      @TheMyleyD 2 ปีที่แล้ว +4

      I think the issue is Moore's Law. We are getting to a point in the x86 architecture where we are using increasing power to less performance. ARM or RISC* is logically the next step. But I'm sure the future will bring different architectures.

    • @afriendofafriend5766
      @afriendofafriend5766 2 ปีที่แล้ว +1

      @@jabbany2715 Ironically you now *can* once again upgrade the cache.

  • @thegardenofeatin5965
    @thegardenofeatin5965 2 ปีที่แล้ว +311

    I think the main thing keeping x86 relevant is Windows. Windows, and more to the point Windows' ecosystem, really can't abandon x86, because there's so much software that REQUIRES it. Apple has such a tight grip on their platform that they can dictate unilateral and very sudden platform shifts. Linux is source-available so as soon as the new architecture is supported by GCC someone somewhere can start pressing the compile button. Windows? There's gonna have to be an end of a decades-long era.

    • @petrkisselev5085
      @petrkisselev5085 2 ปีที่แล้ว +40

      Hence why Valve are experimenting with their Linux-based Steam OS on the Steam Deck.

    • @quinnmichaelson6793
      @quinnmichaelson6793 2 ปีที่แล้ว +16

      I mean I have surface pro x with a Microsoft designed arm chip. It's not perfect, it's not faster than the Mac m1, but it's hell of a lot more useful than my old android tablet, and it works well enough with software emulation.
      I remain extremely skeptical that arm can completely overtake x86/64 though to the extent the video here argues.
      Apple has been doing SOC-like things (closed unibody, no changing parts) far longer than the ARM chips were around so they are just a bad example to point to for wide reaching changes.
      I think the far more likely approach is some sort of hybrid power saving state architecture integrated into most desktops, the same way that dedicated gpus on laptops slowly developed tech to allocate power based on current usage.
      Then we will eventually see more powerful arm cpus integrated into traditional motherboard designs, rather than completely replacing them.

    • @theguythatcoment
      @theguythatcoment 2 ปีที่แล้ว +13

      Nah, x86 is alive because server farms buy every single one of the top line chips before launching to consumers, and server farms love x86 because of virtualization. X86 are in essence a lot of 8 bit risc computers that emulate a 64 bit cisc computer. Why buying risc chips if you can't efficiently emulate x86 with them but you can the other way around

    • @tin-n-tan
      @tin-n-tan 2 ปีที่แล้ว +15

      most linux is run on x86. Its linux so it can run on rainbows and nice thoughts but it still mostly runs on x86.

    • @arsoul3591
      @arsoul3591 2 ปีที่แล้ว +18

      The M2 chip literally has billions more transistors than the most recent x86 stuff from AMD / Intel does and is like twice the size or more. Why are people simply ignoring this? ARM is not more powerful, the only thing it has over x86 is less power per instruction, that's literally it... x86 isn't going anywhere.

  • @LockeTheCole
    @LockeTheCole 2 ปีที่แล้ว +124

    Hilarious to see a company that has been pushing so hard for right to repair releasing a video about how we need to move to more integrated components that are by design impossible/near impossible to repair.

    • @huleyn135
      @huleyn135 2 ปีที่แล้ว +6

      Because the right to repair matters precious little compared to the literal futute of chipmaking and performance.

    • @clark85
      @clark85 2 ปีที่แล้ว +24

      exactly I dont know what the point of this is. Its something we definitely do not want at all

    • @clark85
      @clark85 2 ปีที่แล้ว

      @@huleyn135 hardly it sounds more like the sky is falling

    • @anshikatiwari2785
      @anshikatiwari2785 2 ปีที่แล้ว +13

      Bro not their fault they are just telling facts right?

    • @MightyGachiman
      @MightyGachiman 2 ปีที่แล้ว +20

      @@anshikatiwari2785 They are, but advocating for it is straight out of left field for SUPPOSED advocates of right to repair lmao.

  • @artem.boldariev
    @artem.boldariev ปีที่แล้ว +13

    One thing needs to be noted. In most modern x86 based computers there is really no need to keep backward compatibility with 16-bit 8086 and more of that cruft. I think that this is going to be dropped eventually - there is zero reasons to keep it on modern, UEFI-based computers.

    • @LiqqaRoni-cx3tx
      @LiqqaRoni-cx3tx ปีที่แล้ว

      So no more ax, al, and ah registers?

    • @artem.boldariev
      @artem.boldariev ปีที่แล้ว

      @@LiqqaRoni-cx3tx They are still available as parts of RAX register.

  • @MastermindAtWork
    @MastermindAtWork 2 ปีที่แล้ว +54

    The main thing I want to still be around, even though integrated CPU + GPUs are very convenient, is upgradability. Having the ability to upgrade storage and RAM personally without spending an extra $1000 a 16GB RAM and 2TB ROM setup.

    • @garrettrinquest1605
      @garrettrinquest1605 2 ปีที่แล้ว +2

      Exactly! The thing that is the best about the PC space right now is the modularity. If they could keep that somehow, I'd be super down to move to ARM over time

    • @storyanaksekolah2
      @storyanaksekolah2 2 ปีที่แล้ว +1

      until we can see a modular arm system

    • @ids1024
      @ids1024 2 ปีที่แล้ว +2

      Upgradability, and the ability to combine different options for memory, storage, cpu, gpu, other PCIe cards (video capture, AI accelerator, etc.). With everything integrated on one board, and a lot of it in one chip, there won't be as many options.
      Perhaps we still see this modularity in enthusiast gaming hardware partly because it's essential in the server and workstation space, where hardware can be configured in vastly different ways depending on its use (GPU compute server loaded with GPUs but minimal storage; storage server with a huge number of SSDs and no GPUs, etc.).

    • @vizender
      @vizender 2 ปีที่แล้ว

      Well I think storage upgrability is important but I’m not that sure about RAM. Right now from what I understand one of the biggest issue on x86 RAM is the speed in which they communicate with the CPU/GPU. In the ARM SoC, the RAM being so close to the CPU/GPU cores makes it so much more efficient. I’ve seen post about people indicating they went from a 16Gb intel max to a 8Gb M1 Mac and RAM seemed to perform very well in comparaison to the older intel model.
      They had not specified any specific data but I would not be surprised it’s true.
      I really think for most consumers which don’t use their computer for some very specific tasks, upgrading the RAM after the initial purchase (if we consider the initial purchase to be sufficient at the time you bought it) will not be important for at least a few years, and currently most PC after some years have had so much replaced parts it’s basically a new PC.

    • @aravindpallippara1577
      @aravindpallippara1577 2 ปีที่แล้ว

      ​@@vizender m1 ram isn't anything particularly special - just lpddr4 as far as I understand ( better than ddr4 worse than ddr5)

  • @alexmills1329
    @alexmills1329 2 ปีที่แล้ว +541

    I feel like this leaves out critical information like how Apple is using a more efficient and dense silicon node, meaning that it would both be larger and more power hungry if they did this on 7nm like AMD zen 3, zen 4 is on 5nm and that will be an interesting comparison later this year.

    • @ikjadoon
      @ikjadoon 2 ปีที่แล้ว +28

      Yeah, this was an oversight, but in the end, it didn't matter. Even comparing Zen3 to Apple A12 (both on TSMC N7 non-P), Zen3 ate 11.38W avg for 71.30 SPEC2006 points, while A12 took 3.91W for 49.85 points. On *identical nodes*, A12 (3 generations old now) was still 2x more efficient than Zen3. Source: AnandTech's Zen3 + A12 reviews.

    • @atul1991ful
      @atul1991ful 2 ปีที่แล้ว +40

      @@ikjadoon that would be the case if the performance scaled linearly with power. Amd designed their chips to work in that power envelop so it's expected they won't be amazing at 4w but even when looking at the m1 and it's iPhone counterpart, do you see linear performance scaling when compared to power?

    • @NFStopsnuf
      @NFStopsnuf 2 ปีที่แล้ว

      You then subsequently leave out that all the top physical chemists and spectroscopists go to Intel, not Apple, so chips from Apple will never have the same absolute power as Intel chips until they hire better

    • @ikjadoon
      @ikjadoon 2 ปีที่แล้ว +10

      @@atul1991ful What made you think AMD did not design its microarchitectures for 4W power?
      EPYC 7763. 280W, 64C.
      Per-core power = 4.44W
      EPYC 7713P. 225W, 64C.
      Per-core power = 3.51W
      The per-core power is even lower than this because we're including the I/O die here.
      AMD specifically designs its cores to scale to very low power draw,, well under 5W. Intel does, too, but they do it less successfully. This is the bread & butter of today's microarchitectures.
      Single-core power *does* translate to linear single-core performance in the most critical segment of the perf/W curve. The problem is the uarch is too narrow in most Intel & AMD CPUs, so they pump up the frequency and suffer non-linear scaling. This is not news to either AMD or Intel. They know that.
      There's a reason mainstream CPUs were at 3 GHz for ~10 years.

    • @drsupergood8978
      @drsupergood8978 2 ปีที่แล้ว +8

      ​@@ikjadoon Core power usage is non-linear with performance. Most of the performance is obtained with very little power (4-6W odd) around the base frequency. It is the boost frequency that destroys power efficiency since that requires a higher voltage so is less efficient cycle for cycle. At full boost cores can easily use 12W+. These numbers looking familiar? So that the Zen3 processor could achieve the higher score than the A12 it was almost certainly running at full boost which effectively throws efficiency out the window for performance. If the cores were to be tuned to match their scores the power usage would be very similar between the two, possibly going either way between the more mature processing node of the Zen3 or the less complicated ARM cores of the A12.
      Fundamentally performance cores all work in very similar ways regardless of their architecture. The main overhead x86-64 suffers from is the increased decode complexity from fewer explicit registers, complicated instruction support and legacy instruction support. Even still I suspect more of the die area is taken up for features like AVX than instruction decode. Neither x86 ("32bit") or older instructions need to be particularly performant within the pipeline so I can expect not much die area being allocated to those.

  • @TonyNse
    @TonyNse 2 ปีที่แล้ว +173

    In the future, we will all have Apple-style non-upgradable PCs with "efficient" GPUs that perform like a GTX 1050 (because you can't put a 3090-class GPU into a tiny SOC and make it perform the way it should) which will not be backward compatible with past games and software without some sort of emulation (which will of course suck) You know what? If I want a console I'd f*****g go buy one!

    • @0r_1x
      @0r_1x 2 ปีที่แล้ว +21

      Given recent market trends around GPU's I don't see that happening. Shrinking dies have always been the goal. There will be growing pains for sure, but power efficiency is not a bad thing. We can run things from decades ago on modern hardware without a sweat, so no fears of emulation becoming an issue. Really, the fears here are unfounded. New doesn't mean worse, and x86 has been long in the tooth for a while now but Enterprise and legacy business apps still require it.
      Really, the big problem here is having the consumer right to repair along with a large market of choices going by the wayside. But that's a fight that's yet to come.

    • @ag3ntorange164
      @ag3ntorange164 2 ปีที่แล้ว +22

      Totally agree. We won't get a replacement for a powerful gaming rig from Apple. EVER, PERIOD. Just an overhyped, underpowered piece of shit in a pretty aluminium shell as usual.

    • @samuellourenco1050
      @samuellourenco1050 2 ปีที่แล้ว +5

      GTX 1050? Nah, say GT 710 if you are lucky. The GTX 1050 has a huge die, and the amount of transistors it uses won't fit into a SOC.

    • @davidmalkowski7850
      @davidmalkowski7850 2 ปีที่แล้ว +5

      That's a baseless comparison to the GTX 1050. The benchmarks on the M1 Max Macbook Pro alone show way better performance than that, not to mention it can carry triple 5K displays. Listen, I'm no Apple guy, but for an integrated GPU, anything that beats the current king in the Radeon 680M is worth paying attention to.

    • @michaelbacqalen1109
      @michaelbacqalen1109 2 ปีที่แล้ว +6

      @@davidmalkowski7850 But how much will it cost though, a Max MacBook Pro against a GTX1050 laptop

  • @MultiHunterOne
    @MultiHunterOne 7 หลายเดือนก่อน +3

    Hold on, just because we're reaching limitations of x86 and are probably going to switch to ARM doesn't necessitate that the computers turn into SoC machines with the chips soldered onto the motherboard. What's stopping us from having ARM chips being LGA or PGA and being just processors, not SoCs.

    • @oliverdickens3219
      @oliverdickens3219 6 หลายเดือนก่อน +2

      Tbh they can make more money if they make it all in ones or atleast attached to other things

  • @mrsaul3709
    @mrsaul3709 2 ปีที่แล้ว +84

    Apple creating a good SOC doesn't mean the whole Hardware market is going to change.
    We will have better GPU's and CPU's but not SOC's.

    • @Gamer-nc8qp
      @Gamer-nc8qp 2 ปีที่แล้ว +4

      It's gonna change unfortunately

    • @ScaredDonut
      @ScaredDonut 2 ปีที่แล้ว +10

      @@Gamer-nc8qp Maybe in 10 - 20 years. Not anytime soon

    • @lasue7244
      @lasue7244 2 ปีที่แล้ว +1

      @@Gamer-nc8qp if the pc components say the same and don't evolve with time, definitely Arm SOC market will have more market share. But I believe Intel, And and Nvidia will figure out something.

    • @cbegefdkih
      @cbegefdkih 2 ปีที่แล้ว

      @@lasue7244 ARM devices already have a higher market share. The enthusiast market is not going anywhere.

  • @cippo1995
    @cippo1995 2 ปีที่แล้ว +83

    I have serious doubts on many points, I don't want to make a lenghty comment, but I will say this:
    1. Business model is hard to change for companies this big.
    2. Perfomance will always be better on the current PC model.
    3. Apple used 5nm: apples to oranges comparisons when talking about die sizes and efficiency.

    • @lucasrem
      @lucasrem 2 ปีที่แล้ว

      EUV, we can still go smaller now!
      only TSMC and intel can do that!

    • @novarank
      @novarank 2 ปีที่แล้ว +4

      Yea. I was super annoyed that comparing die sizes when apple m1 5nm vs Ryzen 5700g 7nm.

    • @UmCaraNaNet
      @UmCaraNaNet 2 ปีที่แล้ว +1

      @@novarank and I was using a 45 nm cpu until november

    • @PAcifisti
      @PAcifisti 2 ปีที่แล้ว +1

      @@GeneralKenobi69420 Probably referring to the higher power budget that separate pieces can have. At the expense of efficiency you can just crank up the power when you got way more space to dissipate it (separate plates and heatsinks). SOC's don't really have this option unless you want to turn it into a furnace with 500-700 watts coming out of a tiny plate. PCI-E 4.0 and especially 5.0 shouldn't really bottleneck the bandwidth & latency that much.

    • @samuellourenco1050
      @samuellourenco1050 2 ปีที่แล้ว

      @@lucasrem Beyond 5nm? Perhaps 3nm, but not much smaller than that. We are talking about the scale of atoms here. And silicon usually dies because electrons can knock atoms and dislocate them from their position. It that is a big deal on a 180nm transistor, imagine on a 3nm one.

  • @ALaModePi
    @ALaModePi 2 ปีที่แล้ว +297

    I agree with Anthony, that the salient question is "Is it worth it?" My answer for Apple Silicon was "No." (but that had more to do with being allergic to Apple's infrastructure). If Intel or AMD went ARM (assuming a relatively equal capacity to run older application in emulation -- a big assumption considering that Apple really did an amazing job on Rosetta II), I'd be more sanguine on making the move.
    The remaining issue is losing incremental upgrading, a hallmark of the current PC. Albeit the upgrade path on a CPU is limited by eventually needing a new motherboard to support the newer CPU and there are advancements in Thunderbolt and such that also push new CPUs and MBs but the basic concept of being able to replace your graphics subsystem with an updated, more powerful model is an important consideration when graphics cards are steadily increasing in capability.

    • @x13vil
      @x13vil 2 ปีที่แล้ว +1

      I was wondering if eventually we’ll be able to get pcie card to newer versions of thunderbolt like the ones we’re getting with usb 3.2 pcie expansion

    • @zonaloca
      @zonaloca 2 ปีที่แล้ว +6

      Not just upgrading. But picture a component going bad due to a power surge or something. Just diagnose it to find it and swap it. None of this Is am option in Apple or any SoC nowadays. They are so tightly integrated that you need to replace the whole board if only your storage went bad.

    • @ALaModePi
      @ALaModePi 2 ปีที่แล้ว +3

      Good point. I've seen this happening for some time and there are a number of issues with it. The one you've raised is actually the biggest in my thinking. You basically junk a system based on the failure of some component that should be relatively easy to replace. (I'm thinking of trackpads, keyboards, webcams, and even batteries. It's not just that they're harder to replace, in some cases, the cost of replacement is high enough that you just buy a new entire unit.)

    • @CalikoTube
      @CalikoTube 2 ปีที่แล้ว +2

      I dropped Windows virusware and tinkering with off shelf parts years ago and never looked back.
      For way less money I get an all in on package engineered by the most innovative company in human history that WORKS together and lasts me 6-12 years.
      No brainer.

    • @ALaModePi
      @ALaModePi 2 ปีที่แล้ว +1

      @@CalikoTube What did you get? It sounds great, but it can't possibly be what I think it is.

  • @Cars1Gunz1and1Weights
    @Cars1Gunz1and1Weights ปีที่แล้ว +32

    So glad this channel gets into the technicals. This guy is their best addition to the lineup

    • @themonsterunderyourbed9408
      @themonsterunderyourbed9408 ปีที่แล้ว +2

      LMAO!! careful, you'll get thrown in jail for misgendering him.

    • @Bellathor
      @Bellathor ปีที่แล้ว +2

      @@themonsterunderyourbed9408 lol, you weren't kidding.

  • @LycanWitch
    @LycanWitch 2 ปีที่แล้ว +58

    As long as gaming studios keep pushing graphical limitations, computers with separate GPUs will still be around as I doubt consumers would want to be forced to buy a completely new pc to play the latest and greatest games.

    • @Ambivadox
      @Ambivadox 2 ปีที่แล้ว

      If I wanted to play on a console I wouldn't have built a PC. Fuck apple.

    • @johndododoe1411
      @johndododoe1411 2 ปีที่แล้ว +3

      Gaming studios are dying left, right and center for ginormous moral failures and/or buyout by console makers. Look at EA, Blizzard, Minecraft etc.

    • @Taenlyn
      @Taenlyn 2 ปีที่แล้ว +4

      They already do it with consoles.

    • @hector1234
      @hector1234 2 ปีที่แล้ว +6

      @@johndododoe1411 gaming studios have become about the dollar signs and thats what we are to them so they need us. But on a moral side yes they are dying because of their greed (cough cough) Activision blizzard and ea im looking at you.

    • @dantesk1836
      @dantesk1836 2 ปีที่แล้ว +1

      @@Taenlyn and most mobile phones,tablets and laptops

  • @ZaidAnsari-un5pu
    @ZaidAnsari-un5pu 2 ปีที่แล้ว +56

    You know it's serious when you see Anthony

  • @gamer-ot1xf
    @gamer-ot1xf 2 ปีที่แล้ว +229

    I'm just gonna pray that the PC industry does not head this way, I still want to feel the incredible feeling of unboxing an RTX 3090, not unboxing a cube.

    • @michaelrann4143
      @michaelrann4143 2 ปีที่แล้ว +44

      Don't worry, don't read to much into this, just add it to the list of many inaccurate PC predictions....yawn

    • @Lord_Thunderballs
      @Lord_Thunderballs 2 ปีที่แล้ว +27

      @@michaelrann4143 Yup, video was a complete joke. Dude's just saying shit people have known for years. ARM isn't anything special.

    • @Lord_Thunderballs
      @Lord_Thunderballs 2 ปีที่แล้ว +13

      @@gamer-ot1xf I would point out the Raspberry pi and how they use ARM chips. I don't see this as a bad thing. Building your own computer will never die off. It's silly to say, it's the enthusiast's who push Intel, AMD, and Nvidia to create better hardware to push the limits. If you're worried about who own your computer, switch to Linux. Linux support has gotten a lot better

    • @computerbytes01
      @computerbytes01 2 ปีที่แล้ว +11

      @@michaelrann4143 yeah man 100% - Same as when people said PC gaming was dead.

    • @rwyo83
      @rwyo83 2 ปีที่แล้ว +2

      By 2024 the cubes will take over

  • @caerjones6693
    @caerjones6693 2 ปีที่แล้ว +32

    THANK YOU for explaining this in a way someone barely technical could follow without making me feel dumb. I appreciate friendly faces and accessible info in enthusiast spaces!

  • @harshbarj
    @harshbarj 2 ปีที่แล้ว +131

    People have been saying for DECADES that the pc was doomed. I remember as a kid in the 90's people saying the same. Arm systems are great, but they all still lack real power. Even the new apple arm chip. It's a massive step up, but it's still well behind higher end X64 chips. Plus just try to upgrade an arm system in any way. My R-pie-4 is great and I love my Samsung tab S8 ultra. But at the end of the day when I need some real power I use my Ryzen system. Arm will no doubt eat away at the low end and may even eat a little into the mid range. But for the foreseeable future Arm is no threat to the high end.

    • @Sovjetski-
      @Sovjetski- 2 ปีที่แล้ว +4

      True man

    • @CalikoTube
      @CalikoTube 2 ปีที่แล้ว +11

      No. Apple made a huge leap over Intel. If they hadn’t, they would have stayed with Intel. Intel is sweating right now and mocking Apple in ads.
      Apple Silicon saves a ton of energy compared to high end X86 chips. This is why your knockoff iPad from Samsung doesn’t use X86. I wouldn’t be surprised if within a decade Apple has a chip as fast or faster than the highest end X86 chip but using half the power.
      Apple knew the future and is making it real time. This comment is gonna age as well as
      “Lol Apple removed the floppy disk!”
      “Lol Apple removed the SCAI drive!”
      “Lol Apple removed the mobile keyboard!“
      “Lol Apple removed the CD ROM!”
      “Lol Apple removed the headphone jack”
      I know you’re not mocking Apple, but defending old tech never ends well.

    • @dhruvpandya4136
      @dhruvpandya4136 2 ปีที่แล้ว +27

      @@CalikoTube Headphone jack thing still is a feature which people will look for. Second they are removing unnecessary tech, not making new tech. Arm is not scalable and versatile. Versatility is traditional of hardware is an advantage in itself. Number 2, apple has never been massively succesful on non-mobile computer i.e. desktop. Apart from creators, any other office looks for repairablity and forgoing some power is not a difficult decision. I am not saying arm will never over take x86 but, it seems unlikely. If there is no environment for repairing a machine, it is not reliable. I have seen companies stick gasoline cars as they are repairable. They also tend to pick cars which are easily and cheaply repairable by the local mechanics.

    • @xbendan
      @xbendan 2 ปีที่แล้ว +6

      ​@@CalikoTube If you are talking about the Intel chips on Macbook. Yes. But not any other x86 computers. You could even run PC in 5w, that Apple Silicon could get 10000% faster than x86 because it cannot even run on this watt. It's stupid to say that Apple's chips are 2 times faster than Intel. The fact is, if you are trying to run chips in 500w, x86 could provide boost performance, while Apple Silicon rebooted due to the high temperature.
      Apple Silicon still cannot run much software. Once somebody shows the super strong performance of x86, "Apple" always says that "It's unfair, this is not natively supported by Apple Silicon". I don't know whether Apple knows the future, but it's clear that Apple understands how to make money by selling an extra 8G for hundred dollars.

    • @CalikoTube
      @CalikoTube 2 ปีที่แล้ว +3

      @@xbendan
      Ummm NO. Apple never said “that’s not fair. Doesn’t run on Apple Silicon.”
      To think that Intel is panicking for no reason is ignorant. ARM runs a lot cooler than X86 so that temperature argument is out.

  • @lorenkening3817
    @lorenkening3817 2 ปีที่แล้ว +1510

    Best place to get a windows 10 pro key?

    • @David-bh1rn
      @David-bh1rn 2 ปีที่แล้ว +24

      Scam guys don't

    • @lorenkening3817
      @lorenkening3817 2 ปีที่แล้ว +1

      @@kevinphilips7453 ​Thank you so much Kevin, Got the key in less then 18$ and its activated perfect!

    • @vara48
      @vara48 2 ปีที่แล้ว +4

      pirate windows

    • @cooy
      @cooy 2 ปีที่แล้ว +2

      Bot

    • @secktuss9610
      @secktuss9610 2 ปีที่แล้ว +1

      @@cooy really taking the extra step of botting likes on a youtube comment

  • @drisbain
    @drisbain 2 ปีที่แล้ว +158

    few things, The Actual x86 processor has been a simple (RISC type) CPU since the days of the Pentium Pro. They added a reorder buffer (which about every performance CPU has) for increasing IPC. Making a poor design run faster increases power demand, remember the days of the Pentium 4 being the end of x86 because it was to hot and ineficent... then they took the laptop chip (pentium3 m) and made it the core 2.
    Also Since you are talking performance per Watt, look at the under volted Ryzens, rather than full powered X86. Full powered X86 is not optimized for per Watt performance, just straight performance.

    • @steinwaldmadchen
      @steinwaldmadchen 2 ปีที่แล้ว +13

      You're basically correct. IIRC AMD said they could literially build an ARM chip overnight just by replacing the decoder of Zen.
      But even Zen is behind M1 in terms of performance per watt. There're many reasons, and I don't think Apple's engineers smarter than AMD's (or Intel's) is one of them. Rather, costs related to all those x86 compatibility supports could be a reason behind that, or maybe the tight integration in Apple Silicon that AMD has also been doing for years, aberit slower. Their APU concept and the HSA technology that supports that are also about lowering latency between components.

    • @iskierka8399
      @iskierka8399 2 ปีที่แล้ว +29

      @@steinwaldmadchen M1 has a significant efficiency advantage in the form of its 5 nm exclusivity contract, which is something being overlooked in these claims of x86 being replaced. Yes, in terms of consumer performance, this doesn't actually matter, people should buy what's best for them right now, but when *all* other silicon is limited to 7 nm or larger still, trying to claim that M1 is a demonstration that efficiency (or outright performance) can't be matched is just outright false. Yes, a 3090's chip might be half the size of an M1 while the M1 GPU is a third of the size, but a 5 nm 3090 would likely be less than a third of the size - none of these comparisons are actually fair on the technological side.

    • @TeranToola
      @TeranToola 2 ปีที่แล้ว +16

      @@steinwaldmadchen M1 Is not better due to architecture, it is better due to a smaller node.
      M1 is 5nm, it's "apple-to-apples" (haha) competitor will probably be Ryzen 7000 mobile.
      Apple is TSMC's pipecleaner. They get first dibs because they give TSMC the most money.

    • @steinwaldmadchen
      @steinwaldmadchen 2 ปีที่แล้ว +4

      @@iskierka8399 I've already said there are many reasons behind that. Process node is of course one of them.
      But even a 2019 A13 Bionic at 7nm still has comparable Geekbenck single core score comparable to a 7nm Ryzen 3000 in the same year. Granted Geekbench is not the best representative of performance, and A13 is a mobile chip that is instinctively efficient oriented, but it shows Apple at least has some edges in some areas, using the same node technologies.
      On the other hand, had AMD given the chance to design a chip from ground up, without concerning software compatibility, would they still choose x86-64, or would they pick newer architectures like ARM and RISC-V? Would that theoratical RISC Ryzen performs better in majority of the scenarios than the x86 version we have? Possibly, given some engineers complained x86 is difficult to work with. If that's the case then clearly x86 is one of the reason holding us back.
      Of course for the sake of software compatibility it's a compromise many have to make, otherwise the computer they made would be useless, unfortunately. And engineering is all about compromises.

    • @dirg3music
      @dirg3music 2 ปีที่แล้ว +3

      Yeah there is some serious gross oversimplifications going on in this video

  • @Technicallyaddicted
    @Technicallyaddicted ปีที่แล้ว +3

    The fact that a i9 13700k rtx 4090 prebuilt is only $700 more expensive than a 7950x 7900xtx build is criminal. This is entirely the fault of scalpers. There has to be something we can do about it besides boycott them. We boycott them, they hold the new gpu until the 5000 series come out and the process repeats for the 4th time. It’s not possible to buy a gpu at retail.. even for lower end cards. Man I’m not sure this hobby is worth this much headache.

  • @conedang
    @conedang 2 ปีที่แล้ว +6

    Why is it that everything I’m interested in and every single hobby I have seems to continually be destroyed by society as a whole

  • @lordeisschrank
    @lordeisschrank 2 ปีที่แล้ว +147

    2 things:
    a) I would rather go with RISCV for the openness over arm
    b) why does SOC automatically mean soldered to the board? Also seems kinda wasteful... why not have a socket where you can fit in various SOCs... that wouldn't change much about building pcs. you would still need everything else, like board, ram, powersupply, case ... rgb bullshit. getting rid of a gpu doesn't mean getting rid of building...

    • @MoireFly
      @MoireFly 2 ปีที่แล้ว +36

      If you really want to get the maximal benefits of low signalling costs - including between GPU and CPU - then that probably means integrating memory more closely than possible in a PC. You don't go manually plugging in RAM into a graphics card even today, and you probably want to do a lot better than what's possible today to compete tomorrow.
      That means GPU, CPU, chipset, dram would all need to be part of the SOC, or at least work within physically much closer distances and tolerances than today. What's left is bulk storage (SSDs, say), exotic hardware, powersupply, case & cooling? But if you have a SOC like that, then you pretty much _know_ the ideal powerdraw; so the step towards integrating a PSU+cooling is very natural, and then you've basically gone as far as mac mini - minus the gratuitous apple-ness of preventing using replacable SSDs for no reason.
      The real question is whether this really matters. Because I think apple's wins are more due to their usage of exorbitant amounts of TSMC's fanciest silicon rather than all of this. And that is a question of real cost, not merely picking a "better" design.

    • @arnonabuurs7297
      @arnonabuurs7297 2 ปีที่แล้ว +6

      why: shareholders value

    • @TasX
      @TasX 2 ปีที่แล้ว +5

      @@MoireFly We'll have to see. The new Nvidia RTX 40 series are using the same TSMC chip sizes as Apple.

    • @domsch1302
      @domsch1302 2 ปีที่แล้ว +2

      @@MoireFly I'm not even sure if that is still a downside. I overbuild my PC with 32 Gigs of Ram 5 years ago. Back then, 16 was considered standard for the midrange. 5 Years later most people still don't have 32 Gigs of Ram and even fewer actually need it. We have to realize that hardware requirements just don't increase the same rate they used to. That's also seen in phones. A fully integrated SoC with 16 Gigs of Ram now could last you a years and will probably fail before you need more performance.
      It comes down to how manufacturers are willing to play this game. They can still make this modular and upgradable. Fairphone and Framework both show that a tightly integrated SoC doesn't have to come with Apple pricing or Repairability.

    • @MoireFly
      @MoireFly 2 ปีที่แล้ว +1

      @@TasX I mean, the m1 ultra isn't competitive with the rtx 30-era cards; neither in gaming nor in compute is it anywhere close (disregarding a few very specific unusual benchmarks). So it's not a hard prediction that nvidia's process-node-parity RTX 40 gen will obliterate the m1 ultra, and likely any successor too. But we could compare perf-per-watt; and I wouldn't be surprised that when tuned for perf-per-watt (i.e. undervolted) a RTX 40 gen will outperform an m1 ultra even in that metric.

  • @TheTyphoon365
    @TheTyphoon365 2 ปีที่แล้ว +121

    I confidently disagree, for the same reason that with engines "there's no replacement for displacement". With larger components, you get better heat dissipation, and cheaper manufacturing. The future won't be close shell micro builds, at least not completely. There will be more room for performance in large builds

    • @slimdunkin117
      @slimdunkin117 2 ปีที่แล้ว +2

      Rhe smaller builds will outperform the larger builds..the “larger” builds will just be obsolete

    • @bifta4323
      @bifta4323 2 ปีที่แล้ว +8

      His point is you can have theoretically like 4 small pcs in the size of a bigger one so the big pcs in the future will be really powerful

    • @TheTyphoon365
      @TheTyphoon365 2 ปีที่แล้ว +1

      @@bifta4323 right, true

    • @13orrax
      @13orrax 2 ปีที่แล้ว +1

      Turbos exist

    • @SpecialistBR
      @SpecialistBR 2 ปีที่แล้ว +3

      Electric cars have zero displacement

  • @mdcorbett
    @mdcorbett ปีที่แล้ว +102

    With all of the existing x86 software available and for the fact that consumers like upgrading hardware (including PC gamers), I don't see x86 dieing anytime soon. Just because something is more power efficient, doesn't mean it's more practical in all cases. Edit to clarify: existing software will need to be recompiled for the new chip or it will stay with x86 (unless it's emulated, but the performance may suffer even there.) You can't just move millions of applications and games to a new architecture.

    • @utopian666
      @utopian666 ปีที่แล้ว +15

      Yes it does. It means exactly that.

    • @lunchbox1553
      @lunchbox1553 ปีที่แล้ว +2

      @@utopian666 But it doesn't mean it will be financially successful.

    • @utopian666
      @utopian666 ปีที่แล้ว +14

      @@lunchbox1553 Yes, because Apple are not financially successful with their all Arm strategy. Glad we sorted that out.

    • @chico_zombye
      @chico_zombye ปีที่แล้ว +10

      Europe es already banning TV's that draw ridiculous ammounts if energy. I always think about It lately when I see how much energy PC's draw this days and It's not getting better.

    • @lunchbox1553
      @lunchbox1553 ปีที่แล้ว +26

      @@utopian666 They are financially successful because of their brand, not their technology. Don't fool yourself.

  • @RimaNari
    @RimaNari 2 ปีที่แล้ว +80

    ARM is one thing. But having everything tightly integrated with zero upgradeability is another.
    I agree that x86 really is a dead platform, and things need to change there. But that doesn't necessarily mean that I can't upgrade my RAM anymore, does it? Or that my storage controller must reside on the SoC. There is a good middle ground I think that the established brands need to finally move towards.

    • @roji556
      @roji556 2 ปีที่แล้ว +1

      Which wouldn’t happen. You’d just have more Apples in the market. If SOCs really do become a thing, then I’d rather just have my current pc for old games and then just use an Xbox, Switch, and PlayStation for other gaming.

    • @fernandong9576
      @fernandong9576 2 ปีที่แล้ว +2

      i think thats the problem of apple tho, not arm

    • @-indeed8285
      @-indeed8285 2 ปีที่แล้ว +5

      ARM is great but Apple is abusing it’s power by jamming external componnents (i.e. storage controller, RAM, etc) inside the SoC.

    • @kazioo2
      @kazioo2 2 ปีที่แล้ว +6

      The biggest mistake of this video is focusing too much on CPU architecture, which has much smaller importance in the power hunger. The real MAGIC SAUCE is not having real motherboard and big SoC. This is where the laws of physics do the magic. You can already see how x86 laptops require less power for a given performance and they still have quite a lot of modularity. Tl;dr - modularity and motherboards are the enemies of power efficiency and that's an old basic physical fact that is very sad for a tinkerer.

    • @killkiss_son
      @killkiss_son 2 ปีที่แล้ว +3

      ARM does not mean soldered, x86 doesn't mean liberty.
      Apple means soldered.

  • @Kochiha
    @Kochiha 2 ปีที่แล้ว +28

    This video can be summarized by saying "You will own nothing, and you will be happy." I am not about to go quietly into that winter night, not for as long as I live.

    • @KookoCraft
      @KookoCraft 2 ปีที่แล้ว

      You think this sounds so smart lol

  • @staceyfunk9689
    @staceyfunk9689 2 ปีที่แล้ว +23

    We will achieve the ‘big box pc’ death as soon as we achieve the ‘paperless office’. I’ve been waiting 30 years for to appear.

    • @MrMilkyCoco
      @MrMilkyCoco 2 ปีที่แล้ว +1

      To be fair paper is kept just in case the electronic breaks.

  • @RandomPerson-vf3ld
    @RandomPerson-vf3ld 2 ปีที่แล้ว +24

    ARM based buildable system with a motherboard that has a cpu and gpu socket with badass Zen-7 chiplet goodness wouldnt be all that bad. Imagine the bandwidth between gpu and cpu that could be designed into such a board. So we stop shoving the gpu into a pcie slot? Then pcie is for ultra fast HDD, VR displays and other peripherals. Builders will have to figure out how to suck all that heat from the cpu and gpu chips then. Sounds possibly the future. I hope it happens.

  • @CARTUNE.
    @CARTUNE. 2 ปีที่แล้ว +19

    As someone who works in the field of R&D and software development for *redacted*, I can tell you they don’t have any intentions for going the route of Apple. Rest assured there is going to be some big news in the near future for the hobby.

    • @pocketanime
      @pocketanime 2 ปีที่แล้ว

      At least give us a hint 🤣

    • @EpicWolverine
      @EpicWolverine 2 ปีที่แล้ว +3

      I work for [redacted] too and while I think this is a good analysis and my company is keeping their options open, I don’t think they are going the Apple route either. I also think there’s good stuff in the pipeline.

    • @pricelesssword4559
      @pricelesssword4559 2 ปีที่แล้ว +3

      Thank you, both of you, for this relieving news

    • @pricelesssword4559
      @pricelesssword4559 2 ปีที่แล้ว +4

      I myself work for *redacted* as an animator so I’m glad this isn’t the route your two *redacted* conpanies are going

    • @xdjrunner
      @xdjrunner 2 ปีที่แล้ว

      I as well am here from **redacted** , ... No not _that_ one; _Freddy Ching?_ Oh, they're brothers. :)

  • @handlealreadytaken
    @handlealreadytaken 2 ปีที่แล้ว +47

    I'm excited for ARM, but want nothing to do with Apple when it comes to my desktop and OS. The biggest downside of Apple silicon is having to deal with their ecosystem which is not consumer friendly and closed.

    • @forfox
      @forfox 2 ปีที่แล้ว +2

      Well, there is a distro of Linux for ME1 chips but it's not perfect.

    • @MathewPanicker1010
      @MathewPanicker1010 2 ปีที่แล้ว +4

      @@forfox why would you waste your money on a Mac and then run a Linux distro that has issues?

    • @utubekullanicisi
      @utubekullanicisi 2 ปีที่แล้ว

      That's kind of not entirely the truth, as there are real tangible advantages to being in Apple's ecosystem, and depending on your needs & wants / who you are, it could be to that particular consumer's advantage instead of disadvantage. Closed ecosystem doesn't universally mean a bad thing, it just means a bad thing to you (which is also completely fine).

    • @forfox
      @forfox 2 ปีที่แล้ว

      I wont say that there alternative to MacOS. And in the future, I think there will be a lot more options for M1 chip's

    • @clx001
      @clx001 2 ปีที่แล้ว

      Profile picture checks out

  • @cpuuk
    @cpuuk 2 ปีที่แล้ว +178

    Luckily Apple level the playing field by making Macs so expensive that you might as well build your own ;-)

    • @moxleyproductions
      @moxleyproductions 2 ปีที่แล้ว +10

      Mostly, although the m1 Mac mini is a a bang for your buck powerhouse for creative professionals.

    • @myrealusername2193
      @myrealusername2193 2 ปีที่แล้ว +6

      @@moxleyproductions yeah it’s basically the best desktop Mac so far. Cheap enough to be accessible and still quite powerful

    • @karehaqt
      @karehaqt 2 ปีที่แล้ว +8

      @@moxleyproductions Unfortnately it's terrible for games due to Apple only supporting their terrible Metal API.

    • @karehaqt
      @karehaqt 2 ปีที่แล้ว +6

      @Bear Spicer Is it any wonder there is a heavy PC gamer slant in the LTT audience, the majority of their PC coverage is centred on gaming.

    • @Trillykins
      @Trillykins 2 ปีที่แล้ว +6

      @@myrealusername2193 unfortunately "best desktop Mac" is damning with faint praise. And not sure I would even call it cheap. 16 GB and 1 TB will run you $1300. You could get a significantly better PC for that amount of dough.

  • @austinmason6755
    @austinmason6755 2 ปีที่แล้ว +101

    I love watching Anthony on the channel, I feel like he delivers great information in a way your every day fella can understand!

    • @miso1995srb
      @miso1995srb ปีที่แล้ว

      @@halkopop he probably is, but how do you know

    • @KingBobXVI
      @KingBobXVI ปีที่แล้ว

      @@miso1995srb - LTT has been very upfront about the qualifications/skills of the people on their staff in previous videos about the crew.

  • @DeckDogs4Life
    @DeckDogs4Life 2 ปีที่แล้ว +13

    I feel like the problem is going to end up being lack of serviceability. Companies like Apple that are currently going towards ARM processors have next to no serviceability. They basically want you to throw it out and buy a new one.
    That's the biggest concern I think most people have is that all Desktop PCs will go that way.

    • @ilenastarbreeze4978
      @ilenastarbreeze4978 2 ปีที่แล้ว +3

      That is exactly my biggest concern. I like being able to service my own machine

    • @BlommaBaumbart
      @BlommaBaumbart 2 ปีที่แล้ว

      People already buy a new phone every 2 years, people don't give a fuck about these issues.

    • @Dave102693
      @Dave102693 2 ปีที่แล้ว

      Same

    • @DeckDogs4Life
      @DeckDogs4Life 2 ปีที่แล้ว +2

      @@BlommaBaumbart in the gaming desktop community, yes they do.

    • @BorderlinePathetic
      @BorderlinePathetic 2 ปีที่แล้ว

      @@BlommaBaumbart Lol I'm rocking a 3rd hand 980, it does fine. I rocked a S6 till couple of weeks ago, sadly that charge port died a long time before and the battery just became shit. It's bullshit to have planned obsolescence in a PC, don't get baited into the conditioning these anti consumer corpo's try to pull on you.

  • @3polygons
    @3polygons 2 ปีที่แล้ว +26

    I agree that the power consumption in Apple is heading to where the whole planet should be aiming to. The problem is that the company doing this innovation is notorious for selling their devices at much higher price than the competing companies, for achieving very similar results (and often with less performance) for real life tasks (leaving out the games problems). Apple keeps building non flexible solutions, non repairable - other than through them, and with outrageous repair cost- , and blocking as much as they can third parties repair, maintaining legal fights, _ask Louis Rossmann about that and his fight for the right to repair_ .... BTW, this enters hugely in conflict with the lower consume advantage, as repairing components to avoid e-waste as much as we can, is like a first rule for environment protection, and all this goes in the exact opposite direction. And all in a software-hardware ecosystem only compatible with their own products, almost in every sense possible (less compatibility in the end augments the e-waste, btw). Besides the actual hardware changes always being made complicated or impossible (soldered chips, etc). Upgrades (memory, disk) are incredibly expensive compared to the competitor prices, and certainly to the production cost.

  • @therealcrisis8439
    @therealcrisis8439 2 ปีที่แล้ว +150

    My thoughts on this: I believe it when I see it actually happening! A new thing becoming standard still solely depends on people willing to buy it. Apple has a very unique position on the market rn and what "works" for them very much does not have to work for other companies. Futhermore, I don't see the Enthusiast space dying out even if this does really take off since there's always people who will want to build themselves and not be tied to the service of one entity.

    • @0xAAA
      @0xAAA 2 ปีที่แล้ว +2

      Arm chips arent new?

    • @jimbernard3289
      @jimbernard3289 2 ปีที่แล้ว +2

      I've never wanted an Apple computer more then i do now and that's coming from someone who's always used windows and loves to tinker with PC parts. I like the direction Apple is taking with it's tech (something i would never have said in the past). My biggest wish is for there to be more competition in the ARM space.

    • @OZEEtube
      @OZEEtube 2 ปีที่แล้ว +8

      The thing is, if we start use SOCs on our PCs, the GPU, CPU and Ram are fused on one chip. You won‘t be able to service anything on it other than your M.2 Storage, Power Supply and your cooling. It would also kill upgradability. Even if we‘re lucky and the SoC is modular, upgrading would mean a whole new chip.

    • @jimbernard3289
      @jimbernard3289 2 ปีที่แล้ว +2

      @@OZEEtube It reminds me of an old electronic shop where i was an intern at. We used to be able to fix most old TV's, VCR's, Stereos etc.. Capacitors were the size of a finger lol. Then in came the Plasma and LCD displays and the shop went out of business. Reparability of most modern flat panels are close to ZERO. Every component has shrunken down to a size so little that it makes it very expensive to repair. Same is happening to cars etc... We just need more competition in the PC sector asap so Apple isn't the only game in town.

    • @OZEEtube
      @OZEEtube 2 ปีที่แล้ว +2

      @@jimbernard3289 I hope the competition does step up. As an adult who pays his own bills, a chip that's creating that much power while drawing one third of the competitor chip is huge for me. I live in Germany and we have the highest electricity rates in the world. I will always take double or trice the efficiency over 10% more power.
      I would just prefer it if I wouldn't lose upgradability and repairability in that process. But if SOC are the future and it's inevitable, I won't oppose it if it's superior.

  • @ramahadranhennessy9300
    @ramahadranhennessy9300 2 ปีที่แล้ว +3

    Over my dead body. The idea that a PC is a modular unit has meant we, as consumers, have retained a small amount of power and control to purchase when and what is needed, without the Ram or Hard drive SOLDERED TO THE MOTHERBOARD. I’m an Apple fan. Not a fan of CPU dystopian totalitarianism, and this goes the same with having your GPU powered by a company over the net, or allowing a company like INTEL pay-wall the speed or power of your cpu. And don’t get me started on our governments…innovation has to be balanced with consumer power, privacy, and freedom.

  • @andrewcopple7075
    @andrewcopple7075 2 ปีที่แล้ว +57

    "It's wasteful right?" Not compared to Apple's repair solution. "Oh, you have to bring the case in for any issue. Oh there's dust in there? It would cost $2500 to get the dust out and repair the shorts, might as well just buy a new one."
    It's infuriating to hear LTT describe the PC community as wasteful when compared with the disgrace that is Apple's "concern" for the environment.

    • @aaronchung9838
      @aaronchung9838 2 ปีที่แล้ว +3

      I think that they were talking about the making of a PC and just focusing on the topic of the video ignoring right to repair which they have made a video on.

    • @KenMathis1
      @KenMathis1 2 ปีที่แล้ว +1

      Those issues have nothing to do with the design, and everything to do with Apple having complete control over their ecosystem.

    • @andrewcopple7075
      @andrewcopple7075 2 ปีที่แล้ว

      @@KenMathis1 True, but if they lead the way on the innovation of technology, everyone else is just going to copy pasta their pathological ecosystem setup as well. It's not like HP and Dell don't do this already with pre-built machines to a certain degree. :(

    • @KenMathis1
      @KenMathis1 2 ปีที่แล้ว +1

      @@andrewcopple7075 This isn't about entirely prebuilt PCs. It's about simplifying the instruction set and combining the CPU and GPU into an APU. Cases, motherboards, memory, power supplies, and so on would still be made by others. Sure there will be pre-builts, but there are pre-builts now. Nothing changes there.

    • @andrewcopple7075
      @andrewcopple7075 2 ปีที่แล้ว

      @@KenMathis1 Fair. If it's just about processor architecture, I can see that Apple is in the lead and others will have to play catchup.

  • @jacobsecor5015
    @jacobsecor5015 2 ปีที่แล้ว +168

    We actually talked in my Computer Architecure class in college about how the x86 architecture held back progress (part of it due to legacy support). The thing is because of that legacy issue, data centers will likely stay on x86 (unless emulation of that architecture gets really good, which would be possible in that scenario I suppose)

    • @wayland7150
      @wayland7150 2 ปีที่แล้ว +1

      Swings and roundabouts. The backwards and forwards compatibility is the key to it's success. There have been plenty of failed attempts to break free of x86. The PowerPC for instance was amazing, found in Apple desktop computers and then in PlayStation and RAID cards and all sorts of other things. However ARM is more successful and gets everywhere. It's the fact that Linux compiles for any CPU provides a great deal more freedom. I think the PC format will continue but we might be using ARM CPUs in our Mini ATX motherboards.

    • @MichaelButlerC
      @MichaelButlerC 2 ปีที่แล้ว +3

      but i don't understand why Linux based servers aren't switching to ARM if it's so fast and low power. Nearly everything that runs major data centers for Google, Amazon, and TH-cam run Linux based systems that can be optimized and recompiled any which way. is it just because of those licensing restrictions (Qualcomm)?

    • @Braindead154
      @Braindead154 2 ปีที่แล้ว +6

      Actually, enterprises are switching to ARM, they just aren’t doing it in their data centers. They are doing it in AWS and other cloud providers data centers. Processors like graviton2 are forcing CIOs hands, the cost to performance ratio is too compelling.

    • @raka9220
      @raka9220 2 ปีที่แล้ว +3

      @@MichaelButlerC they're mostly already using arm chips tho.

    • @donaldendsley6199
      @donaldendsley6199 2 ปีที่แล้ว +2

      @@MichaelButlerC Google and Amazon are going with Arm for at least some of their data center processors.

  • @AlexisLK
    @AlexisLK ปีที่แล้ว +2

    Let me tell you some siple facts :
    - I wil continue, all my life, to build my own computer like I want to, doesn't matter how hard it is to find on the market
    - I am ready to pay 3 times the price, if needed, to have my own computer, made myself
    - Doesn't matter how hard they push their shitty products, I will never buy them and they will never have my money
    - The number one priority in my list, for a computer, is the possibility to change EACH component myself. That's not second on my list, that's not third. That's FIRST. So even if I have to choose between a "all in one" premade shit like that is super powerful and a "made myself" computer with each separate component, even if the second option is less powerful, I will still buy the second one
    That's how ready I am to stick to a passion for true beautiful computers. Good luck to them to change that

  • @MrYodaBomb
    @MrYodaBomb 2 ปีที่แล้ว +117

    Man, this is something...
    Yeah, we have a problem right now with die space and power consumption. And yeah, doing what apple did can and will save PC's in the short run. But it will also kill PC's in the long run.
    Hear me out. If best case scenario happens and we get exactly what this video says, then we also get ZERO choice as consumers.
    Realistically, no company will offer "custom" die where you can choose what cpu/gpu you want with any amount of actual choice other than a few pre config options. So that means we will no longer have PC's. We will have consoles that also do PC stuff. And THAT will 100% kill the community. And you know that actual consoles are already PC's as far as hardware and performance. So those will just add an option for a desktop experience for 1/2 the cost of a PC and kill the industry.
    All of this and I didn't even get into what a freaking nightmare repairs and eco waste that would be. Throw an entire PC away bc it is broken or you want an upgrade instead of just replacing the part you need. (GPU, CPU, RAM) You already know if it goes this way, ram could easily be packing the the chip as well for more profit and less cost.
    Have we learned nothing from premium smart phones? Even the "repairable" ones you have to throw away entire parts that work because they are integrated with the part that is broken. Not to mention all in one means it becomes e waste ALL IN ONE!
    I don't like it. Not at all.

    • @SIPEROTH
      @SIPEROTH 2 ปีที่แล้ว +13

      I don't get why abandoning x86 has to change our ability to build PC's.
      In the same way you put an x86 CPU on your motherboard you can do it with an ARM based CPU etc.
      How does that change anything?
      Apple was a close thing even when using Intel cpus.
      If the excuse is "But you don't need a big box". Then ok we will built smaller PC. Besides those CPU look fine now but when competition for power begins then they will also rise in power and heat and need towers etc.
      Look at Qualcomm. They are already finding it hard to keep the power increases going inside small devices like Smartphones were there is no active cooling and have to be careful of battery consumption.
      If by moving to a new design process for CPUs means they will solder them on the board and we can't built PC's anymore then is not the new process that is the issue but the CPU corporation finding an excuse to take away your freedom of building.
      Is like electric cars. Fundamentally there is no reason why they will be any less repayable than combustion engines and yet the car makers are trying to make them as such so they can hold the consumers hostages.

    • @MrYodaBomb
      @MrYodaBomb 2 ปีที่แล้ว +10

      @@SIPEROTH It's not so much taking away having a PC. It's that companies care about profit first so they way they would do this would be terrible for consumers in that they would have limited options and all in one making an already simi monopolized industry even more so. And on top of that if you have an all in one pc with no choice, then people will go console bc it will be cheaper. No reason for the average joe to pay $1.5k for an all in one arm when they can pay $600 for a console that is an all in one pc. And u can bet xbox/playstation will make a desktop experience if it gets people out of PC.

    • @MattyIceTrades
      @MattyIceTrades 2 ปีที่แล้ว +3

      but like whats the point of paying more for inferior products, i know its a hobby, but so is building old cars, but while they are fun to use, they are expensive and smell bad.

    • @MrYodaBomb
      @MrYodaBomb 2 ปีที่แล้ว

      @@MattyIceTrades This exactly. You will always have your customers that what you get for the money doesn't matter. But those are the extreme minority and will be more so if PC goes this. I'm one of those people. I will spend lots of money for very little % real world difference. But not if it goes like this Why would I if a $1k "pro" console with a desktop experience can satisfy my enthusiast self just as much as a $4k PC?

    • @randybrinkman495
      @randybrinkman495 2 ปีที่แล้ว +8

      Well, not zero choice.... it's Apple after all. You will be able to choose Pine Green, Rose Gold, Midnight Blue, or Space Grey.

  • @jamesrobertson504
    @jamesrobertson504 2 ปีที่แล้ว +22

    As I sit here with my PC heating up my small home office in summer time, I see your point Anthony. I didn't use to care about power draw, but with all the recent price increases (got a notice the electricity bill is going up), I guess ARM SOCs are the future. Will miss being able to upgrade components regularly.

    • @ofirdavid3728
      @ofirdavid3728 2 ปีที่แล้ว +3

      it would cost a lot more to fix your ARM PC when it breaks down though as ARM systems are integrated single-use throwaways

    • @ph4se2
      @ph4se2 2 ปีที่แล้ว

      I know this pain, my PC heats up the room pretty quick with a game going.

  • @whyshouldibeats
    @whyshouldibeats 2 ปีที่แล้ว +37

    I just finished my first pc build yesterday, thanks to your build guide. Thanks a million!

  • @OpenSorce
    @OpenSorce ปีที่แล้ว +3

    Where have I heard this before? Oh yeah, that's right... the entire history of PCs! I could list how many times I've heard it since the first time I built a PC in the early 90s, but I won't. Seriously, it's a recurring theme. Home built PCs will never go away.

  • @festusyuma1901
    @festusyuma1901 2 ปีที่แล้ว +17

    Yes it's worth it, I need to be able to upgrade my desktop, or be able to fix it myself. If they can make a transition with those two things intact, then no problem. Also compatibility issues too