They are gonna have to change that eventually. They’ve had the same upgrade pricing tiers since 2015 which is insane. Their base models are such a great deal but the min spec is so lousy almost everyone should upgrade and by the time you’ve doubled the ram and storage you’re at basically double the price. Sucks lol
@@snazzy I'm honestly surprised that the EU hasn't run riot on that yet. The change is inevitable, as they're going in hard on right to repair to the point that it's a matter of time that the right people notice the problems with the lack of upgradeability.
As someone who has built their own desktop PCs since before a lot of TH-cam viewers were born, my biggest anxiety about this transition is whether or not I'll be able to keep building and customizing my own hardware. I don't particularly mind the idea of SoCs, but I really value the modularity of the existing x86 platform. It has saved me boatload of money over the years and lets me get exactly what I care about without paying for what I don't. The thought of being stuck with only a laptop, with all the horrible tradeoffs and e-waste that comes with that is just appalling.
hoping for the same socketed systems that we are used to from intel/amd to come to arm, and just motherboards with a new chipset, maybe no chipset in the case of an SoC they would have great iGPUs and maybe free included ram, while keeping pcie and ddr5 the same or a new standardized ddr-dimm for arm systems apple themselves have proved that pcie at all is possble although no current gpus work with it making a new socket-its nothing new storage is too easy to implement, and its another opportunity for cpu makers to offer some small on board boot drive leaving modular GPU and RAM as the big main questions
I agree with you but I have my PC at home. For games, 3D-printing, hobby programming but also professional heavy programs. For professional stuff on the go my MacBook m1 has been awsome. It just works, syncs easily with my phone etc. It is way more reliable than any pc laptop (and I have had many). Both systems have their strengths and weaknesses both in software and hardware. PC definitely needs to modernize itself on many fronts. But I hope it stays open and adaptable.
GPU is the hero for PC Desktops. The future is more and more GPU focused for everything creator and AI oriented and having a SoC and a GPU will be super powerful. Old Amiga guy here. Amigas were great because they had multiple of chips. If we can figure out the internal i/o of that in a motherboard layout we will all have custom super computers in our future.
@@MarcTelesha The future will be interposers with logic built into them. They can be designed with topologies that enables far better communication than currently between chips. How the architecture overall will be is hard to forecast. Will it be similar to the newest Nvidia AI gpus with CPU and GPU on the same interposer with shared memory? If the performance advantage is high enough that will make upgradeablility of the individial components sacrificable.
@@lukasmauer230 no it can't. lmfao. the swapping thing is a meme. its virtually unheard of for modern SSDs to fail due to swap. it almost never happens, only under experimental stress testing. its not a thing anybody should ever worry about with a laptop lol.
3:10 there is a fourth point thats also important: Apple has continuously operated on a node/technology advantage. Their lower volume, high average selling price and good relations to TSMC made it possible to always be ahead of AMD or Intel in that regard. This is basically only possible because they sell the complete laptop and can use the higher profit margins to actually buy significant volume from TSMC, while still buying less silicon than AMD or Intel move.
They use their higher profit margins to fill in their bank account. If Apple made slightly less money they could sell a lot more computers, but just like Harley Davidson Apple would rather sell fewer products at much higher profit margins than put an Apple computer into everyone’s house.
Also, I think it was a previous video of his (or some other creator) which looked into the actual transistor count (mostly because of Apple's node advantage as you pointed out) vs performance comparison with AMD/Intel chips, and ofc its not all rosy after all!
@@sihamhamda47That’s the thing about being a key founder, you provide all the funds for TSMC to set up and go forward with a new node. It’s because of bug customers like Apple that the industry is moving forward to these new nodes. Also going by the reports many have opted not to design their chip for TSMC N3B as there’s very high costs associated with the node.
You’re so misinformed on the volume. They are the largest user of silicon in the world and also TSMC’s highest volume customer. There are a few little known products called the iPhone, iPad, AirPods, and watches. At one point, Apple accounted for 1/3 of all silicon used in the world in a given year.
@@thekeeprI would say not really. The point was to work in a Qualcomm ad, they paid him to visit after all, and then make a “call” for bold form factors which was largely pointless, thus the clickbait splash page. I think the Vision Pro is that form factor, which Quinn knows. The really slim Mac is effectively an iPad with key case. The issue there is software not hardware, which Quinn knows. This nets out to a Qualcomm ad and some M chip data that is sorta news and sorta not, depending on if you follow the channel. It’s useful for irregular viewers of Snazzy but meh for the regulars. Wish Quinn well of coursee
I still have a 2019 Intel laptop, but do have a M1 Ultra. After watching the video, I think I would be well served with the 2021 MacBook Pro for travel and couch surfing too.
i just love it that the original M1 chip had a media engine 1.5x faster than the dedicated Afterburner card released a year earlier that apple was still selling for 2000$ at that time.
@@pokiblue5870 you know i agree with that, i really hate that part of apple silicon, i honestly feel like the government should do something against non repairable devices cause it's harming the planet a lot more than we think. Are the new Qualcom based PCs upgradable ? Cause if not that would be a disaster too. I feel okay that my macbook is not upgradable cause i bought it directly with very high end specs (at a very high end price) and it's now been 3 years and i still don't feel like i need any upgrade. My Asus Laptop i bought around the same time for gaming is upgradable and i'm so happy i've been able to add an extra SSD in it for pretty cheap, however i wanted to add ram and i already have to buy it used cause in 3 years the standards have already changed.
@@pokiblue5870 I 100% agree with you, the gouvernement should do something against products that can't be repaired and force apple to allow us to repair whatever might go wrong in a few years. However i don't mind it that much cause i often buy the highest tear model and so it last so long that when i finally feel like i want to upgrade, new parts are not compatible with the pc i have. I'm a mac user and would only use mac if i could, but like most mac users i also have a PC, both a gaming laptop and a custom built tower i made myself and i know how great it is to just add an ssd when the current on is full.
With Thunderbolt, superfast external storage with a small portable drive is easy. I have bought a base for many years and use external drives. Even when my 8-year-old iMac HD died many years ago, I ran the OS off an external Firewire drive which ran faster than the old mechanical HD. This is really not a problem in 2024.
@@andyH_England I really want the ability to upgrade internal storage though. The 14" and 16" pro should absolutely have an extra M.2 slot in it that I can drop an SSD into later. Technically they should have up-gradable RAM too but with the way it's integrated into the CPU, it does provide some benefit there so it's a trade-off. But my general take is the Pro versions should have upgradable RAM/Storage and the Air can be soldered to allow a sleeker design (like the 12" Macbook). It really astounds me that they didn't re-release the super thin Macbook with Apple Silicon as the new Macbook Air, and then the laptops they call the Air now should just be a Macbook. It's like whomever is in charge of naming these products is asleep at the wheel.
@@andyH_England ,maybe for you, but a lot of us don't want portable dirves hanging off of our small, sleek, highly portable, and light laptops. In 2024 having only 256 GB (the base configuration of a macbook air) feels a bit cramped to me, especially if you want to edit large 4K video files.
I mean external SSD is ok but Not giving m.2 slot for SSD is criminal I mean if I have m.2 SSD from my previous laptop or want to upgrade storage there should be 1 extra free m.2 slot in every laptop (And you can save the planet also if you care)
One mistake: In the beginning of the PowerPC era Mac owners could be bragadocious for the first time. They were faster and more efficient than Intels chips. That advantage faded away later until the switch to Intel was a big step up.
This.... but to be honest... even a 68k was zippy if you tweaked the OS right... That's why I moved away from Mac.. even though Classic OS was at times buggy.. if you knew how to arrange your extensions, it was fast and just did its job... none of this bloated OS with a kiddy interface that dumbs it down.
They were never faster then intel that was a marketing lie. I loved the pre intel Macs Computer pre sysyem 10 days. Apple will eventually fix whatever problems they encounter but some things are by design. Apple computer where never built to be customizable since the Apple II lost to the Mac. If the Woz had his way, Apple computers would have been more like custom pc are today.
PowerPC was more efficient (in the beginning). I remember a test were the notebooks were a lot faster than Intel-Notebooks. I don't know about desktops.
Some of us are old enough to remember when Macs CPUs had the speed and efficiency advantage multiple times in the past. When the first Mac was released in Jan 84, it had a 16/32-bit 7.83MHz 68000, when IBM's XT was their most advanced PC, and it still had an 8/16-bit 4.77MHz 8088. And practically all of the other personal computers on the market used either the 8088 or an 8-bit CPU, with a few MS-DOS/PC clones perhaps already using the 8086, which was fully 16-bit. And then, when the PowerMacs were released in Mar 94, based on the PowerPC 601, they were again the fastest and most efficient personal computers on the market. And the same happened a few times during the PowerMac era.
No company can sit at the top forever. But consider that the Snapdragon chip wouldn't be readying for release and mass adoption without Apple's M1. Now the pressure from QualComm and MS will keep price pressure on Apple. Competition is good.
@@JoeStuffzAlt how can you be pro x86? Even intel and AMD would love to switch away from it... there's no good argument for x86 other than that every piece of software out there is basically written for it... an adoption of any other architecture would require either really good emulation/translation... or that every program gets rewritten for that architecture.... the industry decided to go with emulation/translation and if it's at some point good enough to work for the corporate clients there will be mass adoption of it... what the private end user uses is irrelevant and depends entirely on what the manufacturers provide
I agree. It's always great when competition leapfrogs each other. This new chip wouldn't exist without Apple, and the next amazing innovation by Apple will be spurred on by this chip.
Apple ram is not soldered to the board. That's the big difference and everything is moving that way in the industry. Ram is ETCHED in the same silicon die that cpu gpu and other modules are. And believe me in future everything would be that way. That sort of tight integration enables huge performace and power savings, something that is absolutely needed...
@@ahrrr7690 Yeah, this take aged badge, but at the time it did seem like performance was somewhat plateauing. Also I think a lot expected the PC arm rollout to make a bigger splash, but hype died soon after the few products released. The M4 max is outperforming the newly released top gaming desktop CPUs by a long shot and you need the $2k+ enthusiast/server chips to find comparable multicore performance. It basically outperforms on single core performance on all but the most expensive stuff. This this is on a max ~33w power draw vs the max ~240w draw for server chips which is wild. That said, unless your job revolves around doing moderate to heavy performance intensive work on a computer I don't think it matters what the m4 max can deliver. It is a wildly expensive device and it exceptionally performant at stuff that does not matter to most people. Gamers is the only consumers that care about performance and for them CPU performance is not important, and they get much better value for money using a dedicated GPU. Not to mention that the macos game support is still abysmal. The TLDR is that nothing has changed, the video expected ARM non-apple laptops to challenge apples dominance in the professional space, but it seems like apple will get to keep donimating for another generation. For gamers nothing has changed.
Took me ages to stumble on why Apple M series machines had such limited external screen support, hugely reducing the previous number. They are using the iPhone display controller instead of a high end laptop one. As a projection artist multi outputs, 3 are really the minimum for the installations I use and that's evidently common on other hardware. My 5 year old midrange Amd based HP supports 5 external outputs (internal touch screen still on, so 6 screens in total) so for arts work it's a dream. Be great if Apple worked out its design limitation.
Makes sense as the M series chips are just a scaled up version of the A series iphone chips. They just threw in more cores, rebranded to M series and called it a day.
@@jerryw1608 @jerryw1608 that's interesting. Is it the base M2?. Apple's site says just one screen on the base M2 or M3 and you need the highest end M2 chip to do more than just 2. There is an update that lets you run 2 external screens on some MacBook Airs but only if you shut the internal screen off, which is quite 'unique' a solution. The only Apple dealer store in my area also tells me it's just one screen on base models and you have to have bought a higher end M series to get over that limitation.
@@carylittleford8980 DisplayLink is analogous to a 'software defined graphics card'. I've got a M2 Pro MBP, and a few USB C hubs that have display port connectors on them, but have only ever been able to drive 'up to 2' external displays of any resolution at the same time natively. I've got 2x 1080p monitors, and a samsung g9 dqhd screen (5120x1440), but only 2 screens will ever work at the same time, no matter which display connector they're connected to, but if I put both 1080p screens on a dual displayport DisplayLink adapter, and plug that into a USB A socket on one of my dongles, then I can get all 3 monitors (plus the laptop's own screen, so 4 in total) working. This does need the 'DisplayLink Manager' app running to get the DisplayLink adapter to function.
You're discounting x86 architecture too much. AMD's 0504 custom APU made for the Steamdeck is an x86 chip which can run most modern PC games with a 15W TDP. In desktop mode it draws only about 8W in typical usage scenarios (disclaimer: I was unable to isolate the APU power draw in desktop mode, so this is the figure that represents total power draw from the battery, with the screen off). Personally, I'd be hyped to see powerful and energy efficient chips like this one in laptops. It has all the compatibility benefits of the x86 architecture and it's pretty efficient. AMD does offer slightly more powerful chips for laptops, but they have roughly double the power draw, which makes them less desirable for people who are on the go a lot.
I think the reason is that the steamdeck is 720p if I am not wrong. We already have processors like that in laptops, but because of things like higher resolution and windows instead of optimised linux distro they seem less powerfull/impresive.
@@matejjustus4573 yup, I agree. But what I'm interested in is a lightweight Linux laptop that can run for many hours on battery, for hassle-free dev work on the go. It just needs to run an IDE, a web browser and, in a pinch, compile some basic modules for testing. If I want to do anything fancy I can delegate the workload to my PC at home. And yes, I very much prefer an x86 machine to natively test my x86 binaries on.
@@hctiBelttiL i have old Thinkpad t480 for my studies, TH-cam and programming (clion, pycharm, gitlab) and it lasts without problems. Right now I have 83 and 46 % of battery (two batteries and it reports ~7:30h remaining on balanced power setting.
For a desktop, I agree. But put that in a laptop, you'd have to carry around a car battery to get decent battery life. I'd love the arm option for mobile
@@mikekellytech Not actually true a lot of the Ryzen newer AM4 cpu's actually draw less power than their intel equivalents. They just run a little hot. Most of the time.
As lifelong pc guy , I recently started longterm camping in a beautiful rainforest and Apple is the only way to go , I’ve wasted too much money on badly built pc laptops and to get a comparable build quality pc laptop , I need to pay more than Apple and then get less performance
@@cheekoandthemanI play a fair bit of games on my laptop, so while I’d certainly kill for the efficiency of an Apple Silicon Mac, my use case rules them out wholly.
The 12inch MacBook - fondly called the **MacBook Nothing** - was my ATF of Apple laptops. Its thin-ness and light-ness was insane. I used it primarily for business travel where would be in meetings, so the use case was simply MS Office, MS Teams, Visio, Web Browsing, and corporate email. It was great for all of that, particularly brilliant with the Retina Display. The KB was a bit janky, as I was constantly missing characters as I typed them (pretty fast, probably not enuf pressure) but the KB never failed or had HW issues like for many. I would love to see a MacBook built on that same fan-less and thin platform, and even if just M2 would be very successful IMHO.
I owned one and LOVED that thing. I don’t know why Apple doesn’t bring that back. The single port was fine by me. Maybe add a second port and you’re done. It was awesome for business travel. MS office and work emails. More functional than an iPad.
Without wanting to be a snob (Im not, I spent years building PC's). No one who owns a Mac will ever go (back) to a pc for any reason apart from cost or work-based software compatibility. If you can afford the Apple ecosystem and buy into it you just dont stop to contemplate alternatives because you no longer have the myriad issues that other non-apple systems suffer from. The next thing is hard to explain to those who dont care about it, but its aesthetics, and simply put: Apple has an aesthetic that is appealing, and almost completely unchallenged. Some of that Chinese tech like XiaoMi and huawei looks like its getting the idea, but we alll know both those companies arent trustworthy competitors to Apple, and the rest of them, including samsung, just dont seem to get it.
It took Apple more than a decade to produce the m1 and people can’t expect the magic reveal every year let alone even every 5 years. Tech moves much slower than the marketing.
@@petarprokopenko645 I would immediately buy a 12-inch MacBook made out of matte plastic with more rounded edges. Even more so if it was a convertible with pencil-support. I'm genuinely tired of aluminum. It's cold and hard and not comfortable at all. I loved my 2006 MacBook....
Fun fact about the Qualcomm Elite X / Nuvia team. "The founding trio of Bruno, Gulati and Williams were key high-level architects at Apple whose expertise brought fruition to many generations of Apple’s SoCs and CPU microarchitectures. Williams was the chief architect on all of Apple’s CPU designs, including the recent Lightning core in the A13"
so yeah, put better, nothing to do with apple hype, its just that arm is coming into age. i could do all of this on a raspberry pi 5 years ago for $35 woot.
Man. I think I started watching your videos many years ago. I already liked them then. I haven't watched a lot recently (the algorithm doesn't suggest them I suppose) and now completely finished this. Thoroughly impressed with sentence formulation, story structure and overall presentation. Good video!
I do my graphics work on an M1 Macbook Pro and I don’t think I need to upgrade within the next 4 years. We see what’s going on then. I don’t need more compute power, I need good software and efficiency. The plague of modern computing is bad software. Programs that were 1MB in the 90s and ran on a 486 are now 500MB and need a high end cpu.
optimization became absolute garbage in the last 10 years. Almost no application is optimized nowadays as software devs are able to brute force push the apps because of the huge leap in processing power.
@@jani0077 i 100% agree on this, gaming is a sore example of how much devs stopped caring about optimization and just push products out with barely any QA/QC
The biggest issue is that it's not a pc. Its greatest achievement is that it's not a cromebook. The Acorn pc used Risc chips.....in the 90s. Risc chips are great for using half the power and taking 3 times as long to do the same task.
I don't think that's possible unfortunately. ARM chips work differently with how the OS needs to be coded specifically for it, or something along those lines
@@lucasremI do not think it would be even possible, because of the hard software/hardware binding of apple. Imagine why no one managed to run iOS on android phone.
I am continually amazed at how quickly these "long" videos fly by. You are an elite presenter. Dare i say the best tech product communicator on youtube.
Truly dude, your funny, entertaining and have a superb outlook on technology and pick interesting topics too. You should have way more subscribers! @@snazzy
@@snazzy you totally missed the point WHY it isn't much faster. the reason is milking the tech that corporations already have. improving too fast drops older cheaper-to-produce products prices too fast. so corporations don't like fast progress which they do only when they forced by sales dropping down. now whith what they have they only compete with themselves, making faster chips will kill m1-m2-m3 products sales. thats the actual reason why your napking math didn't work, everything else is just marketing faitytales from corporations to justify milking the market.
A thing that should be mentioned is that it wasn't really an x86 vs ARM issue. Intel basically just got stuck at a bad node size for a long time as they fab inhouse. The Steam deck uses x86, but on a TSMC AMD chip so the tdp is just better.
yup. In multicore performance actually Ryzens are almost on par with Apple chips. Its so strange AMD didnt try to add efficiency cores to their processors - it would be game changer.
@heroofjustice3349 AMD gets pretty close to Apple in terms of pref per watt, but only at the high end server stuff, and only if you turn off boosting. 128 cores of pure compute if I recall, that only sips 300W
@@heroofjustice3349They will be. AMD will be releasing a power efficient chip with efficiency cores (regular cores with less cache) by CES next year. So both Qualcomm, AMD and Intel to a lesser extent will offer high performance, high battery life laptops in 2025. Apple's advantage will negligible next year.
@@Demopans5990 Ok I will write again because link was caught by YT and my comment was not posted. So instead I ask you to put 'AMD Ryzen 9 7940HS analysis' in google and check first link from notebookcheck. As you can see your comment is lie. Consumer grade AMD chips for laptops like 7x40u/hs or 6800u are almost as efficient as Apple silicon under load even with worse manufacturing node(6800u). Advantage of Apple Silicon lies in E-Cores. Ryzens have only performance cores and so under low load or idle those chips needs to use more power. Its just wonder of BIG.little architecture - power hungry P-Cores are activated and powered only when there is need for more processing power than E-Cores can handle. However AMD is already planning to introduce their own E-Cores and it should put Ryzen powered laptops very close in all aspects of efficiency(including battery life) to Apple chips.
Your point about fan speed is exactly why I love the M1, especially the 16". This thing is cool and silent no matter how many consecutive hours I work or play games on it.
@@snazzy I use for ME always an M1 MacBook Air, even when my whole team uses better Mx Macs. For Keynote Pages and 1x a month an short video export... I see no reason for an upgrade in years.
Like my Mac mini M1. It’s on 24/7 and nobody ever heard the thing, no matter what I did on it. Going to my son’s room and hearing the noise there of his windows PC always makes me laugh
one thing to note tho for when apple transitioned to m1, in the prev gen intel macbook, the heatsink LITERALLY didn't make contact with the die, this was found when LTT tried to improve the cooling with better paste and found that they had to drill out some of the heatsink in order for it to touch the cpu
And in the Early 2020 MacBook Air models, they added that ridiculously underpowered semi-passive heatsink that could only handle about 5-10 Watts before the CPU started to throttle...
@@ZachariahConnor Considering that they _do_ have some engineers employed, I can't think of any other explanation for this. But that leaves me with a feeling that they were actually caught by M1 surprise themselves--had they known that the Apple Silicon would be _so_ good, I don't think they'd have gone through the trouble.
I met Jeb once after that comment. He's so ashamed of it because it was never meant to sound so desperate. But at least he can laugh at himself about it. Really nice guy, but not my type of politician.
What you said is what AMD realized. "Wait... if we manufacture this part on the new node but these parts that won't get as much of a performance boost on these parts on the older node, we can save money!"
It's not just saving money, shrinking certain parts (like IO) further just didn't scale to the new nodes... With chiplets, they have very small dies and thus the chances of major errors in them is much lower, resulting in way better yield or feasibility to make them to begin with. Meanwhile the huge-ass IO die gets made on a node which is mature enough to reliably make that size of die. Well, I guess it is saving money in the end after all by making the production yield of their chips not prohibitively bad.
@anthonycotter1493 ISA used to make a huge difference in the 80's, by the 90's it was down to a few percent. In the 2000's the difference disappeared completely. If ISA was the end all, we would all be using PowerPC right now. A bigger difference will come from a memory controller difference than the ISA.
@anthonycotter1493 I think it is both and performance per Watt counts, that's the main reason the development of Intel x86 Android phones stopped. But yeah, I am also sure if a brand new modern instruction set would be developed, ARM would be defeated, also on performance per Watt.
AArch64 threw out a lot of legacy features from earlier ARM versions, features that proved a challenge for speculation such as its stateful predicate bits and complicated instruction encodings. It also added some features that are generally useful at scale. Intel and AMD have also been adding some useful features, but they are only massively superscalar through absolute brute force. I don't think anyone doing chip design was actually surprised by the leg up the M1 had, even with the process node accounted for.
@anthonycotter1493It can be just a mistake but based on the writing, you are mixed up Instruction set and Instruction set Architecture. Two different things. One like its name is just the instructions, the other is the actual circuit design to execute such instructions. (Loosely speaking.)
if people are going to say that the nanometer nomenclature doesn't make sense, don't use it. Call it N5/N3/N2 etc which is the actual product name TSMC uses.
The main hope I have from competition isn’t new MacBook form factors (thought it’d be nice); it’s more realistic (read: cheaper) storage and RAM upgrades. The fact Snapdragon Elite uses NVMe over PCI should prove that Apple’s soldered-on SSDs don’t have any advantage; maybe they’ll use standard connectors, at least in Pro machines.
I agree with you! I also hope Snapdragon Elite X chips won't be hamstrung with non-upgradable RAM and storage. Why does that leave me with the sinking feeling...that that's EXACTLY what Qualcomm will do? Heaven forfend! 🤣🤣
Doesn't apple already support pci-e on their desktop mac pro with arm chips. Also thunderbolt is also over pcie if I'm not wrong. They just chose to hinder any upgrades.
@@jimcabezola3051i think RAM will be soldered due to the current speed and trace length limitation of ddr5, let's hope that new dell standard for ram gets better adoption. but no reason storage needs to be soldered.
@@kushagraN Yes, you're right... I'm afraid you're right. The traces to a DIMM would be prohibitively long and slow. But...let's not make a base system contain LESS than 32GB, okay? 🤣🤣
After 30 years as an Apple user, I think my 2021 M1 MacBook Pro Max might be my last Apple computer purchase. I am excited about Framework’s roadmap. My biggest regret: thinking 32 GB of RAM was sufficient for my needs and realizing a swap for the 64 GB would require me to take a huge loss. My servers and PC have 128 GB…I must have been drunk when I configured that MBP. 🤦♂️
Counter suggestion: he's recording enough of them at once that standing is just impractical to do in long bouts. I'm sure it's way more comfortable. Glad he can sit down.
I love that you mention how "3 nm" is mainly marketing -- too many people, including tech journalists, are misinformed about what the node number actually means.
I've used Macs since 1990, but gotta say nowadays it feels like they only have a handful of macOS developers but hundreds on the iOS team... recently had to downgrade from Sonoma back to Ventura (on 2020 intel MacBook Air) because after the upgrade the CPU was running hot and the fan was constantly on. (Activity monitor not showing anything crazy). There's an apple forum support thread with at least 300 other people who had same problem. After the downgrade (not easy- required full wipe and install even with a Time Machine backup), fan was back to normal. The good news about these Qualcomm chips is maybe there'll be ARM hackintosh macs possible.
The thing is apple sold well even when they were using intel chips, which were hitting thermal limits fast, so I doubt people buy apple hardware for performance first. I think nothing much will change with apple, they rarely take any risks.
What Apple does best? Hardware? Seriously? Louis Rossmann would like a word. What Apple does best is User Experience essentially... their design process starts with things people need to do, then the build the hardware and software around that. Thats why people love their Apple gear so much.
I dread these snapdragon chips being locked down and inaccessible to linux users. The transition to arm would spell trouble for people who want literally any freedom over their hardware whatsoever.
As usual, title makes me think it will be boring video, but the I see your beard, and I start watching, and cannot stop till the end. That's magic. I do not even have any apple device 😅
Great video going as deep as I always want these videos to go. Excited to FINALLY get an M1 competitor. I'm a Linux user who has been waiting for 4 years for this!
Linux being supported would be the only reason for me to switch from my M1 Pro MacBook Pro to any other device, that's close in efficiency and performance. No x86 system is. And sadly Asahi Linux still doesn't cut it on Mac.
The problem is that the M1 made intel actually finally "wake up" and resume actually investing R&D on x86 arquitecture, and AMD followed suit. This brought innovation back into x86 that was basically stuck for years, and this innovation quickly outrun anything that Apple was trying to do with Apple Silicon. The new intel Core Ultra cpus and Ryzen are now more performant and with better efficiency from x86 than Apple Silicon with ARM64. And the x86 is still a better arquitecture for a lot of things. And now that intel and AMD are back in the fighting ring, x86 will just continue to get better at higher-end performance points
Apple R&D and launch cycle for the M1 happened during the same time span that Intel and AMD R&D'd theirs, AMD had already planned on a new socket and Zen4 architecture, Intel was catching up on AMD's lead with chiplets with their efficiency cores, Blue and Red didn't wake up following Moneybags' M1, they just happened to have parallel'd it
Well AMD being able to do anything is special in it of itself. I still rather have AMD in a framework laptop, but I'll take a SD Elite in a Framework if they make one and it is easy to run my Linux stuff on it. Meanwhile, my Samsung Galaxy S24 on Dex works great too.
I agree, let Apple keep their ARM and let some windows laptops have it too because why not. But that does not mean that x86 is obsolete and shouldn’t be improved further.
Yes it did. You don't think other oems get internal leaks from their counterparts, do you? Sometimes even some contribute to other's success hardware-wise@@thegeekno72
I have an M2 mini and an M2 Macbook Pro that I don't really use anymore. I still use my Hackintosh 2x E5 2699 v3 with a 5700 XT slapped on and 256 GB RAM for my editing and daily use. I only take out the laptop on the go and the M2 mini ...I should sell that, I don't even know when I used it last. Yes, I use a lot more power, but restoring historical photos on an Apple computer would probably mean I would need a Mac Pro 7,1 to even come close to my Hackintosh. And we both know I can't afford one :)
In a word, Proprietary. Hook line and sinker. I didn't like the fact that it was not easy to get under the hood and tweak things. It was nice "machine" but when things did not go well it was like trying to fix bits in a modern car.
I moved to M1 from Microsoft laptops and it was a new world. It changed my perspective on things. Windows laptops never performed well on battery and it didn’t matter about performance when the thing was dead after a few hours. Just being able to use the computer without battery life anxiety was massive. The M1 still feels new and works great and so if it isn’t broke does Apple really need to push the boundaries just yet? If my M1 broke today I will want to buy another MacBook. I don’t have any outstanding issues forcing me to need anything better. Anything more powerful. It’s all just icing on the cake at this point for me.
The biggest problem Apple faces, aside from the absolutely ludicrous pricing of their upgrades, is that they insist on using monolithic chips. There are tons of reasons why everybody is moving away from them, and rumor has it that Apple is moving to make even larger chips. Absolutely ridiculous.
"The Intel Mac is the best! The Intel Mac is magical! The touchbar makes it worth $1000 extra! The Intel Mac is the best Windows computer!" ...then M1 comes out... "The Intel Mac was awful, it overheated, it was an engineering failure, luckily there was a savior! Apple always saves me from the problems that they create!"
I was an Acorn Risc PC dealer in 1991. The original Acorn Risc Machine StrongARM CPU that ran Microsoft Windows 95 in a RiscOS window. I told you then that ARM and RISC are the future not X86.
@@granttaylor8179 In what way did it take intel 40 years? For longer than I have been alive processors have needed cooling and been hitting temperatures upwards of 80c.
@@definingslawek4731 It has has only been since the 486DX2-66 of 1992 that any sort of passive cooling was needed on processor. Intel has been making processors for over 50 years. Apple has run into similar issues a lot sooner than Intel did. They cannot keep shrinking the die process as that has been done for the last 50 years and it is reaching the current limits.
17:08 "The M1 series will be remembered as some of the greatest computers in history" Maybe so, if only they weren't designed to be history in so few years.
@@definingslawek4731 Last, yes, they just won't be of any use. Landfill. I still have operating and useful PCs from the 90s, my mid 2010 MacBook still works, how many M1s will be bootable in five years? None can be given a second life. Mac resale values for Apple silicone models will hit a cliff, something that has never occurred before with any Apple product.
The real Apple Silicon secret sauce is the big process node advantage and low frequency, high IPC cores. Also the Media Engine and AMX, but not "Unified Memory" or the Neural Engine. It's not even used most of the time. And unified memory has been a thing on x86 for many years.
Finally someone pointed out Apple pay-to-win with TSMC for the whole time, and not pixie dust that makes the Apple M-series significantly faster in comparison to others.
His opinions are just random nonsense. Dude has no idea what he is talking about and neither people who agree with him. I cant tell what is the main point of this video or what his problem is.
Just 5 years ago no one would be doing the intense editing you do on any laptop, this is what desktops were for and they're now becoming obsolete. Sometimes you have to take a step back and look at how quickly all of this has advanced, and it's incredible
9 years ago I was doing HD video mixing on a laptop. Also full multi-track video and audio production. But not a Mac because those would overheat if you ran Handbrake for more than a few minutes or tried playing more than 2 4K videos at a time. PC manufacturers should have used the slogan "Don't worry, your Mac will be able to do this in like 6 years."
They should make mac mini / iphone hybrid. Basically an iphone with m3 or m3-lite chip that is able to connect to a dock station (monitor, keyboard, mouse) via usb-c abd offer a mac mini experience.
They would do this but Apple doesn’t have the same luxury as Microsoft in that they have competition within their ecosystem. Apple would self cannibalize their other products if they made something like this a viable alternative. It’s the same reason why the iPad still can’t run desktop apps despite have the same chips, they just don’t want to risk making one of their own product lines redundant. How do you get away with selling $3,000 dollar laptops when a sub $1,000 iPad can actually do everything you need it to?
@@brandondaprothis is already the case with the MacBook Air… TH-camrs are always suggesting the Air for the “average” user who isn’t doing heavy productivity. The iPad Pro/Air can already fulfill this task for the majority… The MacBook Air is a pretty scammy product when you really break it down.
The problem with a comment about the iPad as a substitute doesn’t work for me. Well, let’s say it would/could work for me if the iPad didn’t have a crippled OS.
Having something like Samsung Dex for the iPhone would be glorious. This is the absolute dream for me. Just let me carry a single device with all my files and apps and let me dock it when I need proper peripherals.
TDLR, there is no competition anywhere to be seen. not even in the rear view mirror. Apple has no competition. i don't know any Apple users that cares about anything Android, windows or anything else. just like Apple doesn't care what others do. the only ones are cuntent creators who are in dare need for anything to talk about. so we have synthethic meaningless benchmark comparisons, at the end of the day, nobody cares about the ssd speed of a base model. the camera on the front of a laptop, the throttling under max load etc. but still, here we are, youtubers & co braught the useless notch to macbooks, cpu manufacturers optimizing their chips for benchmarks instead of customers needs, now a base model has faster ssd, nobody ever actually needs, all this nonsense that doesn'T make anything better but everything worse. because content creaters are wannabe's at best, no real skill behind, everything is entertainment.
Great video! I've been thinking about these topics/questions for a few months now. It seems like a lot of hardware chips are getting bigger recently, so obviously transistors can only get so small and we have been scraping the ceiling of those limits for a while.
@@exzld Beyond OG OSX and perhaps the next couple of major releases, each release takes a retrograde step in terms of usability. Originally: A lovely, clean, intuitive user experience for daily driving. Administrative features were all in sensible places - where you'd expect them to be. More recent iterations have had changes for the sake of change, made worse by nobbling some administrative features, as well as arbitrarily moving other admin features to random places for no good reason. Perversely, the biggest competition ("Windoze") have been progressively making their offering somewhat easier to administer.
Your Number 1 just isn't true, Modern X86 CPU's run off Micro ops and translate everything internally into small instructions, ISA is not an advantage of ARM
Doesn't matter point is x86 chips can never be as efficient as arm chips in battery life and efficiency if they were then we would use them in phones and mobile devices not including laptops
@@Space97. that's not true, there ain't any that are designed for phones. If you stick an M3 in a phone it won't be efficient ether. ISA has nothing to do with how efficient a CPU IS.
@@Rocman76 question why do you think where were able to put no fans on some of their macbooks and it stayed cool and quiet because with intel that would not have been possible I recommend you watching videos on x86 vs arm and they all say one thing and that's efficiency
Apple silicon was never magic. The magic apple had was exclusive 5nm and 3nm production process allowing it to cram more transistors on chip, but others are catching up. If apple was smart it would have never included a large bulky Neural Engine and instead used that space for a 64 bit Intel compatible processor. This would have allowed it to still do Neural kind of stuff, but allowed the machine to be Windows backwards compatible for companies whom need to run software that is only available on Intel Chips such as Solidworks. Like Tesla, Apple does not listen to customers.
Apple silicon never had magic. The amazing performance was because Apple released those benchmarks and no one else could do their own yet. HandBrake testing proved that M1 chips didn't remotely live up to the expectations.
For me the biggest problem for apple silicon macs is the complete lack of upgradability and also the criminal pricing of RAM and SSD upgrades
If I'm not mistaken, the video may have suggested Qualcomm's ASIC may allow for user upgradeable storage.
They are gonna have to change that eventually. They’ve had the same upgrade pricing tiers since 2015 which is insane. Their base models are such a great deal but the min spec is so lousy almost everyone should upgrade and by the time you’ve doubled the ram and storage you’re at basically double the price. Sucks lol
A boring trope. No one cares, except a handful of nerds that think they’re smarter than everyone else. Zzzzzzzz
With these new chips coming out from Qualcomm I think it’s will be time for Apple to reconsider their pricing to compete.
@@snazzy I'm honestly surprised that the EU hasn't run riot on that yet. The change is inevitable, as they're going in hard on right to repair to the point that it's a matter of time that the right people notice the problems with the lack of upgradeability.
As someone who has built their own desktop PCs since before a lot of TH-cam viewers were born, my biggest anxiety about this transition is whether or not I'll be able to keep building and customizing my own hardware. I don't particularly mind the idea of SoCs, but I really value the modularity of the existing x86 platform. It has saved me boatload of money over the years and lets me get exactly what I care about without paying for what I don't. The thought of being stuck with only a laptop, with all the horrible tradeoffs and e-waste that comes with that is just appalling.
hoping for the same socketed systems that we are used to from intel/amd to come to arm, and just motherboards with a new chipset, maybe no chipset in the case of an SoC
they would have great iGPUs and maybe free included ram, while keeping pcie and ddr5 the same or a new standardized ddr-dimm for arm systems
apple themselves have proved that pcie at all is possble although no current gpus work with it
making a new socket-its nothing new
storage is too easy to implement, and its another opportunity for cpu makers to offer some small on board boot drive
leaving modular GPU and RAM as the big main questions
I agree with you but I have my PC at home. For games, 3D-printing, hobby programming but also professional heavy programs. For professional stuff on the go my MacBook m1 has been awsome. It just works, syncs easily with my phone etc. It is way more reliable than any pc laptop (and I have had many). Both systems have their strengths and weaknesses both in software and hardware. PC definitely needs to modernize itself on many fronts. But I hope it stays open and adaptable.
GPU is the hero for PC Desktops. The future is more and more GPU focused for everything creator and AI oriented and having a SoC and a GPU will be super powerful.
Old Amiga guy here. Amigas were great because they had multiple of chips. If we can figure out the internal i/o of that in a motherboard layout we will all have custom super computers in our future.
@@MarcTelesha The future will be interposers with logic built into them. They can be designed with topologies that enables far better communication than currently between chips. How the architecture overall will be is hard to forecast. Will it be similar to the newest Nvidia AI gpus with CPU and GPU on the same interposer with shared memory? If the performance advantage is high enough that will make upgradeablility of the individial components sacrificable.
Unfortunately, in future everything would move towards even larger integration, so everyhing would need to be as close as possible, on the same die.
not only are the ssds unupgradeable, but they’re also unreplaceable, if the ssd chips die, the mac is a brick. i hate that.
link?
Have you ever had a Mac with an SSD that died?
@@miketkong2 dude 😆, it can die easily with swapping.
@@lukasmauer230 what’s swapping? It’s an internal drive.
@@lukasmauer230 no it can't. lmfao. the swapping thing is a meme. its virtually unheard of for modern SSDs to fail due to swap. it almost never happens, only under experimental stress testing. its not a thing anybody should ever worry about with a laptop lol.
3:10 there is a fourth point thats also important:
Apple has continuously operated on a node/technology advantage. Their lower volume, high average selling price and good relations to TSMC made it possible to always be ahead of AMD or Intel in that regard.
This is basically only possible because they sell the complete laptop and can use the higher profit margins to actually buy significant volume from TSMC, while still buying less silicon than AMD or Intel move.
They use their higher profit margins to fill in their bank account. If Apple made slightly less money they could sell a lot more computers, but just like Harley Davidson Apple would rather sell fewer products at much higher profit margins than put an Apple computer into everyone’s house.
They even booked all the early batches of 2nm TSMC silicon already, so other brand needs to wait a bit longer
Also, I think it was a previous video of his (or some other creator) which looked into the actual transistor count (mostly because of Apple's node advantage as you pointed out) vs performance comparison with AMD/Intel chips, and ofc its not all rosy after all!
@@sihamhamda47That’s the thing about being a key founder, you provide all the funds for TSMC to set up and go forward with a new node. It’s because of bug customers like Apple that the industry is moving forward to these new nodes. Also going by the reports many have opted not to design their chip for TSMC N3B as there’s very high costs associated with the node.
You’re so misinformed on the volume. They are the largest user of silicon in the world and also TSMC’s highest volume customer. There are a few little known products called the iPhone, iPad, AirPods, and watches. At one point, Apple accounted for 1/3 of all silicon used in the world in a given year.
The 2021 MacBook Pros are aging beautifully, I have no desire to upgrade or feeling of fomo at all
His point lol
Yep. I’m still using an M1 Pro MBP and M1 iMac as my daily drivers. Zero desire to upgrade.
M1 Mini and M1 Air here and still going strong, the Air will need a new battery soon but otherwise it's good
@@thekeeprI would say not really. The point was to work in a Qualcomm ad, they paid him to visit after all, and then make a “call” for bold form factors which was largely pointless, thus the clickbait splash page.
I think the Vision Pro is that form factor, which Quinn knows.
The really slim Mac is effectively an iPad with key case. The issue there is software not hardware, which Quinn knows.
This nets out to a Qualcomm ad and some M chip data that is sorta news and sorta not, depending on if you follow the channel.
It’s useful for irregular viewers of Snazzy but meh for the regulars. Wish Quinn well of coursee
I still have a 2019 Intel laptop, but do have a M1 Ultra. After watching the video, I think I would be well served with the 2021 MacBook Pro for travel and couch surfing too.
i just love it that the original M1 chip had a media engine 1.5x faster than the dedicated Afterburner card released a year earlier that apple was still selling for 2000$ at that time.
Selling a 2000-4000$ device with no ability of upgrade or replace parts should be illegal. You call that pro model ?
@@pokiblue5870 you know i agree with that, i really hate that part of apple silicon, i honestly feel like the government should do something against non repairable devices cause it's harming the planet a lot more than we think. Are the new Qualcom based PCs upgradable ? Cause if not that would be a disaster too.
I feel okay that my macbook is not upgradable cause i bought it directly with very high end specs (at a very high end price) and it's now been 3 years and i still don't feel like i need any upgrade. My Asus Laptop i bought around the same time for gaming is upgradable and i'm so happy i've been able to add an extra SSD in it for pretty cheap, however i wanted to add ram and i already have to buy it used cause in 3 years the standards have already changed.
@@pokiblue5870 I 100% agree with you, the gouvernement should do something against products that can't be repaired and force apple to allow us to repair whatever might go wrong in a few years. However i don't mind it that much cause i often buy the highest tear model and so it last so long that when i finally feel like i want to upgrade, new parts are not compatible with the pc i have. I'm a mac user and would only use mac if i could, but like most mac users i also have a PC, both a gaming laptop and a custom built tower i made myself and i know how great it is to just add an ssd when the current on is full.
Ok, I understand if you don’t offer upgradable ram, a lot of other brands are doing the same, but no upgradable storage? Insane.
With Thunderbolt, superfast external storage with a small portable drive is easy. I have bought a base for many years and use external drives. Even when my 8-year-old iMac HD died many years ago, I ran the OS off an external Firewire drive which ran faster than the old mechanical HD. This is really not a problem in 2024.
@@andyH_England I really want the ability to upgrade internal storage though. The 14" and 16" pro should absolutely have an extra M.2 slot in it that I can drop an SSD into later. Technically they should have up-gradable RAM too but with the way it's integrated into the CPU, it does provide some benefit there so it's a trade-off. But my general take is the Pro versions should have upgradable RAM/Storage and the Air can be soldered to allow a sleeker design (like the 12" Macbook). It really astounds me that they didn't re-release the super thin Macbook with Apple Silicon as the new Macbook Air, and then the laptops they call the Air now should just be a Macbook. It's like whomever is in charge of naming these products is asleep at the wheel.
@@andyH_England external???? gross!
@@andyH_England ,maybe for you, but a lot of us don't want portable dirves hanging off of our small, sleek, highly portable, and light laptops. In 2024 having only 256 GB (the base configuration of a macbook air) feels a bit cramped to me, especially if you want to edit large 4K video files.
I mean external SSD is ok but
Not giving m.2 slot for SSD is criminal I mean if I have m.2 SSD from my previous laptop or want to upgrade storage there should be 1 extra free m.2 slot in every laptop
(And you can save the planet also if you care)
One mistake: In the beginning of the PowerPC era Mac owners could be bragadocious for the first time. They were faster and more efficient than Intels chips. That advantage faded away later until the switch to Intel was a big step up.
This.... but to be honest... even a 68k was zippy if you tweaked the OS right... That's why I moved away from Mac.. even though Classic OS was at times buggy.. if you knew how to arrange your extensions, it was fast and just did its job... none of this bloated OS with a kiddy interface that dumbs it down.
It's pretty ironic actually. ARM is the spiritual successor to the RISC arch in the PowerPCs.
They were never faster then intel that was a marketing lie. I loved the pre intel Macs Computer pre sysyem 10 days. Apple will eventually fix whatever problems they encounter but some things are by design. Apple computer where never built to be customizable since the Apple II lost to the Mac. If the Woz had his way, Apple computers would have been more like custom pc are today.
@@joelv4495 yes and no as ARM kind of predated PowerPC thanks to Acorns work on the BBC micro and all that.
PowerPC was more efficient (in the beginning). I remember a test were the notebooks were a lot faster than Intel-Notebooks. I don't know about desktops.
Some of us are old enough to remember when Macs CPUs had the speed and efficiency advantage multiple times in the past. When the first Mac was released in Jan 84, it had a 16/32-bit 7.83MHz 68000, when IBM's XT was their most advanced PC, and it still had an 8/16-bit 4.77MHz 8088. And practically all of the other personal computers on the market used either the 8088 or an 8-bit CPU, with a few MS-DOS/PC clones perhaps already using the 8086, which was fully 16-bit. And then, when the PowerMacs were released in Mar 94, based on the PowerPC 601, they were again the fastest and most efficient personal computers on the market. And the same happened a few times during the PowerMac era.
No company can sit at the top forever. But consider that the Snapdragon chip wouldn't be readying for release and mass adoption without Apple's M1. Now the pressure from QualComm and MS will keep price pressure on Apple. Competition is good.
I'm pro x64, but I do like that there's shake0up. I hope Apple silicon gets Intel and AMD to start taming their TDPs (how many Watts they burn)
@@JoeStuffzAlt AMD is pretty good for the processing per watt atm
@@JoeStuffzAltdo you mean x86?
@@JoeStuffzAlt how can you be pro x86? Even intel and AMD would love to switch away from it... there's no good argument for x86 other than that every piece of software out there is basically written for it... an adoption of any other architecture would require either really good emulation/translation... or that every program gets rewritten for that architecture.... the industry decided to go with emulation/translation and if it's at some point good enough to work for the corporate clients there will be mass adoption of it... what the private end user uses is irrelevant and depends entirely on what the manufacturers provide
I agree. It's always great when competition leapfrogs each other. This new chip wouldn't exist without Apple, and the next amazing innovation by Apple will be spurred on by this chip.
As soon as Apple began soldering ram and storage to the boards they lost me as a client. Ram prices are borderline criminal.
Agreed
Nowadays even gaming laptops have soldered-in ram.
Apple ram is not soldered to the board. That's the big difference and everything is moving that way in the industry. Ram is ETCHED in the same silicon die that cpu gpu and other modules are. And believe me in future everything would be that way. That sort of tight integration enables huge performace and power savings, something that is absolutely needed...
@@Useruseruser32121don't know what you have been buying but most of the good ones don't have unless you clueless what to buy.
@@veljko100ableit is not upgradable, period 🤷🏻
Click bait or just aged badly?
Aged like milk
@@jordanstarcher3782could you elaborate more? Is M4 big leap? I didn’t find anything to back up that statement.
@@ahrrr7690this and the sd x elite is bad performance wise, windows barely has support for arm too
@@ahrrr7690the M4 is faster than the M3 Pro and the M4 Pro is faster than the M3 Max lol
@@ahrrr7690 Yeah, this take aged badge, but at the time it did seem like performance was somewhat plateauing. Also I think a lot expected the PC arm rollout to make a bigger splash, but hype died soon after the few products released.
The M4 max is outperforming the newly released top gaming desktop CPUs by a long shot and you need the $2k+ enthusiast/server chips to find comparable multicore performance. It basically outperforms on single core performance on all but the most expensive stuff. This this is on a max ~33w power draw vs the max ~240w draw for server chips which is wild.
That said, unless your job revolves around doing moderate to heavy performance intensive work on a computer I don't think it matters what the m4 max can deliver. It is a wildly expensive device and it exceptionally performant at stuff that does not matter to most people. Gamers is the only consumers that care about performance and for them CPU performance is not important, and they get much better value for money using a dedicated GPU. Not to mention that the macos game support is still abysmal.
The TLDR is that nothing has changed, the video expected ARM non-apple laptops to challenge apples dominance in the professional space, but it seems like apple will get to keep donimating for another generation. For gamers nothing has changed.
Took me ages to stumble on why Apple M series machines had such limited external screen support, hugely reducing the previous number. They are using the iPhone display controller instead of a high end laptop one. As a projection artist multi outputs, 3 are really the minimum for the installations I use and that's evidently common on other hardware. My 5 year old midrange Amd based HP supports 5 external outputs (internal touch screen still on, so 6 screens in total) so for arts work it's a dream. Be great if Apple worked out its design limitation.
Makes sense as the M series chips are just a scaled up version of the A series iphone chips. They just threw in more cores, rebranded to M series and called it a day.
I use a displaylink dock to drive triple 4k displays with a macbook air m2 and I'm pretty satisfied with the results.
@@jerryw1608 I'm using built-in display since 2017, but I'm a weird guy that values portability and being able to work any place on earth.
@@jerryw1608 @jerryw1608 that's interesting. Is it the base M2?. Apple's site says just one screen on the base M2 or M3 and you need the highest end M2 chip to do more than just 2. There is an update that lets you run 2 external screens on some MacBook Airs but only if you shut the internal screen off, which is quite 'unique' a solution. The only Apple dealer store in my area also tells me it's just one screen on base models and you have to have bought a higher end M series to get over that limitation.
@@carylittleford8980 DisplayLink is analogous to a 'software defined graphics card'. I've got a M2 Pro MBP, and a few USB C hubs that have display port connectors on them, but have only ever been able to drive 'up to 2' external displays of any resolution at the same time natively. I've got 2x 1080p monitors, and a samsung g9 dqhd screen (5120x1440), but only 2 screens will ever work at the same time, no matter which display connector they're connected to, but if I put both 1080p screens on a dual displayport DisplayLink adapter, and plug that into a USB A socket on one of my dongles, then I can get all 3 monitors (plus the laptop's own screen, so 4 in total) working. This does need the 'DisplayLink Manager' app running to get the DisplayLink adapter to function.
You're discounting x86 architecture too much. AMD's 0504 custom APU made for the Steamdeck is an x86 chip which can run most modern PC games with a 15W TDP. In desktop mode it draws only about 8W in typical usage scenarios (disclaimer: I was unable to isolate the APU power draw in desktop mode, so this is the figure that represents total power draw from the battery, with the screen off).
Personally, I'd be hyped to see powerful and energy efficient chips like this one in laptops. It has all the compatibility benefits of the x86 architecture and it's pretty efficient. AMD does offer slightly more powerful chips for laptops, but they have roughly double the power draw, which makes them less desirable for people who are on the go a lot.
And Steamdecks APU is pretty old now. There are newer designs APU that are pretty astounding
I think the reason is that the steamdeck is 720p if I am not wrong. We already have processors like that in laptops, but because of things like higher resolution and windows instead of optimised linux distro they seem less powerfull/impresive.
@@matejjustus4573 yup, I agree. But what I'm interested in is a lightweight Linux laptop that can run for many hours on battery, for hassle-free dev work on the go. It just needs to run an IDE, a web browser and, in a pinch, compile some basic modules for testing. If I want to do anything fancy I can delegate the workload to my PC at home.
And yes, I very much prefer an x86 machine to natively test my x86 binaries on.
@@hctiBelttiL i have old Thinkpad t480 for my studies, TH-cam and programming (clion, pycharm, gitlab) and it lasts without problems.
Right now I have 83 and 46 % of battery (two batteries and it reports ~7:30h remaining on balanced power setting.
exactly, x86 is fine, its just apple wants to grab intel/amd profit margins for themselves
This aged like fine milk
I'm sticking to x86 via Ryzen 9 computers with 16 cores/32 threads, support for DDR5 128 GBs RAM, and all the hard drives I could ever want.
For a desktop, I agree. But put that in a laptop, you'd have to carry around a car battery to get decent battery life.
I'd love the arm option for mobile
@@mikekellytech Not actually true a lot of the Ryzen newer AM4 cpu's actually draw less power than their intel equivalents. They just run a little hot. Most of the time.
@@MJSGamingSanctuarycool story bro but nobody even mentioned Intel
As lifelong pc guy , I recently started longterm camping in a beautiful rainforest and Apple is the only way to go , I’ve wasted too much money on badly built pc laptops and to get a comparable build quality pc laptop , I need to pay more than Apple and then get less performance
@@cheekoandthemanI play a fair bit of games on my laptop, so while I’d certainly kill for the efficiency of an Apple Silicon Mac, my use case rules them out wholly.
The 12inch MacBook - fondly called the **MacBook Nothing** - was my ATF of Apple laptops. Its thin-ness and light-ness was insane. I used it primarily for business travel where would be in meetings, so the use case was simply MS Office, MS Teams, Visio, Web Browsing, and corporate email. It was great for all of that, particularly brilliant with the Retina Display. The KB was a bit janky, as I was constantly missing characters as I typed them (pretty fast, probably not enuf pressure) but the KB never failed or had HW issues like for many. I would love to see a MacBook built on that same fan-less and thin platform, and even if just M2 would be very successful IMHO.
That single lonely usb-c port will never not make me wheeze laughing.
*MacBook Adorable*
I owned one and LOVED that thing. I don’t know why Apple doesn’t bring that back. The single port was fine by me. Maybe add a second port and you’re done. It was awesome for business travel. MS office and work emails. More functional than an iPad.
i had one too and i so hoped they would bring it back with one of the M processors. would be an instant buy for me over any of the current airs.
@@CosmicTeapot There's a use-case for everything
"Tablet form computers are useless because they only have one port"... said no-one, ever
Without wanting to be a snob (Im not, I spent years building PC's). No one who owns a Mac will ever go (back) to a pc for any reason apart from cost or work-based software compatibility. If you can afford the Apple ecosystem and buy into it you just dont stop to contemplate alternatives because you no longer have the myriad issues that other non-apple systems suffer from. The next thing is hard to explain to those who dont care about it, but its aesthetics, and simply put: Apple has an aesthetic that is appealing, and almost completely unchallenged. Some of that Chinese tech like XiaoMi and huawei looks like its getting the idea, but we alll know both those companies arent trustworthy competitors to Apple, and the rest of them, including samsung, just dont seem to get it.
« You wouldn’t get it. » Joker Ive
It took Apple more than a decade to produce the m1 and people can’t expect the magic reveal every year let alone even every 5 years. Tech moves much slower than the marketing.
I just wish they had the courage to try different form factors.
@@bombombalu like what? What would be a good idea?
@@petarprokopenko645 I would immediately buy a 12-inch MacBook made out of matte plastic with more rounded edges. Even more so if it was a convertible with pencil-support.
I'm genuinely tired of aluminum. It's cold and hard and not comfortable at all. I loved my 2006 MacBook....
@@petarprokopenko645 An M1, M2 or even M3 12 inch Macbook would be awesome. I would purchase it on the spot.
It took Jim Keller to make the m1 magic, he's long gone and his legacy is now over.
Fun fact about the Qualcomm Elite X / Nuvia team.
"The founding trio of Bruno, Gulati and Williams were key high-level architects at Apple whose expertise brought fruition to many generations of Apple’s SoCs and CPU microarchitectures. Williams was the chief architect on all of Apple’s CPU designs, including the recent Lightning core in the A13"
But before working in ARM & Texas instruments.
so yeah, put better, nothing to do with apple hype, its just that arm is coming into age. i could do all of this on a raspberry pi 5 years ago for $35 woot.
I disregard 'fun fact' comments and skip them unread, just like yours. :)
@@FVBmovies You're clearly replying. I win.
@@snakeplissken8887 Still didn't read tho. Have fun with your facts.
Man. I think I started watching your videos many years ago. I already liked them then. I haven't watched a lot recently (the algorithm doesn't suggest them I suppose) and now completely finished this. Thoroughly impressed with sentence formulation, story structure and overall presentation. Good video!
First time encountering him. He’s highly engaging and very knowledgeable.
agreed! longtime catch up since maybe 2020 when I got my iPadPro!
I do my graphics work on an M1 Macbook Pro and I don’t think I need to upgrade within the next 4 years. We see what’s going on then.
I don’t need more compute power, I need good software and efficiency.
The plague of modern computing is bad software. Programs that were 1MB in the 90s and ran on a 486 are now 500MB and need a high end cpu.
Don't have to, rendering video length is within 10%, no big deal
optimization became absolute garbage in the last 10 years. Almost no application is optimized nowadays as software devs are able to brute force push the apps because of the huge leap in processing power.
@@jani0077 i 100% agree on this, gaming is a sore example of how much devs stopped caring about optimization and just push products out with barely any QA/QC
Yes, we need to stop turning every app into an Electron Chromium-bundled JS bloatware
@@jani0077 Its why apps on Windows (especially games) are bloated as fuck and make laptops last like an hour or 2 on idle alone.
I liked the Quinn beardvolution shot.
Came here to say exactly this ✊🏽
The biggest issue is that it's not a pc.
Its greatest achievement is that it's not a cromebook.
The Acorn pc used Risc chips.....in the 90s. Risc chips are great for using half the power and taking 3 times as long to do the same task.
2:00 clicky clicky
slow decline into becoming a hermit
He wanted simpleton brains to comment about it. You got manipulated.
You were wrong
So... When can we expect a Snapdragon hackintosh ?
I don't think that's possible unfortunately. ARM chips work differently with how the OS needs to be coded specifically for it, or something along those lines
Run Virtualization Framework in VN on Snapdragon, why not ?
@@lucasremI do not think it would be even possible, because of the hard software/hardware binding of apple. Imagine why no one managed to run iOS on android phone.
@@asinglefrenchfry bigger issue would be the SEP secure enclave, I would think as the M-series are ARM chips
Apple Silicon has a ton of custom features not found in any Qualcomm chip. The OS expects to interact with those features.
I am continually amazed at how quickly these "long" videos fly by. You are an elite presenter. Dare i say the best tech product communicator on youtube.
Thanks so much for your kind words!
@@snazzywent by in a blaze 😉
☘️
Truly dude, your funny, entertaining and have a superb outlook on technology and pick interesting topics too. You should have way more subscribers! @@snazzy
@@snazzy you totally missed the point WHY it isn't much faster. the reason is milking the tech that corporations already have. improving too fast drops older cheaper-to-produce products prices too fast. so corporations don't like fast progress which they do only when they forced by sales dropping down. now whith what they have they only compete with themselves, making faster chips will kill m1-m2-m3 products sales.
thats the actual reason why your napking math didn't work, everything else is just marketing faitytales from corporations to justify milking the market.
“Napkin” math needs to be on a cocktail napkin from your local bar 😊
A thing that should be mentioned is that it wasn't really an x86 vs ARM issue. Intel basically just got stuck at a bad node size for a long time as they fab inhouse. The Steam deck uses x86, but on a TSMC AMD chip so the tdp is just better.
Ya intel was stuck on 14nm for over 6 years seems to be stuck on 10nm now.
yup. In multicore performance actually Ryzens are almost on par with Apple chips. Its so strange AMD didnt try to add efficiency cores to their processors - it would be game changer.
@heroofjustice3349
AMD gets pretty close to Apple in terms of pref per watt, but only at the high end server stuff, and only if you turn off boosting. 128 cores of pure compute if I recall, that only sips 300W
@@heroofjustice3349They will be. AMD will be releasing a power efficient chip with efficiency cores (regular cores with less cache) by CES next year. So both Qualcomm, AMD and Intel to a lesser extent will offer high performance, high battery life laptops in 2025. Apple's advantage will negligible next year.
@@Demopans5990 Ok I will write again because link was caught by YT and my comment was not posted. So instead I ask you to put 'AMD Ryzen 9 7940HS analysis' in google and check first link from notebookcheck.
As you can see your comment is lie. Consumer grade AMD chips for laptops like 7x40u/hs or 6800u are almost as efficient as Apple silicon under load even with worse manufacturing node(6800u). Advantage of Apple Silicon lies in E-Cores. Ryzens have only performance cores and so under low load or idle those chips needs to use more power. Its just wonder of BIG.little architecture - power hungry P-Cores are activated and powered only when there is need for more processing power than E-Cores can handle. However AMD is already planning to introduce their own E-Cores and it should put Ryzen powered laptops very close in all aspects of efficiency(including battery life) to Apple chips.
Your point about fan speed is exactly why I love the M1, especially the 16". This thing is cool and silent no matter how many consecutive hours I work or play games on it.
It truly is remarkable.
@@snazzy I use for ME always an M1 MacBook Air, even when my whole team uses better Mx Macs. For Keynote Pages and 1x a month an short video export... I see no reason for an upgrade in years.
Like my Mac mini M1. It’s on 24/7 and nobody ever heard the thing, no matter what I did on it. Going to my son’s room and hearing the noise there of his windows PC always makes me laugh
What’s it’s temperature and fan speed when you’re playing games, if you don’t mind me asking?
@@syntex9673 None when crapple OS can't play any real games lol.
How did this age
one thing to note tho for when apple transitioned to m1, in the prev gen intel macbook, the heatsink LITERALLY didn't make contact with the die, this was found when LTT tried to improve the cooling with better paste and found that they had to drill out some of the heatsink in order for it to touch the cpu
And in the Early 2020 MacBook Air models, they added that ridiculously underpowered semi-passive heatsink that could only handle about 5-10 Watts before the CPU started to throttle...
I honestly wonder if apple made some of the pre-m1 computers worse on purpose to make m1 look good.
@@ZachariahConnor it sounds like an apple thing to do honestly. I wouldnt be surprised
@@ZachariahConnor Considering that they _do_ have some engineers employed, I can't think of any other explanation for this. But that leaves me with a feeling that they were actually caught by M1 surprise themselves--had they known that the Apple Silicon would be _so_ good, I don't think they'd have gone through the trouble.
"Please clap" 😂😂😂
I laughed out loud!
Poor ol’ low energy Jeb lol
I met Jeb once after that comment. He's so ashamed of it because it was never meant to sound so desperate. But at least he can laugh at himself about it. Really nice guy, but not my type of politician.
👏 👏 👏
Jeb
That didn't age well 😂
What you said is what AMD realized. "Wait... if we manufacture this part on the new node but these parts that won't get as much of a performance boost on these parts on the older node, we can save money!"
It's not just saving money, shrinking certain parts (like IO) further just didn't scale to the new nodes...
With chiplets, they have very small dies and thus the chances of major errors in them is much lower, resulting in way better yield or feasibility to make them to begin with. Meanwhile the huge-ass IO die gets made on a node which is mature enough to reliably make that size of die.
Well, I guess it is saving money in the end after all by making the production yield of their chips not prohibitively bad.
Yeah and then those chips alone idle at ~20W...
@@yarost12 There's Ryzen NUCs that idle under 10W.
There are also technical reasons why having parts such as IO on older process nodes is advantageous. It's not all about money but that is a factor.
@@yarost12 mine is at 5-8 W while watching this yt vid
ARM is 39 years old. It started with Acorn computers. So it's not an modern ISA and it have been drastically been added to in a similar way to as x86.
@anthonycotter1493 ISA used to make a huge difference in the 80's, by the 90's it was down to a few percent. In the 2000's the difference disappeared completely. If ISA was the end all, we would all be using PowerPC right now. A bigger difference will come from a memory controller difference than the ISA.
@anthonycotter1493 I think it is both and performance per Watt counts, that's the main reason the development of Intel x86 Android phones stopped. But yeah, I am also sure if a brand new modern instruction set would be developed, ARM would be defeated, also on performance per Watt.
Stop making me feel so old 😂
AArch64 threw out a lot of legacy features from earlier ARM versions, features that proved a challenge for speculation such as its stateful predicate bits and complicated instruction encodings. It also added some features that are generally useful at scale. Intel and AMD have also been adding some useful features, but they are only massively superscalar through absolute brute force. I don't think anyone doing chip design was actually surprised by the leg up the M1 had, even with the process node accounted for.
@anthonycotter1493It can be just a mistake but based on the writing, you are mixed up Instruction set and Instruction set Architecture. Two different things. One like its name is just the instructions, the other is the actual circuit design to execute such instructions. (Loosely speaking.)
With the release of the M4 series this week this video aged really badly.
if people are going to say that the nanometer nomenclature doesn't make sense, don't use it. Call it N5/N3/N2 etc which is the actual product name TSMC uses.
The main hope I have from competition isn’t new MacBook form factors (thought it’d be nice); it’s more realistic (read: cheaper) storage and RAM upgrades. The fact Snapdragon Elite uses NVMe over PCI should prove that Apple’s soldered-on SSDs don’t have any advantage; maybe they’ll use standard connectors, at least in Pro machines.
I agree with you! I also hope Snapdragon Elite X chips won't be hamstrung with non-upgradable RAM and storage.
Why does that leave me with the sinking feeling...that that's EXACTLY what Qualcomm will do?
Heaven forfend! 🤣🤣
Doesn't apple already support pci-e on their desktop mac pro with arm chips. Also thunderbolt is also over pcie if I'm not wrong. They just chose to hinder any upgrades.
@@jimcabezola3051i think RAM will be soldered due to the current speed and trace length limitation of ddr5, let's hope that new dell standard for ram gets better adoption.
but no reason storage needs to be soldered.
@@kushagraN Yes, you're right... I'm afraid you're right. The traces to a DIMM would be prohibitively long and slow. But...let's not make a base system contain LESS than 32GB, okay? 🤣🤣
@@kushagraN The present Mac Pro's PCIe support is only a baby-step in the right direction, yes.
They need to go much further, though.
This didn't age well lol
After 30 years as an Apple user, I think my 2021 M1 MacBook Pro Max might be my last Apple computer purchase. I am excited about Framework’s roadmap.
My biggest regret: thinking 32 GB of RAM was sufficient for my needs and realizing a swap for the 64 GB would require me to take a huge loss. My servers and PC have 128 GB…I must have been drunk when I configured that MBP. 🤦♂️
I did the same thing with the same background in computers.
64 GB... On a laptop? I can't imagine. What do you use all that RAM for? Gotta be some cool work you do
Age is making Quinn to do his videos from his chair 😀
Love your videos keep the work man!!!
When I see their review of apple vision pro I became fan of this channel and this video is great
I am loving this channel videos
Counter suggestion: he's recording enough of them at once that standing is just impractical to do in long bouts. I'm sure it's way more comfortable. Glad he can sit down.
I love that you mention how "3 nm" is mainly marketing -- too many people, including tech journalists, are misinformed about what the node number actually means.
switched to windows laptop with 64gb ram upgrade after using mac m1 8gb
Major fail
@@daytonaukpc9387Why? I'd rather a Windows machine than a Mac.
@@daytonaukpc9387 Yeah, 8GB is a major fail. I can't believe myself using only 8GB except if all I'm doing is web browsing and other light stuff.
16:15 I love Quin’s obsession with the 12” MacBook.
It’s unhealthy
@@snazzy I have a similar relationship with the 2009 13” MacBook Pro…
I miss that thing. I remember when the OG 11" MacBook Air came out and I was in love with it. Still am tbh
@@snazzy I'd argue it is the perfect amount of healthy!
@@nateo200 I remember when I saw it first time. I couldn't belive that something so elegant could be produced as working computer
The fact that the other brands are catching up with Apple is a good thing...
Can`t wait.
My man explaining chips in a Castro style 😂
So we're just going to ignore that JLC Reverso on the wrist 🔥🔥
I’m glad I wasn’t the only one that noticed it! 😅
I guess Apple Watch started make him feel cheap
Good spot! Do those things really start at about $7000 USD? Yikes! Any idea what model he's wearing?
...snazzy!
I've used Macs since 1990, but gotta say nowadays it feels like they only have a handful of macOS developers but hundreds on the iOS team... recently had to downgrade from Sonoma back to Ventura (on 2020 intel MacBook Air) because after the upgrade the CPU was running hot and the fan was constantly on. (Activity monitor not showing anything crazy). There's an apple forum support thread with at least 300 other people who had same problem. After the downgrade (not easy- required full wipe and install even with a Time Machine backup), fan was back to normal. The good news about these Qualcomm chips is maybe there'll be ARM hackintosh macs possible.
Darn, it would be so nice if M1+ machines supported egpus for local ML development workflows.
Yeah, pytorch and tensorflow are still very flaky on Apple silicon.
So the original team that designed the m1 now works in Qualcomm, so I think Tim Apple will keep overclocking chips to compete.
Hilarious if true
The thing is apple sold well even when they were using intel chips, which were hitting thermal limits fast, so I doubt people buy apple hardware for performance first. I think nothing much will change with apple, they rarely take any risks.
What Apple does best? Hardware? Seriously? Louis Rossmann would like a word. What Apple does best is User Experience essentially... their design process starts with things people need to do, then the build the hardware and software around that. Thats why people love their Apple gear so much.
Comparatively speaking it’s true though. Maybe not what Apple does best, but it’s one thing Apple typically does better than the competition.
Apple had good usability, not anymore
I’m offended I’m not a normal person 😅
Welcome to reality buddy. No, 20 years ago isn't the 80's, and yes, snake on a Nokia was the best.
???
@@VeritasVortex ¿¿¿
@@BeardyMike ???
@@magnomliman8114 ¿¿¿
I dread these snapdragon chips being locked down and inaccessible to linux users. The transition to arm would spell trouble for people who want literally any freedom over their hardware whatsoever.
Ashai linux works well, at least it did last time I tried it.
@@ULUnLocoThey could do some verified boot without an option to change that. So maybe Windows only.
That would be very ironic. Moving from proprietory to open source architecture and loosing the option.
@@kushagraN what architecture is open source?
Isn't RISC-V an open source ARM, basically?
This aged well
Ngl that fan audio made me start looking around my room going “wheres that fan sound from, my phone doesn’t have a fan and my computer isnt turned on”
Do you also keep a gun next to your ancient printer
It gave me nightmares of using my Win 11 laptop (Chill Windows Im just using Excel...)
As usual, title makes me think it will be boring video, but the I see your beard, and I start watching, and cannot stop till the end. That's magic. I do not even have any apple device 😅
Quinn, I love you for carrying the torch for the MacBook 12”. Why this hasn’t been resurrected is beyond me.
I just want Apple to release a Mac Studio with the trash can form factor
Trash can mac is life
Great video going as deep as I always want these videos to go.
Excited to FINALLY get an M1 competitor. I'm a Linux user who has been waiting for 4 years for this!
Linux being supported would be the only reason for me to switch from my M1 Pro MacBook Pro to any other device, that's close in efficiency and performance. No x86 system is. And sadly Asahi Linux still doesn't cut it on Mac.
@@igordasunddas3377what kind of issues are you running into? or is it known limitations like no external monitor output on a MBP?
4:47 there is a 4th option. You can improve the algorithms inside the chip for more performance at the cost of more transistors
The problem is that the M1 made intel actually finally "wake up" and resume actually investing R&D on x86 arquitecture, and AMD followed suit. This brought innovation back into x86 that was basically stuck for years, and this innovation quickly outrun anything that Apple was trying to do with Apple Silicon. The new intel Core Ultra cpus and Ryzen are now more performant and with better efficiency from x86 than Apple Silicon with ARM64. And the x86 is still a better arquitecture for a lot of things. And now that intel and AMD are back in the fighting ring, x86 will just continue to get better at higher-end performance points
Apple R&D and launch cycle for the M1 happened during the same time span that Intel and AMD R&D'd theirs, AMD had already planned on a new socket and Zen4 architecture, Intel was catching up on AMD's lead with chiplets with their efficiency cores, Blue and Red didn't wake up following Moneybags' M1, they just happened to have parallel'd it
Well AMD being able to do anything is special in it of itself. I still rather have AMD in a framework laptop, but I'll take a SD Elite in a Framework if they make one and it is easy to run my Linux stuff on it.
Meanwhile, my Samsung Galaxy S24 on Dex works great too.
I agree, let Apple keep their ARM and let some windows laptops have it too because why not. But that does not mean that x86 is obsolete and shouldn’t be improved further.
Yes it did. You don't think other oems get internal leaks from their counterparts, do you? Sometimes even some contribute to other's success hardware-wise@@thegeekno72
My M1 Max Macbook Pro is still kicking ass as my work laptop
One of the best!
I have an M2 mini and an M2 Macbook Pro that I don't really use anymore. I still use my Hackintosh 2x E5 2699 v3 with a 5700 XT slapped on and 256 GB RAM for my editing and daily use. I only take out the laptop on the go and the M2 mini ...I should sell that, I don't even know when I used it last. Yes, I use a lot more power, but restoring historical photos on an Apple computer would probably mean I would need a Mac Pro 7,1 to even come close to my Hackintosh. And we both know I can't afford one :)
In a word, Proprietary. Hook line and sinker. I didn't like the fact that it was not easy to get under the hood and tweak things. It was nice "machine" but when things did not go well it was like trying to fix bits in a modern car.
Is today Fidel Castro's birthday?
Give him a cigar lol
I dunno, let me ask my Humane Ai Pin. I’ll get back to you next week.
@@snazzythey should just connect it to people like the Amazon cashier. 😅
As a Cuban i have to say: is not funny 😢😅
Trudeau celebrates it every year.
¡Papi!
These are the videos that make snazzy labs a unique channel.
Keep going!
I moved to M1 from Microsoft laptops and it was a new world. It changed my perspective on things. Windows laptops never performed well on battery and it didn’t matter about performance when the thing was dead after a few hours. Just being able to use the computer without battery life anxiety was massive. The M1 still feels new and works great and so if it isn’t broke does Apple really need to push the boundaries just yet? If my M1 broke today I will want to buy another MacBook. I don’t have any outstanding issues forcing me to need anything better. Anything more powerful. It’s all just icing on the cake at this point for me.
The biggest problem Apple faces, aside from the absolutely ludicrous pricing of their upgrades, is that they insist on using monolithic chips. There are tons of reasons why everybody is moving away from them, and rumor has it that Apple is moving to make even larger chips. Absolutely ridiculous.
"The Intel Mac is the best! The Intel Mac is magical! The touchbar makes it worth $1000 extra! The Intel Mac is the best Windows computer!"
...then M1 comes out...
"The Intel Mac was awful, it overheated, it was an engineering failure, luckily there was a savior! Apple always saves me from the problems that they create!"
11:00 "everyone's natively gonna move on to ARM anyway"
*cries in stellar games from 2000s and 2010s*
dont worry, ARM evangelists have been making this claim for decades with little to back it up.
Gaming is too big of a hobby and with gaming crash looking more likely people will find a way to run older games.
@@shade221 I can't see FEM/FEA and CAD moving to ARM - not for a long time, at least.
Are you crying because you can't play those games anymore? I'm sure that future ARM chips will be able to emulate those just fine.
@@Mastermind12358 lol the emulation will be garbage and you know it
I guess this didn't age well...
It really didn't, the efficency gains are insane. 😂
Of course it didn’t, dude has no idea what he is talking about.
Just 1 day later - what happened?
I'm out of the loop, why didn't this age well?
I was an Acorn Risc PC dealer in 1991. The original Acorn Risc Machine StrongARM CPU that ran Microsoft Windows 95 in a RiscOS window. I told you then that ARM and RISC are the future not X86.
They hit thermal wall faster than we were able to forget that intel has this problem too.
It took Intel 40 Years to hit a Thermal wall. It has taken Apple 4 years to hit a thermal wall.
@@granttaylor8179 In what way did it take intel 40 years? For longer than I have been alive processors have needed cooling and been hitting temperatures upwards of 80c.
@@definingslawek4731it was only in 1992 that any form of passive cooling was needed
@@definingslawek4731 It has has only been since the 486DX2-66 of 1992 that any sort of passive cooling was needed on processor.
Intel has been making processors for over 50 years. Apple has run into similar issues a lot sooner than Intel did.
They cannot keep shrinking the die process as that has been done for the last 50 years and it is reaching the current limits.
@@definingslawek4731 truth
What about the so called unpatchable security flaw found recently in the M1, M2 and M3 chips?
Tell us more about this please
These things are almost never relevant to general users.
This is the most informative and entertaining tech channel in existence! Awesome!
Apple is still 4 years ahead of everything out there an make the fastest processor out there.
Snazzy Labs is over!
Love this video! damn its so nice. thank you for great explanations and effort!
I own M1 MBA and MBP.
They run beautifully.
cap
17:08 "The M1 series will be remembered as some of the greatest computers in history" Maybe so, if only they weren't designed to be history in so few years.
I doubt they'll last any less time than any other laptop ever made.
@@definingslawek4731 Last, yes, they just won't be of any use. Landfill. I still have operating and useful PCs from the 90s, my mid 2010 MacBook still works, how many M1s will be bootable in five years? None can be given a second life. Mac resale values for Apple silicone models will hit a cliff, something that has never occurred before with any Apple product.
As an M2 max owner, my m1 macbook pro (13 in, not m1 pro) is already garbage
@@octav7438 How come? I dont own any apple laptops, but the cpu power should be really good no? Or is it GPU power u are lacking? just wondering
Huh? M1 laptops are still going strong today and likely will be for a good while.
The real Apple Silicon secret sauce is the big process node advantage and low frequency, high IPC cores. Also the Media Engine and AMX, but not "Unified Memory" or the Neural Engine. It's not even used most of the time. And unified memory has been a thing on x86 for many years.
Finally someone pointed out Apple pay-to-win with TSMC for the whole time, and not pixie dust that makes the Apple M-series significantly faster in comparison to others.
That's not at all what was said, though. It's *an* advantage, but far from the only thing making these processors as good as they are.
Well I dont know if its fair to say the M1 is the last time they took a risk. They just released the Vision Pro a few months ago.
His opinions are just random nonsense. Dude has no idea what he is talking about and neither people who agree with him.
I cant tell what is the main point of this video or what his problem is.
Just 5 years ago no one would be doing the intense editing you do on any laptop, this is what desktops were for and they're now becoming obsolete. Sometimes you have to take a step back and look at how quickly all of this has advanced, and it's incredible
9 years ago I was doing HD video mixing on a laptop. Also full multi-track video and audio production.
But not a Mac because those would overheat if you ran Handbrake for more than a few minutes or tried playing more than 2 4K videos at a time.
PC manufacturers should have used the slogan "Don't worry, your Mac will be able to do this in like 6 years."
They should make mac mini / iphone hybrid. Basically an iphone with m3 or m3-lite chip that is able to connect to a dock station (monitor, keyboard, mouse) via usb-c abd offer a mac mini experience.
They would do this but Apple doesn’t have the same luxury as Microsoft in that they have competition within their ecosystem. Apple would self cannibalize their other products if they made something like this a viable alternative. It’s the same reason why the iPad still can’t run desktop apps despite have the same chips, they just don’t want to risk making one of their own product lines redundant. How do you get away with selling $3,000 dollar laptops when a sub $1,000 iPad can actually do everything you need it to?
@@brandondapro They can make it cost as mac mini + iphone combined (2k+) and it will still appeal to some people.
@@brandondaprothis is already the case with the MacBook Air… TH-camrs are always suggesting the Air for the “average” user who isn’t doing heavy productivity. The iPad Pro/Air can already fulfill this task for the majority… The MacBook Air is a pretty scammy product when you really break it down.
The problem with a comment about the iPad as a substitute doesn’t work for me. Well, let’s say it would/could work for me if the iPad didn’t have a crippled OS.
Having something like Samsung Dex for the iPhone would be glorious. This is the absolute dream for me. Just let me carry a single device with all my files and apps and let me dock it when I need proper peripherals.
Apple M1 magic was the specialized tests they used to benchmark results 😂
First video of yours I am watching. Throughly enjoyed it. Thanks! Great content!
Apple making a gaming Mac. LMAO!
How many thousands of game programmers are in need of work today?
Apple is the biggest player in the world of gaming, only a few know this.
i am talking $ wise
I doubt that 😂@@JohnSmith-pn2vl
Instructions unclear bought qualcom stock
“Send me a g and we’re cool” LOL
TLDR: The competition is getting better.
TDLR, there is no competition anywhere to be seen. not even in the rear view mirror. Apple has no competition.
i don't know any Apple users that cares about anything Android, windows or anything else.
just like Apple doesn't care what others do.
the only ones are cuntent creators who are in dare need for anything to talk about.
so we have synthethic meaningless benchmark comparisons, at the end of the day, nobody cares about the ssd speed of a base model.
the camera on the front of a laptop, the throttling under max load etc.
but still, here we are, youtubers & co braught the useless notch to macbooks, cpu manufacturers optimizing their chips for benchmarks instead of customers needs, now a base model has faster ssd, nobody ever actually needs, all this nonsense that doesn'T make anything better but everything worse.
because content creaters are wannabe's at best, no real skill behind, everything is entertainment.
6:51 N3B, not N3E.
Great video! I've been thinking about these topics/questions for a few months now. It seems like a lot of hardware chips are getting bigger recently, so obviously transistors can only get so small and we have been scraping the ceiling of those limits for a while.
I own an m1 iMac and it sits mostly unused not because of the m1 but because of the awful Mac OS Sonoma.
whats your gripes about the OS?
You can downgrade!
@@exzld Beyond OG OSX and perhaps the next couple of major releases, each release takes a retrograde step in terms of usability.
Originally: A lovely, clean, intuitive user experience for daily driving. Administrative features were all in sensible places - where you'd expect them to be.
More recent iterations have had changes for the sake of change, made worse by nobbling some administrative features, as well as arbitrarily moving other admin features to random places for no good reason.
Perversely, the biggest competition ("Windoze") have been progressively making their offering somewhat easier to administer.
@@baronvonschnellenstein2811 windows easier? no they are making same mistakes for years now
@@baronvonschnellenstein2811agreed. Imo High Sierra was the peak. Everything is downhill from there
Your Number 1 just isn't true, Modern X86 CPU's run off Micro ops and translate everything internally into small instructions, ISA is not an advantage of ARM
This comment here. The ISA doesn’t matter, especially the idea that legacy support is a significant problem.
Doesn't matter point is x86 chips can never be as efficient as arm chips in battery life and efficiency if they were then we would use them in phones and mobile devices not including laptops
@@Space97. that's not true, there ain't any that are designed for phones. If you stick an M3 in a phone it won't be efficient ether. ISA has nothing to do with how efficient a CPU IS.
@@Rocman76 question why do you think where were able to put no fans on some of their macbooks and it stayed cool and quiet because with intel that would not have been possible I recommend you watching videos on x86 vs arm and they all say one thing and that's efficiency
@@Rocman76 also I'm talking about the arm architecture not a certain chip it self
Every time I watch this channel, I remember that I forgot how much I enjoy watching this channel.
Apple silicon was never magic. The magic apple had was exclusive 5nm and 3nm production process allowing it to cram more transistors on chip, but others are catching up. If apple was smart it would have never included a large bulky Neural Engine and instead used that space for a 64 bit Intel compatible processor. This would have allowed it to still do Neural kind of stuff, but allowed the machine to be Windows backwards compatible for companies whom need to run software that is only available on Intel Chips such as Solidworks. Like Tesla, Apple does not listen to customers.
Well that didn't age well. Two weeks and everything is wrong.
My thought exactly. I didn't see a heatsink on the back of the new iPad.
Just typical Click bait for all the anti apple folks. He knows his audience.
Apple silicon never had magic. The amazing performance was because Apple released those benchmarks and no one else could do their own yet. HandBrake testing proved that M1 chips didn't remotely live up to the expectations.
Try compiling code on an m1 VS a latest Gen i9 macbook. Shits crazy.
Uuuuh... the M3 is based on N3B not N3E, N3E is the one with the SRAM scaling problem, N3B is better than N3E on that regard. 😅
Yea I spotted that error too.
Freudian slip. It was written properly in-script. Alas, the point stands. Sorry.
N3E is more efficient, according to my research, so for the ultrabook market, it would be a better chip.