Many of us predicted this years ago. I thought it might happen the moment is saw the iPhone 4S which was maybe 2014? The jump in processor speed was so significant, I knew this day would come.
TV is dead. There are so many world-class documentary-style channels on TH-cam! I just recently discovered a new Channel, History of Earth. The production quality is as good as TV! th-cam.com/channels/_aOteuWIY8ITg7DQQspG1g.html
what is a "TV program" lol? Seriously, the last time, I have watched a tv program was in 2015... I cant even watch it if I want to, excluded it from my cable ISP to save money...
"regular TV" has been dead to me for several years now. If I do need one of the major networks for sports, that is free over the air. I only pay for internet now.
But i kinda have this fear that with the transition for computers from x86 to ARM, that everything would become very locked down and we would have less control of our devices... Especially how Apple is now leading the transition, and they are well known to love locking down hardware and make it difficult for their users to do whatever they want with their hardware.
I love it! They got complacent and put the bean counters in charge. Its time for the engineers to lead the charge again. Any competition is a win for the consumer, I'm hoping they come back swinging, because it just means that AMD's utter domination and Apple's next round of silicone will have to compete even harder, giving all us nerds the wonderful excitement of having nice leaps and innovations again!
Though I retired from the semiconductor industry 20 years ago, I still follow the technology and when I first got my hands on an M1 white paper, I must admit I didn't believe Apple could pull it off. Kudos to the ASIC Engineers who did this...absolutely unbelievable!!!
@@m.design The world's largest pure-play semiconductor manufacturer. It's no exaggeration to say that they are one of the major driving forces behind computation of the modern world. They don't design products themselves, but instead take orders from fabless companies and make it for them. Throughout the process, TSMC's engineers provide their input on the designs of their clients' products. You can imagine it as a back and forth group project.
@@m.design TSMC is the Taiwanese Foundry that manufactures the M1 & A-series chips for Apple using the 5 nm process. They are essentially a contract manufacturer that makes chips, and account for nearly 50% of the global chip-making market share at the moment. Along with Samsung, they are the only other manufacturer capable of producing 5 nm chips commercially today.
@@dwyk321 you only need to pay for the internet service and maybe view some adds every now and then, the amount of value you gain in exchange is mind blowing
Very late to the party. Only found this channel a month ago. Such great work. I’ve almost finished all content and look forward to what’s coming next. Amazing channel, describing amazing people and events.
yup, I have an intel distributor code and they give us "points" if the importer sells more (for years now) their new strategy? they duplicated the points. lmao what are they thinking?
Apple has been making industry-leading chips for quite some time now. The only thing is, they were in iPhones so far where you can only do so much on the small screen limited by iOS. Now that they are on laptops, the full potential is finally being unleashed.
@@well7885 but they can never reach the power of a desktop chip. X86 will continue to rule the roost because of the graphics card rollercoaster it rides upon. Windows PC's will continue to crush Macs for the foreseeable future because they are the cutting edge. Not ARM, which is only good for mobile applications.
@@ag3ntorange164 this comment is gonna age so bad. Go check out the reviews of M1 Chip which are the lowest end laptop chips and already has the best single-core score ever. I request you just one thing, come back to see this comment in 2 years and you will realize how awe-fully wrong you were.
9:28 - "Simplified CPU's, such as ARM, generally will do one single instruction per clock cycle. While desktop chips may use many cycles to complete one complex instruction. This means more power consumption, less efficiency, and more heat produced." ...No. Both types of CPUs mostly use simple instructions that take one clock cycle. However, desktop CPUs offer many extra instructions that the ARM does not. For an ARM (RISC) to perform equivalent complex tasks, it must string together long sequences of instructions. Not only that, but a desktop CPU is loaded with specialized hardware acceleration to make it's complex instructions execute even faster. This means the RISC requires far more clock cycles for complex tasks. RISCs are successful because those complex instructions are rarely needed. Usually, CPUs just shovel data as fast as they can. And in that contest, a RISC wins easily.
Thanks for pointing this out. There’s another benefit, equally important. In order to explore instruction level parallelism, pipelining fixed length instructions (such as the ARM ISA provides) is much easier than the x86 variable length instructions. That’s the reason intel started decomposing those in smaller fixed length ones (microops) and reassembling them back into a normal x86 instruction ever since the Pentium Pro era.
My understanding(limited) each intel core actually is doing between 2 to 4(and in rare cases 8) dflop (double precision floating-point operations) per cycle because each core actually has more than one execution unit. I think the thing that is really telling in this video is the fact that they say they can't even tell a speed difference between different software that is compiled for different Hardware such a thing is not a sign of amazing Hardware it's a sign of really bad compiling and or programming.... Lot of this is just silly I mean my laptop is a 8 years old and still has more than 10h of battery life and its a fanless I7 and is plenty fast enough for me now granted it was like 6k new, so this is really a question of affordability not ability... and affordability and Apple not synonymous neither is reliability
That's not really true for AArch64. It's instructions are similarly complex than x86 instructions, except it lacks some of the legacy parts. The main difference is the lack of memory operands, but that's more than compensated for by the larger register file. Both designs do in fact look very similar internally and both the number of instructions needed to achieve common tasks and the number of instructions available in the instruction set are comparable.
@@newguy3588 look, it's one thing to have a brand that shows your status. It's another to have a good piece of hardware. To have both should impress you. Even I was just dismissing apple as a fashion brand. About time they made me look in their direction without disgust on my face.
Apple played even more fundamental role in the development of ARM. In 1990 a joint venture between Acorn, Apple and VLSI gave birth to Advanced RISC Machine Ltd (later renamed to ARM Holdings). The first Apple product featuring an ARM processor wasn't the iPod as claimed in this video, but Apple Newton in 1993 - more than 8 years before the iPod actually.
I know of one difference in architecture between the early Motorola (used by Apple) and Intel was the Intel chip had to outsource the math function to an external chip, then wait for the answer. This was time consuming and generated extra heat. The Motorola (Mac) chip had this function built-in. Just this one difference made the Mac graphics way better than Intel's. I also heard that originally, IBM wanted to use the Motorola chip in their first PC release but Motorola couldn't manufacture enough chips to satisfy the order, but Intel could. Can you imagine how that would have changed history?
I lift my hat for your professional. You provide a content which is very interesting and the way how you present it is very academic and professional. There should be more content like yours nowadays. Thank you Mr Altraide
Honestly yeah, where's all the graphene tech? It was all the hype 5-6 years ago and then nothing. I know they were having issues producing in mass but thought they'd figure it out by now
i think with batteries its more difficult because by laws of physics, we are reaching an energy density that is just too risky to be portable. so i think batteries will stay on the same-ish levels that they are now, but as we see with the M1, the components still have much optimization potential. once we can build screens, processing units etc which run with minimal heat-generation, a laptop with a present-day battery could run 40-50h easily i think. I hope i am wrong though, and there is a way of making a high energy density battery that just by laws of nature cant discharge an any explosive manner
@@amarug Theres probably a way to make high energy density battery unlikely to fail catastrophically, at least unlikely enough to make it an acceptable risk for the convenience, people still use cars even if they are deadly to use sometimes. Basically you don't need to reach 100% safety, just close enough. Another way to reach a similar amount of convenience would be to have fast charging battery, wouldn't really matter if you only had a few hours of peak performance if you could just stop near a power source for less than a minute. Maybe if you were going off the grid then it would matter. Then again transferring large amount of power in a relatively short amount of time probably has its own safety issue but at least you are unlikely to be holding the device in your hand when it explodes. Or maybe some sort of universal battery combined with an easily swapable battery so you can just switch when depleted. A little less convenient than the fast charge since you probably cant bring spare battery everywhere, but i do have a friend who already does that. But yeah it might be easier to just make stuff more energy efficient.
Compatibility: to be fair, there are _some_ apps that have trouble running on M1. But their number is very limited and it's very specialized software (e.g. some images for docker)
developer issue..... they exist on Windows too, its not just Apple. If anyone played Mahjong on PC, you'll know there are graphic issues if you don't change the resolution,. Experienced that when we upgraded to Windows 7 back in the day. It IS a XP game after all, so probably expected.
In my computer architecture class, a common question was, which is more recent, CISC or RISC, and almost everyone got that one wrong. People naturally assume whatevers more complicated must be more "advanced" or newer, when no, it turns out the innovation isn't making it more complicated, but doing more things with less instructions. Thats what the innovation is. That lesson ought to be applied to lots of things :)
Yes yes and yes. I always say, the true wise man knows how to take complicated and abstract concepts and simplifies them to the point where others can understand them.
It was first when RISC Instructions were simple than cisc.. it's like with few instructions you do same job..but to be done the job..the job. Itself is fragments in many instructions that with few commands it be done...in inverse with cisc instructions...they do job step by step with to much commands to the processor that overload and consum much power.. .... In paradox nowadays RISC Instructions are too.much complicated to fragments a Job to done with fewest commands to processor it's possible
CISC made sense back when compiler optimization was primitive and it was assumed that all complex programs and OSes would be written in assembly,. Back in those days, an advanced CPU was expected to minimize the "semantic gap" between human programmers and assembly. So, the designers of the day weren't dumb or lacking in innovation, they were designing for a different goal.
This is why its important to have competition. Intel has been at the top of the food chain for many years but recently AMD has now caught up and you can even argue that they're more performant/efficient. And then there's the ARM chipset, spanking both chips. I'm pretty sure Intel is shitting their pants right now
@@Yeet42069 lol my first thought exactly. AMD is crushing Intel and it's showing in every test & benchmark. Intel's been great, but AMD cleaned em out this last year.
Fun fact: the number of instructions in the ARMv8 architecture (incl. ASIMD) is similar to that of the Intel 64 architecture including everything up to AVX2. It's not “bloat” from “too many instructions” that makes x86 CPUs harder to scale, but rather factors such as the more complex instruction encoding, intertia from Intel having ridden out the same microarchitecture since the PPro days, differences in the memory model, and better memory latency on tightly integrated SoCs vs. systems with discrete memory and GPU. Being able to design a whole system more or less from the ground up gave Apple the ability to sidestep many of the design constraints that hobble the existing Wintel ecosystem.
To highlight the folly of thinking “ARM is RISC, so obviously it's better than the CISC x86,” consider that the PowerPC architecture previously used by Apple (before they switched to x86) is also a RISC architecture, designed according to very similar principles as ARM. And yet it was ditched in favour of x86. It's not RISC/CISC that makes the difference, but rather how the CPU is designed below the instruction set layer.
This single video is far better than all the other reaction videos combined.It gives more insight and in depth analysis of what actually happened how it happened.
I've been a PC enthusiast all my life till i wanted to know and bought my first Apple Device, expensive, overpriced, greedy, slow, that's what i thought. Well, i was wrong. best decision in my Life, changed everything.
@@JohnSmith-pn2vl Similar story with me. I lived in Seattle, a couple of buddies were Microsoft engineers that worked on Windows. Loyalty required staying on Windows until I was forced at a new job to use a MacBook Pro. After complaining for a week or two learning to use the Mac, I realized that for me it was much better. That was 15 years ago. Since then I only use windows when a specific application requires it.
I work with both a MBP and a Windows PC. For the same price the windows beat out in performance for what I was programming. However, this M1 chip has blown me away, and the new MBA, I believe, can beat out 99% of windows laptops. This new chip is powerful.
Same. Loved Android hated apple, made the shift to iPhone 7 because one plus 6 wasn’t available and I needed the device urgently. It’s a really good phone IMO. I’m also interested to shift from Windows PC/ Linux Laptop to a Mac Mini.
I used to write ARM assembly 20 years ago, making my software about 5 times faster than using regular C++ compilers at the time. Seems like this is a skill I might pick up again for even more screaming performance. The ARM instruction set is one of the most beautiful instruction sets I ever encountered on a CPU. So much nicer than Intel 386+, and even more elegant than Motorola 680x0. it's a great future ahead for computing!
Assembly was always screamingly fast be it ARM or any other. There always has been nothing as good as direct assembly code. It has just been less necessary as the chips got faster and there was more memory for lots of everyday applications. It has also been complicated by needing to have your software allow for different peoples machines with different software and hardware setups especially where there are multiple manufacturers -ie non-apple. But I agree re native power being unleashed by assembly.
I had done a bit C64 assembly back in the days, and a bit AVR 8-bit code during studies. But when I came to my first ARM7 project, I remember that I was also stunned by the elegance of the instructions, e.g. the shifting bits. I felt that this was a completely higher philosophy of thinking, and I liked their approach from the first moment. Now I have just bought my first Apple Computer, and as a Linux guy, this was entirely because it's just great technology.
Well of course no widely-produced Instruction Set Architecture (ISA) is as ugly and outright *klugey* as Intel's x86, but... is even the latest ARM ISA as sweet, and so completely *orthogonal,* as that of the ADSP-2100 family? Admittedly a fixed-point DSP, NOT a general purpose CPU, it can be used as a microcontroller in many cases, and almost makes you want to eschew the C compiler and do it all in assembly.
@@tarunarya1780 this is not the whole truth. C compilers will take the CPU Architektur into account while you have to be extremely knowledgeable and have to spend a shit tone of time to get to that point. Your average programmer has no chance to get close to a good c compiler.
@@domenickeller2564 I think you meant the comment for sci fifaan. I think compilers are a good thing, and lots of projects would be too difficult and big to manage in assembly, never mind cope with vast amount of hardware out there. Even programming in VBA can seem overly tedious. Role on AI. There is a role for selectively doing some parts of programs in pure assembly depending on the speed of operation and bottlenecks faced.
Its the future, I say this with confidence. Especially now we've got the M2 chip. I can't wait to see the day we get native chips from AMD and Intel which are ARM based, perhaps even custom chips direct from Microsoft and other companies who have also licensed from ARM.
Z Rus: At the end of the day, the customer wants to talk about cost, convenience, power per dollar, and getting things done. Intel can play games with words all it wants -- once Windows 10 is no longer supported in 5 years (barring something new from MSFT which I don't expect), why will people want to buy many Intel chips? Essentially every PC, phone, gaming console, etc. will be using some form of Linux. And old guys like me who want the old Windows apps will buy used X86 machines for cheap, as I've already been doing for years. When I can get a GREAT Laptop for about $150 which is easily maintained, (or a few for $4.500ish, and lots of spare parts or redundancy if something fails). I just don't see Intel selling many chips, unless it finds new markets.
Windows and Android user hear and the M1 chip is on a different level but bought my wife a I pad air M1 chip and I was so impressed I bought myself one it's amazing having a tablet that has the performances of a £2k pc for £900 and its the best for cheap video editing... And it was only after watching this video it got me interested and it genuinely works I can't believe how fast the M1 chip is ...
Comparing the 7700K with the 2700K is just hilarious. Both are quad-core processors. The 7700K (2017) has a mere increase of 30% more power compared to the 2700K. (2011) Intel needed almost 6 years and 5 generations of processors for such a low increase! The 7700K is furthermore famous for its heat generation and power consumption which makes the overclocking potential limited even with a strong cooling system. The 2700K on the other hand had a lot of overclocking potential and easily reached the same level like a stock 7700K with a decent cooling. Then Ryzen came out of nowhere and Intel was finally forced to stop its politics of milking their for many years unrivaled processors. And as far as we observed it in the past few years they weren't even prepared for this case.
@@JackoBanon1 yea 2700 was my last intel CPU. i skipped then i upgraded to amd last year. and i have complained that there is no innovation in desktop world for the past decade. i hope servers would use some arm chips soon.
@@JackoBanon1 This is why competition is good. Intel has purposely held back their big changes (such as shrinkage from 14mm) since they've had no real reason to make big advances. People buy their stuff every year even with 5% increases, so they just cruised along. AMD gave a solid push in competition but Apple is gonna be the big change needed to get Intel to actually improve.
Honestly that was the same thing my project group thought when we took the assignment on making our own 16 bit risc cpu, but it was surprisingly simple! And we all got the second highest grade on our project exam. But I assure you Apples chip design is much more complex than ours were 😛
Time here, the M1 is still more than enough for 80% of the market out there. The better the base model chips get the less people need to upgrade real world, thus less recycling and waste. M1 changed everything and we're living in the greatest times as a Mac fan. Can't wait to get my base model m4 with 1tb 16GB ram for......yes $1200 (including Apple Care). Insane.
also means unethical marketing strategies, slave labour, releasing multiple models which are basically the same thing but with ever increasing prices and damaging the environment for greed. Etc
There is no quality in just references to other TH-cam channels, no unknown footage or stories, no new ambient music, no nothing, where is the quality you are talking about?? for me what I can see is that Dagogo is running out of money to pay for writers and content creators just like the most of influencers out there. Welcome to the real world.
@@thisisntsergio1352 no its not.. amd beats it in cinebench benchmarks, kido.... also any new ryzen cpu laptop with an nvidia 3080 completely stomps apple everything
@@dertythegrower Apps right now are running on emulation but M1 beats Intel. After a year or so M2 will be release and it will be faster than AMD. You can tell by the graph it is increasing 300% each year.
It's kinda awesome that this stemmed from Acorn Computers. I used them back in the 90s in school and just kinda thought they disappeared into the void, but their legacy still live on today, which is cool!
Clayton Christensen died earlier this year and didn't get a chance to see one of the largest companies completely disrupted - something unthinkable when he wrote the book and set out exactly how it would happen 20+ years ago.
@Nida Margarette Most of them focus on the same synthetic benchmark and exporting video use case. Does e.g. Zoom use AS specific hardware extensions for video decoding / encoding?
Intel isn't going anywhere. While they still stay on ancient 14nm process while AMD, Apple, Qualcomm and basically everyone else use 6nm to 8nm produced by TSMC and Samsung, Intel is still competitive and manages to squeeze more and more performance per core. Imagine what would happen if Intel just moved what they have to 7nm TSMC node. Well they are actually considering doing just that. At the same time typical Mac workloads don't require that single core performance Intel was chasing. So it's not a new paradigm of PC hardware - it's a new type of PC which was actually pioneered not by Apple but by Surfaces and Chromebooks which used ARM long before Apple. Yes this new type of PC will be better at those types of workloads and will take some of Intel's market share but it's just a better option for that kind of tradeoff sacrificing single core performance for power efficiency. But good luck playing Guild Wars 2 with its heavy world thread on those new M1 Macbooks.
@@AntonMochalin What are you talking about. These chips are beating everything hard in single core. They are the fastest single core cpus you can get. Only the high end intel/amd can beat them in multicore (because ton of cores lol). This type of pc will be better for 99% of users. There may be an application for cisc elsewhere, but it will not be mainstream computing.
Needing a more powerful computer isn’t relevant to everyone. But having more efficiency and less power consumption is. And it’s something everyone can notice. This is deeper than it sounds. The majority don’t need extremely powerful laptops.
Intel is not lazy , it is just technology get to a stage that it is difficult for one company to do it all . Intel design their chip as well as manufacturing them is no longer a sound business model
@@RaskaTheFurry dude if you watch the video you can see apple been doing the whole arm cpu thing for a while and we just didn't pay attention because intel was better. but now that's not the case so it's the thing that is neat.
A lot of the binaries will already be compatible, but there's no reason not to recompile them with more chip-specific optimisations. When you say 'use chip direct' do you mean be able to use the chip as it is? I think there will be code signing issues if you want to boot directly on the hardware, but I don't know the situation with recent Macs.
Depends, some stuff they were the inventor and others Jobs just had a vision that no one else had and knew how to market it. I'd give them credit regardless. Really doesn't matter anymore because if you're not half way knowledgeable about computers then people tend to be sheep and you think anything with the Apple logo on is the absolute best.. even if you not quite sure why.
@@Irfaan16 Because Apple knows how to make money. That doesn't necessarily speak to the quality. Things being copied, for example, is how to excel at planned obsolescence.
@@mauriciosl Microsoft had been playing around with some of their components in Linux like DirectX. Putting DirectX on Linux is just one huge leap to allow portability across platform without having to rewrite just for OpenGL stuff that went stagnant.
Same. Using iPhone + Linux + Windows. Keen to replace my Linux laptop for a Mac but that’s it. I’m not replacing my windows pc because I’m comfortable with it.
I believe in better with sensible price... 1000 dollor wheel is rediculous while this year's iPhone is kinda must go phone because Apple finally gave preference to rugged over fragile for the sake of aesthetics. They raised frame so screen couldn't come direct contact with surface making it long lasting while android OEMs make screen up on frame and call it 2.5 glass. If you drop it glass is done.
I have been 100% on windows/android for the past 10 years. I am impressed with Apple's innovation over the past couple years, that drive forces other companies like Microsoft and Samsung to also innovate and I am all about it.
Don't believe Apple's claim lol. Leaks have shown that M1 performs well compared with Intel-based Macs but only performs half compared to Ryzen 7 4800HS and Ryzen 5 3600X
@Bokang Sepinare In actual real-world head-to-head tests the M1 is a "middle of the road" capable CPU, not underpowered by any meaning of the word, but also nowhere near the fastest or most powerful.
My ryzen 3600x reported package power 34W while watching this video even though CPU is not doing much at only 3 or 4% usage, , while GPU is at only 15W, 16% usage, doing all the work rendering youtube. Although i noticed that GPU downclocks itself to minimum when rendering youtube while CPU jumps to maximum 4.3 Ghz.
Another way to look at it: how much free computation have we gained from heating homes and offices via computers? Your perspective is valid too, but in the end a lot of it balances out, especially in countries with severe winters. In the end, technology like this helps us all by giving us more choice in how we do heating. Eventually household heaters will be replaced by computing devices and cloud computing will be farmed out to whichever countries are going through winter at a given moment.
@@1GTX1 for any given chip there is a baseline power, that no matter what is happening...it requires to function. When a gate on a cpu is switched 'off' there is still power running through it. In reality the cpu is measuring the difference of power say 1w to 3w (example), to say on or off. So at the minimum power requirements a cpu on desktop needs 34W, in your example, just to keep the chip powered on and ready to do anything.
This seems to be a trend from intel's side... no vision, future thinking or a forward mindset. I remember watching the defiant ones and intel told Record Exec Jimmy Iovine something similar, nevertheless Jimmy hooked up with Apple, Job's and Tim Cook and now Apple has got Beats which served as a them very well in setting up Apple Music and the AirPods tech. it's sad really when you think about it.
uh.... we are talking performance only here right? Is that is the case, then its true.. Apple may have the performance, gain but as far as i'm concerned, intel still wins due to app compatibility...(or at least they will do when Rosetta 2 get pulled from M1/M2 etc) Apple is neck and neck now only because Rosetta 2 is filling the gap allowing x86 code to run.. Take that away, and there is a clear winner... (Apple did say 3 year transition, but they can't keep it round forever, there is code hidden in MacOS beta's already sighting removal), so it will happen, its just a matter of time. That's how i think of it... There's not much evidence supporting compatibility on M-chips with Rosetta 2 gone is there ? All we're concerned is the "here" and "now" ... No one looks at to what "will be" I kind of things that's a better way .... Least your never surprised .... (That could be kinda boring...)
why? you hate that they make all-encompassing tech that anyone can pickup and use without needing to optimize or change it? or that they have an iPhone for everyone at just about every price point? or literally changed the PC landscape forever a few weeks ago yet didn’t raise prices by even a DOLLAR on any of the new Macs? I don’t understand.
The fact that we have to make it clear that we hate them before acknowledging or appreciating something about them goes to show WE are afraid of fully acknowledging that particular part we like about Apple otherwise we will be classed a “sheep” speaks volumes
For the first time in my 16 years of computer using life, I’m buying a Mac this year unless AMD comes up with something competitive. Apple Silicon is perfect for laptops. Knowing Apple from long term iPhone and iPad usage, I think they will nail the next gen M series chips.
Good luck finding good software or games, hope u have enough money left for the overpriced software and hardware upgrade and repair costs. And hope you enjoy being limited to apple store and propriety ecosystem. HAHAHAHAHAHAHA
@@mhavock different users have different amounts of disposable income. Different companies target different levels of disposable income. Different users have different needs and values. I don’t know about this person specifically, but a lot of Apple customers value: * low latency connectivity (seemless connection between devices) * on-device data, iCloud private relay, strict app regulation (security and privacy) * low power consumption, high performance and long battery life (optimisation) If you don’t value these things that’s okay. If you can’t afford Apple products that’s okay, but don’t judge other people for what they want to spend their money on (unless it’s a significant risk to them or others).
@@costacoffee4life665 Thats not my point at all, and if you think Apple has low latency etc etc, then you have not looked at the equivalent priced technology from the competitors. Infact, all these technologies are not something new Apple developed. They borrowed and take what they need.You know exactly what I mean, do your research. Support opensource and open platform development instead.
@@mhavock I am literally still able to install cracked softwares on macOS, and if that’s not enough I can also boot windows on my mac and get all the freedom windows offers. There’s also plenty of third-party apple repair shops, with fair price, genuine parts, and actual good service
Because of my work I still can't do without a real PC but I recognize and that's what has changed a lot compared to the beginning of smartphones and tablets is that I use more and more both devices to replace my Windows computers Mobile systems are fast, reliable, simple to use and moreover they are used with the fingers, they are portable and I can take my tablet or my smartphone everywhere whereas for my PC it's a little more complicated and as he says very rightly in the video they are more energy efficient, my tablet and my smartphone can easily last a day on the battery And it seems to evolve very quickly so I can't wait to see what the future holds for mobile systems in general Thank you Dagogo
@Tom Cass I'm sure every company does, but you know they're on 14nm+++++. While AMD is on 7nm with 5nm for 2022. With your statement, I will respectfully say, Intel has held off quite a bit then.
Nonsense do your homework. M1 is hopeless in multi thread and multi core performance - Intel and AMD mobility chips blow the M1 out of the water. See links and benchmarks above.
@@bighands69 I'm personally not a dedicated Apple user. Android and PC user myself. Mainly video editing and rendering. Still not switching to Apple, but the test of TLD rendering his footage being faster than his PC is interesting. A $2000 to $3500 PC rendering video slower than an Apple Air with no fan priced at $999 to $1249, that's impressive. And again, I'm not jumping to Apple. Everyone enjoys their own flavors of tech, but just acknowledging, this m1 chip itself is impressive. At least for my video rendering. =)
Very enlightening. Simply by upgrading my phone a couple of times, I had a subconscious awareness that mobile processors were catching up to desktop performance at a surprising speed. But I had no idea they had come this far.
@@AlexMkd1984 I don’t think apple products are “dumb”, also isn’t apple products literally outperforming every android phone out there, and the fact it last over 5 years vs 1 or even 2 years, i’m not defending apple but just saying the advantages of iphone over android.
Best cost benefit in terms of computers at least for me. 100% Approved. Nice design, fast, smooth use, battery lasts more than 15 hours working. So practical as a phone. You take it from the bag and just start using from where you were, with absolutely no lag. It's really open and use. Nice nice nice
Groundbreaking! Well done Apple...never thought I'd ever say that as I have never been a big fan of Apple, but there's no denying that chip is a game changer.
@@DurzoBlunts if it’s simpler to program for, then it’s easier for devs, who will then have more time in their hands to perhaps optimise the software to its best possible capabilities, there’s deadlines you know?.
@@DurzoBlunts some architecture instruction sets are massively more efficient than others. To do a simple 5+5, one processor may need a dozen cycles for memory access and sheer complexity of the chip, but another might just need half that because it isn’t nearly as complex
You shouldn't get too excited. The ARM is not a fresh new architecture. In fact it drags quite a lot of "compatibility" and "legacy" crap along with it too. Switching from intel to ARM is just like changing the pile of stink you prefer to sit on. It will not be as disruptive as many people think.
I own myself. Have privacy... And life has never been better Welcome to 2030 www.tamperproof.earth/post/welcome-to-2030-i-own-myself-have-privacy-and-life-has-never-been-better
Back them, we did use them, HP Win CE too, back then technology was just not that good jet. Webkit, touch screens needed more development, 20 year later we have achieved it!
RISC-era Macs could beat an intel PC of the same speed. Apple made a mistake to listen to Intel's siren song about the x86 platform. The x86 architecture became too complicated for it's own good. I am glad RISC/ARM stuck around and now with good R&D has proven to beat out x86 in both performance and energy efficiency.
The M1 is amazing, and it's only going to get better from here (and at a faster pace than intel/amd). However, I think it is a sickening trend to have memory and ssd built into the motherboard. Not only can you not upgrade your storage, but you can't customize anything about it. And when the RAM or SSD breaks, not only can you not buy something to replace it, but it can also cause the entire system to become unstable. Apple doesn't want you to have the ability to fix your own computer if it breaks, they want to sell you a new computer. And considering Apple's impact on other tech companies, this should be reason to worry.
We’ve already seen it, with the death of the headphone jack (which I admittedly don’t miss but it would be nice for people who do), and the lack of micro sd slots on some newer phones
On the one hand, I agree with you. On the other hand, this is very "stuck in the present" thinking. Do you stress about not being able to replace the south bridge fan in your laptop? Probably not, because it doesn't have a south bridge. Are you upset that it doesn't have replaceable fuses? Are you upset that you can't hot-swap the display backlight? At a certain point - and we're not quite there - RAM and storage are likely to be plentiful and cheap enough that it simply doesn't matter for the vast majority of users whether they can upgrade them or not. Just like I'm sure you're not bothered by the fact that your TV doesn't have a hundred calibration pots for you to fiddle with. It also neglects that there can be inherent advantages to getting all of those things as close together as possible. All I'd really ask is that they tone down the incremental pricing for some of the options, but c'est la vie.
one of the reasons attributed to the M1 phenomenal performance is because the RAM and SSD are integrated. Contrary to popular believe, the ability to upgrade RAM and SSD in a Laptop is not a major selling point for majority of users.
@@MrLoipurz Why are you calling it the "ability to upgrade"? Yeah, most do not care about it. What they do care about is whether they can fix their machine for $30 and continue using it for the next 5 years or buy a new one for $1500...
I'm solidly a PC/Android fan. But a lot of people misunderstand my distaste for Apple products... I have no problem whatsoever with the technology. In many cases I've admired it for years. What I have a MASSIVE problem with is their product model... None of their stuff is modular and little of it is upgradeable. What they offer you is what you get. And with that Limitations is an unacceptable price tag. In many cases I can build a far better PC than a Mac, and if a part breaks, I can fix it myself. Or more importantly... UPGRADE as newer/better components become available.
By the time Apple started using PowerPC, let alone when they stopped, Intel and AMD CPUs had already switched to translating CISC into RISC µOps, in the case of AMD literally originally based on their 29000 called own RISC ISA. The reason Apple went Intel was simple, because of economics of scale bigger numbers of products sold made it possible for Intel, especially their own Fabs, to invest more money in R&D and sell equivalently capable CPUs for less. And now with smartphones and tablets adding to the already decently sized embedded niche ARM had before, and Intel stuck with their own Fabs at a time of lagging behind, economics of scale add to the advantage of having a custom design for what you want to achieve, in addition to the advantage of not having to translate, but there has never been a CISC chip with more than a low hundrets of MHz, because that doesn't actually work. And all the non-Intel using companies naturally used RISC, all be it different ISAs, because once you do choose that you don't need emulated compatibility to an old ISA, you would naturally choose to natively use the actually sensible design. There is nothing about RISC that is less able to do this or that, or technical reasons, why they were used in higher end or by professional users. The reasons are all about company policies. Intel was chosen, wrongly so, to supply the original IBM machine, and all PCs using Intel or AMD since then, use x86, because they descend from an unbroken line of piece by piece iteratively improved IBM PC clones, that couldn't switch away from x86 without breaking compatibility, or systems that switched to IBM PC hardware, like Apple, because as it had most of the PC market and others were to small to compete.
While it is true that Intel has effectively had a RISC core since the Pentium pro, it still relies on a CISC instruction set. Lots of complicated logic is still required in the decode phase of instructions. It’s also why Intel is only a 4 wide design while Apple is using an 8 wide design.
@@steveseidel9967 Modern x86 designs translate the instructions into the micro-op cache long before they are needed. The entire execution pipeline runs on micro-ops, and the micro-ops are reused many times in typical code. This means you only really need one copy of the translation logic, so it only comes to ~1% of the die area. As you point out, M1's CPU cores can execute more instructions per clock than any x86 design. The reason they can do this is because of a novel cache architecture that can keep those pipes full. And a lot of new ideas to keep the power consumption down, so the wide execution units don't exceed the local thermal limit (melt).
@@pranjalkanel2711 Yes, but Apple doesn't ARM, Apple are not the first to build a desktop computer based on ARM this year. Huawei made one, the raspberry pi desktop exists and some Android phones have desktop modes when you plug them in. Also, Docker works on the raspberry pi. reports indicate this is not true of the Apple's Arm product.
History often gets taken for granted and forgotten which didn't survive. He even jumped from 1970 to 1991, when people talks about aapl, world revolves around it somehow :)
They weren't though. The 50 dollar price hike on Zen 3, made sure of that. What's more, Intel has diversified their funds so they can stomach the losses, while AMD has just gotten back up and is highly relient on the Ryzen's margins. So if Apple keeps this pace up, there's a very big chance that AMD ends up being bought by Apple.
@@hilal_younus Yeah, if Apple keeps this up. And AMD don't find other sources of revenue streams, then there's a big chance that Apple could scoop them up in 10 years time. Samsung nearly did it, back when AMD only had the terrible FX cpu's.
Very cool video. I've been thinking about a Mac for some time now, but held off mainly due to price and not feeling like they were worth it... Well, then Apple dropped the M1 bomb, and I still held off... Then I used my dinosaur laptop the other day and it literally took 10 minutes just to boot up. Doing anything on it was PAINFULLY slow... Got a M1 Mac Air last night and I'm pretty excited to learn this new system and also know that I have one of best computers on market that'll probably last me 10-15 years lol... To witness the screen turn on BEFORE I even fully open the computer and then do Touch ID and I'm literally at my home screen ready to do whatever WITHIN a second is pretty f'ing mind blowing to me!
One of the mistakes Microsoft made with Windows was to overload the start-up sequence with too much unnecessary housekeeping. When Apple adopted BSD UNIX for their OS kernel, they did just the opposite....keep the launch sequence uncluttered -- get the desktop up & running in as few seconds as possible.
@@atiqshahriarshourav2958 you don't understand apple they literally realest the best budget phone iphone se in 2020 for 400 dollars and iphone 12 mini for 700 dollars also reduce prices for iphone 11 to 599 dollars and iphone x to 499 dollars.
This is full of factual errors. Long time ago ( as in about 20 year ) there were 2 platforms RISC and CISC. ARM is a version of RISC and x86 used to be a version of CISC. Only today x86, just like ARM, executes RISC instructions internally and really the only difference is that software sends CISC instructions to a x86 CPU which are then translated/decoded to RISC for internal processing. Before someone goes all preachy proclaiming that the decoder is the problem it really isn't and it uses between 3-5% of extra power and that statement has been independently verified many times. What Apple did with M1 is not at all brining a mobile CPU to a laptop/desktop. M1 is and evolution of current CPU design thinking and the true innovation here is removing the memory controller and using a "shared" memory pool for GPU and CPU. That's really not that uncommon or revolutionary. AMD has CPU/GPU on chip and sharing memory between CPU and GPU is common practice with integrated graphics cards. What Apple did is they used the best of everything and used what would normally be considered a low end technique to build a high end processing unit ( not sure it can be called a CPU anymore ). And yes the results are AMAZING. Also the reason why this is even possible and you can still run existing software is simply because Apple replaced the hardware decoder found in x86 with a software decoder. And the best part is that once software is reworked to natively use RISC and not need the decoder, you can expect at least 3-5% gain in speed and/or battery performance. Also, you totally got the history of why CISC was prevalent wrong. It's all about cost of memory. CISC allowed for better reuse of memory and then RAM was very very expensive and there simply wasn't the technological means of having more, CISC was clearly the better technology. But that hasn't been the case for a long time now and today Apple has access to memory that runs at clock speeds similar to what a CPU runs at. That's also why now is the time to make a switch. It also happens to be the reason why Nvidia bought Arm. If M1 is the future of Macs and Intel is not the future of PCs then you clearly have a gap in the market. Nvidia wants to make sure they are part of the conversation especially if the future of GPUs is not a physical board that plug into another physical board.
I am happy to see someone with brains in the comment section. This video is so damn misleading and full of errors. How on earth he can say that a retarded iPad is more powerful than a Laptop 😆🤦♂️
I've been following the industry for forty odd years and this is dumbed down somewhat, if it wasn't it would have been a series, but it's fundamentally accurate. RISC and CISC kinda, sorta merged, the key point was, an instruction has to earn it keep. So Intel has put a lot of RISC elements in their designs, they had to, to improve performance. Where intel is fucked, is its CISC legacy, all those old instructions are still in silicon. ARM on the other hand, has a relatively clean RISC legacy.
I used to view the Discovery Channel and such a lot. Then no more Dish and such. Due to COVID I rediscovered TH-cam. There are so many channels, programs, shows, and documentaries on TH-cam. I don't know how such great programs can be produced to view without being connected to the big cords. How nice!
@@Aecor Yeah been hearing about how AMD will unseat intel on server business base for decade and still nothing. Like how they said Athalon will take Intel down.. Still waiting and never will.
@@laturista1000 I have 2 PC laptops from 15 years ago. One is a thinkpad, the other a HP. I also have an Amstrad CPC 464 from 1988 They still work flawlessly. What's your point?
I think it's important to explain why at 8:30 he says the chip was still running. It was running on the extremely low voltage introduced by the measuring device, it wasn't just running off of God's will.
I remember you saying like 2 years ago, that Apple should include ARM processor, and now here we are, it happened
we alway know it will happen since x86 used too much power.... and unlike 10 years ago.. more dev are now doing arm design app...
What about RISC V processor??
Is it better than ARM Risc
Many of us predicted this years ago. I thought it might happen the moment is saw the iPhone 4S which was maybe 2014? The jump in processor speed was so significant, I knew this day would come.
@@shrin210 any update on this?
amd and intel are counting their final years. unless they rejoin the arm club. and nvidia is a smart, yet greedy guy.he owns arm now.
You can say that Apple took the RISC.
They ARMed themselves.
🥵🥵🥵
@@jorge091167 Damn you guys. Haha
INTELligent move
AMDone with this
The quality of this and other similar channels makes you wonder about the future of classic TV programs.
same
The production of this blows away a lot of the crap on free to are lately
TV is dead. There are so many world-class documentary-style channels on TH-cam! I just recently discovered a new Channel, History of Earth. The production quality is as good as TV! th-cam.com/channels/_aOteuWIY8ITg7DQQspG1g.html
what is a "TV program" lol?
Seriously, the last time, I have watched a tv program was in 2015... I cant even watch it if I want to, excluded it from my cable ISP to save money...
"regular TV" has been dead to me for several years now. If I do need one of the major networks for sports, that is free over the air. I only pay for internet now.
I feel like every Processor manufacturers is just killing Intel right now
But i kinda have this fear that with the transition for computers from x86 to ARM, that everything would become very locked down and we would have less control of our devices...
Especially how Apple is now leading the transition, and they are well known to love locking down hardware and make it difficult for their users to do whatever they want with their hardware.
I love it! They got complacent and put the bean counters in charge. Its time for the engineers to lead the charge again. Any competition is a win for the consumer, I'm hoping they come back swinging, because it just means that AMD's utter domination and Apple's next round of silicone will have to compete even harder, giving all us nerds the wonderful excitement of having nice leaps and innovations again!
True Intel is getting trampled right now
honestly they kinda had it coming
Lack of adaptation kills even the mightiest of the companies.. It's unfortunate that companies that have risen to the stars drop like flies as quick.
Though I retired from the semiconductor industry 20 years ago, I still follow the technology and when I first got my hands on an M1 white paper, I must admit I didn't believe Apple could pull it off. Kudos to the ASIC Engineers who did this...absolutely unbelievable!!!
What would have been the most challenging hurdle they would have faced in desigining the M1 chip?
Would love to hear more of your thoughts on this!
I love how in the 80s. they're talking mad tech in the news/television.
But now its all politics
Those were clips from Tech News shows. That's all they talked about. Do a search for "Computer Chronicles."
A more advanced society in some ways
Mad tech? What’s that
@@BeachLookingGuy a language millennial wouldn’t understand.
TSMC also deserves some credit.
Without them the ARM and AMD chips couldn’t dominate as much.
And ASML deserves credit in turn, for their EUV lithography tools. Without that, TSMC couldn't manufacture at 5 nm.
What's TSMC?
@@m.design TSMC Is a manufacturer
@@m.design The world's largest pure-play semiconductor manufacturer. It's no exaggeration to say that they are one of the major driving forces behind computation of the modern world. They don't design products themselves, but instead take orders from fabless companies and make it for them. Throughout the process, TSMC's engineers provide their input on the designs of their clients' products. You can imagine it as a back and forth group project.
@@m.design TSMC is the Taiwanese Foundry that manufactures the M1 & A-series chips for Apple using the 5 nm process. They are essentially a contract manufacturer that makes chips, and account for nearly 50% of the global chip-making market share at the moment. Along with Samsung, they are the only other manufacturer capable of producing 5 nm chips commercially today.
Intel CPU would make a great heater during winter seasons
Lol
*Puts hands near my 9900k and rubs hands together*
I'm dying, how the turntables eh
@@qualtrox Yes, vinyl records are making a comeback though... ;)
I would use a AMD Bulldozer cpu for that to be honest. Way more heat.
This channel has been consistently at the top of tech news for such a long time ! Cheers!
I am taking a moment to appreciate the free knowledge that exists on the Internet
It isn’t really free.
@@dwyk321 you pay with your soul?
@@dwyk321 you only need to pay for the internet service and maybe view some adds every now and then, the amount of value you gain in exchange is mind blowing
@@batmaneo your time is free? wanan come over later and bag up my leaves? ;)
@@batmaneo If the product is free, YOU are the product.
Thank you for including some clips from our channel. We are witnessing a computing revolution!
Is this Vadim or Max?(sorry if I spelt any name wrong,)
you’re welcome, no problem. any time. :)
And he didn't give you credit for it
Your channel rock, guys, Vadim and Max !!! ✊🏼
@Coldfusion y u no give credit?
Very late to the party.
Only found this channel a month ago.
Such great work. I’ve almost finished all content and look forward to what’s coming next.
Amazing channel, describing amazing people and events.
Not investing on R&D is investing on self-destruction
If i was Intel i would be in serious panic mode right now.
First AMD, now Apple blowing Intel out of the water in a short amount of time.
They really need to freaking step up. They've gotten lazy af
Their processor manufacturing expertise is something to be positive about. Have to see how investors value Intel over the next few years
And nvidia bought arm but theyre not making any new hw yet lol.
coz intel is a goliath, such a big giant monster but kinda slow and stupid. amd on the other side, is david, tiny yet deadly. and arm is an impostor.
@@mayurgianchandani1184 Just gonna throw this out there but Intel is in their current position for being overambitious LMAO.
The issue with cold fusion is that every video topic is so damn amazing, I waste most of my time deciding which one to watch 1st
And I thought I was the only one like that.
Lmao
Exactly. I try to do 3 videos I like and end up seeing more I want to see before the other 2
😂
This made me subscribe
Apple started an
ARMs race.
🥁
M1 to be a sucker for some tech puns
I can imagine the very tense high level meetings that are happening at Intel right now 😂
Nah they pretty chill for some reason. They still acting like they’re on top.
@@thelonelyowl2654 Fake it until you make it
I think most of them are polishing their resume's right now.
They have MBAs running the show, that's usually a disaster for anyone other than themselves
I bought an AMD CPU for the first time this week 😂
I bet "Intel's inside" is now on fire.
intel is attacked 360 degrees.. 2020 is the worst for them. AMD stabbed them, now AAPL killed them.. dang... 😅
yup, I have an intel distributor code and they give us "points" if the importer sells more (for years now)
their new strategy? they duplicated the points.
lmao what are they thinking?
More like Intel Outside
intel has been AMD's punching bag for the last couple of years. Now it's Apple's turn :D
@Amit Ezuthachan yeah, it seems they can't be that stupid to complete ignore their surroundings
This feels like the first actual big innovation from Apple in a long time. I want one
Apple has been making industry-leading chips for quite some time now. The only thing is, they were in iPhones so far where you can only do so much on the small screen limited by iOS. Now that they are on laptops, the full potential is finally being unleashed.
I agree... I am PURE APPLE.... and I keep watch on it...
Pretty sure airpods and apple watch were also revolutionary and changed the entire market
@@well7885 but they can never reach the power of a desktop chip. X86 will continue to rule the roost because of the graphics card rollercoaster it rides upon. Windows PC's will continue to crush Macs for the foreseeable future because they are the cutting edge. Not ARM, which is only good for mobile applications.
@@ag3ntorange164 this comment is gonna age so bad. Go check out the reviews of M1 Chip which are the lowest end laptop chips and already has the best single-core score ever. I request you just one thing, come back to see this comment in 2 years and you will realize how awe-fully wrong you were.
9:28 - "Simplified CPU's, such as ARM, generally will do one single instruction per clock cycle. While desktop chips may use many cycles to complete one complex instruction. This means more power consumption, less efficiency, and more heat produced."
...No. Both types of CPUs mostly use simple instructions that take one clock cycle. However, desktop CPUs offer many extra instructions that the ARM does not. For an ARM (RISC) to perform equivalent complex tasks, it must string together long sequences of instructions. Not only that, but a desktop CPU is loaded with specialized hardware acceleration to make it's complex instructions execute even faster. This means the RISC requires far more clock cycles for complex tasks.
RISCs are successful because those complex instructions are rarely needed. Usually, CPUs just shovel data as fast as they can. And in that contest, a RISC wins easily.
Thanks for pointing this out. There’s another benefit, equally important. In order to explore instruction level parallelism, pipelining fixed length instructions (such as the ARM ISA provides) is much easier than the x86 variable length instructions. That’s the reason intel started decomposing those in smaller fixed length ones (microops) and reassembling them back into a normal x86 instruction ever since the Pentium Pro era.
My understanding(limited) each intel core actually is doing between 2 to 4(and in rare cases 8) dflop (double precision floating-point operations) per cycle because each core actually has more than one execution unit.
I think the thing that is really telling in this video is the fact that they say they can't even tell a speed difference between different software that is compiled for different Hardware such a thing is not a sign of amazing Hardware it's a sign of really bad compiling and or programming....
Lot of this is just silly I mean my laptop is a 8 years old and still has more than 10h of battery life and its a fanless I7 and is plenty fast enough for me now granted it was like 6k new, so this is really a question of affordability not ability... and affordability and Apple not synonymous neither is reliability
so your saying Intel is "inefficient". (..That works on so many levels 😆)
That's not really true for AArch64. It's instructions are similarly complex than x86 instructions, except it lacks some of the legacy parts. The main difference is the lack of memory operands, but that's more than compensated for by the larger register file. Both designs do in fact look very similar internally and both the number of instructions needed to achieve common tasks and the number of instructions available in the instruction set are comparable.
People, including me, went from passionately hating on Apple to seriously considering buying an Air or Pro M1, that's how much M1 changed the market
I kept thinking that Apple users are snobs but damn... I'm impressed as hell with this.
Yup bought my first macbook a few months ago. Got an iphone too.
@@emp437 The sheep says what?
@@emp437 good choice. going for a same
@@newguy3588 look, it's one thing to have a brand that shows your status.
It's another to have a good piece of hardware.
To have both should impress you.
Even I was just dismissing apple as a fashion brand.
About time they made me look in their direction without disgust on my face.
Apple played even more fundamental role in the development of ARM. In 1990 a joint venture between Acorn, Apple and VLSI gave birth to Advanced RISC Machine Ltd (later renamed to ARM Holdings). The first Apple product featuring an ARM processor wasn't the iPod as claimed in this video, but Apple Newton in 1993 - more than 8 years before the iPod actually.
I’ve removed my intel i7 sticker off my laptop out of shame.
Same...
hahaha nice one
lol noice
Hahahahaha 🤣🤣
**hides in a hole with my i3**
I know of one difference in architecture between the early Motorola (used by Apple) and Intel was the Intel chip had to outsource the math function to an external chip, then wait for the answer. This was time consuming and generated extra heat.
The Motorola (Mac) chip had this function built-in. Just this one difference made the Mac graphics way better than Intel's.
I also heard that originally, IBM wanted to use the Motorola chip in their first PC release but Motorola couldn't manufacture enough chips to satisfy the order, but Intel could. Can you imagine how that would have changed history?
The M1 is colored black because it just came back from Intel’s funeral
God this cracked me up
LMAO😂😂😂
savage
lol. tear downs show silver. IDK why they show black in the promo material
Nice!
I lift my hat for your professional. You provide a content which is very interesting and the way how you present it is very academic and professional. There should be more content like yours nowadays. Thank you Mr Altraide
You can say that Apple took the RISC.
Now we need a battery revolution. Still waiting on graphene
Honestly yeah, where's all the graphene tech? It was all the hype 5-6 years ago and then nothing. I know they were having issues producing in mass but thought they'd figure it out by now
@@MrUltimateX graphene is so good, the companies will start losing profits.
and the next battery is more likely some other metal component, not graphene
i think with batteries its more difficult because by laws of physics, we are reaching an energy density that is just too risky to be portable. so i think batteries will stay on the same-ish levels that they are now, but as we see with the M1, the components still have much optimization potential. once we can build screens, processing units etc which run with minimal heat-generation, a laptop with a present-day battery could run 40-50h easily i think. I hope i am wrong though, and there is a way of making a high energy density battery that just by laws of nature cant discharge an any explosive manner
@@amarug Theres probably a way to make high energy density battery unlikely to fail catastrophically, at least unlikely enough to make it an acceptable risk for the convenience, people still use cars even if they are deadly to use sometimes.
Basically you don't need to reach 100% safety, just close enough.
Another way to reach a similar amount of convenience would be to have fast charging battery, wouldn't really matter if you only had a few hours of peak performance if you could just stop near a power source for less than a minute. Maybe if you were going off the grid then it would matter. Then again transferring large amount of power in a relatively short amount of time probably has its own safety issue but at least you are unlikely to be holding the device in your hand when it explodes.
Or maybe some sort of universal battery combined with an easily swapable battery so you can just switch when depleted. A little less convenient than the fast charge since you probably cant bring spare battery everywhere, but i do have a friend who already does that.
But yeah it might be easier to just make stuff more energy efficient.
Compatibility: to be fair, there are _some_ apps that have trouble running on M1. But their number is very limited and it's very specialized software (e.g. some images for docker)
Not anymore, I have everything I need installed right now
developer issue..... they exist on Windows too, its not just Apple. If anyone played Mahjong on PC, you'll know there are graphic issues if you don't change the resolution,. Experienced that when we upgraded to Windows 7 back in the day. It IS a XP game after all, so probably expected.
In my computer architecture class, a common question was, which is more recent, CISC or RISC, and almost everyone got that one wrong. People naturally assume whatevers more complicated must be more "advanced" or newer, when no, it turns out the innovation isn't making it more complicated, but doing more things with less instructions. Thats what the innovation is. That lesson ought to be applied to lots of things :)
Yes yes and yes. I always say, the true wise man knows how to take complicated and abstract concepts and simplifies them to the point where others can understand them.
It was first when RISC Instructions were simple than cisc.. it's like with few instructions you do same job..but to be done the job..the job. Itself is fragments in many instructions that with few commands it be done...in inverse with cisc instructions...they do job step by step with to much commands to the processor that overload and consum much power.. .... In paradox nowadays RISC Instructions are too.much complicated to fragments a Job to done with fewest commands to processor it's possible
RISC vs CISC is irrelevant now
IOW, “Less Is More”!
CISC made sense back when compiler optimization was primitive and it was assumed that all complex programs and OSes would be written in assembly,. Back in those days, an advanced CPU was expected to minimize the "semantic gap" between human programmers and assembly. So, the designers of the day weren't dumb or lacking in innovation, they were designing for a different goal.
Apple: *Launches M1*
Intel: Dead Inside
I am sure others are about to launch new chips too
apple morons believe this ridiculous video: PC Builders:
@@terrobert u arent some sort of smartass to know how to build a pc its nothing hard it just takes u know time
first amd and now apple
*omg!* 😂
This is why its important to have competition. Intel has been at the top of the food chain for many years but recently AMD has now caught up and you can even argue that they're more performant/efficient. And then there's the ARM chipset, spanking both chips.
I'm pretty sure Intel is shitting their pants right now
Argue? we are past that point by halve a decade. Intel is just sheit.
Well M1 is only spanking 'em on efficiency, but AMD rules the brute force market.
Arm is owned buy Nvidia…….
@@Ryang170 Not yet....
@@Yeet42069 lol my first thought exactly. AMD is crushing Intel and it's showing in every test & benchmark. Intel's been great, but AMD cleaned em out this last year.
Fun fact: the number of instructions in the ARMv8 architecture (incl. ASIMD) is similar to that of the Intel 64 architecture including everything up to AVX2. It's not “bloat” from “too many instructions” that makes x86 CPUs harder to scale, but rather factors such as the more complex instruction encoding, intertia from Intel having ridden out the same microarchitecture since the PPro days, differences in the memory model, and better memory latency on tightly integrated SoCs vs. systems with discrete memory and GPU. Being able to design a whole system more or less from the ground up gave Apple the ability to sidestep many of the design constraints that hobble the existing Wintel ecosystem.
To highlight the folly of thinking “ARM is RISC, so obviously it's better than the CISC x86,” consider that the PowerPC architecture previously used by Apple (before they switched to x86) is also a RISC architecture, designed according to very similar principles as ARM. And yet it was ditched in favour of x86. It's not RISC/CISC that makes the difference, but rather how the CPU is designed below the instruction set layer.
This single video is far better than all the other reaction videos combined.It gives more insight and in depth analysis of what actually happened how it happened.
I've been a PC/Android fan all my life, and I take my hat off to Apple for achieving this.
I've been a PC enthusiast all my life till i wanted to know and bought my first Apple Device, expensive, overpriced, greedy, slow, that's what i thought.
Well, i was wrong. best decision in my Life, changed everything.
@@JohnSmith-pn2vl Similar story with me. I lived in Seattle, a couple of buddies were Microsoft engineers that worked on Windows. Loyalty required staying on Windows until I was forced at a new job to use a MacBook Pro. After complaining for a week or two learning to use the Mac, I realized that for me it was much better. That was 15 years ago. Since then I only use windows when a specific application requires it.
I work with both a MBP and a Windows PC. For the same price the windows beat out in performance for what I was programming. However, this M1 chip has blown me away, and the new MBA, I believe, can beat out 99% of windows laptops. This new chip is powerful.
Same. Loved Android hated apple, made the shift to iPhone 7 because one plus 6 wasn’t available and I needed the device urgently. It’s a really good phone IMO. I’m also interested to shift from Windows PC/ Linux Laptop to a Mac Mini.
until you see the price tag
I used to write ARM assembly 20 years ago, making my software about 5 times faster than using regular C++ compilers at the time. Seems like this is a skill I might pick up again for even more screaming performance. The ARM instruction set is one of the most beautiful instruction sets I ever encountered on a CPU. So much nicer than Intel 386+, and even more elegant than Motorola 680x0. it's a great future ahead for computing!
Assembly was always screamingly fast be it ARM or any other. There always has been nothing as good as direct assembly code. It has just been less necessary as the chips got faster and there was more memory for lots of everyday applications. It has also been complicated by needing to have your software allow for different peoples machines with different software and hardware setups especially where there are multiple manufacturers -ie non-apple. But I agree re native power being unleashed by assembly.
I had done a bit C64 assembly back in the days, and a bit AVR 8-bit code during studies. But when I came to my first ARM7 project, I remember that I was also stunned by the elegance of the instructions, e.g. the shifting bits. I felt that this was a completely higher philosophy of thinking, and I liked their approach from the first moment.
Now I have just bought my first Apple Computer, and as a Linux guy, this was entirely because it's just great technology.
Well of course no widely-produced Instruction Set Architecture (ISA) is as ugly and outright *klugey* as Intel's x86, but... is even the latest ARM ISA as sweet, and so completely *orthogonal,* as that of the ADSP-2100 family? Admittedly a fixed-point DSP, NOT a general purpose CPU, it can be used as a microcontroller in many cases, and almost makes you want to eschew the C compiler and do it all in assembly.
@@tarunarya1780 this is not the whole truth. C compilers will take the CPU Architektur into account while you have to be extremely knowledgeable and have to spend a shit tone of time to get to that point. Your average programmer has no chance to get close to a good c compiler.
@@domenickeller2564 I think you meant the comment for sci fifaan. I think compilers are a good thing, and lots of projects would be too difficult and big to manage in assembly, never mind cope with vast amount of hardware out there. Even programming in VBA can seem overly tedious. Role on AI. There is a role for selectively doing some parts of programs in pure assembly depending on the speed of operation and bottlenecks faced.
Its the future, I say this with confidence. Especially now we've got the M2 chip. I can't wait to see the day we get native chips from AMD and Intel which are ARM based, perhaps even custom chips direct from Microsoft and other companies who have also licensed from ARM.
Intel : We don't think it's worth putting our chips in Iphones
Apple : So, you have chosen death.
Yes because they rather put it in servers and super computers.
There are some Intel based android smartphones.
@@ArunG273 - I’m sure all 600 android users are happy.
Made in China XD
They RISCed it but they ARMed it
18:04
Intel doesn't want to talk about benchmarks anymore
Correction,,
Intel only want to talk about internal benchmarks
Z Rus: At the end of the day, the customer wants to talk about cost, convenience, power per dollar, and getting things done. Intel can play games with words all it wants -- once Windows 10 is no longer supported in 5 years (barring something new from MSFT which I don't expect), why will people want to buy many Intel chips?
Essentially every PC, phone, gaming console, etc. will be using some form of Linux.
And old guys like me who want the old Windows apps will buy used X86 machines for cheap, as I've already been doing for years. When I can get a GREAT Laptop for about $150 which is easily maintained, (or a few for $4.500ish, and lots of spare parts or redundancy if something fails). I just don't see Intel selling many chips, unless it finds new markets.
Not to be confused with “AARM” or “the assistant to the assistant to the regional manager”.
Hilarious bro
HAHAHAHAHA U GOT ME THERE BUD
dwight!
I don't get it, care to explain low earth creature?
@@dealerovski82 The Office tv show quote.
Windows and Android user hear and the M1 chip is on a different level but bought my wife a I pad air M1 chip and I was so impressed I bought myself one it's amazing having a tablet that has the performances of a £2k pc for £900 and its the best for cheap video editing... And it was only after watching this video it got me interested and it genuinely works I can't believe how fast the M1 chip is ...
Thanks for remembering ACORN. I am not the least surprised by the M1's speed - I had an Acorn Archimedes . At last the world is catching up !
Intel got too confident and laid back before the 6th generation.
Comparing the 7700K with the 2700K is just hilarious. Both are quad-core processors.
The 7700K (2017) has a mere increase of 30% more power compared to the 2700K. (2011) Intel needed almost 6 years and 5 generations of processors for such a low increase!
The 7700K is furthermore famous for its heat generation and power consumption which makes the overclocking potential limited even with a strong cooling system.
The 2700K on the other hand had a lot of overclocking potential and easily reached the same level like a stock 7700K with a decent cooling.
Then Ryzen came out of nowhere and Intel was finally forced to stop its politics of milking their for many years unrivaled processors.
And as far as we observed it in the past few years they weren't even prepared for this case.
@@JackoBanon1 yea 2700 was my last intel CPU. i skipped then i upgraded to amd last year. and i have complained that there is no innovation in desktop world for the past decade.
i hope servers would use some arm chips soon.
I have 6700k. For sure my next cpu will be an amd. Intel is just a greedy shit company.
@@JackoBanon1 This is why competition is good. Intel has purposely held back their big changes (such as shrinkage from 14mm) since they've had no real reason to make big advances. People buy their stuff every year even with 5% increases, so they just cruised along. AMD gave a solid push in competition but Apple is gonna be the big change needed to get Intel to actually improve.
@@nikkjcrespo Yeah, Apples latest M1 processors are scarry and could change the processor market drastically in the near future.
“As we set about designing the ARM, we didn’t really expect.. to pull it off.”
🤣 what a legend.
To be honest, every innovator or inventor before 90's are a legend because NO ONE expects them to pull it off.
They don't mention Roger (now Sophie) Wilson who was the real brains behind ARM architecture.
The best moment was when he realized the chip wasn't even connected to the power supply and still working 🤣
Honestly that was the same thing my project group thought when we took the assignment on making our own 16 bit risc cpu, but it was surprisingly simple! And we all got the second highest grade on our project exam. But I assure you Apples chip design is much more complex than ours were 😛
@@N00B283
Was this college computer science project?
Time here, the M1 is still more than enough for 80% of the market out there. The better the base model chips get the less people need to upgrade real world, thus less recycling and waste.
M1 changed everything and we're living in the greatest times as a Mac fan. Can't wait to get my base model m4 with 1tb 16GB ram for......yes $1200 (including Apple Care).
Insane.
competition means innovation. We can finally see pcs and notebooks getting better
Nah maybe for you... lol
And in Apples case it means top notch marketing propaganda, sheeple and fanbois 😄😄
Donn't you love it when someone does something good that there is always that one person that keeps putting them down
also means unethical marketing strategies, slave labour, releasing multiple models which are basically the same thing but with ever increasing prices and damaging the environment for greed. Etc
@@mhtbfecsq1 no, that's capitalism xD
I can see the scams now:
"How to download and install ARM chips for PC FREE Download"
“Is your computer getting slow? CLICK this button to install an extra ARM”
Lol!
If I had an extra ARM for every one of those scam ads I've seen...
@@3xOOOOOOOOOOOOOO You'd have 2 ARMs and 3 legs
Please post a link to the video of "How to download and install ARM chips for PC FREE Download" I need more speed for my laptop....
LOL jk
When it's coldfusion content, only one thing comes in mind. *Quality Content*
There is no quality in just references to other TH-cam channels, no unknown footage or stories, no new ambient music, no nothing, where is the quality you are talking about?? for me what I can see is that Dagogo is running out of money to pay for writers and content creators just like the most of influencers out there. Welcome to the real world.
This video is kind of wrong.. the cpu is not a desktop replacement, and isnt changing anything but battery laptop life for a few people
@@dertythegrower it's... Faster.
@@thisisntsergio1352 no its not.. amd beats it in cinebench benchmarks, kido.... also any new ryzen cpu laptop with an nvidia 3080 completely stomps apple everything
@@dertythegrower Apps right now are running on emulation but M1 beats Intel. After a year or so M2 will be release and it will be faster than AMD. You can tell by the graph it is increasing 300% each year.
It's kinda awesome that this stemmed from Acorn Computers. I used them back in the 90s in school and just kinda thought they disappeared into the void, but their legacy still live on today, which is cool!
For those who say 'that's impossible!' at 8:35, the original interview clarifies that the ARM chip was running off the signal inputs.
LowSpecGamer made a really cool video on the story of ARM where he mentions this part
Clayton Christensen died earlier this year and didn't get a chance to see one of the largest companies completely disrupted - something unthinkable when he wrote the book and set out exactly how it would happen 20+ years ago.
He probably knew
@Nida Margarette Most of them focus on the same synthetic benchmark and exporting video use case. Does e.g. Zoom use AS specific hardware extensions for video decoding / encoding?
**INTEL, thank you so much for the severe f&*k-up on your part otherwise we likely wouldn't be here...**
Intel saved Apple from being the f-upper.
When companies get egos and think they're untouchable, this is what happens historically.
@@bjkina
When Apple tried to return the favor for the iPhone chip, intel said no.
Intel isn't going anywhere. While they still stay on ancient 14nm process while AMD, Apple, Qualcomm and basically everyone else use 6nm to 8nm produced by TSMC and Samsung, Intel is still competitive and manages to squeeze more and more performance per core. Imagine what would happen if Intel just moved what they have to 7nm TSMC node. Well they are actually considering doing just that. At the same time typical Mac workloads don't require that single core performance Intel was chasing. So it's not a new paradigm of PC hardware - it's a new type of PC which was actually pioneered not by Apple but by Surfaces and Chromebooks which used ARM long before Apple. Yes this new type of PC will be better at those types of workloads and will take some of Intel's market share but it's just a better option for that kind of tradeoff sacrificing single core performance for power efficiency. But good luck playing Guild Wars 2 with its heavy world thread on those new M1 Macbooks.
@@AntonMochalin What are you talking about. These chips are beating everything hard in single core. They are the fastest single core cpus you can get. Only the high end intel/amd can beat them in multicore (because ton of cores lol). This type of pc will be better for 99% of users. There may be an application for cisc elsewhere, but it will not be mainstream computing.
Needing a more powerful computer isn’t relevant to everyone. But having more efficiency and less power consumption is. And it’s something everyone can notice.
This is deeper than it sounds.
The majority don’t need extremely powerful laptops.
Intel's been lazy for awhile now. Props to Apple! 👏👏
as same as powerpc before..
Even AMD is whipping them
This is nothing new. RISC was tried before and AMD already has APUs (the M1 is the Apple version of an APU). Why are you cheering?
Intel is not lazy , it is just technology get to a stage that it is difficult for one company to do it all . Intel design their chip as well as manufacturing them is no longer a sound business model
Props to Apple fro selling same crappt product with new name even crappy m1 is made by tscm apple selling premium price nothing more
I think it's safe for Apple to include stickers in their MacBooks with "Intel not inside"
Except their highest end model is equipped with an i7... Doh
@@DurzoBlunts i9
@@DurzoBlunts i7 oh no no no the i9 cooking machine
"Intel: Dead Inside"
There is still a lot of Intel-licensed stuff in each MacBook and any other personal computer, no matter which CPU is in it.
This might be revolutionary. Like Apple ways or not, they're where they are for this kind of reason. I wonder how it will inspire other manufacturers.
*Apple does nothing for 10 years* : I sleep
*Apple does 1 thing in 10 years* : Apple is revolutionary
@@RaskaTheFurry Android in 2013 be like: we don't need 64-bit processors for these small little phones. It makes no sense.
@@RaskaTheFurry dude if you watch the video you can see apple been doing the whole arm cpu thing for a while and we just didn't pay attention because intel was better. but now that's not the case so it's the thing that is neat.
@@BloodSprite-tan then it wasnt Apple, but the manufacturers that made the architecture possible.
@@RaskaTheFurry you need to research the product to perfection before you can release it
you keep popping up in my autoplay. your videos are good for listening to while im at work. thanks for the upload. also, subbed
Open source software: update and recompile and use chip direct. Linux should run very fast in new Mac mini.
I bet they'll end up blocking the BIOS for 'security' reasons...
A lot of the binaries will already be compatible, but there's no reason not to recompile them with more chip-specific optimisations. When you say 'use chip direct' do you mean be able to use the chip as it is? I think there will be code signing issues if you want to boot directly on the hardware, but I don't know the situation with recent Macs.
@@ThisRandomUsername I mean direct: can open source Macintosh software for Intel CPU can recompile for ARM CPU, not need use Rosetta emulator layer.
@@computercomputer8922 like what other ARM SOCs? No one else is making anything close to the chips of Apple in the smartphone (=ARM) world.
You can say that Apple took the RISC.
Hate apple all you want, but you have to accept that if Apple wasn’t in the market we would not have gotten to such innovations
📠
Depends, some stuff they were the inventor and others Jobs just had a vision that no one else had and knew how to market it. I'd give them credit regardless.
Really doesn't matter anymore because if you're not half way knowledgeable about computers then people tend to be sheep and you think anything with the Apple logo on is the absolute best.. even if you not quite sure why.
There's a reason why companies copy Apple and not the other way around.
@@Irfaan16 Because Apple knows how to make money. That doesn't necessarily speak to the quality. Things being copied, for example, is how to excel at planned obsolescence.
@@christianknuchel all tech companies do that.
Imagine if these laptop ARM chips became mainstream and a gaming laptop was build out of them.
I give it 10 years max and we will be running arm linux desktops from microsoft.
@@mauriciosl Microsoft had been playing around with some of their components in Linux like DirectX. Putting DirectX on Linux is just one huge leap to allow portability across platform without having to rewrite just for OpenGL stuff that went stagnant.
@Jaskaran Singh dev notes from who?
u can game on those m1 powered macbooks anyways lol, just not that much GPU power, apart from that they're basically the same.
Give it about 5 years
Thank you for ceating such information and high quality content videos each time.
I use both Windows and Macintosh; I don't believe in "brand loyalty."
Me too 👍🏻👍🏻
Same. Using iPhone + Linux + Windows. Keen to replace my Linux laptop for a Mac but that’s it. I’m not replacing my windows pc because I’m comfortable with it.
I believe in better with sensible price... 1000 dollor wheel is rediculous while this year's iPhone is kinda must go phone because Apple finally gave preference to rugged over fragile for the sake of aesthetics. They raised frame so screen couldn't come direct contact with surface making it long lasting while android OEMs make screen up on frame and call it 2.5 glass. If you drop it glass is done.
I bootcamp windows on my mac. Don't know if that will still be possible in the future though
Me too just got my M1 Mac mini for work and everything and a gaming rig with amd ryzen 7 for gaming
Joke. People still use intel lol
I have been 100% on windows/android for the past 10 years. I am impressed with Apple's innovation over the past couple years, that drive forces other companies like Microsoft and Samsung to also innovate and I am all about it.
They inspire slavish imitation for sure.
@@jasonk9779 Apple also copies a lot of features from Android OS devices.
@@Kpop-eye-f7t Android also copies a lot from Apple
@@Kpop-eye-f7t Android also copies a lot from Apple
@@nekogaming8461 so everyone copies from others, no really an inovation company at all exists
I just bought an ARM 64-bit 4-core computer for $100. The Raspberry Pi 400.
U shouldve bought the model b, overclock like a champ with a puny heatsink
i was thinking of raspberry pi as well.
I have a banana pi, with dual core, and it is good for excel, word, light browsing, etc
@@fwefhwe4232 ARM devices should be like this, a side kick for light loads, not a $1000 machine that's powerful but fully locked by the company
$5 PiZero is good enough for me.
@@yudhobaskoro8033 RPI400 is overclocked and it can go up to 2.2
Thanks!
The tech community : Moore's law is dead
Apple : Hold my M1
@Bokang Sepinare Bullshit venturebeat.com/2020/01/07/a-bright-future-for-moores-law/
Don't believe Apple's claim lol. Leaks have shown that M1 performs well compared with Intel-based Macs but only performs half compared to Ryzen 7 4800HS and Ryzen 5 3600X
@Bokang Sepinare In actual real-world head-to-head tests the M1 is a "middle of the road" capable CPU, not underpowered by any meaning of the word, but also nowhere near the fastest or most powerful.
It will be dead once it hits 1nm. I think 3nm EUV will be its limit
Can you tell me what is Moore's law
I wonder how much electricity has been wasted over the last 40 years running Intel chips and cooling offices / homes from the heat released.
My ryzen 3600x reported package power 34W while watching this video even though CPU is not doing much at only 3 or 4% usage, , while GPU is at only 15W, 16% usage, doing all the work rendering youtube. Although i noticed that GPU downclocks itself to minimum when rendering youtube while CPU jumps to maximum 4.3 Ghz.
Another way to look at it: how much free computation have we gained from heating homes and offices via computers? Your perspective is valid too, but in the end a lot of it balances out, especially in countries with severe winters.
In the end, technology like this helps us all by giving us more choice in how we do heating. Eventually household heaters will be replaced by computing devices and cloud computing will be farmed out to whichever countries are going through winter at a given moment.
Well at least they will be cheaper than Apple products.
@@krisg822 Actually AMD is more efficient in power at this time with ZEN than intel in terms of power.
@@1GTX1 for any given chip there is a baseline power, that no matter what is happening...it requires to function. When a gate on a cpu is switched 'off' there is still power running through it. In reality the cpu is measuring the difference of power say 1w to 3w (example), to say on or off. So at the minimum power requirements a cpu on desktop needs 34W, in your example, just to keep the chip powered on and ready to do anything.
Can you do a video ”Rise and fall of Flash player”
Steve Jobs put the kibosh on them
About Rosetta 2
"If Rosetta 2 works, you won't even notice it ever existed" -Angela
html killed it lol
This seems to be a trend from intel's side... no vision, future thinking or a forward mindset. I remember watching the defiant ones and intel told Record Exec Jimmy Iovine something similar, nevertheless Jimmy hooked up with Apple, Job's and Tim Cook and now Apple has got Beats which served as a them very well in setting up Apple Music and the AirPods tech. it's sad really when you think about it.
Intel is not doing well recently, AMD also has sone better last yeats
uh.... we are talking performance only here right? Is that is the case, then its true..
Apple may have the performance, gain but as far as i'm concerned, intel still wins due to app compatibility...(or at least they will do when Rosetta 2 get pulled from M1/M2 etc)
Apple is neck and neck now only because Rosetta 2 is filling the gap allowing x86 code to run..
Take that away, and there is a clear winner... (Apple did say 3 year transition, but they can't keep it round forever, there is code hidden in MacOS beta's already sighting removal), so it will happen, its just a matter of time. That's how i think of it... There's not much evidence supporting compatibility on M-chips with Rosetta 2 gone is there ? All we're concerned is the "here" and "now" ... No one looks at to what "will be"
I kind of things that's a better way .... Least your never surprised .... (That could be kinda boring...)
bought a m1 pro a week ago, IT IS AMAZING. The last time I have used a Mac was 1987, it was a Macintosh:)
I hate the way apple does business and never bought their products but I love this new chip. Amazing.
why? you hate that they make all-encompassing tech that anyone can pickup and use without needing to optimize or change it? or that they have an iPhone for everyone at just about every price point? or literally changed the PC landscape forever a few weeks ago yet didn’t raise prices by even a DOLLAR on any of the new Macs? I don’t understand.
That's exactly why I hate the fact that it's an _Apple_ chip...
Nevan Hoffman is part of the problem.
I feel you man. I just hate Apple as a business but damn are they good at what they do.
The fact that we have to make it clear that we hate them before acknowledging or appreciating something about them goes to show WE are afraid of fully acknowledging that particular part we like about Apple otherwise we will be classed a “sheep” speaks volumes
For the first time in my 16 years of computer using life, I’m buying a Mac this year unless AMD comes up with something competitive. Apple Silicon is perfect for laptops. Knowing Apple from long term iPhone and iPad usage, I think they will nail the next gen M series chips.
Good luck finding good software or games, hope u have enough money left for the overpriced software and hardware upgrade and repair costs. And hope you enjoy being limited to apple store and propriety ecosystem. HAHAHAHAHAHAHA
@@mhavock different users have different amounts of disposable income. Different companies target different levels of disposable income. Different users have different needs and values. I don’t know about this person specifically, but a lot of Apple customers value:
* low latency connectivity (seemless connection between devices)
* on-device data, iCloud private relay, strict app regulation (security and privacy)
* low power consumption, high performance and long battery life (optimisation)
If you don’t value these things that’s okay. If you can’t afford Apple products that’s okay, but don’t judge other people for what they want to spend their money on (unless it’s a significant risk to them or others).
@@costacoffee4life665 Thats not my point at all, and if you think Apple has low latency etc etc, then you have not looked at the equivalent priced technology from the competitors. Infact, all these technologies are not something new Apple developed. They borrowed and take what they need.You know exactly what I mean, do your research. Support opensource and open platform development instead.
@@mhavock I am literally still able to install cracked softwares on macOS, and if that’s not enough I can also boot windows on my mac and get all the freedom windows offers. There’s also plenty of third-party apple repair shops, with fair price, genuine parts, and actual good service
@@mhavock the silence is so loud
Because of my work I still can't do without a real PC but I recognize and that's what has changed a lot compared to the beginning of smartphones and tablets is that I use more and more both devices to replace my Windows computers
Mobile systems are fast, reliable, simple to use and moreover they are used with the fingers, they are portable and I can take my tablet or my smartphone everywhere whereas for my PC it's a little more complicated and as he says very rightly in the video they are more energy efficient, my tablet and my smartphone can easily last a day on the battery
And it seems to evolve very quickly so I can't wait to see what the future holds for mobile systems in general
Thank you Dagogo
AMD vs Intel vs M1. Go.
stomping both intel and amd, and qualcomm, and samsung.
@Tom Cass I'm sure every company does, but you know they're on 14nm+++++. While AMD is on 7nm with 5nm for 2022. With your statement, I will respectfully say, Intel has held off quite a bit then.
If you are willing to pay the premium for apple then it is great most buyers are not willing.
Nonsense do your homework. M1 is hopeless in multi thread and multi core performance - Intel and AMD mobility chips blow the M1 out of the water. See links and benchmarks above.
@@bighands69 I'm personally not a dedicated Apple user. Android and PC user myself. Mainly video editing and rendering. Still not switching to Apple, but the test of TLD rendering his footage being faster than his PC is interesting. A $2000 to $3500 PC rendering video slower than an Apple Air with no fan priced at $999 to $1249, that's impressive. And again, I'm not jumping to Apple. Everyone enjoys their own flavors of tech, but just acknowledging, this m1 chip itself is impressive. At least for my video rendering. =)
Very enlightening. Simply by upgrading my phone a couple of times, I had a subconscious awareness that mobile processors were catching up to desktop performance at a surprising speed. But I had no idea they had come this far.
So, an Apple product is better than...an Apple product? Yeah, amazing innovation there.
@@williamyoung9401 what
Do you really need that much power on your phone
@@williamyoung9401 its Foxconn products with apple logo and dumb customers will buy it 😂
@@AlexMkd1984 I don’t think apple products are “dumb”, also isn’t apple products literally outperforming every android phone out there, and the fact it last over 5 years vs 1 or even 2 years, i’m not defending apple but just saying the advantages of iphone over android.
Never thought I'd see the day when the word "Grandma" is synonymous with "INTEL".......
Burrrrrnnnnn
Lol facts
Facebook is up next
Best cost benefit in terms of computers at least for me. 100% Approved. Nice design, fast, smooth use, battery lasts more than 15 hours working. So practical as a phone. You take it from the bag and just start using from where you were, with absolutely no lag. It's really open and use. Nice nice nice
Groundbreaking! Well done Apple...never thought I'd ever say that as I have never been a big fan of Apple, but there's no denying that chip is a game changer.
The quality of your content is absolutely amazing! keep on! Greetings from Greece.
Lets not forget that the efficiency of the program coding of the software also determines performance speed and the heat factor as well.
How so? If a cooler for a CPU is subpar and can't handle a chip at full load then that is a design flaw in the hardware department.
@@DurzoBlunts if it’s simpler to program for, then it’s easier for devs, who will then have more time in their hands to perhaps optimise the software to its best possible capabilities, there’s deadlines you know?.
@@DurzoBlunts some architecture instruction sets are massively more efficient than others. To do a simple 5+5, one processor may need a dozen cycles for memory access and sheer complexity of the chip, but another might just need half that because it isn’t nearly as complex
It is really cool to be able to witness this kind of development in the industry, I'm exited to see what comes next
New Apple tech now costs an ARM and a leg.
lmao XD
Sounds a bit RISCy
They made a faster chip and didn’t increase the price of the computer..?
It‘s much cheaper than any Windows Laptop with similar performance. Now there is really no point complaining about Apple‘s prices
ROFL
Definitely excited to see how far Apple goes with its ARM design
Well mate, they put it in a fucking ipad now...
@@epistomolokko they’ve been since 1st gen lol
@@bartomiejkomarnicki7506 haha yeah true
@@epistomolokko Would be logic to say : They put it in the iMac and Macbook now! :)
You shouldn't get too excited. The ARM is not a fresh new architecture. In fact it drags quite a lot of "compatibility" and "legacy" crap along with it too. Switching from intel to ARM is just like changing the pile of stink you prefer to sit on. It will not be as disruptive as many people think.
This dude could make a video about plumbing sound all sinister and secretive
How One Plumber Revolutionized Pipes
I own myself. Have privacy... And life has never been better
Welcome to 2030
www.tamperproof.earth/post/welcome-to-2030-i-own-myself-have-privacy-and-life-has-never-been-better
Where do you source all your research from? You level of detail in your video's is awesome!! Thanks, and keep up the great work!!
Apple's Newton was a huge catalyst for ARM in mobile devices - far predating the iPod.
It predated the iphone too, and was about the same size as an iphone. It's the father of the iphone.
Back them, we did use them, HP Win CE too, back then technology was just not that good jet.
Webkit, touch screens needed more development, 20 year later we have achieved it!
?????? newton was a PDA, ipod is a music box. also newton was a flop. apples vs oranges?
As I remember, ARM was created and owned by Apple and Olivetti to power the Newton and Olivetti devices.
Actually we have been using ARM since the game boy!
4 years ago cold fusion- apple has lost its magic
cold fusion now- apple just changed the industry like Steve jobs era
RISC-era Macs could beat an intel PC of the same speed. Apple made a mistake to listen to Intel's siren song about the x86 platform. The x86 architecture became too complicated for it's own good. I am glad RISC/ARM stuck around and now with good R&D has proven to beat out x86 in both performance and energy efficiency.
@@Nphen can everyone make short comments
@@CompilationCorp My above 4 sentences *is* a short comment.. for me. 😂
@@Nphen I agree
took four years though
The M1 is amazing, and it's only going to get better from here (and at a faster pace than intel/amd). However, I think it is a sickening trend to have memory and ssd built into the motherboard. Not only can you not upgrade your storage, but you can't customize anything about it. And when the RAM or SSD breaks, not only can you not buy something to replace it, but it can also cause the entire system to become unstable. Apple doesn't want you to have the ability to fix your own computer if it breaks, they want to sell you a new computer. And considering Apple's impact on other tech companies, this should be reason to worry.
We’ve already seen it, with the death of the headphone jack (which I admittedly don’t miss but it would be nice for people who do), and the lack of micro sd slots on some newer phones
On the one hand, I agree with you.
On the other hand, this is very "stuck in the present" thinking. Do you stress about not being able to replace the south bridge fan in your laptop? Probably not, because it doesn't have a south bridge. Are you upset that it doesn't have replaceable fuses? Are you upset that you can't hot-swap the display backlight?
At a certain point - and we're not quite there - RAM and storage are likely to be plentiful and cheap enough that it simply doesn't matter for the vast majority of users whether they can upgrade them or not. Just like I'm sure you're not bothered by the fact that your TV doesn't have a hundred calibration pots for you to fiddle with.
It also neglects that there can be inherent advantages to getting all of those things as close together as possible. All I'd really ask is that they tone down the incremental pricing for some of the options, but c'est la vie.
one of the reasons attributed to the M1 phenomenal performance is because the RAM and SSD are integrated. Contrary to popular believe, the ability to upgrade RAM and SSD in a Laptop is not a major selling point for majority of users.
ARM is just more efficient, cheap and better than fancy upgrading dreams. You can have better cheap options it's just not apple.
@@MrLoipurz Why are you calling it the "ability to upgrade"? Yeah, most do not care about it.
What they do care about is whether they can fix their machine for $30 and continue using it for the next 5 years or buy a new one for $1500...
I'm solidly a PC/Android fan. But a lot of people misunderstand my distaste for Apple products... I have no problem whatsoever with the technology. In many cases I've admired it for years. What I have a MASSIVE problem with is their product model... None of their stuff is modular and little of it is upgradeable. What they offer you is what you get. And with that Limitations is an unacceptable price tag. In many cases I can build a far better PC than a Mac, and if a part breaks, I can fix it myself. Or more importantly... UPGRADE as newer/better components become available.
By the time Apple started using PowerPC, let alone when they stopped, Intel and AMD CPUs had already switched to translating CISC into RISC µOps, in the case of AMD literally originally based on their 29000 called own RISC ISA. The reason Apple went Intel was simple, because of economics of scale bigger numbers of products sold made it possible for Intel, especially their own Fabs, to invest more money in R&D and sell equivalently capable CPUs for less. And now with smartphones and tablets adding to the already decently sized embedded niche ARM had before, and Intel stuck with their own Fabs at a time of lagging behind, economics of scale add to the advantage of having a custom design for what you want to achieve, in addition to the advantage of not having to translate, but there has never been a CISC chip with more than a low hundrets of MHz, because that doesn't actually work. And all the non-Intel using companies naturally used RISC, all be it different ISAs, because once you do choose that you don't need emulated compatibility to an old ISA, you would naturally choose to natively use the actually sensible design. There is nothing about RISC that is less able to do this or that, or technical reasons, why they were used in higher end or by professional users. The reasons are all about company policies. Intel was chosen, wrongly so, to supply the original IBM machine, and all PCs using Intel or AMD since then, use x86, because they descend from an unbroken line of piece by piece iteratively improved IBM PC clones, that couldn't switch away from x86 without breaking compatibility, or systems that switched to IBM PC hardware, like Apple, because as it had most of the PC market and others were to small to compete.
While it is true that Intel has effectively had a RISC core since the Pentium pro, it still relies on a CISC instruction set. Lots of complicated logic is still required in the decode phase of instructions. It’s also why Intel is only a 4 wide design while Apple is using an 8 wide design.
@@steveseidel9967 Modern x86 designs translate the instructions into the micro-op cache long before they are needed. The entire execution pipeline runs on micro-ops, and the micro-ops are reused many times in typical code. This means you only really need one copy of the translation logic, so it only comes to ~1% of the die area.
As you point out, M1's CPU cores can execute more instructions per clock than any x86 design. The reason they can do this is because of a novel cache architecture that can keep those pipes full. And a lot of new ideas to keep the power consumption down, so the wide execution units don't exceed the local thermal limit (melt).
ARM is not the first RISC design, that goes to MIPS.
I think he swallowed the Kool Aid - Has he even heard about the raspberry pi?
@@JohnDwyer1983 every Raspberry Pi ever made is powered by a System on a Chip built by Broadcom, with ARM-licensed cores.
@@pranjalkanel2711 Yes, but Apple doesn't ARM, Apple are not the first to build a desktop computer based on ARM this year. Huawei made one, the raspberry pi desktop exists and some Android phones have desktop modes when you plug them in. Also, Docker works on the raspberry pi. reports indicate this is not true of the Apple's Arm product.
Arm was meant to be mips for the masses
History often gets taken for granted and forgotten which didn't survive.
He even jumped from 1970 to 1991, when people talks about aapl, world revolves around it somehow :)
Not only did intel got Hit by Apple , They were also beaten by AMD Ryzen at the same time lmao ..
They weren't though.
The 50 dollar price hike on Zen 3, made sure of that.
What's more, Intel has diversified their funds so they can stomach the losses, while AMD has just gotten back up and is highly relient on the Ryzen's margins.
So if Apple keeps this pace up, there's a very big chance that AMD ends up being bought by Apple.
@@Orcawhale1 you think AMD is going to be bought by Apple,???
@@hilal_younus Yeah, if Apple keeps this up. And AMD don't find other sources of revenue streams, then there's a big chance that Apple could scoop them up in 10 years time. Samsung nearly did it, back when AMD only had the terrible FX cpu's.
@@Orcawhale1 maybe , maybe Not , but If Apple Does it , then Their gonna have more resources in their hand , which would mean better chips 🍟
@@hilal_younus We can hope!
Very cool video. I've been thinking about a Mac for some time now, but held off mainly due to price and not feeling like they were worth it... Well, then Apple dropped the M1 bomb, and I still held off... Then I used my dinosaur laptop the other day and it literally took 10 minutes just to boot up. Doing anything on it was PAINFULLY slow... Got a M1 Mac Air last night and I'm pretty excited to learn this new system and also know that I have one of best computers on market that'll probably last me 10-15 years lol... To witness the screen turn on BEFORE I even fully open the computer and then do Touch ID and I'm literally at my home screen ready to do whatever WITHIN a second is pretty f'ing mind blowing to me!
One of the mistakes Microsoft made with Windows was to overload the start-up sequence with too much unnecessary housekeeping. When Apple adopted BSD UNIX for their OS kernel, they did just the opposite....keep the launch sequence uncluttered -- get the desktop up & running in as few seconds as possible.
Once you bite (the apple), you’re bitten (entranced).
Now imagine that chip in a VR set!
GPUs are actually already using RISC, so in essence, VR sets already are using them since the start.
@@isssma0 But you are like a dog on a leash. We need a VR that can stay on for days on its own.
@@MrHichammohsen1 this
@eboy.matthew on ig for nothing 80kg
Imagine that chip in a Fleshlight 😮😮😮
Apple is confused whether to increase or decrease prices of their product😂
@Anurag LOL ..do you even know apple bro?
@@atiqshahriarshourav2958 do you? You clearly don't understand how apple operates now
@@fanban2926 yes, they make each piece separate so you buy more 😂
@@atiqshahriarshourav2958 you don't understand apple they literally realest the best budget phone iphone se in 2020 for 400 dollars and iphone 12 mini for 700 dollars also reduce prices for iphone 11 to 599 dollars and iphone x to 499 dollars.
@@rafalb692 They overprice their stuff
This is full of factual errors. Long time ago ( as in about 20 year ) there were 2 platforms RISC and CISC. ARM is a version of RISC and x86 used to be a version of CISC. Only today x86, just like ARM, executes RISC instructions internally and really the only difference is that software sends CISC instructions to a x86 CPU which are then translated/decoded to RISC for internal processing. Before someone goes all preachy proclaiming that the decoder is the problem it really isn't and it uses between 3-5% of extra power and that statement has been independently verified many times. What Apple did with M1 is not at all brining a mobile CPU to a laptop/desktop. M1 is and evolution of current CPU design thinking and the true innovation here is removing the memory controller and using a "shared" memory pool for GPU and CPU. That's really not that uncommon or revolutionary. AMD has CPU/GPU on chip and sharing memory between CPU and GPU is common practice with integrated graphics cards. What Apple did is they used the best of everything and used what would normally be considered a low end technique to build a high end processing unit ( not sure it can be called a CPU anymore ). And yes the results are AMAZING. Also the reason why this is even possible and you can still run existing software is simply because Apple replaced the hardware decoder found in x86 with a software decoder. And the best part is that once software is reworked to natively use RISC and not need the decoder, you can expect at least 3-5% gain in speed and/or battery performance.
Also, you totally got the history of why CISC was prevalent wrong. It's all about cost of memory. CISC allowed for better reuse of memory and then RAM was very very expensive and there simply wasn't the technological means of having more, CISC was clearly the better technology. But that hasn't been the case for a long time now and today Apple has access to memory that runs at clock speeds similar to what a CPU runs at. That's also why now is the time to make a switch. It also happens to be the reason why Nvidia bought Arm. If M1 is the future of Macs and Intel is not the future of PCs then you clearly have a gap in the market. Nvidia wants to make sure they are part of the conversation especially if the future of GPUs is not a physical board that plug into another physical board.
And what explains the battery performance on ARMs ?
Nerd
@@Rasecz I The "CPU design", duuuh
I am happy to see someone with brains in the comment section. This video is so damn misleading and full of errors. How on earth he can say that a retarded iPad is more powerful than a Laptop 😆🤦♂️
I've been following the industry for forty odd years and this is dumbed down somewhat, if it wasn't it would have been a series, but it's fundamentally accurate. RISC and CISC kinda, sorta merged, the key point was, an instruction has to earn it keep. So Intel has put a lot of RISC elements in their designs, they had to, to improve performance. Where intel is fucked, is its CISC legacy, all those old instructions are still in silicon. ARM on the other hand, has a relatively clean RISC legacy.
I used to view the Discovery Channel and such a lot. Then no more Dish and such. Due to COVID I rediscovered TH-cam. There are so many channels, programs, shows, and documentaries on TH-cam. I don't know how such great programs can be produced to view without being connected to the big cords. How nice!
AMD's Ryzens are getting superhits lately and Apple ditching the intel. Intel's golden era is surely over!
In desktop and laptop maybe but not in the business server world.
@@havu2236 epyc Rome loves to disagree lol
@@yvtiwari haha
@@Aecor Yeah been hearing about how AMD will unseat intel on server business base for decade and still nothing. Like how they said Athalon will take Intel down.. Still waiting and never will.
@@havu2236 for a decade ?
Nuclear powered reactor power plants and Small modular reactor ( Nuscale ) deserve a video !!!
I think Bill Gates wanted to test something like that in China before Trump became the president.
I agree it’s the only way to get 0 emissions
@@Spengas ya and it can is the best hope to reach the Paris accord and better in the long run
@@Spengas That's not true, but it is interesting tech.
Subject zero science made a video about this recently
26:15 "these machines could be less expensive" I can't see Apple ever dropping their prices.
They've already dropped $100 from the mac mini.
@@clydeng87 Well that's a start I suppose.
@@clydeng87 They increased their prices with the iPhone 12 line-up.
@@laturista1000 I have 2 PC laptops from 15 years ago. One is a thinkpad, the other a HP. I also have an Amstrad CPC 464 from 1988 They still work flawlessly. What's your point?
these machines means the other machines like windows notebook pcs or chromebooks etc running better arm chips
I think it's important to explain why at 8:30 he says the chip was still running. It was running on the extremely low voltage introduced by the measuring device, it wasn't just running off of God's will.