Impressive breakdown of the technology war between major players in the ARM chip market. I find it particularly fascinating that despite the CPU advancements, x86 still holds strong in infrastructure due to its age and versatility.
"x86" CPUs have not been "x86" internally for a long time, it's just a layer on the outside. You should listen to CPU architect Jim Keller who says that x86 is not really a substantial disadvantage. Case in point, AMD is goint to release Zen5 based StrixPoint and StrixPoint Halo, which to me seem targeted at Apple's "M" SOCs.
@@swdev245yeah it’s not the architecture, it’s just Intel and AMDs disregard of mobile chips. Now they’re scrambling to deliver much more efficient chips with Strix and Arrow Lake
@@PapiDey-dv4gw IpadOS is like a nerfed MacOS. It's better than iOS but worse than MacOS. Apple refuses to put MacOS on it, I guess because they would have to kill the iPadOS. And the lack of supported apps makes you asks the question, why not buy a MacBook instead.
1:13 that's actually a myth. While it was true that several years ago x86 was CISC and ARM was RISC and that was one of the reason for the latter being more energy efficient, nowadays the latest and most advanced iterations of x86 and ARM have blurred the line between the CISC and RISC. The x86 processor now is using a decoder to interpret its complex CISC intructions to a RISC form internally and ARM has introduced a lot of CISC-like extensions over the years. The actual reason x86 is more inefficient[citation needed] is due to the fact it needs to a lot of 30+ year old legacy code, though Intel have recently proposed a specification called x86-S which aims to remove a lot of legacy cruft from their processors while still maintaining reasonable backwards compatibility.
Wrong! It’s exactly this CISC decoder unit which consumes a lot of energy. Furthermore there was the famous IA64 experiment which failed. Even further seen you have to distinguish between software which runs for companies versus for private consumers. They have different development life cycles and budgets. There is a reason why COBOL exists till today. I don’t know of any large migration project after Y2K. Usually a software stalls in further development if its code is 1) unreadable 2) test coverage is bad of low.
@@javastream5015the decoder is only about 5% of die area in modern CPU cores. Most transistors and energy are spent on branch prediction, handling out of order execution and cache.
@@javastream5015OK, I get the point about CISC to RiSC adding to overhead, but why bring up Itanium? Yeah, it failed. But so did all the other times Intel tried to "replace" their x86 based cores. From what I understand x86s is about removing/depreciating old 16 and 32 bit instructions. In other words making x86 more focused on x64 and beyond. That is not an attempt to replace the current architecture, just a revision.
@@MichaelDeHaven Back to x86s: Yes, it can be done, because Apple did the same with the M1. Apple removed the support for 32 Bit API even one OS version before. 👉A lot of old games (like Civilization 4) stopped working. To run old software emulators are needed. Then you have stupid companies like Parallels who didn’t want to build a x86 emulator into Parallels Desktop so that old x86 (and x64) virtual machines could be used further. ☹️However, as I have read now, Windows for ARM supports emulation of 32 Bit x86 software (on 64 Bit ARM). 👍 👉Long story short: It could work technically. But you would make a lot of owners of legacy applications angry without a proper emulation support. Just look what Apple did.
@@MichaelDeHaven x86S isn't really about deprecating instructions (as that could be devastating for compatibility for older software - Intel's proposal would require changes only from the operating systems), but about deprecating unused features. For example, x86 chips first boot in 16bit mode, then they can be switched to 32bit and then finally to 64bit adding a lot of unnecessary complexity. There's also the entire real and protected mode ordeal, together with unused executions rings 1 and 2 and finally some things about the segmentation model. As for the entire RISC and CISC problem on x86, in my opinion, the decoder and all the execution units (APU, FPU,...) still have to be prepared to deal with a lot of instructions. x86 contains more than 186 opcodes for just the ADD instruction, having the option to provide an immediate value or address from memory, each of those for 8, 16, 32 and 64bit values. Adding a MOV instruction just before that and supporting only the registers would have caused a huge reduction in the opcode count, but it's a little too late for that. It shows, that x86 was designed to be programmed directly in assembly and to offload more complexity to the hardware rather than the software, which is just not how we use computers today with many abstractions above the hardware. Intel's official documentation to x86 in its current form has exactly 5060 Portrait Letter pages full of really well constructed and written text (one of the best documentations I've ever read) and yet it's still incredibly big. Sometimes less is more.
Arm was introduced by Acorn computers, eventually renamed Acorn Risc Machines, as a processor for their desktop computers. They found out they were extremely power efficient in their original design. They also happened to outperform products based on the 68000 series at the same clock speeds. And I mean ran circles around them. Prior to risc architectures CPUs measured performance in clocks per instruction not instructions per clock. This was the whole motivation for risc. Risc architectures introduced the first CPUs that executed 1 instruction per clock. This was a massive improvement over heritage CISC CPUs.
@@brunesifirst arm processor on showcase to bbc was even unplugged to electricity. It was enough of power from returned electricity, from connected monitor to motherboard, to run it😂
1:45 actually microsoft was trying to push arm / equivalent battery efficient stuff before this but nobody (Qualcomm) took this seriously Untill apple silicon
This. Everyone forgot the first windows tablet (Windows XP tablet PC edition anyone?), and shat bricks when the iPad launched. Same with arm (Windows RT anyone?), no one wanted to hop on the arm train back then. Some tech needs apple’s hypetrain.
@@TheDeveloperGuy I heard that Windows RT sucked ass and maybe that's why nobody really wanted to jump ships. It also had no x86 compatibility layer (which the existing software library is the main selling point of Windows these days anyway). Without Rosetta functioning as well as it did, I don't think the jump would have been such a no-brainer for many people as it was.
I had to check that figure. Amazing! It only takes an electron 3 billionths of a second to do one orbit. How can they do 10,000 operations in the time it takes an electron to do an orbit?
The M4 seems to be the fastest CPU chip on the market for running AI. You'd still want a dedicated GPU if you can get one. While the M4 gets near 40 TOPS, the 4090 gets around 83 TFLOPS (trillions of floating point operations per second) at around 32 bit precision. I think the M4 is rumored to be at 8-bit precision for their current TOPS quote. No confirmation on the 8-bit rumor though, so it's good to be skeptical of that.
I mean it's quite obvious that Nvidia's chips are go to for AI related stuff, but it's still impressive what performance M chips can achieve with how little power they need. Sadly, they are OS exclusive.
@@mikadofxx9030 They aren't. Apple has specifically made sure that third-party OSs can run and can do so safely and without sabotaging the system's security model The drivers and some other important bits of code being private is of course a big obstacle, but it's one that can be and defeated, as the Asahi Linux team has proven.
To add, Apple themselves encouraged Microsoft to port Windows to Apple silicon... why? the answer is that macOS makes them practically no money. Apple is a hardware company.
@@veenmikki27 Nah this is the fault of the person throwing it out From my experience old Macs (and almost every higher end computer and phone for that part) work perfectly fine multiple years into the future
@@IDKisReal2401True. I would have been watching on this my 2017 MBP if it weren't for the fact that I accidently dropped and broke it a few weeks ago. That's considering that those models criticised for keyboard, battery, and fan problems. Still sucks a bit that they announced M4 when I just bought a M3 MBA as a replacement.
~flashbacks of farming gazers with tamed dragons...or of stealing peoples shit from their bags with a thief. Or fighting them, disarming them and then stealing the weapon from their bag. Still one of the funniest things I remember doing in a game.
@@hmthatsniceiguess2828 Or being chased by people wanting to kill you only to provoke nearby dragons and wisps to attack and chase them. It really was a great game
@@Aneliusewhile the iPad may be a good device apple is the worse and most scum company on the planet so no doubt it was used on the iPad because their M4 isn't as good as they say it is as usual
Got to say, there’s no flex like running dolphin Mixtral on my M1 Max locally with better response times than the web interface of ChatGPT. Their chips are absolutely nuts and if the M4 is focussed on AI workloads I dread to think what we’ll be able to do on any device in three years.
Since we are at with processor topics, you should talk or do an 100 seconds video on the RISC-V architecture, an open-source, royalty free ISA compared to ARM that's getting decent progress on this race as well.
On device AI is great. Today I tried some Whisper audio dictation on my MacBook and it worked great while on a similarly priced Windows laptop it took more than 10 times as long and was unusable. Very soon having a good AI/Neural chip will become essential in almost every device and Apple are definitely ahead at the moment.
ARM does design CPUs and also licenses out the architecture. Most companies use the ARM off the shelf design where as Apple and now Qualcomm design their own CPU using the ARM ISA. RISC-V is also another x86 killer :p
@@chiluco2000 On my first watch and I caught these errors. (1) The article quoted is that claims x86 is dead is 90% wrong only the last paragraph is even remotely correct. (2) The structures x86 is not based on transistor count but the number of instructions the cisc chip can use to perform its micro operations. (3) Arm was poorly described in the video, he based the argument on instruction sets(Risc) and then went to describe it by transistor count. This is not really the reason why arm is considered better for mobile devices and devices with simple specialized functionality,
1:23 "RISC requires fewer transistors per instruction..." What?! You should have said that CISC and RISC have not the same instruction set (as the meaning of CISC and RISC implies) so RISC cpus are simpler to design and build thanks to their less complex instructions set. They usually need less transistors because they do not implement complex instructions (such as REP MOVSB wich is used to copy data from a memory location to another while decrementing an operator and checking for its value). To perform the same operations with a RISC cpu you need to code a loop with at least 4 instructions in it.
Speed, cost, privacy. For simple tasks a high-performing fine-tuned small model running locally will be sufficient. Apple developing both their own hardware and software can achieve this very effectively.
@@2pingu937 why would you need to run the 4090 in a cloud? Just use a VPN tunnel from your smart device to your home network where your computer with the 4090 resides. Thats not really hard to facilitate.
3:02 why tf this guy is holding his phone upside down
cause hes me
That clip is definitely ai generated
Its AI, look at the fingers
AI generated 😂
He thinks diffrent
M4 on iPadOS is like installing thruster engines on a horse with no legs
Our new racing horse can achieve speeds of up to Mach 4!**
**if it had legs and could actually survive running at those speeds
38 TOPS, ray-tracing, but still can't run macOS fucking lol
@@civilroman still don't have file manager with file system access!
What even are you gonna do with all that power? Makeshift oven? AI generate all your art?
That cracked me up real good.
people during cold war: The arms race will be wild
the ARMs race now
expectation : America and the gang vs Russia and the gang
reality : American vs another American while Taiwan supporting everyone
The hot wars
It's like that translator joke with Les armes
Apple has basically had the lead for so many years, i doubt they will lose it
@@Lenfer-hp3ic taiwan is literally just 80 miles from china, so without america taiwan is f'kd in few hours.
Apple car charging was hilarious.
It's not an ARMs race, it's just hotwheels with a hot charge, USB 4.XXL
it wasnt
@@Moli05 dude, Apple is nothing, ever since jobs died.
Now it's just another brand, take it easy. 🤡🤡
@@Moli05
Apparently you’ve never seen an apple magic mouse
😂
Idk what to say but I had tacos for dinner and it was a banger. Destroyed my toilet in the process but it was worth it in at the end of the day.
nice
no pain no game reversed
Deep , definitely an interesting topic
I had quasedillas + buritto styled salad and I am currently in the same proccess you were in
Are you lactose intolerant or something? I don't get people who say that tacos make them poop
The car got me 😂
should've stayed out of the road
@@iverssonkaufert7619 lmao. looks like the car got him again.
timestamp?
@@subhashgottumukkala 0:11
Yeah the car fr 😂💀🤦♂️
Where does this dude get his stock footage? 😂
Tenor, Giphy and more. Probably easy to find if you just search their expression on Google
TH-cam
meta-llama
AI is cheaper than stock footage.
And who decided to make stock footage of a couple of guys in suits fighting each other with gardening tools?
Arm out there playing 4D chess selling shovel blueprints
Agreed. Hoping arrow lakes nice.
Brilliant metaphor
singularity blaster turbo pro max, i knew it all along
😢
It's the future
Honestly, I would buy it just because of the name (I'm broke)
I'm waiting for the Super V2
starting from just 9,999$ (8GB RAM version)
2:58
I didn't know I needed this piece of stock footage in my life. Thank you
It can run a LLM on the laptop but you still can't replace the battery.
Those two are irrelevant
you can, what you smoke?
..,,::;;!!!????@@gigamoment
@@weiss588 shhhh, he's trying to make the lulz.
I can run an llm on my s20 fe, that isn't special, people have ran llms on hacked switches.
Impressive breakdown of the technology war between major players in the ARM chip market. I find it particularly fascinating that despite the CPU advancements, x86 still holds strong in infrastructure due to its age and versatility.
"x86" CPUs have not been "x86" internally for a long time, it's just a layer on the outside. You should listen to CPU architect Jim Keller who says that x86 is not really a substantial disadvantage.
Case in point, AMD is goint to release Zen5 based StrixPoint and StrixPoint Halo, which to me seem targeted at Apple's "M" SOCs.
@@swdev245yeah it’s not the architecture, it’s just Intel and AMDs disregard of mobile chips. Now they’re scrambling to deliver much more efficient chips with Strix and Arrow Lake
Was it an AI generated comment?
@@BernardoLeon yeah looks like it, click on his profile and see his comment history
@@BernardoLeon Which comment do you mean? I wrote mine with natural stupidity ;)
I legit thought they made apple version of the M4 rifle
now i really wanna see a rifle made by Apple
@@ea02ca6f featuring the thinnest scope bezels apple has ever made!
its an M4 rifle pointed at the snapdragon
I'm waiting until the Apple M16 comes out.
@@ea02ca6f each bullet is 100 dollars
1:30 the grindr joke was peak
I was hoping someone said something. Genius!
I didn't get it, what was the joke? The app?
@@lucasjritter I don't think there was one. Just the fact that Grindr exists is hilarious apparently 🤔
@@BrysonThill gay people are really funny. Thats probably why theyre so happy
@@lucasjritter The very fact that Grinder was the featured app was the joke. As a gay man, this joke gave me all the good feels.
Ipad Hardware: 😎
Ipad OS: 🤡
How bad is it?
@@PapiDey-dv4gw it's bad out here man
@@PapiDey-dv4gw IpadOS is like a nerfed MacOS. It's better than iOS but worse than MacOS.
Apple refuses to put MacOS on it, I guess because they would have to kill the iPadOS. And the lack of supported apps makes you asks the question, why not buy a MacBook instead.
@@wlockuz4467 My guess is MacOS is not on iPad because MacOS allows apps outside of App Store, so there's less revenue.
@@wlockuz4467 they don't put MacOS because people wouldn't buy MacBooks anymore...
The Grindr reference and Apple car were wild. 10/10
I hope he knows what grindr is cause if not it would be SO FUNNY
1:13 that's actually a myth. While it was true that several years ago x86 was CISC and ARM was RISC and that was one of the reason for the latter being more energy efficient, nowadays the latest and most advanced iterations of x86 and ARM have blurred the line between the CISC and RISC. The x86 processor now is using a decoder to interpret its complex CISC intructions to a RISC form internally and ARM has introduced a lot of CISC-like extensions over the years. The actual reason x86 is more inefficient[citation needed] is due to the fact it needs to a lot of 30+ year old legacy code, though Intel have recently proposed a specification called x86-S which aims to remove a lot of legacy cruft from their processors while still maintaining reasonable backwards compatibility.
Wrong! It’s exactly this CISC decoder unit which consumes a lot of energy.
Furthermore there was the famous IA64 experiment which failed.
Even further seen you have to distinguish between software which runs for companies versus for private consumers. They have different development life cycles and budgets. There is a reason why COBOL exists till today. I don’t know of any large migration project after Y2K. Usually a software stalls in further development if its code is 1) unreadable 2) test coverage is bad of low.
@@javastream5015the decoder is only about 5% of die area in modern CPU cores. Most transistors and energy are spent on branch prediction, handling out of order execution and cache.
@@javastream5015OK, I get the point about CISC to RiSC adding to overhead, but why bring up Itanium?
Yeah, it failed. But so did all the other times Intel tried to "replace" their x86 based cores.
From what I understand x86s is about removing/depreciating old 16 and 32 bit instructions. In other words making x86 more focused on x64 and beyond. That is not an attempt to replace the current architecture, just a revision.
@@MichaelDeHaven Back to x86s: Yes, it can be done, because Apple did the same with the M1. Apple removed the support for 32 Bit API even one OS version before. 👉A lot of old games (like Civilization 4) stopped working.
To run old software emulators are needed. Then you have stupid companies like Parallels who didn’t want to build a x86 emulator into Parallels Desktop so that old x86 (and x64) virtual machines could be used further. ☹️However, as I have read now, Windows for ARM supports emulation of 32 Bit x86 software (on 64 Bit ARM). 👍
👉Long story short: It could work technically. But you would make a lot of owners of legacy applications angry without a proper emulation support. Just look what Apple did.
@@MichaelDeHaven x86S isn't really about deprecating instructions (as that could be devastating for compatibility for older software - Intel's proposal would require changes only from the operating systems), but about deprecating unused features. For example, x86 chips first boot in 16bit mode, then they can be switched to 32bit and then finally to 64bit adding a lot of unnecessary complexity. There's also the entire real and protected mode ordeal, together with unused executions rings 1 and 2 and finally some things about the segmentation model.
As for the entire RISC and CISC problem on x86, in my opinion, the decoder and all the execution units (APU, FPU,...) still have to be prepared to deal with a lot of instructions. x86 contains more than 186 opcodes for just the ADD instruction, having the option to provide an immediate value or address from memory, each of those for 8, 16, 32 and 64bit values. Adding a MOV instruction just before that and supporting only the registers would have caused a huge reduction in the opcode count, but it's a little too late for that. It shows, that x86 was designed to be programmed directly in assembly and to offload more complexity to the hardware rather than the software, which is just not how we use computers today with many abstractions above the hardware.
Intel's official documentation to x86 in its current form has exactly 5060 Portrait Letter pages full of really well constructed and written text (one of the best documentations I've ever read) and yet it's still incredibly big. Sometimes less is more.
Love these videos. Wish all news was delivered with this short format, high quality, dry and clever humour!
Why is he starting to sound more like an person and not an AI
because AI gets better
Because he's using AI
He did previously mention that his AI sounds more human than him, so...
😅😅😅😅😅😅😅
AI's getting better :)
Arm was introduced by Acorn computers, eventually renamed Acorn Risc Machines, as a processor for their desktop computers.
They found out they were extremely power efficient in their original design. They also happened to outperform products based on the 68000 series at the same clock speeds. And I mean ran circles around them. Prior to risc architectures CPUs measured performance in clocks per instruction not instructions per clock. This was the whole motivation for risc. Risc architectures introduced the first CPUs that executed 1 instruction per clock. This was a massive improvement over heritage CISC CPUs.
underrated comment
bump
Also, it kept running without the power supply for a while.
@@brunesifirst arm processor on showcase to bbc was even unplugged to electricity. It was enough of power from returned electricity, from connected monitor to motherboard, to run it😂
@@brunesi not without power supply. But it was able to run from input diode pin leakage current. Still impressive though.
Why is the dude at 3:03 using his phone upside down?
Some old phones were built like that. I have one at home
Remember time, when Samsung phones had charger port and jack from the top? Because I do.
@@Noredia_Yuki yeah but where is the camera and why are the speakers on top then lol
it's ai generated stock videos
@@Noredia_Yuki i think i would prefer that
This is just moving from one proprietary architecture to another. WHERE MY RISC-V SQUAD AT?
They're not here because they're having issues compiling their web browser for RISC-V
well at least there won't be monopoly in this arms race
@@TheDanielLivingston I laughed entirely too much at your comment
1:45 actually microsoft was trying to push arm / equivalent battery efficient stuff before this but nobody (Qualcomm) took this seriously
Untill apple silicon
This. Everyone forgot the first windows tablet (Windows XP tablet PC edition anyone?), and shat bricks when the iPad launched. Same with arm (Windows RT anyone?), no one wanted to hop on the arm train back then. Some tech needs apple’s hypetrain.
I have a windows on snapdragon laptop and its great. I do work for Qualcomm as a software engineer though, so I may be biased.
Common Microsoft L
i like your profile picture
@@TheDeveloperGuy I heard that Windows RT sucked ass and maybe that's why nobody really wanted to jump ships. It also had no x86 compatibility layer (which the existing software library is the main selling point of Windows these days anyway). Without Rosetta functioning as well as it did, I don't think the jump would have been such a no-brainer for many people as it was.
why get a measly 38 trillion when you can get 45 TRILLION
I had to check that figure. Amazing! It only takes an electron 3 billionths of a second to do one orbit. How can they do 10,000 operations in the time it takes an electron to do an orbit?
@@blucat4 If one human can eat 5 tacos a day, earth can gobble up ~40 billion tacos a days
Can't wait to buy myself the new Macbook Singularity Blaster Turbo Pro Max (with extra storage of course)
is going to have 4 GB of unified memory and each 4GB is going to cost $400.
@@segiraldovi Sounds like a deal to meeeee (this message is sponsored by apple)
I already prepreordered mine.
But I already sold my kidneys and liver
Sounds like super soaker water
3:04 I was physically hurt by this
The M4 seems to be the fastest CPU chip on the market for running AI. You'd still want a dedicated GPU if you can get one. While the M4 gets near 40 TOPS, the 4090 gets around 83 TFLOPS (trillions of floating point operations per second) at around 32 bit precision. I think the M4 is rumored to be at 8-bit precision for their current TOPS quote. No confirmation on the 8-bit rumor though, so it's good to be skeptical of that.
I mean it's quite obvious that Nvidia's chips are go to for AI related stuff, but it's still impressive what performance M chips can achieve with how little power they need. Sadly, they are OS exclusive.
@@mikadofxx9030 They aren't. Apple has specifically made sure that third-party OSs can run and can do so safely and without sabotaging the system's security model
The drivers and some other important bits of code being private is of course a big obstacle, but it's one that can be and defeated, as the Asahi Linux team has proven.
To add, Apple themselves encouraged Microsoft to port Windows to Apple silicon... why? the answer is that macOS makes them practically no money. Apple is a hardware company.
@@chri-k, sources? Note, links may not work. So, just name the sources.
@@CarlBach-ol9zb Apple's documentation on the boot process as well as documents from the Asahi Linux project
3:07 why is bro holding phone upside down😂😂😂
Well I guess I'll have to throw away my M2 MacBook Pro I bough a year ago /s
Yeah Apple’s gotta stop
O B S O L E T E 😂
You could throw it in my bin though
@@veenmikki27 Nah this is the fault of the person throwing it out
From my experience old Macs (and almost every higher end computer and phone for that part) work perfectly fine multiple years into the future
@@IDKisReal2401True. I would have been watching on this my 2017 MBP if it weren't for the fact that I accidently dropped and broke it a few weeks ago. That's considering that those models criticised for keyboard, battery, and fan problems. Still sucks a bit that they announced M4 when I just bought a M3 MBA as a replacement.
People who spent $6000 on an M2 Max MBP less than a year ago genuinely deserve compensation.
THE DRAGON FROM ULTIMA ONLINE! subscribed
~flashbacks of farming gazers with tamed dragons...or of stealing peoples shit from their bags with a thief. Or fighting them, disarming them and then stealing the weapon from their bag. Still one of the funniest things I remember doing in a game.
Vas flam
@@hmthatsniceiguess2828 Or being chased by people wanting to kill you only to provoke nearby dragons and wisps to attack and chase them. It really was a great game
@@johannesandersson9477_”Cor Por”_
The standard greeting UO before 99% of the population moved to Trammel.
im just here for the UO dragons
Surprised you skipped a llama 3 code report. But thank you for this!
I hope to shadow jesus that Qualcomm doesn't screw it up
not qualcomm, but windows with their shit os
@@weiss588 I dont care. If not windows what will you do with the CPU? Either they do it with windows or at least with android and Linux.
there is always linux to save the day, but windows have the corporate user OS so i think there is no going to be a real diference.
@@omarjimenezromero3463 True. Corporate users dont care about ARM chips tho. All they need is performance not efficiency.
Congrats on 3M subscribers 🎉
Siri should get a rename, just distance themselves from that entirely.
This guy has more humor sense than most professional comedians
Awesome....the idea of the 'The Code Report' is actually fantastic...keep up with the work Fireship...❤❤
1:03 NOT THAT DUDE SPECIFICALLY 😭😭😭
0:09 nah bro didn't have to violate apple this hard 😂
This brings to mind the early days of technological advancements when each leap significantly altered user interaction.
I like how there is literally no information about real perfomance or benshmarks, only ultramegateraflops of some AI trash.
0:19: Not unirocally, having a car that charges from the bottom would not be so bad (EVs with induction charge for example)
a battery swap system like BYD uses is the best tbh
When can we start using quantum dots for computing
Why does Tim Cook always look like he escaped from an old people's home in Florida?
We early today boys! Imagine the future m420 chip able to mine 21bitcoins per nanosecond while also generating 1000LLM's/second
Ah yes, making bitcoin worthless, fun.
Why llms per second just go with nano like the other one or go even lower, picosecond
All that computing power and the first thing you think about is bitcoin
@@ctrl_x1770 the first thing he thinks about is probably milf videos
Haha love this comment 😂
It's incredible how this channel informs, entertains and maxes out on +100 sarcasm at the same time.
this hardware thing is getting wild ngl
I was going to say, “now if only software would evolve so much”, but with AI like these, who needs software?
fr fr
@@TheScottShepard isn"t AI itself a software?
yea seems like we haven't hit the peak of the curve yet 😀
Project Lavender, just another advancement we can all learn from.
I love that you upload your videos all the way up to 4K resolution but half of the videos are 760p that you grab from other videos 🤣
Shout out to the Ultima dragon reference
yea, seeing that was a blast from the past!
At what point is your reduced instruction set called on to do all the instructions anyway???
Apple debuting their new processor on an iPad is like Lamborghini unveiling their newest Supercar in Los Angeles traffic
Buhu dude
The ipad is a great device. Try using it once, you will see
@@Aneliusewhile the iPad may be a good device apple is the worse and most scum company on the planet so no doubt it was used on the iPad because their M4 isn't as good as they say it is as usual
I'm not sure if I enjoyed the comedic value more or the actual information, but it's a perfect combo👌
Still can't run counter strike in 60 fps.
*starts singing my computer just became self aware*
wake up honey, another upload this week
If it's wild, it's good for us. Competition only benefits the customer.
Please what was that grindr jumpscare i'm dying- 😭
Congrats 3 million subs !!
1:30 boyyy lmaoooo
It's official- WWDC is now called AIAI
Hope they haven't included any un patchable security vulnerabilities this time round. I just love security flaws baked into the silicon
Ever since ChatGPT released, it's nothing but AI.
I'm all here for it.
Got to say, there’s no flex like running dolphin Mixtral on my M1 Max locally with better response times than the web interface of ChatGPT. Their chips are absolutely nuts and if the M4 is focussed on AI workloads I dread to think what we’ll be able to do on any device in three years.
Saw the channel for the first time and the sarcasm is really on point with his pictures. I love it!
It has “ray tracing” yet it can’t play games
That grindr picture caught me off guard
Bro that ultima online dragon
A real OG 🫡
3:56 what in the actual fuck lmao
The death of x86 has begun.
👍
" Hell, It's 'bout time. "
If the 100+ TOFS is true, then I don't think so
We wouldn't need arm SoCs if intel made good energy efficient SoCs
too bad that ARM sucks as well
Congrats on 3M subs🎉
Since we are at with processor topics, you should talk or do an 100 seconds video on the RISC-V architecture, an open-source, royalty free ISA compared to ARM that's getting decent progress on this race as well.
Your Use from Tom Lea's painting of a shell shocked Marine from the Battle of Peleliu IS UNACCEPTABLE - REMOVE IT.
On device AI is great. Today I tried some Whisper audio dictation on my MacBook and it worked great while on a similarly priced Windows laptop it took more than 10 times as long and was unusable. Very soon having a good AI/Neural chip will become essential in almost every device and Apple are definitely ahead at the moment.
Are the newer ones released already or you using a 2023 model?
Seems Google takes the middle ground and submits it to the cloud, while taking a little longer, it comes back on the first try as well...
Thanks for this channel dude, it’s super valuable to stay up to date on all this stuff which is moving so fast
Bro who gives a FAAK if your gadget has Ai in the thing ?
The Ultima Online Dragon was perfect. Thank you for the throwback :D
ARM does design CPUs and also licenses out the architecture. Most companies use the ARM off the shelf design where as Apple and now Qualcomm design their own CPU using the ARM ISA.
RISC-V is also another x86 killer :p
Glad that someone mentioned risc-v lol
apple dont have own soc its made by tsmc sells like apple product pathetic loser company people belive in apple lies
@@eddiewang9833apple dont design its made by tsmc sells like apple product pathetic loser company people belive in apple lies
Is that the logo for Inter Inc. at 1:00 ?
(As in “bury” rather than “between”)
Ghz?
I sleep
TOPS?
NOW THAT'S WHAT WE ARE WAITING FOR BABY.
Really tops those charts
I’ll see myself out
She ghz on my tops till' i M4
I wish I was topsed
GHz development stopped 15-20 years ago.
If only applicable to NPU operations per second then ok meh...
3:05 why are u using stock footage like that xD
There are so many factual inaccuracies in this video
Ha! But you are not going to tell us 🤣😂
which ones?
@@chiluco2000 On my first watch and I caught these errors. (1) The article quoted is that claims x86 is dead is 90% wrong only the last paragraph is even remotely correct. (2) The structures x86 is not based on transistor count but the number of instructions the cisc chip can use to perform its micro operations. (3) Arm was poorly described in the video, he based the argument on instruction sets(Risc) and then went to describe it by transistor count. This is not really the reason why arm is considered better for mobile devices and devices with simple specialized functionality,
the Grindr screenshot is crazy
1:05 Apple Macintosh computers / macOS ran originally on RISC processors before switching to x86 in 2005
3:51 AI cringe maxxing 💀
Plot twist, most people don't even need AI. Instinct works just well enough.
1:23 "RISC requires fewer transistors per instruction..." What?! You should have said that CISC and RISC have not the same instruction set (as the meaning of CISC and RISC implies) so RISC cpus are simpler to design and build thanks to their less complex instructions set. They usually need less transistors because they do not implement complex instructions (such as REP MOVSB wich is used to copy data from a memory location to another while decrementing an operator and checking for its value). To perform the same operations with a RISC cpu you need to code a loop with at least 4 instructions in it.
We need more TOPS!!!
says every bottom ever
That picture at 0:55 is fricken hilarious.
The AI cringe race is fierce.
My only question is what does that mean for games? Or will it be fine as long the OSes run proper?
The iPad Pro is the best product to watch Netflix and TH-cam.
I’m using one right now
0:44 that dragon is from ultima online
I still don't understand why AI has to run natively. Most people will use simple prompts, all they need is a mobile connection.
maybe speed difference
privacy matters, like govts fine tuning LLMs on top secret military data, you get the point
because you don't want to send your private and company data to FAANG and NSA ?
Speed, cost, privacy. For simple tasks a high-performing fine-tuned small model running locally will be sufficient. Apple developing both their own hardware and software can achieve this very effectively.
the ability to train custom extensions for it
Bro that "Intel, DEAD inside" got me. I just can't..
"Fastest AI chip in the PC market" at 30tflops of what we assume is INT8, meanwhile the 4090 is over here doing 660TFLOPS of FP8.
Unless you run your 4090 in the cloud to access its compute power remotely your point is invalid.
also, afaik there's no API to interact with it so other than apple no one else that can run code on it.
Now compare power consumption.
it also uses 660 watts of power geg
@@2pingu937 why would you need to run the 4090 in a cloud? Just use a VPN tunnel from your smart device to your home network where your computer with the 4090 resides. Thats not really hard to facilitate.
when was yesterday?
That jab at the magic mouse 😂😂
Congrats to 3 million subs
I worked on the m4 and I can’t even be mad at that electric car joke. That was gold
But will it have a Turbo button?