Really great work here. So much I didn't know! I particularly found it amusing that the people that worked on a power-drawing blackhole like Tejas moved directly to the low-power Atom. And all because the low-power experts were given away. 😂🤦♂
FWIW, Intel sells ARM devices today! Some of the FPGAs that intel sells have ARM processors in them (called HPS). That came from the acquisition of Altera in 2016. So if you want a chip with an arm processor in it from Intel, you can get it. Not low power though.
Intel's “Jasper Lake” N-series Pentium Silver and Celeron CPUs run at a power saving 6W - 10W TPU. Trying them out in Gateway and Lenovo laptops, both the Celeron and Pentium Silver CPUs deliver impressive performance at such low power... But to think of the PAIN intel felt when it passed on the iPhone... and tossed out all the ARM and Xscale development. The Future is in Low Power, especially considering the 2nm and smaller designs Intel is working on, so TINY they really can't get overloaded on too much power or heat...
Fascinating! I appreciate your research and insight on this topic. I remembered while watching that I found an Intel CPU in one of my HP iPaq PDAs while disassembling it for service. At the time I thought it was odd that the PDA had what I assumed was an x86 CPU, but you've confirmed otherwise.
Fascinating how Intel had a dedicated ARM division back in the early 00s... Of course, things come full circle and I have zero doubt Intel will be designing ARM chips themselves before long.
I highly doubt this. The inherent advantages of ARM over x86 in the modern era are mostly nonsense/non-existent. Actual core design/layout matters way, WAAAAAAAAAAAAAAY more than ISA. Modern x86 isn't really CISC, and modern ARM is a lot less RISC than it once was.
Just for the sake of completeness I would like to hear about AMD's attempts at making ARM devices. They did release Opteron A11** series of 3 SoCs, but very little information exists about it. And at the time AMD was working on ZEN x86 architecture, which turned out to be actually good, so it is unlikely they will make any more ARM devices soon.
There's more juicy details to the Opteron A1xxx story 😁 AMD was actually developing Arm server chips for a certain Seattle based cloud provider (it was an open secret, the codename of the Opteron chips was Seattle). It's not clear why they didn't follow though - it might be that K12 (AMD's high performance ARMv8 architecture) wasn't meeting expectations, that AMD decided to focus efforts on Zen (the safer bet in the short term), or that the cloud provider simply decided they could do it better alone. In the time since AMD canned Arm server development, that same cloud provider has formed it's own in-house chip design team and delivered several successful and industry leading products (such as effectively inventing the DPU, and delivering 3 increasingly impressive generations of Arm server SoCs). That cloud provider now has Arm server SoCs that are frankly way better than anything Intel or AMD have in terms of perf/watt and density, and Arm server market share is on the increase. I wonder if AMD might regret messing up that relationship someday 🤔
The DEC breakup is pretty crazy. Intel’s acquired IP and staff went on to fix Pentium 4 missteps and helped build Core and Core2 designs. AMD also snapped up parts of DEC. The Alpha interconnects inspired Hypertransport and AMD got DEC engineer Jim Keller who helped form x86-64bit and designed the Opteron cores that smashed Intel in the early 2000s.
A small minor note ... "D" "E" "C" is pronounced as "deck" by DEC employees and the general Enterprise computing market back then. I was at Compaq when DEC was acquired.
The missed opportunity with the iPhone and the selling of the StrongARM/XScale was arguably intels worst mistake in its entire corporate history. Like worse than: iAPX 432, i860, itanium, Netburst and their 10nm fiasco
i was the one that lwd the reverse-engineering on a number of HTC smartphones around 2003. i had 9 of them, including the BlueAngel, Universal, and they were absolutely fantastic. the story behind the PXA270 is more embarrassing for Intel than anything they've ever done since. there are very few people in the world who know the details.
@FullyBuffered if you want to get more on the insights here - like the fact that Intel did a deal with ARM to "fix" the ARM11 core and why hilariously they never sent them a single line of HDL - in return for only GBP 100,000 and a royalty-free in perpetuity ARM ISA License - let me know, happy to do an interview. this is historical information (as well as very funny given how ARM and Intel make themselves out to be all-knowing) so it's kinda important to get recorded.
Its a shame they ditched this, if you imagine if it had carried on, the amount of atom products that could probably still be of a lot of use today and back in the day, it would be a cheap linux laptop dream, tablets and handhelds, and we wouldn't have gotten sub par IGP and HD graphics built in but something with a bit more oomph, my Linx Vision 8 stream tablet has a quad core atom chip at 1.6ghz, crippled my single channel 2GB ddr3 and some poor HD graphics that make a GT210 look like a powerhouse. if it in a mobile market it had been more open that the centrino package, we could've gotten competing chips for graphics. I imagine a lot of "steam deck" devices would've been more popular, it could have thrown the Wii-U and Switch into the dirt if they were able to push it with the knowledge the engineers had. I wonder how Nvidia buying ARM would have been if intel was the prodominant leader in that platform..
I got a few points of constructive critique here on your video. #1 Please stop editing in these *_extremely_*_ annoying supercuts (focussing in synchronized with your points of emphasize, enlarging the picture), which focus on arguably important points_ - It is not only childish and lowbrow, it's also painstakingly lame and Instagram-esque. It really is _disrupting the viewer's concentration majorly* and makes one want to just move on, no matter the content._ It's a silly cry for attention and we all dislike it. #2 If you add in crucial parts like verbatim texts from interviews, please bear in mind your viewer's ability to actually READ them - _These are overshadowed/overlapped with the players' infamous duration- & scrollbar at the bottom of TH-cam's video-player itself!_ Plain impossible to read them, when they scroll into view from the bottom and are mercilessly cut off the moment, these are fully in the picture. Please leave your viewer a moment to read them and only then move onto another scenery. #3 In general slow down the pace of your imagery and let the scenery make its impression onto the viewer, before changing the scenery/picture. Leave text longer in the picture to be read by the viewer and mind the video-control's overlay overlapping with your actual content (making it impossible to read/watch, when paused) - You can easily leave the content on the screen and already start speaking/explaining again from the off, while the discussed content (websites/screenshots/tables/interview-texts) are still left to be read/consumed. I sincerely hope that you don't mind my direct approach and take it as a constructive/positive input for your contents betterment. Your information presented are just way too valuable to be tarnished with the silly representation, which makes it extremely difficult to annoying to even extract major bits of information off your videos. Still great video from the respective informative value! Keep it on, please.❤
I was watching some of your early catalog yesterday wondering when you were going to return, good to see you back.
That's good to hear :) thanks for the comment!
As always very well researched and presented. Very interesting video! Keep up the great work, your channel us utterly underrated!
Many thanks! :)
Amazing to know that INTEL had the chance to power EVERY iPhone on the Planet.... and, passed on it. X-D OUCH
Really great work here. So much I didn't know! I particularly found it amusing that the people that worked on a power-drawing blackhole like Tejas moved directly to the low-power Atom. And all because the low-power experts were given away. 😂🤦♂
Thanks man! Yeah I was like "they did what?!" when I read that haha
FWIW, Intel sells ARM devices today! Some of the FPGAs that intel sells have ARM processors in them (called HPS). That came from the acquisition of Altera in 2016. So if you want a chip with an arm processor in it from Intel, you can get it. Not low power though.
this was such an interesting history, we see how Intel has learned and taking slow and calculated steps with RISC-V
Thanks!
Your videos are so fascinating.
Intel's “Jasper Lake” N-series Pentium Silver and Celeron CPUs run at a power saving 6W - 10W TPU. Trying them out in Gateway and Lenovo laptops, both the Celeron and Pentium Silver CPUs deliver impressive performance at such low power... But to think of the PAIN intel felt when it passed on the iPhone... and tossed out all the ARM and Xscale development.
The Future is in Low Power, especially considering the 2nm and smaller designs Intel is working on, so TINY they really can't get overloaded on too much power or heat...
Fascinating! I appreciate your research and insight on this topic. I remembered while watching that I found an Intel CPU in one of my HP iPaq PDAs while disassembling it for service. At the time I thought it was odd that the PDA had what I assumed was an x86 CPU, but you've confirmed otherwise.
Your content is consistently a real treat. My thanks for the quality content you produce
Thank you!
I did not know about it. Thank you for telling.
You're welcome!
Glad to see you back with another piece of computer history.
Thank you!
Fascinating how Intel had a dedicated ARM division back in the early 00s...
Of course, things come full circle and I have zero doubt Intel will be designing ARM chips themselves before long.
I highly doubt this. The inherent advantages of ARM over x86 in the modern era are mostly nonsense/non-existent. Actual core design/layout matters way, WAAAAAAAAAAAAAAY more than ISA. Modern x86 isn't really CISC, and modern ARM is a lot less RISC than it once was.
I’m happy for your upload, please keep it up and don’t make me wait longer for another video!
These kinds of videos are a treasure due to the enormous prior investigation that they entail
Thanks!
Just for the sake of completeness I would like to hear about AMD's attempts at making ARM devices. They did release Opteron A11** series of 3 SoCs, but very little information exists about it. And at the time AMD was working on ZEN x86 architecture, which turned out to be actually good, so it is unlikely they will make any more ARM devices soon.
There's more juicy details to the Opteron A1xxx story 😁
AMD was actually developing Arm server chips for a certain Seattle based cloud provider (it was an open secret, the codename of the Opteron chips was Seattle). It's not clear why they didn't follow though - it might be that K12 (AMD's high performance ARMv8 architecture) wasn't meeting expectations, that AMD decided to focus efforts on Zen (the safer bet in the short term), or that the cloud provider simply decided they could do it better alone.
In the time since AMD canned Arm server development, that same cloud provider has formed it's own in-house chip design team and delivered several successful and industry leading products (such as effectively inventing the DPU, and delivering 3 increasingly impressive generations of Arm server SoCs).
That cloud provider now has Arm server SoCs that are frankly way better than anything Intel or AMD have in terms of perf/watt and density, and Arm server market share is on the increase.
I wonder if AMD might regret messing up that relationship someday 🤔
AMD Soundwave incoming 🤨
welcome back, its been a while!
Thanks!
Thanks for a informative and great video!
Thank you!
Finally a new video!
The DEC breakup is pretty crazy. Intel’s acquired IP and staff went on to fix Pentium 4 missteps and helped build Core and Core2 designs.
AMD also snapped up parts of DEC. The Alpha interconnects inspired Hypertransport and AMD got DEC engineer Jim Keller who helped form x86-64bit and designed the Opteron cores that smashed Intel in the early 2000s.
A fascinating time for sure - thanks for the comment!
Please don't let us wait another half a year for the next one :D
A small minor note ... "D" "E" "C" is pronounced as "deck" by DEC employees and the general Enterprise computing market back then. I was at Compaq when DEC was acquired.
The missed opportunity with the iPhone and the selling of the StrongARM/XScale was arguably intels worst mistake in its entire corporate history. Like worse than: iAPX 432, i860, itanium, Netburst and their 10nm fiasco
I am glad they are out of this market. They would have only inflated prices & nothing more
Very interesting story, thanks!
great video
Thanks!
Intel management making bad decisions? I'm so surprised 😱 /s.
i was the one that lwd the reverse-engineering on a number of HTC smartphones around 2003. i had 9 of them, including the BlueAngel, Universal, and they were absolutely fantastic. the story behind the PXA270 is more embarrassing for Intel than anything they've ever done since. there are very few people in the world who know the details.
@FullyBuffered if you want to get more on the insights here - like the fact that Intel did a deal with ARM to "fix" the ARM11 core and why hilariously they never sent them a single line of HDL - in return for only GBP 100,000 and a royalty-free in perpetuity ARM ISA License - let me know, happy to do an interview. this is historical information (as well as very funny given how ARM and Intel make themselves out to be all-knowing) so it's kinda important to get recorded.
And now Intel signs a agreement with Arm
Its a shame they ditched this, if you imagine if it had carried on, the amount of atom products that could probably still be of a lot of use today and back in the day, it would be a cheap linux laptop dream, tablets and handhelds, and we wouldn't have gotten sub par IGP and HD graphics built in but something with a bit more oomph, my Linx Vision 8 stream tablet has a quad core atom chip at 1.6ghz, crippled my single channel 2GB ddr3 and some poor HD graphics that make a GT210 look like a powerhouse. if it in a mobile market it had been more open that the centrino package, we could've gotten competing chips for graphics. I imagine a lot of "steam deck" devices would've been more popular, it could have thrown the Wii-U and Switch into the dirt if they were able to push it with the knowledge the engineers had. I wonder how Nvidia buying ARM would have been if intel was the prodominant leader in that platform..
Intel's ARMy of 0ne ... :-( Rest In Peace, Intel-ARMs We can only dream what a super charged, 2nm Strong ARM could have been...
Fully Buffered o ¿real malcom in the middle?
oh my gosh, what a beautiful pair of eyes 🥰btw this video reminded me of my old and good Palm M100 with a motorola cpu 16mhz and 2mb of RAM... :P
i have a asus z7s ws for sale would u be interested?
?
Thanks for the offer! For now I will have to pass on it unfortunately
Thank you for the example of excellent British English speech.
hiya
I got a few points of constructive critique here on your video.
#1 Please stop editing in these *_extremely_*_ annoying supercuts (focussing in synchronized with your points of emphasize, enlarging the picture), which focus on arguably important points_ - It is not only childish and lowbrow, it's also painstakingly lame and Instagram-esque. It really is _disrupting the viewer's concentration majorly* and makes one want to just move on, no matter the content._ It's a silly cry for attention and we all dislike it.
#2 If you add in crucial parts like verbatim texts from interviews, please bear in mind your viewer's ability to actually READ them - _These are overshadowed/overlapped with the players' infamous duration- & scrollbar at the bottom of TH-cam's video-player itself!_ Plain impossible to read them, when they scroll into view from the bottom and are mercilessly cut off the moment, these are fully in the picture. Please leave your viewer a moment to read them and only then move onto another scenery.
#3 In general slow down the pace of your imagery and let the scenery make its impression onto the viewer, before changing the scenery/picture. Leave text longer in the picture to be read by the viewer and mind the video-control's overlay overlapping with your actual content (making it impossible to read/watch, when paused) - You can easily leave the content on the screen and already start speaking/explaining again from the off, while the discussed content (websites/screenshots/tables/interview-texts) are still left to be read/consumed.
I sincerely hope that you don't mind my direct approach and take it as a constructive/positive input for your contents betterment.
Your information presented are just way too valuable to be tarnished with the silly representation, which makes it extremely difficult to annoying to even extract major bits of information off your videos.
Still great video from the respective informative value! Keep it on, please.❤
What a mistake
i love those pocket pcs.