The way he so stoicly and eloquently brushes aside the question of how it feels to be part of the revolution of arm, and subsequently gives most of the credit to the companies, uni's and designers earnes him all the respect he needs. There is a case for modesty. He's one of the posterboys for sure.
Sorry to be off topic but does anybody know of a trick to log back into an instagram account?? I stupidly forgot the login password. I love any tricks you can give me
@Dakari Merrick thanks so much for your reply. I found the site on google and I'm in the hacking process now. I see it takes a while so I will get back to you later when my account password hopefully is recovered.
Volts are a measure of voltage, not current or power. There's no way 0.1V could be enough voltage to enable a CMOS chip to operate, especially in that era. I assume you mean 0.1 Amps or 0.1 Watts.
Brings back memories. I joined Acorn from college in 1983 as a lowly technical writer, in spite of my BSc in comp. sci., because it was the only company I wanted work for and I would have mopped the floors if it got me in there. I didn't learn about the ARM until one of the guys working on ISO Pascal for the Beeb (which I was writing the user guide for) leaked it to me. After that I used to go and hang around Roger's desk and he'd show me ARM BASIC running on an emulator that was running on an NS16032 second processor. As Steve remembered, BASIC was running on the actual silicon days after they got the first chips back. I believe there was one minor rev. they had to do to get the chip ready for production, and maybe a move from a 3 micron process to 2 micron.
9:20 Remarkable! In case what he said was missed, as he's quite a humble man about it, he was going to test how much current the first ARM chip drew when running. He connected a multi-meter to the power supply and turned it on and began running code on the ARM. Yet the meter read zero. He hadn't remembered to connect the power supply at all. No power was coming from the PSU into the ARM CPU; the ARM chip was so efficient, it was running from power leaking out of the board's capacitors as they slowly drained which is an absolutely minuscule amount of power. And yet the CPU had 2-4x the performance of the VAX 11/780 minicomputer. Can there be any wonder why ARM is the most popular, widely used CPU architecture in the world? An absolutely genius, brilliant design. Cheers to Dr. Furber and Ms. Wilson. Visionaries.
As an erstwhile owner of Model B's and A3000/A5000's I revere Prof Furber and Sophie Wilson. BBC Basic and particularly Basic V was/is the best structured, easiest to use and most elegant version of Basic I have ever seen. And the same goes for ARM Assembler, which I think is a direct reflection of the simplicity of the RISC concept. And imbedding an assembler in Basic was pure genius, it made it so easy to develop a program in Basic then just convert the intensively used bits to Assembler. Still the most pleasurable programming experience I've ever had.
The ARM running without a power connection reminds me of when Frank Whittle was testing one of the first jet engines. It started to over speed so they cut fuel. But the engine kept going, faster, generating more thrust. Turned out fuel had collected in the burners. Something terribly British (pythonesque) about machines that mysteriously work but you don't know how.
A bit like the common runaway problem with some turbodiesel engines, where you cut the fuel but a leaky bearing seal means it's actually also burning some of the engine oil... so it keeps on accelerating...
The magnitude of Acorn's achievement coming up with the ARM is something that shouldn't be understated. It makes them one of the few microcomputer companies who actually rolled their own processors - and one of very few who did it without actually owning their own chip factories or having been active in electronics component manufacturing _before_ starting to make computers. Quite a feat. In fact I'm having trouble thinking of another example - Commodore had ownership of MOS, who were setup first and foremost as a microchip manufacturer in direct competition with Motorola and Intel. Texas Instruments were a long established IC manufacturer. DEC, who started out building minicomputers from piles of mini PCBs filled with discrete components and evolved to making a lot of their own ICs for their bigger systems, sort of flipped between making their own processors for their smaller machines (terminals, the Rainbow and VAX workstations) and just using Intel or Motorola (depending on machine) parts instead. Other big names just bought commodity parts on the whole and only contracted out production of specific accessory parts such as glue ASICs. EG IBM could actually have made their own parts if they wanted, given their decades in the computing field, but simply found it cheaper and easier to buy in bulk from Intel, and about the only custom device in the PC was the BIOS ROM, and the character ROMs on the video cards - something which can be considered a direct contributor to the clone industry, as it turned out to be an exceptionally easy machine to copy, only needing a bit of BIOS reverse engineering. Apple just used MOS processors, and the most complicated custom piece in their early 80s designs would have been something like the IWM that looked after video and disc access. Atari did similar, buying from MOS then Motorola, and contracting various, seemingly random generic IC fabricators to make parts like their video chips (8-bits, ST) and then Glue & MMU ASICs (ST). Sinclair was similar, using Zilog CPUs and commodity memory, only moving to having Ferranti make ULAs for them from the ZX81 onwards. So on and so forth. Possibly the only direct analogue is Sun, when they started producing the (also RISC, so possibly ARM-inspired) Alphas some years later on...
I have my exams in few days on arm processors and I am watching this video. Forgive my ignorance, but I didn't know who is this gentleman is so I paused the video and started reading the description. I came to know that he is Prof Steve Furber who's book I have been refering to for my course!!!🙏
I saw in an interview with Wilson saying that they were very worried about overheating of the package with high current draws so spent a lot of effort to make sure it economical with the juice. Even she was a bit surprised when the current draw figures were measured as to how low they were.
I have been a huge fan of ARM since doing low level mobile development. From a developers persecitve RISC is really wonderful to work with. Keep making these series. I really enjoy them.
MrSlowestD16 That'd be true for high level developments, including mobile. If you're programming a microcontroller, then you'd be working directly with registers (GPIO, SysTick, etc).
JFourier Don't know. Any microcontroller I've written code for was abstracted by C, so it's essentially all the same. As I said, very few people write in asm anymore.
MrSlowestD16 Yea I agree. asm development has become very rare these days. I write my firmwares in C/C++, but I'd still need to know the microcontroller's register map to properly set up GPIO, timers, interrupts, etc.
"...effectively we were powering the chip [(the ARM processor)] from the signal inputs..." This was a well-known problem when working with CMOS chips. As documented in that famous book, "The Art Of Electronics" by Horowitz and Hill, if your (CMOS) circuit suddenly stopped working, it *may* be that one of your chips isn't connected to the power rails and that it's being powered by its inputs. Consider what happens when all of the inputs go "low" at the same time...
I imagine the engineer who didn't read that book is the one who invented the parity line to ensure that there is always an odd number of high inputs so that it can never be all low.
Interesting interview, I am a Acorn fan and owner since BBC model B. Also new to me ... tape out... I laid out several simple PCBs using black tape and letraset pads. We called it "taping out" onto clear film as the last step after proto board and schematic approval. This was then sent off to the PCB manufacturer.
I worked (not actually programming, sadly) for the company who ported Smalltalk-80 to the ARM, in the form of an Archimedes. We had an actual Archimedes on one desk and on another we had a 80386 machine with an "Archimedes on an Expansion Card" which shared the monitor somehow. We also had a 386 port of Smalltalk-80 for comparison. What delighted me, for various reasons, was that the ARM running at 8MHz kicked the living bits out of the 386 running a much faster clock (I don't recall the exact speed after all this time, but it was at least 20 MHz, possibly double that). Our colleague who did all the heavy lifting was loaned various prototypes as new ARM variants came out. He had one which ran so fast it broke the speed-limiters on Flight Simulator making it impossible to play ;-)
> was that the ARM running at 8MHz kicked the living bits out of the 386 running a much faster > clock Yeah, 386 was pure CISC. That changed later on with the Pentium Pro, which has the CISC ISA running on a RISC-like core. 486 was CISC as well, but could execute many instruction in one clock. Stuff like INC AX. IIRC from 386 to 486 the IPC was doubled.
the co of arm today, his masters thesis is on my shelf there - wow furber is a legend! did enjoy reading his books on arm programming when i was a kid :)
Great chip, used the Acorn Archimedes series, first machine I learned assembly language on so I got quite in depth with the OS and chip, awesome for its time. Glad to see ARM still going strong and found their niche in this otherwise dominated marketplace.
Great video, Brady! I love these technology history lessons! I think it's important to remember how we got here, and recognize the people who made it possible.
They came out with 64bit multi-core ARM CPUs recently which are now powering supercomputers and servers that have to be extremely power efficient, low thermals, and the ability to be more tightly packed than any other design. As you can probably figure out they also are being used in smartphones (they are much more powerful than how they are being used in phones, although some of this power is apparent in that you can leave a large amount of applications running at the same time without impacting the performance).
The supercomputer is running off tegra cpu. while it is an arm cpu, it's a proprietary design from Nvidia. Nvidia will be launching a cpu dokn with multiple cores with "haswell-like" performance... which is fairly impressive, considering that even AMD hasn't caught up to haswell yet. Intel might want to make a run for its money since ARM is a real threat to its business. It could potentially be replaced in laptops and servers, and possibly desktop PCs though I wouldn't hold my breath on that, since backward compatibility is still something that arm wouldn't be able to provide, but in this like servers where power consumption is more important than backward compatibility, I think it will definitely take over, especially considering that the majority of servers run on Linux and Linux can run on ARM processors.
+joemann7971 True in fact Linux on ARM has more shipped units than Linux on x86 I'd have thought just from the android devices running Linux kernels, it's amazing really how much all the established and experienced companies in the PC sector managed to miss the boat on smartphones. Processor makers like Intel and AMD, GPU makers like Nvidia and the software giants like Microsoft despite all their experience managed to sleep through others taking a dominant position in a market that's far larger than the PC and server market put together. If anything though ARM is probably better off focusing like they are on embedded devices, smartphones and tablets, they have a bigger market just look how many embedded devices a typical household has compared to PC's and laptops often it's several times as many at least and there are more places those could expand.
Acorn machines were brilliant. The first machine I build to run BASIC on was an Acorn Atom (6502 processor) which came as a self build kit and worked great first time despite my inept engineering skills.......
I have just noticed an error on the screen captured from 11:20 : the ARM1 had 25 000 transistors, but not the ARM2 : it had around 30 000, because contrary to the ARM1, the ARM2 has the multiplication instructions implemented.
I had 30 years of arm once (right) and only 29 years of the other arm (the left one). This was the result of me spinning very rapidly with my right arm at the axis point and my left arm traveling at very close to the speed of light (and thus aging more slowly), thus matching the cliche of the left being younger than the right.
Great background information! Excellent! I had a colleague who had an Acorn, it was a great system but unfortunately an island platform. But it booted fast had a great UI!
Looking listening Learning so much news - so much new My first touch with ARM was in 1989 an Acorn StrongARM with dual CPU one intel - one arm the teacher showing this machine was running hot with it - i was and still am.. Amazed . . never heared much about it found one StrongARM in use at a local backery.. i said.. well.. its a very very good system unfortunatly.. i think the PC/XT (and up) is going to run over the market.. And so it came to be but - was it really ? since i found a fan-page, much later, about the StrongArm - listed all the models and versions of that time period and thought.. i was wrong.. and right.. but what if i knew - but i didn't do then. Will ARM be the next HOST on your PC Adventure - i dare to say YES with the move to online services and less local apps.. it will do well. ok, the static devices in decline the mobile devices as the new default so, ARM has already arrived a long time ago.
What an amazingly humble and down-to-earth guy. If I invented ARM, I'd be struting around like Flavor Flav with a gold medallion the size of a hubcap hanging from my neck.
When Professor Furber noticed that his ARM chip was running without a power supply connection can only mean that computers have been Sentient life forms from the very beginning, and that they have just been waiting for the proper time to become our Silicon Overlords. Humans are Doomed.
Brady, could you get him to talk about the work he's doing with SpiNNaker? While the ARM stuff is interesting, it's history. He's still doing stuff on the bleeding edge that has potential to change everyone's life *completely*.
A different response to the bandwidth problem can be seen in the design of both the BBC Micro itself, as well as the Atari ST and the original Amiga; sharing the memory bandwidth between the processor and the equally as hungry video system on an interleaved basis. The memory had about twice the speed as what the processor could use in both cases, so the video could be given an equal half, reading screenbuffer data out of a particular chunk of the same memory bank as the actual code and data. The available data rate of that sidechannel then determining the screen resolution and colour depth the computer could provide without slowing down the processor (and the amount of processor time it can borrow from determining the actual max and how slow the computer runs when using it). IE, turning the drawback into an advantage, in terms of cost saving and simplification of the architecture, instead of coming up with a whole new processor type and spending money on upping the general performance to take advantage of it. You don't get the sort of raw power (and better graphics) that the Archie offered, but you do get a reasonably performant computer for the lowest possible price. In the Beeb, that being 320x200/256 in mono or 160x200/256 in 4 colours (or the Teletext mode, which uses character mode to give the appearance of 320x200 in 8 colours) with no performance hit, or the max of 640x200/256 in mono and 320x200/256 in 4 colours. The 16-bits had about 4x the memory bandwidth (twice the clock, twice the bus width), so instead manage 640x200/256 in 4 colours and 320x200/256 in 16 colours with no slowdown, and the Amiga could pull up to 640x200/256 in 16 colours with a similar chronic amount of slowdown as experienced by the Beeb, or 320x200/256 in 64 colours with half as much... The PC's graphics cards don't show the same kind of coupling between general machine performance and resolution/colour depth as they run a completely separate bus to the rest of the machine, so gave e.g. BBC-like quality (from CGA) without slowing down the CPU despite using the same general memory technology and bus width. Which was rather important given how underpowered and bandwidth-inefficient the rest of the computer already was (struggling to reach the performance of a Beeb in the lower screen modes even so); with shared video, as seen in the PC Jr, it would have damn near come to a halt.
I don't think the BBC suffers slowdown in any modes, even its highest 640x256 mode. The CPU runs at 2MHz regardless, the only difference between the modes is whether the video system also runs at 2MHz.
Frient The chip needs a package to protect the silicon from damage and to house the leads to connect the IC into a larger circuit. Plastic has lower material costs and is easier to manufacture making it much cheaper. However, plastic is generally not very good at conducting heat so if the chip produces too much heat it can't escape fast enough and will melt. Ceramics are much better at conducting heat so you can get away with producing more heat but it becomes more costly to manufacture.
Would like to know how he feels about the new M1 chip! He must dazzled. And gowing forward, I think that ARM will be the way to go for mainstream computing now. Finally x86 will be killed by a better chip architecture. Imagine the possibilities!! scale an 10W ARM processor up to 95 or 125 W! that would be incredible. At least 10(!) times faster than an average Intel CPU
@@MarkarthGuard45 I watched video on ARM powered super computer from Fujitsu (the first super computer in the world). One of the commenter said that financially, x86-64 super computer is cheaper and more powerful than the ARM one (using same power consumption)
I have an Actron CP9550 that has an Arm Cortex M3 core in it. Got an iPod nano 2g that i picked out of the trash that has some sort of samsung soc that uses Arm. I have never attempted to open it but the Garmin GPS box that I trash picked probably has an Arm core or two in it. Dumpster divin' FTW
Jonathan Watmough Sounds interesting. A couple of friends of mine did a clockless CPU once just to see how that works but they ran into trouble with needing huge amounts of transistors for synchronization so it turned out not to be practical. Apparently Furber solved that problem somehow ... or maybe he didn't as the processor in my smartphone does have a clock ;)
How about the next step be to simplify chips. As we know all logic functions can be implemented with NAND gates. If that were the only gate on the entire chip that chip still could be programmed to do anything by wiring. Now have wiring, in turn, be controlled by a computer.
"In inserting the ammeter into the power supply, I had failed to connect the power supply, so no current was flowing into the ammeter... But the chip was still running!" WTF.
@@hsdsaunders I was thinking more about how AMD have been embarrassing intel at the time and even to this day. Intel have indeed improved and moved forwards with there big little design but it still uses a lot of power for the performance you get. ARM is also moving forward day by day in server and Apples M1 and M2 are ARM based and are extremely impressive. RISC is the future , it just took 3 or 4 decades for people to realise that.
The way he so stoicly and eloquently brushes aside the question of how it feels to be part of the revolution of arm, and subsequently gives most of the credit to the companies, uni's and designers earnes him all the respect he needs.
There is a case for modesty. He's one of the posterboys for sure.
Exactly my thoughts; most people would brag about that, great person!
I believe devjock is referring to 11:38
Sorry to be off topic but does anybody know of a trick to log back into an instagram account??
I stupidly forgot the login password. I love any tricks you can give me
@Dakari Merrick thanks so much for your reply. I found the site on google and I'm in the hacking process now.
I see it takes a while so I will get back to you later when my account password hopefully is recovered.
false.
@ ~ 9:45 or so: "...effectively we were powering the chip [(the ARM processor)] from the signal inputs..."
Holy cow!
Yes, about a 10th of a vault I believe. Watch The Micro Men here on You Tube. They replicate the incident.
Volts are a measure of voltage, not current or power. There's no way 0.1V could be enough voltage to enable a CMOS chip to operate, especially in that era. I assume you mean 0.1 Amps or 0.1 Watts.
MIND == BLOWN
I can't think of a modern analogy for what that means. Maybe an audio amplifier running off it's sound input, haha/ crazy
A cell phone being powered by incoming call I guess...
When I did study 25 years ago, a lot of computer science people had "No risc no fun" stickers at their doors here in Germany.
Haha how cool 🤣
ok?
Brings back memories. I joined Acorn from college in 1983 as a lowly technical writer, in spite of my BSc in comp. sci., because it was the only company I wanted work for and I would have mopped the floors if it got me in there. I didn't learn about the ARM until one of the guys working on ISO Pascal for the Beeb (which I was writing the user guide for) leaked it to me. After that I used to go and hang around Roger's desk and he'd show me ARM BASIC running on an emulator that was running on an NS16032 second processor. As Steve remembered, BASIC was running on the actual silicon days after they got the first chips back. I believe there was one minor rev. they had to do to get the chip ready for production, and maybe a move from a 3 micron process to 2 micron.
ok?
he's modest, but he's had quite a remarkable career
Agreed , very importnat man in the world of microprocessors. I could listen to him speak all day.
That is the British way!
9:20 Remarkable! In case what he said was missed, as he's quite a humble man about it, he was going to test how much current the first ARM chip drew when running. He connected a multi-meter to the power supply and turned it on and began running code on the ARM. Yet the meter read zero. He hadn't remembered to connect the power supply at all. No power was coming from the PSU into the ARM CPU; the ARM chip was so efficient, it was running from power leaking out of the board's capacitors as they slowly drained which is an absolutely minuscule amount of power. And yet the CPU had 2-4x the performance of the VAX 11/780 minicomputer.
Can there be any wonder why ARM is the most popular, widely used CPU architecture in the world? An absolutely genius, brilliant design. Cheers to Dr. Furber and Ms. Wilson. Visionaries.
As an erstwhile owner of Model B's and A3000/A5000's I revere Prof Furber and Sophie Wilson. BBC Basic and particularly Basic V was/is the best structured, easiest to use and most elegant version of Basic I have ever seen. And the same goes for ARM Assembler, which I think is a direct reflection of the simplicity of the RISC concept. And imbedding an assembler in Basic was pure genius, it made it so easy to develop a program in Basic then just convert the intensively used bits to Assembler. Still the most pleasurable programming experience I've ever had.
The ARM running without a power connection reminds me of when Frank Whittle was testing one of the first jet engines. It started to over speed so they cut fuel. But the engine kept going, faster, generating more thrust. Turned out fuel had collected in the burners. Something terribly British (pythonesque) about machines that mysteriously work but you don't know how.
A bit like the common runaway problem with some turbodiesel engines, where you cut the fuel but a leaky bearing seal means it's actually also burning some of the engine oil... so it keeps on accelerating...
TahreyUK \\\\\\\\\\\\\\\\\\\\\
I have so much admiration for these guys. I could listen to this chap's stories all day.
The magnitude of Acorn's achievement coming up with the ARM is something that shouldn't be understated. It makes them one of the few microcomputer companies who actually rolled their own processors - and one of very few who did it without actually owning their own chip factories or having been active in electronics component manufacturing _before_ starting to make computers. Quite a feat.
In fact I'm having trouble thinking of another example - Commodore had ownership of MOS, who were setup first and foremost as a microchip manufacturer in direct competition with Motorola and Intel. Texas Instruments were a long established IC manufacturer. DEC, who started out building minicomputers from piles of mini PCBs filled with discrete components and evolved to making a lot of their own ICs for their bigger systems, sort of flipped between making their own processors for their smaller machines (terminals, the Rainbow and VAX workstations) and just using Intel or Motorola (depending on machine) parts instead.
Other big names just bought commodity parts on the whole and only contracted out production of specific accessory parts such as glue ASICs. EG IBM could actually have made their own parts if they wanted, given their decades in the computing field, but simply found it cheaper and easier to buy in bulk from Intel, and about the only custom device in the PC was the BIOS ROM, and the character ROMs on the video cards - something which can be considered a direct contributor to the clone industry, as it turned out to be an exceptionally easy machine to copy, only needing a bit of BIOS reverse engineering. Apple just used MOS processors, and the most complicated custom piece in their early 80s designs would have been something like the IWM that looked after video and disc access. Atari did similar, buying from MOS then Motorola, and contracting various, seemingly random generic IC fabricators to make parts like their video chips (8-bits, ST) and then Glue & MMU ASICs (ST). Sinclair was similar, using Zilog CPUs and commodity memory, only moving to having Ferranti make ULAs for them from the ZX81 onwards. So on and so forth.
Possibly the only direct analogue is Sun, when they started producing the (also RISC, so possibly ARM-inspired) Alphas some years later on...
Sun's RISC CPUs were the SPARC range; Alpha was DEC's.
@@Desmaad Plus, IBM made its Power RISC processor, and HP made PA-RISC.
@@cdl0 And DEC made ARM chips for awhile under the name StrongARM.
@@Desmaad Yes, ARM and DEC did indeed collaborate on StrongARM.
I could listen to this guy all day.
I have my exams in few days on arm processors and I am watching this video. Forgive my ignorance, but I didn't know who is this gentleman is so I paused the video and started reading the description. I came to know that he is Prof Steve Furber who's book I have been refering to for my course!!!🙏
Cool, hope you did well on your tests bro.
I saw in an interview with Wilson saying that they were very worried about overheating of the package with high current draws so spent a lot of effort to make sure it economical with the juice. Even she was a bit surprised when the current draw figures were measured as to how low they were.
I have been a huge fan of ARM since doing low level mobile development. From a developers persecitve RISC is really wonderful to work with.
Keep making these series. I really enjoy them.
Martyj2009 Yup! I've written firmwares for ARM cortex M microcontrollers, and I like every bit of it (no pun intended :P )
Martyj2009 It actually doesn't matter to most developers b/c very few write in asm anymore.
MrSlowestD16 That'd be true for high level developments, including mobile. If you're programming a microcontroller, then you'd be working directly with registers (GPIO, SysTick, etc).
JFourier Don't know. Any microcontroller I've written code for was abstracted by C, so it's essentially all the same.
As I said, very few people write in asm anymore.
MrSlowestD16 Yea I agree. asm development has become very rare these days.
I write my firmwares in C/C++, but I'd still need to know the microcontroller's register map to properly set up GPIO, timers, interrupts, etc.
"...effectively we were powering the chip [(the ARM processor)] from the signal inputs..."
This was a well-known problem when working with CMOS chips. As documented in that famous book, "The Art Of Electronics" by Horowitz and Hill, if your (CMOS) circuit suddenly stopped working, it *may* be that one of your chips isn't connected to the power rails and that it's being powered by its inputs.
Consider what happens when all of the inputs go "low" at the same time...
I imagine the engineer who didn't read that book is the one who invented the parity line to ensure that there is always an odd number of high inputs so that it can never be all low.
Interesting interview, I am a Acorn fan and owner since BBC model B.
Also new to me ... tape out... I laid out several simple PCBs using black tape and letraset pads. We called it "taping out" onto clear film as the last step after proto board and schematic approval. This was then sent off to the PCB manufacturer.
I was impressed by the design back then (read about it in Byte magazine, probably) and I am still impressed now. Great job.
I worked (not actually programming, sadly) for the company who ported Smalltalk-80 to the ARM, in the form of an Archimedes. We had an actual Archimedes on one desk and on another we had a 80386 machine with an "Archimedes on an Expansion Card" which shared the monitor somehow. We also had a 386 port of Smalltalk-80 for comparison.
What delighted me, for various reasons, was that the ARM running at 8MHz kicked the living bits out of the 386 running a much faster clock (I don't recall the exact speed after all this time, but it was at least 20 MHz, possibly double that).
Our colleague who did all the heavy lifting was loaned various prototypes as new ARM variants came out. He had one which ran so fast it broke the speed-limiters on Flight Simulator making it impossible to play ;-)
Phil Boswell The 386 ran a 33mhz clock as standard with an optional floating point co-processor.
> was that the ARM running at 8MHz kicked the living bits out of the 386 running a much faster
> clock
Yeah, 386 was pure CISC. That changed later on with the Pentium Pro, which has the CISC ISA running on a RISC-like core.
486 was CISC as well, but could execute many instruction in one clock. Stuff like INC AX.
IIRC from 386 to 486 the IPC was doubled.
@@SurmaSampo The first 386's started at 16 MHz.. The last ones sold were 40 MHz.
@@earx23 Not true. I have seen 100mhz 386 processors.
@@SurmaSampo that's not the original Intel 386 though.
"powering the chip through the inputs"
Brought a smile to my face. I knew immediately what happened. Been there, done that. Just not on a processor.
What on?
the co of arm today, his masters thesis is on my shelf there - wow furber is a legend! did enjoy reading his books on arm programming when i was a kid :)
Great chip, used the Acorn Archimedes series, first machine I learned assembly language on so I got quite in depth with the OS and chip, awesome for its time. Glad to see ARM still going strong and found their niche in this otherwise dominated marketplace.
Steve Furber is working on neural networking of computers in Manchester University. A legend of Acorn computing (and the lovely Sinclairs)
outstanding, one of my favorite channels on youtube
Great video, Brady! I love these technology history lessons! I think it's important to remember how we got here, and recognize the people who made it possible.
Phenomenal video. Appreciate y’all sharing a brief history of the ARM processor.
They came out with 64bit multi-core ARM CPUs recently which are now powering supercomputers and servers that have to be extremely power efficient, low thermals, and the ability to be more tightly packed than any other design. As you can probably figure out they also are being used in smartphones (they are much more powerful than how they are being used in phones, although some of this power is apparent in that you can leave a large amount of applications running at the same time without impacting the performance).
Mmmmmm instresting
The supercomputer is running off tegra cpu. while it is an arm cpu, it's a proprietary design from Nvidia. Nvidia will be launching a cpu dokn with multiple cores with "haswell-like" performance... which is fairly impressive, considering that even AMD hasn't caught up to haswell yet. Intel might want to make a run for its money since ARM is a real threat to its business. It could potentially be replaced in laptops and servers, and possibly desktop PCs though I wouldn't hold my breath on that, since backward compatibility is still something that arm wouldn't be able to provide, but in this like servers where power consumption is more important than backward compatibility, I think it will definitely take over, especially considering that the majority of servers run on Linux and Linux can run on ARM processors.
+joemann7971 True in fact Linux on ARM has more shipped units than Linux on x86 I'd have thought just from the android devices running Linux kernels, it's amazing really how much all the established and experienced companies in the PC sector managed to miss the boat on smartphones. Processor makers like Intel and AMD, GPU makers like Nvidia and the software giants like Microsoft despite all their experience managed to sleep through others taking a dominant position in a market that's far larger than the PC and server market put together.
If anything though ARM is probably better off focusing like they are on embedded devices, smartphones and tablets, they have a bigger market just look how many embedded devices a typical household has compared to PC's and laptops often it's several times as many at least and there are more places those could expand.
GPU maker like nvidia and AMD (ATI) you forgot that part.
Acorn machines were brilliant. The first machine I build to run BASIC on was an Acorn Atom (6502 processor) which came as a self build kit and worked great first time despite my inept engineering skills.......
Fascinating! I really enjoy these videos on the early days of computing.
Who is here after Apple's M1 Macs? Haha, my first computer ever was the Acorn's BBC Micro Model B!
This a great episode. Thank you Brady and thanks to everyone involved in making this.
Thanks Dr. Furber!
I have just noticed an error on the screen captured from 11:20 : the ARM1 had 25 000 transistors, but not the ARM2 : it had around 30 000, because contrary to the ARM1, the ARM2 has the multiplication instructions implemented.
This is one of my favorite channels. Thanks!
Thanks for this Sean, the professor's videos on ARM are very interesting. You should do a computerphile extra video of the him with his guitar!
#1 hit the summer of 85 was Tears For Fears "Everybody Wants to Rule the World". Who knew they were talking about computing chips.
This must be the best example of serendipity
not connecting the power supplies is one of the most extreme cases of make a bug a feature I've heard in my life
OMG!! They did it! ARM killed x86! Congratulations!
I had 30 years of arm once (right) and only 29 years of the other arm (the left one). This was the result of me spinning very rapidly with my right arm at the axis point and my left arm traveling at very close to the speed of light (and thus aging more slowly), thus matching the cliche of the left being younger than the right.
that low consumpotion energy story is SICK !!!!!!! awesome!!!!!
A well-engineered piece of hardware will work. A brilliantly-engineered piece of hardware will work even if it shouldn't. :)
Great background information! Excellent!
I had a colleague who had an Acorn, it was a great system but unfortunately an island platform. But it booted fast had a great UI!
It's interesting that he refers to Sophie Wilson as "Roger". That must be confusing for all parties involved.
Such a great interview!
Wow, what a modest man.
"only" 12 billion ARM cpu's at the time of making this video 7 yrs ago.
Today in 2022 it is 230 billion ARM cpu's produced..
Appreciate the video. I would love a complementary video to this to discuss RISC-V.
As an ARM engineer, this is particularly neat to hear :)
Ai is the future. Apple m1 is so fast compared to arm design too. Why is their design so much better than arm latest core design ?
@@ps3301 you high bro? M1 IS an arm chip (as well as the M2 and M3)
Thanks for this interview!
Looking listening Learning
so much news - so much new
My first touch with ARM was in 1989
an Acorn StrongARM with dual CPU
one intel - one arm
the teacher showing this machine was
running hot with it - i was and still am..
Amazed . .
never heared much about it
found one StrongARM in use
at a local backery..
i said.. well.. its a very very good system
unfortunatly.. i think the PC/XT (and up)
is going to run over the market..
And so it came to be
but - was it really ?
since i found a fan-page, much later, about the
StrongArm - listed all the models and versions
of that time period and thought..
i was wrong.. and right..
but what if i knew - but i didn't do then.
Will ARM be the next HOST on your
PC Adventure - i dare to say YES
with the move to online services
and less local apps.. it will do well.
ok, the static devices in decline
the mobile devices as the new default
so, ARM has already arrived a long time ago.
The whole running on no power thing would make a great plot device in an Apollo 13 type movie.
As an avid Commodore enthusiast... 3:00 Wouldn't that Phoenix, AZ company have been Bill Mensch's Western Design Center? :D
What an amazingly humble and down-to-earth guy. If I invented ARM, I'd be struting around like Flavor Flav with a gold medallion the size of a hubcap hanging from my neck.
Any device is not completely off when switched off, rather it has 1 watt of power flowing through the board or device, life!
Nice mention to the 65c816! That's the chip the SNES used.
When Professor Furber noticed that his ARM chip was running without a power supply connection can only mean that computers have been Sentient life forms from the very beginning, and that they have just been waiting for the proper time to become our Silicon Overlords. Humans are Doomed.
buddy5335 the day skynet became self-aware
All my AVR chip do that. I connect some I/O to 5v and no VCC connected and they run as normal. I am in danger? Skynet becoming self-aware? :D
false.
The audio in this video was ridiculously low. You should look into that.
Fascinating to listen to thanks for sharing.
2:45 "The turning point [in the design of the ARM] was a visit to [The Western Design Center, Inc.]".
4:10 *Sophie
I love computers! See, you helped make that happen in a way.
So who was the CEo of ARM who's Masters Thesis was on Steve's shelf?
great talk. plus i now know what tape out means. reliefed to hear although still called tape out, no tape is harmed in the process. symphatically dry.
Who came here after seeing the Apple ARM news?
This just shows that even ARM's principal designer also made silly mistake like all of us by powering up MCU through GPIOs.
Fascinating story!
THe ARM design is so Handy
yay, computer scientist john lithgow is back!
Brady, could you get him to talk about the work he's doing with SpiNNaker? While the ARM stuff is interesting, it's history. He's still doing stuff on the bleeding edge that has potential to change everyone's life *completely*.
gasdive Spinnaker video is already on the way :) >Sean
*****
Oh Wow, just Wow.
my boy Furber is wicked smaht
A different response to the bandwidth problem can be seen in the design of both the BBC Micro itself, as well as the Atari ST and the original Amiga; sharing the memory bandwidth between the processor and the equally as hungry video system on an interleaved basis. The memory had about twice the speed as what the processor could use in both cases, so the video could be given an equal half, reading screenbuffer data out of a particular chunk of the same memory bank as the actual code and data. The available data rate of that sidechannel then determining the screen resolution and colour depth the computer could provide without slowing down the processor (and the amount of processor time it can borrow from determining the actual max and how slow the computer runs when using it).
IE, turning the drawback into an advantage, in terms of cost saving and simplification of the architecture, instead of coming up with a whole new processor type and spending money on upping the general performance to take advantage of it. You don't get the sort of raw power (and better graphics) that the Archie offered, but you do get a reasonably performant computer for the lowest possible price.
In the Beeb, that being 320x200/256 in mono or 160x200/256 in 4 colours (or the Teletext mode, which uses character mode to give the appearance of 320x200 in 8 colours) with no performance hit, or the max of 640x200/256 in mono and 320x200/256 in 4 colours. The 16-bits had about 4x the memory bandwidth (twice the clock, twice the bus width), so instead manage 640x200/256 in 4 colours and 320x200/256 in 16 colours with no slowdown, and the Amiga could pull up to 640x200/256 in 16 colours with a similar chronic amount of slowdown as experienced by the Beeb, or 320x200/256 in 64 colours with half as much...
The PC's graphics cards don't show the same kind of coupling between general machine performance and resolution/colour depth as they run a completely separate bus to the rest of the machine, so gave e.g. BBC-like quality (from CGA) without slowing down the CPU despite using the same general memory technology and bus width. Which was rather important given how underpowered and bandwidth-inefficient the rest of the computer already was (struggling to reach the performance of a Beeb in the lower screen modes even so); with shared video, as seen in the PC Jr, it would have damn near come to a halt.
I don't think the BBC suffers slowdown in any modes, even its highest 640x256 mode. The CPU runs at 2MHz regardless, the only difference between the modes is whether the video system also runs at 2MHz.
can someone please elaborate on the plastic vs ceramic packaging part? I've never heard of this before.
Frient The chip needs a package to protect the silicon from damage and to house the leads to connect the IC into a larger circuit.
Plastic has lower material costs and is easier to manufacture making it much cheaper. However, plastic is generally not very good at conducting heat so if the chip produces too much heat it can't escape fast enough and will melt.
Ceramics are much better at conducting heat so you can get away with producing more heat but it becomes more costly to manufacture.
Ah I see. I thought he was referring to the packaging that items on a store shelf get. Never heard of ceramics on there hehe.
Frient I have an arm-based omap4430 which might melt over 100c (210F)
Frient Example of a CPU in a ceramic DIP (CerDIP) package:
commons.wikimedia.org/wiki/File:KL_National_N8X300I_CerDIP.jpg
Does this support the idea that an electronic device can be unplugged, yet the sensors such as microphone, camera, still recording / transmitting
Watching this on my Macbook Pro with ARM instruction Set M1
Would like to know how he feels about the new M1 chip! He must dazzled. And gowing forward, I think that ARM will be the way to go for mainstream computing now. Finally x86 will be killed by a better chip architecture. Imagine the possibilities!! scale an 10W ARM processor up to 95 or 125 W! that would be incredible. At least 10(!) times faster than an average Intel CPU
@@MarkarthGuard45 I watched video on ARM powered super computer from Fujitsu (the first super computer in the world). One of the commenter said that financially, x86-64 super computer is cheaper and more powerful than the ARM one (using same power consumption)
Great interview !
is there a chance to get other Manchester professors to give a small talk?
Great video, very interesting.
@Computerphile could you enable automatic subtitles for the video? thanks so much!
+1
@Computerphile, yes please +1
I have an Actron CP9550 that has an Arm Cortex M3 core in it. Got an iPod nano 2g that i picked out of the trash that has some sort of samsung soc that uses Arm. I have never attempted to open it but the Garmin GPS box that I trash picked probably has an Arm core or two in it.
Dumpster divin' FTW
Enable subtitles
Designed a microprocessor, couldn't properly connect it to a power supply and multimeter LOL
No it's probably that while they were getting everything hooked up as a team they saw that it was working and assumed that that was already hooked up
ok?
Where can I find the page shown on (11:18) that lists all those historical processors and their specifications?
Google a few sentences from it, ie names of a few processors. It'll probably come up.
could a stackable processor be any good for doubling performance by stacking more cpus ontop of the previous chip? imagine that for easy upgrades lol
Heat dissipation would be almost impossible to manage.
Tim Tian You'd need some sort of integrated inter-die cooling, I think IBM has been doing research on that for quite some time.
sbjf Sounds, err, _cool_.
Tim Tian plop it in a tank of liquid helium
aakksshhaayy That does bad things to semiconducting.
Simplicity saves lives
Hate corporate beaurocracy which kills creativity
14 dislikes from intel execs?. quite astonishing that it was powered only by signal inputs, wow!
fairly young voice for a guy that age. (:
Great video! But what is with the random zoom at 4:54?
Can ARM beat Intel x86 or AMD x64 design? If the main obsatcle is memory bandwidth, why not just increase the amount of cache memory or use DDR6 RAM?
Cache changed the equation.
I wish I still had my Archimedes.
Does ARM already have processors that operate at near-threshold voltage? Because that's what I'd be more ready to call magic.
Penny Lane Steve Furber did a no-clock ARM that only ran when needed. Kind of like a lazy-eval chip.
Jonathan Watmough Sounds interesting. A couple of friends of mine did a clockless CPU once just to see how that works but they ran into trouble with needing huge amounts of transistors for synchronization so it turned out not to be practical. Apparently Furber solved that problem somehow ... or maybe he didn't as the processor in my smartphone does have a clock ;)
Cant Beat The British Technology
Well, it's japanese now.
How about the next step be to simplify chips.
As we know all logic functions can be implemented with NAND gates. If that were the only gate on the entire chip that chip still could be programmed to do anything by wiring. Now have wiring, in turn, be controlled by a computer.
That' beeing done since decades already: FPGAs
+George Steele Yes, as Fabian said, it's been done for decades, at the penalty of HUGELY increased power consumption.
What did he mean by 'MICROPROCESSORS IS A BLACK ART"?
Black/dark arts = techniques or practices that are considered mysterious or dishonourable.
What's the water brand of the bottle to the left on his table?
Fish Kungfu thanks
LastofAvari Puzzled why that was the thing that caught your interest.
great video :)
Accidentally powering a micro only via IO pins, yeah been there done that, caused a bit of head scratching.
This guy is also a guitarist.
"In inserting the ammeter into the power supply, I had failed to connect the power supply, so no current was flowing into the ammeter... But the chip was still running!" WTF.
Genius's am I right?
@@sdprz7893 It's just freaking amazing
Love Sophie Wilson... perhaps the most influential transgender person in modern history.
The way intel is dominant on the desktop today, well that aged well :D
Cant think of too many non x86 consumer desktop computers.
@@hsdsaunders I was thinking more about how AMD have been embarrassing intel at the time and even to this day. Intel have indeed improved and moved forwards with there big little design but it still uses a lot of power for the performance you get. ARM is also moving forward day by day in server and Apples M1 and M2 are ARM based and are extremely impressive. RISC is the future , it just took 3 or 4 decades for people to realise that.