This woman is my hero. She's made a larger difference to our current lives and the future of mobile technology that most people would care to realise, and she's so chill.
Except she didn't. This interviewer got it wrong. The ARM was designed by Steve Furber and Andy Hooper of Acorn Computers, with probably, some help from ex Sinclair designer Chris Curry. Sophi designed the RISCOS.
Wow. Scary that Sophie still has all the architecture facts and figures in her head and can recall it all so effortlessly all these years later. A legend!
Sitting here watching a video of my current devices' great, great, great, great(x10) grand processor. Soph has made the biggest impact to our mobile lives than most would ever realise.
What a legend Sophie Wilson is! To think people like this changed the world in their own way.I had an Archie A310 and it was the dogs nuts......I remember doing 3d on it in 1990. Thanks for posting this -it's really interesting for us geeky types!
Agreed. The Archie was out of this world. Sadly I couldn't afford one at the time :( I used to hang out with engineers from Acorn in the early 1990's. Funny to think Sophie was probably in the building - she is my hero!
The fluency she has over everything from microprocessor architecture to silicon process is incredible: she's just throwing out facts without breaking a sweat!
So interesting! I was privileged to be able to hang out after work in Sophie's cube at Acorn as a fresh out of Uni technical writer, while she showed me gems like ARM BASIC running on an ARM emulator running on an NS 16032 second processor! That was before the silicon came back, of course.
This was a great interview, very interesting as I grew up with the BBC Micro B and Archimedes A4000 and later the RiscPC600 which got upgraded to StrongARM designed by DEC running at 200MHz. I was coding in assembler on the 6502 within a couple of months of using the BBC micro at school and did same with ARM assembler code when I got the A4000 which used the ARM250 SoC which was a very different beast. Fascinating hearing the details behind the scenes and brought back many memories. Sophie and Steve will be my heroes for ever.
I was raised with Acorn computers. My Dad bought an Electron when I was 3 which I learned to program on, followed by an Archimedes A440/1 and then later a RiscPC. I thought the ARM CPUs were great, and it makes me pretty proud both as an Acorn user and a Brit to see them dominating the market nowadays. Great interview! =)
I haven't seen an original ARM evaluation second processor for 25 years! I wish I could buy one somewhere as they were incredibly rare back then so I suppose they're nearly impossible to find now. I've completely restored my Master128 with a brand new logic board fresh from its antistatic packaging - found on eBay. I think it was one of the last ever made in about 1996 I think. I remember coding for the ARM1 and then the ARM2 on the A3000's in-line assembler that she wrote. An incredible machine for the times - way, way ahead of anything else. VIDC was a wonderful video co-processor... and I remember being excited about the additional inter-processor instructions using the upper bits of the address bus, above the main 26-bit mode addressing of the original ARMs. Sophie will go down in history as one of the most important people of the 20th Century.
ARM1 machine instruction set makes Intel’s 8086/286/385 instruction set look like some Rube Goldberg contraption with extra points for weirdness. Yuck. I never had the chance to write in ARM assembly back when Archimedes was a new thing - I wrote plenty of Intel, and I kept thinking how contrived it all seemed.
quite literally one of the most fascinating videos I've seen in a while. If only apple had gone with ARM desktop back then, we could have had unified platforms! OH well hindsight time machine...
An amazing engineer and someone who should be celebrated more; even if it were just for the amazing work in creating BBC BASIC, never mind the world-changing processor work. An extraordinary person.
Loved those times in computing! Nowadays, the leaps arent that gigantic year on year. In the early days, you were playing Hercules or CGA graphics in year1, EGA in year2 and VGA in year3. Those leaps were gigantic! Also imagine going from crap sound to a soundblaster..
wow she totally amazing . the difference her and Steve's work has made to the world makes me just cry. I love her and I don't know what she look like as a man but as a woman she's amazing looking for someone in her 60's.
Fantastic interview, but seriously... if you're going to meet one of the legends of technology at least know what their most successful early computer, the BBC Micro, is. I still subscribed but you guys should have done your research about so much of this first. She's enjoying the interview so much you got the bird at 12 minutes :-D
Thanks but I did watch this movie before doing this interview th-cam.com/video/XXBxV6-zamM/w-d-xo.html and I did read up on some stuff and I have been video-blogging all ARM devices since 2004 and my website is called ARMdevices.net same thing I tried reading some info and watching any other existing interviews on TH-cam before doing my interviews with Steve Furber, Hermann Hauser and Robin Saxby too. Thanks for watching.
38:00 "..so Acorn, Arm, FirePath, DSL stuff, Broadcom.. it's used everywhere, so You used to be designing parts that are everywhere now. How is to feel like that?" That question part make my day :)
22:00 The interesting comparison here is the ARM3 at 25 MHz versus the Intel 486DX (with floating point) at 25 MHz. The 486 had cache memory with a higher program density (so more efficient icache per transistor) and it executed the majority of its simple instructions in one clock cycle. It had a 64-bit barrel shifter, a full 32×32 to 64 bit hardware multiplier. It was starved for registers, but it made efficient use of the stack as an auxiliary register file via the SIB read-modify-write instruction encodings. It also had specialized string instructions which updated both the source and destination pointer, as well as checking the termination condition, all in the same cycle as the memory transfer, without any interference on the memory bus from fetching further instruction bytes. The ARM chip had predication, which would have eliminated some branch overhead. I imagine the ARM chip ran a bit faster due to predication and less register starvation, but not by much. A fly in the ointment is that compilers of the era weren't as sophisticated as register colouring as they have since become; I'm not sure which processor would have been hurt more by this, but likely the 486.
Thanks for the interview but the interviewer doesn't know much about the subject. I keep wincing at some of the questions, e.g. at 6:25 "so when the first chips came back were you surprised by the performance...?" When you get chips back from the fab you're either very happy if they fully work, or very annoyed if they don't; the device's performance is something you spend a _lot_ of time designing and very carefully simulating before anything is ever manufactured. The only surprising bit might be how much you can overclock an individual sample of a chip beyond what the simulations predicted. Any other surprises are usually bad ones :-)
Dr Tune I ask if they were surprised by how low the power consumption was, you can see Steve Furber talk about how it runs at an even lower power consumption than they expected th-cam.com/video/1jOJl8gRPyQ/w-d-xo.htmlm25s which is key to define ARM's success since even though it was not crucial for the Acorn desktop PC design that they were designing this first ARM processor for. He says that the tools that they had for designing the engineering were not very accurate, they applied Victorian engineering margins to the power design, all they knew is that they wanted to come in at under 1W to fit in the low cost plastic package instead of the expensive ceramics package, it actually turned out to consume only 100mW, Steve Furber seems to say that they were not expecting it to consume that low power.
+Dr Tune I'd say a good interviewer has to pretend [s]he know a bit less than [s]he really does, so uninformed viewers have a chance to understand all of it. That broadens the targetted audience.
I hope to see a powerful Arrm PC that can challenge Windows PCs on my desk soon. Plug that was the best explanation of finfet I've ever heard. I understand things better when it's visual.
Jason Gooden you will never see arm beat x86, silicon can only work at a maximum speed and x86 has the biggest instruction set wich can reduce the number of instructions necessary to do computation, yeah ARM is more power efficient than x86 but it will never ever beat a x86 on the same node size, unless they increase the instruction set in wich case they will consume just as much power as x86.
@@fss1704 Intel hasn’t been running their instruction set natively for a good while now. Internally they translate it to - gasp *RISC* instructions - and run those instead. That’s the cost of x86: you buy a RISC CPU and two instruction set translators (one for x86, one for amd64g). Those translators are a waste of power and resources and are only there because the legacy instruction set is so far removed from the hardware it can efficiently run on. A big and baroque instruction set isn’t magic - you still need silicon to execute those quickly, and it’s separate from instruction set to a good extent. A rich instruction set doesn’t make a CPU fast. These days all it does is increase power consumption and make the design harder to develop and validate.
@@absurdengineering I have to disagree that a big instruction set isn't faster than a small set, if you look for stock market bots you'll see that no one ever uses ARM for that purpose, yeah ARM is WAY more power efficient than X86 but it will always need more cycles to process the same stuff, the instruction set translators won't interfere on the performance as all the instructions are converted on the fly while they're cached on the pre execution accelerators, yeah it takes more energy to do that but it's also faster to execute the instructions when the processor preemptively caches the next memory content on the register instead of waiting to ask the memory.
"They [Intel] Tried to copy ARM recently didn't they?" "......no" I love how casual Sophie is when talking about Intel; but the grin on her face says it all.
Given she has not completely trashed Intel x86 in this video, I think she would be equally condescending to Risk V. In my opinion the latter is still inferior to ARM (including the very first ARM1) in many aspects
KJER ERRT Well, I started the thread and everything in general is still motorolla/ arm from experience in general including but not only mostly tablets and phones!
Just in case anyone is trying to find the other parts of this interview: Part 1 - th-cam.com/video/jhwwrSaHdh8/w-d-xo.html Part 3 - th-cam.com/video/QqxThgLTLyk/w-d-xo.html
Sophie, as a pure genius interested by processing power and serious usage of a computer, simply forgets about the sound capability of the VIDC (video and sound controller) : hey, that is 'simply' a chip offering 8 PCM channels, 8 bit log, going as high as 330 Khz on one single channel, with 7 independent stereo positions per channel. And Amiga fanboys brag about their poor crippled Paula chip ... ROTFL.
Sounds good on paper but then all thy did was port amiga mod files.. the Amiga was a design pretty much finalised in 1982/83 anyway.and totally under invested by Commodore, it was getting pretty old by then.
What kind of question is that on the transistors!? I’m not a computer person and even I knew what she was talking about….who is the person doing this interview!?
ITS LIKE FIRST INTEL 386 PROCESSOR ..DIFFERENCE IS ...ITS EVERYTHING ON PACKAGE DIE..VS 386 YOU NEED VIDEO,MICROCONTROLLER, I/O CONTROLLER DUE TO INTEL AND AMD PROBLEMS BACK THEN...ALSO ITS FASTER THAN ANY 386 EVEN 686 PROCESSORS HAVEN'T BEEN MADE YET
When she was doing this stuff, She was living as a man, so that wasn't really part of the equation at the time. She was already very successful before transitioning.
It’s funny to watch her kind of upset about ARM processors not being used on Apple computers at the time of the interview. I guess she would show much satisfaction if the same interview was repeated now
@@charbax I suppose the American big corporations lobbies also played an important role in the way the story of processors developed. I mean, how's possible that Intel and Motorola processors were chosen in virtually every high end personal computer existing in the 80's, while the ARM processor had already demonstrated to be 5 to7 times faster than anything else existing at the time?
@@RelayComputer US big tech perhaps waited to embrace/allow Arm in high end until it was sold to non-European ownership.. the US even tried to overtake Arm and fold it into Nvidia but China and others disallowed it.. 😄 Now that they're talking of floating Arm on the US stock market, outside of Europe, I wonder if this is a way for the Americans to control that future.. 😄
@@charbax As an European, it’s always hard for me to figure out or understand such forceful manoeuvres from the USA. It’s like they are always afraid of something. But I repeat I don’t quite understand these things
ARMv2 only addresses 26-bits of space, so not enough for Android. But ARMv3 (ARM6/7 CPUs from Acorn’s later RiscPC) could probably run it. gcc had ARMv3 support up to gcc 8, so if you got an early version of Android and built it with gcc 8, it’d run - you’d probably need to add way more RAM than those machines came with, and you’d need drivers for the hardware, and I don’t know if Linux ever supported ARMv3 so you might’ve needed to port the low-level kernel bits to that architecture, so it could be done if you had a thousand hours to spend on it, and had the know-how.
We do, and they are 40%-50% less powerfull than x86, that's why. You can't make it go faster because they have a reduced instruction set, if you increase the instruction set you will also increase power draw just like x86, when you have a limited clock speed if you try to make both processors on the same node you will find that ARM is much slower than x86.
Meh, AMD only exist because Intel hired them to manufacturer 8088 CPUs for the IBM PC and AMD .... erm... "Borrowed the way in which the chip behaved" shall we say.
I wish that ARM garbage never existed... Acorn RISC Machine by Al Alcorn IS NOT a portable, low power solution, but a processor architecture so poorly made... The ARM CPUs use much more software to run the same thing... Why x86 "mov al, 3; mul 2" has to look like "bl getnum; mul r0, r0, #6985; bl printnum"? What we did to deserve this?? Intel architecture has HARDWARE SUPPORT for many commands (hundreds of instructions only in Pentium 4, think about i9-9900K), and therefore no need for long code that has to be debugged at each and every step...
Yes, it had relatively poor code density, but you could buy a machine with 1 MB stock (and still expandable) whereas the IBM PC was limited to 640 kB of primary memory for a donkey's age. The IBM PC depending on the legendary code density ox the x86 instruction set to offset its ludicrous system design limitations was not a chapter to be recalled fondly from computer history. I'm guessing you didn't study the x86 ISA long enough to get to the chapter on segment registers (probably chapter 2). Those segment registers alone cost the computer industry thousands of person-years-ask any compiler writer-in faffing around with a pointlessly baroque addressing kluge. But glad to know that someone out there thinks that segment registers were God's gift to French vanilla ice cream. Keeps the world a more interesting place. Interesting, but extremely weird.
@Dos Doktor I have no clue what you’re on about. A register load immediate and a multiplication by an immediate are two instructions on x86, three on ARMv2 (two loads then MUL). I have no idea where you got those subroutine calls (BL) from. Seems like you have no clue.
x86 was trash compared to ARM (still is, with the Apple M1 emulating Intel faster than Intel! (of course Intel licensed ARM too - the XScale). I will say that programming 486 after ARM (both in assembly) was a step back, but it was possible to make code run quickly on a 486, by programming it in a RISC way - use only the simple instructions with low cycle counts. The lack of registers on x86 was annoying, but tricks like disabling interrupts during rasterisation helped to a degree
This woman is my hero. She's made a larger difference to our current lives and the future of mobile technology that most people would care to realise, and she's so chill.
She did give us the bird! Stateside, that's an unmistakable insult... LOL! Brilliant!
Except she didn't. This interviewer got it wrong. The ARM was designed by Steve Furber and Andy Hooper of Acorn Computers, with probably, some help from ex Sinclair designer Chris Curry.
Sophi designed the RISCOS.
@@petermitchell6348 she a man. Roger Wilson from Leeds uk.
@@bolshevikproductions Yes I know.
@@petermitchell6348 are you Womben too?
Absolutely mind bending knowledge but she makes it accessible and interesting! Definitely one of the most important people in computing history.
@KJER ERRT I think you're an ignorant idiot.
Wow. Scary that Sophie still has all the architecture facts and figures in her head and can recall it all so effortlessly all these years later. A legend!
Old times like her never forget. Now the may not remember their cell number.
@@niyablake She's still working in tech, now at Broadcom.
Sitting here watching a video of my current devices' great, great, great, great(x10) grand processor. Soph has made the biggest impact to our mobile lives than most would ever realise.
What a legend Sophie Wilson is!
To think people like this changed the world in their own way.I had an Archie A310 and it was the dogs nuts......I remember doing 3d on it in 1990.
Thanks for posting this -it's really interesting for us geeky types!
Agreed. The Archie was out of this world. Sadly I couldn't afford one at the time :( I used to hang out with engineers from Acorn in the early 1990's. Funny to think Sophie was probably in the building - she is my hero!
sophie's a genius.
I'm no slow coach at all when it comes to computers and I found it difficult to keep up after 5 minutes. fascinating.
The fluency she has over everything from microprocessor architecture to silicon process is incredible: she's just throwing out facts without breaking a sweat!
Yes, she's a complete encyclopedia! It's fantastic and fascinating listening to her.
She went from ARM to running Qualcoms development team. This is her job, day to day.
Prescious time with a true innovator. Very inspiring. Visionary. Such patience, to boot!
So interesting! I was privileged to be able to hang out after work in Sophie's cube at Acorn as a fresh out of Uni technical writer, while she showed me gems like ARM BASIC running on an ARM emulator running on an NS 16032 second processor! That was before the silicon came back, of course.
This was a great interview, very interesting as I grew up with the BBC Micro B and Archimedes A4000 and later the RiscPC600 which got upgraded to StrongARM designed by DEC running at 200MHz. I was coding in assembler on the 6502 within a couple of months of using the BBC micro at school and did same with ARM assembler code when I got the A4000 which used the ARM250 SoC which was a very different beast. Fascinating hearing the details behind the scenes and brought back many memories. Sophie and Steve will be my heroes for ever.
I was raised with Acorn computers. My Dad bought an Electron when I was 3 which I learned to program on, followed by an Archimedes A440/1 and then later a RiscPC. I thought the ARM CPUs were great, and it makes me pretty proud both as an Acorn user and a Brit to see them dominating the market nowadays. Great interview! =)
She is so sweet with the interviewer, it's so funny. I just adore her...''It's a lot of nanometres, like 3000''
I haven't seen an original ARM evaluation second processor for 25 years! I wish I could buy one somewhere as they were incredibly rare back then so I suppose they're nearly impossible to find now. I've completely restored my Master128 with a brand new logic board fresh from its antistatic packaging - found on eBay. I think it was one of the last ever made in about 1996 I think. I remember coding for the ARM1 and then the ARM2 on the A3000's in-line assembler that she wrote. An incredible machine for the times - way, way ahead of anything else. VIDC was a wonderful video co-processor... and I remember being excited about the additional inter-processor instructions using the upper bits of the address bus, above the main 26-bit mode addressing of the original ARMs. Sophie will go down in history as one of the most important people of the 20th Century.
ARM1 machine instruction set makes Intel’s 8086/286/385 instruction set look like some Rube Goldberg contraption with extra points for weirdness. Yuck. I never had the chance to write in ARM assembly back when Archimedes was a new thing - I wrote plenty of Intel, and I kept thinking how contrived it all seemed.
quite literally one of the most fascinating videos I've seen in a while. If only apple had gone with ARM desktop back then, we could have had unified platforms! OH well hindsight time machine...
Now Apple has an ARM design
I am a huge fan of the RISC OS 3.11 and Acorn Archimedes machines that I used to use back in the 1990s when I was in primary school here in the UK.
An amazing engineer and someone who should be celebrated more; even if it were just for the amazing work in creating BBC BASIC, never mind the world-changing processor work. An extraordinary person.
Loved those times in computing!
Nowadays, the leaps arent that gigantic year on year. In the early days, you were playing Hercules or CGA graphics in year1, EGA in year2 and VGA in year3. Those leaps were gigantic!
Also imagine going from crap sound to a soundblaster..
It's very rare to get an interview with Sophie Wilson.
wow she totally amazing . the difference her and Steve's work has made to the world makes me just cry. I love her and I don't know what she look like as a man but as a woman she's amazing looking for someone in her 60's.
I find it amazing that she is unknown in the UK but has made such a huge impact to the modern world that we live in.
Thank you for doing this interview!
Absolute Genius, Fascinating to listen to Sophie.
12:09 my favorite part
I could listen to her for hours. :)
fascinating interview
Fantastic interview, but seriously... if you're going to meet one of the legends of technology at least know what their most successful early computer, the BBC Micro, is. I still subscribed but you guys should have done your research about so much of this first. She's enjoying the interview so much you got the bird at 12 minutes :-D
Thanks but I did watch this movie before doing this interview th-cam.com/video/XXBxV6-zamM/w-d-xo.html and I did read up on some stuff and I have been video-blogging all ARM devices since 2004 and my website is called ARMdevices.net same thing I tried reading some info and watching any other existing interviews on TH-cam before doing my interviews with Steve Furber, Hermann Hauser and Robin Saxby too. Thanks for watching.
I loved the example of the finger.
So cool and interesting, can't wait for part 3!
38:00 "..so Acorn, Arm, FirePath, DSL stuff, Broadcom.. it's used everywhere, so You used to be designing parts that are everywhere now. How is to feel like that?"
That question part make my day :)
22:00 The interesting comparison here is the ARM3 at 25 MHz versus the Intel 486DX (with floating point) at 25 MHz. The 486 had cache memory with a higher program density (so more efficient icache per transistor) and it executed the majority of its simple instructions in one clock cycle. It had a 64-bit barrel shifter, a full 32×32 to 64 bit hardware multiplier. It was starved for registers, but it made efficient use of the stack as an auxiliary register file via the SIB read-modify-write instruction encodings. It also had specialized string instructions which updated both the source and destination pointer, as well as checking the termination condition, all in the same cycle as the memory transfer, without any interference on the memory bus from fetching further instruction bytes. The ARM chip had predication, which would have eliminated some branch overhead. I imagine the ARM chip ran a bit faster due to predication and less register starvation, but not by much. A fly in the ointment is that compilers of the era weren't as sophisticated as register colouring as they have since become; I'm not sure which processor would have been hurt more by this, but likely the 486.
Sophie is such a gem
@KJER ERRT Because she is a woman? The right wing needs to fuck off.
Sophie! The legend.
She's amazing with facts, dates and detail. Shame ARM as a company was taken out of British ownership and again looks to be sold.
Thanks for the interview but the interviewer doesn't know much about the subject. I keep wincing at some of the questions, e.g. at 6:25 "so when the first chips came back were you surprised by the performance...?"
When you get chips back from the fab you're either very happy if they fully work, or very annoyed if they don't; the device's performance is something you spend a _lot_ of time designing and very carefully simulating before anything is ever manufactured. The only surprising bit might be how much you can overclock an individual sample of a chip beyond what the simulations predicted. Any other surprises are usually bad ones :-)
Dr Tune I ask if they were surprised by how low the power consumption was, you can see Steve Furber talk about how it runs at an even lower power consumption than they expected th-cam.com/video/1jOJl8gRPyQ/w-d-xo.htmlm25s which is key to define ARM's success since even though it was not crucial for the Acorn desktop PC design that they were designing this first ARM processor for. He says that the tools that they had for designing the engineering were not very accurate, they applied Victorian engineering margins to the power design, all they knew is that they wanted to come in at under 1W to fit in the low cost plastic package instead of the expensive ceramics package, it actually turned out to consume only 100mW, Steve Furber seems to say that they were not expecting it to consume that low power.
Dr Tune Thats what i thought ! and im no where near the level of shopie !
+Dr Tune I'd say a good interviewer has to pretend [s]he know a bit less than [s]he really does, so uninformed viewers have a chance to understand all of it. That broadens the targetted audience.
Dr Tune in part 3, 18:44, she points out that both she and Steve also wrote the simulation code and included it with the chip. (!)
It hurts at times, but he got to interview the right person.
Literally derived the existence of rice pudding before its power terminals were even connected.
Absolute Genius
thanks @sophie wilson :))
I hope to see a powerful Arrm PC that can challenge Windows PCs on my desk soon. Plug that was the best explanation of finfet I've ever heard. I understand things better when it's visual.
Jason Gooden you will never see arm beat x86, silicon can only work at a maximum speed and x86 has the biggest instruction set wich can reduce the number of instructions necessary to do computation, yeah ARM is more power efficient than x86 but it will never ever beat a x86 on the same node size, unless they increase the instruction set in wich case they will consume just as much power as x86.
@@fss1704 Intel hasn’t been running their instruction set natively for a good while now. Internally they translate it to - gasp *RISC* instructions - and run those instead. That’s the cost of x86: you buy a RISC CPU and two instruction set translators (one for x86, one for amd64g). Those translators are a waste of power and resources and are only there because the legacy instruction set is so far removed from the hardware it can efficiently run on. A big and baroque instruction set isn’t magic - you still need silicon to execute those quickly, and it’s separate from instruction set to a good extent. A rich instruction set doesn’t make a CPU fast. These days all it does is increase power consumption and make the design harder to develop and validate.
@@absurdengineering I have to disagree that a big instruction set isn't faster than a small set, if you look for stock market bots you'll see that no one ever uses ARM for that purpose, yeah ARM is WAY more power efficient than X86 but it will always need more cycles to process the same stuff, the instruction set translators won't interfere on the performance as all the instructions are converted on the fly while they're cached on the pre execution accelerators, yeah it takes more energy to do that but it's also faster to execute the instructions when the processor preemptively caches the next memory content on the register instead of waiting to ask the memory.
@@absurdengineering Yeah obviously intel uses RISC, the CISC countains all of the RISC instructions plus more stuff
"They [Intel] Tried to copy ARM recently didn't they?"
"......no"
I love how casual Sophie is when talking about Intel; but the grin on her face says it all.
Yeah, I coded in Assembler on the 6502, Z80, 68K and Archimedes. Then I tried Intel's 8086 and figured I had better learn assembler. What a POS!
Is there a part 3?
funny 12:00+ ... :D
I really was not expecting that
"Here's a message to all the heat sinks out there..."
Love to know what Sophie thinks of new RISK-V architecture.
I'm interested too.
Given she has not completely trashed Intel x86 in this video, I think she would be equally condescending to Risk V. In my opinion the latter is still inferior to ARM (including the very first ARM1) in many aspects
Pretty much, the David Attenborough of Computer Chips.
It confirms pretty much what the tests I had done on arm compared with intel showed me( Arm much better in every way) nice!
KJER ERRT Intel are half the power of similar motorolla cooperation chips, from experience and logic
KJER ERRT We are talking phones, tablets nowadays
KJER ERRT Well, I started the thread and everything in general is still motorolla/ arm from experience in general including but not only mostly tablets and phones!
Fascinating
Just in case anyone is trying to find the other parts of this interview:
Part 1 - th-cam.com/video/jhwwrSaHdh8/w-d-xo.html
Part 3 - th-cam.com/video/QqxThgLTLyk/w-d-xo.html
Is Sophie a carnation of the writer of the first software program Linda Lovlace?
I whod love to experiment with androed on this pc
just to know , how big it would be a 6502 cpu made with standard component ?
@12:04 was the most interesting part 🤣
Sophie, as a pure genius interested by processing power and serious usage of a computer, simply forgets about the sound capability of the VIDC (video and sound controller) : hey, that is 'simply' a chip offering 8 PCM channels, 8 bit log, going as high as 330 Khz on one single channel, with 7 independent stereo positions per channel.
And Amiga fanboys brag about their poor crippled Paula chip ... ROTFL.
Sounds good on paper but then all thy did was port amiga mod files.. the Amiga was a design pretty much finalised in 1982/83 anyway.and totally under invested by Commodore, it was getting pretty old by then.
A3000 was my first, friend had an A440, later i had an A5000 before it all came down… sadly.
What kind of question is that on the transistors!? I’m not a computer person and even I knew what she was talking about….who is the person doing this interview!?
Charming.
Did Acorn actually make the chips/cpu ?
The chips were manufactured by VLSI to Acorn's design.
ITS LIKE FIRST INTEL 386 PROCESSOR ..DIFFERENCE IS ...ITS EVERYTHING ON PACKAGE DIE..VS 386 YOU NEED VIDEO,MICROCONTROLLER, I/O CONTROLLER DUE TO INTEL AND AMD PROBLEMS BACK THEN...ALSO ITS FASTER THAN ANY 386 EVEN 686 PROCESSORS HAVEN'T BEEN MADE YET
12:03 Sophie flips the bird
12:10 The same to you!!!!! Hahaha
She all showed us her finger!
A woman that just got on with it instead of whining about "glass ceilings"
@KJER ERRT Shes a woman.
When she was doing this stuff, She was living as a man, so that wasn't really part of the equation at the time. She was already very successful before transitioning.
odnosze wrazenie ze to transformator !? tylko nie wiem czy dzwonkowy czy od zasilacza.
Who, by face, is the interviewer?
Roger the Dodger
He says heat sinks are noisy. lol
It’s funny to watch her kind of upset about ARM processors not being used on Apple computers at the time of the interview. I guess she would show much satisfaction if the same interview was repeated now
yeah finally Apple got it 😁😁
@@charbax I suppose the American big corporations lobbies also played an important role in the way the story of processors developed. I mean, how's possible that Intel and Motorola processors were chosen in virtually every high end personal computer existing in the 80's, while the ARM processor had already demonstrated to be 5 to7 times faster than anything else existing at the time?
@@RelayComputer US big tech perhaps waited to embrace/allow Arm in high end until it was sold to non-European ownership.. the US even tried to overtake Arm and fold it into Nvidia but China and others disallowed it.. 😄 Now that they're talking of floating Arm on the US stock market, outside of Europe, I wonder if this is a way for the Americans to control that future.. 😄
@@charbax As an European, it’s always hard for me to figure out or understand such forceful manoeuvres from the USA. It’s like they are always afraid of something. But I repeat I don’t quite understand these things
12:16 to Intel! hahaha
But is an ARM machine this old able to run Android?
ARMv2 only addresses 26-bits of space, so not enough for Android. But ARMv3 (ARM6/7 CPUs from Acorn’s later RiscPC) could probably run it. gcc had ARMv3 support up to gcc 8, so if you got an early version of Android and built it with gcc 8, it’d run - you’d probably need to add way more RAM than those machines came with, and you’d need drivers for the hardware, and I don’t know if Linux ever supported ARMv3 so you might’ve needed to port the low-level kernel bits to that architecture, so it could be done if you had a thousand hours to spend on it, and had the know-how.
Steve Job was right. ARM is the way. Ditch Intel and AMD....
Oh god, I'm being middle fingered at 12:04.
So why don't we have desktop ARM systems today?
Yes! THIS!!! Why don't we???
We do. The raspberry pi is ARM powered and runs the original RISCOS. Read up on it and be amazed!
Philip Cooper be
We do, and they are 40%-50% less powerfull than x86, that's why. You can't make it go faster because they have a reduced instruction set, if you increase the instruction set you will also increase power draw just like x86, when you have a limited clock speed if you try to make both processors on the same node you will find that ARM is much slower than x86.
@@fss1704 Your comment aged like milk.
electrons are too fat, Holy grail PHOTONS
ARM! The only way to escape from maniacal prices of shintel... G'bye Intel!
AMD?
Meh, AMD only exist because Intel hired them to manufacturer 8088 CPUs for the IBM PC and AMD .... erm... "Borrowed the way in which the chip behaved" shall we say.
RISC-V, a free and open source processor architecture.
Serious stuff in there, not much laughing though...
982 liker here :) ♡♡♡♡
What a pile of crap.. lol
I wish that ARM garbage never existed... Acorn RISC Machine by Al Alcorn IS NOT a portable, low power solution, but a processor architecture so poorly made... The ARM CPUs use much more software to run the same thing... Why x86 "mov al, 3;
mul 2" has to look like "bl getnum;
mul r0, r0, #6985;
bl printnum"? What we did to deserve this?? Intel architecture has HARDWARE SUPPORT for many commands (hundreds of instructions only in Pentium 4, think about i9-9900K), and therefore no need for long code that has to be debugged at each and every step...
Holy christ! Do you have but a bit of a clue what you are talking about? Such nonsense 😫
Yes, it had relatively poor code density, but you could buy a machine with 1 MB stock (and still expandable) whereas the IBM PC was limited to 640 kB of primary memory for a donkey's age. The IBM PC depending on the legendary code density ox the x86 instruction set to offset its ludicrous system design limitations was not a chapter to be recalled fondly from computer history. I'm guessing you didn't study the x86 ISA long enough to get to the chapter on segment registers (probably chapter 2). Those segment registers alone cost the computer industry thousands of person-years-ask any compiler writer-in faffing around with a pointlessly baroque addressing kluge. But glad to know that someone out there thinks that segment registers were God's gift to French vanilla ice cream. Keeps the world a more interesting place. Interesting, but extremely weird.
@@afterthesmash I wrote quite a bit of x86 assembly in the lates 80s and early 90s and it felt like dealing with a Frankenstein monster. Ridonkulous.
@Dos Doktor I have no clue what you’re on about. A register load immediate and a multiplication by an immediate are two instructions on x86, three on ARMv2 (two loads then MUL). I have no idea where you got those subroutine calls (BL) from. Seems like you have no clue.
x86 was trash compared to ARM (still is, with the Apple M1 emulating Intel faster than Intel! (of course Intel licensed ARM too - the XScale). I will say that programming 486 after ARM (both in assembly) was a step back, but it was possible to make code run quickly on a 486, by programming it in a RISC way - use only the simple instructions with low cycle counts. The lack of registers on x86 was annoying, but tricks like disabling interrupts during rasterisation helped to a degree