Discussing System On Chip (SoC) - Computerphile

แชร์
ฝัง
  • เผยแพร่เมื่อ 21 พ.ย. 2021
  • With the hype around Apple's M1 chip, Dr Steve Bagley discusses what the big deal is with the system on chip approach to building computers - spoiler, it's not a new thing!
    / computerphile
    / computer_phile
    This video was filmed and edited by Sean Riley.
    Computer Science at the University of Nottingham: bit.ly/nottscomputer
    Computerphile is a sister project to Brady Haran's Numberphile. More at www.bradyharan.com

ความคิดเห็น • 303

  • @TheBasementChannel
    @TheBasementChannel 2 ปีที่แล้ว +328

    I’m mostly impressed the camera guy casually knew what week number it is

    • @Computerphile
      @Computerphile  2 ปีที่แล้ว +220

      Pure coincidence from a different project I work on :) -Sean

    • @andrewharrison8436
      @andrewharrison8436 2 ปีที่แล้ว +19

      Could have been bluffing, after all who would know (apart from subscribers to Computerphile - oh wait).

    • @axelnils
      @axelnils 2 ปีที่แล้ว +15

      Still have no clue why the rest of the world hasn’t switched to the superior Swedish weekday-weeknumbert way of referring to dates

    • @BriaandMatt10
      @BriaandMatt10 2 ปีที่แล้ว +5

      The camera man stays knowing

    • @aaronsims2569
      @aaronsims2569 2 ปีที่แล้ว +4

      @@axelnils does the first week of a new year always start on the same day of the week? Like a Sunday or can it be different set by Jan 1? I like the concept.

  • @spikeevans1488
    @spikeevans1488 2 ปีที่แล้ว +71

    Very good video. As a retired HDD firmware engineer, I can tell you that storage devices had systems on a chip (multicore). In fact everything on an hdd has gotten smaller, use less power, and faster over the decades.

  • @sepgorut2492
    @sepgorut2492 2 ปีที่แล้ว +73

    I'm sure everyone would like to join me in wishing the Acorn A3010 SoC a very happy birthday 🎂

  • @flyball1788
    @flyball1788 2 ปีที่แล้ว +80

    "I am not a chip designer" - never thought I'd hear that disclaimer, hooray for engineers :)
    IAACD (I am a chip designer) and this was a really good outside view of why SoC so good job all.
    Only addition I'd make is that it used to take a couple of engineers about 2 months from inception to prototype to design a new computer (what I did before SoC), a few thousand quid for a handful of prototypes and you could test and fix it in "real life" with a soldering iron until it was working and start selling the final product a few months later. With SoC it takes more than 2 months just to very precisely specify what it does and probably more like a team of 12 engineers 2 years and a few hundred thousand quid to getting a chip that has a chance of working and there's no bodging it with a soldering iron when it doesn't so add another pile of cash and 6 months+ if you got it wrong, which is why most technological changes are now evolutions not revolutions.

    • @bjarnenilsson80
      @bjarnenilsson80 2 ปีที่แล้ว +7

      Not to mention geting capacity in a sufficiently advanced foundry with the right production tech, and apparently you can't just go to a competitor , at keast nit uf you want the latest and greatest tech

    • @teasipperda3rd334
      @teasipperda3rd334 2 ปีที่แล้ว

      @@bjarnenilsson80 do u know how to code or program

  • @VibesNick
    @VibesNick 2 ปีที่แล้ว +34

    Thank you Computerphile and Steve for a lovely video, once again ! Probably my favorite channel on TH-cam.

  • @brandonmack111
    @brandonmack111 2 ปีที่แล้ว +84

    As far as signaling, all of this is true, even without taking capacitance and em interference into account, which are far bigger problems for high frequency signals, and are much easier to manage / plan for in an SOC.

  • @WistrelChianti
    @WistrelChianti 2 ปีที่แล้ว +9

    Thanks! Very interesting discussion of the differences and pros and cons. The 3GHz track length thing was fascinating! Hadn't appreciated at that speed we were getting into such considerations.

  • @lerssilarsson6414
    @lerssilarsson6414 2 ปีที่แล้ว +75

    What we had before a microcontroller? A suitcase full of TTL circuits.

    • @narobii9815
      @narobii9815 2 ปีที่แล้ว +4

      You could fit it in a single suitcase?

    • @Dong_Harvey
      @Dong_Harvey 2 ปีที่แล้ว +2

      @@narobii9815 depends on how big a suit David Byrne can wear

  • @francismendes
    @francismendes 2 ปีที่แล้ว +23

    I already know what a SoC is, but I watch the video anyway because I know I'll still learn a thing or two from Dr. Steve.

  • @Species1571
    @Species1571 2 ปีที่แล้ว +29

    0:52 Disassembly was really easy back in those days.

    • @godfather7339
      @godfather7339 2 ปีที่แล้ว +2

      More disassembly = Less profit

  • @TaleTN
    @TaleTN 2 ปีที่แล้ว +69

    Actually I think you can go back a bit further, because in the early/mid 1980s there already were quite a few MCUs that combined CPU with timer(s), A/D converter, UART, etc.

    • @johnsenchak1428
      @johnsenchak1428 2 ปีที่แล้ว

      I agree with you as most tech channels resort to attention getting behavior to seek views

    • @tackline
      @tackline 2 ปีที่แล้ว +14

      SoCs are distinguished from microcontrollers by being at the heart of a PC or PC-like system. Microcontrollers typically integrate volatile and non-volatile memory. You could argue about terminology all day, but the intent is different between SoCs and MCUs.

    • @johnsenchak1428
      @johnsenchak1428 2 ปีที่แล้ว

      @@tackline No, I am saying that a custom ASCI'S are the same thing as a SOC ,as they contain all the necessary building block circuits that that are needed for a main function of a device such as test equipment, big screen TV's , network appliance, network routers, and special medical devices

    • @jeromethiel4323
      @jeromethiel4323 2 ปีที่แล้ว +7

      @@johnsenchak1428 ASIC's were the precursors to SOC. Like most technology, everything builds on the previous generation. PAL's were the precursors to ASIC's, and are still in use today. I remember, back in the 80's, using ROM chips as cheap PAL's, once you finalized your design. And using EPROM's while in development.

    • @peterfireflylund
      @peterfireflylund 2 ปีที่แล้ว +2

      Yes, the 80188 and 80186 from 1982 were microcontroller versions of the 8088 and 8086. They had on-chip timers, DMA controllers, interrupt controller, clock generator, and wait state generator. No UARTs or A/D converters, though.
      There were plenty of 8051 variants with UARTs and A/D converters and lots of other goodies.

  • @Jone952
    @Jone952 2 ปีที่แล้ว +52

    It's crazy that in one clock cycle light can only travel 10cm. These are some incredible machines

    • @ArumesYT
      @ArumesYT 2 ปีที่แล้ว +17

      It's even worse, because signals travel slower through metal than light does in a vacuum.

    • @Erikitties
      @Erikitties ปีที่แล้ว +1

      + more radiation polution. + lo voltage DC voltage losses over huge 1970s era cable connector usage.

  • @irwainnornossa4605
    @irwainnornossa4605 2 ปีที่แล้ว +3

    This video was absolutely amazing. Please, do more like these. Similar topic, maybe about other HW…

  • @MeneGR
    @MeneGR 2 ปีที่แล้ว

    I love that CTM644 in the background, on top of the MP-3 TV tuner. Really nice!

  • @leomuzzitube
    @leomuzzitube 2 ปีที่แล้ว +48

    I think you should have mentioned the most important leap in modern SoC design: incorporating the RAM.

    • @jpdemer5
      @jpdemer5 2 ปีที่แล้ว +2

      The RAM on the new M1 Macs isn't really "on the chip" - it's still a soldered-on component. It is in fact possible (albeit via extremely specialized work) to upgrade the memory on an M1 device.

    • @leomuzzitube
      @leomuzzitube 2 ปีที่แล้ว +1

      @@jpdemer5 True, but the fact that it is not in the same die doesn't change the characteristics in my opinon - it is still simpler, faster, less flexible, and non-upgradeable (for practical purposes). What most people consider a "chip" is the package anyway, although I agree technically speaking it is "System-in-a-Package".

    • @metroid031993
      @metroid031993 2 ปีที่แล้ว +1

      @@leomuzzitube that's the thing: the RAM is on a separate package, which is then soldered on top of the CPU. They are two separate packages. That said, most SoCs do have a decent bit of internal SRAM, which is great from a bootstrapping perspective (DRAM tends to need to be "trained"), as well as a security perspective (raises the barrier to entry for attackers).

    • @elimalinsky7069
      @elimalinsky7069 2 ปีที่แล้ว +1

      The RAM is on the same chip but on a separate die within the chip, if that makes sense. It's what we would call separate chips back in the day but now it's packaged together in an enclosed shell which looks like a classical chip at first glance. We now call an integrated system on the same silicon wafer - a die, and the package which may contain several dice - a chip.

    • @metroid031993
      @metroid031993 2 ปีที่แล้ว

      @@elimalinsky7069 this isn't correct. They have a BGA DRAM chip which then gets soldered on top of the CPU

  • @Lttlemoi
    @Lttlemoi 2 ปีที่แล้ว +51

    Nowadays if you want to customize the entire thing down to the cpu, you can always plonk down an fpga, use a soft ARM or RISC-V core and build the rest of your custom circuity into the firmware without ever needing to design and produce a custom physical chip.

    • @anujmchitale
      @anujmchitale 2 ปีที่แล้ว +16

      It's too expensive for most companies to do this. Hence dedicated chip vendors perform this and sell the customized chips to the client business.

    • @Lttlemoi
      @Lttlemoi 2 ปีที่แล้ว +7

      @@anujmchitale It's worked wonders for the startup I've been a part of though. No custom chips or socs that limit choices of what other hardware you can attach. Just a big fpga that does all the custom signal handling and you're set.

    • @JahMusicTube
      @JahMusicTube 2 ปีที่แล้ว +23

      At the cost of lower clock speed and higher power than an equivalent ASIC though

    • @anujmchitale
      @anujmchitale 2 ปีที่แล้ว +3

      @@Lttlemoi oh nice.

    • @AntonioBarba_TheKaneB
      @AntonioBarba_TheKaneB 2 ปีที่แล้ว +4

      yeah, but then if you want to optimize the costs, speed and power consumption you have to convert it to ASIC. It's a trade off of course.

  • @scottfranco1962
    @scottfranco1962 2 ปีที่แล้ว +2

    There is also a tremendous speed and power advantage to routing high fan out peripherals on chip vs. off chip. Anytime you have to pass a signal through a pad driver and externally, the driver power goes up and the speed of the line goes down. One of the great drivers of SOCs were the making of cells out of standard design blocks such as CPUs. To do this, silicon designers would separate the cells (CPUs, peripherals, etc) from their pad ring drivers. Thus a single cell design could be dropped into a SOC using multiple cells internally, or drop into a pad ring to make a stand-alone design.

  • @awanderer5446
    @awanderer5446 2 ปีที่แล้ว +3

    Dr. Steve Bagley, what a lovely guy. Looks like he's living the dream with all the retro devices in his office!

  • @Archimedes75009
    @Archimedes75009 2 ปีที่แล้ว +8

    ARM250 based Archimedes ( A3010, A3020, A4000 ) can be very easily overclocked up to 24 Mhz thanks to faster DRAMS.
    It's all detailed on the Stardot forum.

  • @PiechureQ
    @PiechureQ 2 ปีที่แล้ว

    Wow. Teraz zrozumiałem cały concept. Świetny film😇

  • @batkung
    @batkung 2 ปีที่แล้ว

    the motorola 68HC11XX chips were a lot of fun to play with, and they were used in a lot of different consumer devices.

  • @Jamiered18
    @Jamiered18 2 ปีที่แล้ว +1

    Nice. Very comprehensive video. Lots of good info in the comments too

  • @Roxor128
    @Roxor128 2 ปีที่แล้ว +1

    I recall reading about an open source project years ago for a SoC that implemented an IBM PC, though with VGA rather than CGA or MDA and no FPU option.

  • @paco3447
    @paco3447 2 ปีที่แล้ว +3

    The 1984 Commodore/MOS "TED" chip was a sort of early SoC, prior to any ARM chip. Please, some credit for Commodore, MOS/CSG, etc. What insane obsession with Apple/ARM.

    • @jonragnarsson
      @jonragnarsson 2 ปีที่แล้ว

      I didn't know about that. Seems like a 6502 version of the ARM 250.

  • @jimsmind3894
    @jimsmind3894 2 ปีที่แล้ว +34

    I'm sure I heard somewhere that the latest Raspberry Pi is originally a SoC for a set top box.

    • @creesch
      @creesch 2 ปีที่แล้ว +24

      That's entirely possible, a lot of that sort devices (which are effectively computers in their own right) make use of ARM socs like the one you'll find on a raspberry Pi. Edit: did a bit of digging, the soc in the original pi (v1) was indeed also set in set top boxes like the first gen Roku. The soc in the latest generation (pi 4b) is more or less made for the pi but very much resembles a different model Broadcom (the SoC manufacturer) also makes which is once again intended for set top box usage.

    • @techobsessed1
      @techobsessed1 2 ปีที่แล้ว +8

      The first one certainly was.

    • @teasipperda3rd334
      @teasipperda3rd334 2 ปีที่แล้ว

      @Nick Williams do u know how to code or program

  • @SlinexUA
    @SlinexUA 2 ปีที่แล้ว

    I also would like to add, that soc is not only about computational power and exact circuits on chip design(features and so on) but also end user devices manufacturing processes. The size of exact circuits on chip is way smaller that final chip “package”, because the package used not as a shell only but also as heat conductor and array of contact points. There also cons and pros going from this fact.

  • @rafikadonishammoutene3998
    @rafikadonishammoutene3998 2 ปีที่แล้ว +2

    hey @Computerphile ,I really hope that you will be discussing next time about Quadtree and its use in image comparaison or sorting spatial data (the main purpose of the invention of quadtree in 1974) ! Thanks in advance ;

  • @dezmondwhitney1208
    @dezmondwhitney1208 2 ปีที่แล้ว

    I Thought this explanation of SOC's very good indeed and most interesting as well. Many Thanks,

  • @RagHelen
    @RagHelen 2 ปีที่แล้ว +20

    "Somewhere here in my archive" - which is everything in a 200 degree radius behind his back.

    • @morpheus6749
      @morpheus6749 2 ปีที่แล้ว

      Uh... radius is not measured in degrees. Angles are measured in degrees. Go back and repeat 5th grade.

    • @RagHelen
      @RagHelen 2 ปีที่แล้ว

      @@morpheus6749 Field of view, you nit.

    • @morpheus6749
      @morpheus6749 2 ปีที่แล้ว

      ​@@RagHelen LOL.. radius is not the same thing as field of view. Like I said, go back and repeat 5th grade. Maybe this time you'll learn what radius means. Maybe.

    • @reflectedcrosssite2848
      @reflectedcrosssite2848 2 ปีที่แล้ว +1

      @@morpheus6749 it was sort of obvious what he meant, and while he was wrong, you could have put it more nicely. Your little moment of "superiority" there just shows how badly adjusted person you are

    • @morpheus6749
      @morpheus6749 2 ปีที่แล้ว

      @@reflectedcrosssite2848 Obvious? What exactly does "a 200 degree radius" mean to you? Do tell.

  • @heyarno
    @heyarno 2 ปีที่แล้ว +3

    One more advantage of a SoC, is the power efficiency.
    Transfering data uses more power than the actual calculations made in some discrete systems.
    Having the data close, reduces that energy requirement.
    Also latency is reduced.
    This also translates to cost savings for cooling solutions.
    And weight savings for mobile devices.

    • @Freshbott2
      @Freshbott2 2 ปีที่แล้ว

      It beggars why we haven’t already moved more discreet systems onto SoCs and left everything else for where you really need the extensibility

    • @heyarno
      @heyarno 2 ปีที่แล้ว

      @@Freshbott2 There are multiple factors. It's harder to plan for the demand of a more specific piece of circuitry. And more complex SoC's do use more die space, which reduces yields in the production. A modern Pc is close to a SoC. But since it's rather complex, it's more economical to leave some aspects modular. So at the moment, it's either a hybrid approach, a smaller SoC or a very big production run of a company that can dictate what the market demands. For example apples M1 chips are such a case. They can afford to ignore every customer who's needs are not fully met.

    • @Freshbott2
      @Freshbott2 2 ปีที่แล้ว

      @@heyarno I don’t really think that’s it though cause whether it’s a Mac or PC laptop of whatever kind it’s already not meeting the majority of people’s needs fully. The only people who are fully met are the ones an SoC wouldn’t benefit. Forget all the different encoding and enclaves and DSPs etc. just theefficiency gain from moving to true integrated graphics is what people want. Everyone wants that MacBook body.

  • @davidgillies620
    @davidgillies620 2 ปีที่แล้ว

    The primary design constraint in GHz+ PCB design isn't propagation delay but the fact that those are microwave frequencies. Traces act like transmission lines, not wires, and fast edges generate very high frequency harmonics. You can wave away impedance matching at the MHz clocks speeds of 1980s/90s machines, but not now.

  • @902Boots
    @902Boots 2 ปีที่แล้ว

    Any videos where you go over your collection of gear there? If not, make one!

  • @Z4KIUS
    @Z4KIUS 2 ปีที่แล้ว

    integration progresses in many directions, but it gets a bit complicated in MCM, should we call them System On Package? or the lack of GPU makes them not count? but then APU are clearly SoC as these are monolithic
    and technically all of them are able to run without chipset at all but I don't think there are any aftermarket chipletless motherboards
    for Intel there are mobile SoC that have the PCH (chipset) die on the "CPU PCB" but not all of them do that

  • @Rob77896
    @Rob77896 2 ปีที่แล้ว +1

    Awesome video:)

  • @yumtumbout
    @yumtumbout 2 ปีที่แล้ว

    Great topic

  • @jkbullitt8986
    @jkbullitt8986 ปีที่แล้ว

    Superb!

  • @vietstonedotdev
    @vietstonedotdev 2 ปีที่แล้ว

    Nice, even for beginners. Thanks!

  • @DaveShipp
    @DaveShipp หลายเดือนก่อน

    I love that you used Acorns to explain SoC

  • @AgnostosGnostos
    @AgnostosGnostos 2 ปีที่แล้ว

    Great channel I have downloaded all its videos with an external TH-cam download program. Just Wikipedia for the article: Comparison of TH-cam downloaders.

  • @marklonergan3898
    @marklonergan3898 2 ปีที่แล้ว +1

    10:16 - I thin you forgot to put the disclaimer up Sean! 😀

  • @EvilSandwich
    @EvilSandwich 2 ปีที่แล้ว +9

    So would you consider the ULA on the ZX81 to be a primitive version of an SoC?

    • @framegrace1
      @framegrace1 2 ปีที่แล้ว +2

      ULA is generally considered part of the CPU, so no.

    • @Archimedes75009
      @Archimedes75009 2 ปีที่แล้ว +3

      @@framegrace1 it's another chip, nothing to do with a CPU

    • @framegrace1
      @framegrace1 2 ปีที่แล้ว +3

      @@Archimedes75009 YEah. sorry. Thought you were talking about the ALU. (ULA is ALU in my language, hence the confusion).

    • @Archimedes75009
      @Archimedes75009 2 ปีที่แล้ว +2

      @@framegrace1 Ah, I see, I am French and it's the same here :-)

    • @WistrelChianti
      @WistrelChianti 2 ปีที่แล้ว +2

      yes and no. I mean, guessing it isn't a complete system so no, but presumably it allowed them to create one chip that did the work of what would otherwise be several different ones. So while it wasn't a complete system presumably it had at least some of the advantages of SoC around cost/complexity saving, at the alternative expense of needing to design (the final manufacturing stage of) the ULA chip.

  • @el_es
    @el_es 2 ปีที่แล้ว

    The modern x86 processors are kinda SoC too - built in cache, built in memory controller, built in IO PCI-Express bus controller etc. Compare to even 20?25 years ago, there was a 'north bridge' interface for memory controller and 'south bridge' for IO. If RAM serves, even kind of until the Intel Core 'i' era. On certain even older earlier ones (386, 486?) even cache was external. If nowadays a gpu is built in, all you need is RAM and storage and it feeds video output directly from the chip.

  • @SebastianUnterberg
    @SebastianUnterberg 2 ปีที่แล้ว

    I guess you can ignore curves or bypasses of traces when calculating transmission speed due to the transmission line theory. Only the distance between origin and destination is important when calculation time needed for the signal to arrive. Is that correct?

    • @the1exnay
      @the1exnay 2 ปีที่แล้ว

      Transmission law theory?

    • @SebastianUnterberg
      @SebastianUnterberg 2 ปีที่แล้ว

      @@the1exnay ohh sorry. Transmission line theory.

  • @tackline
    @tackline 2 ปีที่แล้ว +4

    Given how many more transistors that, say, the Z80A of a Spectrum has than the Spectrum ULA, it's a shame that SoCs didn't come about in around 1976. Put a DRAM-compatible interface on a ROM/PROM and a relatively cheap computer could have been built with a 40-pin SoC.

    • @d2factotum
      @d2factotum 2 ปีที่แล้ว

      In a way they'd already started down that road with the Z80A--the CPU there included the capability to refresh the DRAM in the computer, which previously had been the job of additional circuitry on the board in earlier microprocessors.

    • @teasipperda3rd334
      @teasipperda3rd334 2 ปีที่แล้ว

      @@d2factotum do u know how to code or program

  • @Bianchi77
    @Bianchi77 2 ปีที่แล้ว

    Nice video, keep it up , thank you :)

  • @Veptis
    @Veptis 2 ปีที่แล้ว +1

    We are reaching performance levels where latency is limited by distance. As signals want to travel at the speed of light - they get held back by the fields which will be generated in a system.
    The trend will continue

  • @MrVadymMykh
    @MrVadymMykh 2 ปีที่แล้ว

    Oh, that “if you remember” moment)))))) nice

  • @divanvanzyl7545
    @divanvanzyl7545 2 ปีที่แล้ว

    00:52 That lid was ready to go 🤣

  • @odysseus9672
    @odysseus9672 ปีที่แล้ว

    No mention of key terms from back in the day: northbridge and southbridge?

  • @roshnikumari3806
    @roshnikumari3806 2 ปีที่แล้ว

    Nice video

  • @zlcoolboy
    @zlcoolboy 2 ปีที่แล้ว +5

    This must have been the kind of thing that totally makes sense in hindsight. Of course you're not going to run a bunch of traces that are unnecessary if you just cram those modules together.

    • @techobsessed1
      @techobsessed1 2 ปีที่แล้ว +4

      It's made sense since the first integrated circuits at the tail end of the 1950s. It just took time for it to become cheap enough to put everything on a single chip.

    • @Saens406
      @Saens406 ปีที่แล้ว

      "just" like its that easy?

  • @talideon
    @talideon 2 ปีที่แล้ว +3

    The ARM250, which you found in the A3010 ran at a clock speed of 12MHz vs the ARM2's 8MHz, but while the latter only managed 4MIPS, the former ran at 7MIPS, and that extra 1MIPS of speed was, in part, down to the fact everything (aside from the memory) was so tightly integrated on the ARM250 SoC!

    • @tackline
      @tackline 2 ปีที่แล้ว +1

      I can't see how that would make any difference. It's literally all four of the original design placed on a single die, without so much as bothering to rework the layouts so that they fit together better. I can believe in better compiler optimisers (MIPS is measured from a particular C benchmark) and for the same screen mode, the video will be using a smaller fraction of the total bandwidth.

    • @Archimedes75009
      @Archimedes75009 2 ปีที่แล้ว +1

      @@tackline That's 7 MIPS for 12 Mhz systems vs 4.5 for 8 Mhz. Yes : 2.5 extra MIPS DOES matter.

  • @ChrisBigBad
    @ChrisBigBad 2 ปีที่แล้ว

    A! that was a Pixel 6, eh? nice.
    So half of the PC-systems today are SoCs? with integrated graphics and integrated mem-controller and integrated north-bridge?

  • @SMJSmoK
    @SMJSmoK 2 ปีที่แล้ว

    I'm a simple man. I see Steve Bagley, I hit like.

  • @khordad1216
    @khordad1216 2 ปีที่แล้ว +1

    So ... essentially...
    SoC is just a fancier more cable Microcontroller?
    Like to hear some thoughts on this.

  • @ArturdeSousaRocha
    @ArturdeSousaRocha 2 ปีที่แล้ว

    It always amazes me that the UK had so many native computer types that were their own thing and weren't exported.

  • @szymon1051tv
    @szymon1051tv 2 ปีที่แล้ว

    Intel or AMD is SoC? They have many controller bulild in "CPU".

  • @MrAwesomeKociks
    @MrAwesomeKociks 2 ปีที่แล้ว +1

    Why are they talking about Shadow of Chenobyl?

  • @bentationfunkiloglio
    @bentationfunkiloglio 2 ปีที่แล้ว

    Oh man, 1987! Awesome.
    Not on topic, but... I wish that I still had my old commodore 64 with it's floppy disks and cassette tape storage to show my teenaged son.

  • @tma2001
    @tma2001 2 ปีที่แล้ว

    for a looong time we had chipsets (and still do for desktop motherboards).

  • @mattbunce2509
    @mattbunce2509 2 ปีที่แล้ว

    So... How does this work vs Veritasium's light-second light bulb circuit which lights almost instantly because of the energy field... If the light turns out almost immediately despite very long wires, why is PCB track length an issue that needs to be consider in relation to "time taken for electrical signal to reach the other end" based on what Veritasium was saying it seems like track length wouldn't be important and only the position of the chip relative to the CPU would matter (it wouldn't matter is the tracks were different lengths or went all around the houses to reach their destination.
    I understand there might be issues with attenuation and potentially signals resonating within that wires, but here if seems you are focusing on the speed of light and length of tracks to determine arrival of information (which makes sense to me), but it seems to go against this whole concept that Veritasium has got people talking about.

    • @lawrencedoliveiro9104
      @lawrencedoliveiro9104 2 ปีที่แล้ว

      At gigahertz clock speeds, it takes a long time for signals to get anywhere.

  • @goodboi42
    @goodboi42 2 ปีที่แล้ว +13

    Me, an ECE student, who somehow managed to pass this semester: _"hmmm, interesting..."_
    Jokes apart, it's videos like this that keep me going. Thanks!

    • @jacoblin0820
      @jacoblin0820 2 ปีที่แล้ว +1

      EE student here, still struggling.

    • @QuantumFluxable
      @QuantumFluxable 2 ปีที่แล้ว +1

      studying EE is not worth it imo unless you want a job in academics or some fancy prosthetics firm. most of what i learned about engineering (digital circuits, microcontrollers, etc) i learned either on the job or DIY style. most complicated formulas from my studies i actually used was capacitor charging curves....

    • @jacoblin0820
      @jacoblin0820 2 ปีที่แล้ว +2

      @@QuantumFluxable I have no choice since its either EE or CS in my college. There is no Computer engineering, and I want to learn more about hardware design ( VLSI design, embedded system).

  • @Wizardofro
    @Wizardofro 2 ปีที่แล้ว +5

    You can make a wireless micro-bridge between GPU/CPU/Microchips shielded from outside interference and it can act as a BUS for interoperability and comms; eg. cryptography/timebased as the chips would "know" the distance to each other and can then encrypt the data.

    • @XenoTravis
      @XenoTravis 2 ปีที่แล้ว +5

      Do you have any links demonstrating this? Sounds interesting.

    • @danieljensen2626
      @danieljensen2626 2 ปีที่แล้ว +2

      You could also just use wires. 🤷‍♂️ Wireless is slower and much harder to design. It's only really better for portability.

    • @tanmaypanadi1414
      @tanmaypanadi1414 2 ปีที่แล้ว +2

      intel labs semes to be going the route of photonics. I heard about it in a interview with Ian Cutriss (Anand Tech) on his TH-cam channel

  • @computer_toucher
    @computer_toucher 2 ปีที่แล้ว +11

    How did I know he'd use an Acorn as the first example lol

    • @DanEllis
      @DanEllis 2 ปีที่แล้ว +1

      They seem oddly obsessed with them there.

    • @Archimedes75009
      @Archimedes75009 2 ปีที่แล้ว +3

      @@DanEllis Because the ARM250 is the 1st ever SOC. And it was created by Acorn.

  • @MrMegaPussyPlayer
    @MrMegaPussyPlayer 2 ปีที่แล้ว

    ... 0:34 It really bugs me, that I can't make an abbreviation with SoC that turns it to SoCK ...

  • @syntaxerorr
    @syntaxerorr 2 ปีที่แล้ว

    South bridge, North bridge. I feel like these are terms you aren't using, but should be with these systems.

  • @morzee94
    @morzee94 2 ปีที่แล้ว

    Acorn could have been HUGE. I know that Arm is super successful but I wish Acorn were still around. I loved their computers, so ahead of their time.

  • @Erikitties
    @Erikitties ปีที่แล้ว

    In addition to signal speed limitationa, voltage loss over say 10x to 10,000X greater distance with the 1960s motherboard technology is not insignificant. + increased radiation emmissions. It seems there should be a huge market for highly upgradable SOS's.
    True computer modding "enthusiasts" would of course need to learn to solder or pop down to the repair shop to have our new componets hooked up.
    Seems motherboards and all the rediculas connector cables and cable managment nonsence needs to become irrelivent relics!
    We would love an episode on the SIC modding going on at the institution there.

  • @heyarno
    @heyarno 2 ปีที่แล้ว +5

    One of the disadvantages of a SoC is, that there is more die size required than for the individual components.
    So in big designs, the yields become less economically viable.

  • @BobDiaz123
    @BobDiaz123 2 ปีที่แล้ว +1

    I believe that the future Low Cost Computers for businesses and home use will all be SoC systems. It's an easy way yo reduce the cost, yet provide the functions the majority of people will want.

    • @Freshbott2
      @Freshbott2 2 ปีที่แล้ว +1

      They already are! What I hope is console grade SoCs in Windows gaming thin-and-lights will show up soon

    • @fsxaircanada01
      @fsxaircanada01 2 ปีที่แล้ว

      Apple M1 arrived last year

  • @selvamthiagarajan8152
    @selvamthiagarajan8152 2 ปีที่แล้ว

    Who is the man asking questions?

  • @mdtrx
    @mdtrx 2 ปีที่แล้ว +2

    And now we are moving "back" to bridges with chiplets...

  • @stensoft
    @stensoft 2 ปีที่แล้ว

    3:44 It almost looks like the serial controller has USB logo on it :)

  • @Clancydaenlightened
    @Clancydaenlightened 2 ปีที่แล้ว

    Hey Ryan

  • @Clancydaenlightened
    @Clancydaenlightened 2 ปีที่แล้ว

    I'm still here

  • @johnsenchak1428
    @johnsenchak1428 2 ปีที่แล้ว +2

    Most flat screen televisions today have SOC (custom ASCI) Also when you combine more silicon on one chip you reduce less power and therefore less heat dissipation because more transistors are decreasing in size on the die.

  • @neevpenkar4955
    @neevpenkar4955 2 ปีที่แล้ว

    Hey!

  • @treyquattro
    @treyquattro 2 ปีที่แล้ว

    I have an "archive" like Steve's (Dr. Bagley) too: a big pile of old computer crap that I can't bring myself to throw out!

  • @Clancydaenlightened
    @Clancydaenlightened 2 ปีที่แล้ว

    Goto work

  • @mbian0same762
    @mbian0same762 2 ปีที่แล้ว

    oh... I thought that were a "photoshop chip", a "crysis chip" to address "can it run crysis", a "Epeen chip" to run cpu-z exclusively, and an nvidia chip so that linus (both of them) can give nvidia the finger.
    Not exactly the exact understanding, but close enough. I do CS too!

  • @shanehebert396
    @shanehebert396 2 ปีที่แล้ว

    The market is for computer users so enthusiasts and such aren't really a part of that. Moving computers to SoC means that a number of things are given up. Expandability, particularly RAM expansion for example, is gone. You need to buy the computer as you want it (and think you may want it before end of life of it) because you won't be easily doing things like adding more RAM to it. You would have to replace the whole SoC if you wanted to go from, say 8GB, to 16GB. So if you buy a machine that, after you get it, you find that you really need more memory, you have to buy an entire new computer and figure out something to do with your older computer. Hopefully you can figure that out within the return policy of the thing and can send it back and get the one you want.

  • @micharoman9188
    @micharoman9188 2 ปีที่แล้ว

    0:51 Amiga :)

  • @tranthien3932
    @tranthien3932 2 ปีที่แล้ว

    I like the fact that the coke container just hang around in the back

  • @sandy666ification
    @sandy666ification 2 ปีที่แล้ว

    Mac?

  • @teasipperda3rd334
    @teasipperda3rd334 2 ปีที่แล้ว

    Awesome video also I am making my own coding security company while in 11th grade lol

  • @cwtrain
    @cwtrain 2 ปีที่แล้ว

    The case of Diet Coke on the desk is just... _Chef's kiss_

    • @AileTheAlien
      @AileTheAlien 2 ปีที่แล้ว

      This episode of Computerphile brought to you by...

  • @worm628
    @worm628 2 ปีที่แล้ว +1

    I'm going to start calling my pile of dusty electronics "the archive" also. Might improve partner approval factor.

  • @Clancydaenlightened
    @Clancydaenlightened 2 ปีที่แล้ว

    how a computer scientist get doxxed by a highschool dropout, its hilarious

  • @lazypig93
    @lazypig93 2 ปีที่แล้ว +2

    Is there a way to audit SoC to prevent big tech from adding subsystem or backdoor?

    • @Roxor128
      @Roxor128 2 ปีที่แล้ว +2

      Creating it in a hardware description language would at least ensure the intended design is free of backdoors. If you used it to program an FPGA, you'd probably be alright, though it's still possible for the software used to create the bitstream to maliciously insert functionality. If you used an open-source toolchain as well, you'd eliminate that avenue. Anything beyond that, like a fixed-function block in the FPGA itself, is getting into "more trouble than it's worth" territory for the malicious actor.

  • @Subbestionix
    @Subbestionix 2 ปีที่แล้ว

    Was a bit repetetive but still quite good

  • @treyquattro
    @treyquattro 2 ปีที่แล้ว +1

    and the new Apple SoCs have the RAM on chip too so you can't expand it after you've configured the system. (Non-upgradeable RAM has been a "feature" since before Apple went to ARM SoCs it must be said, just that it was still a separate component on a motherboard)

    • @QuantumFluxable
      @QuantumFluxable 2 ปีที่แล้ว

      yea i don't get why dr bagley is still such an apple fanboy either. back in the day, sure, they made some pretty great & revolutionary products. today? not so much...

    • @bakedbeings
      @bakedbeings 2 ปีที่แล้ว

      @@QuantumFluxable Perhaps try to think in terms of technologies and utility rather than brands, the extra resolution will help you to make more nuanced decisions. The A64FX chips in Fugaku, - the #1 computer on the top500 supercomputers list - also uses hbm2 on the cpu package. It's a design with great performance upside if you know how to gauge your needs at time of purchase and the configuration is within your budget.

    • @QuantumFluxable
      @QuantumFluxable 2 ปีที่แล้ว

      @@bakedbeings we're talking about the right to repair of ordinary customers, why would i care what a random supercomputer does? for a supercomputer you could just do fully custom one-of-a-kind silicon and probably still have it work out financially...

    • @bakedbeings
      @bakedbeings 2 ปีที่แล้ว +2

      @@QuantumFluxable You might be replying to the wrong thread. You didn't mention anything about right to repair, it was a general complaint about Apple products on a video about socs.

    • @techobsessed1
      @techobsessed1 2 ปีที่แล้ว +2

      Microprocessors have had RAM on the chip for ~30y. For most of that time, its been used as a cache. The new Apple SoCs are no different in that regard. The main system memory isn't on the SoC, it's on a separate chips in separate packages that are mounted on the same module as the SoC. The versions used in iPhones are more compact. They put the RAM package on top of the SoC package.

  • @lazystrike6835
    @lazystrike6835 2 ปีที่แล้ว

    Has anyone noticed how poetic it is , an iMac and acron sitting on the same table.

  • @carlborgen
    @carlborgen 2 ปีที่แล้ว +8

    The idea that squiggly routing of connection wires would impact transfer speed, isn't that in contrary to the point made in the latest Veritasium video, The Big Misconception About Electricity?

    • @AlRoderick
      @AlRoderick 2 ปีที่แล้ว +6

      It's still a distance to cross. If the electrical signal needed to pass the long way around the case and traveled a meter to do so it would still matter that the component was 10 cm away.

    • @greenjom
      @greenjom 2 ปีที่แล้ว +8

      The signal speed is the speed of the electric field, which is also around the speed of light. And the field also "travels" along the traces. The electrons themselves are still pretty slow.

    • @max_kl
      @max_kl 2 ปีที่แล้ว +3

      Veritasium's claim is technically true, but in practice not really relevant since the amplitude of the signal induced by the fields is much, much lower than the actual signal. In high speed circuits the signal path and it's return path (ie ground) are routed very close together to minimize the spatial extent of the EM fields. In the Veritasium video they were "routed" 1 ly apart...

    • @danieljensen2626
      @danieljensen2626 2 ปีที่แล้ว +5

      In the Veritasium video he's really talking about the ability of the wires in the circuit to act like antennas (which I really wish he had explained better, or at all). By treating the wires as antennas you can design them to control how much power is transmitted as radiation rather than via conduction. If you don't actually want an antenna then you just design your wires to suppress that effect as much as possible.
      The effect gets more prominent as the wavelength of the signal on the wire gets closer to the size of the wire, so high frequency (short wavelength) circuits are basically always limited by your ability to keep your wires from acting like antennas. That's one of the reasons why CPUs can't just keep getting faster and faster.

    • @mal2ksc
      @mal2ksc 2 ปีที่แล้ว +3

      The leading edge may get there, but transistor logic doesn't fire on the faintly discernible leading edge, it fires on a voltage threshold. If the transistor needs to see 1V to switch on, it doesn't matter how fast the first couple microvolts get there.

  • @macavalon
    @macavalon 2 ปีที่แล้ว

    VLSI Arm 3 cool !

  • @oisnowy5368
    @oisnowy5368 2 ปีที่แล้ว

    Thought I spotted an A3010 in the thumbnail... slightly disappointed it's not an early production model with a mini-motherboard...

    • @davidcotton8378
      @davidcotton8378 2 ปีที่แล้ว

      From memory, the mezzanine board. Always loved that name.

  • @jondo7680
    @jondo7680 2 ปีที่แล้ว

    Good phones still have separate audio connections

  • @thomasw4422
    @thomasw4422 2 ปีที่แล้ว

    It does concern me a little that the chip could be more easily damaged, and there's no chance of recovering anything.
    Or that it's less upgradable.

    • @jpdemer5
      @jpdemer5 2 ปีที่แล้ว

      This is why, with my new M1 Mac, I paid for AppleCare, for the first time ever (after being a Mac user for 35 years): I could always afford DIY repairs, but I can't easily afford replacements. Insurance has become necessary.

    • @lawrencedoliveiro9104
      @lawrencedoliveiro9104 2 ปีที่แล้ว

      How do you think Apple became a trillion-dollar company?

  • @amberswarbrick6876
    @amberswarbrick6876 2 ปีที่แล้ว

    i love you

  • @Jkauppa
    @Jkauppa 2 ปีที่แล้ว

    if you would have dc power rails, not ac, you would not have to have complicated power systems in the box, also if you had memory in the chip, just one chip

    • @Jkauppa
      @Jkauppa 2 ปีที่แล้ว

      and usb-over-pcie, one port standard

  • @rchandraonline
    @rchandraonline 2 ปีที่แล้ว +2

    It's not just cost reduction, there is also the electrical engineering principle that the lower the component count, the higher the reliability.

    • @damianocaprari6991
      @damianocaprari6991 2 ปีที่แล้ว +2

      That principle applies to any engineering field