Why Moore’s Law Matters

แชร์
ฝัง
  • เผยแพร่เมื่อ 24 ธ.ค. 2024

ความคิดเห็น • 445

  • @Asianometry
    @Asianometry  ปีที่แล้ว +1371

    RIP, Gordon Moore.

    • @ggboss8502
      @ggboss8502 ปีที่แล้ว +53

      And his law too

    • @nomadhgnis9425
      @nomadhgnis9425 ปีที่แล้ว +7

      What ever happened to wang computer corporation. I remember my school having one. The ones with those little monitors and large system unit.

    • @vrclckd-zz3pv
      @vrclckd-zz3pv ปีที่แล้ว +9

      This is how I found out

    • @dongshengdi773
      @dongshengdi773 ปีที่แล้ว +10

      ​@@ggboss8502 only God's Law exists

    • @dougdimmadoodahdaay7887
      @dougdimmadoodahdaay7887 ปีที่แล้ว +2

      an omen

  • @AlanTheBeast100
    @AlanTheBeast100 ปีที่แล้ว +668

    "...a lucky guess that got a lot more publicity than it deserved."
    - Gordon Moore.

    • @matsv201
      @matsv201 ปีที่แล้ว +10

      Well. With a span between 1 year and 2 year it was a pretty wide guess. Also they did throttle there development during periods of the 80s and 90s. Basically cheating.

    • @AlanTheBeast100
      @AlanTheBeast100 ปีที่แล้ว +24

      @@matsv201 It was a very good guess IMO.

    • @Connection-Lost
      @Connection-Lost ปีที่แล้ว +2

      @@matsv201 Their*

    • @Connection-Lost
      @Connection-Lost ปีที่แล้ว

      @@AlanTheBeast100 You not correcting him means you must be low IQ as well

    • @AlanTheBeast100
      @AlanTheBeast100 ปีที่แล้ว +3

      @@Connection-Lost No it means that I know that predictions based on the very small data set that Moore had at the beginning has stood the test of time well. It's never going to be perfect. Indeed look at the graph in the video: ( @21:30 ) a pretty straight line on a log graph? Hmm? You do _know_ what that means, right? I mean you passed grade 9 math? Right?
      So, a little algebra
      1) 1971: about 2500 transistors
      2) 2020: about 35B transistors
      Based on that, if the number of transistors goes up 1.4x every years (on average), then you get close to the 2020 count.
      Of course you can adjust the limits as you will and get slightly different numbers. For example if I make the end number 50B, then the factor would be: 1.41 every year.
      So before accusing people of a low IQ, maybe you should do some basic high school math first and see if your own IQ is up to a level from which to throw cheap shots.

  • @tomhalla426
    @tomhalla426 ปีที่แล้ว +144

    Another issue is that Moore’s Law only applies to microchips. Some politicians act as if similar advances apply to solar cells or batteries.

    • @PainterVierax
      @PainterVierax ปีที่แล้ว +19

      yet, it barely applies to integrated circuit as most of the improvements during the last decades has been done through the design of the chips/systems and making smarter/fine tuned algorithms. And the raw computationnal power gain has mainly been used to ease software programming and portability. (eg. cache levels, multicore, SMT, interposers, ASICs, DSPs, FPGAs, GPGPU, higher level compiled prog languages, HAL, APIs, a metric ton of interpreted languages or cloud/server/Web-based apps)

    • @julioguardado
      @julioguardado ปีที่แล้ว +4

      One of my pet peeves. There will be some spillover of 300mm wafers to solar cells which runs mostly on 200mm because the equipment is cheaper and often second hand. The switchover to 300mm claimed a cost advantage based on wafer size ratio. The same should apply to solar but I am not a solar manufacturing guy. Same could apply to LED's perhaps...

    • @tomhalla426
      @tomhalla426 ปีที่แล้ว +6

      @@julioguardado My understanding is that solar cells are already at a high percentage of the theoretic performance (as are windmills), so only more efficient manufacture is possible.

    • @julioguardado
      @julioguardado ปีที่แล้ว +3

      @@tomhalla426 Same here. Polysilicon is king and its efficiency hasn't changed from around 20% iirc. They're still looking for that high efficiency material that can be manufactured cheaply. Don't see any breakthroughs there.

    • @be0wulfmarshallz
      @be0wulfmarshallz 3 หลายเดือนก่อน

      I REJECT THIS FORM OF TECH NIHILISM

  • @Palmit_
    @Palmit_ ปีที่แล้ว +231

    a forecast of a decade, based on a few years data, just to fit sales targets.. is now, more than legendary. Mythological as it were. Amazing. Thanks Jon. Really interesting stuff.

  • @chengong388
    @chengong388 ปีที่แล้ว +468

    Camera lenses got better because designers could simply let the software run random variations of the lens over and over again to find a better design, the more compute power the more variations you can try and the more likely it is the find a better design. Modern smartphone camera lenses are so ridiculously complex, each element basically has a radial wavy surface that make no sense but somehow focus the light just the right way at the end with minimal aberrations.

    • @WaterZer0
      @WaterZer0 ปีที่แล้ว +105

      Ah the ole brute force approach.

    • @kylinblue
      @kylinblue ปีที่แล้ว +13

      Could you show us an example?

    • @seventhtenth
      @seventhtenth ปีที่แล้ว +81

      not entirely true, a lot is material science and a lot is micro fabrication costs

    • @Laundry_Hamper
      @Laundry_Hamper ปีที่แล้ว +58

      Also important is no longer needing to produce a geometrically correct image at the focal plane. Digital image corrections allow you to trade distortion for sharpness

    • @musaran2
      @musaran2 ปีที่แล้ว

      @@kylinblue Search "polynomial optics" and get ready for a headache.

  • @antman7673
    @antman7673 ปีที่แล้ว +69

    Hearing the story of Moores Law, it is a self-fulfilling prophecy:
    We wouldn’t have current level of tech, without the ambitions of Moores Law.

  • @bloqk16
    @bloqk16 ปีที่แล้ว +4

    I recall back in the 1990s [US] when it came to PCs, the Moore's Law was getting into the lexicon of engineering professionals that used PCs in their work; as the rapid growing power of succeeding Pentium chips were rendering 18 month old PCs obsolete. It was an amazing era on the increasing processing power of the PCs back then on an annual basis.
    The engineering company I worked at, as a means to unload the _obsolete_ Pentium PCs they had [less than two-years old], were selling them to employees for around $100 [US]. Yet, those Pentium PCs were purchased at around $2.5K each, when new, two years prior.

  • @sambojinbojin-sam6550
    @sambojinbojin-sam6550 ปีที่แล้ว +21

    Thanks for telling us Moore's Lore, not just the over-quoted "Law".

  • @hushedupmakiki
    @hushedupmakiki ปีที่แล้ว +65

    21:00 - when mythology is so engrained you create a global industry of adherents. George Moore's passing really struck something in us, even in very adjacent semiconductor research/industries.

  • @ColeL88
    @ColeL88 ปีที่แล้ว +5

    Was a nice surprise to see a picture of mine used as the thumbnail and another used part way through the video!
    Also thanks for listing the source :D

  • @jordanwalsh1691
    @jordanwalsh1691 ปีที่แล้ว +141

    Really interesting video. One small quibble, 2:19 - 2:49 "That 'roughly' is doing some serious heavy lifting". Not really in my opinion. As you point out, 50*2^10 is only 51 200, but the power of the exponent is so large, that you only need to increase the base by 2.5% to 2.05 to make up the difference. 50*2.05^10 = 65 540
    Personally I think 2.05 falls comfortably within the neighbourhood of "roughly" 2

    • @woofcaptain8212
      @woofcaptain8212 ปีที่แล้ว +3

      That was my thought

    • @spodule6000
      @spodule6000 ปีที่แล้ว +2

      I came here to make that comment. Thanks for saving me the trouble!

  • @bgop346
    @bgop346 ปีที่แล้ว +6

    good video on history and i think this is also one of my favourite quotes from "the intel trinity"
    "Gordon more than anyone else understood that it wasn’t really a law in
    the sense that its fulfillment over the years was inevitable, but rather that it was a
    unique cultural contract made between the semiconductor industry and the rest
    of the world to double chip performance every couple of years and thus usher in
    an era of continuous, rapid technological innovation and the life-changing
    products that innovation produced.
    That much was understood pretty quickly by everyone in the electronics
    industry, and it wasn’t long before most tech companies were designing their
    future products in anticipation of the future chip generations promised by
    Moore’s Law. But what Gordon Moore understood before and better than anyone
    was that his law was also an incredibly powerful business strategy. As long as
    Intel made the law the heart of its business model, as long as it made the
    predictions of the law its polestar, and as long as it never, ever let itself fall
    behind the pace of the law, the company would be unstoppable. As Gordon
    would have put it, Moore’s Law was like the speed of light. It was an upper
    boundary. If you tried to exceed its pace, as Gene Amdahl did at Trilogy, your
    wings would fall off. Conversely, if you fell off the law’s pace, you quickly drew
    a swarm of competitors. But if you could stay in the groove, as Intel did for forty
    years, you were uncatchable"

  • @afterSt0rm
    @afterSt0rm ปีที่แล้ว +86

    Well, I'll take the opportunity that creators generally see the initial comments to thank you a lot for the amazing content you've been putting out. I wish you only the best! Hugs from (just) another one of your Brazilian viewers ❤

    • @gordonfreeman9965
      @gordonfreeman9965 ปีที่แล้ว +1

      Nice to see that I´m not the only Brazillian that knows this amazing channel kkkkkkkkkk

    • @OgbondSandvol
      @OgbondSandvol ปีที่แล้ว +1

      @@gordonfreeman9965 Now there are three of us.

    • @DanielLavedoniodeLima_DLL
      @DanielLavedoniodeLima_DLL ปีที่แล้ว +3

      It was nice to see as well that a Brazilian was involved in the last paper that he presented in the video

    • @zerotwo7319
      @zerotwo7319 ปีที่แล้ว

      GG izi shrink brazil to a micro size so we can be more efficient also. Fit more brazils inside brazil

  • @StevieFQ
    @StevieFQ ปีที่แล้ว +7

    I would never argue that we don't need improvements in compute performance but you can make the related statement that improvements in computing power (along with a seemingly never ending thirst for more SW developers) has lead to less efficient SW being developed to take adequate advantage of compute performance.

    • @PainterVierax
      @PainterVierax ปีที่แล้ว +6

      True. But this lack of code efficiency also comes with many advantages like ease of writing, prototyping, debugging, reviewing, correcting, improving, porting or even installing programs. All of the heavy lift is made by a few software bricks now (compilers, interpreters, OS, HALs, APIs, game engines, Web browsers).
      Even in embedded, it becomes way more practical to restrict ASM or RTOS usage only when it's imperatively required.

    • @son_guhun
      @son_guhun ปีที่แล้ว +1

      This claim is sort of absurd on its face. If it were more profitable to produce more efficient software, then that's would companies would make. However, the increasing complexity of business domains, infrastructure and the sheer amount of different platforms a piece of code must be compatible with makes it extremely inefficient (in terms of development costs) to attempt to squeeze every last bit of performance from a chip by writing code in low-level languages.
      Simply put, there's nothing stopping you from putting out highly optimized software today. But you would simply get out-competed unless you were working on a very specific domain or platform. So it's not that powerful hardware leads to less efficient software, but that less efficient software is usually more competitively produced and priced. Hardware performance simply dictates the minimum point at which software is simply not usable due to its inefficiency. For most applications, "adequate" advantage of compute performance IS the ability to produce less efficient software that is still usable, because this means you can produce MORE software or tackle problems that are more complex.
      If a browser already opens a webpage less than 2 seconds, nobody realistically needs it to be faster. It (the browser or the webpage) just needs better features, bugfixes or increased stability. Or maybe just less costly maintenance.

    • @PainterVierax
      @PainterVierax ปีที่แล้ว +2

      @@son_guhun it still really depends on the application.
      Sure Linux got rid of its old ASM pieces to be plain C (and now Rust) but in the mean time Android mostly got rid of Java runtime to compile software during install like BSDs do, despite the vast improvement of ARM SoCs.
      Similarly, developing production software for microcontrolers is not done with extremely inefficient/portable code like MicroPython or Arduino. And sometimes ASM is still used for critical timing functions.
      Same thing with desktop applications: Developing small tools and games with high level languages and APIs can be done without taxing too much of the increasing computing resources (even on laptops and embedded) but it's never used for developing AAA games or production applications that want to use every resources available to speed up the execution.

  • @covert0overt_810
    @covert0overt_810 ปีที่แล้ว +16

    We need Moore transistors….

  • @mrrolandlawrence
    @mrrolandlawrence ปีที่แล้ว +60

    rip GM. a real visionary & titan of semiconductors!

    • @chrisbova9686
      @chrisbova9686 ปีที่แล้ว +1

      Humanity would be immeasurably better off without tech, or those who would enrich themselves from the death of humanity.

    • @Vysair
      @Vysair ปีที่แล้ว

      @@chrisbova9686 you mean extinction? Without tech, you are back to middle age/caveman

    • @---------c5741
      @---------c5741 ปีที่แล้ว +10

      ​@@chrisbova9686 ironic u need to spread this wisdom using technology 😅

    • @chrisbova9686
      @chrisbova9686 ปีที่แล้ว

      @@---------c5741 indeed. Smoke signals aren't dependable, but won't ruin the entire life experience.

    • @wyw201
      @wyw201 ปีที่แล้ว

      @@chrisbova9686 Wouldn't you say the transistor is one of mankind's greatest inventions?

  • @stephanhart9941
    @stephanhart9941 ปีที่แล้ว +5

    The Broken Silicon episode you were on was 🔥!!!

  • @lucidmoses
    @lucidmoses ปีที่แล้ว +3

    And here I though this was going to be some nonsense about how accurate it's been and how it will never end. Nice that you actually looked up the info first. Nicely done.

  • @acolyte1951
    @acolyte1951 ปีที่แล้ว +2

    I appreciate your take on what you said was technological nihilism, that improving the speed (among other things) of electronics is a good and necessary thing. Even if the average consumer doesn't see it, the track record of these developments have indeed transformed the lives of many humans. Seemingly, for the better.

  • @niosanfrancisco
    @niosanfrancisco ปีที่แล้ว +30

    Excellent presentation. RIP Dr. Moore.

  • @Freak80MC
    @Freak80MC ปีที่แล้ว +19

    Listening to this, almost makes me wonder if Moore's Law was more of a self-fulfilling prophecy. Something that motivated people to push harder for technological advancement, which ended up making it come true.

    • @m_sedziwoj
      @m_sedziwoj ปีที่แล้ว

      Look at Ray Kurzweil predictions, they are more interesting than limited Moore's Law.

  • @AlokRayadurga
    @AlokRayadurga ปีที่แล้ว +4

    Over the past weekend, I've been thinking of Moore, the traitorous 8, the last AMD video that you had out recently, and the impact these men made on the industry. Thanks for this video (or tribute?)

  • @nickbensema3045
    @nickbensema3045 ปีที่แล้ว +4

    A few years ago I saw a Philosophy Tube video, in which for the first time I heard Moore's Law referred to as a marketing term, as opposed to a venerable guideline for technological progress. This video illustrates that label wasn't just left-wing cynicism, but kind of accurate -- it demonstrably instilled a confidence in progress that drove sales.

  • @matthagy86
    @matthagy86 ปีที่แล้ว

    Thanks!

  • @hareTom
    @hareTom ปีที่แล้ว +20

    Good info.
    However I just want to point out that "DRAM" should be pronounced as "DEE-RAM"
    Like SRAM is "S - RAM"
    They should be the same logic when you pronounce the word

    • @matthiaskamm
      @matthiaskamm ปีที่แล้ว +7

      yes please, I cringe every time I hear "dram" instead of "Dee-ram" :-) Great video.

    • @shanent5793
      @shanent5793 ปีที่แล้ว

      Whose prescription was that?

    • @seanwieland9763
      @seanwieland9763 ปีที่แล้ว +1

      Same with people who say “oh-led” instead of “O-L-E-D”.

    • @juhotuho10
      @juhotuho10 ปีที่แล้ว +2

      not the first time he does it, most likely he is intentionally saying it like this
      don't know why though

  • @ciCCapROSTi
    @ciCCapROSTi ปีที่แล้ว +5

    I fucking love how this guy does not separate jokes and information. You have to actually pay attention to distinguish.

  • @johnhorner5711
    @johnhorner5711 ปีที่แล้ว +28

    Thank you for yet another educational video. The timing of it's release is uncanny. RIP Gordon Moore indeed. A parallel technology trend which doesn't get as much attention is magnetic storage (disk drives). This has been on a similar trajectory as integrated circuits have been, and has been every bit as important to the development of technology. Data storage is now so plentiful and cheap that we don't even think about it. TH-cam allows anyone to upload unlimited video content for the world to watch. That is thanks to the magnetic storage revolution. Maybe you could do a video on that topic at some point.

    • @InvictraX
      @InvictraX ปีที่แล้ว

      I did notice a big gap in the development of data storage. The capacity has slow down dramatically.

  • @supremebeme
    @supremebeme ปีที่แล้ว

    This is probably my top 10 channels. Keep it going!

  • @douginorlando6260
    @douginorlando6260 ปีที่แล้ว +2

    Shrinking transistor technology also led to lower power consumption solutions, particularly valuable for battery powered devices

    • @ttb1513
      @ttb1513 ปีที่แล้ว +2

      Yes. The shrinking of a transistor’s area by 50%:
      1) Allows twice as many transistors on a chip with the same area.
      2) The same area implies basically the same cost, for twice as many transistors (far from costing twice as much).
      3) A transistor with 1/2 the area consumes 1/2 the power. With twice as many at 1/2 the power, the power consumption stays basically unchanged, for a chip with twice as many transistors.
      Neither the power nor the cost double when the number of transistors double in the same unit of area. This scaling phenomenon is the real important thing.
      If transistors stopped shrinking, compute that needs twice as many transistors will start costing twice as much and using twice the power. Do this for a few generations and costing 8x as much and consuming 8x as much power and 8x the area …. that will make you appreciate the transistor shrinkage advances we’ve had in the past.

  • @mrlucasftw42
    @mrlucasftw42 ปีที่แล้ว +3

    Modern SSD have really revolutionized computing for the base user - boot in sub 2 minutes - after Windows updates can still often be sub 5 minutes.

    • @WaterZer0
      @WaterZer0 ปีที่แล้ว +8

      TWO MINUTES?!
      JESUS CHRIST
      What are you doing?!
      There's no way it should take more than 30 seconds *maximum* to load the OS.

  • @samueltan9279
    @samueltan9279 ปีที่แล้ว +4

    Do basically Moores law was a self fulfilling prophecy. The industry followed it not because of some universal physical law but because everyone in the industry tried their best to follow it for various reasons.

  • @stachowi
    @stachowi ปีที่แล้ว +23

    How do you pump out so much great content

  • @samueladams2340
    @samueladams2340 ปีที่แล้ว +2

    Always look forward to your videos. Well researched and in depth.

  • @RoyAntaw
    @RoyAntaw ปีที่แล้ว +15

    Gordon Moore a true pioneer RIP.

  • @evinoshima9923
    @evinoshima9923 ปีที่แล้ว +2

    I remember in the late 70s in Manila we were making a K&S 478 add on that turned that manual wire bonder into one controlled by a microprocessor. Zylog, AMD, Intel, were our customers... what an amazing time.... our computer had no monitor, used a trackpad with a billard ball in it, and had 12 kb of ram!

  • @thepenguin11
    @thepenguin11 ปีที่แล้ว +5

    I disagree that regular people do not really care about it anymore. Software expects for systems to advance, web alone, try using web with devices 10 years old, you will rage like crazy how slow everything is. The productivity goes up as well with more powerful units.

    • @WaterZer0
      @WaterZer0 ปีที่แล้ว +1

      That's because programmers are less and less efficient.

    • @thepenguin11
      @thepenguin11 ปีที่แล้ว +5

      @@WaterZer0 Clearly you don't know how programming works then. Same software these days is way more efficient than old software doing the same task on same machines. The software like cars advance and need more supporting functions to achieve more efficient and more advanced processes. The issue with badly optimized things these days are not fault of programmers in majority of cases, but fault of leadership, who push unrealistic time frames and cost restrains.

    • @WaterZer0
      @WaterZer0 ปีที่แล้ว +1

      @@thepenguin11 capitalism bad? loud and clear

  • @billfargo9616
    @billfargo9616 ปีที่แล้ว +1

    Since "Moore's law is the observation that the number of transistors in an integrated circuit (IC) doubles about every two years," all that is required to keep it valid forever is to make bigger ICs.

  • @matthewvenn
    @matthewvenn ปีที่แล้ว +3

    Another great video Jon! The last few minutes were especially interesting for me - I'll read that paper. I'd like to know what other industries rely on increasing compute density. I think that mobile phones are an example, the industry plans around people needing to buy new phones to keep up with the latest software. But if phones stop getting more powerful, then that industry needs to rethink its financial plans. I'm also guessing that AI (especially the training side) will want more and more compute going forwards.

    • @m_sedziwoj
      @m_sedziwoj ปีที่แล้ว

      Many people are buying clothes each season, so I think phone industry will find the way ;)

  • @RobBCactive
    @RobBCactive ปีที่แล้ว

    There ought to have been discussion of Dennard scaling which was a key driver of rapid processor improvement, node shrinks gave not only smaller & cheaper but also faster transistors at a constant energy cost, meaning there weren't the heat & power wall issues which halted the frequency scaling.
    I was a bit disappointed that this channel failed to mention that as most people focus on number of transistors, when multi-core is a response to physical limitations on uni-processor performance.

  • @deem1819
    @deem1819 ปีที่แล้ว +4

    Never thought I'd hear an overlap between all the dead space lore channels I follow and my semiconductor manufacturing interests

  • @piethein4355
    @piethein4355 ปีที่แล้ว +1

    I need more compute for my gaming rig still, My 380 can still just barely drive current gen VR titles, I will need sever orders of magnetude more compute before even being able to drive something currently high end like A XR3 at a native resulution. If i also wan't high refreshrates (atleast 144 hertz but preveribly in the 200s) without reprojection and raycasting for improved lighting then we are still many many orders of magnetude away from what is needed.

  • @rollinwithunclepete824
    @rollinwithunclepete824 ปีที่แล้ว +16

    Excellent video, Jon! Among other things it explains why I was never sure what Moore's Law predicted. Was it a doubling every year, every 2 years, every 18 months? Thing itself, Moore's Law, has been redefined to fit the data, the doubling over a span of time.

  • @glennmcgurrin8397
    @glennmcgurrin8397 ปีที่แล้ว +2

    If you take a technology that's developing that rapidly and is that early in it's lifecycle and give me a ten year roadmap into the future and at year 10 you are actually at what you said would be year 9 that's amazingly accurate, it wasn't perfectly accurate but it's incredibly rare to see anything close to that as far as I see.

  • @gljames24
    @gljames24 ปีที่แล้ว +1

    Moore's law, like every exponential in nature, is bounded by physical optimization limitations and more accurately follows a Sigmoidal curve as the technology hits an inflection point and only gives you diminishing returns and the graph goes logarithmic. The only way to advance is to switch to a new technology. Adding cores, gate geometry, 3D stacking has helped in many areas, but we'll likely have to switch from Silicon mosfets to Gallium Nitride or other chemistries to get any sort of frequency improvements at this point.

  • @Edward135i
    @Edward135i ปีที่แล้ว +1

    It's hard to think of a single person who effected more human lives than Gordon Moore, RIP Gordon thank you, I hope Pat doesn't run your company into the ground completely.

    • @shorerocks
      @shorerocks ปีที่แล้ว

      Yeah. Now think about the inventor, or one of the godfathers, of AI: Geoffrey Hinton. There is a CBS morning interview that is super interesting. And... I am not able to predict how much live will change in the next 10, 20 years.

    • @Noqtis
      @Noqtis ปีที่แล้ว

      When you fart, you change the life of more bacteria than there a humans on the planet. Never forget; your asshole is a planet.

  • @colebevans8939
    @colebevans8939 ปีที่แล้ว +2

    With chat gpt 4 this week and how much programmers are already saying it’s helping them code, I can’t help but feel we are just at the start of another massive Burt’s of exponential growth. Soon we will be able to code 10x more things, 10x faster for a fraction of the cost. That alone will be a massive boost to efficiency. As AI models grow it’s only going to get better. Add to that how quickly quantum computing is growing these last few years. Sure it’s an entirely different field but we have no idea what quantum computers could be capable of 20 years from now because we haven’t had the tools to start playing with them until yesterday.

  • @tengkualiff
    @tengkualiff ปีที่แล้ว +24

    Moore's Law will never truly die. :(

    • @benc3825
      @benc3825 ปีที่แล้ว +10

      Moore’s law, or Moore’s observation, I dead, simple as that. We don’t get to rename and define it so it still correcr

    • @RonnieMcNutt666
      @RonnieMcNutt666 ปีที่แล้ว +1

      @@benc3825 AMD and intel future server cpus want to know your location, also gpu density

    • @benc3825
      @benc3825 ปีที่แล้ว +1

      ​@@RonnieMcNutt666Which one specifically? The Turnin family, Venice, Emerald Rapids, Granite Rapids, Diamond Rapids, Sierra Forest and/or Clearwater Forest. :-P

    • @KekusMagnus
      @KekusMagnus ปีที่แล้ว +1

      It's been dead for awhile now

    • @RonnieMcNutt666
      @RonnieMcNutt666 ปีที่แล้ว +1

      @@benc3825 3090 to 4090 having nearly 3x the transistors in a smaller die

  • @jaymacpherson8167
    @jaymacpherson8167 ปีที่แล้ว +8

    FYI…NOAA is typically called “no-ah.”
    Thank you for great documentation that Moore’s Law is passé.

    • @EndOfLineTech
      @EndOfLineTech ปีที่แล้ว

      FOR YOUR INFORMATION you don’t need to be a dick

  • @atanumaulik7093
    @atanumaulik7093 ปีที่แล้ว

    Brilliant, as always. The world needs more compute. Long live the Moore's law !

  • @hsharma3933
    @hsharma3933 ปีที่แล้ว

    I’m so glad you addressed the elephant in the room right at the beginning.

  • @ku0n349
    @ku0n349 ปีที่แล้ว

    I absolutely adore your conclusion, I hate when people say things which ignore everything that technology could be, in favour of what we already have.
    Recently at a party I saw someone who is an engineer at Bosch saying that most of the things are already invented and that there is so little innovation that is yet to come. When I heard that from an actual engineer, I felt disgusted tbh

    • @ceeb830
      @ceeb830 ปีที่แล้ว

      I don’t understand how someone could say that today

  • @truefan1367
    @truefan1367 ปีที่แล้ว

    That truck backing up really sells this video.

  • @tonyv8925
    @tonyv8925 ปีที่แล้ว +9

    Wow, incredible history. I remember my first computer, a VIC 20, then upgrading to the C-64, then the 8086. The first computer I programmed was with Holirith cards on a Univac that used magnetic ring memory and large drum magnetic tape. It was a simple employee hours/wages program. So many things have changed since then. My cell phone has more computing power than the biggest computer that our local college had at that time. Amazing!

    • @milantrcka121
      @milantrcka121 ปีที่แล้ว

      Which Univac -1108? Yes those were the days of dropped card stacks...

    • @Agent-ie3uv
      @Agent-ie3uv ปีที่แล้ว +1

      Obviously a grandma but where on earth the notion that boomers can't use computers? 🤔🧐

  • @jamesmorton7881
    @jamesmorton7881 ปีที่แล้ว +1

    1978 the Motorola 6800. The applications uses exploded. The rocket was just launched.
    I loved all that was CMOS the RCA 4 bit; Now we are at around 2000MIPS. The IBM System 360 waas about 16.6MIPS in 1970.
    The self heat now increases with operating temperature due to higher leakage currents of smaller geometry, around 90nanoM

  • @stuartmacintosh4868
    @stuartmacintosh4868 ปีที่แล้ว +2

    Plot twist: it was a log curve

  • @polka23dot70
    @polka23dot70 ปีที่แล้ว +6

    In the past decade there was no improvement in CPU frequency and almost no improvement in single-thread performance and cost. The term "3 nanometer process" has no relation to any physical feature (such as gate length, metal pitch, or gate pitch) of the transistors. In other words, the "3 nanometer process" is a marketing term. According to International Roadmap for Devices and Systems published by IEEE Standards Association Industry Connection, the 3 nm process is expected to have a contacted gate pitch of 48 nanometers and a tightest metal pitch of 24 nanometers.

    • @elizabethwinsor-strumpetqueen
      @elizabethwinsor-strumpetqueen ปีที่แล้ว

      I like your style - reality rather than hype...thanks

    • @tinayoga8844
      @tinayoga8844 ปีที่แล้ว +3

      My current computers have CPUs from 2011. And the comparable CPU of today has only a doubling in speed of each core.

    • @shanent5793
      @shanent5793 ปีที่แล้ว +2

      Yet throughput has managed to increase, so maybe those things don't matter

    • @supersat
      @supersat ปีที่แล้ว +3

      CPU frequency is more about Dennard scaling, which has been dead for a while. Every time we think Moore's law is dead, we come up with another breakthrough to extend it, although I wouldn't be surprised if it was the end of the line soon.

  • @AdityaMehendale
    @AdityaMehendale ปีที่แล้ว +1

    The actors at 11:05 haven't actually soldered a goddamn thing in their entire lives.

  • @danielsuguwa746
    @danielsuguwa746 ปีที่แล้ว

    Interesting video, and thanks for the content! I just know today that Dr. Moore passed away on Friday... RIP legend...

  • @pierQRzt180
    @pierQRzt180 ปีที่แล้ว

    I am simple man, I see an interesting article cited, and I upvote.

  • @julioguardado
    @julioguardado ปีที่แล้ว

    There's another wave coming in semiconductor manufacturing - the maturing of the industry. Optical scaling has a physical limit and wafer size is not going to go beyond 300mm. What we'll see is all chips becoming much cheaper, particularly complex ones as industry laggards catch up. I think the best is yet to come.

  • @jrherita
    @jrherita ปีที่แล้ว

    Appreciate the deep respect for Moore!
    Only comment is the node charge didn't really end in 2006 for Intel. They executed pretty well until about 2014-2015.

  • @1998awest
    @1998awest ปีที่แล้ว +15

    Awesome video, Jon. I worked on Intel's 65nm node - you got most of the products of the time (Cedarmill, Yonah, Tukwilla). Great summary on the main drivers keeping Moore's Law alive (advancements in design and lithography).
    From 2000 - 2010, Intel's biggest worry was being categorized a monopoly and broken up. Since then, Intel lost its once-massive competitive advantage, and has been surpassed by TSMC and Samsung. The company was a juggernaut with Moore, but after his retirement, hubris crept into the company culture, and mismanagement became the norm, IMO.

    • @razorbackroar
      @razorbackroar ปีที่แล้ว +4

      Sick dude that's awesome we on 4nm now

    • @m_sedziwoj
      @m_sedziwoj ปีที่แล้ว

      @@razorbackroar anybody is naming they node 4nm? Intel 4, TSMC N4, maybe only Samsung, but they lying anyway so who cares.

    • @大砲はピュ
      @大砲はピュ ปีที่แล้ว

      @@m_sedziwoj tsmc on 2nm soon bruh

    • @rkan2
      @rkan2 ปีที่แล้ว

      ​@@大砲はピュEven TSMC doesn't call it with "nm"...

  • @EricJorgensen
    @EricJorgensen ปีที่แล้ว +1

    The less popular corollary to moore's law is the one about how moore's law will be re-defined in order to argue that it's not really meaningless every 3-4 years.

  • @HellishPestilence
    @HellishPestilence ปีที่แล้ว

    There are of course business applications for more computing power. But for the first time in history, consumer products today are not limited by computing power but by things like network speed. The market for more advanced chips is a lot smaller if you're developing for a few HPC clusters at companies or universities rather than smartphones, which perform just fine with a 7nm chip for the vast majority of people

  • @d00dEEE
    @d00dEEE ปีที่แล้ว +2

    Moore's Law should be rewritten as an Old English epic poem, say in the style of "Beowulf".

  • @Nor-tc8vz
    @Nor-tc8vz ปีที่แล้ว +5

    Moore's law is dead, long live Gordon Moore.

  • @luke144
    @luke144 ปีที่แล้ว +1

    We need to adapt and change the way we compute itself. We need to quit looking for unicorn farts with dark matter detectors and tackle problems like P Vs. NP. We need to make out computing me effective and efficient. Right now we run A LOT of flawed, blotted programs. I think things like gallium arsenic will surely help but we need to go back to the basics. We need to change the geometry and architecture of processors. The advent of the arm processor is a perfect example.

    • @buzzsaw838
      @buzzsaw838 ปีที่แล้ว

      Go back to the basics? So basically we've reached the "local maximum" for the current technological paradigms in place for compute. Sounds like some fundamental revolution is needed at the underlying architectural level to continue anything like Moore's law level growth.

    • @luke144
      @luke144 ปีที่แล้ว

      @@buzzsaw838 there are many ways the current model of computing can change.

  • @gp.gonzales
    @gp.gonzales ปีที่แล้ว

    Rest in Peace, Gordon Moore (March 24, 2023)

  • @FogelsChannel
    @FogelsChannel ปีที่แล้ว +1

    Amazing analytical history of transistors and chip design.

  • @ruperterskin2117
    @ruperterskin2117 ปีที่แล้ว

    Right on. Thanks for sharing.

  • @CalgarGTX
    @CalgarGTX ปีที่แล้ว +1

    We're still a few gens away from not needing more compute for gaming imo. We have been trading more performance for equally more power too often last few gens. Id be glad to go back to the days of an actually 50W TDP CPU and sub 200W max TDP GPU that can handle all the latest games at 100-140fps without needing over the top cooling solutions and noise. Not to mention AMD and nvidia seem to be abandoning the low and mid range as much as they can right now. So still a way to go if you ask me.

  • @Datamining101
    @Datamining101 ปีที่แล้ว

    20:17 Multicore is not related to Moore's law, which is a density thing. Multicore is a result of frequency scaling limits, which is more of a Dennard scaling thing.

  • @Schroinx
    @Schroinx ปีที่แล้ว

    Great video. If you are running out of topics, then the Wintel and x86 story could be one.

  • @m_sedziwoj
    @m_sedziwoj ปีที่แล้ว +1

    Personally I would look at "Kurzwell's law" so cost of computing as calculation/s per $1000, because it extend to times with punching machines and ignore technical aspects, which is interesting as a story how it change with time.
    About end, I would give different arguments "you don't need more computing for today software, but only today" because to allow something new, sometimes is few orders of magnitud before it become available (real time RT in games, really smart AI etc). Where are limits to human perception as resolution, refresh or interaction, but quality and complexity have a long way.
    As autonomous cars, robots (humanoid or not but one which can deliver stuff under your door) and many many more. But AI revolution is only at beginning and Von Neumann architecture will not be best for it.

  • @sunroad7228
    @sunroad7228 ปีที่แล้ว

    "In any system of energy, Control is what consumes energy the most.
    Time taken in stocking energy to build an energy system, adding to it the time taken in building the system will always be longer than the entire useful lifetime of the system.
    No energy store holds enough energy to extract an amount of energy equal to the total energy it stores.
    No system of energy can deliver sum useful energy in excess of the total energy put into constructing it.
    This universal truth applies to all systems.
    Energy, like time, flows from past to future".

  • @marianmarkovic5881
    @marianmarkovic5881 ปีที่แล้ว +1

    I woudnt call Multicore as failture to Moor law or anything, yes it was forced change(by pretty much hitting practical limit on frequency of single CPU), but extremly effective one. And industry was using multiprocesor units for decades by then, now it was just integrated in one chip.

  • @MrBun9l3
    @MrBun9l3 ปีที่แล้ว

    The mainboards at 11:05 must have a virus. What the hell are they doing with that soldering iron?

  • @JimCareyMulligan
    @JimCareyMulligan ปีที่แล้ว +1

    It’s just a trend line. You can make your own right now. Moore himself (or probably some analyst in company) revisited data. And if you famous enough you can call it Gelsinger-Moore’s law. Or RayJacket-Moore’s law. RIP

  • @andrewlankford9634
    @andrewlankford9634 ปีที่แล้ว

    In the midnight hour, he cried Moore Moore Moore. With a rebel yell, he cried Moore Moore Moore

  • @ericcarabetta1161
    @ericcarabetta1161 ปีที่แล้ว

    Moore really got to see quite the transformation in computing power over his life.

  • @steveinmidtown
    @steveinmidtown ปีที่แล้ว

    really interesting...quick question, I'd like to watch the video about Wang Labs referred to at 6:05 but am not finding it?

  • @allthethings3071
    @allthethings3071 ปีที่แล้ว

    Problem is the speed of light and needing new substrates, the heat dissipation and current leakage are the main barriers, multi-core has reached a kind of dead end, we need a return of single threaded performance and increased frequency if computing is going to get better. I don't see any major material science advancements on the horizon for a long time, with current tech, all Intel and AMD are doing is basically optimization at this point, fragility of gates and electro migration become massive issues as parts get smaller, not to mention thermal wear on contacts. I don't see how we're going to solve the heat, leakage and memory bandwidth/latency issue anytime soon.
    The biggest issue now is RAM is infinitely slower then l1/l2 cache. So we need - new memory technologies, new materials and new fabrication techniques. Because heat and leakage are the biggest issues that aren't going away anytime soon.

  • @tonifakerman9639
    @tonifakerman9639 ปีที่แล้ว

    Next time you come across the acronym NOAA you can pronounce it like the name Noah and everyone will still understand you, as always great video man

    • @ttb1513
      @ttb1513 ปีที่แล้ว

      Oh "dram", you’re right. He pronounces DRAM as ‘draah-m’ instead of ‘dee-ram’. Little quirks in such great content.

  • @saricubra2867
    @saricubra2867 ปีที่แล้ว +1

    He died with his law, after 2013, stagnation.

  • @VedranCro
    @VedranCro ปีที่แล้ว

    Moore's Law reminds me of Hubble's Law, which describes the expansion of the Universe. Edwin Hubble used only a few data points and boldly drew a line connecting them to prove that galaxies farther from us are receding faster. The values on Hubble's chart were inaccurate, as were his predictions based on it (that the Universe was 2 billion years old). Nevertheless, Hubble's Law inspired others to replicate his work and refine their measurements, which ultimately led to plausible theories and predictions.

    • @VedranCro
      @VedranCro ปีที่แล้ว

      And I love egg fried rice :)

  • @suryaps124
    @suryaps124 ปีที่แล้ว

    I have been looking into Graphene FETs for computational use and other alternatives to the existing Si. One of the main challenges facing any Si alternative is the repeated "extending" of Moore's law. Would you cover post Silicon technologies in a video ? I find your videos quite fascinating and they help keep me informed while I look for roles in the semiconductor industry. Thank you.

    • @m_sedziwoj
      @m_sedziwoj ปีที่แล้ว

      Industry can't move to larger wafer because cost, so moving to another technology... I would bet more on using carbon in different forms as addition to silicon base, not change of everything. At last I didn't hear anything which suggesting is a viable, I think it required a lot more research to make it to market. Researchers are overselling they inventions ;)

  • @Jalae
    @Jalae ปีที่แล้ว +1

    1 trillion fold increase of compute for 2 degrees of accuracy seems like a gross misuse of energy.

  • @ps3301
    @ps3301 ปีที่แล้ว

    We need 4k game running at 120hz refresh rate. That is another 6 years away

  • @twilightknight123
    @twilightknight123 ปีที่แล้ว +1

    His estimate of 65,000 components in 10 years wasn't that far off from a (roughly) factor of two. It's a bit unfair to say the word roughly was "doing a lot of heavy lifting" when the 65k component figure comes from a factor of 2.05 instead of 2.

  • @michaelmoorrees3585
    @michaelmoorrees3585 ปีที่แล้ว

    From a chip users standpoint, I remember the market wide semi level, the transition to new FET designs, in the 1st half of the 1980s. These improved MOSFETs made faster CMOS chips, so big chips, moved from NMOS to lower power CMOS. At the discrete level, the creation of power MOSFETs.
    I thought Intel's big win was IBM choosing its uP for its IBM PC, and the clones, having to stick to it, to stay compatible. Even moving to the 386, when IBM balked at using it, fearing stepping on its mainframe market.

  • @alpaykasal2902
    @alpaykasal2902 ปีที่แล้ว +3

    RIP Doctor Moore. Your fingerprints are all over everything, forever.

  • @JoshuaC923
    @JoshuaC923 ปีที่แล้ว

    Did not expect to see dead space in a Asianometry video😂😂👍🏻👍🏻

  • @yashsanghvi5956
    @yashsanghvi5956 ปีที่แล้ว

    Argument against the fact the Gordon Moore's math was wrong: Yes, 50 times (2^10) would be 51200 transistors. But Gordon Moore also said that the rate of growth was roughly 2 times per 2 year. Assuming the rate of growth was 2.05 instead of 2, we get 50 times (2.05 ^10) = 65540 transistors - which something Gordon Moore said in the whitepaper.
    TLDR: I wouldn't say his math was wrong but the with exponentials the even small approximation in rate of growth can lead to very different answers

  • @trevinbeattie4888
    @trevinbeattie4888 ปีที่แล้ว

    No mention of the physical limits of transistors only a few atoms thick?

  • @paulmuaddib451
    @paulmuaddib451 ปีที่แล้ว +12

    I love the little bits of humor, wit and charm you add to each video, "...technically correct, the best kind of correct (from Futurama)", and that "Whomp Whomp".
    We see you, @Asianometry we see you. 😘

  • @Venkat2811
    @Venkat2811 ปีที่แล้ว +1

    As your narration and explanation is unbeatable, would be great if you could video on evolution of technology from 1400 AD (printing press) till date and how humans were worried about jobs being replaced. This would be very relevant to current GPT / AI revolution.

  • @krakhedd
    @krakhedd ปีที่แล้ว +1

    5:23 - the guy who discovered ALUMINUM named it, "aluminum", and it was only subsequent Brits who thought it needed more flourish and changed it to "aluminium".
    However, the correct pronunciation and spelling are still, "aluminum".
    Especially lacking any hint of King's English, you should be pronouncing it, "aluminum".

    • @Kieselmeister
      @Kieselmeister ปีที่แล้ว

      Technically he originally proposed Alum-ium, but people complained it wasn't classical enough, so he changed it to the actual Latin alumin-um (lit. meaning "from-alum")...
      Then the same complainers claimed that the "ium" ending "sounded" more classical, and wanted "alumin-ium", even though the "um" ending was LITERALLY CORRECT CLASSICAL LATIN, and they started calling it that even though he kept using "alumin-um".
      TLDR, the original invented name for the element was "Alum-ium", which is thus correct ENGLISH...
      The proper declension LATIN name given to the element is "Alumin-um", which is thus correct LATIN...
      And "alumin-ium" is only correct if you are speaking the language of ignorant sore losers.

  • @ronconte4292
    @ronconte4292 ปีที่แล้ว

    It used to be that several companies made lithography machines. As time when on, the number of companies making lithography machines narrowed. Only 3 companies make DUV machines. Only 1 company, ASML, makes EUV machines. They can make hundreds of DUV machines per year, a few dozen EUV machines, and they hope to be able to make 20 EUV high NA machines per year by 2027-28. This narrowing of how many companies make these machines, how many machines can be made per year, and each machine's wafer output, will do more to end Moore's Law than the ability to put more transistors in a smaller space. The end [of Moore's Law] is nigh!

  • @junker_joerg
    @junker_joerg ปีที่แล้ว

    According to the Dead Space analogy, who's Isaac Clarke in rl?

  • @Дми́трийВикторович-о3с
    @Дми́трийВикторович-о3с ปีที่แล้ว +2

    22:03 - It's not a nihilism unless you're advocating for abolishment of consumer rights or something. Industry's been pushing for years for taking control away from users along with driving them to buy more expensive and less reliable hardware. Videogames, software and just general purpose apps getting to be more demanding without any meaningfull payback my smartphone is 10 years old and I could easily get along with 20 y.o. mobile phone just as well. I use incredibly outdated software and I see no benefits for me (not some "NOAA") to upgrade any of it besides a bunch of corporates breaking backwards compatibility in their own product because they immagined themselves some high profits. I'd like "things to get faster" if theese "things" are mine - not G+\MS's spyware. It's not nihilism when the object of nihilism is insultingly untrustworthy and community is brainhammerragingly better at optimising (not to mention making things work).