How Machine Learning Changed Computer Architecture Design (David Patterson) | AI Clips with Lex

แชร์
ฝัง
  • เผยแพร่เมื่อ 1 ก.ค. 2020
  • Full episode with David Patterson (Jun 2020): • David Patterson: Compu...
    Clips channel (Lex Clips): / lexclips
    Main channel (Lex Fridman): / lexfridman
    (more links below)
    Podcast full episodes playlist:
    • Lex Fridman Podcast
    Podcasts clips playlist:
    • Lex Fridman Podcast Clips
    Podcast website:
    lexfridman.com/ai
    Podcast on Apple Podcasts (iTunes):
    apple.co/2lwqZIr
    Podcast on Spotify:
    spoti.fi/2nEwCF8
    Podcast RSS:
    lexfridman.com/category/ai/feed/
    David Patterson is a Turing award winner and professor of computer science at Berkeley. He is known for pioneering contributions to RISC processor architecture used by 99% of new chips today and for co-creating RAID storage. The impact that these two lines of research and development have had on our world is immeasurable. He is also one of the great educators of computer science in the world. His book with John Hennessy "Computer Architecture: A Quantitative Approach" is how I first learned about and was humbled by the inner workings of machines at the lowest level.
    Subscribe to this TH-cam channel or connect on:
    - Twitter: / lexfridman
    - LinkedIn: / lexfridman
    - Facebook: / lexfridman
    - Instagram: / lexfridman
    - Medium: / lexfridman
    - Support on Patreon: / lexfridman
  • วิทยาศาสตร์และเทคโนโลยี

ความคิดเห็น • 22

  • @godspeed133
    @godspeed133 4 ปีที่แล้ว +22

    Seems like ML will only advance us so far. Ultimately we need a new architecture or semiconductor tech breakthrough to revive Moore's law in some form. Otherwise we have plateaud, and that, while not the end of the world, is quite disappointing.

    • @kaseyboles30
      @kaseyboles30 4 ปีที่แล้ว +11

      There are several promising techs on the horizon. Gallium nitrite for example. It won't go much if any smaller than the current limit, it will however run much higher frequencies with lower power. The trick is getting good crystals out of it. The current defects per mm2 is still around 100 times worse than current silicon, however once they get that down enough and refine the techniques down near current process nodes it could become a viable replacement. It's already seen some use in lower complexity devices do to it's superior resilience to temperature.
      There has also been some progress in photonic computing and other options involving graphene are being worked on. And quantum computing is of course being invested in. Though note quantum computing is more complementary than replacement as there are things conventional binary computing can do much better than quantum computing. Kinda like how gpu's and cpu's have different domains they excel at.

    • @maximilliansbabo2099
      @maximilliansbabo2099 4 ปีที่แล้ว +1

      Go check out the company groq.... cuz you are exactly correct

    • @kaseyboles30
      @kaseyboles30 4 ปีที่แล้ว

      @@maximilliansbabo2099 unfortunately the site tries to trick one into loading a 'flash update'. big red flag.

    • @povelvieregg165
      @povelvieregg165 4 ปีที่แล้ว +6

      Doesn't seem like that though. As David Patterson says, this is actually some interesting times we live in. We will simply start creating a lot more specialized hardware. It makes total sense to me. Today it is not just about hardware not improving as rapidly but also the fact that you don't need that much more power. I have been quite happy with performance of my computers for at least a decade now. Almost everything you do today is fast enough except for really specialized areas. There are things like compiling large programs, processing video, rendering, realistic computer games, voice recognition and some other specialized areas where we really need or benefit from higher performance.
      But if you look at drawing applications, editors, 2D games, calendar, spreadsheets, presentation software and lots of other everyday stuff, we already have plenty of performance.

    • @blocksrey
      @blocksrey 2 ปีที่แล้ว

      Code is flawed, we already have the hardware. Once AI start optimizing programs that’s when we’ll see more results

  • @d3ly51d
    @d3ly51d 3 ปีที่แล้ว +6

    Why not simply standardize FPGAs and have the OS allocate areas of your chips to applications, then you can have software that come with their own hardware accelerators. Imagine an mp3 library coming with its own DSP chip design, or a crypto library with its own specialized hardware. And if you run out of FPGA floor space, your OS can then simulate the rest of it on the CPU, kinda like the trade-off between physical RAM and swapfile. Sounds like a good idea, but is probably a massive standardization effort across the entire industry and probably the economics of it all don't make sense at the moment. My bet is that in the future we'll see more reconfigurable hardware integrated with the software. Then you really have a thing that you can go ahead and optimize, and it will benefit everyone.

    • @minhajsixbyte
      @minhajsixbyte 2 ปีที่แล้ว

      @Robert w and the rest of it is bought by amd

    • @W2wxftcxxtcrw
      @W2wxftcxxtcrw ปีที่แล้ว

      Idk sounds non trivial to implement. I say you develop the first prototype 😅

    • @Gooberpatrol66
      @Gooberpatrol66 3 หลายเดือนก่อน

      I've wondered this as well

  • @ethiesm1
    @ethiesm1 3 ปีที่แล้ว +4

    Machines that write our programs-- YES!

  • @ab8jeh
    @ab8jeh 3 ปีที่แล้ว +1

    Seems like they should be talking about GPUs towards the end. Not sure why it didn't come up.

  • @GBlunted
    @GBlunted 8 หลายเดือนก่อน

    TPUs before they were a household term! Very telling little chat you clipped here...😮🤔😊

  • @thisguy9279
    @thisguy9279 3 ปีที่แล้ว +6

    Tenserflow and PyTorch aren't "languages"!!! They are frameworks or librarys.

    • @machinephile
      @machinephile 3 ปีที่แล้ว +1

      actually, what I think, they are languages in a sense that they build abstractions and interfaces on machine learning methodologies.

    • @thisguy9279
      @thisguy9279 3 ปีที่แล้ว +1

      @@machinephile That's what most of the librarys or frameworks do. They build abstractions and interfaces of complex concepts. By that definition moviepy would be a language because it builds abstractions and interfaces on ffmpeg. Even bootstrap would be one. By that definition, most of the classes ever written would be a language, because it implements something that can be used more easily and if you don't believe me just google the word tensorflow. Google will say the same.

    • @machinephile
      @machinephile 3 ปีที่แล้ว +4

      @@thisguy9279 I understand what you are trying to point out, but I would like you to think about, "what is a programming language?" like really? Well, as we know, it consists of a set of rules by which you query the cpu and memory to perform instructions to compute an algorithm. I know it's quite silly to call out liberties as 'languages' but while they certainly are not programming languages but it is quite intriguing to think of them as semi-languages, where they implement their own set of rules.

    • @manhalrahman5785
      @manhalrahman5785 3 ปีที่แล้ว

      fight

    • @borazan
      @borazan ปีที่แล้ว +1

      Thank you for telling that to an AI researcher and a computer architecture pioneer with a turing award, bet they didn't know what "tenserflow" was called...

  • @jcb1orion
    @jcb1orion 3 ปีที่แล้ว +3

    why does this guy swallow cud every 10 seconds?