CPU vs GPU vs TPU vs DPU vs QPU

แชร์
ฝัง
  • เผยแพร่เมื่อ 25 ส.ค. 2023
  • What's the difference between a CPU and GPU? And what the heck is a TPU, DPU, or QPU? Learn the how computers actually compute things in this quick lesson.
    #computerscience #tech #programming
    💬 Chat with Me on Discord
    / discord
    🔗 Resources
    Learn more about CPUs cpu.land
    CPU in 100 Seconds • How a CPU Works in 100...
    Math for Programmers • 10 Math Concepts for P...
    JS Worker Threads • PROOF JavaScript is a ...
    🔥 Get More Content - Upgrade to PRO
    Upgrade at fireship.io/pro
    Use code YT25 for 25% off PRO access
    🎨 My Editor Settings
    - Atom One Dark
    - vscode-icons
    - Fira Code Font
    🔖 Topics Covered
    - What is a CPU architecture?
    - ARM vs x86-64
    - CPU versus GPU
    - Why are GPUs so fast?
    - Why do you need a GPU?
    - What is a DPU?
    - Quantum computing basics
    - How are silicon chips made?
  • วิทยาศาสตร์และเทคโนโลยี

ความคิดเห็น • 1.6K

  • @HerrMustermann
    @HerrMustermann 8 หลายเดือนก่อน +4599

    When you introduce CPU, GPU, TPU, DPU, and QPU at a party, CPU says 'I'm versatile', GPU says 'I'm visual', TPU says 'I'm trending', DPU says 'I'm data-centric', and QPU? Well, it's quantum; it knows all the answers, but when you look at it, it just says 'Maybe...'

    • @cosmos0909
      @cosmos0909 8 หลายเดือนก่อน +146

    • @ikedacripps
      @ikedacripps 8 หลายเดือนก่อน +75

      Maybe…

    • @madhououinkyoma
      @madhououinkyoma 8 หลายเดือนก่อน +16

      Nice try

    • @RADIT-ip3eq
      @RADIT-ip3eq 8 หลายเดือนก่อน +64

      So, schrodingus processor?0

    • @Marginal0
      @Marginal0 8 หลายเดือนก่อน +177

      When you compare CPU, GPU, TPU, DPU, and QPU to superheroes, CPU is like Batman, GPU is like Superman, TPU is like Flash, DPU is like Cyborg, and QPU? Well, it’s quantum; it’s like Doctor Strange, but when you ask it to save the world, it just says ‘There are 14 million possibilities…’

  • @gemma2901
    @gemma2901 8 หลายเดือนก่อน +5083

    man, you are so good at explaining things and combining memes. prodigy.

    • @fackarov9412
      @fackarov9412 8 หลายเดือนก่อน

      he uses QuantumAI to fack our minds, thats why

    • @tedzards509
      @tedzards509 8 หลายเดือนก่อน +49

      1:29 was indeed a glorious commentary of what we have become.

    • @monkey-mike_xyz
      @monkey-mike_xyz 8 หลายเดือนก่อน +14

      THIS. He's the man .

    • @daem0n1ze
      @daem0n1ze 8 หลายเดือนก่อน +3

      his that guy?

    • @weabowoshi
      @weabowoshi 8 หลายเดือนก่อน +3

      savant

  • @vb6code
    @vb6code 7 หลายเดือนก่อน +252

    I asked the GPT to explain that video to my dog, he said,
    Explaining this to a dog might be a bit challenging since dogs don't understand complex technical concepts. However, you can use simple actions and treats to represent these concepts:
    1. **CPU (Central Processing Unit)**: Show your dog a treat (representing a task) and then give it to them. CPUs are like you, the treat giver, handling everyday tasks.
    2. **GPU (Graphics Processing Unit)**: Take a ball (representing a fun task), throw it, and encourage your dog to fetch it. GPUs are like a friend who helps with fun and sporty tasks.
    3. **TPU (Tensor Processing Unit)**: Place a puzzle in front of your dog and watch them solve it (or attempt to). TPUs are like a clever friend who's great at solving puzzles.
    4. **DPU (Data Processing Unit)**: Show your dog their leash (representing a connection) and take them for a short walk. DPUs are like a friend who helps you connect with others and stay safe during walks.
    5. **QPU (Quantum Processing Unit)**: Okay, this one's tricky! Pretend to be a magician and hide a treat in one hand. Then, show both hands and let your dog choose. Sometimes the treat appears, sometimes it doesn't (representing quantum unpredictability).
    Remember, dogs won't grasp these concepts fully, but it can be a fun and interactive way to spend time with your furry friend! 🐶🐾

    • @satyamanu2211
      @satyamanu2211 5 หลายเดือนก่อน +57

      This is fucking underrated and brilliant

    • @bilaltaj1725
      @bilaltaj1725 5 หลายเดือนก่อน

      k@@satyamanu2211

    • @whimsicalkins5585
      @whimsicalkins5585 2 หลายเดือนก่อน +7

      I am crying 😭

    • @zenta12
      @zenta12 2 หลายเดือนก่อน +20

      "Furry friend"

    • @ahmadal_shanqeety802
      @ahmadal_shanqeety802 หลายเดือนก่อน +5

      Wow that's actually awesome!
      Never thought Chat gbt is that useful

  • @user-tj9gj2wx5d
    @user-tj9gj2wx5d 8 หลายเดือนก่อน +396

    Multicore CPUs aren't the reason why we can run multiple applications at once. Operating systems could do that long before multicore CPUs were a thing. The technology which allows that is called process scheduling. The OS is basically switching between the running applications giving each of them a fraction of time (many times per second) to execute whatever code they are currently running. Having multiple cores just allows the OS to handle multiple processes more efficiently.

    • @kkounal974
      @kkounal974 8 หลายเดือนก่อน +44

      He means computing in true parrarel not context switching

    • @iwikal
      @iwikal 8 หลายเดือนก่อน +58

      @@kkounal974 He literally said at 3:02 that "modern CPUs also have multiple cores which allows them to do work in parallel, which allows you to use multiple applications on your PC at the same time". I'm all for benefit of the doubt, but anyone who doesn't already know about context switching is gonna leave this video thinking single core processors can't multitask.

    • @SahilP2648
      @SahilP2648 8 หลายเดือนก่อน +11

      Right but the difference between single core and now multicore processors is that instead of scheduling instructions of multiple applications on the same core, you can execute them on whichever core is available, provided the instructions don't require context.

    • @kanakTheGold
      @kanakTheGold 8 หลายเดือนก่อน +6

      until multicore came into reality, OS could on time share the slices of different threads, it truly became parallel processing only with the muitiple pipelines of multi-core architecture.

    • @ChrisPepper1989
      @ChrisPepper1989 7 หลายเดือนก่อน +11

      Was coming here to say exactly that lol
      Also it's very important to distinguish between multiple processes and multiple applications.
      Because a single application can (and often will) have multiple processes that if all running on one core still have to be time shared.
      That's why the wizards have to ensure they use all cores if they want to get the best out of the CPU. Which of course means that you might be running multiple applications, that all use multiple cores. So the time sharing the OS does is still super important

  • @HeisenbergFam
    @HeisenbergFam 8 หลายเดือนก่อน +612

    "highly trained wizards called software engineers" gotta be one of the most accurate sentences said in history

    • @universaltoons
      @universaltoons 8 หลายเดือนก่อน +8

      W

    • @Namrec_Molai
      @Namrec_Molai 8 หลายเดือนก่อน +16

      This man spitted forbidden facts

    • @LuisSierra42
      @LuisSierra42 8 หลายเดือนก่อน +15

      I'm a wizard Harry

    • @rg2130
      @rg2130 8 หลายเดือนก่อน +12

      @@LuisSierra42 I'm a Jizzard Harry

    • @freddiem7993
      @freddiem7993 8 หลายเดือนก่อน

      Hello again heisenberg!
      For those who don't know, Heisenberg is the fresh account of the "NMRIH is a great source mod" which was banned for botting/violating TH-cam TOS
      -Same mannerisms, Over 800+ subs to only popular/viewed channels, popped up right when the previous account was banned about four months ago, this account is a bot that spams and like baits channel's comment sections for subs.

  • @MrAcuriteOf1337
    @MrAcuriteOf1337 8 หลายเดือนก่อน +1136

    The thing with Quantum Computers is that, basically, you can take a function, feed it literally every possible input in one go, retrieve every possible output simultaneously, and then sample from that statistically to tell you things about the function. It's incredibly niche, and super hard to actually build, and we're not even sure how many problems can actually be accelerated with it, but it's fun to work through the Math for.

    • @FingerinUrDaughter
      @FingerinUrDaughter 8 หลายเดือนก่อน

      the thing with quantum computers is, theyre complete fucking nonsense, and not even an actual idea beyond "what if unlimited power?"

    • @ra2enjoyer708
      @ra2enjoyer708 8 หลายเดือนก่อน +94

      Don't quantum computers also get super fucked by background noise (much like anything involving quantum physics)? This reduces their usefulness to basically running in specific spots of outer space, assuming it can survive all the chaotic background radiation with no effect on its function.

    • @joankim123
      @joankim123 8 หลายเดือนก่อน +142

      it's true that you feed it all inputs, but you actually just get one output, like a normal computer. And then there's some incredibly complex math to statistically correlate repeated outputs with the true answer you want.

    • @FingerinUrDaughter
      @FingerinUrDaughter 8 หลายเดือนก่อน

      @@ra2enjoyer708 they dont do anything, because they dont actually exist.

    • @user0K
      @user0K 8 หลายเดือนก่อน +26

      @@joankim123 yea, as far as I remember it would collapse the quantum function, but you can choose specific parameters to be matching the required values.
      Basically, give me values for arguments of the function, which would result in the wanted result. Prolog 2.0 lol.

  • @orbik_fin
    @orbik_fin 8 หลายเดือนก่อน +65

    You didn't mention DSP - digital signal processor. Specialized to run a single program continuously in a loop {input -> process -> output} with a strict time limit dictated by the sampling rate. Used extensively for stuff like audio, radio comm and oscilloscopes.

    • @PrathamInCloud
      @PrathamInCloud 24 วันที่ผ่านมา +1

      Yes because it's not used by a general purpose computer, even though technically it is still computing stuff

    • @abhinavnatarajan4180
      @abhinavnatarajan4180 18 วันที่ผ่านมา

      ​@@PrathamInCloudnot necessarily true, most general purpose computers have onboard audio chips that are doing A/D and D/A conversions, and that might involve some DSP. Lots of modern phones have dedicated DSP modules attached to their cameras and for dealing with microphone audio.

  • @Popipo85
    @Popipo85 8 หลายเดือนก่อน +90

    Are we all gonna ignore the guy playing League of Legends with a controller at 5:01? 💀

    • @BadDuDeShot
      @BadDuDeShot 8 หลายเดือนก่อน +4

      Was search if one noticed it 💀

    • @EtaCarinaPhenixsChannel
      @EtaCarinaPhenixsChannel 4 หลายเดือนก่อน +2

      I was thinking the same thing XD

    • @HunterKiotori
      @HunterKiotori 3 หลายเดือนก่อน +2

      The superior way to play games

    • @lunyxappocalypse7071
      @lunyxappocalypse7071 หลายเดือนก่อน +4

      ​@@HunterKiotoriIt's best to play with whatever you grew up with, in my opinion. The brain remembers. My cousin still treats his computers keyboard like a controller, while I'm still wrapping my head around LT&RTs, and how to switch between two buttons simultaneously.
      It's a similar problem when switching musical instruments from, say a Guitar or Violin to a Keyboard.

  • @HerrMustermann
    @HerrMustermann 8 หลายเดือนก่อน +2183

    Seems like we have a family reunion here: CPU, the brainy one, GPU, the artist, TPU, the specialized smarty-pants, DPU, the traffic controller, and QPU, the one who says he's working, but you're always unsure because he lives in multiple realities!

    • @someguy9175
      @someguy9175 8 หลายเดือนก่อน

      the QPU is just the pot head

    • @iluvpandas2755
      @iluvpandas2755 8 หลายเดือนก่อน +21

      could not be better said

    • @DemPilafian
      @DemPilafian 8 หลายเดือนก่อน +73

      QPU uses a different work paradigm... it's known as WFH.

    • @fuzzy-02
      @fuzzy-02 8 หลายเดือนก่อน +7

      Rick and Morty, processing unit dimension?

    • @wheredhego47
      @wheredhego47 8 หลายเดือนก่อน +5

      Just thought about Neil Gaiman's The Sandman for some reason.

  • @mikedub
    @mikedub 8 หลายเดือนก่อน +332

    This is how history should be taught.

    • @brain5853
      @brain5853 8 หลายเดือนก่อน +34

      With Amouranth gifs hidden within the material? Agreed.

    • @Namrec_Molai
      @Namrec_Molai 8 หลายเดือนก่อน +4

      I think he read Oghma infinium

    • @LuisSierra42
      @LuisSierra42 8 หลายเดือนก่อน +5

      @@brain5853 🥵🥵🥵

    • @josue1996jc
      @josue1996jc 8 หลายเดือนก่อน +7

      you know . . . i've been having this conversation with my friends (who are algo pretty well educated i must say), and we all agree that the field of philosofy whom ussually were the ones given the task to analyze the facts studied by science and kinda digest it and present it to the regular foe in an understandable way has been less and less capable of doing this job and have been more and more disconected than ever (science and philosophy should be more interconected now than ever) sadly because the factual data presented by science, is becoming more and more complicated by the introduction of . . . well the "quamtum everthing" as we call it xd, so what happen when not even the philosophers can't understand what the fuck i going on the physics department, and to be fair, i dont blame them. we din't really get to answer at the end but i think this channel has something about that migth help with the current situation.

    • @MarkelMathurin
      @MarkelMathurin 2 หลายเดือนก่อน

      But when you really think about it, the air does have a taste​@@josue1996jc

  • @yehitecharts
    @yehitecharts 8 หลายเดือนก่อน +22

    BRO! 6:16 was the MOST comprehensive visualization of matrix multiplication I've seen and I've watched several videos and tried reading several books because I've been self-studying and I came to the conclusion that I need a tutor or something but wow bravo. That actually made me excited to learn it! If you're not doing math videos can you point me in the right direction? Thank you and God bless!

    • @huverdoose
      @huverdoose 18 วันที่ผ่านมา +1

      Still no replies. That sucks. I like 3Blue1Brown's channel. He assumes you've gotten to Trig at least before you start with his earliest videos.

  • @TrippSaaS
    @TrippSaaS 8 หลายเดือนก่อน +3

    Blown away by how good this content is. Thanks!

  • @markzuckerbread1865
    @markzuckerbread1865 8 หลายเดือนก่อน +297

    An analogy I really liked for comparing cpu to gpu is trains vs cars, cars (cpu) are really fast for transporting one person as fast as possible, while trains (gpu) are faster than cars when transporting thousands of people as fast as possible, a cpu has really low latency for executing one instruction while gpus abuse simd operations to reduce the average latency over many parallel and similar instructions.

    • @attepatte8485
      @attepatte8485 8 หลายเดือนก่อน +7

      Thanks Zuc.

    • @headhunterz1000
      @headhunterz1000 8 หลายเดือนก่อน

      Yeah so which one is the car?

    • @HappyBeezerStudios
      @HappyBeezerStudios 4 หลายเดือนก่อน +2

      Don't forget that the train can only transport people and only transport them all to the same station while the car can transport all kinds of stuff and load on and off at any point.

  • @HerrMustermann
    @HerrMustermann 8 หลายเดือนก่อน +517

    CPU: We need to talk. GPU: Already calculated what you're about to say. TPU: Already completed your words with predictions. DPU: Already sent the message. QPU: I was on the next conversation.

    • @relix3267
      @relix3267 8 หลายเดือนก่อน +71

      All that while the human is watching onlyfans😂

    • @rem7412
      @rem7412 8 หลายเดือนก่อน +6

      I like this

    • @Pandazaar
      @Pandazaar 8 หลายเดือนก่อน +10

      brother is just spamming chatgpt comments

    • @rem7412
      @rem7412 8 หลายเดือนก่อน +4

      @@Pandazaar smh it's still funny

    • @dontblamepeopleblamethegov559
      @dontblamepeopleblamethegov559 8 หลายเดือนก่อน +5

      QPU: I own your bitcoins now

  • @Hansanca
    @Hansanca 8 หลายเดือนก่อน

    This was the best video you’ve made yet. Keep it up!

  • @sebby007
    @sebby007 8 หลายเดือนก่อน +2

    Your videos are epic! The information and humour density are perfect!

  • @jp46614
    @jp46614 8 หลายเดือนก่อน +309

    It's important to clarify on most architectures (especially CISC) one clock cycle usually isn't one instruction, only some very fast instructions can execute in one clock cycle but reading memory or division/multiplication can take several clock cycles.

    • @crazybeatrice4555
      @crazybeatrice4555 8 หลายเดือนก่อน +14

      Well there's also instructions per clock as well

    • @Luredreier
      @Luredreier 8 หลายเดือนก่อน +31

      Most of the common instructions actually finishes in one clock cycle these days, AMD and Intel have both worked really hard on that to reduce latency.
      But you're right, some instructions might take multiple clock cycles.
      On the other hand a core have multiple pipelines and can run multiple instructions simultaneously, filling pipelines with out of order execution, speculative execution and a second thread to ensure that execution resources are used even if one threads code doesn't use that resource in that moment.

    • @MI08SK
      @MI08SK 8 หลายเดือนก่อน +4

      Some instructions can be executed paralely in 1 cycle if they are not depependend, for example if there are 4 sequential addittions to 4 registers the cpu will execute all of them in one clock cycle because most Cisc CPUs have multiple ALUs so they can execute those operations simultanosly

    • @MI08SK
      @MI08SK 8 หลายเดือนก่อน +1

      Reading memory can take 1 clock cycle if it is in L1 cache

    • @trevoro.9731
      @trevoro.9731 8 หลายเดือนก่อน +2

      @@MI08SK Really ? Name the CPU which has a L1 latency of 1 cycle.

  • @noramwahmwah
    @noramwahmwah 8 หลายเดือนก่อน +457

    QPU stands for Quadruple Parallel Universes

    • @xbeatghost.6118
      @xbeatghost.6118 8 หลายเดือนก่อน +3

      What's that 🧐?

    • @ablobofgarbage
      @ablobofgarbage 8 หลายเดือนก่อน +43

      Ah, an individual of culture i see

    • @oguzhan.yilmaz
      @oguzhan.yilmaz 8 หลายเดือนก่อน

      Nvidia's official blog is saying QPU stands for Quantum processing units

    • @noramwahmwah
      @noramwahmwah 8 หลายเดือนก่อน +5

      @@ablobofgarbage Glad to see im not the only one (^∪^)

    • @akar_excel
      @akar_excel 8 หลายเดือนก่อน

      Bro

  • @PixyEm
    @PixyEm 8 หลายเดือนก่อน +10

    One downside of a QPU is that you need to stay aligned, you don't wanna know what happens when you're QPU misaligned

  • @zanes9898
    @zanes9898 7 หลายเดือนก่อน

    Wow! This is by far the best layman video I've ever experienced on CPU variants.
    Keep doing these whoever you are.

  • @DrakiniteOfficial
    @DrakiniteOfficial 8 หลายเดือนก่อน +122

    Correction: 1 Hz does not mean 1 instruction per second. Many types of instructions, like multiply/divide and memory operations, take multiple clock cycles to finish. 1 Hz just mean its clock runs once per second.
    Edit: I'm not completely sure about this second one, but I think Neumann in von Neumann is pronounced NOY-min, not NEW-min.

    • @asdfssdfghgdfy5940
      @asdfssdfghgdfy5940 8 หลายเดือนก่อน +10

      That’s a tricky one as he is from Hungary and I’m not sure how they would pronounce it. But in German it is NOY-mann. But Americans tend to pronounce it NEW-mann and he lived there for a fair while so he was probably called that when he was there.

    • @someliker
      @someliker 8 หลายเดือนก่อน +5

      NOY-mann
      The "a" is pronounced the same as in "Aha!". Short "a", long "n".

    • @armyant7
      @armyant7 8 หลายเดือนก่อน +6

      Just need to remember how "Freud" is pronounced 😉
      This applies to "Euler" too...but not "Euclid" ☠️ (presumably due to cultural origin)

    • @asdfssdfghgdfy5940
      @asdfssdfghgdfy5940 8 หลายเดือนก่อน

      @@armyant7 Freud is easy. Try pronouncing Nietzsche

    • @DrakiniteOfficial
      @DrakiniteOfficial 8 หลายเดือนก่อน

      ​@@asdfssdfghgdfy5940My guess is "NEE-etch". Am I close?

  • @AR-yr5ov
    @AR-yr5ov 8 หลายเดือนก่อน +83

    OMG you're the best at explaining tech topics in a digestible, memeable format

  • @finadoggie
    @finadoggie 8 หลายเดือนก่อน +13

    6:50 somehow, touhou manages to appear everywhere

    • @SubPriestPepsi
      @SubPriestPepsi 8 หลายเดือนก่อน +1

      Submit to the cult.

  • @stachowi
    @stachowi 8 หลายเดือนก่อน +2

    Love everything aspect of this video, you have a gift

  • @PhillipAmthor
    @PhillipAmthor 8 หลายเดือนก่อน +41

    1:26 this is the ideal computing output, you may not like it but this is how peak performance looks like

    • @TriNguyen-xi8ji
      @TriNguyen-xi8ji 8 หลายเดือนก่อน +6

      Any one have the source? for research purpose of course.

    • @KatyaAbc575
      @KatyaAbc575 8 หลายเดือนก่อน +12

      @@TriNguyen-xi8ji Amouranth, my dude.

    • @ColePanike
      @ColePanike 8 หลายเดือนก่อน +8

      Lol. I was looking for this. It seems the replay frequency is disabled, but I'd be willing to get that bit would have a nice spike 😏

    • @Triangle1234
      @Triangle1234 หลายเดือนก่อน +1

      lmao

    • @Vifnis
      @Vifnis หลายเดือนก่อน

      Ah yes, the *binary logic gates-to-boobpic.jpg* pipeline

  • @qdaniele97
    @qdaniele97 8 หลายเดือนก่อน +64

    ALUs (arithmetic logic units) and FPUs (floating point units) also used to be a thing but now days are almost always part of the CPU (and are plenty powerfull so there is no need to add external ones).

    • @robertobokarev439
      @robertobokarev439 8 หลายเดือนก่อน +3

      FPU is replaced by AVX, there's even a separate instruction for floats summary and subtraction executing in just 2 cycles. The only case FPU is useful in is OS development (that shit with debug registers and stuff)

    • @cambrown5777
      @cambrown5777 8 หลายเดือนก่อน

      @@robertobokarev439AVX is just the contrived name of the ISA extension on x86 that allows vectorization/SIMD ops . FPU is the name of the module within the microarchitecture. These are totally different things.

    • @TomNook.
      @TomNook. 8 หลายเดือนก่อน +6

      Yeah I remember you could buy a FPU for the Amiga to accelerate it somewhat

    • @acompletelyawesomenameyay2587
      @acompletelyawesomenameyay2587 2 หลายเดือนก่อน

      PPU (Physics Processing Unit)

  • @ThantiK
    @ThantiK 8 หลายเดือนก่อน +5

    TPU is a brand-specific chip, made by Google for Tensor Flow. TPU is not a standardized term, but instead AI Accelerator would be used instead.

  • @elliotkopitske6222
    @elliotkopitske6222 8 หลายเดือนก่อน +3

    This is the most comprehencive CPU vs GPU vs TPU vs DPU vs QPU guide I have ever seen

  • @jaysistar2711
    @jaysistar2711 8 หลายเดือนก่อน +25

    RISC-V can, like ARM, be used for both high performance and low power consumption.

    • @jacob2808
      @jacob2808 8 หลายเดือนก่อน +11

      But is much much less mature than ARM

    • @breakfast7595
      @breakfast7595 8 หลายเดือนก่อน +1

      ​@@jacob2808Yes it's not as mature, but I personally see RISC-V is the way to go because of the long term viability, and the open nature being better for compatibility and usability

    • @jacob2808
      @jacob2808 8 หลายเดือนก่อน

      @breakfast7595 i respectfully disagree but won't write up a thesis at this hour lol

  • @pixiedev
    @pixiedev 8 หลายเดือนก่อน +9

    I liked the outout 😅 1:27

  • @georgeakonjom6015
    @georgeakonjom6015 8 หลายเดือนก่อน

    The videos just keep getting better!

  • @seneral9804
    @seneral9804 8 หลายเดือนก่อน +9

    And here I thought QPU stood for Quad Processing Unit
    Yes, that's a thing, usually a GPU-like parallel processor. Raspberry Pis mentioned in the video have one as part of their VideoCore "GPU".
    E.g. VideoCore IV has 12 cores, each processes 4 vectors per instruction cycle, with each vector being 4x32bit - so a 16-way 32bit processor.
    Used for the OpenGL implementation, but can also be programmed manually, if you dare (assembly or some higher level languages though with lesser support).
    It actually has decent branching support for a GPU, as you can mask away results for each vector element, and can branch completely if all your vector elements are fine with that branch.

  • @andreastheone1
    @andreastheone1 8 หลายเดือนก่อน +3

    Awesome video! Thank you so much for all the good and accurate facts that i sure will use in the next months :D

  • @samuelgunter
    @samuelgunter 8 หลายเดือนก่อน +37

    they call me a YVPU -- youtube video processing unit -- because of my crippling addiction to watching youtube videos

  • @AndersHass
    @AndersHass 8 หลายเดือนก่อน +15

    Modern x86 processors does run fairly similarly to RISC type of processor but it still does have a lot more instructions in case they are still being used (to not break compatibility).
    RISC V will also be an interesting instruction set architecture but it is mainly just in microcontrollers and Raspberry pie type devices and not for personal and data center usage yet.
    There are a lot of special processors made like for taking pictures/video on phones and encoder/decorder.
    I would think with the rise of various machine learning models more processors will be made to optimize for them (or use FPGAs).

  • @AjayGautam-ik2dm
    @AjayGautam-ik2dm 8 หลายเดือนก่อน +1

    Pure Awesomeness. Thanks for posting this 😊

  • @vishalmakwana8391
    @vishalmakwana8391 8 หลายเดือนก่อน +35

    The highly trained wizards, called software engineers 😂

    • @LuisSierra42
      @LuisSierra42 8 หลายเดือนก่อน +3

      Avada Angular js!!

    • @darkwoodmovies
      @darkwoodmovies 8 หลายเดือนก่อน +2

      @@LuisSierra42 What dark magic is this!? Expecto Reactus!

    • @PowerK1
      @PowerK1 8 หลายเดือนก่อน +1

      @@darkwoodmoviessmd fr fr

    • @mummyjohn
      @mummyjohn 23 วันที่ผ่านมา

      King, Warrior, Magician, Lover; we are definitely in the age of the magician right now

  • @luke5100
    @luke5100 8 หลายเดือนก่อน +6

    Who else is old enough to remember math coprocessors? My first computer as a kid was a 486SX in the mid 90s and I remember reading in the manual that an optional math coprocessor could be installed. I always wondered how much faster that would have made my machine

    • @NNokia-jz6jb
      @NNokia-jz6jb 8 หลายเดือนก่อน +2

      286 had them also.

    • @jbird4478
      @jbird4478 8 หลายเดือนก่อน +1

      We still have those, but they're nowadays integrated into the CPU. In fact, they already were at the time. If you upgraded a 486SX to one with math coprocessor, you were actually given a full CPU with integrated coprocessor, and the 486SX would retire, remaining a useless artifact on your motherboard. As to how much faster that made it... depends on what you were doing. For most things, not any faster at all. For CAD, a lot.

    • @luke5100
      @luke5100 8 หลายเดือนก่อน +1

      @@jbird4478 interesting. So as always there was probably a degree of marketing hype behind it. Haha. My understanding was SX meant no math coprocessor and DX meant it had one. Not sure if that was actually how it worked. Back then we were still a couple years away from all the information we now have access to on the web so you learned about things through your friends, through manuals or through computer magazines

    • @jbird4478
      @jbird4478 8 หลายเดือนก่อน +2

      ​@@luke5100 That's what it meant. But there was no external math coprocessor for the 486, so the optional math coprocessor they sold was internally a full 486 DX with the FPU integrated, but rebranded and with different pins. Yes, that's marketing. Anyone who at the time would have needed an FPU would have known so, and would have bought a DX to begin with. Most general purpose software (especially back then) doesn't use the FPU at all, or hardly, so does not benefit from it. WordPerfect and Minesweeper would have run exactly the same with it :)

    • @warlockpaladin2261
      @warlockpaladin2261 8 หลายเดือนก่อน +1

      The SX was basically marketed as a cheaper chip than the DX simply because it wasn't built with the extra math capabilities, and unless you were doing something special, the odds were that SX was just fine even if slower on mathematic processes. Anyway, I had a 486DX back in 1992 (when it had just come out), but they were already obsolete and on the second-hand market by the mid-90's because of the Pentium (basically, 586). Oddly enough, the earliest Pentium chip I've ever personally owned or encountered was the Pentium II, so I've never had to deal with "the division error".

  • @lakshyabankey1429
    @lakshyabankey1429 3 หลายเดือนก่อน

    Bro this has to be the best opening out of any TH-cam video I have ever seen

  • @leoaso6984
    @leoaso6984 8 หลายเดือนก่อน +33

    3:03 Just to expand on this, CPUs don't strictly *need* multiple cores to run programs at the same time. What really allows this to happen is context switching and I/O.
    Iff you record the states of all of the registers and memory of a program (i.e. the program context), and then restore the registers and memory to those states at a later time, the program will continue running like no time passed.
    Operating systems use this nifty feature to constantly switch out the currently running program, and they do this so many times per second that the user feels like the programs are rhunning smoothly in parallel.
    And they switch contexts either when a certain number of milliseconds passes, or when the current program thread does some I/O, since a thread waiting for I/O to complete does not need the CPU.

    • @Max_G4
      @Max_G4 8 หลายเดือนก่อน +2

      Well, that is quasi-parallel computing. For actual parallel computing, you do need multiple processors

    • @ohalee-nkwochachijioke7624
      @ohalee-nkwochachijioke7624 8 หลายเดือนก่อน

      ​@@Max_G4Exactly 👌

  • @dylsplazy
    @dylsplazy 8 หลายเดือนก่อน +8

    You forgot the PPU (Picture Processing Unit) The old 80s 8 bit proto gpu. You'd typically find one on retro games consoles

    • @warlockpaladin2261
      @warlockpaladin2261 8 หลายเดือนก่อน +1

      These were responsible mainly for converting video memory data directly into analog-ready signals. In that sense, a PPU was technically more of a DAC than a GPU. On that topic, a GPU is really only a PPU if it has an analog video output of some kind.

  • @computerblade
    @computerblade หลายเดือนก่อน +8

    Now there's NPU....

  • @myhandle__
    @myhandle__ 8 หลายเดือนก่อน

    This intro was so simple , fun and creative explanation of what is a computer

  • @magicmanchloe
    @magicmanchloe 2 หลายเดือนก่อน

    That intro is fantastic! You just got a new subscriber! 😂

  • @vasilis23456
    @vasilis23456 7 หลายเดือนก่อน +5

    You could also go over older deprecated units. The FPU (floating point unit) which is now included in most CPUs, the short lived physics cards which now have merged with graphics cards and the long lived sound cards, when CPUs were not powerful enough to do sound and other functions at the same time. As you can see most of these units died due to COUs becoming more powerful and taking over their jobs. That is because there is a fairly hard barrier for performance needed for things like sound unlike graphics where the quality rose with performance of these cards.

  • @luke5100
    @luke5100 8 หลายเดือนก่อน +4

    3:04 Jeff made it sound like multicore processors are necessary to run more than one application at once. I don’t think he meant for it to come out that way, but just to clarify for anyone who is still new to this stuff… Multitasking has been possible for decades through a process called time slicing, where even on a single core, the CPU can do bits of work For multiple processes, effectively simulating things happening simultaneously even though things are still happening sequentially. It’s like if you are preparing a meal and you have multiple things on the stovetop at once, you check on one of them, bounce over to the next one and go between them until the meal is ready

    • @arjundureja
      @arjundureja 8 หลายเดือนก่อน

      Yeah it's called context switching.

  • @michaalinski2925
    @michaalinski2925 8 หลายเดือนก่อน

    Ottimo video. Continua a pubblicare altre cose del genere.

  • @monstag616
    @monstag616 7 หลายเดือนก่อน +1

    Your videos are really informative , funny 😂😂 and short. Loved it.❤

  • @user-jd3gf5xw1x
    @user-jd3gf5xw1x 8 หลายเดือนก่อน +3

    that short animation just taught me how multiplying matrixes works

  • @peterlach681
    @peterlach681 8 หลายเดือนก่อน +4

    thank you, this video is fire!

  • @Jerry.Luna63
    @Jerry.Luna63 8 หลายเดือนก่อน

    Another banger explainer video! Thank you 🙏

  • @FatherOshai
    @FatherOshai 6 หลายเดือนก่อน

    This guy is just next level 😂 I love it ...im a new sub!

  • @destroyer2973
    @destroyer2973 8 หลายเดือนก่อน +6

    Inside the Broadcom Videocore GPU there are 4 slices. Each slice contains a quad processing unit. Which is a quad threaded CPU risc CPU core with additional instructions for graphics operations. It runs a proprietary firmware based on Microsoft threads and is also responsible for the boot sequence on the raspberry pi.

    • @sel4785
      @sel4785 7 หลายเดือนก่อน

      What in the goddamn fuck are they cooking over there

  • @jonas8708
    @jonas8708 8 หลายเดือนก่อน +5

    Even modern x86 CPUs use RISC under the hood. In stead they simulate x86 instructions with a hardware compatibility layer because the x86 instructions set has become so ridiculously complicated that implementing it directly into the base layer silicon was becoming a serious problem both for performance and circuit complexity.

  • @uprobo4670
    @uprobo4670 8 หลายเดือนก่อน

    MAN ... i so rarely comment on things but your first 40 seconds were pure art in many forms ...

  • @eskayML
    @eskayML 8 หลายเดือนก่อน

    You're the best man, love it!!!

  • @wiredWhiz27
    @wiredWhiz27 8 หลายเดือนก่อน +9

    Its fascinating how graphics cards have come along
    Initially for graphics rendering
    Then crypto
    Now Deep learning neural networks and Ai
    Wow i wonder what they'll do next

  • @ayushs_2k4
    @ayushs_2k4 2 หลายเดือนก่อน +3

    That last "Trust me bro, it doesn't work" 🤣😂

  • @weybansky
    @weybansky 8 หลายเดือนก่อน

    I just love the writing and explanations 💓

  • @tg3470
    @tg3470 8 หลายเดือนก่อน +2

    Protect this man at all costs! Thank you for this explanation

  • @minneelyyyy8923
    @minneelyyyy8923 8 หลายเดือนก่อน +5

    the TPU is called the template processing unit. it is a chip specifically designed to speed up the compile times of c++ programs.

    • @JATmatic
      @JATmatic 8 หลายเดือนก่อน

      Ah, the case of running an compiler on the template meta programming instruction set TMPI.
      Letting the compiler compile time compiler that runs on compile time.

    • @sciencecompliance235
      @sciencecompliance235 8 หลายเดือนก่อน

      It's my understanding that the T in TPU stands for tensor. Like a matrix but with n dimensions.

  • @yugshende3
    @yugshende3 8 หลายเดือนก่อน +4

    That cpu outputting amouranth was the funniest thing I’ve seen all day.

  • @AdrianoRodrigues
    @AdrianoRodrigues 8 หลายเดือนก่อน

    Man, i gotta say, i really like your way of explaining such complex subject with such a humor! The memes are hilarious

  • @Jake-mp7ex
    @Jake-mp7ex 8 หลายเดือนก่อน +35

    The reason you can do multiple things at once isn't because of multiple cores, we could do it back with only a single cpu.
    Your CPU does a tiny bit of computation at a time for multiple processes, and switches between them rapidly. Think of it like a chef cooking 20 meals at once.
    The reason this isn't noticeable is largely due to the vastly slower I/O commands it has to wait for. You can think of this as frying. You can think of the CPU as cracking the egg, plating up, etc.

    • @rankarat
      @rankarat 8 หลายเดือนก่อน +6

      Single core gives an illusion of parallelism.
      Multiple cores actually work in parallel.

    • @softbubble_
      @softbubble_ 8 หลายเดือนก่อน +2

      @@rankarat is hyperthreading an illusion or actual parallelism?

    • @stepansigut1949
      @stepansigut1949 8 หลายเดือนก่อน +1

      ⁠@@rankarat
      Do they though? They might share a memory controller which needs to fetch the data sequentially. Parallelism depends just on the expected latency of the output and can be achieved via interleaving.

    • @dominicdurkacs8321
      @dominicdurkacs8321 8 หลายเดือนก่อน

      A single cpu core doing multiple things at once is like you doing homework and eating food at the same time, you alternate.
      A multi-core cpu doing g multiple things at once is like you doing homework and listening to music at the same time.

  • @Dicska
    @Dicska 8 หลายเดือนก่อน +20

    I think it's important to note that GPUs are much better at floating point operations which are essential to calculate co-ordinates for 3D rendering while CPUs are mainly good at integer operations - that's one of the reasons they co-exist and can't replace each other. I know the video explained some of it, but I'm surprised it didn't touch on the float-integer subject.
    Also, how did nobody point out the literal madlad playing League of Legends with a gamepad at 5:01, lol?

    • @rift1067
      @rift1067 8 หลายเดือนก่อน +1

      This. I was thinking the same in both cases. xD

  • @v.abhinav5637
    @v.abhinav5637 7 หลายเดือนก่อน

    mate that was crisp and informative

  • @CarlJohnson-iv7sn
    @CarlJohnson-iv7sn 8 หลายเดือนก่อน

    I love how you make it sound in the beginning like it's some crazy task and all we do is write javascript.

  • @honkhonk8009
    @honkhonk8009 8 หลายเดือนก่อน +3

    4:47
    Thats from an Nvidia graphic showing their older Pascal architecture on the left vs their newer Turing architecture on the right when it comes to matrix math

  • @HerrMustermann
    @HerrMustermann 8 หลายเดือนก่อน +63

    CPU to GPU: "You're pretty graphic, huh?"
    GPU to TPU: "You tensor to be dramatic, don't you?"
    TPU to DPU: "Always data-centered huh?"
    DPU to QPU: "Quantum of nonsensical bragging!"
    QPU: "I've processed this joke in a parallel universe where it's actually funny!"

    • @HypnosisBear
      @HypnosisBear 8 หลายเดือนก่อน +1

      Lmfao now that's what I call a good comment! Made my day xd

    • @avrakadavra1552
      @avrakadavra1552 8 หลายเดือนก่อน

      AI-generated joke, good one

  • @MrMBSonic
    @MrMBSonic 8 หลายเดือนก่อน

    One of the best conclusions i've ever seen 👍

  • @stephenwlodarczyk175
    @stephenwlodarczyk175 5 หลายเดือนก่อน

    So cool So inspiring thanks for a great video. All clear as mud now.

  • @maxjohnson7623
    @maxjohnson7623 8 หลายเดือนก่อน +3

    Great video

  • @DJ-bo4pz
    @DJ-bo4pz 8 หลายเดือนก่อน +4

    1:26 I laughed so damnnn hard on this😂

  • @ianmacmoore-nk4vz
    @ianmacmoore-nk4vz 8 หลายเดือนก่อน

    I wasn’t expecting a review of computer hardware history, but I’m here for it.

  • @rohanrajput36940
    @rohanrajput36940 4 หลายเดือนก่อน

    Man the starting was insane 😂, video is very helpful❤

  • @b4ttlemast0r
    @b4ttlemast0r 8 หลายเดือนก่อน +6

    Modern GPUs actually have tensor cores included in them, so they're basically a GPU and TPU combined

  • @MaeLSTRoM1997
    @MaeLSTRoM1997 8 หลายเดือนก่อน +5

    0:53 "built by Konrad Zuse in 1936 in his mom's basement" lol you're the best

  • @cyberbiosecurity
    @cyberbiosecurity 8 หลายเดือนก่อน

    thank you Fireship

  • @thedecimalspace2977
    @thedecimalspace2977 8 หลายเดือนก่อน

    This is your funniest yet! Brilliant.

  • @LeonAlkoholik67
    @LeonAlkoholik67 6 หลายเดือนก่อน +3

    You forgot NPUs. They will be used in Windows in the near future if you happen to have one inside your PC case. Taskmanager will also be able to recognize it.

  • @ElOroDelTigre
    @ElOroDelTigre 8 หลายเดือนก่อน +3

    It would be nice ti have that output video complete and downloadable, for research reasons.

    • @piotrmazgaj
      @piotrmazgaj 8 หลายเดือนก่อน

      You know... I'm something of a scientist myself... output name: Amouranth (Kaitlyn Siragusa) from Twitch

  • @srinivasraghavendran9114
    @srinivasraghavendran9114 2 หลายเดือนก่อน

    I watched this on 2x speed but could fully comprehend due to having background knowledge but also the fact that your explanation is soo good! In 4 mins I understood this whole thing, thank you soo much !!

  • @talhashah
    @talhashah 8 หลายเดือนก่อน

    This channel is the best thing I have discovered on TH-cam.

  • @M4rt1nX
    @M4rt1nX 8 หลายเดือนก่อน +12

    I think that I'm ready to get my degree after watching this video. My brain got literally overload with all that information at that pace.

    • @jarodmica
      @jarodmica 8 หลายเดือนก่อน +1

      It's time for us to get a degree in Wizardry 🤯

  • @macreator9497
    @macreator9497 8 หลายเดือนก่อน +4

    1:20 a cpu can do more than one instruction per 1hz it depends on transistor count

    • @warlockpaladin2261
      @warlockpaladin2261 8 หลายเดือนก่อน +1

      Not like that, it doesn't. 😅

    • @macreator9497
      @macreator9497 8 หลายเดือนก่อน

      @@warlockpaladin2261 google ipc

  • @Umarbit
    @Umarbit 8 หลายเดือนก่อน +1

    Please make a full separate video on quantum computer. You are good in teaching complex concepts.

  • @DevOpsBoss
    @DevOpsBoss 8 หลายเดือนก่อน

    Another straight-fire video 🔥🔥🔥

  • @KinoINFINITY
    @KinoINFINITY 8 หลายเดือนก่อน +18

    1:32 output 😂

  • @taimunozhan
    @taimunozhan 8 หลายเดือนก่อน +3

    There is a common misconception that quantum computers will replace regular computers. If quantum computers ever become available to the general public (and assuming society doesn't collapse after our current forms of encryption get obliterated), then it is likely that we'd see QPUs working together with CPUs, in the same way CPUs and GPUs coexist - a QPU can do some tasks much faster (by using a different kind of algorithms that exploit quantum weirdness) but they would be much slower for other computations (using traditional algorithms like the ones CPUs and GPUs are optimized to run).

    • @cazmatism
      @cazmatism 5 หลายเดือนก่อน

      Well apparently quantum encryption exists although the present form would still get obliterated

  • @sauravbv
    @sauravbv 8 หลายเดือนก่อน +1

    Fireship videos are like gym for the brain, it feels good after watching it ❤

  • @arthurhakobyan7343
    @arthurhakobyan7343 7 หลายเดือนก่อน

    Man the thumbnail is Fire 😂 you are a Legend 👍🏼

  • @tristanmisja
    @tristanmisja 4 หลายเดือนก่อน +4

    If you showed this video to someone 600 years ago they would start a new religion based off of it

    • @recongraves1269
      @recongraves1269 หลายเดือนก่อน

      Lol no need we are doing that now with ai Bitcoin and agi🎉 that's what all this is 😂

  • @mpusch88
    @mpusch88 8 หลายเดือนก่อน +7

    These videos are gold

  • @roshankalita9365
    @roshankalita9365 7 หลายเดือนก่อน

    Very informative. 👌

  • @lakshmanshankar
    @lakshmanshankar 8 หลายเดือนก่อน +2

    This is a top quality material 🔥🔥🔥🔥

  • @Dominik-K
    @Dominik-K 8 หลายเดือนก่อน +10

    I've gotten myself a Google Edge TPU USB stick, Coral Edge, which is super useful for some niche use cases. The power/energy efficiency makes it possible to let that run on battery too, interesting stuff

    • @vinylSummer
      @vinylSummer 8 หลายเดือนก่อน +3

      i wish i could get one here in russia. the thing costs a shit ton of money and it's only available through shady retailers

    • @RoflcopterLamo
      @RoflcopterLamo 8 หลายเดือนก่อน

      @@vinylSummer Probs got hardware/firmware malware aswell

  • @IvanRandomDude
    @IvanRandomDude 8 หลายเดือนก่อน +4

    All of that science and engineering so I can style a button with css.

  • @Kevin.Kawchak
    @Kevin.Kawchak 5 หลายเดือนก่อน

    Thank you for the discussion

  • @GhostTheDeveloper_
    @GhostTheDeveloper_ 8 หลายเดือนก่อน +1

    wonderful video bro

  • @bladetoto94
    @bladetoto94 8 หลายเดือนก่อน +5

    6:05 Yes pls, I have a RTX 4080 and that is what I plan to do with it. Please provide me a video on how to train AI, ty. I'm not even fuckin' joking!