GPU Computing Explained | How A GPU Works

แชร์
ฝัง
  • เผยแพร่เมื่อ 8 ม.ค. 2025

ความคิดเห็น •

  • @OptimisticFuturology
    @OptimisticFuturology  6 ปีที่แล้ว +4

    Want to learn more about the Technological Revolution? Watch our playlist here: th-cam.com/video/ENWsoWjzJTQ/w-d-xo.html
    - ALSO - Become a TH-cam member for many exclusive perks from exclusive posts, bonus content, shoutouts and more! subscribe.futurology.earthone.io/member - AND - Join our Discord server for much better community discussions! subscribe.futurology.earthone.io/discord

    • @holdthetruthhostage
      @holdthetruthhostage 6 ปีที่แล้ว

      I don't think it will take 2040 most likely a start up company with 3d printing will accomplish this sooner

    • @bhuvaneshs.k638
      @bhuvaneshs.k638 5 ปีที่แล้ว

      Ur channel is awesome... Please do a video on TPU

    • @safaulansari7782
      @safaulansari7782 3 ปีที่แล้ว

      P

    • @darrenc2370
      @darrenc2370 ปีที่แล้ว

      the music from 7:46 onwards, anyone else getting Serial Experiments Lain vibes there?
      Edit:
      link for the uninitiated th-cam.com/video/iOVlx4PRxZE/w-d-xo.htmlsi=optCloI6NAEgjj9w&t=248

  • @PauLWaFFleZ
    @PauLWaFFleZ 6 ปีที่แล้ว +146

    Please PLEASE do NOT stop this series on Computing.

  • @kayrosis5523
    @kayrosis5523 6 ปีที่แล้ว +89

    Graphics cards 100,000x as good as now? That simulation hypothesis might be onto something

    • @gs-nq6mw
      @gs-nq6mw 5 ปีที่แล้ว +3

      Nope,the Moore law is expected to end by 2020 by most specialist,including Moore himself

    • @projectjt3149
      @projectjt3149 4 ปีที่แล้ว +2

      g s how about Tensor Cores?

    • @thewalnutwoodworker6136
      @thewalnutwoodworker6136 3 ปีที่แล้ว +1

      We are down to 2nm as of 2021, 1nm will be the end of silicon. We might to be able to push it father with other atoms to go sub nm. We will probably make major advancements in architecture before we go quantum. For example x86 is bloated af, risc is what we need past 1nm. AMD is making major pushes in architecture such as chiplets/mcm. As of now the rdna3 leaks are showing over 2x performance on rasterastion.

    • @spacevspitch4028
      @spacevspitch4028 2 ปีที่แล้ว +1

      @@gs-nq6mw Yowza. 2 years ago and still going.

    • @player111q7
      @player111q7 ปีที่แล้ว

      No it's not, 2023 and it's still valid

  • @PauLWaFFleZ
    @PauLWaFFleZ 6 ปีที่แล้ว +49

    Bro your videos are simply AMAZING. Your presentation is completely pulls you in, and makes you want to wish that the video never stops. Makes me more excited to be going to school for Computer Engineering. Keep at it bro...

    • @OptimisticFuturology
      @OptimisticFuturology  6 ปีที่แล้ว +5

      Thank you for watching :)!

    • @projectjt3149
      @projectjt3149 5 ปีที่แล้ว

      Oh it didn't stop for me! I am about to write a research abstract in college discussing GPU computing and this video was a HUGE help!

  • @L2Xenta
    @L2Xenta 6 ปีที่แล้ว +7

    +1 sub for showing off the ambition of Star Citizen, a game project that takes a long time but challenges multiple barriers of the limited industry.

  • @annoloki
    @annoloki 6 ปีที่แล้ว +15

    Do check out IBM's TrueNorth architecture which'll put an end to all this GPU for AI stuff as TrueNorth doesn't run code to simulate neurons - it provides silicon based neurons on a chip... they are clockless, parallel, programmable, run faster than instruction based neural nets, but use an absolutely minute amount of power to do the same thing... neurons in silicon rather than in software is what will really revolutionise AI.

    • @OptimisticFuturology
      @OptimisticFuturology  6 ปีที่แล้ว +6

      Thanks for watching! I have a video on neuromorphic computing coming soon!

    • @walter0bz
      @walter0bz 6 ปีที่แล้ว

      I hope that AI solutions keep being programable. the same thing happened with graphics: we went from pure-software , to fixed-function hardware accelerators, to programmable hardware that eventually generalised into a fully programable parallel processor. The reason is the variety in algorithms (eg convolutions, capsules, various compression schemes.. ); i would prefer to see this variety increase; we'd just see chips that are more dataflow/low-precision oriented for AI (rather than the GPUs who's datapaths and precision grew around the demands of texturing and geometry)

    • @annoloki
      @annoloki 6 ปีที่แล้ว

      Fantastic, if I can recommend something well worth looking at in that realm which you may wish to incorporate bits into your video (if you've not come across it already) that's an explanation of the general purpose cortical column, which is slightly different from the deep learning neural nets, using SDR (sparse distributed representation) in order to act as memory, and pattern prediction that branches out to predict many different possible futures, allowing pruning of branches that didn't occur once data comes in. It's what powers mammalian (including our) brains, and is being put to use for things like recognising when faults will occur on computer systems to helping predict power generation requirements on a grid. Numenta seem to be a/the leader in this work... it might be a bit out of the scope of what you're doing if you're more on the [near-]consumer tech, but it's super interesting and will play a big role in our futures, so I'll just give you this one video link as it contains some good visual explanations, and will spare you further recommendations unless they're helpful/welcome rather than redundant. Cheers for replying :-)
      th-cam.com/video/iNMbsvK8Q8Y/w-d-xo.html

    • @annoloki
      @annoloki 6 ปีที่แล้ว +1

      walter0bz - TrueNorth is programmable, but not in the same way as a general purpose processor, because it's clockless, asynchronous, very highly parallel etc etc etc... it's more like electrical wiring, like having a load of gates, capacitors, transistors, all on silicon, and "programming" it uses an FPGA to wire them up to make your own circuits... it's more like a physical brain that you choose how neurons are connected to each other and how they behave than like a CPU... but it can do the job of a rack of servers in something that will fit in a phone and barely touch your battery... but I'll leave it there as SP's doing a video on the subject 'n he'll no doubt do a better job covering it, oh just to say, I wouldn't worry about limitation of hardware, when it comes to AI, it's US who are going to be falling behind, esp now Google are already using AIs to make AIs better than people can make them.

    • @walter0bz
      @walter0bz 6 ปีที่แล้ว

      programability is a sliding scale. 'fixed-function GPUs' are configurable and can do different effects with layering, ordering according to a 'program', but fine grain programability in shaders was superior. The device I hope 'wins' would be the grid-of-RISC-cores, with inter-core messaging (this can be clockless, highly parallel..) - just with the right custom instructions (e.g. low precision dot-products) to handle AI workloads

  • @googledev566
    @googledev566 5 ปีที่แล้ว +1

    *_Keep creating such crucial and informative videos_*

  • @thejointcoach
    @thejointcoach 6 ปีที่แล้ว +3

    Please do a video on quantum computing!! I love your channel and I think you deserve way more subscribers, I'll do my best to spread your name

  • @djsvideodiarys
    @djsvideodiarys 6 ปีที่แล้ว +51

    Can't wait for QPU.

    • @OptimisticFuturology
      @OptimisticFuturology  6 ปีที่แล้ว +32

      But could it run Crisis ;)

    • @djsvideodiarys
      @djsvideodiarys 6 ปีที่แล้ว +3

      Hahahaha

    • @djsvideodiarys
      @djsvideodiarys 6 ปีที่แล้ว +10

      Haha Hopefully it will solve Crisis

    • @ActualGenius
      @ActualGenius 6 ปีที่แล้ว +3

      Feel like I was processing in CPU terms my whole life and instantaneously entered QPU age when I started watching this channel.

    • @TheDanm22
      @TheDanm22 6 ปีที่แล้ว +1

      nothing will run crisis.
      @@OptimisticFuturology

  • @rahmanash9856
    @rahmanash9856 6 ปีที่แล้ว +1

    Awesome as always .. waiting for Quantum and other types of computing, Graphene applications,AI and so much others

  • @tomojeetchakraborty5459
    @tomojeetchakraborty5459 6 ปีที่แล้ว +8

    I am your greatest fan .sir . I am very obilged by the knowledge u provide us . And make us aware of the modern trends
    .,💐💐💐💐💐💐

  • @rawding1976
    @rawding1976 6 ปีที่แล้ว +8

    One of my favorite channels!!! This channel is gonna explode with subscribers very soon! Watch & see! Great job as always!

  • @The_Masked_Frenchman
    @The_Masked_Frenchman 6 ปีที่แล้ว

    Watching these is reinvigorating my love for technology and makes me want to go back to school for computer engineering

  • @karehaqt
    @karehaqt 5 ปีที่แล้ว

    Just discovered your channel via this series and you gained my sub, great videos so far.

  • @Slayer3915
    @Slayer3915 6 ปีที่แล้ว

    Apparently it's hard for some people, but I for one appreciate your spoken words per minute.

  • @NietJeffrey
    @NietJeffrey 6 ปีที่แล้ว

    I wish you every bit of youtube fame coming to you. You have some great content!

  • @dangdiggity9916
    @dangdiggity9916 5 ปีที่แล้ว +1

    one thing im wondering about for "ai" in gaming is if they can optimize ray tracing for certain games so when you open the game they have already figured out where on the map its used so its basically learning something before doing it for the first time by the user (but ofc probably still improving)

  • @madsgrand
    @madsgrand 6 ปีที่แล้ว +4

    Please change the coldfusion inspired intro its just to close for comfort. On the content side your videos are so much better!

  • @james_gemma
    @james_gemma 6 ปีที่แล้ว +3

    Your channel should have way more views and subscribers. I guess it's only a matter of time before everyone discovers your excellent informative tech videos.

    • @cdreid99999
      @cdreid99999 6 ปีที่แล้ว

      Funny the number of comments mirroring this. In a row. But I'm sure you're not sockpuppets or bots..

    • @originproductions6120
      @originproductions6120 5 ปีที่แล้ว

      He's ripping off ColdFusion's intro completely

  • @meowmeowmoogabenrules4854
    @meowmeowmoogabenrules4854 5 ปีที่แล้ว

    why is youtube barely showing me this. Amazing work man

  • @benyeo7930
    @benyeo7930 5 ปีที่แล้ว

    I love all your videos - great content that is educational and serves as good primers for the uninitiated. 2 issues with all your videos - the voice of commentator is way too Low and boring (no fluctuations in tone at all - drones on and on) and the speed of narration is also way too rapid; good thing, there are subtitles to follow which solves the issue for the Super interested audience. However, even scanning the subtitles requires full and complete attention which reinforce the comment that the speed and tone issues are real!

  • @bommaritohawaii
    @bommaritohawaii 6 ปีที่แล้ว +9

    Great job!

  • @edwardbrownstien8741
    @edwardbrownstien8741 6 ปีที่แล้ว

    Great channel . Love the content .

  • @Mordred478
    @Mordred478 6 ปีที่แล้ว +2

    Great video, very informative. Is the implication here that Dell and other companies will soon start offering PCs with GPUs instead of CPUs?

    • @jwadaow
      @jwadaow 6 ปีที่แล้ว

      No

    • @mike288190
      @mike288190 6 ปีที่แล้ว

      I believe you would still need a cpu

  • @PabloGonzalezVargas
    @PabloGonzalezVargas 6 ปีที่แล้ว

    Wow !!! I'm a big fan of your channel, impressive eloquent work *

  • @marymcreynolds8355
    @marymcreynolds8355 6 ปีที่แล้ว +6

    Star Citizen...quite a peek of the latest peak.

  • @usertogo
    @usertogo 6 ปีที่แล้ว

    Nice, now if somebody has a graphics accelerator how does one enable the cores to be easily used by the operating system and applications?

  • @wolfisraging
    @wolfisraging 6 ปีที่แล้ว +2

    Great job, make more

  • @daniel_960_
    @daniel_960_ 6 ปีที่แล้ว

    But what do the cores in apples mobile graphics mean? The a11 or a12 have only 3 or 4 graphics cores but are still really powerful. Previous generations had more cores as far as I know.

  • @ELECTR0HERMIT
    @ELECTR0HERMIT 5 ปีที่แล้ว

    excellent job.

  • @anshulsharma9424
    @anshulsharma9424 6 ปีที่แล้ว

    Keep up the good work

  • @pegasusted2504
    @pegasusted2504 6 ปีที่แล้ว

    Good stuff all round :~)

  • @barney9008
    @barney9008 6 ปีที่แล้ว +1

    nice delivery reminds me of cold fusion

  • @winkipinky
    @winkipinky 6 ปีที่แล้ว

    Fantastic .... 😁

  • @dinozaurpickupline4221
    @dinozaurpickupline4221 4 ปีที่แล้ว +1

    An AI could be used to map tonal changes,of different structures and things & their outcome can be used to decrease load times In games like how cool would it be if a computer or software already knows reflection details of every object texture color variation size & scaling using separate sets of data & create further smaller tests this would increase performance drastically

  • @srungarapusaikrishna5583
    @srungarapusaikrishna5583 4 ปีที่แล้ว +1

    That my friend felt like a rocket science class!!!😵🤒

  • @supremepartydude
    @supremepartydude 6 ปีที่แล้ว +1

    As computer enthusiast for 30 years you did a great job.

  • @DANTHETUBEMAN
    @DANTHETUBEMAN 6 ปีที่แล้ว +6

    as soon as computers get consciousness they will no longer serve humanity,

    • @flynnkay
      @flynnkay 6 ปีที่แล้ว

      Ok then Ill unplug that bitch haha

    • @cdreid99999
      @cdreid99999 6 ปีที่แล้ว +1

      You don't understand how computers work then. And we are nowhere close to an. What the hypewagon are calling ai now is what we used to call expert systems .. ie standard proging/algorithms. We simply don't have the processing power yet. We might be able to build a simulated human brain but it would be built on FPGA like neural networks boards, be the size of a Walmart and probably require a power plant to run

    • @TheDanm22
      @TheDanm22 6 ปีที่แล้ว

      @@cdreid99999 ai is going under a new dynamic now. 1 ai is programmed to evaluate and judge. another ai is programmed to design and invent. this is the judge and the inventor ai. two ai working together to improve each other processes. it wont be the size of walmart. its going to evolve exponentially.

    • @viniciusbueno2160
      @viniciusbueno2160 5 ปีที่แล้ว

      And now, 1 or 2 months ago IBM released the first quantum computer outside the lab!!!! Now things can move faster

  • @PauLWaFFleZ
    @PauLWaFFleZ 6 ปีที่แล้ว

    When can we plan on seeing some of the videos on the AI series?

    • @OptimisticFuturology
      @OptimisticFuturology  6 ปีที่แล้ว

      Starting May!

    • @PauLWaFFleZ
      @PauLWaFFleZ 6 ปีที่แล้ว

      Ah come on man, I can't wait that long... You gotta give me the line up for what else is going to be coming out until then...

  • @chuckbuckets1
    @chuckbuckets1 6 ปีที่แล้ว

    ai and protein folding will be one of the most profound paradigm shifts of humanity.

  • @system2072
    @system2072 6 ปีที่แล้ว +1

    great videos man..... but can you please slow down while explaining.. you speak fast which is very hard to understand sometime...

  • @HeadStronger-HS
    @HeadStronger-HS 6 ปีที่แล้ว

    this blew my mind.. Major advances in GPUs

  • @borisgotov9838
    @borisgotov9838 6 ปีที่แล้ว +1

    give simple piece of code a little bit more complex than hello world program. Something like rolling ball or rotating square...

  • @zahanjavaid
    @zahanjavaid 6 ปีที่แล้ว +1

    I seriously liked your videos but not in a position to understand most of the part
    LOL 😂😂😂

  • @govinds3951
    @govinds3951 6 ปีที่แล้ว

    Jheeez good work

  • @Army2willis
    @Army2willis 6 ปีที่แล้ว

    I see you popped in some SCU to show just how crazy graphics are today. You know you like SC

  • @meatofpeach
    @meatofpeach 6 ปีที่แล้ว +1

    Incredible TH-cam channel. Wow. Keep it up

  • @hemendrapratapsingh4156
    @hemendrapratapsingh4156 3 ปีที่แล้ว

    So I thought to watch the complete ad today. But it skipped automatically. 🤐

  • @bassbs
    @bassbs 6 ปีที่แล้ว

    Did you, SP, do SU?

  • @TheDanm22
    @TheDanm22 6 ปีที่แล้ว

    in the first minute.... thats called moores law. it deserves the reference.

  • @sarmadnajim4839
    @sarmadnajim4839 6 ปีที่แล้ว

    Wonderful document , direct and clear , smartly done 👍🏻

  • @wandrinsheep
    @wandrinsheep 6 ปีที่แล้ว

    oho a star citizens fan i see, awesome

  • @DarthRaver-og7qq
    @DarthRaver-og7qq 6 ปีที่แล้ว

    Dam think about what a laptop or desktop gamin rig will like look in say 50 years?? Could you imagine having something portable the size of a Nintendo Switch, yet as powerful as a full desktop gaming rig today with the best of everything. Thats crazy lol. I hope im still alive by then. Everything in the world looks like its actually heading toward a "Bladerunner" type civilization lol.

  • @nad1901
    @nad1901 5 ปีที่แล้ว +1

    Still I don't know how people get the idea of using annoying music background on informative videos. And since when did we've music playing while we learning in schools :/

  • @tonytony7225
    @tonytony7225 6 ปีที่แล้ว

    you gotta talk about amd's threadripper and its AI technology

  • @infinitworld7106
    @infinitworld7106 6 ปีที่แล้ว +8

    MORE CONTENT!!!

  • @MyWatchIsEnded
    @MyWatchIsEnded 6 ปีที่แล้ว +2

    But can the GPU from 2040 run Crysis?

    • @ahuttee
      @ahuttee 6 ปีที่แล้ว +1

      Might be possible

  • @brushhog7089
    @brushhog7089 6 ปีที่แล้ว

    nice horn in the background but really distracting I guess I'll go somewhere as I was for information

  • @Keiktu
    @Keiktu 6 ปีที่แล้ว

    Insta suscribed

  • @Chrisimplayer
    @Chrisimplayer 6 ปีที่แล้ว

    to me it's highly debatable if star citizen is still in development

  • @snoogboonin
    @snoogboonin 6 ปีที่แล้ว

    Your vids are fucking unreal dude. Subbed.

  • @aoeu256
    @aoeu256 6 ปีที่แล้ว

    GPU voice recognition never heard of it.

  • @kokomanation
    @kokomanation 6 ปีที่แล้ว

    I feel that cpu and GPU are getting merged together

    • @rpzcsonli
      @rpzcsonli 6 ปีที่แล้ว

      CPU can calculate everything , GPU needs special cores to be better than a CPU so if they add cores to calculate everything the GPU will become a CPU . It won't become one because it will be to big , expensive , powerhungry . They could make a AiPU(ai proccesing unit) and put it in a card , a RTPU(ray tracing proccesing unit) and put it in a card and so on and make it that you can add 20 cards in your computer and then we will have the true "performance" in everything but it won't happen because is stupid. CPU will be the "CENTRAL Proccesing Unit" untill humanity ends and GPU will just be it's slave doing all the work then it will put everything together and make it pleasent for you to interact with.

  • @fuzzylumpkin8030
    @fuzzylumpkin8030 6 ปีที่แล้ว

    Yeah that’s cool but at what cost to gaming

  • @vyor8837
    @vyor8837 6 ปีที่แล้ว

    Volta isn't on a true 12nm node.

  • @szirbektamas2571
    @szirbektamas2571 6 ปีที่แล้ว

    After this video I feel myslef so stupid

  • @Johnwick99099
    @Johnwick99099 6 ปีที่แล้ว

    i love you man...

  • @harrym8556
    @harrym8556 6 ปีที่แล้ว

    "1^14 FLOPS in performance..." Dude, what are you talking about??
    You know that 1^14 = 1, right?
    Did you mean to say 10^14 FLOPS?

  • @m_sedziwoj
    @m_sedziwoj 6 ปีที่แล้ว

    From last week, look at Google TPU 3, 100 petaflops for DL, it 1000x more then Nvidia Titan V

  • @ekaterinavalinakova2643
    @ekaterinavalinakova2643 6 ปีที่แล้ว +2

    1.13 Quintilian flop gaming system by 2040. 100,000 x 11.3 teraflops.

    • @xsuploader
      @xsuploader 6 ปีที่แล้ว

      Not quite, he said in the 2040s not 2040. At the current rate of 1.5x per year it would take log(100000)/log1.5 years or 28 years approximately putting the year at 2046. At around the 2045 singularity proposed by kurzweil.

  • @Gollywog
    @Gollywog 6 ปีที่แล้ว +7

    you talk too fast. I love the info but it needs my full attention (not something I can listen in the background) because you say everything so fast.

    • @735Secure
      @735Secure 6 ปีที่แล้ว

      StiX it’s because he’s just reading the stuff. If you have a technical background and are a scientist or an engineer you don’t just put on a show. He is all about the show. Descent information but I don’t thrust the information he provides.

    • @curiosity1865
      @curiosity1865 6 ปีที่แล้ว +1

      Turn down the speed of video

    • @originproductions6120
      @originproductions6120 5 ปีที่แล้ว

      @@LearningRaven95 stop trying to sound smart. I'm sure you wouldn't have a problem with 10x speed as well right because you're such an intellectual. It's annoying because my full attention has to be on the video and if I'm playing halo this guy talks too fast for that. Stop trying to show off and be honest with yourself. Can you understand him at 2x speed? Probably, but it's missing the whole point of the video. That point is to absorb the information he's giving you and think about it, and you can't do all of that at 2x speed even if you're Albert fucking Einstein

    • @originproductions6120
      @originproductions6120 5 ปีที่แล้ว

      @@LearningRaven95 also I just watched it at 2x speed and now know that you're just bullshitting if 2x speed isn't even an ideal speed. Stfu no one cares about you trying to sound smart

  • @jackharpe3rd233
    @jackharpe3rd233 5 ปีที่แล้ว

    I could care less for AI or a real life Hal or Skynet. Not because I'm scared, but cause all I truly want is more Pixels and more Polygons being rendered for my videogames. Unless Moore's law affects that then okay, let's use Ray Tracing and other Visual Tricks to fool our eyes into a better Graphical Future. I also want Great Story Telling as well which thanks to the rise of SJWs, E-Sports, Lllumination Studios, and Modern Activism has told the world of Computer Generated Imagery that Story Doesn't Matter Anymore! Please don't let us down Sony!

  • @SumWanYo
    @SumWanYo 6 ปีที่แล้ว

    Why is the nvidia ceo so nervous?

    • @rpzcsonli
      @rpzcsonli 6 ปีที่แล้ว

      doesn't know how to burn AMD to the ground so he can have all the moneiz and maybe some monopoly issues and anti competitive practices that he pays the governments to don't dismantle nvidia to pieces. Nvidia single handedly slowed progress for GPU's by making all the bullshit technology and buying the competitors and other technologies to use in their cards only so it will be "better" then stall for 2-5 years untill AMD catches on then get another "revolutionary" technology that is nvidia only and "help" developers by givind them the tech and money and destroy AMD performance and wait again untill AMD catches on and repeat. I'm not a AMD fanboy but i hate nvidia with everything i have because they did and do everything i sayed , google a little and you'll be enlightened by what nvidia did in the past 20 years.

  • @mohamedsalahoshi1486
    @mohamedsalahoshi1486 6 ปีที่แล้ว

    *J* *U* *S* *T* *A* *M* *A* *Z* *I* *N* *G*

  • @Lightning9060
    @Lightning9060 6 ปีที่แล้ว

    At this point I’m still waiting for the GYX 1180/2080 😂

  • @platin2148
    @platin2148 5 ปีที่แล้ว

    The stacking will also make nvidia obsolete as both intel and amd can make pretty capable gpus so no need to work with nvidia.
    So i suspect the either go server or bet on Arm so if the could make a Arm1000 chip that is tightly integrated with there GPUs the basically would have won.
    And he saying that he made a special chip for AI which is completely wrong he made matrix calculations faster not AI but it could be even faster with FPGA's.

  • @BakiWho
    @BakiWho 6 ปีที่แล้ว

    you sound like the hardy boys in south park :) i have a raging clue

  • @Julia-hk9jp
    @Julia-hk9jp 6 ปีที่แล้ว +2

    to sum it up this video is just nvidia commercial..

  • @strangevideos3048
    @strangevideos3048 6 ปีที่แล้ว

    We live in Matrix!

  • @kapilbsingh
    @kapilbsingh 6 ปีที่แล้ว

    Whatever they will develop it will find a place in landfills.

  • @CCRob720
    @CCRob720 3 ปีที่แล้ว

    what could we do with a billion gpu power.....

  • @Drixidamus
    @Drixidamus 2 ปีที่แล้ว +2

    Your channel is criminally unsubscribed

  • @zalanta7
    @zalanta7 6 ปีที่แล้ว

    this video is 4k

  • @vladimirtchuiev2218
    @vladimirtchuiev2218 6 ปีที่แล้ว

    And now people start using GPUs for crypto-currency mining, driving GPU prices up...

  • @itsotechai
    @itsotechai 4 หลายเดือนก่อน

    ❤❤❤ hi all very

  • @Goldnr
    @Goldnr 6 ปีที่แล้ว +1

    Nvidia‘s Cuda - showing an AMD card...

  • @perspgold8945
    @perspgold8945 3 ปีที่แล้ว

    Not sure if it was the speed speaking or the content but this video was disjointed

  • @cameronh3260
    @cameronh3260 6 ปีที่แล้ว

    But can it run Minecraft?

    • @rpzcsonli
      @rpzcsonli 6 ปีที่แล้ว

      with 400 mods yea but 600 mods ... i don't think so

  • @MegaFlemo
    @MegaFlemo 6 ปีที่แล้ว

    WOW

  • @albertgerard4639
    @albertgerard4639 6 ปีที่แล้ว

    Moors law never took into account bitcoin... ouch

  • @dr.zoidberg8666
    @dr.zoidberg8666 6 ปีที่แล้ว

    We're reeping closer & closer every day to machines with the processing power & storage capacity to simulate human minds.
    Once that's achieved, all we need to do is figure out how to transfer someone over without breaking their stream of consciousness in the process, & we'll have a reliable path forward to radical life extension.

    • @raunak1147
      @raunak1147 6 ปีที่แล้ว

      Dr. Zoidberg By 2021, or if before that, something revolutionary like Graphene/3D processors happen

    • @rpzcsonli
      @rpzcsonli 6 ปีที่แล้ว

      @@raunak1147 you are dreaming to big just like the 1990 people that were thinking we will have flying cars ... maybe another 30 years untill then.

  • @gertjanvandermeij4265
    @gertjanvandermeij4265 6 ปีที่แล้ว

    Nvidia is just a big bully !

  • @tomislavnikolic5778
    @tomislavnikolic5778 6 ปีที่แล้ว

    Holy shit

  • @projectjt3149
    @projectjt3149 6 หลายเดือนก่อน

    Even after all the success with Generative #AI and #NVIDIA recently, no one seems to be watching this video!

  • @madscientistshusta
    @madscientistshusta 6 ปีที่แล้ว +1

    Excuse me,starcitizen is a joke.

  • @utubekullanicisi
    @utubekullanicisi 4 ปีที่แล้ว

    Too fast.

  • @peefwellington8794
    @peefwellington8794 4 ปีที่แล้ว

    Boy the new gen of gpu In 2020 is gonna be incredible