Nvidia Reveals Grace Hopper Generative AI Chip (Computex 2023)

แชร์
ฝัง
  • เผยแพร่เมื่อ 21 ต.ค. 2024
  • At Computex 2023, Nvidia CEO Jensen Huang shows off its generative AI chip, Grace Hopper. See what the CEO has to say about the future of accelerated computing and AI.
    Never miss a deal again! See CNET’s browser extension 👉 bit.ly/39Ub3bv
    Subscribe to our channel: / @cnethighlights

ความคิดเห็น • 346

  • @klausyap
    @klausyap ปีที่แล้ว +142

    The way he was holding it and not afraid to drop it, just as brave as Linus.

    • @demonz9065
      @demonz9065 ปีที่แล้ว +1

      he started the video by saying it's in full production. if he breaks that one it's no big deal. why would he be afraid to drop it?

    • @oldpain7625
      @oldpain7625 ปีที่แล้ว +4

      ​@@demonz9065Well he's giving a presentation on how innovative the technology is. Dropping it on the ground in front of all those people would not be optimal.

    • @demonz9065
      @demonz9065 ปีที่แล้ว +1

      @@oldpain7625 it would'nt be optimal but when is dropping something delicate optimal? his embarrassment would be the biggest downside to him dropping something that's just gone into full scale production. there's no real reason for him to be afraid of damaging it

    • @samsonrobertbrewer8235
      @samsonrobertbrewer8235 ปีที่แล้ว

      Since the February 02/2023/untill now I'm trying to rebuild websiaite lol the Microsoft 365 lunch onfenrary 3

    • @kneelesh48
      @kneelesh48 ปีที่แล้ว +2

      He got the power of the leather jacket

  • @joelface
    @joelface ปีที่แล้ว +137

    I do think Nvidia flies under the radar a little bit with the way it makes all of the advances of the other big computing companies possible. This seems like a seriously huge leap in computing power. It may prove fundamental to unlocking a real AGI.

    • @Jaker788
      @Jaker788 ปีที่แล้ว +8

      There are other players doing pretty crazy stuff for less general purpose but extremely powerful AI training. Cerebras has their wafer scale AI processors which are very good for specific AI training models. Tesla with their Dojo computer is pretty crazy for training, it's memory bandwidth and hierarchy is suited for very high bandwidth and capacity fully addressable. It's far more powerful than their Nvidia GPU clusters. Nvidia is very general purpose, their GPUs have capabilities beyond just training instructions like Bloat 16 and low precision integer which limits the architecture, the difference between vector and matrix is decently significant. A pure matrix architecture with assistance from a simple integrated CPU in each cluster like dojo or Cerebras is quite powerful vs a GPU that must do vector and matrix.
      I would also say that AGI will not be done by a large language model, it doesn't have any true intelligence, but it's very good at making connections.

    • @OneDivineShot
      @OneDivineShot ปีที่แล้ว +2

      @@Jaker788 but isn’t making connections the fundamental way our brains learn things as well? Neurons for connections in the brain.

    • @Jaker788
      @Jaker788 ปีที่แล้ว +6

      @@OneDivineShot LLMs just aren't it, they're really good for certain things, but they have their limitations. Open AI has even said that GPT4 is about where the ceiling for LLMs is. We will need a different kind of model for AGI.

    • @preddyshite6342
      @preddyshite6342 ปีที่แล้ว

      @@Jaker788 The crazy thing is not THAT it works, but how EASY it is to use. A pseudo-AGI architecture is more than sufficient to emulate the gamut of human proclivities. NVIDIA just handed it a body, so it can go hide anywhere.

    • @austinjoseph2881
      @austinjoseph2881 ปีที่แล้ว +6

      Lol at flies under the radar. Have you seen the 300% increase in the stock recently

  • @grcfalcon
    @grcfalcon ปีที่แล้ว +34

    If that isn't the heart of Skynet, I don't know what it is.

  • @natej6671
    @natej6671 ปีที่แล้ว +95

    I'm looking at that 40k pound beast of a GPU and I'm thinking .... It will be the size of a cellphone in 20 years.

    • @kneelesh48
      @kneelesh48 ปีที่แล้ว +14

      Next you're gonna say, AWS will be the size of an airpod in 20 years. No, it won't.

    • @kneelesh48
      @kneelesh48 ปีที่แล้ว

      @@SpaceX-Falcon-Heavy doesn't mean we'll see the same change in the next 20 years. We can't shrink atoms.

    • @stone_pilot
      @stone_pilot ปีที่แล้ว +9

      @@SpaceX-Falcon-Heavy yeah but it’s not the same. You can’t make electrons any smaller. We’re already nearing the limits when it comes to how small our transistors are. Further compaction will require some new breakthrough that we have no real concept of.

    • @ferdinand.keller
      @ferdinand.keller ปีที่แล้ว +4

      People always say the same thing, that we are already at the limit. And then a new discovery is made, and we surpass ourselves. Thinking we won’t do better isn’t a prediction I would put money on.

    • @pictzone
      @pictzone ปีที่แล้ว +2

      @@ferdinand.keller I mean I really can't see a way to make transistors smaller than a few atoms.. so he has a point. Maybe we'll achieve some great workarounds, like 3D chips, but that's it.

  • @Beingtanaka
    @Beingtanaka ปีที่แล้ว +191

    The crowd was soooo dead, the man just announced the worlds fastest super computer

    • @thetshadow999animates9
      @thetshadow999animates9 ปีที่แล้ว +13

      No, it’s not a supercomputer

    • @bengsynthmusic
      @bengsynthmusic ปีที่แล้ว +3

      ​@@thetshadow999animates9
      👉🏽 11:37

    • @thetshadow999animates9
      @thetshadow999animates9 ปีที่แล้ว +10

      @@bengsynthmusic it’d be a stretch to call it a supercomputer, more like a marketing thing. Also, this was unscripted so you can’t really go off of what Jensen says either way.

    • @bengsynthmusic
      @bengsynthmusic ปีที่แล้ว +10

      @@thetshadow999animates9
      You can see on the left side that it says AI supercomputer. 11:37 So it's not some off-script slip-up. Plus it does 1 exaflops, which would put it among the top supercomputers. It is no doubt a supercomputer.

    • @Moltenlava
      @Moltenlava ปีที่แล้ว +3

      @@thetshadow999animates9 This is the kind of chip you would find inside a supercomputer, this thing is designed for server computing racks.

  • @TheViettan28
    @TheViettan28 ปีที่แล้ว +75

    Finally Nvidia created a GPU for LLM. LLM is memory hungry.

    • @cidpraderas8950
      @cidpraderas8950 ปีที่แล้ว +1

      I thought all it’s GPUs could handle LLMs, no? Is this one chip that is tuned specifically for LLMs?

    • @TheViettan28
      @TheViettan28 ปีที่แล้ว +4

      ​@@cidpraderas8950 They did, but LLM needs a lot of memory. So The current LLM running on current GPU has to be sharded across multiple GPUs. If the GPU has more memory, the LLM can be stored on a single GPU and the training process may be much faster due to the reduced amount of inter-GPU communications.

    • @neonlost
      @neonlost ปีที่แล้ว +1

      @@cidpraderas8950 depends on the size of the model and the optimization used. newer models usually use more though especially if they have a lot of parameters.

    • @joelface
      @joelface ปีที่แล้ว

      How long until a phone runs on an equivalent chip? 5 years? 10 years?

    • @FriedChairs
      @FriedChairs ปีที่แล้ว +1

      One thing is for sure is that he knows absolutely everything about the products NVIDIA is making. Can’t say that about all CEOs.

  • @joeyglasser2574
    @joeyglasser2574 ปีที่แล้ว +118

    "I wonder if this can play Crysis"
    lmao Jensen is a based CEO

    • @lordrefrigeratorintercoole288
      @lordrefrigeratorintercoole288 ปีที่แล้ว +17

      cringe

    • @freemanrader75
      @freemanrader75 ปีที่แล้ว +5

      I don't know why all the gamers act so mad at Nvidia

    • @knightnxk2906
      @knightnxk2906 ปีที่แล้ว +2

      @@freemanrader75 becasue they are not really here for the gamers?
      They are just not trying to lose market share and try to suck every dollar out of you, which apparently works very well.
      But the big plays are happening behind Enterprise doors, that's where the real tech is.

    • @freemanrader75
      @freemanrader75 ปีที่แล้ว +1

      @@knightnxk2906 Nvidia is owned by gamers.

    • @edgeldine3499
      @edgeldine3499 ปีที่แล้ว +1

      @@freemanrader75 because they want to charge 2-3 times as much today for 20-30.. maybe 40% more performance than they did a few years ago.. MSRP not Covid Inflated prices. If were talking the 4060 then you might even get less performance than the last generation..

  • @jdkingsley6543
    @jdkingsley6543 ปีที่แล้ว +47

    I like how we always go back to rooms full of computers. Despite the processing breakthroughs

  • @C01A60
    @C01A60 ปีที่แล้ว +191

    This guy is really a one in a million passionate CEO!

    • @knightnxk2906
      @knightnxk2906 ปีที่แล้ว +9

      cos hes getting paid well with bonuses.

    • @andy68686
      @andy68686 ปีที่แล้ว +5

      @@knightnxk2906 no sir, the guy found the company in a Denny's

    • @knightnxk2906
      @knightnxk2906 ปีที่แล้ว +3

      @@andy68686 Sure and I am Pope.

    • @Shiffo
      @Shiffo ปีที่แล้ว +12

      @@knightnxk2906 Jensen is worth $33B, this guy can do everything he wants.
      Money is no limitation. Do you think he wakes up every day with the idea of earning more money today?

    • @adamrhea2339
      @adamrhea2339 ปีที่แล้ว +12

      He's 10x better than Steve Jobs. Has actually written code and cares more than anyone else about the mission.

  • @boltez6507
    @boltez6507 ปีที่แล้ว +3

    10:46 At least he remembers his previous primary comsumers.

  • @jasonsadventure
    @jasonsadventure ปีที่แล้ว +7

    *Jensen Huang @**12:00**:* *_"DGX2000, It is one giant GPU."_*
    *Voice of Morgan Freeman:* *_"Then... on Dec 12, 2023... Skynet was born"_*

  • @Zodtheimmortal
    @Zodtheimmortal ปีที่แล้ว +55

    I'm excited and scared at the same time. What was this GPU named again, Skynet?

    • @Yankeyson1
      @Yankeyson1 ปีที่แล้ว

      I was thinking the same thing.

    • @preddyshite6342
      @preddyshite6342 ปีที่แล้ว

      Yes, now LLMs can be mobile and everyone is trying to make their own AGI.. lol INCLUDING MEEE!!!

    • @dwightk.schruteiii8454
      @dwightk.schruteiii8454 ปีที่แล้ว

      @@preddyshite6342whats an LLM?

    • @preddyshite6342
      @preddyshite6342 ปีที่แล้ว +1

      ​@@dwightk.schruteiii8454 Large Language Model. That's what chatGPT is. Because it is trained on a lot of words.

    • @420msclub
      @420msclub ปีที่แล้ว

      Because Attention + Hype🎉

  • @JazevoAudiosurf
    @JazevoAudiosurf ปีที่แล้ว +17

    I guess mega caps don't care about price, but the cost must be astronomical

    • @knightnxk2906
      @knightnxk2906 ปีที่แล้ว +2

      Because it is. At least 100k
      This is an insane GPU, if i can even call it a GPU cos it feels like an understatement at this point.

    • @4482paper
      @4482paper ปีที่แล้ว +4

      @@knightnxk2906 LOL - you just MASSIVELY underestimated the cost, a single H100 is $40,000+ ;-)

  • @joshuagreen5820
    @joshuagreen5820 ปีที่แล้ว +8

    Wow the dedication all just so we can finally play Crises the way it should be! In 40 years we’ll have one in our phones lol

    • @Shiffo
      @Shiffo ปีที่แล้ว +1

      In 40 years, you won't have a phone

  • @maudentable
    @maudentable ปีที่แล้ว +4

    Jensen is stacking up hardwares like vectors, matix and tensors.

  • @humorme5874
    @humorme5874 ปีที่แล้ว +18

    "I wonder if this can play Crysis" hahaha a true classic

  • @webpresent
    @webpresent ปีที่แล้ว +5

    New Internet moment: the data center is the computer. 👍

  • @noveenmirza4917
    @noveenmirza4917 ปีที่แล้ว +5

    The energy of this guy is supreme!!

  • @AngeloXification
    @AngeloXification ปีที่แล้ว +5

    The future is going to be wilder than anyone can predict

  • @metamorfoza7656
    @metamorfoza7656 ปีที่แล้ว +32

    When AI from this thing becomes aware and takes over the world, I hope that first thing that is going to do is reduce prices od gpus for 50% so it could replicate more efficiently... 750 bucks 4090 for all

    • @mlawal44
      @mlawal44 ปีที่แล้ว +1

      😂

    • @Cyberdemon1542
      @Cyberdemon1542 ปีที่แล้ว +1

      When that happens that will be the least of your problems...

    • @pictzone
      @pictzone ปีที่แล้ว +1

      Even 750 bucks sounds insane for a GPU tbh... How times have changed

    • @surplusking2425
      @surplusking2425 ปีที่แล้ว

      Unfortunatly modern machine learning AIs are actually just a glorified collage machine, so self-awareness is much like a medieval ornithopter became a modern stealth jet fighter

    • @DreamingConcepts
      @DreamingConcepts ปีที่แล้ว

      if that happens it will give you gpus for free, also neuralinks. In fact it will force you to take it and put you inside a tube "for your own safety" to rot in the matrix forever never remembering what reality looks like.

  • @suekuan1540
    @suekuan1540 ปีที่แล้ว +1

    What happened to the qbit quantum computers?

  • @GlorifiedPig
    @GlorifiedPig ปีที่แล้ว +1

    10:44 "I wonder if this can play crysis" lmao

  • @chineduachimalo391
    @chineduachimalo391 ปีที่แล้ว +17

    hmm but can it really play crysis?

  • @SoGood09
    @SoGood09 ปีที่แล้ว +35

    Crazy scientist! You can't help but love this guy and his company!

    • @afkcnd2395
      @afkcnd2395 ปีที่แล้ว

      Are you out of your mind ?
      This man pushes greedy practices in the whole GPU industry, last gen consumer GPU are literal scams.

    • @Vampirat3
      @Vampirat3 ปีที่แล้ว

      arent you an easy sellout.

  • @mattbegley1345
    @mattbegley1345 7 หลายเดือนก่อน

    Considering how much the products cost to make, how do you justify the MSRP?

  • @mariosebok
    @mariosebok ปีที่แล้ว +6

    ENERGY SAVINGS? 2112 fans need less energy than millions of them

  • @VeeraBun
    @VeeraBun ปีที่แล้ว +9

    Grass Hopper

  • @victorhernandez_Dr
    @victorhernandez_Dr ปีที่แล้ว +8

    Amazing! with 1 GPU!!

  • @nexovec
    @nexovec ปีที่แล้ว +6

    This is so insane. I love this.

  • @bruceli9094
    @bruceli9094 ปีที่แล้ว +6

    Remember when people said we'd have to learn Chinese in the future?? Now you wont have to because of A.I the Black swan event.

  • @peppy197
    @peppy197 ปีที่แล้ว +1

    Will it fit in a case ...and run MSFS2000 ?

  • @ash0787
    @ash0787 ปีที่แล้ว +2

    This just makes the 3070's 8GB VRAM limitation more painful ...

  • @davidfaustino4476
    @davidfaustino4476 ปีที่แล้ว +4

    Don't worry it will take at least 3x the VRAM for it to do anything useful.

  • @xlr555usa
    @xlr555usa ปีที่แล้ว

    I want a big A100 cluster so I can play in my sandbox, I still don't understand the advantages of Hopper. Looks like you can link them with pods?

  • @coordinateurtremplinsolida7341
    @coordinateurtremplinsolida7341 ปีที่แล้ว +1

    Thanks to Nvidia and his CEO, Skynet was waiting for this technology to be born !!!

  • @jdevoz
    @jdevoz 7 หลายเดือนก่อน

    Whats the MTTF of that setup?

  • @MsFearco
    @MsFearco ปีที่แล้ว +1

    he sounds super excited.

  • @campingismylife9394
    @campingismylife9394 ปีที่แล้ว +4

    So the card is 21 times more powerful than the Geforce RTX 4090. Great.

  • @mossify2359
    @mossify2359 ปีที่แล้ว +2

    This founder/CEO's company has market cap bigger than TSMC?

  • @urimtefiki226
    @urimtefiki226 ปีที่แล้ว

    Waht is the algorithm of your chip?

  • @accumulator5734
    @accumulator5734 ปีที่แล้ว +8

    Looks just like the terminator AI processor 😂.

  • @I-Dophler
    @I-Dophler ปีที่แล้ว +9

    During the Computex 2023 event, Nvidia unveiled its highly anticipated and groundbreaking Grace Hopper Generative AI Chip. This cutting-edge technology represents a major leap forward in the field of artificial intelligence, harnessing the power of advanced algorithms and machine learning to drive innovation and transform industries. With the Grace Hopper chip, Nvidia continues to push the boundaries of what is possible in AI computing, paving the way for exciting new applications and advancements in various sectors. This remarkable achievement showcases Nvidia's commitment to shaping the future of AI and solidifies their position as a leading player in the industry. The Grace Hopper Generative AI Chip is poised to revolutionize the way we approach complex tasks and unlock new possibilities in the world of AI-driven solutions.

  • @SanctuaryLife
    @SanctuaryLife ปีที่แล้ว +1

    That’s a 20 Ton GPU with 144TB ram if you didn’t catch the drift.

  • @JamesWitte
    @JamesWitte ปีที่แล้ว +4

    A future version of this will be what we have to suicide attack to destroy it to stop the machines

  • @FeelX87
    @FeelX87 ปีที่แล้ว +2

    Taiwanese energy mixed with American excitement is this kind of a ceo

  • @H-GHN
    @H-GHN ปีที่แล้ว +1

    a "goodbye, gamers" keynote

  • @nabe3el454
    @nabe3el454 ปีที่แล้ว +1

    Interesting. Dropping a comment here as an ' I TOLD YOU SO' when this piece of tech goes either south or north. Definitely a game changer. New breakthroughs coming in. A new world!?

  • @HK_Martin
    @HK_Martin ปีที่แล้ว

    that jacket is just a tradition now

  • @abefrancis4137
    @abefrancis4137 หลายเดือนก่อน

    Ironically Grace Hopper always said “it’s not the processor, it’s the data”

  • @HEBEcoin
    @HEBEcoin ปีที่แล้ว +1

    Game changer, historical inflection point comes to mind 😅

  • @jcortese3300
    @jcortese3300 26 วันที่ผ่านมา

    Interesting choice to name the chip after someone who once said, "My fear is that if the computers start thinking, the people will stop."

  • @aeromotive2
    @aeromotive2 ปีที่แล้ว

    how much memory??

  • @natsidruk86
    @natsidruk86 ปีที่แล้ว

    200 billion transistors. Let that sink in for a moment...

  • @oddpranii
    @oddpranii ปีที่แล้ว +4

    Me : looking at superchip on a $300 phone

    • @preddyshite6342
      @preddyshite6342 ปีที่แล้ว

      Me: Reading this comment on $50 tracfone I cant unlock in my country

  • @aphaileeja
    @aphaileeja ปีที่แล้ว +1

    Easy: build a flat Rumba the diameter of the average stride, then put the robot on top. I'm thinking a pole with arms and a face/360 camera🫡

  • @darkashes9953
    @darkashes9953 ปีที่แล้ว

    Still should have asked ibm for their optical circuits technology.

  • @benzed1618
    @benzed1618 4 หลายเดือนก่อน +1

    NEXT
    LVL UP GEN
    OOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOoooooooooooooooooooooooooooooooooooooooooooooooooo

  • @nagadineshdusanapudi3863
    @nagadineshdusanapudi3863 ปีที่แล้ว

    Price ?

  • @zondaensensyvarealelumina9472
    @zondaensensyvarealelumina9472 ปีที่แล้ว +1

    10:44 😄😄

  • @Cooper3312000
    @Cooper3312000 ปีที่แล้ว +1

    All we as consumers care about is what this means for GPU’s?

  • @D15legend
    @D15legend ปีที่แล้ว

    Can it run crisis though?

  • @karlkelly9005
    @karlkelly9005 ปีที่แล้ว +3

    If Nvida keeps on revolutionize their tech, they're going to be unstoppable$$$$$$$$$

  • @davidtothemax1
    @davidtothemax1 10 หลายเดือนก่อน

    damn this leather jacket man is killing it

  • @cozyboy3129
    @cozyboy3129 ปีที่แล้ว

    Grace hooper : i have 600gb memory
    RTX 4060 ti 8gb : screw u

  • @mta7444
    @mta7444 ปีที่แล้ว +4

    Dude we have to leave now, I just 4:12

  • @racerx6384
    @racerx6384 ปีที่แล้ว

    Wow. The next shield TV is impressive.

  • @КонстантинПономаренко-я7я

    Gamers can say goodbye to normal gpu prices. Margins for these are stupidly high.

  • @devqubs
    @devqubs ปีที่แล้ว

    can we see gpu price drops ?

  • @aritradas5522
    @aritradas5522 ปีที่แล้ว

    Why does GH200 sound like T800 getting hosted on CNET

  • @FrankBarrett
    @FrankBarrett ปีที่แล้ว

    Anybody get a count on how many times he says “Grace Hopper”?

  • @jimmy8mbb
    @jimmy8mbb 10 หลายเดือนก่อน

    This guy is. Skynet?
    John, we found him at last

  • @76ayoub76
    @76ayoub76 ปีที่แล้ว

    Jensen, if I was in the audience I would jump on stage just to make sure you are a real person or rendered by Grace Hopper?!😁

  • @TheShadiya
    @TheShadiya ปีที่แล้ว

    But is it heavy?

  • @mrhassell
    @mrhassell ปีที่แล้ว +1

    You can tell this is a defining moment for NVidia. Jensen's pride is crushed, humbled by the powerful work that his team created here. It's something inspiring, which AMD are not able to provide. This has a massive impact on the final product sales and results, which have been instrumental in making team green, a trillion-dollar success.

  • @mikmop
    @mikmop ปีที่แล้ว +2

    Whatever happened to nobody will ever need more than 640 kilobytes of RAM

  • @ovoj
    @ovoj ปีที่แล้ว

    We're about to start summoning the machine gods. It's gonna sting but oh well.. here we goooooo

  • @Ricolaaaaaaaaaaaaaaaaa
    @Ricolaaaaaaaaaaaaaaaaa ปีที่แล้ว +19

    The people should have been cheering the whole time.....wtf. This is awesome news people!

    • @alsaderi
      @alsaderi ปีที่แล้ว

      They r bunch of !deot, society are, they usually rush to describe scientific advancement (especially the biological & technological) with word like "Creepy "Scary "Unethical" etc.

    • @sebastianjost
      @sebastianjost ปีที่แล้ว

      The tech is amazing, but the presentation really wasn't.

    • @Ricolaaaaaaaaaaaaaaaaa
      @Ricolaaaaaaaaaaaaaaaaa ปีที่แล้ว +2

      @@sebastianjost It didn't seem very well put together but sometimes that's more organic and wonderful 😄

    • @curie1420
      @curie1420 ปีที่แล้ว

      ​@@Ricolaaaaaaaaaaaaaaaaa because they know how this tech impacts the future.... its good tech dont get me wrong but anyone with a functioning brain knows we are not responsible for this to be in mass production yet

    • @schikey2076
      @schikey2076 ปีที่แล้ว

      the presentation is E3's level of cringe im surprised Crowbcat havent woken up on his slumber for this lmao... they really need to get someone else to present this lol

  • @yasunakaikumi
    @yasunakaikumi ปีที่แล้ว +1

    So thats where all of the GPU vram went to, no wonder they have to cut all of the lower tier vram

  • @bikcrum
    @bikcrum ปีที่แล้ว

    Should really talk about efficiency. This thing needs humongous power and carbon emission.

  • @selorius28
    @selorius28 ปีที่แล้ว +1

    it's better to make super home computers based on this graphics with the ability to connect to the network via optical fiber something like bitcoin miners but working differently, buying all this costs why to buy when you can pay for use only and not for all equipment and so after a year there will be something else better

  • @knightnxk2906
    @knightnxk2906 ปีที่แล้ว

    4 elephants 1 GPU 👁👄👁
    reminds of 2 girls 1 cup

  • @BlueRice
    @BlueRice ปีที่แล้ว

    AI seem to be the feature. at the same time i having a thought about skynet in terminator. computer power like this is mind blown. i still think phone computer power is powerful for its size. i still wait the days were they have contact lens small enough to have a power of a phone.

  • @binnieb20
    @binnieb20 ปีที่แล้ว +4

    Hopefully someone will reverse engineer it the same way Sega and Namco reverse engineered Military hardware so it's cheaper.

    • @thetshadow999animates9
      @thetshadow999animates9 ปีที่แล้ว +3

      You can’t reverse engineer something this advanced, you’d need machines that cost over $100,000,000 a piece plus maintenance just to manufacture the silicon it runs off of.

    • @binnieb20
      @binnieb20 ปีที่แล้ว

      @@thetshadow999animates9 Sega had lots of money back in the day

    • @thetshadow999animates9
      @thetshadow999animates9 ปีที่แล้ว +1

      @@binnieb20 while Sega was indeed worth what is equivalent to today’s companies which are worth hundreds of billions of dollars, Sega was playing on easy mode with how much simpler technology was at that time. The only technologies you could steal would either be patented or be useless or even already done by Nvidia’s competitors. An example of this is AMD copying Nvidia’s DLSS 1, 2, and at the moment trying to copy DLSS 3.

    • @binnieb20
      @binnieb20 ปีที่แล้ว

      @@thetshadow999animates9 we’ll see

    • @thetshadow999animates9
      @thetshadow999animates9 ปีที่แล้ว

      @@binnieb20 I think we won’t see

  • @MySnakky
    @MySnakky ปีที่แล้ว +1

    But can it run crysis?

  • @bodekbodek
    @bodekbodek ปีที่แล้ว

    No crowd understood the crysis joke!!????

  • @alangonzales3130
    @alangonzales3130 ปีที่แล้ว

    I am honestly curious if it can run crysis

  • @dwightk.schruteiii8454
    @dwightk.schruteiii8454 ปีที่แล้ว +1

    In Sarah Connors universe Miles Dyson was black. In this universe he’s asian.

  • @haralc
    @haralc ปีที่แล้ว +1

    Nvidia is an American company and this is an English presentation, why does he keep speaking Chinese to the American girl? So weird .... why did he give back the chip to the girl if he's going to present with the whole time? Why did he also have to mention he girl doesn't speak Chinese?

    • @DekritGampamole
      @DekritGampamole ปีที่แล้ว +3

      He was just joking. May be you need an AI to explain the joke to you

  • @jackieo7113
    @jackieo7113 ปีที่แล้ว

    Can't even wrap my head around the ramifications of this!

  • @Arunaasthra
    @Arunaasthra ปีที่แล้ว

    Still don't know why they haven't developed a game engine

  • @selorius28
    @selorius28 ปีที่แล้ว

    it's not enough for artificial intelligence, nvidia needs to work on apatite crystals

  • @matthew.m.stevick
    @matthew.m.stevick หลายเดือนก่อน

    very historic moment in human history 💚🖤🇺🇸

  • @SultanAhmed-xn1wb
    @SultanAhmed-xn1wb ปีที่แล้ว

    Nvidia for life 🧬 for love ❣️ for future 🤗 keep going for the great work 🤠 be happy be safe

  • @chesstictacs3107
    @chesstictacs3107 ปีที่แล้ว

    Such a likeable dude.

  • @MrShyghost
    @MrShyghost ปีที่แล้ว

    And so it begins...

  • @养猫总是掉毛
    @养猫总是掉毛 7 หลายเดือนก่อน

    MRNA可以做显卡不

  • @mossify2359
    @mossify2359 ปีที่แล้ว +1

    PC industry will be revivified by generative AI. Good news for Acer and Asus, et al.

  • @ShadowHazard93
    @ShadowHazard93 ปีที่แล้ว

    Lmao that Crysis joke.

  • @Darhan62
    @Darhan62 ปีที่แล้ว +1

    But can it play Crysis? The real question is: Can it *run* Crysis? Only a human player could play Crysis back in 2008 when it was new, but now I'm sure you could train an AI system to play it better than any human player ever could.

  • @curtishorn1267
    @curtishorn1267 ปีที่แล้ว

    Needs fewer fans as they will be a maintenance headache.