Imagine when it's ASI and it starts to hallucinate those problems in a way we're not able to tell they are not real and then spend resources on "solutions"...
We will soon be able to tell a game agent to create a game world with a story-line that we describe to it. We may give it historical or well known characters to influence its avatar creation process. In minutes we will be able to enter the game world that we only imagined a few minutes prior.
I believe that one day I shall convince AI to help me build a time machine, so I can go back to the 90s and buy a ton of NVDA stock. "Help me help you!"
Yeah, if only Musk had been able to build a new car company from scratch and blow away every US automaker. Or revolutionize space travel and take over 80% of the global launch business. Like you said, all BS, no action 😂
It generates 9 out of 10 pixels, not 9 out of 10 frames. It only generates 3 out of 4 frames, but the 1 rendered frame is also scaled up, meaning only like a half or less of its pixels are rendered, and that's how you arrive at only 1 of 10 pixels overall being rendered and other 9 generated. At least that's what I understood from the presentation. It also makes me wonder about when they say it's two times more powerful than the previous generation. By which they probably mean 2 times the framerate. But if the previous generation turned 1 rendered frame into 2 total and the new generation turns 1 rendered frame into 4 total, and the total number of frames per second doubled, doesn't it mean that the number of honestly rendered frames per second didn't actually change?
The claim could encompass other improvements: The new AI models in DLSS 4 operate 40% faster and use 30% less VRAM than their predecessors1. DLSS 4 introduces transformer-based AI models, which may provide better image quality and stability2. The RTX 50 series GPUs have hardware improvements that contribute to overall performance beyond just frame generation3. Therefore, while the number of traditionally rendered frames might not have changed, the overall system improvements and efficiency gains could justify the "two times more powerful" claim.
It might be true, but why would they care about the honestly rendered frame? They see these AI generated frames as the future and want to push this aspect as far as they can. It doesn't matter to them what gamers want. The generated pixels are mostly ok so average gamers wouldn't mind, and it is only going to get better in each version. I wouldn't be surprised if the DLSS 6 would be even better than the honest render.
@@13thxenos >why would they care Well, I do. Because as an average gamer, my monitor only shows 75 frames per second, so I don't care if they're gonna turn 60 frames into 120 or 240. And I don't feel like rendering 20 to turn them into 60 is gonna work out well.
@@HanakoSeishin I know you do, I and a lot of others do to. I'm just saying that nvidia clearly don't and won't in the future. This is the path they chose to go.
In terms of data creation stop thinking traditionally. We just spent the last 10 years deploying 5G throughout the world. That means all the physical spaces have yet to be really contextualized as data points. We have a job to do and that's going to get down to the quantum level. We haven't even started
@@rocketPower047 I would love a holodeck but this is more of a tech that changes your entire area into a scene. Tropical beach? a snowstorm in the woods? a spaceship? a solarpunk city scene (like tomorrowland). Just sit back on your couch and relax to whatever scene you picked.
Sounds like 'Ready Player One' VR tech (or yes, a holodeck). I think I'd prefer an actual tropical beach at this point. I'm getting pretty tired of screens...
I have often worried that humanity relies too heavily on society's charlatans, but now I feel optimistic that scientists are firmly guiding our future.
Those DGX "supercomputers" are small enough to fit in the body of a robot... How much power do they use for inference and training at worse case scenario? And much do they weight?
Forget about jobs in tech. How are we gunna handle all jobs being replaced. All of em. Every single one. Anything manual. Robots. Anything digital. AI. Research AI. Oh nurses maybe. Nope AI will cure everything and won't need em. Every single solitary job no matter what it is, and yes all the dodo's who think there particular job is safe. Isnt. Even if somehow it is, 99% of ever other job is gone. What society does with humans is the only question that remains. Ai and robots will be done. People can argue about when but fact remains its gunna happen. So humans. And i fear humans will do the same thing we do whenever we don't need something. Get rid of it. Humanity's future isnt utopia and 10's of billions of people. Its the lucky few who get deemed worthy and the rest will go the way of the dodo. We already do it as it is. How many millions of people we just leave to die when we no longer need em. Excatly. AI can be a utopia. And it will be. But not for everyone.
Jensen Huang is amazing! :) He inspired me to continue learning about generative AI two years ago, even when my boss told me to stop "playing" with AI tools. :)
Good for you. Your current job probably won't exist in 5-10 years, and your boss won't give a crap when he has to can you. So get ready for the next thing.
[+ @OhHenGee1796 ] ..and with that kind of attitude, your Boss will be getting his own pink-slip in about 18 months, so you might as well begin considering new decorations for when you take over his office. Or better yet.. go find a better company that is happy to have someone who's been "playing" with AI for 2 years!
I could put my entire data on one of those super Cs and run my business on my local language model not having to worry about my data getting lost or gobbled up by AI? If so sounds good. Many businessmen are worried about losing proprietary data to AI..
*AI power problem: Solved* Project Digit is crazy, I figure 25-30x more powerful than Apple’s best M4 Max chip. He mentioned agentic workflows, a lot of companies will buy these to run their “digital employees” locally. $3k for 24/7 production inference on advanced models is *dirt* cheap. Price comparison: A month’s worth of 24/7 fp4 inference on an H100 costs about ~~$1,800. So one of these little boxes would pay for itself in 6-8 weeks. A much bigger point: AI datacenter deployment is already power-constrained. OTOH, thousands of companies could run a stack of these in a spare room and hardly notice the blip in their electric bills. *This is an end run around the AI power problem.*
I don't know Nvidia to be a hype company (i truly don't as I don't follow much of the tech space), so this all seems sincere to me. In that case, that little mini computer is a MUCH bigger deal than he's making it out to be. Like, it was treated like a footnote, but that's a powerful f****** machine if it can do what he's claiming. Scary powerful.
Well, power consumption is not accounted for in your price comparison and somehow I'm sure that small box will consume a lot of power. Cause there's no such thing as free cookies.
I guess all the demo videos of robots have the audio dubbed in. I realise it was about a dozen of them but I had no idea they made so much noise when they moved.
I honestly think Elon Musk's XAI34v is the safest bet for long term hold, and will survive out of every other altcoins. It will get adopted in US, Ecuador, Asia, starting from Japan, and slowly spread out and gain. This is a winning coin, apart from all the technical greatness.
Because Rosie has 5 stepsisters ready and willing to milk you for all that you are worth. Sometimes you have to have post clarity on the cost of these systems.
@JoePiotti Ya Optimus 3 should be out by end of month and pretty sure they are already working some smaller tasks fully autonomous at the Tesla factory
Wow! He's certainly created some work for you. I can't hi k of about 7 or 8 vids for you to cover. Will you try out the Jetson Nano and give a review. May will be interesting with the Digits Project release. Will you be looking at the models NVIDIA has released?
I'm sure we'll see stuff as they catch up. Maybe they're working on their own... or firing staff once they realized nvidia smoked them. Either way, competition is good.
Interesting bit about the "AI rendering games in realtime" .. would this not mean that every first person game that a player plays is a unique environment? .. ie. if you play the same game 100 times then the world around your avatar will never be exactly repeatable!
They can have memory to keep the scenes the same, and anyways its only following how the game tells it to, so the world is already built but running on ai
@JayJay @shirowolff9147 Could either of you two games person help me with a genuine query please..... with these digi-boxes, if I developed with AI a game, a universe, could I drop current news and events into that game, for example drop in today's newspaper for players/actors to read? Sorry to sound dense but I'm a learner.
@@shirowolff9147 it's possible there will be minor differences, but not material differences. Like who cares that the rock on the ground was shifted 2" to the right?
8:12 I'm curious about the claim that there's no more data to train models. Wouldn't new data sources like ocean sensors and the James Webb telescope provide endless opportunities for model improvement?
Companies like sam Sara or people net have millions of hours of watching commercial truck drivers; local and long haul. I remember in 2005 people net watched a group of us truck drivers for about a year. They said it was for future automated drivers or the information that they could sell for driverless vehicles.
To be hones I was expecting that by now we would at least have DevOps assistent AIs. Different from programming, where you sometimes have hundreds of files of code, infrastructure setups, monitors etc. don't have that. I mean an AI that is expert in Kubernetes, Docker, hypervisors, Automation, Monitoring and log tools etc. I means some tools like OpenShift or Datadog have AI integrations but they are rather a joke. I assume it is because the lack of computational power and that models are usually trained as general Chat bots. What do you think?
Matthew, I would argue you agent thesis will come into play but not soon, right now it's too expensive to run and to develop compared with traditional software.
Fully agree with Matthew. I still can't understand why so many "AI vloggers" keep pushing the idea of AI Agents when there's only data and "a layer of interface/AI" on that. Why splitting it into gazillions of pieces (agents)?!?
Because thats how its called? Just like theres different cars but you call them cars, you dont call it by every part its made of? Its really simple bro
When it comes to the training data, won't the quality be a race to the bottom using synthetic data? It will be similar to compressing the same file multiple times without reducing the the file size on disk! Data quality will need to be checked and validated. Who will own the data once this has been done? From public to private data sources, will eventually make it more of a pay-to-play model sooner than we think!
LOL, Nvidia sponsored you to be at the show but didn't pay for you to make the video.... Kinda like Lobbyists giving donations to politicians but they didn't sponsor the bill that benefit the corporate interest.
11:38 i think this slip up is because behind the scenes these people are interacting with agentic AI more than humans these days... he's so used to talking to Cosmos or whatever lol
Not much difference for the eyes you wont be able to see the difference for consumer products but maybe beneficial for bio medical research, health care and safety industry. However, with the Nvidia famous for pricing their incremental product reveals. Will the cost of healthcare continue to increase? As health care professionals are advising patient 's with evidence based healthcare decisions.
Under that curve of the AI dynamics, I am missing another one illustrating the need for human presence in the AI world. We may just experience the most powerful virus created which can come up with its own directives and execute them. The last step is to establish a presence in robots, replacing the need for maintenance workers and human management.
I think not. They have the models closed for at least a year (both gpt4 and o1) before announcing them. GPT4 was demoed to Microsoft before chatgpt was released, under NDA, and the Microsoft research was published few months after gpt4 was announced. Recently I watched a guy from OpenAI interview and it slipped out that o1 was ready few months before they fired Sam alt. Now they are talking about o3, but what is in training and in preview for select customers? What we see from all these companies are the stripped down models that can serve hundreds of millions of subscribers. AGI is locked in a basement somewhere.
How could you not get excited for cheap compute!!! I’ve already been using nano to create compute nodes on the network for processing or rather transcoding video mostly. So between Mac minis being cheap and things like this cheap super compute
jensen is the goat. 🐐💚🖤 he is designing the future for all of humanity. can you imagine being him? started as a dennys bus boy? i recommend _the nvidia way_ by tae kim. its new. it’s excellently done and jensen’s real life story is absolutely amazing. 🙏🤖📈🇺🇸
Poor industry, since the launch of GPT 3.5 it has been stunned and shocked almost daily
😂😂😂
I am the industry... and I have no more nerve endings
I am the industry and I want MOAR
Thankfully a shockingly large amount of compute is being used to train AI therapists to deal with this stunning issue. It will be a game changer!
@@gareth4045 Indeed, and that's the only way after all the shocking and stunning that the industry won't go insane.
is there anything in the last 2 years that has not "just stunned the entire industry"?
Every Intel press-release
Yeah sure. A lot of things only SHOCKED the entire industry.
All that turning and the industry should be just going in a circle
I wonder if he wears that jacket to bed
Marvel movies
Imagine when AI doesn't just solve problems but actually discovers problems we didn't know we needed to solve. Now that's discovery on another level.
Imagine when it's ASI and it starts to hallucinate those problems in a way we're not able to tell they are not real and then spend resources on "solutions"...
Plot twist: The jacket was being rendered in real-time by a Blackwell GPU in his back pocket.
No is made out of them
The potential of XAI34v is unreal! Excited to see where this goes after watching your video!
We will soon be able to tell a game agent to create a game world with a story-line that we describe to it. We may give it historical or well known characters to influence its avatar creation process. In minutes we will be able to enter the game world that we only imagined a few minutes prior.
The real world. We going to create multiple competing new real worlds. LIfe is now a video game.
or even better, let is read a book and create a movie
Just tell your life story to the AI and then live it again 🤓
Only if you using the correct pronouns or be punished by your leftist masters..
Nah. We will become too lazy. We will have an agent to tell another agent what to do
Matthew you are doing awesome with your videos keep it up brother!!
I believe that one day I shall convince AI to help me build a time machine, so I can go back to the 90s and buy a ton of NVDA stock. "Help me help you!"
with my luck i would cause a butterfly effect ripple that would make that one ingenious nvidia developer get hired by AMD instead.
don't you think it was already done? LOL
If that happened in future we would be seeing the people from future doing this in present
@@jagatsimulation not if it would open a new timeline/worldline (aka thread in the simulation) each time you maneuver within time, to avoid paradoxa.
@@kliersheed T F are you saying
Jensen is like a rock star among us nerds.
How do you think the digit will perform up against a mac m4 with 128 gb RAM?
i watched the keynote but you made it 100x better. thanks for the little deeper dive into the topics
"Do you like my jacket?"
"uhhhh....noooo !"
😅
It's a cybertruck equivalent of all jackets.
Like why ask that question?
You have changed Jenson you have changed. Fancy jackets now? The old jacket was a classic.
Morpheus jacket matrix,
The jacket thing is kind of stupid he looks like the Fonzy from happy days why do you want to look like a gay biker from the 50's so bad.
He is the opposite of Musk. A doer and not a bullshit talker
Yeah, if only Musk had been able to build a new car company from scratch and blow away every US automaker. Or revolutionize space travel and take over 80% of the global launch business.
Like you said, all BS, no action 😂
This guy gets it
@metafa84 opinion with nothing to back it up. Rude as well.
@@DJ-Illuminate So sorry to tell you the truth about your messiah
Well, Musk IS a bullshit talker, but he also does some things. You can't deny it, can you?
🤯LOVE, LOVE, LOVE the idea of a "mini home AI supercomputer". Wow!! Would love to get my hands on one of those. 😍
Are you still testing out new llms? Llama 3.3? DeepSeek-V3?
Yes!
@@matthew_berman Are you really really sure?
So, is that good? 😮@@matthew_berman
Deepseek is pretty decent he managed to beat out Claude on a programming challenge of mine
It generates 9 out of 10 pixels, not 9 out of 10 frames. It only generates 3 out of 4 frames, but the 1 rendered frame is also scaled up, meaning only like a half or less of its pixels are rendered, and that's how you arrive at only 1 of 10 pixels overall being rendered and other 9 generated. At least that's what I understood from the presentation.
It also makes me wonder about when they say it's two times more powerful than the previous generation. By which they probably mean 2 times the framerate. But if the previous generation turned 1 rendered frame into 2 total and the new generation turns 1 rendered frame into 4 total, and the total number of frames per second doubled, doesn't it mean that the number of honestly rendered frames per second didn't actually change?
The claim could encompass other improvements:
The new AI models in DLSS 4 operate 40% faster and use 30% less VRAM than their predecessors1.
DLSS 4 introduces transformer-based AI models, which may provide better image quality and stability2.
The RTX 50 series GPUs have hardware improvements that contribute to overall performance beyond just frame generation3.
Therefore, while the number of traditionally rendered frames might not have changed, the overall system improvements and efficiency gains could justify the "two times more powerful" claim.
It might be true, but why would they care about the honestly rendered frame? They see these AI generated frames as the future and want to push this aspect as far as they can. It doesn't matter to them what gamers want. The generated pixels are mostly ok so average gamers wouldn't mind, and it is only going to get better in each version. I wouldn't be surprised if the DLSS 6 would be even better than the honest render.
@@13thxenos
>why would they care
Well, I do. Because as an average gamer, my monitor only shows 75 frames per second, so I don't care if they're gonna turn 60 frames into 120 or 240. And I don't feel like rendering 20 to turn them into 60 is gonna work out well.
@@HanakoSeishin I know you do, I and a lot of others do to. I'm just saying that nvidia clearly don't and won't in the future. This is the path they chose to go.
@13thxenos yeah totally, the "pure gamers" will notice because they are such "pros" 😄😯🫨😵💫
In terms of data creation stop thinking traditionally. We just spent the last 10 years deploying 5G throughout the world. That means all the physical spaces have yet to be really contextualized as data points. We have a job to do and that's going to get down to the quantum level. We haven't even started
quantum these nuts
@@AddyEspresso We'd have to see them
@@AddyEspressoAt least you advertised their size appropriately
so basically we have an infinite amount of data to feed models if we can just get it piped to them all the way down. makes sense.
"Bodies and minds will be the two big products in this next wave of industrial revolution" - Yuval Harari at 24:33 in the video 'Ewe Schal Rise'
Jensen is the Architect from the Matrix 😎
Great video! How to pre-order a mini super computer?
Give me an AI tech that can change the inside of your house to look like a tropical beach, but your outside still looks like a house.
A holodeck?
@@rocketPower047 I would love a holodeck but this is more of a tech that changes your entire area into a scene. Tropical beach? a snowstorm in the woods? a spaceship? a solarpunk city scene (like tomorrowland). Just sit back on your couch and relax to whatever scene you picked.
Sounds like 'Ready Player One' VR tech (or yes, a holodeck). I think I'd prefer an actual tropical beach at this point. I'm getting pretty tired of screens...
I know a Polish builder who can do that for you. Very good rates!
@japethstevens8473 ummmm
I have often worried that humanity relies too heavily on society's charlatans, but now I feel optimistic that scientists are firmly guiding our future.
Is that sarcasm or optimism ? I think both are true to the extent that even the charlatans can’t tell whether they are scientists or not.
if that is what you come away from this video with you might have a mental deficiency
By charlatans you mean Elon Musk and Vivek Ramaslimey? 😅😮
@@pdcdesign9632 Fauci, Gates and yes Elon and Vivek!
Those DGX "supercomputers" are small enough to fit in the body of a robot... How much power do they use for inference and training at worse case scenario? And much do they weight?
The shift to AI-driven computing is fascinating, but what about job displacement in tech? How do we prepare for that?
Forget about jobs in tech. How are we gunna handle all jobs being replaced. All of em. Every single one. Anything manual. Robots. Anything digital. AI. Research AI. Oh nurses maybe. Nope AI will cure everything and won't need em. Every single solitary job no matter what it is, and yes all the dodo's who think there particular job is safe. Isnt. Even if somehow it is, 99% of ever other job is gone. What society does with humans is the only question that remains. Ai and robots will be done. People can argue about when but fact remains its gunna happen. So humans. And i fear humans will do the same thing we do whenever we don't need something. Get rid of it. Humanity's future isnt utopia and 10's of billions of people. Its the lucky few who get deemed worthy and the rest will go the way of the dodo. We already do it as it is. How many millions of people we just leave to die when we no longer need em. Excatly. AI can be a utopia. And it will be. But not for everyone.
You called it a month ago and now XAI34v is blowing up glad I listened
Jensen Huang is amazing! :) He inspired me to continue learning about generative AI two years ago, even when my boss told me to stop "playing" with AI tools. :)
Good for you. Your current job probably won't exist in 5-10 years, and your boss won't give a crap when he has to can you. So get ready for the next thing.
[+ @OhHenGee1796 ] ..and with that kind of attitude, your Boss will be getting his own pink-slip in about 18 months, so you might as well begin considering new decorations for when you take over his office. Or better yet.. go find a better company that is happy to have someone who's been "playing" with AI for 2 years!
Great timing on the investment advice hahaha
I could put my entire data on one of those super Cs and run my business on my local language model not having to worry about my data getting lost or gobbled up by AI? If so sounds good. Many businessmen are worried about losing proprietary data to AI..
Where can I purchase the Orin super for the announced price?
Great overview! Please do a review of the your project AI when bought and installed.
It’s almost like the real rendered frames become a controlnet of sorts.
*AI power problem: Solved*
Project Digit is crazy, I figure 25-30x more powerful than Apple’s best M4 Max chip.
He mentioned agentic workflows, a lot of companies will buy these to run their “digital employees” locally. $3k for 24/7 production inference on advanced models is *dirt* cheap.
Price comparison: A month’s worth of 24/7 fp4 inference on an H100 costs about ~~$1,800. So one of these little boxes would pay for itself in 6-8 weeks.
A much bigger point: AI datacenter deployment is already power-constrained. OTOH, thousands of companies could run a stack of these in a spare room and hardly notice the blip in their electric bills.
*This is an end run around the AI power problem.*
I don't know Nvidia to be a hype company (i truly don't as I don't follow much of the tech space), so this all seems sincere to me. In that case, that little mini computer is a MUCH bigger deal than he's making it out to be. Like, it was treated like a footnote, but that's a powerful f****** machine if it can do what he's claiming. Scary powerful.
Well, power consumption is not accounted for in your price comparison and somehow I'm sure that small box will consume a lot of power. Cause there's no such thing as free cookies.
Thank you for the announcement video for Elon Musk's XAI34v Token!!! Finally they got into crypto...can't wait to see what's next
I guess all the demo videos of robots have the audio dubbed in. I realise it was about a dozen of them but I had no idea they made so much noise when they moved.
Can't believe I almost missed out on Cardano and XAI34v! Thanks for the heads-up in your video!
The Adaxum team is setting the bar high for presales. Transparency & bonuses equals community trust!
I honestly think Elon Musk's XAI34v is the safest bet for long term hold, and will survive out of every other altcoins. It will get adopted in US, Ecuador, Asia, starting from Japan, and slowly spread out and gain. This is a winning coin, apart from all the technical greatness.
Very cool stuff, can’t wait to see wonders in 2025. Very good video.
The jacket....no.
He could used new AI materials ?
...spare the croc......
Design with AI ?
this video is a perfect candidate for chapter markers
Great summary, thank you
If they have Jetson why not name the personal AI pc Rosie
Because Rosie has 5 stepsisters ready and willing to milk you for all that you are worth. Sometimes you have to have post clarity on the cost of these systems.
Great content!
23:42 no Optimus in the lineup
could be because tesla designs their own chips, they've got a supercomputer called dojo
I know this is recent, but it looks so much like last year's. Especially the robot lineup, where last year out came the Disney droids.
I'm working on building an entire coding team of Agents Architect/PM/QA/Frontend Dev/ Backend Dev
Exciting times!
这个视频太有趣了,每一秒都值得期待!
Optimus WASN'T actually one of the robots on stage which is kind of weird.
Tesla has been sandbagging on Optimus, don’t worry, it’s further along than most people think.
@JoePiotti Ya Optimus 3 should be out by end of month and pretty sure they are already working some smaller tasks fully autonomous at the Tesla factory
Wow! He's certainly created some work for you. I can't hi k of about 7 or 8 vids for you to cover. Will you try out the Jetson Nano and give a review. May will be interesting with the Digits Project release. Will you be looking at the models NVIDIA has released?
Is there any response to "digits" from the rest of the hardware world? aka intel, amd, etc.
I'm sure we'll see stuff as they catch up. Maybe they're working on their own... or firing staff once they realized nvidia smoked them.
Either way, competition is good.
How many DIGITS do I need to run 4o?
Looks like Palantir is getting some competition coming with Omniverse and Cosmos.
Shocking leather, stunning jacket
Interesting bit about the "AI rendering games in realtime" .. would this not mean that every first person game that a player plays is a unique environment? .. ie. if you play the same game 100 times then the world around your avatar will never be exactly repeatable!
They can have memory to keep the scenes the same, and anyways its only following how the game tells it to, so the world is already built but running on ai
@JayJay @shirowolff9147 Could either of you two games person help me with a genuine query please..... with these digi-boxes, if I developed with AI a game, a universe, could I drop current news and events into that game, for example drop in today's newspaper for players/actors to read? Sorry to sound dense but I'm a learner.
@@shirowolff9147 it's possible there will be minor differences, but not material differences. Like who cares that the rock on the ground was shifted 2" to the right?
8:12 I'm curious about the claim that there's no more data to train models. Wouldn't new data sources like ocean sensors and the James Webb telescope provide endless opportunities for model improvement?
NVIDIA's 50 series GPUs look impressive! 2025 could be transformative with these AI advancements.
I started investing to cloud GPU one year ago and thank god i did.
This is my search string for AI news: ai news -shocks -groundbreaking -stuns -insane -stunning
Where can you buy that jacket, tho?
Are tops now fp4. Is 8 x fp4 the same as one fp32 in complexity?
Once you connect a camera to the model it can get original data. Imagine what other sensors one could gather real world data.
Companies like sam Sara or people net have millions of hours of watching commercial truck drivers; local and long haul. I remember in 2005 people net watched a group of us truck drivers for about a year. They said it was for future automated drivers or the information that they could sell for driverless vehicles.
This is evolution theory played out digitally. Survival of the most useful.
As your channel continues to grow, you will need to drop those investments to be an impartial source.
Now we have to do this locally with open models, and then we can crush big tech
You wish 😊
How do I invest in CREW?
What kind of timeline until we’re seeing AAA games being rendered? Any idea?
To be hones I was expecting that by now we would at least have DevOps assistent AIs. Different from programming, where you sometimes have hundreds of files of code, infrastructure setups, monitors etc. don't have that. I mean an AI that is expert in Kubernetes, Docker, hypervisors, Automation, Monitoring and log tools etc. I means some tools like OpenShift or Datadog have AI integrations but they are rather a joke. I assume it is because the lack of computational power and that models are usually trained as general Chat bots. What do you think?
The price is $3,000 for Digits. Also there was no Optimus robot on stage.
I enjoy to watch this
Thank you.
Matthew, I would argue you agent thesis will come into play but not soon, right now it's too expensive to run and to develop compared with traditional software.
Ai soon rules in the galaxy!
Fully agree with Matthew. I still can't understand why so many "AI vloggers" keep pushing the idea of AI Agents when there's only data and "a layer of interface/AI" on that.
Why splitting it into gazillions of pieces (agents)?!?
Because thats how its called? Just like theres different cars but you call them cars, you dont call it by every part its made of? Its really simple bro
How would this mini super computer be the same price as a 5090 GPU, but have more vram?
So is the AI computing done on the GPU itself? I thought AI took huge amounts of processing and power so how is this achieved on such a small scale?
Because ai servers can do much more and they are online for billions of people, while a graphics card would be only for one person so its easier
@@shirowolff9147 Got it, thanks.
When it comes to the training data, won't the quality be a race to the bottom using synthetic data? It will be similar to compressing the same file multiple times without reducing the the file size on disk! Data quality will need to be checked and validated. Who will own the data once this has been done? From public to private data sources, will eventually make it more of a pay-to-play model sooner than we think!
Tokenomics on point, a strong development fund, and a clear focus on community growth, Adaxum has it all!
Adaxum's ecosystem and AI features look like a game-changer. Investing early might pay off big time!
I don't see how the Digits device will run a 200B parameter model. The vram is 128GB. It cannot house the 200B model in memory. Inference will be slow
Quantization
70B models run competently on 24GB GPUs, do the calculation.
Yes only Quantized models will run at the 200B size
LOL, Nvidia sponsored you to be at the show but didn't pay for you to make the video.... Kinda like Lobbyists giving donations to politicians but they didn't sponsor the bill that benefit the corporate interest.
11:38
i think this slip up is because behind the scenes these people are interacting with agentic AI more than humans these days...
he's so used to talking to Cosmos or whatever lol
Why's jensen gotta be the focus of every slide?
amazing. Im also at CES. Any chance to say hello?
It's an automatic buy for me. This is the only product that interests me so far at CES.
Not much difference for the eyes you wont be able to see the difference for consumer products but maybe beneficial for bio medical research, health care and safety industry. However, with the Nvidia famous for pricing their incremental product reveals. Will the cost of healthcare continue to increase? As health care professionals are advising patient 's with evidence based healthcare decisions.
can ai help me test my web3 app to check for bugs and UI/UX quality ?
Lol. What's the point of web3 now?
@@OhHenGee1796 to be able to live in lower inflation countries, and still have access to alternate banking facility.
Why haven't they showcased more 8-bit AI video games?
It's always about being early. Adaxum token is still in its early phases, and I see huge potential here!
Mat - FE or third party 5090?
Under that curve of the AI dynamics, I am missing another one illustrating the need for human presence in the AI world. We may just experience the most powerful virus created which can come up with its own directives and execute them. The last step is to establish a presence in robots, replacing the need for maintenance workers and human management.
Where's the patient care?
Re: World Model -- I think that Meta (verse) was trying to do that at the social level
I just wish i had dropped more than 500 into nvidia last November. If only i had dropped everything i had...
Any hope will see O4 or Orion by the end of the year?
I think not. They have the models closed for at least a year (both gpt4 and o1) before announcing them.
GPT4 was demoed to Microsoft before chatgpt was released, under NDA, and the Microsoft research was published few months after gpt4 was announced.
Recently I watched a guy from OpenAI interview and it slipped out that o1 was ready few months before they fired Sam alt.
Now they are talking about o3, but what is in training and in preview for select customers?
What we see from all these companies are the stripped down models that can serve hundreds of millions of subscribers.
AGI is locked in a basement somewhere.
How could you not get excited for cheap compute!!! I’ve already been using nano to create compute nodes on the network for processing or rather transcoding video mostly. So between Mac minis being cheap and things like this cheap super compute
Im blown away
jensen is the goat. 🐐💚🖤
he is designing the future for all of humanity. can you imagine being him? started as a dennys bus boy? i recommend _the nvidia way_ by tae kim. its new. it’s excellently done and jensen’s real life story is absolutely amazing. 🙏🤖📈🇺🇸
Nice video 👍👍👍
I genuinely think Bitcoin and XAI34v will be the breakthrough for this run