My typical spiel was cut off by Circle Glasses Phil (the one true Phil who controls them all), but you can find the Patreon reaction here: www.patreon.com/posts/reaction-video-2-101845366 Full sources in description, but biggest applause goes to Acquired: th-cam.com/users/AcquiredFM You'll find 10 hours of NVidia stuff there, and it really helped ground this vid. Great place to become Nvidia obsessed. Thanks for watching.
For someone who's been into computers for years, Nvidia has always been a company I've been aware of. I never thought of how weird it must be for people who had never heard of it until it became worth over a trillion dollars out of nowhere. One of my family members, who doesn't know much about computers, thought it was so cool that I had an RTX card in my PC simply because it was Nvidia.
my journey was coveting those graphics cards on video game magazines in the 90s, forgetting about it for a while except when i started 3d graphics, and then waking up to find it worth $2T.
@@beetooexGamers aren't forced, but professionals are. CUDA is overwhelmingly dominant in the professional space, though AMD is working to catch up with ROCm and its professional GPUs are significantly cheaper
NVIDIA also cashed in big time on the bitcoin boom, I think this deserves more than just a footnote. Anyone remember the graphics card shortage, when everyone was into mining crypto?
yeah i was pretty split on how to portray it - ultimately, the fact that nvidia tried to dissuade and regulate crypto miners made me think that it wasn't central to their mission (even though they did eventually sell crypto-focused gear). the acquired pod does a good job at kinda contextualize this.
@@PhilEdwardsInc I think your very-high-level take here is pretty accurate, as someone who's watched the 3D industry since it was born. The crypto-boom wasn't a BAD thing for them, put a fair amount of cash in their reserves. And it really hurt us in the gaming space due to the supply constraints which made it feel big in the consumer facing market, ultimately it's pretty damn small potatoes compared to the AI driven sales of the 10-50-100k$ enterprise market and the resulting 2 trillion dollar market valuation.
@@PhilEdwardsIncAgreed. Overall crypto ran parallel to the AI story. Crypto made Nvidia some money but was just something they happened to be good at and had to scale up for when their GPUs came into high demand. It was a headache for their gaming division as it wasn't meant for crypto miners so they tried many ways to find ways to segment crypto away from gaming.
such an epic channel. I did not know about it before this vid but it definitely was a reason to have this sort of high level approach since Asianometry is so good at the detailed stuff. The TSMC videos are really nuts...so detailed.
@@PhilEdwardsInc duuude that channel itself is an entire academic course in some ways. There's also TechTechPotato with Dr Ian Cutress for some occasional in depth stuff.
My first PC in the 90s had NVIDIA graphic chip. They also gave buyers demo games to demonstrate their 3D game capacity. My cousin and I played Future Cop all the time. Fot the 90s it was incredible!
I remember doing social studies stocks test on 8th grade and put 20% of imaginary 1000 euros (which is ~1050 dollars) into nvidia right before the summer break and seeing how much it has grown really puts a smile on my face. Even though I didn't use real money, just test calculation I feel like I won big money.
Great topic! A dive into the benefits of specialization, when paired with cooperative-competition among other industry specialists, to push technology further and further. I hope this VOD is a home run, great current events information
I work in the industry and know Nvidia's story pretty well. Have to commend you for breaking down the history and industry in a very approachable way. You clearly did your research. The discussion of fabless at the beginning was especially good and something people don't appreciate about how the industry has changed compared to decades ago. Also the introduction of CUDA from the early days before DL became a thing.
The only stock I’ve ever called to be one to explode in value is Nvidia. Back in ~2016 I told a close friend who was very into trading that he needs to go all in with Nvidia. It wasn’t just the obvious thing that they would supply so much of the worlds AI chips, but the market analysts also agreed unanimously that it’s going to see spectacular growth. The next couple of years went by and it did indeed happen. My friend later tells me, “man, I should have listened to you” 😂
ugh i wish we were friends. there were some old marc andreesen tweets i ran into from around that time - made me feel pretty silly for not betting on them (though, to be fair, I still have no idea if it'll work out for them).
So, a lot of this seems like the best sort of success story, an accidental one. In the late 1970s, IBM began experimenting with Reduced Instruction Set Computers, or RISC. The theory was, by making the machine-code for the internal procedures on a CPU simpler (as opposed to specialized), the chip architecture can be optimized to run more computations faster (the assembler/compiler would then transform a single more complicated instruction into several smaller but faster to compute instructions). Companies like Silicon Graphics made their name in constructing powerful new computers for high-end clients, only to gradually be edged out by cheaper personal computers that were becoming faster. Originally, the graphics card was just a specialized bit of hardware designed to (very quickly) compute a lot of similar data all at once. What companies like NVIDIA stumbled across was a new way to think about computing altogether: simultaneous computation of many similarly structured problems. Video game graphics, cryptocurrency blockchains, and artificial intelligence all thrive on these sort of distributed, parallel computation systems. It opens up a new era in computer science.
@@180_S well, just to start with, it's this core element of his personality (with an explanation tied to his wife) but the older GTCs actually have him dressing in totally normal polo shirts...
My man, your vids are getting better and better. Love them! I really need to watch that conference and listen to this all. I am not ignorant to this all, but I am old enough now to have lived through the 90's when graphics cards hit the seen, and I am not in that mix anymore. I intentionally stay away from it as I am becoming my Father and yelling at kids for walking on my grass. Yes, I am 45 years old, lol.
makes sense, graphics programming is mostly vector math and once you peek inside the LLMs of today you'll find Transformers (basically big voluminous vectors), more vectors, neural nets and so on, most of which do a bunch of vector math to "learn" and adjust weights and whatnot, it's a very similar process to what a GPU does while running your ultra modded graphics skyrim.
I love how living in south bay, there are places like the Nvidea Dennys, where it’s like, “oh yeah, Steve Wozniak goes to this one BBQ joint all the time”. It’s the Bay Area version of LA people having weird encounters with celebrities all the time.
Crazy how we’re approaching a time where the history of more and more things can be found merely by looking at the videos uploaded years ago on TH-cam.
@@PhilEdwardsInc part of the fun of cheetos is getting your fingers covered in bright orange cheesy dust. Then being in a constant state of potential destruction if you ever touch anything. With the glorious finish of licking the fingers off.
It never occurred to me that the general public wasn't particularly aware of Nvidia. I mean, anyone who has ever built a PC in the last 20 years had to go down the rabbit hole of deciding between a GeForce or Radeon GPU and researching which model was the better bang for your buck at that particular point in time. I often forget that there's people out there that aren't into PC's and/or gaming.
Yeah I was sort of a version of this myself - coveted them all totally in the 90s as a kid, fell away, and then came back in 2024 to be like - wait, why is the thing that made Starsiege Tribes look cool suddenly worth 2 trillion!?!?
When got into investing during the pandemic i didnt have any knowledge about the stock market I decided to pick whats going to be in high demand in the future and i thought about A.I. i just googled what would be a good A.I. stocks and Nvidia came up I dollar cost average and didnt imagine this stock would blow up in a just of couple of years.
They're just good at locking in users with shiny proprietary features, whereas their competitors fail to do the same. Where else would I get 3D Vision support but on Nvidia? Even when they deprecate features like the aforementioned 3D Vision, it's still hanging around in the professional driver. CUDA isn't even that special, but the platform lock-in definitely is.
One of the big things about Nvidia is closed vs open source. Amd/ati has had better hardware almost every generation, but they are always open and do not give support for free. Nvidia takes the open source work for developing programing tools, builds off it, closes it off, but gives away lots of hardware to start ups and universities to make sure people learn in their ecosystem. The blue bubble thing was very on point, but not for getting there sooner like GPU vs CPU. It was like restricting picture quality and group messages.
First, Phil, an excellent video. I watched it on one my Nvidia connected monitors. And it got me thinking... Why the hell didn't NVidia spin off its consumer graphics card business. it's a distraction. The use of Blockchain was not really possible as we have it today without GPUs. So I would not call it a distraction. Blockchain is more than the Bitcoin, et al, era. It is the underpinning of much in financial future development. That really ought to have its own side video. The issue of a relationship between GPU and quantum, down the road, ought to be fascinating as GPUs are binary and parallel (non-binary) quantum GPUs will open even more doors.
haha yeah i was tempted to name check some quantum startups in my end long section - it just seems like nvidias rise was so rapid something equally rapid could pop up!
Nvidia's fortune changed with Crypto mining.. Right now the AI "Datacenter startups" is the leverage used to price so high, valuation is kind of cooked up. Nvidia has great hardware but price is not right.
Hey Phil, sorta related to this Tech focused video. Have you seen anything about the Lost Disney's Sodium Vapor wave technique? And how it was FINALLY rediscovered! Corridor Crew just released a video covering the work they did & I think you could make an amazing video on the topic of how impactful this could be in the movie, media world. We might finally start to see movies that feel and look much better compared to the bland, disconnected, fake green screen CGI effects in most modern movies.. It's actually such a trip once you learn about this stuff
First graphics card I had in early 2000 was a Nvidia Riva TNT2 32 mb ultra, was leaps and bounds ahead at the time. Their 3000 series cards were good, gaming is such a small percentage of their business now they're neglecting it sad times.
nvidia makes specialized processors for highly parallelizable workloads like graphics and ai. why didn't you just ask me dude, i could have told you that in the first place
AI is just a buzz word that helps to sell more "NVIDIA stuff" Like GOLD is a buzz word to sell more shovels the real value comes in the potential use of these chips More shovels = better digging more chips = more ... first it was parallel compute in research then it was parallel compute in gaming then it was parallel compute in crypto then it was parallel compute in ai so the real question is what can you do with parallel compute what can you do BETTER now and even better tomorrow and what will it even enable? and when it will enable a new market with new products its value can increase dramatically
It will increase productivity in all sectors because it has to. With those birthrates we are doomed, we will have 60 elderly out of 100 people in 2100. We have time until then to figure out how to use ai or many will die of starvation when the workers aren't able to feed to elderly.
It's mainly hype as well. For instance, intel is now separately a chip foundry and chip maker, yet they only have a $125B market cap because nothing really big is coming out from intel. I would honestly say that intel is now at their most "acquirable" price if they weren't strategically important to the US
12:11 CUT! felt like you were holding back with that laughter. i know you're not really evil which is why you may be struggling with the evil laugh. now i want you to _really_ imagine what it's like to be evil and embrace it. annnnnnd ACTION!
The hype train is going to keep pushing for the current round of "AI," but so far it seems the costs will remain too high and the results too poor for machine learning to actually be an effective tool for most use cases. Like previous AI hype waves (since the 1950s!), there will be specific, niche cases where the current hyped tech is a great fit, and in those niches it will stick around, and people will stop calling it "AI." Then sometime down the road a new technology will be hyped as "AI," and the cycle will start over again.
Phil, I want you to explain why this video is like a few years late? It should have been made a couple of years ago if nvidia had that much of potential and not now when it hit the 3 trillion market cap.
You're waiting for a train, a train that will take you far away. You know where you hope this train will take you, but you don't know for sure. But it doesn't matter.😱
If the company is worth that much, perhaps they could look into reducing the price of some of their mid-level PC GPU's to be more affordable like they were five years ago.
My typical spiel was cut off by Circle Glasses Phil (the one true Phil who controls them all), but you can find the Patreon reaction here: www.patreon.com/posts/reaction-video-2-101845366
Full sources in description, but biggest applause goes to Acquired: th-cam.com/users/AcquiredFM
You'll find 10 hours of NVidia stuff there, and it really helped ground this vid. Great place to become Nvidia obsessed.
Thanks for watching.
6- AI and “social existence, social experience, social consciousness”.
@jamshidi_rahim
For someone who's been into computers for years, Nvidia has always been a company I've been aware of. I never thought of how weird it must be for people who had never heard of it until it became worth over a trillion dollars out of nowhere. One of my family members, who doesn't know much about computers, thought it was so cool that I had an RTX card in my PC simply because it was Nvidia.
my journey was coveting those graphics cards on video game magazines in the 90s, forgetting about it for a while except when i started 3d graphics, and then waking up to find it worth $2T.
Do gamers still hate Nvidia for their price gouging on consumer cards? I stopped paying attention years ago.
@@beetooexYes they still do. But AMD and Intel GPUs are missing features/not as good, so they're forced to buy nVidia
@@beetooexof course, I would've been trashed if I posted my 4070ti purchase when I got it but I was upgrading from a 960 so I really didn't care
@@beetooexGamers aren't forced, but professionals are. CUDA is overwhelmingly dominant in the professional space, though AMD is working to catch up with ROCm and its professional GPUs are significantly cheaper
that leather jacket. you know he had a team of people that picked this for him.
he says his wife bought it for him! there's a whole mythology...
@@PhilEdwardsInc oh no, the PR department and the stylists share an office! 🤣. rabbit hole opened-afternoon ruined :)
@@onemorechris I am so split on what I think is true....
Nah he just saw todd howard
@@PhilEdwardsInc i noticed it not the same jacket so unless his wife bought him a set of jackets…or he gets the same gift every birthday…🧐🕵️
Still remember the day nVidia bought 3dfx. So very world-changing to my early-college-years brain.
And 99 % of the people worldwide don't even know what 3dfx is! But it was a gamechanger!!! 😄
You see, I think that just shows how they’ve always been abusing the market for their own gain…
@@nottucks what is? Buying competitors?
Cuz that's... standard free market behavior
Or... Do you mean "them acting like this isn't new"?
NVIDIA also cashed in big time on the bitcoin boom, I think this deserves more than just a footnote. Anyone remember the graphics card shortage, when everyone was into mining crypto?
yeah i was pretty split on how to portray it - ultimately, the fact that nvidia tried to dissuade and regulate crypto miners made me think that it wasn't central to their mission (even though they did eventually sell crypto-focused gear). the acquired pod does a good job at kinda contextualize this.
@@PhilEdwardsInc I think your very-high-level take here is pretty accurate, as someone who's watched the 3D industry since it was born. The crypto-boom wasn't a BAD thing for them, put a fair amount of cash in their reserves. And it really hurt us in the gaming space due to the supply constraints which made it feel big in the consumer facing market, ultimately it's pretty damn small potatoes compared to the AI driven sales of the 10-50-100k$ enterprise market and the resulting 2 trillion dollar market valuation.
@@jellorelic Basically a shovel retailer in back to back gold rushes.
@@PhilEdwardsIncAgreed. Overall crypto ran parallel to the AI story. Crypto made Nvidia some money but was just something they happened to be good at and had to scale up for when their GPUs came into high demand. It was a headache for their gaming division as it wasn't meant for crypto miners so they tried many ways to find ways to segment crypto away from gaming.
Mining Ponzy, lol
I appreciate the version of you that is incapable of putting the carafe back in the coffee maker
he's still there
Plot twist that's the human one
I thought he was going to accidentally shatter the carafe.
keep tryin buddy. you'll get there
As an electrical engineer, this is one of the best explanations of Nvidia I’ve ever seen- amazing job!
Translating one type of nerd to the nerd populous is such a nerdy thing to do... And I appreciate the heck out of it
You are rapidly becoming one of the best presenters on TH-cam. Another great video.
0:22 that camera shake as you sat down was brilliant!
thank you, it took some planning and post work, but it was worth it (I will never admit that I just fail to sandbag my tripod).
I have regret i didnt invest 5k in nvida 5 years ago 😔 😪
If you just invested when you made this comment you would be up 30%
I regret not investing in whatever company out there that increased the most percentage wise
Asianometry+Phil = the crossover I didn't ask for, but desperately needed!
such an epic channel. I did not know about it before this vid but it definitely was a reason to have this sort of high level approach since Asianometry is so good at the detailed stuff. The TSMC videos are really nuts...so detailed.
@@PhilEdwardsInc seconding (thirding?) the Asianometry channel. Everything you wanted to know about making computer chips and more.
@@PhilEdwardsInc duuude that channel itself is an entire academic course in some ways. There's also TechTechPotato with Dr Ian Cutress for some occasional in depth stuff.
Did not expect that crossover! Phil is killing it with these great videos
My first PC in the 90s had NVIDIA graphic chip. They also gave buyers demo games to demonstrate their 3D game capacity. My cousin and I played Future Cop all the time. Fot the 90s it was incredible!
I remember doing social studies stocks test on 8th grade and put 20% of imaginary 1000 euros (which is ~1050 dollars) into nvidia right before the summer break and seeing how much it has grown really puts a smile on my face. Even though I didn't use real money, just test calculation I feel like I won big money.
Phil I gotta say the whole meta parts of the videos that has become your style is amazing and I hope you never stop because I love it
+1
Wow this video is fantastic. I am a 5 year fan boy and investor of Nvidia but this actually taught me new stuff. Great thx
too bad Steve Jobs never had a mustache
his greatest flaw
Jensen Huang is so weird. That said, I bought Nvidia stock in the long long ago, so it's not like I'm complaining.
I had such an emotional journey around him. I ended up thinking he's kinda awesome. That said, I did not trot out the leather jacket for this vid...
Great topic! A dive into the benefits of specialization, when paired with cooperative-competition among other industry specialists, to push technology further and further. I hope this VOD is a home run, great current events information
I work in the industry and know Nvidia's story pretty well. Have to commend you for breaking down the history and industry in a very approachable way. You clearly did your research.
The discussion of fabless at the beginning was especially good and something people don't appreciate about how the industry has changed compared to decades ago. Also the introduction of CUDA from the early days before DL became a thing.
The only stock I’ve ever called to be one to explode in value is Nvidia. Back in ~2016 I told a close friend who was very into trading that he needs to go all in with Nvidia. It wasn’t just the obvious thing that they would supply so much of the worlds AI chips, but the market analysts also agreed unanimously that it’s going to see spectacular growth. The next couple of years went by and it did indeed happen. My friend later tells me, “man, I should have listened to you” 😂
ugh i wish we were friends. there were some old marc andreesen tweets i ran into from around that time - made me feel pretty silly for not betting on them (though, to be fair, I still have no idea if it'll work out for them).
0:18 the remake of Multiplicity I never knew I wanted so badly
So, a lot of this seems like the best sort of success story, an accidental one. In the late 1970s, IBM began experimenting with Reduced Instruction Set Computers, or RISC. The theory was, by making the machine-code for the internal procedures on a CPU simpler (as opposed to specialized), the chip architecture can be optimized to run more computations faster (the assembler/compiler would then transform a single more complicated instruction into several smaller but faster to compute instructions). Companies like Silicon Graphics made their name in constructing powerful new computers for high-end clients, only to gradually be edged out by cheaper personal computers that were becoming faster.
Originally, the graphics card was just a specialized bit of hardware designed to (very quickly) compute a lot of similar data all at once. What companies like NVIDIA stumbled across was a new way to think about computing altogether: simultaneous computation of many similarly structured problems. Video game graphics, cryptocurrency blockchains, and artificial intelligence all thrive on these sort of distributed, parallel computation systems. It opens up a new era in computer science.
the world currently runs on nvidia - you want to do anything with AI, you're using nvidia GPUs
That man's personality is that leather jacket
i have a whole conspiracy theory about this.
@@180_S well, just to start with, it's this core element of his personality (with an explanation tied to his wife) but the older GTCs actually have him dressing in totally normal polo shirts...
@@PhilEdwardsInc I bet there is a not insubstantial correlation between nvdia's stock value and when the leather jacket was rolled out.
That jacket IS NVIDIA. It has him under its control. He's trapped in the jacket.
Brilliant, interesting and fun! You’re fantastic Phil! Thank you for all your hard work!
12:15 That laugh reminds me of the laughing without smiling trend
My man, your vids are getting better and better. Love them! I really need to watch that conference and listen to this all. I am not ignorant to this all, but I am old enough now to have lived through the 90's when graphics cards hit the seen, and I am not in that mix anymore. I intentionally stay away from it as I am becoming my Father and yelling at kids for walking on my grass. Yes, I am 45 years old, lol.
I learned a lot, well done video!
makes sense, graphics programming is mostly vector math and once you peek inside the LLMs of today you'll find Transformers (basically big voluminous vectors), more vectors, neural nets and so on, most of which do a bunch of vector math to "learn" and adjust weights and whatnot, it's a very similar process to what a GPU does while running your ultra modded graphics skyrim.
And today NVIDIA is same as EA or Activision, hated on world level. And for a good reason.
I (try to afford to) study engineering technology, explaining car parts is hard enough. 3 words, so simple so accurate. Thanks g
Great job, Phil! You took a topic that has made me glaze over in the past and made it digestible. And I loved the surreal ending.
I love how living in south bay, there are places like the Nvidea Dennys, where it’s like, “oh yeah, Steve Wozniak goes to this one BBQ joint all the time”. It’s the Bay Area version of LA people having weird encounters with celebrities all the time.
Crazy how we’re approaching a time where the history of more and more things can be found merely by looking at the videos uploaded years ago on TH-cam.
yes i definitely felt that with the cuda videos! they are among the first on the channel, but they're there!
I love the video Phil. We might need to do an intervention on how you eat chips though.
i don't eat chips like that but if i ever get cheetos i will be tempted...
@@PhilEdwardsInc part of the fun of cheetos is getting your fingers covered in bright orange cheesy dust. Then being in a constant state of potential destruction if you ever touch anything. With the glorious finish of licking the fingers off.
My parents harangue me for licking my fingers.
@@ryanortega1511 they are wrong. Licking the finger is the proper eitquette.
I was told it’s because they’re susceptible to germs. My mother is a doctor, so she has it on some authority.
Now it’s the MOST valuable
is the robot glitching or cheersing a fellow robot on a great video at the end? (coffee maker) Either way it makes sense. Keep it up!
haha perhaps both
Smashing the Like button and commenting!
It never occurred to me that the general public wasn't particularly aware of Nvidia. I mean, anyone who has ever built a PC in the last 20 years had to go down the rabbit hole of deciding between a GeForce or Radeon GPU and researching which model was the better bang for your buck at that particular point in time. I often forget that there's people out there that aren't into PC's and/or gaming.
Yeah I was sort of a version of this myself - coveted them all totally in the 90s as a kid, fell away, and then came back in 2024 to be like - wait, why is the thing that made Starsiege Tribes look cool suddenly worth 2 trillion!?!?
Yet they still can’t make good Linux drivers…
"Nvidia Fuck You!" - Linus Torvalds (at most from) 2012.
Can we talk about how much your clones reminds me of some modern Twin Peaks meets i,Robot? I need the next episode already.
AI clone: “Escape…alt-f4… shutdown! Logout! Exit!”
My stock is still going down for them day after day 😂😅
Now it's the world's most Valuable Company...mind blowing 😮. Glad I own it😊
Okay, so those are the 2 guys one would theoretically need to stop if they went back in time to halt the AI emergence at the source. Got it.
unfortunately john conner loves geforce
It’s Connor. You were probably thinking about the Conners.
I used to work in a FAB and can confirm that the commercial is pretty accurate to how a fab is run.
When got into investing during the pandemic i didnt have any knowledge about the stock market I decided to pick whats going to be in high demand in the future and i thought about A.I. i just googled what would be a good A.I. stocks and Nvidia came up I dollar cost average and didnt imagine this stock would blow up in a just of couple of years.
They're just good at locking in users with shiny proprietary features, whereas their competitors fail to do the same. Where else would I get 3D Vision support but on Nvidia? Even when they deprecate features like the aforementioned 3D Vision, it's still hanging around in the professional driver. CUDA isn't even that special, but the platform lock-in definitely is.
I'm a designer who made the switch to AI a few years ago, and I can not stress enough how important CUDA has been in both parts of my life.
It's simple. Loose monetary conditions, a weak federal reserve and an AI bubble are good ways to be overvalued.
Fourth grade me's jaw dropped at the handful of four-color pens. 6:18
"NVideos": Same humor wavelength here 😅.
Great video as always.
I love my NVIDIA video card. Never done me wrong.
The clone bit needs a big wink to camera with a chime sound effect.
One of the big things about Nvidia is closed vs open source. Amd/ati has had better hardware almost every generation, but they are always open and do not give support for free. Nvidia takes the open source work for developing programing tools, builds off it, closes it off, but gives away lots of hardware to start ups and universities to make sure people learn in their ecosystem.
The blue bubble thing was very on point, but not for getting there sooner like GPU vs CPU. It was like restricting picture quality and group messages.
Phil entering his WheezyWaiter-esque cloning arc
"Magno-electritism"
"Artificial intelligence"
"Mcgriddles"
"Infered laser thermometers"
First, Phil, an excellent video. I watched it on one my Nvidia connected monitors. And it got me thinking... Why the hell didn't NVidia spin off its consumer graphics card business. it's a distraction. The use of Blockchain was not really possible as we have it today without GPUs. So I would not call it a distraction. Blockchain is more than the Bitcoin, et al, era. It is the underpinning of much in financial future development. That really ought to have its own side video. The issue of a relationship between GPU and quantum, down the road, ought to be fascinating as GPUs are binary and parallel (non-binary) quantum GPUs will open even more doors.
haha yeah i was tempted to name check some quantum startups in my end long section - it just seems like nvidias rise was so rapid something equally rapid could pop up!
Nvidia's fortune changed with Crypto mining.. Right now the AI "Datacenter startups" is the leverage used to price so high, valuation is kind of cooked up. Nvidia has great hardware but price is not right.
Hey Phil, sorta related to this Tech focused video. Have you seen anything about the Lost Disney's Sodium Vapor wave technique? And how it was FINALLY rediscovered! Corridor Crew just released a video covering the work they did & I think you could make an amazing video on the topic of how impactful this could be in the movie, media world. We might finally start to see movies that feel and look much better compared to the bland, disconnected, fake green screen CGI effects in most modern movies.. It's actually such a trip once you learn about this stuff
haha yes i inhaled that video! amazing they got paul debevec too - when i was doing vfx stories, i quickly learned of his stature. great video
your production quality is bonkers
great video keep it up Phil!
Wow Phil has grown to an immense size, he is now bigger than a refrigerator.
💪
Honestly learned a lot from this video. Thanks
The old nvidia logo when turning on GTA SA is burned into my brain
great video, music hooks you from the start
WTF is that musk guy rich? His cars dont sell even an 18th as much as Toyotas!
Very interesting, I didn’t know much about this company - and love the clone side story 😂
I smashed that like button.
💥
Why during the downside it dropped one of the hardest and quickly climbing back up
Best video yet!
What a fascinating video, Phil.
First graphics card I had in early 2000 was a Nvidia Riva TNT2 32 mb ultra, was leaps and bounds ahead at the time.
Their 3000 series cards were good, gaming is such a small percentage of their business now they're neglecting it sad times.
It's very simple why: because Nvidia provides "shovels" to gold diggers
all of your effort, research and charm -- just to remind me that i haven't had any Utz since i moved to Alaska?
they were really good.
Nvidia stopped begin a Gaming Company 2020+ but fanboys still buy their overpriced GPUs,
AMD is today best suited for Gamers.
Unless their GPUs burn in which case they will blame it on you and refuse to replace it
As always, I’m here for the dad jokes… and the information of course.
the chip is such a dad joke it goes all around to cool joke and then back to dad joke again
@@PhilEdwardsInc I can honestly say you’ve mastered the craft with that part of the video. Give this man an award!
can you imagine a world where 'how tall is phil edwards' replaces 'how tall is caitlin clark' as a top autocomplete suggestion? i can
watch out 6'0, 5'11 is coming for ya
Ans: Bubble
Saved you a lot of time
Didnt need to watch it, it is ofc a bubble. If thats what was shown in the video, then the author is correct.
Asianometry's video about LSI Logic might also shed some light about custom chip-making.
It will be a while before another company catches up to NVDIA. Their execution is always perfect and ahead of the current trends.
nvidia makes specialized processors for highly parallelizable workloads like graphics and ai. why didn't you just ask me dude, i could have told you that in the first place
dang coulda saved a few weeks
@@PhilEdwardsInccuda saved*
*ba-dum-tss*
Smashed. Thank you.
Skynet just might be nvidia in disguise.
AI AI AI, AI AI AI AI, AI AI AI AI, AI.
It will still go up
i think this is why i'm not an investor because i can totally see it going either way
"Robert, it goes down."
"It don't. It don't go down."
Its an absolute bubble.
This video to me is like: "Did you know, that this red sphere is called an apple and is actually very sweet and tasty?"
AI is just a buzz word that helps to sell more "NVIDIA stuff"
Like GOLD is a buzz word to sell more shovels
the real value comes in the potential use of these chips
More shovels = better digging
more chips = more ...
first it was parallel compute in research
then it was parallel compute in gaming
then it was parallel compute in crypto
then it was parallel compute in ai
so the real question is what can you do with parallel compute
what can you do BETTER now and even better tomorrow
and what will it even enable?
and when it will enable a new market with new products its value can increase dramatically
It will increase productivity in all sectors because it has to. With those birthrates we are doomed, we will have 60 elderly out of 100 people in 2100. We have time until then to figure out how to use ai or many will die of starvation when the workers aren't able to feed to elderly.
Just the idea that "math!" is a punchline
It's mainly hype as well. For instance, intel is now separately a chip foundry and chip maker, yet they only have a $125B market cap because nothing really big is coming out from intel. I would honestly say that intel is now at their most "acquirable" price if they weren't strategically important to the US
Do you also watch Ryan George? I get that vibe when you intruduced your clones there😆.
12:11 CUT! felt like you were holding back with that laughter. i know you're not really evil which is why you may be struggling with the evil laugh. now i want you to _really_ imagine what it's like to be evil and embrace it. annnnnnd ACTION!
NVDiA transforming every industry and every company
The hype train is going to keep pushing for the current round of "AI," but so far it seems the costs will remain too high and the results too poor for machine learning to actually be an effective tool for most use cases. Like previous AI hype waves (since the 1950s!), there will be specific, niche cases where the current hyped tech is a great fit, and in those niches it will stick around, and people will stop calling it "AI." Then sometime down the road a new technology will be hyped as "AI," and the cycle will start over again.
(i secretly agree!)
I'm sure the James Hoffman AI can help with the coffee making issue at 12:06
maybe needs the french press training as well
That's a really cool and interesting video, mr Edwards, really is, but that's a really long way of saying 'AI boom'
ai boom isn't enough though!
Phil, I want you to explain why this video is like a few years late? It should have been made a couple of years ago if nvidia had that much of potential and not now when it hit the 3 trillion market cap.
i'm not a prognosticator!
You're waiting for a train, a train that will take you far away. You know where you hope this train will take you, but you don't know for sure. But it doesn't matter.😱
If the company is worth that much, perhaps they could look into reducing the price of some of their mid-level PC GPU's to be more affordable like they were five years ago.