As an AI enthusiast myself, I'm thrilled to see how multi-agent systems are revolutionizing businesses! I've been exploring SmythOS and I'm amazed at how easy it is to visually design AI workflows without coding
It's one thing for TSMC to have the EU ASML machines its another thing if they can build them by reverse engineering (that's a whole mega project on its own)
You neglected to mention the very strange earnings call earlier this year when Elon mumbled incoherently about random difficulties with the dojo computer which most likely means the whole project is not doing very well.
@@mr.monitor. Are you a very technically inclined person that either works in the IT field or dabbles with IT projects for fun to the depths of if someone were saying random buzzwords, you'd be able to catch them BS-ing you?
Cerebras has a few patents for cooling & getting data on-and-off the wafer. I guess the problem is that the wafer physically expands a lot as it heats up, and this matters more because it's so large. It will be interesting to see how Tesla solves the same issues either by licensing the patents, hoping no one notices, or coming up with a different way to do the same thing.
@@SangoProductions213 I mean the fact that you SHOULD expect that is also meat riding, fans of any fandom shouldn't be so deep in the cheeks where they can't be real
they are not going to make a massive chip out of a single wafer! they are going to do like apple and make multiple smaller chips and interonnect them thats how its more efficient and succesfull
That's what Cerebras does. If there's 4-10% bad cores, it can route around them. Apparently the actual number of bad cores is actually smaller, possibly because the physical size of the cores is small.
It will have a huge amount of compute, but it is currently calculated as the sum of its parts. In order to have exascale computing all of those parts need to be able to work together in extremely fast and large scale ways. There's lots of systems larger than this out there but their communication overhead dramatically drags down the performance as measured by the sum of their parts. Lots of hype in saying that they bought a ton of off the shelf components, but hopefully it can be constructed in a way that the overhead isn't too substantial
Huge mistake in relying 100% on machine vision by Tesla.. what are you going to do in pea soup fog? What are you going to do when mud splattered on a couple of the cams? Elon should have studied a little bit of perceptual psychology. Once you understand how optical illusions can fool vision you can appreciate the limits of visual perception alone without complementary inputs. Lidar isn't sexy. Aesthetically it will look horrible. Functionally it is what is required among other sensor modalities. Time will tell But who am I, just some Joe schmo putting his two cents in.
@@KarmaMechanic988 How does anyone drive through dense fog? Slowly. If the cameras get dirty maybe the car tells you to clean it or else you will need to take over and drive the vehicle yourself until you do. Have you ever used one of the newer vehicles with a birds eye camera view for backing up? They're pretty neat. It's a lot better than just a backup camera. You see a picture of the vehicle from overhead and everything around it in 360 degrees. It uses different cameras from around the vehicle and then patches it all together to create one image. We had one as a rental for a week and it worked flawlessly. I was so impressed i want that technology in my next vehicle. It's one of those things like a backup camera once you get used to it you can't imagine not having it. Except this is even better than a normal backup camera. I think all vehicles will have multiple cameras on them in the not so distant future. It's tough to say where AI and super computers will take us. We're heading into uncharted territory. I can't believe some of the advancements i've seen in technology over the last 30 years though. For the people who say we'll never solve FSD. Well 30 years ago i wouldn't have believed everyone in the world would be carrying around a phone in their pocket that is also capable of using something called the internet either. No one had a clue what was coming. I didn't even have my first computer or cell phone until the late 90's. Most people didn't.
When all the chips are grown straight up and then layers are no longer needed is when it will be off the chain. No stacking needed. I have a feeling Elon knows this and probably has a plan for it too. Cooling for this would probably be straight up nitrogen or in space.
CUDA was developed in 2006 though, a couple of years before the bitcoin white paper was released. Attention Is All You Need was published in june 2017, just as the first alt bull run was taking off and driving GPU prices through the roof.. so if anything, the periodical consumer hysteria over GPUs slowed the early development of LLMs by keeping cards out of the hands of compsci researchers.
@@puddles5501 yeah I'm talking between graphics cards for gaming and AI. Their initial boom was pushing cuda for gaming yes but the next boom for them was when GPUs we being bought up by the millions of data mining farms. They built these new Blackwell GPUs thinking they could sell fthem to banks in the crypto world but they had AI in their back pocket if transformers ever took off which they did. Either way now data centers have reasons to buy Nvidia server chips in bulk and as everything goes GPU and NPU under ARM I bet you money Crypto comes back because by then the compute needed will be heavily available bc of the AI push.
The Corona crisis helped a LOT for mining as people had more time at home for tinkering and creating content and additional revenue. AI training needs (read GPU) where existing before that, but only after Transformers and and a few others technologies / methodologies became well known, it started growing exponentially . Again helped with Eth2.0 and the drop in mining, AI scooped up an enormous amounts of GPU's and still increasing. I'm particulairly interested what will happen with Grow LPU's for "running" the millions of AI LLM models (aka -> inference).
Amazing video, I’am a huge fan, When I first saw this I will looking for a really good channel to learn about these stuff. I’am so thankful for channel, Like the video is you agree, You’re a great public speaker
Just to point out... the sequencing of human genome didn't involve supercomputers...if you are talking about genome assembly, indeed, some mainframe and linux clusters were used, but again, not supercomputers...
Ini adalah perkakas yang semakin menjawab tuntutan kapasitas yang semakin besar namun masih mengutamakan sustainable dan kompatible perangkat dalam arsitektur yang portable
Wait. . . I thought Tesla dumped their hardware and just went with NVIDIA hardware. . . The Tesla Supercomputer is like the Tesla Battery Cell . . . It is actually just bought from another company.
too late, he already did. Now he is unhappy about what he did and he wants us investors to give him back the shares he sold. He claims he doesnt work for free, but he is claiming more money than the company has ever made in it´s 20 year history
@@brown2889@brown2889 Big difference between one chip company surrounded by China stating You are mine! And A Billionaire who is wisely hedging his bets for WHEN the invasion begins, and NVIDIA chips are no longer sold to the West.
So basically what this thing is saying is take the latest idea and basically copy it and just make a couple changes... And there you have it the newest invention
Nvidia has faster turnaround times for their hardware construction. Unless Tesla can manufacture their chips for 1-2 degrees of magnitude cheaper, they'll likely just buy Nvidia instead of building DOJO further
But will they put UBI before 2030!? Many pioneers including billionaires, scientists, Nobel Prize winners, engineers, architects, analysts and so on and so forth... almost all agree on the fact that we will have AGI in 2027 and ASI in 2029 and they look and evaluating the exponential technological acceleration curve, I wonder why they have not yet implemented universal basic income to anticipate the trends that will come from it. Just to name one, Elon Musk says that we will have AGI as early as 2025 and ASI in 2029.
Fleet doesnt run on dojo. It runs with FSD chip which is also Tesla designed chip. Dojo is for training compex neural networks but i believe most of the training work is done by A100 and H100 training clusters.
It's too bad Tesla doesn't take a more creative look at employing elderly folk. I'm 69 not an engineer but taught myself how to write and prosecute patents. Mostly because I'm an inventor and thought that was a way to fame and riches. Boy did I get that wrong. Patented a few things just to get the hang of it and then life happened. Now I'm too old for most employers to even consider hiring. Made my first analog-digital computer in 1965.
I am not sure how project DOJO will play out NVIDIA is improving in leaps and bounds. The Blackwell project may make dojo obsolete. But we shall see. The need for training compute is growing immensely so prices will definitely increase.
It won't. They are completely different. Nvidia dominates the market for now, but all major tech companies have big interest to develop their own chips and get rid of Nvidia.
Cool i hope we will see more about it. I think that in the future those massive computing chips will reach normal consumers, The design of the pc would be like a cube and very power efficient because of the appearance of full dive VR. At this point i am just guessing what it would be used for a normal customer.🤣
Tesla stock swings too much to calmly accumulate and hold, After buying TSLA shares for just over 10 years, i'm struggling to make gains presently. How do i adjust or revamp my $2M portfolio? or should i consider some defensive investments?
In this current unstable markets, It is advisable to diversify while retaining 70-80% in secure investments. looking at the worth of your portfolio, you should consider financial advisory.
I'm in line with having an advisor oversee my day-to-day investing cos, my job doesn't permit me the time to analyze stocks myself. Thankfully, my portfolio has just 5X in barely 5 years, summing up nearly $1m after subsequent investments to date.
This caught my interest. I worry that I have a couple more months before retirement, and I want to switch to using a financial advisor, but I don’t really know how to find one.
Amber Michelle Smith is the licensed advisor I use. Just research the name. You’d find necessary details to work with a correspondence to set up an appointment.
The original breakout for NVidia was was crypto mining. AI came later, I think a happy surprise the scale of which could not have been anticipated. Also it is important to understand that loading more transistors on one chip reduces overall power consumption. It is more expensive to make, but faster, and cheaper to operate. Power consumption is a major obstacle for AI compute.
Nvidia power consumption advantage, has rendered all it's competitors products worthless. And with Nvidia rolling out NEW chips each year, nobody is catching up anytime soon.
Yes, but does Dojo work and is it more effective and efficient than just using Nvidia? I got the impression that Elon has been downplaying the potential effectiveness of Dojo of late?
Dojo and Nvidia are doing things in different ways. Nvidia is a known operation. Dojo is not. Hard to hire people to do a job that has never been done before. Dojo is far more efficient and less power hungry, but Nvidia currently seems much faster and easier to work with.
@@davidbeppler3032 That's a silly comment lacking factual information. Musk's new startup xAI is using ONLY Nvidia hardware, and will provide AI services to Tesla. Nvidia is leading in power efficiency, AI Supercomputing is all about that. Everything "more than a car company", Musk is pilfering away from Tesla investors.
@@davidbeppler3032 what I'm saying is that I don't believe those claims. As for budget I'm sure Tesla is spending hundredfolds more in R&D to make that then the Nvidia stuff would cost (which makes sense, because if the succeed then they have some original IP)
I have a Hewlet Packard ProDesk desktop computer and I think it is about 10 yrs old and not powerful enough to carry Windows 11. Are there now with just a few silicon wafers one can have a much more powerful computer that is very compact? How much do they cost and what brands are there in the regular consumer market?
Dude, please upgrade. If you want something for that less than $400 just buy any second hand gaming pc with 6 or more cores and a dedicated gpu. Better yet build a pc yourself. It's super easy to do it just takes a little bit of research. The "regular" consumer market, (Dell, HP, Acer, etc) sucks when looking at PC towers. PC Part Picker is a great tool to compare prices and build your own system for your own budget. There's loads of guides out there and it's really not that difficult to do. Or you can spend $1000 and get a prebuilt that will be fast and work out of the box.
@@caseymurray7722 Hi Casey! Very nice of you to take the time to reply and with first hand knowledge. I got my computer for $150 about 3 years ago, the HP ProDesk, from a charity that gets computers from companies that are upgrading. It is a "business grade" computer. Does getting a newer business grade computer from the charity mean it's a lot more powerful than a regular consumer computer? Maybe I should get brave and actually build a computer but that is very intimidating to me. I'm not a very tech oriented person. What's a "core"? and a "dedicated gpu"? Maybe I could for $150 build something. Should I get their 3-5 year old business grad computers for $150 and add more cores and gpu's?
This is so clueless on the GPU story. It wasn't just gamers who were GPU users. It was cryptominers and AI researchers. Don't forget that Elon was one of the founders of OpenAI. GPU are really linear algebra machines. You can choose what matrix manipulations you want. The CUDA software is also needed. It's not just hardware. System on a chip is the concept that ARM perfected. It's why Apple picked ARM and why Nvidia tried to buy ARM. Touting TSMC as something that distinguishes Tesla is putting blinders on your understanding of recent history, Look at Nvidia's recent announcements on the design of the the next iteration of system integration. They are on a one year per generation calendar.
lol no. You Tesla fanboys are something else. Get over them it’s a slowly failing car company with terrible build quality not some great innovative computer tech company lol
The total 10-year cost of maintaining and repairing a Tesla is the cheapest in the industry, according to a data analysis shared by Consumer Reports . The data showed that, over the course of a decade, Teslas averaged just $4,035 in maintenance and repair costs, compared to those of automakers Land Rover and Porsche, which landed at $19,250 and $14,090, respectively.
This is So from the future that you will never see it on this timeline, just like the cybertruck which is going to change trucking in the multiverse first ......
Oh dear God. All supercomputers are parallel computers. Yes Nvidia are well suited to process neural networks because they are matrix multiplication intensive.
Why we do not use GTA game and use some mods to look super realistic, and then we can create super rare diving scenario to simulate and learn, than Tesla do not need cameras for training (faster and cheaper because is on site) :)
The explanation of parallel computing using GPUs was spot on! Vertical integration strategy is truly a game-changer in AI development.
As an AI enthusiast myself, I'm thrilled to see how multi-agent systems are revolutionizing businesses! I've been exploring SmythOS and I'm amazed at how easy it is to visually design AI workflows without coding
My GOD The Mainframe era - is coming back to haunt IT Nerds again
It always starts that way but will eventually shrink again. I guess it’s the way of progress.
Your videos are superb! Thanks for all the work that goes into these. ⭐️😊
ASML uses Extreme UV lithography to make the fastest chips. One reason China really wants Taiwan and TSMC
and for this reason TSMC is building huge Fab Shops in the US.
And they would blow all that shit up before China ever got close to it.
It's one thing for TSMC to have the EU ASML machines its another thing if they can build them by reverse engineering (that's a whole mega project on its own)
China claims Taiwan and Taiwan claims China.
Meh, at the rate China is progressing, I think it's just a matter of time before they research lithography on their own.
You neglected to mention the very strange earnings call earlier this year when Elon mumbled incoherently about random difficulties with the dojo computer which most likely means the whole project is not doing very well.
Just because you don't understand what he is saying doesn't mean he is incoherent 😅
now imagine how bad it must be when elon praises nonfunctional products selling snakeoil...
@@mr.monitor.Ah, the almighty oracle. Hail our saviour. 🤦♂️
@@Conservator. Excellent.
@@mr.monitor. Are you a very technically inclined person that either works in the IT field or dabbles with IT projects for fun to the depths of if someone were saying random buzzwords, you'd be able to catch them BS-ing you?
Cerebras has a few patents for cooling & getting data on-and-off the wafer. I guess the problem is that the wafer physically expands a lot as it heats up, and this matters more because it's so large. It will be interesting to see how Tesla solves the same issues either by licensing the patents, hoping no one notices, or coming up with a different way to do the same thing.
The channel really won the MVP award in Division 1 meat riding
Lmaooo😂😂😂😂😂😂
It's literally called The Tesla Space.
If you didn't expect that, then.... well, you really are beyond words.
@@SangoProductions213 I mean the fact that you SHOULD expect that is also meat riding, fans of any fandom shouldn't be so deep in the cheeks where they can't be real
this is funny lmao
China has spent more on importing chips than crude oil last year. This should tell you something about the future and potential of this technology.
I'm sorry, all I heard was Dr. Evil saying "One-hundred MILLION dollars!" 4:19
Production Yields for a whole wafer to work at 100% must be low. They must have some built in architecture to deal with bad processor cores.
they are not going to make a massive chip out of a single wafer! they are going to do like apple and make multiple smaller chips and interonnect them thats how its more efficient and succesfull
Did you even watch the video?
That's what Cerebras does. If there's 4-10% bad cores, it can route around them. Apparently the actual number of bad cores is actually smaller, possibly because the physical size of the cores is small.
I had two NVIDA graphics cards with two Xeon processors for Computational Structural and Computational Aerodynamics problems. Great machines!
It will have a huge amount of compute, but it is currently calculated as the sum of its parts. In order to have exascale computing all of those parts need to be able to work together in extremely fast and large scale ways. There's lots of systems larger than this out there but their communication overhead dramatically drags down the performance as measured by the sum of their parts. Lots of hype in saying that they bought a ton of off the shelf components, but hopefully it can be constructed in a way that the overhead isn't too substantial
Great info, appreciate the video!
Very informative and accurate information. Great video that anyone can understand!
A.I. - MACHINE LEARNING.
These crazy numbers and dollars are just incomprehensible to me. I’d have an easier time wrapping my brain around the number of stars in the universe.
I love FSD. When VI becomes AI in a car, OMG. Car may finally become an asset vs a depreciating liability
car will never become AI. there is not enough memory in a car to support AI.
Huge mistake in relying 100% on machine vision by Tesla.. what are you going to do in pea soup fog? What are you going to do when mud splattered on a couple of the cams?
Elon should have studied a little bit of perceptual psychology. Once you understand how optical illusions can fool vision you can appreciate the limits of visual perception alone without complementary inputs.
Lidar isn't sexy. Aesthetically it will look horrible. Functionally it is what is required among other sensor modalities.
Time will tell
But who am I, just some Joe schmo putting his two cents in.
😂😂😂😂 keep on dreaming
@@KarmaMechanic988Thank god someone actually agrees with me i thought i was going crazy
@@KarmaMechanic988 How does anyone drive through dense fog? Slowly. If the cameras get dirty maybe the car tells you to clean it or else you will need to take over and drive the vehicle yourself until you do.
Have you ever used one of the newer vehicles with a birds eye camera view for backing up? They're pretty neat. It's a lot better than just a backup camera. You see a picture of the vehicle from overhead and everything around it in 360 degrees. It uses different cameras from around the vehicle and then patches it all together to create one image. We had one as a rental for a week and it worked flawlessly. I was so impressed i want that technology in my next vehicle. It's one of those things like a backup camera once you get used to it you can't imagine not having it. Except this is even better than a normal backup camera. I think all vehicles will have multiple cameras on them in the not so distant future.
It's tough to say where AI and super computers will take us. We're heading into uncharted territory. I can't believe some of the advancements i've seen in technology over the last 30 years though. For the people who say we'll never solve FSD. Well 30 years ago i wouldn't have believed everyone in the world would be carrying around a phone in their pocket that is also capable of using something called the internet either. No one had a clue what was coming. I didn't even have my first computer or cell phone until the late 90's. Most people didn't.
When all the chips are grown straight up and then layers are no longer needed is when it will be off the chain. No stacking needed. I have a feeling Elon knows this and probably has a plan for it too. Cooling for this would probably be straight up nitrogen or in space.
Subscribed!
Please continue to keep us updated about the worlds innovations!!
about TSMC innovations?!
Telsa reinvented everything guys, soon we gonna have cars 2.
You ignored what initially brought the GPU boom, crypto. And then after that died they they found a new market in OpenAI.
CUDA was developed in 2006 though, a couple of years before the bitcoin white paper was released. Attention Is All You Need was published in june 2017, just as the first alt bull run was taking off and driving GPU prices through the roof.. so if anything, the periodical consumer hysteria over GPUs slowed the early development of LLMs by keeping cards out of the hands of compsci researchers.
@@puddles5501 yeah I'm talking between graphics cards for gaming and AI. Their initial boom was pushing cuda for gaming yes but the next boom for them was when GPUs we being bought up by the millions of data mining farms. They built these new Blackwell GPUs thinking they could sell fthem to banks in the crypto world but they had AI in their back pocket if transformers ever took off which they did. Either way now data centers have reasons to buy Nvidia server chips in bulk and as everything goes GPU and NPU under ARM I bet you money Crypto comes back because by then the compute needed will be heavily available bc of the AI push.
The Corona crisis helped a LOT for mining as people had more time at home for tinkering and creating content and additional revenue. AI training needs (read GPU) where existing before that, but only after Transformers and and a few others technologies / methodologies became well known, it started growing exponentially . Again helped with Eth2.0 and the drop in mining, AI scooped up an enormous amounts of GPU's and still increasing. I'm particulairly interested what will happen with Grow LPU's for "running" the millions of AI LLM models (aka -> inference).
@@puddles5501 Compsci researchers wouldn't be using domestic-grade GPUs... Unless we're talking about amateur MLers as "compsi researchers".
This sounds credible. How’s that Cyber Truck doing by the way?
Sold out for 2024.
comparing cars and computers is like comparing your new nails to your brain.
EDIT: SLOGAN PEDDLER REMOVED.
What is over 2M in pre-order means to you?
I just saw a cybertruck the other day for the first time. They are wild!!!
Dojo sounds one hell of a lot like a transputer.
To quote Elon, "If we can get it to work"
Amazing video, I’am a huge fan, When I first saw this I will looking for a really good channel to learn about these stuff. I’am so thankful for channel, Like the video is you agree, You’re a great public speaker
Just to point out... the sequencing of human genome didn't involve supercomputers...if you are talking about genome assembly, indeed, some mainframe and linux clusters were used, but again, not supercomputers...
The supercomputers played the role in understanding protein folding. Not genome sequencing. Computational chemistry is a serious frontier!
Ooh please can you give me some resources to find out more
@@johndawson6057 Google !
Computers work by algorithms but AI changes that old school
I've been betting on Elon Musk for 30 years. I've never been wrong.
Yes, I understand Elon is naming it Skynet.
Ini adalah perkakas yang semakin menjawab tuntutan kapasitas yang semakin besar namun masih mengutamakan sustainable dan kompatible perangkat dalam arsitektur yang portable
you glossed over how Musk poached those top chip designers - 5 years ago. Probably the key part of the whole story.
Wait. . . I thought Tesla dumped their hardware and just went with NVIDIA hardware. . .
The Tesla Supercomputer is like the Tesla Battery Cell . . . It is actually just bought from another company.
This is a great video. A+
Great video! Thanks
I’d rather Elon spend money on future tech than social networks. 🤓
too late, he already did. Now he is unhappy about what he did and he wants us investors to give him back the shares he sold.
He claims he doesnt work for free, but he is claiming more money than the company has ever made in it´s 20 year history
@@unmanned_mission moron
If you've once been a victim of censorship you'll appreciate free speech.
I will get excited about all this computing technology when they can cure my multiple sclerosis...
At this point, I'm convinced that the reason Elon is so well sold on AI is because he has let it do all of his marketing so far.
"What you end up with, is one big ass chip"
This video did not mention the positive impact that BTC mining had on NVIDIA GPU sales from 2020 to 2023 like the crypto hype never happened.
I'm going to bet my money on Nvidia not being dethroned by Tesla
Tesla is a system thats not evil and predatory , its a coherent system of love and efficiency
I don’t imagine he wants to dethrone NVIDIA anyway. They have good resource contacts.
@@brown2889@brown2889 Big difference between one chip company surrounded by China stating You are mine! And A Billionaire who is wisely hedging his bets for WHEN the invasion begins, and NVIDIA chips are no longer sold to the West.
I watched this video 3 times! Thanks man this was very informative and educational
You.. just liked it enough to watch it x2 ?
That’s weird
🚶♂️🫷
Chip Wars ... disappearing airliners, ... they're here.
So basically what this thing is saying is take the latest idea and basically copy it and just make a couple changes... And there you have it the newest invention
GPU aftermarket is about to be crazy.
Dojo/FSD is Project Stardust for legacy automotive ICE vehicle OEMs.
so reinvented that they didn't even speak about it in their Q1 call
Interesting , Thank You . Should be intresting
Nvidia has faster turnaround times for their hardware construction. Unless Tesla can manufacture their chips for 1-2 degrees of magnitude cheaper, they'll likely just buy Nvidia instead of building DOJO further
But will they put UBI before 2030!? Many pioneers including billionaires, scientists, Nobel Prize winners, engineers, architects, analysts and so on and so forth... almost all agree on the fact that we will have AGI in 2027 and ASI in 2029 and they look and evaluating the exponential technological acceleration curve, I wonder why they have not yet implemented universal basic income to anticipate the trends that will come from it. Just to name one, Elon Musk says that we will have AGI as early as 2025 and ASI in 2029.
Yes, that same vehicle IS still on sale today..
not too sure what u mean with "quietly". they even demoed the chips. the entire fleet runs on dojo.
Fleet doesnt run on dojo. It runs with FSD chip which is also Tesla designed chip. Dojo is for training compex neural networks but i believe most of the training work is done by A100 and H100 training clusters.
No. Dojo is a plan B to supplement and *potentially* replace their nvidia GPU cluster, if they are successful.
@@malax4013 yep, slightly misleading thanks for clearing that up.
Crazy
It's too bad Tesla doesn't take a more creative look at employing elderly folk. I'm 69 not an engineer but taught myself how to write and prosecute patents. Mostly because I'm an inventor and thought that was a way to fame and riches. Boy did I get that wrong. Patented a few things just to get the hang of it and then life happened. Now I'm too old for most employers to even consider hiring. Made my first analog-digital computer in 1965.
Much respect.
I am not sure how project DOJO will play out NVIDIA is improving in leaps and bounds. The Blackwell project may make dojo obsolete. But we shall see. The need for training compute is growing immensely so prices will definitely increase.
It won't. They are completely different. Nvidia dominates the market for now, but all major tech companies have big interest to develop their own chips and get rid of Nvidia.
"only one other chip designer doing wafer scale processing" That is hard to believe. Can you clarify?
But is it any good? Or will Tesla depends on Nvidia?
Cool i hope we will see more about it.
I think that in the future those massive computing chips will reach normal consumers,
The design of the pc would be like a cube and very power efficient because of the appearance of full dive VR.
At this point i am just guessing what it would be used for a normal customer.🤣
3:10
This is where I am
The ga144 does this already, my enumera project started this in 2000.
WoW! Thanks.
Do you think that Tesla's Optimus humanoid robot will be able to read, write, reason, think, simulate emotions, feelings, have reflexes, senses, etc.
No
I don’t see why not. Our brain can interface with computers. What does that tell you?
Hello. Does the bv452 have rhis same feature to handle long pcb? It fill a section then adjust the biard and fill the other section.
Tesla stock swings too much to calmly accumulate and hold, After buying TSLA shares for just over 10 years, i'm struggling to make gains presently. How do i adjust or revamp my $2M portfolio? or should i consider some defensive investments?
In this current unstable markets, It is advisable to diversify while retaining 70-80% in secure investments. looking at the worth of your portfolio, you should consider financial advisory.
I'm in line with having an advisor oversee my day-to-day investing cos, my job doesn't permit me the time to analyze stocks myself. Thankfully, my portfolio has just 5X in barely 5 years, summing up nearly $1m after subsequent investments to date.
This caught my interest. I worry that I have a couple more months before retirement, and I want to switch to using a financial advisor, but I don’t really know how to find one.
Amber Michelle Smith is the licensed advisor I use. Just research the name. You’d find necessary details to work with a correspondence to set up an appointment.
I looked up her name online and found her page. I emailed and made an appointment to talk with her. Thanks for the tip
“Big ass chip “
Is that an engineering term? 🤣
Great video!
😂😂😂
TESLAA
How are the Personnel issues at Tesla AI? The chief designer you feature a lot in your video was fired a year ago, right?
Excellent
Dojo to scale
Full self driving next year gyes ⚡
A few months ago the new AI-based FSD was released, and it works extremely well. Your opinion is outdated.
It is "first principles" not "first principals."
All of this - and yet you didn't mention Quantum Computers which beat all of this.
Vastly more sustainable and highly intelligence to do so Tesla was smart
The original breakout for NVidia was was crypto mining. AI came later, I think a happy surprise the scale of which could not have been anticipated. Also it is important to understand that loading more transistors on one chip reduces overall power consumption. It is more expensive to make, but faster, and cheaper to operate. Power consumption is a major obstacle for AI compute.
True
Nvidia power consumption advantage, has rendered all it's competitors products worthless. And with Nvidia rolling out NEW chips each year, nobody is catching up anytime soon.
“… one, big ass chip,” hahahaha!😂
Actually a multiple chips on one tile.
With swell robotics everywhere, Ai jobloss is the only thing I worry about anymore. Anyone else feel the same? Should we cease Ai?
Didn’t Elon very recently snatch unused NVidia hardware from Tesla and put them to use in his new company xAI?
It's all bound bandwidth based
Yes, but does Dojo work and is it more effective and efficient than just using Nvidia? I got the impression that Elon has been downplaying the potential effectiveness of Dojo of late?
Dojo and Nvidia are doing things in different ways. Nvidia is a known operation. Dojo is not. Hard to hire people to do a job that has never been done before. Dojo is far more efficient and less power hungry, but Nvidia currently seems much faster and easier to work with.
@@davidbeppler3032 That's a silly comment lacking factual information. Musk's new startup xAI is using ONLY Nvidia hardware, and will provide AI services to Tesla. Nvidia is leading in power efficiency, AI Supercomputing is all about that. Everything "more than a car company", Musk is pilfering away from Tesla investors.
pretty sure that Nvidia is OP in every single way
@@Niberspace Not if you have a limit on how much you can spend on electricity every hour of every day. Or have a limited budget.
@@davidbeppler3032 what I'm saying is that I don't believe those claims. As for budget I'm sure Tesla is spending hundredfolds more in R&D to make that then the Nvidia stuff would cost (which makes sense, because if the succeed then they have some original IP)
*STOP BS ING - THEY HAVE NOT EVEN GOT CLOSE TO PRODUCTION*
They need to rebuild the entire Nvidia architecture. They will always be far behind them
😊cOM❤putINg😂
Vertical integration doesn't work for car companies and was abandoned many decades ago.
FSD powerhouse
we all know that tesla uses nvidia h100s bud
I don't know any such details, but I also strongly suspect that this is just marketing, that in reality everything they do is based on Nvidia
I have a Hewlet Packard ProDesk desktop computer and I think it is about 10 yrs old and not powerful enough to carry Windows 11. Are there now with just a few silicon wafers one can have a much more powerful computer that is very compact? How much do they cost and what brands are there in the regular consumer market?
Dude, please upgrade. If you want something for that less than $400 just buy any second hand gaming pc with 6 or more cores and a dedicated gpu. Better yet build a pc yourself. It's super easy to do it just takes a little bit of research. The "regular" consumer market, (Dell, HP, Acer, etc) sucks when looking at PC towers. PC Part Picker is a great tool to compare prices and build your own system for your own budget. There's loads of guides out there and it's really not that difficult to do. Or you can spend $1000 and get a prebuilt that will be fast and work out of the box.
@@caseymurray7722 Hi Casey! Very nice of you to take the time to reply and with first hand knowledge. I got my computer for $150 about 3 years ago, the HP ProDesk, from a charity that gets computers from companies that are upgrading. It is a "business grade" computer. Does getting a newer business grade computer from the charity mean it's a lot more powerful than a regular consumer computer? Maybe I should get brave and actually build a computer but that is very intimidating to me. I'm not a very tech oriented person. What's a "core"? and a "dedicated gpu"? Maybe I could for $150 build something. Should I get their 3-5 year old business grad computers for $150 and add more cores and gpu's?
Sky Net
Tesla dojo is a fiasco. It has nothing to show after years of working on that project.
Didn’t IBM announce quantum supremacy. Computers that can process every possibility available in that moment at one time
This is so clueless on the GPU story. It wasn't just gamers who were GPU users. It was cryptominers and AI researchers. Don't forget that Elon was one of the founders of OpenAI. GPU are really linear algebra machines. You can choose what matrix manipulations you want. The CUDA software is also needed. It's not just hardware. System on a chip is the concept that ARM perfected. It's why Apple picked ARM and why Nvidia tried to buy ARM. Touting TSMC as something that distinguishes Tesla is putting blinders on your understanding of recent history, Look at Nvidia's recent announcements on the design of the the next iteration of system integration. They are on a one year per generation calendar.
Le matériaux de support et informatique
lol no. You Tesla fanboys are something else. Get over them it’s a slowly failing car company with terrible build quality not some great innovative computer tech company lol
The total 10-year cost of maintaining and repairing a Tesla is the cheapest in the industry, according to a data analysis shared by Consumer Reports . The data showed that, over the course of a decade, Teslas averaged just $4,035 in maintenance and repair costs, compared to those of automakers Land Rover and Porsche, which landed at $19,250 and $14,090, respectively.
ai greatly helps hw vendor
This is So from the future that you will never see it on this timeline, just like the cybertruck which is going to change trucking in the multiverse first ......
sounds like another tangent by musk
Oh dear God. All supercomputers are parallel computers. Yes Nvidia are well suited to process neural networks because they are matrix multiplication intensive.
Why we do not use GTA game and use some mods to look super realistic, and then we can create super rare diving scenario to simulate and learn, than Tesla do not need cameras for training (faster and cheaper because is on site) :)
No wonder Nvidia stock is going crazy
its 100thousandend chips right now