I have been following this channel and FSD for a few years now. The amount that FSD has improved in the past couple years is insane! It's basically incomparable from the version 2 years ago. I am visually impaired and this technology gives me the hope that maybe someday soon I will be able to have a car that can drive me around. As of right now I can can not drive, which is something a lot of people take for granted. FSD and autonomous vehicles will give disabled people their freedom and independence, and I can't wait for it!
DL engineer here. Regarding "efficient representation of maps and navigation inputs", from the response you shared I'm pretty sure it just means they changed the way the model receives this data so it can process it more efficiently.
This was kind of what I thought too. There is a weakness in lane selection, so supply more map navigation data so that it can better "see" the full scope of where a selected lane might lead. EG rather than map data saying "this is three lanes with a turning lane" its all the organized data of the lanes and intersections within n routing decisions. Like depth to a chess bot.
I thought of it (like for your chess comparison) instead of taking a photo of the chess board, it would simply write the move in standard chess notation, like Qf4 for the Queen moving to the f4 square
I hope it means FSD will stop going crazy when the GPS nav doesn't match what the cameras see. It happens to me a lot, such as when the GPS thinks I'm on an access road instead of the highway, or on the next street over. GPS nav is an input to FSD, and FSD has to decide where to go.
I've worked on quite a few basic neural nets for a while now, and sometimes once you get a model working, you can scale it down. A lot of neural net stuff has to do with inputs more than anything else. That's the main reason why Tesla is so good at all this stuff, because they have so much input data. Just sharing my thoughts.
@@GizmoMaltese I don't recall how Tesla addresses exposure problems in the transition zones between dark and bright places, but it is a very common issue for all automakers and cameras in general.
Yeah, just bought a new Model Y this year. Seems a bit ridiculous that it’s already losing support. I’m sure we’ll get little tweaks here and there. I’d pay for an upgrade to AI4 if it were available.
I almost bought a tesla when I was working there (we had JUST initiated highland release) and I stopped there. HW4 may not solve FSD HW5 could end up still not solving FSD to the satisfaction of the NHTSA. Infact there were 2 recent fatal crashes where FSD was deployed and another with a cyber truck ON FSD that would have driven right into the rear of a turning car if OP didn't intervene. There was still contact but very minimal due to operator intervention.
Soon instead of "driver must pay attention to take over in emergency situations", we are going to have "driver's actions may be limited during emergency situations".
Image processing is used to “clean” the image feed. For example, remove a bug stain from the image feed that is irrelevant to the drive. This is probably an algorithm improvement for that.
I'll chime in as a ML/AI scientist, efficient representation of maps probably refers to a change in the way they are processing the data before input into the network. They may have used or altered their dimensionality reduction technique to restrict the input features, which describe the world/map, used to compute the next decision. Lookup the curse of dimensionality for more info.
I mean, not really, because you can never account for weather or traffic conditions. It should just work how it does now and try to drive as efficiently as possible given the selected drive setting.
@@martinivan5884 Yes, yes it is. Because it already uses that to calculate its current battery percentage and a approximation of what the percentage will be when you arrive. But to be able to select exactly how much battery you want to arrive with, would mean predicting the future, including traffic, and a number of other factors. Weather alone is just one aspect, and keep in mind that weather forecast are not 100% accurate.
standard in EQA for 2 years "predicting the future ..." a simple estimation is a 30 sec calculation anybody can do by hand. Nobody expects perfection but any iditon can determine that you need a loading stop before arriving with 80%
@@benjaminmeusburger4254 At that point, what is the point? Might as well input your destination and rely on what the estimation is. If you use the app to input your destination before getting in the car, it will tell you what it thinks you will arrive with. And they are adding a feature to the app that allows you to select how much you want to arrive with and then charges the car accordingly, BEFORE departing. But even then, it still requires you to charge the car, and is not something that you can just do on the fly. Doing those kinds of calculations on the car on the fly takes up a lot of computing power, meaning less computing power that can be put towards FSD, and likely would also mean less power efficiency.
Running newest V12 on HW4 model X. Mine will do a little maneuver when I get home now mimicking flipping around and backing into my driveway, it ends right when I would flip it into reverse already.
My theory is that Europe will wait years before allowing any L3 features. The manufacturer will probably have to demonstrate the safety ratings acquired in the US. And they might only allow feature-by-feature... not the whole thing at once. I'm guessing driving on select highways will come first.
I know that in Europe we won’t get FSD anytime soon, I do look forward to seeing videos of V13. And I hope that Tesla this way can show to the UNECE with data, that FSD can be used safely
Is the big holdup of fsd in Europe because of European regulations or because Tesla just is not releasing it there. It is not in other countries as well besides Europe. I'm in New Zealand and I don't think the New Zealand government is even considering it. Don't know about Australia.
@@DoomsYann It is the European and the other respective governments around the world that choose to use the UNECE guidelines. The UN comes up with the guidelines but they play no part in implementing them. In fact, at the most recent meeting where the new guidelines were discussed, the UNECE recommended that the guidelines be updated to allow ADAS systems to operate at the level that FSD is currently operating in America, but it was some of the European government representatives (Britain, Germany, and a couple others) that pushed back and said that the guidelines were too liberal and needed to be more restrictive.
There is even chance that they won't get it at all or get some downgraded version. Simply because hardware is not good enough on older cars. Obviously Tesla won't tell you that.
I've very excited for V13. I love how good 12.5.6.3 is. For the 1st time my car drove me to the grocery store today and parked itself. It's struggled before, but it nailed it this time. Adding reverse, driveway, and garage parking...I'm super excited.
I seriously hope that the siren detection is done from outside the vehicle, Otherwise, I can envision kids in the back seat watching movies or cartoons on the iPads where the scene has police chasing suspects. Or maybe Tesla will look for consistency of the siren. Great video.
@@skull1829 In other words measure doppler shift (increased pitch indicating speed of approach) and if volume is increasing. If it uses two microphones it will be able to sense position, if the siren is directly behind or over to left or right say in next block.
To park in our house garage, one must take a car elevator. My M3H freaks out every time. It thinks the now open elevator doors are a wall so collision avoidance kicks in. When in the elevator it complains all the cameras are blocked and starts disabling safety and assist features.
There are a lot of songs out there that have sirens from emergency vehicles built-in. Would the car have the ability to distinguish external sirens from sound coming from its own speakers?
Efficient reprensentation of maps and navigation inputs just means that they're still updating the model and the format of the data it works with. It does not mean anything tangible for your typical driver, it's an "under the hood" thing.
from a NN implementation perspective latent space refers to the activation patterns of the internal layers of the NN after the raw input has been transformed. Typically (in the case of for example autoencoders) it's simply that there are fewer neuron-units than in the input layer which requires that the NN is efficient in it's representation for otherwise the output layer would not have sufficient information to correctly calculate the response. Again, typically, this is handled by raw brute force compute and letting the NN learn the best latent representation to minimise the error during training, but if they're talking about doing work on the architecture leading to efficiency, then apparently they have some additional insight.
Longer context lenght doesn't mean it will include additional context as flag mans. It will just remember for longer or remember more. It might have better temporal coherence, less jiggling or disappearing objects, or just include some specific data to have always in the context window. Maybe some local specific laws / behaviours or something.
are you sponsored by Elgato? I see elgato flashing @2:47 and some other moments. Subliminal advertising or elgato stream deck automatically created this in your videos?
I didn't buy a Tesla until HW4 was standard on th the Model Y. Since my main motivation for picking a Tesla is FSD and the safety systems that it enables.
I think also HW4 is not enough for real FSD. I they must add more sensors as backup. What does FSD with fog or a camera problem? Vision Only will work in 99,9%, but thats not enough.
@mattm1686 Not like my 2015 Durango with 70,000 miles really needed to be traded in any way. I can afford to buy new family cars before they are plies of trash. Transportation is just part of the budget. We needed a new car and wanted a Tesla. Knowing that Hw4 was on the horizon, we waited until it was available and even then held off until they ran the 1% financing. There is always something shiny coming, and when you need it, you need it, but sometimes waiting is worth it. The next car will come at the same point, once this one is paid off and getting old. Plus, the gas savings take ~$200 a month off the already low $460 payment. We basically got a new car for $260 a month.
@renekemna5620 The arguments for cars not being able to drive in blind conditions is what I call unrealistic. No, they won't drive, just like humans shouldn't. Maybe one day it will come about, but IMO, no one should expect their car to drive in conditions they couldn't. As far as 'real FSD', do you have a Tesla with FSD? There isn't much it doesn't already do. We are getting into the nit picking at this point. It always drives me and I rarely do more than tap the accelerator or turn signal here and there. I have also been using the summon every chance I get, and it is so cool. I have never had to let go of the button it just works. Can't wait till it will legit go park itself.
@@aaronb7990 But often you don´t know about the conditions. You are on the highway and it comes heavy snow or other. Lidar can see other cars in that condition. In Europe we haven´t your FSD, so no chance to try it. I think you don´t realy need summon, it´s only a Toy for big boys. 🙂
Depreciation is the norm for cars. FSD isn't worthless when sold or traded in, but it highly depreciates - probably because its very steep for what it can do today. Which part are you surprised about?
If you trade-in or sell your Teslas in less than 5 years, it would be less expensive to lease FSD for $99/month instead of buying it outright. Occasionally, Tesla will offer a limited-time deal that allows owners who purchased FSD to move it to a new Tesla at no charge. These offers usually only last a month or two, so you have to be ready to jump. To stay apprised of these offers, I suggest subscribing to the TH-cam channel, "Now You Know."
Camera Occlusion messaging: 12.5.6.3 made a big improvement by eliminating the message of “one or more cameras occluded” when driving on a dark highway at night. Yay!
Efficient input to a neural network often means finding a way to represent the input data that makes it easier for the network to process and convert it into a useful internal form for further analysis or use.
Translation for Jimmah on "Efficient representation....inputs": Imagine 10 raw vegetables vs a soup made from them To cook a dish (neural net) where the vegetables are ONE of the many indigrients, it's more space efficient (memory), faster (training speed) and cheaper (less fuel) to use the SOUP Everything else remaining constant In that sense, efficient rep is like inc energy density of batteries - it creates +ve ripple effects ACROSS range, handling, costs etc But the key is to identify that the SOUP is best "form - factor" (vs puree, salad, broth etc) to integrate the veggies, and then figuring out the optimum parameters WITHIN the soup (it's recipe) Makes sense? PS: this is one of if not the biggest levers in this list from my experience (albeit limited) with AI models
Are TACC, lane following and safety features part of FSD? The reason I ask is that I'm wondering if the frustrating and sometimes scary intervention problems I'm having (and have had for over 2 years I've had my hardware 3 M3) might be fixed when the v12 FSD software is eventually released in Australia. I very much doubt whether there'll be any fixes to the TACC, lane following and safety features in the existing software. What problems? - sudden braking and screeching audio warnings AFTER passing vehicles on the left or right waiting to turn into my lane - yes, if I see it in time I can put my foot on the accelerator but I still get the screeching audio. - "Emergency steering action taken for your safety" 😱🙀 with accompanying audio warnings when doing the most innocuous things like driving around a roundabout. - "Automatic steering aborting" with audio warning - usually crossing signal controlled intersections where the lane markers disappear but sometimes for no apparent reason at all. - "Automatic steering aborting due to technical fault" with audio warning yesterday. I think that was the wording of the text. Elon Musk likes to talk about driver interventions being reduced to once-a-year but what about unwarranted software interventions? Currently multiple times a day. My usual passenger absolutely hates it.
Tesla should have the option for HW3 antsy car owners to get AI4 for a price. I will pay the price to upgrade today so as not to wait for the "free" upgrade. Will FSD 13 be same on cars vs CT or Semi?
I think the issue with that would be that the current HW4 can't fit in place of HW3 because they've changed the shape/size of the computer and how it connects etc. So they'll have to make a different retrofit version that's basically HW4 specs but in the HW3 computer form factor.
Great info... one of my questions coming from other vehicles with cameras is why tesla does not use the cameras to show actual images when trying to park like other camera systems on vehicles like 360 view. When reversing it shows only 3 of the cameras and going forwards no view of any cameras. Do you know anything about this? Thanks
Not sure if you take requests but I’m hoping you’ll test the latest version at the intersection of Olympic and Locust. The last time I was there my HW 4 V 12.4.1 made a right hand turn while the “No Right Turn” signs were lit.
When will FSD be able to read traffic signs. I live in Cali. There is a sign near my house "No right turn on red light when children are present". That would be a necessary capability for a cybercab.
@ 4:00 - It’s all about how it sees map information, road markings, and signs. Right now, the AI’s interpretation can lead to a messy and inefficient understanding of map data (representation). Tesla is likely training the neural network to better infer and simplify this data, creating a cleaner, more efficient internal representation. Think of it like cleaning up a messy rough sketch into a sharp, detailed map that’s easier to follow and wastes less paper. This helps the AI make decisions faster and with fewer mistakes, which means it should seem to make better overall decisions based on the information it has. Hope that clears things up... Source: I’m a nerd
idk why they are so focused on human sensory like camera and sound. is it because ai can be trained this way much better? because arguments like "it works for humans this way so why not for machine" are a bit silly when you have so many more sensors that you can use to validate the ai prediction
I'm driving a model Y equipped with hardware 3. Is it really impossible to upgrade to hardware 4? I'm really curious. I want to upgrade even if I have to pay for it. If you know, please answer :)
It is not literally possible to replace HW3 with HW4; but it's presumably possible to design a computer that physically fits where HW3 is, but has HW4 performance. Further, Tesla has had several years to design this. So if it does turn out that HW3 can't support FSD (Unsupervised), my guess is that this how Tesla will meet the commitment for FSD (Unsupervised) to run on HW3 Teslas.
Will it have concept of object permanence now? Or will it still drive over someone if it can't avoid a collision and they go out of frame? Can it tell the difference between a distant car and a nearby motorcyclist at night? Or still run them over? Has it's cyclist intimidation algorithm been updated? Or will it no longer swerve at them even when it sees them?
Can it not tell the difference between a distant car and a nearby MC today? I mean, obviously, they can only use the tail lights to identify, but they should easily be able to distinguish the two from that.. within limits. Distance and difference in distance would be estimated by the common distance between the lights. When it comes to MC, it is more difficult. Since it only has one light, you have little/no clue how far away it is. Maybe they have to use the line markings, placement in the lane, and height? That is, until there is enough light to actually see the contours.
0:20: 😄 1:10: Shows HW4 cams superiority. Says only the computer needs upgrading. 🤔 I also have little confidence on the value of Elon's promise. There's no limit to how long they could delay in delivering on that upgrade. They have a lot of incentive to stall. 2:50: "Context length, which I believe refers to how long its context is." ...Ah, yes, thank you for explaining, wise sage 😉💙
Suggestion for the FSD visualization. All Tesla cars should be in color and full detail like owners car in the middle, including Cybertrucks and Semis. All other brands can be generic cars in grey only displaying correct car type, a sedan, mini, VAN, etc.
Within a few miles of here there's 3 completely different versions of “no turn on red” signs. Standardization would help. Also the "end speed limit" signs everywhere around here are ridiculous. What does that even mean? Pick your own speed limit?
Yeah. For true robotaxi status there's gonna have to be sign reading. There are so many things like construction signs that really need to be read for the car to make reasonable decisions 100% of the time.
@@StormyDog End speed limit is for roads that have, let's say 55mph limit but there is a short section when road goes to 35mph. When 35mph limit ends, is where end speed limit sign would be posted. Why it is like that, I do not know but it would be nice if road signs are standardized everywhere.
@@HoldFastFilms Exactly, but why not put the actual speed limit on the sign instead of "end speed limit". The car ends up thinking the speed limit is 35 for the next 20 miles.
@@StormyDog does the "end speed limit" sign even matter? At this point, on two-lane roads, FSD can't speed up when the limit increases or slow down when the limit decreases.
I hope they improve this thing, I was just letting the thing drive this morning and it was taking me all kinds of places, I’m just driving down the street out of nowhere. I just made a U-turn and started driving and taking me errors. I don’t know where the heck we were going. Kind of scary., then I put my home address and let it take me home and it was wigging out. I must say I’m a big fan of Tesla and this autonomous driving I just hope they get it right.
Could it be that v12 took us past the last "false horizon," and they're current pathway to FSD is the correct (or at least, one of the correct) path(s) forward? The preview of v13 sure makes it look like it. I have a bit of a Unicorn car: 2019 "Stealth" Model 3 Performance. Tesla called it "Model 3 Performance without the Performance Package." It has track mode, and all of the performance of a standard Model 3 Performance, but it doesn't have the Performance suspension, spoiler, metal pedals, and the few other little Performance Package trinkets. Not very many were made. And it only has 25,000 miles on it. I really hate to give this car up for HW4, but wow... v13 has me tempted.
AI engineer's take. More efficient representation (of any model input, including maps) means the car will be able to re-evaluate its immediate trajectory faster, while planning long term travel (lane staging, etc) better. One of the largest reasons why FSD is so "slow", "smooth", and "cautious" is NOT because they trained it to be that way. I mean, yes they did train it to be that way, but it drives slow to compensate for a computing performance limitation of not being able to update the travel trajectory fast enough. So when the car under-steers in a turn, or has jerky steering wheel movements during a sweeping turn - this is all an artifact of recalculating the travel trajectory too slowly. The slower the car goes, the safer it is because it has more time to re-compute the travel trajectory. So it takes turns slowly, and has "learned" to be "cautious". Now whether or not it recomputes the whole trajectory up to 0.3 miles ahead, or just the immediate trajectory is unknown. But it sure looks like it could be doing the "immediate" trajectory a whole lot faster, and doesn't need to recompute long-term trajectory all that often. So in my opinion it would be much more efficient to break up the trajectory planning into a pyramid structure, where long-term trajectory is updated less frequently, and short term trajectory (immediate turn) is updated very frequently. It's more complex, but much more efficient. This is also the reason why we have late / weird lane changes today, by the way. And why 12.5.6.3 has late braking, and why we have weird "pulsing" behaviors when accelerating or braking. So it is my opinion that they will break apart long term path planning from short term path planning in order to run the short term plan faster, resulting in an overall more responsive driving experience and better long-term path planning. IMHO, this also translates to HW3. It's a re-architecture and optimization of the model, and there is enough incentive to make it work on HW3 as of today. HW3 might be a bit more jerky and make lane change decisions not as "early and natural" as HW4, but this change should apply to both.
There’s a safety glitch where it signals too early that a car waiting for you to pass may potentially misinterpret your signal to turn before them and may pull out in front of you. FSD should pass the car then signal.
So should I wait to trade in my Model Y LR with version 3 hardware for version 5 hardware? I want to be 100% confident car can drive me while I am sleeping in the future.
Hardware 5 likely won’t happen until 2026 or 2027. If you can hold off that long then great, if not then you are going to be slightly disappointed. This is why you don’t make purchases with promises of the future. You make purchases on what is currently available and possible.
they should really just offer an upgrade path for HW4, I'm sure a lot of people wouldn't mind dropping money on basically a new PC in the car rather than trying to sell the entire car and buy a new one just for the upgrade in compute
@shannon6876 They promised to "upgrade" HW3, not "upgrade it to HW4." Tesla said not too long ago that such an upgrade is economically unfeasable so it'll probably end up being HW3.5.
Im am excited for driveway and garage pull in, I have a myQ garage that opens and closes as the car approaches. Would feel very future forward if it pulled me into my spot
_Still waiting on handling school zones._ Since we're still on FSD (Supervised), it's still up to you to as the supervisor to manage school zones. Obviously, this'll have to be resolved prior to next year's proposed shift to FSD (Unsupervised) (AKA SAE Level 4); and you can bet they're working hard on this at the mothership.
School zones are hard, not because it can’t read the speed limit sign, but because different areas operate that speed limit at different times and a number of other reasons. Maybe this version with one of the future dot updates will be the one to solve that problem.
I'm just looking at the video in amazement.. its incredible what it can already do, and how it scans the landscape for cars and people, changes the view at intersections.. it now shows its intended path thru the cars ahead.. the little animations for speeding up and slowing down. It's so damn cool!
It is Intolerable that HW4 cars are being given priority over HW3 owners who handed over their money to Tesla five years ago. HW 3 owners were sold the promise of FSD after the failure of earlier Tesla HW versions. We now need a big class action against TESLA.
No, but very close. FSD has been driving me around San Diego for several weeks now on v12.5.6.x in a 2024 MY HW4, completely hands-free, with only 2-3 disengagements-all of which were navigation issues, which FSD worked out by rerouting to get to the set destination. It's like being driven around by a professional chauffeur, with my hands in my lap or on the armrests. It even does limousine stops at every stop, a skill I've tried unsuccessfully to master. In my opinion, this level of performance is completely worth paying for, and I've done so gladly.
_what I want is the car not to be so close when I put it at the furthest distance between cars._ Sounds like you're using Autopilot, not FSD which is the subject of this video. FSD doesn't allow adjusting the following distance.
I have been following this channel and FSD for a few years now. The amount that FSD has improved in the past couple years is insane! It's basically incomparable from the version 2 years ago. I am visually impaired and this technology gives me the hope that maybe someday soon I will be able to have a car that can drive me around. As of right now I can can not drive, which is something a lot of people take for granted. FSD and autonomous vehicles will give disabled people their freedom and independence, and I can't wait for it!
DL engineer here. Regarding "efficient representation of maps and navigation inputs", from the response you shared I'm pretty sure it just means they changed the way the model receives this data so it can process it more efficiently.
This was kind of what I thought too. There is a weakness in lane selection, so supply more map navigation data so that it can better "see" the full scope of where a selected lane might lead.
EG rather than map data saying "this is three lanes with a turning lane" its all the organized data of the lanes and intersections within n routing decisions. Like depth to a chess bot.
It's like giving glasses to the model XD
Like adding a tokenizer for map data
I thought of it (like for your chess comparison) instead of taking a photo of the chess board, it would simply write the move in standard chess notation, like Qf4 for the Queen moving to the f4 square
I hope it means FSD will stop going crazy when the GPS nav doesn't match what the cameras see. It happens to me a lot, such as when the GPS thinks I'm on an access road instead of the highway, or on the next street over. GPS nav is an input to FSD, and FSD has to decide where to go.
BABE WAKE UP V13 JUST DROPPED!!! oh it's just a prediction video 😢
I made this video so that V13 will release this weekend
@@AIDRIVR Now we just need chuck to be away on work
@@AIDRIVR Smart move! Let's summon the devil 😈
@@AIDRIVRHopefully youre right 😅
@@AIDRIVR If it doesn't show up before end of Q1 2025 I'm Blaming you for jinxing it. jk.
Dang it I got so excited just seeing that v13 in the title, hahaha
I expect it any day now
@AIDRIVR literally considering buying my first Tesla next month so that would be amazing
I've worked on quite a few basic neural nets for a while now, and sometimes once you get a model working, you can scale it down. A lot of neural net stuff has to do with inputs more than anything else. That's the main reason why Tesla is so good at all this stuff, because they have so much input data. Just sharing my thoughts.
V11 to V12 was the biggest jump yet…excited for V13
Will they fix the fact that FSD can't see at night? I drove under a bridge in broad daylight and it complained that it couldn't see.
@@GizmoMaltese I don't recall how Tesla addresses exposure problems in the transition zones between dark and bright places, but it is a very common issue for all automakers and cameras in general.
Still rough to feel left behind on AI3, but as a shareholder, v13 brings a ton of excitement.
Yeah, just bought a new Model Y this year. Seems a bit ridiculous that it’s already losing support. I’m sure we’ll get little tweaks here and there. I’d pay for an upgrade to AI4 if it were available.
@@kungfoochicken08 my Model Y built July 2023 has HW4, so yours should too
@@kungfoochicken08Your Model Y if it is brand new has hardware 4.
@@kungfoochicken08 Don't pay for something that someone owes you
You have AI4 fool@@kungfoochicken08
A moment of silence for people who have HW3 cars... Including me :(
I have HW4 but I don’t live in the US…
@@DoomsYannthe coming world government might be just around the corner
@@DoomsYann I'll have a moment of silence for you too.
@ with the latest results of the UNECE meeting I doubt it.
I almost bought a tesla when I was working there (we had JUST initiated highland release) and I stopped there. HW4 may not solve FSD HW5 could end up still not solving FSD to the satisfaction of the NHTSA. Infact there were 2 recent fatal crashes where FSD was deployed and another with a cyber truck ON FSD that would have driven right into the rear of a turning car if OP didn't intervene. There was still contact but very minimal due to operator intervention.
Soon instead of "driver must pay attention to take over in emergency situations", we are going to have "driver's actions may be limited during emergency situations".
Image processing is used to “clean” the image feed. For example, remove a bug stain from the image feed that is irrelevant to the drive. This is probably an algorithm improvement for that.
I'll chime in as a ML/AI scientist, efficient representation of maps probably refers to a change in the way they are processing the data before input into the network. They may have used or altered their dimensionality reduction technique to restrict the input features, which describe the world/map, used to compute the next decision. Lookup the curse of dimensionality for more info.
I have always felt that FSD needed to have audio inputs. Cool to hear its coming.
The potential of XAI33x is unreal! Excited to see where this goes after watching your video!
What would be very useful is an option to select how much battery percentage you would like to have when reaching a destination.
I mean, not really, because you can never account for weather or traffic conditions. It should just work how it does now and try to drive as efficiently as possible given the selected drive setting.
@@ProXcaliberThe car gets up to date weather forecasts throughout the route, and accounting for these is not that difficult.
@@martinivan5884 Yes, yes it is. Because it already uses that to calculate its current battery percentage and a approximation of what the percentage will be when you arrive. But to be able to select exactly how much battery you want to arrive with, would mean predicting the future, including traffic, and a number of other factors. Weather alone is just one aspect, and keep in mind that weather forecast are not 100% accurate.
standard in EQA for 2 years
"predicting the future ..." a simple estimation is a 30 sec calculation anybody can do by hand. Nobody expects perfection but any iditon can determine that you need a loading stop before arriving with 80%
@@benjaminmeusburger4254 At that point, what is the point? Might as well input your destination and rely on what the estimation is. If you use the app to input your destination before getting in the car, it will tell you what it thinks you will arrive with. And they are adding a feature to the app that allows you to select how much you want to arrive with and then charges the car accordingly, BEFORE departing. But even then, it still requires you to charge the car, and is not something that you can just do on the fly. Doing those kinds of calculations on the car on the fly takes up a lot of computing power, meaning less computing power that can be put towards FSD, and likely would also mean less power efficiency.
Running newest V12 on HW4 model X. Mine will do a little maneuver when I get home now mimicking flipping around and backing into my driveway, it ends right when I would flip it into reverse already.
Outstanding Episode. Truly contagious. Thanks.
context length scaling means that them model can see further back in time to make decisions.
Cries in europe 😭
Or anyone else under UNECE rules
no
My theory is that Europe will wait years before allowing any L3 features.
The manufacturer will probably have to demonstrate the safety ratings acquired in the US.
And they might only allow feature-by-feature... not the whole thing at once.
I'm guessing driving on select highways will come first.
Imagine having paid the full amount for FSD to see it not work.
@@ev.c6
Thanks. Can’t wait.
I know that in Europe we won’t get FSD anytime soon, I do look forward to seeing videos of V13.
And I hope that Tesla this way can show to the UNECE with data, that FSD can be used safely
Is the big holdup of fsd in Europe because of European regulations or because Tesla just is not releasing it there. It is not in other countries as well besides Europe. I'm in New Zealand and I don't think the New Zealand government is even considering it. Don't know about Australia.
@ it isn’t Europe or your federal government, its the UNECE (a branch of the UN) that is prohibiting it.
@@DoomsYann It is the European and the other respective governments around the world that choose to use the UNECE guidelines. The UN comes up with the guidelines but they play no part in implementing them. In fact, at the most recent meeting where the new guidelines were discussed, the UNECE recommended that the guidelines be updated to allow ADAS systems to operate at the level that FSD is currently operating in America, but it was some of the European government representatives (Britain, Germany, and a couple others) that pushed back and said that the guidelines were too liberal and needed to be more restrictive.
@@DoomsYann Specifically that the US never signed up to the WP.29 regulation body, largely about harmonising regulations within Europe.
@@wr2382 Then all the country representatives agreed and so we're now lucky to get autonomous lane change by Sept 2025 at the earliest. Thanks UN.
Excellent video. 👍 These are exciting times!
It's disheartening that the first people in North America who bought FSD will be the last ones to get unsupervised.
There is even chance that they won't get it at all or get some downgraded version. Simply because hardware is not good enough on older cars. Obviously Tesla won't tell you that.
Yo I just had my car "Come" pick me up from the back of my chiropractor, it was EPIC!
Hey that’s cool, but you should go to a real doctor instead of a quack
@@peterwmdavis a 'real' doctor won't adjust your hips, back or neck for you.
I've very excited for V13. I love how good 12.5.6.3 is. For the 1st time my car drove me to the grocery store today and parked itself. It's struggled before, but it nailed it this time. Adding reverse, driveway, and garage parking...I'm super excited.
It's neat but it drives 2x slower than a human
I seriously hope that the siren detection is done from outside the vehicle, Otherwise, I can envision kids in the back seat watching movies or cartoons on the iPads where the scene has police chasing suspects. Or maybe Tesla will look for consistency of the siren. Great video.
it will likely use data to test whether the sound is getting closer and if so whether or not other cars are reacting to it.
@@skull1829 In other words measure doppler shift (increased pitch indicating speed of approach) and if volume is increasing. If it uses two microphones it will be able to sense position, if the siren is directly behind or over to left or right say in next block.
but if the cameras dont see cops then it shouldnt effect it
To park in our house garage, one must take a car elevator. My M3H freaks out every time. It thinks the now open elevator doors are a wall so collision avoidance kicks in. When in the elevator it complains all the cameras are blocked and starts disabling safety and assist features.
There are a lot of songs out there that have sirens from emergency vehicles built-in. Would the car have the ability to distinguish external sirens from sound coming from its own speakers?
Efficient reprensentation of maps and navigation inputs just means that they're still updating the model and the format of the data it works with. It does not mean anything tangible for your typical driver, it's an "under the hood" thing.
from a NN implementation perspective latent space refers to the activation patterns of the internal layers of the NN after the raw input has been transformed. Typically (in the case of for example autoencoders) it's simply that there are fewer neuron-units than in the input layer which requires that the NN is efficient in it's representation for otherwise the output layer would not have sufficient information to correctly calculate the response. Again, typically, this is handled by raw brute force compute and letting the NN learn the best latent representation to minimise the error during training, but if they're talking about doing work on the architecture leading to efficiency, then apparently they have some additional insight.
Longer context lenght doesn't mean it will include additional context as flag mans. It will just remember for longer or remember more. It might have better temporal coherence, less jiggling or disappearing objects, or just include some specific data to have always in the context window. Maybe some local specific laws / behaviours or something.
When will this come to europe?😢
When the government allows it 😂
When the UNECE allows it
are you sponsored by Elgato? I see elgato flashing @2:47 and some other moments. Subliminal advertising or elgato stream deck automatically created this in your videos?
I didn't buy a Tesla until HW4 was standard on th the Model Y. Since my main motivation for picking a Tesla is FSD and the safety systems that it enables.
And hardware five will come out Tesla wants to be like your cell phone where it’s constantly updating every six months. Good luck.
I think also HW4 is not enough for real FSD. I they must add more sensors as backup. What does FSD with fog or a camera problem? Vision Only will work in 99,9%, but thats not enough.
@mattm1686 Not like my 2015 Durango with 70,000 miles really needed to be traded in any way. I can afford to buy new family cars before they are plies of trash.
Transportation is just part of the budget. We needed a new car and wanted a Tesla. Knowing that Hw4 was on the horizon, we waited until it was available and even then held off until they ran the 1% financing. There is always something shiny coming, and when you need it, you need it, but sometimes waiting is worth it. The next car will come at the same point, once this one is paid off and getting old.
Plus, the gas savings take ~$200 a month off the already low $460 payment. We basically got a new car for $260 a month.
@renekemna5620
The arguments for cars not being able to drive in blind conditions is what I call unrealistic. No, they won't drive, just like humans shouldn't. Maybe one day it will come about, but IMO, no one should expect their car to drive in conditions they couldn't.
As far as 'real FSD', do you have a Tesla with FSD? There isn't much it doesn't already do. We are getting into the nit picking at this point. It always drives me and I rarely do more than tap the accelerator or turn signal here and there.
I have also been using the summon every chance I get, and it is so cool. I have never had to let go of the button it just works. Can't wait till it will legit go park itself.
@@aaronb7990 But often you don´t know about the conditions. You are on the highway and it comes heavy snow or other. Lidar can see other cars in that condition. In Europe we haven´t your FSD, so no chance to try it. I think you don´t realy need summon, it´s only a Toy for big boys. 🙂
I have bought FSD THREE times ($12k, $12k, and $10k), and I'm just a tad bit pissed that I've been shoved to the back of the line.
do apple give you a new phone every year because you bought one once?
Depreciation is the norm for cars. FSD isn't worthless when sold or traded in, but it highly depreciates - probably because its very steep for what it can do today. Which part are you surprised about?
If you trade-in or sell your Teslas in less than 5 years, it would be less expensive to lease FSD for $99/month instead of buying it outright. Occasionally, Tesla will offer a limited-time deal that allows owners who purchased FSD to move it to a new Tesla at no charge. These offers usually only last a month or two, so you have to be ready to jump. To stay apprised of these offers, I suggest subscribing to the TH-cam channel, "Now You Know."
@@joshgoodman6534 No, put they don't keep promising year after year that FSD is coming next year.
@@robertozanconato4992ah so apple is 100% truthful with their advertising?
Camera Occlusion messaging: 12.5.6.3 made a big improvement by eliminating the message of “one or more cameras occluded” when driving on a dark highway at night. Yay!
Efficient input to a neural network often means finding a way to represent the input data that makes it easier for the network to process and convert it into a useful internal form for further analysis or use.
Translation for Jimmah on "Efficient representation....inputs":
Imagine 10 raw vegetables vs a soup made from them
To cook a dish (neural net) where the vegetables are ONE of the many indigrients, it's more space efficient (memory), faster (training speed) and cheaper (less fuel) to use the SOUP
Everything else remaining constant
In that sense, efficient rep is like inc energy density of batteries - it creates +ve ripple effects ACROSS range, handling, costs etc
But the key is to identify that the SOUP is best "form - factor" (vs puree, salad, broth etc) to integrate the veggies, and then figuring out the optimum parameters WITHIN the soup (it's recipe)
Makes sense?
PS: this is one of if not the biggest levers in this list from my experience (albeit limited) with AI models
Are TACC, lane following and safety features part of FSD?
The reason I ask is that I'm wondering if the frustrating and sometimes scary intervention problems I'm having (and have had for over 2 years I've had my hardware 3 M3) might be fixed when the v12 FSD software is eventually released in Australia.
I very much doubt whether there'll be any fixes to the TACC, lane following and safety features in the existing software.
What problems?
- sudden braking and screeching audio warnings AFTER passing vehicles on the left or right waiting to turn into my lane - yes, if I see it in time I can put my foot on the accelerator but I still get the screeching audio.
- "Emergency steering action taken for your safety" 😱🙀 with accompanying audio warnings when doing the most innocuous things like driving around a roundabout.
- "Automatic steering aborting" with audio warning - usually crossing signal controlled intersections where the lane markers disappear but sometimes for no apparent reason at all.
- "Automatic steering aborting due to technical fault" with audio warning yesterday. I think that was the wording of the text.
Elon Musk likes to talk about driver interventions being reduced to once-a-year but what about unwarranted software interventions? Currently multiple times a day. My usual passenger absolutely hates it.
Tesla should have the option for HW3 antsy car owners to get AI4 for a price. I will pay the price to upgrade today so as not to wait for the "free" upgrade.
Will FSD 13 be same on cars vs CT or Semi?
Probably not considering their size differences.
Buy new car then
@ Buying a CT. My refreshed MS has just 10K miles so changing car not prudent.
I think the issue with that would be that the current HW4 can't fit in place of HW3 because they've changed the shape/size of the computer and how it connects etc. So they'll have to make a different retrofit version that's basically HW4 specs but in the HW3 computer form factor.
Will it be able to read traffic signs?
😂😂😂
Great info... one of my questions coming from other vehicles with cameras is why tesla does not use the cameras to show actual images when trying to park like other camera systems on vehicles like 360 view. When reversing it shows only 3 of the cameras and going forwards no view of any cameras. Do you know anything about this? Thanks
Destination options sounds extremely useful.
Very informative... thank you. Looking forward to your coverage on v 13. SoCalFreddy
This looks really exciting. I can't hardly wait to see it firsthand.
Not sure if you take requests but I’m hoping you’ll test the latest version at the intersection of Olympic and Locust. The last time I was there my HW 4 V 12.4.1 made a right hand turn while the “No Right Turn” signs were lit.
When will FSD be able to read traffic signs. I live in Cali. There is a sign near my house "No right turn on red light when children are present". That would be a necessary capability for a cybercab.
Musk should pay for the upgrades for HW3 out of his $47B bonus.
I agree, starting and destinations are the only thing that my model y has a problem with.
Even if you could upgrade a ai3 to ai4 wouldn't you need new cameras as well to be compatible with v13 and greater?
Thanks for the advice! Got XAI33x, feeling bullish! 🚀
@ 4:00 -
It’s all about how it sees map information, road markings, and signs. Right now, the AI’s interpretation can lead to a messy and inefficient understanding of map data (representation). Tesla is likely training the neural network to better infer and simplify this data, creating a cleaner, more efficient internal representation. Think of it like cleaning up a messy rough sketch into a sharp, detailed map that’s easier to follow and wastes less paper. This helps the AI make decisions faster and with fewer mistakes, which means it should seem to make better overall decisions based on the information it has. Hope that clears things up...
Source: I’m a nerd
0:05 hmm why is that guy signalling to the car that it can move more backwards. o.O
For fun, or does the car understand it. Or remote controlled?
I'd missed you. You may not believe it that just yesterday I searched for your channel to check if there was a new video I might have missed it.
I’ve been working on a top secret project. all will be revealed soon :]
Could the PWS speaker also be used to “hear” sirens?
I think for a few maneuvers it's going to need that forward camera in the Cybertruck and Cybercab.
Did I just meet you on Golf +? I do enjoy your videos very informative and detailed. Hopefully will see you around again in Golf + for a rematch.
Excellent video buddy. I love this stuff.
I fully expect epic explosions and good particle effects 😝
idk why they are so focused on human sensory like camera and sound. is it because ai can be trained this way much better? because arguments like "it works for humans this way so why not for machine" are a bit silly when you have so many more sensors that you can use to validate the ai prediction
I'm driving a model Y equipped with hardware 3. Is it really impossible to upgrade to hardware 4? I'm really curious. I want to upgrade even if I have to pay for it. If you know, please answer :)
It is not literally possible to replace HW3 with HW4; but it's presumably possible to design a computer that physically fits where HW3 is, but has HW4 performance. Further, Tesla has had several years to design this. So if it does turn out that HW3 can't support FSD (Unsupervised), my guess is that this how Tesla will meet the commitment for FSD (Unsupervised) to run on HW3 Teslas.
@@BigBen621 Thank you. Your comment gives me a small glimmer of hope.
Lucky number 13 baby!
I can't see how HW3 will work even with new computer since the HW3 cameras will still suck.
Will it have concept of object permanence now? Or will it still drive over someone if it can't avoid a collision and they go out of frame? Can it tell the difference between a distant car and a nearby motorcyclist at night? Or still run them over? Has it's cyclist intimidation algorithm been updated? Or will it no longer swerve at them even when it sees them?
Can it not tell the difference between a distant car and a nearby MC today?
I mean, obviously, they can only use the tail lights to identify, but they should easily be able to distinguish the two from that.. within limits.
Distance and difference in distance would be estimated by the common distance between the lights.
When it comes to MC, it is more difficult. Since it only has one light, you have little/no clue how far away it is. Maybe they have to use the line markings, placement in the lane, and height? That is, until there is enough light to actually see the contours.
Object pernamence should be improved with "longer context window".
Basically "context window" can be interpreted as short term memory
0:20: 😄
1:10: Shows HW4 cams superiority. Says only the computer needs upgrading. 🤔 I also have little confidence on the value of Elon's promise. There's no limit to how long they could delay in delivering on that upgrade. They have a lot of incentive to stall.
2:50: "Context length, which I believe refers to how long its context is." ...Ah, yes, thank you for explaining, wise sage 😉💙
Suggestion for the FSD visualization. All Tesla cars should be in color and full detail like owners car in the middle, including Cybertrucks and Semis. All other brands can be generic cars in grey only displaying correct car type, a sedan, mini, VAN, etc.
I was driving from LA to VEGAS and saw you near Primm NV, I was a Utah car that said K LAMAR❤
@2:47 dont think we didnt see the elgato :P :O
Great video.. I can’t wait to receive V13
If it reads a “no turn on red” sign, that will help.
Within a few miles of here there's 3 completely different versions of “no turn on red” signs. Standardization would help. Also the "end speed limit" signs everywhere around here are ridiculous. What does that even mean? Pick your own speed limit?
Yeah. For true robotaxi status there's gonna have to be sign reading. There are so many things like construction signs that really need to be read for the car to make reasonable decisions 100% of the time.
@@StormyDog End speed limit is for roads that have, let's say 55mph limit but there is a short section when road goes to 35mph. When 35mph limit ends, is where end speed limit sign would be posted. Why it is like that, I do not know but it would be nice if road signs are standardized everywhere.
@@HoldFastFilms Exactly, but why not put the actual speed limit on the sign instead of "end speed limit". The car ends up thinking the speed limit is 35 for the next 20 miles.
@@StormyDog does the "end speed limit" sign even matter? At this point, on two-lane roads, FSD can't speed up when the limit increases or slow down when the limit decreases.
I hope they improve this thing, I was just letting the thing drive this morning and it was taking me all kinds of places, I’m just driving down the street out of nowhere. I just made a U-turn and started driving and taking me errors. I don’t know where the heck we were going. Kind of scary., then I put my home address and let it take me home and it was wigging out. I must say I’m a big fan of Tesla and this autonomous driving I just hope they get it right.
I doubt older teslas with V3 hardware will ever get V4 hardware w/o $$$ spent by owners. I'm not holding my breath.
We’ve been waiting so long 😭
I’m sure the HW3 upgrade will be Customer Pay if you want it. I don’t see them covering the cost
I have HW3 and I think a good middle ground would be to give enhanced autopilot as a free upgrade
@@mathewluby9896I don't see why HW3 owners should settle for less than what they were promised.
@TheSpartan3669 agreed but I can't see every hw3 being recalled for an upgrade if law states FSD is hw4 or above
_I don’t see them covering the cost_
Elon has already said they would, if FSD (Unsupervised) won't run on HW3 and you bought FSD originally.
Your track record tells me that V13 will be released within hours to days after this video is released 😂
Could it be that v12 took us past the last "false horizon," and they're current pathway to FSD is the correct (or at least, one of the correct) path(s) forward? The preview of v13 sure makes it look like it.
I have a bit of a Unicorn car: 2019 "Stealth" Model 3 Performance. Tesla called it "Model 3 Performance without the Performance Package." It has track mode, and all of the performance of a standard Model 3 Performance, but it doesn't have the Performance suspension, spoiler, metal pedals, and the few other little Performance Package trinkets. Not very many were made. And it only has 25,000 miles on it.
I really hate to give this car up for HW4, but wow... v13 has me tempted.
For cyber truck with summons?
I’m excited to see how XAI33x performs! Your video made me a believer!
How's that phantom breaking going?
it was phantom all along
Anyone else notice that 12.5.6.3 won’t change lanes until the last second
AI engineer's take. More efficient representation (of any model input, including maps) means the car will be able to re-evaluate its immediate trajectory faster, while planning long term travel (lane staging, etc) better. One of the largest reasons why FSD is so "slow", "smooth", and "cautious" is NOT because they trained it to be that way.
I mean, yes they did train it to be that way, but it drives slow to compensate for a computing performance limitation of not being able to update the travel trajectory fast enough. So when the car under-steers in a turn, or has jerky steering wheel movements during a sweeping turn - this is all an artifact of recalculating the travel trajectory too slowly. The slower the car goes, the safer it is because it has more time to re-compute the travel trajectory. So it takes turns slowly, and has "learned" to be "cautious".
Now whether or not it recomputes the whole trajectory up to 0.3 miles ahead, or just the immediate trajectory is unknown. But it sure looks like it could be doing the "immediate" trajectory a whole lot faster, and doesn't need to recompute long-term trajectory all that often. So in my opinion it would be much more efficient to break up the trajectory planning into a pyramid structure, where long-term trajectory is updated less frequently, and short term trajectory (immediate turn) is updated very frequently. It's more complex, but much more efficient.
This is also the reason why we have late / weird lane changes today, by the way. And why 12.5.6.3 has late braking, and why we have weird "pulsing" behaviors when accelerating or braking. So it is my opinion that they will break apart long term path planning from short term path planning in order to run the short term plan faster, resulting in an overall more responsive driving experience and better long-term path planning.
IMHO, this also translates to HW3. It's a re-architecture and optimization of the model, and there is enough incentive to make it work on HW3 as of today. HW3 might be a bit more jerky and make lane change decisions not as "early and natural" as HW4, but this change should apply to both.
3:28 Just that you know, „et cetera“ is Latin and spelled with a t ✌️
There’s a safety glitch where it signals too early that a car waiting for you to pass may potentially misinterpret your signal to turn before them and may pull out in front of you. FSD should pass the car then signal.
So should I wait to trade in my Model Y LR with version 3 hardware for version 5 hardware? I want to be 100% confident car can drive me while I am sleeping in the future.
Hardware 5 likely won’t happen until 2026 or 2027. If you can hold off that long then great, if not then you are going to be slightly disappointed. This is why you don’t make purchases with promises of the future. You make purchases on what is currently available and possible.
It’s criminal what they are doing to the older cars. My car is two years old and already out of date?
Good!! That’s why I get a new car every year just like my phone.
they should really just offer an upgrade path for HW4, I'm sure a lot of people wouldn't mind dropping money on basically a new PC in the car rather than trying to sell the entire car and buy a new one just for the upgrade in compute
@shannon6876 They promised to "upgrade" HW3, not "upgrade it to HW4." Tesla said not too long ago that such an upgrade is economically unfeasable so it'll probably end up being HW3.5.
Im am excited for driveway and garage pull in, I have a myQ garage that opens and closes as the car approaches. Would feel very future forward if it pulled me into my spot
Patiently waiting for real FSD in Europe, but the advanced cruiscontrol is already nice and useful.
Still waiting on handling school zones. Where I am, they are photo enforced and you will get a ticket besides safety for kids.
_Still waiting on handling school zones._
Since we're still on FSD (Supervised), it's still up to you to as the supervisor to manage school zones. Obviously, this'll have to be resolved prior to next year's proposed shift to FSD (Unsupervised) (AKA SAE Level 4); and you can bet they're working hard on this at the mothership.
FSD supervised for HW3 should be discounted now.
I guess training onky only HW4 data cuts the data supply down significantly but being able to do more eith thsr data is sweet. Thanks for the update!
How do you know if you have hardware 3
I believe it should say so in the vehicle information under the software tab.
FSD needs some level of reasoning and use different behaviors accordingly. For example, school zones.
School zones are hard, not because it can’t read the speed limit sign, but because different areas operate that speed limit at different times and a number of other reasons. Maybe this version with one of the future dot updates will be the one to solve that problem.
It would probably be enough to train more on such data.
I'm just looking at the video in amazement.. its incredible what it can already do, and how it scans the landscape for cars and people, changes the view at intersections.. it now shows its intended path thru the cars ahead.. the little animations for speeding up and slowing down. It's so damn cool!
when does it come man
4:13 “elgato no signal” for one frame :D
It is Intolerable that HW4 cars are being given priority over HW3 owners who handed over their money to Tesla five years ago. HW 3 owners were sold the promise of FSD after the failure of earlier Tesla HW versions. We now need a big class action against TESLA.
version 13 may be a robotaxi superviser
Not full self driving I'm guessing
No, but very close. FSD has been driving me around San Diego for several weeks now on v12.5.6.x in a 2024 MY HW4, completely hands-free, with only 2-3 disengagements-all of which were navigation issues, which FSD worked out by rerouting to get to the set destination. It's like being driven around by a professional chauffeur, with my hands in my lap or on the armrests. It even does limousine stops at every stop, a skill I've tried unsuccessfully to master. In my opinion, this level of performance is completely worth paying for, and I've done so gladly.
what I want is the car not to be so close when I put it at the furthest distance between cars. A little to close. Like keep the distance better.
_what I want is the car not to be so close when I put it at the furthest distance between cars._
Sounds like you're using Autopilot, not FSD which is the subject of this video. FSD doesn't allow adjusting the following distance.
Current Free FSD (Supervised) is dangerous on my MYP HW3