I've watched many FSD stress test videos and this is one of the best by far. So many people make videos on how it does in light or moderate traffic which we already know it does quite well, but hardly anybody puts it through full on big-city rush hour tests with complex routes. Please keep making videos like this!
Yeah, I haven't seen much of these videos so I figured I'll give it a go. Thanks for your support! Will continue to test on the streets when new updates come out.
@@JustAnotherTeslaGuy Great test. I’m preparing a video about Tesla's exciting future and wondering if I can use some scenes from this video, where I’ll give credit to you
0:42: Right turn into immediate double lane-change. 1:36: Slow turn. 2:25: Counter-productive lane-change, as rain begins. 3:47: Fails to complete right turn. Intervention. 2:25: Counter-productive lane-change. 5:04: Hesitates at protected right turn. 5:18: Interesting protected left scenario. Car appropriately waits for cross-traffic. 6:57: Very smartly avoids blocking intersection. 7:15: Assertively advances through congested intersection. 10:08: Assertively takes gap to lane-change. 12:05: Slow to turn right. 13:33: FSD ignores on-coming traffic. Disengagement. Almost certainly would have been a crash. 14:28: Hesitates to enter congested intersection. Intervention. 15:25: Fails to follow lane through turn, creating awkward traffic interruption, and fails to lane-change, to make room for others. Intervention. 16:33: Fails to lane-change past truck. Intervention. 18:18: Slow to go after stop light. 20:00: Navigates construction.
_FSD ignores on-coming traffic. Disengagement. Almost certainly would have been a crash._ This is a common claim, and almost always wrong. NHTSA records show that there have been only 75 crashes of Teslas with FSD engaged in the 1.3 *billion* miles FSD has drive to date, or one every 17.3 *million* miles. This claim is generally based on the assumption that the Tesla will continue on doing exactly what is what doing before the disengagement. For example, if the Tesla gets in the wrong lane and drives into crossing traffic, the assumption is that it will continue into crossing traffic and likely hit someone. But in fact the Tesla will do everything within the limits of physics to avoid an accident, since this is obviously the prime directive. And of course the cars it's driving into will themselves do everything possible to avoid being hit. And as shown by the accident statistics, the combination of the Tesla doing everything in its power to avoid hitting someone, and the other drivers doing everything in their power to avoid being hit, is almost always successful.
@BigBen621 Water droplets create optical distortion and that is why it misjudged the distance accurately. More sensors to confirm reality would help with errors arising from optical illusions.
@@yesno9592 _Water droplets create optical distortion and that is why it misjudged the distance accurately,_ The front camera field of view is within the swept area of the windshield wiper, and it's clear from the photo that while there are drops of water outside the swept area, there are very few within the swept area to create the effect you claim. _More sensors to confirm reality would help with errors arising from optical illusions._ Requiring an elaborate (and expensive) sensor suite is very likely to have the unintended consequence of reducing safety, rather than increasing it. This is because the more expensive a safety feature is, the fewer people will be able to afford it; so adding for example a Waymo-level sensor suite to Teslas is likely to result in more people opting to buy lower-cost vehicles with fewer safety features, foregoing the astonishing safety record of FSD, and increasing accidents, injuries and deaths.
@@yesno9592 _that is why it misjudged the distance_ You're starting from a false premise. If you'll watch the video frame-by-frame, you'll see that the display shows the car exactly where it actually was, relative to the other cars around it. And again watching frame-by-frame, you can see that the Tesla recognized the threat and started turning away from it, rotating the wheel nearly a half turn before the driver took over. Since the Tesla can do the calculations at the speed of heat and can operate the controls much faster than the driver can, the fact that the driver avoided a sideswipe means that the Tesla would have also. I'm not saying the driver should not have taken over; I would have done the same thing. But it's a virtual certainty that there would have been no accident, even if the driver hadn't.
@@JustAnotherTeslaGuy If you play the video on x0.25 it seems to correct by steering to the right before you disengage. I am also inclined to say it would have avoided the car. But makes sense you do not want to crash your Tesla to fing out.
More videos showcasing FSD in busy traffic please. Observing its performance and failures in challenging conditions is far more valuable than watching flawless drives on simpler routes.
Had a few critical interventions and a handful of non-critical interventions. Still a lot of work to be done around the route planning, context, and on the anticipation fronts but it's improving slowly
It did great overall. For the videos it would be nice to let the car figure things out on its own a bit more, I understand in regular use intervening though.
I think it did great too and want to always let the car figure it out. Just when the other car came at me and FSD didn't seem to move aside, I had to, to make sure it didn't crash.
i dont see to many other channels give it gas when its being slightly hesitant, but this was great content and def a bit relaxed even after almost letting fsd crash lol
Haha thanks for the support. NJ drivers are always rushing. If there aren’t any cars around me, I’m okay with it sitting there to figure it out. But if there are cars, I push to make the car go so I don’t impact other drivers.
@3:47: Fails to complete right turn. Intervention. - as ur action was how most of us will react, ur intervention was absolutely correct in sense. this FSD is still a beta version and exactly these extreme stress situations needs the system to be get smarter and further developed - Awesome city test - great job
If i can avoid pressing on the pedal, i will. But because other drivers around me might not expect me to stop or expect me to continue flowing with traffic, I dont want to impact them. But if is clear in the back, I’ll let FSD do its thing
Thanks for sharing..n i think the car on the other side should have wait not crossing the center lane..i know it did that because of the parked car but still when crossing the center line here in korea we gotta wait until there is no car the other side lane..
Amazing test and performance. The silly comments about this being a poor level 2 system are wild. 😂 People must not realize how many humans, my wife included, would be really stressed driving in that environment. It needs more confidence and at times assertiveness, but it performed excellent overall.
I'm really happy to see FSD navigate the Newark traffic you took it through. It gives me more confidence that it could handle the nightmare of Manhattan traffic.
13:36 Hopefully Tesla AI team watches this video and that bit and makes the necessary fix. This is going to be one of those things to make FSD effective in heavy urban driving like in new york city. FSD needs to be able to make these split second decisions and adjustments that avoids collisions in heavy urban driving.
As there's no outside camera it seemed a bit scary but I think FSD was already turning a bit more to right as watching that moment on slow motion. It didn't seem to be that close and the other car was coming to Tesla's lane also. I bet it wouldn't have crashed, but sure it really must have been a scary situation to be the passenger (or supervisor) onboard.
@@mrlearner4153 _without any close calls_ Seriously? Are you truly not aware that human drivers kill 115 of our friends and loved ones each and every day, mostly due to deliberately and knowingly engaging in dangerous behavior? Are you saying an accident that kills someone is not a "close call"?
it likely won't change the route, and it will have its blinkers on to try to move over to the right side. but FSD will wait until its clear before it changes lane, while human drivers will likely move over once it sees a tiny gap, so i don't want to impact other drivers when I don't need to.
Disengagement was advisable, but it would have been interesting to see if the car would have made an evasive move or taken the side scrape. "Interesting" is not important when it's your car, of course.
At 13:40 you're absolutely correct to disengage. If you'd let it continue youd have ended up crashing or both stopping last moment and one of you needing to back up. From what I can see it would have been your fault too. FSD took it too wide completely unnecessarily.
@@stingerBTW In the UK, the rule is the car with the obstruction on its side of the road must give way. So in that case, the opposing traffic should have given way - in theory. In practice, of course, creeping along over the centre line to avoid parked cars is what happens. It seems FSD needs to learn to drive into the gap, not just into the lane.
@@skwdenyer Situational awareness is crucial for the software yes, but it would be a different story if FSD turned into the oncoming lane making the gap smaller. It legally made a turn, and the other driver was supposed to yield if they felt they couldn't squeeze the gap without causing a near collision.
Great city navigation with FSD. I drive everyday thru GWB and would want to use FSD there, nd this video builds my confidence with buying a tesla in future for the fsd alone
13:36 to 13:37. You avoided a collision! Blue SUV with left-signal light turned on intending to go around the stopped white car (with nobody in the driver's seat) with hazard lights on. Blue SUV is outside its lane because its two tires has crossed the double yellow line. FSD however is visualizing the blue SUV close or stepping on the double yellow line. Blue SUV moves forward to go around white car at the same time the FSD was turning right. FSD was accelerating from 3 to 14 mph and appears to be steering wide biased towards the double yellow line instead of hugging the right side of the road or taking evasive maneuver and making a hard right (which you did) to avoid a head on collision. FSD has no collision avoidance system? Or humans like you can anticipate and react faster than current FSD version? The blue SUV should not have went around the white car because it was not safe to do so in a no pass zone (double solid yellow lines) at that time because you were turning right. The blue SUV should have waited for you to clear the lane it was intending to go to before passing the white car. Thanks for supervising FSD you saved the day. I hope you report this intervention. It will help tesla team improve FSD.
13:18 to 13:34. FSD failed to turn on its left-turn signal light when it decided to move/merge to the left lane to go around stopped Van with hazard lights on and obstructing traffic. Right-turn signal light was on the whole time. i hope you report this next time it happens. Thanks for doing your part in making FSD better by testing it in a busy and challenging areas.
I have seen it done that a few times, where the opposite blinkers are on and it is making the opposite turn. Hope that can be fixed! I don't have the "snapshot" button so I'm unable to report anything without disengaging
9:15 I would have liked to see how it would try to change lanes to the right? Why did you delete that route?? Tesla will have to be able to handle congested lane changes.
Do you think pushing the pedal counts as an intervention ? I think it probably does - but it would be interesting to compile a list of moments when the pedal ought to have been pushed but wasn’t - to see what happened.. For example, at 7:52 i think that really was an intervention - the position the car stayed in was likely irritating others a bit.
for the most part(maybe 80%-90%), I think pushing on the pedal is a user preference, because it generally does the right thing and if is not moving or moving slower, likely because it is not safe to go move or go faster. so to me is more just nudging it in the right direction That being said, here in NJ there are a lot of impatient drivers, so I have to nudge it a bit more to not impact the other drivers so a bit of a gray area there lol
Do you think that robotaxis will be possible in the near future (as in 2024 or 2025)? I was thinking perhaps if Tesla have a control centre where one person monitors say 10 or 20 taxis, so if one get stuck, they can take over remotely (I think this is done by other companies already). With the issues you had in this drive, it seems robotaxis are a long way off.
We already have waymo with no drivers, although it only work in certain locations lol. There are still things FSD needs to work on, like cones in the road, construction blockage, buses blocking the way, etc. I still think it will be a while, but I don't mind being wrong!
@@JustAnotherTeslaGuy thanks so much for your reply. You are in a better position to know than me. Personally, I think it will be a long time before it reaches us here in australia.
14 mins, the other car would've certainly been at fault for crossing their double yellows unsafely. But, I think that if you didn't take over, it's likely there would've been at least a small ding before both cars were able to fully stop. I'm actually surprised none of the safety features kicked in, do you have collision avoidance/warnings on?
Tell you man I have massive fears of traffic and rush hours hints why I never drive just stay in the country side and live a nice peaceful life. Traffic is very stressful especially if you have drivers like that that nearly hit you.
Good tests. Newark isn't part of NYC's right on red default prohibition? 13:36 no you were defo not wrong to disengage, that looked like a certain collision. FSD wasn't figuring on the opposing car coming over during that turn in that particular situation. Opposing car would only really have been wrong if you couldn't make room although he proceeded with blind assumption at speed. But it's NY, it's different rules :)
Great test drive , very challenging situations for FSD, thought in general GSD was awesome. Still not ready for robotaxi yet , but can see progress all the time
Is 4pm rush hour? I'd like to see someone do 5pm in a city like Melbourne (my home town in Australia). Thousands of people crossing roads constantly, bikes, trams and e-scooters galore. Bumper to bumper traffic. This video has lots of cars but looks a bit deserted in terms of people, bikes etc. Bikes, scooters and pedestrians are the most dynamic and unpredictable elements for FSD to handle.
yeah NJ has quite a lot of cars not much people walking or that many bikers. NYC is where the traffic jam happens with cars, people crossing and bikes. In NYC there is so many people crossing (even on red), that usually only 1 car is able to turn in a the full light cycle. FSD def need to be able to handle that.
If you look at the path planner it was going to do what you were doing, but maybe not as fast, it certainly wouldn't have hit that car, that car might have hit you though!
The head on was actually nothg wrong by fsd, it was just the other driver being reckless. And fsd wasn't given a chance to react though, which was rightfully so...
Despite FSD really impressive performance, this video unambiguously shows that Tesla FSD is not, and will likely never be, ready for even L3 automation. I think it's actually more stressful to drive with FSD under full human supervision/second guessing than to drive without. I predict that at some point in the future when Elon figures out a face-saving to back out of the vision only FSD appraoch, FSD could move on to genuine L3/L4 automation.
for me at least, FSD does a pretty decent job on the highways where is just flowing traffic and staying in lane. it is the local streets where it gets much harder because there could be a lot things happening at once
@@beastonabun4618 I'd love to see you drive FSD in a busy city and just let it do its thing with no interventions, all on TH-cam of course. Naive fools like you are good entertainment.
If you're wondering how close Robotaxis are. He prevented a crash in this 20 minute video so if robotaxi was 1000x better than this example a robotaxi network of 1000 vehicles driving 6 hours a day would have 18 crashes.
So let me get this right, you are going to give a prediction of frequency of crashes based on a single drive, in which the driver is trying to throw the car as big a challenge as he can come up with. When Tesla FSD supervised reaches the point where Elon is confident that it is very much safer than human drivers, the government won’t authorize it based on Elons assurances alone, but rather the driving safety data. I suspect it isn’t a coincidence that your analysis is actually this bad, but i think it’s more likely that you are just one of the Tesla haters who try to spin any video in the worst possible light for Tesla. This might work with gullible, uninformed readers but I’ll bet most of the readers here are much more savvy and informed than you might give them credit for.
@@jamesmcneal1821 He made a good point. Elon talks of soon having seven million robotaxis out there. A fleet of one million robocars would put up about 20 million miles per day, about 40 human lifetimes of driving per day! It will have to be super-duper superior to an average human driver, as in able to go hundreds of millions of miles safely between bad crashes, to have a chance of keeping regulatory approval. It's obvious that in busy urban driving, FSD can't average one safe hour currently. The guy was frequently pressing the accelerator, and he clearly prevented a crash that any normal human would have seen coming and prevented.
@@charliedoyle7824 The number of robotaxis on the road doesn’t matter, nor does it matter how many million miles they drive. What matters is how safe those miles are. If they are 10x safer than human drivers I would not want to have all of those extra deaths or crashes on my conscience and I don’t think anyone else should, including those who must make that regulatory decision. It doesn’t matter how this particular drive went, so far as robotaxi goes, how many interventions, etc. why? Because Tesla isn’t going to seek approval until it is many times safer than human driving. During this video, supervised FSD was not yet up to the robotaxi level, but that’s not a problem because there are no Tesla robotaxis running now. Every ride currently has a human driver supervising. That won’t be the case when Tesla’s robotaxi is implemented. You are making the mistake of comparing the current FSD build as seen in this video, to the future build which will be so much safer than human drivers. This current build doesn’t even have reverse available, doesn’t know how to respond to emergency vehicles yet, etc. it’s not ready and Tesla isn’t claiming it is. Critique Tesla’s FSD build when it seeks regulatory approval if at that time you see problems with it. I will tell you a few problems it won’t have: it will never be driving impaired because it is drunk, or sleepy or texting while driving or highly upset because it is yelling at other passengers or just broke up with its significant other or just got cut off in traffic and became enraged at another driver flipping them off or because it needs to prove something. If either you or I become elderly, we may or may not become unsafe drivers. Everyone is an individual. I could go on and on. The point is humans in general are not safe drivers. All you have to do is look at the statistics. I think what we both have in common is that we both want transportation to be as safe as possible.
@@jamesmcneal1821 the reason to critique current FSD performance is to point out how far it is from human levels of safety and ability. FSD can maybe go twenty minutes in busy city driving with no human help, if lucky, where a normal human driver can go forever with no help and decades with no bad crashes. You are fantasizing about FSD being 10x better than a human. They have a long, long way to go just to match an average human, especially in L5 driving. You are also fantasizing about "10x safer than human drivers" being a figure that regulators will care about; they won't operate on an arbitrary figure like that. AV regulators are already clearly operating like the FAA does with aircraft, where every crash and other driving violations are investigated, and any performance deemed dangerous will result in recalls and fix recommendations. If the crashes are bad enough, there will be license revocation by states. One bad crash will be enough to threaten a shutdown of a robocar unless the company can convince regulators they are addressing it. A series of similar bad crashes will definitely shut down the program, likely nationally, just like when Boeing 737 Max planes were going down regularly. With one million robocars driven by FSD, it will cause a bad crash every few days somewhere if they are only 10x safer than a human at causing bad crashes. If Tesla ignores recall orders and continues driving, the lawsuits will write themselves and the press will destroy them, along with many states pulling licenses. A robotaxi operation won't work with such a bad reputation, even in a state with no interest in ADS regulation. With Waymo and other AVs likely able to go hundreds of millions of miles between bad crashes, and fully cooperating with regulators when they do have a crash, Tesla won't be able to operate like it's the wild west with nonsense like: " we're 10x better than a dumb human, you have a moral obligation to let us continue".
@@charliedoyle7824 _With one million robocars driven by FSD, it will cause a bad crash every few days somewhere if they are only 10x safer than a human at causing bad crashes._ I hardly know where to start; but I'll start with this. If robotaxis are 10x safer than human drivers, and FSD is already provably *more than* 10x safer than human drivers, then for every "bad crash" caused by FSD, nine crashes will be avoided; that's just in the math of "10x safer". And I'll take those odds, all day long. Yeah, it'll suck if you or one of your loved ones is the one out of ten "bad crash"; but it's nine times more likely that they'll be one of the nine avoided crashes. Of course, you won't know that at the time; but it'll be clear in the statistics.
@@hevenko Only for the "OG" (Original Group of Beta testers). Everyone else has to disengage before reporting the reason for the disengagement or intervention.
Who exactly will INSURE a self-driving car? I suspect that my insurance company will put a rider on my Tesla policy stating that they are not responsible for damages when FSD (not-supervised) is engaged. Will Tesla insure? Who will show up when a Tesla runs over a pedestrian or cyclist that jumps out from between parked cars? Who will be 1st to put their child on the front seat, kiss them on the forehead, and tell their Tesla to take that child to school, then stop at McD for a burger before returning home? This ain't happening any time soon.
_I suspect that my insurance company will put a rider on my Tesla policy stating that they are not responsible for damages when FSD (not-supervised) is engaged._ FSD has proven to have an astonishing safety record. According to NHTSA records--not Tesla records--Teslas with FSD engaged have been involved in 75 crashes in 1.3 *billion* miles, an average of one every 17.3 million miles. Meanwhile, human drives have a crash every 539,000 miles. Which would you rather insure? _Will Tesla insure?_ Yes, in a growing number of states. Here's what @thejefflutz, who actually has Tesla C&C insurance, has to say: "@Tesla insurance is paying for most or all of the monthly cost of FSD in my case. I started at $163/mo for a MX plaid driving score of 90. FSD Supervised 12.5.x has elevated my driving score to 100 over just a few months and lowered my premiums by about $70..." _Who will show up when a Tesla runs over a pedestrian or cyclist that jumps out from between parked cars?_ First responders--who did you think would? Some accidents are unavoidable; but when that happens, it's >30 times more likely it'll be a human driver and not FSD at the wheel. _Who will be 1st to put their child on the front seat, kiss them on the forehead, and tell their Tesla to take that child to school, then stop at McD for a burger before returning home?_ Once FSD achieves Level 4, anyone who wants their kids to be >30 times as safe as they would be with a human driver. I agree it's not happening soon; but it *is* happening.
I have had FSD on 2 cars since about 2021 (I think).. It literally has not improved a bit and I bet FSD will never get to L3 let alone L4. You cannot drive a mile without a hard crash avoiding intervention. Why the fanboys think vision only autonomy will work like the idiot musk claims is beyond me. Tesla is literally dead last today in its self driving tech and that is coming from someone who wasted $7k on this pipe dream (meaning I want it to work but know it can't)..
_It literally has not improved a bit_ That's one of the most absurd claims I've ever heard, and pretty much destroys your credibility for commenting on FSD.
Maybe some of the smaller channels are focusing too much on problems, giving viewers a skewed view. I have been using FSD for 4 months, V12.3.4 to V12.5.1.1 now. It performs wonderfully most of the time and I can tell when it is likely to have a problem well enough in advance to avoid the problem behavior and report it for fleet improvement. It once drove during a down pour of rain that flooded highways. No unexpected behavior.
Not sure if this was stated. But did you have it in chill, average or assertive setting? Thank you for your time and service. Jesus Christ is alive. Please get a Bible and read it. If you do what he commands. He will give awesome gifts and powers.
When doing these type of test, do NOT EVER, press the pedals. Have patience, unless it is absolutley necessary. You always need to pretend as if no one is behind the wheel
Looks more like a horse to me. It can handle many things, but it can't read and can get "scared" or "confused" sometimes, so you need to grab the reins. At least it doesn't bolt.
I'd beg to differ. I use FSD v12.5 every day on every drive just not in a big city. Zero disengagements on just about every drive. Works at night in the rain and super helpful when I'm driving in a new area. Rarely makes mistakes any more. Speed control is the biggest problem now. A few other improvements needed but at the recent rate of change by next year it should be pretty awesome. Over the air updates are great. Continual improvement.
I've watched many FSD stress test videos and this is one of the best by far. So many people make videos on how it does in light or moderate traffic which we already know it does quite well, but hardly anybody puts it through full on big-city rush hour tests with complex routes. Please keep making videos like this!
Yeah, I haven't seen much of these videos so I figured I'll give it a go. Thanks for your support! Will continue to test on the streets when new updates come out.
@@JustAnotherTeslaGuy Great test. I’m preparing a video about Tesla's exciting future and wondering if I can use some scenes from this video, where I’ll give credit to you
This is the first time I see FSD in the rain. How does the camera stay clear of water?
@@gistfilmit yells LAND HO
@@FromFutureLabsure, not a problem! Thanks in advance for the shoutout ❤
This is the most stressful environment for FSD that I ever seen. great job. well done.
Thank you for your support! Will continue to make more videos on this as new updates roll out.
0:42: Right turn into immediate double lane-change.
1:36: Slow turn.
2:25: Counter-productive lane-change, as rain begins.
3:47: Fails to complete right turn. Intervention.
2:25: Counter-productive lane-change.
5:04: Hesitates at protected right turn.
5:18: Interesting protected left scenario. Car appropriately waits for cross-traffic.
6:57: Very smartly avoids blocking intersection.
7:15: Assertively advances through congested intersection.
10:08: Assertively takes gap to lane-change.
12:05: Slow to turn right.
13:33: FSD ignores on-coming traffic. Disengagement. Almost certainly would have been a crash.
14:28: Hesitates to enter congested intersection. Intervention.
15:25: Fails to follow lane through turn, creating awkward traffic interruption, and fails to lane-change, to make room for others. Intervention.
16:33: Fails to lane-change past truck. Intervention.
18:18: Slow to go after stop light.
20:00: Navigates construction.
_FSD ignores on-coming traffic. Disengagement. Almost certainly would have been a crash._
This is a common claim, and almost always wrong. NHTSA records show that there have been only 75 crashes of Teslas with FSD engaged in the 1.3 *billion* miles FSD has drive to date, or one every 17.3 *million* miles.
This claim is generally based on the assumption that the Tesla will continue on doing exactly what is what doing before the disengagement. For example, if the Tesla gets in the wrong lane and drives into crossing traffic, the assumption is that it will continue into crossing traffic and likely hit someone. But in fact the Tesla will do everything within the limits of physics to avoid an accident, since this is obviously the prime directive. And of course the cars it's driving into will themselves do everything possible to avoid being hit. And as shown by the accident statistics, the combination of the Tesla doing everything in its power to avoid hitting someone, and the other drivers doing everything in their power to avoid being hit, is almost always successful.
@BigBen621
Water droplets create optical distortion and that is why it misjudged the distance accurately. More sensors to confirm reality would help with errors arising from optical illusions.
@@yesno9592 _Water droplets create optical distortion and that is why it misjudged the distance accurately,_
The front camera field of view is within the swept area of the windshield wiper, and it's clear from the photo that while there are drops of water outside the swept area, there are very few within the swept area to create the effect you claim.
_More sensors to confirm reality would help with errors arising from optical illusions._
Requiring an elaborate (and expensive) sensor suite is very likely to have the unintended consequence of reducing safety, rather than increasing it. This is because the more expensive a safety feature is, the fewer people will be able to afford it; so adding for example a Waymo-level sensor suite to Teslas is likely to result in more people opting to buy lower-cost vehicles with fewer safety features, foregoing the astonishing safety record of FSD, and increasing accidents, injuries and deaths.
@@BigBen621
Side "repeter" cameras don't have vipers. Ultrasonic sensors are not that unaffordable.
@@yesno9592 _that is why it misjudged the distance_
You're starting from a false premise. If you'll watch the video frame-by-frame, you'll see that the display shows the car exactly where it actually was, relative to the other cars around it. And again watching frame-by-frame, you can see that the Tesla recognized the threat and started turning away from it, rotating the wheel nearly a half turn before the driver took over. Since the Tesla can do the calculations at the speed of heat and can operate the controls much faster than the driver can, the fact that the driver avoided a sideswipe means that the Tesla would have also.
I'm not saying the driver should not have taken over; I would have done the same thing. But it's a virtual certainty that there would have been no accident, even if the driver hadn't.
This is the best situation to test FSD progress.
Thank you for the support! Will continue to make more videos like these.
Awesome city test! A lot of interesting scenarios on this test drive with hand picked hard paths. Amazing performance from FSD.
Thank you for the support! Will continue to make more videos like these
I mean... it almost crashed and that was not a long drive. It's definitely impressive but certainly a ways to go before safe enough for robotaxi.
Looks cery similar to european traffic (maybe without trams) so when the law will finally allow it, I think it will work great in Europe
Bro you have ninja reflexes, damn 😂.
But in all due seriousness you were absolutely right to disengage
No, you did. It need to disengage , you had the right of way, that car was coming into your lane . FSD would have stopped if he kept coming
Need to stay alert with supervised FSD lol. glad I avoided a collision!
@@shdmd2118still good to avoid a collision if its possible.
@@JustAnotherTeslaGuy If you play the video on x0.25 it seems to correct by steering to the right before you disengage. I am also inclined to say it would have avoided the car. But makes sense you do not want to crash your Tesla to fing out.
Not about to mess around with it and find out lol
At 13:56. The other car was at fault. They had crossed the double line.
This is the most stressful environment for FSD that I ever seen. great job. well done.I hope for more vedio like this in busy road👍
More to come on these videos!
Dang that was so chaotic
it was! Will continue to make more videos like these with new updates
Great test. Thanks for showing us crowded city rush hour.
Thank you for the support! Will continue to make more videos like these.
More videos showcasing FSD in busy traffic please. Observing its performance and failures in challenging conditions is far more valuable than watching flawless drives on simpler routes.
I will try my best to get these busy traffic videos with new updates!
Had a few critical interventions and a handful of non-critical interventions. Still a lot of work to be done around the route planning, context, and on the anticipation fronts but it's improving slowly
this is pretty amazing, FSD is pretty impressive as i would have struggled with that traffic
Yeah, rush hour here in new jersey is a lot haha
I’m very impressed by the progress. Especially in these rainy conditions.
Thanks for the support in watching the video. Will make more of them when next update comes out.
Pretty soon it might even learn not to try and drive through minivans 😉
It just gets better and better and better! Handling complex situations is almost perfect.
It is getting better. But still need to keep an eye on it while using it.
It did great overall. For the videos it would be nice to let the car figure things out on its own a bit more, I understand in regular use intervening though.
I think it did great too and want to always let the car figure it out. Just when the other car came at me and FSD didn't seem to move aside, I had to, to make sure it didn't crash.
I think you did the right thing to disengage. Better safe than sorry!
yeah, good to avoid a collision!
i dont see to many other channels give it gas when its being slightly hesitant, but this was great content and def a bit relaxed even after almost letting fsd crash lol
Haha thanks for the support. NJ drivers are always rushing. If there aren’t any cars around me, I’m okay with it sitting there to figure it out. But if there are cars, I push to make the car go so I don’t impact other drivers.
@3:47: Fails to complete right turn. Intervention. - as ur action was how most of us will react, ur intervention was absolutely correct in sense. this FSD is still a beta version and exactly these extreme stress situations needs the system to be get smarter and further developed - Awesome city test - great job
I believe that you avoided a collision at 14 minutes!
I think so too!
Agree. It's still got some way to go in 'normal' city traffic. Love to see it on a Melbourne hook turn (pull to left lane to turn right), haha
@@StephenWard9 Go back and check the video. There may have been an accident but the other driver was at fault for crossing the double line.
Thanks for the video! It’s a really nice testing scenarios. If you can, please don’t press on the paddle next time. Wonder whether it will be stuck.
If i can avoid pressing on the pedal, i will. But because other drivers around me might not expect me to stop or expect me to continue flowing with traffic, I dont want to impact them. But if is clear in the back, I’ll let FSD do its thing
I'm usually against handsy FSD testers, but you were 100% right to disengage there. It looked like it was going to touch the other car.
Nice save!
Yeah, need to supervised FSD when using it!
Thanks for sharing..n i think the car on the other side should have wait not crossing the center lane..i know it did that because of the parked car but still when crossing the center line here in korea we gotta wait until there is no car the other side lane..
People here always rushing and just want to get home quicker.
Amazing test and performance. The silly comments about this being a poor level 2 system are wild. 😂 People must not realize how many humans, my wife included, would be really stressed driving in that environment. It needs more confidence and at times assertiveness, but it performed excellent overall.
It is scary at first, so many cars and having to account for them is a lot lol. I do wish FSD has a bit more assertiveness especially in NJ.
If you’re really stressed driving perhaps it’s good to drive more and adjust to stress, or switch to public transportaton
Yep, this is what driving in any UK city is like any time between 8am and 6pm.
Halfway through this nail biter and i am amazed!
It has come a long way!
I'm really happy to see FSD navigate the Newark traffic you took it through. It gives me more confidence that it could handle the nightmare of Manhattan traffic.
if it can make it in NY, it can make it anywhere lol
Subscribed! Please always pick the most difficult route for FSD
awesome, thank you for the support! Will continue to make more videos like these.
13:36 Hopefully Tesla AI team watches this video and that bit and makes the necessary fix. This is going to be one of those things to make FSD effective in heavy urban driving like in new york city. FSD needs to be able to make these split second decisions and adjustments that avoids collisions in heavy urban driving.
As there's no outside camera it seemed a bit scary but I think FSD was already turning a bit more to right as watching that moment on slow motion. It didn't seem to be that close and the other car was coming to Tesla's lane also. I bet it wouldn't have crashed, but sure it really must have been a scary situation to be the passenger (or supervisor) onboard.
@@elloboksi Yeah, I think FSD will do what it needs to do to be safe and won't crash. But better to be safe than to be sorry with a collision!
Impressed with the progress of FSD. This system is amazing. Can't wait to see what it can do in a year or two.
I am not sure I could handle this kind of traffic well as a human driver. I can't wait to get this version installed on my HW3.
Yeah, rush hour here in New Jersey is a lot. Fingers crossed that you’ll get 12.5 soon!
Somehow hundreds of thousands of people handle it every day and without any close calls. Can you believe it?
@@mrlearner4153 _without any close calls_
Seriously? Are you truly not aware that human drivers kill 115 of our friends and loved ones each and every day, mostly due to deliberately and knowingly engaging in dangerous behavior? Are you saying an accident that kills someone is not a "close call"?
It was pretty good. I would have kept the right turn In because you're doing a video about traffic situations.
At around 10:30, If moving lanes all the way to the right to make the right turn is too challenging, does it seek an alternative route?
it likely won't change the route, and it will have its blinkers on to try to move over to the right side. but FSD will wait until its clear before it changes lane, while human drivers will likely move over once it sees a tiny gap, so i don't want to impact other drivers when I don't need to.
Newark nj now we’re talking perfect place to test
Traffic here gets crazy lol
Disengagement was advisable, but it would have been interesting to see if the car would have made an evasive move or taken the side scrape. "Interesting" is not important when it's your car, of course.
Haha yeah, I think FSD will have done what it needs to do to avoid a crash, but natural instincts kicked in to avoid the crash.
At 13:40 you're absolutely correct to disengage. If you'd let it continue youd have ended up crashing or both stopping last moment and one of you needing to back up.
From what I can see it would have been your fault too. FSD took it too wide completely unnecessarily.
Need to stay alert with supervised FSD, glad I avoided a collision!
The other car was crossing the double yellow lines. They should def be at fault.
@@stingerBTW In the UK, the rule is the car with the obstruction on its side of the road must give way. So in that case, the opposing traffic should have given way - in theory. In practice, of course, creeping along over the centre line to avoid parked cars is what happens. It seems FSD needs to learn to drive into the gap, not just into the lane.
@@skwdenyer Situational awareness is crucial for the software yes, but it would be a different story if FSD turned into the oncoming lane making the gap smaller. It legally made a turn, and the other driver was supposed to yield if they felt they couldn't squeeze the gap without causing a near collision.
"good apart from when I had to avoid the crash" 🤣 not ready to drive in Bali yet
Great city navigation with FSD. I drive everyday thru GWB and would want to use FSD there, nd this video builds my confidence with buying a tesla in future for the fsd alone
13:36 to 13:37. You avoided a collision! Blue SUV with left-signal light turned on intending to go around the stopped white car (with nobody in the driver's seat) with hazard lights on. Blue SUV is outside its lane because its two tires has crossed the double yellow line. FSD however is visualizing the blue SUV close or stepping on the double yellow line. Blue SUV moves forward to go around white car at the same time the FSD was turning right. FSD was accelerating from 3 to 14 mph and appears to be steering wide biased towards the double yellow line instead of hugging the right side of the road or taking evasive maneuver and making a hard right (which you did) to avoid a head on collision. FSD has no collision avoidance system? Or humans like you can anticipate and react faster than current FSD version? The blue SUV should not have went around the white car because it was not safe to do so in a no pass zone (double solid yellow lines) at that time because you were turning right. The blue SUV should have waited for you to clear the lane it was intending to go to before passing the white car. Thanks for supervising FSD you saved the day. I hope you report this intervention. It will help tesla team improve FSD.
When did 12.5.1 release to HW4 to the general public? or are u just a tester?
12.5.1 came out to the public about 3 weeks ago. I am not part of the testers from Tesla
Nice to see Tesla still has issues with fit and finish. That ambient lighting on the left door🥵 but super interesting video
Yup, I’m setting up time with Tesla to have them realign it.
awesome stress test!
Thank you for your support!
wow. the traffic is insane
Yeah, Friday afternoon during rush hour when everyone is trying to get home, it is a lot lol. Thanks for watching and supporting!
Raise your hand if you would test fsd with a blind fold on... or while sleeping???
eventually when FSD is out of supervision, then people can sleep while the car drives lol. depending on when that actually comes out.
13:36 You should check your car's camera to see if everything is working correctly.
13:18 to 13:34. FSD failed to turn on its left-turn signal light when it decided to move/merge to the left lane to go around stopped Van with hazard lights on and obstructing traffic. Right-turn signal light was on the whole time. i hope you report this next time it happens. Thanks for doing your part in making FSD better by testing it in a busy and challenging areas.
I have seen it done that a few times, where the opposite blinkers are on and it is making the opposite turn. Hope that can be fixed! I don't have the "snapshot" button so I'm unable to report anything without disengaging
What a great test, also how funny would it be if they gave little friendly honks while in traffic... Too assertive? haha
Thanks, glad you liked it. I would love to see FSD give a honk to other drivers lol.
9:15 I would have liked to see how it would try to change lanes to the right? Why did you delete that route?? Tesla will have to be able to handle congested lane changes.
Do you think pushing the pedal counts as an intervention ? I think it probably does - but it would be interesting to compile a list of moments when the pedal ought to have been pushed but wasn’t - to see what happened..
For example, at 7:52 i think that really was an intervention - the position the car stayed in was likely irritating others a bit.
for the most part(maybe 80%-90%), I think pushing on the pedal is a user preference, because it generally does the right thing and if is not moving or moving slower, likely because it is not safe to go move or go faster. so to me is more just nudging it in the right direction
That being said, here in NJ there are a lot of impatient drivers, so I have to nudge it a bit more to not impact the other drivers so a bit of a gray area there lol
Do you think that robotaxis will be possible in the near future (as in 2024 or 2025)? I was thinking perhaps if Tesla have a control centre where one person monitors say 10 or 20 taxis, so if one get stuck, they can take over remotely (I think this is done by other companies already). With the issues you had in this drive, it seems robotaxis are a long way off.
We already have waymo with no drivers, although it only work in certain locations lol. There are still things FSD needs to work on, like cones in the road, construction blockage, buses blocking the way, etc. I still think it will be a while, but I don't mind being wrong!
@@JustAnotherTeslaGuy thanks so much for your reply. You are in a better position to know than me. Personally, I think it will be a long time before it reaches us here in australia.
13:37 Those cars are all driving over the yellow lines. Yes those cars shouldn't be parked there.
14 mins, the other car would've certainly been at fault for crossing their double yellows unsafely. But, I think that if you didn't take over, it's likely there would've been at least a small ding before both cars were able to fully stop. I'm actually surprised none of the safety features kicked in, do you have collision avoidance/warnings on?
Yup, I have the warning on in the "early" setting too
NJ is in the top 5 most stressful places to drive in US
It is, especially with all the cars and how fast everyone is going!
Tell you man I have massive fears of traffic and rush hours hints why I never drive just stay in the country side and live a nice peaceful life. Traffic is very stressful especially if you have drivers like that that nearly hit you.
It saved your life 5:04,5:05 I'm sold on FSD
and it only gets better from here
Good tests. Newark isn't part of NYC's right on red default prohibition?
13:36 no you were defo not wrong to disengage, that looked like a certain collision. FSD wasn't figuring on the opposing car coming over during that turn in that particular situation. Opposing car would only really have been wrong if you couldn't make room although he proceeded with blind assumption at speed. But it's NY, it's different rules :)
Newark is part of NJ, so I can turn on red =) Glad I avoided a collision by disengaging.
The other car would be wrong at 14:24, they have to yield
Great test environments, this is the kind of driving that people hate
I would have disengaged too. Not worth the risk or hassle better be safe than sorry 👍🏻
100% don’t wanna have to deal with insurance to get the car fixed if something happened.
As good as FSD is, this just shows that its still not ready for roboaxi
Not bad alot of improvments most of the time when it hesitated its when it thought people were crossing or cars changing lanes.
좋은 영상 감사합니다
Great test drive , very challenging situations for FSD, thought in general GSD was awesome. Still not ready for robotaxi yet , but can see progress all the time
Yup, FSD needs to do well in these challenging situations
Take it to nyc. That will be the ultimate test
Is 4pm rush hour? I'd like to see someone do 5pm in a city like Melbourne (my home town in Australia). Thousands of people crossing roads constantly, bikes, trams and e-scooters galore. Bumper to bumper traffic. This video has lots of cars but looks a bit deserted in terms of people, bikes etc. Bikes, scooters and pedestrians are the most dynamic and unpredictable elements for FSD to handle.
yeah NJ has quite a lot of cars not much people walking or that many bikers. NYC is where the traffic jam happens with cars, people crossing and bikes. In NYC there is so many people crossing (even on red), that usually only 1 car is able to turn in a the full light cycle. FSD def need to be able to handle that.
awesome!!
Thank you for your support!
Oh snap New Jersey! I have a hard time using fsd because everyone speeding and the car goes too slow
It is doing well for me, I use it daily. Give it a try!
@@JustAnotherTeslaGuy stuck on hardware 3, been using fsd for a while but the auto speed is horrendous on 3.6
fingers crossed you get your soon! turn off the auto speed, so that you can manually control the speed
finally see a real test, the traffic sitations in China could be 3x or even 10x complicated than this, i don't really think any FSD could work.
If you look at the path planner it was going to do what you were doing, but maybe not as fast, it certainly wouldn't have hit that car, that car might have hit you though!
7:52 that somehow invalidates the whole test 😢
I cant believe it 07:21, 10:11
That was a little scary.
Yeah a little lol. Almost crashed into another car, need to keep a close eye on it.
you intervened about 100 times with the petal. this is gonna take another 10 years
id say about 6-8 years until cars are driving themselves around
It looks FSD would hesitate sometimes! QQ Maybe this should be fixed by Tesla engineer.
FSD cannot avoid potholes. That alone is a deal breaker for me. I barely used FSD when I was given a free trial for month
Not good. Absolutely not good. We are nowhere near what the mainstream Tesla TH-camrs are hyping about.
影片裡的路況實在太簡單了。
特斯拉應該在台灣測試及演算才對!因為台灣的路況比歐美要複雜太多了,搞定台灣就能搞定全世界啊!
I guess i will come back in 3 weeks for the next updates.. the comments looks negative, so we're not there yet.
The head on was actually nothg wrong by fsd, it was just the other driver being reckless. And fsd wasn't given a chance to react though, which was rightfully so...
Despite FSD really impressive performance, this video unambiguously shows that Tesla FSD is not, and will likely never be, ready for even L3 automation. I think it's actually more stressful to drive with FSD under full human supervision/second guessing than to drive without. I predict that at some point in the future when Elon figures out a face-saving to back out of the vision only FSD appraoch, FSD could move on to genuine L3/L4 automation.
for me at least, FSD does a pretty decent job on the highways where is just flowing traffic and staying in lane. it is the local streets where it gets much harder because there could be a lot things happening at once
th-cam.com/video/TLUdxdS8bcM/w-d-xo.html Urgent Situation, need to engage FSD, since the another car wants to go straight without checking!
You're too aggressive on your expectations for stop signs pausing. IMO Otherwise it's pretty amazing.
Can you stop interrupting the FSD pls. You are supposed to test it for handling tough situations!
He's supposed to interrupt, it'd crash otherwise.
@@TheSpartan3669 it was just close, no collision
@beastonabun4618 Yeah, there was no collision because he intervened
@@beastonabun4618 I'd love to see you drive FSD in a busy city and just let it do its thing with no interventions, all on TH-cam of course. Naive fools like you are good entertainment.
These videos are so stressful
Haha yeah. trying to push FSD to its limit!
If you're wondering how close Robotaxis are. He prevented a crash in this 20 minute video so if robotaxi was 1000x better than this example a robotaxi network of 1000 vehicles driving 6 hours a day would have 18 crashes.
So let me get this right, you are going to give a prediction of frequency of crashes based on a single drive, in which the driver is trying to throw the car as big a challenge as he can come up with. When Tesla FSD supervised reaches the point where Elon is confident that it is very much safer than human drivers, the government won’t authorize it based on Elons assurances alone, but rather the driving safety data. I suspect it isn’t a coincidence that your analysis is actually this bad, but i think it’s more likely that you are just one of the Tesla haters who try to spin any video in the worst possible light for Tesla. This might work with gullible, uninformed readers but I’ll bet most of the readers here are much more savvy and informed than you might give them credit for.
@@jamesmcneal1821 He made a good point. Elon talks of soon having seven million robotaxis out there. A fleet of one million robocars would put up about 20 million miles per day, about 40 human lifetimes of driving per day! It will have to be super-duper superior to an average human driver, as in able to go hundreds of millions of miles safely between bad crashes, to have a chance of keeping regulatory approval.
It's obvious that in busy urban driving, FSD can't average one safe hour currently. The guy was frequently pressing the accelerator, and he clearly prevented a crash that any normal human would have seen coming and prevented.
@@charliedoyle7824 The number of robotaxis on the road doesn’t matter, nor does it matter how many million miles they drive. What matters is how safe those miles are. If they are 10x safer than human drivers I would not want to have all of those extra deaths or crashes on my conscience and I don’t think anyone else should, including those who must make that regulatory decision. It doesn’t matter how this particular drive went, so far as robotaxi goes, how many interventions, etc. why? Because Tesla isn’t going to seek approval until it is many times safer than human driving. During this video, supervised FSD was not yet up to the robotaxi level, but that’s not a problem because there are no Tesla robotaxis running now. Every ride currently has a human driver supervising. That won’t be the case when Tesla’s robotaxi is implemented. You are making the mistake of comparing the current FSD build as seen in this video, to the future build which will be so much safer than human drivers. This current build doesn’t even have reverse available, doesn’t know how to respond to emergency vehicles yet, etc. it’s not ready and Tesla isn’t claiming it is. Critique Tesla’s FSD build when it seeks regulatory approval if at that time you see problems with it. I will tell you a few problems it won’t have: it will never be driving impaired because it is drunk, or sleepy or texting while driving or highly upset because it is yelling at other passengers or just broke up with its significant other or just got cut off in traffic and became enraged at another driver flipping them off or because it needs to prove something. If either you or I become elderly, we may or may not become unsafe drivers. Everyone is an individual. I could go on and on. The point is humans in general are not safe drivers. All you have to do is look at the statistics. I think what we both have in common is that we both want transportation to be as safe as possible.
@@jamesmcneal1821 the reason to critique current FSD performance is to point out how far it is from human levels of safety and ability. FSD can maybe go twenty minutes in busy city driving with no human help, if lucky, where a normal human driver can go forever with no help and decades with no bad crashes. You are fantasizing about FSD being 10x better than a human. They have a long, long way to go just to match an average human, especially in L5 driving.
You are also fantasizing about "10x safer than human drivers" being a figure that regulators will care about; they won't operate on an arbitrary figure like that.
AV regulators are already clearly operating like the FAA does with aircraft, where every crash and other driving violations are investigated, and any performance deemed dangerous will result in recalls and fix recommendations. If the crashes are bad enough, there will be license revocation by states. One bad crash will be enough to threaten a shutdown of a robocar unless the company can convince regulators they are addressing it. A series of similar bad crashes will definitely shut down the program, likely nationally, just like when Boeing 737 Max planes were going down regularly.
With one million robocars driven by FSD, it will cause a bad crash every few days somewhere if they are only 10x safer than a human at causing bad crashes. If Tesla ignores recall orders and continues driving, the lawsuits will write themselves and the press will destroy them, along with many states pulling licenses. A robotaxi operation won't work with such a bad reputation, even in a state with no interest in ADS regulation.
With Waymo and other AVs likely able to go hundreds of millions of miles between bad crashes, and fully cooperating with regulators when they do have a crash, Tesla won't be able to operate like it's the wild west with nonsense like: "
we're 10x better than a dumb human, you have a moral obligation to let us continue".
@@charliedoyle7824 _With one million robocars driven by FSD, it will cause a bad crash every few days somewhere if they are only 10x safer than a human at causing bad crashes._
I hardly know where to start; but I'll start with this. If robotaxis are 10x safer than human drivers, and FSD is already provably *more than* 10x safer than human drivers, then for every "bad crash" caused by FSD, nine crashes will be avoided; that's just in the math of "10x safer". And I'll take those odds, all day long. Yeah, it'll suck if you or one of your loved ones is the one out of ten "bad crash"; but it's nine times more likely that they'll be one of the nine avoided crashes. Of course, you won't know that at the time; but it'll be clear in the statistics.
not wrong to disengage. You would have crashed. The other driver at fault.
Glad I disengaged here!
All of your interventions should be reported
How?
@@BigBen621 it is a standard tesla software functionality like print screen
@@hevenkoNot so. Disengagements can be reported but only the OG can report interventions without disengaging.
@@hevenko Only for the "OG" (Original Group of Beta testers). Everyone else has to disengage before reporting the reason for the disengagement or intervention.
What garbage software. Camera only will never work.
Who exactly will INSURE a self-driving car? I suspect that my insurance company will put a rider on my Tesla policy stating that they are not responsible for damages when FSD (not-supervised) is engaged. Will Tesla insure? Who will show up when a Tesla runs over a pedestrian or cyclist that jumps out from between parked cars? Who will be 1st to put their child on the front seat, kiss them on the forehead, and tell their Tesla to take that child to school, then stop at McD for a burger before returning home? This ain't happening any time soon.
_I suspect that my insurance company will put a rider on my Tesla policy stating that they are not responsible for damages when FSD (not-supervised) is engaged._
FSD has proven to have an astonishing safety record. According to NHTSA records--not Tesla records--Teslas with FSD engaged have been involved in 75 crashes in 1.3 *billion* miles, an average of one every 17.3 million miles. Meanwhile, human drives have a crash every 539,000 miles. Which would you rather insure?
_Will Tesla insure?_
Yes, in a growing number of states. Here's what @thejefflutz, who actually has Tesla C&C insurance, has to say:
"@Tesla insurance is paying for most or all of the monthly cost of FSD in my case. I started at $163/mo for a MX plaid driving score of 90. FSD Supervised 12.5.x has elevated my driving score to 100 over just a few months and lowered my premiums by about $70..."
_Who will show up when a Tesla runs over a pedestrian or cyclist that jumps out from between parked cars?_
First responders--who did you think would? Some accidents are unavoidable; but when that happens, it's >30 times more likely it'll be a human driver and not FSD at the wheel.
_Who will be 1st to put their child on the front seat, kiss them on the forehead, and tell their Tesla to take that child to school, then stop at McD for a burger before returning home?_
Once FSD achieves Level 4, anyone who wants their kids to be >30 times as safe as they would be with a human driver. I agree it's not happening soon; but it *is* happening.
I wouldn’t disengage, it’s correcting very fast its mistakes or tricky situations usually
WTF u talking about? The Tesla's wheel turned towards the other car. It would have totally hit the other car.
P.S. From what I saw of this video, FSD definitely not ready for this type of scenario.
Watch the video frame by frame, and you'll see that the Tesla had turned its wheel nearly 1/2 turn *away* from the car before he intervened.
I have had FSD on 2 cars since about 2021 (I think).. It literally has not improved a bit and I bet FSD will never get to L3 let alone L4. You cannot drive a mile without a hard crash avoiding intervention. Why the fanboys think vision only autonomy will work like the idiot musk claims is beyond me. Tesla is literally dead last today in its self driving tech and that is coming from someone who wasted $7k on this pipe dream (meaning I want it to work but know it can't)..
_It literally has not improved a bit_ That's one of the most absurd claims I've ever heard, and pretty much destroys your credibility for commenting on FSD.
Why is quality of FSD so much worse on the smaller channels? Makes you wonder..... 👀. You not getting paid by Muskrat?
I am 100% not being paid on my videos lol
Maybe some of the smaller channels are focusing too much on problems, giving viewers a skewed view. I have been using FSD for 4 months, V12.3.4 to V12.5.1.1 now. It performs wonderfully most of the time and I can tell when it is likely to have a problem well enough in advance to avoid the problem behavior and report it for fleet improvement.
It once drove during a down pour of rain that flooded highways. No unexpected behavior.
@@genelane2243 FSD would have crashed into that car.... at 13:30. That's not staged.
it was the other car who crossed the line,and ran into Tesla@@seweso
@@looping1990 Yet the HUMAN driver prevented the accident.
"But he was driving on the wrong side" is not a valid excuse to ram someone.
This is why Tesla FSD will never work
Care to elaborate?
Yep
@@BigBen621
Not sure if this was stated. But did you have it in chill, average or assertive setting? Thank you for your time and service. Jesus Christ is alive. Please get a Bible and read it. If you do what he commands. He will give awesome gifts and powers.
12.5 is ass. Bring back 12.3
When doing these type of test, do NOT EVER, press the pedals. Have patience, unless it is absolutley necessary. You always need to pretend as if no one is behind the wheel
Without 100 percent human supervision, FSD cannot handle much of anything. It is a simple Level 2 assistant, that makes frequent mistakes .
Looks more like a horse to me. It can handle many things, but it can't read and can get "scared" or "confused" sometimes, so you need to grab the reins. At least it doesn't bolt.
I'd beg to differ. I use FSD v12.5 every day on every drive just not in a big city. Zero disengagements on just about every drive. Works at night in the rain and super helpful when I'm driving in a new area. Rarely makes mistakes any more. Speed control is the biggest problem now. A few other improvements needed but at the recent rate of change by next year it should be pretty awesome. Over the air updates are great. Continual improvement.
As an engineer, i am amazed at how well it did in some of the worst conditions imagineable. Imagine the abilities in another year or 2!
@@VasuJaganath A system can work 100% in its domain and still be pretty useless in our messy world, if the domain is too narrow.
Not anymore. FSD v12.5 can handle almost anything