Tesla FSD V12.3 almost causes me an accident…
ฝัง
- เผยแพร่เมื่อ 16 มี.ค. 2024
- The other day I posted a video of an #FSD V12.3 hard brake for no apparent reason in the middle of the road and some idiots in the comment section tried to justify it because a traffic cone was on the other side of a 2 lane road... The traffic cone is not there today. Had a car been close behind and not paying attention I would have been hit.
- วิทยาศาสตร์และเทคโนโลยี
Not "what the fuck"... it's "what the musk"... the same...
Any who would trust anything Musk says, must be pretty gullible, especially FSD. Look at ALL the excuses people make for a system that, is NOT going to work any time soon. FSD is another of Musk's "visions" that never amount to anything. Or certainly not anything like what was pitched at the time of each project's announcement. Like The Boring Company's wormhole tunnels, SpaceX, Hyperloop, Solar City, Cybertruck, Roadster 2, and Tesla Semi.
All have either failed, are way below the specs promised, or are "in development" for year after year. Musk is selling bs, simple as that.
Looks like it misrecognized the cross walk for speed bump.
Always remember, if something goes wrong you are 100% liable with your "autonomous" car...
That's what's happening when you use vision only.
Same things humans use
@@MrTejibaby Humans use a lot more than just vision, only clowns think humans only use their eyes when they drive a car.
@@MrTejibaby But the car can use much better technologies to see. Like lidar.
@@jaromirandel543 "Lidar is lame... very low resolution and it doesn't detect the true nature of objects... a plastic bag is essentially the same as a brick. Lidar is also disgustingly expensive... its price won't significantly drop in the future..."
@@Kurat21 you can combine technilogies.
This seems like a different kind of phantom braking. Thank you for documenting this. I hope you are able to log a report to Tesla. Great work identifying this issue so they can train it out of the model.
So there is a highway we take our tesla fsd on, and there is an exact pinpoint spot where the tesla FSD will come to a stop on the highway. We figured out it is because of the teslas mapping, it reads that the highway stops and continues instead if just continuing. So because of that it confused the tesla and stops. Not sure if the same thing is wrong there but keep it in mind.
Three possible reasons since you hav noticed it at several times, try it in the dark. 1. Spotting a speed bump with vision only is reeeeaaaalllly hard. So two lines separated like this might be interpreted as the rise contour and the crest contur. 2. Since there is a left street the line is maybe interpreted as a stop line. Two objects combined give semantic meaning a stop at intersection. 3. However Tesla does a crap job knowing how far away things are so like number 2 the stop sign 100 meters ahead might already be seen and when a stop sign is seen an the car reaches a line it combines them. It confuses stop lights at the next intersection often. It can see a stop sign at a great distance. Maybe it just sees the stop sign and then the line all of a sudden is given a semantic meaning. The problem of FSD is not to manouver around objects and obey semantic information. It is to see things and understand what you are seeing. Why vision only is doomed to be an extremely expensive assistance system really good, often, then and then dangerous as a crazy russian dictator.
12.3 still has some issues apparently.
i dont think just a break may cause an accident. its just be safer
My guess - the SUV on the left in the driveway shows up on the display and is partially obscured by the fence, this may have caused it to perceive it as moving and a possible collision hazard.
Yeah that is a BIG problem. So easy to get rear ended. Are you watching Tesla????
FSD has been in beta for almost 4 years. A LONG list of issues.
Doesnt matter even with more fsd updates, you still have pay way more attention
Road on the left possibly caused AI to frazzle, didn't know what to make of situation so slows down as precaution.
Yeah, it considers the two horizontal lines on the road as an intersection stop. They are misleading for AI.
Great video!
Thanks for demonstrating that Phantom Braking is still present in version 12.3 FSD.
Not sure its the same type of phantom braking, it was almost like it thought it saw a stop sign
@@matty7834 But the result is the same: the simple minded FSD hit the brakes hard, and for no reason.
FSD has been in beta testing for almost 4 years, and it's still full of problems.
Is there unexplained braking? Yes. Is the phantom braking the same as it was 4 years ago? No. If you have ever had personal experience with FSD over the years, you'll find that phantom braking has largely improved. It's very rare now. It's on a positive improvement trend.
Best way to find out the truth is to test it for yourself. "Experiencing" through selective videos isn't representative of reality.
@@thesadboxman People have extensive knowledge of many subjects that they have not had hands-on experience with.
Many v12.3 videos on TH-cam clearly show human intervention situations. Those data points are completely valid, even if I personally was not in the car.
@@DerekDavis213 there's nothing like experiencing it yourself. Not saying you're unknowledgeable but there are things you cannot understand without first-hand experience.
You can watch 1000 videos on how to drive a stick shift car which will be helpful to learn from, but it won't substitute hands on experience that reveal nuances that cannot easily be understood from a video
It wouldn't brake that hard if you didn't let it. It's called supervised. You will cause the accident, not the car. If you notice it's braking and not supposed to, disengage or press the accelerator. You are still in control. It's not fully autonomous. I understand it's not supposed to do what it's doing, but you have to take over. Plain and simple. Don't use it if you can't overcome these issues.
My money is on the U-turn sign.
I bet if you slipped that down, it'd behave. Probably seeing the "New Homes" U-turn sign as an actual road sign that it's trying to adhere to, then realizes it doesn't need to.
If you've got a buddy that's willing to flip it down for a minute, to test, then put it back up after, I think that sign is the problem.
It's due to the passing walk on the floor, by law ur supposed to slow down when crosing them, fsd sees them when aproaching to real close distence, causing abroubt stop. It shows on the screen for a second.
It's only necessary to slow down and stop if there are pedestrians present. This does not appear to be the case.
Its not phantom braking, it thinks those lines across the road are an intersection. Basically It thinks it just ran a red light/stop sign, I could see why it would think that. make sure you send a voice note if you can
Phantom braking basically means "it braked for a reason that is not proveably explainable". Despite the pessimistic narrative of this video, it is still a accurate case of phantom braking until a clear and provable explanation can be provided.
Let me enlighten you. The Tesla Computer gets confused when a lane becomes 2 lanes wide. It's designed for roads where the lanes are clearly marked and semetrical.If your lane ends up being 2 lanes wide between the lines, it's going to break because it can't see the lines that far apart!! Remember, always keep your hands on the wheel. If a lane starts to get really wide, TAKE OVER!!!!
I would guess people are looking for ways to reproduce in other situations, not to justify behavior. Although I've already found a few comments in this video appearing to justify it so :/
In hope of reproducing: the double white line crosswalk looks a little strange to me, is that normal in your area? are there others you can test on, maybe with a split intersection as well?
Thanks!
Thats a good call out. Honestly, I have not seen this happen in any other video. Based on what I am seeing I can't find the reason but if it is happening at the exact same place every time the system is interpreting something. No clue as to what but needs to be investigated for sure. Hopefully you did the snapshot and voice recording for Tesla so they can review.
If something happens tesla will blame its your fault
It's braking for a pedestrian crossing... there are yellow signs warning about the crossing... those are odd... why are they yellow? Yellow is usually a warning of some kind, perhaps it's thinking there is a speed bump?
The signs are odd.
Yellow sign, and a yard sign at the base. From a distance it may have read that as a pedestrian crossing with a person or object at the base. Then once it got close it determined it was a false positive. Understandable.
On 11.4.9 there's a spot a few miles from my house where it always stops for no apparent reason. It doesn't slam on the brakes though so I have time to hit the accelerator to cancel it.
You're making excuses for a subpar performance of a scenario. Here's the question you should be asking: Are the signs confusing even for human drivers? Would human drivers also make the same mistake? If not, then FSD is subar compared to human drivers in this scenario.
However, I do believe this will be easily fixable and if this is the worst problem FSD v12 has, then it's much better than v11.
Probably should log it and move on. FSD isn't perfect and won't be for a while. At the same time human drivers do dangerous maneuvers and breaking all the time
Human drivers don't typically brake in the middle of a clear road
The only thing I can think of is maybe the camera is getting confused by something it thinks it sees or maybe bad map data. Either way it’s definitely something that shouldn’t be happening this far into FSD’s development
the crosswalk on the road might have been mistaken for a stop line, or maybe it thought someone was in the crosswalk or was about to cross
crosswalk
Just guessing here.. the car could have detected the stop sign ahead way too early; hopefully by now with FSD (Supervised) v12.3.4 you won’t have the problem anymore.
Not going to cause an accident unless someone is on their phone behind you. Not an issue with FSD just a minor glitch probably with the road paint
Yeah, take over and do the voice description so it goes in as a take over. The training and improvement this year will be exponential and much faster than before now that they have more compute coming online. Soon we will be laughing at how it was at the start....
I mean that's why you're supervising it as part of a beta program, right? You're teaching a neural net teenager how to drive. Hope you sent your findings back to Tesla.
First thing. Yes, it’s still in beta. Obviously it’s not perfect. I’m not sure why people are surprised that it makes some mistakes. Maybe they don’t know what beta software is? That’s why they want you to give feedback when it does something wrong. Just like any beta software.
From a defensive driving standpoint, it seems like the car was going a little fast, and I personally brake for crosswalks too (human) when they come up on me unnoticed like that. People who rear end you should leave more space for reaction time, and it would be their fault? I am watching FSD videos, and getting the idea it'a a pretty cautious "driver".
You realize that the car brakes for two reasons. First, although there is a Ped xing sign, there is no crosswalk until later and the car brakes as soon as it sees the crosswalk. Second, you said the car was going 35 and at the same time as the sudden appearance of the crosswalk, a 30 mph sign is seen. This is not rocket science. You must not have very much experience with FSD Beta.
No matter the reasons behavior is incorrect. Calling out wrong behavior is a good thing so other testers can try and reproduce. More data on failures is better.
It’s not ok to slam on the brakes when nobody is crossing, regardless of “experience” - period. I can see where the pedestrian crossing might be the cause but what does the speed limit sign have anything to do with it - car slowed down to 15, not 30. So if I went from 55 to 30 I should expect it to come to a full stop?!
That braking would ONLY cause an accident if the vehicle following you was following too closely.
Just sell it
And Where The F is the “almost causes you an accident”? If you can’t handle being a Beta tester, DONT BE ONE.
The real genius of musk is that he got you to buy an overpriced car... and instead of tesla testing it, he gets you to test it for FREE!!!