Is Tesla Autopilot TOO DANGEROUS?!
ฝัง
- เผยแพร่เมื่อ 27 เม.ย. 2024
- If you believe a recent Wired article--or Dan O'Dowd--about the recent NHTSA Tesla investigation you might believe that Tesla Autopilot (and potentially FSD) is too dangerous to be allowed on roads! But is this really what NHTSA found? The results and the safety statistics paint an entirely different picture.
**If you are looking to purchase a new Tesla Car, Solar roof, Solar tiles or PowerWall, just click this link to get up to $500 off! www.tesla.com/referral/john11286. Thank you!
Join this channel to get access to perks:
/ @drknowitallknows
**To become part of our Patreon team, help support the channel, and get awesome perks, check out our Patreon site here: / drknowitallknows . Thanks for your support!
Get The Elon Musk Mission (I've got two chapters in it) here:
Paperback: amzn.to/3TQXV9g
Kindle: amzn.to/3U7f7Hr!
**Want some awesome Dr. Know-it-all merch, including the YEAR OF EMBODIED AI Shirt? Check out our awesome Merch store: drknowitall.itemorder.com/sale
For a limited time, use the code "Knows2021" to get 20% off your entire order!
**Check out Artimatic: www.artimatic.io
**You can help support this channel with one click! We have an Amazon Affiliate link in several countries. If you click the link for your country, anything you buy from Amazon in the next several hours gives us a small commission, and costs you nothing. Thank you!
* USA: amzn.to/39n5mPH
* Germany: amzn.to/2XbdxJi
* United Kingdom: amzn.to/3hGlzTR
* France: amzn.to/2KRAwXh
* Spain: amzn.to/3hJYYFV
**What do we use to shoot our videos?
-Sony alpha a7 III: amzn.to/3czV2XJ
--and lens: amzn.to/3aujOqE
-Feelworld portable field monitor: amzn.to/38yf2ah
-Neewer compact desk tripod: amzn.to/3l8yrUk
-Glidegear teleprompter: amzn.to/3rJeFkP
-Neewer dimmable LED lights: amzn.to/3qAg3oF
-Rode Wireless Go II Lavalier microphones: amzn.to/3eC9jUZ
-Rode NT USB+ Studio Microphone: amzn.to/3U65Q3w
-Focusrite Scarlette 2i2 audio interface: amzn.to/3l8vqDu
-Studio soundproofing tiles: amzn.to/3rFUtQU
-Sony MDR-7506 Professional Headphones: amzn.to/2OoDdBd
-Apple M1 Max Studio: amzn.to/3GfxPYY
-Apple M1 MacBook Pro: amzn.to/3wPYV1D
-Docking Station for MacBook: amzn.to/3yIhc1S
-Philips Brilliance 4K Docking Monitor: amzn.to/3xwSKAb
-Sabrent 8TB SSD drive: amzn.to/3rhSxQM
-DJI Mavic Mini Drone: amzn.to/2OnHCEw
-GoPro Hero 9 Black action camera: amzn.to/3vgVMrH
-GoPro Max 360 camera: amzn.to/3nORGYk
-Tesla phone mount: amzn.to/3U92fl9
-Suction car mount for camera: amzn.to/3tcUfRK
-Extender Rod for car mount camera: amzn.to/3wHQXsw
**Here are a few products we've found really fun and/or useful:
-NeoCharge Dryer/EV charger splitter: amzn.to/39UcKWx
-Lift pucks for your Tesla: amzn.to/3vJF3iB
-Emergency tire fill and repair kit: amzn.to/3vMkL8d
-CO2 Monitor: amzn.to/3PsQRh2
-Camping mattress for your Tesla model S/3/X/Y: amzn.to/3m7ffef
**Music by Zenlee. Check out his amazing music on instagram -@zenlee_music
or TH-cam - / @zenlee_music
Tesla Stock: TSLA
**EVANNEX
Check out the Evannex web site: evannex.com/
If you use my discount code, KnowsEVs, you get $10 off any order over $100!
**For business inquiries, please email me here: DrKnowItAllKnows@gmail.com
Twitter: / drknowitall16
Also on Twitter: @Tesla_UnPR: / tesla_un
Instagram: @drknowitallknows
**Want some outdoorsy videos? Check out Whole Nuts and Donuts: / @wholenutsanddonuts5741
Sources:
www.wired.com/story/tesla-aut...
/ 1784228757760258533
/ 1707410938281009285
www.tesla.com/VehicleSafetyRe... - บันเทิง
I've driven some other (non-Tesla) vehicles that have Lane Keep Assist and Adaptive Cruise Control. They basically do the same thing as Tesla's Autopilot, maybe not quite as well, but I don't recall them having a system that was a "naggy" as Tesla's Autopilot about forcing the driver's attention. So all this negative media attention toward Tesla seems a bit unfair and biased.
People are ignorant and no personal responsibility.
I had a young female run out in front of my car yesterday and FSD immediately saw her and slowed down and didn't come close to hitting her.
Young males they drive right over. Is that a bug?
And I shutdown FSD because I felt it kept on getting dangerously close to Jersey barriers.
I remain utterly baffled by media pushing the narrative that "autopilot" implies to users that they don't need to pay attention. The term comes from airplanes. In that context, there is a pilot AND copilot who pay attention at all times, ready to take control. You know, exactly like L2 autonomy is supposed to work, and users must agree to before enabling the feature. Where did this narrative that "autopilot" implies the user don't need to pay attention come from? It's literally exactly the opposite.
I use Autopilot regularly and it's obvious that if you don't pay attention you'll have an accident every 50km or so.
Anyone that complains that they became complacent because of Musk or whatever are lying.
Maybe they made a stupid decision and had horrible event happen to then and they prefer to go in denial because of the trauma and/or shame, but, again, it's obvious when you use it that you must pay attention (on top of that Tesla tells you to pay attention and that you are responsible).
It came from Elon himself; he’s said Teslas would be fully autonomous by the end of the year, every year, for the last ten years in a row. Thus he created the perception amongst average consumers that autopilot’s underlying technology is far more capable than it actually is, always just a few months away from being fully hands-free and safer than a human driver, according to Elon.
Sure, when you enable AP or FSD it comes with disclaimers and alerts that you have to pay attention, but Elon has talked non-stop for a decade about how the cars are basically already autonomous, so many people think paying attention is more of a legal technicality than a necessity. Elon has worked hard to create that misconception so he could sell more cars, using credulous media outlets as a tool to help him promote it.
I’m baffled as to why this is surprising to Tesla fans. It sure seems like they’re playing dumb for the sake of cope. Pretending customers should know exactly what “autopilot” means because of how it’s used in aircraft is also ridiculous. Most people use the term colloquially to imply automation, and few people understand how it works in airplanes. The description of autopilot given here isn’t even correct-many single pilot aircraft employ autopilot, and once it’s engaged pilots are free to focus on other aircraft functions, go through checklists, communicate, navigate, etc. It does not require hovering your hands over the stick/yoke when it’s engaged, ready to make a split-second adjustment if something goes wrong, like how it works in a Tesla.
If the media is guilty of anything it’s being too charitable to Musk by uncritically repeating his dubious claims and failing to provide context for them. What’s funny is if media outlets actually clarified that “autopilot” is little more than a lane-departure and cruise control system, pointing out that every other automaker offers these features and they aren’t special, Tesla fans would be outraged by that, too. They’d say the media is downplaying the brilliance and ingenuity of the software that powers autopilot. No matter how the media covers Elon/Tesla, his fans will take umbrage with it and blame them for problems instead of holding Elon responsible.
@@mrjpb23 shuuuuuuuut uuupppppp
It's not only media I see channel on TH-cam dedicated to legacy auto basically using that report as a,hahah moment to slander auto pilot as a failed system then trying to link this to a tesla doom and gloom story. This is like when seatbelts were introduced and had to become law before people reluctantly began wearing them. It's ridiculous how humans lack critical thinking at these pivotal moments in our history but eventually all vehicles will require fsd as humans driving will be far inferior. Let the complaining and crying begin!!!
Thoughtful as ever. Like your suggestions to Tesla - get some good news narrative flowing. Like Apple does with the Apple Watch - their “Dear Tim…” emails 🤔
Autopilot prevented me from having a head on collision with a large truck on a two lane country road at 55 mph.
I've done the same thing myself without it
@@jayfizzle7931 As we all have. But computers never get tired or lazy and recheck the situation 30 times a second. Statistics show that FSD and Autopilot outperform human drivers in terms of safety today, and this will get better and better in the near future.
@@jayfizzle7931Autopilot or FSD needs to save you only one time in your life to make it worth it.
Me too.
I do that 1,000 a day sometimes. You wouldn’t believe the number of trucks I manage to pass without hitting, like a super-human cyborg AI God King.
Having recently survived a head-on car collision, FSD and safety are top priority for my next vehicle.
Me too. My model 3 saved my life, and my daughter’s.
Tesla correctly used the term autopilot. This term has been in use in commercial aviation for many years. In the context from which it was derived aircraft are not point to point autonomous, they are the equivalent to lane assist as the “autopilot” maintains a specific course for the aircraft.
The term is technically accurate but not to the general public. Copilot would have been a far better name choice.
Most consumers are not pilots, what matters is the colloquial understanding of the term and the claims made surrounding it. Musk has claimed Teslas would fully autonomous by the end of the year, every year, for the last ten years, thereby giving people the (false) impression this technology was only a sliver away from being fully automated the entire time. Pairing that pitch with a name many people believe to mean full automation was unbelievably reckless and knowingly misleading. If autopilot was only meant to connotate common features like lane assist and radar cruise control he could have called it that like every other automaker. But he didn’t, because he wanted people to believe it was something much more than that for marketing advantage. He chose to prioritize sales and public perception over safety.
In fact, Tesla’s terminology is STILL intentionally fuzzy and misleading. They’re now calling it “Full Self-Driving (Supervised)”. That’s an obvious contradiction of terms. It’s not “Full” self-driving if it needs to be supervised. If they wanted to be clear about communicating expectations to the customer they would just called it Supervised Self-Driving and drop the Full. Elon can’t cry foul when people misconstrue terms he intentionally implemented to be fuzzy and even contradictory.
And if you really want to nitpick terminology, commercial aviation automation actually does much more than maintain heading and altitude in modern aircraft. There’s autothrottle, autoland, autopilot can be preprogrammed to follow VORs/waypoints so it navigates itself, etc. It’s fuzzy in itself.
@@mrjpb23
Autopilot isn't FSD...
@@Zripas I never implied it was. Not sure what your point is.
If you're a pilot, you'd be able to discern the difference. How many Tesla drivers are pilots?
22:39 - don’t forget to add the average 40 times per day that Tesla does count, that it prevented a clearly accidental pedal mis-application. How many deaths and other accidents does that prevent every year?
fair point we shood add that up in one chart sumday
When you activate Autopilot for the first time, at least in Europe but my guess is that it is the same in the US, it is clearly explained that it should be only used on roads with a physical separation between traffic directions, like highways. Autopilot still works on most other roads and that might be article is referring to, they want Tesla to enforce Autopilot to only work on the specific type of roads that it is meant to work according to Tesla. It makes sense, but would deprive sensible autopilot users greatly.
This is similar to how the media reports on gun violence but not defensive gun use.
Many comments by FSD drivers about "the nag" indicate they too are irritated by it so that reaction is not uncommon. In my experience, I usually miss the text that comes up to remind me to touch (actually hold and assert some small presence on) the steering wheel. But I easily see in my peripheral vision the flashing blue shading at the top of the display even if I'm watching the road, not the display. I then have plenty of time to apply pressure to the wheel or spin the volume wheel on the steering wheel to indicate I'm aware. If I'm listening to the radio, just a single click up or down does the job without changing the volume enough to miss something. It seems that in recent software releases, a minute or two can go by before nagging me. Maybe the interior camera has become the primary watchdog, not the wheel.
Perfectly reported and commented on. I've had a very similar experience with my FSD avoiding an accident. I've also had to get out of FSD as my vehicle tries to merge into traffic off of I-19 onto I-10 going east to El Paso. Sometimes it's not bad, but sometimes it's super crazy at that short transition as traffic is merging not only onto I-10 but merging to the right to exit almost immediately onto the exit for city streets. I-10 has several of these choke points entering and exiting I-10. They cause a lot of traffic slowing down to an almost complete stop at times because they are severe choke points in high traffic situations. I get out of FSD about 1/3 to 1/2 the time at this intersection, that I travel almost daily.
What do you think would have happened if you didn't intercede?
Over the years I have been involved in replacing several legacy computer systems. The incumbent users are generally very reluctant to see changes because they will have to learn some new things.
I think the new driver assist systems being developed now are meeting a similar form of reluctance. Older folks just don’t like change. Young people learn the new systems and readily appreciate the advantages of them.
Great commentary! Thanks very much. I’m 83 by the way and happen to love FSD Super(vised)!
I have to totally disagree with your apparent conclusion. It's not a question of not wanting to change, rather it's the delivery of a product that hasn't matched what was sold to the public years ago by Musk. The reluctance to "accept" is that Musk oversold what was deliverable. No one, not even the TH-cam Mustateers can deny that. Overstating what was deliverable was fairly common practice in the computer industry; perhaps it still is. I was a systems engineer for 35 yrs and part of my job was to accompany salespeople to client presentations. The drivel that came out of salespeople's mouths on occasion was beyond cringeworthy. Autopilot? The name suggested something it never could be. FSD also suggests that the product is "Full Self Driving"; it's not. We all know that, or at least we all should know that. "FSD" is an incredible display of hardware and software engineering, but to this day, it is not what Musk sold in 2020. I do believe Tesla's engineers are finally closing in on Musk's ancient sales pitch? Absolutely. Yet, even today, it appears that a couple of steps to autonomy are being conveniently overlooked, at least as far as public presentations are concerned. I'll sidestep the government regulation because everyone seems to think that Tesla will tell NHTSA what to conclude. I'm not sure it's going to work that way. Insurance? I'll call my insurance company tomorrow and ask them what happens if a car that I own and insured will actually be insured if that car gets into an accident if there isn't a driver behind the wheel.I'm pretty sure I know the answer. Auto summons comes to mind. You know Tesla will claim that Tesla is not responsible. Putting all of that aside. I firmly believe that removing a person from the decision-making process will turn out to be a horrible mistake.
The legal challenge is far greater than the technical one.
Hey doctor know-it-all I think this is a good one to ponder. Now that they're introducing grok into the Tesla's. Once GROK learns how to drive the semi with Pepsi's data it will be over or every other hauling company on the planet?
Wired was a favorite magazine if mine back in the 1990s. Then I saw how they have been reporting about EVs and gave up on that publication more than 10 years ago...
FSD greatly improves safety and is improved every month. Anything reported is far out of date and not relevant to the current system.
Those are pretty general statements I'm not sure how they can be proven.
British Columbia, Canada, just made any autonomous driving over level 2 illegal! Way to keep BC drivers in the Stone Age! I’m dumbfounded by our legislature thinking that full self driving is somehow more dangerous than some of the idiots currently on our roads.
What's your 1st hand experience with so called Full Self Driving? You shouldn't be baffled.
@@craighermle7727 Do you have 1st hand experience with ICBC?
Have been driving with auto pilot for about 4 weeks. Lots of phantom braking in tunnels. Occasionally on a freeway. It is inconsistent reading speed signs and definitely does no disengage when the excellerator is pressed. Autopilot combined with regenerative braking and flashing brake lights did however save me from a nasty accident on the freeway. Coming from 110kh down to 80 the car in front was late to see a break down and slammed on the brakes. Auto pilot saw it before me and started the regen before I could get my foot to the brake. The combination of the two pulled me up real quick. I looked in the rear view mirror and the car behind just pulled up in time but behind him was chaos. Cars cutting lanes and eventually a 3 car pile up. Teslas brake very quickly and every car should have flashing brake lights when they are emergency braking.
I've had Model 3 Highland with Standard AP for about 2 months in Australia. I've had 2 very scary phantom braking episodes that I'm lucky didn't result in being rear ended. The last was at 110pkh, I was 2nd in a queue of cars, when my car slammed on the brakes. I can't explain it as conditions were perfect at the time. The highway has some earlier roadworks so Max Speed sometimes dropped suddenly from 110 kph to 40 kph. Not sure if car on AP with Autosteer would brake hard under those conditions.
@@gjhyland It can be dangerous for sure.
The blue steering wheel isn’t the only indicator. The entire path planner thickens and turns blue, as well as audible indication.
Thank you, John, for explaining in detail how bias against Tesla supervised FSD is a headwind that Tesla really needs to address, perhaps with a Media Relations department. Thank you for your excellent commentary!
Why isn't Tesla comparing miles driven between accidents in Teslas using autopilot and FSD and other brands of cars? At the very least it would make discussion of autopilot/FSD less defensive. Right now media reporting has to dig to find this information, and then decode what the NTSA reports actually mean. It is far easier to go with the Tesla deathtrap narrative.
FSD and Autopilot make you drive more safely when you don't trust them. Increasing your level of attentiveness to always being ready to intervene already makes you less likely to get into an accident than the average driver. It is the attentiveness of the driver and the usage of FSD that makes FSD statistically successful.
“ a trend of avoidable crashes”. Oh! OK now I totally get it. In other words regular human traffic on the roads. What is the percentage of traffic accidents caused by human error? Something like 95%?
What other product out there is blamed when people use it wrong? If you separated out "accidents/deaths when people used it properly" it would be miniscule.
Did they use it wrong? Your statements are indefensible. Have you seen a user manual for FSD?
@@craighermle7727 every accident I've seen reported on the people were over-reliant on it. Many were found to be driving drunk. So yeah. Just by using pure logic the accident rate *HAS* to be lower because you're just as aware of potential accidents as if you were driving unassisted, but now you have the added awareness and faster reaction time of the car avoiding accidents you would have missed on your own. It can only possibly reduce accidents if you're using it correctly - as a driver assist feature. Like it says to do in the manual and have to read ad nauseum on the screen and have to acknowledge and agree to when you're activating it for the first time. So not only are my statements defensible, they're unassailable.
I've said it before. When a pilot puts a plane in autopilot doesn't mean he can sleep. The pilot still needs to pay attention and be vigilant.
FSD does not take the bad drivers out the highways!
Hey Doc, while you are absolutely correct about Tesla’s ability to avoid accidents as you highlight in the video clip with the avoidance maneuver, I think you must also acknowledge that FSD that was engaged at the time, and prevented a serious collision, should probably have automatically been slowing down in response to the approaching emergency vehicle as it appears other cars were doing during the clip? If the Tesla did slow, the offending Suburban would probably have had space to pull over in front of the Tesla. While not illegal for the hearing impaired to drive a car could not being able to hear be the next hurdle for ADAS to overcome? Bottom line, I would have used a better example to show the system’s capabilities. Keep up the good work. I like your videos.
My first car was a Corvair Corsa.
Ralph Nader called it "unsafe at any speed".
I have been ahead of the game since '82.
11:23 (Bigger autopilot indication) YES! When driving with my distance viewing glasses, it can be hard to see smaller stuff in the car. In some of my driving, I'll wear bifocals (Ben Franklin's answer to seeing things in the car and out), but on long trips especially, I'll use my distance-only glasses most of the time as they reduce eye strain better, and some of the tell-tales in the car can be a little harder to see.
Makes sense that Tesla no longer sells autopilot. There is no way to make it completely safe and not have FSD capability. So now people have to purchase FSD if they want the capabilities.
Even when not on, the Car is always watching and prevented me from rear ending a car on the highway when I was distracted. AP wasn’t even on and I don’t have fsd.
Before the update autopilot nag was annoying. After the update it is extremely annoying and sometimes even dangerous. There have been many times that I have gotten the nag and applying turning force on the wheel and/or the scroll wheels do not register and the screen continues to pulsate blue. Sometimes I have to turn the wheel hard enough that it disengages auto steer and I have had to apply enough force that when it disengages the car jerks to the side. This is especially dangerous when I’m on a curve. I’ve had other scenarios where I barely apply pressure and autosteer disengages and if listening to music loudly or others in the car are loud it can be easy to not here the chime letting you know it disengaged. You only realize it when you get to a curve and the car doesn’t follow the curve. I have also had the issue of I turn my blinker on to change lanes just as the nag comes up, when this happens the side repeater view doesn’t come up, so you go to look at the screen to make sure there isn’t a car next to you and there is no video from the side repeater. So IMO their NHTSA’s complaints and “recall” have actually resulted in Autopilot being more dangerous.
I’m loving FSD 12.3.4 as I rarely get nagged by it and it does much better than Autopilot
Thoughtful as ever, Dr John. Like your suggestions to Tesla - get some good news narrative flowing. Suggest like Apple does with the Apple Watch - their “Dear Tim…” emails 🤔
I have set the option to display rainbow road whenever Autopilot is enabled as the little blue is very hard to see. This makes it much easier to see if it disengages for some reason eg small movement of the steering wheel.
On the steering torque thing, I’ve experienced this. I’ve used other driver assistance systems, and Tesla’s grips the steering wheel a bit tighter than others do, so it takes quite a bit more torque to ‘release’ it from Tesla’s ‘hands’. And when it does release the wheel, that extra torque you were putting in now makes you serve a bit. It’s like having a passenger try to turn the wheel right while you try to go left, and the passenger suddenly let’s go of the wheel. It’s not crazy extreme, but can be enough to make you swerve a bit.
I just took a 2,000 mile road trip in my Tesla and this happened 2-3 times. I’ve done the same trip in my Mach E and never had this happen.
Just curious, what are the stats for non Teslas monitoring the same parameters? My guess, is that is Tesla much safer in accidents per 100k miles than any other system Or no system at all. Just saw the rest of your video with the stats. I am sticking with my FSD. I feel much safer.
IMO, the fact that NHTSA has found so few cases of "can't fix stupid" situations is shocking. I'm sure there are way more Tesla drivers out there that miss use the system, but it's likely that it has saved lives in situations that would have been fatal without such software. Overtime the data will win out and show that Autopilot is saving lives statistically.
Maybe the 'opt-out' part is a reference to being able to disable AP in the menu settings, which would invalidate the OTA update?
On Thursday I took my first test drive of a Model 3 and on Friday I took a test drive of a Model Y with Full Self Driving Supervised.
First issue was getting used to regenerative breaking. Although intellectually I understood the car would slow down if I took my foot off the accelerator, my muscle memory was not prepared for a braking experience rather than just coasting.
Agree steering wheel icon is too small for us 60+.
Autopark was amazing (at least in the showroom lot).
I set up FSD to go from showroom (John Young Parkway) to nearest supercharger (Winter Park Wawa) and it was amazing in traffic (though at first for a couple of minutes I had a lot of junk interventions until I understood the interaction).
My issues were trying to get it to navigate back to the showroom . I went out a back way from the supercharger and came to the divided highway at a point where there was no cut through (my navigation mistake), but when I engaged FSD it attempted to do an unprotected left which was not possible because of the median divider -- so I obviously intervened. The next time I tried to engage FSD for the return trip I was at an intersection where forward would go into a big parking lot and the correct move was to turn right. The car went into the big parking lot. this might have been another FSD navigation error or my user error engaging FSD without a navigation plan. Finally, I engaged FSD in the left turn lane of a red light and FSD drove back to the area of the Tesla showroom, but went sailing right past it! Again I don't know whether I had mistakenly disengaged the destination or FSD had made a navigation error. I stopped in the median to make a U-turn and made sure the navigation was fully engaged and engaged FSD and it made it to the showroom with no issues. I used Autopark again but disengaged because there was a family with several small children go to a Tesla about three parking places away from where I was trying to Autopark and I did not want Aeropark engaged when there were tiny tots wandering around. So I looped back in the parking lot and sucessfully engaged Autopark. So, it takes getting used to. Back home I looked up Model 3 user manual and I will have to describe the complexity of the terms in another post.
Autopilot was dangerous prior to V12 as I still was getting many phantom braking incidents on the interstate. Still the drivers responsibility, which I was always attentive. I never thought the nag was too severe or too lenient. I have not experienced phantom braking at all since v12.
Phantom braking shmaking. Suck it up, girly-boy.
The older iterations of AP were similar to the radar and single camera used by other OEMs. I don't see the same level of scrutiny leveled towards those systems, which have the same technical flaws. I have experience with Toyota's and Nissan's TACC and LKA systems. They perform far worse, have even less driver monitoring features, and can be enabled anywhere at any time. They also have no real updates being applied to improve the technology in existing vehicles. So defects are essentially permanent.
They also cut off suddenly without warning; Nissan's system kicked off while traveling a narrow lane between construction bollards in a rainstorm. It was exactly the type of situation where I needed a bit of assistance, but it kicked off immediately, without warning. Tesla AP is far more capable in such situations.
I seen a single accident that was NOT the fault of the driver misusing the system. Someone, looking at you NHTSA, show me just ONE.
I suggest an opaque large steering with “NHTSA REQUIRED” imprinted on the image.
I've never had an issue with disengaging when I wanted to. It does firmly want to keep you in a lane if you start to drift... that's by design so that if you "fall asleep" and aren't intending to leave your lane that your "hand on the wheel" doesn't disengage inadvertently.
I use FSD everyday I drive 3000 miles per month and have been for over 2 years. No issues.
I Just don't get how media can continue to do this type of media misinformation that hurts shareholders , not to mention those who don't buy a tesla and get hurt, because of a purposeful misinformation
How would they rate non-traffic-aware cruise control? It has no crash protection and no driver monitoring. By their logic, the system we have used for the last 30 years should be illegal.
(continuation) What I found confusing was from TH-cam I understood FSD and "AutoPilot" and I was under the impression that "AutoPilot" was just and enhanced cruise control for Turnpikes and Interstates. But then there is "Enhanced Autopilot" which includes "Traffic Aware Cruise Control" and more confusingly "Red Lights and Stop Signs" which one does not expect on Turnpike cruise control. And if I understand correctly these features are all indicated by the same steering wheel icon (I may be wrong about that). If I just bought a Tesa I would be certain what I paid for but in other situations such as this test drive, or a friend's Tesla or a rental car it might not be clear to a newbie what bundles of features are included and what the implications (and limitations) of that bundle of features are. This experience is common with new software, but less common with cars. For example, when I rent a car in Florida I have to know immediately know to turn on and off the headlights and windshield wipers because of sudden rain storms (FL law requires headlights on).
I've had Model 3 Highland with Standard Autopilot, in Australia, for about 2 months now. I seldom travel on fast highways but in the 4 or so occassions I have, I've had 2 alarming phantom braking incidents. Both almost resulted in being rear ended. The last was alarming, traveling at 110 kph, 2nd in a queue of cars, ideal conditions, no overpass pending, when bang the car emergency braked hard. Really scary situation. I had to rush for the accelerator. My car fails the wife test for AP. I'm hoping for improvements once the HW4 inference computer is used for highway AP here in Australia.
As a driver in Germany i can say that the phantom breaking is still occurring and the situations the car produces can be quite dangerous. Also moving the wheel disengages, stepping the brake too, but when a phantom break happens you have to quickly counter react a sudden break with a sudden acceleration and it not always disengages. Besides that there is an annoying new nag in place on traffic lights. We are having lots of issues with traffic lights in tunnels over here. They usually are off because the tunnel is able to be used. But the car often wants to break because you seem not paying attention. To some degree that is true, because i will be alerted on screen and if i don't look on the screen the car will deccelerate and beep lately. So if i pay attention on the road and dont look up on the screen every time there is a disabled traffic light this happens frequently. Also sometimes the car will not correctly classify a traffic light as such and slow you down for no reason because it thought there was a traffic light which there wasn't. Again, this is Germany and no FSD. I cannot tell how the Autopilot behaves in the US, but in Germany there are many situations where i rather prefer driving myself. I strongly decide on the road conditions whether i use Autopilot or not. Less crowded scenarios i feel defintely more comfotable with right now. I would love to have FSD Beta/Supervised here, but for now it feels too risky often times, so i can definitely relate to some of the allegations. As a Tesla Bull i want this to succeed so i am still biased to the positive side and have no doubt they will fix this. I only hope to be able to get the ability to use it soon in the EU too, though i don't have the highest hope for that. Lets see... Thanks for your awesome work, keep doing what you do. Viele Grüße aus Deutschland. Cheers
How old is your Software (version)?
@@user-nf4st5kn6l In the app it says v11.1 2024.8.9
The blue steering is an insignificant indicator to determine that you are in FSD mode. The big blue line in front of your car is the best indicator that you are in FSD.
Out of line of sight, not sufficient
I have and have had FSD since 2019 and the current version is very inconsistent including very dangerous if you are not 100% watchful and available to intervene.
Yeah would love to see all the accidents (potential deaths) FSD and autopilot has prevented
For the AutoPilot the design was limited access road as stated in manual. So any rural road with lights, stop signs and pedestrians is incorrect use of Auto Pilot.
Point taken. But I argue that it is due to precisely the point you raised that you should be mindful of the potential damage this may cause to Tesla's brand and reputation.
Further, the short sellers and struggling car manufacturers will hold on to these headlines deliberately to create distraction as they have been doing for some time.
well done
I have experienced the steering wheel being yanked fairly hard when I apply a small force to disengage AP. It may be trying to stay in the lane for the second or two before it computes that I am taking over. I no longer use the steering wheel torque to disengage AP and take over driving. Instead I put my foot touching the accelerator and then lift the right hand stalk once which disengages AP. Then I adjust my foot on the accelerator to maintain the approximate speed that AP was using. I have tried tapping the accelerator to disengage AP and it does work but it often results in increasing the speed noticeably which is not always the best thing to do given the surrounding traffic. Just my personal experience here, take it as anecdotal advice only.
Working in industrial automation one can nearly perfect a machine where the machine becomes the teacher and the operator just feeds it. If it makes just one minor mistake, or overlook in interactive design (specifically safety) the complaints come out like a waterfall being fully released until a solution is made. Autopilot and FSD is going through a Cultural change with eventual acceptance. Kind of like a new stop sign in the neighborhood.
“Roads it’s not designed for” I’ve wondered if this could mean dirt roads, my experience FSD (not autopilot) seems to handle too aggressively.
Remember we talk of the OLDER versions - pre v12. Given that v12 is a total restart - with an end-to-end neural network, not a programmed but a trained system - so old issues are not relevant anymore. Heck, they may have full self driving 2nd half of the year (I assume the Robotaxi will not be available right after it is announced in larger numbers). Totally different system.
It seems like it would be trivial to tint the image of the car on the main screen bright red or something else really obvious to indicate the driver assistance status.
I've had my model Y since last October and I would like to know, without actually finding out by experience, if the car will save my butt if I don't have FSD turned on.
To my knowledge you can still cover the camera to avoid nags when using autopilot, which I think could be the issue.
I think you need to be on the call with Dan and Adrian 😂
It's not just WIRED reporting this way;
Being reported the same way at CNBC;
_KEY POINTS Federal authorities say a “critical safety gap” in Tesla’s Autopilot system contributed to at least 467 collisions, 13 resulting in fatalities_
And RT;
_The US Transportation Department’s National Highway Traffic Safety Administration_ (NHTSA) _said that their investigation into Tesla’s Autopilot, an advanced driver-assist feature that Elon Musk insists will eventually lead to fully autonomous cars, had identified at least 14 fatal crashes in which the feature was involved_
Also NBC and many, many more as a cursory search will show
We also never talk about the fact that autopilot is probably going to be replaced by FSD at some point.
By this logic, 30,000 US car deaths each year should be a liability to legacy auto. Alsi that humans are not perfect drivers and should not be allowed to drive, ever!
Tesla Autosteer without FSD really DOES rely on road lines. Intersections usually have a break in these lines. If the car is in the right lane and there is a slight right hand curvature in the road within the intersection then autosteer may choose to swerve to the left lane in a fairly violent manner inside the intersection. That has happened to me.
If there is an overtaking car in the left that car may be hit during this swerve. Autopilot "knows" when the car is not on a controlled access road and Tesla can simply lock out the autosteer (on cars without FSD) on city streets for example.
I think that is a reason for Tesla's warning about autosteer:
Warning (from Tesla owner's manual):
Autosteer is intended for use on with a fully attentive driver. Do not use Autosteer in construction zones, or in areas where bicyclists or pedestrians may be present.
With the increase of competence of FSD is it time to replace Autopilot with a limited version of FSD?
Talk about nagging ... Nagging is independent of both AutoPilot and Supervised Full Sell Driving. Yesterday I was driving in a 2021 Model Y with the assistance of (S)FSD 12.3.4. We were following a car in the right lane of a two-lane side of a four-lane road. The car we were following lit its right turn blinker and slowed to turn right. The car in front failed to exit the lane to the right sufficiently promptly to satisfy the Nagging system, and Tesla nagged me to alert me to the possibility that if I did not slow or steer to the left I might hit the back of the turning car. I took no action and (S)FSD 12.3.4 did NOT take action and it did NOT hit the turning car.
It felt the humor of an instance of Tesla nagging its own (S)FSD 12.3.4. :)
FSD will eventually stop all accidents when all cars on the roads have it. As long as we still have those without FSD installed and they make mistakes like cutting others off when not enough space is allowed for say then we will still have accidents. When all cars run FSD then the cars themselves will avoid hitting each other because they will all be communicating to each other
They don't need to create a bigger icon. Anybody who uses Autopilot knows that when it's engaged, the line lines turn blue.
Regarding the 2023 accident, I wonder why forward collision warning and emergency braking didn't activate? I looked for details online about the accident, but I couldn't find them.
The details I found was it was a 2022 Model Y driven by a 51 year old male, in Halifax County North Carolina. The article says that the vehicle MAY have been operating on a partially automated driving system when it struck a student who had just exited a school bus. So is short, they don’t yet know if the AP was even engaged when it fact it could simply been a guy driving not paying attention and blew pass a school bus with flashing lights.
Media gaslighting has been uniquely dangerous - and may still be.
2:37 The report says, 13 fatal crashes and 14 people killed. This means one of the accidents killed two people.
There are estimated 14 million accidents in USA in a year for some perspective.
I believe I posted a paragraph that pointed out Tesla FSD. The current generation along with the generation Alpha still enjoy old-school ways of transportation and that goes for the current regulators. Tesla FSD will be adopted and embraced by generation BETA and beyond. The generation Beta and beyond do not like to leave the house or work at the office. Those generations like for everything to be delivered to them and they will be the most lazy drivers that require the FSD to do the driving.
Sorry, my Tesla owner's manual quote did not include everything:
Warning
Autosteer is intended for use on CONTROLLED ACCESS highways with a fully attentive driver. Do not use Autosteer in construction zones, or in areas where bicyclists or pedestrians may be present.
Wired magazine has a political tilt to it that seeps into most of its articles. Like Reuters.
So if this ends up being the panacea, what are the ethics of not making it open source to all?
The human driver in the flagstaff video was mitigated by FSD.
And then there's the case of FSD that drove a man suffering a heart-attack to the hospital...
No, those stats don't show what you say. You're assuming that the same type of roads are driven in Autopilot as manually, but that's clearly not the case. One tends to use Autopilot on divided highways or interstates (like one would with any adaptive cruise control or lane guidance feature in any car), and other roads are manually driven. The actuarial risk per mile differs between types of road by a substantial multiple, at least based on when I had Pay As You Go car insurance in the UK, in which I was charged per mile based on the type of road, in turn based on actuarial risk. I recall a difference of about 4.5 between the highest risk (small residential roads) and lowest risk (motorways), per mile. (Yes, the UK, despite being metric, uses miles. And stones. I don't recall if they gave vehicle weights in stones, or just people weights.)
Tesla releasing better data on this, comparing Autopilot or FSD to manual driving *on like for like roads*, would be very helpful, and make their point more firmly and credibly and falsifiably. (Or not, conceivably. It's up to them to show convincing data in a careful analysis, not hand waving arguments.)
Where is the NHTSA report blaming legacy carmakers for not keeping DUI accidents from happening? Isn’t DUI a misuse of the vehicle’s system?
Autopilot is only designed to work on highways. It does lane keep and it will change lanes. It’s not designed for city streets as it does not stop or recognize traffic signals or any other turning options.
they absolutely need the counterfactuals on hand
Why not have the system verbally warn the driver?
9:42 DKIA, Yes, but. I would love to be able to bias the lane holding WITHOUT causing a disengagement of Autopilot or FSD, in order to dodge debris and potholes, or to give more room to the oversized or undercontrolled truck that I'm passing or being passed by. In other words, Elon, let me tweak the steering, just as you let me tweak the acceleration without dropping me out of lane holding or FSD. I admit, dropping out of Autopilot isn't bad, as TACC still keeps the car going. But if I nudge the wheel enough to come out of FSD, the car will start braking if I don't put my foot on the accelerator soon enough... this is NOT GOOD. And yes, FSD and Autopilot resist the turning of the wheel by the driver and forces one to put a little extra umph into the turning, at which point it drops you out. Better if it let you steer the car but keep you in FSD.... this would also be better for "teaching" the car to drive better, like at intersections of ditch surrounded narrow roads and hard to access driveways.
24:20 didn t like 30 people crash befcause they disingaged without even realising?!
This is _much ado about nothing._
The safety of the vehicle is not _perfect_ but it's much better than the safety of vehicles *without it.*
The biggest philosophical issue for me, is that FSD will save lives. Say, FSD stops 90% of accidents in the US, so it saves 36,000 lives annually. But 4000 still die. The problem is that those 4000 that die are not going to be the same 4000 that would have died if there were no FSD. There might be some overlap, but some of those people would not have been in a crash if they were driving on their own and some people who would have died because of their poor driving, drunk driving, etc. will not be in a crash and will live. This may be a minor point and there is probably no way to know which people were saved and which people died that wouldn't have. Of course, having 4000 people die instead of 40,000 is a big plus. Tragic that anyone dies.
I like to look at issues with an "all thing considered" approach. And we can't do that here because there's no way to report on all the lives AutoPilot saved... on all the accidents AutoPilot avoided.
You have to apply your analysis clearly to either FSD or AP or EAP. The rest of us, not in North America, are not using FSD.
You wanna know what’s going on. Go watch the movie about the 1940s ”Tucker” car.
An Audi would have given an alarm for driver to react and if there’s no time for a driver intervention it would break and stop fully. We need more context to do a comparison, which you aren’t given
Ever since the last recall that added more nags, I decided to just drive my Honda CRV because the level 2 driver assistance doesn't even nag me.
Why is it ok for Honda to not have nags?
Probably an exaggeration about Honda. I drive a 2021 Subaru with an ADAS system and it nags every 15 seconds about the steering wheel even though my hands are on it. It will also nag promptly when the head is not aligned straight on and up (even if I am watching the road).
@@NoWastedCaloriesNHTSA involvement is retrospective when issues are identified.
In addition NHTSA has from 2020 required reporting is incidents, so if the CRV is older the incidents are not being reported.
Nissan used PRO PILOT for their driver assist software. Sooo
The only danger is the dead head behind the wheel.
13 cars - 14 killed. A car can carry more than 1 passenger.
Tesla's naming is completely irresponsible, they named the basic thing "autopilot" and people started dying in dumb accidents, they named the second thing "full-self driving" and more people have continued to die in dumb accidents, FSD is not "full-self driving" at all, lv2 is not full self driving, the naming is deceiving and utterly irresponsible