Tesla Dashcam Footage Suggests Reasons for Autopilot Crashes | WSJ Exclusive

แชร์
ฝัง
  • เผยแพร่เมื่อ 5 มิ.ย. 2024
  • The federal government is investigating 16 crashes involving Teslas operating in autopilot mode and emergency vehicles. The Journal obtained exclusive dashcam footage from a Model X in one of these accidents.
    In this video investigation, WSJ breaks down what happened and looks at the potential problems with Tesla’s autopilot technology.
    0:00 Exclusive Tesla crash footage
    0:32 A pattern of Teslas crashing into emergency vehicles
    1:52 Tesla’s autopilot system
    2:20 Autopilot in action - breaking down the drive
    3:42 What went wrong? Breaking down the crash
    4:52 Lawsuits and investigations against Tesla
    WSJ video investigations use visual evidence to reveal the truth behind the most important stories of the day.
    #Tesla #Autopilot #WSJ

ความคิดเห็น • 1.7K

  • @dd89210
    @dd89210 10 หลายเดือนก่อน +1195

    Can you imagine the driver just sitting there watching and waiting for the impact.... Or was he not watching.

    • @mluu510
      @mluu510 10 หลายเดือนก่อน +227

      He was too intoxicated to care

    • @tonespeaks
      @tonespeaks 10 หลายเดือนก่อน +147

      @@mluu510 This is the real issue, because what the report failed to mention, is that Autopilot is just an advanced cruise control. It in no way takes responsibility for drving the vehicle. If the driver was just using cruise control would they have driven into the Emergency Vehicles???

    • @___Truth___
      @___Truth___ 10 หลายเดือนก่อน +84

      @@tonespeaksIf that’s the case then Tesla deserves to be sued for false advertisement when they constantly express how autopilot is a early version of Self-Driving versus Cruise Control

    • @lucaskp16
      @lucaskp16 10 หลายเดือนก่อน +60

      @@tonespeaks still tesla fault for calling it AUTOPILOT and making it look better than it is. people will trust their lives and others to drive the car in a state they cannot.

    • @tonespeaks
      @tonespeaks 10 หลายเดือนก่อน +58

      @@lucaskp16 "making it look better than it is" Tesla can't be at fault in any way here. It is a driver aid, just like cruise control is a driver aid. Tesla has always stated it is a driver's aid and that the driver must be alert at all times, since it is their responsibility to control the vehicle.
      I'm a Pilot and use Autopilot, I know that I must stay alert at all times, because the system is just an aid. Not sure how anyone can misunderstand that. More importantly there are millions of Tesla's on the road, if this was really a huge issue with the name wouldn't there be more crashes. Intoxicated people routinely crash into Emergency vehicles this is a common thing, one of the reasons officers approach vehicles on passenger side, because they used to get side swiped, by inattentive and intoxicated drivers.

  • @daviddefortier5970
    @daviddefortier5970 9 หลายเดือนก่อน +330

    I would say if the car has to remind you 150 times to focus... Then you shouldnt be allowed to drive! The other problem here is the car should actually not disengage autopilot, but rather pull over to the side, then disengage auto pilot & not allow the autopilot to reengage for a set period of time.

    • @mmark300
      @mmark300 9 หลายเดือนก่อน +15

      It's not really like that. A driver could easily have both hands on the wheel and fully attentive and this warning will pop up. You have to actually be applying force to the steering wheel to let autopilot know you are there.
      It also now watches your eyes and after a few seconds of not looking straight ahead a different warning pops up - something like "keep your eyes on the road"

    • @DontMe__
      @DontMe__ 9 หลายเดือนก่อน +16

      Autopilot doesn't just flat out disengage when the driver doesn't respond to nags. It alerts the driver loudly , slows down to a stop, and turns on its hazards.

    • @aiyabanana2849
      @aiyabanana2849 9 หลายเดือนก่อน +6

      Autopilot is like if you're not focusing I'm not going to autopiloting eventhough we're going to crash! 😂

    • @humphrey-7094
      @humphrey-7094 9 หลายเดือนก่อน +5

      Exactly. Autopilot should slow and park on the shoulder with hazard lights after reminding the driver three times to focus on the road. It should then disengage for the rest of the ride. If they are inattentive again, it should be reported and autopilot should be taken away for longer, even permanently.

    • @skip181sg
      @skip181sg 9 หลายเดือนก่อน +1

      @@mmark300if the driver received 150 alerts… was coherent…. Was holding the steering wheel….. Pretty sure they ( I certainly would have) disengaged auto pilot

  • @mluu510
    @mluu510 10 หลายเดือนก่อน +892

    That Tesla driver should be held accountable. You need to take over when you see an emergency vehicle, no matter what

    • @danielreid6823
      @danielreid6823 10 หลายเดือนก่อน +42

      Yes, that Telsa "autopilot" SHOULD be held accountable

    • @xvx4848
      @xvx4848 10 หลายเดือนก่อน +36

      Musk needs to be held accountable.

    • @mluu510
      @mluu510 10 หลายเดือนก่อน +78

      @@danielreid6823 no, you agreed to TOS to always be ready to take over when in auto pilot. Clearly the driver wasn't

    • @danielreid6823
      @danielreid6823 10 หลายเดือนก่อน +1

      @@mluu510 I bought a welcome mat, that says agree. You're welcome

    • @danielreid6823
      @danielreid6823 10 หลายเดือนก่อน

      @@mluu510 please.. come over 😉

  • @theforce34
    @theforce34 10 หลายเดือนก่อน +21

    150 notifications for a 45 min drive? That dude wasn't paying attention at any point.

    • @smallfgb
      @smallfgb 10 หลายเดือนก่อน

      Yes. If you watch the road, way fewer inputs are required. Also, seems to prompt when it needs you to watch - just like it did for this accident

    • @ryantakahashi9130
      @ryantakahashi9130 10 หลายเดือนก่อน +1

      ​@@smallfgb No, those prompts are also when it detects you aren't holding the wheel after a certain period of time, something like every 30-60 seconds , not just something you need to pay attention to. The prompts are different when it detects something you need to pay attention to. The "apply torque to the wheel" prompts are just the default "are you still there" prompts. Someone paying attention would have at least 1 hand on the wheel the whole time, they just wouldn't be steering themselves. This person was likely dazed and half asleep but the beeps were enough to trigger them to put a hand on the wheel then probably dropped them immediately, hence completing 150 of them in such a short time.

    • @mmark300
      @mmark300 9 หลายเดือนก่อน +1

      It's not really like that. A driver could easily have both hands on the wheel and fully attentive and this warning will pop up. You have to actually be applying rotational force to the steering wheel to let autopilot know you are there.
      It also now watches your eyes and after a few seconds of not looking straight ahead a different warning pops up - something like "keep your eyes on the road"

    • @elenabob4953
      @elenabob4953 24 วันที่ผ่านมา

      It doesn't matter, as Tesla didn't detect the emergency vehicle. I could happen to any Tesla user.
      By the way, in other automotive companies and especially in Europe if you have a few prompts the autopilot will disengage because that is the sane thing to do.

  • @xuepingsong5329
    @xuepingsong5329 10 หลายเดือนก่อน +804

    Time and time again, autopilot is L2 no matter what the marketing says. If you can’t drive without it, you can’t drive with it.

    • @heyaisdabomb
      @heyaisdabomb 10 หลายเดือนก่อน +22

      What about the cops? Are we going to act like parking in the middle of a highway for a "traffic stop" is safe? Blame the driver all you want but accidents with 2 parties take 2 parties to create. Had the cops parked on the side of the road like normal, nothing would have happened.

    • @vasiliyt8600
      @vasiliyt8600 10 หลายเดือนก่อน +82

      ​@@heyaisdabomb They blocked the lane to provide protection for the officers. They also had the flashing emergency lights on. If the Telsa drivers' brain wasn't on constant hibernation mode, he/she would cut the speed long before and drive with a save distance around the obtsacle.

    • @chadwickwood9843
      @chadwickwood9843 10 หลายเดือนก่อน +20

      Haha they are suing tesla? Sue the driver. What a bunch of idiots. Autopilot team still has work to do, but this is ridiculous.

    • @vasiliyt8600
      @vasiliyt8600 10 หลายเดือนก่อน +8

      @@stanvassilev LOL. It's not about just "numbers". We are talking about scientific fields.
      And yes, it's 100% drivers fault.
      *Level 2* (Partial Driving Automation)
      This means advanced driver assistance systems or ADAS. The vehicle can control both steering and accelerating/decelerating. Here the automation falls short of self-driving because a human sits in the driver’s seat and can take control of the car at any time. Tesla Autopilot and Cadillac (General Motors) Super Cruise systems both qualify as Level 2.
      *Level 3* (Partial Driving Automation)....
      *Level 4* (Conditional Driving Automation)...
      *Level 5* (Full Driving Automation)
      Level 5 vehicles *do not* require human attention―the “dynamic driving task” is eliminated. Level 5 cars won’t even have steering wheels or acceleration/braking pedals. They will be free from geofencing, able to go anywhere and do anything that an experienced human driver can do. Fully autonomous cars are undergoing testing in several pockets of the world, but none are yet available to the general public.

    • @tgsoon2002
      @tgsoon2002 9 หลายเดือนก่อน +3

      @@chadwickwood9843 also the autopilot is the lower tier version. Tesla ai is working on FSD mostly.

  • @st-tube
    @st-tube 10 หลายเดือนก่อน +270

    Should be called Copilot? If it's warning the driver an excessive number of times maybe some preventative measures should kick in

    • @itsmeprasad1987
      @itsmeprasad1987 10 หลายเดือนก่อน +5

      That’s what I was thinking in morning

    • @incyphe
      @incyphe 10 หลายเดือนก่อน +31

      they should just remove the word "pilot". Call it "assist'.

    • @thula2890
      @thula2890 10 หลายเดือนก่อน

      The name is just a convenient excuse to write another attack piece on Tesla. They could name it whatever and people would still ignore the warnings to pay attention and cause an accident. Every driver knows that there is a risk everytime they get behind the wheel, Autopilot is no different and you are made aware of this when you engage it. There will always be idiots who choose to f@(k around and put everyone else at added risk.

    • @nickbien
      @nickbien 10 หลายเดือนก่อน +21

      Lot's of automakers call it something like that. Ford calls it CoPilot360 or BlueCruise, VW calls it Travel Assist, Volvo calls it Pilot Assist. All those names make it pretty clear that it's an assist system, not a fully autonomous system.
      But yes, Tesla needs to rename "Autopilot"

    • @davenz000
      @davenz000 10 หลายเดือนก่อน +17

      Doesn't matter. It was a drunk driver.

  • @rickkern5785
    @rickkern5785 10 หลายเดือนก่อน +225

    My Buick has lane keep assist and cruise control. It does not prompt me to keep my hands on the steering wheel. If I had hit the emergency vehicle they would be suing me, not Buick.

    • @drjekelmrhyde
      @drjekelmrhyde 9 หลายเดือนก่อน +1

      Pretty your GM car's cameras in the steering wheel is watching your eyes.

    • @dansin444
      @dansin444 9 หลายเดือนก่อน +4

      What if it was your family hit by Tesla. Same with guns I know, alls good as long as it’s not one’s own kid in the school where shooting happens

    • @rickkern5785
      @rickkern5785 9 หลายเดือนก่อน +28

      @@dansin444 The obligation of anyone who gets behind the wheel of a car is to be sober. Yes this was a terrible accident caused by a Drunk Driver not by the car. The car prompted him over and over to pay attention. Does your car do that? There are hundreds of terrible accidents every day part of living with roads and highways cars and trucks. The lowest accidents per mile driven are Teslas on autopilot. The data is overwhelming. Only a moron would blame this accident on Tesla and not the driver. No other car would have stopped this accident from happening with this drunk driver behind the wheel.

    • @blisphul8084
      @blisphul8084 9 หลายเดือนก่อน +4

      If the US had good public transportation, people wouldn't need to drive drunk. What do you expect people to do to get home? Ubers are expensive. (If they even operate in your area)

    • @dansin444
      @dansin444 9 หลายเดือนก่อน +1

      @@rickkern5785 can’t be sure. No other car has self drive. And American ingenuity has no limits

  • @jonathanmelhuish4530
    @jonathanmelhuish4530 9 หลายเดือนก่อน +5

    So an appropriate title for this video would be "Tesla autopilot fails to prevent drunk-driving crash".

  • @rylans.5365
    @rylans.5365 9 หลายเดือนก่อน +190

    You can take over at any moment. If the driver was paying attention (which he wasn’t), there was more than enough time to slow down and change lanes. The driver is obviously at fault.

    • @feuermurmel
      @feuermurmel 9 หลายเดือนก่อน +4

      I think Tesla should focus on making driving safer instead of more convenient, i.e. autopilot paying attention _alongside_ the driver, not _instead_ if the driver. I.e. you'd still have to do all the maneuvers yourself with autopilot just intervening when you're obviously making a mistake. Then the issue of keeping drivers attentive would be solved or lessened.

    • @danharold3087
      @danharold3087 9 หลายเดือนก่อน +5

      @@feuermurmel How is Tesla not doing that with autopilot ?

    • @trerou4300
      @trerou4300 9 หลายเดือนก่อน

      no Tesla's fault why just keep going front of stop car

    • @seeker4430
      @seeker4430 8 หลายเดือนก่อน

      Maybe he was expecting the autopilot to change lanes

    • @moos5221
      @moos5221 8 หลายเดือนก่อน

      @@seeker4430 i think he was too drunk to drive, so it's obviously the auto pilots fault only.

  • @theinteloutside
    @theinteloutside 10 หลายเดือนก่อน +22

    dont drink and drive ....but when stupidity strike this will always happen

    • @heyaisdabomb
      @heyaisdabomb 10 หลายเดือนก่อน

      At least the guy was smart enough to trust auto pilot over himself. The human here would have 100% crashed, as much as I hate elon musk, autopilot IS safer than a human. And the data doesn't lie.

    • @paladro
      @paladro 10 หลายเดือนก่อน +1

      @@heyaisdabomb there is no silver lining to this story

    • @theinteloutside
      @theinteloutside 10 หลายเดือนก่อน +1

      @@heyaisdabomb driving when your drunk doesn't make you smart bruh...make your facts straight😂😂😂

  • @shockingguy
    @shockingguy 9 หลายเดือนก่อน +34

    Good report, I drove a model three for three months, 17,000 miles, I used auto pilot many times but it has issues and you need to be awake and aware, I do not think any of these products whether it’s full self driving or auto pilot are perfect, they should all come with extreme warnings that you need to pay attention and be aware of what’s going on

    • @kenbob1071
      @kenbob1071 9 หลายเดือนก่อน +11

      They do warn you to pay attention. Every single time AP is activated you are warned to keep your eyes on the road, hands on the steering wheel and be ready to take control. The system also warned this doofus over 150 times. This accident is the fault of the man behind the wheel.

    • @patriciazoerner
      @patriciazoerner 9 หลายเดือนก่อน +4

      They DO come with warnings!

    • @edwilko8819
      @edwilko8819 8 หลายเดือนก่อน

      there are loads 9f warnings. you have to touch the screen to say you agree to t's and c's 😮 morons mine you do

    • @shockingguy
      @shockingguy 8 หลายเดือนก่อน

      @@edwilko8819 Yeah yeah yeah I should’ve changed my initial statement because I was fully aware of those warnings But I honestly think they should be strong

    • @Legorreta.M.D
      @Legorreta.M.D 8 หลายเดือนก่อน +1

      @@shockingguy question, knowing these Teslas have had accidents (some even change directions to drive over cyclists), are you not more stressed with the autopilot than without?

  • @dw4070
    @dw4070 10 หลายเดือนก่อน +395

    The driver 100% caused the accident the moment he got in the car and Drove While Intoxicated. Hopefully no one was seriously injured.

    • @danielreid6823
      @danielreid6823 10 หลายเดือนก่อน +13

      Sorry, I'm confused. By driver, do you mean the Telsa?

    • @th0master
      @th0master 10 หลายเดือนก่อน +62

      @@danielreid6823 the person who got in the car, while they were under the influence, which is illegal btw. Not the Tesla, the Tesla is not the driver, the person behind the steering wheel is.

    • @danielreid6823
      @danielreid6823 10 หลายเดือนก่อน +2

      @@th0master so who was driving in this incident?

    • @javiervasquez8797
      @javiervasquez8797 10 หลายเดือนก่อน +9

      Not making judgements until I know the # of incidents per miles driven while in Tesla’s autopilot/fsd, compared to the # of incidents per miles driven in all other vehicles.

    • @th0master
      @th0master 10 หลายเดือนก่อน +31

      @@danielreid6823 the person behind the wheel, the drunk, intoxicated driver, the Tesla is just assisting.
      When I use my Tesla with auto pilot, I am still driving, I am still paying attention, the Tesla is just helping me.
      Is a car using cruise control driving itself? Or is it just helping you drive the car?

  • @christopherg2347
    @christopherg2347 9 หลายเดือนก่อน +3

    Okay, a few things:
    1. Do not market L2 Automation as Autopilot. That is intentional mislabling of a product.
    2. _Why_ does the radar have issues detecting stationary objects? Stationary things are way more dangerous then things that are moving away from you!
    The wall or end of the traffic jam is more dangerous then some vehicle changing lanes a full safety distance away.
    3. Do not disengage the assistant before impact. Keep it running so it keeps slowing down, until a human override happens.
    4. Somewhere before 150 attention warnings the system should say: "Okay, either my sensors are faulty or you are faulty. I am off for the night."

  • @xac1d
    @xac1d 9 หลายเดือนก่อน +5

    1:35 "driver is intoxicated." end of video lol

  • @dluchin1998
    @dluchin1998 10 หลายเดือนก่อน +219

    I've yet to read about an autopilot case where the driver was actually confused about what it could do because of the name. They willingly misuse it, just like people text and drive. Nobody blames Apple when someone crashes texting and driving.

    • @quantuminfinity4260
      @quantuminfinity4260 9 หลายเดือนก่อน +38

      I would make the argument that this analogy isn’t fully comparable, Elon Musk has repeatedly and continuously vastly overstated the relative ability of self driving and when full self driving will be available for essentially 6 years at this point. It would be like if Tim Cook said, “Actually, iPhones will be impervious to damage with iOS 17 later this year and actually be safer than using a phone case.” But then, in the fine print say it's actually just as susceptible to damage. Except in this case, no matter how many issues it has, Siri won't make a sudden dangerous move to kill you out of nowhere. Elon flaunts the relative safety of full self driving, but in reality it’s the full self driving plus a human babysitting it, and constantly preventing it from making terrible decisions. Even so there are still many many cases of auto pilot, making an abrupt decision that an attentive human driver is incapable of correcting for in time.

    • @douglasgroff7648
      @douglasgroff7648 9 หลายเดือนก่อน +20

      Autopilot and FSD Beta are completely separate software suites. Autopilot has absolutely no intention of ever being more than a Level 2 Advanced Driver Assistance System. FSD Beta has the explicit goal of becoming a Level 5 ADAS, hence the currently $15,000 price tag. Why can’t anyone understand this difference.
      As far as NTSHA saying that the term Autopilot is misleading, maybe they should talk to some of their compatriots over at the FAA, who is also under the Department of Transportation umbrella. Nobody thinks that “Autopilot” replaces the need for at least two licensed pilots to fly commercial airliners, who are constantly monitoring the system’s performance. Just like Autopilot reduces the workload for commercial pilots and reduces fatigue, Tesla Autopilot has the stated intent to reduce a car driver’s workload and thereby reduce fatigue. In neither case does Autopilot eliminate the NEED for an alert and attentive operator at the controls.

    • @JoseSanchez-ku9mm
      @JoseSanchez-ku9mm 9 หลายเดือนก่อน

      Right they weird people is something else

    • @saaah707
      @saaah707 9 หลายเดือนก่อน +4

      The problem is it's designed to do something that is inherently dangerous in the first place (driving). Whether it's their "fault" or not, we as a society can't afford to have a device with such a catastrophic failure mode around us

    • @madmaximilian5783
      @madmaximilian5783 9 หลายเดือนก่อน +3

      @@saaah707 by the Tesla driver being impaired or under the influence should still count as operating a motor vehicle while impaired and liable for the crash.

  • @berdugosocials9854
    @berdugosocials9854 10 หลายเดือนก่อน +170

    No matter how smart the technology, it can't compensate for human disregard and neglect.

    • @B-RaDD
      @B-RaDD 9 หลายเดือนก่อน

      How about when using FSD? who is responsible then?

    • @berdugosocials9854
      @berdugosocials9854 9 หลายเดือนก่อน

      Do you not read Terms & Conditions? There is always that clause from companies that states that they can’t cover human negligence or stupidity.

    • @raybod1775
      @raybod1775 9 หลายเดือนก่อน +2

      Not true, bad software.

    • @xx133
      @xx133 9 หลายเดือนก่อน +1

      It’s marketed as full self driving. That’s the whole point…

    • @nogibertv4824
      @nogibertv4824 9 หลายเดือนก่อน +1

      @@berdugosocials9854 You Read Terms & Condition?

  • @ChrisWehadababyitsaboy
    @ChrisWehadababyitsaboy 10 หลายเดือนก่อน +12

    120 yards sure looks closer than I thought

    • @ProXcaliber
      @ProXcaliber 9 หลายเดือนก่อน

      Looks like at least 50 bananas to me.

  • @CreekThaGray12
    @CreekThaGray12 9 หลายเดือนก่อน +8

    The features of staying in one lane and the car alerting you to not holding the wheel are the same as in my 2017 VW and 2020 BMW. In both the driver is the only one that could have prevented this. Anything the Tesla does outside of this is an improvement. The car company does not drive the car the person in the car does drunk or not.

  • @johngriswold2956
    @johngriswold2956 10 หลายเดือนก่อน +172

    Really good presentation of this event. Fair, and not clickbait-y. Thanks for the straight reporting, WSJ.

    • @Bruh-wb3qw
      @Bruh-wb3qw 10 หลายเดือนก่อน +2

      Other than the whole “recall” thing at the end which is misleading, it’s pretty good. Surprised they aren’t spewing constant lies anymore.

    • @garykubiak
      @garykubiak 9 หลายเดือนก่อน +9

      Horrible representation of the events. They don't mention that the driver was intoxicated until much later. The whole point of this video is to slam Tesla's autopilot. Think they'd make this video if the driver was using Honda's cruise control? Exactly.

    • @GeorgeOrwell-tp8dw
      @GeorgeOrwell-tp8dw 9 หลายเดือนก่อน +3

      Other than the fact they are describing Full Self Driving... which this car doesn't have, and calling it Auto Pilot.... sure...

  • @XP-nt9iy
    @XP-nt9iy 3 หลายเดือนก่อน +1

    Back-up cameras led to people not being able to reverse without one, and now self-driving cars are gonna give us people who literally just can't drive. It already feels that way when I drive, so just imagine.

  • @MarkusLanxinger
    @MarkusLanxinger 10 หลายเดือนก่อน +12

    who could have thought that 1.2MP cameras and no radar would be a terrible idea. Just look at the Waymo vehicles in contrast.

    • @Lazy2332
      @Lazy2332 9 หลายเดือนก่อน +1

      Why do they not have radar??????

    • @phillyphil1513
      @phillyphil1513 9 หลายเดือนก่อน +2

      re: "Just look at the Waymo vehicles in contrast." exactly, there's more SENSOR FUSION happening on those things than an Apache Helicopter.

    • @ProXcaliber
      @ProXcaliber 9 หลายเดือนก่อน

      This is in 2019 when the cars still had radar, it's even mentioned in the video. This is also what is called Autopilot and not FSD which are two separate systems with entirely different capabilities. Autopilot is basically lane keep assist and adaptive cruise control, like in every other modern car on the road.

  • @NigelzXX
    @NigelzXX 10 หลายเดือนก่อน +44

    Have to agree the name autopilot confuses many. But isn’t auto pilot for tesla means cruise control and certain level of lane keeping? Autopilot and fsd is totally different

    • @david100483
      @david100483 10 หลายเดือนก่อน +7

      Yes, it’s pretty clear when you engage autopilot that you are ultimately in control and responsible for driving. It’s just that some drivers abuse the use of it. I think the officers are trying to get a payday from tesla…the responsibility is all on the driver of this crash.

    • @dluchin1998
      @dluchin1998 10 หลายเดือนก่อน +8

      It doesn't confuse anyone. How could you think you don't need to pay attention when the car is screaming warnings at you that you need to pay attention? Some people are just fools and misuse it by not paying attention.

    • @ciello___8307
      @ciello___8307 10 หลายเดือนก่อน +3

      yes, but tesla's marketing suggests it is more like FSD (which its not)

    • @lachlanB323
      @lachlanB323 10 หลายเดือนก่อน +2

      FSD is something that is payed for separately.

    • @NigelzXX
      @NigelzXX 10 หลายเดือนก่อน

      @@dluchin1998 you need to understand there’s plenty of fools out there who don’t read instructions and follow instructions. 😂😂

  • @actualyoungsoo
    @actualyoungsoo 9 หลายเดือนก่อน +2

    If he is intoxicated, does that crash still be a fault of the autopilot? Him crashing into the police cruiser is his destiny after all.

  • @suvari225
    @suvari225 10 หลายเดือนก่อน +4

    It is said by the narrator that Tesla Autopilot slowed down the vehicle before impact and then disengaged. Why is that? Does it mean the driver panics at the last moment and disengages the autopilot? I drive Tesla for 2 years now, and never had a moment that it disengaged by itself. if the driver doesn't obey the warnings to touch the steering wheel, it actually slows down the vehicle and pulls over to the side of the road with hazards on. So, it does not disengage like that.

    • @Hans-Yolo
      @Hans-Yolo 10 หลายเดือนก่อน +2

      My guess would be the car made an emergency brake and then the driver also hit the brakes or pulled on the steering wheel and that disengaged AP

    • @mmark300
      @mmark300 9 หลายเดือนก่อน

      I've had moments where it disengaged but bc I also grabbed the wheel at almost the same time, I couldn't always tell which happened first. There was an incident, though, that I dont think I touched anything forcefully enough and it made me believe that it shut itself off. I was actually surprised that it wasn't still on. It was in road construction

  • @mro2352
    @mro2352 9 หลายเดือนก่อน +19

    Even before it was mentioned my thought was AI confusion from the lights from the emergency vehicles. Heck, for some drivers they have a hard time distinguishing the details of an emergency vehicle at night because of how bright they are, how do we expect the same from an AI system?

    • @CoryVY
      @CoryVY 9 หลายเดือนก่อน +8

      This is what I was thinking watching this video. Emergency vehicle lights are becoming dangerously bright, especially at night. When they are flashing at night, other drivers simply cannot see the outline of vehicles, any personnel around the scene, and they are often so bright they cause a dazzle reflex and Flash Blindness. I understand the need for them to be bright to catch the attention of other drivers during the day, but they need to be dimmer at night.

    • @person749
      @person749 9 หลายเดือนก่อน +2

      ​@CoryVY Exactly. Modern emergency lights are a danger. Until that changes I think it would be wise if these cars automatically slowed and ultimately disengaged when emergency lights are detected.

    • @budisutanto5987
      @budisutanto5987 8 หลายเดือนก่อน

      LED outlines vehicle.
      There was two tourist using three wheeled vehicle made in India called Bajaj.
      Because they almost always get hit from larger vehicle at night,
      they put LED light.
      It's a safer travel after they put those LED.

  • @lukerinderknecht2982
    @lukerinderknecht2982 10 หลายเดือนก่อน +14

    Jeeze. Even my non-FSD / non-autopilot Mazda would have braked for that.

    • @kevinbrown9425
      @kevinbrown9425 10 หลายเดือนก่อน

      nuh uh

    • @smallfgb
      @smallfgb 10 หลายเดือนก่อน

      The Tesla did brake. The reporter says slows, but probably this means brake input. At 65 a Mazda system would hit it too probably.

    • @lukerinderknecht2982
      @lukerinderknecht2982 10 หลายเดือนก่อน +3

      @@smallfgb Tesla needs radar, not just cameras. Braked way late.

    • @logitech4873
      @logitech4873 10 หลายเดือนก่อน

      @@lukerinderknecht2982 This Tesla had radar lol.
      The driver was drunk. They could've been pushing the accelerator all along for example.

    • @smallfgb
      @smallfgb 10 หลายเดือนก่อน

      @@lukerinderknecht2982 2019 Teslas all have radar. This is older story too so likely active in software too. Vision only was 2021 models forward. Probably the prompt for hand on wheel was due to the radar ping. My model 3 doesn’t just fixed interval ping for hands on wheel, but instead prompts when it sees problems. Much longer intervals on long empty highway than DC suburbia for example

  • @adamr4198
    @adamr4198 9 หลายเดือนก่อน +1

    We just got our first car with adaptive cruise control. It is great and very seductive. Stay alert folks the machines are good but far from perfect.

  • @jimfergusondev
    @jimfergusondev 9 หลายเดือนก่อน +2

    At 1:54, could you please provide clarification? Did the user have Autopilot, Enhanced Autopilot, or FSD (Full Self-Driving)? I'm asking this because Autopilot, which comes with every Tesla, does not perform Lane Changes, contrary to what the video suggests. Lane Changes are possible only with Activated Enhanced Autopilot or FSD.
    Basic Autopilot is designed to maintain speed and avoid cars in front, while Auto Steer keeps the vehicle centered in the lane. Also, if the driver's foot is on the accelerator, Autopilot will not disengage. The car owner can request diagnostic data from Tesla, which can reveal the pedal states and the car's decisions. I'm not dismissing the incident; I'm just seeking clarification

    • @mmark300
      @mmark300 9 หลายเดือนก่อน

      If they had their foot on the accelerator there would have been a message on the screen, if I remember correctly. Autopilot will also shut off if you accelerate past 85mph

    • @jimfergusondev
      @jimfergusondev 9 หลายเดือนก่อน

      @@mmark300 FSD and Enhanced will disable until the vehicle is parked if you take them above 85 with the accelerator without deactivating them.

  • @wi11ialvl
    @wi11ialvl 9 หลายเดือนก่อน +23

    There is a difference between autopilot and self-drive for Tesla's. Autopilot is basically just cruise control with lane keep assist and some other features. Self-drive mode might have recognized the situation, but I have no idea. What say you Tesla?

    • @mikerigley1
      @mikerigley1 8 หลายเดือนก่อน +1

      This is an issue: too much marketing hype with “autopilot”. It should be called “crossed control” or “copilot”.
      Full self driving should have sorted this though. But it’s foggy and confusing for the cameras.
      Still the humans fault though; the driver should have noticed the bright flashing lights and took control, slowed down and moved further across the lanes.
      Autopilot is trusted to much which is partly the name and owners not reading the manual which does state that it is basically lane and cruise control, like so many other cars have.

    • @Robin-vv9iy
      @Robin-vv9iy 8 หลายเดือนก่อน

      Only Mercedes has a Level 3 Self Driving Assist.

  • @CannabisTechLife
    @CannabisTechLife 10 หลายเดือนก่อน +12

    Should also look into how teslas take out motorcycles from behind at night because they have difficulty reading the break lights and think motorcycles are farther away than they actually are.

    • @danielreid6823
      @danielreid6823 10 หลายเดือนก่อน

      Yeah I saw too. Great video

  • @blond-in-blue
    @blond-in-blue 9 หลายเดือนก่อน

    People might think from looking at this footage that there was plenty of time for the driver to react, but owning a Tesla I can tell you that between the expectation that self-driving should know what to do, lack of signals that it doesn’t know what to do (asking to put your hand on the wheel is not a warning and happens constantly), and having about 0.5 seconds to understand that it’s doing something unexpected at 60+ MPH it’s not easy for a human to react correctly under those conditions. That’s why I’ve almost completely stopped using any autopilot/self-driving on my Tesla. If these companies are creating what they claim to be “full self-driving” systems, they should take liability for hurting and killing people with those systems.

  • @simplesepticsecrets
    @simplesepticsecrets 8 หลายเดือนก่อน

    What a great ad !!!

  • @thomi100
    @thomi100 10 หลายเดือนก่อน +4

    One accident was after the update was released - but did the driver install it?

    • @logitech4873
      @logitech4873 10 หลายเดือนก่อน +1

      You have to install them yourself yes

  • @hansonel
    @hansonel 9 หลายเดือนก่อน +4

    If you can't drive a car without autopilot "driving" the vehicle for you, you shouldn't have a drivers license IMO.

    • @Alan-kz6fc
      @Alan-kz6fc 8 หลายเดือนก่อน

      If you can't set a car on autopilot and it drives the car .. You've been lied to and fell for a scam.

  • @MichaelSHartman
    @MichaelSHartman 8 หลายเดือนก่อน +3

    Interesting, and informative video.
    If the driver must be there for everything, including having their hands on the wheel, and can be sued if the autopilot malfunctions, why not just drive it manually? It is an impressive novelty.

  • @awesomeink
    @awesomeink 10 หลายเดือนก่อน +35

    Keep in mind Tesla "autopilot" is a completely different piece of software from FSD. Autopilot is there lane-keeping software with cruise control it does not drive the car and it's not responsible for breaking or Lane changing. Quite a lot of cars drive into other vehicles while the cruise control is on and the drivers not paying attention maybe they should pay attention to the facts and not the tagline. NHTSA found that 392 accidents in the US in the last 10 months involved cars using advanced driver-assistance technologies. The other 384 cars were not Tesla's so it's not just a Tesla problem

    • @kutaplex
      @kutaplex 9 หลายเดือนก่อน

      There is only one company that actively tries to pretend that their cars have full self driving capability. It is Tesla, who even puts disclaimers like 'Driver is only there for legal reasons' on their promo videos. It's a scumbag company run by a scumbag.

    • @malikkelly
      @malikkelly 9 หลายเดือนก่อน +1

      Videos like this need to make this more clear. They would cause panic

    • @alexyooutube
      @alexyooutube 9 หลายเดือนก่อน

      Are you sure "FSD" software will recognize that Police Vehicle?
      I agree Cruise Control or Tesla "Auto Pilot" should be responsible for Changing Lane.
      But, it should be responsible for Braking. In fact, Tesla Autopilot attempted to brake, but it was too late.

    • @ProXcaliber
      @ProXcaliber 9 หลายเดือนก่อน

      @@alexyooutube It helps when the driver is actually alert and not intoxicated. Proper use of all these systems is what is important.

  • @RichardBaran
    @RichardBaran 10 หลายเดือนก่อน +7

    God I hope they find it flawed. You cannot have full auto pilot without radar (removed in the newer models) and LIDAR and even with that doesn't seem likely with current conditions.

    • @wr2382
      @wr2382 10 หลายเดือนก่อน +3

      That car had radar and it was being used, as was repeatedly stated in the video.

  • @Omie334
    @Omie334 9 หลายเดือนก่อน +6

    Model X 2019 did not have a cabin camera to tell if the driver is paying attention. The camera was added later. The newer version of full self driving detects lights easily and slows down drastically for emergency vehicles which is something that AP and FSD lacked until recently. It was something that we have been complaining about for the longest time. Everyone that drives on FSD or Autopilot knows and signs off on the fact that they are in control and there are situations where you have to take control. The driver should have been paying attention and should have taken over in this case. I hope that this brings attention to the issues and flaws that are being brought up in the systems and hopefully fixes the issue. I wonder how the new hardware 4 setup would respond to this situation especially when they enable the radar module. I am not a fan of the fact that tesla is depending on vision only. Thats a huge flaw.

  • @intheshell35ify
    @intheshell35ify 10 หลายเดือนก่อน +2

    Will never work right. You can't code every scenario and you definitely can't code anticipation of every scenario. Too many drivers doing their own thing out there.

    • @ProXcaliber
      @ProXcaliber 9 หลายเดือนก่อน

      Mine drives me to and from work just fine. Then again, I don't ever drive drunk or drink in general.

  • @Futuristbillpicone
    @Futuristbillpicone 10 หลายเดือนก่อน +1

    I am never ever trusting a self driving car.

  • @AndersonChan
    @AndersonChan 10 หลายเดือนก่อน +28

    Autonomous driving technologies are destined to improve and prevent collisions such as these in the future. For now, we still live in a world where autonomous driving is not perfect, and it should be the human driver's responsibility to monitor the system, knowing that the system isn't perfect. With the driver being intoxicated, they should not be operating a vehicle, period.

  • @jaybratab
    @jaybratab 9 หลายเดือนก่อน +6

    The more you drive, you drive better. This is also applicable to a single instance of driving. Responsiveness increase as you continue to drive, till you get tired. Any driver is bound to loose his alertness and responsiveness over time, if autopilot keeps doing all the task of driving correctly. Now he will not have the required responsiveness required at high speed to avoid a danger situation. At lower speed he will be fine. But higher speeds require heightened response which require a heightened state of his body and mind.

  • @PR3M3
    @PR3M3 9 หลายเดือนก่อน

    I could never be that comfortable with any autopilot driving system.

  • @dazm901
    @dazm901 21 วันที่ผ่านมา

    in 2021, Tesla decided it would be better off without radar and abandoned the sensor. It was first removed from Model 3s and Model Y production, followed by the Model S and Model X in 2022.

  • @steveblake4187
    @steveblake4187 10 หลายเดือนก่อน +4

    I have often wondered if it's really necessary for multiple emergency vehicles at one scene to all have their lights flashing. It can be disorienting for drivers going past.

    • @levilangley8744
      @levilangley8744 10 หลายเดือนก่อน

      …. Are you being serious

  • @DigSamurai
    @DigSamurai 10 หลายเดือนก่อน +13

    The road to autonomous driving Level 5 (no steering wheel) will not be a straight line. Expecting cars to magically skip to that level is lunacy.
    Until they get there, humans are clearly responsible for safely piloting their car and should act accordingly.

    • @biomaniac22
      @biomaniac22 9 หลายเดือนก่อน +1

      Until we get there, cars with these capabilities need to be marketed appropriately.

    • @DigSamurai
      @DigSamurai 9 หลายเดือนก่อน +1

      @@biomaniac22 that is a good point but where does the responsibility of the manufacturer end and the driver begin? Tesla already has measures to ensure drivers keep their hands on the wheel, however people try to do workarounds. That's on them.

    • @biomaniac22
      @biomaniac22 9 หลายเดือนก่อน

      @@DigSamurai The marketing for "Full Self Driving" is on Tesla, however. And very much on Musk. The title literally means the car does not need a driver. There are mountains of precedent where companies are held responsible for the names of their products.
      Additionally, safety features generally undergo strenuous testing before being put on the road. There needs to be stricter regulations about car makers putting beta features on the road. The driver may have signed a waver about the dangers of FSD, but not the drivers around them.

  • @byrey3
    @byrey3 8 หลายเดือนก่อน +1

    The autopilot disengaged right before the crash in an effort to claim "the vehicle was NOT in autopilot when it crashed, it is not our fault" from Tesla.
    A good respond would have been to change lanes and avoid the crash... except that thank god it didn't do it because there were people standing up in that lane and they would be dead

  • @seanlavery6477
    @seanlavery6477 8 หลายเดือนก่อน

    3:08 minutes in the video the Wall Street Journals image of the cameras is wrong. The Left side and right side cameras forward of the door mirrors on the front fenders are actually facing the back of the car and not in front of the car.

  • @LocTran-ws2ip
    @LocTran-ws2ip 10 หลายเดือนก่อน +21

    Police cars were parking in 2 lanes. That was a problem as well. On the highway, when you see, that would be too late.

    • @wensiangfong
      @wensiangfong 10 หลายเดือนก่อน +7

      Wonder why the police don’t park inside the emergency lane…

    • @candle_eatist
      @candle_eatist 9 หลายเดือนก่อน +3

      The tesla camera saw the cars from like a mile away. That's too late? Yeah I don't know about that.

    • @Anon1mous
      @Anon1mous 9 หลายเดือนก่อน +1

      What it it wasn’t a police car but a disabled car?

    • @someguy6075
      @someguy6075 9 หลายเดือนก่อน +8

      If you can't see a large obstruction until it's too late to stop, you are driving too fast for conditions, period.

    • @user-nu9qi4cg4c
      @user-nu9qi4cg4c 9 หลายเดือนก่อน +3

      If you can't see a police blocking two lanes ahead then you shouldn't drive period...

  • @christang2371
    @christang2371 10 หลายเดือนก่อน +5

    This hapend to my dad when he was in his Toyota, Lane keep didnt seee a disabled car around the corner and crach into it at 45.

    • @smallfgb
      @smallfgb 10 หลายเดือนก่อน +2

      I was thinking this too. Have a Ford that is hands free and does worse than my hands on the wheel Tesla. If they recall Autopilot, they’d better pull them all because I don’t think any others are as good

    • @TomNook.
      @TomNook. 10 หลายเดือนก่อน +2

      Irrelevant. Why didn't your dad see the car? HE crashed into it.

    • @thefamily512
      @thefamily512 9 หลายเดือนก่อน +1

      Shouldn’t be driving, time to retire

    • @christang2371
      @christang2371 9 หลายเดือนก่อน +1

      he did see it. assumed the car woud do somthing but it didnt coudent correct in time.@@TomNook.

    • @christang2371
      @christang2371 9 หลายเดือนก่อน

      51 years must be too old. im 22 dude @@thefamily512

  • @Jason.W.
    @Jason.W. 9 หลายเดือนก่อน +1

    Unpopular opinion. Why are the police cars even in the lane? The initial suspect was on the shoulder, why impede traffic and cause higher risk of hazard? Why were there no cones or lead up warnings placed? Maybe stop the practice of pulling people over on narrow shoulders, lead them to under a bridge or a rest area.

  • @eprofessio
    @eprofessio 8 หลายเดือนก่อน +1

    Why were there police cars parked in the traffic lanes. The hubris of law enforcement is amazing and appalling at the same time.

  • @pao44445
    @pao44445 10 หลายเดือนก่อน +5

    Here we go again

  • @Ixa90
    @Ixa90 9 หลายเดือนก่อน +6

    As a person who works in the engineering space i think it’s better that the police are suing Tesla this puts a heavier emphasis on training the model to continue the hype around autopilot
    albeit the driver is still responsible for the crash this could possibly make tesla and other self driving models safer

    • @peterh3213
      @peterh3213 9 หลายเดือนก่อน +3

      if someone texting while driving will kill someone from your family, will you expect the police suing Apple? (just trying to understand your logic)

    • @KarelGut-rs8mq
      @KarelGut-rs8mq 9 หลายเดือนก่อน

      @@peterh3213 The logic is quite easy to understand.
      If a person, being as drunk as this driver appears to have been, is able to get on the highway and "drive" for 45 minutes then the Tesla Autopilot helped him achieve that feat and thus it helped/enabled him to commit a crime and it's not unreasonable to hold it responsible for that.
      If a system is capable of "assisting" people doing stupid things like climbing into the back seat as has been seen on numerous youtube videos then perhaps there should be some safeguards introduced that stops that sort of misuse. Failure to ensure that misuse of the system is impossible is reason enough to hold the manufacturer co-liable for the result of the behaviour.

    • @peterh3213
      @peterh3213 9 หลายเดือนก่อน +4

      ​@@KarelGut-rs8mq the logic is quite easy to understand, but unfortunately does not make any sesne :(
      we should then sue every car manufacturer where a person could get behind the wheel and crash because without the car produced it would never happened. let's sue the state for building a highway because you know, based on this logic it enabled to drive a drunk driver fast what would never happen if there was a forest like 150yrs ago before building the highway there.
      tesla's autopilot is actually a helpful system because it lowers the probability of crash of someone drunk unlike other car manufacturers.
      I still stay strongly behind my opinion, that person being drunk is fully responsible not the car manufacturer. Same applies for gun manufacturers, you can pull the trigger and there is no safeguard to unable you to do it ... sry but you and ONLY YOU are responsible and should be sued. No car or gun manufacturer is responsible for your actions. Or country for building the highway, if we strictly follow the logic.

    • @mahasish
      @mahasish 9 หลายเดือนก่อน

      @@peterh3213 But Tesla claims that there cars can drive on their own which is misleading. Check elons tweet "Very few people realize that Teslas can drive on their own" and name Autopilot itself.

    • @peterh3213
      @peterh3213 9 หลายเดือนก่อน +2

      @@mahasish i don't think that some tweets of a major shareholder and CEO is considered a manual ... each manual clearly states that you can't be driving in the back seat or drunk ...

  • @nicholasnapier2684
    @nicholasnapier2684 9 หลายเดือนก่อน +1

    State troopers have a program that they can use to plug up to those underneath the dash they can see the footage what the car sees and what they were doing up and to that accident braking, etc.

  • @thihal123
    @thihal123 8 หลายเดือนก่อน

    Let’s stop calling it autopilot in the video. Call it assisted drive mode.

  • @djalan2000
    @djalan2000 10 หลายเดือนก่อน +5

    Oh wow!! I just watched this and realized a couple of things... First, the title says "Footage SUGGESTS reasons" but doesn't say that WAS the reason... In fact, they even say the driver was alerted PRIOR to the accident and that the "although the system WORKED AS DESIGNED" it was "Not enough to sideline THE IMPAIRED DRIVER"... So, the driver was IMPAIRED, yet they didn't really make that a POINT other than to 'casually' mention that the system wasn't ENOUGH to sideline the driver... Which, means, the DRIVER hit the cars, not the autopilot system...
    Too many times you will see a car (any type) that hit that ONE lonely tree or telephone pole on the side of the road 'head on'. And you have to wonder "Didn't they have enough time to swerve out of the way?" - well, yeah, they probably did, but SOME people can get 'fixated' on the object(s) and head straight for them instead! This happens A LOT with emergency vehicles as well... The driver fixates on the LIGHTS and automatically heads right for them rather than around them...
    Too many times drivers (who survived) have said "The last thing I saw was the telephone pole" or "Last thing I remember was seeing a big tree in front of me"... This of course ends up in an accident that eventually is called 'distracted driving' or such... But in reality it's actually FOCUSED driving - but focusing on the OBJECT instead of the road and their actual driving...
    It's also why so many people will 'cross the lines' into another lane when they look at something on the side of the road... If it's on the right side, they drift right... On the left side and they drift left... While, had they just let go of the wheel (or REALLY paid attention to the wheel like they should), they would have gone straight and not drifted to one side or the other... It's a NATURAL phenomenon... Some people even 'lean' to one side as well...
    So, *I* suspect that the driver may not have even NOTICED the 'warning' on his screen and, instead just headed his car right into the 'shiny lights' OR that he saw the warning, grabbed the wheel 'slightly' like he was used to doing when getting the 'nag' warnings and then it 'turned off' the autopilot and let him ram into the pretty lights...

  • @laudnunoo1915
    @laudnunoo1915 10 หลายเดือนก่อน +24

    The humans should be held accountable. why were their eyes off the road?

  • @isipwater
    @isipwater 10 หลายเดือนก่อน +1

    "What bleeds leads" - I get it, WSJ attempting to stay relevant.
    1) No mention of how the software prevented the drunk driver from causing even more harm.
    2) No mention of how many GM's, Ford's, Toyota's, etc. are involved in similar accidents.
    3) If someone misueses ANY product while intoxicated, risk of harm is increased.
    4) No data presented about crashes per mile driven by all auto manufactures.
    5) Drunk driver causes serious injuries but injured suing Tesla - b/c they know Tesla has the $$.

  • @ropefreeze1660
    @ropefreeze1660 7 หลายเดือนก่อน

    Keep in mind that many safety features can be turned off, like emergency braking or obstacle avoidance. Unsure why it wouldn't slow down at all though

  • @mitchellsmith4601
    @mitchellsmith4601 9 หลายเดือนก่อน +11

    We can talk all day long about how Autopilot is the most advanced self-driving technology available…but Tesla can’t get Level 3 approval, while Mercedes has Level approval in at least two states already. Mercedes claims they’re liable if the car makes a mistake while their Level 3 system is engaged and working.

    • @peterhelm522
      @peterhelm522 9 หลายเดือนก่อน +4

      I don’t know the limitations of the Mercedes Level 3 system Drivepilot in the US. In Germany it is useless, because it only works on certain highways and sections.
      Speed must be below 37mph, no construction site, good weather, must follow a car. When do you have these conditions? Mercedes is good on paper!

    • @timbehrens9678
      @timbehrens9678 9 หลายเดือนก่อน

      ​@@peterhelm522 You just described 50% of traffic on the SoCal highways. Seems pretty useful to me. By the way, all the Autobahnnetz in Germany is available for DrivePilot.

    • @gabrielchow1994
      @gabrielchow1994 9 หลายเดือนก่อน +1

      Autopilot and FSD are different.
      Autopilot is merely Adaptive cruise control.
      FSD is the one that is "most advanced self-driving technology available" that you referred to.

    • @timbehrens9678
      @timbehrens9678 9 หลายเดือนก่อน +1

      @@gabrielchow1994 It is not the "most advanced technology available" because it is running cars into the emergency vehicles or causes pileups by phantom-braking.

    • @kenbob1071
      @kenbob1071 9 หลายเดือนก่อน +2

      @@timbehrens9678 newsflash: human drivers run into emergency vehicles hundreds of times/year. And the fact that a drunk driver on Autopilot runs into an emergency vehicle means that the technology isn't perfect. It doesn't mean that it's not the most advanced tech available.

  • @Thebackson
    @Thebackson 10 หลายเดือนก่อน +54

    No mention of how many vehicles without autopilot hit emergency vehicles every year? Let me give you a hint, the answer is in the thousands. It isn't an autopilot problem, it's a driver problem.

    • @AshTheNimbat
      @AshTheNimbat 10 หลายเดือนก่อน

      Yet newer Tesla vehicles pretty much depend on the camera than sensors which could've at least swerved to the side.

    • @WinstonsGarage
      @WinstonsGarage 10 หลายเดือนก่อน +1

      Facts!

    • @Cassp0nk
      @Cassp0nk 10 หลายเดือนก่อน +1

      The cope is hard.

    • @Thebackson
      @Thebackson 10 หลายเดือนก่อน +1

      @@AshTheNimbat If you were up to date on why they removed other sensors is that multiple types of sensors can cause conflicting information and the programming then does not know which one to believe. Relying on the cameras the way we rely on our eyes has proven to be the better solution. But there are over 3000 accidents of this type every year without any driver assistance features. The problem is the drivers. The software is not perfect but it is better and getting better every day. This driver being intoxicated was the problem. Autopilot and FSD will eventually be the solution to drunk drivers.

    • @kurtphilly
      @kurtphilly 10 หลายเดือนก่อน

      @Thebackson How do the number of vehicles that hit emergency vehicles without Autopilot matter in this situation? The whole point of theses type of systems as they are marketed is to prevent accidents. As you note, this is a quite common situation, one that should be a high priority in the design of this system. This a driver and Autopilot problem because one relies on the other. They aren’t mutually exclusive.

  • @jacobcastro6398
    @jacobcastro6398 9 หลายเดือนก่อน +2

    FSD and AUTOPILOT are in leagues of their own. So make the known there’s a difference rather then putting them in the same category. They do the same thing and drive but one is more advanced

  • @David-xl9cp
    @David-xl9cp 4 หลายเดือนก่อน

    I find it very scary that there are cars out there driving themselves, when the technology is not there yet.

  • @smallfgb
    @smallfgb 10 หลายเดือนก่อน +18

    Funny how Tesla is called out so much. I own a Ford with Blue Cruise. That system is not as good at staying in lane and recognizing even more standard dangers, yet requires no hands on the wheel unlike AutoPilot. My Tesla drives WAY better and also requires I pull on the wheel a few times a minute to ensure I’m paying attention. I closely supervise both systems at all times to avoid dying, but if they are considering pulling Tesla’s system there are many not as good aids out there that don’t require as much input as Tesla.

    • @michaelc7014
      @michaelc7014 9 หลายเดือนก่อน +1

      There's definitely a bias out there

    • @98f5
      @98f5 9 หลายเดือนก่อน +1

      Audi / VAG has a great system, requires hands on the wheels and actually stops for stopped objects but yeah when youre behind the wheel it seems so obvious to pay attention and not trust some glorified cruise control

    • @deamon5959
      @deamon5959 8 หลายเดือนก่อน

      Good point - probably because there are a lot of Teslas

  • @willardchi2571
    @willardchi2571 9 หลายเดือนก่อน +6

    How in the heck could Tesla not have tested their systems for the detection of stopped emergency vehicles with flashing lights, and have done so in all types of weather conditions and times of day?
    When airliners began to incorporate autopilots of increasing capabilities such that the human pilot was left with almost nothing to do except to monitor the autopilot, the human pilot's attention would frequently wander and the result was more plane crashes instead of fewer. The result was a scaling back of autopilot capabilities so that the human pilot would still have to enough to do to keep his/her attention from wandering.

    • @541er
      @541er 9 หลายเดือนก่อน

      Airline pilot here. That is actually not regarding scaling back of autopilot capabilities

    • @kenbob1071
      @kenbob1071 9 หลายเดือนก่อน

      It's the driver's responsibility to take control in situations like this. It's like following GPS instructions to drive into a lake. Sometimes you have to use your brain for critical thinking.

    • @willardchi2571
      @willardchi2571 9 หลายเดือนก่อน

      @@kenbob1071 - Pilot inattention caused more crashes than automating airplane flight to the point that pilots had nothing to do but "monitor"--the FAA found the only solution was to revert to having the pilots actually go back to being more hands on in flying the plane to keep pilots attentive.
      This was learned some time ago (because it was easier to automate flight than it was for automating driving on the ground).

  • @anhhoanginh4763
    @anhhoanginh4763 5 หลายเดือนก่อน

    The "auto pilot" feature didn't recognize the obstacle "just 2.5 sec and 37 yards before the crash".
    Do we have any regulation for this case?

  • @AntanWilson
    @AntanWilson 10 หลายเดือนก่อน +49

    Let's not blame the car because he was drunk. The system probably prevented 85 other crashes he could have caused in his 45 min trip.

    • @xandr13
      @xandr13 10 หลายเดือนก่อน +10

      Excellent point.

    • @hightechjoe1
      @hightechjoe1 10 หลายเดือนก่อน +1

      However, the system might have also enabled him driving while drunk many times before. Unfortunately, short of breathalyzer test no system may have reliably prevented the driver from operating the vehicle.

    • @ryantakahashi9130
      @ryantakahashi9130 10 หลายเดือนก่อน

      ​@@hightechjoe1 True, but the system could do something like prevent drivers from even using autopilot at all if they trigger a certain amount of prompts in a short period of time. Consistent failure to follow the rules means you aren't paying attention so you don't get to use the feature. It would at least help some, even if it doesn't completely solve the problem. At least then the drunk driver has to decide, do they risk it and try to do it themselves? Or do they make the better decision to just park and sleep it off? Would have been better if this person just crashed into a median in the first 5 minutes under their own control than at freeway speeds into a group of people. Not to mention if the autopilot wasn't hiding the fact that they were swerving without it, people might have just called 911 to report a drunk driver. I've called it in a bunch when I noticed really really drunk or erratic drivers.

    • @Screwycummings
      @Screwycummings หลายเดือนก่อน

      @@hightechjoe1 I think every car in the future should require a breathalyzer test to start the car. I know people will complain about it at first, but it will become a standard feature just like seatbelt did.

  • @johnmanderson2060
    @johnmanderson2060 10 หลายเดือนก่อน +12

    This is what happens when you have visual only FSD, with LiDAR this would never happened.

    • @thefamily512
      @thefamily512 9 หลายเดือนก่อน +3

      thanks for your expertise, can you teach us about rocket engines and spacex too?

    • @mujjuman
      @mujjuman 9 หลายเดือนก่อน +2

      This Model X had radar too

    • @lacikeri3102
      @lacikeri3102 9 หลายเดือนก่อน +3

      This type of rollover accident is typically the fault of radar-based systems. Only if it doesn't happen with Tesla, they don't make a case out of it, and other companies typically don't even record the case, they don't have data like Tesla. But anyone who deals with this sort of thing knows that the kryptonite of any such emergency brake assistant is the non-moving obstacle. Think about it. It's quite a problem, it would be if you always braked on poles or trees on the side of the road. This has been fixed pretty well with the visual system. My Tesla still has radar. But it became much more accurate and better when it was switched to the visual system.
      In heavy fog or rain, he can see clearly better than I do. Besides, if he can't see, he slows down. (Lidar does not work in the rain)

    • @johnmanderson2060
      @johnmanderson2060 9 หลายเดือนก่อน

      @@mujjuman You are right, my bad. But is it Radar or LiDAR ?

    • @johnmanderson2060
      @johnmanderson2060 9 หลายเดือนก่อน

      I am still amazed that a simple color analysis of flashing lights doesn’t arrive to the conclusion the car should slow down ( IF BRIGHT RED AND BRIGHT BLUE FLASHES ALTERNATES => SLOW DOWN ) You don’t even need high level object recognition, JUST COLOR PATTERNS in the video feed. As humans we do it every day behind the steering wheel. Maybe FSD is colorblind except for traffic lights. Or color isn’t considered important in the recognition process.

  • @oneryanalexander
    @oneryanalexander 9 หลายเดือนก่อน

    Can you imagine how poor the Tesla Autopilot is to not recognize stationary vehicles.

  • @derekndosi
    @derekndosi 10 หลายเดือนก่อน +2

    The rest of us use the metric system, yards mean nothing to us. Put metrics measurements together with the imperial ones

  • @Emanuel-jr2ii
    @Emanuel-jr2ii 10 หลายเดือนก่อน +55

    I think the same would have happened if the driver had not engaged autopilot. The driver was clearly not paying attention at all. That's on top of the fact he or she was intoxicated... Imo, the driver is at full blame here. The autopilot software is in beta and should be used as one.

    • @Sam-cl1ow
      @Sam-cl1ow 10 หลายเดือนก่อน +8

      Why is beta self-driving software a feature in cars that are on the road? There are clearly some issues with the software, so it should not be available to everybody. Of course the driver is at fault, but that also doesn't mean that we should excuse Tesla for allowing drivers to use a buggy piece of software that is capable of killing people.

    • @carlosrodriguez1958
      @carlosrodriguez1958 10 หลายเดือนก่อน +13

      ⁠​⁠@@Sam-cl1ow wrong, people who dont use the feature correctly (ie, dont pay attention while driving 🙄) are at fault and not the company that also explicitly says you must pay attention while driving. Tesla is not dishing out “buggy piece of software capable pf killing people”. People are killing people, abusing a software that is not capable of driving autonomously YET. And then people blame it on the company in a bizarre attempt to shift responsibility just blows my mind

    • @Closer-than-it-appears
      @Closer-than-it-appears 10 หลายเดือนก่อน +3

      @@carlosrodriguez1958 Tesla cannot contract out of its liability to a 3rd party via an agreement with its customer. It is foreseeable that someone would use or misuse the autopilot in circumstances such as these, rendering the vehicle dangerous to someone not party to the agreement. Tesla must therefore take steps to ensure the system can safely bring the car to a stop when its customer does not comply with its limits. To do otherwise is to expose innocents to risks they did not incur, exposing Tesla to claims of negligence. Liability will likely be apportioned between Tesla and the driver.

    • @Cryaboutmyhandle
      @Cryaboutmyhandle 10 หลายเดือนก่อน +2

      Imagine letting a beta test run wild risking others lives.

    • @Cryaboutmyhandle
      @Cryaboutmyhandle 10 หลายเดือนก่อน +2

      ​@@carlosrodriguez1958its called auto pilot is it not?

  • @blackspiderman1887
    @blackspiderman1887 10 หลายเดือนก่อน +26

    The driver is at fault in that particular case. You're supposed to move over when you see emergency vehicles.

    • @ProXcaliber
      @ProXcaliber 9 หลายเดือนก่อน

      Just for that reason? Not because he/she got in a car while intoxicated in the first place?

    • @L2002
      @L2002 9 หลายเดือนก่อน

      Why didn't Tesla change the lane? Also they replaced sensors with cameras, why not keep both

    • @ProXcaliber
      @ProXcaliber 9 หลายเดือนก่อน

      @@L2002 Does your average car with lane keep assist or adaptive cruise control change lanes automatically? Because that’s what this is.

  • @bradleywilliambusch5198
    @bradleywilliambusch5198 9 หลายเดือนก่อน

    That would be a problem every driver faces because that's what those lights are meant to do. Because of how long the accident is, it's hard to work out if those cars are leaving the scene.

  • @jenkinsmatthew
    @jenkinsmatthew 29 วันที่ผ่านมา +1

    The background music in this video is really annoying.

  • @vangelissotiropoulos7365
    @vangelissotiropoulos7365 10 หลายเดือนก่อน +8

    In which world do the emergency vehicles are allowed to stop an entire highway lane w just their hazards on and no one else stopping ongoing traffic!!
    Even if it wasn’t the Tesla, it would have been another vehicle just as easily crash into them..

    • @evoluxsion7339
      @evoluxsion7339 10 หลายเดือนก่อน +1

      It's common in Texas to see badly trained law enforcement creating dangerous situations.

    • @netj
      @netj 9 หลายเดือนก่อน

      any sensible human driver would slow down and/or change lanes to avoid most hazards, but we'll have to train machines from scratch to do the same via fatal accidents from countless trial and errors like these... by the time we complete the technology, I'm afraid machine's track record will never be better than all human drivers' safety records

  • @battulga91
    @battulga91 9 หลายเดือนก่อน +3

    Can emergency vehicles not block the road when possible? Especially when the visibility is reduced.

  • @Mark-kt5mh
    @Mark-kt5mh 10 หลายเดือนก่อน +1

    Tesla black box now also records from a cabin facing the driver in the cabin.

  • @nicholasnapier2684
    @nicholasnapier2684 9 หลายเดือนก่อน

    It’s not like a drone where you have open air space, but even that is subject to its problems it happens in open air space, but there is someone on the other end of that camera they can see what it’s near on the ground..

  • @benjaminehrenberg
    @benjaminehrenberg 10 หลายเดือนก่อน +51

    I’m still waiting for my Tesla autopilot update which can turn it into a “Robo - Taxi “ and make me 30000$ which Elon Musk claimed to deliver in 2019.😂

    • @rj8u
      @rj8u 10 หลายเดือนก่อน +5

      You wish 😂 You probably can't afford $15k for the FSD 😂😂

    • @danielreid6823
      @danielreid6823 10 หลายเดือนก่อน

      ​@@rj8u​@rj8u Well you can get it online from G2A for $12.99 Mark 2702 or Rich Rebuilds

    • @wr2382
      @wr2382 10 หลายเดือนก่อน +6

      I doubt that you own a Tesla considering you don't seem to know the difference between Autopilot and FSD.

  • @mariovenenno
    @mariovenenno 10 หลายเดือนก่อน +19

    He was drunk… why are we even talking about Tesla this entire video makes no sense “HE WAS DRUNK”

  • @richardw3470
    @richardw3470 2 หลายเดือนก่อน

    There are numerous rear end crashes of both drunks and non-drunks driving non-autopilot cars. Drunks because they're drunk and non-drunks because they're texting or otherwise inattentive to their driving.

  • @issiewizzie
    @issiewizzie 10 หลายเดือนก่อน +1

    Good work wsj

  • @TomTom-cm2oq
    @TomTom-cm2oq 10 หลายเดือนก่อน +15

    How is technology that is designed to assist you, flawed? And how is “flawed” different from misuse of a system that CLEARLY tells you to pay attention at all times?

    • @arcadion448
      @arcadion448 10 หลายเดือนก่อน +1

      Did you even watch the video? 5:25 drivers placed a tape over the internal camera. That's a flaw right there.

    • @paladro
      @paladro 10 หลายเดือนก่อน +2

      dude, the autopilot not autopilot shouldn't be an option... tesla is beta testing on the public roads and claiming they have no responsibility for their own ceo's words and marketing.

    • @TomTom-cm2oq
      @TomTom-cm2oq 10 หลายเดือนก่อน +3

      @@arcadion448 you can use a fake seatbelt clip to trick the car into thinking that you’re buckled and stop beeping at you while you drive around without a seatbelt. Lots of taxi drivers have them.
      Do you think that’s a flaw?

    • @arcadion448
      @arcadion448 10 หลายเดือนก่อน +2

      @@TomTom-cm2oq- the fact you have to even ask shows how out of touch you are. Yes, that's a design flaw.

    • @61rampy65
      @61rampy65 9 หลายเดือนก่อน

      Wouldn't you call technology that can't recognize a stationary vehicle in the road flawed? Sounds like you think hi-tech will fix everything.

  • @th0master
    @th0master 10 หลายเดือนก่อน +15

    So now we’re blaming Tesla instead of the DRUNK driver? If the driver wasn’t drunk and was paying attention, this would’ve never happened.

    • @mattbrinkerhoff3229
      @mattbrinkerhoff3229 10 หลายเดือนก่อน +2

      Kinda funny how that works. Same thing as blaming a gun lol

    • @ironymatt
      @ironymatt 10 หลายเดือนก่อน

      I'm fine with blaming both. Generally speaking, the people who design driver assist features are well aware of the potential for such features to have the side effect of decreased driver attentiveness. This is a problem of too much tech, and there comes a point where the solution of even more tech suffers diminished returns.
      Good driving requires sound judgment, both need to be taught, and that dynamic is compromised the more the car does everything for the driver. This system, much like the drunk, should never have been on the road.

    • @th0master
      @th0master 10 หลายเดือนก่อน

      @@mattbrinkerhoff3229 agreed, we shouldn’t blame guns, we should blame the ones using them, aka more gun restrictions!

    • @evoluxsion7339
      @evoluxsion7339 10 หลายเดือนก่อน

      @@mattbrinkerhoff3229 👏👏👍

    • @evoluxsion7339
      @evoluxsion7339 10 หลายเดือนก่อน

      @@th0master LOL doesn't matter how much ink on a piece of paper you have. Bad people will always break rules and find a way to do it.

  • @Phat737
    @Phat737 9 หลายเดือนก่อน

    The guy in TX was drunk. That should eliminate this particular accident from any study of autopilot accidents.

  • @suprfli6018
    @suprfli6018 9 หลายเดือนก่อน

    I hate the sheer lack of accountability

  • @jeffchandler6285
    @jeffchandler6285 10 หลายเดือนก่อน +21

    It's a two-part system the car did what it could, drunk driver failed and thus should be charged for the crash.

    • @danielreid6823
      @danielreid6823 10 หลายเดือนก่อน +2

      Wait.. the "autopilot" was drunk?

    • @jeffchandler6285
      @jeffchandler6285 10 หลายเดือนก่อน

      @@danielreid6823 😂🤣😂 no the driver that car gave so many are you dead? alerts to.

    • @danielreid6823
      @danielreid6823 10 หลายเดือนก่อน

      Respectfully, it seems like you're drunk "no the driver that car gave so many are you dead"

    • @jeffchandler6285
      @jeffchandler6285 10 หลายเดือนก่อน

      @@danielreid6823 nope just a bit early in the am, inadvertently worked through the night. Fortunately no driving involved.

  • @danielforget9311
    @danielforget9311 9 หลายเดือนก่อน +3

    They don't mention the lives saved by automotion (not the self driving mode), the internet is loaded with the Tesla car swerving to avoid stopped cars before the driver.

    • @HiddenAgendas
      @HiddenAgendas 5 หลายเดือนก่อน

      That's stupid. You know how many times everyday drivers avoid accidents as well w/o the help of the POS beta system Tesla released. lol

  • @Forbes123
    @Forbes123 9 หลายเดือนก่อน +1

    Yet tesla decided to remove the radar from the vehicle and instead rely on cheaper cameras for autopilot.

  • @TheRealVranesh
    @TheRealVranesh 3 หลายเดือนก่อน +1

    I don’t own a Tesla nor planning to buy one after I rented one for a long road trip. This accident is squarely on the driver. These systems are auxiliary systems. The driver is still responsible.

  • @rj8u
    @rj8u 10 หลายเดือนก่อน +20

    So wait 🫣 we're blaming autopilot for the drunk driver 🙈. What the society and main stream media now is ridiculous. Autopilot is a glorified cruise control period!!

    • @RichardBaran
      @RichardBaran 10 หลายเดือนก่อน

      Blaming telas for pretending "autopilot" is well autopilot regardless of the driver. Remember robot taxis "it would be financially insane to buy anything but a tesla"

    • @rj8u
      @rj8u 10 หลายเดือนก่อน +1

      @@RichardBaran The biggest flaws of the free Tesla autopilot system are it does not react to traffic lights , stop sign and Emergency vehicle because Tesla wants people to shell out $15k to get those features. The federal regulators should mandate Tesla to include these features to the current Autopilot due to its driver tendency to misuse the system and cause an accident.

    • @archigoel
      @archigoel 10 หลายเดือนก่อน

      @@RichardBaran both are separate matters.

  • @marchlopez9934
    @marchlopez9934 10 หลายเดือนก่อน +6

    The National Highway Traffic Safety Administration (NHTSA) is investigating 16 crashes involving Tesla's Autopilot system and emergency vehicles to determine whether the technology contributed to the accidents. The latest incident occurred on 27 February in Montgomery County, Texas, when a Tesla Model X driving in Autopilot mode hit a police vehicle at 54 mph, injuring five officers and hospitalising the subject of a traffic stop. The car's Autopilot system failed to recognise the stationary emergency vehicles in time, and while the driver monitoring system worked as designed, it was not enough to prevent the collision. The driver was drunk and had set the car to Autopilot mode four minutes after beginning his journey. Tesla's Autopilot system partially automates highway driving tasks, and drivers using Autopilot are supposed to remain engaged so they can take control of the car at any time. Federal investigators have said that Tesla's marketing exaggerates the technology's capabilities and encourages drivers to misuse it. Tesla denies that the Autopilot feature was responsible for the accident.

    • @Iahusha777Iahuah
      @Iahusha777Iahuah 9 หลายเดือนก่อน +1

      Will they treat ford and gm and others the same way?

    • @ProXcaliber
      @ProXcaliber 9 หลายเดือนก่อน

      My Acura does all the same things as this, does this mean I can blame Acura when I get into a car accident because of it?

    • @marchlopez9934
      @marchlopez9934 9 หลายเดือนก่อน

      @@Iahusha777Iahuah The National Highway Traffic Safety Administration (NHTSA) investigates incidents involving all automakers to ensure safety standards are upheld. The focus on Tesla's Autopilot system is not exclusive, and any similar incidents involving other automakers would likely undergo similar scrutiny. The NHTSA's role is to evaluate safety concerns across the industry, irrespective of the manufacturer, to enhance overall road safety standards.

    • @marchlopez9934
      @marchlopez9934 9 หลายเดือนก่อน

      @@ProXcaliber Blaming the automaker entirely for accidents related to driver-assistance systems like Tesla's Autopilot or Acura's similar features is not the right approach. These systems are designed to assist drivers, not replace them, and their safe use relies on the driver's active engagement and adherence to safety guidelines. In accidents involving such systems, investigations typically consider driver behavior and adherence to safety instructions alongside the performance of the technology itself. The NHTSA's scrutiny aims to determine whether the technology played a role in the accident and to promote safer use of these systems across the industry.

  • @uswwt
    @uswwt 9 หลายเดือนก่อน

    On the bright side, we need to give Autopilot some credit for driving the drunk driver safely for 45mins 😅

  • @rgrgraterol
    @rgrgraterol 10 หลายเดือนก่อน +5

    Intoxicated driver… If I see my Tesla not breaking ahead of a traffic congestion I intervene, so this is mostly the driver’s fault. But I fully agree with the misleading marketing and autopilot is far from perfect so I’m happy to see legal action starting.

    • @jeffsteyn7174
      @jeffsteyn7174 9 หลายเดือนก่อน

      Yet they call it full self driving.

  • @icomxwing42
    @icomxwing42 10 หลายเดือนก่อน +46

    Having had two teslas with FSD, FSD beta is getting better, but is nowhere near being trustworthy. Under ideal conditions it usually works on the interstate but on anything else it drives like a 16 year old kid. I personally have seen autopilot freak out and disable with no warning. I suspect the many accidents Tesla says had autopilot disabled are due to this behavior.

    • @logitech4873
      @logitech4873 10 หลายเดือนก่อน +1

      Tesla counts AP accidents as long as AP was engaged within 5 seconds of the accident.

    • @CameronsCookingChannel
      @CameronsCookingChannel 10 หลายเดือนก่อน +3

      I think it's worse than a 16 year old kid. More like a 10 year old... I've had mine drive over curbs, drive between TWO lanes, stradling the line, make a sudden left turn at 25 miles per hour... I am too scared to use it now.

    • @icomxwing42
      @icomxwing42 10 หลายเดือนก่อน

      @@CameronsCookingChannel and the same system controls regular cruise control, windshield wipers, and headlights.

    • @kenbob1071
      @kenbob1071 9 หลายเดือนก่อน +1

      I suspect that many accidents were caused by negligently inattentive humans behind the wheel.
      P.S. I've never had AP or FSD disable without warning. It always beeps very loudly and says to take control before it disengages. In fact, it rarely disengages.

    • @marhanen
      @marhanen 7 หลายเดือนก่อน

      @@icomxwing42 factually untrue

  • @Runnifier
    @Runnifier 8 หลายเดือนก่อน +1

    I hate new and old cars. I want a comfortable and safe car that is controlled entirely and only by me. I also hate having cameras and microphones in a car. Looks like I’ll be building my own.

  • @joop1991
    @joop1991 9 หลายเดือนก่อน

    Why are they sueing Tesla? The driver is responsible for the vehicle, he even acknowledges a warning to apply force to the wheel (3:56). He should have looked at the road at that moment (at any moment to be completely fair).
    It's insane that people just trust those self driving vehicles even tho you're always responsible as the driver.

  • @TesserId
    @TesserId 9 หลายเดือนก่อน +4

    I live in one of the states that requires that drivers slow or change lanes for emergency vehicles on the road. Since autopilot does not do this, there should be serious questions about how the law should handle this. My bet is that in our plutocracy, all burden will be put on the driver. So, how many more have to die?

    • @SmigGames
      @SmigGames 8 หลายเดือนก่อน

      This is essentially cruise control and lane assist, the driver still needs to be aware and make decisions about what's legal and whatnot. A fully autonomous system on the other hand, would need to take that into account.

  • @mustolourien5823
    @mustolourien5823 10 หลายเดือนก่อน +5

    Tesla is only liable because of monetary viability if sued but in real sense the driver is 100 percent responsible.

    • @logitech4873
      @logitech4873 10 หลายเดือนก่อน

      @@callummcmac4079 Autopilot and FSD are separate. The system is extremely clear about the fact that you need to keep hands on wheel and eyes on road at all times.
      Have you ever driven a Tesla with autopilot? Maybe you should, because you don't seem to speak from experience.