The Hidden Autopilot Data That Reveals Why Teslas Crash | WSJ

แชร์
ฝัง
  • เผยแพร่เมื่อ 18 ม.ค. 2025

ความคิดเห็น • 7K

  • @moken79
    @moken79 หลายเดือนก่อน +3140

    Tesla already lost that lawsuit in the EU for Misleading marketing calling it autopilot. in Europe its seem as glorified cruse control

    • @georgepelton5645
      @georgepelton5645 หลายเดือนก่อน +187

      Autopilots are just glorified cruise control. No different in aircraft or cars. You can't just put your RV on cruise control and go into the back and fix yourself a drink.

    • @Katrina-mx2sf
      @Katrina-mx2sf หลายเดือนก่อน +76

      People are napping in their front seat, so crazy.

    • @jdm1152
      @jdm1152 หลายเดือนก่อน +24

      I agree with you, but the timing of this is WSJ Musk hit piece. The question I asked myself is, did Musk try to give humans one last benefit of the doubt before catering to the Darwin award recipients. Also, people can be petty. If they can blame the car before an internal assessment of their logic, they will.
      Overall, this report should include an individual’s driving record. But moreover, should be an analysis of what type would set autopilot, climb in the back, fall asleep, and…

    • @Triple_J.1
      @Triple_J.1 หลายเดือนก่อน +30

      Autopilot literally is just cruise control in 3D. It holds altitude, heading, and bank angle. The jet will fly until it runs out of gas. The engines will stop. Pressurization will be lost. And everyone will succumb to the reaper within minutes. Unless the jet descends to a safe altitude quickly enough with its engines now inoperative.
      Auto-Land is a much more sophisticated autopilot. It tracks the ILS as a pilot would land in fog, clouds, and rain. But it cannot usually take you where you need to go. Perhaps from lineup on runway, to touch down. But certainly not taxi to the gate.

    • @malk6277
      @malk6277 หลายเดือนก่อน +112

      @@jdm1152The timing makes it a hit piece against Musk? How so? Aren't journalists SUPPOSED to report on issues that are current and relevant? Trump hiding accident reports for Musk seems like the ultimate in current relevance, wouldn't you say? The hit piece, it would appear, is your post, against WSJ.

  • @3dus
    @3dus หลายเดือนก่อน +2962

    CAMERAS ARE NOT EYES. I work in the film industry. Cameras sensors are way worse in dynamic range than human eyes. This means it won’t gather light from dark environments like our eyes do, it will suffer to get light (stop/cars) and objects in dark. Note how many of these crashes happen in the night/dusk.

    • @ChristianLaurinE
      @ChristianLaurinE หลายเดือนก่อน +115

      THIS!!! People do not seem to grasp that.

    • @nathanscandella6075
      @nathanscandella6075 หลายเดือนก่อน +160

      People's eyes fail them constantly as drivers, too. Not to mention that we only have two of them, spaced fairly closely, on only one side of our head. Teslas have 8 cameras, on all 4 sides of the vehicle.
      Yes, the dynamic range issue matters, but again, there's multiple ways Teslas' eyes are better than humans' eyes.
      The bigger issue is that Tesla has the opportunity to augment those electronic eyes with all kinds of other things humans don't have, like radar, lidar, and ultrasonic sensors, but Musk has stubbornly resisted relying on them, leaving Teslas with ONLY those 8 eyes in too many scenarios.

    • @vonnikon
      @vonnikon หลายเดือนก่อน +180

      And critically: The depth perception of stereoscopic cameras is garbage compared to the depth perception of our eyes.
      Which is why Teslas crash into overturned trailers. The cameras are unable to reliably detect the position of that trailer, so it can't tell the difference between that trailer and just a dirty road surface.
      LIDAR fixes that.

    • @pm7753
      @pm7753 หลายเดือนก่อน +35

      They should have equipped all exterior cameras with IR. Why their engineers didn't pursue this is a blatant safety disregardment.

    • @Intelligencia
      @Intelligencia หลายเดือนก่อน +89

      @@nathanscandella6075 Two human eyes and the human brain are far superior to 8 cameras and the Tesla AI. The problem is the Tesla machine learning systems can't model for any and every potential situation. While the human mind can know a truck on its side is a threat and one should brake, the Tesla didn't know that and there are a million other scenarios that a Tesla won't ever be able to understand. Musk chose not to include Lidar because it was too expensive and that can't be retrofitted into most Teslas now. Musk thought he could BS his way through this and he probably won't be able to.

  • @billyk5285
    @billyk5285 หลายเดือนก่อน +3703

    The name Autopilot is what's misleading people. It should be called driving assistance and should be marketed as such.

    • @rick4580
      @rick4580 หลายเดือนก่อน +158

      Doesn't help that Musk promised Full Self Driving and Robotaxi

    • @melo0115
      @melo0115 หลายเดือนก่อน +77

      Exactly because I’m sure most of these auto pilot crashes could’ve been prevented if the driver was staying aware I have a Tesla and I’m always driving it like I don’t trust the auto pilot

    • @giacintoboccia9386
      @giacintoboccia9386 หลายเดือนก่อน +27

      As many other automaker do, but that doesn't make you the richest person in the world.

    • @rylie4495
      @rylie4495 หลายเดือนก่อน +8

      @@rick4580 He might have promised but never stated that is was 110% READY TO DO SO. That's the problem is that things are fed but we call it mislead when in reality never broke the meaning down fully. Just like when driving and its called "AutoPilot" well at the very start we all think fully autonomous but in real world it means A function that Pilots something automatically but further in depth on the word pilot also means (Is not 100% Fully capable to control this something automatically it can only function so much).

    • @kaba_me
      @kaba_me หลายเดือนก่อน +27

      Err... but by definition, Autopilot means "pilot assist".
      Also, the Ladar is not an appendix. Elon may as well argue that Dolphins don't need their sonar.

  • @livelongandtroll9108
    @livelongandtroll9108 25 วันที่ผ่านมา +729

    02:10 *Elon is so confident in his tech that the Vice-President Trump's transition team has already recommended scrapping this requirement (**02:10**) that companies report automated vehicle crash data.*
    Great job America.

    • @markus1351
      @markus1351 24 วันที่ผ่านมา

      This election was their endgame plan to stay out of jail

    • @viviand9493
      @viviand9493 24 วันที่ผ่านมา +25

      I wish I could leave

    • @EnjoyingMyTimeOnThisPlanet
      @EnjoyingMyTimeOnThisPlanet 24 วันที่ผ่านมา +40

      Its the peoples fault for fighting over red and blue like we in a gang. Right wing left wing when both wings belong to the same stupid bird. As long as American keeps fighting over the best of two evils we will keep getting these results. Whats else did you guys spec. We need to get both parties out of our government and introduce a new government that is controlled by the people and people introduce policies and Americans vote on them online.

    • @ProMace
      @ProMace 23 วันที่ผ่านมา

      @EnjoyingMyTimeOnThisPlanet I agree with the general premise of ‘both sides’ exhibiting corruption, but 1) there is a *massive* difference in the extent and 2) MAGA goes *way* beyond ‘normal’ corruption, it’s pure fascism staring people in the eye. If you want to get money out of politics and install true representatives: great. But given the current state of affairs, I think that ship has sailed for good.

    • @edmurth
      @edmurth 23 วันที่ผ่านมา +51

      The oligarchs are in charge now, but don’t worry it worked out really well for people in countries like Russia and North Korea…

  • @ev.c6
    @ev.c6 หลายเดือนก่อน +2156

    Now that Musk bought his way into the government, everyone looking into regulating his companies are going to have a hard time.

    • @PhatPazzo
      @PhatPazzo หลายเดือนก่อน +120

      Regulation? They can be made more efficient by not doing it.
      - Enron Musk (probably)

    • @polygonalfortress
      @polygonalfortress หลายเดือนก่อน

      No incentive for companies to make good, safe and competitive products with Musk in the government - sounds anti capitalist, competition to me ngl.

    • @slickshewz
      @slickshewz หลายเดือนก่อน

      He's not in the government, and will only be an independent advisor. He won't have any more sway over legislation than any other multibillionaire that can buy off lawmakers.

    • @Zeneroth
      @Zeneroth หลายเดือนก่อน +35

      @@PhatPazzo Lol I was actually thinking the opposite, more de-regulations for his companies. He prob lobbied Trump and the Republicans.

    • @FighterFlash
      @FighterFlash หลายเดือนก่อน +12

      Twitter X objects at these comments

  • @chrisroode
    @chrisroode หลายเดือนก่อน +608

    From watching Mentour Pilot, I have come to realize that autopilot in aviation has a completely different meaning than what it means to the general population.

    • @skycaptain3344
      @skycaptain3344 29 วันที่ผ่านมา +135

      Pilot here, yep. The public looks at aircraft autopilot as a navigation system that does it all, not recognizing it has to be programmed with a very specific plan, has a couple dozen combinations of operating modes, and has to be managed.

    • @termitreter6545
      @termitreter6545 28 วันที่ผ่านมา +47

      I think pilots dont even call it autopilot, or at least thats not how the manufacturers call it. Pilots get extensive training as to what the system can and cant.
      Big difference to Teslas, considering how Elon constantly talks about self driving cars. Like in 2015 he said fully self driving cars are just 2 years away, its not a secret why he called it autopilot.

    • @CmdrTobs
      @CmdrTobs 28 วันที่ผ่านมา +17

      @@skycaptain3344That is what Tesla is claiming to be autopilot. Hence the distinction between Autopliot and 'Full self driving (Beta).'

    • @skycaptain3344
      @skycaptain3344 28 วันที่ผ่านมา +11

      @@CmdrTobs as the video shows, that’s not what Elon Musk has actually been saying

    • @adissentingopinion848
      @adissentingopinion848 27 วันที่ผ่านมา +24

      EVEN if you have a lofty expectation in aviation autopilot, there is a distinct advantage in not having wrecked shipping trucks in the sky!

  • @sapier
    @sapier หลายเดือนก่อน +2974

    And that's why Elon wants to stop the requirement to report crash data.

    • @SmallSpoonBrigade
      @SmallSpoonBrigade หลายเดือนก่อน +116

      And we the people who drive have a right to know if other vehicles on the road are safe to operate in that mode. Especially given that the rules about whom to sue when there is a crash aren't really clear.

    • @c.e.robinson1100
      @c.e.robinson1100 หลายเดือนก่อน

      Tesla has been late/dishonest with every launch of technology.. CyberTruck=Junk, Solar Roof=Junk/lies, New Roaster=Ha Ha Ha. “Oh, it was supposed to land in the ocean” When people being fascinated with these crooks that become billionaires leading you to believe that you’re part of some inner circle will all be better off. It’s just a shame that they’re all now running the country because younger people are too stupid to not vote as if everything is a popularity contest. But when a Tesla runs them off the road and kills their family, they can always send a message to Elon on X.

    • @mich8411
      @mich8411 หลายเดือนก่อน

      This are 2021 reports, why don't WSJ show 2024.
      Right because they are dumb and have agendas

    • @martontichi8611
      @martontichi8611 หลายเดือนก่อน +99

      Trump will let him

    • @mich8411
      @mich8411 หลายเดือนก่อน

      @@martontichi8611 the way Tesla are safe, WSJ had to go back to 3 years to find a talking point in 2024, Tesla Autopilot is 10X more safer than human drivers, You all are NPC's i mean who believe in Mainstream Media

  • @DD-wu7xq
    @DD-wu7xq 26 วันที่ผ่านมา +236

    After a few phantom events where it tried to kill me and my family on the M5 in the UK. I now refused to use it. Actually feel stupid for paying thousands for the full package.

    • @DiegoAntonioZárateAlvarez
      @DiegoAntonioZárateAlvarez 22 วันที่ผ่านมา +10

      I'm glad you're ok and didn't leave the car to drive itself

    • @mhdivito
      @mhdivito 22 วันที่ผ่านมา +25

      I’m glad you and yours are ok, and no, you aren’t stupid, you were deceived, and so were many, many others.
      We pay for things with the expectation we will get what we paid for.
      shame belongs to those that profit from deception.

    • @RobertTaylor
      @RobertTaylor 21 วันที่ผ่านมา +20

      We have two 2023 Model 3s. I used FSD extensively from 12.3.6 release and stopped using it during the fall due to several near collisions that could have been devastating - one a drift by FSD over the double-yellow lines on a blind curve on US Hwy 50 west of Tahoe. In that case I was able to take over and return to my lane before meeting oncoming traffic. But the next time … not worth finding out.
      FSD is amazing until it isn’t.

    • @erwile
      @erwile 20 วันที่ผ่านมา

      @@DD-wu7xq yeah ADAS are not fully autonomous. And outside of the US it's far worse.

    • @i2Them2
      @i2Them2 19 วันที่ผ่านมา +10

      Paying that kind of money for a mediocre adaptive cruise control icw lane keeping assist is crazy....

  • @BaldurvanLew
    @BaldurvanLew 28 วันที่ผ่านมา +171

    "Go fast and break things" sounds like a bold and dynamic business model until you're the thing that's broken.

    • @Steyr6500
      @Steyr6500 25 วันที่ผ่านมา +9

      That's fine for an unmanned Starship, not so much when a human is onboard

    • @philprof
      @philprof 15 วันที่ผ่านมา +1

      Perfectly stated!

  • @freshjulian1
    @freshjulian1 หลายเดือนก่อน +778

    Tesla should be forced to make all crash data public

    • @4lxAnd3r
      @4lxAnd3r 29 วันที่ผ่านมา +20

      The problem is that not everyone is able to interpret them correctly so there would be a lot more of misleading stuff out there

    • @larryc1616
      @larryc1616 29 วันที่ผ่านมา +28

      He's petitioned to stop Tesla crash data when Trump takes office

    • @JaayTea2595
      @JaayTea2595 29 วันที่ผ่านมา +21

      It should just be an overarching law that this information be accessible to everyone regardless of manufacture.

    • @freshjulian1
      @freshjulian1 28 วันที่ผ่านมา +12

      If companies like tesla put automated systems in place, they should record the data (as they do). And it should be mandatory that other organisations or public can look into the data, so the customer knows how many crashes and what crashes are caused by these systems

    • @BeHappyByBike
      @BeHappyByBike 28 วันที่ผ่านมา

      Okay but thousands die everyday from normal cars. Why is no one using this data to reduce our dependence on cars in general. We should not be killing our citizens by forcing everyone to drive.

  • @danisyx5804
    @danisyx5804 หลายเดือนก่อน +316

    elon wants to stop this reporting, i wonder why

    • @rogerwilco2
      @rogerwilco2 หลายเดือนก่อน +11

      And on Januray 20th, he's going to get everything he wants.

    • @lovegod1000
      @lovegod1000 28 วันที่ผ่านมา

      So Bieden pardon his son of all charges never will tunter face law step in court , answer questions and you hate trumpo 😂😂​@@rogerwilco2

    • @kenhiett5266
      @kenhiett5266 27 วันที่ผ่านมา +7

      The human behind the wheel is supposed to remain attentive as if still controlling the vehicle. What we have here are examples that lead to human controlled fatalities and the number is miniscule as a percentage of Teslas and Autopilot/FSD miles driven. I have many miles of experience using both systems and it gives me a litany of safety advantages compared to vehicles that lack comparable technology.
      Teslas aren't designed for level 4 or 5 autonomy yet, so your argument is moot in relation to this story.

    • @gatowololo5629
      @gatowololo5629 27 วันที่ผ่านมา +3

      @kenhiett5266 what does that have to do with crash safety reporting?

    • @Pehz63
      @Pehz63 26 วันที่ผ่านมา

      @@rogerwilco2what is the worst thing you think will happen that day? Anything falsifiable? Do you think he will increase the $7500 tax credit for EV buyers? Or increase the carbon credits? Or accelerate the ban of gas cars? Or create more regulations that are easy for his companies to comply with but hard for newer EV manufacturers to comply with so Tesla won't have any competitors in their way? Because I think none of those things will happen, and actually the opposite will happen.

  • @hudsonr6358
    @hudsonr6358 21 วันที่ผ่านมา +58

    You guys need to make a full 1-2 hour documentary on this. It sounds like you guys found out a lot more than what you said in this video. I would watch a full documentary on this.

    • @Dr.AnonymousPro
      @Dr.AnonymousPro 7 วันที่ผ่านมา +2

      I agree, it even looks like too much was edited out to have it be short enough. Strange editing job (and I'm a pro), lacking explanation of given arguments.

    • @Astrodevil
      @Astrodevil 6 วันที่ผ่านมา

      I fully suspect Tesla’s legal team would be down their throats for stating anything “unsubstantiated” in this report. It is likely they found damning evidence on the hacked computer, but that evidence is likely considered “not legitimate”

    • @jasonwilkins1969
      @jasonwilkins1969 5 วันที่ผ่านมา

      @@Dr.AnonymousPro definitely, that’s pretty much the story of every Wall Street Journal video. It’s always way too short about the various topics they cover.

  • @jeffo7188
    @jeffo7188 หลายเดือนก่อน +869

    I don’t use my tesla autopilot anymore because of too many phantom events. It’s like driving with a new learner who needs constant attention. You are constantly on edge waiting to grab the wheel

    • @Jam-K
      @Jam-K หลายเดือนก่อน +26

      Me too. I barely use it. It also feels like only FSD users are getting „Autopilot“ updates.

    • @micheljr.beaudry21
      @micheljr.beaudry21 หลายเดือนก่อน +32

      Got the same feeling. Can't let the car drive when my family is inside

    • @AlexisMoore-nx6wf
      @AlexisMoore-nx6wf หลายเดือนก่อน +38

      I live in the UK. I tried basic autopilot once on the highway/motorway and experienced phantom breaking. Haven’t used it since.

    • @DJBaldPaul
      @DJBaldPaul หลายเดือนก่อน +18

      I've had this in the UK just with EAP, not even FSD, it makes me so nervous that I am absolutely exhausted when I stop because I've been concentrating so hard, it's more stressful than driving manually, so I rarely engage EAP now, only really use it if the highway is empty, but even then I've had phantom braking occur for absolutely no discernible reason.

    • @Mr_Battlefield
      @Mr_Battlefield หลายเดือนก่อน +12

      FSS Version 13 has been improved a lot. I hope Tesla users get this update as soon as it can be sent to your vehicle.

  • @iago9711
    @iago9711 หลายเดือนก่อน +811

    Teslas whole strategy of calling things "full self driving" and "autopilot" and then arguing that the customer was wrong to expect the car to competently fully self drive or operate on autopilot is truly pathetic.

    • @SuperBotcreator
      @SuperBotcreator หลายเดือนก่อน +35

      An airplane on autopilot cannot handle engine failure, altitude recovery, severe weather, instrument failure, hydraulic failure, runway changes, fuel management, route planning, collision avoidance, the list goes on. It really shouldn't be called autopilot huh.

    • @12pentaborane
      @12pentaborane หลายเดือนก่อน +41

      ​@@SuperBotcreator To be fair cars with lane assist and adaptive cruise control are about as automated as aircraft autopilots. The problem is Tesla claiming it's more capable than it is and a public that don't understand how autopilots actually work.

    • @LoyaFrostwind
      @LoyaFrostwind หลายเดือนก่อน +10

      Right? It’s false advertising.

    • @mellie4174
      @mellie4174 หลายเดือนก่อน

      Well he's right... After all the Ohio supremely court just ruled that 'boneless' chicken wings aren't boneless and so it's totally allowable to have secret bones that kill you and it's no one's fault but your own.... This is where we're headed. The corporations rule it all while we die...

    • @mellie4174
      @mellie4174 หลายเดือนก่อน +7

      ​@@12pentaboraneomg no they're not! My god those technologies are miles apart!

  • @rporobotjack3683
    @rporobotjack3683 หลายเดือนก่อน +550

    People . When you are behind the wheel of any vehicle you have one job , drive the vehicle. Never mind all the other distractions todays modern cars have . It's your life you'll be saving as well as others.

    • @WarriorPaxo
      @WarriorPaxo หลายเดือนก่อน +39

      you mean the driver is responsible for theirs and others safty? *gasp* but i thought it was elon killing these people?

    • @rogerwilco2
      @rogerwilco2 หลายเดือนก่อน +20

      The guy leaving at 3 am for a 3 hour commute might have used it to get some more sleep?
      That commute at that time of night is insane and unsafe no matter what you do.

    • @Lun3k
      @Lun3k หลายเดือนก่อน +22

      @@rogerwilco2 yeah, drivers seat is the best place to get some sleep. OR maybe you should take a bus/train if you want to rest?

    • @geelangfordo3272
      @geelangfordo3272 29 วันที่ผ่านมา +52

      @@WarriorPaxo You market a product as FULL SELF DRIVING and then take out crucial sensors needed to safely operate an autonomous and try to sabotage government oversight and cover up the data you are absolutely responsible for the outcome.

    • @Chuck8541
      @Chuck8541 28 วันที่ผ่านมา +4

      @@WarriorPaxo 😂 I know, right?
      These people hate him so much, never mind crash data that says teslas are still better at saving lives than normal cars. ¯\_(ツ)_/¯

  • @JeffyShyChannel
    @JeffyShyChannel 25 วันที่ผ่านมา +50

    He was such an idiot for removing radar. Why even take it out to save .0005% of profit?

    • @nizzie16
      @nizzie16 23 วันที่ผ่านมา +11

      Because profit is everything.

    • @JeffyShyChannel
      @JeffyShyChannel 23 วันที่ผ่านมา

      @ man but you’d think it outweighs the pr. I remember when he first announced it people were so confused and now even more. We know LiDAR and radar are better. 🤦

    • @daydreamer8373
      @daydreamer8373 23 วันที่ผ่านมา +7

      @@JeffyShyChannel Radar was removed because it was conflicting with the camera's, with the camera's being proved correct. Lidar is not needed. V13 is showing that, and will make our roads far safer.

    • @JeffyShyChannel
      @JeffyShyChannel 22 วันที่ผ่านมา +4

      @ except cameras are terrible at reflectivity. Really it’s the marketing at Tesla at fault. People are way too confident to let it fully drive with no attention to the road.

    • @paxon57
      @paxon57 22 วันที่ผ่านมา +1

      Entire car costs like 30 000 USD. One radar will cost few hundred USD and wont be of any real use. One LIDAR, which would be usefull, costs like few thousand dollars for this use case.

  • @jeaniebird999
    @jeaniebird999 หลายเดือนก่อน +1047

    9:07 "He'd leave about 3am to get to work by 6."
    😮3 HOURS, ONE WAY?! That's 6 hours worth of just driving to frickin work! Dayum!

    • @createx1751
      @createx1751 หลายเดือนก่อน +253

      No wondering why he uses autopilot, 6 hours worth of driving in boring highway is crazy

    • @clubbizarre
      @clubbizarre หลายเดือนก่อน

      Definitely. I am not shocked he didn't put his hands on the wheel but i am suspecting he was napping ​@@createx1751

    • @someyoutuber99
      @someyoutuber99 หลายเดือนก่อน +133

      Good pick up. Pretty crazy commute. Obviously a super hard working and loving dad, tragic it happened for his wife, kids, parents, and friends.
      The comment at the end rings true. It’s too much to ask of people to simultaneously trust their cars auto driving feature while paying attention. I think fully autonomous is the way (ie Waymo) if you want that sort of thing but we’re so far away that this transitional period will be really bumpy.
      Personally I found Teslas autopilot too nerve racking and had too many close calls that I just never use it.

    • @karendarrenmclaren
      @karendarrenmclaren หลายเดือนก่อน +128

      Enjoy capitalism

    • @T13Nemo
      @T13Nemo หลายเดือนก่อน +93

      Indeed, this is absolutely insane. I wish we could figure out high speed rail train like virtually the rest of the world. Train at 160 mph would do the same route in 1 hour + 30 minutes to navigate to and from station - boom, you can leave your house at 4:30 instead of 3 AM

  • @autumnfragrance6326
    @autumnfragrance6326 หลายเดือนก่อน +2676

    Note to self: step on brake when you see a truck laying on the road.

    • @ferdiyansurya
      @ferdiyansurya หลายเดือนก่อน +288

      That's what the video is trying to highlight: when we start to "trust" that the autonomous driving works, we start to lay back and let the system do its job and lowering our sense of awareness. Once the system failed to detect an obstacle and we see it in front of us, it is already too late: either the distance is too close and our response time is too slow to make that quick decision due to the lower state of awareness

    • @williampisano7573
      @williampisano7573 หลายเดือนก่อน +63

      Lol 😂 tesla is the least safest car on the road lol 😂 It’s the deadliest cars

    • @notarealchannelonyoutube
      @notarealchannelonyoutube หลายเดือนก่อน

      ​​@@dc-gm7luI think you miscounted. Yours only has 5

    • @austen2751
      @austen2751 หลายเดือนก่อน +77

      @@ferdiyansurya Still x1000 safer than human drivers

    • @tomp66
      @tomp66 หลายเดือนก่อน +43

      thats an idiotic statement did you not watch the wsj video?

  • @Theactualguyyouarelookingfor
    @Theactualguyyouarelookingfor หลายเดือนก่อน +2371

    Now we know who DOGE will shut down first.

    • @uromvictor
      @uromvictor หลายเดือนก่อน +24

      You 😂😂😂

    • @ptahchiev
      @ptahchiev หลายเดือนก่อน +36

      The Wall Street Journal

    • @Ianzgnome
      @Ianzgnome หลายเดือนก่อน +168

      They already are working on removing the reporting mechanism mentioned in this video

    • @oldsport500
      @oldsport500 หลายเดือนก่อน +94

      That's been my theory. Keeping a lid on the investigation is responsible for his recent interest in downsizing government

    • @lukeknowles5700
      @lukeknowles5700 หลายเดือนก่อน +11

      @@oldsport500 Their goal is to make the safest cars. More data is better. Why do you think they have a cluster of supercomputers?

  • @FredBlogs-j7j
    @FredBlogs-j7j 22 วันที่ผ่านมา +3

    I am a retired software engineer. We always had a rule that interface design had to be clear, understandable and aware that people make mistakes. We had to be very careful how to frame the prompts and check the user inputs. Here, Tesla have the prompt that the car has an "autopilot". This violates every guideline that a software engineer would use because it guides user inputs into an undesirable direction. To those who commented here that the drivers must bear responsibility and that Tesla cars give better performance that humans only, this "autopilot" prompt is criminal regardless, especially as it is used for marketing purposes rather than human safety.

  • @GeorgiBarzinski
    @GeorgiBarzinski หลายเดือนก่อน +547

    Watching out for autopilot’s errors is actually more stressful than driving yourself.

    • @joecampo1770
      @joecampo1770 หลายเดือนก่อน +31

      This has been my experience, it’s total nonsense

    • @raviarcot3145
      @raviarcot3145 หลายเดือนก่อน +4

      U Suddenly wakeup gasping - as if u crashed & can't reachout

    • @letuinchi
      @letuinchi หลายเดือนก่อน +22

      I disagree entirely. Been using it for over a year now and I absolutely love it.

    • @angelofdeath275
      @angelofdeath275 หลายเดือนก่อน +1

      thank you

    • @Ottawa411
      @Ottawa411 29 วันที่ผ่านมา +13

      I do not find even basic cruise control to be useful. I need to be involved in all aspects of driving the vehicle or my attention wanders.

  • @zonac14
    @zonac14 หลายเดือนก่อน +314

    Now imagine a computer driving an 80,000lb truck

    • @Armor23OnPatrol
      @Armor23OnPatrol 26 วันที่ผ่านมา +8

      Soooo the tesla semitruck

    • @growtocycle6992
      @growtocycle6992 24 วันที่ผ่านมา +18

      Or we could invest into decent train infrastructure?? 🤔

    • @DunningKruger778
      @DunningKruger778 24 วันที่ผ่านมา +3

      This video isn't a critique of self-driving cars. It's specifically investigating the camera-based technology that Tesla uses, and it's misleading claims about it.
      Human drivers are far from perfect (how many millions of crashes have been caused by truckers making mistakes?). AI will 100% replace human drivers, and make driving enormously safer.

    • @d.b.cooper1
      @d.b.cooper1 24 วันที่ผ่านมา +3

      Yikes

    • @dirnnroot5691
      @dirnnroot5691 24 วันที่ผ่านมา

      I’ve actually not seen a Tesla semi yet somehow out there but I’ve seen other brands without drivers. It’s very worrying for many reasons.

  • @redge4553
    @redge4553 28 วันที่ผ่านมา +221

    Tesla sees something strange on the road and thinks it must be nothing, and drives full speed.
    When I see something strange on the road, I slow down

    • @gatowololo5629
      @gatowololo5629 27 วันที่ผ่านมา +23

      Right? Why isn't this the default behavior?

    • @blagohermann3481
      @blagohermann3481 25 วันที่ผ่านมา +13

      @@gatowololo5629 Well I guess if may be stopping each time the system is not sure what they see. That could again cause even more accidents. It seems to me that without a radar it 's not going to be a reliable system.

    • @necavenue
      @necavenue 25 วันที่ผ่านมา +7

      But....ELON SAID IT'S SAFER THAN HUMAN DRIVERS!!!! 😂😂😂😂😂

    • @SongOfItself
      @SongOfItself 25 วันที่ผ่านมา +4

      @@gatowololo5629 Probably because of the data it's trained on. Technically or humanly, it is impossible to train the system on every single possible thing it may encounter, so it's trained only on a small subset of all possibilities. From that it follows that the case of "encountering an unforeseen / unrecognizable object" is what's happening all the time, many times on each trip. And most of those encounters are harmless, of course. That's okay when you are training a computer to recognize cats and dogs to impress your CS tutor, but definitely not okay when people's lives depend on it.

    • @lindyf381
      @lindyf381 24 วันที่ผ่านมา

      It's still learning to recognise objects on the road

  • @haydynnfike9455
    @haydynnfike9455 22 วันที่ผ่านมา +28

    I have a little different perspective here also as a Tesla driver. I was driving down the road, admittedly a little in my own world. The driver in front of me slams on their breaks. Full break check. The car literally slammed on the breaks, and swerved out of the way to an area with no cars. It was all captured on multiple cameras for me to rewatch and analyze what I could do better. I do not trust autopilot the same at night and I’m extra attentive. That truck would have been incredibly hard to see even if I was driving the car normally. The Tesla has saved me from near misses multiple times that in my old truck would have been an accident. I feel much safer in my Tesla, and you agree to MULTIPLE warnings that the system is not fully automated and that YOU are responsible to pay attention to the road.

    • @user98h5jkl4h
      @user98h5jkl4h 6 วันที่ผ่านมา

      Going to Vegas, my brother’s car automatically braked when someone slammed on their brakes in front of us. Car then sped up when it realized the car behind us was going to hit us because it started braking late. It was clutch.

    • @ceciliagarding4271
      @ceciliagarding4271 5 วันที่ผ่านมา +1

      So do you seriously feel that it is OK that Tesla wants to bury the reports on crashes from the public???

  • @blueconversechucks
    @blueconversechucks 28 วันที่ผ่านมา +171

    as a Tesla driver, i thank WSJ and the whistleblowers in this video for making plain the shortcomings of Tesla autopilot. seeing these examples i can drive much more aware now. this should be required viewing for all Tesla drivers...to familiarize themselves with the specific weaknesses in the system.

    • @kenhiett5266
      @kenhiett5266 27 วันที่ผ่านมา +19

      You didn't know that you were supposed to remain attentive and are still ultimately responsible for the movement of your vehicle? Your Tesla came with a variety of such warnings.
      If you thought your Tesla was level 4 autonomy, that's on you personally. We receive constant reminders that's not the case.

    • @sneikiusas
      @sneikiusas 26 วันที่ผ่านมา +9

      You needed a report to know that you are the driver and not Tesla's autopilot? Like sorry to be rude, but no wonder Tesla drivers crash if this kind of thing is news to you.

    • @mariogarcia2254
      @mariogarcia2254 26 วันที่ผ่านมา +6

      Are u serious? Don’t you have common sense?

    • @TurdFergusen
      @TurdFergusen 26 วันที่ผ่านมา +5

      lol

    • @blueconversechucks
      @blueconversechucks 26 วันที่ผ่านมา +5

      Haha, I thank the reporters for sharing details about the types of accidents that have been taking place and you all are so threatened you start insulting me and making up things I didn't say. Sorry but you're the people who lack nuance, not me.

  • @frederickwelham3829
    @frederickwelham3829 หลายเดือนก่อน +143

    I think people equate autopilot on a car with autopilot on an aircraft. If there is a problem in an aircraft the pilot normally has plenty of time to react due to the separation interval between aircraft and the altitude above the ground. If the self driving automation fails when the nearest object to collide with is maybe 10 feet away the car will hit it before the driver even realises there is an issue.

    • @WarriorPaxo
      @WarriorPaxo หลายเดือนก่อน +7

      if only there was a person behind the wheel that could have took over before that point....

    • @-TheUnkownUser
      @-TheUnkownUser 29 วันที่ผ่านมา +22

      @@WarriorPaxoIf only you don’t call your system “full self driving”.
      If only false advertising was illegal… Oh wait.

    • @JaayTea2595
      @JaayTea2595 29 วันที่ผ่านมา +3

      So the obvious solution here is to have flying Teslas!

    • @Pehz63
      @Pehz63 26 วันที่ผ่านมา +1

      @@-TheUnkownUser false advertising is illegal. Litigation over it is why Tesla had to rename it to "Full Self Driving (supervised)". Stop fantasizing about a world where there's evil people getting away with crimes, when there are real people actually getting away with crimes.

    • @joseph88190
      @joseph88190 25 วันที่ผ่านมา +2

      Sad reality is people autopilot their airplanes into the ground all the time.
      Public think autopilot is some magic button to just push and forget.
      And that is not true.

  • @Tyshark
    @Tyshark หลายเดือนก่อน +244

    Dont worry. The orange one will give them immunity.

    • @whctjsdlfqhrlfprl
      @whctjsdlfqhrlfprl 29 วันที่ผ่านมา +1

      😂😂😂😂😂

    • @Pehz63
      @Pehz63 26 วันที่ผ่านมา

      Immunity from what? Tesla has already had immunity from all of this for years. Tesla has already won tons of lawsuits about crashes. He doesn't need immunity. If he does need immunity, can you name any cases he has lost that Trump will grant him immunity from?

    • @davidbeppler3032
      @davidbeppler3032 22 วันที่ผ่านมา

      Disagree. GM will pay for the lives they have taken. GM does not report crash data. GM does not collect crash data. GM is killing you.

  • @frogturtle
    @frogturtle 22 วันที่ผ่านมา +5

    ABS, blind spot monitoring and rear view cameras are good enough for me and my eyeballs

  • @rjlavallee3575
    @rjlavallee3575 หลายเดือนก่อน +587

    Had a Model 3 with FSD for two years. Important note: I live in the Northeast but lived for six years out west. I understand why Tesla was all in for cameras only, but in every other part of the US where roads are not well marked or lit, that FSD is NOT ready for prime time. We had numerous false reactions from the vehicle from NOT seeing hazards on the road to reacting to hazards that did not exist. The car on more than one occasion slammed on the brakes on a totally empty highway, while doing 70mph. Sold the care after too many occurrences. On top of that, the lowest end Hyundai has better build quality than those cars, and I'm not the first person to point that out.

    • @stoomkracht
      @stoomkracht หลายเดือนก่อน +5

      Should have shared the stories and camera feeds. If that is even allowed. Computer says no?

    • @andrewm190E
      @andrewm190E หลายเดือนก่อน +20

      I've had a Tesla Model 3 for the past 2 years, and although I didn't buy it for the autonomous driving feature, the improvements that have come in the last 6 months have been remarkable.

    • @isthatatesla
      @isthatatesla หลายเดือนก่อน +3

      FSD SUPERVISED

    • @gw2301
      @gw2301 หลายเดือนก่อน +7

      Hyundai has (&has had) awesome built quality!(for years) Got 100.000 m / 8 year warranty and panel spacing is mint! Chrysler cars are a better comparison…

    • @rjlavallee3575
      @rjlavallee3575 หลายเดือนก่อน +6

      @@gw2301 Sorry to bag on Hyundai. You're right. They are actually made really well. A more appropriate comparison would have been a vintage 80s Yugo.

  • @a_d_duran
    @a_d_duran หลายเดือนก่อน +351

    Night cases with the overturned truck and the pick-up are definitely avoidable by radar assisted brakes, which are available even in low-end cars today.

    • @nathanscandella6075
      @nathanscandella6075 หลายเดือนก่อน +49

      Teslas originally relied on radar, too, but Musk has decided to phase that out, even in cars that still have radar hardware. His explanation was that for lower-resolution radar, there are too many false positive events that cause the cars to slow down for nonexistent threats.
      I can confirm that this was indeed a common annoyance in the past, and possibly a way to get rear-ended, and since going to vision-only, that annoyance is mostly gone. But, that doesn't mean Musk is necessarily right. Could be that his engineers just weren't good enough to properly filter the radar data. Radar has always been an imprecise sensor, it's only good signal processing that makes it useful.

    • @georgepelton5645
      @georgepelton5645 หลายเดือนก่อน +13

      AFAIK, radar is not good at detecting stationary objects of any kind, only moving objects.

    • @kaitlyn__L
      @kaitlyn__L หลายเดือนก่อน +8

      @@nathanscandella6075 yeah, part of the issue AIUI was the conflict between camera and radar in Tesla's stack _specifically_ rather than necessarily an inherent issue to radar.
      And when radar is bad-enough to freak-out because of a leaf in the road, it's at least consistent-enough that you learn it quickly and compensate; rather than inconsistent and seemingly out of nowhere like with Phantom Braking.

    • @sureshkrjsl
      @sureshkrjsl หลายเดือนก่อน +24

      Most humans would also crash in the situation they showed those two accidents. The overturned truck was completely not visible up till last second.

    • @UserErr0r3578
      @UserErr0r3578 หลายเดือนก่อน +32

      @@sureshkrjsl Just because it was not visible in the onboard camera replay, does not mean it was the same for human in that situation, if the human would have paid attention. The event replay is of very bad quality, probably on purpose, to give you the impression of not being able to see.
      I have been in a somewhat similar situation, autumn storm, rainy conditions, suddenly I saw what looked like a tree fallen on the road and braked later and harder than normal due to bad visibility but still before my car's auto braking activated. Model X next to me ploughed straight until sudden swerving just before the tree - that was most likely driver intervention. Took significant front end damage and punctured a tyre. No people were harmed. But he cleared the left lane for the rest of us at the cost of his own car. :D

  • @TheRadioAteMyTV
    @TheRadioAteMyTV หลายเดือนก่อน +277

    "Auto pilot" was the worst name in safety history for these cars.

    • @alexbian5567
      @alexbian5567 หลายเดือนก่อน +21

      And yet, no regulatory body can take any action towards it. Insane.

    • @isthatatesla
      @isthatatesla หลายเดือนก่อน

      Cruise Control is just as bad.

    • @fdelaneau
      @fdelaneau หลายเดือนก่อน +20

      The Autopilot name is coming from the aviation industry and it is technically correct. In a plane having the auto pilot active doesn’t mean the pilots can’t leave their seats, the pilots needs to be able to take over at any instant. The problem is that the general public doesn’t understand it as such and this causes much confusion.
      Now, if you are a Tesla owner, when you activate the autopilot in order to be able to use it on the road, the instructions are very clear, this is meant to be used on one way roads and you have to be able to take over at any time. There are also warnings if you need don’t keep your hand on the wheel and if you don’t comply Autopilot will disable.

    • @fdelaneau
      @fdelaneau หลายเดือนก่อน +4

      @@alexbian5567in some countries they had to change the name, but in most it has been kept as it technically describes precisely the feature making it difficult to rule against it.

    • @015AEX
      @015AEX หลายเดือนก่อน

      @@fdelaneau theyve updated it so you dont need to have your hands on the wheel I think autopilot is absolutely awful compared to fsd and not in a feature way it is actually just terrible at making and decisions which I get cutting features off fsd but the actual logic and programming is terrible

  • @sgtreid7659
    @sgtreid7659 26 วันที่ผ่านมา +37

    I’m a Tesla driver. I don’t like Musk, and I feel lied to about the car’s capabilities. BUT, it’s saved me from at least two accidents. Here’s the data point we need but they didn’t provide: “how many accidents per million miles driven WITH and WITHOUT autopilot”. Naturally corrected for other variables. Give me that data.

    • @BigBen621
      @BigBen621 18 วันที่ผ่านมา +6

      _Give me that data._
      Ask and ye shall receive. "In the 3rd quarter [of 2024], we recorded one crash for every 7.08 million miles driven in which drivers were using Autopilot technology. By comparison, the most recent data available from NHTSA and FHWA (from 2022) shows that in the United States there was an automobile crash approximately every 670,000 miles" (Tesla Vehicle Safety Report 3Q2024).

    • @economicprisoner
      @economicprisoner 17 วันที่ผ่านมา +3

      @@BigBen621 That is somewhat skewed because you are only supposed to use autopilot on freeways, which limit vehicle interactions by design.

    • @bardz0sz
      @bardz0sz 16 วันที่ผ่านมา

      Now the issue is that how would you count the accidents prevented by autopilot since, well they didn’t happen because they were prevented

    • @BigBen621
      @BigBen621 16 วันที่ผ่านมา +3

      ​@@economicprisoner Yes, I agree, although speeds are much higher on freeways, which means accidents that do occur are more likely to be fatal.
      The problem is that NHTSA doesn't segregate crash data based on road type. But they *do* segregate fatality data roughly by road type (actually rural vs. urban), and one might assume that crash data would at least roughly follow fatality data. In brief, the rate of fatal accidents on rural roads, which of course include intercity freeways, is 1.68 per 100 million VMT, while the rate for all drivers is 1.33 per 100 million VMT. This suggests that there are *more* fatalities per mile in rural areas than the average, which would make the ratio of 10.6 between Teslas on Autopilot and average drivers even more remarkable.

    • @BigBen621
      @BigBen621 16 วันที่ผ่านมา +1

      ​@@bardz0sz _how would you count the accidents prevented by autopilot_
      You can only infer avoided accidents statistically. Say you have a group of 100,000 vehicles without autopilot, which experience 5 accidents per year; so the total accidents are 500,000. Now you replace 10,000 of them with vehicles *with* autopilot that experience only 4 accidents per year. Now the total accidents are 90,000 x 5 plus 10,000 x 4 = 490,000. So you can infer that replacing 10,000 vehicles without autopilot with 10,000 vehicles with autopilot has prevented 500,000 minus 490,000 = 10,000 accidents.

  • @SurrealAdventure
    @SurrealAdventure หลายเดือนก่อน +55

    After watching this, I think I need to point out that Toyota has had a level 2 autonomous system available since they put it in the Prius in 2010. The system is similar to the Highway Autopilot on Tesla. Toyota uses a radar array and cameras. The radar sees stuff like this and will slam on the brakes, and this is why Toyota is seldom if ever in the news for this kind of accident. The radar system doesn't have to interpret anything, it knows if you are about to slam into a stationary object. In 8 years of owning a Prius, I had it slam on the brakes once (about 75% faster than my foot could, which hadn't even left the gas pedal and the brakes were already applied, avoiding a collision) and warn 3 time to stop. 2 of those were false, as that is the nature of radar, sometimes it doesn't understand sudden curves, angles, or sudden inclines in the road, but I'd rather have a few false positives than slam into something.

    • @sylvaing1
      @sylvaing1 หลายเดือนก่อน +8

      Beside my 2021 Model3, I have a 2017 Prius Prime. Its lane assist is a joke compared to Autopilot. It can't even follow a gentle highway bend. I keep it off. Its lane departure is annoying. I only turn it on while on highways. A radar wouldn't have been helpful with the overturned semi as it would have been ignored because it has been stationary since the first time the radar saw it (radar are designed to ignore stationary objects to prevent phantom braking). The difference here is the Prius driver would have been looking at the road 100%, otherwise he would have ended in the ditch. In a Tesla, 99% of the time, Autopilot will do fine. It's that 1% that is problematic if you're not looking at the road, hence why you SHOULD KEEP LOOKING AT THE ROAD, it's NOT AN EYES OFF system!

    • @Struct.3
      @Struct.3 หลายเดือนก่อน

      No the reason why you don't see Toyota in the news is because they don't get clicks and nobody in the news team has a vendetta against them. Toyota's lately have had quite a lower safety record than Tesla, its not even a competition at this point. So Toyota better step up their game. And btw those false positives are why tesla decided against the radar systems, because they cause people to rear end you. Yes because people suck at driving and they will rear end you.

    • @Robbedem
      @Robbedem หลายเดือนก่อน +3

      @@sylvaing1 A 99% correct system shouldn't be used by the public. And calling it beta is not an excuse.
      Actually a 70% system would probably be safer to use, because at least you'ld be able to keep your attention on the driving.
      But when it's at 99%, you have not enough to do so you'l get bored. When you're bored, you'll start thinking about other things (daydreaming) or even do other things (go check your phone,...). It's just human nature.

    • @alamogiftshop
      @alamogiftshop 27 วันที่ผ่านมา +4

      I had a “false positive” in a Subaru Outback once and it decided to lock its brakes up at speed on the interstate. Almost got steamrolled by the traffic behind me. All because there was a different color road surface patch. Be careful what you wish for.

    • @sylvaing1
      @sylvaing1 27 วันที่ผ่านมา +1

      @@alamogiftshop I had a 2024 Volvo CX40 Recharge Ultimate while my Model 3 was in the body shop last summer. It also did a phantom emergency braking on a curve on regional road while I was going 105 km/h. In my case, there was nothing on the road, no change in the asphalt condition, nothing. The dash displayed "Emergency braking applied to avoid a collision", well nope, there was nothing.

  • @outbackshaqYT
    @outbackshaqYT หลายเดือนก่อน +321

    Imagine always having a beginner driver driving, and then having to be ready to take over all the time. This seems to be what it's like to use autopilot

    • @chriswright9096
      @chriswright9096 หลายเดือนก่อน +24

      I agree. I think its more difficult than simply driving yourself. Most of the people saying the drivers should have been paying more attention are missing this point.

    • @medea27
      @medea27 หลายเดือนก่อน +18

      Exactly - they are relying on humans being the back-up for a computer, rather than the computer helping the human avoid mistakes (like aviation automation works).

    • @christianmuriel4088
      @christianmuriel4088 หลายเดือนก่อน +3

      💯. Need to text at a red light? Auto pilot great to get going when the light turns green. It’s surprised me a few times. Why is auto pilot camping in the left lane? Oh because It’s on auto pilot. Always drive, let auto pilot assist. 🤦🏽‍♂️ this crash is dumb. Condolences to the family, because their father and a husband had to be a statistic of why It’s important that only, you, are responsible for your safety.

    • @spaceprior
      @spaceprior หลายเดือนก่อน +4

      I think people should be a lot more curious about how these people who get into trouble with it were able to convince themselves that it was safe, because I've never gotten that impression from it, you can watch people demonstrating it and it made mistakes constantly.

    • @h.seanhsu8965
      @h.seanhsu8965 หลายเดือนก่อน +7

      It's more like having a helmsman handling the maneuvering so you can attend to higher-level tasks such as maintaining situational awareness.

  • @joedellinger9437
    @joedellinger9437 หลายเดือนก่อน +106

    In Houston old traffic markings are often left behind after lanes have been re-routed. In rain and fog at night sometimes the old remnant lane markings are MORE visible than the “correct” ones. It’s sometimes difficult for even a highly experienced person to understand what to do. I don’t even trust myself to get it right sometimes. A low-resolution camera has no hope!

    • @user-bb3ci3gn7s
      @user-bb3ci3gn7s หลายเดือนก่อน +3

      As a fellow Houstonian, I agree, the marking really get me confused at times. It's dangerous

    • @rpdx3
      @rpdx3 หลายเดือนก่อน +3

      Ditto here. The “lane keeping” function of my Kia Sportage gets fooled all the time by all the different markings on the road. I put it on once in a while for fun, but not for long!

    • @marcmcreynolds2827
      @marcmcreynolds2827 28 วันที่ผ่านมา +2

      "In rain and fog at night" Add to that bright sunlight but at a low angle, e.g. just after sunrise. With concrete pavement in particular (high albedo), seemingly every place where a marking had ever been on the pavement lights up.

    • @paulmadsen51
      @paulmadsen51 25 วันที่ผ่านมา +1

      Just had this experience myself while driving close to sundown. The only way to tell which markings were correct was to use the mirrors. You could clearly see the lanes in the rear view, but looking forward, it was almost impossible to tell.

    • @YamiHoOu
      @YamiHoOu 23 วันที่ผ่านมา

      We have that problem in some of the cities in New Zealand. It's a nightmare driving through those areas when it's raining

  • @didier_777
    @didier_777 2 วันที่ผ่านมา +1

    I don't think I would have seen that car in the dark either.

  • @KrzysztofBob
    @KrzysztofBob หลายเดือนก่อน +65

    And people wonder why Tesla in Europe and China don’t offer full self driving.

    • @isthatatesla
      @isthatatesla หลายเดือนก่อน +4

      It's called FSD SUPERVISED.

    • @nebylicza
      @nebylicza หลายเดือนก่อน +14

      @@isthatateslawhich is a self-contradictory name

    • @muadhnate
      @muadhnate 29 วันที่ผ่านมา

      We're literally used as guinea pigs and so many people are willing to sign up. Smh

    • @termitreter6545
      @termitreter6545 28 วันที่ผ่านมา +11

      @@isthatatesla And FSD is called "fully self driving". Only later did they add the paradox "Supervised" behind it, which should tell you something.

    • @gledatelj1979
      @gledatelj1979 26 วันที่ผ่านมา

      @@termitreter6545 IFSD is in development and will be fully functional in 26. Maybe it should be called Training Self Driving until then .

  • @DarkNemesis25
    @DarkNemesis25 หลายเดือนก่อน +52

    been using Teslas FSD for 2 years now and over 70,000km accross the US and Canada, i think it's incredibly beneficial to safety but i honestly think standard autopilot is dangerous. It is cruise control. that's all it is. it will drive straight into anything in the road. the name Autopilot is misleading tbh. it should be only referred to as advanced cruise control with lane keeping.

    • @personyt55
      @personyt55 หลายเดือนก่อน

      I always knew that’s what it was. My family owns a Tesla and never uses autopilot at all. People should do more research before allowing a car to drive you.

    • @colingenge9999
      @colingenge9999 หลายเดือนก่อน +9

      @@personyt55 the only “research“ you need to do is to read the Tesla documentation on auto pilot which states it’s a drivers aid and you are fully responsible. Operate auto pilot, recognize its capabilities and use it within those capabilities. Simple as that. 80% of my driving is an auto pilot and I understand completely what I can do and what it can’t and adds to my safety because the fatigue of maintaining speed and maintaining lanes to tracks from my ability to watch the road carefully which is what I can do more intent with auto pilot on than off.

    • @valuemastery
      @valuemastery 28 วันที่ผ่านมา +8

      The term is correct. This is exactly what autopilot in aviation is. It will keep an airplane's height, speed and heading, comparable to keeping a car in its lane. And it will fly into any mountains that show up if the pilot does not do his job.

  • @Column1313
    @Column1313 หลายเดือนก่อน +47

    The public needs to know when a car is on “auto pilot”

    • @canwoop
      @canwoop 27 วันที่ผ่านมา +7

      It won't help stopped vehicles know a Tesla is about to rear end them at 70mph.

    • @Uouttooo
      @Uouttooo 19 วันที่ผ่านมา

      I read that there are already plans for a cyan or green (can't remember which) colored lights on vehicles in the near future to let everyone know that a car in autonomous mode.

  • @leog6177
    @leog6177 20 วันที่ผ่านมา +9

    After i got my first tesla this year common sense made me not rely on autopilot in these situations: when its dark. When its foggy or heavy rain. When the road markings or road conditions are bad but its a nice feature in good weather on nice roads and i enjoy it

    • @Uouttooo
      @Uouttooo 19 วันที่ผ่านมา

      You have the option to reduce the maximum speed to whatever you want.

  • @power4things
    @power4things หลายเดือนก่อน +162

    The problem isn't only the Tesla driver, but those of us who did not sign up for the program and get hit by one of these. Makes the Corvair look pretty tame now.

    • @daydreamer8373
      @daydreamer8373 หลายเดือนก่อน +2

      Who has been hit By FSD?

    • @RoadSurferOfficial
      @RoadSurferOfficial หลายเดือนก่อน +24

      All the people in this video 🤡

    • @Pressure165
      @Pressure165 หลายเดือนก่อน +6

      tragic no doubt, 3:40 the drivers was worn 19 times by the autopilot to keep his hands on the steering before the crush. most likely he fell asleep, the autopilot kept him from chrushing way before the flipped truck to appear, This accident will have happen anyway with any other car by him losing the control of the car. I'm hoping that this technology will be so good that will prevent any body from dying on the roads.

    • @daydreamer8373
      @daydreamer8373 หลายเดือนก่อน +3

      @@RoadSurferOfficial That was not FSD. It was autopilot. So my question still stands. Who has been hit by FSD?

    • @nathanscandella6075
      @nathanscandella6075 หลายเดือนก่อน +8

      This is a dumb, yet extremely common, form of sophistry used against deployed autonomy.
      I also didn't sign up to be on the road with rednecks' monster trucks who achieve "safety" via enormous mass and a bumper that's at the same height as my head in a sedan. Or, a child's head on a crosswalk.
      Because, you know why? We don't all get to approve/reject each others' vehicles.
      If you want to argue that Tesla should be better about publicly sharing safety statistics on their autonomous cars, and IF those statistics show significantly worse safety performance, then use that to push for specific regulation. "I didn't sign up for ..." is irrelevant. This is a big, stupid country full of idiots doing harmful things to one another. I didn't sign up for that, either, but we live in a society.

  • @24dbuforddb
    @24dbuforddb หลายเดือนก่อน +29

    The mainstream media should be focused on this as a disqualification for DOGE oversight.

    • @wadexyz
      @wadexyz หลายเดือนก่อน

      They're afraid of going to jail under the new Putin, I mean Trump, administration

    • @JW-mb6tq
      @JW-mb6tq 21 วันที่ผ่านมา +2

      The basic conflict of interest between DOGE and being a CEO at so many large companies should be obvious. Musk should be made to divest as CEO in many places before being able to head a government program ment guide regulations and enforcement.
      I also point out that this is not uncommon in the corporate world. Often when you take one position you might not be allowed to lead or own conflicting situations.

  • @LSolomon13
    @LSolomon13 29 วันที่ผ่านมา +32

    literally every situation shown in this video wouldnt have happened if they used LIDAR. Elon was so set on just using cameras despite warnings from the engineers. the blood is on his hands

    • @daydreamer8373
      @daydreamer8373 29 วันที่ผ่านมา

      Why does everyone think lidar is some kind of magic solution. A few months back a Waymo crashed into a large pole, despite being equipped with Lidar. V13 of FSD is proving Lidar is not needed.

    • @CmdrTobs
      @CmdrTobs 28 วันที่ผ่านมา +4

      yes, dropping lidar on technological ideology was bad.

    • @daydreamer8373
      @daydreamer8373 28 วันที่ผ่านมา +1

      @@CmdrTobs V13 is showing it to be the right decision. Tesla creates a pseudo lidar with it's camera's, and has shown it doing remarkable things. Lidar is not some super sensor. Only a few months ago, A Waymo crashed onto a large pole.

    • @Nullmoose
      @Nullmoose 27 วันที่ผ่านมา +1

      I think the abrupt shift away from lidar to camera based reinforcement learning was the bad move. At least continue to use lidar to build the model for detections and reactions and then introduce that training data to newer vehicles without lidar. Their decision to just replace everything with cameras then push Tesla vision as an OTA update almost 2 years later was truly the dumbest decision their engineering team has ever made. No idea why the didn’t prioritize that rollout first, then focus on physical lidar removal.
      Tesla vision could have been such a huge deal when they shipped it, but instead to felt more like a patch update to fix bugs, since that’s basically what it did for all of the vehicles produced w/o lidar

    • @daydreamer8373
      @daydreamer8373 27 วันที่ผ่านมา

      @@Nullmoose I'm not sure what you are talking about? Tesla has never used Lidar. They did use radar, but got rid of it because of it conflicting with what vision was saying. With vision being proved correct. Looking at the performance of FSD. It would seem to be the correct decision.

  • @edgarortega7609
    @edgarortega7609 16 วันที่ผ่านมา +1

    I was actually working on the same job site as Steven before this incident. I had just got my Tesla and would park next to Steven, until he stopped showing up one day. After I heard about what happened to him I learned to never trust the autopilot without supervision. I didn't know him personally aside from the occasional morning pleasantry, but its tragic how many construction workers die due to having to commute far while sleep deprived. My heart goes out him and his family.

  • @CptSpears007
    @CptSpears007 26 วันที่ผ่านมา +4

    Elon has gone from Ironman to Lex Luther

    • @alr2157
      @alr2157 26 วันที่ผ่านมา +1

      Iron Man was selling weapons to whomever wanted, didn't he?
      not really the hero i dream about

  • @Aniket2712
    @Aniket2712 หลายเดือนก่อน +149

    3:04 what is the driver doing? ... sleeping?

    • @FrancescoDiMauro
      @FrancescoDiMauro หลายเดือนก่อน +49

      Highly likely, or looking at the phone. How is this Tesla's fault is beyond comprehension. One can get complacent if the car does most of the work most of the time, but failing to look ahead while you're cruising at that speed is reckless.

    • @RichardBaran
      @RichardBaran หลายเดือนก่อน +5

      It's called autopilot....

    • @Aimaiai
      @Aimaiai หลายเดือนก่อน +22

      @@FrancescoDiMauro its tesla's fault because they allow people to engage in this behaviour without properly accounting for it, at the same time as elon musk bragging about how perfect it is to frisk shareholder value while it isnt even close to being as safe as a human driver in dire scenarios. It isnt allowed to be called "autopilot" in europe for this reason, because the branding puts a false sense of security in the brain of the driver.

    • @karendarrenmclaren
      @karendarrenmclaren หลายเดือนก่อน +1

      Mostly. It "drives itself" after all

    • @timhill9039
      @timhill9039 หลายเดือนก่อน +8

      @@Aimaiai So you should also ban dumb cruise-control? And how do you know it isnt as safe as humans in "dire situations"? What IS a "dire situation", and what FACTUAL statistics do you have to compare AP vs human responses in such cases? Showing one accident like this video does is meaningless. I can post a video showing a human driver doing something dumb and causing a terrible accident. Do we then ban all human drivers? Seat belts occasionally jam in freak accidents an cause an occasional death. Do you propose we ban seat belts too?

  • @hihihihihi6351
    @hihihihihi6351 8 วันที่ผ่านมา +2

    Seeing this video, makes me very happy that I made the correct decision not to use auto pilot when driving a Tesla rental car for 2 days. During the rental period, I was even driving at midnight. FYI, without using auto pilot, Tesla cars are very unpleasant to drive on.

  • @mycalltoadventure9712
    @mycalltoadventure9712 12 วันที่ผ่านมา +2

    I drive around 6k miles a month with at least 2500 of those on autopilot and FSD. Amazing tech but it’s still supervised and you have to keep your attention on the road.

  • @Helloworld0011-q2m
    @Helloworld0011-q2m 9 วันที่ผ่านมา +1

    Amazed that WSJ owned by Fox corporation investigated this and no one from the Murdoch family tried to bury it. Good job WSJ.

  • @valenciainc2596
    @valenciainc2596 27 วันที่ผ่านมา +12

    The fact that the cameras are at different positions and angles is what allows the computer to calculate depth it's not a flaw.

    • @paulfrydlewicz6736
      @paulfrydlewicz6736 25 วันที่ผ่านมา +4

      as I understood the technician, the problem is that the computer doesn't know the exact position and orientation. It's not about the mounting, but what the computer is being told about the mounting. Bad calibration leads to bad assumptions by the computer

    • @ericew
      @ericew 22 วันที่ผ่านมา +1

      @@paulfrydlewicz6736 Every vehicle goes through a "camera calibration" during the first 100 miles or so during which advanced driver assistance is not available. This is designed to calibrate the camera using stationary points on the ground repeatedly until there is a map of this vehicle's camera for the computer.

    • @honkhonk8009
      @honkhonk8009 20 วันที่ผ่านมา

      WSJ is clueless and mostly pumps out propaganda based on whatever company is bribing them.
      They have a colorfull history of Tesla propaganda that only died down cus their stock price rallied in 2020 leading to them to calm down on it.

  • @concernedcitizen2428
    @concernedcitizen2428 27 วันที่ผ่านมา +34

    Why is there no data presented in this article/video? People crash every day, the question is autopilot crashing at a lower or higher rate.

    • @mazicort_
      @mazicort_ 26 วันที่ผ่านมา +8

      Came exactly to tell this, after this video we still don’t know if AP is safer than regular driver or not

    • @ericew
      @ericew 22 วันที่ผ่านมา

      The data doesn't support their assertion that the software is faulty, that's the problem. The article also says "autopoilot" which is a nebulous term in the Tesla world but in this case applies only to autosteer ( lane keeping + adaptive cruise control ) and does not apply to FSD.
      Feel free to search for "Tesla Vehicle Safety Report" which has the breakdown by quarter.

    • @honkhonk8009
      @honkhonk8009 20 วันที่ผ่านมา +7

      Because teslas frash at lower rates. Its a hit peice that WSJ started putting out again because their CEO has differing political beleifs.

    • @Uouttooo
      @Uouttooo 19 วันที่ผ่านมา +8

      You can see from the first two videos that if the drivers were awake, there is no way they would crash into a huge object in front or heading straight when approaching a T-junction. Both were very likely asleep at the time.

    • @BigBen621
      @BigBen621 18 วันที่ผ่านมา +4

      @@mazicort_ _we still don’t know if AP is safer than regular driver or not_
      Well, yes we do. "In the 3rd quarter [of 2024], we recorded one crash for every *7.08 million miles* driven in which drivers were using Autopilot technology. By comparison, the most recent data available from NHTSA and FHWA (from 2022) shows that in the United States there was an automobile crash approximately every *670,000 miles"* (Tesla Vehicle Safety Report 3Q2024).

  • @brandinojam24
    @brandinojam24 17 วันที่ผ่านมา +3

    Tesla Full Self Driving has come a long way. I was driving at the Tesla recognized a plastic bag in the wind as an obstacle and slowed down accordingly.

    • @necavenue
      @necavenue 17 วันที่ผ่านมา

      🤣🤣🤣🤣🤣🤣🤣🤣🤣🤣🤣🤣

  • @timjudshore6907
    @timjudshore6907 13 วันที่ผ่านมา +2

    a Tesla steered around me wile I was biking , the driver did not care ❤ saved me from getting hit or dead

  • @pietersmit621
    @pietersmit621 16 วันที่ผ่านมา +5

    We should be focusing on data and statistics. People die every day in multiple different cars, focusing on specific examples is just misleading and not focusing on making the roads safer.

    • @ErinWilke
      @ErinWilke 14 วันที่ผ่านมา

      It is important for people to understand the weaknesses of a system to prevent tragedies. Statistics are important, but don't help analysis to that extent, especially when data the data to make more specific statistical inferences is kept hidden

    • @41-Haiku
      @41-Haiku 12 วันที่ผ่านมา

      @@ErinWilke Yes, that data should be made available to regulatory bodies. I don't think it should be publicly available, since that seems like a breach of privacy with no actual upside. As it concerns this video, making this an emotional story about a specific case is in poor taste. The fact that it's about an accident that happened several years ago with a past system makes me question its relevancy.

    • @jonjimihendrix
      @jonjimihendrix 2 วันที่ผ่านมา

      @@41-Haiku The system may have changed, but Tesla has not. That's the problem, and that’s the story.

  • @andybrice2711
    @andybrice2711 29 วันที่ผ่านมา +7

    Surely if the cameras are giving contradictory data, the vehicle ought to slow down in order to get a clear picture and minimize the danger of a collision?

    • @Steyr6500
      @Steyr6500 25 วันที่ผ่านมา +2

      Or sound an urgent alarm in the cabin to alert the driver?

    • @bikeman7982
      @bikeman7982 23 วันที่ผ่านมา

      @@Steyr6500it does sound a loud alarm if a collision is imminent. The “forward collision warning” in my Tesla has 3 choices: early, medium and late. I set mine to medium because early was giving too many annoying false positives. There’s also automatic emergency braking. It doesn’t guarantee a crash won’t occur, but should reduce the severity of the impact.

    • @bikeman7982
      @bikeman7982 23 วันที่ผ่านมา

      That’s the challenge with technology. It could lead to false positives. Unnecessary braking could result in getting rear ended.

    • @andybrice2711
      @andybrice2711 23 วันที่ผ่านมา

      @@bikeman7982 I don’t mean do an emergency stop though. Just slow down a bit to minimize danger. And of course, that can also depend on the rear-view cameras.

  • @rarespetrusamartean5433
    @rarespetrusamartean5433 16 วันที่ผ่านมา +4

    You presented maybe 5 actual data POINTS in the video, with an additional data set in the form of what crashed based on what.
    I really expected more from a multibillion dollar journalistic company.

    • @rayphenicie7344
      @rayphenicie7344 16 วันที่ผ่านมา

      Their articles, of which there at least 10, give links to loads of information. The government (NHTSA) has published a lot of information as well.

    • @rarespetrusamartean5433
      @rarespetrusamartean5433 16 วันที่ผ่านมา +1

      @rayphenicie7344 that changes nothing about the quality of this video, never did they even refer to those articles.

  • @JoanMendoza
    @JoanMendoza 12 วันที่ผ่านมา +2

    I've also heard that most people who work on the automatic driving don't trust it with their own lives.

  • @evoljfe48
    @evoljfe48 16 วันที่ผ่านมา +8

    Now compare this to human error crashes and you'll realize how much worse we are.

    • @edinhusic8892
      @edinhusic8892 16 วันที่ผ่านมา +1

      Well this would be very very easy, but for some reason (wonder why) no company has relesed it crash data so far

  • @russell5700
    @russell5700 25 วันที่ผ่านมา +38

    I would have liked it if the video did a better job contextualizing Tesla Autopilot crash rates vs human crash rates. This would help better understand Elon’s claim that Autopilot is safer than human driving.

    • @johnstephenson5158
      @johnstephenson5158 24 วันที่ผ่านมา +3

      This 👆

    • @GizmoMaltese
      @GizmoMaltese 23 วันที่ผ่านมา +3

      Saying it’s safer than a human driver includes all kinds of reckless drivers and drunk driver. It has to be safer than a responsible human driver. I’m pretty sure that’s nearly impossible in the short term.

    • @elviswsjr
      @elviswsjr 23 วันที่ผ่านมา +3

      Saying it’s safer than a human driver is like saying that AI is a better writer than a human. Sure, it’s better in some cases and in some ways, but not all. It’s a huge generalization. Autopilot is a better driver in that it can stay centered in the lane better, but is a worse driver in so many other ways. Same with FSD. It’s a way better driver… until it’s not.

    • @paxon57
      @paxon57 22 วันที่ผ่านมา

      but then anti musk narrative won't work

    • @alternat8771
      @alternat8771 20 วันที่ผ่านมา +1

      Humans have more total crashes than Tesla autopilot has crashes.
      Therefore, autopilot is safer.
      This is Elon logic.
      Please ignore that there are a much larger total of human drivers than Tesla autopilot drivers.

  • @MrCristopher99
    @MrCristopher99 16 วันที่ผ่านมา +5

    Why? Because of inattentive drivers thats why.

  • @didier_777
    @didier_777 2 วันที่ผ่านมา +1

    FSD is not perfect but it keeps getting better. I know that's not consolation to the people who lost loved ones but the truth is that this technology is is going to save thousands of lives in the near future.

  • @wkymole3
    @wkymole3 25 วันที่ผ่านมา +3

    Tesla- we have autopilot.
    Normal car manufacturers- even if we had autopilot, we wouldn't be stupid enough to call it that.

    • @BigBen621
      @BigBen621 17 วันที่ผ่านมา

      _even if we had autopilot, we wouldn't be stupid enough to call it that._
      Is this somehow dramatically different from Drive Pilot, which is what M-B calls their Level 3 ADS?

    • @wkymole3
      @wkymole3 16 วันที่ผ่านมา

      @BigBen621 as i said, they just wouldn't call it autopilot. For liability reasons. Even if it did function as such.

    • @gridcoregilry666
      @gridcoregilry666 13 วันที่ผ่านมา +1

      it's a GOOD term! look at the definition from Wikipedia for example: "An autopilot is a system used to control the path of a vehicle without requiring constant manual control by a human operator. Autopilots do not replace human operators. Instead, the autopilot assists the operator's control of the vehicle"

    • @BigBen621
      @BigBen621 13 วันที่ผ่านมา

      @@wkymole3 _they just wouldn't call it autopilot. For liability reasons._
      Just what liability does this create for Tesla? They've already won two lawsuits, one in Germany and one in the U.S., brought by folks who claimed it's misleading.

  • @ThanhNguyen-xw3nt
    @ThanhNguyen-xw3nt หลายเดือนก่อน +672

    I am sure that XAI600K will go 100x just like you said

    • @tcfs
      @tcfs หลายเดือนก่อน +7

      I am sure, this is a scam :D

  • @Voiceofreason772
    @Voiceofreason772 20 วันที่ผ่านมา +2

    Years ago, I knew Elon was not smart when he, with full confidence, announced that he was shipping “autopilot” capable cars without radar. A pure camera based solution does not work for that use case.

  • @didier_777
    @didier_777 2 วันที่ผ่านมา +1

    One sensor for head-on collision would make it safer. But I agree with Elon that vision is better than relying on sensors.

    • @jacksmith-mu3ee
      @jacksmith-mu3ee 2 วันที่ผ่านมา +1

      Lame excuses
      Lidar exists for a reason

    • @didier_777
      @didier_777 วันที่ผ่านมา

      @jacksmith-mu3ee LIDAR (all caps because it's an acronym). Was invented in 1961 it wasn't invented for self driving cars. I'm not sure what your point is.

    • @jacksmith-mu3ee
      @jacksmith-mu3ee วันที่ผ่านมา

      @@didier_777 another lame excuse . Cars pre tesla had it
      Thanks for proving my point bot .

  • @MrCristopher99
    @MrCristopher99 16 วันที่ผ่านมา +7

    Can we talk about how many accidents it SAVED? Can we do a similar video from another automaker?

  • @ajaypala3475
    @ajaypala3475 26 วันที่ผ่านมา +8

    Not a Tesla driver here, but I’m assuming self driving tech has improved since 2021. I can’t believe this is the focus of a report in 2024 ( almost 2025).

    • @mikebronicki8264
      @mikebronicki8264 23 วันที่ผ่านมา +2

      That is an assumption with no data to back it up.

    • @41-Haiku
      @41-Haiku 13 วันที่ผ่านมา

      @@mikebronicki8264 That is an assertion with no evidence to back it up.

  • @rken3116
    @rken3116 หลายเดือนก่อน +25

    Sorry, but I trust myself more than I ever will a computer, my condolences to those who have lost their lives to technology.

    • @honkhonk8009
      @honkhonk8009 20 วันที่ผ่านมา +1

      And you should. Thats what the company expects you to do.
      Seriously its a glorified cruise control that litterally tells you at night when its unsure about the conditions.
      Theres so many videos out there of morons who try to game the system and end up dying cus they dont pay attention.
      Irl autopilots on airplanes require the same level of supervision, if not even more.

  • @ZelForShort
    @ZelForShort 26 วันที่ผ่านมา +2

    Imma be honest, getting warned 17 times is wild

  • @fredreeves7652
    @fredreeves7652 16 วันที่ผ่านมา +3

    Unfortunately, in this case, she is going to lose her court case merely and simply because her husband IGNORED the car's warnings of him taking his hands off the wheel so many times. He was pre-warned. : (

  • @mikaelhugg1112
    @mikaelhugg1112 15 วันที่ผ่านมา +5

    Okay maybe this is an unpopular opinion but driver is still the one responsible. Autopilot, ABS, parking sensors etc are just features that makes driving easier and safer, but the driver needs to stay vigilant and observe the road. I feel like many of the crash videos from tesla looks like the driver wasn’t paying attention to driving. :/

  • @Alpha-Alpha
    @Alpha-Alpha 8 วันที่ผ่านมา +1

    Honestly seeing the trunk lying down in the middle of the road, not sure if human driver could do anything either . Result would probably be the same. It’s the truck driver who killed him, not the autopilot.

  • @conceptrat
    @conceptrat 29 วันที่ผ่านมา +4

    If I was John Bernal I wouldn't be putting my Tesla in 'autopilot' mode just in case his vehicle has had a unique update and diverts to somewhere dark.

  • @jamesphillips7150
    @jamesphillips7150 หลายเดือนก่อน +10

    What is the average compared to the total?
    Pretty important fact left out

  • @DIYAudit
    @DIYAudit 28 วันที่ผ่านมา +14

    Not a Tesla fan boy but should the news also show how many times autopilot saved someone’s life or avoided a crash?
    I mean, the news is supposed to be fair and balanced right?

    • @denise8401234
      @denise8401234 27 วันที่ผ่านมา +4

      this isnt a flufff piece its about the tech info tesla has been keeping from the public. its about public safety

    • @DIYAudit
      @DIYAudit 27 วันที่ผ่านมา

      @ it’s 100% “orange man bad friends company” being targeted bc orange 🍊 🧍‍♂️ won.
      News need to be fair and balanced, not just hit pieces.

    • @bikeman7982
      @bikeman7982 23 วันที่ผ่านมา +2

      That type of reporting doesn’t sell.

    • @username7763
      @username7763 22 วันที่ผ่านมา

      Sounds good, other than Tesla not allowing others to look at the data. How can the media report on something that cannot be independently verified?

  • @kinngrimm
    @kinngrimm 5 วันที่ผ่านมา +1

    When you give away control to then realise you haven't any when it matters.

  • @TrynaBeCray
    @TrynaBeCray 25 วันที่ผ่านมา +8

    4:51 Sounds similar to what a certain submarine pilot/ CEO said about his product i wonder where he is now?

  • @JessicaNutt33
    @JessicaNutt33 14 วันที่ผ่านมา +6

    I’m sorry, but when the car warns you NINETEEN TIMES?!? I have empathy for anyone who loses their life but don’t blame the car when the humans don’t do what they’re supposed to- and that’s called pay attention to the road. Also, how many people died in other cars…at night…when visibility is low? Let’s compare apples to apples and see how that goes.

    • @Mike-f8b8i
      @Mike-f8b8i 14 วันที่ผ่านมา

      In other words, this self-driving technology is perfectly safe... as long as it isn't used for self driving like CEO Musk claims. And morons continue buying Teslas for this? 😂

  • @nerva-
    @nerva- 24 วันที่ผ่านมา +1

    I'm surprised they don't mention the crash in the Bay Area where the guy's Tesla veered out of his lane and tried to merge with a concrete divider that was missing its normal crash barrier - he'd told his wife that several previous times on his daily commute he would see the autopilot turn toward the divider as if it thought it was seeing a passing lane, and he'd grabbed the wheel to override it - but on this occasion he was preoccupied for a couple critical seconds and didn't react quickly enough, so he died in a head-on collision with a concrete barrier. Tesla reviewed the accident and pronounced it "driver error" because the driver had failed to override the autopilot when it made a mistake. The fact they could say something that absurd is why I'll never buy a Tesla.

  • @contentcentral_
    @contentcentral_ 17 วันที่ผ่านมา +7

    3:47 sure the autopilot messed up here but if the driver was alert and supervising the fsd, you can clearly see the flashing lights and that crash would've never happend

    • @coolkidsahib
      @coolkidsahib 14 วันที่ผ่านมา +1

      That’s the issue about creating a false sense of safety. People end up relying on the tech too much and either get distracted or zone off. Hence why they’re getting investigated for false marketing because even people like my aunt think that it drives her everywhere with no issues

    • @ErinWilke
      @ErinWilke 14 วันที่ผ่านมา

      Definitely, that specific one could have been avoided by the driver paying attention. The problem is Tesla's marketing (self-driving, autopilot, etc.) combined with their refusal to make crash data public means that people have been misled into trusting the technology. If you think Tesla crashes are one-off tragedies, versus actually understanding the shortcomings of the technology, it's easy to become overly reliant and trusting of the tech. I don't think it's fair to blame individuals when corporations and billionaires lie to make a profit and endanger their customers and the public as a result. Billionaires and corporations are NEVER held accountable.

  • @patriciachimienti8094
    @patriciachimienti8094 16 วันที่ผ่านมา +4

    Autopilot is not FSD. BUT BOTH CAN BE DISENGAGED BY MERELY APPLYING THE BRAKE! Seeing anything blocking the road like a downed semi, I would have applied THE BRAKES! I have been using FSD since 2022 and it is improving so quickly. The autopilot is NOT FSD as it only allows one to steer without having to use the accelerator, however to stop one needs to brake. The video of the accident proves 1) there was plenty of time to brake 2) he was going faster than he should have 3) not paying attention. Did his Tesla even have FSD? Because it would have slowed and sent alarms to driver.

  • @fyrelorde
    @fyrelorde 22 วันที่ผ่านมา +1

    "I don't think this is a long term technology that we're gonna keep in the cars." is actually a fair assessment of why its so controversial for cars.
    If this type of tech were put into something that didn't encounter so many unknown variables, then it would be more effective than what it was marketed for.

  • @StevenBanks123
    @StevenBanks123 26 วันที่ผ่านมา +11

    I can easily imagine running into that overturned semi truck. No flares out and it was dark. It fooled the auto pilot, but it could easily fool the human.

    • @pdavisnwa
      @pdavisnwa 24 วันที่ผ่านมา +2

      The principles of defensive driving saying you should slow down and even stop until you figure out what is in the road ahead.
      You seem to be forgetting that.

    • @StevenBanks123
      @StevenBanks123 23 วันที่ผ่านมา +3

      @ I went along with you until you added “you seem to be forgetting that” It was a dig, and unnecessary. And I do trust that you do indeed know that. See? I trust you.

    • @bikeman7982
      @bikeman7982 23 วันที่ผ่านมา +1

      Yeah. The human mind can easily go into ‘autopilot’ and fail to register the obstacle. Especially at 3am on a deserted highway.

  • @stephenspackman5573
    @stephenspackman5573 18 วันที่ผ่านมา +9

    To be clear, I'm no fan of Tesla's system. But are they, or are they not, safer than human drivers? Are they, or are they not, safer than competing systems? Are they, or are they not, safer than (extant, or possible) public transit alternatives? It's answers to these questions, ones about _how to improve our safety overall,_ and not anecdote-driven sensationalism, that will serve the public.
    Of course it is a tragedy when someone dies on the road, but that tragedy is not increased or lessened by the brand or technology of the vehicle involved. Several of the videos you show are circumstances where _I_ might have been killed had I been the driver, so blaming the manufacturer without looking at the statistics is just muddying the water of an issue that we desperately need to understand clearly.
    I don't feel the video, as presented, was a service at all.

    • @BigBen621
      @BigBen621 18 วันที่ผ่านมา

      @stephenspackman5573
      _But are they, or are they not, safer than human drivers?_
      Without a doubt.
      _Are they, or are they not, safer than competing systems?_
      Competing systems to Tesla Autopilot are the TACC and lane centering systems that have been included in pretty much every new car in the past 10-15 years. The difference is that unlike Tesla Autopilot, nobody's monitoring the safety of the other hundreds of millions of cars, so there's no way to measure whether Autopilot is or isn't safer than the scores of different versions of TACC and lane centering from all the other manufacturers. But because Tesla does do attention monitoring even on Autopilot, and many if not most of the competing systems don't do attention monitoring, it's highly likely that Tesla Autopilot is safer than most of all competing systems.
      _Are they, or are they not, safer than (extant, or possible) public transit alternatives?_
      Probably not.
      This is probably one of the most well-thought-out and rational posts in the entire comments section of this blatant hit piece.

  • @RandaSales
    @RandaSales หลายเดือนก่อน +616

    Qarden Token will make millionaires, after CEX listing it will blow up.

    • @DIMON13585
      @DIMON13585 หลายเดือนก่อน +12

      Scam

  • @pyonchan1804
    @pyonchan1804 12 วันที่ผ่านมา +1

    Autopilot and similar tech just needs to be regulated properly, but it's like any new technology it starts as the wild west then it gets regulated, it's part of the process

  • @ryankarr7109
    @ryankarr7109 13 วันที่ผ่านมา +4

    Over 9 years, 51 deaths from autopilot. Yes, that is 51 deaths too many; however, compared to non-autopilot deaths in the US this number pales in comparison.

  • @thibaultlibat368
    @thibaultlibat368 19 วันที่ผ่านมา +6

    The average driver would have crashed in the majority of these examples

  • @Diptera_Larvae
    @Diptera_Larvae หลายเดือนก่อน +19

    I guess with Elon Musk basically being the President now he and Tesla are immune from regulation.

    • @leoshrotri892
      @leoshrotri892 23 วันที่ผ่านมา

      not how it works

  • @neodym5809
    @neodym5809 9 วันที่ผ่านมา +1

    If Lidar is so expensive, how come that you can find them in sub $1k smartphones?

    • @BigBen621
      @BigBen621 5 วันที่ผ่านมา

      @neodym5809 _If Lidar is so expensive, how come that you can find them in sub $1k smartphones?_
      Are you seriously comparing the cost of a smartphone LIDAR, which has a range of around 15 ft., with the cost of LIDAR for ADS, which must have a range of perhaps 30 times that?

  • @sepster2740
    @sepster2740 หลายเดือนก่อน +10

    I find it hard to have sympathy when the car alerted the guy 19 times to take over. He was mi's using the technology, being careless and having no regard for his own safety, his family, or other people on the road.
    He clearly wasn't paying attention to the road and was over relying on self driving.

    • @mikespohn3948
      @mikespohn3948 27 วันที่ผ่านมา +2

      Agreed. It seems most likely he fell asleep.

    • @aussie405
      @aussie405 19 วันที่ผ่านมา

      Should the car be monitoring the driver and have alarms to wake a sleeping driver?

  • @GulsahElif-m9w
    @GulsahElif-m9w หลายเดือนก่อน +621

    Musk's Qarden Token announcement is coming soon. Easyest money if you get in on the ICO

  • @KevinL-f7r
    @KevinL-f7r 26 วันที่ผ่านมา +3

    Has any study shown that self driving cars crash more than other cars?
    Because all I ever see is data about self driving cars. And emotional reaction to their not being a human driver (as if human drivers were perfect)

    • @Uouttooo
      @Uouttooo 19 วันที่ผ่านมา

      Yes it has been proven that sleeping drivers = 100% crash rate.

  • @chunkycornbread4773
    @chunkycornbread4773 18 วันที่ผ่านมา +1

    The only thing Tesla autopilot trains is drivers to become complacent

  • @Thedishesss
    @Thedishesss 15 วันที่ผ่านมา +4

    This whole fretting thing is so stupid. The only thing that matters for AI driving is, does it SAVE MORE LIVES THAN IT KILLS? No driving system will ever be perfect. It just needs to be better than the average human driver.
    With a level 1 or 2 system like Tesla, it’s even better because you combine the features of both AI and Human safety systems. The AI is never distracted, and the Human is supervising. The combination only is unsafe when the human becomes distracted, which is 100% the fault of the human.

  • @MBHockey
    @MBHockey หลายเดือนก่อน +17

    Have to wonder what the actual reason for removing lidar/radar from Teslas. Was it a cost saving measure during covid and supply chain difficulties?

    • @Tardigrade-n7y
      @Tardigrade-n7y หลายเดือนก่อน +2

      Cost is always a factor, but the issue is that (1) if a human can drive with just vision, then so can a computer, and (2) sorting out conflicting information from the cameras and radar is difficult, especially in bad weather

    • @Tschacki_Quacki
      @Tschacki_Quacki หลายเดือนก่อน +2

      Sensor interference and cost. They said that HD radars would integrate better and would make more sense. The current Model S and X do come with HD radar but afaik they don't make any use of it yet.

    • @oyuyuy
      @oyuyuy หลายเดือนก่อน +4

      Teslas approach is to train their cars to drive like humans do. Humans don't have lidar.

    • @imsrini
      @imsrini หลายเดือนก่อน +2

      One, it is quite expensive. Two, Lidar gives you a point-cloud in front of the car which can be used to detect obstacles with a lot of compute, but you have no sense of what those obstacles are. A 1-feet cube of styrofoam or a 1-feet cube of solid steel will look the same to Lidar. So from an information utility and processing standpoint, Lidar does not offer much over using cameras and computer vision.

    • @MBHockey
      @MBHockey หลายเดือนก่อน +6

      ​@@imsrini but all of these autopilot crashes where the cameras didn't know what it was looking at would not have happened with radar/lidar

  • @MrPhilipReed
    @MrPhilipReed 23 วันที่ผ่านมา +5

    If a human would have been able to see it, and the Tesla alerted the driver 19 times to put his hands on the wheel, then who was really at fault?

  • @SteveBakerIsHere
    @SteveBakerIsHere 10 วันที่ผ่านมา +2

    This is VERY outdated news. That semi-crash that you opened with was when they were still using radar - and was LONG before FSD was mature. There are now 6 million Tesla's on the road. At an average 15,000 miles driven per year - that's 90 billion Tesla-miles driven per year. Over all car types, there are 14.3 collisions per million miles driven. So if Telsa vehicles were like ordinary cars - you'd expect 90,000x14.3 = 1.3 million Tesla collisions per year. You say that "thousands" of Tesla-related accidents have been reported. If it was "millions per year" there would be a problem - but it's not. Also, you're talking about the UK version of FSD - which is FAR from complete. All Tesla owners are told at the start of every trip that they must supervise their car while it's driving. Clearly if you slam into a very obvious semi-truck - then you're not supervising the car's driving. Tesla's are (statistically) the safest cars on the road - and they are safest when using FSD.

  • @BALANCEDPORTFOLIO
    @BALANCEDPORTFOLIO 25 วันที่ผ่านมา +9

    another low-quality hit piece from the WSJ

    • @Kami84
      @Kami84 24 วันที่ผ่านมา

      What did they say that wasn’t accurate? You shouldn’t call something a hit piece just because you don’t like the facts. Elon is incredibly irresponsible and he constantly misrepresents the capabilities of “self driving.” Do you know that Tesla has faked some of their promotional material showing the Tesla commenting someone to work. It was an edited compilation of many many attempts and they made it look like it did it right the first time with no mistakes.

  • @vabriga1
    @vabriga1 21 วันที่ผ่านมา +20

    Tesla FSD is Level 2 autonomous driving. Which means, the driver must all the time watch the road and has hands on steering wheel. End of the Story. If a driver does not pay attention on the road it is (legally) his/her fault. I drive Tesla and have FSD.

    • @joso5554
      @joso5554 17 วันที่ผ่านมา +5

      This is a legal death machine. It’s name and advertisement totally undertone this « legal » part. Confusion is enhanced by the name « autopilot » and the existence of truly autonomous taxi services.
      And it is vastly documented in scientific literature that a high degree of automation seriously decrease the vigilance level of any person supposed to monitor it and retake control immediately in rare cases of failure or error. The ability to react in a timely manner and adopt the right corrective action in unexpected, complex situations where the automation suddenly fails, is proven to be very degraded as opposed to a continuous manual control where dealing with a sudden unusual situation is much more likely to be successful (within human limits obviously).
      This is a low cost hardware system made for profit, covered by a mere legal umbrella to hide known dangerous and documented flaws originating from a flawed cheap design. Period.