Autopilots are just glorified cruise control. No different in aircraft or cars. You can't just put your RV on cruise control and go into the back and fix yourself a drink.
I agree with you, but the timing of this is WSJ Musk hit piece. The question I asked myself is, did Musk try to give humans one last benefit of the doubt before catering to the Darwin award recipients. Also, people can be petty. If they can blame the car before an internal assessment of their logic, they will. Overall, this report should include an individual’s driving record. But moreover, should be an analysis of what type would set autopilot, climb in the back, fall asleep, and…
Autopilot literally is just cruise control in 3D. It holds altitude, heading, and bank angle. The jet will fly until it runs out of gas. The engines will stop. Pressurization will be lost. And everyone will succumb to the reaper within minutes. Unless the jet descends to a safe altitude quickly enough with its engines now inoperative. Auto-Land is a much more sophisticated autopilot. It tracks the ILS as a pilot would land in fog, clouds, and rain. But it cannot usually take you where you need to go. Perhaps from lineup on runway, to touch down. But certainly not taxi to the gate.
@@jdm1152The timing makes it a hit piece against Musk? How so? Aren't journalists SUPPOSED to report on issues that are current and relevant? Trump hiding accident reports for Musk seems like the ultimate in current relevance, wouldn't you say? The hit piece, it would appear, is your post, against WSJ.
CAMERAS ARE NOT EYES. I work in the film industry. Cameras sensors are way worse in dynamic range than human eyes. This means it won’t gather light from dark environments like our eyes do, it will suffer to get light (stop/cars) and objects in dark. Note how many of these crashes happen in the night/dusk.
People's eyes fail them constantly as drivers, too. Not to mention that we only have two of them, spaced fairly closely, on only one side of our head. Teslas have 8 cameras, on all 4 sides of the vehicle. Yes, the dynamic range issue matters, but again, there's multiple ways Teslas' eyes are better than humans' eyes. The bigger issue is that Tesla has the opportunity to augment those electronic eyes with all kinds of other things humans don't have, like radar, lidar, and ultrasonic sensors, but Musk has stubbornly resisted relying on them, leaving Teslas with ONLY those 8 eyes in too many scenarios.
And critically: The depth perception of stereoscopic cameras is garbage compared to the depth perception of our eyes. Which is why Teslas crash into overturned trailers. The cameras are unable to reliably detect the position of that trailer, so it can't tell the difference between that trailer and just a dirty road surface. LIDAR fixes that.
@@nathanscandella6075 Two human eyes and the human brain are far superior to 8 cameras and the Tesla AI. The problem is the Tesla machine learning systems can't model for any and every potential situation. While the human mind can know a truck on its side is a threat and one should brake, the Tesla didn't know that and there are a million other scenarios that a Tesla won't ever be able to understand. Musk chose not to include Lidar because it was too expensive and that can't be retrofitted into most Teslas now. Musk thought he could BS his way through this and he probably won't be able to.
Exactly because I’m sure most of these auto pilot crashes could’ve been prevented if the driver was staying aware I have a Tesla and I’m always driving it like I don’t trust the auto pilot
@@rick4580 He might have promised but never stated that is was 110% READY TO DO SO. That's the problem is that things are fed but we call it mislead when in reality never broke the meaning down fully. Just like when driving and its called "AutoPilot" well at the very start we all think fully autonomous but in real world it means A function that Pilots something automatically but further in depth on the word pilot also means (Is not 100% Fully capable to control this something automatically it can only function so much).
Err... but by definition, Autopilot means "pilot assist". Also, the Ladar is not an appendix. Elon may as well argue that Dolphins don't need their sonar.
02:10 *Elon is so confident in his tech that the Vice-President Trump's transition team has already recommended scrapping this requirement (**02:10**) that companies report automated vehicle crash data.* Great job America.
Its the peoples fault for fighting over red and blue like we in a gang. Right wing left wing when both wings belong to the same stupid bird. As long as American keeps fighting over the best of two evils we will keep getting these results. Whats else did you guys spec. We need to get both parties out of our government and introduce a new government that is controlled by the people and people introduce policies and Americans vote on them online.
@EnjoyingMyTimeOnThisPlanet I agree with the general premise of ‘both sides’ exhibiting corruption, but 1) there is a *massive* difference in the extent and 2) MAGA goes *way* beyond ‘normal’ corruption, it’s pure fascism staring people in the eye. If you want to get money out of politics and install true representatives: great. But given the current state of affairs, I think that ship has sailed for good.
He's not in the government, and will only be an independent advisor. He won't have any more sway over legislation than any other multibillionaire that can buy off lawmakers.
From watching Mentour Pilot, I have come to realize that autopilot in aviation has a completely different meaning than what it means to the general population.
Pilot here, yep. The public looks at aircraft autopilot as a navigation system that does it all, not recognizing it has to be programmed with a very specific plan, has a couple dozen combinations of operating modes, and has to be managed.
I think pilots dont even call it autopilot, or at least thats not how the manufacturers call it. Pilots get extensive training as to what the system can and cant. Big difference to Teslas, considering how Elon constantly talks about self driving cars. Like in 2015 he said fully self driving cars are just 2 years away, its not a secret why he called it autopilot.
And we the people who drive have a right to know if other vehicles on the road are safe to operate in that mode. Especially given that the rules about whom to sue when there is a crash aren't really clear.
Tesla has been late/dishonest with every launch of technology.. CyberTruck=Junk, Solar Roof=Junk/lies, New Roaster=Ha Ha Ha. “Oh, it was supposed to land in the ocean” When people being fascinated with these crooks that become billionaires leading you to believe that you’re part of some inner circle will all be better off. It’s just a shame that they’re all now running the country because younger people are too stupid to not vote as if everything is a popularity contest. But when a Tesla runs them off the road and kills their family, they can always send a message to Elon on X.
@@martontichi8611 the way Tesla are safe, WSJ had to go back to 3 years to find a talking point in 2024, Tesla Autopilot is 10X more safer than human drivers, You all are NPC's i mean who believe in Mainstream Media
After a few phantom events where it tried to kill me and my family on the M5 in the UK. I now refused to use it. Actually feel stupid for paying thousands for the full package.
I’m glad you and yours are ok, and no, you aren’t stupid, you were deceived, and so were many, many others. We pay for things with the expectation we will get what we paid for. shame belongs to those that profit from deception.
We have two 2023 Model 3s. I used FSD extensively from 12.3.6 release and stopped using it during the fall due to several near collisions that could have been devastating - one a drift by FSD over the double-yellow lines on a blind curve on US Hwy 50 west of Tahoe. In that case I was able to take over and return to my lane before meeting oncoming traffic. But the next time … not worth finding out. FSD is amazing until it isn’t.
If companies like tesla put automated systems in place, they should record the data (as they do). And it should be mandatory that other organisations or public can look into the data, so the customer knows how many crashes and what crashes are caused by these systems
Okay but thousands die everyday from normal cars. Why is no one using this data to reduce our dependence on cars in general. We should not be killing our citizens by forcing everyone to drive.
The human behind the wheel is supposed to remain attentive as if still controlling the vehicle. What we have here are examples that lead to human controlled fatalities and the number is miniscule as a percentage of Teslas and Autopilot/FSD miles driven. I have many miles of experience using both systems and it gives me a litany of safety advantages compared to vehicles that lack comparable technology. Teslas aren't designed for level 4 or 5 autonomy yet, so your argument is moot in relation to this story.
@@rogerwilco2what is the worst thing you think will happen that day? Anything falsifiable? Do you think he will increase the $7500 tax credit for EV buyers? Or increase the carbon credits? Or accelerate the ban of gas cars? Or create more regulations that are easy for his companies to comply with but hard for newer EV manufacturers to comply with so Tesla won't have any competitors in their way? Because I think none of those things will happen, and actually the opposite will happen.
You guys need to make a full 1-2 hour documentary on this. It sounds like you guys found out a lot more than what you said in this video. I would watch a full documentary on this.
I agree, it even looks like too much was edited out to have it be short enough. Strange editing job (and I'm a pro), lacking explanation of given arguments.
I fully suspect Tesla’s legal team would be down their throats for stating anything “unsubstantiated” in this report. It is likely they found damning evidence on the hacked computer, but that evidence is likely considered “not legitimate”
@@Dr.AnonymousPro definitely, that’s pretty much the story of every Wall Street Journal video. It’s always way too short about the various topics they cover.
I don’t use my tesla autopilot anymore because of too many phantom events. It’s like driving with a new learner who needs constant attention. You are constantly on edge waiting to grab the wheel
I've had this in the UK just with EAP, not even FSD, it makes me so nervous that I am absolutely exhausted when I stop because I've been concentrating so hard, it's more stressful than driving manually, so I rarely engage EAP now, only really use it if the highway is empty, but even then I've had phantom braking occur for absolutely no discernible reason.
Teslas whole strategy of calling things "full self driving" and "autopilot" and then arguing that the customer was wrong to expect the car to competently fully self drive or operate on autopilot is truly pathetic.
An airplane on autopilot cannot handle engine failure, altitude recovery, severe weather, instrument failure, hydraulic failure, runway changes, fuel management, route planning, collision avoidance, the list goes on. It really shouldn't be called autopilot huh.
@@SuperBotcreator To be fair cars with lane assist and adaptive cruise control are about as automated as aircraft autopilots. The problem is Tesla claiming it's more capable than it is and a public that don't understand how autopilots actually work.
Well he's right... After all the Ohio supremely court just ruled that 'boneless' chicken wings aren't boneless and so it's totally allowable to have secret bones that kill you and it's no one's fault but your own.... This is where we're headed. The corporations rule it all while we die...
People . When you are behind the wheel of any vehicle you have one job , drive the vehicle. Never mind all the other distractions todays modern cars have . It's your life you'll be saving as well as others.
The guy leaving at 3 am for a 3 hour commute might have used it to get some more sleep? That commute at that time of night is insane and unsafe no matter what you do.
@@WarriorPaxo You market a product as FULL SELF DRIVING and then take out crucial sensors needed to safely operate an autonomous and try to sabotage government oversight and cover up the data you are absolutely responsible for the outcome.
@@WarriorPaxo 😂 I know, right? These people hate him so much, never mind crash data that says teslas are still better at saving lives than normal cars. ¯\_(ツ)_/¯
@ man but you’d think it outweighs the pr. I remember when he first announced it people were so confused and now even more. We know LiDAR and radar are better. 🤦
@@JeffyShyChannel Radar was removed because it was conflicting with the camera's, with the camera's being proved correct. Lidar is not needed. V13 is showing that, and will make our roads far safer.
@ except cameras are terrible at reflectivity. Really it’s the marketing at Tesla at fault. People are way too confident to let it fully drive with no attention to the road.
Entire car costs like 30 000 USD. One radar will cost few hundred USD and wont be of any real use. One LIDAR, which would be usefull, costs like few thousand dollars for this use case.
Good pick up. Pretty crazy commute. Obviously a super hard working and loving dad, tragic it happened for his wife, kids, parents, and friends. The comment at the end rings true. It’s too much to ask of people to simultaneously trust their cars auto driving feature while paying attention. I think fully autonomous is the way (ie Waymo) if you want that sort of thing but we’re so far away that this transitional period will be really bumpy. Personally I found Teslas autopilot too nerve racking and had too many close calls that I just never use it.
Indeed, this is absolutely insane. I wish we could figure out high speed rail train like virtually the rest of the world. Train at 160 mph would do the same route in 1 hour + 30 minutes to navigate to and from station - boom, you can leave your house at 4:30 instead of 3 AM
That's what the video is trying to highlight: when we start to "trust" that the autonomous driving works, we start to lay back and let the system do its job and lowering our sense of awareness. Once the system failed to detect an obstacle and we see it in front of us, it is already too late: either the distance is too close and our response time is too slow to make that quick decision due to the lower state of awareness
I am a retired software engineer. We always had a rule that interface design had to be clear, understandable and aware that people make mistakes. We had to be very careful how to frame the prompts and check the user inputs. Here, Tesla have the prompt that the car has an "autopilot". This violates every guideline that a software engineer would use because it guides user inputs into an undesirable direction. To those who commented here that the drivers must bear responsibility and that Tesla cars give better performance that humans only, this "autopilot" prompt is criminal regardless, especially as it is used for marketing purposes rather than human safety.
This video isn't a critique of self-driving cars. It's specifically investigating the camera-based technology that Tesla uses, and it's misleading claims about it. Human drivers are far from perfect (how many millions of crashes have been caused by truckers making mistakes?). AI will 100% replace human drivers, and make driving enormously safer.
@@gatowololo5629 Well I guess if may be stopping each time the system is not sure what they see. That could again cause even more accidents. It seems to me that without a radar it 's not going to be a reliable system.
@@gatowololo5629 Probably because of the data it's trained on. Technically or humanly, it is impossible to train the system on every single possible thing it may encounter, so it's trained only on a small subset of all possibilities. From that it follows that the case of "encountering an unforeseen / unrecognizable object" is what's happening all the time, many times on each trip. And most of those encounters are harmless, of course. That's okay when you are training a computer to recognize cats and dogs to impress your CS tutor, but definitely not okay when people's lives depend on it.
I have a little different perspective here also as a Tesla driver. I was driving down the road, admittedly a little in my own world. The driver in front of me slams on their breaks. Full break check. The car literally slammed on the breaks, and swerved out of the way to an area with no cars. It was all captured on multiple cameras for me to rewatch and analyze what I could do better. I do not trust autopilot the same at night and I’m extra attentive. That truck would have been incredibly hard to see even if I was driving the car normally. The Tesla has saved me from near misses multiple times that in my old truck would have been an accident. I feel much safer in my Tesla, and you agree to MULTIPLE warnings that the system is not fully automated and that YOU are responsible to pay attention to the road.
Going to Vegas, my brother’s car automatically braked when someone slammed on their brakes in front of us. Car then sped up when it realized the car behind us was going to hit us because it started braking late. It was clutch.
as a Tesla driver, i thank WSJ and the whistleblowers in this video for making plain the shortcomings of Tesla autopilot. seeing these examples i can drive much more aware now. this should be required viewing for all Tesla drivers...to familiarize themselves with the specific weaknesses in the system.
You didn't know that you were supposed to remain attentive and are still ultimately responsible for the movement of your vehicle? Your Tesla came with a variety of such warnings. If you thought your Tesla was level 4 autonomy, that's on you personally. We receive constant reminders that's not the case.
You needed a report to know that you are the driver and not Tesla's autopilot? Like sorry to be rude, but no wonder Tesla drivers crash if this kind of thing is news to you.
Haha, I thank the reporters for sharing details about the types of accidents that have been taking place and you all are so threatened you start insulting me and making up things I didn't say. Sorry but you're the people who lack nuance, not me.
I think people equate autopilot on a car with autopilot on an aircraft. If there is a problem in an aircraft the pilot normally has plenty of time to react due to the separation interval between aircraft and the altitude above the ground. If the self driving automation fails when the nearest object to collide with is maybe 10 feet away the car will hit it before the driver even realises there is an issue.
@@-TheUnkownUser false advertising is illegal. Litigation over it is why Tesla had to rename it to "Full Self Driving (supervised)". Stop fantasizing about a world where there's evil people getting away with crimes, when there are real people actually getting away with crimes.
Sad reality is people autopilot their airplanes into the ground all the time. Public think autopilot is some magic button to just push and forget. And that is not true.
Immunity from what? Tesla has already had immunity from all of this for years. Tesla has already won tons of lawsuits about crashes. He doesn't need immunity. If he does need immunity, can you name any cases he has lost that Trump will grant him immunity from?
Had a Model 3 with FSD for two years. Important note: I live in the Northeast but lived for six years out west. I understand why Tesla was all in for cameras only, but in every other part of the US where roads are not well marked or lit, that FSD is NOT ready for prime time. We had numerous false reactions from the vehicle from NOT seeing hazards on the road to reacting to hazards that did not exist. The car on more than one occasion slammed on the brakes on a totally empty highway, while doing 70mph. Sold the care after too many occurrences. On top of that, the lowest end Hyundai has better build quality than those cars, and I'm not the first person to point that out.
I've had a Tesla Model 3 for the past 2 years, and although I didn't buy it for the autonomous driving feature, the improvements that have come in the last 6 months have been remarkable.
Hyundai has (&has had) awesome built quality!(for years) Got 100.000 m / 8 year warranty and panel spacing is mint! Chrysler cars are a better comparison…
Night cases with the overturned truck and the pick-up are definitely avoidable by radar assisted brakes, which are available even in low-end cars today.
Teslas originally relied on radar, too, but Musk has decided to phase that out, even in cars that still have radar hardware. His explanation was that for lower-resolution radar, there are too many false positive events that cause the cars to slow down for nonexistent threats. I can confirm that this was indeed a common annoyance in the past, and possibly a way to get rear-ended, and since going to vision-only, that annoyance is mostly gone. But, that doesn't mean Musk is necessarily right. Could be that his engineers just weren't good enough to properly filter the radar data. Radar has always been an imprecise sensor, it's only good signal processing that makes it useful.
@@nathanscandella6075 yeah, part of the issue AIUI was the conflict between camera and radar in Tesla's stack _specifically_ rather than necessarily an inherent issue to radar. And when radar is bad-enough to freak-out because of a leaf in the road, it's at least consistent-enough that you learn it quickly and compensate; rather than inconsistent and seemingly out of nowhere like with Phantom Braking.
@@sureshkrjsl Just because it was not visible in the onboard camera replay, does not mean it was the same for human in that situation, if the human would have paid attention. The event replay is of very bad quality, probably on purpose, to give you the impression of not being able to see. I have been in a somewhat similar situation, autumn storm, rainy conditions, suddenly I saw what looked like a tree fallen on the road and braked later and harder than normal due to bad visibility but still before my car's auto braking activated. Model X next to me ploughed straight until sudden swerving just before the tree - that was most likely driver intervention. Took significant front end damage and punctured a tyre. No people were harmed. But he cleared the left lane for the rest of us at the cost of his own car. :D
The Autopilot name is coming from the aviation industry and it is technically correct. In a plane having the auto pilot active doesn’t mean the pilots can’t leave their seats, the pilots needs to be able to take over at any instant. The problem is that the general public doesn’t understand it as such and this causes much confusion. Now, if you are a Tesla owner, when you activate the autopilot in order to be able to use it on the road, the instructions are very clear, this is meant to be used on one way roads and you have to be able to take over at any time. There are also warnings if you need don’t keep your hand on the wheel and if you don’t comply Autopilot will disable.
@@alexbian5567in some countries they had to change the name, but in most it has been kept as it technically describes precisely the feature making it difficult to rule against it.
@@fdelaneau theyve updated it so you dont need to have your hands on the wheel I think autopilot is absolutely awful compared to fsd and not in a feature way it is actually just terrible at making and decisions which I get cutting features off fsd but the actual logic and programming is terrible
I’m a Tesla driver. I don’t like Musk, and I feel lied to about the car’s capabilities. BUT, it’s saved me from at least two accidents. Here’s the data point we need but they didn’t provide: “how many accidents per million miles driven WITH and WITHOUT autopilot”. Naturally corrected for other variables. Give me that data.
_Give me that data._ Ask and ye shall receive. "In the 3rd quarter [of 2024], we recorded one crash for every 7.08 million miles driven in which drivers were using Autopilot technology. By comparison, the most recent data available from NHTSA and FHWA (from 2022) shows that in the United States there was an automobile crash approximately every 670,000 miles" (Tesla Vehicle Safety Report 3Q2024).
@@economicprisoner Yes, I agree, although speeds are much higher on freeways, which means accidents that do occur are more likely to be fatal. The problem is that NHTSA doesn't segregate crash data based on road type. But they *do* segregate fatality data roughly by road type (actually rural vs. urban), and one might assume that crash data would at least roughly follow fatality data. In brief, the rate of fatal accidents on rural roads, which of course include intercity freeways, is 1.68 per 100 million VMT, while the rate for all drivers is 1.33 per 100 million VMT. This suggests that there are *more* fatalities per mile in rural areas than the average, which would make the ratio of 10.6 between Teslas on Autopilot and average drivers even more remarkable.
@@bardz0sz _how would you count the accidents prevented by autopilot_ You can only infer avoided accidents statistically. Say you have a group of 100,000 vehicles without autopilot, which experience 5 accidents per year; so the total accidents are 500,000. Now you replace 10,000 of them with vehicles *with* autopilot that experience only 4 accidents per year. Now the total accidents are 90,000 x 5 plus 10,000 x 4 = 490,000. So you can infer that replacing 10,000 vehicles without autopilot with 10,000 vehicles with autopilot has prevented 500,000 minus 490,000 = 10,000 accidents.
After watching this, I think I need to point out that Toyota has had a level 2 autonomous system available since they put it in the Prius in 2010. The system is similar to the Highway Autopilot on Tesla. Toyota uses a radar array and cameras. The radar sees stuff like this and will slam on the brakes, and this is why Toyota is seldom if ever in the news for this kind of accident. The radar system doesn't have to interpret anything, it knows if you are about to slam into a stationary object. In 8 years of owning a Prius, I had it slam on the brakes once (about 75% faster than my foot could, which hadn't even left the gas pedal and the brakes were already applied, avoiding a collision) and warn 3 time to stop. 2 of those were false, as that is the nature of radar, sometimes it doesn't understand sudden curves, angles, or sudden inclines in the road, but I'd rather have a few false positives than slam into something.
Beside my 2021 Model3, I have a 2017 Prius Prime. Its lane assist is a joke compared to Autopilot. It can't even follow a gentle highway bend. I keep it off. Its lane departure is annoying. I only turn it on while on highways. A radar wouldn't have been helpful with the overturned semi as it would have been ignored because it has been stationary since the first time the radar saw it (radar are designed to ignore stationary objects to prevent phantom braking). The difference here is the Prius driver would have been looking at the road 100%, otherwise he would have ended in the ditch. In a Tesla, 99% of the time, Autopilot will do fine. It's that 1% that is problematic if you're not looking at the road, hence why you SHOULD KEEP LOOKING AT THE ROAD, it's NOT AN EYES OFF system!
No the reason why you don't see Toyota in the news is because they don't get clicks and nobody in the news team has a vendetta against them. Toyota's lately have had quite a lower safety record than Tesla, its not even a competition at this point. So Toyota better step up their game. And btw those false positives are why tesla decided against the radar systems, because they cause people to rear end you. Yes because people suck at driving and they will rear end you.
@@sylvaing1 A 99% correct system shouldn't be used by the public. And calling it beta is not an excuse. Actually a 70% system would probably be safer to use, because at least you'ld be able to keep your attention on the driving. But when it's at 99%, you have not enough to do so you'l get bored. When you're bored, you'll start thinking about other things (daydreaming) or even do other things (go check your phone,...). It's just human nature.
I had a “false positive” in a Subaru Outback once and it decided to lock its brakes up at speed on the interstate. Almost got steamrolled by the traffic behind me. All because there was a different color road surface patch. Be careful what you wish for.
@@alamogiftshop I had a 2024 Volvo CX40 Recharge Ultimate while my Model 3 was in the body shop last summer. It also did a phantom emergency braking on a curve on regional road while I was going 105 km/h. In my case, there was nothing on the road, no change in the asphalt condition, nothing. The dash displayed "Emergency braking applied to avoid a collision", well nope, there was nothing.
Imagine always having a beginner driver driving, and then having to be ready to take over all the time. This seems to be what it's like to use autopilot
I agree. I think its more difficult than simply driving yourself. Most of the people saying the drivers should have been paying more attention are missing this point.
Exactly - they are relying on humans being the back-up for a computer, rather than the computer helping the human avoid mistakes (like aviation automation works).
💯. Need to text at a red light? Auto pilot great to get going when the light turns green. It’s surprised me a few times. Why is auto pilot camping in the left lane? Oh because It’s on auto pilot. Always drive, let auto pilot assist. 🤦🏽♂️ this crash is dumb. Condolences to the family, because their father and a husband had to be a statistic of why It’s important that only, you, are responsible for your safety.
I think people should be a lot more curious about how these people who get into trouble with it were able to convince themselves that it was safe, because I've never gotten that impression from it, you can watch people demonstrating it and it made mistakes constantly.
In Houston old traffic markings are often left behind after lanes have been re-routed. In rain and fog at night sometimes the old remnant lane markings are MORE visible than the “correct” ones. It’s sometimes difficult for even a highly experienced person to understand what to do. I don’t even trust myself to get it right sometimes. A low-resolution camera has no hope!
Ditto here. The “lane keeping” function of my Kia Sportage gets fooled all the time by all the different markings on the road. I put it on once in a while for fun, but not for long!
"In rain and fog at night" Add to that bright sunlight but at a low angle, e.g. just after sunrise. With concrete pavement in particular (high albedo), seemingly every place where a marking had ever been on the pavement lights up.
Just had this experience myself while driving close to sundown. The only way to tell which markings were correct was to use the mirrors. You could clearly see the lanes in the rear view, but looking forward, it was almost impossible to tell.
been using Teslas FSD for 2 years now and over 70,000km accross the US and Canada, i think it's incredibly beneficial to safety but i honestly think standard autopilot is dangerous. It is cruise control. that's all it is. it will drive straight into anything in the road. the name Autopilot is misleading tbh. it should be only referred to as advanced cruise control with lane keeping.
I always knew that’s what it was. My family owns a Tesla and never uses autopilot at all. People should do more research before allowing a car to drive you.
@@personyt55 the only “research“ you need to do is to read the Tesla documentation on auto pilot which states it’s a drivers aid and you are fully responsible. Operate auto pilot, recognize its capabilities and use it within those capabilities. Simple as that. 80% of my driving is an auto pilot and I understand completely what I can do and what it can’t and adds to my safety because the fatigue of maintaining speed and maintaining lanes to tracks from my ability to watch the road carefully which is what I can do more intent with auto pilot on than off.
The term is correct. This is exactly what autopilot in aviation is. It will keep an airplane's height, speed and heading, comparable to keeping a car in its lane. And it will fly into any mountains that show up if the pilot does not do his job.
I read that there are already plans for a cyan or green (can't remember which) colored lights on vehicles in the near future to let everyone know that a car in autonomous mode.
After i got my first tesla this year common sense made me not rely on autopilot in these situations: when its dark. When its foggy or heavy rain. When the road markings or road conditions are bad but its a nice feature in good weather on nice roads and i enjoy it
The problem isn't only the Tesla driver, but those of us who did not sign up for the program and get hit by one of these. Makes the Corvair look pretty tame now.
tragic no doubt, 3:40 the drivers was worn 19 times by the autopilot to keep his hands on the steering before the crush. most likely he fell asleep, the autopilot kept him from chrushing way before the flipped truck to appear, This accident will have happen anyway with any other car by him losing the control of the car. I'm hoping that this technology will be so good that will prevent any body from dying on the roads.
This is a dumb, yet extremely common, form of sophistry used against deployed autonomy. I also didn't sign up to be on the road with rednecks' monster trucks who achieve "safety" via enormous mass and a bumper that's at the same height as my head in a sedan. Or, a child's head on a crosswalk. Because, you know why? We don't all get to approve/reject each others' vehicles. If you want to argue that Tesla should be better about publicly sharing safety statistics on their autonomous cars, and IF those statistics show significantly worse safety performance, then use that to push for specific regulation. "I didn't sign up for ..." is irrelevant. This is a big, stupid country full of idiots doing harmful things to one another. I didn't sign up for that, either, but we live in a society.
The basic conflict of interest between DOGE and being a CEO at so many large companies should be obvious. Musk should be made to divest as CEO in many places before being able to head a government program ment guide regulations and enforcement. I also point out that this is not uncommon in the corporate world. Often when you take one position you might not be allowed to lead or own conflicting situations.
literally every situation shown in this video wouldnt have happened if they used LIDAR. Elon was so set on just using cameras despite warnings from the engineers. the blood is on his hands
Why does everyone think lidar is some kind of magic solution. A few months back a Waymo crashed into a large pole, despite being equipped with Lidar. V13 of FSD is proving Lidar is not needed.
@@CmdrTobs V13 is showing it to be the right decision. Tesla creates a pseudo lidar with it's camera's, and has shown it doing remarkable things. Lidar is not some super sensor. Only a few months ago, A Waymo crashed onto a large pole.
I think the abrupt shift away from lidar to camera based reinforcement learning was the bad move. At least continue to use lidar to build the model for detections and reactions and then introduce that training data to newer vehicles without lidar. Their decision to just replace everything with cameras then push Tesla vision as an OTA update almost 2 years later was truly the dumbest decision their engineering team has ever made. No idea why the didn’t prioritize that rollout first, then focus on physical lidar removal. Tesla vision could have been such a huge deal when they shipped it, but instead to felt more like a patch update to fix bugs, since that’s basically what it did for all of the vehicles produced w/o lidar
@@Nullmoose I'm not sure what you are talking about? Tesla has never used Lidar. They did use radar, but got rid of it because of it conflicting with what vision was saying. With vision being proved correct. Looking at the performance of FSD. It would seem to be the correct decision.
I was actually working on the same job site as Steven before this incident. I had just got my Tesla and would park next to Steven, until he stopped showing up one day. After I heard about what happened to him I learned to never trust the autopilot without supervision. I didn't know him personally aside from the occasional morning pleasantry, but its tragic how many construction workers die due to having to commute far while sleep deprived. My heart goes out him and his family.
Highly likely, or looking at the phone. How is this Tesla's fault is beyond comprehension. One can get complacent if the car does most of the work most of the time, but failing to look ahead while you're cruising at that speed is reckless.
@@FrancescoDiMauro its tesla's fault because they allow people to engage in this behaviour without properly accounting for it, at the same time as elon musk bragging about how perfect it is to frisk shareholder value while it isnt even close to being as safe as a human driver in dire scenarios. It isnt allowed to be called "autopilot" in europe for this reason, because the branding puts a false sense of security in the brain of the driver.
@@Aimaiai So you should also ban dumb cruise-control? And how do you know it isnt as safe as humans in "dire situations"? What IS a "dire situation", and what FACTUAL statistics do you have to compare AP vs human responses in such cases? Showing one accident like this video does is meaningless. I can post a video showing a human driver doing something dumb and causing a terrible accident. Do we then ban all human drivers? Seat belts occasionally jam in freak accidents an cause an occasional death. Do you propose we ban seat belts too?
Seeing this video, makes me very happy that I made the correct decision not to use auto pilot when driving a Tesla rental car for 2 days. During the rental period, I was even driving at midnight. FYI, without using auto pilot, Tesla cars are very unpleasant to drive on.
I drive around 6k miles a month with at least 2500 of those on autopilot and FSD. Amazing tech but it’s still supervised and you have to keep your attention on the road.
as I understood the technician, the problem is that the computer doesn't know the exact position and orientation. It's not about the mounting, but what the computer is being told about the mounting. Bad calibration leads to bad assumptions by the computer
@@paulfrydlewicz6736 Every vehicle goes through a "camera calibration" during the first 100 miles or so during which advanced driver assistance is not available. This is designed to calibrate the camera using stationary points on the ground repeatedly until there is a map of this vehicle's camera for the computer.
WSJ is clueless and mostly pumps out propaganda based on whatever company is bribing them. They have a colorfull history of Tesla propaganda that only died down cus their stock price rallied in 2020 leading to them to calm down on it.
The data doesn't support their assertion that the software is faulty, that's the problem. The article also says "autopoilot" which is a nebulous term in the Tesla world but in this case applies only to autosteer ( lane keeping + adaptive cruise control ) and does not apply to FSD. Feel free to search for "Tesla Vehicle Safety Report" which has the breakdown by quarter.
You can see from the first two videos that if the drivers were awake, there is no way they would crash into a huge object in front or heading straight when approaching a T-junction. Both were very likely asleep at the time.
@@mazicort_ _we still don’t know if AP is safer than regular driver or not_ Well, yes we do. "In the 3rd quarter [of 2024], we recorded one crash for every *7.08 million miles* driven in which drivers were using Autopilot technology. By comparison, the most recent data available from NHTSA and FHWA (from 2022) shows that in the United States there was an automobile crash approximately every *670,000 miles"* (Tesla Vehicle Safety Report 3Q2024).
Tesla Full Self Driving has come a long way. I was driving at the Tesla recognized a plastic bag in the wind as an obstacle and slowed down accordingly.
We should be focusing on data and statistics. People die every day in multiple different cars, focusing on specific examples is just misleading and not focusing on making the roads safer.
It is important for people to understand the weaknesses of a system to prevent tragedies. Statistics are important, but don't help analysis to that extent, especially when data the data to make more specific statistical inferences is kept hidden
@@ErinWilke Yes, that data should be made available to regulatory bodies. I don't think it should be publicly available, since that seems like a breach of privacy with no actual upside. As it concerns this video, making this an emotional story about a specific case is in poor taste. The fact that it's about an accident that happened several years ago with a past system makes me question its relevancy.
Surely if the cameras are giving contradictory data, the vehicle ought to slow down in order to get a clear picture and minimize the danger of a collision?
@@Steyr6500it does sound a loud alarm if a collision is imminent. The “forward collision warning” in my Tesla has 3 choices: early, medium and late. I set mine to medium because early was giving too many annoying false positives. There’s also automatic emergency braking. It doesn’t guarantee a crash won’t occur, but should reduce the severity of the impact.
@@bikeman7982 I don’t mean do an emergency stop though. Just slow down a bit to minimize danger. And of course, that can also depend on the rear-view cameras.
You presented maybe 5 actual data POINTS in the video, with an additional data set in the form of what crashed based on what. I really expected more from a multibillion dollar journalistic company.
I would have liked it if the video did a better job contextualizing Tesla Autopilot crash rates vs human crash rates. This would help better understand Elon’s claim that Autopilot is safer than human driving.
Saying it’s safer than a human driver includes all kinds of reckless drivers and drunk driver. It has to be safer than a responsible human driver. I’m pretty sure that’s nearly impossible in the short term.
Saying it’s safer than a human driver is like saying that AI is a better writer than a human. Sure, it’s better in some cases and in some ways, but not all. It’s a huge generalization. Autopilot is a better driver in that it can stay centered in the lane better, but is a worse driver in so many other ways. Same with FSD. It’s a way better driver… until it’s not.
Humans have more total crashes than Tesla autopilot has crashes. Therefore, autopilot is safer. This is Elon logic. Please ignore that there are a much larger total of human drivers than Tesla autopilot drivers.
FSD is not perfect but it keeps getting better. I know that's not consolation to the people who lost loved ones but the truth is that this technology is is going to save thousands of lives in the near future.
_even if we had autopilot, we wouldn't be stupid enough to call it that._ Is this somehow dramatically different from Drive Pilot, which is what M-B calls their Level 3 ADS?
it's a GOOD term! look at the definition from Wikipedia for example: "An autopilot is a system used to control the path of a vehicle without requiring constant manual control by a human operator. Autopilots do not replace human operators. Instead, the autopilot assists the operator's control of the vehicle"
@@wkymole3 _they just wouldn't call it autopilot. For liability reasons._ Just what liability does this create for Tesla? They've already won two lawsuits, one in Germany and one in the U.S., brought by folks who claimed it's misleading.
Years ago, I knew Elon was not smart when he, with full confidence, announced that he was shipping “autopilot” capable cars without radar. A pure camera based solution does not work for that use case.
@jacksmith-mu3ee LIDAR (all caps because it's an acronym). Was invented in 1961 it wasn't invented for self driving cars. I'm not sure what your point is.
Not a Tesla driver here, but I’m assuming self driving tech has improved since 2021. I can’t believe this is the focus of a report in 2024 ( almost 2025).
And you should. Thats what the company expects you to do. Seriously its a glorified cruise control that litterally tells you at night when its unsure about the conditions. Theres so many videos out there of morons who try to game the system and end up dying cus they dont pay attention. Irl autopilots on airplanes require the same level of supervision, if not even more.
Unfortunately, in this case, she is going to lose her court case merely and simply because her husband IGNORED the car's warnings of him taking his hands off the wheel so many times. He was pre-warned. : (
Okay maybe this is an unpopular opinion but driver is still the one responsible. Autopilot, ABS, parking sensors etc are just features that makes driving easier and safer, but the driver needs to stay vigilant and observe the road. I feel like many of the crash videos from tesla looks like the driver wasn’t paying attention to driving. :/
Honestly seeing the trunk lying down in the middle of the road, not sure if human driver could do anything either . Result would probably be the same. It’s the truck driver who killed him, not the autopilot.
If I was John Bernal I wouldn't be putting my Tesla in 'autopilot' mode just in case his vehicle has had a unique update and diverts to somewhere dark.
Not a Tesla fan boy but should the news also show how many times autopilot saved someone’s life or avoided a crash? I mean, the news is supposed to be fair and balanced right?
I’m sorry, but when the car warns you NINETEEN TIMES?!? I have empathy for anyone who loses their life but don’t blame the car when the humans don’t do what they’re supposed to- and that’s called pay attention to the road. Also, how many people died in other cars…at night…when visibility is low? Let’s compare apples to apples and see how that goes.
In other words, this self-driving technology is perfectly safe... as long as it isn't used for self driving like CEO Musk claims. And morons continue buying Teslas for this? 😂
I'm surprised they don't mention the crash in the Bay Area where the guy's Tesla veered out of his lane and tried to merge with a concrete divider that was missing its normal crash barrier - he'd told his wife that several previous times on his daily commute he would see the autopilot turn toward the divider as if it thought it was seeing a passing lane, and he'd grabbed the wheel to override it - but on this occasion he was preoccupied for a couple critical seconds and didn't react quickly enough, so he died in a head-on collision with a concrete barrier. Tesla reviewed the accident and pronounced it "driver error" because the driver had failed to override the autopilot when it made a mistake. The fact they could say something that absurd is why I'll never buy a Tesla.
3:47 sure the autopilot messed up here but if the driver was alert and supervising the fsd, you can clearly see the flashing lights and that crash would've never happend
That’s the issue about creating a false sense of safety. People end up relying on the tech too much and either get distracted or zone off. Hence why they’re getting investigated for false marketing because even people like my aunt think that it drives her everywhere with no issues
Definitely, that specific one could have been avoided by the driver paying attention. The problem is Tesla's marketing (self-driving, autopilot, etc.) combined with their refusal to make crash data public means that people have been misled into trusting the technology. If you think Tesla crashes are one-off tragedies, versus actually understanding the shortcomings of the technology, it's easy to become overly reliant and trusting of the tech. I don't think it's fair to blame individuals when corporations and billionaires lie to make a profit and endanger their customers and the public as a result. Billionaires and corporations are NEVER held accountable.
Autopilot is not FSD. BUT BOTH CAN BE DISENGAGED BY MERELY APPLYING THE BRAKE! Seeing anything blocking the road like a downed semi, I would have applied THE BRAKES! I have been using FSD since 2022 and it is improving so quickly. The autopilot is NOT FSD as it only allows one to steer without having to use the accelerator, however to stop one needs to brake. The video of the accident proves 1) there was plenty of time to brake 2) he was going faster than he should have 3) not paying attention. Did his Tesla even have FSD? Because it would have slowed and sent alarms to driver.
"I don't think this is a long term technology that we're gonna keep in the cars." is actually a fair assessment of why its so controversial for cars. If this type of tech were put into something that didn't encounter so many unknown variables, then it would be more effective than what it was marketed for.
I can easily imagine running into that overturned semi truck. No flares out and it was dark. It fooled the auto pilot, but it could easily fool the human.
The principles of defensive driving saying you should slow down and even stop until you figure out what is in the road ahead. You seem to be forgetting that.
@ I went along with you until you added “you seem to be forgetting that” It was a dig, and unnecessary. And I do trust that you do indeed know that. See? I trust you.
To be clear, I'm no fan of Tesla's system. But are they, or are they not, safer than human drivers? Are they, or are they not, safer than competing systems? Are they, or are they not, safer than (extant, or possible) public transit alternatives? It's answers to these questions, ones about _how to improve our safety overall,_ and not anecdote-driven sensationalism, that will serve the public. Of course it is a tragedy when someone dies on the road, but that tragedy is not increased or lessened by the brand or technology of the vehicle involved. Several of the videos you show are circumstances where _I_ might have been killed had I been the driver, so blaming the manufacturer without looking at the statistics is just muddying the water of an issue that we desperately need to understand clearly. I don't feel the video, as presented, was a service at all.
@stephenspackman5573 _But are they, or are they not, safer than human drivers?_ Without a doubt. _Are they, or are they not, safer than competing systems?_ Competing systems to Tesla Autopilot are the TACC and lane centering systems that have been included in pretty much every new car in the past 10-15 years. The difference is that unlike Tesla Autopilot, nobody's monitoring the safety of the other hundreds of millions of cars, so there's no way to measure whether Autopilot is or isn't safer than the scores of different versions of TACC and lane centering from all the other manufacturers. But because Tesla does do attention monitoring even on Autopilot, and many if not most of the competing systems don't do attention monitoring, it's highly likely that Tesla Autopilot is safer than most of all competing systems. _Are they, or are they not, safer than (extant, or possible) public transit alternatives?_ Probably not. This is probably one of the most well-thought-out and rational posts in the entire comments section of this blatant hit piece.
Autopilot and similar tech just needs to be regulated properly, but it's like any new technology it starts as the wild west then it gets regulated, it's part of the process
Over 9 years, 51 deaths from autopilot. Yes, that is 51 deaths too many; however, compared to non-autopilot deaths in the US this number pales in comparison.
@neodym5809 _If Lidar is so expensive, how come that you can find them in sub $1k smartphones?_ Are you seriously comparing the cost of a smartphone LIDAR, which has a range of around 15 ft., with the cost of LIDAR for ADS, which must have a range of perhaps 30 times that?
I find it hard to have sympathy when the car alerted the guy 19 times to take over. He was mi's using the technology, being careless and having no regard for his own safety, his family, or other people on the road. He clearly wasn't paying attention to the road and was over relying on self driving.
Has any study shown that self driving cars crash more than other cars? Because all I ever see is data about self driving cars. And emotional reaction to their not being a human driver (as if human drivers were perfect)
This whole fretting thing is so stupid. The only thing that matters for AI driving is, does it SAVE MORE LIVES THAN IT KILLS? No driving system will ever be perfect. It just needs to be better than the average human driver. With a level 1 or 2 system like Tesla, it’s even better because you combine the features of both AI and Human safety systems. The AI is never distracted, and the Human is supervising. The combination only is unsafe when the human becomes distracted, which is 100% the fault of the human.
Cost is always a factor, but the issue is that (1) if a human can drive with just vision, then so can a computer, and (2) sorting out conflicting information from the cameras and radar is difficult, especially in bad weather
Sensor interference and cost. They said that HD radars would integrate better and would make more sense. The current Model S and X do come with HD radar but afaik they don't make any use of it yet.
One, it is quite expensive. Two, Lidar gives you a point-cloud in front of the car which can be used to detect obstacles with a lot of compute, but you have no sense of what those obstacles are. A 1-feet cube of styrofoam or a 1-feet cube of solid steel will look the same to Lidar. So from an information utility and processing standpoint, Lidar does not offer much over using cameras and computer vision.
This is VERY outdated news. That semi-crash that you opened with was when they were still using radar - and was LONG before FSD was mature. There are now 6 million Tesla's on the road. At an average 15,000 miles driven per year - that's 90 billion Tesla-miles driven per year. Over all car types, there are 14.3 collisions per million miles driven. So if Telsa vehicles were like ordinary cars - you'd expect 90,000x14.3 = 1.3 million Tesla collisions per year. You say that "thousands" of Tesla-related accidents have been reported. If it was "millions per year" there would be a problem - but it's not. Also, you're talking about the UK version of FSD - which is FAR from complete. All Tesla owners are told at the start of every trip that they must supervise their car while it's driving. Clearly if you slam into a very obvious semi-truck - then you're not supervising the car's driving. Tesla's are (statistically) the safest cars on the road - and they are safest when using FSD.
What did they say that wasn’t accurate? You shouldn’t call something a hit piece just because you don’t like the facts. Elon is incredibly irresponsible and he constantly misrepresents the capabilities of “self driving.” Do you know that Tesla has faked some of their promotional material showing the Tesla commenting someone to work. It was an edited compilation of many many attempts and they made it look like it did it right the first time with no mistakes.
Tesla FSD is Level 2 autonomous driving. Which means, the driver must all the time watch the road and has hands on steering wheel. End of the Story. If a driver does not pay attention on the road it is (legally) his/her fault. I drive Tesla and have FSD.
This is a legal death machine. It’s name and advertisement totally undertone this « legal » part. Confusion is enhanced by the name « autopilot » and the existence of truly autonomous taxi services. And it is vastly documented in scientific literature that a high degree of automation seriously decrease the vigilance level of any person supposed to monitor it and retake control immediately in rare cases of failure or error. The ability to react in a timely manner and adopt the right corrective action in unexpected, complex situations where the automation suddenly fails, is proven to be very degraded as opposed to a continuous manual control where dealing with a sudden unusual situation is much more likely to be successful (within human limits obviously). This is a low cost hardware system made for profit, covered by a mere legal umbrella to hide known dangerous and documented flaws originating from a flawed cheap design. Period.
Tesla already lost that lawsuit in the EU for Misleading marketing calling it autopilot. in Europe its seem as glorified cruse control
Autopilots are just glorified cruise control. No different in aircraft or cars. You can't just put your RV on cruise control and go into the back and fix yourself a drink.
People are napping in their front seat, so crazy.
I agree with you, but the timing of this is WSJ Musk hit piece. The question I asked myself is, did Musk try to give humans one last benefit of the doubt before catering to the Darwin award recipients. Also, people can be petty. If they can blame the car before an internal assessment of their logic, they will.
Overall, this report should include an individual’s driving record. But moreover, should be an analysis of what type would set autopilot, climb in the back, fall asleep, and…
Autopilot literally is just cruise control in 3D. It holds altitude, heading, and bank angle. The jet will fly until it runs out of gas. The engines will stop. Pressurization will be lost. And everyone will succumb to the reaper within minutes. Unless the jet descends to a safe altitude quickly enough with its engines now inoperative.
Auto-Land is a much more sophisticated autopilot. It tracks the ILS as a pilot would land in fog, clouds, and rain. But it cannot usually take you where you need to go. Perhaps from lineup on runway, to touch down. But certainly not taxi to the gate.
@@jdm1152The timing makes it a hit piece against Musk? How so? Aren't journalists SUPPOSED to report on issues that are current and relevant? Trump hiding accident reports for Musk seems like the ultimate in current relevance, wouldn't you say? The hit piece, it would appear, is your post, against WSJ.
CAMERAS ARE NOT EYES. I work in the film industry. Cameras sensors are way worse in dynamic range than human eyes. This means it won’t gather light from dark environments like our eyes do, it will suffer to get light (stop/cars) and objects in dark. Note how many of these crashes happen in the night/dusk.
THIS!!! People do not seem to grasp that.
People's eyes fail them constantly as drivers, too. Not to mention that we only have two of them, spaced fairly closely, on only one side of our head. Teslas have 8 cameras, on all 4 sides of the vehicle.
Yes, the dynamic range issue matters, but again, there's multiple ways Teslas' eyes are better than humans' eyes.
The bigger issue is that Tesla has the opportunity to augment those electronic eyes with all kinds of other things humans don't have, like radar, lidar, and ultrasonic sensors, but Musk has stubbornly resisted relying on them, leaving Teslas with ONLY those 8 eyes in too many scenarios.
And critically: The depth perception of stereoscopic cameras is garbage compared to the depth perception of our eyes.
Which is why Teslas crash into overturned trailers. The cameras are unable to reliably detect the position of that trailer, so it can't tell the difference between that trailer and just a dirty road surface.
LIDAR fixes that.
They should have equipped all exterior cameras with IR. Why their engineers didn't pursue this is a blatant safety disregardment.
@@nathanscandella6075 Two human eyes and the human brain are far superior to 8 cameras and the Tesla AI. The problem is the Tesla machine learning systems can't model for any and every potential situation. While the human mind can know a truck on its side is a threat and one should brake, the Tesla didn't know that and there are a million other scenarios that a Tesla won't ever be able to understand. Musk chose not to include Lidar because it was too expensive and that can't be retrofitted into most Teslas now. Musk thought he could BS his way through this and he probably won't be able to.
The name Autopilot is what's misleading people. It should be called driving assistance and should be marketed as such.
Doesn't help that Musk promised Full Self Driving and Robotaxi
Exactly because I’m sure most of these auto pilot crashes could’ve been prevented if the driver was staying aware I have a Tesla and I’m always driving it like I don’t trust the auto pilot
As many other automaker do, but that doesn't make you the richest person in the world.
@@rick4580 He might have promised but never stated that is was 110% READY TO DO SO. That's the problem is that things are fed but we call it mislead when in reality never broke the meaning down fully. Just like when driving and its called "AutoPilot" well at the very start we all think fully autonomous but in real world it means A function that Pilots something automatically but further in depth on the word pilot also means (Is not 100% Fully capable to control this something automatically it can only function so much).
Err... but by definition, Autopilot means "pilot assist".
Also, the Ladar is not an appendix. Elon may as well argue that Dolphins don't need their sonar.
02:10 *Elon is so confident in his tech that the Vice-President Trump's transition team has already recommended scrapping this requirement (**02:10**) that companies report automated vehicle crash data.*
Great job America.
This election was their endgame plan to stay out of jail
I wish I could leave
Its the peoples fault for fighting over red and blue like we in a gang. Right wing left wing when both wings belong to the same stupid bird. As long as American keeps fighting over the best of two evils we will keep getting these results. Whats else did you guys spec. We need to get both parties out of our government and introduce a new government that is controlled by the people and people introduce policies and Americans vote on them online.
@EnjoyingMyTimeOnThisPlanet I agree with the general premise of ‘both sides’ exhibiting corruption, but 1) there is a *massive* difference in the extent and 2) MAGA goes *way* beyond ‘normal’ corruption, it’s pure fascism staring people in the eye. If you want to get money out of politics and install true representatives: great. But given the current state of affairs, I think that ship has sailed for good.
The oligarchs are in charge now, but don’t worry it worked out really well for people in countries like Russia and North Korea…
Now that Musk bought his way into the government, everyone looking into regulating his companies are going to have a hard time.
Regulation? They can be made more efficient by not doing it.
- Enron Musk (probably)
No incentive for companies to make good, safe and competitive products with Musk in the government - sounds anti capitalist, competition to me ngl.
He's not in the government, and will only be an independent advisor. He won't have any more sway over legislation than any other multibillionaire that can buy off lawmakers.
@@PhatPazzo Lol I was actually thinking the opposite, more de-regulations for his companies. He prob lobbied Trump and the Republicans.
Twitter X objects at these comments
From watching Mentour Pilot, I have come to realize that autopilot in aviation has a completely different meaning than what it means to the general population.
Pilot here, yep. The public looks at aircraft autopilot as a navigation system that does it all, not recognizing it has to be programmed with a very specific plan, has a couple dozen combinations of operating modes, and has to be managed.
I think pilots dont even call it autopilot, or at least thats not how the manufacturers call it. Pilots get extensive training as to what the system can and cant.
Big difference to Teslas, considering how Elon constantly talks about self driving cars. Like in 2015 he said fully self driving cars are just 2 years away, its not a secret why he called it autopilot.
@@skycaptain3344That is what Tesla is claiming to be autopilot. Hence the distinction between Autopliot and 'Full self driving (Beta).'
@@CmdrTobs as the video shows, that’s not what Elon Musk has actually been saying
EVEN if you have a lofty expectation in aviation autopilot, there is a distinct advantage in not having wrecked shipping trucks in the sky!
And that's why Elon wants to stop the requirement to report crash data.
And we the people who drive have a right to know if other vehicles on the road are safe to operate in that mode. Especially given that the rules about whom to sue when there is a crash aren't really clear.
Tesla has been late/dishonest with every launch of technology.. CyberTruck=Junk, Solar Roof=Junk/lies, New Roaster=Ha Ha Ha. “Oh, it was supposed to land in the ocean” When people being fascinated with these crooks that become billionaires leading you to believe that you’re part of some inner circle will all be better off. It’s just a shame that they’re all now running the country because younger people are too stupid to not vote as if everything is a popularity contest. But when a Tesla runs them off the road and kills their family, they can always send a message to Elon on X.
This are 2021 reports, why don't WSJ show 2024.
Right because they are dumb and have agendas
Trump will let him
@@martontichi8611 the way Tesla are safe, WSJ had to go back to 3 years to find a talking point in 2024, Tesla Autopilot is 10X more safer than human drivers, You all are NPC's i mean who believe in Mainstream Media
After a few phantom events where it tried to kill me and my family on the M5 in the UK. I now refused to use it. Actually feel stupid for paying thousands for the full package.
I'm glad you're ok and didn't leave the car to drive itself
I’m glad you and yours are ok, and no, you aren’t stupid, you were deceived, and so were many, many others.
We pay for things with the expectation we will get what we paid for.
shame belongs to those that profit from deception.
We have two 2023 Model 3s. I used FSD extensively from 12.3.6 release and stopped using it during the fall due to several near collisions that could have been devastating - one a drift by FSD over the double-yellow lines on a blind curve on US Hwy 50 west of Tahoe. In that case I was able to take over and return to my lane before meeting oncoming traffic. But the next time … not worth finding out.
FSD is amazing until it isn’t.
@@DD-wu7xq yeah ADAS are not fully autonomous. And outside of the US it's far worse.
Paying that kind of money for a mediocre adaptive cruise control icw lane keeping assist is crazy....
"Go fast and break things" sounds like a bold and dynamic business model until you're the thing that's broken.
That's fine for an unmanned Starship, not so much when a human is onboard
Perfectly stated!
Tesla should be forced to make all crash data public
The problem is that not everyone is able to interpret them correctly so there would be a lot more of misleading stuff out there
He's petitioned to stop Tesla crash data when Trump takes office
It should just be an overarching law that this information be accessible to everyone regardless of manufacture.
If companies like tesla put automated systems in place, they should record the data (as they do). And it should be mandatory that other organisations or public can look into the data, so the customer knows how many crashes and what crashes are caused by these systems
Okay but thousands die everyday from normal cars. Why is no one using this data to reduce our dependence on cars in general. We should not be killing our citizens by forcing everyone to drive.
elon wants to stop this reporting, i wonder why
And on Januray 20th, he's going to get everything he wants.
So Bieden pardon his son of all charges never will tunter face law step in court , answer questions and you hate trumpo 😂😂@@rogerwilco2
The human behind the wheel is supposed to remain attentive as if still controlling the vehicle. What we have here are examples that lead to human controlled fatalities and the number is miniscule as a percentage of Teslas and Autopilot/FSD miles driven. I have many miles of experience using both systems and it gives me a litany of safety advantages compared to vehicles that lack comparable technology.
Teslas aren't designed for level 4 or 5 autonomy yet, so your argument is moot in relation to this story.
@kenhiett5266 what does that have to do with crash safety reporting?
@@rogerwilco2what is the worst thing you think will happen that day? Anything falsifiable? Do you think he will increase the $7500 tax credit for EV buyers? Or increase the carbon credits? Or accelerate the ban of gas cars? Or create more regulations that are easy for his companies to comply with but hard for newer EV manufacturers to comply with so Tesla won't have any competitors in their way? Because I think none of those things will happen, and actually the opposite will happen.
You guys need to make a full 1-2 hour documentary on this. It sounds like you guys found out a lot more than what you said in this video. I would watch a full documentary on this.
I agree, it even looks like too much was edited out to have it be short enough. Strange editing job (and I'm a pro), lacking explanation of given arguments.
I fully suspect Tesla’s legal team would be down their throats for stating anything “unsubstantiated” in this report. It is likely they found damning evidence on the hacked computer, but that evidence is likely considered “not legitimate”
@@Dr.AnonymousPro definitely, that’s pretty much the story of every Wall Street Journal video. It’s always way too short about the various topics they cover.
I don’t use my tesla autopilot anymore because of too many phantom events. It’s like driving with a new learner who needs constant attention. You are constantly on edge waiting to grab the wheel
Me too. I barely use it. It also feels like only FSD users are getting „Autopilot“ updates.
Got the same feeling. Can't let the car drive when my family is inside
I live in the UK. I tried basic autopilot once on the highway/motorway and experienced phantom breaking. Haven’t used it since.
I've had this in the UK just with EAP, not even FSD, it makes me so nervous that I am absolutely exhausted when I stop because I've been concentrating so hard, it's more stressful than driving manually, so I rarely engage EAP now, only really use it if the highway is empty, but even then I've had phantom braking occur for absolutely no discernible reason.
FSS Version 13 has been improved a lot. I hope Tesla users get this update as soon as it can be sent to your vehicle.
Teslas whole strategy of calling things "full self driving" and "autopilot" and then arguing that the customer was wrong to expect the car to competently fully self drive or operate on autopilot is truly pathetic.
An airplane on autopilot cannot handle engine failure, altitude recovery, severe weather, instrument failure, hydraulic failure, runway changes, fuel management, route planning, collision avoidance, the list goes on. It really shouldn't be called autopilot huh.
@@SuperBotcreator To be fair cars with lane assist and adaptive cruise control are about as automated as aircraft autopilots. The problem is Tesla claiming it's more capable than it is and a public that don't understand how autopilots actually work.
Right? It’s false advertising.
Well he's right... After all the Ohio supremely court just ruled that 'boneless' chicken wings aren't boneless and so it's totally allowable to have secret bones that kill you and it's no one's fault but your own.... This is where we're headed. The corporations rule it all while we die...
@@12pentaboraneomg no they're not! My god those technologies are miles apart!
People . When you are behind the wheel of any vehicle you have one job , drive the vehicle. Never mind all the other distractions todays modern cars have . It's your life you'll be saving as well as others.
you mean the driver is responsible for theirs and others safty? *gasp* but i thought it was elon killing these people?
The guy leaving at 3 am for a 3 hour commute might have used it to get some more sleep?
That commute at that time of night is insane and unsafe no matter what you do.
@@rogerwilco2 yeah, drivers seat is the best place to get some sleep. OR maybe you should take a bus/train if you want to rest?
@@WarriorPaxo You market a product as FULL SELF DRIVING and then take out crucial sensors needed to safely operate an autonomous and try to sabotage government oversight and cover up the data you are absolutely responsible for the outcome.
@@WarriorPaxo 😂 I know, right?
These people hate him so much, never mind crash data that says teslas are still better at saving lives than normal cars. ¯\_(ツ)_/¯
He was such an idiot for removing radar. Why even take it out to save .0005% of profit?
Because profit is everything.
@ man but you’d think it outweighs the pr. I remember when he first announced it people were so confused and now even more. We know LiDAR and radar are better. 🤦
@@JeffyShyChannel Radar was removed because it was conflicting with the camera's, with the camera's being proved correct. Lidar is not needed. V13 is showing that, and will make our roads far safer.
@ except cameras are terrible at reflectivity. Really it’s the marketing at Tesla at fault. People are way too confident to let it fully drive with no attention to the road.
Entire car costs like 30 000 USD. One radar will cost few hundred USD and wont be of any real use. One LIDAR, which would be usefull, costs like few thousand dollars for this use case.
9:07 "He'd leave about 3am to get to work by 6."
😮3 HOURS, ONE WAY?! That's 6 hours worth of just driving to frickin work! Dayum!
No wondering why he uses autopilot, 6 hours worth of driving in boring highway is crazy
Definitely. I am not shocked he didn't put his hands on the wheel but i am suspecting he was napping @@createx1751
Good pick up. Pretty crazy commute. Obviously a super hard working and loving dad, tragic it happened for his wife, kids, parents, and friends.
The comment at the end rings true. It’s too much to ask of people to simultaneously trust their cars auto driving feature while paying attention. I think fully autonomous is the way (ie Waymo) if you want that sort of thing but we’re so far away that this transitional period will be really bumpy.
Personally I found Teslas autopilot too nerve racking and had too many close calls that I just never use it.
Enjoy capitalism
Indeed, this is absolutely insane. I wish we could figure out high speed rail train like virtually the rest of the world. Train at 160 mph would do the same route in 1 hour + 30 minutes to navigate to and from station - boom, you can leave your house at 4:30 instead of 3 AM
Note to self: step on brake when you see a truck laying on the road.
That's what the video is trying to highlight: when we start to "trust" that the autonomous driving works, we start to lay back and let the system do its job and lowering our sense of awareness. Once the system failed to detect an obstacle and we see it in front of us, it is already too late: either the distance is too close and our response time is too slow to make that quick decision due to the lower state of awareness
Lol 😂 tesla is the least safest car on the road lol 😂 It’s the deadliest cars
@@dc-gm7luI think you miscounted. Yours only has 5
@@ferdiyansurya Still x1000 safer than human drivers
thats an idiotic statement did you not watch the wsj video?
Now we know who DOGE will shut down first.
You 😂😂😂
The Wall Street Journal
They already are working on removing the reporting mechanism mentioned in this video
That's been my theory. Keeping a lid on the investigation is responsible for his recent interest in downsizing government
@@oldsport500 Their goal is to make the safest cars. More data is better. Why do you think they have a cluster of supercomputers?
I am a retired software engineer. We always had a rule that interface design had to be clear, understandable and aware that people make mistakes. We had to be very careful how to frame the prompts and check the user inputs. Here, Tesla have the prompt that the car has an "autopilot". This violates every guideline that a software engineer would use because it guides user inputs into an undesirable direction. To those who commented here that the drivers must bear responsibility and that Tesla cars give better performance that humans only, this "autopilot" prompt is criminal regardless, especially as it is used for marketing purposes rather than human safety.
Watching out for autopilot’s errors is actually more stressful than driving yourself.
This has been my experience, it’s total nonsense
U Suddenly wakeup gasping - as if u crashed & can't reachout
I disagree entirely. Been using it for over a year now and I absolutely love it.
thank you
I do not find even basic cruise control to be useful. I need to be involved in all aspects of driving the vehicle or my attention wanders.
Now imagine a computer driving an 80,000lb truck
Soooo the tesla semitruck
Or we could invest into decent train infrastructure?? 🤔
This video isn't a critique of self-driving cars. It's specifically investigating the camera-based technology that Tesla uses, and it's misleading claims about it.
Human drivers are far from perfect (how many millions of crashes have been caused by truckers making mistakes?). AI will 100% replace human drivers, and make driving enormously safer.
Yikes
I’ve actually not seen a Tesla semi yet somehow out there but I’ve seen other brands without drivers. It’s very worrying for many reasons.
Tesla sees something strange on the road and thinks it must be nothing, and drives full speed.
When I see something strange on the road, I slow down
Right? Why isn't this the default behavior?
@@gatowololo5629 Well I guess if may be stopping each time the system is not sure what they see. That could again cause even more accidents. It seems to me that without a radar it 's not going to be a reliable system.
But....ELON SAID IT'S SAFER THAN HUMAN DRIVERS!!!! 😂😂😂😂😂
@@gatowololo5629 Probably because of the data it's trained on. Technically or humanly, it is impossible to train the system on every single possible thing it may encounter, so it's trained only on a small subset of all possibilities. From that it follows that the case of "encountering an unforeseen / unrecognizable object" is what's happening all the time, many times on each trip. And most of those encounters are harmless, of course. That's okay when you are training a computer to recognize cats and dogs to impress your CS tutor, but definitely not okay when people's lives depend on it.
It's still learning to recognise objects on the road
I have a little different perspective here also as a Tesla driver. I was driving down the road, admittedly a little in my own world. The driver in front of me slams on their breaks. Full break check. The car literally slammed on the breaks, and swerved out of the way to an area with no cars. It was all captured on multiple cameras for me to rewatch and analyze what I could do better. I do not trust autopilot the same at night and I’m extra attentive. That truck would have been incredibly hard to see even if I was driving the car normally. The Tesla has saved me from near misses multiple times that in my old truck would have been an accident. I feel much safer in my Tesla, and you agree to MULTIPLE warnings that the system is not fully automated and that YOU are responsible to pay attention to the road.
Going to Vegas, my brother’s car automatically braked when someone slammed on their brakes in front of us. Car then sped up when it realized the car behind us was going to hit us because it started braking late. It was clutch.
So do you seriously feel that it is OK that Tesla wants to bury the reports on crashes from the public???
as a Tesla driver, i thank WSJ and the whistleblowers in this video for making plain the shortcomings of Tesla autopilot. seeing these examples i can drive much more aware now. this should be required viewing for all Tesla drivers...to familiarize themselves with the specific weaknesses in the system.
You didn't know that you were supposed to remain attentive and are still ultimately responsible for the movement of your vehicle? Your Tesla came with a variety of such warnings.
If you thought your Tesla was level 4 autonomy, that's on you personally. We receive constant reminders that's not the case.
You needed a report to know that you are the driver and not Tesla's autopilot? Like sorry to be rude, but no wonder Tesla drivers crash if this kind of thing is news to you.
Are u serious? Don’t you have common sense?
lol
Haha, I thank the reporters for sharing details about the types of accidents that have been taking place and you all are so threatened you start insulting me and making up things I didn't say. Sorry but you're the people who lack nuance, not me.
I think people equate autopilot on a car with autopilot on an aircraft. If there is a problem in an aircraft the pilot normally has plenty of time to react due to the separation interval between aircraft and the altitude above the ground. If the self driving automation fails when the nearest object to collide with is maybe 10 feet away the car will hit it before the driver even realises there is an issue.
if only there was a person behind the wheel that could have took over before that point....
@@WarriorPaxoIf only you don’t call your system “full self driving”.
If only false advertising was illegal… Oh wait.
So the obvious solution here is to have flying Teslas!
@@-TheUnkownUser false advertising is illegal. Litigation over it is why Tesla had to rename it to "Full Self Driving (supervised)". Stop fantasizing about a world where there's evil people getting away with crimes, when there are real people actually getting away with crimes.
Sad reality is people autopilot their airplanes into the ground all the time.
Public think autopilot is some magic button to just push and forget.
And that is not true.
Dont worry. The orange one will give them immunity.
😂😂😂😂😂
Immunity from what? Tesla has already had immunity from all of this for years. Tesla has already won tons of lawsuits about crashes. He doesn't need immunity. If he does need immunity, can you name any cases he has lost that Trump will grant him immunity from?
Disagree. GM will pay for the lives they have taken. GM does not report crash data. GM does not collect crash data. GM is killing you.
ABS, blind spot monitoring and rear view cameras are good enough for me and my eyeballs
Had a Model 3 with FSD for two years. Important note: I live in the Northeast but lived for six years out west. I understand why Tesla was all in for cameras only, but in every other part of the US where roads are not well marked or lit, that FSD is NOT ready for prime time. We had numerous false reactions from the vehicle from NOT seeing hazards on the road to reacting to hazards that did not exist. The car on more than one occasion slammed on the brakes on a totally empty highway, while doing 70mph. Sold the care after too many occurrences. On top of that, the lowest end Hyundai has better build quality than those cars, and I'm not the first person to point that out.
Should have shared the stories and camera feeds. If that is even allowed. Computer says no?
I've had a Tesla Model 3 for the past 2 years, and although I didn't buy it for the autonomous driving feature, the improvements that have come in the last 6 months have been remarkable.
FSD SUPERVISED
Hyundai has (&has had) awesome built quality!(for years) Got 100.000 m / 8 year warranty and panel spacing is mint! Chrysler cars are a better comparison…
@@gw2301 Sorry to bag on Hyundai. You're right. They are actually made really well. A more appropriate comparison would have been a vintage 80s Yugo.
Night cases with the overturned truck and the pick-up are definitely avoidable by radar assisted brakes, which are available even in low-end cars today.
Teslas originally relied on radar, too, but Musk has decided to phase that out, even in cars that still have radar hardware. His explanation was that for lower-resolution radar, there are too many false positive events that cause the cars to slow down for nonexistent threats.
I can confirm that this was indeed a common annoyance in the past, and possibly a way to get rear-ended, and since going to vision-only, that annoyance is mostly gone. But, that doesn't mean Musk is necessarily right. Could be that his engineers just weren't good enough to properly filter the radar data. Radar has always been an imprecise sensor, it's only good signal processing that makes it useful.
AFAIK, radar is not good at detecting stationary objects of any kind, only moving objects.
@@nathanscandella6075 yeah, part of the issue AIUI was the conflict between camera and radar in Tesla's stack _specifically_ rather than necessarily an inherent issue to radar.
And when radar is bad-enough to freak-out because of a leaf in the road, it's at least consistent-enough that you learn it quickly and compensate; rather than inconsistent and seemingly out of nowhere like with Phantom Braking.
Most humans would also crash in the situation they showed those two accidents. The overturned truck was completely not visible up till last second.
@@sureshkrjsl Just because it was not visible in the onboard camera replay, does not mean it was the same for human in that situation, if the human would have paid attention. The event replay is of very bad quality, probably on purpose, to give you the impression of not being able to see.
I have been in a somewhat similar situation, autumn storm, rainy conditions, suddenly I saw what looked like a tree fallen on the road and braked later and harder than normal due to bad visibility but still before my car's auto braking activated. Model X next to me ploughed straight until sudden swerving just before the tree - that was most likely driver intervention. Took significant front end damage and punctured a tyre. No people were harmed. But he cleared the left lane for the rest of us at the cost of his own car. :D
"Auto pilot" was the worst name in safety history for these cars.
And yet, no regulatory body can take any action towards it. Insane.
Cruise Control is just as bad.
The Autopilot name is coming from the aviation industry and it is technically correct. In a plane having the auto pilot active doesn’t mean the pilots can’t leave their seats, the pilots needs to be able to take over at any instant. The problem is that the general public doesn’t understand it as such and this causes much confusion.
Now, if you are a Tesla owner, when you activate the autopilot in order to be able to use it on the road, the instructions are very clear, this is meant to be used on one way roads and you have to be able to take over at any time. There are also warnings if you need don’t keep your hand on the wheel and if you don’t comply Autopilot will disable.
@@alexbian5567in some countries they had to change the name, but in most it has been kept as it technically describes precisely the feature making it difficult to rule against it.
@@fdelaneau theyve updated it so you dont need to have your hands on the wheel I think autopilot is absolutely awful compared to fsd and not in a feature way it is actually just terrible at making and decisions which I get cutting features off fsd but the actual logic and programming is terrible
I’m a Tesla driver. I don’t like Musk, and I feel lied to about the car’s capabilities. BUT, it’s saved me from at least two accidents. Here’s the data point we need but they didn’t provide: “how many accidents per million miles driven WITH and WITHOUT autopilot”. Naturally corrected for other variables. Give me that data.
_Give me that data._
Ask and ye shall receive. "In the 3rd quarter [of 2024], we recorded one crash for every 7.08 million miles driven in which drivers were using Autopilot technology. By comparison, the most recent data available from NHTSA and FHWA (from 2022) shows that in the United States there was an automobile crash approximately every 670,000 miles" (Tesla Vehicle Safety Report 3Q2024).
@@BigBen621 That is somewhat skewed because you are only supposed to use autopilot on freeways, which limit vehicle interactions by design.
Now the issue is that how would you count the accidents prevented by autopilot since, well they didn’t happen because they were prevented
@@economicprisoner Yes, I agree, although speeds are much higher on freeways, which means accidents that do occur are more likely to be fatal.
The problem is that NHTSA doesn't segregate crash data based on road type. But they *do* segregate fatality data roughly by road type (actually rural vs. urban), and one might assume that crash data would at least roughly follow fatality data. In brief, the rate of fatal accidents on rural roads, which of course include intercity freeways, is 1.68 per 100 million VMT, while the rate for all drivers is 1.33 per 100 million VMT. This suggests that there are *more* fatalities per mile in rural areas than the average, which would make the ratio of 10.6 between Teslas on Autopilot and average drivers even more remarkable.
@@bardz0sz _how would you count the accidents prevented by autopilot_
You can only infer avoided accidents statistically. Say you have a group of 100,000 vehicles without autopilot, which experience 5 accidents per year; so the total accidents are 500,000. Now you replace 10,000 of them with vehicles *with* autopilot that experience only 4 accidents per year. Now the total accidents are 90,000 x 5 plus 10,000 x 4 = 490,000. So you can infer that replacing 10,000 vehicles without autopilot with 10,000 vehicles with autopilot has prevented 500,000 minus 490,000 = 10,000 accidents.
After watching this, I think I need to point out that Toyota has had a level 2 autonomous system available since they put it in the Prius in 2010. The system is similar to the Highway Autopilot on Tesla. Toyota uses a radar array and cameras. The radar sees stuff like this and will slam on the brakes, and this is why Toyota is seldom if ever in the news for this kind of accident. The radar system doesn't have to interpret anything, it knows if you are about to slam into a stationary object. In 8 years of owning a Prius, I had it slam on the brakes once (about 75% faster than my foot could, which hadn't even left the gas pedal and the brakes were already applied, avoiding a collision) and warn 3 time to stop. 2 of those were false, as that is the nature of radar, sometimes it doesn't understand sudden curves, angles, or sudden inclines in the road, but I'd rather have a few false positives than slam into something.
Beside my 2021 Model3, I have a 2017 Prius Prime. Its lane assist is a joke compared to Autopilot. It can't even follow a gentle highway bend. I keep it off. Its lane departure is annoying. I only turn it on while on highways. A radar wouldn't have been helpful with the overturned semi as it would have been ignored because it has been stationary since the first time the radar saw it (radar are designed to ignore stationary objects to prevent phantom braking). The difference here is the Prius driver would have been looking at the road 100%, otherwise he would have ended in the ditch. In a Tesla, 99% of the time, Autopilot will do fine. It's that 1% that is problematic if you're not looking at the road, hence why you SHOULD KEEP LOOKING AT THE ROAD, it's NOT AN EYES OFF system!
No the reason why you don't see Toyota in the news is because they don't get clicks and nobody in the news team has a vendetta against them. Toyota's lately have had quite a lower safety record than Tesla, its not even a competition at this point. So Toyota better step up their game. And btw those false positives are why tesla decided against the radar systems, because they cause people to rear end you. Yes because people suck at driving and they will rear end you.
@@sylvaing1 A 99% correct system shouldn't be used by the public. And calling it beta is not an excuse.
Actually a 70% system would probably be safer to use, because at least you'ld be able to keep your attention on the driving.
But when it's at 99%, you have not enough to do so you'l get bored. When you're bored, you'll start thinking about other things (daydreaming) or even do other things (go check your phone,...). It's just human nature.
I had a “false positive” in a Subaru Outback once and it decided to lock its brakes up at speed on the interstate. Almost got steamrolled by the traffic behind me. All because there was a different color road surface patch. Be careful what you wish for.
@@alamogiftshop I had a 2024 Volvo CX40 Recharge Ultimate while my Model 3 was in the body shop last summer. It also did a phantom emergency braking on a curve on regional road while I was going 105 km/h. In my case, there was nothing on the road, no change in the asphalt condition, nothing. The dash displayed "Emergency braking applied to avoid a collision", well nope, there was nothing.
Imagine always having a beginner driver driving, and then having to be ready to take over all the time. This seems to be what it's like to use autopilot
I agree. I think its more difficult than simply driving yourself. Most of the people saying the drivers should have been paying more attention are missing this point.
Exactly - they are relying on humans being the back-up for a computer, rather than the computer helping the human avoid mistakes (like aviation automation works).
💯. Need to text at a red light? Auto pilot great to get going when the light turns green. It’s surprised me a few times. Why is auto pilot camping in the left lane? Oh because It’s on auto pilot. Always drive, let auto pilot assist. 🤦🏽♂️ this crash is dumb. Condolences to the family, because their father and a husband had to be a statistic of why It’s important that only, you, are responsible for your safety.
I think people should be a lot more curious about how these people who get into trouble with it were able to convince themselves that it was safe, because I've never gotten that impression from it, you can watch people demonstrating it and it made mistakes constantly.
It's more like having a helmsman handling the maneuvering so you can attend to higher-level tasks such as maintaining situational awareness.
In Houston old traffic markings are often left behind after lanes have been re-routed. In rain and fog at night sometimes the old remnant lane markings are MORE visible than the “correct” ones. It’s sometimes difficult for even a highly experienced person to understand what to do. I don’t even trust myself to get it right sometimes. A low-resolution camera has no hope!
As a fellow Houstonian, I agree, the marking really get me confused at times. It's dangerous
Ditto here. The “lane keeping” function of my Kia Sportage gets fooled all the time by all the different markings on the road. I put it on once in a while for fun, but not for long!
"In rain and fog at night" Add to that bright sunlight but at a low angle, e.g. just after sunrise. With concrete pavement in particular (high albedo), seemingly every place where a marking had ever been on the pavement lights up.
Just had this experience myself while driving close to sundown. The only way to tell which markings were correct was to use the mirrors. You could clearly see the lanes in the rear view, but looking forward, it was almost impossible to tell.
We have that problem in some of the cities in New Zealand. It's a nightmare driving through those areas when it's raining
I don't think I would have seen that car in the dark either.
And people wonder why Tesla in Europe and China don’t offer full self driving.
It's called FSD SUPERVISED.
@@isthatateslawhich is a self-contradictory name
We're literally used as guinea pigs and so many people are willing to sign up. Smh
@@isthatatesla And FSD is called "fully self driving". Only later did they add the paradox "Supervised" behind it, which should tell you something.
@@termitreter6545 IFSD is in development and will be fully functional in 26. Maybe it should be called Training Self Driving until then .
been using Teslas FSD for 2 years now and over 70,000km accross the US and Canada, i think it's incredibly beneficial to safety but i honestly think standard autopilot is dangerous. It is cruise control. that's all it is. it will drive straight into anything in the road. the name Autopilot is misleading tbh. it should be only referred to as advanced cruise control with lane keeping.
I always knew that’s what it was. My family owns a Tesla and never uses autopilot at all. People should do more research before allowing a car to drive you.
@@personyt55 the only “research“ you need to do is to read the Tesla documentation on auto pilot which states it’s a drivers aid and you are fully responsible. Operate auto pilot, recognize its capabilities and use it within those capabilities. Simple as that. 80% of my driving is an auto pilot and I understand completely what I can do and what it can’t and adds to my safety because the fatigue of maintaining speed and maintaining lanes to tracks from my ability to watch the road carefully which is what I can do more intent with auto pilot on than off.
The term is correct. This is exactly what autopilot in aviation is. It will keep an airplane's height, speed and heading, comparable to keeping a car in its lane. And it will fly into any mountains that show up if the pilot does not do his job.
The public needs to know when a car is on “auto pilot”
It won't help stopped vehicles know a Tesla is about to rear end them at 70mph.
I read that there are already plans for a cyan or green (can't remember which) colored lights on vehicles in the near future to let everyone know that a car in autonomous mode.
After i got my first tesla this year common sense made me not rely on autopilot in these situations: when its dark. When its foggy or heavy rain. When the road markings or road conditions are bad but its a nice feature in good weather on nice roads and i enjoy it
You have the option to reduce the maximum speed to whatever you want.
The problem isn't only the Tesla driver, but those of us who did not sign up for the program and get hit by one of these. Makes the Corvair look pretty tame now.
Who has been hit By FSD?
All the people in this video 🤡
tragic no doubt, 3:40 the drivers was worn 19 times by the autopilot to keep his hands on the steering before the crush. most likely he fell asleep, the autopilot kept him from chrushing way before the flipped truck to appear, This accident will have happen anyway with any other car by him losing the control of the car. I'm hoping that this technology will be so good that will prevent any body from dying on the roads.
@@RoadSurferOfficial That was not FSD. It was autopilot. So my question still stands. Who has been hit by FSD?
This is a dumb, yet extremely common, form of sophistry used against deployed autonomy.
I also didn't sign up to be on the road with rednecks' monster trucks who achieve "safety" via enormous mass and a bumper that's at the same height as my head in a sedan. Or, a child's head on a crosswalk.
Because, you know why? We don't all get to approve/reject each others' vehicles.
If you want to argue that Tesla should be better about publicly sharing safety statistics on their autonomous cars, and IF those statistics show significantly worse safety performance, then use that to push for specific regulation. "I didn't sign up for ..." is irrelevant. This is a big, stupid country full of idiots doing harmful things to one another. I didn't sign up for that, either, but we live in a society.
The mainstream media should be focused on this as a disqualification for DOGE oversight.
They're afraid of going to jail under the new Putin, I mean Trump, administration
The basic conflict of interest between DOGE and being a CEO at so many large companies should be obvious. Musk should be made to divest as CEO in many places before being able to head a government program ment guide regulations and enforcement.
I also point out that this is not uncommon in the corporate world. Often when you take one position you might not be allowed to lead or own conflicting situations.
literally every situation shown in this video wouldnt have happened if they used LIDAR. Elon was so set on just using cameras despite warnings from the engineers. the blood is on his hands
Why does everyone think lidar is some kind of magic solution. A few months back a Waymo crashed into a large pole, despite being equipped with Lidar. V13 of FSD is proving Lidar is not needed.
yes, dropping lidar on technological ideology was bad.
@@CmdrTobs V13 is showing it to be the right decision. Tesla creates a pseudo lidar with it's camera's, and has shown it doing remarkable things. Lidar is not some super sensor. Only a few months ago, A Waymo crashed onto a large pole.
I think the abrupt shift away from lidar to camera based reinforcement learning was the bad move. At least continue to use lidar to build the model for detections and reactions and then introduce that training data to newer vehicles without lidar. Their decision to just replace everything with cameras then push Tesla vision as an OTA update almost 2 years later was truly the dumbest decision their engineering team has ever made. No idea why the didn’t prioritize that rollout first, then focus on physical lidar removal.
Tesla vision could have been such a huge deal when they shipped it, but instead to felt more like a patch update to fix bugs, since that’s basically what it did for all of the vehicles produced w/o lidar
@@Nullmoose I'm not sure what you are talking about? Tesla has never used Lidar. They did use radar, but got rid of it because of it conflicting with what vision was saying. With vision being proved correct. Looking at the performance of FSD. It would seem to be the correct decision.
I was actually working on the same job site as Steven before this incident. I had just got my Tesla and would park next to Steven, until he stopped showing up one day. After I heard about what happened to him I learned to never trust the autopilot without supervision. I didn't know him personally aside from the occasional morning pleasantry, but its tragic how many construction workers die due to having to commute far while sleep deprived. My heart goes out him and his family.
Elon has gone from Ironman to Lex Luther
Iron Man was selling weapons to whomever wanted, didn't he?
not really the hero i dream about
3:04 what is the driver doing? ... sleeping?
Highly likely, or looking at the phone. How is this Tesla's fault is beyond comprehension. One can get complacent if the car does most of the work most of the time, but failing to look ahead while you're cruising at that speed is reckless.
It's called autopilot....
@@FrancescoDiMauro its tesla's fault because they allow people to engage in this behaviour without properly accounting for it, at the same time as elon musk bragging about how perfect it is to frisk shareholder value while it isnt even close to being as safe as a human driver in dire scenarios. It isnt allowed to be called "autopilot" in europe for this reason, because the branding puts a false sense of security in the brain of the driver.
Mostly. It "drives itself" after all
@@Aimaiai So you should also ban dumb cruise-control? And how do you know it isnt as safe as humans in "dire situations"? What IS a "dire situation", and what FACTUAL statistics do you have to compare AP vs human responses in such cases? Showing one accident like this video does is meaningless. I can post a video showing a human driver doing something dumb and causing a terrible accident. Do we then ban all human drivers? Seat belts occasionally jam in freak accidents an cause an occasional death. Do you propose we ban seat belts too?
Seeing this video, makes me very happy that I made the correct decision not to use auto pilot when driving a Tesla rental car for 2 days. During the rental period, I was even driving at midnight. FYI, without using auto pilot, Tesla cars are very unpleasant to drive on.
I drive around 6k miles a month with at least 2500 of those on autopilot and FSD. Amazing tech but it’s still supervised and you have to keep your attention on the road.
Amazed that WSJ owned by Fox corporation investigated this and no one from the Murdoch family tried to bury it. Good job WSJ.
lol true
The fact that the cameras are at different positions and angles is what allows the computer to calculate depth it's not a flaw.
as I understood the technician, the problem is that the computer doesn't know the exact position and orientation. It's not about the mounting, but what the computer is being told about the mounting. Bad calibration leads to bad assumptions by the computer
@@paulfrydlewicz6736 Every vehicle goes through a "camera calibration" during the first 100 miles or so during which advanced driver assistance is not available. This is designed to calibrate the camera using stationary points on the ground repeatedly until there is a map of this vehicle's camera for the computer.
WSJ is clueless and mostly pumps out propaganda based on whatever company is bribing them.
They have a colorfull history of Tesla propaganda that only died down cus their stock price rallied in 2020 leading to them to calm down on it.
Why is there no data presented in this article/video? People crash every day, the question is autopilot crashing at a lower or higher rate.
Came exactly to tell this, after this video we still don’t know if AP is safer than regular driver or not
The data doesn't support their assertion that the software is faulty, that's the problem. The article also says "autopoilot" which is a nebulous term in the Tesla world but in this case applies only to autosteer ( lane keeping + adaptive cruise control ) and does not apply to FSD.
Feel free to search for "Tesla Vehicle Safety Report" which has the breakdown by quarter.
Because teslas frash at lower rates. Its a hit peice that WSJ started putting out again because their CEO has differing political beleifs.
You can see from the first two videos that if the drivers were awake, there is no way they would crash into a huge object in front or heading straight when approaching a T-junction. Both were very likely asleep at the time.
@@mazicort_ _we still don’t know if AP is safer than regular driver or not_
Well, yes we do. "In the 3rd quarter [of 2024], we recorded one crash for every *7.08 million miles* driven in which drivers were using Autopilot technology. By comparison, the most recent data available from NHTSA and FHWA (from 2022) shows that in the United States there was an automobile crash approximately every *670,000 miles"* (Tesla Vehicle Safety Report 3Q2024).
Tesla Full Self Driving has come a long way. I was driving at the Tesla recognized a plastic bag in the wind as an obstacle and slowed down accordingly.
🤣🤣🤣🤣🤣🤣🤣🤣🤣🤣🤣🤣
a Tesla steered around me wile I was biking , the driver did not care ❤ saved me from getting hit or dead
We should be focusing on data and statistics. People die every day in multiple different cars, focusing on specific examples is just misleading and not focusing on making the roads safer.
It is important for people to understand the weaknesses of a system to prevent tragedies. Statistics are important, but don't help analysis to that extent, especially when data the data to make more specific statistical inferences is kept hidden
@@ErinWilke Yes, that data should be made available to regulatory bodies. I don't think it should be publicly available, since that seems like a breach of privacy with no actual upside. As it concerns this video, making this an emotional story about a specific case is in poor taste. The fact that it's about an accident that happened several years ago with a past system makes me question its relevancy.
@@41-Haiku The system may have changed, but Tesla has not. That's the problem, and that’s the story.
Surely if the cameras are giving contradictory data, the vehicle ought to slow down in order to get a clear picture and minimize the danger of a collision?
Or sound an urgent alarm in the cabin to alert the driver?
@@Steyr6500it does sound a loud alarm if a collision is imminent. The “forward collision warning” in my Tesla has 3 choices: early, medium and late. I set mine to medium because early was giving too many annoying false positives. There’s also automatic emergency braking. It doesn’t guarantee a crash won’t occur, but should reduce the severity of the impact.
That’s the challenge with technology. It could lead to false positives. Unnecessary braking could result in getting rear ended.
@@bikeman7982 I don’t mean do an emergency stop though. Just slow down a bit to minimize danger. And of course, that can also depend on the rear-view cameras.
You presented maybe 5 actual data POINTS in the video, with an additional data set in the form of what crashed based on what.
I really expected more from a multibillion dollar journalistic company.
Their articles, of which there at least 10, give links to loads of information. The government (NHTSA) has published a lot of information as well.
@rayphenicie7344 that changes nothing about the quality of this video, never did they even refer to those articles.
I've also heard that most people who work on the automatic driving don't trust it with their own lives.
Now compare this to human error crashes and you'll realize how much worse we are.
Well this would be very very easy, but for some reason (wonder why) no company has relesed it crash data so far
I would have liked it if the video did a better job contextualizing Tesla Autopilot crash rates vs human crash rates. This would help better understand Elon’s claim that Autopilot is safer than human driving.
This 👆
Saying it’s safer than a human driver includes all kinds of reckless drivers and drunk driver. It has to be safer than a responsible human driver. I’m pretty sure that’s nearly impossible in the short term.
Saying it’s safer than a human driver is like saying that AI is a better writer than a human. Sure, it’s better in some cases and in some ways, but not all. It’s a huge generalization. Autopilot is a better driver in that it can stay centered in the lane better, but is a worse driver in so many other ways. Same with FSD. It’s a way better driver… until it’s not.
but then anti musk narrative won't work
Humans have more total crashes than Tesla autopilot has crashes.
Therefore, autopilot is safer.
This is Elon logic.
Please ignore that there are a much larger total of human drivers than Tesla autopilot drivers.
Why? Because of inattentive drivers thats why.
FSD is not perfect but it keeps getting better. I know that's not consolation to the people who lost loved ones but the truth is that this technology is is going to save thousands of lives in the near future.
Tesla- we have autopilot.
Normal car manufacturers- even if we had autopilot, we wouldn't be stupid enough to call it that.
_even if we had autopilot, we wouldn't be stupid enough to call it that._
Is this somehow dramatically different from Drive Pilot, which is what M-B calls their Level 3 ADS?
@BigBen621 as i said, they just wouldn't call it autopilot. For liability reasons. Even if it did function as such.
it's a GOOD term! look at the definition from Wikipedia for example: "An autopilot is a system used to control the path of a vehicle without requiring constant manual control by a human operator. Autopilots do not replace human operators. Instead, the autopilot assists the operator's control of the vehicle"
@@wkymole3 _they just wouldn't call it autopilot. For liability reasons._
Just what liability does this create for Tesla? They've already won two lawsuits, one in Germany and one in the U.S., brought by folks who claimed it's misleading.
I am sure that XAI600K will go 100x just like you said
I am sure, this is a scam :D
Years ago, I knew Elon was not smart when he, with full confidence, announced that he was shipping “autopilot” capable cars without radar. A pure camera based solution does not work for that use case.
One sensor for head-on collision would make it safer. But I agree with Elon that vision is better than relying on sensors.
Lame excuses
Lidar exists for a reason
@jacksmith-mu3ee LIDAR (all caps because it's an acronym). Was invented in 1961 it wasn't invented for self driving cars. I'm not sure what your point is.
@@didier_777 another lame excuse . Cars pre tesla had it
Thanks for proving my point bot .
Can we talk about how many accidents it SAVED? Can we do a similar video from another automaker?
Not a Tesla driver here, but I’m assuming self driving tech has improved since 2021. I can’t believe this is the focus of a report in 2024 ( almost 2025).
That is an assumption with no data to back it up.
@@mikebronicki8264 That is an assertion with no evidence to back it up.
Sorry, but I trust myself more than I ever will a computer, my condolences to those who have lost their lives to technology.
And you should. Thats what the company expects you to do.
Seriously its a glorified cruise control that litterally tells you at night when its unsure about the conditions.
Theres so many videos out there of morons who try to game the system and end up dying cus they dont pay attention.
Irl autopilots on airplanes require the same level of supervision, if not even more.
Imma be honest, getting warned 17 times is wild
Unfortunately, in this case, she is going to lose her court case merely and simply because her husband IGNORED the car's warnings of him taking his hands off the wheel so many times. He was pre-warned. : (
Okay maybe this is an unpopular opinion but driver is still the one responsible. Autopilot, ABS, parking sensors etc are just features that makes driving easier and safer, but the driver needs to stay vigilant and observe the road. I feel like many of the crash videos from tesla looks like the driver wasn’t paying attention to driving. :/
Honestly seeing the trunk lying down in the middle of the road, not sure if human driver could do anything either . Result would probably be the same. It’s the truck driver who killed him, not the autopilot.
If I was John Bernal I wouldn't be putting my Tesla in 'autopilot' mode just in case his vehicle has had a unique update and diverts to somewhere dark.
What is the average compared to the total?
Pretty important fact left out
Not a Tesla fan boy but should the news also show how many times autopilot saved someone’s life or avoided a crash?
I mean, the news is supposed to be fair and balanced right?
this isnt a flufff piece its about the tech info tesla has been keeping from the public. its about public safety
@ it’s 100% “orange man bad friends company” being targeted bc orange 🍊 🧍♂️ won.
News need to be fair and balanced, not just hit pieces.
That type of reporting doesn’t sell.
Sounds good, other than Tesla not allowing others to look at the data. How can the media report on something that cannot be independently verified?
When you give away control to then realise you haven't any when it matters.
4:51 Sounds similar to what a certain submarine pilot/ CEO said about his product i wonder where he is now?
I’m sorry, but when the car warns you NINETEEN TIMES?!? I have empathy for anyone who loses their life but don’t blame the car when the humans don’t do what they’re supposed to- and that’s called pay attention to the road. Also, how many people died in other cars…at night…when visibility is low? Let’s compare apples to apples and see how that goes.
In other words, this self-driving technology is perfectly safe... as long as it isn't used for self driving like CEO Musk claims. And morons continue buying Teslas for this? 😂
I'm surprised they don't mention the crash in the Bay Area where the guy's Tesla veered out of his lane and tried to merge with a concrete divider that was missing its normal crash barrier - he'd told his wife that several previous times on his daily commute he would see the autopilot turn toward the divider as if it thought it was seeing a passing lane, and he'd grabbed the wheel to override it - but on this occasion he was preoccupied for a couple critical seconds and didn't react quickly enough, so he died in a head-on collision with a concrete barrier. Tesla reviewed the accident and pronounced it "driver error" because the driver had failed to override the autopilot when it made a mistake. The fact they could say something that absurd is why I'll never buy a Tesla.
3:47 sure the autopilot messed up here but if the driver was alert and supervising the fsd, you can clearly see the flashing lights and that crash would've never happend
That’s the issue about creating a false sense of safety. People end up relying on the tech too much and either get distracted or zone off. Hence why they’re getting investigated for false marketing because even people like my aunt think that it drives her everywhere with no issues
Definitely, that specific one could have been avoided by the driver paying attention. The problem is Tesla's marketing (self-driving, autopilot, etc.) combined with their refusal to make crash data public means that people have been misled into trusting the technology. If you think Tesla crashes are one-off tragedies, versus actually understanding the shortcomings of the technology, it's easy to become overly reliant and trusting of the tech. I don't think it's fair to blame individuals when corporations and billionaires lie to make a profit and endanger their customers and the public as a result. Billionaires and corporations are NEVER held accountable.
Autopilot is not FSD. BUT BOTH CAN BE DISENGAGED BY MERELY APPLYING THE BRAKE! Seeing anything blocking the road like a downed semi, I would have applied THE BRAKES! I have been using FSD since 2022 and it is improving so quickly. The autopilot is NOT FSD as it only allows one to steer without having to use the accelerator, however to stop one needs to brake. The video of the accident proves 1) there was plenty of time to brake 2) he was going faster than he should have 3) not paying attention. Did his Tesla even have FSD? Because it would have slowed and sent alarms to driver.
"I don't think this is a long term technology that we're gonna keep in the cars." is actually a fair assessment of why its so controversial for cars.
If this type of tech were put into something that didn't encounter so many unknown variables, then it would be more effective than what it was marketed for.
I can easily imagine running into that overturned semi truck. No flares out and it was dark. It fooled the auto pilot, but it could easily fool the human.
The principles of defensive driving saying you should slow down and even stop until you figure out what is in the road ahead.
You seem to be forgetting that.
@ I went along with you until you added “you seem to be forgetting that” It was a dig, and unnecessary. And I do trust that you do indeed know that. See? I trust you.
Yeah. The human mind can easily go into ‘autopilot’ and fail to register the obstacle. Especially at 3am on a deserted highway.
To be clear, I'm no fan of Tesla's system. But are they, or are they not, safer than human drivers? Are they, or are they not, safer than competing systems? Are they, or are they not, safer than (extant, or possible) public transit alternatives? It's answers to these questions, ones about _how to improve our safety overall,_ and not anecdote-driven sensationalism, that will serve the public.
Of course it is a tragedy when someone dies on the road, but that tragedy is not increased or lessened by the brand or technology of the vehicle involved. Several of the videos you show are circumstances where _I_ might have been killed had I been the driver, so blaming the manufacturer without looking at the statistics is just muddying the water of an issue that we desperately need to understand clearly.
I don't feel the video, as presented, was a service at all.
@stephenspackman5573
_But are they, or are they not, safer than human drivers?_
Without a doubt.
_Are they, or are they not, safer than competing systems?_
Competing systems to Tesla Autopilot are the TACC and lane centering systems that have been included in pretty much every new car in the past 10-15 years. The difference is that unlike Tesla Autopilot, nobody's monitoring the safety of the other hundreds of millions of cars, so there's no way to measure whether Autopilot is or isn't safer than the scores of different versions of TACC and lane centering from all the other manufacturers. But because Tesla does do attention monitoring even on Autopilot, and many if not most of the competing systems don't do attention monitoring, it's highly likely that Tesla Autopilot is safer than most of all competing systems.
_Are they, or are they not, safer than (extant, or possible) public transit alternatives?_
Probably not.
This is probably one of the most well-thought-out and rational posts in the entire comments section of this blatant hit piece.
Qarden Token will make millionaires, after CEX listing it will blow up.
Scam
Autopilot and similar tech just needs to be regulated properly, but it's like any new technology it starts as the wild west then it gets regulated, it's part of the process
Over 9 years, 51 deaths from autopilot. Yes, that is 51 deaths too many; however, compared to non-autopilot deaths in the US this number pales in comparison.
The average driver would have crashed in the majority of these examples
I guess with Elon Musk basically being the President now he and Tesla are immune from regulation.
not how it works
If Lidar is so expensive, how come that you can find them in sub $1k smartphones?
@neodym5809 _If Lidar is so expensive, how come that you can find them in sub $1k smartphones?_
Are you seriously comparing the cost of a smartphone LIDAR, which has a range of around 15 ft., with the cost of LIDAR for ADS, which must have a range of perhaps 30 times that?
I find it hard to have sympathy when the car alerted the guy 19 times to take over. He was mi's using the technology, being careless and having no regard for his own safety, his family, or other people on the road.
He clearly wasn't paying attention to the road and was over relying on self driving.
Agreed. It seems most likely he fell asleep.
Should the car be monitoring the driver and have alarms to wake a sleeping driver?
Musk's Qarden Token announcement is coming soon. Easyest money if you get in on the ICO
Scam
Presidential Ponzi Scheme
Has any study shown that self driving cars crash more than other cars?
Because all I ever see is data about self driving cars. And emotional reaction to their not being a human driver (as if human drivers were perfect)
Yes it has been proven that sleeping drivers = 100% crash rate.
The only thing Tesla autopilot trains is drivers to become complacent
This whole fretting thing is so stupid. The only thing that matters for AI driving is, does it SAVE MORE LIVES THAN IT KILLS? No driving system will ever be perfect. It just needs to be better than the average human driver.
With a level 1 or 2 system like Tesla, it’s even better because you combine the features of both AI and Human safety systems. The AI is never distracted, and the Human is supervising. The combination only is unsafe when the human becomes distracted, which is 100% the fault of the human.
Have to wonder what the actual reason for removing lidar/radar from Teslas. Was it a cost saving measure during covid and supply chain difficulties?
Cost is always a factor, but the issue is that (1) if a human can drive with just vision, then so can a computer, and (2) sorting out conflicting information from the cameras and radar is difficult, especially in bad weather
Sensor interference and cost. They said that HD radars would integrate better and would make more sense. The current Model S and X do come with HD radar but afaik they don't make any use of it yet.
Teslas approach is to train their cars to drive like humans do. Humans don't have lidar.
One, it is quite expensive. Two, Lidar gives you a point-cloud in front of the car which can be used to detect obstacles with a lot of compute, but you have no sense of what those obstacles are. A 1-feet cube of styrofoam or a 1-feet cube of solid steel will look the same to Lidar. So from an information utility and processing standpoint, Lidar does not offer much over using cameras and computer vision.
@@imsrini but all of these autopilot crashes where the cameras didn't know what it was looking at would not have happened with radar/lidar
If a human would have been able to see it, and the Tesla alerted the driver 19 times to put his hands on the wheel, then who was really at fault?
This is VERY outdated news. That semi-crash that you opened with was when they were still using radar - and was LONG before FSD was mature. There are now 6 million Tesla's on the road. At an average 15,000 miles driven per year - that's 90 billion Tesla-miles driven per year. Over all car types, there are 14.3 collisions per million miles driven. So if Telsa vehicles were like ordinary cars - you'd expect 90,000x14.3 = 1.3 million Tesla collisions per year. You say that "thousands" of Tesla-related accidents have been reported. If it was "millions per year" there would be a problem - but it's not. Also, you're talking about the UK version of FSD - which is FAR from complete. All Tesla owners are told at the start of every trip that they must supervise their car while it's driving. Clearly if you slam into a very obvious semi-truck - then you're not supervising the car's driving. Tesla's are (statistically) the safest cars on the road - and they are safest when using FSD.
another low-quality hit piece from the WSJ
What did they say that wasn’t accurate? You shouldn’t call something a hit piece just because you don’t like the facts. Elon is incredibly irresponsible and he constantly misrepresents the capabilities of “self driving.” Do you know that Tesla has faked some of their promotional material showing the Tesla commenting someone to work. It was an edited compilation of many many attempts and they made it look like it did it right the first time with no mistakes.
Tesla FSD is Level 2 autonomous driving. Which means, the driver must all the time watch the road and has hands on steering wheel. End of the Story. If a driver does not pay attention on the road it is (legally) his/her fault. I drive Tesla and have FSD.
This is a legal death machine. It’s name and advertisement totally undertone this « legal » part. Confusion is enhanced by the name « autopilot » and the existence of truly autonomous taxi services.
And it is vastly documented in scientific literature that a high degree of automation seriously decrease the vigilance level of any person supposed to monitor it and retake control immediately in rare cases of failure or error. The ability to react in a timely manner and adopt the right corrective action in unexpected, complex situations where the automation suddenly fails, is proven to be very degraded as opposed to a continuous manual control where dealing with a sudden unusual situation is much more likely to be successful (within human limits obviously).
This is a low cost hardware system made for profit, covered by a mere legal umbrella to hide known dangerous and documented flaws originating from a flawed cheap design. Period.