Tesla's We, Robot Event: AI Confusion or Cybernetic Revolution

แชร์
ฝัง
  • เผยแพร่เมื่อ 16 ต.ค. 2024

ความคิดเห็น • 3

  • @charliedoyle7824
    @charliedoyle7824 4 วันที่ผ่านมา +1

    I'm glad to see Grayson can see FSD won't be ready until 2030. That's somewhat reasonable, but still too optimistic. FSD is very, very far from being reliable enough to go driverless on a large scale in big cities, on random taxi rides.
    The "zero intervention" drives Grayson is so impressed with are nearly all in light traffic and easy infrastructure.
    It's not an exaggeration to say that 99% of driving is easy, and the last 1% is 99% of the challenge. Tesla 12.5.xx FSD is now a good demo, able to fool the naive into thinking it's almost ready to solve driving. It's about where Google Self-Driving was in 2011.
    In fact, FSD likely couldn't last a safe hour in a busy city, and I'm sure Grayson knows this. If he were to take FSD to downtown Miami or Orlando, away from his simple Palm Beach leisure drives, and drive around during business hours in good weather for one hour, on random trips, FSD would need at least one safety intervention, and likely many more than one, and it would get lost, miss lots of turns, and make the human quite nervous often by hesitating in intersections, driving in bus lanes, not understanding many traffic rules, and hesitating around all the pedestrian chaos.
    I'll tell you Grayson right now, Tesla won't have a big robotaxi business able to deploy 24/7 in a full metro by 2030. The last 1% of autonomy, the very long tail of difficult and dangerous situations, is an enourmous challenge, and probably won't be solved by FSD with lousy maps, cheap, badly-placed and too-few cameras, no roof or long-range sensors, no remote ops, and only an AI black-box making driving-policy decisions based on raw camera images. It won't be reliable or redundant enough in everything the long tail throws at it over a hundred million miles or more of metro driving to reach necessary MTBF numbers.
    The minimum safety level FSD has to achieve is the current Waymo Driver safety level, but over at least 100 million miles of city driving. That includes Waymo's ZERO FAULTY BODILY-INJURY ACCIDENTS over their first 30 million miles. If significantly worse than this, Tesla would end up like Cruise, on the sidelines in most cities, with a very bad public image, and big legal problems.

  • @Nunya-lz9ey
    @Nunya-lz9ey 3 วันที่ผ่านมา +1

    Everybody was convinced? Lol demand pretty obvious to me they were remote.

  • @charliedoyle7824
    @charliedoyle7824 4 วันที่ผ่านมา

    Texas doesn't need a carveout to exclude personally-owned autonomous vehicles from regulation.
    In Texas, AVs are already legal to drive on Texas roads, as Grayson said. The only catch is, they still have driving safety laws against reckless driving, which of course won't change.
    The only requirement for AVs in TX and many other states is that they need to have a "fallback" system in an ADS that prevents crashes when the ADS fails at driving, and they need to get out of the way after an accident, if possible. Current FSD doesn't have any reasonable fallback system to prevent crashes. It does beep for a transition to the human in some cases, but it too often just does dangerous moves and needs sudden human action to prevent crashes. That's why it's Level 2, and doesn't have a legal fallback system. Any lawyer would be able to show that in court, comparing FSD to Waymo, which has an elaborate and effective fallback system along with a great safety record. Tesla shouldn't try to deploy driverlessly until they are as good as Waymo.
    All deployed driverless AVs have the ability to avoid crashes without a human intervention, by stopping or pulling over when it calculates danger is too high, and then a remote human gets on the line to help it get on its way, or come retrieve the vehicle. When they do cause a bad crash, it's very risky for the ADS company, even in Arizona, certainly in CA. Maybe in Texas the state won't prosecute AV drivers, for political reasons. Or maybe they will if the public finds it too unsafe. But there are still personal injury and property-damage lawsuits to worry about for a reckless AV company.
    Perhaps Texas could get rid of the fallback requirement with a new bill, but that would be odd. Even without a fallback requirement, AVs still can't legally cause crashes in any state, and owners of crashing cars will be breaking the law and liable, potentially losing their right to drive.
    So the only thing stopping FSD from going driverless in most states is, it would get stuck and crash far too often, and the owners would be prosecuted at some point and lose the right to drive. In Florida, Tesla would be charged, because the producer of the ADS is the legal driver in FL.
    AV companies don't deploy for driverless testing until they are convinced statistically they won't cause serious crashes in any forseeable way. That's why Zoox is taking so long to pull the driver on real roads, despite being far safer than FSD.
    Pulling the driver is a "bet the company" moment.