Model S Plaid | ADAS Breakdown

แชร์
ฝัง
  • เผยแพร่เมื่อ 7 พ.ย. 2024

ความคิดเห็น • 495

  • @fordnut74
    @fordnut74 2 ปีที่แล้ว +60

    I have to say that I appreciate putting your "behind the scenes" guys in front of the camera. Munro doesn't care about perception, they care about facts and knowledge.

    • @AdmiralQuality
      @AdmiralQuality 2 ปีที่แล้ว +3

      If only he wasn't so wrong about so many things.

    • @AwesomeBlackDude
      @AwesomeBlackDude 2 ปีที่แล้ว +1

      @@AdmiralQuality how's that exactly?

    • @eubikedude
      @eubikedude 2 ปีที่แล้ว +1

      @@AwesomeBlackDude The iris is not what does the focusing in the human eye for one thing.

    • @AwesomeBlackDude
      @AwesomeBlackDude 2 ปีที่แล้ว +1

      @@eubikedude pls continues... 😬

    • @AdmiralQuality
      @AdmiralQuality 2 ปีที่แล้ว

      @@eubikedude Exactly. Several more too but I just don't care to get into it, he's not worth it.

  • @space_2597
    @space_2597 2 ปีที่แล้ว +67

    Structure from motion is used to get depth from a single camera. Stereo vision is only useful for close ranges ~20m given how close together the cameras are

    • @briansilver3412
      @briansilver3412 2 ปีที่แล้ว +16

      Exactly. Stereo vision is needed for static images. Tesla is using video (4D) processing.

    • @jonwatte4293
      @jonwatte4293 2 ปีที่แล้ว +5

      @@briansilver3412 stereo still helps, especially if you can also get wide baseline (two upper windshield corners for example.)

    • @ricknash3055
      @ricknash3055 2 ปีที่แล้ว +1

      How do birds manage depth with eyeballs less than 1cm apart?

    • @space_2597
      @space_2597 2 ปีที่แล้ว +3

      @@ricknash3055 same as humans they use other cues. We use stereo for objects within hand reach/throwing distance

    • @harsimranbansal5355
      @harsimranbansal5355 2 ปีที่แล้ว +1

      Exactly. A good question to ask is how tesla is getting distance measurements from the side or rear cameras where there is only one facing one direction not two?

  • @raddaks2039
    @raddaks2039 2 ปีที่แล้ว +30

    Drive safe everyone! That's good advice from someone who deals with these safety systems a lot. Thanks for the video!

  • @naeem_bari
    @naeem_bari 2 ปีที่แล้ว +30

    Excellent job as usual. Chris is clearly an expert and Ben did a great job moving the conversation along on the right track.

  • @gelisob
    @gelisob 2 ปีที่แล้ว +19

    2:55 front camera feeds (probably all) are used to generate 3d world, through photogrammetry like algos. It is constantly making a pointcloud of what it sees. Three images from separated vantage points is enough to make one.
    And when moving, they have like 60 images per second per camera. Thats why they dont need the radar, they are constantly making a 3d world in front and around the car, on the fly. And you can fairly accurately tell how far each point is.

    • @sjokomelk
      @sjokomelk 2 ปีที่แล้ว +2

      It is 36 frames per second

    • @AwesomeBlackDude
      @AwesomeBlackDude 2 ปีที่แล้ว

      @@sjokomelk what's the reason for the extra six frames? 🤔

    • @sjokomelk
      @sjokomelk 2 ปีที่แล้ว +1

      @@AwesomeBlackDude You have to ask Tesla about that. But that is the frame rate they use for capturing data from the cameras.

    • @daniel_960_
      @daniel_960_ 2 ปีที่แล้ว +1

      @@AwesomeBlackDude best compromise between amount of information and processing.

    • @AwesomeBlackDude
      @AwesomeBlackDude 2 ปีที่แล้ว

      @@daniel_960_ Thanks guys for the (CFF) electro-retinograms fps explanation do you guys have any idea when Tesla will introduce audio-detection.? 😬

  • @1800Supreme
    @1800Supreme 2 ปีที่แล้ว +24

    You can somewhat substitute stereoscopic vision. With one camera and time. The single camera can compare the image from one second ago to the present image and assuming the car has moved it can use the difference between the two imagines to extrapolate depth data. This can only happen as fast as the camera frame rate.
    Temporal stereoscopic imaging.

    • @nettlesoup
      @nettlesoup 2 ปีที่แล้ว +2

      Yes, cf. human drivers with only one usable eye, who are legally allowed to drive. Or, put your hand over one eye, then start walking around your house. You'll find your brain fills in most of the 3D information as you move around.

    • @AlexanderNassian
      @AlexanderNassian 2 ปีที่แล้ว

      Not only "somewhat substitute", it's far more precise. Stereoscopic vision only works around 60cm in front of the head due to the distance of our eyes.

  • @pablopicaro7649
    @pablopicaro7649 2 ปีที่แล้ว +48

    Fabulous review of Electronics no one would ever see at detail level (outside OEM and supplier)

    • @TheMoni7548
      @TheMoni7548 2 ปีที่แล้ว +8

      It was actually pretty shallow

    • @SupremeRuleroftheWorld
      @SupremeRuleroftheWorld 2 ปีที่แล้ว

      try this channel, its goes in way deeper into tesla electronics: th-cam.com/users/Ingineerix

    • @KarmaticEvolution
      @KarmaticEvolution 2 ปีที่แล้ว +1

      @@TheMoni7548 So not in-depth enough for you?

    • @AudiTTQuattro2003
      @AudiTTQuattro2003 2 ปีที่แล้ว

      @@TheMoni7548 ...in that much of it is off the shelf? Seems the Tesla hype is maybe just that, other than the software part.

    • @diavalus
      @diavalus 2 ปีที่แล้ว +1

      @@KarmaticEvolution for someone who may be passionate about cars, this is nothing new but a simple introduction on how ADAS works

  • @fredbloggs5902
    @fredbloggs5902 2 ปีที่แล้ว +35

    Historical note: The Tesla FSD chip was designed by Jim Keller, who previously designed the iPhone A4/A5 chip.

    • @markplott4820
      @markplott4820 2 ปีที่แล้ว +4

      FACT , Tesla D1 chip is better than apple M1

    • @meamzcs
      @meamzcs 2 ปีที่แล้ว +12

      @@markplott4820 You can't compare them AT ALL.

    • @markplott4820
      @markplott4820 2 ปีที่แล้ว +3

      @@meamzcs - when does Apple have AI DAY, production robot cars, humanoid robots, AI factory software, training tile, DOJO Supercomputer ?
      when was last time APPLE innovated anything ? not lately......

    • @chickenp7038
      @chickenp7038 2 ปีที่แล้ว +8

      @@markplott4820 dojo chip is a neural network training chip while m1 is soc

    • @Hibbidyhai
      @Hibbidyhai 2 ปีที่แล้ว +10

      @@markplott4820 Tesla really is following in Apple's footsteps. The emphasis on vertical integration, designing almost every aspect of your product yourself, and having all of those components working together on a deeper level than the competition (who is outsourcing more of their core competencies) is very Apple.
      Rather than use a common standard (micro USB) Apple developed their own superior standard, the Lightning port.
      Rather than use a common standard (J plug) Tesla developed their own superior proprietary charing port.
      Rather than continuing to rely on a supplier's chip (Nvidia) Tesla developed their own proprietary chip designs.
      Rather than continuing to rely on a supplier's chip (Intel) Apple leveraged their previous proprietary chip designs, the A-series, and evolved them into desktop chips.
      Apple was not the first company to enter into most of the product categories they compete in, but when they enter a category they tend to offer a product superior or at least equal to anything available from competitors, if they do tend to be more expensive.
      Likewise Tesla was not the first company to offer a commercially available electric vehicle, but it was far superior to any available competition at the time. There is now plenty of viable cars competing with Tesla (even if competitors struggle to manufacture them at scale). Some competing models are cheaper but less capable, and others are on par capability wise but just as expensive (and not widely available). Tesla is also now longer the first to enter every product category (the Rivian truck entering the market before the Cybertruck). But Tesla is still the standard bearer we judge every other car to, just as in many product categories Apple is the standard that others are judged by (in tablets, smartphones, and smartwatches, etc.)
      Knocking Apple for a lack of innovation compared to Tesla is just an ignorant opinion, when Tesla is so clearly taking a lot of lessons from Apple (not just in the ability to offer something new and revolutionary, but also in how the development and manufacturing of a product happens).
      Even the fancy presentation of a new product with a charismatic CEO on a big stage is something that Tesla does that follows in Apple's footsteps.

  • @paullester2535
    @paullester2535 2 ปีที่แล้ว +18

    PSA: "pay attention when driving" - wise words Chris

  • @magnuslarsson337
    @magnuslarsson337 2 ปีที่แล้ว +8

    1. The dual SoC’s on the Gen 3 HW is for redundancy, but using both when both are working to increase compute.
    Elon:
    “Good analysis. Both computers will be used & sync ~20 times per second. This is a long time for a computer. Like a twin-engine plane, use both engines to max for normal operation, but can safely operate on just one.”
    2.
    Teslas can hear using the cabin microphone.
    Apple has similar functionality in the accessibility in settings to detect sounds like: fire, fire alarm, crying baby, yelling, sirens, running water, shattered glass, etc.

  • @kenchow8213
    @kenchow8213 2 ปีที่แล้ว +14

    I remember Tesla saying that they use the internal radar to detect when busy parents accidentally forget/leave their kids in chair seats in the car. Apparently more common than you think.

  • @slartybartfarst9737
    @slartybartfarst9737 2 ปีที่แล้ว +10

    Ben and Chris thank you for the granularity on the electronics plus the insight to progression over time gives us engineer investors the green light to keep on investing. When I see evidence of Tesla's continuous improvement pace I sleep at night.

    • @surferdude4487
      @surferdude4487 2 ปีที่แล้ว

      How's progress on the Earth II?

  • @chicomtl
    @chicomtl 2 ปีที่แล้ว +2

    Man , I never get tired of your videos. This is way beyond my field of expertise and I always listen to the end. Always learn something new. Thanks to all the Munro Team. Great work on theses videos.

  • @stevebakker6884
    @stevebakker6884 2 ปีที่แล้ว +1

    Always a good idea to explain acronyms when they are introduced

  • @TeslaFSDStudent
    @TeslaFSDStudent 2 ปีที่แล้ว +1

    Really enjoyed this video and thanks for bringing some clarity on the Tesla chipset.
    Also, the best advice ever is “be safe while driving” which is applied at any moment in any vehicle!

  • @Slebonson
    @Slebonson 2 ปีที่แล้ว +5

    Thanks Chris really good explanation of what is going on with the car.

  • @john0constantine
    @john0constantine 2 ปีที่แล้ว +8

    3:35 I don't mean to be rude, but you seem to have a quite idealized view of radar. I've worked quite a lot with Conti automotive radar. The spatial resolution is not what you might expect. The amount of noise and artifacts is substantial. You have a very non-ideal antenna characteristic which routinely gives you phantom objects in weird locations. Distinguishing multiple cars on the road is a challenge. Radar might be useful because it can directly measure the velocity of other objects without the need to derive distance but apart from that it's much more unreliable than you make it sound.

    • @martinmusli3044
      @martinmusli3044 2 ปีที่แล้ว

      He only said Radar is used to confirm an object is there. Period. No judgment

  • @DarkBrandon1
    @DarkBrandon1 2 ปีที่แล้ว +21

    My favorite explanation of Autonomous driving from Tesla is: “It’s a software problem, not a hardware problem.” Other OEMs are trying to solve a software problem with lots of hardware (Cameras, Radar, LiDAR, etc). I’d love to see a “tear down” of Tesla’s software strategy vs. other OEMs. Might be too much to ask.

    • @pipooh1
      @pipooh1 2 ปีที่แล้ว +6

      Thing i learned is, if you can remove any variable for a system you are optimizing do so. Tesla did it by removing as many systems as possible and only rely on vision and sensors. That way they can fine tune these 2 systems to 99.9% and can later add more system without issue since they know their base system is fully reliable as backup.

    • @bobqzzi
      @bobqzzi 2 ปีที่แล้ว

      It's an explanation that makes me laugh every time I hear it.

    • @DarkBrandon1
      @DarkBrandon1 2 ปีที่แล้ว +1

      @@bobqzzi that it’s a software problem? How so?

    • @JosiahLuscher
      @JosiahLuscher 2 ปีที่แล้ว

      Have you seen the AI day presentation? They gave a nice overview of how FSD is being developed.

    • @DarkBrandon1
      @DarkBrandon1 2 ปีที่แล้ว

      @@JosiahLuscher right, but we don’t have anything else even close to AI day for the other OEMs. It’s impossible to compare right now because they keep us in the dark.

  • @robcossin4690
    @robcossin4690 2 ปีที่แล้ว +10

    9:00 correction: Model Y to S, not 3 to S But loved this explanation, thank you.

  • @TheGreatestJuJu
    @TheGreatestJuJu 2 ปีที่แล้ว +9

    *Tesla has a huge blind spot in cross traffic detection*
    The only cameras that can view cross traffic are on the B-pillar. Tesla‘s next hardware suite needs to add cameras to the headlights looking sideways, One near sighted and one far sighted. You can see the problem with the indecision in their full safe driving videos. New headlights make it easy for retro fit but they need to get realistic and do it as soon as possible before there’re too many cars on the road and it cost more for the retrofit

  • @3nityC
    @3nityC 2 ปีที่แล้ว

    Ben is the best! We easily digest as layman. Good Job Munro!

  • @eubikedude
    @eubikedude 2 ปีที่แล้ว +1

    2:29 The human iris adjusts for the amount of light coming in, the natural lens, inside the eye, is what does the focusing.

  • @Urgelt
    @Urgelt 2 ปีที่แล้ว +29

    No.
    Exceeding a human driver's capabilities does not require radar and lidar. The keys to beating human drivers are decision speed and appropriateness of decision-making, not sensor input.
    This is the reason that Tesla is focused on machine learning at scale and advanced neural net chips. They are attempting to get decision speed up and improve appropriateness of decision-making. More sensors won't help with either goal.

    • @AudiTTQuattro2003
      @AudiTTQuattro2003 2 ปีที่แล้ว

      ...attempting is a good description. They may get there, even this year maybe, but it obviously isn't a simple problem regardless of what Elon has implied through countless ready date revisions.

    • @Urgelt
      @Urgelt 2 ปีที่แล้ว +3

      @@AudiTTQuattro2003 I agree that this is a difficult problem. I also agree that, in the past, Elon has been overoptimistic on timelines. He may be overoptimistic even now.
      That's a point worth considering. But there is another point worth considering, too.
      That point is, only Tesla is aggressively pursuing machine learning at scale to solve autonomy. No-one else can do this. No-one else has *ever* attempted to apply machine learning tech on a problem even a tenth as difficult.
      Which means that there are a slew of other companies working on robotaxis who are using 90's tech. Their prototypes are vehicles on virtual rails (geomaps) which use sensors, not to identify objects encountered or to anticipate what they might do, but to reactively avoid collisions as best they can. They almost all rely on lidar and radar as well as optical sensors, they drive on their virtual rails and stop if something is in the way.
      And those competitors are worried. If Tesla succeeds, the competitors' approach will be instantly rendered obsolete, their investments wasted, their market opportunities squandered.
      Even worse, Tesla's approach is sufficiently inexpensive that they will be able to offer it on every car they sell, and operate it anywhere regulators will permit it. By contrast, geofencing requires aggressive map updates (expensive) and expensive add-on equipment that would price cars beyond the reach of most consumers. So they're really only good for robotaxis in high-density urban areas, and their fare prices will be higher than Tesla could offer, if Tesla succeeds.
      Hence the opposition from operators like Green Hills. Tesla is posing an existential threat to all other developers of autonomous vehicles.
      And they're squawking about it, too - running to legislators, to NHTSA, to the public, trying desperately to get government to slam the brakes on Tesla.
      You know what that tells me?
      It tells me that these competitors believe that Tesla will succeed. Maybe by years end, maybe not, doesn't matter. They believe Tesla's tech will work, if nothing is done to stop Tesla.
      If they didn't believe that, they would just quietly go about their business, content in the knowledge that Tesla was wasting its time, energy and money on a fruitless technology.
      I think they're correct to be worried. Tesla's autonomy competitors will be disrupted by Tesla completely out of the autonomy business they've worked so hard to enter.

  • @frankkeel8410
    @frankkeel8410 2 ปีที่แล้ว

    Excellent presentation! This engineer is so knowledgeable! A great asset to Munro!

  • @fredbloggs5902
    @fredbloggs5902 2 ปีที่แล้ว +8

    02:20 In humans the iris is for aperture control, focus is through distortion of the lens by the ciciliary muscles.
    (Note, this is from my biology lessons in 1974, so I could be wrong 🤣).

    • @SodaPopin5ki
      @SodaPopin5ki 2 ปีที่แล้ว +1

      They weren't talking about aperture, but focal length of the camera lenses. Focal length do different things for cameras and eyes. Eyes have a fixed field of view angle (say 90 degrees per eye), and our lenses change focus to change the distance of what we're focusing on. Cameras have different focal lengths to change the field of view angle, and move the lens back and forth to change what's being focused on, distance wise.

    • @fredbloggs5902
      @fredbloggs5902 2 ปีที่แล้ว +2

      @@SodaPopin5ki I gave the precise time code in the video, he specifically stated that the iris was for focus:
      “So for a human eye, your IRIS will resize to focus on something close or something far”.
      But you’re welcome to believe whatever you like, everyone here can see for themselves.

    • @patricklegault6383
      @patricklegault6383 2 ปีที่แล้ว

      @@SodaPopin5ki same things happend to a human driver... i had cataract surgery so i have a long range lense 1 eye and a short range one in my other eye. i usually don't wear glasses but i can drive. however gives me a hell of a headache after 20-30 minutes of driving. if i can do it. so a computer. fixed distance lenses is a pain to deal with but it is manageable for me. a computer should be able to handle it and it's distortion without the headache :p

    • @EwanM11
      @EwanM11 2 ปีที่แล้ว

      Don't worry, eyes haven't changed since 1974

    • @SodaPopin5ki
      @SodaPopin5ki 2 ปีที่แล้ว

      @@fredbloggs5902 I assumed he meant lens, as the iris has no focal length. It's the thing that changes the size of the pupil, which is a hole, akin to the aperture of a camera. It lets more or less light in.

  • @john0constantine
    @john0constantine 2 ปีที่แล้ว +5

    2:45 I'm pretty sure it's not used for stereoscopic vision. The cameras would be much further apart. At that camera disparity the depth resolution is just too bad. Besides that, for stereoscopy you need to very precisely sync the time at which images are recorded which is a huge pain. And the matching algorithm doesn't run itself. It's a lot of trouble for very limited value.

  • @frangalarza
    @frangalarza 2 ปีที่แล้ว +8

    This is a great insight on how people think inside of their silos. When all you have is a hammer, everything looks like a nail.
    When your job is to analyse low voltage electronics, the self driving problem becomes just a hardware problem. You can tell this by how Chris never mentioned software and neural networks. He ignored what Tesla told us during Autonomy day and AI day and assumed they use stereoscopic vision just because Subaru does it. They don't, they feed all the cameras to a neural network that reconstructs the world in 3D.
    Another example is when he associates the FSD computers to the cars they belong to, implying that 3, Y and S have different hardware. They don't, they use the same part. The difference is due to the year they were manufactured on.
    This is what disruption looks like, someone comes along and does things completely differently and most people don't see because they're stuck on their mental frameworks.
    You can extrapolate this to the media and Wall Street and understand why some people are so fixated on the idea that Tesla is grossly overvalued compared to Toyota.

    • @sarvphilly7983
      @sarvphilly7983 2 ปีที่แล้ว +2

      Neural network algorithms are not infallible - they are limited by 1. The model itself, 2. The computational complexity of the said model, and 3. The training set + adaptation inputs. #2 has to be kept low to allow real time processing on today’s hardware and will continue to be a problem for a long time (This is your bog standard problem addressed in any 2nd level programming class in Comp Science). Software does not drive itself - it needs mathematical models to be simple enough to run on the current hardware. So, Chris is correct. One shouldn’t be so quick to swallow marketing talking points, even from a company as admirable as Tesla. Also: how confident are we of #3? We own a Tesla, and let me tell you - it inspires no confidence. It brakes suddenly on autopilot for no reason, so we have to disengage it when someone follows too close behind. Based on my observation as a Tesla owner, they haven’t even mastered level 2 yet. I don’t really have much confidence in their rosy predictions.

    • @frodekleppe3884
      @frodekleppe3884 2 ปีที่แล้ว

      Spott on.

  • @KenLord
    @KenLord 2 ปีที่แล้ว +15

    There were some interesting distinctions in how the levels of autonomy were described. Sure, a Tesla can't hear like we can, and it's decisions are primarily image based... But with so many more cameras than we have eyes, and with it's sonic sensors ... and most of all it's ability to pay attention to everything without fatigue while making thousands of decisions per second.... it very much is (or will be) capable of exceeding humans.
    And that shows in Tesla's accident reports, where Tesla's operating with autopilot / ADAS / Accident Avoidence activated get into accidents 8x less than the rest of traffic on US roads.

    • @GregHassler
      @GregHassler 2 ปีที่แล้ว +2

      Tesla does have a microphone, no reason it couldn't be used for the ADAS system if it were to offer a benefit. But deaf people are allowed to drive. Binaural hearing for humans doesn't really work well inside a vehicle cabin.

    • @harsimranbansal5355
      @harsimranbansal5355 2 ปีที่แล้ว

      @@GregHassler pretty sure elon said they’d be using the microphone in v11?

  • @randallpatkin9630
    @randallpatkin9630 2 ปีที่แล้ว

    With this video and others, a compare and contrast with competitors’ systems would be more enlightening than a straightforward description. Sandy is good at telling us what he likes and doesn’t like. It would be nice to hear that from the other terrific hosts as well.

  • @MrFoxRobert
    @MrFoxRobert 2 ปีที่แล้ว +4

    Thank you!

  • @jbarvideo12
    @jbarvideo12 2 ปีที่แล้ว

    Fabulous detailed overview by Ben and Chris!

  • @GeeKayKayGee
    @GeeKayKayGee 2 ปีที่แล้ว +3

    Good presentation guys, well done.

  • @TWFydGlu
    @TWFydGlu 2 ปีที่แล้ว +9

    A human can't look in all directions at the same time, so it has the potential of surpassing a human.

  • @davidbeppler3032
    @davidbeppler3032 2 ปีที่แล้ว +9

    Lidar is coming. Yes, it will be very useful in geofenced niche vehicles used as local cabs. Romba needs to adopt lidar. Damn thing keeps getting stuck.

    • @SodaPopin5ki
      @SodaPopin5ki 2 ปีที่แล้ว +3

      Neato Robotics has had a low cost Lidar in their robotic vacuum for years.

    • @tesla_tap
      @tesla_tap 2 ปีที่แล้ว +9

      LIDAR sounds a bit like Hydrogen fueled cars. They are going to be everywhere in 10 years. And every year it's only 10 more years out. LIDAR is an expensive ugly crutch, and while it works for cars that are not for sale (Waymo) when you have an unlimited budget. You also don't care how awful it looks or the practicalities working in all automotive environments. It is gerat that LIDAR costs are coming down, but the cheapest forecasts (at some unknown point in the future) is still 100x the cost of a camera.

    • @meamzcs
      @meamzcs 2 ปีที่แล้ว +5

      It's especially great if you're on a budget and you need your development team to have a nice demo on a street you choose beforehand...

  • @FutureSystem738
    @FutureSystem738 2 ปีที่แล้ว +2

    Great video!
    My friend’s Subaru has way worse phantom braking compared to my 2019 Model 3.

    • @arondaniel
      @arondaniel 2 ปีที่แล้ว

      2019 model so I suppose your 3 has the front radar. I recall hearing Tesla's phantom brakes was caused, at least in part, by radar/camera disagreement. Deleting radar may have helped.

  • @DivineSoundsMaster-nm6re
    @DivineSoundsMaster-nm6re 2 หลายเดือนก่อน

    Thanks !
    Would be nice to have more board level details of the ADAS board explained in a future video!

  • @dhargarten
    @dhargarten 2 ปีที่แล้ว

    Chris is a no bullshit type engineer. I like it

  • @bobqzzi
    @bobqzzi 2 ปีที่แล้ว

    Love this detailed look at the sensor package.
    excellent work

  • @uob5
    @uob5 2 ปีที่แล้ว +1

    Interesting EQS in the background ;-)

  • @gfreaq
    @gfreaq 2 ปีที่แล้ว

    As an electrical hardware designer, I think there was quite a bit more information that could have been discussed. How many layers is the PCB? What type of material was used? Did they keep the design simple (larger pitch components, no blind or buried vias, standard spacing between copper, etc.) or did they make it harder to manufacturer to make it as small as possible and to have better electrical performance? How many different components were used on the PCBA? What was the smallest size component used? Are all components automotive grade or better or were some industrial grade components used? Were there special things they did like flooding the outside layers with ground to help prevent noise? How does this compare to other ADAS?

  • @briansilver3412
    @briansilver3412 2 ปีที่แล้ว +3

    8-meter range on the sonar sensors. Good to know.

  • @mishko79
    @mishko79 2 ปีที่แล้ว +13

    Very good content, and detail information.
    Thanks.
    Keep up the great work!

    • @MunroLive
      @MunroLive  2 ปีที่แล้ว +1

      Thanks for watching!

  • @danielhull9079
    @danielhull9079 2 ปีที่แล้ว +2

    Nice to see Magna and Continental in the Tesla Plaid S.
    Magna & Continental are each working (if i recall) with Blackberry QNX that provides a best in breed ISO 26262 pre-certified OS with an Automotive Safety Integrity Level D for safe and secure components.

    • @meamzcs
      @meamzcs 2 ปีที่แล้ว +2

      Yeah well 😂 Qualified stuff is great for regulators and CEOs and corporate as well as public bureaucracy other than that... Meh... SpaceX flies rockets to space and lands them too on open source Linux with the PREEMT_RT patch...
      Not saying QNX is bad it's just that it beeing certified doesn't necessarily make it better than other stuff but it sure makes it a lot more expensive.

    • @danielhull9079
      @danielhull9079 2 ปีที่แล้ว

      @@meamzcs cyber security is one of the biggest topics in the world. I dont see Automonous going far without a secure SBOM along with a high safety level rating for integrity. I am not getting in Huawei taxi for example.

    • @meamzcs
      @meamzcs 2 ปีที่แล้ว

      @@danielhull9079 Beeing certified doesn't mean you're even a tiny little bit more secure than an uncertified open source system...
      If Tesla is gonna achieve autonomous driving, it's also gonna run on a customized Linux Kernel and not something proprietary... I would probably take a Linux Kernel over some certified proprietary stuff too in most cases...

  • @RogerM88
    @RogerM88 2 ปีที่แล้ว +10

    Sandy should make a video about where Tesla cut expenses to handle the supply chain shortages, and increase profit margins. As reducing modules, features, and accessories.

    • @polishfish
      @polishfish 2 ปีที่แล้ว +1

      Great idea!

    • @markplott4820
      @markplott4820 2 ปีที่แล้ว +1

      MUNRO already has an older video on that.

    • @neuropilot7310
      @neuropilot7310 2 ปีที่แล้ว

      I agree. With certain types of sensors, it's extremely difficult to substitute with another supplier or part series. This might also explain why Tesla switched to vision only, and also why they actually increased production when others struggled.

    • @RogerM88
      @RogerM88 2 ปีที่แล้ว

      @@neuropilot7310 their growth in production numbers it's also explain by some questionable healthy strategies during the pandemic, as now in Shanghai, with Workers sleeping on the factory. Wonder what is the average quality control with Workers in such conditions.

    • @markplott4820
      @markplott4820 2 ปีที่แล้ว +1

      @@neuropilot7310 - Tesla has a deal with Samsung for camera.

  • @BigBrotherHal2001
    @BigBrotherHal2001 2 ปีที่แล้ว

    Awesome breakdown. FSD Beta is not using radar at all in mybuunderstandng. Is this your assessment?

  • @GregHassler
    @GregHassler 2 ปีที่แล้ว +5

    The cameras are too close for useful stereoscopic vision at driving distances, especially with different focal lengths. Human stereoscopic vision is only useful at short distances, you don't need/use stereoscopic vision for long distance. It's primarily only useful for things within your physical grasp. Your brain estimates distance based on context and perspective. It's a basic function of geometry, you would need cameras a meter apart to have useful stereoscopic functionality at hundreds of feet.

  • @michaelkimani4207
    @michaelkimani4207 2 ปีที่แล้ว

    Great video. Would be nice to have a "teardown" video of the ADAS board itself.

  • @2nd3rd1st
    @2nd3rd1st 2 ปีที่แล้ว +38

    I'm a bit puzzled by Chris' statement, and I guess belief, that camera/image based autonomy can only allow a car to be equally capable to a human, but not exceed human capabilities.
    Elon Musk has in the past explained and demonstrated that the cameras see more and the computers react faster than any human could and this should be obvious even to a layman.
    I know that there is scepticism to Tesla's optics based approach to autonomy, but the superiority of cameras and computers over eyes and brains isn't debatable.

    • @TheEvilmooseofdoom
      @TheEvilmooseofdoom 2 ปีที่แล้ว +3

      Everyone has a different level of understanding and I guess from where he's sitting that's his impression. Who is right it yet to be known. I myself think another hardware upgrade might be needed to push that level 5 boundary but I am no expert either.

    • @yourcrazybear
      @yourcrazybear 2 ปีที่แล้ว +5

      For both humans and computers it's the brain that is the crucial part, and it's fact that computers can process information much faster than a human can. Computers also doesn't get distracted like us and can have a 360 degree awaraness at all times.

    • @GregHassler
      @GregHassler 2 ปีที่แล้ว +8

      Exactly, I don't know if that's what he meant to say but that's how I interpreted it. But the commodity cameras have better vision, in 360°, than humans do and certainly the vehicle will be able to think and react at superhuman levels.
      And when we say a "human" do we mean an average human or a race car driver?
      I don't doubt that some implementations will use lidar and radar, but those technologies aren't directly correlated to ADAS capabilities.

    • @EwanM11
      @EwanM11 2 ปีที่แล้ว +9

      Most accidents occur because of distraction or emotional impairment. Once the AI system has similar environment awareness and pathfinding, they will crash much less frequently than people. I wouldn't be surprised if they become safer with way worse actual driving skill, just because of constant attentiveness and not doing stupid stuff.

    • @adamrak7560
      @adamrak7560 2 ปีที่แล้ว +3

      @@yourcrazybear Computers can get distracted too, by charged particles for example. So FSD HW in the future will be likely triple redundant, to protect against single event upsets. (in large servers those charged particles cause detectable problems, especially at higher elevations)

  • @zaferroni
    @zaferroni 2 ปีที่แล้ว

    Sandy is right it is the best car had one for 2 years from new.

  • @paulgracey4697
    @paulgracey4697 2 ปีที่แล้ว +5

    About binocular vision, let me tell you a true story...:
    A teen aged friend my age in a family my family knew living in Needles, California lost one eye when he was 15 when a rock hammer broke as he struck. UCLA eye doctors saw to him, and when I visited him in the hospital I was certain he would now not be able to drive, since I had been taught that binocular vision was necessary. The eye doctors knew differently though.
    Fast forward a year or so and another family visit: My friend had his own hot-rodded '52 Ford V8, was on a first name basis with all of the local Highway Patrol officers in Needles and was having no trouble whatsoever with fast driving those lonely desert roads. He pointed out to me what those eye doctors had told him. binocular vision is not how distance is judged in the mind, at any distance much beyond that hood ornament. Perspective and our visual cortex does the work.
    Maybe the ADAS systems can make a better distinction pixel v pixel, but not likely at distance with that close to human spacing of the two cameras. Perhaps Subaru might use two very separated cameras to gain the needed spacing for far field accuracy, but I seriously doubt it is of much use to Tesla unless just employed in parking situations.

    • @TheEvilmooseofdoom
      @TheEvilmooseofdoom 2 ปีที่แล้ว +1

      Yup, this is why perspective tricks work so well in photography.

  • @brandbryce
    @brandbryce 2 ปีที่แล้ว +4

    love to hear a bit about the component costs. Ie, what does a company like Tesla pay for that radar unit, and other sensors?

    • @diavalus
      @diavalus 2 ปีที่แล้ว

      It’s hard to estimate these costs. Even if you’re estimating the cost of the components, these are just the components themselves, leaving out the money Tesla (or any other company) paid to develop, manufacture, the logistics and so on.

  • @reggiejewett7591
    @reggiejewett7591 2 ปีที่แล้ว +5

    This was great, i have alot better understanding of how this automatic driving works and the different levels. Very nice video. Reggie

    • @MunroLive
      @MunroLive  2 ปีที่แล้ว +1

      Thanks Reggie! Please subscribe if you haven't already!

    • @markplott4820
      @markplott4820 2 ปีที่แล้ว +1

      Reggie, you can understand better if you are actual user of Tesla FSD beta. Appreciate it more .

  • @marklearmonth9380
    @marklearmonth9380 2 ปีที่แล้ว +2

    Chris is great, no nonsense here, straight to the point.

  • @bjs2022
    @bjs2022 2 ปีที่แล้ว +1

    iPhone video setting error/failure? I suspect it was in Cinematic mode that resulted in some shallow focus and too much auto-out-of focus. Also, try setting to the .5 (x) camera and get closer to the subject for better depth-of-field.

  • @ApteraEV2024
    @ApteraEV2024 2 ปีที่แล้ว

    Nice showdown, & Advice for ALL!
    PAY ATTENTION CEL PHONE TALKERS!!

  • @asaftzadok6647
    @asaftzadok6647 2 ปีที่แล้ว +1

    Thanks Chris !
    Can you please share the model of the interior radar ?

  • @pranjal86able
    @pranjal86able 2 ปีที่แล้ว +9

    Munro knows cars but do not know much about artificial neural networks hence on the topic of FSD, they don't know what they are talking about. I still like them.

  • @Pedro5antos_
    @Pedro5antos_ 2 ปีที่แล้ว +4

    Awesome content!

    • @MunroLive
      @MunroLive  2 ปีที่แล้ว

      Thanks, Pedro!

  • @Botniasolutions
    @Botniasolutions 2 ปีที่แล้ว

    Monro, please explain why Tesla doesn't have some type of cleaning action on their cameras. Is it difficult to draw solvent lines to rear and side cameras?

  • @polkijain97
    @polkijain97 2 ปีที่แล้ว +5

    As I remember from Tesla Autonomy Day event, the dual chips are for redundancy. Can you please confirm?

    • @frangalarza
      @frangalarza 2 ปีที่แล้ว +2

      That's correct. There's no way to tell just looking at the hardware because it's all controlled by software, but Tesla have told us that they use them for redundancy.

    • @harsimranbansal5355
      @harsimranbansal5355 2 ปีที่แล้ว

      They were. But tesla isn’t doing that now. They were only getting 17fps on each chip for total of 8 cameras, but now combining them they get 34fps which is much better.

  • @ricknash3055
    @ricknash3055 2 ปีที่แล้ว +1

    That was an interesting identification of various sensors used by Tesla. I was hoping Chris would touch upon how vehicles will be certified to be FSD 3.0. I wouldn't want it left up to the automaker to decide if it's good enough.

  • @billg4399
    @billg4399 2 ปีที่แล้ว +2

    There is an ERROR on the slide titled Level’s of Autonomy. The greater than/equal to and Less than/equal to symbols are backwards. ADAS are levels LESS than or equal to Level 2. Full Autonomy are levels GREATER than or equal to Level 3.

    • @musaran2
      @musaran2 2 ปีที่แล้ว

      At 10:54
      C'mon it's not hard, < & > work like funnels.

  • @theipc-twizzt2789
    @theipc-twizzt2789 2 ปีที่แล้ว +10

    I wonder if there was ever - in the history of driving- an accident, that could have been avoided by knowing distances between objects down to the millimeter and not centimeters.

    • @adamrak7560
      @adamrak7560 2 ปีที่แล้ว +3

      if you look hard enough you may find an edge case for anything.
      After reaching superhuman abilities, self driving systems will race to distinguish themselves in extremely rare edge cases. So I can imagine a future where high res radar, lidar and flir are all optional extras for Tesla FSD.
      Basically Tesla is solving the "thinking part" of the problem. After their solution is working, adding other sensors is relatively easy. They have found that adding other sensors can seriously confuse the system if it is not super smart already. So the problem always comes back to smart perception, and not sensors.

    • @theipc-twizzt2789
      @theipc-twizzt2789 2 ปีที่แล้ว +1

      @@adamrak7560 Yeah I fully agree. I think current hardware is sufficient for safer than human, but there will be better hardware in the future. Robotaxi likely will have higher resolution, better light sensitivity cameras, maybe more cameras and HW 4. I am note sure about lidar yet. However I expect the Bot to maybe have lidar.

    • @harsimranbansal5355
      @harsimranbansal5355 2 ปีที่แล้ว

      @@theipc-twizzt2789 nope doesn’t make sense to use lidar for bot. Tesla is literally solving the lidar problem using cameras. It’s getting really really good. Also if you want another sensor you’d rather use something like infrared light rays instead of visible light.

    • @theipc-twizzt2789
      @theipc-twizzt2789 2 ปีที่แล้ว

      @@harsimranbansal5355 I think they absolutely can do the bot without Lidar. However its use cases at first will need high precision movements, down to the millimeter. That's what lidar (or any other emitting sensor) is good for. You don't need it for cars and later bots. But maybe they use it for the first gen to also get some good training data. After all Tesla is using lidar for ground truthing.

  • @collinsfamilync
    @collinsfamilync 2 ปีที่แล้ว +5

    ADAS == Advanced Driver Assistance Systems

  • @wilber8260
    @wilber8260 2 ปีที่แล้ว

    Thanks guys. Very usefule info. Where is the forward radar locatied? In the bumper?

  • @garyswift9347
    @garyswift9347 2 ปีที่แล้ว

    cool episode. Tanks guys

  • @kevinmusyoka5940
    @kevinmusyoka5940 2 ปีที่แล้ว +4

    Love the content; so insightful!

  • @davincisghost9228
    @davincisghost9228 2 ปีที่แล้ว

    Nice one Chris. Thank you.

  • @brandoYT
    @brandoYT 2 ปีที่แล้ว +2

    Diagram of sensor locations would really help. (I guess if I was an actual engineer I'd go look it up)
    I would expect all ADAS boards for all Teslas to be the same as continual improvements progress thru all models & factories.

  • @fineartz99
    @fineartz99 2 ปีที่แล้ว

    Competent, intriguing, compelling!

  • @johnhigman9371
    @johnhigman9371 2 ปีที่แล้ว +6

    So Chris Fox is on record for Munro that Elon Musk and Tesla will not succeed without LiDAR…I’m amazed at Chris’s hubris.

    • @rugbeaters1
      @rugbeaters1 2 ปีที่แล้ว

      Elon goes against the grain and is right most of the time and is willing to change when needed
      It's way too easy to get caught up with stuff when you deal with it every day and lose focus on the big picture because we always do things a certain way does not mean it's the best way of doing it
      Change is very hard for us mere mortals
      Tho I really do enjoy these videos even when some of them don't understand everything completely because none of us understand everything all the time
      It's extremely hard for a lot of people to understand physics at its basic level

    • @AudiTTQuattro2003
      @AudiTTQuattro2003 2 ปีที่แล้ว +1

      It's their best guess for both Elon and Chris. I'm with Chris, but that doesn't mean Elon is wrong. Time will tell, but being obsessed with who's right is immature.

  • @hawedehre
    @hawedehre 2 ปีที่แล้ว

    I appreciate the Rosenberg connectors on modules.

  • @arnomaas6452
    @arnomaas6452 2 ปีที่แล้ว +1

    excellent analysis !

  • @Dukewiththefluke
    @Dukewiththefluke 2 ปีที่แล้ว +1

    Dual cameras and stereoscopic vision is not really possible beyond 5 meters, I'm sure they calculate distance to cars with neural net and train it with radar and lidar. Radars give sporadic false signals and is the reason why almost all cars with adaptive cruise have random braking.

  • @1968matrix
    @1968matrix 2 ปีที่แล้ว

    Personal opinions don't matter. We are talking about what is possible. About the perception of the sound for a car, I don't think is important, the. horn is necessary when you have just two eyes and can be distracted. A computer is never distracted from its tasks and eight eyes are covering 360 degrees all the time.

  • @brandoYT
    @brandoYT 2 ปีที่แล้ว +3

    reminder: I think all ADAS boards have components on BOTH sides.

  • @lebeech876
    @lebeech876 ปีที่แล้ว

    Very insightful

  • @tuskiomisham
    @tuskiomisham 2 ปีที่แล้ว +4

    One think to keep in mind is that in industrial settings, 'Low voltage' is anything below 1,000V. so, your home wall voltage is low voltage.

    • @briansilver3412
      @briansilver3412 2 ปีที่แล้ว +2

      You are correct in the electrical code definition. But in general use low voltage is under 120V. When I worked on 575VAC low voltage motors, it was a bit misleading to say the least.

    • @tuskiomisham
      @tuskiomisham 2 ปีที่แล้ว

      @@briansilver3412 no kidding. Low voltage is the worst engineering term. Is it 0.7v, 5v, 24v, 120v, or 1000v? Nobody knows.

  • @TheElectricMan
    @TheElectricMan 2 ปีที่แล้ว +1

    great video

  • @mxj247
    @mxj247 2 ปีที่แล้ว +1

    I love you guys!

  • @markplott4820
    @markplott4820 2 ปีที่แล้ว +2

    MUNRO Live - interesting take, but you are just GUESSING. you really don't know what Tesla is doing unless you are using FSD beta w/ visualization.
    Tesla is been using psudo- lidar (vision) for awhile now.
    it does not use Stereo Vision for anything.
    go see KAPARTHY video on AI day.

  • @surferdude4487
    @surferdude4487 2 ปีที่แล้ว +1

    I am certain that computers with cameras will be able to exceed human ability in recognition and speed of decision making. They never get tired. They never get drunk. They never get sick. They never get angry. They never take their eyes off the road to text and drive.
    Having said that, I would still like vehicles to have sensors to allow them to "see" even in conditions where normal vision is obscured such as: smoke, fog, heavy rain or snow.
    So tell me... Exactly how would lidar be an improvement over camera vision in those situations? I can understand how radar and sonar would help, but not something that scans one dot at a time in the visible spectrum. RIP lidar.

  • @antoniopalmero4063
    @antoniopalmero4063 2 ปีที่แล้ว

    Cheers guys 🇬🇧👍

  • @eugeniustheodidactus8890
    @eugeniustheodidactus8890 2 ปีที่แล้ว +1

    What is the B Pillar camera used for and why is it said to be an atypical type of installation?

    • @pipooh1
      @pipooh1 2 ปีที่แล้ว

      The camera is used for side capture of the car, like if a car deside to swing into your lane from your blind spot, the car can already predict / prevent that by monitoring the other car speed/trajectory and prob check if other side is clear so he can safely move out of the other car path and prevent collision.
      Also Tesla has sentry mode, where the car capture 360 degree of the car when it's parked in case someone damage your car / bump into it. It get logged and can be used as evidance in case the person does a hit and run.

    • @eugeniustheodidactus8890
      @eugeniustheodidactus8890 2 ปีที่แล้ว

      @@pipooh1 So the B pillar camera is part of both Sentry Mode and ADAS. Thank you.

  • @user-to2rf1rj5v
    @user-to2rf1rj5v 2 ปีที่แล้ว +18

    11:40 absolutely incorrect. A Tesla with FSD will be able to drive at superhuman levels. Having 8 cameras in 360 and a dedicated neural net computer for driving tasks is demonstrative of this fact as humans have only two cameras facing the same direction, and a bio neural net that isn't dedicated to the task of driving (see: distracted drivers).

    • @SWIMMERSHARKJOEL89
      @SWIMMERSHARKJOEL89 2 ปีที่แล้ว +5

      Yea I wasn’t a fan of this guy’s attitude very pessimistic seemed like a tesla q member lol

    • @AudiTTQuattro2003
      @AudiTTQuattro2003 2 ปีที่แล้ว

      ...and yet, it still isn't ready. It's ready when insurance companies charge next to nothing when you use FSD exclusively. Otherwise, it isn't ready.

  • @seanz6586
    @seanz6586 2 ปีที่แล้ว +2

    Can measure distance/speed of moving objects with one camera by comparing two frames.

    • @justinmallaiz4549
      @justinmallaiz4549 2 ปีที่แล้ว

      1 camera could estimate. But Tesla's multi camera setup is another thing entirely

  • @richardalexander5758
    @richardalexander5758 2 ปีที่แล้ว +5

    Would Cybertruck or future vehicles benefit substantially from a 48 V system for these sort of electronics?

    • @KCJbomberFTW
      @KCJbomberFTW 2 ปีที่แล้ว

      😏

    • @billy-go9kx
      @billy-go9kx 2 ปีที่แล้ว +1

      They would have to be completely re-engineered. Components would have to be changed if they can't handle 48v.

    • @tesla_tap
      @tesla_tap 2 ปีที่แล้ว +3

      I don't see how. Almost all digital electronics are running at 5v or less. The larger the delta from the power input to the desired voltage requires a more complex supply and is likely to generate a bit more heat and cost more. This answer only addresses electronics, not the total value of going to 48v. The prime reason for 48v is to reduce the copper in wiring going between modules, small motors and lights. So far, the lack of third-party 48v components at an equal or lower cost than high volume 12v components has held back 48v adoption. Vehicles from Tesla having 48v internal bus is likely to happen, but I'm not sure when.

    • @markplott4820
      @markplott4820 2 ปีที่แล้ว +1

      CT is using a 48v system, and 800v platform. w/ 240v export.

    • @drkastenbrot
      @drkastenbrot 2 ปีที่แล้ว

      @@tesla_tap The cost difference is minimal and the savings in wiring are huge. 48V is well within reach of the same buck converter topology used for 12V, so there should not be a significant design change needed.
      Its just that for most of the ECUs this doesnt matter, they dont consume enough power for 48V to make a difference. It would be interesting to see a split system where high power components draw 48V and simultaneously provide 12V rails for any nearby low power components.

  • @ericsandberg3167
    @ericsandberg3167 2 ปีที่แล้ว

    When you guys talk about liquid cooling on those PCB's...is that like a heat pipe layer on the PCB to handle the thermal load, or do they actually pump cooling fluid through a heat plane on the back of the board.....thanks.

    • @harsimranbansal5355
      @harsimranbansal5355 2 ปีที่แล้ว

      Pretty sure its just connected to the cooling system through a metal casing that can transfer heat away from the board. Basically the heat pump Octavalve system.

  • @HWKier
    @HWKier 2 ปีที่แล้ว +2

    How about this for an idea: Each Robo-Taxi comes with an Optimus Prime conductor. The robot conductor loads suitcases, wipe down seats between fares, makes sure no one forgets their purse or cell phone, assists with destination selection and fare payment, and provides security. Also, the robot conductor might be a security blanket for those hesitant about riding in a driverless vehicle. It could perform small courtesy chores like helping select music.

    • @fredbloggs5902
      @fredbloggs5902 2 ปีที่แล้ว

      ...and reduces the passenger capacity...
      ...not a chance.

    • @bajajuicy
      @bajajuicy 2 ปีที่แล้ว

      th-cam.com/video/eWgrvNHjKkY/w-d-xo.html

    • @HWKier
      @HWKier 2 ปีที่แล้ว

      But "conductor" might not be the best word for the ride-along robot. Perhaps "porter" ?

    • @TheEvilmooseofdoom
      @TheEvilmooseofdoom 2 ปีที่แล้ว

      You have a mighty functional bot planned! :)

  • @valeyard00
    @valeyard00 2 ปีที่แล้ว +3

    Would be nice if you spent 2 seconds explaining what ADAS stands for. More than 1/2 us probably had to go google it during your video.

  • @planetmuskvlog3047
    @planetmuskvlog3047 2 ปีที่แล้ว

    When / if my Tesla cars achieves L5 autonomy, AND automated supercharging is available, I will consider emancipating my fleet. They will need to earn enough to reimburse me for the initial cost, but after that, a car may be able to survive in wild, not only as an autonomous driver, but an Ai with its own agency as well

  • @error079
    @error079 2 ปีที่แล้ว

    I will never let the car drive it self. How much cost is added with all those thingimabobs for self driving?

  • @ronaldlenz5745
    @ronaldlenz5745 2 ปีที่แล้ว +4

    Chris needs to watch all of the Tesla "days" (autonomy, AI) special presentations. He appears to be in the camp of solving ADAS with hardware rather than software and large compute. BTW, my left eye has a new intraocular lens, the Alcon Panoptix trifocal fresnel lens. Three fixed focal length lenses all in one tiny package. The neural net, our brain, interprets what it sees like the variable focus lens we were born with. Eight cameras should be enough for Level 4 FSD.

    • @awebuser5914
      @awebuser5914 2 ปีที่แล้ว +1

      One issue you are completely missing is one of redundancy (sensor glare, dirt, etc.) the other is blind faith in what "AI" is really capable of...
      Basically, computers don't "understand" what they are seeing. A five year-old can infer an enormous amount of information in what they see that the most advanced "AI" in the world is utterly incapable of.
      "AI" systems are comparison-driven and if there is no relevant comparison, then there is no solution, that's why "edge cases" are so staggeringly hard to predict. It's trivial for a human to infer when an unfamiliar/unexpected situation might be dangerous, an AI would have no way to solve such a situation. This is exactly why redundant sensors, not operating in the visible spectrum, are that "insurance" policy to make sure the 3D imaging algorithms are not misidentifying or ignoring an actual hazard.
      I suggest you watch a few videos on how Waymo "robotaxis" work. They have 27(!) cameras, radar, sonar and LIDAR and it's the complete fusion of those inputs that allows them to be Lv. 4. Tesla's (read Elon's) insistence on visible-light only has a lot to do with his fantasy that the existing Tesla fleet is only "a software update away from FSD" since they are configured only with visible-light sensors and can't practically be retrofitted.

  • @budgetaudiophilelife-long5461
    @budgetaudiophilelife-long5461 2 ปีที่แล้ว

    🤗👍THANKS BEN …FOR ASKING THE RIGHT QUESTIONS FOR US LAYPEOPLE AND CHRIS 🤗 FOR EXPLAINING WHAT WE ARE LOOKING 👁 AT …and do I detect a little HANS SOLO In CHRIS 😉🤷‍♂️😍😍😍

  • @by9917
    @by9917 2 ปีที่แล้ว

    I have a 2020 Chevy with full ADAS. I've seen the ads where a Chevy comes to a complete stop when a car ahead stops and the driver is inattentive. My wife lightly bumped another car and there was no warning and no braking. Odd because it regularly warns for nothing. It certainly didn't work the way Chevy ads show.

  • @cengeb
    @cengeb 2 ปีที่แล้ว

    VW Golf R has the best location for rear camera, the VW logo pops up when you put it in reverse, camera is out of weather when not in use, and it looks cooler when the VW logo pops open, which is also the hatch release handle. GERMAN engineering

    • @pipooh1
      @pipooh1 2 ปีที่แล้ว +2

      For me that is just another mechanical moving part that can break and prevent you from safely using your car if it does break.

    • @cengeb
      @cengeb 2 ปีที่แล้ว

      @@pipooh1 That was my first impression when I originally seen it on my first GTI years ago, I even said to dealer, hmmm, how long will this last...but guess what? never had a failure, 147,000miles on one, 87,000miles on last one, and current Golf R 2019 56,000 miles, not one issue with the flip up lock camera assy. I agree, less moving parts is always better....but they paid attention on the logo camera flap, hatch opening assy. Things can work right, if paid attention to, and not try and keep it cheap, cheap always costs more later

  • @seenstee
    @seenstee 2 ปีที่แล้ว

    Why do we need interior radar and is it being used today?

  • @colinmerrilees
    @colinmerrilees 2 ปีที่แล้ว +5

    It still feels like the B-Pillar camera is not the correct position for good cross traffic detection, and was put there originally for highway driving only, not to deal with cross traffic situations.
    Now that Tesla is trying to solve driving for non-highway situations it seems like they need a camera positioned at the very front of the vehicle (corner or centre) facing left and right to see cross traffic.

    • @tesla_tap
      @tesla_tap 2 ปีที่แล้ว +1

      I suspect the center wide-angle camera can capture front cross traffic, and it's farther forward than a human driver. There is some overlap with this camera and the side cameras, but I don't know if the side cameras are used much for cross-traffic detection.

    • @markplott4820
      @markplott4820 2 ปีที่แล้ว +1

      TESLA FSD beta is LEVEL 4 autonomy. the rest in industry are CRAP. they don't have the VISION or COMPUTE Tesla has.
      Tesla Is 25 watts , compare to 500 watts NVIDIA system..... Lol

  • @garybrotherton5732
    @garybrotherton5732 2 ปีที่แล้ว +1

    Sandy mentioned FLIR in a recent interview on E for Electric. Any thoughts on FLIR and Tesla?

    • @TheEvilmooseofdoom
      @TheEvilmooseofdoom 2 ปีที่แล้ว +1

      Likely not worth the trouble for the limited utility. Sandy should stick to what he knows.

    • @pipooh1
      @pipooh1 2 ปีที่แล้ว +2

      FLIR is nice technology, but dont think Tesla is at this stage interested in anything new. Since they want to have as little of variables as possible, adding ADAS/Radar/Lidar/FLIR can all give false positives and than which system is right? It require prob 10x the calculation power to correct a false possitive since you need to double check everything.
      The more sensors / type of technology, the more fine tuning you have to do not only on your overall system but also on the detection of all the side systems. Tesla has now only have to correct one system, vision. Once they have that to 95+% they might add more systems for overall safety etc, but you first need a good baseline to know that system X is correct and Y is not. If you have 5 systems running, how you know which one is correct when you still fine tuning them all?

  • @sebir.584
    @sebir.584 2 ปีที่แล้ว +1

    Aptera update would be great