@@carloareaser Nope. Don't spread b.s.. It works just as well at night as it does in the day. The only time I've had FSD degrade is in very heavy fog that is dicey for a human to drive in.
WOW... just WOW! The drone footage perfectly above the car... even just that is amazingly done, but OVERLAYING FSD visuals... BRILLIANT! Great job! I'm sure it was a TON of work.
Wow, these visuals of the car's perception superimposed over reality are stunning! No one else conveys this like you do. Keep it coming! I cannot wait to see v12 continue to improve this. I'm waiting on my v12 download!
Dude this video is KILLER. I love these. I'm not even a real Tesla fan or Elon-stan, I just genuinely enjoy the tech and your testing of it. Can't wait for more!
And it is totally useless. There is a shit load of other solutions that actually works. Once Rolls-Royce representatives were asked why they do not invest and develop driving aids and self driving into their cars. They politely replied that they have solved this problem 100 years ago. Its called personal driver, it gets you in any conditions anywhere safely, parks and picks you up, he does not need lanes or other road infrastructure to operate and works 100% anywhere in a world.
@@MonkeySmart12325 Sorry that you find it totally useless. I find it somewhat useful, although it could be better of course. I think parking assist is a more viable option for 99+% of owners rather than hiring a driver.
@@BobbieGWhiz you did not get the point, key problem with tesla is that they do try to reinvent the wheel instead of using it. Instead of making sure the solutions they have works 100% every time, they instead delivers half baked solutions that works occasionally or only if certain favorable conditions met.
Fastest notification click ever, this is the test I’ve been waiting for. Aerial view compared to the 3D reconstruction, this is the only thing I wanted to see and nobody did this test, THANK YOU!
That top-down editing was really helpful and truly shows just how accurate the Vision system can be, while also pointing out where exactly is currently has weaknesses!
3:59 This is likely less about "memory" than about uncertainty whether the curb is an object(s) that might move. So as the curbs passes out of view, FSD has to ascribe some chance for the curb to have moved since the last time it was seen. FSD clearly has a degree of "object permanence", but much less certainty in predicting what occluded objects are likely to do.
I'm pretty sure the reason for this is that the car doesn't have a bumper camera looking forward, from afar you have the windshield camera accurately calculating, but as soon as you get close it loses visibility and is just an estimation.
@@Juan.Irizar Lack of camera is _why_ the curb goes out of view. This happens to humans all the time, but we know that curbs do not move and have a pretty accurate estimation of where it is and how close we are. Eventually FSD will be able to do this as well, but right now once something is out of view of any cameras, it becomes increasingly uncertain about whether it is there and where it is, which is what we are seeing in the visualization.
@@GreylanderTV Why not just assume that anything that's not a human or has wheels has practically zero chance of moving? Why does it have assign a chance of moving away to every single object it sees? The only objects that move so easily during the same drive are lighter objects like plastic bags and paper which are not a threat for the car anyway.
@@baab4229 Of course. But this is not something you just program into a neural network. The NN must _learn_ even the most basic facts about the 3D visual world. This takes time and data. FSD has the inference-time compute power of something like a squirrel brain, while the training server clusters have the computing power of maybe a chimp, which is what is responsible for the _learning._ Think about how long it takes a human baby to just lean object permanence (about 8 months). Then 2-3 years to get all the basic physics of what objects are and how they behave. Also note that "practically zero chance" is not _zero_ chance. You don't want self driving cars rolling dice on getting into accidents. But as you can see from the video, it completed the maneuver, so it was certain enough that it would not hit the curb. Just not as certain as a human, so we see the curb "smear" in the visualization.
4:42 isn’t that because the front camera is above the windshield meaning it has a very significant blind spot in front of the bumper? The lack of ultrasonic sensors makes it even worse. This is a hardware design issue and something the software simply can’t fix.
People tell me to drive forward into spots for best Sentry recordings, but the cameras are so much better when reversing in (and easier when exiting). Glad you're sticking around 👍👍👍👍
Again and again you create the best content regarding a topic. Everybody just places boxes here and there. Your idea and implementation was brilliant. Thanks 👌👌
Why isn’t Tesla’s AP team calling you to provide you with V12, like they are with Omar from the Whole Mars Catalog? Your analysis of FSD is light years better than anyone else on social media.
My Tesla cameras are affected by rain, snow, ice, dirt, salt, condensation in tunnels, and even bright sunshine. That doesn't leave many days with weather conditions where cameras are usable. So as impressive as the video here looks, I'm wondering if the tech is effectively unavailable most of the time for half the countries on the planet. So at the same time as removing the manual controls that are still going to be vital for manually driving the car, the camera-only solution means the car won't be self driving either for the duration of the time people would usually keep the car.
That is a BRILLIANT way to show how accurate it is. Thanks for all that hard work. I have ultrasonic but it's not that good for me where I live as I really need more accuracy. If the new vision can get accuracy to within 10cm then that will be amazing. Phillip.
Ultrasonic and 360 camera on my Volvo works very well. And yes, it can detect curbs, and the camera makes it even better. That outperforms this from Tesla. Why? Because it is not guestimating. It is more close to reality. And ultrasonic also works in pitch black and also works when you do not have very low contrast, like snowwall against snow. And they also work even when dirty.
The lack of sensors means this will never get enough accuracy to be useful. Cameras only isn't up to the task, and this is why Tesla is at the back of the pack when it comes to FSD.
Not trying to make an ass out of myself here. I'm kinda a tesla fan boy due to it being the only information I have been exposed to. But what other production car can I go purchase right now that gives me access to FSD. FSD is in my dream car. Up until now that has been Tesla. You seem pretty confident....Change my mind.
Man, your voice is just soo calm, one of the chilliest TH-camrs I follow. I wasn't feeling the need to skip the sponsor segments cause it didn't disturb me unlike some other TH-camrs do. Keep up the amazing work and don't you dare to leave TH-cam!
The memory issue is spot on. I pulled out of my driveway and had my garbage can at the curb. I brought it in the garage and moved the car. It thought the garbage can was still there.
"with time it will getting better, hopefully" is the most used sentence in this videos. Whats about Windscreen wipers? They are usable now? Or will they get better?😂
I have been driving my Model 3 Highland since late November. I love the car on so many levels - but I have to say, the parking assist is pretty terrible. When I park in my driveway it is always insisting I am driving inside the walls, despite having 30-50cm distance still. That is parking front-first usually. Generally parking front first gives very inaccurate estimations.
Version 11 tests are not pointless. I think they are an opportunity for even more videos, because you can set up some version 11 tests specifically for comparison with version 12 after its release last November.
It's unfortunate that the fidelity and accuracy decreases as you approach objects, given that the modeling and visualization are ostensibly for parking in tight spaces.
The issue with objects disappearing when u get too close is most definitely because it's vision only, if they merged between 2 tech it would've been much more accurate. But yeah I believe they still can make a lot of improvements to this overtime
Much lower ultra sonic sensors paired with this would serve to better display the cone scenario and maybe even cubs. Plus I really don't think this system alone would detect a kid playing hide and seek in the front of the car.
This is actually a pretty cool system! With a few tweaks to how its rendering the environment, you could make it work pretty flawlessly. One thing that stuck out is how it handles static objects. If it knew something was a line or a curb within a certain perimeter of the car, it would not need to keep updating that but instead just use reference frames. The cone is a good example, it knew a cone was there but then once it lost sight it deleted it from its surroundings. If it knew it was a cone once it hit a high enough certanty level and had the position locked in, objects like that would not be an issue.
Other cars don't just have the stitched top-down view. You can also switch to viewing individual cameras, without distortion or fantasies. And also - yes, there are dedicated curb-detection ultrasonic sensors that are pointing *down* at the curbs and measure distance between your wheel and the curb at sub-centimeter precision.
I'm so confused about whether self driving will be real or not. I've seen enough long-form essays disparaging the idea from people with sound reason that my confidence in the idea is shaken. I'm glad you're holding the line, and keeping an objective view on the matter. Cheers!
6:08 - _"In my opinion all it really needs is a memory of what it's already seen"_ Easy said than done... what you said brought me back to 2018~2019 when I was working with computer vision and "all we really needed was memory of what our models had already seen". We never solved that problem with the model(s) we were using ;)
Yes. More cameras and sensors all around the EVs would greatly help. Look into gasless and oiless self running generators. Already proven and patented for many years. Now, combine the self running generators and solar in EVs and everything else. No more plugging in, no more charging stations and no more limited range.😎🇺🇸 Please look up the gasless and oiless self running generators, patents and technologies, thoroughly. Not just a little or not at all. We already have the tech needed to go far beyond our many issues and limitations.
@@AIDRIVRsounds about right. Just keep doing what you’re doing then! You are easily my favorite context creator when it comes to FSD beta. I love your videos and I look forward to you getting V12
This is incredibly interesting footage! Thanks for making the effort to create this "hybrid view" for us... For me it shows several key aspects: 1) Cars without a front bumper camera will never be reliable with a camera only system. The blind spot is simply too large. 2) To make this work beyond the 3mph speed will likely require Hardware that is orders of magnitude faster than was currently cars have onboard 3) combining USS and Cameras would likely have yielded better and more accurate results.
I have a byd atto3 with 360 cam. Those cam are only useful for looking at the line on the floors and the curbs infront that’s blind by the front of the car only. Anything higher than like 10 cm off the ground will absolutely get distort and unseeable, just like that trailer. So yea we don’t rely on it that much, but it’s good for blindspot and parallel parking ❤
The drone view footage combined with Tesla vision was amazing! I would love to see more of this. With pillars, cones, small things, sharp edges, steep roads?? An extensive review would be awesome! Tesla could also record everything and make a high fidelity 3D model of cities. Or even let us contribute kinda like Waze gps when there’s an accident or road maintenance, etc.
4:18 better memory here would be totally unnecessary if it just had a front facing camera down low...like Cybertruck. Amazing Tesla is just totally ignoring this.
Damn! I want you on my podcast so bad! I own a Model 3 Highland and have been testing the park assist feature a couple of times, and like you’ve said: a front facing camera would make the system more accurate!
When I test drove a Model 3, trying to park in our garage was just horrendous. The visualisation was off by so much and I had to be waved in (since I did not know the dimensions of the car and did not want to damage it). I wonder how this would do in this scenario (nosing into a narrow garage), since the ultrasonics on my current car feel way more reliable
I park in a narrow garage everyday. It's awful. I can't rely on it at all. I'm stuck using my mirrors .. makes me so upset Everytime. I truly can't believe I'm stuck with this car for 3 years.
Great work! Love the effort you’ve gone into on this video! 👍👍👍 I agree there are so many areas this fails at, I just reviewed the Highland and the parking was a 6/10
A car not being able to see what is directly in front of it is absolute negligence, and I’m surprised you’ve given them so much slack on this point. No regulator will allow self driving cars on the road that can’t know what’s in front of their bumper. Making up what’s in front with “context” doesn’t count.
The FSD visuals are currently something completely different... what the car logically knows about/has modeled. The Park Assist visuals are the topology map + some markings, and are much closer to a camera-based view. I think they serve different purposes, and hope that both exist in the future.
The problem you mentioned several times "Why tesla can't remember the right position and stick to that instead of updating to wrong?" is actually a pretty common problem in programming overall! Especially when you're solving problems that require some calculations, probability estimations, etc. It's very easy and obvious to our eyes to catch the correct solution but sadly for computer it's just several cases with equal probability. How the car should know, that the curb's position was estimated correctly in the first attempt? What if it was wrong and actually the thing was closer? That's where a real debugging pain comes in😂
I've been driving a Tesla M3 for many years and am usually open to new technologies. But despite all the enthusiasm, I think we should leave the church in the village. For me, what Tesla is offering here is at most a nice gimmick, because we are talking about a parking aid, i.e. a system that is supposed to help you park. If the system on offer is unsuitable for its intended purpose because it does not recognize obstacles to the front or rear, detects all sorts of things in the rain and, since the discontinuation of the USS, does not provide reliable information on distance, then for me this is not a revolutionary technology, but rather one poor parking aid with fancy graphics. If I had the choice, I would immediately swap the Tesla's computer-generated display for the system with the 360 degree camera image display including USS, which most cars have installed as standard these days. Although I don't have the wow effect when the Tesla detects any object, I do have a system that fulfills its purpose of existence almost 100% correctly and that I can trust in all weather conditions. Despite all the enthusiasm for technology, evolution always comes before revolution, but because Tesla always prefers revolution, we dummy testers now have to use our metal.
Wow. A 20 year old Daihatsu for less than 3000$ has no cameras and still has better Park Assist by using the proper sensors for the job. But I'm happy for Tesla that they save 112$ per car and create new jobs in the car repair industry.
This video was sponsored by Brilliant - Sign up at brilliant.org/AIDRIVR/ for a free 30-day trial + 20% off for the first 200 people!
I should have listened to your warning. FSD visuals forever ruined for me. TSLA please fix.
The Tesla footage with top down real life footage is awesome. can't wait to see this more in future videos!
Yeah! Great work!
Don't be fooled by sponsored videos. The thing sucks when the weather isn't perfect and especially at night. A few cameras are not enough..
@@carloareaser Nope. Don't spread b.s.. It works just as well at night as it does in the day. The only time I've had FSD degrade is in very heavy fog that is dicey for a human to drive in.
@@carloareaser This isn't even sponsored by Tesla. How in any conceivable way would a completely different company sponsoring this lead to bias?
@@carloareaser The video is sponsored by Brilliant ... not Telsa. What's the point?
glad to be stuck with you! 😂 thanks for staying!
WOW... just WOW! The drone footage perfectly above the car... even just that is amazingly done, but OVERLAYING FSD visuals... BRILLIANT! Great job! I'm sure it was a TON of work.
Glad you enjoyed it! Took a while to get right but I know what to do to make it easier next time
Wow, these visuals of the car's perception superimposed over reality are stunning! No one else conveys this like you do. Keep it coming! I cannot wait to see v12 continue to improve this. I'm waiting on my v12 download!
Can't wait until you get FSD 12 :) You always go into so much detail covering every aspect.
Dude this video is KILLER. I love these. I'm not even a real Tesla fan or Elon-stan, I just genuinely enjoy the tech and your testing of it. Can't wait for more!
Wow, this is the gold standard of high fidelity parking assist videos. No one else has come close. Thanks.
Except for Rivian, they "hit it spot on". But that's a problem...
You're joking right?
And it is totally useless. There is a shit load of other solutions that actually works. Once Rolls-Royce representatives were asked why they do not invest and develop driving aids and self driving into their cars. They politely replied that they have solved this problem 100 years ago. Its called personal driver, it gets you in any conditions anywhere safely, parks and picks you up, he does not need lanes or other road infrastructure to operate and works 100% anywhere in a world.
@@MonkeySmart12325 Sorry that you find it totally useless. I find it somewhat useful, although it could be better of course. I think parking assist is a more viable option for 99+% of owners rather than hiring a driver.
@@BobbieGWhiz you did not get the point, key problem with tesla is that they do try to reinvent the wheel instead of using it. Instead of making sure the solutions they have works 100% every time, they instead delivers half baked solutions that works occasionally or only if certain favorable conditions met.
Fastest notification click ever, this is the test I’ve been waiting for. Aerial view compared to the 3D reconstruction, this is the only thing I wanted to see and nobody did this test, THANK YOU!
That top-down editing was really helpful and truly shows just how accurate the Vision system can be, while also pointing out where exactly is currently has weaknesses!
It almost looks like even Tesla isn't using this kind of methodology to check the performance.
I really appreciate when someone puts this much work into such a short video, excellent job! Although I would watch 1 hour videos from you anytime :)
Happy you're sticking with the TH-cam game. Congrats on 100K. It's going to be a great year for your channel given what's happening with FSD.
Literally all i can say is, thank you. This is the perfect video to display this feature!!! Amazing, will gladly stick here with you!
Thank you for watching and sticking around!
3:59 This is likely less about "memory" than about uncertainty whether the curb is an object(s) that might move. So as the curbs passes out of view, FSD has to ascribe some chance for the curb to have moved since the last time it was seen. FSD clearly has a degree of "object permanence", but much less certainty in predicting what occluded objects are likely to do.
Good point! Haven't thought of that myself
I'm pretty sure the reason for this is that the car doesn't have a bumper camera looking forward, from afar you have the windshield camera accurately calculating, but as soon as you get close it loses visibility and is just an estimation.
@@Juan.Irizar Lack of camera is _why_ the curb goes out of view. This happens to humans all the time, but we know that curbs do not move and have a pretty accurate estimation of where it is and how close we are. Eventually FSD will be able to do this as well, but right now once something is out of view of any cameras, it becomes increasingly uncertain about whether it is there and where it is, which is what we are seeing in the visualization.
@@GreylanderTV Why not just assume that anything that's not a human or has wheels has practically zero chance of moving? Why does it have assign a chance of moving away to every single object it sees? The only objects that move so easily during the same drive are lighter objects like plastic bags and paper which are not a threat for the car anyway.
@@baab4229 Of course. But this is not something you just program into a neural network. The NN must _learn_ even the most basic facts about the 3D visual world. This takes time and data.
FSD has the inference-time compute power of something like a squirrel brain, while the training server clusters have the computing power of maybe a chimp, which is what is responsible for the _learning._ Think about how long it takes a human baby to just lean object permanence (about 8 months). Then 2-3 years to get all the basic physics of what objects are and how they behave.
Also note that "practically zero chance" is not _zero_ chance. You don't want self driving cars rolling dice on getting into accidents. But as you can see from the video, it completed the maneuver, so it was certain enough that it would not hit the curb. Just not as certain as a human, so we see the curb "smear" in the visualization.
Can’t wait for your videos on FSDv12. You make some of the best videos to assess FSD performance.
Appreciate it! Tell Elon to send it to me pls
Thanks for another good video. Comparing the screen to an actual drone shot is genius
4:42 isn’t that because the front camera is above the windshield meaning it has a very significant blind spot in front of the bumper? The lack of ultrasonic sensors makes it even worse. This is a hardware design issue and something the software simply can’t fix.
Amazing work as always you do. Thank you for rethinking the way of showing us the features.
Agreed. amazing test of the feature.
This is by far the best analysis of the new park assist I have seen. Thanks so much for sharing.
you always have a fresh way of looking at this stuff. Production value is insane. Thanks
I appreciate that, thank you!
People tell me to drive forward into spots for best Sentry recordings, but the cameras are so much better when reversing in (and easier when exiting).
Glad you're sticking around 👍👍👍👍
Again and again you create the best content regarding a topic. Everybody just places boxes here and there. Your idea and implementation was brilliant. Thanks 👌👌
Your editing in this video is crazy good- well done.
I appreciate it!
Damn, this overlay footage is amazing. So great to see it like this. Keep up putting out these amazing and original videos!
removing ultra sonics and radar was a mistake
Why isn’t Tesla’s AP team calling you to provide you with V12, like they are with Omar from the Whole Mars Catalog? Your analysis of FSD is light years better than anyone else on social media.
I've been staring at my phone waiting for the call that hasn't come :[
My Tesla cameras are affected by rain, snow, ice, dirt, salt, condensation in tunnels, and even bright sunshine. That doesn't leave many days with weather conditions where cameras are usable. So as impressive as the video here looks, I'm wondering if the tech is effectively unavailable most of the time for half the countries on the planet. So at the same time as removing the manual controls that are still going to be vital for manually driving the car, the camera-only solution means the car won't be self driving either for the duration of the time people would usually keep the car.
Your testing methodology, editing, and analysis is simply the best there is! Super looking forward to V12 testing!
Soooo glad you’re not quitting! Keep up the great work! ❤
Really impressive overlay of the real with the virtual, makes the comparison easy to see.
That is a BRILLIANT way to show how accurate it is. Thanks for all that hard work. I have ultrasonic but it's not that good for me where I live as I really need more accuracy. If the new vision can get accuracy to within 10cm then that will be amazing.
Phillip.
Ultrasonic and 360 camera on my Volvo works very well. And yes, it can detect curbs, and the camera makes it even better. That outperforms this from Tesla.
Why?
Because it is not guestimating. It is more close to reality. And ultrasonic also works in pitch black and also works when you do not have very low contrast, like snowwall against snow. And they also work even when dirty.
The real "high fidelity" tech on display here is your drone footage overlay! Amazing stuff!
7:19 “worst this is ever going to be.” Generally yes, though the history of isolated regressions have been acute enough for some people to disagree.
IKR. The next update could be a buggy mess!
“This is the worst it’s going to be”, if that was true we’d have working auto wipers by now.
Great work on this video. The time and effort you spent truly shows. I hope the tesla team sees it
@4:05 the car thinks the shadow of the curb is the curb when it "reupdates"
I am soo happy I found your channel 1 year ago !
crazy good visualization method
The lack of sensors means this will never get enough accuracy to be useful. Cameras only isn't up to the task, and this is why Tesla is at the back of the pack when it comes to FSD.
Tesla is the king 🤴 duder
Not trying to make an ass out of myself here. I'm kinda a tesla fan boy due to it being the only information I have been exposed to. But what other production car can I go purchase right now that gives me access to FSD. FSD is in my dream car. Up until now that has been Tesla. You seem pretty confident....Change my mind.
Man, your voice is just soo calm, one of the chilliest TH-camrs I follow. I wasn't feeling the need to skip the sponsor segments cause it didn't disturb me unlike some other TH-camrs do. Keep up the amazing work and don't you dare to leave TH-cam!
The memory issue is spot on. I pulled out of my driveway and had my garbage can at the curb. I brought it in the garage and moved the car. It thought the garbage can was still there.
"with time it will getting better, hopefully" is the most used sentence in this videos. Whats about Windscreen wipers? They are usable now? Or will they get better?😂
Your editing on these videos is very impressive.
the drone overaly is sick nice work
Bro, did you wake up one morning and go “I’m going to make the best freaking tesla videos in the world”. Drone overlay with high fidelity was genius.
I have been driving my Model 3 Highland since late November.
I love the car on so many levels - but I have to say, the parking assist is pretty terrible. When I park in my driveway it is always insisting I am driving inside the walls, despite having 30-50cm distance still. That is parking front-first usually.
Generally parking front first gives very inaccurate estimations.
Version 11 tests are not pointless. I think they are an opportunity for even more videos, because you can set up some version 11 tests specifically for comparison with version 12 after its release last November.
Yeah this is a good point. See if you can get some footage of tests that you can compare against directly when you get V12
It's unfortunate that the fidelity and accuracy decreases as you approach objects, given that the modeling and visualization are ostensibly for parking in tight spaces.
Sucks that everyone is quitting, but it's great to be stuck with you!
The issue with objects disappearing when u get too close is most definitely because it's vision only, if they merged between 2 tech it would've been much more accurate. But yeah I believe they still can make a lot of improvements to this overtime
BIG YIKES with that Rivian crash for sure
I hope this improves quickly, it's disconcerting when it doesn't display some vehicles and everything jiggles around
Much lower ultra sonic sensors paired with this would serve to better display the cone scenario and maybe even cubs. Plus I really don't think this system alone would detect a kid playing hide and seek in the front of the car.
This is actually a pretty cool system! With a few tweaks to how its rendering the environment, you could make it work pretty flawlessly. One thing that stuck out is how it handles static objects. If it knew something was a line or a curb within a certain perimeter of the car, it would not need to keep updating that but instead just use reference frames. The cone is a good example, it knew a cone was there but then once it lost sight it deleted it from its surroundings. If it knew it was a cone once it hit a high enough certanty level and had the position locked in, objects like that would not be an issue.
It's all fun and games until someone puts an RC cone in the parking lot... /s
Other cars don't just have the stitched top-down view. You can also switch to viewing individual cameras, without distortion or fantasies. And also - yes, there are dedicated curb-detection ultrasonic sensors that are pointing *down* at the curbs and measure distance between your wheel and the curb at sub-centimeter precision.
Imagine if Tesla actually had accuracy like that!
I'm so confused about whether self driving will be real or not. I've seen enough long-form essays disparaging the idea from people with sound reason that my confidence in the idea is shaken. I'm glad you're holding the line, and keeping an objective view on the matter. Cheers!
6:08 - _"In my opinion all it really needs is a memory of what it's already seen"_
Easy said than done... what you said brought me back to 2018~2019 when I was working with computer vision and "all we really needed was memory of what our models had already seen". We never solved that problem with the model(s) we were using ;)
Yes. More cameras and sensors all around the EVs would greatly help. Look into gasless and oiless self running generators. Already proven and patented for many years. Now, combine the self running generators and solar in EVs and everything else. No more plugging in, no more charging stations and no more limited range.😎🇺🇸 Please look up the gasless and oiless self running generators, patents and technologies, thoroughly. Not just a little or not at all. We already have the tech needed to go far beyond our many issues and limitations.
Drone stabilization is really impressive
Whoa....There are 2 S's in stabilization? That is cool and good to know. Thanks!
Yep my english is sucks@@jaystarr6571
Holy cow! The top down viewpoint super imposed on top of the virtual one is so frikin cool! Keep up the good work!
Second thought, definitely hellishly difficult I’m sure, but imagine if someone could do that with the regular FSD view.
I’ve given it a shot before, I can make still frames look pretty good but while the car is moving it’s nearly impossible
@@AIDRIVRsounds about right. Just keep doing what you’re doing then! You are easily my favorite context creator when it comes to FSD beta. I love your videos and I look forward to you getting V12
@@patrickday3393 Me too my friend, me too
This is incredibly interesting footage! Thanks for making the effort to create this "hybrid view" for us...
For me it shows several key aspects:
1) Cars without a front bumper camera will never be reliable with a camera only system. The blind spot is simply too large.
2) To make this work beyond the 3mph speed will likely require Hardware that is orders of magnitude faster than was currently cars have onboard
3) combining USS and Cameras would likely have yielded better and more accurate results.
As always, your use of visualization is consistently innovative and super helpful. Well done, man
Excellent video. No problem on the lack of FSD videos. Looking forward to future version 12 vids.
Excellent concept and execution here. Ive never seen this done before. Really helps us to see what its doing. Keep it up!
I have a byd atto3 with 360 cam. Those cam are only useful for looking at the line on the floors and the curbs infront that’s blind by the front of the car only. Anything higher than like 10 cm off the ground will absolutely get distort and unseeable, just like that trailer. So yea we don’t rely on it that much, but it’s good for blindspot and parallel parking ❤
The drone view footage combined with Tesla vision was amazing! I would love to see more of this. With pillars, cones, small things, sharp edges, steep roads?? An extensive review would be awesome!
Tesla could also record everything and make a high fidelity 3D model of cities. Or even let us contribute kinda like Waze gps when there’s an accident or road maintenance, etc.
Great visuals. You always kill it with the unique viewpoint.
Totally agree further videos on v11 aren't worth making. Wait for v12.
That was amazing. Really cool how you lined up the visualizations. Well done.
It's still not as good as the Ultra Sonic Sensors or a bird's eye view that other vehicles have. It needs a front bumper camera.
Congrats to the 100k. Keep up the good work. I love your channel.
4:18 better memory here would be totally unnecessary if it just had a front facing camera down low...like Cybertruck. Amazing Tesla is just totally ignoring this.
Love the setup. So cool. Awesome video!
That overlay footage was og. Great work/effort my man
Damn! I want you on my podcast so bad! I own a Model 3 Highland and have been testing the park assist feature a couple of times, and like you’ve said: a front facing camera would make the system more accurate!
When I test drove a Model 3, trying to park in our garage was just horrendous. The visualisation was off by so much and I had to be waved in (since I did not know the dimensions of the car and did not want to damage it). I wonder how this would do in this scenario (nosing into a narrow garage), since the ultrasonics on my current car feel way more reliable
I park in a narrow garage everyday. It's awful. I can't rely on it at all. I'm stuck using my mirrors .. makes me so upset Everytime. I truly can't believe I'm stuck with this car for 3 years.
@@geekgrrlegetting rid of ultrasonic was a mistake, and yet they doubled down on it
Awesome job combining real videos with this visualization, thanks for doing this! 😎💪
Great work! Love the effort you’ve gone into on this video! 👍👍👍
I agree there are so many areas this fails at, I just reviewed the Highland and the parking was a 6/10
A car not being able to see what is directly in front of it is absolute negligence, and I’m surprised you’ve given them so much slack on this point. No regulator will allow self driving cars on the road that can’t know what’s in front of their bumper. Making up what’s in front with “context” doesn’t count.
The FSD visuals are currently something completely different... what the car logically knows about/has modeled. The Park Assist visuals are the topology map + some markings, and are much closer to a camera-based view. I think they serve different purposes, and hope that both exist in the future.
Great footage and edit, thanks - happy to wait for V12
Damn fine use of your magic, "How dey do dat?", cameras! Kudos and Glory to ya!
The problem you mentioned several times "Why tesla can't remember the right position and stick to that instead of updating to wrong?" is actually a pretty common problem in programming overall!
Especially when you're solving problems that require some calculations, probability estimations, etc. It's very easy and obvious to our eyes to catch the correct solution but sadly for computer it's just several cases with equal probability.
How the car should know, that the curb's position was estimated correctly in the first attempt? What if it was wrong and actually the thing was closer?
That's where a real debugging pain comes in😂
Congrats on the 100k subs! Awesome work on the overlay too!
Thank God you're sticking around!
Okay, so you're going to continue to innovate beautifully, it seems. Chapeau!
This is an outstanding video. Well done mate.
Absolutely stellar job with the camera work!
Great video! Can’t wait till ultra sonic vehicles get this too 👌🏼
Really hope the Raven Model S/X get this too
Agree, v11 videos are pointless when you know v12 is rolling out 🎯
This video is awesome! truly my fav FSD youtube channel
Can these shits at tesla let me pay 300 bucks for a bumper cam so the front measurements are good
I've been driving a Tesla M3 for many years and am usually open to new technologies.
But despite all the enthusiasm, I think we should leave the church in the village.
For me, what Tesla is offering here is at most a nice gimmick, because we are talking about a parking aid, i.e. a system that is supposed to help you park. If the system on offer is unsuitable for its intended purpose because it does not recognize obstacles to the front or rear, detects all sorts of things in the rain and, since the discontinuation of the USS, does not provide reliable information on distance, then for me this is not a revolutionary technology, but rather one poor parking aid with fancy graphics.
If I had the choice, I would immediately swap the Tesla's computer-generated display for the system with the 360 degree camera image display including USS, which most cars have installed as standard these days. Although I don't have the wow effect when the Tesla detects any object, I do have a system that fulfills its purpose of existence almost 100% correctly and that I can trust in all weather conditions.
Despite all the enthusiasm for technology, evolution always comes before revolution, but because Tesla always prefers revolution, we dummy testers now have to use our metal.
Imagine someone draw a realistic hole on the road which could prevent all tesla from passing by
Damn dude. Great production quality. Very helpful
Wow. A 20 year old Daihatsu for less than 3000$ has no cameras and still has better Park Assist by using the proper sensors for the job. But I'm happy for Tesla that they save 112$ per car and create new jobs in the car repair industry.
lol @intro, Theres comfort knowing Im not the only one who can’t wait for v12 . 😛
These videos are so well planned and executed! Thx!
So refreshing to watch a video that tells everything as is. Instead of another pumper video with just everything is great with Tesla.