Oh whoa! Curtis! Dang, that's a hefty shout-out, thank you so much! Also hot damn this is cool. With the LIDAR and facial mocap tracking/scanning and all that wild stuff, it seems like an iPhone is totally the way to go if you're into a virtual-productiony workflow. Also had no idea about the clone tool in texture paint! That's going to help cleanup an absurd amount. Personally I can't wait to see what happens once machine learning gets more involved in the LIDAR/photogrammetry workflow (is it already??)- it seems like stuff'll just get insane at that point.
Totally! I'm really looking forward to seeing where this will go. And honestly, I'm starting to feel like a kid in a candy store having all this stuff to play with. There are so many cool things to make, but so little time 😊
There are some papers floating around demonstrating AI generated photogrammetry, albeit mostly for a lot more niche purposes than virtual production, though it is very promising. Especially if they were to start using stuff like LIDAR for training data. It just takes someone with the right project idea in mind. You should check out Two Minute Papers if you haven't already, they cover a lot of awesome stuff relating to AI use in CG as well as other fields and technology.
Underrated? Ian literally has about 4.5 times more subscribers than this channel. Like blender gru is probably only blender tutorial channel who has more subscribers than Ian.
Plus super handy for vfx work, rough scan of room + 3d camera track saves a lot of time, placing reference walls etc. The scanned room could even be used for better reflections than a hdri in the middel of the room.
"Light + Radar" and "Light Detection and Ranging" are basically the same, since Radar stands for "radio detection and ranging". But the second is more accurate, since there are no radiowaves involved in Lidar
This technology could've been and was implemented over ten years ago (Kinects, Realsense). What it needs is AI denoising and aligning, current algorithms are probably the best at what they do, so unless that happens you won't see it much better in the future, the point clouds will be just more dense but only 2cm accurate at 1m.
@@HerbaMachina What is meant by wavelength they're using. Doesn't this just virtualise the geometry within the device and call it a day, how come by increasing the speed of which the device can capture said scans as well as by increasing the subdivisions of the geometry not simply improve the tech to the point its just the norm to use it for real world objects (obviously clean up would likely be needed, but how come this isn't a big thing yet they're trying to advance?)
@@HerbaMachina No, this is not limited by the wavelength. This uses LIDAR, not RADAR, and LIDAR uses a laser in either red wavelength or infrared. Either way, we're talking a wavelength in the hundreds of nanometers, so something like 2 to 5 micrometers. The wavelength is not what's limiting this technique.
Damn this is cool. Iphone seems to have a lot of interesting features for 3D, also for facial capture in combination with mocap suits... great video man!
This is awesome. I'd digitize my whole town. Imagine technology like this in 10 years, being able to simply 3d map a building like a grocery store inside and out by simply filming it like a video. and then you drop the file into any game creation program and it automatically makes it so you can play it like you're playing a first person shooter game.
I’m a gearhead, and I’m really enthusiastic about LIDAR. Imagine seeing a cool car and being able to make a 3D model of it, instead of making a 2D picture!
I've always wanted to combine high quality lidar with high quality photogrammetry, best of both worlds. That would be for film/game close up work though
I tried doing this and it worked in limited scenarios, most of the time the scanned mesh is too different from photogrammetry pointclod that the textures look warped.
Honestly a major reason I bought an iPhone 12 Pro over another phone was for the LiDAR Scanner, you did a great job of highlighting just how useful these kinds of tools can be for environment design
That’s a great video. LIDAR looks awesome and you can quickly get a ton of background environment material quickly. I’ve seen this on $100K cameras, but seeing it on a f🤘🤘🧟♀️ing iPhone... dayummmm!
The biggest benefit I see for this, in filmmaking anyway, is getting a live accurate model of your set or location for both accurate compositing with live action footage, and reflection maps. Like, holy crap, this is invaluable.
@@avnzx5177 I did see a video back in 2015 about a camera that did that. It could do live, depth accurate effects like post-production refocusing, Z-depth based color correction, add fog I think, and was able to composite backgrounds and other elements without the use of green screen because it knew where the actors or subjects were vs their environment. Is that what you're referring to? Cause I actually haven't heard much about that technology since that video.
Had terrible results with lidar apps, apparently it’s software locked, I get overlapped textures and the application locks up often.. but you say it’s usable?
@@hugoantunesartwithblender not sure if there was a tutorial with the app.. ..people on 3d scan group called this a toy really :/ .. but I’m having hopes of software improving.. I was really asking for best apps - apparently all have same issues.. But we can see from the video, you can use some of it.. And mine reasoning is the same - I can do photogrammetry - but requires so many steps - just copy pasting a bunch of photos is enough work..
Lidar isn't a toy, it's well-proven technology across all kinds of industries, from industrial design and geological study to archaeology and, damn, even NASA have been using it. Software design seems to be the most limiting factor at the moment. Using just photo or lidar data to generate a mesh is a mistake I think, perhaps using data from both to inform more accurate decisions about surfaces would be better. Having an effective/efficient way of doing that live is tricky.
@@CurtisHolt the iphone lidar is a toy, not even lidar - that’s what the 3d scan people who used the expensive lidar claimed, accuracy is very low apparently - not my claims - but quality is pretty low - and apparently apple locked the resolution so that might be it, there might be better solutions to come
@@robob3ar Like I said, software is still a limiting factor. Realtime photogrammetry is possible, using Lidar data even at a low resolution should make it better, but it seems choices have been made to exclusively use low resolution Lidar data to construct meshes. Why not both? People think that just because something isn't a hyper-expensive top of the range specialised tool that it's useless. It's not.
This is indeed very interesting! We have even launched a business based on lidar. Considering the huge price drop and size reduction of lidar hardware we this it’s actually not expensive anymore and very accessible. Thanks for sharing🙏🏻
This seems really important. I want to play around with point clouds and I am keeping my eye out for a second hand xbox kinect - which a quick Google search suggests would work. Now, I would think that LIDAR would work for this. The issues holding me back would be the resolution (your point about small items) and that I have an android phone. Thanks!
I've tried it, results can be a little underwhelming when using an original Xbox 360 kinect. Maybe try the Xbox One kinect. Using software like Skanect gave me some useful results, but little details get ruined like LIDAR
So cool! Would love to see a meshroom photogrammetry video from you sometime... And What that pipeline looks like compared to the Lidar pipeline.. How you decimate and retopo in meshroom etc. ... lidar seems interesting because it looks like a continuous stream of collection whereas photogrametry is multiple photos that create a point cloud, I would love to find out if mushroom calculates GPS data as well?... in which case it would make sense to use a camera that has GPS in metadata for mushroom to incorporate that? Lots more to learn 😂 thank you so much for your contributions. Keep up the great work.
well yes but no you still can do this without a proper lidar but the process will be far more difficult and the result won't be as clean it's called photogrammetry, and there is free apps you can use to do it there is also a lot of tutorial for it, so I don't need you to go into too much detail but a fair warning, it takes much more time and effort
@@aronseptianto8142 with photogrammetry you will find out where your holes are way too late. realtime scanning is a big step forward. seeing what you get has tremendous value
Curtis, regarding the UV issue, that's one example where PTEX could be of large benefit. Perhaps you could touch on what potentially PTEX can bring to the table, seeing as how it has moved to the background. It has the potential of being a real gamechanger as it doesn't need UVs to map and paint.
been 3d camera tracking scenes. and on site use the llidar. then after tracking it, I use the models in the camera tracked scene for particle systems with the lidar scans as guides for effectors or collision objects. works beautifully
Such a cool idea. I guess that's technically Augmented Reality, but you can imagine walking around your house with a headset on and the rooms and objects are all digitally viewable and interactable but maybe with different textures or filters on them, or even highlights or graphics overlaid onto the image. Arrows pointing you around your house to where you left your keys, or virtually instructing you on a recipe you're cooking.
The mesh quality limitations probably stem from it's real-time roots, if you were able to capture pictures with LIDAR data, which can have higher resolution and detail than video, and then instead process that on a desktop application, it'd open up the opportunity to use the mesh data with the parallax data generated from standard photogrammetry. The LIDAR sensor probably has a much higher resolution than the phone can process in real time as well.
Really interesting. Do you have any more recent experiments with lidar scanning? A friend of mine has a new iPhone 13 pro coming in the post today and I'm trying to get him to send me some scans to play with.
Very cool to see the 'adaptive refinement' of the mesh if you move around and cover an previously 'undiscovered' area with the camera And yes @IanHubert is great 👍🏼
I wonder if they've improved the Lidar this year. I've watched a number of videos showing the iPhone 12 Lidar and quite a few are waiting for it to improve.
it would be cool to combine this with photogrammetry. in other words, lidar could be combined with high res photos and use the distance data to assist in the generation of the point cloud
if you want the LiDAR scanner make sure it’s the 12 Pro or Pro Max. There’s also Android phones with ToF sensors if want the tech-ish but don’t want iOs, but there seems to be nothing around in terms of phones that beats Apple’s implementation of the technology. I may be wrong, of course, but I genuinely doubt that.
Great vid. Though Lidar still has barriers for mid low end phones right? We can foresee mainstream commercial usage will be 2 to 3 years from now? Or are there any alternative to activate lidar?
Super nice video! :) Any phone or device that is better then others with the Lidar scanner or is the iPhone 12 pro as good as it gets if you do not want to invest big in something ONLY for Lidar scanning?
Imagine using swarms of small drones with lidars and cameras to scan the entire Earth from every angle! It could become the most interesting open world game/driving/flight simulator ever! Or even just crowdsourcing 3d-scans from enthusiasts with the latest smartphones. Then developers and AI can adjust things to make it super realistic.
I have seen these similar devices on jobsites years ago. Last time, a company was getting 3D measurements of a framed staircase, so they can use that information to cut the marble pieces to be installed at a later time.
thanks for your great review! I'm also planning to test iPhone 12's "lidar" myself so this is a great reference. But I don't think that at this point "lidar" can replace photogrammetry yet, judging from the results in your video. The big advantage of photogrammetry is the way we can manipulate (to a certain degree) the end resolution to our needs depending on the camera-to-object distance, and yeah costs matter too. PS. I put "lidar" in quotes because as far as I can see despite the name Apple's system does not seem to be a real lidar/laser scanning per se, more like a depth-camera system. An upgraded Kinect if you will. CMIIW PPS. another great alternative is videogrammetry/photogrammetry from videos!
I can see the use of Lidar for static scenes like what is shown here but I don't see a whole lot of use cases for it outside of that such as in games. The cleanup work seems like it would be much more complicated, and the texture resolution seems to be lacking from what I've seen as well as the accuracy of its simplistic geometry generation. It is however very fast and easy to do from what I can tell. I look forward to seeing how it improves over time.
@@sircher943 this. I was shocked that a pc is equally fast at making a scene on the lowest settings, than arcore apps. And so much more accuracy, control I'm using metashape
Just imagine the applications in vr. You could make photorealistic videogames with explorable environments just by sending a few drones around to film stuff. Microsoft flight sim, this - the only problem is its not perfect but the framework is there for Cyberspace
You missed a key difference between LASER-based LIDAR (see the top of google maps cars ) and image processing based on stereoscopy and variants (leveraging multiple views of the same scene). Apart from that, thanks for the video, nice sharing your work!
For the latest Huawei devices, there is an app, called "3d Scanner". The free version works on AR core enabled phones, but without the lidar sensor. Latest Huawei and Honor phones with lidar make really nice scans with the pro version, which is updated every few months. Checkout the developers name: Lubosh and match that with 3d and you can see his clips on youtube
Is there one available with higher triangle count for more curved edges rather than a jagged apperance? Iam only asking as i love 3d printing and i can see this vome in handy rather than making the models on blender before printing.
So, I know that when mesh resolution is low, UV maps in Blender are often stretched or distorted similar to how they were in your video. Understandably, subdividing your mesh might not be an option as it might put your geometric density unmanageably high, but I was just wondering if you had tried that to help with the stretching/distortion.
Thanks for this! Just got this phone and I’m well into a project making maps for a game/app and this is (potentially) gonna make building custom assets so much easier. Thanks for saving me some time with failed experiments lol
Our geology teacher (studying Civil Engineering degree) used a LIDAR scanner (proper stationary tool rather than a smartphone toy) to scan an entire cliff (about 15m high). Really incredible stuff to be able to manimulate something from real life on your desktop. Perhaps using a dolly or some stabilisation handle would give better end results with a phone as well. Hopefully LIDAR will become more common, though bet people rather would have something dumb like yet another camera ¬_¬
Any app recommended for Android? This looks awesome. Can't wait for this tech to mature enough to give good result on budget phones and not needing to buy expensive lidar scanners.
This thing will be next most wanted feature for every phone. Precision is acceptable and fast. Main problem for photogrammetry is precision. Precision require more processing power which implies more time. In many situation we have not time. When phones become enough powerful then we will combine both to achieve better quality.
I forgot I had a black moustache lol.
great vid
Lol
I bet it was cgi
He & his mustachio were being CGMatter, I was being me.
lmfao..
Now buy a iphone cg
Oh whoa! Curtis! Dang, that's a hefty shout-out, thank you so much! Also hot damn this is cool. With the LIDAR and facial mocap tracking/scanning and all that wild stuff, it seems like an iPhone is totally the way to go if you're into a virtual-productiony workflow. Also had no idea about the clone tool in texture paint! That's going to help cleanup an absurd amount.
Personally I can't wait to see what happens once machine learning gets more involved in the LIDAR/photogrammetry workflow (is it already??)- it seems like stuff'll just get insane at that point.
Totally! I'm really looking forward to seeing where this will go. And honestly, I'm starting to feel like a kid in a candy store having all this stuff to play with. There are so many cool things to make, but so little time 😊
There are some papers floating around demonstrating AI generated photogrammetry, albeit mostly for a lot more niche purposes than virtual production, though it is very promising. Especially if they were to start using stuff like LIDAR for training data. It just takes someone with the right project idea in mind.
You should check out Two Minute Papers if you haven't already, they cover a lot of awesome stuff relating to AI use in CG as well as other fields and technology.
Give it a couple of months and all newer android phones even the budget ones will have it
Way to go? Android has had the same shit for years.
I also wanted to say that the clone tool for texture painting was an incredible knowledge drop! Brilliant
Friendship with photogrammetry ended, now lidar is my best friend
Dying...
You'll have to wait for another 5 years...
FRIENDSHIP END WITH PHOTOGRAMMETRY, NOW LIDAR IS MY BESTFRIEND
Careful now with ending friendships! I’m sure you want textures to go with your geometry :p
LiDAR is one f the tools used in photogrametry...
Props to you for giving Ian Hubert a shout out, he's a seriously underrated blender genius
Underrated? Ian literally has about 4.5 times more subscribers than this channel. Like blender gru is probably only blender tutorial channel who has more subscribers than Ian.
Highly recommend his Ted Talk. Very entertaining and informative at the same time.
(Edit) lol he JUST mentioned it as I was watching. . . Touche'
Ian Hubert is among the most applauded vfx artists today lol
@@dirtiestharry6551 lmao facts and Ian's a god compared to that self claimed guru
Oh look, it's the "seriously underrated", meme comment.
Plus super handy for vfx work, rough scan of room + 3d camera track saves a lot of time, placing reference walls etc. The scanned room could even be used for better reflections than a hdri in the middel of the room.
Yes!^
Hadn’t considered hdri alternative!
“Export an OBJ and E-mail that to yourself...” lol I feel your pain, iPhone and Windows computer user!
your poor soul
"Light + Radar" and "Light Detection and Ranging" are basically the same, since Radar stands for "radio detection and ranging". But the second is more accurate, since there are no radiowaves involved in Lidar
@Curtis Holt -- Thanks for sharing Ian's Blender Talk. I loved it.
I've been making sure to grab photos as I grab LIDAR models, which allows you to project them back on to really messed up areas, texture-wise.
God this is insane, this technology is just going to keep advancing and advancing, imagine this in 10 years
This technology could've been and was implemented over ten years ago (Kinects, Realsense). What it needs is AI denoising and aligning, current algorithms are probably the best at what they do, so unless that happens you won't see it much better in the future, the point clouds will be just more dense but only 2cm accurate at 1m.
@@sircher943 yeah this, as the image is ultimately limited by the wavelength they're using.
yeah, and it's all going to be used for advertising :D
@@HerbaMachina What is meant by wavelength they're using. Doesn't this just virtualise the geometry within the device and call it a day, how come by increasing the speed of which the device can capture said scans as well as by increasing the subdivisions of the geometry not simply improve the tech to the point its just the norm to use it for real world objects (obviously clean up would likely be needed, but how come this isn't a big thing yet they're trying to advance?)
@@HerbaMachina No, this is not limited by the wavelength. This uses LIDAR, not RADAR, and LIDAR uses a laser in either red wavelength or infrared. Either way, we're talking a wavelength in the hundreds of nanometers, so something like 2 to 5 micrometers. The wavelength is not what's limiting this technique.
Measuring the architecture and composing an environment from photos is in the past! Man this evolves so fast!
Thank you for this video, incredibly helpful, the LIDAR is much better than I expected on the iPhone 12 Pro. Thanks!
"I need your clothes, your boots and your motorcycle."
You forgot to say please
the app needs a terminator vision mode
*scans them with phone then politely gives items back*
THANK YOU
Modozycle!
@@flavienvolken3733 😂😂
Thank you for introducing this. By the way Happy New Year! Keep Blending.
Great video and new information about working with my iPhone 12Pro and Blender.
More like the beginning of photogrammetry. Photogrammetry neural nets trained with lidar data will be scary good in the future.
at 4:20 do u have a tutorial on how you created this pic? I'm interested in how you got the dirt..and the the spaceship
Damn this is cool. Iphone seems to have a lot of interesting features for 3D, also for facial capture in combination with mocap suits... great video man!
This is awesome. I'd digitize my whole town. Imagine technology like this in 10 years, being able to simply 3d map a building like a grocery store inside and out by simply filming it like a video. and then you drop the file into any game creation program and it automatically makes it so you can play it like you're playing a first person shooter game.
been using polycam a lot with blender and its awesome. learnt some cool tips here so thank you :)
In a couple of years a bunch of Indie filmmakers will be able to take on the blockbuster market. I can’t wait to see that!
Interesting!!
Looking forward to your next works involving Lidar 👀👀
I’m a gearhead, and I’m really enthusiastic about LIDAR. Imagine seeing a cool car and being able to make a 3D model of it, instead of making a 2D picture!
Totally.
Or walk around it while recording a video for 10 secs? 😂
I've always wanted to combine high quality lidar with high quality photogrammetry, best of both worlds. That would be for film/game close up work though
I tried doing this and it worked in limited scenarios, most of the time the scanned mesh is too different from photogrammetry pointclod that the textures look warped.
I would also love an AI solver that approximates materials beyond albedo, with roughness, and specular, maybe even metalness.
Just need to wait till it is out of Apple territory
It does exist on Huawei's territory. Check out Lubos's 3D scanner app videos in youtube
@@largerification , cool, Thank You! Happy New Year! :-)
th-cam.com/video/NMAJLGKXvxM/w-d-xo.html. Yhis is the latest video, with the latest updates
@@largerification , Thank You! I also tweeted to my friends, to spread the word more:
twitter.com/Thunder_Owl/status/1345015443522727936?s=20
LiDAR was never Apple's "territory". Holy shit, people say the dumbest things.
Honestly a major reason I bought an iPhone 12 Pro over another phone was for the LiDAR Scanner, you did a great job of highlighting just how useful these kinds of tools can be for environment design
that UFO looks so great!
That’s a great video. LIDAR looks awesome and you can quickly get a ton of background environment material quickly. I’ve seen this on $100K cameras, but seeing it on a f🤘🤘🧟♀️ing iPhone... dayummmm!
The biggest benefit I see for this, in filmmaking anyway, is getting a live accurate model of your set or location for both accurate compositing with live action footage, and reflection maps. Like, holy crap, this is invaluable.
RGBD cameras have existed for a good while.
The D being depth
@@avnzx5177 I did see a video back in 2015 about a camera that did that. It could do live, depth accurate effects like post-production refocusing, Z-depth based color correction, add fog I think, and was able to composite backgrounds and other elements without the use of green screen because it knew where the actors or subjects were vs their environment.
Is that what you're referring to? Cause I actually haven't heard much about that technology since that video.
Had terrible results with lidar apps, apparently it’s software locked, I get overlapped textures and the application locks up often..
but you say it’s usable?
Remember of filming really slow
@@hugoantunesartwithblender not sure if there was a tutorial with the app..
..people on 3d scan group called this a toy really :/ .. but I’m having hopes of software improving.. I was really asking for best apps - apparently all have same issues..
But we can see from the video, you can use some of it..
And mine reasoning is the same - I can do photogrammetry - but requires so many steps - just copy pasting a bunch of photos is enough work..
Lidar isn't a toy, it's well-proven technology across all kinds of industries, from industrial design and geological study to archaeology and, damn, even NASA have been using it. Software design seems to be the most limiting factor at the moment. Using just photo or lidar data to generate a mesh is a mistake I think, perhaps using data from both to inform more accurate decisions about surfaces would be better. Having an effective/efficient way of doing that live is tricky.
@@CurtisHolt the iphone lidar is a toy, not even lidar - that’s what the 3d scan people who used the expensive lidar claimed, accuracy is very low apparently - not my claims - but quality is pretty low - and apparently apple locked the resolution so that might be it, there might be better solutions to come
@@robob3ar Like I said, software is still a limiting factor. Realtime photogrammetry is possible, using Lidar data even at a low resolution should make it better, but it seems choices have been made to exclusively use low resolution Lidar data to construct meshes. Why not both? People think that just because something isn't a hyper-expensive top of the range specialised tool that it's useless.
It's not.
This is indeed very interesting! We have even launched a business based on lidar.
Considering the huge price drop and size reduction of lidar hardware we this it’s actually not expensive anymore and very accessible. Thanks for sharing🙏🏻
This creates a lot of new possible scenarios! Interesting video!
Can you use the front cam for smaller objects?
This seems really important.
I want to play around with point clouds and I am keeping my eye out for a second hand xbox kinect - which a quick Google search suggests would work.
Now, I would think that LIDAR would work for this. The issues holding me back would be the resolution (your point about small items) and that I have an android phone.
Thanks!
I've tried it, results can be a little underwhelming when using an original Xbox 360 kinect. Maybe try the Xbox One kinect. Using software like Skanect gave me some useful results, but little details get ruined like LIDAR
How does the lidar on the iphone compares to the lidar in that project tango tablet from google?
So cool! Would love to see a meshroom photogrammetry video from you sometime... And What that pipeline looks like compared to the Lidar pipeline.. How you decimate and retopo in meshroom etc. ... lidar seems interesting because it looks like a continuous stream of collection whereas photogrametry is multiple photos that create a point cloud, I would love to find out if mushroom calculates GPS data as well?... in which case it would make sense to use a camera that has GPS in metadata for mushroom to incorporate that? Lots more to learn 😂 thank you so much for your contributions. Keep up the great work.
Just got the iPhone 12 pro and forgot it had Lidar capabilities this is really useful to me and I didn’t even know about it
Hi There
Love to know more about scanning the outside and inside of houses for renovation and extension plans please
What App did you have used?
Me: no way, after display.land got shut down, I have needed something like this.
Sees that an IPhone 12 is required...
:(
:(
I got my 12 pro max yesterday 🤭
@@manavgala2361 nice
well yes but no
you still can do this without a proper lidar
but the process will be far more difficult and the result won't be as clean
it's called photogrammetry, and there is free apps you can use to do it
there is also a lot of tutorial for it, so I don't need you to go into too much detail
but a fair warning, it takes much more time and effort
@@aronseptianto8142 with photogrammetry you will find out where your holes are way too late. realtime scanning is a big step forward. seeing what you get has tremendous value
Just iphone? Is there some special about iPhone or
I chose to switch to an iPhone at the right time, this is one of my favorite features of any phone I've ever owned.
Curtis, regarding the UV issue, that's one example where PTEX could be of large benefit. Perhaps you could touch on what potentially PTEX can bring to the table, seeing as how it has moved to the background. It has the potential of being a real gamechanger as it doesn't need UVs to map and paint.
I have been hoping for this tech for 20 years ! Can’t wait for it to evolve so I can do hyper accurate scans for engineering purposes
Laser scanning is already a thing
You can already do this for years it’s just the technology being integrated into smaller and cheaper devices
It's not that expensive anymore, for 1k usd you can get a HP SLS-2 which will let you scan any mechanical parts with great accuracy and resolution.
@@sircher943 I knew the tech has been around but didnt realize the price had dropped so far. thanks for the advice
@@jittertn no duh..... the comment is related to the lidar in the phone genius.
been 3d camera tracking scenes. and on site use the llidar. then after tracking it, I use the models in the camera tracked scene for particle systems with the lidar scans as guides for effectors or collision objects. works beautifully
How long until vr headsets have this for mapping your environment and converting to map
Such a cool idea. I guess that's technically Augmented Reality, but you can imagine walking around your house with a headset on and the rooms and objects are all digitally viewable and interactable but maybe with different textures or filters on them, or even highlights or graphics overlaid onto the image. Arrows pointing you around your house to where you left your keys, or virtually instructing you on a recipe you're cooking.
this is amazing but the meshes look wonky in places that can be a problem with further modelling / texturing... thoughts?
What Android phone does this? Asking because I don't see any use for iPhone except that scanning.
i don't even use blender but man this is fascinating to watch
4:15 “because datar is data” lol. Love those ransom R’s in words
The mesh quality limitations probably stem from it's real-time roots, if you were able to capture pictures with LIDAR data, which can have higher resolution and detail than video, and then instead process that on a desktop application, it'd open up the opportunity to use the mesh data with the parallax data generated from standard photogrammetry. The LIDAR sensor probably has a much higher resolution than the phone can process in real time as well.
The main issue is that the lidar sensor's resolution is extremely low at least on iphones, and the dots are very spaced apart
Really interesting. Do you have any more recent experiments with lidar scanning? A friend of mine has a new iPhone 13 pro coming in the post today and I'm trying to get him to send me some scans to play with.
Very cool to see the 'adaptive refinement' of the mesh if you move around and cover an previously 'undiscovered' area with the camera
And yes @IanHubert is great 👍🏼
I wonder if they've improved the Lidar this year. I've watched a number of videos showing the iPhone 12 Lidar and quite a few are waiting for it to improve.
Exciting. One more thing to look for the next time I buy a phone.
Mind blown, really enjoyed this. Do some more videos on how lidar can be used
it would be cool to combine this with photogrammetry. in other words, lidar could be combined with high res photos and use the distance data to assist in the generation of the point cloud
wow this is amazing! this makes me wanna buy an iPhone 12 :D
it's pretty cool but you don't need an iPhone to use Lidar
if you want the LiDAR scanner make sure it’s the 12 Pro or Pro Max. There’s also Android phones with ToF sensors if want the tech-ish but don’t want iOs, but there seems to be nothing around in terms of phones that beats Apple’s implementation of the technology. I may be wrong, of course, but I genuinely doubt that.
Great vid. Though Lidar still has barriers for mid low end phones right? We can foresee mainstream commercial usage will be 2 to 3 years from now? Or are there any alternative to activate lidar?
Super nice video! :) Any phone or device that is better then others with the Lidar scanner or is the iPhone 12 pro as good as it gets if you do not want to invest big in something ONLY for Lidar scanning?
Imagine using swarms of small drones with lidars and cameras to scan the
entire Earth from every angle! It could become the most interesting
open world game/driving/flight simulator ever! Or even just
crowdsourcing 3d-scans from enthusiasts with the latest smartphones.
Then developers and AI can adjust things to make it super realistic.
Its coming
Have you ever tryed Microsoft flight simulator 2020?
It might blow your mind.
7:28 how come iphone 12 was not "advertised" to have the lidar scanner
I literally got the iPhone 12 pro Max because of this. Plus other good things that come with this.
I have seen these similar devices on jobsites years ago. Last time, a company was getting 3D measurements of a framed staircase, so they can use that information to cut the marble pieces to be installed at a later time.
Blender + LIDAR + AI + Cerewave Natural speech engine = Unbelievable explosion in creativity at the individual level.
thanks for your great review! I'm also planning to test iPhone 12's "lidar" myself so this is a great reference. But I don't think that at this point "lidar" can replace photogrammetry yet, judging from the results in your video. The big advantage of photogrammetry is the way we can manipulate (to a certain degree) the end resolution to our needs depending on the camera-to-object distance, and yeah costs matter too.
PS. I put "lidar" in quotes because as far as I can see despite the name Apple's system does not seem to be a real lidar/laser scanning per se, more like a depth-camera system. An upgraded Kinect if you will. CMIIW
PPS. another great alternative is videogrammetry/photogrammetry from videos!
I can see the use of Lidar for static scenes like what is shown here but I don't see a whole lot of use cases for it outside of that such as in games. The cleanup work seems like it would be much more complicated, and the texture resolution seems to be lacking from what I've seen as well as the accuracy of its simplistic geometry generation. It is however very fast and easy to do from what I can tell.
I look forward to seeing how it improves over time.
how would you make that Photogrammetry or lidar model 3d printing ready?
Thanks for sharing
Super cool Curtis!
god damn it i just got into photoscanning
We are as engieers ,can we use it as Digital Satalit Image for Engineering survey issue if we capture the project land by this technology ??
Is there any limitation of 3d scanned map? You made with iphone?
I'm still sad display.land is gone. That thing was very useful.
trnio is a pretty good alternative for objects (not so much for scenes)
Just use some pc software, they give you more control and the results are much better.
@@sircher943 this. I was shocked that a pc is equally fast at making a scene on the lowest settings, than arcore apps.
And so much more accuracy, control
I'm using metashape
@@sircher943 I've used pc software before. Display land was much much faster for stuff that doesn't need to be perfect. They both have their use.
Wait what?
Really cool stuff. I looked up the particular phone you have and it's a little eye watering for me haha
Just imagine the applications in vr. You could make photorealistic videogames with explorable environments just by sending a few drones around to film stuff. Microsoft flight sim, this - the only problem is its not perfect but the framework is there for Cyberspace
Truly awesome. Very impressive!
You missed a key difference between LASER-based LIDAR (see the top of google maps cars ) and image processing based on stereoscopy and variants (leveraging multiple views of the same scene). Apart from that, thanks for the video, nice sharing your work!
Photogrammetry does add more detailed resolution right? Like small gaps
What application is it? For the scan and blend the objects?
For the latest Huawei devices, there is an app, called "3d Scanner". The free version works on AR core enabled phones, but without the lidar sensor. Latest Huawei and Honor phones with lidar make really nice scans with the pro version, which is updated every few months. Checkout the developers name: Lubosh and match that with 3d and you can see his clips on youtube
Any informations about an online library where you can buy asset from this app ?
Why does the mesh resolution not scale inversely with distance?
Is there one available with higher triangle count for more curved edges rather than a jagged apperance? Iam only asking as i love 3d printing and i can see this vome in handy rather than making the models on blender before printing.
Can someone recommend me such app, but for android?
Follow
So, I know that when mesh resolution is low, UV maps in Blender are often stretched or distorted similar to how they were in your video. Understandably, subdividing your mesh might not be an option as it might put your geometric density unmanageably high, but I was just wondering if you had tried that to help with the stretching/distortion.
Any sample environment, where we can walk around in Quest2? Such app exist? thanks
so would a higher power gpu help in the resolution polygon
Is this Polycam you are using. Oh Lidar is the tech? I love it!!! I have the demo. I’m about to Buy or Subscribe to it.
This algo has a lot of room for improvement, it should just resample textures once it resolved a smoother version of the mesh somewhere
Oh this would save me SO much work!! Im in!
Did you actually use the phone for the processing & building of the images or just for the initial scan?
Thanks for this! Just got this phone and I’m well into a project making maps for a game/app and this is (potentially) gonna make building custom assets so much easier. Thanks for saving me some time with failed experiments lol
How is it going?
could you please suggest the app to extract lidar data in appropriate format so I can import it to blender?
Cool! What is UI style in Blender?
I'm using a custom theme that I share with my second tier patrons :)
You’ll have to wait for Android phones to add LIDAR technology.
Our geology teacher (studying Civil Engineering degree) used a LIDAR scanner (proper stationary tool rather than a smartphone toy) to scan an entire cliff (about 15m high). Really incredible stuff to be able to manimulate something from real life on your desktop. Perhaps using a dolly or some stabilisation handle would give better end results with a phone as well.
Hopefully LIDAR will become more common, though bet people rather would have something dumb like yet another camera ¬_¬
This is one of the most impressive things I've seen in a long time
You haven't seen much, have you?
Is there a program that converts a Lidar image sample to an stl file?
Any app recommended for Android? This looks awesome. Can't wait for this tech to mature enough to give good result on budget phones and not needing to buy expensive lidar scanners.
Not possible as there is need for special sensor on your device
@@pranaykumar6648 i would say its 100% confirmed as there is a big push for AR content on smartphones from companies like Apple and Google.
@@shayan-gg true but but sensors are expensive so wouldn't be available on budget smartphones.
I think we can see already that reconstruction from a camera will be possible. If we're lucky it will happen for the public soon.
@@danielloo1255 it is already possible, just not realtime, as curtis mentioned, google photogrammetry
This thing will be next most wanted feature for every phone.
Precision is acceptable and fast.
Main problem for photogrammetry is precision.
Precision require more processing power which implies more time.
In many situation we have not time.
When phones become enough powerful then we will combine both to achieve better quality.