The OAK-D data is not very usable for many robotic pruporses in real time, as avoidance and mapping for drones, clearly you are experimenting the false pointclouds lines in the top of the image, due to factory misalignment or that you are using the unwarp mesh. These fake pointclouds are a line of points that follow the camera pose and make lines where there is nothing, easy to see where there is not enough texture, as the walls close to ceiling,in the windows etc. I would say it looks good, but looking properly is terrible for many use cases. The realsense d455 one doesn't have that effect, but will tint with reddish color in outdoors. Are you using OAK-D internal IMU? Do you get RAW data or Quaternions? The VIO looks the most usable part of the video but a drone cannot carry a laptop, would be nice make a video over RPi4, Xavier , NaNo or reducing to 400p to see if how the lower resolution and FPS affects the algorithm reliability.
Why not colourized pointclouds with OAK-D, the RGB alignment,when crop the depth stream will fix the false pointclouds lines, at least in 400p, at 720p I didn't try.
Hello. Thank you for the tip. The point clouds in this video are computed rather directly from the OAK-D default depth map and used as-is. One point in this low-level API is that it is easy for the user to post-process or filter those to best fit their use case. You are also correct that passive stereo devices such as the standard OAK-D (but not necessarily newer / upcoming OAK-D models, such as the Pro-W) have fundamental issues with textureless surfaces, which are very common indoors.
Hii, can we implement real time mapping just like shown in the video with just Real Sense D455 camera. I dont have the OAK-D.
How is it different than hybvio
The OAK-D data is not very usable for many robotic pruporses in real time, as avoidance and mapping for drones, clearly you are experimenting the false pointclouds lines in the top of the image, due to factory misalignment or that you are using the unwarp mesh.
These fake pointclouds are a line of points that follow the camera pose and make lines where there is nothing, easy to see where there is not enough texture, as the walls close to ceiling,in the windows etc.
I would say it looks good, but looking properly is terrible for many use cases.
The realsense d455 one doesn't have that effect, but will tint with reddish color in outdoors.
Are you using OAK-D internal IMU? Do you get RAW data or Quaternions? The VIO looks the most usable part of the video but a drone cannot carry a laptop, would be nice make a video over RPi4, Xavier , NaNo or reducing to 400p to see if how the lower resolution and FPS affects the algorithm reliability.
Why not colourized pointclouds with OAK-D, the RGB alignment,when crop the depth stream will fix the false pointclouds lines, at least in 400p, at 720p I didn't try.
Hello. Thank you for the tip. The point clouds in this video are computed rather directly from the OAK-D default depth map and used as-is. One point in this low-level API is that it is easy for the user to post-process or filter those to best fit their use case. You are also correct that passive stereo devices such as the standard OAK-D (but not necessarily newer / upcoming OAK-D models, such as the Pro-W) have fundamental issues with textureless surfaces, which are very common indoors.
cool, will that be colored?
Wow