(Preview) MonoNav: MAV Navigation via Monocular Depth Estimation and Reconstruction
ฝัง
- เผยแพร่เมื่อ 3 ต.ค. 2023
- This is a research preview of the MonoNav system, which enables micro aerial vehicles, or MAVs, to fly in previously unseen environments.
MAVs are often constrained to use lightweight sensors - for perception, this means a tiny monocular camera. This renders many conventional planning techniques unusable. However, with access to offboard computation, MAVs can take advantage of recent advances in single-image depth estimation.
In this work, we show how MAVs can use pre-trained depth estimation tools to build a 3D reconstruction of its environment real-time, enabling the use of powerful planning tools.
Warning: This is a research preview for the Learning Robot Super Autonomy workshop at IROS2023. Work to compare MonoNav to state of the art baseline approaches is underway. Stay tuned!
For more information, please visit:
natesimon.github.io/mononav/ - วิทยาศาสตร์และเทคโนโลยี
very nice result
suggestions on dep[th estimation for monocular systems without zoedepth?
maybe some feature tracking,,..... and matching.... and recording diff of recorded d/t of each tracker?
making like a depth map from motion parallax values?
Is ZoeDepth accurate enough? It seems like the reconstruction map has a lot distortion.
I have a cuestion,
What video card did you use for processing, since I have an NVIDIA GeForce GTX 1650 and the frame capture is quite slow?
We used an NVIDIA® GeForce RTX™ 4090
Does ZoeDepth actually provide metric depth information? I thought it only provides a relative depth map?
ZoeDepth aims to output metric depth. We found that it was fairly accurate in hallway environments. Please check out their paper for more information! github.com/isl-org/ZoeDepth
Is it builded on python or C++?
The MonoNav stack is written in Python! We will release the code soon.