- 53
- 55 540
Event-based Robot Vision
เข้าร่วมเมื่อ 14 ต.ค. 2020
Here you will find the videos from the Course "Event-based Robot Vision", taught during the Spring Semester at TU Berlin, Germany. The course is an introduction to the topic of event-based cameras and processing.
Reference text: doi.org/10.1109/TPAMI.2020.3008413
Reference text: doi.org/10.1109/TPAMI.2020.3008413
EPBA: Event-based Photometric Bundle Adjustment
Project page: github.com/tub-rip/epba
PDF: arxiv.org/pdf/2412.14111
Dataset: github.com/tub-rip/ECRot
We tackle the problem of bundle adjustment (i.e., simultaneous refinement of camera poses and scene map) for a purely rotating event camera. Starting from first principles, we formulate the problem as a classical non-linear least squares optimization. The photometric error is defined using the event generation model directly in the camera rotations and the semi-dense scene brightness that triggers the events. We leverage the sparsity of event data to design a tractable Levenberg-Marquardt solver that handles the very large number of variables involved. To the best of our knowledge, our method, which we call Event-based Photometric Bundle Adjustment (EPBA), is the first event-only photometric bundle adjustment method that works on the brightness map directly and exploits the space-time characteristics of event data, without having to convert events into image-like representations. Comprehensive experiments on both synthetic and real-world datasets demonstrate EPBA’s effectiveness in decreasing the photometric error (by up to 90%), yielding results of unparalleled quality. The refined maps reveal details that were hidden using prior state-of-the-art rotation-only estimation methods. The experiments on modern high-resolution event cameras show the applicability of EPBA to panoramic imaging in various scenarios (without map initialization, at multiple resolutions, and in combination with other methods, such as IMU dead reckoning or previous event-based rotation estimation methods). We make the source code publicly available.
Reference:
Shuang Guo and Guillermo Gallego,
Event-based Photometric Bundle Adjustment,
(under review), 2024.
Affiliations:
Technical University of Berlin (Berlin, Germany),
Robotics Institute Germany (RIG), www.robotics-institute-germany.de/
Science of Intelligence Excellence Cluster (Berlin, Germany), www.scienceofintelligence.de/
Einstein Center Digital Future (Berlin, Germany), www.digital-future.berlin/en/
Event-based Vision:
- CMax-SLAM: github.com/tub-rip/cmax_slam
- EMBA: github.com/tub-rip/emba
- Research: sites.google.com/view/guillermogallego/research/event-based-vision
- Survey paper: arxiv.org/abs/1904.08405
- Course at TU Berlin: sites.google.com/view/guillermogallego/teaching/event-based-robot-vision
PDF: arxiv.org/pdf/2412.14111
Dataset: github.com/tub-rip/ECRot
We tackle the problem of bundle adjustment (i.e., simultaneous refinement of camera poses and scene map) for a purely rotating event camera. Starting from first principles, we formulate the problem as a classical non-linear least squares optimization. The photometric error is defined using the event generation model directly in the camera rotations and the semi-dense scene brightness that triggers the events. We leverage the sparsity of event data to design a tractable Levenberg-Marquardt solver that handles the very large number of variables involved. To the best of our knowledge, our method, which we call Event-based Photometric Bundle Adjustment (EPBA), is the first event-only photometric bundle adjustment method that works on the brightness map directly and exploits the space-time characteristics of event data, without having to convert events into image-like representations. Comprehensive experiments on both synthetic and real-world datasets demonstrate EPBA’s effectiveness in decreasing the photometric error (by up to 90%), yielding results of unparalleled quality. The refined maps reveal details that were hidden using prior state-of-the-art rotation-only estimation methods. The experiments on modern high-resolution event cameras show the applicability of EPBA to panoramic imaging in various scenarios (without map initialization, at multiple resolutions, and in combination with other methods, such as IMU dead reckoning or previous event-based rotation estimation methods). We make the source code publicly available.
Reference:
Shuang Guo and Guillermo Gallego,
Event-based Photometric Bundle Adjustment,
(under review), 2024.
Affiliations:
Technical University of Berlin (Berlin, Germany),
Robotics Institute Germany (RIG), www.robotics-institute-germany.de/
Science of Intelligence Excellence Cluster (Berlin, Germany), www.scienceofintelligence.de/
Einstein Center Digital Future (Berlin, Germany), www.digital-future.berlin/en/
Event-based Vision:
- CMax-SLAM: github.com/tub-rip/cmax_slam
- EMBA: github.com/tub-rip/emba
- Research: sites.google.com/view/guillermogallego/research/event-based-vision
- Survey paper: arxiv.org/abs/1904.08405
- Course at TU Berlin: sites.google.com/view/guillermogallego/teaching/event-based-robot-vision
มุมมอง: 149
วีดีโอ
EMBA: Event-based Mosaicing Bundle Adjustment (ECCV 2024)
มุมมอง 2364 หลายเดือนก่อน
Project page: github.com/tub-rip/emba PDF: arxiv.org/pdf/2409.07365 Dataset: github.com/tub-rip/ECRot We tackle the problem of mosaicing bundle adjustment (i.e., simultaneous refinement of camera orientations and scene map) for a purely rotating event camera. We formulate the problem as a regularized non-linear least squares optimization. The objective function is defined using the linearized e...
On the Benefits of Visual Stabilization for Frame- and Event-based Perception (RA-L 2024)
มุมมอง 2415 หลายเดือนก่อน
Vision-based perception systems are typically exposed to large orientation changes in different robot applications. In such conditions, their performance might be compromised due to the inherit complexity of processing data captured under challenging motion. Integration of mechanical stabilizers to compensate for the camera rotation is not always possible due to the robot payload constraints. T...
ES-PTAM: Event-based Stereo Parallel Tracking and Mapping (ECCVW 2024)
มุมมอง 3575 หลายเดือนก่อน
Project page: github.com/tub-rip/ES-PTAM Paper: arxiv.org/pdf/2408.15605 Visual Odometry (VO) and SLAM are fundamental components for spatial perception in mobile robots. Despite enormous progress in the field, current VO/SLAM systems are limited by their sensors’ capability. Event cameras are novel visual sensors that offer advantages to overcome the limitations of standard cameras, enabling r...
Motion-prior Contrast Maximization for Dense Continuous-Time Motion Estimation (ECCV 2024)
มุมมอง 4276 หลายเดือนก่อน
Project page: github.com/tub-rip/MotionPriorCMax PDF: arxiv.org/pdf/2407.10802 Dataset (20GB): drive.google.com/file/d/1YRlvjl0BiNxSQ-jZlw7QcebiBe7Zm2l9/view?usp=drive_link Current optical flow and point-tracking methods rely heavily on synthetic datasets. Event cameras are novel vision sensors with advantages in challenging visual conditions, but state-of-the-art frame-based methods cannot be ...
CMax-SLAM: Event-based Rotational-Motion Bundle Adjustment and SLAM System using Contrast Max (TRO)
มุมมอง 76710 หลายเดือนก่อน
Project page (Code): github.com/tub-rip/cmax_slam PDF: arxiv.org/pdf/2403.08119 Dataset: github.com/tub-rip/ECRot Event cameras are bio-inspired visual sensors that capture pixel-wise intensity changes and output asynchronous event streams. They show great potential over conventional cameras to handle challenging scenarios in robotics and computer vision, such as high-speed and high dynamic ran...
Event based Background-Oriented Schlieren (TPAMI 2023) (5 min)
มุมมอง 1.5Kปีที่แล้ว
Project page: github.com/tub-rip/event_based_bos PDF: doi.org/10.1109/TPAMI.2023.3328188 Schlieren imaging is an optical technique to observe the flow of transparent media, such as air or water, without any particle seeding. However, conventional frame-based techniques require both high spatial and temporal resolution cameras, which impose bright illumination and expensive computation limitatio...
Event Penguins: Low-power, Continuous Remote Behavioral Localization with Event Cameras (CVPR 2024)
มุมมอง 585ปีที่แล้ว
Project page: tub-rip.github.io/eventpenguins/ www.scienceofintelligence.de/news-scioi-in-antarctica/ Researchers in natural science need reliable methods for quantifying animal behavior. Recently, numerous computer vision methods emerged to automate the process. However, observing wild species at remote locations remains a challenging task due to difficult lighting conditions and constraints o...
Event based Background-Oriented Schlieren (TPAMI 2023)
มุมมอง 285ปีที่แล้ว
Project page: github.com/tub-rip/event_based_bos PDF: doi.org/10.1109/TPAMI.2023.3328188 Schlieren imaging is an optical technique to observe the flow of transparent media, such as air or water, without any particle seeding. However, conventional frame-based techniques require both high spatial and temporal resolution cameras, which impose bright illumination and expensive computation limitatio...
A Fast Geometric Regularizer to Mitigate Event Collapse in the Contrast Maximization Framework
มุมมอง 4242 ปีที่แล้ว
Project page: github.com/tub-rip/event_collapse PDF: arxiv.org/pdf/2212.07350 Event cameras are emerging vision sensors and their advantages are suitable for various applications such as autonomous robots. Contrast maximization (CMax), which provides state-of-the-art accuracy on motion estimation using events, may suffer from an overfitting problem called event collapse. Prior works are computa...
イベントカメラを用いたオプティカルフロー推定:動きとは何か? (ViEW 2022)
มุมมอง 1.7K2 ปีที่แล้ว
Optical Flow Estimation using Event Cameras. What is Motion? Odawara Prize: Best presentation award at ViEW 2022. Code: github.com/tub-rip/event_based_optical_flow Dec 8th, 2022 要約: イベントカメラと,イベントカメラによる最新のオプティカルフロー推定手法を紹介する.イベントカメラは,画像空間のエッジの動きに非同期的に反応し,高速,高ダイナミックレンジ,低消費電力などの利点を持つ新しいビジョンセンサである.その非同期で疎なデータに対してアルゴリズムを再考する必要があるが,近年の画像ベースの機械学習の成果を受け,イベントカメラのオプティカルフロー推定手法でも,そのような画像ベースの手法と組み合わせることを急ぎす...
Event-based Stereo Depth for SLAM in Autonomous Driving (BADUE Workshop at IROS 2022)
มุมมอง 9992 ปีที่แล้ว
Project page (Code): github.com/tub-rip/dvs_mcemvs PDF: arxiv.org/pdf/2207.10494 Workshop page: gamma.umd.edu/workshops/badue22/ Event cameras are bio-inspired sensors that offer advantages over traditional cameras. They operate asynchronously, sampling the scene at microsecond resolution and producing a stream of brightness changes.This unconventional output has sparked novel computer vision m...
Multi-Event-Camera Depth Estimation and Outlier Rejection by Refocused Events Fusion (AISY 2022)
มุมมอง 2.7K2 ปีที่แล้ว
Project page (Code): github.com/tub-rip/dvs_mcemvs PDF: arxiv.org/pdf/2207.10494 Advanced Science News: www.advancedsciencenews.com/silicon-retinas-to-help-robots-navigate-the-world Presentation at IEEE MFI 2022: th-cam.com/video/MaRoJ16obdI/w-d-xo.html Event cameras are bio-inspired sensors that offer advantages over traditional cameras. They operate asynchronously, sampling the scene at micro...
Secrets of Event-Based Optical Flow (ECCV 2022, Oral)
มุมมอง 3.2K2 ปีที่แล้ว
PDF: arxiv.org/pdf/2207.10022 Code: github.com/tub-rip/event_based_optical_flow Poster: drive.google.com/file/d/1mF-mM4teb8A9bKJJiQwN7IFsGsRIsRaX/view Event cameras respond to scene dynamics and offer advantages to estimate motion. Following recent image-based deep-learning achievements, optical flow estimation methods for event cameras have rushed to combine those image-based methods with even...
Exercise 5: result of spatial convolutions on real event data (bicycle sequence)
มุมมอง 4114 ปีที่แล้ว
Exercise 5: result of spatial convolutions on real event data (bicycle sequence)
Exercise 7: panoramic image reconstruction with a rotating event camera (i.e., Mosaicing)
มุมมอง 4814 ปีที่แล้ว
Exercise 7: panoramic image reconstruction with a rotating event camera (i.e., Mosaicing)
Event cameras: Sampling in time vs in range
มุมมอง 1.4K4 ปีที่แล้ว
Event cameras: Sampling in time vs in range
Event-based Optical Flow: model-based methods
มุมมอง 1.7K4 ปีที่แล้ว
Event-based Optical Flow: model-based methods
Feature Detection and Tracking with a DAVIS
มุมมอง 8834 ปีที่แล้ว
Feature Detection and Tracking with a DAVIS
Exercise 4: event integrator (ordinary or "direct" integrator)
มุมมอง 8854 ปีที่แล้ว
Exercise 4: event integrator (ordinary or "direct" integrator)
Methods of Event Processing
มุมมอง 1.6K4 ปีที่แล้ว
Event-based Robot Vision © Guillermo Gallego 2020 Slides: drive.google.com/file/d/1oaFGwqtIV3eSMa-xHUAnh6BJYMk2t02m/view?usp=sharing Overview paper: arxiv.org/abs/1904.08405 Research on Event-based Vision: sites.google.com/view/guillermogallego/research/event-based-vision?authuser=0
Event-based cameras: Advantages, Disadvantages and Challenges
มุมมอง 1.6K4 ปีที่แล้ว
Event-based cameras: Advantages, Disadvantages and Challenges
Comparison of five event-based feature trackers
มุมมอง 4744 ปีที่แล้ว
Comparison of five event-based feature trackers
Tracking motion-compensated event features
มุมมอง 6094 ปีที่แล้ว
Tracking motion-compensated event features
Image Reconstruction Methods 2018 - 2020.05
มุมมอง 6224 ปีที่แล้ว
Image Reconstruction Methods 2018 - 2020.05
Tracking blobs of events. Ingredients of the per-event processing paradigm
มุมมอง 8714 ปีที่แล้ว
Tracking blobs of events. Ingredients of the per-event processing paradigm
Event-based Optical Flow: learning-based methods and comparison
มุมมอง 9304 ปีที่แล้ว
Event-based Optical Flow: learning-based methods and comparison
How much does quality decrease when used at night?
A very interesting application for building panoramic images. The result resembles a depth map, and algorithms for detecting vertical walls and ceilings will likely work well here, enabling the automatic creation of 3D rooms, similar to how it is done with 360 cameras.
That's sooo cool!
how did you detection
how can ı process the event data on esim?I m trying to detect qrcode.please
how can ı process the event data on esim?I m trying to detect qrcode.please
how can ı process the event data on esim?I m trying to detect qrcode.please
How can I find Dr. Gallego to ask some questions?
Very interesting 👍🏼
The so far most adorable research on event-based cameras;)
Good day. Do you have an email? How can I write to you?
The quality (density and uncertainty) of the depth map looks great!
Hello professor, many thanks to you about these lectures. I really like it and it's big help Is it possible to arrange your videos in order? I tried my best, but it's not easy to do it myself. Thanks :)
Hello. There is this playlist: th-cam.com/play/PL03Gm3nZjVgUFYUh3v5x8jVonjrGfcal8.html Best regards
Hi , can someone please recommend me a software or ide to work on DAVIS240c dataset/dvs dataset. For a simple motion sensing algorithm
nice talk!
Great works! And I learn a lot from your videos. Thank you so much for sharing them!