Event-Based Vision for Robust SLAM: An Evaluation Using Hyper E2VID Event Reconstruction Algorithm
Hamza Anwar ()
Additional contact information
Hamza Anwar: Department of Electronic Engineering, BalochistanUniversity of IT, Engineering and Management Sciences, Quetta, Pakistan
International Journal of Innovations in Science & Technology, 2024, vol. 6 Special Issue: 7, issue 7, 131-145
Abstract:
This paper investigates the limitations of traditional visual sensors in challenging environments by integrating event-based cameras with visual SLAM (Simultaneous Localization and Mapping). The work presents a novel comparison between a visual-only SLAM implementation using the state-of-the-art HyperE2VID reconstruction method and conventional frame-based SLAM. Traditional cameras struggle in low dynamic range and motion blur scenarios, limitations that are addressed by event-based cameras, which offer hightemporal resolution and robustness in such conditions.The study employs the HyperE2VID algorithm to reconstruct event frames from event data, which are then processed through the SLAM pipeline and compared with conventional frames. Performance metrics, including Absolute Pose Error (APE) and feature tracking performance, were evaluated by contrasting visual SLAM implementations on reconstructed images against those from traditionalcameras across three event camera dataset sequences: Dynamic-6DoF, Poster-6DoF, and Slider depth sequence. Experimental results demonstrate that event-based cameras yield higher-quality reconstructions, significantly outperforming conventional cameras, especially in scenarios marked by motion blur and low dynamic range.Among the tested sequences, the Poster-6DoF sequence exhibited the best performance due to its information-rich scenes, while the Slider depth sequence faced challenges related to drag and scaling, as it lacked rotational motion. Although the APE values for the Slider depth sequence were the lowest, it did experience trajectory drift. In contrast, the Poster-6DoF sequence displayed superior overall performance, with reconstructions closely aligning with those produced by conventional camera-based SLAM. The Dynamic-6DoF sequence showed the poorest performance, marked by high absolute pose error and trajectory drift. Overall, these findings highlight the substantial improvements that event-based cameras can bring to SLAM systems operating in challenging environments characterized by motion blur and low dynamic ranges.
Keywords: SLAM; Event Camera; Neuromorphic; Feature Detection; Computer Vision (search for similar items in EconPapers)
Date: 2024
References: Add references at CitEc
Citations:
Downloads: (external link)
https://journal.50sea.com/index.php/IJIST/article/view/1097/1639 (application/pdf)
https://journal.50sea.com/index.php/IJIST/article/view/1097 (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:abq:ijist1:v:6:y:2024:i:7:p:131-145
Access Statistics for this article
International Journal of Innovations in Science & Technology is currently edited by Prof. Dr. Syed Amer Mahmood
More articles in International Journal of Innovations in Science & Technology from 50sea
Bibliographic data for series maintained by Iqra Nazeer ().