CEAS EuroGNC 2026 Conference on Guidance, Navigation & Control>
Event-Only Framework for Simultaneous Localization and Mapping in space using Gaussian Splatting
Piercarlo Fontana  1@  , Annachiara Ippolito  1, *@  , Michele Maestrini  1, *@  
1 : Politecnico di Milano [Milan]
* : Corresponding author

Event cameras are a novel class of vision sensors that operate fundamentally differently from traditional frame-based cameras. Instead of capturing images at fixed intervals, they asynchronously record changes in pixel intensity as individual events, triggered when a predefined brightness threshold is exceeded. This unique data acquisition paradigm enables extremely high temporal resolution, low latency, low power consumption, and an inherently high dynamic range, making event cameras particularly well-suited for use in visually and dynamically challenging environments. One of their primary applications is in Simultaneous Localization and Mapping (SLAM), where the goal is to estimate the trajectory of a moving camera while simultaneously reconstructing a 3D map of the environment. In this paper, Gaussian Splatting, which was originally developed for photorealistic 3D scene reconstruction, is integrated into an event-based SLAM pipeline with a particular focus on space scenarios such as autonomous docking. By leveraging temporally aggregated event representations and modeling the scene as a collection of volumetric Gaussian primitives, the approach aims to provide robust, accurate, and continuous 3D mapping from sparse visual input, while the system is designed for incremental operation so that it ensures consistent scene reconstruction and reliable pose estimation even under the extreme lighting and motion conditions typical of orbital proximity operations.


Loading... Loading...