CEAS EuroGNC 2026 Conference on Guidance, Navigation & Control>
AI-Based Asynchronous Sensor Data Fusion for Autonomous Relative Navigation around Asteroids
Iain Hall  1@  , Jinglang Feng  1@  , Jesus Gil Fernandez  2@  , Massimilano Vasile  1@  
1 : University of Strathclyde [Glasgow]
2 : ESA - ESTEC (Netherlands)

The estimation of relative position and attitude (Pose) is necessary for enabling autonomous close proximity operations of spacecraft visiting asteroids. This can be achieved by tracking surface features of the target asteroid, but the tracking of these features is challenging due to the extreme illumination conditions and uncertainties in the target's motion and shape. To address these challenges we apply Deep Learning networks to track keypoints in visible and thermal images of an asteroid. We then develop a factor graph based Visual Odometry method to estimates the relative pose of the spacecraft using the tracked keypoints and a LIDAR range measurement to remove scale ambiguity. Sensor data fusion of visible and and thermal images is achieved using the factor graph to enable robustness to the extreme illumination conditions. We show that DL and factor graph based VO are able to accurately estimate the relative pose at low sun phase angles, while thermal images and sensor fusion provide robustness to higher sun phase angles. Finally we test the developed method on real images of Ryugu from the spacecraft Hayabusa2, and successfully demonstrate the the DL networks can adapt to real images.


Loading... Loading...