Unmanned Aircraft Systems (UAS) flights rely on the ability to determine their position with high
accuracy and continuity, even in degraded environments. Traditional localisation approaches
combine data from Global Navigation Satellite Systems (GNSS) and Inertial Measurement Units
(IMUs), which together provide reliable navigation under nominal conditions. However, the increasing
sophistication of jamming and spoofing threats has exposed the dependence of these systems
on external signals, creating a demand for alternative methods that can ensure precise navigation
when GNSS data becomes unreliable or unavailable. To address this challenge, this work explores
the integration of classical navigation sensors with artificial intelligence techniques to enhance
precision in navigation, as well as robustness in complex environments. The proposed framework
combines IMU and GNSS information with visual data processed through deep learning algorithms
for odometry estimation and map correlation. All measurements are subsequently fused within
an Extended Kalman Filter (EKF), which provides an optimal estimation of the vehicle state and
dynamically balances sensor contributions according to their estimated reliability. The resulting
system enables UAS to adaptively select the most accurate and stable source of navigation data
depending on mission context, terrain visibility, and environmental conditions. Beyond the technical
contribution, this approach aims to reduce operational dependency on external infrastructure
while improving safety in autonomous flight missions. The proposed architecture is systematically
evaluated by comparing different sensor configurations, using the classicalGNSS+IMUsolution as a
reference baseline. This controlled assessment allows the contribution and limitations of visual aiding
to be clearly quantified relative to standard navigation performance. The results demonstrate
the feasibility of deploying visual-aided navigation as a resilient complementary component within
small UAS Positioning, Navigation, and Timing (PNT) architectures, while identifying robustness
to visual outliers as a key avenue for further performance enhancement.

