Robust Fusion for Degraded Multi-Sensor Localization
A unified timestamp-ordered estimator for resilient localization across sensors, platforms, and challenging transportation scenarios.
Ultra-Fusion is an ultra-resilient tightly-coupled multi-sensor fusion SLAM framework for intelligent transportation systems. It unifies RGB, IMU, depth, wheel, GNSS, and LiDAR measurements within a timestamp-ordered sliding-window estimator, enabling robust localization under sensor degradation and spatiotemporal uncertainty.
Reliable localization is a fundamental capability for intelligent transportation systems, including autonomous driving, legged robots, and aerial vehicles. Although multi-sensor fusion has become a promising paradigm for robust localization, practical systems remain fragile when sensors degrade, such as under poor illumination, LiDAR degeneracy, wheel slippage, or GNSS denial, and when spatiotemporal calibration is imperfect.
To address these challenges, we propose Ultra-Fusion, an ultra-resilient tightly-coupled multi-sensor fusion SLAM framework built upon a unified timestamp-ordered estimator. Ultra-Fusion combines condition-aware initialization, continuous-time LiDAR--inertial geometric modeling, degradation-aware factor scheduling, and online refinement of sensor time offsets and extrinsic parameters. We further introduce the M3DGR benchmark and conduct a large-scale evaluation of 60 representative SLAM systems. Extensive experiments on M3DGR, M2DGR-Plus, KAIST, GrandTour, and MARS-LVIG demonstrate consistent state-of-the-art performance across wheeled robots, legged platforms, and autonomous vehicles.
Ultra-Fusion pipeline. Heterogeneous sensor measurements are organized in timestamp order and fused in a common sliding-window estimator with condition-aware initialization, continuous-time LiDAR modeling, degradation-aware factor scheduling, and online spatiotemporal refinement.
M3DGR / M2DGR-PlusWe evaluate Ultra-Fusion against representative vision-based, LiDAR-based, and fusion-based SLAM systems on challenging benchmark sequences. The proposed framework consistently improves the balance between nominal accuracy and robustness under degradation, especially in scenarios involving visual failure, LiDAR degeneracy, wheel slippage, GNSS denial, and calibration uncertainty.












KAIST / GrandTour / MARS-LVIGTo validate deployment-oriented performance, we further evaluate Ultra-Fusion on high-speed driving, legged locomotion, and aerial localization benchmarks. The results show that the proposed framework remains accurate and stable across long-term, degraded, and cross-platform operating conditions, rather than being tuned to a single robot or sensing regime.
Robustness visualization. Representative qualitative examples of multi-sensor localization and mapping under challenging conditions.
Trajectory evaluation. Quantitative trajectory comparison and error analysis using the evo toolbox.
Framework demonstration. Illustration of the Ultra-Fusion pipeline and its resilience to sensing degradation and calibration error.
@article{anonymous2026ultrafusion,
author = {Anonymous Authors},
title = {Ultra-Fusion: An Ultra-Resilient Tightly-Coupled Multi-Sensor Fusion SLAM Framework Under Sensor Degradation and Spatiotemporal Uncertainty for Intelligent Transportation Systems},
journal = {IEEE Transactions on Intelligent Transportation Systems},
year = {2026},
}