Ultra-Fusion : An Ultra-Resilient Tightly-Coupled Multi-Sensor Fusion SLAM Framework Under Sensor Degradation and Spatiotemporal Uncertainty for Intelligent Transportation Systems

IEEE Transactions on Intelligent Transportation Systems (Under Review)

Anonymous Authors
Anonymous Institution(s)

Robust Fusion for Degraded Multi-Sensor Localization

A unified timestamp-ordered estimator for resilient localization across sensors, platforms, and challenging transportation scenarios.

Ultra-Fusion teaser

TL;DR

Ultra-Fusion is an ultra-resilient tightly-coupled multi-sensor fusion SLAM framework for intelligent transportation systems. It unifies RGB, IMU, depth, wheel, GNSS, and LiDAR measurements within a timestamp-ordered sliding-window estimator, enabling robust localization under sensor degradation and spatiotemporal uncertainty.

Abstract

Reliable localization is a fundamental capability for intelligent transportation systems, including autonomous driving, legged robots, and aerial vehicles. Although multi-sensor fusion has become a promising paradigm for robust localization, practical systems remain fragile when sensors degrade, such as under poor illumination, LiDAR degeneracy, wheel slippage, or GNSS denial, and when spatiotemporal calibration is imperfect.

To address these challenges, we propose Ultra-Fusion, an ultra-resilient tightly-coupled multi-sensor fusion SLAM framework built upon a unified timestamp-ordered estimator. Ultra-Fusion combines condition-aware initialization, continuous-time LiDAR--inertial geometric modeling, degradation-aware factor scheduling, and online refinement of sensor time offsets and extrinsic parameters. We further introduce the M3DGR benchmark and conduct a large-scale evaluation of 60 representative SLAM systems. Extensive experiments on M3DGR, M2DGR-Plus, KAIST, GrandTour, and MARS-LVIG demonstrate consistent state-of-the-art performance across wheeled robots, legged platforms, and autonomous vehicles.

Contributions

  • We propose Ultra-Fusion, a tightly-coupled multi-sensor fusion SLAM framework that unifies asynchronous sensing streams within a timestamp-ordered sliding-window estimator, enabling tighter cross-modal interaction than modular coordination schemes.
  • We introduce a condition-aware and spatiotemporally consistent estimation strategy that combines dynamic and low-excitation initialization, continuous-time LiDAR modeling, and online refinement of selected time-offset and extrinsic parameters.
  • We develop a degradation-aware tightly-coupled fusion mechanism that performs factor-level scheduling, gating, and down-weighting directly within the unified optimization problem, and we validate it through large-scale benchmarking across M3DGR, M2DGR-Plus, KAIST, GrandTour, and MARS-LVIG.

Pipeline

Ultra-Fusion pipeline overview

Ultra-Fusion pipeline. Heterogeneous sensor measurements are organized in timestamp order and fused in a common sliding-window estimator with condition-aware initialization, continuous-time LiDAR modeling, degradation-aware factor scheduling, and online spatiotemporal refinement.

We evaluate Ultra-Fusion against representative vision-based, LiDAR-based, and fusion-based SLAM systems on challenging benchmark sequences. The proposed framework consistently improves the balance between nominal accuracy and robustness under degradation, especially in scenarios involving visual failure, LiDAR degeneracy, wheel slippage, GNSS denial, and calibration uncertainty.

vision-based
LiDAR-based
fusion-based
ours
ORB-SLAM3
GT
CT-LIO
GT
Fast-LIVO2
GT
Ultra-Fusion
GT
VINS-Mono
GT
HM-LIO
GT
Ground-Fusion++
GT
Ultra-Fusion
GT
GVINS
GT
IESKF-LIO
GT
Coco-LIC
GT
Ultra-Fusion
GT

To validate deployment-oriented performance, we further evaluate Ultra-Fusion on high-speed driving, legged locomotion, and aerial localization benchmarks. The results show that the proposed framework remains accurate and stable across long-term, degraded, and cross-platform operating conditions, rather than being tuned to a single robot or sensing regime.

Robustness visualization. Representative qualitative examples of multi-sensor localization and mapping under challenging conditions.

SLAM trajectory evaluation

Trajectory evaluation. Quantitative trajectory comparison and error analysis using the evo toolbox.

Framework demonstration. Illustration of the Ultra-Fusion pipeline and its resilience to sensing degradation and calibration error.

BibTeX

@article{anonymous2026ultrafusion,
  author    = {Anonymous Authors},
  title     = {Ultra-Fusion: An Ultra-Resilient Tightly-Coupled Multi-Sensor Fusion SLAM Framework Under Sensor Degradation and Spatiotemporal Uncertainty for Intelligent Transportation Systems},
  journal   = {IEEE Transactions on Intelligent Transportation Systems},
  year      = {2026},
}