Spencer Carmichael
Katherine A. Skinner
specarmi@umich.edu
kskin@umich.edu
Robotics Department, University of Michigan, Ann Arbor
Paper (arXiv)
SLAM Code
B-Spline Code
Thermal cameras offer several advantages for simultaneous localization and mapping (SLAM) with mobile robots: they provide a passive, low-power solution to operating in darkness, are invariant to rapidly changing or high dynamic range illumination, and can see through fog, dust, and smoke. However, uncooled microbolometer thermal cameras, the only practical option in most robotics applications, suffer from significant motion blur, rolling shutter distortions, and fixed pattern noise. In this paper, we present TRGS-SLAM, a 3D Gaussian Splatting (3DGS) based thermal inertial SLAM system uniquely capable of handling these degradations. To overcome the challenges of thermal data, we introduce a model-aware 3DGS rendering method and several general innovations to 3DGS SLAM, including B-spline trajectory optimization with a two-stage IMU loss, view-diversity-based opacity resetting, and pose drift correction schemes. Our system demonstrates accurate tracking on real-world, fast motion, and high-noise thermal data that causes all other tested SLAM methods to fail. Moreover, through offline refinement of our SLAM results, we demonstrate thermal image restoration competitive with prior work that required ground truth poses.
Our prior work, TRNeRF, restored degraded thermal images by incorporating a microbolometer image formation model into the NeRF rendering pipeline. This project translates that idea to 3DGS, extends the fixed pattern noise (FPN) parameterization, and integrates it into a SLAM system. To model blur and rolling shutter, we rasterize multiple sharp images and blend them together using precomputed pixel-wise weights (derived as a discrete integral accounting for lens distortion, rolling shutter, and the thermal time constant). The blended result is then combined with an estimate of FPN (modeled as a time-varying spline) to obtain the final render used in the image loss.
We use a continuous-time trajectory representation formed from two uniform B-splines (one for positions in R3 and another for rotations in SO(3)). With this continuous trajectory we can rasterize images at arbitrary times and compute a loss directly against high-rate IMU data (without preintegration). We utilize gyroscope data immediately (which we find to be essential in the early mapping stages), and include accelerometer data after IMU initialization. We provide a standalone package for Rd and SO(3) B-spline optimization in PyTorch (link above).
We develop a novel 3DGS mapping scheme that corrects drift rather than overwriting previously observed portions of the map. Additionally, we introduce a scheme to limit opacity resetting to Gaussians with poor view-diversity. This tends to targets poorly constrained Gaussians at the edges of the scene, as shown below. Eliminating these Gaussians reduces the map size and prevents them from harming tracking.
We test on the dataset we introduced in TRNeRF. This dataset includes six sequences combining three camera speeds (slow, medium, and fast) with two scenes (indoor and outdoor). These combinations allow for an isolation of variables: blur and rolling shutter are insignificant in the slow sequences, and FPN is most prominent in the low contrast indoor scene. All tested methods succeed on the slow outdoor sequence, where the degradations we target are nearly absent. Several methods succeed on the moderately degraded sequences (medium outdoor and slow indoor), but TRGS-SLAM is the only method capable of accurate tracking in the most challenging sequences (fast outdoor, medium indoor, and fast indoor).
RMSE ATE in cm on the TRNeRF dataset
We also show that offline refinement of the TRGS-SLAM results yields restoration performance comparable to TRNeRF, despite the latter relying on ground truth poses.
@misc{trgs_slam_2026,
title={{TRGS-SLAM}: {IMU}-Aided {Gaussian} Splatting {SLAM} for Blurry, Rolling Shutter, and Noisy Thermal Images},
author={Spencer Carmichael and Katherine A. Skinner},
year={2026},
note={arXiv:2603.20443}}