Pou-Chun Kung
Skanda Harisha
Ram Vasudevan
Aline Eid
Katherine A. Skinner
pckung@umich.edu
skandah@umich.edu
ramv@umich.edu
alineeid@umich.edu
kskin@umich.edu
University of Michigan, Ann Arbor
Paper
GitHub (Coming Soon)
RadarSplat enables radar 2D-to-3D scene reconstruction, image synthesis, and occupancy estimation. RadarSplat outperforms the state-of-the-art neural rendering method both qualitatively and quantitatively while also enabling additional radar inverse rendering.
TL;DR: We introduce RadarSplat, a novel Radar Gaussian Splatting method for scene reconstruction, novel view synthesis, and inverse rendering. High-Fidelity 3D scene reconstruction plays a crucial role in autonomous driving by enabling novel data generation from existing datasets. This allows simulating safety-critical scenarios and augmenting training datasets without incurring further data collection costs. While recent advances in radiance fields have demonstrated promising results in 3D reconstruction and sensor data synthesis using cameras and LiDAR, their potential for radar remains largely unexplored. Radar is crucial for autonomous driving due to its robustness in adverse weather conditions like rain, fog, and snow, where optical sensors often struggle. Although the state-of-the-art radar-based neural representation shows promise for 3D driving scene reconstruction, it performs poorly in scenarios with significant radar noise, including receiver saturation and multipath reflection. Moreover, it is limited to synthesizing preprocessed, noise-excluded radar images, failing to address realistic radar data synthesis. To address these limitations, this paper proposes RadarSplat, which integrates Gaussian Splatting with novel radar noise modeling to enable realistic radar data synthesis and enhanced 3D reconstruction. Compared to the state-of-the-art, RadarSplat achieves superior radar image synthesis (+3.4 PSNR / 2.6x SSIM) and improved geometric reconstruction (-40% RMSE / 1.5x Accuracy), demonstrating its effectiveness in generating high-fidelity radar data and scene reconstruction.
System Overview. RadarSplat takes radar images and poses as input. The preprocessing step includes noise detection and initial occupancy mapping. The multipath source map and Gaussian splat reconstruct the 3D scene and model multipath effects for novel view synthesis.
@article{kung2025radarsplat, title={RadarSplat: Radar Gaussian Splatting for High-Fidelity Data Synthesis and 3D Reconstruction of Autonomous Driving Scenes}, author={Kung, Pou-Chun and Harisha, Skanda and Vasudevan, Ram and Eid, Aline and Skinner, Katherine A}, journal={arXiv preprint arXiv:2506.01379}, year={2025} }