Comparison to 3DGS
Our method is more robust on challenging captures than 3D Gaussian Splatting (3DGS) while rendering at higher FPS.
Recent advances in view synthesis and real-time rendering have achieved photorealistic quality at impressive rendering speeds. While Radiance Field-based methods achieve state-of-the-art quality in challenging scenarios such as in-the-wild captures and large-scale scenes, they often suffer from excessively high compute requirements linked to volumetric rendering. Gaussian Splatting-based methods, on the other hand, rely on rasterization and naturally achieve real-time rendering but suffer from brittle optimization heuristics that underperform on more challenging scenes. In this work, we present RadSplat, a lightweight method for robust real-time rendering of complex scenes. Our main contributions are threefold. First, we use radiance fields as a prior and supervision signal for optimizing point-based scene representations, leading to improved quality and more robust optimization. Next, we develop a novel pruning technique reducing the overall point count while maintaining high quality, leading to smaller and more compact scene representations with faster inference speeds. Finally, we propose a novel test-time filtering approach that further accelerates rendering and allows to scale to larger, house-sized scenes. We find that our method enables state-of-the-art synthesis of complex captures at 900+ FPS.
Our method consists of three main steps:
Our method is more robust on challenging captures than 3D Gaussian Splatting (3DGS) while rendering at higher FPS.
Our method achieves similar view synthesis quality as ZipNeRF while rendering 3000x faster.
Our NeRF-based initialization and supervision leads to more stable and higher-quality view synthesis compared to 3DGS.
We achieve higher SSIM and LPIPS than ZipNeRF on the Mip-NeRF360 benchmark while rendering 3000x faster.
@article{niemeyer2024radsplat,
author = {Niemeyer, Michael and Manhardt, Fabian and Rakotosaona, Marie-Julie and Oechsle, Michael and Duckworth, Daniel and Gosula, Rama and Tateno, Keisuke and Bates, John and Kaeser, Dominik and Tombari, Federico},
title = {RadSplat: Radiance Field-Informed Gaussian Splatting for Robust Real-Time Rendering with 900+ FPS },
journal = {arXiv.org},
year = {2024},
}
We would like to thank Georgios Kopanas, Peter Zhizhin, Peter Hedman, and Jon Barron for fruitful discussions and advice, Cengiz Oztireli for reviewing the draft, and Zhiwen Fan and Kevin Wang for sharing additional baseline results. The results we show above are from the Mip-NeRF360 and the ZipNeRF dataset. The website is built on top of the Nerfies template and uses the image slider, and the zoom-in video comparison is inspired by Binary Opacity Grids.