The reconstruction of indoor scenes remains challenging due to the inherent complexity of spatial structures and the prevalence of textureless regions. Recent advancements in 3D Gaussian Splatting have improved novel view synthesis with accelerated processing but have yet to deliver comparable performance in surface reconstruction. In this paper, we introduce 2DGS-Room, a novel method leveraging 2D Gaussian Splatting for high-fidelity indoor scene reconstruction. Specifically, we employ a seed-guided mechanism to control the distribution of 2D Gaussians, with the density of seed points dynamically optimized through adaptive growth and pruning mechanisms. To further improve geometric accuracy, we incorporate monocular depth and normal priors to provide constraints for details and textureless regions respectively. Additionally, multi-view consistency constraints are employed to mitigate artifacts and further enhance reconstruction quality. Extensive experiments on ScanNet and ScanNet++ datasets demonstrate that our method achieves state-of-the-art performance in indoor scene reconstruction.
Overview of 2DGS-Room. Given multi-view posed images, we improve 2DGS to achieve high-fidelity geometric reconstruction for indoor scenes. (a) Starting from an SfM-derived point cloud, we generate a set of seed points through voxelization, establishing a stable foundation for guiding the distribution and density of 2D Gaussians. We further introduce an adaptive growth and pruning strategy to optimize seed points. (b) We incorporate depth and normal priors, addressing the challenges of detailed areas and textureless regions. (c) We introduce multi-view consistency constraints to further enhance the quality of the indoor scene reconstruction.
We thank 2D Gaussian Splatting for our code references.
If you find this project useful in your research, please consider cite:
@article{zhang20242dgsroom,
title={2DGS-Room: Seed-Guided 2D Gaussian Splatting with Geometric Constrains for High-Fidelity Indoor Scene Reconstruction},
author={Zhang, Wanting and Xiang, Haodong and Liao, Zhichao and Lai, Xiansong and Li, Xinghui and Zeng, Long},
journal={arXiv preprint arXiv:2412.03428},
year={2024}
}