4D Gaussian Splatting in the Wild with Uncertainty-Aware Regularization

Authors: Mijeong Kim, Jongwoo Lim, Bohyung Han

Abstract: Novel view synthesis of dynamic scenes is becoming important in various
applications, including augmented and virtual reality. We propose a novel 4D
Gaussian Splatting (4DGS) algorithm for dynamic scenes from casually recorded
monocular videos. To overcome the overfitting problem of existing work for
these real-world videos, we introduce an uncertainty-aware regularization that
identifies uncertain regions with few observations and selectively imposes
additional priors based on diffusion models and depth smoothness on such
regions. This approach improves both the performance of novel view synthesis
and the quality of training image reconstruction. We also identify the
initialization problem of 4DGS in fast-moving dynamic regions, where the
Structure from Motion (SfM) algorithm fails to provide reliable 3D landmarks.
To initialize Gaussian primitives in such regions, we present a dynamic region
densification method using the estimated depth maps and scene flow. Our
experiments show that the proposed method improves the performance of 4DGS
reconstruction from a video captured by a handheld monocular camera and also
exhibits promising results in few-shot static scene reconstruction.

Source: http://arxiv.org/abs/2411.08879v1

About the Author

Leave a Reply

Your email address will not be published. Required fields are marked *

You may also like these