- Published on
EnvGS: Modeling View-Dependent Appearance with Environment Gaussian
- Authors
- Name
- Tao Xie
- Name
- Xi Chen
- Name
- Zhen Xu
- Name
- Yiman Xie
- Name
- Yudong Jin
- Name
- Yujun Shen
- Name
- Sida Peng
- Name
- Hujun Bao
- Name
- Xiaowei Zhou
- Affiliation
- Zhejiang University
- Affiliation
- Ant Group
Reconstructing complex reflections in real-world scenes from 2D images is essential for achieving photorealistic novel view synthesis. Existing methods that utilize environment maps to model reflections from distant lighting often struggle with high-frequency reflection details and fail to account for near-field reflections. In this work, we introduce EnvGS, a novel approach that employs a set of Gaussian primitives as an explicit 3D representation for capturing reflections of environments. These environment Gaussian primitives are incorporated with base Gaussian primitives to model the appearance of the whole scene. To efficiently render these environment Gaussian primitives, we developed a ray-tracing-based renderer that leverages the GPU’s RT core for fast rendering. This allows us to jointly optimize our model for high-quality reconstruction while maintaining real-time rendering speeds. Results from multiple real-world and synthetic datasets demonstrate that our method produces significantly more detailed reflections, achieving the best rendering quality in real-time novel view synthesis.