Neural Parameterization for Dynamic Human Head Editing
Research output: Journal Publications and Reviews (RGC: 21, 22, 62) › 21_Publication in refereed journal › peer-review
Author(s)
Related Research Unit(s)
Detail(s)
Original language | English |
---|---|
Article number | 236 |
Journal / Publication | ACM Transactions on Graphics |
Volume | 41 |
Issue number | 6 |
Online published | 30 Nov 2022 |
Publication status | Published - Dec 2022 |
Conference
Title | 15th ACM SIGGRAPH Conference and Exhibition on Computer Graphics and Interactive Techniques in Asia (SIGGRAPH Asia 2022) |
---|---|
Location | Daegu Exhibition & Convention Center (EXCO) |
Place | Korea, Republic of |
City | Daegu |
Period | 6 - 9 December 2022 |
Link(s)
Abstract
Implicit radiance functions emerged as a powerful scene representation for reconstructing and rendering photo-realistic views of a 3D scene. These representations, however, suffer from poor editability. On the other hand, explicit representations such as polygonal meshes allow easy editing but are not as suitable for reconstructing accurate details in dynamic human heads, such as fine facial features, hair, teeth, and eyes. In this work, we present Neural Parameterization (NeP), a hybrid representation that provides the advantages of both implicit and explicit methods. NeP is capable of photo-realistic rendering while allowing fine-grained editing of the scene geometry and appearance. We first disentangle the geometry and appearance by parameterizing the 3D geometry into 2D texture space. We enable geometric editability by introducing an explicit linear deformation blending layer. The deformation is controlled by a set of sparse key points, which can be explicitly and intuitively displaced to edit the geometry. For appearance, we develop a hybrid 2D texture consisting of an explicit texture map for easy editing and implicit view and time-dependent residuals to model temporal and view variations. We compare our method to several reconstruction and editing baselines. The results show that the NeP achieves almost the same level of rendering accuracy while maintaining high editability.
Research Area(s)
- Neural Rendering, Scene Representation, Editable Neural Radiance Field, Dynamic Scenes
Citation Format(s)
Neural Parameterization for Dynamic Human Head Editing. / MA, Li; LI, Xiaoyu; LIAO, Jing et al.
In: ACM Transactions on Graphics, Vol. 41, No. 6, 236, 12.2022.Research output: Journal Publications and Reviews (RGC: 21, 22, 62) › 21_Publication in refereed journal › peer-review