Neural Parameterization for Dynamic Human Head Editing

Research output: Journal Publications and Reviews (RGC: 21, 22, 62)21_Publication in refereed journalpeer-review

View graph of relations

Author(s)

  • Li MA
  • Xiaoyu LI
  • Xuan WANG
  • Qi ZHANG
  • Jue WANG
  • Pedro V. SANDER

Related Research Unit(s)

Detail(s)

Original languageEnglish
Article number236
Journal / PublicationACM Transactions on Graphics
Volume41
Issue number6
Online published30 Nov 2022
Publication statusPublished - Dec 2022

Conference

Title15th ACM SIGGRAPH Conference and Exhibition on Computer Graphics and Interactive Techniques in Asia (SIGGRAPH Asia 2022)
LocationDaegu Exhibition & Convention Center (EXCO)
PlaceKorea, Republic of
CityDaegu
Period6 - 9 December 2022

Abstract

Implicit radiance functions emerged as a powerful scene representation for reconstructing and rendering photo-realistic views of a 3D scene. These representations, however, suffer from poor editability. On the other hand, explicit representations such as polygonal meshes allow easy editing but are not as suitable for reconstructing accurate details in dynamic human heads, such as fine facial features, hair, teeth, and eyes. In this work, we present Neural Parameterization (NeP), a hybrid representation that provides the advantages of both implicit and explicit methods. NeP is capable of photo-realistic rendering while allowing fine-grained editing of the scene geometry and appearance. We first disentangle the geometry and appearance by parameterizing the 3D geometry into 2D texture space. We enable geometric editability by introducing an explicit linear deformation blending layer. The deformation is controlled by a set of sparse key points, which can be explicitly and intuitively displaced to edit the geometry. For appearance, we develop a hybrid 2D texture consisting of an explicit texture map for easy editing and implicit view and time-dependent residuals to model temporal and view variations. We compare our method to several reconstruction and editing baselines. The results show that the NeP achieves almost the same level of rendering accuracy while maintaining high editability.

Research Area(s)

  • Neural Rendering, Scene Representation, Editable Neural Radiance Field, Dynamic Scenes

Citation Format(s)

Neural Parameterization for Dynamic Human Head Editing. / MA, Li; LI, Xiaoyu; LIAO, Jing et al.

In: ACM Transactions on Graphics, Vol. 41, No. 6, 236, 12.2022.

Research output: Journal Publications and Reviews (RGC: 21, 22, 62)21_Publication in refereed journalpeer-review