Human Motion Transfer With 3D Constraints and Detail Enhancement

Yang-Tian Sun, Qian-Cheng Fu, Yue-Ren Jiang, Zitao Liu, Yu-Kun Lai, Hongbo Fu, Lin Gao*

*Corresponding author for this work

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

8 Citations (Scopus)

Abstract

We propose a new method for realistic human motion transfer using a generative adversarial network (GAN), which generates a motion video of a target character imitating actions of a source character, while maintaining high authenticity of the generated results. We tackle the problem by decoupling and recombining the posture information and appearance information of both the source and target characters. The innovation of our approach lies in the use of the projection of a reconstructed 3D human model as the condition of GAN to better maintain the structural integrity of transfer results in different poses. We further introduce a detail enhancement net to enhance the details of transfer results by exploiting the details in real source frames. Extensive experiments show that our approach yields better results both qualitatively and quantitatively than the state-of-the-art methods. © 2022 IEEE.
Original languageEnglish
Pages (from-to)4682-4693
Number of pages12
JournalIEEE Transactions on Pattern Analysis and Machine Intelligence
Volume45
Issue number4
Online published26 Aug 2022
DOIs
Publication statusPublished - Apr 2023

Research Keywords

  • 3D constraints
  • deep learning
  • detail enhancement
  • Feature extraction
  • Generative adversarial networks
  • Image reconstruction
  • motion transfer
  • Solid modeling
  • Task analysis
  • Three-dimensional displays
  • Training

Fingerprint

Dive into the research topics of 'Human Motion Transfer With 3D Constraints and Detail Enhancement'. Together they form a unique fingerprint.

Cite this