Controllable Facial Caricaturization with Localized Deformation and Personalized Semantic Attentions

Research output: Journal Publications and Reviews (RGC: 21, 22, 62)21_Publication in refereed journalpeer-review

View graph of relations

Author(s)

  • Ming Zeng
  • Yinglin Zheng
  • Jinpeng Lin
  • Xuan Cheng
  • Zizhao Wu
  • Wenjin Deng

Related Research Unit(s)

Detail(s)

Original languageEnglish
Number of pages13
Journal / PublicationIEEE Transactions on Multimedia
Online published10 Sep 2021
Publication statusOnline published - 10 Sep 2021

Abstract

The facial caricature shows the distinct characteristics of a person via exaggerations of both shape and appearance. This paper presents a novel framework that automatically generates vivid facial caricatures by encoding personalized semantic information. To this end, we first design a part-based scheme for geometry warping, which composes local semantic deformation into a global warping field, equipped with sufficient warping freedom of different facial components. Second, under the scheme of Part-based Warping, we design a photo-to-caricature translation network called PbWarpGAN, and adopt several novel losses to capture the personalized characteristics of each input face and preserve its identity better. Third, based on PbWarpGAN, we develop a user-friendly interface by introducing an attention scheme on each facial component, allowing ordinary users to adjust the automatically generated caricature by PbWarpGAN according to their preference conveniently. Experimental results show that our PbWarpGAN is more effective in capturing personalized characteristics than counterparts, and provides an efficient tool for caricature designing application.

Research Area(s)

  • Caricature, Faces, Generative adversarial networks, Geometry, Image-to-Image Translation, Semantics, Shape, Strain, Style Transfer, Task analysis