Visual atribute transfer through deep image analogy

Research output: Journal Publications and Reviews (RGC: 21, 22, 62)21_Publication in refereed journal

83 Scopus Citations
View graph of relations

Author(s)

  • Jing Liao
  • Yuan Yao
  • Lu Yuan
  • Gang Hua
  • Sing Bing Kang

Detail(s)

Original languageEnglish
Article number120
Journal / PublicationACM Transactions on Graphics
Volume36
Issue number4
Publication statusPublished - Jul 2017
Externally publishedYes

Conference

TitleACM SIGGRAPH 2017
PlaceUnited States
CityLos Angeles
Period30 July - 3 August 2017

Abstract

We propose a new technique for visual attribute transfer across images that may have very different appearance but have perceptually similar semantic structure. By visual attribute transfer, we mean transfer of visual information (such as color, tone, texture, and style) from one image to another. For example, one image could be that of a painting or a sketch while the other is a photo of a real scene, and both depict the same type of scene. Our technique finds semantically-meaningful dense correspondences between two input images. To accomplish this, it adapts the notion of "image analogy" [Hertzmann et al. 2001] with features extracted from a Deep Convolutional Neutral Network for matching; we call our technique deep image analogy. A coarse-to-fine strategy is used to compute the nearest-neighbor field for generating the results. We validate the effectiveness of our proposed method in a variety of cases, including style/texture transfer, color/style swap, sketch/painting to photo, and time lapse.

Research Area(s)

  • Deep matching, Image analogy, Transfer

Citation Format(s)

Visual atribute transfer through deep image analogy. / Liao, Jing; Yao, Yuan; Yuan, Lu; Hua, Gang; Kang, Sing Bing.

In: ACM Transactions on Graphics, Vol. 36, No. 4, 120, 07.2017.

Research output: Journal Publications and Reviews (RGC: 21, 22, 62)21_Publication in refereed journal