Projects per year
Abstract
GAN is a generative modelling framework which has been proven as able to minimise various types of divergence measures under an optimal discriminator. However, there is a gap between the loss function of GAN used in theory and in practice. In theory, the proof of the Jensen divergence minimisation involves the min-max criterion, but in practice the non-saturating criterion is instead used to avoid gradient vanishing. We argue that the formulation of divergence minimization via GAN is biased and may yield a poor convergence of the algorithm. In this paper, we propose the Residual Generator for GAN (Rg-GAN), which is inspired by the closed-loop control theory, to bridge the gap between theory and practice. Rg-GAN minimizes the residual between the loss of the generated data to be real and the loss of the generated data to be fake from the perspective of the discriminator. In this setting, the loss terms of the generator depend only on the generated data and therefore contribute to the optimisation of the model. We formulate the residual generator for standard GAN and least-squares GAN and show that they are equivalent to the minimisation of reverse-KL divergence and a novel instance of f-divergence, respectively. Furthermore, we prove that Rg-GAN can be reduced to Integral Probability Metrics (IPMs) GANs (e.g., Wasserstein GAN) and bridge the gap between IPMs and f-divergence. Additionally, we further improve on Rg-GAN by proposing a loss function for the discriminator that has a better discrimination ability. Experiments on synthetic and natural images data sets show that Rg-GAN is robust to mode collapse, and improves the generation quality of GAN in terms of FID and IS scores.
Original language | English |
---|---|
Article number | 108222 |
Journal | Pattern Recognition |
Volume | 121 |
Online published | 2 Aug 2021 |
DOIs | |
Publication status | Published - Jan 2022 |
Research Keywords
- Deep learning
- Generative adversarial networks
- Image synthesis
Fingerprint
Dive into the research topics of 'The residual generator: An improved divergence minimization framework for GAN'. Together they form a unique fingerprint.Projects
- 1 Active
-
GRF: Beyond Model Adaptation: Transforming a Complete Probability Distribution of Model Parameters across Different Domains in Transfer Learning
WONG, H. S. (Principal Investigator / Project Coordinator)
1/01/21 → …
Project: Research