Simple Yet Effective Way for Improving the Performance of GAN

Yoon Jae Yeo, Yong Goo Shin, Seung Park, Sung Jea Ko

Research output: Contribution to journalArticlepeer-review

Abstract

In adversarial learning, the discriminator often fails to guide the generator successfully since it distinguishes between real and generated images using silly or nonrobust features. To alleviate this problem, this brief presents a simple but effective way that improves the performance of the generative adversarial network (GAN) without imposing the training overhead or modifying the network architectures of existing methods. The proposed method employs a novel cascading rejection (CR) module for discriminator, which extracts multiple nonoverlapped features in an iterative manner using the vector rejection operation. Since the extracted diverse features prevent the discriminator from concentrating on nonmeaningful features, the discriminator can guide the generator effectively to produce images that are more similar to the real images. In addition, since the proposed CR module requires only a few simple vector operations, it can be readily applied to existing frameworks with marginal training overheads. Quantitative evaluations on various data sets, including CIFAR-10, CelebA, CelebA-HQ, LSUN, and tiny-ImageNet, confirm that the proposed method significantly improves the performance of GAN and conditional GAN in terms of the Frechet inception distance (FID), indicating the diversity and visual appearance of the generated images.

Original languageEnglish
JournalIEEE Transactions on Neural Networks and Learning Systems
DOIs
Publication statusAccepted/In press - 2021

Keywords

  • Adversarial learning
  • generative adversarial network (GAN)
  • training strategy.

ASJC Scopus subject areas

  • Software
  • Computer Science Applications
  • Computer Networks and Communications
  • Artificial Intelligence

Fingerprint Dive into the research topics of 'Simple Yet Effective Way for Improving the Performance of GAN'. Together they form a unique fingerprint.

Cite this