No modes left behind: Capturing the data distribution effectively using GANS

Shashank Sharma, Vinay P. Namboodiri

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Generative adversarial networks (GANs) while being very versatile in realistic image synthesis, still are sensitive to the input distribution. Given a set of data that has an imbalance in the distribution, the networks are susceptible to missing modes and not capturing the data distribution. While various methods have been tried to improve training of GANs, these have not addressed the challenges of covering the full data distribution. Specifically, a generator is not penalized for missing a mode. We show that these are therefore still susceptible to not capturing the full data distribution. In this paper, we propose a simple approach that combines an encoder based objective with novel loss functions for generator and discriminator that improves the solution in terms of capturing missing modes. We validate that the proposed method results in substantial improvements through its detailed analysis on toy and real datasets. The quantitative and qualitative results demonstrate that the proposed method improves the solution for the problem of missing modes and improves training of GANs.

Original languageEnglish
Title of host publication32nd AAAI Conference on Artificial Intelligence, AAAI 2018
PublisherAAAI Press
Pages4042-4049
Number of pages8
ISBN (Electronic)9781577358008
Publication statusPublished - 1 Jan 2018
Event32nd AAAI Conference on Artificial Intelligence, AAAI 2018 - New Orleans, USA United States
Duration: 2 Feb 20187 Feb 2018

Publication series

Name32nd AAAI Conference on Artificial Intelligence, AAAI 2018

Conference

Conference32nd AAAI Conference on Artificial Intelligence, AAAI 2018
CountryUSA United States
CityNew Orleans
Period2/02/187/02/18

ASJC Scopus subject areas

  • Artificial Intelligence

Fingerprint Dive into the research topics of 'No modes left behind: Capturing the data distribution effectively using GANS'. Together they form a unique fingerprint.

Cite this