scispace - formally typeset
Open AccessProceedings Article

Improved training of wasserstein GANs

TLDR
The authors proposed to penalize the norm of the gradient of the critic with respect to its input to improve the training stability of Wasserstein GANs and achieve stable training of a wide variety of GAN architectures with almost no hyperparameter tuning.
Abstract
Generative Adversarial Networks (GANs) are powerful generative models, but suffer from training instability. The recently proposed Wasserstein GAN (WGAN) makes progress toward stable training of GANs, but sometimes can still generate only poor samples or fail to converge. We find that these problems are often due to the use of weight clipping in WGAN to enforce a Lipschitz constraint on the critic, which can lead to undesired behavior. We propose an alternative to clipping weights: penalize the norm of gradient of the critic with respect to its input. Our proposed method performs better than standard WGAN and enables stable training of a wide variety of GAN architectures with almost no hyperparameter tuning, including 101-layer ResNets and language models with continuous generators. We also achieve high quality generations on CIFAR-10 and LSUN bedrooms.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

Optimizing the Sediment Classification of Small Side-Scan Sonar Images Based on Deep Learning

TL;DR: In this paper, the authors used shallow-water, side-scan sonar images collected from the Pearl River Estuary combined with deep learning to study sediment classification and optimization methods for a small dataset of seabed acoustic images.
Journal ArticleDOI

A Data-Driven Approach for Generating Synthetic Load Patterns and Usage Habits

TL;DR: This paper proposes a flexible framework for generating synthetic labelled load patterns and usage habits via a non-intrusive novel data-driven approach and leverages on recent developments in generative adversarial networks (GAN) and kernel density estimators (KDE) to eliminate model-based assumptions that otherwise result in biases.
Proceedings ArticleDOI

Simulating Brain Signals: Creating Synthetic EEG Data via Neural-Based Generative Models for Improved SSVEP Classification

TL;DR: In this paper, the authors explore the use of modern neural-based generative models trained on a limited quantity of EEG data collected from different subjects to generate supplementary synthetic EEG signal vectors, subsequently used to train an SSVEP classifier.
Journal ArticleDOI

A Neural Vocoder With Hierarchical Generation of Amplitude and Phase Spectra for Statistical Parametric Speech Synthesis

TL;DR: This article presents a neural vocoder named HiNet which reconstructs speech waveforms from acoustic features by predicting amplitude and phase spectra hierarchically and achieves better naturalness of reconstructed speech than the conventional STRAIGHT vocoder, a 16-bit WaveNet vocoder using open source implementation and an NSF vocoder with similar complexity to the PSP.
Posted Content

Dual Contradistinctive Generative Autoencoder.

TL;DR: The two contradistinctive losses in VAE work harmoniously in DC-VAE leading to a significant qualitative and quantitative performance enhancement over the baseline VAEs without architectural changes.
References
More filters
Dissertation

Learning Multiple Layers of Features from Tiny Images

TL;DR: In this paper, the authors describe how to train a multi-layer generative model of natural images, using a dataset of millions of tiny colour images, described in the next section.
Journal ArticleDOI

Simple Statistical Gradient-Following Algorithms for Connectionist Reinforcement Learning

TL;DR: This article presents a general class of associative reinforcement learning algorithms for connectionist networks containing stochastic units that are shown to make weight adjustments in a direction that lies along the gradient of expected reinforcement in both immediate-reinforcement tasks and certain limited forms of delayed-reInforcement tasks, and they do this without explicitly computing gradient estimates.
Posted Content

Unsupervised Representation Learning with Deep Convolutional Generative Adversarial Networks

TL;DR: This work introduces a class of CNNs called deep convolutional generative adversarial networks (DCGANs), that have certain architectural constraints, and demonstrates that they are a strong candidate for unsupervised learning.
Posted Content

Improved Techniques for Training GANs

TL;DR: In this article, the authors present a variety of new architectural features and training procedures that apply to the generative adversarial networks (GANs) framework and achieve state-of-the-art results in semi-supervised classification on MNIST, CIFAR-10 and SVHN.
Proceedings Article

Categorical Reparameterization with Gumbel-Softmax

TL;DR: Gumbel-Softmax as mentioned in this paper replaces the non-differentiable samples from a categorical distribution with a differentiable sample from a novel Gumbel softmax distribution, which has the essential property that it can be smoothly annealed into the categorical distributions.
Related Papers (5)