Open AccessProceedings Article
Improved training of wasserstein GANs
Ishaan Gulrajani,Faruk Ahmed,Martin Arjovsky,Vincent Dumoulin,Aaron Courville +4 more
- Vol. 30, pp 5769-5779
TLDR
The authors proposed to penalize the norm of the gradient of the critic with respect to its input to improve the training stability of Wasserstein GANs and achieve stable training of a wide variety of GAN architectures with almost no hyperparameter tuning.Abstract:
Generative Adversarial Networks (GANs) are powerful generative models, but suffer from training instability. The recently proposed Wasserstein GAN (WGAN) makes progress toward stable training of GANs, but sometimes can still generate only poor samples or fail to converge. We find that these problems are often due to the use of weight clipping in WGAN to enforce a Lipschitz constraint on the critic, which can lead to undesired behavior. We propose an alternative to clipping weights: penalize the norm of gradient of the critic with respect to its input. Our proposed method performs better than standard WGAN and enables stable training of a wide variety of GAN architectures with almost no hyperparameter tuning, including 101-layer ResNets and language models with continuous generators. We also achieve high quality generations on CIFAR-10 and LSUN bedrooms.read more
Citations
More filters
Posted Content
GDPP: Learning Diverse Generations Using Determinantal Point Process
TL;DR: The Generative DPP approach shows a consistent resistance to mode-collapse on a wide variety of synthetic data and natural image datasets while outperforming state-of-the-art methods for data-efficiency, generation quality, and convergence-time whereas being 5.8x faster than its closest competitor.
Journal ArticleDOI
Towards Sustainable Energy Efficiency With Intelligent Electricity Theft Detection in Smart Grids Emphasising Enhanced Neural Networks
Abdulaziz Aldegheishem,Mubbashra Anwar,Nadeem Javaid,Nabil Alrajeh,Muhammad Shafiq,Hasan Ahmed +5 more
TL;DR: Wang et al. as mentioned in this paper proposed a hybrid sampling approach, i.e., synthetic minority oversampling technique with edited nearest neighbor, and AlexNet is used for dimensionality reduction and extracting useful information from electricity consumption data.
Journal ArticleDOI
Bi-Modality Medical Image Synthesis Using Semi-Supervised Sequential Generative Adversarial Networks
TL;DR: Linyi et al. as mentioned in this paper proposed a bi-modality medical image synthesis approach based on sequential generative adversarial network (GAN) and semi-supervised learning, which consists of two generative modules that synthesize images of the two modalities in a sequential order.
Journal ArticleDOI
Painting halos from cosmic density fields of dark matter with physically motivated neural networks
TL;DR: In this paper, a halo painting network is proposed to map approximate 3D dark matter fields to realistic halo distributions via a physically motivated network with which they can learn the nontrivial local relation between dark matter density field and halo distribution without relying on a physical model.
Journal ArticleDOI
Study of low-dose PET image recovery using supervised learning with CycleGAN
Kui Zhao,Long Zhou,Gao Size,Wang Xiaozhuang,Yaofa Wang,Xin Zhao,Huatao Wang,Kanfeng Liu,Yunqi Zhu,Hongwei Ye +9 more
TL;DR: Quantitative and qualitative evaluations indicate the proposed approach to establish a non-linear end-to-end mapping model is accurate, efficient and robust as compared to other state-of-the-art deep learning methods.
References
More filters
Dissertation
Learning Multiple Layers of Features from Tiny Images
TL;DR: In this paper, the authors describe how to train a multi-layer generative model of natural images, using a dataset of millions of tiny colour images, described in the next section.
Journal ArticleDOI
Simple Statistical Gradient-Following Algorithms for Connectionist Reinforcement Learning
TL;DR: This article presents a general class of associative reinforcement learning algorithms for connectionist networks containing stochastic units that are shown to make weight adjustments in a direction that lies along the gradient of expected reinforcement in both immediate-reinforcement tasks and certain limited forms of delayed-reInforcement tasks, and they do this without explicitly computing gradient estimates.
Posted Content
Unsupervised Representation Learning with Deep Convolutional Generative Adversarial Networks
TL;DR: This work introduces a class of CNNs called deep convolutional generative adversarial networks (DCGANs), that have certain architectural constraints, and demonstrates that they are a strong candidate for unsupervised learning.
Posted Content
Improved Techniques for Training GANs
TL;DR: In this article, the authors present a variety of new architectural features and training procedures that apply to the generative adversarial networks (GANs) framework and achieve state-of-the-art results in semi-supervised classification on MNIST, CIFAR-10 and SVHN.
Proceedings Article
Categorical Reparameterization with Gumbel-Softmax
Eric Jang,Shixiang Gu,Ben Poole +2 more
TL;DR: Gumbel-Softmax as mentioned in this paper replaces the non-differentiable samples from a categorical distribution with a differentiable sample from a novel Gumbel softmax distribution, which has the essential property that it can be smoothly annealed into the categorical distributions.