scispace - formally typeset
Search or ask a question

Showing papers by "Ethem Alpaydin published in 2020"


Journal ArticleDOI
TL;DR: In this paper, the authors propose two approaches that automatically update the network structure while also learning its weights, where the depth, or additional complexity, is encapsulated continuously in the parameter space through control parameters that add additional complexity.
Abstract: Traditionally, deep learning algorithms update the network weights, whereas the network architecture is chosen manually using a process of trial and error. In this paper, we propose two novel approaches that automatically update the network structure while also learning its weights. The novelty of our approach lies in our parameterization, where the depth, or additional complexity, is encapsulated continuously in the parameter space through control parameters that add additional complexity. We propose two methods. In tunnel networks, this selection is done at the level of a hidden unit, and in budding perceptrons, this is done at the level of a network layer; updating this control parameter introduces either another hidden unit or layer. We show the effectiveness of our methods on the synthetic two-spiral data and on three real data sets of MNIST, MIRFLICKR, and CIFAR, where we see that our proposed methods, with the same set of hyperparameters, can correctly adjust the network complexity to the task complexity.

13 citations


Journal ArticleDOI
TL;DR: BiGANs trained with the Wasserstein loss and augmented with hints learn better generators in terms of image generation quality and diversity, as measured numerically by the 1-nearest neighbor test, Frechet inception distance, and reconstruction error, and qualitatively by visually analyzing the generated samples.

11 citations