scispace - formally typeset
Book ChapterDOI

PD-DARTS: Progressive Discretization Differentiable Architecture Search

TLDR
This work proposes two strategies to narrow the search/evaluation gap in differentiable architecture search: firstly, rectify the operation with the highest confidence; secondly, prune theoperation with the lowest confidence iteratively.
Abstract
Architecture design is a crucial step for neural-network-based methods, and it requires years of experience and extensive work. Encouragingly, with recently proposed neural architecture search (NAS), the architecture design process could be automated. In particular, differentiable architecture search (DARTS) reduces the time cost of search to a couple of GPU days. However, due to the inconsistency between the architecture search and evaluation of DARTS, its performance has yet to be improved. We propose two strategies to narrow the search/evaluation gap: firstly, rectify the operation with the highest confidence; secondly, prune the operation with the lowest confidence iteratively. Experiments show that our method achieves 2.46%/2.48% (test error, Strategy 1 or 2) on CIFAR-10 and 16.48%/16.15% (test error, Strategy 1 or 2) on CIFAR-100 at a low cost of 11 or 8 (Strategy 1 or 2) GPU hours, and outperforms state-of-the-art algorithms.

read more

Citations
More filters
Proceedings ArticleDOI

A General Method For Automatic Discovery of Powerful Interactions In Click-Through Rate Prediction

TL;DR: Li et al. as mentioned in this paper proposed AutoPI, which adopts a more general search space in which the computational graph is generalized from existing network connections, and the interactive operators in the edges of the graph are extracted from representative hand-crafted works.
Proceedings ArticleDOI

A General Method For Automatic Discovery of Powerful Interactions In Click-Through Rate Prediction

TL;DR: Li et al. as mentioned in this paper proposed AutoPI, which adopts a more general search space in which the computational graph is generalized from existing network connections, and the interactive operators in the edges of the graph are extracted from representative hand-crafted works.
Journal ArticleDOI

Efficient Evaluation Methods for Neural Architecture Search: A Survey

TL;DR: In this paper , the authors comprehensively survey the evaluation methods of Deep Neural Networks (DNNs) and provide a detailed analysis to motivate the further development of this research direction, and divide the existing evaluation methods into four categories based on the number of DNNs trained for constructing these evaluation methods.
Posted Content

DARTS-PRIME: Regularization and Scheduling Improve Constrained Optimization in Differentiable NAS.

TL;DR: In this article, the authors proposed a differentiable architecture search (DARTS) based on a constrained bilevel optimization, which includes improvements to architectural weight update scheduling and regularization towards discretization.
Posted Content

On Constrained Optimization in Differentiable Neural Architecture Search.

TL;DR: In this article, the authors propose and analyze three improvements to architectural weight competition, update scheduling, and regularization towards discretization of differentiable architecture search (DARTS).
References
More filters
Posted Content

Neural Architecture Search with Reinforcement Learning

Barret Zoph, +1 more
- 05 Nov 2016 - 
TL;DR: This paper uses a recurrent network to generate the model descriptions of neural networks and trains this RNN with reinforcement learning to maximize the expected accuracy of the generated architectures on a validation set.
Proceedings Article

Neural Architecture Search with Reinforcement Learning

TL;DR: In this paper, the authors use a recurrent network to generate the model descriptions of neural networks and train this RNN with reinforcement learning to maximize the expected accuracy of the generated architectures on a validation set.
Proceedings Article

DARTS: Differentiable Architecture Search

TL;DR: The proposed algorithm excels in discovering high-performance convolutional architectures for image classification and recurrent architectures for language modeling, while being orders of magnitude faster than state-of-the-art non-differentiable techniques.
Journal ArticleDOI

Regularized Evolution for Image Classifier Architecture Search

TL;DR: AmoebaNet-A as mentioned in this paper modified the tournament selection evolutionary algorithm by introducing an age property to favor the younger genotypes and achieved state-of-the-art performance.
Posted Content

DARTS: Differentiable Architecture Search

TL;DR: In this article, the authors propose a differentiable architecture search algorithm based on the continuous relaxation of the architecture representation. But the architecture search is not a discrete and non-differentiable search space.
Related Papers (5)