Neural Architecture Search with Reinforcement Learning
Citations
5,782 citations
Cites background from "Neural Architecture Search with Rei..."
...NAS takes a novel approach to meta-learning architectures by using a recurrent network trained with Reinforcement Learning to design architectures that result in the best accuracy....
[...]
...28 Concept behind Neural Architecture Search [33]...
[...]
...This approach has become very popular since the publication of NAS [33] from Zoph and Le....
[...]
...Since then, GANs were introduced in 2014 [31], Neural Style Transfer [32] in 2015, and Neural Architecture Search (NAS) [33] in 2017....
[...]
...Applying metalearning concepts from NAS to Data Augmentation has become increasingly popular with works such as Neural Augmentation [36], Smart Augmentation [37], and AutoAugment [38] published in 2017, 2017, and 2018, respectively....
[...]
3,393 citations
2,353 citations
1,897 citations
Cites methods from "Neural Architecture Search with Rei..."
...It is natural to consider automated design of detection backbone architectures, such as the recent Automated Machine Learning (AutoML) [219], which has been applied to image classification and object detection [22, 39, 80, 171, 331, 332]....
[...]
1,707 citations
References
2,364 citations
"Neural Architecture Search with Rei..." refers background or methods in this paper
...Search space: Our search space consists of convolutional architectures, with rectified linear units as non-linearities (Nair & Hinton, 2010), batch normalization (Ioffe & Szegedy, 2015) and skip connections between layers (Section 3.3)....
[...]
...Every child model has two layers, with the number of hidden units adjusted so that total number of learnable parameters approximately match the “medium” baselines (Zaremba et al., 2014; Gal, 2015)....
[...]
...On this task, LSTM architectures tend to excel (Zaremba et al., 2014; Gal, 2015), and improving them is difficult (Jozefowicz et al., 2015)....
[...]
...Additionally, we upsample each image then choose a random 32x32 crop of this upsampled image....
[...]
...First, we make use of the embedding dropout and recurrent dropout techniques proposed in Zaremba et al. (2014) and (Gal, 2015)....
[...]
2,317 citations
"Neural Architecture Search with Rei..." refers background in this paper
...Additionally, it is also possible to predict pooling, local contrast normalization (Jarrett et al., 2009; Krizhevsky et al., 2012), and batchnorm (Ioffe & Szegedy, 2015) in the architectures....
[...]
...This scheme of parallelism is summarized in Figure 3....
[...]
1,952 citations
"Neural Architecture Search with Rei..." refers background or methods in this paper
...Like residual networks (He et al., 2016a), the architecture also has many one-step skip connections....
[...]
...Along with this success is a paradigm shift from feature designing to architecture designing, i.e., from SIFT (Lowe, 1999), and HOG (Dalal & Triggs, 2005), to AlexNet (Krizhevsky et al., 2012), VGGNet (Simonyan & Zisserman, 2014), GoogleNet (Szegedy et al., 2015), and ResNet (He et al., 2016a)....
[...]
...In Section 3.1, the search space does not have skip connections, or branching layers used in modern architectures such as GoogleNet (Szegedy et al., 2015), and Residual Net (He et al., 2016a)....
[...]
1,667 citations
"Neural Architecture Search with Rei..." refers methods in this paper
...There are Bayesian optimization methods that allow to search non fixed length architectures (Bergstra et al., 2013; Mendoza et al., 2016), but they are less general and less flexible than the method proposed in this paper....
[...]
[...]
1,382 citations