Neural Architecture Search with Reinforcement Learning
Citations
19 citations
Cites background or methods from "Neural Architecture Search with Rei..."
...We examine the utility of NNGP validation accuracy for predicting final network performance, and compare it against that of shortened-training, which is a common method for approximating network performance [49]....
[...]
...An important topic in neural architecture search (NAS) [49, 50] is the discovery of computationally cheap methods to predict the fully-trained performance of a given network....
[...]
..., computationally manageable tasks for approximating the performance of a neural network, are commonly used in neural architecture search [49, 56, 58, 59]....
[...]
19 citations
19 citations
19 citations
Cites methods from "Neural Architecture Search with Rei..."
...That is again reachable and comparable to other automatic techniques [21], [23], which used hundreds of GPUs in some cases, thus it is effective and computationally reasonable and cheap....
[...]
...While NASNet [75] using 500 GPUs and needed four days for training....
[...]
...Neural Architecture Search (NAS) [21] is a cell (block) based model that evolves blocks using a Recurrent Neural Network (RNN) [22]....
[...]
...Besides, NAS U–Net [38], need two days to find a network, however, in this method, natural images are used to find the network, and then medical images are used for network evaluation....
[...]
...NAS [21] utilised 800 GPUs which took 28 days for training....
[...]
19 citations
References
123,388 citations
111,197 citations
55,235 citations
"Neural Architecture Search with Rei..." refers methods in this paper
...Along with this success is a paradigm shift from feature designing to architecture designing, i.e., from SIFT (Lowe, 1999), and HOG (Dalal & Triggs, 2005), to AlexNet (Krizhevsky et al., 2012), VGGNet (Simonyan & Zisserman, 2014), GoogleNet (Szegedy et al., 2015), and ResNet (He et al., 2016a)....
[...]
42,067 citations
31,952 citations
"Neural Architecture Search with Rei..." refers methods in this paper
...Along with this success is a paradigm shift from feature designing to architecture designing, i.e., from SIFT (Lowe, 1999), and HOG (Dalal & Triggs, 2005), to AlexNet (Krizhevsky et al., 2012), VGGNet (Simonyan & Zisserman, 2014), GoogleNet (Szegedy et al., 2015), and ResNet (He et al., 2016a)....
[...]