Neural Architecture Search with Reinforcement Learning
Citations
82 citations
Cites methods from "Neural Architecture Search with Rei..."
...In machine learning, many problems can be formulated as bilevel programs, including hyperparameter optimization, GAN training (Goodfellow et al., 2014), meta-learning, and neural architecture search (Zoph & Le, 2016)....
[...]
82 citations
81 citations
81 citations
Cites background from "Neural Architecture Search with Rei..."
...Despite many advancements in NAS algorithms [43, 24, 1, 19, 5, 32, 40, 30, 13, 10], it is remarkable that inverted bottlenecks (IBN) [27] remain the predominant building block in state-of-the-art mobile models....
[...]
81 citations
Cites background from "Neural Architecture Search with Rei..."
...Therefore, NAS is always formulated as a hyper-parameter optimization problem, whose algorithmic realization spans evolution algorithm [21, 7], reinforcement learning [27], Bayesian optimization [9], Monte Carlo Tree Search [24], and differentiable architecture search [14, 26, 3]....
[...]
...Basically, SNAS is a differentiable NAS framework that maintains the generative nature as reinforcement-learningbased methods [27]....
[...]
References
123,388 citations
111,197 citations
55,235 citations
"Neural Architecture Search with Rei..." refers methods in this paper
...Along with this success is a paradigm shift from feature designing to architecture designing, i.e., from SIFT (Lowe, 1999), and HOG (Dalal & Triggs, 2005), to AlexNet (Krizhevsky et al., 2012), VGGNet (Simonyan & Zisserman, 2014), GoogleNet (Szegedy et al., 2015), and ResNet (He et al., 2016a)....
[...]
42,067 citations
31,952 citations
"Neural Architecture Search with Rei..." refers methods in this paper
...Along with this success is a paradigm shift from feature designing to architecture designing, i.e., from SIFT (Lowe, 1999), and HOG (Dalal & Triggs, 2005), to AlexNet (Krizhevsky et al., 2012), VGGNet (Simonyan & Zisserman, 2014), GoogleNet (Szegedy et al., 2015), and ResNet (He et al., 2016a)....
[...]