Dropout: a simple way to prevent neural networks from overfitting
Citations
[...]
38,208 citations
30,843 citations
27,821 citations
17,184 citations
15,696 citations
Cites background from "Dropout: a simple way to prevent ne..."
...This assumption, however, might restrict modeling capacity, as graph edges need not necessarily encode node similarity, but could contain additional information....
[...]
References
115 citations
96 citations
"Dropout: a simple way to prevent ne..." refers background or methods in this paper
...Method Code Quality (bits) Neural Network (early stopping) (Xiong et al., 2011) 440 Regression, PCA (Xiong et al....
[...]
...Xiong et al. (2011) used Bayesian neural nets for this task....
[...]
...This can sometimes be approximated quite well for simple or small models (Xiong et al., 2011; Salakhutdinov and Mnih, 2008), but we would like to approach the performance of the Bayesian gold standard using considerably less computation....
[...]
...The data set that we use (Xiong et al., 2011) comes from the domain of genetics....
[...]
..., 2011) 463 SVM, PCA (Xiong et al., 2011) 487 Neural Network with dropout 567 Bayesian Neural Network (Xiong et al....
[...]
82 citations
"Dropout: a simple way to prevent ne..." refers methods in this paper
...We used the excellent CUDA libraries—cudamat (Mnih, 2009) and cuda-convnet (Krizhevsky et al....
[...]
63 citations
"Dropout: a simple way to prevent ne..." refers background in this paper
...A motivation for dropout comes from a theory of the role of sex in evolution (Livnat et al., 2010)....
[...]