Dropout: a simple way to prevent neural networks from overfitting
Citations
[...]
38,208 citations
30,843 citations
27,821 citations
17,184 citations
15,696 citations
Cites background from "Dropout: a simple way to prevent ne..."
...This assumption, however, might restrict modeling capacity, as graph edges need not necessarily encode node similarity, but could contain additional information....
[...]
References
5,303 citations
"Dropout: a simple way to prevent ne..." refers methods in this paper
...2 Unsupervised Pretraining Neural networks can be pretrained using stacks of RBMs (Hinton and Salakhutdinov, 2006), autoencoders (Vincent et al., 2010) or Deep Boltzmann Machines (Salakhutdinov and Hinton, 2009)....
[...]
...The idea of adding noise to the states of units has previously been used in the context of Denoising Autoencoders (DAEs) by Vincent et al. (2008, 2010) where noise is added to the input units of an autoencoder and the network is trained to reconstruct the noise-free input....
[...]
...Neural networks can be pretrained using stacks of RBMs (Hinton and Salakhutdinov, 2006), autoencoders (Vincent et al., 2010) or Deep Boltzmann Machines (Salakhutdinov and Hinton, 2009)....
[...]
4,814 citations
3,846 citations
"Dropout: a simple way to prevent ne..." refers background in this paper
...On the other hand, Bayesian neural networks (Neal, 1996) are the proper way of doing model averaging over the space of neural network structures and parameters....
[...]
2,783 citations
"Dropout: a simple way to prevent ne..." refers background or methods in this paper
...Method Unit Type Architecture Error % Standard Neural Net (Simard et al., 2003) Logistic 2 layers, 800 units 1....
[...]
...The best performing neural networks for the permutation invariant setting that do not use dropout or unsupervised pretraining achieve an error of about 1.60% (Simard et al., 2003)....
[...]
2,317 citations
"Dropout: a simple way to prevent ne..." refers background in this paper
...Replacing logistic units with rectified linear units (ReLUs) (Jarrett et al., 2009) further reduces the error to 1....
[...]
...Replacing logistic units with rectified linear units (ReLUs) (Jarrett et al., 2009) further reduces the error to 1.25%....
[...]