Dropout as a Bayesian approximation: representing model uncertainty in deep learning
Citations
1,557 citations
1,257 citations
1,114 citations
950 citations
Cites background or methods from "Dropout as a Bayesian approximation..."
...…generalization may find use in offline RL algorithms, including techniques from causal inference (Schölkopf, 2019), uncertainty estimation (Gal and Ghahramani, 2016; Kendall and Gal, 2017), density estimation and generative modeling (Kingma et al., 2014), distributional robustness (Sinha…...
[...]
...Moving forward, we might expect that a variety of tools developed for addressing distributional shift and facilitating generalization may find use in offline RL algorithms, including techniques from causal inference (Schölkopf, 2019), uncertainty estimation (Gal and Ghahramani, 2016; Kendall and Gal, 2017), density estimation and generative modeling (Kingma et al....
[...]
809 citations
Cites methods from "Dropout as a Bayesian approximation..."
...To combat this, MC (MC) dropout has been introduced, which uses dropout [29] as a regularization term to compute the prediction uncertainty [30]....
[...]
...They modeled the uncertainty by leveraging the Bayesian AL (BAL) using Dropout-sampling....
[...]
...Monte Carlo Dropout Input (left atrium 3D MRI images) 𝑁𝑜𝑖𝑠𝑒 𝜉 𝑁𝑜𝑖𝑠𝑒 𝜉′ 𝐷𝐿 𝐷𝑈 +𝐷𝐿 ℒ𝑐 ℒ𝑠 Uncertainty map Guide estimate the distribution of the output segmentation....
[...]
...To alleviate (a) Baseline neural network (b) Bernoulli DropConnect (c) Gaussian DropConnect (d) Bernoulli Dropout (e) Gaussian Dropout (f) Spike-and-Slab Dropout Fig....
[...]
...Kahn et al. a [173] used both Bootstrapping and Dropout methods to estimate uncertainty in NNs and then used in UA collision prediction model....
[...]
References
111,197 citations
"Dropout as a Bayesian approximation..." refers methods in this paper
...Finally, we used mini-batches of size 32 and the Adam optimiser [38]....
[...]
...Finally, we used mini-batches of size 32 and the Adam optimiser (Kingma & Ba, 2014)....
[...]
42,067 citations
33,597 citations
"Dropout as a Bayesian approximation..." refers background or methods in this paper
...Dropout is used in many models in deep learning as a way to avoid over-fitting (Srivastava et al., 2014), and our interpretation suggests that dropout approximately integrates over the models’ weights....
[...]
...Furthermore, our results carry to other variants of dropout as well (such as drop-connect (Wan et al., 2013), multiplicative Gaussian noise (Srivastava et al., 2014), etc.)....
[...]
...In this paper we give a complete theoretical treatment of the link between Gaussian processes and dropout, and develop the tools necessary to represent uncertainty in deep learning....
[...]
23,074 citations
20,769 citations
"Dropout as a Bayesian approximation..." refers background or methods in this paper
...Recent advances in variational inference introduced new techniques into the field such as sampling-based variational inference and stochastic variational inference (Blei et al., 2012; Kingma & Welling, 2013; Rezende et al., 2014; Titsias & LázaroGredilla, 2014; Hoffman et al., 2013)....
[...]
...Recent advances in variational inference introduced new techniques such as sampling-based variational inference and stochastic variational inference [21, 22, 23, 24, 25]....
[...]