Open AccessProceedings Article
Streaming Variational Bayes
Tamara Broderick,Nicholas Boyd,Andre Wibisono,Ashia C. Wilson,Michael I. Jordan +4 more
- Vol. 26, pp 1727-1735
TLDR
SDA-Bayes is presented, a framework for streaming updates to the estimated posterior of a Bayesian posterior, with variational Bayes (VB) as the primitive, and the usefulness of the framework is demonstrated by fitting the latent Dirichlet allocation model to two large-scale document collections.Abstract:
We present SDA-Bayes, a framework for (S)treaming, (D)istributed, (A)synchronous computation of a Bayesian posterior. The framework makes streaming updates to the estimated posterior according to a user-specified approximation batch primitive. We demonstrate the usefulness of our framework, with variational Bayes (VB) as the primitive, by fitting the latent Dirichlet allocation model to two large-scale document collections. We demonstrate the advantages of our algorithm over stochastic variational inference (SVI) by comparing the two after a single pass through a known amount of data—a case where SVI may be applied—and in the streaming setting, where SVI does not apply.read more
Citations
More filters
Journal ArticleDOI
Advances in Variational Inference
TL;DR: Variational inference (VI) as mentioned in this paper approximates a high-dimensional Bayesian posterior with a simpler variational distribution by solving an optimization problem, which has been successfully applied to various models and large-scale applications.
Posted Content
Advances in Variational Inference
TL;DR: An overview of recent trends in variational inference is given and a summary of promising future research directions is provided.
Posted Content
Variational Federated Multi-Task Learning.
Luca Corinzia,Joachim M. Buhmann +1 more
TL;DR: In VIRTUAL the federated network of the server and the clients is treated as a star-shaped Bayesian network, and learning is performed on the network using approximated variational inference, and it is shown that this method is effective on real-world federated datasets.
Proceedings Article
Scalable and Robust Bayesian Inference via the Median Posterior
TL;DR: This work proposes a novel general approach to Bayesian inference that is scalable and robust to corruption in the data, based on the idea of splitting the data into several non-overlapping subgroups, evaluating the posterior distribution given each independent subgroup, and then combining the results.
Proceedings ArticleDOI
Rényi divergence variational inference
Yingzhen Li,Richard E. Turner +1 more
TL;DR: The variational Renyi bound (VR) as mentioned in this paper extends traditional variational inference to Renyi's α-divergences, and enables a smooth interpolation from the evidence lower-bound to the log (marginal) likelihood that is controlled by the value of α that parametrises the divergence.
References
More filters
Journal ArticleDOI
Latent dirichlet allocation
TL;DR: This work proposes a generative model for text and other collections of discrete data that generalizes or improves on several previous models including naive Bayes/unigram, mixture of unigrams, and Hofmann's aspect model.
Book
Graphical Models, Exponential Families, and Variational Inference
TL;DR: The variational approach provides a complementary alternative to Markov chain Monte Carlo as a general source of approximation methods for inference in large-scale statistical models.
Journal ArticleDOI
Stochastic variational inference
TL;DR: Stochastic variational inference lets us apply complex Bayesian models to massive data sets, and it is shown that the Bayesian nonparametric topic model outperforms its parametric counterpart.
Proceedings Article
Hogwild: A Lock-Free Approach to Parallelizing Stochastic Gradient Descent
TL;DR: In this paper, the authors present an update scheme called HOGWILD!, which allows processors access to shared memory with the possibility of overwriting each other's work, which achieves a nearly optimal rate of convergence.
Proceedings Article
Online Learning for Latent Dirichlet Allocation
TL;DR: An online variational Bayes (VB) algorithm for Latent Dirichlet Allocation (LDA) based on online stochastic optimization with a natural gradient step is developed, which shows converges to a local optimum of the VB objective function.