Y
Yee Whye Teh
Researcher at University of Oxford
Publications - 351
Citations - 42930
Yee Whye Teh is an academic researcher from University of Oxford. The author has contributed to research in topics: Computer science & Inference. The author has an hindex of 68, co-authored 326 publications receiving 36155 citations. Previous affiliations of Yee Whye Teh include University of Toronto & University College London.
Papers
More filters
Journal ArticleDOI
A fast learning algorithm for deep belief nets
TL;DR: A fast, greedy algorithm is derived that can learn deep, directed belief networks one layer at a time, provided the top two layers form an undirected associative memory.
Journal ArticleDOI
Hierarchical Dirichlet Processes
TL;DR: This work considers problems involving groups of data where each observation within a group is a draw from a mixture model and where it is desirable to share mixture components between groups, and considers a hierarchical model, specifically one in which the base measure for the childDirichlet processes is itself distributed according to a Dirichlet process.
Proceedings Article
Bayesian Learning via Stochastic Gradient Langevin Dynamics
Max Welling,Yee Whye Teh +1 more
TL;DR: This paper proposes a new framework for learning from large scale datasets based on iterative learning from small mini-batches by adding the right amount of noise to a standard stochastic gradient optimization algorithm and shows that the iterates will converge to samples from the true posterior distribution as the authors anneal the stepsize.
Proceedings Article
The Concrete Distribution: A Continuous Relaxation of Discrete Random Variables
TL;DR: The Concrete distribution as mentioned in this paper is a new family of distributions with closed form densities and a simple reparameterization, which enables optimizing large scale stochastic computation graphs via gradient descent.
Posted Content
The Concrete Distribution: A Continuous Relaxation of Discrete Random Variables
TL;DR: Concrete random variables---continuous relaxations of discrete random variables is a new family of distributions with closed form densities and a simple reparameterization, and the effectiveness of Concrete relaxations on density estimation and structured prediction tasks using neural networks is demonstrated.