M
Michael I. Jordan
Researcher at University of California, Berkeley
Publications - 1110
Citations - 241763
Michael I. Jordan is an academic researcher from University of California, Berkeley. The author has contributed to research in topics: Computer science & Inference. The author has an hindex of 176, co-authored 1016 publications receiving 216204 citations. Previous affiliations of Michael I. Jordan include Stanford University & Princeton University.
Papers
More filters
Journal ArticleDOI
Robust sparse hyperplane classifiers: application to uncertain molecular profiling data.
TL;DR: The task of learning a robust sparse hyperplane from such data is formulated as a second order cone program (SOCP).
Posted Content
Ray: A Distributed Framework for Emerging AI Applications
Philipp Moritz,Robert Nishihara,Stephanie Wang,Alexey Tumanov,Richard Liaw,Eric Liang,Melih Elibol,Zongheng Yang,William Paul,Michael I. Jordan,Ion Stoica +10 more
TL;DR: Ray as mentioned in this paper is a distributed system that implements a unified interface that can express both task-parallel and actor-based computations, supported by a single dynamic execution engine and employs a distributed scheduler and a distributed and fault-tolerant store to manage the control state.
Posted Content
Ancestor Sampling for Particle Gibbs
TL;DR: In this article, a particle Gibbs with ancestor sampling (PG-AS) method was proposed to improve the mixing of the particle MCMC kernel in a single forward sweep instead of using separate forward and backward sweeps.
Proceedings Article
Learning Fine Motion by Markov Mixtures of Experts
Marina Meila,Michael I. Jordan +1 more
TL;DR: A method to learn a model of the movement from measured data that requires little or no prior knowledge and the resulting model explicitly estimates the state of contact.
Posted Content
Accelerated Gradient Descent Escapes Saddle Points Faster than Gradient Descent
TL;DR: In this article, a simple variant of Nesterov's accelerated gradient descent (AGD) was shown to achieve faster convergence rate than GD in the nonconvex setting.