scispace - formally typeset
M

Michael I. Jordan

Researcher at University of California, Berkeley

Publications -  1110
Citations -  241763

Michael I. Jordan is an academic researcher from University of California, Berkeley. The author has contributed to research in topics: Computer science & Inference. The author has an hindex of 176, co-authored 1016 publications receiving 216204 citations. Previous affiliations of Michael I. Jordan include Stanford University & Princeton University.

Papers
More filters
Proceedings Article

Towards Accurate Model Selection in Deep Unsupervised Domain Adaptation.

TL;DR: This work proposes Deep Embedded Validation (DEV), which embeds adapted feature representation into the validation procedure to obtain unbiased estimation of the target risk with bounded variance.
Proceedings Article

MAD-Bayes: MAP-based Asymptotic Derivations from Bayes

TL;DR: In this paper, the authors apply small-variance asymptotics directly to the posterior in Bayesian nonparametric models and obtain a novel objective function that goes beyond clustering to learn (and penalize new) groupings for which they relax the mutual exclusivity and exhaustivity assumptions of clustering.
Proceedings Article

The Missing Piece in Complex Analytics: Low Latency, Scalable Model Management and Serving with Velox

TL;DR: The challenges and architectural considerations required to achieve this functionality, including the abilities to span online and offline systems, to adaptively adjust model materialization strategies, and to exploit inherent statistical properties such as model error tolerance, all while operating at "Big Data" scale are described.
Proceedings Article

Approximating Posterior Distributions in Belief Networks Using Mixtures

TL;DR: This paper derives an efficient algorithm for updating the mixture parameters and applies it to the problem of learning in sigmoid belief networks and demonstrates a systematic improvement over simple mean field theory as the number of mixture components is increased.
Posted Content

Local Privacy, Data Processing Inequalities, and Statistical Minimax Rates

TL;DR: This work proves bounds on information-theoretic quantities, including mutual information and Kullback-Leibler divergence, that depend on the privacy guarantees, and provides a treatment of several canonical families of problems: mean estimation, parameter estimation in fixed-design regression, multinomial probability estimation, and nonparametric density estimation.