scispace - formally typeset
M

Michael I. Jordan

Researcher at University of California, Berkeley

Publications -  1110
Citations -  241763

Michael I. Jordan is an academic researcher from University of California, Berkeley. The author has contributed to research in topics: Computer science & Inference. The author has an hindex of 176, co-authored 1016 publications receiving 216204 citations. Previous affiliations of Michael I. Jordan include Stanford University & Princeton University.

Papers
More filters
Proceedings Article

Information-theoretic lower bounds for distributed statistical estimation with communication constraints

TL;DR: Lower bounds on minimax risks for distributed statistical estimation under a communication budget are established for several problems, including various types of location models, as well as for parameter estimation in regression models.
Journal ArticleDOI

Communication-Efficient Distributed Statistical Inference

TL;DR: In this paper, a communication-efficient surrogate likelihood (CSL) framework for distributed statistical inference problems is presented, which provides a communication efficient surrogate to the global likelihoods.
Posted Content

A Kernelized Stein Discrepancy for Goodness-of-fit Tests and Model Evaluation

TL;DR: In this paper, a new discrepancy statistic for measuring differences between two probability distributions based on combining Stein's identity with the reproducing kernel Hilbert space theory is derived, and applied to test how well a probabilistic model fits a set of observations.
Journal ArticleDOI

Bayesian Nonparametric Inference of Switching Dynamic Linear Models

TL;DR: In this article, a Bayesian nonparametric approach utilizes a hierarchical Dirichlet process prior to learn an unknown number of persistent, smooth dynamical modes, and additionally employs automatic relevance determination to infer a sparse set of dynamic dependencies allowing to learn SLDS with varying state dimension or switching VAR processes with varying autoregressive order.
Journal ArticleDOI

Privacy Aware Learning

TL;DR: This work establishes sharp upper and lower bounds on the convergence rates of statistical estimation procedures in a local privacy framework and exhibits a precise tradeoff between the amount of privacy the data preserves and the utility of any statistical estimator or learning procedure.