M
Mingyuan Zhou
Researcher at University of Texas at Austin
Publications - 223
Citations - 4791
Mingyuan Zhou is an academic researcher from University of Texas at Austin. The author has contributed to research in topics: Computer science & Gibbs sampling. The author has an hindex of 30, co-authored 189 publications receiving 3719 citations. Previous affiliations of Mingyuan Zhou include Microsoft & Duke University.
Papers
More filters
Journal ArticleDOI
Nonparametric Bayesian Dictionary Learning for Analysis of Noisy and Incomplete Images
Mingyuan Zhou,Haojun Chen,John Paisley,Lu Ren,Lingbo Li,Zhengming Xing,David B. Dunson,Guillermo Sapiro,Lawrence Carin +8 more
TL;DR: Nonparametric Bayesian methods are considered for recovery of imagery based upon compressive, incomplete, and/or noisy measurements and significant improvements in image recovery are manifested using learned dictionaries, relative to using standard orthonormal image expansions.
Proceedings Article
Non-Parametric Bayesian Dictionary Learning for Sparse Image Representations
TL;DR: The beta process is employed as a prior for learning the dictionary, and this non-parametric Bayesian method naturally infers an appropriate dictionary size, thereby allowing scaling to large images.
Proceedings Article
Beta-Negative Binomial Process and Poisson Factor Analysis
TL;DR: A beta-negative binomial (BNB) process is proposed, leading to a beta-gamma-Poisson process, which may be viewed as a \multiscoop" generalization of the beta-Bernoulli process.
Journal ArticleDOI
Negative Binomial Process Count and Mixture Modeling
Mingyuan Zhou,Lawrence Carin +1 more
TL;DR: It is shown that with augmentation and normalization, the NB process and gamma-NB process can be reduced to the Dirichlet process and hierarchical Dirichlets process, respectively, and relationships between various count- and mixture-modeling distributions are revealed.
Journal ArticleDOI
Dictionary Learning for Noisy and Incomplete Hyperspectral Images
TL;DR: This work considers analysis of noisy and incomplete hyperspectral imagery, with the objective of removing the noise and inferring the missing data, and addresses dictionary learning from a Bayesian perspective, considering two distinct means of imposing sparse dictionary usage.