scispace - formally typeset
Search or ask a question
Topic

Expectation–maximization algorithm

About: Expectation–maximization algorithm is a research topic. Over the lifetime, 11823 publications have been published within this topic receiving 528693 citations. The topic is also known as: EM algorithm & Expectation Maximization.


Papers
More filters
Journal ArticleDOI
TL;DR: Using simulations and an example, it is shown that by virtue of the ER algorithm, the properties of the existing methods for robust PCA carry through to data with missing elements.

156 citations

Proceedings ArticleDOI
07 Oct 2001
TL;DR: The EM ofGM can be regarded as a special EM of HMM and the EM algorithm of GMM based on symbols is faster in implementation than the EM algorithms based on samples (or on observation) traditionally.
Abstract: The HMM (hidden Markov model) is a probabilistic model of the joint probability of a collection of random variables with both observations and states. The GMM (Gaussian mixture model) is a finite mixture probability distribution model. Although the two models have a close relationship, they are always discussed independently and separately. The EM (expectation-maximum) algorithm is a general method to improve the descent algorithm for finding the maximum likelihood estimation. The EM of HMM and the EM of GMM have similar formulae. Two points are proposed in this paper. One is that the EM of GMM can be regarded as a special EM of HMM. The other is that the EM algorithm of GMM based on symbols is faster in implementation than the EM algorithm of GMM based on samples (or on observation) traditionally.

155 citations

Journal ArticleDOI
TL;DR: An iterative procedure which performs the parameter estimation and image reconstruction tasks at the same time, and is a generalization to the MRF context of a general algorithm, known as the EM algorithm, used to approximate maximum-likelihood estimates for incomplete data problems.

155 citations

Journal ArticleDOI
TL;DR: The TV norm minimization constraint is extended to the field of SPECT image reconstruction with a Poisson noise model and the proposed iterative Bayesian reconstruction algorithm has the capacity to smooth noise and maintain sharp edges without introducing over/under shoots and ripples around the edges.
Abstract: An iterative Bayesian reconstruction algorithm based on the total variation (TV) norm constraint is proposed. The motivation for using TV regularization is that it is extremely effective for recovering edges of images. This paper extends the TV norm minimization constraint to the field of SPECT image reconstruction with a Poisson noise model. The regularization norm is included in the OSL-EM (one step late expectation maximization) algorithm. Unlike many other edge-preserving regularization techniques, the TV based method depends one parameter. Reconstructions of computer simulations and patient data show that the proposed algorithm has the capacity to smooth noise and maintain sharp edges without introducing over/under shoots and ripples around the edges.

155 citations

Journal ArticleDOI
TL;DR: This paper presents a deterministic algorithm to approximately optimize the objective function by using the idea of the split and merge operations which was previously proposed within the maximum likelihood framework and applies the method to mixture of expers models to experimentally show that the proposed method can find the optimal number of experts of a MoE while avoiding local maxima.

155 citations


Network Information
Related Topics (5)
Estimator
97.3K papers, 2.6M citations
91% related
Deep learning
79.8K papers, 2.1M citations
84% related
Support vector machine
73.6K papers, 1.7M citations
84% related
Cluster analysis
146.5K papers, 2.9M citations
84% related
Artificial neural network
207K papers, 4.5M citations
82% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
2023114
2022245
2021438
2020410
2019484
2018519