scispace - formally typeset
Search or ask a question
Topic

Expectation–maximization algorithm

About: Expectation–maximization algorithm is a research topic. Over the lifetime, 11823 publications have been published within this topic receiving 528693 citations. The topic is also known as: EM algorithm & Expectation Maximization.


Papers
More filters
Journal ArticleDOI
Linda Kaufman1
TL;DR: It is shown that the same scaled steepest descent algorithm can be applied to the least squares merit function, and that it can be accelerated using the conjugate gradient approach.
Abstract: The EM algorithm is the basic approach used to maximize the log likelihood objective function for the reconstruction problem in positron emission tomography (PET). The EM algorithm is a scaled steepest ascent algorithm that elegantly handles the nonnegativity constraints of the problem. It is shown that the same scaled steepest descent algorithm can be applied to the least squares merit function, and that it can be accelerated using the conjugate gradient approach. The experiments suggest that one can cut the computation by about a factor of 3 by using this technique. The results are applied to various penalized least squares functions which might be used to produce a smoother image. >

193 citations

Journal ArticleDOI
TL;DR: In this article, the authors relax the assumption that the random effects and model errors follow a skew-normal distribution, which includes normality as a special case and provides flexibility in capturing a broad range of non-normal behavior.
Abstract: Normality (symmetric) of the random effects and the within-subject errors is a routine assumptions for the linear mixed model, but it may be unrealistic, obscuring important features of among- and within-subjects variation. We relax this assumption by considering that the random effects and model errors follow a skew-normal distributions, which includes normality as a special case and provides flexibility in capturing a broad range of non-normal behavior. The marginal distribution for the observed quantity is derived which is expressed in closed form, so inference may be carried out using existing statistical software and standard optimization techniques. We also implement an EM type algorithm which seem to provide some advantages over a direct maximization of the likelihood. Results of simulation studies and applications to real data sets are reported.

193 citations

Journal ArticleDOI
TL;DR: In this article, the authors show that the likelihood function always has a stationary point at one particular set of parameter values, and a condition is given when this point is a local maximum and when it is not.

192 citations

Journal ArticleDOI
TL;DR: Two solutions are proposed to solve the problem of model parameter estimation from incomplete data: a Monte Carlo scheme and a scheme related to Besag's (1986) iterated conditional mode (ICM) method, both of which make use of Markov random-field modeling assumptions.
Abstract: An unsupervised stochastic model-based approach to image segmentation is described, and some of its properties investigated. In this approach, the problem of model parameter estimation is formulated as a problem of parameter estimation from incomplete data, and the expectation-maximization (EM) algorithm is used to determine a maximum-likelihood (ML) estimate. Previously, the use of the EM algorithm in this application has encountered difficulties since an analytical expression for the conditional expectations required in the EM procedure is generally unavailable, except for the simplest models. In this paper, two solutions are proposed to solve this problem: a Monte Carlo scheme and a scheme related to Besag's (1986) iterated conditional mode (ICM) method. Both schemes make use of Markov random-field modeling assumptions. Examples are provided to illustrate the implementation of the EM algorithm for several general classes of image models. Experimental results on both synthetic and real images are provided. >

192 citations

Journal ArticleDOI
TL;DR: The clarifications of the EM algorithm lead to several applications of the algorithm to models that have appeared to be less tractable and an estimation method based on simulations.

192 citations


Network Information
Related Topics (5)
Estimator
97.3K papers, 2.6M citations
91% related
Deep learning
79.8K papers, 2.1M citations
84% related
Support vector machine
73.6K papers, 1.7M citations
84% related
Cluster analysis
146.5K papers, 2.9M citations
84% related
Artificial neural network
207K papers, 4.5M citations
82% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
2023114
2022245
2021438
2020410
2019484
2018519