scispace - formally typeset
Search or ask a question
Topic

Expectation–maximization algorithm

About: Expectation–maximization algorithm is a research topic. Over the lifetime, 11823 publications have been published within this topic receiving 528693 citations. The topic is also known as: EM algorithm & Expectation Maximization.


Papers
More filters
Journal ArticleDOI
01 Nov 2009-Test
TL;DR: In this paper, the authors considered the statistical inference of the unknown parameters of the generalized exponential distribution in presence of progressive censoring and obtained maximum likelihood estimators of unknown parameters using EM algorithm.
Abstract: In this paper, we consider the statistical inference of the unknown parameters of the generalized exponential distribution in presence of progressive censoring. We obtain maximum likelihood estimators of the unknown parameters using EM algorithm. We also compute the expected Fisher information matrix using the missing value principle. We then use these values to determine the optimal progressive censoring plans. Different optimality criteria are considered, and selected optimal progressive censoring plans are presented. One example has been provided for illustrative purposes.

129 citations

Journal ArticleDOI
TL;DR: An algorithm for maximum-likelihood image estimation on the basis of the expectation-maximization (EM) formalism is derived by using a new approximate model for depth-varying image formation for optical sectioning microscopy that incorporates spherical aberration that worsens as the microscope is focused deeper under the cover slip.
Abstract: We derive an algorithm for maximum-likelihood image estimation on the basis of the expectation-maximization (EM) formalism by using a new approximate model for depth-varying image formation for optical sectioning microscopy. This new strata-based model incorporates spherical aberration that worsens as the microscope is focused deeper under the cover slip and is the result of the refractive-index mismatch between the immersion medium and the mounting medium of the specimen. Images of a specimen with known geometry and refractive index show that the model captures the main features of the image. We analyze the performance of the depth-variant EM algorithm with simulations, which show that the algorithm can compensate for image degradation changing with depth.

128 citations

Journal ArticleDOI
TL;DR: Results indicate that sample sizes significantly larger than 100 should be used to obtain reliable estimates through maximum likelihood, and the appropriateness of using asymptotic methods examined.
Abstract: Continuing increases in computing power and availability mean that many maximum likelihood estimation (MLE) problems previously thought intractable or too computationally difficult can now be tackled numerically. However, ML parameter estimation for distributions whose only analytical expression is as quantile functions has received little attention. Numerical MLE procedures for parameters of new families of distributions, the g-and-k and the generalized g-and-h distributions, are presented and investigated here. Simulation studies are included, and the appropriateness of using asymptotic methods examined. Because of the generality of these distributions, the investigations are not only into numerical MLE for these distributions, but are also an initial investigation into the performance and problems for numerical MLE applied to quantile-defined distributions in general. Datasets are also fitted using the procedures here. Results indicate that sample sizes significantly larger than 100 should be used to obtain reliable estimates through maximum likelihood.

128 citations

Journal ArticleDOI
TL;DR: Model-based clustering using a family of Gaussian mixture models, with parsimonious factor analysis-like covariance structure, is described and an ecient algorithm for its implementation is presented, showing its eectiveness when compared to existing software.

128 citations

Proceedings ArticleDOI
08 Nov 1998
TL;DR: Reconstructions of computer simulations and patient data show that the proposed iterative Bayesian reconstruction algorithm has the capacity to smooth the noise and maintain sharp edges without introducing over/under shoots and ripples around the edges.
Abstract: An iterative Bayesian reconstruction algorithm based on the total variation (TV) norm constraint is proposed. The motivation for using TV regularization is that it is extremely effective for recovering edges of images. The TV norm minimization, introduced in 1992 was shown to be effective for restoring blurred images with a Gaussian noise model and was demonstrated to be effective for noise suppression and edge preservation. The images were diffused according to a set of nonlinear anisotropic diffusion partial differential equations, which suffered from computational difficulties. This paper extends the TV norm minimization constraint to the field of SPECT image reconstruction with a Poisson noise model. The regularization norm is included in the ML-EM (maximum likelihood expectation maximization) algorithm. The partial differential equation approach is not utilized here. Reconstructions of computer simulations and patient data show that the proposed algorithm has the capacity to smooth the noise and maintain sharp edges without introducing over/under shoots and ripples around the edges.

127 citations


Network Information
Related Topics (5)
Estimator
97.3K papers, 2.6M citations
91% related
Deep learning
79.8K papers, 2.1M citations
84% related
Support vector machine
73.6K papers, 1.7M citations
84% related
Cluster analysis
146.5K papers, 2.9M citations
84% related
Artificial neural network
207K papers, 4.5M citations
82% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
2023114
2022245
2021438
2020410
2019484
2018519