scispace - formally typeset
Search or ask a question
Topic

Expectation–maximization algorithm

About: Expectation–maximization algorithm is a research topic. Over the lifetime, 11823 publications have been published within this topic receiving 528693 citations. The topic is also known as: EM algorithm & Expectation Maximization.


Papers
More filters
Proceedings Article
03 Jan 2001
TL;DR: An equivalence is derived between AdaBoost and the dual of a convex optimization problem, showing that the only difference between minimizing the exponential loss used by Ada boost and maximum likelihood for exponential models is that the latter requires the model to be normalized to form a conditional probability distribution over labels.
Abstract: We derive an equivalence between AdaBoost and the dual of a convex optimization problem, showing that the only difference between minimizing the exponential loss used by AdaBoost and maximum likelihood for exponential models is that the latter requires the model to be normalized to form a conditional probability distribution over labels. In addition to establishing a simple and easily understood connection between the two methods, this framework enables us to derive new regularization procedures for boosting that directly correspond to penalized maximum likelihood. Experiments on UCI datasets support our theoretical analysis and give additional insight into the relationship between boosting and logistic regression.

198 citations

Journal ArticleDOI
TL;DR: In this article, it was shown that unbiasedness is enough when the estimated likelihood is used inside a Metropolis-Hastings algorithm, which is perhaps surprising given the celebrated results on maximum simulated likelihood estimation.
Abstract: Suppose we wish to carry out likelihood based inference but we solely have an unbiased simulation based estimator of the likelihood. We note that unbiasedness is enough when the estimated likelihood is used inside a Metropolis-Hastings algorithm. This result has recently been introduced in statistics literature by Andrieu, Doucet, and Holenstein (2007) and is perhaps surprising given the celebrated results on maximum simulated likelihood estimation. Bayesian inference based on simulated likelihood can be widely applied in microeconomics, macroeconomics and flnancial econometrics. One way of generating unbiased estimates of the likelihood is by the use of a particle fllter. We illustrate these methods on four problems in econometrics, producing rather generic methods. Taken together, these methods imply that if we can simulate from an economic model we can carry out likelihood based inference using its simulations.

197 citations

Journal ArticleDOI
TL;DR: In this paper, the results of Wald on the consistency of the maximum likelihood estimate are extended and applications are made to mixture distributions and to clustering when the number of clusters is not known.
Abstract: The results of Wald on the consistency of the maximum likelihood estimate are extended. Applications are made to mixture distributions and to clustering when the number of clusters is not known.

196 citations

Journal ArticleDOI
TL;DR: This paper uses a methodology to enable estimators of ERG model parameters to be compared and shows the superiority of the likelihood-based estimators over those based on pseudo- likelihood, with the bias-reduced pseudo-likelihood out-performing the general pseudo-Likelihood.

196 citations

Proceedings ArticleDOI
11 Jun 2000
TL;DR: The authors formulate feature registration problems as maximum likelihood or Bayesian maximum a posteriori estimation problems using mixture models and embedding of the EM algorithm within a deterministic annealing scheme in order to directly control the fuzziness of the correspondences.
Abstract: The authors formulate feature registration problems as maximum likelihood or Bayesian maximum a posteriori estimation problems using mixture models. An EM-like algorithm is proposed to jointly solve for the feature correspondences as well as the geometric transformations. A novel aspect of the authors' approach is the embedding of the EM algorithm within a deterministic annealing scheme in order to directly control the fuzziness of the correspondences. The resulting algorithm-termed mixture point matching (MPM)-can solve for both rigid and high dimensional (thin-plate spline-based) non-rigid transformations between point sets in the presence of noise and outliers. The authors demonstrate the algorithm's performance on 2D and 3D data.

195 citations


Network Information
Related Topics (5)
Estimator
97.3K papers, 2.6M citations
91% related
Deep learning
79.8K papers, 2.1M citations
84% related
Support vector machine
73.6K papers, 1.7M citations
84% related
Cluster analysis
146.5K papers, 2.9M citations
84% related
Artificial neural network
207K papers, 4.5M citations
82% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
2023114
2022245
2021438
2020410
2019484
2018519