scispace - formally typeset
Open AccessJournal Article

A gentle tutorial of the em algorithm and its application to parameter estimation for Gaussian mixture and hidden Markov models

Reads0
Chats0
TLDR
In this paper, the authors describe the EM algorithm for finding the parameters of a mixture of Gaussian densities and a hidden Markov model (HMM) for both discrete and Gaussian mixture observation models.
Abstract
We describe the maximum-likelihood parameter estimation problem and how the ExpectationMaximization (EM) algorithm can be used for its solution. We first describe the abstract form of the EM algorithm as it is often given in the literature. We then develop the EM parameter estimation procedure for two applications: 1) finding the parameters of a mixture of Gaussian densities, and 2) finding the parameters of a hidden Markov model (HMM) (i.e., the Baum-Welch algorithm) for both discrete and Gaussian mixture observation models. We derive the update equations in fairly explicit detail but we do not prove any convergence properties. We try to emphasize intuition rather than mathematical rigor.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

On Technological Change in Crop Yields

TL;DR: In this paper, the authors proposed using mixtures with embedded trend functions to account for potentially different rates of technological change in different components of the yield distribution, and showed that such change leads to nonconstant variance with respect to time (i.e., heteroscedasticity).
Journal ArticleDOI

Group Event Detection With a Varying Number of Group Members for Video Surveillance

TL;DR: In this paper, a group representative is used to handle the recognition with a varying number of group members, and use an asynchronous hidden Markov model (AHMM) to model the relationship between people.
Journal ArticleDOI

KPCA for semantic object extraction in images

TL;DR: This paper demonstrates that kernel KMeans (KKMeans) is equivalent to kernel principal component analysis (KPCA) prior to the conventional K means algorithm, and generalizes Gaussian mixture model (GMM) to its kernel version, the kernel GMM (KGMM).
Journal ArticleDOI

PSO-EM: A Hyperspectral Unmixing Algorithm Based On Normal Compositional Model

TL;DR: A new hyperspectral unmixing algorithm is proposed based on the normal compositional model (NCM) to estimate the endmembers and abundance parameters jointly in this paper, demonstrating the superior performance of PSO-EM compared to other NCM-based as well as LMM-based methods.
Journal ArticleDOI

Community sense and response systems: your phone as quake detector

TL;DR: The Caltech CSN project collects sensor data from thousands of personal devices for real-time response to dangerous earthquakes.
References
More filters
Book

The Nature of Statistical Learning Theory

TL;DR: Setting of the learning problem consistency of learning processes bounds on the rate of convergence ofLearning processes controlling the generalization ability of learning process constructing learning algorithms what is important in learning theory?
Book

Matrix computations

Gene H. Golub

Statistical learning theory

TL;DR: Presenting a method for determining the necessary and sufficient conditions for consistency of learning process, the author covers function estimates from small data pools, applying these estimations to real-life problems, and much more.
Book

The Fractal Geometry of Nature

TL;DR: This book is a blend of erudition, popularization, and exposition, and the illustrations include many superb examples of computer graphics that are works of art in their own right.