scispace - formally typeset
Open AccessJournal Article

A gentle tutorial of the em algorithm and its application to parameter estimation for Gaussian mixture and hidden Markov models

Reads0
Chats0
TLDR
In this paper, the authors describe the EM algorithm for finding the parameters of a mixture of Gaussian densities and a hidden Markov model (HMM) for both discrete and Gaussian mixture observation models.
Abstract
We describe the maximum-likelihood parameter estimation problem and how the ExpectationMaximization (EM) algorithm can be used for its solution. We first describe the abstract form of the EM algorithm as it is often given in the literature. We then develop the EM parameter estimation procedure for two applications: 1) finding the parameters of a mixture of Gaussian densities, and 2) finding the parameters of a hidden Markov model (HMM) (i.e., the Baum-Welch algorithm) for both discrete and Gaussian mixture observation models. We derive the update equations in fairly explicit detail but we do not prove any convergence properties. We try to emphasize intuition rather than mathematical rigor.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

A stochastic version of Expectation Maximization algorithm for better estimation of Hidden Markov Model

TL;DR: A hybrid algorithm, Simulated Annealing Stochastic version of EM (SASEM) is proposed, combining SimulatedAnnealing with EM that reformulates the HMM estimation process using a stochastic step between the EM steps and the SA.
Book ChapterDOI

Probabilistic Models for Text Mining

TL;DR: This chapter provides an overview of a variety of probabilistic models for text mining and focuses more on the fundamental probabilism techniques, and also covers their various applications to different text mining problems.
Book ChapterDOI

Set-Oriented Dimension Reduction: Localizing Principal Component Analysis via Hidden Markov Models

TL;DR: This work demonstrates the performance of the method on a generic 102-dimensional example, applies the new HMM-PCA algorithm to a molecular dynamics simulation of 12–alanine in water and interprets the results.
Journal ArticleDOI

Unsupervised Estimation of Mouse Sleep Scores and Dynamics Using a Graphical Model of Electrophysiological Measurements

TL;DR: HMMs of EEG/EMG features can characterize sleep dynamics from EEG/ EMG measurements, a prerequisite for characterizing the effects of perturbation in sleep monitoring and control applications.
Journal ArticleDOI

Early prediction of the future popularity of uploaded videos

TL;DR: This study first uses a supervised learning approach to develop a video popularity prediction model, and an ensemble model is developed for integration of these classification results to get the most accurate predictions.
References
More filters
Book

The Nature of Statistical Learning Theory

TL;DR: Setting of the learning problem consistency of learning processes bounds on the rate of convergence ofLearning processes controlling the generalization ability of learning process constructing learning algorithms what is important in learning theory?
Book

Matrix computations

Gene H. Golub

Statistical learning theory

TL;DR: Presenting a method for determining the necessary and sufficient conditions for consistency of learning process, the author covers function estimates from small data pools, applying these estimations to real-life problems, and much more.
Book

The Fractal Geometry of Nature

TL;DR: This book is a blend of erudition, popularization, and exposition, and the illustrations include many superb examples of computer graphics that are works of art in their own right.