Open AccessJournal Article
A gentle tutorial of the em algorithm and its application to parameter estimation for Gaussian mixture and hidden Markov models
Reads0
Chats0
TLDR
In this paper, the authors describe the EM algorithm for finding the parameters of a mixture of Gaussian densities and a hidden Markov model (HMM) for both discrete and Gaussian mixture observation models.Abstract:
We describe the maximum-likelihood parameter estimation problem and how the ExpectationMaximization (EM) algorithm can be used for its solution. We first describe the abstract form of the EM algorithm as it is often given in the literature. We then develop the EM parameter estimation procedure for two applications: 1) finding the parameters of a mixture of Gaussian densities, and 2) finding the parameters of a hidden Markov model (HMM) (i.e., the Baum-Welch algorithm) for both discrete and Gaussian mixture observation models. We derive the update equations in fairly explicit detail but we do not prove any convergence properties. We try to emphasize intuition rather than mathematical rigor.read more
Citations
More filters
Proceedings Article
Unlabeled Data Can Degrade Classification Performance of Generative Classifiers
Fabio Gagliardi Cozman,Ira Cohen +1 more
TL;DR: It is shown that unlabeled data can degrade the performance of a classifier when there are discrepancies between modeling assumptions used to build the classifier and the actual model that generates the data.
Journal ArticleDOI
Analysis and automatic identification of sleep stages using higher order spectra.
TL;DR: This study uses a nonlinear technique, higher order spectra (HOS), to extract hidden information in the sleep EEG signal, and indicates that the proposed system is able to identify sleep stages with an accuracy of 88.7%.
Proceedings ArticleDOI
EM algorithms of Gaussian mixture model and hidden Markov model
TL;DR: The EM ofGM can be regarded as a special EM of HMM and the EM algorithm of GMM based on symbols is faster in implementation than the EM algorithms based on samples (or on observation) traditionally.
Journal ArticleDOI
Automatic identification of epileptic and background EEG signals using frequency domain parameters.
TL;DR: This paper used the autoregressive moving average as well as from Yule-Walker and Burg's methods, to extract the power density spectrum from representative signal samples, and found that Burg's method for spectrum estimation together with a support vector machine classifier yields the best classification results.
Proceedings Article
Grounded Language Learning from Video Described with Sentences
Haonan Yu,Jeffrey Mark Siskind +1 more
TL;DR: A method that learns representations for word meanings from short video clips paired with sentences that can be subsequently used to automatically generate description of new video.
References
More filters
Journal ArticleDOI
Maximum likelihood from incomplete data via the EM algorithm
Book
The Nature of Statistical Learning Theory
TL;DR: Setting of the learning problem consistency of learning processes bounds on the rate of convergence ofLearning processes controlling the generalization ability of learning process constructing learning algorithms what is important in learning theory?
Statistical learning theory
TL;DR: Presenting a method for determining the necessary and sufficient conditions for consistency of learning process, the author covers function estimates from small data pools, applying these estimations to real-life problems, and much more.
Book
The Fractal Geometry of Nature
TL;DR: This book is a blend of erudition, popularization, and exposition, and the illustrations include many superb examples of computer graphics that are works of art in their own right.