scispace - formally typeset
Search or ask a question

Showing papers by "Vincent Vanhoucke published in 2003"


Proceedings ArticleDOI
06 Apr 2003
TL;DR: A model which approximates full covariances in a Gaussian mixture while reducing significantly both the number of parameters to estimate and the computations required to evaluate the Gaussian likelihoods is described.
Abstract: We introduce a model that approximates full and block-diagonal covariances in a Gaussian mixture, while reducing significantly both the number of parameters to estimate and the computations required to evaluate the Gaussian likelihoods. The inverse covariance of each Gaussian is expressed as a mixture of a small set of prototype matrices. Estimation of both the mixture weights and the prototypes is performed using maximum likelihood estimation. Experiments on a variety of speech recognition tasks show that this model significantly outperforms a diagonal covariance model, while using the same number of Gaussian-dependent parameters.

24 citations


Proceedings Article
01 Jan 2003
TL;DR: This paper introduces an extension of the mixture of inverse covariances model which uses a variable number of prototypes per Gaussian for improved efficiency and is optimized using a maximum likelihood criterion.
Abstract: The mixture of inverse covariances model is a low-complexity, approximate decomposition of the inverse covariance matrices in a Gaussian mixture model which achieves high modeling accuracy with very good computational efficiency. In this model, the inverse covariances are decomposed into a linear combination of K shared prototype matrices. In this paper, we introduce an extension of this model which uses a variable number of prototypes per Gaussian for improved efficiency. The number of prototypes per Gaussian is optimized using a maximum likelihood criterion. This variable length model is shown to achieve significantly better accuracy at a given complexity level on several speech recognition tasks.

1 citations