Topic
Expectation–maximization algorithm
About: Expectation–maximization algorithm is a research topic. Over the lifetime, 11823 publications have been published within this topic receiving 528693 citations. The topic is also known as: EM algorithm & Expectation Maximization.
Papers published on a yearly basis
Papers
More filters
••
TL;DR: This article proposes a framework where deep neural networks are used to model the source spectra and combined with the classical multichannel Gaussian model to exploit the spatial information and presents its application to a speech enhancement problem.
Abstract: This article addresses the problem of multichannel audio source separation We propose a framework where deep neural networks (DNNs) are used to model the source spectra and combined with the classical multichannel Gaussian model to exploit the spatial information The parameters are estimated in an iterative expectation-maximization (EM) fashion and used to derive a multichannel Wiener filter We present an extensive experimental study to show the impact of different design choices on the performance of the proposed technique We consider different cost functions for the training of DNNs, namely the probabilistically motivated Itakura–Saito divergence, and also Kullback–Leibler, Cauchy, mean squared error, and phase-sensitive cost functions We also study the number of EM iterations and the use of multiple DNNs, where each DNN aims to improve the spectra estimated by the preceding EM iteration Finally, we present its application to a speech enhancement problem The experimental results show the benefit of the proposed multichannel approach over a single-channel DNN-based approach and the conventional multichannel nonnegative matrix factorization-based iterative EM algorithm
304 citations
••
TL;DR: An adaptive algorithm that iteratively updates both the weights and component parameters of a mixture importance sampling density so as to optimise the performance of importance sampling, as measured by an entropy criterion is proposed.
Abstract: In this paper, we propose an adaptive algorithm that iteratively updates both the weights and component parameters of a mixture importance sampling density so as to optimise the performance of importance sampling, as measured by an entropy criterion. The method, called M-PMC, is shown to be applicable to a wide class of importance sampling densities, which includes in particular mixtures of multivariate Student t distributions. The performance of the proposed scheme is studied on both artificial and real examples, highlighting in particular the benefit of a novel Rao-Blackwellisation device which can be easily incorporated in the updating scheme.
302 citations
••
TL;DR: Experimental results show that the estimated Gaussian mixture model fits skin images from a large database and applications of the estimated density function in image and video databases are presented.
Abstract: This paper is concerned with estimating a probability density function of human skin color, using a finite Gaussian mixture model, whose parameters are estimated through the EM algorithm. Hawkins' statistical test on the normality and homoscedasticity (common covariance matrix) of the estimated Gaussian mixture models is performed and McLachlan's bootstrap method is used to test the number of components in a mixture. Experimental results show that the estimated Gaussian mixture model fits skin images from a large database. Applications of the estimated density function in image and video databases are presented.
302 citations
••
TL;DR: In this paper, a maximum likelihood estimation procedure of Hawkes' self-exciting point process model is proposed with explicit presentations of the log-likelihood of the model and its gradient and Hessian.
Abstract: A maximum likelihood estimation procedure of Hawkes' self-exciting point process model is proposed with explicit presentations of the log-likelihood of the model and its gradient and Hessian. A simulation method of the process is also presented. Some numerical results are given.
301 citations
••
01 Jan 2001TL;DR: In this article, a program for maximum likelihood estimation of general stable parameters is described, and the Fisher information matrix is computed, making large sample estimation of stable parameters a practical tool.
Abstract: A program for maximum likelihood estimation of general stable parameters is described. The Fisher information matrix is computed, making large sample estimation of stable parameters a practical tool. In addition, diagnostics are developed for assessing the stability of a data set. Applications to simulated data, stock price data, foreign exchange rate data, radar data, and ocean wave energy are presented.
300 citations