scispace - formally typeset
Search or ask a question

Showing papers on "Independent component analysis published in 2006"


Journal ArticleDOI
TL;DR: This paper considers four different sets of allowed distortions in blind audio source separation algorithms, from time-invariant gains to time-varying filters, and derives a global performance measure using an energy ratio, plus a separate performance measure for each error term.
Abstract: In this paper, we discuss the evaluation of blind audio source separation (BASS) algorithms. Depending on the exact application, different distortions can be allowed between an estimated source and the wanted true source. We consider four different sets of such allowed distortions, from time-invariant gains to time-varying filters. In each case, we decompose the estimated source into a true source part plus error terms corresponding to interferences, additive noise, and algorithmic artifacts. Then, we derive a global performance measure using an energy ratio, plus a separate performance measure for each error term. These measures are computed and discussed on the results of several BASS problems with various difficulty levels

2,855 citations


Journal ArticleDOI
TL;DR: The ability of several algorithms to identify the correct muscle synergies and activation coefficients in simulated data, combined with their consistency when applied to physiological data sets, suggests that the Muscle synergies found by a particular algorithm are not an artifact of that algorithm, but reflect basic aspects of the organization of muscle activation patterns underlying behaviors.
Abstract: Several recent studies have used matrix factorization algorithms to assess the hypothesis that behaviors might be produced through the combination of a small number of muscle synergies. Although generally agreeing in their basic conclusions, these studies have used a range of different algorithms, making their interpretation and integration difficult. We therefore compared the performance of these different algorithms on both simulated and experimental data sets. We focused on the ability of these algorithms to identify the set of synergies underlying a data set. All data sets consisted of nonnegative values, reflecting the nonnegative data of muscle activation patterns. We found that the performance of principal component analysis (PCA) was generally lower than that of the other algorithms in identifying muscle synergies. Factor analysis (FA) with varimax rotation was better than PCA, and was generally at the same levels as independent component analysis (ICA) and nonnegative matrix factorization (NMF). ICA performed very well on data sets corrupted by constant variance Gaussian noise, but was impaired on data sets with signal-dependent noise and when synergy activation coefficients were correlated. Nonnegative matrix factorization (NMF) performed similarly to ICA and FA on data sets with signal-dependent noise and was generally robust across data sets. The best algorithms were ICA applied to the subspace defined by PCA (ICAPCA) and a version of probabilistic ICA with nonnegativity constraints (pICA). We also evaluated some commonly used criteria to identify the number of synergies underlying a data set, finding that only likelihood ratios based on factor analysis identified the correct number of synergies for data sets with signal-dependent noise in some cases. We then proposed an ad hoc procedure, finding that it was able to identify the correct number in a larger number of cases. Finally, we applied these methods to an experimentally obtained data set. The best performing algorithms (FA, ICA, NMF, ICAPCA, pICA) identified synergies very similar to one another. Based on these results, we discuss guidelines for using factorization algorithms to analyze muscle activation patterns. More generally, the ability of several algorithms to identify the correct muscle synergies and activation coefficients in simulated data, combined with their consistency when applied to physiological data sets, suggests that the muscle synergies found by a particular algorithm are not an artifact of that algorithm, but reflect basic aspects of the organization of muscle activation patterns underlying behaviors.

672 citations


Journal ArticleDOI
TL;DR: It is believed that widespread application of independent component analysis and related analysis methods should bring EEG once again to the forefront of brain imaging, merging its high time and frequency resolution with enhanced cm-scale spatial resolution of its cortical sources.

634 citations


Journal ArticleDOI
TL;DR: This paper presents an independent component analysis (ICA) approach to DR, to be called ICA-DR which uses mutual information as a criterion to measure data statistical independency that exceeds second-order statistics.
Abstract: In hyperspectral image analysis, the principal components analysis (PCA) and the maximum noise fraction (MNF) are most commonly used techniques for dimensionality reduction (DR), referred to as PCA-DR and MNF-DR, respectively. The criteria used by the PCA-DR and the MNF-DR are data variance and signal-to-noise ratio (SNR) which are designed to measure data second-order statistics. This paper presents an independent component analysis (ICA) approach to DR, to be called ICA-DR which uses mutual information as a criterion to measure data statistical independency that exceeds second-order statistics. As a result, the ICA-DR can capture information that cannot be retained or preserved by second-order statistics-based DR techniques. In order for the ICA-DR to perform effectively, the virtual dimensionality (VD) is introduced to estimate number of dimensions needed to be retained as opposed to the energy percentage that has been used by the PCA-DR and MNF-DR to determine energies contributed by signal sources and noise. Since there is no prioritization among components generated by the ICA-DR due to the use of random initial projection vectors, we further develop criteria and algorithms to measure the significance of information contained in each of ICA-generated components for component prioritization. Finally, a comparative study and analysis is conducted among the three DR techniques, PCA-DR, MNF-DR, and ICA-DR in two applications, endmember extraction and data compression where the proposed ICA-DR has been shown to provide advantages over the PCA-DR and MNF-DR.

594 citations


Proceedings ArticleDOI
17 Jun 2006
TL;DR: An approach for blindly recovering the parameter needed for separating the airlight from the measurements, thus recovering contrast, with neither user interaction nor existence of the sky in the frame is derived, which eases the interaction and conditions needed for image dehazing.
Abstract: Outdoor imaging is plagued by poor visibility conditions due to atmospheric scattering, particularly in haze. A major problem is spatially-varying reduction of contrast by stray radiance (airlight), which is scattered by the haze particles towards the camera. Recent computer vision methods have shown that images can be compensated for haze, and even yield a depth map of the scene. A key step in such a scene recovery is subtraction of the airlight. In particular, this can be achieved by analyzing polarization-filtered images. However, the recovery requires parameters of the airlight. These parameters were estimated in past studies by measuring pixels in sky areas. This paper derives an approach for blindly recovering the parameter needed for separating the airlight from the measurements, thus recovering contrast, with neither user interaction nor existence of the sky in the frame. This eases the interaction and conditions needed for image dehazing, which also requires compensation for attenuation. The approach has proved successful in experiments, some of which are shown here.

542 citations


Journal ArticleDOI
TL;DR: This work shows that a "leak" of cerebral activity of interest into components marked as artificial means that one is going to lost that activity, and proposes a novel wavelet enhanced ICA method (wICA) that applies a wavelet thresholding not to the observed raw EEG but to the demixed independent components as an intermediate step.

472 citations


Journal ArticleDOI
TL;DR: In this paper, a multivariate statistical process monitoring (MSPM) method based on modified independent component analysis (ICA) is proposed for fault detection and diagnosis in a wastewater treatment process, the Tennessee Eastman process, and a semiconductor etch process.
Abstract: A novel multivariate statistical process monitoring (MSPM) method based on modified independent component analysis (ICA) is proposed. ICA is a multivariate statistical tool to extract statistically independent components from observed data, which has drawn considerable attention in research fields such as neural networks, signal processing, and blind source separation. In this article, some drawbacks of the original ICA algorithm are analyzed and a modified ICA algorithm is developed for the purpose of MSPM. The basic idea of the approach is to use the modified ICA to extract some dominant independent components from normal operating process data and to combine them with statistical process monitoring techniques. Variable contribution plots to the monitoring statistics (T2 and SPE) are also developed for fault diagnosis. The proposed monitoring method is applied to fault detection and diagnosis in a wastewater treatment process, the Tennessee Eastman process, and a semiconductor etch process and is compared with conventional PCA monitoring methods. The monitoring results clearly illustrate the superiority of the proposed method. © 2006 American Institute of Chemical Engineers AIChE J, 2006

374 citations


Book ChapterDOI
TL;DR: Continued application of ICA methods in EEG research should continue to yield new insights into the nature and role of the complex macroscopic cortical dynamics captured by scalp electrode recordings.
Abstract: We discuss the theory and practice of applying independent component analysis (ICA) to electroencephalographic (EEG) data. ICA blindly decomposes multi-channel EEG data into maximally independent component processes (ICs) that typically express either particularly brain generated EEG activities or some type of non-brain artifacts (line or other environmental noise, eye blinks and other eye movements, or scalp or heart muscle activity). Each brain and non-brain IC is identified with an activity time course (its 'activation') and a set of relative strengths of its projections (by volume conduction) to the recording electrodes (its 'scalp map'). Many non-articraft IC scalp maps strongly resemble the projection of a single dipole, allowing the location and orientation of the best-fitting equivalent dipole (or other source model) to be easily determined. In favorable circumstances, ICA decomposition of high-density scalp EEG data appears to allow concurrent monitoring, with high time resolution, of separate EEG activities in twenty or more separate cortical EEG source areas. We illustrate the differences between ICA and traditional approaches to EEG analysis by comparing time courses and mean event related spectral perturbations (ERSPs) of scalp channel and IC data. Comparing IC activities across subjects necessitates clustering of similar Ics based on common dynamic and/or spatial features. We discuss and illustrate such a component clustering strategy. In sum, continued application of ICA methods in EEG research should continue to yield new insights into the nature and role of the complex macroscopic cortical dynamics captured by scalp electrode recordings.

349 citations


Journal ArticleDOI
TL;DR: ICA has recently demonstrated considerable promise in characterizing functional magnetic resonance imaging data, primarily due to its intuitive nature and ability for flexible characterization of the brain function.
Abstract: Independent component analysis (ICA) is a statistical method used to discover hidden factors (sources or features) from a set of measurements or observed data such that the sources are maximally independent. Typically, it assumes a generative model where observations are assumed to be linear mixtures of independent sources and works with higher-order statistics to achieve independence. ICA has recently demonstrated considerable promise in characterizing functional magnetic resonance imaging (fMRI) data, primarily due to its intuitive nature and ability for flexible characterization of the brain function. In this article, ICA is introduced and its application to fMRI data analysis is reviewed.

307 citations


Journal ArticleDOI
TL;DR: An improved version of the FastICA algorithm is proposed which is asymptotically efficient, i.e., its accuracy given by the residual error variance attains the Cramer-Rao lower bound (CRB).
Abstract: FastICA is one of the most popular algorithms for independent component analysis (ICA), demixing a set of statistically independent sources that have been mixed linearly. A key question is how accurate the method is for finite data samples. We propose an improved version of the FastICA algorithm which is asymptotically efficient, i.e., its accuracy given by the residual error variance attains the Cramer-Rao lower bound (CRB). The error is thus as small as possible. This result is rigorously proven under the assumption that the probability distribution of the independent signal components belongs to the class of generalized Gaussian (GG) distributions with parameter alpha, denoted GG(alpha) for alpha>2. We name the algorithm efficient FastICA (EFICA). Computational complexity of a Matlab implementation of the algorithm is shown to be only slightly (about three times) higher than that of the standard symmetric FastICA. Simulations corroborate these claims and show superior performance of the algorithm compared with algorithm JADE of Cardoso and Souloumiac and nonparametric ICA of Boscolo on separating sources with distribution GG(alpha) with arbitrary alpha, as well as on sources with bimodal distribution, and a good performance in separating linearly mixed speech signals

284 citations


Journal ArticleDOI
TL;DR: Among the ICA algorithms, the best performance was achieved by Infomax when using all 22 components as well as for the selected 6 components, however, the performance of Laplacian derivations was comparable withinfomax for both cross-validated and unseen data.
Abstract: This paper compares different ICA preprocessing algorithms on cross-validated training data as well as on unseen test data. The EEG data were recorded from 22 electrodes placed over the whole scalp during motor imagery tasks consisting of four different classes, namely the imagination of right hand, left hand, foot and tongue movements. Two sessions on different days were recorded for eight subjects. Three different independent components analysis (ICA) algorithms (Infomax, FastICA and SOBI) were studied and compared to common spatial patterns (CSP), Laplacian derivations and standard bipolar derivations, which are other well-known preprocessing methods. Among the ICA algorithms, the best performance was achieved by Infomax when using all 22 components as well as for the selected 6 components. However, the performance of Laplacian derivations was comparable with Infomax for both cross-validated and unseen data. The overall best four-class classification accuracies (between 33% and 84%) were obtained with CSP. For the cross-validated training data, CSP performed slightly better than Infomax, whereas for unseen test data, CSP yielded significantly better classification results than Infomax in one of the sessions.

Book ChapterDOI
05 Mar 2006
TL;DR: This paper solves an ICA problem where both source and observation signals are multivariate, thus, vectorized signals and proposes the frequency domain blind source separation (BSS) for convolutive mixtures as an application of IVA.
Abstract: In this paper, we solve an ICA problem where both source and observation signals are multivariate, thus, vectorized signals. To derive the algorithm, we define dependence between vectors as Kullback-Leibler divergence between joint probability and the product of marginal probabilities, and propose a vector density model that has a variance dependency within a source vector. The example shows that the algorithm successfully recovers the sources and it does not cause any permutation ambiguities within the sources. Finally, we propose the frequency domain blind source separation (BSS) for convolutive mixtures as an application of IVA, which separates 6 speeches with 6 microphones in a reverberant room environment.

Journal ArticleDOI
TL;DR: The signal separation performance of the proposed algorithm is superior to that of the conventional ICA-based BSS method, even under reverberant conditions, and the temporal alternation between ICA and beamforming can realize fast- and high-convergence optimization.
Abstract: We propose a new algorithm for blind source separation (BSS), in which independent component analysis (ICA) and beamforming are combined to resolve the slow-convergence problem through optimization in ICA. The proposed method consists of the following three parts: (a) frequency-domain ICA with direction-of-arrival (DOA) estimation, (b) null beamforming based on the estimated DOA, and (c) integration of (a) and (b) based on the algorithm diversity in both iteration and frequency domain. The unmixing matrix obtained by ICA is temporally substituted by the matrix based on null beamforming through iterative optimization, and the temporal alternation between ICA and beamforming can realize fast- and high-convergence optimization. The results of the signal separation experiments reveal that the signal separation performance of the proposed algorithm is superior to that of the conventional ICA-based BSS method, even under reverberant conditions.

Book ChapterDOI
05 Mar 2006
TL;DR: This work proposes a new framework for separation of the whole spectrograms instead of the conventional binwise separation, and demonstrates a gradient-based algorithm using multivariate activation functions derived from the PDFs.
Abstract: Conventional Independent Component Analysis (ICA) in frequency domain inherently causes the permutation problem. To solve the problem fundamentally, we propose a new framework for separation of the whole spectrograms instead of the conventional binwise separation. Under our framework, a measure of independence is calculated from the whole spectrograms, not individual frequency bins. For the calculation, we introduce some multivariate probability density functions (PDFs) which take a spectrum as arguments. To seek the unmixing matrix that makes spectrograms independent, we demonstrate a gradient-based algorithm using multivariate activation functions derived from the PDFs. Through experiments using real sound data, we have confirmed that our framework is effective to generate permutation-free unmixed results.

Journal ArticleDOI
TL;DR: Experimental results demonstrate that the ICA-AQA performs at least comparably to abundance-constrained methods, which is a high-order statistics-based technique.
Abstract: Independent component analysis (ICA) has shown success in many applications. This paper investigates a new application of the ICA in endmember extraction and abundance quantification for hyperspectral imagery. An endmember is generally referred to as an idealized pure signature for a class whose presence is considered to be rare. When it occurs, it may not appear in large population. In this case, the commonly used principal components analysis may not be effective since endmembers usually contribute very little in statistics to data variance. In order to substantiate the author's findings, an ICA-based approach, called ICA-based abundance quantification algorithm (ICA-AQA) is developed. Three novelties result from the author's proposed ICA-AQA. First, unlike the commonly used least squares abundance-constrained linear spectral mixture analysis (ACLSMA) which is a second-order statistics-based method, the ICA-AQA is a high-order statistics-based technique. Second, due to the use of statistical independency, it is generally thought that the ICA cannot be implemented as a constrained method. The ICA-AQA shows otherwise. Third, in order for the ACLSMA to perform the abundance quantification, it requires an algorithm to find image endmembers first then followed by an abundance-constrained algorithm for quantification. As opposed to such a two-stage process, the ICA-AQA can accomplish endmember extraction and abundance quantification simultaneously in one-shot operation. Experimental results demonstrate that the ICA-AQA performs at least comparably to abundance-constrained methods

Journal ArticleDOI
TL;DR: A neural algorithm is proposed using a Newton-like approach to obtain an optimal solution to the constrained optimization problem and experiments with synthetic signals and real fMRI data demonstrate the efficacy and accuracy of the proposed algorithm.

Journal ArticleDOI
TL;DR: The conditions for identifiability, separability and uniqueness of linear complex valued independent component analysis (ICA) models are established and the Darmois-Skitovich theorem for complex-valued models is extended.
Abstract: In this paper, the conditions for identifiability, separability and uniqueness of linear complex valued independent component analysis (ICA) models are established. These results extend the well-known conditions for solving real-valued ICA problems to complex-valued models. Relevant properties of complex random vectors are described in order to extend the Darmois-Skitovich theorem for complex-valued models. This theorem is used to construct a proof of a theorem for each of the above ICA model concepts. Both circular and noncircular complex random vectors are covered. Examples clarifying the above concepts are presented

Journal ArticleDOI
TL;DR: A novel and successful method for recognizing palmprint based on radial basis probabilistic neural network (RBPNN), which achieves higher recognition rate and better classification efficiency than other usual classifiers.

Journal ArticleDOI
TL;DR: An ICA algorithm is tested on three-channel ECG recordings taken from human subjects, mostly in the coronary care unit, and results are presented that show that ICA can detect and remove a variety of noise and artefact sources in these ECGs.
Abstract: Routinely recorded electrocardiograms (ECGs) are often corrupted by different types of artefacts and many efforts have been made to enhance their quality by reducing the noise or artefacts. This paper addresses the problem of removing noise and artefacts from ECGs using independent component analysis (ICA). An ICA algorithm is tested on three-channel ECG recordings taken from human subjects, mostly in the coronary care unit. Results are presented that show that ICA can detect and remove a variety of noise and artefact sources in these ECGs. One difficulty with the application of ICA is the determination of the order of the independent components. A new technique based on simple statistical parameters is proposed to solve this problem in this application. The developed technique is successfully applied to the ECG data and offers potential for online processing of ECG using ICA.

Book ChapterDOI
05 Mar 2006
TL;DR: A novel method for blind separation of instruments in single channel polyphonic music based on a non-negative matrix factor 2-D deconvolution algorithm that has applications in computational auditory scene analysis, music information retrieval, and automatic music transcription is presented.
Abstract: We present a novel method for blind separation of instruments in single channel polyphonic music based on a non-negative matrix factor 2-D deconvolution algorithm. The method is an extention of NMFD recently introduced by Smaragdis [1]. Using a model which is convolutive in both time and frequency we factorize a spectrogram representation of music into components corresponding to individual instruments. Based on this factorization we separate the instruments using spectrogram masking. The proposed algorithm has applications in computational auditory scene analysis, music information retrieval, and automatic music transcription.

Journal ArticleDOI
TL;DR: The results show the improved sound quality obtained with the Student t prior and the better robustness to mixing matrices close to singularity of the Markov chain Monte Carlo approach.
Abstract: We present a Bayesian approach for blind separation of linear instantaneous mixtures of sources having a sparse representation in a given basis. The distributions of the coefficients of the sources in the basis are modeled by a Student t distribution, which can be expressed as a scale mixture of Gaussians, and a Gibbs sampler is derived to estimate the sources, the mixing matrix, the input noise variance and also the hyperparameters of the Student t distributions. The method allows for separation of underdetermined (more sources than sensors) noisy mixtures. Results are presented with audio signals using a modified discrete cosine transform basis and compared with a finite mixture of Gaussians prior approach. These results show the improved sound quality obtained with the Student t prior and the better robustness to mixing matrices close to singularity of the Markov chain Monte Carlo approach

Journal ArticleDOI
TL;DR: An automated system to remove artifacts from ictal scalp EEG using independent component analysis (ICA) successfully rejected a good proportion of artifactual components extracted by ICA, while preserving almost all EEG components.

Journal ArticleDOI
TL;DR: This paper shows that the good convergence properties of the one-unit case are also shared by the full algorithm with symmetrical normalization and the global behavior is illustrated numerically for two sources and two mixtures in several typical cases.
Abstract: The fast independent component analysis (FastICA) algorithm is one of the most popular methods to solve problems in ICA and blind source separation. It has been shown experimentally that it outperforms most of the commonly used ICA algorithms in convergence speed. A rigorous local convergence analysis has been presented only for the so-called one-unit case, in which just one of the rows of the separating matrix is considered. However, in the FastICA algorithm, there is also an explicit normalization step, and it may be questioned whether the extra rotation caused by the normalization will affect the convergence speed. The purpose of this paper is to show that this is not the case and the good convergence properties of the one-unit case are also shared by the full algorithm with symmetrical normalization. A local convergence analysis is given for the general case, and the global behavior is illustrated numerically for two sources and two mixtures in several typical cases

Journal ArticleDOI
TL;DR: The main result of this paper are analytic closed-form expressions that characterize the separating ability of both versions of the FastICA algorithm in a local sense, assuming a "good" initialization of the algorithms and long data records.
Abstract: The FastICA or fixed-point algorithm is one of the most successful algorithms for linear independent component analysis (ICA) in terms of accuracy and computational complexity. Two versions of the algorithm are available in literature and software: a one-unit (deflation) algorithm and a symmetric algorithm. The main result of this paper are analytic closed-form expressions that characterize the separating ability of both versions of the algorithm in a local sense, assuming a "good" initialization of the algorithms and long data records. Based on the analysis, it is possible to combine the advantages of the symmetric and one-unit version algorithms and predict their performance. To validate the analysis, a simple check of saddle points of the cost function is proposed that allows to find a global minimum of the cost function in almost 100% simulation runs. Second, the Crame/spl acute/r-Rao lower bound for linear ICA is derived as an algorithm independent limit of the achievable separation quality. The FastICA algorithm is shown to approach this limit in certain scenarios. Extensive computer simulations supporting the theoretical findings are included.

Journal ArticleDOI
TL;DR: An enhanced ICA algorithm by ensemble learning approach, named as random independent subspace (RIS), to deal with the two problems of unstable ICA classifier and small sample size problem is proposed.

Proceedings ArticleDOI
24 Jul 2006
TL;DR: In this paper, a scalable and efficient parameterized block-based statistical static timing analysis algorithm incorporating both Gaussian and non-Gaussian parameter distributions, capturing spatial correlations using a grid-based model is proposed.
Abstract: We propose a scalable and efficient parameterized block-based statistical static timing analysis algorithm incorporating both Gaussian and non-Gaussian parameter distributions, capturing spatial correlations using a grid-based model. As a preprocessing step, we employ independent component analysis to transform the set of correlated non-Gaussian parameters to a basis set of parameters that are statistically independent, and principal components analysis to orthogonalize the Gaussian parameters. The procedure requires minimal input information: given the moments of the variational parameters, we use a Pade approximation-based moment matching scheme to generate the distributions of the random variables representing the signal arrival times, and preserve correlation information by propagating arrival times in a canonical form. For the ISCAS89 benchmark circuits, as compared to Monte Carlo simulations, we obtain average errors of 0.99% and 2.05%, respectively, in the mean and standard deviation of the circuit delay. For a circuit with |G| gates and a layout with g spatial correlation grids, the complexity of our approach is O(g|G|).

Journal ArticleDOI
TL;DR: A family of probabilistic mixture generative models combining modified positive independent subspace analysis, localization models, and segmental models is designed and it is shown that they outperform methods exploiting spatial diversity only and that they are robust against approximate localization of the sources.
Abstract: This article deals with the source separation problem for stereo musical mixtures using prior information about the sources (instrument names and localization). After a brief review of existing methods, we design a family of probabilistic mixture generative models combining modified positive independent subspace analysis (ISA), localization models, and segmental models (SM). We express source separation as a Bayesian estimation problem and we propose efficient resolution algorithms. The resulting separation methods rely on a variable number of cues including harmonicity, spectral envelope, azimuth, note duration, and monophony. We compare these methods on two synthetic mixtures with long reverberation. We show that they outperform methods exploiting spatial diversity only and that they are robust against approximate localization of the sources.

Journal ArticleDOI
TL;DR: Profiles extracted by MCR-WALS exhibit a strong correlation with cell cycle-associated genes, but also suggest new insights into the regulation of those genes, which are suggested through its application to yeast cell cycle data.
Abstract: Background Modeling of gene expression data from time course experiments often involves the use of linear models such as those obtained from principal component analysis (PCA), independent component analysis (ICA), or other methods. Such methods do not generally yield factors with a clear biological interpretation. Moreover, implicit assumptions about the measurement errors often limit the application of these methods to log-transformed data, destroying linear structure in the untransformed expression data.

Journal ArticleDOI
TL;DR: A new sophisticated method for deciding the number of target sources and then selecting their frequency components is proposed and a new criterion for specifying time-frequency masks is proposed.
Abstract: This paper presents a method for enhancing target sources of interest and suppressing other interference sources. The target sources are assumed to be close to sensors, to have dominant powers at these sensors, and to have non-Gaussianity. The enhancement is performed blindly, i.e., without knowing the position and active time of each source. We consider a general case where the total number of sources is larger than the number of sensors, and neither the number of target sources nor the total number of sources is known. The method is based on a two-stage process where independent component analysis (ICA) is first employed in each frequency bin and then time-frequency masking is used to improve the performance further. We propose a new sophisticated method for deciding the number of target sources and then selecting their frequency components. We also propose a new criterion for specifying time-frequency masks. Experimental results for simulated cocktail party situations in a room, whose reverberation time was 130 ms, are presented to show the effectiveness and characteristics of the proposed method

Journal ArticleDOI
TL;DR: A fast incremental principal non-Gaussian directions analysis algorithm, called IPCA-ICA, is introduced that computes the principal components of a sequence of image vectors incrementally without estimating the covariance matrix (so covariance-free) and at the same time transforming these principal components to the independent directions that maximize the non- Gaussianity of the source.
Abstract: In this paper, a fast incremental principal non-Gaussian directions analysis algorithm, called IPCA-ICA, is introduced. This algorithm computes the principal components of a sequence of image vectors incrementally without estimating the covariance matrix (so covariance-free) and at the same time transforming these principal components to the independent directions that maximize the non-Gaussianity of the source. Two major techniques are used sequentially in a real-time fashion in order to obtain the most efficient and independent components that describe a whole set of human faces database. This procedure is done by merging the runs of two algorithms based on principal component analysis (PCA) and independent component analysis (ICA) running sequentially. This algorithm is applied to face recognition problem. Simulation results on different databases showed high average success rate of this algorithm compared to others.