scispace - formally typeset
Search or ask a question

Showing papers on "Independent component analysis published in 2008"


Journal ArticleDOI
TL;DR: The impressive results prove that the integration of independent component analysis and neural networks, especially PNN, is a promising scheme for the computer-aided diagnosis of heart diseases based on ECG.
Abstract: In this paper, we propose a scheme to integrate independent component analysis (ICA) and neural networks for electrocardiogram (ECG) beat classification. The ICA is used to decompose ECG signals into weighted sum of basic components that are statistically mutual independent. The projections on these components, together with the RR interval, then constitute a feature vector for the following classifier. Two neural networks, including a probabilistic neural network (PNN) and a back-propagation neural network (BPNN), are employed as classifiers. ECG samples attributing to eight different beat types were sampled from the MIT-BIH arrhythmia database for experiments. The results show high classification accuracy of over 98% with either of the two classifiers. Between them, the PNN shows a slightly better performance than BPNN in terms of accuracy and robustness to the number of ICA-bases. The impressive results prove that the integration of independent component analysis and neural networks, especially PNN, is a promising scheme for the computer-aided diagnosis of heart diseases based on ECG.

245 citations


Journal ArticleDOI
TL;DR: It is concluded that automatic ICA-based denoising offers a potentially useful approach to improve the quality of fMRI data and consequently increase the accuracy of the statistical analysis of these data.

221 citations


Journal ArticleDOI
TL;DR: A comparative study of widely used ICA algorithms in the BCI community, conducted on simulated electroencephalography (EEG) data, shows that an appropriate selection of an ICA algorithm may significantly improve the capabilities of BCI systems.
Abstract: Several studies dealing with independent component analysis (ICA)-based brain-computer interface (BCI) systems have been reported. Most of them have only explored a limited number of ICA methods, mainly FastICA and INFOMAX. The aim of this article is to help the BCI community researchers, especially those who are not familiar with ICA techniques, to choose an appropriate ICA method. For this purpose, the concept of ICA is reviewed and different measures of statistical independence are reported. Then, the application of these measures is illustrated through a brief description of the widely used algorithms in the ICA community, namely SOBI, COM2, JADE, ICAR, FastICA, and INFOMAX. The implementation of these techniques in the BCI field is also explained. Finally, a comparative study of these algorithms, conducted on simulated electroencephalography (EEG) data, shows that an appropriate selection of an ICA algorithm may significantly improve the capabilities of BCI systems.

217 citations


Journal ArticleDOI
TL;DR: In this article, a component separation challenge has been organized, based on a set of realistically complex simulations of sky emission, and several methods including those based on internal template subtraction, maximum entropy method, parametric method, spatial and harmonic cross correlation methods, and independent component analysis have been tested.
Abstract: The Planck satellite will map the full sky at nine frequencies from 30 to 857 GHz. The CMB intensity and polarization that are its prime targets are contaminated by foreground emission. The goal of this paper is to compare proposed methods for separating CMB from foregrounds based on their different spectral and spatial characteristics, and to separate the foregrounds into components of different physical origin. A component separation challenge has been organized, based on a set of realistically complex simulations of sky emission. Several methods including those based on internal template subtraction, maximum entropy method, parametric method, spatial and harmonic cross correlation methods, and independent component analysis have been tested. Different methods proved to be effective in cleaning the CMB maps from foreground contamination, in reconstructing maps of diffuse Galactic emissions, and in detecting point sources and thermal Sunyaev-Zeldovich signals. The power spectrum of the residuals is, on the largest scales, four orders of magnitude lower than that of the input Galaxy power spectrum at the foreground minimum. The CMB power spectrum was accurately recovered up to the sixth acoustic peak. The point source detection limit reaches 100 mJy, and about 2300 clusters are detected via the thermal SZ effect on two thirds of the sky. We have found that no single method performs best for all scientific objectives. We foresee that the final component separation pipeline for Planck will involve a combination of methods and iterations between processing steps targeted at different objectives such as diffuse component separation, spectral estimation and compact source extraction.

211 citations


Journal ArticleDOI
TL;DR: A method for removing unwanted components of biological origin from neurophysiological recordings such as magnetoencephalography, electroencephalographs, or multichannel electrophysiological or optical recordings by synthesizing spatial filters synthesized using a blind source separation method known as denoising source separation (DSS).

208 citations


Journal ArticleDOI
TL;DR: Conditions under which the mixing matrix is unique are presented and several algorithms for its computation are discussed, including a generalization to underdetermined mixtures of the well-known SOBI algorithm.
Abstract: In this paper, we study simultaneous matrix diagonalization-based techniques for the estimation of the mixing matrix in underdetermined independent component analysis (ICA). This includes a generalization to underdetermined mixtures of the well-known SOBI algorithm. The problem is reformulated in terms of the parallel factor decomposition (PARAFAC) of a higher-order tensor. We present conditions under which the mixing matrix is unique and discuss several algorithms for its computation.

201 citations


Proceedings ArticleDOI
12 May 2008
TL;DR: An asymptotic Newton algorithm is derived for quasi-maximum likelihood estimation of the ICA mixture model, using the ordinary gradient and Hessian, which yields an algorithm that can accommodate non-stationary environments and arbitrary source densities.
Abstract: We derive an asymptotic Newton algorithm for quasi-maximum likelihood estimation of the ICA mixture model, using the ordinary gradient and Hessian. The probabilistic mixture framework yields an algorithm that can accommodate non-stationary environments and arbitrary source densities. We prove asymptotic stability when the source models match the true sources. An example application to EEC segmentation is given.

200 citations


Journal ArticleDOI
TL;DR: In this paper, anisotropic diffusion kernels on observable data manifolds are used to approximate a Laplacian on the inaccessible independent variable domain, using the metric distortion induced by the Jacobian of the unknown mapping from variables to data.

183 citations


Journal ArticleDOI
Nojun Kwak1
TL;DR: The experimental results show that the proposed method performs well for face recognition problems, compared with conventional methods such as the principal component analysis (PCA), Fisher's linear discriminant (FLD), etc.

169 citations


Journal ArticleDOI
TL;DR: The current results provided evidences that the brain areas within the two anti-correlated networks are highly integrated at both the intra- and inter-regional level.

158 citations


Journal ArticleDOI
TL;DR: A combination of spatial ICA with spectral Bayesian positive source (BPSS) with a rough classification of pixels is proposed, which allows selection of small, but relevant, number of pixels for the component extraction and consequently the endmember classification.

Journal ArticleDOI
TL;DR: This work generated group DM maps and showed that the overall ICA-DM connectivity is negatively correlated with age, and considered different strategies for combining ICA results from individual-level and population-level analyses and used them to evaluate and predict the effect of aging on the DM component.

Journal ArticleDOI
TL;DR: This review begins by placing the BSS linear instantaneous model of EEG within the framework of brain volume conduction theory, and considers the fitness of SOS-based and HOS-based methods for the extraction of spontaneous and induced EEG and their separation from extra-cranial artifacts.

Journal ArticleDOI
TL;DR: This study presents an independent vector analysis (IVA) method to address the permutation problem during fMRI group data analysis and shows that IVA effectively inferred group-activation patterns of unknown origins without the requirement for a pre-processing stage.

Journal ArticleDOI
TL;DR: In this paper, a second-order statistical method employed in blind source separation (BSS) is adapted for use in modal parameter identification, and a class of new non-parametric output-only modal identification algorithms is proposed and examples of its use are provided.

Journal ArticleDOI
TL;DR: A novel statistical approach for automatic vehicle detection based on local features that are located within three significant subregions of the image by eliminating the requirement for an ICA residual image reconstruction process and by computing the likelihood probability using a weighted Gaussian mixture model.
Abstract: This paper develops a novel statistical approach for automatic vehicle detection based on local features that are located within three significant subregions of the image. In the detection process, each subregion is projected onto its associated eigenspace and independent basis space to generate a principal components analysis (PCA) weight vector and an independent component analysis (ICA) coefficient vector, respectively. A likelihood evaluation process is then performed based on the estimated joint probability of the projection weight vectors and the coefficient vectors of the subregions with position information. The use of subregion position information minimizes the risk of false acceptances, whereas the use of PCA to model the low-frequency components of the eigenspace and ICA to model the high-frequency components of the residual space improves the tolerance of the detection process toward variations in the illumination conditions and vehicle pose. The use of local features not only renders the system more robust toward partial occlusions but also reduces the computational overhead. The computational costs are further reduced by eliminating the requirement for an ICA residual image reconstruction process and by computing the likelihood probability using a weighted Gaussian mixture model, whose parameters and weights are iteratively estimated using an expectation-maximization algorithm.

Journal ArticleDOI
TL;DR: The proposed framework, referred herein to as convex analysis of mixtures of non-negative sources (CAMNS), is deterministic requiring no source independence assumption, the entrenched premise in many existing BSS frameworks.
Abstract: This paper presents a new framework for blind source separation (BSS) of non-negative source signals. The proposed framework, referred herein to as convex analysis of mixtures of non-negative sources (CAMNS), is deterministic requiring no source independence assumption, the entrenched premise in many existing (usually statistical) BSS frameworks. The development is based on a special assumption called local dominance. It is a good assumption for source signals exhibiting sparsity or high contrast, and thus is considered realistic to many real-world problems such as multichannel biomedical imaging. Under local dominance and several standard assumptions, we apply convex analysis to establish a new BSS criterion, which states that the source signals can be perfectly identified (in a blind fashion) by finding the extreme points of an observation-constructed polyhedral set. Methods for fulfilling the CAMNS criterion are also derived, using either linear programming or simplex geometry. Simulation results on several data sets are presented to demonstrate the efficacy of the proposed method over several other reported BSS methods.

Journal ArticleDOI
TL;DR: This paper derives both a gradient-descent and a quasi-Newton algorithm that use the full second-order statistics providing superior performance with circular and noncircular sources as compared to existing methods.
Abstract: In this paper, we use complex analytic functions to achieve independent component analysis (ICA) by maximization of non-Gaussianity and introduce the complex maximization of non-Gaussianity (CMN) algorithm. We derive both a gradient-descent and a quasi-Newton algorithm that use the full second-order statistics providing superior performance with circular and noncircular sources as compared to existing methods. We show the connection among ICA methods through maximization of non-Gaussianity, mutual information, and maximum likelihood (ML) for the complex case, and emphasize the importance of density matching for all three cases. Local stability conditions are derived for the CMN cost function that explicitly show the effects of noncircularity on convergence and demonstrated through simulation examples.

Journal ArticleDOI
TL;DR: The empirical criterion for selection of ICA algorithms in signal processing for analytical chemistry is given and the preprocessing method for ICA applications and the robustness of different I CA algorithms are reviewed.
Abstract: Independent component analysis (ICA) is a statistical method the goal of which is to find a linear representation of non-Gaussian data so that the components are statistically independent, or as independent as possible. In an ICA procedure, the estimated independent components (ICs) are identical to or highly correlated to the spectral profiles of the chemical components in mixtures under certain circumstances, so the latent variables obtained are chemically interpretable and useful for qualitative analysis of mixtures without prior information about the sources or reference materials, and the calculated demixing matrix is useful for simultaneous determination of polycomponents in mixtures. We review commonly used ICA algorithms and recent ICA applications in signal processing for qualitative and quantitative analysis. Furthermore, we also review the preprocessing method for ICA applications and the robustness of different ICA algorithms, and we give the empirical criterion for selection of ICA algorithms in signal processing for analytical chemistry.

Journal ArticleDOI
TL;DR: This study extends the work of Bingham and Hyvarinen to the more general case of noncircular sources by deriving a new fixed-point algorithm that uses the information in the pseudo-covariance matrix to provide significant improvement in performance when confronted with non Circular sources.
Abstract: The complex fast independent component analysis (c-FastICA) algorithm is one of the most ubiquitous methods for solving the ICA problems with complex-valued data. In this study, we extend the work of Bingham and Hyvarinen to the more general case of noncircular sources by deriving a new fixed-point algorithm that uses the information in the pseudo-covariance matrix. This modification provides significant improvement in performance when confronted with noncircular sources, specifically with sub-Gaussian noncircular signals such as binary phase-shift keying (BPSK) signals, where c-FastICA fails to achieve separation. We also present a rigorous local stability analysis that we use to quantify the effects of noncircularity on performance. Simulations are presented to demonstrate the effectiveness of our method.

Journal ArticleDOI
TL;DR: Independent component analysis (ICA) was implemented as a group-level ICA that extracts a single set of components from the data and directly allows for population inferences about consistently expressed function-relevant spatiotemporal responses.

15 Sep 2008
TL;DR: This paper proposes a new method of approximating mutual information based on maximum likelihood estimation of a density ratio function, called Maximum Likelihood Mutual Information (MLMI), which has several attractive properties, e.g., density estimation is not involved, it is a single-shot procedure, the global optimal solution can be efficiently computed, and cross-validation is available for model selection.
Abstract: Mutual information is useful in various data processing tasks such as feature selection or independent component analysis. In this paper, we propose a new method of approximating mutual information based on maximum likelihood estimation of a density ratio function. Our method, called Maximum Likelihood Mutual Information (MLMI), has several attractive properties, e.g., density estimation is not involved, it is a single-shot procedure, the global optimal solution can be efficiently computed, and cross-validation is available for model selection. Numerical experiments show that MLMI compares favorably with existing methods.

Journal ArticleDOI
TL;DR: By projecting the measured signal into a decorrelated signal space, the positioning accuracy is improved, since the cross correlation between each AP is reduced, and experimental results show that the size of training samples can be greatly reduced in the decorrelated space.
Abstract: We present a novel approach to the problem of the indoor localization in wireless environments. The main contribution of this paper is fourfold: 1) we show that by projecting the measured signal into a decorrelated signal space, the positioning accuracy is improved, since the cross correlation between each AP is reduced, 2) we demonstrate that this novel approach achieves a more efficient information compaction and provides a better scheme to reduce online computation (the drawback of AP selection techniques is overcome, since we reduce the dimensionality by combing features, and each component in the decorrelated space is the linear combination of all APs; therefore, a more efficient mechanism is provided to utilize information of all APs while reducing the computational complexity), 3) experimental results show that the size of training samples can be greatly reduced in the decorrelated space; that is, fewer human efforts are required for developing the system, and 4) we carry out comparisons between RSS and three classical decorrelated spaces, including Discrete Cosine Transform (DCT), Principal Component Analysis (PCA), and Independent Component Analysis (ICA) in this paper. Two AP selection criteria proposed in the literature, MaxMean and InfoGain are also compared. Testing on a realistic WLAN environment, we find that PCA achieves the best performance on the location fingerprinting task.

Journal ArticleDOI
TL;DR: In this article, the authors proposed probabilistic latent semantic analysis (pLSA) for non-negative decomposition and elucidation of interpretable component spectra and abundance maps.
Abstract: Imaging mass spectrometry (IMS) is a promising technology which allows for detailed analysis of spatial distributions of (bio)molecules in organic samples. In many current applications, IMS relies heavily on (semi)automated exploratory data analysis procedures to decompose the data into characteristic component spectra and corresponding abundance maps, visualizing spectral and spatial structure. The most commonly used techniques are principal component analysis (PCA) and independent component analysis (ICA). Both methods operate in an unsupervised manner. However, their decomposition estimates usually feature negative counts and are not amenable to direct physical interpretation. We propose probabilistic latent semantic analysis (pLSA) for non-negative decomposition and the elucidation of interpretable component spectra and abundance maps. We compare this algorithm to PCA, ICA, and non-negative PARAFAC (parallel factors analysis) and show on simulated and real-world data that pLSA and non-negative PARAFAC are superior to PCA or ICA in terms of complementarity of the resulting components and reconstruction accuracy. We further combine pLSA decomposition with a statistical complexity estimation scheme based on the Akaike information criterion (AIC) to automatically estimate the number of components present in a tissue sample data set and show that this results in sensible complexity estimates.

Journal ArticleDOI
TL;DR: A framework based on Wirtinger calculus for nonlinear complex-valued signal processing such that all computations can be directly carried out in the complex domain is introduced.
Abstract: We introduce a framework based on Wirtinger calculus for nonlinear complex-valued signal processing such that all computations can be directly carried out in the complex domain. The two main approaches for performing independent component analysis, maximum likelihood, and maximization of non-Gaussianity-which are intimately related to each other-are studied using this framework. The main update rules for the two approaches are derived, their properties and density matching strategies are discussed along with numerical examples to highlight their relationships.

Journal ArticleDOI
TL;DR: The monitoring of multivariate systems that exhibit non-Gaussian behavior is addressed and the use of principal component analysis (PCA) is proposed to capture the Gaussian and non- Gaussian source signals.
Abstract: The monitoring of multivariate systems that exhibit non-Gaussian behavior is addressed. Existing work advocates the use of independent component analysis (ICA) to extract the underlying non-Gaussian data structure. Since some of the source signals may be Gaussian, the use of principal component analysis (PCA) is proposed to capture the Gaussian and non-Gaussian source signals. A subsequent application of ICA then allows the extraction of non-Gaussian components from the retained principal components (PCs). A further contribution is the utilization of a support vector data description to determine a confidence limit for the non-Gaussian components. Finally, a statistical test is developed for determining how many non-Gaussian components are encapsulated within the retained PCs, and associated monitoring statistics are defined. The utility of the proposed scheme is demonstrated by a simulation example, and the analysis of recorded data from an industrial melter. © 2008 American Institute of Chemical Engineers AIChE J, 2008

Journal ArticleDOI
TL;DR: A novel method for underdetermined blind source separation using an instantaneous mixing model which assumes closely spaced microphones is proposed and is applicable to segregate speech signals under reverberant conditions and is compared to another state-of-the-art algorithm.
Abstract: Separation of speech mixtures, often referred to as the cocktail party problem, has been studied for decades. In many source separation tasks, the separation method is limited by the assumption of at least as many sensors as sources. Further, many methods require that the number of signals within the recorded mixtures be known in advance. In many real-world applications, these limitations are too restrictive. We propose a novel method for underdetermined blind source separation using an instantaneous mixing model which assumes closely spaced microphones. Two source separation techniques have been combined, independent component analysis (ICA) and binary time-frequency (T-F) masking. By estimating binary masks from the outputs of an ICA algorithm, it is possible in an iterative way to extract basis speech signals from a convolutive mixture. The basis signals are afterwards improved by grouping similar signals. Using two microphones, we can separate, in principle, an arbitrary number of mixed speech signals. We show separation results for mixtures with as many as seven speech signals under instantaneous conditions. We also show that the proposed method is applicable to segregate speech signals under reverberant conditions, and we compare our proposed method to another state-of-the-art algorithm. The number of source signals is not assumed to be known in advance and it is possible to maintain the extracted signals as stereo signals.

Journal ArticleDOI
TL;DR: A novel way of performing real-valued optimization in the complex domain that enables a direct complex optimization technique when the cost function satisfies the Brandwood's independent analyticity condition and results indicate that the fixed-point version and gradient version are superior to other similar algorithms when the sources include both circular and noncircular distributions and the dimension is relatively high.
Abstract: In this paper, we introduce a novel way of performing real-valued optimization in the complex domain. This framework enables a direct complex optimization technique when the cost function satisfies the Brandwood's independent analyticity condition. In particular, this technique has been used to derive three algorithms, namely, kurtosis maximization using gradient update (KM-G), kurtosis maximization using fixed-point update (KM-F), and kurtosis maximization using Newton update (KM-N), to perform the complex independent component analysis (ICA) based on the maximization of the complex kurtosis cost function. The derivation and related analysis of the three algorithms are performed in the complex domain without using any complex-real mapping for differentiation and optimization. A general complex Newton rule is also derived for developing the KM-N algorithm. The real conjugate gradient algorithm is extended to the complex domain similar to the derivation of complex Newton rule. The simulation results indicate that the fixed-point version (KM-F) and gradient version (KM-G) are superior to other similar algorithms when the sources include both circular and noncircular distributions and the dimension is relatively high.

Journal ArticleDOI
TL;DR: RAICAR (Ranking and Averaging Independent Component Analysis by Reproducibility) is introduced to address issues for spatial ICA applied to fMRI and relies on the reproducibility between them to rank and select components.
Abstract: Independent component analysis (ICA) is a data-driven approach that has exhibited great utility for functional magnetic resonance imaging (fMRI). Standard ICA implementations, however, do not provide the number and relative importance of the resulting components. In addition, ICA algorithms utilizing gradient-based optimization give decompositions that are dependent on initialization values, which can lead to dramatically different results. In this work, a new method, RAICAR (Ranking and Averaging Independent Component Analysis by Reproducibility), is introduced to address these issues for spatial ICA applied to fMRI. RAICAR utilizes repeated ICA realizations and relies on the reproducibility between them to rank and select components. Different realizations are aligned based on correlations, leading to aligned components. Each component is ranked and thresholded based on between-realization correlations. Furthermore, different realizations of each aligned component are selectively averaged to generate the final estimate of the given component. Reliability and accuracy of this method are demonstrated with both simulated and experimental fMRI data.

Journal ArticleDOI
Tínu Kollo1
TL;DR: In this paper, the skewness measure is defined as a p-vector while the kurtosis is characterized by a pxp-matrix, which is an extension of the corresponding measures of Mardia.