scispace - formally typeset
Journal ArticleDOI

The nonlinear PCA learning rule in independent component analysis

Erkki Oja
- 30 Sep 1997 - 
- Vol. 17, Iss: 1, pp 25-45
TLDR
It has been verified experimentally that when nonlinear Principal Component Analysis (PCA) learning rules are used for the weights of a neural layer, the neurons have signal separation capabilities and can be used for image and speech signal separation.
About
This article is published in Neurocomputing.The article was published on 1997-09-30. It has received 237 citations till now. The article focuses on the topics: Learning rule & Sparse PCA.

read more

Citations
More filters
Journal ArticleDOI

Statistical pattern recognition: a review

TL;DR: The objective of this review paper is to summarize and compare some of the well-known methods used in various stages of a pattern recognition system and identify research topics and applications which are at the forefront of this exciting and challenging field.
Journal ArticleDOI

Fast and robust fixed-point algorithms for independent component analysis

TL;DR: Using maximum entropy approximations of differential entropy, a family of new contrast (objective) functions for ICA enable both the estimation of the whole decomposition by minimizing mutual information, and estimation of individual independent components as projection pursuit directions.
Journal ArticleDOI

Independent component analysis using an extended infomax algorithm for mixed subgaussian and supergaussian sources

TL;DR: An extension of the infomax algorithm of Bell and Sejnowski (1995) is presented that is able blindly to separate mixed signals with sub- and supergaussian source distributions and is effective at separating artifacts such as eye blinks and line noise from weaker electrical signals that arise from sources in the brain.
Journal ArticleDOI

Regularization Networks and Support Vector Machines

TL;DR: Both formulations of regularization and Support Vector Machines are reviewed in the context of Vapnik's theory of statistical learning which provides a general foundation for the learning problem, combining functional analysis and statistics.

Survey on Independent Component Analysis

TL;DR: This paper surveys the existing theory and methods for independent component analysis (ICA), in which the desired representation is the one that minimizes the statistical dependence of the components of the representation.
References
More filters
Journal ArticleDOI

Independent component analysis, a new concept?

Pierre Comon
- 01 Apr 1994 - 
TL;DR: An efficient algorithm is proposed, which allows the computation of the ICA of a data matrix within a polynomial time and may actually be seen as an extension of the principal component analysis (PCA).
Journal ArticleDOI

Blind separation of sources, Part 1: an adaptive algorithm based on neuromimetic architecture

TL;DR: A new concept, that of INdependent Components Analysis (INCA), more powerful than the classical Principal components Analysis (in decision tasks) emerges from this work.
Journal ArticleDOI

Equivariant adaptive source separation

TL;DR: A class of adaptive algorithms for source separation that implements an adaptive version of equivariant estimation and is henceforth called EASI, which yields algorithms with a simple structure for both real and complex mixtures.
Journal ArticleDOI

Signal processing with higher-order spectra

TL;DR: The strengths and limitations of correlation-based signal processing methods, with emphasis on the bispectrum and trispectrum, and the applications of higher-order spectra in signal processing are discussed.
Journal ArticleDOI

Original Contribution: Principal components, minor components, and linear neural networks

TL;DR: The Stochastic Gradient Ascent neural network is proposed and shown to be closely related to the Generalized Hebbian Algorithm (GHA), and the SGA behaves better for extracting the less dominant eigenvectors.