scispace - formally typeset
Journal ArticleDOI

Mixtures of probabilistic principal component analyzers

Michael E. Tipping, +1 more
- 01 Feb 1999 - 
- Vol. 11, Iss: 2, pp 443-482
Reads0
Chats0
TLDR
PCA is formulated within a maximum likelihood framework, based on a specific form of gaussian latent variable model, which leads to a well-defined mixture model for probabilistic principal component analyzers, whose parameters can be determined using an expectation-maximization algorithm.
Abstract
Principal component analysis (PCA) is one of the most popular techniques for processing, compressing, and visualizing data, although its effectiveness is limited by its global linearity. While nonlinear variants of PCA have been proposed, an alternative paradigm is to capture data complexity by a combination of local linear PCA projections. However, conventional PCA does not correspond to a probability density, and so there is no unique way to combine PCA models. Therefore, previous attempts to formulate mixture models for PCA have been ad hoc to some extent. In this article, PCA is formulated within a maximum likelihood framework, based on a specific form of gaussian latent variable model. This leads to a well-defined mixture model for probabilistic principal component analyzers, whose parameters can be determined using an expectationmaximization algorithm. We discuss the advantages of this model in the context of clustering, density modeling, and local dimensionality reduction, and we demonstrate its application to image compression and handwritten digit recognition.

read more

Content maybe subject to copyright    Report

Citations
More filters

Pattern Recognition and Machine Learning

TL;DR: Probability distributions of linear models for regression and classification are given in this article, along with a discussion of combining models and combining models in the context of machine learning and classification.
Book

Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems

Peter Dayan, +1 more
TL;DR: This text introduces the basic mathematical and computational methods of theoretical neuroscience and presents applications in a variety of areas including vision, sensory-motor integration, development, learning, and memory.
Journal ArticleDOI

Investigations into resting-state connectivity using independent component analysis

TL;DR: A probabilistic independent component analysis approach, optimized for the analysis of fMRI data, is reviewed and it is demonstrated that this is an effective and robust tool for the identification of low-frequency resting-state patterns from data acquired at various different spatial and temporal resolutions.
Journal ArticleDOI

Probabilistic independent component analysis for functional magnetic resonance imaging

TL;DR: An integrated approach to probabilistic independent component analysis for functional MRI (FMRI) data that allows for nonsquare mixing in the presence of Gaussian noise is presented and compared to the spatio-temporal accuracy of results obtained from classical ICA and GLM analyses.
Journal ArticleDOI

Sparse Subspace Clustering: Algorithm, Theory, and Applications

TL;DR: In this article, a sparse subspace clustering algorithm is proposed to cluster high-dimensional data points that lie in a union of low-dimensional subspaces, where a sparse representation corresponds to selecting a few points from the same subspace.
References
More filters
Journal ArticleDOI

An approach to non-linear principal components analysis using radially symmetric kernel functions

TL;DR: An approach to non-linear principal components using radially symmetric kernel basis functions is described and can be related to the homogeneity analysis approach of Gifi through the minimization of a loss function.
Proceedings Article

A Neural Network Autoassociator for Induction Motor Failure Prediction

TL;DR: It is demonstrated that the trained autoassociator has a small reconstruction error on measurements recorded from healthy motors but a larger error on those recorded from a motor with a fault.

Local models and Gaussian mixture models for statistical data processing

TL;DR: Local models or Gaussian mixture models can be efficient tools for dimension reduction, exploratory data analysis, feature extraction, classification and regression, and proposed algorithms for regularizing them are presented.
Journal ArticleDOI

Bayesian Analysis of Mixtures of Factor Analyzers

Akio Utsugi, +1 more
- 01 May 2001 - 
TL;DR: For Bayesian inference on the mixture of factor analyzers, natural conjugate priors on the parameters are introduced, and then a Gibbs sampler that generates parameter samples following the posterior is constructed, regarded as a maximum a posteriori estimation algorithm with hyperparameter search.
Journal ArticleDOI

The Bayesian Evidence Scheme for Regularizing Probability-Density Estimating Neural Networks

TL;DR: A regularization method for mixture models with generalized linear kernel centers is proposed, which adopts the Bayesian evidence approach and optimizes the hyperparameters of the prior by type II maximum likelihood.
Related Papers (5)
Trending Questions (1)
How do i combine permanova and PCA in a statistical analysis?

The provided paper does not discuss the combination of PERMANOVA and PCA in a statistical analysis.