scispace - formally typeset
Journal ArticleDOI

Mixtures of probabilistic principal component analyzers

Michael E. Tipping, +1 more
- 01 Feb 1999 - 
- Vol. 11, Iss: 2, pp 443-482
Reads0
Chats0
TLDR
PCA is formulated within a maximum likelihood framework, based on a specific form of gaussian latent variable model, which leads to a well-defined mixture model for probabilistic principal component analyzers, whose parameters can be determined using an expectation-maximization algorithm.
Abstract
Principal component analysis (PCA) is one of the most popular techniques for processing, compressing, and visualizing data, although its effectiveness is limited by its global linearity. While nonlinear variants of PCA have been proposed, an alternative paradigm is to capture data complexity by a combination of local linear PCA projections. However, conventional PCA does not correspond to a probability density, and so there is no unique way to combine PCA models. Therefore, previous attempts to formulate mixture models for PCA have been ad hoc to some extent. In this article, PCA is formulated within a maximum likelihood framework, based on a specific form of gaussian latent variable model. This leads to a well-defined mixture model for probabilistic principal component analyzers, whose parameters can be determined using an expectationmaximization algorithm. We discuss the advantages of this model in the context of clustering, density modeling, and local dimensionality reduction, and we demonstrate its application to image compression and handwritten digit recognition.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

A Data Mining Algorithm for Monitoring PCB Assembly Quality

TL;DR: In this article, a pattern clustering algorithm is proposed as a statistical quality control technique for diagnosing the solder paste variability when a huge number of binary inspection outputs are involved, and a latent variable model is first introduced and incorporated into classical logistic regression model so that the interdependencies between measured physical characteristics and their relationship to the final solder defects can be explained.
Journal ArticleDOI

Efficient Algorithms on Robust Low-Rank Matrix Completion Against Outliers

TL;DR: The objective is to recover a low-rank data matrix from a small number of noisy observations and the proposed algorithm obtains a better solution with faster convergence speed than the benchmark algorithms in both synthetic and real data scenarios.
Book

Learning-Based Robot Vision

Josef Pauli
TL;DR: The decomposition of the high-level, deliberate task into sub-tasks, and the configuration and implementation of task-specific modules is based on an experimental designing phase, and in the following subsections the designing phase and the application phase are explained in coherence for each sub-task.
Proceedings Article

Product Grassmann Manifold representation and its LRR models

TL;DR: This paper proposes a novel representation, namely Product Grassmann Manifold (PGM), to represent complex high dimensional data with multiple varying factors and obtains superior accuracy compared with the clustering methods on manifolds or conventional Euclidean spaces.
Proceedings ArticleDOI

Capturing appearance variation in active appearance models

TL;DR: This paper presents an extension of active appearance models (AAMs) that is better capable of dealing with the large variation in face appearance that is encountered in large multi-person face data sets, and employs a mixture of probabilistic PCA to describe texture variation, leading to a richer model.
References
More filters
Book

Neural networks for pattern recognition

TL;DR: This is the first comprehensive treatment of feed-forward neural networks from the perspective of statistical pattern recognition, and is designed as a text, with over 100 exercises, to benefit anyone involved in the fields of neural computation and pattern recognition.
Book

Principal Component Analysis

TL;DR: In this article, the authors present a graphical representation of data using Principal Component Analysis (PCA) for time series and other non-independent data, as well as a generalization and adaptation of principal component analysis.
Book ChapterDOI

Neural Networks for Pattern Recognition

TL;DR: The chapter discusses two important directions of research to improve learning algorithms: the dynamic node generation, which is used by the cascade correlation algorithm; and designing learning algorithms where the choice of parameters is not an issue.
Journal ArticleDOI

LIII. On lines and planes of closest fit to systems of points in space

TL;DR: This paper is concerned with the construction of planes of closest fit to systems of points in space and the relationships between these planes and the planes themselves.
Related Papers (5)
Trending Questions (1)
How do i combine permanova and PCA in a statistical analysis?

The provided paper does not discuss the combination of PERMANOVA and PCA in a statistical analysis.