scispace - formally typeset
Search or ask a question
Topic

Principal geodesic analysis

About: Principal geodesic analysis is a research topic. Over the lifetime, 336 publications have been published within this topic receiving 25808 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: In this paper, the principal axes of a set of observed data vectors may be determined through maximum-likelihood estimation of parameters in a latent variable model closely related to factor analysis.
Abstract: Principal component analysis (PCA) is a ubiquitous technique for data analysis and processing, but one which is not based upon a probability model. In this paper we demonstrate how the principal axes of a set of observed data vectors may be determined through maximum-likelihood estimation of parameters in a latent variable model closely related to factor analysis. We consider the properties of the associated likelihood function, giving an EM algorithm for estimating the principal subspace iteratively, and discuss the advantages conveyed by the definition of a probability density function for PCA.

3,362 citations

Journal ArticleDOI
TL;DR: This work presents tools for hierarchical clustering of imaged objects according to the shapes of their boundaries, learning of probability models for clusters of shapes, and testing of newly observed shapes under competing probability models.
Abstract: Using a differential-geometric treatment of planar shapes, we present tools for: 1) hierarchical clustering of imaged objects according to the shapes of their boundaries, 2) learning of probability models for clusters of shapes, and 3) testing of newly observed shapes under competing probability models. Clustering at any level of hierarchy is performed using a minimum variance type criterion and a Markov process. Statistical means of clusters provide shapes to be clustered at the next higher level, thus building a hierarchy of shapes. Using finite-dimensional approximations of spaces tangent to the shape space at sample means, we (implicitly) impose probability models on the shape space, and results are illustrated via random sampling and classification (hypothesis testing). Together, hierarchical clustering and hypothesis testing provide an efficient framework for shape retrieval. Examples are presented using shapes and images from ETH, Surrey, and AMCOM databases.

2,858 citations

Journal ArticleDOI
TL;DR: The powerful visualization tools of geometric morphometrics and the typically large amount of shape variables give rise to a specific exploratory style of analysis, allowing the identification and quantification of previously unknown shape features.
Abstract: Geometric morphometrics is the statistical analysis of form based on Cartesian landmark coordinates. After separating shape from overall size, position, and orientation of the landmark configurations, the resulting Procrustes shape coordinates can be used for statistical analysis. Kendall shape space, the mathematical space induced by the shape coordinates, is a metric space that can be approximated locally by a Euclidean tangent space. Thus, notions of distance (similarity) between shapes or of the length and direction of developmental and evolutionary trajectories can be meaningfully assessed in this space. Results of statistical techniques that preserve these convenient properties—such as principal component analysis, multivariate regression, or partial least squares analysis—can be visualized as actual shapes or shape deformations. The Procrustes distance between a shape and its relabeled reflection is a measure of bilateral asymmetry. Shape space can be extended to form space by augmenting the shape coordinates with the natural logarithm of Centroid Size, a measure of size in geometric morphometrics that is uncorrelated with shape for small isotropic landmark variation. The thin-plate spline interpolation function is the standard tool to compute deformation grids and 3D visualizations. It is also central to the estimation of missing landmarks and to the semilandmark algorithm, which permits to include outlines and surfaces in geometric morphometric analysis. The powerful visualization tools of geometric morphometrics and the typically large amount of shape variables give rise to a specific exploratory style of analysis, allowing the identification and quantification of previously unknown shape features.

1,017 citations

Journal ArticleDOI
TL;DR: The method of principal geodesic analysis is developed, a generalization of principal component analysis to the manifold setting and demonstrated its use in describing the variability of medially-defined anatomical objects.
Abstract: A primary goal of statistical shape analysis is to describe the variability of a population of geometric objects. A standard technique for computing such descriptions is principal component analysis. However, principal component analysis is limited in that it only works for data lying in a Euclidean vector space. While this is certainly sufficient for geometric models that are parameterized by a set of landmarks or a dense collection of boundary points, it does not handle more complex representations of shape. We have been developing representations of geometry based on the medial axis description or m-rep. While the medial representation provides a rich language for variability in terms of bending, twisting, and widening, the medial parameters are not elements of a Euclidean vector space. They are in fact elements of a nonlinear Riemannian symmetric space. In this paper, we develop the method of principal geodesic analysis, a generalization of principal component analysis to the manifold setting. We demonstrate its use in describing the variability of medially-defined anatomical objects. Results of applying this framework on a population of hippocampi in a schizophrenia study are presented.

840 citations


Network Information
Related Topics (5)
Image segmentation
79.6K papers, 1.8M citations
73% related
Feature (computer vision)
128.2K papers, 1.7M citations
73% related
Feature extraction
111.8K papers, 2.1M citations
69% related
Support vector machine
73.6K papers, 1.7M citations
69% related
Convolutional neural network
74.7K papers, 2M citations
68% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20215
20207
20197
20187
201712
201613