scispace - formally typeset
Search or ask a question
Topic

Linear discriminant analysis

About: Linear discriminant analysis is a research topic. Over the lifetime, 18361 publications have been published within this topic receiving 603195 citations. The topic is also known as: Linear discriminant analysis & LDA.


Papers
More filters
Journal ArticleDOI
TL;DR: This paper presents the R package HDclassif, a free software and distributed under the general public license, as part of the R software project, which is devoted to the clustering and the discriminant analysis of high-dimensional data.
Abstract: This paper presents the R package HDclassif which is devoted to the clustering and the discriminant analysis of high-dimensional data. The classification methods proposed in the package result from a new parametrization of the Gaussian mixture model which combines the idea of dimension reduction and model constraints on the covariance matrices. The supervised classification method using this parametrization has been called High Dimensional Discriminant Analysis (HDDA). In a similar manner, the associated clustering method has been called High Dimensional Data Clustering (HDDC) and uses the Expectation-Maximization (EM) algorithm for inference. In order to correctly fit the data, both methods estimate the specific subspace and the intrinsic dimension of the groups. Due to the constraints on the covariance matrices, the number of parameters to estimate is significantly lower than other model-based methods and this allows the methods to be stable and efficient in high-dimensional spaces. Experiments on artificial and real datasets show that HDDC and HDDA perform better than existing classical methods on high-dimensional datasets, even with small datasets. HDclassif is a free software and distributed under the GNU General Public License, as part of the R software project.

116 citations

Journal ArticleDOI
TL;DR: This work considers the use of time-varying spectra for classification and clustering of non-stationary time series and uses recent developments using local stationarity and Kullback-Leibler discrimination measures of distance for classifying earthquakes and mining explosions at regional distances.

115 citations

Journal ArticleDOI
TL;DR: This article reviews the wealth of different pattern recognition methods that have been used for magnetic resonance spectroscopy (MRS) based tumor classification and discusses different approaches in view of practical and theoretical considerations.
Abstract: This article reviews the wealth of different pattern recognition methods that have been used for magnetic resonance spectroscopy (MRS) based tumor classification. The methods have in common that the entire MR spectra is used to develop linear and non-linear classifiers. The following issues are addressed: (i) pre-processing, such as normalization and digitization, (ii) extraction of relevant spectral features by multivariate methods, such as principal component analysis, linear discriminant analysis (LDA), and optimal discriminant vector, and (iii) classification by LDA, cluster analysis and artificial neural networks. Different approaches are compared and discussed in view of practical and theoretical considerations.

115 citations

Journal ArticleDOI
TL;DR: In the framework of handwriting recognition, a novel GA-based feature selection algorithm in which feature subsets are evaluated by means of a specifically devised separability index that represents an extension of the Fisher Linear Discriminant method and uses covariance matrices for estimating how class probability distributions are spread out in the considered N-dimensional feature space.

115 citations

Proceedings ArticleDOI
26 Mar 2000
TL;DR: Two methods using mixtures of linear sub-spaces for face detection in gray level images using Kohonen's self-organizing map for clustering and Fisher linear discriminant to find the optimal projection for pattern classification are presented.
Abstract: We present two methods using mixtures of linear sub-spaces for face detection in gray level images. One method uses a mixture of factor analyzers to concurrently perform clustering and, within each cluster, perform local dimensionality reduction. The parameters of the mixture model are estimated using an EM algorithm. A face is detected if the probability of an input sample is above a predefined threshold. The other mixture of subspaces method uses Kohonen's self-organizing map for clustering and Fisher linear discriminant to find the optimal projection for pattern classification, and a Gaussian distribution to model the class-conditioned density function of the projected samples for each class. The parameters of the class-conditioned density functions are maximum likelihood estimates and the decision rule is also based on maximum likelihood. A wide range of face images including ones in different poses, with different expressions and under different lighting conditions are used as the training set to capture the variations of human faces. Our methods have been tested on three sets of 225 images which contain 871 faces. Experimental results on the first two datasets show that our methods perform as well as the best methods in the literature, yet have fewer false detects.

115 citations


Network Information
Related Topics (5)
Regression analysis
31K papers, 1.7M citations
85% related
Artificial neural network
207K papers, 4.5M citations
80% related
Feature extraction
111.8K papers, 2.1M citations
80% related
Cluster analysis
146.5K papers, 2.9M citations
79% related
Image segmentation
79.6K papers, 1.8M citations
79% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20251
20242
2023756
20221,711
2021678
2020815