scispace - formally typeset
Open AccessJournal ArticleDOI

Linear discriminant analysis: A detailed tutorial

Reads0
Chats0
TLDR
A solid intuition is built for what is LDA, and how LDA works, thus enabling readers of all levels to get a better understanding of the LDA and to know how to apply this technique in different applications.
Abstract
Linear Discriminant Analysis (LDA) is a very common technique for dimensionality reduction problems as a preprocessing step for machine learning and pattern classification applications. At the same time, it is usually used as a black box, but (sometimes) not well understood. The aim of this paper is to build a solid intuition for what is LDA, and how LDA works, thus enabling readers of all levels be able to get a better understanding of the LDA and to know how to apply this technique in different applications. The paper first gave the basic definitions and steps of how LDA technique works supported with visual explanations of these steps. Moreover, the two methods of computing the LDA space, i.e. class-dependent and class-independent methods, were explained in details. Then, in a step-by-step approach, two numerical examples are demonstrated to show how the LDA space can be calculated in case of the class-dependent and class-independent methods. Furthermore, two of the most common LDA problems (i.e. Small Sample Size (SSS) and non-linearity problems) were highlighted and illustrated, and state-of-the-art solutions to these problems were investigated and explained. Finally, a number of experiments was conducted with different datasets to (1) investigate the effect of the eigenvectors that used in the LDA space on the robustness of the extracted feature for the classification accuracy, and (2) to show when the SSS problem occurs and how it can be addressed.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

Selecting critical features for data classification based on machine learning methods

TL;DR: This paper adopts Random Forest to select the important feature in classification and compares the result of the dataset with and without essential features selection by RF methods varImp(), Boruta, and Recursive Feature Elimination to get the best percentage accuracy and kappa.
Journal ArticleDOI

Overview and comparative study of dimensionality reduction techniques for high dimensional data

TL;DR: This paper presents the state-of-the art dimensionality reduction techniques and their suitability for different types of data and application areas and the issues of dimensionality Reduction techniques that can affect the accuracy and relevance of results.
Journal ArticleDOI

Intelligent Bézier curve-based path planning model using Chaotic Particle Swarm Optimization algorithm

TL;DR: A novel Chaotic Particle Swarm Optimization (CPSO) algorithm has been proposed to optimize the control points of Bézier curve and it is proved that the proposed algorithm is capable of finding the optimal path.
Journal ArticleDOI

A Deep Feature Learning Model for Pneumonia Detection Applying a Combination of mRMR Feature Selection and Machine Learning Models

TL;DR: It is pointed out that the deep features provided robust and consistent features for pneumonia detection, and minimum redundancy maximum relevance method was found a beneficial tool to reduce the dimension of the feature set.
Journal ArticleDOI

Chaotic dragonfly algorithm: an improved metaheuristic algorithm for feature selection

TL;DR: Experimental results proved the capability of CDA to find the optimal feature subset, which maximizing the classification performance and minimizing the number of selected features compared with DA and the other meta-heuristic optimization algorithms.
References
More filters
Book

The Nature of Statistical Learning Theory

TL;DR: Setting of the learning problem consistency of learning processes bounds on the rate of convergence ofLearning processes controlling the generalization ability of learning process constructing learning algorithms what is important in learning theory?
Journal ArticleDOI

Reducing the Dimensionality of Data with Neural Networks

TL;DR: In this article, an effective way of initializing the weights that allows deep autoencoder networks to learn low-dimensional codes that work much better than principal components analysis as a tool to reduce the dimensionality of data is described.
Journal ArticleDOI

Eigenfaces for recognition

TL;DR: A near-real-time computer system that can locate and track a subject's head, and then recognize the person by comparing characteristics of the face to those of known individuals, and that is easy to implement using a neural network architecture.
Journal ArticleDOI

Eigenfaces vs. Fisherfaces: recognition using class specific linear projection

TL;DR: A face recognition algorithm which is insensitive to large variation in lighting direction and facial expression is developed, based on Fisher's linear discriminant and produces well separated classes in a low-dimensional subspace, even under severe variations in lighting and facial expressions.
Related Papers (5)
Trending Questions (2)
What is discriminant analysis in machine learning?

The paper provides a detailed tutorial on Linear Discriminant Analysis (LDA), which is a common technique for dimensionality reduction in machine learning and pattern classification applications. It explains the basic definitions, steps, and methods of computing the LDA space, as well as solutions to common LDA problems.

What is linear and quadratic discriminant analysis in machine learning?

The provided paper does not specifically mention linear and quadratic discriminant analysis in machine learning.