scispace - formally typeset
Open AccessProceedings ArticleDOI

Efficient Kernel Discriminant Analysis via Spectral Regression

Reads0
Chats0
TLDR
By using spectral graph analysis, SRKDA casts discriminant analysis into a regression framework which facilitates both efficient computation and the use of regularization techniques, which is a huge save of computational cost.
Abstract
Linear discriminant analysis (LDA) has been a popular method for extracting features which preserve class separability. The projection vectors are commonly obtained by maximizing the between class covariance and simultaneously minimizing the within class covariance. LDA can be performed either in the original input space or in the reproducing kernel Hilbert space (RKHS) into which data points are mapped, which leads to Kernel Discriminant Analysis (KDA). When the data are highly nonlinear distributed, KDA can achieve better performance than LDA. However, computing the projective functions in KDA involves eigen-decomposition of kernel matrix, which is very expensive when a large number of training samples exist. In this paper, we present a new algorithm for kernel discriminant analysis, called spectral regression kernel discriminant analysis (SRKDA). By using spectral graph analysis, SRKDA casts discriminant analysis into a regression framework which facilitates both efficient computation and the use of regularization techniques. Specifically, SRKDA only needs to solve a set of regularized regression problems and there is no eigenvector computation involved, which is a huge save of computational cost. Our computational analysis shows that SRKDA is 27 times faster than the ordinary KDA. Moreover, the new formulation makes it very easy to develop incremental version of the algorithm which can fully utilize the computational results of the existing training samples. Experiments on face recognition demonstrate the effectiveness and efficiency of the proposed algorithm.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

SRDA: An Efficient Algorithm for Large-Scale Discriminant Analysis

TL;DR: By using spectral graph analysis, SRDA casts discriminant analysis into a regression framework that facilitates both efficient computation and the use of regularization techniques, and there is no eigenvector computation involved, which is a huge save of both time and memory.
Journal ArticleDOI

Driver Drowsiness Classification Using Fuzzy Wavelet-Packet-Based Feature-Extraction Algorithm

TL;DR: The experimental results proved the significance of FMIWPT in extracting features that highly correlate with the different drowsiness levels achieving a classification accuracy of 95%-97% on an average across all subjects.

The MediaMill TRECVID 2006 semantic video search engine

TL;DR: The MediaMill Challenge 2006 as discussed by the authors divided the generic video indexing problem into a visual-only, textual only, early fusion, late fusion, and combined analysis experiment and the MediaMill team participated in two tasks: concept detection and search.
Journal ArticleDOI

Tensor decompositions for feature extraction and classification of high dimensional datasets

TL;DR: This work proposes algorithms for feature extraction and classification based on orthogonal or nonnegative tensor (multi-array) decompositions, and higher order (multilinear) discriminant analysis (HODA), whereby input data are considered as tensors instead of more conventional vector or matrix representations.
Journal ArticleDOI

Speed up kernel discriminant analysis

TL;DR: Spectral Regression Kernel Discriminant Analysis is presented, which casts discriminant analysis into a regression framework, which facilitates both efficient computation and the use of regularization techniques.
References
More filters
Journal ArticleDOI

LIBSVM: A library for support vector machines

TL;DR: Issues such as solving SVM optimization problems theoretical convergence multiclass classification probability estimates and parameter selection are discussed in detail.
Book

Matrix computations

Gene H. Golub

Statistical learning theory

TL;DR: Presenting a method for determining the necessary and sufficient conditions for consistency of learning process, the author covers function estimates from small data pools, applying these estimations to real-life problems, and much more.
Journal ArticleDOI

A Tutorial on Support Vector Machines for Pattern Recognition

TL;DR: There are several arguments which support the observed high accuracy of SVMs, which are reviewed and numerous examples and proofs of most of the key theorems are given.
Book

Introduction to Statistical Pattern Recognition

TL;DR: This completely revised second edition presents an introduction to statistical pattern recognition, which is appropriate as a text for introductory courses in pattern recognition and as a reference book for workers in the field.
Related Papers (5)