scispace - formally typeset
Book ChapterDOI

A Criterion for Learning the Data-Dependent Kernel for Classification

TLDR
A novel kernel optimization method based on maximum margin criterion is proposed, which can solve the problem of Xiong's work that the optimal solution can be solved by iteration update algorithm owing to the singular problem of matrix.
Abstract
A novel criterion, namely Maximum Margin Criterion (MMC), is proposed for learning the data-dependent kernel for classification. Different kernels create the different geometrical structures of the data in the feature space, and lead to different class discrimination. Selection of kernel influences greatly the performance of kernel learning. Optimizing kernel is an effective method to improve the classification performance. In this paper, we propose a novel kernel optimization method based on maximum margin criterion, which can solve the problem of Xiong's work [1] that the optimal solution can be solved by iteration update algorithm owing to the singular problem of matrix. Our method can obtain a unique optimal solution by solving an eigenvalue problem, and the performance is enhanced while time consuming is decreased. Experimental results show that the proposed algorithm gives a better performance and a lower time consuming compared with Xiong's work.

read more

Citations
More filters
Journal ArticleDOI

Learning SVM with weighted maximum margin criterion for classification of imbalanced data

TL;DR: A weighted maximum margin criterion is proposed to optimize the data-dependent kernel, which makes the minority class more clustered in the induced feature space and indicates the effectiveness of the proposed algorithm for imbalanced data classification problems.
References
More filters
Book

The Nature of Statistical Learning Theory

TL;DR: Setting of the learning problem consistency of learning processes bounds on the rate of convergence ofLearning processes controlling the generalization ability of learning process constructing learning algorithms what is important in learning theory?
Journal ArticleDOI

Nonlinear component analysis as a kernel eigenvalue problem

TL;DR: A new method for performing a nonlinear form of principal component analysis by the use of integral operator kernel functions is proposed and experimental results on polynomial feature extraction for pattern recognition are presented.
Journal ArticleDOI

An introduction to kernel-based learning algorithms

TL;DR: This paper provides an introduction to support vector machines, kernel Fisher discriminant analysis, and kernel principal component analysis, as examples for successful kernel-based learning methods.
Proceedings ArticleDOI

Fisher discriminant analysis with kernels

TL;DR: In this article, a non-linear classification technique based on Fisher's discriminant is proposed and the main ingredient is the kernel trick which allows the efficient computation of Fisher discriminant in feature space.
Proceedings ArticleDOI

Parameterisation of a stochastic model for human face identification

TL;DR: This paper presents a set of experimental results in which various HMM parameterisations are analysed and shows that stochastic modelling can be used successfully to encode feature information.
Related Papers (5)