scispace - formally typeset
Open AccessJournal Article

Learning over sets using kernel principal angles

TLDR
A new positive definite kernel f(A,B) defined over pairs of matrices A,B is derived based on the concept of principal angles between two linear subspaces and it is shown that the principal angles can be recovered using only inner-products between pairs of column vectors of the input matrices thereby allowing the original column vectors to be mapped onto arbitrarily high-dimensional feature spaces.
Abstract
We consider the problem of learning with instances defined over a space of sets of vectors. We derive a new positive definite kernel f(A,B) defined over pairs of matrices A,B based on the concept of principal angles between two linear subspaces. We show that the principal angles can be recovered using only inner-products between pairs of column vectors of the input matrices thereby allowing the original column vectors of A,B to be mapped onto arbitrarily high-dimensional feature spaces.We demonstrate the usage of the matrix-based kernel function f(A,B) with experiments on two visual tasks. The first task is the discrimination of "irregular" motion trajectory of an individual or a group of individuals in a video sequence. We use the SVM approach using f(A,B) where an input matrix represents the motion trajectory of a group of individuals over a certain (fixed) time frame. We show that the classification (irregular versus regular) greatly outperforms the conventional representation where all the trajectories form a single vector. The second application is the visual recognition of faces from input video sequences representing head motion and facial expressions where f(A,B) is used to compare two image sequences.

read more

Content maybe subject to copyright    Report

Citations
More filters
Proceedings ArticleDOI

The pyramid match kernel: discriminative classification with sets of image features

TL;DR: A new fast kernel function is presented which maps unordered feature sets to multi-resolution histograms and computes a weighted histogram intersection in this space and is shown to be positive-definite, making it valid for use in learning algorithms whose optimal solutions are guaranteed only for Mercer kernels.
Proceedings ArticleDOI

Face recognition in unconstrained videos with matched background similarity

TL;DR: A comprehensive database of labeled videos of faces in challenging, uncontrolled conditions, the ‘YouTube Faces’ database, along with benchmark, pair-matching tests are presented and a novel set-to-set similarity measure, the Matched Background Similarity (MBGS), is described.
Proceedings ArticleDOI

Grassmann discriminant analysis: a unifying view on subspace-based learning

TL;DR: This paper proposes a discriminant learning framework for problems in which data consist of linear subspaces instead of vectors, and treats each sub-space as a point in the Grassmann space, and performs feature extraction and classification in the same space.
Journal ArticleDOI

Discriminative Learning and Recognition of Image Set Classes Using Canonical Correlations

TL;DR: A novel discriminative learning method over sets is proposed for set classification that maximizes the canonical correlations of within-class sets and minimizes thecanon correlations of between- class sets.
Patent

Vision system for vehicle

TL;DR: In this article, a forward-facing vision system for a vehicle includes a forwardfacing camera disposed in a windshield electronics module attached at a windshield of the vehicle and viewing through the windshield.
References
More filters
Book

The Nature of Statistical Learning Theory

TL;DR: Setting of the learning problem consistency of learning processes bounds on the rate of convergence ofLearning processes controlling the generalization ability of learning process constructing learning algorithms what is important in learning theory?
Proceedings ArticleDOI

A training algorithm for optimal margin classifiers

TL;DR: A training algorithm that maximizes the margin between the training patterns and the decision boundary is presented, applicable to a wide variety of the classification functions, including Perceptrons, polynomials, and Radial Basis Functions.
Book ChapterDOI

Relations Between Two Sets of Variates

TL;DR: The concept of correlation and regression may be applied not only to ordinary one-dimensional variates but also to variates of two or more dimensions as discussed by the authors, where the correlation of the horizontal components is ordinarily discussed, whereas the complex consisting of horizontal and vertical deviations may be even more interesting.
Proceedings Article

Learning with Kernels

Related Papers (5)