scispace - formally typeset
Open Access

Statistical learning theory

Reads0
Chats0
TLDR
Presenting a method for determining the necessary and sufficient conditions for consistency of learning process, the author covers function estimates from small data pools, applying these estimations to real-life problems, and much more.
Abstract
A comprehensive look at learning and generalization theory. The statistical theory of learning and generalization concerns the problem of choosing desired functions on the basis of empirical data. Highly applicable to a variety of computer science and robotics fields, this book offers lucid coverage of the theory as a whole. Presenting a method for determining the necessary and sufficient conditions for consistency of learning process, the author covers function estimates from small data pools, applying these estimations to real-life problems, and much more.

read more

Citations
More filters
Book ChapterDOI

The Support Vector Method of Function Estimation

TL;DR: For the Support Vector method both the quality of solution and the complexity of the solution does not depend directly on the dimensionality of an input space, and on the basis of this technique one can obtain a good estimate using a given number of high-dimensional data.
Journal ArticleDOI

Emotional state classification from EEG data using machine learning approach

TL;DR: From experimental results, it is found that power spectrum feature is superior to other two kinds of features; a linear dynamic system based feature smoothing method can significantly improve emotion classification accuracy; and the trajectory of emotion changes can be visualized by reducing subject-independent features with manifold learning.
Journal ArticleDOI

A Novel Transductive SVM for Semisupervised Classification of Remote-Sensing Images

TL;DR: A novel modified TSVM classifier designed for addressing ill-posed remote-sensing problems is proposed that is able to mitigate the effects of suboptimal model selection and can address multiclass cases.
Journal ArticleDOI

Evolutionary Optimization of Computationally Expensive Problems via Surrogate Modeling

TL;DR: The essential backbone of the framework is an evolutionary algorithm coupled with a feasible sequential quadratic programming solver in the spirit of Lamarckian learning that leverages surrogate models for solving computationally expensive design problems with general constraints on a limited computational budget.
Posted Content

Gradient Episodic Memory for Continual Learning

TL;DR: In this article, Gradient Episodic Memory (GEM) is proposed for continual learning, where the model observes, once and one by one, examples concerning a sequence of tasks.
References
More filters
Book

The Nature of Statistical Learning Theory

TL;DR: Setting of the learning problem consistency of learning processes bounds on the rate of convergence ofLearning processes controlling the generalization ability of learning process constructing learning algorithms what is important in learning theory?