scispace - formally typeset
Journal ArticleDOI

The new interpretation of support vector machines on statistical learning theory

TLDR
It is shown that the decision function obtained by C-SVC is just one of the decision functions obtained by solving the optimization problem derived directly from the structural risk minimization principle.
Abstract
This paper is concerned with the theoretical foundation of support vector machines (SVMs). The purpose is to develop further an exact relationship between SVMs and the statistical learning theory (SLT). As a representative, the standard C-support vector classification (C-SVC) is considered here. More precisely, we show that the decision function obtained by C-SVC is just one of the decision functions obtained by solving the optimization problem derived directly from the structural risk minimization principle. In addition, an interesting meaning of the parameter C in C-SVC is given by showing that C corresponds to the size of the decision function candidate set in the structural risk minimization principle.

read more

Citations
More filters
Journal ArticleDOI

Improvements on Twin Support Vector Machines

TL;DR: An improved version of the TBSVM is proposed, named twin bounded support vector machines (TBSVM), based on TWSVM, that the structural risk minimization principle is implemented by introducing the regularization term.
Journal ArticleDOI

Prediction of daily global solar radiation using different machine learning algorithms: Evaluation and comparison

TL;DR: All machine learning algorithms tested in this study can be used in the prediction of daily global solar radiation data with a high accuracy; however, the ANN algorithm is the best fitting algorithm among all algorithms.
Journal ArticleDOI

Recent advances on support vector machines research

TL;DR: The purpose of this paper is to understand SVM from the optimization point of view, review several representative optimization models in SVMs, their applications in economics, in order to promote the research interests in both optimization-based SVMs theory and economics applications.

Improvements on Twin Support Vector Machines

C. M. Bishop
TL;DR: For classification problems, the generalized eigen- value proximal support vector machine (GEPSVM) and twin support Vector machine (TWSVM) are regarded as milestones in the development of the powerful SVMs, as they use the Eigenvalue vector for classification problems.
Journal ArticleDOI

Electricity production based forecasting of greenhouse gas emissions in Turkey with deep learning, support vector machine and artificial neural network algorithms

TL;DR: In this article, the authors used deep learning, support vector machine (SVM), and artificial neural network (ANN) algorithms to forecast GHG emissions from the electricity production sector in Turkey.
References
More filters
Journal ArticleDOI

Support-Vector Networks

TL;DR: High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated and the performance of the support- vector network is compared to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.

Statistical learning theory

TL;DR: Presenting a method for determining the necessary and sufficient conditions for consistency of learning process, the author covers function estimates from small data pools, applying these estimations to real-life problems, and much more.
Book

An Introduction to Support Vector Machines and Other Kernel-based Learning Methods

TL;DR: This is the first comprehensive introduction to Support Vector Machines (SVMs), a new generation learning system based on recent advances in statistical learning theory, and will guide practitioners to updated literature, new applications, and on-line software.
Book

Perturbation Analysis of Optimization Problems

TL;DR: It is shown here how the model derived recently in [Bouchut-Boyaval, M3AS (23) 2013] can be modified for flows on rugous topographies varying around an inclined plane.
Journal ArticleDOI

Convexity, Classification, and Risk Bounds

TL;DR: A general quantitative relationship between the risk as assessed using the 0–1 loss and the riskAs assessed using any nonnegative surrogate loss function is provided, and it is shown that this relationship gives nontrivial upper bounds on excess risk under the weakest possible condition on the loss function.