scispace - formally typeset

Proceedings ArticleDOI

Emphatic Constraints Support Vector Machines for Multi-class Classification

25 Nov 2009-pp 118-123

TL;DR: First, the Emphatic Constraints Support Vector Machines (ECSVM) is proposed as a new powerful classification method and extended to find efficient multi-class classifiers, and the obtained results show the superiority of the method.

Abstract—Support vector machine (SVM) formulation has been originally developed for binary classification problems. Finding the direct formulation for multi-class case is not easy but still an on-going research issue. This paper presents a novel approach for multi-class SVM by modifying the training phase of the SVM. First, we propose the Emphatic Constraints Support Vector Machines (ECSVM) as a new powerful classification method. Then, we extend our method to find efficient multi-class classifiers. We evaluate the performance of the proposed scheme by means of real world data sets. The obtained results show the superiority of our method.

...read more

Content maybe subject to copyright    Report

Citations
More filters

Journal ArticleDOI
17 Feb 2021
Abstract: A noise-aware version of support vector machines is utilized for feature selection in this paper. Combining this method and sequential backward search (SBS), a new algorithm for removing irrelevant features is proposed. Although feature selection methods in the literature which utilize support vector machines have provided acceptable results, noisy samples and outliers may affect the performance of SVM and feature selections method, consequently. Recently, we have proposed relaxed constrains SVM (RSVM) which handles noisy data and outliers. Each training sample in RSVM is associated with a degree of importance utilizing the fuzzy c-means clustering method. Therefore, a less importance degree is assigned to noisy data and outliers. Moreover, RSVM has more relaxed constraints that can reduce the effect of noisy samples. Feature selection increases the accuracy of different machine learning applications by eliminating noisy and irrelevant features. In the proposed RSVM-SBS feature selection algorithm, noisy data have small effect on eliminating irrelevant features. Experimental results using real-world data verify that RSVM-SBS has better results in comparison with other feature selection approaches utilizing support vector machines.

2 citations


References
More filters

Book
Vladimir Vapnik1
01 Jan 1995
TL;DR: Setting of the learning problem consistency of learning processes bounds on the rate of convergence ofLearning processes controlling the generalization ability of learning process constructing learning algorithms what is important in learning theory?
Abstract: Setting of the learning problem consistency of learning processes bounds on the rate of convergence of learning processes controlling the generalization ability of learning processes constructing learning algorithms what is important in learning theory?.

38,164 citations


"Emphatic Constraints Support Vector..." refers background or methods in this paper

  • ...The theory of support vector machine (SVM), which is based on the idea of structural risk minimization, is a new classification technique and has drawn much attention due to its good performance and solid theoretical foundations [1, 2]....

    [...]

  • ...The one-against-all [1] method constructs n SVMs (where n is the number of classes)....

    [...]


Journal ArticleDOI
TL;DR: High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated and the performance of the support- vector network is compared to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.
Abstract: The support-vector network is a new learning machine for two-group classification problems. The machine conceptually implements the following idea: input vectors are non-linearly mapped to a very high-dimension feature space. In this feature space a linear decision surface is constructed. Special properties of the decision surface ensures high generalization ability of the learning machine. The idea behind the support-vector network was previously implemented for the restricted case where the training data can be separated without errors. We here extend this result to non-separable training data. High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated. We also compare the performance of the support-vector network to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.

35,157 citations


"Emphatic Constraints Support Vector..." refers background in this paper

  • ...The theory of support vector machine (SVM), which is based on the idea of structural risk minimization, is a new classification technique and has drawn much attention due to its good performance and solid theoretical foundations [1, 2]....

    [...]


01 Jan 1998
TL;DR: Presenting a method for determining the necessary and sufficient conditions for consistency of learning process, the author covers function estimates from small data pools, applying these estimations to real-life problems, and much more.
Abstract: A comprehensive look at learning and generalization theory. The statistical theory of learning and generalization concerns the problem of choosing desired functions on the basis of empirical data. Highly applicable to a variety of computer science and robotics fields, this book offers lucid coverage of the theory as a whole. Presenting a method for determining the necessary and sufficient conditions for consistency of learning process, the author covers function estimates from small data pools, applying these estimations to real-life problems, and much more.

26,121 citations


Additional excerpts

  • ...To resolve this problem, Vapnik [3] proposed to use continuous decision functions....

    [...]


01 Jan 1998

12,776 citations


"Emphatic Constraints Support Vector..." refers methods in this paper

  • ...All data sets used in the following tests are obtained from the UCI Repository of Machine Learning Databases and Domain Theories [14]....

    [...]


Posted Content
TL;DR: It is demonstrated that error-correcting output codes provide a general-purpose method for improving the performance of inductive learning programs on multiclass problems.
Abstract: Multiclass learning problems involve finding a definition for an unknown function f(x) whose range is a discrete set containing k > 2 values (i.e., k ``classes''). The definition is acquired by studying collections of training examples of the form [x_i, f (x_i)]. Existing approaches to multiclass learning problems include direct application of multiclass algorithms such as the decision-tree algorithms C4.5 and CART, application of binary concept learning algorithms to learn individual binary functions for each of the k classes, and application of binary concept learning algorithms with distributed output representations. This paper compares these three approaches to a new technique in which error-correcting codes are employed as a distributed output representation. We show that these output representations improve the generalization performance of both C4.5 and backpropagation on a wide range of multiclass learning tasks. We also demonstrate that this approach is robust with respect to changes in the size of the training sample, the assignment of distributed representations to particular classes, and the application of overfitting avoidance techniques such as decision-tree pruning. Finally, we show that---like the other methods---the error-correcting code technique can provide reliable class probability estimates. Taken together, these results demonstrate that error-correcting output codes provide a general-purpose method for improving the performance of inductive learning programs on multiclass problems.

2,455 citations


"Emphatic Constraints Support Vector..." refers methods in this paper

  • ...Although, other methods exist, for instance, the error- correcting code techniques [10])....

    [...]