scispace - formally typeset
Search or ask a question
Proceedings ArticleDOI

Emphatic Constraints Support Vector Machines for Multi-class Classification

TL;DR: First, the Emphatic Constraints Support Vector Machines (ECSVM) is proposed as a new powerful classification method and extended to find efficient multi-class classifiers, and the obtained results show the superiority of the method.
Abstract: —Support vector machine (SVM) formulation has been originally developed for binary classification problems. Finding the direct formulation for multi-class case is not easy but still an on-going research issue. This paper presents a novel approach for multi-class SVM by modifying the training phase of the SVM. First, we propose the Emphatic Constraints Support Vector Machines (ECSVM) as a new powerful classification method. Then, we extend our method to find efficient multi-class classifiers. We evaluate the performance of the proposed scheme by means of real world data sets. The obtained results show the superiority of our method.

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI
17 Feb 2021
TL;DR: In this article, a noise-aware version of support vector machines (SVM) is utilized for feature selection and a new algorithm for removing irrelevant features is proposed by combining this method and sequential backward search (SBS).
Abstract: A noise-aware version of support vector machines is utilized for feature selection in this paper. Combining this method and sequential backward search (SBS), a new algorithm for removing irrelevant features is proposed. Although feature selection methods in the literature which utilize support vector machines have provided acceptable results, noisy samples and outliers may affect the performance of SVM and feature selections method, consequently. Recently, we have proposed relaxed constrains SVM (RSVM) which handles noisy data and outliers. Each training sample in RSVM is associated with a degree of importance utilizing the fuzzy c-means clustering method. Therefore, a less importance degree is assigned to noisy data and outliers. Moreover, RSVM has more relaxed constraints that can reduce the effect of noisy samples. Feature selection increases the accuracy of different machine learning applications by eliminating noisy and irrelevant features. In the proposed RSVM-SBS feature selection algorithm, noisy data have small effect on eliminating irrelevant features. Experimental results using real-world data verify that RSVM-SBS has better results in comparison with other feature selection approaches utilizing support vector machines.

6 citations

References
More filters
Posted Content
TL;DR: It is demonstrated that error-correcting output codes provide a general-purpose method for improving the performance of inductive learning programs on multiclass problems.
Abstract: Multiclass learning problems involve finding a definition for an unknown function f(x) whose range is a discrete set containing k > 2 values (i.e., k ``classes''). The definition is acquired by studying collections of training examples of the form [x_i, f (x_i)]. Existing approaches to multiclass learning problems include direct application of multiclass algorithms such as the decision-tree algorithms C4.5 and CART, application of binary concept learning algorithms to learn individual binary functions for each of the k classes, and application of binary concept learning algorithms with distributed output representations. This paper compares these three approaches to a new technique in which error-correcting codes are employed as a distributed output representation. We show that these output representations improve the generalization performance of both C4.5 and backpropagation on a wide range of multiclass learning tasks. We also demonstrate that this approach is robust with respect to changes in the size of the training sample, the assignment of distributed representations to particular classes, and the application of overfitting avoidance techniques such as decision-tree pruning. Finally, we show that---like the other methods---the error-correcting code technique can provide reliable class probability estimates. Taken together, these results demonstrate that error-correcting output codes provide a general-purpose method for improving the performance of inductive learning programs on multiclass problems.

2,455 citations


"Emphatic Constraints Support Vector..." refers methods in this paper

  • ...Although, other methods exist, for instance, the error- correcting code techniques [10])....

    [...]

Proceedings Article
29 Nov 1999
TL;DR: An algorithm, DAGSVM, is presented, which operates in a kernel-induced feature space and uses two-class maximal margin hyperplanes at each decision-node of the DDAG, which is substantially faster to train and evaluate than either the standard algorithm or Max Wins, while maintaining comparable accuracy to both of these algorithms.
Abstract: We present a new learning architecture: the Decision Directed Acyclic Graph (DDAG), which is used to combine many two-class classifiers into a multiclass classifier. For an N-class problem, the DDAG contains N(N - 1)/2 classifiers, one for each pair of classes. We present a VC analysis of the case when the node classifiers are hyperplanes; the resulting bound on the test error depends on N and on the margin achieved at the nodes, but not on the dimension of the space. This motivates an algorithm, DAGSVM, which operates in a kernel-induced feature space and uses two-class maximal margin hyperplanes at each decision-node of the DDAG. The DAGSVM is substantially faster to train and evaluate than either the standard algorithm or Max Wins, while maintaining comparable accuracy to both of these algorithms.

1,857 citations


"Emphatic Constraints Support Vector..." refers methods in this paper

  • ...Another popular solution is Directed Acyclic Graph Support Vector Machines (DAG SVM) [12] that uses a decision tree in the testing stage....

    [...]

Journal ArticleDOI
TL;DR: In this article, the authors discuss a strategy for polychotomous classification that involves estimating class probabilities for each pair of classes, and then coupling the estimates together, similar to the Bradley-Terry method for paired comparisons.
Abstract: We discuss a strategy for polychotomous classification that involves estimating class probabilities for each pair of classes, and then coupling the estimates together. The coupling model is similar to the Bradley-Terry method for paired comparisons. We study the nature of the class probability estimates that arise, and examine the performance of the procedure in real and simulated data sets. Classifiers used include linear discriminants, nearest neighbors, adaptive nonlinear methods and the support vector machine.

1,569 citations

Journal ArticleDOI
TL;DR: This paper applies a fuzzy membership to each input point and reformulate the SVMs such that different input points can make different contributions to the learning of decision surface.
Abstract: A support vector machine (SVM) learns the decision surface from two distinct classes of the input points. In many applications, each input point may not be fully assigned to one of these two classes. In this paper, we apply a fuzzy membership to each input point and reformulate the SVMs such that different input points can make different contributions to the learning of decision surface. We call the proposed method fuzzy SVMs (FSVMs).

1,374 citations


"Emphatic Constraints Support Vector..." refers background in this paper

  • ...Lin and Wang proposed Fuzzy SVM (FSVM) [ 8 , 9] to overcome this problem....

    [...]

  • ...SVM lacks this kind of ability and it is another problem with SVM [ 8 ]....

    [...]