scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Combining MLC and SVM classifiers for learning based decision making: analysis and evaluations

01 Jan 2015-Computational Intelligence and Neuroscience (Hindawi Publishing Corporation)-Vol. 2015, pp 423581-423581
TL;DR: MLC and SVM are combined in learning and classification, which helps to yield probabilistic output for SVM and facilitate soft decision making and interesting results are reported to indicate how the combined classifier may work under various conditions.
Abstract: Maximum likelihood classifier (MLC) and support vector machines (SVM) are two commonly used approaches in machine learning. MLC is based on Bayesian theory in estimating parameters of a probabilistic model, whilst SVM is an optimization based nonparametric method in this context. Recently, it is found that SVM in some cases is equivalent to MLC in probabilistically modeling the learning process. In this paper, MLC and SVM are combined in learning and classification, which helps to yield probabilistic output for SVM and facilitate soft decision making. In total four groups of data are used for evaluations, covering sonar, vehicle, breast cancer, and DNA sequences.The data samples are characterized in terms of Gaussian/non-Gaussian distributed and balanced/unbalanced samples which are then further used for performance assessment in comparing the SVM and the combined SVM-MLC classifier. Interesting results are reported to indicate how the combined classifier may work under various conditions.

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI
TL;DR: A 64-quadrature amplitude modulation based radio over fiber (RoF) system is demonstrated for 10 km of standard single mode fiber length utilizing support vector machine (SVM) method to indicate an effective nonlinearity mitigation in front-hauls.

16 citations

Journal ArticleDOI
27 Aug 2020-PLOS ONE
TL;DR: The new strategy is based on the SVM to conduct a modified stepwise selection, where the tuning parameter could be determined by 10-fold cross-validation that minimizes the mean squared error.
Abstract: An essential aspect of medical research is the prediction for a health outcome and the scientific identification of important factors. As a result, numerous methods were developed for model selections in recent years. In the era of big data, machine learning has been broadly adopted for data analysis. In particular, the Support Vector Machine (SVM) has an excellent performance in classifications and predictions with the high-dimensional data. In this research, a novel model selection strategy is carried out, named as the Stepwise Support Vector Machine (StepSVM). The new strategy is based on the SVM to conduct a modified stepwise selection, where the tuning parameter could be determined by 10-fold cross-validation that minimizes the mean squared error. Two popular methods, the conventional stepwise logistic regression model and the SVM Recursive Feature Elimination (SVM-RFE), were compared to the StepSVM. The Stability and accuracy of the three strategies were evaluated by simulation studies with a complex hierarchical structure. Up to five variables were selected to predict the dichotomous cancer remission of a lung cancer patient. Regarding the stepwise logistic regression, the mean of the C-statistic was 69.19%. The overall accuracy of the SVM-RFE was estimated at 70.62%. In contrast, the StepSVM provided the highest prediction accuracy of 80.57%. Although the StepSVM is more time consuming, it is more consistent and outperforms the other two methods.

15 citations


Cites methods from "Combining MLC and SVM classifiers f..."

  • ...The machine learning technique is somewhat desired [10] since the kernel function is powerful in classification problems [11]....

    [...]

Journal ArticleDOI
TL;DR: In this article, the authors map land and aquatic vegetation of coastal areas using remote sensing for better management and conservation in many parts of the world, such as India, Australia, and New Zealand.
Abstract: Mapping land and aquatic vegetation of coastal areas using remote sensing for better management and conservation has been a long-standing interest in many parts of the world. Due to natural complex...

14 citations


Cites background or methods or result from "Combining MLC and SVM classifiers f..."

  • ...Due to a relatively competitive strength of the MLC and SVM methods, as shown in a number of previous studies, the performance of these two supervised classifiers appears to be different from one case study to another (Zhang, Ren, and Jiang 2015)....

    [...]

  • ...Differently from MLC, which is a parametric method based on Bayesian approach, SVM is an optimization-based non-parametric machine learning method (Zhang, Ren, and Jiang 2015)....

    [...]

  • ...The findings in this study are in accordance with several other studies (Pal and Mather 2005; Zhang, Ren, and Jiang 2015); the SVM method outperforms MLC for all three sensors regardless of the spectral range or spectral and spatial resolutions of the data....

    [...]

  • ...…SVM classifiers have different underlying statistics, which is important to consider when exploring the performance of a new sensor such as HSI2 (Zhang, Ren, and Jiang 2015); 2) The two classifiers are commonly competing in their performance although SVM performed better for hyperspectral data…...

    [...]

Journal ArticleDOI
01 Feb 2019-Cortex
TL;DR: FMRI is used in conjunction with the machine learning technique of multi-voxel pattern analysis (MVPA) to ascertain the predictive capability of the neuronal pattern for classifying individuals' intertemporal decisions across two independent samples and shows that the neuronal information encoded in three brain functional networks can predict individuals' decisions with significant discriminative power in cross-samples.

10 citations

Journal ArticleDOI
TL;DR: This work proposes a novel tensor-based logistic regression algorithm via Tucker decomposition to complete multimedia classification and indicates that the proposed algorithm outperforms the existing state-of-the-arts in relevant areas.

8 citations


Cites background from "Combining MLC and SVM classifiers f..."

  • ...As reported in the literature, many vector-based multimedia analysis and classification methods have been presented, of which representative techniques include K-nearest neighbor (KNN) [9][10], support vector regression (SVR) [11][12], and logistic regression (LR) [13] etc....

    [...]

References
More filters
Journal ArticleDOI
TL;DR: Issues such as solving SVM optimization problems theoretical convergence multiclass classification probability estimates and parameter selection are discussed in detail.
Abstract: LIBSVM is a library for Support Vector Machines (SVMs). We have been actively developing this package since the year 2000. The goal is to help users to easily apply SVM to their applications. LIBSVM has gained wide popularity in machine learning and many other areas. In this article, we present all implementation details of LIBSVM. Issues such as solving SVM optimization problems theoretical convergence multiclass classification probability estimates and parameter selection are discussed in detail.

40,826 citations


"Combining MLC and SVM classifiers f..." refers methods in this paper

  • ...Among these four datasets, SamplesNew is a dataset of suspicious micro-classification clusters extracted from [16] and svmguide3 is a demo dataset of practical svm guide [28], whilst sonar and splice datasets come from the UCI repository of machine learning databases [29]....

    [...]

  • ...Stage 1: SVM for initial training and classification The open source library libSVM [28] is used for initial training and classification of the aforementioned four datasets, and both the linear and the Gaussian radial basis (RBF) kernels are tested....

    [...]

Journal ArticleDOI
TL;DR: High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated and the performance of the support- vector network is compared to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.
Abstract: The support-vector network is a new learning machine for two-group classification problems. The machine conceptually implements the following idea: input vectors are non-linearly mapped to a very high-dimension feature space. In this feature space a linear decision surface is constructed. Special properties of the decision surface ensures high generalization ability of the learning machine. The idea behind the support-vector network was previously implemented for the restricted case where the training data can be separated without errors. We here extend this result to non-separable training data. High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated. We also compare the performance of the support-vector network to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.

37,861 citations


"Combining MLC and SVM classifiers f..." refers background in this paper

  • ...In Cortes and Vapnik [21], the principles of SVM are comprehensively discussed....

    [...]

  • ...In Cortes and Vapnik [22], the principles of SVM are comprehensively discussed....

    [...]

  • ...Machine Learning, 2011 [22] Cortes, C., Vapnik, V., Support-vector networks, Machine Learning, 20: 273-297, 1995 [23] Hsu, C.-W., Lin, C.-J., A Comparison of Methods for Multiclass Support Vector Machines, IEEE Transactions on Neural Networks, 13(2): 415-425, 2002 [24] Lee, Y., Lin, Y., Wahba, G., Multicategory Support Vector Machines, Theory, and Application to the Classification of Microarray Data and Satellite Radiance Data, J. Amer....

    [...]

Journal ArticleDOI
TL;DR: Decomposition implementations for two "all-together" multiclass SVM methods are given and it is shown that for large problems methods by considering all data at once in general need fewer support vectors.
Abstract: Support vector machines (SVMs) were originally designed for binary classification. How to effectively extend it for multiclass classification is still an ongoing research issue. Several methods have been proposed where typically we construct a multiclass classifier by combining several binary classifiers. Some authors also proposed methods that consider all classes at once. As it is computationally more expensive to solve multiclass problems, comparisons of these methods using large-scale problems have not been seriously conducted. Especially for methods solving multiclass SVM in one step, a much larger optimization problem is required so up to now experiments are limited to small data sets. In this paper we give decomposition implementations for two such "all-together" methods. We then compare their performance with three methods based on binary classifications: "one-against-all," "one-against-one," and directed acyclic graph SVM (DAGSVM). Our experiments indicate that the "one-against-one" and DAG methods are more suitable for practical use than the other methods. Results also show that for large problems methods by considering all data at once in general need fewer support vectors.

6,562 citations


"Combining MLC and SVM classifiers f..." refers background or methods in this paper

  • ...2 Results from a RBF-kernelled SVM and the MLC In this group of experiments, the RBF kernel is used for the SVM in the combined classifier as it is popularly used in various classification problems [16, 23]....

    [...]

  • ...Some useful further readings can be found in [23], [24] and [25]....

    [...]

01 Jan 1999

4,584 citations


"Combining MLC and SVM classifiers f..." refers methods in this paper

  • ...In Platt [25], a posterior class probability p i is estimated by a sigmoid function as follows:...

    [...]

  • ...In Platt [26], a posterior class probability ip is estimated by a sigmoid function below....

    [...]

  • ...[25] Crammer, K., Singer, Y., On the Algorithmic Implementation of Multiclass Kernel-based Vector Machines, Journal of Machine Learning Research 2: 265–292, 2001 [26] Platt, J., Probabilistic Outputs for Support Vector Machines and Comparisons to Regularized Likelihood Methods, In: A. Smola, P. Bartlett, B. Scholkopf, and D. Schuurmans (eds.)...

    [...]

  • ...In addition, in Lin et al [27] Platt’s approach is further improved to avoid any numerical difficulty, i.e. overflow or underflow, in determining ip in cases BAgE iSVMi )(x is either too large or too small. otherwiseee Eife p ii i EE i E i 1 1 )1( 0)1( (24) Although there are significant differences between SVM and MLC, the probabilistic model above has uncovered the connection between these two classifiers....

    [...]

  • ...Cambridge, MA., 2000 [27] Lin, H.-T., Lin, C. J., Weng, R. C., A note on Platt’s probabilistic outputs for support vector machines, Journal of Machine Learning, 68(3): 267-276, 2007....

    [...]

Journal Article
TL;DR: This paper describes the algorithmic implementation of multiclass kernel-based vector machines using a generalized notion of the margin to multiclass problems, and describes an efficient fixed-point algorithm for solving the reduced optimization problems and proves its convergence.
Abstract: In this paper we describe the algorithmic implementation of multiclass kernel-based vector machines. Our starting point is a generalized notion of the margin to multiclass problems. Using this notion we cast multiclass categorization problems as a constrained optimization problem with a quadratic objective function. Unlike most of previous approaches which typically decompose a multiclass problem into multiple independent binary classification tasks, our notion of margin yields a direct method for training multiclass predictors. By using the dual of the optimization problem we are able to incorporate kernels with a compact set of constraints and decompose the dual problem into multiple optimization problems of reduced size. We describe an efficient fixed-point algorithm for solving the reduced optimization problems and prove its convergence. We then discuss technical details that yield significant running time improvements for large datasets. Finally, we describe various experiments with our approach comparing it to previously studied kernel-based methods. Our experiments indicate that for multiclass problems we attain state-of-the-art accuracy.

2,214 citations


"Combining MLC and SVM classifiers f..." refers background in this paper

  • ...Some useful further readings can be found in [23], [24] and [25]....

    [...]

Trending Questions (1)
Is SVM a part of deep learning?

Recently, it is found that SVM in some cases is equivalent to MLC in probabilistically modeling the learning process.