scispace - formally typeset
Search or ask a question

Answers from top 8 papers

More filters
Papers (8)Insight
The test results shows that proposed hybrid SVM ensemble has better classification accuracy when compared with other methods.
Proceedings ArticleDOI
Jianli Xiao, Yuncai Liu 
01 Sep 2012
12 Citations
Thus, compared to SVM ensemble algorithm, the complexity of the ensemble classifier of the proposed algorithm is reduced greatly.
Proceedings ArticleDOI
Jianli Xiao, Yuncai Liu 
01 Sep 2012
12 Citations
Meanwhile, in order to achieve relatively better performances, the proposed algorithm need less individual classifiers to construct the ensemble than SVM ensemble algorithm.
Open accessProceedings ArticleDOI
08 Dec 2011
23 Citations
The experimental results show that ensemble methods are more accurate than a single SVM classifier.
By comparing the experimental result SVMs ensemble with the single SVM, the neural network ensemble, the proposed method outperforms the single SVM, and neural network ensemble in terms of classification accuracy.
This characteristic illustrates that the SVM ensemble is able to always improve the performance of the classification.
Proceedings ArticleDOI
Bing-Yu Sun, De-Shuang Huang 
25 Jul 2004
10 Citations
Traditionally the aggregation of the ensemble always uses all the available individual LS-SVM, while our approach can exclude the ones which may degrade the performance of the ensemble.
The results show that the proposed ensemble method is more successful than the straight SVM and the classical generalized linear model approach.

Related Questions

What is the latest achievement in ensemble numerical weather prediction with machine learning?5 answersThe latest achievement in ensemble numerical weather prediction with machine learning involves the development of novel deep learning methods to generate realistic ensemble forecasts for high-impact weather phenomena. Researchers have proposed innovative approaches utilizing generative adversarial networks and deep learning architectures to synthesize additional ensemble members for high-resolution weather models. These methods focus on generating physically consistent and diverse ensemble members, enhancing forecast uncertainty quantification and improving forecast accuracy for events like storms and heat waves. By leveraging advanced neural networks, such as Generative Pretrained Transformers and StyleGANs, these approaches offer computationally efficient ways to produce large hybrid physical/statistical ensembles at a reduced numerical cost, ultimately enhancing the reliability and relevance of ensemble predictions in critical weather situations.
How does the SVM algorithm work?5 answersThe Support Vector Machine (SVM) algorithm is a robust machine learning algorithm used for classification, regression, and outlier detection. It requires tuning the regularization parameter (RP) to control model capacity and generalization performance. The optimum RP is typically found through the Cross-Validation (CV) procedure, which involves comparing a range of values. For non-linearly separable data, SVM uses kernels, and a grid of kernels with different parameters is considered. The optimal choice of RP and the grid of kernels is determined through the grid-search of CV. By analyzing the behavior of the regularization parameter, the SVM performance can be modeled as a function of the separability and scatteredness (S&S) of the data. The S&S ratio-based table provides the optimum RP, which can automatically detect linear or non-linear separability before using the SVM algorithm. The S&S ratio-based table can also provide the optimum kernel and its parameters. This approach reduces the computational complexity of the CV grid-search.
What is ensemble learning in convolutional neural networks?5 answersEnsemble learning in convolutional neural networks (CNNs) refers to the technique of combining multiple CNN models to improve performance. It involves training multiple CNN models with different architectures or training approaches and then combining their predictions to make a final decision. This approach has been applied in various domains, such as planktic foraminifera identification, plant disease recognition, and action recognition from still images. Ensemble learning helps to address the limitations of individual models, such as high bias or lack of temporal information, and can lead to higher prediction accuracy. It also allows for the deployment of lightweight models on devices with limited resources. Additionally, ensemble learning can be used in combination with transfer learning to further enhance the accuracy of CNN models in medical image analysis.
What are the different ensemble methods that can be used with Artificial neural network models in sklearn?5 answersEnsemble methods that can be used with Artificial neural network models in sklearn include neural network ensembles, deep ensembles, probabilistic backpropagation, Monte Carlo dropout, distribution-free ensembles, and hyper ensemble methods. Additionally, there are approaches that focus on generating diverse ensembles through sparsity-inducing priors and exploration and exploitation phases, such as sequential ensembling of dynamic Bayesian neural subnetworks. Another approach is to build a bucket or ensemble of weak models, such as small neural networks, which can be combined to produce reliable predictions.
How can ensemble of Artificial neural network be used to improve the performance of the model?3 answersEnsemble learning has gained attention in recent deep learning research as a way to boost the accuracy and generalizability of deep neural network (DNN) models. By exploring the intrinsic diversity within a model architecture, an efficient DNN ensemble can be built. Pruning and quantization, which lead to an efficient model architecture with a small accuracy drop, result in distinct behavior in the decision boundary. Heterogeneously Compressed Ensemble (HCE) is proposed, where an efficient ensemble is built with pruned and quantized variants from a pretrained DNN model. Additionally, a diversity-aware training objective is proposed to further boost the performance of the HCE ensemble. This approach achieves significant improvement in the efficiency-accuracy tradeoff compared to traditional DNN ensemble training methods and previous model compression methods.
Is SVM ensemble learning?7 answers