Choosing Multiple Parameters for Support Vector Machines
TLDR
The problem of automatically tuning multiple parameters for pattern recognition Support Vector Machines (SVMs) is considered by minimizing some estimates of the generalization error of SVMs using a gradient descent algorithm over the set of parameters.Abstract:
The problem of automatically tuning multiple parameters for pattern recognition Support Vector Machines (SVMs) is considered. This is done by minimizing some estimates of the generalization error of SVMs using a gradient descent algorithm over the set of parameters. Usual methods for choosing parameters, based on exhaustive search become intractable as soon as the number of parameters exceeds two. Some experimental results assess the feasibility of our approach for a large number of parameters (more than 100) and demonstrate an improvement of generalization performance.read more
Citations
More filters
Book ChapterDOI
Faster Support Vector Machines
TL;DR: This work presents a faster multilevel support vector machine that uses a label propagation algorithm to construct the problem hierarchy and indicates that this approach is up to orders of magnitude faster than the previous fastest algorithm while having comparable classification quality.
Journal ArticleDOI
Kernel C-Means Clustering Algorithms for Hesitant Fuzzy Information in Decision Making
Chaoqun Li,Hua Zhao,Zeshui Xu +2 more
TL;DR: A novel hesitant fuzzy clustering algorithm called hesitant fuzzy kernel C-means clustering (HFKCM) by means of kernel functions, which maps the data from the sample space to a high-dimensional feature space, to make the clustering results much more accurate.
Journal ArticleDOI
An EnKF-based scheme to optimize hyper-parameters and features for SVM classifier
TL;DR: A novel scheme to optimize hyper-parameters and features by using the Ensemble Kalman Filter (EnKF), which is an iterative optimization algorithm used for high-dimensional nonlinear systems, and proposes ensemble evolution to converge to the global optimum.
Proceedings ArticleDOI
Minimum reference set based feature selection for small sample classifications
Xue-wen Chen,Jong Cheol Jeong +1 more
TL;DR: This work proposes a novel method using minimum reference set (MRS) generated by the nearest neighbor rule, which is related to structural risk minimization principle and thus leads to good generalization in microarray-based cancer classification problems.
Posted Content
The two-sample problem for Poisson processes: adaptive tests with a non-asymptotic wild bootstrap approach
TL;DR: In this paper, the authors considered two independent Poisson processes and addressed the question of testing equality of their respective intensities, and proposed single tests whose test statistics are U-statistics based on general kernel functions.
References
More filters
Book
The Nature of Statistical Learning Theory
TL;DR: Setting of the learning problem consistency of learning processes bounds on the rate of convergence ofLearning processes controlling the generalization ability of learning process constructing learning algorithms what is important in learning theory?
Journal ArticleDOI
Support-Vector Networks
Corinna Cortes,Vladimir Vapnik +1 more
TL;DR: High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated and the performance of the support- vector network is compared to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.
Statistical learning theory
TL;DR: Presenting a method for determining the necessary and sufficient conditions for consistency of learning process, the author covers function estimates from small data pools, applying these estimations to real-life problems, and much more.
Book
An Introduction to Support Vector Machines and Other Kernel-based Learning Methods
TL;DR: This is the first comprehensive introduction to Support Vector Machines (SVMs), a new generation learning system based on recent advances in statistical learning theory, and will guide practitioners to updated literature, new applications, and on-line software.
Journal ArticleDOI
Molecular classification of cancer: class discovery and class prediction by gene expression monitoring.
Todd R. Golub,Todd R. Golub,Donna K. Slonim,Pablo Tamayo,Christine Huard,Michelle Gaasenbeek,Jill P. Mesirov,Hilary A. Coller,Mignon L. Loh,James R. Downing,Michael A. Caligiuri,Clara D. Bloomfield,Eric S. Lander +12 more
TL;DR: A generic approach to cancer classification based on gene expression monitoring by DNA microarrays is described and applied to human acute leukemias as a test case and suggests a general strategy for discovering and predicting cancer classes for other types of cancer, independent of previous biological knowledge.