scispace - formally typeset
Open AccessJournal ArticleDOI

Model selection for primal SVM

Reads0
Chats0
TLDR
This paper introduces two types of nonsmooth optimization methods for selecting model hyperparameters in primal SVM models based on cross-validation and these methods are directly applicable to other learning tasks with differentiable loss functions and regularization functions.
Abstract
This paper introduces two types of nonsmooth optimization methods for selecting model hyperparameters in primal SVM models based on cross-validation. Unlike common grid search approaches for model selection, these approaches are scalable both in the number of hyperparameters and number of data points. Taking inspiration from linear-time primal SVM algorithms, scalability in model selection is achieved by directly working with the primal variables without introducing any dual variables. The proposed implicit primal gradient descent (ImpGrad) method can utilize existing SVM solvers. Unlike prior methods for gradient descent in hyperparameters space, all work is done in the primal space so no inversion of the kernel matrix is required. The proposed explicit penalized bilevel programming (PBP) approach optimizes both the hyperparameters and parameters simultaneously. It solves the original cross-validation problem by solving a series of least squares regression problems with simple constraints in both the hyperparameter and parameter space. Computational results on least squares support vector regression problems with multiple hyperparameters establish that both the implicit and explicit methods perform quite well in terms of generalization and computational time. These methods are directly applicable to other learning tasks with differentiable loss functions and regularization functions. Both the implicit and explicit algorithms investigated represent powerful new approaches to solving large bilevel programs involving nonsmooth loss functions.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

A PSO and pattern search based memetic algorithm for SVMs parameters optimization

TL;DR: An efficient memetic algorithm based on particle swarm optimization algorithm (PSO) and pattern search and a novel probabilistic selection strategy to select the appropriate individuals among the current population to undergo local refinement is proposed, keeping a well balance between exploration and exploitation.
Journal ArticleDOI

Optimization problems for machine learning: A survey

TL;DR: The machine learning literature is surveyed and in an optimization framework several commonly used machine learning approaches are presented for regression, classification, clustering, deep learning, and adversarial learning as well as new emerging applications in machine teaching, empirical modelLearning, and Bayesian network structure learning.
Proceedings ArticleDOI

Design of the 2015 ChaLearn AutoML challenge

TL;DR: The AutoML contest for IJCNN 2015 challenges participants to solve classification and regression problems without any human intervention, and will push the state of the art in fully automatic machine learning on a wide range of real-world problems.
Journal ArticleDOI

Parameters optimization of support vector machines for imbalanced data using social ski driver algorithm

TL;DR: This paper proposes a social ski-driver (SSD) optimization algorithm which is inspired from different evolutionary optimization algorithms for optimizing the parameters of SVMs, with the aim of improving the classification performance.
References
More filters
Book

Optimization and nonsmooth analysis

TL;DR: The Calculus of Variations as discussed by the authors is a generalization of the calculus of variations, which is used in many aspects of analysis, such as generalized gradient descent and optimal control.
Book

Nonlinear Programming: Theory and Algorithms

TL;DR: The book is a solid reference for professionals as well as a useful text for students in the fields of operations research, management science, industrial engineering, applied mathematics, and also in engineering disciplines that deal with analytical optimization techniques.
Journal ArticleDOI

Generalized Cross-Validation as a Method for Choosing a Good Ridge Parameter

TL;DR: The generalized cross-validation (GCV) method as discussed by the authors is a generalized version of Allen's PRESS, which can be used in subset selection and singular value truncation, and even to choose from among mixtures of these methods.
Book

Least Squares Support Vector Machines

TL;DR: Support Vector Machines Basic Methods of Least Squares Support Vector Machines Bayesian Inference for LS-SVM Models Robustness Large Scale Problems LS- sVM for Unsupervised Learning LS- SVM for Recurrent Networks and Control.
Journal ArticleDOI

Choosing Multiple Parameters for Support Vector Machines

TL;DR: The problem of automatically tuning multiple parameters for pattern recognition Support Vector Machines (SVMs) is considered by minimizing some estimates of the generalization error of SVMs using a gradient descent algorithm over the set of parameters.