scispace - formally typeset
Open AccessProceedings Article

Hard Shape-Constrained Kernel Machines

TLDR
In this paper, the authors prove that hard affine shape constraints on function derivatives can be encoded in kernel machines, which represent one of the most flexible and powerful tools in machine learning and statistics.
Abstract
Shape constraints (such as non-negativity, monotonicity, convexity) play a central role in a large number of applications, as they usually improve performance for small sample size and help interpretability. However enforcing these shape requirements in a hard fashion is an extremely challenging problem. Classically, this task is tackled (i) in a soft way (without out-of-sample guarantees), (ii) by specialized transformation of the variables on a case-by-case basis, or (iii) by using highly restricted function classes, such as polynomials or polynomial splines. In this paper, we prove that hard affine shape constraints on function derivatives can be encoded in kernel machines which represent one of the most flexible and powerful tools in machine learning and statistics. Particularly, we present a tightened second-order cone constrained reformulation, that can be readily implemented in convex solvers. We prove performance guarantees on the solution, and demonstrate the efficiency of the approach in joint quantile regression with applications to economics and to the analysis of aircraft trajectories, among others.

read more

Content maybe subject to copyright    Report

Citations
More filters
Posted Content

Linearly-constrained Linear Quadratic Regulator from the viewpoint of kernel methods.

TL;DR: This study presents how matrix-valued reproducing kernels allow for an alternative viewpoint in the linear quadratic regulator problem, and introduces a strengthened continuous-time convex optimization problem which can be tackled exactly with finite dimensional solvers, and which solution is interior to the constraints.
Posted Content

Deep Local Volatility

TL;DR: A deep learning approach for interpolation of European vanilla option prices which jointly yields the full surface of local volatilities and the use of the Dupire formula to enforce bounds on the local volatility associated with option prices, during the network fitting.
Journal ArticleDOI

Capturing and incorporating expert knowledge into machine learning models for quality prediction in manufacturing

TL;DR: In this paper , a general methodology for building quality prediction models with ML methods on small datasets by integrating shape expert knowledge is introduced, that is, prior knowledge about the shape of the input-output relationship to be learned.
Posted Content

A Dimension-free Computational Upper-bound for Smooth Optimal Transport Estimation

TL;DR: In this paper, an infinite-dimensional sum-of-squares representation is used to derive a statistical estimator of smooth optimal transport which achieves a precision of Ω(Varepsilon$ from independent and identically distributed samples from the distributions.
Proceedings ArticleDOI

On Controller Tuning with Time-Varying Bayesian Optimization

TL;DR: A novel TVBO forgetting strategy using Uncertainty-Injection (UI), which incorporates the assumption of incremental and lasting changes in the objective due to changes to the system dynamics and outperforms the state-of-the-art method in TVBO.
References
More filters
Book

Inequalities: Theory of Majorization and Its Applications

TL;DR: In this paper, Doubly Stochastic Matrices and Schur-Convex Functions are used to represent matrix functions in the context of matrix factorizations, compounds, direct products and M-matrices.
Book

Spline models for observational data

Grace Wahba
TL;DR: In this paper, a theory and practice for the estimation of functions from noisy data on functionals is developed, where convergence properties, data based smoothing parameter selection, confidence intervals, and numerical methods are established which are appropriate to a number of problems within this framework.
Journal ArticleDOI

Theory of Reproducing Kernels.

TL;DR: In this paper, a short historical introduction is given to indicate the different manners in which these kernels have been used by various investigators and discuss the more important trends of the application of these kernels without attempting, however, a complete bibliography of the subject matter.
Book

Support Vector Machines

TL;DR: This book explains the principles that make support vector machines (SVMs) a successful modelling and prediction tool for a variety of applications and provides a unique in-depth treatment of both fundamental and recent material on SVMs that so far has been scattered in the literature.
Book

Lectures on Stochastic Programming: Modeling and Theory

TL;DR: The authors dedicate this book to Julia, Benjamin, Daniel, Natan and Yael; to Tsonka, Konstatin and Marek; and to the Memory of Feliks, Maria, and Dentcho.
Related Papers (5)