Journal ArticleDOI
Quantized Kernel Recursive Least Squares Algorithm
TLDR
By incorporating a simple online vector quantization method, a recursive algorithm is derived to update the solution, namely the quantized kernel recursive least squares algorithm.Abstract:
In a recent paper, we developed a novel quantized kernel least mean square algorithm, in which the input space is quantized (partitioned into smaller regions) and the network size is upper bounded by the quantization codebook size (number of the regions). In this paper, we propose the quantized kernel least squares regression, and derive the optimal solution. By incorporating a simple online vector quantization method, we derive a recursive algorithm to update the solution, namely the quantized kernel recursive least squares algorithm. The good performance of the new algorithm is demonstrated by Monte Carlo simulations.read more
Citations
More filters
Journal ArticleDOI
Kernel recursive maximum correntropy
TL;DR: A robust kernel adaptive algorithm is derived in kernel space and under the maximum correntropy criterion (MCC), which is particularly useful for nonlinear and non-Gaussian signal processing, especially when data contain large outliers or disturbed by impulsive noises.
Journal ArticleDOI
Online Sequential Extreme Learning Machine With Kernels
TL;DR: This work proposes a straightforward extension of the well-known kernel recursive least-squares, belonging to the kernel adaptive filtering (KAF) family, to the ELM framework, and presents an algorithm for this task, which can result in a highly efficient algorithm, both in terms of obtained generalization error and training time.
Journal ArticleDOI
Mixture correntropy for robust learning
TL;DR: Experimental results show that the learning algorithms under MMCC can perform very well and achieve better performance than the conventional MCC based algorithms as well as several other state-of-the-art algorithms.
Journal ArticleDOI
Retargeted Least Squares Regression Algorithm
TL;DR: This brief presents a framework of retargeted least squares regression (ReLSR) for multicategory classification to directly learn the regression targets from data other than using the traditional zero-one matrix as regression targets.
Journal ArticleDOI
Decorrelation of Neutral Vector Variables: Theory and Applications
TL;DR: In this paper, two fundamental invertible transformations, namely serial nonlinear transformation and parallel non-linear transformation, are proposed to carry out the decorrelation for neutral vector variable.
References
More filters
BookDOI
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
TL;DR: Learning with Kernels provides an introduction to SVMs and related kernel methods that provide all of the concepts necessary to enable a reader equipped with some basic mathematical knowledge to enter the world of machine learning using theoretically well-founded yet easy-to-use kernel algorithms.
Journal Article
Learning with kernels : Support vector machines, regularization, optimization, and beyond
Journal ArticleDOI
Regularization theory and neural networks architectures
TL;DR: This paper shows that regularization networks encompass a much broader range of approximation schemes, including many of the popular general additive models and some of the neural networks, and introduces new classes of smoothness functionals that lead to different classes of basis functions.
Journal ArticleDOI
A resource-allocating network for function interpolation
TL;DR: A network that allocates a new computational unit whenever an unusual pattern is presented to the network, which learns much faster than do those using backpropagation networks and uses a comparable number of synapses.
Journal ArticleDOI
The kernel recursive least-squares algorithm
TL;DR: A nonlinear version of the recursive least squares (RLS) algorithm that uses a sequential sparsification process that admits into the kernel representation a new input sample only if its feature space image cannot be sufficiently well approximated by combining the images of previously admitted samples.