scispace - formally typeset
Open AccessPosted Content

Study of Set-Membership Adaptive Kernel Algorithms.

Reads0
Chats0
TLDR
These algorithms use an adaptive step-size to accelerate the learning and can offer an excellent tradeoff between convergence speed and steady state, which allows them to solve nonlinear filtering and estimation problems with a large number of parameters without requiring a large computational cost.
Abstract
In the last decade, a considerable research effort has been devoted to developing adaptive algorithms based on kernel functions. One of the main features of these algorithms is that they form a family of universal approximation techniques, solving problems with nonlinearities elegantly. In this paper, we present data-selective adaptive kernel normalized least-mean square (KNLMS) algorithms that can increase their learning rate and reduce their computational complexity. In fact, these methods deal with kernel expansions, creating a growing structure also known as the dictionary, whose size depends on the number of observations and their innovation. The algorithms described herein use an adaptive step-size to accelerate the learning and can offer an excellent tradeoff between convergence speed and steady state, which allows them to solve nonlinear filtering and estimation problems with a large number of parameters without requiring a large computational cost. The data-selective update scheme also limits the number of operations performed and the size of the dictionary created by the kernel expansion, saving computational resources and dealing with one of the major problems of kernel adaptive algorithms. A statistical analysis is carried out along with a computational complexity analysis of the proposed algorithms. Simulations show that the proposed KNLMS algorithms outperform existing algorithms in examples of nonlinear system identification and prediction of a time series originating from a nonlinear difference equation.

read more

References
More filters
Journal ArticleDOI

Identification and control of dynamical systems using neural networks

TL;DR: It is demonstrated that neural networks can be used effectively for the identification and control of nonlinear dynamical systems and the models introduced are practically feasible.
Journal ArticleDOI

An introduction to kernel-based learning algorithms

TL;DR: This paper provides an introduction to support vector machines, kernel Fisher discriminant analysis, and kernel principal component analysis, as examples for successful kernel-based learning methods.
Book ChapterDOI

A Generalized Representer Theorem

TL;DR: The result shows that a wide range of problems have optimal solutions that live in the finite dimensional span of the training examples mapped into feature space, thus enabling us to carry out kernel algorithms independent of the (potentially infinite) dimensionality of the feature space.
Book

Adaptive Filters

Ali H. Sayed
TL;DR: Adaptive Filters offers a fresh, focused look at the subject in a manner that will entice students, challenge experts, and appeal to practitioners and instructors.
Journal ArticleDOI

A resource-allocating network for function interpolation

John Platt
- 01 Jun 1991 - 
TL;DR: A network that allocates a new computational unit whenever an unusual pattern is presented to the network, which learns much faster than do those using backpropagation networks and uses a comparable number of synapses.
Related Papers (5)