scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Fading memory and the problem of approximating nonlinear operators with Volterra series

TL;DR: In this article, it was shown that any time-invariant continuous nonlinear operator with fading memory can be approximated by a Volterra series operator, and that the approximating operator can be realized as a finite-dimensional linear dynamical system with a nonlinear readout map.
Abstract: Using the notion of fading memory we prove very strong versions of two folk theorems. The first is that any time-invariant (TI) continuous nonlinear operator can be approximated by a Volterra series operator, and the second is that the approximating operator can be realized as a finite-dimensional linear dynamical system with a nonlinear readout map. While previous approximation results are valid over finite time intervals and for signals in compact sets, the approximations presented here hold for all time and for signals in useful (noncompact) sets. The discretetime analog of the second theorem asserts that any TI operator with fading memory can be approximated (in our strong sense) by a nonlinear moving- average operator. Some further discussion of the notion of fading memory is given.

Content maybe subject to copyright    Report

Citations
More filters
Book
01 Jan 1994
TL;DR: In this paper, the authors present a brief history of LMIs in control theory and discuss some of the standard problems involved in LMIs, such as linear matrix inequalities, linear differential inequalities, and matrix problems with analytic solutions.
Abstract: Preface 1. Introduction Overview A Brief History of LMIs in Control Theory Notes on the Style of the Book Origin of the Book 2. Some Standard Problems Involving LMIs. Linear Matrix Inequalities Some Standard Problems Ellipsoid Algorithm Interior-Point Methods Strict and Nonstrict LMIs Miscellaneous Results on Matrix Inequalities Some LMI Problems with Analytic Solutions 3. Some Matrix Problems. Minimizing Condition Number by Scaling Minimizing Condition Number of a Positive-Definite Matrix Minimizing Norm by Scaling Rescaling a Matrix Positive-Definite Matrix Completion Problems Quadratic Approximation of a Polytopic Norm Ellipsoidal Approximation 4. Linear Differential Inclusions. Differential Inclusions Some Specific LDIs Nonlinear System Analysis via LDIs 5. Analysis of LDIs: State Properties. Quadratic Stability Invariant Ellipsoids 6. Analysis of LDIs: Input/Output Properties. Input-to-State Properties State-to-Output Properties Input-to-Output Properties 7. State-Feedback Synthesis for LDIs. Static State-Feedback Controllers State Properties Input-to-State Properties State-to-Output Properties Input-to-Output Properties Observer-Based Controllers for Nonlinear Systems 8. Lure and Multiplier Methods. Analysis of Lure Systems Integral Quadratic Constraints Multipliers for Systems with Unknown Parameters 9. Systems with Multiplicative Noise. Analysis of Systems with Multiplicative Noise State-Feedback Synthesis 10. Miscellaneous Problems. Optimization over an Affine Family of Linear Systems Analysis of Systems with LTI Perturbations Positive Orthant Stabilizability Linear Systems with Delays Interpolation Problems The Inverse Problem of Optimal Control System Realization Problems Multi-Criterion LQG Nonconvex Multi-Criterion Quadratic Problems Notation List of Acronyms Bibliography Index.

11,085 citations

Journal ArticleDOI
TL;DR: A new computational model for real-time computing on time-varying input that provides an alternative to paradigms based on Turing machines or attractor neural networks, based on principles of high-dimensional dynamical systems in combination with statistical learning theory and can be implemented on generic evolved or found recurrent circuitry.
Abstract: A key challenge for neural modeling is to explain how a continuous stream of multimodal input from a rapidly changing environment can be processed by stereotypical recurrent circuits of integrate-and-fire neurons in real time. We propose a new computational model for real-time computing on time-varying input that provides an alternative to paradigms based on Turing machines or attractor neural networks. It does not require a task-dependent construction of neural circuits. Instead, it is based on principles of high-dimensional dynamical systems in combination with statistical learning theory and can be implemented on generic evolved or found recurrent circuitry. It is shown that the inherent transient dynamics of the high-dimensional dynamical system formed by a sufficiently large and heterogeneous neural circuit may serve as universal analog fading memory. Readout neurons can learn to extract in real time from the current state of such recurrent neural circuit information about current and past inputs that may be needed for diverse tasks. Stable internal states are not required for giving a stable output, since transient internal states can be transformed by readout neurons into stable target outputs due to the high dimensionality of the dynamical system. Our approach is based on a rigorous computational model, the liquid state machine, that, unlike Turing machines, does not require sequential transitions between well-defined discrete internal states. It is supported, as the Turing machine is, by rigorous mathematical results that predict universal computational power under idealized conditions, but for the biologically more realistic scenario of real-time processing of time-varying inputs. Our approach provides new perspectives for the interpretation of neural coding, the design of experiments and data analysis in neurophysiology, and the solution of problems in robotics and neurotechnology.

3,446 citations

Journal ArticleDOI
TL;DR: An overview of recent advances in physical reservoir computing is provided by classifying them according to the type of the reservoir to expand its practical applications and develop next-generation machine learning systems.

959 citations

Journal ArticleDOI
TL;DR: It is shown that only near the critical boundary can recurrent networks of threshold gates perform complex computations on time series, which strongly supports conjectures that dynamical systems that are capable of doing complex computational tasks should operate near the edge of chaos.
Abstract: Depending on the connectivity, recurrent networks of simple computational units can show very different types of dynamics, ranging from totally ordered to chaotic. We analyze how the type of dynamics (ordered or chaotic) exhibited by randomly connected networks of threshold gates driven by a time-varying input signal depends on the parameters describing the distribution of the connectivity matrix. In particular, we calculate the critical boundary in parameter space where the transition from ordered to chaotic dynamics takes place. Employing a recently developed framework for analyzing real-time computations, we show that only near the critical boundary can such networks perform complex computations on time series. Hence, this result strongly supports conjectures that dynamical systems that are capable of doing complex computational tasks should operate near the edge of chaos, that is, the transition from ordered to chaotic dynamics.

746 citations

Journal ArticleDOI
V.J. Mathews1
TL;DR: The polynomial systems considered are those nonlinear systems whose output signals can be related to the input signals through a truncated Volterra series expansion or a recursive nonlinear difference equation.
Abstract: Adaptive nonlinear filters equipped with polynomial models of nonlinearity are explained. The polynomial systems considered are those nonlinear systems whose output signals can be related to the input signals through a truncated Volterra series expansion or a recursive nonlinear difference equation. The Volterra series expansion can model a large class of nonlinear systems and is attractive in adaptive filtering applications because the expansion is a linear combination of nonlinear functions of the input signal. The basic ideas behind the development of gradient and recursive least-squares adaptive Volterra filters are first discussed. Adaptive algorithms using system models involving recursive nonlinear difference equations are then treated. Such systems may be able to approximate many nonlinear systems with great parsimony in the use of coefficients. Also discussed are current research trends and new results and problem areas associated with these nonlinear filters. A lattice structure for polynomial models is described. >

541 citations

References
More filters
Book
01 Jan 1973

14,545 citations

Book
01 Jan 1960

2,756 citations

Book
01 Jan 1958
TL;DR: A series of lectures on the role of nonlinear processes in physics, mathematics, electrical engineering, physiology, and communication theory was given in this article, where the last few of these were devoted to the application of my ideas to problems in the statistical mechanics of gases.
Abstract: A series of lectures on the role of nonlinear processes in physics, mathematics, electrical engineering, physiology, and communication theory.From the preface:"For some time I have been interested in a group of phenomena depending upon random processes. One the one hand, I have recorded the random shot effect as a suitable input for testing nonlinear circuits. On the other hand, for some of the work that Professor W. A. Rosenblith and I have been doing concerning the nature of the electroencephalogram, and in particular of the alpha rhythm, it has occurred to me to use the model of a system of random nonlinear oscillators excited by a random input...At the beginning we had contemplated a series of only four or five lectures. My ideas developed pari passu with the course, and by the end of the term we found ourselves with a set of fifteen lectures. The last few of these were devoted to the application of my ideas to problems in the statistical mechanics of gases. This work is both new and tentative, and I found that I had to supplement my course by the writing over of these with the help of Professer Y. W. Lee. "

1,504 citations

Book
01 Jan 1970

835 citations

Journal ArticleDOI
TL;DR: In this paper, the foundations of non-lmear causal functionals are laid down using non-commutative indeterminates, and a theory of causal functions is presented.
Abstract: — Thé foundations ofa theory of non-lmear causal functionals are laid down using non-commutative indeterminates.

354 citations