scispace - formally typeset
Open AccessJournal ArticleDOI

Orthogonal least squares methods and their application to non-linear system identification

Sheng Chen, +2 more
- 01 Nov 1989 - 
- Vol. 50, Iss: 5, pp 1873-1896
TLDR
Identification algorithms based on the well-known linear least squares methods of gaussian elimination, Cholesky decomposition, classical Gram-Schmidt, modified Gram- Schmidt, Householder transformation, Givens method, and singular value decomposition are reviewed.
Abstract
Identification algorithms based on the well-known linear least squares methods of gaussian elimination, Cholesky decomposition, classical Gram-Schmidt, modified Gram-Schmidt, Householder transformation, Givens method, and singular value decomposition are reviewed. The classical Gram-Schmidt, modified Gram-Schmidt, and Householder transformation algorithms are then extended to combine structure determination, or which terms to include in the model, and parameter estimation in a very simple and efficient manner for a class of multivariate discrete-time non-linear stochastic systems which are linear in the parameters.

read more

Content maybe subject to copyright    Report

Citations
More filters
Posted Content

Multi-Branch Matching Pursuit with applications to MIMO radar.

TL;DR: A sufficient condition under which MBMP can recover a sparse signal from noiseless measurements is derived, named MB-coherence, which incorporates the number of branches of MBMP and it requires fewer measurements than other conditions (e.g. the Neuman ERC or the cumulative coherence).
Journal ArticleDOI

Nonlinear system identification and overparameterization effects in multisensory evoked potential studies

TL;DR: A nonlinear modeling technique is proposed to demonstrate the possible mechanisms of interaction between sensory paths and the intersensory phenomenon concept is extended using nonlinear system theory and applied to show the possible interactions between the visual and auditory sensory paths.
Journal ArticleDOI

Readouts for echo-state networks built using locally regularized orthogonal forward regression

TL;DR: A locally regularized linear readout built using LROFR is presented, which may have a smaller dimensionality than the ESN model itself, and improves robustness and accuracy of an ESN.
Journal ArticleDOI

Automatic kernel regression modelling using combined leave-one-out test score and regularised orthogonal least squares.

TL;DR: An automatic robust nonlinear identification algorithm using the leave-one-out test score also known as the PRESS statistic and regularised orthogonal least squares to achieve maximised model robustness via two effective and complementary approaches.
References
More filters
Book

Applied Regression Analysis

TL;DR: In this article, the Straight Line Case is used to fit a straight line by least squares, and the Durbin-Watson Test is used for checking the straight line fit.
Journal ArticleDOI

Singular value decomposition and least squares solutions

TL;DR: The decomposition of A is called the singular value decomposition (SVD) and the diagonal elements of ∑ are the non-negative square roots of the eigenvalues of A T A; they are called singular values.
Book

Linear regression analysis

TL;DR: In this paper, the authors take into serious consideration the further development of regression computer programs that are efficient, accurate, and considered an important part of statistical research, and provide up-to-date accounts of computational methods and algorithms currently in use without getting entrenched in minor computing details.
Journal ArticleDOI

Input-output parametric models for non-linear systems Part II: stochastic non-linear systems

TL;DR: Recursive input-output models for non-linear multivariate discrete-time systems are derived, and sufficient conditions for their existence are defined.