scispace - formally typeset
Open AccessJournal ArticleDOI

Orthogonal least squares methods and their application to non-linear system identification

Sheng Chen, +2 more
- 01 Nov 1989 - 
- Vol. 50, Iss: 5, pp 1873-1896
TLDR
Identification algorithms based on the well-known linear least squares methods of gaussian elimination, Cholesky decomposition, classical Gram-Schmidt, modified Gram- Schmidt, Householder transformation, Givens method, and singular value decomposition are reviewed.
Abstract
Identification algorithms based on the well-known linear least squares methods of gaussian elimination, Cholesky decomposition, classical Gram-Schmidt, modified Gram-Schmidt, Householder transformation, Givens method, and singular value decomposition are reviewed. The classical Gram-Schmidt, modified Gram-Schmidt, and Householder transformation algorithms are then extended to combine structure determination, or which terms to include in the model, and parameter estimation in a very simple and efficient manner for a class of multivariate discrete-time non-linear stochastic systems which are linear in the parameters.

read more

Content maybe subject to copyright    Report

Citations
More filters
Book

Neural networks for pattern recognition

TL;DR: This is the first comprehensive treatment of feed-forward neural networks from the perspective of statistical pattern recognition, and is designed as a text, with over 100 exercises, to benefit anyone involved in the fields of neural computation and pattern recognition.
Journal ArticleDOI

Atomic Decomposition by Basis Pursuit

TL;DR: Basis Pursuit (BP) is a principle for decomposing a signal into an "optimal" superposition of dictionary elements, where optimal means having the smallest l1 norm of coefficients among all such decompositions.
Journal ArticleDOI

$rm K$ -SVD: An Algorithm for Designing Overcomplete Dictionaries for Sparse Representation

TL;DR: A novel algorithm for adapting dictionaries in order to achieve sparse signal representations, the K-SVD algorithm, an iterative method that alternates between sparse coding of the examples based on the current dictionary and a process of updating the dictionary atoms to better fit the data.
Journal ArticleDOI

Atomic Decomposition by Basis Pursuit

TL;DR: This work gives examples exhibiting several advantages over MOF, MP, and BOB, including better sparsity and superresolution, and obtains reasonable success with a primal-dual logarithmic barrier method and conjugate-gradient solver.
Journal ArticleDOI

Greed is good: algorithmic results for sparse approximation

TL;DR: This article presents new results on using a greedy algorithm, orthogonal matching pursuit (OMP), to solve the sparse approximation problem over redundant dictionaries and develops a sufficient condition under which OMP can identify atoms from an optimal approximation of a nonsparse signal.
References
More filters
Journal ArticleDOI

Orthogonal parameter estimation algorithm for non-linear stochastic systems

TL;DR: In this article, an orthogonal parameter estimation algorithm is proposed which allows each parameter in a linear-in-the-parameters nonlinear difference equation model to be estimated sequentially and quite independently of the other parameters in the model.
Journal ArticleDOI

Identification of non-linear output-affine systems using an orthogonal least-squares algorithm

TL;DR: In this article, an orthogonal least-squares algorithm is derived that can determine the term to regress upon, detect significant terms in the model expansion, and provide the final parameter estimates.
Journal ArticleDOI

A prediction-error and stepwise-regression estimation algorithm for non-linear systems

TL;DR: The identification of non-linear systems based on a NARMAX (Non-linear AutoRegressive Moving-Average model with exogenous inputs) model representation is considered, and a combined stepwise-regression/prediction-error estimation algorithm is derived.