scispace - formally typeset
Proceedings ArticleDOI

Numerical Method of an Orthogonal Array Optimization

Reads0
Chats0
TLDR
In this article, the damping OAO was introduced into conventional nonlinear regression analysis, and the numerical experiment results indicate the superior application performance of the damp OAAO.
Abstract
The nonlinear regression analysis (RA) is widely used in signal and image processing, but the ill-conditioning problem and the convergence property of classical nonlinear RAs depend on the initial parameter estimates in a great degree. The orthogonal array optimization (OAO) is one of the approaches to get the initial estimates that are near to the optimal values. We introduced the damping idea into conventional OAO, hereby termed as the damping OAO. This paper presents the numerical method of the damping OAO. The numerical experiment results indicate the superior application performance of the damping OAO. It is well known that the nonlinear regression analysis (RA) is widely used in signal and image processing fields. In statistics, nonlinear RA is a form of RA, where observational data are modeled by a function, which is a nonlinear combination of the model parameters and depends on one or more independent variables. Nonlinear least squares method is the common form of nonlinear RAs. The basis of the method is to approximate the model by a linear one and to refine the unknown parameters by successive iterations; and the initial parameter estimates is highly significant for the ill- conditioning problem and the convergence property of the method (1). There are many approaches to get the initial parameter estimates that are near to the optimal values, one of which is the orthogonal array optimization (OAO) (1-2).

read more

References
More filters
Journal ArticleDOI

A method for the solution of certain non – linear problems in least squares

TL;DR: In this article, the problem of least square problems with non-linear normal equations is solved by an extension of the standard method which insures improvement of the initial solution, which can also be considered an extension to Newton's method.
Book

Nonlinear Regression Analysis and Its Applications

TL;DR: This book offers a balanced presentation of the theoretical, practical, and computational aspects of nonlinear regression and provides background material on linear regression, including the geometrical development for linear and nonlinear least squares.
Book

Iterative Methods for Optimization

C. T. Kelley
TL;DR: Iterative Methods for Optimization does more than cover traditional gradient-based optimization: it is the first book to treat sampling methods, including the Hooke& Jeeves, implicit filtering, MDS, and Nelder& Mead schemes in a unified way.
Book

Orthogonal Arrays: Theory and Applications

Lih-Yuan Deng
TL;DR: The Rao Inequalities for Mixed Orthogonal Arrays., 9.2 The Rao InEqualities for mixed Orthogonic Arrays.- 9.4 Construction X4.- 10.1 Constructions Inspired by Coding Theory.
Related Papers (5)