scispace - formally typeset
Search or ask a question

Showing papers by "Luc Pronzato published in 2009"


Journal ArticleDOI
TL;DR: In this article, sufficient conditions for the strong consistency and asymptotic normality of the least-squares estimator in nonlinear stochastic regression models are derived.

18 citations


Book ChapterDOI
01 Jan 2009
TL;DR: In this article, the consistency and asymptotic normality of the LS estimator of a function h(θ) of the parameters θ in a nonlinear regression model with observations was studied.
Abstract: We study the consistency and asymptotic normality of the LS estimator of a function h(θ) of the parameters θ in a nonlinear regression model with observations \(y_i=\eta(x_i,\theta) +\varepsilon_i\), \(i=1,2\ldots\) and independent errors e i . Optimum experimental design for the estimation of h(θ) frequently yields singular information matrices, which corresponds to the situation considered here. The difficulties caused by such singular designs are illustrated by a simple example: depending on the true value of the model parameters and on the type of convergence of the sequence of design points \(x_1,x_2\ldots\) to the limiting singular design measure ξ, the convergence of the estimator of h(θ) may be slower than \(1/\sqrt{n}\), and, when convergence is at a rate of \(1/\sqrt{n}\) and the estimator is asymptotically normal, its asymptotic variance may differ from that obtained for the limiting design ξ (which we call irregular asymptotic normality of the estimator). For that reason we focuss our attention on two types of design sequences: those that converge strongly to a discrete measure and those that correspond to sampling randomly from ξ. We then give assumptions on the limiting expectation surface of the model and on the estimated function h which, for the designs considered, are sufficient to ensure the regular asymptotic normality of the LS estimator of h(θ).

10 citations


Book ChapterDOI
01 Jan 2009
TL;DR: Algorithms that switch periodically between s = 1 and s = 2 are shown to converge much faster than when s is fixed at 2 and bounds are obtained through optimum design theory.
Abstract: We study the asymptotic behaviour of Forsythe's s-optimum gradient algorithm for the minimization of a quadratic function in \({\mathbb R}^d\) using a renormalization that converts the algorithm into iterations applied to a probability measure. Bounds on the performance of the algorithm (rate of convergence) are obtained through optimum design theory and the limiting behaviour of the algorithm for s = 2 is investigated into details. Algorithms that switch periodically between s = 1 and s = 2 are shown to converge much faster than when s is fixed at 2.

9 citations


Journal ArticleDOI
TL;DR: In this article, the authors consider the design of c-optimal experiments for the estimation of a scalar function h(θ) of the parameters θ in a nonlinear regression model.
Abstract: We consider the design of c-optimal experiments for the estimation of a scalar function h(θ) of the parameters θ in a nonlinear regression model. A c-optimal design ξ* may be singular, and we derive conditions ensuring the asymptotic normality of the Least-Squares estimator of h(θ) for a singular design over a finite space. As illustrated by an example, the singular designs for which asymptotic normality holds typically depend on the unknown true value of θ, which makes singular c-optimal designs of no practical use in nonlinear situations. Some simple alternatives are then suggested for constructing nonsingular designs that approach a c-optimal design under some conditions.

8 citations


Journal ArticleDOI
TL;DR: In this paper, the authors present new conditions for the strong consistency and asymptotic normality of the least squares estimator in nonlinear stochastic models when the design variables vary in a finite set.

7 citations


Book ChapterDOI
01 Jan 2009
TL;DR: The results of a numerical study shows that some of the proposed algorithms for solving quadratic optimization problems are extremely efficient.
Abstract: We study the family of gradient algorithms for solving quadratic optimization problems, where the step-length γ k is chosen according to a particular procedure. To carry out the study, we re-write the algorithms in a normalized form and make a connection with the theory of optimum experimental design. We provide the results of a numerical study which shows that some of the proposed algorithms are extremely efficient.

3 citations


Journal ArticleDOI
TL;DR: In this paper, a design method for the sequential generation of observation sites used for the inversion of a prediction model is extended to cope with practical issues such as delayed observations and design of batches of imposed size.

1 citations


15 Jun 2009
TL;DR: In this article, a scalar coefficient is used to force the support points of an optimal design measure to concentrate around points of minimum cost, and the choice of each new design point is based on the current estimated value of the model parameters.
Abstract: Optimal design under a cost constraint is considered, with a scalar coefficient setting the compromise between information (i.e., precision of the estimation of the model parameters) and cost. For suitable cost functions, by increasing the value of the coefficient one can force the support points of an optimal design measure to concentrate around points of minimum cost. When the experiment is constructed sequentially, the choice of each new design point being based on the current estimated value of the model parameters (response-adaptive design), the strong consistency and asymptotic normality of the estimator of the model parameters is obtained under the assumption that the design variables belong to a finite set. An example of adaptive design in a dose-finding problem with a bivariate binary model is presented, showing the effectiveness of the approach.