scispace - formally typeset
Search or ask a question
Author

David F. Andrews

Other affiliations: Princeton University, Bell Labs
Bio: David F. Andrews is an academic researcher from University of Toronto. The author has contributed to research in topics: Population & Symbolic computation. The author has an hindex of 34, co-authored 85 publications receiving 7411 citations. Previous affiliations of David F. Andrews include Princeton University & Bell Labs.


Papers
More filters
Journal ArticleDOI
TL;DR: In this paper, necessary and sufficient conditions under which a random variable X may be generated as the ratio ZI V where Z and V are independent and Z has a standard normal distribution are presented.
Abstract: SUMMARY This paper presents necessary and sufficient conditions under which a random variable X may be generated as the ratio ZI V where Z and V are independent and Z has a standard normal distribution. This representation is useful in Monte Carlo calculations. It is established that when 7 V2 is exponential, X is double exponential; and that when WV has the asymptotic distribution of the Kolmogorov distance statistic, X is logistic.

936 citations

Journal ArticleDOI
TL;DR: In this article, a method of plotting data of more than two dimensions is proposed, where each data point, x = (xi, *, xk), is mapped into a function of the form fx(t) = xl/ v/2 + x2 sin t + x3 cos t + X4 sin 2t + x5 cos 2t+, and the function is plotted on the range - 7r < t < 7r.
Abstract: SUMMARY A method of plotting data of more than two dimensions is proposed. Each data point, x = (xi, * , xk), is mapped into a function of the form fx(t) = xl/ v/2 + x2 sin t + x3 cos t + X4 sin 2t + x5 cos 2t + , and the function is plotted on the range - 7r < t < 7r. Some statistical properties of the method are explored. The application of the method is illustrated with an example from anthropology.

708 citations

Journal ArticleDOI
TL;DR: It is concluded that sustained administration of desferrioxamine may slow the clinical progression of the dementia associated with AD.

666 citations

Journal ArticleDOI
David F. Andrews1
TL;DR: In this paper, the authors show that techniques of fitting are robust of efficiency when their statistical efficiency remains high for conditions more realistic than the utopian cases of Gaussian distributions with errors of equal variance.
Abstract: Techniques of fitting are said to be resistant when the result is not greatly altered in the case a small fraction of the data is altered: techniques of fitting are said to be robust of efficiency when their statistical efficiency remains high for conditions more realistic than the utopian cases of Gaussian distributions with errors of equal variance. These properties are particularly important in the formative stages of model building when the form of the response is not known exactly. Techniques with these properties are proposed and discussed.

525 citations

Book
21 Jul 1972
TL;DR: The Princeton Legacy Library as discussed by the authors uses the latest print-on-demand technology to again make available previously out-of-print books from the distinguished backlist of Princeton University Press.
Abstract: Because estimation involves inferring information about an unknown quantity on the basis of available data, the selection of an estimator is influenced by its ability to perform well under the conditions that are assumed to underlie the data. Since these conditions are never known exactly, the estimators chosen must be robust; i.e., they must be able to perform well under a variety of underlying conditions. The theory of robust estimation is based on specified properties of specified estimators under specified conditions. This book was written as the result of a study undertaken to establish the interaction of these three components over as large a range as possible.Originally published in 1972.The Princeton Legacy Library uses the latest print-on-demand technology to again make available previously out-of-print books from the distinguished backlist of Princeton University Press. These paperback editions preserve the original texts of these important books while presenting them in durable paperback editions. The goal of the Princeton Legacy Library is to vastly increase access to the rich scholarly heritage found in the thousands of books published by Princeton University Press since its founding in 1905.

460 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: In this paper, the authors consider the problem of comparing complex hierarchical models in which the number of parameters is not clearly defined and derive a measure pD for the effective number in a model as the difference between the posterior mean of the deviances and the deviance at the posterior means of the parameters of interest, which is related to other information criteria and has an approximate decision theoretic justification.
Abstract: Summary. We consider the problem of comparing complex hierarchical models in which the number of parameters is not clearly defined. Using an information theoretic argument we derive a measure pD for the effective number of parameters in a model as the difference between the posterior mean of the deviance and the deviance at the posterior means of the parameters of interest. In general pD approximately corresponds to the trace of the product of Fisher's information and the posterior covariance, which in normal models is the trace of the ‘hat’ matrix projecting observations onto fitted values. Its properties in exponential families are explored. The posterior mean deviance is suggested as a Bayesian measure of fit or adequacy, and the contributions of individual observations to the fit and complexity can give rise to a diagnostic plot of deviance residuals against leverages. Adding pD to the posterior mean deviance gives a deviance information criterion for comparing models, which is related to other information criteria and has an approximate decision theoretic justification. The procedure is illustrated in some examples, and comparisons are drawn with alternative Bayesian and classical proposals. Throughout it is emphasized that the quantities required are trivial to compute in a Markov chain Monte Carlo analysis.

11,691 citations

Journal ArticleDOI
William S. Cleveland1
TL;DR: Robust locally weighted regression as discussed by the authors is a method for smoothing a scatterplot, in which the fitted value at z k is the value of a polynomial fit to the data using weighted least squares, where the weight for (x i, y i ) is large if x i is close to x k and small if it is not.
Abstract: The visual information on a scatterplot can be greatly enhanced, with little additional cost, by computing and plotting smoothed points. Robust locally weighted regression is a method for smoothing a scatterplot, (x i , y i ), i = 1, …, n, in which the fitted value at z k is the value of a polynomial fit to the data using weighted least squares, where the weight for (x i , y i ) is large if x i is close to x k and small if it is not. A robust fitting procedure is used that guards against deviant points distorting the smoothed points. Visual, computational, and statistical issues of robust locally weighted regression are discussed. Several examples, including data on lead intoxication, are used to illustrate the methodology.

10,225 citations

Book
10 Mar 1986
TL;DR: In this paper, the use of theoretical models as an alternative to experiment in making accurate predictions of chemical phenomena is discussed, and the formulation of theoretical molecular orbital models starting from quantum mechanics is discussed.
Abstract: Describes and discusses the use of theoretical models as an alternative to experiment in making accurate predictions of chemical phenomena. Addresses the formulation of theoretical molecular orbital models starting from quantum mechanics, and compares them to experimental results. Draws on a series of models that have already received widespread application and are available for new applications. A new and powerful research tool for the practicing experimental chemist.

8,210 citations

Journal ArticleDOI
TL;DR: In this article, a generalized form of the cross-validation criterion is applied to the choice and assessment of prediction using the data-analytic concept of a prescription, and examples used to illustrate the application are drawn from the problem areas of univariate estimation, linear regression and analysis of variance.
Abstract: SUMMARY A generalized form of the cross-validation criterion is applied to the choice and assessment of prediction using the data-analytic concept of a prescription. The examples used to illustrate the application are drawn from the problem areas of univariate estimation, linear regression and analysis of variance.

7,385 citations