scispace - formally typeset
Search or ask a question
Author

Frank R. Hampel

Bio: Frank R. Hampel is an academic researcher from ETH Zurich. The author has contributed to research in topics: Estimator & Parametric model. The author has an hindex of 15, co-authored 31 publications receiving 8375 citations. Previous affiliations of Frank R. Hampel include University of North Carolina at Chapel Hill & University of Zurich.

Papers
More filters
Book
01 Jan 1986
TL;DR: This paper presents a meta-modelling framework for estimating the values of Covariance Matrices and Multivariate Location using one-Dimensional and Multidimensional Estimators.
Abstract: 1. Introduction and Motivation. 2. One-Dimensional Estimators. 3. One-Dimensional Tests. 4. Multidimensional Estimators. 5. Estimation of Covariance Matrices and Multivariate Location. 6. Linear Models: Robust Estimation. 7. Linear Models: Robust Testing. 8. Complements and Outlook. References. Index.

3,818 citations

Journal ArticleDOI
Frank R. Hampel1
TL;DR: In this article, the first derivative of an estimator viewed as functional and the ways in which it can be used to study local robustness properties are discussed, and a theory of robust estimation "near" strict parametric models is briefly sketched and applied to some classical situations.
Abstract: This paper treats essentially the first derivative of an estimator viewed as functional and the ways in which it can be used to study local robustness properties. A theory of robust estimation “near” strict parametric models is briefly sketched and applied to some classical situations. Relations between von Mises functionals, the jackknife and U-statistics are indicated. A number of classical and new estimators are discussed, including trimmed and Winsorized means, Huber-estimators, and more generally maximum likelihood and M-estimators. Finally, a table with some numerical robustness properties is given.

2,410 citations

Journal ArticleDOI
TL;DR: In this paper, two very closely related definitions of robustness of a sequence of estimators are given which take into account the types of deviations from parametric models that occur in practice.
Abstract: Two very closely related definitions of robustness of a sequence of estimators are given which take into account the types of deviations from parametric models that occur in practice. These definitions utilize the properties of the Prokhorov distance between probability distributions. It is proved that weak $^\ast$-continuous functionals on the space of probability distributions define robust sequences of estimators (in either sense). The concept of the "breakdown point" of a sequence of estimators is defined, and some examples are given.

949 citations

Book
01 Jan 1968

515 citations

Book
21 Jul 1972
TL;DR: The Princeton Legacy Library as discussed by the authors uses the latest print-on-demand technology to again make available previously out-of-print books from the distinguished backlist of Princeton University Press.
Abstract: Because estimation involves inferring information about an unknown quantity on the basis of available data, the selection of an estimator is influenced by its ability to perform well under the conditions that are assumed to underlie the data. Since these conditions are never known exactly, the estimators chosen must be robust; i.e., they must be able to perform well under a variety of underlying conditions. The theory of robust estimation is based on specified properties of specified estimators under specified conditions. This book was written as the result of a study undertaken to establish the interaction of these three components over as large a range as possible.Originally published in 1972.The Princeton Legacy Library uses the latest print-on-demand technology to again make available previously out-of-print books from the distinguished backlist of Princeton University Press. These paperback editions preserve the original texts of these important books while presenting them in durable paperback editions. The goal of the Princeton Legacy Library is to vastly increase access to the rich scholarly heritage found in the thousands of books published by Princeton University Press since its founding in 1905.

460 citations


Cited by
More filters
Journal ArticleDOI

9,941 citations

Journal ArticleDOI
TL;DR: The bootstrap is extended to other measures of statistical accuracy such as bias and prediction error, and to complicated data structures such as time series, censored data, and regression models.
Abstract: This is a review of bootstrap methods, concentrating on basic ideas and applications rather than theoretical considerations. It begins with an exposition of the bootstrap estimate of standard error for one-sample situations. Several examples, some involving quite complicated statistical procedures, are given. The bootstrap is then extended to other measures of statistical accuracy such as bias and prediction error, and to complicated data structures such as time series, censored data, and regression models. Several more examples are presented illustrating these ideas. The last third of the paper deals mainly with bootstrap confidence intervals.

5,894 citations

Journal ArticleDOI
TL;DR: It is demonstrated that Ethernet LAN traffic is statistically self-similar, that none of the commonly used traffic models is able to capture this fractal-like behavior, and that such behavior has serious implications for the design, control, and analysis of high-speed, cell-based networks.
Abstract: Demonstrates that Ethernet LAN traffic is statistically self-similar, that none of the commonly used traffic models is able to capture this fractal-like behavior, that such behavior has serious implications for the design, control, and analysis of high-speed, cell-based networks, and that aggregating streams of such traffic typically intensifies the self-similarity ("burstiness") instead of smoothing it. These conclusions are supported by a rigorous statistical analysis of hundreds of millions of high quality Ethernet traffic measurements collected between 1989 and 1992, coupled with a discussion of the underlying mathematical and statistical properties of self-similarity and their relationship with actual network behavior. The authors also present traffic models based on self-similar stochastic processes that provide simple, accurate, and realistic descriptions of traffic scenarios expected during B-ISDN deployment. >

5,567 citations

Book
01 Jan 1986
TL;DR: This paper presents a meta-modelling framework for estimating the values of Covariance Matrices and Multivariate Location using one-Dimensional and Multidimensional Estimators.
Abstract: 1. Introduction and Motivation. 2. One-Dimensional Estimators. 3. One-Dimensional Tests. 4. Multidimensional Estimators. 5. Estimation of Covariance Matrices and Multivariate Location. 6. Linear Models: Robust Estimation. 7. Linear Models: Robust Testing. 8. Complements and Outlook. References. Index.

3,818 citations

Journal ArticleDOI
TL;DR: In this paper, the median of the squared residuals is used to resist the effect of nearly 50% of contamination in the data in the special case of simple least square regression, which corresponds to finding the narrowest strip covering half of the observations.
Abstract: Classical least squares regression consists of minimizing the sum of the squared residuals. Many authors have produced more robust versions of this estimator by replacing the square by something else, such as the absolute value. In this article a different approach is introduced in which the sum is replaced by the median of the squared residuals. The resulting estimator can resist the effect of nearly 50% of contamination in the data. In the special case of simple regression, it corresponds to finding the narrowest strip covering half of the observations. Generalizations are possible to multivariate location, orthogonal regression, and hypothesis testing in linear models.

3,713 citations