scispace - formally typeset
Search or ask a question
Author

James R. Wallis

Other affiliations: National Academy of Sciences
Bio: James R. Wallis is an academic researcher from IBM. The author has contributed to research in topics: Quantile & Generalized extreme value distribution. The author has an hindex of 34, co-authored 54 publications receiving 12571 citations. Previous affiliations of James R. Wallis include National Academy of Sciences.


Papers
More filters
MonographDOI
TL;DR: In this paper, the authors present a regional L-moments algorithm for detecting homogeneous regions in a set of homogeneous data points and then select a frequency distribution for each region.
Abstract: Preface 1. Regional frequency analysis 2. L-moments 3. Screening the data 4. Identification of homogeneous regions 5. Choice of a frequency distribution 6. Estimation of the frequency distribution 7. Performance of the regional L-moment algorithm 8. Other topics 9. Examples Appendix References Index of notation.

2,329 citations

Journal ArticleDOI
TL;DR: In this paper, the authors use the method of probability-weighted moments to derive estimators of the parameters and quantiles of the generalized extreme-value distribution, and investigate the properties of these estimators in large samples via asymptotic theory, and in small and moderate samples, via computer simulation.
Abstract: We use the method of probability-weighted moments to derive estimators of the parameters and quantiles of the generalized extreme-value distribution. We investigate the properties of these estimators in large samples, via asymptotic theory, and in small and moderate samples, via computer simulation. Probability-weighted moment estimators have low variance and no severe bias, and they compare favorably with estimators obtained by the methods of maximum likelihood or sextiles. The method of probability-weighted moments also yields a convenient and powerful test of whether an extreme-value distribution is of Fisher-Tippett Type I, II, or III.

1,275 citations

Journal ArticleDOI
TL;DR: In this article, Probability weighted moments are introduced and shown to be potentially useful in expressing the parameters of these distributions, such as Tukey's lambda, which may present problems in deriving their parameters by more conventional means.
Abstract: Distributions whose inverse forms are explicitly defined, such as Tukey's lambda, may present problems in deriving their parameters by more conventional means. Probability weighted moments are introduced and shown to be potentially useful in expressing the parameters of these distributions.

1,147 citations

Journal ArticleDOI
TL;DR: In this paper, a series of investigations on self-similar operational hydrology are presented, and the present paper introduces and summarizes the results of these studies. But, as a replacement for statistical hydrological models, selfsimilar models appear very promising, and they account particularly well for the remarkable empirical observations of Harold Edwin Hurst.
Abstract: By ‘Noah Effect’ we designate the observation that extreme precipitation can be very extreme indeed, and by ‘Joseph Effect’ the finding that a long period of unusual (high or low) precipitation can be extremely long. Current models of statistical hydrology cannot account for either effect and must be superseded. As a replacement, ‘self-similar’ models appear very promising. They account particularly well for the remarkable empirical observations of Harold Edwin Hurst. The present paper introduces and summarizes a series of investigations on self-similar operational hydrology.

1,016 citations

Journal ArticleDOI
TL;DR: In this article, the rescaled range R(t, s) / S t, s is shown to be a very robust statistic for testing the presence of noncyclic long run statistical dependence in records and, in cases where such dependence is present, for estimating its intensity.
Abstract: The rescaled range R(t, s) / S(t, s) is shown by extensive computer simulation to be a very robust statistic for testing the presence of noncyclic long run statistical dependence in records and, in cases where such dependence is present, for estimating its intensity. The processes examined in this paper extend to extraordinarily non-Gaussian processes with huge skewness and/or kurtosis (that is, third and/or fourth moments).

907 citations


Cited by
More filters
Book ChapterDOI
TL;DR: This paper provides a concise overview of time series analysis in the time and frequency domains with lots of references for further reading.
Abstract: Any series of observations ordered along a single dimension, such as time, may be thought of as a time series. The emphasis in time series analysis is on studying the dependence among observations at different points in time. What distinguishes time series analysis from general multivariate analysis is precisely the temporal order imposed on the observations. Many economic variables, such as GNP and its components, price indices, sales, and stock returns are observed over time. In addition to being interested in the contemporaneous relationships among such variables, we are often concerned with relationships between their current and past values, that is, relationships over time.

9,919 citations

Journal ArticleDOI
TL;DR: This paper presents a meta-modelling framework for estimating Output from Computer Experiments-Predicting Output from Training Data and Criteria Based Designs for computer Experiments.
Abstract: Many scientific phenomena are now investigated by complex computer models or codes A computer experiment is a number of runs of the code with various inputs A feature of many computer experiments is that the output is deterministic--rerunning the code with the same inputs gives identical observations Often, the codes are computationally expensive to run, and a common objective of an experiment is to fit a cheaper predictor of the output to the data Our approach is to model the deterministic output as the realization of a stochastic process, thereby providing a statistical basis for designing experiments (choosing the inputs) for efficient prediction With this model, estimates of uncertainty of predictions are also available Recent work in this area is reviewed, a number of applications are discussed, and we demonstrate our methodology with an example

6,583 citations

Journal ArticleDOI

6,278 citations

Journal ArticleDOI
TL;DR: It is demonstrated that Ethernet LAN traffic is statistically self-similar, that none of the commonly used traffic models is able to capture this fractal-like behavior, and that such behavior has serious implications for the design, control, and analysis of high-speed, cell-based networks.
Abstract: Demonstrates that Ethernet LAN traffic is statistically self-similar, that none of the commonly used traffic models is able to capture this fractal-like behavior, that such behavior has serious implications for the design, control, and analysis of high-speed, cell-based networks, and that aggregating streams of such traffic typically intensifies the self-similarity ("burstiness") instead of smoothing it. These conclusions are supported by a rigorous statistical analysis of hundreds of millions of high quality Ethernet traffic measurements collected between 1989 and 1992, coupled with a discussion of the underlying mathematical and statistical properties of self-similarity and their relationship with actual network behavior. The authors also present traffic models based on self-similar stochastic processes that provide simple, accurate, and realistic descriptions of traffic scenarios expected during B-ISDN deployment. >

5,567 citations

Journal ArticleDOI
TL;DR: A Bayesian calibration technique which improves on this traditional approach in two respects and attempts to correct for any inadequacy of the model which is revealed by a discrepancy between the observed data and the model predictions from even the best‐fitting parameter values is presented.
Abstract: We consider prediction and uncertainty analysis for systems which are approximated using complex mathematical models. Such models, implemented as computer codes, are often generic in the sense that by a suitable choice of some of the model's input parameters the code can be used to predict the behaviour of the system in a variety of specific applications. However, in any specific application the values of necessary parameters may be unknown. In this case, physical observations of the system in the specific context are used to learn about the unknown parameters. The process of fitting the model to the observed data by adjusting the parameters is known as calibration. Calibration is typically effected by ad hoc fitting, and after calibration the model is used, with the fitted input values, to predict the future behaviour of the system. We present a Bayesian calibration technique which improves on this traditional approach in two respects. First, the predictions allow for all sources of uncertainty, including the remaining uncertainty over the fitted parameters. Second, they attempt to correct for any inadequacy of the model which is revealed by a discrepancy between the observed data and the model predictions from even the best-fitting parameter values. The method is illustrated by using data from a nuclear radiation release at Tomsk, and from a more complex simulated nuclear accident exercise.

3,745 citations