scispace - formally typeset
Search or ask a question

Showing papers in "International Statistical Review in 1989"


Journal ArticleDOI
TL;DR: In this paper, the discrimination problem is defined as follows: e random variable Z, of observed value z, is distributed over some space (say, p-dimensional) either according to distribution F, or according to Distribution G. The problem is to decide, on the basis of z, which of the two distributions Z has.
Abstract: : The discrimination problem (two population case) may be defined as follows: e random variable Z, of observed value z, is distributed over some space (say, p-dimensional) either according to distribution F, or according to distribution G. The problem is to decide, on the basis of z, which of the two distributions Z has.

2,520 citations


Journal ArticleDOI
TL;DR: In this paper, the Markov point processes introduced by Ripley & Kelly are generalised by replacing fixed-range spatial interactions by interactions between neighbouring particles, where the neighbourhood relation may depend on the point configuration.
Abstract: Summary The Markov point processes introduced by Ripley & Kelly are generalised by replacing fixed-range spatial interactions by interactions between neighbouring particles, where the neighbourhood relation may depend on the point configuration. The corresponding HammersleyClifford characterisation theorem is proved. The results are used to construct new models for point processes, multitype point processes and random processes of geometrical objects. Monte Carlo simulations of these models can be generated by running spatial birth-and-death processes.

198 citations




Journal ArticleDOI
TL;DR: In this article, a comparative survey of the power-divergence family of statistics for multivariate data is presented, focusing on the Pearson's X2 statistic and the loglikelihood ratio statistic G2.
Abstract: Summary The importance of developing useful and appropriate statistical methods for analyzing discrete multivariate data is apparent from the enormous amount of attention this subject has commanded in the literature over the last thirty years. Central to these discussions has been Pearson's X2 statistic and the loglikelihood ratio statistic G2. Our review seeks to consolidate this fragmented literature and develop a unifying theme for much of this research. The traditional X2 and G2 statistics are viewed as members of the power-divergence family of statistics, and are linked through a single real-valued parameter. The principal areas covered in this comparative survey are small-sample comparisons of X2 and G2 under both classical (fixed-cells) assumptions and sparseness assumptions, efficiency comparisons, and various modifications to the test statistics (including parameter estimation for ungrouped data, data-dependent and overlapping cell boundaries, serially dependent data, and smoothing). Finally some future areas for research are discussed.

119 citations


Journal ArticleDOI
TL;DR: In this paper, a simple expression for the volume of a tube about a two-dimensional manifold with boundary embedded in the unit sphere in I delta R sub n was derived for testing for a single harmonic of undetermined frequency and phase.
Abstract: : The method suggested by Hotelling (1939) to test for a nonlinear parameter in a regression model is reviewed. Using the method of Weyl (1939), we derive a simple expression for the volume of a tube about a two dimensional manifold with boundary embedded in the unit sphere in I delta R sub n. Applications to testing for a single harmonic of undetermined frequency and phase and to testing for a change-point in linear regression are discussed. Keywords: Differential geometry.

65 citations


Journal ArticleDOI
TL;DR: In this paper, the structure of exponential families of semimartingales is studied using the corresponding local characteristics, and general types of exponential semi-artingale families of continuous and continuous processes with jumps are established.
Abstract: Summary The structure of exponential families of semimartingales is studied using the corresponding local characteristics. General types of exponential families of semimartingales, continuous as well as processes with jumps, are established, which cover the examples given in the literature. It is demonstrated how the martingale properties of the corresponding score functions imply asymptotic statistical results. Some illustrative examples are given. Processes with independent stationary increments are considered in detail. The necessary semimartingale theory is reviewed in an appendix.

43 citations



Journal ArticleDOI
TL;DR: A unifying account of the 'discrimination approach' to convergence rate problems is developed, and two basic theorems from which optimal achievable convergence rates may be easily deduced in a variety of circumstances are given.
Abstract: This paper aims to introduce non-experts to the results and techniques of nonparametric convergence rates. It opens by bridging the gap between familiar Cramer-Rao theory for parametric problems, and nonparametric minimax theory. This is done by putting parametric convergence rates into a minimax setting, and by solving nonparametric problems via the Cramer-Rao bound. Then we develop a unifying account of the 'discrimination approach' to convergence rate problems. We give two basic theorems from which optimal achievable convergence rates (both local and global) may be easily deduced in a variety of circumstances. Four examples illustrate use of the technique.

30 citations


Journal ArticleDOI
TL;DR: Analysis of the Gypsy moth data suggests that the addition of an enzyme to an environmentally safe, but not very potent, microbial control agent produces a mixture that is a more effective toxicant for the gypsy moth than the microbial agent used alone.
Abstract: This paper is concerned with the modeling and analysis of data collected in a large experiment designed to study the mortality in gypsy moths exposed to a mixture of two toxicants and observed over three time periods. The stochastic survival model employed is based on a pertinent biological model that describes the mode of action of synergism between the toxicants. Conditional probability of death in an interval, given survival up to that interval, is fitted by a binary response model with nested random effects added to fixed treatment effects. The random effects factors are used to account for intercorrelation and extravariation. Approximate maximum likelihood estimates of the parameters are evaluated by adapting the iteratively weighted least squares algorithm within GLIM. Results from the nested random effect model are compared with those from the quasi-likelihood procedure for overdispersed data. Analysis of the gypsy moth data suggests that the addition of an enzyme to an environmentally safe, but not very potent, microbial control agent produces a mixture that is a more effective toxicant for the gypsy moth than the microbial agent used alone.

29 citations


Journal ArticleDOI
TL;DR: In this paper, the robustness of point null hypothesis with respect to prior distributions in the problem of testing a point null hypotheses is considered, and the relationship between P-values (or observed significance levels) and Bayesian measure of evidence against the null hypothesis expressed in terms of the infimum of the posterior probability of the hypothesis is discussed.
Abstract: Summary The robustness with respect to prior distributions in the problem of testing a point null hypothesis is considered. Of interest is the relationship between P-values (or observed significance levels) and Bayesian measure of evidence against the null hypothesis expressed in terms of the infimum of the posterior probability of the null hypothesis. This infimum is taken over the class of all priors whose marginals on a subsigma field 12 is given. An asymptotic analysis is developed and thus the results obtained can approximately be applied to a wide range of sampling models for moderate sample size. When the dimension of the parameter space is one, the P-value is much lower than the lowest probability obtained even if all priors are allowed. This means that P-values and any Bayesian analysis of null hypothesis in a space of dimension one, are irreconciliable. However, for null

Journal ArticleDOI
TL;DR: In this paper, two stochastic models giving rise to the Yule distribution are proposed to explain and fit some observed surname frequency distributions, one based on a contagion hypothesis in the sense that the more occurrences a surname has had the more likely it is to have a further occurrence.
Abstract: The problem of determining how family names evolve preoccupies both statistics and human biology. The determination of a proper and well justified probability model to describe the probability distribution of surnames has been confronted by many authors. In this paper two stochastic models giving rise to the Yule distribution are proposed to explain and fit some observed surname frequency distributions. The first is based on a contagion hypothesis in the sense that the more occurrences a surname has had the more likely it is to have a further occurrence. The second model is based on a weaker set of assumptions which also allows "immigration" of new surnames. The distribution that arises from these models is then fitted to actual data and the fit is compared to that provided by the discrete Pareto distribution

Journal ArticleDOI
TL;DR: In this article, a review of safety index methods, in which failure is described in terms of one or more limit states, is provided, and the last part of the exposition is devoted to parametric and omission sensitivity factors, used to study the sensitivity of the failure probability to variations of the parameters in the problem.
Abstract: Summary The review addresses the general problem of consistent evaluation of the safety of structures. An account of safety index methods, in which failure is described in terms of one or more limit states, is provided. Structural reliability methods are divided into levels according to the extent of information about the structural problem that is used. Emphasis has been put on level II and level III methods. Level II methods typically involve mean and variance supplemented with a measure of the correlation between the uncertain parameters whereas level III methods require the joint distribution of all uncertain parameters. The last part of the exposition is devoted to parametric and omission sensitivity factors. These measures are used to study the sensitivity of the failure probability to variations of the parameters in the problem. The omission sensitivity factor may be used to reduce the number of basic variables significantly.

Journal ArticleDOI
TL;DR: The Gregory manuscript collection of the University of Edinburgh Library contains a treatise on chance written by John Arbuthnot (MS Dk.B [no. 19] as mentioned in this paper, probably in 1694).
Abstract: The Gregory manuscript collection held by the University of Edinburgh Library contains a treatise on chance written by John Arbuthnot (MS Dk.1.2. Fol. B [no. 19]), probably in 1694. The manuscript consists of two theorems and four problems in which either generalizations of results in Arbuthnot (1692) or anticipations of results in Arbuthnot (1710) are given. Two types of significance tests are given as applications to the results derived. One of the tests is the same form as that used in 1710; the other is a crude one-sample test of location. The mathematical content of the manuscript indicates that Arbuthnot was an able mathematician working on problems of current interest to probabilists.

Journal ArticleDOI
TL;DR: In this article, the authors propose a methode for decomposing net change of population means and totals through time into components which are interpretable and lead to a better understanding of net change.
Abstract: Proposals are made for decomposing net change of population means and totals through time into components which are interpretable and lead to a better understanding of net change. The framework is extended to allow for non-stationary populations with births, deaths and changing characteristics of continuing members. Design and estimation issues are investigated and a number of numerical examples given. /// Nous proposons une methode pour decomposer les changements nets au cours du temps des moyennes et des tailles de population, en composantes qui ont une interpretation et qui menant a une meilleure comprehension des changements nets. Cette approche est generalisee afin de permettre l'etude de populations nonstationaires avec naissances et morts, dont les survivants ont des characteristiques variables. Le probleme du plan de sondage et celui de l'estimation sont etudies et plusieurs exemples numeriques sont donnes.

Journal ArticleDOI
TL;DR: In this paper, modified F-tests for testing the presence of row and column effects in a two-way layout assuming that observations follow a seasonal ARIMA process are proposed. But the results of these tests are limited to the airline series.
Abstract: Summary When analysing periodic data, there are not one but two time intervals to be considered. Often these intervals correspond to seasons and years. Specifically, one expects relationships to exist (a) between observations for successive seasons in a particular year, and (b) between the observations for the same season in successive years. Periodic data of this type may be tabulated in s rows due to s seasons and n columns due to n years, where s is the periodicity of the data. The usual two-way analysis of variance for examining the seasonal and annual effects may be highly misleading because of the correlation among observations within rows as well as within columns. In the present paper, we propose modified F-tests for testing the presence of row and column effects in a two-way layout assuming that observations follow a seasonal ARIMA process. A Gaussian approximation to the distribution of a linear combination of quadratic forms is proposed to obtain percentile values of the test statistics. The test is illustrated with the airline series.

Journal ArticleDOI
TL;DR: Hooker and Yule as discussed by the authors presented an approach to the relative influence of two variables upon a third in a multiple regression context, which they called "level importance" and "relative influence".
Abstract: Summary In their 1906 J. R. Statist. Soc paper, R.H. Hooker and G.U. Yule present an approach to the relative influence of two variables upon a third in a multiple regression context. The primary formulas suggest that by 'relative influence' Hooker and Yule meant just the ratio of (partial) regression coefficients. Inspection of the numerical example and its text description, however, suggests something different, but not clearly described. Analysis strongly points toward regression coefficient times average (of independent variable) as the intended concept of relative importance. This sometimes-called 'level importance' is favored by economists because of its proportionality to elasticity. Hooker himself returned to the topic in 1907 and created fresh puzzles.

Journal ArticleDOI
TL;DR: In this paper, a review of spectral analysis is presented, although some good results in this field may have been missed or not fully described by us. But this review reflects only our own points of view.
Abstract: Summary In recent years more and more people in China have shown great interest in time series analysis either in its theory or applications. This paper offers a review in spectral analysis, although some good results in this field may have been missed or not fully described by us. Section 1 presents some results in limit theory and ?2 introduces some concepts and methods in spectral analysis. This paper reflects only our own points of view.