scispace - formally typeset
Search or ask a question

Showing papers in "Journal of the royal statistical society series b-methodological in 1970"


Journal ArticleDOI
TL;DR: In this article, a maximum-likelihood estimator based on the conditional distribution given minimal sufficient statistics for the incidental parameters is proposed, and it is proved that conditional maximum likelihood estimates in the regular case are consistent and asymptotically normally distributed.
Abstract: The problem of obtaining consistent estimates for structural parameters in the presence of infinitely many incidental parameters was discussed first by Neyman and Scott (1948). In this paper a maximum-likelihood method based on the conditional distribution given minimal sufficient statistics for the incidental parameters is suggested. It is proved that conditional maximumlikelihood estimates in the regular case are consistent and asymptotically normally distributed with a simple asymptotic variance. The efficiency problem of this new estimator is discussed in particular with respect to some situations with ancillary information.

751 citations


Journal ArticleDOI
TL;DR: In this article, the authors give modifications of eleven statistics, usually used for goodness of fit, so as to dispense with the usual tables of percentage points, and some test situations are illustrated, and formulae given for calculating significance levels.
Abstract: SUMMARY This paper gives modifications of eleven statistics, usually used for goodness of fit, so as to dispense with the usual tables of percentage points. Some test situations are illustrated, and formulae given for calculating significance levels.

478 citations


Journal ArticleDOI
TL;DR: These methods indicate that in many situations commonly encountered objective methods of eliminating unwanted parameters from the likelihood function can be adopted and give an alternative method of interpreting multiparameter likelihoods to that offered by the Bayesian approach.
Abstract: [Read before the ROYAL STATISTICAL SOCIETY at a meeting organized by the RESEARCH SECTION on Wednesday, March 11th, 1970, Professor J. DURBIN in the Chair] SUMMARY Likelihood methods of dealing with some multiparameter problems are introduced and exemplified. Specifically, methods of eliminating nuisance parameters from the likelihood function so that inferences can be made about the parameters of interest are considered. In this regard integrated likelihoods, maximum relative likelihoods, conditional likelihoods, marginal likelihoods and second-order likelihoods are introduced and their uses illustrated in examples. Marginal and conditional likelihoods are dependent upon factorings of the likelihood function. They are applied to the linear functional relationship and to related models and are found to give intuitively appealing results. These methods indicate that in many situations commonly encountered objective methods of eliminating unwanted parameters from the likelihood function can be adopted. This gives an alternative method of interpreting multiparameter likelihoods to that offered by the Bayesian approach.

347 citations



Journal ArticleDOI
TL;DR: Using a combined distribution containing the component models as special cases, statistics are developed for testing for departures from one model in the direction of another and for testing the hypothesis that all models fit the data equally well.
Abstract: [Read before the ROYAL STATISTICAL SOCIETY at a meeting organized by the RESEARCH SECTION on Wednesday, May 13th, 1970, Professor J. DURBIN in the Chair] SUMMARY It is desired to determine which of several alternative models adequately describe the data. The properties of a combined distribution containing the component models as special cases are investigated. Using this distribution, statistics are developed for testing for departures from one model in the direction of another and for testing the hypothesis that all models fit the data equally well. The relationship with other procedures is investigated. Examples are given of the use of the method, especially when there are two component models belonging to separate parametric families.

259 citations


Journal ArticleDOI
TL;DR: In this article, it has been argued that even in the simple cases for which explicit analytical solutions can be found, these solutions are often too complicated to be of practical use, and that the criticism is to be met to some degree by the analysis of situations where robust approximations exist, such as that of heavy traffic.
Abstract: IT is a fair criticism of the theory of queues as it has developed through the years that, even in the simple cases for which explicit analytical solutions can be found, these solutions are often too complicated to be of practical use. It has been argued elsewhere (Kingman, 1966) that the criticism is to be met to some degree by the analysis of situations where robust approximations exist, such as that of "heavy traffic". It is, however, important to know how accurately such approximations represent the true solution, and the significance of inequalities for the various quantities of interest thus becomes apparent. To be useful, an inequality must have two properties which are to some extent incompatible with one another. If 0 is some quantity not perhaps easy to calculate, the inequality 0 < 6' will not be significant unless there is reason to hope that, in an appropriate sense, 6' is reasonably close to 0. This can of course best be determined if the inequality is one of a pair

244 citations



Journal ArticleDOI
TL;DR: In this article, the authors used the weighted least squares method to estimate the timedependent parameters of non-stationary time-series models, which could be used to assess the merits of various weight functions on the basis of "mean square error" criterion.
Abstract: SUMMARY We use the method of weighted least squares to estimate the timedependent parameters of non-stationary time-series models. Approximate expressions for the bias and variance of the estimates, which could be used to assess the merits of various weight functions on the basis of "mean square error" criterion, are obtained. In Section 5, it is shown that the estimates obtained by weighted least-squares method are the same as estimates obtained by weighted maximum-likelihood estimation procedure. The procedures are numerically illustrated in the last section.

191 citations


Journal ArticleDOI
TL;DR: In this article, a solution to the problem of estimating the positions and times of the branch points of a Brownian-motion/Yule process, given the positions of all the particles at a particular time, is outlined.
Abstract: A solution to the problem of estimating the positions and times of the branch points of a Brownian-motion/Yule process, given the positions of all the particles at a particular time, is outlined. A likelihood approach is used, and it is shown that the solution involves maintaining a clear distinction between likelihood and conditional probability if difficulties over mathematical singularities are to be avoided. Several unsolved mathematical problems are encountered, and it is concluded that some simulation studies may be required for a complete solution. Questions of scientific inference which the problem raises are discussed briefly.

105 citations


Journal ArticleDOI
TL;DR: In this article, a metric on equivalence classes is proposed to measure the association of the rows and columns of a 2 x 2 table to an r x s table whose rows and column are assumed unordered.
Abstract: SUMMARY The generalization of Edwards's argument for the measure of association of the rows and columns of 2 x 2 table, to that of an r x s table whose rows and columns are assumed unordered, shows, not surprisingly, that association ought to be measured by some function of the (r -1) (s - 1) cross-ratios. Such a function is suggested by the introduction of a metric on certain equivalence classes. The properties of such metrics are examined, and in particular comparisons are made with Good's suggestion of the use of the algebraic rank of the contingency table, and with Lindley's significance test for association in the r x s table.

95 citations


Journal ArticleDOI
TL;DR: In this article, the authors considered spectral analysis of data with randomly missing observations, and provided conditions under which consistent estimation of the spectral density function of the observed process is possible, and the results of a simulation study of its first two moments are presented.
Abstract: SUMMARY Spectral analysis of data with randomly missing observations is considered. Conditions are found under which consistent estimation of the spectral density function of the observed process is possible. An estimate is proposed and the results of a simulation study of its first two moments are presented. MISSING data problems in spectral analysis were first considered by Jones (1962), who examined the case where a block of observations is periodically unobtainable. Parzen (1962) developed the theory of amplitude-modulated stationary processes, and applied this theory to missing data problems (Parzen, 1963), considering in detail the case where observations are missed in some periodic way. The amplitudemodulated series, Z(t), is constructed by replacing missing observations in the original series,, X(t), by their mean value, which we suppose to be zero. Thus


Journal ArticleDOI
TL;DR: In this article, the authors developed a sampling procedure which for n > 4 and a linear trend has a smaller variance for the sample mean than 1 per stratum and for which an unbiased estimator of the variance is available.
Abstract: FOR populations arranged in natural order, say in increasing values of a concomitant variable, one common sampling scheme is to divide the population into strata and sample proportionately from each stratum. Variance of the sample mean is minimized (with the possible exception of finite corrections) if the population is divided into n strata and one unit selected from each. A second common procedure, particularly if the sampling is with unequal probabilities, is to sample systematically.t It is well known that if the y characteristic is composed of a linear trend plus random elements the 1-per-stratum design is more efficient than systematic sampling of the population in natural order. The disadvantage of both of these sampling schemes is, of course, that no unbiased estimator of variance is available. In this paper we develop a sampling procedure which for n > 4 and a linear trend has a smaller variance for the sample mean than 1 per stratum and for which an unbiased estimator of the variance is available.