scispace - formally typeset
Search or ask a question

Showing papers in "Journal of the American Statistical Association in 1977"


Journal ArticleDOI
TL;DR: In this paper, the authors proposed a restricted maximum likelihood (reml) approach which takes into account the loss in degrees of freedom resulting from estimating fixed effects, and developed a satisfactory asymptotic theory for estimators of variance components.
Abstract: Recent developments promise to increase greatly the popularity of maximum likelihood (ml) as a technique for estimating variance components. Patterson and Thompson (1971) proposed a restricted maximum likelihood (reml) approach which takes into account the loss in degrees of freedom resulting from estimating fixed effects. Miller (1973) developed a satisfactory asymptotic theory for ml estimators of variance components. There are many iterative algorithms that can be considered for computing the ml or reml estimates. The computations on each iteration of these algorithms are those associated with computing estimates of fixed and random effects for given values of the variance components.

2,440 citations



Journal ArticleDOI
TL;DR: In this paper, the authors show that Cox's method has full asymptotic efficiency under conditions which are likely to be satisfied in many realistic situations, and the connection of Cox's methods with the Kaplan-Meier estimator of a survival curve is made explicit.
Abstract: D.R. Cox has suggested a simple method for the regression analysis of censored data. We carry out an information calculation which shows that Cox's method has full asymptotic efficiency under conditions which are likely to be satisfied in many realistic situations. The connection of Cox's method with the Kaplan-Meier estimator of a survival curve is made explicit.

1,149 citations



Journal ArticleDOI
TL;DR: In this paper, the unconditional mean square error of prediction (OMSE) was used as a criterion for comparing stopping rules used with the forward "stepwise" selection procedure in multivariate normal samples, based on simulations of 48 population correlation matrices.
Abstract: This paper uses the unconditional mean square error of prediction as a criterion for comparing stopping rules used with the forward “stepwise” selection procedure in multivariate normal samples, based on simulations of 48 population correlation matrices. The CP statistic, “F to enter” (.15 < α < .25), a rule which minimizes the sample criterion, and one which sequentially tests the equality of the population criterion (.25 < α < .35) are superior. For these rules, the criterion seldom differs by more than three percent, although there are considerable differences between these and some of the other rules.

776 citations



Journal ArticleDOI
TL;DR: This second edition presents a systematic and comprehensive exposition of the methods used by technicians and research workers in dealing with demographic data, concerned with the ways data on population are gathered, classified, and treated to produce tabulations and various summarizing measures.
Abstract: This book is a thorough update of the original \"Methods and Materials of Demography\" (1976). Every chapter is new, written exclusively for this edition. Like the original, \"Red Book\", the second edition presents a systematic and comprehensive exposition of the methods used by technicians and research workers in dealing with demographic data. It is concerned with the ways data on population are gathered, classified, and treated to produce tabulations and various summarizing measures that reveal the significant aspects of the composition and dynamics of populations. It also sets forth the sources, limitations, underlying definitions, and bases of classification, as well as the techniques and methods that have been developed for summarizing and analyzing the data. Format: Hardback Pagination: 819 Price: £91.99 $164.99 €130.99 Publication Date: 17th Mar 2004 ISBN: 9780126419559

549 citations



Journal ArticleDOI
TL;DR: A number of analyses are presented to assess the effects of various covariates on the survival of patients in the Stanford Heart Transplantation Program and provide estimates of the relative risk of transplantation as well as significance tests.
Abstract: This paper presents a number of analyses to assess the effects of various covariates on the survival of patients in the Stanford Heart Transplantation Program. The data have been updated from previously published versions and include some additional covariates, such as measures of tissue typing. The methods used allow for simultaneous investigation of several covariates and provide estimates of the relative risk of transplantation as well as significance tests.

468 citations


Journal ArticleDOI
TL;DR: In this article, it was shown that the test statistic decreases to zero as the distance between the parameter estimate and null value increases, and the power of the test, based on its large-sample distribution, decreases to the significance level for alternatives sufficiently far from the null value.
Abstract: For tests of a single parameter in the binomial logit model, Wald's test is shown to behave in an aberrant manner. In particular, the test statistic decreases to zero as the distance between the parameter estimate and null value increases, and the power of the test, based on its large-sample distribution, decreases to the significance level for alternatives sufficiently far from the null value.

457 citations


Journal ArticleDOI
TL;DR: In this article, a method is given for estimating the effect of nonresponse in sample surveys based on Bayesian techniques, which produces a subjective probability interval for the statistic that would have been calculated if all nonrespondents had responded.
Abstract: A method is given for estimating, in a subjective sense, the effect of nonresponse in sample surveys. Based on Bayesian techniques, this method produces a subjective probability interval for the statistic that would have been calculated if all nonrespondents had responded. Background information which is recorded for both respondents and nonrespondents plays an important role in sharpening the subjective interval. Real survey data of 660 schools with 188 nonrespondents indicates that the method can be useful in practical problems. The general idea can be applied to any problem with nonrespondents or missing data.


Journal ArticleDOI
TL;DR: In this article, the authors show that many economic variables which are generally regarded as being strongly interrelated may with equal validity, based on recent empirical evidence, be regarded as independent or only weakly related.
Abstract: Extensions of time-series modeling procedures of Box and Jenkins [5] reveal that numerous economic variables which are generally regarded as being strongly interrelated may with equal validity, based on recent empirical evidence, be regarded as independent or only weakly related. Differences between these results and the bulk of econometric literature are attributed to the failure of the latter to satisfactorily account for autocorrelation. Due to limitations in the data, it is concluded that in many instances where economic relationships clearly do exist, econometric or other empirical means cannot reliably enable their existence to be ascertained.

Journal ArticleDOI
TL;DR: In this paper, a possible alternative to the hypothesis that the sequence X 1, X 2, X 3, X 4, X 5, X 6, X 7, X 8, X 9, X 10, X 11, X 12, X 13, X 14, X 15, X 16, X 17, X 18, X 19, X 20, X 21, X 22, X 23, X 24, X 25, X 26, X 27, X 28, X 29, X 30, X
Abstract: A possible alternative to the hypothesis that the sequence X 1, X 2, …, Xn are iid N(ξ, σ2) random variables is that at some unknown instant the expectation ξ shifts. The likelihood ratio test for the alternative of a location shift is studied and its distribution under the null hypothesis found. Tables of standard fractiles are given, along with asymptotic results.

Journal ArticleDOI
TL;DR: In this paper, the authors compared OREG with 56 alternatives which pull some or all of the estimated regression coefficients some of the way to zero and showed that OREG outperforms other alternatives when collinearity effects are present, noncentrality in the original model is small, and selected true regression coefficients are small.
Abstract: Estimated regression coefficients and errors in these estimates are computed for 160 artificial data sets drawn from 160 normal linear models structured according to factorial designs. Ordinary multiple regression (OREG) is compared with 56 alternatives which pull some or all estimated regression coefficients some or all the way to zero. Substantial improvements over OREG are exhibited when collinearity effects are present, noncentrality in the original model is small, and selected true regression coefficients are small. Ridge regression emerges as an important tool, while a Bayesian extension of variable selection proves valuable when the true regression coefficients vary widely in importance.

Book ChapterDOI
TL;DR: In this paper, a review of the marginalization and conditionality arguments is presented from a Bayesian point of view and the contents of marginalization arguments are reexamined from the Bayesian perspective.
Abstract: Eliminating nuisance parameters from a model is universally recognized as a major problem of statistics. A surprisingly large number of elimination methods have been proposed by various writers on the topic. In this article we propose to critically review two such elimination methods. We shall be concerned with some particular cases of the marginalizing and the conditioning methods. The origin of these methods may be traced to the work of Sir Ronald A. Fisher. The contents of the marginalization and the conditionality arguments are then reexamined from the Bayesian point of view. This article should be regarded as a sequel to the author's three-part essay (Basu 1975) on statistical information and likelihood.

Journal ArticleDOI
TL;DR: A two-stage identification procedure is presented which involves fitting univariate time-series models to each series, and identifying a dynamic shock model relating the two univariate model innovation series.
Abstract: A methodology is introduced for identifying dynamic regression or distributed lag models relating two time series. First, specification of a bivariate time-series model is discussed, and its relationship to the usual dynamic regression model is indicated. Then, a two-stage identification procedure is presented which involves fitting univariate time-series models to each series, and identifying a dynamic shock model relating the two univariate model innovation series. The models obtained at these two stages are combined to identify a dynamic regression model, which may then be fitted in the usual ways. Two systems of economic time series illustrate the methodology.

Journal ArticleDOI
TL;DR: The authors investigated the power of two methodologies, the tests of Brown, Durbin and Evans [2] and variable parameter regression, to detect several varieties of instability in the coefficients of a linear regression model.
Abstract: This paper investigates the power of two methodologies, the tests of Brown, Durbin, and Evans [2] and variable parameter regression, to detect several varieties of instability in the coefficients of a linear regression model. The study reported by Khan [10] on the stability of the demand for money is replicated with variable parameter regression, and his results are in part questioned and in part sharpened.

Journal ArticleDOI
Robert V. Foutz1
TL;DR: In this article, a unique consistent solution to the likelihood equations is obtained as a consequence of the Inverse Function Theorem, which deals directly with the vector parameter case rather than with extending single parameter arguments.
Abstract: Existence of a unique consistent solution to the likelihood equations is obtained as a consequence of the Inverse Function Theorem. The approach differs from those already in the literature by showing existence and uniqueness in one argument. The argument deals directly with the vector parameter case rather than with extending single parameter arguments.

Journal ArticleDOI
TL;DR: In this article, the authors show that the strong consistency of the Kaplan-Meier estimator can be established by expressing it as an explicit function of two empirical subsurvival functions, which is the natural one that expresses the survival function in terms of the sub-survival functions.
Abstract: The Kaplan-Meier estimator for the survival function in the censored data problem can be expressed for finite samples as an explicit function of two empirical subsurvival functions. This function is the natural one that expresses the survival function in terms of the sub-survival functions. As an illustration of the usefulness of expressing the Kaplan-Meier estimator in this way, the strong consistency of the Kaplan-Meier estimator is easily proved.

Journal ArticleDOI
TL;DR: In this article, two stepwise multiple comparison significance tests are proposed to control the experimentwise type I error rate and compare these tests with some existing procedures, such as step-down and step-up.
Abstract: This article proposes two new stepwise multiple comparison significance tests which control the experimentwise type I error rate and compares these tests with some existing procedures. The new (step-up) tests begin by examining the gaps between adjacent ordered sample means, then the three stretches, etc., until the range is reached. This reverses a procedure (step-down) proposed by Ryan (1960). Tables for the new tests were constructed using improved Monte Carlo techniques. A simulation study showed that one of the new step-up tests and a modification of the Ryan procedure provided significantly greater power than the commonly used Tukey honestly significant difference test.

Journal ArticleDOI
TL;DR: Statistics in Medicine is a useful elementary text which gives careful attention to many of the statistical methods and issues medical students and physicians are confronted with in the current medical literature.

Journal ArticleDOI
TL;DR: In this paper, the authors considered the test reset which is intended to detect a nonzero mean of the disturbance in a linear regression model and found that the power of the test may decline as the size of the disturbances increases.
Abstract: This article considers the test reset which is intended to detect a nonzero mean of the disturbance in a linear regression model. Analysis of an approximation to the test statistic's distribution and Monte Carlo experiments reveal that the power of the test may decline as the size of the disturbance mean increases. However, the possibility is remote and declines with increasing sample size. Alternative sets of test variables are considered, and their effect on the power of the test is studied in Monte Carlo experiments. The best set seems to be composed of powers of the explanatory variables.

Journal ArticleDOI
TL;DR: In this article, it is shown that, in the case of the normal distribution, the maximum likelihood estimates of the second-order moments of the true slope and the true initial value are obtained by simple adjustments of the corresponding moments of estimated quantities.
Abstract: Given a series of consecutive measurements on a random sample of individuals, it is often of interest to investigate whether there exists a relationship between the rate of change and the initial value. Assuming that the observations deviate in a random manner from the true values, straightforward regression computations will yield biased results. It is shown that, in the case of the normal distribution, the maximum likelihood (ML) estimates of the second-order moments of the true slope and the true initial value are obtained by simple adjustments of the corresponding moments of the estimated quantities. An asymptotic formula for the standard error of the regression coefficient of slope on initial value is derived, and the methods are applied to longitudinal blood pressure data. The case with concomitant variables is discussed briefly.


Journal ArticleDOI
David S. Moore1
TL;DR: In this article, the generalized Wald's method is applied to give a quite general solution to the problem of constructing chi-squared type statistics for testing fit to a parametric family when unknown parameters are estimated in various ways.
Abstract: Wald's method constructs test statistics having chi-squared limiting distributions from estimators having nonsingular multivariate normal limiting distributions, using the inverse of the covariance matrix of the limiting distribution. This method is here extended to singular multivariate normal limiting distributions by use of generalized inverses. The generalized Wald's method is then applied to give a quite general solution to the problem of constructing chi-squared type statistics for testing fit to a parametric family when unknown parameters are estimated in various ways.

Journal ArticleDOI
TL;DR: For α = 0.05 and 0.01 and for a range of values of ν, the number of degrees of freedom, the tables give the points above which lie a proportion, for k = 1 (1)20, and a proportion, for k ≤ 7(1) 20, of the Students distribution.
Abstract: For α = 0.05 and 0.01 and for a range of values of ν, the number of degrees of freedom, the tables give the points above which lie a proportion , for k = 1 (1)20, and a proportion , for k = 7(1)20, of the Students distribution.

Journal ArticleDOI
TL;DR: In this article, the conclusiveness of a decision is assessed in terms of a conditional confidence coefficient Γ that has frequentist interpretability analogous to that of a traditional confidence coefficient.
Abstract: Procedures are given for assessing the conclusiveness of a decision in terms of a (chance) conditional confidence coefficient Γ that has frequentist interpretability analogous to that of a traditional confidence coefficient. Properties of such procedures are compared in terms of the distribution of Γ. This leads to recommendations on the choice of conditioning. Also, a methodology for estimating the confidence when it depends on unknown parameter values is given. The notions of confidence are not limited to interval estimation; examples are also given in hypothesis testing and selection problems and in nonparametric and sequential settings.

Journal ArticleDOI
TL;DR: In this paper, the generalized jackknife method is employed to reduce the asymptotic and small sample mean square error and bias of these estimators, and the procedure presented has the flexibility to afford the user a choice between bias reduction, variance reduction, or both.
Abstract: Estimation of the value of a density function at a point of continuity using a kernel-type estimator is discussed and improvements of the technique are presented. The generalized jackknife method is employed to reduce the asymptotic and small sample mean square error and bias of these estimators. The procedure presented has the flexibility to afford the user a choice between bias reduction, variance reduction, or both.