scispace - formally typeset
Search or ask a question

Showing papers in "Applied statistics in 1985"


Journal ArticleDOI
TL;DR: In this paper, a parametric mixture model is proposed to analyze failure-time data that are subject to censoring and multiple modes of failure, and the hazard rate for each conditional distribution of time to failure, given type of failure is modelled as the product of a piecewise exponential function of time and a loglinear function of the covariates.
Abstract: SUMMARY A parametric mixture model provides a regression framework for analysing failure-time data that are subject to censoring and multiple modes of failure. The regression context allows us to adjust for concomitant variables and to assess their effects on the joint distribution of time and type of failure. The mixing parameters correspond to the marginal probabilities of the various failure types and are modelled as logistic functions of the covariates. The hazard rate for each conditional distribution of time to failure, given type of failure, is modelled as the product of a piece-wise exponential function of time and a log-linear function of the covariates. An EM algorithm facilitates the maximum likelihood analysis and illuminates the contributions of the censored observations. The methods are illustrated with data from a heart transplant study and are compared with a cause-specific hazard analysis. The proposed mixture model can also be used to analyse multivariate failure-time data.

195 citations


Journal ArticleDOI
TL;DR: In this article, a simple alternative to the proportional hazards model is considered, whereby the regression coefficients can vary with time, leading to a statistic which can be used for checking the assumption of proportional hazards.
Abstract: SUMMARY A simple alternative to the proportional hazards model is considered, whereby the regression coefficients can vary with time. It leads to a statistic which can be used for checking the assumption of proportional hazards. In the two-sample problem, this statistic is the same as Schoenfeld's (1980); a conservative version, computationally simple, is proposed in this case.

137 citations



Journal ArticleDOI
TL;DR: In this article, the authors consider the problem of choosing the root of a mixture of multivariate normal distributions and demonstrate that the adoption of a homoscedastic normal model in the presence of some heteroscedancy can considerably influence the likelihood estimates, in particular of the mixing proportions, and hence the consequent clustering of the sample at hand.
Abstract: SUMMARY We consider some of the problems associated with likelihood estimation in the context of a mixture of multivariate normal distributions. Unfortunately with mixture models, the likelihood equation usually has multiple roots and so there is the question of which root to choose. In the case of equal covariance matrices the choice of root is straightforward in the sense that the maximum likelihood estimator exists and is consistent. However, an example is presented to demonstrate that the adoption of a homoscedastic normal model in the presence of some heteroscedasticity can considerably influence the likelihood estimates, in particular of the mixing proportions, and hence the consequent clustering of the sample at hand.

64 citations


Journal ArticleDOI
TL;DR: In this paper, a maximum likelihood procedure for estimating simultaneously the concentration parameter of the Dimroth-Watson distribution and either the length or the surface density for the structure is described and illustrated with the capillary data.
Abstract: SUMMARY Classical stereological methods are based on the assumption of isotropy, either of the structure under study or of the sectioning probe relative to the structure. Thus, with "distribution-free stereology", anisotropic structures can be studied only via isotropic uniform random probes, which are often difficult to generate or inefficient in practice. Here we present a parametric approach to model the direction of a randomly chosen line or surface element, whereby length and surface densities can be estimated by probes at fixed and known directions. For the material studied, namely skeletal muscle capillaries, the Dimroth-Watson axial distribution provides a good fit. A maximum likelihood procedure for estimating simultaneously the concentration parameter of the Dimroth-Watson distribution and either the length or the surface density for the structure is described and illustrated with the capillary data. When a particular axial model is known to be satisfactory, a short-cut estimation method of practical value can be used.

56 citations



Journal ArticleDOI
TL;DR: In more, this is the real condition as mentioned in this paper, people will be bored to open the thick book with small words to read, and this will happen probably with this experimental design statistical models and genetic statistics.
Abstract: What do you do to start reading experimental design statistical models and genetic statistics? Searching the book that you love to read first or find an interesting book that will make you want to read? Everybody has difference with their reason of reading a book. Actuary, reading habit must be from earlier. Many people may be love to read, but not a book. It's not fault. Someone will be bored to open the thick book with small words to read. In more, this is the real condition. So do happen probably with this experimental design statistical models and genetic statistics.

38 citations




Journal ArticleDOI
TL;DR: In this article, the validity of the Hoel-Walburg and Peto tests are replaced by a more general condition, representativeness, and a large carcinogenicity experiment provided a unique opportunity to assess empirically if this condition holds.
Abstract: SUMMARY Conditions for the validity of the Hoel-Walburg and Peto tests, which compare dose groups with respect to tumour prevalence, are replaced by a more general condition, representativeness. A large carcinogenicity experiment provided a unique opportunity to assess empirically if this condition holds. Though representativeness was generally violated, neither the Hoel-Walburg nor the Peto tests were seriously distorted. Analytic considerations suggest that such robustness can occur in many situations.

32 citations


Journal ArticleDOI
TL;DR: In this article, the structural equations model is used to estimate the linear relationship between new and standard methods, and a comparison with regression analysis is made, which is illustrated by a new method of measuring cardiac output with data drawn from published reports.
Abstract: SUMMARY Methods for assessing a new measurement technique are examined. The structural equations model is used to estimate the linear relationship between new and standard methods. The delta method is used to find the variance of the estimated parameters. A comparison with regression analysis is made. This is illustrated by a study on a new method of measuring cardiac output with data drawn from published reports. The estimates from the different reports are combined to give an overall estimate of the relationship between the new and standard methods.

Journal ArticleDOI
TL;DR: In this paper, multiplicative, additive, and exponential additive models of the relative risk in stratified epidemiologic studies are considered and an alternative is suggested to the additive model.
Abstract: Multiplicative, additive, and exponential additive models of the relative risk in stratified epidemiologic studies are considered. Parameter estimates of the additive model are shown to have poor statistical properties and an alternative is suggested. This model is generalized to allow each predictor in the model to have either a multiplicative or additive effect on the relative risk. Two examples are presented. One is a standard matched case‐control design and the second is a large prospective study analysed as a synthetic retrospective study. All models may be fitted by making slight changes in existing software.

Journal ArticleDOI
TL;DR: An algorithm "randomly" generates an n x n population corelation matrix from the class of all correlation matrices having these same eigenvalues from the group of orthogonal matrices.
Abstract: Description and Purpose Given a set of n eigenvalues, the algorithm "randomly" generates an n x n population corelation matrix from the class of all correlation matrices having these same eigenvalues. The generated population correlation matrices can be used for performing Monte Carlo simulations or sampling experiments-see e.g. Bendel and Afifi (1977) and Lin and Perlman (1985). Let D be a diagonal matrix whose diagonal elements are the specified eigenvalues. The algorithm first generates a random orthgonal matrix A from the invariant Haar measure on the group of orthogonal matrices (see Heiberger, 1978; Tanner and Thisted, 1982) so that CO =ADA' is a random covariance matrix with eigenvalues given by D. Then the algorithm successively obtains n1 orthgonal matrix Pi so that C Pn-1Pn-2 . .. P2P1CoP'P21 .. Pn-2Pn-1 is a corelation matrix having the same eigenvalues as CO (see Bendel and Mickey (1978)). Each matrix Pi is an elementary rotational matrix of the form:

Journal ArticleDOI
TL;DR: In this article, the treatment effect in a cross-over trial with a binary response is examined in the context of Gart's (1969) logistic regression model for matched pairs.
Abstract: SUMMARY The possible sources of information on the treatment effect in a cross-over trial with a binary response are examined. In the context of Gart's (1969) logistic regression model for matched pairs, proposals to improve upon Gart's conditional analysis are examined. The use of a general logistic regression model is considered.


Journal ArticleDOI
TL;DR: In this paper, an alternative model for estimating the center and radius of the ring at Brogar is proposed, which is an approximate generalization of a model considered by Berman (1983) and is applied to the stone ring at Avebury, and is used to test an hypothesis of Thom, Thom and Foord (1976) concerning the original construction.
Abstract: This paper proposes an alternative model for estimating the centre and radius of the stone ring at Brogar, and shows how this is an approximate generalization of a model considered by Berman (1983). An extension of the new model is applied to the stone ring at Avebury, and is used to test an hypothesis of Thom, Thom and Foord (1976) concerning the original construction of the ring.

Journal ArticleDOI
TL;DR: In this paper, the authors present des plans factoriels equilibres for des experiences en blocs de dimension 2 construites a partir d'ensembles de blocs cycliques generalises.
Abstract: On presente des plans factoriels equilibres pour des experiences en blocs de dimension 2 construites a partir d'ensembles de blocs cycliques generalises

Journal ArticleDOI
TL;DR: Gelfand and Walker as mentioned in this paper described an ensemble model for inferring from small scale properties to large scale systems, which is similar to the one we use in this paper. But with a different approach.
Abstract: Ensemble Modeling Inference from Small‐Scale Properties to Large‐Scale Systems By Alan E Gelfand and Crayton C Walker New York, Marcel Dekker, 1984 (Volume 58 in the Statistics: Textbooks and Monographs Series)

Journal ArticleDOI
TL;DR: A sequential batch acceptance sampling scheme is presented within the framework of decisive prediction as described for example in Aitchison and Dunsmore (1975) and provides intuitively satisfactory plans which are simple to operate.
Abstract: A sequential batch acceptance sampling scheme is presented within the framework of decisive prediction as described for example in Aitchison and Dunsmore (1975). The scheme considers a batch of N components and a sequential testing design is derived in which at each stage we can either scrap the batch, accept the batch or test another component. The testing involves measuring the (perhaps censored) lifetimes of M components simultaneously with an exponential model for lifetimes. The approach uses the Bayesian predictive distribution and a utility structure which depends on the lifetimes of the untested components. The sampling scheme is determined from the solution of a dynamic programming formulation and provides intuitively satisfactory plans which are simple to operate.

Journal ArticleDOI
TL;DR: The simulated small-sample behaviors of several estimators of a common hazard ratio are compared in this paper, where a modified version of the empirical logit (Haldane 1955; Anscombe 1956) seems to be the analytic technique of choice.
Abstract: The simulated small‐sample behaviours of several estimators of a common hazard ratio are compared. In most circumstances, a modified version of the empirical logit (Haldane 1955; Anscombe 1956) seems to be the analytic technique of choice. The performance of the Standardized Mortality Ratio (SMR) is indistinguishable from that of the ML estimator when the denominator experience is much larger than that of the numerator experience. When many cells have small expectations, then the empirical logit becomes biased, and a variant of the Mantel–Haenszel estimator (Mantel and Haenszel, 1959) is the most widely reliable non‐iterative estimator.


Journal ArticleDOI
TL;DR: In this article, des exemples d'utilisation de la moyenne des marques pour une gamme de sujets universitaires are presented, e.g.
Abstract: On donne des exemples d'utilisation de la moyenne des marques pour une gamme de sujets universitaires





Journal ArticleDOI
TL;DR: In this article, des formules de mise a jour pour l'allocation basee sur les distances de Mahalanobis individuelles are presented, i.e.
Abstract: On presente des formules de mise a jour pour l'allocation basee sur les distances de Mahalanobis individuelles



Journal ArticleDOI
TL;DR: The statistical analysis of Pharmacokinetic data shows clear trends in approaches to Uptake by Heterogeneous Perfused Organs and basic principles underlying Radioisotopic Methods for Assay of Biochemical Processes in Vivo are revealed.