scispace - formally typeset
Search or ask a question

Showing papers in "Journal of Statistical Computation and Simulation in 1988"


Journal ArticleDOI
TL;DR: Compared to traditional methods of distribution fitting based on moment matching, percentile matching, L 1 estimation, and L ⌆ estimation, the least-squares technique is seen to yield fits of similar accuracy and to converge more rapidly and reliably to a set of acceptable parametre estimates.
Abstract: To summarize a set of data by a distribution function in Johnson's translation system, we use a least-squares approach to parameter estimation wherein we seek to minimize the distance between the vector of "uniformized" oeder statistics and the corresponding vector of expected values. We use the software package FITTRI to apply this technique to three problems arising respectively in medicine, applied statistics, and civil engineering. Compared to traditional methods of distribution fitting based on moment matching, percentile matchingL 1 estimation, and L ⌆ estimation, the least-squares technique is seen to yield fits of similar accuracy and to converge more rapidly and reliably to a set of acceptable parametre estimates.

403 citations


Journal ArticleDOI
TL;DR: In this article, an iterative imputation procedure based on the idea of Markov chain is proposed, where the incomplete values are filled in through sampling from their predictive distribution, which is a theoretically sound method to fill in incomplete values.
Abstract: Broadly speaking, imputation means filling in incomplete values. A theoretically sound method is to impute the incomplete values through sampling from their predictive distribution. In this paper, an iterative imputation procedure, based on the idea of Markov chain, is proposed. Examples are presented to illustrate its applications.

123 citations


Journal ArticleDOI
TL;DR: In this paper, the method in Krutchkoff (1988b) for one-way ANOVA is extended to the two-way situation and the K-ANOVA application program is discussed.
Abstract: The method in Krutchkoff (1988b) for one-way ANOVA is extended to the two-way situation. The K- ANOVA application program is discussed.

53 citations


Journal ArticleDOI
TL;DR: In this paper, the authors considered the probability distribution of the maximum of r statistics each distributed as the Studentized range of means calculated from c random samples of size n from normal populations.
Abstract: We consider the probability distribution of the maximum of r statistics each distributed as the Studentized range of means calculated from c random samples of size n from normal populations. The rc samples are assumed to be mutually independent and a common pooled—within—samplevariance is used throughout. An algorithm using Gaussian quadrature and the secant method is presented for calculating the upper lOO∝ percentage points for this distribution. We discuss the application in a new procedure for constructing multiple significance tests and simultaneous confidence limits for simple effects in multiway Analysis of Variance.

42 citations


Journal ArticleDOI
TL;DR: In this article, the authors present a procedure to determine the number of change points in a sequence of independent exponentially distributed variables, when there is no prior information concerning the (distribution of the) number, location or magnitude of the changes.
Abstract: The literature on change point problems concerns almost exclusively situations with at most one change. Howeverm, in practice it is often not sure that this is so, especially in long observation periods. There is an urgent need for methods to deal with such observations. We give a procedure to determine the number of change points in a sequence of independent exponentially distributed variables, when there is no prior information concerning the (distribution of the) number, location or magnitude of the changes. The procedure is based on partitioning of the likelihood according to a hierarchy of sub–hypotheses. It consists of a sequence of likehood ratio tests of nested hypotheses corresponding to a decreasing number of change points. Here, we consider a maximum of two change points but it is easily generalized for a higher maximum. Since the distribution of the test statistics can, as yet, not be derived analytically, the properties of the procedure were analyzed by Monte Carlo methods. Under the hypothes...

22 citations


Journal ArticleDOI
TL;DR: In this article, Monte Carlo Simulation is used to investigate the finite sample properties of maximum likelihood estimators of Weibull and lognormal parameters and quantiles from interval censored data.
Abstract: Interval censored data arise frequently in industrial life tests and other applications. Maximum likelihood estimation provides a convenient means for making inferences on important distribution properties like quantiles and failure probabilities. The asymptotic normal distribution of the maximum likelihood estimators provides a simple method of setting approximate confidence bounds on these quantiles. Inverting likelihood ratio tests is another alternative. This paper uses Monte Carlo Simulation to investigate the finite sample properties of maximum likelihood estimators of Weibull and lognormal parameters and quantiles from interval censored data. We evaluate the accuracy of large sample one- and two-sided confidence bounds based on asymptotic normal theory and compare their accuracy (with respect to coverage probability) to those obtained by inverting likelihood ratio tests. Even though these procedures are asymptotically equivalent, our results show that the intervals based on inverting a likelihood r...

22 citations


Journal ArticleDOI
TL;DR: In this article, the authors used the first order bias of the median life to eliminate the first-order bias in a family of life distributions introduced by Birnbaum and Saunders (1969a), which characterizes a certain type of fatigue crack.
Abstract: A family of life distributions was introduced by Birnbaum and Saunders (1969a). The model characterizes a certain type of fatigue crack. In a subsequent investigation, Birnbaum and Saunders (1969b) discuss the estimation of the median life and provide a simple consistent estimate that has limiting distribution within the same class of distributions as the observations themselves. This estimate, over estimates the median life, however. In this investigation, the Jackknife method of Quenouille (1949), is used to eliminate this first order bias. The new estimate has the same limiting behavior as that of Birnbaum and Saunders. The mean square error of the Jackknife estimate is obtained. A simulation study is reported and a set of fatigue life data of Birnbaum and Saunders is also discussed.

21 citations


Journal ArticleDOI
TL;DR: In this paper, two classes of estimators of this quantity are proposed when the data are failure truncated and evaluated for several values of β and for samples sizes of 5, 10,20 and 40.
Abstract: The Weibull process, a nonhomogeneous Poisson process with intensity function is often used to model the failure times of complex systems which are repaired after failure. Point estimation of the value of the intensity function at the current time is frequently an important problem and in this paper, two classes of estimators of this quantity are proposed when the data are failure truncated. Several members of these classes are suggested as estimators. Expressions for the bias and the mean squared error of these estimators are derived and are evaluated for several values of β and for samples sizes of 5, 10,20 and 40. Some estimators have smaller mean squared error than the conditional MLE for a wide range of the parameters.

20 citations


Journal ArticleDOI
TL;DR: In this paper, exact tables of the null distribution are presented for n = 12 to 18 for the Rho measure of association with respect to the Spearman's rank correlation coefficient.
Abstract: Spearman's rank correlation coefficient, Rho, is a widely used nonparametric measure of association. Complete, exact tables of the null distribution are calculated and presented for n = 12 to 18.

12 citations


Journal ArticleDOI
TL;DR: In this article, a modified Winsorized regression procedure for estimating parameters in linear regression models is proposed and it is shown that these estimators provide a close approximation to Tiku's MMLE.
Abstract: In this paper we propose a modified Winsorized regression procedure for estimating parameters in linear regression models. it is shown that these estimators provide a close approximation to Tiku's MMLE. By assuming contaminated normal distributions, Monte Carlo studies indicate that the modified Winsorized estimators are more efficient than Winsorized regression estimators.

11 citations


Journal ArticleDOI
TL;DR: In this paper, the test statistics for two and three outliers are expanded to give more insight, and critical values, based on simulation, are given for the statistics for 2 and 3 outliers.
Abstract: The work by Wilks (1963) is discussed, and the test statistics for two and three outliers are expanded to give more insight. Critical values, based on simulation, are given for the statistics for two and three outliers. Approximations for the critical values are also suggested.

Journal ArticleDOI
TL;DR: In this article, the authors developed a new sequential test based on the Wilcoxon-Mann-Whitney two-sample test for fixed sample size, which can be used against both one-sided and two-sided alternatives.
Abstract: Only a few sequential tests have been developed for the two-sample design common in clinical trials. By stochastic simulation, we have developed a new sequential test based on the Wilcoxon–Mann–Whitney two-sample test for fixed sample size. The development is based on a shift model of normal distributions. The resulting test can be used against both one-sided and two-sided alternatives. Formulas for calculating the parameters in the equations for the boundaries make it possible to choose significance level and power against specified alternatives. Simulation investigations of this sequential test show that it requires substantially fewer patients to reach a conclusion than the corresponding test for fixed sample size. Monte Carlo investigations of its robustness properties show that it is robust and approximately distribution free.

Journal ArticleDOI
TL;DR: In this article, a more general class of estimators is developed and methods for obtaining effeciency gains over Moran's procedure are discussed, which is similar to the one presented in this paper.
Abstract: The computation of orthant probabilities represents a difficutl numerical problem for even modest dimensions. Moran (1984) proposed a Monte Carlo estimator of these quantities. In this paper a more general class of estimators is developed and methods for obtaining effeciency gains over Moran's procedure and discussed.

Journal ArticleDOI
TL;DR: In this article, the problem of estimating the parameters in a polychoric correlation model with the latent variables be distributed according to a bivariate elliptical distribution is considered, and algorithms for computing the maximum likelihood estimates, the minimum chi-square estimates and the modified minimum chi−square estimates are implemented.
Abstract: The problem of estimating the parameters in a polychoric correlation model is considered with the latent variables be distributed according to a bivariate elliptical distribution. Situations that based on bivariate t distribution and bivariate contaminated normal distribution normal distribution are studied in detail. Algorithms for computing the maximum likelihood estimates, the minimum chi–square estimates and the modified minimum chi–square estimates are implemented. Based on the results developd, simulation studies are conducted for investigating the robustness of the normality assumption.

Journal ArticleDOI
TL;DR: In this paper, the distributions of the likelihood ratio statistics for testing the hypotheses in a p-variate normal distribution with mean vector µ and covariance matrix σ, where σ c is a circular symmetric matrix and is obtained through the techniques of inverse Mellin transform and calculus of residues.
Abstract: This article deals with the distributions of the likelihood ratio statistics for testing the hypotheses (i) , (ii) , (iii) , (iv) , and (v) , in a p-variate normal distribution with mean vector µ and covariance matrix σ, where σ c is a circular symmetric matrix and . The distribution is obtained through the techniques of inverse Mellin transform and calculus of residues. Results for small values of p are given in closed forms.

Journal ArticleDOI
TL;DR: The study provides a complete and systematic graphical exposition of twenty–one existing influence measures and the resulting classification of these measures into five similarity classes greatly simplifies the influence diagnostics menu.
Abstract: Several influence measures have been developed for evaluating the effects of individual cases on parameter estimates, fitted values, and other least squares regression statistics. Cook and Weisberg (1982), Hocking (1983), and other feel that the average user of regression diagnostics would be overwhelmed and confused by the use of all such diagnostics. However, as Hocking (1983) points out, evidence from which to draw conclusions about the relative merits of existing influence measures in insufficient to make general recommendations about their use. The study provides a complete and systematic graphical exposition of twenty–one existing influence measures. The resulting classification of these measures into five similarity classes greatly simplifies the influence diagnostics menu. Recommendations based on the results of this analysis are made for the use of influence diagnostics.

Journal ArticleDOI
TL;DR: In this article, the properties of several jackknife-based estimators and a bootstrap estimator are investigated in the context of the quasi-likelihood functions, and it is shown that often confidence regions based on the usual quasilikelihood procedures are severely anticonservative, and cannot be trusted.
Abstract: The properties of several jackknife–based estimators and a bootstrap estimator are investigated in the context of the quasi–likelihood functions. It is shown that often confidence regions based on the usual quasi-likelihood procedures are severely anticonservative, and cannot be trusted. In contrast, a linear and modified linear jackknife have good confidence region properties, being reasonably robust to small sample size, high dispersion, outliers and leverage points.

Journal ArticleDOI
TL;DR: In this paper, asymptotic relative efficiencies are considered for a three factor mixed effects model with the assumption that the observations are realisations of a univariate normal distribution.
Abstract: A basic assumption in the analysis of variance is that the observations are realisations of a univariate normal distribution. However, if the observations are rounded, the question arises of how this affects the usual tests. Rayner, Dodds and Best (1986) considered a one factor fixed effects model with regard to simulated sizes and approximate asymptotic relative efficiencies. Here asymptotic relative efficiencies are considered for a three factor mixed effects model. The method generalises readily to other ANOVA models.

Journal ArticleDOI
TL;DR: In this paper, the authors used low-order polynomial approximators as control variates and applied them to the raw marginal moments and the variance matrix of the parameter estimators.
Abstract: Appoximations and Monte Carlo methods for evaluating the statistical properties of estimators for nonlinear—model parameters are discussed in Swain and Schmeiser (1984), where low—order polynomial approximators are used as control variates. Here, these control variates are specialized and applied to the raw marginal moments and the variance matrix of the parameter estimators. As an example, a catalytic—reaction model is studied empirically, with emphasis on the relationship between the control weights and variance reduction obtained. In this example, as well as in many others, the generalized variance of the estimator of the mean is reduced by orders of magnitude.

Journal ArticleDOI
TL;DR: In this article, the correlation between generated variates for the purpose of inducing dependence among the output of simulation runs is examined, and the concept of obtaining correlation via algorithms other than the inverse transformation is examined.
Abstract: Traditionally, exactness, numerical stability and speed are the three main criteria for evaluating algorithms for random variate generation. However, it is sometimes required that the algorithms provide correlation between generated variates for the purpose of inducing dependence among the output of simulation runs. The inverse transformation, which produces optimal correlation induction, often performs poorly in terms of the first three criteria. Algorithms based on composition, rejection, and special properties which often excel in terms of the first three criteria, tend to scramble the use of random numbers, causing many attempts at common random numbers, antithetic variates and external control variates to fail. The concept of obtaining correlation via algorithms other than the inverse transformation is examined here. To demonstrate feasibility, previously developed algorithms for Poisson and binomial random variate generation are modified to obtain both positive and negative correlation between runs....

Journal ArticleDOI
TL;DR: A large number of approximate methods have been proposed for the components-of-varience problem, and two of these methods have shown promising results in the one factor COV problem as mentioned in this paper.
Abstract: There is no exact small sample solution for setting confidence intervals for the treatment component in one factor components-of-varience problem, or for the problem of setting confidence intervals for the difference in means of two exponential distributions. A large number of approximate methods have been proposed for the components-of-varience problem. In a published study of nine of these methods, two have shown promise. The properties of these two as well as a third method, proposed by the authors, are investigated and shown to perform surprisingly well in the components-of-varience setting. The problem concerning difference of two exponential means is mathematically similar to the components-of-varience problem except that the parameter about which a confidence interval is to be built may take negative values. One may also wish to require a symmetry in the method so that the solution does not depend on the order in which the two samples are labelled.Adaptations of the above mentioned methods to the e...

Journal ArticleDOI
TL;DR: In this paper, the authors used recursive integration to calculate the percentage points of the distribution of Moran's statistic for sample sizes up to 20, and provided a table of the results.
Abstract: Burrows (1979) and Currie (1981) used a method based on recursive integration to calculate the percentage points of Greenwood's statistic for sample sizes up to 20. This paper presents a similar technique applied to the distribution of Moran's statistic and provides a table of the percentage points obtained.



Journal ArticleDOI
TL;DR: In this article, an algorithm suggested by Day for finding maximum likelihood estimates of the parameters of a mixture of two multivariate normal distributions with common covariance matrix, is used to generate critical values for a test of multivariate normality against the alternative that the distribution is a multivariate mixture of multi-normals.
Abstract: An algorithm suggested by Day for finding maximum—likelihood estimates of the parameters of a mixture of two multivariate normal distributions with common covariance matrix, is used to generate critical values for a test of multivariate normality against the alternative that the distribution is a mixture of multivariate normals. The test is based on the maximum likelihood estimate of the generalized distance and the null distribution of this statistic is found by simulation. The power of the test is also investigated, and a suggestion is made as to how the test procedure and the estimation algorithm might be combined into a method of hierarchical cluster analysis.


Journal ArticleDOI
TL;DR: In this article, the performance of the sample quadratic discriminant function (QDF) quickly deteriorates as the dimension p increases relative to the sample size n i i = l, 2.
Abstract: The sample quadratic discriminant function (QDF) has been shown by Marks and Dunn (1974) to be superior to the linear discriminant function for two normal populations with , provided the training sample sizesn 1and n 2, are sufficiently large However, the performance of the QDF quickly deteriorates as the dimension p increases relative to the sample size n i i = l , 2 The deterioration is principally due to poor estimates of the inverse of the covariance matrices, One method of combating this problem is to apply biased estimators of the inverse of the covariance matrices In this paper we contrast the performance of the QDF with respect to several biased estimators and one unbiased estimator of A shrinkage estimator proposed by Peck and Van Ness (1982) is found to yield superior performance over a wide range of configurations and training sample sizes

Journal ArticleDOI
TL;DR: In this article, a methos for approximating the p-value of a goodness-of-fit test developed by Foutz (1980) is proposed, which is based on a Monte Carlo method and is compared with the normal approximation and Franke and Jayachandran's (1983) approximation.
Abstract: We propose a methos for approximating the p-value of a goodness-of-fit test developed by Foutz (1980). This approximation is based on a Monte Carlo method and is compared with the normal approximation and Franke and Jayachandran's (1983) approximation. Two examples, including one bivariate case, are given.

Journal ArticleDOI
TL;DR: In this article, a modified one-sample quantile-quantile plot is proposed as an alternative to the ordinary quantilequantile plots and a modified correlation coefficient test for normality is introduced.
Abstract: A modified one-sample quantile-quantile plot is proposed as an alternative to the ordinary quantilequantile plot. A modified correlation coefficient test for normality is introduced. The percentage points of the proposed test statistic are obtained by a simulation experiment.

Journal ArticleDOI
TL;DR: In this paper, a robust procedure for comparing two means when variances are unequal based on asymmetric Type-II censored samples and assuming normality for the censored samples is developed.
Abstract: By adopting Bayesian method, we develop in this paper a robust procedure for comparing two means when variances are unequal based on asymmetric Type-II censored samples and assuming normality for the censored samples. The posterior distribution of the difference e of means is derived and approximated. Bayesian interval for e is then obtained and used to make inference about e further, the effects of asymmetric censoring on the posterior probabilities are assessed. These results are finally applied to Lehmann's data and to Brownlee's data. The results display that the effects of asymmetric censoring are in general quite small.