scispace - formally typeset
Search or ask a question

Showing papers in "Biometrical Journal in 1986"


Journal ArticleDOI
TL;DR: In this article, the authors discuss how two sample t-tests behave when applied to data that may violate the classical statistical assumptions of independence, heteroscedasticity and Gaussianity.
Abstract: This work discusses how two sample t-tests behave when applied to data that may violate the classical statistical assumptions of independence, heteroscedasticity and Gaussianity. The usual two sample t-statistic based on a pooled variance estimate and the Welch-Aspin statistic are treated in detail. Practical “rules-of-thumb” are given along with their applications to various examples so that readers will easily be able to use such tests on their own data sets.

188 citations


Journal ArticleDOI
TL;DR: In this article, the authors compare the results of direct and semi-parametric adjusted survival curves, their variances, and the variances of their differences for nonparametric (Kaplan-Meier), semiparametric, and parametric (Weibull) models applied to censored exponential data.
Abstract: Several investigators have recently constructed survival curves adjusted for imbalances in prognostic factors by a method which we call direct adjustment. We present methods for calculating variances of these direct adjusted survival curves and their differences. Estimates of the adjusted curves, their variances, and the variances of their differences are compared for non-parametric (Kaplan-Meier), semi-parametric (Cox) and parametric (Weibull) models applied to censored exponential data. Semi-parametric proportional hazards models were nearly fully efficient for estimating differences in adjusted curves, but parametric estimates of individual adjusted curves may be substantially more precise. Standardized differences between direct adjusted survival curves may be used to test the null hypothesis of no treatment effect. This procedure may prove especially useful when the proportional hazards assumption is questionable.

55 citations



Journal ArticleDOI
TL;DR: In this paper, two methods for the estimation of pair-potential of Gibbsian point processes are described, which are applied to a point pattern of ant's nests to obtain information on the degree of interaction of two different species.
Abstract: In this paper two methods for the estimation of pair-potentials of Gibbsian point processes are described. These are applied to a point pattern of ant's nests to obtain information on the degree of interaction of two different species. Furthermore an estimation method of mark-point potentials is demonstrated and applied to the same example.

29 citations


Journal ArticleDOI
Michael Haber1
TL;DR: In this paper, a modified version of Pearson's X2 test is compared with Tocher's randomized exact (UMPU) test, which is based on a less conservative definition of the concept of significance (STONE, 1969).
Abstract: A modified exact test is proposed for 2×2 contingency tables. This test, which is based on a less conservative definition of the concept of significance (STONE, 1969) is compared with a modified form of Pearson's X2 test and with Tocher's randomized exact (UMPU) test. The sizes of the new test lie near the nominal 0.05 levels while those of the X2 test usually exceed the nominal level, sometimes by a factor of 2 or more. The power of the modified test is usually close to that of the UMPU test.

26 citations


Journal ArticleDOI
TL;DR: In this article, a method of deriving confidence limits for a difference of two proportions using Bayesian Theory is presented. The reliability of the method is assessed for a selection of small sample sizes.
Abstract: A method is presented of deriving confidence limits for a difference of two proportions using Bayesian Theory. The reliability of the method is assessed for a selection of small sample sizes. A limited table of confidence limits is displayed.

23 citations


Journal ArticleDOI
TL;DR: In this paper, rank statistics and their associated variance estimators are derived for general two-sample designs for independent observations, if a location model is assumed and if only two samples are considered.
Abstract: In the several sample case for independent observations suitable rank tests can be derived, if a location model is assumed. In more general models it is difficult to deduce such statistics. If only two samples are considered, rank tests can be developed without problems. For general two-sample designs rank statistics and their associated variance estimators are derived.

22 citations


Journal ArticleDOI
TL;DR: In this article, the cross-over design for clinical trials when responses are binary is discussed and three tests which have been proposed for the analysis of this problem are compared by an assessment of their assumptions.
Abstract: The cross-over design for clinical trials when responses are binary is discussed. Three tests which have been proposed for the analysis of this problem are compared by an assessment of their assumptions. A simple test to establish whether it is appropriate to include observations from the second period is presented.

21 citations


Journal ArticleDOI
TL;DR: In this paper, the relative merits of the P-value and the mid-P-value were compared and it was shown that inference based on the midP-values is in a certain sense on firmer ground.
Abstract: This paper considered the relative merits of the P-value and the mid-P-value. It is shown that inference based on the mid-P-value is in a certain sense on firmer ground. In particular the expected mid-P-value does not change under an irrelevant breakup in the test statistic.

20 citations


Journal ArticleDOI
TL;DR: The Brown-Forsythe test is at least as good as the James test and the Welch test, but the differences are so small that the choice is immaterial for practical purposes as mentioned in this paper.
Abstract: By using Monte Carlo studies, this paper compares the Welch test, the James test and the Brown-Forsythe test for comparing several means under heteroscedasticity. It appears that all the tests are quite robust with respect to departure from normality. The Brown-Forsythe test is at least as good as the James test and the Welch test; but the differences are so small that the choice is immaterial for practical purposes.

16 citations


Journal ArticleDOI
TL;DR: In this article, two new linear calibration methods for multicollinear spectral data are proposed and compared with established predictors on a set of data, which consists of measurements of fat in fish meat and near infrared (NIR) reflectance measurements.
Abstract: In this paper we treat linear calibration of multivariate chemical measurement instruments. Two new calibration methods for multicollinear spectral data are proposed. The methods are compared with established predictors on a set of data. The data consists of measurements of fat in fish meat and near infrared (NIR) reflectance measurements. The new methods give improvements compared with the established ones.

Journal ArticleDOI
TL;DR: In this article, the inclined point-symmetry model is proposed and a decomposition for its model is shown, and a test theory for the decomposed models and an example are given.
Abstract: In a square contingency table the inclined point-symmetry model which is an extension of the point-symmetry model is proposed and a decomposition for its model is shown. Moreover a test theory for the decomposed models and an example are given.

Journal ArticleDOI
TL;DR: In this paper, four kinds of symmetry models are proposed and their decompositions are given in a square contingency table, two of them are extensions of the local point-symmetry model and the reverse local point symmetry model by TOMIZAWA (1985), and the other two models are concerned with the inclined point symmetry model by Tomizawa (1986) and the conditional symmetric model by MCCULLAGH (1978).
Abstract: In a square contingency table four kinds of symmetry models are proposed and their decompositions are given. Two models of them are extensions of the local point-symmetry model and the reverse local point-symmetry model by TOMIZAWA (1985), and the other two models are concerned with the inclined point-symmetry model by TOMIZAWA (1986) and the conditional symmetry model by MCCULLAGH (1978). An example is given.

Journal ArticleDOI
TL;DR: It is shown how a concept from queuing theory namely first in first out (FIFO) can be profitably used here and another nonstandard situation considered is one in which lifespan of the individual entity is too long compared to duration of the experiment.
Abstract: One difficulty in summarising biological survivorship data is that the hazard rates are often neither constant nor increasing with time or decreasing with time in the entire life span. The promising Weibull model does not work here. The paper demonstrates how bath tub shaped quadratic models may be used in such a case. Further, sometimes due to a paucity of data actual lifetimes are not as certainable. It is shown how a concept from queuing theory namely first in first out (FIFO) can be profitably used here. Another nonstandard situation considered is one in which lifespan of the individual entity is too long compared to duration of the experiment. This situation is dealt with, by using ancilliary information. In each case the methodology is illustrated with numerical examples.

Journal ArticleDOI
TL;DR: In this paper, a continuity correction is proposed for the test statistics of HABERMAN (1978) and LEHMACHER (1981) for identifying overfrequented (or under-requented) cells in the three-dimensional Configural Frequency Analysis (CFA).
Abstract: A continuity correction is proposed for the test statistics of HABERMAN (1978) and LEHMACHER (1981) for identifying overfrequented (or underfrequented) cells in the three-dimensional Configural Frequency Analysis (CFA). Its quality is shown by comparison with the test based on the exact distribution of the cell frequencies.

Journal ArticleDOI
TL;DR: During preparatory steps of data for automatic classification routines, the amount of information contained by the character distribution is reduced by standardization of the character values but can be regained through special weighting schemes of standardized character values.
Abstract: During preparatory steps of data for automatic classification routines, the amount of information contained by the character distribution is reduced by standardization of the character values This information can be regained through special weighting schemes of standardized character values Two weight coefficients derived from decomposition of the frequency distributions of the characters are proposed


Journal ArticleDOI
M. Katz1, J. M. Goux1
TL;DR: The statistical properties of one estimator of absolute genetic distance (1/2) K∑i=1 |pxi-yr|, tween two populations X and Y, are presented in this paper.
Abstract: The statistical properties of one estimator of absolute genetic distance (1/2) K∑i=1 |pxi-yr|, tween two populations X and Y, are presented. It is shown that using this distance in small samples can be misleading particularly when populations are close to each other.

Journal ArticleDOI
TL;DR: In this paper, the problem of stereological determination of the covariance of a random set from thin sections is considered, and a simple approximation formula is suggested using results of numerical experiments for two examples.
Abstract: The problem of stereological determination of the covariance of a random set from thin sections is considered. Using results of numerical experiments for two examples, a simple approximation formula is suggested.

Journal ArticleDOI
TL;DR: In this article, the Gram-Schmidt orthogonalization process is used to express a multinormal density as a product of univariate normal densities, and a mixture of uNN-variate normal distributions must be computed, making likelihood calculations impractical even for moderate N.
Abstract: The computation of an N-variate normal density function requires the inversion of an N × N co-variance matrix. Furthermore, if each mean depends on u unobservable factors, a mixture of uNN-variate normal densities must be computed, making likelihood calculations impractical even for moderate N. The Gram-Schmidt orthogonalization process is used to express a multinormal density as a product of univariate normal densities. When the pattern of the correlation matrix is taken into account the formulas may be considerably simplified. In some cases each of the orthogonal variates can be written as a linear combination of only a few of the original variates. Such results are crucial for applications of multinormal distributions and of mixtures of multinormal distributions. An intraclass correlation model and a genetic variance components model applicable to family data are discussed as examples.


Journal ArticleDOI
TL;DR: In this paper, improved Bonferroni type inequalities for the union of a set of nonexchangeable events are presented, which are stronger than or equivalent with the bounds proposed by KOUNIAS (1968), HUNTER (1976), MARGOLIN & MAURER (1976, GALAMBOS (1977), and by Mǎrgǎkritescu (1983).
Abstract: In this note we present improved Bonferroni type inequalities for the union of a set of nonexchangeable events. These new bounds are stronger than or equivalent with the bounds proposed by KOUNIAS (1968), HUNTER (1976), MARGOLIN & MAURER (1976), GALAMBOS (1977) and by Mǎrgǎkritescu (1983). In the particular case of the exchangeable events, the bounds given by SOBEL & UPPULURI (1972) can be derived. A generalization in terms of a partition of {1, 2, …, n} is also established.

Journal ArticleDOI
TL;DR: In this article, power tables for univariate and multivariate analyses of variance and regression are presented, with a wide range of different values for a (.0005 to.40), for power (.50 to.9995), and for 45 different values of the degrees of freedom for the numerator of the F ratio (u = 1 to 150).
Abstract: The available power tables for use in experimental design only serve for limited practical purposes, since they are restricted to very few levels of significance such as .01, .05, and .10. With these values, however, usually no correction for cumulating error probabilities, for example, by the Dunn-Bonferroni method, can be achieved, because (very) low values of a and sometimes even of α are necessary. Therefore, power tables are presented that encompass a wide range of different values for a (.0005 to .40), for power (.50 to .9995), and for 45 different values of the degrees of freedom for the numerator of the F ratio (u = 1 to 150). Four of the 16 tables are printed. Their use is demonstrated for some paradigmatic problems in univariate and multivariate analyses of variance and regression.

Journal ArticleDOI
TL;DR: In this article, the authors consider a simple ANOVA model and assess the effect of categorisation by simulating sizes and calculating asymptotic relative efficiencies, showing that even for severe categorisation the effects are small.
Abstract: Many statistical procedures assume a continuous model in spite of the fact that observations are necessarily discrete. Here we consider a simple ANOVA model—one factor with fixed effects—and assess the effect of categorisation by simulating sizes and calculating asymptotic relative efficiencies. Even for severe categorisation the effects are small.

Journal ArticleDOI
TL;DR: A synopsis of statistical methods which can be used for the sequential analysis of possibly censored survival times in clinical trials and results on the asymptotic behaviour of the Breslow-Haug statistic and the sequential version of the logrank statistic are presented.
Abstract: This paper provides a synopsis of statistical methods which can be used for the sequential analysis of possibly censored survival times in clinical trials. Especially, results on the asymptotic behaviour of the Breslow-Haug statistic and on the sequential version of the logrank statistic are presented in a standardized terminology. In addition, formulae for the explicit calculation of linear and square-root boundaries for sequential plans are given and illustrated by an example. Practical problems of applying these methods when monitoring a fixed-sample clinical trial as well as group sequential methods and calculation of P-values are also discussed.

Journal ArticleDOI
TL;DR: The negative binomial distribution (N.B.) was previously proposed (BENNETT, 1981) in the estimation of and tests of significance for the relative risk (ψ) in prospective studies in epidemiology as discussed by the authors.
Abstract: The negative binomial distribution (N.B.) was previously proposed (BENNETT, 1981) in the estimation of and tests of significance for the relative risk (ψ) in prospective studies in epidemiology. This paper discusses the application of the N.B. model in combining estimates of ψ from a series of prospective studies and in statistical tests of the equality of these estimates.

Journal ArticleDOI
Peter Bauer1
TL;DR: In this article, two stage sampling plans are considered for simultaneously testing a main and side effect, assumed to follow a bivariate normal distribution with known variances, but unknown correlation.
Abstract: In clinical trials for the comparison of two treatments it seems reasonable to stop the study if either one treatment has worked out to be markedly superior in the main effect, or one to be severely inferior with respect to an adverse side effect. Two stage sampling plans are considered for simultaneously testing a main and side effect, assumed to follow a bivariate normal distribution with known variances, but unknown correlation. The test procedure keeps the global significance level under the null hypothesis of no differences in main and side effects. The critical values are chosen under the side condition, that the probability for ending at the first or second stage with a rejection of the elementary null hypothesis for the main effect is controlled, when a particular constellation of differences in mean holds; analogously the probability of ending with a rejection of the null hypotheses for the side effect, given certain treatment differences, is controlled too. Plans “optimal” with respect to sample size are given.

Journal ArticleDOI
Abstract: An analysis of experimental design is suggested for comparing progenies of lineXtester mating system. The line × tester mating system is an expanded top-cross mating system, used frequently in practice. This system is of interest for breeders dealing with estimation and testing of general as well as specific combining abilities. In the suggested design, treatments, i.e. hybrids obtained in crosses, and testers are studied in orthogonally supplemented incomplete block design. Analysis of variance has been presented, estimators of expected values for progenies and for testers have been given, testing of hypotheses concerning contrasts of interest for progenies, lines and testers has been discussed. Estimators of effects of general and specific combining abilities as well as tests permitting for testing the significance of various genetic parameters, have also been given.

Journal ArticleDOI
H. Knolle1
TL;DR: In this paper, the mathematical tools of Hilbert space theory and Gaus quadrature are used to derive a new method of curve fitting and calculation of statistical moments of the concentration-time curve.
Abstract: The mathematical tools of Hilbert space theory and Gaus quadrature are used to derive a new method of curve fitting and calculation of statistical moments of the concentration-time curve.

Journal ArticleDOI
TL;DR: In this article, a two-stage procedure for testing the bioequivalence of two drug formulations is presented, assuming that the performance of the drug is characterized by a normally distributed variate.
Abstract: The procedures currently used for testing the bioequivalence of two drug formulations achieve control over the error probability of erroneously accepting bioequivalence or over the probability of erroneous rejection, but not over both error probabilities. A two-stage procedure that rectifies this drawback is presented, assuming that the performance of the drug is characterized by a normally distributed variate.