scispace - formally typeset
Search or ask a question

Showing papers in "Biometrics in 1967"



Journal ArticleDOI
TL;DR: In this article, the problem of resolution of a distribution into two Gaussian components, and some particular cases of it, have been considered by several particular cases, such as the one in this paper.
Abstract: The distributionof a morphometric character inl a biological population is a mixture of components corresponding to different species, broods, sexes, etc. A problem which frequently arises is to find the relative frequencies and the frequency distribution of such components by an analysis of the observed frequency distribution. The frequency distribution of any such component is usually assumed to be normal: hence the problem is one of resolution of a distribution into Gaussiain components. For a population of fish such an analysis has been found to be very helpful for population studies, particularly when determination of age of a fish is difficult. The frequency distribution of length obtained from a sample of fish is usually skew and polymnodal: in many cases, the modes correspond to individual age-groups and are very helpful for separating them. Buchanan-Wollaston and Hodgeson [1929] disapproved of the smoothing out of 'bumpy' distributions, as practiced by the early fishery biologists, even for small samples. They suggested that the individual 'humps' indicate meaningful modes around which normal curves ought to be fitted. The problem of resolution of a distribution into two Gaussian components, and some particular cases of it, have been considered by several

1,015 citations



Journal ArticleDOI
TL;DR: Applied Regression Analysis, Third Edition as discussed by the authors provides a complete introduction to the fundamentals of regression analysis with a focus on fitting and checking of both linear and nonlinear regression models, using small and large data sets.
Abstract: for discovering the relationships among variables. This classic text, with its emphasis on clear, thorough presentation of concepts and applications, offers a complete, easily accessible introduction to the fundamentals of regression analysis. Assuming only a basic knowledge of elementary statistics, Applied Regression Analysis, Third Edition focuses on the fitting and checking of both linear and nonlinear regression models, using small and large data sets, with pocket calculators or computers.

474 citations


Journal ArticleDOI

376 citations



Journal ArticleDOI
TL;DR: In this article, a method of discrimination based on maximum likelihood estimation is described and compared with others in its application to a diagnostic problem, and the discriminant reduces to multivariate logistic analysis.
Abstract: SUMMARY A method of discrimination, based on maximum likelihood estimation, is described. On a variety of mathematical models, including and extending the models most commonly assumed in discriminant theory, the discriminant reduces to multivariate logistic analysis. Even when no simple model can be assumed, other considerations show that this method should work well in practice, and should be very robust with respect to departures from the theoretical assumptions. The method is compared with others in its application to a diagnostic problem.

185 citations



Journal ArticleDOI
TL;DR: In this article, the distance from a randomly located point to the nearest (or nth nearest) individual plant or animal is described and the efficiency of distance methods relative to plot sampling is considered in terms of number of individuals enumerated.
Abstract: Frequency distributions of the distance from a randomly located point to the nearest (or nth nearest) individual plant or animal are described. Results applicable to systematic and contagious (negative binomial) distributions are derived as supplements to the considerable existing literature on randomly distributed individuals. A new 'index of nonrandomness' is suggested, and demonstrated on some previously published data. Efficiency of distance methods relative to plot sampling is considered in terms of number of individuals enumerated.

127 citations


Journal ArticleDOI

98 citations


Journal ArticleDOI
TL;DR: The authors amplify certain aspects of their past work on the construction of models for quantal responses to mixtures of drugs, introduce some new material, and compare their approach with that adopted by Ashford and Smith, to highlight the most important difference between the two approaches.
Abstract: The authors amplify certain aspects of their past work on the construction of models for quantal responses to mixtures of drugs, introduce some new material, and compare their approach with that adopted by Ashford and Smith. The authors point out that their attitude towards classification of joint drug actions has changed since 1952, although they disagree with Ashford and Smith's suggestion that the authors' classification of 1952 is subject to anomalies. The authors' later work has emphasized a general method for development of mathematical models to correspond with specific biological situations, and questions of classification of biological or mathematical models appear to them less important than formerly. Perhaps the most important difference between the two approaches lies in the concept of dose as a variable. The authors have used the word 'dose' in its usual sense, to mean a quantity of drug measured in ordinary scientific units, whereas Ashford and Smith use it to mean any monotonic transformation of the dose as ordinarily conceived. This drastic generalization of the dose variable inevitably leads to mathematical models that are in general almost devoid of predictive properties. However, such generalization seems to be unnecessary for most pharmacological and toxicological situations. Concerning more specialized terms, it is stressed that Ashford and Smith use several of the same terms as the authors do, but with different meanings. The significance of Ashford and Smith's concepts is discussed.

Journal ArticleDOI
TL;DR: A technique used in virus research requires the arrangement of samples from a number of virus prepatiolns in such a way that over the whole set a sample from each virus preparation appears next to another sample from every other virus preparation.
Abstract: A technique used in virus research requires the arrangemeilt in circles of samples from a number of virus preparatiolns in such a way that over the whole set a sample from each virus preparation appears next to a sample from every other virus preparation. This paper shows how this may be done with the minimum total number of samples, in circles containing samples from either all or some of the preparations, and gives tables of designs for up to 41 virus preparationis. Comparisons with related problems are made.




Journal ArticleDOI
TL;DR: A skewed two-parameter distribution is described which has been found useful in the analysis of human survival time data and has integrable and has manageable first and second moments.
Abstract: A skewed two-parameter distribution is described which has been found useful in the analysis of human survival time data. The density is integrable and has manageable first and second moments. Since the distribution has non-zero density at the origin it may be of value in connection with those types of responses which take place even before observation begins. Description of a maximum likelihood technique of estimating the parameters is followed by discussion of damage models that incorporate the distribution.




Journal ArticleDOI
TL;DR: The fitting of additive parental main effects, or 'combining abilities,' to such data is described, using data on English apples and rubber.
Abstract: A plant breeding program, extending over several years, accumulates records of the progeny obtained from numerous crosses, made according to no systematic plan. This paper describes the fitting of additive parental main effects, or 'combining abilities,' to such data. It examines the feasibility and practical utility of the analysis, using data on English apples (from this Institute) and on rubber (supplied by the Rubber Institute of Malaya). The analysis is a straight forward application of least-squares, involving no new statistical theory.


Journal ArticleDOI
TL;DR: Results suggest that, in general, because of extremely high correlations between them, the estimates of the parameters will be subject to very large sampling errors, which implies severe limitations on the use of the distribution.
Abstract: Cresswell & Froggatt [1963] devised a model leading to a three-parameter distribution, which they called the 'Short' distribution, to describe accident data. It is the convolution of a Poisson distribution and a Neyman Type A distribution. We consider first its general properties. Then we derive a recurrence relationship for the probabilities, and also the maximum-likelihood equations, from the probability generating function by a very simple differentiation method. An Algol program has been written to solve these equations and a numerical example is given. The efficiency of the method of moments for estimating the parameters is also examined. Results suggest that, in general, because of extremely high correlations between them, the estimates of the parameters (by either method) will be subject to very large sampling errors. This implies severe limitations on the use of the distribution.

Journal ArticleDOI
TL;DR: In this article, the problem of pooling means of two independent random samples from discrete distributions (in particular Poisson and binomial) which can be approximated by normal distributions after the appropriate transformations is considered.
Abstract: This paper considers the problem of pooling means of two independent random samples from discrete distributions (in particular Poisson and binomial) which can be approximated by normal distributions after the appropriate transformations. We first develop the theory for two samples from N(iL , U2), i = 1, 2, o-2 assumed known, and the parameter of interest being ji, . Using a preliminary test of significance (PTS) at level a to test ju = A2, a new estimator x is proposed, both to estimate g I, and to test the hypothesis ,ui = ,io. The bias and mean squared error of x are studied and the regions in the parameter space in which x has smaller mean squared error than x, the mean of the first sample, are determined. Similarly, the size and power of the overall hypothesis testing procedure are studied. It is recommended that in order to control the mean squared error and the size of the overall test procedure based on x*, the level of significance of PTS should be about .25. These results are then applied to corresponding problems for the data from Poisson and binomial distributions.

Journal ArticleDOI
TL;DR: The withdrawal rate and the withdrawal death rate were found to be the variables most significant in introducing discrepancies between survival rates and standard errors.
Abstract: A comparison was made of results from the life table methods of Cutler and Ederer [1958] and Chiang [1960a, b; 1961]. Survival rates and standard errors were calculated from hypothetical populations with various survival experiences. The discrepancy of the two methods was found to be negligible for suivival rates aind stanidard errors when the withdrawing rates were < 30% aind the lost-tofollow-up rate < 40%. The survival rates for these methods were equal whein the death rate in the withdrawal group was zero. The withdrawal rate and the withdrawal death rate were found to be the variables most significant in introducing discrepancies between (a) survival rates and (b) standard errors. Application of these methods to data from an observed population showed that the differeinces are negligible when the withdrawing rate is small.

Journal ArticleDOI
TL;DR: The simple method of partial totals is suggested as a possible alternative to the method of maximum likelihood for small samples for equally spaced samples.
Abstract: Methods of parameter estimation for exponential model arising in epidemiological studies and biological assay

Journal ArticleDOI
TL;DR: The principle of the Karber method of estimation is rather simple, and its application to the analysis of quantal data has been effectively extended by Cornfield and Mantel [1950].
Abstract: SUMMARY Karber estimation for the exponential parameter in a simple death or failure process requires assigning right tail proportions the average of the left-truncated exponential, while interval proportions are assigned the average of a doublytruncated exponential. In the case of equal arithmetic spacing, such assignments result in a simple estimate for which the asymptotic variance and relative efficiency are obtained, but the efficiency of the estimate can be poor. Optimally, testing should be concentrated at a single dose for which the survival probability is 20.32%. Multiple doses, arithmetically or logarithmically spaced, are justified where there is sufficient prior uncertainty. The estimate obtained is identical with the maximum likelihood estimate for serial equi-spaced observation on the same items. A maximum likelihood estimate similar in form arises in the simple birth process. General use of that estimating form is suggested for branching processes. Equi-spacing of observations permits simplified estimates where the parametric form is known, valid estimates even when the parametric form is uniknown. The principle of the Karber method of estimation (also called Spearman estimation or Spearman-Karber estimation) is rather simple, and its application to the analysis of quantal data has been effectively extended by Cornfield and Mantel [1950]. One extension that Cornfield and Mantel made was to drop the requirement of equal spacing of the dose levels. If at the successively increasing (log) dose levels X1 , X, * - , Xk, the independently observed proportions of animals responding are pi = ri/ni, and the dosage range covered is so extensive that P, = 0, p, = 1, then the KdIrber estimate of the mean of the tolerance distribution is given by


Journal ArticleDOI
TL;DR: The generalization that yields influenced mainly by vegeta-tive and reproductive growth give asymptotic and parabolic relationships respectively is queried, and it is found that equation (1) was inadequate.
Abstract: where w is the yield per plant or biologically definable part of plant (e.g. leaves, roots), p is the number of plants per unit area, and a and c are constants. As density increases, the total yield per unit area (wp) given by (1) approaches an asymptotic maximum yield (1/c). An equivalent relationship was proposed independently by De Wit and Ennik [1959] and by Holliday [1960]. Holliday considered it typical where yield was the result of vegetative growth but inadequate to describe cases where yield was the result of reproductive growth, since he then found that a maximum yield per unit area occurred for some finite density, with higher densities resulting in lower yields. For convenience such relationships will be referred to as parabolic, although this term is not to be understood as applying in the strict mathematical sense. Bleasdale and Nelder [1960] queried the generalization that yields influenced mainly by vegeta-tive and reproductive growth give asymptotic and parabolic relationships respectively, but also found that equation (1) was inadequate. They proposed a modification to allow for both asymptotic and parabolic relationships, a simplified

Journal ArticleDOI
TL;DR: In this article, the application of extreme value theory for estimation of extreme percentiles of arbitrary continuous distributions is discussed and the use of a generalised extreme value distribution is proposed, given in a form containing three parameters: a location parameter, a scale parameter and a shape parameter.
Abstract: The application of extreme-value theory for estimation of extreme percentiles of arbitrary continuous distributions is discussed and the use of a generalised extremevalue distribution is proposed. This distribution is given in a form containing three parameters: a location parameter, a scale parameter and a shape parameter. A method of estimating the parameters of the distribution is developed. It uses order statistics and general least-squares theory and is shown to be suitable for censored as well as uncensored samples. Tables of expectations of order statistics for certain sample sizes and certain values of the shape parameter are provided. An application, illustrating the numerical procedures is given, using data from the field of Applied Physiology.

Journal ArticleDOI
TL;DR: The balls-in-boxes model as mentioned in this paper is used to test whether the species in a ground vegetation mosaic are randomly mingled, regardless of the sizes of the component patches of the mosaic.
Abstract: Suppose an n-phase vegetation mosaic is sampled at equidistant points along a line tranisect. The observed sequence of symbols (a different symbol for each phase) is then 'collapsed'; i.e., any run of two or more identical symbols is replaced by a single symbol of the same kind. The resulting sequence is then such that no two adjacent symbols are the same. If the patches of the several species (or phases) in the mosaic are randomly mingled with one another, the observed sequence of symbols is a realization of an n-state Markov chain for which a balls-in-boxes model may be constructed. It is shown how one may test the fit of the model to ail observed collapsed sequence. This provides a method for testinig whether the species in a mosaic are randomly mingled whatever the sizes of the component patches may be. Examples are given of applications of the test to two mosaics of ground vegetation.