scispace - formally typeset
Search or ask a question

Showing papers on "Sampling (statistics) published in 1980"


Book
01 Jan 1980
TL;DR: History Conceptual Foundations Uses and Kinds of Inference The Logic of Content Analysis Designs Unitizing Sampling Recording Data Languages Constructs for Inference Analytical Techniques The Use of Computers Reliability Validity A Practical Guide
Abstract: History Conceptual Foundations Uses and Kinds of Inference The Logic of Content Analysis Designs Unitizing Sampling Recording Data Languages Constructs for Inference Analytical Techniques The Use of Computers Reliability Validity A Practical Guide

25,749 citations


Journal ArticleDOI
TL;DR: In this article, an analysis of the small-angle scattering problem based on Shannon sampling is presented, which leads to an accurate assessment of the information contained in a given data set and rigorous estimates of the errors inherent in the parameters derived therefrom.
Abstract: An analysis of the small-angle scattering problem based on Shannon sampling is presented. It is shown that this approach leads to an accurate assessment of the information contained in a given data set and rigorous estimates of the errors inherent in the parameters derived therefrom.

310 citations


Proceedings ArticleDOI
01 Dec 1980
TL;DR: In this paper, the zeros of the discrete time system obtained when sampling a continuous time system are explored and theorems for the limiting zeros for large and small sampling periods are given.
Abstract: The zeros of the discrete time system obtained when sampling a continuous time system are explored. Theorems for the limiting zeros for large and small sampling periods are given. A condition which guarantees that the sampled system only has stable zeros is also presented.

299 citations


Journal ArticleDOI
TL;DR: In this paper, procedures for careful shopping center sampling are described for careful sampling of shopping centers, which is a popular technique because of low costs, but typical procedures are haphazard and thus generalization is difficult.
Abstract: Shopping center sampling is popular because of low costs, but typical procedures are haphazard and thus generalization is difficult. In this article, procedures are described for careful sampling o...

216 citations





Journal ArticleDOI
01 Jun 1980-Genetics
TL;DR: Measurement of linkage disequilibrium involves two sampling processes and two-locus descent measures are used to describe the mating system and are transformed to disequ equilibrium moments at the final sampling.
Abstract: Measurement of linkage disequilibrium involves two sampling processes. First, there is the sampling of gametes in the population to form successive generations, and this generates disequilibrium dependent on the effective population size (Ne) and the mating structure. Second, there is sampling of a finite number (n) of individuals to estimate the population disequilibrium.——Two-locus descent measures are used to describe the mating system and are transformed to disequilibrium moments at the final sampling. Approximate eigenvectors for the transition matrix of descent measures are used to obtain formulae for the variance of the observed disequilibria as a function of Ne, mating structure, n, and linkage or recombination parameter.——The variance of disequilibrium is the same for monoecious populations with or without random selfing and for dioecious populations with random pairing for each progeny. With monogamy, the variance is slightly higher, the proportional difference being greater for unlinked loci.

147 citations


Journal ArticleDOI
TL;DR: In this paper, the authors give variable sampling plans for items whose failure times are distributed as either extreme-value variates or Weibull variates (the logarithms of which are from an extreme value distribution).
Abstract: In this paper, we give variables sampling plans for items whose failure times are distributed as either extreme-value variates or Weibull variates (the logarithms of which are from an extreme-value distribution). Tables applying to acceptance regions and operating characteristics for sample size n, ranging from 3 to 18, are given. The tables allow for Type II censoring, with censoring number r ranging from 3 to n. In order to fix the maximum time on test, the sampling plan also allows for Type I censoring. Acceptance/rejection is based upon a statistic incorporating best linear invariant estimates, or, alternatively, maximum likelihood estimates of the location and scale parameters of the underlying extreme value distribution. The operating characteristics are computed using an approximation discussed by Fertig and Mann (1980).

125 citations


Book ChapterDOI
01 Jan 1980
TL;DR: Sampling is only a tool which the entomologist should use to obtain certain information, provided there is no easier way to get the information, and only by having a clear understanding of the end can the sampling program be optimized.
Abstract: One of the first things that a field entomologist learns is how to sample an insect population, for it is a tenet of the discipline that until one knows what species are present and how many there are, nothing is known. Some entomologists spend a major portion of their time working on sampling techniques or on interpreting sampling data. It is easy for entomologists to become convinced that the object of their professional existence is to work on sampling. As one drifts in this direction, it is beneficial to consider a statement by Morris (1960): “Sampling has no intrinsic merit, but is only a tool which the entomologist should use to obtain certain information, provided there is no easier way to get the information.” Sampling is a means to an end. Only by having a clear understanding of the end can the sampling program be optimized.

124 citations


Journal ArticleDOI
TL;DR: An efficient method for weighted sampling of K objects without replacement from a population of n objects is proposed, which requires fewer additions and comparisons than the method proposed by Fagin and Price.
Abstract: In this note, an efficient method for weighted sampling of K objects without replacement from a population of n objects is proposed. The method requires $O(K\log n)$ additions and comparisons, and $O(K)$ multiplications and random number generations while the method proposed by Fagin and Price requires $O(Kn)$ additions and comparisons, and $O(K)$ divisions and random number generations.

Journal ArticleDOI
TL;DR: In this paper, it was shown that orthant sampling, which has previously been used to approximately sample microcanonical ensembles of anharmonic oscillators, is an exact sampling technique when applied to harmonic oscillators.

01 Jan 1980
TL;DR: This document is designed for users of the program developed at Sandia Laboratories by the authors to generate Latin hypercube samples to study effects of distributional assumptions on key input variables without rerunning the computer model.
Abstract: This document is designed for users of the program developed at Sandia Laboratories by the authors to generate Latin hypercube samples. Latin hypercube sampling is a recently developed sampling technique for generating input vectors into computer models for purposes of sensitivity analysis studies. In addition to providing a cost-effective and reliable sampling scheme, the Latin hypercube sampling technique also provides the user with the flexibility efficiently to study effects of distributional assumptions on key input variables without rerunning the computer model. 5 figures, 2 tables.

Journal ArticleDOI
TL;DR: A semisystematic sampling scheme is introduced which allows a higher sampling intensity and permits Hopkins' method to be used without complete enumeration of the study region and a new test for 'randomness' related to Hopkins' test is introduced.
Abstract: 'Distance' or 'nearest neighbour' methods are often used as alternatives to counting plants within squares ('quadrat' methods) either to estimate the number of plants in a study region or to test the 'randomness' of their pattern.To be specific we will consider trees; Figure 1 illustrates a 10 metre square plot of pines (from Strand, 1972). The suggested procedures for density estirnation and for testing have been compared by Diggle (1975 1977), Diggle, Besag and Gleaves (1976), and Hines and Hines (1979). The method suggested by Hopkins (1954) invelves measuring the following squared distances: u from a random point to the nearest tree and v from a randomly chosen tree to its nearest neighbour. Edge effects should be made negligible by placillg the study region from which random points and plants are to be selected well within the region of interest. Hopkins' method has generally been preferred in comparison studies but is usually regarded as impracticable since, to find a random tree, all trees in the study regioIl should be counted and a randomly numbered tree chosen. Squared distances arise because the areas swept out in searches for the nearest trees are Fru and Frv, if the search is thought of in concentric circles about the chosen point or tree. The distribution tlleory assulnes that for a Poisson process the areas ru and xv will be independent. For this to hold it should be checked that the areas searched do not overlap, which constrains the number of samples that can be taken. We consider bounds on the sampling intensity and we introduce a semisystematic sampling scheme which allows a higher sampling intensity and permits Hopkins' method to be used without complete enumeration of the study region. A new test for 'randomness' related to Hopkins' test is introduced; a Monte Carlo study shows that this test and

Journal ArticleDOI
TL;DR: The index is likely to be least stable when the transformed variables of low heritability have high economic weights, and the weights will have highest sampling variance when these heritabilities are nearly equal.
Abstract: A transformation is proposed of the variables used for constructing genetic selection indices, such that the reparameterized phenotypic covariance matrix is identity and the genetic covariance matrix is diagonal. These diagonal elements play the role of heritabilities of the transformed variables. The reparameterization enables sampling properties of the index weights to be easily computed and formulae are given for data from half-sib families. The index is likely to be least stable when the transformed variables of low heritability have high economic weights, and the weights will have highest sampling variance when these heritabilities are nearly equal. It is suggested that the sample roots of the determinantal equation be inspected when constructing an index in order to give some guide to its accuracy.

Journal ArticleDOI
TL;DR: Effect of insufficient transverse sampling on quantitative positron emission computed tomography (ECT) was investigated with computer simulation and measurements on parallel bar and line source phantoms and employing sampling distances smaller than one-third of the intrinsic detector FWHM eliminated noticeable aliasing artifacts.
Abstract: Effect of insufficient transverse sampling on quantitative positron emission computed tomography (ECT) was investigated with computer simulation and measurements on parallel bar and line source phantoms. Aliasing artifacts were observed and were found to be dependent on both the configuration and the location of imaged objects. Images of parallel bar phantoms were found to have aliasing artifacts similar in characteristics to aliasing on one-dimensional signals. In line source images, aliasing effects were manifested as variations in amplitude and full width at half maximum resolution (FWHM) for sources at even slightly different locations in the field of view. It was found that employing sampling distances smaller than one-third of the intrinsic detector FWHM eliminated noticeable aliasing artifacts. Image resolution was also found to be affected by the sampling distance. For a sampling distance equal to one-half of the intrinsic detector FWHM, the imaging FWHM is about 10% worse than the intrinsic FWHM. Selection of sampling distance in noisy environments is discussed. Parallel bar phantoms are shown to have advantages over line sources in the evaluation of sampling and resolution performance of ECT scanners.


Journal ArticleDOI
TL;DR: In this paper, the authors address the technology for soil sampling of large agricultural fields which are inherently variable in both space and time, and they use geostatistical and classical statistical methods to identify the fiducial limits within which it is expected that the true mean salinity exists for given levels of probability.
Abstract: This study addresses the technology for soil sampling of large agricultural fields which are inherently variable in both space and time. Three several hundred ha fields in southwest Iran initially sampled on an arbitrarily selected grid of 80 m to ascertain soil salinity levels were analyzed using both geostatistical and classical statistical methods. The results from two fields showed that the variance structure of the salinity observations were spatially dependent, and hence, geostatistical techniques allowed best linear unbiased estimates of salinity values interpolated between spatially observed sampling locations to yield contour lines of isosalinity. In the third field, salinity observations were found to be spatially independent, and hence, were analyzed by classical methods to yield the number of soil samples necessary to observe the fiducial limits within which it is expected that the true mean salinity exists for given levels of probability.

Journal ArticleDOI
TL;DR: A useful measure of diversity was calculated for microbial communities collected from lake water and sediment samples using the Shannon index (H') and rarefaction [E(S]], which accounts for differences in sample size inherently so that comparisons are made simple.
Abstract: A useful measure of diversity was calculated for microbial communities collected from lake water and sediment samples using the Shannon index (H′) and rarefaction [E(S)]. Isolates were clustered by a numerical taxonomy approach in which limited (<20) tests were used so that the groups obtained represented a level of resolution other than species. The numerical value of diversity for each sample was affected by the number of tests used; however, the relative diversity compared among several sampling locations was the same whether 11 or 19 characters were examined. The number of isolates (i.e., sample size) strongly influenced the value of H′ so that unequal sized samples could not be compared. Rarefaction accounts for differences in sample size inherently so that such comparisons are made simple. Due to the type of sampling carried out by microbiologists, H′ is estimated and not determined and therefore requires a statement of error associated with it. Failure to report error provided potentially misleading results. Calculation of the variance of H′ is not a simple matter and may be impossible when handling a large number of samples. With rarefaction, the variance of E(S) is readily determined, facilitating the comparison of many samples.

Journal ArticleDOI
TL;DR: In this paper, the authors examined the impact of sample design on nonlinear statistics from complex samples, and the behavior of the chi-squared statistic computed from a complex sample to test hypotheses of goodness of fit or independence is studied.
Abstract: The impact on linear statistics of the sample design used in obtaining survey data is the subject of much of sampling literature. Recently, more attention has been paid to the design's impact on nonlinear statistics; the major factor inhibiting these investigations has been the problem of estimating at least the first two moments of such statistics. The present article examines the problem of estimating the variances of nonlinear statistics from complex samples, in the light of existing literature. The behavior of the chi-squared statistic computed from a complex sample to test hypotheses of goodness of fit or independence is studied. Alternative tests are developed and their properties studied in simulation experiments.

Journal ArticleDOI
TL;DR: A nationwide study on the natural radioactivity of drinking water has been made in Finland, directed mainly at drinking water distributed by water supply plants with more than 200 users, and samples were collected from private dug and drilled wells.
Abstract: A nationwide study on the natural radioactivity of drinking water has been made in Finland. The study was directed mainly at drinking water distributed by water supply plants with more than 200 users. Additional samples were collected from private dug and drilled wells. The samples were analysed for \"'Rn, 226 Ra, gross aand gross /3-activity. Half of the samples collected from drilled wells were analysed for uranium as well. The radioactivity of drinking water distributed by the water supply plants was on the average low, the mean concentration being 670 pCi/l. for 222Rn and 0.1 pCi/l. for 226Ra. The most radioactive water was found in drilled wells, in which the mean concentrations were 17,OOOpCi/l. for 222Rn and 2.9pCi/l. for 2Z6Ra. Some of the drilled wells also had abnormally high concentrations of uranium, up to 2100 pg/l . and even higher in some wells in the Helsinki region. INTRODUCTION TWO-THIRDS of the population of Finland live in areas served by municipal and privately owned water supply plants with more than 200 users. Most of the water distributed for consumption is surface water. The use of ground water is, however, increasing, and in 1975 accounted for about 38% of the total water consumed (Na76). The population not served by the water supply plant distribution network uses untreated ground water for drinking and household use. The two most common types of well are the dug well and the deep-drilled well. Drilled wells have become more and more common, especially in recent years, because in some areas they are the only means of obtaining potable water. Several natural radioactive elements are present in water, but only '\"Rn and the long-lived radium isotopes 226Ra and '\"Ra have been found in concentrations that are significant from the point of view of radiation hygiene. Uranium, if present in large amounts in drinking water, is harmful because of its chemical toxicity. The study of the natural radioactivity of tap and raw water distributed by water supply plants was performed in Finland from 1974 to 1977. The study was supplemented later on by taking samples from private drilled wells and dug wells in areas where municipal tap water systems were not in use. The results of the study are presented in this paper, together with a summary of the results obtained in two separate studies carried out in the Helsinki region from 1967 to 1968 and from 1975 to 1978. MATERIAL AND METHODS Sample collection The scope of the study of water supply plants is given in Table 1. Of all the samples collected, 735 represented tap water, which was partially surface or ground water treated by some purification method and partly untreated ground water. Including the samples collected from private wells, a total of 690 ground water supplies (dug wells and springs) was studied. The study on drilled wells is still going on. At the end of 1978,878 samples had been taken from ground water in the bedrock. A small proportion of these samples was




Journal Article
TL;DR: Results of mold surveys in seven homes are reported and the advantages and disadvantages of the different techniques for mold evaluation are illustrated.
Abstract: Results of mold surveys in seven homes are reported. The advantages and disadvantages of the different techniques for mold evaluation are illustrated. No one sampling technique is adequate for detection of endogenous mold problems. Ideally, one should incorporate direct Scotch tape imprints, direct cultures of suspected material, Andersen sampling and a rotorod study to obtain an accurate assessment of the endogenous mold population.

Journal ArticleDOI
TL;DR: A new criterion for accurate aerosol sampling in calm air is derived and experimental data are discussed in the light of this new sampling criterion, and the data are found to be in accord with the theoretical predictions.
Abstract: The problem of aerosol sampling under calm air conditions has been studied by solving the Navier-Stokes equations and integrating the equations of particle motion. Both particle inertia and settling are considered. The sampling efficiency of an inlet has been found to depend on two dimensionless parameters, the Stokes number and the relative settling velocity. A new criterion for accurate aerosol sampling in calm air is then derived. Experimental data of several researchers are discussed in the light of this new sampling criterion, and the data are found to be in accord with the theoretical predictions.

Journal ArticleDOI
01 Apr 1980-Ecology
TL;DR: The small sample bias, at least in the particular case examined in detail, is much smaller for an estimator of the Simpson Index than for any proposed estimators of the Shannon—Wiener Index.
Abstract: If a community is too large to be censused, quantities such as the diversity must be estimated from a sample. Methods using data from a sample random sample of individuals to estimate either the Shannon—Wiener or the Simpson Index of diversity are of limited use. Seldom is it practical to obtain such a sample. It is usually easier to use sampling unit such as the plants in a quadrat or the animals trapped in a net. Methods which use the data from such a scheme to estimate the two commonly used indices are discussed here. (Computational procedures are outlined in a separate display.) The small sample bias, at least in the particular case examined in detail, is much smaller for an estimator of the Simpson Index than for any proposed estimator of the Shannon—Wiener Index. This property appears to provide the former index with a distinct advantage for measuring diversity and its ecological components. See full-text article at JSTOR

Journal ArticleDOI
TL;DR: A review of sampling with unequal probabilities without replacement with an approximate formula for the estimation of variance which does not involve 7,, is presented and comparison of special estimators is made.
Abstract: Summary This paper deals with a review of sampling with unequal probabilities without replacement. In Section 2 of this paper, a list of selection Procedures along with their properties (in brief) is given. Section 3 deals with the comparison of the Horvitz-Thompson Estimator and an approximate formula for the estimation of variance which does not involve 7,, is presented and comparison of special estimators is made in Section 4.

Journal ArticleDOI
TL;DR: An accurate acceptance-rejection algorithm is devised and tested, which requires an average of less than 3 uniform deviates whenever the standard deviation σ of the distribution is at least 4, and this number decreases monotonically to 2.63 as σ→∞.
Abstract: An accurate acceptance-rejection algorithm is devised and tested. The procedure requires an average of less than 3 uniform deviates whenever the standard deviation σ of the distribution is at least 4, and this number decreases monotonically to 2.63 as σ→∞. Variable parameters are permitted, and no subroutines for sampling from other statistical distributions are needed.