scispace - formally typeset
Search or ask a question

Showing papers on "Sampling (statistics) published in 1985"


Journal ArticleDOI
TL;DR: Theoretical and empirical results indicate that Algorithm Z outperforms current methods by a significant margin, and an efficient Pascal-like implementation is given that incorporates these modifications and that is suitable for general use.
Abstract: We introduce fast algorithms for selecting a random sample of n records without replacement from a pool of N records, where the value of N is unknown beforehand. The main result of the paper is the design and analysis of Algorithm Z; it does the sampling in one pass using constant space and in O(n(1 + log(N/n))) expected time, which is optimum, up to a constant factor. Several optimizations are studied that collectively improve the speed of the naive version of the algorithm by an order of magnitude. We give an efficient Pascal-like implementation that incorporates these modifications and that is suitable for general use. Theoretical and empirical results indicate that Algorithm Z outperforms current methods by a significant margin.

1,725 citations


Book
01 Jan 1985
TL;DR: Findings of social science research basic research designs elements of measurement and sampling data collection strategies data analysis and report writing.
Abstract: Foundations of social science research basic research designs elements of measurement and sampling data collection strategies data analysis and report writing.

562 citations


Journal ArticleDOI
TL;DR: In this paper, the authors examined the effect of autocorrelation on six measures of home range and found that positive autocorecorrelation resulted in underestimation of home-range size.
Abstract: Accurate estimation of home-range size often requires large numbers of observations. Radiotelemetry and direct observation are capable of yielding large sample sizes in a short period of time, but observations collected using a short sampling interval often are autocorrelated (i.e., not independent). We examined the effect of autocorrelation on six measures of home range and found that positive autocorrelation resulted in underestimation of home-range size. In long-term studies of movement, sampling intervals should be chosen so that autocorrelation between successive observations is negligible. If home-range estimates must be obtained in a relatively short period of time, collection of autocorrelated data may be unavoidable; under these circumstances nonstatistical measures of home-range size are more appropriate than statistical mea-

445 citations


Journal ArticleDOI
01 Jul 1985
TL;DR: Stochastic sampling techniques allow the construction of alias-free approximations to continuous functions using discrete calculations and can be applied spatiotemporally as well as to other aspects of scene simulation.
Abstract: Stochastic sampling techniques, in particular Poisson and fittered sampling, are developed and analyzed. These approaches allow the construction of alias-free approximations to continuous functions using discrete calculations. Stochastic sampling scatters high frequency information into broadband noise rather than generating the false patterne produced by regular sampling. The type of randomness used in the sampling process controls the spectral character of the noise. The average sampling rate and the function being sampled determine the amount of noise that is produced. Stochastic sampling is applied adaptively so that a greater number of samples are taken where the function varies most. An estimate is used to determine how many samples to take over a given region. Noise reducing filters are used to increase the efficacy of a given sampling rate. The filter width is adaptively controlled to further improve performance. Stochastic sampling can be applied spatiotemporally as well as to other aspects of scene simulation. Ray tracing is one example of an image synthesis approach that can be antialiased by stochastic sampling.

445 citations


Journal ArticleDOI
TL;DR: In this paper, the authors discuss the role of cognitive psychology, survey networks, and questionnaire construction and question writing, as well as survey implementation and management; survey data analysis; special types of surveys; and integrating surveys with other data collection methods.
Abstract: With chapters on: sampling; measurement; questionnaire construction and question writing; survey implementation and management; survey data analysis; special types of surveys; and integrating surveys with other data collection methods, this title includes topics such as measurement models, the role of cognitive psychology, and surveying networks.

407 citations


Journal ArticleDOI
01 Apr 1985
TL;DR: An overview of the theory of sampling and reconstruction of multidimensional signals, including the role of the camera and display apertures, and the human visual system is presented and a class of nonlinear interpolation algorithms which adapt to the motion in the scene is presented.
Abstract: Sampling is a fundamental operation in all image communication systems A time-varying image, which is a function of three independent variables, must be sampled in at least two dimensions for transmission over a one-dimensional analog communication channel, and in three dimensions for digital processing and transmission At the receiver, the sampled image must be interpolated to reconstruct a continuous function of space and time In imagery destined for human viewing, the visual system forms an integral part of the reconstruction process This paper presents an overview of the theory of sampling and reconstruction of multidimensional signals The concept of sampling structures based on lattices is introduced The important problem of conversion between different sampling structures is also treated This theory is then applied to the sampling of time-varying imagery, including the role of the camera and display apertures, and the human visual system Finally, a class of nonlinear interpolation algorithms which adapt to the motion in the scene is presented

301 citations


Journal ArticleDOI
TL;DR: Monte Carlo methods were used to systematically study the effects of sampling error and model characteristics upon parameter estimates and their associated standard errors in maximum likelihood confirmatory factor analysis.
Abstract: Monte Carlo methods were used to systematically study the effects of sampling error and model characteristics upon parameter estimates and their associated standard errors in maximum likelihood confirmatory factor analysis. Sample sizes were varied from 50 to 300 for models defined by different numbers of indicators per factor, numbers of factors, correlations between factors, and indicator reliabilities. The measurement and structural parameter estimates were generally unbiased except for the structural parameters relating factors defined by only two indicators. Sampling variability can be quite large, though, particularly as sample size becomes smaller, there are fewer indicators per factor and the reliabilities are lower. However, the estimated standard errors were adjusted accordingly.

256 citations


Journal ArticleDOI
TL;DR: A sampling theory which extends the uniform sampling theory of Whittaker et al. to include nonuniform sample distributions and shows that a more general result can be obtained by treating the sample sequence as the result of applying a coordinate transformation to the uniform sequence.
Abstract: The reconstruction of functions from their samples at nonuniformly distributed locations is an important task for many applications. This paper presents a sampling theory which extends the uniform sampling theory of Whittaker et al. [11] to include nonuniform sample distributions. This extension is similar to the analysis of Papoulis [15], who considered reconstructions of functions that had been sampled at positions deviating slightly from a uniform sequence. Instead of treating the sample sequence as deviating from a uniform sequence, we show that a more general result can be obtained by treating the sample sequence as the result of applying a coordinate transformation to the uniform sequence. It is shown that the class of functions reconstructible in this manner generally include nonband-limited functions. The two-dimensional uniform sampling theory of Petersen and Middle ton [16] can be similarly extended as is shown in this paper. A practical algorithm for performing reconstructions of two-dimensional functions from nonuniformly spaced samples is described, as well as examples illustrating the performance of the algorithm.

244 citations


Journal ArticleDOI
TL;DR: In this paper, the authors define two levels of parameters: the basic parameters are associated with the model and experiment(s), and the observations define a set of identifiable observational parameters that are functions of the basic parameter.
Abstract: We define two levels of parameters. The basic parameters are associated with the model and experiment(s). However, the observations define a set of identifiable observational parameters that are functions of the basic parameters. Starting with this formulation, we show that an implicit function approach provides a common basis for examining local identifiability and estimability and gives a lead-in to the problem of optimal sampling design. A least squares approach based on a large but finite set of observations generated at initial parameter estimates then gives a uniform approach to local identifiability, estimability, and the generation of an optimal sampling schedule.

216 citations



Journal ArticleDOI
TL;DR: It was recommended that other investigators validate their handwiping, house dust sampling, and digestion techniques to facilitate comparison of results across studies.

01 Jan 1985
TL;DR: A general protocol for evaluating new or existing products under accelerated shelf-life testing (ASLT) procedures is presented and discussed, and the use of this protocol for predicting the shelf life of a frozen pizza product is illustrated.
Abstract: A general protocol for evaluating new or existing products under accelerated shelf-life testing (ASLT) procedures is presented and discussed, and the use of this protocol for predicting the shelf life of a frozen pizza product is illustrated. Problems, errors, and guidelines for using the protocol (including testing design, storage conditions, sampling, and analysis) are discussed. Considerations in ASLT must be given to assessing shelf-life losses due to microbiological, chemical, and physical changes as well as to nutrient losses. (wz)

Journal ArticleDOI
TL;DR: Although the purpose of many drift studies is to describe quantitatively the abundance of drifting invertebrates and make comparisons between seasons or sites, almost no investigations have employe... as discussed by the authors.
Abstract: Although the purpose of many drift studies is to describe quantitatively the abundance of drifting invertebrates and make comparisons between seasons or sites, almost no investigations have employe...

Journal ArticleDOI
TL;DR: Environmental data usually have the following characteristics which make it difficult to analyze: (1) the data are cut off from below by detection limits; (2) the sample size is very small.
Abstract: When environmental phenomena are measured, the measuring devices/procedures used are unable to detect low concentrations. Thus, concentrations below certain threshold levels are not measurable. Standard “detection limits” are set by various agencies for various phenomena for various types of measuring devices. Measured values below these limits are reported as “below detection limit” or as “trace” and are thus not available for statistical analysis. (Sometimes values below these limits are available, but their accuracy is greatly in doubt.) Consequently, the statistician often has a very basic problem facing him: how does he analyze data sets that contain a reasonable percentage of “below detection limit” entries? Environmental data are characterized not only by detection limits but also by small sample size. Required measurements for compliance purposes often are performed annually, quarterly, or, at most, monthly due to the expense or disruption caused by the testing. Studies of pilot plants or demonstration plants are often of such short duration that 5-10 samples are all that are obtained. Thus, methods for estimating the parameters of environmental data using asymptotic or large sample size procedures are usually inapplicable. In summary, environmental data usually have the following characteristics which make it difficult to analyze: (1) The data are cut off from below by detection limits. (2) The sample size is very small. As an example, suppose we have taken eight samples of air near a chemical warehouse in order to see if there are leaks (fugitive emissions). Concentrations below 0.8 ppb, say, are below the reliability of the measurement procedure. Of the eight samples, suppose five are below the detection limit while the other three are measured to have concentrations of 1, 2, and 5 ppb. How do we find the average concentration? Surely the smallest value that the true average could be is (0 + 0 + 0 + 0 + 0 + 1 + 2 + 5)/8 = 1 ppb


01 Jan 1985
TL;DR: In this article, the authors provide recommendations for the collection of high quality groundwater samples from both the scientific literature and the authors' experience, and it should be noted that the precautions and procedures recommended herein must be considered carefully for specific sampling applications.
Abstract: This document provides recommendations for the collection of high quality groundwater samples. The supporting information provided is drawn from both the scientific literature and the authors' experience. It should be noted that the precautions and procedures recommended herein must be considered carefully for specific sampling applications. Hvorslev piezometer test: a) geometry, b) method of analysis (from ref.

Journal ArticleDOI
TL;DR: An algorithm for optimal data collection in random fields, the so-called variance reduction analysis, which is an extension of kriging, is presented, which shows a high degree of stability with respect to noisy inputs.
Abstract: This paper presents an algorithm for optimal data collection in random fields, the so-called variance reduction analysis, which is an extension of kriging. The basis of variance reduction analysis is an information response function (i.e., the amount of information gain at an arbitrary point due to a measurement at another site). The ranking of potential sites is conducted using an information ranking function. The optimal number of new points is then identified by an economic gain function. The selected sequence of sites for further sampling shows a high degree of stability with respect to noisy inputs.

Journal ArticleDOI
TL;DR: The selection at list time (SALT) scheme as discussed by the authors controls sampling of concentration for estimating total suspended sediment yield and the variance of the estimate while automatically emphasizing sampling at higher flows.
Abstract: The “Selection At List Time” (SALT) scheme controls sampling of concentration for estimating total suspended sediment yield. The probability of taking a sample is proportional to its estimated contribution to total suspended sediment discharge. This procedure gives unbiased estimates of total suspended sediment yield and the variance of the estimate while automatically emphasizing sampling at higher flows. When applied to real data with known yield, the SALT method underestimated total suspended sediment yield by less than 1%, whereas estimates by the flow duration sediment rating curve method averaged about 51% underestimation. Implementing the SALT scheme requires obtaining samples with a pumping sampler, stage sensing device, and small battery-powered computer.

Journal ArticleDOI
TL;DR: The electrofishing device and sampling procedure developed to sample lotic microhabitats was very effective in immobilizing fish within the electrode frame and allows multispecies abundances to be quantified with regard to discrete units of microhabitate and a priori sampling designs to be used.
Abstract: An electrofishing device and sampling procedure was developed to sample lotic microhabitats. The device consists of a rectangular electrode frame powered by an alternating current generator. The device was very effective in immobilizing fish within the electrode frame. A time delay between setting and sampling electrode frames permitted a period without disturbance prior to sampling. No significant correlations were found between capture rates and time delays greater than 11 minutes. Consistent results were obtained in repeated sampling of a variety of microhabitats within a single stream reach. The device and sampling procedure allows multispecies abundances to be quantified with regard to discrete units of microhabitat and a priori sampling designs to be used.

Journal ArticleDOI
14 Feb 1985-Nature
TL;DR: Many attempts have been made to trace increases in global pollution from the record of impurities preserved in polar snow and ice layers as mentioned in this paper, although inadequate sampling and analysis techniques have hampered some of the studies, others have improved uniquely our understanding of the history of changes in airborne pollution.
Abstract: Many attempts have been made to trace increases in global pollution from the record of impurities preserved in polar snow and ice layers. Although inadequate sampling and analysis techniques have hampered some of the studies, others have improved uniquely our understanding of the history of changes in airborne pollution.

Proceedings ArticleDOI
01 Dec 1985
TL;DR: The present work characterize the discrete time system which reproduces exactly the evolutions in the state of a given vector input linear analytic continuous time system driven by inputs which are constant on time intervals of fixed amplitude.
Abstract: In the present work we characterize the discrete time system which reproduces exactly the evolutions in the state of a given vector input linear analytic continuous time system driven by inputs which are constant on time intervals of fixed amplitude. This is achieved by comparing the Volterra series associated respectively to the sampled and continuous input state functionals. Moreover we give a compact Lie formula for the solution of a parametrized nonlinear differential equation which enables to characterize the nonlinear difference equation which solves the problem in terms of formal Lie series. On these bases, it becomes very natural to introduce a notion of approximated sampling of main efficiency in practical situations for computing purpose.

Journal ArticleDOI
01 Jun 1985-Botany
TL;DR: Decomposition of Scots pine needle litter was studied in a Scots pine forest in central Sweden using a 6-year series with annual incubations of needle litter to analyse the climatic influence on the process.
Abstract: Decomposition of Scots pine needle litter was studied in a Scots pine forest in central Sweden. A 6-year series with annual incubations of needle litter was used to analyse the climatic influence on the process. The original litter was of similar chemical properties between years and each year new litter was incubated, in the same way, in the autumn. Sampling took place at time intervals ranging from 1 month to 1 year. Soil climate variables such as temperature and water contents and tensions were calculated with a soil water and heat model from standard meteorological data. Decomposition rates from periods longer than 145 days were correlated with different soil climatic factors. The responses for the 1st and 2nd incubation years were not significantly different, but higher coefficients of determination (r2) were found for the 2nd year. Estimated actual evapotranspiration or soil temperature explained temporal variation of decomposition to about 70%; soil water content only or soil water tension only exp...

Journal ArticleDOI
TL;DR: In this article, the performances of four light-weight, open sampling devices intended for use in soft sediments, the Axelsson-Hakanson gravity corer, the Kajak gravity core r, the Jenkin bottom sampler and the Ekman grab (box corer), were examined in situ by direct observation, measurement and photographic documentation by a SCUBA diver.
Abstract: The performances of four light-weight, open sampling devices intended for use in soft sediments, the Axelsson-Hakanson gravity corer, the Kajak gravity corer, the Jenkin bottom sampler and the Ekman grab (box corer), were examined in situ by direct observation, measurement and photographic documentation by a SCUBA diver. Restrictions on the reliability of the sediment samples obtained with these devices and sediment coring instruments in general are evaluated. Separate studies of core shortening show: (1) a positive linear relationship between sediment penetration depth at which shortening of cores starts and coring tube inner diameter, (2) a tube size related shift of curve pattern in the regressions of the core shortening versus sediment depth, and (3) a negative non-linear relationship of shortening intensity versus increasing coring tube inner diameter. These findings show the great risk, when sampling soft sediments, of obtaining a sample quantitatively unrepresentative of the in situ stratification. An accurate correction factor for the degree of core shortening requires a knowledge of: (1) the sediment depth at which core shortening commences, and (2) the curve describing the relationship of shortening to depth of penetration.



Journal ArticleDOI
TL;DR: The results indicate that significant differences may arise from the application of the various techniques of grain selection and it is important that appropriate procedures be rigorously applied.
Abstract: ‘Textural analysis’ is being increasingly used for the characterisation of pottery fabrics. In this study we have investigated some of the effects on the observed size distribution data of applying a number of different grain selection procedures (principally various point-counting and ribbon-counting procedures). The results indicate that significant differences may arise from the application of the various techniques of grain selection and it is important that appropriate procedures be rigorously applied.

Journal ArticleDOI
TL;DR: Sampling methods and taxon analysis in vegetation science: releve surveys, "Vegetationsaufnahmen," floristic analysis of plant communities, and more.

Journal Article
TL;DR: Mosquito responses to terrain features and various meteorological factors are briefly summarized with the object of improving understanding of the samples provided by several classes of sampling techniques.
Abstract: Day-to-day changes in adult mosquito populations are difficult to measure due to the interactions between specific mosquito behavior, environmental influences upon behavior, and the mode of operation of the sampling technique. Mosquito responses to terrain features and various meteorological factors are briefly summarized with the object of improving our understanding of the samples provided by several classes of sampling techniques. The two major environmental influences upon the composition of a sample are the terrain features and several meteorological factors. As each sampling site is unique, a sample provides little direct information of the numbers of mosquitoes within the much broader area it is supposed to represent but it can reflect population changes at the site. However, the population changes usually are masked by meteorological effects upon flight activity. Data from Florida field studies were utilized to adjust trap catches to compensate for meteorological conditions during the catch period to provide more standard samples.

Journal ArticleDOI
TL;DR: It is suggested that none of the commonly used transformations will stabilize the variance at all the densities encountered, and information about sampler size improves the prediction of the logarithm of sampling variance, which increases with an increase in mean density and decreases with a increase in the size of the sampler.
Abstract: Several sets of published data were reanalyzed to examine the effect of mean density size and type of sampler, and functional group of the collected organisms on the variance in estimates of stream...

Journal Article
TL;DR: Spatial patterns of Meloidogyne incognita, Tylenchorhynchus claytoni, Helicotylenchus dihystera, and Criconemella ornata were analyzed using Hill's two-term local quadrat variance method, spectral analysis, and spatial correlation, indicating that clusters were ellipsoidal with long axes oriented along plant rows.
Abstract: Spatial patterns of Meloidogyne incognita, Tylenchorhynchus claytoni, Helicotylenchus dihystera, and Criconemella ornata were analyzed using Hill's two-term local quadrat variance method (TTLQV), spectral analysis, and spatial correlation. Data were collected according to a systematic grid sampling plan from seven tobacco fields in North Carolina. Different estimates of nematode cluster size were obtained through TTLQV and spectral analysis. No relationship was observed between either estimate and nematode species, time of sampling (spring vs. fall), or mean density. Cluster size estimates obtained from spectral analysis depended on sampling block size. For each species examined, spatial correlations among nematode population densities were greater within plant rows than across rows, indicating that clusters were ellipsoidal with long axes oriented along plant rows. Analysis of mean square errors indicated that significant gains in sampling efficiency resulted from orienting the long axis of sampling blocks across plant rows. Spatial correlation was greater in the fall than in spring and was greater among 1 × 1-m quadrats than among 3 × 3-m quadrats. Key words: nematode distribution, spatial pattern, sampling design.