scispace - formally typeset
Search or ask a question

Showing papers on "Sampling (statistics) published in 1997"


Journal ArticleDOI
TL;DR: A new variant of chain-referral sampling, respondent-driven sampling, is introduced that employs a dual system of structured incentives to overcome some of the deficiencies of such samples and discusses how respondent- driven sampling can improve both network sampling and ethnographic investigation.
Abstract: A population is “hidden” when no sampling frame exists and public acknowledgment of membership in the population is potentially threatening. Accessing such populations is difficult because standard probability sampling methods produce low response rates and responses that lack candor. Existing procedures for sampling these populations, including snowball and other chain-referral samples, the key-informant approach, and targeted sampling, introduce well-documented biases into their samples. This paper introduces a new variant of chain-referral sampling, respondent-driven sampling, that employs a dual system of structured incentives to overcome some of the deficiencies of such samples. A theoretic analysis, drawing on both Markov-chain theory and the theory of biased networks, shows that this procedure can reduce the biases generally associated with chain-referral methods. The analysis includes a proof showing that even though sampling begins with an arbitrarily chosen set of initial subjects, as do most chain-referral samples, the composition of the ultimate sample is wholly independent of those initial subjects. The analysis also includes a theoretic specification of the conditions under which the procedure yields unbiased samples. Empirical results, based on surveys of 277 active drug injectors in Connecticut, support these conclusions. Finally, the conclusion discusses how respondent- driven sampling can improve both network sampling and ethnographic 44 investigation.

3,950 citations


Book
22 May 1997
TL;DR: This book presents the principles of Estimation for Finite Populations and Important Sampling Designs and a Broader View of Errors in Surveys: Nonsampling Errors and Extensions of Probability Sampling Theory.
Abstract: PART I: Principles of Estimation for Finite Populations and Important Sampling Designs: Survey Sampling in Theory and Practice. Basic Ideas in Estimation from Probability Samples. Unbiased Estimation for Element Sampling Designs. Unbiased Estimation for Cluster Sampling and Sampling in Two or More Stages. Introduction to More Complex Estimation Problems.- PART II: Estimation through Linear Modeling, Using Auxiliary Variables: The Regression Estimator. Regression Estimators for Element Sampling Designs. Regression Estimators for Cluster Sampling and Two-Stage Sampling.- PART III: Further Questions in Design and Analysis of Surveys: Two-Phase Sampling. Estimation for Domains. Variance Estimation. Searching for Optimal Sampling Designs. Further Statistical Techniques for Survey Data.- PART IV: A Broader View of Errors in Surveys: Nonsampling Errors and Extensions of Probability Sampling Theory. Nonresponse. Measurement Errors. Quality Declarations for Survey Data.- Appendix A - D.- References.

3,197 citations


Journal ArticleDOI
TL;DR: In this article, the authors compare several methods of estimating Bayes factors when it is possible to simulate observations from the posterior distributions, via Markov chain Monte Carlo or other techniques, provided that each posterior distribution is well behaved in the sense of having a single dominant mode.
Abstract: The Bayes factor is a ratio of two posterior normalizing constants, which may be difficult to compute. We compare several methods of estimating Bayes factors when it is possible to simulate observations from the posterior distributions, via Markov chain Monte Carlo or other techniques. The methods that we study are all easily applied without consideration of special features of the problem, provided that each posterior distribution is well behaved in the sense of having a single dominant mode. We consider a simulated version of Laplace's method, a simulated version of Bartlett correction, importance sampling, and a reciprocal importance sampling technique. We also introduce local volume corrections for each of these. In addition, we apply the bridge sampling method of Meng and Wong. We find that a simulated version of Laplace's method, with local volume correction, furnishes an accurate approximation that is especially useful when likelihood function evaluations are costly. A simple bridge sampli...

2,191 citations


Journal ArticleDOI
TL;DR: This paper analyses critically purposeful and theoretical sampling and offers clarification on the use of theoretical sampling for nursing research to enhance understanding of the differences between purposefuland theoretical sampling in nursing research.
Abstract: Sampling is a very complex issue in qualitative research as there are many variations of qualitative sampling described in the literature and much confusion and overlapping of types of sampling, particularly in the case of purposeful and theoretical sampling. The terms purposeful and theoretical are viewed synonomously and used interchangeably in the literature. Many of the most frequent misinterpretations relate to the disparate meanings and usage of the terminology. It is important that the terminology is examined so that underlying assumptions be made more explicit. Lack of shared meanings and terminology in the nursing discourse creates confusion for the neophyte researcher and increases the production of studies with weak methodologies. This paper analyses critically purposeful and theoretical sampling and offers clarification on the use of theoretical sampling for nursing research. The aim is not to make prescriptive statements on sampling; rather, to enhance understanding of the differences between purposeful and theoretical sampling for nursing research.

2,130 citations


Journal ArticleDOI
TL;DR: A series of automated tests is developed for tower and aircraft time series to identify instrumentation problems, flux sampling problems, and physically plausible but unusual situations and serves as a safety net for quality controlling data.
Abstract: A series of automated tests is developed for tower and aircraft time series to identify instrumentation problems, flux sampling problems, and physically plausible but unusual situations. The automated procedures serve as a safety net for quality controlling data. A number of special flags are developed representing a variety of potential problems such as inconsistencies between different tower levels and the flux error due to fluctuations of aircraft height. The tests are implemented by specifying critical values for parameters representing each specific error. The critical values are developed empirically from experience of applying the tests to real turbulent time series. When these values are exceeded, the record is flagged for further inspection and comparison with the rest of the concurrent data. The inspection step is necessary to either verify an instrumentation problem or identify physically plausible behavior. The set of tests is applied to tower data from the Riso Air Sea Experiment and...

1,354 citations


Book
27 Jan 1997
TL;DR: Quantitative Data Analysis with SPSS for Windows explains statistical tests using the latest version of SPSs, the most widely used computer package for analyzing quantitative data, using the same formula-free, non-technical approach.
Abstract: From the Publisher: Quantitative Data Analysis with SPSS for Windows explains statistical tests using the latest version of SPSS, the most widely used computer package for analyzing quantitative data. Using the same formula-free, non-technical approach as the highly successful non-windows version, it assumes no previous familiarity with either statistics or computing, and takes the reader step-by-step through each of the techniques for which SPSS for Windows can be used. The book also contains exercises with answers, and covers issues such as sampling, statistical significance, and the selection of appropriate tests.

1,056 citations


Journal ArticleDOI
TL;DR: The sampling methodology used by Faugier (1996) in her study of prostitutes, HIV and drugs is used as a current example within this context.
Abstract: Studies on 'hidden populations', such as homeless people, prostitutes and drug addicts, raise a number of specific methodological questions usually absent from research involving known populations and less sensitive subjects. This paper examines the advantages and limitations of nonrandom methods of data collection such as snowball sampling. It reviews the currently available literature on sampling hard to reach populations and highlights the dearth of material currently available on this subject. The paper also assesses the potential for using these methods in nursing research. The sampling methodology used by Faugier (1996) in her study of prostitutes, HIV and drugs is used as a current example within this context.

1,006 citations


Book
01 Jan 1997
TL;DR: In this article, the authors present confidence limits and intervals for density, mean and variance, and diversity indices for fieldwork data, in terms of the number of field samples and species diversity.
Abstract: IntroductionDensity: Mean and VarianceNormal and Sampling Distributions for FieldworkConfidence Limits and Intervals for DensityHow Many Field Samples?Spatial Distribution: The Power CurveField Sampling SchemesSpecies Proportions: Relative AbundancesSpecies DistributionsRegression: Occurrences and DensitySpecies OccurrencesSpecies Diversity: The Number of SpeciesDiversity IndicesSHE Analysis

595 citations


Journal ArticleDOI
TL;DR: A new method of farthest point strategy for progressive image acquisition-an acquisition process that enables an approximation of the whole image at each sampling stage-is presented, retaining its uniformity with the increased density, providing efficient means for sparse image sampling and display.
Abstract: A new method of farthest point strategy (FPS) for progressive image acquisition-an acquisition process that enables an approximation of the whole image at each sampling stage-is presented. Its main advantage is in retaining its uniformity with the increased density, providing efficient means for sparse image sampling and display. In contrast to previously presented stochastic approaches, the FPS guarantees the uniformity in a deterministic min-max sense. Within this uniformity criterion, the sampling points are irregularly spaced, exhibiting anti-aliasing properties comparable to those characteristic of the best available method (Poisson disk). A straightforward modification of the FPS yields an image-dependent adaptive sampling scheme. An efficient O(N log N) algorithm for both versions is introduced, and several applications of the FPS are discussed.

407 citations


Journal ArticleDOI
01 Oct 1997-Geoderma
TL;DR: In this article, the authors compare the performance of two sampling strategies, namely, Stratified Simple Random Sampling (SRS) and Systematic Sampling with the block kriging predictor (SY, t OK ), in a simulation study on the basis of the design-based quality criteria, which can be assembled in a decision tree that can be helpful in choosing between the two approaches.

388 citations


Journal ArticleDOI
TL;DR: In this paper, an improved sequential response surface method is proposed using the gradient projection method, the sampling points for response surface approximation are selected to be close to the original failure surface, and a method controlling the selection range of sampling points considering the nonlinearity of the limit states is proposed to reduce the error produced by approximating the original nonlinear limit state with a linear response surface.

Patent
17 Jun 1997
TL;DR: In this article, a percutaneous agent sampling device and a sampling method are provided, which comprises a collector and a sheet having a plurality of microblades for piercing the skin for increasing transdermal flux of an agent.
Abstract: A percutaneous agent sampling device and method are provided. The device comprises a collector and a sheet having a plurality of microblades for piercing the skin for increasing transdermal flux of an agent.

Book
01 Apr 1997
TL;DR: Part 1 Chemometric methods: measurements and basic statistics experimental design and analysis of variances sampling and sampling design multivariate data analysis graphical methods data preprocessing cluster analysis factorial methods multivariate classification multivariate modelling time series analysis.
Abstract: Part 1 Chemometric methods: measurements and basic statistics experimental design and analysis of variances sampling and sampling design multivariate data analysis graphical methods data preprocessing cluster analysis factorial methods multivariate classification multivariate modelling time series analysis. Part 2 Case studies: atmosphere hydrosphere pedosphere related topics. (Part contents).

Journal ArticleDOI
01 Dec 1997
TL;DR: This article introduces a general planning scheme that consists of randomly sampling the robot 's configuration space, and describes two previously developed planners as instances of planners based on this scheme, but applying very different sampling strategies.
Abstract: Several randomized path planners have been proposed during the last few years. Their attractiveness stems from their applicability to virtually any type of robots, and their empirically observed success. In this paper we attempt to present a unifying view of these planners and to theoretically explain their success. First, we introduce a general planning scheme that consists of randomly sampling the robot’s configuration space. We then describe two previously developed planners as instances of planners based on this scheme, but applying very different sampling strategies. These planners are probabilistically complete: if a path exists, they will find one with high probability, if we let them run long enough. Next, for one of the planners, we analyze the relation between the probability of failure and the running time. Under assumptions characterizing the “goodness” of the robot’s free space, we show that the running time only grows as the absolute value of the logarithm of the probability of failure that we are willing to tolerate. We also show that it increases at a reasonable rate as the space goodness degrades. In the last section we suggest directions for future research.

Journal ArticleDOI
TL;DR: In this paper, a new adaptive umbrella sampling technique for molecular dynamics simulations is described, which is achieved by using the weighted histogram analysis method to combine the results from different simulations, by a suitable extrapolation scheme to define the umbrella potential for regions that have not been sampled, and by a criterion to identify simulations during which the system was not in equilibrium.
Abstract: A new adaptive umbrella sampling technique for molecular dynamics simulations is described. The high efficiency of the technique renders multidimensional adaptive umbrella sampling possible and thereby enables uniform sampling of the conformational space spanned by several degrees of freedom. The efficiency is achieved by using the weighted histogram analysis method to combine the results from different simulations, by a suitable extrapolation scheme to define the umbrella potential for regions that have not been sampled, and by a criterion to identify simulations during which the system was not in equilibrium. The technique is applied to two test systems, the alanine dipeptide and the threonine dipeptide, to sample the configurational space spanned by one or two dihedral angles. The umbrella potentials applied at the end of each adaptive umbrella sampling run are equal to the negative of the corresponding potentials of mean force. The trajectories obtained in the simulations can be used to calculate dynamical variables that are of interest. An example is the distribution of the distance between the HN and the H b proton that can be important for the interpretation of NMR experiments. Factors influencing the accuracy of the calculated quantities are discussed. Q 1997 John Wiley & Sons, Inc. J Comput Chem 18: 1450)1462, 1997

Journal ArticleDOI
TL;DR: A comprehensive history and survey of IS methods is presented, and a guide to the strengths and weaknesses of the techniques is offered, to indicate which techniques are suitable for various types of communications systems.
Abstract: Importance sampling (IS) is a simulation technique which aims to reduce the variance (or other cost function) of a given simulation estimator. In communication systems, this usually, but not always, means attempting to reduce the variance of the bit error rate (BER) estimator. By reducing the variance, IS estimators can achieve a given precision from shorter simulation runs; hence the term "quick simulation." The idea behind IS is that certain values of the input random variables in a simulation have more impact on the parameter being estimated than others. If these "important" values are emphasized by sampling more frequently, then the estimator variance can be reduced. Hence, the basic methodology in IS is to choose a distribution which encourages the important values. This use of a "biased" distribution will, of course, result in a biased estimator if applied directly in the simulation. However, there is a simple procedure whereby the simulation outputs are weighted to correct for the use of the biased distribution, and this ensures that the new IS estimator is unbiased. Hence, the "art" of designing quick simulations via IS is entirely dependent on the choice of biased distribution. Over the last 50 years, IS techniques have flourished, but it is only in the last decade that coherent design methods have emerged. The outcome of these developments is that at the expense of increasing technical content, modern techniques can offer substantial run-time saving for a very broad range of problems. We present a comprehensive history and survey of IS methods. In addition, we offer a guide to the strengths and weaknesses of the techniques, and hence indicate which techniques are suitable for various types of communications systems. We stress that simple approaches can still yield useful savings, and so the simulation practitioner as well as the technical researcher should consider IS as a possible simulation tool.

Journal ArticleDOI
TL;DR: The Hammersley and Halton point sets, two well-known, low discrepancy sequences, have been used for quasi-Monte Carlo integration and the sampling scheme is also applied to ray tracing, with a significant improvement in error over standard sampling techniques.
Abstract: The Hammersley and Halton point sets, two well-known, low discrepancy sequences, have been used for quasi-Monte Carlo integration in previous research. A deterministic formula generates a uniformly distributed and stochasticlooking sampling pattern at low computational cost. The Halton point set is also useful for incremental sampling. In this paper, we discuss detailed implementation issues and our experience of choosing suitable bases for the point sets, not just on the two-dimensional plane but also on a spherical surface. The sampling scheme is also applied to ray tracing, with a significant improvement in error over standard sampling techniques.

Journal ArticleDOI
TL;DR: The sampler performance data obtained in this project were affected by large experimental errors, but are nevertheless a useful input to decisions on how to incorporate the CEN inhalable sampling convention into regulation, guidance and occupational hygiene practice.
Abstract: Following the adoption of new international sampling conventions for inhalable, thoracic and respirable aerosol fractions, a working group of Comite Europeen de Normalisation (CEN) drafted a standard for the performance of workplace aerosol sampling instruments. The present study was set up to verify the experimental, statistical and mathematical procedures recommended in the draft performance standard and to check that they could be applied to inhalable aerosol samplers. This was achieved by applying the tests to eight types of personal inhalable aerosol sampler commonly used for workplace monitoring throughout Europe. The study led to recommendations for revising the CEN draft standard, in order to simplify the tests and reduce their cost. However, some further work will be needed to develop simpler test facilities and methods. Several of the samplers tested were found to perform adequately with respect to the inhalable sampling convention, at least over a limited range of typical workplace conditions. In general the samplers were found to perform best in low external wind speeds, which are the test conditions thought to be closest to those normally found in indoor workplaces. The practical implementation of the CEN aerosol sampling conventions requires decisions on which sampling instruments to use, estimation of the likely impact that changing sampling methods could have on apparent exposures, and adjustment where necessary of exposure limit values. The sampler performance data obtained in this project were affected by large experimental errors, but are nevertheless a useful input to decisions on how to incorporate the CEN inhalable sampling convention into regulation, guidance and occupational hygiene practice.

Patent
Janet Tamada1
TL;DR: In this paper, a method for sampling a substance from a subject is provided, which comprises placing one or more sampling chambers on a collection site on a tissue surface on the subject; conducting electric current through the tissue to extract the substance from the subject in a first direction, and reversing the polarity to cause direct current to flow in second direction opposite the first direction; analyzing the sampling chamber or chambers for the concentration of a substance or a substance metabolite.
Abstract: A method for sampling of a substance from a subject is provided, which comprises placing one or more sampling chambers on a collection site on a tissue surface on the subject; conducting electric current through the tissue to extract a substance from the subject in a first direction in one or more sampling chambers that functions alternatively as both an anode and cathode during the course of the method; reversing the polarity to cause direct current to flow in second direction opposite the first direction; and analyzing the sampling chamber or chambers for the concentration of a substance or a substance metabolite. There is also provided a device for sampling of a substance from an organism on continuously or intermittently using alternating polarity method based on the application of low intensity electric fields of alternating polarity across the skin (iontophoresis) to enhance the transport of a substance (such as glucose, lactic acid, pyruvic acid, and the like) from body tissues to a sampling chamber. The device comprises an electrical power supply; a transdermal system that contains one or more sampling chambers that function as both anode and cathode during the course of sampling; a means to alternates or reverses the polarity during the iontophoretic sampling; and means for analyzing for the concentration of a substance or a substance metabolite.

Journal ArticleDOI
TL;DR: A family of designs are developed to permit control of the spatial dispersion of the sample, variable spatial density, and nested subsampling, so that rigorous design-based inference and variance estimation are possible.
Abstract: Many environmental resources, such as mineral resources or vegetation cover, or environmental attributes, such as chemical concentration in a stream or benthic community structure, are most appropriately sampled as continuous populations distributed over space, but most applied sampling theory and methodology is concerned with finite, discrete populations. This paper reports sampling methodology that explicitly recognizes the continuous nature of ecological resources. A family of designs are developed to permit control of the spatial dispersion of the sample, variable spatial density, and nested subsampling. The designs have non-zero joint inclusion probability densities, so that rigorous design-based inference and variance estimation are possible. © 1997 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: This paper presents new fast algorithms for computing long range electrostatic interactions for new polarizable force fields and new methods for sampling low energy molecular conformations, which allow the rapid determination of thermodynamically dominant regions on the potential-energy surface.

Journal ArticleDOI
TL;DR: In this article, kriging and cokriging methods were applied to estimate the spatial distribution of soil properties from available large-scale survey data of Taiwan, and the results suggested that the existing sampling density could be decreased under the large scale sampling interval by nearly half.
Abstract: Analysis and interpretation of soil survey data are very important for effective management of agricultural fields. In this study, kriging and cokriging methods were applied to estimate the spatial distribution of soil properties from available large-scale survey data of Taiwan. The data were derived from soils in a 10-km 2 area divided into 250 m X 250 m node intervals. The soil properties examined included the extractable P, Ca, Mg, and Fe contents, the sum of exchangeable bases (SEB), %sand, %silt, and %clay. The sum of exchangeable bases and particle-size distribution were regarded as the primary and auxiliary variables, respectively, in the cokriging procedure. The ratio of nugget to total variation was about 57 to 80%, indicating that the spatial correlation of the tested soil properties at the large scale was moderately (cross-)dependent.The estimated spatial distributions of the soil properties by kriging, under decreasing sampling densities, all correlated significantly (P < 0.1%) with those obtained from original data. Furthermore, with the over-sampled particle-size distribution, the overall estimation of SEB quality by cokriging was superior to that by kriging. The results suggested that by kriging and cokriging, the existing sampling density could be decreased under the large-scale sampling interval by nearly half and that sufficient spatial information about the soil properties could still be retained. The information obtained could be used to improve the long-term sampling designs of soil surveys in Taiwan. It also may be useful for identifying the appropriate sampling densities for these scales of soil surveys.

Journal ArticleDOI
TL;DR: In this paper, various types of finite mixtures of confirmatory factor-analysis models are proposed for handling data heterogeneity, and three different sampling schemes for these mixture models are distinguished.
Abstract: In this paper, various types of finite mixtures of confirmatory factor-analysis models are proposed for handling data heterogeneity. Under the proposed mixture approach, observations are assumed to be drawn from mixtures of distinct confirmatory factor-analysis models. But each observation does not need to be identified to a particular model prior to model fitting. Several classes of mixture models are proposed. These models differ by their unique representations of data heterogeneity. Three different sampling schemes for these mixture models are distinguished. A mixed type of the these three sampling schemes is considered throughout this article. The proposed mixture approach reduces to regular multiple-group confirmatory factor-analysis under a restrictive sampling scheme, in which the structural equation model for each observation is assumed to be known. By assuming a mixture of multivariate normals for the data, maximum likelihood estimation using the EM (Expectation-Maximization) algorithm and the AS (Approximate-Scoring) method are developed, respectively. Some mixture models were fitted to a real data set for illustrating the application of the theory. Although the EM algorithm and the AS method gave similar sets of parameter estimates, the AS method was found computationally more efficient than the EM algorithm. Some comments on applying the mixture approach to structural equation modeling are made.

01 Jan 1997
TL;DR: In this article, the authors provide a general theory about the Poisson-Binomial distribution concerning its computation and applications, and as by-products, they propose new weighted sampling schemes for finite population, a new method for hypothesis testing in logistic regression, and a new algorithm for finding the maximum conditional likelihood estimate (MCLE) in case-control studies.
Abstract: The distribution of Z1 +···+ZN is called Poisson-Binomial if the Zi are independent Bernoulli random variables with not-all-equal probabilities of success. It is noted that such a distribution and its computation play an important role in a number of seemingly unrelated research areas such as survey sampling, case-control studies, and survival analysis. In this article, we provide a general theory about the Poisson-Binomial distribution concerning its computation and applications, and as by-products, we propose new weighted sampling schemes for finite population, a new method for hypothesis testing in logistic regression, and a new algorithm for finding the maximum conditional likelihood estimate (MCLE) in case-control studies. Two of our weighted sampling schemes are direct generalizations of the "sequential" and "reservoir" methods of Fan, Muller and Rezucha (1962) for simple random sampling, which are of interest to computer scientists. Our new algorithm for finding the MCLE in case-control studies is an iterative weighted least squares method, which naturally bridges prospective and retrospective GLMs.

Journal ArticleDOI
TL;DR: In this article, the X-bar chart with variable sampling intervals (VSI) and the variable sample size (VSS) are compared to the traditional X-Bar chart in detecting shifts in the process.
Abstract: Recent theoretical studies have shown that the X-bar chart with variable sampling intervals (VSI) and the X-bar chart with variable sample size (VSS) are quicker than the traditional X-bar chart in detecting shifts in the process. This article considers..

Proceedings ArticleDOI
07 Apr 1997
TL;DR: In this article, the authors show that random sampling of transactions in the database is an effective method for finding association rules, which can speed up the mining process by more than an order of magnitude by reducing I/O costs and drastically shrinking the number of transactions to be considered.
Abstract: The discovery of association rules is a prototypical problem in data mining. The current algorithms proposed for data mining of association rules make repeated passes over the database to determine the commonly occurring item sets (or set of items). For large databases, the I/O overhead in scanning the database can be extremely high. The authors show that random sampling of transactions in the database is an effective method for finding association rules. Sampling can speed up the mining process by more than an order of magnitude by reducing I/O costs and drastically shrinking the number of transactions to be considered. They may also be able to make the sampled database resident in main-memory. Furthermore, they show that sampling can accurately represent the data patterns in the database with high confidence. They experimentally evaluate the effectiveness of sampling on different databases, and study the relationship between the performance, accuracy, and confidence of the chosen sample.

Patent
S. Gowda1, Hyun Jong Shin1, Hon-Sum Philip Wong1, Peter Hong Xiao1, Jungwook Yang1 
12 Jun 1997
TL;DR: In this article, a correlated double sampling (CDS) circuit is proposed, where a plurality of up/down counters are coupled to respective ones of the comparators, and each is operable to count in a first direction during the first sampling interval and in an opposite direction during a second sampling interval.
Abstract: Disclosed is a circuit for performing correlated double sampling entirely in the digital domain. In an exemplary embodiment, the circuit includes a plurality of comparators, each having a first input coupled to an associated data line for receiving first and second signals in first and second sampling intervals, respectively. A time varying reference signal is applied to the second input of each comparator. A plurality of up/down counters are coupled to respective ones of the comparators, and each is operable to count in a first direction during the first sampling interval and in an opposite direction during the second sampling interval. Each up/down counter is caused to stop counting when the amplitude of the variable reference signal substantially equals the amplitude of the respective first or second signal. As a result, each up/down counter provides an output representing a subtraction of one of said first or second signals from the other. The invention has particular utility when used in conjunction with a CMOS image sensor.

Journal ArticleDOI
TL;DR: The authors show that the pseudovalues used in the jackknife method are directly linked to the placement values, and because of the close link, the choice between the two methods can be based on users' preferences.

Posted Content
TL;DR: In this article, the authors study interactive situations in which players are boundedly rational and use the following choice procedure: each player associates one consequence with each of her actions by sampling (literally or virtually) each action once, and then chooses the action that has the best consequence.
Abstract: We study interactive situations in which players are boundedly ra- tional. Each player, rather than optimizing given a belief about the other players' behavior. as in the theory of Nash equilibrium, uses the following choice procedure. She first associates one consequence with each of her actions by sampling (literally or virtually) each of her actions once. Then she chooses the action that has the best consequence. We define a notion of equilibrium for such situations and study its properties.

Book
01 Jan 1997
TL;DR: In this article, the authors present a compilation of research data on sampling of eucalypts, describing new methods and tools for rapid and cost-effective analysis of wood quality.
Abstract: This book was written to help the forest industry assess wood quality by using non-destructive samples taken from specific points within a tree It is the first compilation of research data on sampling of eucalypts, describing new methods and tools for rapid and cost-effective analysis The book provides information needed to design a sampling program, obtain and process wood samples, and shows how to relate the data to an average tree value