scispace - formally typeset
Search or ask a question

Showing papers on "Resampling published in 1986"


Journal ArticleDOI
TL;DR: In this paper, a class of weighted jackknife variance estimators for the least square estimator by deleting any fixed number of observations at a time was proposed, and three bootstrap methods were considered.
Abstract: Motivated by a representation for the least squares estimator, we propose a class of weighted jackknife variance estimators for the least squares estimator by deleting any fixed number of observations at a time. They are unbiased for homoscedastic errors and a special case, the delete-one jackknife, is almost unbiased for heteroscedastic errors. The method is extended to cover nonlinear parameters, regression $M$-estimators, nonlinear regression and generalized linear models. Interval estimators can be constructed from the jackknife histogram. Three bootstrap methods are considered. Two are shown to give biased variance estimators and one does not have the bias-robustness property enjoyed by the weighted delete-one jackknife. A general method for resampling residuals is proposed. It gives variance estimators that are bias-robust. Several bias-reducing estimators are proposed. Some simulation results are reported.

1,657 citations


Journal ArticleDOI
Karl M. Fant1
TL;DR: Because of the complete and continuous nature of the resampling algorithm, the resulting image is free of the classic sampling artifacts such as graininess, degradation, and edge aliasing.
Abstract: A two-pass spatial transform technique that does not exhibit the aliasing artifacts associated with techniques for spatial transform of discrete sampled images is possible through the use of a complete and continuous resampling interpolation algorithm. The algorithm is complete in the sense that all the pixels of the input image under the map of the output image fully contribute to the output image. It is continuous in the sense that no gaps or overlaps exist in the sampling of the input pixels and that the sampling can be performed with arbitrary precision. The technique is real time in the sense that it can be guaranteed to operate for any arbitrary transform within a given time limit. Because of the complete and continuous nature of the resampling algorithm, the resulting image is free of the classic sampling artifacts such as graininess, degradation, and edge aliasing.

123 citations






Journal ArticleDOI
TL;DR: In this paper, a new test which can be applied as a one-sided t -test and which is UMPI in a subspace of the parameter space is proposed.

27 citations



Book ChapterDOI
01 Jan 1986
TL;DR: The applications of bootstrap and jackknife for covariance structure analysis under violation of standard maximum likelihood assumptions (small sample size, analysis of correlation matrices, and nonnormality) are discussed in this article.
Abstract: Applications of bootstrap and jackknife for covariance structure analysis under violation of standard maximum likelihood assumptions (small sample size, analysis of correlation matrices, and nonnormality) are discussed Procedures are illustrated for a factor analysis model, using LISREL Facta and failures of the resampling methods are covered For bootstrap and jackknife a computer program with graphical facilities is introduced

17 citations





Journal ArticleDOI
TL;DR: An error in the calculation of the apparent error rate affected the simulation results in this paper and did not have much effect on the overall trends and conclusions but it did affect the relative performance of the MC estimator.

Journal ArticleDOI
TL;DR: A cylindrical sector image grid with equal area pixels for representing tomographic images is described which offers computational advantages for some algebraic and stochastic reconstruction strategies.
Abstract: A cylindrical sector image grid with equal area pixels for representing tomographic images is described which offers computational advantages for some algebraic and stochastic reconstruction strategies. An evaluation of techniques for resampling from the cylindrical representation to the standard square pixel representation is included. The resampling techniques of nearest-neighbors, bilinear, cubic B spline, two high resolution cubic spline, and overlap weighting are evaluated by their noise propagation, resolution recovery, noise power spectra, and visual appearance.





Journal ArticleDOI
A.R. Mitchell1, W.D. Stokes1
TL;DR: Examples show that the sampling operation–i.e., the change from the continuous time domain to the discrete time domain–does not necessarily preserve the minimum-phase property and can be created or destroyed by sampling or resampling.
Abstract: Examples show that the sampling operation–i.e., the change from the continuous time domain to the discrete time domain–does not necessarily preserve the minimum-phase property. Further examples can be constructed to show that the resampling operation on the discrete time domain does not necessarily preserve the minimum-phase property. Finally it can be shown that the minimum-phase property can be either created or destroyed by sampling or resampling.




Journal ArticleDOI
TL;DR: In this paper, the authors used permutation tests to evaluate the significance of the obtained value of a statistic S against a distribution of values generated by all partitions of the observations into two sets of sizes n and m. If the number of partitions is large, a random sample of partitions may be used.
Abstract: Modern computer technology has made it practical for investigators to utilize a number of statistical techniques for analyzing data with unknown distribution functions. These include the randomization test (Fisher, 1935), the quadratic assignment technique (Hubert & Schultz, 1976), and the bootstrap (Efron, 1979). The first two of these are permutation tests. For two independent random sam­ ples of sizes n and m, the randomization test evaluates the significance of the obtained value of a statistic S against a distribution of values generated by all partitions of the observations into two sets of sizes nand m. If the num­ ber of partitions is large, a random sample of partitions may be used. In the quadratic assignment technique, mul­ tivariate observations in matrix form are collected, and a statistic S is evaluated against a distribution generated by permuting the matrix rows and corresponding columns in one of the two samples. By contrast, bootstrapping in­ volves drawing repeated samples from the two groups and generating a distribution of S-values. The standard devi­ ation of the distribution provides an estimate of the stan­ dard error of S, and the percentile points of the distribu­ tion can be used to set nonparametric confidence bounds for the true value of S. The program described in this report allows these tech­ niques to be used with proximity matrix data frequently collected in psychological research. As an example, con­ sider an experiment in which an investigator is interested in studying how structural knowledge about a topic might change as a result of reading textual material about that topic. Relevant data could be collected by having subjects rate the relatedness of propositions, or of arguments in propositions, drawn from the topic passage. The ratings could be collected from two groups of subjects, one group which rates without reading the passage and a second group which rates after reading the passage (unmatched sample case), or from a single group of subjects which rates both before and after reading the passage (matched sample case). For both cases, the two sets of ratings would be entered into separate ftles as individual subject matrices with the text units as marginal row and column entries, the matrices summed across subjects in each file, and a correlation coefficient computed between corresponding entries in the two total matrices. In this example, the randomization technique generates the distribution of the correlation coefficient under the






Journal ArticleDOI
TL;DR: In this paper, the authors describe a scenario in which a 2.2-person team of experts is working on a project with two teams of experts, one for each team.
Abstract: 2つのディジタル画像を重ね合わせようとする場合先ず第1段階で粗い重ね合わせを行ない次に第2段階で画像の自動マッチング技術を用いて高精度の重ね合わせを行なう方法について述べたものである。2つの画像を精度良く重ね合わせるためには出来るだけ多くの重ね合わせのための基準点を画像全域に於いて万遍なく選定し, それらの基準点の2つの画像間での位置的誤差を求め誤差の自乗和が最小になる様に画像を再作成する。重ね合わせのための基準点の2つの画像間での位置的誤差は基準点のまわりの小領域 (16×16) の画像データ間の相関係数を計算する事により自動的に求める事が出来た。これにより重ね合わせ画像の作成を迅速に行なえる様になった。ランドサットMSS画像の場合自動マッチングにはバンド6の画像が最適であった。また重ね合わせのために選定した基準点の数と重ね合わせの精度について検討し, 重ね合わせ誤差を1画素以内にするための基準点の選定方法について1つの試案を示した。