scispace - formally typeset
Search or ask a question

Showing papers on "Parametric statistics published in 2000"


Journal ArticleDOI
TL;DR: A universal statistical model for texture images in the context of an overcomplete complex wavelet transform is presented, demonstrating the necessity of subgroups of the parameter set by showing examples of texture synthesis that fail when those parameters are removed from the set.
Abstract: We present a universal statistical model for texture images in the context of an overcomplete complex wavelet transform. The model is parameterized by a set of statistics computed on pairs of coefficients corresponding to basis functions at adjacent spatial locations, orientations, and scales. We develop an efficient algorithm for synthesizing random images subject to these constraints, by iteratively projecting onto the set of images satisfying each constraint, and we use this to test the perceptual validity of the model. In particular, we demonstrate the necessity of subgroups of the parameter set by showing examples of texture synthesis that fail when those parameters are removed from the set. We also demonstrate the power of our model by successfully synthesizing examples drawn from a diverse collection of artificial and natural textures.

1,978 citations


Journal ArticleDOI
TL;DR: In this paper, a general approach and a computationally convenient estimation procedure for the structural analysis of auction data is proposed for first-price sealed-bid auction models within the independent private value paradigm.
Abstract: This paper proposes a general approach and a computationally convenient estimation procedure for the structural analysis of auctioni data. Considering first-price sealed-bid auction models within the independent private value paradigm, we show that the underlying distribution of bidders' private values is identified from observed bids and the number of actual bidders without any parametric assumptions. Using the theoiy of minimax, we establish the best rate of uniform convergence at which the latent density of private values can be estimated nonparametrically from available data. We then propose a two-step kernel-based estimator that converges at the optimal rate.

716 citations


Journal ArticleDOI
TL;DR: It is shown that with proper data preprocessing, Adaptive MultiVariate AutoRegressive (AMVAR) modeling is an effective technique for dealing with nonstationary ERP time series and a bootstrap procedure is proposed to assess the variability in the estimated spectral quantities.
Abstract: In this article we consider the application of parametric spectral analysis to multichannel event-related potentials (ERPs) during cognitive experiments. We show that with proper data preprocessing, Adaptive MultiVariate AutoRegressive (AMVAR) modeling is an effective technique for dealing with nonstationary ERP time series. We propose a bootstrap procedure to assess the variability in the estimated spectral quantities. Finally, we apply AMVAR spectral analysis to a visuomotor integration task, revealing rapidly changing cortical dynamics during different stages of task processing.

608 citations



Book
24 Feb 2000
TL;DR: The Handbook of Parametric and Nonparametric Statistical Procedures (HPSP) as mentioned in this paper is one of the most widely used and widely used handbooks for applied statistics, with more than 200 pages.
Abstract: Called the "bible of applied statistics," the first two editions of the Handbook of Parametric and Nonparametric Statistical Procedures were unsurpassed in accessibility, practicality, and scope. Now author David Sheskin has gone several steps further and added even more tests, more examples, and more background information-more than 200 pages of n

565 citations


Journal ArticleDOI
TL;DR: A new approach to the correction of intensity inhomogeneities in magnetic resonance imaging (MRI) that significantly improves intensity-based tissue segmentation and overcomes limitations of methods based on homomorphic filtering is presented.
Abstract: Presents a new approach to the correction of intensity inhomogeneities in magnetic resonance imaging (MRI) that significantly improves intensity-based tissue segmentation. The distortion of the image brightness values by a low-frequency bias field impedes visual inspection and segmentation. The new correction method called parametric bias field correction (PABIC) is based on a simplified model of the imaging process, a parametric model of tissue class statistics, and a polynomial model of the inhomogeneity field. The authors assume that the image is composed of pixels assigned to a small number of categories with a priori known statistics. Further they assume that the image is corrupted by noise and a low-frequency inhomogeneity field. The estimation of the parametric bias field is formulated as a nonlinear energy minimization problem using an evolution strategy (ES). The resulting bias field is independent of the image region configurations and thus overcomes limitations of methods based on homomorphic filtering. Furthermore, PABIC can correct bias distortions much larger than the image contrast. Input parameters are the intensity statistics of the classes and the degree of the polynomial function. The polynomial approach combines bias correction with histogram adjustment, making it well suited for normalizing the intensity histogram of datasets from serial studies. The authors present simulations and a quantitative validation with phantom and test images. A large number of MR image data acquired with breast, surface, and head coils, both in two dimensions and three dimensions, have been processed and demonstrate the versatility and robustness of this new bias correction scheme.

549 citations


Journal ArticleDOI
TL;DR: Maximum likelihood techniques for the joint estimation of the incidence and latency regression parameters in this model are developed using the nonparametric form of the likelihood and an EM algorithm and are applied to a data set of tonsil cancer patients treated with radiation therapy.
Abstract: Summary. Some failure time data come from a population that consists of some subjects who are susceptible to and others who are nonsusceptible to the event of interest. The data typically have heavy censoring at the end of the follow-up period, and a standard survival analysis would not always be appropriate. In such situations where there is good scientific or empirical evidence of a nonsusceptible population, the mixture or cure model can be used (Farewell, 1982, Biometrics38, 1041–1046). It assumes a binary distribution to model the incidence probability and a parametric failure time distribution to model the latency. Kuk and Chen (1992, Biometrika79, 531–541) extended the model by using Cox's proportional hazards regression for the latency. We develop maximum likelihood techniques for the joint estimation of the incidence and latency regression parameters in this model using the nonparametric form of the likelihood and an EM algorithm. A zero-tail constraint is used to reduce the near nonidentifiability of the problem. The inverse of the observed information matrix is used to compute the standard errors. A simulation study shows that the methods are competitive to the parametric methods under ideal conditions and are generally better when censoring from loss to follow-up is heavy. The methods are applied to a data set of tonsil cancer patients treated with radiation therapy.

549 citations


Journal ArticleDOI
TL;DR: The case when enough data paths can be generated according to an accepted parametric or nonparametric stochastic model when no assumptions on convexity with respect to the random parameters are required is discussed.
Abstract: A major issue in any application of multistage stochastic programming is the representation of the underlying random data process. We discuss the case when enough data paths can be generated according to an accepted parametric or nonparametric stochastic model. No assumptions on convexity with respect to the random parameters are required. We emphasize the notion of representative scenarios (or a representative scenario tree) relative to the problem being modeled.

493 citations


Journal ArticleDOI
TL;DR: A procedure to measure the efficiencies of DMUs with fuzzy observations by applying the α-cut approach, and by extending to fuzzy environment, the DEA approach is made more powerful for applications.

455 citations


Journal ArticleDOI
TL;DR: A general nonparametric mixture model that extends models and improves estimation methods proposed by other researchers and extends Cox's proportional hazards regression model by allowing a proportion of event-free patients and investigating covariate effects on that proportion.
Abstract: Nonparametric methods have attracted less attention than their parametric counterparts for cure rate analysis. In this paper, we study a general nonparametric mixture model. The proportional hazards assumption is employed in modeling the effect of covariates on the failure time of patients who are not cured. The EM algorithm, the marginal likelihood approach, and multiple imputations are employed to estimate parameters of interest in the model. This model extends models and improves estimation methods proposed by other researchers. It also extends Cox's proportional hazards regression model by allowing a proportion of event-free patients and investigating covariate effects on that proportion. The model and its estimation method are investigated by simulations. An application to breast cancer data, including comparisons with previous analyses using a parametric model and an existing nonparametric model by other researchers, confirms the conclusions from the parametric model but not those from the existing nonparametric model.

404 citations


Journal ArticleDOI
TL;DR: In this article, the problem of choosing the number of bootstrap repetitions B for bootstrap standard errors, confidence intervals, confidence regions, hypothesis tests, p-values, and bias correction is considered.
Abstract: This paper considers the problem of choosing the number of bootstrap repetitions B for bootstrap standard errors, confidence intervals, confidence regions, hypothesis tests, p-values, and bias correction. For each of these problems, the paper provides a three-step method for choosing B to achieve a desired level of accuracy. Accuracy is measured by the percentage deviation of the bootstrap standard error estimate, confidence interval length, test's critical value, test's p-value, or bias-corrected estimate based on B bootstrap simulations from the corresponding ideal bootstrap quantities for which B=∞. The results apply quite generally to parametric, semiparametric, and nonparametric models with independent and dependent data. The results apply to the standard nonparametric iid bootstrap, moving block bootstraps for time series data, parametric and semiparametric bootstraps, and bootstraps for regression models based on bootstrapping residuals. Monte Carlo simulations show that the proposed methods work very well.

Journal ArticleDOI
TL;DR: Two kinds of adaptive control schemes for robot manipulator which has the parametric uncertainties are presented and the proposed controllers are robust not only to the structured uncertainty such as payload parameter, but also to the unstructured one such as friction model and disturbance.
Abstract: This paper presents two kinds of adaptive control schemes for robot manipulator which has the parametric uncertainties. In order to compensate these uncertainties, we use the FLS (fuzzy logic system) that has the capability to approximate any nonlinear function over the compact input space. In the proposed control schemes, we need not derive the linear formulation of robot dynamic equation and tune the parameters. We also suggest the robust adaptive control laws in all proposed schemes for decreasing the effect of approximation error. To reduce the number of fuzzy rules of the FLS, we consider the properties of robot dynamics and the decomposition of the uncertainty function. The proposed controllers are robust not only to the structured uncertainty such as payload parameter, but also to the unstructured one such as friction model and disturbance. The validity of the control scheme is shown by computer simulations of a two-link planar robot manipulator.

Book
25 Sep 2000
TL;DR: System Reliability as a Function of Component Reliability and Parametric Lifetime Distributions and Statistical Inference from Incomplete Data are studied.
Abstract: 1 System Reliability as a Function of Component Reliability.- 2 Parametric Lifetime Distributions.- 3 Statistical Inference from Incomplete Data.- 4 Preventive Maintenance Models Based on the Lifetime Distribution.- 5 Preventive Maintenance Based on Parameter Control.- 6 Best Time Scale for Age Replacement.- 7 Preventive Maintenance with Learning.- Solutions to Exercises.- Appendix A: Nonhomogeneous Poisson Process.- A.1 Definition and Basic Properties.- A.2 Parametric estimation of the intensity function.- Appendix B: Covariances and Means of Order Statistics.- Appendix C: The Laplace Transform.- Appendix D: Probability Paper.- Appendix E: Renewal Function.- References.- Glossary of Notation.

Proceedings ArticleDOI
01 Dec 2000
TL;DR: Geometric active contours have many advantages over parametric contours, such as computational simplicity and the ability to change the curve topology during deformation as discussed by the authors, but the relationship between the two has not always been clear.
Abstract: Geometric active contours have many advantages over parametric active contours, such as computational simplicity and the ability to change the curve topology during deformation. While many of the capabilities of the older parametric active contours have been reproduced in geometric active contours, the relationship between the two has not always been clear. We develop a precise relationship between the two which includes spatially-varying coefficients, both tension and rigidity, and non-conservative external forces. The result is a very general geometric active contour formulation for which the intuitive design principles of parametric active contours can be applied. We demonstrate several novel applications in a series of simulations.

Journal ArticleDOI
TL;DR: In this article, two discretization methods are discussed that transcribe optimal control problems into nonlinear programming problems for which SQP-methods provide efficient solution methods, which can be used also for a check of second-order sufficient conditions and for a post-optimal calculation of adjoint variables.

Journal ArticleDOI
TL;DR: The potential of multilevel models for meta-analysis of trials with binary outcomes for both summary data, such as log-odds ratios, and individual patient data is explored, and the flexibility ofMultilevel modelling may be exploited in facilitating extensions to standard Meta-analysis methods.
Abstract: In this paper we explore the potential of multilevel models for meta-analysis of trials with binary outcomes for both summary data, such as log-odds ratios, and individual patient data. Conventional fixed effect and random effects models are put into a multilevel model framework, which provides maximum likelihood or restricted maximum likelihood estimation. To exemplify the methods, we use the results from 22 trials to prevent respiratory tract infections; we also make comparisons with a second example data set comprising fewer trials. Within summary data methods, confidence intervals for the overall treatment effect and for the between-trial variance may be derived from likelihood based methods or a parametric bootstrap as well as from Wald methods; the bootstrap intervals are preferred because they relax the assumptions required by the other two methods. When modelling individual patient data, a bias corrected bootstrap may be used to provide unbiased estimation and correctly located confidence intervals; this method is particularly valuable for the between-trial variance. The trial effects may be modelled as either fixed or random within individual data models, and we discuss the corresponding assumptions and implications. If random trial effects are used, the covariance between these and the random treatment effects should be included; the resulting model is equivalent to a bivariate approach to meta-analysis. Having implemented these techniques, the flexibility of multilevel modelling may be exploited in facilitating extensions to standard meta-analysis methods.

Journal ArticleDOI
TL;DR: In this article, a Lyapunov-based control algorithm is developed for force tracking control of an electro-hydraulic actuator, which relies on an accurate model of the system.

01 Apr 2000
TL;DR: Kernel estimation as mentioned in this paper provides an unbinned and non-parametric estimate of the probability density function from which a set of data is drawn, and is a popular technique for high-energy physics applications.
Abstract: Kernel estimation provides an unbinned and non-parametric estimate of the probability density function from which a set of data is drawn. In the first section, after a brief discussion on parametric and non-parametric methods, the theory of kernel estimation is developed for univariate and multivariate settings. The second section discusses some of the applications of kernel estimation to high-energy physics. The third section provides an overview of the available univariate and multivariate packages. This paper concludes with a discussion of the inherent advantages of kernel estimation techniques and systematic errors associated with the estimation of parent distributions.

Journal ArticleDOI
TL;DR: The parametric adaptive matched filter (PAMF) for space-time adaptive processing (STAP) is introduced via the matched filter, multichannel linear prediction, and the multichannels LDU decomposition.
Abstract: The parametric adaptive matched filter (PAMF) for space-time adaptive processing (STAP) is introduced via the matched filter (MF), multichannel linear prediction, and the multichannel LDU decomposition. Two alternative algorithmic implementations of the PAMF are discussed. Issues considered include sample training data size and constant false alarm rate (CFAR). Detection test statistics are estimated for airborne phased array radar measurements, and probability of detection is estimated using simulated phased array radar data for airborne surveillance radar scenarios. For large sample sizes, the PAMF performs close to the MF; performance degrades slightly for small sample sizes. In both sample size ranges, the PAMF is tolerant to targets present in the training set.

Journal ArticleDOI
TL;DR: A new method which requires a small set of measurements of a simple calibration object consisting of two spherical objects, that can be considered as 'point' objects, which can be determined analytically using explicit formulae.
Abstract: This paper is about calibration of cone-beam (CB) scanners for both x-ray computed tomography and single-photon emission computed tomography. Scanner calibration refers here to the estimation of a set of parameters which fully describe the geometry of data acquisition. Such parameters are needed for the tomographic reconstruction step. The discussion is limited to the usual case where the cone vertex and planar detector move along a circular path relative to the object. It is also assumed that the detector does not have spatial distortions. We propose a new method which requires a small set of measurements of a simple calibration object consisting of two spherical objects, that can be considered as 'point' objects. This object traces two ellipses on the detector and from the parametric description of these ellipses, the calibration geometry can be determined analytically using explicit formulae. The method is robust and easy to implement. However, it is not fully general as it is assumed that the detector is parallel to the rotation axis of the scanner. Implementation details are given for an experimental x-ray CB scanner.

Journal ArticleDOI
TL;DR: The authors adopt a semi-parametric approach based on finite impulse response (FIR) filters, and introduce a Gaussian process prior on the filter parameters to cope with the increase in the number of degrees of freedom.
Abstract: Modeling the hemodynamic response in functional magnetic resonance (fMRI) experiments is an important aspect of the analysis of functional neuroimages. This has been done in the past using parametric response function, from a limited family. In this contribution, the authors adopt a semi-parametric approach based on finite impulse response (FIR) filters. In order to cope with the increase in the number of degrees of freedom, the authors introduce a Gaussian process prior on the filter parameters. They show how to carry on the analysis by incorporating prior knowledge on the filters, optimizing hyper-parameters using the evidence framework, or sampling using a Markov Chain Monte Carlo (MCMC) approach. The authors present a comparison of their model with standard hemodynamic response kernels on simulated data, and perform a full analysis of data acquired during an experiment involving visual stimulation.

Journal ArticleDOI
TL;DR: In the paper, randomized algorithms for stability and performance of linear time invariant uncertain systems described by a general M-/spl Delta/ configuration are studied and efficient polynomial-time algorithms for uncertainty structures /splDelta/ consisting of an arbitrary number of full complex blocks and uncertain parameters are developed.
Abstract: There has been a growing interest in developing randomized algorithms for probabilistic robustness of uncertain control systems. Unlike classical worst case methods, these algorithms provide probabilistic estimates assessing, for instance, if a certain design specification is met with a given probability. One of the advantages of this approach is that the robustness margins can be often increased by a considerable amount, at the expense of a small risk. In this sense, randomized algorithms may be used by the control engineer together with standard worst case methods to obtain additional useful information. The applicability of these probabilistic methods to robust control is presently limited by the fact that the sample generation is feasible only in very special cases which include systems affected by real parametric uncertainty bounded in rectangles or spheres. Sampling in more general uncertainty sets is generally performed through overbounding, at the expense of an exponential rejection rate. In the paper, randomized algorithms for stability and performance of linear time invariant uncertain systems described by a general M-/spl Delta/ configuration are studied. In particular, efficient polynomial-time algorithms for uncertainty structures /spl Delta/ consisting of an arbitrary number of full complex blocks and uncertain parameters are developed.

Journal ArticleDOI
TL;DR: In this paper, the authors concentrate on the parametric mixing of a signal waveform with a linearly chirped optical pump as the time lens mechanism and analyze all single-lens system configurations including sum-and difference-frequency mixing schemes with positive and negative group velocity dispersions using temporal ray diagrams as an aid in understanding their operation.
Abstract: The recently developed process of temporal imaging expands or compresses time waveforms while preserving the shapes of their envelope profiles. A key element in a temporal imaging system is a time lens which imparts a quadratic phase modulation to the waveform being imaged. Several methods, such as electrooptic modulation, can be used to produce the phase modulation. In this paper, we concentrate on the parametric mixing of a signal waveform with a linearly chirped optical pump as the time lens mechanism. We analyze all single-lens system configurations including sum- and difference-frequency mixing schemes with positive and negative group velocity dispersions using temporal ray diagrams as an aid in understanding their operation.

Journal ArticleDOI
TL;DR: In this paper, the exponential families of distributions are applied to the problem of fitting the univariate histograms and discrete bivariate frequency distributions that often arise in the analysis of test scores.
Abstract: The well-developed theory of exponential families of distributions is applied to the problem of fitting the univariate histograms and discrete bivariate frequency distributions that often arise in the analysis of test scores. These models are powerful tools for many forms of parametric data smoothing and are particularly well-suited to problems in which there is little or no theory to guide a choice of probability models, e.g., smoothing a distribution to eliminate roughness and zero frequencies in order to equate scores from different tests. Attention is given to efficient computation of the maximum likelihood estimates of the parameters using Newton's Method and to computationally efficient methods for obtaining the asymptotic standard errors of the fitted frequencies and proportions. We discuss tools that can be used to diagnose the quality of the fitted frequencies for both the univariate and the bivariate cases. Five examples, using real data, are used to illustrate the methods of this paper.

Journal ArticleDOI
TL;DR: A deterministic iterative algorithm is developed to implement the derived contour estimation criterion and the result is an unsupervised parametric deformable contour that adapts its degree of smoothness/complexity and it also estimates the observation (image) model parameters.
Abstract: This paper describes a new approach to adaptive estimation of parametric deformable contours based on B-spline representations. The problem is formulated in a statistical framework with the likelihood function being derived from a region-based image model. The parameters of the image model, the contour parameters, and the B-spline parameterization order (i.e., the number of control points) are all considered unknown. The parameterization order is estimated via a minimum description length (MDL) type criterion. A deterministic iterative algorithm is developed to implement the derived contour estimation criterion, the result is an unsupervised parametric deformable contour: it adapts its degree of smoothness/complexity (number of control points) and it also estimates the observation (image) model parameters. The experiments reported in the paper, performed on synthetic and real (medical) images, confirm the adequate and good performance of the approach.

Journal ArticleDOI
TL;DR: A profile for situations where neural networks are appropriate is derived from the results of experiments to investigate experimentally the applicability of neural networks for cost estimation in early phases of product design.
Abstract: Neural networks in a multilayer perceptron architecture are able to classify data and approximate functions based on a set of sample data (curve fitting). These properties are used to investigate experimentally the applicability of neural networks for cost estimation in early phases of product design. Experiments are based on pilot cost data from a manufacturing company. In addition, artificially created simulative data are used for benchmarking. The cost estimation performance is compared to conventional methods, i.e. linear and non-linear parametric regression. Neural networks achieve lower deviations in their cost estimations. Beyond the use of standard neural architectures, simple modifications for a performance improvement are suggested and tested. Finally, a profile for situations where neural networks are appropriate is derived from the results.

Journal ArticleDOI
24 Apr 2000
TL;DR: A new approach to fault detection for robot manipulators is introduced, based on the isolation of fault signatures via filtered torque prediction error estimates, which is formally demonstrated to be robust under uncertainty in the robot parameters.
Abstract: In this paper, we introduce a new approach to fault detection for robot manipulators. The technique, which is based on the isolation of fault signatures via filtered torque prediction error estimates, does not require measurements or estimates of manipulator acceleration as is the case with some previously suggested methods. The method is formally demonstrated to be robust under uncertainty in the robot parameters. Furthermore, an adaptive version of the algorithm is introduced, and shown to both improve coverage and significantly reduce detection times. The effectiveness of the approach is demonstrated by experiments with a two-joint manipulator system.

Journal ArticleDOI
TL;DR: In this paper, the parametric amplification of super-Hubble-scale scalar metric fluctuations at the end of inflation in some specific two-field models of inflation, a class of which is motivated by hybrid inflation, is studied.
Abstract: We study the parametric amplification of super-Hubble-scale scalar metric fluctuations at the end of inflation in some specific two-field models of inflation, a class of which is motivated by hybrid inflation. We demonstrate that there can indeed be a large growth of fluctuations due to parametric resonance and that this effect is not taken into account by the conventional theory of isocurvature perturbations. Scalar field interactions play a crucial role in this analysis. We discuss the conditions under which there can be nontrivial parametric resonance effects on large scales.

Journal ArticleDOI
TL;DR: The proposed method enables the design of robust sliding hyperplanes in the presence of mismatched parametric uncertainty based on quadratic stability and optimize the sliding motion by applying the guaranteed cost control idea.

Journal ArticleDOI
TL;DR: In this paper, the authors compare non-parametric and parametric approaches to the analysis of data in the form of replicated spatial point patterns in two or more experimental groups, and compare mean K-functions between experimental groups using a bootstrap testing procedure.
Abstract: The paper compares non-parametric (design-based) and parametric (model-based) approaches to the analysis of data in the form of replicated spatial point patterns in two or more experimental groups. Basic questions for data of this kind concern estimating the properties of the underlying spatial point process within each experimental group, and comparing the properties between groups. A non-parametric approach, building on work by Diggle et. al. (1991), summarizes each pattern by an estimate of the reduced second moment measure or K-function (Ripley (1977)) and compares mean K-functions between experimental groups using a bootstrap testing procedure. A parametric approach fits particular classes of parametric model to the data, uses the model parameter estimates as summaries and tests for differences between groups by comparing fits with and without the assumption of common parameter values across groups. The paper discusses how either approach can be implemented in the specific context of a single-factor replicated experiment and uses simulations to show how the parametric approach can be more efficient when the underlying model assumptions hold, but potentially misleading otherwise.