scispace - formally typeset
Search or ask a question

Showing papers on "Parametric statistics published in 2007"


Book
19 Jan 2007
TL;DR: This handbook provides you with everything you need to know about parametric and nonparametric statistical procedures, and helps you choose the best test for your data, interpret the results, and better evaluate the research of others.
Abstract: With more than 500 pages of new material, the Handbook of Parametric and Nonparametric Statistical Procedures, Fourth Edition carries on the esteemed tradition of the previous editions, providing up-to-date, in-depth coverage of now more than 160 statistical procedures. The book also discusses both theoretical and practical statistical topics, such as experimental design, experimental control, and statistical analysis. New to the Fourth Edition Multivariate statistics including matrix algebra, multiple regression, Hotellings T2, MANOVA, MANCOVA, discriminant function analysis, canonical correlation, logistic regression, and principal components/factor analysis Clinical trials, survival analysis, tests of equivalence, analysis of censored data, and analytical procedures for crossover design Regression diagnostics that include the Durbin-Watson test Log-linear analysis of contingency tables, Mantel-Haenszel analysis of multiple 2 2 contingency tables, trend analysis, and analysis of variance for a Latin square design Levene and Brown-Forsythe tests for evaluating homogeneity of variance, the Jarque-Bera test of normality, and the extreme studentized deviate test for identifying outliers Confidence intervals for computing the population median and the difference between two population medians The relationship between exponential and Poisson distribution Eliminating the need to search across numerous books, this handbook provides you with everything you need to know about parametric and nonparametric statistical procedures. It helps you choose the best test for your data, interpret the results, and better evaluate the research of others.

5,097 citations


Journal ArticleDOI
TL;DR: A unified approach is proposed that makes it possible for researchers to preprocess data with matching and then to apply the best parametric techniques they would have used anyway and this procedure makes parametric models produce more accurate and considerably less model-dependent causal inferences.
Abstract: Although published works rarely include causal estimates from more than a few model specifications, authors usually choose the presented estimates from numerous trial runs readers never see. Given the often large variation in estimates across choices of control variables, functional forms, and other modeling assumptions, how can researchers ensure that the few estimates presented are accurate or representative? How do readers know that publications are not merely demonstrations that it is possible to find a specification that fits the author's favorite hypothesis? And how do we evaluate or even define statistical properties like unbiasedness or mean squared error when no unique model or estimator even exists? Matching methods, which offer the promise of causal inference with fewer assumptions, constitute one possible way forward, but crucial results in this fast-growing methodological literature are often grossly misinterpreted. We explain how to avoid these misinterpretations and propose a unified approach that makes it possible for researchers to preprocess data with matching (such as with the easy-to-use software we offer) and then to apply the best parametric techniques they would have used anyway. This procedure makes parametric models produce more accurate and considerably less model-dependent causal inferences.

3,601 citations


Journal ArticleDOI
TL;DR: FCS is a semi-parametric and flexible alternative that specifies the multivariate model by a series of conditional models, one for each incomplete variable, but its statistical properties are difficult to establish.
Abstract: The goal of multiple imputation is to provide valid inferences for statistical estimates from incomplete data. To achieve that goal, imputed values should preserve the structure in the data, as well as the uncertainty about this structure, and include any knowledge about the process that generated the missing data. Two approaches for imputing multivariate data exist: joint modeling (JM) and fully conditional specification (FCS). JM is based on parametric statistical theory, and leads to imputation procedures whose statistical properties are known. JM is theoretically sound, but the joint model may lack flexibility needed to represent typical data features, potentially leading to bias. FCS is a semi-parametric and flexible alternative that specifies the multivariate model by a series of conditional models, one for each incomplete variable. FCS provides tremendous flexibility and is easy to apply, but its statistical properties are difficult to establish. Simulation work shows that FCS behaves very well in ...

2,119 citations


Journal ArticleDOI
15 Apr 2007
TL;DR: This paper gives a general overview of techniques in statistical parametric speech synthesis, and contrasts these techniques with the more conventional unit selection technology that has dominated speech synthesis over the last ten years.
Abstract: This paper gives a general overview of techniques in statistical parametric speech synthesis. One of the instances of these techniques, called HMM-based generation synthesis (or simply HMM-based synthesis), has recently been shown to be very effective in generating acceptable speech synthesis. This paper also contrasts these techniques with the more conventional unit selection technology that has dominated speech synthesis over the last ten years. Advantages and disadvantages of statistical parametric synthesis are highlighted as well as identifying where we expect the key developments to appear in the immediate future.

1,270 citations


Posted Content
Xiaohong Chen1
TL;DR: The method of sieves as discussed by the authors can be used to estimate semi-nonparametric econometric models with various constraints, such as monotonicity, convexity, additivity, multiplicity, exclusion and nonnegativity.
Abstract: Often researchers find parametric models restrictive and sensitive to deviations from the parametric specifications; semi-nonparametric models are more flexible and robust, but lead to other complications such as introducing infinite-dimensional parameter spaces that may not be compact and the optimization problem may no longer be well-posed. The method of sieves provides one way to tackle such difficulties by optimizing an empirical criterion over a sequence of approximating parameter spaces (i.e., sieves); the sieves are less complex but are dense in the original space and the resulting optimization problem becomes well-posed. With different choices of criteria and sieves, the method of sieves is very flexible in estimating complicated semi-nonparametric models with (or without) endogeneity and latent heterogeneity. It can easily incorporate prior information and constraints, often derived from economic theory, such as monotonicity, convexity, additivity, multiplicity, exclusion and nonnegativity. It can simultaneously estimate the parametric and nonparametric parts in semi-nonparametric models, typically with optimal convergence rates for both parts. This chapter describes estimation of semi-nonparametric econometric models via the method of sieves. We present some general results on the large sample properties of the sieve estimates, including consistency of the sieve extremum estimates, convergence rates of the sieve M-estimates, pointwise normality of series estimates of regression functions, root-n asymptotic normality and efficiency of sieve estimates of smooth functionals of infinite-dimensional parameters. Examples are used to illustrate the general results.

654 citations


Journal Article
Dongbin Xiu1
TL;DR: In this paper, a numerical algorithm for effective incorporation of parametric uncertainty into mathematical models is presented, where uncertain parameters are modeled as random variables, and the governing equations are treated as stochastic.
Abstract: A numerical algorithm for effective incorporation of parametric uncertainty into mathematical models is presented. The uncertain parameters are modeled as random variables, and the governing equations are treated as stochastic. The solutions, or quantities of interests, are expressed as convergent series of orthogonal polynomial expansions in terms of the input random parameters. A high-order stochastic collocation method is employed to solve the solution statistics, and more importantly, to reconstruct the polynomial expansion. While retaining the high accuracy by polynomial expansion, the resulting “pseudo-spectral” type algorithm is straightforward to implement as it requires only repetitive deterministic simulations. An estimate on error bounded is presented, along with numerical examples for problems with relatively complicated forms of governing equations.

441 citations


Posted Content
TL;DR: The method of sieves as mentioned in this paper can be used to estimate semi-nonparametric econometric models with various constraints, such as monotonicity, convexity, additivity, multiplicity, exclusion and nonnegativity.
Abstract: Often researchers find parametric models restrictive and sensitive to deviations from the parametric specifications; semi-nonparametric models are more flexible and robust, but lead to other complications such as introducing infinite-dimensional parameter spaces that may not be compact and the optimization problem may no longer be well-posed. The method of sieves provides one way to tackle such difficulties by optimizing an empirical criterion over a sequence of approximating parameter spaces (i.e., sieves); the sieves are less complex but are dense in the original space and the resulting optimization problem becomes well-posed. With different choices of criteria and sieves, the method of sieves is very flexible in estimating complicated semi-nonparametric models with (or without) endogeneity and latent heterogeneity. It can easily incorporate prior information and constraints, often derived from economic theory, such as monotonicity, convexity, additivity, multiplicity, exclusion and nonnegativity. It can simultaneously estimate the parametric and nonparametric parts in semi-nonparametric models, typically with optimal convergence rates for both parts. This chapter describes estimation of semi-nonparametric econometric models via the method of sieves. We present some general results on the large sample properties of the sieve estimates, including consistency of the sieve extremum estimates, convergence rates of the sieve M-estimates, pointwise normality of series estimates of regression functions, root-n asymptotic normality and efficiency of sieve estimates of smooth functionals of infinite-dimensional parameters. Examples are used to illustrate the general results.

419 citations


Book
02 Feb 2007
TL;DR: This paper presents nonparametric Descriptive Methods to Check Parametric Assumptions of Parametric Models of Time-Dependence, and three models of Exponential Transition Rate Models, which are examples of time-dependence-based exponential models.
Abstract: Contents: Preface. Introduction. Event History Data Structures. Nonparametric Descriptive Methods. Exponential Transition Rate Models. Piecewise Constant Exponential Models. Exponential Models With Time-Dependent Covariates. Parametric Models of Time-Dependence. Methods to Check Parametric Assumptions. Semiparametric Transition Rate Models. Problems of Model Specification.

381 citations


Book
30 Jun 2007
TL;DR: An alternative selection scheme based on relative bounds between estimators is described and study, and a two step localization technique which can handle the selection of a parametric model from a family of those is presented.
Abstract: This monograph deals with adaptive supervised classification, using tools borrowed from statistical mechanics and information theory, stemming from the PACBayesian approach pioneered by David McAllester and applied to a conception of statistical learning theory forged by Vladimir Vapnik. Using convex analysis on the set of posterior probability measures, we show how to get local measures of the complexity of the classification model involving the relative entropy of posterior distributions with respect to Gibbs posterior measures. We then discuss relative bounds, comparing the generalization error of two classification rules, showing how the margin assumption of Mammen and Tsybakov can be replaced with some empirical measure of the covariance structure of the classification model.We show how to associate to any posterior distribution an effective temperature relating it to the Gibbs prior distribution with the same level of expected error rate, and how to estimate this effective temperature from data, resulting in an estimator whose expected error rate converges according to the best possible power of the sample size adaptively under any margin and parametric complexity assumptions. We describe and study an alternative selection scheme based on relative bounds between estimators, and present a two step localization technique which can handle the selection of a parametric model from a family of those. We show how to extend systematically all the results obtained in the inductive setting to transductive learning, and use this to improve Vapnik's generalization bounds, extending them to the case when the sample is made of independent non-identically distributed pairs of patterns and labels. Finally we review briefly the construction of Support Vector Machines and show how to derive generalization bounds for them, measuring the complexity either through the number of support vectors or through the value of the transductive or inductive margin.

369 citations


Journal ArticleDOI
TL;DR: A simple, novel, and general method for approximating the sum of independent or arbitrarily correlated lognormal random variables (RV) by a single logn formalism RV without the extremely precise numerical computations at a large number of points that were required by the previously proposed methods.
Abstract: A simple, novel, and general method is presented in this paper for approximating the sum of independent or arbitrarily correlated lognormal random variables (RV) by a single lognormal RV. The method is also shown to be applicable for approximating the sum of lognormal-Rice and Suzuki RVs by a single lognormal RV. A sum consisting of a mixture of the above distributions can also be easily handled. The method uses the moment generating function (MGF) as a tool in the approximation and does so without the extremely precise numerical computations at a large number of points that were required by the previously proposed methods in the literature. Unlike popular approximation methods such as the Fenton-Wilkinson method and the Schwartz-Yeh method, which have their own respective short-comings, the proposed method provides the parametric flexibility to accurately approximate different portions of the lognormal sum distribution. The accuracy of the method is measured both visually, as has been done in the literature, as well as quantitatively, using curve-fitting metrics. An upper bound on the sensitivity of the method is also provided.

356 citations


Journal ArticleDOI
TL;DR: The main conclusion is that, in terms of statistical computations and data analysis, the SP method is better than ML and IFM methods when the marginal distributions are unknown which is almost always the case in practice.

Journal ArticleDOI
TL;DR: In this article, the authors compare the robustness of five widely used techniques, two non-parametric and three parametric, in order, (a) index numbers, (b) data envelopment analysis (DEA), (c) stochastic frontiers, (d) instrumental variables (GMM), and (e) semiparametric estimation.
Abstract: Researchers interested in estimating productivity can choose from an array of methodologies, each with its strengths and weaknesses. We compare the robustness of five widely used techniques, two non-parametric and three parametric: in order, (a) index numbers, (b) data envelopment analysis (DEA), (c) stochastic frontiers, (d) instrumental variables (GMM) and (e) semiparametric estimation. Using simulated samples of firms, we analyze the sensitivity of alternative methods to the way randomness is introduced in the data generating process. Three experiments are considered, introducing randomness via factor price heterogeneity, measurement error and differences in production technology respectively. When measurement error is small, index numbers are excellent for estimating productivity growth and are among the best for estimating productivity levels. DEA excels when technology is heterogeneous and returns to scale are not constant. When measurement or optimization errors are nonnegligible, parametric approaches are preferred. Ranked by the persistence of the productivity differentials between firms (in decreasing order), one should prefer the stochastic frontiers, GMM, or semiparametric estimation methods. The practical relevance of each experiment for applied researchers is discussed explicitly.

Journal ArticleDOI
TL;DR: The BPM toolbox incorporates information obtained from other modalities as regressors in a voxel-wise analysis, thereby permitting investigation of more sophisticated hypotheses and has a high degree of integration with the SPM (statistical parametric mapping) software relying on it for visualization and statistical inference.

Book ChapterDOI
Xiaohong Chen1
TL;DR: The method of sieves as mentioned in this paper can be used to estimate semi-nonparametric econometric models with various constraints, such as monotonicity, convexity, additivity, multiplicity, exclusion and nonnegativity.
Abstract: Often researchers find parametric models restrictive and sensitive to deviations from the parametric specifications; semi-nonparametric models are more flexible and robust, but lead to other complications such as introducing infinite-dimensional parameter spaces that may not be compact and the optimization problem may no longer be well-posed. The method of sieves provides one way to tackle such difficulties by optimizing an empirical criterion over a sequence of approximating parameter spaces (i.e., sieves); the sieves are less complex but are dense in the original space and the resulting optimization problem becomes well-posed. With different choices of criteria and sieves, the method of sieves is very flexible in estimating complicated semi-nonparametric models with (or without) endogeneity and latent heterogeneity. It can easily incorporate prior information and constraints, often derived from economic theory, such as monotonicity, convexity, additivity, multiplicity, exclusion and nonnegativity. It can simultaneously estimate the parametric and nonparametric parts in semi-nonparametric models, typically with optimal convergence rates for both parts. This chapter describes estimation of semi-nonparametric econometric models via the method of sieves. We present some general results on the large sample properties of the sieve estimates, including consistency of the sieve extremum estimates, convergence rates of the sieve M-estimates, pointwise normality of series estimates of regression functions, root-n asymptotic normality and efficiency of sieve estimates of smooth functionals of infinite-dimensional parameters. Examples are used to illustrate the general results.

Journal ArticleDOI
TL;DR: An application of reduced basis method for Stokes equations in domains with affine parametric dependence is presented, ideally suited for the repeated and rapid evaluations required in the context of parameter estimation, design, optimization, and real-time control.

Proceedings ArticleDOI
29 Jul 2007
TL;DR: Locally optimal projection (LOP) as discussed by the authors is a local projection operator that does not rely on estimating a local normal, fitting a local plane, or using any other local parametric representation.
Abstract: We introduce a Locally Optimal Projection operator (LOP) for surface approximation from point-set data. The operator is parameterization free, in the sense that it does not rely on estimating a local normal, fitting a local plane, or using any other local parametric representation. Therefore, it can deal with noisy data which clutters the orientation of the points. The method performs well in cases of ambiguous orientation, e.g., if two folds of a surface lie near each other, and other cases of complex geometry in which methods based upon local plane fitting may fail. Although defined by a global minimization problem, the method is effectively local, and it provides a second order approximation to smooth surfaces. Hence allowing good surface approximation without using any explicit or implicit approximation space. Furthermore, we show that LOP is highly robust to noise and outliers and demonstrate its effectiveness by applying it to raw scanned data of complex shapes.

Journal ArticleDOI
TL;DR: A novel multivariate exponentially weighted moving average monitoring scheme for a general linear profile, which fits a quadratic polynomial regression model well and is introduced to improve the performance of the proposed scheme.
Abstract: We propose a statistical process control scheme that can be implemented in industrial practice, in which the quality of a process can be characterized by a general linear profile. We start by reviewing the general linear profile model and the existing monitoring methods. Based on this, we propose a novel multivariate exponentially weighted moving average monitoring scheme for such a profile. We introduce two other enhancement features, the variable sampling interval and the parametric diagnostic approach, to further improve the performance of the proposed scheme. Throughout the article, we use a deep reactive ion etching example from semiconductor manufacturing, which has a profile that fits a quadratic polynomial regression model well, to illustrate the implementation of the proposed approach.

Journal ArticleDOI
TL;DR: The robust stabilization method via the dynamic surface control (DSC) is proposed for uncertain nonlinear systems with unknown time delays in parametric strict-feedback form and it is proved that all the signals in the closed-loop system are semiglobally uniformly bounded.
Abstract: The robust stabilization method via the dynamic surface control (DSC) is proposed for uncertain nonlinear systems with unknown time delays in parametric strict-feedback form. That is, the DSC technique is extended to state time delay nonlinear systems with linear parametric uncertainties. The proposed control system can overcome not only the problem of ldquoexplosion of complexityrdquo inherent in the backstepping design method but also the uncertainties of the unknown time delays by choosing appropriate Lyapunov-Krasovskii functionals. In addition, we prove that all the signals in the closed-loop system are semiglobally uniformly bounded. Finally, an example is provided to illustrate the effectiveness of the proposed control system.

Journal ArticleDOI
TL;DR: The method, based on an Offline–Online strategy relevant in the reduced basis many-query and real-time context, reduces the Online calculation to a small Linear Program: the objective is a parametric expansion of the underlying Rayleigh quotient; the constraints reflect stability information at optimally selected parameter points.

Journal ArticleDOI
TL;DR: In this paper, a parametric interaction of counterpropagating photons has the unique property of automatically establishing distributed feedback and thus realizing novel sources of coherent and tunable radiation, s...
Abstract: Parametric interaction of counterpropagating photons has the unique property of automatically establishing distributed feedback and thus realizing novel sources of coherent and tunable radiation, s ...

Proceedings ArticleDOI
01 Jan 2007

Journal ArticleDOI
TL;DR: In this article, a critical examination of parametric curve fitting through a series of numerical experiments shows that the goodness-of-fit of an approximation may be a poor guide to its genetic significance.

Journal ArticleDOI
TL;DR: A novel parametric and global image histogram thresholding method based on the estimation of the statistical parameters of ''object'' and ''background'' classes by the expectation-maximization (EM) algorithm, under the assumption that these two classes follow a generalized Gaussian (GG) distribution.

Journal ArticleDOI
TL;DR: In this paper, a nonparametric stochastic frontier (SF) model is proposed based on local maximum likelihood (LML) for estimating the efficiency of a production process.

Journal ArticleDOI
TL;DR: An approximate spatial correlation model for clustered multiple-input multiple-output (MIMO) channels is proposed and used to show that the proposed model is a good fit to the existing parametric models for low angle spreads (i.e., smaller than 10deg).
Abstract: An approximate spatial correlation model for clustered multiple-input multiple-output (MIMO) channels is proposed in this paper. The two ingredients for the model are an approximation for uniform linear and circular arrays to avoid numerical integrals and a closed-form expression for the correlation coefficients that is derived for the Laplacian azimuth angle distribution. A new performance metric to compare parametric and nonparametric channel models is proposed and used to show that the proposed model is a good fit to the existing parametric models for low angle spreads (i.e., smaller than 10deg). A computational-complexity analysis shows that the proposed method is a numerically efficient way of generating the spatially correlated MIMO channels.

Journal ArticleDOI
Ioana Popescu1
TL;DR: It is proved that for a general class of objective functions, the robust solutions amount to solving a certain deterministic parametric quadratic program, and a general projection property for multivariate distributions with given means and covariances is proved.
Abstract: We provide a method for deriving robust solutions to certain stochastic optimization problems, based on mean-covariance information about the distributions underlying the uncertain vector of returns. We prove that for a general class of objective functions, the robust solutions amount to solving a certain deterministic parametric quadratic program. We first prove a general projection property for multivariate distributions with given means and covariances, which reduces our problem to optimizing a univariate mean-variance robust objective. This allows us to use known univariate results in the multidimensional setting, and to add new results in this direction. In particular, we characterize a general class of objective functions (the so-called one- or two-point support functions), for which the robust objective is reduced to a deterministic optimization problem in one variable. Finally, we adapt a result from Geoffrion (1967a) to reduce the main problem to a parametric quadratic program. In particular, our results are true for increasing concave utilities with convex or concave-convex derivatives. Closed-form solutions are obtained for special discontinuous criteria, motivated by bonus- and commission-based incentive schemes for portfolio management. We also investigate a multiproduct pricing application, which motivates extensions of our results for the case of nonnegative and decision-dependent returns.

01 Jan 2007
TL;DR: In this article, the authors present an approach to the construction of lower bounds for the coercivity and inf-sup stability constants required in a posteriori error analysis of reduced basis approximations to affinely parametrized partial differential equations.
Abstract: We present an approach to the construction of lower bounds for the coercivity and inf–sup stability constants required in a posteriori error analysis of reduced basis approximations to affinely parametrized partial differential equations. The method, based on an Offline–Online strategy relevant in the reduced basis many-query and real-time context, reduces the Online calculation to a small Linear Program: the objective is a parametric expansion of the underlying Rayleigh quotient; the constraints reflect stability information at optimally selected parameter points. Numerical results are presented for coercive elasticity and non-coercive

Proceedings ArticleDOI
30 Apr 2007
TL;DR: This paper presents an example-based motion synthesis technique that generates continuous streams of high-fidelity, controllable motion for interactive applications, such as video games, through a new data structure called a parametric motion graph.
Abstract: In this paper, we present an example-based motion synthesis technique that generates continuous streams of high-fidelity, controllable motion for interactive applications, such as video games. Our method uses a new data structure called a parametric motion graph to describe valid ways of generating linear blend transitions between motion clips dynamically generated through parametric synthesis in realtime. Our system specifically uses blending-based parametric synthesis to accurately generate any motion clip from an entire space of motions by blending together examples from that space. The key to our technique is using sampling methods to identify and represent good transitions between these spaces of motion parameterized by a continuously valued parameter. This approach allows parametric motion graphs to be constructed with little user effort. Because parametric motion graphs organize all motions of a particular type, such as reaching to different locations on a shelf, using a single, parameterized graph node, they are highly structured, facilitating fast decision-making for interactive character control. We have successfully created interactive characters that perform sequences of requested actions, such as cartwheeling or punching.

Journal ArticleDOI
TL;DR: Self-tuning experience weighted attraction does as well as EWA in predicting behavior in new games, even though it has fewer parameters, and fits reliably better than the QRE equilibrium benchmark.

Journal ArticleDOI
TL;DR: The paper presents the results from applying the nonparametric model and comparing it to the original model estimated using a conventional parametric random effects model, and discusses the implications for future applications of the SF-6D and further work in this field.