scispace - formally typeset
Search or ask a question
Journal ArticleDOI

An Adaptive, Rate-Optimal Test of a Parametric Mean-Regression Model Against a Nonparametric Alternative

01 May 2001-Econometrica (Wiley-Blackwell)-Vol. 69, Iss: 3, pp 599-631
TL;DR: In this paper, the authors developed a new test of a parametric model of a conditional mean function against a nonparametric alternative, which adapts to the unknown smoothness of the alternative model and is uniformly consistent against alternatives whose distance from the parametric models converges to zero at the fastest possible rate.
Abstract: We develop a new test of a parametric model of a conditional mean function against a nonparametric alternative. The test adapts to the unknown smoothness of the alternative model and is uniformly consistent against alternatives whose distance from the parametric model converges to zero at the fastest possible rate. This rate is slower than n -1/2 . Some existing tests have nontrivial power against restricted classes of alternatives whose distance from the parametric model decreases at the rate n -1/2 . There are, however, sequences of alternatives against which these tests are inconsistent and ours is consistent. As a consequence, there are alternative models for which the finite-sample power of our test greatly exceeds that of existing tests. This conclusion is illustrated by the results of some Monte Carlo experiments.
Citations
More filters
Journal ArticleDOI
TL;DR: In this paper, a nonparametric test for Granger non-causality was proposed to avoid the over-rejection observed in the frequently used test proposed by Hiemstra and Jones [1994].

794 citations


Cites methods from "An Adaptive, Rate-Optimal Test of a..."

  • ...For example, as in Horowitz and Spokoiny (2001) one might consider a new test statistic which is the largest value of the standardized and studentized test statistics for a range of bandwidths, and use bootstrap methods to assess the critical values....

    [...]

Posted Content
Xiaohong Chen1
TL;DR: The method of sieves as discussed by the authors can be used to estimate semi-nonparametric econometric models with various constraints, such as monotonicity, convexity, additivity, multiplicity, exclusion and nonnegativity.
Abstract: Often researchers find parametric models restrictive and sensitive to deviations from the parametric specifications; semi-nonparametric models are more flexible and robust, but lead to other complications such as introducing infinite-dimensional parameter spaces that may not be compact and the optimization problem may no longer be well-posed. The method of sieves provides one way to tackle such difficulties by optimizing an empirical criterion over a sequence of approximating parameter spaces (i.e., sieves); the sieves are less complex but are dense in the original space and the resulting optimization problem becomes well-posed. With different choices of criteria and sieves, the method of sieves is very flexible in estimating complicated semi-nonparametric models with (or without) endogeneity and latent heterogeneity. It can easily incorporate prior information and constraints, often derived from economic theory, such as monotonicity, convexity, additivity, multiplicity, exclusion and nonnegativity. It can simultaneously estimate the parametric and nonparametric parts in semi-nonparametric models, typically with optimal convergence rates for both parts. This chapter describes estimation of semi-nonparametric econometric models via the method of sieves. We present some general results on the large sample properties of the sieve estimates, including consistency of the sieve extremum estimates, convergence rates of the sieve M-estimates, pointwise normality of series estimates of regression functions, root-n asymptotic normality and efficiency of sieve estimates of smooth functionals of infinite-dimensional parameters. Examples are used to illustrate the general results.

654 citations

Posted Content
TL;DR: In this paper, the critical values of the extended Kolmogorov-Smirnov tests of First and Second Order Stochastic Dominance in the general K-prospect case are estimated.
Abstract: We propose a procedure for estimating the critical values of the extended Kolmogorov- Smirnov tests of First and Second Order Stochastic Dominance in the general K-prospect case. We allow for the observations to be serially dependent and, for the first time, we can accommodate general dependence amongst the prospects which are to be ranked. Also, the prospects may be the residuals from certain conditional models, opening the way for conditional ranking. We also propose a test of Prospect Stochastic Dominance. Our method is subsampling; we show that the resulting tests are consistent and powerful against some N|1/2 local alternatives even when computed with a data-based subsample size. We also propose some heuristic methods for selecting subsample size and demonstrate in simulations that they perform reasonably. We show that our test is asymptotically similar on the entire boundary of the null hypothesis, and is unbiased. In comparison, any method based on resampling or simulating from the least favorable distribution does not have these properties and consequently will have less power against some alternatives.

406 citations

ReportDOI
TL;DR: It is demonstrated how the Gaussian approximations and the multiplier bootstrap can be used for modern high dimensional estimation, multiple hypothesis testing, and adaptive specification testing.
Abstract: We derive a Gaussian approximation result for the maximum of a sum of high-dimensional random vectors. Specifically, we establish conditions under which the distribution of the maximum is approximated by that of the maximum of a sum of the Gaussian random vectors with the same covariance matrices as the original vectors. This result applies when the dimension of random vectors ($p$) is large compared to the sample size ($n$); in fact, $p$ can be much larger than $n$, without restricting correlations of the coordinates of these vectors. We also show that the distribution of the maximum of a sum of the random vectors with unknown covariance matrices can be consistently estimated by the distribution of the maximum of a sum of the conditional Gaussian random vectors obtained by multiplying the original vectors with i.i.d. Gaussian multipliers. This is the Gaussian multiplier (or wild) bootstrap procedure. Here too, $p$ can be large or even much larger than $n$. These distributional approximations, either Gaussian or conditional Gaussian, yield a high-quality approximation to the distribution of the original maximum, often with approximation error decreasing polynomially in the sample size, and hence are of interest in many applications. We demonstrate how our Gaussian approximations and the multiplier bootstrap can be used for modern high-dimensional estimation, multiple hypothesis testing, and adaptive specification testing. All these results contain nonasymptotic bounds on approximation errors.

383 citations


Additional excerpts

  • ..., [25], [21], and [18])....

    [...]

Journal ArticleDOI
TL;DR: In this article, a procedure for estimating the critical values of the extended Kolmogorov-Smirnov tests of stochastic dominance of arbitrary order in the general K-prospect case is proposed.
Abstract: We propose a procedure for estimating the critical values of the extended Kolmogorov‐Smirnov tests of Stochastic Dominance of arbitrary order in the general K -prospect case. We allow for the observations to be serially dependent and, for the first time, we can accommodate general dependence amongst the prospects which are to be ranked. Also, the prospects may be the residuals from certain conditional models, opening the way for conditional ranking. We also propose a test of Prospect Stochastic Dominance. Our method is based on subsampling and we show that the resulting tests are consistent and powerful against some N 1/2 local alternatives. We also propose some heuristic methods for selecting subsample size and demonstrate in simulations that they perform reasonably. We describe an alternative method for obtaining critical values based on recentring the test statistic and using full-sample bootstrap methods. We compare the two methods in theory and in practice.

378 citations

References
More filters
Book
01 Jan 2008

1,466 citations

Journal ArticleDOI
TL;DR: In this paper, the wild bootstrap method was used to fit Engel curves in expenditure data analysis, and it was shown that the standard way of bootstrapping this statistic fails.
Abstract: In general, there will be visible differences between a parametric and a nonparametric curve estimate. It is therefore quite natural to compare these in order to decide whether the parametric model could be justified. An asymptotic quantification is the distribution of the integrated squared difference between these curves. We show that the standard way of bootstrapping this statistic fails. We use and analyse a different form of bootstrapping for this task. We call this method the wild bootstrap and apply it to fitting Engel curves in expenditure data analysis.

1,229 citations

Journal ArticleDOI
TL;DR: In this article, the problem of choosing a bandwidth parameter for nonparametric regression is studied and the relationship of this estimate to a kernel estimate is discussed, based on an unbiased estimate of mean square error, which is shown to be asymptotically optimal.
Abstract: This paper is concerned with the problem of choosing a bandwidth parameter for nonparametric regression. We analyze a tapered Fourier series estimate and discuss the relationship of this estimate to a kernel estimate. We first consider a method based on an unbiased estimate of mean square error, and show that the bandwidth thus chosen is asymptotically optimal. Other methods are examined as well and are shown to be asymptotically equivalent. A small simulation shows, however, that for small or moderate sample size, the methods perform quite differently.

674 citations

Book
22 Dec 2012
TL;DR: In this paper, the authors present a data-driven approach to perform Lack-of-Fit tests for general parametric models. But they do not specify the parameters of smoothing.
Abstract: 1. Introduction.- 2. Some Basic Ideas of Smoothing.- 3. Statistical Properties of Smoothers.- 4. Data-Driven Choice of Smoothing Parameters.- 5. Classical Lack-of-Fit Tests.- 6. Lack-of-Fit Tests Based on Linear Smoothers.- 7. Testing for Association via Automated Order Selection.- 8. Data-Driven Lack-of-Fit Tests for General Parametric Models.- 9. Extending the Scope of Application.- 10. Some Examples.- A.2. Bounds for the Distribution of Tcusum.- References.

519 citations

Journal ArticleDOI
TL;DR: In this paper, the conditional moment test is combined with nonparametric estimation techniques and the test statistic is shown to be asymptotically distributed standard normal under the null hypothesis that the parametric model is correct, while diverging to infinity at a rate arbitrarily close to n.

502 citations