scispace - formally typeset
Open AccessJournal ArticleDOI

Optimum Kernel Estimators of the Mode

William F. Eddy
- 01 Jul 1980 - 
- Vol. 8, Iss: 4, pp 870-882
TLDR
In this article, it was shown that for any particular bandwidth sequence the asymptotic mean squared error is minimized by a certain truncated polynomial kernel, and that the rate at which the estimator converges to zero can be decreased from O(n −1+ε −1 + ε(n−1+ε) for any positive ε > 0.
Abstract
Let $X_1, \cdots, X_n$ be independent observations with common density $f$. A kernel estimate of the mode is any value of $t$ which maximizes the kernel estimate of the density $f_n$. Conditions are given restricting the density, the kernel, and the bandwidth under which this estimate of the mode has an asymptotic normal distribution. By imposing sufficient restrictions, the rate at which the mean squared error of the estimator converges to zero can be decreased from $n^{-\frac{4}{7}}$ to $n^{-1+\varepsilon}$ for any positive $\varepsilon$. Also, by bounding the support of the kernel it is shown that for any particular bandwidth sequence the asymptotic mean squared error is minimized by a certain truncated polynomial kernel.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

The Dip Test of Unimodality

TL;DR: The dip test as mentioned in this paper measures multimodality in a sample by the maximum difference, over all sample points, between the empirical distribution function, and the unimodal distribution function that minimizes that maximum difference.
Journal ArticleDOI

Bayesian Mode Regression

TL;DR: A parametric Bayesian model is developed by employing a likelihood function that is based on a mode uniform distribution and it is shown that irrespective of the original distribution of the data, the use of this special uniform distribution is a very natural and effective way for Bayesian mode regression.
Journal ArticleDOI

Kernels for Nonparametric Curve Estimation

TL;DR: In this article, the choice of kernels for nonparametric estimation of regression functions and their derivatives is investigated, and explicit expressions are obtained for kernels minimizing the asymptotic variance or the IMSE (the present proof of the optimality of the latter kernels up to order k = 5).
Journal ArticleDOI

Change-points in nonparametric regression analysis'

TL;DR: In this paper, the authors show that the continuous mapping theorem can be invoked to obtain asymptotic distributions and corresponding rates of convergence for change-point estimators, which are typically faster than n- 1/2.
Journal ArticleDOI

Jump and sharp cusp detection by wavelets

TL;DR: In this paper, a method is proposed to detect jumps and sharp cusps in a function which is observed with noise, by checking if the wavelet transformation of the data has significantly large absolute values across fine scale levels.