scispace - formally typeset
Open AccessJournal ArticleDOI

Central Limit Theorems for Empirical Measures

Richard M. Dudley
- 01 Dec 1978 - 
- Vol. 6, Iss: 6, pp 899-929
TLDR
In this article, the convergence of a stochastic process indexed by a Gaussian process to a certain Gaussian processes indexed by the supremum norm was studied in a Donsker class.
Abstract
Let $(X, \mathscr{A}, P)$ be a probability space. Let $X_1, X_2,\cdots,$ be independent $X$-valued random variables with distribution $P$. Let $P_n := n^{-1}(\delta_{X_1} + \cdots + \delta_{X_n})$ be the empirical measure and let $\nu_n := n^\frac{1}{2}(P_n - P)$. Given a class $\mathscr{C} \subset \mathscr{a}$, we study the convergence in law of $\nu_n$, as a stochastic process indexed by $\mathscr{C}$, to a certain Gaussian process indexed by $\mathscr{C}$. If convergence holds with respect to the supremum norm $\sup_{C \in \mathscr{C}}|f(C)|$, in a suitable (usually nonseparable) function space, we call $\mathscr{C}$ a Donsker class. For measurability, $X$ may be a complete separable metric space, $\mathscr{a} =$ Borel sets, and $\mathscr{C}$ a suitable collection of closed sets or open sets. Then for the Donsker property it suffices that for some $m$, and every set $F \subset X$ with $m$ elements, $\mathscr{C}$ does not cut all subsets of $F$ (Vapnik-Cervonenkis classes). Another sufficient condition is based on metric entropy with inclusion. If $\mathscr{C}$ is a sequence $\{C_m\}$ independent for $P$, then $\mathscr{C}$ is a Donsker class if and only if for some $r, \sigma_m(P(C_m)(1 - P(C_m)))^r < \infty$.

read more

Citations
More filters
Book

The Nature of Statistical Learning Theory

TL;DR: Setting of the learning problem consistency of learning processes bounds on the rate of convergence ofLearning processes controlling the generalization ability of learning process constructing learning algorithms what is important in learning theory?
Book

The Probabilistic Method

Joel Spencer
TL;DR: A particular set of problems - all dealing with “good” colorings of an underlying set of points relative to a given family of sets - is explored.
Book

Prediction, learning, and games

TL;DR: In this paper, the authors provide a comprehensive treatment of the problem of predicting individual sequences using expert advice, a general framework within which many related problems can be cast and discussed, such as repeated game playing, adaptive data compression, sequential investment in the stock market, sequential pattern analysis, and several other problems.
Book

A Probabilistic Theory of Pattern Recognition

TL;DR: The Bayes Error and Vapnik-Chervonenkis theory are applied as guide for empirical classifier selection on the basis of explicit specification and explicit enforcement of the maximum likelihood principle.
Book

Random Fields and Geometry

TL;DR: Random Fields and Geometry as discussed by the authors is a comprehensive survey of the general theory of Gaussian random fields with a focus on geometric problems arising in the study of random fields, including continuity and boundedness, entropy and majorizing measures, Borell and Slepian inequalities.
Related Papers (5)