scispace - formally typeset
Search or ask a question
Author

Walter Philipp

Bio: Walter Philipp is an academic researcher from University of Illinois at Urbana–Champaign. The author has contributed to research in topics: Random variable & Invariance principle. The author has an hindex of 34, co-authored 93 publications receiving 3848 citations. Previous affiliations of Walter Philipp include University of North Carolina at Chapel Hill & Urbana University.


Papers
More filters
Journal ArticleDOI
TL;DR: In this paper, approximation theorems of the following type were proved for sums of independent identically distributed random variables with values in Θ( √ √ n) with a logarithmic mixing rate.
Abstract: In this paper we prove approximation theorems of the following type. Let $\{X_k, k \geqslant 1\}$ be a sequence of random variables with values in $\mathbb{R}^{d_k}, d_k \geqslant 1$ and let $\{G_k, k \geqslant 1\}$ be a sequence of probability distributions on $\mathbb{R}^{d_k}$ with characteristic functions $g_k$ respectively. If for each $k \geqslant 1$ the conditional characteristic function of $X_k$ given $X_1, \cdots, X_{k - 1}$ is close to $g_k$ and if $G_k$ has small tails, then there exists a sequence of independent random variables $Y_k$ with distribution $G_k$ such that $|X_k - Y_k|$ is small with large probability. As an application we prove almost sure invariance principles for sums of independent identically distributed random variables with values in $\mathbb{R}^d$ and for sums of $\phi$-mixing random variables with a logarithmic mixing rate.

315 citations

Book
19 Aug 2002
TL;DR: In this article, the authors provide a survey of classical and modern techniques in the study of empirical processes of dependent data, and provide necessary technical tools like correlation and moment inequalities, and prove central limit theorems for partial sums.
Abstract: Let (X k) k≥1 be a sequence of random variables with common distribution function F(x) = P(X 1 ≤ x). Define the empirical distribution function $$ {{F}_{n}}(x) = \frac{1}{n}\# \{ 1 \leqslant i \leqslant n:{{X}_{1}} \leqslant x\} , $$ and the empirical process by \( \sqrt {n} ({{F}_{n}}(x) - F(x)) \) In this chapter we provide a survey of classical as well as modern techniques in the study of empirical processes of dependent data. We begin with a sketch of the early roots of the field in the theory of uniform distribution mod 1, of sequences defined by X k = {n k ω}, ω ∈ [0, 1], dating back to Weyl’s celebrated 1916 paper. In the second section we provide the essential tools of empirical process theory, and we prove Donsker’s classical empirical process invariance principle for i.i.d. processes. The third section provides an introduction to the subject of weakly dependent random variables. We introduce a variety of mixing concepts, provide necessary technical tools like correlation and moment inequalities, and prove central limit theorems for partial sums. The empirical process of weakly dependent data is investigated in the fourth section, where we put special emphasis on almost sure approximation techniques. The fifth section is devoted to the empirical distribution of U-statistics, defined as $$ Un(x) = {{\left( {\begin{array}{*{20}{c}} n \\ 2 \\ \end{array} } \right)}^{{ - 1}}}\# \{ 1 \leqslant i < j \leqslant n:h({{X}_{i}},{{X}_{j}}) \leqslant x\} $$ for some symmetric kernel h. We give some applications, e.g., to dimension estimation in the analysis of time series, and prove weak convergence of the corresponding empirical process. Empirical processes of long-range dependent data are the topic of the sixth section. We give an introduction to the area of long-range dependent processes, provide important technical tools for the study of their partial sums and investigate the limit behavior of the empirical process. It turns out that the limit process is of a completely different type as in the case of independent or weakly dependent data, and that this has important consequences for various functionals of the empirical process. The final section is devoted to pair correlations, i.e., U-statistics empirical processes over short intervals associated with the kernel h(x, y) = |x − y|

254 citations

Journal ArticleDOI
TL;DR: A new proof of the almost sure central limit theorem for weakly dependent random variables was given in this paper. But it is based on an almost sure invariance principle and extends to weakly dependent random variables.

238 citations

Journal ArticleDOI
TL;DR: In this paper, the approximation of partial sums of mixing random variables with values in a Banach space by a $B$-valued Brownian motion is obtained, which yields the compact as well as the functional law of the iterated logarithm for these sums.
Abstract: The approximation of partial sums of $\phi$-mixing random variables with values in a Banach space $B$ by a $B$-valued Brownian motion is obtained. This result yields the compact as well as the functional law of the iterated logarithm for these sums. As an application we strengthen a uniform law of the iterated logarithm for classes of functions recently obtained by Kaufman and Philipp (1978). As byproducts we obtain necessary and sufficient conditions for an almost sure invariance principle for independent identically distributed $B$-valued random variables and an almost sure invariance principle for sums of $d$-dimensional random vectors satisfying a strong mixing condition.

174 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: Convergence of Probability Measures as mentioned in this paper is a well-known convergence of probability measures. But it does not consider the relationship between probability measures and the probability distribution of probabilities.
Abstract: Convergence of Probability Measures. By P. Billingsley. Chichester, Sussex, Wiley, 1968. xii, 253 p. 9 1/4“. 117s.

5,689 citations

Journal ArticleDOI
TL;DR: In this article, the authors extend the jackknife and the bootstrap method of estimating standard errors to the case where the observations form a general stationary sequence, and they show that consistency is obtained if $l = l(n) \rightarrow \infty$ and $l(n)/n \ rightarrow 0$.
Abstract: We extend the jackknife and the bootstrap method of estimating standard errors to the case where the observations form a general stationary sequence. We do not attempt a reduction to i.i.d. values. The jackknife calculates the sample variance of replicates of the statistic obtained by omitting each block of $l$ consecutive data once. In the case of the arithmetic mean this is shown to be equivalent to a weighted covariance estimate of the spectral density of the observations at zero. Under appropriate conditions consistency is obtained if $l = l(n) \rightarrow \infty$ and $l(n)/n \rightarrow 0$. General statistics are approximated by an arithmetic mean. In regular cases this approximation determines the asymptotic behavior. Bootstrap replicates are constructed by selecting blocks of length $l$ randomly with replacement among the blocks of observations. The procedures are illustrated by using the sunspot numbers and some simulated data.

2,185 citations

01 Jan 1996

1,282 citations

01 Jan 1972
TL;DR: In this article, it was shown that under the assumption of exponential decrease of dependence is strengthened tormdependence, the error in the normal approximation is of the order of n 1/2(log n)2.
Abstract: This paper has two aims, one fairly concrete and the other more abstract. In Section 3, bounds are obtained under certain conditions for the departure of the distribution of the sum of n terms of a stationary random sequence from a normal distribution. These bounds are derived from a more abstract normal approximation theorem proved in Section 2. I regret that, in order to complete this paper in time for publication, I have been forced to submit it with many defects remaining. In particular the proof of the concrete results of Section 3 is somewhat incomplete. A well known theorem of A. Berry [1] and C-G. Esseen [2] asserts that if X1, X2, . is a sequence of independent identically distributed random variables with EXi = 0, EXV = 1, and ,B = EIXij3 < oo. then the cumulative distribution function of (1//;n) Yi=l Xi differs from the unit normal distribution by at most Kf3/ n where K is a constant, which can be taken to be 2. It seems likely, but has never been proved and will not be proved here, that a similar result holds for stationary sequences in which the dependence falls off sufficiently rapidly and the variance of(1//;n) X1.1 Xi approaches a positive constant. I. Ibragimov and Yu. Linnik ([3], pp. 423-432) prove that, under these conditions, the limiting distribution of (1/ /n) E Xi is normal with mean 0 and a certain variance G2 Perhaps the best published results on bounds for the error are those of Phillip [5]. who shows that if in addition the Xi are bounded, with exponentially decreasing dependence, then the discrepancy is roughly of the order of n-114 In Corollary 3.2 of the present paper it is proved that under these conditions the discrepancy is of the order of n 1/2(log n)2. Actually the assumption of boundedness is weakened to the finiteness of eighth moments. In Corollary 3.1 it is proved that if the assumption of exponential decrease of dependence is strengthened tormdependence, the error in the normal approximation is of the order of n1/2 The abstract normal approximation theorem of Section 2 is elementary in the sense that it uses only the basic properties of conditional expectation and the elements of analysis, including the solution of a first order linear differential equation. It is also direct, in the sense that the expectation of a fairly arbitrary

1,233 citations

BookDOI
01 Jan 2008
TL;DR: Semi-parametric Inference as mentioned in this paper is a well-known technique in empirical process analysis, and it has been used in many applications, e.g., for finite-dimensional and infinite-dimensional parameters.
Abstract: Overview.- An Overview of Empirical Processes.- Overview of Semiparametric Inference.- Case Studies I.- Empirical Processes.- to Empirical Processes.- Preliminaries for Empirical Processes.- Stochastic Convergence.- Empirical Process Methods.- Entropy Calculations.- Bootstrapping Empirical Processes.- Additional Empirical Process Results.- The Functional Delta Method.- Z-Estimators.- M-Estimators.- Case Studies II.- Semiparametric Inference.- to Semiparametric Inference.- Preliminaries for Semiparametric Inference.- Semiparametric Models and Efficiency.- Efficient Inference for Finite-Dimensional Parameters.- Efficient Inference for Infinite-Dimensional Parameters.- Semiparametric M-Estimation.- Case Studies III.

1,141 citations