scispace - formally typeset
Search or ask a question
JournalISSN: 0894-9840

Journal of Theoretical Probability 

Springer Science+Business Media
About: Journal of Theoretical Probability is an academic journal published by Springer Science+Business Media. The journal publishes majorly in the area(s): Random walk & Central limit theorem. It has an ISSN identifier of 0894-9840. Over the lifetime, 2003 publications have been published receiving 30220 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: For simple random walk on aN-vertex graph, the mean time to cover all vertices is at leastcN log(N), wherec>0 is an absolute constant, deduced from a more general result about stationary finite-state reversible Markov chains.
Abstract: For simple random walk on aN-vertex graph, the mean time to cover all vertices is at leastcN log(N), wherec>0 is an absolute constant. This is deduced from a more general result about stationary finite-state reversible Markov chains. Under weak conditions, the covering time for such processes is at leastc times the covering time for the corresponding i.i.d. process.

942 citations

Journal ArticleDOI
TL;DR: In this paper, the authors presented an interpretation of effective resistance in electrical networks in terms of random walks on underlying graphs using simple and elegant proofs for some known results in random walks and electrical networks.
Abstract: In this article we present an interpretation ofeffective resistance in electrical networks in terms of random walks on underlying graphs Using this characterization we provide simple and elegant proofs for some known results in random walks and electrical networks We also interpret the Reciprocity theorem of electrical networks in terms of traversals in random walks The byproducts are (a) precise version of thetriangle inequality for effective resistances, and (b) an exact formula for the expectedone-way transit time between vertices

352 citations

Journal ArticleDOI
Qi-Man Shao1
TL;DR: The comparison theorem on moment inequalities between negatively associated and independent random variables extends the Hoeffding inequality on the probability bounds for the sum of a random sample without replacement from a finite population as discussed by the authors.
Abstract: Let {X i, 1≤i≤n} be a negatively associated sequence, and let {X* i , 1≤i≤n} be a sequence of independent random variables such that X* i and X i have the same distribution for each i=1, 2,..., n. It is shown in this paper that Ef(∑ n i=1 X i)≤Ef(∑ n i=1 X* i ) for any convex function f on R 1 and that Ef(max1≤k≤n ∑ n i=k X i)≤Ef(max1≤k≤n ∑ k i=1 X* i ) for any increasing convex function. Hence, most of the well-known inequalities, such as the Rosenthal maximal inequality and the Kolmogorov exponential inequality, remain true for negatively associated random variables. In particular, the comparison theorem on moment inequalities between negatively associated and independent random variables extends the Hoeffding inequality on the probability bounds for the sum of a random sample without replacement from a finite population.

322 citations

Book ChapterDOI
Goran Peskir1
TL;DR: In this paper, a change-of-variable formula was derived for an Ito diffusion X under weaker conditions on F and applied to free-boundary problems of optimal stopping.
Abstract: Let $$X = (X_t)_{t \geq 0}$$ be a continuous semimartingale and let $$b: \mathbb{R}_+ \rightarrow \mathbb{R}$$ be a continuous function of bounded variation. Setting $$C = \{(t, x) \in \mathbb{R} + \times \mathbb{R} | x b(t)\}$$ suppose that a continuous function $$F: \mathbb{R}_+ \times \mathbb{R} \rightarrow \mathbb{R}$$ is given such that F is C1,2 on $$\bar{C}$$ and F is $$C^{1,2}$$ on $$\bar{D}$$ . Then the following change-of-variable formula holds: $$\eqalign{ F(t,X_t) = F(0,X_0)+\int_0^{t} {1 \over 2} (F_t(s, X_s+) + F_t(s,X_s-)) ds\cr + \int_0^t {1 \over 2} (F_x(s,X_s+) + F_x(s,X_s-))dX_s\cr + {1 \over 2} \int_0^t F_{xx} (s,X_s)I (X_s eq b(s)) d \langle X, X \rangle_s\cr + {1 \over 2} \int_0^t (F_x(s,X_s+)-F_x(s,X_s-)) I(X_s = b(s)) d\ell_{s}^{b} (X),\cr} $$ where $$\ell_{s}^{b}(X)$$ is the local time of X at the curve b given by $$\ell_{s}^{b}(X) = \mathbb{P} - \lim_{\varepsilon \downarrow 0} {1 \over 2 \varepsilon} \int_0^s I(b(r)- \varepsilon < X_r < b(r) + \varepsilon) d \langle X, X \rangle_{r} $$ and $$d\ell_{s}^{b}(X)$$ refers to the integration with respect to $$s \mapsto \ell_{s}^{b}(X)$$ . A version of the same formula derived for an Ito diffusion X under weaker conditions on F has found applications in free-boundary problems of optimal stopping.

259 citations

Journal ArticleDOI
TL;DR: In this article, the authors conjecture that the optimal sample size for all distributions with finite fourth moment is O(n) and prove this up to an iterated logarithmic factor.
Abstract: Given a probability distribution in ℝ n with general (nonwhite) covariance, a classical estimator of the covariance matrix is the sample covariance matrix obtained from a sample of N independent points. What is the optimal sample size N=N(n) that guarantees estimation with a fixed accuracy in the operator norm? Suppose that the distribution is supported in a centered Euclidean ball of radius $O(\sqrt{n})$ . We conjecture that the optimal sample size is N=O(n) for all distributions with finite fourth moment, and we prove this up to an iterated logarithmic factor. This problem is motivated by the optimal theorem of Rudelson (J. Funct. Anal. 164:60–72, 1999), which states that N=O(nlog n) for distributions with finite second moment, and a recent result of Adamczak et al. (J. Am. Math. Soc. 234:535–561, 2010), which guarantees that N=O(n) for subexponential distributions.

228 citations

Performance
Metrics
No. of papers from the Journal in previous years
YearPapers
202335
202296
2021144
2020114
201990
201885