scispace - formally typeset
Search or ask a question
Journal ArticleDOI

On large deviations in testing Ornstein–Uhlenbeck-type models

01 Jun 2008-Statistical Inference for Stochastic Processes (Springer Netherlands)-Vol. 11, Iss: 2, pp 143-155
TL;DR: In this paper, the authors obtained exact large deviation rates for the log-likelihood ratio in testing models with observed Ornstein-Uhlenbeck processes and gave explicit rates of decrease for the error probabilities of Neyman-Pearson, Bayes, and minimax tests.
Abstract: We obtain exact large deviation rates for the log-likelihood ratio in testing models with observed Ornstein–Uhlenbeck processes and get explicit rates of decrease for the error probabilities of Neyman–Pearson, Bayes, and minimax tests. Moreover, we give expressions for the rates of decrease for the error probabilities of Neyman–Pearson tests in models with observed processes solving affine stochastic delay differential equations.

Summary (1 min read)

1 Introduction

  • Asymptotic properties of likelihood ratios play an important role in statistical testing problems.
  • Likelihood ratio, Hellinger integral, Neyman-Pearson test, Bayes test, minimax test, large deviation theorems, Girsanov formula for diffusion-type processes, Ornstein-Uhlenbeck-type process, stochastic delay differential equation. in [4] were obtained by using the large deviation techniques for sequences of random variables, also known as Key words and phrases.
  • The results are applied to the investigation of the rates of decrease for error probabilities of the tests mentioned above.
  • Then the initial continuous-time assertions can be obtained as a limiting case of the corresponding results from the discrete-time case by applying the invariance principle.

2 Large deviation theorems and their applications

  • In this section the authors cite some results from Lin’kov [21] - [23] and use his notation.
  • In the rest of the section the authors refer some results about the asymptotic behavior of the error probabilities for Neyman-Pearson, Bayes, and minimax tests.
  • The following assertion describes the rate of decrease for the error probabilities of the first and second kind αt and β(αt), respectively, for the test δt(αt) under the regularity condition (2.3).
  • Under some other conditions the last equality in (2.19) was proved by Vajda [28].

3 Ornstein-Uhlenbeck models

  • In this section the authors consider a model where the observation process X = (Xt)t≥0 satisfies the following stochastic differential equation: dXt = −θXt dt+.
  • For this, the authors will investigate the asymptotic behavior of the Hellinger integral (3.5) under t→∞ .
  • The question if the derived rate bounds are optimal as well as the second order expansions for log βt(αt) remain as open problems here.

4 Ornstein-Uhlenbeck-type models with delay

  • Then, by means of the Girsanov-type formula (5.1) in [15], the authors get that under the hypothesis H0 the log-likelihood ratio process (2.1) admits the representation: Using the arguments in [21; Theorems 3.1.4, 3.2.2] they now describe the asymptotic behavior of the error probabilities for Neyman-Pearson tests.
  • In the rest of the section the authors give some examples of models of the type (4.1) in which condition (4.7) holds.
  • Financial support from the German Research Foundation and the Foundation of Berlin Parliament is gratefully acknowledged.

Did you find this useful? Give us your feedback

Content maybe subject to copyright    Report

On large deviations in testing
Ornstein-Uhlenbeck-type models
Pavel V. Gapeev
Uwe K¨uchler
§
Statistical Inference for Stochastic Processes (2008) 11(2) (143–155)
We obtain exact large deviation rates for the log-likelihood ratio in testing models
with observed Ornstein-Uhlenbeck processes and get explicit rates of decrease for the
error probabilities of Neyman-Pearson, Bayes, and minimax tests. Moreover, we give
expressions for the rates of decrease for the error probabilities of Neyman-Pearson tests
in models with observed processes solving affine stochastic delay differential equations.
1 Introduction
Asymptotic properties of likelihood ratios play an important role in statistical testing problems.
Sometimes they can be studied by using large deviation results, for example, in the case of
binary statistical experiments. Chernoff [9] proved large deviation theorems for sums of i.i.d.
observations. Bahadur [1]-[3] studied asymptotic efficiency of tests and estimates for observed
sequences of random variables (see also Bahadur, Zabel and Gupta [4]). Birg´e [8] applied the
results of [9] to the investigation of the rate of decrease for error probabilities of Neyman-
Pearson tests. Generalizations of the large deviation results to the case of semimartingale
models and their applications are collected in the monograph [21]. Lin’kov [22] proved large
deviation theorems for extended random variables and applied them to the investigation of
general statistical experiments. Exact large deviation rates for the log-likelihood ratio in testing
models with fractional Brownian motion were derived in [23]. In the present paper we derive
an explicit form of large deviation theorems of Chernoff type for the log-likelihood ratio in
testing models with Ornstein-Uhlenbeck processes by applying the large deviation results from
the general continuous-time semimartingale framework of Lin’kov [21]. Note that the results
This research was supported by Deutsche Forschungsgemeinschaft through the SFB 649 Economic Risk.
Weierstraß Institute for Applied Analysis and Stochastics (WIAS), Mohrenstr. 39, D-10117 Berlin, Ger-
many, e-mail: gapeev@wias-berlin.de
Institute of Control Sciences, Russian Academy of Sciences, Profsoyuznaya Str. 65, 117997 Moscow, Russia
§
Institute of Mathematics, Humboldt University of Berlin, Unter den Linden 6, D-10099 Berlin, Germany,
e-mail: kuechler@mathematik.hu-berlin.de
Mathematics Subject Classification 2000. Primary 62F05, 60F10. Secondary 62C10, 62C20, 62M02, 62M07.
Key words and phrases: Likelihood ratio, Hellinger integral, Neyman-Pearson test, Bayes test, minimax
test, large deviation theorems, Girsanov formula for diffusion-type processes, Ornstein-Uhlenbeck-type process,
stochastic delay differential equation.
1

in [4] were obtained by using the large deviation techniques for sequences of random variables.
The problem of testing mean reversion for processes of Ornstein-Uhlenbeck type was earlier
studied by Szimayer and Maller [27]. Note that the Ornstein-Uhlenbeck processes play a key
role for modeling the behavior of interest rates in financial markets (see e.g. [29] or [5]).
In recent years, several statistical problems for models with time delay were studied. Dietz
[11] considered an Ornstein-Uhlenbeck-type model with exponential decreasing memory and
proved the local asymptotically mixed normality (in an extended sense) of the suitably nor-
malized model. Gushchin and K¨uchler [12] - [14] derived local asymptotic properties of the
likelihood process in (two-parameter) models connected with a special case of affine stochastic
delay differential equations. Putschke [25] continued this investigation for a multi-parametric
case of such affine delay equations. K¨uchler and Kutoyants [17] studied the asymptotic behavior
of the maximum likelihood and Bayesian estimators of delay in a simple Ornstein-Uhlenbeck-
type model. K¨uchler and Vasil’ev [18] investigated the almost sure consistency and asymptotic
normality of sequential estimators for multi-parametric affine delay equations. Gushchin and
K¨uchler [15] derived conditions under which a model with an affine stochastic delay differential
equation satisfies the local asymptotic normality property and where the maximum likelihood
and Bayesian estimators of a parameter are asymptotically normal and efficient. In this paper
we consider the problem of testing hypotheses and study the asymptotic behavior of the error
probabilities for Neyman-Pearson tests in Ornstein-Uhlenbeck-type models with delay. Asymp-
totic properties for tests of delay parameters in the cases of small noise and large sample size
were recently studied by Kutoyants [19] - [20].
The paper is organized as follows. In Section 2, we cite large deviation results for the log-
likelihood ratio process and their applications to the investigation of the rates of decrease for
error probabilities of Neyman-Pearson, Bayes, and minimax tests (cf. [21] - [23]). In Section 3,
by means of explicit expressions for Hellinger integrals, we obtain exact large deviation rates for
the log-likelihood ratio in a model of testing hypotheses about the parameter of an Ornstein-
Uhlenbeck process. We remark that there appears some kind of discontinuity in the solution
when the basic hypothesis is altered. The results are applied to the investigation of the rates
of decrease for error probabilities of the tests mentioned above. It seems possible to derive the
analogues of some of these results in the models with discretely observed data (see e.g. [4]). For
this, some essentially different techniques, which is applied by derivation of the large deviation
results for testing models with sequences of random variables, should be used. Then the initial
continuous-time assertions can be obtained as a limiting case of the corresponding results from
the discrete-time case by applying the invariance principle. In Section 4, we get the rates of
decrease for the error probabilities of Neyman-Pearson tests in models with processes that solve
affine stochastic delay differential equations and give two illustrating examples.
2 Large deviation theorems and their applications
Suppose that (Ω, F, P
0
, P
1
) is a binary statistical experiment and that X = (X
t
)
t0
is a real-
valued process. Let (F
t
)
t0
be the filtration generated by X , that is F
t
= σ(X
s
|0 s t),
t 0. Let H
0
and H
1
be two statistical hypotheses under which the distribution of the observed
process X is given by the different measures P
0
and P
1
, respectively. We will consider the
problem of testing the hypothesis H
0
against its alternative H
1
. In this section we cite some
2

results from Lin’kov [21] - [23] and use his notation.
2.1. Suppose that the measures P
0
and P
1
are locally equivalent on the filtration (F
t
)
t0
and introduce the log-likelihood ratio process Λ =
t
)
t0
defined as the logarithm of the
Radon-Nikodym derivative:
Λ
t
= log
d(P
1
|F
t
)
d(P
0
|F
t
)
(2.1)
for all t 0. Let the process H(ε) = (H
t
(ε))
t0
be the Hellinger integral of the order ε
(−∞, ) of the restrictions P
1
|F
t
and P
0
|F
t
given by:
H
t
(ε) := H
t
(ε; P
1
, P
0
) = E
0
[exp(εΛ
t
)] (2.2)
for all t 0 (see e.g. [16; Chapter IV, Section 1]). Note that the relationship H
t
(ε; P
0
, P
1
) =
H
t
(1 ε; P
1
, P
0
) holds for all ε (−∞, ) and t 0.
We will say that the Hellinger integral (2.2) satisfies the regularity condition if for some
function ψ
t
, t 0, such that ψ
t
as t , the (possibly infinite) limit
κ(ε) := lim
t→∞
ψ
1
t
log H
t
(ε) (2.3)
exists for all ε (−∞, ). It is known (see e.g. [10; Chapter III, Section 3]) that the function
κ(ε) is a strictly convex and continuously differentiable function on (ε
, ε
+
) with
−∞ ε
:= inf{ε |κ(ε) < ∞} < ε
+
:= sup{ε |κ(ε) < ∞} (2.4)
where ε
0 and ε
+
1. If ε
< 0 or ε
+
> 1 then the derivatives κ
0
(0) and κ
0
(1) are
well-defined, respectively.
For every γ R let us define the function I(γ) as the Legendre-Fenchel transform of κ(ε)
by:
I(γ) := sup
ε(ε
+
)
(εγ κ(ε)) (2.5)
(cf. e.g. [26]) with
κ
0
(ε
+) := lim
εε
κ
0
(ε) < κ
0
(ε
+
) := lim
εε
+
κ
0
(ε) (2.6)
and define the values:
γ
0
:= κ
0
(0) if ε
< 0, γ
0
:= κ
0
(0+) = lim
εε
κ
0
(ε) if ε
= 0 (2.7)
γ
1
:= κ
0
(1) if ε
+
> 1, γ
1
:= κ
0
(1) = lim
εε
+
κ
0
(ε) if ε
+
= 1 (2.8)
where by virtue of the convexity of κ(ε) on (ε
, ε
+
) we have γ
0
< γ
1
.
The following assertion is a large deviation theorem of Chernoff type for the log-likelihood
ratio process Λ =
t
)
t0
.
Proposition 2.1. Let the regularity condition (2.3) be satisfied. Then the following con-
clusions are valid:
(i) if γ
0
< κ
0
(ε
+
) then for all γ (γ
0
, κ
0
(ε
+
)) we have:
lim
t→∞
ψ
1
t
log P
0
[ψ
1
t
Λ
t
> γ] = lim
t→∞
ψ
1
t
log P
0
[ψ
1
t
Λ
t
γ] = I(γ); (2.9)
3

(ii) if ε
< 0 and κ
0
(ε
+) < κ
0
(0) then for all γ (κ
0
(ε
+), κ
0
(0)) we have:
lim
t→∞
ψ
1
t
log P
0
[ψ
1
t
Λ
t
< γ] = lim
t→∞
ψ
1
t
log P
0
[ψ
1
t
Λ
t
γ] = I(γ); (2.10)
(iii) if κ
0
(ε
+) < γ
1
then for all γ (κ
0
(ε
+), γ
1
) we have:
lim
t→∞
ψ
1
t
log P
1
[ψ
1
t
Λ
t
< γ] = lim
t→∞
ψ
1
t
log P
1
[ψ
1
t
Λ
t
γ] = γ I(γ); (2.11)
(iv) if ε
+
> 1 and κ
0
(1) < κ
0
(ε
+
) then for all γ (κ
0
(1), κ
0
(ε
+
)) we have:
lim
t→∞
ψ
1
t
log P
1
[ψ
1
t
Λ
t
> γ] = lim
t→∞
ψ
1
t
log P
1
[ψ
1
t
Λ
t
γ] = γ I(γ). (2.12)
This assertion is proved by using large deviation theorems for extended random variables
in [22].
2.2. The results cited above give an opportunity to investigate the rate of decrease of the
error probabilities for some statistical tests. In the rest of the section we refer some results about
the asymptotic behavior of the error probabilities for Neyman-Pearson, Bayes, and minimax
tests. The proofs of these results can be found in [22] (see also references in [23]).
Let α
t
, t 0, be an arbitrary function having values in (0, 1), and let δ
t
(α
t
) be a Neyman-
Pearson test of the level α
t
(0, 1) for testing the hypotheses H
0
and H
1
under the observations
X
s
, 0 s t (see e.g. [21; Chapter II, Section 2.1]). The following assertion describes the rate
of decrease for the error probabilities of the first and second kind α
t
and β(α
t
), respectively,
for the test δ
t
(α
t
) under the regularity condition (2.3).
Proposition 2.2. Let (2.3) be satisfied with γ
0
< γ
1
. Then the following conclusions are
valid:
(i) for all a (I(γ
0
), I(γ
1
)) we have:
lim
t→∞
ψ
1
t
log α
t
= a if and only if lim
t→∞
ψ
1
t
log β(α
t
) = b(a) (2.13)
with
b(a) := a γ(a) (I(γ
1
) γ
1
, I(γ
0
) γ
0
) (2.14)
and γ(a) is a unique solution of the equation I(γ) = a with respect to γ (γ
0
, γ
1
);
(ii) for all a [0, I(γ
0
)] we have:
lim
t→∞
ψ
1
t
log α
t
= a implies lim sup
t→∞
ψ
1
t
log β(α
t
) γ
0
I(γ
0
) (2.15)
and for all a [I(γ
1
), ] we have:
lim
t→∞
ψ
1
t
log α
t
= a implies lim inf
t→∞
ψ
1
t
log β(α
t
) γ
1
I(γ
1
); (2.16)
(iii) for all b [0, I(γ
1
) γ
1
] we have:
lim
t→∞
ψ
1
t
log β(α
t
) = b implies lim sup
t→∞
ψ
1
t
log α
t
I(γ
1
) (2.17)
4

and for all b [I(γ
0
) γ
0
, ] we have:
lim
t→∞
ψ
1
t
log β(α
t
) = b implies lim inf
t→∞
ψ
1
t
log α
t
I(γ
0
). (2.18)
These results under more restrictive conditions were proved in [21]. The only if part in
(2.13) for the sequence of observed i.i.d. random variables was proved by Birg´e [8].
Let δ
π
t
be a Bayes test for testing the hypotheses H
0
and H
1
based on the observations X
s
,
0 s t, where π and 1 π , π [0, 1], are the a priori probabilities of the hypotheses H
0
and H
1
, respectively (see e.g. [21; Chapter II, Section 2.1]). The following assertion describes
the rate of decrease for the error probabilities of the first and second kind α
t
(δ
π
t
) and β(δ
π
t
),
and the risk e(δ
π
t
) for the test δ
π
t
under the regularity condition (2.3).
Proposition 2.3. Let (2.3) be satisfied with γ
0
< 0 < γ
1
. Then the following relationships
hold:
lim
t→∞
ψ
1
t
log α(δ
π
t
) = lim
t→∞
ψ
1
t
log β(δ
π
t
) = lim
t→∞
ψ
1
t
log e(δ
π
t
) = I(0). (2.19)
This assertion was proved by Chernoff [9] for the case of i.i.d. random variables. Under
some other conditions the last equality in (2.19) was proved by Vajda [28].
Let δ
t
be a minimax test for testing the hypotheses H
0
and H
1
under the observations X
s
,
0 s t (see e.g. [7; Chapter III, Section 4]). The following assertion describes the rate of
decrease for the error probabilities of the first and second kind α
t
(δ
t
) and β(δ
t
), and the risk
e(δ
t
) for the test δ
t
under the regularity condition (2.3).
Proposition 2.4. Suppose that (2.3) is satisfied with γ
0
< 0 < γ
1
. Then we have:
lim
t→∞
ψ
1
t
log α(δ
t
) = lim
t→∞
ψ
1
t
log β(δ
t
) = lim
t→∞
ψ
1
t
log e(δ
t
) = I(0). (2.20)
3 Ornstein-Uhlenbeck models
In this section we consider a model where the observation process X = (X
t
)
t0
satisfies the
following stochastic differential equation:
dX
t
= θX
t
dt + dW
t
(X
0
= x) (3.1)
where W = (W
t
)
t0
is a standard Wiener process and θ 0, x R are some given constants.
We will study the problem of testing the simple hypothesis H
0
: θ = θ
0
against the simple
alternative H
1
: θ = θ
1
.
Here we specify the results of the previous section for Ornstein-Uhlenbeck processes in both
cases θ
1
> θ
0
= 0 and θ
1
> θ
0
> 0. It is remarkable that the first case cannot be obtained
from the second one by letting θ
0
0.
3.1. Since equation (3.1) has a pathwise unique continuous solution under both hypothe-
ses H
0
and H
1
, by means of the Girsanov formula for diffusion-type processes (see e.g. [24;
Chapter VII, Theorem 7.19]) we may conclude that the measures P
0
and P
1
are locally equiv-
alent on (F
t
)
t0
, and under the hypothesis H
0
the log-likelihood ratio process (2.1) admits the
representation:
Λ
t
= (θ
0
θ
1
)
Z
t
0
X
s
dW
s
(θ
0
θ
1
)
2
2
Z
t
0
X
2
s
ds. (3.2)
5

Citations
More filters
Journal ArticleDOI
TL;DR: In this paper, the authors studied hypothesis testing in time inhomogeneous diffusion processes and obtained the negative regions and decay rates of the error probabilities with the help of large and moderate deviations for the log-likelihood ratio process.

13 citations

Journal ArticleDOI
TL;DR: In this paper, the large deviations and moderate deviations for the log-likelihood ratio of the Jacobi model were applied to give negative regions in testing Jacobi models, and the decay rates of the error probabilities were obtained.

12 citations

Journal ArticleDOI
TL;DR: In this article, a simple hypothesis testing problem for the drift/viscosity coefficient for stochastic fractional heat equation driven by additive space-time white noise colored in space is studied.

11 citations


Cites methods from "On large deviations in testing Orns..."

  • ...To accomplish this, we will use appropriately Feynman–Kac formula (for a similar approach, see also Gapeev and Küchler [8])....

    [...]

  • ...[8] Pavel V. Gapeev and Uwe Küchler....

    [...]

  • ...To accomplish this, we will use appropriately Feynman–Kac formula (for a similar approach, see also Gapeev and Küchler [8])....

    [...]

Journal ArticleDOI
TL;DR: In this paper, the self-normalized asymptotic properties of the parameter estimators in the fractional Ornstein-Uhlenbeck process were investigated and the deviation inequalities in the Cramer-type moderate devia were analyzed.
Abstract: In this paper, we consider the self-normalized asymptotic properties of the parameter estimators in the fractional Ornstein–Uhlenbeck process The deviation inequalities, Cramer-type moderate devia

11 citations

Journal ArticleDOI
TL;DR: In this paper, the authors studied the sharp large deviations for the log-likelihood ratio of an α-Brownian bridge, and the full expansion of the tail probability was obtained by using a change of measure.

7 citations

References
More filters
Journal ArticleDOI
TL;DR: In this article, the authors derived a general form of the term structure of interest rates and showed that the expected rate of return on any bond in excess of the spot rate is proportional to its standard deviation.

6,160 citations


Additional excerpts

  • ...[29] or [5])....

    [...]

Book
01 Jan 1987
TL;DR: In this article, the General Theory of Stochastic Processes, Semimartingales, and Stochastically Integrals is discussed and the convergence of Processes with Independent Increments is discussed.
Abstract: I. The General Theory of Stochastic Processes, Semimartingales and Stochastic Integrals.- II. Characteristics of Semimartingales and Processes with Independent Increments.- III. Martingale Problems and Changes of Measures.- IV. Hellinger Processes, Absolute Continuity and Singularity of Measures.- V. Contiguity, Entire Separation, Convergence in Variation.- VI. Skorokhod Topology and Convergence of Processes.- VII. Convergence of Processes with Independent Increments.- VIII. Convergence to a Process with Independent Increments.- IX. Convergence to a Semimartingale.- X. Limit Theorems, Density Processes and Contiguity.- Bibliographical Comments.- References.- Index of Symbols.- Index of Terminology.- Index of Topics.- Index of Conditions for Limit Theorems.

5,987 citations


"On large deviations in testing Orns..." refers background in this paper

  • ...for all t ≥ 0 (see e.g. Jacod and Shiryaev 1987; Chapt....

    [...]

Journal ArticleDOI
TL;DR: In this paper, it was shown that the likelihood ratio test for fixed sample size can be reduced to this form, and that for large samples, a sample of size $n$ with the first test will give about the same probabilities of error as a sample with the second test.
Abstract: In many cases an optimum or computationally convenient test of a simple hypothesis $H_0$ against a simple alternative $H_1$ may be given in the following form. Reject $H_0$ if $S_n = \sum^n_{j=1} X_j \leqq k,$ where $X_1, X_2, \cdots, X_n$ are $n$ independent observations of a chance variable $X$ whose distribution depends on the true hypothesis and where $k$ is some appropriate number. In particular the likelihood ratio test for fixed sample size can be reduced to this form. It is shown that with each test of the above form there is associated an index $\rho$. If $\rho_1$ and $\rho_2$ are the indices corresponding to two alternative tests $e = \log \rho_1/\log \rho_2$ measures the relative efficiency of these tests in the following sense. For large samples, a sample of size $n$ with the first test will give about the same probabilities of error as a sample of size $en$ with the second test. To obtain the above result, use is made of the fact that $P(S_n \leqq na)$ behaves roughly like $m^n$ where $m$ is the minimum value assumed by the moment generating function of $X - a$. It is shown that if $H_0$ and $H_1$ specify probability distributions of $X$ which are very close to each other, one may approximate $\rho$ by assuming that $X$ is normally distributed.

3,760 citations


"On large deviations in testing Orns..." refers background or methods in this paper

  • ...Birg e [8] applied the results of [ 9 ] to the investigation of the rate of decrease for error probabilities of Neyman-Pearson tests....

    [...]

  • ...Cherno [ 9 ] proved large deviation theorems for sums of i.i.d....

    [...]

  • ...This assertion was proved by Cherno [ 9 ] for the case of i.i.d....

    [...]

Book
01 Jan 1977
TL;DR: In this paper, the optimal linear non-stationary filtering, interpolation and extrapolation of Partially Observable Random Processes with a Countable Number of States (POMOS) was studied.
Abstract: 1. Essentials of Probability Theory and Mathematical Statistics.- 2. Martingales and Related Processes: Discrete Time.- 3. Martingales and Related Processes: Continuous Time.- 4. The Wiener Process, the Stochastic Integral over the Wiener Process, and Stochastic Differential Equations.- 5. Square Integrable Martingales and Structure of the Functionals on a Wiener Process.- 6. Nonnegative Supermartingales and Martingales, and the Girsanov Theorem.- 7. Absolute Continuity of Measures corresponding to the Ito Processes and Processes of the Diffusion Type.- 8. General Equations of Optimal Nonlinear Filtering, Interpolation and Extrapolation of Partially Observable Random Processes.- 9. Optimal Filtering, Interpolation and Extrapolation of Markov Processes with a Countable Number of States.- 10. Optimal Linear Nonstationary Filtering.

2,481 citations


"On large deviations in testing Orns..." refers methods in this paper

  • ...H0 and H1, by means of the Girsanov formula for diffusion-type processes (see e.g. Liptser and Shiryaev 1977; Chapt....

    [...]

  • ...By applying Ito’s formula (see e.g. Liptser and Shiryaev 1977; Chapt....

    [...]

Book
01 Jun 1996
TL;DR: Theoretically, Brownian motion with drift is a Markov process as mentioned in this paper, which is a generalization of the Bessel process of order 1/2 and the Ornstein-Uhlenbeck process.
Abstract: I: Theory.- I. Stochastic processes in general.- II. Linear diffusions.- III. Stochastic calculus.- IV. Brownian motion.- V. Local time as a Markov process.- VI. Differential systems associated to Brownian motion.- Appendix 1. Briefly on some diffusions.- II: TABLES OF DISTRIBUTIONS OF FUNCTIONALS OF BROWNIAN MOTION AND RELATED PROCESSES.- 1. Brownian motion.- 2. Brownian motion with drift.- 3. Reflecting Brownian motion.- 4. Bessel process of order ?.- 5. Bessel process of order 1/2.- 6. Bessel process of order zero.- 7. Ornstein-Uhlenbeck process.- 8. Radial Ornstein-Uhlenbeck process.- 9. Geometric Brownian motion.- Appendix 2. Special functions.- Appendix 3. Inverse Laplace transforms.- Appendix 4. Differential equations and their solutions.- Appendix 5. Formulae for n-fold differentiation.

2,113 citations


"On large deviations in testing Orns..." refers background in this paper

  • ...From the formula (1.9.3) in Borodin and Salminen (1996) ; Chapt....

    [...]

  • ...(cf. the formula (1.9.3) in Borodin and Salminen 1996; Chapt....

    [...]

  • ...(cf. the formula (1.9.7) in Borodin and Salminen 1996; Chapt....

    [...]

Frequently Asked Questions (8)
Q1. What are the contributions mentioned in the paper "On large deviations in testing ornstein-uhlenbeck-type models∗" ?

In this paper, the authors derived large deviation theorems of Chernoff type for the log-likelihood ratio in testing models with Ornstein-Uhlenbeck processes by applying the large deviation results from the general continuous-time semimartingale framework of Lin'kov [ 21 ]. 

Lin’kov [22] proved large deviation theorems for extended random variables and applied them to the investigation of general statistical experiments. 

Gushchin and Küchler [15] derived conditions under which a model with an affine stochastic delay differential equation satisfies the local asymptotic normality property and where the maximum likelihood and Bayesian estimators of a parameter are asymptotically normal and efficient. 

Then for the function ψt , t ≥ 0, given by:ψt = E0[ 12 ∫ t 0 Y 2s ds ] (4.5)we have lim t→∞ ψ−1t logαt = 0 implies lim sup t→∞ ψ−1t log β(αt) ≤ −1, (4.6)and if the condition 

by substituting the expression (3.7) into (2.3), taking ψt = θ1t and letting t go to ∞ , the authors get:κ(ε) = − √ ε(1− √ ε)2 and κ′(ε) = − 1 4 √ ε + 1 2 (3.8)for ε ∈ (ε−, ε+) = (0,∞), so that κ′(ε−+) = −∞ , κ′(1) = 1/4 and κ′(ε+−) = 1/2. It is easily seen that the function I(γ) from (2.5) takes the expression:I(γ) = sup ε>0 (εγ − κ(ε)) = 1 8(1− 2γ)(3.9)and the values in (2.7) - (2.8) can be calculated as γ0 = κ′(ε−+) = −∞ and γ1 = κ′(1) = 1/4 with I(γ0) = 0 and I(γ1) = 1/4.Because in this case the authors have γ0 < 0 < γ1 , from Propositions 2.1 - 2.4 and the formulas (3.8) - (3.9) the authors get that the following assertion holds. 

Then the initial continuous-time assertions can be obtained as a limiting case of the corresponding results from the discrete-time case by applying the invariance principle. 

Asymptotic properties for tests of delay parameters in the cases of small noise and large sample size were recently studied by Kutoyants [19] - [20]. 

Let αt , t ≥ 0, be the error probability of the first kind of the NeymanPearson test in the model (4.1) of testing hypothesis H0 : a(ds) ≡ a0(ds) against the alternative H1 : a(ds) ≡ a1(ds) with ai(ds) ∈