Gaussian Bridges - Modeling and Inference749343/FULLTEXT01.pdfof probability theory and stochastic...

32
UPPSALA DISSERTATIONS IN MATHEMATICS 86 Department of Mathematics Uppsala University UPPSALA 2014 Gaussian Bridges - Modeling and Inference Maik Görgens

Transcript of Gaussian Bridges - Modeling and Inference749343/FULLTEXT01.pdfof probability theory and stochastic...

Page 1: Gaussian Bridges - Modeling and Inference749343/FULLTEXT01.pdfof probability theory and stochastic processes, whereas in Papers III – V we study inference for a continuous time stochastic

UPPSALA DISSERTATIONS IN MATHEMATICS

86

Department of MathematicsUppsala University

UPPSALA 2014

Gaussian Bridges - Modeling and Inference

Maik Görgens

Page 2: Gaussian Bridges - Modeling and Inference749343/FULLTEXT01.pdfof probability theory and stochastic processes, whereas in Papers III – V we study inference for a continuous time stochastic

Dissertation presented at Uppsala University to be publicly examined in Häggsalen,Ångströmlaboratoriet, Lägerhyddsvägen 1, Uppsala, Friday, 7 November 2014 at 13:15for the degree of Doctor of Philosophy. The examination will be conducted in English.Faculty examiner: Professor Mikhail Lifshits (St. Petersburg State University and LinköpingUniversity).

AbstractGörgens, M. 2014. Gaussian Bridges - Modeling and Inference. Uppsala Dissertations inMathematics 86. 32 pp. Uppsala: Acta Universitatis Upsaliensis. ISBN 978-91-506-2420-5.

This thesis consists of a summary and five papers, dealing with the modeling of Gaussian bridgesand membranes and inference for the α-Brownian bridge.

In Paper I we study continuous Gaussian processes conditioned that certain functionals oftheir sample paths vanish. We deduce anticipative and non-anticipative representations forthem. Generalizations to Gaussian random variables with values in separable Banach spaces arediscussed. In Paper II we present a unified approach to the construction of generalized Gaussianrandom fields. Then we show how to extract different Gaussian processes, such as fractionalBrownian motion, Gaussian bridges and their generalizations, and Gaussian membranes fromthem.

In Paper III we study a simple decision problem on the scaling parameter in α-Brownianbridges. We generalize the Karhunen-Loève theorem and obtain the distribution of the involvedlikelihood ratio based on Karhunen-Loève expansions and Smirnov's formula. The presentedapproach is applied to a simple decision problem for Ornstein-Uhlenbeck processes as well. InPaper IV we calculate the bias of the maximum likelihood estimator for the scaling parameterand propose a bias-corrected estimator. We compare it with the maximum likelihood estimatorand two alternative Bayesian estimators in a simulation study. In Paper V we solve an optimalstopping problem for the α-Brownian bridge. In particular, the limiting behavior as α tends tozero is discussed.

Maik Görgens, Department of Mathematics, Analysis and Probability Theory, Box 480,Uppsala University, SE-75106 Uppsala, Sweden.

© Maik Görgens 2014

ISSN 1401-2049ISBN 978-91-506-2420-5urn:nbn:se:uu:diva-232544 (http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-232544)

Page 3: Gaussian Bridges - Modeling and Inference749343/FULLTEXT01.pdfof probability theory and stochastic processes, whereas in Papers III – V we study inference for a continuous time stochastic

Für Bine und Milo

Page 4: Gaussian Bridges - Modeling and Inference749343/FULLTEXT01.pdfof probability theory and stochastic processes, whereas in Papers III – V we study inference for a continuous time stochastic
Page 5: Gaussian Bridges - Modeling and Inference749343/FULLTEXT01.pdfof probability theory and stochastic processes, whereas in Papers III – V we study inference for a continuous time stochastic

List of papers

This thesis is based on the following papers, which are referred to in the textby their Roman numerals.

I M. Görgens. Conditioning of Gaussian processes and a zero areaBrownian bridge. Manuscript.

II M. Görgens and I. Kaj. Gaussian processes, bridges and membranesextracted from selfsimilar random fields. Manuscript.

III M. Görgens. Inference for α-Brownian bridge based onKarhunen-Loève expansions. Submitted for publication.

IV M. Görgens and M. Thulin. Bias-correction of the maximumlikelihood estimator for the α-Brownian bridge. Statistics andProbability Letters, 93, 78–86, 2014.

V M. Görgens. Optimal stopping of an α-Brownian bridge. Submittedfor publication.

Reprints were made with permission from the publishers.

Page 6: Gaussian Bridges - Modeling and Inference749343/FULLTEXT01.pdfof probability theory and stochastic processes, whereas in Papers III – V we study inference for a continuous time stochastic
Page 7: Gaussian Bridges - Modeling and Inference749343/FULLTEXT01.pdfof probability theory and stochastic processes, whereas in Papers III – V we study inference for a continuous time stochastic

Contents

1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91.1 Gaussian processes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9

1.1.1 The Brownian bridge . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101.1.2 Representation of Gaussian processes . . . . . . . . . . . . . . . . . . . . . . . . . . . 111.1.3 Series expansions of Gaussian processes . . . . . . . . . . . . . . . . . . . . . . . 11

1.2 Models for Gaussian bridges and membranes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 131.2.1 Generalized Gaussian bridges . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 131.2.2 Gaussian selfsimilar random fields . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15

1.3 Inference for α-Brownian bridges . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 161.3.1 Estimation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 181.3.2 Hypothesis testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 191.3.3 Optimal stopping . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20

2 Summary of Papers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 222.1 Paper I . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 222.2 Paper II . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 232.3 Paper III . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 242.4 Paper IV . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 252.5 Paper V . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25

3 Summary in Swedish . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27

Acknowledgements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29

References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30

Page 8: Gaussian Bridges - Modeling and Inference749343/FULLTEXT01.pdfof probability theory and stochastic processes, whereas in Papers III – V we study inference for a continuous time stochastic
Page 9: Gaussian Bridges - Modeling and Inference749343/FULLTEXT01.pdfof probability theory and stochastic processes, whereas in Papers III – V we study inference for a continuous time stochastic

1. Introduction

This is a thesis in the mathematical field of stochastics. This field is oftendivided into the areas probability theory, theoretical statistics, and stochasticprocesses. The boundaries between these areas are not sharp but intersec-tions of them exist. Paper I and Paper II contribute to the theory of stochasticmodeling of Gaussian bridges and membranes and belong to the intersectionof probability theory and stochastic processes, whereas in Papers III – V westudy inference for a continuous time stochastic process, and those papers thusbelong to the intersection of the areas theoretical statistics and stochastic pro-cesses. Moreover, throughout the thesis we make use of functional analyticaltools.

The term Gaussian bridges is to be understood in a broad sense. While orig-inally introduced to describe Gaussian processes which attain a certain value ata specific time almost surely1, it was later (with the prefix “generalized”) usedto denote Gaussian processes conditioned on the event that one or more func-tionals of the sample paths vanish2. Paper I contributes to the theory of suchgeneralized Gaussian bridges. In Paper II we present a general method to con-struct selfsimilar Gaussian random fields and study how to extract Gaussianprocesses, bridges, and membranes from them. In Papers III – V we consideranother generalization of Gaussian bridges – the α-Brownian bridges – andstudy problems of inference for the scaling parameter α and optimal stoppingof such bridges. In all five papers of this thesis the Brownian bridge occurs atleast as a special case3.

In this first chapter we give a short introduction to Gaussian processes ingeneral and to the topics studied in this thesis in particular. In Chapter 2 wesummarize the included papers and in Chapter 3 we give an outline of thethesis in Swedish.

1.1 Gaussian processesAmong all probability distributions the normal distribution is of particular im-portance since, by the central limit theorem, sums of independent and iden-tical distributed (i.i.d.) random variables with finite variance behave roughly

1See for example [24].2We refer to [1] and in particular to [43].3A plot of the Brownian bridge is given on the cover page.

9

Page 10: Gaussian Bridges - Modeling and Inference749343/FULLTEXT01.pdfof probability theory and stochastic processes, whereas in Papers III – V we study inference for a continuous time stochastic

like normal random variables. The central limit theorem has its functional ana-logue as well: Random walks S = (Sn)n∈N of the form Sn =∑n

k=1 Xk, where theXk’s are i.i.d. random variables with finite variance, behave (suitably scaled)roughly like Brownian motion. The Brownian motion is the unique continu-ous stochastic process on the real line with i.i.d. and symmetric increments.It serves as a building block for many other Gaussian and non-Gaussian pro-cesses.

Gaussian processes are not only of particular importance, but also very ac-cessible for investigation, since their finite dimensional distributions are solelydetermined by their covariance and, moreover, Gaussian random variables areindependent whenever they are orthogonal in the Hilbert space spanned bythem.

This importance and treatability makes the class of Gaussian processes anobject of intensive study. Here we just mention the monographs [12], [27],[29], and [33].

1.1.1 The Brownian bridgeIf we consider standard Brownian motion W = (Ws)s∈R (i.e., Brownian motionscaled to fulfill EW0 = 0 and EW 2

1 = 1) and tie it down to 0 at time 1 we obtain(restricted to the interval [0,1]) the Brownian bridge. As mentioned before,this process appears in all papers included in this thesis.

The Brownian bridge B = (Bs)s∈[0,1] is a continuous centered Gaussian pro-cess uniquely defined by its covariance function EBsBt = s(1− t) for 0 ≤s ≤ t ≤ 1. It is of particular importance in asymptotic statistics (cf. [23]):Given n i.i.d. random variables with a continuous distribution function F , con-sider their empirical distribution function Fn. By the law of large numbers,Fn(s)→F(s) almost surely as n→∞. In 1933, Glivenko [25] and Cantelli [15]showed that this convergence is uniform on the real line. Now, by the centrallimit theorem,

√n(Fn(s)−F(s))−→d B(F(s)), as n→ ∞. (1.1)

Kolmogorov showed in [31] that4, as n→ ∞,√

nsups∈R|Fn(s)−F(s)| −→d sup

s∈R|B(F(s))|= sup

0≤s≤1|B(s)|. (1.2)

Moreover, he proved that the law of the left hand side in (1.2) is independentof F and studied the distribution of the right hand side in (1.2) (now knownas the Kolmogorov distribution). These results, together with the work ofSmirnov in [41], form the theoretical foundation of the Kolmogorov-Smirnovgoodness-of-fit tests.

4All three papers [15], [25], and [31] were published in 1933 (with almost the same title) in thesame issue of the Italian Giornale dell Istituto Italiano degli Attuari.

10

Page 11: Gaussian Bridges - Modeling and Inference749343/FULLTEXT01.pdfof probability theory and stochastic processes, whereas in Papers III – V we study inference for a continuous time stochastic

1.1.2 Representation of Gaussian processesAfter the aforementioned work of Kolmogorov and others, the Brownian bridgewas studied in more detail. In doing so, it was fruitful to work with differentrepresentations of it. Introducing the Brownian bridge B as a Brownian motionconditioned to end at 0 at time 1 leads immediately to

Bs =Ws− s W1, 0≤ s≤ 1. (1.3)

This representation is anticipative in the sense that, in order to compute Bs weuse W1 – a random variable “not available” at time s < 1.

An alternative representation of the Brownian bridge is given by the stochas-tic differential equation

dBs = dWs− Bs

1− sds, B0 = 0, 0≤ s < 1. (1.4)

The solution of (1.4) is

Bs =∫ s

0

1− s1− x

dWx, 0≤ s < 1, (1.5)

and one has lims↗1 Bs = 0. Clearly, the stochastic processes defined by (1.3)and (1.5) are different (for example they induce different filtrations). How-ever, they induce the same probability law on C([0,1]) – the Banach space ofcontinuous functions on [0,1] equipped with the supremum norm.

Another way to represent Gaussian processes is by means of series expan-sions which we discuss in the following section.

1.1.3 Series expansions of Gaussian processesGaussian processes X = (Xs)s∈[0,T ] with continuous sample paths may be rep-resented as a random series of the form

Xs =d

∑n=1

ξn fn(s), 0≤ s≤ T, (1.6)

where (ξn)∞n=1 is a sequence of i.i.d. standard normal random variables. While

there exist many such series representations we present two of them in moredetail.

Operator generated processesGiven a separable Hilbert space H and a linear and bounded operator u : H −→C([0,T ]) we can define a Gaussian process X = (Xs)s∈[0,T ] via

Xs =∞

∑n=1

ξn(uen)(s), (1.7)

11

Page 12: Gaussian Bridges - Modeling and Inference749343/FULLTEXT01.pdfof probability theory and stochastic processes, whereas in Papers III – V we study inference for a continuous time stochastic

where (en)∞n=1 is an orthonormal basis in H. The convergence in (1.7) is almost

surely. However, the null-set for which the right hand side does not convergedepends in general on s and thus the convergence in (1.7) is in general notuniform.

Now assume that X = (Xs)s∈[0,T ] is a stochastic process with almost surelycontinuous paths. Then (see Theorem 3.5.1 in [12]) there exists a separableHilbert space H and an operator u : H −→C([0,T ]) such that X can be writtenas in (1.7) almost surely. In particular, in this case the convergence is uniformfor all s ∈ [0,T ]. The operator u is called the generating operator (or the asso-ciated operator) of X (note that the different choices of u and H are equivalentonly up to isomorphisms).

The generating operator u : H −→C([0,T ]) encapsulates all information onthe distribution of a Gaussian process X . In particular, changing the orthonor-mal basis in (1.7) does not change the distribution of X .

In order to give an example we state that the Brownian bridge B on [0,1]is generated by the operator u : L0

2([0,1])−→C([0,1]), where L02([0,1]) is the

orthogonal complement of the function f (x)≡ 1 in L2([0,1]) and

(ue)(s) =∫ s

0e(x)dx, 0≤ s≤ 1, e ∈ L0

2([0,1]).

In particular, the orthonormal basis {√2cos(nπx) : n≥ 1} in L02([0,1]) yields

the representation

Bs =√

2∞

∑n=1

ξnsin(nπs)

nπ. (1.8)

Karhunen-Loève expansionsAnother important series representation of a continuous Gaussian process X =(Xs)s∈[0,T ] is given by its Karhunen-Loève expansion. Let R be the covariancefunction of X , R(s, t) = EXsXt , and let μ be a finite measure on [0,T ]. LetL2([0,T ],μ) be the space of square integrable measurable functions on [0,T ]with respect to the measure μ , and define the covariance operator of X , AR :L2([0,T ],μ)−→ L2([0,T ],μ), by

(ARe)(t) =∫ T

0R(t,s)e(s)μ(ds), e ∈ L2([0,T ],μ).

Then AR is a linear and bounded, compact and self-adjoint, and non-negativedefinite operator. Hence, its eigenvalues (λn)

∞n=1 are real and non-negative.

Now, an application of a generalized version of the Karhunen-Loève Theorem(see Theorem 34.5.B in [36] for the classical Karhunen-Loève Theorem andTheorem 2 of Paper III for its extension) yields the following series expansionof X :

Xs =∞

∑n=1

Znen(s) with Zn =∫ T

0Xsen(s)μ(ds), (1.9)

12

Page 13: Gaussian Bridges - Modeling and Inference749343/FULLTEXT01.pdfof probability theory and stochastic processes, whereas in Papers III – V we study inference for a continuous time stochastic

where (en)∞n=1 is the sequence of corresponding orthonormalized continuous

eigenfunctions of the eigenvalues (λn)∞n=1, and (Zn)

∞n=1 is a sequence of inde-

pendent normal random variables with mean 0 and variance λn. The conver-gence in (1.9) is almost surely and uniform in s for all s in the support of themeasure μ . Setting fn =

√λnen we obtain a series representation of X of the

form (1.6).The advantage of the Karhunen-Loève expansion is that it gives an orthog-

onal decomposition of X since the eigenfunctions (en)∞n=1 are orthogonal in

L2([0,T ],μ). Moreover, if we sort the eigenvalues in descending order, thentruncations based on the Karhunen-Loève expansion minimize the total meansquare error, i.e., for all n ∈ N the expected L2([0,T ],μ)-norm of the sum∑∞

k=n+1 Zkek is minimal among all series expansions of the form (1.6).Calculating the Karhunen-Loève expansion of the Brownian bridge with re-

spect to the Lebesgue measure on [0,1] yields the eigenvalues λn = 1/(n2π2)and the normalized eigenfunctions en(s) =

√2sin(nπs), n≥ 1, which eventu-

ally leads to the same representation of the Brownian bridge as in (1.8).

1.2 Models for Gaussian bridges and membranesIn Paper I and Paper II we study the modeling of Gaussian bridges and mem-branes. In Paper I, generalized Gaussian bridges are obtained by conditioningGaussian processes on the event that certain functionals of their sample pathsvanish. In Paper II, Gaussian bridges and their higher dimensional analogue,Gaussian membranes, are extracted from certain selfsimilar Gaussian randomfields.

1.2.1 Generalized Gaussian bridgesIn (1.1) we have seen that

√n(Fn(s)− s)−→d B(s), as n→ ∞, (1.10)

where B = (Bs)s∈[0,1] is the Brownian bridge on [0,1] and Fn is the empiri-cal distribution function of the first n elements in the sequence U1,U2, . . . ofindependent and uniformly distributed random variables on [0,1]. In fact, byDonsker’s Theorem, the probability measure induced be the left hand sideof (1.10) converges weakly to the probability measure induced by Brownianbridge on the Skorokhod space D([0,1]). Now, for n ∈ N, let Un

1 ,Un2 , . . . ,U

nn

be independent and uniformly distributed random variables on [0,1], condi-tioned that ∑n

i=1Uni = n/2 and let Gn be the empirical distribution function of

the random variables Un1 , . . . ,U

nn . From the conditioning of the Un

i ’s it followsthat ∫ 1

0(Gn(s)− s)ds = 0.

13

Page 14: Gaussian Bridges - Modeling and Inference749343/FULLTEXT01.pdfof probability theory and stochastic processes, whereas in Papers III – V we study inference for a continuous time stochastic

0.0 0.2 0.4 0.6 0.8 1.0

−0.6

−0.4

−0.2

0.0

0.2

0.4

0.6

Figure 1.1. A realization of a zero area Brownian bridge. (figure taken from Paper I).

We may thus expect that√

n(Gn(s)−s) converges, at least in the sense of finitedimensional distributions, to the Brownian bridge conditioned that its integralover [0,1] vanishes. We call this process the zero area Brownian bridge. Atypical sample path is given in Figure 1.1 (taken from Paper I).

The zero area Brownian bridge is one example of a generalized Gaussianbridge (or, as we call it in Paper I, conditioned Gaussian process): given acontinuous Gaussian process X = (Xs)s∈[0,T ], T > 0, and a finite subset A ⊂C([0,T ])∗ from the dual space of C([0,T ]), let P(A)

X be the conditioned measure

P(A)X (·) = PX

(·∣∣∣ ⋂

a∈A

a−1(0)

),

where PX is the induced measure of X on C([0,T ]). Every continuous Gaus-sian process X (A) whose induced measure PX(A) on C([0,T ]) coincides with

P(A)X is called a generalized Gaussian bridge (or conditioned Gaussian process

of X with respect to A).The Brownian bridge on [0,1] is the standard Brownian motion on [0,1]

conditioned by A= {δ1}, and the zero area Brownian bridge appears by condi-tioning the Brownian bridge on [0,1] further by A′ = {a}, where a ∈C([0,1])∗is Lebesgue measure on [0,1], or alternatively, by conditioning standard Brow-nian motion on [0,1] by A′′ = {δ1,a}.

The random variables Xs, 0 ≤ s ≤ T , and a(X), a ∈ A, are centered Gaus-sian random variables. Hence, conditioning becomes orthogonal projection inthe Gaussian Hilbert space spanned by the random variables Xs, 0 ≤ s ≤ T .In particular, anticipative representations of generalized Gaussian bridges areobtained easily. For example the anticipative representation of the Brownianbridge B as a conditioned Brownian motion W is, as in (1.3), Bs = Ws− sW1for 0≤ s≤ 1.

14

Page 15: Gaussian Bridges - Modeling and Inference749343/FULLTEXT01.pdfof probability theory and stochastic processes, whereas in Papers III – V we study inference for a continuous time stochastic

Finding non-anticipative representations for the conditioned process X (A),i.e., representations where the filtrations induced by the processes X and X (A)

coincide (such as for example the representations (1.4) and (1.5) for Brownianbridge), is more involved. In general, additional assumptions on X and A arerequired. Generalized Gaussian bridges and special cases of them were studiedbefore, for example in [1], [8], [9], [19], [24], and [43]5. In particular, in therecent work [43] non-anticipative representations for generalized bridges of awide class of Gaussian processes were found.

Generalized bridges as described in this section were used in connectionwith insider trading, where the additional information of an insider is mod-eled by functionals of the price process of some financial derivative. In [8]and [43] the additional expected utility for the inside trader is calculated fordifferent models. Recently, in [17], generalized Gaussian bridges arising byconditioning Gaussian processes on the event that their first coordinates in theKarhunen-Loève expansion vanish were considered in the context of partialfunctional quantization.

In Paper I we present another approach to the study of generalized Gaussianbridges and show how this approach extends to the conditioning of Gaussianrandom variables with values in arbitrary separable Banach spaces.

1.2.2 Gaussian selfsimilar random fieldsA stochastic process X = (Xs)s∈R on the real line is called selfsimilar withHurst index H if, for all c > 0, the processes (Xcs)s∈R and (cHXs)s∈R coincidein distribution. It is said to have stationary increments if the distribution ofXt−Xs depends solely on the length t− s for all s, t ∈ R. The only continuousGaussian selfsimilar process with stationary increments is, up to constants, thefractional Brownian motion BH = (BH

s )s∈R which has covariance function

E BHs BH

t =12(|s|2H + |t|2H −|s− t|2H) , s, t ∈ R.

Moreover, the self-similarity index H needs to be restricted to 0 < H < 1.In the particular case H = 1/2 we obtain standard Brownian motion. Frac-tional Brownian motion is widely used in applications, for example in statisti-cal physics, telecommunications, financial mathematics and many more.

The generalization of Gaussian processes, Gaussian random fields, are usu-ally defined as Gaussian probability measures on the space of distributions S∗,the dual space of the Schwartz functions S on R

d . A Gaussian random field iscalled selfsimilar with index H if, for all c > 0, the random fields (X(ϕc))ϕ∈Sand (cHX(ϕ))ϕ∈S coincide in distribution, where the dilation ϕc of ϕ is de-fined by ϕc(x) = c−dϕ(c−1x), x ∈ R

d . For r ∈ N, it is said to have stationary

5In [1] and [43] under the name generalized Gaussian bridges.

15

Page 16: Gaussian Bridges - Modeling and Inference749343/FULLTEXT01.pdfof probability theory and stochastic processes, whereas in Papers III – V we study inference for a continuous time stochastic

increments of order r if its restriction to Sr is invariant under translations,where Sr ⊂ S is defined as

Sr =

{ϕ ∈ S :

∫Rd

x jϕ(x)dx = 0 for all | j|< r}⊂ S,

where j = ( j1, . . . , jd) is a multi-index, | j| = ∑dk=1 jk, and x j = ∏d

k=1 x jkk for

x = (x1, . . . ,xd) ∈ Rd .

In [20], Dobrushin gave a complete characterization of the covariance func-tionals of stationary selfsimilar Gaussian random fields. In particular, he hasshown that, for H < r, the covariance of all H-selfsimilar Gaussian randomfields with stationary increments of order r equals

E X(ϕ)X(ψ) =∫Sd−1

∫ ∞

0ϕ̂(rx)ψ̂(rx)r−2H−1drσ(dx),

where ϕ̂ and ψ̂ denotes the Fourier transform of ϕ ∈ S and ψ ∈ S and σ isa finite, positive, and reflection invariant measure on the unit sphere S

d−1 ofR

d . For example, choosing σ = ϖ , where ϖ is the uniform measure on Sd−1,

and H = −d/2 yields, up to constants, Gaussian white noise M on S withcovariance functional

E M(ϕ)M(ψ) =∫Rd

ϕ(x)ψ(x)dx, ϕ,ψ ∈ S.

Selfsimilar and fractional random fields have been studied from differentperspectives. The monograph [16] gives a good account of the theory. For ex-ample, it was shown that Gaussian selfsimilar random fields appear as scalinglimits of certain Poisson random ball models (confer for example [10] and thereferences mentioned therein). Moreover, Gaussian random fields were ex-tended from S to a suitable subset of the space of signed measures with finitetotal variation, and it was described how to extract fractional Brownian mo-tion BH from certain Gaussian random fields X via BH

s = X(μs) for suitablechoices of measures (μs)s∈Rd .

1.3 Inference for α-Brownian bridgesIn Papers III–V we study problems of inference for the α-Brownian bridge.

Inference for continuous time stochastic processes has been studied for along time. One of the first systematic treatments of such problems was givenin [26]. However, the approach described in [26] includes a reduction of thecontinuous time sample paths to a collection of countably many random vari-ables. How this collection is chosen is a non-trivial problem, that, however,is very relevant for the power of the deduced estimates and tests. Furtherwork (for example [11] and [14]) studied parameter estimation for continu-ous time stochastic processes without this reduction, but under the assumption

16

Page 17: Gaussian Bridges - Modeling and Inference749343/FULLTEXT01.pdfof probability theory and stochastic processes, whereas in Papers III – V we study inference for a continuous time stochastic

of stationarity or ergodicity (or both). Statistical inference for stochastic pro-cesses based on continuous observations has been studied extensively eversince. Here we just mention the sources [34] and [35].

We next motivate the introduction of the α-Brownian bridges by an exam-ple: assume that Sweden decides to join the European monetary union (EMU).In order to do this, at some date before the planed entrance, the exchange rateat which Swedish crowns will be exchanged to Euro needs to be fixed at somelevel K. Then, in the time between this rate becomes public and the date of theentrance to the EMU, currency dealers will tend to change Euro to Swedishcrowns if the current rate is below K, and to change Swedish crowns to Euroif the current rate is above K. Moreover, this effect will be the stronger thecloser the date of entrance is. Considering the exchange rate as a function oftime, we obtain thus a mapping, which, at the day of entrance to the EMU,attains the fixed value K. α-Brownian bridges have been used as a buildingblock in [42] and [44] to model such behavior.

Given a standard Brownian motion W =(Ws)s∈[0,1] and a real number α > 0,consider the stochastic differential equation

dX (α)s = dWs− α X (α)

s

1− sds, X (α)

0 = 0, 0≤ s < 1. (1.11)

The unique strong solution of (1.11) is

X (α)s =

∫ s

0

(1− s1− x

)αdWx, 0≤ s < 1, (1.12)

and is called the α-Brownian bridge. It fulfills lims↗1 X (α)s = 0 almost surely

and the parameter α determines how the process returns to 0 at time 1. Hence,X (α) has a continuous extension on [0,1] and therefore the term ”bridge“ isjustified. The (usual) Brownian bridge is covered as the special case α = 1.In (1.11) and (1.12) we may allow non-positive values for α as well. ThenBrownian motion is included as the special case α = 0. However, for α ≤ 0,we do not have lims↗1 X (α)

s = 0 any longer. In order to visualize the ef-fect of the parameter α graphically, we give a plot of the ”expected future“E

[X (α)

t |(X (α)x )x∈[0,s]

], 0 ≤ s ≤ t ≤ 1, for different values of α in Figure 1.2

(taken from Paper III).α-Brownian bridges were introduced in [13] for the modeling of riskless

profit given some future contracts in the absence of transaction costs. Thisextended the earlier work [2], where the arbitrage opportunity is derived froma model including the Brownian bridge. Later it was used in economical ([42]and [44]) and biological ([28]) contexts. In particular, in [44] a very similarsituation to our motivating example was studied: the conversion of the ex-change rates between the Greek Drachma and the Euro to a fixed exchangerate on January 1st, 2001.

17

Page 18: Gaussian Bridges - Modeling and Inference749343/FULLTEXT01.pdfof probability theory and stochastic processes, whereas in Papers III – V we study inference for a continuous time stochastic

0 s 1

α = 00 < α < 1α = 1α > 1

Figure 1.2. The influence of α to the “expected future” for different values of α (figuretaken from Paper III).

The first more theoretical investigation of α-Brownian bridges was givenin [37]. In this reference it was, among other results, shown that the α-Brownian bridge is not a bridge of a Gaussian Markov process in the senseof Section 1.2.1 unless α = 1. In [5] sample path properties were studiedand in [3] the Karhunen-Loève expansion of X (α) (under the Lebesgue mea-sure) was computed. Some further references are given in subsequent sec-tions. Here, we only remark that generalizations have been discussed in theliterature, where the constant α was replaced by a mapping s �→ α(s) [4], andwhere the Brownian motion in (1.11) was replaced by fractional Brownianmotion [22].

Returning to the scenario of Sweden joining the European monetary union,assume that at some time close to the entrance to the EMU, a currency dealerholds Swedish crowns and has to decide whether to sell them or not. Sup-pose that she works under the assumption that the exchange rate follows anα-Brownian bridge with an unknown parameter α . Then she will have to es-timate α based on the past exchange rates (we study hypothesis testing andestimation for α-Brownian bridges in Paper III and Paper IV) and, once shefound a good estimate, she will have to find the best selling strategy given thenow fully defined model (we consider this problem in Paper V).

1.3.1 EstimationAn application of Girsanov’s Theorem yields the log-likelihood function of αgiven a sample path of X (α) until T ,

lnL(

α∣∣∣(X (α)

s )s∈[0,T ])=−α

∫ T

0

X (α)s

1− sdX (α)

s − α2

2

∫ T

0

(X (α)s )2

(1− s)2 ds. (1.13)

18

Page 19: Gaussian Bridges - Modeling and Inference749343/FULLTEXT01.pdfof probability theory and stochastic processes, whereas in Papers III – V we study inference for a continuous time stochastic

It follows that the maximum likelihood estimator for α equals

α̂MLE(T ) =−∫ T

0

X (α)s

1− sdX (α)

s

/ ∫ T

0

(X (α)s )2

(1− s)2 ds. (1.14)

In [7] it was shown that α̂MLE is a strongly consistent estimator for α , thatis, that limT↗1 α̂MLE(T ) = α almost surely. Moreover, in [6] (for the firstcase) and [7] (for the second and third cases) it was shown that, as T ↗ 1,

√Iα(T )(α̂MLE(T )−α)−→d

⎧⎪⎪⎨⎪⎪⎩

ζ , for α < 1/2,

− W 21 −1

2√

2∫ 1

0 W 2s ds

, for α = 1/2,

ξ , for α > 1/2,

where ζ denotes a standard Cauchy-distributed random variable, ξ a stan-dard normal random variable, and Iα(T ) is the Fisher information. More-over, in [45], it was proven that under the assumption α > 1/2, the maximumlikelihood estimator α̂MLE satisfies the large deviation principle with speed| ln(1−T )| and good rate

J(x) =

⎧⎨⎩

(α−x)2

2(2x−1) , if x≥ (1+α)/3,

2α−4x+12 , if x < (1+α)/3.

All the aforementioned results are only of asymptotic nature in the sensethat they describe the behavior of α̂MLE(T ) as T ↗ 1. The aim of Paper IIIand Paper IV is to give precise results for all values of T smaller than 1. InPaper IV we show that α̂MLE(T ) is a heavily biased estimator for α unless Tis very close to 1 and we propose a bias-corrected estimator for α .

1.3.2 Hypothesis testingHypothesis testing for the α-Brownian bridge was studied in [46]. More pre-cise, the simple statistical decision problem

H0 : α = α0 vs. H1 : α = α1, (1.15)

where α0,α1 ≥ 1/2 was considered. The decision should be based on an ob-served trajectory until time T < 1 and should be in a way such that the prob-ability of making an error of the second kind is minimized, and at the sametime bounding the probability of making an error of the first kind from aboveby some value p < 1 (usually p = 0.1, p = 0.05 or p = 0.01).

The Neyman–Pearson Lemma provides us with the (in the just describedsense) optimal test: we have to reject the null hypothesis whenever

ϕα0,α1(T ) :=L

(α1

∣∣∣(X (α)s )s∈[0,T ]

)L

(α0

∣∣∣(X (α)s )s∈[0,T ]

) > cα0,α1,T (q). (1.16)

19

Page 20: Gaussian Bridges - Modeling and Inference749343/FULLTEXT01.pdfof probability theory and stochastic processes, whereas in Papers III – V we study inference for a continuous time stochastic

Here, the constant cα0,α1,T (q) is to be chosen such that

P(α0)(ϕα0,α1(T )> cα0,α1,T (q)) = q,

that is, in order to find the best decision in (1.15) we need to know the dis-tribution of the likelihood ratio ϕα0,α1(T ) under the null hypothesis. In [46]approximations for this distribution were given for T close to 1 by means oflarge deviations.

In Paper III we consider the same problem (1.15) but with the less restric-tive assumption α0 +α1 ≥ 1. Applying Smirnov’s formula to the Karhunen-Loève expansion of X (α) under a certain measure μ allows us to determine thedistribution of ϕα0,α1(T ) under the null hypothesis exactly for all T < 1 (seeSection 2.3).

1.3.3 Optimal stoppingOptimal stopping has its roots in sequential analysis, where the size n of thedata (X1,X2, ...,Xn) on which decisions and estimates are based is not pre-defined but depends on some stopping rule η . This rule is chosen such thatthe costs of collecting the data is as small as possible while still providing therequired level of significance for the inference.

This idea is embedded into a continuous setting in the following way: Givena stochastic process Y = (Ys)s∈[0,T ] and a progressively measurable functionG : [0,T ]×R[0,T ] −→ R we consider

V = sup0≤τ≤T

E G(τ,Y ),

where the supremum is taken over all stopping times, and where the termprogressively measurable means that the value of G(t,Y ) is solely based on tand (Ys)s∈[0,t]. However, often the simpler problem

V = sup0≤τ≤T

E Yτ (1.17)

is studied, i.e., the value of G at time t is just Yt . A solution of the optimalstopping problem (1.17) consists of the value V and a stopping time τ∗ forwhich the supremum is attained (if such a stopping time exists).

If Y is a (possibly time-inhomogeneous) Markov process, one usually con-siders the augmented problem

V (x, t) = supt≤τ≤T

Ex,t Yτ , (1.18)

where Ex,t denotes expectation under the assumption that Yt = x. Then thesolution to (1.17) follows via V =V (Y0,0). If we assume that Y is a continuousprocess, it appears natural to continue the observation as long as Yt < V (Yt , t)

20

Page 21: Gaussian Bridges - Modeling and Inference749343/FULLTEXT01.pdfof probability theory and stochastic processes, whereas in Papers III – V we study inference for a continuous time stochastic

and to stop immediately as soon as Yt = V (Yt , t). Hence, we expect that thestopping time

τ∗ = inf{t ≥ 0 : Yt =V (Yt , t)}is optimal in (1.17). In fact, this is true under some regularity conditionson Y (see Theorem 2.4 in [39]). Finding the value function V (x, t) for time-inhomogeneous Markov processes (such as α-Brownian bridges) is non-trivialand different approaches are discussed in the literature.

In [21] the problemV = sup

0≤τ≤1E X (1)

τ ,

i.e., the optimal stopping problem for the Brownian bridge, was solved. Weextend these results in Paper V, by replacing the Brownian bridge X (1) by theα-Brownian bridge X (α) for arbitrary α ≥ 0.

21

Page 22: Gaussian Bridges - Modeling and Inference749343/FULLTEXT01.pdfof probability theory and stochastic processes, whereas in Papers III – V we study inference for a continuous time stochastic

2. Summary of Papers

In this chapter we give a short summary of each paper included in the thesis.

2.1 Paper IThe first paper deals with conditioned Gaussian processes as introduced underthe name generalized Gaussian bridges in Section 1.2.1. Let X = (Xs)s∈[0,T ]be a continuous Gaussian process and let A ⊂ C([0,T ])∗ be finite (we callelements in A conditions). Denote by X (A) the conditioned Gaussian processof X with respect to the set of conditions A, and let PX and PX(A) be the inducedmeasures of X and X (A) on C([0,T ]).

Let u : H −→C([0,T ]) be a generating operator of X as introduced in Sec-tion 1.1.3. We show that the conditioned process X (A) admits a series expan-sion of the form

X (A)s =

∑n=1

ξn(u fn)(s), (2.1)

where (ξn)∞n=1 is a sequence of i.i.d. standard normal random variables and

( fn)∞n=1 is an orthonormal basis in the Hilbert space

H(A) = {h ∈ H : a(uh) = 0 for all a ∈ A} ⊂ H.

From the series expansion (2.1) we deduce an anticipative representationfor the conditioned process X (A), i.e., we express X (A) in terms of X , where,in general, the complete realization of X(ω) is required in order to computeX (A)

s (ω), 0 ≤ s ≤ 1. Moreover, the series expansion (2.1), together with anapplication of the Cameron-Martin Theorem, leads to a simple criterion fordetermining the equivalence of the measures PX and PX(A) .

Next, we study non-anticipative representations of X (A). We show that,whenever X is a solution of a stochastic differential equation of the form

dXs = αdWs +β (s,X)ds, X0 = 0, 0≤ s < T,

where (Ws)s∈[0,T ] is standard Brownian motion and β a progressively measur-able functional, then X (A) solves a stochastic differential equation

dX (A)s = αdWs +δ (s,X (A))ds, X (A)

0 = 0, 0≤ s < T,

22

Page 23: Gaussian Bridges - Modeling and Inference749343/FULLTEXT01.pdfof probability theory and stochastic processes, whereas in Papers III – V we study inference for a continuous time stochastic

for some progressively measurable functional δ . Moreover, if X is a Markovprocess we determine δ explicitly.

After giving examples (e.g. the zero area Brownian bridge mentioned inSection 1.2.1) we finally study extensions to arbitrary separable Banach spacesand consider conditioning of Gaussian processes on [0,∞) and conditioning ofGaussian random measures.

2.2 Paper IIIn Paper II we present a unified framework for the construction of selfsimilargeneralized Gaussian random fields on R

d . These fields are driven by Gaus-sian random balls white noise Mβ defined as Gaussian random measures onR

d×R+ with control measures νβ (dz) = νβ (dx,du) = dxu−β−1du for someβ > 0.

Given a point z = (x,u) ∈ Rd×R+ and a function h : Rd −→ R we define

the shift and scale map τzh : Rd −→ R by τzh(y) = h((y− x)/u). For a signedmeasure μ on R

d and an m > 0, let (−Δ)−m/2μ be the absolutely continuousmeasure with density

((−Δ)−m/2μ)(x) =∫Rd|x− y|−(d−m)μ(dy), x ∈ R

d .

Denote the evaluation of a function g : Rd −→ R with respect to a signedmeasure η on R

d by

〈η ,g〉=∫Rd

g(y)η(dy)

and consider the Gaussian random field

X(μ) =∫Rd×R+

〈(−Δ)−m/2μ,τzh〉 Mβ (dz). (2.2)

The notion of stationarity and self-similarity carries over from generalizedGaussian random fields defined on S as in Section 1.2.2 to generalized Gaus-sian random fields defined on spaces of measures in a natural way. We ana-lyze for which choices of the parameters m and β , of the measure μ , and ofthe shot noise function h, the random field (2.2) is well defined, Moreover, westudy their self-similarity properties in Theorem 2 of Paper II. Modificationsof (2.2), where the driving Gaussian random measure is replaced by Gaussianwhite noise on R

d , are studied in Theorem 3 of Paper II.We then show how to extract Gaussian processes and Gaussian bridges from

these generalized Gaussian random fields. For example, we discuss the extrac-tion of fractional Brownian motion in different representations, the extractionof generalized Gaussian bridges in the sense of Section 1.2.1, and the extrac-tion of Gaussian bridges and membranes on bounded domains D⊂R

d , whichare Gaussian random fields X = (Xs)s∈D̄ such that Xs → 0 as s→ s0 ∈ ∂D.

23

Page 24: Gaussian Bridges - Modeling and Inference749343/FULLTEXT01.pdfof probability theory and stochastic processes, whereas in Papers III – V we study inference for a continuous time stochastic

In a final section we study a second attempt to the construction of Gaussianmembranes through a modification of the control measure of the driving Gaus-sian random measures. This yields random fields which are not selfsimilar ina global sense but in a local sense as shown in Theorem 4 of Paper II.

2.3 Paper IIIThe statistical decision problem (1.15) is considered in Paper III, i.e., givenan observed trajectory of an α-Brownian bridge X (α) with unknown scalingparameter α until time T < 1, we want to test

H0 : α = α0 vs. H1 : α = α1.

We assume that α0 +α1 ≥ 1. As pointed out in Section 1.3.2, in order to findthe optimal test, it is crucial to know the distribution of the likelihood ratioϕα0,α1(T ) (as defined in (1.16)) under the null hypothesis.

We show that ϕα0,α1(T ) can be recast to (Proposition 1 of Paper III)

ϕα0,α1(T ) = exp((α0−α1)(ψα0,α1(T )+ ln(1−T ))/2

), (2.3)

where

ψα0,α1(T ) =(X (α)

T )2

1−T+(α0 +α1−1)

∫ T

0

(X (α)s )2

(1− s)2 ds. (2.4)

We then generalize the Karhunen-Loève Theorem (Theorem 2 of Paper III)and calculate the Karhunen-Loève expansion of X (α0) under the positive mea-sure

μ = μα0,α1,T (ds) =δT (ds)1−T

+(α0 +α1−1)I(s≤ T )ds

(1− s)2

(Theorem 3 of Paper III), where δT denotes the point measure at T and I theindicator function. This yields

X (α0)s =

∑n=1

Znen(s), (2.5)

where the convergence in (2.5) is almost surely and uniform in s ∈ [0,T ] (seealso Section 1.1.3). Here, (Zn)

∞n=1 is a sequence of centered normal random

variables with variance λn, with (λn)∞n=1 being the decreasing sequence of

eigenvalues in the Karhunen-Loève expansion of X (α0), and (en)∞n=1 is the

sequence of corresponding normalized continuous eigenfunctions. In partic-ular, the sequence (en)

∞n=1 forms an orthonormal system in the Hilbert space

L2([0,1],μ). From this fact, (2.4) and (2.5) it follows under the null hypothesis

ψα0,α1(T ) = ‖X (α0)‖2μ =

∑n=1

Z2n =d

∑n=1

λnξ 2n , (2.6)

24

Page 25: Gaussian Bridges - Modeling and Inference749343/FULLTEXT01.pdfof probability theory and stochastic processes, whereas in Papers III – V we study inference for a continuous time stochastic

where (ξn)∞n=1 is a sequence of i.i.d. standard normal random variables. Ran-

dom sums of the form (2.6) were studied by Smirnov [40] and Martynov [38]and concise formulas for their distributions were given. Based on the distri-bution of ψα0,α1(T ), the distribution of ϕα0,α1(T ) is obtained via (2.3) (Theo-rem 1 of Paper III).

In a final Section we apply the presented method to hypothesis testing forOrnstein-Uhlenbeck processes (Theorem 4 and Theorem 5 of Paper III).

2.4 Paper IVIn Paper IV we study the bias of the maximum likelihood estimator for αgiven an observation of a sample path of the α-Brownian bridge X (α) untiltime T < 1. From (2.3) and (2.4) it can be deduced that

α̂MLE(T ) =− (X (α)T )2

2(1−T )I(α)T

+12− ln(1−T )

2I(α)T

, where I(α)T =

∫ T

0

(X (α)s )2

(1− s)2 ds.

The moment generating function of a random variable does not only containinformation about positive but also about negative moments of that randomvariable (provided that they exist). In particular, in [18], formulas for theexpected value of quotients of random variables based on their joint momentgenerating function are derived. The joint Laplace transform of (X (α)

T )2 andI(α)T was computed in [7]. Based on the mentioned results from [7] and [18]

we compute the expected value Eα [α̂MLE(T )] of α̂MLE(T ) (Proposition 1 ofPaper IV), and show that α �→ Eα [α̂MLE(T )] is, as a mapping from R into R,surjective (Proposition 2 and Proposition 3 of Paper IV).

Finally, we propose a bias corrected maximum likelihood estimator for αand compare its bias and mean squared error with those of the maximum like-lihood estimator and two further Bayesian estimators in a simulation study.

2.5 Paper VIn Section 1.3 we presented a currency dealer who wants to change Swedishcrowns to Euro under the assumption that the exchange rate follows an α-Brownian bridge X (α) with unknown parameter α . After an estimation of αshe will have to solve the optimal stopping problem

V (α) = sup0≤τ≤1

E X (α)τ , (2.7)

where the supremum is taken over all stopping times τ with 0≤ τ ≤ 1 almostsurely. In Paper V we solve this problem by following the classical steps in

25

Page 26: Gaussian Bridges - Modeling and Inference749343/FULLTEXT01.pdfof probability theory and stochastic processes, whereas in Papers III – V we study inference for a continuous time stochastic

optimal stopping theory described for example in [39]. First, we augmentproblem (2.7) as in (1.18) and consider

V (x, t,α) = supt≤τ≤1

Ex,t X (α)τ . (2.8)

Then we formulate a two-dimensional free boundary problem for the valuefunction V (·, ·,α) – the solution of which is computed in Theorem 1 of Paper Vand is a candidate for problem (2.8). The final step would be to verify that thefound candidate is actually the correct solution of (2.8). However, this canbe done in exactly the same way as in [21] (i.e., by an application of the Itôformula together with the optional sampling theorem).

Of particular interest is the limiting behavior of V (α) as α ↘ 0. Since X (0)

is a Brownian motion (and thus a martingale) we have V (0) = 0. On the otherhand, if we consider the stopping time

τ =

{1/2, if X (α)

1/2 > 0,

1, otherwise,

then there exists a constant c > 0 such that

E X (α)τ = E

[X (α)

1/2 I(X (α)1/2 > 0)

]> c

for all α < 1. Hence, we can not expect that V (·) is continuous at 0. Thedetails of the limiting behavior are given in Theorem 2 of Paper V.

26

Page 27: Gaussian Bridges - Modeling and Inference749343/FULLTEXT01.pdfof probability theory and stochastic processes, whereas in Papers III – V we study inference for a continuous time stochastic

3. Summary in Swedish

Denna avhandling studerar modellering av Gaussiska stokastiska processer,särskilt Gaussiska bryggor och membran, och inferens för den α-Brownskabryggan. Arbetet tillhör området matematisk statistik men vi använder genom-gående även verktyg från funktionalanalys.

Ett viktigt specialfall i samtliga fem artiklar är den Brownska bryggan somuppstår genom betingning på att Brownska rörelsen antar värdet 0 vid tiden 1.Processen är särskilt viktig i asymptotisk statistik eftersom empiriska fördel-ningsfunktioner av oberoende stokastiska variabler asymptotiskt beter sig somden Brownska bryggan. Speciellt ligger en detaljerad analys av supremum avden Brownska bryggan till grund för Kolmogorov-Smirnov testet som är encentral metod inom teoretisk statistik. I samtliga artiklar studerar vi olikageneraliseringar av den Brownska bryggan.

I artikel I behandlar vi kontinuerliga Gaussiska processer vars realiseringarkontrolleras av en given betingningsfunktional A. Vi ger serieutvecklingar förden obetingade processen X och den betingade processen X (A). Genom atttillämpa Cameron-Martins sats bevisar vi med hjälp av dessa utvecklingar ettekvivalenskriterium för de fördelningar som induceras av X och X (A). Undervissa villkor på processen X och funktionalen A härleder vi explicita kanon-iska representationer för X (A). Slutligen diskuterar vi generaliseringar tillGaussiska stokastiska variabler som tar värden i separabla Banachrum.

I artikel II presenterar vi en enhetlig ram för att konstruera själv-similäraGaussiska stokastiska fält. Dessa fält indexeras av Schwartz-funktioner elleren bredare klass av signerade mått på R

d som kan parametriseras med ettså kallat Hurst index. Vi visar hur man med hjälp av en extraktionsmetodkan konstruera Gaussiska processer på R

d genom att betrakta dessa fält fören familj av mått indexerade av R

d . Speciellt visar vi hur man kan erhållaolika representationer av fraktionell Brownsk rörelse, generaliserade Gaus-siska bryggor och Gaussiska membran. Dessa är Gaussprocesser definieradepå en begränsad domän, och som antar värdet noll på randen. Vi studerar ävenlokal själv-similaritet för sådana membran.

Den α-Brownska bryggan är en generalisering av den Brownska bryggansom använder en skalningsparameter α ≥ 0 vilken bestämmer graden av kon-vergens mot 0 vid tiden 1. Inferens för skalningsparametern α baserad på enrealisering fram till tiden T < 1 har tidigare studerats i litteraturen, men endastför värden T nära 1.

I artikel III betraktar vi det statistiska beslutsproblemet H0 : α = α0 vs.H1 : α = α1 for α-Brownska bryggor. Vi visar att den relevanta likelihood-kvoten kan skrivas som en kvadrerad L2-norm av processen under ett visst

27

Page 28: Gaussian Bridges - Modeling and Inference749343/FULLTEXT01.pdfof probability theory and stochastic processes, whereas in Papers III – V we study inference for a continuous time stochastic

mått μ . Vi generaliserar Karhunen-Loèves sats och beräknar Karhunen-Loéveutvecklingen av den α-Browska bryggan under måttet μ . Baserat på dennautveckling erhåller vi fördelningen för likelihoodkvoten genom en tillämp-ning av Smirnovs formel. Detta leder till optimala test för alla 0 < T ≤ 1.Vi diskuterar också generaliseringar av denna ansats till Ornstein-Uhlenbeckprocesser.

I artikel IV beräknas bias av ML-estimator för α . Det visar sig att estima-torn har ett avsevärt väntevärdesfel när T inte ligger nära 1. Därför föreslårvi en bias-korrigerad ML-estimator och jämför den med den okorrigerade ochmed två alternativa Bayesianska estimatorer i en simuleringsstudie.

I den avslutande artikeln V betraktar vi optimala stopptidsproblem för denα-Brownska bryggan. Vi följer det klassiska konceptet i optimal stopptidste-ori genom att studera en motsvarande PDE med fri rand. Genom att lösa detfria randvärdesproblemet erhålls en kandidatlösning för det optimala stopp-tidsproblemet som man därefter visar att den är den sökta. Vi studerar ocksåhur icke-kontinuitet i den α-Brownska bryggan vid tiden 1 när α går mot 0påverkar lösningen till det optimala stopptidsproblemet.

28

Page 29: Gaussian Bridges - Modeling and Inference749343/FULLTEXT01.pdfof probability theory and stochastic processes, whereas in Papers III – V we study inference for a continuous time stochastic

Acknowledgements

I would like to express my deepest gratitude to my supervisor Ingemar Kajfor his support and encouragement throughout my graduate studies. It wasonly due to his patient guidance, that my chaotic first drafts finally turned into(hopefully) coherent papers.I am also indebted to my second supervisor Svante Janson, who, with histremendous knowledge, always got me back on track whenever I got stuck (inparticular with Paper I).I would like to thank Allan Gut for always caring about me like a mentor,for reading most of my manuscripts, and for pointing my attention to refer-ence [46] which eventually led to Paper III.

Katja, thank you for your friendship and for sharing your lunch breaks withme. I wish you all the best in the final year of your graduate studies andbeyond.Måns, you have been a great friend, office mate, and co-author. Thank youvery much for your companionship. I wish you only the best as well.I would like to thank the past and present members of the MatStat group forcreating a great working atmosphere. I enjoyed uncounted cakes with Fredrik,Ioannis, Jesper, Saeid, Silvelyn, and all the others.

I think the decision about which subject one specializes in is to a largeextent influenced by the teachers one has. I am very grateful to Werner Lindefrom the University of Jena for awakening my interest in stochastic processes.

Finishing this thesis is only the preliminary end of a long journey. I wouldlike to thank my parents and my sister for their love and support over all theyears.

Finally, this thesis is dedicated to Bine and Milo. Being with you is mygreatest joy and with you I can be myself completely. I am glad to have youby my side for everything to come.

29

Page 30: Gaussian Bridges - Modeling and Inference749343/FULLTEXT01.pdfof probability theory and stochastic processes, whereas in Papers III – V we study inference for a continuous time stochastic

References

[1] L. Alili. Canonical decompositions of certain generalized Brownian bridges.Electron. Comm. Probab., 7:27–36 (electronic), 2002.

[2] C. A. Ball and W. N. Torous. Bond price dynamics and options. The Journal ofFinancial and Quantitative Analysis, 18(4):pp. 517–531, 1983.

[3] M. Barczy and E. Iglói. Karhunen-Loève expansions of α-Wiener bridges.Cent. Eur. J. Math., 9(1):65–84, 2011.

[4] M. Barczy and P. Kern. General alpha-Wiener bridges. Commun. Stoch. Anal.,5(3):585–608, 2011.

[5] M. Barczy and G. Pap. α-Wiener bridges: singularity of induced measures andsample path properties. Stoch. Anal. Appl., 28(3):447–466, 2010.

[6] M. Barczy and G. Pap. Asymptotic behavior of maximum likelihood estimatorfor time inhomogeneous diffusion processes. J. Statist. Plann. Inference,140(6):1576–1593, 2010.

[7] M. Barczy and G. Pap. Explicit formulas for Laplace transforms of certainfunctionals of some time inhomogeneous diffusions. J. Math. Anal. Appl.,380(2):405–424, 2011.

[8] F. Baudoin. Conditioned stochastic differential equations: theory, examples andapplication to finance. Stochastic Process. Appl., 100:109–145, 2002.

[9] F. Baudoin and L. Coutin. Volterra bridges and applications. Markov Process.Related Fields, 13(3):587–596, 2007.

[10] H. Biermé, A. Estrade, and I. Kaj. Self-similar random fields and rescaledrandom balls models. J. Theoret. Probab., 23(4):1110–1141, 2010.

[11] P. Billingsley. Statistical inference for Markov processes. Statistical ResearchMonographs, Vol. II. The University of Chicago Press, Chicago, Ill., 1961.

[12] V. I. Bogachev. Gaussian measures, volume 62 of Mathematical Surveys andMonographs. American Mathematical Society, Providence, RI, 1998.

[13] M. J. Brennan and E. S. Schwartz. Arbitrage in stock index futures. TheJournal of Business, 63(1):pp. S7–S31, 1990.

[14] B. M. Brown and J. I. Hewitt. Asymptotic likelihood theory for diffusionprocesses. Journal of Applied Probability, 12(2):pp. 228–238, 1975.

[15] F. P. Cantelli. Sulla determinazione empirica delle leggi di probabilita. Giorn.Ist. Ital. Attuari, 4:221–424, 1933.

[16] S. Cohen and J. Istas. Fractional fields and applications, volume 73 ofMathématiques & Applications. Springer, Heidelberg, 2013.

[17] S. Corlay. Partial functional quantization and generalized bridges. Bernoulli,20(2):716–746, 2014.

[18] N. Cressie, A. S. Davis, J. L. Folks, and G. E. Policello. The moment-generatingfunction and negative integer moments. Amer. Statist., 35(3):148–150, 1981.

[19] P. Deheuvels. A Karhunen-Loève expansion for a mean-centered Brownianbridge. Statist. Probab. Lett., 77(12):1190–1200, 2007.

30

Page 31: Gaussian Bridges - Modeling and Inference749343/FULLTEXT01.pdfof probability theory and stochastic processes, whereas in Papers III – V we study inference for a continuous time stochastic

[20] R. L. Dobrushin. Gaussian and their subordinated self-similar randomgeneralized fields. Ann. Probab., 7(1):1–28, 1979.

[21] E. Ekström and H. Wanntorp. Optimal stopping of a Brownian bridge. J. Appl.Probab., 46(1):170–180, 2009.

[22] K. Es-Sebaiy and I. Nourdin. Parameter estimation for α-fractional bridges. InMalliavin calculus and stochastic analysis, volume 34 of Springer Proc. Math.Stat., pages 385–412. Springer, New York, 2013.

[23] K. Ford. From Kolmogorov’s theorem on empirical distribution to numbertheory. In Kolmogorov’s heritage in mathematics, pages 97–108. Springer,Berlin, 2007.

[24] D. Gasbarra, T. Sottinen, and E. Valkeila. Gaussian bridges. In Stochasticanalysis and applications, volume 2 of Abel Symp., pages 361–382. Springer,Berlin, 2007.

[25] V. I. Glivenko. Sulla determinazione empirica della legge di probabilita. Giorn.Ist. Ital. Attuari, 4:92–99, 1933.

[26] U. Grenander. Stochastic processes and statistical inference. Ark. Mat.,1:195–277, 1950.

[27] T. Hida and M. Hitsuda. Gaussian processes, volume 120 of Translations ofMathematical Monographs. American Mathematical Society, Providence, RI,1993.

[28] J. S. Horne, E. O. Garton, S. M. Krone, and J. S. Lewis. Analyzing animalmovements using brownian bridges. Ecology, 88:pp. 2354–2363, 2007.

[29] I. A. Ibragimov and Y. A. Rozanov. Gaussian random processes, volume 9 ofApplications of Mathematics. Springer-Verlag, New York-Berlin, 1978.

[30] I. Kaj and M. S. Taqqu. Convergence to fractional Brownian motion and to theTelecom process: the integral representation approach. In In and out ofequilibrium. 2, volume 60 of Progr. Probab., pages 383–427. Birkhäuser, Basel,2008.

[31] A. N. Kolmogorov. Sulla determinazione empirica di una legge di distribuzione.Giorn. Ist. Ital. Attuari, 4:83–91, 1933.

[32] W. V. Li and W. Linde. Approximation, metric entropy and small ball estimatesfor Gaussian measures. Ann. Probab., 27(3):1556–1578, 1999.

[33] M. Lifshits. Lectures on Gaussian processes. Springer Briefs in Mathematics.Springer, Heidelberg, 2012.

[34] R. S. Liptser and A. N. Shiryaev. Statistics of random processes. I, volume 5 ofApplications of Mathematics (New York). Springer-Verlag, Berlin, expandededition, 2001.

[35] R. S. Liptser and A. N. Shiryaev. Statistics of random processes. II, volume 6 ofApplications of Mathematics (New York). Springer-Verlag, Berlin, expandededition, 2001.

[36] M. Loève. Probability theory. Third edition. D. Van Nostrand Co., Inc.,Princeton, N.J.-Toronto, Ont.-London, 1963.

[37] R. Mansuy. On a one-parameter generalization of the Brownian bridge andassociated quadratic functionals. J. Theoret. Probab., 17(4):1021–1029, 2004.

[38] G. V. Martynov. Computation of the distribution functions of quadratic forms ofnormal random variables. Theory Probab. Appl., 20(4):797–809, 1975.

[39] G. Peskir and A. Shiryaev. Optimal stopping and free-boundary problems.

31

Page 32: Gaussian Bridges - Modeling and Inference749343/FULLTEXT01.pdfof probability theory and stochastic processes, whereas in Papers III – V we study inference for a continuous time stochastic

Lectures in Mathematics ETH Zürich. Birkhäuser Verlag, Basel, 2006.[40] N. V. Smirnov. On the distribution of von Mises ω2-test. Matem. Sb.,

2(1):973–993, 1937. In Russian.[41] N. V. Smirnov. On the estimation of the discrepancy between empirical curves

of distribution for two independent samples. Bulletin of Moscow University,2:3–16, 1939. In Russian.

[42] D. Sondermann, M. Trede, and B. Wilfling. Estimating the degree ofinterventionist policies in the run-up to emu. Applied Economics,43(2):207–218, 2009.

[43] T. Sottinen and A. Yazigi. Generalized Gaussian bridges. Stochastic Process.Appl., 124(9):3084–3105, 2014.

[44] M. Trede and B. Wilfling. Estimating exchange rate dynamics with diffusionprocesses: an application to greek emu data. Empirical Economics,33(1):23–39, 2007.

[45] S. Zhao and Q. Liu. Large deviations for parameter estimators of α-Brownianbridge. J. Statist. Plann. Inference, 142(3):695–707, 2012.

[46] S. Zhao and Y. Zhou. Sharp large deviations for the log-likelihood ratio of anα-Brownian bridge. Statist. Probab. Lett., 83(12):2750–2758, 2013.

32