Search for notes by fellow students, in your own course and all over the country.
Browse our notes for titles which look like what you need, you can preview any of the notes via a sample of the contents. After you're happy these are the notes you're after simply pop them into your shopping cart.
Title: Stochastic Processes II - Lecture Notes
Description: Columbia Business School - First Year of the Doctoral Program in Decisions, Risk and Operations Notes from Prof David Yao's "Stochastic Processes II".
Description: Columbia Business School - First Year of the Doctoral Program in Decisions, Risk and Operations Notes from Prof David Yao's "Stochastic Processes II".
Document Preview
Extracts from the notes are below, to see the PDF you'll receive please use the links above
Stochastic Processes II
Page 1
STOCHASTIC PROCESSES II
PART I – MARTINGALES
Conditional expectations
Measure theory
o
In a probability space (W, , ) , a sigma field is a collection of events, each of
which as a subset of W
...
Notes:
o
(i) and (ii) W Î
¥
A =
i =1 i
(
¥
)
c
Ac , so also closed under infinite (and finite) intersection
...
When we say X is measurable with
respect to and write X Î , we mean {w : X (w) £ x } Î "x
...
(X | Y )(w) = (X | Y = Y (w))
...
We then
find the expected value of X given we are in that “region”
...
o
W = (X | ) is a random variable
...
In
other words, let A be the smallest element of that contains w – then we
restrict ourselves to some region of W and find the expectation over that region;
(X | )(w) = (X A )
...
(W A ) = (X A ) for all A Î : we are now restricting ourselves to a
region of W that is -measurable
...
(If it is
not the smallest element, the result requires additional thought)
...
éêëX | ùúû if X Î
ii
...
(XZ | ) = Z (X | ) if Z Î
iv
...
( (
) )
(
Proof: Use (X | ) | A = (X | ) A
)
for A Î
...
v
...
Jensen’s: for convex f, éêë f (X ) | ùûú ³ f éêëX | ùúû
o
)
Notes
éêëX ùúû = éêëX | {Æ, W}ùúû (the RHS is a constant, because whatever w we
choose, the only element of {Æ, W} that contains it will be W )
...
Integrability of X implies integrability of éêëX | ùûú :
(vi)
é
ù
ê (X | ) ú £ éê X |
ë
û
ë
(
o
(ii)
ù = X
ûú
)
( )
Example: Let W be countable
...
Then (X | ) takes value
probability (i )
...
Also, éêë (X | )A ùûú is the expected value over those i Í A
...
Martingales
Definition: {Xn } is a sub-martingale with respect to {n } (where n Î n+1 ) if
i
...
(X n ) < ¥
[it is often convenient to work with the stronger condition
Xn < ¥ ]
...
éëêXn +1 | n ùûú ³ Xn [< gives a super-martingale, = gives a martingale]
...
o
An increasing convex function of a submartingale is a submartingale
...
éêë f (Xn +1 ) | n ùúû ³ f ( [Xn +1 | n ]) ³ f (Xn )
...
o
If ar(X i ) = s 2 < ¥ , X n2 - s 2n is a martingale (the variance martingale)
...
For example, if S n =
å
n
i =1
Xi
p = (X i = 1) = 1 - (X i = -1) , then M n =
eq =
1-p
p
qX1
)
...
Example: Suppose an urn starts with one black and one white ball
...
Yn,
the proportion of white balls after n draws, is a martingale (mean ½)
...
{h(Xn )} is then a martingale
...
o
...
Chose
...
Implies limn ¥ (Xn ) = (X ) , since (Xn - X ) £ Xn - X , by Jensen
...
Interchange arguments
o
Concerned with whether Xn X a
...
limn ¥ (Xn ) = (X )
...
Proof: Define
{
}
Bnc = An = X n - X < e
...
Use the fact that (Bn ) 0
...
Monotone Convergence: 0 £ Xn X almost surely
Proof:
Xn £ X (X n ) £ (X ) lim supn (Xn ) £ (X )
...
Together
Dominated convergence: Xn £ Y , Y integrable
...
Apply Fatou to both, subtract
(Y ) < ¥ from both sides
...
Replace Ym
with min[Ym , k ] – bounded because Xn’s positive
...
Uniform integrability
o
Definition: {Xn } is uniformly integrable if
é
ù
lima ¥ supn ê X n X >a ú = 0
n
ë
û
é
ù
"e > 0, $a 0 s
...
ê X n X >a ú £ e "n, a ³ a 0
n
ë
û
o
Lemma: {Xn u
...
} supn Xn £ K
(
Proof: Write X n = X n X
o
n
£a
) + ( X
n
X
n
>a
)£a +e
...
i
...
[Y ] = [Y Y £a ] + [Y Y ³a ]
...
i
...
o
Lemma: $d > 0 s
...
X n
(
Proof: X n X
n
>a
)
1+d
£ K < ¥ {X n } u
...
d
æ
ö
1 æ
éX ù
£ ççç X n ê an ú X >a ÷÷÷ = d çç X n
n
è
÷
çè
ë û
ø a
1+d
X
n
>a
ö÷ K
£
...
o
Theorem: {X n } u
...
& X n p X X n L X
1
Proof: First, show X is integrable by writing éê X ùú = éêlim infn X n ùú , using
ë û
ë
û
Fatou’s, writing lim inf < sup and using the definition of u
...
Then, set Yn = Xn - X £ Xn + X – it is u
...
because X
integrable
...
First one can be made
arbitrarily small by u
...
Bounded convergence applies to the second one, and
since Yn 0 , it can also be made arbitrarily small by letting n ¥
...
i
...
Clearly
x x £a -1 £ fa (x ) £ x x £a
...
By L1 convergence (first term) and monotone convergence
ë
û
(second term), this can be made arbitrarily close to X - éê fa X ùú
ë
û
o
é
ù
£ X - ê X X £a -1 ú
...
ë
û
It can also be shown that {Xn } u
...
if and only if (A) < d éê X n A ùú < e and
ë
û
supn Xn < ¥
...
ë
û
(
)
For the if part, use Markov’s Inequality supn ³1 X n > a 0 , and then use the
assumption
...
When arrows enter at different points, each arrow independently is
sufficient to prove the bubble
...
)
Daniel Guetta
Stochastic Processes II
Page 7
{X } martingale
n
supn X n £ K
X n X a
...
Xn
{X }
Xn p X
n
1+d
£K
u
...
éêsupn X n ùú < ¥
ë
û
X n L X
1
Xn X < ¥
These three results
require Xn p X
(X n ) (X )
Xn £ Y
0 £ Xn X
Y < ¥
Xn £ c
Optional stopping
Definition (stopping time): If T is an integer valued random variable, we say it is a
stopping time with respect to a filtration n if {T = n } Î n for all n (or {T £ t } Î t
for all t, in continuous time)
...
{X } is a (sub)martingale and T is a stopping time, {X } is also a
(sub)martingale
...
i
...
=å X
+X
Proof: Write X
...
Conditioning follows
...
i
...
i
...
Taking
limits, supn (XT+n ) £ supn Xn < ¥ [the last inequality follows by u
...
of
Daniel Guetta
{X } ]
...
Finally, consider éê XT n X >a ùú
...
i
...
Example: Consider a gambler’s ruin with wealth St at time t with S0 = i and with
probably ½ of going each direction at each time step
...
Logic says that (ST ) = pN + 0
...
o
On St2 - n
OST says that (ST2 -T ) = (S 02 ) = i 2 , and so (T ) = (ST2 ) - i 2
...
Together, (T ) = i (N - i )
...
This is well defined, in that (T ¢ < ¥) = 1 (because the random walk is an irreducible
Markov chain, which means every state will eventually be visited) but blindly applying
the OST gives (ST ¢ ) = i , which implies that N = i
...
We need to develop conditions under which the OST works
...
s
...
0
Proof: [XT ] = [XT n ] = [X 0 ]
0
Remark: This condition is not satisfied in the counterexample above because even
though (T ¢ < ¥) = 1 , it is not bounded – following the MC analogy, (T ¢) = ¥
...
This equality holds for every n (since XT n is also a martingale) and so it still
holds after we take t he limit
...
This equality requires an interchange argument – uniform integrability of {XT n }
will achieve this, for example
...
This equality requires the martingale to converge
...
Example: Let us return to the example above, and justify applying the OST in
retrospect:
o
In the first case, ST n £ a b "n , and so supn ST n £ a b
...
i
...
i
...
(2)
[ST2 n - (T n 0 )] = [S 02 - 0] = 0 [ST2 n ] = (T n0 )
...
Since ST is two-valued and has finite
expectation, T has finite expectation
...
ë
û
Example (Wald’s Identity): Let {Xi } be IID with Xi < ¥ and (X i ) = m
...
Then (ST ) = (T )(X1 )
...
Applying the OST gives the required result, given that
)
Sn - Sn -1 | n = X n < ¥
...
We would like to find pb = (ST = b)
...
Note that 0 £ M n T = r
Sn T
£ ra +b 1 ; it is
therefore bounded, and we can apply the OST to deduce that 1 = (r 0 ) = (rT )
...
This allows us to find pb
...
OST gives [ST - T (p - q )] = 0 (ST ) = (p - q )(T )
...
Note
that
the
OST
does
not
necessarily
require
(T < ¥) = 1
...
i
...
Daniel Guetta
Stochastic Processes II
Page 10
It is important to remember that when we invoke the martingale convergence theorem so
say that XT n n ¥ XT , we are implicitly implying that (T < ¥) = 1
...
o
Theorem: If {Xn } is a submartingale and A = {max 0£k £n X k ³ a } , then
a (A) £ [Xn A ] £ [Xn+ ]
Proof: Define T = inf {k : Xk ³ a or k ³ n }
...
Write both sides of equation by splitting over A and
Ac , and note XT = X n over Ac
...
Finally, note
that X n £ X n+ [X n A ] £ [X n+ ]
...
As such, we get Kolmogorov’s Inequality:
(
)
(
)
max 0£k £n Sk ³ x = max 0£k £n Xk ³ x 2 £
[Sn2 ]
x
=
ns 2
x2
Azuma’s Inequality
o
(
)
Theorem: x -1 - x -3 j(x ) £ F(x ) £ x -1j(x ) for all x > 0 (where j is the normal
density function)
...
For the upper bound, multiply the
)
(
)
integrand by 1 + y -2
...
Integrating gives the required result
...
Define
Yn =
Sn
n
- m
...
(It is, by the way,
summable, so we automatically recover the SLLN)
...
Then
(
¥
n =m
æ
2m e2 ö÷÷
Z n > n e £ 2 exp ççç÷
çè (a + b )2 ÷ø
)
This bound is not as tight as the CLT’s, but it requires less
...
Z n = Sn - np is
a martingale with -p £ Z i - Z i -1 £ 1 - p
...
Definition (Doob Martingale): Let X be a random variable in L1 and n be
a set of filtrations
...
Proof: (X n +1 | n ) = éê (X | m +1 ) | n úù = (X | n ) = X n
...
Define
i = s(X1 , , Xi )
...
Then Si = éêëh(X ) | i ùúû is a
Doob martingale
...
Daniel Guetta
Stochastic Processes II
o
Page 12
Example: Consider a system of n components (indexed by i) and m experiments
(indexed by j)
...
For experiment j to work, all the components in the set Aj must work, and we
assume each component is involved in at most 3 experiments
...
We then have
Y =
å
m
j =1
{Experiment j
can be performed}
= h(X )
(Y ) =
å
m
j =1
i ÎAj
pi
Since h(X ) - h(Y ) £ 3 , we can apply Azuma’s
...
This is stronger than measurability; it requires Cn to be predictable
from the previous step’s information
...
We write
(C X )
n
= å k =1C k (X k - X k -1 )
n
(C X )
0
=0
Intuition: Xn may be the state of a market and Cn the action we take at each
step
...
Proof: Measurability is obvious, and integrability follows from the boundedness
of Cn
...
Using predictability and
ë
û
the submartingale property leads to the result
...
For the latter, predictability is not required,
since the expectation is 0 anyway
...
Similarly, we can write
transformation with C n = 1 - C n
...
We define T0 = 0 and
Daniel Guetta
Stochastic Processes II
Page 13
T2k -1 = inf {m > T2k -2 : X m £ a }
T2k = inf {m > T2k -1 : X m ³ b }
Odd stopping times mark the first time the martingale downcrosses a since its
last upcrossing of b
...
These are
clearly stopping times
...
We effectively “buy” only if we’re below a and sell otherwise
...
o
Theorem: Let U n = sup {k : T2k £ n } – this is the number of upcrossings up to
time n – and assume Xn is a submartingale
...
Remark: This is hardly very encouraging
...
Proof: WLOG, shift the upcrossing range to [0, b - a ] , and define Yn = (X n - a )+
– this is also a submartingale and has the same number of upcrossings
...
Letting C n = 1 - C n , we get
(C Y )n + (C Y )n = å k =1 (Yk -Yk -1 ) = Yn -Y0
...
Thus, [(C Y )n ] £ [Yn -Y0 ]
...
i
...
s
...
Remark: supn Xn < ¥ is equivalent to supn (X n+ ) < ¥ because x + £ x = 2x + - x ,
and so, for example, Xn = 2[Xn+ ] - [Xn ] £ 2[Xn+ ] - [X 0 ]
...
However, Un is increasing by definition and therefore tends to some U
(possibly ¥ ), but by monotone convergence, (U n ) (U ) < ¥
...
This is true
for any a and b, and so (lim inf X n = lim sup X n ) = 1
...
Corollary: If {Xn } is a supermartingale and Xn > 0, then X n X and [X ] £ [X 0 ]
...
The condition of the
theorem above (see the remark) is therefore satisfied
...
Example: Assume X 0 = i > 0 , and X n | X n -1 ~ Po(X n -1 )
...
Let T = inf {n : Xn = 0 or Xn ³ b}
...
But (XT ) = i
...
Thus, p 1
...
As such,
ar(Xn ) = ni ; the
variable itself tends to 0, but the variance blows up
...
Let Yn = 2n X n
...
This is a martingale, and, by the SLLN,
1
n
logYn = log 2 + n1 å i =1 logU i log 2 + (logU 1 ) < ¥ a
...
So Yn 0
...
Again, the variance blows up
...
{X }
n
is u
...
(and therefore converges almost surely)
ii
...
Xn can be written as a Doob martingale; Xn = éêëX | n ùúû , with X < ¥
...
Proof: (i ) (ii ) follows from the fact u
...
implies the boundedness condition in the
convergence theorem
...
To show this is equal to Xn, all we need to show is that
[X n A ] = [X A ] for any A Î n
...
(1) Use the martingale property to write,
Daniel Guetta
Stochastic Processes II
Page 15
for any m > n, [Xn A ] = [ [Xm | n ]A ] = [ [Xm A | n ]] = [Xm A ]
...
Letting
ë
û
m ¥ in step 1 therefore gives the required result
...
Want to show that for
ë
û
any e , there exists a0 such that éê [X | n ] [ X | ]>a ùú < e
...
Integrability of X means that
for any e > 0 , there exists a d > 0 such that for any A with (A) < d , (X A ) £ e
...
Applying Markov’s, we get
(A) £ [X ] / a
...
Let
Xn be the proportion of red balls in the urn after draw n
...
PART II – BROWNIAN MOTION
Introduction
A brownian motion
{B(t )} ,
sometimes denoted
{B } ,
t
is a continuous process with
independent and stationary increments which follow a normal distribution; in particular,
Bt ~ N (0, t )
A useful way to motivate BM is using a random walk S0 = 0, Sn = X1 + + X n , with
the Xn IID equally likely to be 1 and –1
...
We then have
S (t ) = Dx (X1 + + X n )
(Dx )2
t
Dt
As Dt 0 , this approaches Brownian motion, provided that (Dx )2 ~ Dt
...
o
Since B(t ) ~ N (0, t ) , its PDF is ft (x ) =
1
2 pt
Daniel Guetta
2
exp(- x2t ) =
1
t
j( x )
...
Then
ov éëêB(s ), B(t )ùûú = ov éëêB(s ), B(t ) - B(s ) + B(s )ùûú = 0 + ar éëêB(s )ùûú = s = s t
o
Joint density: Let t1 < t2
...
Then
ëéêB(s ) = x | B(t ) = b ûùú =
=
=
ëêéB(s ) = x , B(t ) = b ùûú
ëéêB(t ) = b úûù
fs (x )ft -s (b - x )
ft (b )
1
2 px
(
2
exp - x2a
)
1
2 pt
1
2 p (t -s )
(
2
exp - (2(b-t -x s))
)
( )
2
exp - b2t
By multiplying out and then completing the square, we find that
(B(s ) | B(t ) = b ) ~ N (b
s
t
)
, s (1 - st )
The mean is analogous to what we obtained in a Poisson process
...
o
Hitting time: Let Ta = inf {t : B(t ) = a } , a > 0 (the Brownian path is
symmetric, so the result should be identical for –a)
...
Now
(Ta £ t ) = 2 (Bt ³ a ) = 2F
( )
a
t
Now, as t ¥ , we would expect this probability to tend to 1, since the
Brownian motion should eventually hit a
...
However, Brownian motion is a null-recurrent Markov chain – the expected value
of any hitting time is infinity:
Daniel Guetta
Stochastic Processes II
Page 17
(Ta ) =
1
ò
2p
¥
0
a
t
e -a
2
/2t
³
a
ò
2p
¥
1
1
t
e -a
2
/2t
dt ³
ae -a
2
/2
2p
ò
¥
dt
1
t
=¥
Note also that if M t = maxs £t Bs (a quantity always increasing in t), we have
(M t ³ a ) = (Ta £ t ) = 2F
( )
a
t
Similarly, note that mins £t Bs = - maxs £t (-Bs ) =d - maxs £t Bs
...
Note that
(min 0£u £t X (u ) £ 0 | X (0) = a ) = (max 0£u £t X (u ) ³ 0 | X (0) = -a )
= (max 0£u £t X (u ) ³ a | X (0) = 0)
= (Ta £ t )
As such
(At least one 0 in [t 0 , t1 ] | X (t 0 ) = a ) = P(a ) = (Ta £ t )
Now, assume we know X(0) = 0, but do not know X(t0)
...
We can find this by conditioning on the value
of X(t0), and we get
(At least one 0 in (t 0 , t1 ) | X (0) = 0) = 1 -
o
t
t
2
2
arcsin 0 = arccos 0
p
p
t1
t1
Consider
At (x , y ) = (X (t ) > y, min 0£u £t X (u ) > 0 | X (0) = x )
Note that
(X (t ) > y | X (0) = x ) = At (x , y ) + (X (t ) > y, min 0£u £t £ 0 | X (0) = x )
Consider the last term; by reflecting at the first time the process hits 0, we find
that every sample path satisfying that term has a corresponding path that falls
below 0 and with X(t) < –y (and vice versa)
...
Consider
(M t ³ m, Bt £ x ) = (Bt ³ 2m - x )
Daniel Guetta
Stochastic Processes II
Page 18
Because for every path satisfying the first expression, we can reflect at the first
point Mt hits m
...
o
Finally, note that Brownian motion is a special case of a Gaussian Process, since
B(t1 ), , B(tn ) can be expressed as a linear combination of IID standard normals:
B(t1 ) = t1 ⋅ Z 1
B(t2 ) = B(t1 ) + B(t2 ) - B(t1 ) = B(t1 ) + B(t2 - t1 ) = tZ 1 + ( t2 - t1 )Z 2
o
Example: Consider X (t ) = tB( 1t )
...
To show the process is continuous, show that limt 0 X (t ) = 0 limu 0
B (u )
u
0
...
Therefore, it is a Brownian motion; Brownian motion is self-symetric; all we need
to do is to consider the interval t Î [0,1] and we have everything we need about
the process
...
Another way of stating the
Markov property is that given the present, past and future are independent
...
Now,
using the result about the conditional distribution and for 0 < s < t < 1,
ëêéX s X t ùûú = éêëBs Bt | B1 = 0ùûú
= Bt éêëBs | Bt ùúû | B1 = 0
(
= (B B
(
s
t t t
2
t
| B1 = 0)
= st B | B1 = 0
)
)
= s(1 - t )
(Note that ar(Xt ) = t(1 - t )
...
Consider the following argument instead:
ëêéX s X t ùûú = (Bs [Bt | Bs ] | B1 = 0)
(
= Bs2 | B1 = 0
= s(1 - s )
Daniel Guetta
)
Stochastic Processes II
Page 19
This second argument is incorrect, because in going from the first to the second
line, we condition Bt on the past while ignoring that we are at the same time
condition Bt on the future (because we are conditioning on B1 = 0)
...
Defining a stopping time is more tricky; if T satisfies
{T £ t } Î ,
then
t
since
is
an
increasing
family,
{T < t } = {T £ t - } Î
...
To
¥
1
n
n =1
1
n
n
t
t
t
conclude this, we need to assume right-continuity of the filtration
...
(
)
Proof: The first two parts are trivial
...
2
/2
The exponential martingale can be used to generate many other martingales
...
Feeding this into
the martingale property and exchanging summation and expectation, we can use
the fact that this holds for any q including q = 0 , and conclude that for each n,
{H
o
n
(t, Bt )} is also a martingale
...
o
Example: Define T = inf {t : Bt = -a or b }
...
Using the variance martingale, we can find (T ) = ab
...
We are interested in
T = inf {t : Xt = b > 0} (the first time stock depletes)
...
Use the variance martingale for ar(T ) =
o
Example: Xt = mt + sBt , T = inf {t : Xt Ï (-a, b)}
...
Choose q = -2m / s , and use the OST (stopped martingale is bounded);
qX
T
(e s
) = 1
...
Use OST on BT = (XT - mT ) / s to find
(T )
...
2
o
Example: Following from the above example and letting T = inf {t : Xt = b} ,
suppose we want (e -gT )
...
Use the OST on exp(qBT t - 12 q 2 [T t ]) £ exp(qBT t ) [bounded
...
Ito’s Formula
o
For a deterministic xt, dx t = x dt and df (x t ) = f ¢(x t )dx t + 12 f ¢¢(x t )(dx t )2 + ,
but (dx t )2 = x 2 (dt )2 which vanishes
...
The first integral above is called Ito’s Integral and can be
approximated as
ò
t
0
X s dBs » å i =1 X t
n
i -1
éB - B ù
êë ti
ti -1 ú
û
This is a martingale transformation, and is therefore a martingale under
boundedness and predictability ( left-continuity) of Xt
...
To find its variance, consider
2
æ t
ö
ççç ò X s dBs ÷÷÷ »
è 0
ø
(å
n
Xt
i =1
i -1
é
ù
êëBti - Bti -1 úû
)
2
=
å
æ t
ö
2
çç X 2 ds ÷÷
(
X
)(
t
t
)
»
ti -1
i
i -1
i =1
çè ò0 s ÷ø
n
Where we have used orthogonality of martingale differences, and conditioned on
t
...
More generally, for a bivariate f(t, x)
i -1
Daniel Guetta
Stochastic Processes II
Page 21
df (t, Bt ) = ft ¢ dt + fx¢ dBt + 12 fxx¢¢ dt
T
1 T
f ¢¢ dt
2 ò0 xx
T
f (T , BT ) = f (0, 0) + ò ft ¢ dt + ò fx¢ dBt +
0
0
Two results
Since the second integral above is a martingale, we have that if f (x , t )
satisfies ft ¢ = - 12 fx¢¢ , then f(t,Bt) is a martingale
...
o
Example: Letting f (x ) = 12 x 2 and f ¢(x ) = x , we get
1
2
o
Bt2 =
ò
Example: Let f (t, x ) = e
t
0
Bs dBs + 12 t
mt +sBt
ò
t
0
Bs dBs =
and let Yt = f (t, Bt ) = e
(B
2
t
1
2
mt +sBt
)
-t
...
o
Example: We define integrated Brownian Motion as X t =
Gaussian process, and clearly (X t ) =
integration by parts, we can write X t =
ò
t
0
ò
t
0
ò
t
0
Bs ds
...
Furthermore, using
Bs ds =
ò
t
0
(t - s ) dBs , and so, for
t < u and repeatedly using independent increments
u
æ t
ö
ov (X t , X u ) = ov ççç ò (t - s ) dBs , ò (u - s ) dBs ÷÷÷
0
è 0
ø
t
æ t
ö÷
= ov ççç ò (t - s ) dBs , ò (u - s ) dBs ÷÷
0
è 0
ø
t
æ t
ö÷
= ççç ò (t - s ) dBs ò (u - s ) dBs ÷÷
0
è 0
ø
t
2
= ò (t - s )(u - s ) éê(dBs ) ùú
ë
û
0
=
ò
t
0
(t - s )(u - s ) ds
= 12 t 2 (u - 3t )
So this is not Brownian Motion
...
o
deduce
Example: Consider a inventory problem in which the cumulative (net) demand
up to time t is Dt = lt + sBt with initial inventory S > 0
...
Let t = inf {t : Dt = S }
...
Taking expectations, and using
the OST on the uniformly integrable martingale
ò
t
0
s dBs , we get
æ t
ö
ççç ò (S - Dt ) dt ÷÷÷ = 12 l (t 2 )
è 0
ø
Assuming the holding-cost per unit time is h and each replenishment costs
cS + K , the long-term average cost which we seek to minimize is
minl,S
1
2
lh (t 2 ) + cS + K
(t )
From the previous section, we have (t ) =
S
l
and (t 2 ) =
s2
l2
(t ) + 2 (t )
...
In other words, a process is stationary if it is “shift
invariant”
...
Note that a Gaussian process is entirely defined by its mean and covariance; as such, for
a Gaussian process, stationarity implies covariance stationarity
...
p
êë
úû
+ q1 = 1
...
Applications:
Y
...
...
0
...
+ Xn
p
Let Lp limit is unique: if X n L X
p
p
X
p
= (X - X n ) + X n , and use
p
£ e + Xn
p
<¥
...
Theorem: If {X n } if a martingale that is Lp bounded with p > 1, then Xn X a
...
in
Lp
...
Cauchy Criterion: X m - X n
p
0 for n, m ¥ $X s
...
X n - X
Daniel Guetta
p
0
...
Note that in general, X will be a random variable
...
s
...
Again, X is generally a random variable
...
An intermediate step in proving this is…
Lemma
(Maximal
Ergodic
Lemma):
{Y }
Let
n
be
stationary
and
let
Si ,n = Yi + + Yi +n -1 and M i ,n = max {0, Si ,1 , , Si ,n }
...
{ 0,n }
Definition (Shift Operator): j
is a shift operation
...
We say the set A is shift invariant if x Î A j(x ) Î A
...
Note: We can use this definition of ergodicity to show that X is constant
...
s
...
Definition (Mixing): X = (X1 , X 2 , , X n , ) is mixing if
limn ¥ ((X1 , X 2 , ) Î A,(X n +1 , X n +2 , ) Î B ) = ((X1 , X 2 , ) Î A) ⋅ ((X1 , X 2 , ) Î A)
Where A and B are sets of infinite sequences (not necessarily shift invariant)
...
Theorem: If a sequence is mixing, it is ergodic
...
Example: Let X 0 = Y w
...
p and Z w
...
Consider two cases:
o
If Xn º X 0 , X = Y w
...
p 1 - p
o
If X n =d X 0 , X = (Y ) w
...
p 1 - p
Neither case is ergodic
...
Example: If W = max(X ,Y ) , then FW = FX FY , and lW = lX + lY
...
We define the following variable orderings
f (y ) g(y )
£
for all x > y
...
o
Stochastic ordering: X ³ST Y if F (x ) ³ G(y ) for all x
...
x
æ
ö
For (ii) (iii) , use F (x ) = exp ççç-ò f (t ) / F (t ) dt ÷÷÷
...
è -¥
ø
Theorem: X £icx Y and (X ) = (Y ) éëê f (X )ùûú £ éëê f (Y )ùûú for all convex f
...
Theorem: X ³ST Y iff éëêf(X )ùûú ³ éëêf(Y )ùûú for any increasing f
...
Then X =d F -1 (U ) and Y =d G -1 (U )
...
Pick f(x ) = {x ³a }
...
Thus, we can define a
concave ordering ICV for increasing concave functions, and X ³ICX Y -X £ICV -Y
...
o
If Xi ³ST Yi for all i, then h(X ) ³ST h(Y ) for any increasing h : n
...
Proof:
IFR & NBUE Distributions
We let Xt denote the residual lifetime of X given that X has survived up to time t
...
Differentiating, we find that the
density of Xt is f (x + t ) / F (x ) , and so the failure rate is lX (x ) = lX (t + x )
...
Theorem: X Î IFR Xt stochastically decreasing
...
Clearly, this is increasing in
t
t
0
t
è
ø
è
ø
t if and only if lX (s ) is a decreasing function in s
...
Let Xe be the equilibrium distribution of that renewal process – in other words, at
any given time t far into the future, the time since the last renewal
...
Stochastic Processes II
Page 27
Definition: NBUE denotes the class of distributions that satisfies the property of “new
better than used in expectation”
...
Proof: Elementary
...
Proof: (X - t | X > t ) =def of X (X t ) £X
t
t
stoch
...
(X 0 ) =def of X (X )
t
Theorem: X Î NBUE X £icx exp(1 / m) , where m = (X )
...
5
...
in Ross shows that this implies icx ordering
...
Now, consider the coefficient of variation,
ar(Y ) / 2 (Y )
...
o
Let us revisiting problem 1: X0 = I and Xn ~ Po(Xn-1)
...
Furthermore, let T = inf {n : Xn ³ b > i }
...
{T }
T n
is non-negative, but NOT
uniformly integrable
...
So, (XT = 0) ³ 1 - bi
...
ST n is a
non-negative martingale
...
And to 0
...
Daniel Guetta
Title: Stochastic Processes II - Lecture Notes
Description: Columbia Business School - First Year of the Doctoral Program in Decisions, Risk and Operations Notes from Prof David Yao's "Stochastic Processes II".
Description: Columbia Business School - First Year of the Doctoral Program in Decisions, Risk and Operations Notes from Prof David Yao's "Stochastic Processes II".