Search for notes by fellow students, in your own course and all over the country.
Browse our notes for titles which look like what you need, you can preview any of the notes via a sample of the contents. After you're happy these are the notes you're after simply pop them into your shopping cart.
Title: Generating Functions
Description: Well comprehensive notes on Generating Functions
Description: Well comprehensive notes on Generating Functions
Document Preview
Extracts from the notes are below, to see the PDF you'll receive please use the links above
GENERATING FUNCTIONS:
Definition:Let {a0 , a1 , a2 ,
...
If
2
A(s) = a0 + a1 s + a2 s +
...
Example
Let ak =
1
,k
k!
= 0, 1, 2,
...
= es
0!
1!
2!
3!
A generating function is a convergent power series which can be finite of infinite
...
PROBABILITY GENERATING FUNCTION(p
...
f ):
We consider two definitions
Definition 1
This definition looks at probability generating function (pgf) as a special case of a generating
function
...
The corresponding A(s) is called
a probability generating function(p
...
f)
...
1
Instead of ak we usually use pk , the probability that a random variable X takes a non-negative
integer
...
g
...
Definition 2
The generating function G(s) of the integer-valued random X is defined by
X
G(s) = E[S ] =
∞
X
P rob(X = k)sk
k=0
Deriving moments of random variable using p
...
f
From G(s) = E[S X ],the first and second derivatives of G(s) with respect to s are
dG(s)
= G′ (s) = E[XS X−1 ]
ds
d2 G(s)
= G′′ (s) = E[X(X − 1)S X−2 ]
ds2
Substituting s = 1, we have
G(1) =
X
pk = 1
G′ (1) = E[X]
G′′ (1) = E[X(X − 1)]
the mean and variance of X are
E[X] = G′ (1)
and
V ar(X) = E(X 2 ) − [E(X)]2
= E[X(X − 1)] + E[X] − [E(X)]2
= G′′ (1) + G′ (1) − [G′ (1)]2
Examples of p
...
f:
(i) Let X have binomial distribution with parameters n and p;
n x
P (X = x) =
p (1 − p)n−x , x = 0, 1,
...
g
...
Hence the mean and variance of X are
E[X] = G′ (1) = np
and
V ar(X) = G′′ (1) + G′ (1) − [G′ (1)]2
= n(n − 1)p2 + np − (np)2
= np(1 − p)
(ii) Let X have Poisson distribution with parameter λ;
P (X = x) =
e−λ λx
,
x!
x = 0, 1, 2,
...
, Xp )′ be a p-dimensional random vector with joint probability distribution
function f (x1 , x2 , x3 ,
...
The joint probability generating function of the vector is defined by
G(s) = E S1X1 S2X2
...
sx1 1 sx2 2
...
, xp )
xp
if the expectation exists for all values s = [s1 , s2 ,
...
g
...
, p can be obtained from the joint p
...
f by substituting
sj = 1;
sj , j ̸= i ;i
...
si , 1, 1
...
g
...
G(s1 , s2 , s3 ,
...
, sk , 1, 1
...
g
...
, p equal to one
...
g
...
E[Xir Xjm ] =
∂r ∂m
G(s) |s=1
∂sri ∂sm
j
4
MOMENT GENERATING FUNCTIONS:
Definition: The moment generating function of a random variable X is defined by
MX (t) = E[etX ]
Z
=
etx f (x)dx
x
if X is a continuous random variable and
ϕ(t) = E[etX ]
X
=
etx P (X = x)dx
x
if X is a discrete random variable
...
(ii) If X and Y are independent, and Z = X + Y ; then MZ (t) = MX (t) × MY (t)
...
The result can be extended to several random variables
...
Then MY (t) = ebt MX (at)
...
We will make use of tthe Maclaurin series
...
+
...
+
...
+ E[X n ] +
...
+
E[X n ] +
...
(2)
In general the r − th derivative of MX (t) evaluate at t = 0 gives E[X r ]; i
...
d
...
g
...
d
...
Hence the m
...
f of a standard normal variable is e 2
Let us use this m
...
f to derive the m
...
f of Y = σX + µ
Now Y is a linear function of X; thus the m
...
f of Y is given by
MY (t) = eµt MX (σt)
= eµt e
σ 2 t2
2
This is the m
...
f of a random variable having normal distribution with mean µ and variance
σ2
...
g
...
dMY (t)
d h µt σ2 t2 i
=
e e 2
dt
dt
= eµt tσ 2 e
6
σ 2 t2
2
+ µeµt e
σ 2 t2
2
Substituting t = 0;E[Y ] = µ
The second derivative of MY (t) with respect to t is
2 2
d2 MY (t)
d µt 2 σ2 t2
µt σ 2t
2
=
e
tσ
e
+
µe
e
dt2
dt
= σ 2 eµt t2 (σ 2 )2 e
+µeµt tσ 2 e
σ 2 t2
2
σ 2 t2
2
+ tσ 2 µeµt e
+ µ2 eµt e
σ 2 t2
2
+ σ 2 eµt e
σ 2 t2
2
σ 2 t2
2
Substituting t = 0;E[Y 2 ] = µ2 + σ 2 hence
Var(Y ) = E[Y 2 ] − (E[Y ])2 = µ2 + σ 2 − µ2 = σ 2
(b) Let X have p
...
f
n x
P (X = x) =
p (1 − p)n−x ,
x
x = 0, 1,
...
g
...
, Xp )′ be a p-dimensional random vector
...
, tp ]′
...
+ tp xp )]
" p
#
Z
Z Z
X
=
...
, xp ) dx1 dx2
...
+ tp xp )]
" p
#
X XX
X
=
...
, xp )
xp
x2
x1
k=1
Marginal moment generating functions:
The marginal m
...
f of Xi , i = 1, 2,
...
g
...
e
Mxi (ti ) = Mx (0, 0,
...
, 0)
Similarly we can obtain the marginal m
...
f of k variables, (2 ≤ k < p), by putting tj = 0 for all
variables Xj not included in the k variables
...
g
...
g
...
Solution:
1
Mx2 (t2 ) = Mx1 ,x2 ,x3 (0, t2 , 0) = exp (5(0)2 + 3t22 + 2(0)2 + 4(0)t2 + 6(0)(0))
2
1
= exp 3t22
2
1
Mx1 ,x3 (t1 , t3 ) = Mx1 ,x2 ,x3 (t1 , 0, t3 ) = exp (5t21 + 2t23 + 6t1 t3 )
2
8
Moments of random variables: For each Xi of X; its r-th moment can be obtained by
differentiating the joint m
...
f r times with respect ti and putting all the tj , j = 1, 2,
...
Also E[Xir Xjs ] can be obtained from the joint m
...
f by differentiting it r times with respect
to ti and s times with respect to tj
...
Solution:
E[X12 ]
∂2
1 2
2
2
= 2 exp (5t1 + 3t2 + 2t3 + 4t1 t2 + 6t1 t3 ) |(t1 ,t2 ,t3 )=(0,0,0)
∂t1
2
and
∂
1 2
2
2
exp (5t1 + 3t2 + 2t3 + 4t1 t2 + 6t1 t3 ) = (5t1 + 2t2 )
∂t1
2
1 2
2
2
exp (5t1 + 3t2 + 2t3 + 4t1 t2 + 6t1 t3 )
2
1
∂
(5t1 + 2t2 )exp (5t21 + 3t22 + 2t23 + 4t1 t2 + 6t1 t3 ) = (5t1 + 2t2 )2
∂t1
2
1 2
2
2
exp (5t1 + 3t2 + 2t3 + 4t1 t2 + 6t1 t3 )
2
1 2
2
2
+5exp (5t1 + 3t2 + 2t3 + 4t1 t2 + 6t1 t3 )
2
the second derivative above is the second partial derivative of the m
...
f with respect to t1 ; putting
t1 = 0, t2 = 0, t3 = 0 E[X12 ] = 5
...
given that i =
√
−1 and that eitx = cos(tx) + isin(tx) then
|eitx | = 1
...
Hence a characteristic function can
be defined for every random variable
...
Properties of a characteristic function
(i) Existence: Let f(x) be a density function with characteristic function ϕ(t), where t ia a
real parameter
...
−∞
Thus ∀t, the characteristic function always exists
...
Then
Z
|ϕ(t + h) − ϕ(t)| =
∞
Z−∞
∞
≤
−∞
eitx eihx − 1 f (x)dx
eihx − 1 f (x)dx
Thus,
lim
|ϕ(t + h) − ϕ(t)|
h→0
≤
lim
h→0
Z
∞
−∞
ihx
e − 1 f (x)dx = 0
Hence
ϕ(t + h) − ϕ(t) → 0 as h → 0
ϕ(t) is a uniformly continuous function of t
...
Moments and characteristic function:
The characteristic function ϕ(t) is continuously differentiable in its domain
...
ir
Thus the r-th moment of X is given by
E[X r ] =
ϕ′ (0)
ir
Similarly we can obtain moments for a discrete random variable; i
...
Then
ϕX (t) =
∞
X
e−λ (λ)x
x!
k=0
= eλ
= e
eitx
∞
X
(λeit )x
k=0
−λ λeit
x!
e
= e−λ(1−e
it )
The first and second derivatives of ϕ(t) are
ϕ′ (t) = λeit ie−λ(1−e
it )
ϕ′′ (t) = λ2 i2 e−λ(1−e ) e2it + λi2 e−λ(1−e ) eit
it
The mean and variance of X are
ϕ′ (0)
E[X] =
=λ
i
and
′ 2
ϕ′′ (0)
ϕ (0)
V ar(X) =
−
2
i
i
2
= λ + λ − (λ)2
= λ
13
it
Characteristic functions of random vectors:
Let X = (X1 , X2 ,
...
Further let f (x = f (x1 , x2 ,
...
The characteristic function of X
is defined by
ϕ(t1 , t2 ,
...
+ itp xp )]
" p
#
Z
Z Z
X
=
...
, xp ) dx1 dx2
...
, tp ) = E [exp(it1 x1 + it2 x2 +
...
, xp )
=
...
, tp )′ and ti , i = 1, 2,
...
Properties of joint characteristic functions:
(i) Existence: The characteristic function of a randon vector X always exists; ϕ(0, 0,
...
C
...
X2
where X1 is rx1 and X2 is sx1; r+s=p
...
The M
...
F of X1 and x2 are simply ϕ(t1 , 0) and ϕ(0, t2 )
...
f of Xi (i=1,2)
Moments of random variables: For each Xi of X; its r-th moment can be obtained by differentiating the joint c
...
, p equal to zero
...
f by differentiting it r times with respect to ti and
s times with respect to tj
Title: Generating Functions
Description: Well comprehensive notes on Generating Functions
Description: Well comprehensive notes on Generating Functions