Search for notes by fellow students, in your own course and all over the country.

Browse our notes for titles which look like what you need, you can preview any of the notes via a sample of the contents. After you're happy these are the notes you're after simply pop them into your shopping cart.

My Basket

You have nothing in your shopping cart yet.

Title: Probability Formula Sheet
Description: Probability Formula Sheet include all the useful formula needed to solve problems related to probability and statistics. if you want discount or format or custom changes in document or want to order any document or want to design your professional CV/Resume to get job fast please email me at harishassan1995@gmail.com

Document Preview

Extracts from the notes are below, to see the PDF you'll receive please use the links above


Probability Formula Sheet Haris H
...

Axiom 3:
If A ∩ B = ∅ , then P[A ∪ B] = P[A] + P[B]
It states that the total probability (mass) in two disjoint objects
is the sum of the individual probabilities (masses)
...
is a sequence of events such that Ai ∩ Aj =
∅ for all i ≠ j, then


Addition Rule:
𝑃[𝐴 ∪ 𝐵] = 𝑃[𝐴] + 𝑃[𝐵] − 𝑃[𝐴 ∩ 𝐵]
If A & B are mutually exclusive then; 𝑃[𝐴 ∪ 𝐵] = 𝑃[𝐴] + 𝑃[𝐵]
Multiplication Rule:
𝑃[𝐴 ∩ 𝐵] = 𝑃[𝐴]𝑃[𝐵|𝐴] = 𝑃[𝐵]𝑃[𝐴|𝑏]
If A & B are independent then; 𝑃[𝐴 ∩ 𝐵] = 𝑃[𝐴]𝑃[𝐵]

Probability Laws



Total probability law:

𝑃 [⋃ 𝐴 𝑘 ] = ∑ 𝑃[𝐴 𝑘 ]

𝑃[𝐴] = 𝑃[𝐴 ∩ 𝐵1 ] + 𝑃[𝐴 ∩ 𝐵2 ] + ⋯ + 𝑃[𝐴 ∩ 𝐵 𝑛 ] OR

𝑘=1

𝑘=1

Corollaries
c

𝑃[𝐴] = 𝑃[𝐴|𝐵1 ]𝑃[𝐵1 ] + 𝑃[𝐴|𝐵2 ]𝑃[𝐵2 ] + ⋯ + 𝑃[𝐴|𝐵 𝑛 ]𝑃[𝐵 𝑛 ] OR
𝑃[𝐴] = 𝑃[𝐴|𝐵]𝑃[𝐵] + 𝑃[𝐴|𝐵 𝑐 ]𝑃[𝐵 𝑐 ]

P[A ]=1-P[A]

1=P[S]= P[Ac]+P[A]

P[A]≤1
𝑃[𝐴 ∪ 𝐵] = 𝑃[𝐴] + 𝑃[𝐵]
− 𝑃[𝐴 ∩ 𝐵]
𝑃[𝐴 ∪ 𝐵] ≤ 𝑃[𝐴] + 𝑃[𝐵]

P[∅]=0
If A⊂B, then P[A]≤P[B]

Bayes’ Law
𝑃[𝐵 𝑖 |𝐴] =

Computing probabilities by counting methods


A & B are mutually exclusive if [𝐴 ∩ 𝐵] = 0

Sampling with Replacement and with Ordering:

A & B are independent If 𝑃[𝐴|𝐵] = 𝑃[𝐴] & 𝑃[𝐵|𝐴] = 𝑃[𝐵] &
𝑃[𝐴 ∩ 𝐵] = 𝑃[𝐴]𝑃[𝐵]

Sampling without Replacement and with Ordering:

(

𝑃[𝐵 𝑖 ∩ 𝐴]
𝑃[𝐵 𝑖 ∩ 𝐴]
𝑃[𝐵 𝑖 ]
=
= 𝑃[𝐴|𝐵 𝑖 ]
𝑃[𝐴]
𝑃[𝐴]
𝑃[𝐴]
𝑃[𝐴|𝐵 𝑖 ]𝑃[𝐵 𝑖 ]
= 𝑛
∑ 𝑘=1 𝑃[𝐴|𝐵 𝑘 ]𝑃[𝐵 𝑘 ]

Probability definition:

Number of distinct ordered k-tuples = 𝑛 𝑘


𝑃[𝐴 ∩ 𝐵]
𝑃[𝐵]

𝑃[𝐴|𝐵] =

Discrete Probability Distributions

)

Number of distinct ordered k-tuples = 𝑛 𝑛 − 1
...
)


𝑋∈𝑆 𝑋

Variance of the random variable X

Sampling without Replacement and without Ordering
𝐶 𝑘𝑛 𝑘! = 𝑛(𝑛 − 1) … (𝑛 − 𝑘 + 1)
𝐶 𝑘𝑛 =



𝑛(𝑛−1)…(𝑛−𝑘+1)
𝑘!

=

𝑛!
𝑘!(𝑛−𝑘)!

𝑘

𝜎 2 = 𝑉𝐴𝑅[𝑋] = 𝐸[(𝑋 − 𝑚 𝑋 )2 ] = ∑(𝑥 𝑘 − 𝑚 𝑋 )2 𝑝 𝑋 (𝑥 𝑘 )
𝑋
= 𝐸[𝑋 2 ] − 𝑚2𝑋

𝑛
≜( 𝑘 )

𝑘

Standard deviation of the random variable X:

Sampling with Replacement and without Ordering

1⁄
2

𝜎 = 𝑆𝑇𝐷[𝑋] = 𝑉𝐴𝑅[𝑋]

𝑛−1+ 𝑘
𝑛−1+ 𝑘
(
)=(
)
k
𝑛−1

X

X counts

PX

Values of x

E[X]

VAR[X]

Bernoulli

P0=1-p, p1=p

0,1

p

P(1-p)

Binomial

Equals one if the event A
occurs, and zero otherwise
...
,n

np

np(1-p)

Geometric

Number of trials up through
1st success

Uniform

outcomes are equally likely

1
L

1,2,
...


1, 2, …

Cumulative distribution function: 𝐹 𝑋 (𝑥) = 𝑃[𝑋 ≤ 𝑥] , for −∞ < 𝑥 < ∞
Properties
1
...

3
...


0 ≤ 𝐹 𝑋 (𝑥) ≤ 1
...


5
...


6
...

8
...


𝑛→−∞

𝐹 𝑋 (𝑥) is a non-decreasing function of x, that is, if
a


CDF for continuous/discrete random variable:
𝑥

𝐸[𝑋] = ∫ 𝑡𝑓 𝑋 (𝑡)𝑑𝑡

𝑥

𝐹 𝑋 (𝑥) = ∫ 𝑓(𝑡) 𝑑𝑡 = ∫ ∑ 𝑝 𝑋 (𝑥 𝑘 )𝛿( 𝑥 − 𝑥 𝑘 )𝑑𝑡
−∞

−∞

𝑘

Variance of the random variable X:

𝜎 2 = 𝑉𝐴𝑅[𝑋] = 𝐸[(𝑋 − 𝐸[𝑋])2 ] = 𝐸[𝑋 2 ] − 𝐸[𝑋]2
𝑋

Probability density function of X:
𝑓 𝑋 (𝑥) =

−∞

The nth moment of the random variable X:

𝑑𝐹 𝑋 (𝑥)
= ∑ 𝑝 𝑋 (𝑥 𝑘 )𝛿( 𝑥 − 𝑥 𝑘 )
𝑑𝑥



𝐸[𝑋

𝑘

Properties:

Properties of Expected value and Variance:

𝐸[𝑐] = 𝑐 ; 𝐸[𝑋 + 𝑐] = 𝐸[𝑋] + 𝑐 ; 𝐸[𝑐𝑋] = 𝑐𝐸[𝑋]
𝑉𝐴𝑅[𝑋] ≥ 0 ; 𝑉𝐴𝑅[𝑐] = 0 ; 𝑉𝐴𝑅[𝑋 + 𝑐] = 𝑉𝐴𝑅[𝑋]
𝑉𝐴𝑅[𝑐𝑋] = 𝑐 2 𝑉𝐴𝑅[𝑋]

Expected value or mean of a random variable X

X

X counts

uniform
exponential

Time until an event

Gaussian
(normal)

Number of components
becomes large

Where
𝑄(𝑥) = 1 − 𝛷(𝑥) =

𝒇𝒙

FX
0,
𝑥− 𝑎
,
{
𝑏− 𝑎
1,

Outcomes with equal
density

{

2

1
∫ 𝑒 −𝑡 /2
√2𝜋 𝑥

= ∫ 𝑥 𝑛 𝑓 𝑋 (𝑥)𝑑𝑥
−∞

𝑏

𝑃[𝑎 ≤ 𝑋 ≤ 𝑏] = ∫𝑎 𝑓 𝑋 (𝑥)𝑑𝑥,
𝑥

𝐹 𝑋 (𝑥) = ∫ 𝑓 𝑋 (𝑡)𝑑𝑡 ; 1= ∫−∞ 𝑓 𝑋 (𝑡)𝑑𝑡
−∞

𝑓 𝑋 (𝑥) ≥ 0 ;

𝑛]

0,
1 − 𝑒 −𝜆𝑥 ,
𝑥−
𝛷(
𝜎

𝑥< 𝑎
𝑎≤ 𝑥≤ 𝑏
𝑥> 𝑏

E[X]

1
𝑎≤ 𝑥≤ 𝑏
{ 𝑏− 𝑎,
0 ,
𝑥 < 𝑎 𝑎𝑛𝑑 𝑥 > 𝑏

𝑥<0
𝑥≥0
𝑚
)

{

0,
𝜆𝑒 −𝜆𝑥 ,

1
√2𝜋𝜎

𝑏

1
𝑎+ 𝑏
∫ 𝑡𝑑𝑡 =
𝑏− 𝑎
2

(𝑏 − 𝑎)2
12

1
𝜆

1
𝜆2

0

1

𝑎

𝑥<0
𝑥≥0

𝑒 −(𝑥−𝑚)

2 /2𝜎 2

𝑑𝑡 ; 𝑄(0) = 1/2 & 𝑄(−𝑥) = 1 − 𝑄(𝑥) ;

VAR[X]

𝛷(−𝑥) = 𝑄(𝑥) & 𝑄(−𝑥) = 𝛷(𝑥)

Bounds for probability




Markov inequality states that

𝐸[𝑋]
P[X ≥ a] ≤
𝑎

𝑓𝑜𝑟 𝑋 𝑛𝑜𝑛𝑛𝑒𝑔𝑎𝑡𝑖𝑣𝑒

Chebyshev inequality states that

P[|X − m| ≥ a] ≤

𝜎2
𝑎2

Where 𝜎 2 is Variance

Pairs of Random Variables
Joint probability mass function:



𝑝 𝑋,𝑌 (𝑥 𝑗 , 𝑦 𝑘 ) = 𝑃[{𝑋 = 𝑥} ∩ {𝑌 = 𝑦}] ≜ 𝑃[{𝑋 = 𝑥 𝑗 } ∩ {𝑌 = 𝑦 𝑘 }] , (𝑥 𝑗 , 𝑦 𝑘 ) ∈ 𝑆 𝑋,𝑌 ;



∑ ∑ 𝑝 𝑋,𝑌 (𝑥 𝑗 , 𝑦 𝑘 ) = 1
𝐽=1 𝐾=1

Marginal probability mass functions:

𝑝 𝑋 (𝑥 𝑗 ) = 𝑃[𝑋 = 𝑥 𝑗 ] = 𝑃[𝑋 = 𝑥 𝑗 , 𝑌 = 𝑎𝑛𝑦𝑡ℎ𝑖𝑛𝑔] = 𝑃[{𝑋 = 𝑥 𝑗 𝑎𝑛𝑑 𝑌 = 𝑦1 } ∪ {𝑋 = 𝑥 𝑗 𝑎𝑛𝑑 𝑌 = 𝑦2 } ∪ … ]
= ∑∞ 𝑝 𝑋,𝑌 (𝑥 𝑗 , 𝑦 𝑘 ) And similarly for 𝑝 𝑌 (𝑦 𝑘 )
...
If x<0 or y<0, then CDF 𝐹 𝑋,𝑌 (𝑥, 𝑦) = 0
2
...
Finally if x>1 and y>1,
1 1

0 0

𝐹 𝑋,𝑌 (𝑥, 𝑦) = ∫ ∫ 1𝑑𝑥 ′ 𝑑𝑦 ′ = 1

3
...
If 0 ≤ 𝑥 ≤ 1 and y>1,

0 0

Independence of two variables:

if X and Y are independent discrete random variables, then the joint pmf is equal to the product of the marginal pmf’s
...

Covariance of X and Y:

COV(X,Y)=E[XY]-E[X]E[Y]
Pairs of independent random variables have covariance zero
...
com


Title: Probability Formula Sheet
Description: Probability Formula Sheet include all the useful formula needed to solve problems related to probability and statistics. if you want discount or format or custom changes in document or want to order any document or want to design your professional CV/Resume to get job fast please email me at harishassan1995@gmail.com