Search for notes by fellow students, in your own course and all over the country.
Browse our notes for titles which look like what you need, you can preview any of the notes via a sample of the contents. After you're happy these are the notes you're after simply pop them into your shopping cart.
Title: Olofsson Chapter 2 Study Guide
Description: This study guide is a complete and well-organized resource for anyone taking Math 402 or reviewing key topics in probability and statistics. It covers a wide range of problems, each with clear, step-by-step solutions that explain the reasoning and math behind every answer. Topics include everything from roulette bets and dice games to exponential waiting times, Poisson processes, and size-biased sampling. The guide walks through each concept carefully—using formulas, examples, and detailed explanations—so you can really understand how and why things work. You’ll find material on expected values, variances, conditional probabilities, memoryless properties, and more. It also connects theory to practical situations, like server times, machine reliability, and how traits show up in populations. Whether you're studying for an exam, brushing up on core concepts, or just need help working through tough problems, this guide will help you learn faster and feel more confident. Plus, with its clear structure and thorough explanations, it's easy to follow and a great resource to refer back to.
Description: This study guide is a complete and well-organized resource for anyone taking Math 402 or reviewing key topics in probability and statistics. It covers a wide range of problems, each with clear, step-by-step solutions that explain the reasoning and math behind every answer. Topics include everything from roulette bets and dice games to exponential waiting times, Poisson processes, and size-biased sampling. The guide walks through each concept carefully—using formulas, examples, and detailed explanations—so you can really understand how and why things work. You’ll find material on expected values, variances, conditional probabilities, memoryless properties, and more. It also connects theory to practical situations, like server times, machine reliability, and how traits show up in populations. Whether you're studying for an exam, brushing up on core concepts, or just need help working through tough problems, this guide will help you learn faster and feel more confident. Plus, with its clear structure and thorough explanations, it's easy to follow and a great resource to refer back to.
Document Preview
Extracts from the notes are below, to see the PDF you'll receive please use the links above
Math 402 Study Guide
Problem 1
In a “four number bet” in roulette, you win if any of the numbers 00, 0, 1, or 2 comes up
...
To compute the fair payout that yields an expected loss of − 19
, consider the
following: There are 38 equally likely outcomes in American roulette
...
winning the four-number bet is 38
38
Let x be the payout upon winning
...
(b) The actual payout for this bet is 8:1
...
With an 8:1 payout, the net gain is 8 dollars when you win, and the loss is 1
dollar when you lose
...
0526
38
19
Therefore, the expected loss is −
1
or approximately $0
...
19
1
Problem 2
In a dice game, you bet $2 on a number from 1 to 6
...
If your
number appears on k ∈ {1, 2, 3} dice, you win 2k dollars (and keep your original wager)
...
What is your expected net
profit or loss per round?
Solution
...
We first compute the probabilities P (k) for k = 0, 1, 2, 3 matches, using the binomial distribution:
k 3−k
5
3
1
, k = 0, 1, 2, 3
...
216
A $2 stake yields a net profit of 2k dollars when k ≥ 1, and a loss of $2 when k = 0:
(
2k, k = 1, 2, 3,
X=
−2, k = 0
...
1574 dollars
...
16 per round
...
One contains an unknown amount of money, and the
other contains 1
...
You pick one envelope at random and find $120 inside
...
(a) Compute the expected value if you switch
...
We identify the two possibilities for the unknown envelope:
80 =
120
1
...
5×120 (if 120 is the smaller amount)
...
2
2
2
Because you already hold $120, this suggests an expected gain of $10 by switching
...
Solution
...
Let X denote the smaller of the two amounts
prepared
...
However, the actual probability distribution over X determines which scenario is more likely,
and the equal probability assumption generally does not follow from a coherent prior
...
This leads to a contradiction of the conservation of
expected value and thus reveals the logical flaw
...
, n, and mean µ = E[X]
...
(a) Show that the conditional distribution for X is:
pˆk =
kpk
µ
Solution
...
” By Bayes’
theorem,
Pr(A | X = k) Pr(X = k)
Pr(X = k | A) =
...
Since the
proportionality constant cancels during normalization, we may write:
Pr(A | X = k) ∝ k
...
j=1
Combining the components yields the conditional probability:
Pr(X = k | A) =
kpk
= pˆk ,
µ
k = 1,
...
(b) Explain why this is referred to as a “size-biased” distribution
...
The term “size-biased” arises because the original mass pk is weighted by the
“size” factor k before renormalization
...
The observation that a genotype is present biases the distribution in favor of larger
group sizes, hence the name size-biased
...
Of these, nk have k children, where k ∈ {1, 2, 3, 4}
...
You observe one child
playing and note the number of children in their household
...
µ
Solution
...
”
We apply Bayes’ theorem:
Pr(family size = k | A) =
Pr(A | k) Pr(family size = k)
...
Since the total number of children is N = j jnj , we
may write:
Pr(A | k) ∝ k
...
j
j
Hence the conditional probability becomes:
Pr(family size = k | A) =
4
kpk
...
This result is algebraically identical to the genotype problem in Question 4
...
The structure is the same,
creating a size-biased distribution; the only difference lies in the interpretation—household
size versus genotype carrier count
...
bution with mean 12 seconds
...
Solution
...
λ2
(b) Find Pr(T ≤ 6)
...
Using the exponential CDF F (t) = 1 − e−λt :
Pr(T ≤ 6) = 1 − e−λ·6 = 1 − e−6/12 = 1 − e−0
...
60653 ≈ 0
...
(c) What is the probability that the next job arrives within 4 seconds, given that the last
job arrived 20 seconds ago?
Solution
...
Hence,
Pr(T ≤ 4 | T > 20) = Pr(T ≤ 4) = 1 − e−λ·4 = 1 − e−4/12 = 1 − e−1/3 ≈ 1 − 0
...
283
...
Solution
...
368
...
393, 0
...
368, respectively
...
The shelf life T
(in hours) of a mango at this facility follows an exponential distribution with a mean of 120
5
hours
...
Suppose a shipment contains 6 mangoes
...
Solution
...
05 ≈ 0
...
Hence, the probability of passing is q = 1 − p ≈ 0
...
For a crate of n = 6 mangoes, the number Y of spoiled mangoes follows a binomial distribution:
6 k 6−k
Y ∼ Binomial(n = 6, p ≈ 0
...
, 6
...
Solution
...
04877 ≈ 0
...
(c) Compute the variance of the number of spoiled mangoes per crate
...
The variance of a binomial distribution is:
Var[Y ] = npq = 6 · 0
...
95123 ≈ 0
...
Thus, on average, roughly 0
...
28
...
(a) Compute Pr(X > 90)
...
Using the exponential survival function Pr(X > t) = e−λt , we determine λ from
the half-life:
ln 2
λ=
≈ 0
...
70
Then,
Pr(X > 90) = e−λ·90 = exp(−0
...
410
...
6
Solution
...
Then,
Pr(X ≤ 40) = 1 − e−λ·40 = 1 − exp(−0
...
327
...
Solution
...
01 × 102 min,
λ
ln 2
2
1
70
Var[X] = 2 =
≈ 1
...
λ
ln 2
E[X] =
Problem 9
Calls arrive at a hotline according to a Poisson process with mean rate 3 calls per minute
...
(a) Derive the PMF of Z, the number of calls conditioned on the minute being busy
...
Since Y ∼ Poisson(λ = 3), the conditional PMF of Z given that the minute is
busy (i
...
, Y ≥ 1) is:
k
e−3 · 3k!
Pr(Y = k)
=
,
Pr(Z = k) = Pr(Y = k | Y ≥ 1) =
Pr(Y ≥ 1)
1 − e−3
k ≥ 1
...
Solution
...
−λ
1
−
e
k=1
k=1
Since
P∞
k=1
k · Pr(Y = k) = E[Y ] = λ = 3, it follows that:
E[Z] =
λ
3
3
=
≈
≈ 3
...
−λ
−3
1−e
1−e
0
...
16 calls
...
You want the probability that X falls within c standard deviations of the mean to be at
least 98%
...
98
Solution
...
Then,
Pr(µ − cσ ≤ X ≤ µ + cσ) = Pr(|Z| ≤ c) = 2Φ(c) − 1,
where Φ is the standard normal cumulative distribution function
...
98
⇒
Φ(c) ≥ 0
...
Taking the inverse CDF:
c = Φ−1 (0
...
326
...
33
...
(a) Define the discrete failure rate function r(k) = Pr(X = k | X ≥ k)
...
The failure rate function is defined in terms of the PMF and the survival function:
r(k) =
Pr(X = k)
pX (k)
=
,
Pr(X ≥ k)
SX (k)
k = 1, 2,
...
(b) Suppose X is geometric with success probability p = 0
...
Sketch the PMF and failure
rate
...
Solution
...
7k−1 · 0
...
The survival function becomes:
SX (k) =
∞
X
0
...
3 = 0
...
j=k
8
Thus, the failure rate is:
r(k) =
0
...
3
= 0
...
0
...
3 is constant in k
...
• The PMF pX (k) = 0
...
7k−1 is a strictly decreasing geometric sequence, dropping
from 0
...
• The failure rate r(k) = 0
...
The PMF decays because later failures become less likely in absolute terms, whereas the
failure rate reflects a constant conditional chance of failure on the next trial, given survival
so far—demonstrating the memoryless property
...
The probability that the machine lasts
more than 4000 hours is 0
...
We want to double this survival probability by scaling the
failure rate to cr(t) where c < 1
...
The survival function is given by:
Z t
S(t) = exp −
r(u) du = e−H(t) ,
Z
H(t) =
0
t
r(u) du
...
25
⇒
H = − ln(0
...
Modified hazard under scaling:
Sc (4000) = exp(−cH) = e−c ln 4 = (e− ln 4 )c = 0
...
Set Sc (4000) = 0
...
25c = 0
...
25 = ln 0
...
50
=
=
...
25
−2 ln 2
2
Conclusion:
c = 0
...
9
Problem 13
Let X ∼ Poisson(θ) be a random variable representing the number of defective widgets found
during routine inspection at a factory per hour
...
Solution
...
k!
Split the expression into two terms:
∞
X
k 2 e−θ
k=0
∞
X
θk
θk
ke−θ
...
k!
Shift indices: k 7→ k + 2 and k 7→ k + 1 respectively
...
k
ke−θ θk! = θ
...
(b) Use the result from part (a) and the identity X 2 = X(X − 3) + 3X to derive E[X 2 ], and
hence find Var(X)
...
Substitute into the identity:
E[X 2 ] = E[X(X − 3)] + 3E[X] = (θ2 − 2θ) + 3θ = θ2 + θ
...
Thus, the variance of a Poisson-distributed variable is Var(X) = θ
...
Show that the new conditional distribution depends on the second moment E[X 2 ], and
compute the expression for the probability that an individual (Mr
...
E[X 3 ]
(a) Conditional distribution after two positive observations
P
Let pk = P (X = k) and mr = E[X r ] = k≥1 k r pk
...
P (X = k | 2 positives) = P
m2
j P (2 positives | X = j)pj
Thus, the new size-biased PMF is
k 2 pk
,
m2
pˆk =
for k = 1, 2,
...
Y is guilty
Suppose guilt attaches to one of the genotype-positive individuals, and Mr
...
Given X = k, Mr
...
Averaging:
P (guilty) =
X 1
1 X k 2 pk
m2
...
Observing two positives favours large k, but larger k implies smaller
individual guilt probabilities; the third moment normalizes this tradeoff
...
11
With the same logic, a unique culprit is chosen uniformly among the k j ordered (j +1)-tuples
(the j positives plus Mr
...
Pj (guilty) =
mj+1
E[X j+1 ]
Thus, each additional observed carrier strengthens the size bias and lowers the guilt probability according to a ratio of successive moments
...
Define the empirical
P
distribution pk = mk /m, and let µ = 5j=1 jpj denote the average number of children per
neighborhood
...
(a) Show that the probability that the randomly encountered child lives in a k-child household is:
kpk
P (household size = k | random child) =
...
µ
j jpj
Hence, the conditional probability is:
P (size = k | random child) =
kpk
µ
(b) Interpretation via biased sampling and weighted-urn analogy
Each household can be viewed as an urn containing k indistinguishable balls (children)
...
Urns with more balls are proportionally more likely to supply the sampled individual, so the
sampling scheme biases toward larger k by a factor of k
...
This mirrors a weighted-urn model, where compartments are drawn according to weights—here,
the household sizes
Title: Olofsson Chapter 2 Study Guide
Description: This study guide is a complete and well-organized resource for anyone taking Math 402 or reviewing key topics in probability and statistics. It covers a wide range of problems, each with clear, step-by-step solutions that explain the reasoning and math behind every answer. Topics include everything from roulette bets and dice games to exponential waiting times, Poisson processes, and size-biased sampling. The guide walks through each concept carefully—using formulas, examples, and detailed explanations—so you can really understand how and why things work. You’ll find material on expected values, variances, conditional probabilities, memoryless properties, and more. It also connects theory to practical situations, like server times, machine reliability, and how traits show up in populations. Whether you're studying for an exam, brushing up on core concepts, or just need help working through tough problems, this guide will help you learn faster and feel more confident. Plus, with its clear structure and thorough explanations, it's easy to follow and a great resource to refer back to.
Description: This study guide is a complete and well-organized resource for anyone taking Math 402 or reviewing key topics in probability and statistics. It covers a wide range of problems, each with clear, step-by-step solutions that explain the reasoning and math behind every answer. Topics include everything from roulette bets and dice games to exponential waiting times, Poisson processes, and size-biased sampling. The guide walks through each concept carefully—using formulas, examples, and detailed explanations—so you can really understand how and why things work. You’ll find material on expected values, variances, conditional probabilities, memoryless properties, and more. It also connects theory to practical situations, like server times, machine reliability, and how traits show up in populations. Whether you're studying for an exam, brushing up on core concepts, or just need help working through tough problems, this guide will help you learn faster and feel more confident. Plus, with its clear structure and thorough explanations, it's easy to follow and a great resource to refer back to.