General Info 3.4 Independent Events 4.
1 Random Variables
Math 461 Fall 2024
Renming Song
University of Illinois Urbana-Champaign
September 13, 2024
General Info 3.4 Independent Events 4.1 Random Variables
Outline
General Info 3.4 Independent Events 4.1 Random Variables
Outline
1 General Info
2 3.4 Independent Events
3 4.1 Random Variables
General Info 3.4 Independent Events 4.1 Random Variables
HW2 is due today at the end of the class. Please submit your HW2 as
ONE pdf file via the HW2 folder in the course Canvas page. Make
sure that the quality of your pdf file is good enough.
I will post the Solutions to HW2 on my homepage later this afternoon.
General Info 3.4 Independent Events 4.1 Random Variables
HW2 is due today at the end of the class. Please submit your HW2 as
ONE pdf file via the HW2 folder in the course Canvas page. Make
sure that the quality of your pdf file is good enough.
I will post the Solutions to HW2 on my homepage later this afternoon.
General Info 3.4 Independent Events 4.1 Random Variables
Outline
1 General Info
2 3.4 Independent Events
3 4.1 Random Variables
General Info 3.4 Independent Events 4.1 Random Variables
Example 9 (The problem of points)
Independent trials, each results in a success with probability p and a
failure with probability 1 − p, are performed. Find the probability that
n (not necessarily consecutive) successes occur before m (not
necessarily consecutive) failures.
Let E be the event that n (not necessarily consecutive) successes
occur before m (not necessarily consecutive) failures. Then E is
equal to the event that there are at least n successes in the first
n + m − 1 trials. So the answer is
n+m−1
X
n+m−1 k
p (1 − p)n+m−1−k .
k
k=n
General Info 3.4 Independent Events 4.1 Random Variables
Example 9 (The problem of points)
Independent trials, each results in a success with probability p and a
failure with probability 1 − p, are performed. Find the probability that
n (not necessarily consecutive) successes occur before m (not
necessarily consecutive) failures.
Let E be the event that n (not necessarily consecutive) successes
occur before m (not necessarily consecutive) failures. Then E is
equal to the event that there are at least n successes in the first
n + m − 1 trials. So the answer is
n+m−1
X
n+m−1 k
p (1 − p)n+m−1−k .
k
k=n
General Info 3.4 Independent Events 4.1 Random Variables
Example 9 (The problem of points)
Independent trials, each results in a success with probability p and a
failure with probability 1 − p, are performed. Find the probability that
n (not necessarily consecutive) successes occur before m (not
necessarily consecutive) failures.
Let E be the event that n (not necessarily consecutive) successes
occur before m (not necessarily consecutive) failures. Then E is
equal to the event that there are at least n successes in the first
n + m − 1 trials. So the answer is
n+m−1
X
n+m−1 k
p (1 − p)n+m−1−k .
k
k=n
General Info 3.4 Independent Events 4.1 Random Variables
Conditioning is a useful technique in finding probability. Let’s illustrate
this technique with two examples. This first is a homework problem
from Chap. 2.
Example 10
Independent trials, consisting of rolling a pair of fair dice, are
performed, Find the probability that an outcome of 5 appears before
an outcome of 7, where outcome is the sum of the two dice.
Let E be the event that an outcome of 5 appears before an outcome
of 7, let F be the event that the first trial results in an outcome of 5, G
be the event that the first trial results in an outcome of 7, and H be
the event that the first trial results in neither an outcome of 5 nor an
outcome of 7.
General Info 3.4 Independent Events 4.1 Random Variables
Conditioning is a useful technique in finding probability. Let’s illustrate
this technique with two examples. This first is a homework problem
from Chap. 2.
Example 10
Independent trials, consisting of rolling a pair of fair dice, are
performed, Find the probability that an outcome of 5 appears before
an outcome of 7, where outcome is the sum of the two dice.
Let E be the event that an outcome of 5 appears before an outcome
of 7, let F be the event that the first trial results in an outcome of 5, G
be the event that the first trial results in an outcome of 7, and H be
the event that the first trial results in neither an outcome of 5 nor an
outcome of 7.
General Info 3.4 Independent Events 4.1 Random Variables
Conditioning is a useful technique in finding probability. Let’s illustrate
this technique with two examples. This first is a homework problem
from Chap. 2.
Example 10
Independent trials, consisting of rolling a pair of fair dice, are
performed, Find the probability that an outcome of 5 appears before
an outcome of 7, where outcome is the sum of the two dice.
Let E be the event that an outcome of 5 appears before an outcome
of 7, let F be the event that the first trial results in an outcome of 5, G
be the event that the first trial results in an outcome of 7, and H be
the event that the first trial results in neither an outcome of 5 nor an
outcome of 7.
General Info 3.4 Independent Events 4.1 Random Variables
P(E) = P((F ∩ E) + P(G ∩ E) + P(H ∩ E)
= P(F )P(E|F ) + P(G)P(E|G) + P(H)P(E|H)
4 6 26
= ·1+ ·0+ P(E).
36 36 36
Thus
10 4
P(E) = ,
36 36
and so P(E) = 25 .
General Info 3.4 Independent Events 4.1 Random Variables
P(E) = P((F ∩ E) + P(G ∩ E) + P(H ∩ E)
= P(F )P(E|F ) + P(G)P(E|G) + P(H)P(E|H)
4 6 26
= ·1+ ·0+ P(E).
36 36 36
Thus
10 4
P(E) = ,
36 36
and so P(E) = 25 .
General Info 3.4 Independent Events 4.1 Random Variables
Example 11 (Gambler’s ruin)
Two gamblers, A and B, bet on the outcomes of successive coin flips.
On each flip, if the coin comes up Heads, A gets $1 from B,
otherwise, B gets $1 from A. They continue to do so until one of them
is out of money. If the successive flips are independent and each flip
is Heads with probability p, what is the probability that A ends up with
all the money if A starts with $i and B with $(N-i)?
Let E be the event that A ends up with all the money. Let Pi be the
probability of E when A starts with $i and B with $(N-i). Then P0 = 0
and PN = 1. Let H be the event that the first flip results in Heads.
Then for i = 1, 2, . . . , N − 1,
General Info 3.4 Independent Events 4.1 Random Variables
Example 11 (Gambler’s ruin)
Two gamblers, A and B, bet on the outcomes of successive coin flips.
On each flip, if the coin comes up Heads, A gets $1 from B,
otherwise, B gets $1 from A. They continue to do so until one of them
is out of money. If the successive flips are independent and each flip
is Heads with probability p, what is the probability that A ends up with
all the money if A starts with $i and B with $(N-i)?
Let E be the event that A ends up with all the money. Let Pi be the
probability of E when A starts with $i and B with $(N-i). Then P0 = 0
and PN = 1. Let H be the event that the first flip results in Heads.
Then for i = 1, 2, . . . , N − 1,
General Info 3.4 Independent Events 4.1 Random Variables
Pi = P(H)P(E|H) + P(H c )P(E|H c ) = pPi+1 + qPi−1 ,
where q = 1 − p.
This can be rewritten as
(p + q)Pi = pPi+1 + qPi−1 , i = 1, . . . , N − 1
which is the same as
q
Pi+1 − Pi = (Pi − Pi−1 ), i = 1, . . . , N − 1.
p
List them all out, we get
General Info 3.4 Independent Events 4.1 Random Variables
Pi = P(H)P(E|H) + P(H c )P(E|H c ) = pPi+1 + qPi−1 ,
where q = 1 − p.
This can be rewritten as
(p + q)Pi = pPi+1 + qPi−1 , i = 1, . . . , N − 1
which is the same as
q
Pi+1 − Pi = (Pi − Pi−1 ), i = 1, . . . , N − 1.
p
List them all out, we get
General Info 3.4 Independent Events 4.1 Random Variables
Pi = P(H)P(E|H) + P(H c )P(E|H c ) = pPi+1 + qPi−1 ,
where q = 1 − p.
This can be rewritten as
(p + q)Pi = pPi+1 + qPi−1 , i = 1, . . . , N − 1
which is the same as
q
Pi+1 − Pi = (Pi − Pi−1 ), i = 1, . . . , N − 1.
p
List them all out, we get
General Info 3.4 Independent Events 4.1 Random Variables
q q
P2 − P1 = (P1 − P0 ) = P1
p p
2
q q
P3 − P2 = (P2 − P1 ) = P1
p p
...
i−1
q q
Pi − Pi−1 = (Pi−1 − Pi−2 ) = P1
p p
...
N−1
q q
PN − PN−1 = (PN−1 − PN−2 ) = P1 .
p p
Adding up the first i − 1 equations, we get
General Info 3.4 Independent Events 4.1 Random Variables
q q
P2 − P1 = (P1 − P0 ) = P1
p p
2
q q
P3 − P2 = (P2 − P1 ) = P1
p p
...
i−1
q q
Pi − Pi−1 = (Pi−1 − Pi−2 ) = P1
p p
...
N−1
q q
PN − PN−1 = (PN−1 − PN−2 ) = P1 .
p p
Adding up the first i − 1 equations, we get
General Info 3.4 Independent Events 4.1 Random Variables
i−1 !
q q
Pi − P1 = P1 + ··· + .
p p
Thus
i−1 !
q q
Pi = 1 + + · · · + P1
p p
1−(q/p)i
(
1−q/p P1 , if p ̸= q
=
iP1 , if p = q.
Definition
Using the fact PN = 1, we get
General Info 3.4 Independent Events 4.1 Random Variables
i−1 !
q q
Pi − P1 = P1 + ··· + .
p p
Thus
i−1 !
q q
Pi = 1 + + · · · + P1
p p
1−(q/p)i
(
1−q/p P1 , if p ̸= q
=
iP1 , if p = q.
Definition
Using the fact PN = 1, we get
General Info 3.4 Independent Events 4.1 Random Variables
i−1 !
q q
Pi − P1 = P1 + ··· + .
p p
Thus
i−1 !
q q
Pi = 1 + + · · · + P1
p p
1−(q/p)i
(
1−q/p P1 , if p ̸= q
=
iP1 , if p = q.
Definition
Using the fact PN = 1, we get
General Info 3.4 Independent Events 4.1 Random Variables
( 1−q/p
1−(q/p)N
, if q ̸= p
P1 = 1
N , if p = q.
Thus,
1−(q/p)i
(
1−(q/p)N
, if q ̸= p
Pi = i
N , if p = q.
General Info 3.4 Independent Events 4.1 Random Variables
( 1−q/p
1−(q/p)N
, if q ̸= p
P1 = 1
N , if p = q.
Thus,
1−(q/p)i
(
1−(q/p)N
, if q ̸= p
Pi = i
N , if p = q.
General Info 3.4 Independent Events 4.1 Random Variables
Outline
1 General Info
2 3.4 Independent Events
3 4.1 Random Variables
General Info 3.4 Independent Events 4.1 Random Variables
It is often the case that when a random experiment is performed, we
are mainly interested in some function of the outcome, as opposed
the actual outcome itself. In general, “any” real-valued function on the
sample space is called a random variable.
Example 1
Independent trials, each results in a success with probability p and a
failure with probability 1 − p, are performed 5 times. For each
success, you win $1 and for each failure, you lose $1. Obviously, you
are interested in your net winning.
Let X be your net winning, then X is a function on the sample space
and thus it is a random variable.
General Info 3.4 Independent Events 4.1 Random Variables
It is often the case that when a random experiment is performed, we
are mainly interested in some function of the outcome, as opposed
the actual outcome itself. In general, “any” real-valued function on the
sample space is called a random variable.
Example 1
Independent trials, each results in a success with probability p and a
failure with probability 1 − p, are performed 5 times. For each
success, you win $1 and for each failure, you lose $1. Obviously, you
are interested in your net winning.
Let X be your net winning, then X is a function on the sample space
and thus it is a random variable.
General Info 3.4 Independent Events 4.1 Random Variables
It is often the case that when a random experiment is performed, we
are mainly interested in some function of the outcome, as opposed
the actual outcome itself. In general, “any” real-valued function on the
sample space is called a random variable.
Example 1
Independent trials, each results in a success with probability p and a
failure with probability 1 − p, are performed 5 times. For each
success, you win $1 and for each failure, you lose $1. Obviously, you
are interested in your net winning.
Let X be your net winning, then X is a function on the sample space
and thus it is a random variable.
General Info 3.4 Independent Events 4.1 Random Variables
The possible values of X are: ±1, ±3, ±5. The probabilities that it
takes each of these values are
5 5 5 4
P(X = 5) = p , P(X = 3) = p (1 − p)
5 4
5 3 2 5 2
P(X = 1) = p (1 − p) , P(X = −1) = p (1 − p)3
3 2
5 4 5
PX = −3) = p(1 − p) , P(X = −5) = (1 − p)5 .
1 0
Example 2
3 balls are randomly selected, without replacement, from a box
containing 20 balls labeled 1, . . . , 20. Let X be the smallest number
selected.
General Info 3.4 Independent Events 4.1 Random Variables
The possible values of X are: ±1, ±3, ±5. The probabilities that it
takes each of these values are
5 5 5 4
P(X = 5) = p , P(X = 3) = p (1 − p)
5 4
5 3 2 5 2
P(X = 1) = p (1 − p) , P(X = −1) = p (1 − p)3
3 2
5 4 5
PX = −3) = p(1 − p) , P(X = −5) = (1 − p)5 .
1 0
Example 2
3 balls are randomly selected, without replacement, from a box
containing 20 balls labeled 1, . . . , 20. Let X be the smallest number
selected.
General Info 3.4 Independent Events 4.1 Random Variables
X is a random variable. The possible values of X are 1, . . . , 18 and
20−i
2
P(X = i) = 20
, i = 1, . . . , 18.
3
Example 3
Independent trials, each results in a success with probability p and a
failure with probability 1 − p, are performed. Let X be the number of
trials needed in order to get a success.
X is a random variable. Its possible values are 1, 2, . . . and
P(X = i) = (1 − p)i−1 p, i = 1, 2, . . . .
General Info 3.4 Independent Events 4.1 Random Variables
X is a random variable. The possible values of X are 1, . . . , 18 and
20−i
2
P(X = i) = 20
, i = 1, . . . , 18.
3
Example 3
Independent trials, each results in a success with probability p and a
failure with probability 1 − p, are performed. Let X be the number of
trials needed in order to get a success.
X is a random variable. Its possible values are 1, 2, . . . and
P(X = i) = (1 − p)i−1 p, i = 1, 2, . . . .
General Info 3.4 Independent Events 4.1 Random Variables
X is a random variable. The possible values of X are 1, . . . , 18 and
20−i
2
P(X = i) = 20
, i = 1, . . . , 18.
3
Example 3
Independent trials, each results in a success with probability p and a
failure with probability 1 − p, are performed. Let X be the number of
trials needed in order to get a success.
X is a random variable. Its possible values are 1, 2, . . . and
P(X = i) = (1 − p)i−1 p, i = 1, 2, . . . .
General Info 3.4 Independent Events 4.1 Random Variables
Example 4
Independent trials, each results in a success with probability p and a
failure with probability 1 − p, are performed until a success occurs or
a total of n trials are performed. Let X be the number of trials needed.
X is a random variable. Its possible values are 1, 2, . . . , n and
P(X = i) = (1 − p)i−1 p, i = 1, 2, . . . , n − 1
n−1
P(X = n) = (1 − p) .
General Info 3.4 Independent Events 4.1 Random Variables
Example 4
Independent trials, each results in a success with probability p and a
failure with probability 1 − p, are performed until a success occurs or
a total of n trials are performed. Let X be the number of trials needed.
X is a random variable. Its possible values are 1, 2, . . . , n and
P(X = i) = (1 − p)i−1 p, i = 1, 2, . . . , n − 1
n−1
P(X = n) = (1 − p) .
General Info 3.4 Independent Events 4.1 Random Variables
For all the examples above, we describe the random variables by
listing all their possible values and the probability they take these
values. This does not always work.
Example 5
A number is chosen randomly from (0, 1). Let X be the value of the
number.
X is a random variable. Its possible values are in (0, 1). The
probability that it takes any value in (0, 1) is 0. For any sub-interval A
of (0., 1),
P(X ∈ A) = |A|,
when A denotes the length of the interval A.
General Info 3.4 Independent Events 4.1 Random Variables
For all the examples above, we describe the random variables by
listing all their possible values and the probability they take these
values. This does not always work.
Example 5
A number is chosen randomly from (0, 1). Let X be the value of the
number.
X is a random variable. Its possible values are in (0, 1). The
probability that it takes any value in (0, 1) is 0. For any sub-interval A
of (0., 1),
P(X ∈ A) = |A|,
when A denotes the length of the interval A.
General Info 3.4 Independent Events 4.1 Random Variables
For all the examples above, we describe the random variables by
listing all their possible values and the probability they take these
values. This does not always work.
Example 5
A number is chosen randomly from (0, 1). Let X be the value of the
number.
X is a random variable. Its possible values are in (0, 1). The
probability that it takes any value in (0, 1) is 0. For any sub-interval A
of (0., 1),
P(X ∈ A) = |A|,
when A denotes the length of the interval A.
General Info 3.4 Independent Events 4.1 Random Variables
For a random variable X , the function
F (x) = P(X ≤ x), x ∈ R,
is called the (cumulative) distribution function of X .
It is a non-deceasing, right-continuous function with
lim F (x) = 1, lim F (x) = 0.
x→∞ x→−∞
If we know the distribution function F of a random variable X , then we
can find the probability of any event defined in terms of X . For
instance, for any a < b,
P(X ∈ (a, b]) = F (b) − F (a).
General Info 3.4 Independent Events 4.1 Random Variables
For a random variable X , the function
F (x) = P(X ≤ x), x ∈ R,
is called the (cumulative) distribution function of X .
It is a non-deceasing, right-continuous function with
lim F (x) = 1, lim F (x) = 0.
x→∞ x→−∞
If we know the distribution function F of a random variable X , then we
can find the probability of any event defined in terms of X . For
instance, for any a < b,
P(X ∈ (a, b]) = F (b) − F (a).
General Info 3.4 Independent Events 4.1 Random Variables
For a random variable X , the function
F (x) = P(X ≤ x), x ∈ R,
is called the (cumulative) distribution function of X .
It is a non-deceasing, right-continuous function with
lim F (x) = 1, lim F (x) = 0.
x→∞ x→−∞
If we know the distribution function F of a random variable X , then we
can find the probability of any event defined in terms of X . For
instance, for any a < b,
P(X ∈ (a, b]) = F (b) − F (a).