STAT 410
Fall 2016
Let X and Y be continuous random variables with joint p.d.f. f ( x, y ) . Then
Fact:
f X + Y (w) =
f X + Y (w) =
f (x, w x ) dx
(convolution)
f (w y, y ) dy
Proof:
w x
FX + Y (w) = f ( x, y ) dy dx .
let u = y + x,
then d u = d y,
FX + Y (w) =
y = u x,
wx w
w
w
(
(
)
)
=
du
dx
u
x
f
x
dx
u
x
,
f
x
,
du .
f X + Y (w) = FX' + Y (w) = f ( x, w x ) dx .
Fact:
Let X and Y be independent continuous random variables. Then
f X + Y (w) =
f X + Y (w) =
f X (x ) f Y (w x ) dx
f X (w y ) f Y ( y ) dy
0.
a)
Let X and Y be two independent Exponential random variables with mean 1.
Find the probability distribution of Z = X + Y. That is, find f Z ( z ) = f X + Y ( z ) .
x>0
otherwise
f X ( x ) = e
w+ x
f Y (w x ) = e
Case 1:
w x > 0 = e w + x
otherwise
x<w
otherwise
w > 0.
f X + Y (w)
w
x
w+ x
=
dx
e e
f X (x ) f Y (w x ) dx
0
w
= e w dx = w e w ,
w > 0.
Case 2:
w < 0.
f X + Y (w) = f X ( x ) f Y (w x ) dx =
0 dx
= 0,
w < 0.
Let X be an Exponential random variables with mean 1. Suppose the p.d.f. of Y is
f Y ( y ) = 2 y, 0 < y < 1, zero elsewhere. Assume that X and Y are independent.
Find the p.d.f. of W = X + Y, f W ( w ) = f X + Y ( w ).
x
x>0
otherwise
f X ( x ) = e
f X + Y (w) =
2 y 0 < y <1
0 otherwise
f Y (y) =
f X (w y ) f Y ( y ) dy
(w y ) w y > 0 e y w
f X (w y ) = e
=
Case 1:
0 < w < 1.
otherwise
fW(w ) =
y<w
otherwise
y w 2 y dy = 2 ( e w 1 + w ).
Case 2:
w > 1.
fW(w ) =
y w 2 y dy = 2 e w.
OR
f X + Y (w) =
f X (x ) f Y (w x ) dx
2 (w x ) 0 < w x < 1 2 (w x ) w 1 < x < w
=
0
otherwise
0
otherwise
f Y (w x ) =
Case 1:
0 < w < 1.
fW(w ) =
2 (w x ) dx =
Case 2:
w > 1.
fW(w ) =
w 1
2 (w x ) dx =
since w 1 > 0
1.
Consider two continuous random variables X and Y with joint p.d.f.
60 x 2 y
f X, Y ( x, y ) =
x > 0, y > 0, x + y < 1
otherwise
Consider W = X + Y. Find the p.d.f. of W, f W ( w ).
fW(w ) =
f (x, w x ) dx
x>0
y>0
wx>0
x<w
x+y<1
x + (w x) < 1
w<1
2
60 x ( w x ) d x
= 20 w 4 15 w 4 = 5 w 4,
0 < w < 1.
2.
a)
When a person applies for citizenship in Neverland, first he/she must wait X
years for an interview, and then Y more years for the oath ceremony. Thus the
total wait is W = X + Y years. Suppose that X and Y are independent, the
p.d.f. of X is
f X ( x ) = 2/x 3 ,
x > 1,
zero otherwise,
and Y has a Uniform distribution on interval ( 0, 1 ).
Find the p.d.f. of W, f W ( w ) = f X + Y ( w ).
Hint: Consider two cases: 1 < w < 2 and w > 2.
f W ( w ) = f X ( x ) f Y (w x ) dx .
f X ( x ) = 2/x 3 ,
x > 1,
f Y ( w x ) = 1,
0<wx<1
zero otherwise.
OR
Case 1: 1 < w < 2.
0 < w 1 < 1.
w
fW(w ) =
1 dx = 1
3
Case 2: w > 2.
w 1 > 1.
w
fW(w ) =
=
Case 3: w < 1.
3
w 1 x
1 dx
( w 1 )2 w 2
f W ( w ) = 0.
w2
w 1 < x < w,
zero otherwise.
3.
Let X and Y be two independent Poisson random variables with mean 1 and 2 ,
respectively. Let W = X + Y.
a)
What is the probability distribution of W?
P( W = n ) =
P ( X = k ) P ( Y = n k )
k =0
=
( 1 + 2 ) n e ( 1 + 2 )
n!
n k e 1 n k e 2
= 1
2
k
( n k )!
!
k =0
n!
1
k = 0 k ! ( n k )! 1 + 2
n
+
2
1
( 1 + 2 ) n e ( 1 + 2 ) .
n!
Therefore, W is a Poisson random variable with mean 1 + 2 .
OR
M W (t) = M X (t) M Y (t) =
e 1 ( e
t 1)
e 2 ( e
t 1)
e ( 1 + 2 ) ( e
Therefore, W is a Poisson random variable with mean 1 + 2 .
b)
What is the conditional distribution of X given W = n?
P( X = k | W = n ) =
P(X = k W = n )
P(X = k Y = n k )
=
P( W = n )
P( W = n )
1k e 1 n2 k e 2
k!
( n k )!
( 1 + 2 ) n e ( 1 + 2 )
n!
k
1 2
n!
=
k ! ( n k )! 1 + 2 1 + 2
nk
X | W = n has a Binomial distribution, p =
1
.
1 + 2
t 1)
nk
4.
Let X 1 and X 2 be be two independent 2 random variables with m and n
degrees of freedom, respectively. Find the probability distribution of W = X 1 + X 2 .
f1 ( x1 ) =
1
( m 2 ) 2 m 2
f 2 ( x2 ) =
1
( n 2 ) 2 n 2
m 2 1 x1 2
e
x1
,
x1 > 0,
n 2 1 x 2 2
x2
e
,
x2 > 0.
f W (w) = f1 ( x ) f 2 (w x ) dx
w
=
m2
0 ( m 2 ) 2
x m 2 1 e x 2
since
then d x = w d y,
w 1
w(m + n ) 2 1 e w 2
( (m + n ) 2 )
y m 2 1 (1 y )n 2 1 dy
( (m + n ) 2 ) 2 (m + n ) 2 0 ( m 2 ) ( n 2 )
1
( (m + n ) 2 ) 2 (m + n ) 2
w(m + n ) 2 1 e w 2 ,
( (m + n ) 2 )
y m 2 1 (1 y )n 2 1 , 0 < y < 1, is the p.d.f. of a Beta
( m 2 ) ( n 2 )
distribution with = m/2, = n/2.
(w x )n 2 1 e (w x ) 2 dx
0 0,
( n 2 ) 2 n 2
( (m + n ) 2 )
x m 2 1 (w x )n 2 1 dx
(
m
+ n) 2
(
)
(
)
2
2
m
n
( (m + n ) 2 ) 2
0
e w 2
let x = w y,
W has a
2( m + n )
distribution.
If random variables X and Y are independent, then
M X + Y ( t ) = M X ( t ) M Y ( t ).
M1( t ) =
(1 2 t ) m 2
t < 1/2 ,
MW( t ) = M1( t ) M2( t ) =
W has a
2( m + n )
M2( t ) =
1
,
( 1 2 t ) (m + n ) 2
(1 2 t ) n 2
t < 1/2 .
t < 1/2 .
distribution.
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - If X and Y are independent,
X is Bernoulli ( p ), Y is Bernoulli ( p )
X + Y is Binomial ( n = 2, p );
X is Binomial ( n 1 , p ), Y is Binomial ( n 2 , p )
X is Geometric ( p ), Y is Geometric ( p )
X + Y is Binomial ( n 1 + n 2 , p );
X + Y is Neg. Binomial ( r = 2, p );
X is Neg. Binomial ( r 1 , p ), Y is Neg. Binomial ( r 2 , p )
X is Poisson ( 1 ), Y is Poisson ( 2 )
X + Y is Poisson ( 1 + 2 );
X is Exponential ( ), Y is Exponential ( )
X is
2 ( r 1 ),
Y is
2(r2 )
X + Y is
X + Y is Neg. Binomial ( r 1 + r 2 , p );
X + Y is Gamma ( = 2, );
2 ( r 1 + r 2 );
X is Gamma ( 1 , ), Y is Gamma ( 2 , )
X + Y is Gamma ( 1 + 2 , );
X is Normal ( 1 , 12 ), Y is Normal ( 2 , 22 )
X + Y is Normal ( 1 + 2 , 12 + 22 ).
5.
Let X and Y be independent random variables, each geometrically distributed
with the probability of success p, 0 < p < 1. That is,
p X ( k ) = p Y ( k ) = p ( 1 p ) k 1 ,
a)
Find P ( X + Y = n ),
k = 1, 2, 3, ,
n = 2, 3, 4, .
n 1
P( X + Y = n ) =
P ( X = k ) P ( Y = n k )
k =1
n 1
=
p ( 1 p ) k 1 p ( 1 p ) n k 1 =
n 1
k =1
=
p 2 (1 p ) n 2
k =1
( n 1 ) p 2 (1 p ) n 2 ,
n = 2, 3, 4, .
If X and Y both have Geometric ( p ) distribution and are independent,
then X + Y has Negative Binomial distribution with r = 2.
OR
b)
p et
M X + Y (t) = M X (t) M Y (t) =
1 (1 p ) e t
Find P ( X = k | X + Y = n ),
n = 2, 3, 4, .
P( X = k | X + Y = n ) =
k = 1, 2, 3, , n 1,
t < ln ( 1 p ).
P(X = k X + Y = n )
P(X = k Y = n k )
=
P( X + Y = n )
P( X + Y = n )
p ( 1 p ) k 1 p ( 1 p ) n k 1
( n 1) p 2 ( 1 p ) n 2
1
,
n 1
k = 1, 2, 3, , n 1.
X | X + Y = n has a Uniform distribution on integers 1, 2, 3, , n 1.
c)
Find P ( X > Y ). [ Hint: First, find P ( X = Y ). ]
p X ( k ) p Y ( k )
P( X = Y ) =
k =1
= p
2
[ (1 p )
k =1
=
p2
1 (1 p ) 2
2 k 1
p ( 1 p ) k 1 p ( 1 p ) k 1
k =1
= p
2
[ (1 p ) 2 ]
n=0
p
2 p
P ( X > Y ) + P ( X = Y ) + P ( X < Y ) = 1.
Since P ( X > Y ) = P ( X < Y ),
P( X > Y ) =
p
1
1
( 1 P ( X = Y ) ) = 1
2
2 p
2
1 p
=
.
2 p
OR
P( X > Y ) =
p ( 1 p ) x 1 p ( 1 p ) y 1
y =1 x = y +1
p ( 1 p ) y 1
2
y =1
(1 p ) x 1
x = y +1
y
1 ( 1 p )
y
=
=
p ( 1 p ) 2 y 1
p (1 p )
1 (1 p )
y =1
y =1
= p (1 p )
[ (1 p ) 2 ]
n=0
1 p
p (1 p )
=
.
2 p
1 (1 p ) 2
d)
Consider the discrete random variable Q =
1
), E ( Q ).
Y
zk
[ Hint: ln ( 1 z ) =
k =1 k
X
.
Y
Find E ( X ), E (
E( X ) =
for 1 < z < 1. ]
since X has a Geometric ( p ) distribution.
p (1 p ) k
1
1
k
1
=
E( ) =
p (1 p )
1 p k =1
Y
k
k =1 k
= ln( 1 ( 1 p ) )
p
1 p
= ln( p )
p
1 p
Since X and Y are independent,
E( Q ) = E( X ) E(
e)
ln( p )
1
) =
.
1 p
Y
For any positive, irreducible fraction
P( Q =
a
a
, find P ( Q = ).
b
b
a
) = p X ( k a ) p Y ( k b )
b
k =1
p ( 1 p ) k a 1 p ( 1 p ) k b 1
k =1
2
p
k
=
(
1 p ) a +b
1 p
k =1
(1 p ) a + b
p2
.
(1 p ) 2 1 (1 p ) a + b
6.
Suppose we have two 4-sided dice. Suppose that for the first die ( X ),
p X ( 1 ) = 1/10 ,
p X ( 2 ) = 2/10 ,
p X ( 3 ) = 3/10 ,
p X ( 4 ) = 4/10 .
p Y ( 3 ) = 9/30 ,
p Y ( 4 ) = 16/30 .
Suppose also that for the second die ( Y ),
p Y ( 1 ) = 1/30 ,
p Y ( 2 ) = 4/30 ,
Find the probability distribution of U = X + Y.
Y
1
X
1
1
/30
(1, 1)
/10
(2, 1)
/10
(3, 1)
/10
(4, 1)
/10
(1, 2)
1
/300
/300
/300
/300
4
5
/300
/300
p (u)
4
5
6
7
8
/300
/300
/300
16
/30
16
/300
32
/300
48
/300
64
/300
(2, 4)
18
/300
6
(3, 4)
27
/300
(4, 3)
16
(1, 4)
9
(3, 3)
12
(4, 2)
4
/30
(2, 3)
8
(3, 2)
3
(1, 3)
4
(2, 2)
2
/30
7
(4, 4)
36
/300
/300
6
/300
20
/300
50
/300
75
/300
84
/300
64
/300
OR
M U (t) = M X (t) M Y (t)
1
2
3
4
1
4
9
16
+ e 2t
+ e 3t
+ e 4 t e t
+ e 2t
+ e 3t
+ e 4t
= e t
=
10
10
10 30
30
30
30
10
7.
Suppose X and Y are two independent discrete random variables with the following
probability distributions:
p X ( 1 ) = 0.2,
p X ( 2 ) = 0.4,
p Y ( 1 ) = 0.3,
p X ( 3 ) = 0.3,
p Y ( 3 ) = 0.5,
p X ( 4 ) = 0.1,
p Y ( 5 ) = 0.2.
Find the probability distribution of W = X + Y.
M W (t) = M X (t) M Y (t)
= 0.2 e t + 0.4 e 2 t + 0.3 e 3 t + 0.1 e 4 t 0.3 e t + 0.5 e 3 t + 0.2 e 5 t
= 0.06 e 2 t + 0.12 e 3 t + 0.19 e 4 t + 0.23 e 5 t + 0.19 e 6 t + 0.13 e 7 t + 0.06 e 8 t + 0.02 e 9 t .
p W ( 2 ) = 0.06,
p W ( 3 ) = 0.12,
p W ( 4 ) = 0.19,
p W ( 5 ) = 0.23,
p W ( 6 ) = 0.19,
p W ( 7 ) = 0.13,
p W ( 8 ) = 0.06,
p W ( 9 ) = 0.02.