KEMBAR78
Joint Probability Distribution Partial Lecture | PDF | Probability Distribution | Functions And Mappings
0% found this document useful (0 votes)
7 views62 pages

Joint Probability Distribution Partial Lecture

The document discusses the concepts of discrete and continuous random variables, including joint, marginal, and conditional probability distributions. It provides definitions, examples, and solutions related to joint probability functions and distributions, as well as the calculation of probabilities in various scenarios. Additionally, it covers the relationship between random variables and how knowledge of one can affect the probabilities of the other.

Uploaded by

ZAYL BAGABALDO
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views62 pages

Joint Probability Distribution Partial Lecture

The document discusses the concepts of discrete and continuous random variables, including joint, marginal, and conditional probability distributions. It provides definitions, examples, and solutions related to joint probability functions and distributions, as well as the calculation of probabilities in various scenarios. Additionally, it covers the relationship between random variables and how knowledge of one can affect the probabilities of the other.

Uploaded by

ZAYL BAGABALDO
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 62

ENGR. JOSELITO O.

DAROY, MEP
Associate Professor IV
TOPICS:
 TWO DISCRETE/CONTINUOUS
RANDOM VARIABLES
 MULTIPLE DISCRETE/CONTINUOUS
RANDOM VARIABLES
 COVARIANCE AND CORRELATION
 BIVARIATE NORMAL DISTRIBUTION
 LINEAR COMBINATION OF RANDOM
VARIABLES
TWO DISCRETE/CONTINUOUS
RANDOM VARIABLES
 JOINT PROBABILITY DISTRIBUTIONS
 MARGINAL PROBABILITY
DISTRIBUTIONS
 CONDITIONAL PROBABILITY
DISTRIBUTIONS
 INDEPENDENCE
MULTIPLE DISCRETE/CONTINUOUS
RANDOM VARIABLES
 JOINT PROBABILITY DISTRIBUTIONS
 MULTINOMIAL PROBABILITY
DISTRIBUTION
TWO DISCRETE RANDOM VARIABLES

 JOINT PROBABILITY DISTRIBUTIONS


 MARGINAL PROBABILITY
DISTRIBUTIONS
 CONDITIONAL PROBABILITY
DISTRIBUTIONS
 INDEPENDENCE
INTRODUCTION
 Given two random variable that are defined
on the same probability space, the joint
probability distribution is the corresponding
probability distribution on all possible pairs
of outputs.
 If X and Y are two discrete random variables,
the probability distribution that defines
their simultaneous behavior is called a joint
probability distribution.
INTRODUCTION
 The joint probability distribution of two
discrete random variables = bivariate
probability distribution.

 The concept generalizes to any number of


random variables is called a multivariate
distribution.
INTRODUCTION
 The joint probability distribution of two
discrete random variable is usually written
as

P  X  x, Y  y 
Definition:
 The function f (x,y) is a joint probability
distribution or probability mass function of
the discrete random variable X and Y if
1. f  x, y   0, for all  x, y 
2.  f  x, y   1,
x y

3. P  X  x, Y  y   f  x, y  .
For any region A in the xy plane,
P  X , Y   A   f  x, y .
A
Example 1:

 Two ballpoint pens are selected at random


from a box that contains 3 blue pens, 2 red
pens, and 3 green pens. If X is the number of
blue pens selected and Y is the number of
red pens selected, find
(a) the joint probability function f (x, y),
(b) P [(X, Y ) ∈ A], where A is the region
{(x, y)|x + y ≤ 1}.
Solution:
 The possible pairs of values (x,y) are (0,0),
(0,1), (1,0), (1,1), (0,2), and (2,2).

(a.) the total number of equally likely ways of


selecting any two pens from the 8 is
8 
 2   28
 
Solution:
the number of ways of selecting 1 red from 2
red pens and 1 green from 3 green pens is
 2  3 
1 1   6
  
Hence
f  0,1  6
28
f  0,1  3
14
Solution:
the joint probability distribution table
Solution:
the joint probability distribution table can be
presented by the formula

 3  2  3 
 x  y  2  x  y 
f  x, y      
8 
 2
 
for x  0,1,2; y  0,1,2; and 0  x  y  2.
Solution:
(b.) The probability that (X,Y) fall in the
region A is
P  X ,Y   A  P  X  Y  1
 f  0,0   f  0,1  f 1,0 
3 3 9
  
28 14 28
9
P  X ,Y   A 
14
Definition:
 The function f(x, y) is a joint density
function of the continuous random
variable X and Y if
1. f  x, y   0, for all  x, y 
 
2.   f  x, y  dx dy  1
 

3. P  X , Y   A    f  x, y  dx dy ,
A

For any region A in the xy plane.


Example 2
 A privately owned business operates both a drive-in
facility and a walk-in facility. On a randomly selected
day, let X and Y , respectively, be the proportions of the
time that the drive-in and the walk-in facilities are in
use, and suppose that the joint density function of
these random variables is
2
  2 x  3 y  , 0  x  1,0  y  1,
f  x, y    5
0, elsewhere
Example 2
(a) Verify condition 2 of Definition of joint
density function
(b) Find P [(X, Y ) ∈ A],
where:
 1 1 1
A   x, y  0  x  ,  y  .
 2 4 2
Solution

 (a) the integration of f(x, y) over the whole


region is
Solution   2
f  x, y  dx dy     2 x  3 y  dx dy
1 1
 
  0 0 5

x 1
 2x
1 6 xy 
2
    dy
0
 5 5  x 0
 2 6y 
1
    dy

0 5 5 
1
 2 y 3y  2
  
 5 5 0
2 3
 
5 5
 
  f  x, y  dx dy  1
 
Solution
 (b) To calculate the probability

 1 1 1
P  X , Y   A  P  0  X  ,  Y  
 2 4 2
2
P  X , Y   A    2 x  3 y  dx dy
12 12

Solution 14 
0 5
x 1 2
 2x
12 6 xy  2
    dy
14
 5 5  x 0
 1 3y 
12
     dy

1 4 10 5 
12
 y 3y  2
  
 10 10  1 4
1  1 3   1 3  
        
10  2 4   4 16  
13
P  X , Y   A 
160
 Marginal Probability Distribution is the
individual probability distribution of a
random variable
 The marginal distributions of X alone and Y
alone are:
 for discrete case

g  x    f  x, y  and h  y    f  x, y 
y x

 for continuous case


 
g  x   f  x, y  dy and h  y    f  x, y  dx
 
Example 3:
 Show that the column and row totals of the
given table give the marginal distribution of
X alone and Y alone.
Solution
 For the random variable X

g  0   f  0,0   f  0,1  f  0,2 


3 3 1
  
28 14 28
5
g  0 
14
Solution
 For the random variable X

g 1  f 1,0   f 1,1  f 1,2 


9 3
  0
28 14
15
g 1 
28
Solution
 For the random variable X

g  2   f  2,0   f  2,1  f  2,2 


3
 00
28
3
g  2 
28
Solution
 For the random variable X

 For the random variable Y


Example 4:
 Find g(x) and h(y) for the joint density
function of example 2.
Solution: 
g  x   f  x, y  dy


2
   2 x  3 y  dy
1

0 5

y 1
 4 xy 6 y  2
  
 5 10  y 0
4x  3
g  x  , 0  x 1
5
g  x   0, elsewhere
Solution: 
h y   f  x, y  dx


2
   2 x  3 y  dx
1

0 5

x 1
 2 6 xy 
  
5 5  x 0
2 1  3 y 
h y  , 0  y 1
5
h  y   0, elsewhere
 When two random variables are defined in a
random experiment, knowledge of one can
change the probabilities of the other.
 It is the probability distribution of a random
variable, calculated according to the rules of
conditional probability after observing the
realization of another random variable.
 A conditional probability distribution
describes the probability that a randomly
selected person from a sub-population has a
given characteristic of interest.
 Let X and Y be two random variables,
discrete or continuous.
 The conditional distribution of the random
variable Y given that X = x is
f  x, y 
f  y x  , provided g  x   0.
g  x
 Similarly, the conditional distribution of X
given that Y = y is

f  x, y 
f  x y  , provided h  y   0.
h y
 To find the probability that the discrete
random variable X falls between a and b
when it is known that the discrete variable
Y= y
Pa  X  b Y  y   f  x y ,
a  x b

 The summation extends over all values of X


between a and b.
 When X and Y are continuous

P  a  X  b Y  y    f  x y  dx
b

a
Example 5:
 Referring to Example 1, find the conditional
distribution of X, given that Y = 1, and use it
to determine P (X = 0 | Y = 1).
Solution:
 Finding f(x  y), where y = 1, then
2
h 1   f  x,1
x 0

3 3
  0
14 14
3
h 1 
7
Solution:
 Now
f  x,1
f  x 1 
h 1
7
f  x 1    f  x,1 , x  0.1.2
3
Solution:
 Then
7  7  3  1
f  0 1    f  0,1     
3  3  14  2
7  7  3  1
f 1 1    f 1,1     
3  3  14  2
7 7
f  2 1    f  2,1     0   0
3 3
Solution:
 The conditional distribution of X, given Y =
1, is

 Finally
P  X  0 Y=1  f  0 1

P  X  0 Y=1 
1
2
Solution:
 Therefore, if it is known that 1 of the 2 pen
refills selected is red, we have a probability
equal to ½ that the other refill is not blue.
 In some random experiments, knowledge of
the values X does not change any of the
probabilities associated with the values for
Y.
 Two continuous random variables X and Y
are said to be independent, if
f xy  x, y   f x  X  f y Y 
Definition
 Let X and Y be two random variables, discrete
or continuous, with joint probability
distribution f (x, y) and marginal
distributions g(x) and h(y), respectively. The
random variables X and Y are said to be
statistically independent if and if
f  x, y   g  x  h  y 
for all  x, y  within their range.
Example 6:
 Show that the random variables of example 1
are not statistically independent.
Proof:
 Consider the point (0,1). From the table:, the
three probabilities f(0,1), g(0), and h(1) is
Proof:
3
f  0,1 
14 2
2 h 1   f  x,1
g  0    f  0, y  x 0
y 0
3 3
3 3 1   0
   14 14
28 14 28 3
5 h 1 
g  0  7
14
Proof:
f  0,1  g  0  h 1
 5  3 
   
 14  7 
3 15

14 98

 Therefore X and Y are not statistically


independent
Definition
 Let X1, X2, . . . , Xn, be n random variables,
discrete or continuous, with joint probability
distribution f(x1 , x2 , . . . , xn) and marginal
distribution f 1(x1), f 2(x2), . . . , f n(xn),
respectively.
 The random variables X1, X2, . . . , Xn, are said
to be mutually statistically independent if and
if
f  x, y   g  x  h  y 
for all x, y within their range.
Definition
 The random variables X1, X2, . . . , Xn, are said
to be mutually statistically independent if and
if
f  x1 , x2 ,..., xn   f1  x1  f 2  x2     f n  xn 
for all  x1 , x2 , . . . , xn  within their range.
Example 7:
 Suppose that the shelf life, in years, of a
certain perishable food product packaged in
cardboard containers is a random variable
whose probability density function is given
by
e  x , x  0
f  x  
0, elsewhere
Example 7:
 Let X1, X2, and X3 represent the shelf life for
three of these containers selected
independently and find

P  X 1  2, 1  X 2  3, X 3  2 
Solution:
 Since the containers were selected
independently, we can assume that the
random variables X1, X2, and X3 are
statistically independent, having the joint
probability density.
Solution:
f  x1 , x2 , x3   f  x1  f  x2  f  x3 
 x1  x2  x3
e e e
f  x1 , x2 , x3   e  x1  x2  x3

for x1  0, x2  0, x3  0,
and f  x1 , x2 , x3   0, elsewhere

You might also like