PDM Notes
PDM Notes
4.Examples
Example 1: Two Dice Roll
• Sample Space: All possible outcomes when two dice are rolled are 6 × 6 = 36.
• The event of rolling a double six (both dice showing six) has only one favourable outcome: (6,6).
1
Probability of getting a double six: P(Double Six) = 36
Example: If there’s a 30% chance of rain and a 50% chance of thunder, with both happening together 20% of
the time, the probability of either rain or thunder is: P(Rain ∪ Thunder) = 0.3 + 0.5 − 0.2 = 0.6
• Problem: A box contains 3 coins: 2 regular and 1 two-headed coin. If a coin is picked at random and
tossed, what is the probability it shows heads?
• Solution:
Let C1 and C2 represent picking a regular and a two-headed coin, respectively.
o P(H) = P(H|C1) × P(C1 ) + P(H|C2 ) × P(C2 )
1 2 1 2
o P(H) = × + 1 × =
2 3 3 3
1 1 1
• Problem: The probability of rain is and traffic occurs with probabilities (if it rains) or (if it
3 2 4
doesn’t rain). What is the probability it rains given that you arrive late?
P(R∩L)
• Solution: Use conditional probability: P(R|L) =
P (L)
o Calculate P(L) using the law of total probability.
• If a sample space S is partitioned into C1 , C2 , … , Cn, the probability of event A is given by:
5. Measures of Central Tendency: These measures describe the central value of a data set, providing a
summary of where most data points lie.
1.2 Median
• Definition: The middle value when the data is sorted in ascending or descending order.
For an even number of data points, the median is the average of the two middle terms.
h N
Median for Grouped Data: Median = l + f ( 2 − c)
where:
1000
Solution: Median = 4000 + 20
× (21.5 − 8) = 4675 Rupees
1
• Definition: The n − th root of the product of n observations: G. M. = (x1 × x2 × … × xn )n
o Example: Find the G.M. of 2 , 4 , 8 , 12 , 16 , 24 2,4,8,12,16,24.
1
o Solution: log(G. M. ) = (0.3010 + 0.6021 + 0.9031 + 1.0792 + 1.2041 + 1.3802) = 0.9116
6
o Taking antilog: G. M. = 100.9116 = 8.158
1. Conditional Probability
• Definition: The probability of event A occurring given that event B has already occurred, denoted by
P(A | B).
P(A ∩ B)
P(A|B) =
P(B)
P(B|A)×P(A)
3. Bayes' Theorem - Bayes’ Theorem allows us to reverse conditional probabilities: P(A|B) = P(B)
• Generalized Bayes' Theorem: If B1 , B2 , … , Bn form a partition of the sample space: P(Bj |A) =
P(A|Bj )×P(Bj )
• Given:
o Machine 1: 30% of production, 5% defective
o Machine 2: 70% of production, 1% defective
P(D|A1 )×P(A1)
Find the probability a defective item came from Machine 1: P(A1 |D) = P(D|A
1 )×P(A 1 )+P(D|A 2 )×P(A2 )
0.05 × 0.3
P(A1 |D) = = 0.682
(0.05 × 0.3) + (0.01 × 0.7)
Thus, there is a 68.2% chance the defective item came from Machine 1.
• Given:
o 1% of women have cancer (P(Cancer) = 0.01)
o 90% of those with cancer test positive (P(Positive | Cancer) = 0.9)
o 8% of those without cancer also test positive P(Positive|No Cancer) = 0.08
Find the probability that a woman actually has cancer given a positive test:
0.9 × 0.01
P(Cancer|Positive) =
(0.9 × 0.01) + (0.08 × 0.99)
0.009
P( Cancer ∣ Positive ) = 0.009+0.0792 = 0.10
P(A∩B)
1. Conditional Probability: P(A|B) = P(B)
2. Multiplication Rule: P(A ∩ B) = P(A|B) × P(B)
P(B|A)×P(A)
3. Bayes' Theorem: P(A|B) = P(B)
4. Law of Total Probability: P(A) = ∑ni=1 P(A|Bi ) × P(Bi )
1. Random Variables
• Definition: A random variable is a function that assigns a real number to each outcome in the sample
space of a random experiment.
• Examples:
o The number of heads when a coin is tossed twice: X = {0, 1, 2}
o The outcome of rolling a die: S = {1, 2, 3, 4, 5, 6}, with each outcome having a probability:
1
P(X = k) = , k = 1,2, … ,6
6
3. Probability Distributions
• Definition: Models the number of successes in n independent Bernoulli trials, each with success
probability p.
• PMF: P(X = x) = (nx)px (1 − p)n−x , X ∼ Bin(n, p)
• Example: Probability of 3 heads in 5 coin tosses with p = 0.5:
o P(X = 3) = (53)(0.5)3 (0.5)2 = 0.3125
• Definition: Models the number of trials required for the first success.
• PMF: P(X = x) = (1 − p)x−1 p, X ∼ Geo(p)
• Definition: Models the number of events occurring in a fixed interval of time or space.
λx e−λ
• PMF: P(X = x) = , X ∼ Poisson(λ)
x!
6.2 Variance
• Measures the spread of a random variable around its mean. Var(X) = E[(X − μ)2 ] = E[X 2 ] − (E[X])2
• Definition: The CDF gives the probability that the variable takes a value less than or equal to x.
x
• F(x) = P(X ≤ x) = ∫−∞ f(t) dt
1
• Example: For the PDF f(x) = 5 over [0,5] , the CDF is:
x
o F(x) = 5 , 0≤x≤5
2. Marginal Distribution
• The marginal distribution refers to the distribution of a single variable obtained by summing over the
rows or columns in the table. P(X = x) = fX (x) = ∑y f(x, y) and P(Y = y) = fY (y) = ∑x f(x, y).
• Example: If 𝑋 represents income groups and 𝑌 represents expenses, the marginal probability of 𝑋 = 𝑥
sums all joint probabilities in the corresponding row.
3. Joint Distribution - Joint distribution refers to the probability distribution of two (or more) random variables
simultaneously.
• For discrete variables X and Y, the joint probability function is: P(X = x, Y = y) = f(x, y)
• Conditions:
o f(x, y) ≥ 0
o ∑x ∑y f(x, y) = 1
5. Conditional Distribution - Conditional distribution gives the probability of one variable given the value of
another.
5.1 Discrete Case
f(x,y)
• The conditional probability function of Y given X = x is: P(Y = y|X = x) = fX (x)
5.2 Continuous Case
f(x,y)
• The conditional density function of Y given X = x is: f(y|x) = fX (x)
1
• The outcomes are 1, 2, 3, 4, 5, 6 with equal probabilities 6.
1+2+3+4+5+6
o E(X) = 6
= 3.5
12 +22+32 +42 +52 +62
o E(X 2 ) = = 15.17
6
2
o Variance: Var(X) = E(X 2 ) − (E(X)) = 15.17 − (3.5)2 = 2.92
o Standard deviation: σ(X) = √2.92 ≈ 1.71
4. Properties of Expectation and Variance
1. Linearity of Expectation: E(aX + b) = aE(X) + b
2. Variance of a Sum: For independent random variables X and Y: Var(X + Y) = Var(X) + Var(Y)
X−μ
3. Standardized Variable: If X ∗ = σ , then: E(X ∗ ) = 0, Var(X ∗ ) = 1
6. Exercises
1. Expected Profit of a Product: A product has probabilities 0.15 (successful), 0.25 (moderately
successful), and 0.6 (unsuccessful). Calculate the expected profit.
2. Cakes Demand: Find the expected daily demand for cakes given the probabilities for different levels
of demand.
3. Accident Damage: Compute the expected damage for car accidents given probabilities and damage
levels.
2. Bernoulli Distribution
• Bernoulli Trial: A process with only two outcomes—success (1) and failure (0).
PMF of Bernoulli distribution: f(x) = px (1 − p)1−x , x ∈ {0,1}
• Mean: E(X) = p
• Variance: Var(X) = p(1 − p)
3. Binomial Distribution
The binomial distribution describes the number of successes in n independent Bernoulli trials.
• PMF: P(X = x) = (nx)px (1 − p)n−x
• Mean: E(X) = np
• Variance: Var(X) = np(1 − p)
• Example: In 9 trials with p=0.3p = 0.3p=0.3, the probability of exactly 2 successes is:
o P(X = 2) = (92)(0.3)2 (0.7)7 = 0.266
4. Negative Binomial Distribution
• Definition: Describes the number of trials needed to achieve r successes.
• PMF: P(X = x) = (n+r−1r−1
)pr (1 − p)x
r
• Mean: E(X) =
p
r(1−p)
• Variance: Var(X) = p2
5. Poisson Distribution
The Poisson distribution models the number of events in a fixed time or space interval.
λx e−λ
• PMF: P(X = x) = x!
• Mean and Variance: E(X) = λ, Var(X) = λ
• Example: A manufacturer finds that 0.1% of bottles are defective. For 500 bottles, the expected
number of defects is: λ = 500 × 0.001 = 0.5
• The probability of no defects: P(X = 0) = e−0.5 = 0.6065
6. Geometric Distribution
The geometric distribution models the number of trials needed to achieve the first success.
• PMF: P(X = x) = p(1 − p)x−1
1
• Mean: E(X) = p
• Example: If the probability of a defective bulb is p=0.04p = 0.04p=0.04, the probability that the first
defective bulb appears on the 6th test: P(X = 6) = 0.04 × (0.96)5 = 0.0326
• Definition: Describes the number of trials needed to achieve a specified number of successes in a
sequence of independent Bernoulli trials with a constant success probability p.
Example Problems
2. Geometric Distribution - A special case of the negative binomial distribution that models the number of
trials required to get the first success.
Example Problems
• Given: p = 0.4.
• Formula: P(X = 3) = (1 − 0.4)2 ⋅ 0.4 = 0.144
3. Hypergeometric Distribution - Describes the probability of successes in a fixed number of draws from a
finite population without replacement.
(K N−K
k )( n−k )
Formula: P(X = k) =
(N
n)
Hypergeometric Binomial
No replacement With replacement
Probability changes after each trial Probability remains constant
Used for small populations Used for large populations
5. Real-Life Applications
1. Binomial Distribution:
o Problem: A survey of 300 households checks for ownership of 4+ televisions.
o Application: Binomial distribution models the probability based on ownership rates.
2. Poisson Distribution:
o Problem: A caterer serves 15 plates every 10 minutes. How to plan for service?
λr e−λ
o Solution: P(X = r) =
r!
o For service over 5 minutes, P(X = 5) = 0.0378. Proper planning suggests 2 people serve
every 3 minutes to maintain service quality.
• Continuous random variables can take any value within a given range, unlike discrete random
variables that take only specific values.
• The probability density function (PDF) describes the probability of a continuous random variable
falling within a specific interval.
• Key Properties:
1. The area under the PDF curve equals 1.
2. The probability that X lies within an interval [a, b] is the area under the curve between a and
b.
2. Normal Distribution
3. Exponential Distribution
4. Gamma Distribution - Generalizes the exponential distribution and models the sum of independent
exponential variables.
μ(μx)n−1e−μx
• PDF Formula: f(x) = (n−1)!
, x > 0, μ ≥ 0
n n
• Mean and Variance: E(X) = μ , Var(X) = μ2
∞
o Problem: Find Γ(7/2) and evaluate the integral: I = ∫0 x 6 e−5x dx
5 3 1 15√π
o Solution: Γ(7/2) = × × √π =
2 2 2 8
6!
o For the integral: I = 57 ≈ 0.0092
(x−μ)2
1
1. Normal Distribution: f(x) = σ√2π e− 2σ2
x−μ
2. Z-Score: z = σ
1
3. Exponential Distribution: fT (t) = αe−αt , E(T) =
α
μ(μx)n−1e−μx n n
4. Gamma Distribution: f(x) = (n−1)!
, E(X) = μ , Var(X) = μ2