KEMBAR78
PDM Notes | PDF | Probability Distribution | Random Variable
0% found this document useful (0 votes)
69 views15 pages

PDM Notes

Uploaded by

ssaurabh_ss
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
69 views15 pages

PDM Notes

Uploaded by

ssaurabh_ss
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 15

Week 1: Introduction to Probability Study Notes

1.Probability: A Measure of Uncertainty


• Probability quantifies the likelihood of an event occurring. It provides a mathematical framework to
make decisions under uncertainty.
• Example: A "40% chance of rain" means if similar weather conditions occur multiple times, 40 out of
100 times it will rain.

2.Operations with Events
Union of Events
• The union represents the event where either or both of two events occur.
Example: If A is getting a head on the first toss and B is getting a head on the second toss, the union
A∪ is the event of getting at least one head.
P(A ∪ B) = P(A) + P(B) − P(A ∩ B)
Intersection of Events
• The intersection occurs when both events happen simultaneously.
Example: In two coin tosses, A∩BA \cap BA∩B represents the event of getting heads on both tosses.
P(A ∩ B)
Complement of an Event
• The complement is the event where the desired outcome does not happen.
Example: If A is the event of getting a head in a coin toss, Ac is the event of not getting a head (i.e.,
getting a tail).
P(Ac ) = 1 − P(A)

3.Classical Definition of Probability


• This is the basic formula used for calculating the probability of an event. It applies when all
outcomes are equally likely.
Number of favorable outcomes
P(A) =
Total number of outcomes
Example: If a coin is tossed, there are two possible outcomes: heads or tails.
1
The probability of getting heads is: P(Heads) = 2

4.Examples
Example 1: Two Dice Roll
• Sample Space: All possible outcomes when two dice are rolled are 6 × 6 = 36.
• The event of rolling a double six (both dice showing six) has only one favourable outcome: (6,6).
1
Probability of getting a double six: P(Double Six) = 36

Example 2: Leap Year and Sundays


• A leap year contains 366 days, which equals 52 complete weeks plus 2 extra days. These extra days
can be any of the following combinations:
(Sat, Sun), (Sun, Mon), (Mon, Tue), etc.
• The event of having 53 Sundays happens when the extra days include a Sunday (either (Sat, Sun) or
(Sun, Mon)).
2
Probability of 53 Sundays: P(53 Sundays) = 7
2 5
• Thus, the probability of exactly 52 Sundays: P(52 Sundays) = 1 − 7 = 7

5.Combinatorial Probability Example


Given: A box with 400 bulbs, 10% of which are defective.
To calculate the probability of selecting exactly 5 defective bulbs from 20 selected:
• Explanation:
We need to select 5 defective bulbs from 40 defective ones and 15 good bulbs from 360. The total
ways to select 20 bulbs out of 400 is represented by the binomial coefficient (400
20
).
(40 ) × ( 360
)
P(Exactly 5 Defective) = 5 400 15
( 20 )
6.Summary of Key Formulas
1. Classical Probability:
f
This formula applies when outcomes are equally likely. P(A) = where f is the number of favorable
N
outcomes, and NNN is the total number of possible outcomes.
2. Union of Events:
This formula calculates the probability of at least one of the events occurring. P(A ∪ B) = P(A) +
P(B) − P(A ∩ B)
3. Complement of an Event:
The probability of an event not occurring. P(Ac ) = 1 − P(A)
4. Probability of Independent Events:
If two events are independent, the probability of both occurring is the product of their individual
probabilities. P(A ∩ B) = P(A) × P(B)
5. Combinatorial Probability:
(nr)
This is used when selecting subsets from a larger set. P(Combination) = where (nr) is the
(N
R)
binomial coefficient, representing the number of ways to choose r items from n items.

Week 2: Basic Laws of Probability Study Notes

1. Basic Laws of Probability


1.1 Addition Rule
• The probability of either event A or B occurring is given by adding their individual probabilities and
subtracting the intersection (common part).
P(A ∪ B) = P(A) + P(B) − P(A ∩ B)

Example: If there’s a 30% chance of rain and a 50% chance of thunder, with both happening together 20% of
the time, the probability of either rain or thunder is: P(Rain ∪ Thunder) = 0.3 + 0.5 − 0.2 = 0.6

1.2 Multiplication Rule


• The probability of both events A and B occurring is given by: P(A ∩ B) = P(A) × P(B|A)
• For independent events (when one event does not affect the other): P(A ∩ B) = P(A) × P(B).

1.3 Conditional Probability


• Conditional probability represents the likelihood of an event happening given that another event has
P(A∩B)
already occurred. P(B|A) = P(A)
• Example: If there is a 60% chance that a product lasts 2 years and a 40% chance it lasts 3 years, the
P(T≥2)−P(T≥3)
probability it lasts exactly 3 years given it passed 2 years is: P(T ≥ 3|T ≥ 2) = P(T≥2)

2. Dependent and Independent Events

• Dependent Events: The outcome of one event affects the other.


• Independent Events: The outcome of one event has no effect on the other.
• 5% Rule: If the sample size is smaller than 5% of the population, dependent events can be treated as
independent to simplify calculations.
3. Examples of Probability Calculations

Example 1: Tossing Coins

• Problem: A box contains 3 coins: 2 regular and 1 two-headed coin. If a coin is picked at random and
tossed, what is the probability it shows heads?
• Solution:
Let C1 and C2 represent picking a regular and a two-headed coin, respectively.
o P(H) = P(H|C1) × P(C1 ) + P(H|C2 ) × P(C2 )
1 2 1 2
o P(H) = × + 1 × =
2 3 3 3

Example 2: Traffic and Weather

1 1 1
• Problem: The probability of rain is and traffic occurs with probabilities (if it rains) or (if it
3 2 4
doesn’t rain). What is the probability it rains given that you arrive late?
P(R∩L)
• Solution: Use conditional probability: P(R|L) =
P (L)
o Calculate P(L) using the law of total probability.

4. Law of Total Probability

• If a sample space S is partitioned into C1 , C2 , … , Cn, the probability of event A is given by:

P(A) = ∑ P(A|Ci ) × P(Ci )


i=1

5. Measures of Central Tendency: These measures describe the central value of a data set, providing a
summary of where most data points lie.

1.1 Arithmetic Mean (Simple Mean)

• Definition: Sum of all observations divided by the number of observations.


1
For observations x1 , x2 , … , xn , the arithmetic mean x̅ is: x̅ = n ∑ni=1 xi
∑n
i=1 fi xi
• For Frequency Distributions: x̅ = ∑n
i=1 fi
o where fi is the frequency of the corresponding value xi .
o Example:
Find the mean of the data: 70,120,110,101,88,83,95,98,107,100.
70+120+110+101+88+83+95+98+107+100
o Solution: x̅ = = 97.2
10

1.2 Median

• Definition: The middle value when the data is sorted in ascending or descending order.
For an even number of data points, the median is the average of the two middle terms.

h N
Median for Grouped Data: Median = l + f ( 2 − c)

where:

• l: Lower boundary of the median class


• h: Class width
• f: Frequency of the median class
• c: Cumulative frequency of the class before the median class
• N: Total frequency

Example: Find the median wage for the data:

• Wages (Rupees): 2000–3000, 3000–4000, 4000–5000, 5000–6000, 6000–7000


• Frequency: 3, 5, 20, 10, 5

1000
Solution: Median = 4000 + 20
× (21.5 − 8) = 4675 Rupees

1.3 Geometric Mean (G.M.)

1
• Definition: The n − th root of the product of n observations: G. M. = (x1 × x2 × … × xn )n
o Example: Find the G.M. of 2 , 4 , 8 , 12 , 16 , 24 2,4,8,12,16,24.
1
o Solution: log(G. M. ) = (0.3010 + 0.6021 + 0.9031 + 1.0792 + 1.2041 + 1.3802) = 0.9116
6
o Taking antilog: G. M. = 100.9116 = 8.158

Summary of Key Formulas

1 Addition Rule: P(A ∪ B) = P(A) + P(B) − P(A ∩ B)


2 Multiplication Rule: P(A ∩ B) = P(A) × P(B|A)
P(A∩B)
3 Conditional Probability: P(B|A) = P(A)
4 Law of Total Probability: P(A) = ∑ni=1 P(A|Ci ) × P(Ci )

Week 3: Conditional Probability and Bayes' Theorem Study Notes

1. Conditional Probability

• Definition: The probability of event A occurring given that event B has already occurred, denoted by
P(A | B).

P(A ∩ B)
P(A|B) =
P(B)

• If A and B are independent, then P(A | B) = P(A).

Multiplication Rule: P(A ∩ B) = P(A|B) × P(B) = P(B|A) × P(A)

2. Generalization to Multiple Events

• For three events A1 , A2 , A3 , the joint probability is given by:

P(A1 ∩ A2 ∩ A3 ) = P(A1 ) × P(A2 |A1 ) × P(A3 |A1 ∩ A2 )

This generalizes to any number of events.


Example 1: Rolling a Die

• Let A = {1, 2, 3} and B = {2, 4} .


• Calculate P(A), P(B), and P(A ∩ B):
3 1
o P(A) = =
6 2
2 1
o P(B) = =
6 3
1
o P(A ∩ B) = 6
P(A∩B) 1/6 1
• Check if A and B are independent: P(A|B) = = 1/3 = 2 = P(A)
P(B)
• Since P( A ∣ B ) = P(A)P(A|B) = P(A)P( A ∣ B ) = P(A), the events are independent.

P(B|A)×P(A)
3. Bayes' Theorem - Bayes’ Theorem allows us to reverse conditional probabilities: P(A|B) = P(B)

• Generalized Bayes' Theorem: If B1 , B2 , … , Bn form a partition of the sample space: P(Bj |A) =
P(A|Bj )×P(Bj )

i=1 P(A|Bi )×P(Bi )


∑n

Example 2: Defective Items from Two Machines

• Given:
o Machine 1: 30% of production, 5% defective
o Machine 2: 70% of production, 1% defective

P(D|A1 )×P(A1)
Find the probability a defective item came from Machine 1: P(A1 |D) = P(D|A
1 )×P(A 1 )+P(D|A 2 )×P(A2 )

0.05 × 0.3
P(A1 |D) = = 0.682
(0.05 × 0.3) + (0.01 × 0.7)

Thus, there is a 68.2% chance the defective item came from Machine 1.

Example 3: Cancer Detection

• Given:
o 1% of women have cancer (P(Cancer) = 0.01)
o 90% of those with cancer test positive (P(Positive | Cancer) = 0.9)
o 8% of those without cancer also test positive P(Positive|No Cancer) = 0.08

Find the probability that a woman actually has cancer given a positive test:

0.9 × 0.01
P(Cancer|Positive) =
(0.9 × 0.01) + (0.08 × 0.99)

0.009
P( Cancer ∣ Positive ) = 0.009+0.0792 = 0.10

Thus, the probability of having cancer given a positive test is 10%.

4. Summary of Key Formulas

P(A∩B)
1. Conditional Probability: P(A|B) = P(B)
2. Multiplication Rule: P(A ∩ B) = P(A|B) × P(B)
P(B|A)×P(A)
3. Bayes' Theorem: P(A|B) = P(B)
4. Law of Total Probability: P(A) = ∑ni=1 P(A|Bi ) × P(Bi )

Week 4: Random Variables and Probability Distributions Study Notes

1. Random Variables

• Definition: A random variable is a function that assigns a real number to each outcome in the sample
space of a random experiment.
• Examples:
o The number of heads when a coin is tossed twice: X = {0, 1, 2}
o The outcome of rolling a die: S = {1, 2, 3, 4, 5, 6}, with each outcome having a probability:
1
P(X = k) = , k = 1,2, … ,6
6

2. Types of Random Variables

1. Discrete Random Variable: Takes a finite or countably infinite set of values.


o Example: The number of children in a family.
2. Continuous Random Variable: Takes an uncountable number of values over an interval.
o Example: The weight of a person.

3. Probability Distributions

3.1 Discrete Random Variables

• Probability Mass Function (PMF):


o Describes the probability of each possible value for a discrete random variable.
▪ P(X = x) = f(x), where 0 ≤ f(x) ≤ 1 and ∑ f(x) = 1
o Example: For a fair coin tossed twice:
▪ S = {HH, HT, TH, TT}
▪ X = {0,1,2} (number of heads)
1 1 1
▪ P(X = 0) = , P(X = 1) = , P(X = 2) =
4 2 4

3.2 Continuous Random Variables

• Probability Density Function (PDF):


o For a continuous random variable, the probability of exact values is zero. The PDF is used to
calculate probabilities over intervals.
∞ b
▪ ∫−∞ f(x) dx = 1 and P(a ≤ X ≤ b) = ∫a f(x) dx
1
▪ Example: For the function f(x) = 5 over the interval [0, 5], it satisfies:
51
▪ ∫0 dx = 1
5

4. Special Discrete Distributions


4.1 Binomial Distribution

• Definition: Models the number of successes in n independent Bernoulli trials, each with success
probability p.
• PMF: P(X = x) = (nx)px (1 − p)n−x , X ∼ Bin(n, p)
• Example: Probability of 3 heads in 5 coin tosses with p = 0.5:
o P(X = 3) = (53)(0.5)3 (0.5)2 = 0.3125

4.2 Geometric Distribution

• Definition: Models the number of trials required for the first success.
• PMF: P(X = x) = (1 − p)x−1 p, X ∼ Geo(p)

4.3 Poisson Distribution

• Definition: Models the number of events occurring in a fixed interval of time or space.
λx e−λ
• PMF: P(X = x) = , X ∼ Poisson(λ)
x!

5. Special Continuous Distributions

5.1 Exponential Distribution

• Definition: Models the time between two events in a Poisson process.


• PDF: f(x) = λe−λx , x ≥ 0

5.2 Normal Distribution

• Definition: Models data that cluster around a mean value.


(x−μ)2
1
• PDF: f(x) = σ√2π e− 2σ2 , X ∼ N(μ, σ2 )

6. Mean and Variance of a Random Variable

6.1 Mean (Expected Value)

• For a discrete random variable: E[X] = ∑x x P(X = x)



• For a continuous random variable: E[X] = ∫−∞x f(x) dx

6.2 Variance

• Measures the spread of a random variable around its mean. Var(X) = E[(X − μ)2 ] = E[X 2 ] − (E[X])2

7. Cumulative Distribution Function (CDF)

• Definition: The CDF gives the probability that the variable takes a value less than or equal to x.
x
• F(x) = P(X ≤ x) = ∫−∞ f(t) dt
1
• Example: For the PDF f(x) = 5 over [0,5] , the CDF is:
x
o F(x) = 5 , 0≤x≤5

8. Summary of Key Formulas


1. PMF for Discrete Variable: P(X = x) = f(x), ∑ f(x) = 1

2. PDF for Continuous Variable: ∫−∞ f(x) dx = 1
3. Binomial PMF: P(X = x) = (nx)px (1 − p)n−x
λx e−λ
4. Poisson PMF: P(X = x) = x!
5. Mean: E[X] = ∑x x P(X = x)
6. Variance: Var(X) = E[X 2 ] − (E[X])2

Week 5: Joint, Marginal, and Conditional Distributions Study Notes

1. Bivariate Frequency Distribution

• Univariate distribution: Involves only one variable.


• Bivariate distribution: Involves two variables, such as income and expenditure or supply and
demand.
• Data are summarized in a two-way frequency table with each cell representing the frequency of the
pair (x, y).

2. Marginal Distribution

• The marginal distribution refers to the distribution of a single variable obtained by summing over the
rows or columns in the table. P(X = x) = fX (x) = ∑y f(x, y) and P(Y = y) = fY (y) = ∑x f(x, y).
• Example: If 𝑋 represents income groups and 𝑌 represents expenses, the marginal probability of 𝑋 = 𝑥
sums all joint probabilities in the corresponding row.

3. Joint Distribution - Joint distribution refers to the probability distribution of two (or more) random variables
simultaneously.

3.1 Discrete Joint Distribution

• For discrete variables X and Y, the joint probability function is: P(X = x, Y = y) = f(x, y)
• Conditions:
o f(x, y) ≥ 0
o ∑x ∑y f(x, y) = 1

3.2 Continuous Joint Distribution


∞ ∞
• For continuous variables, the joint density function is: ∫−∞ ∫−∞ f(x, y) dx dy = 1
• Example: The probability that XXX lies between a and b and Y between c and d is:
d b
P(a < X < b, c < Y < d) = ∫c ∫a f(x, y) dx dy

4. Cumulative Distribution Function (CDF)


• The joint cumulative distribution function (CDF) for random variables X and Y is defined as: F(x, y) =
P(X ≤ x, Y ≤ y) = ∑u≤x ∑v≤y f(u, v)
x y
• For continuous variables: F(x, y) = ∫−∞ ∫−∞ f(u, v) du dv

5. Conditional Distribution - Conditional distribution gives the probability of one variable given the value of
another.
5.1 Discrete Case
f(x,y)
• The conditional probability function of Y given X = x is: P(Y = y|X = x) = fX (x)
5.2 Continuous Case
f(x,y)
• The conditional density function of Y given X = x is: f(y|x) = fX (x)

6. Example: Joint Distribution of Two Variables


o Scenario: Three balls are drawn from a box containing 2 white, 3 red, and 4 black balls.
o Let X be the number of white balls, and Y the number of red balls drawn.
(21)(32) 1
o The joint probability P(X = 1, Y = 2) is: P(X = 1, Y = 2) = (93)
= 14
7. Exercises
1. Probability of Red Balls
o Two red balls are drawn from a pool of 6 blue and 6 red balls.
(6) 15 5
o Find the probability: P(Two Red Balls) = (12
2
)
= 66 = 22
2
2. Conditional Probability
o What is the probability of death, given no safety restraint was worn?
f(3,0) 0.025
o P(D|NR) = = = 0.0625
fY (0) 0.40

8. Summary of Key Formulas


1. Marginal Probability: P(X = x) = ∑y f(x, y) , P(Y = y) = ∑x f(x, y)
2. Joint Probability: P(X = x, Y = y) = f(x, y)
f(x,y)
3. Conditional Probability: P(Y = y|X = x) =
fX (x)
x y
4. CDF: F(x, y) = ∫−∞ ∫−∞ f(u, v) du dv

Week 6: Mathematical Expectation and Variance Study Notes

1. Mathematical Expectation (Expected Value)


• The expected value E(X) is the average value or mean of a random variable over many repetitions of
the experiment. It provides insight into the typical outcome.

1.1 Discrete Random Variable


• For a discrete random variable X with values x1 , x2 , … , xn and corresponding probabilities p(xi ), the
expected value is: E(X) = ∑k xk p(xk )
• Example:
If X takes values 0, 1, 2, 3 with probabilities 0.2, 0.4, 0.8, 0.6, then:
o E(X) = (0 × 0.2) + (1 × 0.4) + (2 × 0.8) + (3 × 0.6) = 3.8
1.2 Continuous Random Variable
• For a continuous random variable X with PDF f(x), the expected value is:

o E(X) = ∫−∞x f(x) dx
• Example:
1
Given f(x) = 3 for −1 < x < 2, the expected value is:
2 1 1
o E(X) = ∫−1 x ⋅ 3 dx = 2

2. Applications of Expected Value


• Finance: Anticipated value of future investments.
• Decision-making: Guides strategies in various fields like economics, insurance, and games of
chance.

Example: Expected Profit from a Bet


• A player pays $100 to draw a ball from a box with 35 black and 5 white balls.
o Prize for black ball: -$100 (loss)
o Prize for white ball: +$750 (win)
35 5
Expected Profit: E(X) = (−100) × 40 + 750 × 40 = 5
The expected profit is $5.

3. Variance and Standard Deviation –


o Variance measures the spread of a distribution and is denoted as Var(X) = E[(X − μX )2 ] =
E[X 2 ] − (E[X])2

o Standard deviation σ(X) is the square root of variance: σ(X) = √Var(X)

Example: Variance of Rolling a Die

1
• The outcomes are 1, 2, 3, 4, 5, 6 with equal probabilities 6.
1+2+3+4+5+6
o E(X) = 6
= 3.5
12 +22+32 +42 +52 +62
o E(X 2 ) = = 15.17
6
2
o Variance: Var(X) = E(X 2 ) − (E(X)) = 15.17 − (3.5)2 = 2.92
o Standard deviation: σ(X) = √2.92 ≈ 1.71
4. Properties of Expectation and Variance
1. Linearity of Expectation: E(aX + b) = aE(X) + b
2. Variance of a Sum: For independent random variables X and Y: Var(X + Y) = Var(X) + Var(Y)
X−μ
3. Standardized Variable: If X ∗ = σ , then: E(X ∗ ) = 0, Var(X ∗ ) = 1

5. Covariance and Correlation


• Covariance measures how two random variables change together: Cov(X, Y) = E[(X − E[X])(Y −
E[Y])]
Cov(X,Y)
• Correlation normalizes covariance: ρ(X, Y) =
σ(X)σ(Y)
• Example: Variance and Covariance:
11 11 1
o Given Var(X) = Var(X) = , Var(Y) = Var(Y) = 144, and Cov(X, Y) = Cov(X, Y) = − 144,
144
find:
1 1
o Var (2 X + Y) = 4 Var(X) + Var(Y) + Cov(X, Y)

6. Exercises
1. Expected Profit of a Product: A product has probabilities 0.15 (successful), 0.25 (moderately
successful), and 0.6 (unsuccessful). Calculate the expected profit.
2. Cakes Demand: Find the expected daily demand for cakes given the probabilities for different levels
of demand.
3. Accident Damage: Compute the expected damage for car accidents given probabilities and damage
levels.

Week 7: Discrete Distribution Models Study Notes

1. Introduction to Discrete Distribution Models


A discrete distribution model describes the probability of different outcomes from a set of discrete events. A
random variable can take on only distinct, countable values like 0, 1, 2, etc.
• Example: A product's sale may vary around 1000 units—less, equal, or more. Here, the sale volume
is a random variable determined by chance.
Key Properties of Discrete Probability Distributions
• Probability Mass Function (PMF): The probability of a specific outcome x must satisfy:
▪ 0 ≤ P(X = x) ≤ 1, ∑x P(X = x) =
• Support and Domain:
▪ Support: All possible values a random variable can take.
▪ Domain: The specific values observed in practice.

2. Bernoulli Distribution
• Bernoulli Trial: A process with only two outcomes—success (1) and failure (0).
PMF of Bernoulli distribution: f(x) = px (1 − p)1−x , x ∈ {0,1}
• Mean: E(X) = p
• Variance: Var(X) = p(1 − p)
3. Binomial Distribution
The binomial distribution describes the number of successes in n independent Bernoulli trials.
• PMF: P(X = x) = (nx)px (1 − p)n−x
• Mean: E(X) = np
• Variance: Var(X) = np(1 − p)
• Example: In 9 trials with p=0.3p = 0.3p=0.3, the probability of exactly 2 successes is:
o P(X = 2) = (92)(0.3)2 (0.7)7 = 0.266
4. Negative Binomial Distribution
• Definition: Describes the number of trials needed to achieve r successes.
• PMF: P(X = x) = (n+r−1r−1
)pr (1 − p)x
r
• Mean: E(X) =
p
r(1−p)
• Variance: Var(X) = p2
5. Poisson Distribution
The Poisson distribution models the number of events in a fixed time or space interval.
λx e−λ
• PMF: P(X = x) = x!
• Mean and Variance: E(X) = λ, Var(X) = λ
• Example: A manufacturer finds that 0.1% of bottles are defective. For 500 bottles, the expected
number of defects is: λ = 500 × 0.001 = 0.5
• The probability of no defects: P(X = 0) = e−0.5 = 0.6065

6. Geometric Distribution
The geometric distribution models the number of trials needed to achieve the first success.
• PMF: P(X = x) = p(1 − p)x−1
1
• Mean: E(X) = p
• Example: If the probability of a defective bulb is p=0.04p = 0.04p=0.04, the probability that the first
defective bulb appears on the 6th test: P(X = 6) = 0.04 × (0.96)5 = 0.0326

7. Summary of Key Formulas

1. Bernoulli Distribution: E(X) = p, Var(X) = p(1 − p)


2. Binomial Distribution: P(X = x) = (nx)px (1 − p)n−x
r r(1−p)
3. Negative Binomial Distribution: E(X) = p , Var(X) = p2
λx e−λ
4. Poisson Distribution: P(X = x) = , E(X) = λ, Var(X) = λ
x!
1
5. Geometric Distribution: P(X = x) = p(1 − p)x−1 , E(X) = p
Week 8: Discrete Distribution Models - Applications and Examples

1. Negative Binomial Distribution

• Definition: Describes the number of trials needed to achieve a specified number of successes in a
sequence of independent Bernoulli trials with a constant success probability p.

Example Problems

1. First strike on the third well drilled:


o Given: p = 0.2, r = 1, x = 3.
o Formula: P(X = 3) = (3−1 1−1
)(1 − p)3−1 p1
o P(X = 3) = 0.82 × 0.2 = 0.128
2. Rimi arrives on time for the 8th time on the 10th day:

• Given: p = 0.8, r = 8, x = 10.


• Formula: P(X = 10) = (97) ⋅ 0.88 ⋅ 0.22 = 0.4064

2. Geometric Distribution - A special case of the negative binomial distribution that models the number of
trials required to get the first success.

Example Problems

1. Finding the expected number of donors:


o Given: p = 0.2.
1 1
o Formula for Expected Value: E(X) = = =5
p 0.2
2. Hitting the bullseye on the third try:

• Given: p = 0.4.
• Formula: P(X = 3) = (1 − 0.4)2 ⋅ 0.4 = 0.144

3. Hypergeometric Distribution - Describes the probability of successes in a fixed number of draws from a
finite population without replacement.

(K N−K
k )( n−k )
Formula: P(X = k) =
(N
n)

• K: Number of successes in the population


• N: Total population size
• n: Number of draws
• k: Number of observed successes

4. Comparison Between Binomial and Hypergeometric Distributions

Hypergeometric Binomial
No replacement With replacement
Probability changes after each trial Probability remains constant
Used for small populations Used for large populations
5. Real-Life Applications

1. Binomial Distribution:
o Problem: A survey of 300 households checks for ownership of 4+ televisions.
o Application: Binomial distribution models the probability based on ownership rates.
2. Poisson Distribution:
o Problem: A caterer serves 15 plates every 10 minutes. How to plan for service?
λr e−λ
o Solution: P(X = r) =
r!
o For service over 5 minutes, P(X = 5) = 0.0378. Proper planning suggests 2 people serve
every 3 minutes to maintain service quality.

6. Fitting Discrete Distributions

• Steps for Goodness-of-Fit Tests:


1. Formulate null and alternative hypotheses.
2. Choose the appropriate distribution (binomial, Poisson, etc.).
3. Use observed vs. expected values to determine fit.
• Application Example: Determining whether a local car colour distribution aligns with global trends
using Chi-Square Goodness-of-Fit test.

7. Summary of Key Formulas

1. Negative Binomial Distribution: P(X = x) = (x−1


r−1
)pr (1 − p)x−r
1
2. Geometric Distribution: P(X = x) = p(1 − p)x−1 , E(X) = p
(K N−K
k )( n−k )
3. Hypergeometric Distribution: P(X = k) =
(N
n)
λr e−λ
4. Poisson Distribution: P(X = r) = , E(X) = λ
r!

Week 9: Continuous Probability Distribution Models - Applications and Examples

1. Concept of Continuous Distributions

• Continuous random variables can take any value within a given range, unlike discrete random
variables that take only specific values.
• The probability density function (PDF) describes the probability of a continuous random variable
falling within a specific interval.
• Key Properties:
1. The area under the PDF curve equals 1.
2. The probability that X lies within an interval [a, b] is the area under the curve between a and
b.

2. Normal Distribution

• Definition: A continuous distribution characterized by a symmetric, bell-shaped curve.


(x−μ)2
1
• PDF Formula: f(x) = e− 2σ2
σ√2π
o μ: Mean
o σ: Standard deviation
• Key Properties:
o The mean, median, and mode are equal.
o The curve is symmetric around the mean μ.
o The total area under the curve equals 1.
x−μ
o The z-score: z = represents the number of standard deviations xxx is from the mean.
σ
• Example: Lifetimes of Electronic Devices
o Given: Mean μ = 300 hours, σ = 25 hours.
350−300
o Probability of lifetime > 350 hours: z = =2
25
o From the z-table, P(X > 350) = 1 − 0.9772 = 0.0228.

3. Exponential Distribution

• Definition: Models the time between events in a Poisson process.


• PDF Formula: fT (t) = αe−αt , t > 0
o fT (t) = αe−αt , t > 0
• Example: Births at a Hospital
1
o Given: T ∼ exp (7).
10
o Probability of a birth within 10 days: FT (10) = 1 − e− 7 = 0.760

4. Gamma Distribution - Generalizes the exponential distribution and models the sum of independent
exponential variables.

μ(μx)n−1e−μx
• PDF Formula: f(x) = (n−1)!
, x > 0, μ ≥ 0
n n
• Mean and Variance: E(X) = μ , Var(X) = μ2

5. Applications and Exercises

Problem 1: Lifetimes of Light Bulbs

o Given: μ = 120 days, σ = 20 days.


90−120
o Number of bulbs expiring in < 90 days: z = 20 = −1.5
o From the z-table, P(X < 90) = 0.0668. Thus, about 67 bulbs out of 1000 will
expire within 90 days.

Problem 2: Gamma Distribution Example


o Problem: Find Γ(7/2) and evaluate the integral: I = ∫0 x 6 e−5x dx
5 3 1 15√π
o Solution: Γ(7/2) = × × √π =
2 2 2 8
6!
o For the integral: I = 57 ≈ 0.0092

Problem 3: Time Between Cars on a Highway

o Given: f(x) = 2e−2x , x ≥ 0.


1 1
o Mean: E(X) = α = 2 = 0.5 seconds.
1
o Variance: Var(X) = α2 = 0.25.
o CDF: F(x) = 1 − e−2x
6. Summary of Key Formulas

(x−μ)2
1
1. Normal Distribution: f(x) = σ√2π e− 2σ2

x−μ
2. Z-Score: z = σ
1
3. Exponential Distribution: fT (t) = αe−αt , E(T) =
α
μ(μx)n−1e−μx n n
4. Gamma Distribution: f(x) = (n−1)!
, E(X) = μ , Var(X) = μ2

You might also like