KEMBAR78
Basics of Probability | PDF | Expected Value | Probability Distribution
0% found this document useful (0 votes)
8 views32 pages

Basics of Probability

The document provides an overview of basic concepts in probability, including experiments, outcomes, sample spaces, events, and the classical definition of probability. It discusses conditional probability, independence of events, and key theorems such as Bayes' Theorem and the Theorem of Total Probability. Additionally, it covers discrete probability distributions, expectation, and variance, along with examples and problems to illustrate these concepts.

Uploaded by

Joy Sengupta
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
8 views32 pages

Basics of Probability

The document provides an overview of basic concepts in probability, including experiments, outcomes, sample spaces, events, and the classical definition of probability. It discusses conditional probability, independence of events, and key theorems such as Bayes' Theorem and the Theorem of Total Probability. Additionally, it covers discrete probability distributions, expectation, and variance, along with examples and problems to illustrate these concepts.

Uploaded by

Joy Sengupta
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 32

Basics of Probability

&
Probability Distributions
Some Basic Notions
• Experiment : It is any act or performance that can be repeated under
essentially similar conditions
• Eg. 1. Toss a coin once.
2. Roll a die once.
3. React two volumes of H2 with one volume of O2.
Some Basic Notions contd.
• Random Experiment : It is such an experiment the result of which
cannot be predicted before its actual performance.
• Eg 1. Toss a coin once.
2. Roll a die once.
3. Select a person at random from a group of individuals.
Some Basic Notions contd.
• Outcome : The result of a random experiment is known as its
outcome.
• Eg. 1. When we toss a coin once, the possible outcomes are
‘head’ and ‘tail’ (to be further denoted by H and T respectively).
2. When we roll a die once, the possible outcomes are 1, 2, 3,
4, 5 and 6.
Some Basic Notions contd.
• Sample Space : The collection of all possible outcomes with respect
to a given random experiment is known as its sample space. It is
generally denoted by S or Ω.
• Eg. 1. When a coin is tossed once,
S = {H,T}.
2. When a coin is tossed twice,
S = {HH,HT,TH,TT}.
Some Basic Notions contd.
• Eg. 3. When a coin is tossed thrice,
S={HHH,HHT,HTH,THH,HTT,THT,TTH,TTT}.
• Eg. 4. When a die is rolled once,
S={1,2,3,4,5,6}.
• Eg. 5. Suppose a coin is tossed until a head appears. Then,
S={H,TH,TTH,TTTH, …..}
It is an example of infinite sample space.
Some Basic Notions contd.
• Event : A part (subset) of the sample space is called an event.
• Eg. 1. Suppose we toss a coin twice.
Let A be an event denoting that a head will appear in the 1st toss. Then,
A={HH,HT} where S={HH,HT,TH,TT}
Here HH and HT are called the outcomes favourable to the event A.

• Eg. 2. Suppose we roll a die once.


Let B be an event denoting the appearance of an even number. Then,
B={2,4,6} where S={1,2,3,4,5,6}
Here 2, 4 and 6 are the outcomes favourable to the event B.
Some Basic Notions contd.
• Equally Likely Outcomes : Outcomes of a random experiment are
said to be equally likely if none of them can be expected in preference
to the others.

In experiments like tossing of a coin, rolling of a die, etc., if the


outcomes may be considered to be equally likely, then the coin or the
die is said to be fair, symmetric, unbiased, etc.
Classical Definition of Probability
Suppose
➢The sample space S with respect to a given random
experiment is finite. Let N be the total number of possible
outcomes in the sample space.
➢All the outcomes are equally likely.

Given an event A, let n(A) be the number of outcomes favourable


to A. Then by classical definition, the probability of A is defined
as
P(A) = n( A)
.
N
An Example
• Suppose we are asked to find out the probability of getting ‘at least two heads’
when an unbiased coin is tossed thrice.
• We shall here define an event A denoting the appearance of at least two heads.
Here
S={HHH,HHT,HTH,THH,HTT,THT,TTH,TTT}
while,
A={HHT,HTH,THH,HHH}
Thus, n(A) = 4 and N = 8.

n( A) 4 1
Therefore, P(A) = = = .
N 8 2
Another Approach
• Suppose a random experiment is repeated n times.
• Let fn(A) be the number of times an event A occurs in these n repetitions.
f ( A)
• fn(A) is called the frequency of A and n is called its relative frequency.
n
• If the experiment is repeated in sets of n, then, for small values of n, these
relative frequencies fluctuate highly.
• However, as n increases, the fluctuations get reduced and the relative
frequencies tend to stabilized around some fixed level.
• This tendency of relative frequencies to get stabilized around some fixed
level in called statistical regularity and the level around which this
stabilization occurs, is known as the probability of the event A.
Conditional Probability
Let B be an event such that P(B) > 0. Then for any other event A, the
probability of occurrence of A given that B has already occurred is
defined as
𝑃(𝐴 ∩ 𝐵)
𝑃 𝐴𝐵 =
𝑃(𝐵)
Example
Suppose a card is drawn at random from a full pack of playing cards.
What is the probability that it is an ‘ace’ when it is known that the card
drawn is a ‘spade’.
Here A = the card is an ace.
B = the card is a spade.
Then, P(A∩ 𝐵) = 1/52
P(B) = 13/52.
1
∴, 𝑃 𝐴 𝐵 = .
13
Theorem of Compound Probability

Let B be an event such that P(B) > 0. Then for any event A,

𝑃 𝐴 ∩ 𝐵 = 𝑃 𝐵 . 𝑃(𝐴|𝐵)
Theorem of Total Probability
Let A1 and A2 be two mutually exclusive and exhaustive events such
that P(A1) > 0 and P(A2) > 0. Then for any event B,

𝑃 𝐵 = 𝑃 𝐴1 . 𝑃 𝐵 𝐴1 + 𝑃 𝐴2 . 𝑃 𝐵 𝐴2
Bayes’ Theorem
Let A1 and A2 be two mutually exclusive and exhaustive events such
that P(A1) > 0 and P(A2) > 0. Let B be another event such that P(B) > 0.
Then,
𝑃 𝐴1 . 𝑃 𝐵 𝐴1
𝑃 𝐴1 𝐵 =
𝑃 𝐴1 . 𝑃 𝐵 𝐴1 + 𝑃 𝐴2 . 𝑃 𝐵 𝐴2

𝑃 𝐴2 . 𝑃 𝐵 𝐴2
𝑃 𝐴2 𝐵 =
𝑃 𝐴1 . 𝑃 𝐵 𝐴1 + 𝑃 𝐴2 . 𝑃 𝐵 𝐴2
Let us solve
Q. Suppose the probability that a person is suffering from Cancer is
0.3. Further, let the probability that a diagnostic test detects Cancer
correctly is 0.62 and the probability that the test will show a positive
result when the person is not actually suffering from the disease is 0.15.
What is the probability that the test, applied to a randomly selected
person will show positive result? Also find out the probability that the
person is not actually suffering from Cancer when the test shows a
positive result.
Independence

Two events A and B are said to be independent if

𝑃 𝐴 ∩ 𝐵 = 𝑃 𝐴 . 𝑃(𝐵)
Result
If A and B be two independent events, then

(a) A and BC
(b) AC and B
(c) AC and BC

will also be independent.


Let us solve
Suppose the probability of solving a problem by two students are 0.6
and 0.9 respectively. If they try to solve the problem independently,
what is the probability that the problem will be solved?
Measures on a Discrete
Probability Distribution
Expectation (or Mathematical Expectation)
Let X be a discrete random variable assuming the values x1, x2, …, xn (or x1, x2, …) with the
corresponding probabilities p1, p2, …, pn (or p1, p2, …) where 𝑝𝑖 = 𝑃 𝑋 = 𝑥𝑖 ∀𝑖 such that
σ𝑖 𝑝𝑖 = 1. The Expectation of X is then defined as

𝐸 𝑋 = ෍ 𝑥𝑖 𝑝𝑖
𝑖
Notes
σ𝑖 𝑥𝑖 𝑝𝑖
• (i) Observe that 𝐸 𝑋 = σ𝑖 𝑥𝑖 𝑝𝑖 = σ𝑖 𝑝𝑖
, since σ𝑖 𝑝𝑖 = 1. Thus, expectation is the
Arithmetic Mean in the probability setup.

• (ii) We have, 𝐸 𝑋 = σ𝑖 𝑥𝑖 𝑝𝑖 = σ𝑖 𝑥𝑖 𝑃(𝑋 = 𝑥𝑖 ) = σ𝑥 𝑥𝑃(𝑋 = 𝑥) = σ𝑥 𝑥𝑓(𝑥) where


f(x) is the PMF of X.
Example

E: Toss an unbiased coin thrice.


x 0 1 2 3 Total
Let X: Number of heads obtained in three tosses. P(X = x) 1/8 3/8 3/8 1/8 1
The probability distribution of X is:

The Expectation of X is given by


1 3 3
𝐸 𝑋 = σ𝑥 𝑥𝑃(𝑋 = 𝑥) = 0 × + 1× + 2× +
8 8 8
1 12
3× = = 1.5.
8 8
Variance
It is defined as 𝑉 𝑋 = 𝐸[𝑋 − 𝐸 𝑋 ]2 = 𝐸 𝑋 2 − {𝐸 𝑋 }2 .
Problem 1
A bag contains 5 white and 3 black balls. 3 balls are drawn randomly from the bag. Find the
expected number of white balls drawn.
Solution
• Let X be a random variable denoting the number of white balls drawn. Clearly, X can take
up the values 0, 1, 2 and 3.
5 3
0 3 1
• Now, 𝑃 𝑋 = 0 = 8 =
56
3
5 3
15
• 𝑃 𝑋=1 = 182 =
56
3
5 3
2 1 30
• 𝑃 𝑋=2 = 8 =
56
3
5 3
10
• 𝑃 𝑋=3 = 380 =
56
3
Solution
𝟏 𝟏𝟓 𝟑𝟎 𝟏𝟎 𝟏𝟎𝟓
• Thus, 𝑬 𝑿 = σ𝒙 𝒙𝑷(𝑿 = 𝒙) = 𝟎 × + 𝟏× + 𝟐× + 𝟑× = =
𝟓𝟔 𝟓𝟔 𝟓𝟔 𝟓𝟔 𝟓𝟔
𝟏. 𝟖𝟕𝟓.
Problem 2

• Suppose a player A gains Rs. 16 from another player B for getting at least one head and
loses Rs. 40 to B otherwise when an unbiased coin is tossed thrice. Find the expectation
of the gain of A.
Solution
• Let X denote the gain of player A.
• Clearly, x can take up the values 16 and -40.
• Now, P(X = 16) = P(at least one head) = 7/8.
• Also, P(X = -40) = P(No Head) = 1/8.
7 1
• Thus, 𝐸 𝑋 = 16 × + −40 × = 9 𝑅𝑠. .
8 8
Problem 3
• If a person gets Rs. 2x + 5 where X denotes the number appearing when a balanced die
is rolled once, find the expected gain of the person.
Solution
• Here X can take up the values 1, 2, 3, 4, 5, 6.
• Also, the die being balanced, each outcome has the probability 1/6.

• Then, by the problem, the gain of the person may be 7, 9, 11, 13, 15 and 17 respectively.

1
• Thus, 𝐸 𝑋 = 7 + 9 + 11 + 13 + 15 + 17 = 12.
6

You might also like