S ARANSH
Chapter - 15:
Probability
At the foundation level the concept of Probability is used in accounting and finance
to understand the likelihood of occurrence or non- occurrence of a variable. It
helps in developing financial forecasting in which you need to develop expertise
at an advanced stage of chartered accountancy course. Here in this chapter an
attempt is made for solving and understanding the concepts of with the help of
following questions with solutions.
The terms 'Probably' 'in all likelihood', 'chance', 'odds in favour', 'odds against' are
too familiar nowadays and they have their origin in a branch of Mathematics,
known as Probability. In recent time, probability has developed itself into a full-
fledged subject and become an integral part of statistics.
Random Experiment: An experiment is defined to be random if the results of the experiment depend on chance
only. For example if a coin is tossed, then we get two outcomes—Head (H) and Tail (T). It is impossible to say in
advance whether a Head or a Tail would turn up when we toss the coin once. Thus, tossing a coin is an example
of a random experiment. Similarly, rolling a dice (or any number of dice), drawing items from a box containing
both defective and non—defective items, drawing cards from a pack of well shuffled fifty—two cards etc. are all
random experiments.
Events: The results or outcomes of a random experiment are known as events. Sometimes events may be
combination of outcomes. The evyents are of two types:
(i) Simple or Elementary,
(ii) Composite or Compound.
An event is known to be simple if it cannot be decomposed into further events. Tossing a coin once provides us
two simple events namely Head and Tail. On the other hand, a composite event is one that can be decomposed
into two or more events. Getting a head when a coin is tossed twice is an example of composite event as it can
be split into the events HT and TH which are both elementary events.
184 ©ICAI B O S
Probability SA R AN S H
Mutually Exclusive Events or Incompatible Events:
A set of events A1, A2, A3, …… is known to be mutually exclusive if not more than one of them can occur simultaneously.
Thus, occurrence of one such event implies the non-occurrence of the other events of the set. Once a coin is
tossed, we get two mutually exclusive events Head and Tail.
Exhaustive Events:
The events A1, A2, A3, ………… are known to form an exhaustive set if one of these events must necessarily occur. As
an example, the two events Head and Tail, when a coin is tossed once, are exhaustive as no other event except
these two can occur.
Equally Likely Events or Mutually Symmetric Events or Equi-Probable Events:
The events of a random experiment are known to be equally likely when all necessary evidence are taken into
account, no event is expected to occur more frequently as compared to the other events of the set of events. The
two events Head and Tail when a coin is tossed is an example of a pair of equally likely events because there is no
reason to assume that Head (or Tail) would occur more frequently as compared to Tail (or Head).
Classical Definition of Probability or A Prior Definition
Let us consider a random experiment that result in n finite elementary events, which are assumed to be equally
likely. We next assume that out of these n events, nA (n) events are favourable to an event A. Then the probability
of occurrence of the event A is defined as the ratio of the number of events favourable to A to the total number of
events. Denoting this by P(A), we have
nA Number of equally likely events favourable to A
P(A) = =
n
Total number of equally likely events
However, if instead of considering all elementary events, we focus our attention to only those composite events,
which are mutually exclusive, exhaustive and equally likely and if m(n) denotes such events and is furthermore
mA(nA) denotes the no. of mutually exclusive, exhaustive and equally likely events favourable to A, then we have
mA Number of mutually exclusive, exhaustive and equally likely events favourable to A
P(A) = =
m
Total Number of mutually exclusive, exhaustive and equally likely events
Probability and Expected Value by Mathematical Expectation
For this definition of probability, we are indebted to Bernoulli and Laplace. This definition is also termed as a priori
definition because probability of the event A is defined based on prior knowledge.
This classical definition of probability has the following demerits or limitations:
(i) It is applicable only when the total no. of events is finite.
(ii) It can be used only when the events are equally likely or equi-probable. This assumption is made well before
the experiment is performed.
(iii)This definition has only a limited field of application like coin tossing, dice throwing, drawing cards etc. where
the possible events are known well in advance. In the field of uncertainty or where no prior knowledge is
provided, this definition is inapplicable.
In connection with classical definition of probability, we may note the following points:
(a) The probability of an event lies between 0 and 1, both inclusive.
i.e. 0 ≤ P (A) ≤ 1
When P(A) = 0, A is known to be an impossible event and when P(A) = 1, A is known to be a sure event.
©I CAI BOS 185
S ARANSH Probability
(b) Non-occurrence of event A is denoted by A’ or AC or and it is known as complimentary event of A. The event
A along with its complimentary A’ forms a set of mutually exclusive and exhaustive events.
mA m - mA
i.e. P(A) + P (A’) = 1 P(A’) = 1 - P(A) = 1 - =
m m
(c) The ratio of no. of favourable events to the no. of unfavourable events is known as odds in favour of the
event A and its inverse ratio is known as odds against the event A.
i.e. odds in favour of A = mA : (m – mA)
and odds against A = (m – mA) : mA
Statistical definition of Probability :
Owing to the limitations of the classical definition of probability, there are cases when we consider the statistical
definition of probability based on the concept of relative frequency. This definition of probability was first developed
by the British mathematicians in connection with the survival probability of a group of people.
Let us consider a random experiment repeated a very good number of times, say n, under an identical set of
conditions. We next assume that an event A occurs fA times. Then the limiting value of the ratio of fA to n as n
tends to infinity is defined as the probability of A.
lim FA
ie. P(A)
n→∞ n
This statistical definition is applicable if the above limit exists and tends to a finite value.
Two events A and B are mutually exclusive if P (A ∩ B) = 0 or more precisely
P (A ∪ B) = P(A) + P(B)
Similarly, three events A, B and C are mutually exclusive if.
P (A ∪ B ∪ C) = P(A) + P(B) + P(C)
Two events A and B are exhaustive if.
P (A ∪ B) = 1
Similarly, three events A, B and C are exhaustive if.
P (A ∪ B ∪ C) = 1
Three events A, B and C are equally likely if
P(A) = P(B) = P(C)
Axiomatic or modern definition of probability: Then a real valued function p defined on s is known as a
probability measure and p(a) is defined as the probability of A if P satisfies the following axioms:
(i) P(A) ≤ 0 for every A ⊆ S (subset)
(ii) P(S) = 1
(iii)For any sequence of mutually exclusive events A1, A2, A3,..
P(A1 UA2U A3U….) = P(A1) + P(A2) + P(A3)
186 ©ICAI B O S
Probability SA R AN S H
Addition theorems or theorems on total probability: For any two mutually exclusive events a and b, the probability
that either a or b occurs is given by the sum of individual probabilities of A and B. i.e. P (A∪B) or P(A + B) = P(A) +
P(B) or P(A or B) whenever A and B are mutually exclusive
For any three events A, B and C, the probability that at least one of the events occurs is given by P(A∪B∪C) =
P(A) + P(B) + P(C) – P(A∩B) – P(A∩C) – P(B∩C)+ P(A∩B∩C)
(d) For any two events A and B, the probability that either A or B occurs is given by the sum of individual
probabilities of A and B less the probability of simultaneous occurrence of the events A and B.
i. e. P(A∪B) = P(A) + P(B) – P(A∩B)
For any three events A, B and C, the probability that at least one of the events occurs is given by
P(A∪B∪C) = P(A) + P(B) + P(C) – P(A∩B) – P(A∩C) – P(B∩C)+ P(A∩B∩C)
(e) Two events A and B are mutually exclusive if
P (A∪B) = P(A) + P(B)
Similarly, three events A, B and C are mutually exclusive if
(A∪B∪C) = P(A) + P(B) + P(C)
(f) P(A–B) = P (A∩B’) = P(A) – P(A∩B)
And P(B –A) = P(B∩A’) = P(B) – P(A∩B)
Some important Results
1. If A and b are two independent events, then the probability of occurrence of both is given by P (A∩B) =
P(A). P(B)
2. If A, B and C are three events, then. P(A∩B∩C) = P(A). P(B/A). P(C/A∩B)
3. If A and B are two mutually exclusive events of a random experiment, then.
A∩B = φ, P(AUB) = P(A) +P(B)
4. If A and B are associated with a random experiment, then.
P(AUB) = P(A) +P(B) -P(A∩B)
5. If A, B and C are three events connected with random experiment, then
P(AUBUC) = P(A) +P(B) +P(C) -P(A∩B) -P(B∩C)-P(C∩A) +P(A∩B∩C)
(g) Compound Probability or Joint Probability
P(B ∩ A) P(A ∩ B)
P(B/A) = =
P(A) P(A)
(h) For any three events A, B and C, the probability that they occur jointly is given by
P(A∩B∩C) = P(A) P(B/A) P(C/(A∩B)) Provided P(A∩B) > 0
P(A' ∩ B) P(B) - P(A ∩ B)
(i) P(A’/B) = =
P(B) P(B)
(j) P(A/B’)= = P(A ∩ B') = P(A) - P(A ∩ B)
P(B') 1 - P(B)
©I CAI BOS 187
S ARANSH Probability
P('A ∩ B')
(k) P(A’/B’) = P(A/B’)= =
P(B')
P(A ∩ B)'
= [ by De-Morgan's Law A’∩B’ = (AUB)’]
P(B')
1 - P (A ∪ B)
=
1 - P(B)
(7) A random variable or stochastic variable is a function defined on a sample space associated with a random
experiment assuming any value from R and assigning a real number to each and every sample point of the
random experiment.
(8) Expected value or Mathematical Expectation or Expectation of a random variable may be defined as the sum
of products of the different values taken by the random variable and the corresponding probabilities.
When x is a discrete random variable with probability mass function f(x), then its expected value is given by
= µ = ∑xƒ(x)
x
and its variance is
σ2 = E (x2) – µ2
Where E(x2) = ∑x 2(x)f(x)f(x)
For a continuous random variable x defined in [–∞, ∞], its expected value (i.e. mean) and variance are given by
∞
E(x) = ∫ xƒ (x)dx
∞
and σ2 = E (x2) – µ2
∞
where E(x2) = ∫ x2 ƒ (x)dx
∞
Properties of Expected Values
1. Expectation of a constant k is k
i.e. E(k) = k for any constant k
2. Expectation of sum of two random variables is the sum of their expectations.
i.e. E(x + y) = E(x) + E(y) for any two random variables x and y.
3. Expectation of the product of a constant and a random variable is the product of the constant and the
expectation of the random variable.
i.e. E(k x) = k.E(x) for any constant k
4. Expectation of the product of two random variables is the product of the expectation of the two random
variables, provided the two variables are independent.
i.e. E(xy) = E(x) E(y)
Whenever x and y are independent.
188 ©ICAI B O S
Probability SA R AN S H
1. A speaks truth in 60% and B in 75% of the cases. In what percentage of cases are they likely to contradict
each other in stating the same fact?
Solution:
60 (100 - 75) 60 25 3
The Probability that A speaks the truth and B a lie = x = x =
100 100 100 100 20
75 (100 - 60) 75 40 3
The Probability that B speaks the truth and A a lie = x = x =
100 100 100 100 20
3 3 9
∴Total Probability = + =
20 10 20
Hence, the percentage of cases in which they contradict each other = (9/20) × 100 or 45%
2. A Committee of 4 persons is to be appointed from 7 men and 3 women. The probability that the committee
contains (i) exactly two women, and (ii) at least one woman is
Solution:
Total number of persons = 7+3 = 10. Since 4 our out them can be formed in 10 C4 ways the exhaustive number of
cases is 10 C4 or 210 ways.
(i) P (exactly 2 women in a committee) of four = 7 C3 / 210 = 63/210 = 3/10.
(ii) P (at least one women in committee)
= 1 – p (no women) = 1 – 1 /6 = 5/6.
3. If A and B are two events, such that P(A) = ¼, P(B) = 1/3 and P(A B) = ½; then P(B/A) is equal to
Solution: P(A ∪ B) = P(A) + P(B) - P(A ∩ B)
½ = ¼+1/3 - P(A ∩ B)
Or P(A ∩ B) = ¼+1/3-1/2= 1/12
P(A ∩ B) 1 / 12 1
Hence, P (B/A) = = =
P(A) 1/4 3
4. A person applies for a job in two firms, say X and Y. the probability of his being selected in firm X is 0.7 and
being rejected in firm Y is 0.5. The probability of at least one of his applications being rejected is 0.6. What is
the probability that he will be selected in one of the two firms?
Solution:
Events S : Failed in Statistics, and
Event M : Failed in Mathematics,
P (S) = 0.20, P (M) = 0.30 and P(M ∩ S) = 0.1
Hence, = P(M ∪ S) = P(M)+ P (S) - P(M ∩ S)
= 0.2+0.3-0.1= 0.4
©I CAI BOS 189
S ARANSH Probability
5. A person is known to hit a target in 5 out of 8 shots, whereas another person is known to hit in 3 out of 5
shots. Find the probability that the target is hit at all when they both try.
Solution: Event A = First person hits the target and
Event B = Another person hits the target.
P (A) = 5/8 and P(B) = 3/5
P(AC) = 1-5/8 = 3/8 and P(BC) = 1-3/5= 2/5
Event X = target is hit when they both try i.e.,
When at least one of them hit the target.
P (XC) = P (the target is not hit at all)
= P(Ac ∩ Bc) = P (Ac x P(Bc) - 3/20
Hence P(X) = 1-P(XC) = 1-3/20 = 17/20
6. The probability that a man will be alive in 25 years is 3/5, and the probability that his wife will be alive in 25
years in 2/3. Find the probability that :
(i) Both will be alive (ii) at least one of the will be alive
Solution:
P (M) = 3/5 and P(W) = 2/3
P(MC) = 1 – 3/5 and P(WC) = 1-2/3 = 1/3.
The probability that both will be alive
= P(M) × P(W) = 3/5 × 2/3 = 2/5.
Probability that at least one of them will be alive is given by
P(M ∪ W) P(M) + P(W) - P(M ∩ W)
= 3/5+2/3-6/15 = 13/15.
7. Given the data in Previous Problem find the probability that (i) only wife will be alive, (ii) only man will be
alive.
Solution:
Probability that only wife will be alive.
= Probability that wife will be alive nut not man
= P(W) × P(MC) = 2/3×2/5 = 4/15
Probability that only man will be alive
= Probability that man will be alive nut not wife
= P(M) × P(WC) = 3/5×1/3 = 1/5.
8. A random variable X has the following probability distribution:
Value of X 0 1 2 3
P [ X =x] 1/3 1/2 0 1/6
Find E ( {X – E (X)}2]
Solution : E (X) = 0×1/3+1×1/2+2×0+3×1/6= 1,
E(X2) = 0×1/3+1×1/2+4×0+9×1/6= 2
E [X – E(X)]2 = E (X2)- [E (X)]2 = 2-1 = 1.
190 ©ICAI B O S
Probability SA R AN S H
9. Given the data in previous Problem, Find Var (Y), where Y = 2X – 1.
Solution:
E (Y) = E(2X-1) = 2E(X) – 1 = 1
E (Y2) = E(2x-1)2 = 2E(X2) – 4E(X)+1 = 1
Var. (Y) = E(Y2) – [E(Y)]2 = 1.1 = 0
10. Daily demand for transistors is having the following probability distribution:
Demand 1 2 3 4 5 6
Probability 0.10 0.15 0.20 0.25 0.18 0.12
Solution:
E(X) = 1×0.10+2×0.15+3×0.20+4×0.25+5×0.18+6×0.12 = 3.62
Given the data in previous problem obtain the variance of the demand.
E(X2) = 1×0.10+4×0.15+9×0.20+16×0.25+25×0.18+36×0.12= 15.32
Var. (X) = 15.32 – (3.62)2 = 2.22
©I CAI BOS 191