KEMBAR78
Jee Advanced Probability Revision Notes | PDF | Variance | Probability
0% found this document useful (0 votes)
45 views9 pages

Jee Advanced Probability Revision Notes

The document provides comprehensive notes on probability, covering concepts such as random and deterministic experiments, sample space, events, and various probability theorems. It explains independent events, mutually exclusive events, conditional probability, and Bayes' theorem, along with examples to illustrate these concepts. Additionally, it discusses random variables, their distributions, and the characteristics of Bernoulli trials and binomial distributions.

Uploaded by

Soourabh Kapure
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
45 views9 pages

Jee Advanced Probability Revision Notes

The document provides comprehensive notes on probability, covering concepts such as random and deterministic experiments, sample space, events, and various probability theorems. It explains independent events, mutually exclusive events, conditional probability, and Bayes' theorem, along with examples to illustrate these concepts. Additionally, it discusses random variables, their distributions, and the characteristics of Bernoulli trials and binomial distributions.

Uploaded by

Soourabh Kapure
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 9

JEE Advanced Revision Notes

Maths
Probability

Probability:
● The theory of probability is a branch of mathematics that deals with
uncertain or unpredictable events. Probability is a concept that gives a
numerical measurement for the likelihood of occurrence of an event.
● An act that gives some result is an experiment.
● A possible result of experiments is the set of all its outcomes. Thus, each
outcome is also called a sample point of the experiment.
● An experiment repeated under essentially homogeneous and similar
conditions may result in an outcome that is either unique or not unique but
one of the several possible outcomes.

An experiment is called a random experiment if it satisfies the following two


conditions:
1. It has more than one possible outcome.
2. It is not possible to predict the outcome in advance.
The experiments that result in a unique outcome is called deterministic
experiment.
● Sample space is a set consisting of all the outcomes, and its cardinality is
given by n(S) .
● Any subset ‘ E ’ of a sample space for an experiment is called an event.
● The empty set  and the sample space S describe events.  is called an
impossible event, and S , the whole sample space, is called the sure event.
● Whenever an outcome satisfies the conditions, given the event, we say that
the event has occurred.
● If an event E has only one sample point of a sample space, It is called a
sample event. In the experiment of tossing a coin, the sample space is {H,
T} and the event of getting an {H} or a {T} is simple.

Subset: A subset of the sample space with more than one element is called a
compound event. In throwing a dice, the appearance of odd numbers is a
compound event because E  {1,3,5} has 3 sample points or elements.

Class XI Mathematics www.vedantu.com 1


Events are said to be equally likely if we have no reason to believe that one is
more likely to occur than the other. The outcomes of an unbiased coin are equally
likely.
Probability of an event, E, is the ratio of the number of elements in the event
to the number of elements in the sample space.
n( E )
1. P( E ) 
n( S )
2. 0 P ( E ) 1

Independent Events: Two or more events are said to be independent if the


occurrence or non-occurrence of any of them does not affect the probability of
occurrence or non-occurrence of the other event.
The complement of an event A is the set of all outcomes which are not in A . It
is denoted by ‘ A ’.
Events A and B are mutually exclusive if and only if they have no elements in
common.

Image: Mutually exclusive


P ( A or B )  P ( A)  P ( B )
P ( A and B)  0
When every possible outcome of an experiment is considered, the events are
called exhaustive events.

Image: Exhaustive event

Events E1, E2 ,........., En are mutually exclusive and exhaustive if

Class XI Mathematics www.vedantu.com 2


E1, E2 ,........., En  S and Ei  E j   , for every distinct pair of events.
When the sets A and B are two events associated with a sample space, then
A  B is the event either A or B or both.
Therefore event A or B = A  B = { :   A or   B}

If A and B are events, then the event A and B are defined as the set of all the
outcomes favourable to both A and B , that is, A and B is the event A  B . This
is represented diagrammatically as follows:

Image: A intersection B
If A and B are events, then the event A - B is defined to be the set of all outcomes
which are favourable to A but not to B . A  B  A  B  {x : x  A and x  B} .
This is represented diagrammatically as:

Image: A but not B


If S is the sample space of an experiment with n equally likely outcomes
S  {w1, w2 ,.........wn }then
P(w1)  P(w2 )  P(wn )  n .
n
 P(w1)  1
i 1

Class XI Mathematics www.vedantu.com 3


So P(wn )  1/ n
Let S be the sample space of a random experiment. The probability P is a real
value function with domain the power set of S and range the interval [0,1]
satisfying the axioms that:
1. For any event, E , P ( E ) is greater than or equal to 1.
2. p ( S )  1
3. Number P( I ) associated with sample point i such that 0 P (i ) 1

The additional theorem of probability: If A and B be any two events, then the
probability of occurrence of at least one of the events A and B is given by
P( A  B)  P( A)  P( B)  P( A  B )
(a) If A and B are mutually exclusive events then P( A  B)  P( A)  P ( B )

Additional theorem for three events


P( A  B  C )  P( A)  P( B)  P(C )  P( A  B )  P ( B  C )  P ( A  C )  P ( A  B  C )

If E is any event and E' be the complement of event E , then


P( E)  1  P( E)

Probability of difference of events: Let A and B be events.


Then, P( A  B)  P( A)  P ( A  B )
Additional theorem in terms of difference of events:
P( A  B )  P ( A  B )  P ( B  A)  P ( A  B )

The conditional probability:


If E and F are two events associated with the same sample space of a random
experiment, the conditional probability of the event E , given the occurrence of
the event F is given by
Number of elementary events favourable to E  F
P(E∣ F) 
Number of elementary events which are favourable to F
n(E  F)

n( F)

Class XI Mathematics www.vedantu.com 4


n(E  F)
n( S) P(E  F)
P(EFF)  
n( F) P(F)
n( S)

Properties of probability:
Let E and F be events of a sample space S of an experiments,
Then we have
S F
1) P( )  P( )  1
F F
E
2) 0 P( ) 1,
F
1
E E
3) P ( )  1  P ( )
F F
 (E  F )  E F  (E  F ) 
4) P    P ( )  P ( )  P  
 G  G G  G 

Multiple theorem on probability:

F
P( E  F )  P( E ) P   , P( E )  0
E
E
P( E  F )  P( F ) P   , P( E )  0
F

If E & F are independent, then:


P( E  F )  P( E ) P( F )
P( E / F )  P( E ) , P( F )  0
P( F / E )  P( F ) , P( F )  0

Theorem of total probability:


The event E1, E2 , E3 ,.............., En has non-zero probability. Let A be any event
associated with S ,
A A A
Then P( A)  P( E1 ) P( )  P( E2 ) P( )  .........................  P ( En ) P ( )
E1 E2 En

Class XI Mathematics www.vedantu.com 5


Baye’s theorem:
If E1, E2 , E3 ,.............., En are the events which constitute a partition of S that is
E1, E2 , E3 ,.............., En are pair wise disjoint & E1  E2  ...................  En  SA
be ant event with non-zero probability.
Then
A
P( Ei ) P( )
E Ei
P( i ) 
A A
 j 1 P( Ej ) P( E )
n

j
A random variable is a real valued function whose domain is the sample space of
random experiment.
The probability distribution of a random variable X is the system of nos.
X: x1 x2 …………………………….. xn
P( X ) : p1 p2 ……………………………… pn
Where pi  0 , i 1 pi  1, i  1,2,...................., n
n

Let X be a random variable whose possible value x1, x2 ,........., xn occur with
probabilities
p1, p2 ,........., pn Respectively. The mean of X , denoted by  is the No.
i 1 xipi .
n

The mean of random variable X is also called the expectation of X , denoted by


E( X ) .
Let X be a random variable whose possible values, x1 , x2
,…………………………….., xn occur with probabilities.
p( x1), p( x2 ),............, p( xn ) .
Let  = E ( X ) be the mean of X . The variance of X , denoted by Variable( X )
or  2 is defined as
 2  Var ( X )  in1( xi   )2 p( xi) or
 2  E ( X   )2
The non-negative no,
   Var ( X ) =  i1 ( xi   )2 p( xi ) , is called the standard deviation of the
n

random variable X.
Var ( X )  E ( X 2 )  [ E ( X )]2

Class XI Mathematics www.vedantu.com 6


Trails of a random experiment are called Bernoulli trials if they satisfy the
following conditions:
● There should be a finite number of trials.
● The trial should be independent.
● Each trial has exactly two outcomes: success or failure.
● The probability of success remains the same in each trial.
For binomial distribution B(n, p ) , P( X  x) n Cx qn  x p x , x  1, 2,3,........,n &
(q=1-p
Mean = np , and Varience  npq Standard deviation  npq

Example Problems:
1. An insurance company insured 2000 scooter drivers, 4000 car drivers, and
6000 truck drivers. The probability of an accident involving a scooter, a car,
and a truck is 0.01, 0.03, and 0.15, respectively. If a driver meets an accident,
what is the chance that the person is a scooter driver? What is the
importance of insurance in everybody’s life?
Ans: Let A be the event the insured person meets with an accident and E1 , E2
and E3 are the events that the person is a scooter. Car and truck driver
E 
respectively. Then we have to find P  1  .
 A
Total number of insured persons=2000+4000+6000=12000.
2000 1
P( E1 )   ;
12000 6
4000 1
P( E2 )   ;
12000 3
6000 1
P( E3 )   ;
12000 2
 A
Also P    0.01;
 E1 
 A
P    0.03;
 E2 
 A
P    0.15;
 E3 
Hence, by Baye’s theorem we have

Class XI Mathematics www.vedantu.com 7


 A
P   P( E1 )
 A  E1 
P  
 E1  P  A  P( E )  P  A  P( E )  P  A  P( E )
  1   2   3
 E1   E2   E3 
1
0.01 
6 1
 
1 1 1 52
0.01   0.03   0.15 
6 3 2

2. A card from a pack of 52 cards is lost. From the remaining cards of the
pack, two cards are drawn and are found to be both diamonds. Find the
probability of the lost card being a diamond. Is it better not to tell anyone
about the loss of the card while playing?
Ans: Let E1, E2 , E3 , E4 and A be the events defined as follows,
E1  The missing card is a diamond
E2  The missing card is not diamond
A =Two drawn card are of diamond
13 1
Now P( E1 )  
52 4
1
P( E2 ) 
4
 A
P    Probability of drawing of second heart cards when one diamond card id
 E1 
missing
C (12.2) 12 11
  .
C (51.2) 51 50
 A  C (13.2) 12 12
Similarly P     .
 2
E C (52.2) 51 50

P( E1 ).P  
E   E1 
P 2  
 A    
P( E1 ).P    P( E2 ).P  
 E1   E2 

Class XI Mathematics www.vedantu.com 8


1 12 11
. .
4 51 50 1.12.11 11 11
   
1 12 11 3 13 12 1.12.11  3.13.12 11  39 50
. .  . .
4 51 50 4 51 50
No, when we are playing any game it should be played honestly.

3. A man is known to speak the truth 3 out of 4 times. He throws a die and
reports that it is a six. Find the probability that it is a six, and write at least
one drawback of telling a lie.
Ans: The event that six occurs and S2 be the event that six does not occur.
1
Then P( S1)  Probability that six occurs 
6
5
P(S2 )  Probability that six does not occur 
6
P( E | S2 )  Probability that the man reports that six occurs when six has
Not actually occurred on the die.
3 1
=Probability that the man does not speak the truth = 1  
4 4
Thus by Baye’s theorem, we get
P(S1 / E )  Probability that the report of the man that six has occurred is
Actually a six
E
P( S1 ) P  
  S1 
E E
P( S1 ) P    P( S2 ) P  
 S1   S2 
3
Hence, the required probability is  .
8

Class XI Mathematics www.vedantu.com 9

You might also like