PROBABILITY
1. Basic Probability Principles
• Experiment: A procedure that produces outcomes.
• Sample Space (S): The set of all possible outcomes of an experiment.
• Event: A specific outcome or set of outcomes.
• Probability of an Event (P(E)): The likelihood that event E occurs. It is calculated as:
of favorable outcomes number of outcomes in the sample spaceP(E)=Total number of outco
mes in the sample spaceNumber of favorable outcomes where the total probability of all
possible events equals 1.
Key Rules:
• Complement Rule: The probability that event E does not occur is: P(E′)=1−P(E)
• Addition Rule (for mutually exclusive events): If events A and B cannot happen at the same
time, then: P(A∪B)=P(A)+P(B)
• Addition Rule (for non-mutually exclusive events): If events A and B can happen at the
same time, then: P(A∪B)=P(A)+P(B)−P(A∩B)
• Multiplication Rule (for independent events): If two events A and B are independent, then:
P(A∩B)=P(A)×P(B) If events are dependent, then: P(A∩B)=P(A)×P(B∣A) where P(B∣A)
is the conditional probability.
2. Types of Events
• Independent Events: The occurrence of one event does not affect the occurrence of another.
• Dependent Events: The occurrence of one event affects the probability of the other.
• Mutually Exclusive Events: Events that cannot occur at the same time (e.g., rolling a die
and getting both a 3 and 5).
• Non-Mutually Exclusive Events: Events that can occur at the same time (e.g., drawing a
card from a deck and getting a red card).
3. Conditional Probability
• Conditional Probability (P(A | B)): The probability that event A occurs given that event B
has occurred is calculated as: P(A∣B)=P(B)P(A∩B) if P(B)=0.
4. Permutations and Combinations
• Permutation: An arrangement of objects in a specific order. The formula is: P(n,r)=(n−r)!n!
where n is the total number of objects and r is the number of objects to arrange.
• Combination: A selection of objects where order does not matter. The formula is: C(n,r)=r!
(n−r)!n! where n is the total number of objects and r is the number of objects selected.
5. The Binomial Distribution
• A probability distribution used when there are exactly two mutually exclusive outcomes of a
trial (success/failure). The binomial probability formula is: P(X=k)=(kn)pk(1−p)n−k where:
• n is the number of trials,
• k is the number of successes,
• p is the probability of success on a single trial,
• (kn) is the combination of n items taken k at a time.
6. Expected Value
• The expected value of a random variable is the long-term average or mean of the outcomes
of a random experiment. The formula for expected value is: E(X)=∑[x⋅P(x)] where x is each
possible outcome and P(x) is the probability of that outcome.
7. Random Variables
• Discrete Random Variables: Can take on only specific values, often integers (e.g., number
of heads in coin tosses).
• Continuous Random Variables: Can take any value within a range (e.g., the height of
individuals).
8. Geometric and Negative Binomial Distributions
• Geometric Distribution: The probability distribution for the number of trials required
before the first success in a series of independent and identically distributed Bernoulli trials.
• Negative Binomial Distribution: Generalization of the geometric distribution, counting the
number of trials needed for a fixed number of successes.
9. Probability Distributions
• Discrete Probability Distribution: The probability of each outcome in a finite sample
space is specified.
• Continuous Probability Distribution: The probability of any specific outcome is zero, but
probabilities are assigned to ranges of outcomes.
10. Key Theorems
• Law of Total Probability: Used for calculating the probability of an event based on
conditional probabilities over several cases.
• Bayes' Theorem: A method to update the probability of an event based on new information:
P(A∣B)=P(B)P(B∣A)⋅P(A) where P(A∣B) is the probability of A given B, P(B∣A) is the
probability of B given A, and P(A) and P(B) are the probabilities of A and B, respectively.
11. Applications of Probability
• Real-life Applications: Understanding probability helps with decision-making, risk
assessment, statistical inference, games of chance (cards, dice), insurance, and reliability
analysis.
• Simulations: Using random variables to model real-world situations, such as in computer
simulations (Monte Carlo methods).