KEMBAR78
Probability | PDF | Expected Value | Variance
0% found this document useful (0 votes)
13 views5 pages

Probability

This document provides an overview of probability distributions, including definitions, rules, and types of probability. Key concepts such as random experiments, sample space, events, and various probability rules (addition, multiplication, complement, conditional, and Bayes' theorem) are discussed. Additionally, it covers the characteristics of random variables, expected value, variance, and important theorems like the Law of Large Numbers and Central Limit Theorem.

Uploaded by

Monisha Nayak
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views5 pages

Probability

This document provides an overview of probability distributions, including definitions, rules, and types of probability. Key concepts such as random experiments, sample space, events, and various probability rules (addition, multiplication, complement, conditional, and Bayes' theorem) are discussed. Additionally, it covers the characteristics of random variables, expected value, variance, and important theorems like the Law of Large Numbers and Central Limit Theorem.

Uploaded by

Monisha Nayak
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

MODULE- 3

PROBABILITY DISTRIBUTION

Probability.
Probability is the mode of expression of knowledge or belief whether the event
has occurred or will occur. Thus, in the theory of Probability this concept has
given an exact mathematical significance for being used widely in such areas
of study as mathematics, statistics, finance, gambling, science, and
philosophy, so that conclusions can be drawn regarding the probability of
potential events and the mechanics of complex systems being hidden. the
possibility of the outcome of any random event. The meaning of this term is
to check the extent to which any event is likely to happen.
Rules of probability.
1. Addition Rule
The addition rule is used to find the probability that either of two events
occurs. There are two versions of the addition rule: for mutually exclusive
events and non-mutually exclusive events.
a. Addition Rule for Mutually Exclusive Events
When two events are mutually exclusive (i.e., they cannot occur at the same
time), the probability of either event happening is the sum of their individual
probabilities.
b. Addition Rule for Non-Mutually Exclusive Events
When two events are not mutually exclusive (i.e., they can both occur at the
same time), we need to subtract the probability of both events occurring
together to avoid double-counting.
2. Multiplication Rule
The multiplication rule is used to find the probability that two events occur
together (i.e., the probability of both events happening at the same time).
a. Multiplication Rule for Independent Events
When two events are independent (i.e., the outcome of one does not affect the
outcome of the other), the probability of both events occurring is the product
of their individual probabilities.
b. Multiplication Rule for Dependent Events
When two events are dependent (i.e., the outcome of one affects the outcome
of the other), the probability of both events occurring is the product of the
probability of the first event and the conditional probability of the second
event given the first.
3. Complement Rule
The complement rule deals with the probability of the complementary event
(i.e., the event that something does not happen). The sum of the probabilities
of an event and its complement is always 1.
4. Conditional Probability Rule
The conditional probability rule is used to calculate the probability of an event
given that another event has occurred. This rule is used when events are
dependent.
5. Bayes’ Theorem
Bayes’ Theorem is a powerful rule that allows you to update the probability of
an event based on new information or evidence.
6. Total Probability Rule
This rule allows you to calculate the probability of an event by considering all
possible ways the event can happen, partitioning the sample space.
Types of probability.
1. Classical (Theoretical) Probability
This type of probability is based on the assumption that all outcomes in a
sample space are equally likely to occur. It is used when we have complete
knowledge about the system and the possible outcomes.
2. Empirical (Experimental) Probability
Empirical probability is based on observed data or experiments. It is
calculated as the ratio of the number of times an event occurs to the total
number of trials or observations. This type of probability is used when we
don’t know all possible outcomes but can estimate the probability through
experiments or observations.
3. Subjective Probability
Subjective probability is based on personal judgment, intuition, or experience
rather than data or mathematical reasoning. It represents an individual's
belief about how likely an event is to occur.
4. Conditional Probability
Conditional probability refers to the probability of an event occurring given
that another event has already occurred. It is important in scenarios where
events are dependent on each other.
5. Joint Probability
Joint probability is the probability of two or more events happening
simultaneously. It is used when you want to find the likelihood of the
occurrence of two events together.
6. Marginal Probability
Marginal probability refers to the probability of a single event occurring,
without considering any other events. It is the probability of an event when
you "marginalize" over the other variables or conditions.
7. Independent and Dependent Probability
• Independent Probability: Events are independent if the occurrence of
one event does not affect the probability of the other event. For example,
flipping a coin and rolling a die are independent events.
• Dependent Probability: Events are dependent if the occurrence of one
event affects the probability of the other. For example, drawing two
cards from a deck without replacement is a dependent event.
8. Bayesian Probability
Bayesian probability involves updating the probability of an event as new
evidence or information becomes available. It is based on Bayes' Theorem,
which relates the conditional and marginal probabilities of random events.
9. Cumulative Probability
Cumulative probability is the probability that a random variable takes a value
less than or equal to a specific value. It is the sum of individual probabilities
of all possible outcomes up to a given point.
Concepts.
1. Random experiment
A random experiment is any process or action that leads to a result. The result
of the experiment is called an outcome.
2. Sample Space
The sample space of an experiment is the set of all possible outcomes. It is
often denoted as SSS.
3. Event
An event is any subset of the sample space. An event may contain one or more
outcomes. An event can be described as happening or not happening.
4. Outcome
An outcome is a single possible result of an experiment. It is an element of
the sample space.
5. Probability of an Event
The probability of an event is a number between 0 and 1 that expresses the
likelihood of the event occurring. The probability of an event AAA is denoted
by P(A)P(A)P(A).
6. Mutually Exclusive Events
Two events are mutually exclusive if they cannot occur at the same time. If
one event happens, the other cannot.
7. Independent Events
Two events are independent if the occurrence of one event does not affect the
occurrence of the other.
8. Dependent Events
Two events are dependent if the occurrence of one event affects the probability
of the other event occurring.
9. Complementary Events
The complement of an event AAA is the event that AAA does not occur. The
probability of the complement of AAA is denoted by P(Ac)P(A^c)P(Ac) or
P(not A)P(\text{not A})P(not A), and it is calculated as:
10. Conditional Probability
Conditional probability is the probability of an event occurring given that
another event has already occurred. It is denoted by P(A∣B)P(A | B)P(A∣B),
which is the probability of event AAA occurring given event BBB has occurred.
11. Joint Probability
The joint probability is the probability that two events AAA and BBB both
occur. It is denoted by P(A∩B)P(A \cap B)P(A∩B), which is the intersection of
events AAA and BBB.
12. Union of Events
The union of two events AAA and BBB represents the event that either AAA,
BBB, or both events occur. It is denoted as A∪BA \cup BA∪B.
13. Probability Distribution
A probability distribution describes how the probabilities are distributed over
all possible outcomes of a random variable.
14. Random Variable
A random variable is a variable whose value is subject to chance. There are
two types of random variables:
• Discrete Random Variable: Takes on a finite number of distinct values.
• Continuous Random Variable: Takes on an infinite number of values
within a given range.
15. Expected Value (Mean)
The expected value (also called the mean) of a random variable is the long-
run average or mean value it takes when the experiment is repeated many
times. It is denoted as E(X)E(X)E(X).
16. Variance and Standard Deviation
• Variance measures how spread out the values of a random variable are
around the expected value. It is denoted by Var(X)\text{Var}(X)Var(X).
• Standard Deviation is the square root of the variance and provides a
measure of how much the values of the random variable deviate from
the mean.
17. Law of Large Numbers
The Law of Large Numbers states that as the number of trials in an
experiment increases, the empirical probability (observed probability) will get
closer to the theoretical probability.
18. Central Limit Theorem (CLT)
The Central Limit Theorem states that, for a large enough sample size, the
sampling distribution of the sample mean will approximate a normal
distribution, regardless of the shape of the population distribution.

You might also like