KEMBAR78
Module I Complete | PDF | Probability Distribution | Random Variable
0% found this document useful (0 votes)
73 views40 pages

Module I Complete

- The document discusses signals and spectra, with advantages and disadvantages of digital communications. It introduces source coding which converts analog signals to digital bits. - It covers sampling and quantization, with the sampling theorem stating a signal can be represented if the sampling frequency is at least twice the highest frequency. For a given signal, quantization to 9 levels allowed mapping to binary digits. - Classification of signals is discussed, including deterministic vs random signals. Probability theory is introduced for analyzing random signals. Basic probability terms and properties are defined.

Uploaded by

jigef19343
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
73 views40 pages

Module I Complete

- The document discusses signals and spectra, with advantages and disadvantages of digital communications. It introduces source coding which converts analog signals to digital bits. - It covers sampling and quantization, with the sampling theorem stating a signal can be represented if the sampling frequency is at least twice the highest frequency. For a given signal, quantization to 9 levels allowed mapping to binary digits. - Classification of signals is discussed, including deterministic vs random signals. Probability theory is introduced for analyzing random signals. Basic probability terms and properties are defined.

Uploaded by

jigef19343
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 40

Signals and spectra

Fig. 1.2 (BERNARD SKLAR)


Why Digital
• As all transmission lines and circuits have some
non ideal frequency transfer function, there is a
distortion effect on ideal pulse
• Unwanted electrical noise and other interference
• Digital circuits are less subject o distortion and less
interference

Advantages of digital communications


• More reliable
• More flexible
• Less cost
• As digital messages are in terms of ones and zeros , messages
can be handled in packets
• Immune to noise
Disadvantages
• Synchronization is lesn in digital as compared to analog
• When SNR drops below certain threshold level then signal
quality will change from good to poor
Source coding has three parts
• Analog waveform to sequence
• Quantize (sequence to symbols)
• Symbols to bits

Decoding
• Bits to symbols
• Symbols to sequence of numbers
• Sequence to waveform
Formatting and Baseband modulation
Fig 2.2 (BERNARD SKLAR)
Sampling and Quantization
Sampling theorem

• A continuous time signal x(t) can be completely represented in its


sampled form and recovered back from the sampled form if the
sampling frequency fs ≥ 2W where ‘W’ is the message signal
For this signal, a quantization interval of 25 resulted in 9 possible y values, which are
0,25,50,75,100,-100,-75,-50,-25, We can map the 9 values to the binary numbers 0000 - 1001:
(Bernard)
Classification of signals

1.Continuous-time and Discrete-time signals.


2.Real and complex signals
3.Deterministic and Random Signals
4.Periodic and Non-periodic signals
5.Even and Odd signals
6.Energy and Power signals
7.Analog and Digital signals
Deterministic and Non-deterministic Signals
A signal is said to be deterministic if there is no
uncertainty with respect to its value at any instant of
time. Or, signals which can be defined exactly by a
mathematical formula are known as deterministic signals.
Power of energy signal = 0
Energy of power signal = ∞
• Receiver not only receives the information signal but along with information
receiver also receives noise. where noise is random in nature and is
unpredictable.
• Noise can be considered as random signal and are unpredictable
• Random signals can be described in terms of statistical properties.
• Random signals can be analyzed with the help of probability theory.

• Hence Study of probability is very important.


Basic terms of probability theory
• Experiment
• Sample space
• Event
A random experiment is a mechanism that produces a definite outcome that cannot be
predicted with certainty. The sample space associated with a random experiment is the set
of all possible outcomes. An event is a subset of the sample space.
Examples
• Construct a sample space for the experiment that consists of rolling a single die. Find
the events that correspond to the phrases “an even number is rolled” and “a number
greater than two is rolled.”
Solution
Sample space= {1,2,3,4,5,6}
Even number is rolled (E) = {2,4,6}
Number greater than 2 is rolled (T) = {3,4,5,6}
Find the probabilities of the events E: “an even number is rolled” and T: “a number greater than two is
rolled.”
P(E) =1/6+1/6+1/6 =3/6 =1/2
P(T) = 1/6+1/6+1/6+1/6 =4/6 =2/3

P(A)= number of possible favorable outcomes/ total number of outcomes


Properties of probabilities
• Property 1 : the probability of certain event is unity i.e. P(A)=1
• Property 2 : Probability of an event always lies between ‘0’ and ‘1’
0 ≤ P(A) ≤ 1
• Property 3 : If A and B are two mutually exclusively events then
P(A+B) = P(A) + P(B)
• Property 4: if A is an event, then probability of not happening A is
P(A) = 1-P(A)
Where A represents compliment of A
• If A and B are any two events (not mutually exclusive events), then
P(A+B) =P(A) +P(B) – P(AB)
Where P(AB) is called the probability of event A and event B both occurring
simultaneously, such event is known as joint set of A and B, and probability
P(AB) is joint probability
Conditional probability
• The conditional probability, as its name suggests, is the probability of happening an event that is based upon a
condition.

• For example, assume that the probability of a boy playing tennis in the evening is 95% (0.95) whereas the
probability that he plays given that it is a rainy day is less which is 10% (0.1). Then the former case is just
normal probability whereas the latter case is the conditional probability. In this example, we represent the two
probabilities as P(Play tennis) = 0.95 and P(Play tennis | Rainy day) = 0.1

• Conditional probability is one of the important concepts in probability and statistics. The "probability of A
given B" (or) the "probability of A with respect to the condition B" is denoted by the conditional probability
P(A | B) (or) P (A / B) (or) P

• (A). Thus, P(A | B) represents the probability of A which happens after event B has happened already. the
probability of an event may alter if there is a condition given.

• If A and B are two events associated with the same sample space of a random experiment, the conditional
probability of event A given that B has occurred is given by P(A/B) = P( A ∩ B)/ P (B), provided P(B) ≠ 0.
• The conditional probability of event A given that event B has already happened
P(A/B) =P(AB)/P(B)
• The conditional probability of event B given that event A has already happened
P(B/A) =P(AB)/P(A)
Where joint probability has commutative property which states that
P(AB) =P(BA)

Statistically Independent Events

Two events are independent if the occurrence of one event does not affect the chances of the occurrence of the other
event. The mathematical formulation of the independence of events A and B is the probability of the occurrence of
both A and B being equal to the product of the probabilities of A and B (i.e., P(A and B) = P(A)P(B)).

Example: A box contains 3 red , 4 white and 5 black balls. One ball is drawn at random. Find the probability that it is
(a) Red, (b) not black © black or white.
• Example: A box contains 3 red , 4 white and 5 black balls. One ball is drawn at random. Find the probability
that it is
(a) Red, (b) not black © black or white
Solution: a) P(R)= 3/(3+4+5)=3/12
=1/4

b) P(B) =1-P(B)
P(B)=5/12
P(B) =1-5/12
= 7/12
c) P(B+W) =P(B)+P(W)
= 5/12+4/12
= 9/12
=3/4
Random variables

● Random variable is a mathematical formalization of a quantity or object


which depends on random event. If s represents the outcome of the
experiment then random variable is represented by X(s).
● Two important functions of random variable are cumulative distribution
function(CDF) and probability density function(PDF)
Two types
1) Discrete random variable: Discrete random variables take on a countable
number of distinct values. e.g. coin-flipping(head or tail)
2) Continuous random variable: Continuous random variables can represent any
value within a specified range or interval and can take on an infinite number
of possible values.(Rainfall in a month)
Cumulative Distribution Function(CDF)
The cumulative distribution function (CDF) of a random variable is defined as the
probability that the random variable X takes value less than or equal to x. CDF is also
known as the distribution function
For example
If die is rolled, probability of obtaining 1 or 2 or 3 or 4 or 5 or 6 is ⅙ =
16.66%
CDF FX(x) = P(X ≤ x)
Here, x is a dummy variable and the CDF is denoted by FX(x).
Properties of CDF

The cumulative distribution function F(x) has the following properties:


Property 1. The CDF is always bounded between 0 and 1
0 ≤ Fx(x) ≤ 1
Propert 2.
Fx(+∞) = 1
Property 3. Fx(–∞) = 0
Property 4. This property states that Fx (x) is monotone non decreasing function
Fx(x1) ≤ Fx(x2) if x1 ≤ x2
Calculation of CDF for discrete random variable
WKT X is discrete RV which takes value at discrete points.
CDF FX(x) = P(X ≤ x)

X is RV and we may represent x in terms of x1, x2, x3,……xn


There are no events below X <x1
Therefore P(X<x1) = 0
Fx(x) = 0 for x<x1
P(x1 ≤ X ≤ xk ) =P(X=x1)+P(X=x2) +P(X=x3)…….+P(X=xn)
P(
● CDF for complete range is given as
Numerical
Three digit message is transmitted over a noisy channel having a probability of error
P(E) = 2/5 per digit. Find out the corresponding CDF
Given probability of error P(E) = 2/5
Probability of correct digit = 1-2/5
= 3/5
There are 8 patterns in the sample space
S= ( CCC, CCE, CEC, CEE, ECC, ECE, EEC, EEE)
Let Random variable X denote the number of errors in the received message
X = ( no error, one error, two error, all digits in error)
● P(X= x0) is the probability that there is no error in the received signal
P(X=x0) = P(CCC) =P (C) P (C) P (C)
(3/5) × (3/5) × (3/5)
= 27/125
• P(X= x1) is the probability that there is one error in the received signal
x1= (CCE, CEC, ECC)
P(X=x1) = P (C) P (C) P (E) × P (C) P (E) P (C) × P(E) P (C) P (C) × (number of samples
with one error)
= P (C) P (C) P (E) × 3
= (3/5) × (3/5) × (2/5) × 3
=54/125
• P(X= x2) is the probability that there is two error in the received signal
x2= (CEE, ECE, EEC)
P(X=x2) = P (C) P (E) P (E) × (number of samples with two error)
= P (C) P (E) P (E) × 3
= (3/5) × (2/5) × (2/5) × 3
=36/125
• P(X= x3) is the probability that there is tHREE error in the received signal
x3= (EEE)
P (E) P (E) P (E) × 1
= (2/5) × (2/5) × (2/5) × 1
=8/125
● Now calculate CDF Fx(x) at all values of x
Fx(x) = 0 for x<x0
Fx(x0) = P(X< x0)+P(X= x0)
27/125
Fx(x1) = P(X≤ x1)
=P(X<x0)+ P(X= x0)+ P(X= x1)
= 0+27/125+54/125 =81/125
Fx(x2) = P(x≤ x2)
=P(X<x0)+ P(X= x0)+ P(X= x1)+ P(X= x2)
= 0+27/125+54/125+36/125= 117/125
Fx(x3) = P(x≤ x3)
=P(X<x0)+ P(X= x0)+ P(X= x1)+ P(X= x2)+ P(X= x3)
= 0+27/125+54/125+36/125+8/125 =125/125 =1
Probabilities of random variables x at xi i= 0, 1, 2, 3
Random process(stochastic process)
It is collection of random variables indexed by time.
X1, x2, … xn ……. Random variables
May be discrete(x0, x1, …xn) or continuous {Xt} t≥0

Defn: Random process( A, t) is a function of two variables X and t

Dxvbc X(A i, t) =Xi (t) is a sample space Totality of all sample space
is known as ensemble
● The concept of random process allows us to study system involving signals
that are not predictable
● Random process represents the mathematical model of random signals
● Random process can be denoted by X(t,s) or simply X(t) where s is the
sample point of random experiment at time t
● A deterministic process has only one possible 'reality' of how the process evolves under time.
● In a stochastic or random process there are some uncertainties in its future evolution
described by probability distributions.
● Many time-varying signals are random in nature:
○ Noises
○ Image, audio: usually unknown to the distant receiver.
● Random process represents the mathematical model of these random signals.
● Definition: A random process (or stochastic process) is a collection of random variables
(functions) indexed by time.
Notation:
● X (t, s)
● where s: the sample point of the random experiment. t: time.
● Simplified Notation is X (t)
● The difference between random variable and random process:
○ Random variable: an outcome is mapped to a number.
○ Random process: an outcome is mapped to a random waveform that is a function of time
● Each sample point represents a time-varying function.
Stochastic Process
Each sample point in S is associated with a sample function x(t)
X(t, s) is a random process
is an ensemble of all time functions together with a probability rule
X(tk , sj ) is a realization or sample function of the random process
{x1(tk ), x2(tk ), . . . , xn(tk ) = X(tk , s1), X(tk , s2), . . . , X(tk , sn)}
Probability rules assign probability to any meaningful event associated
with an observation An observation is a sample function of the random
process

A stochastic process X(t, s) is represented by time indexed ensemble (family)


of random variables {X(t, s)}
Represented compactly by : X(t)
“A stochastic process X(t) is an ensemble of time functions, which,
together with a probability rule, assigns a probability to any meaningful
event associated with an observation of one of the sample functions of the
stochastic process”.
Stationary Process:
If a process is divided into a number of time intervals exhibiting
same statistical properties, is called as Stationary.
It is arises from a stable phenomenon that has evolved into a steady-state mode of behavior.
Non-Stationary Process:
If a process is divided into a number of time intervals exhibiting
different statistical properties, is called as Non-Stationary. It is arises from an unstable
phenomenon.

The Stochastic process X(t) initiated at t = −∞ is said to be Stationary in the strict sense, or
strictly stationary if,
FX(t1 +τ ),X(t2 +τ ),...,X(tk +τ )(x1, x2, . . . , xk ) =
FX(t1 ),X(t2 ),...,X(tk )(x1, x2, . . . , xk ) Where,
X(t1), X(t2), . . . , X(tk ) denotes RVs obtained by sampling process
X(t) at t1, t2, . . . , tk respectively.
FX(t1 ),X(t2 ),...,X(tk )(x1, x2, . . . , xk ) denotes Joint distribution function of RVs.
X(t1 + τ ), X(t2 + τ ), . . . , X(tk + τ ) denotes new RVs obtained by
sampling process X(t) at t1 + τ, t2 + τ, . . . , tk + τ respectively. Here
τ is fixed time shift.
FX(t1 +τ ),X(t2 +τ ),...,X(tk +τ )(x1, x2, . . . , xk ) denotes Joint distribution function of new RVs.
● Weakly Stationary Process:

● A stochastic process X(t) is said to be weakly stationary(Wide-Sense Stationary) if its second-


order moments satisfy:
● The mean of the process X(t) is constant for all time t.
● The autocorrelation function of the process X(t) depends solely on the difference between any
two times at which the process is sampled.

You might also like