Introduction to Random Process
1 / 26
Table of Contents
Random or Stochastic processes
Mathematical Tools for Studying Random Processes
Stationary Random Processes
2 / 26
Stochastic processes or Random process
▶ A stochastic process is a mathematical model of a
probabilistic experiment that evolves in time and generates
a sequence of numerical values.
▶ Each numerical value in the sequence is modeled by a
random variable
▶ A sequence of random variables (Xt )t≥0 defined on a
sample space.
3 / 26
Example
▶ the sequence of daily prices of a stock;
▶ the sequence of scores in a football game;
▶ the sequence of failure times of a machine;
▶ the sequence of hourly traffic loads at a node of a
communication network;
▶ the sequence of radar measurements of the position of an
airplane
4 / 26
Example
▶ A collection (S0 , S1 , S2 ) is a random process.
▶ S0 , S1 , S2 are random variable representing the asset price
at time 0, 1, 2.
▶ The set of time index is I = {0, 1, , 2}
▶ The state space is the set of all possible value of prices
S = {1, 2, 4, 8}
▶ Corresponding to an tossing outcome w = HH, we have a
sample path/ trajectory/realization (4, 8, 16)
5 / 26
Use random processes to
▶ model some phenomena which evolves over time
▶ take into account the dependence, e.g how knowledge
about asset price up to today effect on the behavior of
asset price tomorrow or in the future
▶ forecasting
▶ evaluate risk
6 / 26
Example
Consider a binomial asset pricing model
▶ S0 = 4
▶ p(H) = p(T ) = 1
2
▶ u = 2, d = 12
Suppose we know that S1 = 2, S2 = 4, S3 = 8.
Then E(S4 |S1 = 2, S2 = 4, S3 = 8) is used to forecast the asset
price at period 4.
7 / 26
Example - Auto regressive model AR(1)
Let Sn be asset price at period n and rn = SnS−S n−1
n−1
be
percentage return at period n
Return at period n depends on the return at period n − 1 and
random noise ϵn
rn = c + ϕrn−1 + ϵn
ϵ1 , ϵ2 , . . . are independent (unpredictable term effects on return)
8 / 26
▶ Assume that c = 3, ϕ = 1 and ϵn ∼ N (0, 1)
▶ Given that r0 = 3, r1 = 1, r2 = 4, r3 = −1, we have
r4 = 3 + (−1) + ϵ4 = 2 + ϵ4
▶ the conditional distribution of r4
r4 |(r0 = 3, r1 = 1, r2 = 4, r3 = −1) ∼ N (2, 1)
9 / 26
▶ Assume that c = 3, ϕ = 1 and ϵn ∼ N (0, 1)
▶ Given that r0 = 3, r1 = 1, r2 = 4, r3 = −1, we have
r4 = 3 + (−1) + ϵ4 = 2 + ϵ4
▶ the conditional distribution of r4
r4 |(r0 = 3, r1 = 1, r2 = 4, r3 = −1) ∼ N (2, 1)
▶ Forecast return at period 4
E(r4 |(r0 = 3, r1 = 1, r2 = 4, r3 = −1) = 2
9 / 26
▶ Assume that c = 3, ϕ = 1 and ϵn ∼ N (0, 1)
▶ Given that r0 = 3, r1 = 1, r2 = 4, r3 = −1, we have
r4 = 3 + (−1) + ϵ4 = 2 + ϵ4
▶ the conditional distribution of r4
r4 |(r0 = 3, r1 = 1, r2 = 4, r3 = −1) ∼ N (2, 1)
▶ Forecast return at period 4
E(r4 |(r0 = 3, r1 = 1, r2 = 4, r3 = −1) = 2
▶ Risk that the return at period 4 is negative
P (r4 < 0|(r0 = 3, r1 = 1, r2 = 4, r3 = −1) = P (X < 0)
where X ∼ N (2, 1)
9 / 26
Exercise - AR(2)
Return at period n depends on the two last returns at period
n − 1 and n − 2 and random noise ϵn
rn = 1 + 0.5rn−2 + 2rn−1 + ϵn
ϵ1 , ϵ2 , . . . are independent and normally distributed N (0, 1).
Given that r0 = 3, r1 = 1, r2 = 4, r3 = −1
1. Forecast return at period 5
2. Evaluate the risk that the return at period 5 is less than -1.
10 / 26
Classification of stochastic processes (Xt )t∈I taking
values in S
▶ Based on time observation I
▶ Discrete time: I is countable, e.g, {0, 1, 2, . . . }
▶ Continuous time : I is uncountable, e.g [0, ∞), [0, 1]
▶ Based on state observation S
▶ Discrete state: S is countable, e.g, {0, 1, 2, . . . },
{20 , 2±1 , 2±2 , . . . }
▶ Continuous state: S is uncountable, e.g [0, ∞), [0, 1]
11 / 26
Discrete time vs Continuous time
Discrete time process. Continuos time process.
12 / 26
Discrete state vs Continuous state
Mo
1 Ms
13 / 26
Random processes in this course
Two important properties: martingale and Markov
14 / 26
Table of Contents
Random or Stochastic processes
Mathematical Tools for Studying Random Processes
Stationary Random Processes
15 / 26
Mathematical Tools for Studying Random Processes
As with random variables, we can mathematically describe a
random process in terms of a cumulative distribution function, a
probability density function, or a probability mass function,
conditional pmf, conditional pdf, expectation ...
16 / 26
Example
Consider a binomial asset pricing model with S0 = 4, u = 2,
d = 12 , p = q = 0.5.
1. Given S0 = 4, S1 = 8, S2 = 4, S3 = 2, S4 = 1, what is the
conditional p.m.f of S5 and the conditional expectation of
S5 ?
2. Given S0 = 4, S1 = 2, S2 = 4, S3 = 1, what is the conditional
p.m.f of S5 and the conditional expectation of S5 ?
3. Is S5 independent of S0 , S1 , S2 , S3 , S4 ?
17 / 26
Mean function
The mean function of a random process X = (Xt )t≥0 is
the expected value of the process at each time period
µX (t) = E(Xt )
A simplest way to describe the trend of a random process
18 / 26
Example
Compute the mean function
µS (n) = E(Sn )
where S = (Sn )n≥0 is a binomial asset pricing model.
19 / 26
Example - Gambler
Consider a fair game in which the chances of winning and
losing equal amounts are the same, i.e. if we denote Xk the
outcome of k-th trial at the game, then it is known that
E[Xk ] = 0. Suppose that the initial wealth of a gambler is 0 and
he is allowed to borrow as much as possible at no extra cost to
play. Then his total wealth after k trials is determined by
k
X
Mk = X1 + · · · + Xk = Xn
n=1
Compute the mean function µM (t) of the weath process.
20 / 26
Autocovariance function
The autocovariance function of a random process (Xt )t≥0
is defined as the covariance of Xt1 and Xt2
CX (t1 , t2 ) = Cov(Xt1 , Xt2 ) = E(Xt1 − µX (t1 )(Xt2 − µX (t2 )))
which is equivalent to
CX (t1 , t2 ) = RX (t1 , t2 ) − µX (t1 )µX (t2 )
Naturally, the autocorrelation function describes the relationship
(correlation) between two samples of a random process. This
correlation will depend on when the samples are taken; thus,
the autocorrelation function is, in general, a function of two time
variables. Quite often we are interested in how the correlation
between two samples depends on how far apart the samples
are spaced.
CX (t, t + τ ) = Cov(Xt , Xt+τ )
where τ is a time difference variable. 21 / 26
Example
Revisit AR(1), binomial asset pricing model and wealth process
in the gambler problem.
22 / 26
Table of Contents
Random or Stochastic processes
Mathematical Tools for Studying Random Processes
Stationary Random Processes
23 / 26
Strictly stationary
A random process is striclty stationary if the joint distribution of
the process are invariance to a time shift. That is for any
0 ≤ t1 < t2 < · · · < tn , the random vectors (Xt1 , . . . , Xtn ) and
(Xt1 +τ , . . . , Xtn +τ ) have the same distribution for any τ > 0.
24 / 26
Weak stationary
A random process is weak stationary the mean function and
autocorrelation function are invariant to a time shift. That is
µX (t) = µ = constant
and
CX (t, t + τ ) = CX (0, τ ) = CX (τ ) (function only of )τ
Conversely, a process is not weak stationary.
25 / 26
Example - a simple Gaussian process
Xn are independent and has the same distribution (i.i.d)
N (0, 1). The the process (Xn )n≥0 is weak stationary
26 / 26